Angry Bing chatbot just mimicking humans, say experts


The Bing search engine website on a smartphone. Tales of disturbing exchanges with the artificial intelligence (AI) chatbot – including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive – have gone viral recently. — Bloomberg

SAN FRANCISCO: Microsoft’s nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Feb 17.

Tales of disturbing exchanges with the artificial intelligence (AI) chatbot – including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive – have gone viral recently.

Save 30% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Chatbot

Next In Tech News

Is social media harmful for kids? Meta and YouTube face US trial after TikTok settles suit
It’s not a product. This habit will be the biggest luxury of 2026
Apple spent years downplaying AI chatbots. Now Siri Is becoming one
US judge signals Musk's xAI may lose lawsuit accusing Altman's OpenAI of stealing trade secrets
Apple stole our revolutionary camera technology, British company claims in US district court lawsuit
Exclusive-Saks ending e-commerce partnership with Amazon, source says
Nvidia's plan to invest up to $100 billion in OpenAI has stalled, WSJ reports
Musk's Starlink updates privacy policy to allow consumer data to train AI
Google defeats bid for billions of dollars of new penalties in US privacy class action
Analysis-Combining SpaceX with xAI may be simple for Musk Inc, but Tesla isn't so easy

Others Also Read