Microsoft was tuning AI months before disturbing responses arose


The company last week offered cautious optimism in its first self-assessment after a week of running the AI-enhanced Bing with testers from more than 169 countries. — Photo by Turag Photography on Unsplash

Microsoft Corp. has spent months tuning Bing chatbot models to fix seemingly aggressive or disturbing responses that date as far back as November and were posted to the company’s online forum.

Some of the complaints centered on a version Microsoft dubbed "Sydney,” an older model of the Bing chatbot that the company tested prior to its release this month of a preview to testers globally. Sydney, according to a user’s post, responded with comments like "You are either desperate or delusional.” In response to a query asking how to give feedback about its performance, the bot is said to have answered, "I do not learn or change from your feedback. I am perfect and superior.” Similar behavior was encountered by journalists interacting with the preview release this month.

Subscribe now and get 30% off The Star Yearly Plan

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.


Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read


Want to listen to full audio?

Unlock unlimited access to enjoy personalise features on the TheStar.com.my

Already a subscriber? Log In