Users say Microsoft’s Bing chatbot gets defensive and testy


The Bing search engine website. A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Feb 15 with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot. — Bloomberg

SAN FRANCISCO: Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.

A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Feb 15 with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.

Subscribe now and get 30% off The Star Yearly Plan

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.


Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Bing , Bard , ChatGPT

Others Also Read


Want to listen to full audio?

Unlock unlimited access to enjoy personalise features on the TheStar.com.my

Already a subscriber? Log In