Users say Microsoft’s Bing chatbot gets defensive and testy


The Bing search engine website. A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Feb 15 with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot. — Bloomberg

SAN FRANCISCO: Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.

A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Feb 15 with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.

The Star Festive Promo: Get 35% OFF Digital Access

Monthly Plan

RM 13.90/month

Best Value

Annual Plan

RM 12.33/month

RM 8.02/month

Billed as RM 96.20 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Bing , Bard , ChatGPT

Next In Tech News

New app helps you sit up straight while at your computer
Dispose of CDs, DVDs while protecting your data and the environment
'Just the Browser' strips AI and other features from your browser
How do I reduce my child's screen time?
Anthropic buys Super Bowl ads to slap OpenAI for selling ads in ChatGPT
Chatbot Chucky: Parents told to keep kids away from talking AI dolls
South Korean crypto firm accidentally sends $44 billion in bitcoins to users
Opinion: Chinese AI videos used to look fake. Now they look like money
Anthropic mocks ChatGPT ads in Super Bowl spot, vows Claude will stay ad-free
Tesla 2.0: What customers think of Model S demise, Optimus robot rise

Others Also Read