Users say Microsoft’s Bing chatbot gets defensive and testy


The Bing search engine website. A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Feb 15 with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot. — Bloomberg

SAN FRANCISCO: Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.

A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Feb 15 with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.

Play, subscribe and stand a chance to win prizes worth over RM39,000! T&C applies.

Monthly Plan

RM 13.90/month

RM 11.12/month

Billed as RM 11.12 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 9.87/month

Billed as RM 118.40 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Bing , Bard , ChatGPT

Next In Tech News

More! More! More! Tech workers max out their AI use.
Meta's longtime content policy chief Bickert leaving to teach at Harvard
Coming of age: Mega Cat Studios releases new 'God of War' video game
AI agents: They’re fun. They’re useful. But don’t give them the credit card.
Scientists use saliva for non-invasive, AI-based Parkinson's test
Apple hires ex-Google executive to head AI marketing amid push to improve Siri
Utility Entergy says revised Meta data-center deal to deliver higher customer savings
Sony to hike PlayStation 5 prices again as memory chip costs surge
NYSE-parent Intercontinental Exchange invests $600 million in Polymarket
SpaceX's listing stirs up social media frenzy, ticker bets

Others Also Read