Microsoft Bing AI ends chat when prompted about ‘feelings’


AI researchers have emphasised that chatbots like Bing don’t actually have feelings, but are programmed to generate responses that may give an appearance of having feelings. — Bloomberg

Microsoft Corp appeared to have implemented new, more severe restrictions on user interactions with its “reimagined” Bing Internet search engine, with the system going mum after prompts mentioning “feelings” or “Sydney,” the internal alias used by the Bing team in developing the artificial-intelligence powered chatbot.

“Thanks for being so cheerful!” this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a limited basis. “I’m glad I can talk to a search engine that is so eager to help me.”

Play, subscribe and stand a chance to win prizes worth over RM39,000! T&C applies.

Monthly Plan

RM 13.90/month

RM 11.12/month

Billed as RM 11.12 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 9.87/month

Billed as RM 118.40 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Could your phone be affecting your skin? Dermatologists explain
AI is coming for the sommeliers
Happiness Report says it is better to be social than on social media
After K-pop and K-drama, here come K-games
Explainer-What is the World Trade Organization e-commerce moratorium?
More! More! More! Tech workers max out their AI use.
Meta's longtime content policy chief Bickert leaving to teach at Harvard
Coming of age: Mega Cat Studios releases new 'God of War' video game
AI agents: They’re fun. They’re useful. But don’t give them the credit card.
Scientists use saliva for non-invasive, AI-based Parkinson's test

Others Also Read