Microsoft Bing AI ends chat when prompted about ‘feelings’


AI researchers have emphasised that chatbots like Bing don’t actually have feelings, but are programmed to generate responses that may give an appearance of having feelings. — Bloomberg

Microsoft Corp appeared to have implemented new, more severe restrictions on user interactions with its “reimagined” Bing Internet search engine, with the system going mum after prompts mentioning “feelings” or “Sydney,” the internal alias used by the Bing team in developing the artificial-intelligence powered chatbot.

“Thanks for being so cheerful!” this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a limited basis. “I’m glad I can talk to a search engine that is so eager to help me.”

Get 20% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 11.12/month

Billed as RM 11.12 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 9.87/month

Billed as RM 118.40 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read