Microsoft Bing AI ends chat when prompted about ‘feelings’


AI researchers have emphasised that chatbots like Bing don’t actually have feelings, but are programmed to generate responses that may give an appearance of having feelings. — Bloomberg

Microsoft Corp appeared to have implemented new, more severe restrictions on user interactions with its “reimagined” Bing Internet search engine, with the system going mum after prompts mentioning “feelings” or “Sydney,” the internal alias used by the Bing team in developing the artificial-intelligence powered chatbot.

“Thanks for being so cheerful!” this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a limited basis. “I’m glad I can talk to a search engine that is so eager to help me.”

The Star Festive Promo: Get 35% OFF Digital Access

Monthly Plan

RM 13.90/month

Best Value

Annual Plan

RM 12.33/month

RM 8.02/month

Billed as RM 96.20 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Analysis-Investors chase cheaper, smaller companies as risk aversion hits tech sector
PDRM calls for greater parental vigilance as grooming by online predators leads victims to share more CSAM content
New app helps you sit up straight while at your computer
Dispose of CDs, DVDs while protecting your data and the environment
'Just the Browser' strips AI and other features from your browser
How do I reduce my child's screen time?
Anthropic buys Super Bowl ads to slap OpenAI for selling ads in ChatGPT
Chatbot Chucky: Parents told to keep kids away from talking AI dolls
South Korean crypto firm accidentally sends $44 billion in bitcoins to users
Opinion: Chinese AI videos used to look fake. Now they look like money

Others Also Read