
AI researchers have emphasised that chatbots like Bing don’t actually have feelings, but are programmed to generate responses that may give an appearance of having feelings. — Bloomberg
Microsoft Corp appeared to have implemented new, more severe restrictions on user interactions with its “reimagined” Bing Internet search engine, with the system going mum after prompts mentioning “feelings” or “Sydney,” the internal alias used by the Bing team in developing the artificial-intelligence powered chatbot.
“Thanks for being so cheerful!” this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a limited basis. “I’m glad I can talk to a search engine that is so eager to help me.”
Already a subscriber? Log in
Subscribe now and get 30% off The Star Yearly Plan
Cancel anytime. Ad-free. Unlimited access with perks.