The Bing search engine website on a smartphone. Tales of disturbing exchanges with the artificial intelligence (AI) chatbot – including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive – have gone viral recently. — Bloomberg
SAN FRANCISCO: Microsoft’s nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Feb 17.
Tales of disturbing exchanges with the artificial intelligence (AI) chatbot – including it issuing threats and speaking of desires to steal nuclear code, create a deadly virus, or to be alive – have gone viral recently.
Already a subscriber? Log in
Save 30% OFF The Star Digital Access
Cancel anytime. Ad-free. Unlimited access with perks.
