SAN FRANCISCO: Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.
A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Feb 15 with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot.
Uh-oh! Daily quota reached.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!