Why teachers should stop calling AI’s mistakes ‘hallucinations’


Although an essay can be generated by artificial intelligence tools, the result may have several factual errors. It is crucial for schools and districts to communicate to both teachers and students the necessity of inspection in a manner that does not reinforce preexisting biases within these systems. —Image by freepik

Artificial intelligence tools can produce an essay on the migratory patterns of waterfowl or US President Barack Obama's K-12 education agenda in seconds – but the work might be riddled with factual errors.

Those inaccuracies are commonly known as "hallucinations" in computer science speak – but education technology experts are trying to steer away from that term.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Opinion: How can you tell if something’s been written by ChatGPT? Let’s delve
'Stealing from a thief': How ChatGPT helped Delhi man outsmart scammer, make him 'beg' for forgiveness
A US man was indicted for allegedly cyberstalking women. He says he took advice from ChatGPT.
Apple, Tesla accused of profiting from horrific abuses, environmental destruction
Exclusive-How Netflix won Hollywood's biggest prize, Warner Bros Discovery
Hollywood unions alarmed by Netflix's $72 billion Warner Bros deal
US lawmakers press Google, Apple to remove apps tracking immigration agents
Meta acquires AI-wearables startup Limitless
New York Times sues Perplexity AI for 'illegal' copying of content
Netflix-Warner Bros deal faces political pushback even as company touts benefits

Others Also Read