Why teachers should stop calling AI’s mistakes ‘hallucinations’


Although an essay can be generated by artificial intelligence tools, the result may have several factual errors. It is crucial for schools and districts to communicate to both teachers and students the necessity of inspection in a manner that does not reinforce preexisting biases within these systems. —Image by freepik

Artificial intelligence tools can produce an essay on the migratory patterns of waterfowl or US President Barack Obama's K-12 education agenda in seconds – but the work might be riddled with factual errors.

Those inaccuracies are commonly known as "hallucinations" in computer science speak – but education technology experts are trying to steer away from that term.

"We know that industry tends to use the term 'hallucinations' to allude to errors that are made by [AI] systems and tools," said Pati Ruiz, a senior director of ed tech and emerging technologies for Digital Promise, a nonprofit that works on technology and equity issues, during an Education Week webinar earlier this month.

But researchers who think about how to talk about AI recommend using another name for those errors – such as "mistake," Ruiz said.

First off, the word "hallucinations," Ruiz said, "make(s) light of mental health issues."

And she added that using that word for AI's errors "might give students a false sense of this tool having humanlike qualities. And that's something that we advocate against, right? We advocate for folks to understand these tools as just that, tools that will support us as humans."

‘AI systems and tools make lots of mistakes’

Ruiz noted that she and another expert who spoke during the webinar, Kip Glazer, the principal of Mountain View High School in California, wrote about this issue earlier this year.

What's more, students need to understand that they shouldn't take any information that they get from ChatGPT and similar tools at face value, Ruiz said.

"Generative AI systems and tools make lots of mistakes," she said. "We need to have expertise across content areas so that we can review the outputs of generative AI. And we recommend always questioning the outputs of generative AI systems and tools."

Schools and districts need to make that need for scrutiny clear to teachers and students. "Guidance is really important so that we can all use (AI) effectively and appropriately and in a way that doesn't perpetuate the biases that already exist in these systems," Ruiz added. – Education Week, Bethesda, Md./Tribune News Service

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

   

Next In Tech News

Tech platforms make pitch for ad deals as TikTok is roiled by politics
Intesa targets new digital-only clients after antitrust blow
Paramount will not extend exclusive deal period with Skydance
Google, US clash over search advertising as trial winds down
Germany and allies accuse Russia of sweeping cyberattacks
Analysis-Apple has big AI ambitions - at a lower cost than its rivals
Hong Kong privacy watchdog to grill authorities over ‘serious’ leak of 17,000 people’s data
Google defends app store, fighting Epic Games' bid for major reforms
Ewaste is overflowing landfills. At one sprawling Vietnam market, workers recycle some of it
You’re surrounded by scammers

Others Also Read