Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said


A computer screen displays text produced by an AI powered transcription program called Whisper at Cornell University in Ithaca, New York. The text preceded by '#Ground truth' is what was actually said while the sentences preceded by 'text' was how the transcription program interpreted the words. — AP

SAN FRANCISCO: Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy”.

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text – known in the industry as hallucinations – can include racial commentary, violent rhetoric and even imagined medical treatments.

Uh-oh! Daily quota reached.


Experience an ad-free unlimited reading on both web and app.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

Major new features expected on the next-generation Apple Watch
Meta releases AI model to enhance Metaverse experience
Google unveils latest AI model, Gemini 2.0
Canada proposed $15 billion incentive to boost AI green data centre investment, Globe and Mail reports
Factbox-How Trump's new FTC chair views AI, Big Tech
Texas probes tech platforms over safety and privacy of minors
Broadcom forecasts Q1 revenue above estimates on strong AI chip demand
Clever puzzles, creative solutions: The keys (and doors) of good puzzle design
Apple nears switch to in-house Bluetooth and Wi-Fi chip for iPhone, smart home, Bloomberg reports
Intel executives say a manufacturing spinoff is possible

Others Also Read