Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said


A computer screen displays text produced by an AI powered transcription program called Whisper at Cornell University in Ithaca, New York. The text preceded by '#Ground truth' is what was actually said while the sentences preceded by 'text' was how the transcription program interpreted the words. — AP

SAN FRANCISCO: Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy”.

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text – known in the industry as hallucinations – can include racial commentary, violent rhetoric and even imagined medical treatments.

Uh-oh! Daily quota reached.


Experience an ad-free unlimited reading on both web and app.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Intel CEO signals that he’ll stick with contentious foundry plan
TikTok to introduce mindfulness tool for teenage users
Intel's new CEO Lip-Bu Tan has a history as a successful underdog
These airlines are banning power bank use onboard
Struggling Intel names Malaysian-born industry veteran Lip-Bu Tan as CEO
Meta wins halt to promotion of 'Careless People' tell-all book by former employee
Who is new Intel CEO Lip-Bu Tan?
Apple readies dramatic software overhaul for iPhone, iPad and Mac
Meta's Zuckerberg held meetings at White House on Wednesday, source says
US wind and solar still have room to grow for data centers, Microsoft VP says

Others Also Read