Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said


A computer screen displays text produced by an AI powered transcription program called Whisper at Cornell University in Ithaca, New York. The text preceded by '#Ground truth' is what was actually said while the sentences preceded by 'text' was how the transcription program interpreted the words. — AP

SAN FRANCISCO: Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy”.

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text – known in the industry as hallucinations – can include racial commentary, violent rhetoric and even imagined medical treatments.

Uh-oh! Daily quota reached.


Experience an ad-free unlimited reading on both web and app.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Model behaviour: India's anti-cruelty robot elephants
Report: Scam calls in Malaysia skyrocketed by 82.81% in 2024
Bank staff trained to spot ‘red flags’ prevent RM12.4mil in fraudulent transfers
Why a Chinese gadget company can make an electric car and Apple can’t
Who's still playing Pok�mon Go? Ask any of the 48,000 people at this Rose Bowl event
Xiaomi 15 Ultra boasts 200-megapixel Leica camera, pre-order starts�from�RM5,199
Disney’s Hulu crashes during Oscars as thousands locked out
How cyber criminals steal cryptocurrency
UK launches investigation into TikTok, Reddit over children's personal data practices
Microsoft retires Skype, Internet call pioneer

Others Also Read