A computer screen displays text produced by an AI powered transcription program called Whisper at Cornell University in Ithaca, New York. The text preceded by '#Ground truth' is what was actually said while the sentences preceded by 'text' was how the transcription program interpreted the words. — AP
SAN FRANCISCO: Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy”.
But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text – known in the industry as hallucinations – can include racial commentary, violent rhetoric and even imagined medical treatments.