AI can 'mimic voices' of loved ones – it’s being used as a scam, FCC warns


'Often the imposter claims to have been in an accident or arrested. The scammer may ask the grandparent ‘please don’t let mom and dad know,’ and may hand the phone over to someone posing as a lawyer seeking immediate payment,' the FCC said. — Image by rawpixel.com on Freepik

People are getting phone calls that say they’re from a family member. It even sounds like them. But the Federal Communications Commission is warning it could be a scam.

“Unfortunately, bad actors can now use artificial intelligence technology ‘to mimic voices, convincing people, often the elderly, that their loved ones are in distress,’ according to a recent Washington Post article,” the FCC said in a statement.

Uh-oh! Daily quota reached.


Experience an ad-free unlimited reading on both web and app.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

AI-powered robots could mean job losses on farms and in construction
Video games bad? You might need to switch your opinion, study shows
Indie developer emptyvessel reveals squad-based cyberpunk shooter ‘Defect’
Preview: ‘Star Wars Outlaws’ is the Han Solo simulator fans always wanted
Are you fact-checking your Facebook feed?
We train AI. AI might be training us, too, US researchers find
A 'true crime' video about a man’s 'secret affair' with his murderous stepson is going viral. It’s fake
Dubai nightclub scam: Tinder 'dates' vanish after leaving men with the bill
California issues draft regulations for operating autonomous trucks
OpenAI names political veteran Lehane as head of global policy, NYT reports

Others Also Read