Scammers now using deepfake audio to impersonate CEOs in fraud attempts, says security company


Nisos advised employees who have received suspicious voicemails to simply call the person back using a known number and verify any instructions with them. — Dreamstime/TNS

An employee at an undisclosed tech company has received a deepfake audio impersonating the voice of its chief executive officer asking for assistance to finalise an “urgent business deal”, according to a security company that investigated the incident.

US-based Nisos told Vice in a report that it analysed the voicemail that the employee received in June and determined that it was fake, a “synthetic audio” made to fool the receiver.

Save 30% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Deepfake , Scam , fraud

Next In Tech News

Windows running slow? Microsoft’s 11 quick fixes to speed up your PC
Meta to let users in EU 'share less personal data' for targeted ads
Drowning in pics? Tidy your Mac library with a few clicks
Flying taxis to take people to London airports in minutes from 2028
Smartphone on your kid’s Christmas list? How to know when they’re ready.
A woman's Waymo rolled up with a stunning surprise: A man hiding in the trunk
A safety report card ranks AI company efforts to protect humanity
Bitcoin hoarding company Strategy remains in Nasdaq 100
Opinion: Everyone complains about 'AI slop,' but no one can define it
Google faces $129 million French asset freeze after Russian ruling, documents show

Others Also Read