Scammers now using deepfake audio to impersonate CEOs in fraud attempts, says security company


Nisos advised employees who have received suspicious voicemails to simply call the person back using a known number and verify any instructions with them. — Dreamstime/TNS

An employee at an undisclosed tech company has received a deepfake audio impersonating the voice of its chief executive officer asking for assistance to finalise an “urgent business deal”, according to a security company that investigated the incident.

US-based Nisos told Vice in a report that it analysed the voicemail that the employee received in June and determined that it was fake, a “synthetic audio” made to fool the receiver.

Play, subscribe and stand a chance to win prizes worth over RM39,000! T&C applies.

Monthly Plan

RM 13.90/month

RM 11.12/month

Billed as RM 11.12 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 9.87/month

Billed as RM 118.40 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Deepfake , Scam , fraud

Next In Tech News

The anomaly of humanity as AI grows inevitable
Musk asks SpaceX IPO banks to buy Grok AI subscriptions, NYT reports
SpaceX delays next Starship test launch by a month, Musk says
Italian court rules Netflix price-hike clauses are void, orders refunds
Trump administration proposes expanding Chinese tech gear crackdown
Moscow shoppers and travellers hit by payment system problem
Streaming channel for pets launched in China
Samsung Elec likely to report stupendous surge in quarterly profit to record level
AI-generated 'Fruit Love Island' takes TikTok by storm
Kremlin's drive for a state-backed messaging app touches a nerve for some

Others Also Read