Scammers now using deepfake audio to impersonate CEOs in fraud attempts, says security company

Nisos advised employees who have received suspicious voicemails to simply call the person back using a known number and verify any instructions with them. — Dreamstime/TNS

An employee at an undisclosed tech company has received a deepfake audio impersonating the voice of its chief executive officer asking for assistance to finalise an “urgent business deal”, according to a security company that investigated the incident.

US-based Nisos told Vice in a report that it analysed the voicemail that the employee received in June and determined that it was fake, a “synthetic audio” made to fool the receiver.

Subscribe now to our Premium Plan for an ad-free and unlimited reading experience!

Deepfake , Scam , fraud


Next In Tech News

In challenge to Meta, Apple expected to unveil mixed-reality headset
India's Tata Group signs $1.6 billion EV battery plant deal
Crypto insurer Evertas authorized to offer largest single crypto insurance policy
Mercedes won't join Renault's new electric van project-sources
US telecom stocks fall on report Amazon in talks for wireless services
SentinelOne's disappointing forecast slams shares
Netflix, Disney, Amazon to challenge India's tobacco rules for streaming-sources
Kenya central bank says digital currency not a 'compelling priority'
Renault customers to lodge criminal complaint in France over faulty engines
Tech shares see biggest ever weekly inflow on AI boom-BofA

Others Also Read