Scammers now using deepfake audio to impersonate CEOs in fraud attempts, says security company


  • Technology
  • Tuesday, 28 Jul 2020

Nisos advised employees who have received suspicious voicemails to simply call the person back using a known number and verify any instructions with them. — Dreamstime/TNS

An employee at an undisclosed tech company has received a deepfake audio impersonating the voice of its chief executive officer asking for assistance to finalise an “urgent business deal”, according to a security company that investigated the incident.

US-based Nisos told Vice in a report that it analysed the voicemail that the employee received in June and determined that it was fake, a “synthetic audio” made to fool the receiver.

Nisos shared a copy of the voicemail to Vice where a voice can be heard saying “Hi (recipient's name), this is (alleged CEO's name). I need your assistance to finalise an urgent business deal”.

The employee who received the voicemail was suspicious and reported it to the company, which led to Nisos' investigation. Researchers then used Spectrum3D, a spectrogram tool, to determine if there were any anomalies.

“You could tell there was something wrong about the audio,” Nisos researcher Dev Badlu said. “It looks like they basically took every single word, chopped it up, and then pasted them back in together.”

He added that when he lowered the volume for the alleged CEO's voice, there was zero background noise, which to him was a clear sign of forgery. He also said that were too many stark peaks and valleys (high and low indicators) within the audio, which does not normally occur in regular speech.

Nisos in a detailed report about the incident said it also investigated the phone number where the voicemail came from and determined that it was a VOIP service with no user registration information. Hence, it was likely that the number was acquired to be used as a “burner” for the fraud attempt.

The company also shared that in 2019, criminals used voice-mimicking software to copy the voice of a British executive and fooled a managing director at his company into transferring US$240,000 (RM1.02mil) to an account in Hungary.

It was also reported that there were cases of audio deepfakes where criminals used machine learning to gather audio snippets from conference calls, YouTube videos and TED talks to copy the speech behaviour of company bosses.

In May, LifestyleTech reported that two YouTubers managed to fool a number of celebrities into believing that they were doing interviews with TV host James Corden simply by using clips of his voice from videos found on YouTube.

Nisos offered a simple piece of advice for employees to avoid getting duped by suspicious voicemail, which is to simply call the person back. It said that deepfake technology has not evolved to mimic an entire phone call.

Finally, Nisos said it would expect criminals to use deepfake audio as the first step in a fraud attempt and then add other forms of trickery to dupe their victims.

“We would anticipate a deepfake audio would be the first step in a series of social engineering attempts to get an employee to wire money to a specific location. Phishing emails, additional phone calls, or even deepfake videos purporting to authorise an action could be used in furtherance of the criminal scheme,” Nisos said in its report.

Article type: metered
User Type: anonymous web
User Status:
Campaign ID: 18
Cxense type: free
User access status: 3

Deepfake , Scam , fraud

   

Did you find this article insightful?

Yes
No

94% readers found this article insightful

Across the site