How to make a deepfake: investigating the new face of cyber fraud

How to Make a Deepfake

Download an app. In seconds, you’ll have the unparalleled power of digital deception at your fingertips. It’s that easy to make a deepfake.

 

 

What is a deepfake?

 

A deepfake is a convincing misrepresentation of a person saying or doing things that never occurred. It is the product of manipulated media, usually images, videos or voice recordings.

 

 

“Deepfakes play with identity and agency, because you can take over someone else — you can make them do something that they’ve never done before,” said Ben-Zion Benkhin, CEO of the AI-powered lip-sync app Wombo.

 

 

These “novel forms of deception,” as the Carnegie Endowment for International Peace (CEIP) refers to deepfakes, are the products of artificial intelligence (AI). In fact, deepfake videos take their name from a form of AI used to generate faces called “deep learning.”

 

 

The CEIP classifies deepfake videos, photos and writing as synthetic media that has the potential to “facilitate financial harm.” And it already has — but we’ll get to that later. 

 

 

How to make deepfakes

 

As with nearly everything these days, software is making it easy to create deepfakes.

 

 

What is deepfake software?

 

Once exclusive to Hollywood studios with powerful computers, the capabilities to make deepfake videos are now available to anyone with an iPhone. Simply download the Reface app to swap faces with your favorite celebrity. Turn your coworker into a virtual puppet with Avatarify. Use Wombo to make Federal Reserve Chairman Jerome Powell lip-sync Prince songs in seconds.

 

 

Identity spoofing is the new pastime. And while these apps are clearly designed for fun and games, they raise a serious question:

 

Are deepfakes cybercrime?

 

Yes, deepfakes can be used for cybercrime. A voice deepfake cost one UK-based energy firm approximately $243,000 in 2019. The cybercriminals employed AI-based software to impersonate the chief executive of the firm’s German parent company.

 

 

The imposter called the CEO and urged him to send the money to a Hungarian supplier within the hour. Recognizing his employer’s German accent and the melody of his voice, the CEO complied with the request, reported The Wall Street Journal.

 

 

“Applying machine-learning technology to spoof voices makes cybercrime easier, said Irakli Beridze, head of the Centre on AI and Robotics at the United Nations Interregional Crime and Justice Research Institute,” in the article.

 

 

How to combat deepfakes and protect your business

 

Cybercriminals are already taking advantage of deepfakes to defraud businesses of tens of thousands of dollars, and the problem could grow worse in the next two years because the shift to remote working during COVID-19 has resulted in an abundance of audio and video samples of influential business people.

 

 

In fact, according to Gartner, it is predicted that, “In 2023, 20% of successful account takeover attacks will use deepfakes to socially engineer users to turn over sensitive data or move money into criminal accounts.”[1]

 

 

However, just as cybercriminals can use AI to manipulate media and masquerade as executives, executives can use sophisticated identity verification software to expose the criminals behind those digital masks. Such technology could prove key to staying one step ahead of fraudsters as we enter the era where nothing is as it seems.

 

Written by Auroriele Hans

 

 

Sources

[1] Gartner, “Predicts 2021: Artificial Intelligence and Its Impact on People and Society”, Magnus Revang, et al, 31 December 2020.

 

How to make a deepfake. See why the new face of cyber fraud is easier to create than you think.

Share this post