$35 Million Stolen Through Use of Deepfaked Company Director’s Voice

Forbes recently uncovered court documents about a $35 million heist in the United Arab Emirates. The fraudsters executed the scheme in early 2020 using deepfake technology, imitating the voice of a prominent client, and requesting a fund transfer.

According to Forbes, UAE authorities sought the assistance of US investigators to trace $400,000 of the stolen $35 million. The UAE believes that the heist involved 17 individuals who moved the money to several accounts around the globe. The $400,000 was split between two accounts registered in the United States.

The crime occurred when a bank manager received a call from a client whose voice he recognized. Looking at his inbox, the manager confirmed that a transfer had been requested, and so he performed it on the voice’s go-ahead. The voice was an elaborate technical imitation.

Of course, banks run background checks to vet prospective clients, but after verifying the client’s identity, transactions are much quicker. The impostors were able to prey on an existing relationship by imitating one of the parties and subsequently defrauding them.

This is the second heist involving a deepfaked voice and the first successful one. When people think of deepfakes, they normally think of images or videos, but the technology can create voices too.

AI companies have been hard at work developing deepvoice for use in online gaming and other areas.

Unfortunately, like any technology, it is now finding criminal applications.

Upping the Fakes

It’s conceivable that the bank could have avoided this outcome had they subjected the caller to better scrutiny.

However, that would mean the bank would need to be aware of the possibility of a fake voice. Most people aren’t because of the nature of the tech.

As such, existing security measures, like phone number lookup services, will need to start anticipating the use of deepfake technology in fraud. Ideally, security will expand to cover multiple bases. The more verification is required, the less likely fraudsters will be to succeed.

ID theft protection services will need to improve too. Facial and voice prints have historically been harder to fake, but technology is changing the rules of the game.

Fakes of biometrics are more convincing because of how personal they are. As such, they pose a dangerous threat to those being imitated.

It may no longer be enough to remove personal info from the internet. Those concerned about ID theft should now be more careful about the audio and video footage they upload too.

ABOUT AUTHOR

Garan is a writer interested in how tech reshapes the environment, and how the environment reshapes tech. You'll usually find him inoculating against future shock and arguing with bots.

Latest from Garan

Best Password Manager

Leave a Reply