Deepfakes danger! On Zoom, ‘You’re on Mute’ Is Now ‘Are You Real?’ | Opinion

Deepfakes danger! On Zoom, ‘You’re on Mute’ Is Now ‘Are You Real?’

Scammers used AI to disguise themselves on a video conference and swipe $25 million. Here’s how to avoid the same fate.

| Updated on: Feb 06 2024, 07:19 IST
deepfakes threat
A finance worker in Hong Kong transferred more than $25 million to scammers. (Pixabay)
deepfakes threat
A finance worker in Hong Kong transferred more than $25 million to scammers. (Pixabay)

Is the boss who's giving you an order real or just realistic? Deepfakes are now taking Zoom calls to another level of awkwardness, by making us question whether our co-workers are genuine. A finance worker in Hong Kong transferred more than $25 million to scammers after they posed as his chief financial officer and other colleagues on a video conference call, marking perhaps the biggest known corporate fraud using deepfake technology to date. The worker had been suspicious about an email requesting a secret transaction, but the scammers looked and sounded so convincing on the call that he sent the money.

Corporate IT managers have spent more than a decade trying, often fruitlessly, to train office workers to spot phishing emails and resist the urge to click on dodgy attachments. Often hackers and fraudsters need just one person out of hundreds to inadvertently download the malware needed to tunnel into a corporate network. With AI-powered video tools, they're moving into territory we have considered safe, underscoring how quickly deepfake technology has developed in just the last year. While it sounds like science fiction, such elaborate frauds are now relatively easy to set up, ushering us into a new age of skepticism.

We are on WhatsApp Channels. Click to join. 

The fraud in Hong Kong almost certainly used real-time deepfakes, meaning that the fake executive mirrored the scammer as they listened, talked and nodded during the meeting. According to David Maimon, a criminology professor at Georgia State University, online fraudsters have been using real-time deepfakes on video calls since at least last year for smaller-scale fraud including romance scams.

Maimon posted the video below to LinkedIn, showing a demo from developers who are selling deepfake video tools to potential fraudsters. In it, you can see the real image of a man on the left and his fake persona on the right, a beautiful young woman scamming the male victim in the middle:

This is uncharted territory for most of us, but here's what the Hong Kong victim could have done to spot the deepfake, and what we'll all need to do in the future for sensitive video calls:

  1. Use visual cues to verify who you're talking to. Deepfakes still can't do complex movements in real time, so if in doubt, ask your video conference counterpart to write a word or phrase on a piece of paper and show it on camera. You could ask them to pick up a nearby book or perform a unique gesture, like touching their ear or waving a hand, all of which can be difficult for deepfakes to replicate convincingly in real-time.  
  2. Watch the mouth. Look out for discrepancies in lip syncing or weird facial expressions that go beyond a typical connection glitch.  
  3. Employ multi-factor authentication. For sensitive meetings, consider involving a secondary conversation via email, SMS or an authenticator app, to make sure the participants are who they claim to be.  
  4. Use other secure channels. For critical meetings that will involve sensitive information or financial transactions, you and the other meeting participants could verify your identities through an encrypted messaging app like Signal or confirm decisions such as financial transactions through those same channels.  
  5. Update your software. Make sure that you're using the latest version of your video conferencing software in case it incorporates security features to detect deepfakes. (Zoom Video Communications did not reply to questions about whether it plans to make such detection technology available to its users.)  
  6. Avoid unknown video conferencing platforms. Especially for sensitive meetings, use well-known platforms like Zoom or Google Meet that have relatively strong security measures in place.  
  7. Look out for suspicious behavior and activity. Some strategies stand the test of time. Be wary of urgent requests for money, last-minute meetings that involve big decisions, or for changes in tone, language or a person's style of speaking. Scammers often use pressure tactics so beware of any attempts to rush a decision too.  

Some of these tips could go out of date over time, especially visual cues. As recently as last year, you could spot a deepfake by asking the speaker to turn sideways to see them in profile. Now some deepfakes can convincingly move their heads side to side.

For years fraudsters have hacked into the computers of wealthy individuals, hoovering up their personal information to help them get through security checks  with their bank. But at least in banking, managers can create new processes to force their underlings to tighten up security. The corporate world is far messier, with an array of different approaches to security that allow fraudsters to simply cast their nets wide enough to find vulnerabilities.

The more people wise up to the possibility of fakery, the less chance the scammers will have. We'll just have to pay the price as the discomfort of conference calls becomes ever more agonizing, and the old Zoom clichés about your peers being on mute morph into requests for them to scratch their noses.  

Also read other top stories today:

Elon Musk's Neuralink Troubles Over? Well, Neuralink's challenges are far from over. Implanting a device in a human is just the beginning of a decades-long clinical project beset with competitors, financial hurdles and ethical quandaries. Read all about it here

Cybercriminals Pull Off Deepfake Video Scam! Scammers tricked a multinational firm out of some $26 million by impersonating senior executives using deepfake technology, Hong Kong police said Sunday, in one of the first cases of its kind in the city. Know how they did it here

Facebook Founder Mark Zuckerberg apologised to families of children exploited online. But that is not enough. Here is what lawmakers in the US must push social media companies to do now. Dive in here

Follow HT Tech for the latest tech news and reviews , also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 06 Feb, 07:18 IST