Home > News > Internet

Fake CFO swindles 180 million CNY from company, employees: Looks so real!

Wed, Feb 14 2024 09:23 PM EST

Unbelievable! Just because of one video conference, they got scammed out of 180 million CNY. 11cfa2e9-753f-41c0-a389-b418a6bcb484.png This is a genuine crime that has taken place in Hong Kong:

The financial personnel of a multinational corporation's Hong Kong branch, following the instructions of the "CFO," transferred a total of 200 million Hong Kong dollars (approximately 180 million RMB) to a designated account.

Who would have thought that the person they just had a "face-to-face" conversation with in a video conference was entirely Deepfake!

Before rushing to criticize this individual for their low vigilance, it's worth noting that there was a hint of suspicion. However, this wasn't your typical one-on-one video call commonly seen in scams—

There were several familiar faces participating in the meeting, and according to the victim, "everyone seemed quite real, both in appearance and voice." 48d1a7d3-57cc-4085-90b4-c4fb0714f3ad.png No matter how you put it, the level of horror in this matter has really spooked netizens.

The topic of AI-enabled crime has suddenly become the focal point of discussion across various platforms. Se71756df-d84f-4161-9a0d-9b0cb24e26da.png

Sfb0fb44c-8b84-45cc-96c0-87c0d9ea52bf.png Marcus, a frontline critic in the AI realm, wasted no time in getting netizens stirred up: "Wow, this feels like 'Black Mirror' has come to life."

The deepfakes are getting scarily realistic. Sbd7f329c-810d-4ff4-a534-5892ebb3aae8.png Chronology of Events

Let's call the financial staff member who was deceived "Xiao Shuai" for now. Here's how the whole thing went down:

Initially, Xiao Shuai received an email from the "CFO" based in the UK headquarters. The email mentioned the need to carry out a confidential transaction, which initially raised some doubts for Xiao Shuai, suspecting it to be a phishing attempt. 701d2c94-8944-4f07-a892-406477c3b351.png The scammer wasted no time and immediately proceeded to the next step, inviting Xiaoshuai to a video conference.

Upon entering the video conference, Xiaoshuai was completely taken aback by what he saw: not only was the company's CFO present, but also some of his colleagues and a few external individuals.

The crucial part was that these fake colleagues looked and sounded exactly like Xiaoshuai's genuine colleagues, making it indistinguishable. Sadbde6f7-6bf3-4483-b415-38747a50d3c1.png During the meeting, the imposter posing as the CFO also asked Xiaoshuai to introduce himself, but the meeting was suddenly interrupted.

The scammer then continued to stay in touch with Xiaoshuai through instant messaging platforms, email, and one-on-one video calls. 4e213518-254c-4a3b-9bd8-0e70146f04d0.png In this consecutive setup, Xiao Shuai eventually fell for the scammer's trickery. Following the instructions from the meeting, he made 15 transfers to 5 Hong Kong bank accounts, totaling 200 million Hong Kong dollars, which is roughly 25.6 million US dollars or around 180 million RMB. 79e50797-dc56-4ebe-a95e-b555984bed6c.png From the moment the scammer contacted Xiao Shuai, the whole ordeal lasted about a week until Xiao Shuai verified the situation with the company headquarters and realized the entire event was nothing but a meticulously planned scam.

Subsequently, they reported it to the police. The authorities immediately launched an investigation and discovered that everyone involved in the meeting, except for Xiao Shuai, were digitally reconstructed personas using publicly available personal video and audio clips by the scammers.

The video where Xiao Shuai introduced himself wasn't a genuine interaction with him but merely a virtual avatar giving instructions.

The police also revealed that the perpetrators had attempted to scam another employee of the company's branch using the same multi-person video call strategy, contacting a total of two to three employees. However, the authorities did not provide complete information about their encounters.

The case is currently under further investigation, and arrests of the scammers have not yet been made.

In a subsequent press conference, the Hong Kong police announced the arrest of six individuals related to such scams, indicating that there are many similar cases:

Between July and September last year, eight stolen Hong Kong identity cards were used to apply for 90 loans and register 54 bank accounts;

At least 20 times, AI deepfake was used to mimic the portraits on identity cards to deceive facial recognition programs.

The capabilities of AI "clones" are quite formidable.

It's evident that a major turning point in the entire case was the overly realistic fake personas during the video conference that eventually lowered Xiao Shuai's guard.

So, just how convincing are these "digital clone personas" nowadays? Let's take a look at a few examples.

Take, for instance, this clip of Taylor Swift speaking Chinese, which comes from a popular AI tool called HeyGen.

[Video Duration: 00:09]

The voice and mouth movements are simply perfect replicas, leaving netizens shocked and exclaiming in amazement. Sefdcd695-b455-49fc-9101-ecf26d4ae54b.png In the latest technology showcase by HeyGen, real-time chatting with digital clone avatars has become a reality. S5872c10c-b152-4734-8675-fab697922cbd.jpg Recently, there's been a lot of buzz around a startup called 11Labs (ElevenLabs) in the field of AI voice synthesis.

What's impressive about 11Labs is that they can not only generate speech in 29 different languages but also clone anyone's voice with just a short audio clip as brief as 1 minute. They can mimic the tone, intonation, and emotional nuances remarkably well.

The key point here is that these AI cloning techniques are becoming increasingly realistic, and the barriers to entry are lowering...

For instance, there are open-source projects out there that can achieve real-time face swapping in a matter of minutes. S108bcdec-1c47-49c1-a85b-2f71f006a8eb.png The project homepage features a variety of pre-trained facial models from different individuals. If you're not satisfied with those, you can also train your own model. Scfaf3638-db13-4fc7-acd5-929c79c09b20.jpg "The digital clone version of you is about to arrive."

Perhaps influenced by this event and the recent incident where Taylor Swift fell victim to Deepfake, the aforementioned real-time face-swapping project is gaining attention again, even freshening up the GitHub trends. Sd14c72fb-a709-4846-9177-65714e004411.png In the issues section of this project, many users' resistance has been lingering for quite some time. S8d6c8c75-6c63-4f9b-aa37-e5beeb048b6c.png But as many netizens have pointed out, technology itself is morally neutral. And the more pressing issue is that once the arrow is released from the bow, there's no turning back. Scb230338-9751-4fc1-bc3f-9c428b250b58.png So, many people have shifted their focus to the matter of "how to defend" themselves.

Aside from the establishment of more regulations and policies, from a technical standpoint, Nature mentioned in the article "Seven Technologies Worth Watching in 2024":

Incorporating invisible watermarking mechanisms into AI tools could be one solution for developers.

And from a personal perspective, increasing vigilance has become a proposition that individuals must face in the era of AIGC technology explosion. Sef1e4c4d-b4ba-40ef-85b1-4764d791bf4d.png The Hong Kong police have also stated in this incident that they will expand their alert system to provide early warnings and prevent users from transferring funds to accounts associated with scams.

However, the whole thing about a Deepfake fake CFO swindling 180 million from the company is a bit too outrageous.

So much so that many netizens have raised doubts:

A company this big doesn't have any risk management measures? Could there be an insider involved? S1df78b1a-8ebc-4d75-8a8c-1df00103cde0.png So, what are your thoughts on this?