On March 4th, reports emerged of a rising trend in cybercriminals utilizing AI technology to seamlessly blend others' faces and voices, creating incredibly realistic synthetic images for perpetrating novel forms of online fraud.
According to domestic media, the core of "AI Face Swap" involves the precise identification of facial features in videos using deep learning algorithms, extracting key facial elements such as eyes, nose, and mouth.
So, how can you reveal the true identity of scammers?
Experts suggest a practical approach: during a video conversation, ask the other party to wave their hand in front of their face. This disrupts the real-time generation and processing of the manipulated video by causing interference in facial data. The forged face may exhibit slight shaking, flickering, or other anomalies during the waving process.
Yu Nenghai, Executive Dean of the School of Cybersecurity at the University of Science and Technology of China, advises another intriguing method. Request the other person to pinch their nose or touch their face and observe any facial alterations. A genuine human nose would deform under pressure, while an AI-generated one would remain unaffected.
He emphasizes the importance of maintaining daily information security, reinforcing protection for biometric data such as facial features, voiceprints, and fingerprints. Additionally, users should enhance the security management of personal devices like smartphones and computers and avoid logging into suspicious websites to prevent potential virus infiltrations.