- Binance founder CZ warns AI deepfakes make video call verification unreliable.
- Mai Fujimoto lost her X account after a 10-minute deepfake Zoom call with a fake user.
- Hackers breached Telegram, then used a deepfake call to install malware and steal accounts.
Binance founder Changpeng Zhao has cautioned that AI-powered deepfake technology has made video call verification unreliable for security purposes. He also warned users to avoid installing software from unofficial sources, even if the request comes from their friends, as their accounts may have been compromised.
CZ’s warning came in response to a sophisticated hacking incident involving cryptocurrency analyst Mai Fujimoto. The analyst lost control of her X account after falling victim to a deepfake attack during a video call.
Zhao emphasized that friends requesting software installation are “most likely hacked” and highlighted how cybercriminals exploit trusted relationships to distribute malware. The former Binance CEO’s warning highlights the evolution of social engineering attacks, which now utilize advanced AI technology to create increasingly convincing impersonations.
Deepfake Attack Exploits Trusted Relationships
Mai Fujimoto explained how her main X account, @missbitcoin_mai, was hacked on June 14 by a carefully planned deepfake attack. The attack began when her friend’s Telegram account was compromised, allowing attackers to exploit the account and initiate a video meeting in bad faith. Fujimoto had accepted the Zoom call invitation in good faith, as the communication appeared to be from a known contact.
During the 10-minute video call, Fujimoto could see what appeared to be the face of her acquaintance, but could not hear. The impersonator provided a link that said it would resolve the audio issue and provided step-by-step instructions on how to adjust settings. Fujimoto believes that this was when malware was installed on her computer, which subsequently led to the theft of her social media account.
Fujimoto Incident Shows The Advancement of AI Deepfake
The technology was so advanced that Fujimoto remained for the entire length of the call, thinking she was talking to her real acquaintance. She only understood the level of sophistication of the attack and the way the attackers had managed to make her believe in them when she lost access to her accounts.
Fujimoto acknowledged that continuing to use Zoom despite persistent audio issues should have raised red flags. However, she attributed the platform choice to her friend’s engineering background, assuming a technical preference rather than recognizing it as a potential manipulation tactic.
The attack’s success extended beyond the initial X compromise, with hackers gaining access to Fujimoto’s Telegram and MetaMask accounts. Fujimoto has expressed concern that her own likeness could be used in future deepfake attacks, warning contacts to remain suspicious of any video calls featuring her face.
Related: Binance Introduces Emergency Contact System as CZ Calls for Industry-Wide ‘Will Functions’
Disclaimer: The information presented in this article is for informational and educational purposes only. The article does not constitute financial advice or advice of any kind. Coin Edition is not responsible for any losses incurred as a result of the utilization of content, products, or services mentioned. Readers are advised to exercise caution before taking any action related to the company.
Source: https://coinedition.com/cz-warns-video-verification-out-the-window-after-deepfake-scam-hits-analyst/