Binance, one of the leading cryptocurrency exchanges, has issued a warning about the growing threat of deepfake technology in the realm of crypto fraud. Deepfakes, which are created using artificial intelligence tools, leverage machine learning algorithms to produce highly convincing audio, images, or videos that mimic a person’s appearance and behavior. While deepfakes have legitimate applications, they can also be exploited by scammers and fraudsters.
According to Binance‘s chief security officer, Jimmy Su, there has been an increase in fraudsters attempting to bypass the exchange’s know-your-customer (KYC) verification processes using deepfake technology. The modus operandi involves finding a regular photo of the victim online and then utilizing deepfake tools to generate realistic videos that can deceive the verification systems.
The advancement of deepfake tools has reached a level where scammers can even respond in real time to audio instructions meant to verify if the applicant is a human. For instance, certain verification processes may require users to perform specific actions like blinking their left eye or looking in different directions. Su explains that deepfakes have become sophisticated enough to execute these commands seamlessly. However, he emphasizes that current deepfake videos still possess detectable flaws that a human operator can identify, such as when the user is instructed to turn their head to the side.
Binance vs artificial intelligence
Although deepfake technology poses a significant challenge, Su acknowledges that it is not an insurmountable problem. He believes that artificial intelligence will continue to evolve, eventually overcoming the limitations that currently allow human operators to detect deepfakes. Therefore, he emphasizes that relying solely on visual inspections is not a foolproof solution.
Binance had previously faced an incident involving deepfake technology when its chief communications officer, Patrick Hillmann, discovered that a “sophisticated hacking team” had created a deepfake version of him. The impostor then used the fabricated persona to conduct Zoom meetings with various cryptocurrency project teams, falsely promising them opportunities to list their assets on Binance in exchange for a fee.
Combatting deepfake attacks presents a significant challenge, as even if Binance can control its own videos, there are countless others available online that are beyond its control. Su emphasizes the importance of user education as a preventive measure. Binance is taking steps to address this by planning to release a series of blog posts aimed at educating users on how to identify and combat cyber threats effectively.
As the threat landscape continues to evolve, it is crucial for both cryptocurrency exchanges and users to remain vigilant against sophisticated fraud attempts. By staying informed about the risks associated with deepfake technology and adopting robust security practices, individuals can better protect themselves and contribute to a safer crypto ecosystem.
Source: https://www.cryptopolitan.com/binances-cso-on-deepfake-ai-threat/