Deceptive Video Call Scams Unveil the Threat of AI-Driven Fraud

In the ever-evolving online fraud landscape, fraudsters are now employing sophisticated techniques such as ‘Deep Fake’ and ‘Air Intelligence’ to orchestrate scams through video calls. The scam artists in this AI-driven fraud exploit this situation to manipulate victims into transferring money urgently under the guise of financial emergencies. The fraud leverages AI technology to convincingly imitate the voices and appearances of known individuals, resulting in a deceptive and convincing interaction.

The anatomy of online AI fraud

The modus operandi of this online AI fraud is as follows: unsuspecting individuals receive video calls from people they recognize – family members, close friends, or acquaintances – all from unfamiliar phone numbers. The callers claim to be in dire financial straits and request immediate monetary assistance. Since the call is visual and appears from a familiar face, victims often trust the appeal and transfer money promptly, typically using platforms like PhonePe or Google Pay.

Exploiting ‘deep fake’ and ‘Air intelligence’ techniques

This nefarious technique exploits the combination of ‘Deep Fake’ technology and ‘Air Intelligence’ to create a false sense of urgency and authenticity. ‘Deep Fake’ involves using AI to fabricate realistic visual and audio content, allowing fraudsters to simulate genuine conversations. In this context, ‘Air Intelligence’ technology refers to these scammers’ ability to engage in genuine conversations using AI-driven commands. The perpetrators compile voice and video samples of known individuals, then manipulate these components to create seemingly authentic video calls.

A cautionary tale from the experts

Pavan Duggal, an advocate at the Supreme Court and the Chief Executive of the Artificial Intelligence Law Hub, has shed light on this emerging fraud trend. He notes that these cases were initially observed in Kerala and have since spread to other southern states. Victims receive video calls from known individuals requesting urgent financial assistance. Having interacted visually with the caller, albeit from an unfamiliar number, victims often transfer money without a second thought. The fraudsters exploit the trust built during the video call to manipulate victims into transferring funds.

Amit Dubey, a cybercrime expert based in Delhi, underscores the risk this scam poses. He highlights that the perpetrators can convincingly imitate family members’ or close friends’ voices and appearances. This complexity makes it challenging for victims to discern the deception, reinforcing the need for vigilance and a cautious approach.

Staying ahead of the scammers

As these scams become increasingly sophisticated, safeguarding oneself from falling victim requires a combination of vigilance and strategy. Experts provide the following tips to protect oneself from these types of scams:

Verify the Caller: Initiate a conversation upon receiving a video call from a known individual using an unknown number. Discuss and ask probing questions about the financial crisis they mentioned earlier. This verification step can help expose fraudulent calls.

Contact Through Trusted Channels: Call the person back on a trusted phone number before transferring money. You can proceed cautiously if the person’s story checks out and aligns with their previous video call.

Exercise Skepticism: Always maintain skepticism when faced with financial appeals over video calls, even from known individuals. Fraudsters use emotion-driven urgency to manipulate victims into making hasty decisions.

Educate Yourself: Familiarize yourself with the concepts of ‘Deep Fake’ and AI technology. Understanding how these techniques work can help you identify inconsistencies and discrepancies in video calls.

The rapidly evolving landscape of online fraud has introduced a new level of complexity through AI-driven deception. Fraudsters can now mimic known individuals convincingly, creating a potent avenue for scams. By following cautious practices, verifying claims, and staying informed about the intricacies of such scams, individuals can better protect themselves from falling prey to these fraudulent tactics. As technology advances, so do the scams, but with vigilance and awareness, people can stay one step ahead of the fraudsters.

Source: https://www.cryptopolitan.com/video-call-scams-threat-of-ai-driven-fraud/