Deepfake scammer posing as Hollywood star Jason Momoa hits widow for $600K

A British woman has lost over $600,000 to an artificial intelligence-generated scam after the scammers led her to believe she was in a romantic relationship with actor Jason Momoa. According to reports, the scammers sent her several videos, made with artificial intelligence, of the actor claiming several things.

According to reports, the scammers used those images to lie to the British widow that she could live happily with the Hollywood hunk, claiming the money she was sending would be used to build their dream home in Hawaii.

The British widow started the relationship after the Hollywood star supposedly replied to her comments on one of his fan pages. He then started talking to her frequently after the first contact, as things moved fast between them.

British widow loses funds to scammers

According to the police, the scammers, posing as Momoa, started to solicit the grandmother for cash. He claimed that his fortune was tied up in several film projects that he was working on. The British widow eventually put her house in Cambridgeshire on sale, transferring more than half a million from the sale to her supposed boyfriend. After she sent the money, the messages stopped, and her heathrob vanished.

“This might sound far-fetched, but it’s a true story, and it left a vulnerable woman without a home,” Cambridgeshire Police said.

The police claimed that the scammers have pulled this same scam on other victims across the United Kingdom and the United States. They claimed that another British woman was scammed of up to £80,000, with the scammer deploying the same romance scam technique with the Hollywood star to swindle the unsuspecting victim.

The British woman claimed to have had a conversation with his daughter, who she claimed was turning 15 this year. “I was also told he was fighting his ex-wife for the house, and he said we needed a marriage certificate to keep the house. So I was gullible and paid for it,” she added.

Dave York, a fraud prevention officer, noted that scammers always target those whom they feel are at their lowest, especially widows. They somehow notice their desperation to fill the gaps in their lives, creating the right opportunity for the criminal elements.

Celebrities call out rise in AI-generated deepfakes

Aside from Jason Momoa, there have also been other popular figures in the United States who have seen their pictures and videos being deployed by criminals. A typical example is Steve Harvey, the popular host of Family Feud. Last year, he was among some celebrities whose voices were mimicked by criminals to promote a scam that promised people government-provided funds.

“I’ve been telling you guys for months to claim this free $6,400,” a voice that sounds like Harvey’s says in one video. However, Harvey has urged regulators to look into this issue and make sure the perpetrators are brought to book. “My concern now is the people that it affects. I don’t want fans of mine or people who aren’t fans to be hurt by something,” Harvey said.

Since the beginning of this year, there has been a big jump in scammers using artificial intelligence to carry out their bad acts. According to a previous report by Cryptopolitan, the Securities Exchange Commission of Nigeria issued a statement warning the general public about the use of AI to create deepfakes of popular personalities to carry out bad acts. The scammers create deepfake videos to solicit funds, while others use them to advertise fake investment opportunities.

Join a premium crypto trading community free for 30 days – normally $100/mo.

Source: https://www.cryptopolitan.com/british-widow-loses-600k-to-ai-scam/