Banned NVIDIA GPUs power North Korea’s AI crypto threat

North Korea has spent nearly 30 years developing artificial intelligence capabilities that could accelerate crypto theft operations, according to a report from South Korea’s Institute for National Security Strategy.

Summary

  • INSS says North Korea is using banned NVIDIA GPUs to accelerate AI-driven cybercrime.
  • AI tools in facial recognition and voice synthesis may boost industrial-scale crypto theft.
  • Crypto hacks hit $172.5M in November, with code flaws causing most of the losses.

Researchers discovered Pyongyang is using banned NVIDIA GeForce RTX 2700 graphics cards to power AI research focused on pattern recognition, speech processing, and data optimization.

The findings come as crypto hacks totaled $172.5 million in November 2025, with code vulnerabilities accounting for $130.2 million in losses.

Kim Min Jung, who heads the Advanced Technology Strategy Center at INSS, warned that “precise monitoring of North Korea’s AI research trends and policy responses to suppress the military and cyber diversion of related technologies are urgently needed.”

AI research targets facial recognition and voice synthesis

North Korea has strengthened AI capabilities through expanded research institutions and self-developed algorithms since the 2010s.

This year’s studies by the National Academy of Sciences’ Mathematical Research Institute and Pyongyang Lee University covered facial recognition, multi-object tracking, lightweight voice synthesis, and accent identification.

The research shows efforts to improve both accuracy and processing speed within limited computational environments.

These technologies allow target identification, movement path prediction, and better efficiency in disrupting command communications or executing social engineering attacks.

Some studies used NVIDIA’s GeForce RTX 2700, which the U.S. Department of the Treasury’s Office of Foreign Assets Control designated as prohibited for export or re-export to North Korea.

Crypto theft could reach industrial scale with AI automation

The INSS report warned that AI capabilities could be applied to deepfake production, detection evasion, and optimized crypto asset theft.

“Utilizing high-performance AI computational resources could exponentially increase attack and theft attempts per unit time, enabling a small number of personnel to conduct operations with efficiency and precision comparable to industrial-scale efforts,” the report stated.

Multi-person tracking research could expand into real-time surveillance systems when combined with CCTV or drone reconnaissance.

North Korea-China-Russia cooperation since the Ukraine war is a variable that could accelerate practical AI deployment. The report emphasized continuous monitoring of North Korea’s AI applications in military, surveillance, and cyber domains.

CertiK Alert data shows November 2025 crypto losses reached $172.5 million, with approximately $45.5 million frozen or returned. Code vulnerabilities and wallet compromises accounted for the majority of incidents.

Source: https://crypto.news/north-korea-weaponizes-banned-gpu-to-steal-more-crypto/