In brief
- Reps. Ted Lieu and Neal Dunn have introduced the AI Fraud Deterrence Act following high-profile AI impersonation incidents.
- The bill increases maximum penalties for AI-assisted fraud to $2 million in fines and up to 30 years in prison for bank fraud.
- Hackers used AI to impersonate White House Chief of Staff Susie Wiles and Secretary of State Marco Rubio in May and July.
Congress is cracking down on AI-powered scams with bipartisan legislation that would send fraudsters to prison for decades after brazen impersonation attacks targeted America’s top officials.
The AI Fraud Deterrence Act, introduced by Rep. Ted Lieu (D-CA) and Rep. Neal Dunn (R-FL) on Tuesday, would raise maximum fines to $2 million and extend prison sentences up to 30 years for bank fraud committed with AI assistance, according to a Tuesday statement.
Pleased to introduce the AI Fraud Deterrence Act today with @DrNealDunnFL2, which would expand penalties for AI scams and deepfakes, including impersonations of federal officials.
Our bipartisan goal is simple: Deter people from using AI to commit fraud. pic.twitter.com/ixUe854KRV
— Rep. Ted Lieu (@RepTedLieu) November 25, 2025
The legislation targets wire fraud, mail fraud, money laundering, and impersonation of federal officials.
“AI has lowered the barrier of entry for scammers, which can have devastating effects,” Lieu said in the statement, warning that impersonations of U.S. officials “can be disastrous for our national security.”
AI fraud escalating
The bill comes after scammers used AI a few months back to breach White House Chief of Staff Susie Wiles’s cellphone, impersonating her voice in calls to senators, governors, business leaders, and other high-level contacts.
Two months later, fraudsters mimicked Secretary of State Marco Rubio’s voice in calls to three foreign ministers, a member of Congress, and a governor in an apparent attempt to obtain sensitive information and account access, according to the bill.
The bill adopts the 2020 National AI Initiative Act’s definition of AI and carves out First Amendment protections, exempting satire, parody, and other expressive uses that include a clear disclosure of inauthenticity.
AI-aided mail and wire fraud would carry up to 20 years and a $1 million in fines, with standard penalties rising to $2 million. AI-driven bank fraud could draw 30 years and a $2 million penalty.
AI-assisted money laundering would carry up to 20 years in prison and fines of $1 million or three times the transaction value, and AI impersonation of federal officials would bring three years and a $1 million penalty.
“AI is advancing at a rapid pace, and our laws have to keep pace with it,” Dunn noted, cautioning that when criminals use AI to steal identities or defraud Americans, “the consequences should be severe enough to match the crime.”
Meanwhile, President Trump is reportedly weighing an executive order to dismantle state AI laws and assert federal primacy, even as more than 200 state lawmakers urge Congress to reject House Republicans’ push to fold an AI-preemption clause into the defense bill.
A similar moratorium collapsed in July after a 99–1 Senate vote, and opposition has since widened, though a draft order circulating last week shows the White House considering its own path to override state rules.
Proving AI use in court
Mohith Agadi, co-founder of Provenance AI, an AI agent and fact-checking SaaS backed by Fact Protocol, told Decrypt that the bipartisan nature of this bill points to a growing consensus that “AI-driven impersonation and fraud demand urgent action.”
“The real challenge is proving in court that AI was used,” Agadi said. “Synthetic content can be difficult to attribute, and existing forensic tools are inconsistent.”
“Lawmakers need to pair these penalties with investments in digital forensics and provenance systems like C2PA that clearly document a content’s origin,” he noted, or else we risk creating laws that are “conceptually strong but practically hard to enforce.”
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.