Political Consultant Indicted for Fake AI Biden Robocall Scheme

Why Trust Techopedia
Key Takeaways

  • A Louisiana consultant was indicted for creating a fake robocall impersonating President Biden to suppress votes in New Hampshire's Democratic primary.
  • Steve Kramer faces 26 charges, including voter suppression and impersonation, and a proposed $6 million FCC fine for caller ID violations.
  • The case highlights concerns about AI's role in election interference, prompting legislative efforts to regulate AI use in political campaigns.

A Louisiana political consultant was indicted on May 23 for orchestrating a fake robocall that impersonated U.S. President Joe Biden.

The Biden robocall aimed to dissuade voters from supporting Biden in New Hampshire’s Democratic primary election.

New Hampshire’s Attorney General, John Formella, announced that the state indicted Steve Kramer on 26 charges. It includes 13 felony counts of voter suppression and 13 misdemeanor counts of impersonating a candidate.

The Attorney General accused Kramer of suppressing voters by sending misleading phone messages that hid their true source, used a fake candidate voice, or provided false information to discourage voting.

In a separate announcement, the Federal Communications Commission (FCC) proposed a $6 million fine against Kramer for violating caller ID rules. Attorney Formella expressed satisfaction with the FCC’s proposed fine, emphasizing the importance of protecting consumers and voters from harmful robocalls and voter suppression.

He stated,

“I am pleased to see that our federal partners are similarly committed to protecting consumers and voters from harmful robocalls and voter suppression. I hope that our respective enforcement actions send a strong deterrent signal to anyone who might consider interfering with elections, whether through the use of artificial intelligence or otherwise.”

Biden Robocall Attacks

Some months back, thousands of New Hampshire voters received a shocking robocall. An AI-generated voice impersonating Joe Biden, urging them not to cast their ballots in the January 2024 primary.

The Attorney General’s office issued a public statement, warning voters to ignore the deepfake robocall of Joe Biden. Steve Kramer, who worked for Dean Phillips’ presidential campaign, admitted to orchestrating the scheme in a February interview with NBC News.

Concerns about AI  Malpractices

US senators have sounded the alarm on AI’s potential to disrupt elections, citing its ability to deceive voters. Examples include the AI-generated Biden robocall and a DeSantis campaign video featuring fake AI-generated images of Donald Trump, highlighting the dangers of AI in elections.

A bipartisan group of lawmakers has also recommended that Congress provide $32 billion in extra funding to develop AI and establish safeguards around it.

However, the path to passing legislation that effectively addresses this issue remains uncertain. During a May 15 hearing, the Senate Rules Committee submitted three Acts to combat the threat of AI in elections.

These are the Protect Elections from Deceptive AI Act, the AI  Transparency in Elections Act, and the Preparing Election Administrators for AI Act.

The Protect Elections from Deceptive AI Act would prohibit the use of deceptive AI video or audio related to federal candidates in political ads. The AI Transparency in Elections Act would require political AI ads to disclose this fact.

Meanwhile, the Preparing Election Administrators for AI Act would require the US Election Assistance Commission to work alongside the National Institute of Standards and Technology to develop voluntary guidelines for election officials on protecting against AI threats, especially from foreign adversaries.