Thu, Jul 25, 2024
Man fined, charged for fake New Hampshire robocall scheme
Uncategorized

Man fined, charged for fake New Hampshire robocall scheme

  • PublishedMay 23, 2024



Court documents show Kramer is facing 13 felony charges for allegedly creating Biden-soundalike robocalls urging voters to “save their vote” for November.

The Federal Communications Commission has issued a $6 million fine against a political consultant who sent AI-generated robocalls mimicking President Joe Biden’s voice to voters ahead of New Hampshire’s presidential primary.

Steve Kramer, who also faces two dozen criminal charges in New Hampshire, has admitted orchestrating a message that was sent to thousands of voters two days before the first-in-the-nation primary on Jan. 23. The message played an AI-generated voice similar to Biden’s that used his phrase “What a bunch of malarkey” and falsely suggested that voting in the primary would preclude voters from casting a ballot in November.

Court documents show Kramer is facing 13 felony charges alleging he violated a New Hampshire law against attempting to deter someone from voting using misleading information. He also faces 11 misdemeanor charges accusing him of falsely representing himself as a candidate by his own conduct or that of another person. The charges were filed in four counties but, as often happens with serious crimes, will be prosecuted by the state attorney general’s office.

Kramer did not immediately respond to a request for comment Thursday but previously said he was trying to send a wake-up call about the dangers of artificial intelligence.

The FCC also issued a $2 million fine against Lingo Telecom, which is accused of transmitting the calls. A company spokesperson did not immediately respond to a call seeking comment Thursday.

FCC Chairwoman Jessica Rosenworcel said regulators are committed to helping states go after perpetrators. In a statement, she called the robocalls “unnerving.”

“Because when a caller sounds like a politician you know, a celebrity you like, or a family member who is familiar, any one of us could be tricked into believing something that is not true with calls using AI technology,” she said in a statement. “It is exactly how the bad actors behind these junk calls with manipulated voices want you to react.”

Swenson reported from New York.



Source link

Written By
johndweiner@gmail.com

Leave a Reply

Your email address will not be published. Required fields are marked *