FCC Wants to Penalize Artificial Intelligence (AI) Robocall Spam

The FCC seeks to outlaw robocalls with artificial intelligence (AI) and has warned about a growing wave of scams using voice cloning technology.

On Thursday, Jessica Rosenworcel, the head of the US communications regulator, said machine learning software could potentially persuade people to take actions such as donating money to fraudulent causes.

It’s one thing for humans to fool other humans, but quite another for computers to imitate celebrities and others while calling people on an industrial scale to scam them with convincing lures.

Last month, New Hampshire residents received a fake call imitating President Joe Biden’s voice, urging them not to vote in the state’s presidential primary election in an attempt to disrupt the results.

“AI-generated voice and image cloning is already sowing confusion by tricking consumers into believing the scams are legitimate. No matter which celebrity or politician you prefer, or what your relationship is with your loved ones when they call for help , we may all be targets of these hoax calls,” Rosenworcel said this week.

“That’s why the FCC is taking steps to recognize this emerging technology as illegal under existing law, providing our partners in state attorneys general offices across the country with new tools they can use to combat these scams and protect consumers. “.

The agency believes that the use of AI voice cloning technology in robocall scams should be criminalized under the Telecommunications Consumer Protection Act of 1991. Current laws require telemarketers to have consumers’ explicit consent before to make automated calls using “an artificial or prerecorded voice.” Rosenworcel argues that the same rules should also apply to AI-generated robocalls.

Lawmakers are also seeking to address the issue. House Rep. Frank Pallone, Jr. (D-NJ) this week introduced a bill called the “Do Not Disturb Act,” which would require telemarketers to disclose whether AI has been used to automatically create a message used in a text message or phone call.

Penalties would be imposed if telemarketers impersonate someone else. Pallone said his bill closes loopholes and expands robocall rules to cover text messaging and the use of AI.

Government officials across the United States support the FCC’s ideas. Last month, Pennsylvania Attorney General Michelle Henry wrote a letter [PDF] to the FCC and said that AI-generated voices should be classified as artificial voices, meaning the Telecommunications Consumer Protection Act should Protect against robocall scams using voice cloning technologies. The letter was signed by 25 attorneys general from other states. na na na na na na

“Technology advances and expands, seemingly minute by minute, and we must ensure that these new developments are not used to stalk, deceive or manipulate consumers,” Henry said.

“This new technology cannot be used as a loophole to overwhelm consumers with illegal calls. I commend partners in this bipartisan coalition for seeing the potential harm that AI can cause to consumers who are already overwhelmed by robocalls and communications. by text”.