Home / News / Artificial Intelligence / Federal Communications Commission (FCC) has declared the use of AI-generated voices in robocalls as illegal in the United States

Federal Communications Commission (FCC) has declared the use of AI-generated voices in robocalls as illegal in the United States

Concerned about the potential misuse of AI-generated voices for deceptive purposes during phone conversations? On February 8, the United States Federal Communications Commission (FCC) made a ruling that has immediate effect. The ruling states that robocalls utilizing AI voices are subject to the restrictions outlined in the Telephone Consumer Protection Act (TCPA), which also covers telemarketing calls and automatic telephone dialing systems.

It appears to be a concept from a distant time, yet it is already a reality. The accessibility and potential misuse of voice cloning and image creation tools generated by artificial intelligence have raised concerns, according to a statement by FCC Chairwoman Jessica Rosenworcel.

Before making a non-emergency call, callers using AI voices are required to obtain consent. The selection of AI-generated messages by a real person does not override the legal prohibition against initiating a call using a prerecorded or artificial voice, as stated in the declaratory ruling. According to CNET, individuals can report these calls by filling out a form on the FCC’s website.

It is necessary for these calls to include identification and disclosure information regarding the caller’s identity. If a call is categorized as telemarketing or advertising, it is required to provide an opt-out option.

“This technology can confuse us when we listen, view, and click because it can trick us into thinking all kinds of fake stuff is legitimate,” Rosenworcel continued. Tom Hanks hawking dental insurance online, a repulsive Taylor Swift video, and political candidates misrepresenting voting procedures are just a few examples of a discernible trend that is emerging.

There are instances where grandparents are deceived into thinking that the person on the phone is their actual grandchild in desperate need of financial assistance, only to discover later that it was a deceitful individual taking advantage of their inclination to provide financial support to family members.

In January, primary voters in New Hampshire received a call, allegedly from President Biden, advising them to refrain from voting in order to “save their vote” by not participating in the state’s primary. The voice on the call seemed to resemble that of the president, although it was evidently not him. Commissioner Geoffrey Starks noted that the calls were voice-cloning calls. The increased believability of fake robocalls due to the use of generative AI has introduced a new challenge to voter suppression schemes and the campaign season.
In response to a Notice of Inquiry initiated in November 2023, the objective was to gain a deeper understanding of how AI can contribute to safeguarding consumers against “unwanted and illegal telephone calls and text messages under the TCPA.” Interestingly, the utilization of AI is not limited to malicious actors. Rosenworcel provided an explanation on how AI has the potential to assist in pattern recognition for the identification of AI robocalls.
“Ensuring responsible and ethical implementation of AI technologies is of utmost importance to maintain balance. It is essential to harness the benefits of AI in order to protect consumers from harm rather than exacerbating the risks they encounter in an ever-evolving digital environment,” stated Commissioner Anna M. Gomez in her explanation.
“With the issuance of this Declaratory Ruling, we will now possess an additional mechanism to combat voice cloning scams and eliminate this fraudulent activity,” Rosenworcel concluded.

About Chambers

Check Also

The Air Force has abandoned its attempt to install a directed-energy weapon on a fighter jet, marking another failure for airborne lasers

The U.S. military’s most recent endeavor to create an airborne laser weapon, designed to safeguard …