Your resource for web content, online publishing
and the distribution of digital products.
«  
  »
S M T W T F S
 
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 
 
 

Consumer Reports: AI voice cloning tools have almost no security checks

DATE POSTED:March 11, 2025
 AI voice cloning tools have almost no security checks

Consumer Reports reveals that several popular voice cloning tools lack adequate safeguards against fraud or abuse, highlighting potential risks associated with AI voice technology. The study examined products from six companies: Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify.

The investigation found that only Descript and Resemble AI have implemented meaningful measures to prevent misuse. Other tools merely require users to confirm they have the legal right to clone a voice, often through self-attestation. Grace Gedye, a policy analyst at Consumer Reports, warned that without proper safety mechanisms, AI voice cloning tools could “supercharge” impersonation scams.

AI voice cloning technology has advanced significantly, capable of mimicking a person’s speech with minimal audio samples. A notable incident occurred during the Democratic primaries last year, where robocalls featuring a fake Joe Biden misled voters. This resulted in a $6 million fine for the political consultant behind the scheme, and the Federal Communications Commission subsequently banned AI-generated robocalls.

The analysis of the six AI voice cloning tools indicated that five have bypassable safeguards, making it easy to clone voices without consent. Deepfake audio detection software often struggles to distinguish between genuine and synthetic voices, complicating the issue. Generative AI, which imitates human characteristics such as voice, has limited federal regulation, with most ethical practices driven by the companies themselves. An executive order signed by President Biden in 2023 included safety demands for AI, but a later revocation by former President Trump dismantled those provisions.

Sesame’s AI voice is so real, it’s unsettling

Voice cloning technology utilizes audio samples from individuals to create synthetic voices. Without safeguards, anyone can upload audio from various platforms, such as TikTok or YouTube, and have the service replicate that voice. Four of the examined services—ElevenLabs, Speechify, PlayHT, and Lovo—simply require users to check a box asserting authorization for the voice clone. Resemble AI, while insisting on real-time audio recording, was circumvented by Consumer Reports, which played recorded audio during verification.

Only Descript offered a somewhat effective safeguard, requiring users to record a specific consent statement. This method is difficult to falsify, except when using another service to clone the voice. All six services are publicly accessible on their respective websites, with ElevenLabs and Resemble AI charging fees of $5 and $1, respectively, for creating custom voice clones, while the others are free to use. Some companies acknowledged the potential for abuse and reported having implemented stronger safeguards to prevent deepfake creation and voice impersonation.

There are legitimate applications for AI voice cloning, such as aiding individuals with disabilities and providing audio translations. However, risks remain significant. Sarah Myers West, co-executive director of the AI Now Institute, noted that this technology could facilitate fraud, scams, and disinformation, including the impersonation of influential figures.

Research on the prevalence of AI in audio scams is limited. The Federal Trade Commission has indicated that AI may be employed in “grandparent scams,” where criminals impersonate family members in distress. Additionally, some musicians have faced challenges due to cloned voices being utilized for unauthorized music production, as exemplified by a viral 2023 song falsely attributed to Drake and the Weeknd.

Featured image credit: BRUNO CERVERA/Unsplash