“Biden’s AI Chief Expresses Concern Over Audio Deep Fakes”
Bruce Reed, the White House Deputy Chief of Staff and head of the Biden administration’s AI strategy, has emphasized the growing concern surrounding voice cloning technology. According to Politico, Reed stated that “voice cloning is one thing that keeps me up at night.” Despite the technology being relatively new, he described it as “frighteningly good.”
Reed further warned that society has yet to fully comprehend the potential impact of perfect voice fakes on our lives. With the increasing sophistication of voice cloning technology, individuals may become unsure whether the voice on the other end of the phone call is real or fake.
Exploitation of voice cloning technology phone scammers has already been reported. Business Insider revealed that scammers are using AI advancements to make their schemes more believable. The Federal Trade Commission (FTC) disclosed in March that scammers have been utilizing voice cloning programs to enhance their family emergency scams, convincing individuals that their loved ones are in distress.
In a harrowing example, a mother in Arizona received a call from a scammer who had used voice cloning software to impersonate her kidnapped daughter. The imposter’s clone closely matched the daughter’s voice, leading the mother to believe the call was authentic. Scammers reportedly only require a few seconds of someone’s voice to create a convincing clone with an 85% match, as indicated a McAfee report.
On the other hand, New York City Mayor Eric Adams shares a contrasting perspective on AI-generated voice clones. Adams has employed a voice cloning platform to robocall residents in multiple languages that he doesn’t personally speak, including Mandarin, Spanish, and Yiddish. This initiative aims to reach a broader segment of the city’s non-English speaking residents. However, some experts argue that this approach is deceptive and highlights the need for politicians to establish clear regulations on AI usage.
The accessibility and simplicity of voice cloning platforms make them susceptible to misuse. ElevenLabs, a free voice cloning platform, requires less than one minute of an individual’s audio to generate a high-quality clone. The company has acknowledged increasing cases of voice cloning misuse. Vice reported that audio deepfake clips, generated using ElevenLabs’ software, featuring celebrities like Joe Rogan, Ben Shapiro, and Emma Watson making offensive comments, were uploaded onto the imageboard site 4chan.
As concerns over audio deep fakes continue to mount, lawmakers and experts are grappling with the ethical implications and potential dangers associated with this emerging technology.