- The Washington Times - Thursday, March 8, 2018

A leading Chinese technology company has an AI algorithm that can clone human speech within seconds.

Research from Baidu into “neural voice cloning” has members of the technology industry impressed — and concerned — about its possibilities. The company’s AI has demonstrated an ability to “reproduce thousands of speaker identities” within 30 minutes.

Vice News also demonstrated samples on Wednesday that were made from just 3.7 seconds of audio.

“The system can change a female voice to male, and a British accent to an American one — demonstrating that AI can learn to mimic different styles of speaking, personalizing text-to-speech to a new level,” the website’s technology blog reported.

“These technologies represent the kind of leaps in the advancement of AI that researchers and theorists raised concerns around when [the deepfakes algorithm] democratized machine learning-generated videos. If all that’s needed is a few seconds of someone’s voice and a dataset of their face, it becomes relatively simple to fabricate an entire interview, press conference, or news segment.”

In short, “deepfakes” technology takes video of an individual and convincingly blends it with a stranger. Some websites, for instance, have used AI to fuse famous celebrity faces into pornographic material.

SEE ALSO: ‘Donald Trump’ AI image swap a glimpse of ‘fake news’ on the horizon: ‘Terrifying’

Vice News’ story comes just weeks after a popular YouTube channel called “derpfakes” unveiled an AI-generated version of President Trump.

The technology website The Next Web called recent progress on such algorithms “staggering.”

“In a year or two, as the algorithms continue improving, it’s unclear whether the average person will even be able to discern authentic videos from fakes. … In a world that already can’t agree on simple facts, the future looks pretty terrifying,” TNW reported Feb. 20.

“We have officially entered a technological era when someone will be able to put your face on security camera footage to frame you for a crime,” replied one viewer on the “derpfakes” YouTube channel.

• Douglas Ernst can be reached at dernst@washingtontimes.com.

Copyright © 2022 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide