Ninja Theory, other developers are using AI voice technology instead of people

A new report from Good Luck Have Fun reveals that several triple-A game developers are using an AI program for voice acting instead of human talent. The company behind the technology is Altered AI, which contains a library of voice performances, including around 20 professional voice actors.

Although it shares similarities with text-to-speech, which reads sentences back into audio, AI dubbing is considered more ethically risky. Since there are tools that can change a voice actor’s tone and type of voice, there was concern that the technology could be used to completely supplant voice actors.

NDAs mean that only two developers are explicitly named in the report: Infernal Blade Developer Ninja Theory is reportedly partnering with Altered. Additionally, Neon Giant reportedly used Altered for voice acting in their 2021 game, The Ascension.

The technology is typically used for prototyping purposes, according to Altered CEO Ioannis Agiomyrgiannakis, who argued that his company did for dubbing what YouTube did for video.

“When you have dialogue, you have a level of imagination. But when you bring the dialogue to the voice actors, it comes back and doesn’t seem as dynamic as you intended,” Agiomyrgiannakis explained. “We provide an intermediate stage where they can prototype the dialogue and have a checkpoint before entering the studio.”

Video games have had a difficult relationship with voice acting, with the American film actors guild SAG-AFTRA going on strike in 2017 on behalf of video game voice actors getting better pay and benefits. Several titles were significantly affected during this strike, including the life is strange prequel Before the storm and Mortal Kombat 11. A deal was finally reached, which ends in November this year.

The argument has been made that voice AI is particularly useful for indie developers and can be used in tandem with established voice actors. But both Horizon’s Ashly Burch and Yuri Lowenthal from Marvel’s Spiderman argued that there are easier ways around this.

“SAG-AFTRA has a low-budget deal to address this issue,” Burch acknowledged. “It is specially designed so that independent developers can access quality VO without breaking the bank. […] if you’re looking for something human, nuanced and alive, AI just won’t cut it.”

What does AI mean for voice actors and their image?

For Lowenthal, the concern is specifically about software like this being used to exploit the talents of actors in games in which they would otherwise not be compensated.

“I know an actress who does a lot of performance capture and voice work and she’s seen her very specific move pop up in games she’s never even worked on,” he said. “This is a scary precedent that has already been set, and I want to start a conversation with AI companies about how we could protect actors, and again, the storytelling ecosystem.”

In a statement, SAG-AFTRA wrote that it would adapt its contracts to changing technology and protect voice actors from having their performances used against them.

“These new technologies offer exciting new opportunities, but can also pose potential threats to performers’ livelihoods. It is crucial that performers have control over the exploitation of their digital selves, are appropriately compensated for its use and are able to provide informed consent,” SAG-AFTRA wrote.

“We know change is coming. SAG-AFTRA is committed to protecting our members from unauthorized or inappropriate use of their voice, image or performance, regardless of the technology used. The best way for an artist to venturing into this new world is armed with a union contract.

Leave a Reply

%d bloggers like this: