Project uses AI to recreate voices of people killed by gun violence

Patricia Oliver lost her son Joaquin in 2018, after a 19-year-old man opened fire on students and staff in Parkland, Florida at Marjory Stoneman Douglas High School. Seventeen people were killed.

Now, Patricia and Manuel Oliver work to end gun violence. The most recent project they were involved in included recreating their son’s voice.

“We moved forward with the project, we were looking and putting together different videos, anything that has audio,” Oliver explained.

Using old audio messages and video along with a message written by Joaquin’s parents, artificial intelligence was able to put the words in his voice.

“We knew that it would be hard, but it’s never harder than losing Joaquin,” Patricia said.

“Hello, I am Joaquin Oliver. Six years ago I was a senior at Parkland. Many students and teachers were murdered on Valentine’s Day that year by a person using an AR-15,” the artificial intelligence message with Joaquin’s voice said.

“I’m back today because my parents used AI to recreate my voice to call you,” the AI message continued.

@scrippsnews “I’m back today because my parents used AI to recreate my voice to call you.” Voices of people shot and killed by guns were re-created using #ArtificialIntelligence. A new project aims to use these messages to send to lawmakers and change the country’s #GunLaws ♬ original sound – Scripps News

“It’s exactly Joaquin’s voice, so when I heard that final product, it was very impressive, very impactful, it was deep. It was like Joaquin was talking next to me, he was telling me that. And that was pretty hard,” Patricia Oliver said.

A new project called The Shotline is using artificial intelligence to recreate the voices of those killed by gun violence. The project was released this month and put together by two organizations – March For Our Lives and Change the Ref.

The groups’ goal is to have the public send these AI messages to lawmakers to combat gun violence.

“This is something that is very honest. It’s a way we decided to move and send the message to these representatives that they are not willing to really listen, that this is a problem that is attacking every single one in every single place of this country,” Oliver said.

But this technology doesn’t come without concern.

“It’s an interesting use of a very novel technology,” said Anton Dahbura, the co-director of the Johns Hopkins Institute for Assured Autonomy.

“Freeform use of synthesizing someone’s voice is a slippery slope. And there are going to be well-intended people, as in obviously this case, and then nefarious people,” he said.

However, Dahbura said it could take years for policy and regulations at both the federal and state level to catch up with this ever-evolving technology to eliminate bad actors.

Earlier this month, the FCC announced that calls made with AI-generated voices are considered “artificial”. This was specifically aimed at making common robocall scams illegal and giving officials new tools to go after bad actors.

“Public awareness, pushback, ensuring that this technology is used responsibly, those are of utmost importance right now,” Dahbura said.

The Shotline project website currently showcases AI messages using the voices of six people lost to gun violence.

As of Feb. 20, the website claimed more than 71,000 AI calls had been submitted to representatives through The Shotline, using the six available AI messages. More messages could be developed by the project in the future.

“This is open to bring more voices, to send more messages,” Oliver said.