Former CNN anchor Jim Acosta
Former CNN anchor Jim Acosta now hosts his own show on Substack Instagram @jimacosta

KEY POINTS

  • Jim Acosta interviewed an AI-generated version of Joaquin Oliver in a new video posted to his Substack.
  • Joaquin Oliver was 17 when he was killed in the 2018 Parkland shooting; he would have turned 25 this week.
  • Acosta said he felt he was 'really speaking' with Joaquin, and Oliver's father called the technology a 'blessing'

Former CNN chief White House correspondent Jim Acosta has ignited a fierce ethical debate after publishing what he described as a 'one-of-a-kind' interview with a guest who is no longer alive.

The segment featured an AI-generated likeness of Joaquin Oliver, a 17-year-old student who was killed during the 2018 Parkland school shooting in Florida.

'What Happened to You?': Acosta Interviews the Digital Dead

The video, released via Acosta's new independent journalism platform on Substack, was promoted on social media as a groundbreaking moment in digital reporting. But the move has left many viewers deeply unsettled, with critics accusing Acosta of exploiting grief and crossing moral boundaries.

In the video, the virtual Joaquin — created using a photograph and generative AI — appears onscreen wearing a beanie, his expression solemn. Acosta begins by asking, 'What happened to you?' The AI responds in a stilted, robotic voice: 'I was taken from this world too soon due to gun violence while at school. It's important to talk about these issues so we can create a safer future for everyone.'

The moment was intended as a powerful tribute, but many viewers found the avatar's voice robotic and its facial expressions uncanny.

Acosta said the Oliver family had invited him to conduct the first public interview with their son's AI, and the project was created in close collaboration with his parents, Manuel and Patricia Oliver.

Who Is Joaquin Oliver?

Joaquin Oliver was just 17 years old when he was shot and killed at Marjory Stoneman Douglas High School in Parkland, Florida, on 14 February 2018 — a tragedy that left 17 people dead and reignited America's long-running debate over gun control.

Known for his love of writing and social activism, Joaquin came to school that morning carrying flowers for his girlfriend to celebrate Valentine's Day. He never made it back home.

He would have turned 25 this week.

Following the shooting, his parents, Manuel and Patricia Oliver, became outspoken advocates for gun reform in the United States. They have since launched multiple campaigns using their son's image and voice, including AI-driven messages to Congress and artistic installations honouring his memory.

'I Really Felt Like I Was Speaking With Joaquin'

According to Acosta, the Oliver family invited him to conduct the first public 'interview' with their son's avatar, which was built using voice reconstruction and animation tools. Acosta also sat down with Joaquin's father, Manuel, who said the experience was deeply emotional.

'I know this is not my son,' Manuel said. 'But hearing his voice, seeing him speak again — that's something I thought I would never experience. If technology can give us that, it's a blessing.'

Acosta himself appeared moved, stating: 'I really felt like I was speaking with Joaquin. It's just a beautiful thing.'

Public Backlash: 'This Isn't Journalism'

While the segment was meant to raise awareness about gun violence, many viewers reacted with outrage. Critics on social media platforms like Bluesky accused Acosta of exploiting grief and pushing the ethical boundaries of journalism.

'There are living survivors of school shootings you could interview, and it would be their words and thoughts instead of completely made-up,' one user wrote.

Others expressed discomfort at seeing AI used in such an emotionally loaded context, especially with a voice that lacks realism and facial movements that come off more like an unsettling dub than a digital resurrection.

A Growing Trend: AI in Advocacy and Legal Proceedings

This isn't the first time Joaquin Oliver's AI likeness has been used. Last year, he was part of The Shotline, a robocalling campaign in which the AI voices of six Parkland victims called members of the US Congress demanding action on gun violence.

'I'm back today because my parents used AI to re-create my voice to call you,' the avatar's message said. 'How many calls will it take for you to care?'

But the lines between tribute and manipulation continue to blur. As the technology improves, ethicists warn of the potential for misuse — from fraud and misinformation to eroding the boundaries between memory and manufactured presence.

'The use of AI to re-create the dead may seem touching, but it opens a Pandora's box of problems,' one tech ethicist noted. 'Consent, representation, authenticity — these are not small issues.'

AI-generated avatars of deceased individuals have begun to appear in courtrooms, documentaries, and now, interviews. In May, a court in Arizona even featured a statement delivered by an AI version of a road rage victim. The judge praised the message, saying, 'I loved that AI... I feel that that was genuine.'

But whether the public sees these AI recreations as healing, helpful, or horrifying remains deeply divided. In Acosta's case, the criticism is still mounting — not just for the technology, but for the uncomfortable question it raises: when does honouring the dead turn into speaking for them?

Is This the Future of Grief or Exploitation?

Acosta, known for his fierce questioning of the Trump administration, now stands at the centre of a debate that spans journalism, ethics, and technology.

Critics argue that creating fictionalised interactions with the dead — especially for public broadcast — goes beyond traditional reporting. They question whether any amount of family consent justifies putting words into the mouth of someone who cannot speak for themselves.

As Acosta continues his independent media journey via Substack, the controversy raises a stark question: when does honouring the dead become speaking for them?