Grief is different for everyone, and how people grieve is evolving along with technology. — Pixabay
From Facebook to FaceTime, it is now easier than ever to stay connected with friends and family members. Thanks to technology, I can FaceTime with my parents, send TikTok videos and share photos of my dog with friends with a few clicks.
But what happens when we use technology to virtually resurrect the dead and allow an avatar to speak on behalf of the deceased in a video interview sharing a political viewpoint?
That question came to the forefront when former CNN White House correspondent Jim Acosta “interviewed” an artificial intelligence avatar of Joaquin Oliver, a teenager killed in the 2018 Parkland high school shooting. In the video, the avatar used a chatbot to generate answers in a voice that supposedly sounded like the boy. Acosta said the boy’s father had approached him to do the piece as a way of keeping Joaquin’s memory alive.
The interview sparked backlash and raised ethical concerns over technology’s potential role in tarnishing the memory of the dead or changing their viewpoint. In this case, the Joaquin avatar advocated for “stronger gun control laws, mental health support and community engagement.”
Acosta’s interview also raises a larger question: Is AI helping us connect or just simulating human connection while we become more disconnected?
Grief is different for everyone, and how people grieve is evolving along with technology. Four years ago, I read about Joshua Barbeau, a guy who lost his fiancee to a rare liver disease. He used Project December – a chat website that simulates a text-based conversation with anyone, including someone who is dead – to communicate via chatbot with an AI version of his deceased fiancée.
Traditionally, people processed grief through therapy or with the support of trusted friends or family members. Today, programs like ChatGPT are being used as therapists, for friendships and in some cases, as romantic partners.
As Derek Thompson wrote in The Atlantic in February, “Americans are spending less time with other people than in any other period for which we have trustworthy data, going back to 1965.”
Isolation isn’t accidental. Many people keep their phones on silent, prefer texting to calling and spend hours doomscrolling. Now, AI can simulate people with avatars. So when grief feels heavy, there is a program to help.
When my grandmother died, the grief felt unbearable. Fifteen years later, when I talk about her, the loss still tugs at my heart. There isn’t a day that I don’t wish she was still here. I keep her memory alive without using AI, but I don’t judge how others grieve.
Grief is intimate – and that’s what makes Joaquin Oliver’s AI interview so eerie. It raises emotional questions. Is AI helping a family grieve? Was it a family’s attempt to give their son a voice in a country that failed to protect him from gun violence?
Maybe it’s both.
Human emotions can be messy. But what makes it all bearable is human connection –holding space for others in tough times – something technology can’t replace.
A chatbot can’t actually resurrect the dead. It’s a mirror of memories, reflecting our own words and thoughts. In the end, as the San Francisco Chronicle reported, Barbeau – the man who lost his fiancee – “felt like the chatbot had given him permission to move on with his life in small ways, simply by urging him to take care of himself.”
Perhaps that’s the lesson. Technology can offer us tools for processing grief and maintaining memories, and maybe even give us permission to move on. But AI can’t hug you or laugh at inside jokes. It won’t sit next to you in silence when the world feels heavy.
The loss of my grandmother still hurts. No amount of technology can bring her back. That’s part of life’s beauty – to love something death can touch. We carry those we’ve lost not through digital simulations, but by sharing memories and stories with others.
The danger isn’t just that AI will replace human connection – it’s that we may settle for it. – Miami Herald/Tribune News Service
