Skip to main content
Student homeNews home
Story
Message centerMy favorites

OPINION: The New Afterlife: How AI Is Changing the Way We Grieve

When someone you love dies, the silence they leave behind is extremely loud. People mourn in different ways, some visit graves, some write letters they’ll never send, some talk to...

When someone you love dies, the silence they leave behind is extremely loud. People mourn in different ways, some visit graves, some write letters they’ll never send, some talk to the air and hope the words reach somewhere. Now, some are turning to something entirely new to the grieving process: A.I.

No, it’s not an episode of Black Mirror, even though it sounds like it. There are personalized chatbots and digital avatars today that can mimic lost loved ones. Some can even imitate how they talk, while others are built from years of texts, voicemails and photos. It’s a strange and emotional mix of technology and the human memory. Is it comforting? Is it creepy? Is it healthy? These are the questions I asked two CSUN professors, Abraham Rutchick, Professor of Psychology as well as Sami Maalouf, an Associate Civil Engineer Professor.

“It is clear that people can have meaningful relationships, in the truest sense, with AI,” said Dr. Rutchick. “Even with systems that aren’t built for that, like ChatGPT. It happens.”

Rutchick isn’t speaking in hypotheticals. He’s tested it out himself, experimenting with character AI to recreate a version of his best friend. It wasn’t a perfect match, but the effect still felt strangely real. “It touches all the same buttons,” he said. “Just like the fake cereal my kid eats; it’s not really food, but it hits the taste buds and does the job.”

That kind of realism, he warned, is both fascinating and dangerous. In grief, there’s a process: feeling the loss, accepting the absence, learning to live in a world where that person no longer exists. AI simulations risk getting in the way.

“If you keep talking to the person digitally, are you really letting them go?” Rutchick asked. “We’re not evolved for this. And that worries me.”

He believes AI can help but only in the right context. Used as a tool between therapy sessions, it might reinforce healing practices like journaling or reflection. “But it shouldn’t replace a therapist,” he said. “The moment it becomes a substitute for actual connection, we’ve gone too far.”

Dr. Sami Maalouf, a civil engineering professor with a deep interest in ethics, sees the issue from another side: the people building the technology. While he didn’t speak in soundbites, his message was clear; engineers have a responsibility to design with empathy and cultural awareness.

“We need to ask, who are we building this for? And why?” Maalouf said. “Not every culture grieves the same way. If we’re going to create digital spaces for mourning, we need to make sure they honor those differences.”
He also raised concerns about profit-driven platforms. If a chatbot designed to simulate a lost loved one becomes a subscription service, what’s stopping companies from taking advantage of people at their most vulnerable?

“There should be safeguards,” Maalouf said. “No one should be tricked or exploited when they’re grieving.”
He’s also not convinced AI memorials are built to last. “Technology changes. Servers shut down. Companies fold. If someone builds a digital grave for their mom, what happens when that platform disappears?” It’s a fair question, and one that doesn’t yet have a good answer.

The idea of “digital immortality” is no longer theoretical. Companies like HereAfter AI and StoryFile already offer services where people can record video or audio to be used in simulated conversations long after their loved ones are gone. Some projects use machine learning to generate realistic speech or text responses.

It raises big questions. Is this a form of healing, or is it getting stuck in the past? Is it honoring someone’s memory, or is it replacing it with an illusion?

“We’re not sure if it’s healthy or horrifically maladaptive,” Rutchick admitted. “There’s just not enough long-term research yet. This stuff hasn’t even been around long enough to study its effects.”

Still, he’s curious. He thinks AI could support grief work in the same way it supports mental health tools for anxiety or depression if it’s done right. “If you can’t access a therapist because you live far away or you can’t afford one, AI might be better than nothing,” he said. “But nothing beats talking to a real person who understands your pain.”

The biggest risk, both professors agreed, is over-reliance. If someone starts using an AI chatbot as their primary emotional outlet, especially when grieving, they may start to avoid real conversations and healing opportunities. “We don’t want to remove people from the world,” Rutchick said. “That’s where recovery actually happens.”

And AI, by design, is always available. It doesn’t sleep. It doesn’t judge. It always responds. That 24/7 access might sound helpful, especially during a panic attack or a wave of sadness in the middle of the night but it can also lead to isolation if used in place of human relationships.

Maalouf added that regulation could help prevent these tools from going off the rails. “We don’t let just anyone build a bridge or a plane,” he said. “Maybe we shouldn’t let just anyone build an AI that talks to the dead.”

AI will not stop evolving. Rutchick estimates that in five years, we’ll have AI systems as smart as the average person possibly smarter in some areas. That means the emotional, ethical, and psychological consequences will only grow.
“We could be heading toward a world where grief looks completely different than it does now,” he said. “But I’d be careful assuming that’s a good thing.”

Both professors agree that AI can be part of grief support. But it shouldn’t be the whole story. Real healing still comes from the people around us, from the process of remembering, letting go, and slowly moving forward.

“Grief is deeply human. Let’s not forget that while we build the future,” said Maalouf.

Latest Daily Sundial