Categories

Tag: death

What AI and Grief-bots Can Teach Us About Supporting Grieving People

By MELISSA LUNARDINI

The Rise of Digital Grief Support

We’re witnessing a shift in how we process one of humanity’s most universal experiences: grief. Several companies have emerged in recent years to develop grief-related technology, where users can interact with AI versions of deceased loved ones or turn to general AI platforms for grief support.

This isn’t just curiosity, it’s a response to a genuine lack of human connection and support. The rise of grief-focused AI reveals something uncomfortable about our society: people are turning to machines because they’re not getting what they need from the humans around them.

Why People Are Choosing  Digital Over Human Support

The grief tech industry is ramping up, with MIT Technology Review reporting that “at least half a dozen companies” in China are offering AI services for interacting with deceased loved ones. Companies like Character.AI, Nomi, Replika, StoryFile, and HereAfter AI offer users the ability to create and engage with the “likeness” of deceased persons, while many other users use AI as a way to quickly normalize and seek answers for their grief. This digital migration isn’t happening in a vacuum. It’s a direct response to the failures of our current support systems:

  • Social Discomfort: Our grief-illiterate society struggles with how to respond to loss. Friends and family often disappear within weeks, leaving mourners isolated when they need support, especially months later.
  • Professional Barriers: Traditional grief counseling is expensive, with long wait times. Many therapists lack proper grief training, with some reporting no grief-related education in their programs. This leaves people without accessible, qualified support when they need it most.
  • Fear of Judgment: People often feel safer sharing intimate grief experiences with AI than with humans who might judge, offer unwanted advice, or grow uncomfortable with the intensity of their grief.

The ELIZA Effect

To understand why grief-focused AI is succeeding, we must look back to 1966, when the first AI-companion program called ELIZA was developed. Created by MIT’s Joseph Weizenbaum, ELIZA simulated conversation using simple pattern matching, specifically mimicking a Rogerian psychotherapist using person-centered therapy. 

Rogerian therapy was perfect for this experiment because it relies heavily on mirroring what the person says. The AI companion’s role was simple: reflect back what the person said with questions like “How does that make you feel?” or “Tell me more about that.” Weizenbaum was surprised that people formed deep emotional connections with this simple program, confiding their most intimate thoughts and feelings. This phenomenon became known as the “ELIZA effect”.

ELIZA worked not because it was sophisticated but because it embodied the core principles of effective emotional support, something we as a society can learn from (or in some cases relearn).

What AI and Grief-bots Get Right

Modern grief-focused AI succeeds for the same reasons ELIZA did, but with enhanced capabilities. Here’s what AI is doing right:

  • Non-Judgmental Presence: AI doesn’t recoil from grief’s intensity. It won’t tell you to “move on,” suggest you should be “over it by now,” or change the subject when your pain becomes uncomfortable. It simply witnesses and reflects.
  • Unconditional Availability: Grief doesn’t follow business hours. It strikes at 3 AM on a Tuesday, during family gatherings, while you’re at work, or on a grocery run. AI works 24/7, providing instant support by quickly normalizing common grief experiences like “I just saw someone who looked like my mom in the grocery store, am I going mad?AI’s response demonstrates effective validation: “You’re not going mad at all. This is actually a very common experience when grieving someone close to you. Your brain is wired to recognize familiar patterns, especially faces of people who were important to you… This is completely normal. Your mind is still processing your loss, and these moments of recognition show just how deeply your mom is still with you in your memories and awareness.” Simple, on-demand validation helps grievers instantly feel normal and understood.
  • Pure Focus on the Griever: AI doesn’t hijack your story to share its own experiences. It doesn’t offer unsolicited advice about what you “should” do or grow weary of hearing the same story repeatedly. Its attention is entirely yours.
  • Validation Without Agenda: Unlike humans, who may rush to make you feel better (often for their own comfort), AI validates emotions without trying to fix or change them. It normalizes grief without pathologizing it.
  • Privacy and Safety: AI holds space for the “good, bad, and ugly” parts of grief confidentially. There’s no fear of social judgment, no worry about burdening someone, no concern about saying the “wrong” thing.
  • No Strings Attached: AI doesn’t need emotional reciprocity. It won’t eventually need comforting, grow tired of your grief, or abandon you if your healing takes longer than expected.

AI Can Do It, But Humans Can Do It Better. Much Better.

According to a 2025 article in Harvard Business Review, the #1 use of AI so far in 2025 is therapy and companionship.

Continue reading…

Calculating Risk

Thursday I traversed the frozen surface of the pond for perhaps the last time this season. The ice is thinning quickly. I had on my rubber boots and stayed what I felt to be a safe distance from shore: should I break through, the water would not be over my head. I got some fantastic photos and considered the little adventure a success. However, over dinner that evening when I mentioned that I’d been on the pond earlier, David and Peter were furious. Peter wouldn’t calm down until I promised I wouldn’t go out again.

I have always considered fear the enemy; something to conquer and overcome and I’ve had a lot of practice. Being risk adverse and scrappy has been an asset now that I have lung cancer.  As a participant in a phase I clinical trial, there is the potential for unforeseen and possibly life threatening side effects of treatment itself. Before you are given your first dose of an experimental drug, you must read through and sign consent forms which acknowledge this risk. It is something most healthy persons would never do. When you have a terminal illness, it is similar to coming to the edge of a ravine with a tiger on your trail. Between you and safety is a rickety bridge that may or may not support your weight. However, even chancy passage is an easy decision when the alternative is certain death.

Continue reading…