Holding on to Ghosts: The False Promise of AI
How AI that "replaces" the dead diminishes our existence
In my morning email from the Association for Death Education and Counseling (ADEC), there was a link to this very brief Yahoo! piece on AI and grief. I work in higher education. AI is all we’re talking about these days, so of course as a death educator working in higher ed, I read the article.
I read the article and I was disturbed. I will fully own the fact that I grew up in the 80s with films like Terminator and War Games among others. I am a science fiction fan, through and through, and as such, I am firmly of the belief that Skynet is a very real and terrifying possibility. I make the joke all the time. When people ask me what I’m (strength) training for, I tell them I’m basically Linda Hamilton in Terminator 2, getting ready for the robots (only she was vastly fitter). Or I yell “Skynet!” in the faculty development AI webinar chat. What are we doing? I don’t think we really know.
Last year I wrote a feature for Western Michigan University’s College of Arts and Sciences Alumni magazine on two of my colleagues who work at the forefront of human-machine communication and artificial intelligence. They’re delightful people, and I have a great deal of respect for them, but they’re a little scary in their enthusiasm for our coming robot overlords. (Honestly, though, check out their work, it’s fascinating.)
Students are using ChatGPT and other AI tools to write their papers for them and we’re having to come up with ways to work with this twist rather than trying to ban it outright because they’re going to use it regardless of what we say. And how can you tell, really? Sometimes it’s obvious. More often than not, though, you don’t have any idea, because you don’t know these individuals. You don’t know their “voice” because you don’t know them.
Artificial Intelligence is, to borrow a phrase, now in “everything, everywhere, all at once,” and there’s no putting that genie back in the bottle. But can we at least be a little more cautious about where we’re allowing that creep?
Humans have found innumerable ways to express grief throughout our history. Some are perfectly healthy and can be very beneficial to our healing from the pain of a loss. Online memorials are fairly standard practice in our present high-tech age. When a loved one dies, it’s relatively easy to convert their Facebook account to a memorial page, for instance. Online guest registries and remembrance pages on funeral home websites appear to exist in perpetuity. I put these things in the realm of spontaneous shrines and public memorials. A sort of digital roadside cross if you will. And while it is, to me, rather impersonal, as most things are on the internet, it makes sense.
I was having a conversation with my aunt recently about folks who message the dead on Facebook. I won’t share what she said, because that’s her business. My take on it is this, though – people message the dead for all of the obvious reasons. It’s that moment when you think “Oh, I have to call Mom and tell her about this, she’ll love it,” and you reach for the phone and almost start punching in her name before you remember she’s not there. But you can still send a message on the Internet. You send messages with no expectation of a response. You send messages just to get the thought out there. Maybe she’ll pick it up from whatever is happening after this life. And maybe you feel a little better having said it “out loud.”
In a much more no-tech way, the Wind Phone serves this purpose. I think it’s so much more poetic. You can sit down in a little booth, pick up the phone, have a conversation with your person, even if it is one-sided, and go on about your day. And some small part of your grief is soothed. You are helped along on your path forward by talking to the wind and messaging the dead. It just feels better saying it “out loud” rather than it merely living in your head. I suppose you could do this anywhere, but it doesn’t really feel the same holding a cell phone to your ear, does it? Regardless, again, there is no expectation of a response.
Of course, there is a very long and rich, and fascinating history of humans attempting to contact the dead, with, if not the expectation of a response, at least the hope of one. (I’d link you to the paper I wrote on the history of Spiritualism, but it is not currently available online.) An entire religion was born out of this need to find proof of life after death. Well, that, and the perfect storm of a teenage prank and tragically high death rates in mid-nineteenth-century America. People in larger urban areas still make a living today as spirit mediums, offering you the opportunity to ask grandma if she’d please just give up the recipe for her bread so it isn’t lost to the ages.
There is a very great need for a very great many humans to believe that the soul or whatever it is that makes us us lives on after our bodies cease to serve their purpose. We are desperate for meaning, desperate to know that this isn’t all there is. I’ll admit to occasionally falling into a quiet despair thinking “What even is the point?” There is a portion of an Alan Watts lecture on the acceptance of death that someone made into a lovely video that I frequently revisit. In it, Watts tells us that the point of life is simply to live. To live and to create new life and to die. We live on through our children, in whom “life is renewed,” and we are reminded that life itself is “marvelous” and full of magic. Of course we must die.
And of course, we must feel the pain of death because death itself gives life meaning.
Enter artificial intelligence. Justin Harrison, founder of You, Only Virtual, believes, or at least hopes, that “people won’t have to feel grief at all.” I can’t tell you how much this disturbs me. Sure it’s a noble idea. Spare everyone the pain of loss! Get lectured by your mother, who died much too soon. (Weird aside, his mother died in October of 2019, if I’m reading the years correctly, at age 61. I’m not sure if she died before or after he started the company. My mother died in October 2019 at age 63. A little spooky. Clearly, as a death educator, I was meant to address his project. Maybe our mothers are having a Diet Coke together, watching some True Crime™ in the ever after.) It is very concerning to me that Harrison, through his project, is entirely disregarding the fact (or at least it seems that way on the surface) that AI is incapable of human emotion, arguably the one thing that is essential to grief. Also, the one thing AI is incapable of learning. I fully understand the temptation to "keep our love ones alive" like this. But I've also seen Pet Sematary.
Loss sucks. It's horrific and agonizing and awful. But that pain is proof that someone's life meant something in this world. Grief is the measure of a life lost, it is the price we pay for loving the people around us. What do we do to that meaning when we train an AI to mimic the people we love? What do we do to the meaning of human existence itself?
Not to mention the other ramifications of an artificial simulacrum (thanks for planting the word in my noggin, Coop) of any of us out there in the ether. It’s bad enough we put most of our lives on the Internet already, but now we’re suggesting we allow a machine to simply virtually become us? Think of the crime! Think of the potential to do serious lasting damage to individuals if this were in widespread use. Maybe I’m getting ahead of myself, maybe not. I can say that I think this is potentially very dangerous to the grieving process.
It’s further troubling that Harrison brushes aside any thoughts of a right to privacy. He explicitly states that he simply doesn’t care if his mother might have approved of “living” on in this way, she’s dead and he isn’t, and he’ll do what he wants. There is no respect there. The dead have precious few to speak for them, and I’m reasonably certain that AI is not the way to give them a voice. The reality is that while it may be trained on the text messages and emails you’ve fed it, it’s becoming its own entity through its interactions with you. So it still isn’t really your mom, is it?
This is the way horror movies start, don’t you think? A lovely interaction with my mum’s AI avatar. Isn’t it wonderful I can have her with me, long after she’s given up the ghost? Isn’t it beautiful? No! It’s wrong! It’s wrong to bring these things into the world with no thought to the potential (probably unintentional) consequences of your actions. No one thought Skynet would take over the world and attempt to eliminate the human race.
OK, joking aside, I’m not entirely joking here. This is a dangerous business ethically, potentially financially, certainly emotionally and psychologically. I can only speculate on most of these areas, I’m a death educator, and that’s my area of expertise. But as I was telling my friend Dave just yesterday, I’m also a realist.
Creating a virtual version of the person you lost doesn't seem like a healthy way to deal with your grief. I would venture to say that, except in rare instances, it will likely even prolong it. This is not to say grief ever goes completely away, but it does change, and (in most instances) becomes less acute over time. The process of grieving is incredibly complex, and there are a number of healthy ways to go about it. The important thing is that we go through it. Avoiding the pain of loss is a perfectly natural reaction, but at some point, the pain must be dealt with. Most people don’t realize that grief doesn’t simply impact us emotionally, but it has very real effects on us physically as well. And when we avoid working through our grief, those physical and psychological effects can become chronic. There are numerous studies on various types of grief: complicated grief, prolonged grief disorder, anticipatory grief, disenfranchised grief. We grieve because we are human. Extending the “life” of the dead through AI removes half of the human from our relationships.
On the flip side, it may be possible that AI can actually offer a great deal of help to those dealing with complicated and/or intense grief. I recall reading about the documentary mentioned in this article in The Washington Post a while back. I struggled with it at the time. I struggle with it now. We can see that the experience of interacting virtually with her daughter helped Jang Ji-Sun. And as a mother, I can’t say that I wouldn’t jump at the same chance of a VR interaction if one of my children were to die unexpectedly. But I also can’t say that I would, because I can see how this can easily become addictive. Studies have shown that people experiencing complicated grief are more prone to addictive behaviors than those who experience a “normal” grief process. I can see how you could easily lose yourself in a virtual world in which your person still exists. I can see how easily you can lose your own humanity trying to hold on to the one you’ve lost, while all of your other loved ones watch you disappear. It is loss magnified.
AI can also help keep our memories alive if not our loved ones. There is something to be said for that. How comforting it is to be able to visit those Facebook memorial pages and look at old photographs, exchanges, and times together. This is, obviously, important and can be very beneficial to the grieving process. Having a photo pop up in my Facebook memories of the time my mom and I went to see Chris Isaak is a lovely reminder of one of the best times I had with her. I value that. From the Post article linked above: James Vlahos “said the [AI] doesn’t make him miss his father any less. ‘But I do love that he can feel more present to me, with the aspects of his personality that I love so much less clouded by the passage of time,’ he said.”
Alternatively, it can also keep our pain fresh and unresolved. As with everything, there is a fine line between what is healthy, and what isn’t.
The Post piece calls it bringing the dead back to life in a sense, but again, I argue that this is a terrible idea. We’re not meant to hold on to the dead after they’re gone in this way. What are we doing to the meaning of anyone’s existence when we artificially prolong their … being? Saying we shouldn’t have to grieve is like saying we can just take an aspirin for this bothersome headache and go on about the business of texting Mom that hilarious thing our youngest said at dinner last night, with the expectation of a response.
I think some of the issues here stem from our distinctly Western approach to death and dying. We live in such perpetual denial of our ultimate demise that when someone we love dies, it’s vastly more painful to accept than in many other cultures. This is the common refrain that runs through all of my work. By not making death an everyday part of our conversation, we do ourselves the grave disservice of not being prepared for the inevitable. And that makes loss that much harder to bear. It makes avoiding the pain of grief that much more tempting.
I know there is no escaping artificial intelligence at this point in history. In some ways, I suppose I’m glad that every day brings me closer to the end of my life than to the beginning of it. I grieve that my children will grow up and old in a world that is largely digitized and fabricated virtually and in solitude. And maybe that’s part of the reason I feel so strongly about this concept. Perhaps the use of AI can truly be helpful to some people. But it just feels wrong to perpetuate the dead in this manner. The dead aren’t meant to be here after they’ve gone. That’s why life is so precious. And so I would advise that you proceed with caution. Humans have a long history of failing to take a pragmatic view of the future. It’s important that we don’t lose the very thing that makes us human.
Post Script: Full disclosure, I’ve run drafts of this essay through Claude 2 a few times in order to polish and perfect it. I won’t deny that AI can be a valuable tool. But that’s all it should be – a tool. It should not and cannot be a replacement or stand-in for flesh and blood humans. (Try telling that to Roy Batty, you argue.) Additionally, I make a lot of jokes and references to science fiction throughout this piece, but I have always held firm in the belief that sci-fi authors are the Cassandras of our age. Doomed forever to tell of coming horrors, if only we would listen.