Advertisment

Ghostbots: AI versions of deceased loved ones could be a serious threat to mental health

author-image
NewsDrum Desk
Updated On
New Update
Artificial Intelligence

Representative image

Dublin: We all experience loss and grief. Imagine, though, that you don’t need to say goodbye to your loved ones. You can recreate them virtually so you can have conversations and find out how they’re feeling.

Advertisment

For Kim Kardashian’s fortieth birthday, her then-husband, Kanye West, gifted her with a hologram of her dead father, Robert Kardashian. Reportedly, Kim Kardashian reacted with disbelief and joy to the virtual appearance of her father at her birthday party. Being able to see a long-dead, much-missed loved one, moving and talking again might offer comfort to those left behind.

After all, resurrecting a deceased loved one might seem miraculous – and possibly more than a little creepy – but what’s the impact on our health? Are AI ghosts a help or hindrance to the grieving process? As a psychotherapist researching how AI technology can be used to enhance therapeutic interventions, I’m intrigued by the advent of ghost bots. But I’m also more than a little concerned about the potential effects of this technology on the mental health of those using it, especially those who are grieving.

Resurrecting dead people as avatars has the potential to cause more harm than good, perpetuating even more confusion, stress, depression, paranoia and, in some cases, psychosis.

Advertisment

Recent developments in artificial intelligence (AI) have led to the creation of ChatGPT and other chatbots that can allow users to have sophisticated human-like conversations.

Using deep fake technology, AI software can create an interactive virtual representation of a deceased person by using digital content such as photographs, emails, and videos.

Some of these creations were just themes in science fiction fantasy only a few years ago but now they are a scientific reality.

Advertisment

Help or hindrance?

Digital ghosts could be a comfort to the bereaved by helping them to reconnect with lost loved ones. They could provide an opportunity for the user to say some things or ask questions they never got a chance to when the now-deceased person was alive.

But the ghostbots’ uncanny resemblance to a lost loved one may not be as positive as it sounds. Research suggests that death bots should be used only as a temporary aid to mourning to avoid potentially harmful emotional dependence on the technology.

Advertisment

AI ghosts could be harmful to people’s mental health by interfering with the grief process.

Grief takes time and there are many different stages that can take place over many years. When newly bereaved, those experiencing grief might think of their deceased loved one frequently. They might freshly recall old memories and it is quite common for a grieving person to dream more intensely about their lost loved one.

The psychoanalyst Sigmund Freud was concerned with how human beings respond to the experience of loss. He pointed out potential added difficulties for those grieving if there’s negativity surrounding a death.

Advertisment

For example, if a person had ambivalent feelings towards someone and they died, the person could be left with a sense of guilt. Or if a person died in horrific circumstances such as a murder, a grieving person might find it more difficult to accept it.

Freud referred to this as “melancholia”, but it can also be referred to as “complicated grief”. In some extreme cases, a person may experience apparitions and hallucinate that they see the dead person and begin to believe they are alive. AI ghost bots could further traumatise someone experiencing complicated grief and may exacerbate associated problems such as hallucinations.

Chatbot horror

Advertisment

There are also risks that these ghost bots could say harmful things or give bad advice to someone in mourning. Similar generative software such as ChatGPT chatbots is already widely criticised for giving misinformation to users.

Imagine if the AI technology went rogue and started to make inappropriate remarks to the user – a situation experienced by journalist Kevin Roose in 2023 when a Bing chatbot tried to get him to leave his wife. It would be very hurtful if a deceased father was conjured up as an AI ghost by a son or daughter to hear comments that they weren’t loved or liked or weren’t their father’s favourite.

Or, in a more extreme scenario, if the ghostbot suggested the user join them in death or they should kill or harm someone. This may sound like a plot from a horror film but it’s not so far-fetched. In 2023, the UK’s Labour party outlined a law to prevent the training of AI to incite violence.

Advertisment

This was a response to the attempted assassination of the Queen earlier in the year by a man who was encouraged by his chatbot girlfriend, with whom he had an “emotional and sexual” relationship.

The creators of ChatGPT currently acknowledge that the software makes errors and is still not fully reliable because it fabricates information. Who knows how a person’s texts, emails or videos will be interpreted and what content will be generated by this AI technology? In any event, it appears that no matter how far this technology advances, there will be a need for considerable oversight and human supervision.

Forgetting is healthy

This latest tech says a lot about our digital culture of infinite possibilities with no limits.

Data can be stored on the cloud indefinitely everything is retrievable and nothing is truly deleted or destroyed. Forgetting is an important element of healthy grief but in order to forget, people will need to find new and meaningful ways of remembering the deceased person.

Anniversaries play a key role in helping those who are mourning to not only remember lost loved ones, but they are also opportunities to represent the loss in new ways. Rituals and symbols can mark the end of something that can allow humans to properly remember in order to properly forget. (The Conversation)

Advertisment
Advertisment
Subscribe