International Finance
MagazineTechnology

AI resurrections: Reviving the dead or exploiting grief?

AI resurrections
A significant concern is that businesses might employ artificial intelligence resurrections to personalise their user marketing

The use of AI-generated Tupac Shakur vocals by Canadian singer Drake and speeches by politicians years after their deaths show how technology is blurring the lines between life and death. Beyond their alluring appeal in politics and entertainment, however, several innovative but somewhat divisive projects could soon make AI “zombies” a reality for bereaved families.

Thus, how do AI “resurrections” function, and are they as horrible as we would think?

Artificial intelligence initiatives worldwide have produced digital “resurrections” of deceased people, enabling friends and family to communicate with them. Usually, consumers give the AI tool details about the departed. This might be responses to questions depending on personality, or it could be emails and texts.

After processing the data, the artificial intelligence tool converses with the user, like the deceased. Replika, a chatbot that mimics texting behaviours, is one of the most well-liked initiatives in this field.

But other businesses now let you see a video of the deceased individual while you converse with them.

For instance, StoryFile, a Los Angeles-based company, employs AI to let people speak during their own funerals. A person can record a video in which they share their ideas and life stories before they die away. When guests ask questions during the funeral, artificial intelligence will pull pertinent answers from the prerecorded video.

Additionally, US-based Eternos made news in June when it developed a digital afterlife for a person using AI. This project, which got underway early this year, gave Michael Bommer, (83), the opportunity to leave behind a digital legacy that his family could use to stay in touch with him.

Touching the emotional chords

A video of the tearful reunion between a South Korean mother and an AI replica of her deceased daughter in virtual reality in 2020 provoked a heated online discussion on whether or not this kind of technology benefits or impedes its users.

These projects’ creators emphasise the agency of the users and claim that their work alleviates a deeper pain.

The majority of users are often experiencing an “extraordinary amount of sorrow and grief,” according to Jason Rohrer, founder of “Project December,” which likewise utilises artificial intelligence to facilitate discussions with the dead. Rohrer stated that most users saw the programme as a coping mechanism.

Many of the individuals who wish to use “Project December” in this way are so overcome with pain from their grief that they are willing to try everything to get over it.

According to Rohrer, many people who use the programme to have thought-provoking conversations with the deceased find that it aids in their quest for closure.

Robert LoCasio, the creator of Eternos, stated that he started the business to record people’s life experiences and enable their loved ones to go on.

According to LoCasio, Bommer, his former colleague who died in June, he was intended to leave a digital legacy that belonged only to his family.

“Just a few days before he passed away, I spoke with [Bommer] and he told me to always remember that this was for me. This was significant to me, even if I’m not sure if they will use it in the future,” the Eternos founder remarked.

Some observers are more cautious when it comes to AI resurrections, raising concerns about its potentially harmful psychological repercussions and doubting the ability of severely bereaved individuals to make an informed decision to use it.

Alessandra Lemma, a consultant at the Anna Freud National Centre for Children and Families, said, “As a clinician, my main worry is that grief is genuinely a very significant process. The ability to acknowledge the absence of another person is a crucial component of growth.”

Lemma cautioned that prolonged use could prevent people from accepting the other person’s departure, putting them in a condition of “limbo.”

A crucial element of one AI service is its perpetual connection to the departed.

Before its recent change, the company’s website said, “Welcome to YOV (You, Only Virtual), the AI startup pioneering improved digital communications so that we Never Have to Say Goodbye to those we love.”

According to Rohrer, his sorrow bot has a “built-in” limit: customers must pay $10 for a constrained chat.

The cost of processing for each response varies, but the fee purchases time on a supercomputer. This means that $10 can cover one to two hours of chat, but it does not guarantee a set number of responses. Users receive a notification as the allotted time expires, allowing them to say their final goodbyes.

Lemma, a researcher on the psychological effects of grief bots, notes that although she is concerned about the possibility of their use outside of a therapeutic setting, they might be utilised safely as an addition to professional therapy.

The other side of the coin

Proponents of this technology say that new methods of preserving life tales are simply being brought about by the digital era, maybe filling a gap left by the decline of customary family storytelling techniques.

“In the past, when a parent knew they were going to die, they would leave boxes full of items or books that they would want to give to a child,” Lemma said.

In light of that, this may be the parent-created, passed-down version of that for the twenty-first century, predating their demise.

According to LoCasio in Eternos, “It’s actually the most natural thing for a human to be able to share the stories of their life to friends and relatives.”

Studies and experts alike have voiced concern that these services might not be able to protect the privacy of user data. Third parties may be able to access personal data, including text messages that you share with these services.

Renee Richardson Gosline, senior lecturer at the MIT Sloan School of Management, noted that even if a corporation claims it would keep data private when someone first signs up, privacy cannot be guaranteed due to frequent adjustments to terms and conditions and potential changes in company ownership.

LoCasio and Rohrer emphasised that privacy was the main focus of their initiatives. Eternos restricts access to the digital legacy to authorised families, whereas Rohrer can only view discussions when users submit a customer support request.

Both acknowledged, though, that these worries might materialise in the case of tech behemoths or for-profit businesses.

A significant concern is that businesses might employ artificial intelligence resurrections to personalise their user marketing.

A loved one’s voice might be an advertisement or a product pusher in their text.

“What you’ve created is a pseudo-endorsement based on someone who never agreed to do such a thing when you’re doing it with vulnerable people. Thus, agency and power asymmetry are the true issues,” according to Gosline.

Gosline argues that these tools, designed for grieving individuals, can be dangerous, especially with the involvement of large tech companies.

“We should be concerned because the items of the most vulnerable people are usually the first to break in the fast-paced, ‘break everything’ ethos of internet businesses,” according to the expert.

And it’s difficult for me to think of someone who is more defenceless than those who are mourning.

The ethics of bringing the dead back to life digitally have drawn criticism from experts, especially when the users feed AI data about them without their consent.

An increasing number of people are worried about how AI-powered tools and chatbots may affect the environment, especially when they use large language models (LLMs), which are programmes designed to comprehend and produce writing that is similar to that of a human.

Large data centres are required for these systems, and these facilities produce a lot of carbon dioxide and water vapour during cooling, not to mention the e-waste that results from regular hardware updates.

Because of the strain, artificial intelligence was placing on its data centres, Google revealed in a July 2024 report that the business was behind its aggressive net-zero targets.

Gosline acknowledged that no software is flawless and that many people using these AI chatbots would stop at nothing to rekindle a relationship with a loved one who has passed away. However, she stated that it is the responsibility of scientists and leaders to give more attention to the kind of world they wish to build.

What's New

Microsoft @ 50: Nadella’s vision reshapes tech giant

IFM Correspondent

Tool-assisted Speedruns stir debate in esports

IFM Correspondent

Video game controllers: New instruments of death

IFM Correspondent

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.