AI Deadbots are becoming increasingly popular as a way to get closure after the death of a loved one.
Photo Credit: Unsplash/ Sandy Millar
The digital afterlife industry is expected to reach a market size of $80 billion by 2034
Imagine being able to send a message or speak with a loved one who has passed away, and receiving a reply that sounds just like them. Even ten years ago, this would have been considered science fiction, but with the rise of generative artificial intelligence (AI) technology, it is possible today. Popularly known as deadbots, griefbots, and thanabots, these are essentially AI chatbots designed to mimic the dead and provide continuity conversations and comfort for their loved ones. But what is this technology, how does it work, and why has it sparked an ethical debate? Let's dive in.
Generative AI technology is currently used to write essays, generate images and videos, create songs, build apps and websites, and even power humanoid robots. But one use case perhaps most did not think about was bringing the dead back as deadbots. Put simply, deadbots are AI chatbots or digital avatars that can mimic the personality, voice, and mannerisms of someone who has died.
Deadbots are trained on the digital footprints of an individual, such as text messages, emails, voice recordings, videos, social media profiles, and more. A large language model (LLM) is trained on this dataset, and it can generate responses that appear similar to what the person would have said, in the manner that they would have said.
These AI chatbots can be text-based and function as a messaging interface, but the advanced versions can also include a deepfake-style digital avatar that looks and moves like the person it is mimicking. The latter is aimed at a more immersive experience.
The goal with all kinds of deadbots is to offer continuity conversations with the ones who have passed away so that they can get closure, which can help with their grief.
The most common way of using AI deadbots is as a way of keeping the memory and essence of someone who has passed away. Multiple AI startups have started providing a subscription-based digital afterlife service that allows people to build their own deadbots. However, this is not the only way it is being used.
In 2023, Rolling Stone reported that deadbots became increasingly popular among true crime fans on TikTok, who use AI to generate digital avatars of victims of violent crime and let them narrate their horrifying stories. Some museums have also started using the technology to create interactive exhibits of historical figures.
One interesting use case was seen in May, when a deadbot of a US Army veteran, Chris Pelkey, who was killed in a road rage incident, was used inside a courtroom to deliver a victim impact statement. According to a Mashable report, the judge later acknowledged that the statement influenced him to deliver the maximum sentence to the accused.
Despite the growing popularity of deadbots, the technology remains highly divisive. While the proponents argue its effectiveness in bringing comfort to the friends and family members of the one who has passed away and as a way to preserve their memory, critics have raised several concerns with both the technology and how it is being used.
The most common concern revolves around consent. With the technology being in its nascent stages, very few individuals have consented to have their likeness or private data used to create a digital replica after death. Instances where TikTok users created a digital replica of children who died in violent crimes are an example of this.
Studies have claimed that deadbots could have a negative impact on the grief process of the bereaved users. The claims range from providing users a false sense of attachment to limiting their long-term emotional and psychological well-being by prolonging the grieving stage.
Commercialising people's grief of losing a loved one is also considered an ethical concern. Creating a subscription-based service forces an individual to continually make payments just to have conversations with someone who has passed away. Such a system opens the possibility of emotional and financial exploitation by the service providers.
Deadbots also come with their fair share of privacy risks. A significant amount of private data is used to create a digital avatar, which remains stored in the company's servers. Users also have intimate conversations, which are also stored on the cloud. As such, a data breach can lead to issues such as impersonation, identity theft, and commercial exploitation.
There is no doubt that using AI to digitally resurrect the dead is one of the most fascinating ways of using AI technology; however, the nature of its use case calls for a strict regulatory framework, safeguards, and an extensive understanding of how it can emotionally and psychologically impact people in the long run.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.