AI ghosts: Researchers explore planning for death with new technology

Imagine interacting with an AI video of a deceased friend that can talk, act and converse just like they did in life. Or, having access to a texting feature that can answer questions on behalf of a dead family member regarding their life experiences or last wishes.

Jed Brubaker, right, conducts an AI ghosts zoom meeting at the University of Colorado on Wednesday. He is joined by Dylan Doyle, left, and Camille Brutera in a Zoom meeting. (Cliff Grassmick/Staff Photographer)
Jed Brubaker, right, conducts an AI ghosts zoom meeting at the University of Colorado on Wednesday. He is joined by Dylan Doyle, left, and Camille Brutera in a Zoom meeting. (Cliff Grassmick/Staff Photographer)

The reality is that this type of artificial intelligence, or AI, isn’t far off. In some cases, it’s already here.

“It’s not a question of if generative ghosts are coming, it’s a question of when,” University of Colorado Boulder professor Jed Brubaker said. “And so it’s really important that we start thinking about what that’s going to look like now so that we can design them to be the most pro-social, most positive version possible, and we can be thoughtful and avoid unanticipated and negative outcomes that we want to make sure we avoid.”

AI ghosts, also known as generative ghosts, are AI agents designed to represent a dead person by acting on their behalf or acting as them.

CU Boulder graduate student Daniel Sullivan said it’s exciting to be on the forefront of something so new and strange.

“It tends to elicit a very visceral reaction, like that is a very scary or dystopian thing,” Sullivan said. “But I think there are just so many different perspectives and ways it can exist, and it already does exist in certain forms.”

The generative ghosts will be able to do more than an AI chatbot like ChatGPT because they can generate new content based on available data, like emails and videos.

“One way that ghosts might vary is what form of embodiment they might have, that’s the term we use,” Brubaker said. “A ghost that’s texting with you as opposed to talking with you on the phone, as opposed to virtual reality. The modality in which we’re going to interact with these things is going to vary, and that’s something we’re going to need to pay attention to, because what’s appropriate, what people like, what’s beneficial versus detrimental, in part has to do with what type of embodiment or what way people are interacting with it.”

Brubaker and his team of researchers at CU Boulder conducted an analysis and collected information about what’s already out there in terms of AI ghosts in a paper called “Generative Ghosts: Anticipating Benefits and Risks of AI Afterlives.”

In the paper, it outlined how some start-up companies are already offering services for people to create their own AI ghosts while they are still alive.

Re;memory, for example, offers a DeepBrain AI that creates an interactive virtual representation of a person after a seven-hour filming and interview session, so that friends and family can engage with the person’s memory after they die. HereAfter2 provides an app that interviews a user to create a post-death digital representation through a chatbot.

Some museum curators and archives are creating AI ghosts of historical figures for public use. Brubaker said he has colleagues at MIT who are working on a ghost for Leonardo da Vinci for a museum in Paris, for example. Holocaust studies as an academic field, Brubaker said, is trying to use AI to preserve memories and histories of the last wave of Holocaust survivors, as many of the last ones are nearing the end of their lives.

“Maybe there’s something to be said for these interactive ghosts that can help people connect with the past and honor historical heritage,” Brubaker said.

Brubaker’s lab is now studying how people feel about AI ghosts by bringing in participants to interact with AI ghosts and record people’s thoughts and feelings about them.

“At the end of the day, if you’re not taking into consideration people’s feelings, reactions and emotions, it will inevitably move fast and crumble easily,” graduate student Jack Manning said. “We’re trying to focus on the user to create something that will hopefully last and benefit the people who will be using it.”

Manning said he’s most excited about the question of reincarnation versus representation in AI ghosts.

“Should generative ghosts be designed to fully reincarnate the person they’re designed around, speaking and acting as them, or should it keep one level of removal?” Manning asked.

But there are also a variety of risks, including ethical concerns. One issue and major unknown is how an AI ghost will impact grief, an area Brubaker said needs to be researched. It’s unclear if an AI ghost would help people who are grieving or make it more difficult.

Another issue is whether a dead person intended to leave an AI ghost behind. Brubaker said if AI systems aren’t designed for death, there will be unintended harms or consequences.

“I think we’re about to enter into this era in the next three years where we’re going to start having agents for everything, and if we are not thinking about what should happen when we die, how should they handle them, we’re going to get a lot of accidental or unintentional ghosts,” Brubaker said.

The group’s paper outlines a host of other potential risks.

For instance, an AI ghost might expose true information that the dead person would not have wanted to be revealed, or provide false information. There’s also the potential for post-mortem identity theft and hijacking of AI ghosts to exploit living loved ones for financial gain. There’s also data and privacy risk if a third party is creating an AI ghost on behalf of someone else, or if someone is accessing a dead person’s data who shouldn’t have access.

Some people might even create malicious ghosts. For example, the team’s paper said an abusive spouse might develop a generative ghost that continues to verbally and emotionally abuse their surviving family members.

“In addition to ghosts that might engage in post-mortem harassment, stalking, trolling, or other forms of abuse of the living, malicious ghosts might be designed to engage in illicit economic activities as a way to earn income for the deceased’s estate or to support various causes including potentially criminal ones,” the paper read.

But there are also various reasons why someone might want to leave an AI ghost of themselves, Brubaker said. For example, a father with a terminal cancer diagnosis may want to write a letter for his child’s birthday every year he’s gone. With AI, he could turn it into a ghost rather than letters. Or, an AI ghost could help with planning memorials or clarifying final wishes as outlined in a will.

Brubaker added that grief changes over time. Maybe one year after a loved one dies, an AI ghost is too much. But maybe 10 years later, it’s nice to have the ghost speak in their voice again.

“Part of the lab work we’re doing now is figuring out those social, emotional and cultural factors that influence how people experience and benefit or not from ghosts,” Brubaker said.

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *