Imagine interacting with AI videos of a deceased friend who can talk, act and talk like they did in life. Alternatively, you can access a text messaging feature that allows you to answer questions on behalf of a dead family regarding life experiences or final wishes.

The reality is that this type of artificial intelligence, or AI, is not too far away. In some cases, it’s already here.
“It’s not a question of whether there’s a generative ghost coming, it’s a question of when,” said Professor Jed Brubaker of the University of Colorado Boulder. “And it’s really important to start thinking about what it looks like now, so that you can design them into the most pro-social and most positive versions, which will make you considerate and avoid unexpected negative consequences.”
Also known as Generated Ghosts, AI Ghosts are AI agents designed to represent dead people by acting on their behalf or acting as them.
Daniel Sullivan, a graduate student at CU Boulder, said it was exciting to be on the forefront of something so new and weird.
“It tends to elicit a very visceral response, like something very scary or dystopian,” Sullivan said. “But I think there are so many different perspectives and ways that it can exist, and it already exists in a certain way.”
Generated Ghosts can generate new content based on available data such as emails and videos, making them more than AI chatbots like ChatGpt.
“One way ghosts can change is the form of embodied form that they may have, and that’s the term we use,” says Brubaker. “In contrast to virtual reality, ghosts texting you rather than talking to you on the phone. There are a variety of modalities that interact with these things, and that’s something we need to pay attention to.
A team of researchers from Brubaker and Cu Boulder conducted the analysis and gathered information about what is already there from the perspective of AI Ghosts in a paper called “Generative Ghosts: Predicting the afterlife benefits and risks of Ai Aigh.”
The paper outlined that some startups already offer services to create their own AI ghosts while people are still alive.
For example, Memory provides deep brain AI that creates interactive virtual representations of people after seven hours of filming and interview sessions. Below 2 provides an app that interviews users and creates digital post-mortem representations via chatbots.
Some museum curators and archives have created AI ghosts of historical figures for public use. Brubaker said he has a colleague at MIT working on the ghost of Leonardo da Vinci in a museum in Paris, for example. Brubaker said the Holocaust is researching as an academic field. Just as many of the last ones approach the end of life, they are trying to preserve the memories and history of the last waves of Holocaust survivors.
“Maybe there’s something to say for these interactive ghosts that can help people connect with the past and respect the historical heritage,” Bluebaker said.
Brubaker’s Lab is currently studying how people feel about AI Ghosts by interacting with AI Ghosts and bringing participants to document people’s thoughts and feelings.
“At the end of the day, if you don’t take into account people’s emotions, reactions and emotions, it will inevitably move faster and easily crumble,” said graduate student Jack Manning. “We’re focused on our users and trying to create something that lasts and benefits the people who use it.”
Manning said he is most excited about the issue of AI Ghosts’ reincarnation and representation.
“Should generative ghosts be designed to completely reincarnate those around them, who speak and act, or should they maintain one level of removal?” Manning asked.
However, there are also a variety of risks, including ethical concerns. One question and the main unknown is how AI ghosts affect grief. It is unclear whether AI ghosts will help those who are sad or make it even more difficult.
Another question is whether the dead were intended to leave the ghosts of AI behind. Brubaker said that if AI systems are not designed for death, there are unintended harms and consequences.
“I think we’re trying to get into this era in the next three years. There we’re starting to have agents for everything. If we don’t think about what to do when we die, how they should handle it, we’ll get a lot of accidental or unintended ghosts,” Brubaker said.
This group’s papers outline many other potential risks.
For example, AI ghosts may disclose true information or provide false information about the dead who doesn’t want to be revealed. It also could be posthumous identity theft and hijacking of AI ghosts. There is also a risk of data and privacy when a third party is creating AI ghosts on behalf of someone else, or when someone is accessing data from a dead person that is inaccessible to access.
Some people may create malicious ghosts. For example, the team’s paper stated that abusive spouse may develop generative ghosts that continue to emotionally abuse surviving families verbally.
“In addition to ghosts that could engage in post-death harassment, stalking, trolling, or other forms of life abuse, malicious ghosts may be designed to engage in illegal economic activity as a way to earn income from the deceased’s property or to support a variety of causes, including potentially criminals,” the paper reads.
But there are various reasons why someone might want to leave their own AI ghosts, Bluebaker said. For example, a father with a terminal cancer diagnosis might want to write a letter on his child’s birthday each year. With AI he was able to turn it into a ghost rather than a letter. Alternatively, the AI ghosts can help clarify the monumental plan and final wishes as outlined in the will.
Brubaker added that sadness changes over time. A year after my loved one dies, there are too many AI ghosts. But it’s good that maybe ten years later, the ghosts speak in their voices again.
“Part of the lab work we do now is to grasp the social, emotional and cultural factors that influence how people experience and benefit from ghosts,” Brubaker said.
Original issue: July 1, 2025, 12:59pm EDT