AI afterlife and practical digital estate planning

A company called Eternos charges $25 a month to build a digital twin of you. You record your voice, answer personality questions, feed it your writing. When you die, your family can call you up and have a conversation. Your voice. Your mannerisms. Your opinions about whether pineapple belongs on pizza. The works.

Reuters recently profiled the pull of this category: a Brazilian son recreating his dead father's voice with ElevenLabs, and families preserving AI versions of people before they die. The emotional promise is obvious. It feels, almost, like they're still here.

I read that and felt two things. First: genuine empathy. Grief is brutal and people cope however they can. Second: deep frustration. Because while someone is having synthetic conversations with a dead parent, they might also be unable to access the bank account, cancel subscriptions, or find the life insurance policy number.

The broader digital legacy market is projected to grow from roughly $22.5 billion in 2024 to about $79 billion by 2034. That's a lot of money flowing toward digital death. Too much of the public imagination still goes to emotional simulation while the practical catastrophe of digital death goes mostly unsolved.

The comfort problem

I'm not here to judge anyone who finds solace in an AI version of someone they lost. Grief doesn't follow rules. If talking to a chatbot that sounds like your mother helps you get through the day, that's your choice.

But I will push back on the idea that this is "death tech" solving the problem of what happens when someone dies. It isn't. It's emotional technology that happens to involve dead people.

The actual problem looks like this:

Your partner dies. Within 48 hours you need to notify their employer, find out if there's life insurance, access joint accounts that only they managed, figure out which bills are on autopay from which accounts, and start dealing with legal paperwork. Within a week you need to handle social media accounts, email archives, cloud storage, crypto wallets, business partnerships, and the 200+ online accounts they accumulated over two decades.

An AI chatbot that mimics their voice cannot help with any of this. It doesn't know their passwords. It can't authorize account transfers. It doesn't have their 2FA codes. It performs presence. It doesn't solve anything.

What grief tech actually sells

The companies in this space figured out something uncomfortable: grief makes people spend money before they ask hard questions.

HereAfter AI interviews you about your life stories, then lets your family "chat" with you after you die. StoryFile records video interviews indexed for interactive Q&A. Eternos builds voice clones. Seance AI and Project December let you type messages to approximations of the dead.

Same pattern across the board. They capture the person: voice, stories, personality traits, opinions. What they are not built to capture is the operational layer: account credentials, financial structures, legal documents, and instructions for what to actually do.

One set of data helps you feel better. The other keeps your family from financial disaster. The industry's public energy is still tilted toward the first category while the second gets treated like paperwork.

The consent problem nobody talks about

Something that bothers me as someone building in this space.

AI afterlife companies need your data to function. Texts, emails, social media posts, voice recordings. They train models on who you were. The question nobody asks: who consents to this?

If you sign up yourself while alive, that's straightforward. You chose to create your digital twin. Fine.

Some of the more careful services require pre-death consent. Good. But the tools underneath this category do not require that moral architecture. Surviving family members can already scrape social media, feed old text messages into a language model, or clone a voice from home videos.

The dead person never agreed to become a chatbot. They never reviewed what the AI says on their behalf. They never got to decide which version of themselves gets immortalized. The version at 25 or at 75? The Instagram persona or the private one?

When Trustbourne handles someone's data after they die, it delivers exactly what that person chose to share, to exactly who they chose, in exactly the way they intended. The vault owner makes every decision while they're alive. There's no interpretation, no approximation, no AI deciding what you "would have said."

That distinction matters.

What your family actually needs

I've spent the last year thinking about what happens after someone dies. Not the emotional part, though that's real. The logistical part. Where someone sits down and tries to figure out the shape of a life they only partially understood.

What survivors consistently need, roughly in order:

Immediate (first 48 hours): Access to email, bank accounts, employer contacts, insurance information. Ability to lock or secure accounts against fraud. Knowledge of outstanding debts, autopay commitments, and financial obligations.

Short term (first month): Complete picture of digital accounts and assets. Access to important documents (wills, deeds, contracts). Instructions for crypto wallets, business accounts, anything non-obvious. Contact information for lawyers, accountants, financial advisors.

Long term: Ability to memorialize or close social media accounts. Access to photos, videos, writing. Transfer of digital purchases, subscriptions, domains.

A chatbot that sounds like the deceased appears nowhere on this list. Not because it's worthless, but because it solves a different problem than the one that actually wrecks families.

Your family can grieve without AI. They can't pay the mortgage without account access.

The uncomfortable overlap

There's a place where grief tech and practical inheritance planning could work together.

Imagine recording a video of yourself explaining your financial setup, then attaching it to your Trustbourne vault alongside the actual credentials and documents. Your family gets both the information they need and the comfort of hearing you walk them through it. That's powerful.

Some AI afterlife companies are starting to add "practical" features. Eternos mentions "knowledge sharing." HereAfter lets you record specific instructions alongside life stories. The lines are blurring.

The fundamental architecture is still wrong, though. These tools are built around simulation, not security. They're designed to capture personality, not protect credentials. Adding a password field to a voice-cloning app doesn't make it a vault. That's like adding a notes feature to your bank app and calling it a diary.

Storing someone's crypto seed phrase requires zero-knowledge encryption, access controls, and legally compliant release mechanisms. Storing their personality profile requires a good microphone. These are not the same engineering problem.

Where this is going

One tracker counts 948 death-tech companies worldwide, with 162 funded and about $837 million in private equity behind them. That includes funeral tools, estate administration, digital memorials, wills, grief support, the whole messy category.

Both feelings and function matter. But the attention is off. Badly.

For every dollar spent making dead people talk, we should be spending ten on helping living people plan what happens to their digital lives.

"Son hears dead father's voice again" gets more clicks than "Widow finds bank login in digital vault." But one of those changes outcomes. The other changes feelings.

What I'd tell the founders

If you're building grief tech: build the practical layer first. The emotional stuff can sit on top. Not the other way around.

If you're a user considering these services, ask yourself: Does my family know how to access my financial accounts if I die tomorrow? If the answer is no, an AI version of you won't fix that.

And if you've lost someone and find comfort in their AI avatar, I hope it helps. Sincerely. But also check whether they left you what you actually need. The bank login. The insurance number. The password to the email where all the other passwords live.

Because your family doesn't need another conversation with you. They need the one conversation you never had while you were alive.


Trustbourne is a digital vault and dead man's switch for your most important files and credentials. No AI clones. No voice simulation. Secure storage, trusted contacts, and a system that delivers what your family actually needs. Learn more