————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————

It’s time to add AI protections to your will

That means everyone, not just celebs.

 By 

Chase DiBenedetto

 on December 4, 2025

Share on FacebookShare on TwitterShare on Flipboard

A hand signing a document, surrounded by cyber imagery and a giant floating lock.
Things to plan for after you die: funeral costs, inheritance, AI. Credit: Ian Moore / Mashable Composite; A. Martin UW Photography / Moment / akinbostanci / iStock / Getty

A visibly pregnant woman stands in the middle of a bright, modern kitchen, rubbing her belly and speaking to someone on the other end of a phone. The phone screen turns. It’s a video call. And it’s not just anyone, but her mom, wearing a bright sweater and giving advice.

Ten months later, grandma is telling the toddler a bedtime story. She’s wearing the same sweater from before. Ten years go by, the preteen is telling grandma about his day at school. We see that red sweater again. Hm. The grandson is 30 now, he’s about to be a dad. Grandma hasn’t aged a day.

The scene is an advertisement, selling you the services of 2wai, an app currently in beta that turns a short video clip into an AI-powered avatar. They’re one of many companies trying to win people over into creating AI versions of themselves to be used after they

No longer is the fear of deepfakes and AI-powered legacy projects (frequently called resurrections or “deadbots”) the sole worry of famous celebrities. It is here, for the average person, in the hands of your family and friends. 

So what if you don’t want a synthetic version of yourself giving advice to your ancestors in perpetuity? Or your AI replica being used in advertisements, art, or by corporations who have access to your data?

It’s still uncharted territory, but you have options to ensure your digital likeness stays offline. And there’s many reasons, not just legal or financial, why you might want to do it. Here’s how.

SEE ALSO:‘Alien: Romulus’s biggest cameo is its greatest error

Start thinking about AI before you die

There’s one thing that needs to be stated right off the bat: Everyone should be planning for their death. 

“We invest so much time and consideration into milestones like weddings and having children, but very little thought is given to how we want to live our final months and years,” said Sarah Chavez. Chavez is the director of Order of the Good Death, a global network of advocates and professionals working to reframe death and dying. 

So alright, you know you need to make sure your digital ducks are in order before you get too old. But do you really need to think about AI, deepfakes, and digital likenesses, of all things?   

If you had asked Chavez this question a year ago, she would have had an entirely different response. That’s rapidly changed. “AI has become so prominent in our everyday lives, not just professionally and personally,” Chavez explained. “We’re also starting to see the dead used in a way that can have legal and social impact, too.” She points to a case of Chris Pelkey, a victim of a road rage incident whose voice was resurrected by his family to give his own victim’s statement. Chavez recalls the viral Shotline project, too, which used AI audio deepfakes of gun violence victims to urge politicians to pass common sense gun reform legislation. Similar tech was used to create an AI likeness of Parkland shooting victim Joaquin Oliver. 

There’s a high degree of risk associated with allowing digital versions of yourself to exist online, with no parameters. Could your digital likeness be used as a tool for scammers, for example, to con your family and friends or even strangers? What about the legal and social ramifications of a chatbot created in your image, one that may become embroiled in the same courtroom battles currently faced by ChatGPT and others. Another big question: What about your personal data privacy? Are you okay with your loved ones providing a tech company or AI developer with the mass amount of data needed to personalize an AI version of you?

“It’s important to remember that these tools are created by for-profit tech companies, which raises a number of concerns about ownership of that data and how it will be used,” warns Chavez. 

Regular people, not just celebrities or those who become headlines, are seeing the fallout of unhampered access to generative AI, like targeted scams and growing misinformation. Just a handful of bullet points in your will could decide whether your digital legacy is mired in the same controversies. If there was ever a time to start planning for the end of your life, it’s now.  First task: Take a digital asset inventory.

AI, your death, and the law 

Cody Barbo, the founder of digital estate planning tool Trust & Will, suggests people use estate planning to better control their digital footprint. The service is like TurboTax but for writing a will, and he says he built it to help regular people who may be avoiding the conversation completely. It’s also a way to bring tech into an industry that has been slow to adopt, even as AI poses huge security and estate questions. 

“Over the past decade, end-of-life planning regarding tech has primarily focused on encouraging people to include information about what they want done with their cell phone, email accounts, and social media platforms, and making sure they’ve provided passwords and login information for their accounts,” Chavez explained. With AI an emerging and yet dominant tech, the industry needs to catch up. 

“We’re just at the entry point,” Barbo said. “We’re dipping our toes in the water of what an AI version of ourselves could look like. [But] we want people to know that you can be in control.”

How does that work in practice? “The challenge with trying to protect something that is so new, that is so innovative, is that there’s no legislation to help you,” explained Solomon Adote, the chief information security officer of The Estate Registry and former Chief Security Officer for the state of Delaware. “Some states say you cannot violate certain privacy protections, but nothing that explicitly says that you cannot abuse this person’s likeness, image, or other aspects of their representation.” In the background, a patchwork of state laws are trying to address these concerns through extended privacy laws, which would better protect your digital assets, including data privacy, after you die.

Mashable Trend Report

Decode what’s viral, what’s next, and what it all means.

Sign up for Mashable’s weekly Trend Report newsletter.Sign Me Up

By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.

For now, individuals have to turn to proactive estate planning. 

What are you trying to protect?

First task: Take a digital asset inventory. This involves surveying and noting all your digital accounts, log-ins, and data, like social media pages, bank log-ins, but also Cloud-based drives, or even text messages or DMs. This also includes defining exactly what your digital likeness includes — is it just depictions of you as an adult? Does it include your voice and physical mannerisms? What version of yourself can or cannot be turned into AI?

Some people may want to solicit the services of a digital identity trust, Adote said, which can help manage your online identity and intellectual property. 

Who will help you protect it?

Next: Assign a digital fiduciary and know the (albeit limited) law. This is a person (or persons) who is given designated access to your digital assets, including online accounts. You can grant permission to just specific assets or entirely limit access through both your will and fiduciary. You can also provide them with guidance for your digital likeness, which is in itself a digital asset, Adote explained.

The boundaries of digital fiduciaries are covered under the Revised Uniform Fiduciary Access to Digital Assets Act (RUFADAA), which has not been passed by every state. Under this law, a person assigned as a digital fiduciary can legally provide or gain access to someone’s online accounts after death or even incapacitation. But only trustee executors can access the content of said accounts, and only if the person who died consented. Tech companies, like Google and Meta, also operate under RUFADAA (that’s why we have things like Facebook legacy accounts and contacts now). If you don’t assign a fiduciary, your accounts default to the tech company’s Terms of Service. 

What will you allow and who will benefit?

Once you’ve assigned a fiduciary, you need to have a direct conversation with them about what they should and should not allow. With your “explicitly written and validated position” on AI use, Adote said, fiduciaries can more easily take legal action, like issuing cease and desist orders on intellectual property.

You can, quite simply, write that you do not consent to someone creating an AI-generated likeliness of yourself in your will, said experts.

You may want to phrase this as “living on in AI-form” or the “publication of an AI-generated, synthetic version” of yourself. You may also want to be clear about data usage: I do not consent to the use of my personal data to create an AI-powered digital likeness of myself. Adote suggests your will should show clear intent, with phrasing like “I do not authorize my image or likeness to be used in any way, form, or fashion.”

Go over these with an estate attorney, as everyone’s situation and end of life needs are different — and state laws vary. 

You can also stipulate very precise cases for how your digital likeness can be used, if it’s not a hard no. But be conservative and narrow with this language, other experts suggested. Write down, for example, exactly who is allowed to use or release it, just as you would with other assets or accounts. List any explicit charities or companies that are allowed to use your likeness, as well.