Is a ChatGPT-Written Will Legal?
At first glance, using AI to draft a will seems like a smart idea. It’s fast, free, and has a modern appeal. You enter a few details, hit submit, and a professional-looking legal document appears. It’s easy to assume that if it looks like a will, it must function like one.
That assumption can lead to serious problems. Legal experts warn that AI-generated wills, while polished in appearance, may not hold up in court. Confident-sounding language does not make a document legally valid. Courts care about accuracy, procedure, and proof.
A will is more than just words on a page—it’s a formal legal instrument with strict rules. Overlooking even one requirement can render it void, leaving state law to dictate outcomes instead of your wishes. Harry Margolis, senior law attorney and author of “Get Your Ducks in a Row,” cautions Americans against relying on AI tools to produce a legally binding will.
That risk grows when AI handles something it cannot truly understand. AI does not know your family history. It does not grasp your local law unless forced through careful prompts. Even then, it cannot guarantee accuracy. That gap is where problems begin.
AI Falls Short With Legal Wills

Pixabay / Pexels / A valid will must follow execution rules set by law. In the UK, Section 9 of the Wills Act 1837 spells this out clearly. The will must be signed by the testator.
Two witnesses must be present at the same time. They must also sign the document correctly.
AI cannot oversee that process. It cannot confirm who watched whom sign. It cannot stop a mistake in the room. Courts do not forgive these errors. If the execution fails, the will fails, no matter how good the wording sounds.
Jurisdiction adds another layer of danger. AI systems often default to American legal concepts. They do this quietly and without warning. A will based on the wrong country or state law may be invalid from the start.
In testing, AI-generated UK wills failed basic UK execution rules. The errors only changed after repeated and very specific prompts. Most users would never catch these flaws before signing.
Then come omissions. AI does not ask follow-up questions the way a solicitor does. It does not pause and say something feels missing. As a result, it may skip assets, ignore stepchildren, or fail to plan for disabled dependents.
These gaps are not minor. They are the main reason wills get challenged. Families end up in court because something important was left unclear or left out entirely. AI cannot protect you from that outcome.
The Hallucination Risk Experts Warn About

RDNE / Pexels / Lawyers are allowed to use AI, but only with strict limits. Professional rules make one thing clear. Responsibility never shifts to the software. The lawyer remains fully accountable.
One of the biggest dangers with generative AI is hallucination. That term means the system produces information that sounds right but is completely false. It does not know it is wrong. It simply fills gaps with plausible text.
This problem has already caused serious legal fallout. Lawyers have submitted briefs filled with fake cases created by AI. Courts responded with heavy sanctions and public reprimands. Judges do not treat AI mistakes lightly.
In California, one attorney was fined $10,000 after filing a brief where most citations were fabricated. In another federal case, a law firm faced penalties exceeding $30,000. These incidents are now part of the legal record.
A will may not cite case law, but the risk remains. If AI invents legal language, misstates trustees’ powers, or misunderstands inheritance rules, the damage surfaces later. By then, the person who wrote the will is gone.
That reality makes correction impossible. Courts cannot ask the deceased what they meant. They can only interpret what appears on the page. AI errors then become permanent problems.
The American Bar Association addressed this directly in Formal Opinion 512. It requires lawyers to independently verify any AI-generated content. Blind reliance violates ethical duties of competence and honesty.
More in Legal Advice
-
DOJ Reviews Over 5 Million Documents Related to Jeffrey Epstein
The Department of Justice (DOJ) is reviewing more than 5.2 million documents related to convicted sex offender Jeffrey Epstein. The review...
January 17, 2026 -
Taylor Swift’s Viral Glitter Freckles Steal the Show in New ‘Life of a Showgirl’ Footage
Taylor Swift knows how to turn a small detail into a big moment. This time, it is a dusting of glitter...
January 11, 2026 -
Did Barron Trump Apply to Harvard? Clearing Up the Rumors
Speculation often swirls around public figures, and in recent months Barron Trump’s college choices became part of the conversation. Questions surfaced...
January 10, 2026 -
Nicotine Pouches Less Harmful Alternatives for Smokers, But Not Safe for All, Study
Nicotine pouches are having a moment. They are small, smokeless, and easy to hide. Many smokers see them as a cleaner...
January 4, 2026 -
The “Eight-Second Rule” of Legal Brothels Most People Don’t Know About
When people talk about legal brothels, safety is rarely the first thing they picture. Most imagine chaos, risk, or a total...
December 28, 2025 -
USF Law School Becomes First to Fully Embed AI in Legal Education
AI is no longer a distant tool for tech companies. At the University of San Francisco School of Law, it is...
December 24, 2025 -
Comedian Russell Brand Faces Criminal Charges After Years of Allegations
Russell Brand has officially been charged with rape, indecent assault, and sexual assault, stemming from alleged incidents that took place between...
December 24, 2025 -
Looking to Restart Exercising After a Break? Here’s How to Do It
Getting back into exercise after time off can feel harder than starting from scratch. Your body remembers movement, but it also...
December 21, 2025 -
Why Celebrity Trials Are Becoming Cultural Spectacles
Celebrity trials aren’t just courtroom proceedings anymore—they’ve morphed into full-scale cultural spectacles. Livestreamed, sliced into viral clips, dissected on talk shows...
December 20, 2025