July 31, 2025
Don Walsh
For most of my career, I have encountered plenty of clients who challenged my legal advice based on conversations they have had with friends, neighbors or stories they have read online. My usual response to all of those comments is that “I will bet my law degree against that person’s law degree any day of the week.” People never appreciate the nuances which exist in every situation and how the objectives of the litigants in other cases, not to mention the jurisdictions, laws and judges, may be very different from their situation.
I now find that those challenges to my advice are sometimes based on people relying on AI to provide them with legal advice. Although that may seem to people like a more reliable internet search and means of creating a solid legal solution, I seriously doubt that your AI provider has spent three grueling years of schooling understanding the very specific, ever slightly shifting law in your jurisdiction, matched it against the nuanced proclivities of the courts in your local jurisdiction, spent 30 years helping people with very similar problems and experiences, examined your individual objectives and offered the pros and cons of your situation and proposed alternative means of accomplishing your goals.
There are some things in the world that AI may be aptly designed to solve—creation of a cover letter for a resume, creating simple content for a website or even creating sappy love letters for your significant other. Providing legal advice or offering solutions for legal problems are not among them. To most attorneys, we appreciate and understand that ChatGPT or similar AI programs don’t always offer complete solutions and can be very superficial in their analysis of a problem. We also see daily articles of attorneys and litigants getting sanctioned for using AI hallucinated arguments or caselaw.
In a recent article in Futurism found here, the author discusses another reason for not using AI to discuss your legal issues; there is no privilege protecting the communications. If litigation ensues, your search history and prompts used have no protection against discovery by your opposing party. From a litigation standpoint, I can tell you that I have obtained search histories before in commercial litigation and they prove instrumental in demonstrating the poor objectives and illegal motives of opposing litigants. I can only imagine how effective such things would be in family law squabbles.
For the same reasons as the medical community has been telling everyone for years that “Google is not a doctor,” the attorneys of the world are now screaming the same thing about the legal advice you may be hoping to obtain through ChatGPT. The very apt quote from the article cited above by Jessee Bundy of the Creative Counsel law firm is that using AI for legal advice is "playing legal Mad Libs." I could not agree more.
© 2022-2025 RKW, LLC. All Rights Reserved.
Sign up for our weekly newsletter