AI For Law Firms

ChatGPT, LLMs, And AI In Law Firms

March 25, 2024
/

Commentators everywhere are predicting AI and large language models (LLMs) are going to replace paralegals, lawyers, teachers, writers, artists, and everyone else. At SW&L, we try to be forward-thinking and test various tools, including AI, to deliver the best client service we can. We’ve used ChatGPT, LLMs, and other AI in actual practice. We thought we’d help separate the real from the speculation and report what we’ve learned so far.

LLMs Are Dangerous As A General Legal Reference Machine. Remember that ChatGPT and LLMs in general are text prediction programs. They are not reference tools. This means they’re really, really good at identifying commonalities of language (including law) and reducing them to text. The more common the law, the more likely the right text is going to be predicted. This also means they’re really, really bad at zeroing in on actual statutes and caselaw — especially if the source is unique. For example, North Dakota’s Noncompete Law is pretty unique when compared to common law and other jurisdictions. If you ask ChatGPT to explain ND’s Noncompete Law, it very often blends ND’s law (NDCC § 9-08-06) and common law around the United States. In other words, it’s often wrong. Worse, it’s confidently and stubbornly wrong. For questions about case law, it’s worse than useless. It blends holdings and case names to make up holdings. It’s flat-out dangerous to use ChatGPT and other LLMs as general legal research tools.

LLMs Are Better As A Specific Reference Machine. ChatGPT+ now offers the ability to create customized versions of ChatGPT by uploading documents into its knowledge database and providing custom instructions. Inquiries then search the data set pursuant to those instructions. For these inquiries, it’s a much better reference machine for the documents uploaded. It is essentially reading the document and having a conversation with you about it. These documents can be statutes, cases, or the like.  A few caveats, though. First, you don’t always know if it’s limiting its answer to the knowledge database or if it’s referencing its larger database. As we discussed before, referencing the general database is dangerous. You’ll absolutely need to double-check answers to all inquiries. In this respect, it becomes a sort of overpowered (but rather untrustworthy) Index.

LLMs Are An A+ Brainstorming Machine. ChatGPT and other LLMs excel at brainstorming ideas. Anything from causes of action, defenses, affirmative defenses, discovery, and contract provisions. Again, we need to double-check references on any ideas, but it’s really good at giving you ideas to chase down.

And A Very Good General Explainer. Ask it to explain “Res Judicata” and it’s going to give you a pretty good start. It won’t give you references, but it’s a decent start. You can piggyback on those keywords to go find caselaw.

Custom Instructions Help. My customized instructions serve a couple of functions. One, I use it to cut down on its hedging and qualifying. I know its dataset limitations. I know it’s not an attorney. I also use it to dial in preferences for what I use it for. I want details. I want suggested solutions. I want contrarian ideas. Experiment with custom instructions to dial it in.

Don’t Ask It To Draft A Contract. Asking it to draft a contract, even something as simple as a residential lease, is generic to the point that it causes more problems than a written lease would save. Yes, it’s a lease. No, it won’t help you on anything besides the most basic terms.

Do Ask It To Help “Normalize” A Contract. We’re big believers that contracts that can be understood by the parties (not just the attorneys) are best because they reduce ambiguities and disputes. ChatGPT is pretty good at translating law-speak to regular-person-speak. That tracks, too. A text prediction program is going to use words that people usually use in ways they usually use them.

Prompt Engineering Is Overrated. Once you have custom instructions dialed in, prompt engineering is overrated. I’ve found a conversational approach works better anyway.

Specific Tools Are So-So. We’ve used tools marketed specifically to lawyers like CoCounsel and Westlaw’s AI-assisted research. They’re hit-and-miss. They can be very helpful in very specific circumstances, or they can completely whiff.

Here are my predictions:

AI Will Be Essential To Legal Practice. The iterations of ChatGPT (and other LLMs) are getting better and better. It stands a good chance to reduce some types of legal drudgery. It seems to me it will be effective in going through voluminous discovery and identifying trends and commonalities.

I’m Hopeful It’ll Help Increase Access To Justice. Unfortunately, economics often come into play in whether an attorney can assist. Obvious wrongs cannot be addressed because it costs too much to pursue. AI tools might be able to reduce the workload for attorneys and make resolution of those disputes economical. We’d love to be able to use AI tools to help people who would otherwise not have access.

Good Writing Will Become Scarce (And Valuable). Good writing is good thinking. The process is every bit as important as the product. ChatGPT and other LLMs are bland writers. Human writing has a uniqueness to it. Doing the hard work of thinking and writing will become scarce and valuable.

ChatGPT Won’t Replace Lawyers. Lawyers live in nuance, ambiguity, and the gray area. If it was easy, people wouldn’t need us. ChatGPT and LLMs really struggle in these areas. To me, it’s pretty clear why. Again, it’s a text prediction software. It’s predicting the most likely next word based on its dataset. It is not living in the edge cases where lawyers are needed. Given the foundational nature of what it is and how it does its thing, I just don’t think it’ll ever replace lawyers. If it does resolve a certain type of conflict, it’ll just move the nature of the dispute further to the edge. Or it will initiate a new type of arms race — between types of large language models. It’s not going to resolve human conflict altogether. And where there’s human conflict, there will be lawyers.

This article was written by a human. If you have any need for a human lawyer, please contact us.

Disclaimer.