โHow long before we see our first case against a computer for the unauthorized practice of law?โ I half-quipped to a judge sitting next to me.
In reply I received a look of shock melting to concern followed by horror. Since it came during a lunchtime judicial roundtable between members of the bench and bar, I probably should not have been quite so flippant just as the judge bit into a sandwich.
Nonetheless, it is a question that the legal profession must answer as artificial intelligence becomes an ever-growing presence in the industry. In a field that is notoriously slow to adapt to change, change is happening faster than legal professionals can keep up. If that trend continues, it threatens the entire system.
Colorado had one of the most high-profile cases involving AI when attorneys for Mike Lindell, the โMy Pillow Guy,โ submitted a brief written by AI as their own work. Unfortunately, the AI was long on artificial and short on intelligence. The included citations that were rife with errors, misquoted court precedents and applied inapposite legal principles.
In the end, the judge issued a $3,000 sanction for each of the attorneys.
While my schadenfreude faded quickly, my concern for the profession continued to grow. Like in every industry over the past few years, the use of AI and its consequences cannot be ignored. That is not just a Colorado problem, but one for national jurists.
In Alabama, a federal judge took a far less lenient approach. She disqualified three attorneys and referred them to the stateโs attorney disciplinary committee. In doing so, the judge declared that use of fabricated legal authority โdemands substantially greater accountability than the reprimands and modest fines that have become common as courts confront this form of AI misuse โฆ As a practical matter, time is telling us โ quickly and loudly โ that those sanctions are insufficient deterrents.โ
Should those lawyers be disbarred, or even suspended, it will send shock waves throughout the industry.
Yet, it also will not change the direction of change. The very environment in which the legal ecosystem exists guarantees that outcome. It is made from a huge web of complex, interconnected principles and cases that lawyers have spent centuries parsing like magicians. The cost has always been time โ it is stereotypical to hear about lawyers, particularly young lawyers at big firms, working 80-, 90- and 100-hour work weeks.
Like a siren, AI calls to those legal professionals buried under mountains of statutory precedent and case law.
That is not to say that courts should bar the use of AI entirely. First, such a dictate would be nearly unenforceable. Second, it would deny both lawyers and lay people a valuable tool. The key is to lay groundwork to ensure the tool is not misused.
For example, I teach a law course where I encourage professionals to use AI for legal research. The students in my class often work fulltime jobs while carrying full course loads. As MBA students, they do not need to develop the same expertise as law students must. But they should know how to quickly and easily access legal precepts applicable to situations they might encounter in their jobs.
For that, AI provides an excellent starting point.
A quick search helps to provide the basics. Refining searches can develop more detail. In less than half an hour, a student might have a foundation to build from. But I also point out the limitations. Due to the complexity and subtlety inherent in case law, it is easy for AI to become confused. Pulling up cited sources and skimming through is a necessary step.
It also seems to be the step that seasoned lawyers have been skipping.
Apparently neither the My Pillow mouthpieces nor the Alabama-via-Mississippi advocates took the time to review their citation formats, much less the actual underlying cases or quotations. That made it easy for a trained eye to catch, much less if the judge actually attempted to use the citation in a ruling.
As sanctions mount, lawyers will surely adjust. Putting your license on the line to save a few hours of work simply is not worth it.
But what about other people? What will courts do when pro se parties begin submitting pleadings generated by AI? Those folks are not subject to the same standards as a lawyer. They do not have the same professional ethical responsibilities. Courts have policies that afford them more leniency.
It is not difficult to imagine someone headed to court deciding that hiring a lawyer is too expensive, especially when AI can provide a quick, easy, more affordable alternative. Anyone with access to a computer and the internet could put together what appears to be a legal brief.
Who is to blame when something goes wrong then? Does OpenAI have a malpractice policy? Is ChaptGPT subject to an unauthorized practice of law complaint? Could a defendant blame Gemini for ineffective assistance of counsel?
I will be sure to avoid food altogether next time I put questions like that to anyone wearing a black robe for living.

Mario Nicolais is an attorney and columnist who writes on law enforcement, the legal system, health care and public policy. Follow him on BlueSky: @MarioNicolais.bsky.social.
The Colorado Sun is a nonpartisan news organization, and the opinions of columnists and editorial writers do not reflect the opinions of the newsroom. Read our ethics policy for more on The Sunโs opinion policy. Learn how to submit a column. Reach the opinion editor at opinion@coloradosun.com.
Follow Colorado Sun Opinion on Facebook.
