Self-Represented Litigants Turn to AI as Legal Shortcut — With Mixed Results

Date:

When Lynn White fell behind on rent at her Long Beach trailer park and faced eviction, she fought the case in court with help from a local tenant advocacy attorney — and lost. Determined to appeal, she decided to try something unconventional.

She turned to artificial intelligence.

White, who runs a small music production business and had previously used AI tools to create promotional videos, began feeding court documents and case details into generative AI platforms. She asked the programs to analyze procedural rulings, identify possible errors and draft arguments for her appeal.

“It felt like having an on-demand research assistant,” White said. “I never could have managed all that paperwork and legal language on my own.”

Months later, White succeeded in overturning her eviction notice, avoiding tens of thousands of dollars in penalties and unpaid rent. She credits AI with helping her organize her case and better understand court procedures.

White is part of a growing number of self-represented, or “pro se,” litigants turning to generative AI to help them navigate the legal system. With tools available for as little as $20 a month — or even free — some are skipping attorneys entirely and relying on algorithms to draft motions, explain statutes and outline strategies.

A Growing Trend in Courtrooms

Legal professionals across the country say they are seeing an increase in court filings that bear the hallmarks of AI assistance.

“I’ve seen more pro se litigants in the last year than in most of my career,” said Meagan Holmes, a paralegal at a Phoenix law firm. “You can usually tell when a document has been generated by AI.”

While some litigants report success in small claims or procedural disputes, attorneys warn that AI-generated filings often contain inaccuracies — sometimes significant ones.

One of the most common issues is fabricated case law. AI systems occasionally generate citations to court decisions that do not exist, a phenomenon known as “hallucination.” In other instances, real cases are cited but misrepresented.

Damien Charlotin, a legal researcher who tracks judicial decisions involving AI-related filing errors, has documented hundreds of cases worldwide since 2023 in which courts directly addressed AI misuse.

“It’s accelerating,” Charlotin said. “The most common problem is fabricated case law, but misrepresented precedent is harder to detect and can be just as serious.”

Judges have issued sanctions in some cases, including fines and community service requirements, when litigants — and even licensed attorneys — submitted filings containing AI-generated errors.

Risk and Responsibility

Several AI companies include disclaimers stating their platforms should not replace professional legal advice. Yet when users ask legal questions, chatbots generally provide detailed responses with only minimal cautionary notes.

Legal experts say the responsibility ultimately falls on the user.

“Just like a law firm would double-check the work of a junior associate, you have to verify anything AI produces,” said one attorney who has reviewed AI-assisted filings. “Courts don’t excuse inaccuracies because a litigant used a chatbot.”

Self-represented litigants who rely heavily on AI may face consequences if filings contain false or misleading information. Judges have warned frequent filers against submitting repetitive or frivolous motions generated by automated tools.

A Tool — Not a Replacement

Despite the pitfalls, some legal professionals remain cautiously optimistic. Pro bono legal clinics in several cities are now teaching self-represented litigants how to use AI responsibly — focusing on research support, document organization and fact-checking techniques rather than legal advice.

“This is a transformative moment,” said Zoe Dolan, a supervising attorney at a Los Angeles-based nonprofit legal advocacy organization. “AI can expand access to information. But it has to be used carefully and critically.”

Even attorneys who caution against overreliance on AI admit they use it themselves — as a brainstorming or research aid.

“AI is helpful as a starting point,” said Andrew Montez, a Southern California attorney. “But it’s never a substitute for legal judgment, experience and manual verification.”

Leveling the Playing Field?

For litigants like White, AI represents something else entirely: access.

“AI gave me drafting help and organizational skills I couldn’t afford otherwise,” she said. “It felt like David and Goliath — except my slingshot was technology.”

As courts grapple with a surge of AI-assisted filings, one thing is clear: generative AI is reshaping how some Americans approach the legal system. Whether it ultimately empowers litigants or overwhelms courtrooms may depend less on the technology itself — and more on how responsibly it’s used.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe to The Hemet & San Jacinto Chronicle

Popular

More like this
Related

Layoffs Continue Across Inland Empire Warehousing and Logistics Industry

Job losses continue to mount across the Inland Empire’s...

A look at the top candidates vying to be California’s controller

In the race for oversight over California’s budget, the...

How California’s 2 biggest pension funds became a battleground for Trump politics and more

California’s two biggest public pension funds have more money than ever...

Lawsuit blames ChatGPT maker OpenAI for helping plan a school shooting

The widow of a man killed in last year’s mass...