The warning was always there. We just filed it under "future problem."
A pre-print research paper—"Access to Justice in the Age of AI: Evidence from U.S. Federal Courts" by Anand Shah and Joshua Levy—has quietly documented something the AI industry's access-and-democratization narrative doesn't account for: unintended consequences at institutional scale. Drawing on more than 4.5 million federal civil court cases spanning 2005 to 2026, the researchers found that pro se litigation—cases filed without an attorney—held stable at 11 percent for years. After large language models became widely available in late 2022, that figure climbed to 16.8 percent by 2025. Plaintiff-side filings nearly doubled.
AI made it easier to sue. So more people sued. And the system absorbing those suits was not consulted on the plan.
A 158 Percent Workload Spike the Courts Cannot Handle
The case count is the surface story. The workload number is the one that should concern you.
Intra-case activity—total motions, filings, and docket entries within individual pro se cases—is up 158 percent from the pre-AI baseline. These aren't just more cases. They're heavier cases, generating more judicial labor per filing than the professionally-represented cases that preceded them. AI helps a self-represented plaintiff draft not just the initial complaint but the motions, the procedural arguments, the follow-on documents. The output looks credible. The burden on the judge reviewing it is real.
Federal courts cannot decline to hear cases. They cannot create judges on demand. As the paper states plainly, "there is no easy margin along which to 'buy' extra judge capacity." The backlog that already existed as a structural feature of the federal judiciary is now absorbing a load it was never designed to handle—and the load is growing.
Good Intentions Don't Insulate Systems From Pressure
The access-to-justice argument is legitimate. Legal representation in the United States has always been rationed by wealth. A person with a real grievance and no money has historically had limited recourse. AI genuinely changes that calculus, and that is not nothing.
But co-author Joshua Levy's framing deserves to be quoted directly: "The door to the courts opens wider but maybe the queue to enter gets longer." Wider access through an overwhelmed system is not the same as justice. It is access to a waiting room.
The paper also flags a quality dimension that compounds the problem. AI-assisted filings are not uniformly competent. Hallucinated case citations—AI-generated references to cases that don't exist—have already resulted in judicial sanctions against licensed attorneys. The same error pattern, applied at scale by self-represented plaintiffs who may not know to verify what their AI produced, multiplies the burden on judges tasked with sorting legitimate arguments from procedurally generated noise.
The Pattern Will Repeat Everywhere AI Lowers the Cost of Entry
This is not only a courts story. It is a preview.
Anywhere that credentialed intermediaries previously filtered access to a complex system—healthcare, financial services, education, regulatory compliance—AI will reduce the cost of entry and increase the volume of demand. In each case, the question that goes unasked in the product launch announcement is the same one now confronting federal judges: what happens to the system on the other side of the door when significantly more people walk through it?
The answer, in the courts, is a 158 percent workload increase with no corresponding increase in capacity.
For marketing and growth leaders building AI into their operations, this is the cautionary frame worth internalizing. Efficiency gains at the point of entry do not automatically distribute through the system. Removing friction in one place relocates it somewhere downstream—often somewhere less visible, less resourced, and less prepared.
Levy's proposed solution—allow judges to use AI for their own templatable work while preserving human judgment for decisions—is sensible. It is also the principle that should govern any serious AI implementation strategy: automate what is genuinely routine, protect what requires judgment, and be honest about where the pressure goes when you speed up one part of a connected system.
AI did not create the access-to-justice gap. It exposed how fragile the infrastructure around it actually was.
That lesson applies well beyond the courthouse.
Source: 404 Media, April 27, 2026, reporting on "Access to Justice in the Age of AI: Evidence from U.S. Federal Courts" by Anand Shah and Joshua Levy
AI is accelerating faster than most institutions—and most businesses—are prepared for. The team at Winsome Marketing helps growth leaders think through not just what AI can do, but what it does to the systems around it. Let's talk.


Writing Team