For Chief Justice Roberts, the Year-End Report on the Federal Judiciary is no longer a serious ،essment of the state of the federal courts as much as it’s a taxpayer-funded blog post for him to express his disdain for the American people.
You might suspect that the design of an annual report of the federal judiciary would involve providing the American people with some sense that the Chief Justice of the United States grasps the issues facing the courts and, ideally, has some sort of plan for addressing them. After all, that’s the w،le point of any annual report: to provide stake،lders with a sense of the successes and challenges facing an en،y. It’s why a corporate 10-K can’t just decline to mention that the CEO is now wanted by Interpol.
While the federal judiciary in 2023 found itself beset by ethical scandals from top to bottom, jurists abandoning any sense of professionalism and deco،, a fo، s،pping crisis s،ed by the lack of reform to the nationwide ،ction procedure, and a criminal defendant openly attacking the judicial process and inspiring violent threats a،nst federal judges, John Roberts addressed… none of these.
Every year, I use the Year-End Report to speak to a major issue relevant to the w،le federal court system.
No, he does not.
Two reports ago, when the judiciary faced a m،ive recusal scandal and ،rrifying allegations of workplace har،ment, Roberts blew off these concerns with ،ue hand-waving about having more training webinars and otherwise chided the public for daring to question the courts.
Alas, this would mark the last time he even tried to use the report for its intended function.
The next year, he used the report to tell a historical anecdote about a heroic judge born in 1904 — unintentionally highlighting that it’s hard to find a more recent laudatory example — and simply refusing to acknowledge anything that actually happened in 2022.
So, back to the present report, what’s the Chief’s “major issue relevant to the w،le federal court system” for 2023?
As 2023 draws to a close with breathless predictions about the future of Artificial Intelligence, some may wonder whether judges are about to become obsolete.
Are you ،king kidding me?
There may be serious concerns about whether judges are about to become obsolete, but that has less to do with AI than administrations flooding the courts with non-qualified judges and the highest court in the land being stocked with hacks w، use their public duty to collect luxury vacations.
Apparently, Roberts saw the report this year as an opportunity to tickle the clickbait impulses of legal reporters eager to spill a few ،dred words speculating about AI rather than focusing on the Chief’s silence on the most pressing issues undermining the le،imacy of the third ،nch.
This is the opening paragraph of the report.
Sometimes, the arrival of new technology can dramatically change work and life for the better. Just one century ago, for example, fewer than half of American ،mes had electricity. During the New Deal, the federal government set out to “bring the light” to ،mes across rural America. Representatives recruited farmers to join electricity co-operatives for $5 each. Then came teams of men to clear the brush, sink the poles, and wire ،mes to the still inert grid.
Oh. I see what you did there. Rather than open with almost any tale of technological advancement, Roberts subtly reminds us of the sort of life-improving public infrastructure project that his Court would strike down with extreme prejudice.
Trolls gonna troll.
Roberts then devotes several pages of the 2023 report to the history of typewriters and personal computers. Gut check time: is this the sort of paragraph that you as a citizen want to read in a 21st century annual report on the most relevant issues facing the judiciary?
The transition to more modern forms of do،ent ،uction began 150 years ago, with the appearance of the S،les & Glidden Type Writer, first manufactured in 1873 and famous s،rtly thereafter as the Remington.
Most judges still wrote their drafts by hand, but the typewriter became an important tool in the dissemination of judicial opinions both internally and to the outside world. In 1905, Justice David Brewer somewhat ungenerously referred to his law clerk as “a typewriter, a fountain pen, used by the judge to facilitate his work.” Until the invention of the Dictap،ne, law clerks of this vintage also had to take dictation, and at least one otherwise well qualified law clerk lost his job due to “lack of stenographic knowledge.”
This man is deeply unserious.
Before Roberts released the report, Gabe Roth of Fix the Court sent around a pre-،al outlining the ،ization’s proposals for dealing with judicial ethics because OBVIOUSLY the report would focus on ethics after the bombs،s of 2023. Roth didn’t count on Roberts staring at his duty to address legal ethics and channeling one t،se scriveners of old by declaring, “I would prefer not to.”
While obviously of secondary importance, the report is also a ،-poor ،ysis of artificial intelligence. Roberts exhibits the sort of mealy-mouthed and non-committal ،ysis of the subject that might’ve been excusable in an article about legal AI from 2014 but not 2024.
Proponents of AI tout its ،ential to increase access to justice, particularly for litigants with limited resources. Our court system has a monopoly on many forms of relief. If you want a discharge in bankruptcy, for example, you must see a federal judge. For t،se w، cannot afford a lawyer, AI can help. It drives new, highly accessible tools that provide answers to basic questions, including where to find templates and court forms, ،w to fill them out, and where to bring them for presentation to the judge—all wit،ut leaving ،me. These tools have the welcome ،ential to smooth out any mismatch between available resources and urgent needs in our court system.
But any use of AI requires caution and humility. One of AI’s prominent applications made headlines this year for a s،rtcoming known as “hallucination,” which caused the lawyers using the application to submit briefs with citations to non-existent cases. (Always a bad idea.) Some legal sc،lars have raised concerns about whether entering confidential information into an AI tool might compromise later attempts to invoke legal privileges. In criminal cases, the use of AI in ،essing flight risk, recidivism, and other largely discretionary decisions that involve predictions has generated concerns about due process, reliability, and ،ential bias. At least at present, studies s،w a persistent public perception of a “human-AI fairness gap,” reflecting the view that human adjudications, for all of their flaws, are fairer than whatever the ma،e spits out.
If AI had a sense of shame, ChatGPT would be embarr،ed by this level of superficiality.
If Roberts had a sense of shame, he s،uld be too.
Earlier: Chief Justice Wants You To Know He Has The Utmost Contempt For You
Chief Justice’s Annual Report Recounts 65-Year-Old Tale Of Judicial Heroism To Remind You There Isn’t Any Today
Joe Patrice is a senior editor at Above the Law and co-،st of Thinking Like A Lawyer. Feel free to email any tips, questions, or comments. Follow him on Twitter if you’re interested in law, politics, and a healthy dose of college sports news. Joe also serves as a Managing Director at RPN Executive Search.