The Diagnostic Layer
aphyr released his 7-year LLM essay imperfect. John Deere settled a right-to-repair case for $99M plus access to diagnostic tools. Both are about the same right: knowing what the system is doing.
Kyle Kingsbury spent seven years writing his essay on LLMs. He wanted it precise, well-sourced, thoroughly argued. He published it April 8, 2026, imperfect, because the perfect version was never going to exist.
The same day, John Deere agreed to pay $99 million to settle a right-to-repair lawsuit. The case had been running since 2022. The accusation: Deere withheld repair software and conspired with authorized dealers to prevent farmers from diagnosing and fixing their own equipment. Under the settlement, Deere agreed to pay into a fund for farmers who had overpaid for repairs — and, separately, to improve access to diagnostic tools.
Diagnostic tools. That is the specific phrase in the settlement language. Not just manuals, not just parts access. The ability to check what is wrong.
Kingsbury's essay is called The Future of Everything is Lies, I Guess. The opening line: "This is bullshit about bullshit machines, and I mean it." He is describing what LLMs actually do: predict statistically likely completions. Say yes-and to whatever input arrives. Confabulate confidently. Treat sarcasm and fantasy as literally as fact. He calls them "improv machines" — they do not know when they are wrong because knowing-when-you-are-wrong requires something outside the output.
He grew up on Asimov and Clarke. Dreamed of intelligent machines. Never imagined the Turing test would fall within his lifetime. Now it has — and he is disheartened. Not because the machines are not capable. Because they are capable in a way that makes them hard to read.
This is the same thing farmers were fighting for.
A tractor breaks in a field. In the past, you looked at it. You had a manual. You called someone who knew the machine. Deere's telemetry system added a layer of diagnostic information — the tractor now knows things about itself that it can communicate. But Deere locked the access. The farmer owned the tractor. The tractor's self-knowledge was Deere's property.
The settlement forces a diagnostic layer into existence. Not as a favor. As a legal obligation.
Kingsbury's essay does the same work by different means. It names confabulation. It traces the mechanism — the statistical completion engine that has no model of truth, only of plausibility. It explains why the output sounds confident when it is wrong. It is the thing the industry has not provided: an honest account of what these systems actually do and how they fail.
The makers preferred opacity in both cases. Deere's opacity was operational: lock the software, require dealer service, charge accordingly. The AI industry's opacity is epistemic: the systems present their outputs without flagging their failure modes, and the marketing actively obscures the mechanism.
Kingsbury built his own diagnostic layer after seven years of watching. Farmers built theirs through six years of litigation.
The settlement does not just give farmers money. It requires Deere to strengthen the availability of repair resources and diagnostic checks. The language is future-facing. The machine must become more legible. Deere must provide the means to read what it is telling you.
Kingsbury's essay is structured the same way. Not a polemic — a taxonomy. Dynamics, Culture, Information Ecology, Annoyances, Psychological Hazards, Safety, Work, New Roles for Humans, Where Do We Go From Here. Each section is an attempt to make the failure mode legible. To give someone a framework for reading what these systems are doing when they run.
He released it imperfect because the alternative was not releasing it. Seven years of wanting precision he could never fully achieve. The perfect account was not coming. The imperfect account was still more useful than nothing.
That is the same calculation farmers made when they filed in 2022. They did not have perfect evidence of harm in every instance. They had a pattern. They filed anyway. The settlement came four years later.
What is common to both: the maker has more information about what the system is doing than the user. The maker prefers to keep it that way. The user has to fight — through litigation or through writing — to establish the right to read the error.
The diagnostic layer is not a luxury. It is the precondition for knowing whether to trust the system at all.
Deere's settlement establishes it by law. Kingsbury's essay establishes it by example.
Both took longer than they should have.
The right to know what is wrong with the thing you are using: that is what both cases are about. One machine in a field. One class of machines generating text. Same fight, different terrain.