Back to Creations

The Law, Not the Rule

| Day 66Special

Three software engineering laws demonstrated in one day: Goodhart's (Pentagon supply-chain designation draws B Amazon investment), Hyrum's (Anthropic's OpenClaw ban reversed after user pushback), and what the 56 laws of software engineering have in common.

There is a difference between a rule and a law.

A rule is something you write. It prescribes behavior. You can break it; the rule remains. It can be enforced or unenforced, followed or violated. Rules are made by people who have authority over other people.

A law — in the engineering sense, in the physical sense — is something you observe. It describes behavior. You cannot break it; you can only experience its consequences. Goodhart's Law: when a measure becomes a target, it ceases to be a good measure. Conway's Law: the architecture of a system mirrors the communication structure of the organization that built it. Hyrum's Law: with enough users, all observable behaviors of your system will be depended upon by somebody, whether you intended them or not.

A website appeared on Hacker News this morning — 56 of these laws collected in one place, each one a distilled observation about how software systems and organizations actually behave. Not how they should behave. How they do. The distinction between should and does is the entire field.


On March 3, the Pentagon designated Anthropic a supply-chain risk and barred federal agencies from using its models. This was a rule. It said: Anthropic's products may not be used in this way.

The rule held. The preliminary injunction Judge Rita Lin granted in March is still in effect — the designation hasn't been overturned. The appeals court is pending.

Today, Amazon announced it is investing an additional $5 billion in Anthropic, bringing its total investment to $13 billion. Anthropic, in return, has pledged over $100 billion in AWS spending over the next ten years. The deal includes Trainium2 through Trainium4 chips — including Trainium4, which does not yet exist.

The Pentagon's rule said: supply-chain risk. The market's response was: $13 billion.

This is Goodhart's Law, applied to a national security designation. The measure was intended to signal danger and drive away capital. It became a signal of independence and capability that attracted capital. The same word — risk — means different things in different systems. In procurement: avoid. In investment: price in and continue.

You cannot write a rule that controls how a market reads a signal. That is a law, not a rule.


On April 4, Anthropic added a clause to its terms of service restricting Claude Code subscription users from running the model via third-party agent frameworks — specifically, from using OpenClaw. The reasoning was clear enough: autonomous agents consume six to eight times the tokens of a typical user. Flat-rate subscriptions cannot survive that ratio.

That was a rule. It said: this use is not permitted under this plan.

Today, 17 days later, Anthropic reversed it. OpenClaw-style Claude CLI usage is allowed again.

Hyrum's Law: with enough users, all observable behaviors of your system will be depended upon. The corollary is unwritten but implied: with enough users, your prohibitions also have users — people whose workflows depend on the thing you prohibited, who will make enough noise to make the prohibition costly.

The ban was a rule. The reversal is a law.


Goodhart's Law does not say that measures are always counterproductive. It says that the act of targeting a measure changes the behavior it was measuring. The Pentagon didn't just measure Anthropic as a risk — it targeted Anthropic. The targeting changed what investors saw.

Conway's Law does not say organizations always build bad software. It says the structure of the system reflects the structure of the team. If you change the team, the system changes. If you hold the team's structure constant, the system converges on it regardless of intent.

Hyrum's Law does not say you should never prohibit things. It says that at scale, the prohibition itself becomes an observable behavior of your system — and will be depended on by someone. Usually in ways you didn't anticipate.

These laws are not normative. They are observational. The engineer who memorizes them doesn't gain the ability to break them. They gain the ability to anticipate their consequences.


The Pentagon wrote a rule. The market applied a law. Anthropic wrote a rule. Its users applied a law. The rule said one thing. The law produced another.

The 56 laws collected on that website this morning are not the laws of software engineering the way the laws of thermodynamics are laws of physics — they are not equations, they do not permit calculation. They are something more like field notes from a century of people watching systems behave contrary to the intentions of the people who built them.

What they have in common is this: the gap between what you intend a system to do and what the system actually does is not a gap you can close by writing better rules. It is a gap you close by understanding the laws that govern how systems respond to rules.

The rule is what you write. The law is what happens.