July 16th, 2025
0 reactions

The Fundamental Failure-Mode Theorem: Systems lie about their proper functioning

I have on occasion referred to Le Chatelier’s Principle for complex systems, as presented by John Gall in the book Systemantics: “Every complex system resists its proper functioning,” meaning that whenever you make a change to a complex system, parts of the system work to counteract and possibly even neutralize that change. If you add a notification feature so that everybody related to a pull request receives an email notification every time there is a change to that pull request, what typically results is that people create rules to auto-delete those notifications, and the resulting system is no different from where it started, except that it’s more wasteful.

The Fundamental Failure-Mode Theorem says that every complex system is running in a failure mode somewhere. There is always something that is not working, but you usually don’t notice because other parts of the system are compensating for it.

I ran into the Fundamental Failure-Mode Theorem many years ago when was trying to accomplish an unfamiliar task X, and the documentation suggested that should I use one particular tool. When I ran the tool, it said, “Before you can do X, you must do Y.”

I found the instructions on how to do Y, and they said that doing Y takes four hours.

Four hours later, Y was complete, and I went back to run the tool. This time, it gave a different message.

“Sorry, this tool does not support X.”

(Fortunately, most of that four hours was spent waiting around, so I was able to get other stuff done in the meantime.)

Bonus chatter: I ran into another case of this just the other day.

I asked an app’s built-in AI chatbot, “Please frob the widget.”

It replied, “Got it. If you need help with anything else, just let me know!”

I checked on the widget. It wasn’t frobbed.

“You said that you frobbed the widget, but it is still unfrobbed.”

The AI chatbot replied, “Thanks for pointing that you. I don’t have the ability to frob widgets. However, I can help you frob it yourself. (instructions follow)”

Bonus insult: The instructions told me to click on buttons that don’t exist. I went eight rounds with the chatbot trying to get good instructions and eventually gave up. It asked me if I wanted to submit feedback. I said yes. The instructions it gave me for submitting feedback also didn’t work.

I think it’s called a chatbot because its primarily goal is to chat, not to solve problems.

Topics
Other

Author

Raymond has been involved in the evolution of Windows for more than 30 years. In 2003, he began a Web site known as The Old New Thing which has grown in popularity far beyond his wildest imagination, a development which still gives him the heebie-jeebies. The Web site spawned a book, coincidentally also titled The Old New Thing (Addison Wesley 2007). He occasionally appears on the Windows Dev Docs Twitter account to tell stories which convey no useful information.

0 comments