The Paradox of Perfect Control in Medical Software

We build crystalline systems for a fluid world. No wonder they shatter.

Medical device software at the intersection of order and chaos

A note on this series: This is a series of my software engineering philosophies focusing on the intersection of the regulated life science sector and the traditional software design paradigm. No proprietary projects or algorithms are shared; insights are based on my personal experiences and opinions shared within are my own and do not represent the view of my past or current employer.

As a first-principles fanatic, I often reflect on what software truly is, given its ubiquity and importance in today’s society, and the fact that it’s experiencing rapid changes in the era of AI, especially with coding agents.

After some soul searching and reading, I really liked this framing: “Software is confided intent.” I am sure it has been said or thought of before. Still, I really like this distillation because it highlights a few core elements of software: 1. the logical thoughts and intentions connected to a task, 2. a flexible machine as opposed to fixed hardware, and 3. it serves as a medium of expression (I know - you think art and STEM would not come together? Think again)1.

I encourage anyone reading this to reflect on all three, but for this series, I want to focus on and share my thoughts about the first and the second - the intent and the malleable machine. Why? Because it is paradoxical in the rapidly evolving world of SaMD (Software as a Medical Device), one of the hottest topics in healthcare today.

Let me give a tragic example. Does anyone remember the Therac-25 incidents from 1985 - 1987? No? Not to reveal my age too much, I for sure read about it. It is one of the well-documented cases where a software bug - “race conditions” - that led to several deaths by massive radioactive overdose2. This case is symbolic and a perfect example of why finding the balance between low-entropy “intent” with an oftentimes necessary high-entropy “malleable machine” is so tricky and challenging. Reflecting on this helps us think through the challenges and practical pathways to navigate the SaMD space.

Let’s dive deeper both in the philosophical and practical aspects.

The tension between a rigid, codified intent and a flexible, malleable machine exists at the intersection of two incompatible worlds.

The first world is the Regulatory Crystal. It is the world of our intent—a low-entropy space of perfect design controls, traceability matrices, and validation protocols. It’s beautiful, orderly, and ironically often too idealist to be true. The second world is the Clinical Fluid. It is the high-entropy reality where our malleable machine must operate—where nurses use creative shortcuts, patients don’t follow the manual, and legacy equipment creates noisy data. One world demands perfect control; the other laughs at your plans.

I have experienced this struggle many times in my career - no matter how much we try to control the intent, the machine will always find a way to surprise us in the real world. It is important to recognize a fact that the building complexity requirement is non-negotiable and based on proven theorems - W. Ross Ashby’s Law of Requisite Variety states that a controller must have at least as much variety as the system it controls3. So, you have to deal with this complexity whether you like it or not - on the other hand - I do not think patients or regulators will appreciate us building a system without some level of predictable behavior and control.

Failing to understand this intricacies and contradictions will lead to “Compliance Theater,” and almost certain failures either in the regulatory submission or in the real world - or maybe both? This is not a story about bad engineering. It’s about a fundamental paradox in medical software: we’re required to build perfectly controlled systems for an inherently uncontrolled domain. Imagine a scenario: we created 10,000 pages of the most comprehensive design history for an ECG app - what is the chance that the system may fail on patients with pacemakers because “that wasn’t in the requirements.” Be honest - it is much bigger than zero.

So, what is the punch line? The solution? Unfortunately, we all may have used up our 5 minutes of attention span reading this far. So I will keep it short and tie it to our next post which you can read after recharging your attention battery.

The answer isn’t more control. It’s a new mental model: entropy management. We must stop trying to eliminate “chaos” and instead build systems that can channel it - in a full compliant and operational manner. We have to accept the fact that our intent needs to be realized in an often “chaotic” real world - embracing it and channeling it is the only way to go.

I hope this is enough of a teaser to get you interested in reading the next post - “The Three-Zone Architecture: Structure for Chaos”.


Next: Part 2 - The Three-Zone Architecture: Structure for Chaos

References

Footnotes

  1. Dijkstra, E. W. (1982). “On the role of scientific thought.” In Selected Writings on Computing: A Personal Perspective (pp. 60-66). Springer-Verlag. The concept of software as “confided intent” reflects Dijkstra’s view of programming as the art of expressing computational intent.

  2. Leveson, N. G., & Turner, C. S. (1993). “An investigation of the Therac-25 accidents.” Computer, 26(7), 18-41. This seminal case study documents how race conditions in the Therac-25 radiation therapy machine led to fatal overdoses, highlighting the critical importance of software reliability in medical devices.

  3. Ashby, W. R. (1956). “An introduction to cybernetics.” Chapman & Hall. The Law of Requisite Variety states that for a controller to effectively control a system, it must have at least as much variety (complexity) as the system it controls.