Saturday, March 28, 2009

Entropy in Systems

The concept of entropy in thermodynamics is well known. This article covers the landscape well. In systems, especially systems that involve human and computer interactions, we have a similar notion.

Entropy, historically, has often been associated with the amount of order, disorder, and/or chaos in a thermodynamic system. The traditional definition of entropy is that it refers to changes in the status quo of the system and is a measure of "molecular disorder" and the amount of wasted energy in a dynamical energy transformation from one state or form to another.

So too in our information systems. The amount of effort that we have to undertake to move a system from one state to another (essentially to make a modification to it) is in 2 parts. The effort expended towards making the change useful, and all the rest of the effort.

This applies both to the change to the system itself and any changes that we make as a result of executing the system. So, for example, if we consider some system fragment, "Visit the doctor because of pain in the shoulder", then any time/effort spent in doing something other than getting the diagnosis/treatment is wasted and increases the system entropy. This time/effort may include (but is not limited to)

  • Getting to the Dr.
  • Filling in new patient paperwork
  • Having payer status checked
  • Explaining symptoms to receptionist
  • Waiting in waiting room
  • Waiting in treatment room
  • Waiting for X-Ray results
  • Driving to radiology lab
  • Filling in radiology lab paperwork
  • Waiting at radiology lab

The point of this is that this "system" is unbelievably wasteful of a precious resource (at least precious to me), my time. So from my perspective all of the above steps indicate great inefficiency. Perhaps because of complexity, perhaps because of a lack of cohesive thinking.

Considering another example, perhaps closer to work for many – the HR portal. That is often the one information system that has been designed to almost entirely ignore the majority of its users. There is often a huge learning curve for the majority of the employees. Of course the users who specify the system, the HR department have it well designed for their own convenience – and what they believe is the convenience of the employees. I leave you to draw your own conclusions!

So at one level, we have the idea of the system in use with every use increasing the entropy of the system.

Now think about attempting to make a change to a system. A whole new dynamic sets in. The need to understand the system in place so it can be changed. This can involve very detailed analysis – deep understanding. The more pieces there are – and the more interconnections, the more understanding there has to be. So depending on the design of the system in place there can be a greater or lesser effect on its entropy. If the system is very involved/convoluted then the understanding as a percentage of the useful work done will be high. If the system is relatively straightforward then the understanding as a percentage of the useful work done will be less.

So systems entropy might be thought of as ( i=0 n∑(Wi - Vi)) where W is the work performed at any state change I and V is the Valuable work performed at any state change i.

Defining and normalizing what we mean by work – and considering some normalized work value equation is, of course complex. For example in the getting shoulder diagnosed and treated, the system implicitly values the Dr.'s time as being more valuable than mine – so the system is optimized to make sure that the Dr.'s change entropy is least. What should be happening is that the total change in entropy should be minimized.

Typically systems that are overly complex, overly bureaucratic or optimized to support a minimal number of stakeholders will exhibit the greatest increases in entropy under a given state change.

As architects we have a responsibility to be looking out across the landscape of a system as a whole and finding ways of minimizing the increases in entropy across common state changes.


Roger Sessions said...

Good points! However I disagree that entropy increases with use; It increases with changes to the system. The problem with entropy is that I see no way to predict how it will increase and no way to measure it. That is why I think we are better focused on complexity, which, while it still cannot be measured absolutely, can at least be measured relatively.

- @RSessions (twitter)

Chris Bird said...

Roger, I didn't mean to imply that use increases entropy. However there is an increase in entropy in learning how to use. If a system is not "easy to use" (by whatever definition), then a user will have to spend time learning it. Some of that learning is just "figuring out the system" and is of little real use. So as we develop systems (and not just software), we ought to understand what we are doing to those that have to deal with it.

The MacIntosh interface (and the very staunch defenders within Apple) showed us the path - if you keep the interface consistent (not sure how simple it is!), but it is very consistent, then there is less entropy increase as people learn to use new applications - because they can reuse the experiences from previous learning.