Saturday, February 16, 2013

Mark Buchanan Ubiquity (2000)

     Mark Buchanan Ubiquity (2000) A discussion of the concept of the critical state as it applies to diverse phenomena. In such systems, an event can trigger a large or small change, but nothing indicates the size of the change prior to its happening. There is no proportion between the triggering event and its consequences. In fact, in the simplest models, such as the sand pile on which one drops grains one by one, the triggering event is the same in every case: a grain of sand. It may trigger a small avalanche or a huge one. It may trigger one or seevarl avclanches. The size and number of the avalanche(s) is unpredictable.
     Buchanan’s thesis is that human systems also are often critical, that in fact human society is an assemblage of critical-state systems. Thus, changes large and small will happen. The only thing we know for sure is that larger changes are less likely than small ones. Of course we notice the large changes and seek for explanations with the hope and aim of preventing them in future. They are not preventable, says Buchanan, because they are not predictable. Moreover, attempts to prevent them may well set off different unpredictable events. Correction: such attempts will set off different unpredictable events.
     As I noted some years ago: explicability is not the same as predictability. We can explain, or at least describe, the chain of events that led to the first world war, but no one at the time could have predicted it. In fact, people had put in place a system of alliances designed to prevent large-scale war. Critical-state physics deals with systems whose history matters. Therefore, the mathematics of critical-state physics should be applicable to history. Buchanan goes further: critical-state physics is the science of history, he claims.
     A very useful book, and a well written one. Buchanan has the knack of explaining difficult (because unfamiliar) ideas by means of homely analogies and examples. But if he’s right, the best we can do is what we do when a hurricane threatens: prepare for the worst, just in case. What we can’t do is devise a system that a) will do exactly what we want it to do; and b) won’t change. **** (2002)

Update 2020 03 16: The current corona virus crisis is a case in point. The triggering event was likely the sale of an infected pangolin in a wild-life market in Wuhan, China. This critter was infected by a corona virus that was able to infect at least one of the people who handled it or its carcase. Now we have a world-wide pandemic, whose course in general is predictable: infections will rise exponentially to some peak and subside at approximately the same rate. Who will die can't be predcited, only the probable number of deaths by demographic slice. And all because, this time, a virus mutated just enough to survive in a human being. It will happen again. We just don't know and can't know whether the next cross-species infection will cause a major or a minor  illness, nor can we predict how many it will infect.

The concept of critical systems should be taught to everyone who manages any kind of system, at whatever level. In short, it should be taught to all of us. 

No comments:

Visual and other illusions

   Visual illusions vary. Some can be controlled. For example, I find that once I’ve seen both images in a dual-image illusion, I can see ei...