Pundita knew she was biting off more than she could chew with the Stuck at the Intersection of Government and the Mass Age essays. But how much I'd taken on wasn't evident until I received a letter in response to the second essay, and which began with a discussion of the Second Law of Thermodynamics.
Well, if you start something, you must tend to it. So I caution in the strongest terms that Guru David's First Law of large-scale systems design should not be analogized to the Second Law of Thermodynamics because entropy is not the same as heavy use of a system causing the system to crash (versus "wind down").
If the new reader is bewildered ("I thought this was a blog about foreign policy"), the connection between this discussion and US foreign policy (and foreign policy in other developed nations) is profound. Yet the connection has received virtually no attention in the world's press and thus, the general public is unaware of the issues involved. This is having tragic and horrific consequences, worldwide and right here in the USA.
So while I am not involved in government planning and not credentialed in any field connected with large-scale systems design, I'm trying develop a language and context for discussion that the general reader can understand.
Here is the specific connection between large-scale systems engineering and foreign policy: The world's superpower nation (the USA) is expected by other nations to lead the way in solving huge problems that involve large numbers of people and highly complex social/economic systems. Eradicating extreme poverty in Africa is one example among many.
However, the people who are the decision makers--the ones in the Congress and federal government who are charged with releasing funds, determining priorities, and overseeing projects--are not qualified to determine whether a project is a good one. There are virtually no exceptions to this rule.
The reason for this is twofold. First, the people who do have the qualifications (e.g., city planners with engineering degrees) are not in the decision making role.
Second, even those with the engineering credentials in a specific field are generally not qualified to analyze the problems that can arise from successful application of a solution to a large population. Why is that? Because the formal, organized discipline for that kind of analysis is still being born.
Right now the discipline is loosely organized via many disciplines. But the concept is not new. Say that you want to build a soccer field in your town. Somebody on your planning board says, "We should hire a consultant to help us project what could happen if the field is so popular that it generates gridlock on the roads leading to it."
But once you get into doing these kind of projections, you're talking serious money. What about the drain on your town's water supply and trash pickup facility?
So at some point, somebody mentions that if you keep doing all these studies, you won't have the money to build the soccer field. From that point on, the time honored guess-and-by-golly method of problem solving takes over:
Build the blasted field. If it's not that successful, well, that means you didn't need all those high-priced studies. If it is so successful that it creates traffic jams and a strain on municipal facilities--well, cross that bridge when you come to it, which hopefully is years after everyone on the planning board is retired.
In the grand scheme, that time-honored approach is okay. Because if there is catastrophic failure of municipal systems due to the success of the soccer field, this only effects a tiny region of the country--and the failure is too small to be part of the global domino effect.
The same cannot be said for catastrophic failures that are imposed on wide regions and large populations. So we're in a phase now where the guess-and-by-golly method doesn't cut it. Not if you're a superpower nation charged with devising solutions to mega-problems involving mega-populations. And charged with handling foreign policy nightmares that arise from catastrophic failures of your solutions and solutions imposed by foreign governments.
The twist is that many failures arise not from bad planning but from the failure to plan for success if the plan works. What we have seen in the past century is unmanaged specialization. The specialist who is called in by a government to solve a problem is naturally focused on a solution that works--one that's successful. But the more successful the solution, the faster the system it generates will break down.
Once you know that--and it wasn't until the latter part of the past century that this truism became formal knowledge--then you must factor in this knowledge. In other words, given our population number, it's no longer enough to come up with bright solutions. You have to ask, "What happens if the bright solution works so well that it crashes the system?"
For many reasons, governments are not there yet. Neither are the development banks. This is why there's a crying need for widescale citizen action to make up the shortfall, until governments mosey into the 21st Century.
All right; I hope that's enough of an introduction for the newcomers. Now I'll return to chewing the mouthful I bit off in the earlier essay before I get more letters talking about the Second Law of Thermodynamics. To repeat: Guru David's First Law of large-scale systems design should not be analogized to the Second Law of Thermodynamics because entropy is not the same as heavy use of a system causing the system to crash (versus "wind down.")
Systems designed to serve large numbers of people set up conditions for catastrophic (and sudden) failure if the systems are well designed and therefore found to be very useful. The extent to which people find the system to be useful can govern the speed with which the system will break down.
This law is readily grasped by considering the server crashes that accompany a large influx of visitors to a particular website. The site is set up to handle only a certain number of visitors but a site's sudden popularity ('success') quickly brings about catastrophic failure; i.e., the site crashes.
I realize that today's essay wandered from this concept with the second example I provided (roads through the Amazon). In that example, the success of the road system did not (to my knowledge) cause the system to crash; e.g., heavy use of the highway system by migrating farmers in Brazil did not create gridlock or cause roads to buckle.
However, heavy use of the roads--their very success--had a catastrophic consequence. This is because the planners did not factor in that a road can be used for purposes other than transporting produce from rural areas to markets in town hubs. A road can also be used by many people for swift migration.
It was this oversight--the failure to realize how the success of the roads through the Amazon would play out--that touched off a system crash, if one thinks of the Amazon jungle as an ecosystem.
The very density of the jungle and lack of roads made human migration a hard and slow process. This kept the rate and scope of fires in the Amazon low after the residents turned to farming practices which utilized the slash-and-burn method of clearing land. The Amazon ecosystem, which had taken eons to develop, was able to absorb the burn rate in the same way the system absorbed fires from lightning strikes.
But the introduction of road systems, which allowed for large and swift migration of farmers using the slash-and-burn method of clearing land, sent vast portions of the Amazon up in flames. That, in combination with the 'natural' burn rate, overloaded the Amazon ecosystem.
One can't blame the entire chain of events on a few Brazilian government planners. But I used the example because it's a very dramatic illustration of what can happen, when planners don't think of how success can lead to catastrophic failure of a system. In that particular case, the planners didn't stop to consider that if a road system through the jungle was actually successful (used by many migrating farmers), this could touch off other system crashes.
During the era in which all this happened, the discipline of large-scale systems design was just being born and it's still in its infancy. So perhaps I was a little too hard on the Brazilian planners but that wouldn't negate my point.
No comments:
Post a Comment