SerendipityвЂ™s a funny thing. When I started planning out this post a couple of days ago, I knew that I was going to have to pull my battered copy of Gregory BatesonвЂ™s Mind and Nature off the bookshelf where I keep basic texts on systems philosophy, since itвЂ™s almost impossible to talk about information in any useful way without banking off BatesonвЂ™s ideas. I didnвЂ™t have any similar intention when I checked out science reporter Charles SeifeвЂ™s Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking from the local library, much less when I took a break from writing the other evening to watch вЂњMonty Python and the Holy GrailвЂќ for the first time since my teens.
Still, IвЂ™m not at all sure I could have chosen better, for both of these latter turned out to have plenty of relevance to the theme of this weekвЂ™s post. Fifty years of failed research and a minor masterpiece of giddy British absurdity may not seem to have much to do with each other, much less with information, Gregory Bateson, or a вЂњgreen wizardryвЂќ fitted to the hard limits and pressing needs of the end of the industrial age. Yet the connections are there, and the process of tracing them out will help more than a little to make sense of how information works вЂ“ and also how it fails to work.
LetвЂ™s start with a few basics. Information is the third element of the triad of fundamental principles that flow through whole systems of every kind, and thus need to be understood to build viable appropriate tech systems. We have at least one huge advantage in understanding information that people a century ago didnвЂ™t have: a science of information flow in whole systems, variously called cybernetics and systems theory, that was one of the great intellectual adventures of the twentieth century and deserves much more attention than most people give it these days.
Unfortunately we also have at least one huge disadvantage in understanding information that people a century ago didnвЂ™t have, either. The practical achievements of cybernetics, especially but not only in the field of computer science, have given rise to attitudes toward information in popular culture that impose bizarre distortions on the way most people nowadays approach the subject. You can see these attitudes in an extreme form in the notion, common in some avant-garde circles, that since the amount of information available to industrial civilization is supposedly increasing at an exponential rate, and exponential curves approach infinity asymptotically in a finite time, then at some point not too far in the future, industrial humanity will know everything and achieve something like omnipotence.
IвЂ™ve pointed out several times in these essays that this faith in the so-called вЂњsingularityвЂќ is a rehash of Christian apocalyptic myth in the language of cheap science fiction, complete with a techno-Rapture into a heaven lightly redecorated to make it look like outer space. It might also make a good exhibit A in a discussion of the way that any exponential curve taken far enough results in absurdity. Still, thereвЂ™s still another point here, which is that the entire notion of the singularity is rooted in a fundamental misunderstanding of what information is and what it does.
BatesonвЂ™s work is a good place to start clearing up the mess. He defines information as вЂњa difference that makes a difference.вЂќ This is a subtle definition, and it implies much more than it states. Notice in particular that whether a difference вЂњmakes a differenceвЂќ is not an objective quality; it depends on an observer, to whom the difference makes a difference. To make the same point in the language of philosophy, information canвЂ™t be separated from intentionality.
What is intentionality? The easiest way to understand this concept is to turn toward the nearest window. Notice that you can look through the window and see whatвЂ™s beyond it, or you can look at the window and see the window itself. If you want to know whatвЂ™s happening in the street outside, you look through the window; if you want to know how dirty the window glass is, you look at the window. The window presents you with the same collection of photons in either case; what turns that collection into information of one kind or another, and makes the difference between seeing the street and seeing the glass, is your intentionality.
The torrent of raw difference that deluges every human being during every waking second, in other words, is not information. That torrent is data вЂ“ a Latin word that means вЂњthat which is given.вЂќ Only when we approach data with intentionality, looking for differences that make a difference, does data become information вЂ“ another Latin word that means вЂњthat which puts form into something.вЂќ Data that isnвЂ™t relevant to a given intentionality, such as the dirt on a window when youвЂ™re trying to see whatвЂ™s outside, has a different name, one that doesnвЂ™t come from Latin: noise.
Thus the mass production of data in which believers in the singularity place their hope of salvation can very easily have the opposite of the effect they claim for it. Information only comes into being when data is approached from within a given intentionality, so itвЂ™s nonsense to speak of it as increasing exponentially in some objective sense. Data can increase exponentially, to be sure, but this simply increases the amount of noise that has to be filtered before information can be made from it. This is particularly true in that a very large fraction of the data thatвЂ™s exponentially increasing these days consists of such important material as, say, gossip about Kate HudsonвЂ™s breast implants.
The need to keep data within bounds to make getting information from it easier explains why the sense organs of living things have been shaped by evolution to restrict, often very sharply, the data they accept. Every species of animal has different information needs, and thus limits its intake of data in a different way. YouвЂ™re descended from mammals that spent a long time living in trees, for example, which is why your visual system is very good at depth perception and seeing the colors that differentiate ripe from unripe fruit, and very poor at a lot of other things.
A honeybee has different needs for information, and so its senses select different data. It sees colors well up into the ultraviolet, which you canвЂ™t, because many flowers use reflectivity in the ultraviolet to signal where the nectar is, and it also sees the polarization angle of light, which you donвЂ™t, since this helps it navigate to and from the hive. You donвЂ™t вЂњseeвЂќ heat with a special organ on your face, the way a rattlesnake does, or sense electrical currents the way many fish do; around you at every moment is a world of data that you will never perceive, because your ancestors over millions of generations survived better by excluding that data, so they could extract information from the remainder, than they would have done by including it.
Human social evolution parallels biological evolution, and so itвЂ™s not surprising that much of the data processing in human societies consists of excluding most data so that useful information can emerge from the little thatвЂ™s left over. This is necessary but itвЂ™s also problematic, for a set of filters that limit data to whatвЂ™s useful in one historical or ecological context can screen out exactly the data that might be most useful in a different context, and the filters donвЂ™t necessarily change as fast as the context.
The history of fusion power research provides a superb example. For more than half a century now, leading scientists in the worldвЂ™s industrial nations have insisted repeatedly, and inaccurately, that they were on the brink of opening the door to commercially viable fusion power. Trillions of dollars have gone down what might best be described as a collection of high-tech ratholes as the same handful of devices get rebuilt in bigger and fancier models, and result in bigger and costlier flops. TheyвЂ™re still at it; the money the US government alone is paying to fund the two fusion megaprojects du jour, the National Ignition Facility and the ITER, would very likely buy a solar hot water system for every residence in the United States and thus cut the countryвЂ™s household energy use by around 10% at a single stroke. Instead, itвЂ™s being spent on projects that even their most enthusiastic proponents admit will only be one more inconclusive step toward fusion power.
The information that is being missed here is that fusion power isnвЂ™t a viable option. Even if sustained fusion can be done at all outside the heart of a star, and the odds of that donвЂ™t look good just now, itвЂ™s been shown beyond a doubt that the cost of building enough fusion power plants to make a difference will be so high that no nation on Earth can afford them. There are plenty of reasons why that information is being missed, but an important one is that industrial society learned a long time ago to filter out data that suggested that any given technology wasnвЂ™t going to be viable. During the last three centuries, as fossil fuel extraction sent energy per capita soaring to unparalleled heights, that was an adaptive choice; the inevitable failures вЂ“ and there have been wowsers вЂ“ were more than outweighed by the long shots that came off, and the steady expansion of economic wealth powered by fossil fuels made covering the costs of failures and long shots alike a minor matter.
We donвЂ™t live in that kind of world any longer. With the peak of world conventional petroleum production receding in the rear view mirror, energy per capita is contracting, not expanding. At the same time, most of the low hanging fruit in science and engineering has long since been harvested, and most of whatвЂ™s left вЂ“ fusion power here again is a good example вЂ“ demands investment on a gargantuan scale with no certainty of payback. The assumption that innovation always pays off, and that data contradicting that belief is to be excluded, has become hopelessly maladaptive, but it remains welded in place; consider the number of people who insist that the proper response to peak oil is some massive program that would gamble the future on some technology that hasnвЂ™t yet left the drawing boards.
ItвЂ™s at this point that the sound of clattering coconut hulls can be heard in the distance, for the attempt to create information out of data that wonвЂ™t fit it is the essence of the absurd, and absurdity was the stock in trade of the crew of British comics who performed under the banner of Monty Python. What makes вЂњMonty Python and the Holy GrailвЂќ so funny is the head-on collisions between intentionalities and data deliberately chosen to conflict with them; any given collision may involve the intentionality the audience has been lured into accepting, or the intentionality one of the characters is pursuing, or both at once, but in every scene, cybernetically speaking, thatвЂ™s whatвЂ™s happening.
Consider King ArthurвЂ™s encounter with the Black Knight. The audience and Arthur both approach the scene with an intentionality borrowed from chivalric romance, in which knightly combat determines a winner and a loser out of the background data. The Black Knight, by contrast, approaches the fight with an intentionality that excludes any data that would signal his defeat. No matter how many of the Black KnightвЂ™s limbs get chopped off вЂ“ and by the end of the scene, heвЂ™s got four bloody stumps вЂ“ he insists on his invincibility and accuses Arthur of cowardice for refusing to continue the fight. ThereвЂ™s some resemblance here to the community of fusion researchers, whose unchanging response to half a century of utter failure is to keep repeating that fusion power is just twenty (more) years in the future.
Doubtless believers in the singularity will be saying much the same thing fifty years from now, if there are still any believers in the singularity around then. The simple logical mistake theyвЂ™re making is the same one that fusion researchers have been making for half a century; theyвЂ™ve forgotten that the words вЂњthis canвЂ™t be doneвЂќ also convey information, and a very important kind of information at that. Just as itвЂ™s very likely at this point that fusion research will end up discovering that fusion power wonвЂ™t work on any scale smaller than a star, itвЂ™s entirely plausible that even if we did achieve infinite knowledge about the nature of the universe, what we would learn from it is that the science fiction fantasies retailed by believers in the singularity are permanently out of reach, and we simply have to grit our teeth and accept the realities of human existence after all.
All these points, even those involving Black Knights, have to be kept in mind in making sense of the flow of information through whole systems. Every system has its own intentionality, and every functional system filters the data given to it so that it can create the information it needs. Even so simple a system as a thermostat connected to a furnace has an intentionality вЂ“ it вЂњlooksвЂќ at the air temperature around the thermostat, and вЂњseesвЂќ if that temperature is low enough to justify turning the furnace on, or high enough to justify turning it off. The better the thermostat, the more completely it ignores any data that has no bearing on its intentionality; conversely, most of the faults thermostats can suffer can be understood as ways that other bits of data (for example, the insulating value of the layer of dust on the thermostat) insert themselves where theyвЂ™re not wanted.
The function of the thermostat-furnace system in the larger system to which it belongs вЂ“ the system of the house that itвЂ™s supposed to keep at a more or less stable temperature вЂ“ is another matter, and requires a subtly different intentionality. The homeowner, whose job it is to make information out of the available data, monitors the behavior of the thermostat-furnace system and, if something goes wrong, has to figure out where the trouble is and fix it. The thermostat-furnace systemвЂ™s intentionality is to turn certain ranges of air temperature, as perceived by the thermostat, into certain actions performed by the furnace; the homeownerвЂ™s intentionality is to make sure that this intentionality produces the effect that itвЂ™s supposed to produce.
One way or another, this same two-level system plays a role in every part of the green wizardвЂ™s work. ItвЂ™s possible to put additional levels between the system on the spot (in the example, the thermostat-furnace system) and the human being who manages the system, but in appropriate tech itвЂ™s rarely a good option; the Jetsons fantasy of the house that runs itself is one of the things most worth jettisoning as the age of cheap energy comes to a close. Your goal in crafting systems is to come up with stable, reliable systems that will pursue their own intentionalities without your interference most of the time, while you monitor the overall output of the system and keep tabs on the very small range of data that will let you know if something has gone haywire.
That same two-level system also applies, interestingly enough, to the process of learning to become a green wizard. The material on appropriate technology IвЂ™ve asked readers to collect embodies a wealth of data; what prospective green wizards have to do, in turn, is to decide on their own intentionality toward the data they have, and begin turning it into information. This is the exercise for this week.
HereвЂ™s how it works. Go through the Master Conserver files you downloaded, and any appropriate tech books youвЂ™ve been able to collect. On a sheet of paper, or perhaps in a notebook, note down each project you encounter вЂ“ for example, weatherstripping your windows, or building a solar greenhouse. Mark any of the projects youвЂ™ve already done with a check mark Then mark each of the projects you havenвЂ™t done with one of four numbers and one of four letters:
1 вЂ“ this is a project that you could do easily with the resources available to you.
2 вЂ“ this is a project that you could do, though it would take some effort to get the resources.
3 вЂ“ this is a project that you could do if you really had to, but it would be a serious challenge.
4 вЂ“ this is a project that, for one reason or another, is out of reach for you.
A вЂ“ this is a project that is immediately and obviously useful in your life and situation right now.
B вЂ“ this is a project that could be useful to you given certain changes in your life and situation.
C вЂ“ this is a project that might be useful if your life and situation were to change drastically.
D вЂ“ this is a project that, for one reason for another, is useless or irrelevant to you.
This exercise will produce a very rough and general intentionality, to be sure, but youвЂ™ll find it tolerably easy to refine from there. Once you decide, letвЂ™s say, that weatherstripping the leaky windows of your apartment before winter arrives is a 1-A project вЂ“ easy as well as immediately useful вЂ“ youвЂ™ve set up an intentionality that allows you to winnow through a great deal of data and find the information you need: for example, what kinds of weatherstripping are available at the local hardware store, and which of those can you use without spending a lot of money or annoying your landlord. Once you decide that building a brand new ecovillage in the middle of nowhere is a 4-D project, equally, you can set aside data relevant to that project and pay attention to things that matter.
Of course youвЂ™re going to find 1-D and 4-A projects as well вЂ“ things that are possible but irrelevant, and things that would be splendidly useful but are out of your reach. Recognizing these limits is part of the goal of the exercise; learning to focus your efforts where they will accomplish the most soonest is another part; recognizing that youвЂ™ll be going back over these lists later on, as you learn more, and potentially changing your mind about some of the rankings, is yet another. Give it a try, and see where it takes you.
Haitian farmers: so all can eat, produce it here4G networks could be revolutionary