Programming the 21st Century

DIGG THIS

The recently-ended twentieth century is known for any number of positives, with humanity's first walk on the moon, the harnessing of hydropower and nuclear power to generate electricity, the growth in human population to six billion inhabitants, the triumph over smallpox and polio, and the general worldwide increase in wealth leading to the situation at the century's end where the numbers of overweight and starving people were nearly equal. Even so, the Y2k problem elicited a number of wags to note that, since the computer systems would not know that the year was not 1900, perhaps we could get a "do-over" on the whole century.

The reason for such wishful thinking is clear: the twentieth century saw two massive world wars, a cold war with threats of nuclear annihilation, and a gripping depression that birthed National Socialism and the Soviet campaign of starvation in Ukraine. R J Rummel estimated more than 170 million humans were murdered by differing states and as high as 200 million died.

This came in great contrast to the century that preceded it. The achievements of the Western world in the nineteenth century, which we will date starting with Beethoven's Eroica symphony in 1804 and end in 1908 with Mahler's Eighth Symphony and the rise of Schoenberg's atonal music present a record of achievement and growth that it is stunning today to contemplate. In motion, we have the steamboat in 1807, the railroad in 1825 or 1830, the automobile in 1885, electric traction and independent multiple control in 1887, and the airplane at Kitty Hawk in 1903. Electricity, telephones, fax machines, telegraph wiring and wires, and international connectivity under the ocean all saw first light in this timeframe. Scientific discoveries abounded, with X-rays, the germ theory of disease, and the periodic table of the elements being only three small examples from physics, biology and chemistry. In art music, the Romantic era launched by Beethoven swept Wagner, Bruckner, Mahler and others in its wake; Verdi and Puccini created timeless compositions for the dramatic stage. Politically, regimes infringed on fewer human rights, especially in the area of economics, and slavery, an ancient and worldwide practice, was reduced to an insignificant institution by mostly nonviolent means, excepting Haiti and the United States. For information systems purposes, George Boole created a radically simplified logic that could be decided by simple machines, Charles Babbage designed a machine that could calculate, and Ada Lovelace demonstrated the skill of programming logic machines.

It would be understandable to fault the twentieth century for performing poorly. However, we must also look to some of the problems bequeathed to it by the nineteenth. First, that century spat forth the political leaders responsible for most of those deaths by government, with Stalin (1878), Franklin Delano Roosevelt (1882), Mussolini (1883), Hitler (1889), and Mao (1893) being five of the top offenders. But what allowed such leaders to pursue their campaigns grew from an unresolved philosophical crisis as outlined by Friedrich Nietzsche: the "death of God." As explained by Tom Wolfe, "Nietzsche said this was not a declaration of atheism, although he was in fact an atheist, but simply the news of an event. He called the death of God a u2018tremendous event,' the greatest event of modern history. The news was that educated people no longer believed in God, as a result of the rise of rationalism and scientific thought, including Darwinism, over the preceding 250 years. But before you atheists run up your flags of triumph, he said, think of the implications. u2018The story I have to tell,' wrote Nietzsche, u2018is the history of the next two centuries.' He predicted (in Ecce Homo) that the twentieth century would be a century of u2018wars such as have never happened on earth,' wars catastrophic beyond all imagining. And why? Because human beings would no longer have a god to turn to, to absolve them of their guilt; but they would still be wracked by guilt, since guilt is an impulse instilled in children when they are very young, before the age of reason. As a result, people would loathe not only one another but themselves. The blind and reassuring faith they formerly poured into their belief in God, said Nietzsche, they would now pour into a belief in barbaric nationalistic brotherhoods: u2018If the doctrines…of the lack of any cardinal distinction between man and animal, doctrines I consider true but deadly' – he says in an allusion to Darwinism in Untimely Meditations –u2018are hurled into the people for another generation…then nobody should be surprised when…brotherhoods with the aim of the robbery and exploitation of the non-brothers…will appear in the arena of the future.'"

The first vision of this death, however, was a result of one of the great discoveries of 19th-century physics, the second law of thermodynamics. Stated simply, it says that disorder (entropy) always increases in a closed system; if the universe itself is a closed system, the universe will eventually dissipate into nothingness. The phrase "the heat death of the universe" had entered the vocabulary.

The dominant school of thought of the nineteenth century if summarized in one word, then, was materialism. Its influence extended to a number of areas, including Marxian economics, Darwinian evolution, legal positivism, and atonal music; a logical consequence of materialism is the determinism that underlies much of Skinner's behaviorism. Indeed, if biology is naught but applied chemistry, and chemistry is a science that follows exacting, deterministic laws, then human thought and action, a biologic process, is nothing more than the end result of a series of predictable chemical reactions (with, perhaps, some randomness thrown into the chemistry by cosmic rays) and completely lacking in what philosophers and theologians call free will.

Against this philosophic backdrop the (chronological) twentieth century sent two intellectual warriors. Werner Heisenberg, born 1901, and Kurt Gödel, born 1906, were two of the central players undermining this deterministic, closed world. Heisenberg is best known for the uncertainty principle, which states that it is not possible to simultaneously know the momentum and position of an electron; determining one necessarily changes the other. In other words, there is no objective, determined result of an observation; the observer himself can change the result of an observation at a low-enough level; Janna Levin observes that quantum mechanics means that chemical reactions in the brain can no longer be deterministic, even while she doubts that "random" provides any more room for a free will to exist. Heisenberg became one of the founders of quantum mechanics, which demonstrated that physics at a subatomic level was probabilistic.

Gödel was a mathematician who worked on problems in the Principia Mathematica (PM); as explained by Hofstadter, PM was designed by Bertrand Russell to exclude "strange loops," one specifically stated as "Let S be the set of all sets that don’t contain themselves. Does S contain itself?" PM was supposed to be a system that excluded such self-referential items by establishing a series of axioms and conclusions drawn from them; everything that was true in PM was supposed to be derived from other things held to be true. Gödel discovered a second, obscure meaning to each of the axioms in PM, however; each true item in PM could be also represented by a number, and it was possible to create, using these numbers, true statements that were not derivable in PM. The statement he created was, essentially, "I am not provable in PM." (If it is provable in PM, then it is not true, which makes PM inconsistent; if it is not provable, it is true, but PM is incomplete, as not all true items in PM can be derived in it.) In essence, in the most precise system of rules designed by mankind, it was not possible to elucidate all the true statements within the system itself; a logical consequence of this is that no other system of rules could be complete, undermining any number of areas, including scientific socialism.

These two developments, quantum mechanics and the incompleteness theorem, help to guide the direction that IS takes. The incompleteness theorem is used to demonstrate the limits to what is computable, through the Church-Turing thesis. Quantum mechanics is the science that drives the increasing capacity of information technology. In his 1960 article There's plenty of room at the bottom, physicist Richard Feynman discussed the abundant space available for the storage of, say, the Encyclopedia Britannica on the head of a pin (the religiously oriented will recall the vision of heaven offered by CS Lewis in 1956' The Last Battle as an onion whose inner layers were "larger" than its outer layers, a strikingly similar idea from the field of literature). This vision becomes reality in the later part of the 20th century and the next.

Two "laws" discuss the resulting growth in capacity, Moore's Law and Metcalfe's Law. They describe exponential processes of growth in transistors and network value, respectively, and both go to explain a conundrum as confusing as quantum mechanics for the person seeking to manage change in information systems: the oncoming "free economy." As explained by Anderson, the logic is this: if the cost of a transistor drops by 50% every 18 months, eventually it will be so low as to essentially be free; the same effect obtains in computer disk storage where capacity has grown even faster than transistor density. (Indeed, quantum mechanics undergirds this growth, and will need to be harnessed to continue it.)

Metcalfe's Law describes the increase in value of a network as you add more nodes to it; the increase is greater than linear, so that more users increase value more than they increase costs. (Think, for example, of the utility of EBay, where having more buyers and sellers increases the overall value of the network.) The ultimate example of the value of a network comes from the World Wide Web. Here the creators of Google found, like Gödel, a second, hidden meaning in the links that connected Web pages, a meta-meaning that actually helped decide what web pages were most relevant when searching for the answer to a web query. The resulting search engine has become one of the greatest machines invented by mankind, and is the basis for a corporation whose market capitalization as of May 1, 2008 is $180 billion, this for a company that did not exist ten years prior.

The world of information systems can offer exponential growth and wealth creation. Integrating the exponentially-increasing bounty obtained from harnessing quantum mechanics with the scarcity arising from the material world is the central challenge of the 21st century. Information systems is critical to this challenge; as a calculating and counting machine, the computer has brought us the great age of quantification, where most of our information is represented as a string of digits (think of MP3 files, DVDs, and this website, for three examples). Quantification leads to the perfection of Taylorism, or scientific management, with its reduction of humans to cogs in a machine, while paradoxically providing exponential growth and non-linear, unquantifiable changes to companies and society.

Time, however, is not unlimited for information systems in this integration. Two events in 2006 are indicative. For one, on one day in 2005 or 2006, more oil was pumped from the earth's crust than had ever been done before, and has been done since: this is the concept of peak oil, and suggests the literal and figurative "end of the road" for the age of material progress which has been built largely on the extremely profitable "energy return on energy invested" obtainable from crude oil. On the plus side, Tan Dun's opera "The Last Emperor" premiered on December 21, 2006, arising from a composer whose success has depended on satisfying paying customers in Wagner's brainchild, the movie industry, and from a rising economic power, China; the music is outside the mainstream of tonal Western art music, but with the promise of a new and exciting age of art music composition stretching out in front, and perhaps an end to the century we've credited as "beginning" in 1908. It would still seem possible to find a deeper order and harmony underneath the structure of musical notes, and perhaps in other areas as well.

Understanding these contrasting and conflicting centuries, and synthesizing new strategies and possibilities from them for personal, organizational and societal advantage and profit will be your task as a strategic thinker in information systems. Know that you will make decisions in a field essential to both materialist measuring of the world, and non-linear qualitative changes to it. The work is challenging but rewarding, and critical to the advancement of civilization.

Adapted from a course lecture.

May 16, 2008