The Complexity of Complexity

March 15, 2011

Book Review
Philip J. Davis

Complexity: A Guided Tour. By Melanie Mitchell, Oxford University Press USA, New York, 2009, 349 pages, $29.95.

Seek simplicity and distrust it.
---Alfred North Whitehead

Complexity, an ill-defined, paradoxical subject, notoriously controversial, is an aspect of life that in multiple ways impacts us all. Even recipes for a pot of chili exhibit levels of complexity. It is no surprise, then, that what might familiarly be called a Think Tank for Complexity is embedded in the Santa Fe Institute, and that Melanie Mitchell, associated both with SFI and with Portland State University, is eager to explain this notion to the Wide Wide World.

What is complexity? Mitchell asks. She informs us, in 349 pages, that she has run into dozens of answers. Among them, as starters, there are complexity as size, complexity as entropy, as information content, as fractal dimension, as logical depth, as thermodynamic depth, as statistical involvement. Considering a recipe for a pot of chili, one might measure its complexity variously: by the number of English (or Spanish) words used, by the number of ingredients, by the cost per bowl, by the difficulty of obtaining a specified ingredient, e.g., chipotle peppers.

The preceding definitions of complexity appear to be based on objective, measurable concepts. One might wonder whether there exists also a subjective aspect. Why not say, simply, that complexity is the opposite of simplicity? "Simplicity" does not appear in the book's index. But these thoughts have led me to a dilemma. A few days ago I was home, totally snowbound. The snow complexified my life because I had to call in a half dozen cancellations. But it simultaneously simplified my life that day because I no longer needed to go to these appointments. Is it the case, then, that for every complexification there is an equal and opposite simplification? Less is more and more is less?

An instance of this duality can be drawn from the history of mathematics. Cardano and later mathematicians, by introducing and using complex numbers, simplified the discussion of the roots of polynomials, relegating Descartes' rule of signs to the bottom of the list of things that every serious student of mathematics ought to know. Thus, as a corollary to my hypothesis, the complexity of a process may very well be time-varying and context-dependent.

Mitchell is a painstaking author, elaborating slowly and often with a light touch matters that need to be explained to her presumably general audience. I regard myself as a member of that audience, and I infer from her writing that she must be an enthusiastic and inspiring teacher. But Mitchell's didacticism has a downside: It sometimes converts her explanations into tutorial sessions during which I am snowed under by information. (PowerPoint presentations have the same effect on me.)

Her guided tour leads us to many topics and subtopics: to dynamics, information theory, evolution, genetics (she's very hot on genetics), cellular automata. She presents "production numbers"---in the terminology of Hollywood musicals---on the development of the theories of evolution and of models of various kinds. If my head grew heavy in attempting to amalgamate all the details (where the devil is said to reside), it was lightened by Mitchell's spicy discussions of the many ambiguities, controversies, acrimonious disagreements that swirl like eddies in Complexityland.

I was attracted particularly to chapter 17, The Mystery of Scaling. The text proceeds from Rubner's law (of the 1880s), which tells us that the metabolic rate of animals varies as the 2/3 power of their body mass, to Klieber's law (1930s), which upped the power to 3/4. In this connection, I wondered whether the "Average Reader"---if, in fact, Mitchell's target audience---would understand what is meant by the 3/4 power of a number without a long lecture on real variable theory. From Klieber, the text moves to branching structures, thence to fractal dimension, and it winds up with Zipf's law.

According to Zipf's law---one of my favorites---given an extended sensible text in a natural language, the frequency of any word is inversely proportional to its rank in the frequency table. Francis and Kucera (colleagues at Brown) and later researchers have conducted extensive investigations of Zipfiana, employing huge databases of natural language texts. Many and diverse explanations have been given for this law, amid controversies as to whether such laws have any universality. One might suppose that physical laws simplify matters, but I recall Philipp Frank's words to the students in his class in mechanics: If you think that Newton's laws are simple matters, you'd better think again.

Mitchell picks up a question much beloved by the sci-fi community and movie makers: Are computers alive? Here are the five reasons she cites for concluding that computers are not alive: (1) no autonomy, (2) no metabolism, (3) no self-reproduction, (4) no survival instinct, (5) no adaptive evolution. She then intimates that all five can easily be refuted.

Yet these days, when we make doctor's appointments or train reservations by talking not to warm, human flesh but rather to a machine, when we then listen to Julie of Amtrak's beautiful voice, and recall Marvin Minsky's dictum that humans are only "meat machines" (a term picked up by later writers), we can easily be seduced into believing that a machine can be alive. There is a real Julie; I read an article about her. But Julie has also metamorphosed into a machine and now enjoys a dual nature. Is the machine Julie endowed with an "I"? Stephen Pinker, a distinguished cognitive scientist, carries forward the idea of a human as a tremendously complex meat machine and, as a consequence, relegates to the World of Illusion such concepts as the soul and the sense of a person's individual "I" as the center of consciousness.

On our guided tour of Complexityland, we run into apodictic, ex cathedra statements by mavens telling us what we are to believe. We are told, for example, that the universe is a cellular automaton. If you listen on YouTube to Marlene Dietrich's famous song from The Blue Angel, you'll get a second opinion.

Mitchell's Guided Tour is filled with names, biographical material, and portraits. It presents many diagrams and contains a bibliography of some three hundred items. All these inclusions make it easy for the interested and curious reader to join in Back Room discussions in Complexityland.

One final remark: An aspect of this book that I find refreshing is its candid display of all the ifs, buts, maybes, and I dunno's that populate the field. Mitchell convinces us that complexity, like the universe, is not a simple matter that can be summed up by a few laws or in a few punch lines.

Philip J. Davis, professor emeritus of applied mathematics at Brown University, is an independent writer, scholar, and lecturer. He lives in Providence, Rhode Island, and can be reached at [email protected].


Donate · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube linkedin google+