The Dawn of the Age of Ignorance and Misinformation
December 18, 2012
William I. Newman
There emerged after the Renaissance a remarkable period (beginning around 1650�1700) often called the Age of Enlightenment, when Western Europe was richard mille replica freed from ignorance and misinformation. There evolved a belief that the application of intellectual approaches together with experiments could provide reproducible explanations for essentially all the mysteries of the cosmos. The scientific method had been born. Out of this rationalist milieu emerged the notion that, if we knew all the laws of nature and possessed a machine capable of computing the dynamical evolution of every atom, we would be able to calculate indefinitely into the future the state of all things. In many regards, this perspective established the raison d'�tre of the scientific community. With overwhelming evidence in support of these ideas, the general population came to accept the outcome of scientific investigations in all things.
In the last half century, we have witnessed a grave decline in the public's acceptance of scientific evidence and even the scientific method. Postmodernism presented many challenges in academe and, of wider concern, the notion that science is but one of many approaches to problem solving and that "scientific results" are not necessarily better, let alone reliable. During recent weeks, we have witnessed the public's abandonment of science in connection with a number of events. Among them are the Italian trial and sentencing of six seismologists to six years in prison for failing to predict the 2009 magnitude 6.3 L'Aquila earthquake, the warnings issued for Hurricane Sandy and its destructive potential, and the magnitude 7.7 Queen Charlotte Islands earthquake and subsequent tsunami warning. Also portending grave consequences is the public's response to the emergent role of climate change and its effects on humanity.
The underlying complexity of the natural world greatly exacerbates this problem. The Earth is a remarkably complicated place. The interactions of its oceans and atmospheres are governed by physical laws that are relatively well known. However, the behavior of its land masses and interior is characterized by material heterogeneity and intricately established geometries. While mathematicians take pride in the historical evolution of calculus from Newton's quintessential F = ma describing the trajectory of an individual particle, the real world is infinitely more complicated.
The Navier�Stokes equations for fluid motion, which possess a beautiful mathematical underpinning, are in practice profoundly difficult to solve in all but the most pristine circumstances. Hydrodynamic problems couple complicated geometries, as might be produced by terrestrial topography, to cumbersome equations of state describing the phase change of water, and frequency-dependent radiative transfer regulating the flow of energy from the sun to the Earth and back again in the presence of clouds and aerosols. Solution of such problems is a monumental undertaking. Adding to this physical complexity is mathematical complexity: We are trying to describe a chaotic system with infinite degrees of freedom over an extended period of time. Thus, when we look at the challenges faced by meteorologists, especially those who study the origin and development of hurricanes using space-based observations coupled to mathematical and computational tools, it is truly remarkable that their predictions regarding Hurricane Sandy were so accurate.
The study of climate builds upon atmospheric and ocean science in a special way; climate is much more than an average of the weather. While weather forecasts involve all the ingredients described above (and others), the investigation of climate must fold into these considerations changes in the ambient environment that meteorologists generally regard as static---such as the variability in time of greenhouse gas abundance, of changes in the reflectivity (albedo) of the ground due to melting/freezing glacial ice masses, and of ground cover (vegetation). Humans presently inject 30 billion metric tons of carbon dioxide into the atmosphere each year---more than 4 tons per person worldwide (18 tons in the U.S. and other developed countries)---a significant fraction of which enters and influences our oceans; this human imprint that commenced at the beginning of the industrial revolution remains firm and growing at a rate of 2.5% per year.
The past decade was the hottest on record---and the surface temperature of the Caribbean has risen several degrees. The increased temperature of these waters increases the capacity of the atmosphere above it to carry water vapor and transport it over land. The latent heat present in this water vapor drives a phenomenally powerful engine; the heated air, upon interacting with colder air masses from the north, can have cataclysmic results. Similar considerations apply to the Indian Ocean, and typhoons (the term used to describe hurricanes in south Asia) have resulted in power outages afflicting more than half a billion people and displacing hundreds of millions. Attendant food shortages and the spread of disease threaten to destabilize a particularly volatile part of the globe. Climate change is no longer speculative; we are witnessing its impact, which can only be expected to grow.
The fluid processes mentioned above contribute to an even more complicated class of scientific and societal problems: the nature of earthquakes and tsunamis. Convection in the Earth's molten mantle drives the continental land masses (more correctly, tectonic plates) on the surface like croutons floating in a boiling pot of soup. Areas of contact between the plates, known as earthquake faults, provide frictional resistance to their relative motion, which otherwise would be of order
50 mm/yr. In reality, the distribution of faults is more complicated than this monolithic picture suggests. Earthquake faults form complicated networks, and even maintain some fractal properties. A good metaphor is to think of a piece of fine China that has broken into many fragments, with a wide range of sizes.
Observed over hundreds of millions of years, this slow motion has resulted in the displacement of the continents from one another over thousands of kilometers. In this ongoing process, the rock material on both sides of a fault undergoes steadily increasing stress until something gives---after which the stress on both sides of the fault is relieved, only to grow again. This so-called elastic-rebound theory, developed by Reid to explain the Great San Francisco Earthquake of 1906, is completely descriptive; efforts to model earthquakes are fundamentally phenomenological, attempting to incorporate in a quantitative way the nature of friction, the rheological properties of the rocks involved, the geometrical configuration, etc.
A tree branch bent with a continually increasing applied stress provides a good analogy for earthquake prediction: It is evident that the branch will ultimately snap, but when? That is the fundamental dilemma posed by efforts to forecast earthquake events. Although the intuitive basis is clearcut, the quantitative and physics-based ingredients are poorly understood, and first-principles equations describing such events are essentially non-existent. Earthquake phenomenology preserves some features reminiscent of fluid turbulence, such as cascades, but we have no formal equivalent for the Navier�Stokes equations to help explain, say, the formation of mountain ranges. Seismologists can make reasoned assessments of locations that are at greatest risk over some extended window of time. To make "predictions" of the sort that our colleagues in meteorology do (with all their attendant uncertainties) remains an impossible dream. Moreover, major earthquake events lack clear premonitory patterns: Only 30% of earthquakes are preceded by "foreshocks," i.e., events of lower magnitude or some other possibly relevant physical manifestations.
Given the profoundly difficult nature of earthquake forecasting, it is utterly unreasonable to expect accurate predictions of these largely stochastic events. The Italian seismologists convicted for failing to provide adequate warning of the L'Aquila earthquake are the scapegoats of a system that failed to ensure that adequate building codes and precautions were taken to address what is truly inevitable and unpredictable behavior.
Days after the egregious decision handed down by the Italian court, we experienced a magnitude 7.7 earthquake off Canada's Queen Charlotte Islands in the Cascadia fault zone. This event was closely monitored because its epicenter is loosely related to a fault---more precisely a subduction zone describing the plunging of the Pacific plate beneath the Juan de Fuca plate---centered off the British Columbia�Washington�Oregon coast. In 1700, the so-called Cascadia earthquake, magnitude approximately 8.7�9.2, triggered a devas-tating tsunami that struck the coast of Japan. We are now possibly overdue for an event of comparable magnitude. When one tectonic plate precipitously slips be-neath another, as in last year's disastrous Tohoku earthquake, it is possible, depending on the geometry of the plates and their relative motion, that a massive shallow-water wave will be triggered. That tsunami caused approximately 10,000 deaths, a number that will never be completely known. In 2004, the tsunami associated with the Great Sumatra�Andaman earthquake killed more than 230,000 people to the west of its epicenter.
When an underwater earthquake occurs, it is not immediately clear whether a tsunami hazard exists. Only when the geometry of the seismic source, the orientation of the water motion, and related facts are established is the true extent of the risk clear. The prudent policy is to issue an alert and be aware that a large submarine seismic event can always trigger a tsunami, and belay a warning if emerging evidence shows that the risk is minimal. The Queen Charlotte Islands tsunami alert elicited many complaints from the public, underscoring the dilemma that arises with so many natural hazards: The public expects scientists to accurately predict potential disasters, yet never to warn of events that ultimately do not come to pass. This misbegotten expectation makes clear that the public has lost its appreciation for the scientific method and the capabilities of scientists. The truth is that we can project the behavior of a complex system into the near future only with significant growth of uncertainty.
We are also witnessing in some quarters the rejection of knowledge obtained through the scientific method. A case in point is Senator James Inhofe's recent book The Greatest Hoax, which dismisses the findings of 97�98% of climate scientists, as surveyed by the U.S. National Academy of Sciences, that climate change is occurring and is largely due to human activity, in favor of the opinion of one dissenting scientist; should that scientist's opinion not prevail, Inhofe expresses his unwavering belief that certain Biblical passages guarantee that divine intervention will safeguard our future.
We are now in an era in which global climate change is unequivocal and societal response is critical. Meanwhile, non-climate-related natural hazards, such as earthquakes, emerge in locations that are especially desirable for human habitation---including the West Coast of the U.S., southern Europe, and much of Southeast Asia---where population growth is proceeding at an exponential pace. Science is capable of providing essential insights into the prevailing problems, but it cannot predict with inerrant accuracy how natural hazards result in disasters and, potentially, catastrophes. The diminution in the eyes of the public of the significance and believability of science threatens to deny us the one tool that could spare humanity. Indeed, we must confront the evolving growth of ignorance and misinformation if we are to survive.
William I. Newman is a professor of earth and space sciences, physics and astronomy, and mathematics at UCLA.