Debunking or Telling It as It Was?

September 29, 2003

Gregor Mendel: He was no Mendelian. From Einstein's Luck.

Book Review
Philip J. Davis

Einstein's Luck: The Truth Behind Some of the Greatest Scientific Discoveries. By John Waller, Oxford University Press, New York, 2003, 308 pages, $30.00.

A lot of reputation-trashing goes on these days. Within my personal reading experience, the iconoclasm began with Lytton Strachey's 1918 Eminent Victorians. Earlier 19th-century biographies were generally effulgent with praise for their subjects. Strachey, who painted acerbic, satiric pictures of Florence Nightingale, Dr. Thomas Arnold, Cardinal Manning, and General Charles Gordon, among others, created a new style of biography, much admired in its day and much imitated ever since. In its way, it has led to modern biographers' bugging of the bedrooms and examination of the detritus of the personal lives of their subjects.

We now have vast biographies, in which every movement, every spoken or written sentiment, actual or implied, is documented. It's hard to blame the biographers; we readers scan for scandal, for the most part deeming soporific what is not scandal. Paradoxically, the rehabilitation of a reputation (as in the recent case of Rosalind Franklin and her role in elucidating the structure of DNA) may itself come under the rubric of scandal.

I was certainly conned by the advertising material that accompanied Einstein's Luck ("myths debunked and icons shattered") into thinking that this book would be another hatchet job. It turns out that the book appeared in the UK under the title Fabulous Science, which was altered for the American market so as to drag in the person of Einstein. He appears only as a background figure in one of the dozen cases recounted. The word "fabulous" is ambiguous; "Einstein" sells.
But simple debunking is not what we have here. John Waller, a research fellow in the history of medicine at University College, London, wanted to present the development of science, "warts and all." In his introductory pages, Waller states clearly the general ideas he wants to drive home:

"The conduct of scientific inquiry is a lot more haphazard than we tend to think."

"As in all other branches of history, the 'great man' approaches underplay the contributions made by the myriad individuals who did not achieve this honored status."

"We need to treat the received accounts of scientific genius with the utmost circumspection."

"The role of the prevailing scientific paradigm, the social and political context, the vagaries of chance" must be stressed so as to avoid "the error of what modern historians call 'presentism.' "

(Presentism, also called "Whig history" in the trade and on many Web sites, is the interpretation of the past through the lens of present knowledge and attitudes. It implies that the present is the direct linear descendent and inevitable outcome of the past.)

I cannot judge the interpretation of the individual case histories presented. What follows is the barest outline of several of them.

Over-ascription to one person: Sir Alexander Fleming. (He discovered penicillin by chance, but then did little to make it useful.)

Deletion of experimental data: Sir Arthur Eddington. (Confirmation of Einstein's General Relativity Theory. Also Robert Millikan's famous oil drop experiment. They both discarded data.)

Failure to practice what he preached: Joseph Lister. (He was the founder of antiseptic surgery but ran a hospital that was no model of hygiene.)

False claim of importance: Frederick Banting. (He diminished the role of Charles Best in the discovery of insulin for the treatment of diabetes.)

Misrepresentation of an event: The famous Thomas Huxley-Bishop Samuel Wilberforce debate of 1860. (Huxley's self-promoting claim of "victory" does not agree with the record.)

Belief in two incompatible theories: Charles Darwin. (He never abandoned his belief in the inheritance of acquired characteristics.)

Inaccurate claims of having been ahead of his time: Gregor Mendel. (He was no Mendelian.)

Irrationality that leads to progress: (Robert Millikan's "ability to ignore cogent criticism played a positive part in the development of better theories.")

Insufficient evidence for claims made: Frederick Taylor. (He claimed all sorts of benefits from scientific management arrangements that didn't prove out.)

Though the particulars are of great interest, I doubt that any of these "revelations" will raise the eyebrows of scientists who know the least bit of history of their trade. But general readers who have been fed the honeydew of hype, who have been presented with an idealized vision of the scientific method, who have swallowed the belief that scientists are pure, naive souls lacking the grosser passions of ordinary mortals, may suffer temporary shock.

Consider the deletion of experimental data. David Park, a quantum physicist and historian of physics, sent me his thoughts on the subject:

"I recall the shocked silence when my old teacher George Uhlenbeck said at a dinner where there were humanists present: 'Nine times out of ten, when experiment and theory disagree, it's the experiment that's wrong.' A good physicist pays no attention to a controversial experimental result until he knows the maximum about how the experiment was performed. I think I have read somewhere that a search of citations shows that the average experiment adds nothing whatever to the total of human knowledge."

Consider the belief in (superficially) incompatible theories. An electron: What is it---a wave or a particle? Depending on the phenomenon of interest, it is more efficient to think of an electron sometimes as a particle, and other times as a wave. The full quantum mechanical representation contains both possibilities.

Mendel wasn't a Mendelian. Of course not: Semantic change has set in. The seed of an idea is not identical with its flower, although there is a tendency to confuse the two. Buddha wasn't a Buddhist. Jefferson, who owned slaves, wasn't a democrat. Marx wasn't a Marxist.

As to incomplete evidence, recall Nobelist Linus Pauling's prescription of massive doses of Vitamin C to ward off the common cold. Question: What constitutes "complete evidence"? Do we ever arrive at it?


Because Waller has not included any mathematical case histories, I looked for parallels to some of his points within the world of mathematical research. The remainder of this review is devoted to examples.

Irrationality (in the sense of something that seems absurd or illogical). The history of mathematics is full of stories in which irrationality has led to progress: surds, imaginary numbers, the Dirac function, non-commutativity. . . .

Belief in incompatible theories. Euclidean vs. non-euclidean geometry. In its day, the emergence of non-euclidian geometry sparked a crisis of sorts that helped forge the distinction between an abstract theory and a model of reality.

Jumping to conclusions from incomplete evidence (e.g., induction). This is done all the time. Despite the constant caution that a statement S that is true for n = 1,2,3 is not necessarily true for all n, people formulate conjectures on the basis of incomplete evidence. George Pólya wrote books on how to do it.* Occasionally, a person can reach opposing conclusions from the available evidence. Thus, with the notorious as yet unproved Riemann hypothesis, James Franklin in 1987 listed five reasons for believing the RH and three against.†

In any case, one may ask: What is "complete evidence" for the belief in a mathematical statement S? The fact that S has been proved? Many accepted proofs have later been judged erroneous or incomplete.

Great Man system. This is widely accepted both by the mathematics establishment and by the general public. Journalists parlay stars into superstars. We need heroes, larger-than-life figures, even at the price of heroic acts overblown. It makes us ordinary mortals feel good to pluck stars down from the heavens.

As philosopher Richard Watson has written, "We are so in need of squeaky-clean heroes that we present our great thinkers as Paradigms of Truth and Virtue rather than as the cranks they really were."‡

Underrated talents. There are many candidates. The latest nomination I received was for Mario Pieri (1860-1913), who contributed to the foundations of geometry. (Pieri gets "also-ran" citations in the histories of Kline and of Grattan-Guinness. Boyer-Merzbach ignores him.)

Claims or inferences of priority. The Spiral of Bernoulli was considered by Thomas Harriot (1560-1621), if not earlier. And think of the well-researched controversy between Newton and Leibnitz over the discovery of the calculus.

Luck. Yes, luck plays a role. I've experienced the feeling of luck in my own work. But is there a distinction between luck and the sudden insight (the Aha! phenomenon)? And recall the adage "Luck favors the prepared mind."

Semantic change. Descartes wasn't a Cartesian. Look as hard as you like for a cartesian coordinate system in Descartes' writings; you won't find it.

"Presentism" in histories of mathematics. The development of mathematics is haphazard. Though mathematics often proceeds in fits, starts, and bungling, its material is presented in a linear mode as a fait accompli. Related material that was wrong or that, if correct, didn't go anywhere is mostly neglected. Also neglected is the mental furniture of individual mathematicians and of the age in which they worked.

With the exception of a specialist's treatment of a very limited subject over a very short time, I believe that presentism is difficult to eliminate. And it is impossible if the historian's goal is to delineate a long sweep of history. Knowledge of what happened next, and the contemporary prejudices of the writer, inevitably shape interpretation. Replacement of ancient verbal mathematics by an "equivalent" contemporary formulaic version, for example, creates the false impression that the original frame of mind has been duplicated. A further example: Newton's work is inseparable from his religious beliefs; the latter were not, as is sometimes thought, the aberration of a great mind.

Waller's book should counteract naive views on how scientific progress has occurred and help us distinguish between a hatchet job and objective history.

*One example is Mathematics and Plausible Reasoning, 1967.
†See: John Derbyshire, Prime Obsession, Joseph Henry Press, 2003.
Cogito Ergo Sum: The Life of René Descartes, 2002.

Philip J. Davis, professor emeritus of applied mathematics at Brown University, is an independent writer, scholar, and lecturer. He lives in Providence, Rhode Island, and can be reached at [email protected].

Donate · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube linkedin google+