"How Are We Going to Keep This Machine Busy?"

April 3, 2002

Book Review
James Case

Go To: The Story of Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts Who Were the Hero Programmers of the Software Revolution. By Steve Lohr, Basic Books, New York, 2001, 250 + x pages, $27.50.

The dust jacket of this, Steve Lohr's second book, describes him as a senior writer and technology correspondent at The New York Times. Here he tells the story of software engineering, from its birth in the laboratories that housed the ENIAC, EDVAC, EDSAC, JOHNIAC, Whirlwind, and other early digital electronic computers to the present day. As the subtitle suggests, he focuses mainly on the people who forged the software revolution, their motives for creating the software they did, and the difficulties they encountered. In the interest of readability, his coverage of technical issues is sketchy at best.

The title refers to the debate inaugurated in 1968 by structured computing guru Edsger Dijkstra, who claimed in an oft-quoted letter to the editor of Communications of the ACM that "the quality of programmers is a decreasing function of the density of GO TO statements in the programs they produce." The presence of the GO TO statement in the Fortran language of the 1950s had caused no immediate damage; when sprinkled throughout the incomparably larger software programs of the 1960s, however, it quickly became an enemy of structure and a mother of dysfunction. Dijkstra's years of experience in trying to disentangle knots of ill-constructed code led him to conclude that "the GO TO statement should be abolished from all 'higher level' languages."

Lohr attributes the first use of the term "computer software," as distinct from "computer hardware," to John Tukey, in 1958. In an article he contributed that year to the American Mathematical Monthly, Tukey alleged that "today the 'software' comprising the carefully planned interpretive routines, compilers, and other aspects of automative programming are at least as important to the modern electronic calculator as its 'hardware' of [vacuum] tubes, transistors, wires, [magnetic] tapes and the like." It was by no means the prevailing view at a time when physicists and electrical engineers dominated the field, and when computer science departments had yet to sprout throughout the developed world like mushrooms after a rain.

Cuthbert Hurd, an IBM executive during the 1950s, vividly recalled a private showing of the firm's then new 701 computer-originally known as the Defense Calculator in deference to the source of funding for its development-at the firm's Poughkeepsie facility during the summer of 1952. Each of the dozen or so prospective customers within what is now called the military-industrial complex was invited to submit a (pretested) program for the machine to solve. "They each got a shot at the computer," he recalled. "They would feed a program into the computer, and bam you got the result. . . . We all sat there and said, How are we going to keep this machine busy? It's so tremendously fast. How are we going to do that?" Crude though the 700 series was by current standards, it seemed lightning fast to a generation weaned on hand-cranked business accounting machines.

The answer to Hurd's question was simple enough: Provide the machine with an increased flow of problems to solve. But that would require expansion of the exclusive "priesthoods" that controlled access to the machines of the day---as to some cryptic ancient oracle---by virtue of the fact that they alone could communicate with what the media of the day tended to describe as "mechanical minds." Those priesthoods would be difficult to expand because "programming"---a term then newly imported from England---was by no means easy to learn. As 701 programmer John Backus suggested in a letter addressed to Hurd late in 1953, there had to be "an easier way." Backus asked to be assigned to the search for that easier way, and Hurd signed off on the request. The result, which took until 1957 to produce, was Fortran, a contraction of FORmula TRANslating system. The language was originally intended to simplify programming on the IBM 704. Portability came later. The project was administered with a light corporate touch---Backus never submitted a formal budget---and the delivery date slipped again and again.

The team that created Fortran acquired new members, one by one, to a total of ten. All were still in their twenties or early thirties when the product was finally released. Lohr names names, and outlines the career path each had followed before joining the group. Most had some training in mathematics, since the bulk of the early applications were scientific. "Still," writes Lohr, "it was an eclectic bunch---a chess wizard, a crystallographer, a cryptographer, an employee on loan from United Aircraft, a researcher from MIT, [and] a young woman who joined the project straight out of Vassar." They found time for Kriegspiel (aka double-blind chess) at lunchtime, and impromptu snowball fights when sufficient quantities of snow accumulated on the fifth-floor window ledges of their New York office building. They programmed mainly in octal, and even made jokes about it, as in "Why can't programmers tell the difference between Christmas and New Year's Eve? Because 25 in decimal is 31 in octal." Much of their work was done at night, to maximize expensive computer access. The odd hours and sometimes intense frustrations bred camaraderie and lasting friendships.

There was substantial resistance from the machine-language priesthood to the notion of a higher-level language. Early compilers, such as the one built at MIT for the government-funded Whirlwind project, produced inefficient code. The programs so written took nearly ten times longer to run than hand-coded programs implementing the same calculations. Such was the prejudice the Backus team knew it was up against. When Fortran was presented to the world, in February 1957, potential customers were invited to propose real-world engineering problems---like calculating the airflow around a wing, or the heat transfer in a cooling system-and to code them in machine language. The IBM team would compile the same problems in Fortran, and the results would be compared.

To the surprise of all in attendance, the Fortran programs ran as---or nearly as---efficiently as the hand-coded products. "It was a revelation," a team member recalled recently. "At that point we knew we had something special." Daniel McCracken's A Guide to FORTRAN Programming, published in 1961, eventually sold 300,000 copies. Such manuals represented a great leap forward in the quest to unite computers with their ultimate beneficiaries, the public at large. With their publication, the feasibility of higher-level languages was no longer in doubt. Other such languages, notably Algol and Lisp, quickly followed.

As efficient and user friendly as Fortran was---at least for its day---it was of little use to IBM's traditional clientele, the purchasers of desktop accounting machines. These corporate giants wanted their new machines to automate the payroll, logistics, purchasing, accounts payable, manufacturing, and franchise operations they were creating in the aftermath of World War II. The director of data systems research at the Department of Defense was soon persuaded to chair a Committee on Data Systems Languages, dedicated to filling the need for a language incorporating "maximum use of simple English" in order "to broaden the base of those who can state problems to computers."

An executive committee was formed to oversee the six men and three women who actually wrote the Cobol language during the last half of 1959. Although the goal of "programming in English" was soon abandoned, the result contributed significantly to the development of database technology, particularly through its ability to arrange data in hierarchical tables, a capability neither Fortran nor Algol could match. The mere fact that DOD refused to buy or lease computers unless they spoke Cobol ensured that it would become an industry standard.

Even as higher-level languages were expanding the market for computers, hardware was improving by leaps and bounds. MIT had a (four-terminal) time-sharing demonstration project up and running by the late 1950s, with plans to go campus-wide. IBM announced its 360 series in April 1964. But workers in both places soon discovered that large-scale software programs were unexpectedly difficult to build and operate reliably. Watts Humphrey, put in charge of "programming support" for the 360 project in 1966, was dismayed to discover that work on the early models in the line was far behind schedule---so much so that, although sales literature was circulating freely, the software project supporting the model 91 (the last and largest machine in the line) had been neither staffed nor funded. He was forced to issue a correction stating in effect that customers for the model 91 would have to wait---no one had even begun to write the necessary software.

The 360 very nearly became the Edsel of the computer business. Now firmly in charge, Thomas J. Watson, Jr., had resolved to "bet the company" on this particular "great leap forward." He was bailed out only by Big Blue's deep pockets, established reputation, and a shrewd engineering decision, to equip the new machines with "emulator technology"---special programming and circuitry that enabled customers to run the software they had developed for the older 7000 and 1400 series machines on their new 360s. As long as they could run their existing applications, customers proved willing to take delivery of the hardware with the understanding that the ambitious OS/360 operating system would follow in due course. Lohr characterizes the design of that system as "the signature software project of the 1960s." Even today, he observes, "much of the industrial strength corporate and government computing in the world is done on mainframe computers running software that has been updated time and again, but is a direct descendant of the OS/360."

Even as IBM was sorting out its problems with the 360, colleges and universities throughout the U.S. and U.K. were losing interest in mainframe computing. Smaller machines, such as the PDP-11 from Digital Equipment Corporation, were emerging to fill their needs at a fraction of the cost. Such computers typically ran on the Unix operating system, which was leaner, simpler, and more elegant than OS/360, which had been designed to meet specifications supplied by the IBM sales force, in an effort to embrace the largest possible market. Unix, on the other hand, was designed at Bell Labs by computer scientists for computer scientists, with no commercial market in mind. The same can be said of C, a language developed more or less simultaneously at Bell Labs specifically to implement Unix. Both C and Unix seemed well attuned to the "hacker culture" that grew up during the 1960s and 1970s in computer centers from coast to coast.

Whether connected to mainframes or minicomputers, time-sharing terminals were the media through which most computer users communicated with their unseen machines. While some programmed in Fortran, Algol, or C, most were introduced to the discipline via Beginners All-Purpose Instruction Code (Basic), a language designed at Dartmouth to meet the needs of (mainly) liberal arts students. It was first used there in 1964, and spread quickly throughout the educational establishment. In its heyday, it was the language of choice at high schools, junior highs, and even a few elementary schools. Basic also furnished a bridge to the personal computer, which began to appear during the mid-1970s.

Microsoft Basic was the product that enabled Microsoft to become a player in the PC revolution. It was also the forerunner of Visual Basic, which enabled the firm to parlay its Windows operating system into the virtual monopoly it currently enjoys in the software industry. The first version of Windows was shipped in 1985. Despite improvements, Lohr writes, Windows didn't begin to dominate the market for PC operating systems until 1991, when Visual Basic became available to facilitate the writing of programs that could run in Windows. "Over the years," observed Bill Gates, "BASIC in all its forms has been the key to much of our success."

The final chapters of the book chronicle the development of VisiCalc, the first "killer ap" for the PC, of the word-processing program Word by Microsoft's own Charles Simonyi, of Java and C++ for the Web, and of Linux and the open-source movement. In each case, Lohr appears to have done his homework thoroughly. He identifies the key players and plumbs motivation in simple, believable terms. The result is an informative yet entertaining read.

Lohr finds at least two morals in his story. The first involves the makeup of effective software production teams. Lohr quotes Harlan Mills, a former IBM manager and computer scientist who wrote extensively on "software productivity," to the effect that "surgical" teams---on which the chief surgeon does most of the cutting, while the rest stand by to offer assistance---perform better than "hog butchering" teams---on which every member answers to more or less the same job description. A surgical team is a hierarchical organization designed to maximize the productivity of the most productive member, the lead surgeon or programmer.

The reason for the effectiveness of such a team is simple. Almost everyone involved in programming for any length of time seems convinced that "star programmers" are many times more productive than merely competent ones. Java developer James Gosling describes them as having the "geek gene." "There is an odd obsessive side to it," he once explained. "The people who are best at it have a temperament that makes them the kind of people who are intellectually drawn to something like it's magnetic, sucked into it, and they really don't know why." Donald Knuth, even in his sixties, still feels the need to write programs every day. "I have to program because of the aesthetics of it," he told Lohr. "I love to see the way it fits together and sort of sings to you."

Lohr's second moral concerns software quality. He quotes Fred Brooks, a veteran of IBM's 360 wars who left the firm in 1965 to become the founding chairman of the computer science department at the University of North Carolina. Brooks is the author of The Mythical Man-Month, a book in which he chronicled the OS/360 saga and enunciated an empirical "law" stating that "adding manpower to a late software project makes it later." In a speech on the history of programming languages and operating systems, he observed that some such products spawn fan clubs, while others don't.

Fortran, Pascal, C, Unix, and the Macintosh operating system, according to Brooks, all attracted "fanatical" fan clubs, while Cobol, OS/360, and Microsoft's Disk Operating System did not. The difference, in his opinion, is that the ones that attracted fan clubs were "originally designed to satisfy a designer or a very small group of designers," whereas the others were "designed to satisfy a large set of requirements" gleaned at substantial cost from potential customers, sales forces, and other supposed barometers of demand.

James Case writes from Baltimore, Maryland.


Donate · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube linkedin google+