Harvard Magazine
Main Menu · Search ·Current Issue ·Contact ·Archives ·Centennial ·Letters to the Editor ·FAQs

The Browser
In this issue's Alumni section:
Books: Computing's Cranky Pioneer - Music: As Good As It Gets - Open Book: Fabulous Bugs - Off the Shelf - Chapter & Verse

Computing's Cranky Pioneer

Tracing the intellectual family tree of computing to its roots

by Harry R. Lewis

A hands-on engineer, Howard Aiken cared most about the mathematical uses of the "automatic sequence controlled calculator" he brought into being with IBM. Photograph courtesy Harvard University Archives

Howard Aiken's ghost was in the Harvard of my youth--his ghost, but not his memory. When I started doing research in the Computation Lab in 1966, Aiken [Ph.D. '39, S.D. '65] had been retired only five years, and a massive segment of his behemoth Mark I electromechanical computer stood in the lobby. The machine had been built by IBM in the early 1940s, installed at Harvard in 1944, and finally turned off in 1959, but until it was moved to the Science Center a couple of years ago one could still flip a switch and watch the giant drive shaft power the decimal counters. Though I was for many years surrounded by Aiken's students and colleagues, I don't remember anyone talking much about the man or his ideas. In fact, my closest personal link to Aiken was in talking to Al Cheverie, an elderly janitor, after I joined the Harvard faculty. When Al came to sweep out my office--which had been Aiken's--he would sometimes lean on his broom and reminisce about the cranky authoritarian computer pioneer.

I. Bernard Cohen's new biography, Howard Aiken, provides answers to a number of questions about the man. To Harvard buffs, the most interesting one may be how this gigantic figure could disappear from the Harvard landscape, both personally and intellectually, so quickly and so thoroughly. The biography, and its companion, Makin' Numbers, a collection of essays and retrospectives by Aiken's contemporaries, guarantee that the Aiken lore will not be lost. But they provide much more. These books are based on interviews with Aiken and his contemporaries, on many early reports and other documents, and to some degree on Cohen's personal knowledge of Aiken from the time when Cohen, now Thomas professor of the history of science emeritus, started to teach at Harvard.

Aiken is a fascinating character. His interest in computing, unlike that of many later computer pioneers, was in getting results ("makin' numbers," as he put it); technology was a means to an end. He did not delight in making tools so others could use them; he got his rewards from seeing numbers coming out of his machines every day, and as many hours of each day as possible. He favored reliability over speed, but above all the use of whatever technology would get answers the soonest, start to finish--and that included the time needed to line up a sponsor.

Howard Aiken: Portrait of a Computer Pioneer, by I. Bernard Cohen '37, Ph.D. '47 (MIT Press, $34.95).

Makin' Numbers: Howard Aiken and the Computer, edited by I. Bernard Cohen and Gregory W. Welch '85 (MIT Press, $40).

His computer designs had many innovations, and many oddities to the modern eye. The Mark I was based on decimal notation for numbers, because he thought the time to convert binary to decimal on input and output would be wasted, even though it would have been quite modest in fact. The Mark I, built out of probably the largest and slowest components ever used for a computer, had a word length of an astonishing 23 decimal digits, the equivalent of more than 64 bits, providing far more precision than could possibly have been useful. Grace Murray Hopper and other Mark I programmers used to physically truncate registers to speed up computations. Yet Aiken railed about the decision by IBM to use a one-punch-in-ten code for decimal digits on punched cards, rather than a binary coded decimal system that would have saved 60 percent of all the trees cut down to make IBM cards over the years.

The Mark I team made many discoveries that recurred in later computers, and Cohen documents these well. Subroutines and the "Harvard architecture" (separate instruction and data storage) are but two examples. It is equally interesting to note the mistakes that were also repeated--for example, the decision to devote lots of hardware to circuits for computing certain mathematical functions, even though these were rarely used and ran no faster than the same methods encoded in software, which could far more easily be replaced by better versions. (This mistake was still being made as late as the 1980s.) Yet these innovations, good and bad, seem to be natural features of the computational landscape: later exemplars were rediscoveries, not descendants, of the work of Aiken. Is it possible Aiken was the Leif Ericson of computer science--he got there first, but it didn't really matter in the long run?

Aiken was a remarkable character and the stories recorded about him are entertaining, though some of the books' anecdotes do not inform any larger themes in the history of computing. At their best moments, and there are many of these, the books provide a fine scientific biography--showing the deeper significance of what seem to be purely accidental episodes.

A good example is the split between Aiken and IBM following the delivery and dedication of Mark I at Harvard. The immediate source of the bitterness was the Harvard press release, which slighted IBM's contribution to the invention, something Aiken himself did not do in his speech at the event. It is easy to write this incident off to petty publicity-consciousness, perhaps another foreshadowing of later trends in technological innovation. A more informed and sophisticated view might attribute the problem to a personal conflict between two colossal figures, Aiken and Thomas J. Watson Sr., the president of IBM. But through his careful interviews with contemporaries, and even a textual analysis of versions of the press release, Cohen reveals a far more interesting explanation: this computer was an entirely new kind of thing, and building it was a new kind of engineering. Any mechanism has a certain kind of logical design, but the transsubstantial aspect of a computing device was of a new order. What was design and what was implementation had not been sorted out. It was not unreasonable for different parties to have different views of what lay at the creative essence of the project.

With the bulk of a boxcar, Mark I resembled early mainframe computers far more than today's PCs and laptop machines. HARVARD UNIVERSITY ARCHIVES

Aiken was a study in contradictions, a physically huge man who classified people into colleagues and objects of contempt. Once a person was in one category--and the classification tended to be made very quickly--it was almost impossible to move to the other. He was generous with his time and his guidance, while expecting people to work things out on their own. Even his protégés could be the object of curtness bordering on cruelty. Anthony Oettinger '51, Ph.D. '54, one of Aiken's doctoral students and today Gordon McKay professor of applied mathematics and professor of information resources policy, writes in Makin' Numbers: "I remember working on something, perhaps a thesis draft, on which I wanted his opinion. He hoisted up his forbidding pince-nez on his big face with the Mephistophelian eyebrows and the sardonic grin, flicked through the pages, and tossed it back on my desk. 'It stinks,' he threw over his shoulder as he turned to stalk out." Yet his level of personal care for the Comp Lab team could be equally intense. Richard Bloch, one of the Mark I programmers, recalls Aiken walking him home after Bloch had pulled two consecutive all-nighters coding a problem, and not leaving until Aiken had seen Bloch into his pajamas, pulled down the shades, and issued an order that he should not return to the laboratory for at least 24 hours.

Aiken had his self-contradictions as an inventor as well. He set a high priority on documenting his team's work--the Annals of the Harvard Computation Laboratory ran to 40 volumes--and he insisted on the highest quality of writing in anything coming out of the Lab. He hosted the major figures in the pioneering days of computers, and traveled extensively to visit their laboratories as well. He ran two major conferences at Harvard, bringing together hundreds of scientific visitors to talk about the latest trends and inventions in the field. He had almost no interest in what we would today call "intellectual property"--patents or other methods for securing financial rewards from his inventions. Cohen quotes Frederick Brooks, Ph.D. '56, another Aiken student with a long and significant career in computing, as reporting that Aiken thought "the problem was not to keep people from stealing your ideas, but to make them steal them."

In another telling anecdote, Cohen describes the disastrous attempt to rebuild the Harvard-IBM bond, many years after the rift at the dedication of the Mark I. Aiken was invited to IBM, was flown to Endicott, New York, in an IBM private jet, and after a limousine ride to the facility, was received by a very high-level delegation, including Thomas J. Watson Jr. Upon his arrival, he was asked to sign a standard nondisclosure agreement--something so routine at IBM that none of those planning the visit had given it any thought. Aiken refused, stating that everything he did was out in the open and he never received any confidential information. After an awkward silence between Aiken and Watson, Aiken got back in the limousine without seeing anything, and was returned to the airport to be flown back to Boston.

Yet this stubborn advocate of free information headed a laboratory that became increasingly isolated from important trends in the computer field. Little that happened elsewhere had much influence on the development of Aiken's line of computers. His categorical rejection of the stored-program concept--which was essential to the development of compilers and other basic software technology--was but one of the ways in which Aiken's very strength attenuated his influence in the long run.

Cohen asserts, and he is probably right, that it was through Aiken's influence as an educator that he has left his imprint on the computer world. Whether or not the Mark I was the first computer, and that is a matter of semantics, there seems little doubt that the program Aiken established at Harvard was the first graduate program in computer science anywhere. There was a curriculum, there were graduate students, there were courses and lectures (Makin' Numbers includes some of Aiken's final examinations). Aiken was a splendid lecturer, speaking from a minimum of notes and always finishing on the hour with a flourish. Intellectually conservative though he was, he always discarded his notes so as not to be tempted to repeat the same lectures the following year. Of his 17 Ph.D. students, more than half became important academic or research figures in the computing field, and many of the rest were influential figures in industry.

Harvard colleagues presented Aiken with this replica of Mark I, crafted by a Cape Ann silversmith, as a retirement memento. HARVARD UNIVERSITY ARCHIVES

Though he was an applied mathematician who put his machines to work mainly on scientific and numerical problems, Aiken encouraged his colleagues and students to explore other areas of possible application of the technology. He pushed Oettinger and others to work on linguistics and on business applications. He anticipated the miniaturization of computing equipment, the universality of its application, and, most interestingly, the fact that it would not be a laborsaving innovation. His reasoning was simple: there never had been a laborsaving device, only devices that enabled people to be more productive while also working longer hours. Computers would be like that, too.

An appendix to the biography reprints a 1959 "Computer Tree" issued by the National Science Foundation that roots all computing machines in the Mark I, and describes the obvious inaccuracy of this lineage and some later variations on it. I wonder what an intellectual family tree might look like. Constructing one would be a subtle problem, because the objective would not be simply a tree of Ph.D. advisers and their students, but a record of the growth and development of ideas, with John Von Neumann's stored-program computer emanating from Alan Turing's universal machine, and so on.

What larger lessons have descended from Aiken? In this lineage, the "Mythical Man-Month" of Frederick Brooks's classic book on software engineering would have some relationship to Aiken's labor-creating computers, and Oettinger, Kenneth Iverson, Ph.D. '54, Gerard Salton, Ph.D. '58, and the others would also show the influence of their parentage. Perhaps such a tree would also explain what did not happen--for example, why computer science went through such a dry spell at Harvard after Aiken's retirement. I suspect the answer to that question is more complicated than the institutional misgivings about applied science and out-side funding that are cited in the essays on this subject in Makin' Numbers.

The whole story of the intellectual lineage of computer science will have to wait another generation to be understood. Cohen has told enough of it to make clear that Aiken in fact was no Leif Ericson--he left his mark, in human terms, through his academic descendants and the educational structure he created. These books document the explosion of novelties in the Comp Lab at a time when almost nothing was known about computing, and have done us a great service by preserving, fondly but dispassionately and exhaustively, the memory of Aiken and those who worked with him.

Harry R. Lewis '68, Ph.D. '74, is dean of Harvard College and Gordon McKay professor of computer science.

Main Menu · Search ·Current Issue ·Contact ·Archives ·Centennial ·Letters to the Editor ·FAQs
Harvard Magazine