oreilly.comSafari Books Online.Conferences.


Eight Questions for George Dyson

by chromatic

Editor' note: George Dyson is Director's Visitor of the Institute for Advanced Study and a historian. His OSCON keynote explores the pioneering work of John von Neumann, and others at the IAS, in computation and computational biology. He draws parallels between that world and modern open source development. We were fortunate enough to engage George in a brief conversation about his upcoming OSCON 2003 presentation.

O'Reilly Network: What should an OSCON attendee expect to understand after your keynote?

George Dyson: OSCON (as I understand it) consists of people who are working in the digital universe (including its open and less-open subsystems) of today. My contribution (as a historian, or, in this case, more of a paleontologist) is to offer a firsthand glimpse into the birth of this universe, a little over 50 years ago. When von Neumann's engineers here at the Institute for Advanced Study started building the first 5,000 bytes of high-speed random-access memory in 1946, it was an empty universe, completely devoid of code. By the time they got the machine running, in 1951, both code and data were waiting, and the machine was operated more or less without interruption until it was shut down (in 1958) having been superseded by its offspring all around the globe.

I will be speaking in very specific terms, showing pages from the original machine log books and other documents that have been hidden (for OSCON-interesting reasons) for fifty years. General conclusions, of course, can be drawn. In the beginning, all software was open source, and there are good reasons to conclude that so it will be in the end.

One of the first significant expenditures of machine cycles at IAS (second only to thermonuclear bomb calculations and meteorology) was a series of experiments conducted by the viral geneticist Nils Aall Barricelli to see if code could be prompted to evolve, within the "artificial universe" of the von Neumann computer, on its own. All the questions raised by Barricelli are equally applicable and equally instructive with regard to the evolution of software "in the wild" today.

ORN: You've spoken in terms that are normally reserved for life: "empty universe", "devoid of...", "offspring". How far should we take this analogy? Is it an analogy?

GD: I believe it's more than analogy, but Barricelli can answer that better than I can:

Are they the beginning of, or some sort of, foreign life forms? Are they only models? They are not models, not any more than living organisms are models. They are a particular class of self-reproducing structures already defined. It does not make sense to ask whether [numerical] organisms are living as long as no clearcut definition of 'living' has been given... The distinction between an evolution experiment performed by numbers in a computer or by nucleotides in a chemical laboratory is a rather subtle one... Whether numbers or nucleotides or magnetic charges in the memory of a high speed computer are used as self-reproducing elements has no influence on the result.

ORN: Would it be proper to suggest that Barricelli's work was on the first "genetic algorithms"? There certainly seems to be a semantic connection.

GD: Yes, although later people usually get the credit for it.

ORN: One of the unique points of the von Neumann architecture is making no distinction between code and data. Was this deliberate?

GD: Yes. I think it's in the abstract for OSCON that the whole business began with von Neumann's prerequisite, announced in 1945, that "'Words' coding the orders are handled in the memory just like numbers." Once you break the distinction between numbers that mean things and numbers that do things, all hell breaks loose, as anyone working with software understands...

ORN: Aren't these machines limited by their hardware? How do you get past that?

GD: Yes, but when you point out that hardware is proliferating by millions of microprocessors every day, infiltrating every corner of our existence, people then say, "Don't worry, it's limited by software." I wouldn't count on it!

ORN: Maybe the ultimate argument is that of DNA, where we're not sure which is code, which is data, and how it all works. Is modern computer science missing something by treating code and data separately? It's still von Neumann's architecture.

GD: The DNA example is surprisingly close. Nature discovered template-based adddressing a long time ago, and in the world of computer-based information processing we're on the same path.

I don't believe code and data are as separate as we sometimes think. Von Neumann was thinking about many different architectures besides the one that got named after him, and I suspect it's the decidedly non-von-Neumann architectures (or higher-level architectures constructed within a von Neumann architecture) that would be most of interest to him today. But that's a guess, and goes beyond what I'll talk about in July.

ORN: As a historian, could you offer a hypothesis to why the code was open back then? Two possibilities come to mind. One, that's how research is done. Two, there was no grand opportunity to profit.

GD: As far as the IAS machine (and its clones) went, the codes were initially open because there really was no distinction yet at that time, there were no assembly languages or compilers or any of those levels of abstraction that allow us to make a distinction between code and source. Everything was programmed in "absolute": there wasn't even relative addressing, yet. There certainly was grand opportunity to profit, and some companies found it irresistable to develop closed systems, whereas others saw that true profitability lay in building machines that could run (and attract) large open libraries of code. But that gets into later (1960s rather than 1950s) history which is not my area of expertise.

ORN: What kind of arguments would you like to hear in the hallways afterwards?

GD: I expect the slides of Barricelli's experiments will prompt considerable argument in the hallways as to whether his view of the digital universe in 1953 applies to the digital universe of today.

chromatic manages Onyx Neon Press, an independent publisher.

Return to

Sponsored by: