top of page

The other scientists were two computer science professors, Dr. David Gelernter of Yale and Dr. Jordan Pollack of Brandeis University; three physicists, Dr. Brian Greene of Columbia, Dr. Alan Guth of the Massachusetts Institute of Technology and Dr. Lee Smolin of the Center for Gravitational Physics and Geometry at Penn State; and a psychologist and neuroscientist, Dr. Marc Hauser of Harvard.

 

John Brockman, a literary agent who represents these scientists, had convened them at the country house here that he shares with his wife and partner, Katinka Matson. Mr. Brockman said he had been inspired to gather the group by a conversation with Dr. Seth Lloyd, a professor of mechanical engineering and quantum computing expert at M.I.T. Mr. Brockman recently posted Dr. Lloyd's statement on his Web site, www.edge.org: "Of course, one way of thinking about all of life and civilization," Dr. Lloyd said, "is as being about how the world registers and processes information. Certainly that's what sex is about; that's what history is about."

 

Humans have always tended to try to envision the world and themselves in terms of the latest technology. In the 17th and 18th centuries, for example, workings of the cosmos were thought of as the workings of a clock, and the building of clockwork automata was fashionable. But not everybody in the world of computers and science agrees with Dr. Lloyd that the computation metaphor is ready for prime time.

 

Several of the people gathered under the maple tree had come in the hopes of debating that issue with Dr. Lloyd, but he could not attend at the last moment. Others were drawn by what Dr. Greene called "the glimmer of a unified language" in which to talk about physics, biology, neuroscience and other realms of thought. What happened instead was an illustration of how hard it is to define a revolution from the inside.

 

Scientifically, the information age can be said to have begun in 1948 when Dr. Claude E. Shannon, a researcher at Bell Laboratories, proposed that information could be defined as the number of ones and zeros—bits—that it took to encode a message in binary language. His mathematical formulation of that implied a link between information and entropy, a measure of disorder that plays a deep role in many areas of physics, like heat, gases and black holes.

 

Some scientists have suggested applying Dr. Shannon's definition of information to situations other than the sending of messages, to things like the human genome or even the universe. According to one interpretation of recent theories in high-energy particle physics, the universe can be portrayed as nothing more than a vast network of information flow—that is to say, as a computer.

 

In principle, Dr. Smolin said, a piece of metal or an orange could be said to have an information content and the object's behavior, rusting or ripening, could be thought of as a computation. But he added, "We have not yet constructed physics on those grounds and have no idea if we will be able to."

 

The assembled scientists, however, argued that Dr. Shannon's definition of information, based on counting bits, did not give a meaningful result in every situation. For example, if you have two copies of a book, you have twice as many bits and thus twice as much information, but you are not necessarily better informed.

 

Not all bits are equal, after all. The words "I do," spoken in one context, say, by the participants in a wedding, have a much larger consequence than if spoken elsewhere. An article on the front page of a newspaper receives more attention than one of equal length inside.

 

The recently completed sequencing of the human genome furnishes another example of the futility of counting bits. The first analysis revealed far fewer genes than expected. The potency of the genome, biologists are saying, arises from relationships among the genes and from their interactions in the environment of a womb, rather than their sheer number. The different parts of the genetic code are, in effect, able to turn one another on and off.

 

Such effects, in a long computer code or in the various contexts of real life, Mr. Lanier said, "create lenses" that magnify some bits while the rest are lost in statistical noise and never affect the world. How to distinguish and quantify the bit that have an impact? Mr. Lanier suggested making a distinction between the maximum potential information in a system—the total number of bits—and the causal impact of those bits.

 

The ambiguities in the meaning of information make it even more difficult to define a way to measure another much bandied-about concept in computer science, namely complexity. Which is a problem, Dr. Pollack explained, because without a quantitative definition of complexity, scientists have no objective way to evaluate the intelligence of a new robot or a computer program or of the process that created them.

 

One difficulty, Dr. Smolin and the others pointed out, is that a simple program can give a complex result. Examples are the beautiful fractal patterns, depicting shapes within shapes within shapes, known as the Mandelbrot set, after the mathematician Benoit Mandelbrot. They can be generated by a simple computer code. So where does the complexity reside?

 

Another example, maintained Dr. Pollack, is life itself, which, he pointed out had begun billions of years ago as a pool of random chemicals, running its own "simple program" and baking in the sun. "Over time you would wind up with something that I would argue was more complex than the initial program," he said.

 

On the other hand there are things that appear simpler than their beginnings, like a fine novel whose polished prose belies the many drafts and years of struggle it took to write. Living organisms today are made of a relatively few number of genes and amino acids, but that is the result of billions of years of evolution in nature's computer.

 

Any useful definition of complexity of some system, the scientists suggested, should include some measure of how much energy or time or computation it took to produce or change the system. "I think complexity is mostly sort of crummy stuff that is there because it's too expensive to change the interface," Mr. Lanier said.

 

Woven among the day's discussions was a running criticism, often scathing, of the state of modern computer technology as it applies to ordinary people. Dr. Gelernter described current software as "crummy," and modern computers, laden with incomprehensible user manuals and unneeded features, as "more a source of irritation, dissatisfaction and angst than a positive benefit."

 

He was particularly scornful of the ubiquitous organization of information into folders and files, a notion he said belonged in offices of the 1940's. He said he expected it to be replaced by a scheme in which computers receive and catalog information chronologically in a continuous stream.

 

But his colleagues saw little hope for such a change, as attractive as it might be. The reason, Dr. Pollack and Mr. Lanier bothsaid, was that in the evolution of computer systems, as in biological evolution, old and inefficient features often became embedded in the system, like the blind spot in the retina of the eye. Natural selection, he said, does not optimize every aspect of an organism. In short, files now, files forever.

 

Dr. Pollack had come armed with a demonstration of the vicissitudes of evolution in the form of his lab's latest robot, a tinkertoy assemblage of arms and joints. It was the knobby result of a process of evolution that had taken place inside a computer commanded to design a robot that would move, then to simulate the behavior of the robot, tweak the design, simulate the behavior of the next robot, and so on. Another computer was then commanded to build the robot that resulted.

 

Let loose, the battery-powered "tinkerbot" stretched and folded sporadically, like a crab shuffling across the table. Someone asked what a pair of extensions sticking out sideways from the middle of the robot were for. "We don't know," Dr. Pollack answered. Then one of the robot's legs fell off, perhaps illustrating its lack of Darwinian fitness.

 

Extracting a conclusion from this daylong session was an exercise in information and complexity theory itself. Dr. Gelernter expressed dismay at what he called his colleagues' "defeatist" attitude toward current technology. "The idea that we can't change is indefensible," he said. "They know the field has been turned over every five years."

 

Mr. Lanier judged the day as "a notch upward" in his efforts to engage the physicists in a dialogue. Dr. Greene said the most interesting aspect was "how hard it was to define something that seems so obvious."

 

"I'm still trying to figure out what happened," said Dr. Pollack, who described the day as "searching for a vocabulary." He said he been surprised and heartened to find that other people were asking the same questions he was, even if they didn't get the same answers. "I felt like I found a soul mate," he said. ■

First published by The New York Times, August 7, 2001

BETHLEHEM, Conn.

 

These would seem to be heady times to be a computer scientist. This is the information age, in which, we are told, biology is defined by a three-billion-letter instruction manual called the genome and human thoughts are analogous to digital bits flowing through a computer. And, we are warned, human intellect will soon be dwarfed by superintelligent machines.

 

"All kinds of people," said Jaron Lanier, a computer scientist and musician, "are happy to tell us what we do is the central metaphor, the best explanation of everything from biology to economics to aesthetics to child rearing, sex, you name it. It's very ego-gratifying."

 

Mr. Lanier is the lead scientist of the National Tele-Immersion Initiative, a virtual reality system that has been designed for the Internet.

He and six other scientists were sitting under a maple tree one recent afternoon worrying whether this headiness was justified. They found instead that they could not even agree on useful definitions of their field's most common terms, like "information" and "complexity," let alone the meaning and future of this revolution.

Time of Growing Pains for Information Age

By Dennis Overbye 8.7.2001

Sitting under a tree to debate the meaning of a revolution.
bottom of page