Physics and the Information Revolution
By Joel Birnbaum
In the fourth century B.C., Pythias was condemned to death by Dionysius, the tyrant of Syracuse, but obtained leave to go home to arrange his affairs after his friend Damon had agreed to take his place and be executed should Pythias not return. Pythias returned in time to save Damon, and Dionysius was so struck with this honorable friendship that he released both of them.
The decades-old friendship of computer technology and physics has also been an honorable one, and it, too, has produced benevolent results. Modern experimental and theoretical physics depend on computing; a debt repaid many times over by fundamental contributions by physicists to hardware, software and systems technologies.
For many years now, I have dreamed of the day when computers would become a pervasive technology, part of everyday life for most people, and more noticeable by their absence rather than their presence. Electric motors are a good example. The average American home contains about two dozen or more electric motors, buried in consumer appliances like vacuum cleaners, electric toothbrushes, washing machines, and VCRs. In the next generation, the same will be true for computers, most of which will be embedded in information appliances, enormously powerful because their parallel architectures will be tailored to particular tasks and inexpensive because of huge production volumes. Just as most electrical consumer appliances are dependent upon the availability of a ubiquitous electric power utility, most information appliances will derive their computational power from a digital information utility.
However, for any technology to become truly pervasive it must transcend being merely manufacturable and commonplace. It must become intuitively accessible to ordinary people, and it must deliver sufficient value to justify the large investment needed to create the supporting infrastructure.
For many years I despaired of this dream ever coming true for computers, because of the political and economic bickering that precluded the creation of a standard that would result in the ability to interconnect systems easily and freely, and by the needless, shortsighted profusion of proprietary, arbitrarily different, complex user interfaces. Isaac Newton was once asked how he had achieved his great accomplishments. He answered modestly that if he had seen far, it was because he had stood on the shoulders of giants. In the computing industry, we have mostly stood on each other's feet.
Web and Mosaic
And then, just a few years ago, a miracle occurred: Both problems were solved by the physics community, long the most sophisticated users of computer technology. Out of CERN came the World Wide Web, based on the quarter-century-old Internet, but with benefits so profound that a de facto connectivity standard seemed to emerge overnight and to spread like wildfire. Soon thereafter, the disarmingly simple Mosaic point-and-click browser technology was invented at NCSA (National Center for Supercomputer Applications), clearing the way for truly intuitive access, and suddenly the basis for a true global information infrastructure was born. Many now believe that taken together these two creations, spawned of the needs of physics users, will rank among the most important developments in the history of civilization, and that all aspects of how we work, learn and live will be forever changed by them.
By the end of the next decade, a new generation of information appliances will have emerged. They will be much more intuitive to use than today's small, mobile, general-purpose computers. Dedicated to a particular task, they will be named by that task just as consumer appliances are: users will think of them in terms of what they do, not how they do it. We expect appliances to evolve to hide their own complexity, just as the one-button, automatically-tuned television set of today has replaced its less complex, but harder-to-adjust ancestor. Because of continued advances in semiconductors and software, the information appliances of tomorrow will be able to do the same. Many of the most interesting appliances will include sensors and communications capabilities and soon whole families of appliances will communicate directly, some wirelessly. As network bandwidth increases and becomes far less expensive, the cost of appliances will drop sharply since much of the computation to support both the multimedia human interface and the application will reside on the information utility.
However, while wonderfully useful, the Internet and Web today are a far cry from an information utility, which must have the characteristics common to all utilities. We should notice a utility only when it fails. It must be secure, reliable, ubiquitous, and the standards for its use must endure. It must be perceived to have a value great enough to justify the huge investments required.
We know that utilities catalyze new industries. They are also invariably lucrative; the information infrastructure has already produced new Internet-based companies which defy all conventional economic logic. They are just early precursors of new entities for electronic commerce, communications, electronic publishing, and internet medicine, to name just a few of the industries which are already being created or transformed. The notion of an information infrastructure built upon the standards already set for the Web, but extending them to improve robustness, performance, manageability, and security as the systems scale to huge numbers of users is at the heart of most industrial efforts today.
The fundamental characteristic of an information utility is that it transforms computing from a capital investment to a competitive service, with costs amortized over many users and paid for by usage. We could provide electricity for our homes by buying a generator, but most of us prefer to subscribe to a service. For many users, the same will be true for computing. I think of the utility as a natural evolution of open computing which will enable a web of electronic services to be built by composition of existing and new services. It will do for computation and services what the Web did for data. An HP Labs prototype of an information utility has been running for some time; it has all of the attributes mentioned, and is now being scaled to a large number of users. The evolution of the information utility will be an industry phenomenon driven initially by the acceptance of the Internet as a surrogate for the enterprise backbone and the economic promise of electronic commerce, telephony and entertainment.
Gordon Moore of Intel was the first to quantify the improvement in gate density when he noticed that the number of transistors on a chip increased exponentially, and over the past 24 years. That exponential growth rate has corresponded to a factor of four increase in the number of bits that can be stored on a memory chip in every device generation, about every 3.4 years — an increase of 16,000 times! This exponential growth in chip functionality is closely tied to the exponential growth in the chip market, which has been approximately doubling every five years.
At the present time, there are two recognized factors that could bring Moore's Law scaling to an end. The first is economic. The cost of building fabrication facilities to manufacture chips has also been increasing exponentially, about a factor of two every chip generation. Thus, the cost of manufacturing chips is increasing significantly faster than the market is expanding, and at some point a saturation effect should slow the exponential growth to yield a classic s-curve for expanding populations.
The second factor threatening Moore's Law is that the engine that has brought us to this point, the CMOS (Complementary Metal Oxide Semiconductor) field-effect transistor, can only get us part of the way to where we want to go. The Semiconductor Industry Association has established a National Technology Roadmap that sets as a goal the continuation of the current exponential increases in capacity and performance up through the year 2010. That projection calls for chips that are 256 times more capable than current designs with no increase in power dissipation. If this goal is attained, then the silicon-based integrated circuit will have accomplished a more than six order of magnitude performance improvement, using energy as a metric, with a single manufacturing paradigm. Compared to the advances experienced in most human endeavors, that increase is extraordinary.
If we are to have any hope of sustaining the economic benefits to the national economy provided by sustaining Moore's Law, we have no choice but to develop quantum switches and the means to interconnect them. Fundamental limits are now becoming an essential issue. It does not make sense to make the enormous investments in research, development and manufacturing that will be required to replace semiconductor switches by the year 2010, if the new technology can have only marginally better performance. To achieve incredible advances in the future will require a totally different type of computational machinery. The requirement for inventing a new technology paradigm, coupled with the economic rewards that would follow from such a development, has created exciting research opportunities for mathematicians, physicists, physical chemists, and scientists of many disciplines as well as for computer technologists. In fact, much of the current interest in interdisciplinary research areas such as nanofabrication, self-assembly and molecular electronics is being driven by this search for a new archetype computer.
A common theme that underlies many schemes is the push to fabricate logic devices on the nanometer-length scale, which will therefore be dominated by quantum mechanical effects. An additional huge increase in performance could result from reversible machines executing what has come to be known as quantum logic; in principle, very clever algorithms could exploit the inherent parallelism of the superposition of quantum states. If we could solve knotty problems of decoherence, programming and input/output, to name a few, quantum logic would enable the solution of some classes of computationally intractable problems, such as factorization and search, which are important in cryptography and Fourier analysis. For some applications the reversibility and inherent parallel nature of quantum logic represent a leap far beyond what ideal nonreversible computing can offer, perhaps by still another nine orders of magnitude or more.
Quantum logic is a fascinating prospect, but it does not seem likely to me that this can become a reality in any widespread practical sense before 2025, and many are less optimistic than that. In any case, barring some currently unimagined breakthrough, it is even more unlikely that an entire system would be built this way. However, there are tremendous advances possible for computing, even if quantum logic never becomes a reality. A physicist's workstation of the future may well run Windows 17 on a Decium, with lots of RAM, but with a reconfigurable, application-specific quantum-switch-based, supercomputer attached.
Winston Churchill observed that the further backward you can look, the farther forward you are likely to see. It is possible that history is about to repeat itself, with the introduction of a new disruptive technology for computation in the 21st Century. Today, we have the silicon FET, but we speculate that a quantum-state switch could be better. A large number of laboratories are now engaged in basic research in the fabrication of materials into arbitrary shapes and sizes, and are searching for the device concept that will lead to a disruptive new technology.
Breakthroughs will require significant advances in the understanding of fundamental issues, and will undoubtedly act as the foundation for new mathematical and scientific disciplines; those companies that convert the breakthroughs to a new, manufacturable technology will be the survivors of the quantum age of information processing. It is a noble quest, and we computer technologists are being held hostage by the laws of physics. We can only hope that once again physicists, just like Pythias, will arrive in time to save the day.
Joel Birnbaum is Chief Scientist at Hewlett Packard. The above text was excerpted from his plenary speech at the APS Centennial Meeting in Atlanta, GA. The full version can be found online at http://www.hpl.hp.com/speeches/birnbaum_aps.cfm.
©1995 - 2016, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.
Editor: Barrett H. Ripin
Associate Editor: Jennifer Ouellette