What Has Happened to Research at Industrial Laboratories?
by Alan B. Fowler
There have been several articles recently seeking to justify the drastic changes taking place at industrial laboratories like AT&T and IBM. The general argument is that existing technologies fill the foreseeable needs, and that industry should focus on incremental research and development tailored to meet ever-greater com-petition and shorter product cycles. These ideas were espoused at least 10 years ago by Ralph Gomory when he championed "evolutionary rather than revolutionary" research, although he retained curiosity-driven research at IBM. The question is of vital importance to the country's economy, and peripherally to the universities that serve as the training ground for industrial scientists.
Industrial research has always been a tenuous occupation, more so the further one departs from applications. A pattern exists not only for recent events, but also historically. Almost universally, research has been reduced drastically or shut down when the funding corporations have experienced financial difficulties and sought to reduce operating expenses. This is often followed by a carefully reasoned analysis of why research should not be supported. Very seldom after recovery, if it occurs, do companies reinvest in their central research labs, at least partly because corporate management looks upon research and development as an expense rather than an investment.
Industrial laboratories of the 1930s were not so different from those of today, except in size and in the slower pace. Work was generally confined to problems that were central to the product interests of the company, usually on what was considered a short time scale for that period. Central laboratories existed for only a few companies, such as General Electric and Bell. Nonetheless, physicists did work at various industrial jobs. After World War II, when the contributions of scientists, and especially physicists, were more generally recognized, a number of new laboratories were created to take advantage of the inventiveness and problem-solving capabilities of our profession.
Following the invention of the transistor in 1948, almost all large corporations involved in technologically-based fields opened a laboratory, hoping to benefit soon from some marvelous invention. They were generally disappointed. In general, the research supported was tightly coupled to product development progams. The transistor was not the result of a few free spirits playing in a sand pile. It resulted from an intelligently directed and company-motivated program, and was the culmination of some 20 years of research that laid the basis for its invention. There was a huge jump in semiconductor research, along with expanded research on other electronic components, such as ferrite microwave devices, but the emphasis was always product-oriented. Laboratories that did not concentrate on problems related to products tended to be short-lived.
Only two laboratories - AT&T and IBM - have supported a large amount of work uncoupled from company goals. Such work enhanced the reputation of the larger organization; it made university relations and recruiting of top-quality scientists easier; and it provided in-house consulting of the highest caliber. Both could afford the investment: as a public utility, AT&T made a fixed return on its investments, and research was considered an investment; IBM was getting an enormous return on the mainframe business it dominated. Both these happy circumstances changed.
In the case of IBM, profits began to fall in the mid-1980s. The corporate and research management had failed to perceive the changes taking place in the computer industry, particularly the loss of the mainframe computer as a cash cow with the rise of PCs. Yet the laboratory continued to concentrate on high-performance technologies, despite the dwindling customer base.
The climax came in the early 1990s. Although IBM as a whole had been drastically downsizing for several years, research was only slightly affected. But in 1993, new management sharply reduced research as a whole, and hard sciences in particular. They could no longer afford the luxury. Many scientists took advantage of a generous corporate early retirement program or left for university posts. Other physicists moved into areas like software, parallel computing, circuit design, and applications of measurement and computer skills to customer problems. In about three years the size of the Yorktown facility's Physical Sciences Department was reduced by half, and the support structure was even more strongly curtailed. Today, almost all work uncoupled to a company goal is being eliminated, although there are several notable exceptions.
What effect has all this had on the research stemming from these laboratories? For those who desire more than anecdotal evidence, publication records are perhaps a better measure than counting the number of people doing industrial research. The number of papers submitted for publication in Physical Review and Physical Review Letters by AT&T/Bell Telephone Laboratories, by IBM, and a group of other larger industrial laboratories, for the years 1984, 1989 and 1994 have been tablulated. IBM saw a precipitous drop in that time period. Bell declined by a factor of 2, IBM by 2.5, and other labs queried declined by a factor of 1.5. Xerox and Exxon were basically stable. There was a corresponding decline in the number of invited papers at the annual APS March Meeting.
Many of the reasons commonly cited for shortening the attention span of industrial research should not be accepted in their entirety. For example, there is the argument that we are overloaded with technologies and most work should be incremental to contend with market competition and 18-month product cycles. Many companies can afford to do little more. They can point to extremely successful companies like Intel, as well as European and Asian countries, who support no long-term research, but are generally horizontally integrated companies with very narrow interests. (The recent AT&T split-up is a move towards less vertical integration.)
No matter how successful incremental improvements are in the short term, if pursued exclusively they lead to disaster in the long term. There is a danger that if something does not fill the role of the great industrial laboratories, not only the companies themselves, but the entire American economy will be made obsolete by developments from overseas. As much as some companies, many politicians, and many ordinary citizens would welcome a chance to catch their breath after increasingly rapid changes, we have a tiger by the tail.
Another argument against supporting long-term research is that seldom has the company that opens a field benefited from it in a way that is commensurate with the original expense. There are a multitude of cases to support this thesis. In 1955, seven years after the invention of the transistor, its main application was in hearing aids. Basic patents are often worthless, given the 10-15 years required for a revolutionary invention like the transistor or the injection laser to make a real economic impact. It seems unlikely that the life of a patent will be raised beyond the present 20 years; thus, there is little incentive for industry to support research on more than a 10-year scale.
Some industrial scientists have rather weakly suggested that universities should carry out the function of exploring new technologies, as they have done in some areas, most notably high- temperature superconductors and quantum blockade devices. However, in general, universities are poorly constituted to dig into an innovation on the three-to 10-year span. They tend to be very conservative and Byzantine in their politics, nor do they know much about existing technology or needs. Furthermore, they are not generally adept at the cross-disciplinary efforts required, and their primary responsibility is teaching. A total rethinking of the universities' roles and funding would be required. It is possible that government could pick up the slack, but improbable for many of the same reasons.
Neither of these possibilities is likely to be explored seriously if the present Congress has its way. It has its own, almost Luddite logic. Today's Congress believes that the government should not interfere with the path of business development, and should therefore not try to make technological decisions for industry. This policy ignores the fact that most industry has abandoned the mid-term work, most of which is done to determine feasibility. Industry would still have to decide whether to manufacture.
One does not expect consistency from politicians, and in truth, it is often a very dangerous trait in the breed. They do not seem to apply this doctrine to other business subsidies, nor to agricultural subsidies or research. The present Congress came to power vowing to eliminate all of the $15-100 billion in subsidies and tax loopholes to industry; they have eliminated $1.5 billion, half of which was aimed at programs that support research and development. The fate of NIST is still uncertain.
Thus, the model for most of the American electronics industry has become what is the norm for some of the fastest-growing and successful companies of recent years: exploit existing technological innovations very well and do not address the future. Presently it is the Japanese who are providing most of the innovation in silicon technology, and who are also investing in long-term research in the U.S. and England as well. If they are forced to change direction by economics and competition from other countries feeding off their research, perhaps the Chinese or Koreans will take over. More likely, the rate of development will slow or stop.
In this brave - or not so brave - new world, I hope that the more nearly apposite poen from the 1920s is not, "That's the way the world goes out./ Not with a bang, but a whimper." Rather, I prefer Mehetibel's jaunty and lowercase cry: "there's a dance in the old girl yet/toujour gai, toujour gai."
Alan B. Fowler is an IBM Fellow Emeritus at IBM in Yorktown Heights, New York.
©1995 - 2017, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.
The End of Generalists in APS?
Discussion of the demise of the General Meetings of the Society in APS News, October, 1995, APS/AAPT Joint Spring Meeting to Rotate Sites, is long overdue. For early recognition of the facts and a plausible analysis of the cause we should thank a former executive officer of AAPT, Jack M. Wilson. His editorial in the Announcer, 19 (1). 20 (1989), The Balkanization of Physics, described with telling accuracy the transformation of APS from a Society united by a devotion to physics to a federation of specialists. Today our disunity is plaintively recognized by sessions on the Unity of Physics.
The change began with the constitutional changes of 1966 which gave specialists the upper hand in the Council of the Society.
The question of the day is whether those effects of constitutional change can be undone by merely rotating the sites of our General Meetings. It is a remedy which has been tried and has failed before. To this observer it is like applying a band-aid to a cancer lesion. It seems reasonable first to diagnose a malady before prescribing a cure, and if my diagnosis is correct, what is called for is major constitutional changes back to where we were 30 years ago, when the General Meetings of the Society were thronged with enthusiastic physicists. Perhaps that is a forlorn hope, but at least is has a sound historical justification, and how we react will tell us much about our courage or lack of it in facing past mistakes.
Restoring the influence of generalists in physics will do much more than merely enhance the interest of our General Meetings. It will boost pride in membership in APS and in physics as a career, and should enlarge much-needed opportunities for employment. The alternative is the continued decline of the tradition of generalism in physics in America.