Why is it so easy to forecast the relentless pace of information technology at the “Moore’s Law” level, and so difficult to forecast the way that business will use technology? Moore’s law is the well known forecast by Gordon Moore of Intel in 1964 who predicted that the capability of the chip would continue to double every eighteen months, giving rise to the doubling of computer power at the same price every eighteen months or so.
I got some insight on this recently listening to Seth Lloyd, an MIT researcher speaking at a conference on Complexity in Seattle. He was actually talking about quantum-mechanical computers, which he has written about in the Scientific American (October 1995, pp 140-145). The details of quantum-mechanical computers based on the manipulation of atoms will be left to him; interested readers can review his paper.
What caught my attention was his forecast: quantum-mechanical computing will become real in about 2015-2025 in spite of some significant technical barriers that exist today. The basis? Because this is the timeframe when Moore’s law runs out, based on the physics of creating ever more capable silicon chips.
Ingenuity
It is important to understand that Moore’s law is not a physical law, nor a moral law. Rather, it is a statement about ingenuity which has been remarkably consistent over the time period, and Lloyd’s statement was his belief that this ingenuity will continue. Whether this is valid or not, I don’t know. From the early fifties until about 2010 or so (the next few years are fairly well mapped out), the consistency of this prediction of performance makes this area of technology forecasting a very boring subject indeed.
If things are this stable in the basic technology level, why was everyone surprised by the Web? And what about the forecasts on how business will use computing and telecommunications to transform not only how things get done, but what gets done? Who was talking about the “dot com” phenomenon five years ago? Through this period, forecasts of what tools will emerge from the underlying technology revolution, and how businesses will use them, have been as surprising as the technology forecasts have been boring.
The reasons for this difference can be found in the motivation for change in the three areas. At the base technology level, there is both a scientific and a business drive to continue the innovation. The situation is quite different at the tools and business applications levels.
In developing tools to exploit the technology there are many more directions that work can go. Further there is often resistance from the marketplace for tools until users can figure out their value. An interesting case study of a similar phenomenon is that of the VCR, which was developed in the 1950s but didn’t find its market niche until almost 1980. Wearable computers or combination telephone/computer/video/TV are technically feasible, but have not caught on the way some thought they would.
It takes a great deal of imagination to see the true potential for a new technology when it emerges, and to reject possible, but poor ideas. Witness some of these early predictions:
“I think there is a world market for maybe five computers.”
– Thomas Watson, chairman IBM, 1943
“There is no reason anyone would want a computer in their home.”
– Ken Olson, chairman and founder, Digital Equipment Corporation, 1977
Predictions about the way technology tools will be applied to business are even more difficult. It is not surprising that predictions for this area have failed as well:
“I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won’t last out the year.”
Editor for Business Books,
– Prentice Hall, 1957
“The Internet has tremendous potential, but it is important that expectations aren’t cranked too high. Fewer than 10% of PC users subscribe to online service. And attrition rate is very high — many subscribers drop off after less than a year.”
– Bill Gates, chairman, Microsoft
The Road Ahead, 1995
Predictions
Sharing these forecasts is not done to poke fun at the forecasters, who are very bright people, but to underscore how difficult predictions of business change are. There are many reasons why they are difficult.
First, there is a natural tendency to use new technology to accelerate what is already being done. A change which simply automates old methods and practices almost always results in minimal gain and higher costs. Getting past this phase seems to be a necessary detour. To avoid this step requires that the businesses understand the potential of new technology before beginning the business transformation process, and this is not an easily accepted idea.
Second, it may take years for a large company to redesign or develop a large information system. During this time, the capability of technology will change significantly. Few companies would want to change their design mid-stream, and to do so may mean never benefiting from the results. And, when a project gets finished, it is very difficult to begin again based on the capability of new technology and tools.
Third, for a large company, the complex processes may be fully understood only by those managing them. But, those managing these processes are often measured on preserving stability and predictability. Worse, the technology may eliminate the process altogether, and this is not only risky for the process, but risky for the individual.
Fourth, the scientific and business pressures that drive Moore’s law at the technology level are often lacking at the tool level. But there are forces fighting change at the business level. Competition can lead to the need to create change, but even here it is not always obvious how a competitor is producing a higher quality product for lower cost.
For all of these reasons, the predictable change in the base technology is difficult to translate to predictions about technology products, and even more difficult to translate to transformations in the business.
One thing should be evident. Consider the changes in business between 1990 and 2000 based on technology. The emergence of the Web, the “dot com” phenomenon, and the change in
business speed were all enabled by the underlying technology change. I also believe we are still at the front end of business change brought about by the Web, not the back end. We can expect much more change for businesses to come from what is made possible by the Web. Some suggestions for this can be found in The Cluetrain Manefesto, reviewed on page 8 in this issue.
Beyond this, exactly how business will be transformed between now and 2010 is tough to predict, but that it will be transformed is easy to predict. We can expect something as radical as the Web to come from the underlying changes in technology over the next ten to fifteen years. We can expect some changes at the business level that we cannot envision today.
This suggests we need to get a great deal better about two things at least. Designing systems that are robust that can survive the unpredictable change that we know will come. This is an important area of research now, and one that deserves even more attention. The old systems were difficult and costly to change, as we found out when dealing with Y2K issues. This is one of the findings of a National Academy research report issued in May 2000, identifying research directions for information technology for the next decade.
Secondly, it is very important for businesses to know what needs to change and what does not. When everything is changing at a rapid pace, there is little time to consider this in the middle of the project. This, in part, was the message of Built to Last by Jim Collins. If this message was important in the mid-1990s, it is even more important today.
Al Erisman is executive editor of Ethix, which he co-founded in 1998.
He spent 32 years at The Boeing Company, the last 11 as director of technology.
He was selected as a senior technical fellow of The Boeing Company in 1990,
and received his Ph.D. in applied mathematics from Iowa State University.