Introduction
Information has always been recognized as central to business. Financial and marketing information is at the heart of business decisions, regardless of the business. Some businesses have information at the core of their strategy, such as a reservation system for an airline or the insurance business. Information is the key to product improvement for heavy manufacturing businesses. Information is often used by a company to control the way it deals with its employees and customers. Some businesses have information as their product (newspapers, product evaluation services, travel agents). Information also plays a key role in education.
With this pervasive role for information, it is not surprising at all that the radical changes in capability from information technology (defined as the collection of computing telecommunications, and software that make up the computing system) have had a profound effect on business at the end of the 20th century.
In the last decade or so, the role of information technology in business has become a dominant issue. Evidence is all around us including the growing number of books on the shelves of today’s mega bookstores, the role of technology stocks in our economy, spending by major corporations, the emergence of Chief Technology or Chief Information Officers at many companies, and the occupations of people on the list of most wealthy.
If information has always been important to business, what has happened lately to cause this change in the perception of information technology? Is there systemic change happening in the business or simply a lot of hype? And will this change continue in the same way for the near future?
The first question is easier to answer than the others, but I will comment on all three.
Information Technology
Capability in information technologyis not a recent phenomenon. In 1965, Gordon Moore predicted that the capacity of the computer chip would double every 18 months. The remarkable thing is that it has followed or exceeded this trend to this day. This translates to more than doubling the price performance of computers over that period and into the foreseeable future every 18 months. Even before 1965, computers had begun to leave specialty labs and become an integral part of business. So what has happened recently?
Perhaps it is simply the reality of the old fable about the king who promised a victorious knight either half his kingdom, or a kernel of wheat on the first square of a checkerboard, two on the second, four on the third, etc., doubling each square. This would be more than all of the wheat in the world. Doubling has a profound effect. Perhaps we have gone through enough doubling phases to realize we have a lot of wheat on the board!
But it is not just this. It is what people have done with these ever more powerful computer chips that matters. In 1965, only a few specialized people could use a computer, and its cost blocked most businesses. This doubling has led to things like today’s $1500 PC having similar capability to a $15 million dollar Cray supercomputer from the early 1980’s. This makes computers broadly available. Much of the increase in the power of computers has been used for graphics and user interface capability improvements. This makes computers broadly accessible.
Another reason for the recent attention is the emergence of the Web. It was made possible by the capability of accessible and available computers, combined with the availability of increasingly capable telecommunications networks. The Web has generated interest, and possibilities for accessing information, for everyone from the school child to the CEO. Even the remote Havasupai tribe in Arizona has a Web page though it has no access to roads!
Systemic Business Change?
But what does this mean for the way we work? Having the capability without the business requirement usually results in cost rather than value for business. In 1990 Strassmann came out with a book entitled The Business Value of Computing, where he produced a rather fascinating graph showing spending in computing correlated with profitability. It was a pure scatter chart with no trends at all. Using computing is not enough.
A key change came with the relatively recent recognition that computing doesn’t simply speed up what used to be done, but it enables new things. Businesses started reevaluating their processes or even recognizing that they had processes, inspired by Japanese business practices and by books such as Process Innovation, or Reengineering the Corporation. Much of this has happened in the last decade.
Some of this process change actually led to undoing things that had been computerized. For example, tracking the progress of a part through a factory was often done with very complex computing implementations. Reworking the process into a work cell meant that the computer was not needed because you could see where the part was!
Aside from these poor implementations of the past, however, new ways of using this capability of access to information arose from the ability to access information world wide. For example, tracking a package in an overnight delivery service fundamentally changed the service. Automated teller machines did not shorten the lines in the bank (though that was their original intention) but opened up the possibility of 24 hour banking.
On the other side, productivity figures for white collar workers continue to stagnate. Could this be because we are measuring the wrong things in assessing producitivity? The question is raised because wages are increasing, the job market is tightening, and inflation is flat. Perhaps these measures need to be reengineered as well!
Where from Here?
Information technology shows no signs of slowing its pace. Business process reengineering is accelerating, driven by forces of competition and globalization. Does this mean that everything will get better and better? This depends in a large part on the measure of goodness and the sustainability of the pace.
A dark cloud is the pressures people experience on the job, resulting in growing dissatisfaction from work. This is more than a reaction to downsizing and business pressure. Another issue, for example, is the way technology is often used to “automate” the more obvious and creative tasks (in computer aided design, for example) causing the less interesting parts of the work to become a growing percentage of what the person does.
Perhaps part of this is because people have been left out of the process of reengineering. It is not that people are not involved in reengineering, but they are rarely considered as an integral part of the business system. What are the people good at? What do they enjoy doing? Can the process be reworked to take advantage of these capabilities, and to create attractive systems that people want to participate in, leading to productivity?
Looking back in history, the worker was most certainly the forgotten factor in the auto assembly line. People are not machines, and assembly lines do not take advantage of their creative capabilities.
Wouldn’t it be interesting to see how information technology and new processes might be developed to support an interesting, creative job for the engineer? Would this improve productivity and the overall process leading to competitiveness? The same concept could be applied to work in the factory, the office, or school.
Al Erisman is executive editor of Ethix, which he co-founded in 1998.
He spent 32 years at The Boeing Company, the last 11 as director of technology.
He was selected as a senior technical fellow of The Boeing Company in 1990,
and received his Ph.D. in applied mathematics from Iowa State University.