TechWatch: Peer-to-Peer Computing: Hype or Progress?

Peer-to-peer computing is “hot” — the number of articles recently in the New York Times, The Wall Street Journal, and many of the trade publications is but one indication. The attraction is obvious. The term refers to a direct interaction between workstations, PCs, or other smaller devices that enable work to be done. The key is doing this peer to peer — without going through mainframes and servers. Eliminating mainframes and servers reduces single points of failure as well as cost.

Like so many other terms in the technology and business fields, however, once something is “hot,” all sorts of products appear that carry the label, whether they follow the concept or not. Then the term rapidly goes out of favor because it is over-hyped.

In the technology field, we have seen this happen with artificial intelligence and knowledge based systems. In the business world this has happened to business process re-engineering and e-business. One publication recently reported that an e-label for a company marks it as mid-90s; the modern Internet based company carries an i-label. I visited a used bookstore recently that no longer carries business or technology books because “they are obsolete before they go on the shelf.”

In spite of the potential for too much hype before all of the difficulties are solved (certainly true today), peer-to-peer is such a powerful concept that at some point it will revolutionize how computing gets done. It is true in this example as in many others that once the new way of computing is established, a new way of doing business using this paradigm will follow. It is the latter that offers the most potential benefit.

Most computing today has come from the mainframe way of thinking, just like much business today is managed hierarchically. These two approaches fit together naturally. The death of the mainframe has been predicted prematurely for some time, but even when mainframe computers are gone, mainframe computing will live on. Most businesses rely much less on the mainframe computer today, but they are moving their large servers to the “glass house,” managing the nerve center of computing in a centralized way.

The big idea behind peer-to-peer is eliminating the centralized way of thinking. A similar transition took place in the industrial era, and will happen in the knowledge era as well.

The industrial revolution had its own centralized concept. When power came from the large water wheel, tools that would be driven by the source of power had to be linked to it through a series of belts. When the first motors came to the factory, they too were large centralized devices that drove all the machines in the factory through a series of belts. Early in the 1900s Sears Roebuck offered a home motor that could drive many of your home appliances. It took awhile, but ultimately people recognized that what the motor did was more important than the motor itself. Motors became small and integrated into appliances and the emphasis on the motor went away.

The analogy with motors is a good one, but just an analogy. There is a danger of oversimplification here. When the motors were distributed throughout the factory as a part of individual tools, the only thing they needed in common was a power source. The information distribution is more complex since the distributed devices will still need to communicate with each other. It is for this reason that fully distributed computing, without the centralized concept, has still not materialized except in very focused circumstances.

In a real sense, the power of the computer comes when the computer itself “disappears.” The focus belongs on the generation, management, and distribution of information rather than on the computers and telecommunication that make it happen. Peer-to-peer computing promises the ultimate fulfillment of this dream. Computers themselves no longer are the focus. Rather, almost everything (clothing, refrigerators, machines, offices, people) contains computer chips, and work gets done through their communication with each other.

An intriguing example of one form of peer-to-peer computing is using the collection of resources over the network to solve very large scale computational problems, or information management problems. These sometimes fit under the concept of “grid computing,” since the grid of telecommunications and computers is used as a large computer.

This idea is attractive for two reasons. First, there are idle computer cycles all over the typical company. Particularly at night when most PCs are doing nothing, capturing these excess cycles to create the world’s largest computer (without spending additional capital) sounds like a “free lunch.” This idea can be extended outside the walls of the company using the Internet itself as the computer. In addition to being free, this “Internet computer” is more powerful than any computer that could be purchased at any price.

The idea is not without its difficulties, however. The coordination of tasks using many resources, proper handling of resources that go off-line for any reason, distribution of data, and security are but a few of the issues that need to be addressed. Several companies (including Legion and Centrata) are providing tools to make this task easier.

The search for extraterrestrial intelligence has popularized this notion. Users offer idle time on their computers for someone else to “use the computer” in analyzing signals potentially from outer space. While this approach fits in peer-to-peer computing, it is still hierarchical in nature. There is a single problem that needs to be solved. Someone is in charge. Resources from many places are used to solve the single large problem.

Another peer-to-peer model enables file sharing over the network. Napster has received the most publicity because of its legal problems involved in sharing music without compensation, but their approach demonstrates another form of peer-to-peer. Other Napster-like concepts also are dealing with economic realities, but the technology is being demonstrated in this area.

There is another set of problems not addressed by the previous description. Here, people may want to work together to solve one or many problems, where authority does not rest in a single person. Doing so reliably and securely adds to the challenge.

To take a simple example, suppose 22 people wanted to participate in an electronic football game, where each player in the game represented a single player on the field. A mainframe solution for this problem would have all of the action taking place on the centralized server, with messages going back and forth between each player and the server. A peer-to-peer idea would allow each player to use his or her own computer, with all of the updates passed between the various players.

Similarly, a large manufacturing company and its suppliers are working together on the design of a truck. Different participants have different roles in the design process, and there are lots of smaller, impromptu meetings called by various subgroups to collaborate on the design. Again, a mainframe solution would put all of the real work on the server, while a peer-to-peer solution would take advantage of the distributed devices, sharing updates through multiple communications paths eliminating both the need for the server and the single point of failure.

Mainframes and servers have had years to mature all of the technology to assure accurate, reliable, timely solutions which are robust in the presence of multiple users trying to use the same data in different ways.

Mainframes and servers have had years to mature all of the technology to assure accurate, reliable, timely solutions which are robust in the presence of multiple users trying to use the same data in different ways.

Getting to the point where peer-to-peer tools will offer this same maturity with their promise of being able to solve much larger problems with fewer resources is still coming. This powerful concept has the potential to change both the computing world and the way work gets done. How quickly will this happen? Not as quickly as peer-to-peer computing will become over-hyped and used so broadly it will lose its meaning.

When successful peer-to-peer computing comes, it will change not only the cost and management of the computing system, but the nature of work itself. In a true peer-to-peer system collaboration is less dependent on a hierarchical management structure. Many of the typical management functions will go the way of mainframes and servers.


Al Erisman is executive editor of Ethix, which he co-founded in 1998.
He spent 32 years at The Boeing Company, the last 11 as director of technology.
He was selected as a senior technical fellow of The Boeing Company in 1990,
and received his Ph.D. in applied mathematics from Iowa State University.