TechWatch: Clearing the Technology Fog

In the late ‘90s, we reached the peak of hype around technology. When the “dot-coms” faded, some dismissed technology as being unimportant in business. Nicolas Carr put the exclamation mark on this way of thinking with his book IT Doesn’t Matter.

Yet technology continues to be at the center of overstatement and understatement. This was brought home by two articles that colleagues sent to me in the past month. The first focused on the dangers of new technologies, and the second trumpeted the near-term demise of the Internet because of some supposed breakthrough technologies. In both cases, the premises are expressed in popular, easily understood language, but the details of the technology makes an obvious point much less obvious, even wrong.

Dangerous Technologies

The first article was from eWeek by Jim Ropoza titled, “The Most Dangerous New Technologies,” published March 18, 2008. The opening statement lays out the premise:

“Cutting-edge technologies offer the hope of a better world, potentially bringing welcome solutions to everything from disease to environmental damage. But these same technologies can also bring danger, by aiding criminals and terrorists, invading personal privacy, and even potentially creating diseases and damaging the environment.”

As a broad and general statement, this is true. But it does not apply just to new, cutting-edge technologies. All technology offers both the promise of something better and the potential for harm. Neil Postman in Technopoly even points out the downsides of the railroad, the telephone, and the printing press. These created enormous new opportunities, while at the same time leading to the fragmentation of families and societies as we became more mobile, and the loss of memory as we became dependent on the written page to tell our stories. This two-sided nature of technology was also the theme of Edward Tenner’s book, Why Things Bite Back.

Rapoza identifies these technologies for his very brief comments:

  1. Nanotechnology
  2. RFID (radio frequency Identification)
  3. Artificial intelligence
  4. Mobile payment
  5. Digital identities
  6. Web OS
  7. GPS
  8. Networked computerized cars
  9. Anonymity tools
  10. Cybernetic implants

There are, of course, upsides and downsides in each area. Nanotechnology is indeed at its infancy. The article on pp 18–19 provides a more thoughtful discussion of how one group sees this technology, and how we should be cautious in its implementation.

Artificial intelligence (AI), on the other hand, is very mature. The author’s statement, “When AI finally does start to work, it’s a good bet that people will use it in areas where the smartest and safest course would be to stick with human intuition.” indicates a complete lack of understanding of AI.

He implies that AI has not yet started to work but AI tools have become a part of our everyday life. We have search capability we use through Google and other search engines, voice recognition systems we encounter when calling the airlines, and spelling and grammar checkers we find with our word processors. The computer can play competitive chess, identify when we are close to a tree when backing a car, and build cars in our manufacturing plants through robotics. Robotic surgery, reasoning diagnostic systems, and advances in pharmaceutical automation all can reduce one kind of risk and introduce another kind of risk that needs to be managed.

So, yes, like every technology, old and new, there will be upsides and downsides. The title, “The Most Dangerous New Technologies” is pure hype.

The End of the Internet

The second article I received this month was titled, “‘The Grid’ Could Soon Make the Internet Obsolete.” It was published by The Times, at on April 7, 2008. No author was identified with the story. Here is the opening paragraph:

“The Internet could soon be made obsolete. The scientists who pioneered it have now built a lightning-fast replacement capable of downloading entire feature films within seconds. At speeds about 10,000 times faster than a typical broadband connection, “the Grid” will be able to send the entire Rolling Stones back catalogue from Britain to Japan in less than two seconds.”

The article went on to highlight the need for moving large amounts of data based on experiments at CERN, the Swiss research center.The thrust of the article was that scientists (some of the same ones involved in starting the present-day Internet) were motivated to solve the problem of moving large amounts of experimental data between laboratories, including one in England, and this would result in a near-term fallout which would benefit the general public with a faster Internet.

There are two problems with this story.

First, if it were true that a replacement for the Internet could run 10,000 times faster, its implications would be much more than speed. It would fundamentally change whole industries. Downloading digital content would eliminate the need for CDs and DVDs and all of the surrounding industry around them. It is a useful exercise to ask what would happen when some portion (in this case the network) of an IT system changes dramatically. The article barely scratched the surface.

But, alas, the second problem is more serious: the article is just plain wrong. Yes, there are point-to-point networks for scientific use that will enable much faster transmission of data, but the general availability of this technology to each home or business remains a dream — perhaps a far off dream. It is the “last mile” problem linking homes and businesses and the issues of cost.

Scientific Perspectives

I asked several noted scientists to wade in on the article. Here are some of their insights:

John Reid, scientist at the Rutherford Laboratories in England (connected to CERN) said,

“Yes, we do have a huge cluster in this building that has a high-speed link to CERN. [But the article] looks like a lot of speculation to me.”

Ken Neves, chief information advisor for Lawrence Livermore National Laboratories in California, said,

“This sounds like enormous hype. First of all the Grid offers nothing in the way of connectivity, unless the Grid complex used bi-passes the ad hoc Internet “star connectivity.” If CERN exploited “lambda” multiplexing of dedicated fiber channels, a huge advantage could be gained. BUT that requires a network; Grids do not replace networks. Our physics friends have always had dedicated high-bandwidth connections to the major world labs including CERN. The article suggests a special network to do the job of moving massive amounts of data. How would this scale to replace the Internet for the world community? Very bizarre article.”

Ed Lazowska, the Bill and Melinda Gates chair of computer science at University of Washington, and the subject of the Ethix Conversation in January/February 2003, responded:

“This is a preposterous piece of PR from CERN.

Beginning with the first paragraph, the people who invented the Internet have nothing to do with the Grid, and CERN had nothing to do with inventing the Internet. As is correctly stated in the third paragraph, one specific individual at CERN did indeed pioneer the Web — that’s Tim Berners Lee, who immediately left CERN for MIT where he has been for more than a decade. CERN is the European nuclear-physics research center, which is home to a major international collaboration that will come online soon.

Second, the Grid, at its inception, was a U.S. National Science Foundation project, essentially a life ring for the federal Supercomputer Centers. Most of the technology for the Grid comes from people at Argonne National Laboratory — Ian Foster, Charlie Catlett, etc., Larry Smarr, formerly of NCSA (Illinois supercomputer center) and now of UCSD, gets credit for the so-called Lambda Grid — use of direct fiber-optic links rather than more traditional routed packets. Much of Larry’s technology actually comes from a University of Washington-spawned consortium, NLR (National Lambda Rail).

It is the case that LHC (Large Hadron Collider) is an interesting science project that is going to have to move lots of data around. The instrument, located at CERN, will produce a huge volume of data, and because of the nature of this international collaboration, the data will be retained and analyzed at various sites around the world.

Third, the Internet is only as good as last-mile bandwidth. This sort of lambda-switching has nothing to do with getting high-speed data to your home or business. You can have a highway that accommodates rocket-sleds, but if you still need to get to your garage by means of residential streets, your trip time doesn’t decrease a whole lot. (Taken a trip by air lately)”


Misinformation in the form of strong statements about the downsides of technology or overstatements on the payoff of technology, can readily find its way into the press. Business leaders need a good set of filters when looking at this kind of information. A recommendation I have made before is that business leaders need trusted technical advisors who can assist in finding the dose of reality. It is too often the case that the difference between truth and outlandish fiction can be separated by what some would view as a subtle technical detail (the development of AI or the meaning and appropriate application of lambda switching!).

By the way, Nicolas Carr has a new book out: The Big Switch. It’s about the Grid. Be careful!


Al Erisman is executive editor of Ethix, which he co-founded in 1998.
He spent 32 years at The Boeing Company, the last 11 as director of technology.
He was selected as a senior technical fellow of The Boeing Company in 1990,
and received his Ph.D. in applied mathematics from Iowa State University.