The demand for IT workers is back, at least in some areas. Anne Fisher wrote in Fortune (May 25, 2006):
To look at what’s been going on at the fastest-growing tech companies — including, Adobe, and many lesser-known firms — you’d almost think it’s the late ’90s all over again.
These employers grew their payrolls by an average of 16 percent in 2005, hiring more than 70,000 people. Meanwhile, Google has snapped up so much tech talent over the past couple of years that it’s pushed salaries higher all over Silicon Valley. Could it be that the IT job market’s long slump is finally over?
In the meantime, the enrollment decline in computer science programs across the United States started in 2000 and has continued, leaving a shortage of people to fill available positions. For example, Vanderbilt University’s undergraduate computer science enrollments dropped 38 percent between 2000 and 2005, according to a 2005 Vanderbilt City News article. How do we explain declining enrollments and increasing demand?
I believe a number of seemingly unrelated events have come together in a perfect storm to create the situation we now face. These factors led to a loss of jobs in computing, hence a reduction in enrollment in computer science programs, hence a lack of available skills to fill the current need. The factors contributing to this storm include Y2K, dot-coms, 9/11, Moore’s Law, macroeconomic cycles, and short-term thinking. While some of these factors will not return, the remaining ones will likely continue and some form of this problem will appear again.
Most people had hoped we would never talk about Y2K again. But for several years leading up to the year 2000, organizations of all kinds and sizes worked on making certain their hardware and software systems would continue to work properly when the last two digits of the calendar moved to “00.” The demand for IT workers was high at this time, so some organizations discovered the opportunity of outsourcing software maintenance too. But when this worked well for certain activities, these jobs never returned.
A second thing happened with Y2K. Many companies accelerated spending on their computer systems while they were becoming Y2K compliant. This led to a dip in expenditures in 2000 and 2001. Normally, technology refresh (upgrading hardware and software) takes place according to each company’s internal schedule. This event caused these cycles to align at the end of 1999.
The hype surrounding the dot-coms built a bubble in the NASDAQ that finally burst in April 2000. While the build up of the dot-coms had led even traditional business people to believe there was something magic about technology, the burst led to the opposite and equally fallacious view that technology offered nothing special to business. Combine this with the broad availability of hardware from the now defunct dot-com companies, and combine that with the already declining technology purchases due to Y2K, and we have a drop in the market for the technology sellers, and an oversupply of technology people.
Those who bought stock in the early 1980s and held on might have believed the market could only go one way — up. But about the same timeframe as Y2K and the demise of the dot-coms, this upward trend ended. Any economist will recognize that the economy goes both up and down. Unfortunately for technology, the cycle turned down early in the new century (or late in the old one depending on whether you see 2000 as the end of the old century or the beginning of the new). Combined with the pre-spending for Y2K and the collapse of the dot-coms, we can now add a normal economic cycle to the mix, with a strong impact on technology.
The terrorist attacks on New York, Washington, D.C., and Pennsylvania were horrible in their own right. But the fallout from these attacks did two more things that affected the tech cycle. First, they accelerated the decline of a macroeconomic cycle that had already begun. Second, they led to the Patriot Act, which made it much more difficult for foreign workers, or foreign students, to get visas in the United States. Universities that at one time had a significant percent of computer science students from other nations, suddenly saw a decline in this part of the student population. “The number of foreign undergraduates enrolled in U.S. academic institutions in all fields decreased almost 5% from academic year 2002-03 to 2003-04, the second consecutive decline after record increases during the 1990s,” according to Science and Engineering Indicators 2006. Thus the decline in computer science enrollment can be traced in part to 9/11.
I have frequently commented on some aspect of Moore’s Law, first stated in the early 1960s: “The density of the chip will double every eighteen months.” The implication of this law has had profound effects on technology including the most obvious, that every five years or so the computer you buy will be a factor of 10 faster or cheaper (or a combination of these). But the implications of this quantitative observation are even more profound.
One such implication relevant here is that the work of a computer scientist will go through profound shifts every decade or so. Ten years ago, a non-technology company would hire computer scientists to build software. This is far less likely now, as most companies purchase their software from a growing industry of applications providers. Does this mean computer scientists are no longer needed at places such as Kraft Foods, Boeing, and General Motors? Not nearly as many for software development (see the comment by Scott Griffin, Boeing CIO, in “The Ethix Conversation,” pp. 6-13). But they are needed for issues of computer security, wireless applications, application integration, and a whole host of areas that were very small or nonexistent 10 years ago.
Of course, the technology companies are still hiring programmers, hardware engineers, compiler developers, and the like, but even here they are working with new tools.
This is not a new cycle, as significant changes have taken place over the decades. Computing went from mainframes to mini computers to personal computers. Telecommunications and information security have grown in importance, as have user interfaces, software integration, and business specific applications. I remember a computer scientist who set up his company in the mid-1990s and created one of the early Web sites. After spending several months creating the site, a software program came out enabling the same thing to be done in several hours.
Every one of these cycles produces a shift in the requirements for computer science people. The top people are incredibly adaptable, moving from technology to technology. But with every shift there is the group of people who no longer fit, giving the appearance that computer science people can’t get work. This combines with the real decline due to the other forces cited earlier.
Now let’s look at one more trend — toward short-term thinking. We have seen this in the past decade in business, on Wall Street, and everywhere else. In part because of the availability of information, there is a temptation to create short-term measurements because they are easy, rather than the harder, long-term measurements that are more important.
I believe this affects computer science in three ways: Students (and their parents) saw short-term results such as the combination of events in 2000-02 and concluded that there were no jobs in computer science, so they didn’t enter the program. The reduction in foreign students due to security issues only amplifies the enrollment decline that is now passing through our universities. Also, computer science is hard. In our era of short-term thinking, some don’t want to pay the price.
Businesses are also looking for the shortcut. In the same Fortune article cited earlier, Fisher said the hot skills are things like Microsoft .Net, BizTalk, Oracle Financials, and SAP. Businesses often take the short-term solution of looking for people with skills that are immediately useful; then these people are discarded when the skill is no longer needed. A well-trained computer scientist can readily pick up new applications.
Third, universities can fall into this same short-term thinking. Turning a computer science program into a training program for hot skills is a tempting way to attract students. It is not a mistake to include familiarity with the latest applications in the curriculum. But it is a mistake to do this while eliminating the fundamental skills that must be there for the next applications. Universities are challenged to broaden their curriculum in computer science in recognition of the broader needs in today’s technology, while maintaining the core knowledge that will be vital in our 21-century technological age.
A number of factors have come together at one time to create a significant dip in the availability of computing people at the same time there is an increasing demand. Unfortunately, the time between educational choice and graduation doesn’t mesh well with the trends for jobs. This puts pressure on students, businesses, and universities alike. The danger is to react too quickly to short-term cycles, and this is amplified in our “instant” age.
Y2K, one of the key factors, is gone for good. We would like to say that 9/11 and the dot-coms are gone also, but a new variation of these themes is too likely to reoccur. However, Moore’s Law, macroeconomic cycles, and short-term thinking will continue to be a part of business and technology as we move forward. We could encounter this disparity between supply and demand in the computing field again. A new emphasis on math and science education, a long-term view for businesses, and broad thinking about the computer science curriculum, are all required to avoid future pitfalls.
Al Erisman is executive editor of Ethix, which he co-founded in 1998. He spent 32 years at The Boeing Company, the last 11 as director of technology. He was selected as a senior technical fellow of The Boeing Company in 1990, and received his Ph.D. in applied mathematics from Iowa State University.