Economic changes blamed for IT labor shortage
(IDG) -- Starting salaries for C, C++, and Visual Basic applications programmers are expected to rise an average of 18.4 percent in 1999 compared with 1998, according to RHI Consulting, in Menlo Park, Calif. Database administrators' starting salaries are expected to rise 16.3 percent.
With salaries continuing to increase, turnover among IT workers has followed suit.
"Many companies are reporting annual turnover percentages in the high teens to the low 20s. You might expect that in the fast-food business, but not for people making salaries north of $60,000," says Harris Miller, president of the Information Technology Association of America (ITAA), in Arlington, Va.
How did the U.S. IT industry get to this point? The answer, in hindsight, is simple: For more than 20 years, potential IT employees have been receiving mixed messages about job security and opportunity, which served to alternately encourage and discourage them from considering IT careers. Today, the discouraging messages of the past have contributed to what some call the worst IT-worker shortage ever.
"We are short somewhere between 290,000 and 350,000 IT workers, but no one knows for sure. It looks like it's both a shortage of people and a mismatch of skills in the marketplace that produce vacancies," says Howard Rubin, a senior consultant at the Cutter Consortium, a New York-based IT research group, and chairman of the computer science department at Hunter College, in New York. "But it doesn't matter what the cause is if you can't fill jobs that support income and revenue for your company -- the effect is there."
There's no indication that the shortage will be alleviated anytime soon, Rubin says.
"There are more college kids in IT-related programs, but universities still are not keeping up with industry demand," Rubin says. "About 30,000 to 35,000 students will graduate this year in all computer science-related fields. But that represents a 10-1 spread between the supply of students and the shortage."
Going to the source
What are the roots of the IT industry's troubles? To understand, we must look back 40 years.
"In the late 1950s, in line with the space race and the Cold War, the government put a tremendous emphasis on funding engineers and computer people," Miller says. "The country had a massive focus on subsidizing students, supporting government labs, and supporting colleges and universities. So there was a huge growth spurt in the talent pool in the 1960s as colleges and universities pumped out large numbers of technically trained people."
Then came the end of the 1960s growth phase in the supply of IT workers.
"The U.S. government's commitment to continuing to pump out these graduates began to drop off in 1970s," Miller says. "The space race had been won. There was a recession in the late 1960s. The country was hurt by the oil crisis in the early 1970s. And the country had moved away from the attitude of the 1960s that you could solve the world's problems with government spending."
College students were less motivated to seek out IT careers when demand dipped in the late 1970s. And, because the government was no longer creating momentum in IT, there was a sense among students that IT was no longer challenging, Miller says.
Then things changed again, and prospective IT workers got a more positive message.
Although demand for IT workers slumped in some regions and industries during the recession of the early 1980s, the decade saw an overall upturn in demand for IT professionals. It was a growth period for the defense industry, a result of the Cold War. The number of computer science graduates peaked at more than 40,000 in 1986; although computer science graduates fill only some IT positions, the number of graduates is one indicator of how many people are prepared to enter the field.
Then came the downturn in the national economy in the late 1980s. The Cold War ended, and a major consolidation of the defense industry began. College and university students responded to the downturn in demand for IT by looking elsewhere for employment.
"The cutbacks in aerospace and defense sent a signal to the job market that engineers and math and science degrees were not going to be as much in demand in the 1990s as they were, in fact, going to be," Miller says. By the early '90s, the number of computer science graduates in the United States had dropped to about 25,000.
"Students were seeing the end of the Cold War, corporate restructuring, and layoffs. To tell them that they should focus on a computer science degree rather than a business or law school degree was not an easy sell," Miller says.
Norman Imamshah, director for computing and telecommunications services at Central Washington University, in Ellensburg, Wash., recalls that the IT problems of the late 1980s involved not just fewer jobs, but a mismatch of skills.
"The end of the Cold War cast into the workforce many qualified but narrowly focused people who were not suited to the general business community," Imamshah says. "As a result, a lot of people in college turned away from IT in the 1980s, feeling that the bubble had burst."
Nate Viall, an IT market researcher at Nate Viall and Associates, in Des Moines, Iowa, says the late-1980s recession began with big layoffs in 1989 at Digital Equipment, followed by cuts at IBM. That was followed by the corporate "merger mania" of the early 1990s, which also resulted in thousands of IT workers losing their jobs, he says.
"All through the 1990s to about 1995, there were few months when there was not some news headline about IT layoffs," Viall says.
Several other events of the '90s, however, helped create today's skills shortage: the rise of client/server computing, the widespread use of PCs in the corporate world, and the blooming of the Internet.
Learning from the past
Today's IT shortage has brought into focus two big mistakes the business community made in the past, Rubin says.
First, IT was treated as a cost center rather than an asset, meaning that when times were tough, IT budgets were cut.
Also, Rubin says, corporations focused on hiring people with four-year college degrees, when other institutions, such as technical schools, might have helped to fill jobs.
"I'm not talking about lowering hiring standards; I'm talking about rethinking those standards," Rubin says.
Why didn't corporate America take a more enlightened stance toward IT in the late 1980s through the mid-1990s? It seems that few foresaw the far-reaching technological changes that would make IT a vital part of any business.
"I don't think anyone then imagined how quickly IT would move to being on the asset side of the balance sheet," says Greg Scileppi, executive director of RHI Consulting.
Steve Alexander is a free-lance writer in Edina, Minn.
Back to the top
© 2000 Cable News Network. All Rights Reserved.
Terms under which this service is provided to you.
Read our privacy guidelines.