From technology to politics to video games; these are the random thoughts of a geek with too much time on his hands
Outsourcing is merely the first of many problems to come...
Published on March 22, 2004 By Zoomba In Pure Technology
Outsourcing has become the largest problem for the IT industry since the dot-com bomb. However, it is not the last major shake-up we'll see in this industry, nor is it a sign of its eventual demise in the US. IT is just advancing along the same trend as every other form of technology throughout history. First there is the point where only masters and experts can participate, making these individuals very valuble, then slowly the number of people who can use/work with the technology increases as people want to work in a more profitable field, then eventually employment reaches a point where there are simply too many people trying to work on too little work, and you see a bust as people are laid off and/or work moves to a cheaper location. We've seen this with every tech industry throughout history. From printing presses, to watch and clock makers, to engineers and now computers. These two adjustments (the bust and then redistribution of the workforce to cheaper places) are just the first we'll see in the coming years.

The next adjustment is already taking place, it's the commoditization of computer technology. In the beginning, it was 1s and 0s, mainframes that took up entire buildings and dumb terminals where you would input a program. At first you needed to know how to program the computer at a very basic level to be able to do anything with it. Slowly, as components shrunk and became cheaper and the technology itself began to improve (faster, more memory etc) they were made easier and easier to use. Opening them up to an ever expanding audience. The introduction of the Personal Computer was the first step towards the eventual doom of the industry as a large, long-term employer. But even when they first came on the market, the C64, Amiga, Apple and IBM ventures were still obtuse and difficult to learn for the average person. Command Line Interfaces kept a majority of the populace away from PCs unless they were forced to use them for work or school. Computer geeks were still a necessary resource as no one could figure out what to do whenever they got a cryptic error message, or their word processor stopped working. The machines were counterintuitive for the common user to the extent that people avoided them at all costs if they could.

So, in an attempt to improve technology further, we created the Graphical User Interface, added the mouse and worked to continually refine the end-user experience. This extended to every aspect of computing, even to software development (those nice fancy IDEs) and system administration (RedHat and it's attempt to be extremely easy to setup as a robust server etc), the last bastions of the computer geek. Also, applications have been developed to automate many tasks so that the only real input required from a user is to set the initial configuration and click "go"

The drive now is to make everything as simple and intuitive as possible, and I'm not saying there's anything wrong with that, it's the natural progression of any end-user technology, since the reason for it's existence is to make life better for the people using it. Having a device that is cryptic, frustrating and clunky does not generally make anything better, it just pisses you off. However, that trend towards simple, easy to use, little configuration required tech is what will cause an increasingly large amount of job loss in IT over the next few decades.

Aside from your PC, how often does anything you use that has a computer at it's core break-down or crash? Does your VCR or DVD player pop up a blue screen of death? How often do you get an illegal operation error on your microwave, video game console or car? In most cases now, once something is configured, that's all that needs to be done. It used to be that continual maintenance and monitoring was required for any computer system, thus justifying the employment of a lot of computer workers. But with everything becoming sort of Fire-and-Forget, we don't need to employ as many people. Also, the general population is becoming better at dealing with technology. Everyone from your 2 year old niece to your 90 year old grandma is probably using computers now, surfing the web, playing games and is generally saavy to the basics, and in the case of children they'll only get better with age.

Add the increasing ease and simplicity of technology to a user base that is growing increasingly sophisticated and the end of the initial development burst and disemination, and you have a recipie for a constant and inevitable shrinking of the job market in this area. There will still be busts and bombs as new iterations of technology will come along that require businesses to retool their operations (currently middleware is the big thing). But unless there is some shocking new development that totally revolutionizes computing, we're going to see IT fade to the point where it is no more or less a major industry than any other field. We aren't going to be the industry where all the hot to trot jobs are going to be like we were in the 90s.

Does this mean all is lost, and we should all start looking for new jobs in other industries? Heck no! What is going to happen is all the people who joined into the field because they heard it paid well rather than because they have a love for computers and technology will sort out to move onto the next big money maker. Once that happens, it will be mostly the people who actually enjoy this stuff that are still around doing the work. Also, it's those who are really devoted to the stuff that will find new opportunities within the field, that will create and advance technology in new directions, creating additional work and more money. The good ones will make out better in the long-run, but it won't be easy going for the next few years.

This is not a field where you will find job security for some time, but those who stick it out and keep at it will likely benefit greatly and find it was worth the blood sweat and tears. It's just going to be hard to predict the next direction for the industry, and it'll be even harder to sucessfully weather the dips and busts that we're going to see in the next several years.

Outsourcing and the Dot Com bust are merely the first steps in the evolution of IT from a new to an established profession.

Comments
on Mar 22, 2004
Yes i cant wait for the non-geeks to leave the industry.  Not even for the money aspect.  I know the money of the 90's wouldnt last forever.    But I just despise having to work with people who can do the job that's laid out in the S.O.P.'s but dont really "get it".  It frustrates me.
on Mar 22, 2004
Penn State created a new academic school called Information Sciences & Technology right on the high-point of the boom of the 90s... I was the second class to go through the program and all around me were people who didn't even know how to use their own PCs, but they were in the major because they thought it would make them a lot of money. Most of them don't even like computers. As a result of having 90% of the student body of the school being computer illiterate, they had to spend about 3 semesters just teaching the basics. I wanted to scream and throw stuff in frustration as we were still learning basic HTML in my senior year! And people still didnt' get it! The most frustrating thing is that many of the people I had classes with who can't turn a PC on unless the button is clearly labled with a neon sign, all got great jobs starting out because they either outright lied on their resumes (having an intro C++ class that goes as far as one dimension arrays does NOT mean you know how to program in C++) or completely BSed their way through interviews.

These are the people that will thankfully sort out the quickest with this current resettling.
on Mar 23, 2004
>>> Outsourcing has become the largest problem for the IT industry since the dot-com bomb.

Why is outsourcing a problem ? Read this article for a different point of view.

>>> Add the increasing ease and simplicity of technology to a user base that is growing increasingly sophisticated and the end of the initial development burst and disemination, and you have a recipie for a constant and inevitable shrinking of the job market in this area.

Funny... I came to the opposite conclusion. Greater requirements (both in ease of use and demand by sophisticated users) means more resource needs, which means more jobs. I don't follow your argument here. Greater use of computers by an increasing demographic (2 to 90 year olds) only compounds the problem.

When software can be developed in a more automated fashion, at a higher-level of abstraction, then we will see a reduced need for human resources. I don't see that happening to a great degree... yet.

>>> Outsourcing is merely the first of many problems to come...

What are the other problems you foresee ?
on Mar 23, 2004
>>>Why is outsourcing a problem ? Read this article for a different point of view.

I understand there are different takes on outsourcing, I am merely speaking from the point of view of an IT worker in the US who has to deal with jobs fleeing overseas. Outsourcing is a great solution for upper management looking to make a company increasingly cost-effective (which is the entire point, I do understand that), but it is done at the expense of workers. We had the same problem with manufacturing jobs in previous years, but now it's hit white-collar work... it's always a management vs. workforce issue, as the two have very conflicting interests. I consider outsourcing a problem from the perspective of the workforce.

>>>Funny... I came to the opposite conclusion. Greater requirements (both in ease of use and demand by sophisticated users) means more resource needs, which means more jobs. I don't follow your argument here. Greater use of computers by an increasing demographic (2 to 90 year olds) only compounds the problem.

I'll use clocks and watches as an example to try and better illustrate my point.
Go back to the late 1800's, and you'll find that clocks and watches are some of the most complex pieces of machienery the average person encounters beyond their place of work if they work in a factory. Those capable of making and repairing these devices were rare, and had to be masters of a realtively obscure trade. However, over time, these devices became simpler, more common place, easier and cheaper to produce. Eventually, just about everyone could pick up a digital watch at a store for $10 that cost $2 to make. Sure there are still the really fancy expensive watches that require a real master to work on, but those are the exception as opposed to the rule. The same thing applies to the advance of any technology. It starts out expensive, confusing and requires an army of skilled workers to maintain and develop, but over time the process is refined, the product made easier and easier to use, and we no longer need as many skilled workers in the field.

Computers are moving in that direction. With the exception of high-end complex systems, the trend is towards technology that is so easy to use we don't even really notice it (like watches are now... once the marvel of the technical world, now they're just a piece of clothing effectively). Also, people are becoming increasingly familiar with how it all works. Even 5 years ago I was explaining to people how to use e-mail and the web. Now, everyone knows the basics, and a larger number of people are moving towards becoming power users (not experts, but good enough to handle most common problems and tasks). If a user base grows up with a technology, it's not something they really have to learn or adapt to, it's just something that is part of their being and culture. Do we consider a telephone (not cell phones, those are a different story)to be something that needs a huge support infrastructure anymore? Do we need classes on how to dial numbers? Nope, it's considered so common place, such an essential and standard thing to have that it's a seamless part of our world. COmputers are moving that way.

Keep in mind I'm not talking about programing here, I'm talking about all other aspects of the IT world (networking, sys admin, helpdesk, consulting, etc...). As it becomes easier to manage and use, the number of people needed to manage it and support it shrinks. Everyone has watches, do we have a huge industry supporting watches? The size of the userbase only matters to a point, once technology passes a certain "ease-of-use" point, the numbers don't matter nearly as much.

>>>What are the other problems you foresee ?
1) A glut of senior/experienced workers unemployed make it harder for new entrants to the field to find jobs
2) Shrinking need for support infrastructure as computers become a more integrated part of like (like TVs, VCRs, Watches etc... those have been reduced more to manufacturing industries with a small amount of infrastructure support and innovation)
3) A boom-and-bust economy is forming. Since tech is starting to level out in terms of adoption and integration, we're going to see more and more waves where there's a sudden need to upgrade to XYZ methodology/system/technology/application that will quickly go away once the upgrade is done.

----
Now, keep in mind I'm not saying this is the end of the IT industry, this is just the direction I see things moving in the next 20-40 years with the current technology, the current way we do and use things. It's going to take some new and innovative thinking to move us in a new direction. It's the IT workers who are worth their pay that will lead and be able to adapt to it. Currently we're growing too comfortable with what we have, and we're seeing iteration rather than innovation.
on Mar 23, 2004
Interesting view.

I think that outsourcing will make things more competitive, but that's a good thing. While computers are getting easier to use I their uses are getting much more complex.

Sure, most people know how to use e-mail, but how are they managing all that data? What will happen when people start e-mailing self contained applications? I look at the possibilities that the Longhorn builds offer and my mind starts spinning at all the back end stuff that has to be in place to make that technology seem so easy. Version control, data management, security, etc... These issues are going to get much more complex before they get easier.

I believe that the number of jobs that have lower level skills will shrink due to the advances in the technology, but the higher level skills will be in more demand due to the increased complexity and interaction of more and more systems.