I started in IT while I was in college, working in the computing lab. I would have done the job for free, just to have the access, but they actually paid me to do it, and more money than I had ever seen as a dishwasher. I thought to myself, "this could lead to something...."
At that time our computer lab was a collection of Televideo terminals, IBM PCs, DEC Rainbows and a single Mac+. And we had a hardware tech who spent the majority of his time inserting expansion cards into the various PCs. You have to remember that PCs were pretty expensive then, so you would typically buy a computer that just met your immediate requirements. Later, when you had more money, or a need arose, you might add RAM, upgrade your graphics card, or put in a bigger hard disk. Adding expansion cards into PCs was not a simple thing to do in those days and you had to have a fair bit of understanding of the internal PC architecture in order to do it properly. Specifically, you had to understand 3 critical resources used by expansion cards: shared memory (which was the memory between 640K and 1M), Interrupts (IRQ), and Direct Memory Access (DMA) channels. You had to understand how these resources were being used in the computer you were upgrading, and which ones the card needed. You then had to configure the card (usually by means of jumper pins or DIP switches) to dovetail its resource consumption with what was available.
Today nobody except hardware designers have to know a thing about those resources or how they are allocated. There are many reasons for this; first, PCs are a lot cheaper, so often times you just buy the computer with all the features you need and never bother to do any expansion. But more importantly, if you even have expansion slots, the underlying hardware is sophisticated enough to figure out how to allocate those resources correctly. And most 'peripheral expansion' today is via USB, which is also self-configuring.
Would anyone want to go back to the old days? Is it not self evident that having the hardware configure itself automatically is a much better world than one in which we have to figure it out ourselves? Just imagine how much less would get done if every time you needed connect new hardware to your computer you had to crack the case, consult the manual for the new hardware and the old, and then fiddle with the settings?
There was a time, not so very long ago, when if you wanted cash, you had to go the bank. When they were open. Does the term "Banker's Hours" ring a bell? You would go to the bank, stand in line, and then you withdrew your cash after interacting with the teller. Today we have ATMs (Automated Teller Machine). The process of verifying your identity and bank balance and then dispensing cash has been completely automated. Could you even imagine life without them now? Would anyone want to go back to the days before ATM machines?
There is a great story that has been floating around the net forever called The Ballad of Mel. It chronicles the story of a programmer whose code was so meticulously optimized for the underlying hardware that that it took the story's author two weeks to figure out how the program exited a loop.
The story might make one wistful for the days when people had such deep understanding about the technology they worked with, but I ask "Would we really want to have an army of Mel's writing code, or would we rather use software tools that, at some expense in terms of performance, allowed other developers to easily understand what the code is doing and modify it if needed? Could we even imagine going back to a time when people wrote software that way?
I am sure there used to be an army of hardware technicians, tellers, and hex-level hardware programmers. As these changes came, many of them lost their jobs. Sure there are still a few of each left, just like there are still a few elevator operators and even farriers. But the vast majority of people in those professions are no longer doing those tasks. Those jobs simply dried up. Of course new ones arose, and those who could adapt to the changes brought about by the new technology did well. But if you were a hardware tech or teller in the 80's or a Cobol or Assember language programmer from the 70s you had a choice to make: learn new technologies and adapt, become the last person doing what you do, or find yourself out of work. And you don't want to be the last person doing what you do.
Over the years of my career I have seen a lot of people who do not understand this dynamic. They become 'the guy' who does a particular thing and they jealously protect their domain, not sharing the knowledge or doing anything to enable others to do what they do. They are fierce resisters of anything which diminishes their role as the unique provider of a particular service. And then I see people who realize that our industry is in a constant state of flux. These people realize that whatever they're doing today, a good part of it will be automated soon. The best ones try to implement that automation themselves. They see their value not in terms of a particular set of knowledge or skills, but in being people who help the business by bringing technology to bear on the problems of the day. To them, technology is not an end in itself, but means to an end. Such people are always trying to learn new skills and new ways of doing things, as they facilitate the automation/commoditization of what they're doing now. That is how to really add value!
Some people say that Amazon is destroying book selling. In reply to this, Jeff Bezos said "Amazon is not what's happening to book selling, the future is what's happening to book selling". And folks, the future is what is happening to the datacenter. Today's virtualized infrastructures are growing exponentially in size, scope, and complexity. Software-Driven Control is here now, and more and more companies are realizing that this is the only way to run these environments. And while we are in transition to this way of doing things now, the day is going to come soon when nobody will be able to imagine going back to a world where the IT staff spends its time sizing and placing virtual machines. That job will seem as quaint as a farrier.
So what is a virtualization engineer to do? For what it is worth, my advice is to embrace change. And when I say embrace, I don't mean like you embrace your obnoxious uncle when he comes over for Thanksgiving, I mean embrace like you embrace your dear ones: with love and affection. Do not fear the changes that are coming, because fear will simply paralyze you. Change is coming, change is inevitable, so if you want to do well in IT for the long term, you must be constantly in motion, always experimenting, and always learning. You must not find your identity or see your value in the particular task you are doing, but in your ability to apply whatever technology is available to solving the problems your organization faces.
It may not be easy to predict which technology will ultimately take over, but you can be sure that whatever you are doing now will not last. For those who adopt this perspective, the future is very bright.