Going to university and learning – It is harder in the one way you never thought possible
Many of the readers and visitors are the average Joe with computers. I know I used to be one. Many are happy with their current situation, their job allows their hobby to flourish, and for the odd person here, their relationship allows this to happen as well.
But what about people like me, who was not happy with their non-existent career or job prospects? This also applies to the younger ones who want to pursue a career in the IT industry.
After extensively creating my hobby of computers, maintaining a personal home network, getting very involved in the detail of IT and its functions, I realised that my talents and knowledge were being wasted. I returned to higher education in the form of university, to get official credit for what I had already learned.
Anyone of you like me would have a base to start on, and would likely not make many notes in lectures, if at all. We know this; we know that, we just turn up for the attendance mark and try to catch any comments that could possibly reveal any new information that could be valuable. I know I am doing all of that. I know many of us would have no issue with the main area of knowledge we have concentrated on in our hobby.
But it isn’t that easy. The problem is sometimes you learn more than you could ever need to.
What surprised me most about going back was being taught obsolete technology. Technology does move fast in this day and age. I know I can walk in a lecture knowing what was hot in flash memory just went obsolete as I walked out two hours later, literally.
Other technology moves slower though. The typical memory you can install in your computer to increase your ‘ram’ is still fundamentally a technology that has been merely tweaked over time. CPU architecture also moves slowly, with changes to how the CPU can calculate different instructions being the only significant change. The rest of the CPU still relates quite similarly to previous instances, it just has more and is faster at doing it.
Some people have pointed out that we should start at the basics. Yes, but when you’re being taught or shown technology examples that use valves and is now only ever found in museums, or instruction sets so obsolete that you would never use anything like them, you become very surprised and possibly like myself and many of my classmates, disgusted!
The best example is possibly Level 2 cache. We are currently being taught that this is directly on the motherboard itself. Whilst this is correct in a very few small cases, the majority of Level 2 cache locations has been directly on die for most of a decade. But we know that if the exam we have to take in a few weeks has this type of question, so we have to revert to teachings and texts that didn’t even know the dot com bubble was coming.
That’s the hardest thing that we all have to confront, having to deal with obsolete technologies, theories and practices. That will be alongside separating the relevant teachings in the course we are on and also keeping up with current technology, theories and practices in our own time, too.
This is in contrast to what we get on the software development side. The university I am attending has a Microsoft Academic Alliance. We get access to free software from Microsoft and the latest information available. When we leave we will be at the cutting edge of the IT industry. We even get the chance to walk in to Microsoft jobs at the end of our degree!
So beware, it will be easy, but it can also be extremely frustrating at times. If you are looking at the option of going into or even returning to higher education, this will almost certainly be your biggest stumbling block, adapting and coping with too much information, especially old information that you certainly won’t need. You may not experience any of these issues, or may experience different problems if you do the same as I did, but something will get you, so once again, beware!
Finally, despite all I’ve said, it’s worth it. Do it, you won’t regret it!