I’m sure that I’m not the first person to cover this topic but I thought I’d add my bit.
One thing that amazes me is the cycle of technology that goes on. The old cliche of “put the old dress in the closet and 20 years from now it will be back in fashion” seems to pretty much apply equally with IT. I think it was seeing those “reto-handsets” they are offering everywhere for the IPhones that got me thinking. Let’s take a few examples..
The first “real” laptop computers (from about 1989/90 or so) came out with specs more or less similar to a conventional desktop with screens between 14-15 inches. The in the late 90s, there was a rage of “Micro Notebooks”. I had an IBM x240 which was about 8 or 9 inches wide, and I was given an original Toshiba Libretto (Pentium 166) as a keepsake from a friend a few years back (still the smallest production notebook I’ve even seen and it still works perfectly). They were popular for a while but quickly became stale as people realised they didn’t have the horsepower to do any real work on and had crummy battery life. So we moved to the “ultra portables” – with the Toshiba leading the way again with the Portege – which were slightly bigger, had much more grunt and better battery life, and a much bigger pricetag. Execs and graphic designers had them but the majority went back to notebooks with specs more or less similar to a conventional desktop and screens between 15-15.4 inches.
Along comes 2007/8 and suddenly everyone needs a “Netbook” (i.e. Micro Notebook). They can’t build them fast enough and every vendor jumps on the bandwagon. In Australia, the government gave away literally tens of thousands of them to high-school students as part of their “Digitial Education Revolution” funding program…only to realise that they didn’t have the horsepower to do any real work and had crummy battery life (this was particularly depressing for students as they tend to be power users and finding out that you can’t edit photos or videos on your brand new notebook was not a good thing). This buzz this year is for “ultra portable” which are slightly bigger, have much more grunt and a much bigger pricetag. Sound familiar?
About 15 years ago, Microsoft brought out a product called Windows CE (sometimes horribly abbreviated to WinCE) designed specifically for portable computing devices, often employing touch-screens. At around the same time, Apple brought out their Newton portable touch-screen computing device. There was a brief flurry of interest and activity with many vendors (most notably HP) bringing Windows CE devices to market and many die-hard Apple fans swearing by their Newton. Eventually, Steve Jobs killed off the Newton when he took back the reigns of Apple and set about transforming the company whilst Windows CE is still around but used mostly for devices like Thin Clients. About 18 months ago, Apple released a device called the “IPad”, and once again we are off and running with touch-screen computing devices (AKA Tablets).
Finally (for this article) there’s cloud computing. 25+ years ago pretty much all business computing was centralised. Big Data Centers housed monster computers that consumed so much electricity and produced so much heat that they needed dedicated power distribution and cooling separate from other buildings around them. Everyone connected to them via relatively slow data links, which didn’t matter too much since they did all the work centrally. Then we got the PC which placed a reasonable fraction of that processing power directly on to the business user’s desktop allowing them to work discretely without having to share the central computer with 100s of other people; followed shortly thereafter a slightly more powerful PC that was called a “File Server” which further allowed businesses to localise their data and processing all the way down to the branch and individual user level. The Information Sprawl was thus born and the amount of data and information gathered and collected (and dispersed) started growing exponentially as more and more devices were purchased and deployed until the business reached a state where virtually no organisation could even come close to accounting for where all their data is actually located. Around 1998, a company called VMWare popped up, promoting technology that allowed companies to carve up their existing File Servers into multiple “virtual file servers”. They called this technology a “Hypervisor” (they didn’t invent it, but they were certainly the first company to make it work). At the same time, a company called Citrix was starting to gain decent acceptance of their “Winframe” or “Metframe” product which allowed businesses to carve up their existing File Server into multiple “virtual desktops” that users remotely accessed in order to gain access to company applications and data. Both these technologies developed a reasonable following but were limited to use by very large organisations since even though they didn’t need much network bandwidth to operate (i.e. the capacity of the link sitting between where the user and file servers were located), they needed more than most home users, small or even medium businesses had available (i.e. generally dial-up links). The we finally got the last main piece of the puzzle which was the wide availability of high-speed, low cost data links to almost everywhere in the world. So now businesses are moving all their data and processing into Big Data Centres which has monstrous arrays of computers that comsume so much electricity and produce so much hear that they need dedicated power distribution and cooling separate from the other buildings around them.
What’s next on the technology cycle?