Home > Blogs > Tribal Knowledge > Monthly Archives: November 2008

Monthly Archives: November 2008

More Than Blue Sky Thinking

Reza Posted by Réza Malekzadeh
Sr. Director, Product Marketing & Alliances

Looming on the horizon are the nimbus, cirrus, stratus and cumulus that threaten to deliver us cloud computing imminently.  Promising an end to most of the challenges and frustrations of IT systems as we know them, the concept of cloud computing is thundering through the business community to become one of the most talked about and revered subjects of the day.

Behind the hype seems to be a reality that, for once, the IT industry maybe onto something truly game changing that will not only radically cut costs, but also deliver a far better experience to the business or consumer user. 

The expectations are huge. banking analysts say that cloud computing will be a $160 billion market within the next five years, and every major IT company from Microsoft to Google, from IBM to Dell, is desperate to be the rainmaker.

The question that comes to mind though is not “what” cloud computing is, but rather “why”. If it is such a great idea then why has it taken until now for the gurus of technology to deliver it. 

The “what” question is just too easy; imagine a world where you could walk up to any computer, anywhere in the world and instantly access all your data and applications just as you left them last time you logged on – and somewhere, up in the clouds, a huge IT infrastructure was whirring and churning to deliver the IT services to you.  Basically, think of the ease of getting electricity from a socket in your home that somehow connects to a generating station and you start to get the idea.

Why has it taken so long? Go back far enough in time and IT professionals always thought that computing would be delivered from the cloud and that the personal computer was nothing more than an aberration.  Early mainframes where constructed to deliver IT services down wires to dumb terminals that could do no more than display text on a screen and take back digits typed into a keyboard.    

These mainframes could handle hundreds or even thousands of users and if they had carried on evolving then, we would probably have had cloud computing in 1988 – rather than 30 years later.   

In fact, Thomas J. Watson, founder of IBM, is supposed to have remarked that “there is a world market for about five computers”.  He didn’t mean that these new fangled devices would never catch on (as Lloyd George unfortunately said about TV) but that his vision was of a few massive number crunching mainframes in the sky that could deliver their computational power to the users remotely.

What was not understood though, was the challenges that cloud computing would have to overcome.  And this is where the answer to the “why now” question lies.

To deliver cloud computing requires five critical components: the scalability of the infrastructure to meet users’ needs; the resilience to accommodate the unexpected; the network to distribute the applications; and the ability to deliver an acceptable experience to the user at a reasonable cost.

When it came to scalability the reality was that you built or rebuilt your datacenters once every five years to fit an estimated workload or users and traffic.  The concept of “dynamic” or “on demand” capacity existed only as a concept.  But something fundamental changed at the start of the 21st Century, when server virtualisation suddenly arrived on the scene, as a result of innovations led by VMware.

Where previously you had attached a given application to a server only to see users slow to a deathly halt during periods of peak usage, now you could now decide to vary the server capacity or resources available to a virtualised application and so scale it up or down according to demand.  This was freedom for the CIO and MIS staff as they suddenly could adapt their business to the needs of the user community.  It wasn’t cloud computing yet, but maybe the forerunner.

Resilience was probably the biggest killer of the original IT model and gave rise to the PC almost by itself.  Despite cloud computing being the ideal solution for IT architecture, the repeated and sustained or catastrophic breakdowns of mainframes led users to revolt against the tyranny of the IT director.  The phrase that sends shudders though the souls of many middle-aged ex-programmers is “unscheduled outage” as hours of work would be lost to some minor bug on a given server.  Evolution eventually kicked in with the concept of transferable workloads made possible by innovations such as VMware’s VMotion – a technology which can take a running application from a problematic server to another server with no interruption. 

Bizarrely the network was probably the least of the problems.  Arpanet, the forerunner of the Internet, was up and running in the 80s and although designated for military use quickly proved itself within the academic community.  But at an original 50 Kbps, compared to today’s multi-Megabit throughputs, there is no doubt that broadband has transformed the landscape for cloud computing.

Probably the most emotive issue in IT is the end-user experience. Grown men have cried at the prospect of rebooting Windows Vista and previous experiences of cloud computing were little different. We expect and have a right to an IT experience that delivers the goods.  An ATM machine, a great example of existing cloud computing, should not take three minutes working out whether it will or won’t pour out cash.  But so many factors affect that experience that IT directors have previously been powerless to control the experience.  The era of virtualisation has radically transformed that equation as the IT professional can now isolate, prioritise and manage applications to deliver a fantastic experience to users. 

Finally is the age-old issue of cost.  Every new era of IT has promised much, but at extra cost.  PC networks, client-server computing, server-based computing – all of them demanded an extravagant outflow for the promises of a return tomorrow.   Cloud computing is the very, very first that actually costs less.  By harnessing the scalability and resilience provided through virtualisation, and by using the global networks that now exist, it delivers the massively improved user experience at a lower cost. 

And if you need proof, look at any of the combatants in providing Cloud Computing – Amazon, Microsoft, Google, Oracle – and ask them if they use virtualisation at the core of their infrastructure.  They all do. If the drip, drip, drip of effect of cloud computing works for some of the most popular IT services of today, you can be sure it will seep into mainstream IT soon.