Living with a cloud – paradigm shift in the way we work Part 2
Continuing from my last post, it’s clear that there is a colossal wastage of resources with respect to server utilization – both from the point of view of individual companies and in the universal sense. One of the solutions was to create smaller server units with better efficiency and advanced technology.
Case of large organizations
The battle of in-house resources versus that of a Data Center is quite old. I remember the days when email servers were usually hosted in individual offices. For those with multiple offices spanning the globe, this posed a serious problem. Those tied to Microsoft technology were further hampered by restrictive policies.
Typically a large organization would spread their requirement over a thousand servers, each running at 30 to 40% efficiency. As long as there was no visible wastage, most management either overlooked the inefficiency or simply bore with it because the tech guys did not seem to have any other solution. There was always the danger of the entire IT infrastructure collapsing due to overloading. No one could take chances and therefore status quo was maintained.
Managing on line infrastructure resources
It was not that organizations suffered this inefficiency without waging a battle. Many in-house technologies mushroomed to counter server underutilization. Clustering was one such solution. Load Balancing techniques were refined. However, the need for physical infrastructure remained and with it came the concomitant wastage.
The world of virtualization
Virtualization is a brilliant concept where the physical layer of implementation was separated from the user with a software layer. Using this technology, you can use a single resource to run multiple operations as if each user owned the resource independently. If I remember correctly, the animation industry picked up virtualization first with gusto. Each animation workstation was directly linked with the datacenter for executing operations which were resource intensive. Pooling of infrastructure led to better efficiency. Essentially, clustering was a major breakthrough in animation industry.
Enter the cloud
The next logical step was obviously the cloud. There is a difference between virtualization and cloud computing which is not the subject under review here. Suffice to say that the virtualization concept was carried forward to give birth to cloud computing. Here, I would like to illuminate readers about the fact that not all clouds are the same. The principles may be identical but implementation differs. AWS of Amazon is different from Microsoft offering. The perceived efficiency also varies from vendor to vendor. Though there are no standards as such, the basic building block of a cloud is a server instance. This may comprise of a CPU, memory and some cases a separate database.
The advent of abstraction
The cloud is an abstraction layer which separates the user from the infrastructure. There is a beautiful way in which this has been explained by some geek. Imagine that you are driving a car. You switch the ignition and start the car and rest automatically happens. If you were to understand or comprehend the working of gears and machine and how the gears engage the engine, you would never be able to drive the car. There is an abstraction layer which separates you, the driver, from the actual working of the car. In the case of a car, you can pick up a reputable brand and sleep in peace. Now this brings us to the question of how to evaluate the cloud engine which you would be driving?
Be Part of Our Cloud Conversation
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook, Twitter and Pinterest.
About the Guest Author:
Sankarambadi Srinivasan, ‘Srini’, is a maverick writer, technopreneur, geek and online marketing enthusiast rolled into one. He began his career as a Naval weapon specialist. Later, he sold his maiden venture and became head of an offshore Database administration company in Mumbai. He moved on as Chief Technology Officer of one of the largest online entities, where he led consolidation of 300 online servers and introduced several Web 2.0 initiatives. He holds a Master’s degree in Electronics and Telecommunication.