July 13th, 2012
Desktop as a Service (DaaS) involves hosting remote virtualized desktops through a remote server simultaneously. This makes IT tasks exponentially easier since there is no need for IT professionals to have to go physically to each computer in order to carry out repetitive tasks – for example, applying a major upgrade. Since all the machines connected to the cloud are effectively accessing the same applications and data, there is no need to apply upgrades or perform other major tasks on each employee’s desktop.
An overview of DaaS
In DaaS computing, each uses computer is a workstation in the cloud. Using the Internet, each user can connect to their own virtual desktop, even from mobile devices such as a tablet PC or even a smartphone. What each user sees on their device is actually an image of one of multiple desktops on a remote server. Being able to access this desktop from any device, anywhere in the world, gives users complete freedom and flexibility to carry out their work.
DaaS gives users the ability to carry out their work without needing an expensive and excessively powerful machine. Rather than having to purchase new memory or hardware components for each individual’s computer systems, processing power can be scaled as needed. DaaS systems will often mean lower management costs, significantly reduced infrastructure needs, and more effective IT management. Best of all, DaaS eliminates many compatibility issues and operating system issues, since everything is happening on what is essentially the same machine with multiple, virtual, simultaneous desktops. In fact, as long as your device has an internet connection, it has the potential to access any operating system and any level of data processing power.
DaaS Is Cost Effective
Technology expenditures are often some of the highest up-front investments when building a business and setting up an office. Rather than the high cost of setting up traditional workstations, virtual desktops offer the advantage of a lower cost due to the decreased need for processing power and hardware in each workstation. In fact, hardware costs in a DaaS system will often be less than half of purchasing traditional workstations, and operating them annually is also considerably less expensive. A virtual desktop lasts longer, requires no local hard disk and memory expenditure, and consumes less energy. Operating system upgrades and migrations can be prohibitively expensive due to having to upgrade the server and all desktop and portable computers in a company. However, DaaS allows companies to invest less in each upgrade and extend the life of their hardware.
DaaS Is More Secure
Traditional workstations have a basic security disadvantage, sensitive data needs to be stored in each workstation. Due to the fact that this data is distributed among numerous memory devices means that it is more vulnerable to hacking, loss, or even natural disasters. Virtual desktops do not require storing data in multiple hard disks, so this means that there is simply no data available to be stolen, lost, or destroyed. One of the most common causes of data loss comes from computer users themselves. A lack of proper computer security understanding can lead to malware infestations which can lead to viruses and worms entering an office’s network. With a virtual desktop, this risk is almost nonexistent and, thanks to being able to update and back up a single location rather than each workstation, maintaining a company’s data security is easier and more effective. DaaS can be an effective backup and disaster recovery aid in itself.
DaaS Allows Greater Mobility and Flexibility
DaaS is one of the best resources for remote workers. Being able to transfer seamlessly from a home computer to a mobile device to an office computer with all the same windows staying open and the same applications running continuously is the greatest advantage of DaaS. Using any device, workers can access their own virtual desktop without having to interrupt their work or carry around data in removable memory drives or other, similar devices. For frequent travelers, businesses that require constant mobility, freelancers, and workers from home, DaaS is the best possible solution. DaaS also allows scaling of resources and quick implementation of changes to software and applications while also eliminating compatibility problems.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.
July 12th, 2012
There are significant uncertainties in launching a new game as compared to creating a business or a personal use software product. To begin with, in the non gaming applications, you are fairly clear about the demand for your product and its features and therefore there is lesser overall risk. Of course you still have to do a great job and market your software well, but at least you know what works in the market.
Cutting edge game producers, when writing a new game, are never so certain. Will the game go viral and recoup every investment a hundred times over or will it not? You will never really know till at least the beta is released and enough hard core gamers have had a chance to play the game against each other.
Games have also become more complex, network oriented and multiplayer. This means that the gaming company has to make significant investment in back end hardware to support the game. As the as the last section of this article will show, even casinos are betting big on cloud based games to attract clients.
Fortunately, using the cloud to develop and host the game offers a very practical method of reducing developmental risk. This benefits the gaming industry by making it more affordable to produce new games and offer them to users. It also benefits players by bringing down costs, making massive multiplayer games possible and improving the overall experience.
Here’s looking at some specifics –
Reducing Development Cost
You can rent access to a cloud service for very little cost and without making any long term commitments. The advantage is that you get access to the very best computing resources without making any kind of fixed investment in servers, databases and processing power. These low costs allow you to experiment without worry. Since you are not incurring large costs in hiring of hardware, approvals to experiment are far easier to obtain. Who can argue with starting with a $35 payment as against shelling out thousands of dollars to buy servers and operating systems?
Games are no longer confined to a single operating system or a single hardware platform. No one in his right mind would restrict himself so severely. Therefore you need to provide your development team a number of different target operating systems in varying hardware configurations to test their output and make sure that a consistent user experience is obtained. Previously, serious gaming companies would maintain servers running different OS and test their games out on them. Naturally this led to higher cost. If you had limited computing resources, you ended up installing and uninstalling multiple operating systems. I can tell you that this could be pretty frustrating. Using cloud computing, you can hire multiple instances and run different OS across them. Testing has now become easy and you can integrate it tightly into your development process. Saves time and money and keeps your developers smiling.
Testing is extended to real users as well at the lowest cost possible using cloud services. You just need to make a small and controlled expenditure to deploy your game fully. There is no need to scrimp on costs because the costs are so low.
The production version can be deployed rapidly and scaled up in near real time as traffic picks up. This ensures that costs only increase as revenue picks up and you do not eat into company resources upfront.
Quality Issues
For some time there was a genuine concern about quality. Would cloud based games be able to deliver the same video experience that a top of the line graphics card be able to provide on a local host? This worry has been laid to rest with the coming of age of extremely powerful graphics processing capability in the cloud. As a result, you can shift the processing to the cloud rather than rely on the user end device. This ensures that all your users, regardless of the end user device they use, will get the same video experience. On the cloud you can hire increasingly powerful processors to ensure that your users get an amazing experience and the “wow!” factor.
From Individual Players to Casinos
Many casinos are shifting to using cloud computing for their games. Imagine the benefits –there are no coins to manage, instant credit of accounts and much easier handling of cash. All of this with a user experience that is secure and amazing with frequently upgraded games and with previews provided to clients even before they set foot into the casino.
Cloud computing has changed the gaming industry beyond recognition. Everything now looks real and possible in the virtual world.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Sanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.
July 11th, 2012
The United States Department of Defense (DoD) has long been one of the foremost early adopters of new technology. In fact, the DoD has had an essential role in accelerating the advance of various technologies and having a tremendous impact on the private sector and commercial market. This is especially true when it comes to areas such as aviation, computing, and energy. In fact, without the DoD’s support and development of new technology, such technologies such as GPS and the internet itself would probably be considerably underdeveloped today – if they were even to exist in their present form at all! Because of this, the strategies used by the DoD, which are thoroughly researched and developed, can often serve as a strong guideline for private sector companies looking to adopt the new technology.
Businesses today can consider that the DoD’s dramatic expansion of its use of cloud computing technology in the next years is a massive endorsement of this technology and its positive effects on efficiency, security, and IT costs especially. In the DoD Cloud Computing Strategy document, released this week, the DoD outlines a plan consisting of four steps that is designed to promote the use of cloud computing within the DoD as well as training those responsible for acquisitions on how to contract cloud services and understand cloud computing technology. This lengthy document outlining in detail the many aspects of cloud computing can serve as a great resource for large companies looking to implement a cloud computing strategy in their operations.
According to the DoD Cloud Computing Strategy, the following four points will be essential for moving to a combined government and commercial cloud strategy:
The four points enumerated above can serve as a basic template for businesses or government agencies wishing to make a large-scale move towards cloud technology. Some of the cloud services that will be essential in this move include cloud-based collaboration; web-based communication via instant messaging, temporary chat rooms, conferencing, and email; and integrating multimedia and data services on the cloud to ensure easy access from multiple locations.
Nearly all of the DoD’s cloud services will be managed by the Defense Information Systems Agency, with the main authority residing in the DoD’s chief information officer, Terri Takai. According to the DoD Cloud Computing Strategy, the goal of moving to the cloud is ensuring that the people involved in the DoD can access data whenever they need it, from any device, regardless of where they are. This freedom is what distinguishes cloud computing and puts it leagues ahead of traditional data storage and management.
To bypass the possible cloud computing security risks of using commercial cloud services, the DoD has indicated that they will have a rigorous control over the data stored on commercial cloud services. All critical mission data that could compromise or interrupt DoD operations if lost will not be hosted on commercial clouds. These decisions will be made by the Defense Information Systems Agency (DISA). This combined model of government and commercial clouds has been seen in the past and is a hybrid cloud model that has come to be accepted as an industry standard as a way of ensuring better scalability and increased security.
So how can a business apply the DoD’s four strategies to their own implementation of cloud computing? Perhaps by establishing the following three recommendations learned from the DOD’s strategy:
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.
July 10th, 2012
More and more businesses are discovering that cloud computing can save them a lot of headaches when it comes to managing their server. By offloading the tasks of running hardware and software to an offsite cloud provider, cloud computing enables companies to focus better on core business services and management issues.
In addition, businesses save on software licensing costs as well as overhead and IT costs. Instead, they only pay subscription fees to the cloud provider for the online applications they use. At the same time, businesses can increase productivity with regards to software use; cloud computing gives you the power to provide your employees unlimited access to shared resources and databases. Cloud-based services are also scalable – you can expand in a snap as your business grows.
Because of these benefits, cloud computing has attracted many start-ups and small businesses who don’t have the budget for maintaining an onsite IT team or a dedicated physical infrastructure to depend on.
Migrating to the cloud, however, is much more complex for companies with legacy systems. How do you relocate the old systems and software to the beaming new world of cloud?
With proper planning and execution, these hurdles can be easily overcome. Here are the steps on how to migrate legacy systems without a fuss:
Create a Plan of Action
Migrating legacy systems to the cloud is not the same from business to business. Each will have different system specifications and requirements so you need to come up with a solid strategy.
Planning well migration of legacy systems to the cloud allows business to determine the functionalities that they need. More importantly, it allows them to improve on processes to increase their chances of migration success. When creating a plan of action, you should:
Determine the resources you need. In order for migration to be successful, you have to identify your current workloads, their configuration, licensing requirements, apps, hardware, etc. In order for migration to be successful, you have to identify your current workloads, their configuration, licensing requirements, apps, hardware, etc.
Categorize workloads into easy, slightly complex, and complex to migrate. You can determine this by looking into the following elements: type of application (e.g. web server, streaming service); resource usage (e.g. CPU, memory); shared components, etc.
Set a schedule for migration. Migration would mean server downtime, so you should decide on a date or time where migration would not interfere much with business operations. Start with workloads that are easy to migrate. Basically those that use fewer resources, run on a single server, or are low-risk applications are easy ones.
To migrate or to not migrate. You should understand that not all can be migrated to the cloud. You may succeed to migrate some applications using some processes such as recompilation, while some should remain as they are. Some assets you may want to keep include HR payroll software and Photoshop. Others are best relocated to the cloud, particularly if you have workers from different locations or constantly roaming workers. Before deciding whether to keep or migrate things, you should be clear on the risks and merits of your choices. An experienced IT will be able to help you identify what you need to move to the cloud and which ones are best kept in their current state.
In addition to the technical requirements, businesses should also identify their business objectives – are they in line with the scopes of private cloud? Some of the criteria you need to review include infrastructure cost and acquisition times.
Act on the Plan
Now that you have a plan, you need to put it to action. First, you have to look for cloud providers whom you can trust. You need to inform business departments as well as stakeholders about the big change. Ask them to refrain from making urgent requirements from IT staffs until after the migration process has been completed.
Finally, you need to train managers and operational staffs on how to use the new cloud-based system.
Monitoring Results
Cloud computing is a relatively new technology. Though your cloud provider may have the best intentions, things may go wrong during the relocation so you need to monitor post-migration events.
Some of the issues you may encounter include over-configuration of virtual server; absence of record of old versions. These problems should be addressed immediately.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Rodolfo Lentejas, Jr. is a fulltime freelance writer based in Toronto. He is the founder of the PostSckrippt, a growing online writing business dedicated to producing top quality, original and fresh content. To know more about him, please visit www.postsckrippt.ca. Like him on Facebook or follow him on Twitter, Google+ and Pinterest.
July 9th, 2012
Many enterprises are starting to recognize the benefits of migrating legacy IT systems to the cloud. However with some organizations the concerns over security and privacy still prevail, especially when it comes to a public cloud environment. Although this is a legitimate concern and one that requires a lot of planning on the part of the organization, there are applications which are suitable for the public cloud even if cloud hosting is located near your competitors. According to studies conducted by Gartner and others, more than three quarters of enterprises are currently using the public cloud for enterprise applications or they are planning to use a public cloud model in the near future. The majority of these enterprises have more than 600 staff members and are using or planning to use public cloud environments for applications such as CRM, email, and other purposes. Some of the applications are SaaS (Software as a Service) where others are not mission critical to company operations. Migration to the cloud involves many considerations during the planning phase. Meanwhile, there are enterprise applications which are fitting for a public cloud model regardless of your type of business.
Pay as You Go Development Apps
Development and testing can be a costly proposition for enterprises but at the same time it is a necessary application for business growth and expansion. When it comes to business applications and SQL, the amount of server sprawl can be significant which results in under-utilization of server resources. Instead, development and testing can be accomplished using a public cloud model which allows you to use a pay as you go model when using public cloud services. This alleviates concerns over performance and storage when implementing ongoing integration and agile methodologies which are a necessary part of development and testing.
CRM Apps
Customer relationship management is typically delivered via a SaaS model which is not directly connected with other enterprise applications except for tools such as email and order tracking and management. This allows a CRM solution to be easily migrated to a public cloud service. It is important to note that this does not include customer data since there are compliance regulations which must be met to ensure data security and privacy.
Message Archives
Many enterprises face regulations in terms of being required to archive past email messages for a specified length of time. These are daily correspondences which are typically managed on an internal server but in recent years have been moved to a cloud environment. By using public cloud storage, message archives can be easily managed and stored at a much lower cost than an in-house infrastructure.
Occasional Apps
If you take a look at the many applications which are used for enterprise operations chances are you will discover that you have applications which are used infrequently or hardly ever. These are applications which can be moved to a public cloud environment. Since public cloud services are typically on a pay as you go basis you can save on costs while increasing space and improving the performance of mission critical applications which are accessed on a daily basis.
Large Data Archives
If you have a massive amount of data which must be placed in the archives within a relatively short period of time, moving it to the public cloud simplifies the process while saving your enterprise time and money. If the amount of data is large enough it could potentially require as much as several hundred servers. By archiving the data to a public cloud this can be accomplished within a matter of hours and without the exorbitant cost of procuring the hardware infrastructure to archive the data.
Finally, any other applications which are supportive applications as opposed to mission critical are good candidates for the public cloud. These include applications such as website components including PDF documents and images which are public information, customer support applications, training server applications, and other apps which can safely be moved to a public cloud service while utilizing the level of security which is offered by the service. Many public cloud services are deploying new technologies for security and privacy which is a sign that senior managers and IT professionals will be making more use of the public cloud in the near future. Meanwhile, a careful review of the purpose of different business applications and how they are used avoids a lot of the concern that surrounds security and privacy. As a result, these applications can be easily moved to the public cloud with a flexible payment arrangement which helps enterprises save on IT costs while increasing productivity.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Aeyne Schriber has more than two decades of accumulated experience in IT security, computer technology, and internet marketing, including technology education and administration field both on the public school and college level. She works worldwide helping companies establish an online presence from small businesses to large enterprises. Her skills as a published copywriter and marketer also include consulting and training corporate personnel and entrepreneurs. Visit her site at www.digitalnewmediamarketing.com
July 6th, 2012
Not too long ago, many organizations would wrestle with the opportunities and negative consequences of cloud technology. Back then, many had qualms about placing their valuable data to a faraway storage location that, when jeopardized, could potentially cause a major operational disturbance. What’s even more doubtful about the cloud is that it’s intangible – there’s no hardware that you can monitor 24/7.
So the logical thing to do is to spread the risk either over several cloud providers or to off-cloud infrastructure, or both. But with the SLAs and redundancy being offered by most cloud providers today, it seems that such a move may be unnecessary. However, we can’t discount historical events that suggest otherwise. Remember the Amazon EC2 outage last year?
One cloud still poses risks
There are many cloud providers today that claim to have an all-in-one package that’s simple yet scalable, powerful, and easy to install. However, such providers don’t really suggest that you distribute cloud-associated risks over multiple solutions. Instead, they offer a combination of private and public clouds or hybrid clouds. But at the end of the day, it’s still one and the same cloud they’re offering you, just with an extra location.
But apart from hedging outage risks, there are other reasons why your organization should opt for multi-cloud providers. For one, you’ll benefit from geographical diversity and get served from different data centers found in different locations. Next, you’ll get to choose which provider will provide you with the best infrastructure for specific workloads. Think of cloud providers as suppliers. Keep your independence and prevent supplier lock-ins by sustaining good relationships with many suppliers. This way, you’ll be able to determine the strengths and weaknesses of each and choose which of them will be able to cater to your requirements. Then there are also legal issues such as country laws saying that data that originated from a specific country (i.e. Germany and UK) should remain within it. And so on.
Multi-cloud platforms are becoming the trend
In the 2012 State of Cloud Computing research done by InformationWeek, results reveal that a staggering 73% of survey respondents were utilizing several cloud providers. This is due to the fact that the IT teams of most organizations today support different applications, hardware infrastructure, and operating systems. Hence, it’s not surprising that they have embraced a multi-cloud provider approach.
Having multiple cloud providers has a downside too
It’s good to know that there are many deployment platforms out there that seek to make the use of multiple clouds easier. But even with such platforms in place, you’d still have to fully know the requirements of each and every cloud provider you choose.
Take note that while the main objective of a multi-cloud solution is to minimize risk, it inevitably increases complexity. You’d have double, triple, or even more – depending on how many cloud providers you want – configuration and changes to constantly monitor. What even makes things more difficult is that you need to deploy your system across cloud platforms that are different from each other, meaning you’d have to pay closer attention to specific requirements of a particular stack.
Bear in mind that if your organization is unable to account for the added complexities of a multiple-cloud approach, the reliability of the applications in the cloud will be jeopardized. Performance and security issues may arise as a result of the failure to properly identify and keep track of the changing configurations that are specific to cloud platforms.
Use multiple cloud providers to achieve redundancy
Remember that redundancy in applications, data, and systems in the cloud simply cannot be attained with the use of a single cloud provider. If you really want to get true redundancy, you have to enlist the services of several cloud providers, but at the same time you have to prepare yourself for the monitoring of servers, bandwidth, application performance, configuration, and so on. Of course, there are services out there that will help you look after your cloud deployments and guide you in building the right infrastructure for your organization’s specific requirements. You would be wise to avail of such services to ensure that your multi-cloud vendor works well for you.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author
Rodolfo Lentejas, Jr. is a fulltime freelance writer based in Toronto. He is the founder of the PostSckrippt, a growing online writing business dedicated to producing top quality, original and fresh content. To know more about him, please visit www.postsckrippt.ca. Like him on Facebook or follow him on Twitter, Google+ and Pinterest.
July 5th, 2012
It is interesting to notice the unprecedented increase in the use of cloud computing in all categories of businesses nowadays, most especially in the gaming industry; and according to Gartner, Inc. - the world leading information technology research and advisory company, one-third of consumers’ digital content would possibly be stored in the “Cloud” by the year 2013! An optimistic view of how cloud computing is being broadly accepted by corporate decision-makers!
It is not surprising that Nintendo, a frontline game company is gearing up toward year 2013 to provide Wii U “cloud storage” feature for its new console so that gamers could have peace of mind while having their save games and player profiles or information securely stored through the New Nintendo Network. This New Nintendo Network is reportedly under construction with the help of Mozy, Inc.—a Seattle-based company that has been offering cloud and backup storage facilities for consumers since 2005.
Nintendo is expected to offer 512MB of online storage to every gamer for the main purposes of storing only their save games and player profile data on the company’s Network. This indicates that users or players cannot store any other digital content on Nintendo Network. And this benefit is projected to be available to gamers in Japan first in 2013 before it would be globally launched for Nintendo international consumers.
One may conjecture that Nintendo decided to embark on offering online storage of save games and player information or data to its array of game consumers so as to keep pace with its other competitors that are giving their game players similar benefits for their optimum satisfaction. Currently, Microsoft offers its game players free 512 MB cloud storage for save games and user files or data through Xbox Live. Sony has done likewise about the newly released PS3 Firmware 3.60 for only its PlayStation Plus users for now. Sony makes it easier for PlayStation Plus users to easily upload their game saves to Sony’s Cloud storage to securely protect their games as well as having unrestricted access to them anywhere.
Gamers have a lot to benefit from this singular action of giant game companies offering them free cloud storage for their save games and player personal information/data. They could be rest assured that their games and data are securely stored in these companies’ cloud storage facilities, from where they could access them anywhere and anytime.
But the budget of undertaking this process of saving consumers’ digital content in their cloud storages would be practically exorbitant for these gaming companies. Though the statistics is still not known, it would definitely run into millions of dollars per year to maintain the cloud storage networks.
However, outsourcing this cloud storage process to the third parties may as well save these gaming companies from dedicating time and huge part of their budgets to developing and maintaining these “cloud storage” networks.
There is no doubt that cloud computing would impact the gaming industry in a positive way as game players/users and the gaming companies discover better ways to offer protection for the player profiles and their game saves. Initially, cloud storage helps players not to bother themselves about the extra cost of setting up physical back-up systems or overcrowding the hard drive of their computers or gaming devices with game saves and other important data they need to keep in a secure place.
Cloud storage also provides bigger storage capacity for both game players and the gaming companies; this entails that they do not need to fret over how much data could be stored as the amount of consumer digital content increases over the years. This reality would give game players maximum satisfaction and remove any fear that they might lose their game saves or personal information or profiles over a long period of time.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Jerry Olasakinju, a Bachelor of Technology (B.Tech) degree holder, is a passionate researcher and writer whose interest in everything computing is unparalleled. He blogs about his literary works at http://jerryolasakinju.blogspot.jp/
July 4th, 2012
When companies migrate their business processes to the cloud, service assurance will be an important aspect of the migration. CEOs need to be reassured that a minimum acceptable level of service will be provided and that there will be legal safeguards to ensure that this level of service can be enforced. Only when a vendor will offer guarantees instead of disclaimers will any prudent CEO agree to move away from the certainty of on-premise IT.
Therefore, all competent cloud services providers understand that a well written Service Level Agreement (SLA) is a document that instills confidence in the corporate user and is instrumental in getting more customers. There are also nuances to the SLA that must be understood by users if they are to get the best out of their solutions.
In a number of cases, vendors specify an uptime of 99.5% or more for your service. But they also say that to get this level of service, you need to hire the service so as to operate from two different geographic locations. Obviously, if you do not hire the service correctly, you will not get the guaranteed up-time. This is a typical case of understanding the SLA correctly.
There are several issues in cloud SLAs that are different from other service level agreements. This is due to the very nature of the cloud itself and these factors need to be understood well. Some of these issues are discussed below. They must be carefully handled in the SLA.
Who owns the Metadata?
The first of these relates to the ownership of data. If you were simply hiring storage space in the cloud, things would be straightforward. Whatever data you store is yours. But if you are hiring Software as a Service, things can get more complex. The business related data that the software generates is obviously yours, but what about other data that the software creates – the one that relates to usage trends, failed login attempts, metadata, user feedback, preferences and complaints, traffic statistics? Do these belong to the client hiring the software or the company hosting the software? You will need to understand these issues and the criticality of these to your business before you sit down to draft the SLA.
What happens if the government wants to see your data?
Your service level agreement must define what will happen in case a government agency makes a request for your data. While the service provider may be legally bound to comply with the government request, your SLA must clarify that your company will be informed before any data is actually handed over so that you have the opportunity to take necessary action such as contesting the demand or obtaining legal counsel. On the other hand, there are legal requirements of archiving and maintaining your data, this is specifically so for financial or medical data. The service provider must be told this specifically through the SLA so that he becomes duty bound to protect the data. Also make it clear that in case there is a failure to meet these obligations, what would be the extent of legal penalties etc that the service provider would be liable for.
Well-Defined Metrics
The SLA must also describe in detail how the level of service and the uptime that the vendor is providing will be measured. How will service requests be handled and in what timeframe? What would the penalties be for not meeting them?
Data Security
Other important issues that the SLA must address in detail are levels of data security and segregation that the vendor will provide and what will constitute a security breach. While it may seem unnecessary to define what a breach would be, it is important to do so because at times there could be an incident that does not result in a loss of data or business but is nevertheless important. The SLA must specify that every breach or incident affecting your company or its data is reported to you along with the remedial action taken. You must also ensure that the provider gives you advance warning of any maintenance being carried out that may slow down your service. It is also critical to ensure that in case of a disaster – natural or manmade what will be the service provider’s role in supporting your business continuity.
Termination
There are issues of termination of the relationship, what assistance will the vendor provide in case you want to migrate your applications to another vendor or bring them back in-house. Whether the vendor can outsource your work to another cloud provider and if so then what terms of the SLA would apply to the new vendor must all be clarified in the SLA.
Is all of this possible?
Will you be able to meet all these terms and more when you set out to move to the cloud? If you are a small company without much leverage, chances are you will not. You may not even get to meet any of the staff of the cloud company since all activities may be completed on-line with a standard SLA. However, these are issues that you must think about. Obviously, if you are a large company giving big business to the vendor, you will have more leverage, but whether you are big or small, the importance of understanding the SLA clearly cannot be overstated.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Sanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.
July 3rd, 2012
When it comes to transitioning to a cloud model, regardless of the industry, change is a necessary part of ensuring that the migration process is successful and appropriately meets the needs of every business. Organizational changes must be identified in terms of both staff and technology in order for cloud migration to successfully solve business problems. This is what is meant by mapping IT resources to business needs when deploying a cloud model. The whole focus behind cloud migration is to solve business problems as opposed to solving technical problems within an IT infrastructure. By streamlining business processes through cloud computing deployment you are assisting a business with moving forward. This means that organization changes must focus around meeting mission critical business needs when planning for cloud migration.
Organizational Changes That Help to Meet Business Needs
To ensure that a new cloud model will meet business needs certain organizational changes must take place to maintain a focal point on solving business problems as opposed to technical problems. This requires laying the initial groundwork with people instead of technology. The more time you invest on working with key people within your organization the more weight you will hold when planning a transition to the cloud to help a business move forward. This will also help to reduce the amount of resistance you will face from those who are reluctant to change despite the fact that they know it will better serve business needs and help the organization to operate more efficiently.
In order to implement business-enabling change you must identify the key players within the organization which include those who are focused on business tasks, professionals who are focused on technology and IT initiatives, administrators who are IT task oriented, and other planning professionals who are adept at looking at the big picture when it comes to organizational change. Then you must identify roles in each of these categories so changes in the roles can be identified and implemented. In some of the categories we mentioned the changes needed may be more for some areas than in others.
Scenarios of Role-Based Changes
When cloud-based deployment is focused around business needs it is important for staff to work closely with management to identify new roles in the key areas mentioned above. This requires maintaining focus on quickly delivering services which solve business problems and appropriately serve the needs of the organization. The following represents a typical scenario for cloud-based deployment which is geared toward business solutions more than it is toward solving technology problems:
So when you invest your time more heavily in working with people in terms of organizational change and identifying what is working and what is in need of improvement, the technology will tend to take shape a lot easier with many technical problems automatically addressed when you make the human factor the priority during the deployment of a cloud model.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Aeyne Schriber has more than two decades of accumulated experience in IT security, computer technology, and internet marketingas well as technology education and administration field both on the public school and college level. She works worldwide helping companies establish an online presence from small businesses to large enterprises. Her skills as a published copywriter and marketer also include consulting and training corporate personnel and entrepreneurs. For more information, please visit www.digitalnewmediamarketing.com
July 2nd, 2012
As the concept of cloud computing is being gradually adopted in various enterprises, there are still a number of misconceptions people associate with public cloud hosting. One of these concerns is: does it bring about true cost reduction or is it a mere cost transfer?
Corporate decision-makers are constantly looking for ways of cutting the costs of running their businesses. And when cloud computing comes along, a few companies have reportedly utilized cloud technology to reduce their overhead cost. But how did they do it?
Depending on the size and the nature of operations, companies normally establish in-house IT systems that handle a list of on-going activities which include but are not limited to e-mail, data processing, data storage, communications, e-publicity (electronic publicity), decision-making method and IT marketing Applications. All these IT systems’ functions are primarily hosted in-house which, in most cases, cost lot to implement, maintain and customize in order to necessitate greater productivity.
A public cloud is one type of Infrastructure as a Service (IaaS) which gives business managers the opportunity to have their IT managed by external providers. In practice, this is expected to allow companies to reduce expenses for purchasing, implementing and developing software (both required at the end-user workstations and the server-level), servers and their accompanying hardware, firewall infrastructure, media outlet, storage facilities (which include series of hardware back-ups and back-up servers for security), facility accommodation (or space), and concurrent personnel expenses (that is, salaries, benefits and payments for consultants). Utilizing a public cloud service should takes the responsibilities described above from the shoulders of corporate IT managers and allow them to focus on their core business.
In principle, cloud computing provides a “buffer experience” for corporate executives in the sense that they do not necessarily need to bother about managing somewhat complex in-house IT systems, and they could enjoy peace of mind knowing fully that their IT structures are hosted securely in an external and non-intrusive environment where they have unrestricted access to their IT infrastructures. Incidentally, this practice helps business managers to save cost on their IT applications. However, some pundits have argued that this is not always true. Some believe that hosting a company’s IT systems in a public cloud denotes a “mere cost transfer” from the company to the external cloud-technology providers. Could this observation be true?
Typically, a mid-size company is expected to spend between $130,000 and $200,000 a year on its IT set-ups; look at this important breakdown of its possible yearly capital expenses.
Pre-Cloud Technology Adoption Expenses
Description | Cost ($) |
Software & Hardware (purchase, implementation, customization) |
50,000 |
Servers and Server’s maintenance | 40,000 |
Personnel Expenses (salaries etc) | 40,000 |
Total | 130,000 |
Post-Cloud Technology Adoption Expenses
Description | Cost ($) |
Software & Hardware (purchase, implementation, customization) |
30,000 |
Servers and Server’s maintenance | None |
Personnel Expenses (salaries etc) | None |
Public cloud hosting & other services | 60,000 |
Total | 90,000 |
It is possible for businesses to save on their IT infrastructure costs through the use of public cloud services because they only need to pay for the resources they actually use, rather than having to purchase additional capacity in case usage increases owing to business expansion or other factors.
Public cloud computing provides an opportunity for “agile IT” system, which allows businesses to increase or decrease resources (processing power, memory and storage capacity) based on their business requirements. This flexible scalability is not possible in a rigid in-house IT structure.
One thing to note is being able to identify the particular needs of the organization in order to maximize the use of resources. Once needs are identified, then the structure can be created so as to reduce resource wastage.
There are several ways on how to configure the public cloud; here are several examples of public cloud configurations commonly used by business users.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Writer:
Jerry Olasakinju, a Bachelor of Technology (B.Tech) degree holder, is a passionate researcher and writer whose interest in everything computing is unparalleled. He blogs about his literary works at http://jerryolasakinju.blogspot.jp/
« Older Entries Newer Entries »