Blog
Cloud News & Insights

Cloud Backup and Disaster Recovery Woes In the Financial Sector

Many companies today go with traditional backup and disaster-recovery solutions, overlooking the fact that cloud computing and device virtualization have far more to offer. Natural and man-made disasters from political turmoil in the Middle East to the tragic tsunami that hit Japan have had the effect of making businesses rethink their backup and disaster-recovery strategies. However, studies indicate that many of the world’s largest corporations are still woefully lacking in backup and disaster recovery, and that this is especially true of many financial services businesses in the United States.

United States financial institutions particularly vulnerable

The fact that data processing and acquisition is an essential aspect of financial services makes this lack of preparedness particularly worrying. Managing billions of contracts and tens of thousands of credit card transactions per second, the world’s largest banks rely on a strong network and computing infrastructure. This reliance on stable data centers will only increase due to the rise of online banking and accompanying security risks.

Yet the 2012 Disaster Recovery Index’ findings make worrying reading for customers, indicating that only 35% of financial institutions in the United States are confident their IT departments can execute effective disaster recovery operations. Furthermore, only about 30% have confidence of a quick recovery from system downtime. Nearly half of the surveyed financial service providers indicated that system downtime cost them more than a quarter of a million dollars every year!

With data recovery playing such a critical role in the financial sector, why are more companies not making sure of protection from a possible disaster? IT managers indicate this problem stems from a lack of resources and lack of support from management. The complexity of these technologies can also be an issue. The unique combination of the need for the highest possible security, the need for speed, and the large-scale operations being carried out, does not translate well in terms of infrastructure. This often stems from an incomplete adoption of new technologies and a lack of consolidation of existing infrastructure into optimized, manageable units.

The largest financial firms require a complex infrastructure that needs to integrate physical platforms with private and public clouds, as well as physical and virtual devices. Establishing a backup and disaster recovery solution that encompasses all of a firm’s infrastructure is both expensive and complex. More than three quarters of financial firms surveyed in the 2012 Disaster Recovery Index indicated there is currently a lack of a comprehensive solution to integrate all of these platforms into a single, manageable, and affordable disaster recovery strategy. Confusion and costly management errors are increasingly commonplace in dealing with multiple vendors managing multiple platforms. However, recent cloud computing advances indicate that a move to a strategy based mostly on cloud can help solve many of these problems, saving both money and effort in the long term.

Cloud backup and disaster recovery

Although cloud computing has an inherent advantage in ensuring data is always accessible and more resilient against disasters, it is also important to backup virtual data. With cloud offering such backup solutions, more companies are starting to adopt a remote backup service for their most crucial data. In fact, data migration to the cloud can reduce IT operating costs significantly as well as helping backup and disaster recovery operations.

Blindly adopting the latest cloud and device virtualization innovations may not always be the best option from a backup and data recovery standpoint. While the cloud itself offers greater advantages in these matters, it is the unwieldy combination of physical platforms and multiple virtual platforms that makes backup and data recovery significantly more difficult. Because of this, a mistake financial companies have made is simply adding new technologies to their existing strategy without proper prior planning of infrastructure. To avoid the confusion and high costs that result from this, a careful strategy for IT infrastructure and implementation should be laid out in order to optimize the adoption of a new computing technology.

An equally important aspect of course is security. Prior to disaster recovery, businesses need to make sure that the security set up is reliable and effective. What are some of the attributes of security? Visit GMO Cloud’s Security page and see how much value is put into this aspect of the business.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Nida Rasheed

 

Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.

Share on LinkedIn

When Data Grows – the Cloud is Built for Big Data

For many years there was stability in the database segment of the IT industry. Relational Database Management Systems (RDBMS) such as Oracle, SQL Server and MySQL (to name a few) had proven themselves capable of handling vast amounts of data. One of the leading solution providers in the ERP space said that it had 30,000 tables in its ERP database that could handle the complexity of practically any business.

But the advent of Big Data changed the database scenario overwhelmingly. Many IT professionals and business executives still do not comprehend the challenges and opportunities presented by Big Data and how the cloud is uniquely positioned to handle this. Understanding a few simple concepts and capabilities will help you extract the best from this emerging area.

What is Big Data?

In computing parlance, Big Data is defined as “datasets so large as to be unmanageable for conventional RDBMS.” Handling zettabytes (1021 bytes) of data is a different task altogether, and you can encounter data of these volumes in scientific fields, weather forecasting and pharmaceutical research. Similarly, any large city, with its thousands of security cameras all recording video footage could easily hit this kind of archival volume. All large international retailers, relying extensively on RFID chips to manage the movement of inventory and an international supply chain, also generate enormous quantities of data.

While all business analysts agree that there are gold mines of business information in this data, the challenge is to understand the trends and put them to business use. The capability to handle Big Data is required and only cloud-based applications can currently do it.

Turning to the Cloud to handle Big Data

One initiative proven extremely successful in handling enormous quantities of data is the open-source cloud-based Big Data management system known as Hadoop1. This was built from day one with the requirements of handling Big Data in mind. Hadoop combines both data-processing capabilities and the physical cluster management capability to handle very large database servers and data. For example, in 2008, Yahoo! reported that it was using the world’s largest Hadoop application, working on a 10,000-core Linux cluster which handled all web queries.

Two years later, Facebook announced that they were running the largest cluster which was handling 21 PB of storage (1 PB = 1015 bytes). This grew to 30 PB soon after.  This kind of storage and processing capability simply cannot be built in a conventional in-house IT center. You need the kind of capabilities that only the cloud can provide. In yet another example of Big Data management, the New York Times used 100 Amazon Elastic Compute Cloud instances to process 4 TB of raw images into 11 million PDF files. The task took about 24 hours and cost – hold your breath – all of $240. This is the kind of capability rendered possible by cloud computing1.

In yet another case of using the cloud to handle Big Data efficiently1, a cancer research team built a 50,000-core supercomputer to perform core cancer research computations. This supercomputer, assembled in data centers spanning the globe, ran for three hours and cost just $4,828 per hour to run. Such a supercomputing cluster can be set up by any qualified person in a matter of hours without any external help.

Maybe you are not yet into Big Data, but if you are passionate about growing your company, it is only a matter of time before you will encounter Big Data with its many challenges and opportunities. Big Data need not be a constraint! You can take advantage of cloud computing techniques to get the best out of the large data sets your competitors may be afraid to make use of.

To gain a better understanding on how big data can be optimally managed in the cloud, visit GMO Cloud’s High Availability Configuration page.

 

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sanjay SrivastavaSanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk.

Share on LinkedIn

Key Benefits of Desktop as a Service (DaaS)

Desktop as a Service (DaaS) involves hosting remote virtualized desktops through a remote server simultaneously. This makes IT tasks exponentially easier. There is no longer a need for IT professionals to physically visit each computer to carry out repetitive tasks – for example, applying a major upgrade – since all machines connected to the cloud are effectively accessing the same applications and data.

An overview of DaaS

In DaaS computing, each user’s computer is a workstation in the cloud. Using the Internet, each user can connect to their own virtual desktop, even from mobile devices such as a tablet PC or smartphone. What each user sees on their device is actually an image of one of multiple desktops on a remote server. Being able to access this desktop from any device, anywhere in the world, gives users complete freedom and flexibility to carry out their work.

Better still, DaaS doesn’t require an expensive and excessively powerful machine. Rather than having to purchase new memory or hardware components for each individual’s computer systems, processing power can be scaled as needed. DaaS systems will often mean lower management costs, significantly reduced infrastructure needs, and more effective IT management. DaaS also eliminates many compatibility issues and operating system issues, since everything is happening on what is essentially the same machine with multiple, virtual, simultaneous desktops. In fact, as long as your device has an internet connection, it has the potential to access any operating system and any level of data-processing power.

DaaS Is Cost Effective

Rather than the high cost of setting up traditional workstations, virtual desktops offer the advantage of a lower cost due to the decreased need for processing power and hardware in each workstation. In fact, hardware costs in a DaaS  system will often be less than half those of purchasing traditional workstations. Operating them annually is also considerably less expensive. A virtual desktop lasts longer, requires no local hard disk and memory expenditure, and consumes less energy. Operating system upgrades and migrations can be prohibitively expensive due to having to upgrade the server and all desktop and portable computers. DaaS, on the other hand, allows companies to invest less in each upgrade and extend the life of their hardware.

DaaS Is More Secure

Traditional workstations have a basic security disadvantage in that sensitive data needs to be stored in each workstation. This data is distributed among numerous memory devices, meaning it is more vulnerable to hacking, loss, or even natural disasters. Virtual desktops do not require storage of data in multiple hard disks, meaning there is simply no data available to be stolen, lost, or destroyed. One of the most common causes of data loss comes from computer users themselves. A lack of proper computer security understanding can lead to malware infestations which allow viruses and worms to enter an office’s network. With a virtual desktop, this risk is almost nonexistent and, thanks to being able to update and back-up a single location rather than each workstation, maintaining a company’s data security is easier and more effective. DaaS can be an effective backup and disaster recovery aid in itself.

DaaS Allows Greater Mobility and Flexibility

DaaS is one of the best resources for remote workers. Being able to transfer seamlessly from a home computer to a mobile device to an office computer with all the same windows staying open and the same applications running continuously is the greatest advantage of DaaS. Using any device, workers can access their own virtual desktop without having to interrupt their work or carry around data in removable memory drives. For frequent travelers, businesses that require constant mobility, freelancers, and workers from home, DaaS is the best possible solution.  DaaS also allows scaling of resources and quick implementation of changes to software and applications while also eliminating compatibility problems.

Of course the very first step to having your organization’s own DaaS is to get your own cloud server. Visit GMO Cloud’s Hosting page and see what essential functions are needed to be taken into consideration in buying a cloud server.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Nida RasheedNida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.

Share on LinkedIn

Casinos Bet Big on the Cloud

There are significant uncertainties in launching a new game compared to creating a business or a personal-use software product. In non-gaming applications, you are fairly clear about the demand for your product and its features, meaning less overall risk. Of course you still have to do a great job and market your software, but at least you know what works in the market.

Cutting-edge game producers, when writing a new game, are never so certain. Will the game go viral and recoup every investment a hundred times over? Or will it completely bomb? You will never really know until at least the beta is released and enough hard core gamers have had a chance to play.

Games have gradually become more complex, network-oriented and multi-player in outlook, meaning gaming companies have to make significant investment in back-end hardware to support the game. Thus, as the last section of this article will show, even casinos are betting big on cloud based games to attract clients.

Fortunately, using the cloud to develop and host the game offers a very practical method of reducing developmental risk, making it more affordable to produce new games and offer them to users. This also benefits players by bringing down costs, making massive multi-player games possible and improving the overall experience.

Here are some specifics –

Reducing Development Cost

You can rent access to a cloud service for little cost without making long-term commitments. You get access to the very best computing resources without making any kind of fixed investment in servers, databases and processing power. These low costs allow you to experiment without worry, making approvals to experiment far easier to obtain. Who can argue over starting with a $35 downpayment as opposed to thousands of dollars spent on servers and operating systems?

Games are no longer confined to a single operating system or a single hardware platform. No one in their right mind would restrict himself so severely.  Therefore you need to provide your development team with different target operating systems in varying hardware configurations to test output and ensure consistent user experience. Previously, serious gaming companies would maintain servers running different OS’s and test their games on them.

Naturally this led to higher cost. If you had limited computing resources, you ended up installing and uninstalling multiple operating systems, which was pretty frustrating. Using cloud computing, however, you can hire multiple instances and run different OS’s across them. Testing has now become easy and you can integrate it tightly into your development process, saving time, money and keeping your developers smiling.

Testing is also extended to real users at the lowest cost possible. You just need to make a small and controlled expenditure to deploy your game fully. The production version can be deployed rapidly and scaled up in near real time as traffic picks up. This ensures that costs only increase as revenue picks up and do not eat into company resources upfront.

Quality Issues

For some time there was a genuine concern about quality. Would cloud based games be able to deliver the same video experience that a top-of-the-line graphics card is able to provide on a local host?  This worry has been laid to rest with the coming of age of extremely powerful graphics processing capability in the cloud. As a result, you can shift the processing to the cloud rather than rely on the user-end device. This ensures that all users, regardless of end user device, will get the same video experience. On the cloud you can hire increasingly powerful processors to ensure that your users get an amazing experience and the “wow!” factor.

From Individual Players to Casinos

Many casinos are shifting to cloud for their games. Imagine the benefits: no coins to manage, instant credit of accounts, and much easier handling of cash. And all of this with a secure and amazing user experience with frequently upgraded games and previews provided to clients even before they set foot in the casino.

Cloud computing has changed the gaming industry beyond recognition. Everything now looks real and possible in the virtual world.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sanjay SrivastavaSanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India, where he plans to use cloud computing and modern technology to improve the lives of local rural folk.

 

 

Share on LinkedIn

What the Private Sector Can Learn About Government Implementation of Cloud Computing

The United States Department of Defense (DoD) has long been at the forefront of adopting new technology, accelerating developments of new systems which have impacted tremendously on the private sector and commercial market. This is especially true in such areas as aviation, computing, and energy. Without DoD support for – and development of – new technology, such revolutions as GPS and the internet itself might not even exist in their present form! Because of this, the strategies used by the DoD can serve as a guideline for private sector companies looking to adopt new technology.

Businesses today can consider the DoD’s dramatic expansion of cloud computing technology as a massive endorsement of the concept and its effects on efficiency, security, and IT costs. In the DoD Cloud Computing Strategy document, released this week, the DoD outlines a four-step plan designed to promote the use of cloud computing within the DoD, training those responsible for acquisitions how to contract cloud services and understand the technology.

According to the DoD Cloud Computing Strategy, the following four points will be essential for moving to a combined government and commercial cloud strategy:

     
  1. - Use an outreach campaign to promote the use of cloud technology in DoD facilities in order to increase the number of cloud users and cloud service providers associated with the DoD.
  2.  

  3. - Eliminate unnecessary software duplication and unnecessary IT services in order to optimize consolidation of these in data centers.
  4.  

  5. - Delivering cloud services through the DoD’s own vendors, departments, and associated agencies.
  6.  

  7. - Making a move towards cloud-based hardware and software options in already-existing DoD data centers.

 
 
These four points can serve as a basic template for businesses or government agencies wishing to make a large-scale move towards cloud. More essential services include cloud-based collaboration; web-based communication via instant messaging, temporary chat rooms, conferencing, and email; and integrating multimedia and data services to ensure easy access from multiple locations.

Nearly all of DoD cloud services will be managed by the Defense Information Systems Agency, with the main authority residing in the DoD’s Chief Information Officer, Terri Takai. According to the DoD Cloud Computing Strategy, the goal of moving to the cloud is to ensure that people involved in DoD affairs can access data whenever they need it, from any device, regardless of where they are. This freedom is what distinguishes cloud computing from the pack, putting it leagues ahead of traditional data storage and management.

To bypass the possible security risks of using commercial cloud services, the DoD has hinted at rigorous control over data stored in these domains. All mission-critical data that could compromise or interrupt DoD operations if lost will not be hosted on commercial clouds. These decisions will be made by the Defense Information Systems Agency (DISA). This combined model of government and commercial clouds has been seen in the past as a hybrid cloud model that has come to be accepted as an industry standard to ensure better scalability and increased security.

So how can a business apply the DoD’s four strategies to their own implementation of cloud? Perhaps by establishing the following three recommendations learned from the DOD’s strategy:

     

  1. - Actively promote cloud computing in all offices and branches of the company, incentivizing personal use, course-taking and certification, and promoting its use in company affiliates such as vendors or distributors.
  2.  

  3. - Cut back on unnecessary IT services and avoid software and hardware duplication wherever possible to cut costs. Consolidate existing computing capabilities in optimized data centers.
  4.  

  5. - Create a branch or division within the company charged with delivering cloud services, especially in the case of a private cloud for in-house use in order to prevent security leaks. The role and structure of DISA can be used as a guide for the organization of such a branch.

 
 

One way of promoting the technology is obviously highlighting its unique features and advantages. GMO Cloud offers functions that help organizations manage systems more efficiently. See the examples on the High Availability Features page.

 

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Nida Rasheed is a freelance writer and owner of an outsourcing company. Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.

Share on LinkedIn

Page 30 of 71« First...10...2829303132...40...Last »