Latest Articles
12.21.2012What Can Rendering Service Market Get From Cloud Computing?
12.20.2012Penetrating the Japanese Social Gaming World With the Help of Japanese Social Gaming Companies
12.19.2012Cloud-Based Rendering – the Logical Next Step for Render Farms
Archives
Categories
Tags
Android Basic Cloud Concepts big data CIO cloud Cloud adoption Cloud applications Cloud benefits cloud computing cloud concepts cloud gaming cloud hosting Cloud in business Cloud in Vertical Industries Cloud Migration Cloud News cloud provider cloud security cloud servers cloud services cloud storage cloud technology Enterprise gaming industry Google Government IaaS IT Japan mobile applications mobile gaming PaaS private cloud public cloud SaaS security Small Business small businesses smb SMBs social gaming Social Media social network virtualization virtual serversCall 855-466-4678
07.17.2012
Many companies today choose to go with traditional backup and disaster recovery solutions, overlooking the fact that cloud computing and device virtualization have far more to offer. The many recent natural and man-made disasters in recent years – from the political turmoil in the middle east to the tragic tsunami that hit Japan – have had the effect of making businesses rethink their backup and disaster recovery strategies. However, studies indicate that many of the world’s largest corporations are still woefully behind when it comes to backup and disaster recovery; this is especially true of many financial services businesses in the United States.
United States financial institutions are particularly vulnerable to disasters and system downtime
The fact that data processing and acquisition is an essential aspect of financial services makes this lack of preparedness particularly worrying. Managing billions of contracts and tens of thousands of credit card transactions per second, the world’s largest banks rely on a strong network and computing infrastructure in order to carry out their business. This reliance on stable data centers will only increase due to the rise of online banking and the security risks that come with it.
Customers demand financial security from their transactions and a possible disaster could be critical to a financial institution’s operations. Therefore, the 2012 Disaster Recovery Index’ findings are worrying, indicating that only 35% of financial institutions in the United States are confident that their IT departments can execute effective disaster recovery operations and that only about 30% have confidence of a quick recovery from system downtime. These are not hypotheticals: nearly half of the surveyed financial service providers indicated that system downtime cost them more than a quarter of a million dollars every year!
With data recovery playing such a critical role in the financial sector, it is worrying that more companies are not making sure that they are protected from a possible disaster. Many IT managers have indicated that this problem comes from a combination of lack of resources and a lack of support from management. The complexity of these technologies can also be an issue. The unique combination of the need for the highest possible security for most of the data that is being handled, the need for speed, and the large-scale operations being carried out do not translate well in terms of infrastructure. This often stems from an incomplete adoption of new technologies and a lack of consolidation of existing infrastructure into optimized, manageable units.
The largest financial firms require a complex infrastructure that needs to integrate physical platforms with private and public clouds as well as physical and virtual devices. Establishing a backup and disaster recovery solution that encompasses all of a firm’s infrastructure is an expensive and complex matter. More than three quarters of the financial firms surveyed in the 2012 Disaster Recovery Index indicated that there is currently a lack of a comprehensive solution that integrates all of these platforms into a single, manageable, and affordable disaster recovery strategy. Confusion and costly management errors are increasingly commonplace in dealing with multiple vendors in an effort to manage multiple platforms. However, cloud computing advances in recent years indicate that a move to a strategy based mostly on cloud computing can help solve many of these problems and save money and effort in the long term.
Cloud computing and backup and disaster recovery strategies
Although cloud computing has an inherent advantage in ensuring that data is always accessible and more resilient against disasters, it is also important to backup virtual data. The cloud also offers backup solutions and more companies are starting to adopt a remote backup service for their most crucial data. In fact, data migration to the cloud can reduce IT operating costs significantly as well as helping backup and disaster recovery operations become easier to handle. It is important to understand that blindly adopting the latest cloud and device virtualization innovations may not always be the best option from a backup and data recovery standpoint. While the cloud itself offers greater advantages in these matters, it is the unwieldy combination of physical platforms and multiple virtual platforms that makes backup and data recovery significantly more difficult. Because of this, one of the mistakes that financial companies have made is simply adding new technologies to their existing strategy without the proper previous planning of infrastructure. To avoid the confusion and high costs that result from this, a careful strategy for IT infrastructure and implementation should be laid out in order to optimize the adoption of a new computing technology.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.
07.16.2012
For many years there was stability in the database segment of the IT industry. Relational Database Management Systems (RDBMS) such as Oracle, SQL Server and MySQL (to name a few) had proven themselves to be capable of handling vast amounts of data and that seemed to be all that modern businesses required. One of the leading solution providers in the ERP space said that it had 30,000 tables in its ERP database and that could handle the complexity of practically any business.
But the advent of Big Data changed the database scenario as nothing before it. Many IT professionals and business executives still do not comprehend the challenges and opportunities presented by Big Data and how the cloud is uniquely positioned to handle this. Understanding a few simple concepts and capabilities will help you extract the best from this emerging area.
What is Big Data
In computing parlance, Big Data is defined as datasets that are so large that they cannot be handled by conventional RDBMS. Handling this data which can be up to zettabytes in volume (1021 bytes) is a different task altogether. You can encounter data of these volumes if you are working in scientific fields, in weather forecasting and in pharmaceutical research etc. Any large city, with its thousands of security cameras all recording video footage could easily hit this kind of archival volume in fairly short periods of time. All large international retailers, relying extensively on RFID chips to manage the movement of inventory and an international supply chain also generate enormous quantities of data.
While all business analysts agree that there are gold mines of business information in this data, the challenge is to understand the trends and to be able to put them to business use. The capability to handle Big Data is required and only cloud based applications have the capability to handle such a task.
Turning to the Cloud to handle Big Data
One initiative that has proved extremely successful in handling enormous quantities of data is the open source cloud based Big Data management system known as Hadoop1. This was built from day one with the requirements of handling Big Data in mind. Hadoop combines both data processing capabilities and the physical cluster management capability to handle very large clusters of database servers and data. For example, in 2008, Yahoo! reported that it was using the world’s largest Hadoop application then which was working on a 10,000 core Linux cluster and handled all web queries.
Two years later, Facebook announced that they were running the largest cluster which was handling 21 PB of storage (1 PB = 1015 bytes), soon thereafter, this grew to 30 PB. This kind of storage and processing capability simply cannot be built in a conventional in-house IT center. You need the kind of capabilities that only the cloud can provide. In yet another example of Big Data management, the New York Times used 100 Amazon Elastic Compute Cloud instances to process 4 TB of raw images into 11 million PDF files. The task took about 24 hours and cost – hold your breath – all of $240. This is the kind of capability that is possible with cloud computing1.
In yet another case of using the cloud to handle Big Data efficiently1, a cancer research team built a 50,000 core supercomputer to perform some core cancer research computations. This supercomputer was assembled in data centers spanning the globe. This virtual supercomputer ran for three hours and cost just $4,828 per hour to run. Such a supercomputing cluster can be set up by any qualified person in a matter of hours without any external help.
Maybe you are not yet into Big Data, but if you are passionate about growing your comp-any, it is only a matter of time before you will encounter Big Data with its many challenges and opportunities. It is important to understand that handling Big Data need not be a constraint and that you can take advantage of cloud computing techniques to get the best out of large data sets that your competitors may be afraid to make use of.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Sanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.
07.13.2012
Desktop as a Service (DaaS) involves hosting remote virtualized desktops through a remote server simultaneously. This makes IT tasks exponentially easier since there is no need for IT professionals to have to go physically to each computer in order to carry out repetitive tasks – for example, applying a major upgrade. Since all the machines connected to the cloud are effectively accessing the same applications and data, there is no need to apply upgrades or perform other major tasks on each employee’s desktop.
An overview of DaaS
In DaaS computing, each uses computer is a workstation in the cloud. Using the Internet, each user can connect to their own virtual desktop, even from mobile devices such as a tablet PC or even a smartphone. What each user sees on their device is actually an image of one of multiple desktops on a remote server. Being able to access this desktop from any device, anywhere in the world, gives users complete freedom and flexibility to carry out their work.
DaaS gives users the ability to carry out their work without needing an expensive and excessively powerful machine. Rather than having to purchase new memory or hardware components for each individual’s computer systems, processing power can be scaled as needed. DaaS systems will often mean lower management costs, significantly reduced infrastructure needs, and more effective IT management. Best of all, DaaS eliminates many compatibility issues and operating system issues, since everything is happening on what is essentially the same machine with multiple, virtual, simultaneous desktops. In fact, as long as your device has an internet connection, it has the potential to access any operating system and any level of data processing power.
DaaS Is Cost Effective
Technology expenditures are often some of the highest up-front investments when building a business and setting up an office. Rather than the high cost of setting up traditional workstations, virtual desktops offer the advantage of a lower cost due to the decreased need for processing power and hardware in each workstation. In fact, hardware costs in a DaaS system will often be less than half of purchasing traditional workstations, and operating them annually is also considerably less expensive. A virtual desktop lasts longer, requires no local hard disk and memory expenditure, and consumes less energy. Operating system upgrades and migrations can be prohibitively expensive due to having to upgrade the server and all desktop and portable computers in a company. However, DaaS allows companies to invest less in each upgrade and extend the life of their hardware.
DaaS Is More Secure
Traditional workstations have a basic security disadvantage, sensitive data needs to be stored in each workstation. Due to the fact that this data is distributed among numerous memory devices means that it is more vulnerable to hacking, loss, or even natural disasters. Virtual desktops do not require storing data in multiple hard disks, so this means that there is simply no data available to be stolen, lost, or destroyed. One of the most common causes of data loss comes from computer users themselves. A lack of proper computer security understanding can lead to malware infestations which can lead to viruses and worms entering an office’s network. With a virtual desktop, this risk is almost nonexistent and, thanks to being able to update and back up a single location rather than each workstation, maintaining a company’s data security is easier and more effective. DaaS can be an effective backup and disaster recovery aid in itself.
DaaS Allows Greater Mobility and Flexibility
DaaS is one of the best resources for remote workers. Being able to transfer seamlessly from a home computer to a mobile device to an office computer with all the same windows staying open and the same applications running continuously is the greatest advantage of DaaS. Using any device, workers can access their own virtual desktop without having to interrupt their work or carry around data in removable memory drives or other, similar devices. For frequent travelers, businesses that require constant mobility, freelancers, and workers from home, DaaS is the best possible solution. DaaS also allows scaling of resources and quick implementation of changes to software and applications while also eliminating compatibility problems.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.
07.12.2012
There are significant uncertainties in launching a new game as compared to creating a business or a personal use software product. To begin with, in the non gaming applications, you are fairly clear about the demand for your product and its features and therefore there is lesser overall risk. Of course you still have to do a great job and market your software well, but at least you know what works in the market.
Cutting edge game producers, when writing a new game, are never so certain. Will the game go viral and recoup every investment a hundred times over or will it not? You will never really know till at least the beta is released and enough hard core gamers have had a chance to play the game against each other.
Games have also become more complex, network oriented and multiplayer. This means that the gaming company has to make significant investment in back end hardware to support the game. As the as the last section of this article will show, even casinos are betting big on cloud based games to attract clients.
Fortunately, using the cloud to develop and host the game offers a very practical method of reducing developmental risk. This benefits the gaming industry by making it more affordable to produce new games and offer them to users. It also benefits players by bringing down costs, making massive multiplayer games possible and improving the overall experience.
Here’s looking at some specifics –
Reducing Development Cost
You can rent access to a cloud service for very little cost and without making any long term commitments. The advantage is that you get access to the very best computing resources without making any kind of fixed investment in servers, databases and processing power. These low costs allow you to experiment without worry. Since you are not incurring large costs in hiring of hardware, approvals to experiment are far easier to obtain. Who can argue with starting with a $35 payment as against shelling out thousands of dollars to buy servers and operating systems?
Games are no longer confined to a single operating system or a single hardware platform. No one in his right mind would restrict himself so severely. Therefore you need to provide your development team a number of different target operating systems in varying hardware configurations to test their output and make sure that a consistent user experience is obtained. Previously, serious gaming companies would maintain servers running different OS and test their games out on them. Naturally this led to higher cost. If you had limited computing resources, you ended up installing and uninstalling multiple operating systems. I can tell you that this could be pretty frustrating. Using cloud computing, you can hire multiple instances and run different OS across them. Testing has now become easy and you can integrate it tightly into your development process. Saves time and money and keeps your developers smiling.
Testing is extended to real users as well at the lowest cost possible using cloud services. You just need to make a small and controlled expenditure to deploy your game fully. There is no need to scrimp on costs because the costs are so low.
The production version can be deployed rapidly and scaled up in near real time as traffic picks up. This ensures that costs only increase as revenue picks up and you do not eat into company resources upfront.
Quality Issues
For some time there was a genuine concern about quality. Would cloud based games be able to deliver the same video experience that a top of the line graphics card be able to provide on a local host? This worry has been laid to rest with the coming of age of extremely powerful graphics processing capability in the cloud. As a result, you can shift the processing to the cloud rather than rely on the user end device. This ensures that all your users, regardless of the end user device they use, will get the same video experience. On the cloud you can hire increasingly powerful processors to ensure that your users get an amazing experience and the “wow!” factor.
From Individual Players to Casinos
Many casinos are shifting to using cloud computing for their games. Imagine the benefits –there are no coins to manage, instant credit of accounts and much easier handling of cash. All of this with a user experience that is secure and amazing with frequently upgraded games and with previews provided to clients even before they set foot into the casino.
Cloud computing has changed the gaming industry beyond recognition. Everything now looks real and possible in the virtual world.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Sanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.