August 10th, 2012
Although cloud computing has become mainstream within many organizations within the last few years, the reality is that the concept of transitioning from legacy IT systems to a cloud environment is still in its infancy. While there are enterprises that have fully implemented cloud infrastructures, many others are still in the contemplation or planning phase. In a report entitled “Sizing the Cloud” by Forrester, the research indicated that the global market for cloud computing is expected to reach $241 billion by the year 2020. This represents a significant increase over $150 billion during the year 2011. The report represents the first cloud industry research to classify the market sector and category expansion on the global scale which helps to forecast when each market sector will hit full capacity. As the demand for cloud computing increases new technologies will be developed to fully support new cloud initiatives and enterprise needs. So what are some of the ways cloud computing may change by the end of this decade?
Cloud Integration and iPaaS
The current cloud IT software stack consists of three different layers which include IaaS (Infrastructure as a Service) – this provides the basis on which systems are deployed, PaaS (Platform as a Service), which is the middle level that enables the integration of multiple applications as well as the development of advanced applications, and SaaS which provides the end-user applications that support business productivity. The development of the series of platforms that make up the cloud infrastructure with the absence of integration solutions has been a primary issue. As cloud computing grows and more applications migrate to the new architecture there will be a growing need to find new ways to integrate the contrasting elements of a cloud model. iPaaS platforms have been recently emerging and will be the next generation integration platform for integrating cloud computing applications both onsite and with legacy applications.
Inexpensive Processing
As it currently stands, 32-bit ARM architecture is being widely used however, Intel has been working on releasing an ARM chip which reduces power consumption and improves performance by utilizing 64-bit capability. Although this technology will not be released until sometime next year and mostly applies to a family of microprocessors related to mobile devices, it is believed that high performance, low power consumption chips will make their way into server technology and storage classifications in cloud environments. Some of the primary companies have already been providing ARM servers to many of the providers of cloud-based infrastructure.
Faster Performance
The increased need for application distribution on a massive scale coupled with the implementation of faster processing will result in data center that runs at much faster speed. With the development of new technologies, data centers will eventually be able to deliver high performance cloud-based environments capable of running at speeds as fast as 200-300 GBPS (gigabits per second) at the onset. This will also allow enterprises to get more accomplished in less time and at a lower cost than in-house legacy IT systems.
Maintenance Automation
The current cloud environments require reduced intervention on the part of IT professionals since many IT infrastructures can easily be controlled from a single interface as opposed to multiple locations that result from server sprawl. As new technologies emerge for maintaining hardware and software applications as well as cloud security, cloud IT environments will basically be able to run themselves in terms of the automation of routine maintenance tasks, updates, and patch management. This leaves even more time for enterprises to focus on the development of new IT initiatives which contribute to company growth.
Cloud Specialization
Instead of the differentiated cloud services such as IaaS, SaaS, and PaaS, cloud environments will become more specialized with more cloud providers offering customized services that would otherwise be carried out on company premises. This will allow more enterprises to easily shift their workloads to the cloud without having to plan for data classification and their relationship with costly legacy IT systems.
The 2020 cloud looks like a significant shift in the middleware PaaS model to an environment where customized solutions can easily be created. Additionally, the private cloud will take precedence over a public cloud environment thanks to the development of new technologies which will place the virtual private cloud within reach for many enterprises and organizations.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Aeyne Schriber has more than two decades of accumulated experience in IT security, computer technology, and internet marketing, including technology education and administration field both on the public school and college level. She works worldwide helping companies establish an online presence from small businesses to large enterprises. Her skills as a published copywriter and marketer also include consulting and training corporate personnel and entrepreneurs. For more details, visit www.digitalnewmediamarketing.com
August 9th, 2012
The media industry has gone through some transformations in recent years, and one of these significant changes is the good news that some media companies have already begun to move their IT systems to the cloud for faster delivery and cost efficiency.
Needless to say, the millennium had seen many media companies change their content format from analog to digital for easy storage, transmission and upgrade. The amount of digital archives in the media industry has increased lately, and this has brought immense pressure on media companies to devise appropriate mechanisms to protect, store, and update their digital archives.
Similarly, the recent hike in the demand for media products and services has forced media companies to rethink the way they serve their insatiable customers: there is unprecedented volume of demands for online videos, news, reports, commentaries etc, and media companies are bothered with the huge responsibility of fulfilling their customers’ requests within the shortest time possible.
Media cloud—a jargon used to describe the adoption of cloud computing in the media industry—has been quite instrumental in assisting media companies to respond to their customers’ demands in a faster and properly coordinated manner. Traditionally, customers could not have access to media products until sometime after they had been produced. Today, with the adoption of cloud technology, media companies can deliver their digital products/content as soon as they are made. This gives the media companies a huge advantage of marketing their digital products/services faster than before.
Media cloud offers bigger capacity for digital content as well as maintaining them in archives for a long time so that customers can easily access them anytime anywhere. This practice of increased exposure of media content to customers for as long as possible has helped some media companies achieve more profitability through increase in sales and revenues. Not only that, customers can receive the latest flow of information in real-time owing to the fact that cloud computing, which is flexible and scalable, allows periodic updating of the media products without any delay. In other words, cloud computing guarantees the availability of media products/services globally, and this quality, no doubt, enhances business expansion and agility.
Since media companies won’t be bogged down with huge cost of hiring IT personnel, maintaining the inventory and spending recurrent capital expenditure on acquiring IT software and hardware, many media companies find cloud computing expressly cheap and stable. Cost efficiency is one of the innovation-driving indices in the business world today as companies spend less on a section of their operations, say IT system and invest the saved money elsewhere to increase performance and profitability. Media cloud encourages optimal use of technologies and improves asset utilization for the purpose of reducing the time media products/services are made available to the market.
However, the two main risks associated with the use of media cloud are: (i) security and (ii) worry about performance. Some media companies that are reluctant about moving their in-house IT infrastructure to the cloud often cite security as the main cause of concern. They often argue that exposing their archives of content to subscribers whose activities they cannot control (once their content is in the cloud) may lead to loss of vital content. They strongly believe that some mischievous users or subscribers may steal some of their media products and violate their archives. Similarly, there is a grave concern about how media cloud shutdown may affect the performance of their operations. Not all media companies are sure that adopting media cloud can guarantee uninterrupted products-services delivery.
But as the issues of security and performance are practically dealt with, there is every possibility that the number of media companies adopting cloud computing will increase in the years ahead. No success-oriented company wants to overlook the cost efficiency, fastness in product/service delivery and possibility of business innovation that media cloud promises.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Jerry Olasakinju, a Bachelor of Technology (B.Tech) degree holder, is a passionate researcher and writer whose interest in everything computing is unparalleled. He blogs about his literary works at http://jerryolasakinju.blogspot.jp/
August 8th, 2012
While there are a number of security and regulatory concerns in the health care sector and these have held back universal adoption of cloud computing, there is no doubt that growth in this sector is still robust. The healthcare market which was worth just $1.8 billion in 2011 is expected to touch $5.4 billion by 2017. This is a cumulative average growth rate of nearly 20.5% which is exceptional by any standards. While the market is growing robustly, no clear market leaders have emerged as yet and no cloud service provider presently has a market share greater than 5%. Therefore the next five years may see plenty of consolidation, mergers and acquisitions.
Typically, the healthcare industry can be broken into Clinical and Non-Clinical sectors. The clinical sector is where the sensitive work gets done and this is the sector that is most heavily regulated. Applications in this domain consist of Electronic Health Records, physician instructions and orders, investigations and imaging, radiographic software etc. The specific fields covered in the clinical domain are listed below –
The non-clinical domain is essentially concerned with managing the business end and handles issues like patient billing, claims management, revenue cycle management and employee management etc. This is nearly similar to most other businesses and represents the more easily implemented end of the business. The specific modules in this domain are –
Cloud computing makes it easy to share patient Electronic Health Records between doctors. In case, for example, a person from New York were to face a medical emergency in Los Angeles, the doctor he consults should have the latest data about his medications, allergies, blood group and any other medical problems. In case the person is unconscious or otherwise unable to tell, much time could be saved if the physician had immediate access to this data. While technologically there are no difficulties in implementing this, there are many regulatory issues involved. These regulatory issues and the Health Insurance Portability and Accountability Act (HIPAA) are presently delaying the full fledged roll out of this capability.
Imaging is another major area where cloud computing can help the healthcare industry. Medical science has evolved its imaging capabilities far beyond X-rays. These images are now occupying thousands of gigabytes and must be stored because they provide an excellent record of the progress of a medical condition over time. Storage and retrieval of these images presents challenges that the cloud is uniquely poised to handle.
While security has often been cited as a concern, fact remains that most of the cloud security issues have been resolved already and large numbers of business enterprises have begun to pin their faith in cloud security. In any case, it has been often said that the physical and logical security that can be provided to cloud based services and applications is much more than that can be provided by most enterprises to their in-house data services. Availability of services in the public cloud has also been shown to be much higher as compared to captive services.
In summary, the healthcare industry is going to face a round of disruptive innovation as the presence of cloud computing increases in healthcare. The cloud, as discussed earlier is the perfect medium for delivery of electronic health records. Once this really takes off, patients will save on duplication of tests and the large amount of time being wasted in waiting rooms. Billing will also become more transparent with the patient really coming to know what he is being billed for. There is also a churn occurring in the industry with pure cloud computing companies offering low cost pay-as-you-go models. The new cloud based health care services are also far easier to deploy and have a very shallow learning curve.
It is certain that in ten years time, cloud computing will change the way healthcare works in developed countries.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Sanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.
August 7th, 2012
Trust remains an integral part in every situation or relationship whether business or personal in nature. When it comes to cloud hosting, trust earns the top spot in its core operations and if not, then it should. This is primarily due to the fact that clients entrust their private information to a service provider that controls their data on the cloud. Giving a third party the authority to store and use your data may leave a person or company vulnerable to loss, theft, abuse or tampering of that certain information. Thus, fear and apprehension to avail of cloud services are the usual feelings that any client may go through as it may be detrimental to their business and even their personal lives.
Whether choosing to use a single or multiple cloud service providers, a customer could be at risk when security systems are not in place. Some companies nowadays are opting to work with multiple cloud hosting providers to store their data. Through this setup, when one server crashes, they can still have a back-up of that lost data on another server with a different provider. When your data is on multiple servers around the world however, your data is in the hands of more people in the cloud than the usual which means a higher risk that it could be hacked. But this does not necessarily mean that your data is safer on a single provider since the most important matter to consider is whether your single or multiple cloud hosts are implementing strict security measures for your protection.
When selecting a cloud host provider, you must look for different types of security measures that they could offer in your contract. This should involve protection not only for your software applications but also for the hardware systems to be used. One of the basic security approaches is encryption, which entails that data be seen by authorized persons only. Encryption must be required for static data as well as those being transmitted especially to various servers around the world. With encryption, it is paramount that key management is restricted to trusted individuals only and that contingency plans are available when key access has been compromised. Highly advanced encryption methods could however be too expensive for the customer, which is why companies can opt to build their own encryption system before letting out the information from their hands.
Access control to the information must also be set-up depending on the kind of information and level of confidentiality. When policies and procedures are strictly employed with corresponding penalties for internal non-compliance, then there is a better chance that data will remain secure. Firewalls have also been widely utilized to ensure that networks are not being attacked by malicious individuals from the internet that is open to the public. Exposure to harmful websites must also be minimized while authentication tokens, certifications and log in challenges must be included in their offerings.
Physical security is also a main concern in protecting data from being accessed or damaged. The servers being used must be located in a highly secure place that can only be accessed by authorized individuals. Password codes, alarm systems and cameras must be in place for heightened security. The hardware itself must be damage-proof if possible in the event of attackers or natural disasters. When using external media such as flash drives, encryption could also be implemented and restrictions on the allowable media to be used should be placed.
As you choose the cloud service provider to run your applications online, you have to consider all these mentioned security measures before signing that contract or agreeing to pay a fee that won’t be worth it in the long run. Make sure that the cloud service provider prioritizes transparency in what they do in order for trust to be earned and sustained in your working relationship. Visibility in all aspects of operations allows you to monitor and identify possible issues and risks in security that could harm your business. When cloud host providers remain open and communicative, customers feel safer to entrust their data for a longer period and even at a higher price because they know that their privacy and protection are always the number one goal of the cloud service provider.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Rodolfo Lentejas, Jr. is a fulltime freelance writer based in Toronto. He is the founder of the PostSckrippt, a growing online writing business dedicated to producing top quality, original and fresh content. To know more about him, please visit www.postsckrippt.ca. Like him on Facebook or follow him on Twitter, Google+ and Pinterest.
August 6th, 2012
The billion dollar empire that is the gaming industry has hit a major breakthrough with cloud hosting rising to fame. Hosted on various virtual servers, gamers experience a new high with easier access and lower costs delivered by cloud hosting companies. With cloud gaming, data and applications are on the “cloud” or online without the need for multiple, expensive, high-maintenance, and trouble-prone physical servers or machines.
Instead of purchasing high end computers with a large capacity to play the most in-demand games available, anyone can just sit and relax at home or anywhere they want with just their laptop, tablet or mobile phone to access their favorite video game or play with or against another person from the other corner of the world. An internet connection is merely what they need in order to stream the games thru their regular-performing computers. To simultaneously manage millions of players in any location playing the same game is only one of the most astonishing features of cloud gaming. Not only are these virtual servers able to withstand high traffic without crashing or any delay, cloud hosting providers could also deliver rich graphics that is key to attracting more players for an even bigger profit.
These immense technogical requirements of the gaming industry fit perfectly with the services that cloud hosts could offer. Since players won’t need to go to the physical or online store to buy the software, chips or consoles, convenience is at the tip of their fingers with cloud gaming. It is definitely cheaper with an affordable monthly fee with no installation required or hours to wait for the usual downloading of files. Though console manufacturing companies have hit some dents, gamers and gaming companies obviously benefit from this economical technology instead of purchasing any hardware or building the needed infrastructure.
Just recently, Sony has joined the bandwagon of cloud gaming to further their brand and reach more gamers around the world. Sony has acquired GaiKai, a cloud gaming provider on different platforms for a whopping $380 million. This move by the makers of the popular Playstation could be perceived as a contingency plan to up their sales by investing in cheaper gaming services. This may also be a consequence of having lower sales in their console products since more consumers are preferring online-based games due to their convenience, flexibility and affordability.
The emergence of these cloud gaming service providers in the scene provides gaming companies the chip to power what they need in order to boost their business. One of the most common features available today is having unlimited virtual servers which allows gaming companies to expand their offerings while providing stability in their operations. Scalability options have also been an enticing feature of rising cloud computing service providers since it provides the necessary flexibility. Online gaming or cloud gaming often experiences heavy load, which is why, these scale up or down options come in handy whenever they need to upgrade or downgrade their resources such as in the number of servers in order to handle an upsurge in access or load.
A lot of these cloud hosting providers in the market offer a traditional fixed price whether customers use a little or big amount of resources in their operations. However, customers have another better option in sight with a number of cloud hosting providers offering flexible terms depending on the resources used such as memory, disk space and virtual CPU. If you are starting your own gaming company, you need to conduct more research on which cloud hosting provider could be more beneficial to your end. Some cloud hosting services offer monthly fees for a certain period with no charges for initial set-up. But if you don’t want to be tied to a long-term commitment, you can look for companies offering services paid by the hour for more room to move.
For gaming company masterminds, mobility is definitely a plus factor when managing their growing business on the cloud. With this in mind, it is important to choose a cloud hosting provider that allows you to access and control every facet of your cloud gaming business anytime and anywhere you may be. This type of convenience is often a top priority for small to medium sized gaming businesses dreaming of growing their empire around the world. Since travel is inevitable in their business, being able to make decisions and implement changes on their laptops or phones are essential in selecting a cloud hosting provider. These evolving offerings only make the future of cloud hosting even brighter by having more visibility and power in the gaming industry.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Rodolfo Lentejas, Jr. is a fulltime freelance writer based in Toronto. He is the founder of the PostSckrippt, a growing online writing business dedicated to producing top quality, original and fresh content. To know more about him, please visit www.postsckrippt.ca. Like him on Facebook or follow him on Twitter, Google+ and Pinterest.
August 3rd, 2012
Many have regarded cloud computing as the fourth industrial revolution in information technology. There is no doubt that in the next decade, computing infrastructures will be largely cloud-based. This means that there are big changes coming in the industry – after all, we are still in this technology’s initial stages. Just as computer scientists in the 1960′s had no way of predicting the iPad as anything more than science fiction, it is very difficult to speculate on the future of cloud computing technology by the middle of the 21st century. However, this is a fast-evolving field and huge growth is projected in the next decade. Analysts at Forrester, for example, have predicted that cloud computing will grow into a $150 billion dollar industry in the next eight years. Although there are barriers to overcome, a widespread use of cloud computing technology in most company’s IT infrastructure will lead to a generalized change in all computing-related fields. Being prepared for these changes is a great way to make sure that your company is prepared for what most experts agree will happen in the next eight years.
The Total Separation of Hardware and Software
The most important change will probably be SaaS becoming the norm, rather than an exception. Software will no longer be tied to hardware, leading to a radically different way of conceiving IT infrastructure. In fact, most companies will not be able to tie down their IT infrastructure, pointing instead of their cloud computing service providers. Infrastructure will, essentially, become invisible. Front end applications built on a platform that is essentially a service will entirely do away with hardware restrictions on software on the user’s end.
Due to the processing power inherent in the cloud’s scale, we will start to see the real power of HaaS, with powerful modular software that can take advantage of the complexity afforded by the cloud. This will mean a radical change in software development, placing a large emphasis on the development of very large applications with multiple components that can be modified independently and without having to shut down the program – modular software. This will require a radical change in programming mindsets and in IT management. A likely model for future software interactions can be found in current social media networks. Software will have to become social, interacting with different hardware and applications constantly, much like Facebook users “like” different videos, posts, or pictures.
Lower Processing and Implementation Costs
For current businesses, this means lower hardware costs and cheaper access to cloud technology. Thanks to current cloud projects, such as the Open Compute Project, businesses will have access to powerful infrastructure that can be maintained quickly and easily while being invisible to the end user. Prices will also be driven down with the coming of low-power ARM 64-bit capable chips, which will be entering the market in about a year. These processors will cut down on energy bills, leading to dramatically lower data center prices. It will not only be ARM; competitors are working hard to bring their chips to the same power usage standards, easily setting a trend for future data center processing.
Preparing for the Fourth Industrial IT Revolution
The greatest barrier in the future of cloud computing is actually in the hands of the vendors and the industry. The technology is there; the main problem will be developing the necessary openness and interaction necessary to implement upcoming breakthroughs in cloud computing. Two things will be needed: full technical disclosure and clear industry standards. This is because the future of cloud computing has been predicated on breaking down computing to its individual parts and having them interact freely. However, it will also be necessary for everyone in the industry to interact in the same way. Of course, this will definitely be an uphill battle. Vendors that can charge for their proprietary technology will not be happy with standards and full technical disclosure allows newcomers to enter the market with a better chance of competing. However, the outlook is quite promising, with companies such as Huawei and HP joining the Open Compute Project and similar open source and open hardware projects. If anything, the real debate in the future will be finding out where to draw the line on exactly how open vendors and businesses will need to be.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.
August 2nd, 2012
The adoption of cloud computing in the tourism industry can guarantee economies of scale and innovation.
Tourism Ireland experiences these possibilities by moving its entire IT infrastructure to the cloud at a whopping five-year contract worth 850,000 Euro.
Tourism Ireland stands to gain a lot from this development: the company manages over a dozen websites that have presence in over 20 countries and provide tourist information for countless visitors that arrive in Ireland every year; and as the amount of information the company handles each year increases, Tourism Ireland makes the significant decision to reduce the cost of running its previous data center by moving its erstwhile in-house IT structure to a privately managed cloud. Subsequently, the company has, by far, succeeded in cutting its hosting and energy cost by fifty percent.
This strategic move guarantees both economies of scale and greater innovation for the company. Hosting its IT infrastructure in the cloud affords Tourism Ireland the unique opportunity of concentrating only on the internal activity of management and business expansion. Niall Powderly, head of IT Infrastructure of the cloud company hosting Tourism Ireland’s IT system, declared that, “For enterprise organizations like Tourism Ireland, the (new) infrastructure offers levels of efficiency and opportunities to innovate that would have been unimaginable a few years ago.” Therefore, Tourism Ireland operates optimally through the centralization of its IT systems that produces no negative impact on the company’s internal and external activities.
Tourism Ireland achieves efficiency by creating a common resource pool with a combination of network, servers, storage facility and customer-centered services, all hosted in the cloud. This action definitely helps the company to save on the cost of maintaining IT personnel and other contracted consultants that oversaw its previous in-house IT structure.
Niall Powderly finally gave this piece of advice: “Organizations that were reluctant to move IT off-premise are now looking at it because …networks are faster and more affordable. You can now move infrastructure to a data center at a very reasonable price and have no performance impact.”
Three significant merits of adopting cloud computing in tourism industry stand out from Niall’s statements above:
Setting a significant example in this situation, Tourism Ireland’s action would likely motivate several other tourism companies to sign up for cloud computing with the hope of reducing their overhead expenses and positioning their business for innovation and unprecedented growth. Cloud computing will help companies in the tourism industry to devise the appropriate methods for managing their customers’ data, operational processes and business expansion in a way that cost efficiency and optimum performance can be guaranteed. It is not immediately clear how many European tourism industries will emulate Tourism Ireland’s efforts, but the trend is likely going to continue as many firms discover their potential gains if they adopt cloud computing strategies and technology.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Writer:
Jerry Olasakinju, a Bachelor of Technology (B.Tech) degree holder, is a passionate researcher and writer whose interest in everything computing is unparalleled. He blogs about his literary works at http://jerryolasakinju.blogspot.jp/
August 1st, 2012
A recent survey of more than 2000 IT decision makers in Japan and the Asia Pacific during the first quarter of 2012 provided an overview of how enterprises are dealing with disaster recovery and the reduction of tape backup technology in exchange for cloud hosted solutions. The study provided a snapshot of the common causes of downtime and data loss at a country-specific level, the location and solutions used for off-site data back-up and storage, areas of IT spending on data storage and disaster recovery, and the primary reasons for wanting to move away from tape storage to alternative solutions in the cloud.
Data Loss and Downtime
According to the Disaster Recovery Survey 2012: Asia Pacific and Japan, which was conducted by the research company Vanson Bourne and commissioned by EMC, the top three causes of data loss included data corruption, hardware issues, and power loss. According to the study, 60 percent of IT decision makers indicated that hardware issues are the top reasons for downtime and data loss where 48 percent reported that corrupt data was the main cause for downtime and business disruptions. Almost 45 percent indicated that power loss was a primary cause of data loss and downtime with the highest percentage reported by IT decision makers from Singapore. The fact that there was a high percentage of people that reported power loss as a perpetrator of downtime points to the possibility of incorrect configurations in the data center level or a variable quality of IT infrastructure deployed in enterprises in addition to availability of high speed bandwidth. This assumption originates from the fact power outages are rare in Singapore. 22 percent reported loss of power as the cause of data loss and downtime in Japan however; data corruption and hardware failure was cited at 52 percent. The amount of data loss which occurred over the last 12 months was just less than 625GB.
Data Storage
Another key issue posed in the survey was use of data storage and off-site backup locations. 57 percent reported that off-site data storage is performed at another company location within the country. IT decision makers also reported that a third party provider is used for data storage and backup in the same country as the company’s primary location.
The highest response referred to the fact that enterprises are seeking cloud storage solutions and online backup as the result of the increased availability of high speed broadband. In fact close to 70 percent of those respondents reported that they wanted to move away from tape backup to alternative data storage solutions due to the capability of faster backups (50%), durability and life cycle (41%) and the speed of data recovery and system restores (42%).
Also because of the cost-effectiveness and efficiency of online backup, tape backup no longer makes sense when it comes to utilizing cloud storage solutions. The capability to reduce costs while increasing recovery speed, off-site backup in the cloud makes it difficult to justify tape backup. When it comes to cloud computing, disaster recovery allows critical infrastructure to be restored within minutes on a shared or private hosted platform.
In terms of IT spending on disaster recovery, the study showed that most of the participants spend from anywhere from 10 to 12 percent of their budget on disaster recovery initiatives with over 50 percent obligated to have a disaster recovery plan in place to meet insurance and compliance guidelines and regulations.
Despite the differences in how countries in the Asia Pacific region deal with data storage and backup, there is at least one commonality when compared with other enterprises in different parts of the world and that is to deal with an increasing amount of data to relocate within the same backup windows. According to the study, many enterprises in the Asia Pacific region still rely on tape backup systems (38%) or disks (38%). At the same token, the survey indicated that a high percentage of the enterprises (53%) have plans to migrate from tape to a faster and more efficient cloud solutions in order to improve data backup, storage and recovery initiatives.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Aeyne Schriber has more than two decades of accumulated experience in IT security, computer technology, and internet marketing, including technology education and administration field both on the public school and college level. She works worldwide helping companies establish an online presence from small businesses to large enterprises. Her skills as a published copywriter and marketer also include consulting and training corporate personnel and entrepreneurs. For more details, visit www.digitalnewmediamarketing.com
July 31st, 2012
With larger numbers of companies shifting their operations to the cloud, users are getting overwhelmed with remembering a large number of access codes, passwords and URLs. There is also a need to manage user accounts and authentication to ensure that the requirements of rights management and security are followed. This, as any serious IT practitioner will know is the basic cost of doing business. You have to be able to take access control and user authentication as a given.
The problem is becoming more complex because of the proliferation of smart phones and tablet computers. The days of only connecting through a LAN are over and any solution has to cater to a very large number of mobile devices as well.
With companies also opting to use hybrid clouds and at times selecting to keep some critical parts of their applications in-house, the problem of authentication and single sign on is becoming worse. At times companies could even be working with different cloud vendors and still expect to manage access control and user authentication from a single point. Under such circumstances, IT staff can have a difficult time supporting security, access control and user rights management.
New SaaS Offering
A new branch of cloud services is now beginning to be offered that promises to handle these issues. Access control and user authentication services are now being offered by Software as a Service (SaaS) vendors. These services work in the background with the user company’s Active Directory (AD) or Lightweight Directory Access Protocol Service (LDAP) to map the rights and privileges granted to individual users and map them to cloud based services and applications that the company uses. This makes it easy for individual users to work while leaving the work of managing access control to the access control service provider.
A key issue in being able to offer such a service is being able to ensure near perfect availability for the authentication company itself. Most such companies take extreme steps to ensure that they have a service availability of greater than 99.99% because if they were to be off the grid for any substantial amount of time, they would have a cascading effect on their users who would not be able to log in. Even a 99.99% uptime means that there are nearly 53 minutes in a year that the service is not available and even this may not be acceptable to many users.
The authentication vendor offers much more than the mere capability to log in and achieve a single sign on for several applications. It is able to create a workflow consisting of several applications and is able to direct the user from one application to the other – giving him an illusion of a single application while he is actually being transferred between applications. A comprehensive audit trail and report is maintained and all user actions are logged. Since this is a third party service, in some kind of regulatory environments, this kind of audit trail could have more value than an internal company audit trail.
Since the authentication service is aware of the applications the users are accessing, it can automatically hire larger numbers of instances of the cloud service based on pre-defined thresholds and release these instances back to the cloud vendor when the user logs off. This ensures a ‘just in time’ kind of provisioning and de-provisioning and can result in substantial savings while always ensuring that a minimum level of service response is never violated.
Major gains
What does the user company get from the service that it could not have done itself? The answer is time. Given enough time and manpower, one could shift the active directory to the cloud and integrate with every application you are running – both in the cloud and off it. But this process takes time and for a company just beginning to move to the cloud, it is far simpler to outsource its access control and rights management to an expert in the field. The overall stability of the applications and processes in the cloud improve and as a result, the company can concentrate on managing the application rather than its access control and right management. This also brings down the time it will take you to start generating revenue.
You can think of the access control layer as an overlay on all your (possible different) applications. It makes it easier for applications to work together since trust is inherently ensured between the applications.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Sanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.
July 30th, 2012
The importance of backing up data is something that all people involved in IT learn very early, usually from a bad experience. It all comes down to redundancy, to having multiple copies of everything and keeping them separated so that they do not fail all at once. However unlikely, if it can happen, it probably will happen at some point. When adopting cloud computing for your business, making sure that your data is properly backed up and that not all of your “eggs are in one basket” can save you money and time and ensure that your data is safe. Most cloud services providers will give you options with regard to cloud hybridization and avoiding risks with multiple locations. However, these are often all on the same cloud, which may pose a risk. So how to get around that? The fact is that using multiple cloud providers for redundancy is a better way to manage risk since not all the responsibility for your data is in the hands of a single vendor.
Even With Distributed or Hybrid Models, There’s a Risk When Using a Single Provider
The fact is that depending on a single cloud services provider has its advantages. However, the negative effects of a single problem are compounded when this is the case. After all, if all of your data is residing on a single cloud, all it takes is one specific issue to affect the whole negatively. So, even a distributed cloud that attempts to provide ideal regional distribution can have issues. A more fluid operating model can help avoid these problems by adopting multiple cloud platforms in order to reduce the inherent risk mentioned above. If well implemented, multiple clouds do not have to be complicated to manage and actually make your cloud ecosystem richer thanks to the diversity of options and greater geographic distribution. All it takes is better up-front planning of your cloud ecosystem.
Managing Multiple Cloud Platforms
The critical aspect of managing multiple cloud platforms is having a management platform that is strong and well-implemented. Regardless of the applications being used and the various nodes involved, without a strong oversight, change management, and governance, moving apps to production can be a nightmare. Because of this, it is important that all clouds have a strong automation and management platform. This is even more crucial when dealing with multiple clouds. A common mistake is to keep many aspects of management manual, ultimately slowing everything down to a crawl. Learning to adapt to widespread management tools with high degrees of automation is crucial when managing multiple cloud platforms. This is often one of the hurdles that keeps many companies from adopting a more effective cloud strategy involving multiple platforms. The fact is that having all your data on a single cloud is not ideal but it is even worse to implement any cloud computing strategy with ineffective management tools, or even having to carry out all management functions manually.
What Your Cloud Services Vendor Needs to Provide
Not all cloud providers have services that are ideal for use with other clouds. A few important features that your cloud services vendor needs to provide include:
In conclusion, building redundancy with the use of multiple cloud platforms means that you will be more resilient to disasters and outage risks. Depending on a single vendor may mean that all of your services will be halted until that vendor solves the issue. With multiple cloud platforms, issues in a single provider’s cloud become less critical to your operations. This strategy also allows you to take advantage of each provider’s strongest features while minimizing their weaknesses. For example, you can choose the vendor with the best storage facilities to ensure that your database performance is at its best while using a cloud services provider with weaker storage facilities to cover in case of problems with the primary cloud – that’s the beauty of using multiple cloud platforms.
Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.
About the Guest Author:
Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.
« Older Entries Newer Entries »