Blog
Cloud News & Insights

Dedicated Servers In A Cloud – Virtual Private Servers

There is something new happening in the cloud all the time. Small websites had no option but to opt for shared hosting. The same was true for small businesses and enterprises. But no more. With the advent of VPS or Virtual Private Servers in the cloud, the requirement of small businesses has also been fulfilled.

What is VPS?

In my opinion, it was only a matter of time before someone came up with the idea of a VPS or Virtual Private Server. With a VPS, enterprises can run their own dedicated servers in the cloud with minimal expenditure. One can call a VPS a form of hosting service with many advantages over traditional data centers. A VPS environment is highly secure and in-built redundancy offers high availability. For more information on this, please visit Webkeepers.

Advantages of using a VPS

By far the most important feature of a VPS is reliability. As most users are aware, shared hosting comes with huge disadvantages. A shared host has to operate with a few hundred other websites. This can be disastrous if the load on a server, due to traffic on a single site, suddenly peaks bringing the server down.

Even otherwise, fluctuating load can slow page download speed and lead to poor user experience. Web design experts caution web portals against such poor performance. Users or visitors don’t give you a second chance. Dropped website access can mean the difference between a successful transaction and a failed one. This is why shared hosting can be dangerous for enterprises. With a VPS, you don’t suffer from this limitation. You don’t have to bother about a load on server due to traffic on other shared hosts. In fact, you are the owner of a dedicated server.

Versatile Control Panel

One of the grouses of any online entity is the requirement to manage dedicated servers. This is one reason why enterprises avoid the use of traditional dedicated servers. Requirement of system engineers makes this option unviable. In the case of a VPS, though you require a basic understanding of operating systems, the control panel is highly versatile and easy to use. In addition you can manage ten or more domains from a single control panel.

The option to use a different operating environment

A range of virtualization technologies combined with a broad range of operating systems means that enterprises and businesses can easily migrate from their traditional dedicated servers to a VPS without any technical issues.

Value Added Services

There are many other value added services which can be incorporated on demand.  Anti-virus plugins, anti-spam, SSL certificate for additional security to ensure online financial transactions and mobile options can be added.

Support

We generally don’t think of support until we are in trouble. VPS systems have an extensive database which can be accessed by users for troubleshooting and for understanding the processes and systems better.

Cost advantage

One of the major advantages of a VPS is the low cost of hosting. There are a number of packages, small to large, to suit each and every requirement.  In fact, the cost factor itself would tilt the balance in favor of a VPS. For more on on this please visit Webkeepers.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sankarambadi SrinivasanSankarambadi Srinivasan, ‘Srini’, is a maverick writer, technopreneur, geek and online marketing enthusiast rolled into one. He began his career as a Naval weapon specialist. Later, he sold his maiden venture and became head of an offshore Database administration company in Mumbai. He moved on as Chief Technology Officer of one of the largest online entities, where he led consolidation of 300 online servers and introduced several Web 2.0 initiatives. He holds a Master’s degree in Electronics and Telecommunication.

Share on LinkedIn

Application Development in the Cloud: Going Beyond Infrastructure as a Service

The cloud is changing the expectations of both those who use it to run businesses and those who use it as a development platform. We mentioned several times that software developers love the freedom that a cloud environment gives them because they can simulate any configuration of hardware/software that they want to test their applications on. However this capability, part of the Infrastructure as a Service (IaaS) offering, seems to have hit a limit as far as developers are concerned.

So long as a developer is building something that can be defined within the IaaS parameters, everything goes well. But when the developer wants to use lower level functions, he or she hits a road block. The high productivity environment that IaaS provides now becomes a constraint because in the interests of stability, it lays a lot of constraints on what can be done and what cannot be done.

Assembling software?

Many major cloud platforms have realized this and have begun to offer more fundamental services that work at a very low level as well as the common services that developers are used to. As a result, major enterprise level applications are becoming a composite collection of user developed code, low level functions provided by cloud service providers and functionality developed by independent software vendors. The developer is now writing very specialized bits of code that really forms the core of his applications, and for other part of the functionality he/she is more involved in assembling applications.

Internal IT faces a challenge

Due to this approach, which is going way beyond IaaS, internal IT departments will face considerable competition from external cloud service providers. They need to find ways to work collaboratively with cloud service providers so that they can provide those bits of domain expertise that a cloud service provider cannot. Internal IT can help by studying cloud service providers and their capabilities so that developers have ready access to the services they need.

Creation of a service catalog is extremely important. Based on the results of an in-depth study of cloud service providers, internal IT can create an approved service catalog. This will ensure that developers use standardized cloud services for most of their developmental needs. The organization gains by having a degree of standardization for most parts of the application while giving the developer the freedom to write code for the critical parts that are not available as a service from a cloud vendor.

Costs

All of the above brings down the cost of application development while increasing the developer’s capabilities. However, the use of cloud based components will mean that there will be recurring (but relatively small) costs of using the application throughout its lifetime. It is important that this is made clear – both to management and to users so that there is transparency all through.

A very major benefit of this approach – in addition to the speed of development and low initial costs – is the inherent capability of cloud based services to scale up and down. This is something that in-house IT teams just cannot achieve. Your solution is built to be scalable even before the first line of code is written. To explain this better,GMO Cloud illustrates a high availability system in its suggested configurations. Find out how scalability is at play in this type of set-up.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

About the Guest Author:

Sanjay SrivastavaSanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure.  He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.

Share on LinkedIn

Managing Fluctuating Traffic With Cloud Computing

This is an extension of my earlier post where I have discussed increased server load due to heavy traffic during certain events like Valentine’s Day. Fluctuating demand on infrastructure is not isolated to such instances. There are many situations where one would examine cloud computing as a feasible alternative to traditional web server based traditional architecture.

The technical conundrum

Managing fluctuating traffic demand and consequent infrastructure requirement has been a sensitive topic due to several reasons. On the hand, the tech guys in the trenches want to be certain that their infrastructure is fail safe. This means they cater to requirements by adding 20 – 25% to the calculated maximum load. This over cautious approach means the enterprise has to bear the cost of needless infrastructure. Obviously the management will see red (if they really know what’s happening). To be fair, it’s the tech guys in the trench who take the blame if anything goes wrong. They are justified in keeping their yard arm clear.

Cloud Computing – an elegant solution

This is where the beauty of cloud computing comes in.  The tech guys need no longer worry about maximum load. They can now work on the base load. Their in-house infrastructure can now be optimized and costs can be slashed. One can opt for a private cloud if there are other concerns like security and fear of compromising their data.  Sensitive applications can now run within their own secure environment while other applications can be moved to a public cloud – like eating the cake and having it too.

Candidates for private-public cloud architecture

Following businesses must look at public cloud for optimizing their infrastructure.

  1. Seasonal fluctuation in traffic: I discussed seasonal workload factors in my earlier post. Online Shopping experiences seasonal fluctuation in their traffic – Holiday shopping during New Year having maximum load. Such businesses may also experience surge in traffic during times of special offers and discounts. Deploying public cloud would take care of this increased traffic. The server instances can be added within short notice and withdrawn once the critical period is over.
  2. Music and video download – streaming media activities: These are ideal candidates for public cloud computing services. Typically, one would experience heavy traffic during the weekends and leisure period. Netflix has optimized their infrastructure by adding public cloud computing services to their traditional data center operations. A mix of traditional and public cloud computing is ideal for these services.
  3. E-commerce daily workload: Typically these online businesses experience maximum traffic during the day. In effect, the infrastructure remains idle during the night leading to wastage of resources. Moving to a cloud based architecture would immensely reduce cost of running such businesses.
  4. E-Learning Enterprises: 

Here are some figures on E-Learning from the Japanese market:

The Network-based e-learning market has experienced 1.8% increase from the previous year to 68.4 billion yen (Yano Economic Research Institute). Out of this Business to Business has contributed   56.2 billion yen (same as the previous year) whereas for Business to Consumer market it has been   12.2 billion yen (increase of 10.9% from the previous year). The network-based e-learning market using mobile phones, smartphones, and tablet PCs has also been rapidly growing. Most E-Learning enterprises would experience oscillating demand for their services, maximum during the day and minimum during night – ideal for public cloud computing.

Conclusion

In today’s environment of cut throat competition and cost cutting, online businesses have to look at options to minimize their expenditure on infrastructure. However there are security concerns and reasons which deter enterprises migrating to a cloud based service. An ideal combination of private/traditional infrastructure with public cloud computing services can lead to better efficiencies and reduced cost

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sankarambadi SrinivasanSankarambadi Srinivasan, ‘Srini’, is a maverick writer, technopreneur,   geek and online marketing enthusiast rolled into one.  He began his career as a Naval weapon specialist.  Later, he sold his maiden venture and   became head of an offshore Database administration company in Mumbai. He moved on as  Chief Technology Officer of one of the largest online entities, where he led consolidation of 300 online servers and introduced several Web 2.0 initiatives. He holds a Master’s degree in Electronics and Telecommunication.

Share on LinkedIn

What Has Technology Got to Do with Valentines Day?

Valentines Day is a very powerful holiday. I should explain that I’m not talking here about my love life here, although that would make for another interesting story. This is about my days in an internet company when I managed about 300 web servers as the CTO. For many technologists with similar responsibilities, Valentines Day can be a force to be reckoned with. Let me explain.

Traditional web server based operations

Many web-based businesses experience a sudden and extraordinary hike in traffic during Valentine’s Day. Take for example, online greeting cards. In a span of 48 hours, you can see the total traffic hitting hundred million or more. The problem faced by technical teams is to plan for these 48 hours, and that inevitably means scaling your infrastructure. Typically, the time required for such preparation is a week to ten days, depending on the scale. On the tail end of the event, you need to give another week for all messages and information to be retrieved by users. In other words, let’s say you send a Valentine’s card on the 13th. The recipient may not access the internet on that day, or the next – but perhaps several days later.

You may have to hire dedicated servers for a month (a month being a typical minimum period for hiring dedicated servers). The real cost of managing peak traffic for 48 hours is actually the cost for that whole month. If you add 200 servers for Valentine’s Day, your expenditure is $200,000 at $1000 rental per server. This includes the cost of high speed internet access for those servers. This, of course, is prohibitively high.

And that’s only one example. I’m sure you can think of many other web services which are similar in nature.

Cloud services to the rescue!

The best part about a cloud service is scaling. Many cloud service providers have an auto-scaling feature where the capacity is automatically scaled up on demand. This feature no doubt has saved many online entities from server crashes during critical stages when there is a traffic surge. In the case of the example I mentioned, you can plan for an expansion in infrastructure which would last for say three days: the D-day itself, plus one day prior to and following Valentine’s Day. This is where the power of cloud computing kicks in. In one stroke I have reduced my cost to a fraction of what I would pay with a traditional setup.

No cost for setup and maintenance

You just can’t ignore the cost of setting up web servers. To add 100 servers you need two dedicated engineers. You have to worry about load balancing, setting up round robin DNS, and assigning IPs to individual servers. Getting clean IPs, or ones which have not been blacklisted, can be a huge headache. In a cloud service you don’t have to worry about any of these things. You save on salaries for dedicated engineers, which can be a tidy sum.

Managing server overload

Even after taking all these precautions you may face an isolated server overload. I have seen it happen. Auto-failover is certainly a feature which can save you from embarrassment. Sudden disruption of service due to server downtime does affect your reputation, especially when you have competitors who are waiting to cash in on poor quality of service.

GMO Cloud simplifies this process by implementing a high availability solution ensuring constant uptime. Read more about the reliability of this solution on the High Availability page.

Put a ribbon on it

Valentine’s is a big day for technologists. I certainly wish you and your loved ones all the best on this special day.

And as for the data that you care about so much — may it rest in a cloud of happiness!

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

About the Guest Author:

Sankarambadi Srinivasan

Sankarambadi Srinivasan (or ‘Srini’) is a maverick writer, technopreneur, geek and online marketing enthusiast rolled into one. He began his career as a naval weapon specialist. Later he sold his maiden venture and became head of an offshore Database administration company in Mumbai. He moved on as Chief Technology Officer of one of the largest online entities, where he led consolidation of 300 online servers and introduced several Web 2.0 initiatives. He holds a Master’s degree in Electronics and Telecommunication.

Share on LinkedIn

Asia Pacific Companies Enter a New Stage of Growth With Cloud Computing

While cloud-based services have become an inevitable part of the business processes of most companies around the world, Asia Pacific companies have yet to experience the full power of cloud computing technology. According to the Third Annual VMWare Cloud Index 2012, cloud adoption growth reached 42% in 2012. In 2010, the annual growth rate was 24%. This increased to 32% in 2011.

The interesting thing about this infographic (see below) is that 75% of Asia Pacific companies believe that cloud computing technology can leverage resources and improve business performance. However, certain businesses failed to adopt cloud services due to difficulties caused within the organization. Three important concerns have deterred companies from moving to cloud-based services, according to a VMWare survey:

Data privacy

Companies that deal with huge volumes of critical data are concerned about data privacy. Cloud computing services enable people to access data and applications from any place, and some businesses feel that this flexibility in accessing data can result in compromised privacy of the client. However, the data privacy issue is efficiently addressed by cloud computing companies. The highest security measures are incorporated into cloud computing procedures to ensure that only the client has access to his private data. The accessibility of data depends on the service mode selected by the client.

Costs involved in the integration of existing systems

Another important concern for businesses is the complexity involved in integrating existing systems into the cloud network. There are certain critical applications that might not suit the cloud environment. Companies need to assess the risk of migrating applications into the cloud. Businesses that use legacy infrastructure find it difficult to identify challenges that arise with this migration. Moreover, costs involved in the transition can be huge.

However, the flexibility offered by automatic scaling features of cloud-based services enables companies to improve agility and efficacy in business performance while at the same time reducing costs. A problem with migrating to the cloud is that the integration of existing infrastructure into the cloud changes the responsibility of IT staff. People are not happy with this change in job roles. So some companies have adopted a wait and see attitude toward integration.

Security

Data management and security is another aspect that prevents companies from migrating to the cloud. While giving control over critical data to cloud computing companies is risky, transmission of data over the Internet is another area of concern. Moreover, data is stored in multiple locations and resources are shared between multiple clients. This shared data infrastructure is believed to create data security issues. However, businesses that have already migrated to the cloud felt that there is an increased security with cloud-based storage solutions. The reason is simple: A single organization cannot spend huge amounts on their data security measures.

But cloud computing companies can afford to implement higher levels of security measures to protect large volumes of critical data. In addition, existing disaster management solutions identify the source of a data crash to eliminate subsequent data disasters. For instance, GMO Cloud has a multi-level security strategy that ensures the protection of data and operations in the cloud. The future seems to be bright for the IT industry in the Asia Pacific region as more and more companies are expected to embrace the cloud in 2013.

GMO Cloud offers highly scalable data management solutions that answer these concerns. While cutting edge technologies used in business processes leverage available resources, data privacy, and security is highly prioritized. To experience the full extent of the power of cloud computing technology, purchase a trial package for your business. Take your company to the next level.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

About the Guest Author:

Kaushik Das

Kaushik Das is an engineer, research analyst and a technical writer in the areas wireless, IT, enterprise software, next-generation hosting, storage and renewable energy. He specializes in competitive analysis, market research, industry insights, white paper and actionable web content development.

Share on LinkedIn

Page 3 of 7112345...10...Last »