Blog
Cloud News & Insights

How Cloud Computing is Changing Research Methods

How has cloud computing changed the lives of researchers and scholars? This is an interesting subject, and I’ve come across some interesting findings on the matter.

The cloud has impacted research in several ways, but the most profound one has been in the domain of how data is stored and handled. There are three fundamental changes occurring which are going to change the way researchers work.

1. Big data

Previously, when researchers collected data, it was just a sample. Because (a) they did not have the means of collecting large volumes of data and (b) they could not handle large volumes of data.

Cloud computing and the advent of big data has changed all that. Suddenly the researcher is able to work with enormous quantities of data, effectively attaining freedom from sampling. Using a small quantity of data and relying on it to predict the outcome of the entire population had major shortcomings and naturally led to compromises and inaccuracies.

Cloud computing and big data allows researchers to overcome this limitation. Naturally, the insights that are possible with this kind of data are far better compared to those from a small sample set of data.

2. Big picture

The second major change lies in the reduced need for exact measurements. All of us familiar with database queries know that the query matches the value you are looking for: (SELECT * from EMPLOYEE WHERE AGE = 25). Since data was limited, you looked for greater accuracy to get better results.

Cloud computing and big data allows you a different luxury. You don’t need accuracy anymore! Take the example of a small candy shop. Every night at closing time the owner knows exactly how much he has sold and what his profit is. This is small data. The values are very accurately known. Upscale this to the sales data of Wal-Mart. Do you think they have similar data every night? Of course not. Even when their accounts close, do you think they are 100% accurate down to the last dime? Surely not. Does it really matter – no it does not. How does a dollar there or here make a difference? It is the macro insights that are really important.

Prediction from patterns

The third change that big data has brought about in research is to stop searching for causes to an understanding of patterns. Traditional research looks for the cause of an event or a phenomenon in the hopes of predicting future events.  However with cloud computing and using big data, you need not look for the cause of an event, you merely need to understand the pattern. Today, stock market analysts look at millions of tweets to try and figure out the mood of a population and thereby predict if the stock market will rise or fall.

Will these three changes have a major impact on research? Will we see more patterns of research emerge? Time will tell. But one thing is sure, change is just round the corner.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sanjay SrivastavaSanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.

Share on LinkedIn

Legal Issues in Cloud Computing

Strong legal protection is a critical aspect of giving clients the confidence to move to the cloud. Naturally, no company would want to trust its critical data or processes to the cloud unless it is assured of complete safety. There are a number of legal challenges and issues involved. Some of these are:

Liability – Is the cloud vendor liable for loss of business if his services are disrupted? Or is he liable to provide military grade protection to your data? Look at the service level agreement you are signing or accepting. It will state very clearly that you indemnify the vendor from any consequential claims.

Compliance – Who is responsible for compliance with the law – you or the vendor who is hosting your data/service? In numerous cases, it has been upheld that simply because you are using hosted services, you cannot transfer the responsibility for compliance with the law to the vendor. If you are storing medical data, then it is you and not the cloud vendor who needs to be HIPAA compliant.

Copyright – This issue flows from the previous one on compliance. If you are storing data or displaying it on your website, you are responsible to ensure that no copyright is violated.

Data Protection – There are legal issues asurrounding bout who is responsible for data protection. Different countries and even states have different laws on the subject.

Data portability – If you decide to shift your data to another location, is the vendor required to help you? Can they deny you certain data or meta data?

The fact of the matter is that existing rules are still struggling to catch up with the rapid advance of technology. Look at the image below, this clearly illustrates the widening gap between technology and the legal framework that looks to govern it. If the gap is increasing so rapidly, legal stability is still some time away.

When it comes to privacy and data protection, a detailed study reveals that most rules governing cloud computing have their genesis in the pre-Internet era when the issues involved were far simpler. What we are seeing now is a knee jerk reaction from States that are struggling to understand and regulate cloud computing.

European Union regulations on cloud computing are more restrictive than those in the US. There is a reform process underway but that will take some time to show results on ground.

Graph presenting the gap in the progress of technology and legal development

Source: isaca.org

When it comes to entering into contracts with the cloud service vendor, remember that the agreement you ‘sign’ – even if you select the “I Agree” check box, is enforceable. Before you select the check box, read the terms carefully to see how much of the liability is the vendor going to assume. To give you an example of a comprehensive agreement, visit GMO Cloud’s Terms of Use.

You may want to include a clause on auditing services provided by the vendor. However, pause and think if a genuine audit can be performed with data being stored in diverse physical locations that could be hundreds of kilometers apart. Be wary of litigation if it involves vendors from different countries, as costs can be prohibitive.

What about illegal content? How liable is the cloud service provider and how liable is the person who stores the content? European law says that the service provider has no knowledge of the contents and hence cannot be held responsible provided they remove or block the illegal content when it is brought to their notice.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sanjay SrivastavaSanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.

Share on LinkedIn

Telemedicine in Japan and Possible Cloud Computing Applications

Telemedicine literally means long-distance medicine, medicine from afar. And communications technologies have always developed alongside telemedicine. The telephone resulted in telemedical experiments being carried out through traditional telephone lines. The same happened with telecommunication satellites and the development of the internet. In this article we will take a look at one of the three main branches of telemedicine, teleconsultation, and how it has been developed in Japan.

1996 was very important for field because the Japanese government relaxed many of its policies which were holding back the development of telemedicine in this country. We are now observing the cloud being used for information and data management and to interact with other medical professionals around the globe. One of the hopes of this field of research is to eventually develop a virtual hospital which will attend to patients directly via the web. However there are many obstacles, including private data security, possible malpractice and vulnerability concerns, and unreliability of the telecommunications system in life and death situations.

Understanding telemedicine

There are three main branches of telemedicine:

1. Remote patient care – This refers to cases where the doctor can treat and diagnose a patient from a remote location.
2. Teleconsulation – This refers to communication between medical professionals, allowing general practitioners to consult with specialists or for doctors in remote areas to gain access to pathology and radiology services.
3. Medical e-learning – This refers to the use of the cloud to help doctors in remote locations to always receive up to date information in their field.

Telemedicine and Japan

The first telemedicine experiments in Japan were carried out in 1971 by the Medical Association of Wakayama. Infrastructure was set up to allow rapid communications via FAX (for example, to transmit electrocardiograms). A curious thing about this period is that transmission of this kind of data through a modem was illegal due to government regulation. Although experiments continued to be carried out, it wasn’t until 1991 when telemedicine started to be used actively to treat patients and increase the efficiency of the health system. One example of this is the field of pathology.

In Japan, pathologists have always been in short supply, not because there are few but because they are unevenly distributed throughout the country. Because of this, one of the first successful telemedicine applications involved remote pathology consultations using high definition images, remote control of special microscopes, and direct data links. Each year, the number of new papers and research in remote patient care and consultation increase, and it is clear that the adoption of these technologies in Japan is now a priority.

What the cloud has to offer to traditional medicine

The main advantage of implementing cloud technology is that it gives all patients equal access to medical services, despite the fact that certain regions are more remote than others. This is especially important when it comes to specialists and other services usually inaccessible outside of large urban centers.

The cloud also allows quicker collaboration between medical professionals. For example, the implementation of data transmission from ambulances to the cloud can allow doctors in the emergency room to access patient information while the patient is already on his way. There are three areas where the cloud has been particularly useful in the Japanese health care system:

1. Patient care and monitoring, particularly in the case of elderly patients at home and patients with chronic diseases that require constant monitoring.
2. Transmission of MRI, CAT scans, and other images to a radiologist or other qualified professional.
3. Management of patient medical records, allowing doctors to have a centralized source for all medical information for a patient’s medical history, accessing it all from a terminal.

Telemedicine is growing at a dramatic pace in Japan and the cloud is an integral part of its development. It is also certain that the demand for supplies for home care and monitoring will continue to increase. With apps capable of monitoring a patients vital signs, medication intake, and other diagnostic reference points, the cloud is quickly becoming an important medical tool.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Nida Rasheed

Nida Rasheed is a freelance writer and owner of an outsourcing company, Nida often finds herself wanting to write about the subjects that are closest to her heart. She lives in Islamabad, Pakistan and can be found on Twitter @nidarasheed.

Share on LinkedIn

Big Data Touches Us All

In this series of articles, I have often spoken about ‘big data’ and how it is revolutionizing many aspects of our life. There are ethical issues with the management of big data and I discussed this in an earlier post. But let me relate another story.

Back in 2009 much of the world was reeling under the impact of the H1N1 virus. This particular outbreak was so violent and the spread was so fast that it was already being compared to the Spanish Flu outbreak of 1918. The 1918 episode had infected nearly half a billion people and was responsible for the death of millions. The H1N1 outbreak was thought to be similar.

To be able to control H1N1, governments first had to know the regions which were the worst affected. Even in today’s world with the best electronic communications, there was a problem. Typically when people were first infected, they suffered from body ache, a mild fever, a headache and related symptoms. These symptoms could occur due to a hundred different reasons – including a hangover. Many would take an aspirin and hope they would be better the next day. Only after a couple of days would they consult a doctor – once again a correct diagnosis – even with a qualified doctor – would take some time. In fact, the detection of H1N1 cases was lagging behind infection by nearly a week.

Information about a positive test for H1N1 in the US was to be sent to the Center for Disease Control and Prevention (CDC) where a regional map was created and teams and resources were moved to the areas worst affected. However, as the discussion so far shows, there was an inherent delay of one to two weeks in updating charts.

The battle against H1N1 was being lost in spite of the most modern tools and medicines.

A few months prior to the outbreak, engineers had written a scientific paper that had discussed a scenario very similar to the H1N1 outbreak. Researchers had shown that they could predict the occurrence of the common flu in an area merely by analyzing Internet search patterns. Using a very large number of search records (typical big data) and correlating with the actual progress of the H1N1 virus, they were able to determine that if a region was seeing a high prevalence of 45 specific keyword searches (possibly terms like body ache, fever, sneezing, headache, etc), then there was a serious possibility that H1N1 was active in that area. This was before the disease/virus was detected medically.

The ability to analyze big data is possessed by cloud technology where a reliable infrastructure needs to be in place [1].

Using this analysis, researchers were able to finally get medical teams to sites that were affected by H1N1 before doctors had begun to detect the virus. And that is how the tide was turned in the fight against the virus.

Graph predicting H1N1 Virus spread

This is not just a story. It really happened. And since the public was involved and the information was not classified, we know of the situation and how it was controlled. I am sure that there are thousands of other such stories as well – possibly not so well known – where big data is making a huge impact on our lives and we are simply not aware of it. I am not talking about how Wal-Mart is improving efficiencies or how Target is finding out if you are pregnant (apologies to male readers).

Such collection and analysis of big data would not have been possible without cloud computing. High quality, established cloud data centers have a big role to play in efforts like this one.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sanjay SrivastavaSanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure.  He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.


  1. To give you an illustration on how this works, check out more information on high availability system configuration.  ↩

Share on LinkedIn

Managing Server Load: How Do Dedicated and Cloud Hosting Stack Up?

The advantages of cloud computing are visible all over the internet. Scalability, stability, resource optimization and economies of scale are some commonly cited benefits. But I can hear some server specialists saying, “So what? My services and website run fine on my dedicated server. Why should I migrate to the cloud?” My only answer is that system administrators are restricted to their domain and don’t really know how the capacity of their dedicated server is impacting the overall performance of their services. Here are some practical considerations which one must bear in mind while deciding on whether to migrate to a cloud computing environment or not.

How do you estimate server traffic capacity?

Systems and hardware specialists don’t really deal with these issues unless the requirement comes from the ‘User interface and web analytics’ team. Estimating server traffic load capacity is at best an approximation. For a more detailed analysis, check out the report Measuring the capacity of a Web server under realistic loads. There are many online entities that to offer such services. However none of them reveal their methodology and from practical experience I can say that the calculations from simulated runs can at best be used as an indicator of traffic capacity. The point I want to make is that calculating traffic capacity is at best a ‘guesstimate.’ You must remember that here we are talking about a stable environment without much fluctuation in traffic. Traffic calculation in volatile traffic is nearly impossible.

Will your server perform optimally at full capacity?

For the sake of simplicity let’s assume that your server administrator provides you with a maximum traffic load figure. At this juncture, we’ll assume that our dedicated server can serve that many clients per minute. Let’s shift focus to the ‘user experience’ expert. What does he observe? In practice, deterioration in user experience begins at around 60% of server load. What this means is that the user experiences difficulty in accessing your server or programs that reside there. Effects can take the form of slow speed, or a break in connection, database connectivity problems, etc. These are irritants and cause poor user experience. If you are particular about your clients and visitors, you may wish to prepare additional server capacity or add another dedicated server at this stage. However, it’s nearly impossible to monitor low level performance issues because inherently final user satisfaction depends on several other factors.

Now let’s come to a situation where the server load is 70% of the rated capacity. At this stage, users can definitely experience a deterioration in performance. As your server load reaches its maximum capacity, users may feel frustration in trying to accessing your services. The conclusion is that you must never plan for maximum server capacity.

Planning server architecture

Now let’s imagine a traditional datacenter environment where your dedicated server is located. If you want your clients to have a smooth and faultless experience, you would plan for 50% load and add servers as required. This can be a waste of resources because you are catering to a situation which may never arise or may occur at very rare occasions. This is where cloud computing comes into picture.

In my next post I will discuss how cloud computing environment can empower your server administration in managing server loads and user expectations. Meanwhile I would recommend readers to peruse more information about high availability features.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sankarambadi SrinivasanSankarambadi Srinivasan, ‘Srini’, is a maverick writer, technopreneur, geek and online marketing enthusiast rolled into one. He began his career as a Naval weapon specialist. Later, he sold his maiden venture and became head of an offshore Database administration company in Mumbai. He moved on as Chief Technology Officer of one of the largest online entities, where he led consolidation of 300 online servers and introduced several Web 2.0 initiatives. He holds a Master’s degree in Electronics and Telecommunication.

Share on LinkedIn

Page 2 of 7112345...10...Last »