Blog
Cloud News & Insights

Predicting Earthquakes in Japan with Big Data

Predicting Earthquakes in Japan with Big Data

A research paper estimates that the big data market in Japan could reach one trillion yen ($10 billion USD) by 2020, helping smart cities to minimize waste and improve efficiency.

A Critical Application

A typical example of processing huge volumes of data in real time is provided by Japan’s real-time earthquake warning system. This application’s capability has been greatly enhanced by cloud computing and big data. As this application improves, it will use more diverse data to improve the quality of its prediction.

How Governments are Getting Involved

With cloud computing booming, government began to explore whether efficiencies and benefits could be tapped. If there is one thing that governments have in plenty – except red tape – it is data. Mountains of it! Cloud computing has of course equipped data management with new capabilities. This is where big data comes into its own. Properly harnessed, the information hidden in big data will give citizens:

     

  • Improved healthcare at lower overall cost, with predictive capabilities and cost effective archival of all kinds of medical records, including MRIs, CAT scans, X rays, every ECG and EEG report, every test result and so on. All of it available to any medical practitioner under controlled circumstances.
  •  

  • Safer living even in the face of increased threats. Accurate long and short range weather forecasting – leading to greater efficiency in agriculture and other weather dependent vocations.
  •  

  • Much greater vehicular densities with much lower accidents
  •  

  • Far more insurance coverage with highly customized policies – tailored to individuals, much larger claims and much lower fraud rates.
  •  

Unlike private companies where the data-driven culture has firmly taken route and the implementation of decisions and directives is rapid, decision-making in government is far more pedestrian. Governments also worry about privacy and security issues.

Technologies Required

Managing Big Data requires companies to be able to handle Real Time Analysis and Streaming of data – with massive amounts of data simply flowing in, there has to be a very large I/O capability  and parallel processing that uses rules based algorithms to manage the data streams. Typically, this kind of data could be from sensors (think surveillance cameras), social media sites, video records, communication records. This has to be handled, tagged, classified and stored as it comes in. This massive, multiprocessor-based handling is in reality only possible with cloud based systems.

Map Reduce Frameworks – At this level of processing, a master server maps the data (distributes it) to many different servers, collects their processing results and ‘reduces’ it to a final summarized result. These servers are based on Hadoop and related technology, providing the capability to process data streams in parallel and move them to distributed servers. They rely on Hadoop Distributed File System to store the data in fault-tolerant fashion. Data can then be used to provide business analytics, log analysis, web search engine output figures and so on. Depending on the quantity of data coming in, the map server can call in additional servers or release surplus ones. This capability is native to cloud applications.

In Summary

Several components work together to create an effective application cluster. A well-planned information strategy leads to an information infrastructure where appropriate technology build systems to handle such applications. There have to be policies that control management, use and protection of data in a phased execution plan. Obviously, such a complex application cannot be delivered all at once, necessitating the iterative development that continuously refines the application and creates new opportunities. Only cloud can deliver this type of processing.

Be Part of Our Cloud Conversation

Our articles are written to provide you with tools and information to meet your IT and cloud solution needs. Join us on Facebook and Twitter.

 

About the Guest Author:

Sanjay SrivastavaSanjay Srivastava has been active in computing infrastructure and has participated in major projects on cloud computing, networking, VoIP and in creation of applications running over distributed databases. Due to a military background, his focus has always been on stability and availability of infrastructure. Sanjay was the Director of Information Technology in a major enterprise and managed the transition from legacy software to fully networked operations using private cloud infrastructure. He now writes extensively on cloud computing and networking and is about to move to his farm in Central India where he plans to use cloud computing and modern technology to improve the lives of rural folk in India.

Share on LinkedIn

Comments are closed.