India's largest platform for AI & Analytics leaders & professionals

Sign in

India's largest platform for AI & Analytics leaders & professionals

3AI Digital Library

Forecasting Earthquakes with Analytics

3AI December 14, 2020

We know the quakes are coming. We just don’t know how to tell enough people early enough to avoid the catastrophe ahead. Around the world more than 13,000 people are killed each year by earthquakes, and almost 5 million have their lives affected by injury or loss of property. Add to that $12 billion a year in economic losses to the global economy (the average annual toll between 1980 and 2008). Understandably for some time scientists have been asking if earthquakes can be predicted more accurately.

Unfortunately, the conventional answer has often been “no”. For many years earthquake prediction relied almost entirely on monitoring the frequency of quakes or natural events in the surroundings and using this to establish when they were likely to reoccur. A case in point is the Haicheng earthquake that occurred in eastern China on February 4, 1975. Just prior to this earthquake, the temperatures were high and the pressure was abnormal. Many snakes and rodents also emerged from the ground as a warning sign. With this information, the State Seismological Bureau (SSB) was able to predict an earthquake that helped to save many lives. However, this prediction was issued on the day when the earthquake occurred, so it did cause heavy loss of property. Had this earthquake been predicted a few days earlier, it could have been possible to completely evacuate the affected cities, and this is exactly where big data fits in.

Nature is always giving cues about the occurrence of events, and it is simply up to us to tune in to these cues so that we can act accordingly. Since these cues are widespread, it is best to use big data to collectively bring in this data to a central location so that analysis and the resulting predictions are more accurate. Some common information that can be tracked by big data is the movement of animals and the atmospheric conditions preceding earthquakes.

Scientists today predict where major earthquakes are likely to occur, based on the movement of the plates in the Earth and the location of fault zones. They calculate quake probabilities by looking at the history of earthquakes in the region and detecting where pressure is building along fault lines. These can go wrong as a strain released along a section of the fault line can transfer strain to another section. This is also what happened in the recent quake, say French scientists, noting that the 1934 quake on the eastern segment had moved a part of the strain to the eastern section where the latest quake was triggered.

Academics often put forward arguments that accurate earthquake prediction is inherently impossible, as conditions for potential seismic disturbance exist along all tectonic fault lines, and a build-up of small-scale seismic activity can effectively trigger larger, more devastating quakes at any point. However all this is changing. Big Data analysis has opened up the game to a new breed of earthquake forecasters using satellite and atmospheric data combined with statistical analysis. And their striking results seem to be proving the naysayers wrong.

One of these innovators is Jersey-based Terra Seismic, which uses satellite data to predict major earthquakes anywhere in the world with 90% accuracy. It uses unparalleled satellite Big Data technology, in many cases they could forecast major (magnitude 6+) quakes from one to 30 days before they occur in all key seismic prone countries. It uses open source software written in Python and running on Apache web servers to process large volumes of satellite data, taken each day from regions where seismic activity is ongoing or seems imminent. Custom algorithms analyze the satellite images and sensor data to extrapolate risk, based on historical facts of which combinations of circumstances have previously led to dangerous quakes.

Of course plenty of other organizations have monitored these signs – but it is big data analytics which is now providing the leap in levels of accuracy. Monitored in isolation these particular metrics might be meaningless – due to the huge number of factors involved in determining where a quake will hit, and how severe it will be. But with the ability to monitor all potential quake areas, and correlate any data point on one quake, with any other – predictions can become far more precise, and far more accurate models of likely quake activity can be constructed, based on statistical likelihood.

So once again we see Big Data being put to use to make the impossible possible – and hopefully cut down on the human misery and waste of life caused by natural disasters across the globe.

    3AI Trending Articles

  • Augmenting AI to set the right sales targets

    Author: Abhishek Bhadra, AVP – Platforms & Offerings (Building AI products for Sales & Commercial value chain with domain expertise in CPG& Retail, Manufacturing & Aftermarket Services), Genpact | LinkedIn – https://www.linkedin.com/in/abhishekbhadra/ Setting the right target  Target setting for the sales force is an age-old problem which is nonlinear and multivariate in nature. If the target is […]

  • Crossing the AI Adoption Chasm

    Featured Article: Author: Abhishek Tandon, Director, Customer Success for Fosfor, LTIMindtree Chapter 1: The chasm between AI outcomes and business value The message is loud and clear. AI projects are not seeing the level of adoption and ROI that was expected of them over the last decade. But there is no doubt that AI Rules. […]

  • Global Captive Centers – The true enablers of Collaboration (Part-2)

    Featured Article: Author: Vinodh Ramachandran, Neiman Marcus Group In my previous article, I had spoken about some challenges due to a siloed working model within data & analytics teams in large retailers and how setting up a Data & Analytics CoE within the GCC can be a good solution. Lets look at a few ways […]

  • Transforming Product Experience: How Generative AI is changing the game for Microsoft, Oracle, and AWS

    Featured Article: Author: Abhishek Tandon, Director, Customer Success for Fosfor, LTIMindtree Artificial Intelligence (AI) has revolutionized the way we interact with technology, and one of its most intriguing applications is conversational AI. Chatbots and virtual assistants have become increasingly popular, enabling businesses to provide personalized and efficient customer experiences. Companies like Microsoft, Oracle, and Amazon […]