Home Innovations GPS Scientists build new earthquake-forecasting model

Scientists build new earthquake-forecasting model

Washington, USA, 11 December 2006 – US geophysicists have developed a new earthquake-forecasting model, which they say is the most realistic till date. Till now scientists have relied on two important pieces of data for forecasting quakes, the past geological record and the current measurements from the GPS, which sometimes give conflicting views.

As part of their study, Prof. Kaj Johnson, Assistant Professor of Geological Sciences and Professor Paul Segall at Indiana University have created a new model that weaved together everything known about how a fault moves.

Prof. Johnson said an important component of earthquake-probability assessment was determining how fast a fault moved. One technique involved the use of GPS, which allowed seismologists to measure the movement of various points on the surface of the Earth and then using that data to extrapolate underground fault movement. Another way to determine fault slip rates was to dig a trench across the fault and find the signatures of past earthquakes, a method called paleoseismology, Prof. Johnson added.

“People say, let`s compare rates of fault movement from GPS to rates of fault movement from geologic studies. But it`s as if you`re measuring different parts of the same thing with different tools. The discrepancy can be quite big,” Prof. Segall said. He said the idea for the model came when he was asked to speak at a conference on the “rate debate,” which is how geophysicists refer to the GPS-paleoseismology discrepancy.

Their model, researchers say, will help close that gap and provide more credible evidence of impending earthquakes. “This is the most realistic model to date. This is something people had been asking for years now. It`s the next step,” said Kaj Johnson, Assistant Professor of Geological Sciences at Indiana University, who worked on the modelling project several years ago when he was a Stanford graduate student.

“That`s when I realized that the standard model doesn`t take into account that fault-slippage rates vary over time. This time dependence is important, because GPS doesn`t measure fault slippage directly. Rather, it measures how quickly points on the surface of the Earth are moving. Then scientists try to fit these data into mathematical models to estimate the rate of slip,” Prof. Segall said.

“Because of the time-dependent rate, your estimate depends on where you are in the earthquake cycle. So if you use a model that doesn`t take that into account, you will get a slip rate that`s different,” he said. With the new model, the two have already confirmed that the slip rates from GPS and from the geological record for the San Francisco Bay Area are relatively consistent.

“Along the San Andreas system, the numbers tend to come out in reasonable agreement,” Prof. Segall said. The two now hope that their new updated model will give a more accurate picture of slip rates and reconcile the two pieces of fault data.