Forecasting seismic risk as an earthquake sequence happens

Transcription

Forecasting seismic risk as an earthquake sequence happens
Chapter 7
Forecasting Seismic Risk as an
Earthquake Sequence Happens
J. Douglas Zechar1, Marcus Herrmann1, Thomas van Stiphout2 and
Stefan Wiemer1
1
Swiss Seismological Service, ETH Zurich, Zurich, Switzerland, 2Independent Researcher, Zurich,
Switzerland
ABSTRACT
We describe a model-based approach to forecast seismic risk during an earthquake
sequence, emphasizing building damage and human injuries. This approach, which
could also be used to forecast financial losses, incorporates several models of varying
complexity, but it is intuitive and its output can be succinctly described with simple
mathematics. We describe two applications of this approachdone in the wake of the
destructive 2009 L’Aquila, Italy earthquake, and another with a hypothetical Mw6.6
earthquake in Basel, Switzerland. We discuss the challenges of short-term seismic risk
forecasting and suggest potential improvements.
7.1. INTRODUCTION
When we think of seismic risk, we have in mind a place, a period, and consequences: probablilistic seismic risk quantifies the probability of those consequences at that place within that period. Because earthquakes do not occur
constantly, seismic hazard and, as a consequence, risk vary from moment to
moment, especially when earthquake activity increases. Such an increase
might follow a large earthquake, when many smaller earthquakes occur
nearby. Sometimes, small or moderate events precede a large earthquake and
seem, in retrospect, to have signaled the impending large event. The preceding
and following earthquakes are colloquially referred to as foreshocks and aftershocks, respectively, of the mainshock. Sometimes, several events of about
the same size happen and there is no mainshock; this is called a swarm. Swarm
sequences and foreshockemainshockeaftershock sequences increase seismic
risk, and in this chapter, we consider how we can forecast seismic risk as an
earthquake sequence happens.
Earthquake Hazard, Risk, and Disasters. http://dx.doi.org/10.1016/B978-0-12-394848-9.00007-9
Copyright © 2014 Elsevier Inc. All rights reserved.
167
168
Earthquake Hazard, Risk, and Disasters
Forecasting seismic risk is subtly different from assessing seismic risk as
discussed in Chapter 6: the goal of that type of assessment is to estimate the
impact of earthquakes that have occurred, while the goal of forecasting is to
predict both the distribution of future earthquakes and the corresponding
effects. In this chapter, we emphasize time-varying risk forecasts that respond
to the space-time clustering of seismicity. Our motivations for doing this are so
intuitive that you likely already know them: to advise emergency managers
and respond to public concerns, thereby reducing losses. Conceptually, our
approach is also intuitive: we combine an earthquake occurrence model, a
hazard model, a damage model, and a loss model to make corresponding
probabilistic predictive statements about future seismicity, shaking, damage,
and losses. In combining these elements, we synthesize concepts from the
chapters on earthquake prediction (Wu, Chapter 16; Sobolev, Chapter 17;
Kossobokov, Chapter 18), forecast testing (Schorlemmer & Gerstenberger,
Chapter 15), seismic risk (Wyss, Chapter 6), and losses (Michel, Chapter 21).
Like those chapters, we have in mind tectonic earthquakes, not induced
seismicity. And like Wyss (Chapter 6), we use “loss” to refer to either loss of
human life or financial loss incurred from damage to the built environment.
In the following sections, we introduce notation and suggest a mathematical
formulation of the problem; illustrate our approach using data from the L’Aquila
earthquake sequence of 2009; demonstrate an extension using a spatially-varying
seismicity model for Switzerland’s SEISMO-12 preparatory exercise; and
discuss shortcomings of our implementation and directions for future research.
7.2. SEISMIC RISK
What do we need to assess seismic risk and predict the resulting losses? We are
interested in building damage, so we need to know about the earthquakes that
may have caused damage and the buildings that may have been damaged.
Specifically, we need to know where the earthquakes occurred and how big
they were, so we can estimate the shaking they caused. And we need to know
something about the strength of the potentially affected buildings, so we can
estimate the damage caused by such shaking. We are also interested in casualties, so we need to know about the people that were potentially affected by
building damage: how many people were affected, and where were they?
Consider the first-person perspective of risk assessment: imagine that a set of
earthquakes, E, is going to happen and you want to know the probability that you
are injureddin other words, that you reach a casualty degree C. We can express
this risk, Pr(CjE), as a product of several (mostly conditional) probabilities:
PrðCjEÞ ¼
5 X
12 X
F
XX
e˛E n¼0 k¼1 j¼A
Pr CjDn ; Vj Pr Dn jIk ; Vj PrðIk jeÞPr Vj
(1)
Chapter j 7
Forecasting Seismic Risk
169
Let’s step through the product terms left to right. The first denotes the
probability that casualty degree C is reached conditional on you being in a
building with vulnerability Vj that reached damage state Dn. The next term is
the probability that the damage state was reached, conditional on the vulnerability and a given intensity Ik. This is analogous to an earthquake engineering
fragility function. The third term is the probability that the earthquake e
produced the given intensity, something we might call the conditional seismic
hazard. And the final term describes the distribution of vulnerability. Note that
we treat building vulnerability, intensity, damage states, and casualty degrees
as categorical data. In fact, when we write risk in this way, we have particular
discretizations in mind: European Macroseismic Scale (EMS-98, Grünthal,
1998) for vulnerability, intensity, and damage grade, and HAZUS (FEMA,
2010) for casualty degree. But this approach to assessing risk can be generalized to other discretizations, or to continuous data. For example, I could be
given in units of Modified Mercalli Intensity, peak ground velocity, peak
ground acceleration, and so on.
When we forecast seismic risk, we are not interested in the effect of
earthquakes that have already happened, but rather of those that may happen in
some specific future period. This puts us in the land of models; we need the
following:
1. An earthquake occurrence model to forecast when and where future
earthquakes will occur, and how big they will be.
2. A ground motion model, including site effects, to forecast the resulting
shaking.
3. A building stock model to represent the built environment.
4. A damage model to forecast the damage to the building stock caused by the
shaking.
5. A loss model to forecast human casualties (which requires the loss model
to include a model of the space-time distribution of people) and/or
financial losses (which requires a model of costs for potential loss
events).
Frequently, the first two models are combined to form a seismic hazard
model, where hazard can be expressed as the probability of exceeding some
ground motion. Then we can write the time-varying, first-person-perspective
risk as a function of hazard:
PrðC; tÞ ¼
5 X
12 X
F
X
Pr CjDn ; Vj Pr Dn jIk ; Vj PrðIk ; tÞPr Vj
(2)
n¼0 k¼1 j¼A
Note the explicit time dependence in the risk forecast and the hazard model
Pr(I, t): in subtle contrast to Equation 1, Equation 2 describes the probability
that you will experience a casualty degree C in the future period t. In principle,
the vulnerability distribution could also vary in time, allowing one to account
170
Earthquake Hazard, Risk, and Disasters
for progressive damage, but this is not yet done in practice. When an earthquake sequence is ongoing, we can update our risk forecast by updating our
earthquake occurrence model (and thus our seismic hazard model). This is
analogous to day-by-day weather forecasting: based on new observations, the
weekend weather forecast may change throughout the week.
In the following two sections, we demonstrate this approach to forecasting
short-term risk in Italy and Switzerland and give concrete examples of the
terms in the seismic risk forecast equation.
7.3. FORECASTING SEISMIC RISK DURING
THE L’AQUILA SEQUENCE
On April 6, 2009, an Mw6.3 earthquake devastated L’Aquila, Italy, causing
widespread damage and killing 299 residents. In the weeks leading up the
deadly earthquake, many smaller events occurred near L’Aquila; with the
benefit of hindsight, we can say that those events were foreshocks. Many
authors have addressed the controversy surrounding the L’Aquila sequence,
and we will not rehash the details (see for example Jordan et al., 2011;
Marzocchi, 2012). For this chapter, we are only interested in the L’Aquila
sequence as a set of scientific circumstances, and we will ignore the legal,
philosophical, and political ramifications, despite the fact that they may have a
more important legacy than the earthquakes themselves.
In the aftermath of the L’Aquila sequence, van Stiphout et al. (2010,
hereafter vS) recognized that seismologists must be able to effectively
communicate time-varying seismic risk to emergency managers. They pioneered the model-based approach described in the previous section, and they
conducted an experiment to forecast seismic risk as the L’Aquila sequence
progressed. Note that they did this after the fact, not in real time, so the
experiment was a proof of concept that used observed data.
For the earthquake occurrence model, vS used the RJ (Reasenberg and
Jones, 1994) model, which is founded on two first-order observations of
earthquakes:
1. The distribution of magnitudes is well-approximated by the Gutenberge
Richter relation (i.e., an exponential function, Gutenberg and Richter,
1944).
2. The rate of events occurring nearby and soon after a large earthquake is
well approximated by the OmorieUtsu relation (i.e., an exponential decay
function, Utsu, 1961).
The RJ model aims to reproduce the temporal clustering and size distribution of earthquakes and is a stochastic triggering model: it estimates the rate
and sizes of future earthquakes triggered by observed events. For the application to the L’Aquila sequence, vS used RJ parameter values that Lolli
Chapter j 7
Forecasting Seismic Risk
171
and Gasperini (2003) estimated by fitting the model to earthquake sequences
in Italy.
For the ground motion model, vS combined the ground motion prediction
equation of Akkar and Bommer (2007) with the ShakeMap method using
Italy-specific parameter values (Michelini et al., 2008), and they included a
site amplification factor of 1.25 intensity units to account for local soil conditions in L’Aquila. For the building stock model, damage model, and the loss
model, vS used the module developed for the QLARM software (Trendafiloski
et al., 2011). To estimate the distribution of damage, QLARM employs the
European Macroseismic Method (EMM) (Giovinazzi, 2005), which maps intensity and vulnerability class to a probability mass function of damage.
In other words, this permits the possibility that two buildings with the same
vulnerability are subject to the same shaking intensity yet experience different
damage. For the building stock model, QLARM uses a generic model in regions where detailed models are not available; in the vS study, QLARM
assigned 30 percent of the L’Aquila buildings to EMS-98 vulnerability class A,
30 percent to B, 30 percent to C, 10 percent to D, and none to E and F (class A
is the most vulnerable to shaking and F, the least vulnerable). The QLARM
loss model maps the estimated building damage to HAZUS casualty degree
using an empirical approach calibrated using previous events (Trendafiloski
et al., 2011). For the loss model component that describes the distribution of
people, QLARM assumes that the distribution of inhabitants matches the
building stock distribution: 30 percent of the inhabitants were in class A
buildings, 30 percent in B, 30 percent in C, and 10 percent in D.
For their study of the L’Aquila sequence, vS analyzed regional earthquakes with magnitude greater than 2.5 between November 1, 2008 and May
1, 2009 (weg damit Figure 7.1). They made 24 h forecasts that were updated
every 3 h or whenever a new earthquake happened, whichever came first.
(Three hours without any earthquake is an observation that can be used to
update the earthquake occurrence model, and therefore, the seismic risk
forecast.) Because this forecast approach is probabilistic, we can look at the
risk in several different ways: for example, imagine that you are an emergency manager and you are concerned not about your own risk but the risk of
all people in the regiondcall this a third-person risk perspective. Then you
would be interested in a loss curve that shows the probability of exceeding
some number of fatalities (Figure 7.2). By integrating the curve and dividing
by the population, we can recover the first-person risk forecast emphasized in
the previous section. We can also visualize the temporal evolution of the
seismic risk forecast as the sequence progresses, as shown in Figure 7.3.
Figure 7.3 illustrates the effect of the ongoing sequence on the risk
forecast: whenever a new eventda possible foreshockdoccurs, the risk
suddenly increases, and as time passes and no events occur, risk gradually
decreases.
13°15'E
13°20'E
13°25'E
13°30'E
172
13°10'E
13°35'E
42°30'N
42°30'N
INSTRUMENTAL INTENSITY
I
II
III
IV
V
VI
VII VIII IX
X
XI
Mw
XII
Cagnano Amiterno
Barete
2.4–3.0
3.1–3.5
3.6–4.0
Pizzoli
4.1–4.5
42°25'N
42°25'N
4.6–5.0
5.1–5.5
Scoppito
Mw 6.3 L'Aquila
Earthquake
42°20'N
L'Aquila
Onna
42°20'N
San Demetrio
ne’ Vestini
Ocre
0.3
Fiamignano
mig
gnano
42°15'N
0.2
42°15'N
0.1
0
Pesc
sc
coro
rocc
cchi
hi an
hia
ano
Pescorocchiano
13°10'E
13°15'E
13°20'E
13°25'E
13°30'E
13°35'E
FIGURE 7.1 Map of the region affected by the April 6, 2009 L’Aquila Mw6.3 earthquake (red star), including the predicted ground motion, the foreshocks
between 1 November and 6 April (yellow), aftershocks between 6 April and 1 May (gray), and the settlements (black squares). Inset shows the national seismic
hazard map (Meletti et al., 2008) with the white box indicating the region in the main panel.
Earthquake Hazard, Risk, and Disasters
Fossa
Tornimparte
Lucoli
PGA (g)
10% in 10yr
0.4
Poggio Picenze
Chapter j 7
173
Forecasting Seismic Risk
0
Probability of exceedance
10
–1
10
–2
10
–3
10
–4
10
100
0
200
300
No. of fatalities
400
500
FIGURE 7.2 Probabilistic loss curve for EMS-98 building class type A in L’Aquila on
April 6, 2009 at 2 a.m. local time, for the following 24 h.
10–2
10–4
3
4
5
6
April
7
8
Background
6
5
10–6
4
3
–8
10
1
15
1
Nov
2008
15
Dec
1
15
Jan
1
15
Feb
1
15
Mar
2009
1
15
Apr
Magnitude
Probability of exceedance, P(X ≥ 100)
100
1
May
FIGURE 7.3 Probability of exceeding 100 fatalities in the next 24 h, updated after each
earthquake or every 3 h (black). The time of the mainshock is indicated by a red star and the
background probability of exceeding 100 fatalities with the next 24 h based on Meletti et al. (2008)
is shown with the dashed blue line. The inset shows details of the risk forecast immediately before
and after the occurrence of the mainshock. Right axis: earthquake magnitudes as a function of
time. Note: the probability is based on the seismicity within a box 25 by 25 km centered at
L’Aquila.
174
Earthquake Hazard, Risk, and Disasters
7.4. FORECASTING SEISMIC RISK FOR THE SEISMO-12
SCENARIO SEQUENCE
In May 2012, the Swiss Federal Office for Defense, Civil Protection, and Sport
conducted an earthquake exercise involving a hypothetical Mw6.6 event in
Basel, Switzerland. This exercise, called SEISMO-12, was designed to explore
how authorities might react in case of a “repeat” of the 1356 Basel earthquake
(Meghraoui et al., 2001; Gisler et al., 2008; Fäh et al., 2009). Rather than
isolating the scenario mainshock, SEISMO-12 participants were asked to
respond to an entire sequence. The SEISMO-12 sequence combined events
from an automated simulation of the Epidemic Type Aftershock Sequence
(Ogata, 1988) model (specifically, the implementation of Hainzl et al., 2008)
and some manually inserted earthquakes: five foreshocks (including an Mw5.1
20 min prior to the Mw6.6 event) and several large aftershocks (M3.4eM5.9).
Figures 7.4 and 7.5 show the temporal and spatial distribution, respectively, of
the SEISMO-12 sequence.
Herrmann (2013, hereafter H13) extended the approach of vS to forecast
risk for the SEISMO-12 sequence. Rather than using the RJ earthquake
occurrence model, H13 used the Short-Term Earthquake Probability model
(STEP) (Gerstenberger et al., 2005), which is an extension of the RJ model. In
particular, STEP extends the RJ model by incorporating spatial information in
so-called aftershock zones. With these aftershock zones, STEP aims to include
an additional first-order observation: earthquakes cluster in space. As more
events are observed, STEP gradually increases the resolution of the aftershock
zone. One of the ways that STEP attempts to reproduce spatial clustering is to
estimate the geometry of the fault that ruptured in large earthquakes.
7
Magnitude
6
5
4
3
2
1
0
1
2
3
4
5
Time (days)
6
7
8
9
10
FIGURE 7.4 Time-magnitude distribution of the SEISMO-12 sequence for the 10 day duration
of the exercise, with the Mw6.6 event at t ¼ 0 days. Events from the ETAS simulation are shown in
gray, the Mw6.6 event is red, manually inserted foreshocks are orange, and manually-inserted
aftershocks are green and blue. The blue circles denote larger aftershocks occurring after
t ¼ 3 days.
Chapter j 7
175
Forecasting Seismic Risk
47.6˚
Basel
47.5˚
47.4˚
7.4˚
7.5˚
7.6˚
7.7˚
7.8˚
FIGURE 7.5 Map view of the SEISMO-12 sequence. The colors of the symbols are the same as
in Figure 7.4.
In the original STEP implementation, a fault was represented as two line
segments with an endpoint at the epicenter of the large event, and the length
and direction of the two line segments were determined by the spatial extent of
aftershocks. But H13 noted that this fault identification algorithm was sensitive to outliers and yielded counterintuitive results for the SEISMO-12
sequence, and he therefore implemented an improvement that emphasizes
the regions with the greatest aftershock density. Figure 7.6 shows a comparison
of the fault approximation methods for the SEISMO-12 sequence.
As shown in Figure 7.6(a) and (b), STEP generates spatially-varying earthquake occurrence forecasts; for the ground motion model, H13 used such forecasts as input to the intensity prediction equation of Allen et al. (2012) and
summed the results with regionally appropriate site amplifications from P. Kästli
(written communication). Rather than using the generic building stock model in
QLARM, H13 benefited from being able to use a superior building stock model
for the Basel region. Because of a geothermal project, a detailed risk assessment
of Basel had been conducted and the corresponding report was published,
including building inventory for 19 districts of Basel and 60 surrounding settlements (Baisch et al., 2009); see Figure 7.7 for a map-view representation of
these data. The report also contained population estimates for each settlement
and the district of Basel. Like the building stock model, the population dataset
176
Earthquake Hazard, Risk, and Disasters
0
(a)
(b)
0
10
10
-2
2
10
-3
3
10
-4
4
10
-5
10
-6
15
10
Earthquake count per bin
20
–4
–5
–6
10
30
25
Truncation
20
15
10
5
5
0
0
FIGURE 7.6 (a) and (b) 24 h rate forecasts using the different methods of fault identification.
(a) The original STEP implementation. (b) The H13 approximation. Both forecasts are generated
18 h after the mainshock, with approximately 530 aftershocks having already occurred. Using the
same data, (c) and (d) demonstrate how the H13 fault approximation works. (c) is a 2D spatial
histogram of the aftershock, (d) is the same after filtering, and the red line denotes the inferred
fault.
offers a higher resolution than that of the default data in QLARM (which were
used by vS).
Following vS, H13 also used QLARM for the damage model and loss
model. Also following vS, for the loss model H13 assumed that the distribution of inhabitants weg damit matched the local building stock distribution
weg damit.
To estimate seismic risk during the SEISMO-12 sequence, H13 analyzed
those earthquakes shown in Figures 7.4 and 7.5 and generated 24 h earthquake occurrence forecasts, seismic hazard forecasts, and seismic risk
forecasts. Figure 7.8 is analogous to Figure 7.3dit shows, for the SEISMO12 sequence, the temporal evolution of hazard and risk. Figures 7.9 and 7.10
emphasize the spatial information available in these risk forecastsdthey are
risk maps showing the space-time variation of risk that results from seismicity that varies in space and time, as well as vulnerability and population
that varies spatially. Figure 7.10(b) shows a third-person risk perspective that
is normalized by population, a view that lets you identify relative risk across
the region.
Earthquake count per bin
(d)
30
25
–3
35
35
(c)
–2
Seismicity rate for M ≥ 3.0 in 24 h
–1
-1
Chapter j 7
177
Forecasting Seismic Risk
EMS-98
vulnerability
classes
A
B
C
D
E
F
FIGURE 7.7 Map view of the EMS vulnerability class distribution for each settlement. The area
of each pie chart is proportional to the number of buildings in the corresponding settlement.
0
10
–2
10
Mean hazard
1 ‰ Fatalities
in whole region
City
of
Basel
EMS 5
(mean)
Outer zone
EMS 7
–3
10
–4
10
Foreshocks
Mainshock
Aftershocks
–5
10
–6
–9
6
5
4
Background
10
7
0
24
48
72
96
120
Time relative to mainshock (hours)
FIGURE 7.8 Time-varying probability of exceeding loss (orange and blue) and hazard thresholds
(gray to black) for a few hours before and the first 6 days after the Basel scenario earthquake. The
loss threshold is set to 1 per mill (&) fatalities; the hazard is illustrated by distinct intensity levels.
Thicker lines represent the mean probability of zone-related settlements (see legend), whereas the
dashed lines show the maximum and minimum loss probabilities as observed in the two zones
separately. The loss and hazard forecasts were issued each hour and refer to the following 24 h. To
track the seismicity during this time, stems at the bottom represent earthquakes above M3 with the
same color scheme as in Figure 7.4.
3
Magnitude
Probability of exceedance
10–1
178
Earthquake Hazard, Risk, and Disasters
(a) −11 min
(b) +1 min
0.1
0.003 | 0.012
mean
0.22 | 0.63
max
(c) +24 hours
mean
max
(d) +7 days
0.01
Probability of exceeding 1 fatality
1
0.001
0.08 | 0.26
mean
0.022 | 0.077
max
mean
max
FIGURE 7.9 Snapshots of the regional seismic risk forecast for the following 24 h, issued at the
time (relative to the Mw6.6 event) given in the top-left corner. The color of each settlement denotes
the probability of having one or more fatalities. The average and maximum probabilities for each
map are reported in the bottom-right corner of each snapshot. The earthquake symbols were assigned
the same color scheme as in Figure 7.4 (except the mainshock, which is white in these maps).
(a) + 1 min
(b) + 1 min
0.10 | 0.21
0.07 | 0.27
mean
0.001
mean
max
Probability of exceeding 10 fatalities
max
Probability of exceeding 1 ‰ fatalities
0.01
0.1
0.2 0.3
FIGURE 7.10 Short-term risk forecast for the next 24 h, 1 min after the Mw6.6. Probabilities of
exceeding two different fatality thresholds are presented. Note that the color scale has changed
compared to Figure 7.9. (a) Probability of more than 10 fatalities in each settlement. (b)
Normalized by settlement population: Probability that more than 1 per mill (&) of the population
in each settlement dies.
Chapter j 7
Forecasting Seismic Risk
179
7.5. DISCUSSION
As Wyss (Chapter 6) highlighted, the practice of routinely estimating seismic
losses in the wake of a large earthquake is relatively new. The practice of
forecasting seismic risk as an earthquake sequence happens is even newerdvS
were pioneersdand forecasting risk is inherently a more difficult problem.
Wyss (Chapter 6, Section 8) mentioned some of the unsolved problems for
estimating losses in real time; they also apply to forecasting short-term risk,
especially those related to data quality and availability.
Nearly all short-term earthquake occurrence models use a catalog of recent
events as input. Because the models we have in mind are founded on the belief
that small earthquakes can trigger large earthquakes, including small earthquakes in the catalog is important. Ironically, detecting and analyzing
small earthquakes is hardest exactly when they interest us mostdwhen
seismic activity is higher than normal. This problem of time-varying
catalog completeness is well known and affects all of statistical seismology
(Mignan and Woessner, 2012). Some researchers have proposed partial solutions to this problem (e.g., Peng et al., 2006), but the solutions have not been
implemented in routine seismic network processing. Of course, errors in catalog datadfor example, incorrect magnitude or location estimates, or records
of earthquakes that did not happendwill also result in inaccurate seismic risk
forecasts. Producing very accurate and very complete catalogs in real time is
difficult primarily because the producers lack resources, computational or
otherwise. This is a pragmatic problem that is probably best addressed by
stressing the importance of high-quality catalog data to funding agencies.
Even if data quality does not improve dramatically, there are steps we
should take to improve seismic risk forecasts. For example, in this chapter,
we have neglected the topic of uncertainty, and this should be addressed.
Uncertainty in earthquake source parameters could be propagated from the
catalog into the occurrence model and on to the loss modeldin other words,
from beginning to end. Beyond data uncertainty, we also cannot be sure that
we are using the best model for each component of the seismic risk forecast.
Although the examples in the previous two sections emphasized STEP as an
earthquake occurrence model, we could use more sophisticated models and/or
models that have been developed for a particular region or tectonic setting. Or
we could directly account for our uncertainty in selecting an earthquake
occurrence model by using an ensemble, aiming to offset the weaknesses of
any individual model with the strengths of others (Marzocchi et al., 2012). The
same principle applies to ground motion, damage, and loss models.
In addition to improving the earthquake occurrence model, we should
leverage the numerous recent advances made by the ground motion modeling
community. For example, we could apply methods intended to help rank
ground motion prediction equations (Scherbaum et al., 2004, 2009;
Kaklamanos and Baise, 2011; Kale and Akkar, 2013), and we can also apply
180
Earthquake Hazard, Risk, and Disasters
the findings of impending studies based on ground motion data sets from
around the world (e.g., the Pacific Earthquake Engineering Research Center
ground motion database). We should move away from empirical, qualitative
measures such as intensity and toward physical measurements such as peak
ground velocity, peak ground acceleration, and spectral acceleration. These
quantities can be used in more sophisticated, machanics-based damage models
(e.g., Borzi et al. 2008). Such models also permit the possibility of timevarying building stock models. In other words, damage models could feed
back into the risk forecast as the sequence progresses, and we could account
for progressive damage.
We could already crudely model progressive damage with the simple damage
model that vS and H13 used. In those examples, the building stock model and
distribution of people were treated as static entries throughout the earthquake
sequence. But the damage forecasts themselves implying time-varying vulnerability: to be internally consistent, the fractions of the buildings that were
forecast to be destroyed should be removed from the building stock model, and
the remaining fraction should be updated. Moreover, as the buildings are
damaged, the number and distribution of affected people would change, even if a
complete evacuation were not economically justified (as vS claimed regarding
the L’Aquila sequence). Future seismic risk forecasts should account for progressive damage and people’s movements during an earthquake sequence;
ideally, these systematic changes would be estimated from field observations.
Although the previous two sections emphasized QLARM for loss estimation, Wyss (Chapter 6) mentioned a few alternatives, including Prompt
Assessment of Global Earthquake Risk (PAGER) (Jaiswal and Wald, 2010).
Indeed the approach we described in this chapter is flexibledit is not tied to
any particular model. More generally, existing loss models only estimate
losses due to earthquake shaking, ignoring the potential impact of tsunamis,
landslides, and fires caused by earthquakes. This is a fundamental problem and
one for which we have even fewer data available to build empirical models.
Perhaps you have noticed the elephant in the room: how do we use shortterm seismic risk forecasts? And how do we effectively communicate results?
We don’t know the answer, but we can say that it depends on the target user,
and in general, we should try to open the lines of communication between
seismologists, emergency managers, and social scientists (the latter group is
particularly important if our goal is to communicate with the public). In terms
of communication, vS discussed cost-benefit analysis, and H13 presented a
traffic light system, but both approaches were developed without interaction
with risk experts, and in that sense, they are only a guess as to what forecast
products might be useful. In this chapter, we emphasized a scientific, modelbased method to forecast seismic risk during an earthquake sequence, but it is
easy to imagine that this is only the beginning; and the hardest, and most
exciting, workdfiguring out how to apply such methods effectively to benefit
societydlies ahead.
Chapter j 7
Forecasting Seismic Risk
181
ACKNOWLEDGMENTS
This work was partially supported by the NERA (Network of European Research Infrastructure for Earthquake Risk Assessment and Mitigation) project from the 7th Framework
Program by the European Commission.
REFERENCES
Akkar, S., Bommer, J.J., 2007. Empirical prediction equations for peak ground velocity derived
from strong-motion records from Europe and the Middle East. Bull. Seismol. Soc. Am. 97,
511e530.
Allen, T.I., Wald, D.J., Worden, C.B., 2012. Intensity attenuation for active crustal regions. J.
Seismol 16, 409e433. http://dx.doi.org/10.1007/s10950-012-9278-7.
Baisch, S., Carbon, D., Dannwolf, U., Delacou, B., Devaux, M., Dunand, F., Jung, R., Koller, M.,
Martin, C., Sartori, M., Secanell, R., Vörös, R., 2009. SERIANEX e deep heat mining Basel.
Tech. Rep. (in German).
Borzi, B., Crowley, H., Pinho, R., 2008. Simplified Pushover-Based Earthquake Loss Assessment
(SP-BELA) Method for Masonry Buildings. Int. J. Archit. Herit 2, 353e376. http://dx.doi.org/
10.1080/15583050701828178.
Fäh, D., Gisler, M., Jaggi, B., Kästli, P., Lutz, T., Masciadri, V., Matt, C., Meyer-Rosa, D.,
Rippmann, D., Schwarz-Zanetti, G., Tauber, J., Wenk, T., 2009. The 1356 Basel earthquake:
an interdisciplinary revision. Geophys. J. Int. 178, 351e374.
FEMA, 2010. Hazus e MH MR5, Earthquake Model e Technical Manual.
Gerstenberger, M.C., Wiemer, S., Jones, L.M., Reasenberg, P.A., 2005. Real-time forecasts of
tomorrow’s earthquakes in California. Nature 435, 328e331.
Gisler, M., Fäh, D., Giardini, D., 2008. Nachbeben e- Eine Geschichte der Erdbeben in der
Schweiz, first ed. Haupt Verlag, Bern. p. 187 (in German).
Giovinazzi, S., 2005. The Vulnerability Assessment and the Damage Scenario in Seismic Risk
Analysis (Doctoral dissertation). Department of Civil Engineering Technical University
Carolo-Wilhelmina, Braunschweig, Germany.
Grünthal, G. (Ed.), 1998. European Macroseismic Scale 1998. European Seismological Commission, Luxembourg.
Gutenberg, B., Richter, C., 1944. Frequency of earthquakes in California. Bull. Seismol. Soc. Am.
34, 185e188.
Hainzl, S., Christophersen, A., Enescu, B., 2008. Impact of earthquake rupture extensions
on parameter estimations of point-process models. Bull. Seismol. Soc. Am. 98, 2066e2072.
Hermann, M., 2013. Forecasting Losses Caused by a M6.6 Scenario Earthquake Sequence is Basel,
Switzerland. Institut für Geophysik and Geoinformatik, TU Bergakademie Freiberg. Master
Thesis.
Jaiswal, K.S., Wald, D.J., 2010. An empirical model for Global earthquake fatality estimation.
Earthquake Spectra 26, 1017e1037.
Jordan, T.H., Chen, Y.-T., Gasparini, P., Madariaga, R., Main, I., Marzocchi, W., Papadopoulos, G.,
Sobolev, G., Yamaoka, K., Zschau, J., 2011. Operational earthquake forecasting: state of
knowledge and guidelines for implementation. Ann. Geophys. 54, 315e391.
Kaklamanos, J., Baise, L.G., 2011. Model validations and comparisons of the next generation
attenuation of ground motions (NGAeWest) project. Bull. Seismol. Soc. Am. 101,
160e175.
182
Earthquake Hazard, Risk, and Disasters
Kale, O., Akkar, S., 2013. A new procedure for selecting and ranking ground-motion prediction
equations (GMPEs): the Euclidean distance-based ranking (EDR) method. Bull. Seismol. Soc.
Am. 103, 1069e1084.
Lolli, B., Gasperini, P., 2003. Aftershocks hazard in Italy Part I: Estimation of time magnitude
distribution model parameters and computation of probabilities of occurrence. Journal of
seismology, 235e257. http://dx.doi.org/10.1023/A:1023588007122.
Marzocchi, W., 2012. Putting science on trial. Phys. World 25, 17e18.
Marzocchi, W., Zechar, J.D., Jordan, T.H., 2012. Bayesian forecast evaluation and ensemble
earthquake forecasting. Bull. Seismol. Soc. Am. 102, 2574e2584.
Meghraoui, M., Delouis, B., Ferry, M., Giardini, D., Huggenberger, P., Spottke, I., Granet, M.,
2001. Active normal faulting in the upper Rhine graben and paleoseismic identification of the
1356 Basel earthquake. Science 293, 2070e2073.
Meletti, C., Galadini, F., Valensise, G., Stucchi, M., Basili, R., Barba, S., Vannucci, G., Boschi, E.,
2008. A seismic source zone model for the seismic hazard assessment of the Italian territory.
Tectonophysics 450, 85e108.
Michelini, A., Faenza, L., Lauciani, V., Malagnini, L., 2008. Shakemap implementation in Italy.
Seismol. Res. Lett. 79, 688e697.
Mignan, A., Woessner, J., 2012. Estimating the Magnitude of Completeness for Earthquake
Catalogs. Community Online Resource for Statistical Seismicity Analysis. http://dx.doi.org/
10.5078/corssa-00180805. Available at: http://www.corssa.org.
Ogata, Y., 1988. Statistical models for earthquake occurrences and residual analysis for point
processes. J. Am. Stat. Assoc. 83, 9e27.
Peng, Z., Vidale, J.E., Houston, H., 2006. Anomalous early aftershock decay rate of the 2004
Mw6.0 Parkfield, California, earthquake. Geophys. Res. Lett. 33, L17307. http://dx.doi.org/10.
1029/2006GL026744.
Reasenberg, P.A., Jones, L.M., 1994. Earthquake aftershocks: update. Science 265, 1251e1252.
Scherbaum, F., Cotton, F., Smith, P., 2004. On the use of response spectral-reference data for the
selection and ranking of ground-motion models for seismic-hazard analysis in regions of
moderate seismicity: the case of rock motion. Bull. Seismol. Soc. Am. 94, 2164e2185.
Scherbaum, F., Delavaud, E., Riggelsen, C., 2009. Model selection in seismic hazard analysis: an
informationetheoretic perspective. Bull. Seismol. Soc. Am. 99, 3234e3247.
Trendafiloski, G., Wyss, M., Rosset, P., 2011. Loss estimation module in the second generation
software QLARM. In: Spence, R., So, E., Scawthornp, C. (Eds.), Human Casualties in Natural
Disasters: Progress in Modeling and Mitigation. Springer, Dordrecht, Heidelberg, London,
New York, pp. 95e104.
Utsu, T., 1961. A statistical study of the occurrence of aftershocks. Geophys. Mag. 30, 521e605.
van Stiphout, T., Wiemer, S., Marzocchi, W., 2010. Are short-term evacuations warranted? Case of
the 2009 L’Aquila earthquake. Geophys. Res. Lett. 37, 1e5.

Similar documents