# Italy Forecast Models

### Long-term (5- and 10-year) models

### Akinci, Aybige & Lombardi, Anna Maria: HAZGRIDX

The model, HAZGRIDX, uses historical and instrumental seismicity assuming that the future earthquakes will be clustered spatially near locations of historical main shocks. The forecast model uses earthquake data with magnitudes, M4.0 and greater and with no explicit use of tectonic, geologic, or geodetic information. The forecasted seismicity rate is the sum of constant background seismicity and is assumed constant in time. The model HAZGRIDX analyzes the declustered CPTI and CSI catalogs to compute the rate of earthquakes on a grid and then smooth these rates to account for the spatial distribution of future earthquakes. HAZGRIDX uses the method proposed by Weichert (1980) to calculate model seismicity rates from a catalog whose completeness magnitude threshold changes with time. It assumes that all earthquake magnitudes follow the Gutenberg-Richter law with a uniform b-value.

#### Chan, Chung-Han: HZA

**HZA_TI:** The approach considers a bandwidth function as a smoothing function to evaluate seismicity density in the neighboring regions of earthquakes. In order to improve the capabilities for large study areas, large-scale zones based solely on the large-scale geological architecture are introduced. Thus, various parameters can be assigned to each large-scale zone separately. By averaging the seismicity density estimated from past earthquakes, we can estimate the annual seismicity density to forecast the spatial distribution of future earthquakes.

**HZA_TD:** This approach combines the time-independent seismicity density with time-dependent seismicity rate changes derived from rate-and-state friction model. In order to acquire a general slip model for large earthquakes which occurred in the past, we introduce homogenous slip on a fault with dimensions derived from the empirical scaling laws of Wells and Coopersmith (2004). These are used as basis for calculating the Coulomb stress change imparted by each earthquake. The time-dependent seismicity rate change imparted by each of the reference earthquakes is derived from the Coulomb stress changed and summarized with appropriate time delay.

#### Console, Rodolfo; Murru, Maura & Falcone, Giuseppe: LTST

The Long Term forecast model (LTST) is based on the estimate of the probability of rupturing of known seismic sources for the next 5 and 10 years under the hypothesis of characteristic earthquake. It was built on the fusion of the statistical time-dependent renewal model (with memory) called BPT (Brownian Passage Time, Matthews et al., 2002) with a physical model considering an interacting fault population. The algorithm includes the computation of the coseismic static permanent Coulomb stress change (Δ CFF) caused by all earthquakes that occurred after the latest characteristic earthquake on the concerned fault segment.

#### Faenza, Licia: PHM

The Proportional Hazard Model is a multivariate model to characterize the spatio-temporal distribution of large earthquakes. The model is non-parametric in the temporal domain, allowing the straightforward testing of a variety of time occurrence hypotheses, Moreover, it may account for tectonics/physics parameters that can potentially influence the spatio-temporal variability, and tests their relative importance. The model has previously applied to the Italian seismicity of the last four centuries, using two different spatial models: a regular grid (Faenza et al., 2003) and a seismo-tectonic zonation (Cinti et al., 2004). Within the CSEP testing experiment, we adopt our model for the long-term (5- and 10 years) test, applied to the whole Italian territory, using separately the two spatial distributions.

#### Gulia, Laura & Wiemer, Stefan: ALM, HALM

The Asperity Likelihood Model (ALM) hypothesizes that small-scale spatial variations in the b-value of the Gutenberg-Richter relationship play a central role in forecasting future seismicity (Wiemer and Schorlemmer, SRL, 2007). The physical basis of the model is the concept that the local b-value is inversely dependent on applied shear stress. Thus low b-values (b < 0.7) characterize the locked paches of faults—asperities—from which future main shocks are more likely to be generated, whereas high b-values (b > 1.1) found for example in creeping sections of faults suggest a lower probability of large events. The b-value variability is mapped on a grid. First, using the entire dataset above the overall magnitude of completeness and the regional b-value are estimated. The regional b-values are then compared to locally estimated ones at each grid-node for a number of sampling radii; we use the local value if its AIC score is lower than the AIC score for the regional value. In the ALM_IT model, we additionally first decluster the input catalog for M ≥ 2 using the method by Gardner and Knopoff (1974) and smooth the node-wise rates of the declustered catalog with a gaussian filter. Completeness values for each node are taken from the analysis by Schorlemmer et al. (in print) using the probability-based magnitude of completeness method. Like in the ALM, the b-value per nodes is selected based on AIC scores. The resulting forecast is calibrated to te average number of events M4.95+ in the CPTI08 catalog. In the HALM (Hybrid Asperity Likelihood Model), a "hybrid" between a grid-based and a zoning model, the Italian territory is divided into distinct regions depending on the main tectonic regime and the local b-value variability is thus mapped using independent b-values for each tectonic zone.

Wiemer S., Schorlemmer D., 2007. ALM: An Asperity-based Likelihood Model for California. Seismol. Res. Letters., 78 (1): 134-140.

#### Lombardi, Anna Maria: DBM

The Double Branching Model is a stochastic time-dependent model assuming that each earthquake can generate, or is correlated to, other earthquakes, through different physical mechanisms. Specifically it consists of a sequential application of two branching processes, in which any earthquake can trigger a family of later events on different space-time scales. The first part of our model consists of well-known ETAS model describing the short-term clustering of earthquakes due to coseismic stress transfer. The second branching process works at larger space-time scales, compared to smaller domains involved by the short-term clustering, and aims to describe possible further correlations between events, not ascribable to coseismic stress-perturbations.

#### Meletti, Carlo: MPS04, MPS04after

The MPS04 model (MPS Working Group, 2004; http://zonesismiche.mi.ingv.it) is the reference model for the seismic hazard in Italy, according the Prime Minister Ordinance 3519/2006. A direct application of MPS04 is in the definition of seismic zones and response spectra in the new Building Code (2008).

MPS04 derives from a standard Cornell's approach to the PSHA where a Poissonian process is assumed. As in MPS04 procedure, two different methodologies were followed for determining the completeness time-intervals: a mainly historical approach (weighted at 60%) and a mainly statistical approach (weighted at 40%).

The number of expected earthquakes in 5 and 10 years for each space-magnitude bin was evaluated starting from the seismicity rates computed in MPS04 for the branches of the logic tree where the Gutenberg-Richter distribution was adopted. Of course, due to the Poissonian assumption, the number in 10 years exactly doubles the expected events in 5 years.

Two further models are submitted, again for 5 and 10 years. In MPS04 a

declustered catalogue was used. Thus, in order to take into account

the possibility of aftershocks (not considered in MPS04 model), the number

of earthquakes considered for MPS04 has been multiplied by a fixed coefficient equal to 1.24. This coefficient (following the suggestion by W. Marzocchi and L. Faenza) considers that, given N events, the number of 1st

generation aftershocks is 20%; these on their turn will produce a

further 20% of events.

#### Nanjo, Kazuyoshi: RI

Four earthquake forecast models based on the algorithm called "Relative Intensity of Seismicity" (RI) have been submitted to the Italian CSEP experiment: one for the 10-year testing class, one for the 5-year class, and two for the 3-month class. The RI working assumption is that future large earthquakes are considered likely to occur at sites of higher seismic activity in the past. The RI algorithm, originally a binary forecast system, is modified, as required by the experiment, to get the forecast numbers of earthquakes for the predefined magnitudes at each grid point, by assuming the Gutenberg-Richter frequency-magnitude distribution of earthquakes with a typical value for the Italy, b=1.2. This approach is then modified to belong to a general class of smoothed seismicity model. All models are based on earthquakes since 1985 except for one 3-month-class model based approximately on the last 4-year seismicity. The two 3-month-class models are used to examine whether the use of the historical seismicity provides better forecasts than that of the recent earthquakes for such intermediate-term class.

#### Peruzza, Laura; Pace, Bruno & Visini, Francesco: LASSCI

The LASSCI (LAyered Seismogenic Source model in Central Italy) is an earthquake rupture forecast model based on a schematic representation of active faulting with three types of seismogenic sources, formally combined together into a PSHA, under stationary and time-dependent perspectives. The three layers are: 1) background seismicity (BK), defined by regular cells of variable a and b values, with G-R relationships calibrated on instrumental data; 2) seismotectonic provinces (SP), defined by large structural domains, homogeneous in terms of active tectonics, with a G-R relationship calibrated on historical events not assigned to individual faults; 3) individual seismogenic sources (SB), defined as the surface projection of the known active fault (box): three occurrence models (namely GR, CH and HY, that is the combination of G-R behavior with the peak of a characteristic event) are permitted, and time-dependency is considered for the sources having the date of the last event.

#### Schorlemmer, Danijel & Wiemer, Stefan: ALM_IT

see above

#### Werner, Max: HiResSmoSeis

The model is based on very simple, yet hotly contested hypotheses. The model solely requires past seismicity as data input. Future earthquakes are assumed to occur with higher probability in areas where past earthquakes have occurred. We therefore smooth the locations of past quakes using a power-law kernel that is adaptive, i.e we smooth very little in regions of dense seismicity, but we smooth more in sparse regions. We optimize the degree of smoothing via retrospective (out-of-sample) tests. Sequences of triggered events are removed using a simple declustering method. The magnitude of each earthquake is independently distributed according to a tapered Gutenberg-Richter distribution with corner magnitude 8.0, independent of geological setting, past earthquakes or any other characteristic. Because of the assumed independence of the spatial density of future earthquakes on magnitude, we can use small magnitude 2 and larger quakes to better forecast future large events. The small quakes indicate regions of active seismicity. We correct for missing events by estimating the completeness magnitude in space.

#### Zechar, Jeremy: TripleS

The Simple Smoothed Seismicity (Triple-S) model is based on Gaussian smoothing of historical seismicity. Past epicenters make a smoothed contribution to an earthquake density estimation, where the epicenters are smoothed using a fixed lengthscale σ; σ is optimized by minimizing the average area skill score misfit function in a retrospective experiment (Zechar & Jordan 2010, PAGEOPH). The density map is scaled to match the average historical rate of seismicity.

### Three-month models

### Zechar, Jeremy: TripleS

see above

#### Nanjo, Kazuyoshi: RI

see above

### One-day models

### Console, Rodolfo; Murru, Maura & Falcone, Giuseppe: ERS, ETES

Our group has submitted three earthquake occurrence models, two in the short- (24 hour) and one in long-term (5- and 10 year), applied to the whole Italian territory. The first two models consider the short-term clustering properties of earthquakes. The first is a purely stochastic Epidemic Type Earthquake Sequence model (ETES), where the temporal aftershock decay rate is governed by the modified Omori Law (Ogata, 1983) and the distance decay follows a power-law. The second kind of short-term forecast (named Epidemic Rate-State - ERS) is physically constrained by the application of the Dieterich rate-and-state constitutive law to earthquake clustering. For the computation of the earthquake rate, both short term models assume the validity of the magnitude-frequency Gutenberg-Richter distribution.

#### Lombardi, Anna Maria: ETAS_LM

The Epidemic Type Aftershocks Sequences (ETAS) Model is a stochastic point aiming to model coseismic stress-triggered aftershock sequences. It describes the seismicity rate of a specific area as the sum of two contributions: the "background rate", usually associated with the regional tectonic strain rate, and the "rate of triggered events", associated with stress perturbations caused by previous earthquakes. The background rate is assumed variable in space and constant in time. With regard as the triggered seismicity, each event has a magnitude-dependent ability to generate its own Omori-like aftershock decay. The spatial distribution of this triggering capability is assumed isotropic and decreasing with increase of distance from the epicenter of triggering event.

#### Woessner, Jochen: STEP

The Short-Term Earthquake Probabilities (STEP) model (Gerstenberger et al. 2005) merges a static background model part of seismicity rates derived from a simple smoothed seismicity approach of Zechar & Jordan (2009) with a time-dependent model part. Comparing the forecasted seismicity rates of these models parts, the algorithm selects the contribution with the higher forecasted seismicity rate. The time-dependent model itself is an extension of the simpler Reasenberg & Jones model (Reasenberg & Jones 1989, 1990, 1994) and combines three elements: (1) a generic element based on the average behavior of Italian aftershock sequences; (2) an element based on the average behavior of the particular aftershock sequence; and (3) a spatially variable element where the aftershock behavior is mapped on a 5-km-square grid. At each grid node, we apply the corrected Akaike Information Criterion (AICc) (Kenneth et al. 2002) to determine the best fitting model. As we do not wish to force only a single model to be selected the AIC weighting is used where a relative weight is derived for each model element based on its AICc score, and the final model is a weighted sum of the three models (Gerstenberger et al. 2005). We submitted two models, STEP_LG and STEP_NG, to the daily testing class for the Italian region (Woessner et al. 2009). The difference between the models falls in the rate forecast based on the generic element of the time-dependent model. STEP_LG uses the parameters derived for Italian seismicity following the approach by Gasperini & Lolli (2006). STEP_NG (New Generic) uses a magnitude dependent productivity parameter following Christophersen & Smith (2008) and Christopersen & Gerstenberger (2009, pers. comm.). The latter model is less productive for M ≤ 6.2 earthquakes and more productive otherwise.