Influence of scour of foundations on the seismic performance of bridges
Abstract
Infrastructure deterioration and aging are growing challenges, also because of budgetary constraints on bridge maintenance and retrofitting. Specifically, scour of foundations is a significant issue for many bridges, particularly in a situation where global climate change is causing an increase in extreme alluvial events. In fact, it not only is the primary cause collapse of existing bridges but also lowers the ability to withstand following catastrophic occurrences, such as earthquakes. For the limited financial resources available for retrofitting, it is crucial to analyze the residual static and seismic capacity of bridges with accurate models. In this regard, soil-structure interaction plays a crucial role both for the assessment of current conditions and for the appraisal of expected performances. In fact, a lot of emphasis has been paid in recent years to the use of dynamic testing on bridges to evaluate their current condition, specifically with regard to foundation scour. The fact that the dynamic response of the bridge is altered also when the scour hole is filled with sediment in the aftermaths of big events gives vibration-based monitoring approaches their principal benefit over conventional ones. On the other hand, scour of the foundation has a substantial impact on bridges dynamic response and ability to withstand future earthquakes. Modelling of these phenomena is often over-simplified by assuming homogeneous riverbed erosion as a reference condition, whereas the actual configuration of the scour hole can significantly affect the response of the pier. For a proper use of resources, not only from an economic perspective but also for environmental sustainability, a more realistic appraisal is required. To do this, physical models can supply information for the calibration of numerical models, which can subsequently be applied to a realistic evaluation of the static and seismic performances.
Biography
Sebastiano Foti is a Professor in Geotechnical Engineering and Vice-Rector for Education at Politecnico di Torino (Italy). His research activity is mainly devoted to geotechnical earthquake engineering and geophysical methods for geotechnical characterization. Author of the book “Surface wave methods for near-surface site characterization” and member of the Project Team for Eurocode 7 - Geotechnical design - Part 2: Ground investigation and testing. He was awarded the Geotechnical Research Medal (Bishop Medal) by the Institution of Civil Engineers (UK) and an Honorable Mention by the Society of Exploration Geophysics (USA) and the Outstanding Paper Award from Earthquake Spectra by the Earthquake Engineering Research Institute.





ground motion intensity varies significantly with changes in orientation. Although historically this has been ignored or not properly accounted for, this variation is actually significant and has multiple, important, implications in earthquake resistant design and when evaluating the seismic performance of the built environment. A common misconception is that strong directionality only occurs in the near field, while, in reality, a strong directionality occurs even at large distances from the rupture where it cannot be attributed to directivity. Recent studies to quantify and model directionality will be presented. This includes probabilistic models of two different metrics to quantify directivity in earthquake ground motions. Novel results show an interesting negative correlation between the level of polarity in a ground motion and the duration of the ground motion, meaning the level of polarization tends to decrease as the duration of the motion increases. It will be shown that for most structures, which typically have two principal axes in the horizontal direction that are perpendicular to each other, the probability that orientation-independent measures of ground motion intensity such as RotD50 are exceeded in one of the principal axes of the structure is higher than 90% when RotD50 occurs at the site and therefore the mean annual rate of exceedance the ground motion intensity in the structure is actually significantly higher than the mean annual rate of exceedance of RotD50. An alternate measure on intensity referred to as MaxRotD50 will be presented and discussed. The new measure of intensity is particularly well suited for earthquake-resistant design where a major concern for geotechnical and structural engineers is the probability that the design ground motion intensity is exceeded in at least one of the two principal horizontal components of the structure. The presentation will also include new emerging orientation-dependent ground motion models that allow to make estimates of ground motion intensities at specific orientations and will show that the variability in these models in many cases is smaller than that of orientation-independent measures of intensity. Finally, some applications will be presented to illustrate the importance of directionality.
c seismic hazard analysis (PSHA) is the standard approach for developing earthquake ground motions for critical facilities because it account for the important sources of uncertainty and variability associated with the earthquake source and with ground motion prediction. The Senior Seismic Hazard Analysis Committee (SSHAC) process was initiated in 1997 by the US Nuclear Regulatory Committee to provide guidance on uncertainty and the use of experts in PSHA, and since then the SSHAC framework has been used for PSHA projects for critical facilities around the world. Site response analysis traditionally has been performed outside of the PSHA and, thus, the SSHAC process has not been utilized, despite the fact that significant uncertainties and judgments are associated with site response analysis. More recently, site response analysis has become part of the PSHA and guidance for applying the SSHAC process to site response analysis has been developed. This presentation will introduce the SSHAC process and its application to site response analysis in recent projects. The approach to developing site adjustment factors will be described, along with the main sources of epistemic uncertainty and aleatory variability. The logic tree approach to incorporate epistemic uncertainty will be demonstrated and examples provided.
In nuclear engineering practice, the so-called safety factor (or separation of variables) approach is generally used to develop fragility curves due to its systematic applicability and the possibility to deal with a large number of SSCs (Structure, Systems and Components). With increasing computational capabilities, it becomes now feasible and more and more common to develop numerical models representing complex and possibly nonlinear behavior for components at stake. This talk addresses different aspects related to the numerical evaluation of fragility curves, including the choice of intensity measures, uncertainty propagation, reliability of numerical models, possible surrogates and the introduction of knowledge through expert judgement and in-situ experience data. Different sources of information such as expert judgement, numerical simulation, qualification tests and experience feedback can be combined in a Bayesian framework to develop best-informed fragility curves. Here, we present an approach that allows for the consideration of generic fragility parameters and simulation to develop priors and update fragility curves using experience feedback considering both epistemic and aleatory uncertainty. In particular, we use a database that contains failure data collected in industrial plants that have experienced an earthquake. We discuss opportunities and difficulties of this approach, related to the lack of specific data for nuclear equipment despite growing experience feedback and awareness. The PGA is generally used as intensity measure when developing fragility and hazard curves. Eventually, we consider an approach to deal with vector hazard and vector fragility curves and discuss possible benefit for seismic risk assessment of nuclear plants. Indeed, while PGA proves to be a very good damage or failure indicator for a large number of SSCs, there are a few with low frequency behavior that could be better characterized by introducing a second intensity measure such as low frequency spectral acceleration.
