Abstract

Railway bridges are essential components of any transportation system and are typically subjected to several environmental and operational actions that can cause damage. Furthermore, they are not easily replaced, and their failure can have catastrophic consequences. Considering the expected lifespan of bridges, it is essential to guarantee their adequate serviceability and safety. In this scenario, emerges the Structural Health Monitoring (SHM), which allows the early identification of damage before it becomes critical. Damage identification is usually performed by the comparison between the damaged and undamaged responses obtained from monitoring data. Among the several features extracted from the responses, the time-series models exhibit a better performance, capability of early damage detection, and may also be applied within online damage detection strategies using unsupervised machine learning frameworks. In this paper, a review of advanced time-series methodologies for damage detection is presented. Initially, several time-series models often used in SHM are described, such as Autoregressive Models (AR), Recurrent Neural Networks (RNN), Gated Recurrent Unit (GRU), and Long Short-Term Memory (LSTM). Later, the framework where these models are usually applied is also detailed, including the latest upgrades and most relevant results. Finally, the conclusions summarize and elucidate the current perspectives and research gaps on the time-series models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.