Abstract

AbstractDisplacement is a critical indicator for mechanical systems and civil structures. Conventional vision‐based displacement recognition methods mainly focus on the sparse identification of limited measurement points, and the motion representation of an entire structure is very challenging. This study proposes a novel Nodes2STRNet for structural dense displacement recognition using a handful of structural control nodes based on a deformable structural three‐dimensional mesh model, which consists of control node estimation subnetwork (NodesEstimate) and pose parameter recognition subnetwork (Nodes2PoseNet). NodesEstimate calculates the dense optical flow field based on FlowNet 2.0 and generates structural control node coordinates. Nodes2PoseNet uses structural control node coordinates as input and regresses structural pose parameters by a multilayer perceptron. A self‐supervised learning strategy is designed with a mean square error loss and L2 regularization to train Nodes2PoseNet. The effectiveness and accuracy of dense displacement recognition and robustness to light condition variations are validated by seismic shaking table tests of a four‐story‐building model. Comparative studies with image‐segmentation‐based Structure‐PoseNet show that the proposed Nodes2STRNet can achieve higher accuracy and better robustness against light condition variations. In addition, NodesEstimate does not require retraining when faced with new scenarios, and Nodes2PoseNet has high self‐supervised training efficiency with only a few control nodes instead of fully supervised pixel‐level segmentation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.