Digital cameras are cost-effective vision sensors and able to directly provide two-dimensional information of structural condition in monitoring and assessment applications. For example, digital cameras are essential components of unmanned aerial vehicles (UAVs) and robotic agents for mobile sensing and inspection of pipelines, buildings, transportation infrastructure, etc, especially in post-natural disaster and man-made extreme events assessment. Additionally, while surveillance cameras have been widely used for transportation systems (e.g., traffic monitoring), if appropriately mounted on the large-scale structures such as the bridges, they can continuously monitor the structural condition under operational loads and hazards, complementing the regular visual inspection and assessment conducted by experts. In these or other applications, efficiently and reliably transferring the structural images or videos, which are as such large-scale, are important and challenging, especially in wireless platform that is either required (e.g., UAVs and robotic agents) or more suitable (e.g., camera monitoring networks) with only limited power and communication resources. This paper studies the computational algorithms for efficient and reliable transmission of the structural monitoring images; in particular, the compressed sensing (CS) technique is explored for robust data transmission and recovery. The sparse representation or data structure of the structural images is exploited, leading to the CS based central strategy: on some sparse domain, randomly encode large-scale image data into few relevant coefficients, which are then transferred (robust to random data loss) and recovered (in base station) for subsequent structural health diagnosis. Image data of bench scale pipe structure, concrete structure and full scale stay cable are employed for validation of the CS based method. Its performance is also compared with traditional transform coding and low-dimensional encoding (sampling), and their advantages and drawbacks are discussed. Copyright © 2016 John Wiley & Sons, Ltd.
Read full abstract