Abstract

AbstractThis contribution proposes a workflow for a completely automatic orientation of historical terrestrial urban images. Automatic structure from motion (SfM) software packages often fail when applied to historical image pairs due to large radiometric and geometric differences causing challenges with feature extraction and reliable matching. As an innovative initialising step, the proposed method uses the neural network D2‐Net for feature extraction and Lowe’s mutual nearest neighbour matcher. The principal distance for every camera is estimated using vanishing point detection. The results were compared to three state‐of‐the‐art SfM workflows (Agisoft Metashape, Meshroom and COLMAP) with the proposed workflow outperforming the other SfM tools. The resulting camera orientation data are planned to be imported into a web and virtual/augmented reality (VR/AR) application for the purpose of knowledge transfer in cultural heritage.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call