Abstract

This paper proposes a machine learning-based position prediction approach to determine the position of a light-emitting diode (LED) target using a new measuring system called the multi-aperture positioning system (MAPS). The measurement system is based on a photogrammetric approach using an aperture mask and a single camera sensor. To achieve high accuracy in position calculation, several complex algorithms with high computational complexity are used. The accuracy of the system is equal to or better than that of existing photogrammetric devices. We investigate whether a neural network (NN) can replace the algorithms currently used in the system software to increase the measurement frequency with similar accuracy. Simulated images are used to train the NN, while real images are used to measure performance. Previously, various algorithms were used to calculate the position of the target from the captured images. Our approach is to train an NN, using thousands of labeled images, to predict the position of the target from these images. We investigate whether systematic measurement errors can be avoided; not all factors affecting the measurement precision are yet known, can always be accurately determined, or change over time. When NNs are used, all information contained in the images is learned by the model, considering all influences present at the time of training. Results show that the trained NN can achieve similar performance to the previously used Gaussian algorithm in less time since no filters or other pre-processing of images are required. This factor directly affects the measurement frequency of the MAPS. The light spot center was detected with sub-pixel accuracy without systematic errors in contrast to some of the previously used algorithms. The simulation of the sensor images needs to be improved to investigate the full potential of the NN.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.