Abstract
Although walking methodologies (WMs) and machine learning (ML) have been objects of interest for urban scholars, it is difficult to find research that integrates both. We propose a ‘cyborg walk’ method and apply it to studying litter in public spaces. Walking routes are created based on an unsupervised learning algorithm (k-means) to classify public spaces. Then, a deep learning model (YOLOv5) is used to collect data from geotagged photos taken by an automatic Insta360 X3 camera worn by human walkers. Results from image recognition have an accuracy between 83.7% and 95%, which is similar to what is validated by the literature. The data collected by the machine are automatically georeferenced thanks to the metadata generated by a GPS attached to the camera. WMs could benefit from the introduction of ML for informative route optimisation and georeferenced visual data quantification. The links between these findings and the existing WM literature are discussed, reflecting on the parallels between this ‘cyborg walk’ experiment and the seminal cyborg metaphor proposed by Donna Haraway.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.