Abstract

<p><strong>Background:</strong> Incidents of personal belongings being lost often occur due to our negligence as human beings or criminal acts such as theft. The methods used to address such situations are still manual and ineffective. The manual process of reporting lost items requires significant time and effort. Additionally, matching the information of lost items with the found ones becomes increasingly difficult, and finding the original owners can be time-consuming. <strong>Objectives and Methods:</strong> This research aims to develop an approach that aids the community in the management of lost items by incorporating a process of item identification. It proposes the creation of an iOS-based prototype model that implements image comparison and string matching. The ResNet-50 architecture extracts features from images, and the Euclidean Distance method measures similarity between these features. Natural language processing used for text pre-processing and employs the cosine similarity metric to assess textual similarity in item descriptions. <strong>Result and Conclusion:</strong> By combining Euclidean distance and cosine similarity values, the model predicts similar lost item reports. Image comparison provides an accuracy result of 29.96% correctness, while string matching with 97.92% correctness. Thorough testing and validation confirm the model’s success across different reports.</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call