Abstract

Artificial intelligence and robotics are increasingly used in chemical experiments for reagent preparation, but it is still a challenge to accurately grasp the reagent bottle in small and dense scenarios. Therefore, precise positioning and identification of the reagent bottles are crucial. However, the different kinds of chemical reagents create complexity in target detection, making the task difficult. Therefore, we propose a model, named TA-YOLOv5, based on YOLOv5 for the task of chemical reagent localization. To enrich our dataset and prevent single-background overfitting, we employed Mosaic and Mix-Up data augmentation during dataset processing. Furthermore, Transformer Encoder and Coordinate Attention modules are utilized to enhance the feature expressive ability for both small target objects and global information during model training. Finally, the CIoU-NMS algorithm is utilized to optimize prediction box filtering in dense scenarios. In addition, for the task of chemical reagent classification, this paper uses the CRNN (Convolutional Recurrent Neural Network) algorithm to identify character information on the label of the reagent bottle, enabling the successful classification of the reagent bottle. The experimental results illustrate that TA-YOLOv5 achieves 98.49% precision on the self-made reagent bottles dataset, which is 4.12% higher than the YOLOv5 network model. Moreover, the CRNN network reached 97.6% accuracy on IIIT 5k dataset.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.