Abstract

Funding agencies play a pivotal role in bolstering research endeavors by allocating financial resources for data collection and analysis. However, the lack of detailed information regarding the methods employed for data gathering and analysis can obstruct the replication and utilization of the results, ultimately affecting the study’s transparency and integrity. The task of manually annotating extensive datasets demands considerable labor and financial investment, especially when it entails engaging specialized individuals. In our crowd counting study, we employed the web-based annotation tool SuperAnnotate to streamline the human annotation process for a dataset comprising 3,000 images. By integrating automated annotation tools, we realized substantial time efficiencies, as demonstrated by the remarkable achievement of 858,958 annotations. This underscores the significant contribution of such technologies to the efficiency of the annotation process.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call