Abstract

Crowd management is an essential task to ensure the safety and smoothness of any event. Using novel technologies, including surveillance cameras, drones, and the communication techniques between security agents, the control of the crowd has become easier. Yet, the usage of such techniques is still not effective. This article presents an approach for crowd counting from drones’ data. The proposed method exploits the dilated and scaled neural networks for feature extraction and density crowd estimation. A new dataset named ViseDrone2020 is used for training and testing of the proposed method. In order to compare the proposed method, we implemented 10 state-of-the-art methods and trained it on this dataset. The experiments show that the proposed model is more efficient for crowd counting compared with the implemented methods. Also, some of these methods give relatively accurate results in terms of estimated crowd numbers as well as the quality of estimated density maps. The proposed model was further evaluated on nondrone datasets, namely, UCF_QNRF, UCF_CC_50, and shanghaiTech_(A, B), which produced satisfying results for all the datasets. In addition, the proposed method was tested on noisy images where Gaussian noise and salt-and-pepper noise was applied to all the images of the dataset with a noise density of 0.02. The analysis showed that the quality of the density map as well as the quantity of the crowd count estimation is comparatively better than other existing methods without the presence of noise. After the acceptance, the code of the proposed method as well as 10 implemented methods will be available at: [Online]. Available: https://github.com/elharroussomar/Crowd-Conting-on-Drone-Data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.