Abstract

Abstract Most disaster responses require manual processing to eliminate errors, which is currently handled by the government. Although this process ensures quality of the information, considerable time delays occur. With the rapid development of the Internet and mobile devices, crowd-based platforms have been developed and used for disaster responses. However, three obvious drawbacks exist with crowdsourcing: errors, duplications, and unstructured formats. This study aims to develop a computational method, called the Artificial and Crowd Intelligence (ACI) filter, to overcome these drawbacks. To verify the ACI filter, 876 responses collected from an actual flood that occurred on June 10th, 2012, in Taiwan, were used; 284 volunteers from the Internet were recruited for the testing. The results show that the ACI filter eliminates 26.25% of the responses. The percentage of mistaken validations was 0.00% and the percentage of mistaken eliminations was 3.91%. This demonstrates that the ACI filter, by combining machine and human intelligence, can successfully improve crowd response accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call