Abstract

Mobile advertisements influence customers' in-store purchases and boost in-store sales for brick-and-mortar retailers. Targeting mobile ads has become significantly important to compete with online shopping. The key to enabling targeted mobile advertisement and service is to learn shoppers' interest during their stay in the store. Precise shopper tracking and identification are essential to gain the insights. However, existing sensor-based or vision-based solutions are neither practical nor accurate; no commercial solutions today can be readily deployed in a large store. On the other hand, we recognize that most retail stores have the installation of surveillance cameras, and most shoppers carry Bluetooth-enabled smartphones. Thus, in this paper, we propose TAR to learn shoppers' in-store interest via accurate multi-camera people tracking and identification. TAR leverages widespread camera deployment and Bluetooth proximity information to accurately track and identify shoppers in the store. TAR is composed of four novel design components: (1) a deep neural network (DNN) based visual tracking, (2) a user trajectory estimation by using shopper visual and BLE proximity trace, (3) an identity matching and assignment to recognize shopper's identity, and (4) a cross-camera calibration algorithm. TAR carefully combines these components to track and identify shoppers in real-time. TAR achieves 90% accuracy in two different real-life deployments, which is 20% better than the state-of-the-art solution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call