In the past 10 years, the concept of smart homes and buildings has become progressively mainstream with the introduction of several commercial solutions and available sensor technologies. One of the most important aspects of a smart home system is sensing, which is typically leveraged to observe, understand and predict the daily behavior of the occupant. This sensing of a smart home’s occupant extends to personal location estimation, behavior detection, and activity monitoring. However, there is always a trade off between the amount of information that can be sensed and the loss of privacy of the occupant. Cameras coupled with state of the art computer vision algorithms can provide much more nuanced and rich information than binary sensors, like motion sensors, door sensors and so on. Unfortunately, cameras are not well tolerated by people in their homes and other private areas for privacy protection reasons. In this article, an intensive study was performed on a dataset gathered in a smart home test bed equipped with binary motion sensors. Leveraging our previous works in a concurrent activation model that describes an occupant’s mobility along with the newly defined continuity property of a occupant’s mobility, we propose a bipartite sensor graph with intersection sensor nodes that models a occupant’s high level mobility. We then developed a trajectory propagation algorithm, which can generate an occupant’s most probable trajectories and locations. We show that by using the concurrent activation events of a binary motion sensor network we can extract useful high level details of an occupant’s behavior.
Read full abstract