Abstract

The emergence and penetration of smart mobile devices has given rise to the development of context-aware systems that utilize sensors to collect available data about users in order to improve various user services. Recently, the use of context-aware recommender systems (CARS) aimed at recommending items to users has expanded, particularly those that consider user context. Adding context to recommendation systems is challenging, because the addition of various environmental contexts to the recommendation process results in the expansion of its dimensionality, and thus increases sparsity. Therefore, existing CARS tend to incorporate a small set of pre-defined explicit contexts which do not necessary represent user context or reflect the optimal set of features for the recommendation process. We suggest a novel approach centered on representing environmental features as low dimensional unsupervised latent contexts. We extract data from a rich set of mobile sensors in order to infer unexplored user contexts in an unsupervised manner. The latent contexts are hidden context patterns modeled as numeric vectors which are efficiently extracted from raw sensor data. The latent contexts are automatically learned for each user utilizing unsupervised deep learning techniques and PCA on the data collected from the user's mobile phone. Integrating the data extracted from high dimensional sensors into a new latent context-aware recommendation algorithm results in up to a 20% increase in recommendation accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.