Abstract

Accurately tailored support such as advice or assistance can increase user satisfaction from interactions with smart devices; however, in order to achieve high accuracy, the device must obtain and exploit private user data and thus confidential user information might be jeopardized. We provide an analysis of this privacy–accuracy trade-off. We assume two positive correlations: a user’s utility from a device is positively correlated with the user’s privacy risk and also with the quality of the advice or assistance offered by the device. The extent of the privacy risk is unknown to the user. Thus, privacy concerned users might choose not to interact with devices they deem as unsafe. We suggest that at the first period of usage, the device should choose not to employ the full capability of its advice or assistance capabilities, since this may intimidate users from adopting it. Using three analytical propositions, we further offer an optimal policy for smart device exploitation of private data for the purpose of interactions with users.

Highlights

  • Personal data can improve user experience as users receive personally tailored advice or assistance

  • We propose a model that shows that the user–device interaction has an interesting trend for users moderately concerned about their privacy

  • We assume that the probability to receive adjusted support depends on the accuracy level of the device’s support algorithm; for high privacy risk devices it is two times more probable than for low privacy risk devices, i.e., Pa (q, H ) = q and Pa (q, L) = 0.5q

Read more

Summary

Introduction

Personal data can improve user experience as users receive personally tailored advice or assistance. We consider devices that employ user private information collected from sensors or devices provided with sensors. By wearing a smart watch, the user shares information such as his location, heartbeat and movements. After an initial period in which they use the device, and once the device collects the data that are used to offer advice or assistance, the users may become aware of the potential privacy risk. When the device is not trusted, privacy risks and privacy violations may lead to users abandoning the device ( known as customer churn [5]). Considerations of privacy risk affect their interactions with sensors and devices provided with sensors. Our results are not limited to any specific device and are generally true for any sensor or device provided with sensors that utilizes information where user privacy might be compromised. We conclude with a discussion of our main findings (Section 5)

Related Work
Numerical Results
Discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.