Abstract

Privacy-by-Design (PbD) suggests designing the fundamental architecture and features of computing systems with privacy in mind. Although widely adopted by regulatory frameworks, a growing number of critics have questioned whether PbD's focus on compliance with privacy regulation may prevent it from addressing users' specific privacy attitudes and expectations. Motivated to enhance user-centered privacy-by-design processes, we examine what are the consequences of the way privacy questions are framed to crowd users, and how personal characteristics of the crowd users impact their responses. We recruited a total of 665 participants, of which 456 were recruited via Amazon Mechanical Turk (AMT), and 209 were university students. We show that the framing of computing systems' features using data flows results in features' evaluations that are less critical, compared to using descriptions of personal experiences. We also found, based on the student sample, that students with professional engineering experience are less critical than those with no work experience when assessing the features' appropriateness. We discuss how our results can be used to enhance privacy-by-design processes and encourage user-centered privacy engineering.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call