“Notice and consent” is a widely-used model of privacy protection that relies on users’ ability to interpret an organization’s written description of its information practices through a privacy policy and choose whether to participate. In the past, organizations using this model have fallen under harsh criticism for providing long, legalistic notices and confusing, unclear choices. In the past few years, Facebook has sought to address these issues through privacy settings such as the “Privacy Publisher tool.” These mechanisms are new and untested, but they could potentially vindicate notice-and-consent by using graphics, metrics and other mechanisms to better inform and facilitate privacy decisions. The study will test the efficacy of these mechanisms for providing appropriate protection. In doing so, it will evaluate their feasibility as a model for notice and consent. To conduct this evaluation, the thesis will conduct a behavioral survey of how site visitors use these settings to understand their privacy risk and make decisions based on personal preferences. The study hypothesizes that, while privacy settings may provide some protection, they leave most users unprotected from many significant privacy harms. The survey itself will consist of three sections. The first section will test respondents’ ability to use the privacy settings to protect against specific harms. The survey will present the respondents with the mock privacy pages and prompt them to choose settings in response to a privacy issue. One prompt may be: “Choose Facebook settings that would only allow friends to tag photos of you.” The results could indicate whether the settings discourage certain kinds of protections, such as content sharing restrictions on Facebook. The second section will examine how respondents currently apply their privacy preferences on Facebook. Respondents will begin by submitting their Facebook privacy settings via multiple choice answers. Respondents will then answer questions about their preferences for each privacy practice. This section will permit the study to compare users’ opinion on the issues with how they actually selected their settings and will be an important indicator for evaluating the privacy settings. Finally, the third section will collect information of demographics and Facebook usage, such as how often a user checks Facebook for updates or posts to his or her wall. The final report will include a statistical analysis of these results and an examination of the conclusion. The study will conduct an analysis of variance, for instance, to understand how significant the gaps were between respondents’ selected settings and the settings that actually protect their preferences. The study will also employ regression techniques to understand which privacy settings face similar usability obstacles and which demographics face greater risks to privacy. David Krone is a recent graduate of Georgetown University’s Communications, Culture and Technology Program. David conducted research on Facebook Privacy as part of his thesis, with Georgetown Professors Mark MacCarthy and Diana Owen as advisors. In addition, David works full time as a Cybersecurity and Privacy Analyst at the Office of Personnel Management where he holds several privacy-related responsibilities. These include assessing IT system security risks to conducting system privacy threshold analyses (PTAs) and compiling reports on breaches involving personally identifiable information (PII). David holds the certification for the CIPP/G (Certified Information Privacy Professional – Government). Finally, as a Georgetown undergraduate, David studied the intersection of society and information and communications technologies by majoring in Science, Technology and International Affairs and minoring in Computer Science.
Read full abstract