Abstract

After Edward Snowden’s revelations, encryption of online communications at a large scale and in a usable manner has become a matter of public concern. The most advanced and popular among recently-developed encryption protocols is currently the Signal protocol. While the Signal protocol is widely adopted and considered as an improvement over previous ones, it remains officially unstandardised, even though there is an informal draft elaborated towards that goal. The analysis of how this protocol was introduced and swiftly adopted by various applications, and of subsequent transformations of the encrypted messaging ecosystem, sheds light on how a particular period in the history of secure messaging has been marked by a “de facto standardisation.” What can we learn about existing modes of governance of encryption and the histories of traditional standardisation bodies, when analysing the approach of “standardisation by running code” adopted by Signal? And finally, how does the Signal protocol challenge a “linear,” evolution-based vision of messaging history? Drawing from a three-year qualitative investigation of end-to-end encrypted messaging, from a perspective informed by science and technology studies (STS), we seek to unveil the ensemble of processes that make the Signal protocol a quasi-standard.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.