Abstract
As scientists, we are proud of our role in developing the current digital age that enables billions of people to communicate rapidly with others via social media. However, when things go wrong, we are also responsible for taking an ethical stand and trying to solve problems, and this work aims to take a step in this direction. Our goal is to set the foundation for a mathematically formal study of how we might regulate social media and, in particular, address the problem of the echo chamber effect. An echo chamber is a closed system where other voices are excluded by omission, causing your beliefs to become amplified or reinforced. In turn, these bubbles can boost social polarization and extreme political views, and, unfortunately, there is strong evidence that echo chambers exist in social media. The fundamental question we try to answer is: how and can a regulation “break” or reduce the echo chamber effect in social media? Sadly, the paper’s main result is an impossibility result: a general regulation function that achieves this goal (on our social media model) while obeying the core values of democratic societies (freedom of expression and user privacy) does not exist. This result leaves us with hard future choices to make.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.