Abstract
Disinformation and propaganda are based on the fact that the information: i) is designed to be totally or partially false, manipulated or misleading or uses unethical persuasion techniques; ii) concern a matter of public interest; (iii) is intended to generate insecurity, hostility or polarization or attempt to undermine democratic processes; iv) is disseminated and/or amplified using automatic and aggressive tools, such as social bots, artificial intelligence, micro-targeting or paid human trolls, often with the aim of increasing the public visibility of the content. Especially the systematic dissemination of disinformation by active politicians, parties or authorities is a clear and immediate threat to democracy and is disrespectful of the values of the European Union according to Article 2 TEU, because the trust in such authoritative persons is a value choice which cannot be changed by rational arguments. Moreover, deepfakes (algorithmically generated messages flooding recipients to give a false impression of political consensus) present a significant challenge for democracy, because they may sow uncertainty which may, in turn, reduce trust in news on social media and hinder civic participation in online debates. Finally, a study commissioned by the European External Antion Service, published in 2021, has focused on two more categories of disinformation: 'influence operations' by third countries, aimed at influencing a target audience using a range of illegitimate and deceptive means, and 'foreign interference', aimed at disrupting the free formation and expression of political will. Therefore, among the many issues dealt with by the EP resolution of 1 June 2023 on foreign interference in all democratic processes in the European Union, it must be mentioned the fact that foreign interference, including disinformation, is a national and cross-border security threat. Consequently, the EP has stressed the need for solidarity between the Member States so that such activities can be effectively combated, also amending Article 222 TFEU (the solidarity clause) by including foreign interference. In fact, with respect to disinformation a process of securitization is being promoted, consisting of applying security tools and discourses upon an object that was previously not identified as such. An example of this trend is represented by the specific task force set up within the European External Action Service in order to address Russia's ongoing disinformation campaigns. Another example, showing that disinformation has become a CFSP issue, is the Council regulation (EU) 2022/350 of 1 March 2022, based on Council decision (CFSP) 2022/351, concerning restrictive measures in view of Russia's actions destabilizing the situation in Ukraine. However, when determining the focus and political actions of the EU against disinformation, two opposing logics – securitization and self-regulation – coexist and compete. As a result, the EU is promoting a discourse linking disinformation to security, exceptionality and geopolitical strategies, but being lax at the same time with the obligations and responsibilities of the digital platform companies. Received: 25 December 2023 / Accepted: 25 February 2024 / Published: 23 April 2024
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Interdisciplinary Journal of Research and Development
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.