Abstract
2022 is called the year of “generative AI,” and the increased interest in ChatGPT in 2023 proves not just a sign of the popularity of neural networks among the mass audience but rather a trend. The number and quality of neural network models are growing, accelerating the digital transformation of society and its subsystems towards a new paradigm – Society 5.0. In this study, the authors analyze the functional potential of generative AI in the fields of social development, mass communication, and audiovisual media and identify several serious ethical challenges and dilemmas directly related to the operationalization of this digital technology. Along with the techno-optimistic concepts of “digital happiness” and “super smart society” and the excitement of fast and efficient operation of technologies using FM and LLM, even the most progressive adherents of Society 5.0 do not deny the existence of a number of risks to society and humans generated by AI, among other things. It goes about the threat of rapid and convincing multiplication of fake news and narratives, excessive dependence, the risk of copyright infringement and related ethical dilemmas, challenges to freedom of creativity, the instrumentalization of intellectual and artistic (including audiovisual) practices, digital manipulation of consciousness, and social exclusion associated with the deformation of the existential foundations of human life and communication.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Indian Journal of Information Sources and Services
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.