Abstract

While most contemporary researchers argue that Artificial Intelligence (AI) provides overwhelming and profound advancements in technology and beneficial tools used daily to advance human life on earth, there exist a host of other researchers who hold contrary opinions to this view, with regard to the rising adverse ontological and existential consequences which the products of super-intelligent technologies has on an alarming number of mankind in the 21st century. Scholars like Vardi, Tegmark and Greene likened this scenario to a time-bomb waiting to go off any moment. The recent endorsement of 23 AI principles by 1200 AI/Robotics researchers and over 2342 other researchers from diverse disciplines, in a just concluded Future of Life (FLI) Conference, adds credence to the worries which most researchers have about the presumed benefits of AI to mankind. The study draws from a combination of Marxian Alienation and ontological theories which basically holds that: rising advancements in AI technologies, continues to alienate mankind from his existential human nature. The ex-post facto research design in the social sciences and Deriders' deconstructive and critical reconstructive analytic method in philosophy, for interrogating the meaning of concepts, arguments and current debates on the relevance and risks of IA, were adopted for the study. The study identified justifiable grounds and reasons for the alarm raised over current innovations in the field of AI technologies. The study strengthen the resolve of current researchers to identify ways of reducing or avoiding the impending adverse consequences of evolving conscious AI and machines in the feature. The research proposes a radical legal enforcement and adoption of the 23 newly established AI principles as one of the pertinent measure for saving mankind from the impending threats posed by innovations powered by advances in AI technology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call