A rapid expansion of the computer technology industry, particularly in the field of artificial intelligence, has ignited a global concern that warrants our immediate action. As nurses, our professional values frameworks compel us to protect public health and address national and global health issues. When industry activities adversely affect the social wellbeing of civil society and social institutions, it is important to evaluate them against their industry’s ‘social license to operate, which is a measure of public trust, credibility, and the legitimacy of their industrial and corporate citizenship status. The central question is, do computer technology companies continue to have a social license to operate in civil society? Nurses are encouraged to evaluate the computer technology industry’s recent ‘generative artificial intelligence’ chatbot activities against its tacit undertaking to be good corporate citizens in return for social acceptance of their operations and behaviour. An evidence-based overview of chatbot impacts on societies, environmental sustainability and human rights provide a basis for evaluation. Basic computer technology terminology and relevant concepts are explained. This article is a direct call to action for clinical nurses and those involved in research, education, management, and policy. We have a duty to critically assess the claims made by chatbot technology vendors in both practice and social contexts. If these vendors integrate chatbot technologies with existing machine learning used in nursing and healthcare technologies it could result in detrimental effects beyond user control. By influencing decisions on technology adoption, we can ensure the implementation of safeguards, protect patient safety and social well-being, and uphold the integrity of nursing values. A closing discussion of impacts of computer industry trust deficits on healthcare and research reflects the author’s concerns and conclusions about the central question. Readers may draw other conclusions and perhaps use the issues and evidence presented here to stimulate further investigations.
Read full abstract