Abstract

BackgroundSeveral studies highlight the effects of artificial intelligence (AI) systems on healthcare delivery. AI-based tools may improve prognosis, diagnostics, and care planning. It is believed that AI will be an integral part of healthcare services in the near future and will be incorporated into several aspects of clinical care. Thus, many technology companies and governmental projects have invested in producing AI-based clinical tools and medical applications. Patients can be one of the most important beneficiaries and users of AI-based applications whose perceptions may affect the widespread use of AI-based tools. Patients should be ensured that they will not be harmed by AI-based devices, and instead, they will be benefited by using AI technology for healthcare purposes. Although AI can enhance healthcare outcomes, possible dimensions of concerns and risks should be addressed before its integration with routine clinical care.MethodsWe develop a model mainly based on value perceptions due to the specificity of the healthcare field. This study aims at examining the perceived benefits and risks of AI medical devices with clinical decision support (CDS) features from consumers’ perspectives. We use an online survey to collect data from 307 individuals in the United States.ResultsThe proposed model identifies the sources of motivation and pressure for patients in the development of AI-based devices. The results show that technological, ethical (trust factors), and regulatory concerns significantly contribute to the perceived risks of using AI applications in healthcare. Of the three categories, technological concerns (i.e., performance and communication feature) are found to be the most significant predictors of risk beliefs.ConclusionsThis study sheds more light on factors affecting perceived risks and proposes some recommendations on how to practically reduce these concerns. The findings of this study provide implications for research and practice in the area of AI-based CDS. Regulatory agencies, in cooperation with healthcare institutions, should establish normative standard and evaluation guidelines for the implementation and use of AI in healthcare. Regular audits and ongoing monitoring and reporting systems can be used to continuously evaluate the safety, quality, transparency, and ethical factors of AI-based services.

Highlights

  • Several studies highlight the effects of artificial intelligence (AI) systems on healthcare delivery

  • Respondents were fairly young as 67.1% of them were younger than 40 years old

  • 80% indicated that their computer skills were good or excellent, and 74% rated their technical knowledge about AI average or good

Read more

Summary

Introduction

Several studies highlight the effects of artificial intelligence (AI) systems on healthcare delivery. It is believed that AI will be an integral part of healthcare services in the near future and will be incorporated into several aspects of clinical care. Many technology companies and governmental projects have invested in producing AI-based clinical tools and medical applications. Patients should be ensured that they will not be harmed by AI-based devices, and instead, they will be benefited by using AI technology for healthcare purposes. Artificial Intelligence (AI) generally refers to a computerized system (hardware or software) that is able to perform physical tasks and cognitive functions, solve various problems, or make decisions without explicit human instructions [1]. AI can replace human tasks and activities within a wide range of industrial, intellectual, and social applications with resulting impacts on productivity and performance. The value of using AI tools is perceived based on the trade-off between possible benefit and risk as the benefit is higher than the risk, greater value of using the technology is perceived

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call