Abstract
Chronic pain is a significant source of suffering, disability and societal cost in the US. However, while the ability to detect a person's risk for developing persistent pain is desirable for timely assessment, management, treatment, and reduced health care costs---no objective measure to detect clinical pain intensity exist. Recent Artificial Intelligence (AI) methods have deployed clinical decision- making and assessment tools to enhance pain risk detection across core social and clinical domains. Yet, risk assessment models are only as "good" as the data they are based on. Thus, ensuring fairness is also a critical component of equitable care in both the short and long term. This paper takes an intersectional and public health approach to AI fairness in the context of pain and invisible disability, suggesting that computational ethnography is a multimodal and participatory real-world data (RWD) methodology that can be used to enhance the curation of intersectional knowledge bases, thereby expanding existing boundaries of AI fairness in terms of inclusiveness and transparency for pain and invisible disability use cases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.