Abstract

Background: Artificial intelligence services, such as ChatGPT (generative pre-trained transformer), can provide parents with tailored responses to their pediatric orthopaedic concerns. We undertook a qualitative study to assess the accuracy of the answer provided by ChatGPT in comparison to OrthoKids (“OK”), a patient-facing educational platform governed by the Pediatric Orthopaedic Society of North America (POSNA) for common pediatric orthopaedic conditions.Methods: A cross-sectional study was performed from May 26 to June 18, 2023. OK website (orthokids.org) was reviewed and 30 existing questions were collected. The corresponding OK and ChatGPT responses were recorded. Two pediatric orthopaedic surgeons assessed the answer provided by ChatGPT against the OK response. Answers were graded as: AGREE (accurate information; question addressed in full), NEUTRAL (accurate information; question not answered), DISAGREE (information was inaccurate or could be detrimental to patients' health). The evaluators' responses were compiled; discrepancies were adjudicated by a third pediatric orthopaedist. Additional chatbot answer characteristics such as unprompted treatment recommendations, bias, and referral to a healthcare provider were recorded. Data was analyzed using descriptive statistics.Results: The chatbot's answers were agreed upon in 93% of questions. Two responses were felt to be neutral. No responses met disagreement. Unprompted treatment recommendations were included in 55% of its responses (excluding treatment-specific questions). The chatbot encouraged users to “consult with a healthcare professional” in all responses. It was nearly an equal split between recommending a generic provider (46%) in contrast to specifically stating a pediatric orthopaedist (54%). The chatbot was inconsistent in related topics in its provider recommendations, such as recommending a pediatric orthopaedist in 3 of 5 spine conditions.Conclusion: Questions pertaining to common pediatric orthopaedic conditions were accurately represented by a chatbot in comparison to a specialty society-governed website. The knowledge that chatbots deliver appropriate responses is reassuring. However, the chatbot frequently offered unsolicited treatment recommendations whilst simultaneously inconsistently recommending an orthopaedic consultation. We urge caution to parents utilizing artificial intelligence without also consulting a healthcare professional.Level of Evidence: IVKey Concepts•Artificial intelligence chatbots are becoming increasingly popular, as demonstrated by the rapid rise of publications on the topic in the last 3 months, and they represent a novel patient education online platform.•In comparing 30 common pediatric orthopaedic conditions, >90% of the chatbot's responses were felt to be in agreement with a specialty society's parent-patient-facing education platform.•The chatbot's responses were largely unbiased and referred patients to a healthcare professional. However, the responses lacked references or citing sources for the provided information.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.