Abstract

To compare levels of agreement amongst paediatric clinicians with those amongst consultant paediatric radiologists when interpreting chest radiographs (CXRs). Four paediatric radiologists used picture archiving and communication system (PACS) workstations to evaluate the presence of five radiological features of infection, independently in each of 30 CXRs. The radiographs were obtained over 1year (2008) from children with fever and signs of respiratory distress, aged 6 months to <16years. The same CXRs were interpreted a second time by the paediatric radiologists and by 21 clinicians with varying experience levels, using the Web 1000 viewing system and a projector. Intra- and interobserver agreement within groups, split by grade and specialty, were analysed using free-marginal multi-rater kappa. Normal CXRs were identified consistently amongst all 25 participants. The four paediatric radiologists showed high levels of intraobserver agreement between methods (kappa scores between 0.53 and 1.00) and interobserver agreement for each method (kappa scores between 0.67 and 0.96 for PACS assessment). The 21 clinicians showed varying levels of agreement from 0.21 to 0.89. Paediatric radiologists showed high levels of agreement for all features. Ingeneral, the clinicians had lower levels of agreement than the radiologists. This study highlights the need for improved training in interpreting CXRs for clinicians and the timely reporting of CXRs by radiologists to allow appropriate patient management.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call