Comparing clinical gait analysis (CGA) data between clinical centers is critical in the treatment and rehabilitation progress. However, CGA protocols and system configurations, as well as choice of marker sets and individual variability during marker attachment, may affect the comparability of data. The aim of this study was to evaluate reliability of CGA data collected between two gait analysis laboratories. Three healthy subjects underwent a standardized CGA protocol at two separate centers. Kinematic data were captured using the same motion capturing systems (two systems, same manufacturer, but different analysis software and camera configurations). The CGA data were analyzed by the same two observers for both centers. Interobserver reliability was calculated using single measure intraclass correlation coefficients (ICC). Intraobserver as well as between-laboratory intraobserver reliability were assessed using an average measure ICC. Interobserver reliability for all joints (ICCtotal = 0.79) was found to be significantly lower (p < 0.001) than intraobserver reliability (ICCtotal = 0.93), but significantly higher (p < 0.001) than between-laboratory intraobserver reliability (ICCtotal = 0.55). Data comparison between both centers revealed significant differences for 39% of investigated parameters. Different hardware and software configurations impact CGA data and influence between-laboratory comparisons. Furthermore, lower intra- and interobserver reliability were found for ankle kinematics in comparison to the hip and knee, particularly for interobserver reliability.
Read full abstract