Abstract

BackgroundAs physical and cognitive rehabilitation protocols utilizing virtual environments transition from single applications to comprehensive rehabilitation programs there is a need for a new design cycle methodology. Current human-computer interaction designs focus on usability without benchmarking technology within a user-in-the-loop design cycle. The field of virtual rehabilitation is unique in that determining the efficacy of this genre of computer-aided therapies requires prior knowledge of technology issues that may confound patient outcome measures. Benchmarking the technology (e.g., displays or data gloves) using healthy controls may provide a means of characterizing the "normal" performance range of the virtual rehabilitation system. This standard not only allows therapists to select appropriate technology for use with their patient populations, it also allows them to account for technology limitations when assessing treatment efficacy.MethodsAn overview of the proposed user-centered design cycle is given. Comparisons of two optical see-through head-worn displays provide an example of benchmarking techniques. Benchmarks were obtained using a novel vision test capable of measuring a user's stereoacuity while wearing different types of head-worn displays. Results from healthy participants who performed both virtual and real-world versions of the stereoacuity test are discussed with respect to virtual rehabilitation design.ResultsThe user-centered design cycle argues for benchmarking to precede virtual environment construction, especially for therapeutic applications. Results from real-world testing illustrate the general limitations in stereoacuity attained when viewing content using a head-worn display. Further, the stereoacuity vision benchmark test highlights differences in user performance when utilizing a similar style of head-worn display. These results support the need for including benchmarks as a means of better understanding user outcomes, especially for patient populations.ConclusionsThe stereoacuity testing confirms that without benchmarking in the design cycle poor user performance could be misconstrued as resulting from the participant's injury state. Thus, a user-centered design cycle that includes benchmarking for the different sensory modalities is recommended for accurate interpretation of the efficacy of the virtual environment based rehabilitation programs.

Highlights

  • As physical and cognitive rehabilitation protocols utilizing virtual environments transition from single applications to comprehensive rehabilitation programs there is a need for a new design cycle methodology

  • Stereoacuity calculated for head-worn projection displays (HWPDs)-1 and HWPD-2 Figure 5 and 6 show the overall mean stereoacuity values attained at each viewing distance, for each task, and each HWPD

  • Participants wearing HWPD-1 performed more variably at the 800 mm viewing distance; as the distance was adjusted toward the optimized optical plane, participants’ performance improved significantly, (MV-HD800 = 186.70 arc sec, SD = 92.10 arc sec; MV-HD1500 = 133.52 arc sec, SD = 34.6; MV-HD3000 = 41.91 arc sec, SD = 7.18)

Read more

Summary

Introduction

As physical and cognitive rehabilitation protocols utilizing virtual environments transition from single applications to comprehensive rehabilitation programs there is a need for a new design cycle methodology. “Good Fit” assessments are another suggested requirement of the virtual rehabilitation (VR) design cycle The purpose of these assessments is to gauge how well the VE solution presents real world attributes in a more controlled, repeatable manner that will allow for comparable results over treatment effects [10]. This point raises an important issue: VE solutions for cognitive rehabilitation are mostly designed to capture data necessary to evaluate levels of cognitive function or transfer effects pre and post rehabilitation. Lack of standardization leads to redundancy of VE applications and platforms; more importantly, it makes comparisons across research endeavors difficult [11]

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.