AbstractUsability, functionality, and accessibility testing of digital library information services and products are essential for providing high quality services to users. This paper details a long‐term, evolving effort to develop meaningful evaluations for assessing digital libraries. The authors of this paper have been engaged in a multi‐year study to determine appropriate evaluation techniques, tools, and methodologies for the Florida Electronic Library (FEL) and other digital libraries. The evaluation protocols and approaches have been designed over time and iteratively through assessment efforts of the research team of other digital library initiatives and with multiple versions of the FEL. As such, this paper examines the process of developing, applying, and refining appropriate evaluation methodologies for the networked environment of libraries, as well as the implications of these methodologies.The approach taken in the research described herein relies on a combination of evaluation strategies applied iteratively to assess libraries from the perspective of patron needs. A number of specific methods, as shown in this paper, can be readily developed to provide such evaluations. The following goals guide the presentation in this paper of the development through time of the methods and instruments created, tested, refined, and operationalized in functionality, usability, and accessibility testing by the researchers. The goals of this paper are to: 1) demonstrate the potential roles of multiple, iterative evaluation strategies in the development and refinement of digital libraries; 2) detail the methodologies that focus on how the services meet the needs of users; and 3) encourage further discussion of the uses of these multiple evaluation approaches in assessing these libraries.
Read full abstract