Abstract

The first half of this paper discusses how infrastructures for large-scale digitisation rely heavily on standardisation, field research and the shifting of workflows from a project-based model to a programme-based model. It also discusses the importance of establishing a metadata practice that takes into consideration the reuse of existing data, automation and appropriate levels of description. Issues regarding preservation metadata and metadata consistency are also viewed through the lens of the large-scale model. The final half of the paper concentrates on two large-scale case studies. The initial study focuses on a complex 150,000-item photoarchive digitisation project at the Frick Art Reference Library and the essential roles that assessment, standards and automation played in the project’s successful completion. The latter study details the digitisation of the entire collection of the Dallas Museum of Art and how to establish a flexible workflow which is responsive to donor requirements, space limitations, increased throughput and challenges introduced by data migration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.