Abstract
Abstract This paper presents our ongoing work in the Virtual Interiors project, which aims to develop 3D reconstructions as geospatial interfaces to structure and explore historical data of seventeenth-century Amsterdam. We take the reconstruction of the entrance hall of the house of the patrician Pieter de Graeff (1638–1707) as our case study and use it to illustrate the iterative process of knowledge creation, sharing, and discovery that unfolds while creating, exploring and experiencing the 3D models in a prototype research environment. During this work, an interdisciplinary dataset was collected, various metadata and paradata were created to document both the sources and the reasoning process, and rich contextual links were added. These data were used as the basis for creating a user interface for an online research environment, taking design principles and previous user studies into account. Knowledge is shared by visualizing the 3D reconstructions along with the related complexities and uncertainties, while the integration of various underlying data and Linked Data makes it possible to discover contextual knowledge by exploring associated resources. Moreover, we outline how users of the research environment can add annotations and rearrange objects in the scene, facilitating further knowledge discovery and creation.
Highlights
In this paper, we explore the use of 3D reconstructions1 as heuristic tools in the research process
We offer our contribution to the current debate by discussing how we applied these principles in our ongoing work in the Virtual Interiors project, taking the reconstruction of the entrance hall of a seventeenth-century grand canal house in Amsterdam as a case study
We argue for the use of 3D reconstructions in the humanities as potentially enabling an iterative process of knowledge creation, sharing, and discovery
Summary
As Gane and Beer (2008) assert, is an “in-between device,” for instance positioned between a user and a system. The 3D research environment developed in the context of this work allows direct access to this information First of all, it comprises explicit metadata captured during the modeling process (i.e., information about the included resources such as the type and description of an object) and references to underlying sources used for the reconstruction. Direct links to the original sources of all of these pieces of information can be followed directly to delve deeper into the sources underlying the reconstruction (Figure 9, top right) In this way, our environment serves as a research hub that facilitates further discovery since users can navigate to related relevant resources, beyond the already included information. Users can potentially explore different hypotheses with regard to spatial arrangement of objects as well experiment with their appearance
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.