Abstract
In this paper we provide a critical overview of the state of the art in human-centric intelligent data management approaches for geographic visualizations when we are faced with bandwidth limitations. These limitations often force us to rethink how we design displays for geographic visualizations. We need ways to reduce the amount of data to be visualized and transmitted. This is partly because modern instruments effortlessly produce large volumes of data and Web 2.0 further allows bottom-up creation of rich and diverse content. Therefore, the amount of information we have today for creating useful and usable cartographic products is higher than ever before. However, how much of it can we really use online? To answer this question, we first calculate the bandwidth needs for geographic data sets in terms of waiting times. The calculations are based on various data volumes estimated by scholars for different scenarios. Documenting the waiting times clearly demonstrates the magnitude of the problem. Following this, we summarize the current hardware and software solutions, then the current human-centric design approaches trying to address the constraints such as various screen sizes and information overload. We also discuss a limited set of social issues touching upon the digital divide and its implications. We hope that our systematic documentation and critical review will help researchers and practitioners in the field to better understand the current state of the art.
Highlights
In many of today‘s applications, level of detail (LOD) switches help a great deal with the lag times and are relatively successfully utilized in progressive loading models, the adaptation of the visualization is often not seamless
In this study we surveyed the current state of the art in the approaches to deal with bandwidth limitations when high quality geographic information is being provided
The system responsiveness is directly linked to ―response time‖; which is a basic metric in usability, and is measured in almost all user studies (e.g., [28,79])
Summary
Various forms of computer networks, along with other developments in technology, enabled almost any kind of information imaginable to be produced and distributed in unforeseen amounts. This almost ubiquitous availability of vast amounts of information at our fingertips enriched our lives, and continues to do so. Filtering task is both a technical challenge and a cognitive one—i.e., we work on algorithms that will manage the data intelligently, but at the same time we need to understand what our minds can process to customize the technology and the design . If we can match the cognitive limitations with bandwidth limitations; we may find a ―sweet spot‖ where we can handle the data just right and possibly improve both human and machine performance considerably
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have