Abstract

Since various behavioral movement patterns are likely to be valid within different, unique ranges of spatial and temporal scales (e.g., instantaneous, diurnal, or seasonal) with the corresponding spatial extents, a cross-scale approach is needed for accurate classification of behaviors expressed in movement. Here, we introduce a methodology for the characterization and classification of behavioral movement data that relies on computing and analyzing movement features jointly in both the spatial and temporal domains. The proposed methodology consists of three stages. In the first stage, focusing on the spatial domain, the underlying movement space is partitioned into several zonings that correspond to different spatial scales, and features related to movement are computed for each partitioning level. In the second stage, concentrating on the temporal domain, several movement parameters are computed from trajectories across a series of temporal windows of increasing sizes, yielding another set of input features for the classification. For both the spatial and the temporal domains, the ``reliable scale'' is determined by an automated procedure. This is the scale at which the best classification accuracy is achieved, using only spatial or temporal input features, respectively. The third stage takes the measures from the spatial and temporal domains of movement, computed at the corresponding reliable scales, as input features for behavioral classification. With a feature selection procedure, the most relevant features contributing to known behavioral states are extracted and used to learn a classification model. The potential of the proposed approach is demonstrated on a dataset of adult zebrafish (Danio rerio) swimming movements in testing tanks, following exposure to different drug treatments. Our results show that behavioral classification accuracy greatly increases when firstly cross-scale analysis is used to determine the best analysis scale, and secondly input features from both the spatial and the temporal domains of movement are combined. These results may have several important practical applications, including drug screening for biomedical research.

Highlights

  • Understanding behavioral dynamics of moving objects is becoming the focus of many researchers in various fields of GIScience

  • As shown in Step I.1 in Figure 1, we focus on three hierarchical levels of subdivision which correspond to different spatial scales: “micro” is confined to the scale of finely grained zones; “meso” points to the level of aggregated micro-zones; and “macro” refers to the coarsest possible spatial extent

  • For the given options of partitioning schemes, the 9-zone subdivision can be selected as the “reliable spatial scale” (Step I.4), markedly improving drug characterization based on zebrafish behavioral responses

Read more

Summary

Introduction

Discovering latent information about behaviors of objects from raw movement data, typically comprised of a series of time-stamped fixes, needs more sophisticated approaches to improve characterizing different behavioral states. The primary interest of studying MPs in movement analysis is in characterizing different behavioral states and investigate how they change over time [30]. Since movement occurs in space and time, exploration of both the underlying spatial extent and the relevant temporal characteristics of movement processes are needed to understand the fundamental behavioral mechanisms. Since different behavioral patterns and processes are likely to be valid within their own unique range of spatial and temporal scales, understanding the functional hierarchy underlying movement processes necessitates investigation of movement mechanisms and patterns across multiple spatiotemporal scales [26]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.