Anonymised human location data in England for urban mobility research
The rapid expansion of sensor technologies and location-enabled mobile applications has greatly advanced the study of human mobility. Recently emerged sources like mobile app data offer outputs similar to traditional datasets—trip chains, flows, and indicators—but the methods and decisions used to process such data often vary across location, time, and policy context and remain poorly documented and insufficiently transparent. This variability necessitates tailored data processing and validation approaches, which remain underexplored in existing literature. This study aims to provide a reproducible and replicable framework for processing location-points data, using a case study of anonymised mobility records collected in November 2021 across England. We describe a modular workflow and multi-stage validation techniques that enhance the reproducibility of stay-point detection and activity labelling. Furthermore, we demonstrate how the proposed framework can generate reliable mobility indicators and origin-destination flow matrices for broader research applications. The resulting datasets—including sampled anonymised trajectories and full origin-destination flow matrices—are publicly available for research purposes, with updates to code and methodology hosted on GitHub and Zenodo.
- Conference Article
1
- 10.1109/igarss.2019.8897814
- Jul 1, 2019
Analysis Ready Data (ARD) are satellite data that have been pre-processed for immediate analysis with minimal user effort. The generation of Surface Reflectance (SR) from optical satellite data, involves a series of corrections to standardise the data and enable meaningful comparison of data from multiple sensors and across time. Surface reflectance data are foundational for time-series analyses and rapid generation of other information products. Field based validation of surface reflectance data is therefore critical to determine its fitness for purpose, and applicability for downstream product development.In this paper, an approach for continental scale validation of the surface reflectance data from Landsat-8 and Sentinel-2 satellites, using field-based measurements that are near-synchronous to the satellite observations over multiple sites across Australia is presented. Good practice measurement protocols governing the acquisition of field data, including field instrument calibration, sampling strategy and approach for post-collection processing and management of field spectral data are outlined.This study has been a nationally coordinated, collaborative field data collection campaign across Australia. Permanent field sites, to support validation efforts within the broader Earth Observation (EO) community for continental scale products were also identified. The approach is expected to serve as a model for coordinated ongoing validation of ARD products at continental to global scales.
- Research Article
6
- 10.1177/09544097211041879
- Aug 26, 2021
- Proceedings of the Institution of Mechanical Engineers, Part F: Journal of Rail and Rapid Transit
To increase operational efficiency, resilience and capacity of the railway system, the development of modern railway traffic management system (TMS) has attracted more and more attention in recent years. To support the development and implementation of the next generation of TMS and related applications, advanced data collection, transmission and processing approaches, digitalised databases, and virtual validation platforms, etc., are required. In the context of the TMS development (addressed by Technology Demonstrator 2.9 of Shift2Rail Innovation Programme 2), this support is to be provided by a scalable, interoperable and standardised communication platform for internal and external communication between different subsystems, applications and clients. This paper outlines the approach of the ongoing OPTIMA project aimed to develop a communication platform demonstrator for railway TMS based on a novel Integration Layer (IL) and its various interfaces to entities including integration layer services, TMS service, rail business service, external services and operator workstations. Further detailed discussion in this paper relates to the approach to validating the communication platform demonstrator as a functional entity, and as a virtual testing environment to validate railway traffic management and other applications. The validation approach for the applications tested on the communication platform demonstrator is also presented. The results of future implementation of this validation approach will be used to assess the functionality of the communications platform demonstrator developed, and the initial TMS applications tested on it, and form an important step towards developing and implementing IL based communications platforms for future TMSs.
- Conference Article
1
- 10.1109/iciecs.2009.5363874
- Dec 1, 2009
Traditional statistical data processing approach needs to know the distribution regularity of samples. But in the anti- radiation missiles (ARM), when Aerial defense Radar uses Active-Decoying, the sample distribution regularity usually can't be known or has many likelihoods. To improve the precision and the stabilization of the angle-measure in the active-decoying environment, a grey processing approach is proposed in this paper. The methods of removing gross error and parameter estimation based on the grey entropy definition are proposed in the paper. Using grey data processing approach based on the definitions of grey distance measure and grey relation entropy, eliminate singularity and retain random error. Finally, the simulation test verifies that the grey processing approach can improve the precision and stabilization of angle-measurer in active-decoying environment, and can improve the precision of ARM.
- Conference Article
3
- 10.1109/dese.2011.75
- Dec 1, 2011
A data security and validation framework of a SOA based system for management, storage, processing and visualization of data obtained from scientific experiments is proposed in this paper. The framework covers the three levels of data security: authorized user access, data encryption and data validation. To ensure authorized user access three access levels are provided: ownership, distribution rights and read rights. To ensure the data encryption an asymmetric 128-bit encrypting method based on public and private key is implemented. A universal approach for complete data validation has been suggested and implemented as web service. Data validation service allows validating the input data processed by each of the modules in the system. A declarative language defining the data validation and their interpretation based on XML standard is specified, by which precise rules in XML format are established and simultaneously with a library that supports this grammar is performed. In the language structures are included elements describing the data, rules and logic, which have to satisfy the input data. The actual definition is supported by a definition repository XML file and allows adding additional field types or complex validation logic as needed and easy modification of the validation rules.
- Research Article
31
- 10.1007/s11694-018-9893-2
- Aug 11, 2018
- Journal of Food Measurement and Characterization
Fruit quality inspection and authentication instruments are the essential requirement at the different stages of fruit processing industries from harvesting to market. In recent years, various intelligent analytical methods such as electronic nose, gas chromatography and mass spectroscopy, UV–Vis–NIR spectroscopy, machine vision, hyperspectral imaging and many more have been evolved to access the fruit quality at different stages such as maturity judgement of an on-tree fruit, shelf life measurement of harvested fruit, other quality parameters measurement of various fruit products at processing industries etc. Information extracted from various analytical methods needs to be processed using different data processing approaches and strategies, which plays the major role to bring the intelligence in the analytical instruments. Although, highly promising results have been reported to process data acquired from similar type of sensory panel (gas sensor array in electronic nose) and single sensing technique (impedance measurement) but still there are several challenges to process data acquired from multiple sensing techniques fusion (similar or complementary in nature) to predict better informative results. Recently, there is a growing interest in the direction of multiple sensing techniques fusion to extract better information from fruit samples in a reliable manner and also in less time. This paper presents an extensive review of classical and modern data processing approaches and strategies that have been used for single and multiple non-destructive sensing methods in the area of fruit quality inspection and authentication. Various approaches and strategies for preprocessing, data fusion, feature extraction, model design, multi-modal data processing, training, testing and validation for single and multiple sensing techniques have been briefly explained in the presented review. The presented review also discusses the need, scope, and challenges of data processing methods for multiple sensing techniques fusion. Different commercially available handheld and lab level analytical instruments also have been reviewed based on their intelligence, complexity and quality parameters prediction.
- Book Chapter
52
- 10.1016/bs.agph.2020.07.003
- Jan 1, 2020
Seismic signal augmentation to improve generalization of deep neural networks
- Book Chapter
5
- 10.1007/978-3-319-93375-7_23
- Jan 1, 2018
In recent years, the amount of data increases continuously. With newly emerging paradigms, such as the Internet of Things, this trend will even intensify in the future. Extracting information and, consequently, knowledge from this large amount of data is challenging. To realize this, approved data analytics approaches and techniques have been applied for many years. However, those approaches are oftentimes very static, i.e., cannot be dynamically controlled. Furthermore, their implementation and modification requires deep technical knowledge only technical experts can provide, such as an IT department of a company. The special needs of the business users are oftentimes not fully considered. To cope with these issues, we introduce in this article a human-centered approach for interactive data processing and analytics. By doing so, we put the user in control of data analytics through dynamic interaction. This approach is based on requirements derived from typical case scenarios.
- Conference Article
- 10.1109/icecaa55415.2022.9936364
- Oct 13, 2022
The objective of this work is to conduct a sequenced approach for customer data processing in banking applications between various services with decision tree and k nearest neighbour. Materials and Methods: It is considered as two gatherings, for example, decision tree and k nearest neighbor algorithms where N=10 sample iterations to test the accuracy of the model for customer data processing in banking applications for various services. Result: The accuracy results of the Novel Decision Tree classifier model has potential up to (98%) and the K Nearest Neighbour model has an accuracy of (88.98%). It’s been observed that there is a statistical remarkable perfection between the Decision Tree and K Nearest Neighbour (p=0.002). Conclusion: The work results for customer data processing in banking applications between various services show that the Decision Tree algorithm has shown higher results and Significance than K Nearest Neighbour algorithm.
- Research Article
9
- 10.1186/s40537-019-0275-3
- Dec 1, 2019
- Journal of Big Data
The analysis and processing of big data are one of the most important challenges that researchers are working on to find the best approaches to handle it with high performance, low cost and high accuracy. In this paper, a novel approach for big data processing and management was proposed that differed from the existing ones; the proposed method employs not only the memory space to reads and handle big data, it also uses space of memory-mapped extended from memory storage. From a methodological viewpoint, the novelty of this paper is the segmentation stage of big data using memory mapping and broadcasting all segments to a number of processors using a parallel message passing interface. From an application viewpoint, the paper presents a high-performance approach based on a homogenous network which works parallelly to encrypt-decrypt big data using AES algorithm. This approach can be done on Windows Operating System using .NET libraries.
- Research Article
16
- 10.1155/2015/794518
- Jun 1, 2015
- International Journal of Distributed Sensor Networks
Since wireless sensor networks (WSNs) consist of nodes with limited battery power, collaborative data processing and balanced energy consumption should be considered as the key issue. This paper proposes an efficient cluster head selection approach for collaborative data processing in WSNs. The proposed algorithm designs an effective energy-efficient model to select the optimal cluster heads among all nodes fairly, which helps to reduce the impact of the monitoring scheme on the lifetime of network. Experimental results show that the proposed protocol is able to reduce energy consumption and obtain higher efficiency as well as effectively prolonging the lifetime of network more than a few existing cluster-based routing protocols.
- Research Article
21
- 10.1007/s12652-018-0843-y
- May 18, 2018
- Journal of Ambient Intelligence and Humanized Computing
Internet of things (IoT) applications rely on networks composed of a set of heterogeneous sensors and smart devices, which have the capability to constantly monitor the surroundings and gather data. This heterogeneity is reflected in raw data collected by such type of systems. Additionally, these data are continuously streaming; thus leading to huge volumes of heterogeneous data, which are further transferred to centralized platforms for processing. Consequently, two main challenges have arisen. First, the heterogeneity aspect of IoT data makes high-level IoT applications’ task of interpreting such data and detecting events in the real world more complex. Second, sending sensory data to a centralized platform leads to some issues, such as extensive consumption of IoT devices’ limited resources, network traffic overloading, and latency, which might negatively impact the response time especially in systems that were designed to handle critical situations. In this paper, we propose a decentralized approach for IoT data processing, by delegating this task to distributed edge devices (Gateways) taking into consideration their limited resources and network bandwidth. To accomplish this, we proposed a two-layer data processing approach that employs a hyped model encompassed of complex event processing (CEP) and semantic web (SW) techniques. While the first is proposed for performing aggregation and classification tasks, we use the latter for performing semantic filtering and annotation tasks. We have evaluated the feasibility of our approach to process sensory data in the context of Air Quality Monitoring scenario using an experimentation involving established ontologies. Several benchmarks are considered such as overall runtime, data size, and response time.
- Research Article
- 10.4028/www.scientific.net/amm.654.315
- Oct 1, 2014
- Applied Mechanics and Materials
The main objective of this study is to discover and investigate greater levels of human motion activities recognition. The study presents four approaches of human motion data processing to recognize the human activities. Data collection process was performed in two ways: wearable sensor based in signal data and vision based in image data. The proposed approaches used to analyze the signal and image data are: wearable sensor using 3-space sensing with angular velocity and elevation angle as moderators, wearable sensor using statistical nine existing and a proposed developed classifiers as classification learning system, vision based using skeletonization with humerus-radius and horizontal-radius as measuring angle and vision based image-signal histogram using 2D-1D transformation method. The principal contributions of this thesis are the development of the human motion analysis methods with validated evaluation process tested on the proposed systems. The proposed systems achieved more than 98 % for signal processing and 97 % for image processing of accuracy on recognizing human activities.
- Research Article
11
- 10.1016/j.optlaseng.2012.01.022
- Feb 24, 2012
- Optics and Lasers in Engineering
High temporal and spatial resolution in time resolved speckle interferometry
- Research Article
28
- 10.1007/s11207-017-1214-0
- Dec 1, 2017
- Solar Physics
Estimates of the photospheric magnetic, electric and plasma velocity fields are essential for studying the dynamics of the solar atmosphere, for example through the derivative quantities of Poynting and relative helicity flux and by using of the fields to obtain the lower boundary condition for data-driven coronal simulations. In this paper we study the performance of a data processing and electric field inversion approach that requires only high-resolution and high-cadence line-of-sight or vector magnetograms -- which we obtain from Helioseismic and Magnetic Imager (HMI) onboard Solar Dynamics Observatory (SDO). The approach does not require any photospheric velocity estimates, and the lacking velocity information is compensated using ad hoc assumptions. We show that the free parameters of these assumptions can be optimized to reproduce the time evolution of the total magnetic energy injection through the photosphere in NOAA AR 11158, when compared to the recent estimates for this active region. However, we find that the relative magnetic helicity injection is reproduced poorly reaching at best a modest underestimation. We discuss also the effect of some of the data processing details on the results, including the masking of the noise-dominated pixels and the tracking method of the active region, both of which have not received much attention in the literature so far. In most cases the effect of these details is small, but when the optimization of the free parameters of the ad hoc assumptions is considered a consistent use of the noise mask is required. The results found in this paper imply that the data processing and electric field inversion approach that uses only the photospheric magnetic field information offers a flexible and straightforward way to obtain photospheric magnetic and electric field estimates suitable for practical applications such as coronal modeling studies.
- Research Article
21
- 10.1039/c0ay00736f
- Jan 1, 2011
- Analytical Methods
A portable X-ray fluorescence (XRF) spectrometer and portable Raman spectrophotometer were utilized at the Coriglia, Castel Viscardo excavation site near Orvieto, Italy to study pigments found on fresco. Over eighty fresco samples were analyzed. Identified pigments included vermillion, red ochre, yellow ochre, terre verte, Egyptian blue, and hematite. XRF spectroscopic data were collected utilizing three separate sets of instrument conditions. Various data processing and analysis approaches were evaluated for the XRF and Raman spectroscopic data including scatterplots and analysis of variance on integrated peak areas, live time correction of the XRF spectral intensities, principle components analysis on both the integrated peak areas and spectra, and data fusion of the spectra. Fusion of the high voltage and low voltage under vacuum XRF spectroscopic data provided improved data clustering results over single technique data. Fusion of high voltage XRF spectroscopic data with Raman data also was demonstrated to provide improved differentiation results for certain pigments. Data from individual pigments were then evaluated using these demonstrated best approaches for possible source variations with encouraging results.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.