Abstract

A significant challenge for Machine Learning (ML) prognostic analyses of large-scale time series databases is variable clock skew between/among multiple data acquisition (DAQ) systems across assets in a fleet of monitored assets, and even inside individual assets, where the sheer numbers of sensors being deployed are so large that multiple individual DAQs, each with their own internal clocks, can create significant clock-mismatch issues. For Big Data prognostic anomaly detection, we have discovered and amply demonstrated that variable clock skew issues in the timestamps for time series telemetry signatures cause poor performance for ML prognostics, resulting in high false-alarm and missed-alarm probabilities (FAPs and MAPs). This paper describes a new Analytical Resampling Process (ARP) that embodies novel techniques in the time domain and frequency domain for interpolative online normalization and optimal phase coherence so that all system telemetry time series outputs are available in a uniform format and aligned with a common sampling frequency. More importantly, the optimality of the proposed technique gives end users the ability to select between ultimate accuracy or lowest overhead compute cost, for automated coherence synchronization of collections of time series signatures, whether from a few sensors, or hundreds of thousands of sensors, and regardless of the sampling rates and signal-to-noise (S/N) ratios for those sensors.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.