Abstract

This study addresses the prediction of CAN bus data, a lesser-explored aspect within unsupervised anomaly detection research. We propose the Fast-Gated Attention (FGA) Transformer, a novel approach designed for accurate and efficient prediction of CAN bus data. This model utilizes a cross-attention window to optimize computational scale and feature extraction, a gated single-head attention mechanism in place of multi-head attention, and shared parameters to minimize model size. Additionally, a generalized unbiased linear attention approximation technique speeds up attention block computation. On three datasets—Car-Hacking, SynCAN, and Automotive Sensors—the FGA Transformer achieves predicted root mean square errors of 1.86 × 10−3, 3.03 × 10−3, and 30.66 × 10−3, with processing speeds of 2178, 2768, and 3062 frames per second, respectively. The FGA Transformer provides the best or comparable accuracy with a speed improvement ranging from 6 to 170 times over existing methods, underscoring its potential for CAN bus data prediction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.