Abstract

Time series forecasting holds significant value across various application scenarios. However, practical implementations often encounter challenges related to low-quality time series data resulting from system failures or external interference. Unfortunately, existing forecasting methods primarily focus on optimizing model architecture for accurate predictions and overlook the importance of addressing data quality issues. In this paper, we propose optimizing data utilization to enhance model performance based on data quality and introduce the Sample Cooperation with Sample Quality (SCSQ) method to facilitate recurrent neural networks training. Firstly, we define sample quality as the matching degree between samples and model, and suggest using the attention entropy to calculate the sample quality through an attention mechanism. Secondly, we optimize the model's gradient vector based on sample quality. To address optimization training involving samples of different qualities effectively, we propose a more reasonable objective function within our proposed sample gradient conflict optimization module and devise a novel algorithm leveraging approximation techniques along with ADMM algorithm implementation. Through experiments conducted on six datasets, the results demonstrate that SCSQ significantly improves LSTM's performance. In certain cases, it even surpasses the state-of-the-art models. Additionally, SCSQ enhances the anti-interference ability of LSTM against low-quality samples.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.