This work aims to advance the security management of complex networks to better align with evolving societal needs. The work employs the Ant Colony Optimization algorithm in conjunction with Long Short-Term Memory neural networks to reconstruct and optimize task networks derived from time series data. Additionally, a trend-based noise smoothing scheme is introduced to mitigate data noise effectively. The approach entails a thorough analysis of historical data, followed by applying trend-based noise smoothing, rendering the processed data more scientifically robust. Subsequently, the network reconstruction problem for time series data originating from one-dimensional dynamic equations is addressed using an algorithm based on the principles of Stochastic Gradient Descent (SGD). This algorithm decomposes time series data into smaller samples and yields optimal learning outcomes in conjunction with an adaptive learning rate SGD approach. Experimental results corroborate the remarkable fidelity of the weight matrix reconstructed by this algorithm to the true weight matrix. Moreover, the algorithm exhibits efficient convergence with increasing data volume, manifesting shorter time requirements per iteration while ensuring the attainment of optimal solutions. When the sample size remains constant, the algorithm’s execution time is directly proportional to the square of the number of nodes. Conversely, as the sample size scales, the SGD algorithm capitalizes on the availability of more information, resulting in improved learning outcomes. Notably, when the noise standard deviation is 0.01, models predicated on SGD and the Least-Squares Method (LSM) demonstrate reduced errors compared to instances with a noise standard deviation of 0.1, highlighting the sensitivity of LSM to noise. The proposed methodology offers valuable insights for advancing research in complex network studies.
Read full abstract