Abstract

Time-varying delays adversely affect the performance of networked control systems (NCS) and in the worst case can destabilize the entire system. Therefore, modeling network delays are important for designing NCS. However, modeling time-varying delays are challenging because of their dependence on multiple parameters, such as length, contention, connected devices, protocol employed, and channel loading. Further, these multiple parameters are inherently random and delays vary in a nonlinear fashion with respect to time. This makes estimating random delays challenging. This investigation presents a methodology to model delays in NCS using experiments and general regression neural network (GRNN) due to their ability to capture nonlinear relationship. To compute the optimal smoothing parameter that computes the best estimates, genetic algorithm is used. The objective of the genetic algorithm is to compute the optimal smoothing parameter that minimizes the mean absolute percentage error (MAPE). Our results illustrate that the resulting GRNN is able to predict the delays with less than 3 % error. The proposed delay model gives a framework to design compensation schemes for NCS subjected to time-varying delays.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.