Abstract

The standard assumption that a measurement signal is available at each sample in iterative learning control (ILC) is not always justified, e.g., when exploiting time-stamped data from incremental encoders or in systems with data dropouts. The aim of this paper is to develop a computationally tractable ILC framework that is capable of exploiting intermittent data while maintaining favourable properties, including monotonic convergence. A controllability and observability analysis of the intermittent ILC framework leads to appropriate monotonic convergence conditions which allow for missing data. These conditions lead to a new explicit ILC controller design independent of the sampling instances, which is reminiscent of gradient-descent ILC. The approach is demonstrated on both an intuitive example and a practically relevant example which exploits time-varying timestamped data from an incremental encoder.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call