Abstract

The Cerebellar Model Articulation Controller (CMAC) neural network utilizes hypercube basis function domains, called cells. Because of the local nature of the cells the CMAC exhibits more weight drift (also called parameter drift or overlearning) compared to other types of neural networks. Typical robust weight update law modifications designed to prevent drift, like deadzone or e-modification, often sacrifice performance. This paper proposes a solution to this problem by using two CMACs in the control law, referred to as a performance CMAC and a robust CMAC. The performance CMAC reduces the state error to a small value during initial training, and then turns off its weight updates according to the output of a decision algorithm in order to capture the best performance. One proposed algorithm simply turns off the training of a weight after its cell has been activated for a certain period of time. A more advanced method, deemed the introspective algorithm, stops training in a cell when it appears that the average error measured over sequential cell domains is no longer being reduced by the weight update in a cell. The robust CMAC trains in parallel using a conservative e-modification algorithm, continuing to provide robustness after the performance CMAC has stopped training. Lyapunov methods show a guarantee of uniformly ultimately bounded signals. Simulations with a quadrotor helicopter demonstrate the superior performance of the proposed method over e-modification, deadzone, and a PID control.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call