Abstract

ABSTRACTIn this paper, we present an optimal model reduction method on the basis of the trust-region technique for linear time-invariant discrete-time dynamical systems. First, based on the poles and residues, the error norm for single-input and single-output discrete-time systems is investigated, which leads to the error gradient and Hessian. Next, for multiple-input and multiple-out discrete-time systems, the gradient and Hessian of the error norm are accordingly derived. Then, the error gradient and Hessian are employed to establish the trust-region method for optimal model reduction. Moreover, it is shown that the proposed method can produce a decreasing sequence. The construction of the state space realization of the reduced order system is studied concerning the divisions of the resulting residues. Model reduction is investigated for nonlinear discrete-time system. Finally, an illustrative example is used to demonstrate the performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call