Abstract

The Local Model State Space Network (LMSSN) is a recently developed black box algorithm in nonlinear system identification. It has proven to be an appropriate tool on benchmark problems as well as for real-world processes. A severe shortcoming though is the long computation time that is necessary for model training. Therefore, a different optimization strategy, the adaptive moment estimation (ADAM) method with mini batches is used for the LMSSN and compared to the current Quasi-Newton (QN) optimization method. It is shown on a numerical Hammerstein example and on a well known Wiener-Hammerstein benchmark that the use of ADAM and mini batches does not limit the performance of the LMSSN algorithm and speeds up the nonlinear optimization per investigated split by more than 30 times. The price to be paid, however, is higher parameter variance (less interpretability) and more tedious hyperparameter tuning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call