Abstract

Developing a mathematical model has become an inevitable need in studies of all disciplines. With advancements in technology, there is an emerging need to develop complex mathematical models. System identification is a popular way of constructing mathematical models of highly complex processes when an analytical model is not feasible. One of the many model architectures of system identification is to utilize a Local Model Network (LMN). Hierarchical Local Model Tree (HILOMOT) is an iterative LMN training algorithm that uses the axis-oblique split method to divide the input space hierarchically. The split positions of the local models directly influence the accuracy of the entire model. However, finding the best split positions of the local models presents a nonlinear optimization problem. This paper presents an optimized HILOMOT algorithm with enhanced Expectation–Maximization (EM) and Particle Swarm Optimization (PSO) algorithms which includes the normalization parameter and utilizes the reduced-parameter vector. Finally, the performance of the improved HILOMOT algorithm is compared with the existing algorithm by modeling the NOx emission model of a gas turbine and multiple nonlinear test functions of different orders and structures.

Highlights

  • Each neuron in the Local Model Network (LMN) represents a local model, which is defined by the validity function, φi, and holds a Local Linear Model (LLM)

  • The results demonstrates that Particle Swarm Optimization (PSO) produced the best result and the Nested Optimization algorithm produced the worst result in modeling the real system

  • This paper presents the improved Hierarchical Local Model Tree (HILOMOT) LMN training algorithm with enhanced

Read more

Summary

Introduction

With modern advancements in engineering there is an emerging need to develop well-formulated models. Nonlinear systems modeling presents many challenges due to their diversity in structure, which requires the modeling algorithm to be universal to represent a wide range of systems [12] Several architectures such as block-oriented [13–18], Volterra series [19], and polynomial Nonlinear Auto-regressive Network with Exogenous Inputs (NARXs) [20–23] can be found in the literature. The flexibility of the HILOMOT algorithm is the consequence of the axis-oblique split achieved by using the arbitrarily oriented sigmoid functions instead of the orthogonal Gaussian functions This flexibility comes at the cost of an expensive nonlinear optimization which is required to find the optimal partition in relation to the lowest training error. Particle Swarm Optimization (PSO) algorithm, which can locate global optimum, has been used to optimize the hierarchical decomposition of the logistic discriminant function in [70] These works did not consider the undesired effect of split-parameter optimization on the steepness of the transition between the validity functions.

Partition Strategies
Grid-Based and Clustering-Based Partitioning
Data-Based Partitioning
Nonlinear Optimization-Based Partitioning
Heuristic Tree-Based Partitioning
HILOMOT Algorithm
Split Optimization Algorithms
Expectation–Maximization (EM) Algorithm
Particle Swarm Optimization (PSO) Algorithm
Nested Optimization
Results and Validation
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call