Abstract

Axis-oblique decision trees have been proposed to effectively deal with high-dimensional input spaces, weakening the effects of the curse of dimensionality. Usually the axis-oblique partitioning is obtained by nonlinear optimization techniques, introducing additional flexibility together with an increase in variance error. In this paper a normalized L1 regularization for optimization based axis-oblique partitioning strategies is proposed, which only penalizes the amount of obliqueness in-corporated in the partitioning. It is exemplarily applied to the hierarchical local model tree (HILOMOT) algorithm, building local model networks (LMNs) for system identification tasks. It is shown that the proposed normalized L1 regularization keeps the number of variables used for the partitioning low and decreases the variance error.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call