Abstract

We describe and analyze algorithms for shape-constrained symbolic regression, which allow the inclusion of prior knowledge about the shape of the regression function. This is relevant in many areas of engineering — in particular, when data-driven models, which are based on data of measurements must exhibit certain properties (e.g. positivity, monotonicity, or convexity/concavity). To satisfy these properties, we have extended multi-objective algorithms with shape constraints. A soft-penalty approach is used to minimize both the constraint violations and the prediction error. We use the non-dominated sorting genetic algorithm (NSGA-II) as well as the multi-objective evolutionary algorithm based on decomposition (MOEA/D). The algorithms are tested on a set of models from physics textbooks and compared against previous results achieved with single objective algorithms. Further, we generated out-of-domain samples to test the extrapolation behavior using shape constraints and added a different level of noise on the training data to verify if shape constraints can still help maintain the prediction errors to a minimum and generate valid models. The results showed that the multi-objective algorithms were capable of finding mostly valid models, also when using a soft-penalty approach. Further, we investigated that NSGA-II achieved the best overall ranks on high noise instances.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call