Abstract

We introduce GPLS (Genetic Programming for Linear Systems) as a GP system that finds mathematical expressions defining an iteration matrix. Stationary iterative methods use this iteration matrix to solve a system of linear equations numerically. GPLS aims at finding iteration matrices with a low spectral radius and a high sparsity, since these properties ensure a fast error reduction of the numerical solution method and enable the efficient implementation of the methods on parallel computer architectures. We study GPLS for various types of system matrices and find that it easily outperforms classical approaches like the Gauss–Seidel and Jacobi methods. GPLS not only finds iteration matrices for linear systems with a much lower spectral radius, but also iteration matrices for problems where classical approaches fail. Additionally, solutions found by GPLS for small problem instances show also good performance for larger instances of the same problem.

Highlights

  • Numerical methods are used in various disciplines to solve problems where an analytical solution does not exist or is difficult to find

  • Being surprised by the extremely fast convergence of the iterative numerical methods evolved by GP approach for linear systems (GPLS), we study whether GPLS has found as iteration matrix G the inverse of the system matrix A or a matrix that is very similar

  • As Genetic programming (GP) is known for finding human-competitive results for many real-world problems [18], its combination with domain knowledge from the classical numerical methods allows us to come up with iteration matrices that beat existing iterative numerical methods

Read more

Summary

Introduction

Numerical methods are used in various disciplines to solve problems where an analytical solution does not exist or is difficult to find. In the field of symbolic regression, where the aim is to find mathematical expressions solving a given problem, GP has been used to approximate even complex problems [12, 27] This makes GP an interesting approach for finding new iterative numerical methods as it can be used to find the required mathematical expressions to generate iteration matrices based on certain classes of given system matrices. The spectral radius is an indicator for the convergence of the generated method, high sparsity provides performance advantages in the calculation and implementation of the method, and the complexity measure serves as bloat control Following this introduction, we present a background to iterative numerical methods and explain the relevant stationary iterative numerical methods, describe the discretization of partial differential equations to systems of linear equations, introduce GPLS in detail, and present our experiments and discuss the results in Sect.

Iterative numerical methods
The Jacobi method
The Gauss–Seidel method
Successive over‐relaxation
Convergence of stationary methods
Discretization of partial differential equations
GPLS: genetic programming for linear systems
Representation of iteration matrices
Objective function
Experiments and results
Performance of GPLS for random system matrices
Generalization of iteration matrices found by GPLS
GPLS overcomes limitations of existing stationary iterative methods
Convergence analysis of iteration matrices found by GPLS
Sparse diagonally dominant band matrices
Conclusions
Findings
Future work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call