Abstract

The paper describes an approach to minimize the number of simulation experiments of multidimensional signals by means of a regression neural network. Multivariate signal simulation systems are in demand for testing real-time computing systems, but they mostly have a wide vector of model parameters. The formation of parameter vector for the simulation model of multidimensional signals, which ensures an adequate solution to the problem of signal processing obtained as a result of the model operation, is an urgent task. The authors propose a method of heuristic optimization of input parameters, implemented by means of machine learning, which reduces the searching of values for the given optimization function. In this study the ability to perform real-time signal simulations is the optimization feature. The research outcomes are the description of the neural network, the rationale for its configuration, and the training and validation results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call