Abstract

Building mathematical models is an important part of developing digital products in various industries, medicine, geology, construction, finance and other areas. Modeling allows optimizing production processes, identifying patterns, predicting time series, classifying objects, and constructing regressions. Quantile regression models are a generalization of median regression and can be used to examine data in depth. Quantile analysis involves estimating model parameters and determining quantile values of the dependent variable for given values of the independent variable. This is done by minimizing the loss function based on quantile values. In contrast to the method of least squares, quantile regression allows to predict the values of the dependent variable more accurately when the values of the independent variable change. That is, quantile regression is more robust. It can be used to solve many problems in various fields of science and business, where it is necessary to more accurately predict the values of the dependent variable under changing conditions. The natural gradient descent is an effective method for constructing regression and has a higher rate of convergence than the classical algorithm. However, in practice this method is quite complicated from a computational point of view, since it requires the calculation of the second derivative. This problem is especially acute when training neural networks, where the number of parameters is much higher than when building classical regression models. The study of methods of regression construction and application of numerical methods are of practical and scientific interest. This paper will look at quantile regression, natural gradient descent and their combination to build mathematical models. Gradient descent is one of the most popular optimization methods and is widely used in machine learning. The natural gradient descent is the preferred method because it is more efficient and has a high rate of convergence. In addition, this method is less vulnerable to hitting local minima and provides more accurate estimates of model parameters. In practice, however, this method is computationally difficult, as it requires the calculation of the second derivative. The article presents an algorithm for model building using natural gradient descent. The essence of using quantile regression in a natural gradient descent is to use a quantile estimate of the loss function instead of the usual estimate used in the least squares method. This allows not only the mean value of the dependent variable, but also more extreme values (e.g., median, 25th percentile, 95th percentile, etc.) to be considered when constructing the model. The resulting method has also been compared with other popular quantile regression-supported gradient descent methods on open data sets of different dimensionality, both in terms of the number of factors and the number of observations. In addition, the possibilities of further development and optimization of this method will be discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call