Abstract

In this paper, motivated by the recent development of the approximation capability of neural network operators, we aim to construct neural network (NN) operators for multivariate fractional integrals of order α, which are equipped with a vector-valued function ψ and its Jacobian matrix. In this system, the connection strengths between output neurons are represented by fractional mean values, which depend on the vector-valued function ψ, of the approximated multivariate function. To estimate the rate of convergence (learning rate), the connection weights of NN are equipped with a decreasing sequence (λn). The activation function of the system is constructed using a linear collection ΩΨ, which consists of density functions generated by multivariate sigmoidal functions. Our goal is to construct a flexible and productive hybrid system by leveraging a diverse selection of the function ψ, the parameter α, the sequence (λn) and the activation function ΩΨ. The quantitative estimates of the operators are examined by means of the multivariate modulus of continuity. Moreover, we provide some illustrative examples with graphs to demonstrate the approximation performance of the operators through the selected activation functions. Finally, we present numerical results consisting of the maximum absolute errors of approximation for the proposed operators based on various selections of the vector-valued function ψ.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call