Abstract
In this paper, a framework based on algebraic structures to formalize various types of neural networks is presented. The working strategy is to break down neural networks into building blocks, relationships between each building block, and their operations. Building blocks are collections of primary components or neurons. In turn, neurons are collections of properties functioning as single entities, transforming an input into an output. We perceive a neuron as a function. Thus the flow of information in a neural network is a composition between functions. Moreover, we also define an abstract data structure called a layer which is a collection of entities which exist in the same time step. This layer concept allows the parallel computation of our model. There are two types of operation in our model; recalling operators and training operators. The recalling operators are operators that challenge the neural network with data. The training operators are operators that change parameters of neurons to fit with the data. This point of view means that all neural networks can be constructed or modelled using the same structures with different parameters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.