Abstract
The probabilistic neuro-fuzzy system to solve the image classification-recognition task is proposed. The considered system is a “hybrid” of Specht’s probabilistic neural network and the neuro-fuzzy system of Takagi-Sugeno-Kang. It is designed to solve tasks in case of overlapping classes. Also, it is supposed that the initial data that are fed on the input of the system can be represented in numerical, rank, and nominal (binary) scales. The tuning of the network is implemented with the modified procedure of lazy learning based on the concept “neurons at data points”. Such a learning approach allows substantially reducing the consumption of time and does not require large amounts of training dataset. The proposed system is easy in computational implementation and characterised by a high classification speed, as well as allows processing information both in batch and online mode.
Highlights
Today artificial neural networks have become widespread for an image recognition task due to their universal approximating over time), it is assumed that N1 observations belong to the class Cl1, N2 to Cl2, N j to Cl j, and Nm to the m-th class Clm
It is proposed to use a modified lazy learning procedure based on the concept “neurons at data points” [18], [19] for the tuning of the probabilistic neuro-fuzzy system (PNFS)
The probabilistic neuro-fuzzy system works faster than the evolving fuzzy-probabilistic neural network; it yields evolving fuzzyprobabilistic neural network (EFPNN) in classification accuracy. It has to be taken into account that the K-nearest neighbour algorithm was performed on GPU and the probabilistic neuro-fuzzy system on CPU, which led to the gap in the time spent
Summary
On the zero (receptor) layer of the system, vectors of observations are sequentially fed. They are forming a training dataset in the form X ={x(1), x(2),..., x(k),...x(N )}, Keywords – Lazy learning; members= hip function; neural x(k) (x1(k), x2 (k),..., xi (k),... Xn (k))T ∈Rn network; neuro-fuzzy system; probabilistic pattern recognition They are forming a training dataset in the form X ={x(1), x(2),..., x(k),...x(N )}, Keywords – Lazy learning; members= hip function; neural x(k) (x1(k), x2 (k),..., xi (k),... xn (k))T ∈Rn network; neuro-fuzzy system; probabilistic pattern recognition
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have