Abstract

The paper is proposed the Ratio-Memory Cellular Neural Network (RMCNN) that structure with the self-feedback and the modified Hebbian learning algorithm. The learnable RMCNN architecture was designed and realized in CMOS technology for associative memory neural network applications. The exemplar patterns can be learned and correctly recognized the output patterns for the proposed system. Only self-output pixel value in A template and B template weights are updated by the nearest neighboring five elements for all test input exemplar patterns. The learned ratio weights of the B template are generated that the catch weights are performed the summation of absolute coefficients operation to enhance the feature of recognized pattern. Simulation results express that the system can be learned some exemplar patterns with noise and recognized the correctly pattern. The 9×9 RMCNN structure with self-feedback and the modified Hebbian learning algorithm is implemented and verified in the CMOS circuits for TSMC 0.25 µm 1P5M VLSI technology. The proposed RMCNN have more learning and recognition capability for the variant exemplar patterns in the auto-associative memory neural system applications.

Highlights

  • The Cellular Neural Network (CNN) has properties for the neighboring cells with locally connected as introduced by Chua and Yang [1, 2]

  • The five weightsin the B template of Ratio-Memory Cellular Neural Network (RMCNN) can generated and updated from the exemplar patterns that the weights are ratioed from the absolute summation for the neighborhood cells and stored the weights in the ratioed-memory

  • To observe the simulation results, the function of 18×18 RMCNN has been verified that system can learned 8 patterns with white-black noise of the character A and correctly recognized the desired pattern

Read more

Summary

INTRODUCTION

The Cellular Neural Network (CNN) has properties for the neighboring cells with locally connected as introduced by Chua and Yang [1, 2]. The most designed neural networks were stored the processed patterns has a local minimum of an associative energy function in the associative memory. The Grossberg outstar structure is consisted the ratio memory (RM) to implement the weights of template use in the neural network for various image processing. The capability and function of the 18×18 RMCNN architecture with coupled the embedded ratio-memory B template and the modified Hebbian learning algorithm is shown and analysed. A CNN include the ratio-memory (RM) that the expressed RM and SRM blocks are used to realize the weights connect from the neighboring cells and the self-cell, respectively, is shown in Fig. The cell outputs are adjusted by the learned ratio weights gradually to close one of the features for the training patterns. When the end of learning period the voltage can be written as Vziijkl

Czi m p 1
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call