Abstract

Hebbian-type associative memory is characterized by its simple architecture. However, the hardware implementation of Hebbian-type associative memories is normally complicated when there are a huge number of patterns stored. To simplify the interconnection values of a network, a nonlinear quantization strategy is presented. The strategy takes into account the property that the interconnection values are Gaussian distributed, and divides the interconnection weight values into a small number of unequal ranges accordingly. Interconnection weight values in each range contain information equally and each range is quantized to a value. The equation of probability of direct convergence was derived. The probability of direct convergence of nonlinear quantized networks with a small number of ranges is compatible with their original networks. The effects of linear and nonlinear quantization were also assessed in terms of recall capability, information capacity, and number of bits storing interconnection values saved by quantization. The performance of the proposed nonlinear quantization strategy is better than that of the linear quantization while retaining a recall capability that is compatible with its original network. The proposed approach reduces the number of connection weights and the size of the chip areas of a Hebbian-type associative memory while approximately retaining its performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.