Abstract

This paper provides a comparative study of three proposed self organising neural network models that use forms of soft competition. The use of soft competition helps the neural networks to avoid poor local minima and so provide a better interpretation of the data they are representing. The networks are also thought to be generally insensitive to initialisation conditions. The networks studied are the Deterministic Soft Competition Network (DSCN) of Yair et al., the Neural Gas Network of Martinetz et al and the Generalised Learning Vector Quantisation (GLVQ) of Pal et al. The performance of the networks is compared to that of standard competitive networks and a Self Organising Map when run over a variety of data sets. The three proposed neural network models appear to produce enhanced results, particularly the Neural Gas network, but in the case of the Neural Gas network and the DSCN this is at the cost of greater computational complexity.KeywordsMean Square ErrorNoisy InputLower Mean Square ErrorSingle Source DataCompetitive UnitThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call