http://swrc.ontoware.org/ontology#InProceedings
Memory capacity bound and threshold optimization in recurrent neural network with variable hysteresis threshold
en
Recurrent neural networks
Threshold optimization
Variable synthesis threshold
Neural networks
Nakayama Kenji
Nishimura Katsuaki
Katayama Hiroshi
Proceedings of the International Joint Conference on Neural Mateworks
3
2603-2606
1993-10-01
IEEE(Institute of Electrical and Electronics Engineers)
Authors have proposed an asymmetrical associative neural network (NN) using variable hysteresis threshold and its learning and association algorithms. It can drastically improve noise performance, that is insensitivity to noise. In this paper, memory capacity bound and threshold optimization in this associative NN are further discussed. Binary random patterns are considered. First, relation between the number of patterns and the number of iterations is investigated. The latter gradually increases until some number of patterns. After that, it suddenly increases. This is a very peculiar phenomenon. This turning point gives the memory capacity bound, that is about 1.56N, where N is the number of units. Next, threshold optimization is discussed. Relation between threshold and noise performance, and effects of connection weight distribution on noise performance are theoretically discussed. Based on these results, a ratio of step-size and the threshold is optimized to be 0.5/(NP-1), where NP is the number of units on the pattern. Numerically statistical simulation demonstrates efficiency of the proposed methods.