2023-05-30T10:07:39Zhttps://kanazawa-u.repo.nii.ac.jp/?action=repository_oaipmhoai:kanazawa-u.repo.nii.ac.jp:000076002023-05-29T00:43:45Z00934:00935:00936
Memory capacity bound and threshold optimization in recurrent neural network with variable hysteresis thresholdengRecurrent neural networksThreshold optimizationVariable synthesis thresholdNeural networkshttp://hdl.handle.net/2297/6797Conference PaperNakayama, KenjiNishimura, KatsuakiKatayama, HiroshiProceedings of the International Joint Conference on Neural Mateworks3260326061993-10-01IEEE(Institute of Electrical and Electronics Engineers)Authors have proposed an asymmetrical associative neural network (NN) using variable hysteresis threshold and its learning and association algorithms. It can drastically improve noise performance, that is insensitivity to noise. In this paper, memory capacity bound and threshold optimization in this associative NN are further discussed. Binary random patterns are considered. First, relation between the number of patterns and the number of iterations is investigated. The latter gradually increases until some number of patterns. After that, it suddenly increases. This is a very peculiar phenomenon. This turning point gives the memory capacity bound, that is about 1.56N, where N is the number of units. Next, threshold optimization is discussed. Relation between threshold and noise performance, and effects of connection weight distribution on noise performance are theoretically discussed. Based on these results, a ratio of step-size and the threshold is optimized to be 0.5/(NP-1), where NP is the number of units on the pattern. Numerically statistical simulation demonstrates efficiency of the proposed methods.publisherhttps://kanazawa-u.repo.nii.ac.jp/?action=repository_action_common_download&item_id=7600&item_no=1&attribute_id=26&file_no=12017-10-03