2024-03-28T23:04:17Zhttps://eprints.lib.hokudai.ac.jp/dspace-oai/requestoai:eprints.lib.hokudai.ac.jp:2115/655822022-11-17T02:08:08Zhdl_2115_20053hdl_2115_145An incremental self-organizing neural network based on enhanced competitive Hebbian learningLiu, HaoKurihara, MasahitoOyama, SatoshiSato, Haruhiko007Self-organizing neural networks are important tools for realizing unsupervised learning. Recently, a difficult task has involved the incremental, efficient and robust learning in noisy environments. Most of the existing techniques are poor in this regard. In this paper, we first propose a new topology generating method called enhanced competitive Hebbian learning (enhanced CHL), and then propose a novel incremental self-organizing neural network based on the enhanced CHL method, called enhanced incremental growing neural gas (Hi- GNG). The experiments presented in this paper show that the Hi-GNG algorithm can automatically and efficiently generate a topological structure with a suitable number of neurons and that the proposed algorithm is robust to noisy data.The 2013 International Joint Conference on Neural Networks (IJCNN), ISBN: 978-1-4673-6129-3IEEE (Institute of Electrical and Electronics Engineers)Conference Paperapplication/pdfhttp://hdl.handle.net/2115/65582https://eprints.lib.hokudai.ac.jp/dspace/bitstream/2115/65582/1/ijcnn2013.pdf182013enginfo:doi/10.1109/IJCNN.2013.6706725978-1-4673-6129-3© 2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.author