版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Univ Illinois Dept Elect & Comp Engn Chicago IL 60607 USA Univ Calif Davis Dept Elect & Comp Engn Davis CA 95616 USA
出 版 物:《IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS》 (IEEE Trans Circuits Syst I Fundam Theor Appl)
年 卷 期:2002年第49卷第12期
页 面:1876-1879页
核心收录:
基 金:National Science Foundation, NSF, (97-021, 97-022, ECS-9732785, ECS-9996428) National Science Foundation, NSF
主 题:constructive algorithm feedforward neural networks incremental training linear programming quadratic programming
摘 要:We develop, in this brief, a new constructive learning algorithm for feedforward neural networks. We employ an incremental training procedure where training patterns are learned one by one. Our algorithm starts with a single training pattern and a single hidden-layer neuron. During the course of neural network training, when the algorithm gets stuck in a local minimum, we will attempt to escape from the local minimum by using the weight scaling technique. It is only after several consecutive failed attempts in escaping from a local minimum that will we allow the network to grow by adding a hidden-layer neuron. At this stage, we employ an optimization procedure based on quadratic/linear programming to select initial weights for the newly added neuron. Our optimization procedure tends to make the network reach the error tolerance with no or little training after adding a hidden-layer neuron. Our simulation results indicate that the present constructive algorithm can obtain neural networks very close to minimal structures (with the least possible number of hidden-layer neurons) and that convergence (to a solution) in neural network training can be guaranteed. We tested our algorithm extensively using a widely used benchmark problem, i.e., the parity problem.