The geometric learning algorithm (GLA) was proposed as an application of the affine projection algorithm (APA) for an adaptive filter to perceptron. In the GLA, the connection weight vector w(n) is updated vertically ...
详细信息
ISBN:
(纸本)0780348605
The geometric learning algorithm (GLA) was proposed as an application of the affine projection algorithm (APA) for an adaptive filter to perceptron. In the GLA, the connection weight vector w(n) is updated vertically towards the orthogonal complement of k patterns. The GLA demonstrates some typical behavior when the learning rate lambda is 2, which means that w(n) and w(n+1) are symmetric with respect to the complement. Therefore, in this paper, the GLA with lambda = 2 is discriminated as "symmetric learningalgorithm (SLA)" and the convergence properties of the SLA are analyzed. The convergence condition among the order k of the SLA, the number P of patterns and the dimension N of patterns is analyzed theoretically. It is proved that k < N is the necessary condition for convergence when P greater than or equal to 2N. The relation between k and the learning speed is analyzed theoretically. It becomes clear that the maximum learning speed on average can be obtained when k = N/2. These properties are supported by computer simulations. Furthermore, the goodness of the solution by the SLA is investigated through computer simulation. That is, there exists little difference in the goodness of solution by changing the order k.
暂无评论