咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Using Kullback-Leibler diverge... 收藏

Using Kullback-Leibler divergence to predict on artificial neural network

作     者:Ismail, I.A. Nabil, Tamer 

作者机构:Computer Science Dept. Faculty of Computer and Informatics Zagazig University Zagazig Egypt Basic Science Dept. Faculty of Computer and Informatics Suez Canal University Ismailia Egypt 

出 版 物:《Neural Network World》 (Neural Network World)

年 卷 期:2004年第14卷第6期

页      面:507-519页

核心收录:

学科分类:08[工学] 0835[工学-软件工程] 0714[理学-统计学(可授理学、经济学学位)] 0701[理学-数学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

主  题:Neural networks 

摘      要:Several algorithms have been developed for time series forecasting. In this paper, we develop a type of algorithm that makes use of the numerical methods for optimizing on objective function that is the Kullbak-Leibler divergence between the joint probability density function of a time series x1, x2,...,xn and the product of their marginal distributions. The Gram-charlier expansion is used for estimating these distributions. Using the weights that have been obtained by the neural network, and adding to them the Kullback-Leibler divergence of these weights, we obtain new weights that are used for forecasting the new value of xn+k.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分