咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Enlarging the margins in perce... 收藏

Enlarging the margins in perceptron decision trees

在视感控器判定树扩大边缘

作     者:Bennett, KP Cristianini, N Shawe-Taylor, J Wu, DH 

作者机构:Rensselaer Polytech Inst Dept Math Sci Troy NY 12180 USA Univ London Royal Holloway & Bedford New Coll Dept Comp Sci Egham TW20 0EX Surrey England 

出 版 物:《MACHINE LEARNING》 (机器学习)

年 卷 期:2000年第41卷第3期

页      面:295-313页

核心收录:

学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

主  题:capacity control decision trees perceptron learning theory learning algorithm 

摘      要:Capacity control in perceptron decision trees is typically performed by controlling their size. We prove that other quantities can be as relevant to reduce their flexibility and combat overfitting. In particular, we provide an upper bound on the generalization error which depends both on the size of the tree and on the margin of the decision nodes. So enlarging the margin in perceptron decision trees will reduce the upper bound on generalization error. Based on this analysis, we introduce three new algorithms, which can induce large margin perceptron decision trees. To assess the effect of the large margin bias, OC1 (Journal of Artificial Intelligence Research, 1994, 2, 1-32.) of Murthy, Kasif and Salzberg, a well-known system for inducing perceptron decision trees, is used as the baseline algorithm. An extensive experimental study on real world data showed that all three new algorithms perform better or at least not significantly worse than OC1 on almost every dataset with only one exception. OC1 performed worse than the best margin-based method on every dataset.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分