咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Unsupervised learning by proba... 收藏

Unsupervised learning by probabilistic latent semantic analysis

由概率的潜伏的语义分析的无指导的学习

作     者:Hofmann, T 

作者机构:Brown Univ Dept Comp Sci Providence RI 02912 USA 

出 版 物:《MACHINE LEARNING》 (机器学习)

年 卷 期:2001年第42卷第1-2期

页      面:177-196页

核心收录:

学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

主  题:unsupervised learning latent class models mixture models dimension reduction EM algorithm information retrieval natural language processing language modeling 

摘      要:This paper presents a novel statistical method for factor analysis of binary and count data which is closely related to a technique known as Latent Semantic Analysis. In contrast to the latter method which stems from linear algebra and performs a Singular Value Decomposition of co-occurrence tables, the proposed technique uses a generative latent class model to perform a probabilistic mixture decomposition. This results in a more principled approach with a solid foundation in statistical inference. More precisely, we propose to make use of a temperature controlled version of the Expectation Maximization algorithm for model fitting, which has shown excellent performance in practice. Probabilistic Latent Semantic Analysis has many applications, most prominently in information retrieval, natural language processing, machine learning from text, and in related areas. The paper presents perplexity results for different types of text and linguistic data collections and discusses an application in automated document indexing. The experiments indicate substantial and consistent improvements of the probabilistic method over standard Latent Semantic Analysis.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分