A new algorithm of patternrecognition based on RBF neural network and Monkey-King genetic algorithm (MK-RBFNN) is prsented. The algorithm includes two parts. The first parts is that Monkey-King genetic algorithm is i...
详细信息
ISBN:
(纸本)9780769535579
A new algorithm of patternrecognition based on RBF neural network and Monkey-King genetic algorithm (MK-RBFNN) is prsented. The algorithm includes two parts. The first parts is that Monkey-King genetic algorithm is introduced to determine the initial positions and number of the hidden layer centers in RBF network. The second part is to set up the RBF neural network to do patternrecognition by learning and training the input sample data. Finally, experiments are implemented on datasets as Iris and WINES, which shows that the proposed algorithm has higher recognition ability compared with the conventional methods.
In many areas of patternrecognition and machinelearning, low dimensional data are often embedded in a high dimensional space. There have been many dimensionality reduction and manifold learning methods to discover t...
详细信息
ISBN:
(纸本)9781424441990
In many areas of patternrecognition and machinelearning, low dimensional data are often embedded in a high dimensional space. There have been many dimensionality reduction and manifold learning methods to discover the low dimensional representation from high dimensional data. Locality based manifold learning methods often rely on a distance metric between neighboring points. In this paper, we propose a new distance metric named relative distance, which is learned from the data and can better reflect the relative density. Combining the relative distance with Laplacian Eigenmaps (LE), we obtain a new algorithm called Relative Distance-based Laplacian Eigenmaps (RDLE) for nonlinear dimensionality reduction. Based on two different definitions of the relative distance, we give two variations of the RDLE. For efficient projection of out-of-sample data, we also present the linear version of RDLE, LRDLE. Experimental results on toy problems and real-world data demonstrate the effectiveness of our methods.
Various of manifold learning methods have been proposed to capture the intrinsic characteristic of nonlinear data. However, when confronting highly nonlinear data sets, existing algorithms may fail to discover the cor...
详细信息
ISBN:
(纸本)9781424441990
Various of manifold learning methods have been proposed to capture the intrinsic characteristic of nonlinear data. However, when confronting highly nonlinear data sets, existing algorithms may fail to discover the correct inner structure of data sets. In this paper, we proposed a new locality-based manifold learning method - Neighborhood Balance Embedding. The proposed method share the same 'neighborhood preserving' property with other manifold learning methods, however, it describe the local structure in a different way, which makes each neighborhood like as rigid balls, thus prevents the overlapping phenomenon which often happens when coping with highly nonlinear data. Experimental results on the data sets with high nonlinearity show good performances of the proposed method.
In this paper, we propose to kernelize linear learningmachines, e.g., PCA and LDA, in the empirical kernel feature space, a finite-dimensional embedding space, in which the distances of the data in the kernel feature...
详细信息
ISBN:
(纸本)9781424441990
In this paper, we propose to kernelize linear learningmachines, e.g., PCA and LDA, in the empirical kernel feature space, a finite-dimensional embedding space, in which the distances of the data in the kernel feature space are preserved. The empirical kernel feature space provides a unified framework for the kernelization of all kinds of linear machines: performing a linear machine in the finite-dimensional empirical feature space, its nonlinear kernel machine is then established in the original input data space. This method is different from the conventional kernel-trick based kernelization, and more importantly, the final nonlinear kernel machines, called empirical kernel machines, are shown to be more efficient in many real-world applications, such as face recognition and facial expression recognition, than the kernel-trick based kernel machines.
In this paper, we propose a novel supervised learning method called Global Sparse Representation Projections (GSRP) for linear dimensionality reduction. GSRP can be viewed as a combiner of sparse representation and ma...
详细信息
ISBN:
(纸本)9781424441990
In this paper, we propose a novel supervised learning method called Global Sparse Representation Projections (GSRP) for linear dimensionality reduction. GSRP can be viewed as a combiner of sparse representation and manifold learning. But differing from the recent manifold learning methods such as Local Preserving Projections (LPP), GSRP introduces the global sparse representation information into the objective function. Since sparse representation can implicitly employ the "local" structure of the data by imposing the sparsity prior, we take advantages of this property to characterize the local structure. By combining the local interclass neighborhood relationship and sparse representation information, GSRP aims to preserve the sparse reconstructive relationship of the data and simultaneously maximize the interclass separability. Comprehensive comparison and extensive experiments show that GSRP achieves higher recognition rates than the state-of-the-art techniques such as LPP and Sparsity Preserving Projections (SPP).
Dimensionality reduction has been demonstrated to be an effective way for feature extraction in the patternrecognition task. In this paper, a new manifold learning algorithm, Local Discriminant Space Alignment (LDSA)...
详细信息
ISBN:
(纸本)9781424441990
Dimensionality reduction has been demonstrated to be an effective way for feature extraction in the patternrecognition task. In this paper, a new manifold learning algorithm, Local Discriminant Space Alignment (LDSA), is developed for nonlinear dimensionality reduction. In LDSA, the discriminant structure and the local geometry of data manifold is learned by constructing a local space for each data point through local discriminant analysis, and those discriminant subspaces are aligned to give the internal global coordinates of data points with respect to the underlying manifold. To solve the out of sample problem, the linearization of LDSA (LLDSA) is also proposed and is applied to face recognition. The Experimental results on ORL and Yale database showed the effectiveness of LLDSA in comparison with existing dimensionality reduction algorithms designed for feature extraction.
Malware detection is an important application of datamining. Most of the previously developed sequential patternmining methods are Apriori-like, which still encounters problems when a sequence database is large and/...
详细信息
ISBN:
(纸本)9780769535579
Malware detection is an important application of datamining. Most of the previously developed sequential patternmining methods are Apriori-like, which still encounters problems when a sequence database is large and/or when sequential patterns to be mined are numerous and/or long. So we need a novel sequential patternmining method, called PrefixSpan*, which uses brief projection database in stead of projection database in PrefixSpan. In this paper, we propose a behavior-based detection system, which combines datamining and expert system technique to detect malware in our hosts. The PrefixSpan* algorithm mines association rules in the malware behavior sequence database to form malware behavior patterndatabase;the expert system matches facts and rules and gives the final result. To verify the correctness and effectiveness of our algorithm, we test and analyze some samples in the experiment section.
In this paper, a nonparametric histogram-based fisher information embedding method is presented for clustering and visualizing data sets with non-Euclidean geometric structures. It is on the assumption that each data ...
详细信息
ISBN:
(纸本)9781424441990
In this paper, a nonparametric histogram-based fisher information embedding method is presented for clustering and visualizing data sets with non-Euclidean geometric structures. It is on the assumption that each data set is derived from a probability distribution that can be characterized by a probability density function (PDF) lying on a statistical manifold. The dissimilarities between data sets can be quantified with the corresponding geodesic distances on statistical manifold from the views of information geometry. Our method is designed to convert data distribution information into simplex discretely, thereby a similarity measure can be adopted with simplex geometry, further clustering or visualizing original manifolds can be performed on a low-dimensional Euclidean subspace via manifold learning. Experiments on clustering Swiss Roll and s-curve submanifolds and reconstructing a sphere demonstrate the effectiveness of our method.
The Decision Support System (DSS) applies various data and models to Human-machine Interface (HMI) to assist decision-makers at each level in achieving scientific decisions. The DSS was originated in 1970s, but has se...
详细信息
ISBN:
(纸本)9780769535579
The Decision Support System (DSS) applies various data and models to Human-machine Interface (HMI) to assist decision-makers at each level in achieving scientific decisions. The DSS was originated in 1970s, but has seen a rapid growth. The form, development and its future growing mode will he introduced briefly in this paper.
Based on UDP and MFA, we propose a new un-supervised feature extraction algorithm, LMP (Local Marginal Projection), which is built on local quality. It measures the non-local quantities by the nearest sample between t...
详细信息
ISBN:
(纸本)9781424441990
Based on UDP and MFA, we propose a new un-supervised feature extraction algorithm, LMP (Local Marginal Projection), which is built on local quality. It measures the non-local quantities by the nearest sample between two locals. The goal of LMP is to find a projection that can maximize the distance of the sample in the same local and in different locals, in which case, the data can be projected into low-dimension easily. Besides, this projection could deal with the nonlinear and high-dimensional problem. The experiment on ORL and Yale face database shows that LMP algorithm can describe the high-dimensional data and can embed the nonlinear data Swiss-Hole into low-dimension space with a reasonable visual effectively.
暂无评论