A new 3D reconstruction algorithm based on particle swarm optimization (PSO) is proposed. This is the first time to introduce PSO to the multi-views problem. The proposed algorithm designs a scheme to represent the 3D...
详细信息
A new 3D reconstruction algorithm based on particle swarm optimization (PSO) is proposed. This is the first time to introduce PSO to the multi-views problem. The proposed algorithm designs a scheme to represent the 3D parameters using a particle. Then the PSO algorithm is used to search the solution space and optimizes the fitness function. Finally, the algorithm can efficiently get the correct solution with less time and less complexity. Additionally, some details of implement are discussed which influence the performance of the algorithm. Experiments and comparisons are given in real data. The accuracy of the recovered solution is compared to the existing algorithms and outperformed them.
This paper describes a region-based retrieval system based on graph-cuts and global/local feature. We also use dynamic partial function (DPF), indexing by locality sensitive hashing (LSH) and learning feedback for imp...
详细信息
This paper describes a region-based retrieval system based on graph-cuts and global/local feature. We also use dynamic partial function (DPF), indexing by locality sensitive hashing (LSH) and learning feedback for improving system performance. Such a system is useful for finding a sub-object from a large image database. In order to obtain the sub-object from a sample image, we propose an efficient graph-cuts segmentation method to cut out the object. The system utilizes the segmentation results to capture the higher-level concept of images and gets a stable and accurate result. Also the feedback method is efficient. Experimental and comparison results, which are performed using a general purpose database containing 5,000 images, are encouraging.
In designing a focused crawler, the choice of strategy for prioritizing unvisited URLs is vital. The text surrounding a link or the link context on the HMTL page is a good summary of the target page. This paper invest...
详细信息
In designing a focused crawler, the choice of strategy for prioritizing unvisited URLs is vital. The text surrounding a link or the link context on the HMTL page is a good summary of the target page. This paper investigates some alternative methods and advocates that the link-context derived from reference page's HTML tag tree can provide a wealth of illumination for domain-specific web resource discovery guided by SVM classifier with uneven margins, which is particularly helpful for small training datasets. Little work has been done to utilize the beneficial link context information about the seed URLs. In order that crawler can acquire enough illumination from link-context, we initially look for some referring pages by traversing backward from seed URLs. The method collects this kind of resources beforehand and then uses it to steer resource discovery. A comprehensive experiment has been conducted using multiple crawls over 10 topics covering thousands of pages allowing us to derive statistically strong results.
Peer-to-peer systems and applications are the hotspot of research of network applications. As peer-to-peer system has no central system and is deployed on an open network, new concerns regarding security have been rai...
详细信息
Peer-to-peer systems and applications are the hotspot of research of network applications. As peer-to-peer system has no central system and is deployed on an open network, new concerns regarding security have been raised. As an additional security measure, the intrusion detection system would help determine whether unauthorized users are attempting to access, have already accessed, or have compromised the network Intrusion detection, as the second line of defense, is an indispensable tool for highly survivable networks. In this paper two anomaly intrusion detection methods are proposed for peer-to-peer system. The main characters of the methods are that they can detect intrusion in real-time without any expert knowledge and attack signatures. One method uses hidden Markov model to check reflector DoS attacks, another based on adaptive resonance theory, which can learn the normal behavior with unsupervised method. The experimental P2P system is built on FreePastry 1.401 and JDK 1.5.0. The results have indicated that the methods can find DoS attacks immediately and find new intruders with low false alarm rate.
In this paper, we introduce two improvements on Ant Colony Optimization (ACO) algorithm: route optimization and individual variation. The first is an optimized implementation of ACO, by which the running time of ants ...
详细信息
The fast-forward planning system (FF), which obtains heuristics via a relaxed planning graph to guide the enforced hill-climbing search strategy, has shown excellent performance in most STRIPS domains. When it comes t...
详细信息
The fast-forward planning system (FF), which obtains heuristics via a relaxed planning graph to guide the enforced hill-climbing search strategy, has shown excellent performance in most STRIPS domains. When it comes to ADL domains, FF handles actions with conditional effects in a way similar to factored expansion. The result is that enforced hill-climbing guided by the relaxed Graphplan always fails in some ADL domains. We have discovered that the reason behind this issue is the relaxed Graphplan's inability to handle relationships between actions' components. We propose a novel approach called delayed partly reasoning on a naive conditional-effects planning graph (DP-CEPG). We do not ignore action's delete effects and consider restricted induced component mutual exclusions between factored expanded actions. Preliminary results show that enforced hill-climbing while guided by DP-CEPG gains obvious improvements in most ADL problems in terms of both solution length and runtime.
To overcome the shortcomings of traditional search engines, advocates a prototype of semantic-based search engine, called CRAB. By combining technologies of semantic Web, information extraction (IE), natural language ...
详细信息
To overcome the shortcomings of traditional search engines, advocates a prototype of semantic-based search engine, called CRAB. By combining technologies of semantic Web, information extraction (IE), natural language processing (NLP) and a novel theme-based method, this framework can extract factual knowledge from Chinese natural language documents automatically. Instead of list of document links, results of user's query request returned from CRAB are semantically coherent reports generated intelligently, which can satisfy users greatly.
According to the complex control process of grinding, an expert system for grinding control based on fuzzy control and artificial neural network process was designed. The system integrates compute intelligence, expert...
详细信息
According to the complex control process of grinding, an expert system for grinding control based on fuzzy control and artificial neural network process was designed. The system integrates compute intelligence, expert system and automatic control technology, relying on powerful development environment. NET, using the advanced object-oriented language C++ and expert system tool CLIPS to develop the system. Under the guidance of expert experience and knowledge, the system has high accuracy control and run flexible stability. In the mean time it has strong adaptability and self-learning capability.
In this paper, we put forward a fast simplification method for the terrain model by integrating the discrete particle swarm optimization with the hierarchical structure. In this method, each particle is represented as...
详细信息
We augment Naive Bayes models by using divide and conquer strategy to address shortcomings of the standard Naive Bayes text classifier. The result is a generalized Bayes classifier which allows for local dependence am...
详细信息
We augment Naive Bayes models by using divide and conquer strategy to address shortcomings of the standard Naive Bayes text classifier. The result is a generalized Bayes classifier which allows for local dependence among feature subset;a model we refer to as the Augmented Naive Bayes (ANB) classifier. ANB relaxes the independence assumptions of Naive Bayes while still permitting efficient inference and learning. Experimental studies on a set of natural domains show that ANB has clear advantages with respect to the generalization ability.
暂无评论