Theorem proving based on the extension rule is a new reasoning method. Based on the extension rule algorithm RIER, this paper presents a more efficient algorithm HRIER, which uses the heuristic strategy to guide the c...
详细信息
Theorem proving based on the extension rule is a new reasoning method. Based on the extension rule algorithm RIER, this paper presents a more efficient algorithm HRIER, which uses the heuristic strategy to guide the choosing of restricted searching space. The experiment results show HRIER improves the efficiency a lot, meanwhile it keeps the essence characteristic of extension rule method, namely it is still potentially a complementary method to resolution based methods.
Constraint Satisfaction Problem is an important branch of Artificial Intelligence, one typical algorithm to solve Constraint Satisfaction Problem is the searching algorithm based on backtracking. The Dynamic Backtrack...
详细信息
Constraint Satisfaction Problem is an important branch of Artificial Intelligence, one typical algorithm to solve Constraint Satisfaction Problem is the searching algorithm based on backtracking. The Dynamic Backtracking algorithm proposed by Ginsberg in 1993 is an efficient algorithm which uses backtracking integrates with constraint propagation. Now, according to the basic idea of dynamic backtracking, we put forward four implementary strategies and demonstrate that the efficiency and backtracking times of these four strategies are different The most efficient strategy of these four strategies is the Strategy2.1,it can significantly improve the efficiency and reduce the backtracking times. Anatomizing the results of experiments, we find the differences between these four strategies, then we propose an heuristic rules to improve dynamic backtracking algorithm on selecting a variable that has not been instantiated - Successful Assignment Principle. According to the Failure First Principle, we propose an optimization strategy that combine the Successful Assignment Principle with the Failure First Principle - Strategy 2.4. What is more ,the final test results show that efficiency of Strategy 2.4 is 1.595 ~2.227 times more than the that of Strategy 2.1.
Classification of hyperspectral data with high spatial resolution from urban areas is investigated. The approach is an extension of existing approaches, using both spectral and spatial information for classification. ...
详细信息
Classification of hyperspectral data with high spatial resolution from urban areas is investigated. The approach is an extension of existing approaches, using both spectral and spatial information for classification. The spatial information is derived by mathematical morphology and principal components of the hyperspectral data set, generating a set of different morphological profiles. The whole data set is classified by the Random Forest algorithm. However, the computational complexity as well as the increased dimensionality and redundancy of data sets based on morphological profiles are potential drawbacks. Thus, in the presented study, feature selection is applied, using nonparametric weighted feature extraction and the variable importance of the random forests. The proposed approach is applied to ROSIS data from an urban area. The experimental results demonstrate that a feature reduction is useful in terms of accuracy. Moreover, the proposed approach also shows excellent results with a limited training set.
Trusted network connect (TNC), whose goal is to improve network security from source, now has become hot topic in security domain. We proposed a new access model for the terminals without meeting the requirements in T...
详细信息
Trusted network connect (TNC), whose goal is to improve network security from source, now has become hot topic in security domain. We proposed a new access model for the terminals without meeting the requirements in TNC specifications. Also we discussed the work flow and communication process of presented model. Simulations show that the feasibility of the presented model used in the scenario where terminals fall short of network demand can pass the access authentication of network with TNC specifications.
One of the most prevalent security problems in network is the rampant propagation of email worms. In this paper game theory is suggested as a method for modeling and computing the probabilities of expected behaviors o...
详细信息
One of the most prevalent security problems in network is the rampant propagation of email worms. In this paper game theory is suggested as a method for modeling and computing the probabilities of expected behaviors of email users in the email worm propagation process. The game situation models the actions of the email users under the condition that at the time they open an attachment, the system may be infected. Results of email worm propagation simulation under a practical simulation environment show that well camouflaged worms spread more quickly and survive longer than naive worms, moreover, user's security consciousness and the update frequency of anti-virus software make great impact on email worm propagation.
The Web has been dramatically deepened by the deep Web, which the traditional information fusion system shows disability to integrate. The myriad information hidden behind the deep Web attracts the considerable attent...
详细信息
The Web has been dramatically deepened by the deep Web, which the traditional information fusion system shows disability to integrate. The myriad information hidden behind the deep Web attracts the considerable attention from the researchers. The key element in a deep Web information fusion system is the data source modeling problem, which determine the whole technical method of the whole system. The query interfaces provided by the deep Web are the clues to disclose the hidden schemas. But the complicated semantic relationships in the query interfaces lead to the low generality and ability of local as view (LAV) method in the traditional information fusion system. An approach of evaluating the semantic relationships between the attributes in the query interfaces by utilizing WordNet, a typical ontology technique, is presented in this paper. The semantic relationships between semantic related attributes are evaluated by the semantic calculating method of WordNet. The meaningless attributes is instantiated by instance information embedded in the interfaces in order to attach correlated semantic information. The experiment is carried out in the real life domains, and the result shows the efficiency of ontology based semantic evaluating method in LAV.
Information fusion is an interdisciplinary research field aiming to combine and merge the information or data from different information sources. The output of information fusion system is a global schema through whic...
详细信息
Information fusion is an interdisciplinary research field aiming to combine and merge the information or data from different information sources. The output of information fusion system is a global schema through which users can get information or data from different dispersed sources. To generate a deep Web information fusion system is facing many challenges, because the nature of Web is different from those of traditional databases and multi-databases. The major challenge in a deep Web information fusion system is the data source modeling problem, which determine the technical method of the whole system. The query interfaces provided by the deep Web are the clues to disclose the hidden schemas. But the complicated semantic relationships in the query interfaces lead to the lower generality and ability of local as view (LAV) method in the traditional information fusion system. We present an approach that the semantic relationships between semantic related attributes can be evaluated by the WordNet, a kind of ontology instrument. The experiment is carried out on the famous dataset, and the result shows the efficiency of ontology extended LAV of building mappings between local views and mediator schema.
This paper studies the technique of preprocessing in constraint satisfaction problem (CSP). Firstly, we propose a notion of entirety singleton consistency (ESC) and the algorithm, and then analyze the time and space c...
详细信息
This paper studies the technique of preprocessing in constraint satisfaction problem (CSP). Firstly, we propose a notion of entirety singleton consistency (ESC) and the algorithm, and then analyze the time and space complexity and correctness. Based on this, we present a new preprocessing algorithm SAC-ESC based on ESC, and prove its correctness. Furthermore, we use the divide-conquer strategy for the algorithm to automatically adapt to domain partition of various problems. In our experiments on random CSPs, pigeon problems, N-queens problems and benchmarks, the efficiency of our algorithm SAC-ESC is 3-20 times those of the existing SAC-SDS and SAC-3.
ATP (automated theorem proving) has always been one of the most advanced areas of computer science. The traditional idea used in ATP is to try to deduce the empty clause to check satisfiability, such as resolution bas...
详细信息
ATP (automated theorem proving) has always been one of the most advanced areas of computer science. The traditional idea used in ATP is to try to deduce the empty clause to check satisfiability, such as resolution based theorem proving, which is one of the most popular methods. Extension-rule-based theorem proving is a new resolution-based theorem proving method. After a deep research work on the extension rule, a brilliant property of the rule is obtained. In this paper, the property and an algorithm which is used to decide it are proposed firstly. In addition, the algorithm's time complexity and space complexity are analyzed and proved. Based on the above work, a novel extension rule based theorem proving algorithm called NER is proposed. The NER algorithm transforms the problem which decides whether a clause set is satisfiable to a series of problems deciding whether one literal set includes another one, while the original extension algorithm transforms them to problems counting the number of maximum terms that can be expended. A number of experiments show that the NER algorithm obviously outperforms both the original extension rule based algorithm ER and the directional resolution algorithm DR. Especially, it can be improved up to two orders of magnitude.
A new escrow mechanism for personal security keys on IBE was proposed. This mechanism constructed a security trust system composed of private key generator, key management center, and user security component to set up...
详细信息
A new escrow mechanism for personal security keys on IBE was proposed. This mechanism constructed a security trust system composed of private key generator, key management center, and user security component to set up a personal security key escrow model, thus providing identity validation, confidentiality and integrality check for the application, backup, recovery, and renewal of the escrowed keys. The mechanism, taking advantage of the IBE, simplified the process of authentication and encryption, mede it possible for users to complete independently the escrow of personal security keys. Therefore, it is more practical than traditional key escrow schemes.
暂无评论