Deductive formalisms have been strongly developed in recent years;among them, answer set programming (ASP) gained some momentum and has been lately fruitfully employed in many real-world scenarios. Nonetheless, in spi...
详细信息
Deductive formalisms have been strongly developed in recent years;among them, answer set programming (ASP) gained some momentum and has been lately fruitfully employed in many real-world scenarios. Nonetheless, in spite of a large number of success stories in relevant application areas, and even in industrial contexts, deductive reasoning cannot be considered the ultimate, comprehensive solution to artificial intelligence;indeed, in several contexts, other approaches result to be more useful. Typical bioinformatics tasks, for instance classification, are currently carried out mostly by machine learning (ML)-based solutions. In this paper, we focus on the relatively new problem of analyzing the evolution of neurological disorders. In this context, ML approaches already demonstrated to be a viable solution for classification tasks;here, we show how ASP can play a relevant role in the brain evolution simulation task. In particular, we propose a general and extensible framework to support physicians and researchers at understanding the complex mechanisms underlying neurological disorders. The framework relies on a combined use of ML and ASP, and is general enough to be applied in several other application scenarios, which are outlined in the paper.
Dealing with context-dependent knowledge has led to different formalizations of the notion of context. Among them is the Contextualized Knowledge Repository (CKR) framework, which is rooted in description logics but l...
详细信息
Dealing with context-dependent knowledge has led to different formalizations of the notion of context. Among them is the Contextualized Knowledge Repository (CKR) framework, which is rooted in description logics but links on the reasoning side strongly to logic programs and Answer Set programming (ASP) in particular. The CKR framework caters for reasoning with defeasible axioms and exceptions in contexts, which was extended to knowledge inheritance across contexts in a coverage (specificity) hierarchy. However, the approach supports only this single type of contextual relation and the reasoning procedures work only for restricted hierarchies, due to nontrivial issues with model preference under exceptions. In this paper, we overcome these limitations and present a generalization of CKR hierarchies to multiple contextual relations, along with their interpretation of defeasible axioms and preference. To support reasoning, we use ASP with algebraic measures, which is a recent extension of ASP with weighted formulas over semirings that allows one to associate quantities with interpretations depending on the truth values of propositional atoms. Notably, we show that for a relevant fragment of CKR hierarchies with multiple contextual relations, query answering can be realized with the popular asprin framework. The algebraic measures approach is more powerful and enables, for example, reasoning with epistemic queries over CKRs, which opens interesting perspectives for the use of quantitative ASP extensions in other applications.
Context-sensitive global analysis of large code bases can be expensive, which can make its use impractical during software development. However, there are many situations in which modifications are small and isolated ...
详细信息
Context-sensitive global analysis of large code bases can be expensive, which can make its use impractical during software development. However, there are many situations in which modifications are small and isolated within a few components, and it is desirable to reuse as much as possible previous analysis results. This has been achieved to date through incremental global analysis fixpoint algorithms that achieve cost reductions at fine levels of granularity, such as changes in program lines. However, these fine-grained techniques are neither directly applicable to modular programs nor are they designed to take advantage of modular structures. This paper describes, implements, and evaluates an algorithm that performs efficient context-sensitive analysis incrementally on modular partitions of programs. The experimental results show that the proposed modular algorithm shows significant improvements, in both time and memory consumption, when compared to existing non-modular, fine-grain incremental analysis techniques. Furthermore, thanks to the proposed intermodular propagation of analysis information, our algorithm also outperforms traditional modular analysis even when analyzing from scratch.
On top of a neural network-based dependency parser and a graph-based natural language processing module, we design a Prolog-based dialog engine that explores interactively a ranked fact database extracted from a text ...
详细信息
On top of a neural network-based dependency parser and a graph-based natural language processing module, we design a Prolog-based dialog engine that explores interactively a ranked fact database extracted from a text document. We reorganize dependency graphs to focus on the most relevant content elements of a sentence and integrate sentence identifiers as graph nodes. Additionally, after ranking the graph, we take advantage of the implicit semantic information that dependency links and WordNet bring in the form of subject-verb-object, "is-a" and "part-of" relations. Working on the Prolog facts and their inferred consequences, the dialog engine specializes the text graph with respect to a query and reveals interactively the document's most relevant content elements. The open-source code of the integrated system is available at https://***/ptarau/DeepRank.
The inexpressive Description logic (DL) FL0, which has conjunction and value restriction as its only concept constructors, had fallen into disrepute when it turned out that reasoning in FL0 w.r.t. general TBoxes is Ex...
详细信息
State-of-the-art solvers for constrained Horn clauses (CHC) are successfully used to generate reachability facts from symbolic encodings of programs. In this paper, we present a new application to test-case generation...
详细信息
ISBN:
(纸本)9783030995270;9783030995263
State-of-the-art solvers for constrained Horn clauses (CHC) are successfully used to generate reachability facts from symbolic encodings of programs. In this paper, we present a new application to test-case generation: if a block of code is provably unreachable, no test case can be generated allowing to explore other blocks of code. Our new approach uses CHC to incrementally construct different program unrollings and extract test cases from models of satisfiable formulas. At the same time, a CHC solver keeps track of CHCs that represent unreachable blocks of code which makes the unrolling process more efficient. In practice, this lets our approach to terminate early while guaranteeing maximal coverage. Our implementation called HORNTINUUM exhibits promising performance: it generates high coverage in the majority of cases and spends less time on average than state-of-the-art.
This paper presents PFLP, a library for probabilistic programming in the functional logicprogramming language Curry. It demonstrates how the concepts of a functional logicprogramming language support the implementat...
详细信息
This paper presents PFLP, a library for probabilistic programming in the functional logicprogramming language Curry. It demonstrates how the concepts of a functional logicprogramming language support the implementation of a library for probabilistic programming. In fact, the paradigms of functional logic and probabilistic programming are closely connected. That is, language characteristics from one area exist in the other and vice versa. For example, the concepts of non-deterministic choice and call-time choice as known from functional logicprogramming are related to and coincide with stochastic memoization and probabilistic choice in probabilistic programming, respectively. We will further see that an implementation based on the concepts of functional logicprogramming can have benefits with respect to performance compared to a standard list-based implementation and can even compete with full-blown probabilistic programming languages, which we illustrate by several benchmarks.
Most automated verifiers for separation logic are based on the symbolic-heap fragment, which disallows both the magic-wand operator and the application of classical Boolean operators to spatial formulas. This is not s...
详细信息
ISBN:
(纸本)9783030720193;9783030720186
Most automated verifiers for separation logic are based on the symbolic-heap fragment, which disallows both the magic-wand operator and the application of classical Boolean operators to spatial formulas. This is not surprising, as support for the magic wand quickly leads to undecidability, especially when combined with inductive predicates for reasoning about data structures. To circumvent these undecidability results, we propose assigning a more restrictive semantics to the separating conjunction. We argue that the resulting logic, strong-separation logic, can be used for symbolic execution and abductive reasoning just like "standard" separation logic, while remaining decidable even in the presence of both the magic wand and the list-segment predicate a combination of features that leads to undecidability for the standard semantics.
The Computer Science Curricula 2013 (CS2013) calls for undergraduate CS programs to expose their students to the models underlying different programming languages as well as the principles on which language features a...
详细信息
ISBN:
(纸本)9781450385671
The Computer Science Curricula 2013 (CS2013) calls for undergraduate CS programs to expose their students to the models underlying different programming languages as well as the principles on which language features are defined, composed, and implemented. Although an upper division "programming Languages (Foundations)" course is common in many CS programs in United States, such a course is rare in Chinese CS undergraduate programs. This is partly due to the challenge of balancing the breadth and depth of topics covered in such a course. Another common concern is the balance between theory and practice. This paper reports our experience in designing "Foundations of programming Languages" at USTC. Following the exemplar programming Languages courses in CS2013, this 15-week course covers scripting, functional, logic and systems languages, presenting their principle features and design/implementation issues. The topics include modern language features (e.g. closures), principles (e.g. lambda calculus and logic), control flow and runtime components (e.g. memory management and concurrency). The paper describes how four selected languages (Lua for scripting, OCaml for functional, Datalog for logic and Rust for systems) work together to provide practical experiences for students on concepts that would be considered too abstract. It also includes discussions of student performance on selected projects.
暂无评论