A TCSP-like concurrent language is extended with an operator for action refinement which plays a role similar to that of procedure-call for sequential languages. The language is given a denotational semantics that ful...
详细信息
A TCSP-like concurrent language is extended with an operator for action refinement which plays a role similar to that of procedure-call for sequential languages. The language is given a denotational semantics that fully expresses causality in terms of Causal Trees. These are Synchronization Trees where each are has a richer labelling containing, besides an action name, also the set of backward pointers to those arcs ''causing'' the present action, An operational semantics reflecting causality is also defined in SOS style by a causal transition system, the unfoldings of which are causal trees. The denotational and operational semantics agree up to causal bisimulation, which is proved to be a congruence for ail the operators of the calculus;notably, for the refinement operator. Also, a complete set of axioms is provided that characterizes the congruence classes of causal bisimulation for finite agents. The main result of the paper is an operational semantics firmly based on a view of action refinement as purely semantic substitution. Therefore, its operational definition provides a ''parallel copy rule,'' i.e., the concurrent analogous of the classic ''copy rule'' for sequential languages. (C) 1995 Academic Press, Inc.
The first logic programminglanguages, such as Prolog, used a fixed left-to-right atom scheduling rule. Recent logic programminglanguages, however, provide more flexible scheduling in which there is a default computa...
详细信息
The first logic programminglanguages, such as Prolog, used a fixed left-to-right atom scheduling rule. Recent logic programminglanguages, however, provide more flexible scheduling in which there is a default computation rule such as left-to-right but in which some calls are dynamically ''delayed'' until their arguments are sufficiently instantiated to allow the call to run efficiently. Such languages include constraint logic programminglanguages, since most implementations of these languages delay constraints which are ''too hard.'' From the semantic point of view, the fact that an atom must be delayed under certain conditions, causes the standard semantics of (constraint) logic programming to be no longer adequate to capture the meaning of a program. In our paper we attack this problem and we develop a denotational semantics for constraint logic programming with dynamic scheduling. The key idea is that the denotation of an atom or goal is a set of closure operators, where different closure operators correspond to different sequences of rule choices. (C) 1997 Academic Press.
In his seminal paper "A Natural semantics for Lazy Evaluation", John Launchbury proves his semantics correct with respect to a denotational semantics, and outlines a proof of adequacy. Previous attempts to r...
详细信息
In his seminal paper "A Natural semantics for Lazy Evaluation", John Launchbury proves his semantics correct with respect to a denotational semantics, and outlines a proof of adequacy. Previous attempts to rigorize the adequacy proof, which involves an intermediate natural semantics and an intermediate resourced denotational semantics, have failed. We devised a new, direct proof that skips the intermediate natural semantics. It is the first rigorous adequacy proof of Launchbury's semantics. We have modeled our semantics in the interactive theorem prover Isabelle and machine-checked our proofs. This does not only provide a maximum level of rigor, but also serves as a tool for further work, such as a machine-checked correctness proof of a compiler transformation.
The novel field of quantum computation and quantum information has gathered significant momentum in the last few years. It has the potential to radically impact the future of information technology and influence the d...
详细信息
The novel field of quantum computation and quantum information has gathered significant momentum in the last few years. It has the potential to radically impact the future of information technology and influence the development of modern society. The construction of practical, general purpose quantum computers has been challenging, but quantum cryptographic and communication devices have been available in the commercial marketplace for several years. Quantum networks have been built in various cities around the world and a dedicated satellite has been launched by China to provide secure quantum communication. Such new technologies demand rigorous analysis and verification before they can be trusted in safety-and security-critical applications. Experience with classical hardware and software systems has shown the difficulty of achieving robust and reliable implementations. We present CCSq, a concurrent language for describing quantum systems, and develop verification techniques for checking equivalence between CCSq processes. CCSq has well-defined operational and superoperator semantics for protocols that are functional, in the sense of computing a deterministic input-output relation for all interleavings arising from concurrency in the system. We have implemented QEC (Quantum Equivalence Checker), a tool that takes the specification and implementation of quantum protocols, described in CCSq, and automatically checks their equivalence. QEC is the first fully automatic equivalence checking tool for concurrent quantum systems. For efficiency purposes, we restrict ourselves to Clifford operators in the stabilizer formalism, but we are able to verify protocols over all input states. We have specified and verified a collection of interesting and practical quantum protocols, ranging from quantum communication and quantum cryptography to quantum error correction.
In most discussions about information and knowledge management, natural language is described as too fuzzy, ambiguous, and changing to serve as a basis for the development of large-scale tools and systems. Instead, ar...
详细信息
In most discussions about information and knowledge management, natural language is described as too fuzzy, ambiguous, and changing to serve as a basis for the development of large-scale tools and systems. Instead, artificial formal languages are developed and used to represent, hopefully in an unambiguous and precise way, the information or knowledge to be managed. Intertextual semantics (IS) adopts an almost exactly opposite point of view: Natural language is the foundation on which information management tools and systems should be developed, and the usefulness of artificial formalisms used in the process lies exclusively in our ability to derive natural language from them. In this article, we introduce IS, its origins, and underlying hypotheses and principles, and argue that even if its basic principles seem remote from current trends in design, IS is actually compatible with-and complementary to-those trends, especially semiotic engineering (C.S. de Souza, 2005a). We also hint at further possible application areas, such as interface and interaction design, and the design of concrete objects.
This article discusses the relationship between two frameworks: universal composability (UC) and robust compilation (RC). In cryptography, UC is a framework for the specification and analysis of cryptographic protocol...
详细信息
This article discusses the relationship between two frameworks: universal composability (UC) and robust compilation (RC). In cryptography, UC is a framework for the specification and analysis of cryptographic protocols with a strong compositionality guarantee: UC protocols remain secure even when composed with other protocols. In programminglanguage security, RC is a novel framework for determining secure compilation by proving whether compiled programs are as secure as their source-level counterparts no matter what target-level code they interact with. Presently, these disciplines are studied in isolation, though we argue that there is a deep connection between them and exploring this connection will benefit both research fields. This article formally proves the connection between UC and RC and then it explores the benefits of this connection (focussing on perfect, rather than computational UC). For this, this article first identifies which conditions must programminglanguages fulfil in order to possibly attain UC-like composition. Then, it proves UC of both an existing and a new commitment protocol as a corollary of the related compilers attaining RC . Finally, it mechanises these proofs in DEEPSEC, obtaining symbolic guarantees that the protocol is indeed UC. Our connection lays the groundwork towards a better and deeper understanding of both UC and RC , and the benefits we showcase from this connection provide evidence of scalable mechanised proofs for UC.
This paper describes the formalisation of the deductive object-oriented database system ROCK & ROLL, This is a system which integrates the deductive and object-oriented paradigms in a way that is both clean and co...
详细信息
This paper describes the formalisation of the deductive object-oriented database system ROCK & ROLL, This is a system which integrates the deductive and object-oriented paradigms in a way that is both clean and consistent, and that has a sound theoretical foundation. The system uses a formally defined object-oriented data model as a foundation for both a logic query language and an imperative data manipulation language in such a way that impedance mismatches are minimised. This paper introduces the facilities offered by ROCK & ROLL, and indicates how their formalisation has been achieved. (C) 1997 Elsevier Science B.V.
We present a generic symbolic analysis framework for imperative programminglanguages. Our framework is capable of computing all valid variable bindings of a program at given program points. This information is invalu...
详细信息
We present a generic symbolic analysis framework for imperative programminglanguages. Our framework is capable of computing all valid variable bindings of a program at given program points. This information is invaluable for domain-specific static program analyses such as memory leak detection, program parallelization, and the detection of superfluous bound checks, variable aliases and task deadlocks. We employ path expression algebra to model the control flow information of programs. A homomorphism maps path expressions into the symbolic domain. At the center of the symbolic domain is a compact algebraic structure called supercontext. A supercontext contains the complete control and data flow analysis information valid at a given program point. Our approach to compute supercontexts is based purely on algebra and is fully automated. This novel representation of program semantics closes the gap between program analysis and computer algebra systems, which makes supercontexts an ideal symbolic intermediate representation for all domain-specific static program analyses. Our approach is more general than existing methods because it can derive solutions for arbitrary (even intra-loop and nested loop) nodes of reducible and irreducible control flow graphs. We prove the correctness of our symbolic analysis method. Our experimental results show that the problem sizes arising from real-world applications such as the SPEC95 benchmark suite are tractable for our symbolic analysis framework. (C) 2011 Elsevier Inc. All rights reserved.
Copy-paste might seem to make life easier, but it often leads to inconsistencies. Giving users the freedom to specify semantic relationships among copied objects can help rectify this "crime."
Copy-paste might seem to make life easier, but it often leads to inconsistencies. Giving users the freedom to specify semantic relationships among copied objects can help rectify this "crime."
The main concern of this paper is the interplay between functionality and nondeterminism. We ask whether the analysis of parallelism in terms of sequentiality and nondeterminism, which is usual in the algebraic treatm...
详细信息
The main concern of this paper is the interplay between functionality and nondeterminism. We ask whether the analysis of parallelism in terms of sequentiality and nondeterminism, which is usual in the algebraic treatment of concurrency, remains correct in the presence of functional application and abstraction, We argue in favour of a distinction between nondeterminism and parallelism, due to the conjunctive nature of the former in contrast to the disjunctive character of the latter. This is the basis of our analysis of the operational and denotational semantics of the nondeterministic lambda-calculus, which is the classical calculus plus a choice operator, and of our election of bounded indeterminacy as the semantic counterpart of conjunctive nondeterminism. This leads to operational semantics based on the idea of must preorder, coming from the classical theory of solvability and from the theory of process algebras. To characterize this relation, we build a model using the inverse limit construction over nondeterministic algebras, and we prove it fully abstract using a generalization of Bohm trees. We further prove conservativity theorems for the equational theory of the model and for other theories related to nondeterministic lambda-calculus with respect to classical lambda-theories. (C) 1995 Academic Press, Inc.
暂无评论