probabilistic programs are a powerful and convenient approach to formalising distributions over system executions. A classical verification problem for probabilistic programs is temporal inference: to compute the like...
详细信息
Tractable probabilistic Models such as Sum-Product Networks are a powerful category of models that offer a rich choice of fast probabilistic queries. However, they are limited in the distributions they can represent, ...
详细信息
Tractable probabilistic Models such as Sum-Product Networks are a powerful category of models that offer a rich choice of fast probabilistic queries. However, they are limited in the distributions they can represent, e.g., they cannot define distributions using loops or recursion. To move towards more complex distributions, we introduce a novel neurosymbolic programming language, Sum Product Loop Language (SPLL), along with the Neuro-Symbolic Transpiler (NeST). SPLL aims to build inference code most closely resembling Tractable probabilistic Models. NeST is the first neuro-symbolic transpiler-a compiler from one high-level language to another. It generates inference code from SPLL but natively supports other computing platforms, too. This way, SPLL can seamlessly interface with e.g. pretrained (neural) models in PyTorch or Julia. The result is a language that can run probabilistic inference on more generalized distributions, reason on neural network outputs, and provide gradients for training.
A wide range of human reasoning patterns can be explained as conditioning in probabilistic models;however, conditioning has traditionally been viewed as an operation applied to such models, not represented in such mod...
详细信息
A wide range of human reasoning patterns can be explained as conditioning in probabilistic models;however, conditioning has traditionally been viewed as an operation applied to such models, not represented in such models. We describe how probabilistic programs can explicitly represent conditioning as part of a model. This enables us to describe reasoning about others' reasoning using nested conditioning. Much of human reasoning is about the beliefs, desires, and intentions of other people;we use probabilistic programs to formalize these inferences in a way that captures the flexibility and inherent uncertainty of reasoning about other agents. We express examples from game theory, artificial intelligence, and linguistics as recursive probabilistic programs and illustrate how this representation language makes it easy to explore new directions in each of these fields. We discuss the algorithmic challenges posed by these kinds of models and describe how dynamic programming techniques can help address these challenges. (C) 2014 Published by Elsevier B.V.
Competitive pressures cause the price of high-technology products to erode over time. The purpose of this paper is to describe a decision support tool to maximize profit by prescribing three related decisions: product...
详细信息
Competitive pressures cause the price of high-technology products to erode over time. The purpose of this paper is to describe a decision support tool to maximize profit by prescribing three related decisions: product upgrading. pricing and production levels. A stochastic dynamic programming (DP) model prescribes when to upgrade a product and what new technologies to incorporate to maintain product competitiveness and profit margins. It deals with the two major sources of risk: demand and the lead time required to complete an upgrade. The objective of the DP model is to maximize expected profit over a given planning horizon. Decision variables prescribe which alternative upgrades to implement and when, pricing, and production levels. Benchmarking computational tests are described along with an example demonstrating model application. Managers can use this DP model to coordinate decisions related to product upgrading, pricing, and operations management to better meet business objectives. (C) 2002 Elsevier Science B.V. All rights reserved.
Inverse problems, particularly those governed by Partial Differential Equations (PDEs), are prevalent in various scientific and engineering applications, and uncertainty quantification (UQ) of solutions to these probl...
详细信息
Inverse problems, particularly those governed by Partial Differential Equations (PDEs), are prevalent in various scientific and engineering applications, and uncertainty quantification (UQ) of solutions to these problems is essential for informed decision-making. This second part of a two-paper series builds upon the foundation set by the first part, which introduced CUQIpy, a Python software package for computational UQ in inverse problems using a Bayesian framework. In this paper, we extend CUQIpy's capabilities to solve PDE-based Bayesian inverse problems through a general framework that allows the integration of PDEs in CUQIpy, whether expressed natively or using third-party libraries such as FEniCS. CUQIpy offers concise syntax that closely matches mathematical expressions, streamlining the modeling process and enhancing the user experience. The versatility and applicability of CUQIpy to PDE-based Bayesian inverse problems are demonstrated on examples covering parabolic, elliptic and hyperbolic PDEs. This includes problems involving the heat and Poisson equations and application case studies in electrical impedance tomography and photo-acoustic tomography, showcasing the software's efficiency, consistency, and intuitive interface. This comprehensive approach to UQ in PDE-based inverse problems provides accessibility for non-experts and advanced features for experts.
Graphical models in probability and statistics are a core concept in the area of probabilistic reasoning and probabilistic programming-graphical models include Bayesian networks and factor graphs. For modeling and for...
详细信息
Graphical models in probability and statistics are a core concept in the area of probabilistic reasoning and probabilistic programming-graphical models include Bayesian networks and factor graphs. For modeling and formal verification of probabilistic systems, probabilistic automata were introduced. This paper proposes a coherent suite of models consisting of Mixed Systems, Mixed Bayesian Networks, and Mixed Automata, which extend factor graphs, Bayesian networks, and probabilistic automata with the handling of nondeterminism. Each of these models comes with a parallel composition, and we establish clear relations between these three models. Also, we provide a detailed comparison between Mixed Automata and probabilistic Automata
Lifted graphical models provide a language for expressing dependencies between different types of entities, their attributes, and their diverse relations, as well as techniques for probabilistic reasoning in such mult...
详细信息
Lifted graphical models provide a language for expressing dependencies between different types of entities, their attributes, and their diverse relations, as well as techniques for probabilistic reasoning in such multi-relational domains. In this survey, we review a general form for a lifted graphical model, a par-factor graph, and show how a number of existing statistical relational representations map to this formalism. We discuss inference algorithms, including lifted inference algorithms, that efficiently compute the answers to probabilistic queries over such models. We also review work in learning lifted graphical models from data. There is a growing need for statistical relational models (whether they go by that name or another), as we are inundated with data which is a mix of structured and unstructured, with entities and relations extracted in a noisy manner from text, and with the need to reason effectively with this data. We hope that this synthesis of ideas from many different research groups will provide an accessible starting point for new researchers in this expanding field.
The benefits of automating design cycles for Bayesian inference-based algorithms are becoming increasingly recognized by the machine learning community. As a result, interest in probabilistic programming frameworks ha...
详细信息
The benefits of automating design cycles for Bayesian inference-based algorithms are becoming increasingly recognized by the machine learning community. As a result, interest in probabilistic programming frameworks has much increased over the past few years. This paper explores a specific probabilistic programming paradigm, namely message passing in Forney-style factor graphs (FFGs), in the context of automated design of efficient Bayesian signal processing algorithms. To this end, we developed "ForneyLab("2) as a Julia toolbox for message passing-based inference in FFGs. We show by example how ForneyLab enables automatic derivation of Bayesian signal processing algorithms, including algorithms for parameter estimation and model comparison. Crucially, due to the modular makeup of the FFG framework, both the model specification and inference methods are readily extensible in FomeyLab. In order to test this framework, we compared variational message passing as implemented by ForneyLab with automatic differentiation variational inference (ADVI) and Monte Carlo methods as implemented by state-of-the-art tools "Edward" and "Stan". In terms of performance, extensibility and stability issues, ForneyLab appears to enjoy an edge relative to its competitors for automated inference in state-space models. (C) 2018 Elsevier Inc. All rights reserved.
Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function's (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD ha...
详细信息
Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function's (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.
A large-scale model to support the online scheduling of power generation at 5 minute intervals is developed using a form of stochastic linear programming. This model takes explicit account of the expected recourse act...
详细信息
A large-scale model to support the online scheduling of power generation at 5 minute intervals is developed using a form of stochastic linear programming. This model takes explicit account of the expected recourse action which will be associated with the mismatch between dispatched generation and actual load demanded. Various experimental results on data for the S.W. Region of the CEGB demonstrate the economic efficiency of this approach over the more conventional deterministic linear programming model. [ABSTRACT FROM AUTHOR]
暂无评论