The controlled branching process (CBP) is a generalization of the classical Bienaym,-Galton-Watson branching process, and, in the terminology of population dynamics, is used to describe the evolution of populations in...
详细信息
The controlled branching process (CBP) is a generalization of the classical Bienaym,-Galton-Watson branching process, and, in the terminology of population dynamics, is used to describe the evolution of populations in which a control of the population size at each generation is needed. In this work, we deal with the problem of estimating the offspring distribution and its main parameters for a CBP with a deterministic control function assuming that the only observable data are the total number of individuals in each generation. We tackle the problem from a bayesian perspective in a non parametric context. We consider a Markov chain Monte Carlo (MCMC) method, in particular the Gibbs sampler and approximate bayesian computation (ABC) methodology. The first is a data imputation method and the second relies on numerical simulations. Through a simulated experiment we evaluate the accuracy of the MCMC and ABC techniques and compare their performances.
Constitutive models for biological tissue are typically formulated as a mixture of constituents and the overall response is then assembled by superposition or compatibility. This ensures the stress response of the bio...
详细信息
Constitutive models for biological tissue are typically formulated as a mixture of constituents and the overall response is then assembled by superposition or compatibility. This ensures the stress response of the biological tissue to be in the range of a given constitutive relationship, guaranteeing that at least one parameter combination exists so that an experimental response can be sufficiently well captured. Another, perhaps more challenging, problem is to use constitutive models as a proxy to infer the structure/function of a biological tissue from experiments. In other words, we determine the optimal set of parameters by solving an inverse problem and use these parameters to infer the integrity of the tissue constituents. In previous studies, we focused on the mechanical stress-stretch response of the murine patellar tendon at various age and healing timepoints and solved the inverse problem using three constitutive models, i.e., the Freed-Rajagopal, Gasser-Ogden-Holzapfel and Shearer in order of increasing microstructural detail. Herein, we extend this work by adopting a bayesian perspective on parameter estimation and implement the constitutive relations in the tulip library for uncertainty analysis, critically analyzing parameter marginals, correlations, identifiability and sensitivity. Our results show the importance of investigating the variability of parameter estimates and that results from optimization may be misleading, particularly for models with many parameters inferred from limited experimental evidence. In our study, we show that different age and healing conditions do not correspond to statistically significant separation among the Gasser-Ogden-Holzapfel and Shearer model parameters, while the phenomenological Freed-Rajagopal model is instead characterized by better indentifiability and parameter learning. Use of the complete experimental observations rather than averaged stress-stretch responses appears to positively constrain inference and re
A simple approach to the identification of geometrical and material uncertainties of wood is presented. This stochastic mechanics problem combines classical micromechanics, computational homogenization and experimenta...
详细信息
A simple approach to the identification of geometrical and material uncertainties of wood is presented. This stochastic mechanics problem combines classical micromechanics, computational homogenization and experimental measurements with bayesian inference to estimate the model parameters including the characteristics of errors in macroscopic elastic properties of wood caused by randomness of microstructural details on the one hand and the experimental errors on the other hand. The former source of uncertainty includes, for example, variability in microfibril angle and growth ring density. Even such limiting consideration of random input illustrates the need for combined computational and experimental approach in a reliable prediction of the desired material properties. Tying the two approaches in the framework of bayesian statistical method proves useful when addressing their limitations and as such giving better notion on the credibility of the prediction. This is demonstrated here on one particular example of spruce wood. (C) 2019 Elsevier Ltd. All rights reserved.
We make an estimation of a maximizer of posterior marginals (MPM) as for a multi-valued information symbol by a code-division Multiple access (CDMA) demodulator using a multivalued spreading code sequence within a fra...
详细信息
We make an estimation of a maximizer of posterior marginals (MPM) as for a multi-valued information symbol by a code-division Multiple access (CDMA) demodulator using a multivalued spreading code sequence within a framework of a bayesian inference. We calculate an estimation error of the CDMA demodulator and the information capacity of the CDMA channel by using a replica method. We clarify that because of redundancy for degree of the multi-values, the multi-valued information symbol and the multi-valued spreading code sequence are useful as for improvement of information transmission by comparing performance for various degrees of the multi-values. (c) 2005 Elsevier B.V. All rights reserved.
The scope of this study is to introduce the reader to bayesian inference applied to the evaluation of measurement uncertainty and conformity assessment in the field of radiofrequency (RF) and electromagnetic compatibi...
详细信息
The scope of this study is to introduce the reader to bayesian inference applied to the evaluation of measurement uncertainty and conformity assessment in the field of radiofrequency (RF) and electromagnetic compatibility (EMC) measurements and testing. The advantages stemming from the use of bayesian inference with respect to the consolidated theoretical framework provided by the Guide to the Expression of Uncertainty in Measurement (widely known as the "GUM") are emphasized, also in order to appreciate the reasons behind the ongoing revision of the same GUM. An important result of bayesian inference has been already implemented in two guides to the evaluation of EMCmeasurement uncertainty, namely, the IEC TR 61000-1-6 and the standard ANSI C63.23. Further, it is here shown that through bayesian inference mathematical tools can be derived for the assessment of conformity of distribution of values, such as the electric field over a surface, taking into proper account both the intrinsic variability among the values of the distribution and the measurement uncertainty of each value. The theoretical background is first introduced and then two applications of bayesian inference to measurement uncertainty and conformity assessment in the field of RF and EMC measurements and testing are thoroughly described.
While probabilistic techniques have previously been investigated extensively for performing inference over the space of metric maps, no corresponding general-purpose methods exist for topological maps. We present the ...
详细信息
While probabilistic techniques have previously been investigated extensively for performing inference over the space of metric maps, no corresponding general-purpose methods exist for topological maps. We present the concept of probabilistic topological maps (PTMs), a sample-based representation that approximates the posterior distribution over topologies, given available sensor measurements. We show that the space of topologies is equivalent to the intractably large space of set partitions on the set of available measurements. The combinatorial nature of the problem is overcome by computing an approximate, sample-based representation of the posterior. The PTM is obtained by performing bayesian inference over the space of all possible topologies, and provides a systematic solution to the problem of perceptual aliasing in the domain of topological mapping. In this paper, we describe a general framework for modeling measurements, and the use of a Markov-chain Monte Carlo algorithm that uses specific instances of these models for odometry and appearance measurements to estimate the posterior distribution. We present experimental results that validate our technique and generate good maps when using odometry and appearance, derived from panoramic images, as sensor measurements.
The effective ground thermal conductivity and borehole thermal resistance constitute information needed to design a ground-source heat pump (GSHP). In situ thermal response tests (TRTs) are considered reliable to obta...
详细信息
The effective ground thermal conductivity and borehole thermal resistance constitute information needed to design a ground-source heat pump (GSHP). In situ thermal response tests (TRTs) are considered reliable to obtain these parameters, but interpreting TRT data by a deterministic approach may result in significant uncertainties in the estimates. In light of the impact of the two parameters on GSHP applications, the quantification of uncertainties is necessary. For this purpose, in this study, we develop a stochastic method based on bayesian inference to estimate the two parameters and associated uncertainties. Numerically generated noisy TRT data and reference sandbox TRT data were used to verify the proposed method. The posterior probability density functions obtained were used to extract the point estimates of the parameters and their credible intervals. Following its verification, the proposed method was applied to in situ TRT data, and the relationship between test time and estimation accuracy was examined. The minimum TRT time of 36 h recommended by ASHRAE produced an uncertainty of similar to +/- 21% for effective thermal conductivity. However, the uncertainty of estimation decreased exponentially with increasing TRT time, and was +/- 8.3% after a TRT time of 54 h, lower than the generally acceptable range of uncertainty of +/- 10%. Based on the obtained results, a minimum TRT time of 50 h is suggested and that of 72 h is expected to produce sufficiently accurate estimates for most cases.
A new, frequency-domain, behavioral modeling methodology for gallium nitride (GaN) high-electron-mobility transistors (HEMTs), based on the bayesian inference theory, is presented in this paper. Several different prob...
详细信息
A new, frequency-domain, behavioral modeling methodology for gallium nitride (GaN) high-electron-mobility transistors (HEMTs), based on the bayesian inference theory, is presented in this paper. Several different probability distribution (kernel) functions are examined for the bayesian-based modeling architecture, with the optimal kernel function identified through experimental testing. These results are compared to an alternative approach based on the artificial neural networks (ANNs), with the data showing that the proposed approach demonstrates improved accuracy, while at the same time, alleviating the well-known ANN overfitting issue. Model verification is performed at the fundamental and harmonic frequencies using the identified optimal kernel, through comparisons with simulated data from a reference nonlinear circuit model, and with experimental data from separate 2- and 10-W GaN HEMT devices, over a wide range of load conditions. The models can predict accurately the optimal area of the fundamental output power on the Smith chart and the area of optimal power efficiency. Furthermore, the ability of the model to interpolate across input power levels and input frequencies is also tested, with excellent fidelity to the simulated and measured data obtained.
Development of probabilistic modelling tools to perform bayesian inference and uncertainty quantification (UQ) is a challenging task for practical hydrogen-enriched and low emission combustion systems due to the need ...
详细信息
Development of probabilistic modelling tools to perform bayesian inference and uncertainty quantification (UQ) is a challenging task for practical hydrogen-enriched and low emission combustion systems due to the need to take into account simultaneously simulated fluid dynamics and detailed combustion chemistry. A large number of evaluations is required to calibrate models and estimate parameters using experimental data within the framework of bayesian inference. This task is computationally prohibitive in high-fidelity and deterministic approaches such as large eddy simulation (LES) to design and optimize combustion systems. Therefore, there is a need to develop methods that: (a) are suitable for bayesian inference studies and (b) characterize a range of solutions based on the uncertainty of modelling parameters and input conditions. This paper aims to develop a computationally-efficient toolchain to address these issues for probabilistic modelling of NOx emission in hydrogen-enriched and lean-premixed combustion systems. A novel method is implemented into the toolchain using a chemical reactor network (CRN) model, non-intrusive polynomial chaos expansion based on the point collocation method (NIPCE-PCM), and the Markov Chain Monte Carlo (MCMC) method. First, a CRN model is generated for a combustion system burning hydrogen-enriched methane/air mixtures at high-pressure lean-premixed conditions to compute NOx emission. A set of metamodels is then developed using NIPCE-PCM as a computationally efficient alternative to the physics based CRN model. These surrogate models and experimental data are then implemented in the MCMC method to perform a two-step bayesian calibration to maximize the agreement between model predictions and measurements. The average standard deviations for the prediction of exit temperature and NOx emission are reduced by almost 90% using this method. The calibrated model then used with confidence for global sensitivity and reli-ability analysis s
Unlike the classical linear model, nonlinear generative models have been addressed sparsely in the literature of statistical learning. This work aims to shed light on these models and their secrecy potential. To this ...
详细信息
Unlike the classical linear model, nonlinear generative models have been addressed sparsely in the literature of statistical learning. This work aims to shed light on these models and their secrecy potential. To this end, we invoke the replica method to derive the asymptotic normalized cross entropy in an inverse probability problem whose generative model is described by a Gaussian random field with a generic covariance function. Our derivations further demonstrate the asymptotic statistical decoupling of the bayesian estimator and specify the decoupled setting for a given nonlinear model. The replica solution depicts that strictly nonlinear models establish an all-or-nothing phase transition: there exists a critical load at which the optimal bayesian inference changes from perfect to an uncorrelated learning. Based on this finding, we design a new secure coding scheme which achieves the secrecy capacity of the wiretap channel. This interesting result implies that strictly nonlinear generative models are perfectly secured without any secure coding. We justify this latter statement through the analysis of an illustrative model for perfectly secure and reliable inference.
暂无评论