A stochastic constitutive model is developed for describing the continuum-scale mechanical response of disordered cellular materials. In the present work, attention is restricted to finite-strain uni-axial compression...
详细信息
A stochastic constitutive model is developed for describing the continuum-scale mechanical response of disordered cellular materials. In the present work, attention is restricted to finite-strain uni-axial compression under quasi-static loading conditions. The development begins with an established cellular-scale mechanical model, but departs from traditional modeling approaches by generalizing the cellular-scale model to accommodate finite strain. The continuum-scale model is obtained by averaging the cellular-scale mechanical response over an ensemble of foam cells. Various stochastic material representations are considered through the use of probability density functions for the relevant material parameters, and the effects of the various representations on the continuum-scale response are investigated. Combining cellular-scale mechanics with a stochastic material representation to derive a continuum-scale constitutive model offers a promising new approach for simulating the finite-strain response of cellular materials. Results demonstrate that increasing a material's degree of polydispersity can produce the same stiffening effects as increasing the initial solid-volume fraction. Additionally, particular stochastic material representations are shown to provide upper and lower bounds on the mechanical response of the cellular materials under investigation, while suitable choices for the stochastic representation are shown to accurately reproduce experimental stress-strain data through the large deformations associated with densification. Published by Elsevier Ltd.
Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that comput...
详细信息
Maximum likelihood is one of the most widely used techniques to infer evolutionary histories. Although it is thought to be intractable, a proof of its hardness has been lacking. Here, we give a short proof that computing the maximum likelihood tree is NP-hard by exploiting a connection between likelihood and parsimony observed by Tuffley and Steel.
The book treats free probability theory, which has been extensively developed since the early 1980s. The emphasis is put on entropy and the random matrix model approach. The volume is a unique presentation demonstrati...
详细信息
ISBN:
(数字)9781470413040
ISBN:
(纸本)9780821841358
The book treats free probability theory, which has been extensively developed since the early 1980s. The emphasis is put on entropy and the random matrix model approach. The volume is a unique presentation demonstrating the extensive interrelation between the topics. Wigner's theorem and its broad generalizations, such as asymptotic freeness of independent matrices, are explained in detail. Consistent throughout the book is the parallelism between the normal and semicircle laws. Voiculescu's multivariate free entropy theory is presented with full proofs and extends the results to unitary operators. Some applications to operator algebras are also given. Based on lectures given by the authors in Hungary, Japan, and Italy, the book is a good reference for mathematicians interested in free probability theory and can serve as a text for an advanced graduate course.
Discrete or count data arise in experiments where the outcome variables are the numbers of individuals classified into unique, non-overlapping categories. This revised edition describes the statistical models used in ...
详细信息
ISBN:
(数字)9781383029673
ISBN:
(纸本)9780198567011
Discrete or count data arise in experiments where the outcome variables are the numbers of individuals classified into unique, non-overlapping categories. This revised edition describes the statistical models used in the analysis and summary of such data, and provides a sound introduction to the subject for graduate students and practitioners needing a review of the methodology. With many numerical examples throughout, it includes topics not covered in depth elsewhere, such as the negative multinomial distribution; the many forms of the hypergeometric distribution; and coordinate free models. A detailed treatment of sample size estimation and power are given in terms of both exact inference and asymptotic, non-central chi-squared methods. A new section covering Poisson regression has also been included. An important feature of this book, missing elsewhere, is the integration of the software into the text. Many more exercises are provided (including 84% more applied exercises) than in the previous edition, helping consolidate the reader's understanding of all subjects covered, and making the book highly suitable for use in a classroom setting. Several new datasets, mostly from the health and medical sector, are discussed, including previously unpublished data from a study of Tourette's Syndrome in children.
The book is devoted to the results on large deviations for a class of stochastic processes. Following an introduction and overview, the material is presented in three parts. Part 1 gives necessary and sufficient condi...
详细信息
ISBN:
(数字)9781470413583
ISBN:
(纸本)9781470418700
The book is devoted to the results on large deviations for a class of stochastic processes. Following an introduction and overview, the material is presented in three parts. Part 1 gives necessary and sufficient conditions for exponential tightness that are analogous to conditions for tightness in the theory of weak convergence. Part 2 focuses on Markov processes in metric spaces. For a sequence of such processes, convergence of Fleming's logarithmically transformed nonlinear semigroups is shown to imply the large deviation principle in a manner analogous to the use of convergence of linear semigroups in weak convergence. Viscosity solution methods provide applicable conditions for the necessary convergence. Part 3 discusses methods for verifying the comparison principle for viscosity solutions and applies the general theory to obtain a variety of new and known results on large deviations for Markov processes. In examples concerning infinite dimensional state spaces, new comparison principles are derived for a class of Hamilton-Jacobi equations in Hilbert spaces and in spaces of probability measures.
In Monte-Carlo photon-tracing methods energy-carrying particles are traced in an environment to generate hit points on object surfaces for simulating global illumination. The surface illumination can be reconstructed ...
详细信息
In Monte-Carlo photon-tracing methods energy-carrying particles are traced in an environment to generate hit points on object surfaces for simulating global illumination. The surface illumination can be reconstructed from particle hit points by solving a density estimation problem using an orthogonal series. The appropriate number of terms of an orthogonal series used for approximating surface illumination depends on the numbers of hit points (i.e. the number of samples) as well as illumination discontinuity (i.e. shadow boundaries) on a surface. Existing photon-tracing methods based on orthogonal series density estimation use a pre-specified or fixed number in of terms of an orthogonal series;this results in undesirable visual artifacts, i.e. either near-constant shading across a surface which conceals the true illumination variation when m is very small or excessive illumination oscillation when m is very large. On the other hand, interactive user specification of the number of terms for different surface patches is inefficient and inaccurate, and thus is not a practical solution. In this paper an algorithm is presented for automatically determining on the fly the optimal number of terms to be used in an orthogonal series in order to reconstruct surface illumination from surface hit points. When the optimal number of terms required is too high due to illumination discontinuity of a surface, a heuristic scheme is used to subdivide the surface along the discontinuity boundary into some smaller patches, called sub-patches, so as to allow a smaller number of terms in the orthogonal series to optimally represent illumination on these sub-patches. Experimental results are presented to show that the new method improves upon other existing orthogonal series-based density estimation methods used for global illumination in both running time and memory requirements. (c) 2005 Elsevier Ltd. All rights reserved.
This paper presents a finite element calculation of frictionless, non-adhesive, contact between a rigid plane and an clasto-plastic solid with a self-affine fractal surface. The calculations are conducted within an ex...
详细信息
This paper presents a finite element calculation of frictionless, non-adhesive, contact between a rigid plane and an clasto-plastic solid with a self-affine fractal surface. The calculations are conducted within an explicit dynamic Lagrangian framework. The elastoplastic response of the material is described by a J(2) isotropic plasticity law. Parametric studies are used to establish general relations between contact properties and key material parameters. In all cases, the contact area A rises linearly with the applied load. The rate of increase grows as the yield stress sigma(y) decreases, scaling as a power of sigma(y) over the range typical of real materials. Results for A from different plasticity laws and surface morphologies can all be described by a simple scaling formula. Plasticity produces qualitative changes in the distributions of local pressures in the contact and of the size of connected contact regions. The probability of large local pressures is decreased, while large clusters become more likely. Loading-unloading cycles are considered and the total plastic work is found to be nearly constant over a wide range of yield stresses. (c) 2005 Elsevier Ltd. All rights reserved.
We consider a random aggregate of identical frictionless elastic spheres that has first been subjected to an isotropic compression and then sheared. We assume that the average strain provides a good description of how...
详细信息
We consider a random aggregate of identical frictionless elastic spheres that has first been subjected to an isotropic compression and then sheared. We assume that the average strain provides a good description of how stress is built up in the initial isotropic compression. However, when calculating the increment in the displacement between a typical pair of contaction particles due to the shearing, we employ force equilibrium for the particles of the pair, assuming that the average strain provides a good approximation for their interactions with their neighbors. The incorporation of these additional degrees of freedom in the displacement of a typical pair relaxes the system, leading to a decrease in the effective moduli of the aggregate. The introduction of simple models for the statistics of the ordinary and conditional averages contributes an additional decrease in moduli. The resulting value of the shear modulus is in far better agreement with that measured in numerical simulations. (C) 2004 Elsevier Ltd. All rights reserved.
This new edition of the successful multi-disciplinary text Statistical Modelling in GLIM takes into account new developments in both statistical software and statistical modelling. Including three new chapters on mixt...
详细信息
ISBN:
(数字)9781383024036
ISBN:
(纸本)9780198524137
This new edition of the successful multi-disciplinary text Statistical Modelling in GLIM takes into account new developments in both statistical software and statistical modelling. Including three new chapters on mixture and random effects models, it provides a comprehensive treatment of the theory of statistical modelling with generalised linear models with an emphasis on applications to practical problems and an expanded discussion of statistical theory. A wide range of case studies is also provided, using the normal, binomial, Poisson, multinomial, gamma, exponential and Weibull distributions. This book is ideal for graduates and research students in applied statistics and a wide range of quantitative disciplines, including biology, medicine and the social sciences. Professional statisticians at all levels will also find it an invaluable desktop companion.
In this book, limit theorems are made accessibly by stating everything in terms of a game of tossing a coin: heads or tails. This book is suitable for anyone who would like to learn more about mathematical probabili...
详细信息
ISBN:
(数字)9781470421397
ISBN:
(纸本)9780821837146
In this book, limit theorems are made accessibly by stating everything in terms of a game of tossing a coin: heads or tails. This book is suitable for anyone who would like to learn more about mathematical probability and has a one-year undergraduate course in analysis.
暂无评论