The diagnosis/prognosis problem has already been introduced by the authors in previous papers as a classification problem for survival data. In this paper, the specific aspects of the estimation of the survival functi...
详细信息
The diagnosis/prognosis problem has already been introduced by the authors in previous papers as a classification problem for survival data. In this paper, the specific aspects of the estimation of the survival functions in diagnostic classes and the evaluation of the posterior probabilities of the diagnostic classes are addressed;a latent random variable Z is defined to denote the classification of censored and uncensored individuals, where early censored individuals cannot be immediately classified as Z is not observed. Parameter estimation of the mixture survival model thus derived ii;carried out using a proper version of the em algorithm with given prior probabilities on Z and diagnostic/prognostic information provided by the observable covariates is also included into the model. Numerical examples using AIDS data and a simulation study are used to better outline the main features of the model and of the estimation methodology.
In recent years numerous advances in em methodology have led to algorithms which can be very efficient when compared with both their em predecessors and other numerical methods (e.g., algorithms based on Newton-Raphso...
详细信息
In recent years numerous advances in em methodology have led to algorithms which can be very efficient when compared with both their em predecessors and other numerical methods (e.g., algorithms based on Newton-Raphson). This article combines several of these new methods to develop a set of mode-finding algorithms for the popular mixed-effects model which are both fast and more reliable than such standard algorithms as proc mixed in SAS. We present efficient algorithms for maximum likelihood (ML), restricted maximum likelihood (RemL), and computing posterior modes with conjugate proper and improper priors. These algorithms are not only useful in their own right, but also illustrate how parameter expansion, conditional data augmentation, and the ECME algorithm can be used in conjunction to form efficient algorithms. In particular, we illustrate a difficulty in using the typically very efficient PXem (parameter-expanded em) for posterior calculations, but show how algorithms based on conditional data augmentation can be used. Finally, we present a result that extends Hobert and Casella's result on the propriety of the posterior for the mixed-effects model under an improper prior, an important concern in Bayesian analysis involving these models that when not properly understood has lead to difficulties in several applications.
Ambroise et al. (1996) have proposed a clustering algorithm that is well-suited for dealing with spatial data. This algorithm, derived from the em algorithm (Dempster et al., 1977), has been designed for penalized lik...
详细信息
Ambroise et al. (1996) have proposed a clustering algorithm that is well-suited for dealing with spatial data. This algorithm, derived from the em algorithm (Dempster et al., 1977), has been designed for penalized likelihood estimation in situations with unobserved class labels. Some very satisfactory empirical results lead us to believe that this algorithm converges (Ambroise et al., 1996). However, this convergence has not been proven theoretically. In this paper, we present sufficient conditions and proof of the convergence. A practical application illustrates the use of this algorithm. (C) 1998 Published by Elsevier Science B.V. All rights reserved.
Censored data are commonly observed in industrial experiments such as for life testing and reliability improvement. Analyzing censored data from highly fractionated experiments presents a challenging problem to experi...
详细信息
Censored data are commonly observed in industrial experiments such as for life testing and reliability improvement. Analyzing censored data from highly fractionated experiments presents a challenging problem to experimenters because many traditional methods become inadequate. Motivated by the data from a fluorescent-lamp experiment, we consider in this article analyzing censored data from highly fractionated experiments using covariance adjustment based on multivariate multiple regression models, which make use of the joint distribution of multivariate response variables. The Bayesian approach is taken for the main statistical inference. The posterior distribution of the parameters is obtained using the data augmentation algorithm. We illustrate the methodology with the fluorescent-lamp experiment data. With the real example and a simulation study, we show that covariance adjustment can lead to both dramatic variance reduction and possible bias reduction.
In a series of recent articles on nonparametric regression, Donoho and Johnstone developed wavelet-shrinkage methods for recovering unknown piecewise-smooth deterministic signals from noisy data. Wavelet shrinkage bas...
详细信息
In a series of recent articles on nonparametric regression, Donoho and Johnstone developed wavelet-shrinkage methods for recovering unknown piecewise-smooth deterministic signals from noisy data. Wavelet shrinkage based on the Bayesian approach involves specifying a prior distribution on the wavelet coefficients, which is usually assumed to have a distribution with zero mean. There is no a priori reason why all prior means should be 0;indeed, one can imagine certain types of signals in which this is not a good choice of model. In this article, we take an empirical Bayes approach in which we propose an estimator for the prior mean that is "plugged into" the Bayesian shrinkage formulas. Another way we are more general than previous work is that we assume that the underlying signal is composed of a piecewise-smooth deterministic part plus a zero-mean stochastic part: that is, the signal may contain a reasonably large number of nonzero wavelet coefficients. Our goal is to predict this signal from noisy data. We also develop a new estimator for the noise variance based on a geostatistical method that considers the behavior of the variogram near the origin. Simulation studies show that our method (DecompShrink) outperforms the well-known VisuShrink and SureShrink methods for recovering a wide variety of signals. Moreover. it is insensitive to the choice of the lowest-scale cut-off parameter, which is typically not the case for other wavelet-shrinkage methods.
作者:
Albert, PSNCI
Biometr Res Branch Bethesda MD 20892 USA
Binary longitudinal data are often collected in clinical trials when interest is on assessing the effect of a treatment over time. Our application is a recent study of opiate addiction that examined the effect of a ne...
详细信息
Binary longitudinal data are often collected in clinical trials when interest is on assessing the effect of a treatment over time. Our application is a recent study of opiate addiction that examined the effect of a new treatment on repeated urine tests to assess opiate use over an extended follow-up. Drug addiction is episodic, and a new treatment may affect various features of the opiate-use process such as the proportion of positive urine tests over follow-up and the time to the first occurrence of a positive test. Complications in this trial were the large amounts of dropout and intermittent missing data and the large number of observations on each subject. We develop a transitional model for longitudinal binary data subject to nonignorable missing data and propose an em algorithm for parameter estimation. We use the transitional model to derive summary measures of the opiate-use process that can be compared across treatment groups to assess treatment effect. Through analyses and simulations, we show the importance of property accounting for the missing data mechanism when assessing the treatment effect in our example.
In conventional methods for detecting vanishing points and vanishing lines, the observed feature points are clustered into collections that represent different lines. The multiple lines are then detected and the vanis...
详细信息
In conventional methods for detecting vanishing points and vanishing lines, the observed feature points are clustered into collections that represent different lines. The multiple lines are then detected and the vanishing points are detected as points of intersection of the lines. The vanishing line is then detected based on the points of intersection. However, for the purpose of optimization, these processes should be integrated and be achieved simultaneously. In the present paper, we assume that the observed noise model for the feature points is a two-dimensional Gaussian mixture and define the likelihood function, including obvious vanishing points and a vanishing line parameters. As a result, the above described simultaneous detection can be formulated as a maximum likelihood estimation problem. In addition, an iterative computation method for achieving this estimation is proposed based on the Ehl (Expectation Maximization) algorithm. The proposed method involves new techniques by which stable convergence is achieved and computational cost is reduced. The effectiveness of the proposed method that includes these techniques can be confirmed by computer simulations and real images.
Suppose that some components are initially operated in a certain condition and then switched to operating in a different condition. Working hours of the components in condition 1 and condition 2 air respectively obser...
详细信息
Suppose that some components are initially operated in a certain condition and then switched to operating in a different condition. Working hours of the components in condition 1 and condition 2 air respectively observed. Of interest is the lifetime distribution F of the component in the second condition only, i.e., the distribution without the prior exposure to the first condition. In this paper, we propose a method to transform the lifetime obtained in condition 1 to an equivalent lifetime in condition 2 and then use the transformed data to estimate F. Both parametric and nonparametric approaches each with complete and censored data are discussed. Numerical studies are presented to investigate the performance of the method. (C) 2000 John Wiley & Sons, Inc.
The determination of toxicokinetic parameters is an essential component in the risk assessment of potential harmful chemicals. It is a key step to analyse the processes involved in the formation of DNA adducts which a...
详细信息
The determination of toxicokinetic parameters is an essential component in the risk assessment of potential harmful chemicals. It is a key step to analyse the processes involved in the formation of DNA adducts which are connected with the development of chemical-induced cancer. A general problem is the extrapolation of toxicological data from experimental animals to the human organism. The basis of a toxicokinetic species extrapolation are physiologically-based pharmacokinetic models, which require detailed information about physiological parameters as well as about the kinetic processes involved. Fundamental in the extrapolation from one species to another is the characterisation of processes valid for the whole species, i.e. of population mean parameters instead of sets of parameters for different individuals. These, again, may vary between repeated experiments at the same or at different administered doses. Nevertheless, these differences are of great importance in obtaining a more precise insight into the variability structure of process investigated within the test animal population, so that a valid basis for further research is the final result. The theory of hierarchical models provides a procedure which incorporates both modelling of the variability structure and estimation of population mean parameter vectors. The present study was designed to elucidate interindividual and interoccasion variabilities of toxicokinetic parameters relevant for the biological transformation of one of the basic petrochemical industrial compounds, ethylene (ethene), which is also a physiological body constituent, to its metabolite, ethylene oxide, which is a proven carcinogen. In particular, this aspect has a potential impact for legal regulations of weak genotoxins in general. Copyright (C) 2000 John Wiley & Sons, Ltd.
作者:
McClean, SDevine, CUniv Ulster
Sch Informat & Softwar Engn Fac Informat Coleraine BT52 1SA Londonderry North Ireland
In manpower planning it is commonly the case that employees withdraw from active service for a period of time before returning to take up post at a later date. Such periods of absence are frequently of major concern t...
详细信息
In manpower planning it is commonly the case that employees withdraw from active service for a period of time before returning to take up post at a later date. Such periods of absence are frequently of major concern to employers who are anxious to ensure that employees return as soon as possible. The distribution of duration of such periods of absence are therefore of considerable interest as is the probability that such employees will ever return to active service. In this paper we derive a nonparametric estimator for such a lifetime distribution based on renewal data which are subject to various forms of incompleteness, namely right censoring, left and right truncation, and forward recurrence. Artificial truncation is used to ensure that the data are time homogeneous. A nonparametric maximum likelihood estimator for the lifetime distribution is derived using the em algorithm. The data analysed concern the Northern Ireland nursing profession.
暂无评论