The assessment of sample size in clinical trials comparing means requires a variance estimate of the main efficacy variable. If no reliable information about the variance of the key response is available at the beginn...
详细信息
The assessment of sample size in clinical trials comparing means requires a variance estimate of the main efficacy variable. If no reliable information about the variance of the key response is available at the beginning of a clinical trial, the use of data from the first 'few' patients entered in the trial ('internal pilot') may be appropriate to estimate the variance and thus to recalculate the required sample size. A SAS macro that implements the em algorithm for carrying out and simulating such interim power evaluations without unblinding the treatment status is presented. (C) 2001 Elsevier Science Ireland Ltd. All rights reserved.
When analyzing Poisson count data sometimes a high frequency of extra zeros is observed. The Zero-Inflated Poisson (ZIP) model is a popular approach to handle zero-inflation. In this paper we generalize the ZIP model ...
详细信息
When analyzing Poisson count data sometimes a high frequency of extra zeros is observed. The Zero-Inflated Poisson (ZIP) model is a popular approach to handle zero-inflation. In this paper we generalize the ZIP model and its regression counterpart to accommodate the extent of individual exposure. empirical evidence drawn from an occupational injury data set confirms that the incorporation of exposure information can exert a substantial impact on the model fit. Tests for zero-inflation are also considered, Their finite sample properties are examined in a Monte Carlo study.
This paper discusses a two-state hidden Markov Poisson regression (MPR) model for analyzing longitudinal data of epileptic seizure counts, which allows for the rate of the Poisson process to depend on covariates throu...
详细信息
This paper discusses a two-state hidden Markov Poisson regression (MPR) model for analyzing longitudinal data of epileptic seizure counts, which allows for the rate of the Poisson process to depend on covariates through an exponential link function and to change according to the states of a two-state Markov chain with its transition probabilities associated with covariates through a logit link function. This paper also considers a two-state hidden Markov negative binomial regression (MNBR) model, as an alternative, by using the negative binomial instead of Poisson distribution in the proposed MPR model when there exists extra-Poisson variation conditional on the states of the Markov chain. The two proposed models in this paper relax the stationary requirement of the Markov chain, allow for over-dispersion relative to the usual Poisson regression model and for correlation between repeated observations. The proposed methodology provides a plausible analysis for the longitudinal data of epileptic seizure counts, and the MNBR model fits the data much better than the MPR model. Maximum likelihood estimation using the em and quasi-Newton algorithms is discussed. A Monte Carlo study for the proposed MPR model investigates the reliability of the estimation method, the choice of probabilities for the initial states of the Markov chain, and some finite sample behaviors of the maximum likelihood estimates, suggesting that (1) the estimation method is accurate and reliable as long as the total number of observations is reasonably large, and (2) the choice of probabilities for the initial states of the Markov process has little impact on the parameter estimates.
We describe a new class of computationally efficient algorithms designed to solve incomplete-data problems frequently encountered in image processing and computer vision. The basis of this framework is the marriage of...
详细信息
We describe a new class of computationally efficient algorithms designed to solve incomplete-data problems frequently encountered in image processing and computer vision. The basis of this framework is the marriage of the expectation-maximization (em) procedure with two powerful methodologies. In particular, we have incorporated optimal multiscale estimators into the em procedure to compute estimates and error statistics efficiently. In addition, mean-field theory (MFT) from statistical mechanics is incorporated into the em procedure to help solve the computational problems that arise from our use of Markov random-field (MRF) modeling of the hidden data in the em formulation. We have applied this, algorithmic framework and shown that it is effective in solving a wide variety of image-processing and computer-vision problems. We demonstrate the application of our algorithmic framework to solve the problem of simultaneous anomaly detection, segmentation, and object profile estimation for noisy and speckled laser radar range images. (C) 2001 Society of Photo-Optical Instrumentation Engineers.
This paper proposes methods of estimating the lifetime distribution for situations where additional field data can be gathered after the warranty expires in a parametric time to failure distribution. For satisfactory ...
详细信息
This paper proposes methods of estimating the lifetime distribution for situations where additional field data can be gathered after the warranty expires in a parametric time to failure distribution. For satisfactory inference about parameters involved, it is desirable to incorporate these after-warranty data in the analysis. It is assumed that after-warranty data are reported with probability p(1)(<1), while within-warranty data are reported with probability 1. Methods of obtaining maximum likelihood estimators are outlined, their asymptotic properties are studied, and specific formulas for Weibull distribution are obtained. An estimation procedure using the expectation and maximization algorithm is also proposed when the reporting probability is unknown. Simulation studies are performed to investigate the properties of the estimates. (C) 2001 Elsevier Science Ltd. All rights reserved.
Quantitative measurement is an accepted ideal, but pass-fail inspection remains a fact of life, even in high-technology industries. For pass-fail data variance components do not separate gauge and material variation. ...
详细信息
Quantitative measurement is an accepted ideal, but pass-fail inspection remains a fact of life, even in high-technology industries. For pass-fail data variance components do not separate gauge and material variation. This article focuses on maximum likelihood estimation of conditional misclassification rates, with and without reference evaluations to anchor the analysis. Likelihood-based confidence intervals and testing for reproducibility effects an also discussed.
A modified version of the SAGE algorithm is presented for joint delay-azimuth-attenuation parameters' estimation in a multiuser DS-CDMA system. The introduced modification consists of using different time interval...
详细信息
A modified version of the SAGE algorithm is presented for joint delay-azimuth-attenuation parameters' estimation in a multiuser DS-CDMA system. The introduced modification consists of using different time interval lengths wizen calculating the time correlations for optimizing the different channel parameters. This modification was proposed for the purpose of a further reduction in the algorithm's computational weight in case of receiving sufficiently resolvable waves. Specifically, we found that short interval windows are sufficient for estimating delays and azimuth angles, which is quite effective in reducing the computational burden in their optimization processes. As for the estimation of the attenuation parameters, a longer time window, equal to the preamble length, is considered for more accurate estimation. Also two other estimators are proposed. The first one combining the modified SAGE with a sequential estimation of the attenuation parameters, suitable for slowly varying channels. Another one, similar to the first, and primarily designed to alleviate the influence of present strong interferers. Through a numerical example, the performances of the three presented estimation schemes, in terms of their near-far resistance, are compared. And it is shown that the proposed second combined estimator outperforms the modified SAGE in environments with high MAI levels.
Conceptual and statistical issues surrounding the estimation of a background concentration distribution for arsenic are reviewed. How background area is defined and samples collected are shown to impact the shape and ...
详细信息
Conceptual and statistical issues surrounding the estimation of a background concentration distribution for arsenic are reviewed. How background area is defined and samples collected are shown to impact the shape and location of the probability density function that in turn affects the estimation and precision of associated distributional parameters. The overall background concentration distribution is conceptualized as a mixture of a natural background distribution, an anthropogenic background distribution and a distribution designed to accommodate the potential for contamination site samples being included into the background sample set. This concept is extended to a discussion of issues surrounding estimation of natural and anthropogenic background distributions for larger geographic areas. Finally, the mixture model is formally defined and statistical approaches to estimating its parameters discussed. (C) 2001 AEHS.
Maximum likelihood algorithms for use with missing data are becoming commonplace in microcomputer packages. Specifically, 3 maximum likelihood algorithms are currently available in existing software packages: the mult...
详细信息
Maximum likelihood algorithms for use with missing data are becoming commonplace in microcomputer packages. Specifically, 3 maximum likelihood algorithms are currently available in existing software packages: the multiple-group approach, full information maximum likelihood estimation, and the em algorithm. Although they belong to the same family of estimator, confusion appears to exist over the differences among the 3 algorithms. This article provides a comprehensive, nontechnical overview of the 3 maximum likelihood algorithms. Multiple imputation, which is frequently used in conjunction with the em algorithm, is also discussed.
Wavelet-domain hidden Markov models (HMMs), in particular the hidden Markov tree (HMT) model, have recently been introduced and applied to signal and image processing, e.g., signal denoising. In this paper, we develop...
详细信息
Wavelet-domain hidden Markov models (HMMs), in particular the hidden Markov tree (HMT) model, have recently been introduced and applied to signal and image processing, e.g., signal denoising. In this paper, we develop a simple initialization scheme for the efficient HMT model training and then propose a new four-state HMT model called HMT-2. We find that the new initialization scheme fits the HMT-2 model well. Experimental results show that the performance of signal denoising using the HMT-2 model is often improved over the two-state HMT model developed by Crouse et al.
暂无评论