This work is part of a project concerned with parallel reconstruction of 3D images from the data delivered by positron emission tomography. The intention of this investigation is the development of a highly efficient ...
详细信息
This work is part of a project concerned with parallel reconstruction of 3D images from the data delivered by positron emission tomography. The intention of this investigation is the development of a highly efficient parallel-computer assisted PET diagnosis system. Due to the Poisson-nature of positron emission processes, likelihood models can be used for their description. The investigations undertaken are based on an expectation-maximum algorithm that iterates to a maximum likelihood of an estimated radiosity image. A parallel implementation of this algorithm is adapted to the modular architecture of a Quadrics DM-SIMD parallel-computer that can be tailored for special applications. Two approaches are described. The "slice-wise" approach calculates 15 slices of a 3D data set in parallel, one slice per processor. The systolic approach provides more precise calculations based on direct matrix-vector multiplication while making more efficient use of the Quadrics architecture. Problems are discussed that are raised from the definition of a huge and sparse transition matrix during systolic computation. (C) 1999 Elsevier Science B.V. All rights reserved.
A new class of distributions based on phase-type distributions is introduced in the current paper to model lifetime data in the field of reliability analysis. This one is the natural extension of the distribution prop...
详细信息
A new class of distributions based on phase-type distributions is introduced in the current paper to model lifetime data in the field of reliability analysis. This one is the natural extension of the distribution proposed by Acal et al. (One cut-point phase-type distributions in reliability. An application to resistive random access memories. Mathematics 9(21):2734, 2021) for more than one cut-point. Multiple interesting measures such as density function, hazard rate or moments, among others, were worked out both for the continuous and discrete case. Besides, a new em-algorithm is provided to estimate the parameters by maximum likelihood. The results have been implemented computationally in R and simulation studies reveal that this new distribution reduces the number of parameters to be estimated in the optimization process and, in addition, it improves the fitting accuracy in comparison with the classical phase-type distributions, especially in heavy tailed distributions. An application is presented in the context of resistive memories with a new set of electron devices for nonvolatile memory circuits. In particular, the voltage associated with the resistive switching processes that control the internal behavior of resistive memories has been modeled with this new distribution to shed light on the physical mechanisms behind the operation of these memories.
Generalized linear mixed models are a common tool in statistics which extends generalized linear models to situations where data are hierarchically clustered or correlated. It is shown that the simple but often inadeq...
详细信息
Generalized linear mixed models are a common tool in statistics which extends generalized linear models to situations where data are hierarchically clustered or correlated. It is shown that the simple but often inadequate restriction to a linear form of the predictor variables may be dropped. A class of semiparametrically structured models is proposed in which the predictor decomposes into components that may be given by a parametric term, an additive form of unspecified smooth functions of covariates, varying-coefficient terms or terms which vary smoothly (or not) across the repetitions in a repeated measurement design. The class of models is explicitly designed as an extension of multivariate generalized mixed linear models such that ordinal responses may be treated within this framework. The modelling of smooth effects is based on basis functions like e.g. B-splines or radial basis functions. For the estimation of parameters a penalized marginal likelihood approach is proposed which may be based on integration techniques like Gauss-Hermite quadrature but may as well be used within the more recently developed nonparametric maximum likelihood approaches. For the maximization of the penalized marginal likelihood the em-algorithm is adapted. It is shown that this extended em-algorithm shares the property of monotonicity with the usual em-algorithm. Moreover, an adequate form of cross-validation is developed and shown to work satisfactorily. Various examples demonstrate the flexibility of the class of models. (C) 2003 Elsevier B.V. All rights reserved.
A new model for cross-sectional lifetime data is presented. The model is based on the length-bias assumption, and it is adapted to situations in which several types of censoring may occur. The NPMLE of the survival fu...
详细信息
A new model for cross-sectional lifetime data is presented. The model is based on the length-bias assumption, and it is adapted to situations in which several types of censoring may occur. The NPMLE of the survival function is derived. An em-algorithm to approximate the NPMLE is devised. The performance of the introduced estimator is investigated through simulations. A real set of data collected as part of a study on unemployment duration in Spain is used for illustration purposes. (c) 2006 Elsevier B.V. All rights reserved.
A single outlier in a regression model can be detected by the effect of its deletion on the residual sum of squares, An equivalent procedure is the simple intervention in which an extra parameter is added for the mean...
详细信息
A single outlier in a regression model can be detected by the effect of its deletion on the residual sum of squares, An equivalent procedure is the simple intervention in which an extra parameter is added for the mean of the observation in question, Similarly, for unobserved components or structural time-series models, the effect of elaborations of the model on inferences can be investigated by the use of interventions involving a single parameter, such as trend or level changes. Because such time-series models contain more than one variance, the effect of the intervention is measured by the change in individual variances. We examine the effect on the estimated parameters of moving various kinds of intervention along the series, The horrendous computational problems involved are overcome by the use of score statistics combined with recent developments in filtering and smoothing. Interpretation of the resulting time-series plots of diagnostics is aided by simulation envelopes. Our procedures, illustrated with four example, permit keen insights into the fragility of inferences to specific shocks, such as outliers and level breaks. Although the emphasis is mostly on parameter estimation, forecast are also considered. Possible extensions include seasonal adjustment and detrending of series. (C) 1997 Elsevier Science S.A.
An automatic procedure to estimate proportions of components from grey level histograms is proposed. This procedure is based on statistical methods for parameter estimation in mixtures of normal distributions by maxim...
详细信息
An automatic procedure to estimate proportions of components from grey level histograms is proposed. This procedure is based on statistical methods for parameter estimation in mixtures of normal distributions by maximum likelihood. The major advantage is that proportions of components can be estimated properly even when the grey level distributions of the components overlap considerably.
Learning piecewise linear regression has been recognized as an effective way for example learning-based single image super-resolution (SR) in literature. In this paper, we employ an expectation-maximization (em) algor...
详细信息
Learning piecewise linear regression has been recognized as an effective way for example learning-based single image super-resolution (SR) in literature. In this paper, we employ an expectation-maximization (em) algorithm to further improve the SR performance of our previous multiple linear mappings (MLM) based SR method. In the training stage, the proposed method starts with a set of linear regressors obtained by the MLM-based method, and then jointly optimizes the clustering results and the low- and high-resolution subdictionary pairs for regression functions by using the metric of the reconstruction errors. In the test stage, we select the optimal regressor for SR reconstruction by accumulating the reconstruction errors of m-nearest neighbors in the training set. Thorough experimental results carried on six publicly available datasets demonstrate that the proposed SR method can yield high-quality images with finer details and sharper edges in terms of both quantitative and perceptual image quality assessments. (C) 2017 Elsevier B.V. All rights reserved.
Many of the methods which deal with the reduction of dimensionality in matrices of data are based on mathematical techniques such as distance-based algorithms or matrix decomposition and eigenvalues. Recently a group ...
详细信息
Many of the methods which deal with the reduction of dimensionality in matrices of data are based on mathematical techniques such as distance-based algorithms or matrix decomposition and eigenvalues. Recently a group of likelihood-based finite mixture models for a data matrix with binary or count data, using basic Bernoulli or Poisson building blocks has been developed. This is extended and establishes likelihood-based multivariate methods for a data matrix with ordinal data which applies fuzzy clustering via finite mixtures to the ordered stereotype model. Model-fitting is performed using the expectation-maximization (em) algorithm, and a fuzzy allocation of rows, columns, and rows and columns simultaneously to corresponding clusters is obtained. A simulation study is presented which includes a variety of scenarios in order to test the reliability of the proposed model. Finally, the results of the application of the model in two real data sets are shown. (C) 2014 Elsevier B.V. All rights reserved.
A new connection between the distribution of component failure times of a coherent system and (adaptive) progressively Type-II censored order statistics is established. Utilizing this property, we develop inferential ...
详细信息
A new connection between the distribution of component failure times of a coherent system and (adaptive) progressively Type-II censored order statistics is established. Utilizing this property, we develop inferential procedures when the data is given by all component failures until system failure in two scenarios: In the case of complete information, we assume that the failed component is also observed whereas in the case of incomplete information, we have only information about the failure times but not about the components which have failed. In the first setting, we show that inferential methods for adaptive progressively Type-II censored data can directly be applied to the problem. For incomplete information, we face the problem that the corresponding censoring plan is not observed and that the available inferential procedures depend on the knowledge of the used censoring plan. To get estimates for distributional parameters, we propose maximum likelihood estimators which can be obtained by solving the likelihood equations directly or via an Expectation-Maximization-algorithm type procedure. For an exponential distribution, we discuss also a linear estimator to estimate the mean. Moreover, we establish exact distributions for some estimators in the exponential case which can be used, for example, to construct exact confidence intervals. The results are illustrated by a five component bridge system. (c) 2015 Wiley Periodicals, Inc. Naval Research Logistics 62: 512-530, 2015
In this paper, we introduce the generalized exponential-power series (GEPS) class of distributions, which is obtained by compounding generalized exponential and power series distributions. The compounding procedure fo...
详细信息
In this paper, we introduce the generalized exponential-power series (GEPS) class of distributions, which is obtained by compounding generalized exponential and power series distributions. The compounding procedure follows the same way as previously carried out in introducing the complementary exponential-geometric (CEG) and the two-parameter Poisson-exponential (PE) lifetime distributions. This new class of distributions contains several lifetime models such as: CEG, PE, generalized exponential-binomial (GEB), generalized exponential-Poisson (GEP), generalized exponential-geometric (GEG) and generalized exponential-logarithmic (GEL) distributions as special cases. The hazard function of the GEPS distributions can be increasing, decreasing or bathtub shaped among others. We obtain several properties of the GEPS distributions such as moments, maximum likelihood estimation procedure via an em-algorithm and inference for a large sample. Special distributions are studied in some detail. At the end, in order to show the flexibility and potentiality of the new class of distributions, we demonstrate applications of two real data sets. (C) 2012 Elsevier B.V. All rights reserved.
暂无评论