For efficiently estimating the normal mean ((Formula presented.)) under right censoring (threshold = (Formula presented.), (Formula presented.) is known), we compare two approaches within the maximum likelihood estima...
详细信息
Assuring the reliability of crude unit pipelines in the downstream oil and gas industry is highly essential since unexpected failures of these pipelines can result in a number of negative impacts to the business, incl...
详细信息
ISBN:
(纸本)9781728136899
Assuring the reliability of crude unit pipelines in the downstream oil and gas industry is highly essential since unexpected failures of these pipelines can result in a number of negative impacts to the business, including safety, environmental, and economic impacts. The objective of this work is to understand the degradation behavior of the piping system so we can know in advance when the degraded pipeline will reach the minimum thickness threshold. The damage mechanisms in atmospheric crude tower overhead piping has been well researched. Hydrochloric Acid (HCL) corrosion is one major type of damage mechanisms seen in the atmospheric crude tower overhead piping. This type of corrosion is time-dependent and also influenced by different operational conditions such as temperature, adequacy of the neutralization of the formed HCL and pipeline interior protection using corrosion inhibitors. To better monitor the degradation of these pipelines and to reduce cost associated with scheduled inspections, a number of refineries are resorting to real time thickness monitoring using Ultrasonic instruments mounted at vantage locations on the pipeline, to provide continuous, non-destructive corrosion and erosion monitoring. The focus of this paper is to use a stochastic degradation model, which is suitable for characterizing wall thickness degradation data, to estimate the failure probability of the pipeline in the midst of inadequate data. We model the degradation of the crude overhead pipeline using a stationary Gamma process. To capture the substantial heterogeneity among different thickness monitoring locations on the pipe line, random effects are incorporated in our stochastic degradation model. We illustrate the proposed random effect method using the gamma process to model pipe wall thickness degradation data observed over a period.
This work proposes an exponential computation with low-computational complexity and applies this technique to the expectation-maximization (em) algorithm for Gaussian mixture model (GMM). For certain machine-learning ...
详细信息
This work proposes an exponential computation with low-computational complexity and applies this technique to the expectation-maximization (em) algorithm for Gaussian mixture model (GMM). For certain machine-learning techniques, such as the em algorithm for the GMM, fast and low-cost implementations are preferred over high precision ones. Since the exponential function is frequently used in machine-learning algorithms, this work proposes reducing computational complexity by transforming the function into powers of two and introducing a look-up table. Moreover, to improve efficiency the look-up table is scaled. To verify the validity of the proposed technique, this work obtains simulation results for the em algorithm used for parameter estimation and evaluates the performances of the results in terms of the mean absolute error and computational time. This work compares our proposed method against the Taylor expansion and the exp() function in a standard C library, and shows that the computational time of the em algorithm is reduced while maintaining comparable precision in the estimation results.
As a promising paradigm that does not require a central node, decentralized computing provides better network load balance and has advantages over centralized computing in terms of data protection. Although decentrali...
详细信息
ISBN:
(纸本)9783030602451;9783030602444
As a promising paradigm that does not require a central node, decentralized computing provides better network load balance and has advantages over centralized computing in terms of data protection. Although decentralized algorithms such as decentralized gradient descent algorithms have been extensively studied, there is no such research on the expectation maximization (em) algorithm, which includes the expectation step (E-step) and the maximization step (M-step) and is a popular iterative method for missing data studies and latent variable models. In this paper, we propose decentralized em algorithms under different communication and network topology settings such as synchronous communication and dynamic networks. Convergence analysis of the proposed algorithms is provided in the synchronous scenario. empirical studies show that our proposed algorithms have numerical advantages over the em algorithms based on local data or full data only, especially when there is no closed-form maximization in the M-step.
Multidimensional Item Response Theory (MIRT) is widely used in assessment and evaluation of educational and psychological tests. It models the individual response patterns by specifying functional relationship between...
详细信息
Multidimensional Item Response Theory (MIRT) is widely used in assessment and evaluation of educational and psychological tests. It models the individual response patterns by specifying functional relationship between individuals' multiple latent traits and their responses to test items. One major challenge in parameter estimation in MIRT is that the likelihood involves intractable multidimensional integrals due to latent variable structure. Various methods have been proposed that either involve direct numerical approximations to the integrals or Monte Carlo simulations. However, these methods have some limitations in that they are computationally demanding in high dimensions and rely on sampling from a posterior distribution. In the second chapter of the thesis, we propose a new Gaussian Variational em (GVem) algorithm which adopts a variational inference to approximate the intractable marginal likelihood by a computationally feasible lower bound. The optimal choice of variational lower bound allows us to derive closed-form updates in em procedure, which makes the algorithm efficient and easily scale to high dimensions. We illustrate that the proposed algorithm can also be applied to assess the dimensionality of the latent traits in an exploratory analysis. Simulation studies and real data analysis are presented to demonstrate the computational efficiency and estimation precision of the GVem algorithm in comparison to the popular alternative Metropolis-Hastings Robbins-Monro algorithm. In addition, theoretical guarantees are derived to establish the consistency of the estimator from the proposed GVem algorithm. One of the key elements in MIRT is the relationship between the items and the latent traits, so-called a test structure. The correct specification of this relationship is crucial for accurate assessment of individuals. Hence, it is of interest to study how to accurately estimate the test structure from data. In the third chapter, we propose to apply GVem
The em algorithm is the standard tool for maximum likelihood estimation in finite mixture models. Its most important drawbacks are the slow convergence, the need for a suitable stopping criterion and the choice of the...
详细信息
The em algorithm is the standard tool for maximum likelihood estimation in finite mixture models. Its most important drawbacks are the slow convergence, the need for a suitable stopping criterion and the choice of the initial values. In this paper, we focus on the issue of selecting initial values for the em algorithm in mixture Poisson regression models. A new strategy, aiming at overcoming limitations of other approaches, is proposed and a simulation study comparing its performance with two alternative strategies is carried out. In models with overlapped components and/or not similar mixing proportions, the new strategy has proven to provide more accurate parameter estimates and to require a fewer number of iterations until the em algorithm convergence saving computing time.
In order to reduce the labor cost of invigilation, improve invigilation efficiency and deal with violations in real time, this paper designs and implements an intelligent invigilation system from two aspects of hardwa...
详细信息
ISBN:
(纸本)9781728161365
In order to reduce the labor cost of invigilation, improve invigilation efficiency and deal with violations in real time, this paper designs and implements an intelligent invigilation system from two aspects of hardware and software. The system on the basis of the standardized test in video monitoring system, aiming at solving the problem that the traditional em algorithm is sensitive to initial value, this paper puts forward an improved method to make supervised learning image and human body's contour recognition, to extract and analyse the scene of the abnormal information feature, and use adaptive threshold algorithm to improve the accuracy of the automatic alarm. these technology make the monitor platform be able to find abnormal information, and timely feed them back to the On-site invigilators. Finally, it can realize intelligent invigilation, improve the precision of supervision system, and has promotion value.
Quantile regression has emerged as an important analytical alternative to the classical mean regression model. However, the analysis could be complicated by the presence of censored measurements due to a detection lim...
详细信息
Quantile regression has emerged as an important analytical alternative to the classical mean regression model. However, the analysis could be complicated by the presence of censored measurements due to a detection limit of equipment in combination with unavoidable missing values arising when, for instance, a researcher is simply unable to collect an observation. Another complication arises when measures depart significantly from normality, for instance, in the presence of skew heavy-tailed observations. For such data structures, we propose a robust quantile regression for censored and/or missing responses based on the skew-t distribution. A computationally feasible em-based procedure is developed to carry out the maximum likelihood estimation within such a general framework. Moreover, the asymptotic standard errors of the model parameters are explicitly obtained via the information-based method. We illustrate our methodology by using simulated data and two real data sets.
Although there is a rapidly growing literature on dynamic connectivity methods, the primary focus has been on separate network estimation for each individual, which fails to leverage common patterns of information. We...
详细信息
Although there is a rapidly growing literature on dynamic connectivity methods, the primary focus has been on separate network estimation for each individual, which fails to leverage common patterns of information. We propose novel graph-theoretic approaches for estimating a population of dynamic networks that are able to borrow information across multiple heterogeneous samples in an unsupervised manner and guided by covariate information. Specifically, we develop a Bayesian product mixture model that imposes independent mixture priors at each time scan and uses covariates to model the mixture weights, which results in time-varying clusters of samples designed to pool information. The computation is carried out using an efficient Expectation-Maximization algorithm. Extensive simulation studies illustrate sharp gains in recovering the true dynamic network over existing dynamic connectivity methods. An analysis of fMRI block task data with behavioral interventions reveal subgroups of individuals having similar dynamic connectivity, and identifies intervention-related dynamic network changes that are concentrated in biologically interpretable brain regions. In contrast, existing dynamic connectivity approaches are able to detect minimal or no changes in connectivity over time, which seems biologically unrealistic and highlights the challenges resulting from the inability to systematically borrow information across samples.
The mixture cure model is the most popular model used to analyse the major event with a potential cure *** in the real world there may exist a potential risk from other non-curable competing *** this paper,we study th...
详细信息
The mixture cure model is the most popular model used to analyse the major event with a potential cure *** in the real world there may exist a potential risk from other non-curable competing *** this paper,we study the accelerated failure time model with mixture cure model via kernel-based nonparametric maximum likelihood estimation allowing non-curable competing *** em algorithm is developed to calculate the estimates for both the regression parameters and the unknown error densities,in which a kernel-smoothed conditional profile likelihood is maximised in the M-step,and the resulting estimates are *** performance is demonstrated through comprehensive simulation ***,the proposed method is applied to the colorectal clinical trial data.
暂无评论