In this paper, an adaptive grid DOA estimation algorithm based on sparse Bayesian learning (LMSBL) is proposed. Compared with the traditional off-grid EM-SBL algorithm, the proposed algorithm overcomes the problem of ...
详细信息
The failure mechanism as well as the life prediction of neutron tubes have attracted much attention for the wide application of neutron tubes in many domains. In this paper, the failure modes of prefabricated deuteriu...
详细信息
Maximizing monotone submodular functions under cardinality constraints is a classic optimization task with several applications in data mining and machine learning. In this paper we study this problem in a dynamic env...
详细信息
Maximizing monotone submodular functions under cardinality constraints is a classic optimization task with several applications in data mining and machine learning. In this paper we study this problem in a dynamic environment with consistency constraints: elements arrive in a streaming fashion and the goal is maintaining a constant approximation to the optimal solution while having a stable solution (i.e., the number of changes between two consecutive solutions is bounded). We provide algorithms in this setting with different trade-offs between consistency and approximation quality. We also complement our theoretical results with an experimental analysis showing the effectiveness of our algorithms in real-world instances. Copyright 2024 by the author(s)
The law of total variance states that the unconditional variance of a random variable Y is the sum of (a) the variance of the conditional expectation of Y given X and (b) the expectation of the conditional variance ...
详细信息
In this paper, we propose a blind detection method based on asymmetric constellations for multiple-input multiple-output (MIMO) systems. In block fading channels, the transmission information sequence is randomly and ...
详细信息
For sparse Direction-of-Arrival (DOA) estimation problems, sparse Bayesian learning (SBL) has achieved excellent performance. As a combination of observation and priors, SBL-based methods exploit priors as regularizat...
详细信息
Positron Emission Tomography (PET) is a medical imaging modality relying on numerical methods that integrate the statistical properties of the measurements and prior assumptions about the images. In order to maximize ...
详细信息
In this paper, we present a multivariate bounded Kotz mixture model (BKMM) for data modeling when the data lies in a bounded support region. In BKMM, parameter estimation is performed by maximizing the log-likelihood ...
详细信息
This article considers the point estimations and interval estimations for a generally inverse exponential distribution on the basis of the progressive first failure censoring. We derive the maximum likelihood estimato...
详细信息
In this work, we investigate the margin-maximization bias exhibited by gradient-based algorithms in classifying linearly separable data. We present an in-depth analysis of the specific properties of the velocity field...
详细信息
In this work, we investigate the margin-maximization bias exhibited by gradient-based algorithms in classifying linearly separable data. We present an in-depth analysis of the specific properties of the velocity field associated with (normalized) gradients, focusing on their role in margin maximization. Inspired by this analysis, we propose a novel algorithm called Progressive Rescaling Gradient Descent (PRGD) and show that PRGD can maximize the margin at an exponential rate. This stands in stark contrast to all existing algorithms, which maximize the margin at a slow polynomial rate. Specifically, we identify mild conditions on data distribution under which existing algorithms such as gradient descent (GD) and normalized gradient descent (NGD) provably fail in maximizing the margin efficiently. To validate our theoretical findings, we present both synthetic and real-world experiments. Notably, PRGD also shows promise in enhancing the generalization performance when applied to linearly non-separable datasets and deep neural networks. Copyright 2024 by the author(s)
暂无评论