This paper derives the CUR-type factorization for tensors in the Tucker format based on a new variant of the discrete empirical interpolation method known as L-DEIM. This novel sampling technique allows us to construc...
详细信息
This paper derives the CUR-type factorization for tensors in the Tucker format based on a new variant of the discrete empirical interpolation method known as L-DEIM. This novel sampling technique allows us to construct an efficient algorithm for computing the structure-preserving decomposition, which significantly reduces the computational cost. For large-scale datasets, we incorporate the random sampling technique with the L-DEIM procedure to further improve efficiency. Moreover, we propose randomized algorithms for computing a hybrid decomposition, which yield interpretable factorization and provide a smaller approximation error than the tensor CUR factorization. We provide comprehensive analysis of probabilistic errors associated with our proposed algorithms, and present numerical results that demonstrate the effectiveness of our methods.
randomized algorithms are efficient techniques for big data tensor analysis. In this tutorial paper, we review and extend a variety of randomized algorithms for decomposing large-scale data tensors in Tensor Ring (TR)...
详细信息
randomized algorithms are efficient techniques for big data tensor analysis. In this tutorial paper, we review and extend a variety of randomized algorithms for decomposing large-scale data tensors in Tensor Ring (TR) format. We discuss both adaptive and nonadaptive randomized algorithms for this task. Our main focus is on the random projection technique as an efficient randomized framework and how it can be used to decompose large-scale data tensors in the TR format. Simulations are provided to support the presentation and efficiency, and performance of the presented algorithms are compared.
The singular value decomposition (SVD) of a reordering of a matrix A can be used to determine an efficient Kronecker product (KP) sum approximation to A. We present the use of an approximate truncated SVD (TSVD) to fi...
详细信息
The quality of network services is directly affected by QoS routing algorithms,and QoS routing algorithms rely heavily on network state information specifying the resource availability at network nodes and *** practic...
详细信息
ISBN:
(纸本)0780363949
The quality of network services is directly affected by QoS routing algorithms,and QoS routing algorithms rely heavily on network state information specifying the resource availability at network nodes and *** practice,the network state information is not always accurate because it does not update in time. This paper proposes a randomized QoS routing algorithm on networks with inaccurate link-state information,and develops a simulation environment. Our algorithm reduces computational cost and protocol *** tests demonstrate that our algorithm performs very well in practice.
The famous Tucker decomposition has been widely and successfully used in many fields. However, it often suffers from the curse of dimensionality due to the core tensor and large ranks. To tackle this issue, we introdu...
详细信息
The famous Tucker decomposition has been widely and successfully used in many fields. However, it often suffers from the curse of dimensionality due to the core tensor and large ranks. To tackle this issue, we introduce an additional core tensor into Tucker decomposition and propose the so-called double-Tucker (dTucker) decomposition. The additional core can share the ranks of the original Tucker decomposition and hence make the parameters of the new decomposition be reduced greatly. We employ the alternating least squares (ALS) method with explicit structures on coefficient matrices of the ALS subproblems to compute the dTucker decomposition. To figure out the structures, a new tensor product is defined. Its properties and the aforementioned structures together motivate an ALS-based randomized algorithm built on Kronecker sub-sampled randomized Fourier transform for our new decomposition. A special case of the algorithm leads to a more efficient leverage-based random sampling algorithm. These randomized algorithms can avoid forming the full coefficient matrices of ALS subproblems by implementing projecting and sampling on factor tensors. Numerical experiments including tensor reconstruction and multi-view subspace clustering are presented to test our decomposition and algorithms, which show that dTucker decomposition can effectively decrease the ranks of the classical one and hence the total parameters, and the randomized algorithms reduce running time greatly while maintaining similar accuracy. Moreover, the numerical results also show that our decomposition can even outperform the popular tensor train decomposition and the newly developed tensor wheel decomposition on compressing parameters.
We propose two random low-rank approximation algorithms based on sparse projection, SEMHMT and SEMTropp. Compared with HMT and Tropp algorithms, we mainly introduce Sparse Embedding Matrix (SEM) as sparse projection t...
详细信息
Kidney exchange programs have been established in several countries to organize kidney exchanges between incompatible patient-donor *** core of these programs are algorithms to solve kidney exchange problem, which can...
详细信息
Kidney exchange programs have been established in several countries to organize kidney exchanges between incompatible patient-donor *** core of these programs are algorithms to solve kidney exchange problem, which can be modeled as finding a maximum weight packing of vertex-disjoint cycles with length at most some small constant L (typically 2 ≤ L ≤ 5) in a directed *** generally, the objective function is maximizing the number of possible kidney *** this paper, we study the random methods for the kidney exchange problem involving only 2-cycle and 3-cycle ***, we formal the kidney exchange problem as a parameterized *** then we propose a randomized parameterized algorithm of running time O*(5.63k3 · 22k2) by randomly partitioning the ***, by using the random divide-and-conquer technique, another randomized algorithm of running time O* (k2[log k2/2.k3[logk3]/2.42k3.22k2) is given for the parameterized kidney ***,our randomized algorithms can be extended to solve the general kidney exchange problem.
This paper presents an improved randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transfo...
详细信息
This paper presents an improved randomized Circle Detection (RCD) algorithm with the characteristic of circularity to detect randomized circle in images with complex background, which is not based on the Hough Transform. The experimental results denote that this algorithm can locate the circular mark of Printed Circuit Board (PCB).
This article presents a new fault detection scheme for stochastic distribution systems with non-Gaussian variables based on a probabilistic framework. The available information is the output probability density functi...
详细信息
This article presents a new fault detection scheme for stochastic distribution systems with non-Gaussian variables based on a probabilistic framework. The available information is the output probability density function (pdf) rather than the measured outputs themselves. The square-root B-spline function is utilized to formulate the output pdfs. Probabilistic parameter models are developed to characterize system uncertainties and faults. An observer is designed to detect the multiplicative and additive faults. The randomized algorithm is adopted to design the threshold to achieve an optimal balance between the false alarm rate (FAR) and the fault detection rate (FDR). The effectiveness of the proposed method is demonstrated and compared against existing work by utilizing a continuous stirred tank reactor system.
This paper explores and analyzes two randomized designs for robust principal component analysis employing lowdimensional data sketching. In one design, a data sketch is constructed using random column sampling followe...
详细信息
This paper explores and analyzes two randomized designs for robust principal component analysis employing lowdimensional data sketching. In one design, a data sketch is constructed using random column sampling followed by lowdimensional embedding, while in the other, sketching is based on random column and rowsampling. Both designs are shown to bring about substantial savings in complexity andmemory requirements for robust subspace learning over conventional approaches that use the full scale data. A characterization of the sample and computational complexity of both designs is derived in the context of two distinct outliermodels, namely, sparse and independent outlier models. The proposed randomized approach can provably recover the correct subspace with computational and sample complexity which depend only weakly on the size of the data (only through the coherence parameters). The results of the mathematical analysis are confirmed through numerical simulations using both synthetic and real data.
暂无评论