A high-order symplectic FDTD (SFDTD) framework for solving the time-dependent Schrödinger equation is established. The third-order symplectic integrators and fourth-order collocated differences are employed in th...
详细信息
Compressed Sensing (CS), a popular technique which seeks to capture a discrete signal with a small number of linear measurements, could be used to compress a signal during the process of sampling. As an iterative gree...
详细信息
Compressed Sensing (CS), a popular technique which seeks to capture a discrete signal with a small number of linear measurements, could be used to compress a signal during the process of sampling. As an iterative greedy reconstruction algorithm for practical CS, sparsity adaptive matching pursuit (SAMP) takes advantage of the capability of signal reconstruction without prior information of the sparsity in the process of resuming the original high-dimension-data from low-dimension measurement. This paper presents a backward and adaptive matching pursuit reconstruction algorithm with fixed step sizes to avoid the overestimation phenomena of SAMP by using a standard regularized approach. Firstly, a fixed and biggish step size is set to make sure the size of support set of the signal to be reconstructed increasing stably. The energy difference between adjacent reconstructed signals is then taken as the halting condition of iteration. A standard regularized approach is employed to post-dispose the final iteration results, which backward eliminates superfluous atoms to acquire exact reconstruction. Experimental results show that such an improvement of SAMP is feasible in technology and effective in acquiring quick and exact reconstruction with sufficient measurement.
The conventional 'OR' fusion rule is frequently applied in two pre-determined limits energy detection networks but its overall performance of the false alarm and miss detection probability is generally. The ne...
详细信息
The quotient space theory based on fuzzy tolerance relation is put forward to solve the problem of clustering in this paper. The similarity matrix does not always satisfy ultrametric inequality, theoretically and prac...
详细信息
In this paper we first describe the technology of automatic annotation transformation, which is based on the annotation adaptation algorithm (Jiang et al., 2009). It can automatically transform a human-annotated corpu...
详细信息
ISBN:
(纸本)9781622765034
In this paper we first describe the technology of automatic annotation transformation, which is based on the annotation adaptation algorithm (Jiang et al., 2009). It can automatically transform a human-annotated corpus from one annotation guideline to another. We then propose two optimization strategies, iterative training and predict-self reestimation, to further improve the accuracy of annotation guideline transformation. Experiments on Chinese word segmentation show that, the iterative training strategy together with predict-self reestimation brings significant improvement over the simple annotation transformation baseline, and leads to classifiers with significantly higher accuracy and several times faster processing than annotation adaptation does. On the Penn Chinese Treebank 5.0, it achieves an F-measure of 98.43%, significantly outperforms previous works although using a single classifier with only local features.
Non invasion estimating of Central Aortic blood Pressure (CAP) is still a hard problem. Through tenyears clinical research plan (CAFE, Conduit Artery Functional Endpoint) on CAP, CAP is a vital sign to evaluate human&...
详细信息
In the theory of compressive sensing, the selection of the basis functions directly affects the sparse transformation, observation number and reconstruction accuracy. In this paper, we introduce the structure of three...
详细信息
The weighted circles layout problem belongs to the layout optimization problem with performance constraints. Due to its NP-hard property, it is difficult to solve in polynomial time. In this paper, a heuristic particl...
详细信息
The weighted circles layout problem belongs to the layout optimization problem with performance constraints. Due to its NP-hard property, it is difficult to solve in polynomial time. In this paper, a heuristic particle swarm optimization approach with quasi-human strategy (HQHPSA) is presented for this problem. Its layout scheme is constructed through the proposed heuristic method: that both circular radius and the norm of row vector of the matrix and sub-vector are taken as the probability factors of the roulette selection and the circles are located by arranging round existing circles in peripheral with counterclockwise. The complexity of the proposed heuristic method is only O(n) for one layout scheme. The better layout solution obtained through the proposed heuristic method is taken as the elite particle individual. The PSO with quasi-human strategy is used to optimize the elite particle into the optimal solution. The numerical experiments show that the performance of proposed algorithm is superior to the existing algorithms.
Most of the previous works for web video topic detection(e.g., graph-based co-clustering method) always encounter the problem of real-time topic detection, since they all suffer from the high computation complexity. T...
详细信息
Most of the previous works for web video topic detection(e.g., graph-based co-clustering method) always encounter the problem of real-time topic detection, since they all suffer from the high computation complexity. Therefore, a fast topic detection is needed to meet users' or administrators' requirement in real-world scenarios. Along this line, we propose a fast and effective topic detection framework, in which video streams are first partitioned into buckets using a time-window function, and then an incremental hierarchical clustering algorithm is developed, finally a video-based fusion strategy is used to integrate information from multiple modalities. Furthermore, a series of novel similarity metrics are defined in the framework. The experimental results on three months' YouTube videos demonstrate the effectiveness and efficiency of the proposed method.
Local learning approaches are especially easy for parallel processing, so they are very important for cloud computing. In 1997, Lotti A. Zadeh proposed the concept of Granular computing (GrC). Zadeh proposed that ther...
详细信息
Local learning approaches are especially easy for parallel processing, so they are very important for cloud computing. In 1997, Lotti A. Zadeh proposed the concept of Granular computing (GrC). Zadeh proposed that there are three basic concepts that underlie human cognition: granulation, organization and causation and a granule being a clump of points (objects) drawn together by indistinguishability, similarity, proximity or functionality. In this paper, we give out a novel local learning approach based on the concept of Granular computing named as "nested local learning NGLL". The experiment shows that the novel NGLL approach is better than the probabilistic latent semantic analysis (PLSA).
暂无评论