In the skew-coordinates DCT coding method, an image is partitioned along the edge into variably shaped blocks and coded using the skew-coordinates DCT adopted to the edge direction. Compared with the conventional squa...
详细信息
In the skew-coordinates DCT coding method, an image is partitioned along the edge into variably shaped blocks and coded using the skew-coordinates DCT adopted to the edge direction. Compared with the conventional square block DCT, it is known that the skew-coordinates DCT can improve power packing efficiency and can reduce mosquito noise in the reconstructed image. In this paper, the entropy coding method of the skew-coordinates DCT is studied in order to improve the compression ratio. As an entropy coding method, we adopt an adaptive code allocation method based on the Gaussian mixture distribution model and study the construction of the mixture model. Statistical characteristics of the DCT coefficients in real images are investigated and it is shown that the warping effect of the skew-coordinates DCT can reduce the local variation of the variance distribution of the DCT coefficients and that, consequently, a simple mean-power model is suitable as the mixture model. Finally, the result of a computer simulation experiment shows that the proposed method is useful in improving coding performance. (C) 2001 Scripta Technica.
As speed of communication data path is drastically improved in this decade due to the high data rate, evolutional technology is demanded to address the fast communication implementation. In this paper, we focus on dat...
详细信息
As speed of communication data path is drastically improved in this decade due to the high data rate, evolutional technology is demanded to address the fast communication implementation. In this paper, we focus on data compression technology to speed up the communication data path. We have proposed a stream-based data compression called ASE coding. It compresses data stream based on the instantaneous data entropy without buffering and stalling for the compression processes. It is also suitable for hardware implementation. However, the stream-based data compression works heuristically with sensitive parameters that affect to the data compression ratio. If the parameters are statically configured, it does not follow the dynamic data entropy, and thus, the data compression performance becomes unstable. In this paper, we will disseminate the parameters, discuss the behaviors of those parameters and propose its autonomous adjustment methods. We will also propose adjustment algorithms for those parameters that follow the data entropy of the input data stream autonomously. Through experimental evaluations applying the algorithms, we will confirm the parameters are adjusted with depending on the data entropy in the data stream. And then, the compression ratio becomes stable as the compressor exploits the minimal entropy adaptively.
After our internal code cross-check, we have recently found some mistakes in [1, Table VIII] and [1, Figs. 12 and 13] . As such, we have reimplemented the ideas and methods stated in [1] . The corrected Table VII...
详细信息
After our internal code cross-check, we have recently found some mistakes in [1, Table VIII] and [1, Figs. 12 and 13] . As such, we have reimplemented the ideas and methods stated in [1] . The corrected Table VIII and Figs. 12 and 13 are now shown in this correction. Our code is also available from http://*** . To reflect this correction, the following changes have to be made accordingly throughout the paper [1] .
In this paper, a perceptually tuned wavelet image coder is presented. In the wavelet transform domain, wavelet coefficients are quantized using a lattice quantizer, and the resulting lattice points are efficiently los...
详细信息
In this paper, a perceptually tuned wavelet image coder is presented. In the wavelet transform domain, wavelet coefficients are quantized using a lattice quantizer, and the resulting lattice points are efficiently losslessly encoded in the framework of quadtree decomposition and hybrid entropy coding. The parameter used by the lattice quantizer is determined from a perceptual model, and is used to confine the quantization noise to a just-notice distortion (JND) or minimally noticeable distortion (MND) when the bitrate budget is tight. Moreover, a perceptually optimized bit allocation algorithm is also investigated. The proposed coder can efficiently remove both statistical redundancy and perceptual redundancy.
The aim of this study is to investigate a new method of coding neural ensemble activity via information theory and show the validity of this method in coding different stimuli (a pulse current and Gaussian white noise...
详细信息
ISBN:
(纸本)9781424449934
The aim of this study is to investigate a new method of coding neural ensemble activity via information theory and show the validity of this method in coding different stimuli (a pulse current and Gaussian white noise). Experimental data used in this paper is a simulation spatio-temporal spike of 120 neurons in hippocampus CA3 computed from small-world neural network model. We quantify neural ensemble temporal patterns via methods of informatics: estimate entropy values of neural population on temporal resolution (¿=1 ms) and word length (L=8) in a selected window (100ms) with a moving step (size of 1/4 window), and calculate the amount of mutual information between pre and post different stimuli. Then "ensemble threshold" is applied to obtain neuronal ensemble. The results indicate that difference of spike trains between pre and post stimulus is statistically significant (p<0.01). Besides, neuronal ensembles of neural firing to the pulse current and Gaussian white noise are also different. Our results demonstrate that external stimuli are encoded via neural ensemble activity and the validity of the ensemble entropy coding method in coding different stimuli has been proved.
An improved ECG compression algorithm is presented. The ECG signal is cut beat by beat according to the QRS information. The signal samples of each beat are transformed by a lifting-based 1D wavelet transform. The tra...
详细信息
An improved ECG compression algorithm is presented. The ECG signal is cut beat by beat according to the QRS information. The signal samples of each beat are transformed by a lifting-based 1D wavelet transform. The transform coefficients are then quantized by a uniform scalar dead-zone quantizer. The quantized coefficients are decomposed into the Significance stream, the Position of the Most Significant Bit stream, the Sign stream and the Residual Bits stream. Different context models are constructed based on the inter-beat correlation to enable the efficient compression of these data streams by an adaptive arithmetic encoder. The experimental results on the signals from the MIT-BIH arrhythmia database indicate that the proposed method produces competing performance with respect to some 2D ECG compression methods reported in the literature.
We present a novel entropy coding technique which is adaptable in that each bit to be encoded may have an associated probability estimate that depends on previously encoded bits. The technique can achieve arbitrarily ...
详细信息
ISBN:
(纸本)0780371232
We present a novel entropy coding technique which is adaptable in that each bit to be encoded may have an associated probability estimate that depends on previously encoded bits. The technique can achieve arbitrarily small redundancy, admits a simple and fast decoder, and may have advantages over arithmetic coding.
The following topics are dealt with: EMTP; fault diagnosis; power engineering computing; wavelet transforms; power transmission lines; power supply quality; neural nets; radio receivers; power transformer protection; ...
详细信息
ISBN:
(数字)9781728187914
ISBN:
(纸本)9781728187921
The following topics are dealt with: EMTP; fault diagnosis; power engineering computing; wavelet transforms; power transmission lines; power supply quality; neural nets; radio receivers; power transformer protection; pricing.
Joint space-frequency segmentation is a relatively new image compression technique that finds the rate-distortion optimal representation of an image from a large set of possible space-frequency partitions and quantize...
详细信息
Joint space-frequency segmentation is a relatively new image compression technique that finds the rate-distortion optimal representation of an image from a large set of possible space-frequency partitions and quantizer combinations. As such, the method is especially effective when the images to code are statistically inhomogeneous, which is certainly the case in the ultrasound modality. Unfortunately, however, the original paper on space-frequency segmentation neglected to use an actual entropy coder, but instead relied upon the zeroth-order entropy to guide the algorithm. In this work, we fill this gap by comparing actual entropy-coding strategies and their effect on both the resulting segmentations as well as the rate-distortion performance. We then apply the resulting "complete" algorithm to representative ultrasound images. The result is an effective technique that performs significantly better than SPIHT using both objective and subjective measures.
Summary form only given. entropy coding is defined to be the compression of a stream of symbols taken from a known symbol set where the probability of occurrence of any symbol from the set at any given point in the st...
详细信息
Summary form only given. entropy coding is defined to be the compression of a stream of symbols taken from a known symbol set where the probability of occurrence of any symbol from the set at any given point in the stream is constant and independent of any known occurrences of any other symbols. Shannon and Fano showed that the information of such a sequence could be calculated. When measured in bits the information represents the optimum compressed length of the original sequence. If information about sequential redundancy is known better compression may be possible by one of the substitution coding techniques of Ziv and Lempel. In the absence of any such information, entropy coding provides an optimum coding strategy. Huffman posed an optimal variable word length coding technique. Many years later the arithmetic coding technique was formulated by IBM which provided compression close to the optimum. Combination coding is as efficient as arithmetic coding and is very fast as it only requires basic integer operations for both the compression and decompression stages. The technique is in fact for the entropy coding of a binary sequence but by the use of a binary tree, the entropy coding of a sequence of any number of symbols can be reduced to the entropy coding of a number of binary sequences.
暂无评论