In the present world data compression is used in every field. Through data compression the bits required to represent a message will be reduced. By compressing the given data, we can save the storage capacity, files a...
详细信息
In the present world data compression is used in every field. Through data compression the bits required to represent a message will be reduced. By compressing the given data, we can save the storage capacity, files are transferred at high speed, storage hardware is decreased so that its cost is also decreased, and storage bandwidth is decreased. There are many methods to compress the data. But in this paper, we are discussing about Huffman coding and arithmetic coding. For various input streams we are comparing adaptive Huffman coding and arithmetic coding and we will observe which technique will be more efficient to compress the data.
We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. The encoding is adaptable in that each bit to be encoded may have an associated ...
详细信息
ISBN:
(纸本)0769510310
We present a novel entropy coding technique which is based on recursive interleaving of variable-to-variable length binary source codes. The encoding is adaptable in that each bit to be encoded may have an associated probability estimate which depends on previously encoded bits. The technique may have advantages over arithmetic coding. The technique can achieve arbitrarily small redundancy, and admits a simple and fast decoder. We discuss code design and performance estimation methods, as well as practical encoding and decoding algorithms.
Recent years have seen tremendous increase in the production, transportation and storage of color images. A new method for loss less image compression of color images is presented in this paper. This method is based o...
详细信息
ISBN:
(纸本)9781479970698
Recent years have seen tremendous increase in the production, transportation and storage of color images. A new method for loss less image compression of color images is presented in this paper. This method is based on wavelet transform and support vector machines (SVM). The wavelet transform is applied to the luminance (Y) and chrominance (Cb, Cr) components of the original color image. Then SVM regression dependence could learn from training data and compression achieved using less training point (support vector) to represent the original data and eliminate redundancy. In addition, an effective entropy coder based on run-length and arithmetic encoders is used to encode vector and weight support. Our compression algorithm is applied to a test set of images of size 1024 * 1024 encoded on 24 bits. To evaluate our results, we calculated the peak signal noise ratio (PSNR) and their ratios datasets compression (CR). Experimental results show that the performance of the compression algorithm to achieve much improvement.
In recent years, many wavelet coders that use various spatially adaptive coding technique to compress the image. Level of flexibility and the coding efficiency are two crucial issues in spatially adaptive methods. So ...
详细信息
ISBN:
(纸本)9783642240423;9783642240430
In recent years, many wavelet coders that use various spatially adaptive coding technique to compress the image. Level of flexibility and the coding efficiency are two crucial issues in spatially adaptive methods. So in this paper "spherical coder" is introduced. The objective of this paper is to combine the spherical tree with the wavelet lifting technique and compare the performance of the spherical tree between the different coding technique such as arithmetic, Huffman and run length. The Comparison is made by using PSNR and Compression Ratio (CR). It is shown the Spherical tree in wavelet lifting with the arithmetic coder gives high CR value.
This paper considers large-scale OneMax and RoyalRoad optimization problems with up to 10(7) binary variables within a compact Estimation of Distribution Algorithms (EDA) framework. Building upon the compact Genetic A...
详细信息
ISBN:
(纸本)9781450311779
This paper considers large-scale OneMax and RoyalRoad optimization problems with up to 10(7) binary variables within a compact Estimation of Distribution Algorithms (EDA) framework. Building upon the compact Genetic Algorithm (cGA), the continuous domain Population-Based Incremental Learning algorithm (PBILc) and the arithmetic-coding EDA, we define a novel method that is able to compactly solve regular and noisy versions of these problems with minimal memory requirements, regardless of problem or population size. This feature allows the algorithm to be run in a conventional desktop machine. Issues regarding probability model sampling, arbitrary precision of the arithmetic-coding decompressing scheme, incremental fitness function evaluation and updating rules for compact learning, are presented and discussed.
In this paper, a performance comparison of an embedded zero tree wavelet (EZW) based codec is done on the basis of Huffman and arithmetic coding. The sub-band decomposition coefficients are coded into multilayer bit s...
详细信息
ISBN:
(纸本)9781467310499;9781467310475
In this paper, a performance comparison of an embedded zero tree wavelet (EZW) based codec is done on the basis of Huffman and arithmetic coding. The sub-band decomposition coefficients are coded into multilayer bit stream and then it is entropy coded using Huffman and arithmetic code. The comparison results shows that the arithmetic coded bit stream results in improved bit rate than Huffman coding at the same threshold.
The CABAC entropy coding engine is a well known throughput bottleneck in the AVC/H.264 video codec. It was redesigned to achieve higher throughput for the latest video coding standard HEVC/H.265. Various improvements ...
详细信息
ISBN:
(纸本)9781467362382
The CABAC entropy coding engine is a well known throughput bottleneck in the AVC/H.264 video codec. It was redesigned to achieve higher throughput for the latest video coding standard HEVC/H.265. Various improvements were made including reduction in context coded bins, reduction in total bins and grouping of bypass bins. This paper discusses and quantifies the impact of these techniques and introduces a new metric called Bjontegaard delta cycles (BD-cycle) to compare the CABAC throughput of HEVC vs. AVC. BD-cycle uses the Bjontegaard delta measurement method to compute the average difference between the cycles vs. bit-rate curves of HEVC and AVC. This metric is useful for estimating the throughput of an HEVC CABAC engine from an existing AVC CABAC design for a given bit-rate. Under the common conditions set by the JCT-VC standardization body, HEVC CABAC has an average BD-cycle reduction of 31.1% for all intra, 24.3% for low delay, and 25.9% for random access, when processing up to 8 bypass bins per cycle.
The ultraspectral sounder data features strong correlations in disjoint spectral regions due to the same type of absorbing gases. This paper compares the compression performance of two robust data preprocessing scheme...
详细信息
ISBN:
(纸本)0819462896
The ultraspectral sounder data features strong correlations in disjoint spectral regions due to the same type of absorbing gases. This paper compares the compression performance of two robust data preprocessing schemes, namely Bias-Adjusted reordering (BAR) and Minimum Spanning Tree (MST) reordering, in the context of entropy coding. Both schemes can take advantage of the strong correlations for achieving higher compression gains. The compression methods consist of the BAR or MST preprocessing schemes followed by linear prediction with context-free or context-based arithmetic coding (AC). Compression experiments on the NASA AIRS ultraspectral sounder data. set show that MST without bias-adjustment produces lower compression ratios than BAR and bias-adjusted MST for both context-free and context-based AC. Bias-adjusted MST outperforms BAR for context-free arithmetic coding, whereas BAR outperforms MST for context-based arithmetic coding. BAR with context-based AC yields the highest average compression ratios in comparison to MST with context-free or context-based AC.
Methods for embedment of stream cipher rules into compressive Elias-type entropy coders are presented. Such systems have the ability to simultaneously compress and encrypt input data;thereby providing a fast and secur...
详细信息
ISBN:
(纸本)9780819486370
Methods for embedment of stream cipher rules into compressive Elias-type entropy coders are presented. Such systems have the ability to simultaneously compress and encrypt input data;thereby providing a fast and secure means for data compression. Procedures for maintaining compressive performance are articulated with focus on further compression optimizations. Furthermore, a novel method is proposed which exploits the compression process to hide cipherstream information in the case of a known plaintext attack. Simulations were performed on images from a variety of classes in order to grade and compare the compressive and computational costs of the novel system relative to traditional compression-followed-by-encryption methods.
Relational datasets are being generated at an alarmingly rapid rate across organizations and industries. Compressing these datasets could significantly reduce storage and archival costs. Traditional compression algori...
详细信息
ISBN:
(纸本)9781450342322
Relational datasets are being generated at an alarmingly rapid rate across organizations and industries. Compressing these datasets could significantly reduce storage and archival costs. Traditional compression algorithms, e.g., gzip, are suboptimal for compressing relational datasets since they ignore the table structure and relationships between attributes. We study compression algorithms that leverage the relational structure to compress datasets to a much greater extent. We develop SQUISH, a system that uses a combination of Bayesian Networks and arithmetic coding to capture multiple kinds of dependencies among attributes and achieve near-entropy compression rate. SQUISH also supports user-defined attributes: users can instantiate new data types by simply implementing five functions for a new class interface. We prove the asymptotic optimality of our compression algorithm and conduct experiments to show the effectiveness of our system: SQUISH achieves a reduction of over 50% in storage size relative to systems developed in prior work on a variety of real datasets.
暂无评论