Low-density parity-check codes achieve coding performance which approaches the Shannon limit. Based on a novel method for deriving regular quasi-cyclic sub-codes, a radiation tolerant encoder was implemented in 0.25 m...
详细信息
ISBN:
(纸本)0780390237
Low-density parity-check codes achieve coding performance which approaches the Shannon limit. Based on a novel method for deriving regular quasi-cyclic sub-codes, a radiation tolerant encoder was implemented in 0.25 mu m CMOS. Use of generator polynomial reconstruction, partial product multiplication and functional sharing in the parity register results in a highly efficient design. Only 1,492 flip flops along with a programmable 21-bit look-ahead scheme are used to achieve a 1 Gb/s data throughput. A comparable two-stage encoder requires 8,176 flip flops.
Pre-processing images prior to encoding can remove noise, or unimportant detail, and thus improve the overall performance of the coder. Typical objective image quality metrics are obtained by computing a single or sev...
详细信息
ISBN:
(纸本)0819425869
Pre-processing images prior to encoding can remove noise, or unimportant detail, and thus improve the overall performance of the coder. Typical objective image quality metrics are obtained by computing a single or several numbers as a function of the difference image between the original and coded images. Such metrics may not reflect the improvement in performance. In a recent paper we have presented a methodology that allows the quantitative determination of image quality when the image has been processed prior to encoding. We now present an extension of the work showing global objective quality measures that quantify the value of pre-processing for image coding using a wavelet coder. Because many options are available in wavelet coder design, we limit our study to a ''best'' coder obtained in previous work, and determine what further performance improvement can be achieved by image processing.
We consider Slepian-Wolf code design (or source coding with side information) based on LDPC codes. We show that density evolution defined in conventional channel coding can be used in analyzing the Slepian-Wolf coding...
详细信息
ISBN:
(纸本)0780387201
We consider Slepian-Wolf code design (or source coding with side information) based on LDPC codes. We show that density evolution defined in conventional channel coding can be used in analyzing the Slepian-Wolf coding performance provided that certain symmetry condition, dubbed dual symmetry, is satisfied by the hypothetical channel between the source and the side information. Exploiting such an analysis, we design an efficient LDPC code based Slepian-Wolf coding scheme and apply it to the quadratic Gaussian Wyner-Ziv problems.
Although IBDI, a coding tool that increases internal bit depth to improve compression performance for high quality video, is able to significantly improve the coding efficiency, the internal memory increment problem o...
详细信息
ISBN:
(纸本)9781457702518
Although IBDI, a coding tool that increases internal bit depth to improve compression performance for high quality video, is able to significantly improve the coding efficiency, the internal memory increment problem occurs because of the necessity of storing reference frames. Therefore, memory compression algorithm is proposed to solve the internal memory increment problem while maintaining the coding performance of IBDI. The memory compression methods have successively reduced the reference frame memory while preserving the coding efficiency by dividing a reference frame into the fixed size processing units and using additional information of each unit. However, additional information of each processing unit also has to be stored in internal frame memory;the amount of additional information could be a limitation of the effectiveness of memory compression scheme. Therefore, to relax this limitation of previous method, we propose a selective merging-based reference frame memory compression algorithm, dramatically reducing the amount of additional information. Simulation results show that the proposed algorithm provides much smaller overhead than that of the previous algorithm while keeping the coding efficiency of IBDI.
Distributed Video coding (DVC) is a new video coding which shifts the complexity from the encoder to the decoder. In DVC, the side information is only known at the decoder to exploit the source statistics. Therefore i...
详细信息
ISBN:
(纸本)9781424458219;9781424458240
Distributed Video coding (DVC) is a new video coding which shifts the complexity from the encoder to the decoder. In DVC, the side information is only known at the decoder to exploit the source statistics. Therefore it has a lower-complexity encoder but higher-complexity decoder. The reconstruction function plays an important role in DVC which can improve the quality of video sequences. In the previous study, pixel values of video frames were considered as the continuous random variables, which is impractical. In this paper, we propose a novel reconstruction algorithm in which pixel values are thought as discrete random variables. We also present a method to model the correlation noise at the encoder to obtain a more accurate estimation of the correlation noise model (CNM). The experiment results show that the proposed reconstruction algorithm is comparable to the optimal reconstruction algorithm and can result in better rate distortion performance than the straightforward reconstruction algorithm.
Disparity compensation is widely used to remove the spatial redundancies in stereoscopic image coding. In this paper, we present a disparity estimation algorithm called hierarchical block correlation, which is a hybri...
详细信息
暂无评论