This paper investigates the impact of various codecs and coding parameters on power consumption of video decoding on mobile handheld devices. With limited battery capabilities of mobile device and with current trend o...
详细信息
This paper investigates the impact of various codecs and coding parameters on power consumption of video decoding on mobile handheld devices. With limited battery capabilities of mobile device and with current trend of an increased demand on quality of video contents the power management becomes the vital aspect for mobile devices. Besides the decoding complexity, the file size of the investigated codecs is also taken into account. By using SSIM (Structural SIMilarity index) metric, the measured results for power consumption and storage demand are correlated with perceived video quality.
Data compression is widely used in many scientific areas and in a transparent manner in various daily life activities. Lossy data compression, eg in JPEG images, leads to the loss of part of the original information b...
详细信息
Data compression is widely used in many scientific areas and in a transparent manner in various daily life activities. Lossy data compression, eg in JPEG images, leads to the loss of part of the original information but usually not essential. In this paper we present a software system for the evaluation of the quality of compressed JPEG images. This system can act on different stages of Jpeg and allow to check the resulting change in the quality and compression.
With the growth in multimedia technology, demand for highspeed real time image compression system has also increased. JPEG 2000 standard is developed to cater such application requirements. However, the sequential exe...
详细信息
With the growth in multimedia technology, demand for highspeed real time image compression system has also increased. JPEG 2000 standard is developed to cater such application requirements. However, the sequential execution of the bit plane coder (BPC) used in this standard consumes more clock cycles. To improve the performance of the BPC, a new concurrent context modeling technique is proposed in this paper. To study number of context generated in each clock cycle, analysis is carried out on five ISO grayscale images with size 512 × 512. The study revealed that about 58% of time more than 4 contexts are generated in one clock. Therefore, a new concurrent context coding architecture is proposed in this paper. It is implemented on Startix FPGA and the hardware requirement is reduced significantly, compared to similar architectures. Moreover, number of clock cycles required to encode a bit plane is reduced by 10% and it is minimum 2.5 times faster than the similar designs in existence. This design operates at 164.47 MHz, which makes it compatible for encoding HDTV 1920 × 1080 4:2:2 at 39 frames per second.
We propose an MPEG-2 to H.264 transcoding method for interlace streams intermingled with frame and field macroblocks. This method uses the encoding information from an MPEG-2 stream and keeps as many DCT coefficients ...
详细信息
We propose an MPEG-2 to H.264 transcoding method for interlace streams intermingled with frame and field macroblocks. This method uses the encoding information from an MPEG-2 stream and keeps as many DCT coefficients of the original MPEG-2 bitstream as possible. Experimental results show that the proposed method improves PSNR by about 0.190.31 dB compared with a conventional method.
In this paper, we propose a simple, non iterative blocking artifacts reduction method for block discrete cosine transform (DCT) compressed images, using adaptive bilateral filter. Bilateral filter when applied on a in...
详细信息
In this paper, we propose a simple, non iterative blocking artifacts reduction method for block discrete cosine transform (DCT) compressed images, using adaptive bilateral filter. Bilateral filter when applied on a input image, smooths out the blocking artifacts by weighted averaging of the pixel values without smoothing the edges. The proper selection of the bilateral filter parameters is very important and affects the filtering results significantly. We select the bilateral filter parameters optimally, through empirical study. The bilateral filter parameters are made adaptive to different decompressed images using different quantization tables. Our main contribution in this paper is to select the bilateral filter parameters optimally and adaptively through empirical study, in application to image deblocking. The proposed method shows highly encouraging results both objectively and subjectively, when compared to many state of the art image deblocking schemes including a recent method based on adaptive bilateral filter.
This article proposed a new algorithm to handle color image seqsuences. This new algorithm allows the processing sequence of three consecutive frames on the same principle as for fixed images. The goal is to enhance t...
详细信息
This article proposed a new algorithm to handle color image seqsuences. This new algorithm allows the processing sequence of three consecutive frames on the same principle as for fixed images. The goal is to enhance the reconstruction video frequency, even if transmits presents the wrong visual quality, and makes the transmission process to be as far as possible simple. This new algorithm is based on an association of a wavelet transform (WT), a vector quantization (VQ) and color image sequences of YCbCr format. The entire transmission chain's common use causes it to adapt complex and the low data rate channel completely. Simulation results show improved performance when compared to MPEG-4 code.
In this paper, we investigate the effects that the lossy image compression has on the performance competitive codes of palmprint verification algorithms. Effects of lossy compression on palmprint recognition performan...
详细信息
In this paper, we investigate the effects that the lossy image compression has on the performance competitive codes of palmprint verification algorithms. Effects of lossy compression on palmprint recognition performance are of interest in applications where image storage space and transmission time are of critical importance. To reconcile that goal with its implications for bandwidth and storage, we test the effects of image lossy compression on competitive code under different transforms, quantization and encoding algorithms using a publicly available palmprint database. Experiments performing recognition directly in the compressed domain show that the extracted feature template variation from the compressed palmprint image is not consistent with the image distortion. What is interesting, the effect on the JPEG is more serious than that of JPEG2000 when the compression ratio is bigger than about 11 while the effect on JPEG2000 is more serious than that of JPEG when the compression ratio is smaller than about 11. And the trellis coded quantized method is not suitable for compressing palmprint images.
Current image steganographic detection algorithms are unable to make full use of the geometry of unlabeled image examples, detection performance is subject to a few labeled examples, which is utilized for training. In...
详细信息
ISBN:
(纸本)9781424482313
Current image steganographic detection algorithms are unable to make full use of the geometry of unlabeled image examples, detection performance is subject to a few labeled examples, which is utilized for training. In this paper, we propose an effective steganographic detection method for JPEG image that rely on the overall dataset. The method is combined with semi-supervised kernel in the presence of unlabeled examples. Semi-supervised kernel method constructs data adjacency graph to obtain Gram matrix, then we obtain the proposed method by incorporating graph Laplacian into kernel-based algorithms, which is effective integration of the cluster assumption and manifold assumption. Our method utilizes the geometry of all examples with manifold regularization to produce smooth decision functions and thus improving the performance universal steganographic detection. Experimental results show the effectiveness of our proposed method.
In this paper, a novel macroblock level hybrid temporal-spatial video coding framework is proposed. In this framework, a new Hybrid Temporal Spatial Prediction (HTSP) coding mode is adopted for each macroblock. Macrob...
详细信息
In this paper, a novel macroblock level hybrid temporal-spatial video coding framework is proposed. In this framework, a new Hybrid Temporal Spatial Prediction (HTSP) coding mode is adopted for each macroblock. Macroblock is divided into two partitions. The first partition is temporally predicted using motion compensation and encoded, while the second partition is spatially predicted using the reconstruction of the first partition. Experimental results show that up to 0.4 dB coding gain and 0.2 dB on average can be achieved for HD sequences.
A joint watermarking and compression (JWC) paradigm is considered for the application of JPEG image compression to achieve an efficient tradeoff among the embedding rate, compression rate, embedding distortion, and ro...
详细信息
A joint watermarking and compression (JWC) paradigm is considered for the application of JPEG image compression to achieve an efficient tradeoff among the embedding rate, compression rate, embedding distortion, and robustness against a class of “natural” signal processing attacks, including spatial filtering, image resizing and rotation, random row and column deleting, and JPEG2000 recompression. This paper makes two novel contributions: First, a new JWC embedding method called joint odd-even watermarking and JPEG compression scheme is proposed to optimize compression rate and embedding distortion when watermarks are embedded into JPEG compressed images. Second, low-density parity-check codes are employed into the JWC system to obtain an efficient tradeoff between the embedding rate and robustness. Experimental results show that the proposed algorithm significantly outperforms the recently designed DEW (differential energy watermarking), DQW (differential quantization watermarking) and RA-SEC (repeat-accumulate code based selectively embedding in coefficients) schemes, in terms of compression rate, embedding distortion and robustness under the same embedding rate.
暂无评论