Reed-Solomon (RS) codes are widely used in storage systems to ensure data reliability. In this paper, we first propose a new construction of RS codes with between three to five parity symbols over a special finite fie...
详细信息
ISBN:
(纸本)9798350348941;9798350348934
Reed-Solomon (RS) codes are widely used in storage systems to ensure data reliability. In this paper, we first propose a new construction of RS codes with between three to five parity symbols over a special finite field of size 256. We show that all the operations involved in the encoding/decoding process can be implemented by XOR and cyclic shift. Second, we present a fast encoding/decoding algorithm for our codes by designing a modified Reed-Muller (RM) transform that has both small computational complexity and space complexity. We show that our codes have much lower space complexity and nearly the same computational complexity, compared with the existing RM-based RS codes. Simulation results demonstrate that our codes improve encoding and decoding throughput by 34.06% and 31.66%, respectively, under evaluated parameters, compared with existing RM-based RS codes.
This article describes a fast Reed-Solomon encoding algorithm with four and seven parity symbols in between. First, we show that the syndrome of Reed-Solomon codes can be computed via the Reed-Muller transform. Based ...
详细信息
This article describes a fast Reed-Solomon encoding algorithm with four and seven parity symbols in between. First, we show that the syndrome of Reed-Solomon codes can be computed via the Reed-Muller transform. Based on this result, the fast encoding algorithm is then derived. Analysis shows that the proposed approach asymptotically requires 3 XORs per data bit, representing an improvement over previous algorithms. The simulation demonstrates that the performance of the proposed approach improves with the increase of code length and is superior to other methods. In particular, when the parity number is 5, the proposed approach is about two times faster than other cutting-edge methods.
Low density parity check code (LDPC) is a linear block code whose performance is close to Shannon's limit. Compared with other decoding algorithms, it has low decoding complexity and flexible structure. So it beco...
详细信息
Low density parity check code (LDPC) is a linear block code whose performance is close to Shannon's limit. Compared with other decoding algorithms, it has low decoding complexity and flexible structure. So it becomes people's research hotspots. In recent years, LDPC code codec algorithm has been improved, and is widely used in deep space communications, optical fiber communications, mobile communications and underwater and other fields. Using chaotic sequence single-layer mapping plus pixel scrambling can complete chaotic image encryption, because some chaotic sequence is poor robust, in the process of channel transmission, it is difficult to ensure that the information can be destroyed after the attack In this paper, LDPC codes are introduced, and in the chaotic image encryption, block coding is used to form error protection to improve the stability of the image transmission process and improve the transmission quality.
Random codes based on quasigroups (RCBQ) are cryptcodes, i.e. they are error-correcting codes, which provide information security. Cut-Decoding and 4-Sets-Cut-Decoding algorithms for these codes are defined elsewhere....
详细信息
Random codes based on quasigroups (RCBQ) are cryptcodes, i.e. they are error-correcting codes, which provide information security. Cut-Decoding and 4-Sets-Cut-Decoding algorithms for these codes are defined elsewhere. Also, the performance of these codes for the transmission of text messages is investigated elsewhere. In this study, the authors investigate the RCBQ's performance with Cut-Decoding and 4-Sets-Cut-Decoding algorithms for transmission of images and audio files through a Gaussian channel. They compare experimental results for both coding/decoding algorithms and for different values of signal-to-noise ratio. In all experiments, the differences between the transmitted and decoded image or audio file are considered. Experimentally obtained values for bit-error rate and packet error rate and the decoding speed of both algorithms are compared. Also, two filters for enhancing the quality of the images decoded using RCBQ are proposed.
In orthogonal frequency division multiple access (OFDMA) networks, relay nodes utilize network coding for the received information to improve spectrum efficiency. However, network coding spreads may result in some err...
详细信息
ISBN:
(纸本)9783662474013;9783662474006
In orthogonal frequency division multiple access (OFDMA) networks, relay nodes utilize network coding for the received information to improve spectrum efficiency. However, network coding spreads may result in some errors. These bit errors along the relay's transmission, which results in high BER in receivers. To solve the error spread problem, we propose a new coding algorithm called dualism network coding (DNC) algorithm to cripple the spread of error effectively for 2-relay networks. In the proposed algorithm, relay node reuse the bit decoded from the received information for the network coding of next slot. The simulation results show that the proposed algorithm is effective to solve the problem of error propagation effectively.
This paper proposes a novel coding algorithm based on loss compression using scalar quantization switching technique. The algorithm of switching is performed by the estimating input variance and further coding with No...
详细信息
This paper proposes a novel coding algorithm based on loss compression using scalar quantization switching technique. The algorithm of switching is performed by the estimating input variance and further coding with Nonuniform Switched Scalar Compandor (NSSC). An accurate estimation of the input signal variance is needed when finding the best compressor function for a compandor implementation. It enables quantizers to be adapted to the maximal amplitudes of input signals. Additionally, we have discussed the performances of coding schemes designed according to waveform G.711 and G.712 standards and a novel presented codec standard for wideband speech and audio coding. We have pointed out the benefits that can be achieved by using our algorithm: raise of quality and compression. The main contribution of this model is reaching the loss compression through reaching the higher quality of the signal-to-quantization noise ratio (SQNR) in a wide range of signal volumes (variances) with respect to the necessary robustness over a broad range of input variances, and applying possibility for VOIP applications and an effective coding of signals that likewise speech signals follow Gaussian distribution and have the time varying characteristics.
The quality of the reconstructed holographic three-dimensional image is seriously influenced by the noise. A novel encoded algorithm combining the Burch code with the four-step phase shifting method is presented to re...
详细信息
The quality of the reconstructed holographic three-dimensional image is seriously influenced by the noise. A novel encoded algorithm combining the Burch code with the four-step phase shifting method is presented to remove the noise and improve the contrast and resolution of the reconstructed image. The reconstructed three-dimensional images are compared with the results of the median filtering. The performance parameters of two methods are analyzed. The experimental results show that the zero-order light spot, conjugate image and speckle noise are suppressed effectively. The quality of the reconstructed image is noticeably improved. (C) 2013 Elsevier GmbH. All rights reserved.
This paper proposes a structure for a digital coder, optimizing the multiplication circuit area that leads to a substantial gain in the main circuit surface and helps future decoding. The work describes the functionin...
详细信息
ISBN:
(纸本)9781479954797
This paper proposes a structure for a digital coder, optimizing the multiplication circuit area that leads to a substantial gain in the main circuit surface and helps future decoding. The work describes the functioning process, the technical design and tests the coder's main functions. The evolution of RS coding and the growing pallet of application domains are also briefly covered.
In the coal gangue automatic selection system, many coal gangue images have been created. It is urgent problem to store and transfer these coal gangue images in real-time processing. An improved embedded zerotree wave...
详细信息
ISBN:
(纸本)9781424441303
In the coal gangue automatic selection system, many coal gangue images have been created. It is urgent problem to store and transfer these coal gangue images in real-time processing. An improved embedded zerotree wavelet (EZW) coding algorithm is introduced to compress the coal gangue images in this paper. According to wavelet transform theory, the basic features of wavelet coefficients are analyzed. The zerotrees are divided into the edge tree, texture tree and smoothness tree, and the different coding schemes are applied to these different trees. The low frequency components in the coal gangue images are compressed in lossless, whereas the wavelet coefficients of the high frequency in different directions are processed with the adaptive threshold values. The integer wavelet transform is adopted to keep the higher fidelity in inversion operation. Investigation results show that the proposed algorithm can raise the compression efficiency and gain the higher human vision quality in the reconstructed images.
Purpose The Food and Drug Administration'sMini-Sentinel pilot program initially aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveilla...
详细信息
Purpose The Food and Drug Administration'sMini-Sentinel pilot program initially aims to conduct active surveillance to refine safety signals that emerge for marketed medical products. A key facet of this surveillance is to develop and understand the validity of algorithms for identifying health outcomes of interest from administrative and claims data. This article summarizes the process and findings of the algorithm review of anaphylaxis. Methods PubMed and Iowa Drug Information Service searches were conducted to identify citations applicable to the anaphylaxis health outcome of interest. Level 1 abstract reviews and Level 2 full-text reviews were conducted to find articles using administrative and claims data to identify anaphylaxis and including validation estimates of the coding algorithms. Results Our search revealed limited literature focusing on anaphylaxis that provided administrative and claims data-based algorithms and validation estimates. Only four studies identified via literature searches provided validated algorithms;however, two additional studies were identified by Mini-Sentinel collaborators and were incorporated. The International Classification of Diseases, Ninth Revision, codes varied, as did the positive predictive value, depending on the cohort characteristics and the specific codes used to identify anaphylaxis. Conclusions Research needs to be conducted on designing validation studies to test anaphylaxis algorithms and estimating their predictive power, sensitivity, and specificity. Copyright (C) 2012 John Wiley & Sons, Ltd.
暂无评论