This brief proposes a low-complexity first-two-minimum-values generator for a bit-serial scheme. Since the hardware complexity of generators utilizes a significant portion of the min-sum low-density parity-check decod...
详细信息
This brief proposes a low-complexity first-two-minimum-values generator for a bit-serial scheme. Since the hardware complexity of generators utilizes a significant portion of the min-sum low-density parity-check decoder, a low-complexity generator is crucially important. To reduce hardware complexity, an existing bit-serial generator that finds only one minimum value instead of two has been proposed;however, it can cause bit error rate (BER) degradation. By contrast, the proposed low-complexity bit-serial generator can find the exact first two minimum values and thus can improve the BER performance. Moreover, the proposed generator does not suffer from any throughput loss since its latency is almost the same as that of the existing generator.
5G sets stringent requirement on delay, which means large packets have to be segmented into shorter ones. However, the gain of turbo coding can be severely degraded due to short code size. In this paper, a novel codin...
详细信息
ISBN:
(纸本)9781509016983
5G sets stringent requirement on delay, which means large packets have to be segmented into shorter ones. However, the gain of turbo coding can be severely degraded due to short code size. In this paper, a novel coding and encoding scheme is proposed to obtain high coding gain under short size. The idea is to encode multiple short code blocks cooperatively to obtain a long code size. In the process of encoding, an XOR operation is conducted on certain parts of previously encoded blocks to introduce correlation. The mechanism of successive interference cancelation (SIC) receiver is applied in the decoding procedure and each code block utilizes the extrinsic information from the correlated parts from other code blocks. The scheme incurs no extra delays at reduced complexity. Simulation results show the scheme can yield equivalent or better performance than using traditional turbo code on long code block.
Review of self-correction applied for min-sum based decoding algorithms considered. Complexity, performance, and average number of iteration for each algorithm are shown.
ISBN:
(纸本)9781479989997
Review of self-correction applied for min-sum based decoding algorithms considered. Complexity, performance, and average number of iteration for each algorithm are shown.
5G sets stringent requirement on delay, which means large packets have to be segmented into shorter ones. However, the gain of turbo coding can be severely degraded due to short code size. In this paper, a novel codin...
详细信息
ISBN:
(纸本)9781509016990
5G sets stringent requirement on delay, which means large packets have to be segmented into shorter ones. However, the gain of turbo coding can be severely degraded due to short code size. In this paper, a novel coding and encoding scheme is proposed to obtain high coding gain under short size. The idea is to encode multiple short code blocks cooperatively to obtain a long code size. In the process of encoding, an XOR operation is conducted on certain parts of previously encoded blocks to introduce correlation. The mechanism of successive interference cancelation (SIC) receiver is applied in the decoding procedure and each code block utilizes the extrinsic information from the correlated parts from other code blocks. The scheme incurs no extra delays at reduced complexity. Simulation results show the scheme can yield equivalent or better performance than using traditional turbo code on long code block.
An adaptive-normalized/offset min-sum (AN-/AO-MS) algorithm for decoding low-density parity-check (LDPC) codes is proposed. Unlike the normalized/offset min-sum (NMS/OMS) algorithm, the normalization/offset factor is ...
详细信息
An adaptive-normalized/offset min-sum (AN-/AO-MS) algorithm for decoding low-density parity-check (LDPC) codes is proposed. Unlike the normalized/offset min-sum (NMS/OMS) algorithm, the normalization/offset factor is adaptively adjusted according to the state of check nodes in each iteration. Simulation results show that the proposed AN-/AO-MS algorithm can perform better than the NMS/OMS algorithm while still preserving the low complexity of the min-sum algorithm.
A modified min-sum algorithm for low-density parity check codes is proposed in this paper. The inaccurate check node messages of the min-sum algorithm compared to the belief propagation (BP) algorithm are compensated ...
详细信息
ISBN:
(纸本)9781424458516
A modified min-sum algorithm for low-density parity check codes is proposed in this paper. The inaccurate check node messages of the min-sum algorithm compared to the belief propagation (BP) algorithm are compensated by modified variable node messages, which is different from the normalized min-sum algorithm. The modified factor is obtained before decoding program and only one extra multiplication is needed in one cycle. So the increased complexity is fairly low. The simulation results show that the bit error rate (BER) and average number of iteration of the modified min-sum algorithm arc very close to the BP algorithm, the modified algorithm can improve the BER performance compared to the min-sum algorithm while added complexity slightly.
Applying the max-product (and sum-product) algorithms to loopy graphs is now quite popular for best assignment problems. This is largely due to their low computational complexity and impressive performance in practice...
详细信息
Applying the max-product (and sum-product) algorithms to loopy graphs is now quite popular for best assignment problems. This is largely due to their low computational complexity and impressive performance in practice. Still, there is no general understanding of the conditions required for convergence or optimality of converged solutions or both. This paper presents an analysis of both attenuated max-product decoding and weighted min-sum decoding for low-density parity-check (LDPC) codes, which guarantees convergence to a fixed point when a weight factor, beta, is sufficiently small. It also shows that, if the fixed point satisfies some consistency conditions, then it must be both a linear-programming (LP) and maximum-likelihood (ML) decoding solution. For (d(v), d(c))-regular LDPC codes, the weight factor must satisfy beta(d(v) - 1) < 1 to guarantee convergence to a fixed point, whereas the results proposed by Frey and Koetter require instead that beta(d(v) - 1)(d(c) - 1) < 1. In addition, the range of the weight factor for a provable ML decoding solution is extended to 0 < beta(d(v) - 1) <= 1. In addition, counterexamples that show a fixed point might not be the ML decoding solution if beta(d(v) - 1) > 1 are given. Finally, connections are explored with recent work on the threshold of LP decoding.
The problem of improving the performance of min-sum decoding of low-density parity-check (LDPC) codes is considered in this paper. Based on min-sum algorithm, a novel modified min-sum decoding algorithm for LDPC cod...
详细信息
The problem of improving the performance of min-sum decoding of low-density parity-check (LDPC) codes is considered in this paper. Based on min-sum algorithm, a novel modified min-sum decoding algorithm for LDPC codes is proposed. The proposed algorithm modifies the variable node message in the iteration process by averaging the new message and previous message if their signs are different. Compared with the standard min-sum algorithm, the modification is achieved with only a small increase in complexity, but significantly improves decoding performance for both regular and irregular LDPC codes. Simulation results show that the performance of our modified decoding algorithm is very close to that of the standard sum-product algorithm for moderate length LDPC codes.
A modified min-sum algorithm for low-density parity check codes is proposed in this paper. The inaccurate check node messages of the min-sum algorithm compared to the belief propagation (BP) algorithm are compensated ...
详细信息
A modified min-sum algorithm for low-density parity check codes is proposed in this paper. The inaccurate check node messages of the min-sum algorithm compared to the belief propagation (BP) algorithm are compensated by modified variable node messages, which is different from the normalized min-sum algorithm. The modified factor is obtained before decoding program and only one extra multiplication is needed in one cycle. So the increased complexity is fairly *** simulation results show that the bit error rate (BER) and average number of iteration of the modified min-sum algorithm are very close to the BP algorithm, the modified algorithm can improve the BER performance compared to the min-sum algorithm while added complexity slightly.
An adaptive-normalized min-sum (AN-MS) algorithm for decoding low-density parity-check (LDPC) codes is proposed. Unlike the normalized min-sum (NMS) algorithm, the normalization factor is adaptively adjusted according...
详细信息
ISBN:
(纸本)9781424458219;9781424458240
An adaptive-normalized min-sum (AN-MS) algorithm for decoding low-density parity-check (LDPC) codes is proposed. Unlike the normalized min-sum (NMS) algorithm, the normalization factor is adaptively adjusted according to the state of check nodes in each iteration. Simulation results show that the proposed AN-MS algorithm can perform better than the NMS algorithm while still preserving the low complexity of the min-sum algorithm.
暂无评论