To perform reliable information processing in quantum computers, quantum error correction (QEC) codes are essential for the detection and correction of errors in the qubits. Among QEC codes, topological QEC codes are ...
详细信息
To perform reliable information processing in quantum computers, quantum error correction (QEC) codes are essential for the detection and correction of errors in the qubits. Among QEC codes, topological QEC codes are designed to interact between the neighboring qubits, which is a promising property for easing the implementation requirements. In addition, the locality to the qubits provides unusual tolerance to local errors. Recently, various decoding algorithms based on machine learning have been proposed to improve the decoding performance and latency of QEC codes. In this work, we propose a new decoding algorithm for surface codes, i.e., a type of topological codes, by using convolutional neural networks (CNNs) tailored for the topological lattice structure of the surface codes. In particular, the proposed algorithm takes advantage of the syndrome pattern, which is represented as a part of a rectangular lattice given to the CNN as its input. The remaining part of the rectangular lattice is filled with a carefully selected incoherent value for better logical error rate performance. In addition, we introduce how to optimize the hyperparameters in the CNN, according to the lattice structure of a given surface code. This reduces the overall decoding complexity and makes the CNN-based decoder computationally more suitable for implementation. The numerical results show that the proposed decoding algorithm effectively improves the decoding performance in terms of logical error rate as compared to the existing algorithms on various quantum error models.
Previous algorithms for decoding AG codes up to the designed distance all assumed existence of an extra rational place on the base algebraic curve. The place is used to solve the decoding problem by linear algebra ove...
详细信息
Previous algorithms for decoding AG codes up to the designed distance all assumed existence of an extra rational place on the base algebraic curve. The place is used to solve the decoding problem by linear algebra over the base field of the curve. The rationality of the place is essential, and therefore AG codes supported by all rational places on the curve are excluded from the domain of applicability of the decoding algorithms. This paper presents a decoding algorithm for those AG codes using an extra place of higher degree. Hence finally all AG codes, as Goppa defined 40 years ago, are equipped with a fast decoding algorithm.
In this study, we primarily aim to address the exposure bias issue in long text generation intrinsic to statistical language models. We propose a sentence-level heuristic tree search algorithm, specially tailored for ...
详细信息
In this study, we primarily aim to address the exposure bias issue in long text generation intrinsic to statistical language models. We propose a sentence-level heuristic tree search algorithm, specially tailored for long text generation, to mitigate the problem by managing generated texts in a tree structure and curbing the compounding of biases. Our algorithm utilizes two pre-trained language models, an auto-regressive model for generating new sentences and an auto-encoder model for evaluating sentence quality. These models work in tandem to perform four critical operations: expanding the text tree with new sentences, evaluating the quality of the additions, sampling potential unfinished text fragments for further generation, and pruning leaf nodes deemed unpromising. This iterative process continues until a pre-defined number of [EOS] tokens are produced, at which point we select the highest-scoring completed text as our final output. Moreover, we pioneer two novel token-level decoding techniques-nucleus sampling with temperature and diverse beam search with sampling. These methods, integrated with our sentence-level search algorithm, aim to improve the consistency and diversity of text generation. Experimental results, both automated measures (including Jaccard similarity, Word2vec similarity, and unique word ratio) and human evaluations (assessing consistency, fluency, and rhetorical skills), conclusively demonstrate that our approach considerably enhances the quality of machine-generated long-form text. Through this research, we aim to inspire further innovations in sentence-level search-based text generation algorithms.
A class of linear codes that extends classical Goppa codes to a non-commutative context is defined. An efficient decoding algorithm, based on the solution of a non-commutative key equation, is designed. We show how th...
详细信息
A class of linear codes that extends classical Goppa codes to a non-commutative context is defined. An efficient decoding algorithm, based on the solution of a non-commutative key equation, is designed. We show how the parameters of these codes, when the alphabet is a finite field, may be adjusted to propose a McEliece-type cryptosystem.
In this paper, an improved belief propagation decoding algorithm was proposed for low density parity check codes. In the proposed decoding process, error bits can be detected once again after hard-decision in the conv...
详细信息
ISBN:
(纸本)9783037859391
In this paper, an improved belief propagation decoding algorithm was proposed for low density parity check codes. In the proposed decoding process, error bits can be detected once again after hard-decision in the conventional BP decoding algorithm. The detection criterion is based on check matrix characteristics and D-value between prior probability and posterior probability. Simulation results demonstrate the performance of the improved BP decoding algorithm outperform that of the conventional BP decoding algorithm.
The cyclic redundancy check (CRC) aided successive cancelation list (SCL) decoding algorithm has better error performance than the successive cancelation (SC) decoding algorithm for short or moderate polar codes. Howe...
详细信息
ISBN:
(纸本)9781479965885
The cyclic redundancy check (CRC) aided successive cancelation list (SCL) decoding algorithm has better error performance than the successive cancelation (SC) decoding algorithm for short or moderate polar codes. However, the CRC aided SCL (CA-SCL) decoding algorithm still suffer from long decoding latency. In this paper, a reduced latency list decoding (RLLD) algorithm for polar codes is proposed. For the proposed RLLD algorithm, all rate-0 nodes and part of rate-1 nodes are decoded instantly without traversing the corresponding subtree. A list maximum-likelihood decoding (LMLD) algorithm is proposed to decode the maximum likelihood (ML) nodes and the remaining rate-1 nodes. Moreover, a simplified LMLD (SLMLD) algorithm is also proposed to reduce the computational complexity of the LMLD algorithm. Suppose a partial parallel list decoder architecture with list size L = 4 is used, for an (8192, 4096) polar code, the proposed RLLD algorithm can reduce the number of decoding clock cycles and decoding latency by 6.97 and 6.77 times, respectively.
A new decoding for hidden Markov models is presented. As opposed to the commonly used Viterbi algorithm, it is based on the Min-CuUMax-Flow algorithm instead of dynamic programming. Therefore non-Markovian long-term d...
详细信息
ISBN:
(纸本)9781479943357
A new decoding for hidden Markov models is presented. As opposed to the commonly used Viterbi algorithm, it is based on the Min-CuUMax-Flow algorithm instead of dynamic programming. Therefore non-Markovian long-term dependencies can easily be added to influence the decoding path while still finding the optimal decoding in polynomial time. We demonstrate through an experimental evaluation how these constraints can be used to improve an HMM-based handwritten word recognition system that model words via linear character-HMM by restricting the length of each character.
The rapid progress of digital devices and technology, coupled with the emergence of the internet has amplified the risks and perils associated with malicious attacks. Consequently, it becomes crucial to protect valuab...
详细信息
The rapid progress of digital devices and technology, coupled with the emergence of the internet has amplified the risks and perils associated with malicious attacks. Consequently, it becomes crucial to protect valuable information transmitted through the internet. Steganography is a tried-and-true technique for hiding information beneath digital content, such as pictures, texts, audio, and video. Various methodologies of image steganography have been developed recently. In image recognition, edge detection secures an image into well-defined areas. This paper introduces a novel image steganography algorithm with edge detection and XOR coding techniques. The proposed approach aims to conceal a confidential message within the spatial domain of the original image. In contrast to uniform regions, the Human Visual System (HVS) is less responsive to variations in the sharp areas;an edge detection algorithm is applied to identify edge pixels. Furthermore, to enhance the efficiency and reduce the embedding impact, XOR operation has been utilized to embed the secret message in the Least Significant Bit (LSB). According to the results of the experiments, the proposed method embeds confidential data without causing noticeable modifications to the stego image. The proposed method system produced imperceptible stego images with minimal embedding distortions compared to existing methods. Based on the results, the proposed approach outperforms the conventional methods regarding image distortion techniques. The PSNR values achieved by the proposed method are higher than the acceptable level.
A b Byzantine-robust K-out-of-l private information retrieval ((b,K, l)-BRPIR) scheme allows a user to retrieve any item of a database from l servers, even if only K out of the l servers respond and at most b out of t...
详细信息
ISBN:
(纸本)9781450391405
A b Byzantine-robust K-out-of-l private information retrieval ((b,K, l)-BRPIR) scheme allows a user to retrieve any item of a database from l servers, even if only K out of the l servers respond and at most b out of the K responding servers provide false answers. The existing BRPIR schemes require either an exponential time decoding algorithm or a communication complexity no better than O (n(1/(2k-1))kl log l) with k = K-2b In this paper, we show a new (b, K, l)-BRPIR scheme that has both a polynomial time decoding algorithm and a communication cost of l . exp(O ((log n)(1-1/r) (log log n)(1/r))) for r = log(K /(2b + 1)), which is more efficient when n -> infinity.
The Low-Density Parity Check (LDPC) codes of Euclidean Geometry (EG) are encrypted and decrypted in numerous ways, namely Soft Bit Flipping (SBF), Sequential Peeling Decoder (SPD), Belief Propagation Decoder (BPD), Ma...
详细信息
The Low-Density Parity Check (LDPC) codes of Euclidean Geometry (EG) are encrypted and decrypted in numerous ways, namely Soft Bit Flipping (SBF), Sequential Peeling Decoder (SPD), Belief Propagation Decoder (BPD), Majority Logic Decoder/Detector (MLDD), and Parallel Peeling Decoder (PPD) decoding algorithms. These algorithms provide aextensive range of trade-offs between latency decoding, power consumption, hardware complexity-required resources, and error rate performance. Therefore, the problem is to communicate a sophisticated technique specifying the both soft and burst errors for effective information transmission. In this research, projected a technique named as Hybrid SBF (HSBF) decoder for EG-LDPC codes, which reduces the decoding complexity and maximizes the signal transmission and reception. In this paper, HSBF is also known as Self Reliability based Weighted Soft Bit Flipping (SRWSBF) Decoder. It is obvious from the outcomes that the proposed technique is better than the decoding algorithms SBF, MLDD, BPD, SPD and PPD. Using Xilinx synthesis and SPARTAN 3e, a simulation model is designed to investigate latency, hardware utilization and power consumption. Average latency of 16.65 percent is found to be reduced. It is observed that in considered synthesis parameters such as number of 4-input LUTs, number of slices, and number of bonded IOBs, excluding number of slice Flip-Flops, hardware utilization is minimized to an average of 4.25 percent. The number of slices Flip-Flops resource use in the proposed HSBF decoding algorithm is slightly higher than other decoding algorithms, i.e. 1.85%. It is noted that, over the decoding algorithms considered in this study, the proposed research study minimizes power consumption by an average of 41.68%. These algorithms are used in multimedia applications, processing systems for security and information.
暂无评论