This paper provides the Shannon theoretic coding theorems on the success probabilities of the impersonation attack and the substitution attack against secret-key authentication systems. Though there are many studies t...
详细信息
This paper provides the Shannon theoretic coding theorems on the success probabilities of the impersonation attack and the substitution attack against secret-key authentication systems. Though there are many studies that develop lower bounds on the success probabilities, their tight upper bounds are rarely discussed. This paper characterizes the tight upper bounds in an extended secret-key authentication system that includes blocklength K and permits the decoding error probability tending to zero as K --> infinity. In the extended system an encoder encrypts IC source outputs to K cryptograms under K keys and transmits K cryptograms to a decoder through a public channel in the presence of an opponent. The decoder judges whether K cryptograms received from the public channel are legitimate or not under K keys shared with the encoder. It is shown that 2(-KI(W;E)) is the minimal attainable upper bound of the success probability of the impersonation attack, where I(W;E) denotes the mutual information between a cryptogram W and a key E. In addition, 2(-KH(E/W)) is proved to be the tight upper bound of the probability that the opponent can correctly guess K keys from transmitted K cryptograms, where H(E/W) denotes the conditional entropy of E given TY.
Source coding problems are treated for Shannon's cipher system with correlated source outputs (X, Y). Several cases are considered based on whether both X and Y, only X, or only Y must be transmitted to the receiv...
详细信息
Source coding problems are treated for Shannon's cipher system with correlated source outputs (X, Y). Several cases are considered based on whether both X and Y, only X, or only Y must be transmitted to the receiver, whether both X and Y, only X, or only Y must be kept secret, or whether the security level is measured by (1/K H (X(K)\W), 1/K H (Y(K)\W)) or 1/K H (X(K)Y(K)\W) where W is a cryptogram. The admissible region of cryptogram rate and key rate for a given security level is derived for each case. Furthermore, two new kinds of common information of X and Y, say C1(X;Y) and C2(X;Y), are considered. C1(X;Y) is defined as the rate of the attainable minimum core of (X(K), Y(K)) by removing each private information from (X(K), Y(K)) as much as possible, while C2(X;Y) is defined as the rate of the attainable maximum core V(C) such that if we lose V(C), then each uncertainty of X(K) and Y(K) becomes H(V(C)). It is proved that C1(X;Y) = I(X;Y) and C2(X;Y) = min {H(X), H(Y)}. C1(X;Y) justifies our intuitive feeling that the mutual information represents a common information of X and Y.
The security level of the Shannon cipher system is traditionally measured by equivocation 1/N H (Y | Z), where Y is a secret plaintext with length N and Z is its cryptogram. But, Merhav and Arikan have considered anot...
详细信息
The security level of the Shannon cipher system is traditionally measured by equivocation 1/N H (Y | Z), where Y is a secret plaintext with length N and Z is its cryptogram. But, Merhav and Arikan have considered another security criterion, which is measured by the number of guesses needed for a wiretapper to uncover Y from Z. Merhav has also considered the third security criterion, which measured by the probability of correct guess of a wiretapper. On the other hand, in the case of the traditional security criterion, Yamamoto has treated a coding problem for correlated source outputs X and Y such that only X is secret against wiretappers and only Y must be transmitted to a legitimate receiver. In this correspondence, coding theorems are proved for the case that Yamamoto's coding problem is applied to Merhav-Arikan's security criterion or Merhav's security criterion.
The present paper continues investigation of the capacities of measurement (quantum-classical) channels in the most general setting, initiated in [A. A. Kuznetsova and A. S. Holevo, Theory Probab. Appl., 58 (2014), pp...
详细信息
The present paper continues investigation of the capacities of measurement (quantum-classical) channels in the most general setting, initiated in [A. A. Kuznetsova and A. S. Holevo, Theory Probab. Appl., 58 (2014), pp. 264-2851. The proof of coding theorems is given for the classical capacity and entanglement-assisted classical capacity of the measurement channel with arbitrary output alphabet, without assuming that the channel is given by a bounded operator-valued density.
The main result of this paper is the coding theorem for the entanglement-assisted capacity of measurement channel with arbitrary alphabet (theorem 3). As a by-product, a number of results concerning entropies of hybri...
详细信息
The main result of this paper is the coding theorem for the entanglement-assisted capacity of measurement channel with arbitrary alphabet (theorem 3). As a by-product, a number of results concerning entropies of hybrid (classical-quantum) systems are obtained, as well as the capacities of channels with hybrid output, which are used in the proof of theorem 3.
We consider a situation where n-tuples generated from a general source are encoded by a fixed-length code and discuss coding theorems on the worst-case redundancy, where the worst-case redundancy is defined as the max...
详细信息
ISBN:
(纸本)9781457705953
We consider a situation where n-tuples generated from a general source are encoded by a fixed-length code and discuss coding theorems on the worst-case redundancy, where the worst-case redundancy is defined as the maximum of the difference between the rate and the ideal codeword length per symbol with respect to all the correctly decodable n-tuples. We treat the four cases where the decoding error probability epsilon(n) is required to satisfy (a) lim(n ->infinity) epsilon(n) = 0, (b) lim inf(n ->infinity) epsilon(n) = 0, (c) lim sup(n ->infinity) epsilon(n) <= epsilon, and (d) lim inf(n ->infinity) epsilon(n) <= epsilon, respectively, where epsilon is an element of [0;1) is an arbitrary constant. We give general formulas of the optimum worst-case redundancy that are closely related to the width of the entropy-spectrum of a source.
In the present communication, a parametric R-norm fuzzy entropy of a fuzzy variable and subsequently R-norm fuzzy directed divergence measure of fuzzy variables by taking the credibility distribution measure into acco...
详细信息
In the present communication, a parametric R-norm fuzzy entropy of a fuzzy variable and subsequently R-norm fuzzy directed divergence measure of fuzzy variables by taking the credibility distribution measure into account have been proposed along with their proof of validity. Based on the proposed information measures, some new results along with its various properties and the monotonic behavior have been studied in detail. Further, some important results and inequalities related to their corresponding noiseless coding theorems have been well established. Some particular cases related to the proposed fuzzy information measures and fuzzy mean codeword length in terms of existing results have also been discussed. In view the monotonic behavior of the fuzzy mean codeword length, an application to chose the appropriate value of the parameter R to reduce the fuzzy mean codeword length significantly for optimal code has been presented with a possible variability.
Recently, the delta-mutual information between uncertain variables has been introduced as a generalization of Nair's non-stochastic mutual information functional [1, 2]. Within this framework, we introduce four di...
详细信息
ISBN:
(纸本)9781538682098
Recently, the delta-mutual information between uncertain variables has been introduced as a generalization of Nair's non-stochastic mutual information functional [1, 2]. Within this framework, we introduce four different notions of capacity and present corresponding coding theorems. Our definitions include an analogue of Shannon's capacity in a non-stochastic setting, and a generalization of the zero-error capacity. The associated coding theorems hold for stationary, memoryless, non-stochastic uncertain channels. These results establish the relationship between the 6-mutual information and our operational definitions, providing a step towards the development of a complete non-stochastic information theory.
A new framework is presented in this paper for proving coding theorems for linear codes, where the systematic bits and the corresponding parity-check bits play different roles. Precisely, the noisy systematic bits are...
详细信息
ISBN:
(纸本)9781665421607;9781665421591
A new framework is presented in this paper for proving coding theorems for linear codes, where the systematic bits and the corresponding parity-check bits play different roles. Precisely, the noisy systematic bits are used to limit the list size of typical codewords, while the noisy parity-check bits are used to select from the list the maximum likelihood codeword. This new framework for linear codes allows that the systematic bits and the parity-check bits are transmitted in different ways and over different channels. In particular, this new framework unifies the source coding theorems and the channel coding theorems. With this framework, we prove that the Bernoulli generator matrix codes (BGMCs) are capacity-achieving over binary-input output symmetric (BIOS) channels and also entropy-achieving for Bernoulli sources.
We explore some basic properties of coding theory of a general quantum communication channel and its operational capacity, including: 1) adaptive measurement with feedback code;2) reconsideration of single-letterized ...
详细信息
We explore some basic properties of coding theory of a general quantum communication channel and its operational capacity, including: 1) adaptive measurement with feedback code;2) reconsideration of single-letterized capacity formula;and 3) pseudoclassicality of a channel.
暂无评论