We prove an exponential decay concentration inequality to bound the tail probability of the difference between the log-likelihood of discrete random variables on a finite alphabet and the negative entropy. The concent...
详细信息
We prove an exponential decay concentration inequality to bound the tail probability of the difference between the log-likelihood of discrete random variables on a finite alphabet and the negative entropy. The concentration bound we derive holds uniformly over all parameter values. The new result improves the convergence rate in an earlier result of Zhao (2020), from (K2 log K)/n = o(1) to (log K)2/n = o(1), where n is the sample size and K is the size of the alphabet. We further prove that the rate (log K)2/n = o(1) is optimal. The result is extended to misspecified log-likelihoods for grouped random variables. We give applications of the new result in information theory.
This paper describes the encoding of a scalar unstable Markov process into two parallel fixed-rate bit streams. source coding theorems for stable Markov and Gaussian auto-regressive processes (ARMA process) under mean...
详细信息
ISBN:
(纸本)0780382803
This paper describes the encoding of a scalar unstable Markov process into two parallel fixed-rate bit streams. source coding theorems for stable Markov and Gaussian auto-regressive processes (ARMA process) under mean-squared-error distortion used in calculating a standard information-theoretic rate-distortion function.
This paper presents the source coding theorem and its converse for discrete-time cyclostationary Gaussian sources with absolutely summable autocorrelation sequence. For the proofs of these theorems, a new definition f...
详细信息
This paper presents the source coding theorem and its converse for discrete-time cyclostationary Gaussian sources with absolutely summable autocorrelation sequence. For the proofs of these theorems, a new definition for the rate distortion function R(D) for cyclostationary Gaussian sources is presented and the existence of R(D) is proved. The transform coding scheme is used to show the existence of the optimal code. With the proof of the source coding theorem and its converse, it is shown that the newly defined rate distortion function for cyclostationary sources is consistent with the usual concept of rate distortion function as defined for stationary sources.
The (noiseless) fixed-length source coding theorem states that, except for outcomes in a set of vanishing probability, a source can be encoded at its entropy but not more efficiently, It is well known that the Asympto...
详细信息
The (noiseless) fixed-length source coding theorem states that, except for outcomes in a set of vanishing probability, a source can be encoded at its entropy but not more efficiently, It is well known that the Asymptotic Equipartition Property (AEP) is a sufficient condition for a source to be encodable at its entropy, This paper shows that the AEP is necessary for the source coding theorem to hold for nonzero-entropy finite-alphabet sources, Furthermore, we show that a nonzero-entropy finite-alphabet source satisfies the direct codingtheorem if and only if it satisfies the strong converse, In addition, we introduce the more general setting of nonserial information sources which need not put out strings of symbols, In this context, which encompasses the conventional serial setting, the AEP is equivalent to the validity of the strong codingtheorem, Fundamental limits for data compression of nonserial information sources are shown based on the flat-top property-a new sufficient condition for the AEP.
Slepian, Wolf and Wyner proved famous source coding theorems for correlated i.i.d. sources. On the other hand recently Han and Verdu have shown the source and channel codingtheorems on general sources and channels wh...
详细信息
Slepian, Wolf and Wyner proved famous source coding theorems for correlated i.i.d. sources. On the other hand recently Han and Verdu have shown the source and channel codingtheorems on general sources and channels whose statistics can be arbitrary, that is, no assumption such as stationarity or ergodicity is imposed. We prove source coding theorems on correlated general sources by using the method which Han and Verdu developed to prove their theorems. Also, through an example, we show some new results which are essentially different from those already obtained for the i.i.d. source cases.
暂无评论