Summary form only given. In this paper, we bound the rate-distortion region for a four-node network. The results are the first known expansion of rate-distortion theory from single-hop networks (every source has a dir...
详细信息
Summary form only given. In this paper, we bound the rate-distortion region for a four-node network. The results are the first known expansion of rate-distortion theory from single-hop networks (every source has a direct connection to each of its destinations), to multihop networks, which allow intermediate nodes. While single-hop network source coding solutions may be applied in multihop networks, such applications require explicit rate allocation for each source-destination pair, and the resulting solutions may be suboptimal. We therefore tackle the multihop network source coding problem directly using a diamond network.
This paper studies a special case of the problem of source coding with side information. A single transmitter describes a source to a receiver that has access to a side information observation that is unavailable at t...
详细信息
ISBN:
(数字)9781665421591
ISBN:
(纸本)9781665421607
This paper studies a special case of the problem of source coding with side information. A single transmitter describes a source to a receiver that has access to a side information observation that is unavailable at the transmitter. While the source and true side information sequences are dependent, stationary, memoryless random processes, the side information observation at the decoder is unreliable, which here means that it may or may not equal the intended side information and therefore may or may not be useful for decoding the source description. The probability of side information observation failure, caused, for example, by a faulty sensor or source decoding error, is non-vanishing but is bounded by a fixed constant independent of the blocklength. This paper proposes a coding system that uses unreliable side information to get efficient source representation subject to a fixed error probability bound. Results include achievability and converse bounds under two different models of the joint distribution of the source, the intended side information, and the side information observation.
We consider a lossless multi-terminal source coding problem with one transmitter, two receivers and side information. The achievable rate region of the problem is not well understood. In this paper, we characterise th...
详细信息
We consider a lossless multi-terminal source coding problem with one transmitter, two receivers and side information. The achievable rate region of the problem is not well understood. In this paper, we characterise the rate region when the side information at one receiver is conditionally less noisy than the side information at the other, given this other receiver's desired source. The conditionally less noisy definition includes degraded side information and a common message as special cases, and it is motivated by the concept of less noisy broadcast channels. The key contribution of the paper is a new converse theorem employing a telescoping identity and the Csiszár sum identity.
We characterize the rate distortion function for the source coding with decoder side information setting when the i-th reconstruction symbol is allowed to depend only on the first i + d side information symbols, for s...
详细信息
ISBN:
(纸本)142440505X
We characterize the rate distortion function for the source coding with decoder side information setting when the i-th reconstruction symbol is allowed to depend only on the first i + d side information symbols, for some finite lookahead d, in addition to the index from the encoder. For the case of causal side information, i.e., d = 0, we find that the penalty of causality is the omission of the subtracted mutual information term in the Wyner-Ziv rate distortion function. For d > 0, we derive a computable "infinite-letter" expression for the rate distortion function. When specialized to the near-lossless case, our results characterize the best achievable rate for the Slepian-Wolf source coding problem with limited side information lookahead, and have some surprising implications. We find that side information is useless for any fixed d when the joint PMF of the source and side information satisfies the positivity condition P(x,y) > 0 for all (x,y). More generally, the optimal rate depends on the distribution of the pair X, Y only through the distribution of X and the bipartite graph whose edges represent the pairs x,y for which P(x,y) > 0. On the other hand, if side information lookahead d n is allowed to grow faster than logarithmic in the block length n, then H(X|Y) is achievable. Finally, we apply our approach to derive a computable expression for channel capacity when state information is available at the encoder with limited lookahead
Signal representations based on low-resolution quantization of redundant expansions is an interesting source coding paradigm, the most important practical case of which is oversampled A/D conversion. Signal reconstruc...
详细信息
Signal representations based on low-resolution quantization of redundant expansions is an interesting source coding paradigm, the most important practical case of which is oversampled A/D conversion. Signal reconstruction from quantized coefficients of a redundant expansion and accuracy of representations of this kind are problems which are still not well understood and these are studied in this paper in finite dimensional spaces. It has been previously proven that accuracy of signal representations based on quantized redundant expansions, measured as the squared Euclidean norm of the reconstruction error, cannot be better than O(1/(r/sup 2/)), where r is the expansion redundancy. We give some general conditions under which 1/(r/sup 2/) accuracy can be attained. We also suggest a form of structure for overcomplete families which facilitates reconstruction, and which enables efficient encoding of quantized coefficients with a logarithmic increase of the bit-rate in redundancy.
This paper considers source coding problems with the requirements of perfect secrecy and zero error at receivers. In the problems considered in this paper, there is always one transmitter but there can be one or two r...
详细信息
ISBN:
(纸本)9781479904440
This paper considers source coding problems with the requirements of perfect secrecy and zero error at receivers. In the problems considered in this paper, there is always one transmitter but there can be one or two receivers. Two different scenarios depending on whether the receivers' side information are present at the transmitter or not are considered. By deriving bounds on the probability masses of the cipher-text and the key, the minimum transmission rate and key rate are characterized. Although zero-error capacities are typically difficult to characterize, the perfect secrecy constraint turns out to be the key that simplifies the problems considered in this paper and makes them analytically tractable.
The recently proposed set-up of source coding with a side information “vending machine” allows the decoder to select actions in order to control the quality of the side information. The actions can depend on the mes...
详细信息
ISBN:
(纸本)9781479904440
The recently proposed set-up of source coding with a side information “vending machine” allows the decoder to select actions in order to control the quality of the side information. The actions can depend on the message received from the encoder and on the previously measured samples of the side information, and are cost constrained. Moreover, the final estimate of the source by the decoder is a function of the encoder's message and depends causally on the side information sequence. Previous work by Permuter and Weissman has characterized the rate-distortion-cost function in the special case in which the source and the “vending machine” are memoryless. In this work, motivated by the related channel coding model introduced by Kramer, the rate-distortion-cost function characterization is extended to a model with in-block memory. Various special cases are studied including block-feedforward and side information repeat request models.
This paper describes the source coding of the information signals with feedforward Gaussian sources. A stationary memoryless Gaussian source with zero-mean and variance, and with mean squared error as the distortion m...
详细信息
ISBN:
(纸本)0780382803
This paper describes the source coding of the information signals with feedforward Gaussian sources. A stationary memoryless Gaussian source with zero-mean and variance, and with mean squared error as the distortion measure, gives a deterministic scheme that achieves the optimal rate-distortion bound using simple uniform scalar quantizers. To reconstruct source codes, the decoder uses the optimal Shannon rate-distortion function and achieves channel coding with feedback.
We consider lossy source coding when side information affecting the distortion measure may be available at the encoder, decoder, both, or neither. For example, such distortion side information can model reliabilities ...
详细信息
We consider lossy source coding when side information affecting the distortion measure may be available at the encoder, decoder, both, or neither. For example, such distortion side information can model reliabilities for noisy measurements, sensor calibration information, or perceptual effects like masking and sensitivity to context. When the distortion side information is statistically independent of the source, we show that in many cases (e.g., for additive or multiplicative distortion side information) there is no penalty for knowing the side information only at the encoder, and there is no advantage to knowing it at the decoder. Furthermore, for quadratic distortion measures scaled by the distortion side information, we evaluate the penalty for lack of encoder knowledge and show that it can be arbitrarily large. In this scenario, we also sketch transform based quantizers constructions which efficiently exploit encoder side information in the high-resolution limit.
暂无评论