版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Univ Buenos Aires Fac Ingn C1063ACV Buenos Aires DF Argentina CSC CONICET C1425FQD Buenos Aires DF Argentina Univ Paris Sud Cent Supelec French Natl Ctr Sci Res F-91190 Gif Sur Yvette France Univ Montreal Montreal Inst Learning Algorithms Montreal PQ H3T 1N8 Canada
出 版 物:《IEEE TRANSACTIONS ON INFORMATION THEORY》 (IEEE信息论汇刊)
年 卷 期:2019年第65卷第2期
页 面:787-815页
核心收录:
学科分类:0808[工学-电气工程] 08[工学] 0701[理学-数学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:CONICET [PIP11220150100578CO] University of Buenos Aires [UBACyT 20020170100470BA] CNRS under Grant PICS MoSiME Universidad de Buenos Aires through the Peruilh Ph.D. Scholarship
主 题:Multi-terminal source coding logarithmic loss distributed source coding noisy rate-distortion side information interactive lossy source coding information bottleneck Shannon theory
摘 要:This paper investigates a multi-terminal source coding problem under a logarithmic loss fidelity which does not necessarily lead to an additive distortion measure. The problem is motivated by an extension of the information bottleneck method to a multi-source scenario where several encoders have to build cooperatively rate-limited descriptions of their sources in order to maximize information with respect to other unobserved (hidden) sources. More precisely, we study fundamental information-theoretic limits of the so-called: 1) two-way collaborative information bottleneck (TW-CIB) and 2) the collaborative distributed information bottleneck (CDIB) problems. The TW-CIB problem consists of two distant encoders that separately observe marginal (dependent) components X-1 and X-2 and can cooperate through multiple exchanges of limited information with the aim of extracting information about hidden variables (Y-1, Y-2), which can be arbitrarily dependent on (X-1, X-2). On the other hand, in CDIB, there are two cooperating encoders which separately observe X-1 and X-2 and a third node which can listen to the exchanges between the two encoders in order to obtain information about a hidden variable Y. The relevance (figure-of-merit) is measured in terms of a normalized (per-sample) multi-letter mutual information metric (log-loss fidelity), and an interesting tradeoff arises by constraining the complexity of descriptions, measured in terms of the rates needed for the exchanges between the encoders and decoders involved. Inner and outer bounds to the complexity-relevance region of these problems are derived from which optimality is characterized for several cases of interest. Our resulting theoretical complexity-relevance regions are finally evaluated for binary symmetric and Gaussian statistical models, showing theoretical tradeoffs between the complexity-constrained descriptions and their relevance with respect to the hidden variables.