publication . Preprint . Article . 2010

On Marton’s Inner Bound for the General Broadcast Channel

Amin Gohari; Abbas El Gamal; Venkat Anantharam;
Open Access English
  • Published: 26 Jun 2010
Abstract
Comment: 14 pages, Submitted to IEEE Transactions in Information Theory
Subjects
arXiv: Computer Science::Information Theory
free text keywords: Computer Science - Information Theory, Library and Information Sciences, Information Systems, Computer Science Applications, Coding (social sciences), Encoding (memory), Random variable, Mathematics, Combinatorics, Broadcast channels, Gaussian process, symbols.namesake, symbols, Gaussian, Binary number, Channel capacity, Discrete mathematics
Funded by
NSF| NetSE: Large: Collaborative Research: Improving Internet Incentives
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 0910702
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computer and Network Systems
,
NSF| Theory and Methodology for the Design and Evaluation of High-Performance LDPC Codes
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 0635372
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computing and Communication Foundations
,
NSF| Collaborative Research: NeTS-FIND: Market-Enabling Network Architecture
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 0627161
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computer and Network Systems
,
NSF| Emerging Frontiers of Science of Information
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 0939370
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computing and Communication Foundations
,
NSF| New Techniques for the Control of Multi-Agent Systems in Uncertain Environments
Project
  • Funder: National Science Foundation (NSF)
  • Project Code: 0500234
  • Funding stream: Directorate for Computer & Information Science & Engineering | Division of Computing and Communication Foundations

[1] T. M. Cover and J. A. Thomas, Elements of Information Theory, John Wiley and Sons, 1991.

[2] I. Csisza´r and J. Ko¨rner, Information Theory: Coding Theorems for Discrete Memoryless Systems, Budapest, Hungary: Akadmiai Kiad, 1981.

[3] K. Marton, “A coding theorem for the discrete memoryless broadcast channel,” IEEE Trans. IT, 25 (3): 306-311 (1979). [OpenAIRE]

[4] S. I. Gelfand and M. S. Pinsker, “Capacity of a broadcast channel with one deterministic component,” Probl. Inf. Transm., 16 (1): 17-25 (1980).

[5] C. Nair and V.W. Zizhou, “On the inner and outer bounds for 2- receiver discrete memoryless broadcast channels,” Proceedings of the ITA workshop, San Diego, 226-229 (2008). [OpenAIRE]

[6] Y. Liang, G. Kramer, and H.V. Poor, “Equivalence of two inner bounds on the capacity region of the broadcast channel,” 46th Annual Allerton Conf. on Commun., Control and Comp., 1417-1421 (2008).

[7] J. Ko¨rner and K. Marton, “General broadcast channels with degraded message sets,” IEEE Trans. IT, 23 (1): 60-64 (1977).

[8] A. A. Gohari and V. Anantharam, “Evaluation of Marton's Inner Bound for the General Broadcast Channel,” Submitted to IEEE Trans. IT. Available at http://arxiv.org/abs/0904.4541

[9] F. M. J. Willems, “The maximal-error and average-error capacity region of the broadcast channel are identical,” Problems of Control and Information Theory, 19 (4): 339-347 (1990).

[10] V. Jog and C. Nair, “An information inequality for the BSSC channel,” Proceedings of the ITA workshop, San Diego, 1-8 (2010).

[11] Y. Geng, A. A. Gohari, C. Nair, Y. Yu, “On Marton's inner bound for two receiver broadcast channels,” Proceedings of the ITA workshop, San Diego (2011).

[12] Y. Geng, A. A. Gohari, C. Nair, Y. Yu, “The capacity region for two classes of product broadcast channels,” To appear in the Proceedings of the 2011 IEEE International Symposium on Information Theory, Saint Petersburg, Russia, 2011. [OpenAIRE]

[13] C. Nair, Z. V. Wang, and Y. Geng, “An information inequality and evaluation of Marton's inner bound for binary input broadcast channels”, Proceedings of the 2010 IEEE International Symposium on Information Theory, Austin, Texas, Jun. 13-18, pp.550 - 554, 2010

[14] H. Weingarten, Y. Steinberg, and S. Shamai (Shitz), “The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel,” IEEE Trans. IT, 52 (9): 3936-3964 (2006). Now, assume that case 2 holds. Following, the above proof for case 1, one can get = 1: In this case, equation (22) implies that I(V ; U jZ) = 0. Furthermore equation (21) will hold with equality. Since t(x) 2 T , we must have I(U ; Y ) = I(U ; Z) = 0. The fact that I(V ; U jZ) = I(U ; Y ) = I(U ; Z) = 0 implies that I(U ; Y ) = I(U ; ZV ) = 0. Therefore the inequality I(U ; Y ) I(U ; ZV ) also holds in this case.

Let us consider the following two cases: < 1: In this case, equation (22) implies that I(U ; Y ) I(U ; V Z) + 1 I(V ; U jZ). This inequality implies the desired inequality I(U ; Y ) I(U ; V Z).

Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue
publication . Preprint . Article . 2010

On Marton’s Inner Bound for the General Broadcast Channel

Amin Gohari; Abbas El Gamal; Venkat Anantharam;