Broadcast Channels

所需积分/C币:9 2018-11-03 17:43:21 1.71MB PDF
收藏 收藏

We introduce the problem of a single source attempting to communicate information simultaneously to several receivers. The intent is to model the situation of a broadcaster with multiple receivers or a lecturer with many listeners. Thus several different channels with a common input alphabet are specified. We shall determine the families of simultaneously achievable transmission rates for many extreme classes of channels. Upper and lower bounds on the capacity region will be found, and it will be shown that the family of theoretically achievable rates dominates the family of rates achievable by previously known time-sharing and maximin procedures. This improvement is gained by superimposing high-rate information on low-rate information. All of these results lead to a new approach to the compound channels problem.
IEEE TRANSACTIONS ON INFORMATION THEORY, JANUARY 1972 R2 CHANNEL 1 Y, DECODER MINIMAX (r;1,s2) BLOCK xir,51,52 (Clap +ap)+Hla), clap+ap) ENCODER」xexn CHANNELY, DECODER NFORMATION IE SHARING e】=P(,≠(r,s p2Pr2)(,52 Fig. 6. Encoder and decoder for broadcast channel Fig. 5. Set of achievable rates for bsc P(u,2 l x), to consist of three finite sets X,Y, Y2 and a collection of probability distributions p(, I xonY1xY2 one for each xe X. The interpretation is that x is an input This argument suggests that(R,, R2)is jointly achievable to the channel and yi and y2 are the respective outputs at he Appendix contains the proof. Letting a range fromo receiver terminals I and 2 as shown in F1g. 6. The problem to 1 generates the new achievable set of rates shown in is to communicate simultaneously with receivers 1 and 2 as 1g. 5. we strongly believe that Fig. 5 exhibits the optimal efficiently as possible region of achievable rates For the development of this paper we shall need knowl- This curve dominates the time-sharing curve We note also eage only of the marginal distributions that near the minimax point, the slope is zero. Thus an p1(y1|x) p(1, y2 infinitesimal degradation in the rate for the poor channel will allow an infinitesimally infinite increase in the rate for the good channel. Consequently, at least for two rsC. p2(y2|x) p(y1,y2|x) yE¥1 superposition of information dominates time sharing. This example naturally leads to a conjecture concerning which we have designated in the examples by channel the evaluation of the capacity region for a special class of matrices PI and P2 of sizes xX lY and Ix x ly2l broadcast channels in which one channel is a degraded respectively. The possible dependence or independence of ⅴ ersion of the other Y, and Y2 given X is irrelevant, given the constraint that Definition: Let P and o be channel matrices of size the decoding at the two receivers must bc donc indcpcn YI x Y,I and X|Y,l respectively. o will be said to dently be a degraded version of P if there exists a stochastic matrix The nth extension for a broadcast channel is the broadcast M such that P=QM. Shannon[6] has shown that the channel capacity of channel o is not greater than that of channel P Conjecture 1 Let S be an arbitrary XI xX channel (X,p(y1y2|x),Y1"xy2), matrix corresponding to the channel density p(x l s)and where p(v1] x)=II-1 P(yu ] Ix), for xE X let p(s)be an arbitrary probability distribution on X. let v,eY,n, y2EY2' p(s)induce the joint distribution P(1,J2, S, x)=p(s)P(x s) An((M,M2, M12),n)code for a broadcast channel poise x)on ( y2,S, x ) Let P2 be a degraded version of consists of three sets of integers PI. Then the set of achievable(r1, R2) pairs for the broad- cast channel (X, p(v1, 22 x),Yx Y2) is given by R={1,2,…,M12 ((S; Y2)+I(X; Y S), I(S; Y2): generated by all channels S1={1,2,…M1} S and probability distributions p(s) The two-BSC example in this section is a special case of S2={1,2,…,M2}, this conjecture The code that achieves(ri, r,) is construc ted in an analogous manner. At the time of this writing, an encoding function P. Bergmans at Stanford has made some progress on the proof of this conjecture. In fact, Bergmans'considerations r;R×S1×S2→X have allowed me to modify the conjecture from an initially and two decoding functions more ambitious version involving a larger class of channels mentioned in [6]. I no longcr have any basis for belief in g1:Y1"→R×Sl1;9(y1)-(,1) the more ambitious conjecture g2:Y2→R×S2;92(y2)=(冷,S2 II. DEFINITIONS AND NOTATION The set{x(r,51,s2)|(r,s1,s2)∈R×S1×S2} is called the We shall define a two-receiver memoryless broadcast set of codewords. As illustrated in Fig. 6, we think of channel, denoted by (A, P(1,]2 x),Y1 Y2) or by integers s and s2 as being arbitrarily chosen by the trans- COVER: BROADCAST CHANNE 5 mitter to be sent to receivers 1 and 2, respectively. The of a broadcast channel, we will often designate this ex- integer r is also chosen by the transmitter and is intended plicitly by a(n), P m(e) to be received by both receivers. Thus r is the"common" Definition The rate(R,, R2, Rl2)is said to be achievable part of the message and s, and s2 are the "independent by a broadcast channel if, for any e>0 and for all n parts of the message sufficiently large, there exists an(M1, M2, M12), n) code with An error is made by the ith receiver if 9 i+(,Ss. If M1M12≥ onr the message(r,sl, s2) is sent, let 12 λ(r,S1,s2)=Pr{sa(y)≠(r,1)},i=1,2,(6) M2M12≥2nK2 denote the probabilities of error for the two channels M12≤2n (14) where we note that y1, 2 are the only chance variables in such that p, n(e)< 8, p2 m (e)<e the above expression Comment: Note that the total number M=M,M2M12 We denote the(arithmetic average) probability of error of codewords for a code satisfying (14)must exceed in decoding(r,1 averaged over all choices of s2 by 2(R+R2-R1) Definition The capacity region R" for a broadcast 1(r,s) λ1(r,S12S2) (7 channel is the set of all achievable rates(R, R2, R,2) The goal of this paper is to determine R*for as largc a Similarly, for channel 2 we define class of channels as possible Comment We shall sometimes let also denote the set (8)of achievable(R1, R2) pairs. However, at this stage in our M ∑λ2(r,s1,s2) L 51=1 understanding, it seems that sole concern with(r,, R2), Finally, we define the overall arithmetic average probabilities with the exclusion of concern with R12, would result in a of error of the code for channels l and 2 as coarsened and cumbersome theoretical development Comment The extension of the definition of the broad Pi(e)=2 i,(rS?M M,(r, 51 2)(9) cast channel from two receivers to k receivers is notationally cumbersome but straightforward, given the followin 1g comment. The index sets R, s,, s, should be replaced b p2(e) M2M12r,2 ∑42(,S2)=M,∑42(,12),(1 index sets I(),B∈{0,1,,0≠0, with the into 7,S1,2 pretation that the integer i(e)selected in index set where 1(0)=11, 2, . M(o) is intended (by the propcr codc M=M1M 12 selection) to be received correctly by every receiver j for which 0;=1 in 0=(01, 02, ., 02). Then, for example, th The overbar on P,e) will serve as a reminder that this rate of transmission over the nth extension of a broadcast probability of error is calculated under a special distribution, channel to the ith receiver will be given by namely the uniform distribution over the codewords We shall also be interested in the maximal probabilities of og M()=∑logM().(15) error 0c(0, 1R A1= max Prig(y)≠(r,s)|(r,12)},i=1,2,(12) In the two-receiver broadcast channel, the corresponding corresponding to the worst codeword with respect to each sets in the new notation are R=I(1,),S,=I(1,0), channel. note that ni 2 pie S2-I0,1) We shall define the rate(ri, r2, ri2) of an(Mi,M2 Section IⅤ treats best two-channel situation and Mi2), n)code by Section v treats the worst IY, ORTHOGONAL CHANNELS RI=log M,M12 In this scction wc shall investigate a broadcast channel in which efficient communication to onc receiver in. no way terferes with communication to the other. a movie R g M2M12 designed to be shown simultaneously to a blind person and a deaf person would be such an example. R log M12 (13) Consider the broadcast channel with X= 1, 2, 3, 4 Y1={1,2},Y2={1,2},wih all defined in bits/transmission. Thus R i is the total rate of 10 transmission of information to receiver, i= 1. 2. and r 10 01 P1= is the portion of the information common to both receivers 10 Comment: When Ai and pi(e) refer to the nth extension 01 0 6 IEEE TRANSACTIONS ON INFORMATION THEORY. JANTIARY 1972 (RI FOR ANY OSRI2SI Fig. 8. Achievable rates for the orthogonal channel asn→o, and selecting x;∈{1,2)}orx;∈{34} according to the value of the ith bit of the codeword chosen to be sent from the first code and selecting x E 1, 3 or xi( 2, 3] Fig 7. Orthogonal channel according to the value of the ith bit of the codeword selected from the second code. Here, any Ri2 such thato< ri2< as depicted in Fig. 7. As before, min C 1, C23 may also be achieved Nothing more could be expected, and each channel performs no worse in the (PkL= Pr k=jix=i,k=1, 2; j=1, 2; i= 1, 2, 3, 4. presence of the other than it would alone We easily calculate C1= C2 =1 bit/transmission V. INCOMPATIBLE BROADCAST CHANNELS Clearly, from the standpoint of receiver yi, inputs x=1 and x =2 both result in y1 1 with probability l and can In a search to find the worst case of incompatibility in thercforc bc mcrgcd. Procccding with this analysis, wc find simultaneous communication we turn to the following that Y, can determine only xE(1,2, versus x E 4, while practical example which, for obvious reasons, we term the Y2 can determine only x∈{1,3} versus x∈{2,4} switch-to-talk channel For this example, Ci=I and C2= I and are, respec Example 1-Switch-to-Talk: Let tively, attained for Pr x=1+ Pr i=2]=i and X=X1∪X Pr x=1+Pr ix=3=y Solving these simultaneous equations, we find (I(X I Y,I(X I Y2)=(1, 1)can b Y1=氵1∪{d} achieved by p i=4,i=1, 2, 3, 4. This in itself de not guarantee that(C1, C2) can be simultaneously achieved Y2=Y2∪{2 However there does exist a coding theorem for this channeland Let u1E 1, 2), u2E(1, 2 denote the message bits that we wish to transmit to Y, and Y2, respectively. xx其 x0 Make the association from pairs of u to input symbols 2)=(1,1)+→1 x (u12)=(1,2)→2 00 01 (u12)=(2,1)+3 (1,2)=(2,2)+>4 00 01 and send the appropriate input symbol x. Then V1=ui Y2 and y2= u2, and capacities C and C2 are simultaneously achieved. Since u, and u2 may be chosen independently, we 00 01 may also achieve Ri2 s I by this scheme. Fig. 8 shows th set of achievable rates. The upper bound theorem of 00 0|1 Section VIll establishes this region as optimal E x The noiselessness of the channels is not crucial this 0 broadcast channel remains orthogonal in the sense that (RI,)=(C1, C2) may be achieved even if we define the (19) new channels as shown in Fig 9 Each receiver has an indicator that lights when the send T1 T ting with th ceiver. The idea is that 8)when the sender wishes to communicate with Y, he uses r x∈X1, resulting in 中2, indicating to r 2 that the sender is communicating with Y,. Similarly, to communi In this case, C=1-H(r1)and C2=1-H(r2). Ci and cate with Y2, the sender uses x E 2, resulting in y C2 may be simultaneously achieved by selecting sequences of This might correspond to the situation, for example,where n( cker fluent in Spanish and Dutch must speak simul :n2 taneously to two listeners, one of whom understands only codes with words in 0, 1]n such that A,()-+0, 12m)+0, Dutch and the other only spanish COVER: BROADCAST CHANNELS CHANNEL I 3 X Fig. I1. Incompatible broadcast channels Hannel one of the 2nH(a) possible ways. This bound cannot be achieved unless the information rate ra common to both Fig, 9. The switch-to-talk channel channels satisfies R122 H(a). The results are summarized in Fig. 10 It is an easy consequence of Section VIlI that Fig. 10 corresponds to the capacity region for this channel, and R2 (R, Ral :(ac()+ H(a), ac(o)+ H(an therefore that this encoding scheme is optimal for the switch-to-talk channel R,, 2H(a) 2 The following example illustrates the worst case that r arise in simultaneous communications NAIVE Example 2-Incompatible Case: Let TIME SHARI X={1,2,3,4},Y1={1,2},Y2={1,2} 10 01 L Iig. 10. Achievable rates for switch-to-talk channels as shown in Fig. 11. Thus if X wishes to communicate with Y, over the perfect channel x E ( 1, 23++ Y, he must Let channel 1 havc capacity C(o) and channel 2 have send pure noise to Y,, i. e, Pr 2=1ixc(1, 233= 2 capacity C20). Using the known result for sum channels A similar statement holds for X communicating with Y2 (see Shannon [3])we find In Section VIII we shall establish an upper bound on the capacity region by finding the set of all achievable(I(X! Y), C1=log(1+200 I(X Y2) pairs. Anticipating these results, we shall make this calculation for this example. Let Pr x=i=pi C2=log(1 +2c2(0 1, 2, 3, 4. Define a=Pi+p2, d=p3+ P4. Then We shall discuss this example informally. Certainly H(= H(P1 +a/ 2) and H(YIX)=a, yielding I(XIY=H(PI +a/2)-c. Similarly, I(XI Y2)= (R2,R2)=(C1,0) achievable and(R1, R2)=(0, C2) is (X Y2)=H(P4+ o/2)-a achievable, and hence by time sharing, any pair of ratcs First, fixing x, and maximizing over0≤D1≤ (R1,R2)=(AC1,C2),0≤2≤1, is achievable. However,0≤p4≤a, we find the maximum values additional information is contained in the knowledge of o and proper encoding of the transmission times to Y, and Y, I(X|Y1)=1-=x can he used to send extra information to both channels. If channel 1 is used proportion a of the time aC, ( o) bits/trans- (X|Y2)=1-x= mission are received by Y. However, H(o)additional achieved by P,=P2 =0/2 and P3=P4=d/2. This is the bits/transmission are achieved by choosing which channel upper boundary of achievable(I(X I Y,), I(X I Y2 ) pairs to send through independently at each instant by flipping a It may also be verified that, for any E [O, (], there exist allows the perfect transmission of one of 2n l(a)additional I(rI Y2) pairs depicted in Fg. o coin with bias ox. In other words, modulation of the switch- P,P2, P3,P 4 achieving any (I(X I 1),(X l Y2) dominated by to-talk button, subject to the time-proportion contraint a, (a, 1-a) a). Thus we have the set I of achievable (I(X I Y1) messages to both receivers Y, and Y2 In Section VIII it will be shown that this region of jointly Thus all (R1, R2) of the form (R,, R,)=(aCi(o)+ achievable((r I Y, I(XI Y2)pairs is an upper bound on H(a), aC2(0)+H(o)can be achieved by choosing the the capacity region. However, we can trivially achieve any subset of n transmissions devoted to the use of channel 1 in pair of rates(r1, R2 )on the upper boundary of g by simply 8 IEEE TRANSACTIONS ON INFORMATION THEORY. JANUARY 1972 I(XY,) Fig. 13. The bottleneck channel R12'C, RIR2 c R,+R2-R12'C Fig 12. Capacity region for incompatible channels. RIo, Ri+rz=C time-sharing the two noiseless channels xe (1, 2++Y, and x∈色3,4}→Y2,Ifx∈{1,2} is used a proportion a of the time,then rates r,=a and R2=d=1-a may be Fig. 14. Achievable rates for bottleneck channel achieved without any additional coding. Thus the upper bound can be achieved with trivial coding procedures, and Fig. 12 therefore corresponds to the capacity region to receiver 2 through the bottleneck channel P with Here, then, is an example in which the two channels are arbitrarily small probability of error. (See Fig 15) so incompatible that one can do no better than time sharing Assume that U= Un and V=nns are jointly ergodic i. e, using one channel efficiently part of the time and the processes taking values in finite alphabets. By jointly other channel the remainder. Fortunately, for those wishing ergodic, we mean that the process zn=(Un, V) is ergodic. to get something for nothing, this is the exception rather We recall that the definition of the entropy of an ergodic than the rule process ( Zn is defined by VI, THE BOTTLENECK CHANNEL H(z= lim n- H(Zi, Z2,'', Zn). (23) Consider the broadcast channel in which the two channels have the same structure, i.e. We assert the following Fact: Asymptotically error free transmission of (U1,U2, P1G1 x)=p2(v2 x), xEX,1,2EY1=Y2=Y U7n}→{U1,O2,…,On}and{1 p n) over the bottleneck channel of capacity C can be as shown in Fig. 13. We shall term this the bottleneck accomplished if and only if channel Here, we note that any code for receiver Y, is also a code H(U,V<C (24) with the same error properties for receiver Y,. Thus Y, and Proof: The well-known idea of the encoding is to Y2 both perceive correctly the transmitted sequence x with enumerate the 2n((U, n)+a) e-typical sequences and send the low probability of error. Let the capacity of channel P be denoted by C1= C2=C idex of the actually occurring sequence(z1,z2, ." zn)over the channel. If H(U,V)+e< C, then this index will be bits per transmission. Now, since both receivers receive the correctly transmitted with probability of error a/2 for same information about X. it follows that both receivers 1 suficiently large n. Since the probability that a random and 2 will be able to correctly recover r, s, and s, if and only (z1,z2, '. 2n) will be typical can be made 21-/2 for if(R1, R 2, R 12)is an achievable rate. Counting the number sufficiently large n, the overall probability of error can be of messages per unit time necessary to transmit (, 13S2) made less than e. The converse follows the standard argu- correctly yields the following proposition [see comment ment for a single channel following (14)J The generalization of this result to arbitrary broadcast Proposition:(Ri, R2, Ri2) is an achievable rate for the channels is unknown. broadcast bottleneck channel of capacity C if and only if Let us now compare the orthogonal channel with the R1+R2-R12≤C bottleneck channel. The orthogonal channel of Section IV achieves (R1, R2)=(1, 1)with arbitrary joint rate 0 s 0≤R1≤C R12< 1. Thus fully independent messages(R11= 0)or ximally dependent messages (R11 1)can be sent 0≤R2≤C simultancously to receivers I and 2 0≤R1≤C. at the other extreme in the case of the bottleneck channel with capacity= 1, we can simultaneously achieve R1=1 As an important application of these ideas, suppose that R2 1. Here however, it may be seen that achieving we wish to send a random process u= Un: n= 1, 2, ' .(ri, r2)=(1, 1)implies R12= 1. Thus the messages sent to receiver 1 and a random process V =Vn:n= 1, 2, ,.to 1 and 2 must be maximally dependent, and in fact equal COVER: BROADCAST CHANNELS 6→ ENCODER□ DEc。pER {n} Fig, 15. Sending two random processes over the same channel. NO N, S,NN(O, QS) SANLO. S) 2"N(O,N2) Fig. 18. Decomposition of the signal Z2N N(O, N Fig. 16. Gaussian broadcast channel as a sequence of i.i.d N(o, as)rv. here 0< a s l and a =1-a. Thus the sequence 3= S,+s2 will be a sequence of i.i.d. N(O, S)RV. The received sequences yI=s+ s2+z1 and y2=S,+s2+32are in Fig. 18. Now s, and z2 are considered to be noise by receiver 2 We see that si z2i are i.i.d. N(O, as+ N2)Rv. Therc- ges may be sent at rates less than aS Fig. 17. Time sharing rates for the gaussian broadcast channel as+ N VII. GAUSSIAN CHANNELS to receiver Y2 with probability of error near zero for Consider the time-discrete gaussian broadcast channel fficiently large block length n. That is, there exists a sequence of (2n(Ca(a)-a), n) codes with average power con with two receivers depicted in Fig. 16 straint &S and probability of error p2((e)=0 Let1=(z1z12,…,z1n……) be a sequence of inde Now, since Ni< N2, receiver Y, may also correctly pendently identically distributed (i.i. d, normal random determine the transmitted sequence S2 with arbitrarily low variables(rv)with mean zero and variance N,, and let probability of error. Upon decoding of S2, given y1 (221,222.',22ns's) be i.i.d. normal RV with mean receiver Yi then subtracts 52 from yi, yielding j1 zero and variance N. let n, w. at the ith transmission yI At this stage channel 1 may be the real number xi Is sent and y1 x:+ hiis y2i=xt 22i considered to be a Gaussian channel with input power are received. In our analysis it is irrelevant whether zsi and constraint as and additive zero mean Gaussian noise with Z2i are correlated or not (although in the feedback case it variance N,. The capacity of this channel is i log [1 may make a difference). Let there be a power constraint on (aS N)]= C,(a)bits/transmission, and is achieved the transmitted power, given for any n by roughly speaking, by choosing 2nc(a) independent n- x;≤S (25) sequences of ii d. N(O, aS)RV as the code set for the possible sequences s Thus receiver Y1 correctly receives both s, and for any signal=(x,,x2,,,,xn)of block length n This informal argument indicates that rates It is well known that the individual capacities are C1=log(1 S Ni) and C2=! log(1+ S/N2) bits/ R,= log transmission, where all logarithms are to the base 2 zs+N220g|1×as S Time sharing will achieve any convex combination of S (C2, C2) and(C1, 0), as shown in Fig. 17. R log[ 1+ (26) Now let us see how we can improve on this performance. Think of the signal s2 (intended for the high noise receiver may simultaneously be a-achieved, for any0sas 1 Y2)as a sequence of i.i. d. N(, uS)RV. Superimposed on These rate pairs, shown in Fig. 19, dominate the time this sequence will be a sequence s, that may be considered sharing rates 10 TEEE TRANSACTIONS ON INFORMATION THEORY JANUARY 1972 I(×Y2 MINIMAX 3 UPERP。3TN I(XY, TIME SHARING 广a〓 Fig. 20. Upper bound on capacity region Fig 19. Set of achievable rates for the Gaussian broadcast channel set of jointly achievable mutual-information pairs, properly modified to take into account the possibility of time-sharing and throwing information away, does yield an upper bound Summarizing the argument, we select a set of 2n(C-a(a=x) on the capacity region 3*. This upper bound is actually achieved by the orthogonal-channel, switch-to-talk-channel ndom n-sequences of i.i. d. N(O, aS)Rv, and a set of 2n (C1(x)-e) random n-sequences of i.i.d. N(O, S)RV.Now and incompatible- channel examples ( C1()+C2()-2e)n-sequences are formed by adding together bound. Let Thus we proceed to define R and establish as an upper pairs of scqucnccs, in which the first sequence is chosen from the first set and the second sequence is chosen from I=((X Y),I(XI Y2)lp(x)20, 2p(x)=1(29) the second set, and the pairs are chosen in all possible ways. A message denote the sct of all pairs(I(XI Y1), I(XI Y2)) gcncrated by P(x)as p( )ranges over the simplex of possible probability (r,s1),r∈{1,2,…,2mC2()-)},s1∈{1,2 distributions on X. Define to be the convex hull of I Thus I may be interpreted as the average joint mutual is transmitted by selecting the n-sequence corresponding to information achievable by varying p( )with timc. Let the sum of the rth sequence in the first set and the s,th sequence in the second set Receiver 1 is intended to decode R=[(RI, R2EE2IRIsI, R2 I2, (r, s corrcctly and rcccivcr 2 is intended to decode r correctly, thus simultaneously achieving rates for some(I1,I2)∈Ⅰ}.(30) R1=C1(x)+C2(a)-2E Thus intuitively corresponds to the joint mutual infor mation achievable from i by throwing information away. R2=C2(x)-E (27) These sets are depicted in Fig. 20. We now show *s R as given in (26) Lemma 1: Given an arbitrary ((M1,M2, M12); n) code for A full discussion of the Gaussian channel would lead far the nth extension of a broadcast channel, consisting of afield. A direct simple proof of the achievability of the rates words x(r, s1 S2EX, ER,S1 ES1,S2ES2,IR=Miz given in(27)has been found but will not be presented here 1-Mi,s2I=M2,M-M12M1M2ilet (r, 15 52)be a We shall conclude this section with one observation. If random variable with range R X S x S2. Let (J1,J2) NI =0, and channel l is therefore perfect, we have Cl Yi"x Y2" be the corresponding random output n-sequences and C2 =i log(1 S/N compound channel or received by I and 2, generated by sending x(, S12)over the maximin approach would have us send at rates(R, R))=channel. If Pi(e)=Pr((1 51)*(1,51)y and p2(e) (C2, C2). However, an arbitrarily small decrement in the rate Pr ((1 52)*(1, s2)) are the receiver probabilities of error for channel 2, corresponding to 0 a<1 in(26), yields of the code, then R1,R2)=(a, C2 - as a pair of achievable rates Although this rate pair docs not dominate(C2, c2), it sccms H(X Y<I+log M2 pi(e) log M12M (31) vastly preferable H(r Y2)s 1 +log Mi+P2(e) log M,2M2.(32) VIIl. AN UPPER BOUND ON ACHIEVABLE RATES(R1, R2 Proof: Let the decoding rules corresponding to the Suppose that plx), a probability distribution on X, code be generates the pair of mutual informations (I(KI), 91:Y1→R×S1 I XI), where, for i= 1, 2 g2:¥2n→R×S 3) XY)=∑∑x)pyx)log(y!x) (28)written xEXyEYi p y 9k(yx)=(9k1(yk)k2(y),k=1,2 Given the intuitive properties of mutual information, it is natural w assume Lha rates r=l(X| Y) ), R2=l(X Y2) Thus, given random message (r,1, 2) and sequence are therefore simultaneo usly achievable. This turns out not Dk E yn. receiver k will make an error if and only if to be the case. i Close inspection of the example of two bsc 91(y1)≠(r,1),k=1 in Section II, with Pr &x=1=, and I(XIYi=1, I(XIY2)=C2, will yield a counterexample. However, the 92(y2)≠(,2),k=2 (34) COVER BROADCAST CHANNELS We now wish to show that p, e),p,(nl(e)cannot P(e)=Pr{91(y1)≠(,1) simultaneously tend to zero for rates(rir2)R. This will establish iR as an upper bound on the capacity region for a p2(e)=Pr{g2(y2)≠(r,2) (5) broadcast channel We note that Let R1=1/n log M,M12 and r,=1/n log M2Mi2 be the rates of communication in bits/transmission for receivers H(r lyis H(pe), i-p, l yu)) Y, and Y2, respectively. ( We recall that R12=log Mi2 is +(1-pilelyid)log M2+ plle lyi) log(M-M..the transmission rate for information common to both channels The proof closely resembles that used by Shannon (36) [4] for the two-way channel where we have used the inequality Theorem: For any sequcncc of [(2nR1, 2nR2, 2 K12),n] codes, (RI, R2)fR implies that I(ax1,a2,…,an)≤logm, (37) p1(e),p2(l)→(0,0),(2,2)(0, and a basic composition relation(see Ash [l, p. 8D. We Thus s is an upper bound on the capacity region for the have f course, conditioned on the events g, (y and g1(31)*(r,s,). Taking the expectation over Y " and broadcast channel nsing the convexity of H(p, 1-p)in p, we have Proof: Given an arbitrary [(mi, 2, M12), n] code for the nth extension of the broadcast channel, choose a code- H(X YSH(P,(e), 1-p(e))+(1-P,(e) log M2 word x(, S1, 2)at random according to a uniform distribu tion Pr r,1, S2=I/M,( S1S2ERX S,x S2, where +pe) log(M-M).( 38) M=RIIS1HIS2l If the codewords r(r, i,s2)eXn are not Finally, since H(p, 1-p)<I and M=Mi2M,M2 we distinct, a simple modification of the proof below will prove ha ave the theorem. Thus treating the case where the x(r, S1, 52)are distinct, we have H(X)= log M and I(XIYi=log M H(XI Y,<I+log M2+pi(e)log(M-M2)/M2 H(XY1, under the given uniform distribution on the s I+ log M2+Pi(e) log Mi2M,. (39)codewords. As in Section Ill, letP, n(e)and p2(m(e)designate the probabilities of error of the code under this distribution The corresponding argument for proof H(X I Y2)completes the By Lemma 2 We shall need the following lemma Ash [1, p 81 I(XY)≤∑I(xz|Y1) (41) Lemma 2: Let Xi,'', Xn be a sequence of input random variables to the(discrete memoryless)broadcast channel and Y YinY2 Y2n the corresponding rcccived out- I(XIY =log M-H(XIY)s2I(X,l Y1i).(42) put random variables for 1 and 2, respectively. Then Finally, since(31)in Lemma 1 holds for any distribution I(XI,', X,I Yxl,' ." Ykm)s. >I(X, I YkD), k=1, 2, on the codewords, substitution in(42)yields with equality iff Yk1, Yk2, ''' Ykn are independent log M-1-log M2-P, (m(e)log M,2M1 Proof: ∑I(X1|Y1l),(4 H(Yk1,Xk2,…,YknX 会-∑x,yk) log p(yk|x which becomes the basic inequality but, because the channel k is memoryless, Pk(yk I x) factors R (1/n)+(1n)∑=1I(X2|Y1 into a product lip(k xi, yielding 2gM12M1≤ 1-pin(e (44 H(Yk1,…·,Yn|X1,·,X Similarly we find ∑p(x,y)∑logp( R2△-logM12M2 (1/n)+(1 (k1Y2 1-2)(e) H(Yk I Xi (44b) Summarizing, an arbitrary code for the nth extension of Also, by a basic inequality a broadcast channel must have rates(r1, R2) satisfying 446), where Ykn)≤∑H(Yk p((e) ∑(r,s1,s2),i=1,2.(45) with equality iff Yki are independent for i=1, 2, '', n Since I(X I Yk=H(Y-H(Yk I X, the lemma follows. Now suppose (ri, r2)AR,r120,R220 as in Fig. 21

试读 13P Broadcast Channels
立即下载 身份认证VIP会员低至7折
  • 签到新秀

  • 分享王者

关注 私信
Broadcast Channels 9积分/C币 立即下载
Broadcast Channels第1页
Broadcast Channels第2页
Broadcast Channels第3页

试读结束, 可继续读1页

9积分/C币 立即下载