Broadcast Channels

We introduce the problem of a single source attempting to communicate information simultaneously to several receivers. The intent is to model the situation of a broadcaster with multiple receivers or a lecturer with many listeners. Thus several different channels with a common input alphabet are specified. We shall determine the families of simultaneously achievable transmission rates for many extreme classes of channels. Upper and lower bounds on the capacity region will be found, and it will be shown that the family of theoretically achievable rates dominates the family of rates achievable by previously known timesharing and maximin procedures. This improvement is gained by superimposing highrate information on lowrate information. All of these results lead to a new approach to the compound channels problem.
IEEE TRANSACTIONS ON INFORMATION THEORY, JANUARY 1972 R2 CHANNEL 1 Y, DECODER MINIMAX (r;1,s2) BLOCK xir,51,52 (Clap +ap)+Hla), clap+ap) ENCODER」xexn CHANNELY, DECODER NFORMATION IE SHARING e】=P(,≠(r,s p2Pr2)(,52 Fig. 6. Encoder and decoder for broadcast channel Fig. 5. Set of achievable rates for bsc P(u,2 l x), to consist of three finite sets X,Y, Y2 and a collection of probability distributions p(, I xonY1xY2 one for each xe X. The interpretation is that x is an input This argument suggests that(R,, R2)is jointly achievable to the channel and yi and y2 are the respective outputs at he Appendix contains the proof. Letting a range fromo receiver terminals I and 2 as shown in F1g. 6. The problem to 1 generates the new achievable set of rates shown in is to communicate simultaneously with receivers 1 and 2 as 1g. 5. we strongly believe that Fig. 5 exhibits the optimal efficiently as possible region of achievable rates For the development of this paper we shall need knowl This curve dominates the timesharing curve We note also eage only of the marginal distributions that near the minimax point, the slope is zero. Thus an p1(y1x) p(1, y2 infinitesimal degradation in the rate for the poor channel will allow an infinitesimally infinite increase in the rate for the good channel. Consequently, at least for two rsC. p2(y2x) p(y1,y2x) yE￥1 superposition of information dominates time sharing. This example naturally leads to a conjecture concerning which we have designated in the examples by channel the evaluation of the capacity region for a special class of matrices PI and P2 of sizes xX lY and Ix x ly2l broadcast channels in which one channel is a degraded respectively. The possible dependence or independence of ⅴ ersion of the other Y, and Y2 given X is irrelevant, given the constraint that Definition: Let P and o be channel matrices of size the decoding at the two receivers must bc donc indcpcn YI x Y,I and XY,l respectively. o will be said to dently be a degraded version of P if there exists a stochastic matrix The nth extension for a broadcast channel is the broadcast M such that P=QM. Shannon[6] has shown that the channel capacity of channel o is not greater than that of channel P Conjecture 1 Let S be an arbitrary XI xX channel (X,p(y1y2x),Y1"xy2), matrix corresponding to the channel density p(x l s)and where p(v1] x)=II1 P(yu ] Ix), for xE X let p(s)be an arbitrary probability distribution on X. let v,eY,n, y2EY2' p(s)induce the joint distribution P(1,J2, S, x)=p(s)P(x s) An((M,M2, M12),n)code for a broadcast channel poise x)on ( y2,S, x ) Let P2 be a degraded version of consists of three sets of integers PI. Then the set of achievable(r1, R2) pairs for the broad cast channel (X, p(v1, 22 x),Yx Y2) is given by R={1,2,…,M12 ((S; Y2)+I(X; Y S), I(S; Y2): generated by all channels S1={1,2,…M1} S and probability distributions p(s) The twoBSC example in this section is a special case of S2={1,2,…,M2}, this conjecture The code that achieves(ri, r,) is construc ted in an analogous manner. At the time of this writing, an encoding function P. Bergmans at Stanford has made some progress on the proof of this conjecture. In fact, Bergmans'considerations r;R×S1×S2→X have allowed me to modify the conjecture from an initially and two decoding functions more ambitious version involving a larger class of channels mentioned in [6]. I no longcr have any basis for belief in g1:Y1"→R×Sl1;9(y1)(,1) the more ambitious conjecture g2:Y2→R×S2;92(y2)=(冷,S2 II. DEFINITIONS AND NOTATION The set{x(r,51,s2)(r,s1,s2)∈R×S1×S2} is called the We shall define a tworeceiver memoryless broadcast set of codewords. As illustrated in Fig. 6, we think of channel, denoted by (A, P(1,]2 x),Y1 Y2) or by integers s and s2 as being arbitrarily chosen by the trans COVER: BROADCAST CHANNE 5 mitter to be sent to receivers 1 and 2, respectively. The of a broadcast channel, we will often designate this ex integer r is also chosen by the transmitter and is intended plicitly by a(n), P m(e) to be received by both receivers. Thus r is the"common" Definition The rate(R,, R2, Rl2)is said to be achievable part of the message and s, and s2 are the "independent by a broadcast channel if, for any e>0 and for all n parts of the message sufficiently large, there exists an(M1, M2, M12), n) code with An error is made by the ith receiver if 9 i+(,Ss. If M1M12≥ onr the message(r,sl, s2) is sent, let 12 λ(r,S1,s2)=Pr{sa(y)≠(r,1)},i=1,2,(6) M2M12≥2nK2 denote the probabilities of error for the two channels M12≤2n (14) where we note that y1, 2 are the only chance variables in such that p, n(e)< 8, p2 m (e)<e the above expression Comment: Note that the total number M=M,M2M12 We denote the(arithmetic average) probability of error of codewords for a code satisfying (14)must exceed in decoding(r,1 averaged over all choices of s2 by 2(R+R2R1) Definition The capacity region R" for a broadcast 1(r,s) λ1(r,S12S2) (7 channel is the set of all achievable rates(R, R2, R,2) The goal of this paper is to determine R*for as largc a Similarly, for channel 2 we define class of channels as possible Comment We shall sometimes let also denote the set (8)of achievable(R1, R2) pairs. However, at this stage in our M ∑λ2(r,s1,s2) L 51=1 understanding, it seems that sole concern with(r,, R2), Finally, we define the overall arithmetic average probabilities with the exclusion of concern with R12, would result in a of error of the code for channels l and 2 as coarsened and cumbersome theoretical development Comment The extension of the definition of the broad Pi(e)=2 i,(rS?M M,(r, 51 2)(9) cast channel from two receivers to k receivers is notationally cumbersome but straightforward, given the followin 1g comment. The index sets R, s,, s, should be replaced b p2(e) M2M12r,2 ∑42(,S2)=M,∑42(,12),(1 index sets I(),B∈{0,1,,0≠0, with the into 7,S1,2 pretation that the integer i(e)selected in index set where 1(0)=11, 2, . M(o) is intended (by the propcr codc M=M1M 12 selection) to be received correctly by every receiver j for which 0;=1 in 0=(01, 02, ., 02). Then, for example, th The overbar on P,e) will serve as a reminder that this rate of transmission over the nth extension of a broadcast probability of error is calculated under a special distribution, channel to the ith receiver will be given by namely the uniform distribution over the codewords We shall also be interested in the maximal probabilities of og M()=∑logM().(15) error 0c(0, 1R A1= max Prig(y)≠(r,s)(r,12)},i=1,2,(12) In the tworeceiver broadcast channel, the corresponding corresponding to the worst codeword with respect to each sets in the new notation are R=I(1,),S,=I(1,0), channel. note that ni 2 pie S2I0,1) We shall define the rate(ri, r2, ri2) of an(Mi,M2 Section IⅤ treats best twochannel situation and Mi2), n)code by Section v treats the worst IY, ORTHOGONAL CHANNELS RI=log M,M12 In this scction wc shall investigate a broadcast channel in which efficient communication to onc receiver in. no way terferes with communication to the other. a movie R g M2M12 designed to be shown simultaneously to a blind person and a deaf person would be such an example. R log M12 (13) Consider the broadcast channel with X= 1, 2, 3, 4 Y1={1,2},Y2={1,2},wih all defined in bits/transmission. Thus R i is the total rate of 10 transmission of information to receiver, i= 1. 2. and r 10 01 P1= is the portion of the information common to both receivers 10 Comment: When Ai and pi(e) refer to the nth extension 01 0 6 IEEE TRANSACTIONS ON INFORMATION THEORY. JANTIARY 1972 (RI FOR ANY OSRI2SI Fig. 8. Achievable rates for the orthogonal channel asn→o, and selecting x;∈{1,2)}orx;∈{34} according to the value of the ith bit of the codeword chosen to be sent from the first code and selecting x E 1, 3 or xi( 2, 3] Fig 7. Orthogonal channel according to the value of the ith bit of the codeword selected from the second code. Here, any Ri2 such thato< ri2< as depicted in Fig. 7. As before, min C 1, C23 may also be achieved Nothing more could be expected, and each channel performs no worse in the (PkL= Pr k=jix=i,k=1, 2; j=1, 2; i= 1, 2, 3, 4. presence of the other than it would alone We easily calculate C1= C2 =1 bit/transmission V. INCOMPATIBLE BROADCAST CHANNELS Clearly, from the standpoint of receiver yi, inputs x=1 and x =2 both result in y1 1 with probability l and can In a search to find the worst case of incompatibility in thercforc bc mcrgcd. Procccding with this analysis, wc find simultaneous communication we turn to the following that Y, can determine only xE(1,2, versus x E 4, while practical example which, for obvious reasons, we term the Y2 can determine only x∈{1,3} versus x∈{2,4} switchtotalk channel For this example, Ci=I and C2= I and are, respec Example 1SwitchtoTalk: Let tively, attained for Pr x=1+ Pr i=2]=i and X=X1∪X Pr x=1+Pr ix=3=y Solving these simultaneous equations, we find (I(X I Y,I(X I Y2)=(1, 1)can b Y1=氵1∪{d} achieved by p i=4,i=1, 2, 3, 4. This in itself de not guarantee that(C1, C2) can be simultaneously achieved Y2=Y2∪{2 However there does exist a coding theorem for this channeland Let u1E 1, 2), u2E(1, 2 denote the message bits that we wish to transmit to Y, and Y2, respectively. xx其 x0 Make the association from pairs of u to input symbols 2)=(1,1)+→1 x (u12)=(1,2)→2 00 01 (u12)=(2,1)+3 (1,2)=(2,2)+>4 00 01 and send the appropriate input symbol x. Then V1=ui Y2 and y2= u2, and capacities C and C2 are simultaneously achieved. Since u, and u2 may be chosen independently, we 00 01 may also achieve Ri2 s I by this scheme. Fig. 8 shows th set of achievable rates. The upper bound theorem of 00 01 Section VIll establishes this region as optimal E x The noiselessness of the channels is not crucial this 0 broadcast channel remains orthogonal in the sense that (RI,)=(C1, C2) may be achieved even if we define the (19) new channels as shown in Fig 9 Each receiver has an indicator that lights when the send T1 T ting with th ceiver. The idea is that 8)when the sender wishes to communicate with Y, he uses r x∈X1, resulting in 中2, indicating to r 2 that the sender is communicating with Y,. Similarly, to communi In this case, C=1H(r1)and C2=1H(r2). Ci and cate with Y2, the sender uses x E 2, resulting in y C2 may be simultaneously achieved by selecting sequences of This might correspond to the situation, for example,where n( cker fluent in Spanish and Dutch must speak simul :n2 taneously to two listeners, one of whom understands only codes with words in 0, 1]n such that A,()+0, 12m)+0, Dutch and the other only spanish COVER: BROADCAST CHANNELS CHANNEL I 3 X Fig. I1. Incompatible broadcast channels Hannel one of the 2nH(a) possible ways. This bound cannot be achieved unless the information rate ra common to both Fig, 9. The switchtotalk channel channels satisfies R122 H(a). The results are summarized in Fig. 10 It is an easy consequence of Section VIlI that Fig. 10 corresponds to the capacity region for this channel, and R2 (R, Ral :(ac()+ H(a), ac(o)+ H(an therefore that this encoding scheme is optimal for the switchtotalk channel R,, 2H(a) 2 The following example illustrates the worst case that r arise in simultaneous communications NAIVE Example 2Incompatible Case: Let TIME SHARI X={1,2,3,4},Y1={1,2},Y2={1,2} 10 01 L Iig. 10. Achievable rates for switchtotalk channels as shown in Fig. 11. Thus if X wishes to communicate with Y, over the perfect channel x E ( 1, 23++ Y, he must Let channel 1 havc capacity C(o) and channel 2 have send pure noise to Y,, i. e, Pr 2=1ixc(1, 233= 2 capacity C20). Using the known result for sum channels A similar statement holds for X communicating with Y2 (see Shannon [3])we find In Section VIII we shall establish an upper bound on the capacity region by finding the set of all achievable(I(X! Y), C1=log(1+200 I(X Y2) pairs. Anticipating these results, we shall make this calculation for this example. Let Pr x=i=pi C2=log(1 +2c2(0 1, 2, 3, 4. Define a=Pi+p2, d=p3+ P4. Then We shall discuss this example informally. Certainly H(= H(P1 +a/ 2) and H(YIX)=a, yielding I(XIY=H(PI +a/2)c. Similarly, I(XI Y2)= (R2,R2)=(C1,0) achievable and(R1, R2)=(0, C2) is (X Y2)=H(P4+ o/2)a achievable, and hence by time sharing, any pair of ratcs First, fixing x, and maximizing over0≤D1≤ (R1,R2)=(AC1,C2),0≤2≤1, is achievable. However,0≤p4≤a, we find the maximum values additional information is contained in the knowledge of o and proper encoding of the transmission times to Y, and Y, I(XY1)=1=x can he used to send extra information to both channels. If channel 1 is used proportion a of the time aC, ( o) bits/trans (XY2)=1x= mission are received by Y. However, H(o)additional achieved by P,=P2 =0/2 and P3=P4=d/2. This is the bits/transmission are achieved by choosing which channel upper boundary of achievable(I(X I Y,), I(X I Y2 ) pairs to send through independently at each instant by flipping a It may also be verified that, for any E [O, (], there exist allows the perfect transmission of one of 2n l(a)additional I(rI Y2) pairs depicted in Fg. o coin with bias ox. In other words, modulation of the switch P,P2, P3,P 4 achieving any (I(X I 1),(X l Y2) dominated by totalk button, subject to the timeproportion contraint a, (a, 1a) a). Thus we have the set I of achievable (I(X I Y1) messages to both receivers Y, and Y2 In Section VIII it will be shown that this region of jointly Thus all (R1, R2) of the form (R,, R,)=(aCi(o)+ achievable((r I Y, I(XI Y2)pairs is an upper bound on H(a), aC2(0)+H(o)can be achieved by choosing the the capacity region. However, we can trivially achieve any subset of n transmissions devoted to the use of channel 1 in pair of rates(r1, R2 )on the upper boundary of g by simply 8 IEEE TRANSACTIONS ON INFORMATION THEORY. JANUARY 1972 I(XY,) Fig. 13. The bottleneck channel R12'C, RIR2 c R,+R2R12'C Fig 12. Capacity region for incompatible channels. RIo, Ri+rz=C timesharing the two noiseless channels xe (1, 2++Y, and x∈色3,4}→Y2,Ifx∈{1,2} is used a proportion a of the time,then rates r,=a and R2=d=1a may be Fig. 14. Achievable rates for bottleneck channel achieved without any additional coding. Thus the upper bound can be achieved with trivial coding procedures, and Fig. 12 therefore corresponds to the capacity region to receiver 2 through the bottleneck channel P with Here, then, is an example in which the two channels are arbitrarily small probability of error. (See Fig 15) so incompatible that one can do no better than time sharing Assume that U= Un and V=nns are jointly ergodic i. e, using one channel efficiently part of the time and the processes taking values in finite alphabets. By jointly other channel the remainder. Fortunately, for those wishing ergodic, we mean that the process zn=(Un, V) is ergodic. to get something for nothing, this is the exception rather We recall that the definition of the entropy of an ergodic than the rule process ( Zn is defined by VI, THE BOTTLENECK CHANNEL H(z= lim n H(Zi, Z2,'', Zn). (23) Consider the broadcast channel in which the two channels have the same structure, i.e. We assert the following Fact: Asymptotically error free transmission of (U1,U2, P1G1 x)=p2(v2 x), xEX,1,2EY1=Y2=Y U7n}→{U1,O2,…,On}and{1 p n) over the bottleneck channel of capacity C can be as shown in Fig. 13. We shall term this the bottleneck accomplished if and only if channel Here, we note that any code for receiver Y, is also a code H(U,V<C (24) with the same error properties for receiver Y,. Thus Y, and Proof: The wellknown idea of the encoding is to Y2 both perceive correctly the transmitted sequence x with enumerate the 2n((U, n)+a) etypical sequences and send the low probability of error. Let the capacity of channel P be denoted by C1= C2=C idex of the actually occurring sequence(z1,z2, ." zn)over the channel. If H(U,V)+e< C, then this index will be bits per transmission. Now, since both receivers receive the correctly transmitted with probability of error a/2 for same information about X. it follows that both receivers 1 suficiently large n. Since the probability that a random and 2 will be able to correctly recover r, s, and s, if and only (z1,z2, '. 2n) will be typical can be made 21/2 for if(R1, R 2, R 12)is an achievable rate. Counting the number sufficiently large n, the overall probability of error can be of messages per unit time necessary to transmit (, 13S2) made less than e. The converse follows the standard argu correctly yields the following proposition [see comment ment for a single channel following (14)J The generalization of this result to arbitrary broadcast Proposition:(Ri, R2, Ri2) is an achievable rate for the channels is unknown. broadcast bottleneck channel of capacity C if and only if Let us now compare the orthogonal channel with the R1+R2R12≤C bottleneck channel. The orthogonal channel of Section IV achieves (R1, R2)=(1, 1)with arbitrary joint rate 0 s 0≤R1≤C R12< 1. Thus fully independent messages(R11= 0)or ximally dependent messages (R11 1)can be sent 0≤R2≤C simultancously to receivers I and 2 0≤R1≤C. at the other extreme in the case of the bottleneck channel with capacity= 1, we can simultaneously achieve R1=1 As an important application of these ideas, suppose that R2 1. Here however, it may be seen that achieving we wish to send a random process u= Un: n= 1, 2, ' .(ri, r2)=(1, 1)implies R12= 1. Thus the messages sent to receiver 1 and a random process V =Vn:n= 1, 2, ,.to 1 and 2 must be maximally dependent, and in fact equal COVER: BROADCAST CHANNELS 6→ ENCODER□ DEc。pER {n} Fig, 15. Sending two random processes over the same channel. NO N, S,NN(O, QS) SANLO. S) 2"N(O,N2) Fig. 18. Decomposition of the signal Z2N N(O, N Fig. 16. Gaussian broadcast channel as a sequence of i.i.d N(o, as)rv. here 0< a s l and a =1a. Thus the sequence 3= S,+s2 will be a sequence of i.i.d. N(O, S)RV. The received sequences yI=s+ s2+z1 and y2=S,+s2+32are in Fig. 18. Now s, and z2 are considered to be noise by receiver 2 We see that si z2i are i.i.d. N(O, as+ N2)Rv. Therc ges may be sent at rates less than aS Fig. 17. Time sharing rates for the gaussian broadcast channel as+ N VII. GAUSSIAN CHANNELS to receiver Y2 with probability of error near zero for Consider the timediscrete gaussian broadcast channel fficiently large block length n. That is, there exists a sequence of (2n(Ca(a)a), n) codes with average power con with two receivers depicted in Fig. 16 straint &S and probability of error p2((e)=0 Let1=(z1z12,…,z1n……) be a sequence of inde Now, since Ni< N2, receiver Y, may also correctly pendently identically distributed (i.i. d, normal random determine the transmitted sequence S2 with arbitrarily low variables(rv)with mean zero and variance N,, and let probability of error. Upon decoding of S2, given y1 (221,222.',22ns's) be i.i.d. normal RV with mean receiver Yi then subtracts 52 from yi, yielding j1 zero and variance N. let n, w. at the ith transmission yI At this stage channel 1 may be the real number xi Is sent and y1 x:+ hiis y2i=xt 22i considered to be a Gaussian channel with input power are received. In our analysis it is irrelevant whether zsi and constraint as and additive zero mean Gaussian noise with Z2i are correlated or not (although in the feedback case it variance N,. The capacity of this channel is i log [1 may make a difference). Let there be a power constraint on (aS N)]= C,(a)bits/transmission, and is achieved the transmitted power, given for any n by roughly speaking, by choosing 2nc(a) independent n x;≤S (25) sequences of ii d. N(O, aS)RV as the code set for the possible sequences s Thus receiver Y1 correctly receives both s, and for any signal=(x,,x2,,,,xn)of block length n This informal argument indicates that rates It is well known that the individual capacities are C1=log(1 S Ni) and C2=! log(1+ S/N2) bits/ R,= log transmission, where all logarithms are to the base 2 zs+N220g1×as S Time sharing will achieve any convex combination of S (C2, C2) and(C1, 0), as shown in Fig. 17. R log[ 1+ (26) Now let us see how we can improve on this performance. Think of the signal s2 (intended for the high noise receiver may simultaneously be aachieved, for any0sas 1 Y2)as a sequence of i.i. d. N(, uS)RV. Superimposed on These rate pairs, shown in Fig. 19, dominate the time this sequence will be a sequence s, that may be considered sharing rates 10 TEEE TRANSACTIONS ON INFORMATION THEORY JANUARY 1972 I(×Y2 MINIMAX 3 UPERP。3TN I(XY, TIME SHARING 广a〓 Fig. 20. Upper bound on capacity region Fig 19. Set of achievable rates for the Gaussian broadcast channel set of jointly achievable mutualinformation pairs, properly modified to take into account the possibility of timesharing and throwing information away, does yield an upper bound Summarizing the argument, we select a set of 2n(Ca(a=x) on the capacity region 3*. This upper bound is actually achieved by the orthogonalchannel, switchtotalkchannel ndom nsequences of i.i. d. N(O, aS)Rv, and a set of 2n (C1(x)e) random nsequences of i.i.d. N(O, S)RV.Now and incompatible channel examples ( C1()+C2()2e)nsequences are formed by adding together bound. Let Thus we proceed to define R and establish as an upper pairs of scqucnccs, in which the first sequence is chosen from the first set and the second sequence is chosen from I=((X Y),I(XI Y2)lp(x)20, 2p(x)=1(29) the second set, and the pairs are chosen in all possible ways. A message denote the sct of all pairs(I(XI Y1), I(XI Y2)) gcncrated by P(x)as p( )ranges over the simplex of possible probability (r,s1),r∈{1,2,…,2mC2())},s1∈{1,2 distributions on X. Define to be the convex hull of I Thus I may be interpreted as the average joint mutual is transmitted by selecting the nsequence corresponding to information achievable by varying p( )with timc. Let the sum of the rth sequence in the first set and the s,th sequence in the second set Receiver 1 is intended to decode R=[(RI, R2EE2IRIsI, R2 I2, (r, s corrcctly and rcccivcr 2 is intended to decode r correctly, thus simultaneously achieving rates for some(I1,I2)∈Ⅰ}.(30) R1=C1(x)+C2(a)2E Thus intuitively corresponds to the joint mutual infor mation achievable from i by throwing information away. R2=C2(x)E (27) These sets are depicted in Fig. 20. We now show *s R as given in (26) Lemma 1: Given an arbitrary ((M1,M2, M12); n) code for A full discussion of the Gaussian channel would lead far the nth extension of a broadcast channel, consisting of afield. A direct simple proof of the achievability of the rates words x(r, s1 S2EX, ER,S1 ES1,S2ES2,IR=Miz given in(27)has been found but will not be presented here 1Mi,s2I=M2,MM12M1M2ilet (r, 15 52)be a We shall conclude this section with one observation. If random variable with range R X S x S2. Let (J1,J2) NI =0, and channel l is therefore perfect, we have Cl Yi"x Y2" be the corresponding random output nsequences and C2 =i log(1 S/N compound channel or received by I and 2, generated by sending x(, S12)over the maximin approach would have us send at rates(R, R))=channel. If Pi(e)=Pr((1 51)*(1,51)y and p2(e) (C2, C2). However, an arbitrarily small decrement in the rate Pr ((1 52)*(1, s2)) are the receiver probabilities of error for channel 2, corresponding to 0 a<1 in(26), yields of the code, then R1,R2)=(a, C2  as a pair of achievable rates Although this rate pair docs not dominate(C2, c2), it sccms H(X Y<I+log M2 pi(e) log M12M (31) vastly preferable H(r Y2)s 1 +log Mi+P2(e) log M,2M2.(32) VIIl. AN UPPER BOUND ON ACHIEVABLE RATES(R1, R2 Proof: Let the decoding rules corresponding to the Suppose that plx), a probability distribution on X, code be generates the pair of mutual informations (I(KI), 91:Y1→R×S1 I XI), where, for i= 1, 2 g2:￥2n→R×S 3) XY)=∑∑x)pyx)log(y!x) (28)written xEXyEYi p y 9k(yx)=(9k1(yk)k2(y),k=1,2 Given the intuitive properties of mutual information, it is natural w assume Lha rates r=l(X Y) ), R2=l(X Y2) Thus, given random message (r,1, 2) and sequence are therefore simultaneo usly achievable. This turns out not Dk E yn. receiver k will make an error if and only if to be the case. i Close inspection of the example of two bsc 91(y1)≠(r,1),k=1 in Section II, with Pr &x=1=, and I(XIYi=1, I(XIY2)=C2, will yield a counterexample. However, the 92(y2)≠(,2),k=2 (34) COVER BROADCAST CHANNELS We now wish to show that p, e),p,(nl(e)cannot P(e)=Pr{91(y1)≠(,1) simultaneously tend to zero for rates(rir2)R. This will establish iR as an upper bound on the capacity region for a p2(e)=Pr{g2(y2)≠(r,2) (5) broadcast channel We note that Let R1=1/n log M,M12 and r,=1/n log M2Mi2 be the rates of communication in bits/transmission for receivers H(r lyis H(pe), ip, l yu)) Y, and Y2, respectively. ( We recall that R12=log Mi2 is +(1pilelyid)log M2+ plle lyi) log(MM..the transmission rate for information common to both channels The proof closely resembles that used by Shannon (36) [4] for the twoway channel where we have used the inequality Theorem: For any sequcncc of [(2nR1, 2nR2, 2 K12),n] codes, (RI, R2)fR implies that I(ax1,a2,…,an)≤logm, (37) p1(e),p2(l)→(0,0),(2,2)(0, and a basic composition relation(see Ash [l, p. 8D. We Thus s is an upper bound on the capacity region for the have f course, conditioned on the events g, (y and g1(31)*(r,s,). Taking the expectation over Y " and broadcast channel nsing the convexity of H(p, 1p)in p, we have Proof: Given an arbitrary [(mi, 2, M12), n] code for the nth extension of the broadcast channel, choose a code H(X YSH(P,(e), 1p(e))+(1P,(e) log M2 word x(, S1, 2)at random according to a uniform distribu tion Pr r,1, S2=I/M,( S1S2ERX S,x S2, where +pe) log(MM).( 38) M=RIIS1HIS2l If the codewords r(r, i,s2)eXn are not Finally, since H(p, 1p)<I and M=Mi2M,M2 we distinct, a simple modification of the proof below will prove ha ave the theorem. Thus treating the case where the x(r, S1, 52)are distinct, we have H(X)= log M and I(XIYi=log M H(XI Y,<I+log M2+pi(e)log(MM2)/M2 H(XY1, under the given uniform distribution on the s I+ log M2+Pi(e) log Mi2M,. (39)codewords. As in Section Ill, letP, n(e)and p2(m(e)designate the probabilities of error of the code under this distribution The corresponding argument for proof H(X I Y2)completes the By Lemma 2 We shall need the following lemma Ash [1, p 81 I(XY)≤∑I(xzY1) (41) Lemma 2: Let Xi,'', Xn be a sequence of input random variables to the(discrete memoryless)broadcast channel and Y YinY2 Y2n the corresponding rcccived out I(XIY =log MH(XIY)s2I(X,l Y1i).(42) put random variables for 1 and 2, respectively. Then Finally, since(31)in Lemma 1 holds for any distribution I(XI,', X,I Yxl,' ." Ykm)s. >I(X, I YkD), k=1, 2, on the codewords, substitution in(42)yields with equality iff Yk1, Yk2, ''' Ykn are independent log M1log M2P, (m(e)log M,2M1 Proof: ∑I(X1Y1l),(4 H(Yk1,Xk2,…,YknX 会∑x,yk) log p(ykx which becomes the basic inequality but, because the channel k is memoryless, Pk(yk I x) factors R (1/n)+(1n)∑=1I(X2Y1 into a product lip(k xi, yielding 2gM12M1≤ 1pin(e (44 H(Yk1,…·,YnX1,·,X Similarly we find ∑p(x,y)∑logp( R2△logM12M2 (1/n)+(1 (k1Y2 12)(e) H(Yk I Xi (44b) Summarizing, an arbitrary code for the nth extension of Also, by a basic inequality a broadcast channel must have rates(r1, R2) satisfying 446), where Ykn)≤∑H(Yk p((e) ∑(r,s1,s2),i=1,2.(45) with equality iff Yki are independent for i=1, 2, '', n Since I(X I Yk=H(YH(Yk I X, the lemma follows. Now suppose (ri, r2)AR,r120,R220 as in Fig. 21
 11.26MB
Broadcast广播的使用
20190125Broadcast广播的使用,简单介绍了静态、动态注册广播，以及接受系统发出的广播和自定义本地广播
 3.49MB
QPresenterBroadcastSetup
20140123将VGA连接到投影仪，耳机输出连接到调音台，当电脑离投影仪较远时，投影图像就会出现波纹状扰动。用网线代替VGA线会怎样呢？
 52KB
broadcast
20071204出现问题的广播程序
 5.38MB
SENSMEPSP资源
20120921PSP的一个文件神马滴，有点复杂把，大家看看，看懂了就好
 133KB
Broadcast Channels
20120603Broadcast Channels Transmission of Information Lecture 22
 7.12MB
springapi参考chm格式
20090712一个关于认知无线电与MIMO结合的优化速率的pdf文献
 309KB
Degrees of freedom for MIMO interference broadcast channels
20210221Degrees of freedom for MIMO interference broadcast channels
 309KB
The degrees of freedom for MIMO interference broadcast channels with no CSIT
20210221The degrees of freedom for MIMO interference broadcast channels with no CSIT
 154KB
On user selection in cognitive broadcast channels
20210209On user selection in cognitive broadcast channels
 312KB
On Secret Reconstruction in Secret Sharing Schemes
20090508Abstract—A secret sharing scheme typically requires secure communications in each of two distribution phases: 1) a dealer distributes...rates of partial broadcast channels for the secret reconstruction.
 4.7MB
信息论基础（英文版）
2013050315.6.2 Degraded Broadcast Channels 564 15.6.3 Capacity Region for the Degraded Broadcast Channel 565 15.7 Relay Channel 571 15.8 Source Coding with Side Information 575 15.9 Rate Distortion with Side ...
 398KB
OptimumCapacity MIMO Satellite Broadcast System
20140328OptimumCapacity MIMO Satellite Broadcast System  Conceptual Design for LOS Channels
 895KB
Polite WaterFilling for Weighted SumRate Maximization in MIMO BMAC Networks Under Multiple Linear Constraints
20210221Optimization under multiple linear constraints is important for practical systems with individual power constraints, perantenna power constraints,... broadcast channels (BC), and the nonconvex optimizat
 6.38MB
TCPIP SOCKETS IN JAVA
201003225.6.2 Selecting and Identifying Ready Channels 138 5.6.3 Channel Attachments 140 5.6.4 Selectors in a Nutshell 140 5.7 Datagram (UDP) Channels 141 5.8 Exercises 145 6 Under the Hood 147 6.1 Buffering ...
 912KB
RemObjects SDK for Delphi
20120213Broadcast Chat Intermediate This example shows how to use the TROBroadcastServer and TROBroadcastChannel channels to write an UDP broadcasting chat program. Class Factories Architecture Advanced This ...
 2.44MB
WCDMA and cdma2000 for 3G Mobile Networks 2002.pdf
20100413Broadcast/Multicast (BMC) Protocol 246 Radio Resource Control Protocol 247 RRC Functions 247 Management of RRC Connections 249 Handover 250 Summary 254 References 256 General Systems Descriptions 256 ...
 1.77MB
Introduction to 3G Mobile Communications
200911243.1.5 Multiplexing of Transport Channels and Demultiplexing of CCTrCHs 57 3.1.6 Rate Matching 57 3.1.7 Mapping of CCTrCHs on Physical Channels 57 3.1.8 Modulation, Spreading/Demodulation, and ...
 12.8MB
WIRELESS COMMUNICATIONS
201308311.2.1 Broadcast 8 1.2.2 Paging 9 1.2.3 Cellular Telephony 10 1.2.4 Trunking Radio 12 1.2.5 Cordless Telephony 12 1.2.6 Wireless Local Area Networks 14 1.2.7 Personal Area Networks 14 1.2.8 Fixed ...
 30.66MB
Cinegy Air Express Control 9.5.0 (完整版)
20150706Cinegy Air Express Control 9.5.0  realtime playout server and broadcast automation software Description: Cinegy Air Express is the new entrylevel version of the successful Cinegy Air product ...
 1.81MB
3G CDMA2000 Wireless system engineering  Yang S. (2004)
200910132.3.4 Broadcast Control Channel (FBCCH) 20 2.3.5 Common Assignment Channel (FCACH) 21 2.3.6 Common Power Control Channel (FCPCCH) 22 2.3.7 Pilot Channels 24 2.4 User Channels 26 2.4.1 Forward ...
 311KB
多用户MIMO下行链路的高效块对角线化方法
20210310基于奇异值分解（SVD）的块对角化（BD）是一种众所周知的预编码方法，可以消除多用户多输入多输出（MIMO）广播信道中的用户间干扰，但计算效率低下。 在本文中，我们提出了另一种方法来代替SVD以获得BDscheme中的预...

下载
高校迎新系统 (2).zip
高校迎新系统 (2).zip

下载
深蓝色简约市场专员简历word简历模板.docx
深蓝色简约市场专员简历word简历模板.docx

下载
下载+1.6.7_0.crx
下载+1.6.7_0.crx

下载
20210510新时代证券先进制造行业投资周报：机器人和机床核心部件紧缺，龙头持续受益.pdf
20210510新时代证券先进制造行业投资周报：机器人和机床核心部件紧缺，龙头持续受益.pdf

下载
20210506中泰证券兆易创新603986立足存储外延发展，景气度大年有望加速成长.pdf
20210506中泰证券兆易创新603986立足存储外延发展，景气度大年有望加速成长.pdf

下载
20210509光大证券汽车和汽车零部件行业周报：行业零售数据走强，蔚来开启海外市场战略.pdf
20210509光大证券汽车和汽车零部件行业周报：行业零售数据走强，蔚来开启海外市场战略.pdf

下载
排列5历史开奖数据下载
排列5历史开奖数据下载

下载
博睿数据：2020年年度报告(修订版).PDF
博睿数据：2020年年度报告(修订版).PDF

下载
20210514上海证券图森未来TSP.US智驾服务将迎风口，产业新秀崭露头角.pdf
20210514上海证券图森未来TSP.US智驾服务将迎风口，产业新秀崭露头角.pdf

下载
《数据结构》期末考试复习样卷附答案.rar
《数据结构》期末考试复习样卷附答案.rar