Chapter4.ChannelandChannelCapacityInformationTheoryandCodingReview–propertyofmutualinfo.function:Property.1•Relationshipbetweenaveragemutualinfo.andchannelinputprobabilitydistributionProperty1:I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionp(x).I(X;Y)p(x)Review–propertyofmutualinfo.function:Property.2•RelationshipbetweenInfo.contentandchanneltransitionprobabilitydistributionProperty2:I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributesp(y/X).I(X;Y)p(y/x)•Whatischannel?–Thechannelisacarriertransmittingmessages—apassagethroughwhichsignalpasses.–Theinformationisabstract,butthechannelisconcrete.Forinstance:Iftwopeopleconverse,theairisthechannel;Ifthetwocalleachother,thetelephonelineisthechannel;Ifwewatchthetelevision,listentotheradio,thespacebetweenthereceiverandthetransmitteristhechannel.4.1.Themodelandclassificationofthechannel•Inthispart,wewillmainlyintroducetwoparts:ChannelmodelsChannelclassifications4.1.1ChannelModelsnoisechannelxyP(y|x)wecantreatchannelasaconverterwhichtransferevents.thechannelmodelcanbeindicatedasthefollowFig:•BinarySymmetricChannel(BSC)isthesimplestchannelmodel•ABSCisshownasbelow:01011-p1-pppBSCDMC•Weassumethatthechannelandthemodulationismemoryless.Theinputsandoutputscanthenberelatedbyasetofconditionalprobabilities||ijijPYyXxPyx0,1,,1iQ0,1,,1jqDMC•ThischannelisknownasaDiscreteMemorylessChannel(DMC)andisdepictedasx0x1xq-1y0y1yQ-14.1.2Channelclassifications•Channelcanbeclassifiedintoseveraltypes.ChangeableConstanttypeParameter2)Twouserchannel(pointtopoint)1)user'stypeMulti-userchannel(comm.network)OpenwiresymmetricalbalancecableSolidmediaCablefinecoaxialcablecoaxialcableLongwaveAMShortwaveFM3)typeofmediaMobileairmediaHorizonrelayMicrowaveTroposphericScatteringIonosphericSatelliteLightWaveguideMixedmediaCable(对流层)(电离层)discretememorylesscontinuoussignalsemi-discretewithmemorysemi-continuousnointerference4)signal/interferenceThermalnoiseLinearsuperpositionImpulsenoiseinterferencewithinterferenceIntermodulationMultiplicativeFadingInter-symbolinterference4.2Channeldoubtdegreeandaveragemutualinformation•4.2.1Channeldoubtdegree•4.2.2Averagemutualinformation•4.2.3Propertiesofmutualinformationfunction•4.2.4Relationshipbetweenentropy,channeldoubtdegreeandaveragemutualinformation4.2.1Channeldoubtdegree•Assumer.v.Xindicatestheinputsetofchannel,andr.v.Yindicatestheoutputsetofchannel,thechanneldoubtdegreeis:Themeaningof“channeldoubtdegreeH(X|Y)”isthatwhenthereceivingterminalgetsmessageY,theaverageuncertaintystillleavesaboutsourceX.Infact,theuncertaintycomesfromthenoiseinchannel.(|)[(|)]()log()XjijijijHXYEHXbpabpab11(|)(|)(|)(|)log(|)nnjijijijijiiHXbpabIabpabpab•ThismeansthatiftheaverageuncertaintyofsourceXisH(X),we’llgetmoreorlessinformationwhicheliminatestheuncertaintyofthesourceXwhengettheoutputmessageY.Sowehavethefollowingconceptofaveragemutualinformation.Sincewehave:(|)()HXYHX4.2.2Averagemutualinformation•TheaveragemutualinformationistheentropyofsourceXminusthechanneldoubtdegree.•TheabovemeaningisthatwhenthereceivergetsamessageY,theaverageinformationhecangetaboutXfromeverysymbolhereceived.(;)()(|)defIXYHXHXY4.2.3Propertiesofmutualinformationfunction•Property1:Relationshipbetweenmutualinformationandchannelinputprobabilitydistribution.•I(X;Y)isanupperconvexfunctionofthechannelinputprobabilitydistributionP(X).ThiscanbeshowninFig.4.5andFig.4.6.I(X;Y)P(X)ChannelXYP(X)P(Y|X)Fig.4.5.I(X;Y)isconvexfunctionofP(X)Fig.4.6.Messagepassingthroughthechannel•E.g.4.1Consideringadualelementchannel,theprobabilitydistributionis110PXandthematrixofchannelisWhereistheprobabilityoftransmissionerror.0101pppp11122122ppppPpppp1pp)1log1log()()()|(1log)|()()()|(1log)()()|()();(ppppxpYHxypxypxpYHxypxypYHXYHYHYXIXXYXY)()()1log1log()(pHYHppppYHThenthemutualinformationis,•Andwecangetthefollowingresults,So,TheaveragemutualinformationdiagramisshowninthefollowingFig.4.7.(0)(0)(0|0)(1)(0|1)(1)(1)(0)(1|0)(1)(1|1)(1)PYPXYXPXYXppppPYPXYXPXYXpppp11()()log()logHYppppppppI(X;Y)01/211-H(p)Fig.4.7.MutualinformationofthedualsymmetricchannelFromthediagram,wecanseethatwhentheinputsymbolssatisfy“equalprobabilitydistribution”,theaveragemutualinformationI(X;Y)reachesthemaximumvalue,andonlyatthistimethereceivergetsthelargestinformationfromeverysymbolhereceived.•Property2Relationshipbetweeninformationandchanneltransitionprobabilitydistribution.•I(X;Y)isaconcavefunctionofchanneltransitionprobabilitydistributionofp(Y|X).I(X;Y)P(X|Y)Fig.4.8.I(X;Y)isaconcavefunctionofP(X|Y)•E.g.4.2(Thisisthefollow-upofE.g.4.1)Consideringdualchannel,nowweknowtheaveragemutualinformationis,whenthesourcedistributionis)()();(pHppHYXIaveragemutualinformationI(X;Y)istheconcavefunctionofp,justseeitfromthefollowingdiagram,,the00.51.0pH(ω)I(X;Y)Mutualinfo.offixedbinarys