简单消息处理过程:1.Describe:handleCmd_DESCRIBE-|sdpLines--|createNewStreamSource---|ByteStreamFileSource---|H264VideoStreamFramer----|H264or5VideoStreamFramer-----|H264or5VideoStreamParser------|MPEGVideoStreamParser::StreamParser(inputSource,FramedSource::handleClosure,usingSource,&MPEGVideoStreamFramer::continueReadProcessing,usingSource),-------|continueReadProcessing--------|parse--|createGroupsock--|createNewRTPSink(dummyGroupsock,rtpPayloadType,inputSource)---|H264or5VideoRTPSink----|VideoRTPSink-----|MultiFramedRTPSink--|setSDPLinesFromRTPSink(dummyRTPSink,inputSource,estBitrate)---|getAuxSDPLine----|checkForAuxSDPLine-----|checkForAuxSDPLine1{nextTask()=envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,(TaskFunc*)checkForAuxSDPLine,this);/*添加到任务调度中,固定时间就会调用checkForAuxSDPLine来查找sdp*/}播放过程:3.PlayhandleCmd_PLAY-|startStream--|StreamState::startPlaying{fRTPSink-startPlaying(*fMediaSource,afterPlayingStreamState,this);}//播放完成之后会执行该函数来关闭---|MediaSink::startPlaying(之前笔记之中缺少了一个环节,注意startPlaying中参数,func对应函数应该是在播放完成之后需要调用来关闭播放的.)----|H264or5VideoRTPSink::continuePlaying()-----|MultiFramedRTPSink::continuePlaying()------|buildAndSendPacket(大循环,源源不断获取数据)-------|packFrame{fSource-getNextFrame(fOutBuf-curPtr(),fOutBuf-totalBytesAvailable(),afterGettingFrame,this,ourHandleClosure,this);}--------|H264or5Fragmenter::doGetNextFrame(){if(fNumValidDataBytes==1){fInputSource-getNextFrame(&fInputBuffer[1],fInputBufferSize-1,afterGettingFrame,this,FramedSource::handleClosure,this);}else{case1;case2;case3.//根据帧数据大小和可发送最大长度比较,分片处理FramedSource::afterGetting(this);//发送数据}}--------|MPEGVideoStreamFramer::doGetNextFrame()---------|continueReadProcessing{unsignedacquiredFrameSize=fParser-parse();if(acquiredFrameSize0){afterGetting(this);//在doGetNextFrame中有进行发送数据,这块作用??}}----------|H264or5VideoStreamParser::parse(){test4Bytes()//用来获取字节数据}-----------|u_int32_ttest4Bytes()------------|StreamParser::ensureValidBytes1(unsignednumBytesNeeded){fInputSource-getNextFrame(&curBank()[fTotNumValidBytes],maxNumBytesToRead,afterGettingBytes,this,onInputClosure,this);// 这里的fInputSource应该是前面创建的ByteStreamFileSource文件}-------------|ByteStreamFileSource::doGetNextFrame()--------------|ByteStreamFileSource::doReadFromFile(){fFrameSize=fread(fTo,1,fMaxSize,fFid);//找不到定义,原来是系统函数size_tfread(void*ptr,size_tsize,size_tnmemb,FILE*stream);}---------|FramedSource::afterGetting(FramedSource*source)//数据已获取,发送数据----------|MultiFramedRTPSink::afterGettingFrame1-----------|sendPacketIfNecessary{(1)fRTPInterface.sendPacket(fOutBuf-packet(),fOutBuf-curPacketSize());//发送数据(2)nextTask()=envir().taskScheduler().scheduleDelayedTask(uSecondsToGo,(TaskFunc*)sendNext,this); //获取下一帧数据}------------|(1)RTPInterface::sendPacket-------------|sendRTPorRTCPPacketOverTCP(packet,packetSize,stream-fStreamSocketNum,stream-fStreamChannelId)--------------|sendDataOverTCP(socketNum,packet,packetSize,True)---------------|send(socketNum,(charconst*)data,dataSize,0/*flags*/)//调用系统函数------------|(2)voidMultiFramedRTPSink::sendNext(void*firstArg){sink-buildAndSendPacket(False);//所以说buildAndSendPacket是大循环}source文件分析:RTSPServer::RTSPClientConnection::handleCmd_DESCRIBE(charconst*urlPreSuffix,charconst*urlSuffix,charconst*fullRequestStr){(1)session=fOurServer.lookupServerMediaSession(urlTotalSuffix);//查找并建立会话(2)sdpDescription=session-generateSDPDescription();//获取文件sdp内容,并ByteStreamFileSource和createNewRTPSink,并对ByteStreamFileSource实例化}--|(1)ServerMediaSession*DynamicRTSPServer::lookupServerMediaSession(charconst*streamName,BooleanisFirstLookupInSession){FILE*fid=fopen(streamName,rb);ServerMediaSession*sms=RTSPServer::lookupServerMediaSession(streamName);//查找是否存在该会话if(sms==NULL){sms=createNewSMS(envir(),streamName,fid);//不存在该会话时则建立会话addServerMediaSession(sms);}}---|ServerMediaSession*createNewSMS(UsageEnvironment&env,charconst*fileName,FILE*/*fid*/){elseif(strcmp(extension,.264)==0){//AssumedtobeaH.264VideoElementaryStreamfile:NEW_SMS(H.264Video);OutPacketBuffer::maxSize=100000;//allowforsomepossiblylargeH.264framessms-addSubsession(H264VideoFileServerMediaSubsession::createNew(env,fileName,reuseSource));}----|H264VideoFileServerMediaSubsession::H264VideoFileServerMediaSubsession(UsageEnvironment&env,charconst*fileName,BooleanreuseFirstSource):FileServerMediaSubsession(env,fileName,reuseFirstSource),fAuxSDPLine(NULL),fDoneFlag(0),fDummyRTPSink(NULL){}-----|FileServerMediaSubsession::FileServerMediaSubsession(UsageEnvironment&env,charconst*fileName,BooleanreuseFirstSource):OnDemandServerMediaSubsession(env,reuseFirstSource),fFileSize(0){fFileName=strDup(fileName);//fFileName在这里被指定之后,之后要用它来实例化ByteStreamFileSource}}到目前为止,会话已经建立好了,并且该会话所对应的fFileName的内容也已经是客户端需求文件的路径.可以说,该会话已经与该文件绑定在一起了.这样,在后面试了实例化ByteStreamFileSource时,对应的文件路径就是fFileName内容,这样就很明确了.--|ServerMediaSession::generateSDPDescription()---|OnDemandServerMediaSubsession::sdpLines(){FramedSource*inputSource=createNewStreamSource(0,estBitrate);//实例化sourceGroupsock*dummyGroupsock=createGroupsock(dummyAddr,0);RTPSink*dummyRTPSink=