为什么AVSampleBufferDisplayLayer停止显示从AVCaptureVideoDataOutput的委托取得的CMSampleBuffers?

我想用AVSampleBufferDisplayLayer显示一些CMSampleBuffer,但是在显示第一个样本后它会冻结。

我从AVCaptureVideoDataOutputSampleBuffer委托中获取样本缓冲区:

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CFRetain(sampleBuffer); [self imageToBuffer:sampleBuffer]; CFRelease(sampleBuffer); } 

把他们变成一个载体

 -(void) imageToBuffer: (CMSampleBufferRef )source{ //buffers is defined as: std::vector<CMSampleBufferRef> buffers; CMSampleBufferRef newRef; CMSampleBufferCreateCopy(kCFAllocatorDefault, source, &newRef); buffers.push_back(newRef); } 

然后尝试通过AVSampleBufferDisplayLayer(在另一个ViewController中)显示它们

 AVSampleBufferDisplayLayer * displayLayer = [[AVSampleBufferDisplayLayer alloc] init]; displayLayer.bounds = self.view.bounds; displayLayer.position = CGPointMake(CGRectGetMidX(self.displayOnMe.bounds), CGRectGetMidY(self.displayOnMe.bounds)); displayLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; displayLayer.backgroundColor = [[UIColor greenColor] CGColor]; [self.view.layer addSublayer:displayLayer]; self.view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight; dispatch_queue_t queue = dispatch_queue_create("My queue", DISPATCH_QUEUE_SERIAL); [displayLayer setNeedsDisplay]; [displayLayer requestMediaDataWhenReadyOnQueue:queue usingBlock:^{ while ([displayLayer isReadyForMoreMediaData]) { if (samplesKey < buffers.size()) { CMSampleBufferRef buf = buffers[samplesKey]; [displayLayer enqueueSampleBuffer:buffers[samplesKey]]; samplesKey++; }else { [displayLayer stopRequestingMediaData]; break; } } }]; 

但是它显示第一个样本然后冻结,什么都不做。

而我的video数据输出设置如下:

 //set up our output self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; dispatch_queue_t queue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL); [_videoDataOutput setSampleBufferDelegate:self queue:queue]; [_videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA],(id)kCVPixelBufferPixelFormatTypeKey, nil]]; 

我在相同的上下文中遇到了这个问题,试图从AVCaptureVideoDataOutput获取输出并将其显示在AVSampleDisplay图层中。

如果您的框架以显示顺序出现,那么修复非常简单,只需在CMSampleBufferRef上设置立即显示标志即可。

获取由委托返回的示例缓冲区,然后…

 CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES); CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0); CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue); 

如果帧按照编码器顺序(不是显示顺序)出来,那么CMSampleBuffer上的时间标记需要被零偏置并重新标记,使得第一帧时间戳等于时间0。

  double pts = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)); // ptsStart is equal to the first frames presentationTimeStamp so playback starts from time 0. CMTime presentationTimeStamp = CMTimeMake((pts-ptsStart)*1000000,1000000); CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, presentationTimeStamp); 

更新:

我遇到了一个情况,那就是当我使用零点偏移的方法时,一些video仍然不能stream畅地播放,我进一步调查。 正确的答案似乎是从你打算玩的第一帧使用PTS。

我的答案在这里,但我也会在这里发布。

设置AVSampleBufferDisplayLayer呈现样本缓冲区的速率

时基需要设置为您打算解码的第一帧的显示时间戳(pts)。 我通过从所有后续的pts中减去初始pts并将时基设置为0来将第一帧的pts索引为0。不pipe什么原因,这对某些video不起作用。

你想要这样的东西( 调用解码之前调用):

 CMTimebaseRef controlTimebase; CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase ); displayLayer.controlTimebase = controlTimebase; // Set the timebase to the initial pts here CMTimebaseSetTime(displayLayer.controlTimebase, CMTimeMake(ptsInitial, 1)); CMTimebaseSetRate(displayLayer.controlTimebase, 1.0); 

设置CMSampleBuffer的PTS …

 CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, presentationTimeStamp); 

也许确保立即显示没有设置….

 CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanFalse); 

在WWDC 2014会议513中对此进行了非常简短的介绍。