使用AVCaptureVideoDataOutput和AVCaptureAudioDataOutput时的性能问题

当我使用AVCaptureVideoDataOutput和AVCaptureAudioDataOutput录制audio+video时,我遇到了滞后问题。 有时video会阻塞几毫秒,有时audio与video不同步。

我插入了一些日志,并观察到,首先,我在captureOutputcallback中获得了很多video缓冲区,过了一段时间,我得到了audio缓冲区(有时候根本没有收到audio缓冲区,结果是没有声音)。 如果我评论处理video缓冲区的代码,我得到的audio缓冲区没有问题。

这是我正在使用的代码:

-(void)initMovieOutput:(AVCaptureSession *)captureSessionLocal { AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init]; self._videoOutput = dataOutput; [dataOutput release]; self._videoOutput.alwaysDiscardsLateVideoFrames = NO; self._videoOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey ]; AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init]; self._audioOutput = audioOutput; [audioOutput release]; [captureSessionLocal addOutput:self._videoOutput]; [captureSessionLocal addOutput:self._audioOutput]; // Setup the queue dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL); [self._videoOutput setSampleBufferDelegate:self queue:queue]; [self._audioOutput setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); } 

在这里,我设立了作家:

 -(BOOL) setupWriter:(NSURL *)videoURL session:(AVCaptureSession *)captureSessionLocal { NSError *error = nil; self._videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL fileType:AVFileTypeQuickTimeMovie error:&error]; NSParameterAssert(self._videoWriter); // Add video input NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, nil]; self._videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; NSParameterAssert(self._videoWriterInput); self._videoWriterInput.expectsMediaDataInRealTime = YES; self._videoWriterInput.transform = [self returnOrientation]; // Add the audio input AudioChannelLayout acl; bzero( &acl, sizeof(acl)); acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; NSDictionary* audioOutputSettings = nil; // Both type of audio inputs causes output video file to be corrupted. // should work on any device requires more space audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys: [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey, [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey, [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey, [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey, [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey, nil ]; self._audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeAudio outputSettings: audioOutputSettings ]; self._audioWriterInput.expectsMediaDataInRealTime = YES; // add input [self._videoWriter addInput:_videoWriterInput]; [self._videoWriter addInput:_audioWriterInput]; return YES; } 

这里是callback:

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { if( !CMSampleBufferDataIsReady(sampleBuffer) ) { NSLog( @"sample buffer is not ready. Skipping sample" ); return; } if( _videoWriter.status != AVAssetWriterStatusCompleted ) { if( _videoWriter.status != AVAssetWriterStatusWriting ) { CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); [_videoWriter startWriting]; [_videoWriter startSessionAtSourceTime:lastSampleTime]; } if( captureOutput == _videoOutput ) { if( [self._videoWriterInput isReadyForMoreMediaData] ) { [self newVideoSample:sampleBuffer]; } } else if( captureOutput == _audioOutput ) { if( [self._audioWriterInput isReadyForMoreMediaData] ) { [self newAudioSample:sampleBuffer]; } } } } -(void) newAudioSample:(CMSampleBufferRef)sampleBuffer { if( _videoWriter.status > AVAssetWriterStatusWriting ) { [self NSLogPrint:[NSString stringWithFormat:@"Audio:Warning: writer status is %d", _videoWriter.status]]; if( _videoWriter.status == AVAssetWriterStatusFailed ) [self NSLogPrint:[NSString stringWithFormat:@"Audio:Error: %@", _videoWriter.error]]; return; } if( ![_audioWriterInput appendSampleBuffer:sampleBuffer] ) [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to audio input"]]; } -(void) newVideoSample:(CMSampleBufferRef)sampleBuffer { if( _videoWriter.status > AVAssetWriterStatusWriting ) { [self NSLogPrint:[NSString stringWithFormat:@"Video:Warning: writer status is %d", _videoWriter.status]]; if( _videoWriter.status == AVAssetWriterStatusFailed ) [self NSLogPrint:[NSString stringWithFormat:@"Video:Error: %@", _videoWriter.error]]; return; } if( ![_videoWriterInput appendSampleBuffer:sampleBuffer] ) [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to video input"]]; } 

我的代码有什么问题,为什么video滞后? (我正在testing一个Iphone 4 ios 4.2.1)

看起来你正在使用串行队列。 audio输出队列在video输出队列之后。 考虑使用并发队列。