iOS:captureOutput:didOutputSampleBuffer:fromConnection不被调用

我想从AVCaptureSession的实时反馈中提取帧,我正在使用Apple的AVCam作为testing用例。 这里是AVCam的链接:

https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html

我发现captureOutput:didOutputSampleBuffer:fromConnection不被调用,我想知道为什么或者我在做什么错误。

这是我所做的:

(1)我使AVCamViewController成为一个委托

 @interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate> 

(2)我创build了一个AVCaptureVideoDataOutput对象并将其添加到会话中

 AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; if ([session canAddOutput:videoDataOutput]) { [session addOutput:videoDataOutput]; } 

(3)我添加了委托方法,并通过logging随机string来testing

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { NSLog(@"I am called"); } 

testing应用程序工作,但captureOutput:didOutputSampleBuffer:fromConnection不被调用。

(4)我在SO上读到AVCaptureSession *session = [[AVCaptureSession alloc] init];中的会话variablesAVCaptureSession *session = [[AVCaptureSession alloc] init]; 在viewDidLoad中是本地的,这是委托没有被调用的一个可能的原因,我把它作为AVCamViewController类的一个实例variables,但它没有被调用。

这里是我正在testing的viewDidLoad方法(取自AVCam),我添加AVCaptureDataOutput方法的结尾:

 - (void)viewDidLoad { [super viewDidLoad]; // Create the AVCaptureSession session = [[AVCaptureSession alloc] init]; [self setSession:session]; // Setup the preview view [[self previewView] setSession:session]; // Check for device authorization [self checkDeviceAuthorizationStatus]; // In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time. // Why not do all of this on the main queue? // -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive). dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL); [self setSessionQueue:sessionQueue]; dispatch_async(sessionQueue, ^{ [self setBackgroundRecordingID:UIBackgroundTaskInvalid]; NSError *error = nil; AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack]; AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; if (error) { NSLog(@"%@", error); } if ([session canAddInput:videoDeviceInput]) { [session addInput:videoDeviceInput]; [self setVideoDeviceInput:videoDeviceInput]; dispatch_async(dispatch_get_main_queue(), ^{ // Why are we dispatching this to the main queue? // Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread. // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer's connection with other session manipulation. [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]]; }); } AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]; AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; if (error) { NSLog(@"%@", error); } if ([session canAddInput:audioDeviceInput]) { [session addInput:audioDeviceInput]; } AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; if ([session canAddOutput:movieFileOutput]) { [session addOutput:movieFileOutput]; AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo]; if ([connection isVideoStabilizationSupported]) [connection setEnablesVideoStabilizationWhenAvailable:YES]; [self setMovieFileOutput:movieFileOutput]; } AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; if ([session canAddOutput:stillImageOutput]) { [stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}]; [session addOutput:stillImageOutput]; [self setStillImageOutput:stillImageOutput]; } AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; [videoDataOutput setSampleBufferDelegate:self queue:sessionQueue]; if ([session canAddOutput:videoDataOutput]) { NSLog(@"Yes I can add it"); [session addOutput:videoDataOutput]; } }); } - (void)viewWillAppear:(BOOL)animated { dispatch_async([self sessionQueue], ^{ [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext]; [self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext]; [self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; __weak AVCamViewController *weakSelf = self; [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) { AVCamViewController *strongSelf = weakSelf; dispatch_async([strongSelf sessionQueue], ^{ // Manually restarting the session since it must have been stopped due to an error. [[strongSelf session] startRunning]; [[strongSelf recordButton] setTitle:NSLocalizedString(@"Record", @"Recording button record title") forState:UIControlStateNormal]; }); }]]; [[self session] startRunning]; }); } 

有人可以告诉我为什么和build议如何解决它?

我已经做了很多的实验,我想我可能有答案。 我有类似但不同的代码,从头开始编写,而不是从苹果的样本(现在有点旧)复制。

我认为这是该部分…

 AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; if ([session canAddOutput:movieFileOutput]) { [session addOutput:movieFileOutput]; AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo]; if ([connection isVideoStabilizationSupported]) [connection setEnablesVideoStabilizationWhenAvailable:YES]; [self setMovieFileOutput:movieFileOutput]; } 

从我的实验中,这是导致你的问题的事情。 在我的代码中,当这里有captureOutput:didOutputSampleBuffer:fromConnection不被调用。 我认为video系统可以给你一系列的缓冲区,或者把压缩的优化的电影文件logging到磁盘上,而不是同时logging下来。 (至less在iOS上)我想这是有道理的/并不奇怪,但我没有看到它logging在任何地方!

此外,有一点,当我打开麦克风时,我似乎正在发生错误和/或缓冲区callback。 再次无证,这些错误-11800(未知错误)。 但我不能总是重现这一点。

你的代码对我来说看起来不错,我可以想象10个你可以尝试的猜测和检查的东西,所以我会采取不同的方法,希望间接地解决这个问题。 除了我认为AVCam写得不好之外,我认为你最好只看到一个只关注实况video的例子,而不是录制video和拍摄静止图像。 我已经提供了一个例子,只是这样做,没有更多。

 -(void)startSession { self.session = [AVCaptureSession new]; self.session.sessionPreset = AVCaptureSessionPresetMedium; AVCaptureDevice *backCamera; for (AVCaptureDevice *device in [AVCaptureDevice devices]) { if ([device hasMediaType:AVMediaTypeVideo] && device.position == AVCaptureDevicePositionBack) { backCamera = device; break; } } NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error]; if (error) { // handle error } if ([self.session canAddInput:input]) { [self.session addInput:input]; } AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new]; [output setSampleBufferDelegate:self queue:self.queue]; output.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)}; if ([self.session canAddOutput:output]) { [self.session addOutput:output]; } dispatch_async(self.queue, ^{ [self.session startRunning]; }); }