用AVFoundation崩溃video捕获

我正在尝试使用AVFoundation在我的应用程序中实现video捕获。 我在viewDidLoad下面有以下代码:

session = [[AVCaptureSession alloc] init]; movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; videoInputDevice = [[AVCaptureDeviceInput alloc] init]; AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable]; if (videoDevice) { NSError *error; videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; if (!error) { if ([session canAddInput:videoInputDevice]) [session addInput:videoInputDevice]; else NSLog (@"Couldn't add input."); } } AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio]; NSError *audioError = nil; AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&audioError]; if (audioInput) { [session addInput:audioInput]; } movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; Float64 TotalSeconds = 35; //Total seconds int32_t preferredTimeScale = 30; //Frames per second CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale); movieFileOutput.maxRecordedDuration = maxDuration; movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; if ([session canAddOutput:movieFileOutput]) [session addOutput:movieFileOutput]; [session setSessionPreset:AVCaptureSessionPresetMedium]; if ([session canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them [session setSessionPreset:AVCaptureSessionPreset640x480]; [self cameraSetOutputProperties]; [session startRunning]; 

此代码位于启动捕获的按钮的实现中,其中包括:

 NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]; NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath]; NSFileManager *fileManager = [NSFileManager defaultManager]; if ([fileManager fileExistsAtPath:outputPath]) { NSError *error; if ([fileManager removeItemAtPath:outputPath error:&error] == NO) { //Error - handle if requried } } [outputPath release]; //Start recording [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self]; [outputURL release]; 

当我尝试在设备上运行它时,当我尝试加载所有这应该发生的视图时,它会崩溃。 Xcode在以下位置给我一个“线程1:EXC_BAD_ACCESS(代码= 1,地址= 0x4):

 AVFoundation`-[AVCaptureDeviceInput _setDevice:]: (stuff) 0x3793f608: ldr r0, [r1, r0] 

错误在最后一行给出。 我认为这与某个地方的AVCaptureDeviceInput有关,但我知道它可能是什么。 有谁知道我在这里缺少什么? 谢谢。

编辑:在摆弄断点之后,我发现崩溃发生在这一行:

 AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable]; 

那么该方法有什么用呢? 这是我的实现文件,也许有些问题。

 NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; AVCaptureDevice *captureDevice = nil; for (AVCaptureDevice *device in videoDevices) { if (device.position == AVCaptureDevicePositionFront) { captureDevice = device; break; } } // couldn't find one on the front, so just get the default video device. if ( ! captureDevice) { captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; } return captureDevice; 

编辑2:可能是我正在使用

 AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable]; 

并且’自我’以某种方式在其中投掷扳手? 我知道在创建CALayer时可以这样做

 CALayer *aLayer = [CALayer layer]; 

但我不知道AVCaptureDevice与此相当,如果有的话。 我不知道它还能是什么,所有帐户我的代码看起来很好,我已经尝试清理项目,重新启动Xcode,重新启动计算机等。

我很确定问题是你在模拟器上运行程序。 模拟器无法使用这些资源。

管理解决这个问题(也许)。

添加

 @property (nonatomic, retain) AVCaptureDeviceInput *videoInputDevice; 

到接口,在实现中合成它,并使用这些方法来获得前置摄像头:

 - (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position { NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *device in devices) { if ([device position] == position) { return device; } } return nil; } -(AVCaptureDevice *)frontFacingCamera { return [self cameraWithPosition:AVCaptureDevicePositionFront]; 

然后分配它

 videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:[self frontFacingCamera] error:&error]; 

不是100%确定它是否有效,因为它现在无法保存,但它不会再崩溃了。 当我确定它是否有效时,会回来更新。