在没有UIImagePicker的iOS中拍照,不用预览

你知道任何方式/方法在iOS拍摄照片并将其保存到相机只有一个简单的button压力滚动,而不显示任何预览?

我已经知道如何显示相机视图,但它显示图像的预览,用户需要点击拍照button才能拍照。

用几句话:用户点击button,拍摄照片,没有预览,也没有双重检查拍摄/保存照片。

我已经find了UIIMagePickerController类的takePicture方法http://developer.apple.com/library/ios/documentation/uikit/reference/UIImagePickerController_Class/UIImagePickerController/UIImagePickerController.html#//apple_ref/occ/instm/UIImagePickerController/takePicture

showsCameraControls -Property设置为NO

  poc = [[UIImagePickerController alloc] init]; [poc setTitle:@"Take a photo."]; [poc setDelegate:self]; [poc setSourceType:UIImagePickerControllerSourceTypeCamera]; poc.showsCameraControls = NO; 

您还必须将自己的控件添加为poc.view顶部的自定义视图。 但是,这是非常简单的,你可以通过这种方式添加自己的UI风格。

您通常会在imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:收到图像数据imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:

要拍照,你打电话

 [poc takePicture]; 

从您的自定义button。

希望,这对你有用。

假设你想要一个点对点的方法,你可以创build一个AVSession,并调用UIImageWriteToSavedPhotosAlbum方法。 这是一个链接进入到确切的过程: http : //www.musicalgeometry.com/?p=1297

还值得注意的是,您的用户需要将应用程序访问到相机胶卷,否则您可能会遇到保存图像的问题。

你需要根据你的尺寸devise自己的自定义预览,按下捕捉button并调用buttonPressed方法,做你想做的东西

 (void)buttonPressed:(UIButton *)sender { NSLog(@" Capture Clicked"); [self.imagePicker takePicture]; //[NSTimer scheduledTimerWithTimeInterval:3.0f target:self selector:@selector(timerFired:) userInfo:nil repeats:NO]; } 

下面是代码,将拍照而不显示预览屏幕。 当我尝试接受的答案,其中使用UIImagePickerController,预览屏幕显示,然后自动消失。 用下面的代码,用户点击“拍照”button,设备拍摄的照片零UI(在我的应用程序,我添加一个绿色的复选标记旁边拍照)。 下面的代码是从苹果https://developer.apple.com/LIBRARY/IOS/samplecode/AVCam/Introduction/Intro.html与“额外的function”(这不涉及到拍照仍然)注释掉。 谢谢你incmikobuild议在你的答案这个代码iOS拍照从相机没有modalViewController 。

更新代码,2015年3月26日:

触发拍照:

 [self snapStillImage:sender]; 

在.h文件中:

 #import <AVFoundation/AVFoundation.h> #import <AssetsLibrary/AssetsLibrary.h> // include code below in header file, after #import and before @interface // avfoundation copy paste code static void * CapturingStillImageContext = &CapturingStillImageContext; static void * RecordingContext = &RecordingContext; static void * SessionRunningAndDeviceAuthorizedContext = &SessionRunningAndDeviceAuthorizedContext; // avfoundation, include code below after @interface // avf - Session management. @property (nonatomic) dispatch_queue_t sessionQueue; // Communicate with the session and other session objects on this queue. @property (nonatomic) AVCaptureSession *session; @property (nonatomic) AVCaptureDeviceInput *videoDeviceInput; @property (nonatomic) AVCaptureMovieFileOutput *movieFileOutput; @property (nonatomic) AVCaptureStillImageOutput *stillImageOutput; // avf - Utilities. @property (nonatomic) UIBackgroundTaskIdentifier backgroundRecordingID; @property (nonatomic, getter = isDeviceAuthorized) BOOL deviceAuthorized; @property (nonatomic, readonly, getter = isSessionRunningAndDeviceAuthorized) BOOL sessionRunningAndDeviceAuthorized; @property (nonatomic) BOOL lockInterfaceRotation; @property (nonatomic) id runtimeErrorHandlingObserver; 

在.m文件中:

 #pragma mark - AV Foundation - (BOOL)isSessionRunningAndDeviceAuthorized { return [[self session] isRunning] && [self isDeviceAuthorized]; } + (NSSet *)keyPathsForValuesAffectingSessionRunningAndDeviceAuthorized { return [NSSet setWithObjects:@"session.running", @"deviceAuthorized", nil]; } // call following method from viewDidLoad - (void)CreateAVCaptureSession { // Create the AVCaptureSession AVCaptureSession *session = [[AVCaptureSession alloc] init]; [self setSession:session]; // Check for device authorization [self checkDeviceAuthorizationStatus]; // In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time. // Why not do all of this on the main queue? // -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive). dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL); [self setSessionQueue:sessionQueue]; dispatch_async(sessionQueue, ^{ [self setBackgroundRecordingID:UIBackgroundTaskInvalid]; NSError *error = nil; AVCaptureDevice *videoDevice = [ViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionFront]; AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; if (error) { NSLog(@"%@", error); } if ([session canAddInput:videoDeviceInput]) { [session addInput:videoDeviceInput]; [self setVideoDeviceInput:videoDeviceInput]; dispatch_async(dispatch_get_main_queue(), ^{ // Why are we dispatching this to the main queue? // Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread. // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer's connection with other session manipulation. }); } /* AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]; AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; if (error) { NSLog(@"%@", error); } if ([session canAddInput:audioDeviceInput]) { [session addInput:audioDeviceInput]; } */ AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; if ([session canAddOutput:movieFileOutput]) { [session addOutput:movieFileOutput]; AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo]; if ([connection isVideoStabilizationSupported]) [connection setEnablesVideoStabilizationWhenAvailable:YES]; [self setMovieFileOutput:movieFileOutput]; } AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; if ([session canAddOutput:stillImageOutput]) { [stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}]; [session addOutput:stillImageOutput]; [self setStillImageOutput:stillImageOutput]; } }); } // call method below from viewWilAppear - (void)AVFoundationStartSession { dispatch_async([self sessionQueue], ^{ [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext]; [self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext]; [self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; __weak ViewController *weakSelf = self; [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) { ViewController *strongSelf = weakSelf; dispatch_async([strongSelf sessionQueue], ^{ // Manually restarting the session since it must have been stopped due to an error. [[strongSelf session] startRunning]; }); }]]; [[self session] startRunning]; }); } // call method below from viewDidDisappear - (void)AVFoundationStopSession { dispatch_async([self sessionQueue], ^{ [[self session] stopRunning]; [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; [[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]]; [self removeObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext]; [self removeObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" context:CapturingStillImageContext]; [self removeObserver:self forKeyPath:@"movieFileOutput.recording" context:RecordingContext]; }); } - (BOOL)prefersStatusBarHidden { return YES; } - (BOOL)shouldAutorotate { // Disable autorotation of the interface when recording is in progress. return ![self lockInterfaceRotation]; } - (NSUInteger)supportedInterfaceOrientations { return UIInterfaceOrientationMaskAll; } - (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration { // [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)toInterfaceOrientation]; } - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context { if (context == CapturingStillImageContext) { BOOL isCapturingStillImage = [change[NSKeyValueChangeNewKey] boolValue]; if (isCapturingStillImage) { [self runStillImageCaptureAnimation]; } } else if (context == RecordingContext) { BOOL isRecording = [change[NSKeyValueChangeNewKey] boolValue]; dispatch_async(dispatch_get_main_queue(), ^{ if (isRecording) { // [[self cameraButton] setEnabled:NO]; // [[self recordButton] setTitle:NSLocalizedString(@"Stop", @"Recording button stop title") forState:UIControlStateNormal]; // [[self recordButton] setEnabled:YES]; } else { // [[self cameraButton] setEnabled:YES]; // [[self recordButton] setTitle:NSLocalizedString(@"Record", @"Recording button record title") forState:UIControlStateNormal]; // [[self recordButton] setEnabled:YES]; } }); } else if (context == SessionRunningAndDeviceAuthorizedContext) { BOOL isRunning = [change[NSKeyValueChangeNewKey] boolValue]; dispatch_async(dispatch_get_main_queue(), ^{ if (isRunning) { // [[self cameraButton] setEnabled:YES]; // [[self recordButton] setEnabled:YES]; // [[self stillButton] setEnabled:YES]; } else { // [[self cameraButton] setEnabled:NO]; // [[self recordButton] setEnabled:NO]; // [[self stillButton] setEnabled:NO]; } }); } else { [super observeValueForKeyPath:keyPath ofObject:object change:change context:context]; } } #pragma mark Actions - (IBAction)toggleMovieRecording:(id)sender { // [[self recordButton] setEnabled:NO]; dispatch_async([self sessionQueue], ^{ if (![[self movieFileOutput] isRecording]) { [self setLockInterfaceRotation:YES]; if ([[UIDevice currentDevice] isMultitaskingSupported]) { // Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until AVCam returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when AVCam is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved. [self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]]; } // Update the orientation on the movie file output video connection before starting recording. // [[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]]; // Turning OFF flash for video recording [ViewController setFlashMode:AVCaptureFlashModeOff forDevice:[[self videoDeviceInput] device]]; // Start recording to a temporary file. NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[@"movie" stringByAppendingPathExtension:@"mov"]]; [[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self]; } else { [[self movieFileOutput] stopRecording]; } }); } - (IBAction)changeCamera:(id)sender { // [[self cameraButton] setEnabled:NO]; // [[self recordButton] setEnabled:NO]; // [[self stillButton] setEnabled:NO]; dispatch_async([self sessionQueue], ^{ AVCaptureDevice *currentVideoDevice = [[self videoDeviceInput] device]; AVCaptureDevicePosition preferredPosition = AVCaptureDevicePositionUnspecified; AVCaptureDevicePosition currentPosition = [currentVideoDevice position]; switch (currentPosition) { case AVCaptureDevicePositionUnspecified: preferredPosition = AVCaptureDevicePositionBack; break; case AVCaptureDevicePositionBack: preferredPosition = AVCaptureDevicePositionFront; break; case AVCaptureDevicePositionFront: preferredPosition = AVCaptureDevicePositionBack; break; } AVCaptureDevice *videoDevice = [ViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:preferredPosition]; AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil]; [[self session] beginConfiguration]; [[self session] removeInput:[self videoDeviceInput]]; if ([[self session] canAddInput:videoDeviceInput]) { [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentVideoDevice]; [ViewController setFlashMode:AVCaptureFlashModeAuto forDevice:videoDevice]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice]; [[self session] addInput:videoDeviceInput]; [self setVideoDeviceInput:videoDeviceInput]; } else { [[self session] addInput:[self videoDeviceInput]]; } [[self session] commitConfiguration]; dispatch_async(dispatch_get_main_queue(), ^{ // [[self cameraButton] setEnabled:YES]; // [[self recordButton] setEnabled:YES]; // [[self stillButton] setEnabled:YES]; }); }); } - (IBAction)snapStillImage:(id)sender { dispatch_async([self sessionQueue], ^{ // Update the orientation on the still image output video connection before capturing. // [[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]]; // Flash set to Auto for Still Capture [ViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]]; // Capture a still image. [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer) { NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; UIImage *image = [[UIImage alloc] initWithData:imageData]; // do someting good with saved image [self saveImageToParse:image]; } }]; }); } - (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer { // CGPoint devicePoint = [(AVCaptureVideoPreviewLayer *)[[self previewView] layer] captureDevicePointOfInterestForPoint:[gestureRecognizer locationInView:[gestureRecognizer view]]]; // [self focusWithMode:AVCaptureFocusModeAutoFocus exposeWithMode:AVCaptureExposureModeAutoExpose atDevicePoint:devicePoint monitorSubjectAreaChange:YES]; } - (void)subjectAreaDidChange:(NSNotification *)notification { CGPoint devicePoint = CGPointMake(.5, .5); [self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO]; } #pragma mark File Output Delegate - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error { if (error) NSLog(@"%@", error); [self setLockInterfaceRotation:NO]; // Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns. UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID]; [self setBackgroundRecordingID:UIBackgroundTaskInvalid]; [[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) { if (error) NSLog(@"%@", error); [[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil]; if (backgroundRecordingID != UIBackgroundTaskInvalid) [[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID]; }]; } #pragma mark Device Configuration - (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange { dispatch_async([self sessionQueue], ^{ AVCaptureDevice *device = [[self videoDeviceInput] device]; NSError *error = nil; if ([device lockForConfiguration:&error]) { if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode]) { [device setFocusMode:focusMode]; [device setFocusPointOfInterest:point]; } if ([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:exposureMode]) { [device setExposureMode:exposureMode]; [device setExposurePointOfInterest:point]; } [device setSubjectAreaChangeMonitoringEnabled:monitorSubjectAreaChange]; [device unlockForConfiguration]; } else { NSLog(@"%@", error); } }); } + (void)setFlashMode:(AVCaptureFlashMode)flashMode forDevice:(AVCaptureDevice *)device { if ([device hasFlash] && [device isFlashModeSupported:flashMode]) { NSError *error = nil; if ([device lockForConfiguration:&error]) { [device setFlashMode:flashMode]; [device unlockForConfiguration]; } else { NSLog(@"%@", error); } } } + (AVCaptureDevice *)deviceWithMediaType:(NSString *)mediaType preferringPosition:(AVCaptureDevicePosition)position { NSArray *devices = [AVCaptureDevice devicesWithMediaType:mediaType]; AVCaptureDevice *captureDevice = [devices firstObject]; for (AVCaptureDevice *device in devices) { if ([device position] == position) { captureDevice = device; break; } } return captureDevice; } #pragma mark UI - (void)runStillImageCaptureAnimation { /* dispatch_async(dispatch_get_main_queue(), ^{ [[[self previewView] layer] setOpacity:0.0]; [UIView animateWithDuration:.25 animations:^{ [[[self previewView] layer] setOpacity:1.0]; }]; }); */ } - (void)checkDeviceAuthorizationStatus { NSString *mediaType = AVMediaTypeVideo; [AVCaptureDevice requestAccessForMediaType:mediaType completionHandler:^(BOOL granted) { if (granted) { //Granted access to mediaType [self setDeviceAuthorized:YES]; } else { //Not granted access to mediaType dispatch_async(dispatch_get_main_queue(), ^{ [[[UIAlertView alloc] initWithTitle:@"AVCam!" message:@"AVCam doesn't have permission to use Camera, please change privacy settings" delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil] show]; [self setDeviceAuthorized:NO]; }); } }]; }