如何在使用RTMPStreamPublisher发布video的同时在iPhone上存储video?

现在我正在使用RTMPStreamPublisherRTMPStreamPublisher上发布video。 它正在上传成功,但任何人都可以告诉我如何可以在上传到服务器上的同时在iPhone上存储相同的video?

我正在使用https://github.com/slavavdovichenko/MediaLibDemos ,但没有太多的文档可用。 如果我只能存储发布的数据,那么我的工作就会成功。

这里是他们用来上传stream的方法,但我找不到在iPhone设备上存储相同video的方法:

 // ACTIONS -(void)doConnect { #if 0 // use ffmpeg rtmp NSString *url = [NSString stringWithFormat:@"%@/%@", hostTextField.text, streamTextField.text]; upstream = [[BroadcastStreamClient alloc] init:url resolution:RESOLUTION_LOW]; upstream.delegate = self; upstream.encoder = [MPMediaEncoder new]; [upstream start]; socket = [[RTMPClient alloc] init:host] btnConnect.title = @"Disconnect"; return; #endif #if 0 // use inside RTMPClient instance upstream = [[BroadcastStreamClient alloc] init:hostTextField.text resolution:RESOLUTION_LOW]; //upstream = [[BroadcastStreamClient alloc] initOnlyAudio:hostTextField.text]; //upstream = [[BroadcastStreamClient alloc] initOnlyVideo:hostTextField.text resolution:RESOLUTION_LOW]; #else // use outside RTMPClient instance if (!socket) { socket = [[RTMPClient alloc] init:hostTextField.text]; if (!socket) { [self showAlert:@"Socket has not be created"]; return; } [socket spawnSocketThread]; } upstream = [[BroadcastStreamClient alloc] initWithClient:socket resolution:RESOLUTION_LOW]; #endif [upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeRight]; //[upstream setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft]; //[upstream setVideoBitrate:512000]; upstream.delegate = self; [upstream stream:streamTextField.text publishType:PUBLISH_LIVE]; //[upstream stream:streamTextField.text publishType:PUBLISH_RECORD]; //[upstream stream:streamTextField.text publishType:PUBLISH_APPEND]; btnConnect.title = @"Disconnect"; } 

我发现用名为“上游”的BroadcastStreamClient的实例,我可以通过以下行获取AVCaptureSession

 [upstream getCaptureSession]; 

如何使用此AVCaptureSession在iPhone上录制video?

一旦你获得AVCaptureSession你可以添加一个AVCaptureMovieFileOutput的实例,如下所示:

 AVCaptureMovieFileOutput *movieFileOutput = [AVCaptureMovieFileOutput new]; if([captureSession canAddOutput:movieFileOutput]){ [captureSession addOutput:movieFileOutput]; } // Start recording NSURL *outputURL = … [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self]; 

来源: https : //www.objc.io/issues/23-video/capturing-video/

另外看看这个为了更好地了解如何使用AVCaptureFileOutput : https : //developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureFileOutput_Class/index.html#//apple_ref/occ/cl / AVCaptureFileOutput

从相册中selectvideo或录制时尝试此操作。 或在didFinishPickingMediaWithInfo方法

 NSURL __block *videoUrl=(NSURL*)[info objectForKey:UIImagePickerControllerMediaURL]; NSString *moviePath = [videoUrl path]; if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath)) { UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil); } 

只要你的对象符合<NSCoding>协议(许多集合types已经符合这个),你就可以用NSKeyedArchivercachingvideo。 这个caching目录得到定期清理

http://khanlou.com/2015/07/cache-me-if-you-can/

或者,也许更好的select可能是使用NSTemporaryDirectory()在本地存储临时video文件(这是NSPathUtilities一部分)。 AVAssetExportSession应该允许你把video文件在那里:

 AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality]; exportSession.outputURL = outputFileURL;