如何在录制video时进行stream式传输?

好的,所以我是我的应用程序,我有一个ViewController ,它处理从摄像头录制的video,然后保存到我的应用程序文件夹的文档目录中。 现在我想要做的是在录制video的同时上传当前文件的部分也写到一个服务器(我是这个新东西,但猜测http服务器)。 我这样做的原因是因为我希望添加支持,所以当video我可以stream到铬铸造。 这是可能的,因为EZCast应用程序已经执行了类似的function。

我已经制定了如何将video上传到http服务器,将http服务器中的video发送到chrome chromecast,并使用以下来源实际录制video:

Chrome Cast: https : //developers.google.com/cast/

Chrome Cast: https : //github.com/googlecast/CVideosVideos-ios

Http服务器: https : //github.com/robbiehanson/CocoaHTTPServer

从Idevice相机录制: https : //github.com/BradLarson/GPUImage

要投射video,我明显连接,但在允许我进入logging视图之前,我必须已经连接,所以我的代码纯粹投出一个.mp4video简单看起来像这样:

 -(void)startCasting { [self establishServer]; self.mediaControlChannel = [[GCKMediaControlChannel alloc] init]; self.mediaControlChannel.delegate = self; [self.deviceManager addChannel:gblvb.mediaControlChannel]; [self.mediaControlChannel requestStatus]; NSString *path = [NSString stringWithFormat:@"http://%@%@%hu%@%@", [self getIPAddress], @":" ,[httpServer listeningPort], @"/", @"Movie.mp4"]; NSString *image; NSString *type; self.metadata = [[GCKMediaMetadata alloc] init]; image = @"";//Image HERE [gblvb.metadata setString:@" forKey:kGCKMetadataKeySubtitle];//Description Here type = @"video/mp4";//Video Type [self.metadata setString:[NSString stringWithFormat:@"%@%@", @"Casting " , @"Movie.mp4"]forKey:kGCKMetadataKeyTitle];//Title HERE //define Media information GCKMediaInformation *mediaInformation = [[GCKMediaInformation alloc] initWithContentID:path streamType:GCKMediaStreamTypeNone contentType:type metadata:gblvb.metadata streamDuration:0 customData:nil]; //cast video [self.mediaControlChannel loadMedia:mediaInformation autoplay:TRUE playPosition:0]; } - (NSString *)getIPAddress { NSString *address = @"error"; struct ifaddrs *interfaces = NULL; struct ifaddrs *temp_addr = NULL; int success = 0; // retrieve the current interfaces - returns 0 on success success = getifaddrs(&interfaces); if (success == 0) { // Loop through linked list of interfaces temp_addr = interfaces; while(temp_addr != NULL) { if(temp_addr->ifa_addr->sa_family == AF_INET) { // Check if interface is en0 which is the wifi connection on the iPhone if([[NSString stringWithUTF8String:temp_addr->ifa_name] isEqualToString:@"en0"]) { // Get NSString from C String address = [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)temp_addr->ifa_addr)->sin_addr)]; } } temp_addr = temp_addr->ifa_next; } } // Free memory freeifaddrs(interfaces); return address; } 

现在在铸造之前,我需要build立我的http服务器。 添加CocoaHTTPServer到你的项目后,这很简单,需要很less的实现。 我的代码来启动服务器如下所示:

 static const int ddLogLevel = LOG_LEVEL_VERBOSE; -(void)establishServer { [httpServer stop]; // Do any additional setup after loading the view from its nib. // Configure our logging framework. // To keep things simple and fast, we're just going to log to the Xcode console. [DDLog addLogger:[DDTTYLogger sharedInstance]]; // Create server using our custom MyHTTPServer class httpServer = [[HTTPServer alloc] init]; // Tell the server to broadcast its presence via Bonjour. // This allows browsers such as Safari to automatically discover our service. [httpServer setType:@"_http._tcp."]; // Normally there's no need to run our server on any specific port. // Technologies like Bonjour allow clients to dynamically discover the server's port at runtime. // However, for easy testing you may want force a certain port so you can just hit the refresh button. // [httpServer setPort:12345]; // Serve files from our embedded Web folder NSString *webPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/"]; DDLogInfo(@"Setting document root: %@", webPath); [httpServer setDocumentRoot:webPath]; [self startServer]; } - (void)startServer { // Start the server (and check for problems) NSError *error; if([httpServer start:&error]) { DDLogInfo(@"Started HTTP Server on port %hu", [httpServer listeningPort]); } else { DDLogError(@"Error starting HTTP Server: %@", error); } } 

最后,我使用此代码开始显示和logging从iPhone相机:

  - (void)viewDidLoad { [super viewDidLoad]; videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack]; // videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront]; // videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack]; // videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1920x1080 cameraPosition:AVCaptureDevicePositionBack]; videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait; videoCamera.horizontallyMirrorFrontFacingCamera = NO; videoCamera.horizontallyMirrorRearFacingCamera = NO; // filter = [[GPUImageSepiaFilter alloc] init]; // filter = [[GPUImageTiltShiftFilter alloc] init]; // [(GPUImageTiltShiftFilter *)filter setTopFocusLevel:0.65]; // [(GPUImageTiltShiftFilter *)filter setBottomFocusLevel:0.85]; // [(GPUImageTiltShiftFilter *)filter setBlurSize:1.5]; // [(GPUImageTiltShiftFilter *)filter setFocusFallOffRate:0.2]; // filter = [[GPUImageSketchFilter alloc] init]; filter = [[GPUImageFilter alloc] init]; // filter = [[GPUImageSmoothToonFilter alloc] init]; // GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRightFlipVertical]; [videoCamera addTarget:filter]; GPUImageView *filterView = (GPUImageView *)self.view; // filterView.fillMode = kGPUImageFillModeStretch; // filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill; // Record a movie for 10 s and store it in /Documents, visible via iTunes file sharing NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.mp4"]; unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie]; movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)]; movieWriter.encodingLiveVideo = YES; // movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(640.0, 480.0)]; // movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(720.0, 1280.0)]; // movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(1080.0, 1920.0)]; [filter addTarget:movieWriter]; [filter addTarget:filterView]; [videoCamera startCameraCapture]; } bool recording; - (IBAction)Record:(id)sender { if (recording == YES) { Record.titleLabel.text = @"Record"; recording = NO; double delayInSeconds = 0.1; dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC); dispatch_after(stopTime, dispatch_get_main_queue(), ^(void){ [filter removeTarget:movieWriter]; videoCamera.audioEncodingTarget = nil; [movieWriter finishRecording]; NSLog(@"Movie completed"); // [videoCamera.inputCamera lockForConfiguration:nil]; // [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOff]; // [videoCamera.inputCamera unlockForConfiguration]; }); UIAlertView *message = [[UIAlertView alloc] initWithTitle:@"Do You Wish To Store This Footage?" message:@"Recording has fineshed. Do you wish to store this video into your camera roll?" delegate:self cancelButtonTitle:nil otherButtonTitles:@"Yes", @"No",nil]; [message show]; [self dismissViewControllerAnimated:YES completion:nil]; } else { double delayToStartRecording = 0.5; dispatch_time_t startTime = dispatch_time(DISPATCH_TIME_NOW, delayToStartRecording * NSEC_PER_SEC); dispatch_after(startTime, dispatch_get_main_queue(), ^(void){ NSLog(@"Start recording"); videoCamera.audioEncodingTarget = movieWriter; [movieWriter startRecording]; // NSError *error = nil; // if (![videoCamera.inputCamera lockForConfiguration:&error]) // { // NSLog(@"Error locking for configuration: %@", error); // } // [videoCamera.inputCamera setTorchMode:AVCaptureTorchModeOn]; // [videoCamera.inputCamera unlockForConfiguration]; recording = YES; Record.titleLabel.text = @"Stop"; }); [self startCasting]; } } 

现在,你可能会看到我正试图运行录制后直接录制的video,并指出服务器的位置。 这是行不通的,因为我相信该文件是不是位于该path,直到停止button被按下,但我该如何解决这个问题? 任何人都可以帮忙吗?

ChromeCast支持的媒体types: https //developers.google.com/cast/docs/media