从,Swift创作电影

我已经找到导出[UIImage]作为电影 ,但它全部在ObjectiveC中,我无法弄清楚Swift。

我需要从[UIImage]创建一个video

通过以上链接研究Zoul的答案。 第1部分)连接作者

我到目前为止:

let paths = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask) let documentsURL = paths[0] as! NSURL let videoWriter:AVAssetWriter = AVAssetWriter(URL: documentsURL, fileType: AVFileTypeQuickTimeMovie, error: nil) var videoSettings: NSDictionary = NSDictionary( 

我无法弄清楚他正确的Swift版本

 NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, nil]; 

构造Dictionary文字很简单:

 import AVFoundation let videoSettings = [ AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 640, AVVideoHeightKey: 480 ] 

至于其他所有内容,我建议您先阅读Apple的Swift编程语言 ,先了解基础知识,而不是依赖于SO或教程来覆盖您想要做的事情。 正如他们所说,“教人钓鱼”。

我将’@Cameron E’发布的Objective-c代码转换为Swift 3,并且它正在工作。 答案的链接: @Cameron E的CEMovieMaker

以下是CXEImagesToVideo类:

 // // CXEImagesToVideo.swift // VideoAPPTest // // Created by Wulei on 16/12/14. // Copyright © 2016 wulei. All rights reserved. // import Foundation import AVFoundation import UIKit typealias CXEMovieMakerCompletion = (URL) -> Void typealias CXEMovieMakerUIImageExtractor = (AnyObject) -> UIImage? public class CXEImagesToVideo: NSObject{ var assetWriter:AVAssetWriter! var writeInput:AVAssetWriterInput! var bufferAdapter:AVAssetWriterInputPixelBufferAdaptor! var videoSettings:[String : Any]! var frameTime:CMTime! var fileURL:URL! var completionBlock: CXEMovieMakerCompletion? var movieMakerUIImageExtractor:CXEMovieMakerUIImageExtractor? public class func videoSettings(codec:String, width:Int, height:Int) -> [String: Any]{ if(Int(width) % 16 != 0){ print("warning: video settings width must be divisible by 16") } let videoSettings:[String: Any] = [AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: width, AVVideoHeightKey: height] return videoSettings } public init(videoSettings: [String: Any]) { super.init() let paths = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true) let tempPath = paths[0] + "/exprotvideo.mp4" if(FileManager.default.fileExists(atPath: tempPath)){ guard (try? FileManager.default.removeItem(atPath: tempPath)) != nil else { print("remove path failed") return } } self.fileURL = URL(fileURLWithPath: tempPath) self.assetWriter = try! AVAssetWriter(url: self.fileURL, fileType: AVFileTypeQuickTimeMovie) self.videoSettings = videoSettings self.writeInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings) assert(self.assetWriter.canAdd(self.writeInput), "add failed") self.assetWriter.add(self.writeInput) let bufferAttributes:[String: Any] = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32ARGB)] self.bufferAdapter = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: self.writeInput, sourcePixelBufferAttributes: bufferAttributes) self.frameTime = CMTimeMake(1, 10) } func createMovieFrom(urls: [URL], withCompletion: @escaping CXEMovieMakerCompletion){ self.createMovieFromSource(images: urls as [AnyObject], extractor:{(inputObject:AnyObject) ->UIImage? in return UIImage(data: try! Data(contentsOf: inputObject as! URL))}, withCompletion: withCompletion) } func createMovieFrom(images: [UIImage], withCompletion: @escaping CXEMovieMakerCompletion){ self.createMovieFromSource(images: images, extractor: {(inputObject:AnyObject) -> UIImage? in return inputObject as? UIImage}, withCompletion: withCompletion) } func createMovieFromSource(images: [AnyObject], extractor: @escaping CXEMovieMakerUIImageExtractor, withCompletion: @escaping CXEMovieMakerCompletion){ self.completionBlock = withCompletion self.assetWriter.startWriting() self.assetWriter.startSession(atSourceTime: kCMTimeZero) let mediaInputQueue = DispatchQueue(label: "mediaInputQueue") var i = 0 let frameNumber = images.count self.writeInput.requestMediaDataWhenReady(on: mediaInputQueue){ while(true){ if(i >= frameNumber){ break } if (self.writeInput.isReadyForMoreMediaData){ var sampleBuffer:CVPixelBuffer? autoreleasepool{ let img = extractor(images[i]) if img == nil{ i += 1 print("Warning: counld not extract one of the frames") //continue } sampleBuffer = self.newPixelBufferFrom(cgImage: img!.cgImage!) } if (sampleBuffer != nil){ if(i == 0){ self.bufferAdapter.append(sampleBuffer!, withPresentationTime: kCMTimeZero) }else{ let value = i - 1 let lastTime = CMTimeMake(Int64(value), self.frameTime.timescale) let presentTime = CMTimeAdd(lastTime, self.frameTime) self.bufferAdapter.append(sampleBuffer!, withPresentationTime: presentTime) } i = i + 1 } } } self.writeInput.markAsFinished() self.assetWriter.finishWriting { DispatchQueue.main.sync { self.completionBlock!(self.fileURL) } } } } func newPixelBufferFrom(cgImage:CGImage) -> CVPixelBuffer?{ let options:[String: Any] = [kCVPixelBufferCGImageCompatibilityKey as String: true, kCVPixelBufferCGBitmapContextCompatibilityKey as String: true] var pxbuffer:CVPixelBuffer? let frameWidth = self.videoSettings[AVVideoWidthKey] as! Int let frameHeight = self.videoSettings[AVVideoHeightKey] as! Int let status = CVPixelBufferCreate(kCFAllocatorDefault, frameWidth, frameHeight, kCVPixelFormatType_32ARGB, options as CFDictionary?, &pxbuffer) assert(status == kCVReturnSuccess && pxbuffer != nil, "newPixelBuffer failed") CVPixelBufferLockBaseAddress(pxbuffer!, CVPixelBufferLockFlags(rawValue: 0)) let pxdata = CVPixelBufferGetBaseAddress(pxbuffer!) let rgbColorSpace = CGColorSpaceCreateDeviceRGB() let context = CGContext(data: pxdata, width: frameWidth, height: frameHeight, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pxbuffer!), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue) assert(context != nil, "context is nil") context!.concatenate(CGAffineTransform.identity) context!.draw(cgImage, in: CGRect(x: 0, y: 0, width: cgImage.width, height: cgImage.height)) CVPixelBufferUnlockBaseAddress(pxbuffer!, CVPixelBufferLockFlags(rawValue: 0)) return pxbuffer } } 

用法:

  var uiImages = [UIImage]() /** add image to uiImages */ let settings = CXEImagesToVideo.videoSettings(codec: AVVideoCodecH264, width: (uiImages[0].cgImage?.width)!, height: (uiImages[0].cgImage?.height)!) let movieMaker = CXEImagesToVideo(videoSettings: settings) movieMaker.createMovieFrom(images: uiImages){ (fileURL:URL) in let video = AVAsset(url: fileURL) let playerItem = AVPlayerItem(asset: video) let avPlayer = AVPlayer(playerItem: playerItem) let playerLayer = AVPlayerLayer(player: avPlayer) playerLayer.frame = CGRect(x: 0, y: 0, width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.width * 3.0 / 4.0) self.view.layer.addSublayer(playerLayer) avPlayer.play() } 

使用fileURL导出或播放video。 有两种异步和同步方式。 要点: https : //gist.github.com/Willib/b97b08d8d877ca5d875ff14abb4c3f1a

Interesting Posts