iOS AVPlayer无法播放240 fpsvideo

我在更改AVCaptureDeviceFormat后录制了240 fps的video。 如果我将该video保存在照片库中,则会产生缓慢效果。 但是,如果我从文档目录播放该文件,使用AVPlayer,我看不到slowmo效果。

播放video的代码:

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[AVAsset assetWithURL:[NSURL fileURLWithPath:fullPath]]]; AVPlayer *feedVideoPlayer = [AVPlayer playerWithPlayerItem:playerItem]; AVPlayerViewController *playerController = [[AVPlayerViewController alloc] init]; playerController.view.frame = CGRectMake(0, 0, videoPreviewView.frame.size.width, videoPreviewView.frame.size.height); playerController.player = feedVideoPlayer; 

这有点烦人,但我相信如果你不想失去质量,你需要在AVComposition重新创建video。 我想知道是否还有其他办法,但这就是我想出来的。 您可以通过AVAssetExportSession从技术上导出video,但使用PassThrough质量将产生相同的video文件,这不会是慢动作 – 您需要对其进行转码,这会失去质量(AFAIK。请参阅问题播放慢动作用于该解决方案的AVPlayer中的AVAsset )。


您需要做的第一件事是获取源媒体的原始时间映射对象。 你可以这样做:

 let options = PHVideoRequestOptions() options.version = PHVideoRequestOptionsVersion.current options.deliveryMode = .highQualityFormat PHImageManager().requestAVAsset(forVideo: phAsset, options: options, resultHandler: { (avAsset, mix, info) in guard let avAsset = avAsset else { return } let originalTimeMaps = avAsset.tracks(withMediaType: AVMediaTypeVideo) .first? .segments .flatMap { $0.timeMapping } ?? [] } 

对原始媒体(位于文档目录中的媒体)进行timeMappings后,您可以传入该媒体的URL以及要重新创建的原始CMTimeMapping对象。 然后创建一个可以在AVPlayer中播放的新AVComposition。 你需要一个类似这样的类:

 class CompositionMapper { let url: URL let timeMappings: [CMTimeMapping] init(for url: URL, with timeMappings: [CMTimeMapping]) { self.url = url self.timeMappings = timeMappings } init(with asset: AVAsset, and timeMappings: [CMTimeMapping]) { guard let asset = asset as? AVURLAsset else { print("cannot get a base URL from this asset.") fatalError() } self.timeMappings = timeMappings self.url = asset.url } func compose() -> AVComposition { let composition = AVMutableComposition(urlAssetInitializationOptions: [AVURLAssetPreferPreciseDurationAndTimingKey: true]) let emptyTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) let audioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid) let asset = AVAsset(url: url) guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first else { return composition } var segments: [AVCompositionTrackSegment] = [] for map in timeMappings { let segment = AVCompositionTrackSegment(url: url, trackID: kCMPersistentTrackID_Invalid, sourceTimeRange: map.source, targetTimeRange: map.target) segments.append(segment) } emptyTrack.preferredTransform = videoAssetTrack.preferredTransform emptyTrack.segments = segments if let _ = asset.tracks(withMediaType: AVMediaTypeVideo).first { audioTrack.segments = segments } return composition.copy() as! AVComposition } 

然后,您可以使用CompositionMapper类的compose()函数为您提供准备在AVPlayer播放的CMTimeMapping ,它应该尊重您传入的CMTimeMapping对象。

 let compositionMapper = CompositionMapper(for: someAVAssetURL, with: originalTimeMaps) let mappedComposition = compositionMapper.compose() let playerItem = AVPlayerItem(asset: mappedComposition) let player = AVPlayer(playerItem: playerItem) playerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed 

如果您需要帮助将其转换为Objective-C,请告诉我,但它应该相对简单。