AVPlayerstream媒体进展

我成功地使用AVPlayer来从服务器stream式传输audio,而现在我想要做的是显示一个自定义的UISlider来显示缓冲的进度。

像这样的东西:

在这里输入图像说明

使用AVPlayer似乎没有办法获得audio文件的总下载大小或当前下载量,只有当前播放时间和总播放时间。

有什么解决办法吗?

我只是在做这个,到目前为止有以下几点:

 - (NSTimeInterval) availableDuration; { NSArray *loadedTimeRanges = [[self.player currentItem] loadedTimeRanges]; CMTimeRange timeRange = [[loadedTimeRanges objectAtIndex:0] CMTimeRangeValue]; Float64 startSeconds = CMTimeGetSeconds(timeRange.start); Float64 durationSeconds = CMTimeGetSeconds(timeRange.duration); NSTimeInterval result = startSeconds + durationSeconds; return result; } 

它应该运作良好:

Objective-C的:

 - (CMTime)availableDuration { NSValue *range = self.player.currentItem.loadedTimeRanges.firstObject; if (range != nil){ return CMTimeRangeGetEnd(range.CMTimeRangeValue); } return kCMTimeZero; } 

Swift版本:

 func availableDuration() -> CMTime { if let range = self.player?.currentItem?.loadedTimeRanges.first { return CMTimeRangeGetEnd(range.timeRangeValue) } return kCMTimeZero } 

要观看当前时间值,您可以使用: CMTimeShow([self availableDuration]); CMTimeShow(availableDuration()) (用于swift)

此方法将为您的UISlider返回缓冲区时间间隔

 public var bufferAvail: NSTimeInterval { // Check if there is a player instance if ((player.currentItem) != nil) { // Get current AVPlayerItem var item: AVPlayerItem = player.currentItem if (item.status == AVPlayerItemStatus.ReadyToPlay) { var timeRangeArray: NSArray = item.loadedTimeRanges var aTimeRange: CMTimeRange = timeRangeArray.objectAtIndex(0).CMTimeRangeValue var startTime = CMTimeGetSeconds(aTimeRange.start) var loadedDuration = CMTimeGetSeconds(aTimeRange.duration) return (NSTimeInterval)(startTime + loadedDuration); } else { return(CMTimeGetSeconds(kCMTimeInvalid)) } } else { return(CMTimeGetSeconds(kCMTimeInvalid)) } } 

如果返回的数组是空的,则所选的答案可能会导致问题 这是一个固定的function:

 - (NSTimeInterval) availableDuration { NSArray *loadedTimeRanges = [[_player currentItem] loadedTimeRanges]; if ([loadedTimeRanges count]) { CMTimeRange timeRange = [[loadedTimeRanges objectAtIndex:0] CMTimeRangeValue]; Float64 startSeconds = CMTimeGetSeconds(timeRange.start); Float64 durationSeconds = CMTimeGetSeconds(timeRange.duration); NSTimeInterval result = startSeconds + durationSeconds; return result; } return 0; } 

目标C中Suresh Kansujiya的代码

 NSTimeInterval bufferAvail; if (player.currentItem != nil) { AVPlayerItem *item = player.currentItem; if (item.status == AVPlayerStatusReadyToPlay) { NSArray *timeRangeArray = item.loadedTimeRanges; CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue]; Float64 startTime = CMTimeGetSeconds(aTimeRange.start); Float64 loadedDuration = CMTimeGetSeconds(aTimeRange.duration); bufferAvail = startTime + loadedDuration; NSLog(@"%@ - %f", [self class], bufferAvail); } else { NSLog(@"%@ - %f", [self class], CMTimeGetSeconds(kCMTimeInvalid)); } } else { NSLog(@"%@ - %f", [self class], CMTimeGetSeconds(kCMTimeInvalid)); }