从AVPlayer获取HLS的PCM数据

这个问题似乎在过去几年中被问及几次,但没有人回答这个问题。 我正在尝试处理来自HLS的PCM数据,我必须使用AVPlayer。

这篇文章点击本地文件https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/

而这个点击工作与远程文件,但不与.m3u8 hls文件。 http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/

我可以播放播放列表中的前两个曲目,但它不启动所需的callback获得pcm,当文件是本地或远程(不stream)我仍然可以得到pcm,但它是hls不工作,我需要HLS工作

这是我的代码

//avplayer tap try - (void)viewDidLoad { [super viewDidLoad]; NSURL*testUrl= [NSURL URLWithString:@"http://playlists.ihrhls.com/c5/1469/playlist.m3u8"]; AVPlayerItem *item = [AVPlayerItem playerItemWithURL:testUrl]; self.player = [AVPlayer playerWithPlayerItem:item]; // Watch the status property - when this is good to go, we can access the // underlying AVAssetTrack we need. [item addObserver:self forKeyPath:@"status" options:0 context:nil]; } -(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context { if(![keyPath isEqualToString:@"status"]) return; AVPlayerItem *item = (AVPlayerItem *)object; if(item.status != AVPlayerItemStatusReadyToPlay) return; NSArray *tracks = [self.player.currentItem tracks]; for(AVPlayerItemTrack *track in tracks) { if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio]) { NSLog(@"GOT DAT FUCKER"); [self beginRecordingAudioFromTrack:track.assetTrack]; [self.player play]; } } } - (void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack { // Configure an MTAudioProcessingTap to handle things. MTAudioProcessingTapRef tap; MTAudioProcessingTapCallbacks callbacks; callbacks.version = kMTAudioProcessingTapCallbacksVersion_0; callbacks.clientInfo = (__bridge void *)(self); callbacks.init = init; callbacks.prepare = prepare; callbacks.process = process; callbacks.unprepare = unprepare; callbacks.finalize = finalize; OSStatus err = MTAudioProcessingTapCreate( kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap ); if(err) { NSLog(@"Unable to create the Audio Processing Tap %d", (int)err); return; } // Create an AudioMix and assign it to our currently playing "item", which // is just the stream itself. AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix]; AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack]; inputParams.audioTapProcessor = tap; audioMix.inputParameters = @[inputParams]; self.player.currentItem.audioMix = audioMix; } void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) { OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, NULL, numberFramesOut); if (err) NSLog(@"Error from GetSourceAudio: %d", (int)err); NSLog(@"Process"); } void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) { NSLog(@"Initialising the Audio Tap Processor"); *tapStorageOut = clientInfo; } void finalize(MTAudioProcessingTapRef tap) { NSLog(@"Finalizing the Audio Tap Processor"); } void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) { NSLog(@"Preparing the Audio Tap Processor"); } void unprepare(MTAudioProcessingTapRef tap) { NSLog(@"Unpreparing the Audio Tap Processor"); } 

void init被称为void prepareprocess也被调用。

我怎样才能做到这一点?

我build议使用Novocaine

在iOS和Mac OS X中使用audio单元的audio确实很难,而且会留下伤痕累累的血腥。 现在只需要几行代码就可以完成以前的工作。