AVPlayer从外部源同时附加的文件加载AVAsset(对于macOS和iOS)

我有一个关于使用AVFoundation的AVPlayer的问题 (可能适用于iOS和macOS)。 我正在尝试播放来自标准HTTP直播流以外的频道的音频(未压缩的wav)数据。

案子:
音频数据包在通道中压缩,以及应用程序需要使用的其他数据。 例如,video和音频来自同一个频道,并由标题分隔。
过滤后,我获取音频数据并将其解压缩为WAV格式(此阶段不包含标题)。
一旦数据包准备就绪(每个9600字节用于24k,立体声16位音频),它们将被传递到AVPlayer实例(根据Apple的AVAudioPlayer不适合流式传输音频)。

鉴于AVPlayer(项目或资产)不从内存加载(没有initWithData:(NSData))并且需要HTTP实时流URL或文件URL,我在磁盘上创建文件(macOS或iOS),添加WAV标题并在那里附加未压缩的数据。

回到AVPlayer,我创建了以下内容:

AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:tempAudioFile] options:nil]; AVPlayerItem *audioItem = [[AVPlayerItem alloc] initWithAsset:audioAsset]; AVPlayer *audioPlayer = [[AVPlayer alloc] initWithPlayerItem:audioItem]; 

添加KVO然后尝试开始播放:

 [audioPlayer play]; 

结果是音频播放1-2秒然后停止( AVPlayerItemDidPlayToEndTimeNotification准确),同时数据继续附加到文件。 由于整个事情都处于循环状态,[audioPlayer play]会多次启动和暂停(rate == 0)。

整个概念以简化forms:

 -(void)PlayAudioWithData:(NSData *data) //data in encoded format { NSData *decodedSound = [AudioDecoder DecodeData:data]; //decodes the data from the compressed format (Opus) to WAV [Player CreateTemporaryFiles]; //This creates the temporary file by appending the header and waiting for input. [Player SendDataToPlayer:decodedSound]; //this sends the decoded data to the Player to be stored to file. See below for appending. Boolean prepared = [Player isPrepared]; //a check if AVPlayer, Item and Asset are initialized if (!prepared)= [Player Prepare]; //creates the objects like above Boolean playing = [Player isAudioPlaying]; //a check done on the AVPlayer if rate == 1 if (!playing) [Player startPlay]; //this is actually [audioPlayer play]; on AVPlayer Instance } -(void)SendDataToPlayer:(NSData *data) { //Two different methods here. First with NSFileHandle — not so sure about this though as it definitely locks the file. //Initializations and deallocations happen elsewhere, just condensing code to give you an idea NSFileHandle *audioFile = [NSFileHandle fileHandleForWritingAtPath:_tempAudioFile]; //happens else where [audioFile seekToEndOfFile]; [audioFile writeData:data]; [audioFile closeFile]; //happens else where //Second method is NSOutputStream *audioFileStream = [NSOutputStream outputStreamWithURL:[NSURL fileURLWithPath:_tempStreamFile] append:YES]; [audioFileStream open]; [audioFileStream write:[data bytes] maxLength:data.length]; [audioFileStream close]; } 

NSFileHandle和NSOutputStream都可以生成由QuickTime,iTunes,VLC等播放的完全正常工作的WAV文件。另外,如果我绕过[Player SendDataToPlayer:decodingSound]并且预先加载标准WAV的临时音频文件,它也可以正常播放。

到目前为止,有两个方面:a)我已经解压缩并准备播放音频数据b)我正确保存数据。

我想要做的是连续发送 – 写入 – 读取 。 这使我认为将数据保存到文件,获得对文件资源的独占访问权限,并且不允许AVPlayer继续播放。

任何人都知道如何保持文件可用于NSFileHandle / NSOutputStream和AVPlayer?

甚至更好……有AVPlayer initWithData吗? (呵呵…)

任何帮助深表感谢! 提前致谢。

您可以使用AVAssetResourceLoader将您自己的数据和元数据传输到AVAsset ,然后您可以使用AVPlayer播放,实际上可以创建[[AVPlayer alloc] initWithData:...]

 - (AVPlayer *)playerWithWavData:(NSData* )wavData { self.strongDelegateReference = [[NSDataAssetResourceLoaderDelegate alloc] initWithData:wavData contentType:AVFileTypeWAVE]; NSURL *url = [NSURL URLWithString:@"ns-data-scheme://"]; AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil]; // or some other queue != main queue [asset.resourceLoader setDelegate:self.strongDelegateReference queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)]; AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset]; return [[AVPlayer alloc] initWithPlayerItem:item]; } 

您可以这样使用:

 [self setupAudioSession]; NSURL *wavUrl = [[NSBundle mainBundle] URLForResource:@"foo" withExtension:@"wav"]; NSData *wavData = [NSData dataWithContentsOfURL:wavUrl]; self.player = [self playerWithWavData:wavData]; [self.player play]; 

问题是, AVAssetResourceLoaderfunction非常强大( 除非你想使用AirPlay ),所以你可能比将音频数据一次性输入AVPlayer做得更好 – 你可以将它传输到AVAssetResourceLoader委托中,因为它可用。

这是简单的“一次性” AVAssetResourceLoader委托。 要修改它以进行流式传输,应该足以设置比您当前拥有的数据量更长的contentLength

头文件:

 #import  #import  @interface NSDataAssetResourceLoaderDelegate : NSObject  - (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType; @end 

实施文件:

 @interface NSDataAssetResourceLoaderDelegate() @property (nonatomic) NSData *data; @property (nonatomic) NSString *contentType; @end @implementation NSDataAssetResourceLoaderDelegate - (instancetype)initWithData:(NSData *)data contentType:(NSString *)contentType { if (self = [super init]) { self.data = data; self.contentType = contentType; } return self; } - (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest { AVAssetResourceLoadingContentInformationRequest* contentRequest = loadingRequest.contentInformationRequest; // TODO: check that loadingRequest.request is actually our custom scheme if (contentRequest) { contentRequest.contentType = self.contentType; contentRequest.contentLength = self.data.length; contentRequest.byteRangeAccessSupported = YES; } AVAssetResourceLoadingDataRequest* dataRequest = loadingRequest.dataRequest; if (dataRequest) { // TODO: handle requestsAllDataToEndOfResource NSRange range = NSMakeRange((NSUInteger)dataRequest.requestedOffset, (NSUInteger)dataRequest.requestedLength); [dataRequest respondWithData:[self.data subdataWithRange:range]]; [loadingRequest finishLoading]; } return YES; } @end