使用AVAudioTime调度将来播放的audio文件

我试图找出如何纠正在不久的将来安排一个audiofile。 我的实际目标是发挥同步化的多个轨道。

所以如何正确configuration'aTime',从现在开始大约0.3秒。 我想我可能也需要hostTime,但我不知道如何正确使用它

func createStartTime() -> AVAudioTime? { var time:AVAudioTime? if let lastPlayer = self.trackPlayerDictionary[lastPlayerKey] { if let sampleRate = lastPlayer.file?.processingFormat.sampleRate { var sampleTime = AVAudioFramePosition(shortStartDelay * sampleRate ) time = AVAudioTime(sampleTime: sampleTime, atRate: sampleRate) } } return time } 

这是我用来开始播放的function:

 func playAtTime(aTime:AVAudioTime?){ self.startingFrame = AVAudioFramePosition(self.currentTime * self.file!.processingFormat.sampleRate) let frameCount = AVAudioFrameCount(self.file!.length - self.startingFrame!) self.player.scheduleSegment(self.file!, startingFrame: self.startingFrame!, frameCount: frameCount, atTime: aTime, completionHandler:{ () -> Void in NSLog("done playing")//actually done scheduling }) self.player.play() } 

我想到了!

对于填充mach_absolute_time()的hostTime参数,这是计算机/ iPad的“现在”时间。 AVAudioTime(hostTime:sampleTime:atRate)将sampleTime添加到hostTime,并在不久的将来返回一个时间,可用于在相同的起始时间安排多个audio片段

 func createStartTime() -> AVAudioTime? { var time:AVAudioTime? if let lastPlayer = self.trackPlayerDictionary[lastPlayerKey] { if let sampleRate = lastPlayer.file?.processingFormat.sampleRate { var sampleTime = AVAudioFramePosition(shortStartDelay * sampleRate ) time = AVAudioTime(hostTime: mach_absolute_time(), sampleTime: sampleTime, atRate: sampleRate) } } return time } 

那么 – 这是ObjC – 但你会明白的

不需要mach_absolute_time() – 如果你的引擎正在运行,你已经在AVAudioNode中获得了@property lastRenderTime – 你的播放器的超类…

 AVAudioFormat *outputFormat = [playerA outputFormatForBus:0]; const float kStartDelayTime = 0.0; // seconds - in case you wanna delay the start AVAudioFramePosition startSampleTime = playerA.lastRenderTime.sampleTime; AVAudioTime *startTime = [AVAudioTime timeWithSampleTime:(startSampleTime + (kStartDelayTime * outputFormat.sampleRate)) atRate:outputFormat.sampleRate]; [playerA playAtTime: startTime]; [playerB playAtTime: startTime]; [playerC playAtTime: startTime]; [playerD playAtTime: startTime]; [player... 

顺便说一下,您可以使用AVAudioPlayer类实现相同的100%采样帧准确结果…

 NSTimeInterval startDelayTime = 0.0; // seconds - in case you wanna delay the start NSTimeInterval now = playerA.deviceCurrentTime; NSTimeInterval startTime = now + startDelayTime; [playerA playAtTime: startTime]; [playerB playAtTime: startTime]; [playerC playAtTime: startTime]; [playerD playAtTime: startTime]; [player... 

在没有startDelayTime的情况下,所有玩家的第一个100-200ms将被截断,因为起始命令实际上需要时间到达运行循环,尽pipe玩家现在已经开始100%同步了。 但是,如果startDelayTime = 0.25,那么你很好。 并且不要忘记准备提前你的玩家,这样在开始的时候不需要额外的缓冲或者设置 – 只需要开始他们;-)


更深入的解释请看我的答案

AVAudioEngine多AVAudioInputNodes不完美同步播放