Swift:声音输出和麦克风输入| 使用AudioKit |


我正在使用> Xcode Version 9.2 <
我正在使用> AudioKit版本4.0.4 <


我已经写了一些你可以在下面找到的代码

  • 播放特定声音(frequency: 500.0HZ)
  • “收听”麦克风输入并实时计算频率

如果我正在调用playSound()receiveSound()一切看起来都很好,并且确实按照我的预期工作。 但之后调用了playSound()receiveSound() ? 正是在那里我遇到了很大的问题。

这就是我想让代码工作的方式:

 SystemClass.playSound() //play sound DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 3.0)) { SystemClass.receiveSound() //get microphone input 3 seconds later } 

 let SystemClass: System = System() class System { public init() { } func playSound() { let sound = AKOscillator() AudioKit.output = sound AudioKit.start() sound.frequency = 500.0 sound.amplitude = 0.5 sound.start() DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 2.0)) { sound.stop() } } var tracker: AKFrequencyTracker! func receiveSound() { AudioKit.stop() AKSettings.audioInputEnabled = true let mic = AKMicrophone() tracker = AKFrequencyTracker(mic) let silence = AKBooster(tracker, gain: 0) AudioKit.output = silence AudioKit.start() Timer.scheduledTimer( timeInterval: 0.1, target: self, selector: #selector(SystemClass.outputFrequency), userInfo: nil, repeats: true) } @objc func outputFrequency() { print("Frequency: \(tracker.frequency)") } } 

这些消息是每次我想运行代码时调用的一些编译器错误消息(调用playSound()并在3秒后调用receiveSound () ):

 AVAEInternal.h:103:_AVAE_CheckNoErr: [AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875 AVAudioEngine.mm:149:-[AVAudioEngine prepare]: Engine@0x1c401bff0: could not initialize, error = -10875 [MediaRemote] [AVOutputContext] WARNING: AVF context unavailable for sharedSystemAudioContext [AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875 Fatal error: AudioKit: Could not start engine. error: Error Domain=com.apple.coreaudio.avfaudio Code=-10875 "(null)" UserInfo={failed call=err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)}.: file /Users/megastep/src/ak/AudioKit/AudioKit/Common/Internals/AudioKit.swift, line 243 

我相信你的问题的lionshare是由于在使用它们的函数中本地声明AKNode:

  let sound = AKOscillator() let mic = AKMicrophone() let silence = AKBooster(tracker, gain: 0) 

将这些声明为实例变量,如此处所述。