如何在iPhone设备上使用AVAudioSessionCategoryMultiRoute?

我想使用AVAudioSessionCategoryMultiRoute,但不幸的是在苹果开发中心和Google上都没有例子。 如何使用/实现AVAudioSessionCategoryMultiRoute为了定义iPhone上的2个不同的路线ios7.0.4? 我的目标是通过扬声器和耳机路由audio。 (我知道这是不可能的,但我想尝试与ios7)

感谢您的帮助,

这可以帮助我: AVAudioEngine和Multiroute ,以及: iOS中的audio会话和多pathaudio 。

就我而言,我使用两种方法来实现:

首先请求MultiRoute类别

[_session setCategory:AVAudioSessionCategoryMultiRoute error:nil]; 

方法1:AVAudioPlayer的setup channelAssignments:

  // My hardware has 4 output channels if (_outputPortChannels.count == 4) { AVAudioSessionChannelDescription *desiredChannel1 = [_outputPortChannels objectAtIndex:2]; AVAudioSessionChannelDescription *desiredChannel2 = [_outputPortChannels objectAtIndex:3]; // Create an array of desired channels NSArray *channelDescriptions = [NSArray arrayWithObjects:desiredChannel1, desiredChannel2, nil]; // Assign the channels _avAudioPlayer1.channelAssignments = channelDescriptions; NSLog(@"_player.channelAssignments: %@", _avAudioPlayer1.channelAssignments); // Play audio to output channel3, channel4 [_avAudioPlayer1 play]; } 

method2:自定义频道地图

  // Get channel map indices based on user specified channelNames NSMutableArray *channelMapIndices = [self getOutputChannelMapIndices:_inChannelNames]; NSAssert(channelMapIndices && channelMapIndices.count > 0, @"Error getting indices for user specified channel names!"); // AVAudioEngine setup _engine = [[AVAudioEngine alloc] init]; _output = _engine.outputNode; _mixer = _engine.mainMixerNode; _player = [[AVAudioPlayerNode alloc] init]; [_engine attachNode:_player]; // open the file to play NSString *path1 = [[NSBundle mainBundle] pathForResource:@"yujian" ofType:@"mp3"]; NSURL *songURL1 = [NSURL fileURLWithPath:path1]; _songFile = [[AVAudioFile alloc] initForReading:songURL1 error:nil]; // create output channel map SInt32 source1NumChannels = (SInt32)_songFile.processingFormat.channelCount; // I use constant map // Play audio to output channel3, channel4 SInt32 outputChannelMap[4] = {-1, -1, 0, 1}; // This will play audio to output channel1, channel2 //SInt32 outputChannelMap[4] = {0, 1, -1, -1}; // set channel map on outputNode AU UInt32 propSize = (UInt32)sizeof(outputChannelMap); OSStatus err = AudioUnitSetProperty(_output.audioUnit, kAudioOutputUnitProperty_ChannelMap, kAudioUnitScope_Global, 1, outputChannelMap, propSize); NSAssert(noErr == err, @"Error setting channel map! %d", (int)err); // make connections AVAudioChannelLayout *channel1Layout = [[AVAudioChannelLayout alloc] initWithLayoutTag:kAudioChannelLayoutTag_DiscreteInOrder | (UInt32)source1NumChannels]; AVAudioFormat *format1 = [[AVAudioFormat alloc] initWithStreamDescription:_songFile.processingFormat.streamDescription channelLayout:channel1Layout]; [_engine connect:_player to:_mixer format:format1]; [_engine connect:_mixer to:_output format:format1]; // schedule the file on player [_player scheduleFile:_songFile atTime:nil completionHandler:nil]; // start engine and player if (!_engine.isRunning) { [_engine startAndReturnError:nil]; } [_player play]; 

它为我工作。