列出可用的输出音频目标AVAudioSession

我需要列出iOS应用程序可用的音频输出。 我的问题与此类似: 如何在iOS上列出可用的音频输出路径

我试过这段代码:

NSError *setCategoryError = nil; BOOL success = [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: &setCategoryError]; NSError *activationError = nil; [[AVAudioSession sharedInstance] setActive: YES error: &activationError]; … NSLog(@"session.currentRoute.outputs count %d", [[[[AVAudioSession sharedInstance] currentRoute] outputs ] count]); for (AVAudioSessionPortDescription *portDesc in [[[AVAudioSession sharedInstance] currentRoute] outputs ]) { NSLog(@"-----"); NSLog(@"portDesc UID %@", portDesc.UID); NSLog(@"portDesc portName %@", portDesc.portName); NSLog(@"portDesc portType %@", portDesc.portType); NSLog(@"portDesc channels %@", portDesc.channels); } 

但是我总是看到一个输出端口(计数是1),如果我有两个(Airplay和内置扬声器)。 如果我使用音乐应用程序,我可以看到两个端口并在它们之间切换。 在我的应用程序中,我只看到我选择的那个。

我还需要做些什么吗?

谢谢

编辑:

我也试过这段代码:

 CFDictionaryRef asCFType = nil; UInt32 dataSize = sizeof(asCFType); AudioSessionGetProperty(kAudioSessionProperty_AudioRouteDescription, &dataSize, &asCFType); NSDictionary *audioRoutesDesc = (__bridge NSDictionary *)asCFType; NSLog(@"audioRoutesDesc %@", audioRoutesDesc); 

但字典只列出一个输出目的地。 而且输入源数组是空的(我有一个iPhone 4s)

EDIT2:

我有一些使用MPVolumeView的工作。 此组件有一个按钮,可让您选择输出音频路径,就像在音乐应用程序中一样。

如果你想要,你可以隐藏滑块(并只有按钮)使用:

 self.myMPVolumeView.showsVolumeSlider = NO; 

尝试这样的东西,它比你需要的更多,但你可以削减它:

  + (NSString *) demonstrateInputSelection { NSError* theError = nil; BOOL result = YES; NSMutableString *info = [[NSMutableString alloc] init]; [info appendString: @" Device Audio Input Hardware\n"]; NSString *str = nil; if( iOSMajorVersion < 7 ){ str = @"No input device information available"; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; return info; } AVAudioSession* myAudioSession = [AVAudioSession sharedInstance]; result = [myAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&theError]; if (!result) { NSLog(@"setCategory failed"); } result = [myAudioSession setActive:YES error:&theError]; if (!result) { NSLog(@"setActive failed"); } // Get the set of available inputs. If there are no audio accessories attached, there will be // only one available input -- the built in microphone. NSArray* inputs = [myAudioSession availableInputs]; str = [NSString stringWithFormat:@"\n--- Ports available on %@: %d ---", [UIDevice currentDevice].name , [inputs count]]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; // Locate the Port corresponding to the built-in microphone. AVAudioSessionPortDescription* builtInMicPort = nil; AVAudioSessionDataSourceDescription* frontDataSource = nil; for (AVAudioSessionPortDescription* port in inputs) { // Print out a description of the data sources for the built-in microphone str = @"\n**********"; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; str = [NSString stringWithFormat:@"Port :\"%@\": UID:%@", port.portName, port.UID ]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; if( [port.dataSources count] ){ str = [NSString stringWithFormat:@"Port has %d data sources",(unsigned)[port.dataSources count] ]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } str = [NSString stringWithFormat:@">%@", port.dataSources]; NSLog(@"%@",str); // [info appendFormat:@"%@\n",str]; if( [port.portType isEqualToString:AVAudioSessionPortLineIn] ){ str = @"Line Input found"; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } else if( [port.portType isEqualToString:AVAudioSessionPortUSBAudio] ){ str = @"USB Audio found"; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } else if ([port.portType isEqualToString:AVAudioSessionPortBuiltInMic]){ builtInMicPort = port; str = @"Built-in Mic found"; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } else if ([port.portType isEqualToString:AVAudioSessionPortHeadsetMic]){ builtInMicPort = port; str = @"Headset Mic found"; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } else{ str = @"Other input source found"; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } // loop over the built-in mic's data sources and attempt to locate the front microphone for (AVAudioSessionDataSourceDescription* source in port.dataSources) { str = [NSString stringWithFormat:@"\nName:%@ (%d) \nPolar:%@ \nType:%@ \nPatterns:%@", source.dataSourceName, [source.dataSourceID intValue], source.selectedPolarPattern, port.portType, source.supportedPolarPatterns]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; // if ([source.orientation isEqual:AVAudioSessionOrientationFront]) // { // frontDataSource = source; // break; // } } // end data source iteration } str = @"\n---- Current Selected Ports ----\n"; NSLog(@"%@",str); [info appendFormat:@"%@",str]; NSArray *currentInputs = myAudioSession.currentRoute.inputs; // str = [NSString stringWithFormat:@"\n%d current input ports", [currentInputs count]]; // NSLog(@"%@",str); // [info appendFormat:@"%@\n",str]; for( AVAudioSessionPortDescription *port in currentInputs ){ str = [NSString stringWithFormat:@"\nInput Port :\"%@\":", port.portName ]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; if( [port.dataSources count] ){ str = [NSString stringWithFormat:@"Port has %d data sources",(unsigned)[port.dataSources count] ]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; str = [NSString stringWithFormat:@"Selected data source:%@", port.selectedDataSource.dataSourceName]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; if( [port.selectedDataSource.supportedPolarPatterns count] > 0 ){ str = [NSString stringWithFormat:@"Selected polar pattern:%@", port.selectedDataSource.selectedPolarPattern]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } } } NSArray *currentOutputs = myAudioSession.currentRoute.outputs; // str = [NSString stringWithFormat:@"\n%d current output ports", [currentOutputs count]]; // NSLog(@"%@",str); // [info appendFormat:@"%@\n",str]; for( AVAudioSessionPortDescription *port in currentOutputs ){ str = [NSString stringWithFormat:@"\nOutput Port :\"%@\":", port.portName ]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; if( [port.dataSources count] ){ str = [NSString stringWithFormat:@"Port has %d data sources",(unsigned)[port.dataSources count] ]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; str = [NSString stringWithFormat:@"Selected data source:%@", port.selectedDataSource.dataSourceName]; NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; } } // str = [NSString stringWithFormat:@"\Current Route: %@ Source:%@\n", myAudioSession.currentRoute.portName, myAudioSession.preferredInput.selectedDataSource.dataSourceName]; // NSLog(@"%@",str); // [info appendFormat:@"%@\n",str]; if( myAudioSession.preferredInput.portName ){ str = [NSString stringWithFormat:@"\nPreferred Port: %@ Source:%@\n", myAudioSession.preferredInput.portName, myAudioSession.preferredInput.selectedDataSource.dataSourceName]; } else { str = @"\nNo Preferred Port set"; } NSLog(@"%@",str); [info appendFormat:@"%@\n",str]; return info; if (frontDataSource) { NSLog(@"Currently selected source is \"%@\" for port \"%@\"", builtInMicPort.selectedDataSource.dataSourceName, builtInMicPort.portName); NSLog(@"Attempting to select source \"%@\" on port \"%@\"", frontDataSource, builtInMicPort.portName); // Set a preference for the front data source. theError = nil; result = [builtInMicPort setPreferredDataSource:frontDataSource error:&theError]; if (!result) { // an error occurred. Handle it! NSLog(@"setPreferredDataSource failed"); } } // Make sure the built-in mic is selected for input. This will be a no-op if the built-in mic is // already the current input Port. theError = nil; result = [myAudioSession setPreferredInput:builtInMicPort error:&theError]; if (!result) { // an error occurred. Handle it! NSLog(@"setPreferredInput failed"); } return info; } 
 AVAudioSessionRouteDescription *currentRoute = [[AVAudioSession sharedInstance] currentRoute]; for (AVAudioSessionPortDescription *output in currentRoute.outputs) { } 

它取决于您的AVAudioSession类别。

您可以放心地在iPhone上假设您至少有麦克风作为输入,扬声器作为输出。 如果您要获取蓝牙/ AirPlay输出列表,首先您必须确保您的会话类别向您报告:

 do { try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: .AllowBluetooth) try audioSession.setActive(true) } catch let e { debugPrint("failed to initialize audio session: \(e)") } 

然后获得可用输出的非直观方法是检查AVAudioSession.availableInputs因为通常蓝牙HFP设备也会有麦克风……我现在可能会假设很多..但这是始终获得可用输出的唯一方法。

更好的方法是使用MultipleRoute类别,这将使您更自由地访问AVAudioSessionPort