iOS Audio Session, ofxPD - AirPods, Audio Routing, Bluetooth etc


I have a project that uses ofxPD that uses play and record, so adc~ and dac~.
This works as expected, but then when plugging in lightning headphones it throws a complete wobbly, and after unplugging the lightning headphones the adc~ doesn’t work anymore.
When attempting to use AirPods the issues are very similar.
Also, AirPlay doesn’t work either.
(this is not using the newly fresh ofxPD, it’s the previous version)

I expect this is something to do with the AudioSession or AudioStream. As I understand it when using Bluetooth Audio you can’t use the mic on the AirPods when playing back to them as well.

I was wondering if anyone had experienced similar and had come up with an approach to dealing with this?
@danomatika @cerupcat does this align with your experience too?


Ok, so I’ve been able to deal with the lightning headphone issue. I realised that by trying to fix the AirPods issue I had broken the lightning headphones.

I just had to make sure that the audio session was set as so:

[session setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionAllowBluetooth|AVAudioSessionCategoryOptionMixWithOthers|AVAudioSessionCategoryOptionDefaultToSpeaker error:&audioSessionError];

(as it was setup in the ofxPD iOS example)

Also, to be clear, the ofxPD iOS example app doesn’t allow for AirPlay

This a problem with the OF Sound Stream, not ifxPd. OfxPd only sends and relieves samples.

The fix is likely more complicated and I would suggest opening an issue on the OF Github repo. You can include a link to the libpd repo as the Obj-C audio handling was updated to improve bluetooth etc.

enohp ym morf tnes

Ok thanks Dan, I’ll have a rummage and report back

Thanks @danomatika I had a look through the issues brought up on the libpd repo, and as far as I got was being able to have the app acknowledge that the audio routing had changed, but that lead to the issues of constant re-init of pd because the buffer/samplerate keeps changing, which leads to no audio output apart from glitchy clicks

I’ve added an issue on the of GitHub - Audio Handling of Bluetooth Routing on iOS · Issue #7163 · openframeworks/openFrameworks · GitHub

I think I have described the issue appropriately

Have you tried the following? ofxPd/ at master · danomatika/ofxPd · GitHub

Yeah, that’s the project I’m working from, and I’ve been using that solve in my app too.

When using those settings the samplerate changes to accommodate the AirPods, (16k), but the audio is still routed through the iPad.

If I add the line

[session setMode:AVAudioSessionModeVideoChat error:&audioSessionError];

to that function I end up with audio being routed to the AirPods, however it makes pd reinitialise every frame and is just clicks and glitches.

Also, as an aside - having the DefaultToSpeaker option stops lightning headphones from working, so for iPad I just remove that line because it’s only really useful on iPhones to make sure it doesn’t come through the receiver.

You should read up on the AVAudioSession modes and choose what works best. VidoeChat is basically the mono in / mono out for old school bluetooth headsets. Getting full stereo output and mono input requires also setting the appropriate AVAudioSessionCategoryOptions to go with it.

In testing and research, I added notes about the options which make sense for libpd to the PdAudioController header: libpd/PdAudioController.h at master · libpd/libpd · GitHub

This is likely due to the buffer size not being constant due to mismatched samplerates, ie. 16k input but using 44.1k internally. The way to deal with this is to buffer the samples and and processes them in blocks to libpd… or use a different samplerate. At a minimum, I would try using 48k as this would probably eliminate the mismatch since 16 divides cleanly into 48. This won’t solve the overall problem but it’s less likely to show up since most Apple devices now use 48k.

That’s great, thanks! I’ve managed to get the pd-for-ios on the libpd repo working alongside OF, the only thing left is to get the

- (void)receivePrint:(NSString *)message {}

etc. from the AppDelegate into ofApp.

This is all working from within ofApp::setup

appDelegate=[[AppDelegate alloc] init];
    audioController = [[PdAudioController alloc] init];
    audioController.allowBluetoothA2DP = YES;
    audioController.mode = AVAudioSessionModeVideoRecording;
    [audioController configurePlaybackWithSampleRate:44100 numberChannels:2 inputEnabled:YES mixingEnabled:NO];
    [audioController setActive:YES];
    [audioController print];
    [PdBase setDelegate:appDelegate];
    [PdBase subscribe:@"symTest"];
    [PdBase subscribe:@"floatTest"];
    [PdBase subscribe:@"bangTest"];
    [PdBase openFile:@"test.pd" path:[[NSBundle mainBundle] resourcePath]];

I just can’t find a way to get the receive functions (receivePrint, receiveFloat, etc) to then call functions within ofApp

I only need the below functions in AppDelegate to call functions in ofApp and I think I’m good to go!

/ handle [print] messages from pd
- (void)receivePrint:(NSString *)message {
    NSLog(@"Pd Console: %@", message);

- (void)receiveBangFromSource:(NSString *)source {
    NSLog(@"Listener %@: bang\n",source);

- (void)receiveFloat:(float)val fromSource:(NSString *)source {
    NSLog(@"Listener %@: float %f\n", source, val);
    //NSString *s = [NSString stringWithFormat:@"%f", val];

- (void)receiveSymbol:(NSString *)s fromSource:(NSString *)source {
    NSLog(@"Listener %@: symbol %@\n",source, s);

Create a dummy receiver class which forwards to your main app instance, either via a property pointer or ofGetAppPtr():

#include "ofApp.h"
- (void)receivePrint:(NSString *)message {
    ofApp *app = (ofApp *)ofGetAppPtr()
    app->print(std::string([message UTF8String]));

Oh @danomatika you magnificent human, thank you