i am still trying to figure out how to extend ofxAudioUnit to work with my 24 channel audio interface.
here is what apple support provided me with. i thought i would share this.
We don’t really have specific docs or samples that discuss “multi-channel audio” since Core Audio handles any number of channels without any specific setup or magic being required. In other words using the AUHAL for example, output multiple streams “just works” if you have the appropriate hardware installed and have actually set it up the hardware (via the Audio Midi Setup App) and perform the standard AU configuration which amounts to generally:
A) Set the client stream format for input to the Output Unit (The ASBD - use CAStreamBasicDescription class from the Core Audio Utility Classes to ensure you’re doing this correctly) providing the number of channels, etc that you are going to provide to AUHAL. The stream format specifies the number of channels per frame of audio data you will be providing in the Audio Buffer List for render. In other words 4 channels of audio would me that the AudioBuffer array would be 4 elements long and mNumberOfBuffers would be 4 (assuming non-interleved pcm) could be two stereo pairs or 4 discreet channels etc.
B) Provide the appropriate AudioChannelLayout that specifies the ordering of the channels you are providing. The number of channels described in the Channel Layout and the stream description *must* match. You’re will probably just be set to discreet.
C) Set up the appropriate channel mapping if required.
Additional comment regarding point B --AUHAL will take whatever the user has specified in Audio MIDI Setup as their speaker placements, and match the channels you are providing to those speaker locations if you use say a 5.1 channel layout or a 7.1 channel layout and so on. The user can specify which channels are their stereo pair as well.
For sample code, CAPlayThough is a good sample to mess around with along with our Audio Tools like HALLab and AULab.
https://developer.apple.com/library/mac/#samplecode/CAPlayThrough/Introduction/Intro.html
For example, I have a 10 channel device and when I run the above sample code the following line
AudioUnitGetProperty(mOutputUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &asbd_dev2_out, &propertySize);
Returns an ASBD stream format that lists 10 mChannelsPerFrame.
The sample then goes on to configure stream formats to match input/output channel counts (commented appropriately) but it basically can be modified to do what ever you want. Once again meaning that the support of any number of channels is just built into the API.
Here’s some other reference material you may want to investigate:
Discusses AUHAL more in depth.
http://developer.apple.com/library/mac/#technotes/tn2091/-index.html
Docs discussing Audio Object APIs (AudioHardware APIs mentioned in the above TN are superseded by AudioObject APIs discussed here).
https://developer.apple.com/library/mac/#technotes/tn2223/-index.html
Core Audio Utility Classes are found here and will be required to build CAPlaythough
https://developer.apple.com/library/mac/#samplecode/CoreAudioUtilityClasses/Introduction/Intro.html
Also, (while not specifically a desktop os discussion) the contents of the WWDC 2012 talk starting at the 27:00 mark talks about how to use IO Units and also discusses multi-channel as it applies to the RemoteIO unit on iOS – however, the RemoteIO is simply the iOS version of the AUHAL and therefore all the discussions about getting more than 2 channels out to different devices on iOS apply directly to the desktop. The APIs for talking to the AU are the same. So worth the watch.