ofxAudioUnit

Hi oF!

I’ve made an addon which makes it super-easy to use Audio Units in your oF apps. It comes with a handful of examples which walk you through what Audio Units are, how to access them, how to hook them up to each other and how to mess with them. Here’s a demonstration of all of the bundled examples:

http://vimeo.com/41115496

The syntax is quite simple. Here’s how you would create an Audio Unit chain with a sampler hooked up to a distortion unit, a reverb unit and a mixer.

  
  
//	ofxAudioUnits are constructed with a description. These  
//	descriptions are explained in the first example.  
  
	ofxAudioUnit sampler    = ofxAudioUnit(kAudioUnitType_MusicDevice, kAudioUnitSubType_Sampler);  
	ofxAudioUnit distortion = ofxAudioUnit(kAudioUnitType_Effect, kAudioUnitSubType_Distortion);  
	ofxAudioUnit reverb     = ofxAudioUnit(kAudioUnitType_Effect, kAudioUnitSubType_MatrixReverb);  
	ofxAudioUnitMixer mixer;  
	ofxAudioUnitOutput output;  
	  
//	Hook them all up.  
	sampler.connectTo(distortion).connectTo(reverb).connectTo(mixer).connectTo(output);  
	  
//	Start making sound  
	output.start();  
	  
//	Show the sampler's UI in a pop-up window  
	sampler.showUI();  
  

Here it is on github : http://github.com/admsyn/ofxAudioUnit
A direct download link : http://github.com/admsyn/ofxAudioUnit/zipball/master

Hi,

uauauauau, greeeeeat !!!
i will try that asap !!!

thank you !!!
do you plan to make it also for VST ?

miguel
www.tangiblex.net

thank you !!!

:smiley:

do you plan to make it also for VST?

Probably not, unfortunately.

ofxAudioUnitMixer.cpp:58: error: ‘kMultiChannelMixerParam_Pan’ was not declared in this scope

that’s in the example-hello.
the variable is not part of the enum, where f.e. the kMultiChannelMixerParam_Volume is declared

cool project!

/*j

the variable is not part of the enum, where f.e. the kMultiChannelMixerParam_Volume is declared

Ah yes, I can reproduce this one by changing the SDK to 10.6. It looks like that param constant was added in 10.7. Is switching to the 10.7 SDK an option for you?

BTW, if you want panning on 10.6 the Stereo Mixer can do it.

EDIT: You should be able to compile / run everything on OS X 10.6 now, though ofxAudioUnitMixer’s setPan() function won’t do anything.

Looks awesome! Thanks a lot for sharing! :smiley:

Hi,

This is some great work! It really opens up a world of sound processing to OF. I’ve been through the examples, but I can’t find any evidence of how to get AudioIn(my voice) in the AU signal chain. Is this possible? I’m imagining something like an ofxAudioUnitInput:

input >> tap >> delay >> reverb >> output

thanks again for the great work!

I’ve been through the examples, but I can’t find any evidence of how to get AudioIn(my voice) in the AU signal chain. Is this possible? I’m imagining something like an ofxAudioUnitInput:

input >> tap >> delay >> reverb >> output

UPDATE (Jun 24 2012): This functionality exists now, in ofxAudioUnitInput.

At the moment that’s not built in to ofxAudioUnit, but it is certainly possible. For now, if you want this functionality, you’ll have to do some additional legwork in the Core Audio world. It’ll be like this:

  1. get the input samples
  2. store them in a buffer
  3. feed the buffer to an output chain with a render callback

For 1 & 2, you have some options. To do it purely in the Core Audio / Audio Unit way, you’ll have to deal with the AUHAL unit. This is pretty verbose and not beginner friendly, unfortunately. Here’s Apple’s notes on how to do that : http://developer.apple.com/library/mac/#technotes/tn2091/-index.html . I’m working on a way to simplify this for ofxAudioUnit, but it will involve some abstraction since you cannot directly feed an input unit into an output chain in the intuitive way that you want to (ie. you need to do some buffering and a bit of render callback dancing).

As an alternative, you could do something with openFrameworks’ built in ofSoundStream (see the audioInputExample).

The end goal of steps 1 & 2 is to have some raw buffer of samples to feed to your units in a render callback. The samples should be floating point numbers between -1 and 1. If you’re working on iOS the sample format will be different (probably 8.24 fixed point, I believe).

For 3, you’ll have to take the first unit in your output chain (for example, the delay in the chain you’ve outlined) and set its render callback to a function in your program. In your render callback function, you’ll give it samples from a buffer that has already been filled with mic samples from steps 1 & 2. See the render callback example in ofxAudioUnit for a few words about that.

So…yeah. Not beginner friendly unfortunately, sorry about that. I’ll get an input unit set up once I’ve got time to devote to ofxAudioUnit again (shouldn’t be too long (famous last words)).

PS : if you do get this working, beware of feedback. Also, be wary of passing data into your output chain that isn’t in the sane -1 to 1 range. If you’re wearing headphones at the time, your head might explode :wink:

EDIT : By the way, if you start going down the AUHAL path and need some extra help, feel free to ask here and I’ll see what I can do.

Hi admsym!

I’ve been studying up on coreaudio and AUs and I’ve made some additions to your addon. Wrapping AU functionality for OF is of great interest to me so if its cool with you I’d like to fork off your github project and throw my additions in.

Jason

Awesome! Excited to see what you have.

very nice addon.
is there a way to record the sound in to a file?
i read something about this here:
http://developer.apple.com/library/mac/#samplecode/AudioDataOutputToAudioUnit/Listings/CaptureSessionController-m.html

is there a way to record the sound in to a file?

That depends on what you mean. It’s not a thing I’ve built into ofxAudioUnit (yet), but it is something Audio Units can do. Strictly speaking, basically anything that you can do with audio on OSX is something that you can do with Audio Units, as just about everything audio-related on OSX is built on top of them (Audio Queues, AVFoundation etc).

The sample code you linked to is related to QTKit, which is a few layers of abstraction above Audio Units (I might be wrong, and there are certainly other members of the forum who know more about QTKit than I). Basically though, the QTKit code you linked won’t do much for you in Audio Unit land.

I’m going to take a stab at implementing this though, since it seems like a useful feature. Just as an update to people who are interested, these are the things on my plate at the moment (in order):

  • input (ie. mic, line in)

  • recording to file

  • the things on the github TODO (iOS, sampler, aupresets…)

I would welcome any pull requests or issue filing on the github page, btw! I don’t have as much time to devote to this project as I’d like.

PS has anyone used ofxAudioUnit on an iThing? Does it work? I’ve only tested it on the simulator so far.

Update : ofxAudioUnitInput now exists, with an API mimicking all of the other Audio Units. You can use it just as xululululuuum asked for:

  
  
input >> tap >> delay >> reverb >> output;  
input.start();  
output.start();  
  

If you do a git pull and can’t compile, make sure you’ve re-added all of the files in ofxAudioUnit/src/ since there are some new ones now.

A quirk : You’ll need to have your hardware sample rate set to 44100 in /Applications/Utilities/Audio MIDI Setup.app. Some applications change this on you (I think? It seems to set itself to 96000 on occasion on my laptop). If you start getting error -10863, this is almost certainly the cause.

this is a very nice addon.

i was able to make play 32 sound file at once. maybe more have not tested that yet.

is there a way to specify the output device. i have multiple m-audio fast tracks attached via usb and would like to tell the app which one to use.

thanks.

is there a way to specify the output device. i have multiple m-audio fast tracks attached via usb and would like to tell the app which one to use.

There isn’t one built into ofxAudioUnit right now, but it’s certainly possible. If you look into ofxAudioUnitInput.cpp, in ofxAudioUnitInput::configureInputDevice() you can see where I’m setting up the AUHAL unit to represent the proper input device. Setting up the AUHAL to represent a different output device in ofxAudioUnitOutput would be a similar affair.

This is actually 2nd on my list of things to do right now (and I am working on it!). I want to include some device management features, both for selecting outputs as you suggest, and for managing the input device (since it has a habit of picking an incompatible input format).

Currently I’m doing a bit of DSP stuff (fft, pitch detection, etc) which should be done soon™.

ok cool.
i will take a look. but will also keep an eye out for your updates.

do you know when you might have the record input and select output device feature available?

thx.

i am still working on getting multichannel audio devices to work with this addon.
but in the meantime i expanded this great addon to allow for device selection.
i have not had time to test it thoroughly but here it is:
https://github.com/antimodular/ofxAudioUnit

What an incredible addon. Thanks so much for this contribution!

A question about usage. I have a 3rd-party AU whose sub-type doesn’t seem to conform to the 4-letter code rule. Aside from asking the programmer to change this in the next release, is there anything I can do to be able to use this AU?

Weird! I guess the auval utility is just bugging out. The 4-character codes just boil down to numbers, so maybe 00616770 is just the numeric representation of the real 4-char code.

Does it work if you just use it directly as a number, unquoted? Like this:

ofxAudioUnit('aumu', 001616770, 'AudG');

No, same error: EXC_BAD_ACCESS

Notably, it also failed in exactly the same way when I tried another AU I had installed:

ofxAudioUnit('aumu', 'ncut', 'TOGU');
// aumu ncut TOGU  -  TAL-Togu Audio Line: TAL NoiseMaker Plugin

It prints this line just before falling over:

Couldn't locate component for description

I now have it working with another 3rd-party AU, that I just installed:

ofxAudioUnit('aumu', 'CaC2', 'CamA');
//aumu CaC2 CamA  -  Camel Audio: Alchemy

There’s something about those other two. They both work just fine in Reaper. Let me know if you have any thoughts I’m happy to help try to debug this.