getting sound levels (or, manually scheduling events)

I’ve been looking into Openframeworks for an iPhone app I’m planning to build. The visual part of the app will need to respond to the currently playing audio. There may be up to 4 samples playing at the same time. I don’t need fancy FFT info, just the current level would do.

I looked into the OpenAL addon that’s available from Zach’s website, but it doesn’t support fetching the current level. Also, I briefly checked the OpenAL docs, and couldn’t find anything there either.

Now, my question: Is there a way to relatively easily play sounds and get the levels back?

If the answer is no, then I have a second question. For each sound I could create list of some kind with all positions I want my app to respond to (in frames perhaps, or relative to the duration of the sample). How would I approach something like this? The scheduling would have to be pretty accurate.

Thanks, and thanks for an awesome framework!

yeah, there is a way to get the volume of the currently playing sound, especially if you have generated it yourself.

take a look at the ofSoundStream. You will need to write your own code for loading in samples, if that is how you are generating your sound.

regarding the second question, that is doable as well, just store your data in an array at some time interval (3 - 4 milliseconds), and check the offset time from ofGetElapsedTimeMillis(). you should get pretty good scheduling with that method.

Thanks! Got the scheduling working as you proposed.
Soundstream looks promising, but getting sound (especially CAF samples) in there is a challenge I’m not up to yet :slight_smile: