[ofxMaxim] How to play/analyze stereo music files?

Hi,

I’m currently using ofxMaxim in my app for some audio analysis (FFT) and had a couple of questions.

My current implementation looks like this:

In my header:

  
  
ofxMaxiSample music;  
ofxMaxiMix mixer;  
  

In the Setup():

  
  
music.loadOgg(ofToDataPath("mysong.ogg"));  
  

The audio callback:

  
  
void testApp::audioRequested(float * output, int bufferSize, int nChannels) {  
  
		double sampleMusic;  
		double outputs[2];  
  
		for (int i = 0; i < bufferSize; i++) {  
			sampleMusic = music.playOnce();  
  
			// Do FFT here  
			  
			mixer.stereo(sampleMusic, outputs, 0.5);  
		  
			output[i*nChannels    ] = outputs[0];  
			output[i*nChannels + 1] = outputs[1];  
		}  
}  
  

One thing that I noticed while testing, is that the example above only plays one channel. To be more precise, it plays only the left channel. The only thing that the mixer seems to do is copying the left channel to the right channel but both will output the same (it becomes mono). Since I was playing a stereo music file, this becomes more noticeable.

My questions would be, is it possible to play both channels of an ofxMaxiSample object? And would I also be able to process the channels either at the same time or separately?

For now I did something like here below, but that is a bit nasty as it will load and decode the ogg audio file twice (it is much slower on mobile devices).

  
  
musicLeft.loadOgg(ofToDataPath("mysong.ogg"), 0);  
musicRight.loadOgg(ofToDataPath("mysong.ogg"), 1);  
  

  
  
void testApp::audioRequested(float * output, int bufferSize, int nChannels) {  
  
		double sampleMusicLeft;  
		double sampleMusicRight;  
  
		for (int i = 0; i < bufferSize; i++) {  
			sampleMusicLeft = musicLeft.playOnce();  
			sampleMusicRight = musicRight.playOnce();  
		  
			// Do FFT here for sampleMusicLeft  
			// Do FFT here for sampleMusicRight  
		  
			output[i*nChannels    ] = sampleMusicLeft;  
			output[i*nChannels + 1] = sampleMusicRight;  
		}  
}  
  

anyone :confused: ?

Is your call to ofSetupSoundStream like:

  
ofSoundStreamSetup(2,0,this, sampleRate, initialBufferSize, 4); // note 2 channels  

I’ve actually never used playOnce(), but reading through the code it seems fine. I’m wondering if somehow it’s something in OGG decoding? Does your code work correctly if you use an WAV or other audio file type?

I don’t think the source format really matters (Wav or Ogg), on load you specify which channel you want to use. Looking at the source this will become the “read channel” https://github.com/micknoise/Maximilian/blob/master/ofxMaxim/ofxMaxim/libs/maximilian.cpp#L529

The only benefit that I’ve seen for using WAV is that it loads up way faster (almost instant) than OGG (takes quite a few seconds for a 3~5 min song).

My soundstream setup is as follows:

  
  
sampleRate = maxiSettings::sampleRate; // 44100  
channels = maxiSettings::channels; // 2  
bufferSize = 512;  
  
ofSoundStreamSetup(channels, 0, this, sampleRate, bufferSize, 4);  
  

I’ll admit to not being super familiar with either the inner workings of Maximillian or of signal processing and generation. Is there any chance you could post one or two of the samples that you’re using so I could play around with them and see what I can figure out?

You could try this file: MyStereoSong.wav

header file

  
  
#pragma once  
  
#include "ofMain.h"  
#include "ofxMaxim.h"  
  
#if defined(TARGET_ANDROID)  
	#include "ofxAndroid.h"  
#endif  
  
#if defined(TARGET_QNX)  
	#include "ofxQNX.h"  
#endif  
  
#if defined(TARGET_ANDROID)  
class testApp : public ofxAndroidApp {  
#elif defined(TARGET_QNX)  
class testApp : public ofxQNXApp {  
#else  
class testApp : public ofBaseApp{  
#endif  
	public:  
		~testApp();  
  
		void setup();  
		void update();  
		void draw();  
  
		void audioRequested(float * output, int bufferSize, int nChannels);  
		void audioReceived(float * input, int bufferSize, int nChannels);  
  
		// Keyboard Input  
		void keyPressed(int key);  
		void keyReleased(int key);  
  
	#if defined(TARGET_ANDROID)	|| defined(TARGET_QNX)  
		// Touch Input  
		void touchDown(ofTouchEventArgs &touch);  
		void touchMoved(ofTouchEventArgs &touch);  
		void touchUp(ofTouchEventArgs &touch);  
		void touchDoubleTap(ofTouchEventArgs &touch);  
		void touchCancelled(ofTouchEventArgs &touch);  
	#elif defined(TARGET_WIN32) || defined(TARGET_OSX) || defined(TARGET_LINUX)  
		// Mouse Input  
		void mouseMoved(int x, int y);  
		void mouseDragged(int x, int y, int button);  
		void mousePressed(int x, int y, int button);  
		void mouseReleased(int x, int y, int button);  
  
		void dragEvent(ofDragInfo dragInfo);  
		void gotMessage(ofMessage msg);  
	#endif  
		  
		void windowResized(int w, int h);  
  
	private:		  
		int sampleRate;  
		int channels;  
		int bufferSize;  
  
		int fftSize;		  
						  
		string songName;  
  
		ofxMaxiSample musicLeft;  
		ofxMaxiFFT fftLeft;  
		ofxMaxiFFTOctaveAnalyzer octLeft;  
  
		ofxMaxiSample musicRight;  
		ofxMaxiFFT fftRight;  
		ofxMaxiFFTOctaveAnalyzer octRight;  
};  
  
static std::string getDataPath()  
{  
#if defined(TARGET_QNX)  
	return "app/native/data/";  
#else  
	return ofToDataPath("");  
#endif  
}  
  

cpp file

  
  
#include "testApp.h"  
  
//--------------------------------------------------------------  
testApp::~testApp() {  
  
}  
  
//--------------------------------------------------------------  
void testApp::setup() {  
	ofSetLogLevel(OF_LOG_VERBOSE);  
  
	// Setup sound streamer  
	sampleRate = maxiSettings::sampleRate;  
	channels = maxiSettings::channels;	  
	bufferSize = 512;	// maxiSettings::bufferSize = 1024;  
		  
	ofxMaxiSettings::setup(sampleRate, channels, bufferSize);  
	ofSoundStreamSetup(channels, 0, this, sampleRate, bufferSize, 4);  
	  
	// Setup FFT	  
	fftSize = 1024;  
  
	fftLeft.setup(fftSize, 512, 256);	  
	octLeft.setup(sampleRate, fftSize, 12);  
	  
	fftRight.setup(fftSize, 512, 256);	  
	octRight.setup(sampleRate, fftSize, 12);  
	  
	// Prepare music  
	  
	// WAV  
	songName = "MyStereoSong.wav";	  
	musicLeft.load(getDataPath() + songName, 0);	// Load left (0) channel  
	musicRight.load(getDataPath() + songName, 1);	// Load right (1) channel  
	  
	// OGG (Pretty slow)  
	/*  
		songName = "MyStereoSong.ogg";	  
		musicLeft.loadOgg(getDataPath() + songName, 0);  
		musicRight.loadOgg(getDataPath() + songName, 1);  
	*/  
}  
  
//--------------------------------------------------------------  
void testApp::update() {  
	  
}  
  
//--------------------------------------------------------------  
void testApp::draw() {  
	  
}  
  
//--------------------------------------------------------------  
void testApp::audioRequested(float * output, int bufferSize, int nChannels) {  
	  
		double sampleLeft;  
		double sampleRight;  
  
		for (int i = 0; i < bufferSize; i++) {  
			// Get audio samples  
			sampleLeft = musicLeft.playOnce();		// Get left channel sample  
			sampleRight = musicRight.playOnce();	// Get right channel sample  
  
			// Process left channel   
			if (fftLeft.process(sampleLeft)) {  
				fftLeft.magsToDB();  
				octLeft.calculate(fftLeft.magnitudes);				  
			}  
			  
			// Process right channel   
			if (fftRight.process(sampleRight)) {  
				fftRight.magsToDB();  
				octRight.calculate(fftRight.magnitudes);  
			}  
				  
			// Write to output buffer  
			output[i*nChannels    ] = sampleLeft;	// Put left channel sample in output  
			output[i*nChannels + 1] = sampleRight;	// Put right channel sample in output  
		}  
}  
  
//--------------------------------------------------------------  
void testApp::audioReceived(float * input, int bufferSize, int nChannels) {  
	// No audio input / mic  
}  
  
//--------------------------------------------------------------  
void testApp::keyPressed(int key) {  
		  
}  
  
//--------------------------------------------------------------  
void testApp::keyReleased(int key) {  
  
}  
  
#if defined(TARGET_ANDROID)	|| defined(TARGET_QNX)  
//--------------------------------------------------------------  
void testApp::touchDown(ofTouchEventArgs &touch) {  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchMoved(ofTouchEventArgs &touch) {  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchUp(ofTouchEventArgs &touch) {  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchDoubleTap(ofTouchEventArgs &touch) {  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchCancelled(ofTouchEventArgs &touch) {  
  
}  
  
#elif defined(TARGET_WIN32) || defined(TARGET_OSX) || defined(TARGET_LINUX)  
  
//--------------------------------------------------------------  
void testApp::mouseMoved(int x, int y) {  
  
}  
  
//--------------------------------------------------------------  
void testApp::mouseDragged(int x, int y, int button) {  
  
}  
  
//--------------------------------------------------------------  
void testApp::mousePressed(int x, int y, int button) {  
	  
}  
  
//--------------------------------------------------------------  
void testApp::mouseReleased(int x, int y, int button) {  
  
}  
  
//--------------------------------------------------------------  
void testApp::dragEvent(ofDragInfo dragInfo) {  
	// not used  
}  
  
//--------------------------------------------------------------  
void testApp::gotMessage(ofMessage msg) {  
	// not used  
}  
#endif  
  
//--------------------------------------------------------------  
void testApp::windowResized(int w, int h) {  
  
}  
  

If still usefull – the comment from the person behind ofxMaxim about stereo support:

1 Like

@islandrabe: Thanks for letting us know!

Where did he announce this and are the updates suppose to be on https://github.com/micknoise/Maximilian ?

I found this comment in the long discussion about ofxMaxim (the link right next to my foregoing post guides to the cited passage); the collection of questions and answers there have been quite useful for me.