I’ve been trying to plot a sound waveform in a fragment shader for a few days now with no success. I’m assuming the proper way to do this is to convert the sound buffer into a texture and then send the texture as a uniform, but I’ve tried to do this in a number of different ways with no luck so far! any ideas?
Hi @fidete , did you try getting the sound buffer data into an ofFloatPixels, and then use it for the ofTexture? I’m thinking something like this would work:
// in ofApp.h
ofSoundBuffer soundBuffer;
ofFloatPixels fPixels; // values 0.0 - 1.0
ofTexture texture;
// in ofApp.setup()
// soundBuffer will have 1 channel with 160000 samples (200 * 200 * 4);
// fPixels will be RGBA 200 x 200
int width = 200;
int height = 200;
int numChannels = 4; // RGBA
// allocate soundBuffer and fill it with noise
soundBuffer.allocate(width * height * numChannels, 1);
soundBuffer.fillWithNoise();
// set fPixels from soundBuffer
fPixels.setFromPixels(soundBuffer.getBuffer().data(), width, height, numChannels);
// set texture from fPixels
texture.loadData(fPixels);
// in ofApp.draw()
// texture should have "rainbow" RGBA noise
texture.draw(0.f, 0.f);
I haven’t used ofSoundBuffer at all. I’m thinking that channels are interleaved for more than 1 channel.
There is also ofBufferObject, which is kinda like a texture for data. The /examples/gl/textureBufferInstancedExample project is a pretty clear example of how to use this class if you’re interested.
Edit: I forgot that sound typically is in the range of -1.0 - 1.0. So multiplying every sample by 0.5 and adding 0.5 should correct the range to 0.0 - 1.0. I don’t think an RGBA texture can have values outside of 0.0 - 1.0.
Also, there are some texture formats that (should) accommodate values outside of 0.0 - 1.0, like GL_RGBA32F and GL_RGBA16F. But these might be better suited to getting data out of a shader (as a texture).