Hey gang, I am getting a pretty strange error on the Kindle Fire Phone. I am testing out the androidSoundPlayerExample and the voice sample (which is the only mp3 – the rest are wav files) seems to trigger the following error:
After this error happens twice (not once! it still works after one of these), the sample no longer plays when I hit the vocals button.
I searched and searched but the only reference to this error was in the following source code:
I have no idea what is going on here, but I am worried that I am not going to be able to use mp3s in my project but rather be forced into using gigantic-sized wav files.
Note that I tried this example project with both ofSoundPlayer and ofxAndroidSoundPlayer. I tried with streaming set to true and streaming set to false. I got the same behavior in all cases.
Naming convention difference between Android and openFrameworks
ofxAndroidSoundPlayer uses Android’s SoundPool, which is for short .wav sound effects, where ofxAndroidVideoPlayer uses Android’s MediaPlayer, which is very good for videos and mp3s.
you should use it like that:
ofxAndroidVideoPlayer music;
music.loadMovie("pathToYourFile.mp3");
music.removeTexture(); //do not forget this line for sound files
music.setLoopState(OF_LOOP_NORMAL);
music.setVolume(1.0f);
music.play();
About your error; I think you need to somehow add ofAndroidLib project to your project dependencies. Maybe Project Properties -> Java Build Path -> Projects -> Add… may do the job. But i am not sure.
EDIT: The last paragraph may not be the case. I have had such errors in the past but I don’t remember how I solved.
if you set the sound player to stream it’ll also use mediaplayer internally. like: load(sound,true)
the media player though is though to be used with long sound files… if you want to play short sounds like game effects… you should be using sound pool. not sure where the error comes from though
Ah interesting, so this is similar to the streaming=false parameter in the constructor to ofSoundPlayer? I have read that on Android it toggles between different media backends but I have not looked at the source to see for sure:
What does removeTexture() do for sound files? Does it release one of the channels? I think on my hardware (Kindle Fire Phone) there is a message at startup about 128 channels.
One thing I was hoping to fix today is simply polling the play() method to see when my sound file had finally loaded (and played) but unfortunately that method returns void.
i would use ofSoundPlayer setting streaming to true, using a video player seems super hacky and will allocate a texture, pixels… that are not necessary
Sorry about that error, it actually has nothing to do with sound.
Ok, so I tried both of those methods – setting streaming to true and using the ofxAndroidVideoPlayer. It’s interesting, in both cases I get the following error in my log:
So, even though both of those methods were supposed to use a streaming mediaplayer on the backend my error message indicates that the soundpool is being used. I wonder if is something specific to the Kindle Fire Phone? I am going to start looking at the source to see how these functions are implemented behind the scenes, because otherwise I am at a complete loss on how to load and play a sound.