Performance considerations for Raspberry Pi?

An exploration of mine has a simple crossfading slideshow that pulls images down over HTTP via ofLoadURLAsync(), loads the resulting data into an ofImage then draws via alpha blending. The framerate leaves something to be desired, fluttering in the 7-13 FPS range and I’m trying to determine what the slowdown could be caused by and boost it. I’ve also tried using a simple shader and FBO to compare and while faster, the FBO stops drawing after a few iterations, possibly as a result of: *ofFbo not working with ofGLProgrammableRenderer
*

What factors should one consider in laggy framerates? Are there any special considerations one should make when running on the Raspberry Pi? Some things I’ve thought of:

Factors

  • CPU / GPU memory split – 256/256
  • memory card speed – class 10
  • image size – 1024x768 PNG
  • GL setup – ofSetupOpenGL(1024, 768, OF_WINDOW) or OF_FULLSCREEN

How often are you downloading frames? Every update? Or less often?

Not too often, the simple slideshow holds an image for 5 seconds, then crossfades over 2 and a replacement image isn’t downloaded until needed.

It’s very strange and I suspect that it has something to do with the renderer. You should easily be able to get framerates approaching 60 for a shader-based cross-fade – especially since you are using an async download … @jvcleave any updates on this https://github.com/openframeworks/openFrameworks/issues/2593?

also try using ofxThreadedImageLoader which will load images in a background thread instead of locking the GL thread while loading. even with that uploading to the texture will happen in the GL thread which can be slow, there’s ways of loading textures in a background thread on the PI but it’s kind of complex

i would also try the crossfading with 2 preloaded images and see which fps you get with that if the fps is ok there then it’s surely because of loading the image and the texture in the main thread

Thanks for the suggestions guys, I’ll try local images first, then take a look at the threaded image loader to see what improvements I can find.

@bakercp no updates beyond what the github issue contains

I would try arturo’s suggestion first as it sounds more inline with my experience.

I just dug up this older SlideShow I had and tried it on the PI and was getting 60fps

I would probably use something like this with preallocated images and a threaded image loader

Thanks Jason I will give that project a run and see if I can tweeze out what is happening on my side. The first thing I notice is that I wasn’t using the programmable renderer in my naive alpha blending version, perhaps that has some deeper impact than I was anticipating.

Thanks for all the suggestions guys, I noticed in htop that my memory use was at 47% opposed to @jvcleave’s sample which ran around 13%. So that was very telling and I ripped out components until the issue reproduced with a skeleton app. Embarrassingly enough, I was passing an ofImage into a simple perspective fit/fill function by value instead of by reference; oups. :poop: I suppose either my laptop’s hardware masked the issue when doing development over there or it could be conceivable that Clang could recognize my gross error and optimize it away.

The framerate definitely dipped from 59 to the upper 30s while an image was loading, so I jumped on the ofxThreadedImageLoader and now it bottoms out in the mid 40s instead - either way it isn’t a big deal in my use because the image load occurs while another image is being presented full alpha, so visually it isn’t noticeable.

1 Like

@jvcleave
I am very interested in your SlideShowExampleApp.

I am a television Studio Engineer, and I have recently discovered a way to turn a Raspberry pi into an incredibly useful graphics playback device, with awesome resolution and reliablilty.  However, I have only succeeded in making it work with playing back looping video graphics, or video files in general.  I have not yet been able to make it work with still images, other than using FBI, which works but it jerks like crazy on the transitions, because as you know it does not use hardware acceleration.  I read your post about slideshow example app, and I want to know how to use it.  I have downloaded it, but I haven’t got it working.

What are the usage commands to make it playback a directory full of JPEG images?

What I want to do, is have it show a picture, until I press a trigger key, like Enter, and then is fades to the next image in the numbered sequence, then wait for Enter, then fade to the next one, etc.  This is basically a powerpoint function.  Can this be done with your app?

The first time I installed it I typed ./SlideShow

and then I got a whole slew of verbose text which ultimately ended with

[notice ] ofAppEGLWindow: runAppViaInfinteLoop(): setting up notifications complete
Segmentation fault

Just need a bit of training on how to use it!

It is currently setup to look for a folder called “images” in bin/data that contains png files. You could build on it to respond to key presses but it would take some modifications. You should take a look at the OF examples and you should be able to tie it together.