Fast screen buffer w/ feedback?

Hello …

I’m trying to do a fast screenbuffer effect that feeds back on itself. Essentially … I set up 2 objects: a background and a mirror image (which contains an ofTexture). Then, per frame, without automatically clearing the screen I…

  • draw the background to the screen (with or without alpha blending, so that it clears immediately or with a slur)

  • draw the ‘mirror image’ to the screen (with an offset in position/scale/etc. and with or without alpha blending)

  • capture the current screen to the mirror image using ofTexture.loadScreenData(0,0,320,240)

This works. It blazes in the simulator. Then, on hardware it has varied results. On iPhone gen 1 it absolutely crawls (about 2 fps). On iPhone 3GS it is not fast, but fast enough (about 20 - 30 fps).

Is the iphone pixel pipeline just really slow, or is there some better way to do this? Bearing in mind - I am not just trying to create a frame blur, but rather I need to draw to the screen (or to a frame buffer) THEN capture that result and draw in onto the screen (or frame buffer) with offsets.

I tried using an ofImage.grabScreen() instead of ofTexture.loadScreenData() and it was much slower (and had weird mirroring problems where the result was inverted vertically).

I’ve tried optimizing by turning off linear interpolation and other sorts of things, but in the end it doesn’t seem to have much impact on performance either way – the slowdown seems purely related to imaging the screen (or a screen buffer) with each frame.

N

Hey, try drawing directly to an offscreen texture, then you don’t have to capture what you drew, but already have it on a texture, then you can draw it to the screen as many times as you want with different offsets. This should be fastest.

You can look at the fbo addon at addons.openframeworks.cc. It probably won’t work straight up and you may need to make some mods to get it to compile on opengl es. Alternatively you can look at making an offscreen render buffer from the ground up, the code should be 99% similar to the surface creation code in eaglview.m (in ofxiPhone). If you look at the apple docs you can find examples on this.

The current app came out like this – at a good enough FPS on 3GS to be quite usable (sorry the only video cam I have around is on the iPhone itself)… http://www.flickr.com/photos/alinear/se-…-705654263/ … I’m not sure I could get near the same resolution of effect (in some cases needing 320 to 480 textures at full screen size) using iterated textures, at least not without also choking up the hardware.

I could probably do a more limited effect using a frame buffer iterator with less resolution, for older handsets…

I was actually hoping to eventually find a way to apply this to imaging the camera input somehow – since from what I can tell the latest beta SDK provides a means to get at it legitimately (finally).

I haven’t yet tried just using a Quartz/CoreSurface implementation of this sort of effect, I am sort of assuming that would be slow…

alinear, would you be willing to share your FBO code?

Hey sorry for the delay. I am about to release this app based on the code I mentioned which will now include processing the (live) cam input. I’ll share the code once I release… :slight_smile:

nice, I can’t wait to see that live camera code…

is that something that is only supported on 3gs?

this sounds great. by the way there is an iphone compatible fbo code at http://forum.openframeworks.cc/t/ofxfbotexture/3143/0
I believe this is going to be a core addon soon.

sick!

ha!

http://www.flickr.com/photos/jonbro/sets/72157623270478164/

I am not doing anything with FBOs, I am just using alpha transparency on the last screen cap. I think my favorite thing is that it picks up the iphone UI elements.