[SOLVED] Passing video between simultaneously running apps, or am I crazy?

So, I’m in a bit of a pickle.

I’ve successfully been able to grab video from a Ximea USB 3 research camera with OF, by grabbing the pointer of the first pixel of the frame using their API and making a pixel object with a 64-bit version of OF. I need to be able to use that video with an Oculus DK2, however the ofxOculusDK2 addon is using a 32-bit version of the LibOVR library, so I can’t use it inside my now-working 64-bit app.

One method I’ve come up with to get around this is to run both apps (the Framegrabber and the Oculus visualizer) simultaneously, and somehow give the Oculus app access to the pixel objects generated by the Framegrabber app (Pass along the pointer to the object? Streaming over localhost? Keeping latency low is extremely important).

Does anybody know how I might go about doing this? Is there a better option?

1 Like

you can pass video textures between apps pretty efficiently using the Syphon addon.


syphon is great for sharing texture memory. If you need to send data, I’ve had good luck streaming images between computers using zeroMQ - https://github.com/satoruhiga/ofxZmq

1 Like

Another way to do this is with ‘shared memory’. I learnt about this in another post here, and thought it was pretty useful. You setup a ‘memory mapped file’ which has an unsigned char array of your video pixels in one app, and in the other you read the memory mapped file. I made an addon a while ago with a video example https://github.com/trentbrooks/ofxSharedMemory but i think you can do it natively with Poco::SharedMemory - which would be much cleaner.


For Windows apps and NVIDIA compatible graphics there is “Spout” http://spout.zeal.co. Works fine with OpenFrameworks and Cinder. There are examples for sender and receiver included in the installation.

Also for BlackMagic devices there is “BlackSpout” https://magicmusicvisuals.com/forums/viewtopic.php?f=6&t=201

. . .

ofxSyphon worked splendidly. Thank you.