Been looking around the forums all day for some guidance on whether this is possible:
Essentially I have 8-10 png sequences around 300-400 png’s each, quite large at 1900x1080 - these sequences need to be triggered possibly 6-8 overlayed at the same time.
The playback is fine, the problem is loading all those images. So my approach so far has been to loop through all the image folders and load all the images at setup - this has been fine for a few hundred images, but bombs the app with any more.
The code for for my loading is
numFolders = folderDIR.listDir("images");
for (int f = 0; f < numFolders; f++)
// get current folder
string currentFolder = folderDIR.getPath(f);
// how many images in this folder
int numImagesInFolder = imageDIR.listDir(currentFolder);
// for this folder we allocate an ofImage of length "numImagesInFolder"
imageFolders[f] = new ofImage[numImagesInFolder];
// we then loop through each image in this folder and load it
for (int i = 0; i < numImagesInFolder; i++)
I’m guessing I need to pre-load a percentage of each sequence and buffer the rest of the sequence throughout playback, deleting old images as I go so as to not eat up RAM - has anyone done this successfully, and if so how well did it work?
I’d really appreciate hearing other peoples experiences dealing with this kind, and whether it is even going to possible to handle this many images at the same time.
Many thanks, looking forward to your replies
Hi there, did you ever get this to work the way you described, I’m trying to do something similar with ofxImageSequence but keep running into limitations of the hardware… I would be very interested in your findings.
loading that many big images will definitely bomb! What you’re doing sounds really like the task of a video player. Unless your sequences don’t run sequentially, in which case video players can have a lot of stutter when jumping around.
if you’re on mac the ofxQTKitVideoPlayer (https://github.com/obviousjim/FlightPhase-ofxAddons/tree/master/ofxQTKitVideoPlayer) can get up to 3 or 4 HD videos playing. 6-8 may be pushing it.
One thing I have been thinking is that ofxImageSequence could be extended to use the new ofxThreadedImageLoader so that it can buffer frames in advance of the playhead without causing a stutter – but this is practically like writing our own video codec! =)
It’s always a good question to ask if you really *need* high resolution. For a lot of content, it’s really hard to tell the difference between a 1280x720 image scaled up vs a full 1920x1080.
I don’t know if this is the right place for this, but I think I’ve found a small bug in your ofxQTKitVideoPlayer.
(What would be the right place to write this?)
In ofxQTKitVideoPlayer.mm at line 76 and 77 shouldn’t it be like this:
76 bool useTexture = (mode == OFXQTVIDEOPLAYER_MODE_TEXTURE_ONLY || mode == OFXQTVIDEOPLAYER_MODE_PIXELS_AND_TEXTURE);
77 bool usePixels = (mode == OFXQTVIDEOPLAYER_MODE_PIXELS_ONLY || mode == OFXQTVIDEOPLAYER_MODE_PIXELS_AND_TEXTURE);
76 bool useTexture = (mode == OFXQTVIDEOPLAYER_MODE_TEXTURE_ONLY || OFXQTVIDEOPLAYER_MODE_PIXELS_AND_TEXTURE);
77 bool usePixels = (mode == OFXQTVIDEOPLAYER_MODE_PIXELS_ONLY || OFXQTVIDEOPLAYER_MODE_PIXELS_AND_TEXTURE);
thanks for making this add-on, I’m using it heavily instead of my ofxAlphaVideoPlayer for displaying videos with alpha on OSX.
oh yeah wow that’s a mistake hah!
seems like a pretty big one good catch!! i’ll update the git
I’m sorry but I think I found another one… :-s
in QTKitMovieRenderer.m on line 85 you had
NSMutableDictionary* movieAttributes = [NSDictionary dictionaryWithObjectsAndKeys:....
and that should be
NSMutableDictionary* movieAttributes = [NSMutableDictionary dictionaryWithObjectsAndKeys:
I’m just trying to help, as I said: I love the work you did on this one!
Hi James or Jim,
I was playing with your ofxQTKitVideoPlayer again, trying to get it to work with threaded loading (almost done) and something caught my eye in ofxQTKitVideoPlayer.mm
You have a lot of autoreleasePools in there and for some of them, I don’t understand why they are there…
Like for example:
void ofxQTKitVideoPlayer::setPosition(float pct)
if(moviePlayer == NULL) return;
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
moviePlayer.position = pct;
could you explain why they are there?
hey tim- any luck with threaded loading? are you allocating movies on the background thread? poked at that a bit but it’s a bit dangerous.
I’ve done some large image sequence loading, not that many and that long though. I use a an imageSequencePlayer class that I wrote and then extended it to subclass ofThread and added a circular queue. This works pretty well because the images are all loaded in a thread and the circular queue keeps the buffer fresh and full at all times. There is currently no support for any kind of timing/framerate control, it pretty much loads the images as fast as possible, but that could be easily added. If anyone still needs something like this let me know and I can clean it up a bit and post the code.
re: tim K
sorry I missed your question from earlier this month!
the auto release pools are there because of how Objective-C manages memory. There is no delete or free() when using objective-c objects because they are referenced counted – which means the runtime keeps track of whether anyone is using the objects any more and gets rid of them if they aren’t needed.
In order to make that happen, and to avoid leaks in your program, you have to set up auto release pools. the pattern is that before you use any objective c objects you alloc an auto release pool, then when you are done you release it and away it goes along with any left over objects.
Normally in Obj-C there is a “main loop” which has an autorelease pool set up that so don’t have to worry about doing it yourself. But since we are using ObjC from the GLUT loop we need to do it all ourselves.
Hope that makes a little bit of sense!
I know it’s been quite a while - but I (beginner) 've been looking into image sequencers to maximise image rendering efficiency. Wondering if you’ve still got the code?