I have two live video streams heading off to an Oculus Rift, but the way I’m doing it is generating some pretty serious lag (about half a second), and too slow a framerate (about 25fps right now, I need as close to 60 as possible). I’m on OSX 10.10, running oF 0.8.4, using two Firewire cameras for the time being (will be USB 3.0 in the future).
Here’s my video pipeline per eye:
In setup():
ofSetVerticalSync(false);
ofSetFrameRate(60);
videoCamWidth = 1280;
videoCamHeight = 1024;
riftEyeWidth = 960;
riftEyeHeight = 1080;
leftEyeVideo.setDeviceID(1);
leftEyeVideo.setDesiredFrameRate(60);
leftEyeVideo.initGrabber(videoCamWidth, videoCamHeight);
leftEyeImage.allocate(riftEyeWidth, riftEyeHeight, OF_IMAGE_COLOR);
In update():
bool bNewFrameLeft = false;
leftEyeVideo.update();
bNewFrameLeft = leftEyeVideo.isFrameNew();
if (bNewFrameLeft) {
leftEyePixels = leftEyeVideo.getPixelsRef();
// cropping to adjust aspect ratio — if at all possible, avoid this. It's killing the framerate.
// Input: (Wi, Hi); Output: (Wo, Ho);
// (Wo / Ho) > (Wi / Hi) ? crop(Wi * Ho / Hi, Ho) : crop(Wo, Hi * Wo / Wi);
(riftEyeWidth / riftEyeHeight) > (videoCamWidth / videoCamHeight) ? leftEyePixels.crop(0, 0, videoCamWidth * riftEyeHeight / videoCamHeight, riftEyeWidth) : leftEyePixels.crop(0, 0, riftEyeWidth, videoCamHeight * riftEyeWidth / videoCamWidth);
leftEyePixels.rotate90(-1);
leftEyePixels.resize(riftEyeWidth, riftEyeHeight);
leftEyeImage.setFromPixels(leftEyePixels);
}
In draw():
leftEyeImage.draw(0, 0);
So essentially: grab video, put in ofPixels object, crop to fix aspect ratio, rotate, scale, put in ofImage object, draw.
What would be the best way to optimize this pipeline, keeping in mind I may need to use a shader on the videos as well?