First time diving into OF on iOS…exciting! As a first run I’m trying to port an app I made before into an iOS app. Its a pretty simple rutt etra-like effect on video coming in from the video camera. I have it working as a mac app, but I can’t seem to get it displaying properly on my iPhone. The mesh is drawing, but I don’t think I’m getting pixel values from my camera to vidPixels in order to change the color of my mesh. I’m basing this off of the videoGrabberExample in OF iOS 0072. I’m on a MacBook Pro, 10.7.5, running Xcode 4.5.2.
Can anyone give this a look and let me know if I’m doing something wrong? Thanks so much in advance.
Code:
testApp.cpp
#include "testApp.h"
#include "ofGLUtils.h"
#include "ofGLRenderer.h"
//--------------------------------------------------------------
void testApp::setup(){
ofxiPhoneSetOrientation(OFXIPHONE_ORIENTATION_LANDSCAPE_RIGHT);
ofSetFrameRate(30);
grabber.initGrabber(480, 360);
yStep = 5;
xStep = 5;
// drawRuttEtra = false;
ofBackground(0, 0, 0);
}
//--------------------------------------------------------------
void testApp::update(){
//ofBackground(255,255,255);
grabber.update();
if(grabber.isFrameNew()){
vidPixels = grabber.getPixelsRef();
}
}
//--------------------------------------------------------------
void testApp::draw(){
glEnable(GL_DEPTH_TEST);
ofMesh mesh;
int rowCount = 0;
for (int y = 0; y<grabber.height; y+=yStep){
ofNoFill();
mesh.setMode(OF_PRIMITIVE_LINE_STRIP);
if (rowCount % 2 == 0) {
for (int x = 0; x < grabber.width; x += xStep){
ofColor curColor = vidPixels.getColor(x, y);
mesh.addColor(ofColor(curColor));
mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3));
}
} else {
for (int x = grabber.width-1; x >= 0; x -= xStep){
ofColor curColor = vidPixels.getColor(x, y);
mesh.addColor(ofColor(curColor));
mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3)); }
}
rowCount++;
}
mesh.draw();
// grabber.draw(0,0);
}
I got it to work! Realizing that getPixelsRef() might not work on iOS, I changed some of the code around to work with getPixels(), basing it off of James George’s meshFromCamera example. The camera seems a bit off, as in it feels like its not drawing exactly whats in front of it, more to the left …just in a skewed position or weirdly zoomed in. Its a bit dark in my apartment so I can’t tell how well it works or how to optimize that. But it works! Any suggestions on how to make this better would be appreciated. I’ll start tightening up the code as well
I now think the problem is that the mesh is only drawing a zoomed in portion of the videoGrabber, particularly the left hand quadrant I believe. I tried adding ofScale, then ofTranslate in the draw function to solve this, but it doesn’t seem to help. Any ideas?
you have to multiply your grabber.width with 3 inside your x and y loop because getPixels returns list with pairs of 3 values (R, G, B) so every row of your input is in fact 3 times your width
for (int y = 0; y<grabber.height*3; y+=yStep){
for (int x = 0; x < grabber.width*3; x += xStep){
int pr = x + (y*grabber.getWidth());
ofColor curColor(src[pr],src[pr+1], src[pr+2]);
mesh.addColor(ofColor(curColor));
mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3));
}
}
Hi,
I’m newbie on OF and i’m trying to use this exemple on iOS. It works perfectly on Iphone but when I send it on Apple TV, it stays on portrait even if I use Landscape Orientation.
Orientation is ok on iPhone but not on Apple TV.
Any Thoughts ?