Video grabber + ofMesh

Heya-

First time diving into OF on iOS…exciting! As a first run I’m trying to port an app I made before into an iOS app. Its a pretty simple rutt etra-like effect on video coming in from the video camera. I have it working as a mac app, but I can’t seem to get it displaying properly on my iPhone. The mesh is drawing, but I don’t think I’m getting pixel values from my camera to vidPixels in order to change the color of my mesh. I’m basing this off of the videoGrabberExample in OF iOS 0072. I’m on a MacBook Pro, 10.7.5, running Xcode 4.5.2.

Can anyone give this a look and let me know if I’m doing something wrong? :slight_smile: Thanks so much in advance.

Code:

testApp.cpp

  
  
#include "testApp.h"  
#include "ofGLUtils.h"  
#include "ofGLRenderer.h"  
  
//--------------------------------------------------------------  
void testApp::setup(){	  
	ofxiPhoneSetOrientation(OFXIPHONE_ORIENTATION_LANDSCAPE_RIGHT);  
  
	ofSetFrameRate(30);  
  
	grabber.initGrabber(480, 360);  
	  
    yStep = 5;  
    xStep = 5;  
      
  //  drawRuttEtra = false;  
      
    ofBackground(0, 0, 0);  
  
}  
  
//--------------------------------------------------------------  
void testApp::update(){  
	//ofBackground(255,255,255);  
	  
	grabber.update();  
	  
    if(grabber.isFrameNew()){  
    vidPixels = grabber.getPixelsRef();  
    }  
      
}  
  
//--------------------------------------------------------------  
void testApp::draw(){	  
  
    glEnable(GL_DEPTH_TEST);  
     
        ofMesh mesh;  
          
        int rowCount = 0;  
        for (int y = 0; y<grabber.height; y+=yStep){  
            ofNoFill();  
            mesh.setMode(OF_PRIMITIVE_LINE_STRIP);  
              
            if (rowCount % 2 == 0) {  
                for (int x = 0; x < grabber.width; x += xStep){  
                    ofColor curColor = vidPixels.getColor(x, y);  
                    mesh.addColor(ofColor(curColor));  
                    mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3));  
                }  
            } else {  
                for (int x = grabber.width-1; x >= 0; x -= xStep){  
                    ofColor curColor = vidPixels.getColor(x, y);  
                    mesh.addColor(ofColor(curColor));  
                    mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3));                }  
            }  
            rowCount++;  
        }  
        mesh.draw();  
       // grabber.draw(0,0);  
}  
  

testApp.h

  
  
#pragma once  
  
#include "ofMain.h"  
#include "ofxiPhone.h"  
#include "ofxiPhoneExtras.h"  
  
class testApp : public ofxiPhoneApp{  
	  
	public:  
        void setup();  
        void update();  
        void draw();  
        void exit();  
      
        void touchDown(ofTouchEventArgs & touch);  
        void touchMoved(ofTouchEventArgs & touch);  
        void touchUp(ofTouchEventArgs & touch);  
        void touchDoubleTap(ofTouchEventArgs & touch);  
        void touchCancelled(ofTouchEventArgs & touch);  
	  
        void lostFocus();  
        void gotFocus();  
        void gotMemoryWarning();  
        void deviceOrientationChanged(int newOrientation);  
		  
		ofVideoGrabber grabber;  
		ofTexture tex;  
		unsigned char * pix;  
      
        //rutt etra effect  
      
        int yStep;  
        int xStep;  
        bool drawRuttEtra;  
      
        ofPixels vidPixels;  
  
};  
  
  

After a little debugging, I’m back with some more clues…

I wanted to make sure ifFrameNew() works. Trying

  
 if(grabber.isFrameNew()){  
       cout<< "i'm grabbing new pixels!" << endl;  
   vidPixels = grabber.getPixelsRef();  
   }  

Prints “i’m grabbing new pixels!”, so that if statement is executing.

In my double for loop, if I cout the value of vidPixels.getColor(x,y)

  
cout<<vidPixels.getColor(x,y) << endl;  

I get all 255…which makes me think grabber.getPixelsRef() isn’t working how I thought it should.

Any ideas?

I got it to work! Realizing that getPixelsRef() might not work on iOS, I changed some of the code around to work with getPixels(), basing it off of James George’s meshFromCamera example. The camera seems a bit off, as in it feels like its not drawing exactly whats in front of it, more to the left …just in a skewed position or weirdly zoomed in. Its a bit dark in my apartment so I can’t tell how well it works or how to optimize that. But it works! Any suggestions on how to make this better would be appreciated. I’ll start tightening up the code as well :slight_smile:

  
#include "testApp.h"  
#include "ofGLUtils.h"  
#include "ofGLRenderer.h"  
  
//--------------------------------------------------------------  
void testApp::setup(){	  
	ofxiPhoneSetOrientation(OFXIPHONE_ORIENTATION_LANDSCAPE_RIGHT);  
  
	ofSetFrameRate(30);  
  
	grabber.initGrabber(480, 360, OF_PIXELS_RGB);  
	  
    yStep = 3;  
    xStep = 3;  
      
  //  drawRuttEtra = false;  
      
    ofBackground(0, 0, 0);  
  
}  
  
//--------------------------------------------------------------  
void testApp::update(){  
	//ofBackground(255,255,255);  
	  
	grabber.update();  
	  
    if(grabber.isFrameNew()){  
    vidPixels = grabber.getPixelsRef();  
          
          
    }  
      
}  
  
//--------------------------------------------------------------  
void testApp::draw(){	  
  
    glEnable(GL_DEPTH_TEST);  
         
        int rowCount = 0;  
        for (int y = 0; y<grabber.height; y+=yStep){  
            ofNoFill();  
            mesh.setMode(OF_PRIMITIVE_LINE_STRIP);  
              
            if (rowCount % 2 == 0) {  
                for (int x = 0; x < grabber.width; x += xStep){  
              //  cout<<vidPixels.getColor(x,y) << endl;  
  
                    ofColor curColor(grabber.getPixels()[x+(y*grabber.width)], grabber.getPixels()[x+(y*grabber.width)+1], grabber.getPixels()[x+(y*grabber.width)+2]);  
                                                           
                    mesh.addColor(ofColor(curColor));  
                    mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3));  
                }  
            } else {  
                for (int x = grabber.width-1; x >= 0; x -= xStep){  
ofColor curColor(grabber.getPixels()[x+(y*grabber.width)], grabber.getPixels()[x+(y*grabber.width)+1], grabber.getPixels()[x+(y*grabber.width)+2]);                    mesh.addColor(ofColor(curColor));  
                    mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3));                }  
            }  
            rowCount++;  
        }  
        mesh.draw();  
    mesh.clear();  
 //       grabber.draw(0,0, grabber.getWidth()/4, grabber.getHeight()/4);  
}  
  
//--------------------------------------------------------------  
void testApp::exit(){  
      
}  
  
//--------------------------------------------------------------  
void testApp::touchDown(ofTouchEventArgs & touch){  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchMoved(ofTouchEventArgs & touch){  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchUp(ofTouchEventArgs & touch){  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchDoubleTap(ofTouchEventArgs & touch){  
  
}  
  
//--------------------------------------------------------------  
void testApp::touchCancelled(ofTouchEventArgs & touch){  
  
}  
  
//--------------------------------------------------------------  
void testApp::lostFocus(){  
      
}  
  
//--------------------------------------------------------------  
void testApp::gotFocus(){  
      
}  
  
//--------------------------------------------------------------  
void testApp::gotMemoryWarning(){  
      
}  
  
//--------------------------------------------------------------  
void testApp::deviceOrientationChanged(int newOrientation){  
      
}  
  
  
  

I now think the problem is that the mesh is only drawing a zoomed in portion of the videoGrabber, particularly the left hand quadrant I believe. I tried adding ofScale, then ofTranslate in the draw function to solve this, but it doesn’t seem to help. Any ideas?

With ofScale I tried:

  
ofScale((ofGetWidth()/grabber.getWidth())/2, (ofGetHeight()/grabber.getHeight())/2);  
  

With ofTranslate I tried:

  
ofTranslate(grabber.getWidth()/2, grabber.getHeight()/2);  

you have to multiply your grabber.width with 3 inside your x and y loop because getPixels returns list with pairs of 3 values (R, G, B) so every row of your input is in fact 3 times your width

  
  
for (int y = 0; y<grabber.height*3; y+=yStep){  
          
        for (int x = 0; x < grabber.width*3; x += xStep){  
              
            int pr = x + (y*grabber.getWidth());  
  
            ofColor curColor(src[pr],src[pr+1], src[pr+2]);  
              
            mesh.addColor(ofColor(curColor));  
            mesh.addVertex(ofVec3f(x,y, curColor.getBrightness() * 0.3));  
        }  
          
    }  
  

Hi,
I’m newbie on OF and i’m trying to use this exemple on iOS. It works perfectly on Iphone but when I send it on Apple TV, it stays on portrait even if I use Landscape Orientation.
Orientation is ok on iPhone but not on Apple TV.
Any Thoughts ?

Thnx
SLip