FreeFrame for OF

Hi,

I came across the FreeFrame project and it looks perfect for a project we’re working on, so I am going to try to make it work with OF. It looks like it could be fairly simple (hopefully), basically just pass a char array as an input texture, then read it back somehow.

So I downloaded the example Xcode project and just tried adding the files to an empty OF project and the first problem I have is that

  
error: conflicting declaration 'typedef unsigned int GLhandleARB'  

I found the following statement in “FFGL.h”

  
  
//on osx, <OpenGl/gl.h> auto-includes gl_ext.h for OpenGL extensions, which will interfere  
//with the FFGL SDK's own FFGLExtensions headers (included below). this #define disables  
//the auto-inclusion of gl_ext.h in OpenGl.h on OSX  
#define GL_GLEXT_LEGACY  
#include <OpenGL/gl.h>  
  

and in “gl.h”:

  
  
#ifndef GL_GLEXT_LEGACY  
#include <OpenGL/glext.h>  
#endif  
  

which makes sense but it seems that “glext.h” is still loaded and that’s why I’m getting that double declaration error.

Any ideas what’s going on?

Hopefully once I figure out all this precompiler/linking stuff, the rest will be straightforward :slight_smile:

cool! it’d be awesome to hook up ff.

I’d try two things:

a) throw “#define GL_GLEXT_LEGACY” in ofConstants and see if that helps.
b) alternatively, I think you will want to try to include “FFGL.h” before ofMain…

the problem is likely you are hitting (in terms of includes) ofMain before ffgl.h, and therefore, the #define which is supposed to prevent some legacy stuff from loading isn’t defined, the legacy stuff loads and when it hits ffgl.h there are issues. does that make sense? do (a) or (b) above help?

take care!!
zach

hey zach,

definitely would be awesome, that’s why this is driving me crazy :slight_smile:

i had already tried both:
a) gives the same error
b) gives me a new error:

  
error: #error gl.h included before glee.h  

and the same error right below that

isn’t main.cpp the first file that gets compiled/looked at? i also removed the “-framework OpenGL” linker flag from the project settings, just to see what would happen, and its still finding all the GL stuff. this doesn’t really make sense to me…

thanks

oh - glee is super particular about being included in the right order. glee must come before opengl.

I think (a) is the solution that seems the most promising. sometimes to trace compile / include errors, you can add warnings or errors to the h file, like:

#warning “this messed up here”

or

#error “how did this ever get included?”

throwing these in will stop / notify during compile and help you get a sense of include order.

http://sunsite.ualberta.ca/Documentatio-…-pp-37.html

I think main.cpp is the first file that gets looks at, but every .cpp file gets compiled, and it’s hard to say in what order (it varies from compile to compile based on last touched, etc). everytime a .cpp is compiled, it moves through the chain of .h files, ofMain.h->ofConstants->etc…

hope that helps ! I’d throw an error in the thing you don’t want to include and try again for solution (a)

take care!
zach

ok cool, i was actually wondering if there was a way to print traces while compiling.

so i put a #warning in “OpenGL/gl.h” right before “glext.h” gets loaded:

  
  
#ifndef GL_GLEXT_LEGACY  
#warning tricky  
#include <OpenGL/glext.h>  
#elif  
#warning tricked me again  
#endif  
  

and it prints “tricked me again”, so i guess that means “glext.h” doesn’t get loaded here.

it’s also getting loaded in “OpenGL/gliDispatch.h” but i put the same trace in there and nothing, so i guess this file is never linked.

so, at this point, it looks like “glext.h” is getting loaded somewhere secret. i did a search in the top OF directory (addons, apps, libs) and couldn’t find anything.

any ideas?

(btw the FFGL XCode project that comes with the download compiles fine. i copied the .cpp and .h files over to an empty openFrameworks project to get to where i am now)

thanks!

cool - glad that helped, warnings and errors are super useful for this stuff. h file stuff is voodoo, especially with order issues (like glee) and legacy issues.

if you put a warning / error in glext.h, does it show you anything about where it’s coming from?

ah good idea :slight_smile:

i just added a #warning at the top of the file and it doesn’t print! i checked the error msg again and it’s now coming from “GLee.h”, not “glext.h”. so i guess (a) worked after all

now GLee defines these two structs as

  
  
typedef int GLhandleARB;  
typedef char GLcharARB;  
  

but FFGL defines them as

  
  
typedef char GLcharARB;  
typedef unsigned int GLhandleARB;  
  

i just commented both out and now have this at the top of my “main.cpp”

  
  
#define GL_GLEXT_LEGACY  
typedef char GLcharARB;  
typedef unsigned int GLhandleARB;  
  

that part is finally compiling but for some reason it can’t find “cv.h” now… anyways, i’ll look at it tomorrow morning because i don’t think i’ll get any further tonight.

thanks for the help!

alright for some reason, this is still giving me problems. now that i have the definitions at the top of “main.cpp”, i’m getting the following error a bunch of times in “FFGLExtensions.h”:

  
  
error: 'GLhandleARB' was not declared in this scope  
  

or

  
  
error: 'GLcharARB' has not been declared  
  

what is the scope of these typedefs at the top of “main.cpp” ?

and if all else fails, can i just change the name “GLhandleARB” to “GLhandleARBFF” or something for all references in FFGL ? Is “GLhandleARB” just an alias for an int ?

thanks!

the problem is that defining something in main.cpp doesn’t mean it will get defined across other .cpp or .c files.

when you got to compile, each every c++ file gets compiled if it’s been touched, and they have no relationship to each other. They might utilize the same h files (ofMain, ofConstants, etc), and in that way, both know what ofGetElapsedTimef() means, but they have no idea about each other. this is why we use .h files to define commonly used data types, etc.

  • zach

hey,

i put the following code in a file called “FFConstants.h”

  
  
#ifndef FF_CONSTANTS  
#define FF_CONSTANTS  
  
#define GL_GLEXT_LEGACY  
typedef char GLcharARB;  
typedef unsigned int GLhandleARB;  
  
#endif  
  

i added "#include “FFConstants.h” at the top of “GLee.h” and removed the type definitions for “GLhandleARB” and “GLcharARB” (at around line 720).

i did the same thing in “FFGLExtensions.h”, which is where the other definition of “GLhandleARB” and “GLcharARB” is.

now when i compile, everything is cool, so i’m trying to get to the actual FreeFraming. more on that later :slight_smile:

but for now, i messed with “GLee.h” and this is probably not a great idea. does anyone have any suggestions on how i can do this without having to go inside “GLee.h” ?

hey!

so i finally got it to work but there are a few OpenGL glitches that i don’t know how to deal with and would need help fixing.

I have two versions of the “ofxFreeFrameHost” class; one of them is a standalone and the other extends ofTexture.

the standalone renders the FF plugins properly but draws the result in some funky shape and size:

the subclass version draws the result properly in the window, but doesn’t process the plugins:

there are three functions that change between both versions: “allocate(…)”, “loadData(…)”, and “draw(…)”.

it’s still kinda complicated to set up and test on a system (since you need to mess with GLee.h and install FFGLSDK-1.5), but here is the code for ofxFreeFrameHost.cpp

i’m sure it’s a simple bug but i don’t know much about opengl/textures/viewports to know where to look.

common stuff:

  
  
#include "ofxFreeFrame.h"  
  
ofxFreeFrameHost::ofxFreeFrameHost() {  
    inputImage = NULL;  
      
    // init gl extensions  
    glExtensions.Initialize();  
    if (glExtensions.EXT_framebuffer_object == 0) {  
        printf("ERROR: FBO support not detected, cannot continue!\n");  
        return;  
    }  
}  
  
void ofxFreeFrameHost::loadPlugin(const char* filename) {  
    // load plugin  
    FFGLPluginInstance* aPlugin = FFGLPluginInstance::New();  
    if (aPlugin->Load(filename) == FF_FAIL) {  
        printf("ERROR opening plugin %s\n", filename);          
    }  
      
    // allocate FBO  
    FFGLFBO anFbo;  
    if (!anFbo.Create(width, height, glExtensions)) {  
        printf("ERROR initializing FBO for plugin %s.\n", filename);  
    }  
      
    // instantiate GL for plugin  
    if (aPlugin->InstantiateGL(&fboViewport) != FF_SUCCESS) {  
        printf("ERROR instantiating GL for plugin %s.\n", filename);  
    }  
      
    // add the plugin and FBO to the list  
	plugins.push_back(aPlugin);  
    fbos.push_back(anFbo);  
}  
  
FFGLPluginInstance* ofxFreeFrameHost::getPlugin(int i) {  
    if (plugins.size() > i)  
        return plugins[i];  
    return NULL;  
}  
  
void ofxFreeFrameHost::process() {  
    float elapsedTime = ofGetElapsedTimef();  
  
    // go through all the plugins  
	for (int i=0; i < plugins.size(); i++) {  
		// activate the FBO as our render target  
        if (!fbos[i].BindAsRenderTarget(glExtensions)) {  
            printf("ERROR binding FBO as render target for plugin %d.\n", i);  
            continue;  
        }  
          
        //set the gl viewport to equal the size of the FBO  
        glViewport(fboViewport.x, fboViewport.y, fboViewport.width, fboViewport.height);  
          
        // make sure all the matrices are reset  
        glMatrixMode(GL_TEXTURE);  
        glLoadIdentity();  
        glMatrixMode(GL_PROJECTION);  
        glLoadIdentity();  
        glMatrixMode(GL_MODELVIEW);  
        glLoadIdentity();  
           
        // clear the depth and color buffers  
        glClearColor(0,0,0,0);  
        glClearDepth(1.0);  
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);  
          
        // tell plugin about the current time  
        plugins[i]->SetTime(elapsedTime);  
          
        // prepare the structure used to call the plugin's ProcessOpenGL method  
        ProcessOpenGLStructTag processStruct;  
                  
        //create the array of FFGLTextureStruct* to be passed to the plugin  
        FFGLTextureStruct* inputTextures[1];  
        if (i == 0) {  
            // use the input texture  
            inputTextures[0] = &inputTexture;  
        } else {  
            // use the resulting texture of the previous operation  
            FFGLTextureStruct fboTexture = fbos[i-1].GetTextureInfo();  
            inputTextures[0] = &fboTexture;  
        }  
          
        // provide the 1 input texture we allocated above  
        processStruct.numInputTextures = 1;  
        processStruct.inputTextures = inputTextures;  
          
        // specify our FBO's handle in the ProcessOpenGLStructTag  
        processStruct.HostFBO = fbos[i].GetFBOHandle();  
          
        // call the plugin's ProcessOpenGL  
        if (plugins[i]->CallProcessOpenGL(processStruct) != FF_SUCCESS) {  
            // the plugin call failed  
            printf("ERROR running ProcessOpenGL for plugin %d.\n");  
            continue;  
        }  
          
        // deactivate rendering to the fbo  
        // (this re-activates rendering to the window)  
        fbos[i].UnbindAsRenderTarget(glExtensions);  
	}  
}  
  

standalone stuff (works but draws bad):

  
  
void ofxFreeFrameHost::allocate(int w, int h, int internalGlDataType) {  
    width = w;  
    height = h;  
      
    tex_t = 1.0f;  
    tex_u = 1.0f;  
          
    tex_w = w;  
	tex_h = h;  
      
    // allocate CPU memory (24-bit RGB)  
    inputImage = new unsigned char[w*h*3];  
    if (inputImage == NULL)  
        return;  
      
    // allocate the GPU texture  
      
    // find the smallest power of two size that can contain the texture    
    int glTextureWidth = 1;  
    while (glTextureWidth < w)   
        glTextureWidth *= 2;  
      
    int glTextureHeight = 1;  
    while (glTextureHeight < h)   
        glTextureHeight *= 2;  
          
    tex_w = glTextureWidth;  
    tex_h = glTextureHeight;  
      
    // create and setup the GL texture  
    GLuint glTextureHandle = 0;  
    glGenTextures(1, &glTextureHandle);  
      
    // bind this new texture so that glTex* calls apply to it  
    glBindTexture(GL_TEXTURE_2D, glTextureHandle);  
      
    // use bilinear interpolation when the texture is scaled larger than its true size  
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);  
      
    // don't use mipmapping when the texture is scaled smaller than its true size  
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);  
      
    // don't wrap when texture coordinates reference outside the bounds of the texture  
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);  
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);  
      
    // allocate room for the GL texture  
    // (this doesn't fill it with any pixels, the NULL would otherwise contain a pointer to the texture data)  
    glTexImage2D(GL_TEXTURE_2D,  
                 0, 3, // we are using a 24-bit image, which has 3 bytes per pixel  
                 glTextureWidth,  
                 glTextureHeight,  
                 0, GL_RGB,  
                 GL_UNSIGNED_BYTE,  
                 NULL);  
      
    // unbind the texture  
    glBindTexture(GL_TEXTURE_2D, 0);  
      
    // init the inputTexture struct  
    FFGLTextureStruct &t = inputTexture;  
    t.Handle = glTextureHandle;  
    t.Width = w;  
    t.Height = h;    
    t.HardwareWidth = glTextureWidth;  
    t.HardwareHeight = glTextureHeight;  
      
    if (inputTexture.Handle==0 || inputImage==NULL) {  
        FFDebugMessage("Texture allocation failed");  
    }  
    
    //instantiate the first plugin passing a viewport that matches  
    //the FBO (plugin1 is rendered into our FBO)  
    fboViewport.x = 0;  
    fboViewport.y = 0;  
    fboViewport.width = w;  
    fboViewport.height = h;  
}  
  
void ofxFreeFrameHost::loadData(unsigned char* data, int w, int h, int glDataType) {  
  inputImage = data;  
    
  //bind the gl texture so we can upload the next video frame  
	glBindTexture(GL_TEXTURE_2D, inputTexture.Handle);  
  
	//upload it to the gl texture. use subimage because  
	//the video frame size is probably smaller than the  
	//size of the texture on the gpu hardware  
	glTexSubImage2D(GL_TEXTURE_2D,  
                    0,  
                    0, 0,  
                    inputTexture.Width,  
                    inputTexture.Height,  
                    GL_RGB,  
                    GL_UNSIGNED_BYTE,  
                    inputImage);  
      
    //glTexImage2D(GL_TEXTURE_2D,  
    //             0, 3, // we are using a 24-bit image, which has 3 bytes per pixel  
    //             inputTexture.Width,  
    //                inputTexture.Height,  
    //             0, GL_RGB,  
    //             GL_UNSIGNED_BYTE,  
    //             NULL);  
  
	//unbind the gl texture  
	glBindTexture(GL_TEXTURE_2D, 0);  
}  
  
void ofxFreeFrameHost::draw(float x, float y, float w, float h) {  
    FFGLTextureStruct fboTexture = fbos[fbos.size()-1].GetTextureInfo();  
  
    //reset all matrices  
	glMatrixMode(GL_TEXTURE);  
	glLoadIdentity();  
	glMatrixMode(GL_PROJECTION);  
	glLoadIdentity();  
	glMatrixMode(GL_MODELVIEW);  
	glLoadIdentity();  
  
	//clear the color and depth buffers  
	glClearColor(0,0,0,0);  
	glClearDepth(1.0);  
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);  
      
    //bind the gl texture so we can upload the next video frame  
	//glBindTexture(GL_TEXTURE_2D, fboTexture.Handle);  
  
	glEnable(GL_TEXTURE_2D);  
       glActiveTexture(GL_TEXTURE0);  
       glBindTexture(GL_TEXTURE_2D, fboTexture.Handle);  
     
       GLint px0 = 0;		// up to you to get the aspect ratio right  
		GLint py0 = 0;  
		GLint px1 = (GLint)w;  
		GLint py1 = (GLint)h;  
  
        bool bFlipTexture = false;  
		if (bFlipTexture){  
			GLint temp = py0;  
			py0 = py1;  
			py1 = temp;  
		}  
  
		// for rect mode center, let's do this:  
        bool bRectmodeCenter = false;  
		if (bRectmodeCenter){  
			px0 = (GLint)-w/2;  
			py0 = (GLint)-h/2;  
			px1 = (GLint)+w/2;  
			py1 = (GLint)+h/2;  
		}  
  
		// -------------------------------------------------  
		// complete hack to remove border artifacts.  
		// slightly, slightly alters an image, scaling...  
		// to remove the border.  
		// we need a better solution for this, but  
		// to constantly add a 2 pixel border on all uploaded images  
		// is insane..  
  
		GLfloat offsetw = 0;  
		GLfloat offseth = 0;  
          
          
		//if (textureTarget == GL_TEXTURE_2D){  
			offsetw = 1.0f/(w);  
			offseth = 1.0f/(h);  
		//}  
		// -------------------------------------------------  
          
        //compute new tex co-ords based on the ratio of data's w, h to texture w,h;  
        int tex_t = w;  
        int tex_u = h;  
  
  
		GLfloat tx0 = 0+offsetw;  
		GLfloat ty0 = 0+offseth;  
		GLfloat tx1 = tex_t - offsetw;  
		GLfloat ty1 = tex_u - offseth;  
  
		glPushMatrix();  
		glTranslated(x, y, 0);  
		glBegin( GL_QUADS );  
			glTexCoord2f(tx0,ty0);			glVertex3i(px0, -py0,0);  
			glTexCoord2f(tx1,ty0);			glVertex3i(px1, -py0,0);  
			glTexCoord2f(tx1,ty1);			glVertex3i(px1, -py1,0);  
			glTexCoord2f(tx0,ty1);			glVertex3i(px0, -py1,0);  
		glEnd();  
		glPopMatrix();  
          
        glBindTexture(GL_TEXTURE_2D, 0);  
}  
  

subclass stuff (doesn’t work but draws nice):

  
  
//----------------------------------------------------------  
void ofxFreeFrameHost::allocate(int w, int h, int internalGlDataType) {  
    // allocate the GPU texture  
    ofTexture::allocate(w, h, internalGlDataType);  
          
    // allocate CPU memory  
    if (internalGlDataType == GL_LUMINANCE) {  
        // 8-bit A  
        inputImage = new unsigned char[w*h];  
    } else if (internalGlDataType == GL_RGB) {  
        // 24-bit RGB  
        inputImage = new unsigned char[w*h*3];  
    } else if (internalGlDataType == GL_RGBA) {  
        // 32-bit RGBA  
        inputImage = new unsigned char[w*h*4];  
    } else {  
        printf("ERROR: GL data type not recognized.\n");  
        return;  
    }  
      
    // init the inputTexture struct  
    FFGLTextureStruct &t = inputTexture;  
    t.Handle = (GLuint)textureName[0];  
    t.Width = w;  
    t.Height = h;    
    t.HardwareWidth = tex_w;  
    t.HardwareHeight = tex_h;  
      
    // init the fboViewport struct  
    fboViewport.x = 0;  
    fboViewport.y = 0;  
    fboViewport.width = width;  
    fboViewport.height = height;  
}  
  
//----------------------------------------------------------  
void ofxFreeFrameHost::loadData(unsigned char* data, int w, int h, int glDataType) {  
    // upload data the GPU texture  
    ofTexture::loadData(data, w, h, glDataType);  
          
    // allocate CPU memory  
    if (glDataType == GL_LUMINANCE) {  
        // 8-bit A  
        inputImage = new unsigned char[w*h];  
    } else if (glDataType == GL_RGB) {  
        // 24-bit RGB  
        inputImage = new unsigned char[w*h*3];  
    } else if (glDataType == GL_RGBA) {  
        // 32-bit RGBA  
        inputImage = new unsigned char[w*h*4];  
    } else {  
        printf("ERROR: GL data type not recognized.\n");  
        return;  
    }  
      
    inputImage = data;  
}  
  
//----------------------------------------------------------  
void ofxFreeFrameHost::draw(float x, float y, float w, float h) {  
    FFGLTextureStruct fboTexture = fbos[fbos.size()-1].GetTextureInfo();  
	textureName[0] = fboTexture.Handle;  
    ofTexture::draw(x, y, w, h);  
}  
  

thanks for the help!