Creating buffer in GPU makes OF not drawing

I’m using osX so -> openGL 2.0.
I want to creat a buffer in my GPU and use it as a costom shader attribute.
I could use ofShader::setAttribute( ), but to use it I have to pick up an array stored in the CPU each time I want to draw. I want to directly use the ID of thoses data stored in the GPU.

The problem is, when i use :

     glGenBuffers(1, &vertArrayID);
     glBindBuffer(GL_ARRAY_BUFFER, vertArrayID);
     glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat)*9, vertArray, GL_STATIC_DRAW);

the folling lines doesn’t draw any thing even if I’m not using the created buffer.

void ofApp::draw()
   img.draw(0,0, 500, 500);// nothing happends

and in the following lines, my shape is drawn but not the image that I want to draw at the end in the OF default shader.

    void ofApp::draw()
    glBindBuffer(GL_ARRAY_BUFFER, vertArrayID);
    glDrawArrays(GL_TRIANGLES, 0, 3);
    shader.end();//I get a triangle
    img.draw(0,0, 500, 500);//but my image do not appear

Any help or explication?

In my case I want to interpolate the positions of two states by using a vertex shader, so I need 2 position arrays for my vertex shader. Thats why I’m not using the default position attribute of ofMesh or ofVbo.


draw image inside shader.begin() and shader.end()

It doesn’t work too, and in my case I want to draw something else outside the custom shader. The problem really comes from the generation of the buffers.

After many tests, I notice that every thing’s allright as long as I don’t use this command

 glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat)*9, vertArray, GL_STATIC_DRAW);

But how can I use an array of values stored in the graphic card to fill the attributes of a shaders, knowing that those values are not built in parameters such as position, coor, texCoords, normals … ?