Hi, I am trying to have an ofmesh of particles update according to various dynamical systems, like the chaotic Lorenz system for instance. To do this I need to be able to update a particle’s position based on its current position. Here is my ofApp.cpp:
void ofApp::setup(){
shader.load("shadersGL3\\shader");
glEnable(GL_DEPTH_TEST);
mesh.setMode(OF_PRIMITIVE_POINTS);
mesh.enableColors();
for (int y = ofGetHeight()/4; y < ofGetHeight()*.75; y++) {
for (int x = ofGetWidth()/4; x < ofGetWidth()*.75; x++) {
mesh.addVertex(ofVec3f(x, y, 0)); // make a new vertex
mesh.addColor(ofColor(254, 0, 0)); // add a color at that vertex
}
}
}
void ofApp::update(){
}
void ofApp::draw(){
shader.begin();
mesh.draw();
shader.end();
}
What I expect here is for the mesh to continuously move to the right. Instead, it just starts out shifted to the right and stays stuck there. So it seems like position just takes the initial position of the mesh, rather than updating it every iteration. Am I supposed to be using the update() function here? Would I do this by loading the vertices to a buffer and then binding that to the GPU? Just want some guidance before I proceed. Thanks!
Did you update the main.cpp also? You need to set the following attribute for your app; settings.setGLVersion(3, 2);
You can check the examples folder. Goto “your_openframeworks_folder/examples/shader/02_simpleVertexDisplacement”
If you want to update the vertex positions you need to pass a dynamic value to your vertex shader which can be ofGetElapsedTimef(). You can do that in your draw() function. So the vertex shader receives a value changing over time.
void ofApp::draw(){
shader.begin();
shader.setUniform1f("time", ofGetElapsedTimef()); // Here you pass the dynamic variable
mesh.draw();
shader.end();
}
Also, you need to update the vertex shader document as follows;
#version 150
// these are for the programmable pipeline system
uniform mat4 modelViewProjectionMatrix;
in vec4 position;
// the time value is passed into the shader by the OF app.
uniform float time;
void main()
{
vec4 modifiedPosition = modelViewProjectionMatrix * position;
float xMult = 100.0; // increase the value to make vetices move faster and vice versa...
modifiedPosition.x += time*xMult;
gl_Position = modifiedPosition;
}
Thanks for your reply and sorry for the late response. You mentioned that the vertex shader needs a dynamic value, but aren’t the vertex positions themselves dynamic values, since they’re changing each iteration?
ofGetElapsedTimef() increases with each call to draw, so why do the vertices move at a constant speed rather than accelerating?
I think you are misunderstanding how the vertex shader works. If I am reading this correctly, you are expecting the mesh to be continuously/cumulatively modified by the shader on every frame, but unfortunately that is not how it works! The vertex shader simply takes the vertices you supply and transforms them in the GPU, but the original data in the ofMesh will remain unchanged. In other words, the transformations that the vertex shader performs are only “valid” for a given frame you are rendering. This is also the way in which the fragment shader works BTW!
If you want to accumulate vertex changes computed in the shader over time, then you need a different mechanism where the data modified by the GPU persists, preferably by storing the transformed geometry in the GPU itself. You can achieve this in two ways: you can use “OpenGL Transform Feedback” or you can use a Compute Shader in concert with a Shader Storage Object. OF has examples that illustrate both mechanisms, but unfortunately Compute Shaders are not supported on MacOS. If your target OS supports it, the preferred route is to use Compute Shaders.
Thanks! Just to make sure I am understanding correctly, vertex and fragment shaders take data from the ofMesh and process it through the GPU, but they don’t change the underlying ofMesh, so each time the shader is called, it modifies the original ofMesh. Is that correct?
I will look into both transform feedback and compute shaders.
Also the gpuParticleSystemExample might be helpful to look at. It stores position and velocity data in textures, which are updated by shaders. A geometry shader emits vertices and calculates texture coordinates for a primitive. There are some other examples in the /examples/gl/ folder that might also be helpful for using some of the ideas that cmendoza mentioned.