Question 1 particle system, uniforms, and the frag shader
I wrote a simple program I called glowBoids that used uniforms to supply the fragment shaders with the positions of the particles. The of course the fragment shader would iterate over all the particles (in this, n=32) and generally augment the brightness based on whatever force expression equation I was using.
The first limitations I encounter are obviously the hard limit on uniforms I can supply the shader, and the per-pixel number of operations that are feasible (it is surprisingly high, but limited).
Loading up the information in a texture is clearly the method for large data mass transfer, is there a float texture type I should be using?
I have been rendering to multiple framebuffers in series to cut down on the length of time each pixel takes to calculate in the fragment shader, and then combining the resulting textures using yet another shader.
What do I need to be doing in the fragment shader so I’m not doing 1024 operations per pixel?
Another thing I would love to do is manage the particle system on the GPU. Storing velocity and position vectors in textures and letting fragment shader do all the dirty work of calculating time+1 for me for everybody in parallel. I can do this, but how am I going about turning those 1M positions into pixels on my screen? In my dreams the fragment shader can write to a VBO. I just want to park things on the GPU and not have to rely on OpenCL within OF.
Question 2 what is that line?!?
Anything I do that is at all computationally expensive and full-screen will always introduce a nasty tracking scanline about an inch from the top of my screen. It will dance all around the screen if I don’t set the vertical sync, but the line stays put with vertical sync. This might be my video card, it might be because I use Linux. I’m getting a nice modern card this weekend, I will see what happens. It happens on multiple monitors, and my projector.
Question 3 why?
Besides “morphing” objects (understandably powerful) and doing per-vertex lighting calculations, what is the purpose of the vertex shader? I understand how powerful it is to animate a static model using just a few supplied vectors, but it is there a way to use the vertex shader to update individual particles in a simulation? Can the vertex shader read from other positions in the VBO? Can it write to a new VBO like the frag shader can ostensibly write to a texture if you use a FBO?
Besides “sparklies” and more morphing, what am I not getting about the geometry shader? Is the purpose of the geometry shader to provide highlights and increased detail at vertices?
I imagine drawing a greek column where just a single two point line is draw on the screen, but the vertices that make up the cylinder and detailed volutes at the top and bottom are drawn using new vertices provided by the geometry shader interpolated from just the single vector supplied to it.
Question 4 crosstalk
Aside from lighting information, what am I using the “varying” type for? I see no real interesting use for this, but I know I am wrong.