Just figured I’d throw this out there and see if anyone has good resources or suggestions for building a projection mesh using vertex shaders, i.e. creating a way to map all the vertexes of a series of objects to a mesh. This is to be able to self-synchronize multiple projections from multiple projectors so that depending on any shift in projects, screens, we can create a mesh for a particular projection so that the images will sync up. (hopefully that makes sense). Basically we want to make sure that multiple projections can be fitted to one another.
A vertex shader seems likes a fast way to do this but maybe someone has a different/better idea I’d love to hear it. Thanks!
hmm do you mean quad warping in the vertex shader? Or do you mean that the position of the projector should work like a camera? so for instance if you move the projector the image changes according to the movement?
Anyways you cant “create a mesh” in a vertex shader. You can only position the vertices you allready sent to the pipeline.
Thanks Moka. Yeah, that came out all wrong. I meant that we’d create a mesh for deformations and then apply it to all of our vertexes in the shader. Sort of like what’s discussed here:
but I’m thinking we’ll be applying to a lot more vertexes than Theo demonstrates there, so I was just wondering what a possible limitations there are to that approach as I haven’t worked with vertex shaders a ton. I’ll get some more details together on exactly what we need and get them posted here in an hour or so