"Game of Life" shader example

I made a “Game of Life” patch with Pure Data / Ofelia / Open Frameworks (Pure Data and the Ofelia library needs to be installed, or it is also possible to run it with ofxOfelia and Open Frameworks). “gol_shader_v006_mask” is the current version: https://github.com/Jonathhhan/PDGameofLife-shaderVersion-

1 Like

Would it be difficult to transfer this into 3d-space? It is not a problem to transfer the game of life rules from 2d to 3d, but how to solve this with a shader? Can I create a 3d fbo with 3d pixels (voxels?) as cell states and pass them to a shader as a sampler3D? And how to draw a 3d grid with a shader? I would like to create something like this: https://www.youtube.com/watch?v=EW9Q0qMc2Xc (if possible).

Actually, i think what I need is a 3D-Texture. But it seems quite complicated to initialize them. Here I found an addon https://github.com/timscaffidi/ofxVolumetrics which could do the job, but I get an error with the example…

In aNa I initialize and use an OpenGL glTexStorage3D 3D texture. The source code can be found here.


function: void ofApp::loadImageSequence()

But it is about a slit-scan effect not about volume rendering. Many images with the same size are loaded. Then the fragment shader can access any pixel of any image in constant time.

Visual output:

Sorry can not post links.

1 Like

Thank you for the example. I am not there yet, but I have a better understanding now. Is it only possible to manipulate and use the 3d texture with a shader, or can i also access its pixels outside of the shader / draw it into an ofFbo with Open Frameworks?
This example does work with the ofxVolumetrics addon: https://github.com/totetmatt/TLCube

The way I use the 3D textures: The shader has read-only access to the pixel data. From the shader’s point of view, it is a gigantic Look Up Table.

A fragment shader can read from one texture and write to a second texture. Keyword: double buffer / ping pong buffers. aNa and Hydra is full of ping pong buffers.

Loading the textures back from the GPU into the CPU requires that the calculations on the GPU must be finished before the data is copied into the RAM of the CPU. In the OpenGL world one tries to use this synchronization as little as possible.

If the image data is already loaded into the RAM of the GPU, then it is more performant to copy it there from one buffer to the other while changing the data.

Thanks again for the explanations. I already use ping pong buffers (with help from ofFbo) and access the pixels ( a.fboCells:getTexture():readToPixels(a.pixels); ) from 2d textures, which works well. Just was not sure, if I can use a 3d texture the same way.

I added the ofxVolumetrics addon to Ofelia and adapted the example (as a start for using 3d textures with Ofelia): https://github.com/Jonathhhan/volumetrics_example_ofelia

I managed to make a game of life wit a 3d texture that is calculated on a shader (with ofxVolumetrics).
Only drawback is that I need to load the whole texture every frame to the cpu and back to the gpu as a buffer, because I had no luck with storing the 3d texture into an ofFbo (would make it much faster I think). Would be great to know if there is a way to stay on the gpu side…

Here is a small example:

Actually I ported the Game of Life patch from ofxOfelia to OF (its now 1 MB instead of 12 - still think there is a lot of redundant code…):


The pattern are saved with ofXml. Of course with Emscripten it only saves for one session. Maybe it would be nice to import / export .xml presets with Java Script (that should be possible, I guess)?

1 Like

Hey @Jona , you may have this worked out a few different ways by now. If you didn’t want to use addons, you could store “seed vertices” in an ofMesh or an ofVboMesh, and then use a geometry shader to create the vertices for rendering. The gpuParticleSystemExample does this if you’re interested.

1 Like

Thanks. Yes, in the ofxOfelia version I saved the presets with Pure Data as arrays into a text object. I guess, it is not possible to edit and save the seed vertices? But it sounds like nice solution, I will definitely have a look.

Here is a 3d game of life shader attempt. A video is mapped to the pixels of the cellular automata in a slitscan style: ofGol3d 3 - YouTube
ofEmscriptenExamples/oFgameOfLife3d3 at main · Jonathhhan/ofEmscriptenExamples · GitHub

1 Like

It was possible to optimize some stuff: ofEmscriptenExamples/ofGameOfLife3d_optimized at main · Jonathhhan/ofEmscriptenExamples · GitHub
The main improvements are to use glCopyImageSubData for copying textures (needs openGL 4.3) and only to begin and end a shader once in a draw call (not for every texture layer). A 300x300x300 texture can be updated with 60 fps (with a GTX 970). The bottleneck now seems to be the raycast shader from the ofxVolumetrics addon (which is already optimized, I guess - but who knows?). So my question is: Is there a way to optimize the ofxVolumetrics raycast shaders (I read, for example, that large for loops in a shader are not a good idea)? And another question: Is it possible to use ofLight and ofMaterial with an ofxVolumetrics texture (did not work for me)?

Hey Jona did you use an ofMesh (or other source of vertices) when you tried ofLight and ofMaterial? The ofLight and ofMaterial need vertices and normals. The textures need texcoord, which can also be associated with an ofMesh but can come from other places too (like an ofFbo).

In the of3dPrimitivesExample, the lights do work with the texture from the webcam. So it seems like these 3d textures would interact similarly with ofLight and ofMaterial.

Unfortunately I know very little about raycasting and shader optimizations. Ugh! But maybe have a look at the articles on the website of Inigo Quilez. He covers some advanced shader topics in a thoughtful way, and often with an emphasis on efficiency. And then I often look at shaders archived on Shadertoy; you can search with keywords like “raycast” or “loop optimization”, etc. Plus its just really fun and amazing to look at all the shaders and how much stuff you can do with relatively few lines of code.

1 Like

Hey @TimChi actually ofxVolumetrics.drawVolume() draws to an ofVboMesh and ofxVolumetrics has vertices and normals. ofxVolumetrics/ofxVolumetrics.cpp at master · timscaffidi/ofxVolumetrics · GitHub But the ofVboMesh is drawn into an fbo, I think because of that ofLight / ofMaterial does not work (I copied some of the code from the of3dPrimitivesExample). And thanks for the links. I know (and like) shadertoy, but a good tutorial is what helps me now. It also reminded me of this one: Compute Shaders in Open Frameworks (tutorial)

Also how about a camera? Rendering 3d into an ofFbo should be fine. You could test that out in a quick and ultra-simple project just to make sure.

I wish I could be more help, so I’ll watch the thread. I’m not familiar with some of the calls in ofxVolumetrics::drawVolume() (both oF and openGL).

One approach might be to build a very simple example with all the components (2D textures, ofLight, ofMaterial, the ofVboMesh and the ofFbo, and probably an ofEasyCam), and then slowly test and modify or add some of the unknowns (ofxTexture3d, ofxVolumetrics, etc).

Here is a Emscripten version of the 3d game of life: https://volumetrics.handmadeproductions.de/
And here is the Emscripten branch of ofxVolumetrics (it does not seem to work with every GPU): GitHub - Jonathhhan/ofxVolumetrics at emscripten
I have one issue with the ofEasycam (not sure , if it is solvable): If I zoom too close into the texture3d, it disappears (you can see it in the example), and it would be very nice to zoom through the texture3d without making it disapper (set nearClip already to 0, also tried negative values).

1 Like