Raymarching - a way to liberation of generative graphics

Recently I was so fascinating to work with raymarching technology in oF, that I decide to write this detailed post about it.
Such technology allows to render non-rigid objects such as clouds, create dynamic and truly geneative volumes without meshes, and, additionally, very simply render in in VR or 360-degrees panoramic images.

It was not so fast to collect all information, so I sure it will be useful for somebody desired to work with in in a modern oF/GLSL.

We (art-duet Endless Attractions Museum) recently released VR art project “Night sleep was taken from my eyes, V.2”. In the project we are thinking, how are our memories formed? Is it possible to visualize the amnesia process? What will the storage look like in the future? Wearing the VR HMD, a viewer found himself flying in a cloudy space. Some clouds contain inside transformed panoramic photos illustrating one day in the life of artists - us, the authors of the project.

Video: https://vimeo.com/339054266

To build desired “cloudy space”, we were required to use non-classical computer graphics, because clouds are just density in a space - without edges.
The most comfortable technology to achieve it is “raymarching”. It’s a quite simple approach related to raytracing field. The idea behind raymarching is in obtaining for each screen’s pixel the corresponding ray of sight, and scanning this ray using fixed or increasing steps, and accumulating colors and opacity of the points of the 3D scene. (Raycasting and “real” raytracing are more delicate in choosing steps, depending on the scene content, but “simplest” raymarching is not.)

Raymarching gives a huge difference between “graphics of lines and triangles”. Instead of thinking about surfaces in space and their lighting, you can thing about mixing generative volumes, dynamical space densities, volumetric lighting and many-many “fantastic” things.

To get the taste of raymarching and raytracing - see shadertoy site. Just see Sculpture III by iq: https://www.shadertoy.com/view/XtjSDK (see more on his site) :

– this is not dynamic 3D mesh, it’s just generative volume! On shadertoy site you can tweak the code and adjust it online.

But, now return to the clouds. Our first version of the project was made in Unreal Engine 4, but it appears to work slow and not so flexible, so we remade it in openFrameworks (sound synthesis was made in PureData).

Here the list of components for working with raymarching in oF, with optional output to VR and panoramic images:

  1. Using ofxShadertoy addon by Tiago Rezende you can get (almost) any shader from shadertoy site run on openFrameworks.

For the clouds, we started with shader https://www.shadertoy.com/view/lss3zr by XT95, which renders clouds using pure formulas.

  1. Working with 3D textures.
    We were required to fill all 3D space with dynamic clouds in realtime, for two VR HTC Vive’s views 1512x1680, and pure formula-based computations works slow even on our NVidia GeForce 1080. So, we replaced formulas by a mix of 3D repeated cloud textures.

For using 3D textures in modern oF/GLSL shaders with programmable pipeline, I developed class ofxKuGraphicsTexture3D, which is in my ofxKu addon (Inspired by ofxVolumetrics addon by Timothy Scaffidi).

  1. For VR and hybrid graphics (combining classical OpenGL and raymarching) it’s required to accurate compute ray’s origin and direction, corresponding the View Projection matrix. I describe how to do it here:
    Such computing is crucial for VR, where you must to receive exact view for HMD, and also for rendering hybrid graphics combining raytracing and “classical” rasterized-based graphics.

  2. For “front to back” alpha blending can be used the following formulas.

  3. For using VR in oF, I using my fork of ofxOpenVR addon - it’s contains several fixes and improvements comparing the original version.

  4. Having implemented raymarching algorithm, it’s simple to render panoramic images. For this, it’s just enough to fix ray’s origins into one point (center of panoram) and map each pixel’s coordinate to corresponding ray’s direction, as described in my article.

Though raymarching is a quite simple, I hope it’s enough to engage with it and using in creative projects.
And, of course, I will happy to help you with this stuff.


Great compilation in resources, thanks for sharing!


thanks for sharing. did you used some of the new Nvidia RTX gpu’s on this?
this days I am thinking if it’s worth it or could go to something older like GTX 1070.

Do you have any examples to check this classes? https://github.com/perevalovds/ofxKu

1 Like

Thanks for the reply! No, I didn’t touch RTX GPU yet, and working now with GTX 1080 and 1060.

Examples for classes ofxKu - no, currently I haven’t. Actually it’s a collection of independent useful classes used in a big variety of projects I made for several years.

I will think on making some examples of raymarching and ofxKu :slight_smile:


Thank you for this! I will dive into this as soon as I can. Raymarching is still a mystery for me, so this will probably help me step in the right directions!

1 Like


I’m currently looking at it, it’s really great.

Thanks for this.



1 Like