Extrude geometry of sphere with texture loaded from image

Hello!

I’m working on taking some spectrograms and mapping them to a 3D shape (sphere for right now), and extruding areas of the geometry based on pixel brightness. I have some of this working but it isn’t exactly what I want. I’m able to put my image texture onto a sphere mesh in
the draw loop, and I have some code to displace the vertices (in a noise-y way, not how I would like). I’m not sure how to get it where I want it to be.

Here’s what I’ve tried:

ofApp.h

#pragma once

#include "ofMain.h"

class ofApp : public ofBaseApp{

	public:
        void setup();
        void update();
        void draw();
        
        void keyPressed  (int key);
        void keyReleased(int key);
        void mouseMoved(int x, int y );
        void mouseDragged(int x, int y, int button);
        void mousePressed(int x, int y, int button);
        void mouseReleased(int x, int y, int button);
        void mouseEntered(int x, int y);
        void mouseExited(int x, int y);
        void windowResized(int w, int h);
        void dragEvent(ofDragInfo dragInfo);
        void gotMessage(ofMessage msg);

        ofTexture texture;
      
        ofMesh mesh;

        ofEasyCam cam;

        ofSpherePrimitive sphere;
        ofMesh planet;
    
};

ofApp.cpp

#include "ofApp.h"

//--------------------------------------------------------------
void ofApp::setup(){
    sphere.setRadius(200);
    sphere.setResolution(100);
    sphere.setPosition(ofGetWidth()/2, ofGetHeight()/2, 0);

    planet = sphere.getMesh();
    ofDisableArbTex();
    ofLoadImage(texture,"bird.png");
    ofEnableDepthTest();
    cam.enableMouseInput();
}

//--------------------------------------------------------------
void ofApp::update() {

}

//--------------------------------------------------------------
void ofApp::draw() {
        
    ofBackground(180, 180, 180);

    cam.begin();
    texture.bind();
    vector<glm::vec3>& verts = planet.getVertices();
    
    for(unsigned int i = 0; i < verts.size(); i++){
            verts[i].x += ofSignedNoise(verts[i].x*ofRandom(-1.,1), verts[i].y/ofRandom(-1.,1),verts[i].z/ofRandom(-1.,1), ofGetElapsedTimef());
            verts[i].y += ofSignedNoise(verts[i].z*ofRandom(-1.,1), verts[i].x/ofRandom(-1.,1),verts[i].y/ofRandom(-1.,1), ofGetElapsedTimef());
            verts[i].z += ofSignedNoise(verts[i].y*ofRandom(-1.,1), verts[i].z/ofRandom(-1.,1),verts[i].x/ofRandom(-1.,1), ofGetElapsedTimef());
    }
    
    planet.drawFaces();
    //planet.drawWireframe();
    texture.unbind();
    cam.end();
}

Here is the spectrogram image I’m using, which you can download and put in bin/data for testing:

bird

Screen shot of what I currently have is in the post below (I wasn’t able to add it to this post because new users can only add one image in a post)

I’m also curious because this is something that I feel only needs to be done once to the mesh vertices, as my intent is to not continually displace them. I feel like I should be able to do all of this in setup() but I’m not sure how…

Any help would be appreciated! Happy to clarify if anything doesn’t make sense.

Here is a screen shot of what it looks like so far:

Its not half bad but what I really want to be able to do is displace the pixels via their brightness on the sphere.

Hey @aquietlife, there are likely several ways to do this. One way is to use an ofShader. Have a look at the example project 08_displacementMap in the shader folder. It modifies the position of vertices of an ofPlanePrimitive with a texture that is being generated by ofNoise. You may want to change how the vertices are modified in the .vert shader, maybe with:

        modifiedPosition.xyz += displacementY * scale;

And, you could make a grayscale image from your image and use it in the .vert shader, so you wouldn’t have to average the rgb values of your image to get the brightness. Hope this helps!

1 Like

Thank you! That’s a really good solution.

I ended up finding just what I wanted in the openframeworks Essentials book.

My code ended up looking like this:

#include "ofApp.h"

//--------------------------------------------------------------
void ofApp::setup(){
    sphere.setRadius(200);
    sphere.setResolution(100);

    ofLoadImage(texture,"bird.png");
    cam.enableMouseInput();
    
    float w = texture.getWidth();
    float h = texture.getHeight();
    sphere.mapTexCoords(0, h, w, 0);
    sphere.rotate(180, 0, 1, 0);
    
     vector<glm::vec<3, float, glm::packed_highp>>& vertices = sphere.getMesh().getVertices();
    
     ofPixels pixels;
     texture.readToPixels(pixels);
     for (int i=0; i<vertices.size(); i++) {
         ofVec2f t = sphere.getMesh().getTexCoords()[i];
         t.x = ofClamp( t.x, 0, pixels.getWidth()-1 );
         t.y = ofClamp( t.y, 0, pixels.getHeight()-1 );
         float br = pixels.getColor(t.x, t.y).getBrightness();
         vertices[i] *= 1 + br / 255.0 ; // * some_extrude_factor
     }
    
}

//--------------------------------------------------------------
void ofApp::update() {
}

//--------------------------------------------------------------
void ofApp::draw() {
        
    ofBackground(180, 180, 180);

    cam.begin();
    draw3d();
    cam.end();
}

void ofApp::draw3d(){
    texture.bind();
    //light.setPosition(ofGetWidth() / 2, ofGetHeight()/2, 600);
    //light.enable();
    //material.begin();
    ofEnableDepthTest();
    
    ofSetColor(ofColor::white);
    sphere.draw();
    sphere.drawWireframe();
 
    ofDisableDepthTest();
    //material.end();
    //light.disable();
    //ofDisableLighting();
    texture.unbind();

}

I tried setting the type of vertices as vector<ofVec3> or vector<ofPoint> (which was in the book) but nothing worked.

Is there something less funky that I can use on like 16 instead of vector<glm::vec<3, float, glm::packed_highp>>?

Here’s how it looks. Pretty satisfying!

1 Like

Hey that looks fantastic! So, you could try the following, and see if you get the same results:

You could try auto; it let’s the compiler pick the type; I’m not sure if you can auto for a vector though.

     auto& vertices = sphere.getMesh().getVertices();

Or, you can omit the step where you get a reference to the vertices,

     vector<glm::vec<3, float, glm::packed_highp>>& vertices = sphere.getMesh().getVertices();

and then loop thru the vertices of the mesh and modify them:

     ofPixels pixels;
     texture.readToPixels(pixels);
     for (int i=0; i<sphere.getMesh().getVertices().size(); i++) {
         //the folloiwng gets 1 tex cord at index i; can also use glm::vec2 type here
         ofVec2f t = sphere.getMesh().getTexCoord(i);
         t.x = ofClamp( t.x, 0, pixels.getWidth()-1 );
         t.y = ofClamp( t.y, 0, pixels.getHeight()-1 );
         float br = pixels.getColor(t.x, t.y).getBrightness();
         glm::vec3 vertex = sphere.getMesh().getVertex(i); // can also use ofVec3
         vertex *= 1 + br;
         sphere.getMesh().setVertex(i, vertex);

Initially I was going to suggest the approach you took, but I was unsure of the details of how the texture from the image was mapped onto the sphere with .mapTexCoords(), especialy in the polar regions. The shader approach might speed things up if you need to modify the vertices with other images (like say from a video of spectral recordings).

The more recent release of oF has incorporated the glm::vec types and functions, which can be used in place of the older ofVec types. Its not a direct substitution though. The glm documentation page looks at some of nuances. And there is a blog post about it here: https://blog.openframeworks.cc/post/173546759884/glm . You’ll see both of these types out in the wild.

1 Like

Thanks for that :slight_smile: I’ll try updating my code and spend some time understanding more the API for texture/mesh/vertex manipulation.

Another approach, as @TimChi was suggesting, is to do the displacement in the vertex shader. This is a really good tutorial https://www.clicktorelease.com/blog/vertex-displacement-noise-3d-webgl-glsl-three-js/, it is for threejs, but the explanation and the process is the same also in OF.