Line drawing artifacts

I am trying to draw a gray code pattern on screen using ofDrawLine, but I am experiencing some strange artifacts. What is causing them? (im on osx)

The artifact looks like this: http://i.imgur.com/l1CmzEX.png

It is supposed to be two solid blocks of blackness, but somehow it goes wrong.

Any input is highly welcomed.

Kind regards

Jesper

Here is my code:

#include “ofApp.h”

void ofApp::setup(){
ofBackground(255,255,255);
ofSetFrameRate(60);
ofDisableAntiAliasing();
}

void ofApp::draw(){
//ofSetBackgroundAuto(false);
ofSetLineWidth(1.0);
for(int i=0; i<1680; i++)
{
int d = 0;
if( (i & (1 << bit)) != 0)
{
d++;
}
if( (i & (1 << (bit+1))) != 0)
{
//d++;
}

    if(d == 1)
    {
        ofSetColor(255,255,255);
        ofDrawLine(i,0,i,300);
    }
    else
    {
        ofSetColor(0,0,0);
        ofDrawLine(i,0,i,300);
    }
        
}

}

void ofApp::keyPressed(int key){
bit += 1;
if (bit > 11)
{
bit = 0;
}
}

Hi,
I can tell that your code works fine with OF 8.4.0 and windows:

void ofApp::setup()
{
    ofBackground( 255, 255, 255 );
    ofSetFrameRate( 60 );
    ofDisableAntiAliasing();
}

void ofApp::draw()
{
    ofSetLineWidth( 1.0 );
    for( int bit = 1; bit < 11; bit++ )
    {
        for( int i = 0; i < 1680; i++ )
        {
            int d = 0;
            if( ( i & ( 1 << bit ) ) != 0 )
            {
                ofSetColor( 255, 255, 255 );
            }
            else
            {
                ofSetColor( 0, 0, 0 );
            }
            ofLine( i, bit * 40, i, bit * 40 + 39 );
        }
        ofDrawBitmapStringHighlight( "bit=" + ofToString( bit ), 0, bit * 40 + 25, ofColor(255,0,0), ofColor(0) );
    }
}

I get the same result as @lilive, on OSX 10.11.3 with OF 0.9.2.

It is supposed to be two solid blocks of blackness, but somehow it goes wrong.

What do you mean? The code you pasted is clearly meant to generate a pattern.

By the way, you might want to make sure your code shows up properly on the forum: indent every line of code with four spaces (or multiples of four when indenting for-loops and the like) or use the button at the top of the input field.

Thanx for your replies. I will try to isolate my problem. I should have isolated the problem more before posting.

My overall intention was to display gray-code on the entire screen, so I can identify pixels through a sequence of images of the screen. But that will have to wait untill my line drawing works.

To narrow down the problemI have simplified my code to draw 1000 parallel vertical lines:

void ofApp::setup(){
    ofDisableAntiAliasing();
    ofDisableSmoothing();
}

void ofApp::draw()
{
    ofBackground(255,0,0);
    ofSetLineWidth(1.0);
    ofSetColor(255,255,255);
    for(int i=0; i < 1000; i++)
    {
        ofDrawLine(i,0,i,500);
    }
}

Its supposed to fill a square 1000x500 with white, by drawing 1 pixel lines on a red background. But it looks like some of the lines are not filling a full pixel, forming a moire like pattern:

Im using a plain OF 0.9.2 on OSX Yosemite with a Radeon 6750M.

(http://openframeworks.cc/versions/v0.9.2/of_v0.9.2_osx_release.zip)

Could it be my GPU that is behaving strange or more likely me missing a setup flag?

Kind regards

Jesper

I don’t understand your overall intention, neither why you obtain this result. I have no problem with the same code.
But I can make a suggestion, waiting for someone to give a better answer.
What do you get with:

    for(int i=0; i < 1000; i++)
    {
        ofDrawRectangle(i,0,0.9999999f,500);
        // or try this
        // ofDrawLine(i+0.5f,0,i+0.5f,500);
    }

?

The idea behind this try is that perhaps a line (10,0,10,500) is not the good way to fill all the pixels at x=10, because I think perhaps this line is half in the y=9 column, and half in the y=10 column. Tell me if I’m not clear :wink:

@jesper

On OS X 10.9 with an NVIDIA GPU I get a white rectangle.
I know AMD can be a little different with half pixel values, what if you do this?

void ofApp::setup(){
    ofDisableAntiAliasing();
    ofDisableSmoothing();
}

void ofApp::draw()
{
    ofBackground(255,0,0);
    ofSetLineWidth(1.0);
    ofSetColor(255,255,255);
    for(int i=0; i < 1000; i++)
    {
        ofDrawLine(0.5 + (float)i,0, 0.5 + (float)i,500);
    }
}

Also what does your main.cpp look like?
Are you using Open GL > 2 or the programmable renderer?

Thanks!
Theo

Thanx for the replies.

The plot thickens. :slight_smile:

If I replace the line drawing with the following I get what I expected initially: http://i.imgur.com/GDLDQ3s.png

void ofApp::draw(){
    ofBackground(255,0,0);
    //ofSetLineWidth(1.0);
    ofSetColor(255,255,255);
    for(int i=0; i < 1000; i=i+1)
    {
        ofDrawLine(i+0.5,0,i+0.5,500);
    }
}

Also if I use the rectagle trick it gives similar correct output.

void ofApp::draw(){
    ofBackground(255,0,0);
    //ofSetLineWidth(1.0);
    ofSetColor(255,255,255);
    for(int i=0; i < 1000; i=i+1)
    {
        ofDrawRectangle(i,0,0.9999999f,500);
    }
}

@theo Theo 's example also gives the correct result.

My main.cpp is the following so I guess im using OpenGL.

#include "ofMain.h"
#include "ofApp.h"

//========================================================================
int main( ){
	ofSetupOpenGL(1024,768,OF_WINDOW);			// <-------- setup the GL context

	// this kicks off the running of my app
	// can be OF_WINDOW or OF_FULLSCREEN
	// pass in width and height too:
	ofRunApp(new ofApp());

}

Is the half offset an error or just something I have to live with?

The reason for the trouble is that I want to use gray-code on a screen to identify what pixels of a camera correspond to what pixels on my screen. Its a method known from the structured light field and its used in the LightLeaks project. It depends on me being able to fill pixels precisely.

I will continue my work with an half offset unless I find a better solution.

Kind regards and thank you again for your input.

Jesper

Technically for drawing the pixel coords should be offset by 0.5 in OpenGL.
see: https://anteru.net/2009/06/01/489/

I think the difference is NVIDIA cards are more forgiving and interpret 0.0 to 0.5 as being on a pixel where as AMD only snaps to the pixel at 0.5.

Super annoying for the type of things your doing.
You could just do an ofTranslate(0.5, 0.5, 0.0) at the beginning of your draw call which might be easier.

Theo

Glad to see my suggestion works.

I don’t see it as an error. Here’s how I explain it to myself, perhaps this can help you:
The x (or y) value is not the number of the pixel column (or row). But x and y are the coordinates of a point, in the mathematical meaning. And a point has no dimensions, no size. A point is not a pixel.
What does this mean when the purpose is to set pixels colors? My answer is that pixels are the cells in the grid defined by the mathematical cartesian coordinate system. At this point I think a drawing can be better to explain myself :slight_smile: :

This square is defined by 4 points, or vertices in OpenGL vocabulary: (2,2), (8,2), (8,8) and (2,8)
If I want to draw such a square, I pass the 4 vertices coordinates to OpenGL, and ask him to draw it.
This make perfect sense, if we think about x and y as vertices coordinates, and not the pixels rows and columns.
What is the width of my square? I just do 8-2=6. The square is 6 pixels width, and 6 pixels are drawn from left to right. Good!
The drawback is that if I do the same thing to draw the boundary of the square, with lines of 1 pixel width, I get that:

And this image is not correct, because pixels can’t be half filled, so the graphic card must make decisions, like mixing the color with the background color, producing blury lines, or some other decision which products artifacts like yours.
This is how I explain the need to offset the coordinates to get the correct result, using 2.5 and 7.5 values for x and y:

Note that the problem only occurs for odd line width. With even values there is no problem. Exemple for ofSetLineWidth(2.0) and the previous vertices with 2 and 8 coordinates:

If I consider the rendering system in this way, it makes sense, and seems consistent and pretty good to me.

And, because I’m in great shape, let me add some others considerations :wink:
If we now think about x and y as the column and row numbers. Then we can draw the boundary of the same square using 2 and 7 values for x and y:

OK, but what if I want a 2 pixels width boundary? This time I have to use 1.5 and 7.5 for x and y.
And how do I compute the width of the square? I must do 7-2 +1 = 6

At the end, thinking about x and y as the vertices coordinates seems more consistent and convenient to me.

Well, I stop now! I hope I’m not venturing to far within a subject I don’t master, and that I’m understandable despite my limited english :slight_smile:

1 Like

@lilive That makes good sense.

What bothers me is that it seems to be different on different GPU systems. I must do some testing.

Thanx for the input im off to do line drawing again. :slight_smile:

Jesper