Touch Table Setup and Calibration

Hey OF,
Not sure where to post this but I would love some feedback. I asked a question about camera distortion but this wasn’t the problem (I think) just a part of the problem. Since this is a large question I posted it here (Zach/Theo I can post here if you are ok with it)

I am asking about touch table setups and camera to projection alignment. Zach looks like you did this awhile back and Stefanix you have been working on this.

Please read the post and see where I am falling apart. THANKS OF!

http://toddvanderlin.com/blog/2008/03/1-…-libration/

Hey Todd,

I do something very similar in the Laser Tag 2.0 code to align the camera quad to the projected quad. It looks like from your image that you are getting distortion across your projected texture. This is because openGL breaks a texture up into triangles and if you just have a texture with four points then you will get an uneven distortion of the image.

For Laser Tag we solved this by using an ofAdvTexture object that mapped an image across a grid of points so when you move the corners it warps nicely across the whole image.

You can see it a little here - with the second quad on the main display:
http://impssble.com/photos/data/LASERTA-…–guide.jpg

You can download all the laser tag code including the texture stuff here:
http://muonics.net/blog/index.php?postid=26

Or a much more simple implementation:
An app that lets you warp an image texture.
http://impssble.com/GRL/projectionApp/g-…-dXcode.zip

– BTW I am actually working on solving this problem for the entire openGL world (not just textures) – You can see my progress here:
http://forum.openframeworks.cc/t/quad-warping-an-entire-opengl-view-solved/509/0

I am pretty close to getting it working with glMultMatrix -
theo

hey vanderlin.

I don’t get what your exact problem is, so excuse if I’m out of way, but since it seems that we’ve been trying to solve similar problems this week (and I think we’ve solved ours) I thought I could give my two cents.

We are also working on a multitouch table, and we had two main problems concerning blob alignment:

the table is huge, so we had to use a wide angle lens (1.9 mm) which means a lot of fisheye distortion.

the table is a circle, so we couldn’t see the edges of the projection to do the warping properly.

for the first problem, stefanix help and code has been a blessing. We’ve got rid of the distortion by using his chessboard calibration app. stefan, we do owe you a beer.

for the second one, we’ve figured out a way of making the warping using internal points of the projection. that is, dragging points which are not edges but points at a constant distance from the edges.

Today we’ve finally gotten the calibration right. You can see a test (on the screen of my computer) here

http://www.flickr.com/photos/jesusgollonet/2346619588/

as the note says, we drag the blue points instead of the edges…

the code has still a couple of things to fix, but if you feel that it could be of any help I can upload it.

best!
jesús

jesús,
This is the problem exactly, your approach sounds great. When you draw to the screen: (things you interact with) are those getting warped & distorted or are the proportionate to the table top?

I would love to try your solution out.

I will also check out the advTexture and see what that does. Thanks a ton guys.
T

hey vanderlin!

I’m sorry it took long. I’m on vacation on a mostly internetless place.

The basic idea is to warp using internal points instead of the edges, maintaining the aspect ratio, so if you have 4 points in you projection which have a known relationship with the edges, you can adjust your warping points to those and fit your warped image with the projected image.

I’ve uploaded a couple apps :

http://jesusgollonet.com/openFrameworks/gridWarp.zip

this is a demo app I did for trying the concept. It’s just for calculating the points and draggins, so no image transformation here. I thought it might be useful to understand

http://jesusgollonet.com/openFrameworks-…-Points.zip

and this is the real app. it relies on stefan’s ofCvMain, ofPoint suggested core type and ofXML.

it does several things.

  • load the warping and undistort settings from an xml (AppSettings.xml, included).
  • draws the real and warped image.
  • allows you to drag the blue points on the real image. you should align those to the internal points on the big image.

I think the easiest way to test is by pointing a camera to your computer screen and try to align the points (as seen on the image from the prevous post).

Hope it makes sense/is of any use.

Best!
jesús.