Duck hunt like game: laser point recognition/tracking

Hi OF!

We’re working on a simple duck hunt like game where you can use
a simple laser (a tiny red one which are for sale for around $5). Our idea is
to check if there is a laser-point on the wall on which we project the game.

Background differencing + laser dot recognition
We’ll use the opencv HSV + background differencing. After background differencing we
check if there is a laser color in the current grabbed webcam frame. As we have no experience with laser beam tracking (and some of you have) we are wondering if
this is a correct approach? We’re in doubt if we can ‘see’ the red dot on the wall
on which the game is projecten. Probably we can use other colors in the game (so no
red) but still we’re not sure about this.

Someone here who has some info on this?

Blob tracking
Besides the recognition of the ‘red dot’ we want to track this dot so you can throw things around in the game. But, what blob tracking code here available (or from one of the ‘touch table’ libraries) is ‘best’ ?

Game development
And the last thing… what would be a good programming language for this? Flash is simple but the communication between flash and openframeworks could be something that would slow down the interaction? Maybe OF with opengl??
Thanks![/b]

a couple ideas… programming language, I would say stick to OF/openGL … you’re not going to beat the performance of C++ very easily with any other high level language.

As for detecting your laster dot… have you considered trying an infrared camera/laser pointer? There’s plenty of tutorials on how you can easily make an IR filter for a normal webcam, and I’ve heard you can obtain IR laser pointers relatively cheap (~$5). Or you probably could do it something like this to work with your game and not get lost in the game’s pixels:

  • capture video input of projection (with laser dot)
  • capture a screen image of what’s being rendered
  • difference those to find the laser

Hopefully even if the laser is shone on a red area of the projection, it will brighten it enough that it will appear different… not entirely sure, but probably there would be some hues where it would be undetectable. You will likely need to do some testing with a full red spectrum projected, and see if you can find the laser dot on it.

You can probably cobble together a simple blob tracker pretty quickly… I wrote one in less than an hour using the openCV libs to do differencing and contour/blob finding. The thing you will run into here is that blobs will swap id’s pretty unpredictably, so you would need to do some basic logic on caching your blob locations and then hooking up the openCV-detected blobs to whichever is closest (x, y, size, colour, whatever info you can obtain) of your cached blobs. Besides that, check out tbeta or Touche.

Hi Plong,

I had a look at your website and found this project:
http://tangibleinteraction.com/?page-id=38

Are you using infrared there? I’m wondering, will the infrared reflect enough to
be picked up by the camera?

Could I see the code of that graffiti wall project?

Roxlu

Hmm… I think the version of the graffiti wall that’s shown in that video is actually using a 3D mouse. Our latest version does in fact use infrared… but it’s only an infrared LED in the end of a pen.

Our current setup has a screen with a projector and infrared webcam setup behind it (a rear-projection setup). Users stand in front of the screen and use the led pens. The infrared shines through the screen (the pens have to be really close - almost touching - to shine through) and is picked up by the infrared webcam. We use Touche to serve up the pen co-ordinates as TUIO cursors, which are then read into the OpenFrameworks Graffiti Wall app to do the drawing.

There was another project (that I think Theo wrote) very similar to our Graffiti Wall that uses a green laser pointer as its cursor. http://graffitiresearchlab.com/?page-id=76

Yeah we’ve thought about using a rear-projection setup, but the space we have (te rear side) is not really large (not more than 50cm). Therefore the L.A.S.E.R. project setup of Theo seems to be more applicable.

I’ll think about a solution using your feedback. Thanks a lot!

If someone else has an idea, feel free to post :slight_smile:

Roxlu

The code for the Laser Tag 2.0 app is here - includes the code for tracking a laser point based on HSV

http://www.muonics.net/blog/index.php?postid=26

All the best,
Theo