Xtion + openframeworks + Raspberry Pi


i am having an Asus Xtion Pro live and i have it connected with my Raspberry Pi (os:raspbian wheezy). I have downloaded Openframeworks on both RPi and my laptop (os: ubuntu 12.04) and run the examples (work with ssh).
Although, i don’t know how to get image from Xtion Pro Live (RGB image) and start working. My project has to do with face detection and recognition with this stereo camera.

I’ve also downloaded addons for Xtion and OpenNI but nothing seems to work proper in order to grab image.

Could you give me any ideas please? Some help?

it’s a bit old but here is how i had done it before

I have been playing with this last month. It took me a while to get things running.
I’m using ofxOpenNI2Grabber and I think I slighly modified it .
I can remove some code from my project and share it if you like.
I need to remove some parts that’s controlling some hardware via gpio otherwise it won’t work for you.

Thank you for sharing your project. I encounter a problem on compilling and i would like a little help if you can, because i am new on openframeworks and i don’t know if i follow the right procedure.

I have copied the files from “OpenNI2AppExample” on “/home/pi/openframeworks/apps/myApps” location and then i run make, but i get severe problems. Is this the correct way of compilling your example? Should i download XCode?

Well I have not been able to get the openNi2AppExample running but my own project is working.

some questions first:
Are you using a raspberry pi 1 or 2? ( the raspberry pi 1 will be too slow!! it’s not gonna work )
Are you using openframeworks v0.8.4 ?
Did you follow the setup guide ? Raspberry Pi 2 Setup Guide (0.8.4)
Did you try compiling the example in the setup guide? (3DPrimitivesExample)

I am having a raspberry pi 1, model B (not even B+ :frowning: ). Rpi’s architecture is armv6l.
I downloaded openframeworks v0.8.4 on both machines (ubuntu and raspberry). I didn’t followed the instructions because they were for raspberry pi 2. Should i do the procedure again following these instructions?
The example runs fine on both machines.

Can this model of pi handle the Xtion’s feedback?

Thank you for your help!

I haven’t tried running the xtion with a rpi1b (I only have rpi2) , but I’m afraid it will be really slow!
For the pi1 (armv6l) you don’t have to follow the setup guide link I posted.

But I’m not sure it’s gonna work. I’ll see if I can make a basic project you can test.

Thank you for your help, i appreciate it!!

In our past tests with RPI1 and xtion, it was possible to (sometimes) get a 320x240 depth map every few seconds using @jvcleave’s examples. If you’re interested in using the xtion with Raspberry Pi, a small investment in an RPI2 will be well worth the hours and frustration with the RPI1. Though, keep in mind even with an RPI2 you’ll still be using the xtion at lower framerates, resolution, etc.

I agree Christopher.

@m_tsourma I have packed a project that is running on the rpi2.
You can have a look and see if it runs on your pi1.

unpack the archive at the proper location ( i have mine in /home/pi/of_v0.8.4_linuxarmv7l_release/apps/myApps/xtionSimple)
and make sure to read the readme.md where its says to
copy the file 55-primesense-usb.rules to /etc/udev/rules.d/ and restart
after that go to /home/pi/of_v0.8.4_linuxarmv7l_release/apps/myApps/xtionSimple and do:


if it compiles you can do

make run


This works!!! i get a “motion detection caption”, but not on an RGB image.
How could i implement your code in order to get an RGB image? Is there a way?

So far i cannot buy a raspberry pi2, unfortunately.

Thank you very much for your support.

Are you referring to my project that I shared on googledrive?
if so there is an appSettings.xml in the bin/data/ directory
in there you will find a setting


if you change that to


you should see the color Image,
which you can toggle on or off when pressing the r key while running
you can also hide the activity indicators by pressing a
This project probably does more than you are looking for, but it’s the fastest I could do for you.

One thing to note is that it uses a slightly modified version of ofxOpenNI2Grabber, that adds a property to the DepthSource class that exposes the depthsource image as ofPixels without an alpha-channel so that it can easily be used as the source of an ofxCvGrayscaleImage.

Thank you very very much for your time!!!
Your work is awesome!!
Could i use your project and modify it?
I want to try face detection or recognition (with baby steps cause i’m a newbie).

opencv/face detection is really slow on the RPi

Overall you should use a powered USB hub in order to get a more reliable stream from the depth camera.

If i purchase the raspberry pi 2, should it work better for a real-time application ()handle face recognition)?

Is there another board better for real-time applications?

@m_tsourma ,
yes you can use the code if you like, "knowledge is there to be shared " :smile:

I’ll do a quick test to see if face detection on the rpi2 gives usable results,
Face detection is pretty heavy computation, even for some desktop computers.

But hey, now I’m curious to see what frame rate we can get.

If you can save money on something else somehow than I recommend you get an RPI2!
It is much more fun :wink:

I have combined the code from the haarFinderExample
and the project I packed, and it’s running at 2 FPS on Raspberry Pi 2.
(even when not drawing the found rects in an FBO)

So clearly if you want to do face detection and something visual or just need a framerate higher than 2fps a Raspbery Pi 1 or 2 is not a good solution.

@gepatto I couldn’t use your sharing code.

Perhaps, my raspberrypi couldn’t include “bcm_host.h”. Do you know the solution about the problem?

../../../libs/openFrameworks/utils/ofConstants.h:166:31: fatal error: bcm_host.h:
          #include "bcm_host.h"

I’m using “of_v20150923_linuxarmv7l_release”.

Did you use the correct make files and platform variant as outlined in the RPi armv7 setup guide?

I apologize in advance for the possible unintelligent questions I might ask.
I am working on a senior project that requires me to pull PGB-d data from the XTion. @gepatto seems to have excelled in this respect.
I am working with an Overclocked raspberry pi 2 and have sucessfully complied OpenFrameworks.
My first question: @gepatto can you share with me the file you shared with @m_tsourma? Thank you
Second: Where do I find the ofxOpenNIGrabber and how do I integrate it into OpenFrameworks.