multiple ps3eye cams on Windows 7 64bits

Hello,

trying to use two ps3eye cameras on Windows 7 64 bits, I am wondering: does anyone have experience with this? Is it possible to use the CL-Eye SDK for this just like that, or is paying manodatory anyhow?

Beyond that, I am wondering if there is a functioning wrapper around the API for openframeworks? There is ofxCLeye out there, but I get errors trying to build it. I took some of the code and tried using only the listDevices() method. The app builds, but there is no response whatsoever. Also tried to build the example app inside the SDK of CL into an app, but the sample C++ code is a bit beyond my current understanding of C++.

Does anyone have experience with this? Or is it way easier to simply switch to Ubuntu? So far, I have stuck with Windows cause my laptop runs fine with it, and linux compatability with nvidia is not so optimal…

Thanks! Bye, menno

i’m guessing your best bet would be to try and get ofxCLeye working if you want a lot of manual control.

but is there a reason the normal ofVideoGrabber won’t work? in my experience it works fine, using setDeviceID() – at least with the older drivers from alexp.

and yes, i’m pretty sure you really have to pay for each license of alex’s driver. that’s one advantage to linux + osx for ps3eye dev.

edit: i would add, i’ve had good and bad experiences with nvidia on ubuntu. it really depends on your card, and which release you’re talking about.

ofVideoGrabber actually works really well with the CLEye driver which support a single camera.

My reason for exploring methods to get two camera’s to work is I wish to have two kinds of video analysis, on different areas of the image. Doing it with one camera is possible but leaves a lower resolution to analyse.
Also like you are saying, the manual control would be great and really helpful, the image is to bright now. I am unaware of a way to do this with the ofVideoGrabber class. In Ubuntu it’s more easy to adjust settings with eg. guvcview.

My laptop has nvidia GT 540M 2gb. Actually I just checked nvidia website and they seem to have released a driver for linux on the 5th of October! I’m gonna give that a try, it may also ease my adventures into building puredata from source, which is overly difficult on Windows :slight_smile:

well, Ubuntu installs just fine on my laptop. Even the proprietary driver for the Nvidia graphics card installs upon first boot. However, after installing Codeblocks, downloading Of and running the scripts for dependencies etc, I get, everythings builds, but upon running any Of app, I get:

  
Xlib: extension "GLX" missing on display ":0.0.  
freeglut (openframworks): OpenGL GLX extension not supported by display ':0.0'  

I’m sure how to go on from here. I took a long time trying to find a way towards a solution but do not know here to proceed.
I expected the Nvidia drivers to have good support for opengl and the like…

Actually this also quite off topic, sorry for that.

mmh, that’s weird. can you post the first lines of the output of glxinfo (run it in a terminal) In my computer with the propietary nvidia drivers it reads like:

  
  
  
name of display: :0  
display: :0  screen: 0  
direct rendering: Yes  
server glx vendor string: NVIDIA Corporation  
server glx version string: 1.4  
server glx extensions:  
  

That doesn’t look so good, I think:

  
name of display: :0  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Error: couldn't find RGB GLX visual or fbconfig  
  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
Xlib:  extension "GLX" missing on display ":0".  
  

Running lsmod does show this (the 0 meaning ‘used by’):

  
nvidia              11713772  0  

From which I would expect the driver is loaded.
However, running “cat /var/log/Xorg.0.log” gives, among many other things:

  
[  5651.902] (EE) Failed to initialize GLX extension (Compatible NVIDIA X driver not found)  
  

That would explain GLX not being loaded…
In ‘System Settings > Additional Drivers’ it says: NVIDIA accelerated driver (version current) > This driver is activated and currently in use.

i know this is too obvious so sorry, but have you tried to restart :slight_smile:

Always ok to ask for the obvious :slight_smile:
Have tried that, but to no avail, I get the same errors. I also switch from the “(version current)” of the NVIDIA addition driver to “(post-release updates)”, but no effect. Wouldn’t know why the drivers install fine (though there is no feedback of this process from Ubuntu) but then not load…
I wonder if it would help to uninstall the driver and use the Nouveau, even though it has no support for Optimus technology. The system settings don’t really give that as an option though.

This is a bit of a dillemma.
Windows does not have any of these issues with openframeworks, but I cannot use multiple cameras. Also I cannot (well, do not know how to solve all errors) build puredata (I use OSC for communication between Of and pd). But Ubuntu won’t run the apps at all at this point.

can you take a look into nvidia-settings see if you can find something else there.

ok it seems the problem is related with optimus:

http://ubuntuforums.org/showthread.php?t=1699939&page=2

that’s for the previous release of ubuntu but i guess the problem is the same in the new one.

You can try installing the latest nvidia drivers, do not use the drivers directly from the nvidia page, that usually breaks the ubuntu propietary driver system. Better use this:

  
  
sudo add-apt-repository ppa:ubuntu-x-swat/x-updates  
sudo apt-get update  
sudo apt-get install nvidia-current nvidia-current-modaliases nvidia-settings  
  

to add this ppa with the latest version: https://launchpad.net/~ubuntu-x-swat/+archive/x-updates

That should solve it

Thanks a lot for you help, I seem to have tackled this now!

Here is the story:
The nvidia-settings tool said: nvidia drivers are not loaded.
Looking at xorg.conf that was not so surprising, there was almost nothing inside.
I ran nvidia-xconfig to make a new xorg.conf.
This would keep the system from booting, it hung somewhere halfway, though ctrl-alt-del neatly shuts everything down and reboot.
Removing xorg.conf solves this, but the glx issue also returns.
Installing the drivers from the ppa went fine, then upon reboot (after running nvidia-xconfig), things were the same: cannot boot. There is no feedback as to why the boot process stalls.
Then I again removed xorg.conf, and installed ironhide from this ppa: https://launchpad.net/~mj-casalogic/+archive/ironhide
This rebuild the kernel, then after a reboot I am able to run Of emptyExample.

The first lines of glxinfo now show:

  
name of display: :0.0  
display: :0 screen: 0  
direct rendering: Yes  
server glx vendor string: SGI  
server glx version string: 1.4  
server glx extensions: GLX_ARB_MULTISAMPLE, etc etc  

Then there a re more vendor strings etc.
I am not sure how performance is, and if switching between intel and nvidia is automatic, but at least things run!

Thanks a lot!

After making a new project, porting the app I built in Windows 7 to Ubuntu (addons.make is great!!), everything works, I can connect two ps3eye cams and grab frames.

However, it seems like a lot of frames are dropped, for both streams. This has real negative influence on the accuracy of movement and color tracking.
Is it normal for this kind of behaviour to accur with two of these cams at 640x480 at 60 fps? (At least, that’s what I set up in guvcview).
Running two instances of guvcview shows somewhat different behaviour: one runs fine, the other only grabs a frame every ten or more seconds…

haven’t tried connecting several ps3eyes at the same time, i can try with 1 ps3eye and the laptop camera to see what happens.
btw the sgi driver you managed to install is software emulation so it’s going to be really slow

For the purpose of testing, I booted my desktop into Ubuntu 11.04, 64bits, and built the app there. The result is more or less the same, with a lot of latency and many lost frames, especially on the camera initialized second.

Is there a way to control settings for cameras from within Of? Maybe if I could manually set the framerate to 30 frames, if that could run smoothly I think it could be fine. The difference in responsiveness should not be so big with 60 frames, if only there is no delay and/or error grabbing the frames.

you can use:

  
  
  
setDesiredFramerate(30)  
  

before calling initGrabber

Desired framerate does not improve the situation. This desktop has an ATI videocard, with the drivers working etc, so I’m not sure where to look for the cause.
As for the driver on the laptop, it does seem to affect the usage of the nvidia card, though I’m not sure how to really tell which of the two (intel or nvidia) are on/off and used/dormant at any given moment. How could I test if emulation makes things slow?

perhaps try to run an OF app that does some intensive drawing, like put the polygonExample draw in a for loop and draw it several times per frame and print the framerate to see which one is faster

Running the example on the laptop with Nvidia: in a loop with:

  • 10 times drawing everything fps = 45-60
  • 100 times, fps = 11-12

Running the movieGrabber example with two cams (no inverted imgs):

  • 60-70 fps, with dropped frames for
  • changing desired frames for both streams
  • setting desired fps to 30 makes things much worse

Maybe this has to do with the way the data gets from the cam to the pc?
I am thinking I will stick to one cam and deal with the lower resolution due to ‘digital zoom’.

Edit: also tried movieGrabber example with one cam, but running two instances with different devices set up. This results in both instances running not at all smoothly, just like in the dual-cam app. Closing one of the two makes the other run smoothly again.
Is it possible this has to do with the way memory is managed in the ofVideoGrabber class? For example, there is a buffer with a certain amount available for data, which overflows with two ps3cams running simultaneously?

if you are getting the same behaivour in an external app like guvc i guess it’s related with some thing more low level like the ps3 driver or something in the kernel. i’m going to try with 2 different cameras and let you know

Not sure if this is relevant, however, when using Gem inside Puredata, there is no problem using 2 ps3eye cams simultaneously, no stuttering and dropping of frames. I’m a bit at odds with this: does pd/Gem use a different video driver? It uses V4L2 as far as I know.