I have scoured for a clear answer to this and came up empty handed.
I’m working with a machine that 2 graphics cards. the primary display is a single output for a control monitor. the secondary output servers 2x tripleheads, spanned into a single 6x1 display.
If we run OF with the control monitor set to primary the window wont’ render on the 6up. If we switch the 6 up to primary it works, but this is not convenient for running our controls.
Anyone know if there is a GLUT call to switch openGL context, or something like that, to force the window to attach to the second graphics card / display.
As far far as I know if you open a new GLUT window it always creates a new OpenGL context that you can easily switch between. But I have only tried this with a dual monitor output on a single graphic card. The old ofxFenster that uses glut has some code included to switch between these contexts http://forum.openframeworks.cc/t/ofxfenster—addon-to-handle-a-second-window-[outdated]/3916/0
int winRef = glutCreateWindow(name.c_str());
glutSetWindow(winRef);
But GLUT can be a little limited and maybe it does get difficult to have a proper controlling of the windows on different cards.
We had a very similar setup using three graphic cards and ended up using a modified version of GHOST on Linux. So you could set the monitor for each window and then the system internally created and switched between contexts as needed. Depending on the time you have and level of control you need, this might be worth checking out. I can share the code since I plan to include it within ofxFenster at some point.
It don’t really know however if there is anything in there to use different screens. If the windows api is somewhat similar to X11 it’s probably possible to hack it in within half a day or so. On linux it was mostly replacing some function argument parameters like the display on wich to create the window with some custom variable. The ghost code is pretty understandable and ofxFenster wraps it to use easily. It’s still a risk though if the current solution does work somehow.
Are you trying to do this with one OF app?
If so you will def have issues with one app spanning two cards. If you are trying to do one window for control and one 6x1 window fullscreen for output then I would suggest:
a) Checking out Memo’s ofxCocoa project or whatever his latest Cocoa code is at.
b) Actually just have two separate OF apps and communicate with the fullscreen app over OSC.
Also check you have extended desktop set to on in GLUT preferences ( OS X )
Also try setting the window position of the fullscreen app to the second display before setting to go fullscreen. Fullscreen might only work on the primary display though.
Is this is a windows project? If so you can do fake fullscreen really easily ( I just did it for an installation that uses Windows 7, which doesn’t support horizontal span )? . Just do ofSetWindowPosition(0,0) and ofSetWindowShape(1024*6, 768) . You might need to do this every frame but it doesn’t slow the app down at all.
If so you will def have issues with one app spanning two cards.
I think that’s only partially true. At least on linux and I guess on windows and mac as well it is actually a pretty straight forward process to have an app spanning among multiple cards if you’re willing to abandon GLUT.
@theo this is a windows project, so the cocoa stuff is out unfortunately. I’m not trying to span both cards, but just allow the GLUT window to even work on the second card. as it stands GLUT seems to only work on whatever is set as the primary monitor. so when the big display is second, nothing works on that display.
Even putting in the fullscreen trick and having it move to the second display nothing renders.
so far @underdoeg’s GHOST thing seems most promising, any way to specify the context needs to be attached to the secondary monitor
I’ll list the major changes needed to have the setup on linux. I guess most functions are similar and at the same place in ghost. So maybe it saves you some time.
Because it’s NULL it uses the default display. But you can pass a string instead with the address to the output you want to use. eg “1:0” for monitor 1, desktop 0. So I changed m_display to a vector of displays filled with all outputs.
I was looking for the multi monitor performance setting explanation for NVIDIA cards - last post hints that multiple displays don’t work when in SLI mode ( though this is an older post ).
ps you might want to look at GLFW - we had a wrapper for it ( I think arturo might still have it ) and it does almost everything GLUT does. should be easy with the swappable renderer.
Yes, I also read that SLI does not support using one card as a monitor output when enabled. But it was older posts as well…
I also did some further research and it looks like GHOST will handle the contexts automatically. A Desktop on windows always is spanned even among different cards, so to place the window on card two, it should be sufficient to just place it at something like x = 1024*4.
[quote=“underdoeg, post:12, topic:7442”]
Yes, I also read that SLI does not support using one card as a monitor output when enabled. But it was older posts as well…[/quote]
as far as I know, SLI is not an issue, there’s no SLI setup in the system in question.
[quote=“underdoeg, post:12, topic:7442”]
A Desktop on windows always is spanned even among different cards…[/quote]
even on windows 7, where there’s no span mode anymore?
This looks really promising, I didn’t get a chance to test any of the suggestions since we had to go live so quickly, but this is something i want to have a solution for going forward so I’ll follow up when i have a chance to test. At least there are wealth of resources now for looking into the issue.
I just ran into this with Windows 7. Needed to do 4x 1920x1080 fullscreen over two heads of one card with 2 DualHead2Go’s.
The easy solution was to have the app non-fullscreen and set the window position to 0,0 and the window shape to the fullscreen size every frame. It looked exactly like a fullscreen setup and the fps didn’t seem affected.
With the ofxDisplayManager, you can get an array of all available displays.
Then you can pass a display pointer to the ofxFensterManager as the one to use when the next window is created.
OpenGL contexts on the same screen are shared automatically. On a different screen a new one is created.
Functions like ofGetWidth or ofGetMouseX() should still work as expected.
That way the handling is pretty easy. It can still become tricky though when it comes to managing resources. For each context you have to send the textures and fonts, etc to the right graphic card. But you can activate the right one with ofxFenster::setActive().
My setup is now a desktop PC with a Rampage II extreme main board and two Nvidia graphic cards (GT440 and GT460) running Ubuntu 11.04. X11 is configured to use every monitor as a separate desktop.
Too bad it’s not working on windows and mac yet. Though I guess we should be able to port this somehow? But for me it’s not the highest priority right now. I think this is something you need very rarely and usually within a larger project. So in case of an emergency you can always install linux and still use mac or windows for developing. (The code should work but ofxDisplayManager::get()->getDisplays(); will only return the main display)