External Display support for iOS

hi guys,

ive added support to display OF apps on External Displays from the iPhone/iPad over VGA and AirPlay.
there is some cool possibilities for gaming and also using iOS devices for live performance visuals.

for those that are github savvy, you can test the feature using this branch,

there is an iosExternalDisplayExample included.
you can test on Simulator by selecting Hardware => TV out => and select an external display.
to test on the device, connect to a tv/projector using a Apple VGA Adapter or connect to your Apple TV using AirPlay. here is how => http://support.apple.com/kb/HT5209?viewlocale=en-US

please post if you come across any issues.


Awesome! I’m looking forward to trying this in the next day or so. I can report back how it works. Do I need to check out the whole OF or can I just take the ofxiPhone addon?

you’ll need to checkout the develop OF branch.
let me know how you go and if you have any suggestions.


I noticed a couple things so far.

  1. The popup to pick the external display comes up twice. So you select a option and then it pops up again.

  2. I can’t seem to open the xib files in interface builder. They’ll only open as a text. None of the other xib files I have do this, so it must be something with the way it was saved.

Is there a way to keep touch interaction going while the external display is on so that we can interact with the external (or both) displays?

hi Seth,

thanks for testing.

  1. i had that issue before but since ive fixed it. you running the latest develop branch?
  2. you can right click the xib file in xcode and open it with interface builder or as source.

there is a way of routing touch events from the ios screen to your OF app on the external display.
you will have to create a UIView which you will add to the device screen when OF is displaying on the external screen. you can then overwrite the touch event handlers and re-route them to ofxiPhoneGetViewController.glView

i had to do this recently and i imagine it will be a common thing to do, so i’ll add that into the example soon.


I checked out the latest develop branch from your OF fork. So if that’s the one you mean, then yeah I still had the double popup happening with it. I’ve experienced that before on another app of mine, but I don’t remember how I fixed it.

I agree about the touch events being a common thing. I see the following use cases for people:

  1. Static Image/Button on device, OF on External Display (like the current example)
  2. Touch interaction on device, OF on external display (adding touch events to current example for external display)
  3. 2 OF views - 1 touch/OF display on device AND one OF display on external display

I imagine in majority of cases having at least touch interaction be sent to the external display (or both displays) would be standard.

can you just double check for me (excuse the pun)
if when pressing down, you receive 2 touchDown events instead of one?

what version of xcode are you running?
i was saving the xib with the latest xcode (4.3.2) so maybe thats some kind of backward compatibility issue…

on the 3rd point you mentioned having two OF apps running, one on the device screen and another on the external display. this unfortunately is not possible as OF only supports one testApp at a time. and the opengl view can only exist on the external display or the device screen. it can not span both.


Well the double popup occurs even without pressing a button. For example, if the apps loads with an external display already mirroring then it’ll popup twice when the app loads. I think it’s an issue with the alert view allocation.

I fixed the presentExternalDisplayPopup() double popup by first doing:


It does appear that the touchDown event (and touch up event) is triggered twice which is probably why it’s calling the other alerts multiple times.

That’s unfortunate about not being able to do two OF apps. I could see a use case where people draw controls on the device and then use the external display for the main part. We could still use UIKit for that the device screen, but then miss out on OF. Would we have to have 2 testApps? We couldn’t just have one with multiple glViews?

Got the example working, but I have a related question -

My app uses microphone input. When I add audio input to your example, I get an error:

Undefined symbols for architecture armv7:  
  "__ZN7testApp7audioInEPfii", referenced from:  
      __ZTV7testApp in testApp.o  
ld: symbol(s) not found for architecture armv7  
clang: error: linker command failed with exit code 1 (use -v to see invocation)  

I guess we lose audio input when we connect the HDMI VGA adapter. Is there a way around this?

hi josh,

how are you adding audio input?
can you paste some code…


Sure, this is the first line that gives me trouble:

void audioIn( float  *input, int bufferSize, int nChannels );  

So to clarify, I haven’t added anything else to your example. When I add the above line to testApp.h, I get the error. When I remove it, the error goes away. Any help is appreciated. Thanks,



Whats the easiest say to enable mirroring in the app? I’ve looked at the example but I don’t want to have an interface where people can select an external display, just to have it mirror if someone turns mirroring on in iOS.

so far i have:
class testApp : public ofxiPhoneApp, public ofxiPhoneExternalDisplay {

(.cpp testapp setup)


What else do I need to do?


Also just testing out the example extended screen running on the ipad3.

When I start airplay in ios, selecting my destination, then tick mirror, within the example app it pops up asking either if i want to use 1280x720 or preferred method.

the app on device goes white and the app display comes up on airplay device. this isn’t mirroring though.

also no touch points on the device show up.

any ideas? thanks

Thanks much for this feature!

One question, must I use a device with AirPlay mirroring (iPhone 4S etc…) to show my app content on an external display? Older devices only stream audio it seems.

Chris, I believe that’s a bug. The first time you select ‘mirroring’ it’ll do ‘external display’ instead of mirroring. The second time you press it, it’ll do mirroring though.

I’m curious if there’s a method for doing two OF or how we could try to make it possible since it’d be highly useful to be able to continue to draw on the device screen while also drawing on the external screen. What’s the conflict keeping that from happening?

Hi guys

I realise this thread is a few years old but I wanted to ask a question about this feature.

Are external displays supported on OF for iPad? I notice that the definitions and declarations for external display features are #ifdef __IPHONE_4_3 enclosed. Does this mean it is only supported for iPhone development?


Hi again

Sorry, I realised that __IPHONE_4_3 is the version number of the iOS SDK for both iPhone and iPad. But I have another question about using external displays.

I’ve run the external displays test example and it works mostly. The display shows up perfectly on the external screen but the display goes blank on the iPad. When hacked this example into my own app, I notice I’m not receiving touches and the orientation is wrong.

Any known problems with this part of open frameworks for iOS?