video with alpha channel

Hi,
Is there a way play a video with an alpha channel on ios ?
I succeeded on Mac with AlphaVideoPlayer but didn’t find any clues to port it on iOS
Thanks

as far as i know its not possible.
ffmpeg lib can play alpha video on ios but the lib’s licensing does not pass apple’s approval process.

there are some codecs with which is possible, but they are not really realtime. especially not on ios.

there’s a simple workaround i used once. i put the alpha channel in grayscale next to the video frame. so we had a double width video that was rendered using a shader, which assembled the final video with alpha channel.

How did you do it ? Shaders don’t exist on openFrameworks iOS since it’s OpenGLES 1 which is implemented (and not v2). It’s a few days i’m working on it and i don’t find any solution… Thanks.

There is actually a fork of OF that is gles2. I have not used it much, so you would be playing on your own here: https://github.com/andreasmuller/openFrameworks/tree/develop-opengles2

Thanks a lot :slight_smile: Just another question, would it be possible and a good idea to do it with OpenCL ?

I never used shaders, may you share some parts of code for this workaround with us ?

it seems that AVAnimator could do the job also: http://www.modejong.com/AVAnimator

it was not for ios, but for an app. i’m not experienced with ios development, so it might not work if it does not support shaders. the shader is quite simple. it expects a rectangular texture. the left part contains the image, the right sie contains the alpha converted to gray scale. the texture size is passed to the shader.
this is the fragment shader, which assembles the two image and the alpha data:

  
  
uniform sampler2DRect tex;  
  
uniform vec2 size;  
  
void main()  
{  
    vec2 uv = gl_TexCoord[0].st;  
    vec2 cuv = vec2( uv.s / 2., uv.t );  
    vec2 auv = vec2( size.x / 2. + cuv.s, uv.t );  
    vec4 color = texture2DRect( tex, cuv );  
    color.a = texture2DRect( tex, auv ).r;  
    gl_FragColor = gl_Color * color;  
}  
  

the vertex shader is a pass-through:

  
  
void main()  
{  
    gl_FrontColor = gl_Color;  
    gl_TexCoord[0] = gl_MultiTexCoord0;  
    gl_Position = ftransform();  
}  
  

hope this helps.

Hi Talaron, and all!

I’m doing a project in IOS and i need to use videos with transparency too!
Did you find any workaround to your problem? It is possible to write the code for that?

Alternatively to using a shader, you could look into an advanced blending mode. see http://stackoverflow.com/questions/5097145/opengl-mask-with-multiple-textures.

also, you should be able to do the same thing makuz described with a side-by-side video, and to avoid any CPU processing just mess with the texture coords when you draw each part.

Let me know if this makes sense and if not I can whip up some pseudo code

Hi Tim!

Thanks a lot for the answer.
So i’ve already have the video exported from after effects with no background. When i reproduce it with openFrameworks player the areas where it is supposed to be with no background are black.

How can i use the advanced blending mode to draw the video with no background? Can you post some pseudo code?
thanks a lot.
J

I tried actually implementing the method I described, but it isn’t working. I’m actually doubtful that method can work, though it sounded plausible and apparently someone on stack overflow was able to get it working some how. Ater looking into the details of that method, I’ve found that the blend function won’t work that way, because no matter what you set the source and dest to, there seems to be no way it will ever use the RGB source color in computing the ALPHA dest color. You can obviously use Src alpha to to affect dest color, but not the other way around. I’ve also tried writing to the alpha channel of an FBO using an RGB texture, but no dice, because the alpha channel of an RGB texture is always 1.

Again, I’m not sure how the stack overflow example I linked to earlier is supposed to work because from everything I read about glBlendFunc, it shouldn’t.

Sorry for the confusion, but the best way would probably be to combine RGB and Alpha on the CPU into a single texture, if you cant get RGBA videos to work natively.

Hi again,

Still stuck with this issue, i’ve been using the new class ofqtkitplayer from the new release 0072 witch can play videos with alpha channel on the desktop but can’t use it on ios, any way to port it to ios?

Does anyone have manage to get argb videos in ios?

During a search on google i find AVAnimator and this tutorial

http://emilytoop.com/2012/05/16/playing-movies-with-an-alpha-channel-on-the-ipad/

He uses AVAnimator to play argb videos, i’ve tried to copy the code to an emptyExample project, but i get 6 syntax errors, not sure if i can mix objective c in the openframeworks project directly!? Can anyone help me with this, it seems pretty simple, i just find objective c extremely confusing.

Thanks in advance
JF

Hi guys

My name is Mo and I created the AVAnimator library that a couple of people mentioned in this thread. I think you will find that the newer 2.0 release compiles without problems and provides the video with an alpha channel support you are looking for.

http://www.modejong.com/AVAnimator/

The h.264 plus alpha channel implementation uses the hardware decoder available on iOS devices, it is described here:

http://www.modejong.com/blog/post4-h-264-video-with-an-alpha-channel/index.html

This library does not depend on a specific OpenGL ES version or any openframeworks code. It is just plain C and Objective-C code built on top of iOS APIs. Lots of example projects can be found at the main URL above.

1 Like