Hi!!
someone know if it’s possible by now put video on iphone??
Thanks to all!!
Hi!!
someone know if it’s possible by now put video on iphone??
Thanks to all!!
Hi, ofVideoPlayer is currently not functional on ofw iphone, however you can play video using the Cocoa iPhone SDK. You can use the MPMoviePlayerController objectiveC class to load and play movies. However you do not get pixel level access to the data as you would with ofVideoPlayer.
Thanks for your fast reply memo!!
Memo, I’ll need to be using video for my iPhone app. I’m not sure, however, how to implement the code from the reference page you provided using openFrameworks. Can I just start mixing obj-C into my testApp.cpp? Do I need to prototype anything in the header first? I saw on the Wiki that I need to change my files to .mm extensions to get obj-C to work.
Can you just explain the steps of how I could use the MPMoviePlayerController class in a normal testApp.h and testApp.cpp? I’m just not sure where to start…
Thanks!
…aaaand I just found this-thread. I think that pretty much answers my question, but I’d still like to see a simple example of getting MPMoviePlayerController to work inside some oF code.
MPMoviePlayerController looks is only playing video fullscreen mode
(from documentation starts: An MPMoviePlayerController object defines a full-screen movie player. )
Any opcion to play a video (without access to pixels) in one app without be fullscreen?
I’ve been trying to use the iPhone sdk examples to work some video into my openFrameworks iPhone app, but I’m having trouble figuring out how to declare the right things in my header.
Can someone explain how to work objective-c into oF code? The objective-c stuff is pretty straight forward, but I’ve been unsuccessful in getting it to work inside my oF code. For instance, how can I get MPMoviePlayerController to even work in my oF code? Any help would be greatly appreciated…
just rename your .cpp to .mm, or goto your source file properties and select Obj-C++ as the file-type. Now the compiler will compile it as an Obj-C++ file - i.e. source file containing both Obj-C code and C++ code. (I prefer the former method so it’s visible at a glance which files are c++ which are obj-c++). . Then you can include all the cocoa includes you need.
Yes, thanks, memo. I did read that in the other thread as well, and I changed my file extension as you mentioned, but I was still running into problems. I guess I’m not sure what files to even include, as I’ve never used the iphone sdk before. Like I said, I was checking out a simple video example from the sdk, but the problem for me was that there were other GUI elements involved, so I wasn’t sure what I would need just for the video to work. In other words, the sdk example is a full iphone app, so I wasn’t sure which files I could leave out, and which ones I need just to run the video code.
Again, if someone is using video in their ofxiphone app, I would love to just have some code to go off of. If I see it done once I’ll get it, but I’m a windows developer who’s never touched objective-c. I’m finally starting to get the hang of XCode, but it’s a slow process. That’s why I think if I can just see it done, I’ll get it…
Nochin did you get it to work yet? I managed to get a simple version of it going by using the code from the link that memo posted. This is what I used:
NSURL *myMovieURL = [NSURL URLWithString:@"[http://webapp-net.com/Demo/Media/sample_iPod.m4v"];](http://webapp-net.com/Demo/Media/sample_iPod.m4v"];)
MPMoviePlayerController* theMovie = [[MPMoviePlayerController alloc] initWithContentURL:myMovieURL];
theMovie.scalingMode = MPMovieScalingModeAspectFill;
theMovie.movieControlMode = MPMovieControlModeHidden;
[theMovie play];
Just make sure to add the following include to your testApp.h file:
#include <MediaPlayer/MediaPlayer.h>
and add the MediaPlayer.framework to the “libs/core/core frameoworks” folder in the xcode sidebar.
I’m still a total n00b (both with openframeworks and objective c) so I haven’t figured how to add the playback finished callback, so if anyone has any advice on how to get that working it would be greatly appreciated.
Hey! Great post! I’ll be looking forward to trying some of that code out. I’ve been busily getting knee-deep in the iPhone SDK. I figured if I’m going to take the trouble to learn enough objective-c to use the media stuff from the SDK, I might as well learn how to use the rest of the SDK. I’ll post some code too if I get any video working in oF.
I tried putting something together for a local file, but I can’t seem to get it to work. The MoviePlayerController launches the movie, and the simulator switches to landscape, but instead of playing the movie, I’m just getting a static Quicktime logo. There are no errors and the console is clean as well. I think I might just be missing something. Any ideas? Here’s the code I put in testApp::setup():
bundle = [NSBundle mainBundle];
NSString *path = [bundle pathForResource:@"theMovie" ofType:@"mov"];
myMovieURL = [NSURL fileURLWithPath:path];
theMovie = [[MPMoviePlayerController alloc] initWithContentURL:myMovieURL];
theMovie.scalingMode = MovieScalingModeAspectFill;
theMovie.movieControlMode = MPMovieControlModeHidden;
[theMovie play];
OK I was just googling around to find an answer to my problem, and apparently .mov files don’t play correctly on the simulator, but they’ll work directly on the iPhone itself. So I went ahead and loaded the .m4v file from the MPMoviePlayerController example and it worked great. So, problem solved. Just be aware…
okay so I have ffmpeg running on the iphone.
it took a lot of tinkering but before I post a cleaned up example project I thought I would get all my notes up on the forum so that it might benefit others trying to get this working.
A lot of the info came from this thread ( though it is split across a lot of pages )
http://lists.mplayerhq.hu/pipermail/ffm-…-76618.html
I grabbed ffmpeg from the latest svn.
svn checkout svn://svn.ffmpeg.org/ffmpeg/trunk ffmpeg
Edit: also forgot you need to grab this perl file and throw it in /usr/local/bin ( only to compile ffmpeg not to use it ) http://github.com/yuvi/gas-preprocessor
the configure settings I ended up going with were as follows:
edit - update:
removed these non-needed flags from the configure commands:
--enable-neon --enable-pic --disable-shared --enable-static --disable-mmx --disable-iwmmxt
**armv7 ( iphone 3gs ) **
./configure --cc=/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc --as='gas-preprocessor.pl /Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc' --sysroot=/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.1.sdk --enable-cross-compile --target-os=darwin --arch=arm --cpu=cortex-a8 --enable-gpl --enable-postproc --disable-debug --disable-stripping --enable-avfilter --enable-avfilter-lavf --extra-cflags='-arch armv7' --extra-ldflags='-arch armv7'
armv6 ( older iphone / ipod )
./configure --cc=/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc --as='gas-preprocessor.pl /Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc' --sysroot=/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.1.sdk --enable-cross-compile --target-os=darwin --arch=arm --cpu=arm1176jzf-s --disable-pic --enable-gpl --enable-postproc --disable-debug --disable-stripping --enable-avfilter --enable-avfilter-lavf --extra-cflags='-arch armv6' --extra-ldflags='-arch armv6'
**i386 - worked after editing confg.make and configure.h to remove jack stuff **
./configure --enable-cross-compile --target-os=darwin --disable-neon --disable-pic --enable-postproc --disable-debug --disable-stripping --enable-avfilter --enable-avfilter-lavf --enable-gpl
So basically you now do this process three times - once for each architecture.
The reason we need the i386 ( I believe but I might be wrong ) is it is needed for the simulator. So after building ffmpeg 3 times with these settings ( and copying the .a files to seperate folders armv7/, armv6/, i386/ ) we now want to join them into a universal library that has all architectures in it. To do this we use lipo. This is how I used it for the folder structure I had.
lipo -create armv7/libavcodec.a armv6/libavcodec.a i386/libavcodec.a -output universal/libavcodec.a
lipo -create armv7/libavdevice.a armv6/libavdevice.a i386/libavdevice.a -output universal/libavdevice.a
lipo -create armv7/libavfilter.a armv6/libavfilter.a i386/libavfilter.a -output universal/libavfilter.a
lipo -create armv7/libavformat.a armv6/libavformat.a i386/libavformat.a -output universal/libavformat.a
lipo -create armv7/libavutil.a armv6/libavutil.a i386/libavutil.a -output universal/libavutil.a
lipo -create armv7/libpostproc.a armv6/libpostproc.a i386/libpostproc.a -output universal/libpostproc.a
lipo -create armv7/libswscale.a armv6/libswscale.a i386/libswscale.a -output universal/libswscale.a
The reason these use the 3.1 sdk in the configure scripts is that supposedly there are huge speed improvements. This means though that the lib will have linking errors if built against a 2.* xcode project. Min project settings you can use with 3.1 sdk would be 3.0
To then use ffmpeg in your project you need to add all the libs to your project and change a couple of things in the project settings.
Then to use the library you need to include the header files:
in testApp.h
#ifdef __cplusplus
extern "C" {
#endif
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#include "libswscale/swscale.h"
#ifdef __cplusplus
}
#endif
If you don’t do the extern “C” stuff you will get linker errors.
Then to test that you managed to setup up the ffmpeg stuff correctly call:
in your testApp.cpp ( or testApp.mm ) setup() function
avcodec_init();
If you don’t get an error - you are good to go!
Grab the compiled lib and headers here:
http://www.openframeworks.cc/files/ffmpegLib.zip
Here are some other useful links I found in the process of hooking this up.
A nice video player class needed minimal tweaking to work in OF
Some early ffmpeg to iphone stuff
Example code for decoding frames
OpenCV on iphone
so I have it hooked up into the basis for an ofVideoPlayer
it is a little picky with file formats but quicktime movies work well when using motion jpeg, and quicktime’s default export as avi works perfectly.
currently these calls are supported:
loadMovie
play
stop
setPause
getPixels
update
draw
currently on an iphone 3gs I get 50fps playing fingers.avi and 30 fps playing fingers.avi and drawing the circle below.
[attachment=0:2n9f3cbk]IMG_0007.PNG[/attachment:2n9f3cbk]
Grab it here:
http://www.openframeworks.cc/files/006–…-Player.zip
[quote author=“theo”]so I have it hooked up into the basis for an ofVideoPlayer
it is a little picky with file formats but quicktime movies work well when using motion jpeg, and quicktime’s default export as avi works perfectly.
currently these calls are supported:
loadMovie
play
stop
setPause
getPixels
update
draw
currently on an iphone 3gs I get 50fps playing fingers.avi and 30 fps playing fingers.avi and drawing the circle below.
[attachment=0:2lxha9bq]IMG_0007.PNG[/attachment:2lxha9bq]
Grab it here:
http://www.openframeworks.cc/files/006–…-Player.zip[/quote]
Hi,
I have grab the code you have posted here. I am getting bunch of errors (Around 120) while compiling it on Xcode 3.1.4. Is there any restriction with Xcode version?
Errors - Command /Developer/Platform/iPhoneSimulator.platform/Developer/usr/bin/gcc-4.2 failed with exit code 1
Waiting for Reply
Thanks
Pratik
[quote author=“theo”]okay so I have ffmpeg running on the iphone.
it took a lot of tinkering but before I post a cleaned up example project I thought I would get all my notes up on the forum so that it might benefit others trying to get this working.
A lot of the info came from this thread ( though it is split across a lot of pages )
http://lists.mplayerhq.hu/pipermail/ffm-…-76618.html
I grabbed ffmpeg from the latest svn.
svn checkout svn://svn.ffmpeg.org/ffmpeg/trunk ffmpeg
Edit: also forgot you need to grab this perl file and throw it in /usr/local/bin ( only to compile ffmpeg not to use it ) http://github.com/yuvi/gas-preprocessor
the configure settings I ended up going with were as follows:
edit - update:
removed these non-needed flags from the configure commands:
--enable-neon --enable-pic --disable-shared --enable-static --disable-mmx --disable-iwmmxt
**armv7 ( iphone 3gs ) **
./configure --cc=/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc --as='gas-preprocessor.pl /Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc' --sysroot=/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.1.sdk --enable-cross-compile --target-os=darwin --arch=arm --cpu=cortex-a8 --enable-gpl --enable-postproc --disable-debug --disable-stripping --enable-avfilter --enable-avfilter-lavf --extra-cflags='-arch armv7' --extra-ldflags='-arch armv7'
armv6 ( older iphone / ipod )
./configure --cc=/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc --as='gas-preprocessor.pl /Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc' --sysroot=/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.1.sdk --enable-cross-compile --target-os=darwin --arch=arm --cpu=arm1176jzf-s --disable-pic --enable-gpl --enable-postproc --disable-debug --disable-stripping --enable-avfilter --enable-avfilter-lavf --extra-cflags='-arch armv6' --extra-ldflags='-arch armv6'
**i386 - worked after editing confg.make and configure.h to remove jack stuff **
./configure --enable-cross-compile --target-os=darwin --disable-neon --disable-pic --enable-postproc --disable-debug --disable-stripping --enable-avfilter --enable-avfilter-lavf --enable-gpl
So basically you now do this process three times - once for each architecture.
The reason we need the i386 ( I believe but I might be wrong ) is it is needed for the simulator. So after building ffmpeg 3 times with these settings ( and copying the .a files to seperate folders armv7/, armv6/, i386/ ) we now want to join them into a universal library that has all architectures in it. To do this we use lipo. This is how I used it for the folder structure I had.
lipo -create armv7/libavcodec.a armv6/libavcodec.a i386/libavcodec.a -output universal/libavcodec.a
lipo -create armv7/libavdevice.a armv6/libavdevice.a i386/libavdevice.a -output universal/libavdevice.a
lipo -create armv7/libavfilter.a armv6/libavfilter.a i386/libavfilter.a -output universal/libavfilter.a
lipo -create armv7/libavformat.a armv6/libavformat.a i386/libavformat.a -output universal/libavformat.a
lipo -create armv7/libavutil.a armv6/libavutil.a i386/libavutil.a -output universal/libavutil.a
lipo -create armv7/libpostproc.a armv6/libpostproc.a i386/libpostproc.a -output universal/libpostproc.a
lipo -create armv7/libswscale.a armv6/libswscale.a i386/libswscale.a -output universal/libswscale.a
The reason these use the 3.1 sdk in the configure scripts is that supposedly there are huge speed improvements. This means though that the lib will have linking errors if built against a 2.* xcode project. Min project settings you can use with 3.1 sdk would be 3.0
To then use ffmpeg in your project you need to add all the libs to your project and change a couple of things in the project settings.
Then to use the library you need to include the header files:
in testApp.h
#ifdef __cplusplus
extern "C" {
#endif
#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#include "libswscale/swscale.h"
#ifdef __cplusplus
}
#endif
If you don’t do the extern “C” stuff you will get linker errors.
Then to test that you managed to setup up the ffmpeg stuff correctly call:
in your testApp.cpp ( or testApp.mm ) setup() function
avcodec_init();
If you don’t get an error - you are good to go!
Grab the compiled lib and headers here:
http://www.openframeworks.cc/files/ffmpegLib.zip
Here are some other useful links I found in the process of hooking this up.
A nice video player class needed minimal tweaking to work in OF
Some early ffmpeg to iphone stuff
Example code for decoding frames
OpenCV on iphone
I am a complete Newbie when it comes to this type of stuff, and I have a few questions about your instructions:
Thank you!
Hi Theo,
I want to thank you for your post. I have successfully compiled and linked ffmpeg on the iPhone device thanks to you, and am now learning the API.
I’m curious, were you able to compile libfaac/libfaad/libmp3lame as well? These libraries have totally different configure scripts that don’t take the same arguments.
Any advice is welcome,
Marcus
[quote="I have grab the code you have posted here. I am getting bunch of errors (Around 120) while compiling it on Xcode 3.1.4. Is there any restriction with Xcode version?
[/quote]
I can’t compile it under 3.1, 3.0 or even 2.21.
FFmpeg compiles correctly and works with other code ffplay (kind of).
But I get compilation errors, they are mostly in the openframework classes.
Has anyone successfully compiled this. Can you share your xcode project.
Also does this player work, so far my efforts with ffmpeg have been dismal, using ffplay from ffmpeg4iphone project seems to work but the sdl layer causes the video (if you can call it that) to play as a green distorted mess.