I’m looking forward to deploy a video installation with a lot of high resolution video sources blended together.
My dream is to get something like Hap codec working with the native videoplayer class on linux
I use Arch, Openframeworks 0.9.8 and gst-libav-1.10.2-1
Maybe I am a little bit naive, but looking at gstreamer-libav documentation, I taught hap was supported and that would make it enable by default since the videoplayer is gstreamer based.
But when I try to read this file on the linux platform I get
checking pkg-config libraries: cairo zlib gstreamer-app-1.0 gstreamer-1.0 gstreamer-video-1.0 gstreamer-base-1.0 libudev freetype2 fontconfig sndfile openal openssl rtaudio gl glu glew gtk+-2.0 libmpg123 [ error ] ofGstUtils: startPipeline(): unable to pause pipeline after 5s [ error ] ofGstUtils: gstHandleMessage(): embedded video playback halted for plugin, module uridecodebin0 reported: Your GStreamer installation is missing a plug-in. ofGstVideoUtils: update(): ofGstVideoUtils not loaded
Using ffmpeg, I am able to convert a video to the hap format in .mov container using ffmpeg on the linux box (compiled ffmpeg-git and added to the pkg
ffmpeg -i /pathToInputFile/movie.mp4 -map 0:0 -an -c:v hap -b:v 9354.24k -maxrate 18708.48k -bufsize 18708.48k -r 30000/1001 -s 1280x720 -aspect 16:9 -pix_fmt rgba -coder ac -trellis 0 -subq 6 -me_range 16 -b_strategy 1 -refs 3 -sc_threshold 40 -keyint_min 30 -g 60 -qmin 3 -qmax 51 -metadata creation_time=now -sn -y /pathToFile/movieInHap.mov
Does anybody have a success story with HAP on Linux or a workaround to get hardware decoded texture from a video file?
Thanks a lot!