i am currently working with the Jetson Nano and the Intel Realsense D415 Depth camera. I am simply masking out a video with the depth data of the camera.
Unfortunately the Video that is supposed to be masked is utterly lagging. With the following code
the output shows between 15 - 20 FPS but is definitely lower. When i use a simple VideoPlayer sketch the output shows 25 (as defined) but is visibly lagging as well…
Vertical Sync is off, with Vertical Sync the framerate drops another 5 frames or so…
When i play the video with totem it runs smoothly though.
The video is a 1920x1080 h264 mov file. The bitrate is quite low.
On my Win10 PC the video is lagging on DebugMode, but in ReleaseMode it seems to run just fine.
Could it have anything to do with gstreamer?
What is the best way to encode the videos played?
Some additional information:
I have encoded a variety of a 30 seconds video, ranging from 1 kbit/s to 100 kbit/s in h264 (with different keyframe values) and 2 mpeg videos.
All of them are lagging, even the very pixely 1 kbit/s video, the mpeg videos especially. All of the Jetson Nano CPU cores are on 100 %.
And this is what i dont understand, the Jetson Nano is built for much more complicated calculations (no calculations here accually).
I would be very grateful if one of you could hint me into a direction.
Bumping was not intended.
After my holidays and some more research i finally found out what the problem was:
The issue was the YUV to RGB convertion that is done on the CPU rather than the GPU per default, as @arturo pointed out in some of his posts.
Here is the shader.frag i used to convert the colors, works like a charm now:
Hi @numu can you tell me which part of the code you replaced with the shader? I also try to optimize the video performance and your attempt sounds quite promising…
I also tried the ofxHapPlayer, which works very well on the desktop but with Emscripten it fails to run…
But take a look at the ofShader class. Basically you take video image that is YUV and send it through the ofShader. The Shader converts the YUV image into RGB. On the GPU. If you dont do it that way it converts the colorspace on the CPU and that is slow af.
As soon as I am close to my computer i can send you the c++ file.
Hey numu,
thanks.
it would be great, if you can show me the code.
I know how to use basic shaders with OF, but not sure which part in ofVideoPlayer I have to replace with the shader (where the YUV to RGB conversion happens…).
Hey @Jona!
Unfortunately i can’t find the files on my desktop PC. But anyway, it’s much simpler.
I didn’t change anything in the source class. Just use this shader in your sketch with an FBO.
And on desktop it works really great. Even frame by frame rendering seems possible accurately. Sadly not for web, with the ofxEmscriptenVideoPlayer…
Is there something similar possible for ofxEmscriptenVideoPlayer?
Edit: I guess the ofxEmscriptenVideoPlayer “problem” is, that it transfers the whole data every frame from the GPU to the CPU and back?