There was recently a RPi HiQ camera released based on a Sony IMX477 sensor, which allows C/CS-mount lenses to be attached. Using Buster and Pi4 I’ve been trying out different approaches with film and CCTV lenses for an upcoming project.
Here are some notes I made, in particular toward finding a solution for grabbing on RPi 4 and Buster, and a legacy MMAL addon ofxRaspicam.
Raspberry Pi / Libcamera / OS Buster / IMX477 HiQ Camera
MMAL (Multi-Media-Abstraction-Layer) is an API layer on top of OpenMAX, which interfaces with Broadcom camera chips. On Raspberry Pi OS, the default CL tools are using MMAL, which is available in userland:
When using raspistill or raspivid, 2 channels are created, one for processing the frame, and another for preview.
With raspiraw, only a single channel is made, and no processing done to the raw Bayer pixels, which can be dumped to file or RAM. Afterwards this can be converted into a video, which is how 660FPS videos are achieved.
This is why it’s also helpful to think of the camera sensor less as a camera, and more as a sensor - where it might run straight into OpenCV for robotics, without touching the GPU.
The Linux kernel also includes drivers for various camera chips, which are loaded via boot/config.txt:
dtoverlay=ov5647 # v1 camera dtoverlay=imx219 # v2 camera dtoverlay=imx477 # HiQ camera dtoverlay=adv7282m # for the heads ;)
Once loaded, MMAL (raspivid/cam/raw) ceases to work, but the camera is recognised as a connected webcam of sorts - such as via the default ofVideoGrabber and ofVideoPlayer which use GStreamer.
Camera settings can be directly controlled via the driver:
NB: IMX477 driver is only supported on kernel >= 5.4 (see below)
OpenMAX + GPU-acceleration
@jvcleaves did amazing work with ofxOMXCamera and ofxOMXPlayer addons which go deeper into OpenMAX to do HW-accelerated playback and grabbing (textures).
For a time things were good: but since Buster, this is gone with GLES (though still working on Pi 3B+ with Stretch):
As a temporary fix, I made an addon ofxRaspicam
It wraps raspicam, itself a wrapper of MMAL, with added modes for IMX477. On Pi4 30FPS 1080p is possible, and individual control of framerate, shutter speed etc. Parameters are broken out into an ofParameterGroup for testing different settings (ie. via ofxGUI).
The layers of abstraction though are less than ideal: ie. via MMAL the preview channel is being used for pixel buffer, where preview is a second channel used to measure and produce auto-exposure modes or display on-screen.
On Pi 4 its a good way to plug the gap (documentation and examples will be forthcoming), but I will park it once done (due to libcamera, see below).
Kernel >= 5.4 + Libcamera
With the release of the HiQ camera RPi Foundation also announced libcamera:
A complex camera support library for Linux, Android, and ChromeOS
Cameras are complex devices that need heavy hardware image processing operations. Control of the processing is based on advanced algorithms that must run on a programmable processor. This has traditionally been implemented in a dedicated MCU in the camera, but in embedded devices algorithms have been moved to the main CPU to save cost. Blurring the boundary between camera devices and Linux often left the user with no other option than a vendor-specific closed-source solution.
To address this problem the Linux media community is collaborating with the industry to develop a camera stack that is open-source-friendly while still protecting vendor core IP. libcamera was born out of that collaboration and offers modern camera support to Linux-based systems, including traditional Linux distributions, ChromeOS and Android.
It’s exciting stuff! And with a quick look at the library; MMAL-less, communicating directly with the chip registers (+ V4L2 drivers and gstreamer implementations):
Raspberry Pi OS is not yet at 5.4 kernel, but you can test it out by updating Buster via the guide here (and experimenting with Qcam app which is a GUI equivalent to raspistill and raspivid):
Before seeing the libcamera library I’d planned to start a separate implementation based off raspiraw. This was for a couple of reasons:
a) I want to record CRT screens, which requires syncing to 50/60HZ, via GENLOCK (an old standard that sends 300mV blips on each scanline refresh)
b) varying contexts of needing pixels, cv Mats, GPU textures or encoding to a codec; and finding the fastest combination of that
c) OF seems to have some Bayer pixel handling already
With libcamera, this seems like the future, and it’s also reassuring to see config files like this:
So I’m planning something along the lines of ofxLibcameraGrabber. If anyone would like to help out, especially with GPU-side: firstname.lastname@example.org
Raspberry Pi Camera - Stamm-Wilbrandt - very thorough notes on everything pi-camera
Raspberry Pi/Camera Streaming – Wiki - streaming guides (nb: netcat approach is FAST)
UV4L for ARM (Raspberry Pi) - UV4L install guide (for WebRTC and localhost GUI)
A Guide to Recording 660FPS Video On A $6 Raspberry Pi Camera - 660FPS guide