Getting Started¶
To fetch the sources, build and install:
git clone https://git.libcamera.org/libcamera/libcamera.git
cd libcamera
meson setup build
ninja -C build install
Dependencies¶
The following Debian/Ubuntu packages are required for building libcamera. Other distributions may have differing package names:
- A C++ toolchain: [required]
- Either {g++, clang}
- Meson Build system: [required]
- meson (>= 0.60) ninja-build pkg-config
- for the libcamera core: [required]
- libyaml-dev python3-yaml python3-ply python3-jinja2
- for IPA module signing: [recommended]
Either libgnutls28-dev or libssl-dev, openssl
Without IPA module signing, all IPA modules will be isolated in a separate process. This adds an unnecessary extra overhead at runtime.
- for improved debugging: [optional]
libdw-dev libunwind-dev
libdw and libunwind provide backtraces to help debugging assertion failures. Their functions overlap, libdw provides the most detailed information, and libunwind is not needed if both libdw and the glibc backtrace() function are available.
- for device hotplug enumeration: [optional]
- libudev-dev
- for documentation: [optional]
- python3-sphinx doxygen graphviz texlive-latex-extra
- for gstreamer: [optional]
- libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev
- for Python bindings: [optional]
- libpython3-dev pybind11-dev
- for cam: [optional]
libevent-dev is required to support cam, however the following optional dependencies bring more functionality to the cam test tool:
- libdrm-dev: Enables the KMS sink
- libjpeg-dev: Enables MJPEG on the SDL sink
- libsdl2-dev: Enables the SDL sink
- for qcam: [optional]
- libtiff-dev qt6-base-dev qt6-tools-dev-tools
- for tracing with lttng: [optional]
- liblttng-ust-dev python3-jinja2 lttng-tools
- for android: [optional]
- libexif-dev libjpeg-dev
- for Python bindings: [optional]
- pybind11-dev
- for lc-compliance: [optional]
- libevent-dev libgtest-dev
- for abi-compat.sh: [optional]
- abi-compliance-checker
Basic testing with cam utility¶
The cam
utility can be used for basic testing. You can list the cameras
detected on the system with cam -l
, and capture ten frames from the first
camera and save them to disk with cam -c 1 --capture=10 --file
. See
cam -h
for more information about the cam
tool.
In case of problems, a detailed debug log can be obtained from libcamera by
setting the LIBCAMERA_LOG_LEVELS
environment variable:
:~$ LIBCAMERA_LOG_LEVELS=*:DEBUG cam -l
Using GStreamer plugin¶
To use the GStreamer plugin from the source tree, use the meson devenv
command. This will create a new shell instance with the GST_PLUGIN_PATH
environment set accordingly.
meson devenv -C build
The debugging tool gst-launch-1.0
can be used to construct a pipeline and
test it. The following pipeline will stream from the camera named “Camera 1”
onto the OpenGL accelerated display element on your system.
gst-launch-1.0 libcamerasrc camera-name="Camera 1" ! queue ! glimagesink
To show the first camera found you can omit the camera-name property, or you can list the cameras and their capabilities using:
gst-device-monitor-1.0 Video
This will also show the supported stream sizes which can be manually selected if desired with a pipeline such as:
gst-launch-1.0 libcamerasrc ! 'video/x-raw,width=1280,height=720' ! \
queue ! glimagesink
The libcamerasrc element has two log categories, named libcamera-provider (for
the video device provider) and libcamerasrc (for the operation of the camera).
All corresponding debug messages can be enabled by setting the GST_DEBUG
environment variable to libcamera*:7
.
Presently, to prevent element negotiation failures it is required to specify the colorimetry and framerate as part of your pipeline construction. For instance, to capture and encode as a JPEG stream and receive on another device the following example could be used as a starting point:
gst-launch-1.0 libcamerasrc ! \
video/x-raw,colorimetry=bt709,format=NV12,width=1280,height=720,framerate=30/1 ! \
queue ! jpegenc ! multipartmux ! \
tcpserversink host=0.0.0.0 port=5000
Which can be received on another device over the network with:
gst-launch-1.0 tcpclientsrc host=$DEVICE_IP port=5000 ! \
multipartdemux ! jpegdec ! autovideosink
The GStreamer element also supports multiple streams. This is achieved by
requesting additional source pads. Downstream caps filters can be used
to choose specific parameters like resolution and pixel format. The pad
property stream-role
can be used to select a role.
The following example displays a 640x480 view finder while streaming JPEG encoded 800x600 video. You can use the receiver pipeline above to view the remote stream from another device.
gst-launch-1.0 libcamerasrc name=cs src::stream-role=view-finder src_0::stream-role=video-recording \
cs.src ! queue ! video/x-raw,width=640,height=480 ! videoconvert ! autovideosink \
cs.src_0 ! queue ! video/x-raw,width=800,height=600 ! videoconvert ! \
jpegenc ! multipartmux ! tcpserversink host=0.0.0.0 port=5000