Open Projects

Google Summer of Code 2021 Project Ideas

For information about the application process, see the libcamera GSoC application guide.

Multistream support in GStreamer element

Description of the project:

The GStreamer libcamera element allows libcamera to be used in GStreamer pipelines. libcamera supports simultaneous streaming, for example having lower quality one stream for preview, while another higher quality one is used for recording. We would like to extend support for the GStreamer libcamera element to support this multistreaming.

This project requires a Raspberry Pi or Rockchip RK3399, such as a ROCK Pi 4, until vimc supports multiple streams.

Expected results:
A working GStreamer libcamera element that supports streaming multiple streams simultaneously
Confirmed mentor:
Paul Elder
Desirable skills:
Good knowledge of C and C++.

vimc multistream support

Description of the project:
vimc is a driver that emulates complex video hardware, and is useful for testing libcamera without needing access to a physical camera. We would like to add support to the libcamera vimc pipeline handler for multiple simultaneous streams, to ease testing of such mechanism. This also requires adding multistream support to the vimc driver in the Linux kernel.
Expected results:
A working vimc driver and pipeline handler that supports streaming multiple streams simultaneously
Confirmed mentor:
Paul Elder
Desirable skills:
Good knowledge of C and C++. Some knowledge of V4L2 would also be beneficial.

OpenGL/OpenCL software ISP

Description of the project:

Image Signal Processors (ISPs) implement the functions necessary in an image pipeline, to transform a raw image from a sensor into a meaningful image that can be displayed. Examples of such functions are debayering, demosaicing, noise reduction, etc. Most ISPs are currently implemented in hardware, so a software-based ISP would be useful for testing and experimentation in the absence of a hardware ISP, and would be an open source ISP implementation.

This project requires interfacing with a GPU for computation. There are a few ways to do this, such as with OpenCL or OpenGL compute shaders. Thus choosing an API is a required task for this project (it’s best to explore/experiment and choose before the project begins). In addition to being able to use the GPU for computation, the platform that this project will be developed on requires the ability to capture raw Bayer images. This can be done either via vimc (which needs to be fixed first), or with a CSI camera on one of the platforms that libcamera currently supports.

The easiest option is to use the vimc virtual driver on a regular computer. The raw Bayer capture implementation in the vimc driver is however broken at the moment, and isn’t supported yet in libcamera. These issues would need to be addressed first.

The options for OpenCL may be more limited. These options include a Raspberry Pi 3B+, which has an unofficial OpenCL implementation, or a Rockship RK3399 device, which might require closed-source Mali drivers, or an i.MX device that has a raw Bayer sensor and OpenCL support, or feeding images manually to a software ISP implementation.

There is also the option of using OpenGL compute shaders instead of OpenCL, which may be supported on a wide range of platforms.

Expected results:
A software ISP that implements some amount of ISP functions
Confirmed mentor:
Paul Elder
Desirable skills:
Good knowledge of C and C++. Some knowledge of OpenGL or OpenCL would also be beneficial.

Warmup tasks for Google Summer of Code 2021

These are some ideas for tasks for warming up for Google Summer of Code 2021. Look at the section(s) relevant to the project ideas that you’re interested in.

These warmup tasks are designed to guide your exploration into libcamera. You don’t have to do all of them, these are just guidance to what you might want to look at to help shape your proposal and your project.

Remember that for all of these steps, you are not alone! You can talk to us anytime on IRC or email (although keep in mind that there may be timezone differences).

Communications and logistics

Multistream support in GStreamer element

Intro

  • Build the GStreamer element
  • Use the libcamera GStreamer element in some GStreamer pipeline. Play with it! Have fun!
  • If you see any problems, try to debug and fix it. Don’t worry about completing it, but log what you wanted to do, what you tried, and what the results were.

Learning the interface of libcamera and GStreamer

  • Go learn about how GStreamer elements work, read their tutorials, maybe make a mini GStreamer element.
  • Learn how the libcamera public API works, by making your own mini libcamera app (just streaming frames from a webcam)
    • Implement setting controls (doesn’t need to have a fancy UI)
  • Go study the libcamera GStreamer element (if you haven’t already in the intro)
  • Write a unit test (in test/) for the existing GStreamer element (with a single stream) and submit it to libcamera in a patch.

Designing the proposal

Now that you know how GStreamer and libcamera work, how would you add support for multistream in the libcamera GStreamer element? Note that this is the proposal that you will submit during the student application phase.

vimc multistream support

Intro

  • Build libcamera with the vimc pipeline enabled
  • Stream frames from vimc with the qcam application and confirm that it works
  • There was previous work on this: https://patchwork.libcamera.org/project/libcamera/list/?series=1127
    • Apply these patches and see what happens. Try to figure out what’s not working and why (probably will need to change the patches as the codebase has changed).
  • This was the previous work on the vimc kernel driver: https://lore.kernel.org/linux-media/20200819180442.11630-1-kgupta@es.iitr.ac.in/
    • Apply these patches to the kernel, and run with the pipeline again. See what happens. Try to figure out what’s not working and why (probably will need to change the patches as the codebase has changed)
    • We highly recommend that you run this in a VM, as there is a risk of crashing your kernel during development.
    • Also study the patches of course. What do they do?

Designing the proposal

Now that you know the previous work, and the goal, how would you add support for multistream in the vimc pipeline handler in libcamera?

OpenGL/OpenCL software ISP

Intro

  • Build libcamera with the raspberrypi, rkisp1, simple, or vimc pipeline enabled
  • Read the project description detailing the hardware dependencies
    • If you have a Raspberry Pi 3, RK3399 device, or i.MX, run cam/qcam on it, and see if you can run OpenCL on it
    • If you have a device that won’t run OpenCL, you can try to run OpenGL compute shaders instead.
    • If you don’t have any of the above devices, then run with vimc
  • If you’re thinking about going the vimc route, fix the vimc raw capture functionality in the vimc kernel driver. This might be a sizable piece of work on its own, so studying the vimc driver and planning out how you would do the fix may be sufficient. See this bug report for more information.
  • Write a standalone OpenCL or OpenGL application which takes an image and applies color gains (or some other ISP function of your choice)

These exploration tasks should give you enough idea on which platform and route you would want to take.

Learning how an ISP is used

Designing the proposal

Now that you know how an ISP is meant to be used, how would you implement a GPU-based software ISP?

Other training tasks

  • The vimc scaler is currently hardcoded in the kernel driver to multiples of 3. Turn this into a variable-ratio scaler in the driver, and adapt the libcamera vimc pipeline handler accordingly.
  • Implement V4L2 controls and selection rectangles in the vimc driver that libcamera wants in the vimc sensor entity.
  • Another medium-sized task is to support the UVC XU API in the UVC pipeline handler. It requires a Logitech webcam as these are the only ones for which we have XU documentation. The goal would be to expose libcamera controls for XU controls, without going through creating mappings between XU controls and V4L2 controls in the kernel. Here are some resources:
  • Another related task is parsing UVC metadata to generate better timestamps. There’s an implementation of this in the kernel driver, it’s broken, and it would be much better handled in userspace.