workflow: update README and RELEASE

This commit is contained in:
Kamil Trzcinski
2023-05-30 23:58:20 +02:00
parent 6062a1214d
commit 9da4e89403
10 changed files with 398 additions and 278 deletions

View File

@ -49,6 +49,8 @@ jobs:
run: docker create --name deb_make deb_make run: docker create --name deb_make deb_make
- name: Copy files - name: Copy files
run: 'docker cp deb_make:/deb/. deb/' run: 'docker cp deb_make:/deb/. deb/'
- name: Fix RELEASE.md version
run: sed -i 's|#{GIT_VERSION}|${{ env.GIT_VERSION }}|g' RELEASE.md
- name: 'Release debian files' - name: 'Release debian files'
uses: ncipollo/release-action@v1 uses: ncipollo/release-action@v1
with: with:

294
README.md
View File

@ -1,9 +1,9 @@
# Yet another camera streamer # camera-streamer
There's a number of great projects doing an UVC/CSI camera streaming There's a number of great projects doing an UVC/CSI camera streaming
on SBC (like Raspberry PI's). on SBC (like Raspberry PI's).
This is yet another camera streamer project that is primarly focused This is yet another **camera-streamer** project that is primarly focused
on supporting a fully hardware accelerated streaming of MJPEG streams on supporting a fully hardware accelerated streaming of MJPEG streams
and H264 video streams for minimal latency. and H264 video streams for minimal latency.
@ -11,290 +11,28 @@ This supports well CSI cameras that provide 10-bit Bayer packed format
from sensor, by using a dedicated ISP of Raspberry PI's. from sensor, by using a dedicated ISP of Raspberry PI's.
Take into account that this is a draft project, and is nowhere as complete Take into account that this is a draft project, and is nowhere as complete
and well supported as (awesome ustreamer)[https://github.com/pikvm/ustreamer]. and well supported as [awesome ustreamer](https://github.com/pikvm/ustreamer).
This project was inspired by mentioned ustreamer. This project was inspired by mentioned ustreamer.
## Validate your system ## Install
**This streamer does only use hardware, and does not support any software decoding or encoding**. 1. [Use precompiled debian package](https://github.com/ayufan/camera-streamer/releases/latest) (recommended)
It does require system to provide: 2. [Compile manually](docs/install-manual.md) (advanced)
1. ISP (`/dev/video13`, `/dev/video14`, `/dev/video15`) ## Configure
2. JPEG encoder (`/dev/video31`)
3. H264 encoder (`/dev/video11`)
4. JPEG/H264 decoder (for UVC cameras, `/dev/video10`)
5. At least LTS kernel (5.15) for Raspberry PIs
You can validate the presence of all those devices with: 1. [Configure resolution, brightness or image quality](docs/configure.md)
1. [See different streaming options](docs/streaming.md)
1. [See example configurations](service/)
```bash ## Advanced
uname -a
v4l2-ctl --list-devices
```
The `5.15 kernel` is easy to get since this is LTS kernel for Raspberry PI OS: This section contains some advanced explanations that are not complete and might be outdated:
```bash 1. [High-performance mode via ISP for CSI](docs/v4l2-isp-mode.md)
apt-get update 1. [High-performance mode via direct decoding for USB](docs/v4l2-usb-mode.md)
apt-get dist-upgrade 1. [High-compatibility via `libcamera` on Raspberry PI](docs/raspi-libcamera.md)
reboot 1. [Performance analysis](docs/performance-analysis.md)
```
Ensure that your `/boot/config.txt` has enough of GPU memory (required for JPEG re-encoding):
```
# Example for IMX519
dtoverlay=vc4-kms-v3d,cma-128
gpu_mem=128 # preferred 160 or 256MB
dtoverlay=imx519
# Example for Arducam 64MP
gpu_mem=128
dtoverlay=arducam_64mp,media-controller=1
# Example for USB cam
gpu_mem=128
```
## Compile
```bash
git clone https://github.com/ayufan-research/camera-streamer.git --recursive
apt-get -y install libavformat-dev libavutil-dev libavcodec-dev libcamera-dev liblivemedia-dev v4l-utils pkg-config xxd build-essential cmake libssl-dev
cd camera-streamer/
make
sudo make install
```
## Use it
There are three modes of operation implemented offering different
compatibility to performance.
### Use a preconfigured systemd services
The simplest is to use preconfigured `service/camera-streamer*.service`.
Those can be used as an example, and can be configured to fine tune parameters.
Example:
```bash
systemctl enable $PWD/service/camera-streamer-arducam-16MP.service
systemctl start camera-streamer-arducam-16MP
```
If everything was OK, there will be web-server at `http://<IP>:8080/`.
Error messages can be read `journalctl -xef -u camera-streamer-arducam-16MP`.
### High-compatibility via `libcamera`
This script uses `libcamera` to access camera and provide
manual and automatic brightness and exposure controls.
The settings for those are configurable via described below controls.
This due to extra overhead has worse latency than direct decoding
via ISP.
```bash
tools/libcamera_camera.sh -help
tools/libcamera_camera.sh -camera-format=YUYV
```
### High-performance mode via ISP for CSI
This script uses high-performance implementation for directly
accessing sensor feeds and passing it via DMA into bcm2385 ISP.
As such it does not implement brightness control similarly to `libcamera`.
```bash
# This script uses dumped IMX519 parametrs that are feed into bcm2385 ISP module
# This does not provide automatic brightness control
# Other sensors can be supported the same way as long as ISP parameters are adapted
tools/imx519_camera.sh -help
tools/imx519_camera.sh -camera-format=RG10 ...
```
This mode allows to provide significantly better performance for camera sensors
than described in specs. For example for Arducam 16MP (using IMX519) it is possible to achieve:
1. 120fps on 1280x720, with latency of about 50ms, where-as documentation says 1280x720x60p
1. 60fps on 1920x1080, with latency of about 90ms, where-as documentation says 1920x1080x30p
1. 30fps on 2328x1748, with latency of about 140ms, where-as documentation says 2328x1748x15p
The above was tested with camera connected to Raspberry PI Zero 2W, streaming `MJPEG` over WiFi
with camera recording https://www.youtube.com/watch?v=e8ZtPSIfWPc from desktop monitor. So, it as
well measured also a desktop rendering latency.
### High-performance mode via direct decoding for USB
This script uses direct decoding or passthrough of MJPEG or H264 streams from UVC camera into
re-encoded stream provided over web-interface.
```bash
tools/usb_camera.sh -help
tools/usb_camera.sh -camera-format=H264 ...
tools/usb_camera.sh -camera-format=MJPEG ...
```
## HTTP web server
All streams are exposed over very simple HTTP server, providing different streams for different purposes:
- `http://<ip>:8080/` - index page
- `http://<ip>:8080/snapshot` - provide JPEG snapshot (works well everywhere)
- `http://<ip>:8080/stream` - provide M-JPEG stream (works well everywhere)
- `http://<ip>:8080/video` - provide H264 in-browser muxed video output (using https://github.com/samirkumardas/jmuxer/, works only in Desktop Chrome)
- `http://<ip>:8080/video.mp4` or `http://<ip>:8080/video.mkv` - provide remuxed `mkv` or `mp4` stream (uses `ffmpeg` to remux, works as of now only in Desktop Chrome and Safari)
### Resolution
Camera capture and resolution exposed is controlled by threee parameters:
- `-camera-width` and `-camera-height` define capture resolution
- `-camera-video.height` - define height for an aspect ratio scaled resolution for `/video` and `/webrtc` (H264) output - this might require rescaller and might not always work
- `-camera-stream.height` - define height for an aspect ratio scaled resolution for `/stream` (MJPEG) output - this might require rescaller and might not always work
- `-camera-snapshot.height` - define height for an aspect ratio scaled resolution for `/snapshot` (JPEG) output - this might require rescaller and might not always work
Any `video`, `stream` and `snapshot` might not work as this requires usually decoding, scaling, and encoding to achieve the desired resolution.
This works ONLY BEST when using `libcamera`, the support for `USB` will varry and might require configuring `/boot/config.txt` to set enough of GPU memory to be able to re-encode JPEG.
## RTSP server
The camera-streamer implements RTSP server via `live555`. Enable it with:
- adding `-rtsp-port`: will enable RTSP server on 8554
- adding `-rtsp-port=1111`: will enable RTSP server on custom port
The camera-streamer will expose two stream (if low res mode is enabled):
- `rtsp://<ip>:8554/stream.h264` - high resolution stream (always enabled if H264 stream is available directly or via encoding)
- `rtsp://<ip>:8554/stream_low_res.h264` - low resolution stream if low res mode is configured via `-camera-low_res_factor`
## List all available controls
You can view all available configuration parameters by adding `-log-verbose`
to one of the above commands, like:
```bash
tools/libcamera_camera.sh -log-verbose ...
```
Depending on control they have to be used for camera, ISP, or JPEG or H264 codec:
```bash
# specify camera option
-camera-options=brightness=1000
# specify ISP option
-camera-isp.options=digital_gain=1000
# specify H264 option
-camera-h264.options=bitrate=10000000
# specify JPEG option
-camera-jpeg.options=compression_quality=60
```
## List all available formats and use proper one
You might list all available capture formats for your camera:
```bash
v4l2-ctl -d /dev/video0 --list-formats-ext
```
Some of them might be specified to streamer:
```bash
tools/*_camera.sh -camera-format=RG10 # Bayer 10 packet
tools/*_camera.sh -camera-format=YUYV
tools/*_camera.sh -camera-format=MJPEG
tools/*_camera.sh -camera-format=H264 # This is unstable due to h264 key frames support
```
## Camera support
### Arducam 16MP
The 16MP sensor is supported by default in Raspberry PI OS after adding to `/boot/config.txt`.
However it will not support auto-focus nor manual focus due to lack of `ak7535` compiled
and enabled in `imx519`. Focus can be manually controlled via `i2c-tools`:
```shell
# /boot/config.txt
dtoverlay=imx519,media-controller=0
gpu_mem=160 # at least 128
# /etc/modules-load.d/modules.conf
i2c-dev
# after starting camera execute to control the focus with `0xXX`, any value between `0x00` to `0xff`
# RPI02W (and possible 2+, 3+):
i2ctransfer -y 22 w4@0x0c 0x0 0x85 0x00 0x00
# RPI4:
i2ctransfer -y 11 w4@0x0c 0x0 0xXX 0x00 0x00
```
Latency according to my tests is due to the way how buffers are enqueued and processing delay, this is for triple buffering in a case where sensor is able to deliver frames quick enough. Depending on how you queue (on which slope) and when you enqueue buffer you might achieve significantly better latency as shown in the ISP-mode. The `libcamera` can still achieve 120fps, it is just slightly slower :)
#### 2328x1748@30fps
```shell
# libcamera
$ ./camera_streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=63/0, processing_ms=101.1, frame_ms=33.1
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=64/0, processing_ms=99.2, frame_ms=31.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=65/0, processing_ms=99.6, frame_ms=34.8
# direct ISP-mode
$ ./camera_streamer -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=30 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=32/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=33/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=34/0, processing_ms=49.7, frame_ms=33.4
```
#### 2328x1748@10fps
```shell
# libcamera
$ ./camera_streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=585/0, processing_ms=155.3, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=586/0, processing_ms=155.5, frame_ms=100.2
# direct ISP-mode
$ ./camera_streamer -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=260/0, processing_ms=57.5, frame_ms=99.7
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=261/0, processing_ms=57.6, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=262/0, processing_ms=58.0, frame_ms=100.4
```
#### 1280x720@120fps for Arducam 16MP
```shell
# libcamera
$ ./camera_streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=139/0, processing_ms=20.1, frame_ms=7.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=140/0, processing_ms=20.6, frame_ms=8.8
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=141/0, processing_ms=19.8, frame_ms=8.1
# direct ISP-mode
$ ./camera_streamer -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=157/0, processing_ms=18.5, frame_ms=8.4
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=158/0, processing_ms=18.5, frame_ms=8.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=159/0, processing_ms=18.5, frame_ms=8.3
```
## WebRTC support
The WebRTC is accessible via `http://<ip>:8080/video` by default and is available when there's H264 output generated.
WebRTC support is implemented using awesome [libdatachannel](https://github.com/paullouisageneau/libdatachannel/) library.
The support will be compiled by default when doing `make`.
## License ## License

View File

@ -0,0 +1,43 @@
# Release #{GIT_VERSION}
## Variants
Download correct version for your platform:
- Variant: **raspi**: Raspberry PI compatible build with USB, CSI, WebRTC, RTSP support
- Variant: **generic**: All other platforms with USB and MJPEG support only for time being
- System: **bullseye**: Debian Bullseye (11) compatible build
- Platform: **amd64**: x86/64 compatible build
- Platform: **arm32**: ARM 32-bit kernel: PIs 0.2W, 2B, and higher, Orange PIs, Rock64, etc. No support for RPI0.
- Platform: **arm64**: ARM 64-bit kernel: PIs 0.2W, 3B, and higher, Orange PIs, Rock64, etc. No support for RPI0 and RPI2B.
## Install on Raspberry PI or any other platform
Copy the below and paste into terminal:
```bash
if [[ -e /etc/default/raspberrypi-kernel ]]; then
PACKAGE=camera-streamer-raspi_#{GIT_VERSION}.bullseye_$(dpkg --print-architecture).deb
else
PACKAGE=camera-streamer-generic_#{GIT_VERSION}.bullseye_$(dpkg --print-architecture).deb
fi
wget "https://github.com/ayufan/camera-streamer/releases/download/#{GIT_VERSION}/$PACKAGE"
sudo apt install "$PWD/$PACKAGE"
```
Enable one of provided systemd configuration:
```bash
ls -al /usr/share/camera-streamer/examples/
systemctl enable /usr/share/camera-streamer/examples/camera-streamer-raspi-v3-12MP.service
systemctl start camera-streamer-raspi-v3-12MP
```
You can also copy an existing service and fine tune it:
```bash
cp /usr/share/camera-streamer/examples/camera-streamer-raspi-v3-12MP.service /etc/systemd/system/camera-streamer.service
edit /etc/systemd/system/camera-streamer.service
systemctl enable camera-streamer
systemctl start camera-streamer
```

102
docs/configure.md Normal file
View File

@ -0,0 +1,102 @@
# Configure
## Resolution
Camera capture and resolution exposed is controlled by threee parameters:
- `--camera-width` and `--camera-height` define the camera capture resolution
- `--camera-snapshot.height` - define height for an aspect ratio scaled resolution for `/snapshot` (JPEG) output - this might require rescaller and might not always work
- `--camera-stream.height` - define height for an aspect ratio scaled resolution for `/stream` (MJPEG) output - this might require rescaller and might not always work
- `--camera-video.height` - define height for an aspect ratio scaled resolution for `/video` and `/webrtc` (H264) output - this might require rescaller and might not always work
### Example: Raspberry PI v3 Camera (best)
```text
libcamera-still --list-cameras
Available cameras
-----------------
0 : imx708_wide [4608x2592] (/base/soc/i2c0mux/i2c@1/imx708@1a)
Modes: 'SRGGB10_CSI2P' : 1536x864 [120.13 fps - (0, 0)/4608x2592 crop]
2304x1296 [56.03 fps - (0, 0)/4608x2592 crop]
4608x2592 [14.35 fps - (0, 0)/4608x2592 crop]
```
To get the best resolution for the above camera, the `--camera-width` and `--camera-height`
needs to be configured with the native resolution of camera sensor, only then the resolution
can be downscaled.
```text
--camera-width=2304 --camera-height=1296 --camera-snapshot.height=1080 --camera-video.height=720 --camera-snapshot.height=480
```
This will result in:
- camera will capture `2304x1296` a full sensor
- `snapshot` be ~1920x1080
- `video` be ~1280x720
- `snapshot` be ~640x480
### Example: Raspberry PI v3 Camera with cropped output (bad)
If the camera capture is not configured properly the result image will be cropped.
```text
--camera-width=1920 --camera-height=1080 --camera-snapshot.height=1080 --camera-video.height=720 --camera-snapshot.height=480
```
This will result in:
- camera will capture the middle `1920x1080` from the `2304x1296`
- `snapshot` be ~1920x1080
- `video` be ~1280x720
- `snapshot` be ~640x480
## List all available controls
You can view all available configuration parameters by adding `--log-verbose`
to one of the above commands, like:
```bash
tools/libcamera_camera.sh --camera-list_options ...
```
Or navigate to `http://<ip>:8080/option` while running.
Depending on control they have to be used for camera, ISP, or JPEG or H264 codec:
```bash
# specify camera option
--camera-options=brightness=1000
# specify ISP option
--camera-isp.options=digital_gain=1000
# specify H264 option
--camera-video.options=bitrate=10000000
# specify snapshot option
--camera-snapshot.options=compression_quality=60
--camera-stream.options=compression_quality=60
```
## List all available formats and use proper one
You might list all available capture formats for your camera:
```bash
v4l2-ctl -d /dev/video0 --list-formats-ext
```
Some of them might be specified to streamer:
```bash
tools/*_camera.sh --camera-format=RG10 # Bayer 10 packet
tools/*_camera.sh --camera-format=YUYV
tools/*_camera.sh --camera-format=MJPEG
tools/*_camera.sh --camera-format=H264 # This is unstable due to h264 key frames support
```
It is advised to always use:
- for `libcamera` the `--camera-type=libcamera --camera-format=YUYV` (better image quality) or `--camera-format=YUV420` (better performance)
- for `USB cameras` the `--camera-type=libcamera --camera-format=MJPEG`

77
docs/install-manual.md Normal file
View File

@ -0,0 +1,77 @@
# Install
This describes a set of manual instructions to run camera streamer.
## Validate your system
**This streamer does only use hardware, and does not support any software decoding or encoding**.
It does require system to provide:
1. ISP (`/dev/video13`, `/dev/video14`, `/dev/video15`)
2. JPEG encoder (`/dev/video31`)
3. H264 encoder (`/dev/video11`)
4. JPEG/H264 decoder (for UVC cameras, `/dev/video10`)
5. At least LTS kernel (5.15, 6.1) for Raspberry PIs
You can validate the presence of all those devices with:
```bash
uname -a
v4l2-ctl --list-devices
```
The `5.15 kernel` or `6.1 kernel` is easy to get since this is LTS kernel for Raspberry PI OS:
```bash
apt-get update
apt-get dist-upgrade
reboot
```
Ensure that your `/boot/config.txt` has enough of GPU memory (required for JPEG re-encoding):
```text
# Example for IMX519
dtoverlay=vc4-kms-v3d,cma-128
gpu_mem=128 # preferred 160 or 256MB
dtoverlay=imx519
# Example for Arducam 64MP
gpu_mem=128
dtoverlay=arducam_64mp,media-controller=1
# Example for USB cam
gpu_mem=128
```
## Compile
```bash
git clone https://github.com/ayufan-research/camera-streamer.git --recursive
apt-get -y install libavformat-dev libavutil-dev libavcodec-dev libcamera-dev liblivemedia-dev v4l-utils pkg-config xxd build-essential cmake libssl-dev
cd camera-streamer/
make
sudo make install
```
## Use it
There are three modes of operation implemented offering different
compatibility to performance.
### Use a preconfigured systemd services
The simplest is to use preconfigured `service/camera-streamer*.service`.
Those can be used as an example, and can be configured to fine tune parameters.
Example:
```bash
systemctl enable $PWD/service/camera-streamer-arducam-16MP.service
systemctl start camera-streamer-arducam-16MP
```
If everything was OK, there will be web-server at `http://<IP>:8080/`.
Error messages can be read `journalctl -xef -u camera-streamer-arducam-16MP`.

View File

@ -0,0 +1,72 @@
# Performance analysis
## Arducam 16MP
The 16MP sensor is supported by default in Raspberry PI OS after adding to `/boot/config.txt`.
However it will not support auto-focus nor manual focus due to lack of `ak7535` compiled
and enabled in `imx519`. Focus can be manually controlled via `i2c-tools`:
```shell
# /boot/config.txt
dtoverlay=imx519,media-controller=0
gpu_mem=160 # at least 128
# /etc/modules-load.d/modules.conf
i2c-dev
# after starting camera execute to control the focus with `0xXX`, any value between `0x00` to `0xff`
# RPI02W (and possible 2+, 3+):
i2ctransfer -y 22 w4@0x0c 0x0 0x85 0x00 0x00
# RPI4:
i2ctransfer -y 11 w4@0x0c 0x0 0xXX 0x00 0x00
```
Latency according to my tests is due to the way how buffers are enqueued and processing delay, this is for triple buffering in a case where sensor is able to deliver frames quick enough. Depending on how you queue (on which slope) and when you enqueue buffer you might achieve significantly better latency as shown in the ISP-mode. The `libcamera` can still achieve 120fps, it is just slightly slower :)
#### 2328x1748@30fps
```shell
# libcamera
$ ./camera_streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=63/0, processing_ms=101.1, frame_ms=33.1
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=64/0, processing_ms=99.2, frame_ms=31.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=65/0, processing_ms=99.6, frame_ms=34.8
# direct ISP-mode
$ ./camera_streamer -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=30 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=32/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=33/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=34/0, processing_ms=49.7, frame_ms=33.4
```
#### 2328x1748@10fps
```shell
# libcamera
$ ./camera_streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=585/0, processing_ms=155.3, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=586/0, processing_ms=155.5, frame_ms=100.2
# direct ISP-mode
$ ./camera_streamer -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=260/0, processing_ms=57.5, frame_ms=99.7
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=261/0, processing_ms=57.6, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=262/0, processing_ms=58.0, frame_ms=100.4
```
#### 1280x720@120fps for Arducam 16MP
```shell
# libcamera
$ ./camera_streamer -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=139/0, processing_ms=20.1, frame_ms=7.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=140/0, processing_ms=20.6, frame_ms=8.8
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=141/0, processing_ms=19.8, frame_ms=8.1
# direct ISP-mode
$ ./camera_streamer -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=157/0, processing_ms=18.5, frame_ms=8.4
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=158/0, processing_ms=18.5, frame_ms=8.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=159/0, processing_ms=18.5, frame_ms=8.3
```

13
docs/raspi-libcamera.md Normal file
View File

@ -0,0 +1,13 @@
# High-compatibility via `libcamera` on Raspberry PI
This script uses `libcamera` to access camera and provide
manual and automatic brightness and exposure controls.
The settings for those are configurable via described below controls.
This due to extra overhead has worse latency than direct decoding
via ISP.
```bash
tools/libcamera_camera.sh -help
tools/libcamera_camera.sh -camera-format=YUYV
```

34
docs/streaming.md Normal file
View File

@ -0,0 +1,34 @@
# Streaming
Camera Streamer exposes a number of streaming capabilities.
## HTTP web server
All streams are exposed over very simple HTTP server, providing different streams for different purposes:
- `http://<ip>:8080/` - index page
- `http://<ip>:8080/snapshot` - provide JPEG snapshot (works well everywhere)
- `http://<ip>:8080/stream` - provide MJPEG stream (works well everywhere)
- `http://<ip>:8080/video` - provide automated video.mp4 or video.hls stream depending on browser used
- `http://<ip>:8080/video.mp4` or `http://<ip>:8080/video.mkv` - provide remuxed `mkv` or `mp4` stream (uses `ffmpeg` to remux, works as of now only in Desktop Chrome and Safari)
- `http://<ip>:8080/webrtc` - provide WebRTC feed
## WebRTC support
The WebRTC is accessible via `http://<ip>:8080/webrtc` by default and is available when there's H264 output generated.
WebRTC support is implemented using awesome [libdatachannel](https://github.com/paullouisageneau/libdatachannel/) library.
The support will be compiled by default when doing `make`.
## RTSP server
The camera-streamer implements RTSP server via `live555`. Enable it with:
- adding `--rtsp-port`: will enable RTSP server on 8554
- adding `--rtsp-port=1111`: will enable RTSP server on custom port
The camera-streamer will expose two stream (if low res mode is enabled):
- `rtsp://<ip>:8554/stream.h264` - high resolution stream (always enabled if H264 stream is available directly or via encoding)
- `rtsp://<ip>:8554/stream_low_res.h264` - low resolution stream if low res mode is configured via `-camera-low_res_factor`

24
docs/v4l2-isp-mode.md Normal file
View File

@ -0,0 +1,24 @@
# High-performance mode via ISP for CSI
This script uses high-performance implementation for directly
accessing sensor feeds and passing it via DMA into bcm2385 ISP.
As such it does not implement brightness control similarly to `libcamera`.
```bash
# This script uses dumped IMX519 parametrs that are feed into bcm2385 ISP module
# This does not provide automatic brightness control
# Other sensors can be supported the same way as long as ISP parameters are adapted
tools/csi_camera.sh -help
tools/csi_camera.sh -camera-format=RG10 ...
```
This mode allows to provide significantly better performance for camera sensors
than described in specs. For example for Arducam 16MP (using IMX519) it is possible to achieve:
1. 120fps on 1280x720, with latency of about 50ms, where-as documentation says 1280x720x60p
1. 60fps on 1920x1080, with latency of about 90ms, where-as documentation says 1920x1080x30p
1. 30fps on 2328x1748, with latency of about 140ms, where-as documentation says 2328x1748x15p
The above was tested with camera connected to Raspberry PI Zero 2W, streaming `MJPEG` over WiFi
with camera recording https://www.youtube.com/watch?v=e8ZtPSIfWPc from desktop monitor. So, it as
well measured also a desktop rendering latency.

15
docs/v4l2-usb-mode.md Normal file
View File

@ -0,0 +1,15 @@
# High-performance mode via direct decoding for USB
This script uses direct decoding or passthrough of MJPEG or H264 streams from UVC camera into
re-encoded stream provided over web-interface.
```bash
tools/usb_camera.sh -help
tools/usb_camera.sh -camera-format=MJPEG ...
```
Currently the H264 is consider rather broken:
```bash
tools/usb_camera.sh -camera-format=H264 ...
```