Many projects with a Raspberry PI require "eyes" or, in other words, a Camera to control something like a robot or a remote-controlled car.
In this article, I will test the latencies of live-streaming a video. Ultimately, we will have a solution with a ~200 ms delay.
It's better to watch live-stream in a video version of the article:
Why is it so important to have a zero-latency video stream?
Of course, it really depends on the task. The latency doesn’t matter if it’s about recording birds outside your window. Or watching a stream of a beautiful waterfall.
However, latency is crucial when piloting an FPV drone, a racing car, or a remote-controlled boxer robot.
This article is not about how each particular protocol works and why these delays exist.
This article will review our options and how to implement them with minimum friction. Even someone without deep programming knowledge can set it up and use it.
HardwareThere's not too much to say about hardware. It’s as simple as possible.
As an operating system, I have the latest official port of Debian Bookworm. I have zero custom setup; everything can be found in the official Raspberry Stores and Raspberry PI Imager.
Let’s start with the official Raspberry Camera Documentation and try some network stream recommendations.
https://www.raspberrypi.com/documentation/computers/camera_software.html
The left menu has an rpicam-vid item. Press it, and the page will scroll down to the necessary information. We can scroll a bit down to Network Streaming, and let’s try UDP stream.
I have two terminals. One will be connected to the Raspberry PI via SSH, and the other will run commands on a local machine.
Raspberry PI 5 UDP video steam with a native codechttps://www.raspberrypi.com/documentation/computers/camera_software.html#udp
The command to start a stream on a Raspberry PI is:
rpicam-vid -t 0 --width 1280 --height 720 --framerate 30 --inline -o udp://LAPTOP_IP_HERE:5555
We must replace LAPTOP_IP_HERE with a laptop IP address (not the Raspberry PI).
A port could be arbitrary. I used 5555.
As we can see, the Raspberry PI sends some frames to the laptop.
To play the video, we can run the ffplay command on the laptop:
ffplay udp://RASPBERRY_PI_IP_HERE:5555 -fflags nobuffer -flags low_delay -framedrop
Now, it’s time to see the performance.
I have a camera that records the timer in real-time and sends a stream through the local network.
We can see that the UDP video delivery method has a 3 400 millisecond delay.
It’s a huge delay. For example, a car traveling 40 miles per hour (or approximately 64 kilometers per hour) will cover 66.49 yards (or 60 meters) in 3.4 seconds.
Here is what 66 yards looks like on a football field. Pretty impressive.
It’s not something that you can rely on when controlling a high-speed device like a drone.
Raspberry PI 5 TCP video steam with a native codechttps://www.raspberrypi.com/documentation/computers/camera_software.html#tcp
The command to stream on Raspberry PI:
rpicam-vid -t 0 --width 1280 --height 720 --framerate 30 --inline --listen -o tcp://0.0.0.0:5556
The Raspberry PI will make a stream to its own port (I set 5556); any client can connect to it and receive a stream
As you can see, it sends some frames, and it stops and waits for a listener to receive the stream.
To show a stream on a laptop, we can use:
ffplay tcp://RASPBERRY_PI_IP_HERE:5556 -vf "setpts=N/30" -fflags nobuffer -flags low_delay -framedrop
This TCP video streaming method has only half a second delay, which is much better than UDP.
Let’s apply this delay to a car traveling 40 miles per hour.
In 05 seconds, a car would travel only 10 yards (or 9 meters).
Raspberry PI 5 RTSP video steam with a native codecNext, I will test the RTSP stream.
rpicam-vid -t 0 --inline -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/stream1}' :demux=h264
And.....
…… yes. It doesn’t work.
I have to tune it, but I hate it when it’s added to official documentation. However, you have to complete a scavenger hunt to make it work.
So, for now, I will skip this part. I will test another RTSP stream later.
I still have several options in this documentation.
Like LibAV integrations into RPICAM-VID.
Raspberry PI 5 TCP video steam with a libav codec (mpegts)It’s a different video codec, and it could affect delays.
The command to run on a Raspberry PI:
rpicam-vid -t 0 --width 1280 --height 720 --framerate 30 --codec libav --libav-format mpegts --libav-audio -o "tcp://0.0.0.0:1234?listen=1"
The command to play it on a laptop:
ffplay tcp://RASPBERRY_PI_IP_HERE:1234 -vf "setpts=N/30" -fflags nobuffer -flags low_delay -framedrop
Surprisingly, LibAv has a huge delay—approximately 10.5 seconds. We had half a second with a default codec and TCP, but now it is ten and a half.
If we apply it to the car, we will see that a car will be far outside the field with that delay. 200 yards, or 190 meters. It’s not a joke.
The command to run on a Raspberry PI:
rpicam-vid -t 0 --width 1280 --height 720 --framerate 30 --codec libav --libav-format mpegts --libav-audio -o "udp://REPLACE_WITH_LAPTOP_IP:5555"
And player for a laptop:
ffplay tcp://RASPBERRY_PI_IP_HERE:1234 -vf "setpts=N/30" -fflags nobuffer -flags low_delay -framedrop
And now we have only half a second delay for UDP.
Let’s add all the info to the football field.
These results are pretty strange. With the native codec, we had a huge delay in UDP and a small delay in TCP; with LibAV, we have the opposite.
But we still can do better than half a second.
Raspberry PI 5 MediaMTX setupFinally, we are getting closer to our Raspberry PI winner of the stream latency: MediaMTX. This software can send a stream through all the protocols mentioned previously in the video and a few more. Let’s start with the set-up.
To install it, I need to do really simple steps:
- First, open the release page on the GitHub.
- Second, copy a link to an ARM64 archive. It’s essential to choose a proper version here.
- Third, create a folder for it on the Raspberry PI.
mkdir mediamtx && cd mediamtx
- Fourth, download it using the WGET command (find the latest link on GitHub).
wget https://github.com/bluenviron/mediamtx/releases/download/v1.7.0/mediamtx_v1.7.0_linux_arm64v8.tar.gz
- Fifth, unarchive it to the same folder.
tar -xvzf mediamtx_v1.7.0_linux_arm64v8.tar.gz
- Sixth, open the YML configuration file for editing.
nano mediamtx.yml
- Seventh, scroll down and paste these configurations.
cam1:
runOnInit: bash -c 'rpicam-vid -t 0 --camera 0 --nopreview --codec yuv420 --width 1280 --height 720 --inline --listen -o - | ffmpeg -f rawvideo -pix_fmt yuv420p -s:v 1280x720 -i /dev/stdin -c:v libx264 -preset ultrafast -tune zerolatency -f rtsp rtsp://localhost:$RTSP_PORT/$MTX_PATH'
runOnInitRestart: yes
In a nutshell, the MediaMTX software will run this command in a bash.
Next, it asks the RPICAM-VID command (the same as we dealt with before) to send a stream to the FFMPEG.
And FFMPEG will send it to the MediaMTX via RTSP protocol, but locally.
We can save the configuration and run MediaMTX.
./mediamtx
The very first prints of it will have helpful information about protocol and ports that we can use.
Let’s try it with a VLC player and RTSP stream.
We can run a vlc player with a command on the Laptop:
vlc rtsp://RASPBERRY_PI_IP_ADDRESS_HERE:8554/cam1
and in a moment, it will show a stream.
This type of stream has a 1.3-second delay. Let’s add this number to the football field.
Finally, we are closer to our winner. Let’s go back to the console and grab a port for WebRTC. In my case, it’s 8889.
To play it, I'll open this stream in the browser:
http://RASPBERRY_PI_IP_HERE:8889/cam1
In a moment, we have a stream.
Zoom out, and let’s review the delay.
It has an awesome 0.2 delay, which is a winner across all tests.
You can see it in action in the video:
On a football field for the same car with the same speed, it will be approximately 3.9 yards or 3.5 meters, which is the most minor delay from all of tested before.
Here, all the results are in one table.
If you are going to use stream with an FPV drone, you will need a latency ten times lower than 200 milliseconds. You should have approximately 20-30 milliseconds.
As for my needs, the 200-millisecond delay is insignificant, and I can implement my plans with MediaMTX.
If you want a lower delay, you can try tuning the WebRTC stream, reducing the framerate, or playing with other parameters.
I hope this article will help you build an exciting project with a low-latency live stream.
Let me know what you think about it in the comments on the YouTube channel. Also, please let me know if you want more info like this!
Thanks for reading!
Comments