The vast majority of modern IP-cameras have got a serious drawback —video frames get buffered before being transmitted, which add sroughly a 1/25 second delay (given that camera works at 25 FPS). This delay becomes critical when the camera is used as a part of complex real-time automation systems, e.g.: machine vision, robotics, manufacturing process control, and others.
Our solution is relying on a hardware-accelerated video streaming:instead of acquiring and buffering the whole video frame before the transmission, it uses buffering only for parts of the frame, allow into reduce the delay up to 1/ncols seconds (given that rolling shutter camera is used, where ncols — is the number of vertical camera pixels). For 1080x720 resolution, it gives us a delay of 1.4 ms(instead of 40 ms using the regular approach).
For the means of real-time video transmission, an appropriate technology should be used. GigE Vision standard is based on Ethernet as the camera link and is widely adopted in industry. We use the Aravis library (based on results of reverse-engineering the GigE protocols)to make our device act as ordinary GigE Vision camera.
The GigE Vision is an industrial camera interface standard which was developed for high-speed real-time video streaming transmission overthe Ethernet network. It supports numerous speed grades of the Ethernet, but is optimized for 1 to 10 (since version 2.0) Gigabit Ethernet link speeds.
Our device acts as a GigE Vision camera bridge, based on Zybo Z7-10 board with a Digilent Pcam 5C camera connected to it. The 5C camera is used for the means of prototyping and could be replaced with a full-featured rolling shutter camera afterwards.
ImplementationThe main problem with providing the real-time video streams using regular MPUs and MCUs is that their camera interfaces perform frame buffering in hardware, thus limiting the further abilities of the whole device.In our device, we decided to implement a MIPI interface with a behavior different from the described above. To achieve this goal we were able to use the MIPI IP-core from Xilinx.
But for our camera to act as a GigE Vision camera, an Aravis library should be used, which itself is designed to work as a user space library in the GNU/Linux environment. This has brought us to the need of using Embedded Linux running on top of ARM core part the Xilinx SoC. We have decided to make a custom Linux distribution based on the Xilinx PetaLinux Project. Our distribution must have a connection with the camera. Also, support of Ethernet protocol is required.
Asa starting point, we used Zybo Z7-20 base-Linux project from Digilent, which already has a MIPI CSI-2 subsystem and Ethernet interfaces ( https://github.com/Digilent/Zybo-Z7-20-base-linux ).Project was ported to Zybo Z7-10 board. Also the components non-present on Zybo Z7-10 board were eliminated.
Software implementation
The original distribution of the ARAVIS does not allow direct video streaming from the camera but includes a video viewer and ArvFake camera simulator. The ArvFake imitates the GigE Vision camera, but itcan only emit a test video sequence in a form of moving gradient pattern. To avoid that limitation we have largely modified the original source code of the ArvFake component to replace the gradient patterns with actual video flow acquired from the camera.
Projectdesign stages
1)Ported and customized Zybo Z7-20 Base-Linux FPGA design to our development board. Completed
The output of that stage was getting the bitstream for FPGA part which in future will be downloaded to the development board alongside with Linux image.
All design presented in the picture:
2) Build a custom distribution of Linux based on the Xilinx Petalinux project.Test the connection between Linux distribution and camera. Completed
The distribution is adapted to our needs and includes many packages that will be useful in the future. For example, Yavta and FFmpeg gave us an opportunity to check the Linux support for the interconnections between camera and the ARM core made through the FPGApart. Also, this software is used to receive some frames from the Pcam camera module.The first frame we acquired(using FFmpeg) is presented on the figure below:
We used Yocto (along with meta layers and BSPs provided by Xilinx and Digilent) to build our custom Embedded Linux distro. Aravis library was build using Yocto and added to our distribution with a help of the corresponding aravis-meta meta layer. This layer is based on open-source Github project(https://github.com/astarasikov/meta-aravis).The layer can also be used for building Aravis for the OpenEmbedded/Angstrom Linux distribution, alongside with the required Gstreamer plugin.
A bitbake recipe and patches were added to build the Aravis library.
As a result of that stage we have built a Linux distro image, where Aravis stands as a static library (libaravis.so) with all its dependencies.
3) Modify fakecam module and test it on PC. Completed
Part of the source code:
static void start_capturing(void)
{
unsigned int i;
enum v4l2_buf_type type;
for (i = 0; i < n_buffers; ++i) {
struct v4l2_buffer buf;
CLEAR(buf);
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_MMAP;
buf.index = i;
if (-1 == xioctl(fd, VIDIOC_QBUF, &buf))
errno_exit("VIDIOC_QBUF");
}
type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
if (-1 == xioctl(fd, VIDIOC_STREAMON, &type))
errno_exit("VIDIOC_STREAMON");
}
void
fill_v4l_callback (ArvBuffer *buffer, void *fill_pattern_data,
guint32 exposure_time_us,
guint32 gain,
ArvPixelFormat pixel_format)
{
double pixel_value;
double scale;
guint32 width;
guint32 height;
if (buffer == NULL)
return;
if (pixel_format != ARV_PIXEL_FORMAT_MONO_8)
return;
width = buffer->priv->width;
height = buffer->priv->height;
}
The result showing the successful acquisition of the video stream, transmission of it by the means of GigE Vision–compatible transport, and the reception of the stream on the other side is shown on the figure below:
As an output, we resulted having a modified ArvFake module, which emits real video stream from camera through the Ethernet link using the GigE Vision-compatible protocols.
4)Change the fakecam module to fit the development board hardware(including FPGA components).(80%ready)
5) Build and test the modified fakecam module on the development board.
We plan to launch the modified fakecam module on the Zybo Z7-10 board. Output video stream from the camera will be transmitted to the PC via Ethernet link. For visualization, the ARAVIS-viewer component can be used.
6) Final optimization and debugging of the whole project design.
For our project to become true real-time, a series of steps will be done:applying the real-time patches to the Linux kernel; building a custom kernel module to allow for more robust video acquisition (instead of the v4l2 layer); bringing some parts of software from the user space to the kernel space, to comply to even more strict real-time requirements.
Comments