The UV4L installation instructions for Bullseye have been updated. Please read the additional actions listed at the top of the page that you must take in order to install the software properly.
As a good example of edge computing, check out this new tutorial about live object detection with depth estimation thanks to stereo vision with a Raspberry Pi Compute Module.
As announced in the previous post, a new raspicam driver that adds some AI to UV4L has been released!
Here is a tutorial about how to quickly configure UV4L to make a robot doing real-time object detection, tracking by controlling pan/tilt servos and streaming over the web with WebRTC at the same time.
Below is another example of live object detection where bounding boxes, labels and confidence scores are overlaid real-time on the high-resolution video (tracking has been disabled in this case).
UV4L packages for the Raspberry Pi OS (Buster) are out! To know how to install them, please read the installation instructions.
Next releases will support Tensor Flow Lite models through the Google Edge TPU which will allow your Raspberry Pi to do things like real-time video tracking out of the box. See the example below. Tutorial.
This is a tutorial and below is the demo which shows a new way to play RetroPie games with an headless Raspberry Pi 3B+ and an Android “terminal” in the same LAN: traditionally the Raspberry Pi is directly wired to display, speakers (where the game is rendered ) and a gamepad controller. In this case, however, the Raspberry Pi is not connected to any display (except in the first part of the demo for clarity), speakers and controller. On the contrary, the game is played from within Android Chrome on an smartphone with a gamepad controller connected to the smartphone itself. In other words, the game video and audio are rendered onto the smartphone screen and speakers. The great thing is that no special, third-party application on the smartphone is required other than the stock Chrome browser.
In this example we will see how to configure the Raspberry Pi to serve a customizable web app which allows the Raspberry Pi to share its screen and speakers (“what you hear”) to the PC browser. Additionally, thanks to a virtual keyboard running on the Raspberry Pi, it will be possible to emulate the keypresses captured and sent from within the web page in the browser.
It can be used to fully play RetroPie games in the browser as shown in this DEMO.
WebRTC data channels can be used for an infinite number of useful things. They are ideal for sharing data between two peers in a secure and efficient way. In a past demo we showed how they could be used to control a robot in real-time. In this demo we will see how to use a WebRTC data channel to transfer a file from a browser to a Raspberry Pi. The demo can also work as a simple remote media player.
The UV4L packages for Ubuntu Bionic Beaver and Ubuntu Xenial (x86-64bit) have been updated and a new x11-renderer module has been added!
See the quick installation instructions.
webrtcH4cKS has recently posted a very interesting article about a computer vision project that makes use of UV4L for real-time streaming of video and data over WebRTC from a Raspberry Pi Zero connected to a AIY Vision board. The board embeds a Vision Processing Unit (VPU) chip that runs Tensor Flow image processing graphs super efficiently.
The article is divided in two parts:
The UV4L packages for Ubuntu Xenial 16.04 (x86-64bit) have been updated! See the quick installation instructions.