Raspberry Pi5 and Ai

Install Raspberry PI5 Ai Kit

I have a Raspberry Pi AI Kit, which includes:
– an M.2 HAT+
– a pre-installed Hailo-8L AI module

sudo apt install hailo-all

This installs the following dependencies:
– Hailo kernel device driver and firmware
– HailoRT middleware software
– Hailo Tappas core post-processing libraries
– The rpicam-apps Hailo post-processing software demo stages

Finally, reboot your Raspberry Pi with sudo reboot for these settings to take effect.

To ensure everything is running correctly, run the following command:

$ hailortcli fw-control identify
---------------------------------------------------------------------
Executing on device: 0000:01:00.0
Identifying board
Control Protocol Version: 2
Firmware Version: 4.20.0 (release,app,extended context switch buffer)
Logger Version: 0
Board Name: Hailo-8
Device Architecture: HAILO8L
Serial Number: HLDDLBB243300869
Part Number: HM21LB1C2LAE
Product Name: HAILO-8L AI ACC M.2 B+M KEY MODULE EXT TMP

To ensure the camera is operating correctly, run the following command:

$ rpicam-hello -t 10s

This starts the camera and shows a preview window for ten seconds. Once you have verified everything is installed correctly, it’s time to run some demos.

Install Ollama on the Raspberry Pi 5

$ curl -fsSL https://ollama.com/install.sh | sh
>>> Installing ollama to /usr/local
>>> Downloading Linux arm64 bundle
######################################################################## 100.0%
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
WARNING: No NVIDIA/AMD GPU detected. Ollama will run in CPU-only mode.

If Ollama is installed correctly, you’ll see the installed version displayed in the terminal.

$ ollama --version
ollama version is 0.5.11

Install Docker

$ sudo apt install docker docker-compose

Install Open Web UI

$ sudo docker pull ghcr.io/open-webui/open-webui:main
$ sudo docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

Using browser to open webUI
https://192.168.1.xxx:3000

Install Portioner

$ sudo docker pull portainer/portainer-ce:latest
$ sudo docker volume create portainer_data
$ docker run -d -p 8000:8000 -p 9443:9443 --name=portainer --restart=always -v /var/run/docker.sock:/var/run/docker.sock -v portainer_data:/data portainer/portainer-ce:latest

Using browser to access Portainer interface using the IP address
https://192.168.1.xxx:9443/