Enable Deepseek on your Raspberry Pi
1.Introuduction
DeepSeek R1 is a powerful AI tool that can be used for various applications, from natural language processing to computer vision. In this guide, I’ll show you how to enable DeepSeek R1 on a Raspberry Pi, making it accessible for low-cost, portable projects. Whether you’re a beginner or an experienced developer, this step-by-step tutorial will help you get started.
2. Prerequisites
Before we begin, make sure you have the following:
A Raspberry Pi (preferably Raspberry Pi 4 with at least 4GB RAM).
A MicroSD card (16GB or larger) with Raspberry Pi OS installed.
A stable internet connection.
Basic peripherals like a keyboard, mouse, and monitor.
3. Step 1: Update Your Raspberry Pi
sudo apt update
sudo apt upgrade
4. Step 2: Install ollama
pi@mypi4:~ $ curl -fsSL https://ollama.com/install.sh | sh
>>> Installing ollama to /usr/local
>>> Downloading Linux arm64 bundle
######################################################################## 100.0%›
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
WARNING: No NVIDIA/AMD GPU detected. Ollama will run in CPU-only mode.
5. Step 3: run ollama with 1.5b model
pi@mypi4:~ $ curl -fsSL https://ollama.com/install.sh | sh
>>> Installing ollama to /usr/local
>>> Downloading Linux arm64 bundle
######################################################################## 100.0%›
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
WARNING: No NVIDIA/AMD GPU detected. Ollama will run in CPU-only mode.