Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      Sunshine And March Vibes (2025 Wallpapers Edition)

      June 1, 2025

      The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks

      June 1, 2025

      How To Fix Largest Contentful Paint Issues With Subpart Analysis

      June 1, 2025

      How To Prevent WordPress SQL Injection Attacks

      June 1, 2025

      7 MagSafe accessories that I recommend every iPhone user should have

      June 1, 2025

      I replaced my Kindle with an iPad Mini as my ebook reader – 8 reasons why I don’t regret it

      June 1, 2025

      Windows 11 version 25H2: Everything you need to know about Microsoft’s next OS release

      May 31, 2025

      Elden Ring Nightreign already has a duos Seamless Co-op mod from the creator of the beloved original, and it’ll be “expanded on in the future”

      May 31, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      Student Record Android App using SQLite

      June 1, 2025
      Recent

      Student Record Android App using SQLite

      June 1, 2025

      When Array uses less memory than Uint8Array (in V8)

      June 1, 2025

      Laravel 12 Starter Kits: Definite Guide Which to Choose

      June 1, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      Photobooth is photobooth software for the Raspberry Pi and PC

      June 1, 2025
      Recent

      Photobooth is photobooth software for the Raspberry Pi and PC

      June 1, 2025

      Le notizie minori del mondo GNU/Linux e dintorni della settimana nr 22/2025

      June 1, 2025

      Rilasciata PorteuX 2.1: Novità e Approfondimenti sulla Distribuzione GNU/Linux Portatile Basata su Slackware

      June 1, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Operating Systems»Linux»How to Install DeepSeek R1 Locally on Linux

    How to Install DeepSeek R1 Locally on Linux

    January 31, 2025

    How to Install DeepSeek R1 Locally on Linux

    DeepSeek has taken the AI world by storm. While it’s convenient to use DeepSeek on their hosted website, we know that there’s no place like 127.0.0.1. 😉

    How to Install DeepSeek R1 Locally on Linux
    Source: The Hacker News

    However, with recent events, such as a cyberattack on DeepSeek AI that has halted new user registrations, or DeepSeek AI database exposed, it makes me wonder why not more people choose to run LLMs locally.

    Not only does running your AI locally give you full control and better privacy, but it also keeps your data out of someone else’s hands.

    In this guide, we’ll walk you through setting up DeepSeek R1 on your Linux machine using Ollama as the backend and Open WebUI as the frontend.

    Let’s dive in!

    📋
    The DeepSeek version you will be running on the local system is a striped down version of actual DeepSeek that ‘outperformed’ ChatGPT. You’ll need Nvidia/AMD graphics on your system to run it.

    Step 1: Install Ollama

    Before we get to DeepSeek itself, we need a way to run Large Language Models (LLMs) efficiently. This is where Ollama comes in.

    What is Ollama?

    Ollama is a lightweight and powerful platform for running LLMs locally. It simplifies model management, allowing you to download, run, and interact with models with minimal hassle.

    The best part? It abstracts away all the complexities, no need to manually configure dependencies or set up virtual environments.

    Installing Ollama

    The easiest way to install Ollama is by running the following command in your terminal:

    curl -fsSL https://ollama.com/install.sh | sh
    How to Install DeepSeek R1 Locally on Linux

    Once installed, verify the installation:

    ollama --version

    Now, let’s move on to getting DeepSeek running with Ollama.

    Step 2: Install and run DeepSeek model

    With Ollama installed, pulling and running the DeepSeek model is really simple as running this command:

    ollama run deepseek-r1:1.5b

    This command downloads the DeepSeek-R1 1.5B model, which is a small yet powerful AI model for text generation, answering questions, and more.

    The download may take some time depending on your internet speed, as these models can be quite large.

    How to Install DeepSeek R1 Locally on Linux

    Once the download is complete, you can interact with it immediately in the terminal:

    How to Install DeepSeek R1 Locally on Linux

    But let’s be honest, while the terminal is great for quick tests, it’s not the most polished experience. It would be better to use a Web UI with Ollama. While there are many such tools, I prefer Open WebUI.

    12 Tools to Provide a Web UI for Ollama
    Don’t want to use the CLI for Ollama for interacting with AI models? Fret not, we have some neat Web UI tools that you can use to make it easy!
    How to Install DeepSeek R1 Locally on LinuxIt’s FOSSAnkush Das
    How to Install DeepSeek R1 Locally on Linux

    Step 3: Setting up Open WebUI

    Open WebUI provides a beautiful and user-friendly interface for chatting with DeepSeek. There are two ways to install Open WebUI:

    • Direct Installation (for those who prefer a traditional setup)
    • Docker Installation (my personal go-to method)

    Don’t worry, we’ll be covering both.

    Method 1: Direct installation

    If you prefer a traditional installation without Docker, follow these steps to set up Open WebUI manually.

    Step 1: Install python & virtual environment

    First, ensure you have Python installed along with the venv package for creating an isolated environment.

    Run the following command:

    sudo apt install python3-venv -y
    
    How to Install DeepSeek R1 Locally on Linux

    This installs the required package for managing virtual environments.

    Step 2: Create a virtual environment

    Next, create a virtual environment inside your home directory:

    python3 -m venv ~/open-webui-venv
    

    and then activate the virtual environment we just created:

    source ~/open-webui-venv/bin/activate
    How to Install DeepSeek R1 Locally on Linux

    You’ll notice your terminal prompt changes, indicating that you’re inside the virtual environment.

    Hostinger

    Step 4: Install Open WebUI

    With the virtual environment activated, install Open WebUI by running:

    pip install open-webui
    
    How to Install DeepSeek R1 Locally on Linux

    This downloads and installs Open WebUI along with its dependencies.

    Step 5: Run Open WebUI

    To start the Open WebUI server, use the following command:

    open-webui serve
    
    How to Install DeepSeek R1 Locally on Linux

    Once the server starts, you should see output confirming that Open WebUI is running.

    Step 6: Access Open WebUI in your browser

    Open your web browser and go to: http://localhost:8080

    You’ll now see the Open WebUI interface, where you can start chatting with DeepSeek AI!

    Method 2: Docker installation (Personal favorite)

    If you haven’t installed Docker yet, no worries! Check out our step-by-step guide on how to install Docker on Linux before proceeding.

    Once that’s out of the way, let’s get Open WebUI up and running with Docker.

    Step 1: Pull the Open WebUI docker image

    First, download the latest Open WebUI image from Docker Hub:

    docker pull ghcr.io/open-webui/open-webui:main
    
    How to Install DeepSeek R1 Locally on Linux

    This command ensures you have the most up-to-date version of Open WebUI.

    Step 2: Run Open WebUI in a docker container

    Now, spin up the Open WebUI container:

    docker run -d 
      -p 3000:8080 
      --add-host=host.docker.internal:host-gateway 
      -v open-webui:/app/backend/data 
      --name open-webui 
      --restart always 
      ghcr.io/open-webui/open-webui:main
    

    Don’t get scared looking at that big, scary command. Here’s what each part of the command actually does:

    Command Explanation
    docker run -d Runs the container in the background (detached mode).
    -p 3000:8080 Maps port 8080 inside the container to port 3000 on the host. So, you’ll access Open WebUI at http://localhost:3000.
    --add-host=host.docker.internal:host-gateway Allows the container to talk to the host system, useful when running other services alongside Open WebUI.
    -v open-webui:/app/backend/data Creates a persistent storage volume named open-webui to save chat history and settings.
    --name open-webui Assigns a custom name to the container for easy reference.
    --restart always Ensures the container automatically restarts if your system reboots or if Open WebUI crashes.
    ghcr.io/open-webui/open-webui:main This is the Docker image for Open WebUI, pulled from GitHub’s Container Registry.
    How to Install DeepSeek R1 Locally on Linux

    Step 3: Access Open WebUI in your browser

    Now, open your web browser and navigate to: http://localhost:8080 .You should see Open WebUI’s interface, ready to use with DeepSeek!

    How to Install DeepSeek R1 Locally on Linux

    Once you click on “Create Admin Account,” you’ll be welcomed by the Open WebUI interface.

    Since we haven’t added any other models yet, the DeepSeek model we downloaded earlier is already loaded and ready to go.

    How to Install DeepSeek R1 Locally on Linux

    Just for fun, I decided to test DeepSeek AI with a little challenge. I asked it to: “Write a rhyming poem under 20 words using the words: computer, AI, human, evolution, doom, boom.”

    And let’s just say… the response was a bit scary. 😅

    Here’s the full poem written by DeepSeek R1:

    How to Install DeepSeek R1 Locally on Linux

    Conclusion

    And there you have it! In just a few simple steps, you’ve got DeepSeek R1 running locally on your Linux machine with Ollama and Open WebUI.

    Whether you’ve chosen the Docker route or the traditional installation, the setup process is straightforward, and should work on most Linux distributions.

    So, go ahead, challenge DeepSeek to write another quirky poem, or maybe put it to work on something more practical. It’s yours to play with, and the possibilities are endless.

    For instance, I recently ran DeepSeek R1 on my Raspberry Pi 5, while it was a bit slow, it still got the job done.

    Who knows, maybe your next challenge will be more creative than mine (though, I’ll admit, that poem about “doom” and “boom” was a bit eerie! 😅).

    Enjoy your new local AI assistant, and happy experimenting! 🤖

    Source: Read More

    Hostinger
    Facebook Twitter Reddit Email Copy Link
    Previous ArticleDamn Small Linux – Linux distro for older hardware
    Next Article macOS Tune-Up Checklist (Free Download)

    Related Posts

    Linux

    Photobooth is photobooth software for the Raspberry Pi and PC

    June 1, 2025
    Linux

    Le notizie minori del mondo GNU/Linux e dintorni della settimana nr 22/2025

    June 1, 2025
    Leave A Reply Cancel Reply

    Continue Reading

    CVE-2025-3455 – WordPress 1 Click Migration Plugin Remote File Upload Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    The Ongoing Challenges of Understanding Long COVID and Exploring Innovative Solutions

    Development

     Exploring Salesforce’s AI-Powered Future

    Development

    CVE-2025-5272 – Firefox Memory Corruption Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    Highlights

    Development

    How to get the Windows 11 2024 Update (version 24H2) on your computer NOW

    June 19, 2024

    Windows 11’s next big feature update isn’t officially available just yet, even though it’s shipping…

    Limitations of JSON API and REST API

    November 27, 2024

    New model predicts a chemical reaction’s point of no return

    April 23, 2025

    New Wi-Fi Vulnerability Enables Network Eavesdropping via Downgrade Attacks

    May 16, 2024
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.