AndyMelton.net
How to Set Up Ollama and Open WebUI on Linux
Purpose

This tutorial will guide you through the process of setting up Ollama and Open Web UI on a Debian-based Linux distribution (e.g., Debian or Ubuntu). Ollama will be used to acquire freely available large language models (LLMs). Open WebUI will provide a ChatGPT-like interface to interact with those models (LLMs). These instructions do not utilize Docker.

Pre-Requisites

Required

  • A Debian-based Linux distribution should already be installed.
  • Other Linux distributions do work of course, but the steps in this guide focus on Debian distributions.
  • Windows Subsystem for Linux (WSL) can be used on a Windows computer.

Recommended

  • While identifying the best GPU to use for LLM work is out-of-scope for this tutorial, the author recommends an NVIDIA GPU with at least 8 GB of VRAM. The author used the following GPUs to develop this tutorial:
    • NVIDIA Quadro RTX 4000 w/8 GB of VRAM
    • NVIDIA RTX A2000 w/8GB of VRAM
  • If you are making use of a GPU:
    • You must ensure that the latest non-free drivers are installed.
    • Use nvidia-smi to monitor GPU usage.
Steps to Install Ollama

1.) Install curl and pipx. Curl will allow us to retrieve the Ollama package. Pipx will allow us to easily install Open WebUI.

sudo apt-get install curl pipx

2.) Install Ollama using command provided on the Ollama GitHub page.

curl -fsSL https://ollama.com/install.sh | sh

3.) Pull (download) one of the freely available models from Ollama’s repositories. For the purposes of this tutorial, we will be using gemma2 (from Google) with 9 billion (9b) parameters.

ollama pull gemma2:9b

4.) Run gemma2:9b

ollama run gemma2:9b

5.) Enter a general question. Example: “Why is the sky blue?”

    • Depending on your compute power, it may take some time for the response to be generated.
    • If you are making use of a GPU and it is maxed out at 100% for several minutes, you will want to attempt to run a smaller model.
Steps to Install Open WebUI

1.) Install pipx. This will allow us to more easily install Open WebUI.

sudo apt install pipx

2.) Install Open WebUI via pipx.

pipx install open-webui

3.) Utilize the “ensurepath” command to ensure that the pipx path is in the PATH environment variable.

pipx ensurepath

4.) Restart the terminal or SSH session being used to issue commands.

5.) Start Open WebUI.

open-webui serve

6.) From a web browser, navigate to the Open WebUI interface. The interface should be located at http://YourServerIPAddress:8080

7.) Once at the Open WebUI login screen, create an account and login. The first account that you create will be the administrator account for your instance.

Steps to Ensure Ollama and Open WebUI Start During Boot
1.) Start by creating a service file. This will be used to start open-webui as a service when the system starts up.
sudo nano /etc/systemd/system/open-webui.service
2.) Add the following content to the service file, once entered, save and close the file.

Command: [Unit] Description=Open WebUI Service After=network. Target [Service] ExecStart=/home/username/.local/pipx/venvs/open-webui/bin/open-webui serve Restart=always User=username WorkingDirectory=/home/username/ Environment=”PATH=/home/username/.local/pipx/venvs/open-webui/bin:/usr/bin” [Install] WantedBy=multi-user. Target

NOTES:
  • Replace username with your username.
  • From the terminal, you can issue the command “open-webui –help” to view the list of available commands for open-webui. This is especially useful if you need to troubleshoot.
3.) Set permissions on the open-webui parent directories.
sudo chmod +x /home/username/.local/pipx/venvs/open-webui/bin/open-webui
sudo chmod -R 755 /home/username/.local/pipx/venvs/open-webui
4.) Reload internal database of unit files (i.e., reload the database that provides instructions for the starting/stopping of services).
sudo systemctl daemon-reload
5.) Enable the open-webui.service
sudo systemctl enable open-webui.service
6.) Start the open-webui.service
sudo systemctl start open-webui.service
7.) Check the status of the open-webui.service
sudo systemctl status open-webui.service
8.) Reboot the computer to ensure that the open-webui service started.
sudo reboot
9.) Once the system has restarted, ensure that you are able to access the Open WebUI interface from a remote system at http://YOURSERVERIP:8080