Debian 13 trixie

Open WebUI : Install2025/09/30

 

Install Open WebUI which allows you to run LLM on Web UI.

Open WebUI can be easily installed with pip3, but as of September 2025, the default version of Debian 13, Python 3.13, and related modules do not all support the versions required by Open WebUI. Therefore, in this example, we will start it in a container.

[1]

Install Podman, refer to here.

[2]

Install Ollama, refer to here.

[3] Pull and start the Open WebUI container image.
root@dlp:~#
podman pull ghcr.io/open-webui/open-webui:main

root@dlp:~#
podman images

REPOSITORY                     TAG         IMAGE ID      CREATED       SIZE
ghcr.io/open-webui/open-webui  main        34835e210222  17 hours ago  4.89 GB

root@dlp:~#
podman run -d -p 8080:8080 --add-host=host.containers.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
root@dlp:~#
podman ps

CONTAINER ID  IMAGE                               COMMAND        CREATED        STATUS        PORTS                   NAMES
b74e06cef03d  ghcr.io/open-webui/open-webui:main  bash start.sh  5 seconds ago  Up 4 seconds  0.0.0.0:8080->8080/tcp  open-webui
[4] Launch a web browser on any client computer and access the application to check that it works.
When you access the application, the following screen will appear, click [Get started].
[5] You will need to create an admin account the first time you access the site.
Enter the required information and click [Create Admin Account].
[6] Once your account is created, you will be taken to the default page of the Open WebUI.
[7] From next time, you can log in with your registered email address and password.
[8] Subsequent users can be registered from the Admin panel of the admin account.
[9] To use Chat, select the model you have loaded into Ollama from the top menu, enter a message in the bottom section, and you will receive a reply.
Matched Content