Skip to content

Instantly share code, notes, and snippets.

@usrbinkat
usrbinkat / README.md
Last active January 29, 2026 07:48
Ollama + Open-Webui + Nvidia/CUDA + Docker + docker-compose

Ollama + Open-WebUI + NVIDIA GPU

GitHub stars

A Docker Compose stack for running local LLMs on your own hardware with NVIDIA GPU acceleration, a web UI, and one-command model setup via profiles.

Open-WebUI Interface

What's Included