Best practices for using Python & uv inside Docker
Unable to run uv managed Python on an Alpine based image.
Publish variants of Docker image
Installing dependencies with `uv sync` system-wide in a Docker image
Videos
Getting uv right inside Docker is a bit tricky and even their official recommendations are not optimal.
It is better to use a two-step build process to eliminate uv from the final image size.
A two-step build process not only saves disk space but also reduces attack surface against security vulerabilities
EDIT: See my solution in the comments!
I've loved uv's local DevEx until now, but I can't seem to find the "recommended"/"best" way to create a docker image that installs all my dependencies system-wide (without a venv) in a Docker image.
I've tried with a 'pyproject.toml' (and associated 'uv.lock'):
[project]
name = "api"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"
dependencies = [
"fastapi>=0.115.6",
etc...
]
[dependency-groups]
dev = [
"pytest>=8.3.4",
etc...
]And a Dockerfile:
FROM ghcr.io/astral-sh/uv:python3.13-alpine #ENV UV_PROJECT_ENVIRONMENT="/usr/local/bin/" TODO ??? ENV PYTHONUNBUFFERED True WORKDIR /app COPY . . RUN uv sync --locked --python-preference system --no-dev
BUT exec'ing into the image after build, I see no '.venv/' and:
/app # python main.py
Traceback (most recent call last):
File "/app/main.py", line 1, in <module>
from fastapi import FastAPI
ModuleNotFoundError: No module named 'fastapi'-
Should I be using
uv install --system...instead or is that deprecated? -
Should I use UV_PROJECT_ENVIRONMENT?
-
Should I simply run everything within a venv inside containers somehow?
Any help or suggestions is greatly appreciated!!