300 Operational Patterns
title: “Containerization Operational Patterns” tags: [“kb”]
Containerization Operational Patterns
This section describes the common operational instructions and patterns for managing containers within this project’s strategy.
1. Image Building and Reproducibility
- Command:
docker buildis used with the respectiveDockerfile. - Version Control: Application code repositories are cloned using
git cloneand then explicitly pinned to specific commit hashes viagit reset --hardcommands directly within the Dockerfiles. This ensures build reproducibility. - Syntax Directive: Dockerfiles include
docker/dockerfile:1syntax directive.
2. Running Containers
Containers are designed to be run with exposed ports (-p argument) to enable access to their web user interfaces or APIs.
hlky/stable-diffusion:- Exposes port:
7860 - Entrypoint:
python3 -u scripts/webui.py
- Exposes port:
naifu:- Exposes port:
6969 - Entrypoint:
./run.sh
- Exposes port:
AUTOMATIC1111/stable-diffusion-webui:- Exposes port:
7860 - Entrypoint:
python3 -u ../../webui.py - Requires mounting
config.jsonand potentially other helper scripts.
- Exposes port:
linuxserver/jellyfin:- Exposes ports:
8096(HTTP), optionally8920(HTTPS),7359/udp,1900/udp. - Entrypoint:
/init(due tos6-overlayinitialization).
- Exposes ports:
3. Configuration
- Environment Variables:
ENVandARGinstructions within Dockerfiles are widely used for configuration. Common variables include:PATHCLI_ARGSTOKENPUID(Process User ID)PGID(Process Group ID)TZ(Timezone)- Model paths
- Configuration Files: For
AUTOMATIC1111/stable-diffusion-webui,config.jsonis copied into the image to define output directories, image generation parameters, and post-processing settings.
4. GPU Acceleration
- Crucial for AI Images: Essential for high-performance AI workloads.
- Runtime Requirement: Requires the
nvidia-dockerruntime on the host system. - Environment Variable: The
NVIDIA_VISIBLE_DEVICES=allenvironment variable is necessary to ensure GPU access from within the container. - General Purpose: Jellyfin also details hardware acceleration support for Intel, Nvidia, and Raspberry Pi via its
README.md.
5. Volume Mounting
/config: A common volume for persistent application data, notably used by Jellyfin./output: Used by Stable Diffusion UIs for storing generated images./models: For AI applications, used for storing and accessing pre-downloaded AI models.- Jellyfin Media Libraries: Expected at
/data/tvshowsand/data/moviesfor media content.