100 Strategy Overview

title: “Containerization Strategy Overview” tags: [“kb”]

Containerization Strategy Overview

1. Executive Summary

This codebase employs a hybrid containerization strategy designed to support both GPU-accelerated AI applications and general-purpose Linux services. The core approach involves building monolithic, self-contained Docker images for each application, embedding all necessary dependencies. A clear distinction in base images is maintained: nvidia/cuda for compute-intensive AI workloads, and a custom scratch-based Ubuntu image (leveraging s6-overlay) for other applications. Reproducibility is a key focus, achieved through explicit version pinning of Git repositories and consistent package management.

2. System Architecture and Logic

The strategy integrates two primary containerization lineages:

  • AI/ML Applications: These leverage nvidia/cuda:11.7.1-runtime-ubuntu20.04 as their foundational base, providing a direct GPU-enabled Ubuntu environment. Dependency management varies by application, utilizing Miniconda or pip/virtualenv. Application codebases are cloned from Git repositories with specific commit hashes to ensure exact version pinning and reproducibility.
  • General-Purpose Applications: These build upon a custom, minimal Ubuntu base image (linuxserver/ubuntu-baseimage), which is constructed through a multi-stage Docker process. This custom base integrates s6-overlay for robust process management.

Key Characteristics:

  • Monolithic Builds: Each application-specific Dockerfile (e.g., Stable Diffusion UIs, Naifu, Jellyfin) produces a single, comprehensive image that includes all its runtime dependencies and application code. While the custom base image construction itself is multi-stage, the final application images are designed to be self-contained.
  • Web Interfaces: All primary applications expose web interfaces on distinct ports for accessibility.
  • Model/Data Handling: AI applications are configured to use dedicated paths for models (/models) and outputs (/output), often with dynamic linking or volume mounting logic.