AI-powered CMS Logo

AI-powered CMS

A monorepo where LLMs build, containerize, and deploy web applications just by committing code

AI-poweredDevOpsSelf-Hosted

The Idea

What if an AI coding assistant could go from idea to deployed web application in a single conversation? Not just write the code — but containerize it, configure the deployment, and ship it to production automatically?

That's what this project does. It's a monorepo with a unified CI/CD pipeline where every folder with a Dockerfile is a deployable application. An LLM like Claude Code creates a new folder, writes the app, adds a Dockerfile and a small config file, and pushes to main. Minutes later, the app is live on Docker Swarm.

How It Works

  1. 1Create a folder at the repo root with your application code, a Dockerfile, and a small app.yaml config
  2. 2Push to main. The CI pipeline automatically discovers which apps changed by diffing the commit
  3. 3Docker images are built on a staging runner and pushed to a private container registry
  4. 4Deploy jobs fan out in parallel to the correct Docker Swarm cluster based on each app's deploy_target
  5. 5The app is live. Docker Swarm performs a rolling update with zero downtime

What Makes It AI-Powered

The pipeline itself is conventional CI/CD. What makes this an AI-powered CMS is the workflow it enables. The entire monorepo is designed so that an LLM can be a first-class contributor:

Convention Over Configuration

A CLAUDE.md file at the repo root teaches any LLM the project conventions. Add a folder, a Dockerfile, and a three-line YAML — that's all it takes to ship.

Learn by Example

With 19 apps already in the repo, LLMs have plenty of working examples to reference. Need a React game? Look at Gloop. Need a Go backend? Look at Destroyers.

Full-Stack Autonomy

LLMs don't just write frontend code. They create multi-stage Dockerfiles, configure base paths for reverse proxies, set up backends with databases, and wire up deployment targets.

Human in the Loop

The developer reviews and approves every commit. The AI proposes, the human disposes. Git history provides a complete audit trail of every change and who authored it.

What's Been Built With It

This monorepo currently hosts 19 containerized applications — the majority written primarily by AI coding assistants. The range of projects demonstrates the versatility of the approach:

Web Games

Productivity & Tools

Self-Hosted

  • TunerWeb — Live TV streaming
  • ZimContext — Offline knowledge
  • • This website itself

Architecture

CI/CD Pipeline

  • • Gitea Actions — Self-hosted GitHub Actions compatible
  • • Automatic app discovery via git diff
  • • Base64-encoded matrix for multi-app builds
  • • Parallel deploy jobs per target cluster

Infrastructure

  • • Docker Swarm for orchestration
  • • Private container registry
  • • Multiple deploy targets across servers
  • • Rolling updates with zero downtime

The Three-Line Deploy Config

Every app needs just one config file to go from code to production. The pipeline reads this file to determine what to name the image, which service to update, and where to deploy it:

# app.yaml
name: my-app
service_name: cdprod_myapp
deploy_target: smallprod

Why Not a Traditional CMS?

Traditional content management systems like WordPress separate content from code. That made sense when humans were the bottleneck — editing a blog post shouldn't require a developer.

But AI coding assistants blur that line. An LLM can write a complete React application as easily as it can write a paragraph of text. When your "content creator" can produce full-stack applications, the CMS doesn't need to abstract away the code — it needs to make shipping code frictionless.

That's what this monorepo does. Every project page on this website, every game, every tool — they're all just folders with code, Dockerfiles, and a tiny config file. The AI writes it, the human reviews it, and the pipeline ships it.

Perfect For

🤖

AI-First Teams

Teams using LLMs as primary developers who need fast deploy cycles

🚀

Rapid Prototypers

Go from idea to deployed app in a single conversation

📦

Multi-App Shops

Manage many small apps without per-app CI/CD overhead

Requirements

  • • Gitea instance with Actions enabled (or adaptable to GitHub Actions)
  • • Docker Swarm cluster (one or more nodes)
  • • Self-hosted runners with Docker build capability
  • • An AI coding assistant like Claude Code, Cursor, or similar

Interested in This Approach?

We'd love to talk about how AI-powered development workflows can accelerate your team.

Get in Touch