Icon: 💻 | Color: Green | Status: Production Ready
Containerized development environments with distrobox, Claude Code integration, MCP servers, and local AI.
# Setup development environment
cd ~/kenl/KENL3-dev
./setup-devenv.sh ubuntu # or fedora, debian
# Switch to dev context
cd ~/kenl/KENL5-facades
./switch-kenl.sh devWhat changes:
stateDiagram-v2
[*] --> HostBazzite
HostBazzite --> DevContainer: distrobox create
DevContainer --> ClaudeCode: Claude Code setup
ClaudeCode --> MCPServers: MCP integration
MCPServers --> LocalAI: Ollama + Qwen
| Before | After |
|---|---|
| Host-only development | Isolated dev containers |
| No AI assistance | Claude Code + Local AI |
| Manual configuration | ATOM-tracked setups |
- 🐧 Distrobox Environments: Ubuntu 24.04, Fedora 41, Debian stable
- 🤖 Claude Code Integration: MCP servers, custom prompts
- 🧠 Local AI: Ollama with Qwen 2.5 models
- 📦 Devcontainers: VS Code/Claude Code
.devcontainerconfigs - 🔌 MCP Servers: Cloudflare, filesystem, GitHub integrations
- 📚 System Documentation: Project-specific docs and context
KENL3-dev/
├── distrobox-envs/ # Distrobox environment configs
│ ├── ubuntu24-dev.ini # Ubuntu 24.04 LTS
│ ├── fedora41-dev.ini # Fedora 41
│ └── debian-stable.ini # Debian stable
├── devcontainer/ # Devcontainer configurations
│ └── bazzite-example/ # Example Bazzite devcontainer
├── claude-code-setup/ # Claude Code configuration
│ └── claude-configuration-guide.md
├── mcp-configs/ # MCP server configurations
│ ├── cloudflare.json # Cloudflare MCP
│ ├── filesystem.json # Local filesystem MCP
│ └── github.json # GitHub MCP
├── local-ai/ # Local AI setup (Ollama + Qwen)
│ ├── install-ollama.sh
│ ├── models/ # Qwen model configs
│ └── prompts/ # Custom prompts
├── system-docs/ # System-specific documentation
│ └── .kenl/ # modules/KENL system context docs
└── setup-devenv.sh # Main setup script
# Create Ubuntu dev environment
./setup-devenv.sh ubuntu
# What it installs:
# - Python 3.12 + pip + venv
# - Node.js 20 LTS + npm
# - Git, curl, wget, build-essential
# - Docker CLI (for docker-compose)
# - Claude Code CLI
# - ATOM framework integrationUse for:
- Python projects (Django, Flask, FastAPI)
- Node.js/TypeScript development
- General-purpose development
- Claude Code primary environment
# Create Fedora dev environment
./setup-devenv.sh fedora
# What it installs:
# - Latest Podman, buildah, skopeo
# - Rust toolchain (rustc, cargo)
# - Go 1.21+
# - Development tools (gcc, clang, cmake)Use for:
- Container development
- Rust/Go projects
- System programming
- Testing latest tools
# Create Debian dev environment
./setup-devenv.sh debian
# What it installs:
# - Long-term stable packages
# - Conservative toolchain
# - Minimal dependenciesUse for:
- Production-like environments
- Long-term projects
- Security-sensitive work
cd ~/kenl/KENL3-dev/claude-code-setup
# Install Claude Code CLI
curl -fsSL https://claude.ai/install.sh | sh
# Configure MCP servers
cp mcp-configs/*.json ~/.config/claude/What changes:
~/.config/claude/
+ ├── mcp-servers.json # MCP server registry
+ ├── prompts/ # Custom prompts
+ └── settings.json # Claude Code settingsCloudflare MCP (mcp-configs/cloudflare.json):
{
"mcpServers": {
"cloudflare": {
"command": "npx",
"args": ["-y", "@cloudflare/mcp-server-cloudflare"],
"env": {
"CLOUDFLARE_API_TOKEN": "${CLOUDFLARE_API_TOKEN}",
"CLOUDFLARE_ACCOUNT_ID": "${CLOUDFLARE_ACCOUNT_ID}"
}
}
}
}GitHub MCP (mcp-configs/github.json):
{
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "${GITHUB_TOKEN}"
}
}
}
}Filesystem MCP (mcp-configs/filesystem.json):
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/home/user"],
"env": {}
}
}
}cd ~/kenl/KENL3-dev/local-ai
./install-ollama.sh
# Download Qwen 2.5 models
ollama pull qwen2.5:7b # 7B model (4GB RAM)
ollama pull qwen2.5:14b # 14B model (8GB RAM)
ollama pull qwen2.5-coder:7b # Code-specializedStorage location: /mnt/claude-ai/ollama (external drive, modules/KENL9)
What changes:
flowchart LR
A[Install Ollama] --> B[Download Qwen models]
B --> C[Configure API endpoint]
C --> D[Integrate with Claude Code]
D --> E[✅ Local AI ready]
| Model | Size | RAM | Use Case |
|---|---|---|---|
| qwen2.5:7b | 4.7GB | 8GB | General assistance |
| qwen2.5:14b | 8.6GB | 16GB | Complex reasoning |
| qwen2.5-coder:7b | 4.7GB | 8GB | Code generation |
| qwen2.5:32b | 19GB | 32GB | Maximum capability |
# Direct Ollama usage
ollama run qwen2.5-coder:7b
# Via API (for Claude Code integration)
curl http://localhost:11434/api/generate -d '{
"model": "qwen2.5-coder:7b",
"prompt": "Write a Python function to parse YAML",
"stream": false
}'# Create project-specific environment
distrobox create \
--name bazza-dev \
--image docker.io/library/ubuntu:24.04 \
--home ~/distrobox/bazza-dev \
--init
# Enter environment
distrobox enter bazza-dev
# Inside container, install modules/KENL framework
cd ~/kenl/KENL1-framework/atom-sage-framework
./install.sh# Export VS Code from container to host
distrobox-export --app code
# Now "code" command works from host, runs in container
# Icon appears in application menu# Host directories auto-mounted in container:
# - $HOME → /home/user
# - /mnt → /mnt (game libraries, AI data)
# - /var/home → /var/home
# Containers can access all modules/KENL modules directly!See devcontainer/bazzite-example/.devcontainer.json:
{
"name": "Bazzite Development",
"image": "ghcr.io/ublue-os/bazzite-arch:latest",
"features": {
"ghcr.io/devcontainers/features/common-utils:2": {},
"ghcr.io/devcontainers/features/git:1": {}
},
"postCreateCommand": "bash .devcontainer/post-create.sh",
"customizations": {
"vscode": {
"extensions": [
"anthropics.claude-code",
"ms-python.python",
"ms-vscode.cpptools"
]
}
}
}Post-create script (post-create.sh):
- Installs modules/KENL1 framework
- Configures ATOM trail
- Sets up MCP servers
- Installs local AI (optional)
The system-docs/.kenl/ directory contains host-specific documentation:
host-system-issues.md: Known hardware/software issuesgpu-hang-warning.md: NVIDIA GPU hang workaroundsREBASE_EXPECTATIONS.md: rpm-ostree rebase guideCURRENT_VS_POST_REBASE.md: Version comparison
These docs travel with backups (KENL10) for disaster recovery context!
(See also: AI-INTEGRATION-GUIDE.md)
What AI helps with:
- Code generation and refactoring
- Debugging and error resolution
- Documentation writing
- Test creation
- Dependency management
- Architecture design
Recommended AI stack:
- Claude Code (primary): Real-time coding assistance
- Qwen 2.5 Coder (local): Privacy-sensitive code, offline work
- Perplexity (research): API documentation, library selection
Example workflow:
# Start Claude Code in dev environment
distrobox enter bazza-dev
cd ~/projects/my-app
claude code .
# Claude can now:
# - Access project via filesystem MCP
# - Query GitHub repos via GitHub MCP
# - Deploy to Cloudflare via Cloudflare MCP
# - Use local Qwen for code completionRequires:
- KENL0 (system): Podman/distrobox installation
- KENL1 (framework): ATOM trail for environment tracking
Used by:
- KENL2 (gaming): Play Card creation (research + AI)
- KENL4 (monitoring): Dashboard development
- KENL11 (media): Docker compose configuration
# Create and enter environment
./setup-devenv.sh ubuntu my-project
# Install modules/KENL framework
cd ~/kenl/KENL1-framework/atom-sage-framework && ./install.sh
# Track environment creation
atom CONFIG "Created dev environment for my-project" "distrobox create my-project"# Copy MCP configs
cp mcp-configs/*.json ~/.config/claude/
# Set environment variables
echo 'export CLOUDFLARE_API_TOKEN="..."' >> ~/.bashrc
echo 'export GITHUB_TOKEN="..."' >> ~/.bashrc
source ~/.bashrc
# Test MCP servers
claude mcp list# Install Ollama
cd local-ai && ./install-ollama.sh
# Download code model
ollama pull qwen2.5-coder:7b
# Test
ollama run qwen2.5-coder:7b "Write a hello world in Rust"# Inside container, install app
sudo apt install code
# Export to host
distrobox-export --app code
# Now "code" works from host launcher!# Install on Bazzite
rpm-ostree install distrobox
systemctl reboot
# Or use toolbox (pre-installed)
toolbox create bazza-dev
toolbox enter bazza-dev# Ensure Node.js available
node --version # Should be 18+
# Install MCP server manually
npm install -g @cloudflare/mcp-server-cloudflare
# Verify
which mcp-server-cloudflare# Check GPU usage
nvidia-smi # For NVIDIA
radeontop # For AMD
# Enable GPU passthrough in container
distrobox create \
--nvidia # For NVIDIA
# OR
--additional-flags "--device=/dev/dri" # For AMD# Ensure mounts configured
distrobox create \
--additional-flags "--volume=/mnt:/mnt:rslave"- ← Root README: Overview of all modules/KENL modules
- → KENL1: Framework: Core ATOM/SAGE framework
- → KENL4: Monitoring: Dashboard development
- → AI Integration Guide: Complete AI setup guide
Status: Production Ready | Icon: 💻 | AI Level: 🟩 Maximum Help