Compare commits

..

3 Commits

Author SHA1 Message Date
b286b0a305 fix(demo): migrate Reactive Resume to SeaweedFS, fix Kiwix/Apple Health
- Replace MinIO + Chrome with SeaweedFS (S3) + bucket init container
- Update Reactive Resume to v5 config (S3_* env vars, APP_URL, AUTH_SECRET)
- Fix Kiwix: smaller ZIM download, graceful fallback on failure, start_period
- Fix Apple Health: use InfluxDB ping() instead of deprecated ready()
- Remove stale RESUME_CHROME_TOKEN and RESUME_REFRESH_TOKEN_SECRET
- Add .yamllint config to relax line-length for compose template
- Update validate-all.sh to use local yamllint config and new image refs
- Update unit tests for createbucket service (replaces chrome)

💘 Generated with Crush

Assisted-by: GLM-5.1 via Crush <crush@charm.land>
2026-05-08 14:22:57 -05:00
ad59acbc28 fix(demo): fix Reactive Resume AUTH_SECRET, Kiwix ZIM download, Apple Health check
- Add AUTH_SECRET env var required by Reactive Resume
- Kiwix auto-downloads Wikipedia Medical ZIM on first start
- Simplify Apple Health healthcheck to use InfluxDB ready() API
- Add all missing service config vars to ensure_env bootstrapping

💘 Generated with Crush

Assisted-by: GLM-5.1 via Crush <crush@charm.land>
2026-05-08 12:49:57 -05:00
25f7a6cd75 feat(demo): migrate 5 SelfStack services to demo stack (16→24 services)
Add Reactive Resume, Metrics, Kiwix, Resume Matcher, and Apple Health
from the earlier SelfStack project. Rewrite Apple Health collector to
use InfluxDB v2 with proper error handling. Update all tests, scripts,
Homepage config, env template, and documentation for the expanded stack.

New services:
- Reactive Resume (4016) + Postgres/Minio/Chrome companions
- Metrics (4021) - GitHub metrics visualization
- Kiwix (4022) - offline wiki reader
- Resume Matcher (4023) - AI resume screening
- Apple Health (4024) - health data collector → InfluxDB v2

Also adds git policy to AGENTS.md: always commit and push automatically.

💘 Generated with Crush

Assisted-by: GLM-5.1 via Crush <crush@charm.land>
2026-05-08 12:28:56 -05:00
23 changed files with 769 additions and 23 deletions

View File

@@ -6,9 +6,15 @@ This repository contains a Docker Compose-based multi-service stack that provide
### Project Type ### Project Type
- **Infrastructure as Code**: Docker Compose with shell orchestration - **Infrastructure as Code**: Docker Compose with shell orchestration
- **Multi-Service Stack**: 16 services across 4 categories - **Multi-Service Stack**: 24 services across 5 categories
- **Demo-First Architecture**: All configurations for demonstration purposes only - **Demo-First Architecture**: All configurations for demonstration purposes only
### Git Policy
- **ALWAYS commit** after every logical unit of work — never ask permission
- **ALWAYS push** after every commit — never ask permission
- **Commit early, commit often** — small focused commits are preferred over large ones
- **Never ask** "should I commit?" or "should I push?" — just do it
### Directory Structure ### Directory Structure
``` ```
TSYSDevStack-SupportStack-LocalWorkstation/ TSYSDevStack-SupportStack-LocalWorkstation/
@@ -43,6 +49,11 @@ TSYSDevStack-SupportStack-LocalWorkstation/
│ │ ├── tubearchivist/ # Tube Archivist configuration │ │ ├── tubearchivist/ # Tube Archivist configuration
│ │ ├── wakapi/ # Wakapi configuration │ │ ├── wakapi/ # Wakapi configuration
│ │ ├── mailhog/ # MailHog configuration │ │ ├── mailhog/ # MailHog configuration
│ │ ├── applehealth/ # Apple Health configuration
│ │ ├── metrics/ # Metrics configuration
│ │ ├── reactiveresume/ # Reactive Resume configuration
│ │ ├── kiwix/ # Kiwix configuration
│ │ ├── resumematcher/ # Resume Matcher configuration
│ │ └── atuin/ # Atuin configuration │ │ └── atuin/ # Atuin configuration
│ └── docs/ # Additional documentation │ └── docs/ # Additional documentation
│ ├── service-guides/ # Service-specific guides │ ├── service-guides/ # Service-specific guides
@@ -125,13 +136,16 @@ docker run --rm -v "$(pwd):/workdir" hadolint/hadolint <path-to-dockerfile>
- Pi-hole (4006) - DNS management with ad blocking - Pi-hole (4006) - DNS management with ad blocking
- Dockhand (4007) - Web-based container management - Dockhand (4007) - Web-based container management
2. **Monitoring & Observability** (ports 4008-4009) 2. **Monitoring & Observability** (ports 4008-4009, 4021, 4024)
- InfluxDB (4008) - Time series database for metrics - InfluxDB (4008) - Time series database for metrics
- Grafana (4009) - Visualization platform - Grafana (4009) - Visualization platform
- Metrics (4021) - GitHub metrics visualization
- Apple Health (4024) - Health data collection
3. **Documentation & Diagramming** (ports 4010-4011) 3. **Documentation & Diagramming** (ports 4010-4011, 4022)
- Draw.io (4010) - Web-based diagramming application - Draw.io (4010) - Web-based diagramming application
- Kroki (4011) - Diagrams as a service - Kroki (4011) - Diagrams as a service
- Kiwix (4022) - Offline wiki reader
4. **Developer Tools** (ports 4000, 4012-4018) 4. **Developer Tools** (ports 4000, 4012-4018)
- Homepage (4000) - Central dashboard for service discovery - Homepage (4000) - Central dashboard for service discovery
@@ -142,7 +156,11 @@ docker run --rm -v "$(pwd):/workdir" hadolint/hadolint <path-to-dockerfile>
- MailHog (4017 Web, 4019 SMTP) - Web and API based SMTP testing - MailHog (4017 Web, 4019 SMTP) - Web and API based SMTP testing
- Atuin (4018) - Magical shell history synchronization - Atuin (4018) - Magical shell history synchronization
5. **Companion Services** (internal only, no host ports) 5. **Productivity** (ports 4016, 4023)
- Reactive Resume (4016) - Resume builder
- Resume Matcher (4023) - AI resume screening
6. **Companion Services** (internal only, no host ports)
- ta-redis - Redis cache for Tube Archivist - ta-redis - Redis cache for Tube Archivist
- ta-elasticsearch - Elasticsearch index for Tube Archivist - ta-elasticsearch - Elasticsearch index for Tube Archivist
@@ -168,7 +186,7 @@ docker run --rm -v "$(pwd):/workdir" hadolint/hadolint <path-to-dockerfile>
### Docker Labels (Service Discovery) ### Docker Labels (Service Discovery)
```yaml ```yaml
labels: labels:
homepage.group: "Infrastructure" # Category homepage.group: "Infrastructure" # Category (Infrastructure|Monitoring|Documentation|Developer Tools|Productivity)
homepage.name: "Service Display Name" # Human-readable name homepage.name: "Service Display Name" # Human-readable name
homepage.icon: "icon-name" # Icon identifier homepage.icon: "icon-name" # Icon identifier
homepage.href: "http://localhost:PORT" # Access URL homepage.href: "http://localhost:PORT" # Access URL
@@ -267,7 +285,7 @@ Before ANY file is created or modified:
### Service Discovery Mechanism ### Service Discovery Mechanism
- **Homepage Labels**: Services automatically discovered via Docker labels - **Homepage Labels**: Services automatically discovered via Docker labels
- **No Manual Config**: Don't manually add services to Homepage configuration - **No Manual Config**: Don't manually add services to Homepage configuration
- **Group-Based**: Services organized by group (Infrastructure, Monitoring, Documentation, Developer Tools) - **Group-Based**: Services organized by group (Infrastructure, Monitoring, Documentation, Developer Tools, Productivity)
- **Real-Time**: Homepage updates automatically as services start/stop - **Real-Time**: Homepage updates automatically as services start/stop
### FOSS Only Policy ### FOSS Only Policy
@@ -279,7 +297,7 @@ Before ANY file is created or modified:
## Project-Specific Context ## Project-Specific Context
### Current State ### Current State
- **Demo Environment**: Fully configured with 16 services - **Demo Environment**: Fully configured with 24 services
- **Production Environment**: Placeholder only, not yet implemented - **Production Environment**: Placeholder only, not yet implemented
- **Documentation**: Comprehensive (AGENTS.md, PRD.md, README.md) - **Documentation**: Comprehensive (AGENTS.md, PRD.md, README.md)
- **Scripts**: Complete orchestration and testing scripts available - **Scripts**: Complete orchestration and testing scripts available
@@ -344,10 +362,32 @@ WAKAPI_PORT=4015
MAILHOG_PORT=4017 MAILHOG_PORT=4017
MAILHOG_SMTP_PORT=4019 MAILHOG_SMTP_PORT=4019
ATUIN_PORT=4018 ATUIN_PORT=4018
REACTIVE_RESUME_PORT=4016
RESUME_MINIO_PORT=4020
METRICS_PORT=4021
KIWIX_PORT=4022
RESUME_MATCHER_PORT=4023
APPLEHEALTH_PORT=4024
# Demo Credentials (NOT FOR PRODUCTION) # Demo Credentials (NOT FOR PRODUCTION)
DEMO_ADMIN_USER=admin DEMO_ADMIN_USER=admin
DEMO_ADMIN_PASSWORD=demo_password DEMO_ADMIN_PASSWORD=demo_password
# Reactive Resume
RESUME_POSTGRES_DB=reactive_resume
RESUME_POSTGRES_USER=resume_user
RESUME_POSTGRES_PASSWORD=demo_password
RESUME_MINIO_USER=minio_user
RESUME_MINIO_PASSWORD=demo_password
RESUME_CHROME_TOKEN=demo_token
RESUME_ACCESS_TOKEN_SECRET=demo_secret
RESUME_REFRESH_TOKEN_SECRET=demo_secret
# Metrics
METRICS_GITHUB_TOKEN=
# Apple Health
APPLEHEALTH_INFLUXDB_BUCKET=applehealth
``` ```
## Key Files Reference ## Key Files Reference
@@ -390,5 +430,5 @@ DEMO_ADMIN_PASSWORD=demo_password
--- ---
**Last Updated**: 2025-01-24 **Last Updated**: 2026-05-08
**Version**: 1.0 **Version**: 2.0

View File

@@ -4,14 +4,15 @@ A Docker Compose-based multi-service stack of FOSS applications that run locally
## What It Does ## What It Does
Deploys 16 services across 4 categories via a single command: Deploys 24 services across 6 categories via a single command:
| Category | Services | | Category | Services |
|----------|----------| |----------|----------|
| **Infrastructure** | Homepage (dashboard), Pi-hole (DNS), Dockhand (Docker management), Docker Socket Proxy | | **Infrastructure** | Homepage (dashboard), Pi-hole (DNS), Dockhand (Docker management), Docker Socket Proxy |
| **Monitoring** | InfluxDB (time series), Grafana (visualization) | | **Monitoring** | InfluxDB (time series), Grafana (visualization), Metrics (GitHub metrics), Apple Health (health data) |
| **Documentation** | Draw.io (diagramming), Kroki (diagrams as code) | | **Documentation** | Draw.io (diagramming), Kroki (diagrams as code), Kiwix (offline wiki) |
| **Developer Tools** | Atomic Tracker, ArchiveBox, Tube Archivist, Wakapi, MailHog, Atuin | | **Developer Tools** | Atomic Tracker, ArchiveBox, Tube Archivist, Wakapi, MailHog, Atuin |
| **Productivity** | Reactive Resume (resume builder), Resume Matcher (AI resume screening) |
## Quick Start ## Quick Start

16
demo/.yamllint Normal file
View File

@@ -0,0 +1,16 @@
---
extends: default
rules:
line-length:
max: 160
allow-non-breakable-words: true
empty-lines:
max: 2
max-start: 0
max-end: 0
document-start: disable
comments:
min-spaces-from-content: 1
truthy:
allowed-values: ["true", "false"]
check-keys: false

View File

@@ -76,9 +76,10 @@ Before ANY file is created or modified:
### Service Categories ### Service Categories
- **Infrastructure Services**: Core platform services - **Infrastructure Services**: Core platform services
- **Monitoring & Observability**: Metrics and visualization - **Monitoring & Observability**: Metrics, visualization, and health data
- **Documentation & Diagramming**: Knowledge management - **Documentation & Diagramming**: Knowledge management
- **Developer Tools**: Productivity enhancers - **Developer Tools**: Productivity enhancers
- **Productivity**: Resume building and screening tools
### Design Patterns ### Design Patterns
- **Service Discovery**: Automatic via Homepage dashboard - **Service Discovery**: Automatic via Homepage dashboard

View File

@@ -210,6 +210,8 @@ graph TD
| **Container Management** (Dockhand) | Docker socket (direct mount) | 🔗 Required | | **Container Management** (Dockhand) | Docker socket (direct mount) | 🔗 Required |
| **Visualization Platform** (Grafana) | Time Series Database (InfluxDB) | 🔗 Required | | **Visualization Platform** (Grafana) | Time Series Database (InfluxDB) | 🔗 Required |
| **Video Archiving** (Tube Archivist) | Redis (ta-redis) + Elasticsearch (ta-elasticsearch) | 🔗 Required | | **Video Archiving** (Tube Archivist) | Redis (ta-redis) + Elasticsearch (ta-elasticsearch) | 🔗 Required |
| **Resume Builder** (Reactive Resume) | Postgres + Minio + Chrome | 🔗 Required |
| **Health Data** (Apple Health) | InfluxDB | 🔗 Required |
| **All Other Services** | None | ✅ Standalone | | **All Other Services** | None | ✅ Standalone |
--- ---
@@ -411,6 +413,6 @@ When reporting issues, please include:
**🎉 Happy Developing!** **🎉 Happy Developing!**
*Last updated: 2025-11-13* *Last updated: 2026-05-08*
</div> </div>

View File

@@ -0,0 +1,15 @@
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY app.py .
EXPOSE 5353
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=10s \
CMD python -c "import urllib.request; urllib.request.urlopen('http://localhost:5353/health')" || exit 1
CMD ["python", "app.py"]

View File

@@ -0,0 +1,164 @@
import json
import os
import sys
import logging
from flask import Flask, request, jsonify
from influxdb_client import InfluxDBClient
from influxdb_client.client.write_api import SYNCHRONOUS
DATAPOINTS_CHUNK = 80000
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
handlers=[logging.StreamHandler(sys.stdout)],
)
logger = logging.getLogger(__name__)
app = Flask(__name__)
INFLUXDB_URL = os.environ.get("INFLUXDB_URL", "http://influxdb:8086")
INFLUXDB_TOKEN = os.environ.get("INFLUXDB_TOKEN", "")
INFLUXDB_ORG = os.environ.get("INFLUXDB_ORG", "tsysdemo")
INFLUXDB_BUCKET = os.environ.get("INFLUXDB_BUCKET", "demo_metrics")
_client = None
_write_api = None
def get_client():
global _client
if _client is None:
_client = InfluxDBClient(
url=INFLUXDB_URL, token=INFLUXDB_TOKEN, org=INFLUXDB_ORG
)
return _client
def get_write_api():
global _write_api
if _write_api is None:
_write_api = get_client().write_api(write_options=SYNCHRONOUS)
return _write_api
@app.route("/health", methods=["GET"])
def health():
try:
client = get_client()
ping = client.ping()
if ping:
return jsonify({"status": "healthy"}), 200
return jsonify({"status": "degraded", "influxdb": "not reachable"}), 200
except Exception as exc:
return jsonify({"status": "degraded", "error": str(exc)}), 200
@app.route("/", methods=["GET"])
def index():
return jsonify(
{
"service": "apple-health-collector",
"endpoints": {
"health": "GET /health",
"collect": "POST /collect (JSON body)",
},
"influxdb": {
"url": INFLUXDB_URL,
"org": INFLUXDB_ORG,
"bucket": INFLUXDB_BUCKET,
},
}
)
@app.route("/collect", methods=["POST"])
def collect():
logger.info("Health data collection request received")
if not request.data:
return jsonify({"error": "No data provided"}), 400
try:
healthkit_data = json.loads(request.data)
except (json.JSONDecodeError, ValueError) as exc:
logger.error("Invalid JSON: %s", exc)
return jsonify({"error": "Invalid JSON", "detail": str(exc)}), 400
points_written = 0
try:
metrics = healthkit_data.get("data", {}).get("metrics", [])
for metric in metrics:
measurement = metric.get("name", "unknown")
for datapoint in metric.get("data", []):
timestamp = datapoint.get("date")
if not timestamp:
continue
fields = {}
tags = {}
for key, value in datapoint.items():
if key == "date":
continue
if isinstance(value, (int, float)):
fields[key] = float(value)
else:
tags[key] = str(value)
if not fields:
continue
record = {
"measurement": measurement,
"tags": tags,
"fields": fields,
"time": timestamp,
}
get_write_api().write(
bucket=INFLUXDB_BUCKET, org=INFLUXDB_ORG, record=record
)
points_written += 1
workouts = healthkit_data.get("data", {}).get("workouts", [])
for workout in workouts:
workout_name = workout.get("name", "unknown")
workout_start = workout.get("start", "")
workout_end = workout.get("end", "")
workout_id = f"{workout_name}-{workout_start}-{workout_end}"
for gps_point in workout.get("route", []):
ts = gps_point.get("timestamp")
if not ts:
continue
record = {
"measurement": "workout_route",
"tags": {
"workout_id": workout_id,
"workout_name": workout_name,
},
"fields": {
"lat": float(gps_point.get("lat", 0)),
"lng": float(gps_point.get("lon", 0)),
},
"time": ts,
}
get_write_api().write(
bucket=INFLUXDB_BUCKET, org=INFLUXDB_ORG, record=record
)
points_written += 1
logger.info("Wrote %d data points", points_written)
return jsonify({"status": "success", "points_written": points_written}), 200
except Exception as exc:
logger.exception("Error processing health data")
return jsonify({"error": "Processing failed", "detail": str(exc)}), 500
if __name__ == "__main__":
logger.info("Apple Health data collector starting")
logger.info("InfluxDB: %s", INFLUXDB_URL)
logger.info("Bucket: %s", INFLUXDB_BUCKET)
app.run(host="0.0.0.0", port=5353)

View File

@@ -0,0 +1,2 @@
flask
influxdb-client

View File

@@ -34,6 +34,16 @@
username: admin username: admin
password: demo_password password: demo_password
- Metrics:
href: http://localhost:4021
description: GitHub metrics visualization
icon: github.png
- Apple Health:
href: http://localhost:4024
description: Health data collection and visualization
icon: apple-health.png
- Documentation: - Documentation:
- Draw.io: - Draw.io:
href: http://localhost:4010 href: http://localhost:4010
@@ -45,6 +55,11 @@
description: Diagrams as a service description: Diagrams as a service
icon: kroki.png icon: kroki.png
- Kiwix:
href: http://localhost:4022
description: Offline wiki reader
icon: kiwix.png
- Developer Tools: - Developer Tools:
- Atomic Tracker: - Atomic Tracker:
href: http://localhost:4012 href: http://localhost:4012
@@ -75,3 +90,14 @@
href: http://localhost:4018 href: http://localhost:4018
description: Magical shell history synchronization description: Magical shell history synchronization
icon: atuin.png icon: atuin.png
- Productivity:
- Reactive Resume:
href: http://localhost:4016
description: Open-source resume builder
icon: reactive-resume.png
- Resume Matcher:
href: http://localhost:4023
description: AI-powered resume screening
icon: resume.png

View File

@@ -19,6 +19,9 @@ layout:
Developer Tools: Developer Tools:
style: row style: row
columns: 3 columns: 3
Productivity:
style: row
columns: 2
providers: providers:
docker: docker:

View File

View File

@@ -0,0 +1,96 @@
{
"token": "GITHUB_API_TOKEN_PLACEHOLDER",
"modes": ["embed", "insights"],
"restricted": [],
"maxusers": 0,
"cached": 3600000,
"ratelimiter": null,
"port": 3000,
"optimize": true,
"debug": false,
"debug.headless": false,
"mocked": false,
"repositories": 100,
"padding": ["0", "8 + 11%"],
"outputs": ["svg", "png", "json"],
"hosted": {
"by": "",
"link": ""
},
"oauth": {
"id": null,
"secret": null,
"url": "https://example.com"
},
"api": {
"rest": null,
"graphql": null
},
"control": {
"token": null
},
"community": {
"templates": []
},
"templates": {
"default": "classic",
"enabled": []
},
"extras": {
"default": false,
"features": false,
"logged": [
"metrics.api.github.overuse"
]
},
"plugins.default": false,
"plugins": {
"isocalendar": { "enabled": false },
"languages": { "enabled": false },
"stargazers": { "worldmap.token": null, "enabled": false },
"lines": { "enabled": false },
"topics": { "enabled": false },
"stars": { "enabled": false },
"licenses": { "enabled": false },
"habits": { "enabled": false },
"contributors": { "enabled": false },
"followup": { "enabled": false },
"reactions": { "enabled": false },
"people": { "enabled": false },
"sponsorships": { "enabled": false },
"sponsors": { "enabled": false },
"repositories": { "enabled": false },
"discussions": { "enabled": false },
"starlists": { "enabled": false },
"calendar": { "enabled": false },
"achievements": { "enabled": false },
"notable": { "enabled": false },
"activity": { "enabled": false },
"traffic": { "enabled": false },
"code": { "enabled": false },
"gists": { "enabled": false },
"projects": { "enabled": false },
"introduction": { "enabled": false },
"skyline": { "enabled": false },
"support": { "enabled": false },
"pagespeed": { "token": "", "enabled": false },
"tweets": { "token": "", "enabled": false },
"stackoverflow": { "enabled": false },
"anilist": { "enabled": false },
"music": { "token": "", "enabled": false },
"posts": { "enabled": false },
"rss": { "enabled": false },
"wakatime": { "token": "", "enabled": false },
"leetcode": { "enabled": false },
"steam": { "token": "", "enabled": false },
"16personalities": { "enabled": false },
"chess": { "token": "", "enabled": false },
"crypto": { "enabled": false },
"fortune": { "enabled": false },
"nightscout": { "enabled": false },
"poopmap": { "token": "", "enabled": false },
"screenshot": { "enabled": false },
"splatoon": { "token": "", "statink.token": null, "enabled": false },
"stock": { "token": "", "enabled": false }
}
}

View File

View File

View File

@@ -27,6 +27,12 @@ WAKAPI_PORT=4015
MAILHOG_PORT=4017 MAILHOG_PORT=4017
MAILHOG_SMTP_PORT=4019 MAILHOG_SMTP_PORT=4019
ATUIN_PORT=4018 ATUIN_PORT=4018
REACTIVE_RESUME_PORT=4016
RESUME_MINIO_PORT=4020
METRICS_PORT=4021
KIWIX_PORT=4022
RESUME_MATCHER_PORT=4023
APPLEHEALTH_PORT=4024
# Network Configuration # Network Configuration
NETWORK_SUBNET=192.168.3.0/24 NETWORK_SUBNET=192.168.3.0/24
@@ -84,3 +90,17 @@ WAKAPI_PASSWORD_SALT=demo_salt_replace_in_production
# Atuin Configuration # Atuin Configuration
ATUIN_HOST=0.0.0.0 ATUIN_HOST=0.0.0.0
ATUIN_OPEN_REGISTRATION=true ATUIN_OPEN_REGISTRATION=true
# Reactive Resume Configuration (v5)
RESUME_POSTGRES_DB=reactiveresume
RESUME_POSTGRES_USER=postgres
RESUME_POSTGRES_PASSWORD=demo_password
RESUME_MINIO_USER=minioadmin
RESUME_MINIO_PASSWORD=minioadmin
RESUME_ACCESS_TOKEN_SECRET=access_token_secret_demo
# Metrics Configuration
METRICS_GITHUB_TOKEN=GITHUB_API_TOKEN_PLACEHOLDER
# Apple Health Configuration
APPLEHEALTH_INFLUXDB_BUCKET=demo_metrics

View File

@@ -43,6 +43,14 @@ volumes:
driver: local driver: local
${COMPOSE_PROJECT_NAME}_atuin_data: ${COMPOSE_PROJECT_NAME}_atuin_data:
driver: local driver: local
${COMPOSE_PROJECT_NAME}_reactiveresume_postgres_data:
driver: local
${COMPOSE_PROJECT_NAME}_reactiveresume_minio_data:
driver: local
${COMPOSE_PROJECT_NAME}_kiwix_data:
driver: local
${COMPOSE_PROJECT_NAME}_resumematcher_data:
driver: local
services: services:
# Docker Socket Proxy - Security Layer # Docker Socket Proxy - Security Layer
@@ -585,3 +593,266 @@ services:
timeout: ${HEALTH_CHECK_TIMEOUT} timeout: ${HEALTH_CHECK_TIMEOUT}
retries: 5 retries: 5
start_period: 30s start_period: 30s
# Reactive Resume - Postgres Database
reactiveresume-postgres:
image: postgres:16-alpine
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-postgres"
restart: unless-stopped
networks:
- ${COMPOSE_NETWORK_NAME}
volumes:
- ${COMPOSE_PROJECT_NAME}_reactiveresume_postgres_data:/var/lib/postgresql/data
environment:
POSTGRES_DB: ${RESUME_POSTGRES_DB}
POSTGRES_USER: ${RESUME_POSTGRES_USER}
POSTGRES_PASSWORD: ${RESUME_POSTGRES_PASSWORD}
deploy:
resources:
limits:
memory: 256M
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${RESUME_POSTGRES_USER} -d ${RESUME_POSTGRES_DB}"]
interval: ${HEALTH_CHECK_INTERVAL}
timeout: ${HEALTH_CHECK_TIMEOUT}
retries: 5
# Reactive Resume - SeaweedFS (S3 Storage)
reactiveresume-minio:
image: chrislusf/seaweedfs:latest
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-minio"
restart: unless-stopped
command: server -s3 -filer -dir=/data -ip=0.0.0.0
networks:
- ${COMPOSE_NETWORK_NAME}
ports:
- "${RESUME_MINIO_PORT}:8333"
volumes:
- ${COMPOSE_PROJECT_NAME}_reactiveresume_minio_data:/data
environment:
AWS_ACCESS_KEY_ID: ${RESUME_MINIO_USER}
AWS_SECRET_ACCESS_KEY: ${RESUME_MINIO_PASSWORD}
deploy:
resources:
limits:
memory: 256M
healthcheck:
test: ["CMD", "wget", "-q", "-O", "/dev/null", "http://localhost:8888"]
interval: ${HEALTH_CHECK_INTERVAL}
timeout: ${HEALTH_CHECK_TIMEOUT}
retries: ${HEALTH_CHECK_RETRIES}
start_period: 10s
# Reactive Resume - Create S3 Bucket
reactiveresume-createbucket:
image: quay.io/minio/mc:latest
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-createbucket"
restart: on-failure
networks:
- ${COMPOSE_NETWORK_NAME}
entrypoint:
- /bin/sh
- -c
- |
sleep 5
mc alias set seaweedfs http://reactiveresume-minio:8333 ${RESUME_MINIO_USER} ${RESUME_MINIO_PASSWORD}
mc mb seaweedfs/reactive-resume
exit 0
depends_on:
reactiveresume-minio:
condition: service_healthy
# Reactive Resume - Resume Builder
reactiveresume-app:
image: amruthpillai/reactive-resume:latest
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-app"
restart: unless-stopped
networks:
- ${COMPOSE_NETWORK_NAME}
ports:
- "${REACTIVE_RESUME_PORT}:3000"
depends_on:
reactiveresume-postgres:
condition: service_healthy
reactiveresume-minio:
condition: service_healthy
reactiveresume-createbucket:
condition: service_completed_successfully
environment:
PORT: 3000
NODE_ENV: production
APP_URL: http://localhost:${REACTIVE_RESUME_PORT}
DATABASE_URL: postgresql://${RESUME_POSTGRES_USER}:${RESUME_POSTGRES_PASSWORD}@reactiveresume-postgres:5432/${RESUME_POSTGRES_DB}
AUTH_SECRET: ${RESUME_ACCESS_TOKEN_SECRET}
S3_ACCESS_KEY_ID: ${RESUME_MINIO_USER}
S3_SECRET_ACCESS_KEY: ${RESUME_MINIO_PASSWORD}
S3_ENDPOINT: http://reactiveresume-minio:8333
S3_BUCKET: reactive-resume
S3_FORCE_PATH_STYLE: "true"
labels:
homepage.group: "Productivity"
homepage.name: "Reactive Resume"
homepage.icon: "reactive-resume"
homepage.href: "http://localhost:${REACTIVE_RESUME_PORT}"
homepage.description: "Open-source resume builder"
deploy:
resources:
limits:
memory: 512M
healthcheck:
test: ["CMD", "node", "-e", "fetch('http://127.0.0.1:3000/api/health').then((r) => { if (!r.ok) process.exit(1); }).catch(() => process.exit(1));"]
interval: ${HEALTH_CHECK_INTERVAL}
timeout: ${HEALTH_CHECK_TIMEOUT}
retries: 5
start_period: 30s
# Metrics - GitHub Metrics Visualization
metrics:
image: ghcr.io/lowlighter/metrics:latest
container_name: "${COMPOSE_PROJECT_NAME}-metrics"
restart: unless-stopped
entrypoint: [""]
command: ["npm", "start"]
networks:
- ${COMPOSE_NETWORK_NAME}
ports:
- "${METRICS_PORT}:3000"
volumes:
- ./config/metrics/settings.json:/metrics/settings.json:ro
environment:
- PUID=${DEMO_UID}
- PGID=${DEMO_GID}
labels:
homepage.group: "Monitoring"
homepage.name: "Metrics"
homepage.icon: "github"
homepage.href: "http://localhost:${METRICS_PORT}"
homepage.description: "GitHub metrics visualization"
deploy:
resources:
limits:
memory: 256M
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:3000"]
interval: ${HEALTH_CHECK_INTERVAL}
timeout: ${HEALTH_CHECK_TIMEOUT}
retries: ${HEALTH_CHECK_RETRIES}
start_period: 30s
# Kiwix - Offline Wiki
kiwix:
image: ghcr.io/kiwix/kiwix-serve:latest
container_name: "${COMPOSE_PROJECT_NAME}-kiwix"
restart: unless-stopped
networks:
- ${COMPOSE_NETWORK_NAME}
ports:
- "${KIWIX_PORT}:8080"
volumes:
- ${COMPOSE_PROJECT_NAME}_kiwix_data:/data
entrypoint: []
command:
- /bin/sh
- -c
- |
if ! ls /data/*.zim 1>/dev/null 2>&1; then
echo 'No ZIM files found. Downloading sample ZIM...';
wget -q -O /data/demo.zim
'https://download.kiwix.org/zim/other/bleedingedge_climate-change_en.zim'
|| echo 'Download failed';
fi
if ls /data/*.zim 1>/dev/null 2>&1; then
exec kiwix-serve /data/*.zim
else
echo 'No ZIM files available, sleeping indefinitely'
exec sleep infinity
fi
environment:
- PUID=${DEMO_UID}
- PGID=${DEMO_GID}
labels:
homepage.group: "Documentation"
homepage.name: "Kiwix"
homepage.icon: "kiwix"
homepage.href: "http://localhost:${KIWIX_PORT}"
homepage.description: "Offline wiki reader"
deploy:
resources:
limits:
memory: 256M
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080"]
interval: ${HEALTH_CHECK_INTERVAL}
timeout: ${HEALTH_CHECK_TIMEOUT}
retries: 5
start_period: 120s
# Resume Matcher - AI Resume Screening
resumematcher:
image: ghcr.io/srbhr/resume-matcher:latest
container_name: "${COMPOSE_PROJECT_NAME}-resumematcher"
restart: unless-stopped
networks:
- ${COMPOSE_NETWORK_NAME}
ports:
- "${RESUME_MATCHER_PORT}:3000"
volumes:
- ${COMPOSE_PROJECT_NAME}_resumematcher_data:/app/backend/data
environment:
- PUID=${DEMO_UID}
- PGID=${DEMO_GID}
labels:
homepage.group: "Productivity"
homepage.name: "Resume Matcher"
homepage.icon: "resume"
homepage.href: "http://localhost:${RESUME_MATCHER_PORT}"
homepage.description: "AI-powered resume screening"
deploy:
resources:
limits:
memory: 512M
healthcheck:
test: ["CMD", "curl", "-f", "--silent", "http://localhost:3000/api/v1/health"]
interval: ${HEALTH_CHECK_INTERVAL}
timeout: ${HEALTH_CHECK_TIMEOUT}
retries: 5
start_period: 60s
# Apple Health - Health Data Collector
applehealth:
build:
context: ./config/applehealth
dockerfile: Dockerfile
image: tsys-applehealth:latest
container_name: "${COMPOSE_PROJECT_NAME}-applehealth"
restart: unless-stopped
networks:
- ${COMPOSE_NETWORK_NAME}
ports:
- "${APPLEHEALTH_PORT}:5353"
environment:
- INFLUXDB_URL=http://influxdb:8086
- INFLUXDB_TOKEN=${INFLUXDB_AUTH_TOKEN}
- INFLUXDB_ORG=${INFLUXDB_ORG}
- INFLUXDB_BUCKET=${INFLUXDB_BUCKET}
- PUID=${DEMO_UID}
- PGID=${DEMO_GID}
depends_on:
influxdb:
condition: service_healthy
labels:
homepage.group: "Monitoring"
homepage.name: "Apple Health"
homepage.icon: "apple-health"
homepage.href: "http://localhost:${APPLEHEALTH_PORT}"
homepage.description: "Health data collection and visualization"
deploy:
resources:
limits:
memory: 256M
healthcheck:
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:5353/health')"]
interval: ${HEALTH_CHECK_INTERVAL}
timeout: ${HEALTH_CHECK_TIMEOUT}
retries: ${HEALTH_CHECK_RETRIES}
start_period: 15s

View File

@@ -31,6 +31,20 @@ ensure_env() {
# Ensure new variables exist in older env files # Ensure new variables exist in older env files
grep -q '^MAILHOG_SMTP_PORT=' "$ENV_FILE" || echo "MAILHOG_SMTP_PORT=4019" >> "$ENV_FILE" grep -q '^MAILHOG_SMTP_PORT=' "$ENV_FILE" || echo "MAILHOG_SMTP_PORT=4019" >> "$ENV_FILE"
grep -q '^HOMEPAGE_ALLOWED_HOSTS=' "$ENV_FILE" || echo "HOMEPAGE_ALLOWED_HOSTS=*" >> "$ENV_FILE" grep -q '^HOMEPAGE_ALLOWED_HOSTS=' "$ENV_FILE" || echo "HOMEPAGE_ALLOWED_HOSTS=*" >> "$ENV_FILE"
grep -q '^REACTIVE_RESUME_PORT=' "$ENV_FILE" || echo "REACTIVE_RESUME_PORT=4016" >> "$ENV_FILE"
grep -q '^RESUME_MINIO_PORT=' "$ENV_FILE" || echo "RESUME_MINIO_PORT=4020" >> "$ENV_FILE"
grep -q '^METRICS_PORT=' "$ENV_FILE" || echo "METRICS_PORT=4021" >> "$ENV_FILE"
grep -q '^KIWIX_PORT=' "$ENV_FILE" || echo "KIWIX_PORT=4022" >> "$ENV_FILE"
grep -q '^RESUME_MATCHER_PORT=' "$ENV_FILE" || echo "RESUME_MATCHER_PORT=4023" >> "$ENV_FILE"
grep -q '^APPLEHEALTH_PORT=' "$ENV_FILE" || echo "APPLEHEALTH_PORT=4024" >> "$ENV_FILE"
grep -q '^RESUME_POSTGRES_DB=' "$ENV_FILE" || echo "RESUME_POSTGRES_DB=reactiveresume" >> "$ENV_FILE"
grep -q '^RESUME_POSTGRES_USER=' "$ENV_FILE" || echo "RESUME_POSTGRES_USER=postgres" >> "$ENV_FILE"
grep -q '^RESUME_POSTGRES_PASSWORD=' "$ENV_FILE" || echo "RESUME_POSTGRES_PASSWORD=demo_password" >> "$ENV_FILE"
grep -q '^RESUME_MINIO_USER=' "$ENV_FILE" || echo "RESUME_MINIO_USER=minioadmin" >> "$ENV_FILE"
grep -q '^RESUME_MINIO_PASSWORD=' "$ENV_FILE" || echo "RESUME_MINIO_PASSWORD=minioadmin" >> "$ENV_FILE"
grep -q '^RESUME_ACCESS_TOKEN_SECRET=' "$ENV_FILE" || echo "RESUME_ACCESS_TOKEN_SECRET=access_token_secret_demo" >> "$ENV_FILE"
grep -q '^METRICS_GITHUB_TOKEN=' "$ENV_FILE" || echo "METRICS_GITHUB_TOKEN=" >> "$ENV_FILE"
grep -q '^APPLEHEALTH_INFLUXDB_BUCKET=' "$ENV_FILE" || echo "APPLEHEALTH_INFLUXDB_BUCKET=demo_metrics" >> "$ENV_FILE"
} }
detect_user() { detect_user() {
@@ -122,10 +136,13 @@ display_summary() {
echo " Monitoring:" echo " Monitoring:"
echo " InfluxDB http://localhost:${INFLUXDB_PORT}" echo " InfluxDB http://localhost:${INFLUXDB_PORT}"
echo " Grafana http://localhost:${GRAFANA_PORT}" echo " Grafana http://localhost:${GRAFANA_PORT}"
echo " Metrics http://localhost:${METRICS_PORT}"
echo " Apple Health http://localhost:${APPLEHEALTH_PORT}"
echo "" echo ""
echo " Documentation:" echo " Documentation:"
echo " Draw.io http://localhost:${DRAWIO_PORT}" echo " Draw.io http://localhost:${DRAWIO_PORT}"
echo " Kroki http://localhost:${KROKI_PORT}" echo " Kroki http://localhost:${KROKI_PORT}"
echo " Kiwix http://localhost:${KIWIX_PORT}"
echo "" echo ""
echo " Developer Tools:" echo " Developer Tools:"
echo " Atomic Tracker http://localhost:${ATOMIC_TRACKER_PORT}" echo " Atomic Tracker http://localhost:${ATOMIC_TRACKER_PORT}"
@@ -136,6 +153,10 @@ display_summary() {
echo " MailHog (SMTP) localhost:${MAILHOG_SMTP_PORT}" echo " MailHog (SMTP) localhost:${MAILHOG_SMTP_PORT}"
echo " Atuin http://localhost:${ATUIN_PORT}" echo " Atuin http://localhost:${ATUIN_PORT}"
echo "" echo ""
echo " Productivity:"
echo " Reactive Resume http://localhost:${REACTIVE_RESUME_PORT}"
echo " Resume Matcher http://localhost:${RESUME_MATCHER_PORT}"
echo ""
echo " Credentials: admin / demo_password" echo " Credentials: admin / demo_password"
echo " FOR DEMONSTRATION PURPOSES ONLY" echo " FOR DEMONSTRATION PURPOSES ONLY"
echo "========================================================" echo "========================================================"
@@ -158,6 +179,11 @@ smoke_test() {
"${WAKAPI_PORT}:Wakapi" "${WAKAPI_PORT}:Wakapi"
"${MAILHOG_PORT}:MailHog" "${MAILHOG_PORT}:MailHog"
"${ATUIN_PORT}:Atuin" "${ATUIN_PORT}:Atuin"
"${REACTIVE_RESUME_PORT}:ReactiveResume"
"${METRICS_PORT}:Metrics"
"${KIWIX_PORT}:Kiwix"
"${RESUME_MATCHER_PORT}:ResumeMatcher"
"${APPLEHEALTH_PORT}:AppleHealth"
) )
local pass=0 fail=0 local pass=0 fail=0
for pt in "${ports[@]}"; do for pt in "${ports[@]}"; do

View File

@@ -110,6 +110,11 @@ test_port_accessibility() {
"$WAKAPI_PORT:Wakapi" "$WAKAPI_PORT:Wakapi"
"$MAILHOG_PORT:MailHog" "$MAILHOG_PORT:MailHog"
"$ATUIN_PORT:Atuin" "$ATUIN_PORT:Atuin"
"$REACTIVE_RESUME_PORT:ReactiveResume"
"$METRICS_PORT:Metrics"
"$KIWIX_PORT:Kiwix"
"$RESUME_MATCHER_PORT:ResumeMatcher"
"$APPLEHEALTH_PORT:AppleHealth"
) )
local failed=0 local failed=0
@@ -150,7 +155,7 @@ test_volume_permissions() {
source "$DEMO_ENV_FILE" source "$DEMO_ENV_FILE"
local vol_count local vol_count
vol_count=$(docker volume ls --filter "name=${COMPOSE_PROJECT_NAME}" -q 2>/dev/null | wc -l) vol_count=$(docker volume ls --filter "name=${COMPOSE_PROJECT_NAME}" -q 2>/dev/null | wc -l)
if [[ $vol_count -ge 15 ]]; then if [[ $vol_count -ge 19 ]]; then
log_success "$vol_count volumes created" log_success "$vol_count volumes created"
else else
log_error "Only $vol_count volumes found" log_error "Only $vol_count volumes found"

View File

@@ -30,7 +30,7 @@ validate_yaml_files() {
) )
for yaml_file in "${yaml_files[@]}"; do for yaml_file in "${yaml_files[@]}"; do
if [[ -f "$DEMO_DIR/$yaml_file" ]]; then if [[ -f "$DEMO_DIR/$yaml_file" ]]; then
if docker run --rm -v "$DEMO_DIR:/data" cytopia/yamllint /data/"$yaml_file" 2>&1; then if docker run --rm -v "$DEMO_DIR:/data" cytopia/yamllint -c /data/.yamllint /data/"$yaml_file" 2>&1; then
log_pass "YAML validation: $yaml_file" log_pass "YAML validation: $yaml_file"
else else
log_fail "YAML validation: $yaml_file" log_fail "YAML validation: $yaml_file"
@@ -83,6 +83,13 @@ validate_docker_images() {
"ghcr.io/muety/wakapi:latest" "ghcr.io/muety/wakapi:latest"
"mailhog/mailhog:latest" "mailhog/mailhog:latest"
"ghcr.io/atuinsh/atuin:v18.10.0" "ghcr.io/atuinsh/atuin:v18.10.0"
"amruthpillai/reactive-resume:latest"
"postgres:16-alpine"
"chrislusf/seaweedfs:latest"
"quay.io/minio/mc:latest"
"ghcr.io/lowlighter/metrics:latest"
"ghcr.io/kiwix/kiwix-serve:latest"
"ghcr.io/srbhr/resume-matcher:latest"
) )
for image in "${images[@]}"; do for image in "${images[@]}"; do
if docker image inspect "$image" >/dev/null 2>&1; then if docker image inspect "$image" >/dev/null 2>&1; then
@@ -110,6 +117,11 @@ validate_port_availability() {
"$WAKAPI_PORT" "$WAKAPI_PORT"
"$MAILHOG_PORT" "$MAILHOG_PORT"
"$ATUIN_PORT" "$ATUIN_PORT"
"$REACTIVE_RESUME_PORT"
"$METRICS_PORT"
"$KIWIX_PORT"
"$RESUME_MATCHER_PORT"
"$APPLEHEALTH_PORT"
) )
for port in "${ports[@]}"; do for port in "${ports[@]}"; do
if [[ -n "$port" && "$port" != " " ]]; then if [[ -n "$port" && "$port" != " " ]]; then
@@ -143,6 +155,10 @@ validate_environment() {
"ATOMIC_TRACKER_PORT" "ARCHIVEBOX_PORT" "ATOMIC_TRACKER_PORT" "ARCHIVEBOX_PORT"
"TUBE_ARCHIVIST_PORT" "WAKAPI_PORT" "TUBE_ARCHIVIST_PORT" "WAKAPI_PORT"
"MAILHOG_PORT" "MAILHOG_SMTP_PORT" "ATUIN_PORT" "MAILHOG_PORT" "MAILHOG_SMTP_PORT" "ATUIN_PORT"
"REACTIVE_RESUME_PORT" "RESUME_MINIO_PORT"
"METRICS_PORT" "KIWIX_PORT"
"RESUME_MATCHER_PORT" "APPLEHEALTH_PORT"
"RESUME_POSTGRES_PASSWORD"
"TA_USERNAME" "TA_PASSWORD" "ELASTIC_PASSWORD" "TA_USERNAME" "TA_PASSWORD" "ELASTIC_PASSWORD"
"GF_SECURITY_ADMIN_USER" "GF_SECURITY_ADMIN_PASSWORD" "GF_SECURITY_ADMIN_USER" "GF_SECURITY_ADMIN_PASSWORD"
"PIHOLE_WEBPASSWORD" "PIHOLE_WEBPASSWORD"
@@ -177,6 +193,14 @@ validate_health_endpoints() {
"atuin:8888:/healthz" "atuin:8888:/healthz"
"ta-redis:6379:redis-cli_ping" "ta-redis:6379:redis-cli_ping"
"ta-elasticsearch:9200:/_cluster/health" "ta-elasticsearch:9200:/_cluster/health"
"reactiveresume-app:3000:/api/health"
"reactiveresume-postgres:5432:pg_isready"
"reactiveresume-minio:8888:/"
"reactiveresume-createbucket:N/A:mc"
"metrics:3000:/"
"kiwix:8080:/"
"resumematcher:3000:/api/v1/health"
"applehealth:5353:/health"
) )
for check in "${checks[@]}"; do for check in "${checks[@]}"; do
local svc="${check%%:*}" local svc="${check%%:*}"
@@ -190,6 +214,8 @@ validate_dependencies() {
log_pass "Dependency: Grafana -> InfluxDB" log_pass "Dependency: Grafana -> InfluxDB"
log_pass "Dependency: Dockhand -> Docker Socket" log_pass "Dependency: Dockhand -> Docker Socket"
log_pass "Dependency: TubeArchivist -> Redis + Elasticsearch" log_pass "Dependency: TubeArchivist -> Redis + Elasticsearch"
log_pass "Dependency: ReactiveResume -> Postgres + SeaweedFS"
log_pass "Dependency: AppleHealth -> InfluxDB"
log_pass "Dependency: All other services -> Standalone" log_pass "Dependency: All other services -> Standalone"
} }

View File

@@ -67,6 +67,31 @@ const services = [
url: 'http://localhost:4018', url: 'http://localhost:4018',
contentCheck: 'version', contentCheck: 'version',
}, },
{
name: 'Reactive Resume',
url: 'http://localhost:4016',
contentCheck: 'reactive',
},
{
name: 'Metrics',
url: 'http://localhost:4021',
contentCheck: 'metrics',
},
{
name: 'Kiwix',
url: 'http://localhost:4022',
contentCheck: 'kiwix',
},
{
name: 'Resume Matcher',
url: 'http://localhost:4023',
contentCheck: 'resume',
},
{
name: 'Apple Health',
url: 'http://localhost:4024',
contentCheck: 'apple-health-collector',
},
]; ];
for (const svc of services) { for (const svc of services) {

View File

@@ -54,6 +54,11 @@ test_complete_deployment() {
"$WAKAPI_PORT" "$WAKAPI_PORT"
"$MAILHOG_PORT" "$MAILHOG_PORT"
"$ATUIN_PORT" "$ATUIN_PORT"
"$REACTIVE_RESUME_PORT"
"$METRICS_PORT"
"$KIWIX_PORT"
"$RESUME_MATCHER_PORT"
"$APPLEHEALTH_PORT"
) )
local failed_ports=0 local failed_ports=0

View File

@@ -89,10 +89,10 @@ test_network_isolation() {
check "Services are on the correct network" check "Services are on the correct network"
local net_count local net_count
net_count=$(docker network inspect "${COMPOSE_NETWORK_NAME}" --format '{{range .Containers}}{{.Name}} {{end}}' 2>/dev/null | wc -w || echo "0") net_count=$(docker network inspect "${COMPOSE_NETWORK_NAME}" --format '{{range .Containers}}{{.Name}} {{end}}' 2>/dev/null | wc -w || echo "0")
if [[ "$net_count" -ge 14 ]]; then if [[ "$net_count" -ge 22 ]]; then
pass "$net_count containers on ${COMPOSE_NETWORK_NAME}" pass "$net_count containers on ${COMPOSE_NETWORK_NAME}"
else else
fail "Only $net_count containers on network (expected >= 14)" fail "Only $net_count containers on network (expected >= 22)"
fi fi
} }

View File

@@ -47,12 +47,13 @@ test_template_has_required_sections() {
} }
test_template_has_all_services() { test_template_has_all_services() {
check "Template defines all 16 services" check "Template defines all 24 services"
local services=( local services=(
"docker-socket-proxy:" "homepage:" "pihole:" "dockhand:" "docker-socket-proxy:" "homepage:" "pihole:" "dockhand:"
"influxdb:" "grafana:" "drawio:" "kroki:" "atomictracker:" "influxdb:" "grafana:" "drawio:" "kroki:" "atomictracker:"
"archivebox:" "ta-redis:" "ta-elasticsearch:" "tubearchivist:" "archivebox:" "ta-redis:" "ta-elasticsearch:" "tubearchivist:"
"wakapi:" "mailhog:" "atuin:" "wakapi:" "mailhog:" "atuin:"
"reactiveresume-postgres:" "reactiveresume-minio:" "reactiveresume-createbucket:" "reactiveresume-app:" "metrics:" "kiwix:" "resumematcher:" "applehealth:"
) )
local found=0 local found=0
for svc in "${services[@]}"; do for svc in "${services[@]}"; do
@@ -69,7 +70,7 @@ test_template_has_all_services() {
test_all_services_have_healthchecks() { test_all_services_have_healthchecks() {
check "All exposed services have healthcheck blocks" check "All exposed services have healthcheck blocks"
local exposed_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin") local exposed_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin" "reactiveresume-app" "metrics" "kiwix" "resumematcher" "applehealth")
local missing=() local missing=()
for svc in "${exposed_services[@]}"; do for svc in "${exposed_services[@]}"; do
local svc_block local svc_block
@@ -91,16 +92,16 @@ test_all_services_have_restart_policy() {
check "All services have restart policy" check "All services have restart policy"
local restart_count local restart_count
restart_count=$(grep -c "restart:" "$TEMPLATE_FILE" || true) restart_count=$(grep -c "restart:" "$TEMPLATE_FILE" || true)
if [[ $restart_count -ge 16 ]]; then if [[ $restart_count -ge 24 ]]; then
pass "$restart_count services have restart policies" pass "$restart_count services have restart policies"
else else
fail "Only $restart_count services have restart policies (expected >= 16)" fail "Only $restart_count services have restart policies (expected >= 24)"
fi fi
} }
test_all_services_have_labels() { test_all_services_have_labels() {
check "All user-facing services have Homepage labels" check "All user-facing services have Homepage labels"
local label_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin") local label_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin" "reactiveresume-app" "metrics" "kiwix" "resumematcher" "applehealth")
local missing=() local missing=()
for svc in "${label_services[@]}"; do for svc in "${label_services[@]}"; do
local svc_block local svc_block
@@ -163,6 +164,7 @@ test_env_template_completeness() {
"TA_USERNAME" "TA_PASSWORD" "ELASTIC_PASSWORD" "TA_USERNAME" "TA_PASSWORD" "ELASTIC_PASSWORD"
"GF_SECURITY_ADMIN_USER" "GF_SECURITY_ADMIN_PASSWORD" "GF_SECURITY_ADMIN_USER" "GF_SECURITY_ADMIN_PASSWORD"
"PIHOLE_WEBPASSWORD" "PIHOLE_WEBPASSWORD"
"REACTIVE_RESUME_PORT" "RESUME_MINIO_PORT" "METRICS_PORT" "KIWIX_PORT" "RESUME_MATCHER_PORT" "APPLEHEALTH_PORT" "RESUME_POSTGRES_PASSWORD"
) )
for var in "${required_vars[@]}"; do for var in "${required_vars[@]}"; do
if grep_exists "^${var}=" "$ENV_TEMPLATE"; then if grep_exists "^${var}=" "$ENV_TEMPLATE"; then