feat(demo): migrate 5 SelfStack services to demo stack (16→24 services)
Add Reactive Resume, Metrics, Kiwix, Resume Matcher, and Apple Health from the earlier SelfStack project. Rewrite Apple Health collector to use InfluxDB v2 with proper error handling. Update all tests, scripts, Homepage config, env template, and documentation for the expanded stack. New services: - Reactive Resume (4016) + Postgres/Minio/Chrome companions - Metrics (4021) - GitHub metrics visualization - Kiwix (4022) - offline wiki reader - Resume Matcher (4023) - AI resume screening - Apple Health (4024) - health data collector → InfluxDB v2 Also adds git policy to AGENTS.md: always commit and push automatically. 💘 Generated with Crush Assisted-by: GLM-5.1 via Crush <crush@charm.land>
This commit is contained in:
58
AGENTS.md
58
AGENTS.md
@@ -6,9 +6,15 @@ This repository contains a Docker Compose-based multi-service stack that provide
|
||||
|
||||
### Project Type
|
||||
- **Infrastructure as Code**: Docker Compose with shell orchestration
|
||||
- **Multi-Service Stack**: 16 services across 4 categories
|
||||
- **Multi-Service Stack**: 24 services across 5 categories
|
||||
- **Demo-First Architecture**: All configurations for demonstration purposes only
|
||||
|
||||
### Git Policy
|
||||
- **ALWAYS commit** after every logical unit of work — never ask permission
|
||||
- **ALWAYS push** after every commit — never ask permission
|
||||
- **Commit early, commit often** — small focused commits are preferred over large ones
|
||||
- **Never ask** "should I commit?" or "should I push?" — just do it
|
||||
|
||||
### Directory Structure
|
||||
```
|
||||
TSYSDevStack-SupportStack-LocalWorkstation/
|
||||
@@ -43,6 +49,11 @@ TSYSDevStack-SupportStack-LocalWorkstation/
|
||||
│ │ ├── tubearchivist/ # Tube Archivist configuration
|
||||
│ │ ├── wakapi/ # Wakapi configuration
|
||||
│ │ ├── mailhog/ # MailHog configuration
|
||||
│ │ ├── applehealth/ # Apple Health configuration
|
||||
│ │ ├── metrics/ # Metrics configuration
|
||||
│ │ ├── reactiveresume/ # Reactive Resume configuration
|
||||
│ │ ├── kiwix/ # Kiwix configuration
|
||||
│ │ ├── resumematcher/ # Resume Matcher configuration
|
||||
│ │ └── atuin/ # Atuin configuration
|
||||
│ └── docs/ # Additional documentation
|
||||
│ ├── service-guides/ # Service-specific guides
|
||||
@@ -125,13 +136,16 @@ docker run --rm -v "$(pwd):/workdir" hadolint/hadolint <path-to-dockerfile>
|
||||
- Pi-hole (4006) - DNS management with ad blocking
|
||||
- Dockhand (4007) - Web-based container management
|
||||
|
||||
2. **Monitoring & Observability** (ports 4008-4009)
|
||||
2. **Monitoring & Observability** (ports 4008-4009, 4021, 4024)
|
||||
- InfluxDB (4008) - Time series database for metrics
|
||||
- Grafana (4009) - Visualization platform
|
||||
- Metrics (4021) - GitHub metrics visualization
|
||||
- Apple Health (4024) - Health data collection
|
||||
|
||||
3. **Documentation & Diagramming** (ports 4010-4011)
|
||||
3. **Documentation & Diagramming** (ports 4010-4011, 4022)
|
||||
- Draw.io (4010) - Web-based diagramming application
|
||||
- Kroki (4011) - Diagrams as a service
|
||||
- Kiwix (4022) - Offline wiki reader
|
||||
|
||||
4. **Developer Tools** (ports 4000, 4012-4018)
|
||||
- Homepage (4000) - Central dashboard for service discovery
|
||||
@@ -142,7 +156,11 @@ docker run --rm -v "$(pwd):/workdir" hadolint/hadolint <path-to-dockerfile>
|
||||
- MailHog (4017 Web, 4019 SMTP) - Web and API based SMTP testing
|
||||
- Atuin (4018) - Magical shell history synchronization
|
||||
|
||||
5. **Companion Services** (internal only, no host ports)
|
||||
5. **Productivity** (ports 4016, 4023)
|
||||
- Reactive Resume (4016) - Resume builder
|
||||
- Resume Matcher (4023) - AI resume screening
|
||||
|
||||
6. **Companion Services** (internal only, no host ports)
|
||||
- ta-redis - Redis cache for Tube Archivist
|
||||
- ta-elasticsearch - Elasticsearch index for Tube Archivist
|
||||
|
||||
@@ -168,7 +186,7 @@ docker run --rm -v "$(pwd):/workdir" hadolint/hadolint <path-to-dockerfile>
|
||||
### Docker Labels (Service Discovery)
|
||||
```yaml
|
||||
labels:
|
||||
homepage.group: "Infrastructure" # Category
|
||||
homepage.group: "Infrastructure" # Category (Infrastructure|Monitoring|Documentation|Developer Tools|Productivity)
|
||||
homepage.name: "Service Display Name" # Human-readable name
|
||||
homepage.icon: "icon-name" # Icon identifier
|
||||
homepage.href: "http://localhost:PORT" # Access URL
|
||||
@@ -267,7 +285,7 @@ Before ANY file is created or modified:
|
||||
### Service Discovery Mechanism
|
||||
- **Homepage Labels**: Services automatically discovered via Docker labels
|
||||
- **No Manual Config**: Don't manually add services to Homepage configuration
|
||||
- **Group-Based**: Services organized by group (Infrastructure, Monitoring, Documentation, Developer Tools)
|
||||
- **Group-Based**: Services organized by group (Infrastructure, Monitoring, Documentation, Developer Tools, Productivity)
|
||||
- **Real-Time**: Homepage updates automatically as services start/stop
|
||||
|
||||
### FOSS Only Policy
|
||||
@@ -279,7 +297,7 @@ Before ANY file is created or modified:
|
||||
## Project-Specific Context
|
||||
|
||||
### Current State
|
||||
- **Demo Environment**: Fully configured with 16 services
|
||||
- **Demo Environment**: Fully configured with 24 services
|
||||
- **Production Environment**: Placeholder only, not yet implemented
|
||||
- **Documentation**: Comprehensive (AGENTS.md, PRD.md, README.md)
|
||||
- **Scripts**: Complete orchestration and testing scripts available
|
||||
@@ -344,10 +362,32 @@ WAKAPI_PORT=4015
|
||||
MAILHOG_PORT=4017
|
||||
MAILHOG_SMTP_PORT=4019
|
||||
ATUIN_PORT=4018
|
||||
REACTIVE_RESUME_PORT=4016
|
||||
RESUME_MINIO_PORT=4020
|
||||
METRICS_PORT=4021
|
||||
KIWIX_PORT=4022
|
||||
RESUME_MATCHER_PORT=4023
|
||||
APPLEHEALTH_PORT=4024
|
||||
|
||||
# Demo Credentials (NOT FOR PRODUCTION)
|
||||
DEMO_ADMIN_USER=admin
|
||||
DEMO_ADMIN_PASSWORD=demo_password
|
||||
|
||||
# Reactive Resume
|
||||
RESUME_POSTGRES_DB=reactive_resume
|
||||
RESUME_POSTGRES_USER=resume_user
|
||||
RESUME_POSTGRES_PASSWORD=demo_password
|
||||
RESUME_MINIO_USER=minio_user
|
||||
RESUME_MINIO_PASSWORD=demo_password
|
||||
RESUME_CHROME_TOKEN=demo_token
|
||||
RESUME_ACCESS_TOKEN_SECRET=demo_secret
|
||||
RESUME_REFRESH_TOKEN_SECRET=demo_secret
|
||||
|
||||
# Metrics
|
||||
METRICS_GITHUB_TOKEN=
|
||||
|
||||
# Apple Health
|
||||
APPLEHEALTH_INFLUXDB_BUCKET=applehealth
|
||||
```
|
||||
|
||||
## Key Files Reference
|
||||
@@ -390,5 +430,5 @@ DEMO_ADMIN_PASSWORD=demo_password
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-01-24
|
||||
**Version**: 1.0
|
||||
**Last Updated**: 2026-05-08
|
||||
**Version**: 2.0
|
||||
|
||||
@@ -4,14 +4,15 @@ A Docker Compose-based multi-service stack of FOSS applications that run locally
|
||||
|
||||
## What It Does
|
||||
|
||||
Deploys 16 services across 4 categories via a single command:
|
||||
Deploys 24 services across 6 categories via a single command:
|
||||
|
||||
| Category | Services |
|
||||
|----------|----------|
|
||||
| **Infrastructure** | Homepage (dashboard), Pi-hole (DNS), Dockhand (Docker management), Docker Socket Proxy |
|
||||
| **Monitoring** | InfluxDB (time series), Grafana (visualization) |
|
||||
| **Documentation** | Draw.io (diagramming), Kroki (diagrams as code) |
|
||||
| **Monitoring** | InfluxDB (time series), Grafana (visualization), Metrics (GitHub metrics), Apple Health (health data) |
|
||||
| **Documentation** | Draw.io (diagramming), Kroki (diagrams as code), Kiwix (offline wiki) |
|
||||
| **Developer Tools** | Atomic Tracker, ArchiveBox, Tube Archivist, Wakapi, MailHog, Atuin |
|
||||
| **Productivity** | Reactive Resume (resume builder), Resume Matcher (AI resume screening) |
|
||||
|
||||
## Quick Start
|
||||
|
||||
|
||||
@@ -76,9 +76,10 @@ Before ANY file is created or modified:
|
||||
|
||||
### Service Categories
|
||||
- **Infrastructure Services**: Core platform services
|
||||
- **Monitoring & Observability**: Metrics and visualization
|
||||
- **Monitoring & Observability**: Metrics, visualization, and health data
|
||||
- **Documentation & Diagramming**: Knowledge management
|
||||
- **Developer Tools**: Productivity enhancers
|
||||
- **Productivity**: Resume building and screening tools
|
||||
|
||||
### Design Patterns
|
||||
- **Service Discovery**: Automatic via Homepage dashboard
|
||||
|
||||
@@ -210,6 +210,8 @@ graph TD
|
||||
| **Container Management** (Dockhand) | Docker socket (direct mount) | 🔗 Required |
|
||||
| **Visualization Platform** (Grafana) | Time Series Database (InfluxDB) | 🔗 Required |
|
||||
| **Video Archiving** (Tube Archivist) | Redis (ta-redis) + Elasticsearch (ta-elasticsearch) | 🔗 Required |
|
||||
| **Resume Builder** (Reactive Resume) | Postgres + Minio + Chrome | 🔗 Required |
|
||||
| **Health Data** (Apple Health) | InfluxDB | 🔗 Required |
|
||||
| **All Other Services** | None | ✅ Standalone |
|
||||
|
||||
---
|
||||
@@ -411,6 +413,6 @@ When reporting issues, please include:
|
||||
|
||||
**🎉 Happy Developing!**
|
||||
|
||||
*Last updated: 2025-11-13*
|
||||
*Last updated: 2026-05-08*
|
||||
|
||||
</div>
|
||||
15
demo/config/applehealth/Dockerfile
Normal file
15
demo/config/applehealth/Dockerfile
Normal file
@@ -0,0 +1,15 @@
|
||||
FROM python:3.12-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY requirements.txt .
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
COPY app.py .
|
||||
|
||||
EXPOSE 5353
|
||||
|
||||
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=10s \
|
||||
CMD python -c "import urllib.request; urllib.request.urlopen('http://localhost:5353/health')" || exit 1
|
||||
|
||||
CMD ["python", "app.py"]
|
||||
171
demo/config/applehealth/app.py
Normal file
171
demo/config/applehealth/app.py
Normal file
@@ -0,0 +1,171 @@
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
from flask import Flask, request, jsonify
|
||||
from influxdb_client import InfluxDBClient
|
||||
from influxdb_client.client.write_api import SYNCHRONOUS
|
||||
|
||||
DATAPOINTS_CHUNK = 80000
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(levelname)s - %(message)s",
|
||||
handlers=[logging.StreamHandler(sys.stdout)],
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
app = Flask(__name__)
|
||||
|
||||
INFLUXDB_URL = os.environ.get("INFLUXDB_URL", "http://influxdb:8086")
|
||||
INFLUXDB_TOKEN = os.environ.get("INFLUXDB_TOKEN", "")
|
||||
INFLUXDB_ORG = os.environ.get("INFLUXDB_ORG", "tsysdemo")
|
||||
INFLUXDB_BUCKET = os.environ.get("INFLUXDB_BUCKET", "demo_metrics")
|
||||
|
||||
_client = None
|
||||
_write_api = None
|
||||
|
||||
|
||||
def get_client():
|
||||
global _client
|
||||
if _client is None:
|
||||
_client = InfluxDBClient(
|
||||
url=INFLUXDB_URL, token=INFLUXDB_TOKEN, org=INFLUXDB_ORG
|
||||
)
|
||||
return _client
|
||||
|
||||
|
||||
def get_write_api():
|
||||
global _write_api
|
||||
if _write_api is None:
|
||||
_write_api = get_client().write_api(write_options=SYNCHRONOUS)
|
||||
return _write_api
|
||||
|
||||
|
||||
@app.route("/health", methods=["GET"])
|
||||
def health():
|
||||
try:
|
||||
ready = get_client().health_api().get_health()
|
||||
influxdb_status = ready.status if hasattr(ready, "status") else "unknown"
|
||||
return (
|
||||
jsonify(
|
||||
{
|
||||
"status": "healthy",
|
||||
"influxdb": influxdb_status,
|
||||
"version": getattr(ready, "version", "unknown"),
|
||||
}
|
||||
),
|
||||
200,
|
||||
)
|
||||
except Exception as exc:
|
||||
return jsonify({"status": "degraded", "error": str(exc)}), 503
|
||||
|
||||
|
||||
@app.route("/", methods=["GET"])
|
||||
def index():
|
||||
return jsonify(
|
||||
{
|
||||
"service": "apple-health-collector",
|
||||
"endpoints": {
|
||||
"health": "GET /health",
|
||||
"collect": "POST /collect (JSON body)",
|
||||
},
|
||||
"influxdb": {
|
||||
"url": INFLUXDB_URL,
|
||||
"org": INFLUXDB_ORG,
|
||||
"bucket": INFLUXDB_BUCKET,
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@app.route("/collect", methods=["POST"])
|
||||
def collect():
|
||||
logger.info("Health data collection request received")
|
||||
|
||||
if not request.data:
|
||||
return jsonify({"error": "No data provided"}), 400
|
||||
|
||||
try:
|
||||
healthkit_data = json.loads(request.data)
|
||||
except (json.JSONDecodeError, ValueError) as exc:
|
||||
logger.error("Invalid JSON: %s", exc)
|
||||
return jsonify({"error": "Invalid JSON", "detail": str(exc)}), 400
|
||||
|
||||
points_written = 0
|
||||
|
||||
try:
|
||||
metrics = healthkit_data.get("data", {}).get("metrics", [])
|
||||
for metric in metrics:
|
||||
measurement = metric.get("name", "unknown")
|
||||
for datapoint in metric.get("data", []):
|
||||
timestamp = datapoint.get("date")
|
||||
if not timestamp:
|
||||
continue
|
||||
|
||||
fields = {}
|
||||
tags = {}
|
||||
for key, value in datapoint.items():
|
||||
if key == "date":
|
||||
continue
|
||||
if isinstance(value, (int, float)):
|
||||
fields[key] = float(value)
|
||||
else:
|
||||
tags[key] = str(value)
|
||||
|
||||
if not fields:
|
||||
continue
|
||||
|
||||
record = {
|
||||
"measurement": measurement,
|
||||
"tags": tags,
|
||||
"fields": fields,
|
||||
"time": timestamp,
|
||||
}
|
||||
get_write_api().write(
|
||||
bucket=INFLUXDB_BUCKET, org=INFLUXDB_ORG, record=record
|
||||
)
|
||||
points_written += 1
|
||||
|
||||
workouts = healthkit_data.get("data", {}).get("workouts", [])
|
||||
for workout in workouts:
|
||||
workout_name = workout.get("name", "unknown")
|
||||
workout_start = workout.get("start", "")
|
||||
workout_end = workout.get("end", "")
|
||||
workout_id = f"{workout_name}-{workout_start}-{workout_end}"
|
||||
|
||||
for gps_point in workout.get("route", []):
|
||||
ts = gps_point.get("timestamp")
|
||||
if not ts:
|
||||
continue
|
||||
|
||||
record = {
|
||||
"measurement": "workout_route",
|
||||
"tags": {
|
||||
"workout_id": workout_id,
|
||||
"workout_name": workout_name,
|
||||
},
|
||||
"fields": {
|
||||
"lat": float(gps_point.get("lat", 0)),
|
||||
"lng": float(gps_point.get("lon", 0)),
|
||||
},
|
||||
"time": ts,
|
||||
}
|
||||
get_write_api().write(
|
||||
bucket=INFLUXDB_BUCKET, org=INFLUXDB_ORG, record=record
|
||||
)
|
||||
points_written += 1
|
||||
|
||||
logger.info("Wrote %d data points", points_written)
|
||||
return jsonify({"status": "success", "points_written": points_written}), 200
|
||||
|
||||
except Exception as exc:
|
||||
logger.exception("Error processing health data")
|
||||
return jsonify({"error": "Processing failed", "detail": str(exc)}), 500
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
logger.info("Apple Health data collector starting")
|
||||
logger.info("InfluxDB: %s", INFLUXDB_URL)
|
||||
logger.info("Bucket: %s", INFLUXDB_BUCKET)
|
||||
app.run(host="0.0.0.0", port=5353)
|
||||
2
demo/config/applehealth/requirements.txt
Normal file
2
demo/config/applehealth/requirements.txt
Normal file
@@ -0,0 +1,2 @@
|
||||
flask
|
||||
influxdb-client
|
||||
@@ -34,6 +34,16 @@
|
||||
username: admin
|
||||
password: demo_password
|
||||
|
||||
- Metrics:
|
||||
href: http://localhost:4021
|
||||
description: GitHub metrics visualization
|
||||
icon: github.png
|
||||
|
||||
- Apple Health:
|
||||
href: http://localhost:4024
|
||||
description: Health data collection and visualization
|
||||
icon: apple-health.png
|
||||
|
||||
- Documentation:
|
||||
- Draw.io:
|
||||
href: http://localhost:4010
|
||||
@@ -45,6 +55,11 @@
|
||||
description: Diagrams as a service
|
||||
icon: kroki.png
|
||||
|
||||
- Kiwix:
|
||||
href: http://localhost:4022
|
||||
description: Offline wiki reader
|
||||
icon: kiwix.png
|
||||
|
||||
- Developer Tools:
|
||||
- Atomic Tracker:
|
||||
href: http://localhost:4012
|
||||
@@ -75,3 +90,14 @@
|
||||
href: http://localhost:4018
|
||||
description: Magical shell history synchronization
|
||||
icon: atuin.png
|
||||
|
||||
- Productivity:
|
||||
- Reactive Resume:
|
||||
href: http://localhost:4016
|
||||
description: Open-source resume builder
|
||||
icon: reactive-resume.png
|
||||
|
||||
- Resume Matcher:
|
||||
href: http://localhost:4023
|
||||
description: AI-powered resume screening
|
||||
icon: resume.png
|
||||
|
||||
@@ -19,6 +19,9 @@ layout:
|
||||
Developer Tools:
|
||||
style: row
|
||||
columns: 3
|
||||
Productivity:
|
||||
style: row
|
||||
columns: 2
|
||||
|
||||
providers:
|
||||
docker:
|
||||
|
||||
0
demo/config/kiwix/.gitkeep
Normal file
0
demo/config/kiwix/.gitkeep
Normal file
96
demo/config/metrics/settings.json
Normal file
96
demo/config/metrics/settings.json
Normal file
@@ -0,0 +1,96 @@
|
||||
{
|
||||
"token": "GITHUB_API_TOKEN_PLACEHOLDER",
|
||||
"modes": ["embed", "insights"],
|
||||
"restricted": [],
|
||||
"maxusers": 0,
|
||||
"cached": 3600000,
|
||||
"ratelimiter": null,
|
||||
"port": 3000,
|
||||
"optimize": true,
|
||||
"debug": false,
|
||||
"debug.headless": false,
|
||||
"mocked": false,
|
||||
"repositories": 100,
|
||||
"padding": ["0", "8 + 11%"],
|
||||
"outputs": ["svg", "png", "json"],
|
||||
"hosted": {
|
||||
"by": "",
|
||||
"link": ""
|
||||
},
|
||||
"oauth": {
|
||||
"id": null,
|
||||
"secret": null,
|
||||
"url": "https://example.com"
|
||||
},
|
||||
"api": {
|
||||
"rest": null,
|
||||
"graphql": null
|
||||
},
|
||||
"control": {
|
||||
"token": null
|
||||
},
|
||||
"community": {
|
||||
"templates": []
|
||||
},
|
||||
"templates": {
|
||||
"default": "classic",
|
||||
"enabled": []
|
||||
},
|
||||
"extras": {
|
||||
"default": false,
|
||||
"features": false,
|
||||
"logged": [
|
||||
"metrics.api.github.overuse"
|
||||
]
|
||||
},
|
||||
"plugins.default": false,
|
||||
"plugins": {
|
||||
"isocalendar": { "enabled": false },
|
||||
"languages": { "enabled": false },
|
||||
"stargazers": { "worldmap.token": null, "enabled": false },
|
||||
"lines": { "enabled": false },
|
||||
"topics": { "enabled": false },
|
||||
"stars": { "enabled": false },
|
||||
"licenses": { "enabled": false },
|
||||
"habits": { "enabled": false },
|
||||
"contributors": { "enabled": false },
|
||||
"followup": { "enabled": false },
|
||||
"reactions": { "enabled": false },
|
||||
"people": { "enabled": false },
|
||||
"sponsorships": { "enabled": false },
|
||||
"sponsors": { "enabled": false },
|
||||
"repositories": { "enabled": false },
|
||||
"discussions": { "enabled": false },
|
||||
"starlists": { "enabled": false },
|
||||
"calendar": { "enabled": false },
|
||||
"achievements": { "enabled": false },
|
||||
"notable": { "enabled": false },
|
||||
"activity": { "enabled": false },
|
||||
"traffic": { "enabled": false },
|
||||
"code": { "enabled": false },
|
||||
"gists": { "enabled": false },
|
||||
"projects": { "enabled": false },
|
||||
"introduction": { "enabled": false },
|
||||
"skyline": { "enabled": false },
|
||||
"support": { "enabled": false },
|
||||
"pagespeed": { "token": "", "enabled": false },
|
||||
"tweets": { "token": "", "enabled": false },
|
||||
"stackoverflow": { "enabled": false },
|
||||
"anilist": { "enabled": false },
|
||||
"music": { "token": "", "enabled": false },
|
||||
"posts": { "enabled": false },
|
||||
"rss": { "enabled": false },
|
||||
"wakatime": { "token": "", "enabled": false },
|
||||
"leetcode": { "enabled": false },
|
||||
"steam": { "token": "", "enabled": false },
|
||||
"16personalities": { "enabled": false },
|
||||
"chess": { "token": "", "enabled": false },
|
||||
"crypto": { "enabled": false },
|
||||
"fortune": { "enabled": false },
|
||||
"nightscout": { "enabled": false },
|
||||
"poopmap": { "token": "", "enabled": false },
|
||||
"screenshot": { "enabled": false },
|
||||
"splatoon": { "token": "", "statink.token": null, "enabled": false },
|
||||
"stock": { "token": "", "enabled": false }
|
||||
}
|
||||
}
|
||||
0
demo/config/reactiveresume/.gitkeep
Normal file
0
demo/config/reactiveresume/.gitkeep
Normal file
0
demo/config/resumematcher/.gitkeep
Normal file
0
demo/config/resumematcher/.gitkeep
Normal file
@@ -27,6 +27,12 @@ WAKAPI_PORT=4015
|
||||
MAILHOG_PORT=4017
|
||||
MAILHOG_SMTP_PORT=4019
|
||||
ATUIN_PORT=4018
|
||||
REACTIVE_RESUME_PORT=4016
|
||||
RESUME_MINIO_PORT=4020
|
||||
METRICS_PORT=4021
|
||||
KIWIX_PORT=4022
|
||||
RESUME_MATCHER_PORT=4023
|
||||
APPLEHEALTH_PORT=4024
|
||||
|
||||
# Network Configuration
|
||||
NETWORK_SUBNET=192.168.3.0/24
|
||||
@@ -84,3 +90,19 @@ WAKAPI_PASSWORD_SALT=demo_salt_replace_in_production
|
||||
# Atuin Configuration
|
||||
ATUIN_HOST=0.0.0.0
|
||||
ATUIN_OPEN_REGISTRATION=true
|
||||
|
||||
# Reactive Resume Configuration
|
||||
RESUME_POSTGRES_DB=reactiveresume
|
||||
RESUME_POSTGRES_USER=postgres
|
||||
RESUME_POSTGRES_PASSWORD=demo_password
|
||||
RESUME_MINIO_USER=minioadmin
|
||||
RESUME_MINIO_PASSWORD=minioadmin
|
||||
RESUME_CHROME_TOKEN=chrome_token_demo
|
||||
RESUME_ACCESS_TOKEN_SECRET=access_token_secret_demo
|
||||
RESUME_REFRESH_TOKEN_SECRET=refresh_token_secret_demo
|
||||
|
||||
# Metrics Configuration
|
||||
METRICS_GITHUB_TOKEN=GITHUB_API_TOKEN_PLACEHOLDER
|
||||
|
||||
# Apple Health Configuration
|
||||
APPLEHEALTH_INFLUXDB_BUCKET=demo_metrics
|
||||
|
||||
@@ -43,6 +43,14 @@ volumes:
|
||||
driver: local
|
||||
${COMPOSE_PROJECT_NAME}_atuin_data:
|
||||
driver: local
|
||||
${COMPOSE_PROJECT_NAME}_reactiveresume_postgres_data:
|
||||
driver: local
|
||||
${COMPOSE_PROJECT_NAME}_reactiveresume_minio_data:
|
||||
driver: local
|
||||
${COMPOSE_PROJECT_NAME}_kiwix_data:
|
||||
driver: local
|
||||
${COMPOSE_PROJECT_NAME}_resumematcher_data:
|
||||
driver: local
|
||||
|
||||
services:
|
||||
# Docker Socket Proxy - Security Layer
|
||||
@@ -585,3 +593,260 @@ services:
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: 5
|
||||
start_period: 30s
|
||||
|
||||
# Reactive Resume - Postgres Database
|
||||
reactiveresume-postgres:
|
||||
image: postgres:16-alpine
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-postgres"
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
volumes:
|
||||
- ${COMPOSE_PROJECT_NAME}_reactiveresume_postgres_data:/var/lib/postgresql/data
|
||||
environment:
|
||||
POSTGRES_DB: ${RESUME_POSTGRES_DB}
|
||||
POSTGRES_USER: ${RESUME_POSTGRES_USER}
|
||||
POSTGRES_PASSWORD: ${RESUME_POSTGRES_PASSWORD}
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 256M
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U ${RESUME_POSTGRES_USER} -d ${RESUME_POSTGRES_DB}"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: 5
|
||||
|
||||
# Reactive Resume - Minio Storage
|
||||
reactiveresume-minio:
|
||||
image: minio/minio
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-minio"
|
||||
restart: unless-stopped
|
||||
command: server /data
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
ports:
|
||||
- "${RESUME_MINIO_PORT}:9000"
|
||||
volumes:
|
||||
- ${COMPOSE_PROJECT_NAME}_reactiveresume_minio_data:/data
|
||||
environment:
|
||||
MINIO_ROOT_USER: ${RESUME_MINIO_USER}
|
||||
MINIO_ROOT_PASSWORD: ${RESUME_MINIO_PASSWORD}
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 256M
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "--silent", "http://localhost:9000/minio/health/live"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: ${HEALTH_CHECK_RETRIES}
|
||||
|
||||
# Reactive Resume - Chrome (PDF Generation)
|
||||
reactiveresume-chrome:
|
||||
image: ghcr.io/browserless/chromium:latest
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-chrome"
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
environment:
|
||||
TIMEOUT: 10000
|
||||
CONCURRENT: 10
|
||||
TOKEN: ${RESUME_CHROME_TOKEN}
|
||||
EXIT_ON_HEALTH_FAILURE: true
|
||||
PRE_REQUEST_HEALTH_CHECK: true
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 512M
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "--silent", "http://localhost:3000/health"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: ${HEALTH_CHECK_RETRIES}
|
||||
start_period: 30s
|
||||
|
||||
# Reactive Resume - Resume Builder
|
||||
reactiveresume-app:
|
||||
image: amruthpillai/reactive-resume:latest
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-reactiveresume-app"
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
ports:
|
||||
- "${REACTIVE_RESUME_PORT}:3000"
|
||||
depends_on:
|
||||
reactiveresume-postgres:
|
||||
condition: service_healthy
|
||||
reactiveresume-minio:
|
||||
condition: service_started
|
||||
reactiveresume-chrome:
|
||||
condition: service_started
|
||||
environment:
|
||||
PORT: 3000
|
||||
NODE_ENV: production
|
||||
PUBLIC_URL: http://localhost:${REACTIVE_RESUME_PORT}
|
||||
STORAGE_URL: http://localhost:${RESUME_MINIO_PORT}/default
|
||||
CHROME_TOKEN: ${RESUME_CHROME_TOKEN}
|
||||
CHROME_URL: ws://reactiveresume-chrome:3000
|
||||
DATABASE_URL: postgresql://${RESUME_POSTGRES_USER}:${RESUME_POSTGRES_PASSWORD}@reactiveresume-postgres:5432/${RESUME_POSTGRES_DB}
|
||||
ACCESS_TOKEN_SECRET: ${RESUME_ACCESS_TOKEN_SECRET}
|
||||
REFRESH_TOKEN_SECRET: ${RESUME_REFRESH_TOKEN_SECRET}
|
||||
MAIL_FROM: noreply@localhost
|
||||
STORAGE_ENDPOINT: reactiveresume-minio
|
||||
STORAGE_PORT: 9000
|
||||
STORAGE_REGION: us-east-1
|
||||
STORAGE_BUCKET: default
|
||||
STORAGE_ACCESS_KEY: ${RESUME_MINIO_USER}
|
||||
STORAGE_SECRET_KEY: ${RESUME_MINIO_PASSWORD}
|
||||
STORAGE_USE_SSL: "false"
|
||||
STORAGE_SKIP_BUCKET_CHECK: "false"
|
||||
labels:
|
||||
homepage.group: "Productivity"
|
||||
homepage.name: "Reactive Resume"
|
||||
homepage.icon: "reactive-resume"
|
||||
homepage.href: "http://localhost:${REACTIVE_RESUME_PORT}"
|
||||
homepage.description: "Open-source resume builder"
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 512M
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "--silent", "http://localhost:3000/api/health"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: 5
|
||||
start_period: 30s
|
||||
|
||||
# Metrics - GitHub Metrics Visualization
|
||||
metrics:
|
||||
image: ghcr.io/lowlighter/metrics:latest
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-metrics"
|
||||
restart: unless-stopped
|
||||
entrypoint: [""]
|
||||
command: ["npm", "start"]
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
ports:
|
||||
- "${METRICS_PORT}:3000"
|
||||
volumes:
|
||||
- ./config/metrics/settings.json:/metrics/settings.json:ro
|
||||
environment:
|
||||
- PUID=${DEMO_UID}
|
||||
- PGID=${DEMO_GID}
|
||||
labels:
|
||||
homepage.group: "Monitoring"
|
||||
homepage.name: "Metrics"
|
||||
homepage.icon: "github"
|
||||
homepage.href: "http://localhost:${METRICS_PORT}"
|
||||
homepage.description: "GitHub metrics visualization"
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 256M
|
||||
healthcheck:
|
||||
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:3000"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: ${HEALTH_CHECK_RETRIES}
|
||||
start_period: 30s
|
||||
|
||||
# Kiwix - Offline Wiki
|
||||
kiwix:
|
||||
image: ghcr.io/kiwix/kiwix-serve:latest
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-kiwix"
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
ports:
|
||||
- "${KIWIX_PORT}:8080"
|
||||
volumes:
|
||||
- ${COMPOSE_PROJECT_NAME}_kiwix_data:/data
|
||||
environment:
|
||||
- PUID=${DEMO_UID}
|
||||
- PGID=${DEMO_GID}
|
||||
labels:
|
||||
homepage.group: "Documentation"
|
||||
homepage.name: "Kiwix"
|
||||
homepage.icon: "kiwix"
|
||||
homepage.href: "http://localhost:${KIWIX_PORT}"
|
||||
homepage.description: "Offline wiki reader"
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 256M
|
||||
healthcheck:
|
||||
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: ${HEALTH_CHECK_RETRIES}
|
||||
|
||||
# Resume Matcher - AI Resume Screening
|
||||
resumematcher:
|
||||
image: ghcr.io/srbhr/resume-matcher:latest
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-resumematcher"
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
ports:
|
||||
- "${RESUME_MATCHER_PORT}:3000"
|
||||
volumes:
|
||||
- ${COMPOSE_PROJECT_NAME}_resumematcher_data:/app/backend/data
|
||||
environment:
|
||||
- PUID=${DEMO_UID}
|
||||
- PGID=${DEMO_GID}
|
||||
labels:
|
||||
homepage.group: "Productivity"
|
||||
homepage.name: "Resume Matcher"
|
||||
homepage.icon: "resume"
|
||||
homepage.href: "http://localhost:${RESUME_MATCHER_PORT}"
|
||||
homepage.description: "AI-powered resume screening"
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 512M
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "--silent", "http://localhost:3000/api/v1/health"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: 5
|
||||
start_period: 60s
|
||||
|
||||
# Apple Health - Health Data Collector
|
||||
applehealth:
|
||||
build:
|
||||
context: ./config/applehealth
|
||||
dockerfile: Dockerfile
|
||||
image: tsys-applehealth:latest
|
||||
container_name: "${COMPOSE_PROJECT_NAME}-applehealth"
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- ${COMPOSE_NETWORK_NAME}
|
||||
ports:
|
||||
- "${APPLEHEALTH_PORT}:5353"
|
||||
environment:
|
||||
- INFLUXDB_URL=http://influxdb:8086
|
||||
- INFLUXDB_TOKEN=${INFLUXDB_AUTH_TOKEN}
|
||||
- INFLUXDB_ORG=${INFLUXDB_ORG}
|
||||
- INFLUXDB_BUCKET=${INFLUXDB_BUCKET}
|
||||
- PUID=${DEMO_UID}
|
||||
- PGID=${DEMO_GID}
|
||||
depends_on:
|
||||
influxdb:
|
||||
condition: service_healthy
|
||||
labels:
|
||||
homepage.group: "Monitoring"
|
||||
homepage.name: "Apple Health"
|
||||
homepage.icon: "apple-health"
|
||||
homepage.href: "http://localhost:${APPLEHEALTH_PORT}"
|
||||
homepage.description: "Health data collection and visualization"
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 256M
|
||||
healthcheck:
|
||||
test: ["CMD", "python", "-c", "import urllib.request; urllib.request.urlopen('http://localhost:5353/health')"]
|
||||
interval: ${HEALTH_CHECK_INTERVAL}
|
||||
timeout: ${HEALTH_CHECK_TIMEOUT}
|
||||
retries: ${HEALTH_CHECK_RETRIES}
|
||||
start_period: 15s
|
||||
|
||||
@@ -31,6 +31,12 @@ ensure_env() {
|
||||
# Ensure new variables exist in older env files
|
||||
grep -q '^MAILHOG_SMTP_PORT=' "$ENV_FILE" || echo "MAILHOG_SMTP_PORT=4019" >> "$ENV_FILE"
|
||||
grep -q '^HOMEPAGE_ALLOWED_HOSTS=' "$ENV_FILE" || echo "HOMEPAGE_ALLOWED_HOSTS=*" >> "$ENV_FILE"
|
||||
grep -q '^REACTIVE_RESUME_PORT=' "$ENV_FILE" || echo "REACTIVE_RESUME_PORT=4016" >> "$ENV_FILE"
|
||||
grep -q '^RESUME_MINIO_PORT=' "$ENV_FILE" || echo "RESUME_MINIO_PORT=4020" >> "$ENV_FILE"
|
||||
grep -q '^METRICS_PORT=' "$ENV_FILE" || echo "METRICS_PORT=4021" >> "$ENV_FILE"
|
||||
grep -q '^KIWIX_PORT=' "$ENV_FILE" || echo "KIWIX_PORT=4022" >> "$ENV_FILE"
|
||||
grep -q '^RESUME_MATCHER_PORT=' "$ENV_FILE" || echo "RESUME_MATCHER_PORT=4023" >> "$ENV_FILE"
|
||||
grep -q '^APPLEHEALTH_PORT=' "$ENV_FILE" || echo "APPLEHEALTH_PORT=4024" >> "$ENV_FILE"
|
||||
}
|
||||
|
||||
detect_user() {
|
||||
@@ -122,10 +128,13 @@ display_summary() {
|
||||
echo " Monitoring:"
|
||||
echo " InfluxDB http://localhost:${INFLUXDB_PORT}"
|
||||
echo " Grafana http://localhost:${GRAFANA_PORT}"
|
||||
echo " Metrics http://localhost:${METRICS_PORT}"
|
||||
echo " Apple Health http://localhost:${APPLEHEALTH_PORT}"
|
||||
echo ""
|
||||
echo " Documentation:"
|
||||
echo " Draw.io http://localhost:${DRAWIO_PORT}"
|
||||
echo " Kroki http://localhost:${KROKI_PORT}"
|
||||
echo " Kiwix http://localhost:${KIWIX_PORT}"
|
||||
echo ""
|
||||
echo " Developer Tools:"
|
||||
echo " Atomic Tracker http://localhost:${ATOMIC_TRACKER_PORT}"
|
||||
@@ -136,6 +145,10 @@ display_summary() {
|
||||
echo " MailHog (SMTP) localhost:${MAILHOG_SMTP_PORT}"
|
||||
echo " Atuin http://localhost:${ATUIN_PORT}"
|
||||
echo ""
|
||||
echo " Productivity:"
|
||||
echo " Reactive Resume http://localhost:${REACTIVE_RESUME_PORT}"
|
||||
echo " Resume Matcher http://localhost:${RESUME_MATCHER_PORT}"
|
||||
echo ""
|
||||
echo " Credentials: admin / demo_password"
|
||||
echo " FOR DEMONSTRATION PURPOSES ONLY"
|
||||
echo "========================================================"
|
||||
@@ -158,6 +171,11 @@ smoke_test() {
|
||||
"${WAKAPI_PORT}:Wakapi"
|
||||
"${MAILHOG_PORT}:MailHog"
|
||||
"${ATUIN_PORT}:Atuin"
|
||||
"${REACTIVE_RESUME_PORT}:ReactiveResume"
|
||||
"${METRICS_PORT}:Metrics"
|
||||
"${KIWIX_PORT}:Kiwix"
|
||||
"${RESUME_MATCHER_PORT}:ResumeMatcher"
|
||||
"${APPLEHEALTH_PORT}:AppleHealth"
|
||||
)
|
||||
local pass=0 fail=0
|
||||
for pt in "${ports[@]}"; do
|
||||
|
||||
@@ -110,6 +110,11 @@ test_port_accessibility() {
|
||||
"$WAKAPI_PORT:Wakapi"
|
||||
"$MAILHOG_PORT:MailHog"
|
||||
"$ATUIN_PORT:Atuin"
|
||||
"$REACTIVE_RESUME_PORT:ReactiveResume"
|
||||
"$METRICS_PORT:Metrics"
|
||||
"$KIWIX_PORT:Kiwix"
|
||||
"$RESUME_MATCHER_PORT:ResumeMatcher"
|
||||
"$APPLEHEALTH_PORT:AppleHealth"
|
||||
)
|
||||
|
||||
local failed=0
|
||||
@@ -150,7 +155,7 @@ test_volume_permissions() {
|
||||
source "$DEMO_ENV_FILE"
|
||||
local vol_count
|
||||
vol_count=$(docker volume ls --filter "name=${COMPOSE_PROJECT_NAME}" -q 2>/dev/null | wc -l)
|
||||
if [[ $vol_count -ge 15 ]]; then
|
||||
if [[ $vol_count -ge 19 ]]; then
|
||||
log_success "$vol_count volumes created"
|
||||
else
|
||||
log_error "Only $vol_count volumes found"
|
||||
|
||||
@@ -83,6 +83,13 @@ validate_docker_images() {
|
||||
"ghcr.io/muety/wakapi:latest"
|
||||
"mailhog/mailhog:latest"
|
||||
"ghcr.io/atuinsh/atuin:v18.10.0"
|
||||
"amruthpillai/reactive-resume:latest"
|
||||
"postgres:16-alpine"
|
||||
"minio/minio"
|
||||
"ghcr.io/browserless/chromium:latest"
|
||||
"ghcr.io/lowlighter/metrics:latest"
|
||||
"ghcr.io/kiwix/kiwix-serve:latest"
|
||||
"ghcr.io/srbhr/resume-matcher:latest"
|
||||
)
|
||||
for image in "${images[@]}"; do
|
||||
if docker image inspect "$image" >/dev/null 2>&1; then
|
||||
@@ -110,6 +117,11 @@ validate_port_availability() {
|
||||
"$WAKAPI_PORT"
|
||||
"$MAILHOG_PORT"
|
||||
"$ATUIN_PORT"
|
||||
"$REACTIVE_RESUME_PORT"
|
||||
"$METRICS_PORT"
|
||||
"$KIWIX_PORT"
|
||||
"$RESUME_MATCHER_PORT"
|
||||
"$APPLEHEALTH_PORT"
|
||||
)
|
||||
for port in "${ports[@]}"; do
|
||||
if [[ -n "$port" && "$port" != " " ]]; then
|
||||
@@ -143,6 +155,10 @@ validate_environment() {
|
||||
"ATOMIC_TRACKER_PORT" "ARCHIVEBOX_PORT"
|
||||
"TUBE_ARCHIVIST_PORT" "WAKAPI_PORT"
|
||||
"MAILHOG_PORT" "MAILHOG_SMTP_PORT" "ATUIN_PORT"
|
||||
"REACTIVE_RESUME_PORT" "RESUME_MINIO_PORT"
|
||||
"METRICS_PORT" "KIWIX_PORT"
|
||||
"RESUME_MATCHER_PORT" "APPLEHEALTH_PORT"
|
||||
"RESUME_POSTGRES_PASSWORD"
|
||||
"TA_USERNAME" "TA_PASSWORD" "ELASTIC_PASSWORD"
|
||||
"GF_SECURITY_ADMIN_USER" "GF_SECURITY_ADMIN_PASSWORD"
|
||||
"PIHOLE_WEBPASSWORD"
|
||||
@@ -177,6 +193,14 @@ validate_health_endpoints() {
|
||||
"atuin:8888:/healthz"
|
||||
"ta-redis:6379:redis-cli_ping"
|
||||
"ta-elasticsearch:9200:/_cluster/health"
|
||||
"reactiveresume-app:3000:/api/health"
|
||||
"reactiveresume-postgres:5432:pg_isready"
|
||||
"reactiveresume-minio:9000:/minio/health/live"
|
||||
"reactiveresume-chrome:3000:/health"
|
||||
"metrics:3000:/"
|
||||
"kiwix:8080:/"
|
||||
"resumematcher:3000:/api/v1/health"
|
||||
"applehealth:5353:/health"
|
||||
)
|
||||
for check in "${checks[@]}"; do
|
||||
local svc="${check%%:*}"
|
||||
@@ -190,6 +214,8 @@ validate_dependencies() {
|
||||
log_pass "Dependency: Grafana -> InfluxDB"
|
||||
log_pass "Dependency: Dockhand -> Docker Socket"
|
||||
log_pass "Dependency: TubeArchivist -> Redis + Elasticsearch"
|
||||
log_pass "Dependency: ReactiveResume -> Postgres + Minio + Chrome"
|
||||
log_pass "Dependency: AppleHealth -> InfluxDB"
|
||||
log_pass "Dependency: All other services -> Standalone"
|
||||
}
|
||||
|
||||
|
||||
@@ -67,6 +67,31 @@ const services = [
|
||||
url: 'http://localhost:4018',
|
||||
contentCheck: 'version',
|
||||
},
|
||||
{
|
||||
name: 'Reactive Resume',
|
||||
url: 'http://localhost:4016',
|
||||
contentCheck: 'reactive',
|
||||
},
|
||||
{
|
||||
name: 'Metrics',
|
||||
url: 'http://localhost:4021',
|
||||
contentCheck: 'metrics',
|
||||
},
|
||||
{
|
||||
name: 'Kiwix',
|
||||
url: 'http://localhost:4022',
|
||||
contentCheck: 'kiwix',
|
||||
},
|
||||
{
|
||||
name: 'Resume Matcher',
|
||||
url: 'http://localhost:4023',
|
||||
contentCheck: 'resume',
|
||||
},
|
||||
{
|
||||
name: 'Apple Health',
|
||||
url: 'http://localhost:4024',
|
||||
contentCheck: 'apple-health-collector',
|
||||
},
|
||||
];
|
||||
|
||||
for (const svc of services) {
|
||||
|
||||
@@ -54,6 +54,11 @@ test_complete_deployment() {
|
||||
"$WAKAPI_PORT"
|
||||
"$MAILHOG_PORT"
|
||||
"$ATUIN_PORT"
|
||||
"$REACTIVE_RESUME_PORT"
|
||||
"$METRICS_PORT"
|
||||
"$KIWIX_PORT"
|
||||
"$RESUME_MATCHER_PORT"
|
||||
"$APPLEHEALTH_PORT"
|
||||
)
|
||||
|
||||
local failed_ports=0
|
||||
|
||||
@@ -89,10 +89,10 @@ test_network_isolation() {
|
||||
check "Services are on the correct network"
|
||||
local net_count
|
||||
net_count=$(docker network inspect "${COMPOSE_NETWORK_NAME}" --format '{{range .Containers}}{{.Name}} {{end}}' 2>/dev/null | wc -w || echo "0")
|
||||
if [[ "$net_count" -ge 14 ]]; then
|
||||
if [[ "$net_count" -ge 22 ]]; then
|
||||
pass "$net_count containers on ${COMPOSE_NETWORK_NAME}"
|
||||
else
|
||||
fail "Only $net_count containers on network (expected >= 14)"
|
||||
fail "Only $net_count containers on network (expected >= 22)"
|
||||
fi
|
||||
}
|
||||
|
||||
|
||||
@@ -47,12 +47,13 @@ test_template_has_required_sections() {
|
||||
}
|
||||
|
||||
test_template_has_all_services() {
|
||||
check "Template defines all 16 services"
|
||||
check "Template defines all 24 services"
|
||||
local services=(
|
||||
"docker-socket-proxy:" "homepage:" "pihole:" "dockhand:"
|
||||
"influxdb:" "grafana:" "drawio:" "kroki:" "atomictracker:"
|
||||
"archivebox:" "ta-redis:" "ta-elasticsearch:" "tubearchivist:"
|
||||
"wakapi:" "mailhog:" "atuin:"
|
||||
"reactiveresume-postgres:" "reactiveresume-minio:" "reactiveresume-chrome:" "reactiveresume-app:" "metrics:" "kiwix:" "resumematcher:" "applehealth:"
|
||||
)
|
||||
local found=0
|
||||
for svc in "${services[@]}"; do
|
||||
@@ -69,7 +70,7 @@ test_template_has_all_services() {
|
||||
|
||||
test_all_services_have_healthchecks() {
|
||||
check "All exposed services have healthcheck blocks"
|
||||
local exposed_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin")
|
||||
local exposed_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin" "reactiveresume-app" "metrics" "kiwix" "resumematcher" "applehealth")
|
||||
local missing=()
|
||||
for svc in "${exposed_services[@]}"; do
|
||||
local svc_block
|
||||
@@ -91,16 +92,16 @@ test_all_services_have_restart_policy() {
|
||||
check "All services have restart policy"
|
||||
local restart_count
|
||||
restart_count=$(grep -c "restart:" "$TEMPLATE_FILE" || true)
|
||||
if [[ $restart_count -ge 16 ]]; then
|
||||
if [[ $restart_count -ge 24 ]]; then
|
||||
pass "$restart_count services have restart policies"
|
||||
else
|
||||
fail "Only $restart_count services have restart policies (expected >= 16)"
|
||||
fail "Only $restart_count services have restart policies (expected >= 24)"
|
||||
fi
|
||||
}
|
||||
|
||||
test_all_services_have_labels() {
|
||||
check "All user-facing services have Homepage labels"
|
||||
local label_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin")
|
||||
local label_services=("homepage" "pihole" "dockhand" "influxdb" "grafana" "drawio" "kroki" "atomictracker" "archivebox" "tubearchivist" "wakapi" "mailhog" "atuin" "reactiveresume-app" "metrics" "kiwix" "resumematcher" "applehealth")
|
||||
local missing=()
|
||||
for svc in "${label_services[@]}"; do
|
||||
local svc_block
|
||||
@@ -163,6 +164,7 @@ test_env_template_completeness() {
|
||||
"TA_USERNAME" "TA_PASSWORD" "ELASTIC_PASSWORD"
|
||||
"GF_SECURITY_ADMIN_USER" "GF_SECURITY_ADMIN_PASSWORD"
|
||||
"PIHOLE_WEBPASSWORD"
|
||||
"REACTIVE_RESUME_PORT" "RESUME_MINIO_PORT" "METRICS_PORT" "KIWIX_PORT" "RESUME_MATCHER_PORT" "APPLEHEALTH_PORT" "RESUME_POSTGRES_PASSWORD"
|
||||
)
|
||||
for var in "${required_vars[@]}"; do
|
||||
if grep_exists "^${var}=" "$ENV_TEMPLATE"; then
|
||||
|
||||
Reference in New Issue
Block a user