actually run the 2fa script.
This commit is contained in:
279
CODE-REVIEW-FINDINGS.md
Normal file
279
CODE-REVIEW-FINDINGS.md
Normal file
@@ -0,0 +1,279 @@
|
|||||||
|
# TSYS FetchApply Code Review Findings
|
||||||
|
|
||||||
|
**Review Date:** July 14, 2025
|
||||||
|
**Reviewer:** Claude (Anthropic)
|
||||||
|
**Repository:** TSYS Group Infrastructure Provisioning Scripts
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
|
||||||
|
The repository shows good architectural structure with centralized framework components, but has several performance, security, and maintainability issues that require attention. The codebase is functional but needs optimization for production reliability.
|
||||||
|
|
||||||
|
## Critical Issues (High Priority)
|
||||||
|
|
||||||
|
### 1. Package Installation Performance ⚠️
|
||||||
|
**Location:** `ProjectCode/SetupNewSystem.sh:27` and `Lines 117-183`
|
||||||
|
**Issue:** Multiple separate package installation commands causing performance bottlenecks
|
||||||
|
```bash
|
||||||
|
# Current inefficient pattern
|
||||||
|
apt-get -y install git sudo dmidecode curl
|
||||||
|
# ... later in script ...
|
||||||
|
DEBIAN_FRONTEND="noninteractive" apt-get -qq --yes install virt-what auditd ...
|
||||||
|
```
|
||||||
|
**Impact:** Significantly slower deployment, multiple package cache updates
|
||||||
|
**Fix:** Combine all package installations into single command
|
||||||
|
|
||||||
|
### 2. Network Operations Lack Error Handling 🔴
|
||||||
|
**Location:** `ProjectCode/SetupNewSystem.sh:61-63`, multiple modules
|
||||||
|
**Issue:** curl commands without timeout or error handling
|
||||||
|
```bash
|
||||||
|
# Vulnerable pattern
|
||||||
|
curl --silent ${DL_ROOT}/path/file >/etc/config
|
||||||
|
```
|
||||||
|
**Impact:** Deployment failures in poor network conditions
|
||||||
|
**Fix:** Add timeout, error handling, and retry logic
|
||||||
|
|
||||||
|
### 3. Unquoted Variable Expansions 🔴
|
||||||
|
**Location:** Multiple files, including `ProjectCode/SetupNewSystem.sh:244`
|
||||||
|
**Issue:** Variables used without proper quoting creating security risks
|
||||||
|
```bash
|
||||||
|
# Risky pattern
|
||||||
|
chsh -s $(which zsh) root
|
||||||
|
```
|
||||||
|
**Impact:** Potential command injection, script failures
|
||||||
|
**Fix:** Quote all variable expansions consistently
|
||||||
|
|
||||||
|
## Security Concerns
|
||||||
|
|
||||||
|
### 4. No Download Integrity Verification 🔴
|
||||||
|
**Issue:** All remote downloads lack checksum verification
|
||||||
|
**Impact:** Supply chain attack vulnerability
|
||||||
|
**Recommendation:** Implement SHA256 checksum validation
|
||||||
|
|
||||||
|
### 5. Excessive Root Privilege Usage ⚠️
|
||||||
|
**Issue:** All operations run as root without privilege separation
|
||||||
|
**Impact:** Unnecessary security exposure
|
||||||
|
**Recommendation:** Delegate non-privileged operations when possible
|
||||||
|
|
||||||
|
## Performance Optimization Opportunities
|
||||||
|
|
||||||
|
### 6. Individual File Downloads 🟡
|
||||||
|
**Location:** `ProjectCode/Modules/Security/secharden-scap-stig.sh:66-77`
|
||||||
|
**Issue:** 12+ individual curl commands for config files
|
||||||
|
```bash
|
||||||
|
curl --silent ${DL_ROOT}/path1 > /etc/file1
|
||||||
|
curl --silent ${DL_ROOT}/path2 > /etc/file2
|
||||||
|
# ... repeated 12+ times
|
||||||
|
```
|
||||||
|
**Impact:** Network overhead, slower deployment
|
||||||
|
**Fix:** Batch download operations
|
||||||
|
|
||||||
|
### 7. Missing Connection Pooling ⚠️
|
||||||
|
**Issue:** No connection reuse for multiple downloads from same host
|
||||||
|
**Impact:** Unnecessary connection overhead
|
||||||
|
**Fix:** Use curl with connection reuse or wget with keep-alive
|
||||||
|
|
||||||
|
## Code Quality Issues
|
||||||
|
|
||||||
|
### 8. Inconsistent Framework Usage 🟡
|
||||||
|
**Issue:** Not all modules use established error handling framework
|
||||||
|
**Impact:** Inconsistent error reporting, debugging difficulties
|
||||||
|
**Fix:** Standardize framework usage across all modules
|
||||||
|
|
||||||
|
### 9. Incomplete Function Implementations 🟡
|
||||||
|
**Location:** `Framework-Includes/LookupKv.sh`
|
||||||
|
**Issue:** Stubbed functions with no implementation
|
||||||
|
**Impact:** Technical debt, confusion
|
||||||
|
**Fix:** Implement or remove unused functions
|
||||||
|
|
||||||
|
### 10. Missing Input Validation 🟡
|
||||||
|
**Location:** `Project-Includes/pi-detect.sh`
|
||||||
|
**Issue:** Functions lack proper input validation and quoting
|
||||||
|
**Impact:** Potential script failures
|
||||||
|
**Fix:** Add comprehensive input validation
|
||||||
|
|
||||||
|
## Recommended Immediate Actions
|
||||||
|
|
||||||
|
### Phase 1: Critical Fixes (Week 1)
|
||||||
|
1. **Fix variable quoting** throughout codebase
|
||||||
|
2. **Add error handling** to all network operations
|
||||||
|
3. **Combine package installations** for performance
|
||||||
|
4. **Implement download integrity verification**
|
||||||
|
|
||||||
|
### Phase 2: Performance Optimization (Week 2)
|
||||||
|
1. **Batch file download operations**
|
||||||
|
2. **Add connection timeouts and retries**
|
||||||
|
3. **Implement bulk configuration deployment**
|
||||||
|
4. **Optimize service restart procedures**
|
||||||
|
|
||||||
|
### Phase 3: Code Quality (Week 3-4)
|
||||||
|
1. **Standardize framework usage**
|
||||||
|
2. **Add comprehensive input validation**
|
||||||
|
3. **Implement proper logging with timestamps**
|
||||||
|
4. **Remove or complete stubbed functions**
|
||||||
|
|
||||||
|
## Specific Code Improvements
|
||||||
|
|
||||||
|
### Enhanced Error Handling Pattern
|
||||||
|
```bash
|
||||||
|
function safe_download() {
|
||||||
|
local url="$1"
|
||||||
|
local dest="$2"
|
||||||
|
local max_attempts=3
|
||||||
|
local attempt=1
|
||||||
|
|
||||||
|
while [[ $attempt -le $max_attempts ]]; do
|
||||||
|
if curl --silent --connect-timeout 30 --max-time 60 --fail "$url" > "$dest"; then
|
||||||
|
print_success "Downloaded: $(basename "$dest")"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
print_warning "Download attempt $attempt failed: $url"
|
||||||
|
((attempt++))
|
||||||
|
sleep 5
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
print_error "Failed to download after $max_attempts attempts: $url"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Bulk Package Installation Pattern
|
||||||
|
```bash
|
||||||
|
function install_all_packages() {
|
||||||
|
print_info "Installing all required packages..."
|
||||||
|
|
||||||
|
local packages=(
|
||||||
|
# Core system packages
|
||||||
|
git sudo dmidecode curl wget
|
||||||
|
|
||||||
|
# Security packages
|
||||||
|
auditd fail2ban aide
|
||||||
|
|
||||||
|
# Monitoring packages
|
||||||
|
snmpd snmp-mibs-downloader
|
||||||
|
|
||||||
|
# Additional packages
|
||||||
|
virt-what net-tools htop
|
||||||
|
)
|
||||||
|
|
||||||
|
if DEBIAN_FRONTEND="noninteractive" apt-get -qq --yes -o Dpkg::Options::="--force-confold" install "${packages[@]}"; then
|
||||||
|
print_success "All packages installed successfully"
|
||||||
|
else
|
||||||
|
print_error "Package installation failed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Batch Configuration Download
|
||||||
|
```bash
|
||||||
|
function download_configurations() {
|
||||||
|
print_info "Downloading configuration files..."
|
||||||
|
|
||||||
|
local -A configs=(
|
||||||
|
["${DL_ROOT}/ProjectCode/ConfigFiles/ZSH/tsys-zshrc"]="/etc/zshrc"
|
||||||
|
["${DL_ROOT}/ProjectCode/ConfigFiles/SMTP/aliases"]="/etc/aliases"
|
||||||
|
["${DL_ROOT}/ProjectCode/ConfigFiles/Syslog/rsyslog.conf"]="/etc/rsyslog.conf"
|
||||||
|
)
|
||||||
|
|
||||||
|
for url in "${!configs[@]}"; do
|
||||||
|
local dest="${configs[$url]}"
|
||||||
|
if ! safe_download "$url" "$dest"; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
print_success "All configurations downloaded"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Recommendations
|
||||||
|
|
||||||
|
### Add Performance Tests
|
||||||
|
```bash
|
||||||
|
function test_package_installation_performance() {
|
||||||
|
local start_time=$(date +%s)
|
||||||
|
install_all_packages
|
||||||
|
local end_time=$(date +%s)
|
||||||
|
local duration=$((end_time - start_time))
|
||||||
|
|
||||||
|
echo "✅ Package installation completed in ${duration}s"
|
||||||
|
|
||||||
|
if [[ $duration -gt 300 ]]; then
|
||||||
|
echo "⚠️ Installation took longer than expected (>5 minutes)"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Add Network Resilience Tests
|
||||||
|
```bash
|
||||||
|
function test_network_error_handling() {
|
||||||
|
# Test with invalid URL
|
||||||
|
if safe_download "https://invalid.example.com/file" "/tmp/test"; then
|
||||||
|
echo "❌ Error handling test failed - should have failed"
|
||||||
|
return 1
|
||||||
|
else
|
||||||
|
echo "✅ Error handling test passed"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Monitoring and Metrics
|
||||||
|
|
||||||
|
### Deployment Performance Metrics
|
||||||
|
- **Package installation time:** Should complete in <5 minutes
|
||||||
|
- **Configuration download time:** Should complete in <2 minutes
|
||||||
|
- **Service restart time:** Should complete in <30 seconds
|
||||||
|
- **Total deployment time:** Should complete in <15 minutes
|
||||||
|
|
||||||
|
### Error Rate Monitoring
|
||||||
|
- **Network operation failures:** Should be <1%
|
||||||
|
- **Package installation failures:** Should be <0.1%
|
||||||
|
- **Service restart failures:** Should be <0.1%
|
||||||
|
|
||||||
|
## Compliance Assessment
|
||||||
|
|
||||||
|
### Development Guidelines Adherence
|
||||||
|
✅ **Good:** Single package commands in newer modules
|
||||||
|
✅ **Good:** Framework integration patterns
|
||||||
|
✅ **Good:** Function documentation in recent code
|
||||||
|
|
||||||
|
❌ **Needs Work:** Variable quoting consistency
|
||||||
|
❌ **Needs Work:** Error handling standardization
|
||||||
|
❌ **Needs Work:** Input validation coverage
|
||||||
|
|
||||||
|
## Risk Assessment
|
||||||
|
|
||||||
|
**Current Risk Level:** Medium
|
||||||
|
|
||||||
|
**Key Risks:**
|
||||||
|
1. **Deployment failures** due to network issues
|
||||||
|
2. **Security vulnerabilities** from unvalidated downloads
|
||||||
|
3. **Performance issues** in production deployments
|
||||||
|
4. **Maintenance challenges** from code inconsistencies
|
||||||
|
|
||||||
|
**Mitigation Priority:**
|
||||||
|
1. Network error handling (High)
|
||||||
|
2. Download integrity verification (High)
|
||||||
|
3. Performance optimization (Medium)
|
||||||
|
4. Code standardization (Medium)
|
||||||
|
|
||||||
|
## Conclusion
|
||||||
|
|
||||||
|
The TSYS FetchApply repository has a solid foundation but requires systematic improvements to meet production reliability standards. The recommended fixes will significantly enhance:
|
||||||
|
|
||||||
|
- **Deployment reliability** through better error handling
|
||||||
|
- **Security posture** through integrity verification
|
||||||
|
- **Performance** through optimized operations
|
||||||
|
- **Maintainability** through code standardization
|
||||||
|
|
||||||
|
Implementing these improvements in the suggested phases will create a robust, production-ready infrastructure provisioning system.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Next Steps:**
|
||||||
|
1. Review and prioritize findings with development team
|
||||||
|
2. Create implementation plan for critical fixes
|
||||||
|
3. Establish testing procedures for improvements
|
||||||
|
4. Set up monitoring for deployment metrics
|
273
Framework-Includes/SafeDownload.sh
Executable file
273
Framework-Includes/SafeDownload.sh
Executable file
@@ -0,0 +1,273 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Safe Download Framework
|
||||||
|
# Provides robust download operations with error handling, retries, and validation
|
||||||
|
# Author: TSYS Development Team
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Source framework dependencies
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/PrettyPrint.sh" 2>/dev/null || echo "Warning: PrettyPrint.sh not found"
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/Logging.sh" 2>/dev/null || echo "Warning: Logging.sh not found"
|
||||||
|
|
||||||
|
# Download configuration
|
||||||
|
declare -g DOWNLOAD_TIMEOUT=60
|
||||||
|
declare -g DOWNLOAD_CONNECT_TIMEOUT=30
|
||||||
|
declare -g DOWNLOAD_MAX_ATTEMPTS=3
|
||||||
|
declare -g DOWNLOAD_RETRY_DELAY=5
|
||||||
|
|
||||||
|
# Safe download with retry logic and error handling
|
||||||
|
function safe_download() {
|
||||||
|
local url="$1"
|
||||||
|
local dest="$2"
|
||||||
|
local expected_checksum="${3:-}"
|
||||||
|
local max_attempts="${4:-$DOWNLOAD_MAX_ATTEMPTS}"
|
||||||
|
|
||||||
|
local attempt=1
|
||||||
|
local temp_file="${dest}.tmp.$$"
|
||||||
|
|
||||||
|
# Validate inputs
|
||||||
|
if [[ -z "$url" || -z "$dest" ]]; then
|
||||||
|
print_error "safe_download: URL and destination are required"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create destination directory if needed
|
||||||
|
local dest_dir
|
||||||
|
dest_dir="$(dirname "$dest")"
|
||||||
|
if [[ ! -d "$dest_dir" ]]; then
|
||||||
|
if ! mkdir -p "$dest_dir"; then
|
||||||
|
print_error "Failed to create directory: $dest_dir"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
print_info "Downloading: $(basename "$dest") from $url"
|
||||||
|
|
||||||
|
while [[ $attempt -le $max_attempts ]]; do
|
||||||
|
if curl --silent --show-error --fail \
|
||||||
|
--connect-timeout "$DOWNLOAD_CONNECT_TIMEOUT" \
|
||||||
|
--max-time "$DOWNLOAD_TIMEOUT" \
|
||||||
|
--location \
|
||||||
|
--user-agent "TSYS-FetchApply/1.0" \
|
||||||
|
"$url" > "$temp_file"; then
|
||||||
|
|
||||||
|
# Verify checksum if provided
|
||||||
|
if [[ -n "$expected_checksum" ]]; then
|
||||||
|
if verify_checksum "$temp_file" "$expected_checksum"; then
|
||||||
|
mv "$temp_file" "$dest"
|
||||||
|
print_success "Downloaded and verified: $(basename "$dest")"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
print_error "Checksum verification failed for: $(basename "$dest")"
|
||||||
|
rm -f "$temp_file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
mv "$temp_file" "$dest"
|
||||||
|
print_success "Downloaded: $(basename "$dest")"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
print_warning "Download attempt $attempt failed: $(basename "$dest")"
|
||||||
|
rm -f "$temp_file"
|
||||||
|
|
||||||
|
if [[ $attempt -lt $max_attempts ]]; then
|
||||||
|
print_info "Retrying in ${DOWNLOAD_RETRY_DELAY}s..."
|
||||||
|
sleep "$DOWNLOAD_RETRY_DELAY"
|
||||||
|
fi
|
||||||
|
|
||||||
|
((attempt++))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
print_error "Failed to download after $max_attempts attempts: $(basename "$dest")"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Verify file checksum
|
||||||
|
function verify_checksum() {
|
||||||
|
local file="$1"
|
||||||
|
local expected_checksum="$2"
|
||||||
|
|
||||||
|
if [[ ! -f "$file" ]]; then
|
||||||
|
print_error "File not found for checksum verification: $file"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
local actual_checksum
|
||||||
|
actual_checksum=$(sha256sum "$file" | cut -d' ' -f1)
|
||||||
|
|
||||||
|
if [[ "$actual_checksum" == "$expected_checksum" ]]; then
|
||||||
|
print_success "Checksum verified: $(basename "$file")"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
print_error "Checksum mismatch for $(basename "$file")"
|
||||||
|
print_error "Expected: $expected_checksum"
|
||||||
|
print_error "Actual: $actual_checksum"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Batch download multiple files
|
||||||
|
function batch_download() {
|
||||||
|
local -n download_map=$1
|
||||||
|
local failed_downloads=0
|
||||||
|
|
||||||
|
print_info "Starting batch download of ${#download_map[@]} files..."
|
||||||
|
|
||||||
|
for url in "${!download_map[@]}"; do
|
||||||
|
local dest="${download_map[$url]}"
|
||||||
|
if ! safe_download "$url" "$dest"; then
|
||||||
|
((failed_downloads++))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ $failed_downloads -eq 0 ]]; then
|
||||||
|
print_success "All batch downloads completed successfully"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
print_error "$failed_downloads batch downloads failed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Download with progress indication for large files
|
||||||
|
function safe_download_with_progress() {
|
||||||
|
local url="$1"
|
||||||
|
local dest="$2"
|
||||||
|
local expected_checksum="${3:-}"
|
||||||
|
|
||||||
|
print_info "Downloading with progress: $(basename "$dest")"
|
||||||
|
|
||||||
|
if curl --progress-bar --show-error --fail \
|
||||||
|
--connect-timeout "$DOWNLOAD_CONNECT_TIMEOUT" \
|
||||||
|
--max-time "$DOWNLOAD_TIMEOUT" \
|
||||||
|
--location \
|
||||||
|
--user-agent "TSYS-FetchApply/1.0" \
|
||||||
|
"$url" -o "$dest"; then
|
||||||
|
|
||||||
|
# Verify checksum if provided
|
||||||
|
if [[ -n "$expected_checksum" ]]; then
|
||||||
|
if verify_checksum "$dest" "$expected_checksum"; then
|
||||||
|
print_success "Downloaded and verified: $(basename "$dest")"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
rm -f "$dest"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
print_success "Downloaded: $(basename "$dest")"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
print_error "Failed to download: $(basename "$dest")"
|
||||||
|
rm -f "$dest"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if URL is accessible
|
||||||
|
function check_url_accessibility() {
|
||||||
|
local url="$1"
|
||||||
|
|
||||||
|
if curl --silent --head --fail \
|
||||||
|
--connect-timeout "$DOWNLOAD_CONNECT_TIMEOUT" \
|
||||||
|
--max-time 10 \
|
||||||
|
"$url" >/dev/null 2>&1; then
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Validate all required URLs before starting deployment
|
||||||
|
function validate_required_urls() {
|
||||||
|
local -a urls=("$@")
|
||||||
|
local failed_urls=0
|
||||||
|
|
||||||
|
print_info "Validating accessibility of ${#urls[@]} URLs..."
|
||||||
|
|
||||||
|
for url in "${urls[@]}"; do
|
||||||
|
if check_url_accessibility "$url"; then
|
||||||
|
print_success "✓ $url"
|
||||||
|
else
|
||||||
|
print_error "✗ $url"
|
||||||
|
((failed_urls++))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ $failed_urls -eq 0 ]]; then
|
||||||
|
print_success "All URLs are accessible"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
print_error "$failed_urls URLs are not accessible"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Download configuration files with backup
|
||||||
|
function safe_config_download() {
|
||||||
|
local url="$1"
|
||||||
|
local dest="$2"
|
||||||
|
local backup_suffix="${3:-.bak.$(date +%Y%m%d-%H%M%S)}"
|
||||||
|
|
||||||
|
# Backup existing file if it exists
|
||||||
|
if [[ -f "$dest" ]]; then
|
||||||
|
local backup_file="${dest}${backup_suffix}"
|
||||||
|
if cp "$dest" "$backup_file"; then
|
||||||
|
print_info "Backed up existing config: $backup_file"
|
||||||
|
else
|
||||||
|
print_error "Failed to backup existing config: $dest"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Download new configuration
|
||||||
|
if safe_download "$url" "$dest"; then
|
||||||
|
print_success "Configuration updated: $(basename "$dest")"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
# Restore backup if download failed and backup exists
|
||||||
|
local backup_file="${dest}${backup_suffix}"
|
||||||
|
if [[ -f "$backup_file" ]]; then
|
||||||
|
if mv "$backup_file" "$dest"; then
|
||||||
|
print_warning "Restored backup after failed download: $(basename "$dest")"
|
||||||
|
else
|
||||||
|
print_error "Failed to restore backup: $(basename "$dest")"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test network connectivity to common endpoints
|
||||||
|
function test_network_connectivity() {
|
||||||
|
local test_urls=(
|
||||||
|
"https://archive.ubuntu.com"
|
||||||
|
"https://github.com"
|
||||||
|
"https://curl.haxx.se"
|
||||||
|
)
|
||||||
|
|
||||||
|
print_info "Testing network connectivity..."
|
||||||
|
|
||||||
|
for url in "${test_urls[@]}"; do
|
||||||
|
if check_url_accessibility "$url"; then
|
||||||
|
print_success "Network connectivity confirmed: $url"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
print_error "No network connectivity detected"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Export functions for use in other scripts
|
||||||
|
export -f safe_download
|
||||||
|
export -f verify_checksum
|
||||||
|
export -f batch_download
|
||||||
|
export -f safe_download_with_progress
|
||||||
|
export -f check_url_accessibility
|
||||||
|
export -f validate_required_urls
|
||||||
|
export -f safe_config_download
|
||||||
|
export -f test_network_connectivity
|
286
Project-Tests/unit/safe-download.sh
Executable file
286
Project-Tests/unit/safe-download.sh
Executable file
@@ -0,0 +1,286 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Safe Download Framework Unit Tests
|
||||||
|
# Tests the SafeDownload.sh framework functionality
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
PROJECT_ROOT="$(dirname "$(realpath "${BASH_SOURCE[0]}")")/../.."
|
||||||
|
|
||||||
|
# Source framework functions
|
||||||
|
source "$PROJECT_ROOT/Framework-Includes/SafeDownload.sh"
|
||||||
|
|
||||||
|
function test_network_connectivity() {
|
||||||
|
echo "🔍 Testing network connectivity..."
|
||||||
|
|
||||||
|
if test_network_connectivity; then
|
||||||
|
echo "✅ Network connectivity test passed"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
echo "❌ Network connectivity test failed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
function test_url_accessibility() {
|
||||||
|
echo "🔍 Testing URL accessibility..."
|
||||||
|
|
||||||
|
local test_urls=(
|
||||||
|
"https://archive.ubuntu.com"
|
||||||
|
"https://github.com"
|
||||||
|
)
|
||||||
|
|
||||||
|
local failed=0
|
||||||
|
|
||||||
|
for url in "${test_urls[@]}"; do
|
||||||
|
if check_url_accessibility "$url"; then
|
||||||
|
echo "✅ URL accessible: $url"
|
||||||
|
else
|
||||||
|
echo "❌ URL not accessible: $url"
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
return $failed
|
||||||
|
}
|
||||||
|
|
||||||
|
function test_safe_download() {
|
||||||
|
echo "🔍 Testing safe download functionality..."
|
||||||
|
|
||||||
|
local test_url="https://raw.githubusercontent.com/torvalds/linux/master/README"
|
||||||
|
local test_dest="/tmp/test-download-$$"
|
||||||
|
local failed=0
|
||||||
|
|
||||||
|
# Test successful download
|
||||||
|
if safe_download "$test_url" "$test_dest"; then
|
||||||
|
echo "✅ Safe download successful"
|
||||||
|
|
||||||
|
# Verify file exists and has content
|
||||||
|
if [[ -f "$test_dest" && -s "$test_dest" ]]; then
|
||||||
|
echo "✅ Downloaded file exists and has content"
|
||||||
|
else
|
||||||
|
echo "❌ Downloaded file is missing or empty"
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
rm -f "$test_dest"
|
||||||
|
else
|
||||||
|
echo "❌ Safe download failed"
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Test download with invalid URL
|
||||||
|
if safe_download "https://invalid.example.com/nonexistent" "/tmp/test-invalid-$$" 2>/dev/null; then
|
||||||
|
echo "❌ Invalid URL download should have failed"
|
||||||
|
((failed++))
|
||||||
|
else
|
||||||
|
echo "✅ Invalid URL download failed as expected"
|
||||||
|
fi
|
||||||
|
|
||||||
|
return $failed
|
||||||
|
}
|
||||||
|
|
||||||
|
function test_checksum_verification() {
|
||||||
|
echo "🔍 Testing checksum verification..."
|
||||||
|
|
||||||
|
local test_file="/tmp/test-checksum-$$"
|
||||||
|
local test_content="Hello, World!"
|
||||||
|
local expected_checksum="dffd6021bb2bd5b0af676290809ec3a53191dd81c7f70a4b28688a362182986f"
|
||||||
|
local failed=0
|
||||||
|
|
||||||
|
# Create test file with known content
|
||||||
|
echo -n "$test_content" > "$test_file"
|
||||||
|
|
||||||
|
# Test correct checksum
|
||||||
|
if verify_checksum "$test_file" "$expected_checksum"; then
|
||||||
|
echo "✅ Correct checksum verification passed"
|
||||||
|
else
|
||||||
|
echo "❌ Correct checksum verification failed"
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Test incorrect checksum
|
||||||
|
if verify_checksum "$test_file" "invalid_checksum" 2>/dev/null; then
|
||||||
|
echo "❌ Incorrect checksum should have failed"
|
||||||
|
((failed++))
|
||||||
|
else
|
||||||
|
echo "✅ Incorrect checksum verification failed as expected"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Test missing file
|
||||||
|
if verify_checksum "/tmp/nonexistent-file-$$" "$expected_checksum" 2>/dev/null; then
|
||||||
|
echo "❌ Missing file checksum should have failed"
|
||||||
|
((failed++))
|
||||||
|
else
|
||||||
|
echo "✅ Missing file checksum verification failed as expected"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
rm -f "$test_file"
|
||||||
|
|
||||||
|
return $failed
|
||||||
|
}
|
||||||
|
|
||||||
|
function test_batch_download() {
|
||||||
|
echo "🔍 Testing batch download functionality..."
|
||||||
|
|
||||||
|
# Create test download map
|
||||||
|
declare -A test_downloads=(
|
||||||
|
["https://raw.githubusercontent.com/torvalds/linux/master/README"]="/tmp/batch-test-1-$$"
|
||||||
|
["https://raw.githubusercontent.com/torvalds/linux/master/COPYING"]="/tmp/batch-test-2-$$"
|
||||||
|
)
|
||||||
|
|
||||||
|
local failed=0
|
||||||
|
|
||||||
|
# Test batch download
|
||||||
|
if batch_download test_downloads; then
|
||||||
|
echo "✅ Batch download successful"
|
||||||
|
|
||||||
|
# Verify all files were downloaded
|
||||||
|
for file in "${test_downloads[@]}"; do
|
||||||
|
if [[ -f "$file" && -s "$file" ]]; then
|
||||||
|
echo "✅ Batch file downloaded: $(basename "$file")"
|
||||||
|
else
|
||||||
|
echo "❌ Batch file missing: $(basename "$file")"
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
for file in "${test_downloads[@]}"; do
|
||||||
|
rm -f "$file"
|
||||||
|
done
|
||||||
|
else
|
||||||
|
echo "❌ Batch download failed"
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
|
||||||
|
return $failed
|
||||||
|
}
|
||||||
|
|
||||||
|
function test_config_backup_and_restore() {
|
||||||
|
echo "🔍 Testing config backup and restore..."
|
||||||
|
|
||||||
|
local test_config="/tmp/test-config-$$"
|
||||||
|
local original_content="Original configuration"
|
||||||
|
local failed=0
|
||||||
|
|
||||||
|
# Create original config file
|
||||||
|
echo "$original_content" > "$test_config"
|
||||||
|
|
||||||
|
# Test safe config download (this will fail with invalid URL, triggering restore)
|
||||||
|
if safe_config_download "https://invalid.example.com/config" "$test_config" ".test-backup" 2>/dev/null; then
|
||||||
|
echo "❌ Invalid config download should have failed"
|
||||||
|
((failed++))
|
||||||
|
else
|
||||||
|
echo "✅ Invalid config download failed as expected"
|
||||||
|
|
||||||
|
# Verify original file was restored
|
||||||
|
if [[ -f "$test_config" ]] && grep -q "$original_content" "$test_config"; then
|
||||||
|
echo "✅ Original config was restored after failed download"
|
||||||
|
else
|
||||||
|
echo "❌ Original config was not restored properly"
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
rm -f "$test_config" "$test_config.test-backup"
|
||||||
|
|
||||||
|
return $failed
|
||||||
|
}
|
||||||
|
|
||||||
|
function test_download_error_handling() {
|
||||||
|
echo "🔍 Testing download error handling..."
|
||||||
|
|
||||||
|
local failed=0
|
||||||
|
|
||||||
|
# Test download with missing parameters
|
||||||
|
if safe_download "" "/tmp/test" 2>/dev/null; then
|
||||||
|
echo "❌ Download with empty URL should have failed"
|
||||||
|
((failed++))
|
||||||
|
else
|
||||||
|
echo "✅ Download with empty URL failed as expected"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if safe_download "https://example.com" "" 2>/dev/null; then
|
||||||
|
echo "❌ Download with empty destination should have failed"
|
||||||
|
((failed++))
|
||||||
|
else
|
||||||
|
echo "✅ Download with empty destination failed as expected"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Test download to read-only location (should fail)
|
||||||
|
if safe_download "https://github.com" "/test-readonly-$$" 2>/dev/null; then
|
||||||
|
echo "❌ Download to read-only location should have failed"
|
||||||
|
((failed++))
|
||||||
|
else
|
||||||
|
echo "✅ Download to read-only location failed as expected"
|
||||||
|
fi
|
||||||
|
|
||||||
|
return $failed
|
||||||
|
}
|
||||||
|
|
||||||
|
function test_download_performance() {
|
||||||
|
echo "🔍 Testing download performance..."
|
||||||
|
|
||||||
|
local test_url="https://raw.githubusercontent.com/torvalds/linux/master/README"
|
||||||
|
local test_dest="/tmp/perf-test-$$"
|
||||||
|
local start_time end_time duration
|
||||||
|
|
||||||
|
start_time=$(date +%s)
|
||||||
|
|
||||||
|
if safe_download "$test_url" "$test_dest"; then
|
||||||
|
end_time=$(date +%s)
|
||||||
|
duration=$((end_time - start_time))
|
||||||
|
|
||||||
|
echo "✅ Download completed in ${duration}s"
|
||||||
|
|
||||||
|
if [[ $duration -gt 30 ]]; then
|
||||||
|
echo "⚠️ Download took longer than expected (>30s)"
|
||||||
|
else
|
||||||
|
echo "✅ Download performance acceptable"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
rm -f "$test_dest"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
echo "❌ Performance test download failed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main test execution
|
||||||
|
function main() {
|
||||||
|
echo "🧪 Running Safe Download Framework Unit Tests"
|
||||||
|
echo "==========================================="
|
||||||
|
|
||||||
|
local total_failures=0
|
||||||
|
|
||||||
|
# Run all tests
|
||||||
|
test_network_connectivity || ((total_failures++))
|
||||||
|
test_url_accessibility || ((total_failures++))
|
||||||
|
test_safe_download || ((total_failures++))
|
||||||
|
test_checksum_verification || ((total_failures++))
|
||||||
|
test_batch_download || ((total_failures++))
|
||||||
|
test_config_backup_and_restore || ((total_failures++))
|
||||||
|
test_download_error_handling || ((total_failures++))
|
||||||
|
test_download_performance || ((total_failures++))
|
||||||
|
|
||||||
|
echo "==========================================="
|
||||||
|
|
||||||
|
if [[ $total_failures -eq 0 ]]; then
|
||||||
|
echo "✅ All safe download framework tests passed"
|
||||||
|
exit 0
|
||||||
|
else
|
||||||
|
echo "❌ $total_failures safe download framework tests failed"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Run main if executed directly
|
||||||
|
if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
|
||||||
|
main "$@"
|
||||||
|
fi
|
@@ -358,7 +358,7 @@ function secharden-auto-upgrades() {
|
|||||||
|
|
||||||
function secharden-2fa() {
|
function secharden-2fa() {
|
||||||
print_info "Now running $FUNCNAME"
|
print_info "Now running $FUNCNAME"
|
||||||
#curl --silent ${DL_ROOT}/Modules/Security/secharden-2fa.sh|$(which bash)
|
bash ./Modules/Security/secharden-2fa.sh
|
||||||
print_info "Completed running $FUNCNAME"
|
print_info "Completed running $FUNCNAME"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
Reference in New Issue
Block a user