Files
dnviti 2719cfff59
Some checks failed
Build / Code Quality Checks (push) Successful in 15m11s
Build / Build & Push Docker Images (worker) (push) Successful in 13m44s
Build / Build & Push Docker Images (frontend) (push) Successful in 5m8s
Build / Build & Push Docker Images (chat) (push) Failing after 30m7s
Build / Build & Push Docker Images (api) (push) Failing after 21m39s
Add Helm chart, Docs, and Config conversion script
2025-10-22 14:35:21 +02:00

307 lines
7.9 KiB
Markdown

# Scripts Directory
This directory contains utility scripts for the Datacenter Documentation project.
---
## 🔍 test-ci-pipeline.sh
**Local CI/CD Pipeline Validation Script**
### Description
Simulates the complete GitLab CI/CD pipeline locally before pushing code to the repository. This script runs all the same checks that would run in GitHub Actions, GitLab CI, or Gitea Actions.
### Usage
```bash
# Run from project root
bash scripts/test-ci-pipeline.sh
# Or make it executable and run directly
chmod +x scripts/test-ci-pipeline.sh
./scripts/test-ci-pipeline.sh
```
### Pipeline Stages
The script executes the following stages in order:
#### 1. **LINT** Stage
- **Black**: Code formatting check
- **Ruff**: Linting and code quality
- **MyPy**: Type checking (strict mode)
#### 2. **TEST** Stage
- **Unit Tests**: Runs pytest with coverage
- **Security Scan**: Bandit (if installed)
#### 3. **BUILD** Stage
- **Poetry Check**: Validates `pyproject.toml` configuration
- **Dependency Resolution**: Tests if all dependencies can be installed
- **Docker Validation**: Checks Dockerfile syntax
#### 4. **INTEGRATION** Stage (Optional)
- **API Health Check**: Tests if local API is running
### Output
The script provides:
-**Color-coded output** for easy readability
- 📊 **Real-time progress** for each job
- 📄 **Summary report** at the end
- 📝 **Written report** saved to `ci-pipeline-report-TIMESTAMP.txt`
### Example Output
```
╔═══════════════════════════════════════════════════════╗
║ LOCAL CI/CD PIPELINE SIMULATION ║
║ GitLab CI Pipeline ║
╚═══════════════════════════════════════════════════════╝
=====================================
STAGE: LINT
=====================================
>>> JOB: lint:black
Running: poetry run black --check src/ tests/
✅ PASSED: Black code formatting
>>> JOB: lint:ruff
Running: poetry run ruff check src/ tests/
✅ PASSED: Ruff linting
>>> JOB: lint:mypy
Running: poetry run mypy src/
✅ PASSED: MyPy type checking
...
╔═══════════════════════════════════════════════════════╗
║ ✅ PIPELINE PASSED SUCCESSFULLY ✅ ║
╚═══════════════════════════════════════════════════════╝
Total Tests: 8
Passed: 8
Failed: 0
Duration: 6s
```
### Exit Codes
- **0**: All checks passed ✅
- **1**: One or more checks failed ❌
### Requirements
- **Poetry**: For dependency management
- **Python 3.12+**: As specified in `pyproject.toml`
- **Docker/Podman** (optional): For Docker validation stage
- **MongoDB** (optional): For integration tests
### When to Run
Run this script:
-**Before every commit** to ensure code quality
-**Before creating a pull request**
-**After making significant changes**
-**To verify CI/CD pipeline compatibility**
### Integration with Git
You can add this as a Git pre-push hook:
```bash
#!/bin/bash
# .git/hooks/pre-push
echo "Running CI pipeline validation..."
bash scripts/test-ci-pipeline.sh
if [ $? -ne 0 ]; then
echo "❌ CI pipeline validation failed. Push aborted."
exit 1
fi
echo "✅ CI pipeline validation passed. Proceeding with push..."
exit 0
```
### Continuous Integration Compatibility
This script simulates:
-**GitHub Actions** (`.github/workflows/build-deploy.yml`)
-**GitLab CI** (`.gitlab-ci.yml`)
-**Gitea Actions** (`.gitea/workflows/ci.yml`)
All checks performed locally will also pass in the actual CI/CD platforms.
---
## 📝 Report Files
After running the validation script, you'll find:
- **`ci-pipeline-report-TIMESTAMP.txt`**: Plain text summary
- **`CI_VALIDATION_REPORT.md`**: Comprehensive markdown report with details
---
## 🔄 convert_config.py
**Configuration Format Converter**
### Description
Converts between `.env` and `values.yaml` configuration formats, making it easy to switch between Docker Compose and Helm deployments.
### Usage
#### Prerequisites
```bash
pip install pyyaml
```
#### Convert .env to values.yaml
```bash
./scripts/convert_config.py env-to-yaml .env values.yaml
```
#### Convert values.yaml to .env
```bash
./scripts/convert_config.py yaml-to-env values.yaml .env
```
### Examples
**Example 1: Create values.yaml from existing .env**
```bash
# You have an existing .env file from Docker development
./scripts/convert_config.py env-to-yaml .env my-values.yaml
# Use the generated values.yaml with Helm
helm install my-release deploy/helm/datacenter-docs -f my-values.yaml
```
**Example 2: Generate .env from values.yaml**
```bash
# You have a values.yaml from Kubernetes deployment
./scripts/convert_config.py yaml-to-env values.yaml .env
# Use the generated .env with Docker Compose
cd deploy/docker
docker-compose -f docker-compose.dev.yml up -d
```
**Example 3: Environment migration**
```bash
# Convert development .env to staging values.yaml
./scripts/convert_config.py env-to-yaml .env.development values-staging.yaml
# Manually adjust staging-specific settings
nano values-staging.yaml
# Deploy to staging Kubernetes cluster
helm install staging deploy/helm/datacenter-docs -f values-staging.yaml
```
### Supported Configuration
The script converts:
- **MongoDB**: Connection settings and authentication
- **Redis**: Connection and authentication
- **MCP Server**: URL and API key
- **Proxmox**: Host, authentication, SSL settings
- **LLM**: Provider settings (OpenAI, Anthropic, Ollama, etc.)
- **API**: Server configuration and workers
- **CORS**: Allowed origins
- **Application**: Logging and debug settings
- **Celery**: Broker and result backend
- **Vector Store**: ChromaDB and embedding model
### Output
```
Reading .env file: .env
Converting to values.yaml format...
Writing values.yaml: my-values.yaml
✓ Conversion completed successfully!
Output written to: my-values.yaml
```
### Limitations
- Converts common configuration options only
- Complex nested structures may require manual adjustment
- Helm-specific values (resource limits, replicas) not included in .env conversion
- Always review and test converted configuration
### Tips
1. **Review output**: Always check converted files for accuracy
2. **Test first**: Validate in development before production
3. **Keep secrets secure**: Use proper secret management tools
4. **Version control**: Track configuration changes
### See Also
- [CONFIGURATION.md](../CONFIGURATION.md) - Complete configuration guide
- [.env.example](../.env.example) - Environment variable template
- [values.yaml](../values.yaml) - YAML configuration template
---
## 🚀 Quick Start
```bash
# First time setup
poetry install
# Run validation
bash scripts/test-ci-pipeline.sh
# If all passes, commit and push
git add .
git commit -m "your commit message"
git push
```
---
## 🔧 Troubleshooting
### "poetry: command not found"
Install Poetry: https://python-poetry.org/docs/#installation
### "Black would reformat X files"
Run: `poetry run black src/ tests/`
### "Ruff found X errors"
Run: `poetry run ruff check --fix src/ tests/`
### "MyPy found X errors"
Fix type errors or add type ignores where appropriate.
### Docker validation fails
Ensure Docker or Podman is installed:
- **Ubuntu/Debian**: `sudo apt install docker.io`
- **Fedora**: `sudo dnf install podman podman-compose`
---
## 📚 Additional Resources
- [CLAUDE.md](../CLAUDE.md) - Project documentation for AI assistants
- [README.md](../README.md) - Project overview
- [TODO.md](../TODO.md) - Development roadmap
- [CI_VALIDATION_REPORT.md](../CI_VALIDATION_REPORT.md) - Latest validation report