CI/CD Pipeline¶
Audience: Dev, Ops
You will learn:
- Continuous Integration Setup für das Icon-Tool
- Automated Testing und Quality Checks
- Deployment-Pipeline für verschiedene Umgebungen
- Release-Management und Versionierung
Pre-requisites: - Entwicklungsumgebung eingerichtet - Git-Repository mit Push-Rechten - Grundverständnis von GitHub Actions/GitLab CI
Pipeline-Übersicht¶
graph TB
A[Git Push] --> B[CI Trigger]
B --> C[Code Quality]
B --> D[Tests]
B --> E[Icon Validation]
C --> F[Lint Check]
C --> G[Format Check]
D --> H[Unit Tests]
D --> I[Integration Tests]
E --> J[Icon Extraction]
E --> K[Metadata Validation]
F --> L[Build]
G --> L
H --> L
I --> L
J --> L
K --> L
L --> M[Deploy Dev]
M --> N[Deploy Staging]
N --> O[Deploy Production]
Evidenz: CI/CD best practices, automated workflows
GitHub Actions Workflow¶
1. Haupt-Workflow¶
# .github/workflows/ci.yml
name: CI/CD Pipeline
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
env:
NODE_VERSION: '20'
PYTHON_VERSION: '3.11'
jobs:
quality-checks:
name: Code Quality
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'
- name: Install Dependencies
run: |
npm ci
pip install flask flake8 black pytest
- name: Python Lint
run: flake8 app.py --max-line-length=88
- name: Python Format Check
run: black --check app.py
- name: JavaScript Lint
run: npx eslint extract-icons.js create-selected-icons.js
test-suite:
name: Test Suite
runs-on: ubuntu-latest
needs: quality-checks
steps:
- name: Checkout Code
uses: actions/checkout@v4
- name: Setup Environment
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'
- name: Install Dependencies
run: |
npm ci
pip install flask pytest requests
- name: Run Python Tests
run: pytest tests/ -v
- name: Run JavaScript Tests
run: npm test
- name: Integration Tests
run: |
python app.py &
sleep 5
curl -f http://localhost:5000/api/icons
curl -f http://localhost:5000/health
icon-validation:
name: Icon Validation
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Install Dependencies
run: npm ci
- name: Extract Icons
run: node extract-icons.js
- name: Validate Icon Count
run: |
icon_count=$(ls static/icons/*.svg | wc -l)
echo "Icon count: $icon_count"
if [ $icon_count -lt 150 ]; then
echo "Error: Too few icons extracted ($icon_count < 150)"
exit 1
fi
- name: Validate ZIP Size
run: |
zip_size=$(stat -c%s "turkiye-atlasi-icons.zip")
echo "ZIP size: $zip_size bytes"
if [ $zip_size -gt 100000 ]; then
echo "Warning: ZIP size is large ($zip_size > 100KB)"
fi
- name: Validate Metadata
run: |
python3 -c "
import json
with open('icons.json') as f:
categories = json.load(f)
total_categorized = sum(len(icons) for icons in categories.values())
print(f'Categorized icons: {total_categorized}')
import os
actual_count = len([f for f in os.listdir('static/icons') if f.endswith('.svg')])
print(f'Actual icons: {actual_count}')
if abs(total_categorized - actual_count) > 5:
print('Error: Significant mismatch between categorized and actual icons')
exit(1)
"
- name: Upload Icon Artifacts
uses: actions/upload-artifact@v3
with:
name: extracted-icons
path: |
static/icons/
*.zip
retention-days: 30
build:
name: Build Application
runs-on: ubuntu-latest
needs: [quality-checks, test-suite, icon-validation]
steps:
- name: Checkout Code
uses: actions/checkout@v4
- name: Setup Environment
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'
- name: Install Dependencies
run: |
npm ci
pip install flask
- name: Extract Icons
run: node extract-icons.js
- name: Create Build Artifact
run: |
tar -czf build-artifact.tar.gz \
app.py \
static/ \
templates/ \
icons.json \
pyproject.toml \
requirements.txt
- name: Upload Build Artifact
uses: actions/upload-artifact@v3
with:
name: build-artifact
path: build-artifact.tar.gz
retention-days: 90
deploy-staging:
name: Deploy to Staging
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/develop'
environment: staging
steps:
- name: Download Build Artifact
uses: actions/download-artifact@v3
with:
name: build-artifact
- name: Deploy to Staging
run: |
echo "Deploying to staging environment..."
# Replit deployment oder andere staging platform
# curl -X POST "$STAGING_DEPLOY_WEBHOOK" \
# -H "Authorization: Bearer $STAGING_TOKEN" \
# -F "artifact=@build-artifact.tar.gz"
deploy-production:
name: Deploy to Production
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/main'
environment: production
steps:
- name: Download Build Artifact
uses: actions/download-artifact@v3
with:
name: build-artifact
- name: Deploy to Production
run: |
echo "Deploying to production environment..."
# Production deployment logic
Evidenz: GitHub Actions best practices, project needs
2. Release-Workflow¶
# .github/workflows/release.yml
name: Release
on:
push:
tags:
- 'v*'
jobs:
release:
name: Create Release
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install Dependencies
run: npm ci
- name: Extract Icons
run: node extract-icons.js
- name: Generate Changelog
id: changelog
run: |
# Generate changelog since last tag
PREVIOUS_TAG=$(git describe --tags --abbrev=0 HEAD~1)
echo "## Changes since $PREVIOUS_TAG" > CHANGELOG.md
git log $PREVIOUS_TAG..HEAD --pretty=format:"- %s" >> CHANGELOG.md
- name: Create Release Archives
run: |
# Full icon collection
zip -r "icon-collection-${{ github.ref_name }}.zip" static/icons/
# Selected icons
node create-selected-icons.js
mv selected-icons.zip "selected-icons-${{ github.ref_name }}.zip"
# Source code archive
tar -czf "source-${{ github.ref_name }}.tar.gz" \
--exclude=".git" \
--exclude="node_modules" \
--exclude="*.zip" \
.
- name: Create GitHub Release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: Release ${{ github.ref }}
body_path: CHANGELOG.md
draft: false
prerelease: false
- name: Upload Release Assets
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./icon-collection-${{ github.ref_name }}.zip
asset_name: icon-collection-${{ github.ref_name }}.zip
asset_content_type: application/zip
Testing-Strategien¶
1. Unit Tests¶
# tests/test_app.py
import pytest
import json
from app import app, get_icon_list, load_icon_categories
@pytest.fixture
def client():
app.config['TESTING'] = True
with app.test_client() as client:
yield client
def test_health_endpoint(client):
"""Test basic health check"""
response = client.get('/health')
assert response.status_code == 200
def test_api_icons(client):
"""Test icons API endpoint"""
response = client.get('/api/icons')
assert response.status_code == 200
data = json.loads(response.data)
assert 'icons' in data
assert 'categories' in data
assert len(data['icons']) > 0
def test_icon_list_structure():
"""Test icon list data structure"""
icons, categories = get_icon_list()
# Test icon structure
if icons:
icon = icons[0]
required_fields = ['name', 'filename', 'path', 'display_name', 'category']
for field in required_fields:
assert field in icon
# Test categories structure
assert isinstance(categories, dict)
def test_individual_icon(client):
"""Test individual icon endpoint"""
# Test existing icon
response = client.get('/api/icon/home')
if response.status_code == 200:
data = json.loads(response.data)
assert data['name'] == 'home'
assert data['filename'] == 'home.svg'
# Test non-existing icon
response = client.get('/api/icon/nonexistent')
assert response.status_code == 404
def test_metadata_loading():
"""Test metadata file loading"""
categories = load_icon_categories()
assert isinstance(categories, dict)
# Test category structure
for category, icons in categories.items():
assert isinstance(category, str)
assert isinstance(icons, list)
for icon in icons:
assert icon.endswith('.svg')
2. Integration Tests¶
# tests/test_integration.py
import subprocess
import time
import requests
import pytest
class TestIconExtraction:
"""Test icon extraction process"""
def test_extract_icons_script(self):
"""Test that extract-icons.js runs successfully"""
result = subprocess.run(['node', 'extract-icons.js'],
capture_output=True, text=True)
assert result.returncode == 0
assert "Icons erfolgreich extrahiert" in result.stdout
def test_generated_files_exist(self):
"""Test that extraction creates expected files"""
import os
# Check icons directory
assert os.path.exists('static/icons')
icons = [f for f in os.listdir('static/icons') if f.endswith('.svg')]
assert len(icons) >= 150 # Minimum expected icons
# Check ZIP archive
assert os.path.exists('turkiye-atlasi-icons.zip')
# Check file sizes
zip_size = os.path.getsize('turkiye-atlasi-icons.zip')
assert zip_size > 10000 # At least 10KB
assert zip_size < 200000 # Less than 200KB
class TestAPIIntegration:
"""Test full API integration"""
@pytest.fixture(scope="class")
def running_app(self):
"""Start Flask app for testing"""
import threading
from app import app
# Start app in background thread
server = threading.Thread(
target=lambda: app.run(port=5001, debug=False)
)
server.daemon = True
server.start()
# Wait for server to start
time.sleep(2)
yield "http://localhost:5001"
# Cleanup happens automatically when daemon thread ends
def test_full_api_workflow(self, running_app):
"""Test complete API workflow"""
base_url = running_app
# Test main page
response = requests.get(f"{base_url}/")
assert response.status_code == 200
assert "Icon Browser" in response.text
# Test API endpoint
response = requests.get(f"{base_url}/api/icons")
assert response.status_code == 200
data = response.json()
assert len(data['icons']) > 0
# Test individual icon
first_icon = data['icons'][0]
response = requests.get(f"{base_url}/api/icon/{first_icon['name']}")
assert response.status_code == 200
# Test static file serving
response = requests.get(f"{base_url}{first_icon['path']}")
assert response.status_code == 200
assert response.headers['content-type'] == 'image/svg+xml'
3. Performance Tests¶
# tests/test_performance.py
import time
import pytest
from app import app
class TestPerformance:
"""Performance benchmarking tests"""
def test_api_response_time(self):
"""Test API response times"""
with app.test_client() as client:
start_time = time.time()
response = client.get('/api/icons')
end_time = time.time()
assert response.status_code == 200
response_time = end_time - start_time
# API should respond within 200ms
assert response_time < 0.2, f"API too slow: {response_time:.3f}s"
def test_concurrent_requests(self):
"""Test handling of concurrent requests"""
import threading
import queue
results = queue.Queue()
def make_request():
with app.test_client() as client:
start = time.time()
response = client.get('/api/icons')
end = time.time()
results.put((response.status_code, end - start))
# Create 10 concurrent requests
threads = []
for _ in range(10):
thread = threading.Thread(target=make_request)
threads.append(thread)
thread.start()
# Wait for all threads
for thread in threads:
thread.join()
# Check results
response_times = []
while not results.empty():
status_code, response_time = results.get()
assert status_code == 200
response_times.append(response_time)
# Average response time should be reasonable
avg_time = sum(response_times) / len(response_times)
assert avg_time < 0.5, f"Average response time too high: {avg_time:.3f}s"
Quality Gates¶
1. Pre-commit Hooks¶
# .pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-json
- id: check-merge-conflict
- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
language_version: python3
- repo: https://github.com/pycqa/flake8
rev: 6.0.0
hooks:
- id: flake8
- repo: local
hooks:
- id: icon-extraction-test
name: Test Icon Extraction
entry: node extract-icons.js
language: system
pass_filenames: false
2. Quality Metrics¶
# Quality thresholds in CI
quality_gates:
test_coverage: 80% # Minimum test coverage
max_response_time: 200ms # API response time limit
max_zip_size: 100KB # Icon archive size limit
min_icon_count: 150 # Minimum number of icons
max_bundle_size: 500KB # Maximum application bundle
Deployment-Strategien¶
1. Replit Deployment¶
# .replit deployment script
#!/bin/bash
echo "Starting Replit deployment..."
# Install dependencies
npm ci
pip install -r requirements.txt
# Extract icons
node extract-icons.js
# Validate deployment
python -c "
import app
with app.app.test_client() as client:
response = client.get('/api/icons')
assert response.status_code == 200
print('✓ Deployment validation passed')
"
echo "✓ Replit deployment complete"
2. Docker Deployment¶
# Dockerfile
FROM python:3.11-slim AS python-base
# Install Node.js
RUN apt-get update && apt-get install -y \
curl \
&& curl -fsSL https://deb.nodesource.com/setup_20.x | bash - \
&& apt-get install -y nodejs \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /app
# Copy and install dependencies
COPY package*.json ./
RUN npm ci --only=production
COPY pyproject.toml ./
RUN pip install flask
# Copy application code
COPY . .
# Extract icons
RUN node extract-icons.js
# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
CMD curl -f http://localhost:5000/health || exit 1
EXPOSE 5000
CMD ["python", "app.py"]
3. Vercel Deployment¶
{
"version": 2,
"builds": [
{
"src": "app.py",
"use": "@vercel/python"
},
{
"src": "extract-icons.js",
"use": "@vercel/node"
}
],
"routes": [
{
"src": "/api/(.*)",
"dest": "app.py"
},
{
"src": "/static/(.*)",
"dest": "static/$1"
},
{
"src": "/(.*)",
"dest": "app.py"
}
],
"env": {
"FLASK_ENV": "production"
}
}
Monitoring & Alerting¶
1. Health Checks¶
# Enhanced health check endpoint
@app.route('/health')
def health_check():
"""Comprehensive health check"""
checks = {
'timestamp': time.time(),
'status': 'healthy',
'checks': {}
}
# Check icons directory
icons_dir = Path('static/icons')
checks['checks']['icons_directory'] = {
'exists': icons_dir.exists(),
'icon_count': len(list(icons_dir.glob('*.svg'))) if icons_dir.exists() else 0
}
# Check metadata file
metadata_file = Path('icons.json')
checks['checks']['metadata'] = {
'exists': metadata_file.exists(),
'valid_json': False
}
if metadata_file.exists():
try:
with open(metadata_file) as f:
json.load(f)
checks['checks']['metadata']['valid_json'] = True
except json.JSONDecodeError:
pass
# Check API functionality
try:
icons, categories = get_icon_list()
checks['checks']['api'] = {
'functional': True,
'icon_count': len(icons),
'category_count': len(categories)
}
except Exception as e:
checks['checks']['api'] = {
'functional': False,
'error': str(e)
}
checks['status'] = 'unhealthy'
# Overall health assessment
if not all(
checks['checks']['icons_directory']['exists'],
checks['checks']['metadata']['valid_json'],
checks['checks']['api']['functional']
):
checks['status'] = 'unhealthy'
return jsonify(checks), 503
return jsonify(checks)
2. Performance Monitoring¶
# GitHub Actions monitoring
- name: Performance Monitor
run: |
# Response time check
start_time=$(date +%s%N)
curl -f http://localhost:5000/api/icons > /dev/null
end_time=$(date +%s%N)
response_time=$(( (end_time - start_time) / 1000000 ))
echo "Response time: ${response_time}ms"
if [ $response_time -gt 500 ]; then
echo "::warning::API response time is high: ${response_time}ms"
fi
# Memory usage check
memory_usage=$(ps -o pid,vsz,rss,comm -p $! | tail -1 | awk '{print $3}')
echo "Memory usage: ${memory_usage}KB"
if [ $memory_usage -gt 100000 ]; then
echo "::warning::Memory usage is high: ${memory_usage}KB"
fi
Versionierung & Releases¶
1. Semantic Versioning¶
# Version bump scripts
# package.json version wird als single source of truth verwendet
# Patch release (bug fixes)
npm version patch
git push origin main --tags
# Minor release (new features)
npm version minor
git push origin main --tags
# Major release (breaking changes)
npm version major
git push origin main --tags
2. Changelog Automation¶
# Generate changelog automatically
npx auto-changelog --package --template keepachangelog
# Or manual changelog format
echo "## [1.2.0] - $(date +%Y-%m-%d)
### Added
- New healthcare icon category (8 icons)
- Performance improvements for icon loading
### Changed
- Updated FontAwesome to v7.1.0
- Improved error handling in API
### Fixed
- Fixed metadata inconsistency issues
- Resolved ZIP archive size optimization
### Technical
- Total icons: 162 → 170
- ZIP size: 69KB → 72KB
- New CI/CD pipeline with quality gates" >> CHANGELOG.md
Evidenz: Release management best practices, versioning strategies
CI/CD-Setup Checklist: - [ ] GitHub Actions workflows konfiguriert - [ ] Quality gates und Tests implementiert - [ ] Deployment-Pipeline für alle Umgebungen - [ ] Monitoring und Health Checks aktiviert - [ ] Release-Automation eingerichtet - [ ] Documentation automatisch generiert