Compare commits

..

No commits in common. "fda8319e8ed1f69652bda438b54d9d7465b58451" and "38fa3b6c0aef06c4ab23e4afdbb1f24b791c7751" have entirely different histories.

118 changed files with 1623 additions and 25697 deletions

View File

@ -45,18 +45,74 @@ ECONOMIC_AGREEMENT_GRANT_TOKEN=your_agreement_grant_token_here
# 🚨 SAFETY SWITCHES - Beskytter mod utilsigtede ændringer # 🚨 SAFETY SWITCHES - Beskytter mod utilsigtede ændringer
ECONOMIC_READ_ONLY=true # Set to false ONLY after testing ECONOMIC_READ_ONLY=true # Set to false ONLY after testing
ECONOMIC_DRY_RUN=true # Set to false ONLY when ready for production writes ECONOMIC_DRY_RUN=true # Set to false ONLY when ready for production writes
# =====================================================
# vTiger Cloud Integration (Required for Subscriptions)
# =====================================================
# Get credentials from vTiger Cloud -> Settings -> Integration -> Webservices
VTIGER_URL=https://your-instance.od2.vtiger.com
VTIGER_USERNAME=your_vtiger_username
VTIGER_API_KEY=your_vtiger_api_key
# ===================================================== # =====================================================
# Simply-CRM / Old vTiger On-Premise (Legacy) # vTiger CRM Integration (Optional)
# ===================================================== # =====================================================
# Old vTiger installation (if different from cloud) VTIGER_URL=https://your-instance.od2.vtiger.com
OLD_VTIGER_URL=http://your-old-vtiger-server.com VTIGER_USERNAME=your_username@yourdomain.com
OLD_VTIGER_USERNAME=your_old_username VTIGER_API_KEY=your_api_key_or_access_key
OLD_VTIGER_API_KEY=your_old_api_key VTIGER_PASSWORD=your_password_if_using_basic_auth
# =====================================================
# TIME TRACKING MODULE - Isolated Settings
# =====================================================
# vTiger Integration Safety Flags
TIMETRACKING_VTIGER_READ_ONLY=true # 🚨 Bloker ALLE skrivninger til vTiger
TIMETRACKING_VTIGER_DRY_RUN=true # 🚨 Log uden at synkronisere
# e-conomic Integration Safety Flags
TIMETRACKING_ECONOMIC_READ_ONLY=true # 🚨 Bloker ALLE skrivninger til e-conomic
TIMETRACKING_ECONOMIC_DRY_RUN=true # 🚨 Log uden at eksportere
TIMETRACKING_EXPORT_TYPE=draft # draft|booked (draft er sikrest)
# Business Logic Settings
TIMETRACKING_DEFAULT_HOURLY_RATE=850.00 # DKK pr. time (fallback hvis kunde ikke har rate)
TIMETRACKING_AUTO_ROUND=true # Auto-afrund til nærmeste interval
TIMETRACKING_ROUND_INCREMENT=0.5 # Afrundingsinterval (0.25, 0.5, 1.0)
TIMETRACKING_ROUND_METHOD=up # up (op til), nearest (nærmeste), down (ned til)
TIMETRACKING_REQUIRE_APPROVAL=true # Kræv manuel godkendelse
# =====================================================
# OLLAMA AI Integration (Optional - for document extraction)
# =====================================================
OLLAMA_ENDPOINT=http://ai_direct.cs.blaahund.dk
OLLAMA_MODEL=qwen2.5-coder:7b
# =====================================================
# COMPANY INFO
# =====================================================
OWN_CVR=29522790 # BMC Denmark ApS - ignore when detecting vendors
# =====================================================
# FILE UPLOAD
# =====================================================
UPLOAD_DIR=uploads
MAX_FILE_SIZE_MB=50
# =====================================================
# MODULE SYSTEM - Dynamic Feature Loading
# =====================================================
# Enable/disable entire module system
MODULES_ENABLED=true
# Directory for dynamic modules (default: app/modules)
MODULES_DIR=app/modules
# Auto-reload modules on changes (dev only, requires restart)
MODULES_AUTO_RELOAD=true
# =====================================================
# MODULE-SPECIFIC CONFIGURATION
# =====================================================
# Pattern: MODULES__{MODULE_NAME}__{KEY}
# Example module configuration:
# MODULES__INVOICE_OCR__READ_ONLY=true
# MODULES__INVOICE_OCR__DRY_RUN=true
# MODULES__INVOICE_OCR__API_KEY=secret123
# MODULES__MY_FEATURE__READ_ONLY=false
# MODULES__MY_FEATURE__DRY_RUN=false
# MODULES__MY_FEATURE__SOME_SETTING=value

View File

@ -1,33 +1,24 @@
# =====================================================
# PRODUCTION ENVIRONMENT - BMC Hub
# =====================================================
# Kopier denne fil til .env på production serveren
# OG ÆNDR ALLE PASSWORDS OG SECRETS!
# =====================================================
# RELEASE VERSION
# =====================================================
# Tag fra Gitea (f.eks. v1.0.0, v1.2.3)
RELEASE_VERSION=v1.0.0
# =====================================================
# GITEA AUTHENTICATION
# =====================================================
# Personal Access Token fra https://g.bmcnetworks.dk/user/settings/applications
GITHUB_TOKEN=your_gitea_token_here
GITHUB_REPO=ct/bmc_hub
# ===================================================== # =====================================================
# POSTGRESQL DATABASE - Production # POSTGRESQL DATABASE - Production
# ===================================================== # =====================================================
DATABASE_URL=postgresql://bmc_hub_prod:CHANGE_THIS_PASSWORD@postgres:5432/bmc_hub_prod DATABASE_URL=postgresql://bmc_hub:CHANGEME_STRONG_PASSWORD@postgres:5432/bmc_hub
# Database credentials (bruges af docker-compose/podman-compose) # Database credentials (bruges af docker-compose)
POSTGRES_USER=bmc_hub_prod POSTGRES_USER=bmc_hub
POSTGRES_PASSWORD=CHANGE_THIS_PASSWORD_TO_STRONG_PASSWORD POSTGRES_PASSWORD=CHANGEME_STRONG_PASSWORD
POSTGRES_DB=bmc_hub_prod POSTGRES_DB=bmc_hub
POSTGRES_PORT=5432 POSTGRES_PORT=5432
# =====================================================
# GITHUB DEPLOYMENT - Production Version Control
# =====================================================
# Git tag eller branch at deploye (f.eks. "v1.0.0", "v1.2.3")
# VIGTIGT: Brug ALTID tags til production (ikke "latest" eller "main")
RELEASE_VERSION=v1.0.0
# GitHub repository (format: owner/repo eller path på Gitea)
GITHUB_REPO=ct/bmc_hub
# GitHub/Gitea Personal Access Token (skal have læseadgang til repo) # GitHub/Gitea Personal Access Token (skal have læseadgang til repo)
# Opret token på: https://g.bmcnetworks.dk/user/settings/applications # Opret token på: https://g.bmcnetworks.dk/user/settings/applications
GITHUB_TOKEN=your_gitea_token_here GITHUB_TOKEN=your_gitea_token_here

View File

@ -1,418 +0,0 @@
# 🚀 BMC Hub - Production Deployment Checklist
## ✅ Pre-Deployment (På din Mac)
### 1. Test Lokalt
- [ ] Alle ændringer committed til Git
- [ ] Lokale tests kørt og består
- [ ] `docker-compose up` virker lokalt
- [ ] Health endpoint returnerer OK: `curl http://localhost:8001/health`
### 2. Opret Gitea Release
```bash
cd /Users/christianthomas/DEV/bmc_hub_dev
# Se nuværende tags
git tag -l
# Commit alle ændringer
git add .
git status
git commit -m "Release v1.0.0: Initial production release"
# Push til Gitea
git push origin main
# Tag release (semantic versioning: major.minor.patch)
git tag v1.0.0
git push origin v1.0.0
# Verificer på Gitea
open https://g.bmcnetworks.dk/ct/bmc_hub/releases
```
### 3. Verificer Release på Gitea
- [ ] Tag synligt på https://g.bmcnetworks.dk/ct/bmc_hub/tags
- [ ] Kan downloade archive: https://g.bmcnetworks.dk/ct/bmc_hub/archive/v1.0.0.tar.gz
- [ ] Raw files tilgængelige via API
## 🔧 Production Server Setup (Første Gang)
### 1. Forbered Server
```bash
# SSH til server
ssh user@your-server.com
# Installer Podman (hvis ikke installeret)
sudo apt update
sudo apt install -y podman podman-compose
# Eller på RHEL/CentOS
sudo dnf install -y podman podman-compose
# Verificer installation
podman --version
podman-compose --version
```
### 2. Opret Gitea Personal Access Token
- [ ] Gå til https://g.bmcnetworks.dk/user/settings/applications
- [ ] Klik "Generate New Token"
- [ ] Token navn: `BMC Hub Production`
- [ ] Scopes: ✅ `repo` (read)
- [ ] Gem token sikkert (vises kun én gang!)
### 3. Download Deployment Files
```bash
# Opret deployment directory
sudo mkdir -p /opt/bmc_hub
sudo chown $USER:$USER /opt/bmc_hub
cd /opt/bmc_hub
# Download deployment script
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/scripts/deploy_production.sh?ref=v1.0.0 \
-o setup.sh
chmod +x setup.sh
# Download .env template
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/.env.prod.example?ref=v1.0.0 \
-o .env.example
```
### 4. Konfigurer Environment
```bash
# Kopier template
cp .env.example .env
# Rediger .env
nano .env
```
**KRITISKE ÆNDRINGER:**
```bash
# 1. Version
RELEASE_VERSION=v1.0.0
# 2. Gitea Token
GITHUB_TOKEN=glpat-xxxxxxxxxxxxxxxxxxxx # Din token fra trin 2
# 3. Database Passwords (generer stærke passwords)
POSTGRES_PASSWORD=$(openssl rand -base64 32)
DATABASE_URL=postgresql://bmc_hub_prod:${POSTGRES_PASSWORD}@postgres:5432/bmc_hub_prod
# 4. Secret Key (generer random)
SECRET_KEY=$(python3 -c "import secrets; print(secrets.token_urlsafe(32))")
# 5. CORS Origins (production domain)
CORS_ORIGINS=https://hub.bmcnetworks.dk
# 6. e-conomic Credentials (hvis relevant)
ECONOMIC_APP_SECRET_TOKEN=xxxxx
ECONOMIC_AGREEMENT_GRANT_TOKEN=xxxxx
# 7. vTiger Credentials (hvis relevant)
VTIGER_API_KEY=xxxxx
# 8. BEHOLD SAFETY SWITCHES!
ECONOMIC_READ_ONLY=true
ECONOMIC_DRY_RUN=true
TIMETRACKING_VTIGER_READ_ONLY=true
TIMETRACKING_ECONOMIC_READ_ONLY=true
```
### 5. Kør Deployment
```bash
# Download alle filer fra Gitea
./setup.sh
# Verificer downloaded files
ls -la
# Skal se: docker-compose.yml, Dockerfile, requirements.txt, migrations/
# Build og start
podman-compose up -d --build
# Følg logs
podman-compose logs -f
```
### 6. Verificer Deployment
```bash
# Check container status
podman ps
# Expected output:
# CONTAINER ID IMAGE STATUS PORTS
# xxxxxxxxxxxx bmc-hub:v1.0.0 Up 2 minutes 0.0.0.0:8000->8000/tcp
# xxxxxxxxxxxx postgres:16-alpine Up 2 minutes 0.0.0.0:5432->5432/tcp
# Test health endpoint
curl http://localhost:8000/health
# Expected:
# {"status":"healthy","database":"connected","version":"v1.0.0"}
# Test API
curl http://localhost:8000/api/v1/system/health
# Check database
podman exec -it bmc-hub-postgres-prod psql -U bmc_hub_prod -d bmc_hub_prod
# Liste tabeller
\dt
# Check sample data
SELECT * FROM customers LIMIT 5;
\q
```
### 7. Setup Reverse Proxy (SSL/HTTPS)
**Med Nginx:**
```nginx
server {
listen 443 ssl http2;
server_name hub.bmcnetworks.dk;
ssl_certificate /etc/ssl/certs/bmcnetworks.crt;
ssl_certificate_key /etc/ssl/private/bmcnetworks.key;
location / {
proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
**Med Traefik** (labels i docker-compose.yml):
```yaml
labels:
- "traefik.enable=true"
- "traefik.http.routers.bmc-hub.rule=Host(`hub.bmcnetworks.dk`)"
- "traefik.http.routers.bmc-hub.entrypoints=websecure"
- "traefik.http.routers.bmc-hub.tls.certresolver=letsencrypt"
```
### 8. Setup Backups
```bash
# Opret backup script
sudo nano /opt/bmc_hub/backup.sh
```
```bash
#!/bin/bash
BACKUP_DIR="/opt/backups/bmc_hub"
DATE=$(date +%Y%m%d_%H%M%S)
mkdir -p $BACKUP_DIR
# Backup database
podman exec bmc-hub-postgres-prod pg_dump -U bmc_hub_prod bmc_hub_prod | \
gzip > $BACKUP_DIR/database_$DATE.sql.gz
# Backup uploads og data
tar -czf $BACKUP_DIR/files_$DATE.tar.gz /opt/bmc_hub/uploads /opt/bmc_hub/data
# Behold kun 30 dages backups
find $BACKUP_DIR -type f -mtime +30 -delete
echo "Backup completed: $DATE"
```
```bash
chmod +x /opt/bmc_hub/backup.sh
# Tilføj til crontab (daglig backup kl 02:00)
crontab -e
# Add: 0 2 * * * /opt/bmc_hub/backup.sh >> /opt/bmc_hub/logs/backup.log 2>&1
```
### 9. Setup Monitoring (Optional)
**Uptime Kuma:**
- Add monitor for: `https://hub.bmcnetworks.dk/health`
- Interval: 60 sekunder
- Expected keyword: `"healthy"`
**Prometheus/Grafana:**
- Se `docs/MONITORING.md` (hvis eksisterer)
## 🔄 Opdatering til Ny Version
### På din Mac:
```bash
cd /Users/christianthomas/DEV/bmc_hub_dev
# Lav ændringer...
git add .
git commit -m "Feature: Add new functionality"
git push origin main
# Tag ny version
git tag v1.1.0
git push origin v1.1.0
```
### På Production Server:
```bash
cd /opt/bmc_hub
# Backup først!
./backup.sh
# Opdater RELEASE_VERSION i .env
nano .env
# Ændr: RELEASE_VERSION=v1.1.0
# Download nye filer
./setup.sh
# Rebuild
podman-compose down
podman-compose up -d --build
# Verificer
podman-compose logs -f api
curl http://localhost:8000/health
```
## 🆘 Troubleshooting
### Container Starter Ikke
```bash
# Check logs detaljeret
podman logs bmc-hub-api-prod --tail 100
# Check build logs
podman-compose build --no-cache
# Verificer .env
cat .env | grep -v "PASSWORD\|TOKEN\|SECRET"
```
### Database Connection Fejl
```bash
# Test database connection
podman exec -it bmc-hub-postgres-prod psql -U bmc_hub_prod -d bmc_hub_prod
# Check database logs
podman logs bmc-hub-postgres-prod
# Restart database
podman-compose restart postgres
```
### Gitea Download Fejl
```bash
# Test token manuelt
curl -H "Authorization: token YOUR_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/tags
# Verificer release eksisterer
curl https://g.bmcnetworks.dk/ct/bmc_hub/releases
# Check network
ping g.bmcnetworks.dk
```
### Port Allerede I Brug
```bash
# Find hvad der bruger porten
sudo lsof -i :8000
# Ændr port i .env
nano .env
# API_PORT=8001
# Rebuild
podman-compose down
podman-compose up -d
```
## 🔙 Rollback Procedure
```bash
cd /opt/bmc_hub
# Stop services
podman-compose down
# Restore database backup
gunzip < /opt/backups/bmc_hub/database_YYYYMMDD_HHMMSS.sql.gz | \
podman exec -i bmc-hub-postgres-prod psql -U bmc_hub_prod -d bmc_hub_prod
# Ændr til gammel version i .env
nano .env
# RELEASE_VERSION=v1.0.0
# Rebuild
podman-compose up -d --build
# Verificer
curl http://localhost:8000/health
```
## 📊 Post-Deployment Checks
- [ ] Health endpoint OK: `curl https://hub.bmcnetworks.dk/health`
- [ ] API responding: `curl https://hub.bmcnetworks.dk/api/v1/system/health`
- [ ] Database accessible og data intact
- [ ] Logs ser normale ud (ingen ERROR/CRITICAL)
- [ ] SSL certificate valid
- [ ] Backups kører automatisk
- [ ] Monitoring alerts konfigureret
- [ ] Safety switches aktiveret (READ_ONLY=true)
- [ ] DNS pointing til ny server (hvis relevant)
- [ ] Firewall rules konfigureret
## 🎯 Security Checklist
- [ ] Alle passwords ændret fra defaults
- [ ] SECRET_KEY er random og unik
- [ ] CORS_ORIGINS sat til production domain
- [ ] SSL/HTTPS aktiveret
- [ ] Firewall kun åbner 80/443 (ikke 8000 direkte)
- [ ] Database port IKKE exposed eksternt (kun internt network)
- [ ] .env fil har korrekte permissions (600)
- [ ] Gitea token har minimal scope (kun read)
- [ ] Safety switches aktiveret i .env
- [ ] Backups krypteret (hvis sensitive data)
## 📝 Dokumentation
- [ ] [PRODUCTION_DEPLOYMENT.md](PRODUCTION_DEPLOYMENT.md) - Detaljeret guide
- [ ] [PRODUCTION_QUICK_START.md](PRODUCTION_QUICK_START.md) - Hurtig reference
- [ ] [README.md](README.md) - Project overview
## 📞 Support
Ved problemer:
- Email: ct@bmcnetworks.dk
- Gitea Issues: https://g.bmcnetworks.dk/ct/bmc_hub/issues
---
**Version:** 1.0
**Sidst opdateret:** 2025-12-17

View File

@ -8,6 +8,9 @@ RUN apt-get update && apt-get install -y \
git \ git \
libpq-dev \ libpq-dev \
gcc \ gcc \
tesseract-ocr \
tesseract-ocr-dan \
tesseract-ocr-eng \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
# Build arguments for GitHub release deployment # Build arguments for GitHub release deployment

View File

@ -1,317 +0,0 @@
# BMC Hub - Production Deployment Guide
## 📋 Forudsætninger
- Linux server med **Podman** eller **Docker** installeret
- Adgang til Gitea repository: `https://g.bmcnetworks.dk/ct/bmc_hub`
- Gitea Personal Access Token med læseadgang
## 🚀 Deployment Trin-for-Trin
### 1. Opret Gitea Personal Access Token
1. Gå til https://g.bmcnetworks.dk/user/settings/applications
2. Klik "Generate New Token"
3. Giv tokenet et navn: "BMC Hub Production"
4. Vælg scopes: `repo` (read)
5. Gem tokenet sikkert - det vises kun én gang
### 2. Tag en Release i Gitea
Fra din lokale udviklings-mac:
```bash
cd /Users/christianthomas/DEV/bmc_hub_dev
# Commit alle ændringer
git add .
git commit -m "Prepare for production v1.0.0"
# Tag release
git tag v1.0.0
git push origin main
git push origin v1.0.0
```
### 3. Forbered Production Server
```bash
# SSH til production server
ssh user@your-production-server.com
# Opret deployment directory
mkdir -p /opt/bmc_hub
cd /opt/bmc_hub
# Download kun nødvendige filer fra Gitea
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/docker-compose.prod.yml?ref=v1.0.0 \
-o docker-compose.yml
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/Dockerfile?ref=v1.0.0 \
-o Dockerfile
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/.env.prod.example?ref=v1.0.0 \
-o .env.example
# Opret migrations directory
mkdir -p migrations
# Download alle migrations (init.sql og andre)
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/migrations/init.sql?ref=v1.0.0 \
-o migrations/init.sql
# Download requirements.txt (til Dockerfile)
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/requirements.txt?ref=v1.0.0 \
-o requirements.txt
```
### 4. Konfigurer Environment Variables
```bash
# Kopier template
cp .env.example .env
# Rediger .env med production værdier
nano .env
```
**VIGTIGE ÆNDRINGER I .env:**
```bash
# Release version (matcher git tag)
RELEASE_VERSION=v1.0.0
# Gitea token
GITHUB_TOKEN=din_gitea_personal_access_token
# SKIFT ALLE PASSWORDS!
POSTGRES_PASSWORD=et_meget_stærkt_password_her
SECRET_KEY=$(python3 -c "import secrets; print(secrets.token_urlsafe(32))")
# Production settings
LOG_LEVEL=WARNING
CORS_ORIGINS=https://yourdomain.com
# BEHOLD SAFETY SWITCHES PÅ!
ECONOMIC_READ_ONLY=true
ECONOMIC_DRY_RUN=true
```
### 5. Opret Nødvendige Directories
```bash
cd /opt/bmc_hub
# Opret data directories
mkdir -p logs uploads data/invoice_templates
# Sæt permissions (hvis nødvendigt)
chmod 755 logs uploads data
```
### 6. Start Services med Podman Compose
```bash
# Pull og build images
podman-compose -f docker-compose.yml build --no-cache
# Start services
podman-compose -f docker-compose.yml up -d
# Følg logs
podman-compose -f docker-compose.yml logs -f
```
### 7. Verificer Deployment
```bash
# Check container status
podman ps
# Test health endpoint
curl http://localhost:8000/health
# Check database
podman exec -it bmc-hub-postgres-prod psql -U bmc_hub_prod -d bmc_hub_prod -c "\dt"
```
Expected output:
```json
{
"status": "healthy",
"database": "connected",
"version": "v1.0.0"
}
```
## 🔄 Opdatering til Ny Version
```bash
# På din Mac - tag ny release
cd /Users/christianthomas/DEV/bmc_hub_dev
git tag v1.1.0
git push origin v1.1.0
# På production server
cd /opt/bmc_hub
# Opdater RELEASE_VERSION i .env
nano .env # Ændr til v1.1.0
# Rebuild og genstart
podman-compose down
podman-compose build --no-cache
podman-compose up -d
# Verificer
podman-compose logs -f api
```
## 🗄️ Database Migrations
Når der tilføjes nye migrations:
```bash
# Download nye migration filer til /opt/bmc_hub/migrations/
curl -H "Authorization: token YOUR_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/raw/migrations/XXX_new_migration.sql?ref=v1.1.0 \
-o migrations/XXX_new_migration.sql
# Kør migration manuelt
podman exec -i bmc-hub-postgres-prod psql -U bmc_hub_prod -d bmc_hub_prod < migrations/XXX_new_migration.sql
```
**VIGTIGT:** init.sql kører KUN ved første database initialization. Nye migrations skal køres manuelt.
## 🔒 Sikkerhed Best Practices
1. **Passwords**: Brug minimum 32 tegn, tilfældige passwords
2. **Secret Key**: Generer med `secrets.token_urlsafe(32)`
3. **Gitea Token**: Begræns til read-only scope for production
4. **Firewall**: Åbn kun nødvendige porte (8000, 5432 hvis remote access)
5. **HTTPS**: Brug reverse proxy (nginx/traefik) med SSL i production
6. **Backups**: Sæt automatisk backup op for PostgreSQL data volume
## 📊 Monitoring
```bash
# Container status
podman-compose ps
# Resource usage
podman stats
# Application logs
podman-compose logs -f api
# Database logs
podman-compose logs -f postgres
# Health check
curl http://localhost:8000/health
curl http://localhost:8000/api/v1/system/health
```
## 🆘 Troubleshooting
### Database Connection Issues
```bash
# Check postgres logs
podman-compose logs postgres
# Test connection
podman exec -it bmc-hub-postgres-prod psql -U bmc_hub_prod -d bmc_hub_prod
# Check network
podman network inspect bmc-hub-network_bmc-hub-network
```
### API Won't Start
```bash
# Check detailed logs
podman logs bmc-hub-api-prod
# Verify environment variables
podman exec bmc-hub-api-prod env | grep -E "DATABASE|RELEASE"
# Check if database is ready
podman exec bmc-hub-postgres-prod pg_isready
```
### Gitea Download Fails
```bash
# Test token manually
curl -H "Authorization: token YOUR_TOKEN" \
https://g.bmcnetworks.dk/api/v1/repos/ct/bmc_hub/tags
# Verify release exists
curl https://g.bmcnetworks.dk/ct/bmc_hub/releases
```
## 🔄 Rollback Procedure
```bash
# Ændr RELEASE_VERSION til tidligere version
nano .env # v1.1.0 → v1.0.0
# Rebuild med gammel version
podman-compose down
podman-compose build --no-cache
podman-compose up -d
```
## 📦 Backup & Restore
### Backup Database
```bash
# Automated backup script
podman exec bmc-hub-postgres-prod pg_dump -U bmc_hub_prod bmc_hub_prod > backup_$(date +%Y%m%d).sql
# Eller med podman-compose
podman-compose exec postgres pg_dump -U bmc_hub_prod bmc_hub_prod > backup.sql
```
### Restore Database
```bash
# Stop API først
podman-compose stop api
# Restore
podman exec -i bmc-hub-postgres-prod psql -U bmc_hub_prod -d bmc_hub_prod < backup.sql
# Start API
podman-compose start api
```
## 🎯 Production Checklist
- [ ] Gitea token oprettet med read-only scope
- [ ] Release tagged i Git (f.eks. v1.0.0)
- [ ] `.env` fil udfyldt med production værdier
- [ ] Alle passwords ændret fra defaults
- [ ] SECRET_KEY genereret tilfældigt
- [ ] CORS_ORIGINS sat til production domain
- [ ] Safety switches aktiveret (READ_ONLY=true, DRY_RUN=true)
- [ ] Directories oprettet (logs, uploads, data)
- [ ] Migrations downloaded til `migrations/` directory
- [ ] Database initialiseret med init.sql
- [ ] Containers startet og healthy
- [ ] Health endpoints verificeret
- [ ] Backup strategi implementeret
- [ ] Firewall konfigureret
- [ ] SSL certifikat installeret (reverse proxy)
- [ ] Monitoring setup (optional: Uptime Kuma)
## 📞 Support
Ved problemer, kontakt: ct@bmcnetworks.dk

View File

@ -1,111 +0,0 @@
# BMC Hub - Quick Production Setup
## 🚀 Hurtig Start (TL;DR)
```bash
# 1. På din Mac - tag en release
cd /Users/christianthomas/DEV/bmc_hub_dev
git tag v1.0.0 && git push origin v1.0.0
# 2. På production server
mkdir /opt/bmc_hub && cd /opt/bmc_hub
# 3. Download setup script
curl https://raw.githubusercontent.com/ct/bmc_hub/v1.0.0/scripts/deploy_production.sh -o setup.sh
chmod +x setup.sh
# 4. Opret .env (brug template nedenfor)
nano .env
# 5. Kør deployment
./setup.sh
# 6. Start services
podman-compose up -d --build
# 7. Verificer
curl http://localhost:8000/health
```
## 📋 Minimal .env Template
```bash
# Version
RELEASE_VERSION=v1.0.0
# Gitea (opret token på https://g.bmcnetworks.dk/user/settings/applications)
GITHUB_TOKEN=glpat-xxxxxxxxxxxxx
GITHUB_REPO=ct/bmc_hub
# Database (SKIFT PASSWORD!)
POSTGRES_USER=bmc_hub_prod
POSTGRES_PASSWORD=din_stærke_password_her
POSTGRES_DB=bmc_hub_prod
DATABASE_URL=postgresql://bmc_hub_prod:din_stærke_password_her@postgres:5432/bmc_hub_prod
# Security (generer med: python3 -c "import secrets; print(secrets.token_urlsafe(32))")
SECRET_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
# API
API_PORT=8000
CORS_ORIGINS=https://yourdomain.com
# Safety (BEHOLD true!)
ECONOMIC_READ_ONLY=true
ECONOMIC_DRY_RUN=true
```
## 🔄 Opdater til Ny Version
```bash
cd /opt/bmc_hub
# Ændr version i .env
nano .env # RELEASE_VERSION=v1.1.0
# Rebuild
podman-compose down
podman-compose up -d --build
# Check
podman-compose logs -f api
```
## 📖 Fuld Dokumentation
Se [PRODUCTION_DEPLOYMENT.md](PRODUCTION_DEPLOYMENT.md) for detaljeret guide.
## 🆘 Problemer?
```bash
# Check logs
podman-compose logs -f
# Check containers
podman ps -a
# Restart
podman-compose restart
# Start forfra
podman-compose down -v
podman-compose up -d --build
```
## 📁 Struktur på Production Server
```
/opt/bmc_hub/
├── .env # Din konfiguration (opret selv)
├── docker-compose.yml # Downloaded fra Gitea
├── Dockerfile # Downloaded fra Gitea
├── requirements.txt # Downloaded fra Gitea
├── migrations/ # SQL migrations
│ └── init.sql
├── logs/ # Application logs
├── uploads/ # Uploaded files
└── data/ # Application data
└── invoice_templates/
```
**VIGTIGT:** Kun disse filer downloades til serveren. Al application kode hentes automatisk af Docker/Podman fra Gitea release når containeren bygges.

View File

@ -7,6 +7,12 @@ Et centralt management system til BMC Networks - håndterer kunder, services, ha
## 🌟 Features ## 🌟 Features
- **Customer Management**: Komplet kundedatabase med CRM integration - **Customer Management**: Komplet kundedatabase med CRM integration
- **Time Tracking Module**: vTiger integration med tidsregistrering og fakturering
- Automatisk sync fra vTiger (billable timelogs)
- Step-by-step godkendelses-wizard
- Auto-afrunding til 0.5 timer
- Klippekort-funktionalitet
- e-conomic export (draft orders)
- **Hardware Tracking**: Registrering og sporing af kundeudstyr - **Hardware Tracking**: Registrering og sporing af kundeudstyr
- **Service Management**: Håndtering af services og abonnementer - **Service Management**: Håndtering af services og abonnementer
- **Billing Integration**: Automatisk fakturering via e-conomic - **Billing Integration**: Automatisk fakturering via e-conomic
@ -123,12 +129,43 @@ bmc_hub/
## 🔌 API Endpoints ## 🔌 API Endpoints
### Main API
- `GET /api/v1/customers` - List customers - `GET /api/v1/customers` - List customers
- `GET /api/v1/hardware` - List hardware - `GET /api/v1/hardware` - List hardware
- `GET /api/v1/billing/invoices` - List invoices - `GET /api/v1/billing/invoices` - List invoices
- `GET /health` - Health check - `GET /health` - Health check
Se fuld dokumentation: http://localhost:8000/api/docs ### Time Tracking Module
- `POST /api/v1/timetracking/sync` - Sync from vTiger (read-only)
- `GET /api/v1/timetracking/wizard/next` - Get next pending timelog
- `POST /api/v1/timetracking/wizard/approve/{id}` - Approve timelog
- `POST /api/v1/timetracking/orders/generate` - Generate invoice order
- `POST /api/v1/timetracking/export` - Export to e-conomic (with safety flags)
- `GET /api/v1/timetracking/export/test-connection` - Test e-conomic connection
Se fuld dokumentation: http://localhost:8001/api/docs
## 🚨 e-conomic Write Mode
Time Tracking modulet kan eksportere ordrer til e-conomic med **safety-first approach**:
### Safety Flags (default: SAFE)
```bash
TIMETRACKING_ECONOMIC_READ_ONLY=true # Block all writes
TIMETRACKING_ECONOMIC_DRY_RUN=true # Simulate writes (log only)
```
### Enable Write Mode
Se detaljeret guide: [docs/ECONOMIC_WRITE_MODE.md](docs/ECONOMIC_WRITE_MODE.md)
**Quick steps:**
1. Test connection: `GET /api/v1/timetracking/export/test-connection`
2. Test dry-run: Set `READ_ONLY=false`, keep `DRY_RUN=true`
3. Export test order: `POST /api/v1/timetracking/export`
4. Enable production: Set **both** flags to `false`
5. Verify first order in e-conomic before bulk operations
**CRITICAL**: All customers must have `economic_customer_number` (synced from vTiger `cf_854` field).
## 🧪 Testing ## 🧪 Testing

View File

@ -1,10 +0,0 @@
"""
Backup System Module for BMC Hub
Provides automated backup and restore functionality for:
- PostgreSQL database (compressed .dump + plain .sql)
- File storage (uploads/, data/, logs/)
- Automated rotation (30 days + monthly for 12 months)
- Offsite upload via SFTP/SSH
- Mattermost notifications
"""

View File

@ -1 +0,0 @@
"""Backup backend services, API routes, and scheduler."""

View File

@ -1,394 +0,0 @@
"""
Notification Service for Backup System
Sends rich formatted notifications to Mattermost webhook
"""
import logging
import aiohttp
from datetime import datetime
from typing import Dict, Optional, List
from app.core.config import settings
from app.core.database import execute_insert
logger = logging.getLogger(__name__)
class MattermostNotification:
"""Service for sending Mattermost webhook notifications"""
def __init__(self):
self.webhook_url = settings.MATTERMOST_WEBHOOK_URL
self.enabled = settings.MATTERMOST_ENABLED
self.channel = settings.MATTERMOST_CHANNEL
async def send_backup_success(self, job_id: int, job_type: str, file_size_bytes: int,
duration_seconds: float, checksum: str, is_monthly: bool = False):
"""
Send success notification for completed backup
Args:
job_id: Backup job ID
job_type: database, files, or full
file_size_bytes: Size of backup file
duration_seconds: Time taken to complete backup
checksum: SHA256 checksum of backup
is_monthly: Whether this is a monthly backup
"""
if not self._should_send_notification('backup_success'):
return
file_size_mb = file_size_bytes / 1024 / 1024
payload = {
"channel": self.channel,
"username": "BMC Hub Backup",
"icon_emoji": ":white_check_mark:",
"text": f"✅ **Backup Complete** - {job_type.capitalize()}",
"attachments": [{
"color": "#36a64f", # Green
"fields": [
{
"short": True,
"title": "Job ID",
"value": f"#{job_id}"
},
{
"short": True,
"title": "Type",
"value": f"{job_type.capitalize()} {'(Monthly)' if is_monthly else '(Daily)'}"
},
{
"short": True,
"title": "Size",
"value": f"{file_size_mb:.2f} MB"
},
{
"short": True,
"title": "Duration",
"value": f"{duration_seconds:.1f}s"
},
{
"short": False,
"title": "Checksum (SHA256)",
"value": f"`{checksum[:16]}...{checksum[-16:]}`"
}
],
"footer": "BMC Hub Backup System",
"footer_icon": "https://platform.slack-edge.com/img/default_application_icon.png",
"ts": int(datetime.now().timestamp())
}]
}
await self._send_webhook(payload, 'backup_success', job_id)
async def send_backup_failed(self, job_id: int, job_type: str, error_message: str):
"""
Send failure notification for failed backup
Args:
job_id: Backup job ID
job_type: database, files, or full
error_message: Error message from failed backup
"""
if not self._should_send_notification('backup_failed'):
return
payload = {
"channel": self.channel,
"username": "BMC Hub Backup",
"icon_emoji": ":x:",
"text": f"❌ **Backup Failed** - {job_type.capitalize()}",
"attachments": [{
"color": "#ff0000", # Red
"fields": [
{
"short": True,
"title": "Job ID",
"value": f"#{job_id}"
},
{
"short": True,
"title": "Type",
"value": job_type.capitalize()
},
{
"short": False,
"title": "Error",
"value": f"```{error_message[:500]}```"
}
],
"actions": [
{
"name": "view_dashboard",
"type": "button",
"text": "View Dashboard",
"url": f"{self._get_hub_url()}/backups"
}
],
"footer": "BMC Hub Backup System",
"ts": int(datetime.now().timestamp())
}]
}
await self._send_webhook(payload, 'backup_failed', job_id)
async def send_offsite_success(self, job_id: int, backup_name: str, file_size_bytes: int):
"""
Send notification for successful offsite upload
Args:
job_id: Backup job ID
backup_name: Name of uploaded backup file
file_size_bytes: Size of uploaded file
"""
if not settings.NOTIFY_ON_SUCCESS_OFFSITE:
return
file_size_mb = file_size_bytes / 1024 / 1024
payload = {
"channel": self.channel,
"username": "BMC Hub Backup",
"icon_emoji": ":cloud:",
"text": f"☁️ **Offsite Upload Complete**",
"attachments": [{
"color": "#0066cc", # Blue
"fields": [
{
"short": True,
"title": "Job ID",
"value": f"#{job_id}"
},
{
"short": True,
"title": "Destination",
"value": f"{settings.SFTP_HOST}:{settings.SFTP_REMOTE_PATH}"
},
{
"short": True,
"title": "Filename",
"value": backup_name
},
{
"short": True,
"title": "Size",
"value": f"{file_size_mb:.2f} MB"
}
],
"footer": "BMC Hub Backup System",
"ts": int(datetime.now().timestamp())
}]
}
await self._send_webhook(payload, 'backup_success', job_id)
async def send_offsite_failed(self, job_id: int, backup_name: str, error_message: str,
retry_count: int):
"""
Send notification for failed offsite upload
Args:
job_id: Backup job ID
backup_name: Name of backup file
error_message: Error message
retry_count: Current retry attempt number
"""
if not self._should_send_notification('offsite_failed'):
return
max_retries = settings.OFFSITE_RETRY_MAX_ATTEMPTS
payload = {
"channel": self.channel,
"username": "BMC Hub Backup",
"icon_emoji": ":warning:",
"text": f"⚠️ **Offsite Upload Failed** (Retry {retry_count}/{max_retries})",
"attachments": [{
"color": "#ff9900", # Orange
"fields": [
{
"short": True,
"title": "Job ID",
"value": f"#{job_id}"
},
{
"short": True,
"title": "Destination",
"value": f"{settings.SFTP_HOST}:{settings.SFTP_REMOTE_PATH}"
},
{
"short": True,
"title": "Filename",
"value": backup_name
},
{
"short": True,
"title": "Retry Status",
"value": f"{retry_count}/{max_retries}"
},
{
"short": False,
"title": "Error",
"value": f"```{error_message[:300]}```"
}
],
"footer": "BMC Hub Backup System",
"ts": int(datetime.now().timestamp())
}]
}
await self._send_webhook(payload, 'offsite_failed', job_id)
async def send_storage_warning(self, usage_pct: float, used_gb: float, max_gb: int):
"""
Send notification for high storage usage
Args:
usage_pct: Percentage of storage used
used_gb: Gigabytes used
max_gb: Maximum storage capacity
"""
if not self._should_send_notification('storage_low'):
return
payload = {
"channel": self.channel,
"username": "BMC Hub Backup",
"icon_emoji": ":warning:",
"text": f"⚠️ **Backup Storage Warning**",
"attachments": [{
"color": "#ff9900", # Orange
"fields": [
{
"short": True,
"title": "Usage",
"value": f"{usage_pct:.1f}%"
},
{
"short": True,
"title": "Space Used",
"value": f"{used_gb:.2f} GB / {max_gb} GB"
},
{
"short": False,
"title": "Recommendation",
"value": "Consider running backup rotation or increasing storage capacity."
}
],
"actions": [
{
"name": "view_dashboard",
"type": "button",
"text": "View Dashboard",
"url": f"{self._get_hub_url()}/backups"
}
],
"footer": "BMC Hub Backup System",
"ts": int(datetime.now().timestamp())
}]
}
await self._send_webhook(payload, 'storage_low')
async def send_restore_started(self, job_id: int, backup_name: str, eta_minutes: int):
"""
Send notification when restore operation starts
Args:
job_id: Backup job ID being restored
backup_name: Name of backup file
eta_minutes: Estimated time to completion
"""
payload = {
"channel": self.channel,
"username": "BMC Hub Backup",
"icon_emoji": ":gear:",
"text": f"🔧 **System Maintenance: Restore in Progress**",
"attachments": [{
"color": "#ffcc00", # Yellow
"fields": [
{
"short": True,
"title": "Backup Job ID",
"value": f"#{job_id}"
},
{
"short": True,
"title": "ETA",
"value": f"{eta_minutes} minutes"
},
{
"short": False,
"title": "Backup File",
"value": backup_name
},
{
"short": False,
"title": "Status",
"value": "System is in maintenance mode. All services temporarily unavailable."
}
],
"footer": "BMC Hub Backup System",
"ts": int(datetime.now().timestamp())
}]
}
await self._send_webhook(payload, 'restore_started', job_id)
async def _send_webhook(self, payload: Dict, event_type: str, job_id: Optional[int] = None):
"""
Send webhook to Mattermost and log notification
Args:
payload: Mattermost webhook payload
event_type: Type of notification event
job_id: Optional backup job ID
"""
if not self.enabled or not self.webhook_url:
logger.info("📢 Notification (disabled): %s - job_id=%s", event_type, job_id)
return
try:
async with aiohttp.ClientSession() as session:
async with session.post(self.webhook_url, json=payload, timeout=10) as response:
if response.status == 200:
logger.info("📢 Notification sent: %s - job_id=%s", event_type, job_id)
# Log to database
execute_insert(
"""INSERT INTO backup_notifications
(backup_job_id, event_type, message, mattermost_payload)
VALUES (%s, %s, %s, %s)""",
(job_id, event_type, payload.get('text', ''), str(payload))
)
else:
error_text = await response.text()
logger.error("❌ Notification failed: HTTP %s - %s",
response.status, error_text)
except aiohttp.ClientError as e:
logger.error("❌ Notification connection error: %s", str(e))
except Exception as e:
logger.error("❌ Notification error: %s", str(e))
def _should_send_notification(self, event_type: str) -> bool:
"""Check if notification should be sent based on settings"""
if not self.enabled or not self.webhook_url:
return False
if event_type in ['backup_failed', 'offsite_failed', 'storage_low']:
return settings.NOTIFY_ON_FAILURE
if event_type == 'backup_success' and 'offsite' in event_type:
return settings.NOTIFY_ON_SUCCESS_OFFSITE
return True
def _get_hub_url(self) -> str:
"""Get BMC Hub base URL for action buttons"""
# TODO: Add HUB_BASE_URL to config
return "http://localhost:8000" # Fallback
# Singleton instance
notifications = MattermostNotification()

View File

@ -1,505 +0,0 @@
"""
Backup System API Router
REST endpoints for backup management
"""
import logging
from typing import List, Optional
from datetime import datetime, date, timedelta
from pathlib import Path
from fastapi import APIRouter, HTTPException, Query, UploadFile, File
from pydantic import BaseModel, Field
from app.core.database import execute_query, execute_update, execute_insert
from app.core.config import settings
from app.backups.backend.service import backup_service
from app.backups.backend.notifications import notifications
logger = logging.getLogger(__name__)
router = APIRouter()
# Pydantic Models
class BackupCreate(BaseModel):
"""Request model for creating a new backup"""
job_type: str = Field(..., description="Type of backup: database, files, or full")
is_monthly: bool = Field(False, description="Create monthly backup (uses SQL format)")
class BackupJob(BaseModel):
"""Response model for backup job"""
id: int
job_type: str
status: str
backup_format: str
file_path: Optional[str]
file_size_bytes: Optional[int]
checksum_sha256: Optional[str]
is_monthly: bool
includes_uploads: bool
includes_logs: bool
includes_data: bool
started_at: Optional[datetime]
completed_at: Optional[datetime]
error_message: Optional[str]
retention_until: Optional[date]
offsite_uploaded_at: Optional[datetime]
offsite_retry_count: int
notification_sent: bool
created_at: datetime
updated_at: datetime
class RestoreRequest(BaseModel):
"""Request model for restoring from backup"""
confirmation: bool = Field(..., description="Must be true to confirm restore operation")
message: Optional[str] = Field(None, description="Optional restore reason/notes")
class MaintenanceStatus(BaseModel):
"""Response model for maintenance mode status"""
maintenance_mode: bool
maintenance_message: str
maintenance_eta_minutes: Optional[int]
updated_at: datetime
class NotificationRecord(BaseModel):
"""Response model for notification record"""
id: int
backup_job_id: Optional[int]
event_type: str
message: str
sent_at: datetime
acknowledged: bool
acknowledged_at: Optional[datetime]
class StorageStats(BaseModel):
"""Response model for storage statistics"""
total_size_bytes: int
total_size_gb: float
max_size_gb: int
usage_pct: float
file_count: int
warning: bool
# API Endpoints
@router.post("/backups", response_model=dict, tags=["Backups"])
async def create_backup(backup: BackupCreate):
"""
Create a new backup manually
- **job_type**: database, files, or full
- **is_monthly**: Use plain SQL format for database (monthly backups)
"""
if not settings.BACKUP_ENABLED:
raise HTTPException(status_code=503, detail="Backup system is disabled (BACKUP_ENABLED=false)")
logger.info("📦 Manual backup requested: type=%s, monthly=%s", backup.job_type, backup.is_monthly)
try:
if backup.job_type == 'database':
job_id = await backup_service.create_database_backup(is_monthly=backup.is_monthly)
if job_id:
return {"success": True, "job_id": job_id, "message": "Database backup created successfully"}
else:
raise HTTPException(status_code=500, detail="Database backup failed - check logs")
elif backup.job_type == 'files':
job_id = await backup_service.create_files_backup()
if job_id:
return {"success": True, "job_id": job_id, "message": "Files backup created successfully"}
else:
raise HTTPException(status_code=500, detail="Files backup failed - check logs")
elif backup.job_type == 'full':
db_job_id, files_job_id = await backup_service.create_full_backup(is_monthly=backup.is_monthly)
if db_job_id and files_job_id:
return {
"success": True,
"db_job_id": db_job_id,
"files_job_id": files_job_id,
"message": "Full backup created successfully"
}
else:
raise HTTPException(status_code=500, detail=f"Full backup partially failed: db={db_job_id}, files={files_job_id}")
else:
raise HTTPException(status_code=400, detail="Invalid job_type. Must be: database, files, or full")
except Exception as e:
logger.error("❌ Manual backup error: %s", str(e), exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.get("/backups/jobs", response_model=List[BackupJob], tags=["Backups"])
async def list_backups(
status: Optional[str] = Query(None, description="Filter by status: pending, running, completed, failed"),
job_type: Optional[str] = Query(None, description="Filter by type: database, files, full"),
limit: int = Query(50, ge=1, le=500),
offset: int = Query(0, ge=0)
):
"""
List backup jobs with optional filtering and pagination
"""
# Build query
query = "SELECT * FROM backup_jobs WHERE 1=1"
params = []
if status:
query += " AND status = %s"
params.append(status)
if job_type:
query += " AND job_type = %s"
params.append(job_type)
query += " ORDER BY created_at DESC LIMIT %s OFFSET %s"
params.extend([limit, offset])
backups = execute_query_single(query, tuple(params))
return backups if backups else []
@router.get("/backups/jobs/{job_id}", response_model=BackupJob, tags=["Backups"])
async def get_backup(job_id: int):
"""Get details of a specific backup job"""
backup = execute_query(
"SELECT * FROM backup_jobs WHERE id = %s",
(job_id,))
if not backup:
raise HTTPException(status_code=404, detail=f"Backup job {job_id} not found")
return backup
@router.post("/backups/upload", response_model=dict, tags=["Backups"])
async def upload_backup(
file: UploadFile = File(...),
backup_type: str = Query(..., description="Type: database or files"),
is_monthly: bool = Query(False, description="Mark as monthly backup")
):
"""
Upload a previously downloaded backup file
Validates file format and creates backup job record
"""
if settings.BACKUP_READ_ONLY:
raise HTTPException(
status_code=403,
detail="Upload blocked: BACKUP_READ_ONLY=true"
)
logger.info("📤 Backup upload: filename=%s, type=%s, size=%d bytes",
file.filename, backup_type, file.size if hasattr(file, 'size') else 0)
# Validate file type
allowed_extensions = {
'database': ['.dump', '.sql', '.sql.gz'],
'files': ['.tar.gz', '.tgz']
}
if backup_type not in allowed_extensions:
raise HTTPException(status_code=400, detail="Invalid backup_type. Must be: database or files")
file_ext = ''.join(Path(file.filename).suffixes)
if file_ext not in allowed_extensions[backup_type]:
raise HTTPException(
status_code=400,
detail=f"Invalid file extension '{file_ext}' for type '{backup_type}'. Allowed: {allowed_extensions[backup_type]}"
)
try:
# Determine storage path and format
backup_dir = Path(settings.BACKUP_STORAGE_PATH)
if backup_type == 'database':
target_dir = backup_dir / "database"
backup_format = 'dump' if file_ext == '.dump' else 'sql'
else:
target_dir = backup_dir / "files"
backup_format = 'tar.gz'
target_dir.mkdir(parents=True, exist_ok=True)
# Generate unique filename with timestamp
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
original_name = Path(file.filename).stem
new_filename = f"{original_name}_uploaded_{timestamp}{file_ext}"
target_path = target_dir / new_filename
# Save uploaded file
logger.info("💾 Saving upload to: %s", target_path)
content = await file.read()
with open(target_path, 'wb') as f:
f.write(content)
file_size = target_path.stat().st_size
# Calculate checksum
import hashlib
checksum = hashlib.sha256(content).hexdigest()
logger.info("✅ File saved: %d bytes, checksum=%s", file_size, checksum[:16])
# Calculate retention date
if is_monthly:
retention_until = datetime.now() + timedelta(days=settings.MONTHLY_KEEP_MONTHS * 30)
else:
retention_until = datetime.now() + timedelta(days=settings.RETENTION_DAYS)
# Create backup job record
job_id = execute_insert(
"""INSERT INTO backup_jobs
(job_type, status, backup_format, file_path, file_size_bytes,
checksum_sha256, is_monthly, started_at, completed_at, retention_until)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)""",
(backup_type, 'completed', backup_format, str(target_path), file_size,
checksum, is_monthly, datetime.now(), datetime.now(), retention_until.date())
)
logger.info("✅ Backup upload registered: job_id=%s", job_id)
return {
"success": True,
"job_id": job_id,
"message": f"Backup uploaded successfully: {new_filename}",
"file_size_mb": round(file_size / 1024 / 1024, 2),
"checksum": checksum
}
except Exception as e:
logger.error("❌ Upload failed: %s", str(e), exc_info=True)
# Clean up partial file
if target_path.exists():
target_path.unlink()
raise HTTPException(status_code=500, detail=f"Upload failed: {str(e)}")
@router.post("/backups/restore/{job_id}", response_model=dict, tags=["Backups"])
async def restore_backup(job_id: int, request: RestoreRequest):
"""
Restore from a backup (database or files)
**WARNING**: This will enable maintenance mode and temporarily shut down the system
"""
if not request.confirmation:
raise HTTPException(
status_code=400,
detail="Restore operation requires confirmation=true"
)
if settings.BACKUP_READ_ONLY:
raise HTTPException(
status_code=403,
detail="Restore blocked: BACKUP_READ_ONLY=true. Update configuration to enable restores."
)
# Get backup job
backup = execute_query_single(
"SELECT * FROM backup_jobs WHERE id = %s",
(job_id,))
if not backup:
raise HTTPException(status_code=404, detail=f"Backup job {job_id} not found")
if backup['status'] != 'completed':
raise HTTPException(status_code=400, detail=f"Cannot restore from backup with status: {backup['status']}")
logger.warning("🔧 Restore initiated: job_id=%s, type=%s, user_message=%s",
job_id, backup['job_type'], request.message)
try:
# Send notification
await notifications.send_restore_started(
job_id=job_id,
backup_name=backup['file_path'].split('/')[-1],
eta_minutes=5
)
# Perform restore based on type
if backup['job_type'] == 'database':
success = await backup_service.restore_database(job_id)
elif backup['job_type'] == 'files':
success = await backup_service.restore_files(job_id)
elif backup['job_type'] == 'full':
# Restore both database and files
db_success = await backup_service.restore_database(job_id)
files_success = await backup_service.restore_files(job_id)
success = db_success and files_success
else:
raise HTTPException(status_code=400, detail=f"Unknown backup type: {backup['job_type']}")
if success:
logger.info("✅ Restore completed successfully: job_id=%s", job_id)
return {"success": True, "message": "Restore completed successfully"}
else:
logger.error("❌ Restore failed: job_id=%s", job_id)
raise HTTPException(status_code=500, detail="Restore operation failed - check logs")
except Exception as e:
logger.error("❌ Restore error: %s", str(e), exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.delete("/backups/jobs/{job_id}", response_model=dict, tags=["Backups"])
async def delete_backup(job_id: int):
"""
Delete a backup job and its associated file
"""
# Get backup job
backup = execute_query_single(
"SELECT * FROM backup_jobs WHERE id = %s",
(job_id,))
if not backup:
raise HTTPException(status_code=404, detail=f"Backup job {job_id} not found")
logger.info("🗑️ Deleting backup: job_id=%s, file=%s", job_id, backup['file_path'])
try:
# Delete file if exists
from pathlib import Path
if backup['file_path']:
file_path = Path(backup['file_path'])
if file_path.exists():
file_path.unlink()
logger.info("✅ Deleted backup file: %s", file_path.name)
# Delete database record
execute_update("DELETE FROM backup_jobs WHERE id = %s", (job_id,))
return {"success": True, "message": f"Backup {job_id} deleted successfully"}
except Exception as e:
logger.error("❌ Delete backup error: %s", str(e), exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.post("/backups/offsite/{job_id}", response_model=dict, tags=["Backups"])
async def upload_offsite(job_id: int):
"""
Manually trigger offsite upload for a specific backup
"""
if not settings.OFFSITE_ENABLED:
raise HTTPException(status_code=503, detail="Offsite uploads are disabled (OFFSITE_ENABLED=false)")
logger.info("☁️ Manual offsite upload requested: job_id=%s", job_id)
try:
success = await backup_service.upload_offsite(job_id)
if success:
return {"success": True, "message": f"Backup {job_id} uploaded to offsite successfully"}
else:
raise HTTPException(status_code=500, detail="Offsite upload failed - check logs")
except Exception as e:
logger.error("❌ Manual offsite upload error: %s", str(e), exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.get("/backups/maintenance", response_model=MaintenanceStatus, tags=["System"])
async def get_maintenance_status():
"""
Get current maintenance mode status
Used by frontend to display maintenance overlay
"""
status = execute_query_single(
"SELECT * FROM system_status WHERE id = 1")
if not status:
# Return default status if not found
return {
"maintenance_mode": False,
"maintenance_message": "",
"maintenance_eta_minutes": None,
"updated_at": datetime.now()
}
return status
@router.get("/backups/notifications", response_model=List[NotificationRecord], tags=["Backups"])
async def list_notifications(
acknowledged: Optional[bool] = Query(None, description="Filter by acknowledged status"),
limit: int = Query(50, ge=1, le=200),
offset: int = Query(0, ge=0)
):
"""
List backup notifications (alerts, warnings, errors)
"""
query = "SELECT * FROM backup_notifications WHERE 1=1"
params = []
if acknowledged is not None:
query += " AND acknowledged = %s"
params.append(acknowledged)
query += " ORDER BY sent_at DESC LIMIT %s OFFSET %s"
params.extend([limit, offset])
notifications_list = execute_query(query, tuple(params))
return notifications_list if notifications_list else []
@router.post("/backups/notifications/{notification_id}/acknowledge", response_model=dict, tags=["Backups"])
async def acknowledge_notification(notification_id: int):
"""
Acknowledge a notification (mark as read)
"""
execute_update(
"""UPDATE backup_notifications
SET acknowledged = true, acknowledged_at = %s
WHERE id = %s""",
(datetime.now(), notification_id)
)
return {"success": True, "message": f"Notification {notification_id} acknowledged"}
@router.get("/backups/storage", response_model=StorageStats, tags=["System"])
async def get_storage_stats():
"""
Get backup storage usage statistics
"""
stats = await backup_service.check_storage_usage()
return stats
@router.get("/backups/scheduler/status", response_model=dict, tags=["System"])
async def get_scheduler_status():
"""
Get backup scheduler status and job information
"""
from app.backups.backend.scheduler import backup_scheduler
if not backup_scheduler.running:
return {
"enabled": settings.BACKUP_ENABLED,
"running": False,
"message": "Backup scheduler is not running"
}
jobs = []
for job in backup_scheduler.scheduler.get_jobs():
jobs.append({
"id": job.id,
"name": job.name,
"next_run": job.next_run_time.isoformat() if job.next_run_time else None,
})
return {
"enabled": settings.BACKUP_ENABLED,
"running": backup_scheduler.running,
"jobs": jobs
}

View File

@ -1,395 +0,0 @@
"""
Backup Scheduler
Manages scheduled backup jobs, rotation, offsite uploads, and retry logic
"""
import logging
from datetime import datetime, time
from apscheduler.schedulers.asyncio import AsyncIOScheduler
from apscheduler.triggers.cron import CronTrigger
from apscheduler.triggers.interval import IntervalTrigger
from app.core.config import settings
from app.core.database import execute_query
from app.backups.backend.service import backup_service
from app.backups.backend.notifications import notifications
logger = logging.getLogger(__name__)
class BackupScheduler:
"""Scheduler for automated backup operations"""
def __init__(self):
self.scheduler = AsyncIOScheduler()
self.enabled = settings.BACKUP_ENABLED
self.running = False
def start(self):
"""Start the backup scheduler with all jobs"""
if not self.enabled:
logger.info("⏭️ Backup scheduler disabled (BACKUP_ENABLED=false)")
return
if self.running:
logger.warning("⚠️ Backup scheduler already running")
return
logger.info("🚀 Starting backup scheduler...")
# Daily full backup at 02:00 CET
self.scheduler.add_job(
func=self._daily_backup_job,
trigger=CronTrigger(hour=2, minute=0),
id='daily_backup',
name='Daily Full Backup',
max_instances=1,
replace_existing=True
)
logger.info("✅ Scheduled: Daily backup at 02:00")
# Monthly backup on the 1st at 02:00 CET
self.scheduler.add_job(
func=self._monthly_backup_job,
trigger=CronTrigger(day=1, hour=2, minute=0),
id='monthly_backup',
name='Monthly Full Backup (SQL format)',
max_instances=1,
replace_existing=True
)
logger.info("✅ Scheduled: Monthly backup on 1st at 02:00")
# Weekly offsite upload (configurable day, default Sunday at 03:00)
offsite_day = self._get_weekday_number(settings.OFFSITE_WEEKLY_DAY)
self.scheduler.add_job(
func=self._offsite_upload_job,
trigger=CronTrigger(day_of_week=offsite_day, hour=3, minute=0),
id='offsite_upload',
name=f'Weekly Offsite Upload ({settings.OFFSITE_WEEKLY_DAY.capitalize()})',
max_instances=1,
replace_existing=True
)
logger.info("✅ Scheduled: Weekly offsite upload on %s at 03:00",
settings.OFFSITE_WEEKLY_DAY.capitalize())
# Offsite retry job (every hour)
self.scheduler.add_job(
func=self._offsite_retry_job,
trigger=IntervalTrigger(hours=settings.OFFSITE_RETRY_DELAY_HOURS),
id='offsite_retry',
name='Offsite Upload Retry',
max_instances=1,
replace_existing=True
)
logger.info("✅ Scheduled: Offsite retry every %d hour(s)",
settings.OFFSITE_RETRY_DELAY_HOURS)
# Backup rotation (daily at 01:00)
self.scheduler.add_job(
func=self._rotation_job,
trigger=CronTrigger(hour=1, minute=0),
id='backup_rotation',
name='Backup Rotation',
max_instances=1,
replace_existing=True
)
logger.info("✅ Scheduled: Backup rotation at 01:00")
# Storage check (daily at 01:30)
self.scheduler.add_job(
func=self._storage_check_job,
trigger=CronTrigger(hour=1, minute=30),
id='storage_check',
name='Storage Usage Check',
max_instances=1,
replace_existing=True
)
logger.info("✅ Scheduled: Storage check at 01:30")
# Start the scheduler
self.scheduler.start()
self.running = True
logger.info("✅ Backup scheduler started successfully")
def stop(self):
"""Stop the backup scheduler"""
if not self.running:
return
logger.info("🛑 Stopping backup scheduler...")
self.scheduler.shutdown(wait=True)
self.running = False
logger.info("✅ Backup scheduler stopped")
def pause(self):
"""Pause all scheduled jobs (for maintenance)"""
if not self.running:
return
logger.info("⏸️ Pausing backup scheduler...")
self.scheduler.pause()
logger.info("✅ Backup scheduler paused")
def resume(self):
"""Resume all scheduled jobs"""
if not self.running:
return
logger.info("▶️ Resuming backup scheduler...")
self.scheduler.resume()
logger.info("✅ Backup scheduler resumed")
async def _daily_backup_job(self):
"""Daily full backup job (database + files)"""
logger.info("🔄 Starting daily backup job...")
try:
start_time = datetime.now()
# Create full backup (database uses compressed .dump format)
db_job_id, files_job_id = await backup_service.create_full_backup(is_monthly=False)
end_time = datetime.now()
duration = (end_time - start_time).total_seconds()
if db_job_id and files_job_id:
logger.info("✅ Daily backup completed: db=%s, files=%s (%.1fs)",
db_job_id, files_job_id, duration)
# Send success notification for database backup
db_backup = execute_query_single(
"SELECT * FROM backup_jobs WHERE id = %s",
(db_job_id,))
if db_backup:
await notifications.send_backup_success(
job_id=db_job_id,
job_type='database',
file_size_bytes=db_backup['file_size_bytes'],
duration_seconds=duration / 2, # Rough estimate
checksum=db_backup['checksum_sha256'],
is_monthly=False
)
else:
logger.error("❌ Daily backup failed: db=%s, files=%s", db_job_id, files_job_id)
# Send failure notification
if not db_job_id:
await notifications.send_backup_failed(
job_id=0,
job_type='database',
error_message='Database backup failed - see logs for details'
)
if not files_job_id:
await notifications.send_backup_failed(
job_id=0,
job_type='files',
error_message='Files backup failed - see logs for details'
)
except Exception as e:
logger.error("❌ Daily backup job error: %s", str(e), exc_info=True)
await notifications.send_backup_failed(
job_id=0,
job_type='full',
error_message=str(e)
)
async def _monthly_backup_job(self):
"""Monthly full backup job (database uses plain SQL format)"""
logger.info("🔄 Starting monthly backup job...")
try:
start_time = datetime.now()
# Create full backup with is_monthly=True (uses plain SQL format)
db_job_id, files_job_id = await backup_service.create_full_backup(is_monthly=True)
end_time = datetime.now()
duration = (end_time - start_time).total_seconds()
if db_job_id and files_job_id:
logger.info("✅ Monthly backup completed: db=%s, files=%s (%.1fs)",
db_job_id, files_job_id, duration)
# Send success notification for database backup
db_backup = execute_query_single(
"SELECT * FROM backup_jobs WHERE id = %s",
(db_job_id,))
if db_backup:
await notifications.send_backup_success(
job_id=db_job_id,
job_type='database',
file_size_bytes=db_backup['file_size_bytes'],
duration_seconds=duration / 2,
checksum=db_backup['checksum_sha256'],
is_monthly=True
)
else:
logger.error("❌ Monthly backup failed: db=%s, files=%s", db_job_id, files_job_id)
await notifications.send_backup_failed(
job_id=0,
job_type='monthly',
error_message='Monthly backup failed - see logs for details'
)
except Exception as e:
logger.error("❌ Monthly backup job error: %s", str(e), exc_info=True)
await notifications.send_backup_failed(
job_id=0,
job_type='monthly',
error_message=str(e)
)
async def _offsite_upload_job(self):
"""Weekly offsite upload job - uploads all backups not yet uploaded"""
if not settings.OFFSITE_ENABLED:
logger.info("⏭️ Offsite upload skipped (OFFSITE_ENABLED=false)")
return
logger.info("☁️ Starting weekly offsite upload job...")
try:
# Find all completed backups not yet uploaded
pending_backups = execute_query_single(
"""SELECT * FROM backup_jobs
WHERE status = 'completed'
AND offsite_uploaded_at IS NULL
AND offsite_retry_count < %s
ORDER BY completed_at ASC""",
(settings.OFFSITE_RETRY_MAX_ATTEMPTS,)
)
if not pending_backups:
logger.info("✅ No pending backups for offsite upload")
return
logger.info("📦 Found %d backups pending offsite upload", len(pending_backups))
success_count = 0
fail_count = 0
for backup in pending_backups:
success = await backup_service.upload_offsite(backup['id'])
if success:
success_count += 1
# Send success notification
await notifications.send_offsite_success(
job_id=backup['id'],
backup_name=backup['file_path'].split('/')[-1],
file_size_bytes=backup['file_size_bytes']
)
else:
fail_count += 1
# Get updated retry count
updated_backup = execute_query(
"SELECT offsite_retry_count FROM backup_jobs WHERE id = %s",
(backup['id'],))
# Send failure notification
await notifications.send_offsite_failed(
job_id=backup['id'],
backup_name=backup['file_path'].split('/')[-1],
error_message='Offsite upload failed - will retry',
retry_count=updated_backup['offsite_retry_count']
)
logger.info("✅ Offsite upload job completed: %d success, %d failed",
success_count, fail_count)
except Exception as e:
logger.error("❌ Offsite upload job error: %s", str(e), exc_info=True)
async def _offsite_retry_job(self):
"""Retry failed offsite uploads"""
if not settings.OFFSITE_ENABLED:
return
# Find backups that failed offsite upload and haven't exceeded retry limit
retry_backups = execute_query(
"""SELECT * FROM backup_jobs
WHERE status = 'completed'
AND offsite_uploaded_at IS NULL
AND offsite_retry_count > 0
AND offsite_retry_count < %s
ORDER BY offsite_retry_count ASC, completed_at ASC
LIMIT 5""", # Limit to 5 retries per run
(settings.OFFSITE_RETRY_MAX_ATTEMPTS,)
)
if not retry_backups:
return
logger.info("🔁 Retrying %d failed offsite uploads...", len(retry_backups))
for backup in retry_backups:
logger.info("🔁 Retry attempt %d/%d for backup %s",
backup['offsite_retry_count'] + 1,
settings.OFFSITE_RETRY_MAX_ATTEMPTS,
backup['id'])
success = await backup_service.upload_offsite(backup['id'])
if success:
logger.info("✅ Offsite upload succeeded on retry: backup %s", backup['id'])
await notifications.send_offsite_success(
job_id=backup['id'],
backup_name=backup['file_path'].split('/')[-1],
file_size_bytes=backup['file_size_bytes']
)
async def _rotation_job(self):
"""Backup rotation job - removes expired backups"""
logger.info("🔄 Starting backup rotation job...")
try:
await backup_service.rotate_backups()
logger.info("✅ Backup rotation completed")
except Exception as e:
logger.error("❌ Backup rotation error: %s", str(e), exc_info=True)
async def _storage_check_job(self):
"""Storage usage check job - warns if storage is running low"""
logger.info("🔄 Starting storage check job...")
try:
stats = await backup_service.check_storage_usage()
if stats['warning']:
await notifications.send_storage_warning(
usage_pct=stats['usage_pct'],
used_gb=stats['total_size_gb'],
max_gb=settings.BACKUP_MAX_SIZE_GB
)
logger.info("✅ Storage check completed: %.1f%% used (%.2f GB / %d GB)",
stats['usage_pct'], stats['total_size_gb'], settings.BACKUP_MAX_SIZE_GB)
except Exception as e:
logger.error("❌ Storage check error: %s", str(e), exc_info=True)
def _get_weekday_number(self, day_name: str) -> int:
"""Convert day name to APScheduler weekday number (0=Monday, 6=Sunday)"""
days = {
'monday': 0,
'tuesday': 1,
'wednesday': 2,
'thursday': 3,
'friday': 4,
'saturday': 5,
'sunday': 6
}
return days.get(day_name.lower(), 6) # Default to Sunday
# Singleton instance
backup_scheduler = BackupScheduler()

View File

@ -1,690 +0,0 @@
"""
Backup Service
Handles database and file backup operations, rotation, restore, and offsite uploads.
"""
import os
import logging
import hashlib
import tarfile
import subprocess
import fcntl
from pathlib import Path
from datetime import datetime, timedelta
from typing import Optional, Dict, List, Tuple
import paramiko
from stat import S_ISDIR
from app.core.config import settings
from app.core.database import execute_query, execute_insert, execute_update
logger = logging.getLogger(__name__)
class BackupService:
"""Service for managing backup operations"""
def __init__(self):
self.backup_dir = Path(settings.BACKUP_STORAGE_PATH)
self.backup_dir.mkdir(parents=True, exist_ok=True)
# Subdirectories for different backup types
self.db_dir = self.backup_dir / "database"
self.files_dir = self.backup_dir / "files"
self.db_dir.mkdir(exist_ok=True)
self.files_dir.mkdir(exist_ok=True)
async def create_database_backup(self, is_monthly: bool = False) -> Optional[int]:
"""
Create PostgreSQL database backup using pg_dump
Args:
is_monthly: If True, creates plain SQL backup for readability
Returns:
backup_job_id or None if failed
"""
if settings.BACKUP_DRY_RUN:
logger.info("🔄 DRY RUN: Would create database backup (monthly=%s)", is_monthly)
return None
# Determine format based on monthly flag
backup_format = settings.DB_MONTHLY_FORMAT if is_monthly else settings.DB_DAILY_FORMAT
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"db_{timestamp}_{'monthly' if is_monthly else 'daily'}.{backup_format}"
backup_path = self.db_dir / filename
# Create backup job record
job_id = execute_insert(
"""INSERT INTO backup_jobs (job_type, status, backup_format, is_monthly, started_at)
VALUES (%s, %s, %s, %s, %s)""",
('database', 'running', backup_format, is_monthly, datetime.now())
)
logger.info("🔄 Starting database backup: job_id=%s, format=%s, monthly=%s",
job_id, backup_format, is_monthly)
try:
# Build pg_dump command - connect via network to postgres service
env = os.environ.copy()
env['PGPASSWORD'] = settings.DATABASE_URL.split(':')[2].split('@')[0] # Extract password
# Parse database connection info from DATABASE_URL
# Format: postgresql://user:pass@host:port/dbname
db_parts = settings.DATABASE_URL.replace('postgresql://', '').split('@')
user_pass = db_parts[0].split(':')
host_db = db_parts[1].split('/')
user = user_pass[0]
password = user_pass[1] if len(user_pass) > 1 else ''
host = host_db[0].split(':')[0] if ':' in host_db[0] else host_db[0]
dbname = host_db[1] if len(host_db) > 1 else 'bmc_hub'
env['PGPASSWORD'] = password
if backup_format == 'dump':
# Compressed custom format (-Fc)
cmd = ['pg_dump', '-h', host, '-U', user, '-Fc', dbname]
else:
# Plain SQL format
cmd = ['pg_dump', '-h', host, '-U', user, dbname]
# Execute pg_dump and write to file
logger.info("📦 Executing: %s > %s", ' '.join(cmd), backup_path)
with open(backup_path, 'wb') as f:
result = subprocess.run(cmd, stdout=f, stderr=subprocess.PIPE, check=True, env=env)
# Calculate file size and checksum
file_size = backup_path.stat().st_size
checksum = self._calculate_checksum(backup_path)
# Calculate retention date
if is_monthly:
retention_until = datetime.now() + timedelta(days=settings.MONTHLY_KEEP_MONTHS * 30)
else:
retention_until = datetime.now() + timedelta(days=settings.RETENTION_DAYS)
# Update job record
execute_update(
"""UPDATE backup_jobs
SET status = %s, completed_at = %s, file_path = %s,
file_size_bytes = %s, checksum_sha256 = %s, retention_until = %s
WHERE id = %s""",
('completed', datetime.now(), str(backup_path), file_size, checksum,
retention_until.date(), job_id)
)
logger.info("✅ Database backup completed: %s (%.2f MB)",
filename, file_size / 1024 / 1024)
return job_id
except subprocess.CalledProcessError as e:
error_msg = e.stderr.decode() if e.stderr else str(e)
logger.error("❌ Database backup failed: %s", error_msg)
execute_update(
"""UPDATE backup_jobs
SET status = %s, completed_at = %s, error_message = %s
WHERE id = %s""",
('failed', datetime.now(), error_msg, job_id)
)
# Clean up partial backup file
if backup_path.exists():
backup_path.unlink()
return None
async def create_files_backup(self) -> Optional[int]:
"""
Create tar.gz backup of file directories (uploads/, data/, logs/)
Returns:
backup_job_id or None if failed
"""
if settings.BACKUP_DRY_RUN:
logger.info("🔄 DRY RUN: Would create files backup")
return None
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"files_{timestamp}.tar.gz"
backup_path = self.files_dir / filename
# Paths to backup (relative to project root)
base_path = Path.cwd()
paths_to_backup = []
if settings.BACKUP_INCLUDE_UPLOADS:
uploads_path = base_path / settings.UPLOAD_DIR
if uploads_path.exists():
paths_to_backup.append((uploads_path, 'uploads'))
if settings.BACKUP_INCLUDE_DATA:
data_path = base_path / 'data'
if data_path.exists():
paths_to_backup.append((data_path, 'data'))
if settings.BACKUP_INCLUDE_LOGS:
logs_path = base_path / 'logs'
if logs_path.exists():
paths_to_backup.append((logs_path, 'logs'))
if not paths_to_backup:
logger.warning("⚠️ No file directories to backup")
return None
# Create backup job record
job_id = execute_insert(
"""INSERT INTO backup_jobs
(job_type, status, backup_format, includes_uploads, includes_logs, includes_data, started_at)
VALUES (%s, %s, %s, %s, %s, %s, %s)""",
('files', 'running', 'tar.gz',
settings.BACKUP_INCLUDE_UPLOADS,
settings.BACKUP_INCLUDE_LOGS,
settings.BACKUP_INCLUDE_DATA,
datetime.now())
)
logger.info("🔄 Starting files backup: job_id=%s, paths=%s",
job_id, [name for _, name in paths_to_backup])
try:
# Exclude patterns
exclude_patterns = [
'__pycache__',
'*.pyc',
'*.pyo',
'*.pyd',
'.DS_Store',
'.git',
'backup', # Don't backup the backup directory itself!
]
# Create tar.gz archive
with tarfile.open(backup_path, 'w:gz') as tar:
for path, arcname in paths_to_backup:
tar.add(
path,
arcname=arcname,
recursive=True,
filter=lambda ti: None if any(
pattern in ti.name for pattern in exclude_patterns
) else ti
)
# Calculate file size and checksum
file_size = backup_path.stat().st_size
checksum = self._calculate_checksum(backup_path)
# Calculate retention date (files use daily retention)
retention_until = datetime.now() + timedelta(days=settings.RETENTION_DAYS)
# Update job record
execute_update(
"""UPDATE backup_jobs
SET status = %s, completed_at = %s, file_path = %s,
file_size_bytes = %s, checksum_sha256 = %s, retention_until = %s
WHERE id = %s""",
('completed', datetime.now(), str(backup_path), file_size, checksum,
retention_until.date(), job_id)
)
logger.info("✅ Files backup completed: %s (%.2f MB)",
filename, file_size / 1024 / 1024)
return job_id
except Exception as e:
logger.error("❌ Files backup failed: %s", str(e))
execute_update(
"""UPDATE backup_jobs
SET status = %s, completed_at = %s, error_message = %s
WHERE id = %s""",
('failed', datetime.now(), str(e), job_id)
)
# Clean up partial backup file
if backup_path.exists():
backup_path.unlink()
return None
async def create_full_backup(self, is_monthly: bool = False) -> Tuple[Optional[int], Optional[int]]:
"""
Create full backup (database + files)
Returns:
(db_job_id, files_job_id) tuple
"""
logger.info("🔄 Starting full backup (database + files)")
db_job_id = await self.create_database_backup(is_monthly=is_monthly)
files_job_id = await self.create_files_backup()
if db_job_id and files_job_id:
logger.info("✅ Full backup completed: db=%s, files=%s", db_job_id, files_job_id)
else:
logger.warning("⚠️ Full backup partially failed: db=%s, files=%s",
db_job_id, files_job_id)
return (db_job_id, files_job_id)
async def rotate_backups(self):
"""
Remove old backups based on retention policy:
- Daily backups: Keep for RETENTION_DAYS (default 30 days)
- Monthly backups: Keep for MONTHLY_KEEP_MONTHS (default 12 months)
"""
if settings.BACKUP_DRY_RUN:
logger.info("🔄 DRY RUN: Would rotate backups")
return
logger.info("🔄 Starting backup rotation")
# Find expired backups
expired_backups = execute_query_single(
"""SELECT id, file_path, is_monthly, retention_until
FROM backup_jobs
WHERE status = 'completed'
AND retention_until < CURRENT_DATE
ORDER BY retention_until ASC"""
)
deleted_count = 0
freed_bytes = 0
for backup in expired_backups:
file_path = Path(backup['file_path'])
if file_path.exists():
file_size = file_path.stat().st_size
file_path.unlink()
freed_bytes += file_size
logger.info("🗑️ Deleted expired backup: %s (%.2f MB, retention_until=%s)",
file_path.name, file_size / 1024 / 1024, backup['retention_until'])
# Delete from database
execute_update("DELETE FROM backup_jobs WHERE id = %s", (backup['id'],))
deleted_count += 1
if deleted_count > 0:
logger.info("✅ Rotation complete: deleted %d backups, freed %.2f MB",
deleted_count, freed_bytes / 1024 / 1024)
else:
logger.info("✅ Rotation complete: no expired backups")
async def restore_database(self, job_id: int) -> bool:
"""
Restore database from backup with maintenance mode
Args:
job_id: Backup job ID to restore from
Returns:
True if successful, False otherwise
"""
if settings.BACKUP_READ_ONLY:
logger.error("❌ Restore blocked: BACKUP_READ_ONLY=true")
return False
# Get backup job
backup = execute_query(
"SELECT * FROM backup_jobs WHERE id = %s AND job_type = 'database'",
(job_id,))
if not backup:
logger.error("❌ Backup job not found: %s", job_id)
return False
backup_path = Path(backup['file_path'])
if not backup_path.exists():
logger.error("❌ Backup file not found: %s", backup_path)
return False
logger.info("🔄 Starting database restore from backup: %s", backup_path.name)
# Enable maintenance mode
await self.set_maintenance_mode(True, "Database restore i gang", eta_minutes=5)
# TODO: Stop scheduler (will be implemented in scheduler.py)
try:
# Verify checksum
current_checksum = self._calculate_checksum(backup_path)
if current_checksum != backup['checksum_sha256']:
raise ValueError(f"Checksum mismatch! Expected {backup['checksum_sha256']}, got {current_checksum}")
logger.info("✅ Checksum verified")
# Acquire file lock to prevent concurrent operations
lock_file = self.backup_dir / ".restore.lock"
with open(lock_file, 'w') as f:
fcntl.flock(f.fileno(), fcntl.LOCK_EX)
# Parse database connection info
env = os.environ.copy()
db_parts = settings.DATABASE_URL.replace('postgresql://', '').split('@')
user_pass = db_parts[0].split(':')
host_db = db_parts[1].split('/')
user = user_pass[0]
password = user_pass[1] if len(user_pass) > 1 else ''
host = host_db[0].split(':')[0] if ':' in host_db[0] else host_db[0]
dbname = host_db[1] if len(host_db) > 1 else 'bmc_hub'
env['PGPASSWORD'] = password
# Build restore command based on format
if backup['backup_format'] == 'dump':
# Restore from compressed custom format
cmd = ['pg_restore', '-h', host, '-U', user, '-d', dbname, '--clean', '--if-exists']
logger.info("📥 Executing: %s < %s", ' '.join(cmd), backup_path)
with open(backup_path, 'rb') as f:
result = subprocess.run(cmd, stdin=f, stderr=subprocess.PIPE, check=True, env=env)
else:
# Restore from plain SQL
cmd = ['psql', '-h', host, '-U', user, '-d', dbname]
logger.info("📥 Executing: %s < %s", ' '.join(cmd), backup_path)
with open(backup_path, 'rb') as f:
result = subprocess.run(cmd, stdin=f, stderr=subprocess.PIPE, check=True, env=env)
# Release file lock
fcntl.flock(f.fileno(), fcntl.LOCK_UN)
logger.info("✅ Database restore completed successfully")
# Log notification
execute_insert(
"""INSERT INTO backup_notifications (backup_job_id, event_type, message)
VALUES (%s, %s, %s)""",
(job_id, 'restore_started', f'Database restored from backup: {backup_path.name}')
)
return True
except Exception as e:
logger.error("❌ Database restore failed: %s", str(e))
return False
finally:
# Disable maintenance mode
await self.set_maintenance_mode(False)
# TODO: Restart scheduler (will be implemented in scheduler.py)
# Clean up lock file
if lock_file.exists():
lock_file.unlink()
async def restore_files(self, job_id: int) -> bool:
"""
Restore files from tar.gz backup
Args:
job_id: Backup job ID to restore from
Returns:
True if successful, False otherwise
"""
if settings.BACKUP_READ_ONLY:
logger.error("❌ Restore blocked: BACKUP_READ_ONLY=true")
return False
# Get backup job
backup = execute_query_single(
"SELECT * FROM backup_jobs WHERE id = %s AND job_type = 'files'",
(job_id,))
if not backup:
logger.error("❌ Backup job not found: %s", job_id)
return False
backup_path = Path(backup['file_path'])
if not backup_path.exists():
logger.error("❌ Backup file not found: %s", backup_path)
return False
logger.info("🔄 Starting files restore from backup: %s", backup_path.name)
try:
# Verify checksum
current_checksum = self._calculate_checksum(backup_path)
if current_checksum != backup['checksum_sha256']:
raise ValueError(f"Checksum mismatch! Expected {backup['checksum_sha256']}, got {current_checksum}")
logger.info("✅ Checksum verified")
# Acquire file lock
lock_file = self.backup_dir / ".restore_files.lock"
with open(lock_file, 'w') as f:
fcntl.flock(f.fileno(), fcntl.LOCK_EX)
# Extract tar.gz to project root
base_path = Path.cwd()
with tarfile.open(backup_path, 'r:gz') as tar:
# Extract all files, excluding backup directory
members = [m for m in tar.getmembers() if 'backup' not in m.name]
tar.extractall(path=base_path, members=members)
# Release file lock
fcntl.flock(f.fileno(), fcntl.LOCK_UN)
logger.info("✅ Files restore completed successfully")
return True
except Exception as e:
logger.error("❌ Files restore failed: %s", str(e))
return False
finally:
# Clean up lock file
if lock_file.exists():
lock_file.unlink()
async def upload_offsite(self, job_id: int) -> bool:
"""
Upload backup to offsite location via SFTP/SSH
Args:
job_id: Backup job ID to upload
Returns:
True if successful, False otherwise
"""
if not settings.OFFSITE_ENABLED:
logger.info("⏭️ Offsite upload disabled")
return False
if settings.BACKUP_DRY_RUN:
logger.info("🔄 DRY RUN: Would upload backup to offsite")
return False
# Get backup job
backup = execute_query_single(
"SELECT * FROM backup_jobs WHERE id = %s",
(job_id,))
if not backup:
logger.error("❌ Backup job not found: %s", job_id)
return False
if backup['offsite_uploaded_at']:
logger.info("⏭️ Backup already uploaded to offsite: %s", job_id)
return True
backup_path = Path(backup['file_path'])
if not backup_path.exists():
logger.error("❌ Backup file not found: %s", backup_path)
return False
logger.info("☁️ Starting offsite upload: %s to %s:%s",
backup_path.name, settings.SFTP_HOST, settings.SFTP_REMOTE_PATH)
try:
# Connect via SFTP
transport = paramiko.Transport((settings.SFTP_HOST, settings.SFTP_PORT))
if settings.SSH_KEY_PATH:
# Use SSH key authentication
private_key = paramiko.RSAKey.from_private_key_file(settings.SSH_KEY_PATH)
transport.connect(username=settings.SFTP_USER, pkey=private_key)
else:
# Use password authentication
transport.connect(username=settings.SFTP_USER, password=settings.SFTP_PASSWORD)
sftp = paramiko.SFTPClient.from_transport(transport)
# Create remote directory if needed
remote_path = settings.SFTP_REMOTE_PATH
self._ensure_remote_directory(sftp, remote_path)
# Upload file
remote_file = f"{remote_path}/{backup_path.name}"
sftp.put(str(backup_path), remote_file)
# Verify upload
remote_stat = sftp.stat(remote_file)
local_size = backup_path.stat().st_size
if remote_stat.st_size != local_size:
raise ValueError(f"Upload verification failed: remote size {remote_stat.st_size} != local size {local_size}")
# Close connection
sftp.close()
transport.close()
# Update job record
execute_update(
"""UPDATE backup_jobs
SET offsite_uploaded_at = %s, offsite_retry_count = 0
WHERE id = %s""",
(datetime.now(), job_id)
)
logger.info("✅ Offsite upload completed: %s", backup_path.name)
return True
except Exception as e:
logger.error("❌ Offsite upload failed: %s", str(e))
# Increment retry count
execute_update(
"""UPDATE backup_jobs
SET offsite_retry_count = offsite_retry_count + 1
WHERE id = %s""",
(job_id,)
)
return False
async def check_storage_usage(self) -> Dict[str, any]:
"""
Check backup storage usage and warn if exceeding threshold
Returns:
Dict with storage statistics
"""
total_size = 0
file_count = 0
for backup_file in self.backup_dir.rglob('*'):
if backup_file.is_file() and not backup_file.name.startswith('.'):
total_size += backup_file.stat().st_size
file_count += 1
max_size_bytes = settings.BACKUP_MAX_SIZE_GB * 1024 * 1024 * 1024
usage_pct = (total_size / max_size_bytes) * 100 if max_size_bytes > 0 else 0
stats = {
'total_size_bytes': total_size,
'total_size_gb': total_size / 1024 / 1024 / 1024,
'max_size_gb': settings.BACKUP_MAX_SIZE_GB,
'usage_pct': usage_pct,
'file_count': file_count,
'warning': usage_pct >= settings.STORAGE_WARNING_THRESHOLD_PCT
}
if stats['warning']:
logger.warning("⚠️ Backup storage usage high: %.1f%% (%.2f GB / %d GB)",
usage_pct, stats['total_size_gb'], settings.BACKUP_MAX_SIZE_GB)
# Log notification
execute_insert(
"""INSERT INTO backup_notifications (event_type, message)
VALUES (%s, %s)""",
('storage_low',
f"Backup storage usage at {usage_pct:.1f}% ({stats['total_size_gb']:.2f} GB / {settings.BACKUP_MAX_SIZE_GB} GB)")
)
return stats
async def set_maintenance_mode(self, enabled: bool, message: str = None, eta_minutes: int = None):
"""
Enable or disable system maintenance mode
Args:
enabled: True to enable maintenance mode, False to disable
message: Custom maintenance message
eta_minutes: Estimated time to completion in minutes
"""
if message is None:
message = "System under vedligeholdelse" if enabled else ""
execute_update(
"""UPDATE system_status
SET maintenance_mode = %s, maintenance_message = %s,
maintenance_eta_minutes = %s, updated_at = %s
WHERE id = 1""",
(enabled, message, eta_minutes, datetime.now())
)
if enabled:
logger.warning("🔧 Maintenance mode ENABLED: %s (ETA: %s min)", message, eta_minutes)
else:
logger.info("✅ Maintenance mode DISABLED")
def _calculate_checksum(self, file_path: Path) -> str:
"""Calculate SHA256 checksum of file"""
sha256_hash = hashlib.sha256()
with open(file_path, "rb") as f:
for byte_block in iter(lambda: f.read(4096), b""):
sha256_hash.update(byte_block)
return sha256_hash.hexdigest()
def _ensure_remote_directory(self, sftp: paramiko.SFTPClient, path: str):
"""Create remote directory if it doesn't exist (recursive)"""
dirs = []
current = path
while current != '/':
dirs.append(current)
current = os.path.dirname(current)
dirs.reverse()
for dir_path in dirs:
try:
sftp.stat(dir_path)
except FileNotFoundError:
sftp.mkdir(dir_path)
logger.info("📁 Created remote directory: %s", dir_path)
# Singleton instance
backup_service = BackupService()

View File

@ -1 +0,0 @@
"""Backup frontend views and templates."""

View File

@ -1,20 +0,0 @@
"""
Backup Frontend Views
Serves HTML pages for backup system dashboard
"""
from fastapi import APIRouter, Request
from fastapi.templating import Jinja2Templates
from fastapi.responses import HTMLResponse
router = APIRouter()
templates = Jinja2Templates(directory="app")
@router.get("/backups", response_class=HTMLResponse)
async def backups_dashboard(request: Request):
"""Backup system dashboard page"""
return templates.TemplateResponse("backups/templates/index.html", {
"request": request,
"title": "Backup System"
})

View File

@ -1,820 +0,0 @@
<!DOCTYPE html>
<html lang="da" data-bs-theme="light">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Backup System - BMC Hub</title>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.0/font/bootstrap-icons.css">
<style>
:root {
--primary-color: #0f4c75;
--secondary-color: #3282b8;
--accent-color: #bbe1fa;
--success-color: #28a745;
--warning-color: #ffc107;
--danger-color: #dc3545;
--light-bg: #f8f9fa;
--dark-bg: #1b2838;
}
[data-bs-theme="dark"] {
--light-bg: #1b2838;
--primary-color: #3282b8;
}
body {
background-color: var(--light-bg);
font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
}
.navbar {
background-color: var(--primary-color) !important;
box-shadow: 0 2px 4px rgba(0,0,0,0.1);
}
.card {
border: none;
border-radius: 10px;
box-shadow: 0 2px 8px rgba(0,0,0,0.1);
margin-bottom: 1.5rem;
}
.card-header {
background-color: var(--primary-color);
color: white;
border-radius: 10px 10px 0 0 !important;
padding: 1rem 1.25rem;
font-weight: 600;
}
.stat-card {
text-align: center;
padding: 1.5rem;
}
.stat-card .stat-value {
font-size: 2.5rem;
font-weight: bold;
color: var(--primary-color);
}
.stat-card .stat-label {
font-size: 0.9rem;
color: #6c757d;
text-transform: uppercase;
}
.badge-type {
font-size: 0.85rem;
padding: 0.4rem 0.8rem;
}
.progress {
height: 25px;
font-size: 0.9rem;
}
.btn-action {
margin: 0.25rem;
}
.offsite-badge {
font-size: 0.75rem;
}
.notification-item {
border-left: 4px solid;
padding-left: 1rem;
margin-bottom: 0.75rem;
}
.notification-item.backup_failed {
border-color: var(--danger-color);
}
.notification-item.storage_low {
border-color: var(--warning-color);
}
.notification-item.backup_success {
border-color: var(--success-color);
}
.theme-toggle {
cursor: pointer;
font-size: 1.25rem;
color: white;
}
</style>
</head>
<body>
<!-- Navbar -->
<nav class="navbar navbar-expand-lg navbar-dark">
<div class="container-fluid">
<a class="navbar-brand" href="/">
<i class="bi bi-hdd-network"></i> BMC Hub - Backup System
</a>
<div class="d-flex align-items-center">
<span class="theme-toggle me-3" onclick="toggleTheme()">
<i class="bi bi-moon-stars" id="theme-icon"></i>
</span>
<a href="/api/docs" class="btn btn-outline-light btn-sm">
<i class="bi bi-code-square"></i> API Docs
</a>
</div>
</div>
</nav>
<div class="container-fluid py-4">
<!-- Stats Row -->
<div class="row mb-4">
<div class="col-md-3">
<div class="card stat-card">
<div class="stat-value" id="total-backups">-</div>
<div class="stat-label">Total Backups</div>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card">
<div class="stat-value text-success" id="completed-backups">-</div>
<div class="stat-label">Completed</div>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card">
<div class="stat-value text-warning" id="pending-offsite">-</div>
<div class="stat-label">Pending Offsite</div>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card">
<div class="stat-value" id="storage-usage">-</div>
<div class="stat-label">Storage Used</div>
</div>
</div>
</div>
<!-- Storage Usage -->
<div class="row mb-4">
<div class="col-12">
<div class="card">
<div class="card-header">
<i class="bi bi-hdd"></i> Storage Usage
</div>
<div class="card-body">
<div class="progress" id="storage-progress">
<div class="progress-bar" role="progressbar" style="width: 0%" id="storage-bar">
0%
</div>
</div>
<p class="text-muted mt-2 mb-0" id="storage-details">Loading...</p>
</div>
</div>
</div>
</div>
<!-- Actions and Scheduler Status -->
<div class="row mb-4">
<div class="col-md-8">
<div class="card">
<div class="card-header">
<i class="bi bi-play-circle"></i> Manual Backup
</div>
<div class="card-body">
<form id="backup-form">
<div class="row g-3 align-items-end">
<div class="col-md-4">
<label class="form-label">Backup Type</label>
<select class="form-select" id="backup-type">
<option value="full">Full (Database + Files)</option>
<option value="database">Database Only</option>
<option value="files">Files Only</option>
</select>
</div>
<div class="col-md-4">
<div class="form-check mt-4">
<input class="form-check-input" type="checkbox" id="is-monthly">
<label class="form-check-label" for="is-monthly">
Monthly Backup (SQL Format)
</label>
</div>
</div>
<div class="col-md-4">
<button type="submit" class="btn btn-primary w-100">
<i class="bi bi-download"></i> Create Backup
</button>
</div>
</div>
</form>
<div id="backup-result" class="mt-3"></div>
<hr class="my-4">
<!-- Upload Backup Form -->
<h6 class="mb-3"><i class="bi bi-cloud-upload"></i> Upload Backup</h6>
<form id="upload-form" onsubmit="uploadBackup(event)">
<div class="row g-2">
<div class="col-md-6">
<label for="backup-file" class="form-label">Backup File</label>
<input type="file" class="form-control" id="backup-file" required
accept=".dump,.sql,.sql.gz,.tar.gz,.tgz">
</div>
<div class="col-md-6">
<label for="upload-type" class="form-label">Backup Type</label>
<select class="form-select" id="upload-type" required>
<option value="database">Database</option>
<option value="files">Files</option>
</select>
</div>
<div class="col-md-6">
<div class="form-check mt-4">
<input class="form-check-input" type="checkbox" id="upload-monthly">
<label class="form-check-label" for="upload-monthly">
Monthly Backup
</label>
</div>
</div>
<div class="col-md-6">
<button type="submit" class="btn btn-success w-100 mt-4">
<i class="bi bi-upload"></i> Upload Backup
</button>
</div>
</div>
</form>
<div id="upload-result" class="mt-3"></div>
</div>
</div>
</div>
<div class="col-md-4">
<div class="card">
<div class="card-header">
<i class="bi bi-clock-history"></i> Scheduler Status
</div>
<div class="card-body">
<div id="scheduler-status">
<div class="spinner-border spinner-border-sm" role="status"></div>
<span class="ms-2">Loading...</span>
</div>
</div>
</div>
</div>
</div>
<!-- Backup History -->
<div class="row">
<div class="col-md-8">
<div class="card">
<div class="card-header d-flex justify-content-between align-items-center">
<span><i class="bi bi-clock-history"></i> Backup History</span>
<button class="btn btn-light btn-sm" onclick="refreshBackups()">
<i class="bi bi-arrow-clockwise"></i> Refresh
</button>
</div>
<div class="card-body">
<div class="table-responsive">
<table class="table table-hover">
<thead>
<tr>
<th>ID</th>
<th>Type</th>
<th>Format</th>
<th>Size</th>
<th>Status</th>
<th>Offsite</th>
<th>Created</th>
<th>Actions</th>
</tr>
</thead>
<tbody id="backups-table">
<tr>
<td colspan="8" class="text-center">
<div class="spinner-border" role="status"></div>
<p class="mt-2">Loading backups...</p>
</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
<!-- Notifications -->
<div class="col-md-4">
<div class="card">
<div class="card-header">
<i class="bi bi-bell"></i> Recent Notifications
</div>
<div class="card-body" style="max-height: 500px; overflow-y: auto;">
<div id="notifications-list">
<div class="text-center">
<div class="spinner-border spinner-border-sm" role="status"></div>
<p class="mt-2 text-muted">Loading...</p>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Restore Confirmation Modal -->
<div class="modal fade" id="restoreModal" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header bg-warning">
<h5 class="modal-title">
<i class="bi bi-exclamation-triangle"></i> Bekræft Restore
</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<div class="alert alert-danger">
<strong>ADVARSEL:</strong> Systemet vil blive lukket ned under restore-processen.
Alle aktive brugere vil miste forbindelsen.
</div>
<p>Er du sikker på, at du vil gendanne fra denne backup?</p>
<p class="text-muted mb-0">
<strong>Backup ID:</strong> <span id="restore-job-id"></span><br>
<strong>Type:</strong> <span id="restore-job-type"></span><br>
<strong>Estimeret tid:</strong> 5-10 minutter
</p>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Annuller</button>
<button type="button" class="btn btn-danger" onclick="confirmRestore()">
<i class="bi bi-arrow-counterclockwise"></i> Gendan Nu
</button>
</div>
</div>
</div>
</div>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
<script>
let selectedJobId = null;
let restoreModal = null;
// Initialize
document.addEventListener('DOMContentLoaded', function() {
restoreModal = new bootstrap.Modal(document.getElementById('restoreModal'));
loadDashboard();
// Refresh every 30 seconds
setInterval(loadDashboard, 30000);
// Setup backup form
document.getElementById('backup-form').addEventListener('submit', createBackup);
});
// Load all dashboard data
async function loadDashboard() {
await Promise.all([
loadBackups(),
loadStorageStats(),
loadNotifications(),
loadSchedulerStatus()
]);
}
// Load backups list
async function loadBackups() {
// TODO: Implement /api/v1/backups/jobs endpoint
console.warn('⚠️ Backups API ikke implementeret endnu');
document.getElementById('backups-table').innerHTML = '<tr><td colspan="8" class="text-center text-warning"><i class="bi bi-exclamation-triangle me-2"></i>Backup API er ikke implementeret endnu</td></tr>';
return;
/* Disabled until API implemented:
try {
const response = await fetch('/api/v1/backups/jobs?limit=50');
const backups = await response.json();
const tbody = document.getElementById('backups-table');
if (backups.length === 0) {
tbody.innerHTML = '<tr><td colspan="8" class="text-center text-muted">No backups found</td></tr>';
updateStats(backups);
return;
}
tbody.innerHTML = backups.map(b => `
<tr>
<td><strong>#${b.id}</strong></td>
<td>
<span class="badge badge-type ${getTypeBadgeClass(b.job_type)}">
${b.job_type} ${b.is_monthly ? '(Monthly)' : ''}
</span>
</td>
<td><code>${b.backup_format}</code></td>
<td>${formatBytes(b.file_size_bytes)}</td>
<td>${getStatusBadge(b.status)}</td>
<td>${getOffsiteBadge(b)}</td>
<td>${formatDate(b.created_at)}</td>
<td>
${b.status === 'completed' ? `
<button class="btn btn-sm btn-warning btn-action" onclick="showRestore(${b.id}, '${b.job_type}')" title="Restore">
<i class="bi bi-arrow-counterclockwise"></i>
</button>
${b.offsite_uploaded_at === null ? `
<button class="btn btn-sm btn-info btn-action" onclick="uploadOffsite(${b.id})" title="Upload Offsite">
<i class="bi bi-cloud-upload"></i>
</button>
` : ''}
` : ''}
<button class="btn btn-sm btn-danger btn-action" onclick="deleteBackup(${b.id})" title="Delete">
<i class="bi bi-trash"></i>
</button>
</td>
</tr>
`).join('');
updateStats(backups);
} catch (error) {
console.error('Load backups error:', error);
document.getElementById('backups-table').innerHTML =
'<tr><td colspan="8" class="text-center text-danger">Failed to load backups</td></tr>';
}
}
// Load storage stats
async function loadStorageStats() {
// TODO: Implement /api/v1/backups/storage endpoint
return;
/* Disabled until API implemented:
try {
const response = await fetch('/api/v1/backups/storage');
const stats = await response.json();
const bar = document.getElementById('storage-bar');
const pct = Math.min(stats.usage_pct, 100);
bar.style.width = pct + '%';
bar.textContent = pct.toFixed(1) + '%';
if (stats.warning) {
bar.classList.remove('bg-success', 'bg-info');
bar.classList.add('bg-danger');
} else if (pct > 60) {
bar.classList.remove('bg-success', 'bg-danger');
bar.classList.add('bg-warning');
} else {
bar.classList.remove('bg-warning', 'bg-danger');
bar.classList.add('bg-success');
}
document.getElementById('storage-details').textContent =
`${stats.total_size_gb.toFixed(2)} GB used of ${stats.max_size_gb} GB (${stats.file_count} files)`;
document.getElementById('storage-usage').textContent = stats.total_size_gb.toFixed(1) + ' GB';
} catch (error) {
console.error('Load storage stats error:', error);
}
}
// Load notifications
async function loadNotifications() {
// TODO: Implement /api/v1/backups/notifications endpoint
return;
/* Disabled until API implemented:
try {
const response = await fetch('/api/v1/backups/notifications?limit=10');
const notifications = await response.json();
const container = document.getElementById('notifications-list');
if (notifications.length === 0) {
container.innerHTML = '<p class="text-muted text-center">No notifications</p>';
return;
}
container.innerHTML = notifications.map(n => `
<div class="notification-item ${n.event_type} ${n.acknowledged ? 'opacity-50' : ''}">
<small class="text-muted">${formatDate(n.sent_at)}</small>
<p class="mb-1">${n.message}</p>
${!n.acknowledged ? `
<button class="btn btn-sm btn-outline-secondary" onclick="acknowledgeNotification(${n.id})">
<i class="bi bi-check"></i> Acknowledge
</button>
` : '<small class="text-success"><i class="bi bi-check-circle"></i> Acknowledged</small>'}
</div>
`).join('');
} catch (error) {
console.error('Load notifications error:', error);
}
}
// Load scheduler status
async function loadSchedulerStatus() {
// TODO: Implement /api/v1/backups/scheduler/status endpoint
return;
/* Disabled until API implemented:
try {
const response = await fetch('/api/v1/backups/scheduler/status');
const status = await response.json();
const container = document.getElementById('scheduler-status');
if (!status.running) {
container.innerHTML = `
<div class="alert alert-warning mb-0">
<i class="bi bi-exclamation-triangle"></i> Scheduler not running
</div>
`;
return;
}
container.innerHTML = `
<div class="alert alert-success mb-0">
<i class="bi bi-check-circle"></i> Active
</div>
<small class="text-muted">Next jobs:</small>
<ul class="list-unstyled mb-0 mt-1">
${status.jobs.slice(0, 3).map(j => `
<li><small>${j.name}: ${j.next_run ? formatDate(j.next_run) : 'N/A'}</small></li>
`).join('')}
</ul>
`;
} catch (error) {
console.error('Load scheduler status error:', error);
}
}
// Create manual backup
async function createBackup(event) {
event.preventDefault();
const resultDiv = document.getElementById('backup-result');
resultDiv.innerHTML = '<div class="alert alert-warning"><i class="bi bi-exclamation-triangle me-2"></i>Backup API er ikke implementeret endnu</div>';
return;
/* Disabled until API implemented:
const type = document.getElementById('backup-type').value;
const isMonthly = document.getElementById('is-monthly').checked;
resultDiv.innerHTML = '<div class="alert alert-info"><i class="bi bi-hourglass-split"></i> Creating backup...</div>';
try {
const response = await fetch('/api/v1/backups', {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({job_type: type, is_monthly: isMonthly})
});
const result = await response.json();
if (response.ok) {
resultDiv.innerHTML = `<div class="alert alert-success">${result.message}</div>`;
setTimeout(() => loadBackups(), 2000);
} else {
resultDiv.innerHTML = `<div class="alert alert-danger">Error: ${result.detail}</div>`;
}
} catch (error) {
resultDiv.innerHTML = `<div class="alert alert-danger">Error: ${error.message}</div>`;
}
}
// Upload backup
async function uploadBackup(event) {
event.preventDefault();
const resultDiv = document.getElementById('upload-result');
resultDiv.innerHTML = '<div class="alert alert-warning"><i class="bi bi-exclamation-triangle me-2"></i>Backup upload API er ikke implementeret endnu</div>';
return;
/* Disabled until API implemented:
const fileInput = document.getElementById('backup-file');
const type = document.getElementById('upload-type').value;
const isMonthly = document.getElementById('upload-monthly').checked;
if (!fileInput.files || fileInput.files.length === 0) {
resultDiv.innerHTML = '<div class="alert alert-danger">Please select a file</div>';
return;
}
const file = fileInput.files[0];
const formData = new FormData();
formData.append('file', file);
resultDiv.innerHTML = `<div class="alert alert-info">
<i class="bi bi-hourglass-split"></i> Uploading ${file.name} (${(file.size / 1024 / 1024).toFixed(2)} MB)...
</div>`;
try {
const response = await fetch(`/api/v1/backups/upload?backup_type=${type}&is_monthly=${isMonthly}`, {
method: 'POST',
body: formData
});
const result = await response.json();
if (response.ok) {
resultDiv.innerHTML = `
<div class="alert alert-success">
<strong>✅ Upload successful!</strong><br>
Job ID: ${result.job_id}<br>
Size: ${result.file_size_mb} MB<br>
Checksum: ${result.checksum.substring(0, 16)}...
</div>
`;
fileInput.value = ''; // Clear file input
setTimeout(() => loadBackups(), 2000);
} else {
resultDiv.innerHTML = `<div class="alert alert-danger">Error: ${result.detail}</div>`;
}
} catch (error) {
resultDiv.innerHTML = `<div class="alert alert-danger">Upload error: ${error.message}</div>`;
}
}
// Show restore modal
function showRestore(jobId, jobType) {
selectedJobId = jobId;
document.getElementById('restore-job-id').textContent = jobId;
document.getElementById('restore-job-type').textContent = jobType;
restoreModal.show();
}
// Confirm restore
async function confirmRestore() {
alert('⚠️ Restore API er ikke implementeret endnu');
return;
/* Disabled until API implemented:
if (!selectedJobId) return;
try {
const response = await fetch(`/api/v1/backups/restore/${selectedJobId}`, {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({confirmation: true})
});
const result = await response.json();
restoreModal.hide();
if (response.ok) {
alert('Restore started! System entering maintenance mode.');
window.location.reload();
} else {
alert('Restore failed: ' + result.detail);
}
} catch (error) {
alert('Restore error: ' + error.message);
}
}
// Upload to offsite
async function uploadOffsite(jobId) {
alert('⚠️ Offsite upload API er ikke implementeret endnu');
return;
/* Disabled until API implemented:
if (!confirm('Upload this backup to offsite storage?')) return;
try {
const response = await fetch(`/api/v1/backups/offsite/${jobId}`, {method: 'POST'});
const result = await response.json();
if (response.ok) {
alert(result.message);
loadBackups();
} else {
alert('Upload failed: ' + result.detail);
}
} catch (error) {
alert('Upload error: ' + error.message);
}
}
// Delete backup
async function deleteBackup(jobId) {
alert('⚠️ Delete backup API er ikke implementeret endnu');
return;
/* Disabled until API implemented:
if (!confirm('Delete this backup? This cannot be undone.')) return;
try {
const response = await fetch(`/api/v1/backups/jobs/${jobId}`, {method: 'DELETE'});
const result = await response.json();
if (response.ok) {
loadBackups();
} else {
alert('Delete failed: ' + result.detail);
}
} catch (error) {
alert('Delete error: ' + error.message);
}
}
// Acknowledge notification
async function acknowledgeNotification(notificationId) {
console.warn('⚠️ Notification API ikke implementeret');
return;
/* Disabled until API implemented:
try {
await fetch(`/api/v1/backups/notifications/${notificationId}/acknowledge`, {method: 'POST'});
loadNotifications();
} catch (error) {
console.error('Acknowledge error:', error);
}
}
// Refresh backups
function refreshBackups() {
loadBackups();
}
// Update stats
function updateStats(backups) {
document.getElementById('total-backups').textContent = backups.length;
document.getElementById('completed-backups').textContent =
backups.filter(b => b.status === 'completed').length;
document.getElementById('pending-offsite').textContent =
backups.filter(b => b.status === 'completed' && !b.offsite_uploaded_at).length;
}
// Utility functions
function formatBytes(bytes) {
if (!bytes) return '-';
const mb = bytes / 1024 / 1024;
return mb > 1024 ? `${(mb / 1024).toFixed(2)} GB` : `${mb.toFixed(2)} MB`;
}
function formatDate(dateStr) {
if (!dateStr) return '-';
const date = new Date(dateStr);
return date.toLocaleString('da-DK', {
year: 'numeric',
month: '2-digit',
day: '2-digit',
hour: '2-digit',
minute: '2-digit'
});
}
function getTypeBadgeClass(type) {
const classes = {
'database': 'bg-primary',
'files': 'bg-info',
'full': 'bg-success'
};
return classes[type] || 'bg-secondary';
}
function getStatusBadge(status) {
const badges = {
'pending': '<span class="badge bg-secondary">Pending</span>',
'running': '<span class="badge bg-warning">Running</span>',
'completed': '<span class="badge bg-success">Completed</span>',
'failed': '<span class="badge bg-danger">Failed</span>'
};
return badges[status] || status;
}
function getOffsiteBadge(backup) {
if (backup.offsite_uploaded_at) {
return '<span class="badge bg-success offsite-badge"><i class="bi bi-cloud-check"></i> Uploaded</span>';
} else if (backup.offsite_retry_count > 0) {
return `<span class="badge bg-warning offsite-badge"><i class="bi bi-exclamation-triangle"></i> Retry ${backup.offsite_retry_count}</span>`;
} else {
return '<span class="badge bg-secondary offsite-badge">Pending</span>';
}
}
// Theme toggle
function toggleTheme() {
const html = document.documentElement;
const icon = document.getElementById('theme-icon');
const currentTheme = html.getAttribute('data-bs-theme');
if (currentTheme === 'light') {
html.setAttribute('data-bs-theme', 'dark');
icon.classList.remove('bi-moon-stars');
icon.classList.add('bi-sun');
localStorage.setItem('theme', 'dark');
} else {
html.setAttribute('data-bs-theme', 'light');
icon.classList.remove('bi-sun');
icon.classList.add('bi-moon-stars');
localStorage.setItem('theme', 'light');
}
}
// Load saved theme
(function() {
const savedTheme = localStorage.getItem('theme') || 'light';
document.documentElement.setAttribute('data-bs-theme', savedTheme);
if (savedTheme === 'dark') {
document.getElementById('theme-icon').classList.replace('bi-moon-stars', 'bi-sun');
}
})();
</script>
</body>
</html>

View File

@ -4,12 +4,11 @@ Backend API for managing supplier invoices that integrate with e-conomic
""" """
from fastapi import APIRouter, HTTPException, UploadFile, File from fastapi import APIRouter, HTTPException, UploadFile, File
from pydantic import BaseModel
from typing import List, Dict, Optional from typing import List, Dict, Optional
from datetime import datetime, date, timedelta from datetime import datetime, date, timedelta
from decimal import Decimal from decimal import Decimal
from pathlib import Path from pathlib import Path
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single from app.core.database import execute_query, execute_insert, execute_update
from app.core.config import settings from app.core.config import settings
from app.services.economic_service import get_economic_service from app.services.economic_service import get_economic_service
from app.services.ollama_service import ollama_service from app.services.ollama_service import ollama_service
@ -277,35 +276,21 @@ async def get_pending_files():
logger.info(f"📋 Checking invoice2data templates: {len(invoice2data.templates)} loaded") logger.info(f"📋 Checking invoice2data templates: {len(invoice2data.templates)} loaded")
for file in files: for file in files:
# Check if there's an invoice2data template for this vendor's CVR or name # Check if there's an invoice2data template for this vendor's CVR
vendor_cvr = file.get('matched_vendor_cvr_number') or file.get('detected_vendor_cvr') or file.get('vendor_cvr') vendor_cvr = file.get('matched_vendor_cvr_number') or file.get('detected_vendor_cvr') or file.get('vendor_cvr')
vendor_name = file.get('vendor_name') or file.get('detected_vendor_name') or file.get('matched_vendor_name')
file['has_invoice2data_template'] = False file['has_invoice2data_template'] = False
logger.debug(f" File {file['file_id']}: CVR={vendor_cvr}, name={vendor_name}") logger.debug(f" File {file['file_id']}: CVR={vendor_cvr}")
# Check all templates if vendor_cvr:
# Check all templates for this CVR in keywords
for template_name, template_data in invoice2data.templates.items(): for template_name, template_data in invoice2data.templates.items():
keywords = template_data.get('keywords', []) keywords = template_data.get('keywords', [])
logger.debug(f" Template {template_name}: keywords={keywords}") logger.debug(f" Template {template_name}: keywords={keywords}")
if str(vendor_cvr) in [str(k) for k in keywords]:
# Match by CVR
if vendor_cvr and str(vendor_cvr) in [str(k) for k in keywords]:
file['has_invoice2data_template'] = True file['has_invoice2data_template'] = True
file['invoice2data_template_name'] = template_name file['invoice2data_template_name'] = template_name
logger.info(f" ✅ File {file['file_id']} matched template {template_name} by CVR") logger.info(f" ✅ File {file['file_id']} matched template: {template_name}")
break
# Match by vendor name
if vendor_name:
for keyword in keywords:
if str(keyword).upper() in str(vendor_name).upper():
file['has_invoice2data_template'] = True
file['invoice2data_template_name'] = template_name
logger.info(f" ✅ File {file['file_id']} matched template {template_name} by name")
break
if file['has_invoice2data_template']:
break break
except Exception as e: except Exception as e:
logger.error(f"❌ Failed to check invoice2data templates: {e}", exc_info=True) logger.error(f"❌ Failed to check invoice2data templates: {e}", exc_info=True)
@ -324,7 +309,9 @@ async def get_file_pdf_text(file_id: int):
# Get file info # Get file info
file_info = execute_query( file_info = execute_query(
"SELECT file_path, filename FROM incoming_files WHERE file_id = %s", "SELECT file_path, filename FROM incoming_files WHERE file_id = %s",
(file_id,)) (file_id,),
fetchone=True
)
if not file_info: if not file_info:
raise HTTPException(status_code=404, detail="Fil ikke fundet") raise HTTPException(status_code=404, detail="Fil ikke fundet")
@ -355,17 +342,21 @@ async def get_file_extracted_data(file_id: int):
"""Hent AI-extracted data fra en uploaded fil""" """Hent AI-extracted data fra en uploaded fil"""
try: try:
# Get file info # Get file info
file_info = execute_query_single( file_info = execute_query(
"SELECT * FROM incoming_files WHERE file_id = %s", "SELECT * FROM incoming_files WHERE file_id = %s",
(file_id,)) (file_id,),
fetchone=True
)
if not file_info: if not file_info:
raise HTTPException(status_code=404, detail="Fil ikke fundet") raise HTTPException(status_code=404, detail="Fil ikke fundet")
# Get extraction results if exists # Get extraction results if exists
extraction = execute_query_single( extraction = execute_query(
"SELECT * FROM extractions WHERE file_id = %s ORDER BY created_at DESC LIMIT 1", "SELECT * FROM extractions WHERE file_id = %s ORDER BY created_at DESC LIMIT 1",
(file_id,)) (file_id,),
fetchone=True
)
# Parse llm_response_json if it exists (from AI or template extraction) # Parse llm_response_json if it exists (from AI or template extraction)
llm_json_data = None llm_json_data = None
@ -380,7 +371,7 @@ async def get_file_extracted_data(file_id: int):
# Get extraction lines if exist # Get extraction lines if exist
extraction_lines = [] extraction_lines = []
if extraction: if extraction:
extraction_lines = execute_query_single( extraction_lines = execute_query(
"""SELECT * FROM extraction_lines """SELECT * FROM extraction_lines
WHERE extraction_id = %s WHERE extraction_id = %s
ORDER BY line_number""", ORDER BY line_number""",
@ -422,23 +413,12 @@ async def get_file_extracted_data(file_id: int):
# Build llm_data response # Build llm_data response
llm_data = None llm_data = None
if llm_json_data: if llm_json_data:
# Normalize common invoice2data field names to our API schema
total_amount_value = llm_json_data.get('total_amount')
if total_amount_value is None:
total_amount_value = llm_json_data.get('amount_total')
invoice_date_value = llm_json_data.get('invoice_date')
if invoice_date_value is None:
invoice_date_value = llm_json_data.get('document_date')
due_date_value = llm_json_data.get('due_date')
# Use invoice_number from LLM JSON (works for both AI and template extraction) # Use invoice_number from LLM JSON (works for both AI and template extraction)
llm_data = { llm_data = {
"invoice_number": llm_json_data.get('invoice_number'), "invoice_number": llm_json_data.get('invoice_number'),
"invoice_date": invoice_date_value, "invoice_date": llm_json_data.get('invoice_date'),
"due_date": due_date_value, "due_date": llm_json_data.get('due_date'),
"total_amount": float(total_amount_value) if total_amount_value else None, "total_amount": float(llm_json_data.get('total_amount')) if llm_json_data.get('total_amount') else None,
"currency": llm_json_data.get('currency') or 'DKK', "currency": llm_json_data.get('currency') or 'DKK',
"document_type": llm_json_data.get('document_type'), "document_type": llm_json_data.get('document_type'),
"lines": formatted_lines "lines": formatted_lines
@ -487,7 +467,9 @@ async def download_pending_file(file_id: int):
# Get file info # Get file info
file_info = execute_query( file_info = execute_query(
"SELECT * FROM incoming_files WHERE file_id = %s", "SELECT * FROM incoming_files WHERE file_id = %s",
(file_id,)) (file_id,),
fetchone=True
)
if not file_info: if not file_info:
raise HTTPException(status_code=404, detail="Fil ikke fundet") raise HTTPException(status_code=404, detail="Fil ikke fundet")
@ -525,17 +507,21 @@ async def link_vendor_to_extraction(file_id: int, data: dict):
raise HTTPException(status_code=400, detail="vendor_id is required") raise HTTPException(status_code=400, detail="vendor_id is required")
# Verify vendor exists # Verify vendor exists
vendor = execute_query_single( vendor = execute_query(
"SELECT id, name FROM vendors WHERE id = %s", "SELECT id, name FROM vendors WHERE id = %s",
(vendor_id,)) (vendor_id,),
fetchone=True
)
if not vendor: if not vendor:
raise HTTPException(status_code=404, detail="Leverandør ikke fundet") raise HTTPException(status_code=404, detail="Leverandør ikke fundet")
# Get latest extraction for this file # Get latest extraction for this file
extraction = execute_query_single( extraction = execute_query(
"SELECT extraction_id FROM extractions WHERE file_id = %s ORDER BY created_at DESC LIMIT 1", "SELECT extraction_id FROM extractions WHERE file_id = %s ORDER BY created_at DESC LIMIT 1",
(file_id,)) (file_id,),
fetchone=True
)
if not extraction: if not extraction:
raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil") raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil")
@ -571,19 +557,23 @@ async def delete_pending_file_endpoint(file_id: int):
try: try:
# Get file info # Get file info
file_info = execute_query_single( file_info = execute_query(
"SELECT * FROM incoming_files WHERE file_id = %s", "SELECT * FROM incoming_files WHERE file_id = %s",
(file_id,)) (file_id,),
fetchone=True
)
if not file_info: if not file_info:
raise HTTPException(status_code=404, detail="Fil ikke fundet") raise HTTPException(status_code=404, detail="Fil ikke fundet")
# Check if already converted to invoice # Check if already converted to invoice
invoice_exists = execute_query_single( invoice_exists = execute_query(
"""SELECT si.id FROM supplier_invoices si """SELECT si.id FROM supplier_invoices si
JOIN extractions e ON si.extraction_id = e.extraction_id JOIN extractions e ON si.extraction_id = e.extraction_id
WHERE e.file_id = %s""", WHERE e.file_id = %s""",
(file_id,)) (file_id,),
fetchone=True
)
if invoice_exists: if invoice_exists:
raise HTTPException( raise HTTPException(
@ -649,17 +639,21 @@ async def link_vendor_to_extraction(file_id: int, data: dict):
raise HTTPException(status_code=400, detail="vendor_id er påkrævet") raise HTTPException(status_code=400, detail="vendor_id er påkrævet")
# Verify vendor exists # Verify vendor exists
vendor = execute_query_single( vendor = execute_query(
"SELECT id, name FROM vendors WHERE id = %s", "SELECT id, name FROM vendors WHERE id = %s",
(vendor_id,)) (vendor_id,),
fetchone=True
)
if not vendor: if not vendor:
raise HTTPException(status_code=404, detail=f"Leverandør {vendor_id} ikke fundet") raise HTTPException(status_code=404, detail=f"Leverandør {vendor_id} ikke fundet")
# Get latest extraction for this file # Get latest extraction for this file
extraction = execute_query_single( extraction = execute_query(
"SELECT extraction_id FROM extractions WHERE file_id = %s ORDER BY created_at DESC LIMIT 1", "SELECT extraction_id FROM extractions WHERE file_id = %s ORDER BY created_at DESC LIMIT 1",
(file_id,)) (file_id,),
fetchone=True
)
if not extraction: if not extraction:
raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil") raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil")
@ -691,14 +685,16 @@ async def create_invoice_from_extraction(file_id: int):
"""Opret leverandørfaktura fra extraction data""" """Opret leverandørfaktura fra extraction data"""
try: try:
# Get latest extraction for this file # Get latest extraction for this file
extraction = execute_query_single( extraction = execute_query(
"""SELECT e.*, v.name as vendor_name """SELECT e.*, v.name as vendor_name
FROM extractions e FROM extractions e
LEFT JOIN vendors v ON v.id = e.vendor_matched_id LEFT JOIN vendors v ON v.id = e.vendor_matched_id
WHERE e.file_id = %s WHERE e.file_id = %s
ORDER BY e.created_at DESC ORDER BY e.created_at DESC
LIMIT 1""", LIMIT 1""",
(file_id,)) (file_id,),
fetchone=True
)
if not extraction: if not extraction:
raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil") raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil")
@ -711,15 +707,17 @@ async def create_invoice_from_extraction(file_id: int):
) )
# Check if invoice already exists # Check if invoice already exists
existing = execute_query_single( existing = execute_query(
"SELECT id FROM supplier_invoices WHERE extraction_id = %s", "SELECT id FROM supplier_invoices WHERE extraction_id = %s",
(extraction['extraction_id'],)) (extraction['extraction_id'],),
fetchone=True
)
if existing: if existing:
raise HTTPException(status_code=400, detail="Faktura er allerede oprettet fra denne extraction") raise HTTPException(status_code=400, detail="Faktura er allerede oprettet fra denne extraction")
# Get extraction lines # Get extraction lines
lines = execute_query_single( lines = execute_query(
"""SELECT * FROM extraction_lines """SELECT * FROM extraction_lines
WHERE extraction_id = %s WHERE extraction_id = %s
ORDER BY line_number""", ORDER BY line_number""",
@ -868,7 +866,9 @@ async def list_templates():
if vendor_cvr: if vendor_cvr:
vendor = execute_query( vendor = execute_query(
"SELECT id, name FROM vendors WHERE cvr_number = %s", "SELECT id, name FROM vendors WHERE cvr_number = %s",
(vendor_cvr,)) (vendor_cvr,),
fetchone=True
)
if vendor: if vendor:
vendor_id = vendor['id'] vendor_id = vendor['id']
vendor_name = vendor['name'] vendor_name = vendor['name']
@ -909,7 +909,7 @@ async def get_template(template_id: int):
LEFT JOIN vendors v ON t.vendor_id = v.id LEFT JOIN vendors v ON t.vendor_id = v.id
WHERE t.template_id = %s AND t.is_active = true WHERE t.template_id = %s AND t.is_active = true
""" """
template = execute_query_single(query, (template_id,)) template = execute_query(query, (template_id,), fetchone=True)
if not template: if not template:
raise HTTPException(status_code=404, detail="Template not found") raise HTTPException(status_code=404, detail="Template not found")
@ -943,9 +943,11 @@ async def search_vendor_by_info(request: Dict):
# Search by CVR first (most accurate) # Search by CVR first (most accurate)
if vendor_cvr: if vendor_cvr:
vendor = execute_query_single( vendor = execute_query(
"SELECT id, name, cvr_number FROM vendors WHERE cvr_number = %s", "SELECT id, name, cvr_number FROM vendors WHERE cvr_number = %s",
(vendor_cvr,)) (vendor_cvr,),
fetchone=True
)
if vendor: if vendor:
return { return {
"found": True, "found": True,
@ -956,7 +958,7 @@ async def search_vendor_by_info(request: Dict):
# Search by name (fuzzy) # Search by name (fuzzy)
if vendor_name: if vendor_name:
vendors = execute_query_single( vendors = execute_query(
"SELECT id, name, cvr_number FROM vendors WHERE LOWER(name) LIKE LOWER(%s) LIMIT 5", "SELECT id, name, cvr_number FROM vendors WHERE LOWER(name) LIKE LOWER(%s) LIMIT 5",
(f"%{vendor_name}%",) (f"%{vendor_name}%",)
) )
@ -1150,13 +1152,15 @@ async def get_supplier_invoice(invoice_id: int):
FROM supplier_invoices si FROM supplier_invoices si
LEFT JOIN vendors v ON si.vendor_id = v.id LEFT JOIN vendors v ON si.vendor_id = v.id
WHERE si.id = %s""", WHERE si.id = %s""",
(invoice_id,)) (invoice_id,),
fetchone=True
)
if not invoice: if not invoice:
raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found") raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found")
# Get lines # Get lines
lines = execute_query_single( lines = execute_query(
"SELECT * FROM supplier_invoice_lines WHERE supplier_invoice_id = %s ORDER BY line_number", "SELECT * FROM supplier_invoice_lines WHERE supplier_invoice_id = %s ORDER BY line_number",
(invoice_id,) (invoice_id,)
) )
@ -1283,7 +1287,9 @@ async def update_supplier_invoice(invoice_id: int, data: Dict):
# Check if invoice exists # Check if invoice exists
existing = execute_query( existing = execute_query(
"SELECT id, status FROM supplier_invoices WHERE id = %s", "SELECT id, status FROM supplier_invoices WHERE id = %s",
(invoice_id,)) (invoice_id,),
fetchone=True
)
if not existing: if not existing:
raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found") raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found")
@ -1336,9 +1342,11 @@ async def update_supplier_invoice(invoice_id: int, data: Dict):
async def delete_supplier_invoice(invoice_id: int): async def delete_supplier_invoice(invoice_id: int):
"""Delete supplier invoice (soft delete if integrated with e-conomic)""" """Delete supplier invoice (soft delete if integrated with e-conomic)"""
try: try:
invoice = execute_query_single( invoice = execute_query(
"SELECT id, invoice_number, economic_voucher_number FROM supplier_invoices WHERE id = %s", "SELECT id, invoice_number, economic_voucher_number FROM supplier_invoices WHERE id = %s",
(invoice_id,)) (invoice_id,),
fetchone=True
)
if not invoice: if not invoice:
raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found") raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found")
@ -1369,33 +1377,32 @@ async def delete_supplier_invoice(invoice_id: int):
# ========== E-CONOMIC INTEGRATION ========== # ========== E-CONOMIC INTEGRATION ==========
class ApproveRequest(BaseModel):
approved_by: str
@router.post("/supplier-invoices/{invoice_id}/approve") @router.post("/supplier-invoices/{invoice_id}/approve")
async def approve_supplier_invoice(invoice_id: int, request: ApproveRequest): async def approve_supplier_invoice(invoice_id: int, approved_by: str):
"""Approve supplier invoice for payment""" """Approve supplier invoice for payment"""
try: try:
invoice = execute_query_single( invoice = execute_query(
"SELECT id, invoice_number, status FROM supplier_invoices WHERE id = %s", "SELECT id, invoice_number, status FROM supplier_invoices WHERE id = %s",
(invoice_id,)) (invoice_id,),
fetchone=True
)
if not invoice: if not invoice:
raise HTTPException(status_code=404, detail=f"Faktura {invoice_id} ikke fundet") raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found")
if invoice['status'] != 'pending': if invoice['status'] != 'pending':
raise HTTPException(status_code=400, detail=f"Faktura har allerede status '{invoice['status']}' - kan kun godkende fakturaer med status 'pending'") raise HTTPException(status_code=400, detail=f"Invoice is already {invoice['status']}")
execute_update( execute_update(
"""UPDATE supplier_invoices """UPDATE supplier_invoices
SET status = 'approved', approved_by = %s, approved_at = CURRENT_TIMESTAMP SET status = 'approved', approved_by = %s, approved_at = CURRENT_TIMESTAMP
WHERE id = %s""", WHERE id = %s""",
(request.approved_by, invoice_id) (approved_by, invoice_id)
) )
logger.info(f"✅ Approved supplier invoice {invoice['invoice_number']} by {request.approved_by}") logger.info(f"✅ Approved supplier invoice {invoice['invoice_number']} by {approved_by}")
return {"success": True, "invoice_id": invoice_id, "approved_by": request.approved_by} return {"success": True, "invoice_id": invoice_id, "approved_by": approved_by}
except HTTPException: except HTTPException:
raise raise
@ -1412,12 +1419,14 @@ async def send_to_economic(invoice_id: int):
""" """
try: try:
# Get invoice with lines # Get invoice with lines
invoice = execute_query_single( invoice = execute_query(
"""SELECT si.*, v.economic_supplier_number as vendor_economic_id, v.name as vendor_full_name """SELECT si.*, v.economic_supplier_number as vendor_economic_id, v.name as vendor_full_name
FROM supplier_invoices si FROM supplier_invoices si
LEFT JOIN vendors v ON si.vendor_id = v.id LEFT JOIN vendors v ON si.vendor_id = v.id
WHERE si.id = %s""", WHERE si.id = %s""",
(invoice_id,)) (invoice_id,),
fetchone=True
)
if not invoice: if not invoice:
raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found") raise HTTPException(status_code=404, detail=f"Invoice {invoice_id} not found")
@ -1429,7 +1438,7 @@ async def send_to_economic(invoice_id: int):
raise HTTPException(status_code=400, detail="Invoice already sent to e-conomic") raise HTTPException(status_code=400, detail="Invoice already sent to e-conomic")
# Get lines # Get lines
lines = execute_query_single( lines = execute_query(
"SELECT * FROM supplier_invoice_lines WHERE supplier_invoice_id = %s ORDER BY line_number", "SELECT * FROM supplier_invoice_lines WHERE supplier_invoice_id = %s ORDER BY line_number",
(invoice_id,) (invoice_id,)
) )
@ -1467,7 +1476,9 @@ async def send_to_economic(invoice_id: int):
# Get default journal number from settings # Get default journal number from settings
journal_setting = execute_query( journal_setting = execute_query(
"SELECT setting_value FROM supplier_invoice_settings WHERE setting_key = 'economic_default_journal'") "SELECT setting_value FROM supplier_invoice_settings WHERE setting_key = 'economic_default_journal'",
fetchone=True
)
journal_number = int(journal_setting['setting_value']) if journal_setting else 1 journal_number = int(journal_setting['setting_value']) if journal_setting else 1
# Build VAT breakdown from lines # Build VAT breakdown from lines
@ -1594,7 +1605,7 @@ async def get_payment_overview():
try: try:
today = date.today().isoformat() today = date.today().isoformat()
stats = execute_query_single(""" stats = execute_query("""
SELECT SELECT
COUNT(*) as total_count, COUNT(*) as total_count,
SUM(CASE WHEN paid_date IS NOT NULL THEN 1 ELSE 0 END) as paid_count, SUM(CASE WHEN paid_date IS NOT NULL THEN 1 ELSE 0 END) as paid_count,
@ -1607,7 +1618,7 @@ async def get_payment_overview():
SUM(CASE WHEN paid_date IS NULL AND due_date < %s THEN total_amount ELSE 0 END) as overdue_amount SUM(CASE WHEN paid_date IS NULL AND due_date < %s THEN total_amount ELSE 0 END) as overdue_amount
FROM supplier_invoices FROM supplier_invoices
WHERE status != 'cancelled' WHERE status != 'cancelled'
""", (today, today, today, today, today)) """, (today, today, today, today, today), fetchone=True)
return { return {
"total_invoices": stats.get('total_count', 0) if stats else 0, "total_invoices": stats.get('total_count', 0) if stats else 0,
@ -1630,7 +1641,7 @@ async def get_payment_overview():
async def get_stats_by_vendor(): async def get_stats_by_vendor():
"""Get supplier invoice statistics grouped by vendor""" """Get supplier invoice statistics grouped by vendor"""
try: try:
stats = execute_query_single(""" stats = execute_query("""
SELECT SELECT
v.id as vendor_id, v.id as vendor_id,
v.name as vendor_name, v.name as vendor_name,
@ -1722,18 +1733,22 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
# Check for duplicate file # Check for duplicate file
existing_file = execute_query( existing_file = execute_query(
"SELECT file_id, status FROM incoming_files WHERE checksum = %s", "SELECT file_id, status FROM incoming_files WHERE checksum = %s",
(checksum,)) (checksum,),
fetchone=True
)
if existing_file: if existing_file:
temp_path.unlink(missing_ok=True) temp_path.unlink(missing_ok=True)
logger.warning(f"⚠️ Duplicate file detected: {checksum[:16]}...") logger.warning(f"⚠️ Duplicate file detected: {checksum[:16]}...")
# Get existing invoice if linked # Get existing invoice if linked
existing_invoice = execute_query_single( existing_invoice = execute_query(
"""SELECT si.* FROM supplier_invoices si """SELECT si.* FROM supplier_invoices si
JOIN extractions e ON si.extraction_id = e.extraction_id JOIN extractions e ON si.extraction_id = e.extraction_id
WHERE e.file_id = %s""", WHERE e.file_id = %s""",
(existing_file['file_id'],)) (existing_file['file_id'],),
fetchone=True
)
return { return {
"status": "duplicate", "status": "duplicate",
@ -1753,12 +1768,14 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
logger.info(f"💾 Saved file as: {final_path.name}") logger.info(f"💾 Saved file as: {final_path.name}")
# Insert file record # Insert file record
file_record = execute_query_single( file_record = execute_query(
"""INSERT INTO incoming_files """INSERT INTO incoming_files
(filename, original_filename, file_path, file_size, mime_type, checksum, status) (filename, original_filename, file_path, file_size, mime_type, checksum, status)
VALUES (%s, %s, %s, %s, %s, %s, 'processing') RETURNING file_id""", VALUES (%s, %s, %s, %s, %s, %s, 'processing') RETURNING file_id""",
(final_path.name, file.filename, str(final_path), total_size, (final_path.name, file.filename, str(final_path), total_size,
ollama_service._get_mime_type(final_path), checksum)) ollama_service._get_mime_type(final_path), checksum),
fetchone=True
)
file_id = file_record['file_id'] file_id = file_record['file_id']
# Extract text from file # Extract text from file
@ -1797,14 +1814,16 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
logger.info(f"🔍 Checking for duplicate invoice number: {document_number}") logger.info(f"🔍 Checking for duplicate invoice number: {document_number}")
# Check 1: Search in local database (supplier_invoices table) # Check 1: Search in local database (supplier_invoices table)
existing_invoice = execute_query_single( existing_invoice = execute_query(
"""SELECT si.id, si.invoice_number, si.created_at, v.name as vendor_name """SELECT si.id, si.invoice_number, si.created_at, v.name as vendor_name
FROM supplier_invoices si FROM supplier_invoices si
LEFT JOIN vendors v ON v.id = si.vendor_id LEFT JOIN vendors v ON v.id = si.vendor_id
WHERE si.invoice_number = %s WHERE si.invoice_number = %s
ORDER BY si.created_at DESC ORDER BY si.created_at DESC
LIMIT 1""", LIMIT 1""",
(document_number,)) (document_number,),
fetchone=True
)
if existing_invoice: if existing_invoice:
# DUPLICATE FOUND IN DATABASE # DUPLICATE FOUND IN DATABASE
@ -2007,9 +2026,11 @@ async def reprocess_uploaded_file(file_id: int):
try: try:
# Get file record # Get file record
file_record = execute_query_single( file_record = execute_query(
"SELECT * FROM incoming_files WHERE file_id = %s", "SELECT * FROM incoming_files WHERE file_id = %s",
(file_id,)) (file_id,),
fetchone=True
)
if not file_record: if not file_record:
raise HTTPException(status_code=404, detail=f"Fil {file_id} ikke fundet") raise HTTPException(status_code=404, detail=f"Fil {file_id} ikke fundet")
@ -2037,54 +2058,15 @@ async def reprocess_uploaded_file(file_id: int):
is_invoice2data = (template_id == -1) is_invoice2data = (template_id == -1)
if is_invoice2data: if is_invoice2data:
def _to_numeric(value):
if value is None:
return None
if isinstance(value, (int, float, Decimal)):
return float(value)
if not isinstance(value, str):
return None
cleaned = value.strip().replace(' ', '')
if not cleaned:
return None
# Common Danish formatting: 25.000,00 or 1.530,00
if ',' in cleaned:
cleaned = cleaned.replace('.', '').replace(',', '.')
try:
return float(cleaned)
except ValueError:
return None
def _clean_document_id(value):
if value is None:
return None
if isinstance(value, str):
cleaned = value.strip()
return cleaned if cleaned and cleaned.lower() != 'none' else None
return str(value)
# Invoice2data doesn't have vendor in cache # Invoice2data doesn't have vendor in cache
logger.info(f"📋 Using invoice2data template") logger.info(f"📋 Using invoice2data template")
# Try to find vendor from extracted CVR # Try to find vendor from extracted CVR
if extracted_fields.get('vendor_vat'): if extracted_fields.get('vendor_vat'):
vendor = execute_query_single( vendor = execute_query(
"SELECT id FROM vendors WHERE cvr_number = %s", "SELECT id FROM vendors WHERE cvr_number = %s",
(extracted_fields['vendor_vat'],)) (extracted_fields['vendor_vat'],),
if vendor: fetchone=True
vendor_id = vendor['id'] )
# Fallback: use vendor detected during quick analysis (incoming_files.detected_vendor_id)
if vendor_id is None:
vendor_id = file_record.get('detected_vendor_id')
# Fallback: match by issuer name
if vendor_id is None and extracted_fields.get('issuer'):
vendor = execute_query_single(
"SELECT id FROM vendors WHERE name ILIKE %s ORDER BY id LIMIT 1",
(extracted_fields['issuer'],))
if vendor: if vendor:
vendor_id = vendor['id'] vendor_id = vendor['id']
@ -2099,12 +2081,12 @@ async def reprocess_uploaded_file(file_id: int):
(file_id, vendor_id, (file_id, vendor_id,
extracted_fields.get('issuer'), # vendor_name extracted_fields.get('issuer'), # vendor_name
extracted_fields.get('vendor_vat'), # vendor_cvr extracted_fields.get('vendor_vat'), # vendor_cvr
_clean_document_id(extracted_fields.get('invoice_number')), # document_id str(extracted_fields.get('invoice_number')), # document_id
extracted_fields.get('invoice_date'), # document_date extracted_fields.get('invoice_date'), # document_date
extracted_fields.get('due_date'), extracted_fields.get('due_date'),
'invoice', # document_type 'invoice', # document_type
'invoice', # document_type_detected 'invoice', # document_type_detected
_to_numeric(extracted_fields.get('amount_total') if extracted_fields.get('amount_total') is not None else extracted_fields.get('total_amount')), extracted_fields.get('amount_total'),
extracted_fields.get('currency', 'DKK'), extracted_fields.get('currency', 'DKK'),
1.0, # invoice2data always 100% confidence 1.0, # invoice2data always 100% confidence
json.dumps(extracted_fields), # llm_response_json json.dumps(extracted_fields), # llm_response_json
@ -2114,9 +2096,6 @@ async def reprocess_uploaded_file(file_id: int):
# Insert line items if extracted # Insert line items if extracted
if extracted_fields.get('lines'): if extracted_fields.get('lines'):
for idx, line in enumerate(extracted_fields['lines'], start=1): for idx, line in enumerate(extracted_fields['lines'], start=1):
line_total = _to_numeric(line.get('line_total'))
unit_price = _to_numeric(line.get('unit_price'))
quantity = _to_numeric(line.get('quantity'))
execute_insert( execute_insert(
"""INSERT INTO extraction_lines """INSERT INTO extraction_lines
(extraction_id, line_number, description, quantity, unit_price, (extraction_id, line_number, description, quantity, unit_price,
@ -2125,8 +2104,8 @@ async def reprocess_uploaded_file(file_id: int):
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
RETURNING line_id""", RETURNING line_id""",
(extraction_id, idx, line.get('description'), (extraction_id, idx, line.get('description'),
quantity, unit_price, line.get('quantity'), line.get('unit_price'),
line_total, None, None, 1.0, line.get('line_total'), None, None, 1.0,
line.get('ip_address'), line.get('contract_number'), line.get('ip_address'), line.get('contract_number'),
line.get('location_street'), line.get('location_zip'), line.get('location_city')) line.get('location_street'), line.get('location_zip'), line.get('location_city'))
) )
@ -2158,14 +2137,8 @@ async def reprocess_uploaded_file(file_id: int):
logger.info(f"🤖 Calling Ollama for AI extraction...") logger.info(f"🤖 Calling Ollama for AI extraction...")
llm_result = await ollama_service.extract_from_text(text) llm_result = await ollama_service.extract_from_text(text)
# Handle both dict and string error responses if not llm_result or 'error' in llm_result:
if not llm_result or isinstance(llm_result, str) or (isinstance(llm_result, dict) and 'error' in llm_result): error_msg = llm_result.get('error') if llm_result else 'AI extraction fejlede'
if isinstance(llm_result, dict):
error_msg = llm_result.get('error', 'AI extraction fejlede')
elif isinstance(llm_result, str):
error_msg = llm_result # Error message returned as string
else:
error_msg = 'AI extraction fejlede'
logger.error(f"❌ AI extraction failed: {error_msg}") logger.error(f"❌ AI extraction failed: {error_msg}")
execute_update( execute_update(
@ -2225,14 +2198,6 @@ async def reprocess_uploaded_file(file_id: int):
logger.info(f"✅ AI extraction completed for file {file_id}") logger.info(f"✅ AI extraction completed for file {file_id}")
# Return success with template data or AI extraction result # Return success with template data or AI extraction result
# Determine confidence value safely
if template_id:
final_confidence = confidence
elif 'llm_result' in locals() and isinstance(llm_result, dict):
final_confidence = llm_result.get('confidence', 0.75)
else:
final_confidence = 0.0
result = { result = {
"status": "success", "status": "success",
"file_id": file_id, "file_id": file_id,
@ -2240,16 +2205,18 @@ async def reprocess_uploaded_file(file_id: int):
"template_matched": template_id is not None, "template_matched": template_id is not None,
"template_id": template_id, "template_id": template_id,
"vendor_id": vendor_id, "vendor_id": vendor_id,
"confidence": final_confidence, "confidence": confidence if template_id else llm_result.get('confidence', 0.75),
"extracted_fields": extracted_fields, "extracted_fields": extracted_fields,
"pdf_text": text[:1000] if not template_id else text "pdf_text": text[:1000] if not template_id else text
} }
# Add warning if no template exists # Add warning if no template exists
if not template_id and vendor_id: if not template_id and vendor_id:
vendor = execute_query_single( vendor = execute_query(
"SELECT name FROM vendors WHERE id = %s", "SELECT name FROM vendors WHERE id = %s",
(vendor_id,)) (vendor_id,),
fetchone=True
)
if vendor: if vendor:
result["warning"] = f"⚠️ Ingen template fundet for {vendor['name']} - brugte AI extraction (langsommere)" result["warning"] = f"⚠️ Ingen template fundet for {vendor['name']} - brugte AI extraction (langsommere)"

View File

@ -515,7 +515,7 @@
<!-- Manual Entry Modal --> <!-- Manual Entry Modal -->
<div class="modal fade" id="manualEntryModal" tabindex="-1"> <div class="modal fade" id="manualEntryModal" tabindex="-1">
<div class="modal-dialog" style="max-width: 95%; width: 1800px;"> <div class="modal-dialog modal-xl">
<div class="modal-content"> <div class="modal-content">
<div class="modal-header"> <div class="modal-header">
<h5 class="modal-title"><i class="bi bi-pencil-square me-2"></i>Manuel Indtastning af Faktura</h5> <h5 class="modal-title"><i class="bi bi-pencil-square me-2"></i>Manuel Indtastning af Faktura</h5>
@ -525,28 +525,16 @@
<input type="hidden" id="manualEntryFileId"> <input type="hidden" id="manualEntryFileId">
<div class="row"> <div class="row">
<!-- Left: PDF Text --> <!-- Left: PDF Viewer -->
<div class="col-md-4"> <div class="col-md-6">
<h6 class="mb-3"> <h6 class="mb-3">PDF Dokument</h6>
PDF Dokument <div style="border: 1px solid #ddd; border-radius: 4px; height: 600px; overflow: hidden;">
<button type="button" class="btn btn-sm btn-outline-secondary float-end" onclick="togglePdfView()"> <iframe id="manualEntryPdfViewer" type="application/pdf" width="100%" height="100%" style="border: none;"></iframe>
<i class="bi bi-file-earmark-text" id="pdfViewIcon"></i> Vis Original
</button>
</h6>
<!-- PDF Text View (default, selectable) -->
<div id="pdfTextView" style="border: 1px solid #ddd; border-radius: 4px; height: 700px; overflow: auto; background: white; padding: 15px; font-family: monospace; font-size: 0.85rem; line-height: 1.4; user-select: text;">
<div class="text-muted text-center py-5">Indlæser PDF tekst...</div>
</div>
<!-- PDF Original View (hidden by default) -->
<iframe id="manualEntryPdfViewer" type="application/pdf" width="100%" height="700px" style="border: 1px solid #ddd; border-radius: 4px; display: none;"></iframe>
<div class="alert alert-info mt-2 py-2 px-3" style="font-size: 0.85rem;">
💡 <strong>Tip:</strong> Markér tekst og klik på et felt for at indsætte
</div> </div>
</div> </div>
<!-- Right: Form --> <!-- Right: Form -->
<div class="col-md-8"> <div class="col-md-6">
<h6 class="mb-3">Faktura Detaljer</h6> <h6 class="mb-3">Faktura Detaljer</h6>
<form id="manualEntryForm"> <form id="manualEntryForm">
<!-- Vendor Selection --> <!-- Vendor Selection -->
@ -562,13 +550,13 @@
<!-- Invoice Details --> <!-- Invoice Details -->
<div class="row mb-3"> <div class="row mb-3">
<div class="col-md-8"> <div class="col-md-6">
<label class="form-label fw-bold">Fakturanummer *</label> <label class="form-label">Fakturanummer *</label>
<input type="text" class="form-control form-control-lg manual-entry-field" id="manualInvoiceNumber" required> <input type="text" class="form-control" id="manualInvoiceNumber" required>
</div> </div>
<div class="col-md-4"> <div class="col-md-6">
<label class="form-label fw-bold">Type</label> <label class="form-label">Type</label>
<select class="form-select form-select-lg" id="manualInvoiceType"> <select class="form-select" id="manualInvoiceType">
<option value="invoice">Faktura</option> <option value="invoice">Faktura</option>
<option value="credit_note">Kreditnota</option> <option value="credit_note">Kreditnota</option>
</select> </select>
@ -577,23 +565,23 @@
<div class="row mb-3"> <div class="row mb-3">
<div class="col-md-6"> <div class="col-md-6">
<label class="form-label fw-bold">Fakturadato *</label> <label class="form-label">Fakturadato *</label>
<input type="date" class="form-control form-control-lg manual-entry-field" id="manualInvoiceDate" required> <input type="date" class="form-control" id="manualInvoiceDate" required>
</div> </div>
<div class="col-md-6"> <div class="col-md-6">
<label class="form-label fw-bold">Forfaldsdato</label> <label class="form-label">Forfaldsdato</label>
<input type="date" class="form-control form-control-lg manual-entry-field" id="manualDueDate"> <input type="date" class="form-control" id="manualDueDate">
</div> </div>
</div> </div>
<div class="row mb-3"> <div class="row mb-3">
<div class="col-md-8"> <div class="col-md-6">
<label class="form-label fw-bold">Total Beløb *</label> <label class="form-label">Total Beløb *</label>
<input type="number" step="0.01" class="form-control form-control-lg manual-entry-field" id="manualTotalAmount" required> <input type="number" step="0.01" class="form-control" id="manualTotalAmount" required>
</div> </div>
<div class="col-md-4"> <div class="col-md-6">
<label class="form-label fw-bold">Valuta</label> <label class="form-label">Valuta</label>
<select class="form-select form-select-lg" id="manualCurrency"> <select class="form-select" id="manualCurrency">
<option value="DKK">DKK</option> <option value="DKK">DKK</option>
<option value="EUR">EUR</option> <option value="EUR">EUR</option>
<option value="USD">USD</option> <option value="USD">USD</option>
@ -724,102 +712,41 @@
let currentInvoiceId = null; let currentInvoiceId = null;
let currentFilter = 'all'; let currentFilter = 'all';
let allInvoices = []; let allInvoices = [];
let lastSelectedText = '';
let lastFocusedField = null;
// Load data on page load // Load data on page load
document.addEventListener('DOMContentLoaded', () => { document.addEventListener('DOMContentLoaded', () => {
loadStats(); loadStats();
loadInvoices(); loadInvoices();
loadVendors(); loadVendors();
setupManualEntryTextSelection();
setDefaultDates(); setDefaultDates();
loadPendingFilesCount(); // Load count for badge loadPendingFilesCount(); // Load count for badge
checkEmailContext(); // Check if coming from email checkEmailContext(); // Check if coming from email
}); });
// Check if coming from email context // Check if coming from email context
async function checkEmailContext() { function checkEmailContext() {
const emailContext = sessionStorage.getItem('supplierInvoiceContext'); const emailContext = sessionStorage.getItem('supplierInvoiceContext');
if (emailContext) { if (emailContext) {
try { try {
const context = JSON.parse(emailContext); const context = JSON.parse(emailContext);
console.log('📧 Processing email context:', context);
// Show notification // Show notification
console.log('📧 Behandler faktura fra email:', context.subject); showSuccess(`Opret faktura fra email: ${context.subject}`);
// Process attachments if any // Pre-fill description field with email subject
if (context.attachments && context.attachments.length > 0) { const descriptionField = document.getElementById('description');
console.log(`📎 Found ${context.attachments.length} attachments`); if (descriptionField) {
descriptionField.value = `Fra email: ${context.subject}\nAfsender: ${context.sender}`;
// Find PDF attachments
const pdfAttachments = context.attachments.filter(att =>
att.filename && att.filename.toLowerCase().endsWith('.pdf')
);
if (pdfAttachments.length > 0) {
console.log(`📄 Processing ${pdfAttachments.length} PDF attachments`);
for (const attachment of pdfAttachments) {
try {
// Download attachment and upload to supplier invoices
console.log(`⬇️ Downloading attachment: ${attachment.filename}`);
const attachmentResponse = await fetch(`/api/v1/emails/${context.emailId}/attachments/${attachment.id}`);
if (!attachmentResponse.ok) {
console.error(`Failed to download attachment ${attachment.id}`);
continue;
} }
const blob = await attachmentResponse.blob(); // Open create modal if exists
const file = new File([blob], attachment.filename, { type: 'application/pdf' }); const createModal = new bootstrap.Modal(document.getElementById('invoiceModal'));
createModal.show();
// Upload to supplier invoices
console.log(`⬆️ Uploading to supplier invoices: ${attachment.filename}`);
const formData = new FormData();
formData.append('file', file);
const uploadResponse = await fetch('/api/v1/supplier-invoices/upload', {
method: 'POST',
body: formData
});
if (uploadResponse.ok) {
const result = await uploadResponse.json();
console.log('✅ Upload successful:', result);
alert(`✅ Faktura ${attachment.filename} uploadet og behandlet`);
} else {
const errorData = await uploadResponse.json();
console.error('Upload failed:', errorData);
alert(`❌ Kunne ikke uploade ${attachment.filename}: ${errorData.detail || 'Ukendt fejl'}`);
}
} catch (error) {
console.error(`Error processing attachment ${attachment.filename}:`, error);
alert(`❌ Fejl ved behandling af ${attachment.filename}`);
}
}
// Reload pending files list after uploads
setTimeout(() => {
loadPendingFiles();
}, 1000);
} else {
alert('⚠️ Ingen PDF vedhæftninger fundet i emailen');
}
} else {
alert('⚠️ Emailen har ingen vedhæftninger');
}
// Clear context after use // Clear context after use
sessionStorage.removeItem('supplierInvoiceContext'); sessionStorage.removeItem('supplierInvoiceContext');
} catch (error) { } catch (error) {
console.error('Failed to process email context:', error); console.error('Failed to parse email context:', error);
alert('❌ Kunne ikke behandle email vedhæftninger');
sessionStorage.removeItem('supplierInvoiceContext');
} }
} }
} }
@ -1240,10 +1167,6 @@ async function reviewExtractedData(fileId) {
.replace(/^dk/i, '') .replace(/^dk/i, '')
.trim(); .trim();
// Check for validation warnings
const hasValidationIssues = aiData?._validation_warning || aiData?._vat_warning;
const allValidationsPassed = !hasValidationIssues && aiData && aiData.lines && aiData.lines.length > 0;
// Build modal content // Build modal content
let modalContent = ` let modalContent = `
<div class="alert alert-info"> <div class="alert alert-info">
@ -1256,19 +1179,6 @@ async function reviewExtractedData(fileId) {
} }
</div> </div>
${hasValidationIssues ? `
<div class="alert alert-warning">
<h6 class="alert-heading"><i class="bi bi-exclamation-triangle me-2"></i>Beløbs-validering</h6>
${aiData._validation_warning ? `<p class="mb-1">⚠️ ${aiData._validation_warning}</p>` : ''}
${aiData._vat_warning ? `<p class="mb-0">⚠️ ${aiData._vat_warning}</p>` : ''}
</div>
` : allValidationsPassed ? `
<div class="alert alert-success">
<h6 class="alert-heading"><i class="bi bi-check-circle me-2"></i>Beløbs-validering</h6>
<p class="mb-0">✅ Varelinjer summer korrekt til subtotal<br>✅ Moms beregning er korrekt (25%)</p>
</div>
` : ''}
<h5>Udtrukne Data:</h5> <h5>Udtrukne Data:</h5>
<table class="table table-sm"> <table class="table table-sm">
<tr><th>Felt</th><th>Værdi</th></tr> <tr><th>Felt</th><th>Værdi</th></tr>
@ -1303,7 +1213,7 @@ async function reviewExtractedData(fileId) {
${data.pdf_text_preview ? ` ${data.pdf_text_preview ? `
<h6 class="mt-3">PDF Tekst Preview:</h6> <h6 class="mt-3">PDF Tekst Preview:</h6>
<div class="border rounded p-3 bg-light" style="max-height: 500px; overflow-y: auto;"> <div class="border rounded p-3 bg-light" style="max-height: 500px; overflow-y: auto;">
<pre class="mb-0" style="font-size: 0.85rem; white-space: pre-wrap; word-wrap: break-word; font-family: monospace; line-height: 1.3;">${escapeHtml(data.pdf_text_preview)}</pre> <pre class="mb-0" style="font-size: 0.85rem; white-space: pre-wrap; word-wrap: break-word;">${data.pdf_text_preview}</pre>
</div> </div>
` : ''} ` : ''}
`; `;
@ -1714,22 +1624,8 @@ async function openManualEntryMode() {
// Wait a bit for modal to render // Wait a bit for modal to render
await new Promise(resolve => setTimeout(resolve, 300)); await new Promise(resolve => setTimeout(resolve, 300));
// Load PDF text after modal is open // Load PDF after modal is open
console.log('Loading PDF text...'); console.log('Loading PDF...');
try {
const pdfResponse = await fetch(`/api/v1/supplier-invoices/files/${fileId}/pdf-text`);
if (pdfResponse.ok) {
const pdfData = await pdfResponse.json();
document.getElementById('pdfTextView').innerHTML = `<pre style="margin: 0; white-space: pre-wrap; word-wrap: break-word;">${escapeHtml(pdfData.pdf_text)}</pre>`;
} else {
document.getElementById('pdfTextView').innerHTML = '<div class="text-danger">Kunne ikke indlæse PDF tekst</div>';
}
} catch (e) {
console.error('Error loading PDF text:', e);
document.getElementById('pdfTextView').innerHTML = '<div class="text-danger">Fejl ved indlæsning af PDF</div>';
}
// Also set iframe src for original view toggle
document.getElementById('manualEntryPdfViewer').src = `/api/v1/supplier-invoices/files/${fileId}/pdf`; document.getElementById('manualEntryPdfViewer').src = `/api/v1/supplier-invoices/files/${fileId}/pdf`;
// Load vendors into dropdown // Load vendors into dropdown
@ -1863,93 +1759,6 @@ async function openManualEntryMode() {
} }
} }
// Setup text selection feature for manual entry
function setupManualEntryTextSelection() {
// Track selection anywhere in document
document.addEventListener('mouseup', (e) => {
const selection = window.getSelection();
const selectedText = selection.toString().trim();
if (selectedText && selectedText.length > 0) {
// Check if selection is from PDF text view
const pdfTextView = document.getElementById('pdfTextView');
if (pdfTextView && pdfTextView.contains(selection.anchorNode)) {
lastSelectedText = selectedText;
console.log('✅ Selected from PDF:', selectedText);
// Visual feedback
const tip = document.querySelector('.alert-info');
if (tip && tip.closest('#manualEntryModal')) {
tip.classList.remove('alert-info');
tip.classList.add('alert-success');
tip.innerHTML = `✅ <strong>"${selectedText.substring(0, 50)}${selectedText.length > 50 ? '...' : ''}"</strong> markeret - klik på et felt for at indsætte`;
setTimeout(() => {
tip.classList.remove('alert-success');
tip.classList.add('alert-info');
tip.innerHTML = '💡 <strong>Tip:</strong> Markér tekst og klik på et felt for at indsætte';
}, 3000);
}
}
}
});
// Insert selected text on field click/focus
document.addEventListener('click', (e) => {
// Check if clicked element is an input/textarea in the modal
if (lastSelectedText && e.target.closest('#manualEntryModal')) {
if ((e.target.tagName === 'INPUT' &&
e.target.type !== 'button' &&
e.target.type !== 'submit' &&
e.target.type !== 'checkbox' &&
e.target.type !== 'radio') ||
e.target.tagName === 'TEXTAREA') {
console.log('🎯 Clicked field:', e.target.id || e.target.name, 'Current value:', e.target.value);
console.log('📝 Will insert:', lastSelectedText);
// Only insert if field is empty or user confirms
if (!e.target.value || e.target.value.trim() === '') {
e.target.value = lastSelectedText;
console.log('✅ Auto-inserted into', e.target.id || e.target.name);
e.target.focus();
// Clear selection
lastSelectedText = '';
window.getSelection().removeAllRanges();
} else {
// Ask to replace
if (confirm(`Erstat "${e.target.value}" med "${lastSelectedText}"?`)) {
e.target.value = lastSelectedText;
console.log('✅ Replaced content in', e.target.id || e.target.name);
// Clear selection
lastSelectedText = '';
window.getSelection().removeAllRanges();
}
}
}
}
}, true); // Use capture phase
}
// Toggle between PDF text view and original PDF
function togglePdfView() {
const textView = document.getElementById('pdfTextView');
const pdfView = document.getElementById('manualEntryPdfViewer');
const icon = document.getElementById('pdfViewIcon');
if (textView.style.display === 'none') {
// Show text view
textView.style.display = 'block';
pdfView.style.display = 'none';
icon.className = 'bi bi-file-earmark-text';
} else {
// Show PDF view
textView.style.display = 'none';
pdfView.style.display = 'block';
icon.className = 'bi bi-file-earmark-pdf';
}
}
async function loadVendorsForManual() { async function loadVendorsForManual() {
try { try {
console.log('Fetching vendors from API...'); console.log('Fetching vendors from API...');

View File

@ -340,13 +340,10 @@ async function runTest() {
} }
try { try {
// Load PDF text from dedicated endpoint (not reprocess) // Load PDF text
const fileResponse = await fetch(`/api/v1/supplier-invoices/files/${fileId}/pdf-text`); const fileResponse = await fetch(`/api/v1/supplier-invoices/reprocess/${fileId}`, {
method: 'POST'
if (!fileResponse.ok) { });
throw new Error(`Kunne ikke læse PDF: ${fileResponse.status} ${fileResponse.statusText}`);
}
const fileData = await fileResponse.json(); const fileData = await fileResponse.json();
const pdfText = fileData.pdf_text; const pdfText = fileData.pdf_text;
@ -399,8 +396,6 @@ async function runTest() {
let linesHtml = ''; let linesHtml = '';
const lineItems = result.line_items || []; const lineItems = result.line_items || [];
if (lineItems.length > 0) { if (lineItems.length > 0) {
const hasLineTotal = lineItems.some(l => (l.line_total !== undefined && l.line_total !== null && l.line_total !== '') || (l.lineTotal !== undefined && l.lineTotal !== null && l.lineTotal !== ''));
linesHtml = ` linesHtml = `
<h6 class="mt-3">Varelinjer (${lineItems.length} stk):</h6> <h6 class="mt-3">Varelinjer (${lineItems.length} stk):</h6>
<div class="table-responsive"> <div class="table-responsive">
@ -411,7 +406,6 @@ async function runTest() {
${lineItems[0].description ? '<th>Beskrivelse</th>' : ''} ${lineItems[0].description ? '<th>Beskrivelse</th>' : ''}
${lineItems[0].quantity ? '<th>Antal</th>' : ''} ${lineItems[0].quantity ? '<th>Antal</th>' : ''}
${lineItems[0].unit_price ? '<th>Pris</th>' : ''} ${lineItems[0].unit_price ? '<th>Pris</th>' : ''}
${hasLineTotal ? '<th>Beløb</th>' : ''}
${lineItems.some(l => l.circuit_id || l.ip_address) ? '<th>Kredsløb/IP</th>' : ''} ${lineItems.some(l => l.circuit_id || l.ip_address) ? '<th>Kredsløb/IP</th>' : ''}
${lineItems.some(l => l.location_street) ? '<th>Adresse</th>' : ''} ${lineItems.some(l => l.location_street) ? '<th>Adresse</th>' : ''}
</tr> </tr>
@ -421,14 +415,12 @@ async function runTest() {
lineItems.forEach((line, idx) => { lineItems.forEach((line, idx) => {
const locationText = [line.location_street, line.location_zip, line.location_city].filter(x => x).join(' '); const locationText = [line.location_street, line.location_zip, line.location_city].filter(x => x).join(' ');
const circuitText = line.circuit_id || line.ip_address || ''; const circuitText = line.circuit_id || line.ip_address || '';
const lineTotal = (line.line_total !== undefined && line.line_total !== null) ? line.line_total : line.lineTotal;
linesHtml += `<tr> linesHtml += `<tr>
<td>${idx + 1}</td> <td>${idx + 1}</td>
${line.description ? `<td>${line.description}</td>` : ''} ${line.description ? `<td>${line.description}</td>` : ''}
${line.quantity ? `<td>${line.quantity}</td>` : ''} ${line.quantity ? `<td>${line.quantity}</td>` : ''}
${line.unit_price ? `<td>${line.unit_price}</td>` : ''} ${line.unit_price ? `<td>${line.unit_price}</td>` : ''}
${hasLineTotal ? `<td>${(lineTotal !== undefined && lineTotal !== null) ? lineTotal : ''}</td>` : ''}
${lineItems.some(l => l.circuit_id || l.ip_address) ? `<td><small>${circuitText}</small></td>` : ''} ${lineItems.some(l => l.circuit_id || l.ip_address) ? `<td><small>${circuitText}</small></td>` : ''}
${lineItems.some(l => l.location_street) ? `<td><small>${locationText}</small></td>` : ''} ${lineItems.some(l => l.location_street) ? `<td><small>${locationText}</small></td>` : ''}
</tr>`; </tr>`;

View File

@ -6,6 +6,7 @@ Handles contact CRUD operations with multi-company support
from fastapi import APIRouter, HTTPException, Query from fastapi import APIRouter, HTTPException, Query
from typing import Optional, List from typing import Optional, List
from app.core.database import execute_query, execute_insert, execute_update from app.core.database import execute_query, execute_insert, execute_update
from app.models.schemas import Contact, ContactCreate, ContactUpdate, ContactCompanyLink, CompanyInfo
import logging import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -48,7 +49,7 @@ async def get_contacts(
FROM contacts c FROM contacts c
{where_sql} {where_sql}
""" """
count_result = execute_query_single(count_query, tuple(params)) count_result = execute_query(count_query, tuple(params), fetchone=True)
total = count_result['count'] if count_result else 0 total = count_result['count'] if count_result else 0
# Get contacts with company count # Get contacts with company count
@ -70,7 +71,7 @@ async def get_contacts(
""" """
params.extend([limit, offset]) params.extend([limit, offset])
contacts = execute_query_single(query, tuple(params)) # Default is fetchall contacts = execute_query(query, tuple(params)) # Default is fetchall
return { return {
"contacts": contacts or [], "contacts": contacts or [],
@ -98,7 +99,7 @@ async def get_contact(contact_id: int):
FROM contacts FROM contacts
WHERE id = %s WHERE id = %s
""" """
contact = execute_query(contact_query, (contact_id,)) contact = execute_query(contact_query, (contact_id,), fetchone=True)
if not contact: if not contact:
raise HTTPException(status_code=404, detail="Contact not found") raise HTTPException(status_code=404, detail="Contact not found")
@ -113,7 +114,7 @@ async def get_contact(contact_id: int):
WHERE cc.contact_id = %s WHERE cc.contact_id = %s
ORDER BY cc.is_primary DESC, cu.name ORDER BY cc.is_primary DESC, cu.name
""" """
companies = execute_query_single(companies_query, (contact_id,)) # Default is fetchall companies = execute_query(companies_query, (contact_id,)) # Default is fetchall
contact['companies'] = companies or [] contact['companies'] = companies or []
return contact return contact
@ -125,12 +126,11 @@ async def get_contact(contact_id: int):
raise HTTPException(status_code=500, detail=str(e)) raise HTTPException(status_code=500, detail=str(e))
# POST/PUT/DELETE endpoints temporarily disabled - need proper models @router.post("/contacts", response_model=dict)
# @router.post("/contacts", response_model=dict) async def create_contact(contact: ContactCreate):
# async def create_contact(contact: ContactCreate): """
# """ Create a new contact and link to companies
# Create a new contact and link to companies """
# """
try: try:
# Insert contact # Insert contact
insert_query = """ insert_query = """
@ -171,7 +171,7 @@ async def update_contact(contact_id: int, contact: ContactUpdate):
""" """
try: try:
# Check if contact exists # Check if contact exists
existing = execute_query("SELECT id FROM contacts WHERE id = %s", (contact_id,)) existing = execute_query("SELECT id FROM contacts WHERE id = %s", (contact_id,), fetchone=True)
if not existing: if not existing:
raise HTTPException(status_code=404, detail="Contact not found") raise HTTPException(status_code=404, detail="Contact not found")
@ -258,12 +258,12 @@ async def link_contact_to_company(contact_id: int, link: ContactCompanyLink):
""" """
try: try:
# Check if contact exists # Check if contact exists
contact = execute_query_single("SELECT id FROM contacts WHERE id = %s", (contact_id,)) contact = execute_query("SELECT id FROM contacts WHERE id = %s", (contact_id,), fetchone=True)
if not contact: if not contact:
raise HTTPException(status_code=404, detail="Contact not found") raise HTTPException(status_code=404, detail="Contact not found")
# Check if company exists # Check if company exists
customer = execute_query_single("SELECT id FROM customers WHERE id = %s", (link.customer_id,)) customer = execute_query("SELECT id FROM customers WHERE id = %s", (link.customer_id,), fetchone=True)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")

View File

@ -1,91 +0,0 @@
"""
Contact API Router - Simplified (Read-Only)
Only GET endpoints for now
"""
from fastapi import APIRouter, HTTPException, Query
from typing import Optional
from app.core.database import execute_query
import logging
logger = logging.getLogger(__name__)
router = APIRouter()
@router.get("/contacts")
async def get_contacts(
search: Optional[str] = None,
customer_id: Optional[int] = None,
is_active: Optional[bool] = None,
limit: int = Query(default=100, le=1000),
offset: int = Query(default=0, ge=0)
):
"""Get all contacts with optional filtering"""
try:
where_clauses = []
params = []
if search:
where_clauses.append("(first_name ILIKE %s OR last_name ILIKE %s OR email ILIKE %s)")
params.extend([f"%{search}%", f"%{search}%", f"%{search}%"])
if is_active is not None:
where_clauses.append("is_active = %s")
params.append(is_active)
where_sql = "WHERE " + " AND ".join(where_clauses) if where_clauses else ""
# Count total
count_query = f"SELECT COUNT(*) as count FROM contacts {where_sql}"
count_result = execute_query(count_query, tuple(params))
total = count_result[0]['count'] if count_result else 0
# Get contacts
query = f"""
SELECT
id, first_name, last_name, email, phone, mobile,
title, department, is_active, created_at, updated_at
FROM contacts
{where_sql}
ORDER BY first_name, last_name
LIMIT %s OFFSET %s
"""
params.extend([limit, offset])
contacts = execute_query(query, tuple(params))
return {
"total": total,
"contacts": contacts,
"limit": limit,
"offset": offset
}
except Exception as e:
logger.error(f"Failed to get contacts: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/contacts/{contact_id}")
async def get_contact(contact_id: int):
"""Get a single contact by ID"""
try:
query = """
SELECT
id, first_name, last_name, email, phone, mobile,
title, department, is_active, user_company,
created_at, updated_at
FROM contacts
WHERE id = %s
"""
contacts = execute_query(query, (contact_id,))
if not contacts:
raise HTTPException(status_code=404, detail="Contact not found")
return contacts[0]
except HTTPException:
raise
except Exception as e:
logger.error(f"Failed to get contact {contact_id}: {e}")
raise HTTPException(status_code=500, detail=str(e))

View File

@ -47,9 +47,11 @@ async def get_current_user(
# Get additional user details from database # Get additional user details from database
from app.core.database import execute_query from app.core.database import execute_query
user_details = execute_query_single( user_details = execute_query(
"SELECT email, full_name FROM users WHERE id = %s", "SELECT email, full_name FROM users WHERE id = %s",
(user_id,)) (user_id,),
fetchone=True
)
return { return {
"id": user_id, "id": user_id,

View File

@ -85,9 +85,11 @@ class AuthService:
# Check if token is revoked # Check if token is revoked
jti = payload.get('jti') jti = payload.get('jti')
if jti: if jti:
session = execute_query_single( session = execute_query(
"SELECT revoked FROM sessions WHERE token_jti = %s", "SELECT revoked FROM sessions WHERE token_jti = %s",
(jti,)) (jti,),
fetchone=True
)
if session and session.get('revoked'): if session and session.get('revoked'):
logger.warning(f"⚠️ Revoked token used: {jti[:10]}...") logger.warning(f"⚠️ Revoked token used: {jti[:10]}...")
return None return None
@ -115,12 +117,14 @@ class AuthService:
User dict if successful, None otherwise User dict if successful, None otherwise
""" """
# Get user # Get user
user = execute_query_single( user = execute_query(
"""SELECT id, username, email, password_hash, full_name, """SELECT id, username, email, password_hash, full_name,
is_active, is_superadmin, failed_login_attempts, locked_until is_active, is_superadmin, failed_login_attempts, locked_until
FROM users FROM users
WHERE username = %s OR email = %s""", WHERE username = %s OR email = %s""",
(username, username)) (username, username),
fetchone=True
)
if not user: if not user:
logger.warning(f"❌ Login failed: User not found - {username}") logger.warning(f"❌ Login failed: User not found - {username}")
@ -209,13 +213,15 @@ class AuthService:
List of permission codes List of permission codes
""" """
# Check if user is superadmin first # Check if user is superadmin first
user = execute_query_single( user = execute_query(
"SELECT is_superadmin FROM users WHERE id = %s", "SELECT is_superadmin FROM users WHERE id = %s",
(user_id,)) (user_id,),
fetchone=True
)
# Superadmins have all permissions # Superadmins have all permissions
if user and user['is_superadmin']: if user and user['is_superadmin']:
all_perms = execute_query_single("SELECT code FROM permissions") all_perms = execute_query("SELECT code FROM permissions")
return [p['code'] for p in all_perms] if all_perms else [] return [p['code'] for p in all_perms] if all_perms else []
# Get permissions through groups # Get permissions through groups
@ -244,19 +250,21 @@ class AuthService:
# Superadmins have all permissions # Superadmins have all permissions
user = execute_query( user = execute_query(
"SELECT is_superadmin FROM users WHERE id = %s", "SELECT is_superadmin FROM users WHERE id = %s",
(user_id,)) (user_id,),
fetchone=True
)
if user and user['is_superadmin']: if user and user['is_superadmin']:
return True return True
# Check if user has permission through groups # Check if user has permission through groups
result = execute_query_single(""" result = execute_query("""
SELECT COUNT(*) as cnt SELECT COUNT(*) as cnt
FROM permissions p FROM permissions p
JOIN group_permissions gp ON p.id = gp.permission_id JOIN group_permissions gp ON p.id = gp.permission_id
JOIN user_groups ug ON gp.group_id = ug.group_id JOIN user_groups ug ON gp.group_id = ug.group_id
WHERE ug.user_id = %s AND p.code = %s WHERE ug.user_id = %s AND p.code = %s
""", (user_id, permission_code)) """, (user_id, permission_code), fetchone=True)
return bool(result and result['cnt'] > 0) return bool(result and result['cnt'] > 0)

View File

@ -33,41 +33,107 @@ class Settings(BaseSettings):
ECONOMIC_READ_ONLY: bool = True ECONOMIC_READ_ONLY: bool = True
ECONOMIC_DRY_RUN: bool = True ECONOMIC_DRY_RUN: bool = True
# Ollama LLM # vTiger CRM Integration
OLLAMA_ENDPOINT: str = "http://localhost:11434"
OLLAMA_MODEL: str = "llama3.2:3b"
# vTiger Cloud Integration
VTIGER_URL: str = "" VTIGER_URL: str = ""
VTIGER_USERNAME: str = "" VTIGER_USERNAME: str = ""
VTIGER_API_KEY: str = "" VTIGER_API_KEY: str = ""
VTIGER_PASSWORD: str = "" # Fallback hvis API key ikke virker
# Time Tracking Module Settings # Simply-CRM Integration (Legacy System med CVR data)
TIMETRACKING_DEFAULT_HOURLY_RATE: float = 1200.00 OLD_VTIGER_URL: str = "https://bmcnetworks.simply-crm.dk"
TIMETRACKING_AUTO_ROUND: bool = True OLD_VTIGER_USERNAME: str = "ct"
TIMETRACKING_ROUND_INCREMENT: float = 0.5 OLD_VTIGER_ACCESS_KEY: str = ""
TIMETRACKING_ROUND_METHOD: str = "up" # "up", "down", "nearest"
# Time Tracking Module Safety Flags # Time Tracking Module - vTiger Integration (Isoleret)
TIMETRACKING_VTIGER_READ_ONLY: bool = True TIMETRACKING_VTIGER_READ_ONLY: bool = True # 🚨 SAFETY: Bloker ALLE skrivninger til vTiger
TIMETRACKING_VTIGER_DRY_RUN: bool = True TIMETRACKING_VTIGER_DRY_RUN: bool = True # 🚨 SAFETY: Log uden at synkronisere
TIMETRACKING_ECONOMIC_READ_ONLY: bool = True
TIMETRACKING_ECONOMIC_DRY_RUN: bool = True
TIMETRACKING_EXPORT_TYPE: str = "draft" # "draft" or "booked"
# Simply-CRM (Old vTiger On-Premise) # Time Tracking Module - e-conomic Integration (Isoleret)
OLD_VTIGER_URL: str = "" TIMETRACKING_ECONOMIC_READ_ONLY: bool = True # 🚨 SAFETY: Bloker ALLE skrivninger til e-conomic
OLD_VTIGER_USERNAME: str = "" TIMETRACKING_ECONOMIC_DRY_RUN: bool = True # 🚨 SAFETY: Log uden at eksportere
OLD_VTIGER_API_KEY: str = "" TIMETRACKING_EXPORT_TYPE: str = "draft" # draft|booked (draft er sikrest)
# Simply-CRM (Separate System) # Time Tracking Module - Business Logic
SIMPLYCRM_URL: str = "" TIMETRACKING_DEFAULT_HOURLY_RATE: float = 850.00 # DKK pr. time (fallback)
SIMPLYCRM_USERNAME: str = "" TIMETRACKING_AUTO_ROUND: bool = True # Auto-afrund til nærmeste 0.5 time
SIMPLYCRM_API_KEY: str = "" TIMETRACKING_ROUND_INCREMENT: float = 0.5 # Afrundingsinterval (0.25, 0.5, 1.0)
TIMETRACKING_ROUND_METHOD: str = "up" # up (op til), nearest (nærmeste), down (ned til)
TIMETRACKING_REQUIRE_APPROVAL: bool = True # Kræv manuel godkendelse (ikke auto-approve)
# Ollama AI Integration
OLLAMA_ENDPOINT: str = "http://ai_direct.cs.blaahund.dk"
OLLAMA_MODEL: str = "qwen2.5-coder:7b" # qwen2.5-coder fungerer bedre til JSON udtrækning
# Email System Configuration
EMAIL_TO_TICKET_ENABLED: bool = False # 🚨 SAFETY: Disable auto-processing until configured
# Email Fetching (IMAP)
USE_GRAPH_API: bool = False # Use Microsoft Graph API instead of IMAP (preferred)
IMAP_SERVER: str = "outlook.office365.com"
IMAP_PORT: int = 993
IMAP_USE_SSL: bool = True
IMAP_USERNAME: str = ""
IMAP_PASSWORD: str = ""
IMAP_FOLDER: str = "INBOX"
IMAP_READ_ONLY: bool = True # 🚨 SAFETY: Never mark emails as read or modify mailbox
# Microsoft Graph API (OAuth2)
GRAPH_TENANT_ID: str = ""
GRAPH_CLIENT_ID: str = ""
GRAPH_CLIENT_SECRET: str = ""
GRAPH_USER_EMAIL: str = "" # Email account to monitor
# Email Processing
EMAIL_PROCESS_INTERVAL_MINUTES: int = 5 # Background job frequency
EMAIL_MAX_FETCH_PER_RUN: int = 50 # Limit emails per processing cycle
EMAIL_RETENTION_DAYS: int = 90 # Days to keep emails before soft delete
# Email Classification (AI)
EMAIL_AI_ENABLED: bool = True
EMAIL_AI_CONFIDENCE_THRESHOLD: float = 0.7 # Minimum confidence for auto-processing
EMAIL_AUTO_CLASSIFY: bool = True # Run AI classification on new emails
# Email Rules Engine
EMAIL_RULES_ENABLED: bool = True
EMAIL_RULES_AUTO_PROCESS: bool = False # 🚨 SAFETY: Require manual approval initially
# Company Info
OWN_CVR: str = "29522790" # BMC Denmark ApS - ignore when detecting vendors
# File Upload
UPLOAD_DIR: str = "uploads"
MAX_FILE_SIZE_MB: int = 50
ALLOWED_EXTENSIONS: List[str] = [".pdf", ".png", ".jpg", ".jpeg", ".txt", ".csv"]
# Module System Configuration
MODULES_ENABLED: bool = True # Enable/disable entire module system
MODULES_DIR: str = "app/modules" # Directory for dynamic modules
MODULES_AUTO_RELOAD: bool = True # Hot-reload modules on changes (dev only)
class Config: class Config:
env_file = ".env" env_file = ".env"
case_sensitive = True case_sensitive = True
extra = "ignore" # Ignore extra fields from .env
settings = Settings() settings = Settings()
def get_module_config(module_name: str, key: str, default=None):
"""
Hent modul-specifik konfiguration fra miljøvariabel
Pattern: MODULES__{MODULE_NAME}__{KEY}
Eksempel: MODULES__MY_MODULE__API_KEY
Args:
module_name: Navn modul (fx "my_module")
key: Config key (fx "API_KEY")
default: Default værdi hvis ikke sat
Returns:
Konfigurationsværdi eller default
"""
import os
env_key = f"MODULES__{module_name.upper()}__{key.upper()}"
return os.getenv(env_key, default)

View File

@ -55,21 +55,37 @@ def get_db():
release_db_connection(conn) release_db_connection(conn)
def execute_query(query: str, params: tuple = None, fetch: bool = True): def execute_query(query: str, params: Optional[tuple] = None, fetchone: bool = False):
"""Execute a SQL query and return results""" """
Execute a SQL query and return results
Args:
query: SQL query string
params: Query parameters tuple
fetchone: If True, return single row dict, otherwise list of dicts
Returns:
Single dict if fetchone=True, otherwise list of dicts
"""
conn = get_db_connection() conn = get_db_connection()
try: try:
with conn.cursor(cursor_factory=RealDictCursor) as cursor: with conn.cursor(cursor_factory=RealDictCursor) as cursor:
cursor.execute(query, params) cursor.execute(query, params or ())
# Auto-detect write operations and commit # Check if this is a write operation (INSERT, UPDATE, DELETE)
query_upper = query.strip().upper() query_upper = query.strip().upper()
if query_upper.startswith(('INSERT', 'UPDATE', 'DELETE')): is_write = any(query_upper.startswith(cmd) for cmd in ['INSERT', 'UPDATE', 'DELETE'])
conn.commit()
if fetch: if fetchone:
return cursor.fetchall() row = cursor.fetchone()
return cursor.rowcount if is_write:
conn.commit()
return dict(row) if row else None
else:
rows = cursor.fetchall()
if is_write:
conn.commit()
return [dict(row) for row in rows]
except Exception as e: except Exception as e:
conn.rollback() conn.rollback()
logger.error(f"Query error: {e}") logger.error(f"Query error: {e}")
@ -78,15 +94,35 @@ def execute_query(query: str, params: tuple = None, fetch: bool = True):
release_db_connection(conn) release_db_connection(conn)
def execute_insert(query: str, params: tuple = None): def execute_insert(query: str, params: tuple = ()) -> Optional[int]:
"""Execute INSERT query and return new ID""" """
Execute an INSERT query and return last row id
Args:
query: SQL INSERT query (will add RETURNING id if not present)
params: Query parameters tuple
Returns:
Last inserted row ID or None
"""
conn = get_db_connection() conn = get_db_connection()
try: try:
with conn.cursor(cursor_factory=RealDictCursor) as cursor: with conn.cursor(cursor_factory=RealDictCursor) as cursor:
# PostgreSQL requires RETURNING clause
if "RETURNING" not in query.upper():
query = query.rstrip(";") + " RETURNING id"
cursor.execute(query, params) cursor.execute(query, params)
conn.commit()
result = cursor.fetchone() result = cursor.fetchone()
return result['id'] if result and 'id' in result else None conn.commit()
# If result exists, return the first column value (typically ID)
if result:
# If it's a dict, get first value
if isinstance(result, dict):
return list(result.values())[0]
# If it's a tuple/list, get first element
return result[0]
return None
except Exception as e: except Exception as e:
conn.rollback() conn.rollback()
logger.error(f"Insert error: {e}") logger.error(f"Insert error: {e}")
@ -95,14 +131,24 @@ def execute_insert(query: str, params: tuple = None):
release_db_connection(conn) release_db_connection(conn)
def execute_update(query: str, params: tuple = None): def execute_update(query: str, params: tuple = ()) -> int:
"""Execute UPDATE/DELETE query and return affected rows""" """
Execute an UPDATE/DELETE query and return affected rows
Args:
query: SQL UPDATE/DELETE query
params: Query parameters tuple
Returns:
Number of affected rows
"""
conn = get_db_connection() conn = get_db_connection()
try: try:
with conn.cursor(cursor_factory=RealDictCursor) as cursor: with conn.cursor(cursor_factory=RealDictCursor) as cursor:
cursor.execute(query, params) cursor.execute(query, params)
rowcount = cursor.rowcount
conn.commit() conn.commit()
return cursor.rowcount return rowcount
except Exception as e: except Exception as e:
conn.rollback() conn.rollback()
logger.error(f"Update error: {e}") logger.error(f"Update error: {e}")
@ -111,7 +157,66 @@ def execute_update(query: str, params: tuple = None):
release_db_connection(conn) release_db_connection(conn)
def execute_query_single(query: str, params: tuple = None): def execute_module_migration(module_name: str, migration_sql: str) -> bool:
"""Execute query and return single row (backwards compatibility for fetchone=True)""" """
result = execute_query(query, params) Kør en migration for et specifikt modul
return result[0] if result and len(result) > 0 else None
Args:
module_name: Navn modulet
migration_sql: SQL migration kode
Returns:
True hvis success, False ved fejl
"""
conn = get_db_connection()
try:
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
# Sikr at module_migrations tabel eksisterer
cursor.execute("""
CREATE TABLE IF NOT EXISTS module_migrations (
id SERIAL PRIMARY KEY,
module_name VARCHAR(100) NOT NULL,
migration_name VARCHAR(255) NOT NULL,
applied_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
success BOOLEAN DEFAULT TRUE,
error_message TEXT,
UNIQUE(module_name, migration_name)
)
""")
# Kør migration
cursor.execute(migration_sql)
conn.commit()
logger.info(f"✅ Migration for {module_name} success")
return True
except Exception as e:
conn.rollback()
logger.error(f"❌ Migration failed for {module_name}: {e}")
return False
finally:
release_db_connection(conn)
def check_module_table_exists(table_name: str) -> bool:
"""
Check om en modul tabel eksisterer
Args:
table_name: Tabel navn (fx "my_module_customers")
Returns:
True hvis tabellen eksisterer
"""
query = """
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = %s
)
"""
result = execute_query(query, (table_name,), fetchone=True)
if result and isinstance(result, dict):
return result.get('exists', False)
return False

View File

@ -9,7 +9,7 @@ from typing import List, Optional, Dict
from pydantic import BaseModel from pydantic import BaseModel
import logging import logging
from app.core.database import execute_query, execute_query_single from app.core.database import execute_query, execute_insert, execute_update
from app.services.cvr_service import get_cvr_service from app.services.cvr_service import get_cvr_service
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -82,24 +82,13 @@ async def list_customers(
source: Filter by source ('vtiger' or 'local') source: Filter by source ('vtiger' or 'local')
is_active: Filter by active status is_active: Filter by active status
""" """
# Build query with primary contact info # Build query
query = """ query = """
SELECT SELECT
c.*, c.*,
COUNT(DISTINCT cc.contact_id) as contact_count, COUNT(DISTINCT cc.contact_id) as contact_count
CONCAT(pc.first_name, ' ', pc.last_name) as contact_name,
pc.email as contact_email,
COALESCE(pc.mobile, pc.phone) as contact_phone
FROM customers c FROM customers c
LEFT JOIN contact_companies cc ON cc.customer_id = c.id LEFT JOIN contact_companies cc ON cc.customer_id = c.id
LEFT JOIN LATERAL (
SELECT con.first_name, con.last_name, con.email, con.phone, con.mobile
FROM contact_companies ccomp
JOIN contacts con ON ccomp.contact_id = con.id
WHERE ccomp.customer_id = c.id
ORDER BY ccomp.is_primary DESC, con.id ASC
LIMIT 1
) pc ON true
WHERE 1=1 WHERE 1=1
""" """
params = [] params = []
@ -128,7 +117,7 @@ async def list_customers(
params.append(is_active) params.append(is_active)
query += """ query += """
GROUP BY c.id, pc.first_name, pc.last_name, pc.email, pc.phone, pc.mobile GROUP BY c.id
ORDER BY c.name ORDER BY c.name
LIMIT %s OFFSET %s LIMIT %s OFFSET %s
""" """
@ -159,7 +148,7 @@ async def list_customers(
count_query += " AND is_active = %s" count_query += " AND is_active = %s"
count_params.append(is_active) count_params.append(is_active)
count_result = execute_query_single(count_query, tuple(count_params)) count_result = execute_query(count_query, tuple(count_params), fetchone=True)
total = count_result['total'] if count_result else 0 total = count_result['total'] if count_result else 0
return { return {
@ -174,17 +163,21 @@ async def list_customers(
async def get_customer(customer_id: int): async def get_customer(customer_id: int):
"""Get single customer by ID with contact count and vTiger BMC Låst status""" """Get single customer by ID with contact count and vTiger BMC Låst status"""
# Get customer # Get customer
customer = execute_query_single( customer = execute_query(
"SELECT * FROM customers WHERE id = %s", "SELECT * FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
# Get contact count # Get contact count
contact_count_result = execute_query_single( contact_count_result = execute_query(
"SELECT COUNT(*) as count FROM contact_companies WHERE customer_id = %s", "SELECT COUNT(*) as count FROM contact_companies WHERE customer_id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
contact_count = contact_count_result['count'] if contact_count_result else 0 contact_count = contact_count_result['count'] if contact_count_result else 0
@ -237,9 +230,11 @@ async def create_customer(customer: CustomerCreate):
logger.info(f"✅ Created customer {customer_id}: {customer.name}") logger.info(f"✅ Created customer {customer_id}: {customer.name}")
# Fetch and return created customer # Fetch and return created customer
created = execute_query_single( created = execute_query(
"SELECT * FROM customers WHERE id = %s", "SELECT * FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
return created return created
except Exception as e: except Exception as e:
@ -251,9 +246,11 @@ async def create_customer(customer: CustomerCreate):
async def update_customer(customer_id: int, update: CustomerUpdate): async def update_customer(customer_id: int, update: CustomerUpdate):
"""Update customer information""" """Update customer information"""
# Verify customer exists # Verify customer exists
existing = execute_query_single( existing = execute_query(
"SELECT id FROM customers WHERE id = %s", "SELECT id FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not existing: if not existing:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
@ -278,9 +275,11 @@ async def update_customer(customer_id: int, update: CustomerUpdate):
logger.info(f"✅ Updated customer {customer_id}") logger.info(f"✅ Updated customer {customer_id}")
# Fetch and return updated customer # Fetch and return updated customer
updated = execute_query_single( updated = execute_query(
"SELECT * FROM customers WHERE id = %s", "SELECT * FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
return updated return updated
except Exception as e: except Exception as e:
@ -295,9 +294,11 @@ async def lock_customer_subscriptions(customer_id: int, lock_request: dict):
locked = lock_request.get('locked', False) locked = lock_request.get('locked', False)
# Get customer # Get customer
customer = execute_query_single( customer = execute_query(
"SELECT id, name FROM customers WHERE id = %s", "SELECT id, name FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
@ -326,7 +327,7 @@ async def lock_customer_subscriptions(customer_id: int, lock_request: dict):
@router.get("/customers/{customer_id}/contacts") @router.get("/customers/{customer_id}/contacts")
async def get_customer_contacts(customer_id: int): async def get_customer_contacts(customer_id: int):
"""Get all contacts for a specific customer""" """Get all contacts for a specific customer"""
rows = execute_query_single(""" rows = execute_query("""
SELECT SELECT
c.*, c.*,
cc.is_primary, cc.is_primary,
@ -347,7 +348,9 @@ async def create_customer_contact(customer_id: int, contact: ContactCreate):
# Verify customer exists # Verify customer exists
customer = execute_query( customer = execute_query(
"SELECT id FROM customers WHERE id = %s", "SELECT id FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
@ -380,9 +383,11 @@ async def create_customer_contact(customer_id: int, contact: ContactCreate):
logger.info(f"✅ Created contact {contact_id} for customer {customer_id}") logger.info(f"✅ Created contact {contact_id} for customer {customer_id}")
# Fetch and return created contact # Fetch and return created contact
created = execute_query_single( created = execute_query(
"SELECT * FROM contacts WHERE id = %s", "SELECT * FROM contacts WHERE id = %s",
(contact_id,)) (contact_id,),
fetchone=True
)
return created return created
except Exception as e: except Exception as e:
@ -415,9 +420,11 @@ async def get_customer_subscriptions(customer_id: int):
from app.services.vtiger_service import get_vtiger_service from app.services.vtiger_service import get_vtiger_service
# Get customer with vTiger ID # Get customer with vTiger ID
customer = execute_query_single( customer = execute_query(
"SELECT id, name, vtiger_id FROM customers WHERE id = %s", "SELECT id, name, vtiger_id FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
@ -499,29 +506,20 @@ async def get_customer_subscriptions(customer_id: int):
# Note: Simply-CRM returns one row per line item, so we need to group them # Note: Simply-CRM returns one row per line item, so we need to group them
query = f"SELECT * FROM SalesOrder WHERE account_id='{simplycrm_account_id}';" query = f"SELECT * FROM SalesOrder WHERE account_id='{simplycrm_account_id}';"
all_simplycrm_orders = await simplycrm.query(query) all_simplycrm_orders = await simplycrm.query(query)
logger.info(f"🔍 Simply-CRM raw query returned {len(all_simplycrm_orders or [])} orders for account {simplycrm_account_id}")
# Group line items by order ID # Group line items by order ID
# Filter: Only include orders with recurring_frequency (otherwise not subscription) # Filter: Only include orders with recurring_frequency (otherwise not subscription)
orders_dict = {} orders_dict = {}
filtered_closed = 0
filtered_no_freq = 0
for row in (all_simplycrm_orders or []): for row in (all_simplycrm_orders or []):
status = row.get('sostatus', '').lower() status = row.get('sostatus', '').lower()
if status in ['closed', 'cancelled']: if status in ['closed', 'cancelled']:
filtered_closed += 1
logger.debug(f" ⏭️ Skipping closed order: {row.get('subject', 'N/A')} ({status})")
continue continue
# MUST have recurring_frequency to be a subscription # MUST have recurring_frequency to be a subscription
recurring_frequency = row.get('recurring_frequency', '').strip() recurring_frequency = row.get('recurring_frequency', '').strip()
if not recurring_frequency: if not recurring_frequency:
filtered_no_freq += 1
logger.debug(f" ⏭️ Skipping order without frequency: {row.get('subject', 'N/A')}")
continue continue
logger.info(f" ✅ Including order: {row.get('subject', 'N/A')} - {recurring_frequency} ({status})")
order_id = row.get('id') order_id = row.get('id')
if order_id not in orders_dict: if order_id not in orders_dict:
# First occurrence - create order object # First occurrence - create order object
@ -550,7 +548,7 @@ async def get_customer_subscriptions(customer_id: int):
}) })
simplycrm_sales_orders = list(orders_dict.values()) simplycrm_sales_orders = list(orders_dict.values())
logger.info(f"📥 Found {len(simplycrm_sales_orders)} unique recurring orders in Simply-CRM (filtered out: {filtered_closed} closed, {filtered_no_freq} without frequency)") logger.info(f"📥 Found {len(simplycrm_sales_orders)} unique open sales orders in Simply-CRM")
else: else:
logger.info(f" No Simply-CRM account found for '{customer_name}'") logger.info(f" No Simply-CRM account found for '{customer_name}'")
except Exception as e: except Exception as e:
@ -610,7 +608,9 @@ async def create_subscription(customer_id: int, subscription: SubscriptionCreate
# Get customer's vTiger ID # Get customer's vTiger ID
customer = execute_query( customer = execute_query(
"SELECT vtiger_id FROM customers WHERE id = %s", "SELECT vtiger_id FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer or not customer.get('vtiger_id'): if not customer or not customer.get('vtiger_id'):
raise HTTPException(status_code=404, detail="Customer not linked to vTiger") raise HTTPException(status_code=404, detail="Customer not linked to vTiger")
@ -686,9 +686,11 @@ async def delete_subscription(subscription_id: str, customer_id: int = None):
try: try:
# Check if subscriptions are locked for this customer (if customer_id provided) # Check if subscriptions are locked for this customer (if customer_id provided)
if customer_id: if customer_id:
customer = execute_query_single( customer = execute_query(
"SELECT subscriptions_locked FROM customers WHERE id = %s", "SELECT subscriptions_locked FROM customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if customer and customer.get('subscriptions_locked'): if customer and customer.get('subscriptions_locked'):
raise HTTPException( raise HTTPException(
status_code=403, status_code=403,
@ -713,85 +715,3 @@ async def delete_subscription(subscription_id: str, customer_id: int = None):
except Exception as e: except Exception as e:
logger.error(f"❌ Error deleting subscription: {e}") logger.error(f"❌ Error deleting subscription: {e}")
raise HTTPException(status_code=500, detail=str(e)) raise HTTPException(status_code=500, detail=str(e))
# Subscription Internal Comment Endpoints
class SubscriptionComment(BaseModel):
comment: str
@router.post("/customers/{customer_id}/subscription-comment")
async def save_subscription_comment(customer_id: int, data: SubscriptionComment):
"""Save internal comment about customer subscriptions"""
try:
# Check if customer exists
customer = execute_query_single(
"SELECT id FROM customers WHERE id = %s",
(customer_id,)
)
if not customer:
raise HTTPException(status_code=404, detail="Customer not found")
# Delete existing comment if any and insert new one in a single query
result = execute_query(
"""
WITH deleted AS (
DELETE FROM customer_notes
WHERE customer_id = %s AND note_type = 'subscription_comment'
)
INSERT INTO customer_notes (customer_id, note_type, note, created_by, created_at)
VALUES (%s, 'subscription_comment', %s, 'System', NOW())
RETURNING id, note, created_by, created_at
""",
(customer_id, customer_id, data.comment)
)
if not result or len(result) == 0:
raise Exception("Failed to insert comment")
row = result[0]
logger.info(f"✅ Saved subscription comment for customer {customer_id}")
return {
"id": row['id'],
"comment": row['note'],
"created_by": row['created_by'],
"created_at": row['created_at'].isoformat()
}
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error saving subscription comment: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/customers/{customer_id}/subscription-comment")
async def get_subscription_comment(customer_id: int):
"""Get internal comment about customer subscriptions"""
try:
result = execute_query_single(
"""
SELECT id, note, created_by, created_at
FROM customer_notes
WHERE customer_id = %s AND note_type = 'subscription_comment'
ORDER BY created_at DESC
LIMIT 1
""",
(customer_id,)
)
if not result:
raise HTTPException(status_code=404, detail="No comment found")
return {
"id": result['id'],
"comment": result['note'],
"created_by": result['created_by'],
"created_at": result['created_at'].isoformat()
}
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error fetching subscription comment: {e}")
raise HTTPException(status_code=500, detail=str(e))

View File

@ -8,7 +8,7 @@ templates = Jinja2Templates(directory="app")
@router.get("/customers", response_class=HTMLResponse) @router.get("/customers", response_class=HTMLResponse)
async def customers_page(request: Request): async def customers_page(request: Request):
""" """
Render the customers page Render the customers list page
""" """
return templates.TemplateResponse("customers/frontend/customers.html", {"request": request}) return templates.TemplateResponse("customers/frontend/customers.html", {"request": request})

View File

@ -215,7 +215,7 @@
</li> </li>
<li class="nav-item"> <li class="nav-item">
<a class="nav-link" data-bs-toggle="tab" href="#subscriptions"> <a class="nav-link" data-bs-toggle="tab" href="#subscriptions">
<i class="bi bi-arrow-repeat"></i>Abonnnents tjek <i class="bi bi-arrow-repeat"></i>Abonnementer
</a> </a>
</li> </li>
<li class="nav-item"> <li class="nav-item">
@ -353,7 +353,7 @@
<!-- Subscriptions Tab --> <!-- Subscriptions Tab -->
<div class="tab-pane fade" id="subscriptions"> <div class="tab-pane fade" id="subscriptions">
<div class="d-flex justify-content-between align-items-center mb-4"> <div class="d-flex justify-content-between align-items-center mb-4">
<h5 class="fw-bold mb-0">Abonnnents tjek</h5> <h5 class="fw-bold mb-0">Abonnementer & Salgsordre</h5>
<div class="btn-group"> <div class="btn-group">
<button class="btn btn-success btn-sm" onclick="showCreateSubscriptionModal()"> <button class="btn btn-success btn-sm" onclick="showCreateSubscriptionModal()">
<i class="bi bi-plus-circle me-2"></i>Opret Abonnement <i class="bi bi-plus-circle me-2"></i>Opret Abonnement
@ -364,36 +364,6 @@
</div> </div>
</div> </div>
<!-- Internal Comment Box -->
<div class="card mb-4" style="border-left: 4px solid var(--accent);">
<div class="card-body">
<h6 class="fw-bold mb-3">
<i class="bi bi-shield-lock me-2"></i>Intern Kommentar
<small class="text-muted fw-normal">(kun synlig for medarbejdere)</small>
</h6>
<div id="internalCommentDisplay" class="mb-3" style="display: none;">
<div class="alert alert-light mb-2">
<div style="white-space: pre-wrap;" id="commentText"></div>
</div>
<div class="d-flex justify-content-between align-items-center">
<small class="text-muted" id="commentMeta"></small>
<button type="button" class="btn btn-sm btn-outline-primary" onclick="editInternalComment()" title="Rediger kommentar">
<i class="bi bi-pencil me-1"></i>Rediger
</button>
</div>
</div>
<div id="internalCommentEdit">
<textarea class="form-control mb-2" id="internalCommentInput" rows="3"
placeholder="Skriv intern note om kundens abonnementer..."></textarea>
<div class="d-flex justify-content-end gap-2">
<button class="btn btn-sm btn-primary" onclick="saveInternalComment()">
<i class="bi bi-save me-1"></i>Gem Kommentar
</button>
</div>
</div>
</div>
</div>
<div id="subscriptionsContainer"> <div id="subscriptionsContainer">
<div class="text-center py-5"> <div class="text-center py-5">
<div class="spinner-border text-primary"></div> <div class="spinner-border text-primary"></div>
@ -513,7 +483,6 @@ document.addEventListener('DOMContentLoaded', () => {
if (subscriptionsTab) { if (subscriptionsTab) {
subscriptionsTab.addEventListener('shown.bs.tab', () => { subscriptionsTab.addEventListener('shown.bs.tab', () => {
loadSubscriptions(); loadSubscriptions();
loadInternalComment();
}, { once: false }); }, { once: false });
} }
@ -891,13 +860,11 @@ function renderSalesOrdersList(orders) {
const lineItems = order.lineItems || []; const lineItems = order.lineItems || [];
const hasLineItems = Array.isArray(lineItems) && lineItems.length > 0; const hasLineItems = Array.isArray(lineItems) && lineItems.length > 0;
const total = parseFloat(order.hdnGrandTotal || 0); const total = parseFloat(order.hdnGrandTotal || 0);
const recordId = order.id.includes('x') ? order.id.split('x')[1] : order.id;
const simplycrmUrl = `https://bmcnetworks.simply-crm.dk/index.php?module=SalesOrder&view=Detail&record=${recordId}`;
return ` return `
<div class="subscription-item border rounded p-3 mb-3 bg-white shadow-sm"> <div class="subscription-item border rounded p-3 mb-3 bg-white shadow-sm">
<div class="d-flex justify-content-between align-items-start mb-2"> <div class="d-flex justify-content-between align-items-start mb-2" style="cursor: pointer;" onclick="toggleLineItems('${itemId}')">
<div class="flex-grow-1" style="cursor: pointer;" onclick="toggleLineItems('${itemId}')"> <div class="flex-grow-1">
<div class="fw-bold d-flex align-items-center"> <div class="fw-bold d-flex align-items-center">
<i class="bi bi-chevron-right me-2 text-success" id="${itemId}-icon" style="font-size: 0.8rem;"></i> <i class="bi bi-chevron-right me-2 text-success" id="${itemId}-icon" style="font-size: 0.8rem;"></i>
${escapeHtml(order.subject || order.salesorder_no || 'Unnamed')} ${escapeHtml(order.subject || order.salesorder_no || 'Unnamed')}
@ -907,23 +874,13 @@ function renderSalesOrdersList(orders) {
</div> </div>
</div> </div>
<div class="text-end ms-3"> <div class="text-end ms-3">
<div class="btn-group btn-group-sm mb-2" role="group">
<a href="${simplycrmUrl}" target="_blank" class="btn btn-outline-success" title="Åbn i Simply-CRM">
<i class="bi bi-box-arrow-up-right"></i> Simply-CRM
</a>
</div>
<div>
<div class="badge bg-${getStatusColor(order.sostatus)} mb-1">${escapeHtml(order.sostatus || 'Open')}</div> <div class="badge bg-${getStatusColor(order.sostatus)} mb-1">${escapeHtml(order.sostatus || 'Open')}</div>
<div class="fw-bold text-success">${total.toFixed(2)} DKK</div> <div class="fw-bold text-success">${total.toFixed(2)} DKK</div>
</div> </div>
</div> </div>
</div>
<div class="d-flex gap-2 flex-wrap small text-muted mt-2"> <div class="d-flex gap-2 flex-wrap small text-muted mt-2">
${order.recurring_frequency ? `<span class="badge bg-light text-dark"><i class="bi bi-arrow-repeat me-1"></i>${escapeHtml(order.recurring_frequency)}</span>` : ''} ${order.recurring_frequency ? `<span class="badge bg-light text-dark"><i class="bi bi-arrow-repeat me-1"></i>${escapeHtml(order.recurring_frequency)}</span>` : ''}
${order.last_recurring_date ? `<span class="badge bg-info text-dark"><i class="bi bi-calendar-check me-1"></i>Sidste: ${formatDate(order.last_recurring_date)}</span>` : ''}
${order.start_period && order.end_period ? `<span class="badge bg-light text-dark"><i class="bi bi-calendar-range me-1"></i>${formatDate(order.start_period)} - ${formatDate(order.end_period)}</span>` : ''}
${order.start_period && !order.end_period ? `<span class="badge bg-light text-dark"><i class="bi bi-calendar-check me-1"></i>Start: ${formatDate(order.start_period)}</span>` : ''}
</div> </div>
${hasLineItems ? ` ${hasLineItems ? `
@ -1075,7 +1032,6 @@ function renderSubscriptionsList(subscriptions, isLocked = false) {
<div class="d-flex gap-2 flex-wrap small text-muted mt-2"> <div class="d-flex gap-2 flex-wrap small text-muted mt-2">
${sub.generateinvoiceevery ? `<span class="badge bg-light text-dark"><i class="bi bi-arrow-repeat me-1"></i>${escapeHtml(sub.generateinvoiceevery)}</span>` : ''} ${sub.generateinvoiceevery ? `<span class="badge bg-light text-dark"><i class="bi bi-arrow-repeat me-1"></i>${escapeHtml(sub.generateinvoiceevery)}</span>` : ''}
${sub.next_subscription_date ? `<span class="badge bg-warning text-dark"><i class="bi bi-calendar-event me-1"></i>Next: ${formatDate(sub.next_subscription_date)}</span>` : ''}
${sub.startdate && sub.enddate ? `<span class="badge bg-light text-dark"><i class="bi bi-calendar-range me-1"></i>${formatDate(sub.startdate)} - ${formatDate(sub.enddate)}</span>` : ''} ${sub.startdate && sub.enddate ? `<span class="badge bg-light text-dark"><i class="bi bi-calendar-range me-1"></i>${formatDate(sub.startdate)} - ${formatDate(sub.enddate)}</span>` : ''}
${sub.startdate ? `<span class="badge bg-light text-dark"><i class="bi bi-calendar-check me-1"></i>Start: ${formatDate(sub.startdate)}</span>` : ''} ${sub.startdate ? `<span class="badge bg-light text-dark"><i class="bi bi-calendar-check me-1"></i>Start: ${formatDate(sub.startdate)}</span>` : ''}
</div> </div>
@ -1440,102 +1396,5 @@ async function toggleSubscriptionsLock() {
alert('Fejl: ' + error.message); alert('Fejl: ' + error.message);
} }
} }
async function saveInternalComment() {
const commentInput = document.getElementById('internalCommentInput');
const commentText = commentInput.value.trim();
if (!commentText) {
alert('Indtast venligst en kommentar');
return;
}
try {
const response = await fetch(`/api/v1/customers/${customerId}/subscription-comment`, {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({ comment: commentText })
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail || 'Kunne ikke gemme kommentar');
}
const result = await response.json();
// Clear input
commentInput.value = '';
// Show saved comment
displayInternalComment(result);
alert('✓ Kommentar gemt');
} catch (error) {
console.error('Error saving comment:', error);
alert('Fejl: ' + error.message);
}
}
async function loadInternalComment() {
try {
const response = await fetch(`/api/v1/customers/${customerId}/subscription-comment`);
if (response.status === 404) {
// No comment yet, that's fine
return;
}
if (!response.ok) {
throw new Error('Kunne ikke hente kommentar');
}
const result = await response.json();
displayInternalComment(result);
} catch (error) {
console.error('Error loading comment:', error);
}
}
function displayInternalComment(data) {
const displayDiv = document.getElementById('internalCommentDisplay');
const editDiv = document.getElementById('internalCommentEdit');
const commentMeta = document.getElementById('commentMeta');
const commentText = document.getElementById('commentText');
if (data && data.comment) {
commentText.textContent = data.comment;
// Format timestamp
const timestamp = new Date(data.created_at).toLocaleString('da-DK', {
year: 'numeric',
month: '2-digit',
day: '2-digit',
hour: '2-digit',
minute: '2-digit'
});
commentMeta.textContent = `Oprettet af ${data.created_by || 'System'} • ${timestamp}`;
displayDiv.style.display = 'block';
editDiv.style.display = 'none';
} else {
displayDiv.style.display = 'none';
editDiv.style.display = 'block';
}
}
function editInternalComment() {
const commentText = document.getElementById('commentText').textContent;
const commentInput = document.getElementById('internalCommentInput');
const displayDiv = document.getElementById('internalCommentDisplay');
const editDiv = document.getElementById('internalCommentEdit');
// Show input, populate with existing text
commentInput.value = commentText;
editDiv.style.display = 'block';
displayDiv.style.display = 'none';
}
</script> </script>
{% endblock %} {% endblock %}

View File

@ -12,6 +12,7 @@
border-radius: 20px; border-radius: 20px;
font-size: 0.9rem; font-size: 0.9rem;
transition: all 0.2s; transition: all 0.2s;
cursor: pointer;
} }
.filter-btn:hover, .filter-btn.active { .filter-btn:hover, .filter-btn.active {
@ -19,6 +20,38 @@
color: white; color: white;
border-color: var(--accent); border-color: var(--accent);
} }
.customer-avatar {
width: 40px;
height: 40px;
border-radius: 8px;
background: var(--accent-light);
color: var(--accent);
display: flex;
align-items: center;
justify-content: center;
font-weight: bold;
font-size: 0.9rem;
}
.pagination-btn {
border: 1px solid rgba(0,0,0,0.1);
padding: 0.5rem 1rem;
background: var(--bg-card);
color: var(--text-primary);
cursor: pointer;
transition: all 0.2s;
}
.pagination-btn:hover:not(:disabled) {
background: var(--accent-light);
border-color: var(--accent);
}
.pagination-btn:disabled {
opacity: 0.5;
cursor: not-allowed;
}
</style> </style>
{% endblock %} {% endblock %}
@ -29,16 +62,29 @@
<p class="text-muted mb-0">Administrer dine kunder</p> <p class="text-muted mb-0">Administrer dine kunder</p>
</div> </div>
<div class="d-flex gap-3"> <div class="d-flex gap-3">
<input type="text" id="searchInput" class="header-search" placeholder="Søg kunde..."> <input type="text" id="searchInput" class="header-search" placeholder="Søg kunde, CVR, email...">
<button class="btn btn-primary"><i class="bi bi-plus-lg me-2"></i>Opret Kunde</button> <button class="btn btn-primary" onclick="showCreateCustomerModal()">
<i class="bi bi-plus-lg me-2"></i>Opret Kunde
</button>
</div> </div>
</div> </div>
<div class="mb-4 d-flex gap-2"> <div class="mb-4 d-flex gap-2 flex-wrap">
<button class="filter-btn active">Alle Kunder</button> <button class="filter-btn active" data-filter="all" onclick="setFilter('all')">
<button class="filter-btn">Aktive</button> Alle Kunder <span id="countAll" class="ms-1"></span>
<button class="filter-btn">Inaktive</button> </button>
<button class="filter-btn">VIP</button> <button class="filter-btn" data-filter="active" onclick="setFilter('active')">
Aktive <span id="countActive" class="ms-1"></span>
</button>
<button class="filter-btn" data-filter="inactive" onclick="setFilter('inactive')">
Inaktive <span id="countInactive" class="ms-1"></span>
</button>
<button class="filter-btn" data-filter="vtiger" onclick="setFilter('vtiger')">
<i class="bi bi-cloud me-1"></i>vTiger
</button>
<button class="filter-btn" data-filter="local" onclick="setFilter('local')">
<i class="bi bi-hdd me-1"></i>Lokal
</button>
</div> </div>
<div class="card p-4"> <div class="card p-4">
@ -47,16 +93,17 @@
<thead> <thead>
<tr> <tr>
<th>Virksomhed</th> <th>Virksomhed</th>
<th>Kontakt</th> <th>Kontakt Info</th>
<th>CVR</th> <th>CVR</th>
<th>Kilde</th>
<th>Status</th> <th>Status</th>
<th>E-mail</th> <th>Kontakter</th>
<th class="text-end">Handlinger</th> <th class="text-end">Handlinger</th>
</tr> </tr>
</thead> </thead>
<tbody id="customersTableBody"> <tbody id="customersTableBody">
<tr> <tr>
<td colspan="6" class="text-center py-5"> <td colspan="7" class="text-center py-5">
<div class="spinner-border text-primary" role="status"> <div class="spinner-border text-primary" role="status">
<span class="visually-hidden">Loading...</span> <span class="visually-hidden">Loading...</span>
</div> </div>
@ -65,181 +112,407 @@
</tbody> </tbody>
</table> </table>
</div> </div>
<div class="d-flex justify-content-between align-items-center mt-3">
<div class="text-muted" id="customerCount">Loading...</div> <!-- Pagination -->
<nav> <div class="d-flex justify-content-between align-items-center mt-4">
<ul class="pagination mb-0" id="pagination"></ul> <div class="text-muted small">
</nav> Viser <span id="showingStart">0</span>-<span id="showingEnd">0</span> af <span id="totalCount">0</span> kunder
</div>
<div class="d-flex gap-2">
<button class="pagination-btn" id="prevBtn" onclick="previousPage()">
<i class="bi bi-chevron-left"></i> Forrige
</button>
<button class="pagination-btn" id="nextBtn" onclick="nextPage()">
Næste <i class="bi bi-chevron-right"></i>
</button>
</div>
</div> </div>
</div> </div>
<!-- Create Customer Modal -->
<div class="modal fade" id="createCustomerModal" tabindex="-1">
<div class="modal-dialog modal-lg">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Opret Ny Kunde</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<form id="createCustomerForm">
<!-- CVR Lookup Section -->
<div class="mb-4">
<label class="form-label">CVR-nummer</label>
<div class="input-group">
<input type="text" class="form-control" id="cvrInput" placeholder="12345678" maxlength="8">
<button class="btn btn-primary" type="button" id="cvrLookupBtn" onclick="lookupCVR()">
<i class="bi bi-search me-2"></i>Søg CVR
</button>
</div>
<div class="form-text">Indtast CVR-nummer for automatisk udfyldning</div>
<div id="cvrLookupStatus" class="mt-2"></div>
</div>
<hr class="my-4">
<div class="row g-3">
<div class="col-md-12">
<label class="form-label">Virksomhedsnavn <span class="text-danger">*</span></label>
<input type="text" class="form-control" id="nameInput" required>
</div>
<div class="col-md-8">
<label class="form-label">Adresse</label>
<input type="text" class="form-control" id="addressInput">
</div>
<div class="col-md-4">
<label class="form-label">Postnummer</label>
<input type="text" class="form-control" id="postalCodeInput">
</div>
<div class="col-md-6">
<label class="form-label">By</label>
<input type="text" class="form-control" id="cityInput">
</div>
<div class="col-md-6">
<label class="form-label">Email</label>
<input type="email" class="form-control" id="emailInput">
</div>
<div class="col-md-6">
<label class="form-label">Telefon</label>
<input type="text" class="form-control" id="phoneInput">
</div>
<div class="col-md-6">
<label class="form-label">Hjemmeside</label>
<input type="url" class="form-control" id="websiteInput" placeholder="https://">
</div>
<div class="col-12">
<div class="form-check">
<input class="form-check-input" type="checkbox" id="isActiveInput" checked>
<label class="form-check-label" for="isActiveInput">
Aktiv kunde
</label>
</div>
</div>
</div>
</form>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Annuller</button>
<button type="button" class="btn btn-primary" onclick="createCustomer()">
<i class="bi bi-plus-lg me-2"></i>Opret Kunde
</button>
</div>
</div>
</div>
</div>
{% endblock %}
{% block extra_js %}
<script> <script>
let currentPage = 1; let currentPage = 0;
const pageSize = 50; let pageSize = 20;
let currentFilter = 'all';
let searchQuery = '';
let totalCustomers = 0; let totalCustomers = 0;
let searchTerm = '';
let searchTimeout = null;
// Load customers on page load // Load customers on page load
document.addEventListener('DOMContentLoaded', () => { document.addEventListener('DOMContentLoaded', () => {
loadCustomers(); loadCustomers();
// Setup search with debounce
const searchInput = document.getElementById('searchInput'); const searchInput = document.getElementById('searchInput');
// Search with debounce
let searchTimeout;
searchInput.addEventListener('input', (e) => { searchInput.addEventListener('input', (e) => {
clearTimeout(searchTimeout); clearTimeout(searchTimeout);
searchTimeout = setTimeout(() => { searchTimeout = setTimeout(() => {
searchTerm = e.target.value; searchQuery = e.target.value;
loadCustomers(1); currentPage = 0;
console.log('🔍 Searching for:', searchQuery);
loadCustomers();
}, 300); }, 300);
}); });
}); });
async function loadCustomers(page = 1) { // Cmd+K / Ctrl+K keyboard shortcut (outside DOMContentLoaded so it works everywhere)
currentPage = page; document.addEventListener('keydown', (e) => {
const offset = (page - 1) * pageSize; if ((e.metaKey || e.ctrlKey) && e.key === 'k') {
e.preventDefault();
const searchInput = document.getElementById('searchInput');
if (searchInput) {
searchInput.focus();
searchInput.select();
}
}
});
function setFilter(filter) {
currentFilter = filter;
currentPage = 0;
// Update active button
document.querySelectorAll('.filter-btn').forEach(btn => {
btn.classList.remove('active');
});
event.target.classList.add('active');
loadCustomers();
}
async function loadCustomers() {
const tbody = document.getElementById('customersTableBody');
tbody.innerHTML = '<tr><td colspan="7" class="text-center py-5"><div class="spinner-border text-primary"></div></td></tr>';
try { try {
let url = `/api/v1/customers?limit=${pageSize}&offset=${offset}`; // Build query parameters
if (searchTerm) { let params = new URLSearchParams({
url += `&search=${encodeURIComponent(searchTerm)}`; limit: pageSize,
offset: currentPage * pageSize
});
if (searchQuery) {
params.append('search', searchQuery);
console.log('📤 Sending search query:', searchQuery);
} }
const response = await fetch(url);
if (currentFilter === 'active') {
params.append('is_active', 'true');
} else if (currentFilter === 'inactive') {
params.append('is_active', 'false');
} else if (currentFilter === 'vtiger' || currentFilter === 'local') {
params.append('source', currentFilter);
}
const response = await fetch(`/api/v1/customers?${params}`);
const data = await response.json(); const data = await response.json();
totalCustomers = data.total; totalCustomers = data.total;
renderCustomers(data.customers); displayCustomers(data.customers);
renderPagination(); updatePagination(data.total);
updateCount();
} catch (error) { } catch (error) {
console.error('Error loading customers:', error); console.error('Failed to load customers:', error);
document.getElementById('customersTableBody').innerHTML = ` tbody.innerHTML = '<tr><td colspan="7" class="text-center py-5 text-danger">Kunne ikke indlæse kunder</td></tr>';
<tr><td colspan="6" class="text-center text-danger py-5">
❌ Fejl ved indlæsning: ${error.message}
</td></tr>
`;
} }
} }
function renderCustomers(customers) { function displayCustomers(customers) {
const tbody = document.getElementById('customersTableBody'); const tbody = document.getElementById('customersTableBody');
if (!customers || customers.length === 0) { if (!customers || customers.length === 0) {
tbody.innerHTML = ` tbody.innerHTML = '<tr><td colspan="7" class="text-center py-5 text-muted">Ingen kunder fundet</td></tr>';
<tr><td colspan="6" class="text-center text-muted py-5">
Ingen kunder fundet
</td></tr>
`;
return; return;
} }
tbody.innerHTML = customers.map(customer => { tbody.innerHTML = customers.map(customer => {
const initials = customer.name ? customer.name.split(' ').map(n => n[0]).join('').substring(0, 2).toUpperCase() : '??'; const initials = getInitials(customer.name);
const statusBadge = customer.is_active ? const statusBadge = customer.is_active
'<span class="badge bg-success bg-opacity-10 text-success">Aktiv</span>' : ? '<span class="badge bg-success bg-opacity-10 text-success">Aktiv</span>'
'<span class="badge bg-secondary bg-opacity-10 text-secondary">Inaktiv</span>'; : '<span class="badge bg-secondary bg-opacity-10 text-secondary">Inaktiv</span>';
const sourceBadge = customer.vtiger_id
? '<span class="badge bg-primary bg-opacity-10 text-primary"><i class="bi bi-cloud me-1"></i>vTiger</span>'
: '<span class="badge bg-secondary bg-opacity-10 text-secondary"><i class="bi bi-hdd me-1"></i>Lokal</span>';
const contactCount = customer.contact_count || 0;
return ` return `
<tr onclick="window.location.href='/customers/${customer.id}'" style="cursor: pointer;"> <tr style="cursor: pointer;" onclick="viewCustomer(${customer.id})">
<td> <td>
<div class="d-flex align-items-center"> <div class="d-flex align-items-center">
<div class="rounded bg-light d-flex align-items-center justify-content-center me-3 fw-bold" <div class="customer-avatar me-3">${initials}</div>
style="width: 40px; height: 40px; color: var(--accent);">
${initials}
</div>
<div> <div>
<div class="fw-bold">${customer.name || '-'}</div> <div class="fw-bold">${escapeHtml(customer.name)}</div>
<div class="small text-muted">${customer.address || '-'}</div> <div class="small text-muted">${customer.city || customer.address || '-'}</div>
</div> </div>
</div> </div>
</td> </td>
<td> <td>
<div class="fw-medium">${customer.contact_name || '-'}</div> <div class="fw-medium">${customer.email || '-'}</div>
<div class="small text-muted">${customer.contact_phone || '-'}</div> <div class="small text-muted">${customer.phone || '-'}</div>
</td> </td>
<td class="text-muted">${customer.cvr_number || '-'}</td> <td class="text-muted">${customer.cvr_number || '-'}</td>
<td>${sourceBadge}</td>
<td>${statusBadge}</td> <td>${statusBadge}</td>
<td class="text-muted">${customer.email || '-'}</td> <td>
<span class="badge bg-light text-dark border">
<i class="bi bi-person me-1"></i>${contactCount}
</span>
</td>
<td class="text-end"> <td class="text-end">
<button class="btn btn-sm btn-outline-primary" <div class="btn-group">
onclick="event.stopPropagation(); window.location.href='/customers/${customer.id}'" <button class="btn btn-sm btn-light" onclick="event.stopPropagation(); viewCustomer(${customer.id})">
title="Se detaljer"> <i class="bi bi-eye"></i>
<i class="bi bi-arrow-right"></i>
</button> </button>
<button class="btn btn-sm btn-light" onclick="event.stopPropagation(); editCustomer(${customer.id})">
<i class="bi bi-pencil"></i>
</button>
</div>
</td> </td>
</tr> </tr>
`; `;
}).join(''); }).join('');
} }
function renderPagination() { function updatePagination(total) {
const totalPages = Math.ceil(totalCustomers / pageSize); const start = currentPage * pageSize + 1;
const pagination = document.getElementById('pagination'); const end = Math.min((currentPage + 1) * pageSize, total);
if (totalPages <= 1) { document.getElementById('showingStart').textContent = total > 0 ? start : 0;
pagination.innerHTML = ''; document.getElementById('showingEnd').textContent = end;
document.getElementById('totalCount').textContent = total;
// Update buttons
document.getElementById('prevBtn').disabled = currentPage === 0;
document.getElementById('nextBtn').disabled = end >= total;
}
function previousPage() {
if (currentPage > 0) {
currentPage--;
loadCustomers();
}
}
function nextPage() {
if ((currentPage + 1) * pageSize < totalCustomers) {
currentPage++;
loadCustomers();
}
}
function viewCustomer(customerId) {
window.location.href = `/customers/${customerId}`;
}
function editCustomer(customerId) {
// TODO: Open edit modal
console.log('Edit customer:', customerId);
}
function showCreateCustomerModal() {
// Reset form
document.getElementById('createCustomerForm').reset();
document.getElementById('cvrLookupStatus').innerHTML = '';
document.getElementById('isActiveInput').checked = true;
// Show modal
const modal = new bootstrap.Modal(document.getElementById('createCustomerModal'));
modal.show();
}
async function lookupCVR() {
const cvrInput = document.getElementById('cvrInput');
const cvr = cvrInput.value.trim();
const statusDiv = document.getElementById('cvrLookupStatus');
const lookupBtn = document.getElementById('cvrLookupBtn');
if (!cvr || cvr.length !== 8) {
statusDiv.innerHTML = '<div class="alert alert-warning mb-0"><i class="bi bi-exclamation-triangle me-2"></i>Indtast et gyldigt 8-cifret CVR-nummer</div>';
return; return;
} }
let pages = []; // Show loading state
lookupBtn.disabled = true;
lookupBtn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Søger...';
statusDiv.innerHTML = '<div class="alert alert-info mb-0"><i class="bi bi-hourglass-split me-2"></i>Henter virksomhedsoplysninger...</div>';
// Previous button try {
pages.push(` const response = await fetch(`/api/v1/cvr/${cvr}`);
<li class="page-item ${currentPage === 1 ? 'disabled' : ''}">
<a class="page-link" href="#" onclick="loadCustomers(${currentPage - 1}); return false;">
<i class="bi bi-chevron-left"></i>
</a>
</li>
`);
// Page numbers (show max 7 pages) if (!response.ok) {
let startPage = Math.max(1, currentPage - 3); throw new Error('CVR ikke fundet');
let endPage = Math.min(totalPages, startPage + 6);
if (endPage - startPage < 6) {
startPage = Math.max(1, endPage - 6);
} }
if (startPage > 1) { const data = await response.json();
pages.push(`<li class="page-item"><a class="page-link" href="#" onclick="loadCustomers(1); return false;">1</a></li>`);
if (startPage > 2) { // Auto-fill form fields
pages.push(`<li class="page-item disabled"><span class="page-link">...</span></li>`); document.getElementById('nameInput').value = data.name || '';
document.getElementById('addressInput').value = data.address || '';
document.getElementById('postalCodeInput').value = data.postal_code || '';
document.getElementById('cityInput').value = data.city || '';
document.getElementById('phoneInput').value = data.phone || '';
document.getElementById('emailInput').value = data.email || '';
statusDiv.innerHTML = '<div class="alert alert-success mb-0"><i class="bi bi-check-circle me-2"></i>Virksomhedsoplysninger hentet fra CVR-registeret</div>';
} catch (error) {
console.error('CVR lookup failed:', error);
statusDiv.innerHTML = '<div class="alert alert-danger mb-0"><i class="bi bi-x-circle me-2"></i>Kunne ikke finde virksomhed med CVR-nummer ' + cvr + '</div>';
} finally {
lookupBtn.disabled = false;
lookupBtn.innerHTML = '<i class="bi bi-search me-2"></i>Søg CVR';
} }
} }
for (let i = startPage; i <= endPage; i++) { async function createCustomer() {
pages.push(` const name = document.getElementById('nameInput').value.trim();
<li class="page-item ${i === currentPage ? 'active' : ''}">
<a class="page-link" href="#" onclick="loadCustomers(${i}); return false;">${i}</a> if (!name) {
</li> alert('Virksomhedsnavn er påkrævet');
`); return;
} }
if (endPage < totalPages) { const customerData = {
if (endPage < totalPages - 1) { name: name,
pages.push(`<li class="page-item disabled"><span class="page-link">...</span></li>`); cvr_number: document.getElementById('cvrInput').value.trim() || null,
} address: document.getElementById('addressInput').value.trim() || null,
pages.push(`<li class="page-item"><a class="page-link" href="#" onclick="loadCustomers(${totalPages}); return false;">${totalPages}</a></li>`); postal_code: document.getElementById('postalCodeInput').value.trim() || null,
city: document.getElementById('cityInput').value.trim() || null,
email: document.getElementById('emailInput').value.trim() || null,
phone: document.getElementById('phoneInput').value.trim() || null,
website: document.getElementById('websiteInput').value.trim() || null,
is_active: document.getElementById('isActiveInput').checked
};
try {
const response = await fetch('/api/v1/customers', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(customerData)
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail || 'Kunne ikke oprette kunde');
} }
// Next button const newCustomer = await response.json();
pages.push(`
<li class="page-item ${currentPage === totalPages ? 'disabled' : ''}">
<a class="page-link" href="#" onclick="loadCustomers(${currentPage + 1}); return false;">
<i class="bi bi-chevron-right"></i>
</a>
</li>
`);
pagination.innerHTML = pages.join(''); // Close modal
const modal = bootstrap.Modal.getInstance(document.getElementById('createCustomerModal'));
modal.hide();
// Reload customer list
await loadCustomers();
// Show success message (optional)
alert('Kunde oprettet succesfuldt!');
} catch (error) {
console.error('Failed to create customer:', error);
alert('Fejl ved oprettelse af kunde: ' + error.message);
}
} }
function updateCount() { function getInitials(name) {
const start = (currentPage - 1) * pageSize + 1; if (!name) return '?';
const end = Math.min(currentPage * pageSize, totalCustomers); const words = name.trim().split(' ');
document.getElementById('customerCount').textContent = if (words.length === 1) return words[0].substring(0, 2).toUpperCase();
`Viser ${start}-${end} af ${totalCustomers} kunder`; return (words[0][0] + words[words.length - 1][0]).toUpperCase();
}
function escapeHtml(text) {
const div = document.createElement('div');
div.textContent = text;
return div.innerHTML;
} }
</script> </script>
{% endblock %} {% endblock %}

View File

@ -17,22 +17,22 @@ async def get_dashboard_stats():
# 1. Customer Counts # 1. Customer Counts
logger.info("Fetching customer count...") logger.info("Fetching customer count...")
customer_res = execute_query_single("SELECT COUNT(*) as count FROM customers WHERE deleted_at IS NULL") customer_res = execute_query("SELECT COUNT(*) as count FROM customers WHERE deleted_at IS NULL", fetchone=True)
customer_count = customer_res['count'] if customer_res else 0 customer_count = customer_res['count'] if customer_res else 0
# 2. Contact Counts # 2. Contact Counts
logger.info("Fetching contact count...") logger.info("Fetching contact count...")
contact_res = execute_query_single("SELECT COUNT(*) as count FROM contacts") contact_res = execute_query("SELECT COUNT(*) as count FROM contacts", fetchone=True)
contact_count = contact_res['count'] if contact_res else 0 contact_count = contact_res['count'] if contact_res else 0
# 3. Vendor Counts # 3. Vendor Counts
logger.info("Fetching vendor count...") logger.info("Fetching vendor count...")
vendor_res = execute_query_single("SELECT COUNT(*) as count FROM vendors") vendor_res = execute_query("SELECT COUNT(*) as count FROM vendors", fetchone=True)
vendor_count = vendor_res['count'] if vendor_res else 0 vendor_count = vendor_res['count'] if vendor_res else 0
# 4. Recent Customers (Real "Activity") # 4. Recent Customers (Real "Activity")
logger.info("Fetching recent customers...") logger.info("Fetching recent customers...")
recent_customers = execute_query_single(""" recent_customers = execute_query("""
SELECT id, name, created_at, 'customer' as type SELECT id, name, created_at, 'customer' as type
FROM customers FROM customers
WHERE deleted_at IS NULL WHERE deleted_at IS NULL
@ -154,7 +154,7 @@ async def get_live_stats():
# Try to get real customer count as a demo # Try to get real customer count as a demo
try: try:
customer_count = execute_query("SELECT COUNT(*) as count FROM customers WHERE deleted_at IS NULL") customer_count = execute_query("SELECT COUNT(*) as count FROM customers WHERE deleted_at IS NULL", fetchone=True)
sales_stats["active_orders"] = customer_count.get('count', 0) if customer_count else 0 sales_stats["active_orders"] = customer_count.get('count', 0) if customer_count else 0
except: except:
pass pass

View File

@ -5,7 +5,7 @@ from fastapi.responses import HTMLResponse
router = APIRouter() router = APIRouter()
templates = Jinja2Templates(directory="app") templates = Jinja2Templates(directory="app")
@router.get("/", response_class=HTMLResponse) @router.get("/dashboard", response_class=HTMLResponse)
async def dashboard(request: Request): async def dashboard(request: Request):
""" """
Render the dashboard page Render the dashboard page

View File

@ -9,123 +9,217 @@
<p class="text-muted mb-0">Velkommen tilbage, Christian</p> <p class="text-muted mb-0">Velkommen tilbage, Christian</p>
</div> </div>
<div class="d-flex gap-3"> <div class="d-flex gap-3">
<input type="text" class="header-search" placeholder="Søg..."> <div class="input-group">
<button class="btn btn-primary"><i class="bi bi-plus-lg me-2"></i>Ny Opgave</button> <span class="input-group-text bg-white border-end-0"><i class="bi bi-search"></i></span>
<input type="text" id="dashboardSearchInput" class="form-control border-start-0 ps-0" placeholder="Søg i alt... (⌘K)" style="max-width: 250px;" role="button">
</div>
<div class="dropdown">
<button class="btn btn-primary dropdown-toggle" type="button" data-bs-toggle="dropdown">
<i class="bi bi-plus-lg me-2"></i>Ny Oprettelse
</button>
<ul class="dropdown-menu dropdown-menu-end">
<li><a class="dropdown-item" href="/customers"><i class="bi bi-building me-2"></i>Ny Kunde</a></li>
<li><a class="dropdown-item" href="/contacts"><i class="bi bi-person me-2"></i>Ny Kontakt</a></li>
<li><a class="dropdown-item" href="/vendors"><i class="bi bi-shop me-2"></i>Ny Leverandør</a></li>
</ul>
</div>
</div> </div>
</div> </div>
<!-- 1. Live Metrics Cards -->
<div class="row g-4 mb-5"> <div class="row g-4 mb-5">
<div class="col-md-3"> <div class="col-md-3">
<div class="card stat-card p-4 h-100"> <div class="card stat-card p-4 h-100">
<div class="d-flex justify-content-between mb-2"> <div class="d-flex justify-content-between mb-2">
<p>Aktive Kunder</p> <p>Kunder</p>
<i class="bi bi-building text-primary" style="color: var(--accent) !important;"></i>
</div>
<h3 id="customerCount">-</h3>
<small class="text-success"><i class="bi bi-check-circle"></i> Aktive i systemet</small>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card p-4 h-100">
<div class="d-flex justify-content-between mb-2">
<p>Kontakter</p>
<i class="bi bi-people text-primary" style="color: var(--accent) !important;"></i> <i class="bi bi-people text-primary" style="color: var(--accent) !important;"></i>
</div> </div>
<h3>124</h3> <h3 id="contactCount">-</h3>
<small class="text-success"><i class="bi bi-arrow-up-short"></i> 12% denne måned</small> <small class="text-muted">Tilknyttede personer</small>
</div> </div>
</div> </div>
<div class="col-md-3"> <div class="col-md-3">
<div class="card stat-card p-4 h-100"> <div class="card stat-card p-4 h-100">
<div class="d-flex justify-content-between mb-2"> <div class="d-flex justify-content-between mb-2">
<p>Hardware</p> <p>Leverandører</p>
<i class="bi bi-hdd text-primary" style="color: var(--accent) !important;"></i> <i class="bi bi-shop text-primary" style="color: var(--accent) !important;"></i>
</div> </div>
<h3>856</h3> <h3 id="vendorCount">-</h3>
<small class="text-muted">Enheder online</small> <small class="text-muted">Aktive leverandøraftaler</small>
</div> </div>
</div> </div>
<div class="col-md-3"> <div class="col-md-3">
<div class="card stat-card p-4 h-100"> <div class="card stat-card p-4 h-100">
<div class="d-flex justify-content-between mb-2"> <div class="d-flex justify-content-between mb-2">
<p>Support</p> <p>System Status</p>
<i class="bi bi-ticket text-primary" style="color: var(--accent) !important;"></i> <i class="bi bi-cpu text-primary" style="color: var(--accent) !important;"></i>
</div> </div>
<h3>12</h3> <h3 id="systemStatus" class="text-success">Online</h3>
<small class="text-warning">3 kræver handling</small> <small class="text-muted" id="systemVersion">v1.0.0</small>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card p-4 h-100">
<div class="d-flex justify-content-between mb-2">
<p>Omsætning</p>
<i class="bi bi-currency-dollar text-primary" style="color: var(--accent) !important;"></i>
</div>
<h3>450k</h3>
<small class="text-success">Over budget</small>
</div> </div>
</div> </div>
</div> </div>
<div class="row g-4"> <div class="row g-4">
<!-- 2. Recent Activity (New Customers) -->
<div class="col-lg-8"> <div class="col-lg-8">
<div class="card p-4"> <div class="card p-4 h-100">
<h5 class="fw-bold mb-4">Seneste Aktiviteter</h5> <div class="d-flex justify-content-between align-items-center mb-4">
<h5 class="fw-bold mb-0">Seneste Tilføjelser</h5>
<a href="/customers" class="btn btn-sm btn-light">Se alle</a>
</div>
<div class="table-responsive"> <div class="table-responsive">
<table class="table table-hover align-middle"> <table class="table table-hover align-middle">
<thead> <thead>
<tr> <tr>
<th>Kunde</th> <th>Navn</th>
<th>Handling</th> <th>Type</th>
<th>Status</th> <th>Oprettet</th>
<th class="text-end">Tid</th> <th class="text-end">Handling</th>
</tr> </tr>
</thead> </thead>
<tbody> <tbody id="recentActivityTable">
<tr> <tr>
<td class="fw-bold">Advokatgruppen A/S</td> <td colspan="4" class="text-center py-4">
<td>Firewall konfiguration</td> <div class="spinner-border text-primary" role="status"></div>
<td><span class="badge bg-success bg-opacity-10 text-success">Fuldført</span></td> </td>
<td class="text-end text-muted">10:23</td>
</tr>
<tr>
<td class="fw-bold">Byg & Bo ApS</td>
<td>Licens fornyelse</td>
<td><span class="badge bg-warning bg-opacity-10 text-warning">Afventer</span></td>
<td class="text-end text-muted">I går</td>
</tr>
<tr>
<td class="fw-bold">Cafe Møller</td>
<td>Netværksnedbrud</td>
<td><span class="badge bg-danger bg-opacity-10 text-danger">Kritisk</span></td>
<td class="text-end text-muted">I går</td>
</tr> </tr>
</tbody> </tbody>
</table> </table>
</div> </div>
</div> </div>
</div> </div>
<!-- 3. Vendor Distribution & Quick Links -->
<div class="col-lg-4"> <div class="col-lg-4">
<div class="card p-4 h-100"> <div class="card p-4 mb-4">
<h5 class="fw-bold mb-4">System Status</h5> <h5 class="fw-bold mb-4">Leverandør Fordeling</h5>
<div id="vendorDistribution">
<div class="mb-4"> <div class="text-center py-3">
<div class="d-flex justify-content-between mb-2"> <div class="spinner-border text-primary" role="status"></div>
<span class="small fw-bold text-muted">CPU LOAD</span>
<span class="small fw-bold">24%</span>
</div> </div>
<div class="progress" style="height: 8px; background-color: var(--accent-light);">
<div class="progress-bar" style="width: 24%; background-color: var(--accent);"></div>
</div> </div>
</div> </div>
<div class="mb-4"> <!-- 4. Quick Actions / Shortcuts -->
<div class="d-flex justify-content-between mb-2"> <div class="card p-4">
<span class="small fw-bold text-muted">MEMORY</span> <h5 class="fw-bold mb-3">Genveje</h5>
<span class="small fw-bold">56%</span> <div class="d-grid gap-2">
<a href="/settings" class="btn btn-light text-start p-3 d-flex align-items-center">
<div class="bg-white p-2 rounded me-3 shadow-sm">
<i class="bi bi-gear text-primary"></i>
</div> </div>
<div class="progress" style="height: 8px; background-color: var(--accent-light);"> <div>
<div class="progress-bar" style="width: 56%; background-color: var(--accent);"></div> <div class="fw-bold">Indstillinger</div>
<small class="text-muted">Konfigurer systemet</small>
</div> </div>
</a>
<a href="/vendors" class="btn btn-light text-start p-3 d-flex align-items-center">
<div class="bg-white p-2 rounded me-3 shadow-sm">
<i class="bi bi-truck text-success"></i>
</div> </div>
<div>
<div class="mt-auto p-3 rounded" style="background-color: var(--accent-light);"> <div class="fw-bold">Leverandører</div>
<div class="d-flex"> <small class="text-muted">Administrer aftaler</small>
<i class="bi bi-check-circle-fill text-success me-2"></i>
<small class="fw-bold" style="color: var(--accent)">Alle systemer kører optimalt.</small>
</div> </div>
</a>
</div> </div>
</div> </div>
</div> </div>
</div> </div>
{% endblock %} {% endblock %}
{% block extra_js %}
<script>
async function loadDashboardStats() {
try {
const response = await fetch('/api/v1/dashboard/stats');
const data = await response.json();
// Update Counts
document.getElementById('customerCount').textContent = data.counts.customers;
document.getElementById('contactCount').textContent = data.counts.contacts;
document.getElementById('vendorCount').textContent = data.counts.vendors;
// Update Recent Activity
const activityTable = document.getElementById('recentActivityTable');
if (data.recent_activity && data.recent_activity.length > 0) {
activityTable.innerHTML = data.recent_activity.map(item => `
<tr>
<td class="fw-bold">
<div class="d-flex align-items-center">
<div class="rounded-circle bg-light d-flex align-items-center justify-content-center me-3" style="width: 32px; height: 32px;">
<i class="bi bi-building text-primary"></i>
</div>
${item.name}
</div>
</td>
<td><span class="badge bg-primary bg-opacity-10 text-primary">Kunde</span></td>
<td class="text-muted">${new Date(item.created_at).toLocaleDateString('da-DK')}</td>
<td class="text-end">
<a href="/customers/${item.id}" class="btn btn-sm btn-light"><i class="bi bi-arrow-right"></i></a>
</td>
</tr>
`).join('');
} else {
activityTable.innerHTML = '<tr><td colspan="4" class="text-center text-muted py-4">Ingen nylig aktivitet</td></tr>';
}
// Update Vendor Distribution
const vendorDist = document.getElementById('vendorDistribution');
if (data.vendor_distribution && data.vendor_distribution.length > 0) {
const total = data.counts.vendors;
vendorDist.innerHTML = data.vendor_distribution.map(cat => {
const percentage = Math.round((cat.count / total) * 100);
return `
<div class="mb-3">
<div class="d-flex justify-content-between mb-1">
<span class="small fw-bold">${cat.category || 'Ukendt'}</span>
<span class="small text-muted">${cat.count}</span>
</div>
<div class="progress" style="height: 6px;">
<div class="progress-bar" role="progressbar" style="width: ${percentage}%" aria-valuenow="${percentage}" aria-valuemin="0" aria-valuemax="100"></div>
</div>
</div>
`;
}).join('');
} else {
vendorDist.innerHTML = '<p class="text-muted text-center">Ingen leverandørdata</p>';
}
} catch (error) {
console.error('Error loading dashboard stats:', error);
}
}
document.addEventListener('DOMContentLoaded', loadDashboardStats);
// Connect dashboard search input to global search modal
document.getElementById('dashboardSearchInput').addEventListener('click', () => {
const modalEl = document.getElementById('globalSearchModal');
const modal = bootstrap.Modal.getOrCreateInstance(modalEl);
modal.show();
// Focus input when modal opens
modalEl.addEventListener('shown.bs.modal', () => {
document.getElementById('globalSearchInput').focus();
}, { once: true });
});
// Also handle focus (e.g. via tab navigation)
document.getElementById('dashboardSearchInput').addEventListener('focus', (e) => {
e.target.click();
e.target.blur(); // Remove focus from this input so we don't get stuck in a loop or keep cursor here
});
</script>
{% endblock %}

View File

@ -76,14 +76,14 @@ async def get_features(version: Optional[str] = None, status: Optional[str] = No
params.append(status) params.append(status)
query += " ORDER BY priority DESC, expected_date ASC" query += " ORDER BY priority DESC, expected_date ASC"
result = execute_query_single(query, tuple(params) if params else None) result = execute_query(query, tuple(params) if params else None)
return result or [] return result or []
@router.get("/features/{feature_id}", response_model=Feature) @router.get("/features/{feature_id}", response_model=Feature)
async def get_feature(feature_id: int): async def get_feature(feature_id: int):
"""Get a specific feature""" """Get a specific feature"""
result = execute_query("SELECT * FROM dev_features WHERE id = %s", (feature_id,)) result = execute_query("SELECT * FROM dev_features WHERE id = %s", (feature_id,), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Feature not found") raise HTTPException(status_code=404, detail="Feature not found")
return result return result
@ -97,10 +97,10 @@ async def create_feature(feature: FeatureCreate):
VALUES (%s, %s, %s, %s, %s, %s) VALUES (%s, %s, %s, %s, %s, %s)
RETURNING * RETURNING *
""" """
result = execute_query_single(query, ( result = execute_query(query, (
feature.title, feature.description, feature.version, feature.title, feature.description, feature.version,
feature.status, feature.priority, feature.expected_date feature.status, feature.priority, feature.expected_date
)) ), fetchone=True)
logger.info(f"✅ Created feature: {feature.title}") logger.info(f"✅ Created feature: {feature.title}")
return result return result
@ -116,10 +116,10 @@ async def update_feature(feature_id: int, feature: FeatureCreate):
WHERE id = %s WHERE id = %s
RETURNING * RETURNING *
""" """
result = execute_query_single(query, ( result = execute_query(query, (
feature.title, feature.description, feature.version, feature.title, feature.description, feature.version,
feature.status, feature.priority, feature.expected_date, feature_id feature.status, feature.priority, feature.expected_date, feature_id
)) ), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Feature not found") raise HTTPException(status_code=404, detail="Feature not found")
@ -131,7 +131,7 @@ async def update_feature(feature_id: int, feature: FeatureCreate):
@router.delete("/features/{feature_id}") @router.delete("/features/{feature_id}")
async def delete_feature(feature_id: int): async def delete_feature(feature_id: int):
"""Delete a roadmap feature""" """Delete a roadmap feature"""
result = execute_query_single("DELETE FROM dev_features WHERE id = %s RETURNING id", (feature_id,)) result = execute_query("DELETE FROM dev_features WHERE id = %s RETURNING id", (feature_id,), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Feature not found") raise HTTPException(status_code=404, detail="Feature not found")
@ -151,7 +151,7 @@ async def get_ideas(category: Optional[str] = None):
params.append(category) params.append(category)
query += " ORDER BY votes DESC, created_at DESC" query += " ORDER BY votes DESC, created_at DESC"
result = execute_query_single(query, tuple(params) if params else None) result = execute_query(query, tuple(params) if params else None)
return result or [] return result or []
@ -163,7 +163,7 @@ async def create_idea(idea: IdeaCreate):
VALUES (%s, %s, %s) VALUES (%s, %s, %s)
RETURNING * RETURNING *
""" """
result = execute_query(query, (idea.title, idea.description, idea.category)) result = execute_query(query, (idea.title, idea.description, idea.category), fetchone=True)
logger.info(f"✅ Created idea: {idea.title}") logger.info(f"✅ Created idea: {idea.title}")
return result return result
@ -178,7 +178,7 @@ async def vote_idea(idea_id: int):
WHERE id = %s WHERE id = %s
RETURNING * RETURNING *
""" """
result = execute_query_single(query, (idea_id,)) result = execute_query(query, (idea_id,), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Idea not found") raise HTTPException(status_code=404, detail="Idea not found")
@ -189,7 +189,7 @@ async def vote_idea(idea_id: int):
@router.delete("/ideas/{idea_id}") @router.delete("/ideas/{idea_id}")
async def delete_idea(idea_id: int): async def delete_idea(idea_id: int):
"""Delete an idea""" """Delete an idea"""
result = execute_query_single("DELETE FROM dev_ideas WHERE id = %s RETURNING id", (idea_id,)) result = execute_query("DELETE FROM dev_ideas WHERE id = %s RETURNING id", (idea_id,), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Idea not found") raise HTTPException(status_code=404, detail="Idea not found")
@ -209,14 +209,14 @@ async def get_workflows(category: Optional[str] = None):
params.append(category) params.append(category)
query += " ORDER BY created_at DESC" query += " ORDER BY created_at DESC"
result = execute_query_single(query, tuple(params) if params else None) result = execute_query(query, tuple(params) if params else None)
return result or [] return result or []
@router.get("/workflows/{workflow_id}", response_model=Workflow) @router.get("/workflows/{workflow_id}", response_model=Workflow)
async def get_workflow(workflow_id: int): async def get_workflow(workflow_id: int):
"""Get a specific workflow""" """Get a specific workflow"""
result = execute_query("SELECT * FROM dev_workflows WHERE id = %s", (workflow_id,)) result = execute_query("SELECT * FROM dev_workflows WHERE id = %s", (workflow_id,), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Workflow not found") raise HTTPException(status_code=404, detail="Workflow not found")
return result return result
@ -230,9 +230,9 @@ async def create_workflow(workflow: WorkflowCreate):
VALUES (%s, %s, %s, %s) VALUES (%s, %s, %s, %s)
RETURNING * RETURNING *
""" """
result = execute_query_single(query, ( result = execute_query(query, (
workflow.title, workflow.description, workflow.category, workflow.diagram_xml workflow.title, workflow.description, workflow.category, workflow.diagram_xml
)) ), fetchone=True)
logger.info(f"✅ Created workflow: {workflow.title}") logger.info(f"✅ Created workflow: {workflow.title}")
return result return result
@ -247,10 +247,10 @@ async def update_workflow(workflow_id: int, workflow: WorkflowCreate):
WHERE id = %s WHERE id = %s
RETURNING * RETURNING *
""" """
result = execute_query_single(query, ( result = execute_query(query, (
workflow.title, workflow.description, workflow.category, workflow.title, workflow.description, workflow.category,
workflow.diagram_xml, workflow_id workflow.diagram_xml, workflow_id
)) ), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Workflow not found") raise HTTPException(status_code=404, detail="Workflow not found")
@ -262,7 +262,7 @@ async def update_workflow(workflow_id: int, workflow: WorkflowCreate):
@router.delete("/workflows/{workflow_id}") @router.delete("/workflows/{workflow_id}")
async def delete_workflow(workflow_id: int): async def delete_workflow(workflow_id: int):
"""Delete a workflow""" """Delete a workflow"""
result = execute_query_single("DELETE FROM dev_workflows WHERE id = %s RETURNING id", (workflow_id,)) result = execute_query("DELETE FROM dev_workflows WHERE id = %s RETURNING id", (workflow_id,), fetchone=True)
if not result: if not result:
raise HTTPException(status_code=404, detail="Workflow not found") raise HTTPException(status_code=404, detail="Workflow not found")
@ -274,9 +274,9 @@ async def delete_workflow(workflow_id: int):
@router.get("/stats") @router.get("/stats")
async def get_devportal_stats(): async def get_devportal_stats():
"""Get DEV Portal statistics""" """Get DEV Portal statistics"""
features_count = execute_query_single("SELECT COUNT(*) as count FROM dev_features") features_count = execute_query("SELECT COUNT(*) as count FROM dev_features", fetchone=True)
ideas_count = execute_query_single("SELECT COUNT(*) as count FROM dev_ideas") ideas_count = execute_query("SELECT COUNT(*) as count FROM dev_ideas", fetchone=True)
workflows_count = execute_query_single("SELECT COUNT(*) as count FROM dev_workflows") workflows_count = execute_query("SELECT COUNT(*) as count FROM dev_workflows", fetchone=True)
features_by_status = execute_query(""" features_by_status = execute_query("""
SELECT status, COUNT(*) as count SELECT status, COUNT(*) as count

View File

@ -11,7 +11,6 @@ from datetime import datetime, date
from app.core.database import execute_query, execute_insert, execute_update from app.core.database import execute_query, execute_insert, execute_update
from app.services.email_processor_service import EmailProcessorService from app.services.email_processor_service import EmailProcessorService
from app.services.email_workflow_service import email_workflow_service
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -91,49 +90,6 @@ class EmailRule(BaseModel):
last_matched_at: Optional[datetime] last_matched_at: Optional[datetime]
class EmailWorkflow(BaseModel):
id: Optional[int] = None
name: str
description: Optional[str]
classification_trigger: str
sender_pattern: Optional[str] = None
subject_pattern: Optional[str] = None
confidence_threshold: float = 0.70
workflow_steps: List[dict]
priority: int = 100
enabled: bool = True
stop_on_match: bool = True
execution_count: int = 0
success_count: int = 0
failure_count: int = 0
last_executed_at: Optional[datetime] = None
class WorkflowExecution(BaseModel):
id: int
workflow_id: int
email_id: int
status: str
steps_completed: int
steps_total: Optional[int]
result_json: Optional[List[dict]] = None # Can be list of step results
error_message: Optional[str]
started_at: datetime
completed_at: Optional[datetime]
execution_time_ms: Optional[int]
class WorkflowAction(BaseModel):
id: int
action_code: str
name: str
description: Optional[str]
category: Optional[str]
parameter_schema: Optional[dict]
example_config: Optional[dict]
enabled: bool
class ProcessingStats(BaseModel): class ProcessingStats(BaseModel):
status: str status: str
fetched: int = 0 fetched: int = 0
@ -183,7 +139,7 @@ async def list_emails(
""" """
params.extend([limit, offset]) params.extend([limit, offset])
result = execute_query_single(query, tuple(params)) result = execute_query(query, tuple(params))
return result return result
@ -227,41 +183,6 @@ async def get_email(email_id: int):
raise HTTPException(status_code=500, detail=str(e)) raise HTTPException(status_code=500, detail=str(e))
@router.post("/emails/{email_id}/mark-processed")
async def mark_email_processed(email_id: int):
"""Mark email as processed and move to 'Processed' folder"""
try:
# Update email status and folder
update_query = """
UPDATE email_messages
SET status = 'processed',
folder = 'Processed',
processed_at = CURRENT_TIMESTAMP,
updated_at = CURRENT_TIMESTAMP
WHERE id = %s AND deleted_at IS NULL
RETURNING id, folder, status
"""
result = execute_query(update_query, (email_id,))
if not result:
raise HTTPException(status_code=404, detail="Email not found")
logger.info(f"✅ Email {email_id} marked as processed and moved to Processed folder")
return {
"success": True,
"email_id": result.get('id') if result else email_id,
"folder": result.get('folder') if result else 'Processed',
"status": result.get('status') if result else 'processed'
}
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error marking email {email_id} as processed: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/emails/{email_id}/attachments/{attachment_id}") @router.get("/emails/{email_id}/attachments/{attachment_id}")
async def download_attachment(email_id: int, attachment_id: int): async def download_attachment(email_id: int, attachment_id: int):
"""Download email attachment""" """Download email attachment"""
@ -274,7 +195,7 @@ async def download_attachment(email_id: int, attachment_id: int):
JOIN email_messages e ON e.id = a.email_id JOIN email_messages e ON e.id = a.email_id
WHERE a.id = %s AND a.email_id = %s AND e.deleted_at IS NULL WHERE a.id = %s AND a.email_id = %s AND e.deleted_at IS NULL
""" """
result = execute_query_single(query, (attachment_id, email_id)) result = execute_query(query, (attachment_id, email_id))
if not result: if not result:
raise HTTPException(status_code=404, detail="Attachment not found") raise HTTPException(status_code=404, detail="Attachment not found")
@ -692,390 +613,3 @@ async def get_email_stats():
except Exception as e: except Exception as e:
logger.error(f"❌ Error getting stats: {e}") logger.error(f"❌ Error getting stats: {e}")
raise HTTPException(status_code=500, detail=str(e)) raise HTTPException(status_code=500, detail=str(e))
# ========== Workflow Endpoints ==========
@router.get("/workflows", response_model=List[EmailWorkflow])
async def list_workflows():
"""Get all email workflows"""
try:
query = """
SELECT * FROM email_workflows
ORDER BY priority ASC, name ASC
"""
result = execute_query(query)
return result
except Exception as e:
logger.error(f"❌ Error listing workflows: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/workflows/{workflow_id}", response_model=EmailWorkflow)
async def get_workflow(workflow_id: int):
"""Get specific workflow by ID"""
try:
query = "SELECT * FROM email_workflows WHERE id = %s"
result = execute_query(query, (workflow_id,))
if not result:
raise HTTPException(status_code=404, detail="Workflow not found")
return result
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error getting workflow: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.post("/workflows", response_model=EmailWorkflow)
async def create_workflow(workflow: EmailWorkflow):
"""Create new email workflow"""
try:
import json
query = """
INSERT INTO email_workflows
(name, description, classification_trigger, sender_pattern, subject_pattern,
confidence_threshold, workflow_steps, priority, enabled, stop_on_match, created_by_user_id)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, 1)
RETURNING *
"""
result = execute_query_single(query, (
workflow.name,
workflow.description,
workflow.classification_trigger,
workflow.sender_pattern,
workflow.subject_pattern,
workflow.confidence_threshold,
json.dumps(workflow.workflow_steps),
workflow.priority,
workflow.enabled,
workflow.stop_on_match
))
if result:
logger.info(f"✅ Created workflow: {workflow.name}")
return result
else:
raise HTTPException(status_code=500, detail="Failed to create workflow")
except Exception as e:
logger.error(f"❌ Error creating workflow: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.put("/workflows/{workflow_id}", response_model=EmailWorkflow)
async def update_workflow(workflow_id: int, workflow: EmailWorkflow):
"""Update existing email workflow"""
try:
import json
query = """
UPDATE email_workflows
SET name = %s,
description = %s,
classification_trigger = %s,
sender_pattern = %s,
subject_pattern = %s,
confidence_threshold = %s,
workflow_steps = %s,
priority = %s,
enabled = %s,
stop_on_match = %s
WHERE id = %s
RETURNING *
"""
result = execute_query_single(query, (
workflow.name,
workflow.description,
workflow.classification_trigger,
workflow.sender_pattern,
workflow.subject_pattern,
workflow.confidence_threshold,
json.dumps(workflow.workflow_steps),
workflow.priority,
workflow.enabled,
workflow.stop_on_match,
workflow_id
))
if result:
logger.info(f"✅ Updated workflow {workflow_id}")
return result
else:
raise HTTPException(status_code=404, detail="Workflow not found")
except Exception as e:
logger.error(f"❌ Error updating workflow: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.delete("/workflows/{workflow_id}")
async def delete_workflow(workflow_id: int):
"""Delete email workflow"""
try:
query = "DELETE FROM email_workflows WHERE id = %s"
execute_update(query, (workflow_id,))
logger.info(f"🗑️ Deleted workflow {workflow_id}")
return {"success": True, "message": f"Workflow {workflow_id} deleted"}
except Exception as e:
logger.error(f"❌ Error deleting workflow: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.post("/workflows/{workflow_id}/toggle")
async def toggle_workflow(workflow_id: int):
"""Toggle workflow enabled status"""
try:
query = """
UPDATE email_workflows
SET enabled = NOT enabled
WHERE id = %s
RETURNING enabled
"""
result = execute_query_single(query, (workflow_id,))
if not result:
raise HTTPException(status_code=404, detail="Workflow not found")
status = "enabled" if result['enabled'] else "disabled"
logger.info(f"🔄 Workflow {workflow_id} {status}")
return {
"success": True,
"workflow_id": workflow_id,
"enabled": result['enabled']
}
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error toggling workflow: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.post("/emails/{email_id}/execute-workflows")
async def execute_workflows_for_email(email_id: int):
"""Manually trigger workflow execution for an email"""
try:
# Get email data
query = """
SELECT id, message_id, subject, sender_email, sender_name, body_text,
classification, confidence_score, status
FROM email_messages
WHERE id = %s AND deleted_at IS NULL
"""
email_data = execute_query_single(query, (email_id,))
if not email_data:
raise HTTPException(status_code=404, detail="Email not found")
# Execute workflows
result = await email_workflow_service.execute_workflows(email_data)
return result
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error executing workflows: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/workflow-executions", response_model=List[WorkflowExecution])
async def list_workflow_executions(
workflow_id: Optional[int] = Query(None),
email_id: Optional[int] = Query(None),
status: Optional[str] = Query(None),
limit: int = Query(50, le=500)
):
"""Get workflow execution history"""
try:
where_clauses = []
params = []
if workflow_id:
where_clauses.append("workflow_id = %s")
params.append(workflow_id)
if email_id:
where_clauses.append("email_id = %s")
params.append(email_id)
if status:
where_clauses.append("status = %s")
params.append(status)
where_sql = " AND ".join(where_clauses) if where_clauses else "1=1"
query = f"""
SELECT * FROM email_workflow_executions
WHERE {where_sql}
ORDER BY started_at DESC
LIMIT %s
"""
params.append(limit)
result = execute_query(query, tuple(params))
return result
except Exception as e:
logger.error(f"❌ Error listing workflow executions: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/workflow-actions", response_model=List[WorkflowAction])
async def list_workflow_actions():
"""Get all available workflow actions"""
try:
query = """
SELECT * FROM email_workflow_actions
WHERE enabled = true
ORDER BY category, name
"""
result = execute_query(query)
return result
except Exception as e:
logger.error(f"❌ Error listing workflow actions: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/workflows/stats/summary")
async def get_workflow_stats():
"""Get workflow execution statistics"""
try:
query = """
SELECT * FROM v_workflow_stats
"""
result = execute_query(query)
return result
except Exception as e:
logger.error(f"❌ Error getting workflow stats: {e}")
raise HTTPException(status_code=500, detail=str(e))
# ========== Email Activity Log Endpoints ==========
class EmailActivityLog(BaseModel):
id: int
email_id: int
event_type: str
event_category: str
description: str
metadata: Optional[dict]
user_id: Optional[int]
user_name: Optional[str]
created_at: datetime
created_by: str
@router.get("/emails/{email_id}/activity", response_model=List[EmailActivityLog])
async def get_email_activity_log(email_id: int, limit: int = Query(default=100, le=500)):
"""Get complete activity log for an email"""
try:
query = """
SELECT
eal.id,
eal.email_id,
eal.event_type,
eal.event_category,
eal.description,
eal.metadata,
eal.user_id,
u.username as user_name,
eal.created_at,
eal.created_by
FROM email_activity_log eal
LEFT JOIN users u ON eal.user_id = u.user_id
WHERE eal.email_id = %s
ORDER BY eal.created_at DESC
LIMIT %s
"""
result = execute_query(query, (email_id, limit))
return result
except Exception as e:
logger.error(f"❌ Error getting email activity log: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/emails/activity/recent", response_model=List[EmailActivityLog])
async def get_recent_activity(
limit: int = Query(default=50, le=200),
event_type: Optional[str] = None,
event_category: Optional[str] = None
):
"""Get recent email activity across all emails"""
try:
conditions = []
params = []
if event_type:
conditions.append("eal.event_type = %s")
params.append(event_type)
if event_category:
conditions.append("eal.event_category = %s")
params.append(event_category)
where_clause = f"WHERE {' AND '.join(conditions)}" if conditions else ""
params.append(limit)
query = f"""
SELECT
eal.id,
eal.email_id,
eal.event_type,
eal.event_category,
eal.description,
eal.metadata,
eal.user_id,
u.username as user_name,
eal.created_at,
eal.created_by
FROM email_activity_log eal
LEFT JOIN users u ON eal.user_id = u.user_id
{where_clause}
ORDER BY eal.created_at DESC
LIMIT %s
"""
result = execute_query(query, tuple(params))
return result
except Exception as e:
logger.error(f"❌ Error getting recent activity: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/emails/activity/stats")
async def get_activity_stats():
"""Get activity statistics"""
try:
query = """
SELECT
event_type,
event_category,
COUNT(*) as count,
MAX(created_at) as last_occurrence
FROM email_activity_log
WHERE created_at >= NOW() - INTERVAL '7 days'
GROUP BY event_type, event_category
ORDER BY count DESC
"""
result = execute_query(query)
return result
except Exception as e:
logger.error(f"❌ Error getting activity stats: {e}")
raise HTTPException(status_code=500, detail=str(e))

File diff suppressed because it is too large Load Diff

View File

@ -3,10 +3,11 @@ Pydantic Models and Schemas
""" """
from pydantic import BaseModel from pydantic import BaseModel
from typing import Optional from typing import Optional, List
from datetime import datetime from datetime import datetime
# Customer Schemas
class CustomerBase(BaseModel): class CustomerBase(BaseModel):
"""Base customer schema""" """Base customer schema"""
name: str name: str
@ -15,9 +16,31 @@ class CustomerBase(BaseModel):
address: Optional[str] = None address: Optional[str] = None
class CustomerCreate(CustomerBase): class CustomerCreate(BaseModel):
"""Schema for creating a customer""" """Schema for creating a customer"""
pass name: str
cvr_number: Optional[str] = None
email: Optional[str] = None
phone: Optional[str] = None
address: Optional[str] = None
postal_code: Optional[str] = None
city: Optional[str] = None
website: Optional[str] = None
is_active: bool = True
class CustomerUpdate(BaseModel):
"""Schema for updating a customer"""
name: Optional[str] = None
cvr_number: Optional[str] = None
email: Optional[str] = None
phone: Optional[str] = None
address: Optional[str] = None
postal_code: Optional[str] = None
city: Optional[str] = None
website: Optional[str] = None
is_active: Optional[bool] = None
subscriptions_locked: Optional[bool] = None
class Customer(CustomerBase): class Customer(CustomerBase):
@ -30,6 +53,70 @@ class Customer(CustomerBase):
from_attributes = True from_attributes = True
# Contact Schemas
class ContactBase(BaseModel):
"""Base contact schema"""
first_name: str
last_name: str
email: Optional[str] = None
phone: Optional[str] = None
mobile: Optional[str] = None
title: Optional[str] = None
department: Optional[str] = None
class ContactCreate(ContactBase):
"""Schema for creating a contact"""
company_ids: List[int] = [] # List of customer IDs to link to
is_primary: bool = False # Whether this is the primary contact for first company
role: Optional[str] = None
notes: Optional[str] = None
is_active: bool = True
class ContactUpdate(BaseModel):
"""Schema for updating a contact"""
first_name: Optional[str] = None
last_name: Optional[str] = None
email: Optional[str] = None
phone: Optional[str] = None
mobile: Optional[str] = None
title: Optional[str] = None
department: Optional[str] = None
is_active: Optional[bool] = None
class ContactCompanyLink(BaseModel):
"""Schema for linking/unlinking a contact to a company"""
customer_id: int
is_primary: bool = False
role: Optional[str] = None
notes: Optional[str] = None
class CompanyInfo(BaseModel):
"""Schema for company information in contact context"""
id: int
name: str
is_primary: bool
role: Optional[str] = None
notes: Optional[str] = None
class Contact(ContactBase):
"""Full contact schema"""
id: int
is_active: bool
vtiger_id: Optional[str] = None
created_at: datetime
updated_at: Optional[datetime] = None
companies: List[CompanyInfo] = [] # List of linked companies
class Config:
from_attributes = True
# Hardware Schemas
class HardwareBase(BaseModel): class HardwareBase(BaseModel):
"""Base hardware schema""" """Base hardware schema"""
serial_number: str serial_number: str
@ -51,32 +138,46 @@ class Hardware(HardwareBase):
from_attributes = True from_attributes = True
# Vendor Schemas
class VendorBase(BaseModel): class VendorBase(BaseModel):
"""Base vendor schema""" """Base vendor schema"""
name: str name: str
cvr_number: Optional[str] = None
domain: Optional[str] = None
email: Optional[str] = None email: Optional[str] = None
phone: Optional[str] = None phone: Optional[str] = None
contact_person: Optional[str] = None
category: Optional[str] = None
notes: Optional[str] = None
class VendorCreate(VendorBase): class VendorCreate(BaseModel):
"""Schema for creating a vendor""" """Schema for creating a vendor"""
pass name: str
cvr_number: Optional[str] = None
email: Optional[str] = None
phone: Optional[str] = None
address: Optional[str] = None
postal_code: Optional[str] = None
city: Optional[str] = None
website: Optional[str] = None
domain: Optional[str] = None
email_pattern: Optional[str] = None
category: str = 'general'
priority: int = 50
notes: Optional[str] = None
is_active: bool = True
class VendorUpdate(BaseModel): class VendorUpdate(BaseModel):
"""Schema for updating a vendor""" """Schema for updating a vendor"""
name: Optional[str] = None name: Optional[str] = None
cvr_number: Optional[str] = None cvr_number: Optional[str] = None
domain: Optional[str] = None
email: Optional[str] = None email: Optional[str] = None
phone: Optional[str] = None phone: Optional[str] = None
contact_person: Optional[str] = None address: Optional[str] = None
postal_code: Optional[str] = None
city: Optional[str] = None
website: Optional[str] = None
domain: Optional[str] = None
email_pattern: Optional[str] = None
category: Optional[str] = None category: Optional[str] = None
priority: Optional[int] = None
notes: Optional[str] = None notes: Optional[str] = None
is_active: Optional[bool] = None is_active: Optional[bool] = None
@ -84,9 +185,20 @@ class VendorUpdate(BaseModel):
class Vendor(VendorBase): class Vendor(VendorBase):
"""Full vendor schema""" """Full vendor schema"""
id: int id: int
is_active: bool = True cvr_number: Optional[str] = None
address: Optional[str] = None
postal_code: Optional[str] = None
city: Optional[str] = None
country: Optional[str] = None
website: Optional[str] = None
domain: Optional[str] = None
category: str
priority: int
notes: Optional[str] = None
is_active: bool
created_at: datetime created_at: datetime
updated_at: Optional[datetime] = None updated_at: Optional[datetime] = None
class Config: class Config:
from_attributes = True from_attributes = True

View File

@ -32,7 +32,7 @@ Alle tabeller SKAL bruge `table_prefix` fra module.json:
```sql ```sql
-- Hvis table_prefix = "mymod_" -- Hvis table_prefix = "mymod_"
CREATE TABLE mymod_items ( CREATE TABLE mymod_customers (
id SERIAL PRIMARY KEY, id SERIAL PRIMARY KEY,
name VARCHAR(255) name VARCHAR(255)
); );
@ -40,43 +40,6 @@ CREATE TABLE mymod_items (
Dette sikrer at moduler ikke kolliderer med core eller andre moduler. Dette sikrer at moduler ikke kolliderer med core eller andre moduler.
### Customer Linking (Hvis nødvendigt)
Hvis dit modul skal have sin egen kunde-tabel (f.eks. ved sync fra eksternt system):
**SKAL altid linke til core customers:**
```sql
CREATE TABLE mymod_customers (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
external_id VARCHAR(100), -- ID fra eksternt system
hub_customer_id INTEGER REFERENCES customers(id), -- VIGTIG!
active BOOLEAN DEFAULT TRUE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- Auto-link trigger (se migrations/001_init.sql for komplet eksempel)
CREATE TRIGGER trigger_auto_link_mymod_customer
BEFORE INSERT OR UPDATE OF name
ON mymod_customers
FOR EACH ROW
EXECUTE FUNCTION auto_link_mymod_customer();
```
**Hvorfor?** Dette sikrer at:
- ✅ E-conomic export virker automatisk
- ✅ Billing integration fungerer
- ✅ Ingen manuel linking nødvendig
**Alternativ:** Hvis modulet kun har simple kunde-relationer, brug direkte FK:
```sql
CREATE TABLE mymod_orders (
id SERIAL PRIMARY KEY,
customer_id INTEGER REFERENCES customers(id) -- Direkte link
);
```
## Konfiguration ## Konfiguration
Modul-specifikke miljøvariable følger mønsteret: Modul-specifikke miljøvariable følger mønsteret:

View File

@ -29,7 +29,7 @@ async def get_items():
read_only = get_module_config("template_module", "READ_ONLY", "true") == "true" read_only = get_module_config("template_module", "READ_ONLY", "true") == "true"
# Hent items (bemærk table_prefix) # Hent items (bemærk table_prefix)
items = execute_query_single( items = execute_query(
"SELECT * FROM template_items ORDER BY created_at DESC" "SELECT * FROM template_items ORDER BY created_at DESC"
) )
@ -58,7 +58,9 @@ async def get_item(item_id: int):
try: try:
item = execute_query( item = execute_query(
"SELECT * FROM template_items WHERE id = %s", "SELECT * FROM template_items WHERE id = %s",
(item_id,)) (item_id,),
fetchone=True
)
if not item: if not item:
raise HTTPException(status_code=404, detail="Item not found") raise HTTPException(status_code=404, detail="Item not found")
@ -243,7 +245,7 @@ async def health_check():
""" """
try: try:
# Test database connectivity # Test database connectivity
result = execute_query_single("SELECT 1 as test") result = execute_query("SELECT 1 as test", fetchone=True)
return { return {
"status": "healthy", "status": "healthy",

View File

@ -11,19 +11,6 @@ CREATE TABLE IF NOT EXISTS template_items (
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
); );
-- Optional: Customers tabel hvis modulet har egne kunder (f.eks. sync fra eksternt system)
-- Kun nødvendigt hvis modulet har mange custom felter eller external sync
-- Ellers brug direkte foreign key til customers.id
CREATE TABLE IF NOT EXISTS template_customers (
id SERIAL PRIMARY KEY,
name VARCHAR(255) NOT NULL,
external_id VARCHAR(100), -- ID fra eksternt system hvis relevant
hub_customer_id INTEGER REFERENCES customers(id), -- VIGTIG: Link til core customers
active BOOLEAN DEFAULT TRUE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- Index for performance -- Index for performance
CREATE INDEX IF NOT EXISTS idx_template_items_active ON template_items(active); CREATE INDEX IF NOT EXISTS idx_template_items_active ON template_items(active);
CREATE INDEX IF NOT EXISTS idx_template_items_created ON template_items(created_at DESC); CREATE INDEX IF NOT EXISTS idx_template_items_created ON template_items(created_at DESC);
@ -42,39 +29,6 @@ BEFORE UPDATE ON template_items
FOR EACH ROW FOR EACH ROW
EXECUTE FUNCTION update_template_items_updated_at(); EXECUTE FUNCTION update_template_items_updated_at();
-- Trigger for auto-linking customers (hvis template_customers tabel oprettes)
-- Dette linker automatisk nye kunder til core customers baseret på navn match
CREATE OR REPLACE FUNCTION auto_link_template_customer()
RETURNS TRIGGER AS $$
DECLARE
matched_hub_id INTEGER;
BEGIN
-- Hvis hub_customer_id allerede er sat, skip
IF NEW.hub_customer_id IS NOT NULL THEN
RETURN NEW;
END IF;
-- Find matching hub customer baseret på navn
SELECT id INTO matched_hub_id
FROM customers
WHERE LOWER(TRIM(name)) = LOWER(TRIM(NEW.name))
LIMIT 1;
-- Hvis match fundet, sæt hub_customer_id
IF matched_hub_id IS NOT NULL THEN
NEW.hub_customer_id := matched_hub_id;
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER trigger_auto_link_template_customer
BEFORE INSERT OR UPDATE OF name
ON template_customers
FOR EACH ROW
EXECUTE FUNCTION auto_link_template_customer();
-- Indsæt test data (optional) -- Indsæt test data (optional)
INSERT INTO template_items (name, description) INSERT INTO template_items (name, description)
VALUES VALUES

View File

@ -29,7 +29,7 @@ async def get_items():
read_only = get_module_config("test_module", "READ_ONLY", "true") == "true" read_only = get_module_config("test_module", "READ_ONLY", "true") == "true"
# Hent items (bemærk table_prefix) # Hent items (bemærk table_prefix)
items = execute_query_single( items = execute_query(
"SELECT * FROM test_module_items ORDER BY created_at DESC" "SELECT * FROM test_module_items ORDER BY created_at DESC"
) )
@ -58,7 +58,9 @@ async def get_item(item_id: int):
try: try:
item = execute_query( item = execute_query(
"SELECT * FROM test_module_items WHERE id = %s", "SELECT * FROM test_module_items WHERE id = %s",
(item_id,)) (item_id,),
fetchone=True
)
if not item: if not item:
raise HTTPException(status_code=404, detail="Item not found") raise HTTPException(status_code=404, detail="Item not found")
@ -243,7 +245,7 @@ async def health_check():
""" """
try: try:
# Test database connectivity # Test database connectivity
result = execute_query_single("SELECT 1 as test") result = execute_query("SELECT 1 as test", fetchone=True)
return { return {
"status": "healthy", "status": "healthy",

View File

@ -1,273 +0,0 @@
from fastapi import APIRouter, HTTPException
from app.core.database import execute_query
from typing import List, Optional, Dict, Any
from pydantic import BaseModel
from datetime import datetime, date
import logging
logger = logging.getLogger(__name__)
router = APIRouter()
# Pydantic Models
class PrepaidCard(BaseModel):
id: Optional[int] = None
card_number: str
customer_id: int
purchased_hours: float
used_hours: float
remaining_hours: float
price_per_hour: float
total_amount: float
status: str
purchased_at: Optional[datetime] = None
expires_at: Optional[datetime] = None
economic_invoice_number: Optional[str] = None
economic_product_number: Optional[str] = None
notes: Optional[str] = None
created_at: Optional[datetime] = None
updated_at: Optional[datetime] = None
class PrepaidCardCreate(BaseModel):
customer_id: int
purchased_hours: float
price_per_hour: float
expires_at: Optional[date] = None
notes: Optional[str] = None
@router.get("/prepaid-cards", response_model=List[Dict[str, Any]])
async def get_prepaid_cards(status: Optional[str] = None, customer_id: Optional[int] = None):
"""
Get all prepaid cards with customer information
"""
try:
query = """
SELECT
pc.*,
c.name as customer_name,
c.email as customer_email,
(SELECT COUNT(*) FROM tticket_prepaid_transactions WHERE card_id = pc.id) as transaction_count
FROM tticket_prepaid_cards pc
LEFT JOIN customers c ON pc.customer_id = c.id
WHERE 1=1
"""
params = []
if status:
query += " AND pc.status = %s"
params.append(status)
if customer_id:
query += " AND pc.customer_id = %s"
params.append(customer_id)
query += " ORDER BY pc.created_at DESC"
cards = execute_query(query, tuple(params) if params else None)
logger.info(f"✅ Retrieved {len(cards) if cards else 0} prepaid cards")
return cards or []
except Exception as e:
logger.error(f"❌ Error fetching prepaid cards: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.get("/prepaid-cards/{card_id}", response_model=Dict[str, Any])
async def get_prepaid_card(card_id: int):
"""
Get a specific prepaid card with transactions
"""
try:
result = execute_query("""
SELECT
pc.*,
c.name as customer_name,
c.email as customer_email
FROM tticket_prepaid_cards pc
LEFT JOIN customers c ON pc.customer_id = c.id
WHERE pc.id = %s
""", (card_id,))
if not result or len(result) == 0:
raise HTTPException(status_code=404, detail="Prepaid card not found")
card = result[0]
# Get transactions
transactions = execute_query("""
SELECT
pt.*,
w.ticket_id,
t.subject as ticket_title
FROM tticket_prepaid_transactions pt
LEFT JOIN tticket_worklog w ON pt.worklog_id = w.id
LEFT JOIN tticket_tickets t ON w.ticket_id = t.id
WHERE pt.card_id = %s
ORDER BY pt.created_at DESC
""", (card_id,))
card['transactions'] = transactions or []
return card
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error fetching prepaid card {card_id}: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.post("/prepaid-cards", response_model=Dict[str, Any])
async def create_prepaid_card(card: PrepaidCardCreate):
"""
Create a new prepaid card
"""
try:
# Calculate total amount
total_amount = card.purchased_hours * card.price_per_hour
# Check if customer already has active card
existing = execute_query("""
SELECT id FROM tticket_prepaid_cards
WHERE customer_id = %s AND status = 'active'
""", (card.customer_id,))
if existing and len(existing) > 0:
raise HTTPException(
status_code=400,
detail="Customer already has an active prepaid card"
)
# Create card (need to use fetch=False for INSERT RETURNING)
conn = None
try:
from app.core.database import get_db_connection, release_db_connection
from psycopg2.extras import RealDictCursor
conn = get_db_connection()
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
cursor.execute("""
INSERT INTO tticket_prepaid_cards
(customer_id, purchased_hours, price_per_hour, total_amount, expires_at, notes)
VALUES (%s, %s, %s, %s, %s, %s)
RETURNING *
""", (
card.customer_id,
card.purchased_hours,
card.price_per_hour,
total_amount,
card.expires_at,
card.notes
))
conn.commit()
result = cursor.fetchone()
logger.info(f"✅ Created prepaid card: {result['card_number']}")
return result
finally:
if conn:
release_db_connection(conn)
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error creating prepaid card: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.put("/prepaid-cards/{card_id}/status")
async def update_card_status(card_id: int, status: str):
"""
Update prepaid card status (cancel, reactivate)
"""
try:
if status not in ['active', 'cancelled']:
raise HTTPException(status_code=400, detail="Invalid status")
conn = None
try:
from app.core.database import get_db_connection, release_db_connection
from psycopg2.extras import RealDictCursor
conn = get_db_connection()
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
cursor.execute("""
UPDATE tticket_prepaid_cards
SET status = %s
WHERE id = %s
RETURNING *
""", (status, card_id))
conn.commit()
result = cursor.fetchone()
if not result:
raise HTTPException(status_code=404, detail="Card not found")
logger.info(f"✅ Updated card {card_id} status to {status}")
return result
finally:
if conn:
release_db_connection(conn)
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error updating card status: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.delete("/prepaid-cards/{card_id}")
async def delete_prepaid_card(card_id: int):
"""
Delete a prepaid card (only if no transactions)
"""
try:
# Check for transactions
transactions = execute_query("""
SELECT COUNT(*) as count FROM tticket_prepaid_transactions
WHERE card_id = %s
""", (card_id,))
if transactions and len(transactions) > 0 and transactions[0]['count'] > 0:
raise HTTPException(
status_code=400,
detail="Cannot delete card with transactions"
)
execute_query("DELETE FROM tticket_prepaid_cards WHERE id = %s", (card_id,), fetch=False)
logger.info(f"✅ Deleted prepaid card {card_id}")
return {"message": "Card deleted successfully"}
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Error deleting card: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.get("/prepaid-cards/stats/summary", response_model=Dict[str, Any])
async def get_prepaid_stats():
"""
Get prepaid cards statistics
"""
try:
result = execute_query("""
SELECT
COUNT(*) FILTER (WHERE status = 'active') as active_count,
COUNT(*) FILTER (WHERE status = 'depleted') as depleted_count,
COUNT(*) FILTER (WHERE status = 'expired') as expired_count,
COUNT(*) FILTER (WHERE status = 'cancelled') as cancelled_count,
COALESCE(SUM(remaining_hours) FILTER (WHERE status = 'active'), 0) as total_remaining_hours,
COALESCE(SUM(used_hours), 0) as total_used_hours,
COALESCE(SUM(purchased_hours), 0) as total_purchased_hours,
COALESCE(SUM(total_amount), 0) as total_revenue
FROM tticket_prepaid_cards
""")
return result[0] if result and len(result) > 0 else {}
except Exception as e:
logger.error(f"❌ Error fetching prepaid stats: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))

View File

@ -1,34 +0,0 @@
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse
from fastapi.templating import Jinja2Templates
import logging
logger = logging.getLogger(__name__)
router = APIRouter()
templates = Jinja2Templates(directory=["app/prepaid/frontend", "app/shared/frontend"])
@router.get("/prepaid-cards", response_class=HTMLResponse)
async def prepaid_cards_page(request: Request):
"""
Prepaid cards overview page
"""
logger.info("🔍 Rendering prepaid cards page")
return templates.TemplateResponse("index.html", {
"request": request,
"page_title": "Prepaid Cards"
})
@router.get("/prepaid-cards/{card_id}", response_class=HTMLResponse)
async def prepaid_card_detail(request: Request, card_id: int):
"""
Prepaid card detail page
"""
logger.info(f"🔍 Rendering prepaid card detail: {card_id}")
return templates.TemplateResponse("detail.html", {
"request": request,
"page_title": "Card Details",
"card_id": card_id
})

View File

@ -1,313 +0,0 @@
{% extends "base.html" %}
{% block title %}Card Details - BMC Hub{% endblock %}
{% block content %}
<div class="container-fluid py-4">
<!-- Header -->
<div class="row mb-4">
<div class="col">
<nav aria-label="breadcrumb">
<ol class="breadcrumb">
<li class="breadcrumb-item"><a href="/prepaid-cards">Prepaid Cards</a></li>
<li class="breadcrumb-item active" aria-current="page" id="cardNumber">Loading...</li>
</ol>
</nav>
<h1 class="h3 mb-0" id="pageTitle">💳 Loading...</h1>
</div>
<div class="col-auto">
<button class="btn btn-outline-secondary" onclick="window.location.href='/prepaid-cards'">
<i class="bi bi-arrow-left"></i> Tilbage
</button>
</div>
</div>
<!-- Card Info -->
<div class="row g-4 mb-4">
<div class="col-md-8">
<div class="card border-0 shadow-sm">
<div class="card-header bg-white border-0 py-3">
<h5 class="mb-0">Kort Information</h5>
</div>
<div class="card-body">
<div class="row g-3" id="cardInfo">
<div class="col-12 text-center py-5">
<div class="spinner-border text-primary" role="status">
<span class="visually-hidden">Loading...</span>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="col-md-4">
<!-- Stats Card -->
<div class="card border-0 shadow-sm mb-3">
<div class="card-header bg-white border-0 py-3">
<h5 class="mb-0">Oversigt</h5>
</div>
<div class="card-body">
<div class="mb-3 pb-3 border-bottom">
<small class="text-muted d-block mb-1">Købte Timer</small>
<h4 class="mb-0" id="statPurchased">-</h4>
</div>
<div class="mb-3 pb-3 border-bottom">
<small class="text-muted d-block mb-1">Brugte Timer</small>
<h4 class="mb-0 text-info" id="statUsed">-</h4>
</div>
<div class="mb-3 pb-3 border-bottom">
<small class="text-muted d-block mb-1">Tilbage</small>
<h4 class="mb-0 text-success" id="statRemaining">-</h4>
</div>
<div>
<small class="text-muted d-block mb-1">Total Beløb</small>
<h4 class="mb-0 text-primary" id="statTotal">-</h4>
</div>
</div>
</div>
<!-- Actions Card -->
<div class="card border-0 shadow-sm">
<div class="card-header bg-white border-0 py-3">
<h5 class="mb-0">Handlinger</h5>
</div>
<div class="card-body">
<div id="actionButtons">
<!-- Will be populated -->
</div>
</div>
</div>
</div>
</div>
<!-- Transactions -->
<div class="card border-0 shadow-sm">
<div class="card-header bg-white border-0 py-3">
<h5 class="mb-0">Transaktioner</h5>
</div>
<div class="card-body">
<div class="table-responsive">
<table class="table table-hover align-middle" id="transactionsTable">
<thead class="table-light">
<tr>
<th>Dato</th>
<th>Ticket</th>
<th>Beskrivelse</th>
<th class="text-end">Timer</th>
<th class="text-end">Beløb</th>
</tr>
</thead>
<tbody id="transactionsBody">
<tr>
<td colspan="5" class="text-center py-5">
<div class="spinner-border text-primary" role="status">
<span class="visually-hidden">Loading...</span>
</div>
</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
<script>
const cardId = {{ card_id }};
// Load card details
document.addEventListener('DOMContentLoaded', () => {
loadCardDetails();
});
async function loadCardDetails() {
try {
const response = await fetch(`/api/v1/prepaid-cards/${cardId}`);
if (!response.ok) {
throw new Error('Card not found');
}
const card = await response.json();
// Update header
document.getElementById('cardNumber').textContent = card.card_number;
document.getElementById('pageTitle').textContent = `💳 ${card.card_number}`;
// Update stats
document.getElementById('statPurchased').textContent =
parseFloat(card.purchased_hours).toFixed(1) + ' t';
document.getElementById('statUsed').textContent =
parseFloat(card.used_hours).toFixed(1) + ' t';
document.getElementById('statRemaining').textContent =
parseFloat(card.remaining_hours).toFixed(1) + ' t';
document.getElementById('statTotal').textContent =
new Intl.NumberFormat('da-DK', {
style: 'currency',
currency: 'DKK'
}).format(parseFloat(card.total_amount));
// Update card info
const statusBadge = getStatusBadge(card.status);
const expiresAt = card.expires_at ?
new Date(card.expires_at).toLocaleDateString('da-DK', {
year: 'numeric',
month: 'long',
day: 'numeric'
}) : 'Ingen udløbsdato';
document.getElementById('cardInfo').innerHTML = `
<div class="col-md-6">
<label class="small text-muted">Kortnummer</label>
<p class="mb-0"><strong>${card.card_number}</strong></p>
</div>
<div class="col-md-6">
<label class="small text-muted">Status</label>
<p class="mb-0">${statusBadge}</p>
</div>
<div class="col-md-6">
<label class="small text-muted">Kunde</label>
<p class="mb-0">
<a href="/customers/${card.customer_id}" class="text-decoration-none">
${card.customer_name || '-'}
</a>
</p>
<small class="text-muted">${card.customer_email || ''}</small>
</div>
<div class="col-md-6">
<label class="small text-muted">Pris pr. Time</label>
<p class="mb-0"><strong>${parseFloat(card.price_per_hour).toFixed(2)} kr</strong></p>
</div>
<div class="col-md-6">
<label class="small text-muted">Købt Dato</label>
<p class="mb-0">${new Date(card.purchased_at).toLocaleDateString('da-DK')}</p>
</div>
<div class="col-md-6">
<label class="small text-muted">Udløber</label>
<p class="mb-0">${expiresAt}</p>
</div>
${card.economic_invoice_number ? `
<div class="col-md-6">
<label class="small text-muted">e-conomic Fakturanr.</label>
<p class="mb-0">${card.economic_invoice_number}</p>
</div>
` : ''}
${card.notes ? `
<div class="col-12">
<label class="small text-muted">Bemærkninger</label>
<p class="mb-0">${card.notes}</p>
</div>
` : ''}
`;
// Update action buttons
const actions = [];
if (card.status === 'active') {
actions.push(`
<button class="btn btn-warning w-100 mb-2" onclick="cancelCard()">
<i class="bi bi-x-circle"></i> Annuller Kort
</button>
`);
}
document.getElementById('actionButtons').innerHTML = actions.join('') ||
'<p class="text-muted text-center mb-0">Ingen handlinger tilgængelige</p>';
// Render transactions
renderTransactions(card.transactions || []);
} catch (error) {
console.error('Error loading card:', error);
document.getElementById('cardInfo').innerHTML = `
<div class="col-12 text-center text-danger py-5">
<i class="bi bi-exclamation-circle fs-1 mb-3"></i>
<p>❌ Fejl ved indlæsning: ${error.message}</p>
<button class="btn btn-primary" onclick="window.location.href='/prepaid-cards'">
Tilbage til oversigt
</button>
</div>
`;
}
}
function renderTransactions(transactions) {
const tbody = document.getElementById('transactionsBody');
if (!transactions || transactions.length === 0) {
tbody.innerHTML = `
<tr><td colspan="5" class="text-center text-muted py-5">
Ingen transaktioner endnu
</td></tr>
`;
return;
}
tbody.innerHTML = transactions.map(t => `
<tr>
<td>${new Date(t.created_at).toLocaleDateString('da-DK')}</td>
<td>
${t.ticket_id ?
`<a href="/ticket/tickets/${t.ticket_id}" class="text-decoration-none">
#${t.ticket_id} - ${t.ticket_title || 'Ticket'}
</a>` : '-'}
</td>
<td>${t.description || '-'}</td>
<td class="text-end">${parseFloat(t.hours_used).toFixed(2)} t</td>
<td class="text-end">${parseFloat(t.amount).toFixed(2)} kr</td>
</tr>
`).join('');
}
function getStatusBadge(status) {
const badges = {
'active': '<span class="badge bg-success">Aktiv</span>',
'depleted': '<span class="badge bg-secondary">Opbrugt</span>',
'expired': '<span class="badge bg-danger">Udløbet</span>',
'cancelled': '<span class="badge bg-warning">Annulleret</span>'
};
return badges[status] || status;
}
async function cancelCard() {
if (!confirm('Er du sikker på at du vil annullere dette kort?')) {
return;
}
try {
const response = await fetch(`/api/v1/prepaid-cards/${cardId}/status`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ status: 'cancelled' })
});
if (!response.ok) throw new Error('Fejl ved annullering');
alert('✅ Kort annulleret');
loadCardDetails(); // Reload
} catch (error) {
console.error('Error cancelling card:', error);
alert('❌ Fejl: ' + error.message);
}
}
</script>
<style>
.breadcrumb {
background: transparent;
padding: 0;
}
.card-header {
font-weight: 600;
}
.table th {
font-weight: 600;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.5px;
color: var(--bs-secondary);
}
</style>
{% endblock %}

View File

@ -1,461 +0,0 @@
{% extends "base.html" %}
{% block title %}Prepaid Cards - BMC Hub{% endblock %}
{% block content %}
<div class="container-fluid py-4">
<!-- Header -->
<div class="row mb-4">
<div class="col">
<h1 class="h3 mb-0">💳 Prepaid Cards</h1>
<p class="text-muted">Oversigt og kontrol af kunders timekort</p>
</div>
<div class="col-auto">
<button class="btn btn-primary" onclick="openCreateModal()">
<i class="bi bi-plus-circle"></i> Opret Nyt Kort
</button>
</div>
</div>
<!-- Stats Row -->
<div class="row g-3 mb-4" id="statsCards">
<div class="col-md-3">
<div class="card border-0 shadow-sm">
<div class="card-body">
<div class="d-flex align-items-center">
<div class="flex-shrink-0">
<div class="rounded-circle bg-success bg-opacity-10 p-3">
<i class="bi bi-credit-card-2-front text-success fs-4"></i>
</div>
</div>
<div class="flex-grow-1 ms-3">
<p class="text-muted small mb-1">Aktive Kort</p>
<h3 class="mb-0" id="activeCount">-</h3>
</div>
</div>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card border-0 shadow-sm">
<div class="card-body">
<div class="d-flex align-items-center">
<div class="flex-shrink-0">
<div class="rounded-circle bg-primary bg-opacity-10 p-3">
<i class="bi bi-clock-history text-primary fs-4"></i>
</div>
</div>
<div class="flex-grow-1 ms-3">
<p class="text-muted small mb-1">Tilbageværende Timer</p>
<h3 class="mb-0" id="remainingHours">-</h3>
</div>
</div>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card border-0 shadow-sm">
<div class="card-body">
<div class="d-flex align-items-center">
<div class="flex-shrink-0">
<div class="rounded-circle bg-info bg-opacity-10 p-3">
<i class="bi bi-hourglass-split text-info fs-4"></i>
</div>
</div>
<div class="flex-grow-1 ms-3">
<p class="text-muted small mb-1">Brugte Timer</p>
<h3 class="mb-0" id="usedHours">-</h3>
</div>
</div>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card border-0 shadow-sm">
<div class="card-body">
<div class="d-flex align-items-center">
<div class="flex-shrink-0">
<div class="rounded-circle bg-warning bg-opacity-10 p-3">
<i class="bi bi-currency-dollar text-warning fs-4"></i>
</div>
</div>
<div class="flex-grow-1 ms-3">
<p class="text-muted small mb-1">Total Omsætning</p>
<h3 class="mb-0" id="totalRevenue">-</h3>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Filter Bar -->
<div class="card border-0 shadow-sm mb-4">
<div class="card-body">
<div class="row g-3">
<div class="col-md-3">
<label class="form-label small text-muted">Status</label>
<select class="form-select" id="statusFilter" onchange="loadCards()">
<option value="">Alle</option>
<option value="active">Aktive</option>
<option value="depleted">Opbrugt</option>
<option value="expired">Udløbet</option>
<option value="cancelled">Annulleret</option>
</select>
</div>
<div class="col-md-6">
<label class="form-label small text-muted">Søg Kunde</label>
<input type="text" class="form-control" id="customerSearch"
placeholder="Søg efter kundenavn eller email...">
</div>
<div class="col-md-3 d-flex align-items-end">
<button class="btn btn-outline-secondary w-100" onclick="resetFilters()">
<i class="bi bi-x-circle"></i> Nulstil
</button>
</div>
</div>
</div>
</div>
<!-- Cards Table -->
<div class="card border-0 shadow-sm">
<div class="card-body">
<div class="table-responsive">
<table class="table table-hover align-middle" id="cardsTable">
<thead class="table-light">
<tr>
<th>Kortnummer</th>
<th>Kunde</th>
<th class="text-end">Købte Timer</th>
<th class="text-end">Brugte Timer</th>
<th class="text-end">Tilbage</th>
<th class="text-end">Pris/Time</th>
<th class="text-end">Total</th>
<th>Status</th>
<th>Udløber</th>
<th class="text-end">Handlinger</th>
</tr>
</thead>
<tbody id="cardsTableBody">
<tr>
<td colspan="10" class="text-center py-5">
<div class="spinner-border text-primary" role="status">
<span class="visually-hidden">Loading...</span>
</div>
</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
<!-- Create Card Modal -->
<div class="modal fade" id="createCardModal" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">💳 Opret Nyt Prepaid Kort</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<form id="createCardForm">
<div class="mb-3">
<label class="form-label">Kunde *</label>
<select class="form-select" id="customerId" required>
<option value="">Vælg kunde...</option>
</select>
</div>
<div class="mb-3">
<label class="form-label">Antal Timer *</label>
<input type="number" class="form-control" id="purchasedHours"
step="0.5" min="1" required>
</div>
<div class="mb-3">
<label class="form-label">Pris pr. Time (DKK) *</label>
<input type="number" class="form-control" id="pricePerHour"
step="0.01" min="0" required>
</div>
<div class="mb-3">
<label class="form-label">Udløbsdato (valgfri)</label>
<input type="date" class="form-control" id="expiresAt">
</div>
<div class="mb-3">
<label class="form-label">Bemærkninger</label>
<textarea class="form-control" id="notes" rows="3"></textarea>
</div>
<div class="alert alert-info small">
<i class="bi bi-info-circle"></i>
Kortnummeret bliver automatisk genereret
</div>
</form>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Annuller</button>
<button type="button" class="btn btn-primary" onclick="createCard()">
<i class="bi bi-plus-circle"></i> Opret Kort
</button>
</div>
</div>
</div>
</div>
<script>
let createCardModal;
// Initialize
document.addEventListener('DOMContentLoaded', () => {
createCardModal = new bootstrap.Modal(document.getElementById('createCardModal'));
loadStats();
loadCards();
loadCustomers();
});
// Load Statistics
async function loadStats() {
try {
const response = await fetch('/api/v1/prepaid-cards/stats/summary');
const stats = await response.json();
document.getElementById('activeCount').textContent = stats.active_count || 0;
document.getElementById('remainingHours').textContent =
parseFloat(stats.total_remaining_hours || 0).toFixed(1) + ' t';
document.getElementById('usedHours').textContent =
parseFloat(stats.total_used_hours || 0).toFixed(1) + ' t';
document.getElementById('totalRevenue').textContent =
new Intl.NumberFormat('da-DK', {
style: 'currency',
currency: 'DKK'
}).format(parseFloat(stats.total_revenue || 0));
} catch (error) {
console.error('Error loading stats:', error);
}
}
// Load Cards
async function loadCards() {
const status = document.getElementById('statusFilter').value;
const search = document.getElementById('customerSearch').value.toLowerCase();
try {
let url = '/api/v1/prepaid-cards';
if (status) url += `?status=${status}`;
const response = await fetch(url);
const cards = await response.json();
// Filter by search
const filtered = cards.filter(card => {
if (!search) return true;
const name = (card.customer_name || '').toLowerCase();
const email = (card.customer_email || '').toLowerCase();
return name.includes(search) || email.includes(search);
});
renderCards(filtered);
} catch (error) {
console.error('Error loading cards:', error);
document.getElementById('cardsTableBody').innerHTML = `
<tr><td colspan="10" class="text-center text-danger">
❌ Fejl ved indlæsning: ${error.message}
</td></tr>
`;
}
}
// Render Cards Table
function renderCards(cards) {
const tbody = document.getElementById('cardsTableBody');
if (!cards || cards.length === 0) {
tbody.innerHTML = `
<tr><td colspan="10" class="text-center text-muted py-5">
Ingen kort fundet
</td></tr>
`;
return;
}
tbody.innerHTML = cards.map(card => {
const statusBadge = getStatusBadge(card.status);
const expiresAt = card.expires_at ?
new Date(card.expires_at).toLocaleDateString('da-DK') : '-';
// Parse decimal strings to numbers
const purchasedHours = parseFloat(card.purchased_hours);
const usedHours = parseFloat(card.used_hours);
const remainingHours = parseFloat(card.remaining_hours);
const pricePerHour = parseFloat(card.price_per_hour);
const totalAmount = parseFloat(card.total_amount);
return `
<tr>
<td>
<a href="/prepaid-cards/${card.id}" class="text-decoration-none">
<strong>${card.card_number}</strong>
</a>
</td>
<td>
<div>${card.customer_name || '-'}</div>
<small class="text-muted">${card.customer_email || ''}</small>
</td>
<td class="text-end">${purchasedHours.toFixed(1)} t</td>
<td class="text-end">${usedHours.toFixed(1)} t</td>
<td class="text-end">
<strong class="${remainingHours < 5 ? 'text-danger' : 'text-success'}">
${remainingHours.toFixed(1)} t
</strong>
</td>
<td class="text-end">${pricePerHour.toFixed(2)} kr</td>
<td class="text-end"><strong>${totalAmount.toFixed(2)} kr</strong></td>
<td>${statusBadge}</td>
<td>${expiresAt}</td>
<td class="text-end">
<div class="btn-group btn-group-sm">
<a href="/prepaid-cards/${card.id}" class="btn btn-outline-primary"
title="Se detaljer">
<i class="bi bi-eye"></i>
</a>
${card.status === 'active' ? `
<button class="btn btn-outline-warning"
onclick="cancelCard(${card.id})" title="Annuller">
<i class="bi bi-x-circle"></i>
</button>
` : ''}
</div>
</td>
</tr>
`;
}).join('');
}
// Get Status Badge
function getStatusBadge(status) {
const badges = {
'active': '<span class="badge bg-success">Aktiv</span>',
'depleted': '<span class="badge bg-secondary">Opbrugt</span>',
'expired': '<span class="badge bg-danger">Udløbet</span>',
'cancelled': '<span class="badge bg-warning">Annulleret</span>'
};
return badges[status] || status;
}
// Load Customers for Dropdown
async function loadCustomers() {
try {
const response = await fetch('/api/v1/customers');
const customers = await response.json();
const select = document.getElementById('customerId');
select.innerHTML = '<option value="">Vælg kunde...</option>' +
customers.map(c => `<option value="${c.id}">${c.name}</option>`).join('');
} catch (error) {
console.error('Error loading customers:', error);
}
}
// Open Create Modal
function openCreateModal() {
document.getElementById('createCardForm').reset();
createCardModal.show();
}
// Create Card
async function createCard() {
const form = document.getElementById('createCardForm');
if (!form.checkValidity()) {
form.reportValidity();
return;
}
const data = {
customer_id: parseInt(document.getElementById('customerId').value),
purchased_hours: parseFloat(document.getElementById('purchasedHours').value),
price_per_hour: parseFloat(document.getElementById('pricePerHour').value),
expires_at: document.getElementById('expiresAt').value || null,
notes: document.getElementById('notes').value || null
};
try {
const response = await fetch('/api/v1/prepaid-cards', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(data)
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail || 'Fejl ved oprettelse');
}
createCardModal.hide();
loadStats();
loadCards();
alert('✅ Prepaid kort oprettet!');
} catch (error) {
console.error('Error creating card:', error);
alert('❌ Fejl: ' + error.message);
}
}
// Cancel Card
async function cancelCard(cardId) {
if (!confirm('Er du sikker på at du vil annullere dette kort?')) {
return;
}
try {
const response = await fetch(`/api/v1/prepaid-cards/${cardId}/status`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ status: 'cancelled' })
});
if (!response.ok) throw new Error('Fejl ved annullering');
loadStats();
loadCards();
alert('✅ Kort annulleret');
} catch (error) {
console.error('Error cancelling card:', error);
alert('❌ Fejl: ' + error.message);
}
}
// Reset Filters
function resetFilters() {
document.getElementById('statusFilter').value = '';
document.getElementById('customerSearch').value = '';
loadCards();
}
// Live search on customer input
document.getElementById('customerSearch').addEventListener('input', loadCards);
</script>
<style>
.table th {
font-weight: 600;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.5px;
color: var(--bs-secondary);
}
.card {
transition: transform 0.2s, box-shadow 0.2s;
}
.card:hover {
transform: translateY(-2px);
box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15) !important;
}
.btn-group-sm .btn {
padding: 0.25rem 0.5rem;
}
</style>
{% endblock %}

View File

@ -1,213 +0,0 @@
"""
Email Activity Logger
Helper service for logging all email events
"""
import logging
from typing import Optional, Dict, Any
import json
from app.core.database import execute_query, execute_insert
logger = logging.getLogger(__name__)
class EmailActivityLogger:
"""Centralized email activity logging"""
@staticmethod
async def log(
email_id: int,
event_type: str,
description: str,
category: str = 'system',
metadata: Optional[Dict[str, Any]] = None,
user_id: Optional[int] = None,
created_by: str = 'system'
) -> Optional[int]:
"""
Log an email activity event
Args:
email_id: Email ID
event_type: Type of event (fetched, classified, workflow_executed, etc.)
description: Human-readable description
category: Event category (system, user, workflow, rule, integration)
metadata: Additional event data as dict
user_id: User ID if user-triggered
created_by: Who/what created this log entry
Returns:
Log ID or None on failure
"""
try:
metadata_json = json.dumps(metadata) if metadata else None
log_id = execute_insert(
"""INSERT INTO email_activity_log
(email_id, event_type, event_category, description, metadata, user_id, created_by)
VALUES (%s, %s, %s, %s, %s::jsonb, %s, %s)""",
(email_id, event_type, category, description, metadata_json, user_id, created_by)
)
logger.debug(f"📝 Logged email event: {event_type} for email {email_id}")
return log_id
except Exception as e:
logger.error(f"❌ Failed to log email activity: {e}")
return None
@staticmethod
async def log_fetched(email_id: int, source: str, message_id: str):
"""Log email fetch event"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type='fetched',
category='system',
description=f'Email fetched from {source}',
metadata={'source': source, 'message_id': message_id}
)
@staticmethod
async def log_classified(email_id: int, classification: str, confidence: float, method: str = 'ai'):
"""Log email classification"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type='classified',
category='system',
description=f'Classified as {classification} (confidence: {confidence:.2%})',
metadata={
'classification': classification,
'confidence': confidence,
'method': method
}
)
@staticmethod
async def log_workflow_executed(email_id: int, workflow_id: int, workflow_name: str,
status: str, steps_completed: int, execution_time_ms: int):
"""Log workflow execution"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type='workflow_executed',
category='workflow',
description=f'Workflow "{workflow_name}" {status} ({steps_completed} steps, {execution_time_ms}ms)',
metadata={
'workflow_id': workflow_id,
'workflow_name': workflow_name,
'status': status,
'steps_completed': steps_completed,
'execution_time_ms': execution_time_ms
}
)
@staticmethod
async def log_rule_matched(email_id: int, rule_id: int, rule_name: str, action_type: str):
"""Log rule match"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type='rule_matched',
category='rule',
description=f'Matched rule "{rule_name}" → action: {action_type}',
metadata={
'rule_id': rule_id,
'rule_name': rule_name,
'action_type': action_type
}
)
@staticmethod
async def log_status_changed(email_id: int, old_status: str, new_status: str, reason: Optional[str] = None):
"""Log status change"""
desc = f'Status changed: {old_status}{new_status}'
if reason:
desc += f' ({reason})'
return await EmailActivityLogger.log(
email_id=email_id,
event_type='status_changed',
category='system',
description=desc,
metadata={
'old_status': old_status,
'new_status': new_status,
'reason': reason
}
)
@staticmethod
async def log_read(email_id: int, user_id: Optional[int] = None, username: Optional[str] = None):
"""Log email read"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type='read',
category='user',
description=f'Email read by {username or "user"}',
metadata={'username': username},
user_id=user_id,
created_by=username or 'user'
)
@staticmethod
async def log_attachment_action(email_id: int, filename: str, action: str = 'downloaded'):
"""Log attachment action"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type=f'attachment_{action}',
category='user',
description=f'Attachment {action}: {filename}',
metadata={'filename': filename, 'action': action}
)
@staticmethod
async def log_linked(email_id: int, entity_type: str, entity_id: int, entity_name: str):
"""Log entity linking"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type='linked',
category='system',
description=f'Linked to {entity_type}: {entity_name}',
metadata={
'entity_type': entity_type,
'entity_id': entity_id,
'entity_name': entity_name
}
)
@staticmethod
async def log_invoice_extracted(email_id: int, invoice_number: Optional[str],
amount: Optional[float], success: bool):
"""Log invoice data extraction"""
desc = f'Invoice data extraction {"succeeded" if success else "failed"}'
if success and invoice_number:
desc += f' - #{invoice_number}'
return await EmailActivityLogger.log(
email_id=email_id,
event_type='invoice_extracted',
category='integration',
description=desc,
metadata={
'success': success,
'invoice_number': invoice_number,
'amount': amount
}
)
@staticmethod
async def log_error(email_id: int, error_type: str, error_message: str, context: Optional[Dict] = None):
"""Log error event"""
return await EmailActivityLogger.log(
email_id=email_id,
event_type='error',
category='system',
description=f'Error: {error_type} - {error_message}',
metadata={
'error_type': error_type,
'error_message': error_message,
'context': context
}
)
# Export singleton instance
email_activity_logger = EmailActivityLogger()

View File

@ -11,8 +11,6 @@ from datetime import datetime
from app.services.email_service import EmailService from app.services.email_service import EmailService
from app.services.email_analysis_service import EmailAnalysisService from app.services.email_analysis_service import EmailAnalysisService
from app.services.simple_classifier import simple_classifier from app.services.simple_classifier import simple_classifier
from app.services.email_workflow_service import email_workflow_service
from app.services.email_activity_logger import email_activity_logger
from app.core.config import settings from app.core.config import settings
from app.core.database import execute_query, execute_update from app.core.database import execute_query, execute_update
@ -71,44 +69,16 @@ class EmailProcessorService:
email_data['id'] = email_id email_data['id'] = email_id
stats['saved'] += 1 stats['saved'] += 1
# Log: Email fetched and saved
await email_activity_logger.log_fetched(
email_id=email_id,
source='email_server',
message_id=email_data.get('message_id', 'unknown')
)
# Step 3: Classify email with AI # Step 3: Classify email with AI
if settings.EMAIL_AI_ENABLED and settings.EMAIL_AUTO_CLASSIFY: if settings.EMAIL_AI_ENABLED and settings.EMAIL_AUTO_CLASSIFY:
await self._classify_and_update(email_data) await self._classify_and_update(email_data)
stats['classified'] += 1 stats['classified'] += 1
# Step 4: Execute workflows based on classification # Step 4: Match against rules
workflow_processed = False if self.rules_enabled:
if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') and settings.EMAIL_WORKFLOWS_ENABLED:
workflow_result = await email_workflow_service.execute_workflows(email_data)
if workflow_result.get('workflows_executed', 0) > 0:
logger.info(f"✅ Executed {workflow_result['workflows_executed']} workflow(s) for email {email_id}")
# Mark as workflow-processed to avoid duplicate rule execution
if workflow_result.get('workflows_succeeded', 0) > 0:
workflow_processed = True
email_data['_workflow_processed'] = True
# Step 5: Match against rules (legacy support) - skip if workflow already processed
if self.rules_enabled and not workflow_processed:
# Check if workflow already processed this email
existing_execution = execute_query_single(
"SELECT id FROM email_workflow_executions WHERE email_id = %s AND status = 'completed' LIMIT 1",
(email_id,))
if existing_execution:
logger.info(f"⏭️ Email {email_id} already processed by workflow, skipping rules")
else:
matched = await self._match_rules(email_data) matched = await self._match_rules(email_data)
if matched: if matched:
stats['rules_matched'] += 1 stats['rules_matched'] += 1
elif workflow_processed:
logger.info(f"⏭️ Email {email_id} processed by workflow, skipping rules (coordination)")
except Exception as e: except Exception as e:
logger.error(f"❌ Error processing email: {e}") logger.error(f"❌ Error processing email: {e}")
@ -147,20 +117,8 @@ class EmailProcessorService:
""" """
execute_update(query, (classification, confidence, email_data['id'])) execute_update(query, (classification, confidence, email_data['id']))
# Update email_data dict with classification for workflow execution
email_data['classification'] = classification
email_data['confidence_score'] = confidence
logger.info(f"✅ Classified email {email_data['id']} as '{classification}' (confidence: {confidence:.2f})") logger.info(f"✅ Classified email {email_data['id']} as '{classification}' (confidence: {confidence:.2f})")
# Log: Email classified
await email_activity_logger.log_classified(
email_id=email_data['id'],
classification=classification,
confidence=confidence,
method='ai' if self.ai_enabled else 'keyword'
)
# Update email_data for rule matching # Update email_data for rule matching
email_data['classification'] = classification email_data['classification'] = classification
email_data['confidence_score'] = confidence email_data['confidence_score'] = confidence

View File

@ -1,680 +0,0 @@
"""
Email Workflow Service
Executes automated workflows based on email classification
Inspired by OmniSync architecture adapted for BMC Hub
"""
import logging
from typing import Dict, List, Optional, Any
from datetime import datetime, date
import re
import json
import hashlib
import shutil
from pathlib import Path
from decimal import Decimal
from app.core.database import execute_query, execute_insert, execute_update
from app.core.config import settings
from app.services.email_activity_logger import email_activity_logger
logger = logging.getLogger(__name__)
class EmailWorkflowService:
"""Orchestrates workflow execution for classified emails"""
def __init__(self):
self.enabled = settings.EMAIL_WORKFLOWS_ENABLED if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') else True
async def execute_workflows(self, email_data: Dict) -> Dict:
"""
Execute all matching workflows for an email
Args:
email_data: Email dict with classification, confidence_score, id, etc.
Returns:
Dict with execution results
"""
if not self.enabled:
logger.info("⏭️ Workflows disabled")
return {'status': 'disabled', 'workflows_executed': 0}
email_id = email_data.get('id')
classification = email_data.get('classification')
confidence = email_data.get('confidence_score', 0.0)
if not email_id or not classification:
logger.warning(f"⚠️ Cannot execute workflows: missing email_id or classification")
return {'status': 'skipped', 'reason': 'missing_data'}
logger.info(f"🔄 Finding workflows for classification: {classification} (confidence: {confidence})")
# Find matching workflows
workflows = await self._find_matching_workflows(email_data)
if not workflows:
logger.info(f"✅ No workflows match classification: {classification}")
return {'status': 'no_match', 'workflows_executed': 0}
logger.info(f"📋 Found {len(workflows)} matching workflow(s)")
results = {
'status': 'executed',
'workflows_executed': 0,
'workflows_succeeded': 0,
'workflows_failed': 0,
'details': []
}
# Execute workflows in priority order
for workflow in workflows:
result = await self._execute_workflow(workflow, email_data)
results['details'].append(result)
results['workflows_executed'] += 1
if result['status'] == 'completed':
results['workflows_succeeded'] += 1
else:
results['workflows_failed'] += 1
# Stop if workflow has stop_on_match=true
if workflow.get('stop_on_match') and result['status'] == 'completed':
logger.info(f"🛑 Stopping workflow chain (stop_on_match=true)")
break
logger.info(f"✅ Workflow execution complete: {results['workflows_succeeded']}/{results['workflows_executed']} succeeded")
return results
async def _find_matching_workflows(self, email_data: Dict) -> List[Dict]:
"""Find all workflows that match this email"""
classification = email_data.get('classification')
confidence = email_data.get('confidence_score', 0.0)
sender = email_data.get('sender_email', '')
subject = email_data.get('subject', '')
query = """
SELECT id, name, classification_trigger, sender_pattern, subject_pattern,
confidence_threshold, workflow_steps, priority, stop_on_match
FROM email_workflows
WHERE enabled = true
AND classification_trigger = %s
AND confidence_threshold <= %s
ORDER BY priority ASC
"""
workflows = execute_query_single(query, (classification, confidence))
# Filter by additional patterns
matching = []
for wf in workflows:
# Check sender pattern
if wf.get('sender_pattern'):
pattern = wf['sender_pattern']
if not re.search(pattern, sender, re.IGNORECASE):
logger.debug(f"⏭️ Workflow '{wf['name']}' skipped: sender doesn't match pattern")
continue
# Check subject pattern
if wf.get('subject_pattern'):
pattern = wf['subject_pattern']
if not re.search(pattern, subject, re.IGNORECASE):
logger.debug(f"⏭️ Workflow '{wf['name']}' skipped: subject doesn't match pattern")
continue
matching.append(wf)
return matching
async def _execute_workflow(self, workflow: Dict, email_data: Dict) -> Dict:
"""Execute a single workflow"""
workflow_id = workflow['id']
workflow_name = workflow['name']
email_id = email_data['id']
logger.info(f"🚀 Executing workflow: {workflow_name} (ID: {workflow_id})")
# Create execution record
execution_id = execute_insert(
"""INSERT INTO email_workflow_executions
(workflow_id, email_id, status, steps_total, result_json)
VALUES (%s, %s, 'running', %s, %s) RETURNING id""",
(workflow_id, email_id, len(workflow['workflow_steps']), json.dumps({}))
)
started_at = datetime.now()
steps = workflow['workflow_steps']
steps_completed = 0
step_results = []
try:
# Execute each step
for idx, step in enumerate(steps):
action = step.get('action')
params = step.get('params', {})
logger.info(f" ➡️ Step {idx + 1}/{len(steps)}: {action}")
step_result = await self._execute_action(action, params, email_data)
step_results.append({
'step': idx + 1,
'action': action,
'status': step_result['status'],
'result': step_result.get('result'),
'error': step_result.get('error')
})
if step_result['status'] == 'failed':
logger.error(f" ❌ Step failed: {step_result.get('error')}")
# Continue to next step even on failure (configurable later)
else:
logger.info(f" ✅ Step completed successfully")
steps_completed += 1
# Mark execution as completed
completed_at = datetime.now()
execution_time_ms = int((completed_at - started_at).total_seconds() * 1000)
execute_update(
"""UPDATE email_workflow_executions
SET status = 'completed', steps_completed = %s,
result_json = %s, completed_at = CURRENT_TIMESTAMP,
execution_time_ms = %s
WHERE id = %s""",
(steps_completed, json.dumps(step_results), execution_time_ms, execution_id)
)
# Update workflow statistics
execute_update(
"""UPDATE email_workflows
SET execution_count = execution_count + 1,
success_count = success_count + 1,
last_executed_at = CURRENT_TIMESTAMP
WHERE id = %s""",
(workflow_id,)
)
logger.info(f"✅ Workflow '{workflow_name}' completed ({execution_time_ms}ms)")
# Log: Workflow execution completed
await email_activity_logger.log_workflow_executed(
email_id=email_id,
workflow_id=workflow_id,
workflow_name=workflow_name,
status='completed',
steps_completed=steps_completed,
execution_time_ms=execution_time_ms
)
return {
'workflow_id': workflow_id,
'workflow_name': workflow_name,
'execution_id': execution_id,
'status': 'completed',
'steps_completed': steps_completed,
'steps_total': len(steps),
'execution_time_ms': execution_time_ms,
'step_results': step_results
}
except Exception as e:
logger.error(f"❌ Workflow execution failed: {e}")
# Mark execution as failed
execute_update(
"""UPDATE email_workflow_executions
SET status = 'failed', steps_completed = %s,
error_message = %s, completed_at = CURRENT_TIMESTAMP
WHERE id = %s""",
(steps_completed, str(e), execution_id)
)
# Update workflow statistics
execute_update(
"""UPDATE email_workflows
SET execution_count = execution_count + 1,
failure_count = failure_count + 1,
last_executed_at = CURRENT_TIMESTAMP
WHERE id = %s""",
(workflow_id,)
)
# Log: Workflow execution failed
await email_activity_logger.log_workflow_executed(
email_id=email_id,
workflow_id=workflow_id,
workflow_name=workflow_name,
status='failed',
steps_completed=steps_completed,
execution_time_ms=0
)
return {
'workflow_id': workflow_id,
'workflow_name': workflow_name,
'execution_id': execution_id,
'status': 'failed',
'steps_completed': steps_completed,
'steps_total': len(steps),
'error': str(e)
}
async def _execute_action(self, action: str, params: Dict, email_data: Dict) -> Dict:
"""Execute a single workflow action"""
try:
# Dispatch to specific action handler
handler_map = {
'create_ticket': self._action_create_ticket_system,
'link_email_to_ticket': self._action_link_email_to_ticket,
'create_time_entry': self._action_create_time_entry,
'link_to_vendor': self._action_link_to_vendor,
'link_to_customer': self._action_link_to_customer,
'extract_invoice_data': self._action_extract_invoice_data,
'extract_tracking_number': self._action_extract_tracking_number,
'send_slack_notification': self._action_send_slack_notification,
'send_email_notification': self._action_send_email_notification,
'mark_as_processed': self._action_mark_as_processed,
'flag_for_review': self._action_flag_for_review,
}
handler = handler_map.get(action)
if not handler:
logger.warning(f"⚠️ Unknown action: {action}")
return {
'status': 'skipped',
'error': f'Unknown action: {action}'
}
result = await handler(params, email_data)
return {
'status': 'success',
'result': result
}
except Exception as e:
logger.error(f"❌ Action '{action}' failed: {e}")
return {
'status': 'failed',
'error': str(e)
}
# Action Handlers
async def _action_create_ticket_system(self, params: Dict, email_data: Dict) -> Dict:
"""Create a ticket from email using new ticket system"""
from app.ticket.backend.email_integration import EmailTicketIntegration
# Build email_data dict for ticket integration
ticket_email_data = {
'message_id': email_data.get('message_id'),
'subject': email_data.get('subject'),
'from_address': email_data.get('sender_email'),
'body': email_data.get('body_text', ''),
'html_body': email_data.get('body_html'),
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
'in_reply_to': None, # TODO: Extract from email headers
'references': None # TODO: Extract from email headers
}
# Get params from workflow
customer_id = params.get('customer_id') or email_data.get('customer_id')
assigned_to_user_id = params.get('assigned_to_user_id')
logger.info(f"🎫 Creating ticket from email: {email_data.get('message_id')}")
result = await EmailTicketIntegration.process_email_for_ticket(
email_data=ticket_email_data,
customer_id=customer_id,
assigned_to_user_id=assigned_to_user_id
)
logger.info(f"✅ Created ticket {result.get('ticket_number')} from email")
return {
'action': 'create_ticket',
'ticket_id': result.get('ticket_id'),
'ticket_number': result.get('ticket_number'),
'created': result.get('created', False),
'linked': result.get('linked', False)
}
async def _action_link_email_to_ticket(self, params: Dict, email_data: Dict) -> Dict:
"""Link email to existing ticket"""
from app.ticket.backend.email_integration import EmailTicketIntegration
ticket_number = params.get('ticket_number')
if not ticket_number:
logger.warning("⚠️ No ticket_number provided for link_email_to_ticket action")
return {
'action': 'link_email_to_ticket',
'status': 'failed',
'error': 'ticket_number required'
}
ticket_email_data = {
'message_id': email_data.get('message_id'),
'subject': email_data.get('subject'),
'from_address': email_data.get('sender_email'),
'body': email_data.get('body_text', ''),
'html_body': email_data.get('body_html'),
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
}
logger.info(f"🔗 Linking email to ticket {ticket_number}")
result = await EmailTicketIntegration.link_email_to_ticket(
ticket_number=ticket_number,
email_data=ticket_email_data
)
logger.info(f"✅ Linked email to ticket {ticket_number}")
return {
'action': 'link_email_to_ticket',
'ticket_id': result.get('ticket_id'),
'ticket_number': result.get('ticket_number'),
'linked': True
}
async def _action_create_time_entry(self, params: Dict, email_data: Dict) -> Dict:
"""Create time entry from email"""
logger.info(f"⏱️ Would create time entry")
# TODO: Integrate with time tracking system
return {
'action': 'create_time_entry',
'note': 'Time entry creation not yet implemented'
}
async def _action_link_to_vendor(self, params: Dict, email_data: Dict) -> Dict:
"""Link email to vendor"""
match_by = params.get('match_by', 'email')
sender_email = email_data.get('sender_email')
if not sender_email:
return {'action': 'link_to_vendor', 'matched': False, 'reason': 'No sender email'}
# Find vendor by email
query = "SELECT id, name FROM vendors WHERE email = %s LIMIT 1"
result = execute_query(query, (sender_email,))
if result:
vendor_id = result['id']
# Check if already linked to avoid duplicate updates
current_vendor = execute_query_single(
"SELECT supplier_id FROM email_messages WHERE id = %s",
(email_data['id'],))
if current_vendor and current_vendor.get('supplier_id') == vendor_id:
logger.info(f"⏭️ Email already linked to vendor {vendor_id}, skipping duplicate update")
return {
'action': 'link_to_vendor',
'matched': True,
'vendor_id': vendor_id,
'vendor_name': result['name'],
'note': 'Already linked (skipped duplicate)'
}
# Update email with vendor link
execute_update(
"UPDATE email_messages SET supplier_id = %s WHERE id = %s",
(vendor_id, email_data['id'])
)
logger.info(f"🔗 Linked email to vendor: {result['name']} (ID: {vendor_id})")
return {
'action': 'link_to_vendor',
'matched': True,
'vendor_id': vendor_id,
'vendor_name': result['name']
}
else:
logger.info(f"⚠️ No vendor found for email: {sender_email}")
return {'action': 'link_to_vendor', 'matched': False, 'reason': 'Vendor not found'}
async def _action_link_to_customer(self, params: Dict, email_data: Dict) -> Dict:
"""Link email to customer"""
logger.info(f"🔗 Would link to customer")
# TODO: Implement customer matching logic
return {
'action': 'link_to_customer',
'note': 'Customer linking not yet implemented'
}
async def _action_extract_invoice_data(self, params: Dict, email_data: Dict) -> Dict:
"""Save email PDF attachment to incoming_files for processing"""
logger.info(f"📄 Saving invoice PDF from email to incoming files")
email_id = email_data.get('id')
sender_email = email_data.get('sender_email', '')
vendor_id = email_data.get('supplier_id')
# Get PDF attachments from email
attachments = execute_query_single(
"""SELECT filename, file_path, size_bytes, content_type
FROM email_attachments
WHERE email_id = %s AND content_type = 'application/pdf'""",
(email_id,)
)
if not attachments:
attachments = []
elif not isinstance(attachments, list):
attachments = [attachments]
if not attachments:
logger.warning(f"⚠️ No PDF attachments found for email {email_id}")
return {
'action': 'extract_invoice_data',
'success': False,
'note': 'No PDF attachment found in email'
}
uploaded_files = []
for attachment in attachments:
try:
attachment_path = attachment['file_path']
if not attachment_path:
logger.warning(f"⚠️ No file path for attachment {attachment['filename']}")
continue
# Handle both absolute and relative paths
file_path = Path(attachment_path)
if not file_path.is_absolute():
# Try common base directories
for base in [Path.cwd(), Path('/app'), Path('.')]:
test_path = base / attachment_path
if test_path.exists():
file_path = test_path
break
if not file_path.exists():
error_msg = f"Attachment file not found: {attachment_path}"
logger.error(f"{error_msg}")
raise FileNotFoundError(error_msg)
# Old code continues here but never reached if file missing
if False and not file_path.exists():
logger.warning(f"⚠️ Attachment file not found: {attachment_path}")
continue
# Calculate checksum
with open(file_path, 'rb') as f:
file_content = f.read()
checksum = hashlib.sha256(file_content).hexdigest()
# Check if file already exists
existing = execute_query(
"SELECT file_id FROM incoming_files WHERE checksum = %s",
(checksum,))
if existing:
logger.info(f"⚠️ File already exists: {attachment['filename']}")
continue
# Create uploads directory if it doesn't exist
upload_dir = Path("uploads")
upload_dir.mkdir(exist_ok=True)
# Copy file to uploads directory
timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
safe_filename = f"{timestamp}_{attachment['filename']}"
destination = upload_dir / safe_filename
shutil.copy2(file_path, destination)
# Insert into incoming_files
file_id = execute_insert(
"""INSERT INTO incoming_files
(filename, original_filename, file_path, file_size, mime_type, checksum,
status, detected_vendor_id, uploaded_at)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, CURRENT_TIMESTAMP)
RETURNING file_id""",
(
safe_filename,
attachment['filename'],
str(destination),
attachment['size_bytes'],
'application/pdf',
checksum,
'pending', # Will appear in "Mangler Behandling"
vendor_id
)
)
uploaded_files.append({
'file_id': file_id,
'filename': attachment['filename']
})
logger.info(f"✅ Saved PDF to incoming_files: {attachment['filename']} (file_id: {file_id})")
except Exception as e:
logger.error(f"❌ Failed to save attachment: {e}")
continue
if uploaded_files:
return {
'action': 'extract_invoice_data',
'success': True,
'files_uploaded': len(uploaded_files),
'file_ids': [f['file_id'] for f in uploaded_files],
'note': f"{len(uploaded_files)} PDF(er) gemt i 'Mangler Behandling'"
}
else:
return {
'action': 'extract_invoice_data',
'success': False,
'note': 'No files could be uploaded'
}
async def _action_extract_tracking_number(self, params: Dict, email_data: Dict) -> Dict:
"""Extract tracking number from email"""
body = email_data.get('body_text', '')
subject = email_data.get('subject', '')
text = f"{subject} {body}"
# Simple regex patterns for common carriers
patterns = {
'postnord': r'\b[0-9]{18}\b',
'gls': r'\b[0-9]{11}\b',
'dao': r'\b[0-9]{14}\b'
}
tracking_numbers = []
for carrier, pattern in patterns.items():
matches = re.findall(pattern, text)
if matches:
tracking_numbers.extend([{'carrier': carrier, 'number': m} for m in matches])
if tracking_numbers:
logger.info(f"📦 Extracted {len(tracking_numbers)} tracking number(s)")
# Update email with tracking number
if tracking_numbers:
first_number = tracking_numbers[0]['number']
execute_update(
"UPDATE email_messages SET extracted_tracking_number = %s WHERE id = %s",
(first_number, email_data['id'])
)
return {
'action': 'extract_tracking_number',
'tracking_numbers': tracking_numbers
}
async def _action_send_slack_notification(self, params: Dict, email_data: Dict) -> Dict:
"""Send Slack notification"""
channel = params.get('channel', '#general')
template = params.get('template', 'New email: {{subject}}')
logger.info(f"💬 Would send Slack notification to {channel}")
# TODO: Integrate with Slack API
return {
'action': 'send_slack_notification',
'channel': channel,
'note': 'Slack integration not yet implemented'
}
async def _action_send_email_notification(self, params: Dict, email_data: Dict) -> Dict:
"""Send email notification"""
recipients = params.get('recipients', [])
logger.info(f"📧 Would send email notification to {len(recipients)} recipient(s)")
# TODO: Integrate with email sending service
return {
'action': 'send_email_notification',
'recipients': recipients,
'note': 'Email notification not yet implemented'
}
async def _action_mark_as_processed(self, params: Dict, email_data: Dict) -> Dict:
"""Mark email as processed"""
status = params.get('status', 'processed')
execute_update(
"""UPDATE email_messages
SET status = %s, processed_at = CURRENT_TIMESTAMP, auto_processed = true
WHERE id = %s""",
(status, email_data['id'])
)
logger.info(f"✅ Marked email as: {status}")
return {
'action': 'mark_as_processed',
'status': status
}
async def _action_flag_for_review(self, params: Dict, email_data: Dict) -> Dict:
"""Flag email for manual review"""
reason = params.get('reason', 'Flagged by workflow')
execute_update(
"""UPDATE email_messages
SET status = 'flagged', approval_status = 'pending_review'
WHERE id = %s""",
(email_data['id'],)
)
logger.info(f"🚩 Flagged email for review: {reason}")
return {
'action': 'flag_for_review',
'reason': reason
}
# Global instance
email_workflow_service = EmailWorkflowService()

View File

@ -118,8 +118,8 @@ class Invoice2DataService:
decimal_sep = options.get('decimal_separator', ',') decimal_sep = options.get('decimal_separator', ',')
thousands_sep = options.get('thousands_separator', '.') thousands_sep = options.get('thousands_separator', '.')
# Remove all whitespace first (pdf extraction may split numbers across lines) # Remove all spaces first
value = re.sub(r'\s+', '', value) value = value.replace(' ', '')
# If both separators are present, we can determine the format # If both separators are present, we can determine the format
# Danish: 148.587,98 (thousands=., decimal=,) # Danish: 148.587,98 (thousands=., decimal=,)
@ -170,25 +170,6 @@ class Invoice2DataService:
if lines_config: if lines_config:
extracted['lines'] = self._extract_lines(text, lines_config, options) extracted['lines'] = self._extract_lines(text, lines_config, options)
# Calculate due_date if field has '+Xd' value
if 'due_date' in fields:
due_config = fields['due_date']
if due_config.get('parser') == 'static':
value = due_config.get('value', '')
if value.startswith('+') and value.endswith('d') and extracted.get('invoice_date'):
try:
from datetime import timedelta
days = int(value[1:-1])
inv_date = datetime.strptime(extracted['invoice_date'], '%Y-%m-%d')
due_date = inv_date + timedelta(days=days)
extracted['due_date'] = due_date.strftime('%Y-%m-%d')
logger.info(f"✅ Calculated due_date: {extracted['due_date']} ({days} days from invoice_date)")
except Exception as e:
logger.warning(f"Failed to calculate due_date: {e}")
# Validate amounts (sum of lines vs total, VAT calculation)
self._validate_amounts(extracted)
return extracted return extracted
def _extract_lines(self, text: str, lines_configs: List[Dict], options: Dict) -> List[Dict]: def _extract_lines(self, text: str, lines_configs: List[Dict], options: Dict) -> List[Dict]:
@ -255,7 +236,7 @@ class Invoice2DataService:
if field_type == 'float': if field_type == 'float':
thousands_sep = options.get('thousands_separator', ',') thousands_sep = options.get('thousands_separator', ',')
decimal_sep = options.get('decimal_separator', '.') decimal_sep = options.get('decimal_separator', '.')
value = re.sub(r'\s+', '', value) value = value.replace(' ', '')
if thousands_sep in value and decimal_sep in value: if thousands_sep in value and decimal_sep in value:
value = value.replace(thousands_sep, '').replace(decimal_sep, '.') value = value.replace(thousands_sep, '').replace(decimal_sep, '.')
@ -301,61 +282,6 @@ class Invoice2DataService:
logger.debug(f" ✗ Failed to extract context field {ctx_field_name}: {e}") logger.debug(f" ✗ Failed to extract context field {ctx_field_name}: {e}")
break # Stop after first match for this pattern break # Stop after first match for this pattern
# If header is line-wrapped (e.g. "Husleje" on one line and "(inkl...)" on next),
# stitch them together so description becomes "Husleje (inkl...) ...".
try:
description = line_data.get('description')
if isinstance(description, str) and description.lstrip().startswith('('):
prefix = None
for candidate in reversed(context_lines):
candidate_stripped = candidate.strip()
if not candidate_stripped:
continue
if candidate_stripped.startswith('('):
continue
if re.match(r'^\d', candidate_stripped):
continue
if candidate_stripped.lower().startswith('periode:'):
continue
if ' Kr ' in f" {candidate_stripped} ":
continue
# Avoid picking calculation/detail lines
if any(token in candidate_stripped for token in ('*', '=', 'm2', '')):
continue
# Prefer short header-like prefixes (e.g. "Husleje")
if len(candidate_stripped) <= 40:
prefix = candidate_stripped
break
if prefix and not description.strip().lower().startswith(prefix.lower()):
line_data['description'] = f"{prefix} {description.strip()}".strip()
except Exception as e:
logger.debug(f" ✗ Failed to stitch wrapped description: {e}")
# Safety: skip subtotal/totals artifacts that may match loosely
# e.g. a line like "Kr 3 9.048,75" (no letters) should not become a line item.
try:
details = line_data.get('_line_details')
description = line_data.get('description')
if isinstance(details, str) and details.strip() == '':
continue
text_to_check = None
if isinstance(description, str) and description.strip() and description.strip() != '-':
text_to_check = description
elif isinstance(details, str) and details.strip():
text_to_check = details
if isinstance(text_to_check, str):
if not re.search(r'[A-Za-zÆØÅæøå]', text_to_check):
continue
lowered = text_to_check.lower()
if lowered.startswith('kr') or ' moms' in lowered or lowered.startswith('total') or lowered.startswith('netto'):
continue
except Exception:
pass
if line_data: if line_data:
all_lines.append(line_data) all_lines.append(line_data)
@ -366,55 +292,6 @@ class Invoice2DataService:
return all_lines return all_lines
def _validate_amounts(self, extracted: Dict) -> None:
"""Validate that line totals sum to subtotal/total, and VAT calculation is correct"""
try:
lines = extracted.get('lines', [])
total_amount = extracted.get('total_amount')
vat_amount = extracted.get('vat_amount')
if not lines or total_amount is None:
return
# Calculate sum of line_total values
line_sum = 0.0
for line in lines:
line_total = line.get('line_total')
if line_total is not None:
if isinstance(line_total, str):
# Parse Danish format: "25.000,00" or "1 .530,00"
cleaned = line_total.replace(' ', '').replace('.', '').replace(',', '.')
try:
line_sum += float(cleaned)
except ValueError:
pass
elif isinstance(line_total, (int, float)):
line_sum += float(line_total)
# If we have VAT amount, subtract it from total to get subtotal
subtotal = total_amount
if vat_amount is not None:
subtotal = total_amount - vat_amount
# Check if line sum matches subtotal (allow 1 DKK difference for rounding)
if abs(line_sum - subtotal) > 1.0:
logger.warning(f"⚠️ Amount validation: Line sum {line_sum:.2f} != subtotal {subtotal:.2f} (diff: {abs(line_sum - subtotal):.2f})")
extracted['_validation_warning'] = f"Varelinjer sum ({line_sum:.2f}) passer ikke med subtotal ({subtotal:.2f})"
else:
logger.info(f"✅ Amount validation: Line sum matches subtotal ({line_sum:.2f})")
# Check VAT calculation (25%)
if vat_amount is not None:
expected_vat = subtotal * 0.25
if abs(vat_amount - expected_vat) > 1.0:
logger.warning(f"⚠️ VAT validation: VAT {vat_amount:.2f} != 25% of {subtotal:.2f} ({expected_vat:.2f})")
extracted['_vat_warning'] = f"Moms ({vat_amount:.2f}) passer ikke med 25% af subtotal ({expected_vat:.2f})"
else:
logger.info(f"✅ VAT validation: 25% VAT calculation correct ({vat_amount:.2f})")
except Exception as e:
logger.warning(f"⚠️ Amount validation failed: {e}")
def extract(self, text: str, template_name: Optional[str] = None) -> Optional[Dict[str, Any]]: def extract(self, text: str, template_name: Optional[str] = None) -> Optional[Dict[str, Any]]:
""" """
Extract invoice data from text Extract invoice data from text

View File

@ -12,7 +12,7 @@ from datetime import datetime
import re import re
from app.core.config import settings from app.core.config import settings
from app.core.database import execute_insert, execute_query, execute_update, execute_query_single from app.core.database import execute_insert, execute_query, execute_update
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -582,9 +582,11 @@ Output: {
return None return None
# Search vendors table # Search vendors table
vendor = execute_query_single( vendor = execute_query(
"SELECT * FROM vendors WHERE cvr_number = %s", "SELECT * FROM vendors WHERE cvr_number = %s",
(cvr_clean,)) (cvr_clean,),
fetchone=True
)
if vendor: if vendor:
logger.info(f"✅ Matched vendor: {vendor['name']} (CVR: {cvr_clean})") logger.info(f"✅ Matched vendor: {vendor['name']} (CVR: {cvr_clean})")

View File

@ -22,16 +22,16 @@ class SimplyCRMService:
"""Service for integrating with Simply-CRM via webservice.php (VTiger fork)""" """Service for integrating with Simply-CRM via webservice.php (VTiger fork)"""
def __init__(self): def __init__(self):
# Try SIMPLYCRM_* first, fallback to OLD_VTIGER_* for backward compatibility # Simply-CRM bruger OLD_VTIGER settings
self.base_url = getattr(settings, 'SIMPLYCRM_URL', None) or getattr(settings, 'OLD_VTIGER_URL', None) self.base_url = getattr(settings, 'OLD_VTIGER_URL', None)
self.username = getattr(settings, 'SIMPLYCRM_USERNAME', None) or getattr(settings, 'OLD_VTIGER_USERNAME', None) self.username = getattr(settings, 'OLD_VTIGER_USERNAME', None)
self.access_key = getattr(settings, 'SIMPLYCRM_API_KEY', None) or getattr(settings, 'OLD_VTIGER_API_KEY', None) self.access_key = getattr(settings, 'OLD_VTIGER_ACCESS_KEY', None)
self.session_name: Optional[str] = None self.session_name: Optional[str] = None
self.session: Optional[aiohttp.ClientSession] = None self.session: Optional[aiohttp.ClientSession] = None
if not all([self.base_url, self.username, self.access_key]): if not all([self.base_url, self.username, self.access_key]):
logger.warning("⚠️ Simply-CRM credentials not configured (SIMPLYCRM_* or OLD_VTIGER_* settings)") logger.warning("⚠️ Simply-CRM credentials not configured (OLD_VTIGER_* settings)")
async def __aenter__(self): async def __aenter__(self):
"""Context manager entry - create session and login""" """Context manager entry - create session and login"""

View File

@ -6,7 +6,6 @@ Inspired by OmniSync's invoice template system
import re import re
import logging import logging
import json
from typing import Dict, List, Optional, Tuple from typing import Dict, List, Optional, Tuple
from datetime import datetime from datetime import datetime
from pathlib import Path from pathlib import Path
@ -103,22 +102,10 @@ class TemplateService:
score = 0.0 score = 0.0
patterns = template.get('detection_patterns', []) patterns = template.get('detection_patterns', [])
# DB may return JSON/JSONB as string depending on driver/config
if isinstance(patterns, str):
try:
patterns = json.loads(patterns)
except Exception:
patterns = []
if isinstance(patterns, dict):
patterns = [patterns]
if not patterns: if not patterns:
return 0.0 return 0.0
for pattern_obj in patterns: for pattern_obj in patterns:
if not isinstance(pattern_obj, dict):
continue
pattern_type = pattern_obj.get('type') pattern_type = pattern_obj.get('type')
weight = pattern_obj.get('weight', 0.5) weight = pattern_obj.get('weight', 0.5)

View File

@ -208,10 +208,9 @@
<h5 class="fw-bold mb-2">📦 Modul System</h5> <h5 class="fw-bold mb-2">📦 Modul System</h5>
<p class="text-muted mb-0">Dynamisk feature loading - udvikl moduler isoleret fra core systemet</p> <p class="text-muted mb-0">Dynamisk feature loading - udvikl moduler isoleret fra core systemet</p>
</div> </div>
<!-- <a href="/api/v1/modules" target="_blank" class="btn btn-sm btn-outline-primary"> <a href="/api/v1/modules" target="_blank" class="btn btn-sm btn-outline-primary">
<i class="bi bi-box-arrow-up-right me-1"></i>API <i class="bi bi-box-arrow-up-right me-1"></i>API
</a> --> </a>
<span class="badge bg-secondary">API ikke implementeret endnu</span>
</div> </div>
<!-- Quick Start --> <!-- Quick Start -->

View File

@ -231,12 +231,8 @@
<i class="bi bi-headset me-2"></i>Support <i class="bi bi-headset me-2"></i>Support
</a> </a>
<ul class="dropdown-menu mt-2"> <ul class="dropdown-menu mt-2">
<li><a class="dropdown-item py-2" href="/ticket/dashboard"><i class="bi bi-speedometer2 me-2"></i>Dashboard</a></li> <li><a class="dropdown-item py-2" href="#">Sager</a></li>
<li><a class="dropdown-item py-2" href="/ticket/tickets"><i class="bi bi-ticket-detailed me-2"></i>Alle Tickets</a></li> <li><a class="dropdown-item py-2" href="#">Ny Sag</a></li>
<li><a class="dropdown-item py-2" href="/ticket/worklog/review"><i class="bi bi-clock-history me-2"></i>Godkend Worklog</a></li>
<li><hr class="dropdown-divider"></li>
<li><a class="dropdown-item py-2" href="#">Ny Ticket</a></li>
<li><a class="dropdown-item py-2" href="/prepaid-cards"><i class="bi bi-credit-card-2-front me-2"></i>Prepaid Cards</a></li>
<li><hr class="dropdown-divider"></li> <li><hr class="dropdown-divider"></li>
<li><a class="dropdown-item py-2" href="#">Knowledge Base</a></li> <li><a class="dropdown-item py-2" href="#">Knowledge Base</a></li>
</ul> </ul>
@ -298,7 +294,6 @@
<ul class="dropdown-menu dropdown-menu-end mt-2"> <ul class="dropdown-menu dropdown-menu-end mt-2">
<li><a class="dropdown-item py-2" href="#">Profil</a></li> <li><a class="dropdown-item py-2" href="#">Profil</a></li>
<li><a class="dropdown-item py-2" href="/settings"><i class="bi bi-gear me-2"></i>Indstillinger</a></li> <li><a class="dropdown-item py-2" href="/settings"><i class="bi bi-gear me-2"></i>Indstillinger</a></li>
<li><a class="dropdown-item py-2" href="/backups"><i class="bi bi-hdd-stack me-2"></i>Backup System</a></li>
<li><a class="dropdown-item py-2" href="/devportal"><i class="bi bi-code-square me-2"></i>DEV Portal</a></li> <li><a class="dropdown-item py-2" href="/devportal"><i class="bi bi-code-square me-2"></i>DEV Portal</a></li>
<li><hr class="dropdown-divider"></li> <li><hr class="dropdown-divider"></li>
<li><a class="dropdown-item py-2 text-danger" href="#">Log ud</a></li> <li><a class="dropdown-item py-2 text-danger" href="#">Log ud</a></li>
@ -495,14 +490,11 @@
</div> </div>
</div> </div>
{% block content_wrapper %}
<div class="container-fluid px-4 py-4"> <div class="container-fluid px-4 py-4">
{% block content %}{% endblock %} {% block content %}{% endblock %}
</div> </div>
{% endblock %}
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
<script src="/static/js/tag-picker.js"></script>
<script> <script>
// Dark Mode Toggle Logic // Dark Mode Toggle Logic
const darkModeToggle = document.getElementById('darkModeToggle'); const darkModeToggle = document.getElementById('darkModeToggle');
@ -1032,90 +1024,6 @@
}); });
}); });
</script> </script>
<!-- Maintenance Mode Overlay -->
<div id="maintenance-overlay" style="display: none; position: fixed; top: 0; left: 0; width: 100%; height: 100%; background: rgba(0,0,0,0.85); z-index: 9999; backdrop-filter: blur(5px);">
<div style="position: absolute; top: 50%; left: 50%; transform: translate(-50%, -50%); text-align: center; color: white; max-width: 500px; padding: 2rem;">
<div style="font-size: 4rem; margin-bottom: 1rem;">🔧</div>
<h2 style="font-weight: 700; margin-bottom: 1rem;">System under vedligeholdelse</h2>
<p id="maintenance-message" style="font-size: 1.1rem; margin-bottom: 1.5rem; opacity: 0.9;">Systemet er midlertidigt utilgængeligt på grund af vedligeholdelse.</p>
<div id="maintenance-eta" style="font-size: 1rem; margin-bottom: 2rem; opacity: 0.8;"></div>
<div class="spinner-border text-light" role="status" style="width: 3rem; height: 3rem;">
<span class="visually-hidden">Loading...</span>
</div>
<p style="margin-top: 1.5rem; font-size: 0.9rem; opacity: 0.7;">
Siden opdateres automatisk når systemet er klar igen.
</p>
</div>
</div>
<script>
// Check maintenance mode status
let maintenanceCheckInterval = null;
function checkMaintenanceMode() {
fetch('/api/v1/system/maintenance')
.then(response => {
if (!response.ok) {
// Silently ignore 404 - maintenance endpoint not implemented yet
return null;
}
return response.json();
})
.then(data => {
if (!data) return; // Skip if endpoint doesn't exist
const overlay = document.getElementById('maintenance-overlay');
const messageEl = document.getElementById('maintenance-message');
const etaEl = document.getElementById('maintenance-eta');
if (data.maintenance_mode) {
// Show overlay
overlay.style.display = 'block';
// Update message
if (data.maintenance_message) {
messageEl.textContent = data.maintenance_message;
}
// Update ETA
if (data.maintenance_eta_minutes) {
etaEl.textContent = `Estimeret tid: ${data.maintenance_eta_minutes} minutter`;
} else {
etaEl.textContent = '';
}
// Start polling every 5 seconds if not already polling
if (!maintenanceCheckInterval) {
maintenanceCheckInterval = setInterval(checkMaintenanceMode, 5000);
}
} else {
// Hide overlay
overlay.style.display = 'none';
// Stop polling if maintenance is over
if (maintenanceCheckInterval) {
clearInterval(maintenanceCheckInterval);
maintenanceCheckInterval = null;
}
}
})
.catch(error => {
// Silently ignore errors - maintenance check is not critical
});
}
// Check on page load (optional feature, don't block if not available)
checkMaintenanceMode();
// Check periodically (every 30 seconds when not in maintenance)
setInterval(() => {
if (!maintenanceCheckInterval) {
checkMaintenanceMode();
}
}, 30000);
</script>
{% block extra_js %}{% endblock %} {% block extra_js %}{% endblock %}
</body> </body>
</html> </html>

View File

@ -44,64 +44,3 @@ async def get_config():
"economic_read_only": settings.ECONOMIC_READ_ONLY, "economic_read_only": settings.ECONOMIC_READ_ONLY,
"economic_dry_run": settings.ECONOMIC_DRY_RUN "economic_dry_run": settings.ECONOMIC_DRY_RUN
} }
@router.get("/system/maintenance")
async def get_maintenance_status():
"""Get maintenance mode status"""
return {
"maintenance_mode": False,
"message": None
}
@router.get("/system/live-stats")
async def get_live_stats():
"""Get live dashboard statistics"""
try:
# Get counts from database
customers_result = execute_query("SELECT COUNT(*) as count FROM customers")
tickets_result = execute_query("SELECT COUNT(*) as count FROM tickets WHERE status != 'closed'")
return {
"sales": {
"active_orders": 0,
"pending_quotes": 0
},
"customers": {
"total": customers_result[0]['count'] if customers_result else 0,
"new_this_month": 0
},
"tickets": {
"open": tickets_result[0]['count'] if tickets_result else 0,
"in_progress": 0
}
}
except Exception as e:
return {
"sales": {"active_orders": 0, "pending_quotes": 0},
"customers": {"total": 0, "new_this_month": 0},
"tickets": {"open": 0, "in_progress": 0}
}
@router.get("/system/recent-activity")
async def get_recent_activity():
"""Get recent system activity"""
try:
# Get recent customers
query = """
SELECT
'customer' as activity_type,
name,
created_at,
'bi-building' as icon,
'primary' as color
FROM customers
ORDER BY created_at DESC
LIMIT 10
"""
activities = execute_query(query)
return activities
except Exception as e:
return []

View File

@ -1,86 +0,0 @@
"""
Pydantic models for tag system
"""
from pydantic import BaseModel, Field
from typing import Optional, List, Literal
from datetime import datetime
# Tag types
TagType = Literal['workflow', 'status', 'category', 'priority', 'billing']
class TagBase(BaseModel):
"""Base tag model"""
name: str = Field(..., max_length=100)
type: TagType
description: Optional[str] = None
color: str = Field(..., pattern=r'^#[0-9A-Fa-f]{6}$') # Hex color
icon: Optional[str] = None
is_active: bool = True
class TagCreate(TagBase):
"""Tag creation model"""
pass
class Tag(TagBase):
"""Full tag model"""
id: int
created_at: datetime
updated_at: datetime
class Config:
from_attributes = True
class TagUpdate(BaseModel):
"""Tag update model - all fields optional"""
name: Optional[str] = Field(None, max_length=100)
description: Optional[str] = None
color: Optional[str] = Field(None, pattern=r'^#[0-9A-Fa-f]{6}$')
icon: Optional[str] = None
is_active: Optional[bool] = None
class EntityTagBase(BaseModel):
"""Base entity tag association"""
entity_type: str = Field(..., max_length=50)
entity_id: int
tag_id: int
class EntityTagCreate(EntityTagBase):
"""Entity tag creation"""
tagged_by: Optional[int] = None
class EntityTag(EntityTagBase):
"""Full entity tag model"""
id: int
tagged_by: Optional[int]
tagged_at: datetime
class Config:
from_attributes = True
class EntityWithTags(BaseModel):
"""Model for entities with their tags"""
entity_type: str
entity_id: int
tags: List[Tag]
class TagWorkflowBase(BaseModel):
"""Base workflow configuration"""
tag_id: int
trigger_event: Literal['on_add', 'on_remove']
action_type: str = Field(..., max_length=50)
action_config: Optional[dict] = None
is_active: bool = True
class TagWorkflowCreate(TagWorkflowBase):
"""Workflow creation model"""
pass
class TagWorkflow(TagWorkflowBase):
"""Full workflow model"""
id: int
created_at: datetime
class Config:
from_attributes = True

View File

@ -1,226 +0,0 @@
"""
Tag system API endpoints
"""
from fastapi import APIRouter, HTTPException
from typing import List, Optional
from app.tags.backend.models import (
Tag, TagCreate, TagUpdate,
EntityTag, EntityTagCreate,
TagWorkflow, TagWorkflowCreate,
TagType
)
from app.core.database import execute_query, execute_query_single, execute_update
router = APIRouter(prefix="/tags")
# ============= TAG CRUD =============
@router.get("", response_model=List[Tag])
async def list_tags(
type: Optional[TagType] = None,
is_active: Optional[bool] = None
):
"""List all tags with optional filtering"""
query = "SELECT * FROM tags WHERE 1=1"
params = []
if type:
query += " AND type = %s"
params.append(type)
if is_active is not None:
query += " AND is_active = %s"
params.append(is_active)
query += " ORDER BY type, name"
results = execute_query(query, tuple(params) if params else ())
return results
@router.get("/{tag_id}", response_model=Tag)
async def get_tag(tag_id: int):
"""Get single tag by ID"""
result = execute_query_single(
"SELECT * FROM tags WHERE id = %s",
(tag_id,)
)
if not result:
raise HTTPException(status_code=404, detail="Tag not found")
return result
@router.post("", response_model=Tag)
async def create_tag(tag: TagCreate):
"""Create new tag"""
query = """
INSERT INTO tags (name, type, description, color, icon, is_active)
VALUES (%s, %s, %s, %s, %s, %s)
RETURNING *
"""
result = execute_query_single(
query,
(tag.name, tag.type, tag.description, tag.color, tag.icon, tag.is_active)
)
return result
@router.put("/{tag_id}", response_model=Tag)
async def update_tag(tag_id: int, tag: TagUpdate):
"""Update existing tag"""
# Build dynamic update query
updates = []
params = []
if tag.name is not None:
updates.append("name = %s")
params.append(tag.name)
if tag.description is not None:
updates.append("description = %s")
params.append(tag.description)
if tag.color is not None:
updates.append("color = %s")
params.append(tag.color)
if tag.icon is not None:
updates.append("icon = %s")
params.append(tag.icon)
if tag.is_active is not None:
updates.append("is_active = %s")
params.append(tag.is_active)
if not updates:
raise HTTPException(status_code=400, detail="No fields to update")
updates.append("updated_at = CURRENT_TIMESTAMP")
params.append(tag_id)
query = f"""
UPDATE tags
SET {', '.join(updates)}
WHERE id = %s
RETURNING *
"""
result = execute_query_single(query, tuple(params))
if not result:
raise HTTPException(status_code=404, detail="Tag not found")
return result
@router.delete("/{tag_id}")
async def delete_tag(tag_id: int):
"""Delete tag (also removes all entity associations)"""
result = execute_update(
"DELETE FROM tags WHERE id = %s",
(tag_id,)
)
if result == 0:
raise HTTPException(status_code=404, detail="Tag not found")
return {"message": "Tag deleted successfully"}
# ============= ENTITY TAGGING =============
@router.post("/entity", response_model=EntityTag)
async def add_tag_to_entity(entity_tag: EntityTagCreate):
"""Add tag to any entity (ticket, customer, time_entry, etc.)"""
query = """
INSERT INTO entity_tags (entity_type, entity_id, tag_id, tagged_by)
VALUES (%s, %s, %s, %s)
ON CONFLICT (entity_type, entity_id, tag_id) DO NOTHING
RETURNING *
"""
result = execute_query_single(
query,
(entity_tag.entity_type, entity_tag.entity_id, entity_tag.tag_id, entity_tag.tagged_by)
)
if not result:
raise HTTPException(status_code=409, detail="Tag already exists on entity")
return result
@router.delete("/entity")
async def remove_tag_from_entity(
entity_type: str,
entity_id: int,
tag_id: int
):
"""Remove tag from entity"""
result = execute_update(
"DELETE FROM entity_tags WHERE entity_type = %s AND entity_id = %s AND tag_id = %s",
(entity_type, entity_id, tag_id)
)
if result == 0:
raise HTTPException(status_code=404, detail="Tag association not found")
return {"message": "Tag removed from entity"}
@router.get("/entity/{entity_type}/{entity_id}", response_model=List[Tag])
async def get_entity_tags(entity_type: str, entity_id: int):
"""Get all tags for a specific entity"""
query = """
SELECT t.*
FROM tags t
JOIN entity_tags et ON et.tag_id = t.id
WHERE et.entity_type = %s AND et.entity_id = %s
ORDER BY t.type, t.name
"""
results = execute_query(query, (entity_type, entity_id))
return results
@router.get("/search")
async def search_tags(q: str, type: Optional[TagType] = None):
"""Search tags by name (fuzzy search)"""
query = """
SELECT * FROM tags
WHERE is_active = true
AND LOWER(name) LIKE LOWER(%s)
"""
params = [f"%{q}%"]
if type:
query += " AND type = %s"
params.append(type)
query += " ORDER BY name LIMIT 20"
results = execute_query(query, tuple(params))
return results
# ============= WORKFLOW MANAGEMENT =============
@router.get("/workflows", response_model=List[TagWorkflow])
async def list_workflows(tag_id: Optional[int] = None):
"""List all workflow configurations"""
if tag_id:
results = execute_query(
"SELECT * FROM tag_workflows WHERE tag_id = %s ORDER BY id",
(tag_id,)
)
else:
results = execute_query(
"SELECT * FROM tag_workflows ORDER BY tag_id, id",
()
)
return results
@router.post("/workflows", response_model=TagWorkflow)
async def create_workflow(workflow: TagWorkflowCreate):
"""Create new workflow trigger"""
query = """
INSERT INTO tag_workflows (tag_id, trigger_event, action_type, action_config, is_active)
VALUES (%s, %s, %s, %s, %s)
RETURNING *
"""
result = execute_query_single(
query,
(workflow.tag_id, workflow.trigger_event, workflow.action_type,
workflow.action_config, workflow.is_active)
)
return result
@router.delete("/workflows/{workflow_id}")
async def delete_workflow(workflow_id: int):
"""Delete workflow configuration"""
result = execute_update(
"DELETE FROM tag_workflows WHERE id = %s",
(workflow_id,)
)
if result == 0:
raise HTTPException(status_code=404, detail="Workflow not found")
return {"message": "Workflow deleted"}

View File

@ -1,14 +0,0 @@
"""
Tag system frontend views
"""
from fastapi import APIRouter, Request
from fastapi.responses import HTMLResponse
from fastapi.templating import Jinja2Templates
router = APIRouter()
templates = Jinja2Templates(directory="app")
@router.get("/tags", response_class=HTMLResponse)
async def tags_admin_page(request: Request):
"""Render tag administration page"""
return templates.TemplateResponse("tags/frontend/tags_admin.html", {"request": request})

View File

@ -1,378 +0,0 @@
<!DOCTYPE html>
<html lang="da">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Tag Administration - BMC Hub</title>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.0/font/bootstrap-icons.css">
<style>
:root {
--primary-color: #0f4c75;
--workflow-color: #ff6b35;
--status-color: #ffd700;
--category-color: #0f4c75;
--priority-color: #dc3545;
--billing-color: #2d6a4f;
}
.tag-badge {
display: inline-block;
padding: 0.35em 0.65em;
font-size: 0.9em;
font-weight: 500;
border-radius: 0.375rem;
margin: 0.2em;
cursor: pointer;
transition: all 0.2s;
}
.tag-badge:hover {
transform: translateY(-2px);
box-shadow: 0 4px 8px rgba(0,0,0,0.2);
}
.tag-type-workflow { background-color: var(--workflow-color); color: white; }
.tag-type-status { background-color: var(--status-color); color: #333; }
.tag-type-category { background-color: var(--category-color); color: white; }
.tag-type-priority { background-color: var(--priority-color); color: white; }
.tag-type-billing { background-color: var(--billing-color); color: white; }
.tag-list-item {
padding: 1rem;
border-left: 4px solid transparent;
transition: all 0.2s;
}
.tag-list-item:hover {
background-color: #f8f9fa;
}
.tag-list-item[data-type="workflow"] { border-left-color: var(--workflow-color); }
.tag-list-item[data-type="status"] { border-left-color: var(--status-color); }
.tag-list-item[data-type="category"] { border-left-color: var(--category-color); }
.tag-list-item[data-type="priority"] { border-left-color: var(--priority-color); }
.tag-list-item[data-type="billing"] { border-left-color: var(--billing-color); }
.color-preview {
width: 40px;
height: 40px;
border-radius: 8px;
border: 2px solid #dee2e6;
}
</style>
</head>
<body>
<div class="container-fluid py-4">
<div class="row mb-4">
<div class="col">
<h1><i class="bi bi-tags"></i> Tag Administration</h1>
<p class="text-muted">Administrer tags der bruges på tværs af hele systemet</p>
</div>
<div class="col-auto">
<button class="btn btn-primary" data-bs-toggle="modal" data-bs-target="#createTagModal">
<i class="bi bi-plus-circle"></i> Opret Tag
</button>
</div>
</div>
<!-- Type Filter Tabs -->
<ul class="nav nav-tabs mb-4" id="typeFilter">
<li class="nav-item">
<a class="nav-link active" href="#" data-type="all">Alle</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" data-type="workflow">
<span class="tag-badge tag-type-workflow">Workflow</span>
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" data-type="status">
<span class="tag-badge tag-type-status">Status</span>
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" data-type="category">
<span class="tag-badge tag-type-category">Category</span>
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" data-type="priority">
<span class="tag-badge tag-type-priority">Priority</span>
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="#" data-type="billing">
<span class="tag-badge tag-type-billing">Billing</span>
</a>
</li>
</ul>
<!-- Tags List -->
<div class="card">
<div class="card-body p-0">
<div id="tagsList" class="list-group list-group-flush">
<div class="text-center p-4">
<div class="spinner-border text-primary" role="status">
<span class="visually-hidden">Indlæser...</span>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Create/Edit Tag Modal -->
<div class="modal fade" id="createTagModal" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Opret Tag</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<form id="tagForm">
<input type="hidden" id="tagId">
<div class="mb-3">
<label for="tagName" class="form-label">Navn *</label>
<input type="text" class="form-control" id="tagName" required>
</div>
<div class="mb-3">
<label for="tagType" class="form-label">Type *</label>
<select class="form-select" id="tagType" required>
<option value="">Vælg type...</option>
<option value="workflow">Workflow - Trigger automatisering</option>
<option value="status">Status - Tilstand/fase</option>
<option value="category">Category - Emne/område</option>
<option value="priority">Priority - Hastighed</option>
<option value="billing">Billing - Økonomi</option>
</select>
</div>
<div class="mb-3">
<label for="tagDescription" class="form-label">Beskrivelse</label>
<textarea class="form-control" id="tagDescription" rows="3"></textarea>
<small class="text-muted">Forklaring af hvad tagget gør eller betyder</small>
</div>
<div class="mb-3">
<label for="tagColor" class="form-label">Farve *</label>
<div class="input-group">
<input type="color" class="form-control form-control-color" id="tagColor" value="#0f4c75">
<input type="text" class="form-control" id="tagColorHex" value="#0f4c75" pattern="^#[0-9A-Fa-f]{6}$">
</div>
</div>
<div class="mb-3">
<label for="tagIcon" class="form-label">Ikon (valgfrit)</label>
<input type="text" class="form-control" id="tagIcon" placeholder="bi-star">
<small class="text-muted">Bootstrap Icons navn (fx: bi-star, bi-flag)</small>
</div>
<div class="form-check">
<input class="form-check-input" type="checkbox" id="tagActive" checked>
<label class="form-check-label" for="tagActive">
Aktiv
</label>
</div>
</form>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Annuller</button>
<button type="button" class="btn btn-primary" id="saveTagBtn">Gem</button>
</div>
</div>
</div>
</div>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
<script>
let allTags = [];
let currentFilter = 'all';
// Load tags on page load
document.addEventListener('DOMContentLoaded', () => {
loadTags();
setupEventListeners();
});
function setupEventListeners() {
// Type filter tabs
document.querySelectorAll('#typeFilter a').forEach(tab => {
tab.addEventListener('click', (e) => {
e.preventDefault();
document.querySelectorAll('#typeFilter a').forEach(t => t.classList.remove('active'));
e.target.closest('a').classList.add('active');
currentFilter = e.target.closest('a').dataset.type;
renderTags();
});
});
// Color picker sync
document.getElementById('tagColor').addEventListener('input', (e) => {
document.getElementById('tagColorHex').value = e.target.value;
});
document.getElementById('tagColorHex').addEventListener('input', (e) => {
const color = e.target.value;
if (/^#[0-9A-Fa-f]{6}$/.test(color)) {
document.getElementById('tagColor').value = color;
}
});
// Type change updates color
document.getElementById('tagType').addEventListener('change', (e) => {
const type = e.target.value;
const colorMap = {
'workflow': '#ff6b35',
'status': '#ffd700',
'category': '#0f4c75',
'priority': '#dc3545',
'billing': '#2d6a4f'
};
if (colorMap[type]) {
document.getElementById('tagColor').value = colorMap[type];
document.getElementById('tagColorHex').value = colorMap[type];
}
});
// Save button
document.getElementById('saveTagBtn').addEventListener('click', saveTag);
// Modal reset on close
document.getElementById('createTagModal').addEventListener('hidden.bs.modal', () => {
document.getElementById('tagForm').reset();
document.getElementById('tagId').value = '';
document.querySelector('#createTagModal .modal-title').textContent = 'Opret Tag';
});
}
async function loadTags() {
try {
const response = await fetch('/api/v1/tags');
if (!response.ok) throw new Error('Failed to load tags');
allTags = await response.json();
renderTags();
} catch (error) {
console.error('Error loading tags:', error);
document.getElementById('tagsList').innerHTML = `
<div class="alert alert-danger m-3">
<i class="bi bi-exclamation-triangle"></i> Fejl ved indlæsning af tags
</div>
`;
}
}
function renderTags() {
const container = document.getElementById('tagsList');
const filteredTags = currentFilter === 'all'
? allTags
: allTags.filter(t => t.type === currentFilter);
if (filteredTags.length === 0) {
container.innerHTML = `
<div class="text-center p-4 text-muted">
<i class="bi bi-inbox" style="font-size: 3rem;"></i>
<p class="mt-3">Ingen tags fundet</p>
</div>
`;
return;
}
container.innerHTML = filteredTags.map(tag => `
<div class="tag-list-item list-group-item" data-type="${tag.type}">
<div class="d-flex align-items-center">
<div class="color-preview me-3" style="background-color: ${tag.color};">
${tag.icon ? `<i class="bi ${tag.icon}" style="font-size: 1.5rem; color: white; line-height: 40px;"></i>` : ''}
</div>
<div class="flex-grow-1">
<div class="d-flex align-items-center mb-1">
<h5 class="mb-0 me-2">${tag.name}</h5>
<span class="tag-badge tag-type-${tag.type}">${tag.type}</span>
${!tag.is_active ? '<span class="badge bg-secondary ms-2">Inaktiv</span>' : ''}
</div>
${tag.description ? `<p class="text-muted mb-0 small">${tag.description}</p>` : ''}
</div>
<div class="btn-group">
<button class="btn btn-sm btn-outline-primary" onclick="editTag(${tag.id})">
<i class="bi bi-pencil"></i>
</button>
<button class="btn btn-sm btn-outline-danger" onclick="deleteTag(${tag.id}, '${tag.name}')">
<i class="bi bi-trash"></i>
</button>
</div>
</div>
</div>
`).join('');
}
async function saveTag() {
const tagId = document.getElementById('tagId').value;
const tagData = {
name: document.getElementById('tagName').value,
type: document.getElementById('tagType').value,
description: document.getElementById('tagDescription').value || null,
color: document.getElementById('tagColorHex').value,
icon: document.getElementById('tagIcon').value || null,
is_active: document.getElementById('tagActive').checked
};
try {
const url = tagId ? `/api/v1/tags/${tagId}` : '/api/v1/tags';
const method = tagId ? 'PUT' : 'POST';
const response = await fetch(url, {
method: method,
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(tagData)
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail || 'Failed to save tag');
}
bootstrap.Modal.getInstance(document.getElementById('createTagModal')).hide();
await loadTags();
} catch (error) {
alert('Fejl: ' + error.message);
}
}
function editTag(tagId) {
const tag = allTags.find(t => t.id === tagId);
if (!tag) return;
document.getElementById('tagId').value = tag.id;
document.getElementById('tagName').value = tag.name;
document.getElementById('tagType').value = tag.type;
document.getElementById('tagDescription').value = tag.description || '';
document.getElementById('tagColor').value = tag.color;
document.getElementById('tagColorHex').value = tag.color;
document.getElementById('tagIcon').value = tag.icon || '';
document.getElementById('tagActive').checked = tag.is_active;
document.querySelector('#createTagModal .modal-title').textContent = 'Rediger Tag';
new bootstrap.Modal(document.getElementById('createTagModal')).show();
}
async function deleteTag(tagId, tagName) {
if (!confirm(`Slet tag "${tagName}"?\n\nDette vil også fjerne tagget fra alle steder det er brugt.`)) {
return;
}
try {
const response = await fetch(`/api/v1/tags/${tagId}`, {
method: 'DELETE'
});
if (!response.ok) throw new Error('Failed to delete tag');
await loadTags();
} catch (error) {
alert('Fejl ved sletning: ' + error.message);
}
}
</script>
</body>
</html>

View File

@ -1,10 +0,0 @@
"""
Ticket System & Klippekort Module
==================================
Isoleret modul til ticket management og klippekort (prepaid time cards).
Alle tabeller bruger 'tticket_' prefix og kan uninstalleres uden at påvirke core data.
"""
__version__ = "1.0.0"

View File

@ -1,6 +0,0 @@
"""
Ticket Module Backend
=====================
Business logic, API endpoints, og services for ticket-systemet.
"""

View File

@ -1,402 +0,0 @@
"""
E-conomic Export Service for Ticket System
Exports billable worklog entries to e-conomic as invoice lines
🚨 SAFETY MODES (inherited from EconomicService):
- TICKET_ECONOMIC_READ_ONLY: Blocks ALL write operations when True
- TICKET_ECONOMIC_DRY_RUN: Logs operations but doesn't send to e-conomic when True
- TICKET_ECONOMIC_AUTO_EXPORT: Enable automatic export on worklog approval
Integration Pattern:
1. Worklog entry marked "billable" Export to e-conomic
2. Create invoice draft with worklog line items
3. Store economic_invoice_number in tticket_worklog
4. Mark worklog as "billed" after successful export
"""
import logging
from typing import Dict, List, Optional
from datetime import date, datetime
from decimal import Decimal
from app.core.database import execute_query, execute_update, execute_query_single
from app.core.config import settings
from app.services.economic_service import EconomicService
from psycopg2.extras import Json
logger = logging.getLogger(__name__)
class TicketEconomicExportService:
"""
Handles export of billable worklog to e-conomic
"""
def __init__(self):
self.economic = EconomicService()
self.read_only = getattr(settings, 'TICKET_ECONOMIC_READ_ONLY', True)
self.dry_run = getattr(settings, 'TICKET_ECONOMIC_DRY_RUN', True)
self.auto_export = getattr(settings, 'TICKET_ECONOMIC_AUTO_EXPORT', False)
# Log safety status
if self.read_only:
logger.warning("🔒 TICKET E-CONOMIC READ-ONLY MODE ENABLED")
elif self.dry_run:
logger.warning("🏃 TICKET E-CONOMIC DRY-RUN MODE ENABLED")
else:
logger.warning("⚠️ TICKET E-CONOMIC WRITE MODE ACTIVE")
def _check_export_permission(self, operation: str) -> bool:
"""Check if export operations are allowed"""
if self.read_only:
logger.error(f"🚫 BLOCKED: {operation} - TICKET_ECONOMIC_READ_ONLY=true")
logger.error("To enable: Set TICKET_ECONOMIC_READ_ONLY=false in .env")
return False
if self.dry_run:
logger.warning(f"🏃 DRY-RUN: {operation} - TICKET_ECONOMIC_DRY_RUN=true")
logger.warning("To execute: Set TICKET_ECONOMIC_DRY_RUN=false in .env")
return False
logger.warning(f"⚠️ EXECUTING: {operation} - Will create in e-conomic")
return True
async def export_billable_worklog_batch(
self,
customer_id: int,
worklog_ids: Optional[List[int]] = None,
date_from: Optional[date] = None,
date_to: Optional[date] = None
) -> Dict:
"""
Export billable worklog entries to e-conomic as draft invoice
Args:
customer_id: Customer ID for the invoice
worklog_ids: Specific worklog entry IDs to export (optional)
date_from: Start date filter (optional)
date_to: End date filter (optional)
Returns:
Dict with export results
"""
try:
logger.info(f"📦 Starting worklog export for customer {customer_id}")
# Get customer e-conomic info
customer = await self._get_customer_economic_info(customer_id)
if not customer:
raise ValueError(f"Customer {customer_id} not found or missing e-conomic mapping")
# Get billable worklog entries
worklogs = await self._get_billable_worklog(
customer_id, worklog_ids, date_from, date_to
)
if not worklogs:
logger.info("✅ No billable worklog entries found")
return {
'status': 'no_entries',
'exported_count': 0
}
logger.info(f"📋 Found {len(worklogs)} billable entries")
# Safety check
if not self._check_export_permission(f"Export {len(worklogs)} worklog entries"):
return {
'status': 'blocked',
'reason': 'Safety mode enabled',
'read_only': self.read_only,
'dry_run': self.dry_run,
'entries': len(worklogs)
}
# Create invoice in e-conomic
invoice_result = await self._create_economic_invoice(customer, worklogs)
# Update worklog entries with invoice number
await self._mark_worklogs_as_billed(
worklog_ids=[w['id'] for w in worklogs],
economic_invoice_number=invoice_result.get('draftInvoiceNumber')
)
logger.info(f"✅ Successfully exported {len(worklogs)} entries to e-conomic")
return {
'status': 'exported',
'exported_count': len(worklogs),
'invoice_number': invoice_result.get('draftInvoiceNumber'),
'total_hours': sum(float(w['hours']) for w in worklogs),
'total_amount': sum(float(w['amount']) for w in worklogs),
'entries': [
{
'id': w['id'],
'ticket_number': w['ticket_number'],
'hours': float(w['hours'])
}
for w in worklogs
]
}
except Exception as e:
logger.error(f"❌ Failed to export worklog: {e}")
raise
async def _get_customer_economic_info(self, customer_id: int) -> Optional[Dict]:
"""
Get customer e-conomic mapping information
Returns:
Dict with debtor_number, payment_terms, etc.
"""
query = """
SELECT
id,
name,
email,
economic_customer_number,
payment_terms_number,
address,
postal_code,
city
FROM customers
WHERE id = %s
"""
customer = execute_query_single(query, (customer_id,))
if not customer:
logger.error(f"❌ Customer {customer_id} not found")
return None
if not customer.get('economic_customer_number'):
logger.error(f"❌ Customer {customer_id} missing economic_customer_number")
return None
return customer
async def _get_billable_worklog(
self,
customer_id: int,
worklog_ids: Optional[List[int]] = None,
date_from: Optional[date] = None,
date_to: Optional[date] = None
) -> List[Dict]:
"""
Get billable worklog entries ready for export
Returns:
List of worklog dicts with ticket and customer info
"""
query = """
SELECT
w.id,
w.ticket_id,
w.work_date,
w.hours,
w.work_type,
w.description,
w.billing_method,
t.ticket_number,
t.subject AS ticket_subject,
t.customer_id,
c.economic_customer_number,
(w.hours * 850) AS amount
FROM tticket_worklog w
INNER JOIN tticket_tickets t ON t.id = w.ticket_id
INNER JOIN customers c ON c.id = t.customer_id
WHERE w.status = 'billable'
AND w.billing_method = 'invoice'
AND w.billed_at IS NULL
AND t.customer_id = %s
"""
params = [customer_id]
if worklog_ids:
query += " AND w.id = ANY(%s)"
params.append(worklog_ids)
if date_from:
query += " AND w.work_date >= %s"
params.append(date_from)
if date_to:
query += " AND w.work_date <= %s"
params.append(date_to)
query += " ORDER BY w.work_date ASC, w.created_at ASC"
return execute_query(query, tuple(params))
async def _create_economic_invoice(self, customer: Dict, worklogs: List[Dict]) -> Dict:
"""
Create draft invoice in e-conomic with worklog line items
Args:
customer: Customer dict with e-conomic mapping
worklogs: List of billable worklog entries
Returns:
e-conomic invoice response dict
"""
# Build invoice payload
invoice_data = {
"date": datetime.now().strftime("%Y-%m-%d"),
"customer": {
"customerNumber": customer['economic_customer_number']
},
"recipient": {
"name": customer['name'],
"address": customer.get('address', ''),
"zip": customer.get('postal_code', ''),
"city": customer.get('city', ''),
"vatZone": {
"vatZoneNumber": 1 # Denmark
}
},
"paymentTerms": {
"paymentTermsNumber": customer.get('payment_terms_number', 1)
},
"layout": {
"layoutNumber": 19 # Default layout
},
"lines": []
}
# Add worklog entries as invoice lines
for worklog in worklogs:
product_number = 'SUPPORT' # Default product number
# Build line description
ticket_ref = f"[{worklog['ticket_number']}] {worklog['ticket_subject']}"
work_desc = worklog.get('description', 'Support arbejde')
line_text = f"{ticket_ref}\n{work_desc}\n{worklog['work_date']}"
line = {
"product": {
"productNumber": product_number[:25] # Max 25 chars
},
"description": line_text[:250], # Max 250 chars
"quantity": float(worklog['hours']),
"unitNetPrice": float(worklog['amount']) / float(worklog['hours']),
"unit": {
"unitNumber": 1 # Hours
}
}
invoice_data['lines'].append(line)
logger.info(f"📄 Creating e-conomic invoice with {len(invoice_data['lines'])} lines")
logger.info(f"📊 Invoice payload: {invoice_data}")
# DRY-RUN: Just return mock response
if self.dry_run:
logger.warning("🏃 DRY-RUN: Would create invoice in e-conomic")
return {
'draftInvoiceNumber': 999999,
'grossAmount': sum(float(w['amount']) for w in worklogs),
'netAmount': sum(float(w['amount']) for w in worklogs),
'vatAmount': 0,
'dryRun': True
}
# REAL EXECUTION: Create in e-conomic
try:
result = await self.economic.create_draft_invoice(invoice_data)
logger.info(f"✅ Created e-conomic draft invoice: {result.get('draftInvoiceNumber')}")
return result
except Exception as e:
logger.error(f"❌ Failed to create e-conomic invoice: {e}")
raise
async def _mark_worklogs_as_billed(
self,
worklog_ids: List[int],
economic_invoice_number: Optional[int]
) -> None:
"""
Mark worklog entries as billed with e-conomic reference
Args:
worklog_ids: List of worklog entry IDs
economic_invoice_number: e-conomic invoice number
"""
if not worklog_ids:
return
query = """
UPDATE tticket_worklog
SET billed_at = CURRENT_TIMESTAMP,
economic_invoice_number = %s,
updated_at = CURRENT_TIMESTAMP
WHERE id = ANY(%s)
"""
execute_update(query, (economic_invoice_number, worklog_ids))
logger.info(f"✅ Marked {len(worklog_ids)} worklog entries as billed")
async def get_export_preview(
self,
customer_id: int,
worklog_ids: Optional[List[int]] = None,
date_from: Optional[date] = None,
date_to: Optional[date] = None
) -> Dict:
"""
Preview what would be exported without actually exporting
Args:
Same as export_billable_worklog_batch
Returns:
Dict with preview information
"""
try:
customer = await self._get_customer_economic_info(customer_id)
if not customer:
return {
'status': 'error',
'error': 'Customer not found or missing e-conomic mapping'
}
worklogs = await self._get_billable_worklog(
customer_id, worklog_ids, date_from, date_to
)
total_hours = sum(float(w['hours']) for w in worklogs)
total_amount = sum(float(w['amount']) for w in worklogs)
return {
'status': 'preview',
'customer_id': customer_id,
'customer_name': customer['name'],
'economic_customer_number': customer['economic_customer_number'],
'entry_count': len(worklogs),
'total_hours': float(total_hours),
'total_amount': float(total_amount),
'entries': [
{
'id': w['id'],
'ticket_number': w['ticket_number'],
'work_date': w['work_date'].strftime('%Y-%m-%d'),
'hours': float(w['hours']),
'amount': float(w['amount']),
'description': w['description']
}
for w in worklogs
]
}
except Exception as e:
logger.error(f"❌ Failed to generate export preview: {e}")
return {
'status': 'error',
'error': str(e)
}
# Singleton instance
ticket_economic_service = TicketEconomicExportService()

View File

@ -1,472 +0,0 @@
"""
Email-to-Ticket Integration for BMC Hub Ticket System
Implements Option B: Message-ID Threading
- Parse email message_id from workflow context
- Create tickets with email metadata
- Link emails to existing tickets using In-Reply-To header
- Extract ticket tags from email body
- Store email threading info in tticket_email_log
"""
import logging
import re
from typing import Dict, Any, Optional, List
from datetime import datetime
from app.core.database import execute_query, execute_insert, execute_query_single
from app.ticket.backend.ticket_service import TicketService
from app.ticket.backend.models import TTicketCreate, TicketPriority
from psycopg2.extras import Json
logger = logging.getLogger(__name__)
class EmailTicketIntegration:
"""Handles email-to-ticket conversion and threading"""
# Regex patterns for ticket detection
TICKET_NUMBER_PATTERN = r'TKT-\d{8}-\d{3}'
TAG_PATTERN = r'#(\w+)'
@staticmethod
async def create_ticket_from_email(
email_data: Dict[str, Any],
customer_id: Optional[int] = None,
assigned_to_user_id: Optional[int] = None
) -> Dict[str, Any]:
"""
Create ticket from email workflow action
Args:
email_data: Dict with keys:
- message_id: Email Message-ID header
- subject: Email subject
- from_address: Sender email
- body: Email body text
- html_body: Email HTML body (optional)
- received_at: Email timestamp (ISO format)
- in_reply_to: In-Reply-To header (optional)
- references: References header (optional)
customer_id: BMC customer ID (if known)
assigned_to_user_id: User to assign ticket to
Returns:
Dict with ticket_id and ticket_number
"""
try:
logger.info(f"🎫 Creating ticket from email: {email_data.get('message_id')}")
# Extract tags from email body
tags = EmailTicketIntegration._extract_tags(email_data.get('body', ''))
# Determine priority from subject/tags
priority = EmailTicketIntegration._determine_priority(
email_data.get('subject', ''),
tags
)
# Create ticket
ticket_create = TTicketCreate(
subject=email_data.get('subject', 'Email without subject'),
description=EmailTicketIntegration._format_description(email_data),
customer_id=customer_id,
assigned_to_user_id=assigned_to_user_id,
priority=priority,
tags=tags,
custom_fields={
'email_from': email_data.get('from_address'),
'email_message_id': email_data.get('message_id'),
'created_from_email': True
}
)
# Create via service (handles auto-number generation)
ticket = await TicketService.create_ticket(ticket_create, created_by_user_id=None)
# Log email-ticket linkage
await EmailTicketIntegration._log_email_linkage(
ticket_id=ticket['id'],
email_data=email_data
)
logger.info(f"✅ Created ticket {ticket['ticket_number']} from email {email_data.get('message_id')}")
return {
'ticket_id': ticket['id'],
'ticket_number': ticket['ticket_number'],
'created': True
}
except Exception as e:
logger.error(f"❌ Failed to create ticket from email: {e}")
raise
@staticmethod
async def link_email_to_ticket(
ticket_number: str,
email_data: Dict[str, Any]
) -> Dict[str, Any]:
"""
Link email to existing ticket (reply threading)
Args:
ticket_number: TKT-YYYYMMDD-XXX format
email_data: Same format as create_ticket_from_email
Returns:
Dict with ticket_id and linked=True
"""
try:
logger.info(f"🔗 Linking email to ticket {ticket_number}")
# Find ticket by ticket_number
query = "SELECT id FROM tticket_tickets WHERE ticket_number = %s"
result = execute_query_single(query, (ticket_number,))
if not result:
logger.warning(f"⚠️ Ticket {ticket_number} not found - creating new ticket instead")
return await EmailTicketIntegration.create_ticket_from_email(email_data)
ticket_id = result['id']
# Log email linkage
await EmailTicketIntegration._log_email_linkage(
ticket_id=ticket_id,
email_data=email_data,
is_reply=True
)
# Add comment with email content
from app.ticket.backend.models import TTicketCommentCreate
comment_text = f"📧 Email from {email_data.get('from_address')}:\n\n{email_data.get('body', '')}"
await TicketService.add_comment(
ticket_id=ticket_id,
comment=TTicketCommentCreate(
comment_text=comment_text,
internal_note=False
),
user_id=None
)
logger.info(f"✅ Linked email to ticket {ticket_number} (ID: {ticket_id})")
return {
'ticket_id': ticket_id,
'ticket_number': ticket_number,
'linked': True
}
except Exception as e:
logger.error(f"❌ Failed to link email to ticket: {e}")
raise
@staticmethod
async def process_email_for_ticket(
email_data: Dict[str, Any],
customer_id: Optional[int] = None,
assigned_to_user_id: Optional[int] = None
) -> Dict[str, Any]:
"""
Smart processor: Creates new ticket OR links to existing based on threading
Args:
email_data: Email metadata dict
customer_id: Customer ID if known
assigned_to_user_id: User to assign new tickets to
Returns:
Dict with ticket_id, ticket_number, created/linked status
"""
try:
# Check if email is reply to existing ticket
ticket_number = EmailTicketIntegration._find_ticket_in_thread(email_data)
if ticket_number:
# Reply to existing ticket
return await EmailTicketIntegration.link_email_to_ticket(
ticket_number=ticket_number,
email_data=email_data
)
else:
# New ticket
return await EmailTicketIntegration.create_ticket_from_email(
email_data=email_data,
customer_id=customer_id,
assigned_to_user_id=assigned_to_user_id
)
except Exception as e:
logger.error(f"❌ Failed to process email for ticket: {e}")
raise
@staticmethod
def _find_ticket_in_thread(email_data: Dict[str, Any]) -> Optional[str]:
"""
Find ticket number in email thread (In-Reply-To, References, Subject)
Returns:
Ticket number (TKT-YYYYMMDD-XXX) or None
"""
# Check In-Reply-To header first
in_reply_to = email_data.get('in_reply_to', '')
if in_reply_to:
match = re.search(EmailTicketIntegration.TICKET_NUMBER_PATTERN, in_reply_to)
if match:
return match.group(0)
# Check References header
references = email_data.get('references', '')
if references:
match = re.search(EmailTicketIntegration.TICKET_NUMBER_PATTERN, references)
if match:
return match.group(0)
# Check subject line for [TKT-YYYYMMDD-XXX] or Re: TKT-YYYYMMDD-XXX
subject = email_data.get('subject', '')
match = re.search(EmailTicketIntegration.TICKET_NUMBER_PATTERN, subject)
if match:
return match.group(0)
return None
@staticmethod
def _extract_tags(body: str) -> List[str]:
"""
Extract #hashtags from email body
Returns:
List of lowercase tags without # prefix
"""
if not body:
return []
tags = re.findall(EmailTicketIntegration.TAG_PATTERN, body)
return [tag.lower() for tag in tags if len(tag) > 2][:10] # Max 10 tags
@staticmethod
def _determine_priority(subject: str, tags: List[str]) -> TicketPriority:
"""
Determine ticket priority from subject/tags
Returns:
TicketPriority enum value
"""
subject_lower = subject.lower()
all_text = f"{subject_lower} {' '.join(tags)}"
# Critical keywords
if any(word in all_text for word in ['kritisk', 'critical', 'down', 'nede', 'urgent', 'akut']):
return TicketPriority.critical
# High priority keywords
if any(word in all_text for word in ['høj', 'high', 'vigtig', 'important', 'haster']):
return TicketPriority.high
# Low priority keywords
if any(word in all_text for word in ['lav', 'low', 'spørgsmål', 'question', 'info']):
return TicketPriority.low
return TicketPriority.normal
@staticmethod
def _format_description(email_data: Dict[str, Any]) -> str:
"""
Format email body as ticket description
Returns:
Formatted description text
"""
body = email_data.get('body', '').strip()
from_address = email_data.get('from_address', 'unknown')
received_at = email_data.get('received_at', '')
description = f"📧 Email fra: {from_address}\n"
description += f"📅 Modtaget: {received_at}\n"
description += f"{'='*60}\n\n"
description += body
return description
@staticmethod
async def _log_email_linkage(
ticket_id: int,
email_data: Dict[str, Any],
is_reply: bool = False
) -> None:
"""
Store email-ticket linkage in tticket_email_log
Args:
ticket_id: Ticket ID
email_data: Email metadata
is_reply: True if email is reply to existing ticket
"""
query = """
INSERT INTO tticket_email_log (
ticket_id,
email_message_id,
email_subject,
email_from,
email_received_at,
is_reply,
thread_data
) VALUES (%s, %s, %s, %s, %s, %s, %s)
"""
# Parse received_at timestamp
received_at = email_data.get('received_at')
if isinstance(received_at, str):
try:
received_at = datetime.fromisoformat(received_at.replace('Z', '+00:00'))
except:
received_at = None
thread_data = {
'in_reply_to': email_data.get('in_reply_to'),
'references': email_data.get('references'),
'message_id': email_data.get('message_id')
}
execute_insert(
query,
(
ticket_id,
email_data.get('message_id'),
email_data.get('subject'),
email_data.get('from_address'),
received_at,
is_reply,
Json(thread_data)
)
)
logger.info(f"📝 Logged email linkage for ticket {ticket_id}")
@staticmethod
async def get_ticket_email_thread(ticket_id: int) -> List[Dict[str, Any]]:
"""
Get all emails linked to a ticket (chronological order)
Args:
ticket_id: Ticket ID
Returns:
List of email log entries
"""
query = """
SELECT
id,
email_message_id,
email_subject,
email_from,
email_received_at,
is_reply,
thread_data,
created_at
FROM tticket_email_log
WHERE ticket_id = %s
ORDER BY email_received_at ASC
"""
return execute_query(query, (ticket_id,), fetchall=True)
@staticmethod
async def find_tickets_by_email_address(email_address: str) -> List[Dict[str, Any]]:
"""
Find all tickets associated with an email address
Args:
email_address: Email address to search for
Returns:
List of ticket info dicts
"""
query = """
SELECT DISTINCT
t.id,
t.ticket_number,
t.subject,
t.status,
t.created_at
FROM tticket_tickets t
INNER JOIN tticket_email_log e ON e.ticket_id = t.id
WHERE e.email_from = %s
ORDER BY t.created_at DESC
"""
return execute_query(query, (email_address,), fetchall=True)
# Workflow action functions (called from email workflow engine)
async def workflow_action_create_ticket(
context: Dict[str, Any],
action_params: Dict[str, Any]
) -> Dict[str, Any]:
"""
Workflow action: Create ticket from email
Usage in workflow:
{
"action": "create_ticket",
"params": {
"customer_id": 123, # Optional
"assigned_to_user_id": 5, # Optional
"priority": "high" # Optional override
}
}
Args:
context: Email workflow context with email_data
action_params: Action parameters from workflow definition
Returns:
Result dict with ticket info
"""
email_data = context.get('email_data', {})
result = await EmailTicketIntegration.process_email_for_ticket(
email_data=email_data,
customer_id=action_params.get('customer_id'),
assigned_to_user_id=action_params.get('assigned_to_user_id')
)
return result
async def workflow_action_link_email_to_ticket(
context: Dict[str, Any],
action_params: Dict[str, Any]
) -> Dict[str, Any]:
"""
Workflow action: Link email to existing ticket
Usage in workflow:
{
"action": "link_email_to_ticket",
"params": {
"ticket_number": "TKT-20251215-001"
}
}
Args:
context: Email workflow context
action_params: Must contain ticket_number
Returns:
Result dict with link status
"""
email_data = context.get('email_data', {})
ticket_number = action_params.get('ticket_number')
if not ticket_number:
raise ValueError("ticket_number required in action params")
result = await EmailTicketIntegration.link_email_to_ticket(
ticket_number=ticket_number,
email_data=email_data
)
return result

View File

@ -1,488 +0,0 @@
"""
Klippekort (Prepaid Time Card) Service
=======================================
Business logic for prepaid time cards: purchase, balance, deduction.
CONSTRAINT: Only 1 active card per customer (enforced by database UNIQUE index).
"""
import logging
from datetime import datetime
from decimal import Decimal
from typing import Optional, Dict, Any, List
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
from app.ticket.backend.models import (
TPrepaidCard,
TPrepaidCardCreate,
TPrepaidCardUpdate,
TPrepaidCardWithStats,
TPrepaidTransaction,
TPrepaidTransactionCreate,
PrepaidCardStatus,
TransactionType
)
logger = logging.getLogger(__name__)
class KlippekortService:
"""Service for prepaid card operations"""
@staticmethod
def purchase_card(
card_data: TPrepaidCardCreate,
user_id: Optional[int] = None
) -> Dict[str, Any]:
"""
Purchase a new prepaid card
CONSTRAINT: Only 1 active card allowed per customer.
This will fail if customer already has an active card.
Args:
card_data: Card purchase data
user_id: User making purchase
Returns:
Created card dict
Raises:
ValueError: If customer already has active card
"""
from psycopg2.extras import Json
# Check if customer already has an active card
existing = execute_query_single(
"""
SELECT id, card_number FROM tticket_prepaid_cards
WHERE customer_id = %s AND status = 'active'
""",
(card_data.customer_id,))
if existing:
raise ValueError(
f"Customer {card_data.customer_id} already has an active card: {existing['card_number']}. "
"Please deactivate or deplete the existing card before purchasing a new one."
)
logger.info(f"💳 Purchasing prepaid card for customer {card_data.customer_id}: {card_data.purchased_hours}h")
# Insert card (trigger will auto-generate card_number if NULL)
card_id = execute_insert(
"""
INSERT INTO tticket_prepaid_cards (
card_number, customer_id, purchased_hours, price_per_hour, total_amount,
status, expires_at, notes, economic_invoice_number, economic_product_number,
created_by_user_id
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
""",
(
card_data.card_number,
card_data.customer_id,
card_data.purchased_hours,
card_data.price_per_hour,
card_data.total_amount,
'active',
card_data.expires_at,
card_data.notes,
card_data.economic_invoice_number,
card_data.economic_product_number,
user_id or card_data.created_by_user_id
)
)
# Create initial transaction
execute_insert(
"""
INSERT INTO tticket_prepaid_transactions
(card_id, transaction_type, hours, balance_after, description, created_by_user_id)
VALUES (%s, %s, %s, %s, %s, %s)
""",
(
card_id,
'purchase',
card_data.purchased_hours,
card_data.purchased_hours,
f"Initial purchase: {card_data.purchased_hours}h @ {card_data.price_per_hour} DKK/h",
user_id
)
)
# Fetch created card
card = execute_query_single(
"SELECT * FROM tticket_prepaid_cards WHERE id = %s",
(card_id,))
logger.info(f"✅ Created prepaid card {card['card_number']} (ID: {card_id})")
return card
@staticmethod
def get_card(card_id: int) -> Optional[Dict[str, Any]]:
"""Get prepaid card by ID"""
return execute_query_single(
"SELECT * FROM tticket_prepaid_cards WHERE id = %s",
(card_id,))
@staticmethod
def get_card_with_stats(card_id: int) -> Optional[Dict[str, Any]]:
"""Get prepaid card with usage statistics"""
return execute_query_single(
"SELECT * FROM tticket_prepaid_balances WHERE id = %s",
(card_id,))
@staticmethod
def get_active_card_for_customer(customer_id: int) -> Optional[Dict[str, Any]]:
"""
Get active prepaid card for customer
Returns None if no active card exists.
"""
return execute_query_single(
"""
SELECT * FROM tticket_prepaid_cards
WHERE customer_id = %s AND status = 'active'
""",
(customer_id,))
@staticmethod
def check_balance(customer_id: int) -> Dict[str, Any]:
"""
Check prepaid card balance for customer
Args:
customer_id: Customer ID
Returns:
Dict with balance info: {
"has_card": bool,
"card_id": int or None,
"card_number": str or None,
"balance_hours": Decimal,
"status": str or None,
"expires_at": datetime or None
}
"""
card = KlippekortService.get_active_card_for_customer(customer_id)
if not card:
return {
"has_card": False,
"card_id": None,
"card_number": None,
"balance_hours": Decimal('0'),
"status": None,
"expires_at": None
}
return {
"has_card": True,
"card_id": card['id'],
"card_number": card['card_number'],
"balance_hours": card['remaining_hours'],
"status": card['status'],
"expires_at": card['expires_at']
}
@staticmethod
def can_deduct(customer_id: int, hours: Decimal) -> tuple[bool, Optional[str]]:
"""
Check if customer has sufficient balance for deduction
Args:
customer_id: Customer ID
hours: Hours to deduct
Returns:
(can_deduct, error_message)
"""
balance_info = KlippekortService.check_balance(customer_id)
if not balance_info['has_card']:
return False, f"Customer {customer_id} has no active prepaid card"
if balance_info['status'] != 'active':
return False, f"Prepaid card is not active (status: {balance_info['status']})"
if balance_info['balance_hours'] < hours:
return False, (
f"Insufficient balance: {balance_info['balance_hours']}h available, "
f"{hours}h required"
)
# Check expiration
if balance_info['expires_at']:
if balance_info['expires_at'] < datetime.now():
return False, f"Prepaid card expired on {balance_info['expires_at']}"
return True, None
@staticmethod
def deduct_hours(
customer_id: int,
hours: Decimal,
worklog_id: int,
user_id: Optional[int] = None,
description: Optional[str] = None
) -> Dict[str, Any]:
"""
Deduct hours from customer's active prepaid card
Args:
customer_id: Customer ID
hours: Hours to deduct
worklog_id: Worklog entry consuming the hours
user_id: User performing deduction
description: Optional description
Returns:
Transaction dict
Raises:
ValueError: If insufficient balance or no active card
"""
# Check if deduction is possible
can_deduct, error = KlippekortService.can_deduct(customer_id, hours)
if not can_deduct:
raise ValueError(error)
# Get active card
card = KlippekortService.get_active_card_for_customer(customer_id)
logger.info(f"⏱️ Deducting {hours}h from card {card['card_number']} for worklog {worklog_id}")
# Update card usage
new_used = Decimal(str(card['used_hours'])) + hours
execute_update(
"UPDATE tticket_prepaid_cards SET used_hours = %s WHERE id = %s",
(new_used, card['id'])
)
# Calculate new balance
new_balance = Decimal(str(card['remaining_hours'])) - hours
# Create transaction
transaction_id = execute_insert(
"""
INSERT INTO tticket_prepaid_transactions
(card_id, worklog_id, transaction_type, hours, balance_after, description, created_by_user_id)
VALUES (%s, %s, %s, %s, %s, %s, %s)
""",
(
card['id'],
worklog_id,
'usage',
-hours, # Negative for deduction
new_balance,
description or f"Worklog #{worklog_id}: {hours}h",
user_id
)
)
# Check if card is now depleted
if new_balance <= Decimal('0'):
execute_update(
"UPDATE tticket_prepaid_cards SET status = 'depleted' WHERE id = %s",
(card['id'],)
)
logger.warning(f"💳 Card {card['card_number']} is now depleted")
# Fetch transaction
transaction = execute_query_single(
"SELECT * FROM tticket_prepaid_transactions WHERE id = %s",
(transaction_id,))
logger.info(f"✅ Deducted {hours}h from card {card['card_number']}, new balance: {new_balance}h")
return transaction
@staticmethod
def top_up_card(
card_id: int,
hours: Decimal,
user_id: Optional[int] = None,
note: Optional[str] = None
) -> Dict[str, Any]:
"""
Add hours to existing prepaid card
Args:
card_id: Card ID
hours: Hours to add
user_id: User performing top-up
note: Optional note
Returns:
Transaction dict
Raises:
ValueError: If card not found or not active
"""
# Get card
card = KlippekortService.get_card(card_id)
if not card:
raise ValueError(f"Prepaid card {card_id} not found")
if card['status'] not in ['active', 'depleted']:
raise ValueError(f"Cannot top up card with status: {card['status']}")
logger.info(f"💰 Topping up card {card['card_number']} with {hours}h")
# Update purchased hours
new_purchased = Decimal(str(card['purchased_hours'])) + hours
execute_update(
"UPDATE tticket_prepaid_cards SET purchased_hours = %s, status = 'active' WHERE id = %s",
(new_purchased, card_id)
)
# Calculate new balance
new_balance = Decimal(str(card['remaining_hours'])) + hours
# Create transaction
transaction_id = execute_insert(
"""
INSERT INTO tticket_prepaid_transactions
(card_id, transaction_type, hours, balance_after, description, created_by_user_id)
VALUES (%s, %s, %s, %s, %s, %s)
""",
(
card_id,
'top_up',
hours, # Positive for addition
new_balance,
note or f"Top-up: {hours}h added",
user_id
)
)
transaction = execute_query_single(
"SELECT * FROM tticket_prepaid_transactions WHERE id = %s",
(transaction_id,))
logger.info(f"✅ Topped up card {card['card_number']} with {hours}h, new balance: {new_balance}h")
return transaction
@staticmethod
def get_transactions(
card_id: int,
limit: int = 100
) -> List[Dict[str, Any]]:
"""
Get transaction history for card
Args:
card_id: Card ID
limit: Max number of transactions
Returns:
List of transaction dicts
"""
transactions = execute_query_single(
"""
SELECT * FROM tticket_prepaid_transactions
WHERE card_id = %s
ORDER BY created_at DESC
LIMIT %s
""",
(card_id, limit)
)
return transactions or []
@staticmethod
def list_cards(
customer_id: Optional[int] = None,
status: Optional[str] = None,
limit: int = 50,
offset: int = 0
) -> List[Dict[str, Any]]:
"""
List prepaid cards with optional filters
Args:
customer_id: Filter by customer
status: Filter by status
limit: Number of results
offset: Offset for pagination
Returns:
List of card dicts
"""
query = "SELECT * FROM tticket_prepaid_cards WHERE 1=1"
params = []
if customer_id:
query += " AND customer_id = %s"
params.append(customer_id)
if status:
query += " AND status = %s"
params.append(status)
query += " ORDER BY purchased_at DESC LIMIT %s OFFSET %s"
params.extend([limit, offset])
cards = execute_query(query, tuple(params))
return cards or []
@staticmethod
def cancel_card(
card_id: int,
user_id: Optional[int] = None,
reason: Optional[str] = None
) -> Dict[str, Any]:
"""
Cancel/deactivate a prepaid card
Args:
card_id: Card ID
user_id: User cancelling card
reason: Cancellation reason
Returns:
Updated card dict
Raises:
ValueError: If card not found or already cancelled
"""
card = KlippekortService.get_card(card_id)
if not card:
raise ValueError(f"Prepaid card {card_id} not found")
if card['status'] == 'cancelled':
raise ValueError(f"Card {card['card_number']} is already cancelled")
logger.info(f"❌ Cancelling card {card['card_number']}")
# Update status
execute_update(
"UPDATE tticket_prepaid_cards SET status = 'cancelled' WHERE id = %s",
(card_id,)
)
# Log transaction
execute_insert(
"""
INSERT INTO tticket_prepaid_transactions
(card_id, transaction_type, hours, balance_after, description, created_by_user_id)
VALUES (%s, %s, %s, %s, %s, %s)
""",
(
card_id,
'cancellation',
Decimal('0'),
card['remaining_hours'],
reason or "Card cancelled",
user_id
)
)
# Fetch updated card
updated = execute_query(
"SELECT * FROM tticket_prepaid_cards WHERE id = %s",
(card_id,))
logger.info(f"✅ Cancelled card {card['card_number']}")
return updated

View File

@ -1,783 +0,0 @@
"""
Pydantic Models for Ticket System & Klippekort Module
======================================================
Alle models repræsenterer data fra tticket_* tabeller.
Modulet er isoleret og har ingen afhængigheder til core Hub-models.
"""
from datetime import date, datetime
from decimal import Decimal
from typing import List, Optional
from pydantic import BaseModel, Field, field_validator
from enum import Enum
# ============================================================================
# ENUMS
# ============================================================================
class TicketStatus(str, Enum):
"""Ticket status workflow"""
OPEN = "open"
IN_PROGRESS = "in_progress"
WAITING_CUSTOMER = "waiting_customer"
WAITING_INTERNAL = "waiting_internal"
RESOLVED = "resolved"
CLOSED = "closed"
class TicketPriority(str, Enum):
"""Ticket prioritet"""
LOW = "low"
NORMAL = "normal"
HIGH = "high"
URGENT = "urgent"
class TicketSource(str, Enum):
"""Hvor ticket blev oprettet fra"""
EMAIL = "email"
PORTAL = "portal"
PHONE = "phone"
MANUAL = "manual"
API = "api"
class WorkType(str, Enum):
"""Type af arbejde"""
SUPPORT = "support"
DEVELOPMENT = "development"
TROUBLESHOOTING = "troubleshooting"
ON_SITE = "on_site"
MEETING = "meeting"
OTHER = "other"
class BillingMethod(str, Enum):
"""Afregningsmetode"""
PREPAID_CARD = "prepaid_card"
INVOICE = "invoice"
INTERNAL = "internal"
WARRANTY = "warranty"
class WorklogStatus(str, Enum):
"""Worklog status"""
DRAFT = "draft"
BILLABLE = "billable"
BILLED = "billed"
NON_BILLABLE = "non_billable"
class PrepaidCardStatus(str, Enum):
"""Klippekort status"""
ACTIVE = "active"
DEPLETED = "depleted"
EXPIRED = "expired"
CANCELLED = "cancelled"
class TransactionType(str, Enum):
"""Klippekort transaction type"""
PURCHASE = "purchase"
TOP_UP = "top_up"
USAGE = "usage"
REFUND = "refund"
EXPIRATION = "expiration"
CANCELLATION = "cancellation"
# ============================================================================
# TICKET MODELS
# ============================================================================
class TTicketBase(BaseModel):
"""Base model for ticket"""
subject: str = Field(..., min_length=1, max_length=500)
description: Optional[str] = None
status: TicketStatus = Field(default=TicketStatus.OPEN)
priority: TicketPriority = Field(default=TicketPriority.NORMAL)
category: Optional[str] = Field(None, max_length=100)
customer_id: Optional[int] = Field(None, description="Reference til customers.id")
contact_id: Optional[int] = Field(None, description="Reference til contacts.id")
assigned_to_user_id: Optional[int] = Field(None, description="Reference til users.user_id")
source: TicketSource = Field(default=TicketSource.MANUAL)
tags: Optional[List[str]] = Field(default_factory=list)
custom_fields: Optional[dict] = Field(default_factory=dict)
class TTicketCreate(TTicketBase):
"""Model for creating a ticket"""
ticket_number: Optional[str] = Field(None, description="Auto-generated hvis ikke angivet")
created_by_user_id: Optional[int] = Field(None, description="User der opretter ticket")
class TTicketUpdate(BaseModel):
"""Model for updating a ticket (partial updates)"""
subject: Optional[str] = Field(None, min_length=1, max_length=500)
description: Optional[str] = None
status: Optional[TicketStatus] = None
priority: Optional[TicketPriority] = None
category: Optional[str] = Field(None, max_length=100)
customer_id: Optional[int] = None
contact_id: Optional[int] = None
assigned_to_user_id: Optional[int] = None
tags: Optional[List[str]] = None
custom_fields: Optional[dict] = None
class TTicket(TTicketBase):
"""Full ticket model with DB fields"""
id: int
ticket_number: str
created_by_user_id: Optional[int] = None
created_at: datetime
updated_at: Optional[datetime] = None
first_response_at: Optional[datetime] = None
resolved_at: Optional[datetime] = None
closed_at: Optional[datetime] = None
class Config:
from_attributes = True
class TTicketWithStats(TTicket):
"""Ticket med statistik (fra view)"""
comment_count: Optional[int] = 0
attachment_count: Optional[int] = 0
pending_hours: Optional[Decimal] = None
billed_hours: Optional[Decimal] = None
last_comment_at: Optional[datetime] = None
age_hours: Optional[float] = None
class Config:
from_attributes = True
# ============================================================================
# COMMENT MODELS
# ============================================================================
class TTicketCommentBase(BaseModel):
"""Base model for comment"""
comment_text: str = Field(..., min_length=1)
is_internal: bool = Field(default=False, description="Intern note (ikke kunde-synlig)")
class TTicketCommentCreate(TTicketCommentBase):
"""Model for creating a comment"""
ticket_id: int = Field(..., gt=0)
user_id: Optional[int] = Field(None, description="User der opretter kommentar")
class TTicketComment(TTicketCommentBase):
"""Full comment model with DB fields"""
id: int
ticket_id: int
user_id: Optional[int] = None
created_at: datetime
updated_at: Optional[datetime] = None
class Config:
from_attributes = True
# ============================================================================
# ATTACHMENT MODELS
# ============================================================================
class TTicketAttachmentBase(BaseModel):
"""Base model for attachment"""
file_name: str = Field(..., min_length=1, max_length=255)
file_path: str = Field(..., min_length=1, max_length=500)
file_size: Optional[int] = Field(None, ge=0)
mime_type: Optional[str] = Field(None, max_length=100)
class TTicketAttachmentCreate(TTicketAttachmentBase):
"""Model for creating an attachment"""
ticket_id: int = Field(..., gt=0)
uploaded_by_user_id: Optional[int] = None
class TTicketAttachment(TTicketAttachmentBase):
"""Full attachment model with DB fields"""
id: int
ticket_id: int
uploaded_by_user_id: Optional[int] = None
created_at: datetime
class Config:
from_attributes = True
# ============================================================================
# WORKLOG MODELS
# ============================================================================
class TTicketWorklogBase(BaseModel):
"""Base model for worklog entry"""
work_date: date = Field(..., description="Dato arbejdet blev udført")
hours: Decimal = Field(..., gt=0, le=24, description="Timer brugt")
work_type: WorkType = Field(default=WorkType.SUPPORT)
description: Optional[str] = None
billing_method: BillingMethod = Field(default=BillingMethod.INVOICE)
@field_validator('hours')
@classmethod
def validate_hours(cls, v):
"""Validate hours is reasonable (max 24 hours per day)"""
if v <= 0:
raise ValueError('Hours must be greater than 0')
if v > 24:
raise ValueError('Hours cannot exceed 24 per entry')
return v
class TTicketWorklogCreate(TTicketWorklogBase):
"""Model for creating a worklog entry"""
ticket_id: int = Field(..., gt=0)
user_id: Optional[int] = Field(None, description="User der opretter worklog")
prepaid_card_id: Optional[int] = Field(None, description="Klippekort ID hvis billing_method=prepaid_card")
class TTicketWorklogUpdate(BaseModel):
"""Model for updating a worklog entry (partial updates)"""
work_date: Optional[date] = None
hours: Optional[Decimal] = Field(None, gt=0, le=24)
work_type: Optional[WorkType] = None
description: Optional[str] = None
billing_method: Optional[BillingMethod] = None
status: Optional[WorklogStatus] = None
prepaid_card_id: Optional[int] = None
class TTicketWorklog(TTicketWorklogBase):
"""Full worklog model with DB fields"""
id: int
ticket_id: int
user_id: Optional[int] = None
status: WorklogStatus = Field(default=WorklogStatus.DRAFT)
prepaid_card_id: Optional[int] = None
created_at: datetime
updated_at: Optional[datetime] = None
billed_at: Optional[datetime] = None
class Config:
from_attributes = True
class TTicketWorklogWithDetails(TTicketWorklog):
"""Worklog med ticket detaljer (til review UI)"""
ticket_number: Optional[str] = None
ticket_subject: Optional[str] = None
customer_id: Optional[int] = None
ticket_status: Optional[str] = None
has_sufficient_balance: Optional[bool] = True
class Config:
from_attributes = True
# ============================================================================
# PREPAID CARD (KLIPPEKORT) MODELS
# ============================================================================
class TPrepaidCardBase(BaseModel):
"""Base model for prepaid card"""
customer_id: int = Field(..., gt=0, description="Reference til customers.id")
purchased_hours: Decimal = Field(..., gt=0, description="Timer købt")
price_per_hour: Decimal = Field(..., gt=0, description="DKK pr. time")
total_amount: Decimal = Field(..., gt=0, description="Total pris DKK")
expires_at: Optional[datetime] = Field(None, description="Udløbsdato (NULL = ingen udløb)")
notes: Optional[str] = None
@field_validator('total_amount')
@classmethod
def validate_total(cls, v, info):
"""Validate that total_amount matches purchased_hours * price_per_hour"""
# Note: This validator runs after all fields are set in Pydantic v2
# If we need cross-field validation, use model_validator instead
if v <= 0:
raise ValueError('Total amount must be greater than 0')
return v
class TPrepaidCardCreate(TPrepaidCardBase):
"""Model for creating a prepaid card"""
card_number: Optional[str] = Field(None, description="Auto-generated hvis ikke angivet")
created_by_user_id: Optional[int] = Field(None, description="User der opretter kort")
economic_invoice_number: Optional[str] = Field(None, max_length=50)
economic_product_number: Optional[str] = Field(None, max_length=50)
class TPrepaidCardUpdate(BaseModel):
"""Model for updating a prepaid card (partial updates)"""
status: Optional[PrepaidCardStatus] = None
expires_at: Optional[datetime] = None
notes: Optional[str] = None
economic_invoice_number: Optional[str] = Field(None, max_length=50)
economic_product_number: Optional[str] = Field(None, max_length=50)
class TPrepaidCard(TPrepaidCardBase):
"""Full prepaid card model with DB fields"""
id: int
card_number: str
used_hours: Decimal = Field(default=Decimal('0'))
remaining_hours: Decimal # Generated column
status: PrepaidCardStatus = Field(default=PrepaidCardStatus.ACTIVE)
purchased_at: datetime
economic_invoice_number: Optional[str] = None
economic_product_number: Optional[str] = None
created_by_user_id: Optional[int] = None
created_at: datetime
updated_at: Optional[datetime] = None
class Config:
from_attributes = True
class TPrepaidCardWithStats(TPrepaidCard):
"""Prepaid card med statistik (fra view)"""
usage_count: Optional[int] = 0
total_hours_used: Optional[Decimal] = None
billed_usage_count: Optional[int] = 0
class Config:
from_attributes = True
# ============================================================================
# PREPAID TRANSACTION MODELS
# ============================================================================
class TPrepaidTransactionBase(BaseModel):
"""Base model for prepaid transaction"""
transaction_type: TransactionType
hours: Decimal = Field(..., description="Timer (positiv=tilføj, negativ=træk)")
description: Optional[str] = None
class TPrepaidTransactionCreate(TPrepaidTransactionBase):
"""Model for creating a transaction"""
card_id: int = Field(..., gt=0)
worklog_id: Optional[int] = Field(None, description="Reference til worklog (NULL for køb/top-up)")
balance_after: Decimal = Field(..., ge=0, description="Saldo efter transaction")
created_by_user_id: Optional[int] = None
class TPrepaidTransaction(TPrepaidTransactionBase):
"""Full transaction model with DB fields"""
id: int
card_id: int
worklog_id: Optional[int] = None
balance_after: Decimal
created_at: datetime
created_by_user_id: Optional[int] = None
class Config:
from_attributes = True
# ============================================================================
# EMAIL INTEGRATION MODELS
# ============================================================================
class TTicketEmailLogBase(BaseModel):
"""Base model for email log"""
action: str = Field(..., max_length=50)
class TTicketEmailLogCreate(TTicketEmailLogBase):
"""Model for creating an email log entry"""
ticket_id: Optional[int] = None
email_id: Optional[int] = Field(None, description="Reference til email_messages.id")
email_message_id: Optional[str] = Field(None, max_length=500, description="Email Message-ID header")
class TTicketEmailLog(TTicketEmailLogBase):
"""Full email log model with DB fields"""
id: int
ticket_id: Optional[int] = None
email_id: Optional[int] = None
email_message_id: Optional[str] = None
created_at: datetime
class Config:
from_attributes = True
# ============================================================================
# AUDIT LOG MODELS
# ============================================================================
class TTicketAuditLogBase(BaseModel):
"""Base model for audit log"""
entity_type: str = Field(..., max_length=50)
action: str = Field(..., max_length=50)
old_value: Optional[str] = None
new_value: Optional[str] = None
details: Optional[dict] = Field(default_factory=dict)
class TTicketAuditLogCreate(TTicketAuditLogBase):
"""Model for creating an audit log entry"""
ticket_id: Optional[int] = None
entity_id: Optional[int] = None
user_id: Optional[int] = None
class TTicketAuditLog(TTicketAuditLogBase):
"""Full audit log model with DB fields"""
id: int
ticket_id: Optional[int] = None
entity_id: Optional[int] = None
user_id: Optional[int] = None
created_at: datetime
class Config:
from_attributes = True
# ============================================================================
# RESPONSE MODELS (for API responses)
# ============================================================================
class TicketListResponse(BaseModel):
"""Response model for ticket lists"""
tickets: List[TTicketWithStats]
total: int
page: int = 1
page_size: int = 50
class WorklogReviewResponse(BaseModel):
"""Response model for worklog review page"""
worklogs: List[TTicketWorklogWithDetails]
total: int
total_hours: Decimal
total_billable_hours: Decimal
class PrepaidCardBalanceResponse(BaseModel):
"""Response model for prepaid card balance check"""
card: TPrepaidCardWithStats
can_deduct: bool
required_hours: Optional[Decimal] = None
message: Optional[str] = None
# ============================================================================
# REQUEST MODELS (for specific actions)
# ============================================================================
class TicketStatusUpdateRequest(BaseModel):
"""Request model for updating ticket status"""
status: TicketStatus
note: Optional[str] = Field(None, description="Note til audit log")
class WorklogBillingRequest(BaseModel):
"""Request model for marking worklogs as billable"""
worklog_ids: List[int] = Field(..., min_length=1)
note: Optional[str] = Field(None, description="Note til audit log")
class PrepaidCardTopUpRequest(BaseModel):
"""Request model for topping up prepaid card"""
hours: Decimal = Field(..., gt=0, description="Timer at tilføje")
note: Optional[str] = Field(None, description="Beskrivelse af top-up")
class PrepaidCardDeductRequest(BaseModel):
"""Request model for deducting hours from prepaid card"""
worklog_id: int = Field(..., gt=0, description="Worklog ID der skal trækkes fra kort")
hours: Decimal = Field(..., gt=0, description="Timer at trække")
# ============================================================================
# TICKET RELATIONS MODELS (Migration 026)
# ============================================================================
class TicketRelationType(str, Enum):
"""Ticket relation types"""
MERGED_INTO = "merged_into"
SPLIT_FROM = "split_from"
PARENT_OF = "parent_of"
CHILD_OF = "child_of"
RELATED_TO = "related_to"
class TTicketRelationBase(BaseModel):
"""Base model for ticket relation"""
ticket_id: int
related_ticket_id: int
relation_type: TicketRelationType
reason: Optional[str] = None
class TTicketRelationCreate(TTicketRelationBase):
"""Create ticket relation"""
pass
class TTicketRelation(TTicketRelationBase):
"""Full ticket relation model"""
id: int
created_by_user_id: Optional[int] = None
created_at: datetime
class Config:
from_attributes = True
# ============================================================================
# CALENDAR EVENTS MODELS
# ============================================================================
class CalendarEventType(str, Enum):
"""Calendar event types"""
APPOINTMENT = "appointment"
DEADLINE = "deadline"
MILESTONE = "milestone"
REMINDER = "reminder"
FOLLOW_UP = "follow_up"
class CalendarEventStatus(str, Enum):
"""Calendar event status"""
PENDING = "pending"
CONFIRMED = "confirmed"
COMPLETED = "completed"
CANCELLED = "cancelled"
class TTicketCalendarEventBase(BaseModel):
"""Base model for calendar event"""
ticket_id: int
title: str = Field(..., min_length=1, max_length=200)
description: Optional[str] = None
event_type: CalendarEventType = Field(default=CalendarEventType.APPOINTMENT)
event_date: date
event_time: Optional[str] = None
duration_minutes: Optional[int] = None
all_day: bool = False
status: CalendarEventStatus = Field(default=CalendarEventStatus.PENDING)
class TTicketCalendarEventCreate(TTicketCalendarEventBase):
"""Create calendar event"""
suggested_by_ai: bool = False
ai_confidence: Optional[Decimal] = None
ai_source_text: Optional[str] = None
class TTicketCalendarEvent(TTicketCalendarEventBase):
"""Full calendar event model"""
id: int
suggested_by_ai: bool = False
ai_confidence: Optional[Decimal] = None
ai_source_text: Optional[str] = None
created_by_user_id: Optional[int] = None
created_at: datetime
updated_at: Optional[datetime] = None
completed_at: Optional[datetime] = None
class Config:
from_attributes = True
# ============================================================================
# TEMPLATES MODELS
# ============================================================================
class TTicketTemplateBase(BaseModel):
"""Base model for template"""
name: str = Field(..., min_length=1, max_length=200)
description: Optional[str] = None
category: Optional[str] = None
subject_template: Optional[str] = Field(None, max_length=500)
body_template: str = Field(..., min_length=1)
available_placeholders: Optional[List[str]] = None
default_attachments: Optional[dict] = None
is_active: bool = True
requires_approval: bool = False
class TTicketTemplateCreate(TTicketTemplateBase):
"""Create template"""
pass
class TTicketTemplate(TTicketTemplateBase):
"""Full template model"""
id: int
created_by_user_id: Optional[int] = None
created_at: datetime
updated_at: Optional[datetime] = None
last_used_at: Optional[datetime] = None
usage_count: int = 0
class Config:
from_attributes = True
class TemplateRenderRequest(BaseModel):
"""Request to render template with data"""
template_id: int
ticket_id: int
custom_data: Optional[dict] = None
class TemplateRenderResponse(BaseModel):
"""Rendered template"""
subject: Optional[str] = None
body: str
placeholders_used: List[str]
# ============================================================================
# AI SUGGESTIONS MODELS
# ============================================================================
class AISuggestionType(str, Enum):
"""AI suggestion types"""
CONTACT_UPDATE = "contact_update"
NEW_CONTACT = "new_contact"
CATEGORY = "category"
TAG = "tag"
PRIORITY = "priority"
DEADLINE = "deadline"
CALENDAR_EVENT = "calendar_event"
TEMPLATE = "template"
MERGE = "merge"
RELATED_TICKET = "related_ticket"
class AISuggestionStatus(str, Enum):
"""AI suggestion status"""
PENDING = "pending"
ACCEPTED = "accepted"
REJECTED = "rejected"
AUTO_EXPIRED = "auto_expired"
class TTicketAISuggestionBase(BaseModel):
"""Base model for AI suggestion"""
ticket_id: int
suggestion_type: AISuggestionType
suggestion_data: dict # Struktureret data om forslaget
confidence: Optional[Decimal] = None
reasoning: Optional[str] = None
source_text: Optional[str] = None
source_comment_id: Optional[int] = None
class TTicketAISuggestionCreate(TTicketAISuggestionBase):
"""Create AI suggestion"""
expires_at: Optional[datetime] = None
class TTicketAISuggestion(TTicketAISuggestionBase):
"""Full AI suggestion model"""
id: int
status: AISuggestionStatus = Field(default=AISuggestionStatus.PENDING)
reviewed_by_user_id: Optional[int] = None
reviewed_at: Optional[datetime] = None
created_at: datetime
expires_at: Optional[datetime] = None
class Config:
from_attributes = True
class AISuggestionReviewRequest(BaseModel):
"""Request to accept/reject AI suggestion"""
action: str = Field(..., pattern="^(accept|reject)$")
note: Optional[str] = None
# ============================================================================
# EMAIL METADATA MODELS
# ============================================================================
class TTicketEmailMetadataBase(BaseModel):
"""Base model for email metadata"""
ticket_id: int
message_id: Optional[str] = None
in_reply_to: Optional[str] = None
references: Optional[str] = None
from_email: str
from_name: Optional[str] = None
from_signature: Optional[str] = None
class TTicketEmailMetadataCreate(TTicketEmailMetadataBase):
"""Create email metadata"""
matched_contact_id: Optional[int] = None
match_confidence: Optional[Decimal] = None
match_method: Optional[str] = None
suggested_contacts: Optional[dict] = None
extracted_phone: Optional[str] = None
extracted_address: Optional[str] = None
extracted_company: Optional[str] = None
extracted_title: Optional[str] = None
class TTicketEmailMetadata(TTicketEmailMetadataCreate):
"""Full email metadata model"""
id: int
created_at: datetime
updated_at: Optional[datetime] = None
class Config:
from_attributes = True
# ============================================================================
# AUDIT LOG MODELS
# ============================================================================
class TTicketAuditLog(BaseModel):
"""Audit log entry"""
id: int
ticket_id: int
action: str
field_name: Optional[str] = None
old_value: Optional[str] = None
new_value: Optional[str] = None
user_id: Optional[int] = None
performed_at: datetime
reason: Optional[str] = None
metadata: Optional[dict] = None
class Config:
from_attributes = True
# ============================================================================
# EXTENDED REQUEST MODELS
# ============================================================================
class TicketMergeRequest(BaseModel):
"""Request to merge tickets"""
source_ticket_ids: List[int] = Field(..., min_length=1, description="Tickets at lægge sammen")
target_ticket_id: int = Field(..., description="Primær ticket der skal beholdes")
reason: Optional[str] = Field(None, description="Hvorfor lægges de sammen")
class TicketSplitRequest(BaseModel):
"""Request to split ticket"""
source_ticket_id: int = Field(..., description="Ticket at splitte")
comment_ids: List[int] = Field(..., min_length=1, description="Kommentarer til ny ticket")
new_subject: str = Field(..., min_length=1, description="Emne på ny ticket")
new_description: Optional[str] = Field(None, description="Beskrivelse på ny ticket")
reason: Optional[str] = Field(None, description="Hvorfor splittes ticketen")
class TicketDeadlineUpdateRequest(BaseModel):
"""Request to update ticket deadline"""
deadline: Optional[datetime] = None
reason: Optional[str] = None

File diff suppressed because it is too large Load Diff

View File

@ -1,525 +0,0 @@
"""
Ticket Service
==============
Business logic for ticket operations: status transitions, validation, audit logging.
"""
import logging
from datetime import datetime
from typing import Optional, Dict, Any, List
from decimal import Decimal
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
from app.ticket.backend.models import (
TicketStatus,
TicketPriority,
TTicketCreate,
TTicketUpdate,
TTicket
)
logger = logging.getLogger(__name__)
class TicketService:
"""Service for ticket business logic"""
# Status transition rules (which transitions are allowed)
ALLOWED_TRANSITIONS = {
TicketStatus.OPEN: [TicketStatus.IN_PROGRESS, TicketStatus.CLOSED],
TicketStatus.IN_PROGRESS: [
TicketStatus.WAITING_CUSTOMER,
TicketStatus.WAITING_INTERNAL,
TicketStatus.RESOLVED
],
TicketStatus.WAITING_CUSTOMER: [TicketStatus.IN_PROGRESS, TicketStatus.CLOSED],
TicketStatus.WAITING_INTERNAL: [TicketStatus.IN_PROGRESS, TicketStatus.CLOSED],
TicketStatus.RESOLVED: [TicketStatus.CLOSED, TicketStatus.IN_PROGRESS], # Can reopen
TicketStatus.CLOSED: [] # Cannot transition from closed
}
@staticmethod
def validate_status_transition(current_status: str, new_status: str) -> tuple[bool, Optional[str]]:
"""
Validate if status transition is allowed
Returns:
(is_valid, error_message)
"""
try:
current = TicketStatus(current_status)
new = TicketStatus(new_status)
except ValueError as e:
return False, f"Invalid status: {e}"
if current == new:
return True, None # Same status is OK
allowed = TicketService.ALLOWED_TRANSITIONS.get(current, [])
if new not in allowed:
return False, f"Cannot transition from {current.value} to {new.value}"
return True, None
@staticmethod
def create_ticket(
ticket_data: TTicketCreate,
user_id: Optional[int] = None
) -> Dict[str, Any]:
"""
Create a new ticket with auto-generated ticket_number if not provided
Args:
ticket_data: Ticket creation data
user_id: User creating the ticket
Returns:
Created ticket dict
"""
logger.info(f"🎫 Creating ticket: {ticket_data.subject}")
# Prepare data for PostgreSQL (convert Python types to SQL types)
import json
from psycopg2.extras import Json
# Insert ticket (trigger will auto-generate ticket_number if NULL)
result = execute_query_single(
"""
INSERT INTO tticket_tickets (
ticket_number, subject, description, status, priority, category,
customer_id, contact_id, assigned_to_user_id, created_by_user_id,
source, tags, custom_fields
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
RETURNING id
""",
(
ticket_data.ticket_number,
ticket_data.subject,
ticket_data.description,
ticket_data.status.value,
ticket_data.priority.value,
ticket_data.category,
ticket_data.customer_id,
ticket_data.contact_id,
ticket_data.assigned_to_user_id,
user_id or ticket_data.created_by_user_id,
ticket_data.source.value,
ticket_data.tags or [], # PostgreSQL array
Json(ticket_data.custom_fields or {}) # PostgreSQL JSONB
)
)
if not result:
raise Exception("Failed to create ticket - no ID returned")
ticket_id = result['id']
# Log creation
TicketService.log_audit(
ticket_id=ticket_id,
entity_type="ticket",
entity_id=ticket_id,
user_id=user_id,
action="created",
details={"subject": ticket_data.subject, "status": ticket_data.status.value}
)
# Fetch created ticket
ticket = execute_query_single(
"SELECT * FROM tticket_tickets WHERE id = %s",
(ticket_id,))
logger.info(f"✅ Created ticket {ticket['ticket_number']} (ID: {ticket_id})")
return ticket
@staticmethod
def update_ticket(
ticket_id: int,
update_data: TTicketUpdate,
user_id: Optional[int] = None
) -> Dict[str, Any]:
"""
Update ticket with partial data
Args:
ticket_id: Ticket ID to update
update_data: Fields to update
user_id: User making the update
Returns:
Updated ticket dict
"""
# Get current ticket
current = execute_query_single(
"SELECT * FROM tticket_tickets WHERE id = %s",
(ticket_id,))
if not current:
raise ValueError(f"Ticket {ticket_id} not found")
# Build UPDATE query dynamically
import json
from psycopg2.extras import Json
updates = []
params = []
update_dict = update_data.model_dump(exclude_unset=True)
for field, value in update_dict.items():
# Convert enums to values
if hasattr(value, 'value'):
value = value.value
# Convert dict to JSON for PostgreSQL JSONB
elif field == 'custom_fields' and isinstance(value, dict):
value = Json(value)
# Arrays are handled automatically by psycopg2
updates.append(f"{field} = %s")
params.append(value)
if not updates:
logger.warning(f"No fields to update for ticket {ticket_id}")
return current
# Add ticket_id to params
params.append(ticket_id)
query = f"UPDATE tticket_tickets SET {', '.join(updates)} WHERE id = %s"
execute_update(query, tuple(params))
# Log update
TicketService.log_audit(
ticket_id=ticket_id,
entity_type="ticket",
entity_id=ticket_id,
user_id=user_id,
action="updated",
details=update_dict
)
# Fetch updated ticket
updated = execute_query_single(
"SELECT * FROM tticket_tickets WHERE id = %s",
(ticket_id,))
logger.info(f"✅ Updated ticket {updated['ticket_number']}")
return updated
@staticmethod
def update_ticket_status(
ticket_id: int,
new_status: str,
user_id: Optional[int] = None,
note: Optional[str] = None
) -> Dict[str, Any]:
"""
Update ticket status with validation
Args:
ticket_id: Ticket ID
new_status: New status value
user_id: User making the change
note: Optional note for audit log
Returns:
Updated ticket dict
Raises:
ValueError: If transition is not allowed
"""
# Get current ticket
current = execute_query_single(
"SELECT * FROM tticket_tickets WHERE id = %s",
(ticket_id,))
if not current:
raise ValueError(f"Ticket {ticket_id} not found")
current_status = current['status']
# Validate transition
is_valid, error = TicketService.validate_status_transition(current_status, new_status)
if not is_valid:
logger.error(f"❌ Invalid status transition: {error}")
raise ValueError(error)
# Update status
execute_update(
"UPDATE tticket_tickets SET status = %s WHERE id = %s",
(new_status, ticket_id)
)
# Update resolved_at timestamp if transitioning to resolved
if new_status == TicketStatus.RESOLVED.value and not current['resolved_at']:
execute_update(
"UPDATE tticket_tickets SET resolved_at = CURRENT_TIMESTAMP WHERE id = %s",
(ticket_id,)
)
# Update closed_at timestamp if transitioning to closed
if new_status == TicketStatus.CLOSED.value and not current['closed_at']:
execute_update(
"UPDATE tticket_tickets SET closed_at = CURRENT_TIMESTAMP WHERE id = %s",
(ticket_id,)
)
# Log status change
TicketService.log_audit(
ticket_id=ticket_id,
entity_type="ticket",
entity_id=ticket_id,
user_id=user_id,
action="status_changed",
old_value=current_status,
new_value=new_status,
details={"note": note} if note else None
)
# Fetch updated ticket
updated = execute_query_single(
"SELECT * FROM tticket_tickets WHERE id = %s",
(ticket_id,))
logger.info(f"✅ Updated ticket {updated['ticket_number']} status: {current_status}{new_status}")
return updated
@staticmethod
def assign_ticket(
ticket_id: int,
assigned_to_user_id: int,
user_id: Optional[int] = None
) -> Dict[str, Any]:
"""
Assign ticket to a user
Args:
ticket_id: Ticket ID
assigned_to_user_id: User to assign to
user_id: User making the assignment
Returns:
Updated ticket dict
"""
# Get current assignment
current = execute_query_single(
"SELECT assigned_to_user_id FROM tticket_tickets WHERE id = %s",
(ticket_id,))
if not current:
raise ValueError(f"Ticket {ticket_id} not found")
# Update assignment
execute_update(
"UPDATE tticket_tickets SET assigned_to_user_id = %s WHERE id = %s",
(assigned_to_user_id, ticket_id)
)
# Log assignment
TicketService.log_audit(
ticket_id=ticket_id,
entity_type="ticket",
entity_id=ticket_id,
user_id=user_id,
action="assigned",
old_value=str(current['assigned_to_user_id']) if current['assigned_to_user_id'] else None,
new_value=str(assigned_to_user_id)
)
# Fetch updated ticket
updated = execute_query_single(
"SELECT * FROM tticket_tickets WHERE id = %s",
(ticket_id,))
logger.info(f"✅ Assigned ticket {updated['ticket_number']} to user {assigned_to_user_id}")
return updated
@staticmethod
def add_comment(
ticket_id: int,
comment_text: str,
user_id: Optional[int] = None,
is_internal: bool = False
) -> Dict[str, Any]:
"""
Add a comment to a ticket
Args:
ticket_id: Ticket ID
comment_text: Comment content
user_id: User adding comment
is_internal: Whether comment is internal
Returns:
Created comment dict
"""
# Verify ticket exists
ticket = execute_query_single(
"SELECT id FROM tticket_tickets WHERE id = %s",
(ticket_id,))
if not ticket:
raise ValueError(f"Ticket {ticket_id} not found")
# Insert comment
comment_id = execute_insert(
"""
INSERT INTO tticket_comments (ticket_id, user_id, comment_text, is_internal)
VALUES (%s, %s, %s, %s)
""",
(ticket_id, user_id, comment_text, is_internal)
)
# Update ticket's updated_at timestamp
execute_update(
"UPDATE tticket_tickets SET updated_at = CURRENT_TIMESTAMP WHERE id = %s",
(ticket_id,)
)
# Update first_response_at if this is the first non-internal comment
if not is_internal:
ticket = execute_query_single(
"SELECT first_response_at FROM tticket_tickets WHERE id = %s",
(ticket_id,))
if not ticket['first_response_at']:
execute_update(
"UPDATE tticket_tickets SET first_response_at = CURRENT_TIMESTAMP WHERE id = %s",
(ticket_id,)
)
# Log comment
TicketService.log_audit(
ticket_id=ticket_id,
entity_type="comment",
entity_id=comment_id,
user_id=user_id,
action="comment_added",
details={"is_internal": is_internal, "length": len(comment_text)}
)
# Fetch created comment
comment = execute_query_single(
"SELECT * FROM tticket_comments WHERE id = %s",
(comment_id,))
logger.info(f"💬 Added comment to ticket {ticket_id} (internal: {is_internal})")
return comment
@staticmethod
def log_audit(
ticket_id: Optional[int] = None,
entity_type: str = "ticket",
entity_id: Optional[int] = None,
user_id: Optional[int] = None,
action: str = "updated",
old_value: Optional[str] = None,
new_value: Optional[str] = None,
details: Optional[Dict[str, Any]] = None
) -> None:
"""
Log an action to audit log
Args:
ticket_id: Related ticket ID (optional)
entity_type: Type of entity (ticket, comment, worklog, etc.)
entity_id: ID of the entity
user_id: User performing action
action: Action performed
old_value: Old value (for updates)
new_value: New value (for updates)
details: Additional details as JSON
"""
try:
from psycopg2.extras import Json
execute_insert(
"""
INSERT INTO tticket_audit_log
(ticket_id, entity_type, entity_id, user_id, action, old_value, new_value, details)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s)
""",
(ticket_id, entity_type, entity_id, user_id, action, old_value, new_value,
Json(details) if details else None)
)
except Exception as e:
logger.error(f"❌ Failed to log audit entry: {e}")
# Don't raise - audit logging should not break the main operation
@staticmethod
def get_ticket_with_stats(ticket_id: int) -> Optional[Dict[str, Any]]:
"""
Get ticket with statistics from view
Args:
ticket_id: Ticket ID
Returns:
Ticket dict with stats or None if not found
"""
ticket = execute_query_single(
"SELECT * FROM tticket_open_tickets WHERE id = %s",
(ticket_id,))
# If not in open_tickets view, fetch from main table
if not ticket:
ticket = execute_query_single(
"SELECT * FROM tticket_tickets WHERE id = %s",
(ticket_id,))
return ticket
@staticmethod
def list_tickets(
status: Optional[str] = None,
priority: Optional[str] = None,
customer_id: Optional[int] = None,
assigned_to_user_id: Optional[int] = None,
search: Optional[str] = None,
limit: int = 50,
offset: int = 0
) -> List[Dict[str, Any]]:
"""
List tickets with filters
Args:
status: Filter by status
priority: Filter by priority
customer_id: Filter by customer
assigned_to_user_id: Filter by assigned user
search: Search in subject/description
limit: Number of results
offset: Offset for pagination
Returns:
List of ticket dicts
"""
query = "SELECT * FROM tticket_tickets WHERE 1=1"
params = []
if status:
query += " AND status = %s"
params.append(status)
if priority:
query += " AND priority = %s"
params.append(priority)
if customer_id:
query += " AND customer_id = %s"
params.append(customer_id)
if assigned_to_user_id:
query += " AND assigned_to_user_id = %s"
params.append(assigned_to_user_id)
if search:
query += " AND (subject ILIKE %s OR description ILIKE %s)"
search_pattern = f"%{search}%"
params.extend([search_pattern, search_pattern])
query += " ORDER BY created_at DESC LIMIT %s OFFSET %s"
params.extend([limit, offset])
tickets = execute_query(query, tuple(params))
return tickets or []

View File

@ -1,6 +0,0 @@
"""
Ticket Module Frontend
======================
HTML templates og view handlers for ticket-systemet.
"""

View File

@ -1,283 +0,0 @@
{% extends "shared/frontend/base.html" %}
{% block title %}Ticket Dashboard - BMC Hub{% endblock %}
{% block content %}
<div class="container-fluid py-4">
<!-- Header -->
<div class="row mb-4">
<div class="col">
<h1 class="h3 mb-0">🎫 Support Dashboard</h1>
<p class="text-muted">Oversigt over alle support tickets og aktivitet</p>
</div>
<div class="col-auto">
<button class="btn btn-primary" onclick="window.location.href='/ticket/tickets/new'">
<i class="bi bi-plus-circle"></i> Ny Ticket
</button>
</div>
</div>
<!-- Status Overview -->
<div class="row g-3 mb-4">
<div class="col-md-2">
<div class="card border-0 shadow-sm h-100" style="cursor: pointer;" onclick="filterByStatus('open')">
<div class="card-body text-center">
<div class="rounded-circle bg-info bg-opacity-10 p-3 d-inline-flex mb-3">
<i class="bi bi-inbox-fill text-info fs-4"></i>
</div>
<h2 class="mb-1 text-info">{{ stats.open_count or 0 }}</h2>
<p class="text-muted small mb-0">Åbne</p>
</div>
</div>
</div>
<div class="col-md-2">
<div class="card border-0 shadow-sm h-100" style="cursor: pointer;" onclick="filterByStatus('in_progress')">
<div class="card-body text-center">
<div class="rounded-circle bg-warning bg-opacity-10 p-3 d-inline-flex mb-3">
<i class="bi bi-hourglass-split text-warning fs-4"></i>
</div>
<h2 class="mb-1 text-warning">{{ stats.in_progress_count or 0 }}</h2>
<p class="text-muted small mb-0">I Gang</p>
</div>
</div>
</div>
<div class="col-md-2">
<div class="card border-0 shadow-sm h-100" style="cursor: pointer;" onclick="filterByStatus('pending_customer')">
<div class="card-body text-center">
<div class="rounded-circle bg-secondary bg-opacity-10 p-3 d-inline-flex mb-3">
<i class="bi bi-clock-fill text-secondary fs-4"></i>
</div>
<h2 class="mb-1 text-secondary">{{ stats.pending_count or 0 }}</h2>
<p class="text-muted small mb-0">Afventer</p>
</div>
</div>
</div>
<div class="col-md-2">
<div class="card border-0 shadow-sm h-100" style="cursor: pointer;" onclick="filterByStatus('resolved')">
<div class="card-body text-center">
<div class="rounded-circle bg-success bg-opacity-10 p-3 d-inline-flex mb-3">
<i class="bi bi-check-circle-fill text-success fs-4"></i>
</div>
<h2 class="mb-1 text-success">{{ stats.resolved_count or 0 }}</h2>
<p class="text-muted small mb-0">Løst</p>
</div>
</div>
</div>
<div class="col-md-2">
<div class="card border-0 shadow-sm h-100" style="cursor: pointer;" onclick="filterByStatus('closed')">
<div class="card-body text-center">
<div class="rounded-circle bg-dark bg-opacity-10 p-3 d-inline-flex mb-3">
<i class="bi bi-archive-fill text-dark fs-4"></i>
</div>
<h2 class="mb-1 text-dark">{{ stats.closed_count or 0 }}</h2>
<p class="text-muted small mb-0">Lukket</p>
</div>
</div>
</div>
<div class="col-md-2">
<div class="card border-0 shadow-sm h-100 bg-primary text-white">
<div class="card-body text-center">
<div class="rounded-circle bg-white bg-opacity-25 p-3 d-inline-flex mb-3">
<i class="bi bi-ticket-detailed-fill fs-4"></i>
</div>
<h2 class="mb-1">{{ stats.total_count or 0 }}</h2>
<p class="small mb-0 opacity-75">I Alt</p>
</div>
</div>
</div>
</div>
<!-- Worklog & Prepaid Overview -->
<div class="row g-4 mb-4">
<div class="col-md-6">
<div class="card border-0 shadow-sm">
<div class="card-header bg-white border-0 py-3">
<h5 class="mb-0">⏱️ Worklog Status</h5>
</div>
<div class="card-body">
<div class="row">
<div class="col-6 border-end">
<div class="text-center p-3">
<h3 class="text-warning mb-2">{{ worklog_stats.draft_count or 0 }}</h3>
<p class="text-muted small mb-1">Kladder</p>
<p class="mb-0"><strong>{{ "%.1f"|format(worklog_stats.draft_hours|float if worklog_stats.draft_hours else 0) }} timer</strong></p>
</div>
</div>
<div class="col-6">
<div class="text-center p-3">
<h3 class="text-success mb-2">{{ worklog_stats.billable_count or 0 }}</h3>
<p class="text-muted small mb-1">Fakturerbare</p>
<p class="mb-0"><strong>{{ "%.1f"|format(worklog_stats.billable_hours|float if worklog_stats.billable_hours else 0) }} timer</strong></p>
</div>
</div>
</div>
<div class="text-center pt-3 border-top">
<a href="/ticket/worklog/review" class="btn btn-outline-primary btn-sm">
<i class="bi bi-check-square"></i> Godkend Worklog
</a>
</div>
</div>
</div>
</div>
<div class="col-md-6">
<div class="card border-0 shadow-sm">
<div class="card-header bg-white border-0 py-3">
<h5 class="mb-0">💳 Prepaid Cards</h5>
</div>
<div class="card-body" id="prepaidStats">
<div class="text-center py-3">
<div class="spinner-border spinner-border-sm text-primary" role="status">
<span class="visually-hidden">Loading...</span>
</div>
</div>
</div>
</div>
</div>
</div>
<!-- Recent Tickets -->
<div class="card border-0 shadow-sm">
<div class="card-header bg-white border-0 py-3 d-flex justify-content-between align-items-center">
<h5 class="mb-0">📋 Seneste Tickets</h5>
<a href="/ticket/tickets" class="btn btn-sm btn-outline-secondary">
Se Alle <i class="bi bi-arrow-right"></i>
</a>
</div>
<div class="card-body p-0">
<div class="table-responsive">
<table class="table table-hover mb-0">
<thead class="table-light">
<tr>
<th>Ticket #</th>
<th>Emne</th>
<th>Kunde</th>
<th>Status</th>
<th>Prioritet</th>
<th>Oprettet</th>
<th></th>
</tr>
</thead>
<tbody>
{% if recent_tickets %}
{% for ticket in recent_tickets %}
<tr onclick="window.location.href='/ticket/tickets/{{ ticket.id }}'" style="cursor: pointer;">
<td><strong>{{ ticket.ticket_number }}</strong></td>
<td>{{ ticket.subject }}</td>
<td>{{ ticket.customer_name or '-' }}</td>
<td>
{% if ticket.status == 'open' %}
<span class="badge bg-info">Åben</span>
{% elif ticket.status == 'in_progress' %}
<span class="badge bg-warning">I Gang</span>
{% elif ticket.status == 'pending_customer' %}
<span class="badge bg-secondary">Afventer</span>
{% elif ticket.status == 'resolved' %}
<span class="badge bg-success">Løst</span>
{% elif ticket.status == 'closed' %}
<span class="badge bg-dark">Lukket</span>
{% else %}
<span class="badge bg-secondary">{{ ticket.status }}</span>
{% endif %}
</td>
<td>
{% if ticket.priority == 'urgent' %}
<span class="badge bg-danger">Akut</span>
{% elif ticket.priority == 'high' %}
<span class="badge bg-warning">Høj</span>
{% elif ticket.priority == 'normal' %}
<span class="badge bg-info">Normal</span>
{% else %}
<span class="badge bg-secondary">Lav</span>
{% endif %}
</td>
<td>{{ ticket.created_at.strftime('%d/%m/%Y %H:%M') if ticket.created_at else '-' }}</td>
<td>
<button class="btn btn-sm btn-outline-primary" onclick="event.stopPropagation(); window.location.href='/ticket/tickets/{{ ticket.id }}'">
<i class="bi bi-arrow-right"></i>
</button>
</td>
</tr>
{% endfor %}
{% else %}
<tr>
<td colspan="7" class="text-center text-muted py-5">
Ingen tickets endnu
</td>
</tr>
{% endif %}
</tbody>
</table>
</div>
</div>
</div>
</div>
<script>
// Load prepaid stats
document.addEventListener('DOMContentLoaded', () => {
loadPrepaidStats();
});
async function loadPrepaidStats() {
try {
const response = await fetch('/api/v1/prepaid-cards/stats/summary');
const stats = await response.json();
document.getElementById('prepaidStats').innerHTML = `
<div class="row text-center">
<div class="col-6 border-end">
<h4 class="text-success mb-1">${stats.active_count || 0}</h4>
<p class="text-muted small mb-0">Aktive Kort</p>
</div>
<div class="col-6">
<h4 class="text-primary mb-1">${parseFloat(stats.total_remaining_hours || 0).toFixed(1)} t</h4>
<p class="text-muted small mb-0">Timer Tilbage</p>
</div>
</div>
<div class="text-center pt-3 border-top">
<a href="/prepaid-cards" class="btn btn-outline-primary btn-sm">
<i class="bi bi-credit-card-2-front"></i> Se Alle Kort
</a>
</div>
`;
} catch (error) {
console.error('Error loading prepaid stats:', error);
document.getElementById('prepaidStats').innerHTML = `
<p class="text-center text-muted mb-0">Kunne ikke indlæse data</p>
`;
}
}
function filterByStatus(status) {
window.location.href = `/ticket/tickets?status=${status}`;
}
</script>
<style>
.card {
transition: transform 0.2s, box-shadow 0.2s;
}
.card:hover {
transform: translateY(-2px);
box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15) !important;
}
.table th {
font-weight: 600;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.5px;
color: var(--bs-secondary);
}
.table tbody tr {
transition: background-color 0.2s;
}
.table tbody tr:hover {
background-color: rgba(15, 76, 117, 0.05);
}
</style>
{% endblock %}

View File

@ -1,361 +0,0 @@
{% extends "shared/frontend/base.html" %}
{% block title %}Ticket Dashboard - BMC Hub{% endblock %}
{% block extra_css %}
<style>
.stat-card {
text-align: center;
padding: 2rem 1.5rem;
cursor: pointer;
position: relative;
overflow: hidden;
}
.stat-card::before {
content: '';
position: absolute;
top: 0;
left: 0;
right: 0;
height: 4px;
background: var(--accent);
transform: scaleX(0);
transition: transform 0.3s;
}
.stat-card:hover::before {
transform: scaleX(1);
}
.stat-card h3 {
font-size: 3rem;
font-weight: 700;
color: var(--accent);
margin-bottom: 0.5rem;
line-height: 1;
}
.stat-card p {
color: var(--text-secondary);
font-size: 0.9rem;
margin: 0;
text-transform: uppercase;
letter-spacing: 0.5px;
}
.stat-card .icon {
font-size: 2rem;
opacity: 0.3;
margin-bottom: 1rem;
}
.stat-card.status-open h3 { color: #17a2b8; }
.stat-card.status-in-progress h3 { color: #ffc107; }
.stat-card.status-resolved h3 { color: #28a745; }
.stat-card.status-closed h3 { color: #6c757d; }
.ticket-list {
background: var(--bg-card);
}
.ticket-list th {
font-weight: 600;
color: var(--text-secondary);
border-bottom: 2px solid var(--accent-light);
padding: 1rem 0.75rem;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.5px;
}
.ticket-list td {
padding: 1rem 0.75rem;
vertical-align: middle;
border-bottom: 1px solid var(--accent-light);
}
.ticket-row {
transition: background-color 0.2s;
cursor: pointer;
}
.ticket-row:hover {
background-color: var(--accent-light);
}
.badge {
padding: 0.4rem 0.8rem;
font-weight: 500;
border-radius: 6px;
font-size: 0.75rem;
}
.badge-status-open {
background-color: #d1ecf1;
color: #0c5460;
}
.badge-status-in_progress {
background-color: #fff3cd;
color: #856404;
}
.badge-status-pending_customer {
background-color: #e2e3e5;
color: #383d41;
}
.badge-status-resolved {
background-color: #d4edda;
color: #155724;
}
.badge-status-closed {
background-color: #f8d7da;
color: #721c24;
}
.badge-priority-low {
background-color: var(--accent-light);
color: var(--accent);
}
.badge-priority-normal {
background-color: #e2e3e5;
color: #383d41;
}
.badge-priority-high {
background-color: #fff3cd;
color: #856404;
}
.badge-priority-urgent, .badge-priority-critical {
background-color: #f8d7da;
color: #721c24;
}
.ticket-number {
font-family: 'Monaco', 'Courier New', monospace;
background: var(--accent-light);
padding: 0.2rem 0.5rem;
border-radius: 4px;
font-size: 0.85rem;
color: var(--accent);
font-weight: 600;
}
.worklog-stats {
display: flex;
justify-content: space-around;
padding: 1.5rem;
background: linear-gradient(135deg, var(--accent-light) 0%, var(--bg-card) 100%);
border-radius: var(--border-radius);
margin-bottom: 2rem;
}
.worklog-stat {
text-align: center;
}
.worklog-stat h4 {
font-size: 2rem;
font-weight: 700;
color: var(--accent);
margin: 0;
}
.worklog-stat p {
color: var(--text-secondary);
margin: 0;
font-size: 0.85rem;
}
.empty-state {
text-align: center;
padding: 4rem 2rem;
color: var(--text-secondary);
}
.empty-state i {
font-size: 4rem;
margin-bottom: 1rem;
opacity: 0.3;
}
.section-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 1.5rem;
}
.section-header h2 {
font-size: 1.5rem;
font-weight: 600;
margin: 0;
}
.quick-actions {
display: flex;
gap: 1rem;
flex-wrap: wrap;
}
</style>
{% endblock %}
{% block content %}
<div class="container-fluid px-4">
<!-- Page Header -->
<div class="section-header">
<div>
<h1 class="mb-2">
<i class="bi bi-speedometer2"></i> Ticket Dashboard
</h1>
<p class="text-muted">Oversigt over alle tickets og worklog aktivitet</p>
</div>
<div class="quick-actions">
<a href="/ticket/tickets/new" class="btn btn-primary">
<i class="bi bi-plus-circle"></i> Ny Ticket
</a>
</div>
</div>
<!-- Ticket Statistics -->
<div class="row mb-4">
<div class="col-md-3">
<div class="card stat-card status-open" onclick="filterTickets('open')">
<div class="icon"><i class="bi bi-inbox"></i></div>
<h3>{{ stats.open_count or 0 }}</h3>
<p>Nye Tickets</p>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card status-in-progress" onclick="filterTickets('in_progress')">
<div class="icon"><i class="bi bi-arrow-repeat"></i></div>
<h3>{{ stats.in_progress_count or 0 }}</h3>
<p>I Gang</p>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card status-resolved" onclick="filterTickets('resolved')">
<div class="icon"><i class="bi bi-check-circle"></i></div>
<h3>{{ stats.resolved_count or 0 }}</h3>
<p>Løst</p>
</div>
</div>
<div class="col-md-3">
<div class="card stat-card status-closed" onclick="filterTickets('closed')">
<div class="icon"><i class="bi bi-archive"></i></div>
<h3>{{ stats.closed_count or 0 }}</h3>
<p>Lukket</p>
</div>
</div>
</div>
<!-- Worklog Statistics -->
<div class="worklog-stats">
<div class="worklog-stat">
<h4>{{ worklog_stats.draft_count or 0 }}</h4>
<p>Draft Worklog</p>
</div>
<div class="worklog-stat">
<h4>{{ "%.1f"|format(worklog_stats.draft_hours or 0) }}t</h4>
<p>Udraft Timer</p>
</div>
<div class="worklog-stat">
<h4>{{ worklog_stats.billable_count or 0 }}</h4>
<p>Billable Entries</p>
</div>
<div class="worklog-stat">
<h4>{{ "%.1f"|format(worklog_stats.billable_hours or 0) }}t</h4>
<p>Billable Timer</p>
</div>
</div>
<!-- Recent Tickets -->
<div class="section-header">
<h2>
<i class="bi bi-clock-history"></i> Seneste Tickets
</h2>
<a href="/ticket/tickets" class="btn btn-outline-secondary">
<i class="bi bi-list-ul"></i> Se Alle
</a>
</div>
{% if recent_tickets %}
<div class="card">
<div class="table-responsive">
<table class="table ticket-list mb-0">
<thead>
<tr>
<th>Ticket</th>
<th>Kunde</th>
<th>Status</th>
<th>Prioritet</th>
<th>Oprettet</th>
</tr>
</thead>
<tbody>
{% for ticket in recent_tickets %}
<tr class="ticket-row" onclick="window.location='/ticket/tickets/{{ ticket.id }}'">
<td>
<span class="ticket-number">{{ ticket.ticket_number }}</span>
<br>
<strong>{{ ticket.subject }}</strong>
</td>
<td>
{% if ticket.customer_name %}
{{ ticket.customer_name }}
{% else %}
<span class="text-muted">-</span>
{% endif %}
</td>
<td>
<span class="badge badge-status-{{ ticket.status }}">
{{ ticket.status.replace('_', ' ').title() }}
</span>
</td>
<td>
<span class="badge badge-priority-{{ ticket.priority }}">
{{ ticket.priority.title() }}
</span>
</td>
<td>
{{ ticket.created_at.strftime('%d-%m-%Y %H:%M') if ticket.created_at else '-' }}
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
{% else %}
<div class="card">
<div class="empty-state">
<i class="bi bi-inbox"></i>
<h3>Ingen tickets endnu</h3>
<p>Opret din første ticket for at komme i gang</p>
<a href="/ticket/tickets/new" class="btn btn-primary mt-3">
<i class="bi bi-plus-circle"></i> Opret Ticket
</a>
</div>
</div>
{% endif %}
</div>
{% endblock %}
{% block extra_js %}
<script>
// Filter tickets by status
function filterTickets(status) {
window.location.href = `/ticket/tickets?status=${status}`;
}
// Auto-refresh every 5 minutes
setTimeout(() => {
location.reload();
}, 300000);
</script>
{% endblock %}

View File

@ -1,265 +0,0 @@
{% extends "shared/frontend/base.html" %}
{% block title %}Concept 1: Split Screen - BMC Hub{% endblock %}
{% block extra_css %}
<style>
/* Hide default footer if exists to maximize height */
footer { display: none; }
.split-layout {
display: flex;
height: 100%;
}
/* Left Sidebar: Ticket List */
.ticket-list-sidebar {
width: 350px;
flex-shrink: 0;
border-right: 1px solid rgba(0,0,0,0.1);
background: var(--bg-card);
display: flex;
flex-direction: column;
}
.list-header {
padding: 1rem;
border-bottom: 1px solid rgba(0,0,0,0.05);
}
.ticket-list-scroll {
overflow-y: auto;
flex-grow: 1;
}
.ticket-item {
padding: 1rem;
border-bottom: 1px solid rgba(0,0,0,0.05);
cursor: pointer;
transition: background-color 0.2s;
position: relative;
}
.ticket-item:hover {
background-color: var(--accent-light);
}
.ticket-item.active {
background-color: var(--accent-light);
border-left: 4px solid var(--accent);
}
.ticket-item.unread::after {
content: '';
position: absolute;
top: 1rem;
right: 1rem;
width: 8px;
height: 8px;
background-color: var(--accent);
border-radius: 50%;
}
/* Right Content: Detail View */
.ticket-detail-main {
flex-grow: 1;
background: var(--bg-body);
display: flex;
flex-direction: column;
overflow: hidden;
}
.detail-header {
background: var(--bg-card);
padding: 1rem 1.5rem;
border-bottom: 1px solid rgba(0,0,0,0.1);
display: flex;
justify-content: space-between;
align-items: center;
}
.detail-scroll {
flex-grow: 1;
overflow-y: auto;
padding: 1.5rem;
}
.detail-footer {
background: var(--bg-card);
padding: 1rem 1.5rem;
border-top: 1px solid rgba(0,0,0,0.1);
}
/* Chat bubbles */
.message-bubble {
background: var(--bg-card);
padding: 1rem;
border-radius: 12px;
margin-bottom: 1rem;
box-shadow: 0 1px 3px rgba(0,0,0,0.05);
max-width: 85%;
}
.message-bubble.internal {
background: #fff3cd;
border: 1px solid #ffeeba;
}
.message-bubble.me {
background: var(--accent-light);
margin-left: auto;
}
.meta-text {
font-size: 0.75rem;
color: var(--text-secondary);
margin-bottom: 0.25rem;
}
</style>
{% endblock %}
{% block content_wrapper %}
<div class="container-fluid p-0" style="height: calc(100vh - 80px); overflow: hidden;">
<div class="split-layout">
<!-- Left Sidebar -->
<div class="ticket-list-sidebar">
<div class="list-header">
<div class="input-group">
<span class="input-group-text bg-transparent border-end-0"><i class="bi bi-search"></i></span>
<input type="text" class="form-control border-start-0" placeholder="Søg tickets...">
</div>
<div class="d-flex gap-2 mt-2 overflow-auto pb-1">
<span class="badge bg-primary rounded-pill">Alle</span>
<span class="badge bg-light text-dark border rounded-pill">Mine</span>
<span class="badge bg-light text-dark border rounded-pill">Uløste</span>
</div>
</div>
<div class="ticket-list-scroll">
<!-- Active Item -->
<div class="ticket-item active">
<div class="d-flex justify-content-between mb-1">
<span class="badge bg-warning text-dark">Høj</span>
<small class="text-muted">14:32</small>
</div>
<h6 class="mb-1 text-truncate">Netværksproblem i hovedkontoret</h6>
<div class="d-flex align-items-center gap-2">
<img src="https://ui-avatars.com/api/?name=Tech+Corp&background=random" class="rounded-circle" width="20">
<small class="text-muted">Tech Corp A/S</small>
</div>
</div>
<!-- Other Items -->
<div class="ticket-item unread">
<div class="d-flex justify-content-between mb-1">
<span class="badge bg-info text-white">Ny</span>
<small class="text-muted">12:15</small>
</div>
<h6 class="mb-1 text-truncate">Printer løbet tør for toner</h6>
<div class="d-flex align-items-center gap-2">
<img src="https://ui-avatars.com/api/?name=Advokat+Huset&background=random" class="rounded-circle" width="20">
<small class="text-muted">Advokathuset</small>
</div>
</div>
<div class="ticket-item">
<div class="d-flex justify-content-between mb-1">
<span class="badge bg-success">Løst</span>
<small class="text-muted">I går</small>
</div>
<h6 class="mb-1 text-truncate">Opsætning af ny bruger (Mette)</h6>
<div class="d-flex align-items-center gap-2">
<img src="https://ui-avatars.com/api/?name=Byg+Aps&background=random" class="rounded-circle" width="20">
<small class="text-muted">Byg ApS</small>
</div>
</div>
<!-- More dummy items -->
{% for i in range(5) %}
<div class="ticket-item">
<div class="d-flex justify-content-between mb-1">
<span class="badge bg-secondary">Venter</span>
<small class="text-muted">2 dage siden</small>
</div>
<h6 class="mb-1 text-truncate">Licens fornyelse Office 365</h6>
<div class="d-flex align-items-center gap-2">
<img src="https://ui-avatars.com/api/?name=Kunde+{{i}}&background=random" class="rounded-circle" width="20">
<small class="text-muted">Kunde {{i}}</small>
</div>
</div>
{% endfor %}
</div>
</div>
<!-- Right Content -->
<div class="ticket-detail-main">
<div class="detail-header">
<div>
<div class="d-flex align-items-center gap-2 mb-1">
<span class="badge bg-light text-dark border">#TKT-20251215-005</span>
<span class="badge bg-warning text-dark">I Gang</span>
</div>
<h5 class="mb-0">Netværksproblem i hovedkontoret</h5>
</div>
<div class="d-flex gap-2">
<button class="btn btn-outline-secondary btn-sm"><i class="bi bi-person-plus"></i> Tildel</button>
<button class="btn btn-outline-secondary btn-sm"><i class="bi bi-clock"></i> Log Tid</button>
<button class="btn btn-primary btn-sm"><i class="bi bi-check-lg"></i> Løs Sag</button>
</div>
</div>
<div class="detail-scroll">
<!-- Original Request -->
<div class="message-bubble">
<div class="d-flex justify-content-between align-items-center mb-2">
<div class="d-flex align-items-center gap-2">
<img src="https://ui-avatars.com/api/?name=Jens+Jensen&background=random" class="rounded-circle" width="24">
<span class="fw-bold">Jens Jensen</span>
<span class="text-muted small">via Email</span>
</div>
<small class="text-muted">15. dec 14:32</small>
</div>
<p class="mb-0">Hej Support,<br><br>Vi har problemer med internettet på hovedkontoret. Ingen kan komme på Wi-Fi, og kablet forbindelse virker heller ikke. Det haster meget!</p>
</div>
<!-- Internal Note -->
<div class="message-bubble internal">
<div class="d-flex justify-content-between align-items-center mb-2">
<div class="d-flex align-items-center gap-2">
<i class="bi bi-shield-lock text-warning"></i>
<span class="fw-bold text-warning-emphasis">Intern Note</span>
</div>
<small class="text-muted">15. dec 14:45</small>
</div>
<p class="mb-0">Jeg har tjekket Unifi controlleren. Switch #3 svarer ikke. Det er nok den der er nede.</p>
</div>
<!-- Reply -->
<div class="message-bubble me">
<div class="d-flex justify-content-between align-items-center mb-2">
<div class="d-flex align-items-center gap-2">
<img src="https://ui-avatars.com/api/?name=Christian&background=0f4c75&color=fff" class="rounded-circle" width="24">
<span class="fw-bold">Christian (Support)</span>
</div>
<small class="text-muted">15. dec 14:50</small>
</div>
<p class="mb-0">Hej Jens,<br><br>Jeg kigger på det med det samme. Jeg kan se vi har mistet forbindelsen til en af jeres switche. Jeg prøver at genstarte den remote.</p>
</div>
</div>
<div class="detail-footer">
<div class="input-group">
<button class="btn btn-outline-secondary" type="button"><i class="bi bi-paperclip"></i></button>
<button class="btn btn-outline-secondary dropdown-toggle" type="button" data-bs-toggle="dropdown">Svar Kunde</button>
<ul class="dropdown-menu">
<li><a class="dropdown-item" href="#">Svar Kunde</a></li>
<li><a class="dropdown-item" href="#"><i class="bi bi-shield-lock text-warning"></i> Intern Note</a></li>
</ul>
<input type="text" class="form-control" placeholder="Skriv et svar...">
<button class="btn btn-primary" type="button"><i class="bi bi-send"></i> Send</button>
</div>
</div>
</div>
</div>
</div>
{% endblock %}

View File

@ -1,251 +0,0 @@
{% extends "shared/frontend/base.html" %}
{% block title %}Concept 2: Kanban Board - BMC Hub{% endblock %}
{% block extra_css %}
<style>
.kanban-board {
display: flex;
height: 100%;
padding: 1.5rem;
gap: 1.5rem;
overflow-x: auto;
}
.kanban-column {
width: 320px;
flex-shrink: 0;
display: flex;
flex-direction: column;
background: transparent;
height: 100%;
}
.column-header {
padding: 0.5rem 0.5rem 1rem 0.5rem;
display: flex;
justify-content: space-between;
align-items: center;
font-weight: 600;
color: var(--text-secondary);
text-transform: uppercase;
font-size: 0.85rem;
letter-spacing: 0.5px;
}
.column-content {
flex-grow: 1;
overflow-y: auto;
padding-right: 0.5rem; /* Space for scrollbar */
}
.kanban-card {
background: var(--bg-card);
border-radius: 8px;
padding: 1rem;
margin-bottom: 1rem;
box-shadow: 0 2px 4px rgba(0,0,0,0.05);
cursor: grab;
transition: transform 0.2s, box-shadow 0.2s;
border-left: 4px solid transparent;
}
.kanban-card:hover {
transform: translateY(-2px);
box-shadow: 0 4px 8px rgba(0,0,0,0.1);
}
.kanban-card:active {
cursor: grabbing;
}
/* Priority Borders */
.priority-high { border-left-color: var(--warning); }
.priority-critical { border-left-color: var(--danger); }
.priority-normal { border-left-color: var(--info); }
.priority-low { border-left-color: var(--text-secondary); }
.card-meta {
display: flex;
justify-content: space-between;
align-items: center;
margin-top: 0.75rem;
font-size: 0.8rem;
color: var(--text-secondary);
}
.card-tags {
display: flex;
gap: 0.25rem;
margin-top: 0.5rem;
}
.tag {
background: var(--bg-body);
padding: 0.1rem 0.4rem;
border-radius: 4px;
font-size: 0.7rem;
color: var(--text-secondary);
}
.add-card-btn {
background: transparent;
border: 2px dashed rgba(0,0,0,0.1);
color: var(--text-secondary);
width: 100%;
padding: 0.75rem;
border-radius: 8px;
margin-top: 0.5rem;
transition: all 0.2s;
}
.add-card-btn:hover {
border-color: var(--accent);
color: var(--accent);
background: rgba(15, 76, 117, 0.05);
}
</style>
{% endblock %}
{% block content_wrapper %}
<div class="container-fluid p-0" style="height: calc(100vh - 80px); overflow: hidden;">
<div class="kanban-board">
<!-- Column: Ny -->
<div class="kanban-column">
<div class="column-header">
<span><i class="bi bi-circle text-info me-2"></i>Ny</span>
<span class="badge bg-secondary bg-opacity-10 text-secondary">3</span>
</div>
<div class="column-content">
<div class="kanban-card priority-critical">
<div class="d-flex justify-content-between mb-2">
<small class="text-muted">#TKT-005</small>
<i class="bi bi-three-dots text-muted"></i>
</div>
<h6 class="mb-2">Server nede i produktion</h6>
<div class="card-tags">
<span class="tag">Server</span>
<span class="tag">Kritisk</span>
</div>
<div class="card-meta">
<div class="d-flex align-items-center gap-1">
<img src="https://ui-avatars.com/api/?name=Fabrik+A/S&background=random" class="rounded-circle" width="16">
<span>Fabrik A/S</span>
</div>
<span>10m</span>
</div>
</div>
<div class="kanban-card priority-normal">
<div class="d-flex justify-content-between mb-2">
<small class="text-muted">#TKT-008</small>
<i class="bi bi-three-dots text-muted"></i>
</div>
<h6 class="mb-2">Ny medarbejder oprettelse</h6>
<div class="card-meta">
<div class="d-flex align-items-center gap-1">
<img src="https://ui-avatars.com/api/?name=Kontor+ApS&background=random" class="rounded-circle" width="16">
<span>Kontor ApS</span>
</div>
<span>2t</span>
</div>
</div>
<button class="add-card-btn"><i class="bi bi-plus"></i> Tilføj kort</button>
</div>
</div>
<!-- Column: I Gang -->
<div class="kanban-column">
<div class="column-header">
<span><i class="bi bi-play-circle text-warning me-2"></i>I Gang</span>
<span class="badge bg-secondary bg-opacity-10 text-secondary">2</span>
</div>
<div class="column-content">
<div class="kanban-card priority-high">
<div class="d-flex justify-content-between mb-2">
<small class="text-muted">#TKT-002</small>
<img src="https://ui-avatars.com/api/?name=Christian&background=0f4c75&color=fff" class="rounded-circle" width="20" title="Assigned to Christian">
</div>
<h6 class="mb-2">Netværksproblem hovedkontor</h6>
<div class="card-tags">
<span class="tag">Netværk</span>
</div>
<div class="card-meta">
<div class="d-flex align-items-center gap-1">
<img src="https://ui-avatars.com/api/?name=Tech+Corp&background=random" class="rounded-circle" width="16">
<span>Tech Corp</span>
</div>
<span>1d</span>
</div>
</div>
<div class="kanban-card priority-normal">
<div class="d-flex justify-content-between mb-2">
<small class="text-muted">#TKT-004</small>
<img src="https://ui-avatars.com/api/?name=Morten&background=random" class="rounded-circle" width="20" title="Assigned to Morten">
</div>
<h6 class="mb-2">Opdatering af firewall</h6>
<div class="card-meta">
<div class="d-flex align-items-center gap-1">
<img src="https://ui-avatars.com/api/?name=Sikkerhed+A/S&background=random" class="rounded-circle" width="16">
<span>Sikkerhed A/S</span>
</div>
<span>3d</span>
</div>
</div>
</div>
</div>
<!-- Column: Venter på Kunde -->
<div class="kanban-column">
<div class="column-header">
<span><i class="bi bi-pause-circle text-secondary me-2"></i>Venter</span>
<span class="badge bg-secondary bg-opacity-10 text-secondary">4</span>
</div>
<div class="column-content">
{% for i in range(4) %}
<div class="kanban-card priority-low">
<div class="d-flex justify-content-between mb-2">
<small class="text-muted">#TKT-01{{i}}</small>
<i class="bi bi-three-dots text-muted"></i>
</div>
<h6 class="mb-2">Bestilling af hardware {{i}}</h6>
<div class="card-meta">
<div class="d-flex align-items-center gap-1">
<img src="https://ui-avatars.com/api/?name=Kunde+{{i}}&background=random" class="rounded-circle" width="16">
<span>Kunde {{i}}</span>
</div>
<span>5d</span>
</div>
</div>
{% endfor %}
</div>
</div>
<!-- Column: Løst -->
<div class="kanban-column">
<div class="column-header">
<span><i class="bi bi-check-circle text-success me-2"></i>Løst</span>
<span class="badge bg-secondary bg-opacity-10 text-secondary">12</span>
</div>
<div class="column-content">
<div class="kanban-card priority-normal opacity-75">
<div class="d-flex justify-content-between mb-2">
<small class="text-muted">#TKT-001</small>
<i class="bi bi-check-lg text-success"></i>
</div>
<h6 class="mb-2 text-decoration-line-through">Printer installation</h6>
<div class="card-meta">
<div class="d-flex align-items-center gap-1">
<img src="https://ui-avatars.com/api/?name=Advokat&background=random" class="rounded-circle" width="16">
<span>Advokat</span>
</div>
<span>1u</span>
</div>
</div>
</div>
</div>
</div>
</div>
{% endblock %}

View File

@ -1,239 +0,0 @@
{% extends "shared/frontend/base.html" %}
{% block title %}Concept 3: Power Table - BMC Hub{% endblock %}
{% block extra_css %}
<style>
.power-table-container {
background: var(--bg-card);
border-radius: 8px;
box-shadow: 0 2px 15px rgba(0,0,0,0.05);
overflow: hidden;
}
.filter-bar {
padding: 1rem;
border-bottom: 1px solid rgba(0,0,0,0.1);
background: var(--bg-body);
display: flex;
gap: 1rem;
align-items: center;
flex-wrap: wrap;
}
.table-responsive {
overflow-x: auto;
}
.power-table {
width: 100%;
font-size: 0.85rem;
white-space: nowrap;
}
.power-table th {
background: var(--bg-body);
padding: 0.75rem 1rem;
font-weight: 600;
color: var(--text-secondary);
text-transform: uppercase;
font-size: 0.75rem;
letter-spacing: 0.5px;
border-bottom: 2px solid rgba(0,0,0,0.1);
}
.power-table td {
padding: 0.75rem 1rem;
border-bottom: 1px solid rgba(0,0,0,0.05);
vertical-align: middle;
}
.power-table tr:hover td {
background-color: var(--accent-light);
cursor: pointer;
}
.col-checkbox { width: 40px; text-align: center; }
.col-id { width: 120px; font-family: monospace; color: var(--accent); }
.col-status { width: 100px; }
.col-priority { width: 100px; }
.col-subject { min-width: 300px; font-weight: 500; }
.col-customer { width: 200px; }
.col-assigned { width: 150px; }
.col-updated { width: 150px; color: var(--text-secondary); }
.status-dot {
display: inline-block;
width: 8px;
height: 8px;
border-radius: 50%;
margin-right: 6px;
}
.status-open { background-color: var(--info); }
.status-in-progress { background-color: var(--warning); }
.status-resolved { background-color: var(--success); }
.status-closed { background-color: var(--text-secondary); }
.priority-badge {
padding: 2px 6px;
border-radius: 4px;
font-size: 0.7rem;
font-weight: 600;
text-transform: uppercase;
}
.priority-high { background: #fff3cd; color: #856404; }
.priority-critical { background: #f8d7da; color: #721c24; }
.priority-normal { background: #e2e3e5; color: #383d41; }
.avatar-sm {
width: 24px;
height: 24px;
border-radius: 50%;
vertical-align: middle;
margin-right: 6px;
}
.bulk-actions {
display: none;
align-items: center;
gap: 1rem;
background: var(--accent);
color: white;
padding: 0.5rem 1rem;
border-radius: 6px;
font-size: 0.9rem;
}
.bulk-actions.show {
display: flex;
}
</style>
{% endblock %}
{% block content %}
<div class="container-fluid px-4 py-4">
<div class="d-flex justify-content-between align-items-center mb-4">
<h2 class="h4 mb-0">Alle Tickets</h2>
<div class="d-flex gap-2">
<button class="btn btn-outline-secondary"><i class="bi bi-download"></i> Eksportér</button>
<button class="btn btn-primary"><i class="bi bi-plus-lg"></i> Ny Ticket</button>
</div>
</div>
<div class="power-table-container">
<!-- Filter Bar -->
<div class="filter-bar">
<div class="input-group" style="width: 300px;">
<span class="input-group-text bg-white border-end-0"><i class="bi bi-search"></i></span>
<input type="text" class="form-control border-start-0" placeholder="Søg...">
</div>
<select class="form-select" style="width: 150px;">
<option selected>Status: Alle</option>
<option>Ny</option>
<option>I Gang</option>
<option>Løst</option>
</select>
<select class="form-select" style="width: 150px;">
<option selected>Prioritet: Alle</option>
<option>Kritisk</option>
<option>Høj</option>
<option>Normal</option>
</select>
<select class="form-select" style="width: 150px;">
<option selected>Tildelt: Alle</option>
<option>Mig</option>
<option>Ufordelt</option>
</select>
<div class="ms-auto text-muted small">
Viser 1-50 af 142 tickets
</div>
</div>
<!-- Bulk Actions Bar (Hidden by default) -->
<div class="bg-light p-2 border-bottom d-none" id="bulkActions">
<div class="d-flex align-items-center gap-3 px-2">
<span class="fw-bold text-primary">3 valgt</span>
<div class="vr"></div>
<button class="btn btn-sm btn-outline-secondary">Sæt Status</button>
<button class="btn btn-sm btn-outline-secondary">Tildel</button>
<button class="btn btn-sm btn-outline-danger">Slet</button>
</div>
</div>
<!-- Table -->
<div class="table-responsive">
<table class="power-table">
<thead>
<tr>
<th class="col-checkbox"><input type="checkbox" class="form-check-input"></th>
<th class="col-id">ID</th>
<th class="col-subject">Emne</th>
<th class="col-customer">Kunde</th>
<th class="col-status">Status</th>
<th class="col-priority">Prioritet</th>
<th class="col-assigned">Tildelt</th>
<th class="col-updated">Opdateret</th>
</tr>
</thead>
<tbody>
<tr>
<td class="col-checkbox"><input type="checkbox" class="form-check-input"></td>
<td class="col-id">#TKT-005</td>
<td class="col-subject">Netværksproblem i hovedkontoret</td>
<td class="col-customer">
<img src="https://ui-avatars.com/api/?name=Tech+Corp&background=random" class="avatar-sm">
Tech Corp A/S
</td>
<td class="col-status"><span class="status-dot status-in-progress"></span>I Gang</td>
<td class="col-priority"><span class="priority-badge priority-high">Høj</span></td>
<td class="col-assigned">
<img src="https://ui-avatars.com/api/?name=Christian&background=0f4c75&color=fff" class="avatar-sm">
Christian
</td>
<td class="col-updated">10 min siden</td>
</tr>
<tr>
<td class="col-checkbox"><input type="checkbox" class="form-check-input"></td>
<td class="col-id">#TKT-004</td>
<td class="col-subject">Server nede i produktion</td>
<td class="col-customer">
<img src="https://ui-avatars.com/api/?name=Fabrik+A/S&background=random" class="avatar-sm">
Fabrik A/S
</td>
<td class="col-status"><span class="status-dot status-open"></span>Ny</td>
<td class="col-priority"><span class="priority-badge priority-critical">Kritisk</span></td>
<td class="col-assigned"><span class="text-muted fst-italic">Ufordelt</span></td>
<td class="col-updated">1 time siden</td>
</tr>
{% for i in range(10) %}
<tr>
<td class="col-checkbox"><input type="checkbox" class="form-check-input"></td>
<td class="col-id">#TKT-00{{i}}</td>
<td class="col-subject">Support sag vedrørende faktura {{i}}</td>
<td class="col-customer">
<img src="https://ui-avatars.com/api/?name=Kunde+{{i}}&background=random" class="avatar-sm">
Kunde {{i}} ApS
</td>
<td class="col-status"><span class="status-dot status-resolved"></span>Løst</td>
<td class="col-priority"><span class="priority-badge priority-normal">Normal</span></td>
<td class="col-assigned">
<img src="https://ui-avatars.com/api/?name=Morten&background=random" class="avatar-sm">
Morten
</td>
<td class="col-updated">{{i}} dage siden</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
</div>
{% endblock %}

View File

@ -1,877 +0,0 @@
{% extends "shared/frontend/base.html" %}
{% block title %}{{ ticket.ticket_number }} - BMC Hub{% endblock %}
{% block extra_css %}
<style>
.ticket-header {
background: linear-gradient(135deg, var(--accent) 0%, var(--accent-light) 100%);
padding: 2rem;
border-radius: var(--border-radius);
color: white;
margin-bottom: 2rem;
}
.ticket-number {
font-family: 'Monaco', 'Courier New', monospace;
font-size: 1rem;
opacity: 0.9;
}
.ticket-title {
font-size: 2rem;
font-weight: 700;
margin: 0.5rem 0;
}
.badge {
padding: 0.4rem 0.8rem;
font-weight: 500;
border-radius: 6px;
font-size: 0.75rem;
}
.badge-status-open { background-color: #d1ecf1; color: #0c5460; }
.badge-status-in_progress { background-color: #fff3cd; color: #856404; }
.badge-status-pending_customer { background-color: #e2e3e5; color: #383d41; }
.badge-status-resolved { background-color: #d4edda; color: #155724; }
.badge-status-closed { background-color: #f8d7da; color: #721c24; }
.badge-priority-low { background-color: var(--accent-light); color: var(--accent); }
.badge-priority-normal { background-color: #e2e3e5; color: #383d41; }
.badge-priority-high { background-color: #fff3cd; color: #856404; }
.badge-priority-urgent, .badge-priority-critical { background-color: #f8d7da; color: #721c24; }
.info-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(200px, 1fr));
gap: 1.5rem;
margin-bottom: 2rem;
}
.info-item {
background: var(--bg-card);
padding: 1rem;
border-radius: var(--border-radius);
border: 1px solid var(--accent-light);
}
.info-item label {
display: block;
font-size: 0.75rem;
text-transform: uppercase;
letter-spacing: 0.5px;
color: var(--text-secondary);
margin-bottom: 0.5rem;
}
.info-item .value {
font-weight: 600;
color: var(--text-primary);
}
.comment {
padding: 1.5rem;
margin-bottom: 1rem;
background: var(--accent-light);
border-radius: var(--border-radius);
border-left: 4px solid var(--accent);
}
.comment.internal {
background: #fff3cd;
border-left-color: #ffc107;
}
.comment-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 0.75rem;
}
.comment-author {
font-weight: 600;
color: var(--accent);
}
.comment-date {
font-size: 0.85rem;
color: var(--text-secondary);
}
.worklog-table th {
font-weight: 600;
color: var(--text-secondary);
border-bottom: 2px solid var(--accent-light);
padding: 0.75rem;
font-size: 0.85rem;
}
.worklog-table td {
padding: 0.75rem;
border-bottom: 1px solid var(--accent-light);
}
.attachment {
display: inline-flex;
align-items: center;
padding: 0.5rem 1rem;
background: var(--accent-light);
border-radius: var(--border-radius);
margin-right: 0.5rem;
margin-bottom: 0.5rem;
text-decoration: none;
color: var(--accent);
transition: all 0.2s;
}
.attachment:hover {
background: var(--accent);
color: white;
}
.attachment i {
margin-right: 0.5rem;
}
.tags-container {
display: flex;
flex-wrap: wrap;
gap: 0.5rem;
margin-top: 1rem;
}
.tag-badge {
display: inline-flex;
align-items: center;
gap: 0.5rem;
padding: 0.5rem 0.75rem;
border-radius: 8px;
font-size: 0.85rem;
font-weight: 500;
color: white;
transition: all 0.2s;
cursor: default;
}
.tag-badge:hover {
transform: translateY(-2px);
box-shadow: 0 4px 8px rgba(0,0,0,0.15);
}
.tag-badge .btn-close {
font-size: 0.6rem;
opacity: 0.7;
}
.tag-badge .btn-close:hover {
opacity: 1;
}
.add-tag-btn {
display: inline-flex;
align-items: center;
gap: 0.5rem;
padding: 0.5rem 0.75rem;
border: 2px dashed var(--accent-light);
border-radius: 8px;
background: transparent;
color: var(--accent);
cursor: pointer;
transition: all 0.2s;
font-size: 0.85rem;
font-weight: 500;
}
.add-tag-btn:hover {
border-color: var(--accent);
background: var(--accent-light);
}
.section-title {
font-size: 1.25rem;
font-weight: 600;
margin-bottom: 1rem;
display: flex;
align-items: center;
gap: 0.5rem;
}
.empty-state {
text-align: center;
padding: 2rem;
color: var(--text-secondary);
}
.empty-state i {
font-size: 2rem;
margin-bottom: 0.5rem;
opacity: 0.3;
}
.action-buttons {
display: flex;
gap: 0.5rem;
flex-wrap: wrap;
}
.description-box {
background: var(--bg-body);
padding: 1.5rem;
border-radius: var(--border-radius);
white-space: pre-wrap;
line-height: 1.6;
}
.contact-item {
padding: 0.75rem;
margin-bottom: 0.5rem;
background: var(--accent-light);
border-radius: var(--border-radius);
display: flex;
justify-content: space-between;
align-items: center;
transition: all 0.2s;
}
.contact-item:hover {
transform: translateX(3px);
box-shadow: 0 2px 6px rgba(0,0,0,0.1);
}
.contact-info {
flex: 1;
}
.contact-name {
font-weight: 600;
color: var(--accent);
margin-bottom: 0.25rem;
}
.contact-details {
font-size: 0.85rem;
color: var(--text-secondary);
}
.contact-role-badge {
padding: 0.25rem 0.6rem;
border-radius: 4px;
font-size: 0.75rem;
font-weight: 500;
margin-right: 0.5rem;
}
/* Standard roller */
.role-primary { background: #28a745; color: white; }
.role-requester { background: #17a2b8; color: white; }
.role-assignee { background: #ffc107; color: #000; }
.role-cc { background: #6c757d; color: white; }
.role-observer { background: #e9ecef; color: #495057; }
/* Almindelige roller */
.role-ekstern_it { background: #6f42c1; color: white; }
.role-third_party { background: #fd7e14; color: white; }
.role-electrician { background: #20c997; color: white; }
.role-consultant { background: #0dcaf0; color: #000; }
.role-vendor { background: #dc3545; color: white; }
/* Default for custom roller */
.contact-role-badge:not([class*="role-primary"]):not([class*="role-requester"]):not([class*="role-assignee"]):not([class*="role-cc"]):not([class*="role-observer"]):not([class*="role-ekstern"]):not([class*="role-third"]):not([class*="role-electrician"]):not([class*="role-consultant"]):not([class*="role-vendor"]) {
background: var(--accent);
color: white;
}
</style>
{% endblock %}
{% block content %}
<div class="container-fluid px-4">
<!-- Ticket Header -->
<div class="ticket-header">
<div class="ticket-number">{{ ticket.ticket_number }}</div>
<div class="ticket-title">{{ ticket.subject }}</div>
<div class="mt-3">
<span class="badge badge-status-{{ ticket.status }}">
{{ ticket.status.replace('_', ' ').title() }}
</span>
<span class="badge badge-priority-{{ ticket.priority }}">
{{ ticket.priority.title() }} Priority
</span>
</div>
<div class="tags-container" id="ticketTags">
<!-- Tags loaded via JavaScript -->
</div>
<button class="add-tag-btn mt-2" onclick="showTagPicker('ticket', {{ ticket.id }}, reloadTags)">
<i class="bi bi-plus-circle"></i> Tilføj Tag (⌥⇧T)
</button>
</div>
<!-- Action Buttons -->
<div class="action-buttons mb-4">
<a href="/api/v1/ticket/tickets/{{ ticket.id }}" class="btn btn-outline-primary">
<i class="bi bi-pencil"></i> Rediger
</a>
<button class="btn btn-outline-secondary" onclick="addComment()">
<i class="bi bi-chat"></i> Tilføj Kommentar
</button>
<button class="btn btn-outline-secondary" onclick="addWorklog()">
<i class="bi bi-clock"></i> Log Tid
</button>
<a href="/ticket/tickets" class="btn btn-outline-secondary">
<i class="bi bi-arrow-left"></i> Tilbage
</a>
</div>
<div class="row">
<!-- Main Content -->
<div class="col-lg-8">
<!-- Description -->
<div class="card">
<div class="card-body">
<div class="section-title">
<i class="bi bi-file-text"></i> Beskrivelse
</div>
<div class="description-box">
{{ ticket.description or 'Ingen beskrivelse' }}
</div>
</div>
</div>
<!-- Comments -->
<div class="card">
<div class="card-body">
<div class="section-title">
<i class="bi bi-chat-dots"></i> Kommentarer ({{ comments|length }})
</div>
{% if comments %}
{% for comment in comments %}
<div class="comment {% if comment.internal_note %}internal{% endif %}">
<div class="comment-header">
<span class="comment-author">
<i class="bi bi-person-circle"></i>
{{ comment.user_name or 'System' }}
{% if comment.internal_note %}
<span class="badge bg-warning text-dark ms-2">Internal</span>
{% endif %}
</span>
<span class="comment-date">
{{ comment.created_at.strftime('%d-%m-%Y %H:%M') if comment.created_at else '-' }}
</span>
</div>
<div class="comment-text">
{{ comment.comment_text }}
</div>
</div>
{% endfor %}
{% else %}
<div class="empty-state">
<i class="bi bi-chat"></i>
<p>Ingen kommentarer endnu</p>
</div>
{% endif %}
</div>
</div>
<!-- Worklog -->
<div class="card">
<div class="card-body">
<div class="section-title">
<i class="bi bi-clock-history"></i> Worklog ({{ worklog|length }})
</div>
{% if worklog %}
<div class="table-responsive">
<table class="table worklog-table">
<thead>
<tr>
<th>Dato</th>
<th>Timer</th>
<th>Type</th>
<th>Beskrivelse</th>
<th>Status</th>
<th>Medarbejder</th>
</tr>
</thead>
<tbody>
{% for entry in worklog %}
<tr>
<td>{{ entry.work_date.strftime('%d-%m-%Y') if entry.work_date else '-' }}</td>
<td><strong>{{ "%.2f"|format(entry.hours) }}t</strong></td>
<td>{{ entry.work_type }}</td>
<td>{{ entry.description or '-' }}</td>
<td>
<span class="badge {% if entry.status == 'billable' %}bg-success{% elif entry.status == 'draft' %}bg-warning{% else %}bg-secondary{% endif %}">
{{ entry.status }}
</span>
</td>
<td>{{ entry.user_name or '-' }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="empty-state">
<i class="bi bi-clock"></i>
<p>Ingen worklog entries endnu</p>
</div>
{% endif %}
</div>
</div>
<!-- Attachments -->
{% if attachments %}
<div class="card">
<div class="card-body">
<div class="section-title">
<i class="bi bi-paperclip"></i> Vedhæftninger ({{ attachments|length }})
</div>
{% for attachment in attachments %}
<a href="/api/v1/attachments/{{ attachment.id }}/download" class="attachment">
<i class="bi bi-file-earmark"></i>
{{ attachment.filename }}
<small class="ms-2">({{ (attachment.file_size / 1024)|round(1) }} KB)</small>
</a>
{% endfor %}
</div>
</div>
{% endif %}
</div>
<!-- Sidebar -->
<div class="col-lg-4">
<!-- Ticket Info -->
<div class="card">
<div class="card-body">
<div class="section-title">
<i class="bi bi-info-circle"></i> Ticket Information
</div>
<div class="info-item mb-3">
<label>Kunde</label>
<div class="value">
{% if ticket.customer_name %}
<a href="/customers/{{ ticket.customer_id }}" style="text-decoration: none; color: var(--accent);">
{{ ticket.customer_name }}
</a>
{% else %}
<span class="text-muted">Ikke angivet</span>
{% endif %}
</div>
</div>
<div class="info-item mb-3">
<label>Tildelt til</label>
<div class="value">
{{ ticket.assigned_to_name or 'Ikke tildelt' }}
</div>
</div>
<div class="info-item mb-3">
<label>Oprettet</label>
<div class="value">
{{ ticket.created_at.strftime('%d-%m-%Y %H:%M') if ticket.created_at else '-' }}
</div>
</div>
</div>
</div>
<!-- Contacts -->
<div class="card">
<div class="card-body">
<div class="section-title">
<i class="bi bi-people"></i> Kontakter
<button class="btn btn-sm btn-outline-primary ms-auto" onclick="showAddContactModal()">
<i class="bi bi-plus-circle"></i> Tilføj
</button>
</div>
<div id="contactsList">
<div class="text-center text-muted py-2">
<i class="bi bi-hourglass-split"></i> Indlæser...
</div>
</div>
</div>
</div>
<!-- Back to Ticket Info continuation -->
<div class="card">
<div class="card-body">
<div class="info-item mb-3">
<label>Senest opdateret</label>
<div class="value">
{{ ticket.updated_at.strftime('%d-%m-%Y %H:%M') if ticket.updated_at else '-' }}
</div>
</div>
{% if ticket.resolved_at %}
<div class="info-item mb-3">
<label>Løst</label>
<div class="value">
{{ ticket.resolved_at.strftime('%d-%m-%Y %H:%M') }}
</div>
</div>
{% endif %}
{% if ticket.first_response_at %}
<div class="info-item mb-3">
<label>Første svar</label>
<div class="value">
{{ ticket.first_response_at.strftime('%d-%m-%Y %H:%M') }}
</div>
</div>
{% endif %}
</div>
</div>
<!-- Tags -->
{% if ticket.tags %}
<div class="card">
<div class="card-body">
<div class="section-title">
<i class="bi bi-tags"></i> Tags
</div>
{% for tag in ticket.tags %}
<span class="badge bg-secondary me-1 mb-1">#{{ tag }}</span>
{% endfor %}
</div>
</div>
{% endif %}
</div>
</div>
</div>
{% endblock %}
{% block extra_js %}
<script>
// Add comment (placeholder - integrate with API)
function addComment() {
alert('Add comment functionality - integrate with POST /api/v1/ticket/tickets/{{ ticket.id }}/comments');
}
// Add worklog (placeholder - integrate with API)
function addWorklog() {
alert('Add worklog functionality - integrate with POST /api/v1/ticket/tickets/{{ ticket.id }}/worklog');
}
// Load and render ticket tags
async function loadTicketTags() {
try {
const response = await fetch('/api/v1/tags/entity/ticket/{{ ticket.id }}');
if (!response.ok) return;
const tags = await response.json();
const container = document.getElementById('ticketTags');
if (tags.length === 0) {
container.innerHTML = '<small class="text-muted"><i class="bi bi-tags"></i> Ingen tags endnu</small>';
return;
}
container.innerHTML = tags.map(tag => `
<span class="tag-badge" style="background-color: ${tag.color};">
${tag.icon ? `<i class="bi ${tag.icon}"></i>` : ''}
${tag.name}
<button type="button" class="btn-close btn-close-white btn-sm"
onclick="removeTag(${tag.id}, '${tag.name}')"
aria-label="Fjern"></button>
</span>
`).join('');
} catch (error) {
console.error('Error loading tags:', error);
}
}
async function removeTag(tagId, tagName) {
if (!confirm(`Fjern tag "${tagName}"?`)) return;
try {
const response = await fetch(`/api/v1/tags/entity?entity_type=ticket&entity_id={{ ticket.id }}&tag_id=${tagId}`, {
method: 'DELETE'
});
if (!response.ok) throw new Error('Failed to remove tag');
await loadTicketTags();
} catch (error) {
console.error('Error removing tag:', error);
alert('Fejl ved fjernelse af tag');
}
}
function reloadTags() {
loadTicketTags();
}
// ============================================
// CONTACTS MANAGEMENT
// ============================================
async function loadContacts() {
try {
const response = await fetch('/api/v1/ticket/tickets/{{ ticket.id }}/contacts');
if (!response.ok) throw new Error('Failed to load contacts');
const data = await response.json();
const container = document.getElementById('contactsList');
if (!data.contacts || data.contacts.length === 0) {
container.innerHTML = `
<div class="empty-state py-2">
<i class="bi bi-person-x"></i>
<p class="mb-0 small">Ingen kontakter tilføjet endnu</p>
</div>
`;
return;
}
container.innerHTML = data.contacts.map(contact => `
<div class="contact-item">
<div class="contact-info">
<div class="contact-name">
<span class="contact-role-badge role-${contact.role}">
${getRoleLabel(contact.role)}
</span>
${contact.first_name} ${contact.last_name}
</div>
<div class="contact-details">
${contact.email ? `<i class="bi bi-envelope"></i> ${contact.email}` : ''}
${contact.phone ? `<i class="bi bi-telephone ms-2"></i> ${contact.phone}` : ''}
</div>
${contact.notes ? `<div class="contact-details mt-1"><i class="bi bi-sticky"></i> ${contact.notes}</div>` : ''}
</div>
<div class="btn-group btn-group-sm">
<button class="btn btn-outline-primary" onclick="editContactRole(${contact.contact_id}, '${contact.role}', '${contact.notes || ''}')">
<i class="bi bi-pencil"></i>
</button>
<button class="btn btn-outline-danger" onclick="removeContact(${contact.contact_id}, '${contact.first_name} ${contact.last_name}')">
<i class="bi bi-trash"></i>
</button>
</div>
</div>
`).join('');
} catch (error) {
console.error('Error loading contacts:', error);
document.getElementById('contactsList').innerHTML = `
<div class="alert alert-warning mb-0">
<i class="bi bi-exclamation-triangle"></i> Kunne ikke indlæse kontakter
</div>
`;
}
}
function getRoleLabel(role) {
const labels = {
'primary': '⭐ Primær',
'requester': '📝 Anmoder',
'assignee': '👤 Ansvarlig',
'cc': '📧 CC',
'observer': '👁 Observer',
'ekstern_it': '💻 Ekstern IT',
'third_party': '🤝 3. part',
'electrician': '⚡ Elektriker',
'consultant': '🎓 Konsulent',
'vendor': '🏢 Leverandør'
};
return labels[role] || ('📌 ' + role.replace(/_/g, ' ').replace(/\b\w/g, l => l.toUpperCase()));
}
async function showAddContactModal() {
// Fetch all contacts for selection
try {
const response = await fetch('/api/v1/contacts?limit=1000');
if (!response.ok) throw new Error('Failed to load contacts');
const data = await response.json();
const contacts = data.contacts || [];
// Check if this ticket has any contacts yet
const ticketContactsResp = await fetch('/api/v1/ticket/tickets/{{ ticket.id }}/contacts');
const ticketContacts = await ticketContactsResp.json();
const isFirstContact = !ticketContacts.contacts || ticketContacts.contacts.length === 0;
// Create modal content
const modalHtml = `
<div class="modal fade" id="addContactModal" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title"><i class="bi bi-person-plus"></i> Tilføj Kontakt</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<div class="mb-3">
<label class="form-label">Kontakt *</label>
<select class="form-select" id="contactSelect" required>
<option value="">Vælg kontakt...</option>
${contacts.map(c => `
<option value="${c.id}">${c.first_name} ${c.last_name} ${c.email ? '(' + c.email + ')' : ''}</option>
`).join('')}
</select>
</div>
<div class="mb-3">
<label class="form-label">Rolle *</label>
<div id="firstContactNotice" class="alert alert-info mb-2" style="display: none;">
<i class="bi bi-star"></i> <strong>Første kontakt</strong> - Rollen sættes automatisk til "Primær kontakt"
</div>
<select class="form-select" id="roleSelect" onchange="toggleCustomRole()" required>
<optgroup label="Standard roller">
<option value="primary">⭐ Primær kontakt</option>
<option value="requester">📝 Anmoder</option>
<option value="assignee">👤 Ansvarlig</option>
<option value="cc">📧 CC (Carbon Copy)</option>
<option value="observer" selected>👁 Observer</option>
</optgroup>
<optgroup label="Almindelige roller">
<option value="ekstern_it">💻 Ekstern IT</option>
<option value="third_party">🤝 3. part leverandør</option>
<option value="electrician">⚡ Elektriker</option>
<option value="consultant">🎓 Konsulent</option>
<option value="vendor">🏢 Leverandør</option>
</optgroup>
<optgroup label="Custom">
<option value="_custom">✏️ Indtast custom rolle...</option>
</optgroup>
</select>
</div>
<div class="mb-3" id="customRoleDiv" style="display: none;">
<label class="form-label">Custom Rolle</label>
<input type="text" class="form-control" id="customRoleInput"
placeholder="f.eks. 'bygningsingeniør' eller 'projektleder'">
<small class="text-muted">Brug lowercase og underscore i stedet for mellemrum (f.eks. bygnings_ingeniør)</small>
</div>
<div class="mb-3">
<label class="form-label">Noter (valgfri)</label>
<textarea class="form-control" id="contactNotes" rows="2" placeholder="Evt. noter om kontaktens rolle..."></textarea>
</div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Annuller</button>
<button type="button" class="btn btn-primary" onclick="addContact()">
<i class="bi bi-plus-circle"></i> Tilføj
</button>
</div>
</div>
</div>
</div>
`;
// Remove old modal if exists
const oldModal = document.getElementById('addContactModal');
if (oldModal) oldModal.remove();
// Append and show new modal
document.body.insertAdjacentHTML('beforeend', modalHtml);
const modal = new bootstrap.Modal(document.getElementById('addContactModal'));
// Show notice and disable role selector if first contact
if (isFirstContact) {
document.getElementById('firstContactNotice').style.display = 'block';
document.getElementById('roleSelect').value = 'primary';
document.getElementById('roleSelect').disabled = true;
}
modal.show();
} catch (error) {
console.error('Error:', error);
alert('Fejl ved indlæsning af kontakter');
}
}
function toggleCustomRole() {
const roleSelect = document.getElementById('roleSelect');
const customDiv = document.getElementById('customRoleDiv');
const customInput = document.getElementById('customRoleInput');
if (roleSelect.value === '_custom') {
customDiv.style.display = 'block';
customInput.required = true;
} else {
customDiv.style.display = 'none';
customInput.required = false;
}
}
async function addContact() {
const contactId = document.getElementById('contactSelect').value;
let role = document.getElementById('roleSelect').value;
const notes = document.getElementById('contactNotes').value;
if (!contactId) {
alert('Vælg venligst en kontakt');
return;
}
// If custom role selected, use the custom input
if (role === '_custom') {
const customRole = document.getElementById('customRoleInput').value.trim();
if (!customRole) {
alert('Indtast venligst en custom rolle');
return;
}
role = customRole.toLowerCase().replace(/\s+/g, '_').replace(/-/g, '_');
}
try {
const url = `/api/v1/ticket/tickets/{{ ticket.id }}/contacts?contact_id=${contactId}&role=${role}${notes ? '&notes=' + encodeURIComponent(notes) : ''}`;
const response = await fetch(url, { method: 'POST' });
if (!response.ok) {
const error = await response.json();
throw new Error(error.detail || 'Failed to add contact');
}
// Close modal and reload
bootstrap.Modal.getInstance(document.getElementById('addContactModal')).hide();
await loadContacts();
} catch (error) {
console.error('Error adding contact:', error);
alert('Fejl: ' + error.message);
}
}
async function editContactRole(contactId, currentRole, currentNotes) {
const newRole = prompt(`Ændr rolle (primary, requester, assignee, cc, observer):`, currentRole);
if (!newRole || newRole === currentRole) return;
const notes = prompt(`Noter (valgfri):`, currentNotes);
try {
const url = `/api/v1/ticket/tickets/{{ ticket.id }}/contacts/${contactId}?role=${newRole}${notes ? '&notes=' + encodeURIComponent(notes) : ''}`;
const response = await fetch(url, { method: 'PUT' });
if (!response.ok) throw new Error('Failed to update contact');
await loadContacts();
} catch (error) {
console.error('Error updating contact:', error);
alert('Fejl ved opdatering af kontakt');
}
}
async function removeContact(contactId, contactName) {
if (!confirm(`Fjern ${contactName} fra ticket?`)) return;
try {
const response = await fetch(`/api/v1/ticket/tickets/{{ ticket.id }}/contacts/${contactId}`, {
method: 'DELETE'
});
if (!response.ok) throw new Error('Failed to remove contact');
await loadContacts();
} catch (error) {
console.error('Error removing contact:', error);
alert('Fejl ved fjernelse af kontakt');
}
}
// Load tags and contacts on page load
document.addEventListener('DOMContentLoaded', () => {
loadTicketTags();
loadContacts();
});
// Override global tag picker to auto-reload after adding
if (window.tagPicker) {
const originalShow = window.tagPicker.show.bind(window.tagPicker);
window.showTagPicker = function(entityType, entityId, onSelect) {
window.tagPicker.show(entityType, entityId, () => {
loadTicketTags();
if (onSelect) onSelect();
});
};
}
</script>
{% endblock %}

View File

@ -1,272 +0,0 @@
{% extends "shared/frontend/base.html" %}
{% block title %}Alle Tickets - BMC Hub{% endblock %}
{% block extra_css %}
<style>
.filter-bar {
background: var(--bg-card);
padding: 1.5rem;
border-radius: var(--border-radius);
margin-bottom: 1.5rem;
box-shadow: 0 2px 15px rgba(0,0,0,0.05);
}
.ticket-table th {
font-weight: 600;
color: var(--text-secondary);
border-bottom: 2px solid var(--accent-light);
padding: 1rem 0.75rem;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.5px;
}
.ticket-table td {
padding: 1rem 0.75rem;
vertical-align: middle;
border-bottom: 1px solid var(--accent-light);
}
.ticket-row {
transition: background-color 0.2s;
cursor: pointer;
}
.ticket-row:hover {
background-color: var(--accent-light);
}
.badge {
padding: 0.4rem 0.8rem;
font-weight: 500;
border-radius: 6px;
font-size: 0.75rem;
}
.badge-status-open { background-color: #d1ecf1; color: #0c5460; }
.badge-status-in_progress { background-color: #fff3cd; color: #856404; }
.badge-status-pending_customer { background-color: #e2e3e5; color: #383d41; }
.badge-status-resolved { background-color: #d4edda; color: #155724; }
.badge-status-closed { background-color: #f8d7da; color: #721c24; }
.badge-priority-low { background-color: var(--accent-light); color: var(--accent); }
.badge-priority-normal { background-color: #e2e3e5; color: #383d41; }
.badge-priority-high { background-color: #fff3cd; color: #856404; }
.badge-priority-urgent, .badge-priority-critical { background-color: #f8d7da; color: #721c24; }
.ticket-number {
font-family: 'Monaco', 'Courier New', monospace;
background: var(--accent-light);
padding: 0.2rem 0.5rem;
border-radius: 4px;
font-size: 0.85rem;
color: var(--accent);
font-weight: 600;
}
.search-box {
position: relative;
}
.search-box input {
padding-left: 2.5rem;
background: var(--bg-body);
border: 1px solid var(--accent-light);
color: var(--text-primary);
}
.search-box i {
position: absolute;
left: 1rem;
top: 50%;
transform: translateY(-50%);
color: var(--text-secondary);
}
.empty-state {
text-align: center;
padding: 4rem 2rem;
color: var(--text-secondary);
}
.empty-state i {
font-size: 4rem;
margin-bottom: 1rem;
opacity: 0.3;
}
.meta-info {
font-size: 0.85rem;
color: var(--text-secondary);
}
.meta-info i {
margin-right: 0.25rem;
}
</style>
{% endblock %}
{% block content %}
<div class="container-fluid px-4">
<!-- Page Header -->
<div class="d-flex justify-content-between align-items-center mb-4">
<div>
<h1 class="mb-2">
<i class="bi bi-ticket-detailed"></i> Alle Tickets
</h1>
<p class="text-muted">Oversigt over alle tickets i systemet</p>
</div>
<a href="/api/v1/ticket/tickets" class="btn btn-primary">
<i class="bi bi-plus-circle"></i> Ny Ticket
</a>
</div>
<!-- Filter Bar -->
<div class="filter-bar">
<form method="get" action="/ticket/tickets" class="row g-3">
<div class="col-md-3">
<label for="status" class="form-label">Status:</label>
<select name="status" id="status" class="form-select" onchange="this.form.submit()">
<option value="">Alle</option>
<option value="open" {% if selected_status == 'open' %}selected{% endif %}>Open</option>
<option value="in_progress" {% if selected_status == 'in_progress' %}selected{% endif %}>In Progress</option>
<option value="pending_customer" {% if selected_status == 'pending_customer' %}selected{% endif %}>Pending Customer</option>
<option value="resolved" {% if selected_status == 'resolved' %}selected{% endif %}>Resolved</option>
<option value="closed" {% if selected_status == 'closed' %}selected{% endif %}>Closed</option>
</select>
</div>
<div class="col-md-3">
<label for="priority" class="form-label">Prioritet:</label>
<select name="priority" id="priority" class="form-select" onchange="this.form.submit()">
<option value="">Alle</option>
<option value="low" {% if selected_priority == 'low' %}selected{% endif %}>Low</option>
<option value="normal" {% if selected_priority == 'normal' %}selected{% endif %}>Normal</option>
<option value="high" {% if selected_priority == 'high' %}selected{% endif %}>High</option>
<option value="urgent" {% if selected_priority == 'urgent' %}selected{% endif %}>Urgent</option>
<option value="critical" {% if selected_priority == 'critical' %}selected{% endif %}>Critical</option>
</select>
</div>
<div class="col-md-3">
<label for="customer_id" class="form-label">Kunde:</label>
<select name="customer_id" id="customer_id" class="form-select" onchange="this.form.submit()">
<option value="">Alle kunder</option>
{% for customer in customers %}
<option value="{{ customer.id }}" {% if customer.id == selected_customer_id %}selected{% endif %}>
{{ customer.name }}
</option>
{% endfor %}
</select>
</div>
<div class="col-md-3">
<label for="search" class="form-label">Søg:</label>
<div class="search-box">
<i class="bi bi-search"></i>
<input
type="text"
name="search"
id="search"
class="form-control"
placeholder="Ticket nummer eller emne..."
value="{{ search_query or '' }}">
</div>
</div>
</form>
</div>
<!-- Tickets Table -->
{% if tickets %}
<div class="card">
<div class="table-responsive">
<table class="table ticket-table mb-0">
<thead>
<tr>
<th>Ticket</th>
<th>Kunde</th>
<th>Status</th>
<th>Prioritet</th>
<th>Tildelt</th>
<th>Aktivitet</th>
<th>Oprettet</th>
</tr>
</thead>
<tbody>
{% for ticket in tickets %}
<tr class="ticket-row" onclick="window.location='/ticket/tickets/{{ ticket.id }}'">
<td>
<span class="ticket-number">{{ ticket.ticket_number }}</span>
<br>
<strong>{{ ticket.subject }}</strong>
</td>
<td>
{% if ticket.customer_name %}
{{ ticket.customer_name }}
{% else %}
<span class="text-muted">-</span>
{% endif %}
</td>
<td>
<span class="badge badge-status-{{ ticket.status }}">
{{ ticket.status.replace('_', ' ').title() }}
</span>
</td>
<td>
<span class="badge badge-priority-{{ ticket.priority }}">
{{ ticket.priority.title() }}
</span>
</td>
<td>
{% if ticket.assigned_to_name %}
{{ ticket.assigned_to_name }}
{% else %}
<span class="text-muted">Ikke tildelt</span>
{% endif %}
</td>
<td>
<div class="meta-info">
<i class="bi bi-chat"></i> {{ ticket.comment_count }}
<i class="bi bi-clock ms-2"></i> {{ ticket.worklog_count }}
</div>
</td>
<td>
{{ ticket.created_at.strftime('%d-%m-%Y') if ticket.created_at else '-' }}
<br>
<small class="text-muted">{{ ticket.created_at.strftime('%H:%M') if ticket.created_at else '' }}</small>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
<div class="text-muted mt-3">
<small>Viser {{ tickets|length }} tickets</small>
</div>
{% else %}
<div class="card">
<div class="empty-state">
<i class="bi bi-inbox"></i>
<h3>Ingen tickets fundet</h3>
<p>Prøv at justere dine filtre eller opret en ny ticket</p>
{% if selected_status or selected_priority or selected_customer_id or search_query %}
<a href="/ticket/tickets" class="btn btn-outline-secondary mt-3">
<i class="bi bi-x-circle"></i> Ryd filtre
</a>
{% endif %}
</div>
</div>
{% endif %}
</div>
{% endblock %}
{% block extra_js %}
<script>
// Submit search on Enter
document.getElementById('search').addEventListener('keypress', function(e) {
if (e.key === 'Enter') {
e.preventDefault();
this.form.submit();
}
});
</script>
{% endblock %}

File diff suppressed because it is too large Load Diff

View File

@ -1,461 +0,0 @@
"""
Ticket System Frontend Views
HTML template routes for ticket management UI
"""
import logging
from fastapi import APIRouter, Request, HTTPException, Form
from fastapi.responses import HTMLResponse, RedirectResponse
from fastapi.templating import Jinja2Templates
from typing import Optional
from datetime import date
from app.core.database import execute_query, execute_update, execute_query_single
logger = logging.getLogger(__name__)
router = APIRouter()
templates = Jinja2Templates(directory="app")
# ============================================================================
# MOCKUP ROUTES (TEMPORARY)
# ============================================================================
@router.get("/mockups/1", response_class=HTMLResponse)
async def mockup_option1(request: Request):
"""Mockup: Split Screen Concept"""
return templates.TemplateResponse("ticket/frontend/mockups/option1_splitscreen.html", {"request": request})
@router.get("/mockups/2", response_class=HTMLResponse)
async def mockup_option2(request: Request):
"""Mockup: Kanban Board Concept"""
return templates.TemplateResponse("ticket/frontend/mockups/option2_kanban.html", {"request": request})
@router.get("/mockups/3", response_class=HTMLResponse)
async def mockup_option3(request: Request):
"""Mockup: Power Table Concept"""
return templates.TemplateResponse("ticket/frontend/mockups/option3_powertable.html", {"request": request})
@router.get("/worklog/review", response_class=HTMLResponse)
async def worklog_review_page(
request: Request,
customer_id: Optional[int] = None,
status: str = "draft"
):
"""
Worklog review page with single-entry approval
Query params:
customer_id: Filter by customer (optional)
status: Filter by status (default: draft)
"""
try:
# Build query with filters
query = """
SELECT
w.id,
w.ticket_id,
w.user_id,
w.work_date,
w.hours,
w.work_type,
w.description,
w.billing_method,
w.status,
w.prepaid_card_id,
w.created_at,
t.ticket_number,
t.subject AS ticket_subject,
t.customer_id,
t.status AS ticket_status,
c.name AS customer_name,
u.username AS user_name,
pc.card_number,
pc.remaining_hours AS card_remaining_hours
FROM tticket_worklog w
INNER JOIN tticket_tickets t ON t.id = w.ticket_id
LEFT JOIN customers c ON c.id = t.customer_id
LEFT JOIN users u ON u.user_id = w.user_id
LEFT JOIN tticket_prepaid_cards pc ON pc.id = w.prepaid_card_id
WHERE w.status = %s
"""
params = [status]
if customer_id:
query += " AND t.customer_id = %s"
params.append(customer_id)
query += " ORDER BY w.work_date DESC, w.created_at DESC"
worklogs = execute_query(query, tuple(params))
# Get customer list for filter dropdown
customers_query = """
SELECT DISTINCT c.id, c.name
FROM customers c
INNER JOIN tticket_tickets t ON t.customer_id = c.id
INNER JOIN tticket_worklog w ON w.ticket_id = t.id
WHERE w.status = %s
ORDER BY c.name
"""
customers = execute_query(customers_query, (status,))
# Calculate totals
total_hours = sum(float(w['hours']) for w in worklogs)
total_billable = sum(
float(w['hours'])
for w in worklogs
if w['billing_method'] == 'invoice'
)
return templates.TemplateResponse(
"ticket/frontend/worklog_review.html",
{
"request": request,
"worklogs": worklogs,
"customers": customers,
"selected_customer_id": customer_id,
"selected_status": status,
"total_hours": total_hours,
"total_billable_hours": total_billable,
"total_entries": len(worklogs)
}
)
except Exception as e:
logger.error(f"❌ Failed to load worklog review page: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.post("/worklog/{worklog_id}/approve")
async def approve_worklog_entry(
worklog_id: int,
redirect_to: Optional[str] = Form(default="/ticket/worklog/review")
):
"""
Approve single worklog entry (change status draft billable)
Form params:
redirect_to: URL to redirect after approval
"""
try:
# Check entry exists and is draft
check_query = """
SELECT id, status, billing_method
FROM tticket_worklog
WHERE id = %s
"""
entry = execute_query_single(check_query, (worklog_id,))
if not entry:
raise HTTPException(status_code=404, detail="Worklog entry not found")
if entry['status'] != 'draft':
raise HTTPException(
status_code=400,
detail=f"Cannot approve entry with status '{entry['status']}'"
)
# Approve entry
update_query = """
UPDATE tticket_worklog
SET status = 'billable',
updated_at = CURRENT_TIMESTAMP
WHERE id = %s
"""
execute_update(update_query, (worklog_id,))
logger.info(f"✅ Approved worklog entry {worklog_id}")
return RedirectResponse(url=redirect_to, status_code=303)
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Failed to approve worklog entry: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.post("/worklog/{worklog_id}/reject")
async def reject_worklog_entry(
worklog_id: int,
reason: Optional[str] = Form(default=None),
redirect_to: Optional[str] = Form(default="/ticket/worklog/review")
):
"""
Reject single worklog entry (change status draft rejected)
Form params:
reason: Rejection reason (optional)
redirect_to: URL to redirect after rejection
"""
try:
# Check entry exists and is draft
check_query = """
SELECT id, status
FROM tticket_worklog
WHERE id = %s
"""
entry = execute_query_single(check_query, (worklog_id,))
if not entry:
raise HTTPException(status_code=404, detail="Worklog entry not found")
if entry['status'] != 'draft':
raise HTTPException(
status_code=400,
detail=f"Cannot reject entry with status '{entry['status']}'"
)
# Reject entry (store reason in description)
update_query = """
UPDATE tticket_worklog
SET status = 'rejected',
description = COALESCE(description, '') ||
CASE WHEN %s IS NOT NULL
THEN E'\n\n[REJECTED: ' || %s || ']'
ELSE E'\n\n[REJECTED]'
END,
updated_at = CURRENT_TIMESTAMP
WHERE id = %s
"""
execute_update(update_query, (reason, reason, worklog_id))
logger.info(f"❌ Rejected worklog entry {worklog_id}" + (f": {reason}" if reason else ""))
return RedirectResponse(url=redirect_to, status_code=303)
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Failed to reject worklog entry: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/tickets/new", response_class=HTMLResponse)
async def new_ticket_page(request: Request):
"""
New ticket creation page with multi-step wizard
"""
return templates.TemplateResponse("ticket/frontend/ticket_new.html", {"request": request})
@router.get("/dashboard", response_class=HTMLResponse)
async def ticket_dashboard(request: Request):
"""
Ticket system dashboard with statistics
"""
try:
# Get ticket statistics
stats_query = """
SELECT
COUNT(*) FILTER (WHERE status = 'open') AS open_count,
COUNT(*) FILTER (WHERE status = 'in_progress') AS in_progress_count,
COUNT(*) FILTER (WHERE status = 'pending_customer') AS pending_count,
COUNT(*) FILTER (WHERE status = 'resolved') AS resolved_count,
COUNT(*) FILTER (WHERE status = 'closed') AS closed_count,
COUNT(*) AS total_count
FROM tticket_tickets
"""
stats_result = execute_query(stats_query)
stats = stats_result[0] if stats_result else {}
# Get recent tickets
recent_query = """
SELECT
t.id,
t.ticket_number,
t.subject,
t.status,
t.priority,
t.created_at,
c.name AS customer_name
FROM tticket_tickets t
LEFT JOIN customers c ON c.id = t.customer_id
ORDER BY t.created_at DESC
LIMIT 10
"""
recent_tickets = execute_query(recent_query)
# Get worklog statistics
worklog_stats_query = """
SELECT
COUNT(*) FILTER (WHERE status = 'draft') AS draft_count,
COALESCE(SUM(hours) FILTER (WHERE status = 'draft'), 0) AS draft_hours,
COUNT(*) FILTER (WHERE status = 'billable') AS billable_count,
COALESCE(SUM(hours) FILTER (WHERE status = 'billable'), 0) AS billable_hours
FROM tticket_worklog
"""
worklog_stats_result = execute_query(worklog_stats_query)
worklog_stats = worklog_stats_result[0] if worklog_stats_result else {}
return templates.TemplateResponse(
"ticket/frontend/dashboard.html",
{
"request": request,
"stats": stats,
"recent_tickets": recent_tickets or [],
"worklog_stats": worklog_stats
}
)
except Exception as e:
logger.error(f"❌ Failed to load dashboard: {e}", exc_info=True)
raise HTTPException(status_code=500, detail=str(e))
@router.get("/tickets", response_class=HTMLResponse)
async def ticket_list_page(
request: Request,
status: Optional[str] = None,
priority: Optional[str] = None,
customer_id: Optional[int] = None,
search: Optional[str] = None
):
"""
Ticket list page with filters
"""
try:
# Build query with filters
query = """
SELECT
t.id,
t.ticket_number,
t.subject,
t.status,
t.priority,
t.created_at,
t.updated_at,
c.name AS customer_name,
u.username AS assigned_to_name,
(SELECT COUNT(*) FROM tticket_comments WHERE ticket_id = t.id) AS comment_count,
(SELECT COUNT(*) FROM tticket_worklog WHERE ticket_id = t.id) AS worklog_count
FROM tticket_tickets t
LEFT JOIN customers c ON c.id = t.customer_id
LEFT JOIN users u ON u.user_id = t.assigned_to_user_id
WHERE 1=1
"""
params = []
if status:
query += " AND t.status = %s"
params.append(status)
if priority:
query += " AND t.priority = %s"
params.append(priority)
if customer_id:
query += " AND t.customer_id = %s"
params.append(customer_id)
if search:
query += " AND (t.subject ILIKE %s OR t.description ILIKE %s OR t.ticket_number ILIKE %s)"
search_pattern = f"%{search}%"
params.extend([search_pattern, search_pattern, search_pattern])
query += " ORDER BY t.created_at DESC LIMIT 100"
tickets = execute_query(query, tuple(params)) if params else execute_query(query)
# Get filter options
customers = execute_query(
"""SELECT DISTINCT c.id, c.name
FROM customers c
INNER JOIN tticket_tickets t ON t.customer_id = c.id
ORDER BY c.name"""
)
return templates.TemplateResponse(
"ticket/frontend/ticket_list.html",
{
"request": request,
"tickets": tickets,
"customers": customers,
"selected_status": status,
"selected_priority": priority,
"selected_customer_id": customer_id,
"search_query": search
}
)
except Exception as e:
logger.error(f"❌ Failed to load ticket list: {e}")
raise HTTPException(status_code=500, detail=str(e))
@router.get("/tickets/{ticket_id}", response_class=HTMLResponse)
async def ticket_detail_page(request: Request, ticket_id: int):
"""
Ticket detail page with comments and worklog
"""
try:
# Get ticket details
ticket_query = """
SELECT
t.*,
c.name AS customer_name,
c.email AS customer_email,
u.username AS assigned_to_name
FROM tticket_tickets t
LEFT JOIN customers c ON c.id = t.customer_id
LEFT JOIN users u ON u.user_id = t.assigned_to_user_id
WHERE t.id = %s
"""
ticket = execute_query_single(ticket_query, (ticket_id,))
if not ticket:
raise HTTPException(status_code=404, detail="Ticket not found")
# Get comments
comments_query = """
SELECT
c.*,
u.username AS user_name
FROM tticket_comments c
LEFT JOIN users u ON u.user_id = c.user_id
WHERE c.ticket_id = %s
ORDER BY c.created_at ASC
"""
comments = execute_query(comments_query, (ticket_id,))
# Get worklog
worklog_query = """
SELECT
w.*,
u.username AS user_name
FROM tticket_worklog w
LEFT JOIN users u ON u.user_id = w.user_id
WHERE w.ticket_id = %s
ORDER BY w.work_date DESC, w.created_at DESC
"""
worklog = execute_query(worklog_query, (ticket_id,))
# Get attachments
attachments_query = """
SELECT * FROM tticket_attachments
WHERE ticket_id = %s
ORDER BY created_at DESC
"""
attachments = execute_query(attachments_query, (ticket_id,))
return templates.TemplateResponse(
"ticket/frontend/ticket_detail.html",
{
"request": request,
"ticket": ticket,
"comments": comments,
"worklog": worklog,
"attachments": attachments
}
)
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Failed to load ticket detail: {e}")
raise HTTPException(status_code=500, detail=str(e))

View File

@ -1,560 +0,0 @@
<!DOCTYPE html>
<html lang="da">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Worklog Godkendelse - BMC Hub</title>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.1/font/bootstrap-icons.css">
<style>
:root {
--bg-body: #f8f9fa;
--bg-card: #ffffff;
--text-primary: #2c3e50;
--text-secondary: #6c757d;
--accent: #0f4c75;
--accent-light: #eef2f5;
--border-radius: 12px;
--success: #28a745;
--danger: #dc3545;
--warning: #ffc107;
}
[data-theme="dark"] {
--bg-body: #1a1d23;
--bg-card: #252a31;
--text-primary: #e4e6eb;
--text-secondary: #b0b3b8;
--accent: #4a9eff;
--accent-light: #2d3748;
--success: #48bb78;
--danger: #f56565;
--warning: #ed8936;
}
body {
background-color: var(--bg-body);
color: var(--text-primary);
font-family: 'Inter', -apple-system, BlinkMacSystemFont, sans-serif;
padding-top: 80px;
transition: background-color 0.3s, color 0.3s;
}
.navbar {
background: var(--bg-card);
box-shadow: 0 2px 15px rgba(0,0,0,0.08);
padding: 1rem 0;
border-bottom: 1px solid rgba(0,0,0,0.05);
transition: background-color 0.3s;
}
.navbar-brand {
font-weight: 700;
color: var(--accent);
font-size: 1.25rem;
}
.nav-link {
color: var(--text-secondary);
padding: 0.6rem 1.2rem !important;
border-radius: var(--border-radius);
transition: all 0.2s;
font-weight: 500;
margin: 0 0.2rem;
}
.nav-link:hover, .nav-link.active {
background-color: var(--accent-light);
color: var(--accent);
}
.card {
border: none;
border-radius: var(--border-radius);
box-shadow: 0 2px 15px rgba(0,0,0,0.05);
background: var(--bg-card);
margin-bottom: 1.5rem;
transition: background-color 0.3s;
}
.stats-row {
margin-bottom: 2rem;
}
.stat-card {
text-align: center;
padding: 1.5rem;
}
.stat-card h3 {
font-size: 2.5rem;
font-weight: 700;
color: var(--accent);
margin-bottom: 0.5rem;
}
.stat-card p {
color: var(--text-secondary);
font-size: 0.9rem;
margin: 0;
}
.worklog-table {
background: var(--bg-card);
}
.worklog-table th {
font-weight: 600;
color: var(--text-secondary);
border-bottom: 2px solid var(--accent-light);
padding: 1rem 0.75rem;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.5px;
}
.worklog-table td {
padding: 1rem 0.75rem;
vertical-align: middle;
border-bottom: 1px solid var(--accent-light);
}
.worklog-row {
transition: background-color 0.2s;
}
.worklog-row:hover {
background-color: var(--accent-light);
}
.badge {
padding: 0.4rem 0.8rem;
font-weight: 500;
border-radius: 6px;
font-size: 0.75rem;
}
.badge-invoice {
background-color: var(--accent-light);
color: var(--accent);
}
.badge-prepaid {
background-color: #d4edda;
color: #155724;
}
.badge-support {
background-color: #cce5ff;
color: #004085;
}
.badge-development {
background-color: #f8d7da;
color: #721c24;
}
.btn-approve {
background-color: var(--success);
color: white;
border: none;
padding: 0.4rem 1rem;
border-radius: 6px;
font-size: 0.85rem;
transition: all 0.2s;
}
.btn-approve:hover {
background-color: #218838;
transform: translateY(-1px);
}
.btn-reject {
background-color: var(--danger);
color: white;
border: none;
padding: 0.4rem 1rem;
border-radius: 6px;
font-size: 0.85rem;
transition: all 0.2s;
}
.btn-reject:hover {
background-color: #c82333;
transform: translateY(-1px);
}
.theme-toggle {
cursor: pointer;
padding: 0.5rem 1rem;
border-radius: var(--border-radius);
background: var(--accent-light);
color: var(--accent);
transition: all 0.2s;
border: none;
font-size: 1.2rem;
}
.theme-toggle:hover {
background: var(--accent);
color: white;
}
.filter-bar {
background: var(--bg-card);
padding: 1.5rem;
border-radius: var(--border-radius);
margin-bottom: 1.5rem;
box-shadow: 0 2px 15px rgba(0,0,0,0.05);
}
.hours-display {
font-weight: 700;
color: var(--accent);
font-size: 1.1rem;
}
.empty-state {
text-align: center;
padding: 4rem 2rem;
color: var(--text-secondary);
}
.empty-state i {
font-size: 4rem;
margin-bottom: 1rem;
opacity: 0.3;
}
.ticket-number {
font-family: 'Monaco', 'Courier New', monospace;
background: var(--accent-light);
padding: 0.2rem 0.5rem;
border-radius: 4px;
font-size: 0.85rem;
color: var(--accent);
}
.customer-name {
font-weight: 600;
color: var(--text-primary);
}
.work-description {
color: var(--text-secondary);
font-size: 0.9rem;
max-width: 300px;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.btn-group-actions {
white-space: nowrap;
}
</style>
</head>
<body>
<!-- Navigation -->
<nav class="navbar navbar-expand-lg fixed-top">
<div class="container-fluid">
<a class="navbar-brand" href="/">
<i class="bi bi-boxes"></i> BMC Hub
</a>
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navbarNav">
<span class="navbar-toggler-icon"></span>
</button>
<div class="collapse navbar-collapse" id="navbarNav">
<ul class="navbar-nav me-auto">
<li class="nav-item">
<a class="nav-link" href="/ticket/dashboard">
<i class="bi bi-speedometer2"></i> Dashboard
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="/api/v1/ticket/tickets">
<i class="bi bi-ticket-detailed"></i> Tickets
</a>
</li>
<li class="nav-item">
<a class="nav-link active" href="/ticket/worklog/review">
<i class="bi bi-clock-history"></i> Worklog Godkendelse
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="/api/v1/prepaid-cards">
<i class="bi bi-credit-card"></i> Klippekort
</a>
</li>
</ul>
<button class="theme-toggle" onclick="toggleTheme()" title="Toggle Dark Mode">
<i class="bi bi-moon-stars-fill"></i>
</button>
</div>
</div>
</nav>
<div class="container-fluid px-4">
<!-- Page Header -->
<div class="row mb-4">
<div class="col">
<h1 class="mb-2">
<i class="bi bi-clock-history"></i> Worklog Godkendelse
</h1>
<p class="text-muted">Godkend eller afvis enkelt-entries fra draft worklog</p>
</div>
</div>
<!-- Statistics Row -->
<div class="row stats-row">
<div class="col-md-4">
<div class="card stat-card">
<h3>{{ total_entries }}</h3>
<p>Entries til godkendelse</p>
</div>
</div>
<div class="col-md-4">
<div class="card stat-card">
<h3>{{ "%.2f"|format(total_hours) }}t</h3>
<p>Total timer</p>
</div>
</div>
<div class="col-md-4">
<div class="card stat-card">
<h3>{{ "%.2f"|format(total_billable_hours) }}t</h3>
<p>Fakturerbare timer</p>
</div>
</div>
</div>
<!-- Filter Bar -->
<div class="filter-bar">
<form method="get" action="/ticket/worklog/review" class="row g-3">
<div class="col-md-4">
<label for="customer_id" class="form-label">Filtrer efter kunde:</label>
<select name="customer_id" id="customer_id" class="form-select" onchange="this.form.submit()">
<option value="">Alle kunder</option>
{% for customer in customers %}
<option value="{{ customer.id }}" {% if customer.id == selected_customer_id %}selected{% endif %}>
{{ customer.name }}
</option>
{% endfor %}
</select>
</div>
<div class="col-md-4">
<label for="status" class="form-label">Status:</label>
<select name="status" id="status" class="form-select" onchange="this.form.submit()">
<option value="draft" {% if selected_status == 'draft' %}selected{% endif %}>Draft</option>
<option value="billable" {% if selected_status == 'billable' %}selected{% endif %}>Billable</option>
<option value="rejected" {% if selected_status == 'rejected' %}selected{% endif %}>Rejected</option>
</select>
</div>
<div class="col-md-4 d-flex align-items-end">
<button type="submit" class="btn btn-primary">
<i class="bi bi-funnel"></i> Filtrer
</button>
<a href="/ticket/worklog/review" class="btn btn-outline-secondary ms-2">
<i class="bi bi-x-circle"></i> Ryd filtre
</a>
</div>
</form>
</div>
<!-- Worklog Table -->
{% if worklogs %}
<div class="card">
<div class="table-responsive">
<table class="table worklog-table mb-0">
<thead>
<tr>
<th>Ticket</th>
<th>Kunde</th>
<th>Dato</th>
<th>Timer</th>
<th>Type</th>
<th>Fakturering</th>
<th>Beskrivelse</th>
<th>Medarbejder</th>
<th style="text-align: right;">Handlinger</th>
</tr>
</thead>
<tbody>
{% for worklog in worklogs %}
<tr class="worklog-row">
<td>
<span class="ticket-number">{{ worklog.ticket_number }}</span>
<br>
<small class="text-muted">{{ worklog.ticket_subject[:30] }}...</small>
</td>
<td>
{% if worklog.customer_name %}
<span class="customer-name">{{ worklog.customer_name }}</span>
{% else %}
<span class="text-muted">-</span>
{% endif %}
</td>
<td>
{{ worklog.work_date.strftime('%d-%m-%Y') if worklog.work_date else '-' }}
</td>
<td>
<span class="hours-display">{{ "%.2f"|format(worklog.hours) }}t</span>
</td>
<td>
{% if worklog.work_type == 'support' %}
<span class="badge badge-support">Support</span>
{% elif worklog.work_type == 'development' %}
<span class="badge badge-development">Udvikling</span>
{% else %}
<span class="badge">{{ worklog.work_type }}</span>
{% endif %}
</td>
<td>
{% if worklog.billing_method == 'invoice' %}
<span class="badge badge-invoice">
<i class="bi bi-file-earmark-text"></i> Faktura
</span>
{% elif worklog.billing_method == 'prepaid' %}
<span class="badge badge-prepaid">
<i class="bi bi-credit-card"></i> Klippekort
</span>
{% if worklog.card_number %}
<br><small class="text-muted">{{ worklog.card_number }}</small>
{% endif %}
{% endif %}
</td>
<td>
<div class="work-description" title="{{ worklog.description or '-' }}">
{{ worklog.description or '-' }}
</div>
</td>
<td>
{{ worklog.user_name or 'N/A' }}
</td>
<td>
{% if worklog.status == 'draft' %}
<div class="btn-group-actions">
<form method="post" action="/ticket/worklog/{{ worklog.id }}/approve" style="display: inline;">
<input type="hidden" name="redirect_to" value="{{ request.url }}">
<button type="submit" class="btn btn-approve btn-sm">
<i class="bi bi-check-circle"></i> Godkend
</button>
</form>
<button
type="button"
class="btn btn-reject btn-sm ms-1"
onclick="rejectWorklog({{ worklog.id }}, '{{ request.url }}')">
<i class="bi bi-x-circle"></i> Afvis
</button>
</div>
{% else %}
<span class="badge">{{ worklog.status }}</span>
{% endif %}
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
{% else %}
<div class="card">
<div class="empty-state">
<i class="bi bi-inbox"></i>
<h3>Ingen worklog entries</h3>
<p>Der er ingen entries med status "{{ selected_status }}" {% if selected_customer_id %}for denne kunde{% endif %}.</p>
</div>
</div>
{% endif %}
</div>
<!-- Reject Modal -->
<div class="modal fade" id="rejectModal" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content" style="background: var(--bg-card); color: var(--text-primary);">
<div class="modal-header" style="border-bottom: 1px solid var(--accent-light);">
<h5 class="modal-title">Afvis Worklog Entry</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<form id="rejectForm" method="post">
<div class="modal-body">
<div class="mb-3">
<label for="rejectReason" class="form-label">Årsag til afvisning (valgfrit):</label>
<textarea
class="form-control"
id="rejectReason"
name="reason"
rows="3"
placeholder="Forklar hvorfor denne entry afvises..."
style="background: var(--bg-body); color: var(--text-primary); border-color: var(--accent-light);"></textarea>
</div>
<input type="hidden" name="redirect_to" id="rejectRedirectTo">
</div>
<div class="modal-footer" style="border-top: 1px solid var(--accent-light);">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Annuller</button>
<button type="submit" class="btn btn-reject">
<i class="bi bi-x-circle"></i> Afvis Entry
</button>
</div>
</form>
</div>
</div>
</div>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
<script>
// Theme Toggle
function toggleTheme() {
const html = document.documentElement;
const currentTheme = html.getAttribute('data-theme');
const newTheme = currentTheme === 'dark' ? 'light' : 'dark';
html.setAttribute('data-theme', newTheme);
localStorage.setItem('theme', newTheme);
// Update icon
const icon = document.querySelector('.theme-toggle i');
if (newTheme === 'dark') {
icon.className = 'bi bi-sun-fill';
} else {
icon.className = 'bi bi-moon-stars-fill';
}
}
// Load saved theme
document.addEventListener('DOMContentLoaded', function() {
const savedTheme = localStorage.getItem('theme') || 'light';
document.documentElement.setAttribute('data-theme', savedTheme);
const icon = document.querySelector('.theme-toggle i');
if (savedTheme === 'dark') {
icon.className = 'bi bi-sun-fill';
}
});
// Reject worklog with modal
function rejectWorklog(worklogId, redirectUrl) {
const form = document.getElementById('rejectForm');
form.action = `/ticket/worklog/${worklogId}/reject`;
document.getElementById('rejectRedirectTo').value = redirectUrl;
const modal = new bootstrap.Modal(document.getElementById('rejectModal'));
modal.show();
}
// Auto-refresh indicator (optional)
let lastRefresh = Date.now();
setInterval(() => {
const elapsed = Math.floor((Date.now() - lastRefresh) / 1000);
if (elapsed > 300) { // 5 minutes
const badge = document.createElement('span');
badge.className = 'badge bg-warning position-fixed';
badge.style.bottom = '20px';
badge.style.right = '20px';
badge.style.cursor = 'pointer';
badge.innerHTML = '<i class="bi bi-arrow-clockwise"></i> Opdater siden';
badge.onclick = () => location.reload();
document.body.appendChild(badge);
}
}, 60000);
</script>
</body>
</html>

View File

@ -18,7 +18,7 @@ import aiohttp
from fastapi import HTTPException from fastapi import HTTPException
from app.core.config import settings from app.core.config import settings
from app.core.database import execute_query, execute_update, execute_query_single from app.core.database import execute_query, execute_update
from app.timetracking.backend.models import ( from app.timetracking.backend.models import (
TModuleEconomicExportRequest, TModuleEconomicExportRequest,
TModuleEconomicExportResult TModuleEconomicExportResult
@ -102,43 +102,6 @@ class EconomicExportService:
logger.error(f"❌ e-conomic connection error: {e}") logger.error(f"❌ e-conomic connection error: {e}")
return False return False
async def check_draft_exists(self, draft_id: int) -> bool:
"""
Tjek om en draft order stadig eksisterer i e-conomic.
Args:
draft_id: e-conomic draft order nummer
Returns:
True hvis draft findes, False hvis slettet eller ikke fundet
"""
try:
logger.info(f"🔍 Checking if draft {draft_id} exists in e-conomic...")
async with aiohttp.ClientSession() as session:
async with session.get(
f"{self.api_url}/orders/drafts/{draft_id}",
headers=self._get_headers(),
timeout=aiohttp.ClientTimeout(total=10)
) as response:
if response.status == 200:
logger.info(f"✅ Draft {draft_id} exists in e-conomic")
return True
elif response.status == 404:
logger.info(f"✅ Draft {draft_id} NOT found in e-conomic (deleted)")
return False
else:
error_text = await response.text()
logger.error(f"❌ e-conomic check failed ({response.status}): {error_text}")
raise Exception(f"e-conomic API error: {response.status}")
except aiohttp.ClientError as e:
logger.error(f"❌ e-conomic connection error: {e}")
raise Exception(f"Kunne ikke forbinde til e-conomic: {str(e)}")
except Exception as e:
logger.error(f"❌ Draft check error: {e}")
raise
async def export_order( async def export_order(
self, self,
request: TModuleEconomicExportRequest, request: TModuleEconomicExportRequest,
@ -162,7 +125,7 @@ class EconomicExportService:
JOIN tmodule_customers c ON o.customer_id = c.id JOIN tmodule_customers c ON o.customer_id = c.id
WHERE o.id = %s WHERE o.id = %s
""" """
order = execute_query_single(order_query, (request.order_id,)) order = execute_query(order_query, (request.order_id,), fetchone=True)
if not order: if not order:
raise HTTPException(status_code=404, detail="Order not found") raise HTTPException(status_code=404, detail="Order not found")
@ -244,7 +207,7 @@ class EconomicExportService:
LEFT JOIN customers c ON tc.hub_customer_id = c.id LEFT JOIN customers c ON tc.hub_customer_id = c.id
WHERE tc.id = %s WHERE tc.id = %s
""" """
customer_data = execute_query_single(customer_number_query, (order['customer_id'],)) customer_data = execute_query(customer_number_query, (order['customer_id'],), fetchone=True)
if not customer_data or not customer_data.get('economic_customer_number'): if not customer_data or not customer_data.get('economic_customer_number'):
raise HTTPException( raise HTTPException(

View File

@ -13,7 +13,7 @@ from datetime import date
from fastapi import HTTPException from fastapi import HTTPException
from app.core.config import settings from app.core.config import settings
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single from app.core.database import execute_query, execute_insert, execute_update
from app.timetracking.backend.models import ( from app.timetracking.backend.models import (
TModuleOrder, TModuleOrder,
TModuleOrderWithLines, TModuleOrderWithLines,
@ -42,7 +42,7 @@ class OrderService:
try: try:
# Check module customer # Check module customer
query = "SELECT hourly_rate FROM tmodule_customers WHERE id = %s" query = "SELECT hourly_rate FROM tmodule_customers WHERE id = %s"
result = execute_query_single(query, (customer_id,)) result = execute_query(query, (customer_id,), fetchone=True)
if result and result.get('hourly_rate'): if result and result.get('hourly_rate'):
rate = result['hourly_rate'] rate = result['hourly_rate']
@ -52,7 +52,7 @@ class OrderService:
# Check Hub customer if linked # Check Hub customer if linked
if hub_customer_id: if hub_customer_id:
query = "SELECT hourly_rate FROM customers WHERE id = %s" query = "SELECT hourly_rate FROM customers WHERE id = %s"
result = execute_query_single(query, (hub_customer_id,)) result = execute_query(query, (hub_customer_id,), fetchone=True)
if result and result.get('hourly_rate'): if result and result.get('hourly_rate'):
rate = result['hourly_rate'] rate = result['hourly_rate']
@ -86,9 +86,11 @@ class OrderService:
""" """
try: try:
# Hent customer info # Hent customer info
customer = execute_query_single( customer = execute_query(
"SELECT * FROM tmodule_customers WHERE id = %s", "SELECT * FROM tmodule_customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
@ -108,7 +110,7 @@ class OrderService:
AND t.billable = true AND t.billable = true
ORDER BY c.id, t.worked_date ORDER BY c.id, t.worked_date
""" """
approved_times = execute_query_single(query, (customer_id,)) approved_times = execute_query(query, (customer_id,))
if not approved_times: if not approved_times:
raise HTTPException( raise HTTPException(
@ -314,7 +316,7 @@ class OrderService:
LEFT JOIN tmodule_customers c ON o.customer_id = c.id LEFT JOIN tmodule_customers c ON o.customer_id = c.id
WHERE o.id = %s WHERE o.id = %s
""" """
order = execute_query_single(order_query, (order_id,)) order = execute_query(order_query, (order_id,), fetchone=True)
if not order: if not order:
raise HTTPException(status_code=404, detail="Order not found") raise HTTPException(status_code=404, detail="Order not found")
@ -399,7 +401,9 @@ class OrderService:
# Check order exists and is not exported # Check order exists and is not exported
order = execute_query( order = execute_query(
"SELECT * FROM tmodule_orders WHERE id = %s", "SELECT * FROM tmodule_orders WHERE id = %s",
(order_id,)) (order_id,),
fetchone=True
)
if not order: if not order:
raise HTTPException(status_code=404, detail="Order not found") raise HTTPException(status_code=404, detail="Order not found")
@ -420,7 +424,7 @@ class OrderService:
) )
# Reset time entries back to approved # Reset time entries back to approved
lines = execute_query_single( lines = execute_query(
"SELECT time_entry_ids FROM tmodule_order_lines WHERE order_id = %s", "SELECT time_entry_ids FROM tmodule_order_lines WHERE order_id = %s",
(order_id,) (order_id,)
) )
@ -449,7 +453,9 @@ class OrderService:
# Return updated order # Return updated order
updated = execute_query( updated = execute_query(
"SELECT * FROM tmodule_orders WHERE id = %s", "SELECT * FROM tmodule_orders WHERE id = %s",
(order_id,)) (order_id,),
fetchone=True
)
return TModuleOrder(**updated) return TModuleOrder(**updated)

View File

@ -12,7 +12,7 @@ from typing import Optional, List
from fastapi import APIRouter, HTTPException, Depends from fastapi import APIRouter, HTTPException, Depends
from fastapi.responses import JSONResponse from fastapi.responses import JSONResponse
from app.core.database import execute_query, execute_update, execute_query_single from app.core.database import execute_query, execute_update
from app.timetracking.backend.models import ( from app.timetracking.backend.models import (
TModuleSyncStats, TModuleSyncStats,
TModuleApprovalStats, TModuleApprovalStats,
@ -36,7 +36,7 @@ from app.timetracking.backend.audit import audit
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
router = APIRouter(prefix="/timetracking") router = APIRouter()
# ============================================================================ # ============================================================================
@ -80,9 +80,11 @@ async def sync_case_comments(case_id: int):
""" """
try: try:
# Hent case fra database # Hent case fra database
case = execute_query_single( case = execute_query(
"SELECT vtiger_id FROM tmodule_cases WHERE id = %s", "SELECT vtiger_id FROM tmodule_cases WHERE id = %s",
(case_id,)) (case_id,),
fetchone=True
)
if not case: if not case:
raise HTTPException(status_code=404, detail="Case not found") raise HTTPException(status_code=404, detail="Case not found")
@ -183,7 +185,7 @@ async def approve_time_entry(
JOIN tmodule_customers cust ON t.customer_id = cust.id JOIN tmodule_customers cust ON t.customer_id = cust.id
WHERE t.id = %s WHERE t.id = %s
""" """
entry = execute_query_single(query, (time_id,)) entry = execute_query(query, (time_id,), fetchone=True)
if not entry: if not entry:
raise HTTPException(status_code=404, detail="Time entry not found") raise HTTPException(status_code=404, detail="Time entry not found")
@ -397,106 +399,6 @@ async def cancel_order(
raise HTTPException(status_code=500, detail=str(e)) raise HTTPException(status_code=500, detail=str(e))
@router.post("/orders/{order_id}/unlock", tags=["Orders"])
async def unlock_order(
order_id: int,
admin_code: str,
user_id: Optional[int] = None
):
"""
🔓 Lås en eksporteret ordre op for ændringer (ADMIN ONLY).
Kræver:
1. Korrekt admin unlock code (fra TIMETRACKING_ADMIN_UNLOCK_CODE)
2. Ordren skal være slettet fra e-conomic først
Query params:
- admin_code: Admin unlock kode
"""
try:
from app.core.config import settings
# Verify admin code
if not settings.TIMETRACKING_ADMIN_UNLOCK_CODE:
raise HTTPException(
status_code=500,
detail="Admin unlock code ikke konfigureret i systemet"
)
if admin_code != settings.TIMETRACKING_ADMIN_UNLOCK_CODE:
logger.warning(f"⚠️ Ugyldig unlock code forsøg for ordre {order_id}")
raise HTTPException(status_code=403, detail="Ugyldig admin kode")
# Get order
order = order_service.get_order_with_lines(order_id)
if order.status != 'exported':
raise HTTPException(
status_code=400,
detail="Kun eksporterede ordrer kan låses op"
)
if not order.economic_draft_id:
raise HTTPException(
status_code=400,
detail="Ordre har ingen e-conomic ID"
)
# Check if order still exists in e-conomic
try:
draft_exists = await economic_service.check_draft_exists(order.economic_draft_id)
if draft_exists:
raise HTTPException(
status_code=400,
detail=f"⚠️ Ordren findes stadig i e-conomic (Draft #{order.economic_draft_id}). Slet den i e-conomic først!"
)
except Exception as e:
logger.error(f"❌ Kunne ikke tjekke e-conomic status: {e}")
raise HTTPException(
status_code=500,
detail=f"Kunne ikke verificere e-conomic status: {str(e)}"
)
# Unlock order - set status back to draft
update_query = """
UPDATE tmodule_orders
SET status = 'draft',
economic_draft_id = NULL,
exported_at = NULL
WHERE id = %s
RETURNING *
"""
result = execute_query_single(update_query, (order_id,))
# Log unlock
audit.log_event(
event_type="order_unlocked",
entity_type="order",
entity_id=order_id,
user_id=user_id,
details={
"previous_economic_id": order.economic_draft_id,
"unlocked_by_admin": True
}
)
logger.info(f"🔓 Order {order_id} unlocked by admin (user {user_id})")
return {
"success": True,
"message": "Ordre låst op - kan nu redigeres eller slettes",
"order_id": order_id
}
except HTTPException:
raise
except Exception as e:
logger.error(f"❌ Unlock failed: {e}")
raise HTTPException(status_code=500, detail=str(e))
# ============================================================================ # ============================================================================
# e-conomic EXPORT ENDPOINTS # e-conomic EXPORT ENDPOINTS
# ============================================================================ # ============================================================================
@ -549,8 +451,10 @@ async def test_economic_connection():
async def get_module_metadata(): async def get_module_metadata():
"""Hent modul metadata""" """Hent modul metadata"""
try: try:
result = execute_query_single( result = execute_query(
"SELECT * FROM tmodule_metadata ORDER BY id DESC LIMIT 1") "SELECT * FROM tmodule_metadata ORDER BY id DESC LIMIT 1",
fetchone=True
)
if not result: if not result:
raise HTTPException(status_code=404, detail="Module metadata not found") raise HTTPException(status_code=404, detail="Module metadata not found")
@ -571,7 +475,7 @@ async def module_health():
SELECT COUNT(*) as count FROM information_schema.tables SELECT COUNT(*) as count FROM information_schema.tables
WHERE table_name LIKE 'tmodule_%' WHERE table_name LIKE 'tmodule_%'
""" """
result = execute_query_single(tables_query) result = execute_query(tables_query, fetchone=True)
table_count = result['count'] if result else 0 table_count = result['count'] if result else 0
# Get stats - count each table separately # Get stats - count each table separately
@ -584,8 +488,10 @@ async def module_health():
} }
for table_name in ["customers", "cases", "times", "orders"]: for table_name in ["customers", "cases", "times", "orders"]:
count_result = execute_query_single( count_result = execute_query(
f"SELECT COUNT(*) as count FROM tmodule_{table_name}") f"SELECT COUNT(*) as count FROM tmodule_{table_name}",
fetchone=True
)
stats[table_name] = count_result['count'] if count_result else 0 stats[table_name] = count_result['count'] if count_result else 0
except Exception as e: except Exception as e:
@ -667,9 +573,11 @@ async def update_customer_hourly_rate(customer_id: int, hourly_rate: float, user
) )
# Return updated customer # Return updated customer
customer = execute_query_single( customer = execute_query(
"SELECT id, name, hourly_rate FROM tmodule_customers WHERE id = %s", "SELECT id, name, hourly_rate FROM tmodule_customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
@ -712,9 +620,11 @@ async def toggle_customer_time_card(customer_id: int, enabled: bool, user_id: Op
) )
# Return updated customer # Return updated customer
customer = execute_query_single( customer = execute_query(
"SELECT * FROM tmodule_customers WHERE id = %s", "SELECT * FROM tmodule_customers WHERE id = %s",
(customer_id,)) (customer_id,),
fetchone=True
)
if not customer: if not customer:
raise HTTPException(status_code=404, detail="Customer not found") raise HTTPException(status_code=404, detail="Customer not found")
@ -760,7 +670,7 @@ async def list_customers(
query += " ORDER BY customer_name" query += " ORDER BY customer_name"
customers = execute_query_single(query) customers = execute_query(query)
else: else:
# Simple customer list # Simple customer list
query = "SELECT * FROM tmodule_customers" query = "SELECT * FROM tmodule_customers"
@ -883,7 +793,7 @@ async def uninstall_module(
(SELECT COUNT(*) FROM tmodule_order_lines) + (SELECT COUNT(*) FROM tmodule_order_lines) +
(SELECT COUNT(*) FROM tmodule_sync_log) as total (SELECT COUNT(*) FROM tmodule_sync_log) as total
""" """
count_result = execute_query(count_query) count_result = execute_query(count_query, fetchone=True)
total_rows = count_result['total'] if count_result else 0 total_rows = count_result['total'] if count_result else 0
except: except:
total_rows = 0 total_rows = 0

View File

@ -28,7 +28,7 @@ import aiohttp
from fastapi import HTTPException from fastapi import HTTPException
from app.core.config import settings from app.core.config import settings
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single from app.core.database import execute_query, execute_insert, execute_update
from app.timetracking.backend.models import TModuleSyncStats from app.timetracking.backend.models import TModuleSyncStats
from app.timetracking.backend.audit import audit from app.timetracking.backend.audit import audit
@ -46,6 +46,7 @@ class TimeTrackingVTigerService:
self.base_url = settings.VTIGER_URL self.base_url = settings.VTIGER_URL
self.username = settings.VTIGER_USERNAME self.username = settings.VTIGER_USERNAME
self.api_key = settings.VTIGER_API_KEY self.api_key = settings.VTIGER_API_KEY
self.password = settings.VTIGER_PASSWORD
self.rest_endpoint = f"{self.base_url}/restapi/v1/vtiger/default" self.rest_endpoint = f"{self.base_url}/restapi/v1/vtiger/default"
# Safety flags # Safety flags
@ -301,9 +302,11 @@ class TimeTrackingVTigerService:
data_hash = self._calculate_hash(account) data_hash = self._calculate_hash(account)
# Check if exists # Check if exists
existing = execute_query_single( existing = execute_query(
"SELECT id, sync_hash FROM tmodule_customers WHERE vtiger_id = %s", "SELECT id, sync_hash FROM tmodule_customers WHERE vtiger_id = %s",
(vtiger_id,)) (vtiger_id,),
fetchone=True
)
if existing: if existing:
# Check if data changed # Check if data changed
@ -421,9 +424,11 @@ class TimeTrackingVTigerService:
continue continue
# Find customer in our DB # Find customer in our DB
customer = execute_query_single( customer = execute_query(
"SELECT id FROM tmodule_customers WHERE vtiger_id = %s", "SELECT id FROM tmodule_customers WHERE vtiger_id = %s",
(account_id,)) (account_id,),
fetchone=True
)
if not customer: if not customer:
logger.warning(f"⚠️ Customer {account_id} not found - sync customers first") logger.warning(f"⚠️ Customer {account_id} not found - sync customers first")
@ -448,9 +453,11 @@ class TimeTrackingVTigerService:
data_hash = self._calculate_hash(ticket_with_comments) data_hash = self._calculate_hash(ticket_with_comments)
# Check if exists # Check if exists
existing = execute_query_single( existing = execute_query(
"SELECT id, sync_hash FROM tmodule_cases WHERE vtiger_id = %s", "SELECT id, sync_hash FROM tmodule_cases WHERE vtiger_id = %s",
(vtiger_id,)) (vtiger_id,),
fetchone=True
)
if existing: if existing:
if existing['sync_hash'] == data_hash: if existing['sync_hash'] == data_hash:
@ -678,18 +685,22 @@ class TimeTrackingVTigerService:
if related_to: if related_to:
# Try to find case first, then account # Try to find case first, then account
case = execute_query_single( case = execute_query(
"SELECT id, customer_id FROM tmodule_cases WHERE vtiger_id = %s", "SELECT id, customer_id FROM tmodule_cases WHERE vtiger_id = %s",
(related_to,)) (related_to,),
fetchone=True
)
if case: if case:
case_id = case['id'] case_id = case['id']
customer_id = case['customer_id'] customer_id = case['customer_id']
else: else:
# Try to find customer directly # Try to find customer directly
customer = execute_query_single( customer = execute_query(
"SELECT id FROM tmodule_customers WHERE vtiger_id = %s", "SELECT id FROM tmodule_customers WHERE vtiger_id = %s",
(related_to,)) (related_to,),
fetchone=True
)
if customer: if customer:
customer_id = customer['id'] customer_id = customer['id']
@ -714,9 +725,11 @@ class TimeTrackingVTigerService:
data_hash = self._calculate_hash(timelog) data_hash = self._calculate_hash(timelog)
# Check if exists # Check if exists
existing = execute_query_single( existing = execute_query(
"SELECT id, sync_hash FROM tmodule_times WHERE vtiger_id = %s", "SELECT id, sync_hash FROM tmodule_times WHERE vtiger_id = %s",
(vtiger_id,)) (vtiger_id,),
fetchone=True
)
if existing: if existing:
if existing['sync_hash'] == data_hash: if existing['sync_hash'] == data_hash:

View File

@ -12,7 +12,7 @@ from decimal import Decimal
from datetime import datetime from datetime import datetime
from fastapi import HTTPException from fastapi import HTTPException
from app.core.database import execute_query, execute_update, execute_query_single from app.core.database import execute_query, execute_update
from app.timetracking.backend.models import ( from app.timetracking.backend.models import (
TModuleTimeWithContext, TModuleTimeWithContext,
TModuleTimeApproval, TModuleTimeApproval,
@ -36,7 +36,7 @@ class WizardService:
SELECT * FROM tmodule_approval_stats SELECT * FROM tmodule_approval_stats
WHERE customer_id = %s WHERE customer_id = %s
""" """
result = execute_query_single(query, (customer_id,)) result = execute_query(query, (customer_id,), fetchone=True)
if not result: if not result:
return None return None
@ -83,7 +83,7 @@ class WizardService:
WHERE customer_id = %s WHERE customer_id = %s
LIMIT 1 LIMIT 1
""" """
result = execute_query_single(query, (customer_id,)) result = execute_query(query, (customer_id,), fetchone=True)
else: else:
# Hent næste generelt # Hent næste generelt
if exclude_time_card: if exclude_time_card:
@ -96,7 +96,7 @@ class WizardService:
else: else:
query = "SELECT * FROM tmodule_next_pending LIMIT 1" query = "SELECT * FROM tmodule_next_pending LIMIT 1"
result = execute_query_single(query) result = execute_query(query, fetchone=True)
if not result: if not result:
# Ingen flere entries # Ingen flere entries
@ -161,7 +161,7 @@ class WizardService:
JOIN tmodule_customers cust ON t.customer_id = cust.id JOIN tmodule_customers cust ON t.customer_id = cust.id
WHERE t.id = %s WHERE t.id = %s
""" """
entry = execute_query_single(query, (approval.time_id,)) entry = execute_query(query, (approval.time_id,), fetchone=True)
if not entry: if not entry:
raise HTTPException(status_code=404, detail="Time entry not found") raise HTTPException(status_code=404, detail="Time entry not found")
@ -215,7 +215,7 @@ class WizardService:
) )
# Return updated entry # Return updated entry
updated = execute_query_single(query, (approval.time_id,)) updated = execute_query(query, (approval.time_id,), fetchone=True)
return TModuleTimeWithContext(**updated) return TModuleTimeWithContext(**updated)
except HTTPException: except HTTPException:
@ -251,7 +251,7 @@ class WizardService:
JOIN tmodule_customers cust ON t.customer_id = cust.id JOIN tmodule_customers cust ON t.customer_id = cust.id
WHERE t.id = %s WHERE t.id = %s
""" """
entry = execute_query_single(query, (time_id,)) entry = execute_query(query, (time_id,), fetchone=True)
if not entry: if not entry:
raise HTTPException(status_code=404, detail="Time entry not found") raise HTTPException(status_code=404, detail="Time entry not found")
@ -285,7 +285,7 @@ class WizardService:
logger.info(f"❌ Rejected time entry {time_id}: {reason}") logger.info(f"❌ Rejected time entry {time_id}: {reason}")
# Return updated # Return updated
updated = execute_query_single(query, (time_id,)) updated = execute_query(query, (time_id,), fetchone=True)
return TModuleTimeWithContext(**updated) return TModuleTimeWithContext(**updated)
except HTTPException: except HTTPException:
@ -321,7 +321,7 @@ class WizardService:
JOIN tmodule_customers cust ON t.customer_id = cust.id JOIN tmodule_customers cust ON t.customer_id = cust.id
WHERE t.id = %s WHERE t.id = %s
""" """
entry = execute_query_single(query, (time_id,)) entry = execute_query(query, (time_id,), fetchone=True)
if not entry: if not entry:
raise HTTPException(status_code=404, detail="Time entry not found") raise HTTPException(status_code=404, detail="Time entry not found")
@ -368,7 +368,7 @@ class WizardService:
logger.info(f"🔄 Reset time entry {time_id} to pending: {reason}") logger.info(f"🔄 Reset time entry {time_id} to pending: {reason}")
# Return updated # Return updated
updated = execute_query_single(query, (time_id,)) updated = execute_query(query, (time_id,), fetchone=True)
return TModuleTimeWithContext(**updated) return TModuleTimeWithContext(**updated)
except HTTPException: except HTTPException:
@ -491,7 +491,7 @@ class WizardService:
ORDER BY t.worked_date ORDER BY t.worked_date
LIMIT 1 LIMIT 1
""" """
case = execute_query_single(query, (customer_id,)) case = execute_query(query, (customer_id,), fetchone=True)
if case: if case:
current_case_id = case['id'] current_case_id = case['id']
current_case_title = case['title'] current_case_title = case['title']
@ -608,7 +608,7 @@ class WizardService:
FROM tmodule_cases FROM tmodule_cases
WHERE id = %s WHERE id = %s
""" """
case = execute_query(case_query, (case_id,)) case = execute_query(case_query, (case_id,), fetchone=True)
if not case: if not case:
raise HTTPException(status_code=404, detail="Case not found") raise HTTPException(status_code=404, detail="Case not found")

View File

@ -99,8 +99,8 @@
</div> </div>
</div> </div>
<!-- Safety Banner (dynamisk) --> <!-- Safety Banner -->
<div class="row mb-4" id="safety-banner" style="display: none;"> <div class="row mb-4">
<div class="col-12"> <div class="col-12">
<div class="alert alert-warning d-flex align-items-center" role="alert"> <div class="alert alert-warning d-flex align-items-center" role="alert">
<i class="bi bi-shield-exclamation me-2"></i> <i class="bi bi-shield-exclamation me-2"></i>
@ -190,22 +190,8 @@
let orderModal = null; let orderModal = null;
// Initialize modal // Initialize modal
document.addEventListener('DOMContentLoaded', async function() { document.addEventListener('DOMContentLoaded', function() {
orderModal = new bootstrap.Modal(document.getElementById('orderModal')); orderModal = new bootstrap.Modal(document.getElementById('orderModal'));
// Check if DRY-RUN mode is active
try {
const configResponse = await fetch('/api/v1/timetracking/config');
const config = await configResponse.json();
// Show banner only if DRY-RUN or READ-ONLY is enabled
if (config.economic_dry_run || config.economic_read_only) {
document.getElementById('safety-banner').style.display = 'block';
}
} catch (error) {
console.error('Error checking config:', error);
}
loadOrders(); loadOrders();
}); });
@ -228,13 +214,13 @@
const tbody = document.getElementById('orders-tbody'); const tbody = document.getElementById('orders-tbody');
tbody.innerHTML = orders.map(order => { tbody.innerHTML = orders.map(order => {
const statusBadge = getStatusBadge(order); const statusBadge = getStatusBadge(order);
const isLocked = order.status === 'exported'; const isPosted = order.status === 'posted';
const economicInfo = order.economic_draft_id const economicInfo = order.economic_order_number
? `<br><small class="text-muted">e-conomic draft #${order.economic_draft_id}</small>` ? `<br><small class="text-muted">e-conomic #${order.economic_order_number}</small>`
: ''; : '';
return ` return `
<tr class="order-row ${isLocked ? 'table-warning' : ''}" onclick="viewOrder(${order.id})"> <tr class="order-row ${isPosted ? 'table-success' : ''}" onclick="viewOrder(${order.id})">
<td> <td>
<strong>${order.order_number}</strong> <strong>${order.order_number}</strong>
${economicInfo} ${economicInfo}
@ -254,17 +240,9 @@
onclick="event.stopPropagation(); exportOrder(${order.id})"> onclick="event.stopPropagation(); exportOrder(${order.id})">
<i class="bi bi-cloud-upload"></i> Eksporter <i class="bi bi-cloud-upload"></i> Eksporter
</button> </button>
<button class="btn btn-sm btn-outline-danger"
onclick="event.stopPropagation(); cancelOrder(${order.id})">
<i class="bi bi-x-circle"></i> Annuller
</button>
` : ''} ` : ''}
${order.status === 'exported' ? ` ${isPosted ? `
<span class="badge bg-warning text-dark"><i class="bi bi-lock"></i> Låst</span> <span class="badge bg-success"><i class="bi bi-lock"></i> Låst</span>
<button class="btn btn-sm btn-outline-warning"
onclick="event.stopPropagation(); unlockOrder(${order.id})">
<i class="bi bi-unlock"></i> Lås op
</button>
` : ''} ` : ''}
</td> </td>
</tr> </tr>
@ -288,8 +266,9 @@
function getStatusBadge(order) { function getStatusBadge(order) {
const statusMap = { const statusMap = {
'cancelled': '<span class="badge bg-danger">Annulleret</span>', 'cancelled': '<span class="badge bg-danger">Annulleret</span>',
'exported': '<span class="badge bg-warning text-dark"><i class="bi bi-cloud-check"></i> Eksporteret</span>', 'posted': '<span class="badge bg-success"><i class="bi bi-check-circle"></i> Bogført</span>',
'draft': '<span class="badge bg-secondary">Kladde</span>' 'exported': '<span class="badge bg-info">Eksporteret</span>',
'draft': '<span class="badge bg-warning">Kladde</span>'
}; };
return statusMap[order.status] || '<span class="badge bg-secondary">Ukendt</span>'; return statusMap[order.status] || '<span class="badge bg-secondary">Ukendt</span>';
@ -325,7 +304,7 @@
<hr class="my-3"> <hr class="my-3">
<h6 class="mb-3">Ordrelinjer:</h6> <h6 class="mb-3">Ordrelinjer:</h6>
${(order.lines || []).map(line => { ${order.lines.map(line => {
// Parse data // Parse data
const caseMatch = line.description.match(/CC(\d+)/); const caseMatch = line.description.match(/CC(\d+)/);
const caseTitle = line.description.split(' - ').slice(1).join(' - ') || line.description; const caseTitle = line.description.split(' - ').slice(1).join(' - ') || line.description;
@ -382,20 +361,23 @@
// Update export button // Update export button
const exportBtn = document.getElementById('export-order-btn'); const exportBtn = document.getElementById('export-order-btn');
if (order.status === 'exported') { if (order.status === 'posted') {
exportBtn.disabled = true; exportBtn.disabled = true;
exportBtn.innerHTML = '<i class="bi bi-lock"></i> Eksporteret (Låst)'; exportBtn.innerHTML = '<i class="bi bi-lock"></i> Bogført (Låst)';
exportBtn.classList.remove('btn-primary'); exportBtn.classList.remove('btn-primary');
exportBtn.classList.add('btn-secondary'); exportBtn.classList.add('btn-secondary');
} else if (order.status === 'draft') { } else if (order.economic_draft_id) {
exportBtn.disabled = false;
exportBtn.innerHTML = '<i class="bi bi-arrow-repeat"></i> Re-eksporter (force)';
exportBtn.onclick = () => {
if (confirm('Re-eksporter ordre til e-conomic?\n\nDette vil overskrive den eksisterende draft order.')) {
exportOrderForce(currentOrderId);
}
};
} else {
exportBtn.disabled = false; exportBtn.disabled = false;
exportBtn.innerHTML = '<i class="bi bi-cloud-upload"></i> Eksporter til e-conomic'; exportBtn.innerHTML = '<i class="bi bi-cloud-upload"></i> Eksporter til e-conomic';
exportBtn.onclick = exportCurrentOrder; exportBtn.onclick = exportCurrentOrder;
} else {
exportBtn.disabled = true;
exportBtn.innerHTML = `<i class="bi bi-x-circle"></i> ${order.status}`;
exportBtn.classList.remove('btn-primary');
exportBtn.classList.add('btn-secondary');
} }
orderModal.show(); orderModal.show();
@ -454,76 +436,43 @@
} }
} }
// Unlock exported order (ADMIN) // Force re-export order
async function unlockOrder(orderId) { async function exportOrderForce(orderId) {
const adminCode = prompt(
'🔐 ADMIN ADGANG PÅKRÆVET\n\n' +
'Før ordren kan låses op skal den være slettet fra e-conomic.\n\n' +
'Indtast admin unlock kode:'
);
if (!adminCode) return;
if (!confirm(
'⚠️ ADVARSEL\n\n' +
'Er du SIKKER på at ordren er slettet fra e-conomic?\n\n' +
'Systemet vil tjekke om ordren stadig findes i e-conomic.\n\n' +
'Fortsæt?'
)) return;
try { try {
const response = await fetch(`/api/v1/timetracking/orders/${orderId}/unlock?admin_code=${encodeURIComponent(adminCode)}`, { const response = await fetch(`/api/v1/timetracking/export`, {
method: 'POST', method: 'POST',
headers: { headers: {
'Content-Type': 'application/json' 'Content-Type': 'application/json'
} },
body: JSON.stringify({
order_id: orderId,
force: true
})
}); });
if (!response.ok) { if (!response.ok) {
const errorData = await response.json(); const errorData = await response.json();
alert(`❌ Kunne ikke låse ordre op:\n\n${errorData.detail}`); throw new Error(errorData.detail || 'Export failed');
return;
} }
const result = await response.json(); const result = await response.json();
alert(`✅ ${result.message}\n\nOrdren kan nu redigeres eller slettes.`);
if (result.dry_run) {
alert(`DRY-RUN MODE:\n\n${result.message}\n\n⚠ Ingen ændringer er foretaget i e-conomic (DRY-RUN mode aktiveret).`);
} else if (result.success) {
alert(`✅ Ordre re-eksporteret til e-conomic!\n\n- Draft Order nr.: ${result.economic_draft_id}\n- e-conomic ordre nr.: ${result.economic_order_number}`);
loadOrders(); loadOrders();
if (orderModal._isShown) {
orderModal.hide();
}
} else {
throw new Error(result.message || 'Export failed');
}
} catch (error) { } catch (error) {
console.error('Error unlocking order:', error); alert('Fejl ved eksport: ' + error.message);
alert(`❌ Fejl ved unlock: ${error.message}`);
} }
} }
// Cancel order
async function cancelOrder(orderId) {
if (!confirm('Annuller denne ordre?\n\nTidsregistreringerne sættes tilbage til "godkendt" status.')) {
return;
}
try {
const response = await fetch(`/api/v1/timetracking/orders/${orderId}/cancel`, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
}
});
if (!response.ok) {
const errorData = await response.json();
alert(`❌ Kunne ikke annullere ordre:\n\n${errorData.detail}`);
return;
}
alert('✅ Ordre annulleret\n\nTidsregistreringerne er sat tilbage til "godkendt" status.');
loadOrders();
} catch (error) {
console.error('Error cancelling order:', error);
alert(`❌ Fejl ved annullering: ${error.message}`);
}
}
</script> </script>
</div> </div>
{% endblock %} {% endblock %}

View File

@ -352,8 +352,7 @@
<script> <script>
let currentEntry = null; let currentEntry = null;
let currentCustomerId = null; let currentCustomerId = null;
let currentCaseId = null; let defaultHourlyRate = 850.00; // Fallback værdi, hentes fra API
let defaultHourlyRate = 1200.00; // Fallback værdi, hentes fra API
// Load config from API // Load config from API
async function loadConfig() { async function loadConfig() {
@ -403,7 +402,6 @@
} }
currentEntry = data.time_entry; currentEntry = data.time_entry;
currentCaseId = currentEntry.case_id;
// Fetch ALL pending timelogs in this case // Fetch ALL pending timelogs in this case
if (currentEntry.case_id) { if (currentEntry.case_id) {
@ -660,14 +658,6 @@
</label> </label>
</div> </div>
</div> </div>
<div class="mt-3">
<label class="form-label">
<i class="bi bi-pencil"></i> Godkendelsesnote (valgfri)
</label>
<textarea class="form-control" id="approval-note-${e.id}" rows="2"
placeholder="Tilføj evt. note til denne godkendelse..."></textarea>
</div>
</div> </div>
<div class="mt-3 d-flex gap-2"> <div class="mt-3 d-flex gap-2">
@ -995,7 +985,7 @@
const method = methodSelect.value; const method = methodSelect.value;
const minimum = parseFloat(minimumInput.value) || 0; const minimum = parseFloat(minimumInput.value) || 0;
const hourlyRate = parseFloat(hourlyRateInput?.value) || 1200; const hourlyRate = parseFloat(hourlyRateInput?.value) || 850;
const original = parseFloat(entry.original_hours) || 0; const original = parseFloat(entry.original_hours) || 0;
let billable = original; let billable = original;
@ -1087,10 +1077,6 @@
const travelCheckbox = document.getElementById(`travel-${entryId}`); const travelCheckbox = document.getElementById(`travel-${entryId}`);
const isTravel = travelCheckbox ? travelCheckbox.checked : false; const isTravel = travelCheckbox ? travelCheckbox.checked : false;
// Get approval note
const approvalNoteField = document.getElementById(`approval-note-${entryId}`);
const approvalNote = approvalNoteField ? approvalNoteField.value.trim() : '';
try { try {
const response = await fetch(`/api/v1/timetracking/wizard/approve/${entryId}`, { const response = await fetch(`/api/v1/timetracking/wizard/approve/${entryId}`, {
method: 'POST', method: 'POST',
@ -1100,8 +1086,7 @@
body: JSON.stringify({ body: JSON.stringify({
billable_hours: billableHours, billable_hours: billableHours,
hourly_rate: hourlyRate, hourly_rate: hourlyRate,
is_travel: isTravel, is_travel: isTravel
approval_note: approvalNote || null
}) })
}); });

View File

@ -1,11 +0,0 @@
{
"folders": [
{
"path": "."
},
{
"path": "../../pakkemodtagelse"
}
],
"settings": {}
}

View File

@ -1,7 +1,7 @@
version: '3.8' version: '3.8'
services: services:
# PostgreSQL Database - Production # PostgreSQL Database
postgres: postgres:
image: postgres:16-alpine image: postgres:16-alpine
container_name: bmc-hub-postgres-prod container_name: bmc-hub-postgres-prod
@ -11,8 +11,7 @@ services:
POSTGRES_DB: ${POSTGRES_DB} POSTGRES_DB: ${POSTGRES_DB}
volumes: volumes:
- postgres_data:/var/lib/postgresql/data - postgres_data:/var/lib/postgresql/data
# Mount all migration files for initialization - ./migrations/init.sql:/docker-entrypoint-initdb.d/init.sql:ro
- ./migrations:/docker-entrypoint-initdb.d:ro
ports: ports:
- "${POSTGRES_PORT:-5432}:5432" - "${POSTGRES_PORT:-5432}:5432"
restart: always restart: always
@ -24,7 +23,7 @@ services:
networks: networks:
- bmc-hub-network - bmc-hub-network
# FastAPI Application - Production with Gitea Release # FastAPI Application - Production with GitHub Release Version
api: api:
build: build:
context: . context: .
@ -41,10 +40,10 @@ services:
ports: ports:
- "${API_PORT:-8000}:8000" - "${API_PORT:-8000}:8000"
volumes: volumes:
# Data persistence (NO source code in production)
- ./logs:/app/logs - ./logs:/app/logs
- ./uploads:/app/uploads - ./uploads:/app/uploads
- ./data:/app/data - ./data:/app/data
# NOTE: No source code mount in production - code comes from GitHub release
env_file: env_file:
- .env - .env
environment: environment:
@ -63,7 +62,6 @@ services:
labels: labels:
- "com.bmcnetworks.app=bmc-hub" - "com.bmcnetworks.app=bmc-hub"
- "com.bmcnetworks.version=${RELEASE_VERSION:-latest}" - "com.bmcnetworks.version=${RELEASE_VERSION:-latest}"
- "com.bmcnetworks.environment=production"
networks: networks:
bmc-hub-network: bmc-hub-network:

View File

@ -42,12 +42,17 @@ services:
# Mount for local development - live code reload # Mount for local development - live code reload
- ./app:/app/app:ro - ./app:/app/app:ro
- ./main.py:/app/main.py:ro - ./main.py:/app/main.py:ro
- ./scripts:/app/scripts:ro
# Mount OmniSync database for import (read-only)
- /Users/christianthomas/pakkemodtagelse/data:/omnisync_data:ro
env_file: env_file:
- .env - .env
environment: environment:
# Override database URL to point to postgres service # Override database URL to point to postgres service
- DATABASE_URL=postgresql://${POSTGRES_USER:-bmc_hub}:${POSTGRES_PASSWORD:-bmc_hub}@postgres:5432/${POSTGRES_DB:-bmc_hub} - DATABASE_URL=postgresql://${POSTGRES_USER:-bmc_hub}:${POSTGRES_PASSWORD:-bmc_hub}@postgres:5432/${POSTGRES_DB:-bmc_hub}
- ENABLE_RELOAD=false - ENABLE_RELOAD=false
- OLLAMA_MODEL=qwen3:4b # Bruger Chat API format
- OLLAMA_MODEL_FALLBACK=qwen2.5:3b # Backup model
restart: unless-stopped restart: unless-stopped
healthcheck: healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"] test: ["CMD", "curl", "-f", "http://localhost:8000/health"]

View File

@ -1,232 +0,0 @@
# BMC Hub Backup System - Environment Configuration Guide
## Quick Start
Add these lines to your `.env` file to enable the backup system:
```bash
# ===== BACKUP SYSTEM =====
# Safety switches (start with safe defaults)
BACKUP_ENABLED=true # Enable backup system (default: false)
BACKUP_DRY_RUN=false # Disable dry-run to actually perform backups (default: true)
BACKUP_READ_ONLY=false # Allow restore operations (default: true)
# Backup formats
DB_DAILY_FORMAT=dump # Compressed format for daily backups (default: dump)
DB_MONTHLY_FORMAT=sql # Plain SQL for monthly backups (default: sql)
# Backup scope
BACKUP_INCLUDE_UPLOADS=true # Include uploads/ directory (default: true)
BACKUP_INCLUDE_LOGS=true # Include logs/ directory (default: true)
BACKUP_INCLUDE_DATA=true # Include data/ directory (default: true)
# Storage configuration
BACKUP_STORAGE_PATH=/opt/backups # Production path (default: /opt/backups)
# BACKUP_STORAGE_PATH=./backups # Use this for local development
BACKUP_MAX_SIZE_GB=50 # Maximum storage size (default: 50)
STORAGE_WARNING_THRESHOLD_PCT=80 # Warn at 80% usage (default: 80)
# Retention policy
RETENTION_DAYS=30 # Keep daily backups for 30 days (default: 30)
MONTHLY_KEEP_MONTHS=12 # Keep monthly backups for 12 months (default: 12)
# Offsite uploads (SFTP/SSH)
OFFSITE_ENABLED=false # Disable until configured (default: false)
OFFSITE_WEEKLY_DAY=sunday # Day for weekly upload (default: sunday)
OFFSITE_RETRY_MAX_ATTEMPTS=3 # Max retry attempts (default: 3)
OFFSITE_RETRY_DELAY_HOURS=1 # Hours between retries (default: 1)
# SFTP/SSH connection (configure when enabling offsite)
SFTP_HOST=backup.example.com
SFTP_PORT=22
SFTP_USER=bmc_backup
SFTP_PASSWORD= # Leave empty if using SSH key
SSH_KEY_PATH=/path/to/id_rsa # Path to SSH private key (preferred)
SFTP_REMOTE_PATH=/backups/bmc_hub
# Mattermost notifications
MATTERMOST_ENABLED=false # Disable until webhook configured (default: false)
MATTERMOST_WEBHOOK_URL=https://mattermost.example.com/hooks/xxx
MATTERMOST_CHANNEL=backups
NOTIFY_ON_FAILURE=true # Send alerts on failures (default: true)
NOTIFY_ON_SUCCESS_OFFSITE=true # Send alerts on successful offsite uploads (default: true)
```
## Configuration Steps
### 1. Basic Setup (Local Development)
```bash
BACKUP_ENABLED=true
BACKUP_DRY_RUN=false
BACKUP_READ_ONLY=false
BACKUP_STORAGE_PATH=./backups
```
Restart the application:
```bash
docker-compose restart api
```
### 2. Enable Offsite Uploads
#### Using SSH Key (Recommended)
```bash
OFFSITE_ENABLED=true
SFTP_HOST=your-backup-server.com
SFTP_USER=backup_user
SSH_KEY_PATH=/path/to/id_rsa
SFTP_REMOTE_PATH=/backups/bmc_hub
```
#### Using Password
```bash
OFFSITE_ENABLED=true
SFTP_HOST=your-backup-server.com
SFTP_USER=backup_user
SFTP_PASSWORD=your_secure_password
SFTP_REMOTE_PATH=/backups/bmc_hub
```
### 3. Enable Mattermost Notifications
1. Create an incoming webhook in Mattermost
2. Copy the webhook URL
3. Add to `.env`:
```bash
MATTERMOST_ENABLED=true
MATTERMOST_WEBHOOK_URL=https://your-mattermost.com/hooks/xxxxxxxxxxxxx
MATTERMOST_CHANNEL=backups
```
## Scheduled Jobs
When `BACKUP_ENABLED=true`, the system automatically schedules:
- **Daily Backup**: 02:00 CET - Full backup (database + files) in compressed format
- **Monthly Backup**: 1st day at 02:00 CET - Full backup in plain SQL format
- **Weekly Offsite**: Sunday at 03:00 CET - Upload all pending backups to offsite
- **Backup Rotation**: Daily at 01:00 CET - Delete expired backups
- **Storage Check**: Daily at 01:30 CET - Check disk usage
## Manual Operations
### Via UI Dashboard
Visit: `http://localhost:8000/backups`
Features:
- Create manual backups (database, files, or full)
- View backup history with sizes and checksums
- Restore from backup (with confirmation)
- Manual offsite upload
- View notifications
- Monitor storage usage
### Via API
```bash
# Create manual backup
curl -X POST http://localhost:8000/api/v1/backups \
-H "Content-Type: application/json" \
-d '{"job_type": "full", "is_monthly": false}'
# List backups
curl http://localhost:8000/api/v1/backups/jobs
# Get backup details
curl http://localhost:8000/api/v1/backups/jobs/1
# Restore from backup (⚠️ DANGER - enters maintenance mode)
curl -X POST http://localhost:8000/api/v1/backups/restore/1 \
-H "Content-Type: application/json" \
-d '{"confirmation": true}'
# Upload to offsite
curl -X POST http://localhost:8000/api/v1/backups/offsite/1
# Check storage
curl http://localhost:8000/api/v1/backups/storage
# Check maintenance mode
curl http://localhost:8000/api/v1/backups/maintenance
```
## Database Migration
Migration has already been applied. If you need to re-run:
```bash
docker-compose exec -T postgres psql -U bmc_hub -d bmc_hub < migrations/024_backup_system.sql
```
## Troubleshooting
### Backup not running
- Check `BACKUP_ENABLED=true` in `.env`
- Check logs: `docker-compose logs api | grep backup`
- Verify scheduler status via API: `curl http://localhost:8000/api/v1/backups/scheduler/status`
### Offsite upload failing
- Verify SFTP credentials
- Test SSH connection: `ssh -i /path/to/key user@host`
- Check retry count in backup history
- Review notifications in dashboard
### Storage full
- Increase `BACKUP_MAX_SIZE_GB`
- Reduce `RETENTION_DAYS`
- Manually delete old backups via UI
- Enable offsite uploads to move backups off-server
### Restore not working
- Set `BACKUP_READ_ONLY=false` in `.env`
- Restart API: `docker-compose restart api`
- Verify backup file exists on disk
- Check maintenance mode overlay appears during restore
## Safety Features
The system includes multiple safety switches:
1. **BACKUP_ENABLED** - Master switch, disabled by default
2. **BACKUP_DRY_RUN** - Logs operations without executing
3. **BACKUP_READ_ONLY** - Blocks destructive restore operations
4. **OFFSITE_ENABLED** - Disables external uploads until configured
5. **MATTERMOST_ENABLED** - Prevents notification spam
Always test with `BACKUP_DRY_RUN=true` first!
## Production Checklist
- [ ] Set strong SFTP credentials
- [ ] Configure SSH key authentication
- [ ] Test restore procedure (on test system first!)
- [ ] Configure Mattermost notifications
- [ ] Set appropriate retention periods
- [ ] Verify backup storage capacity
- [ ] Document restore procedures
- [ ] Schedule restore drills (quarterly)
- [ ] Monitor backup success rate
- [ ] Test offsite download/restore
## Backup Storage Structure
```
/opt/backups/
├── database/
│ ├── db_20251213_020000_daily.dump
│ ├── db_20251201_020000_monthly.sql
│ └── ...
└── files/
├── files_20251213_020000.tar.gz
└── ...
```
## Support
For issues or questions, check:
- Logs: `/logs/app.log`
- API Docs: `http://localhost:8000/api/docs`
- Dashboard: `http://localhost:8000/backups`

View File

@ -1,456 +0,0 @@
# Email Activity Logging System
## Oversigt
Komplet audit trail system der logger **alt** hvad der sker med emails i BMC Hub. Hver handling, ændring og event bliver logget automatisk med timestamps, metadata og kontekst.
## 🎯 Hvad Bliver Logget?
### System Events
- **fetched**: Email hentet fra mail server
- **classified**: Email klassificeret af AI/keyword system
- **workflow_executed**: Workflow kørt på email
- **rule_matched**: Email regel matchet
- **status_changed**: Email status ændret
- **error**: Fejl opstået under processing
### User Events
- **read**: Email læst af bruger
- **attachment_downloaded**: Attachment downloaded
- **attachment_uploaded**: Attachment uploaded
### Integration Events
- **linked**: Email linket til vendor/customer/case
- **invoice_extracted**: Faktura data ekstraheret fra PDF
- **ticket_created**: Support ticket oprettet
- **notification_sent**: Notifikation sendt
## 📊 Database Schema
### email_activity_log Table
```sql
CREATE TABLE email_activity_log (
id SERIAL PRIMARY KEY,
email_id INTEGER NOT NULL, -- Hvilken email
event_type VARCHAR(50) NOT NULL, -- Hvad skete der
event_category VARCHAR(30) NOT NULL, -- Kategori (system/user/workflow/etc)
description TEXT NOT NULL, -- Human-readable beskrivelse
metadata JSONB, -- Ekstra data som JSON
user_id INTEGER, -- Bruger hvis user-triggered
created_at TIMESTAMP, -- Hvornår
created_by VARCHAR(255) -- Hvem/hvad
);
```
### email_timeline View
Pre-built view med joins til users og email_messages:
```sql
SELECT * FROM email_timeline WHERE email_id = 123;
```
## 🔧 Hvordan Bruges Det?
### I Python Code
```python
from app.services.email_activity_logger import email_activity_logger
# Log email fetch
await email_activity_logger.log_fetched(
email_id=123,
source='imap',
message_id='msg-abc-123'
)
# Log classification
await email_activity_logger.log_classified(
email_id=123,
classification='invoice',
confidence=0.85,
method='ai'
)
# Log workflow execution
await email_activity_logger.log_workflow_executed(
email_id=123,
workflow_id=5,
workflow_name='Invoice Processing',
status='completed',
steps_completed=3,
execution_time_ms=1250
)
# Log status change
await email_activity_logger.log_status_changed(
email_id=123,
old_status='active',
new_status='processed',
reason='workflow completed'
)
# Log entity linking
await email_activity_logger.log_linked(
email_id=123,
entity_type='vendor',
entity_id=42,
entity_name='Acme Corp'
)
# Log invoice extraction
await email_activity_logger.log_invoice_extracted(
email_id=123,
invoice_number='INV-2025-001',
amount=1234.56,
success=True
)
# Log error
await email_activity_logger.log_error(
email_id=123,
error_type='extraction_failed',
error_message='PDF corrupted',
context={'file': 'invoice.pdf', 'size': 0}
)
# Generic log (for custom events)
await email_activity_logger.log(
email_id=123,
event_type='custom_event',
category='integration',
description='Custom event happened',
metadata={'key': 'value'}
)
```
### Via SQL
```sql
-- Log event directly via function
SELECT log_email_event(
123, -- email_id
'custom_event', -- event_type
'system', -- event_category
'Something happened', -- description
'{"foo": "bar"}'::jsonb, -- metadata (optional)
NULL, -- user_id (optional)
'system' -- created_by
);
-- Query logs for specific email
SELECT * FROM email_activity_log
WHERE email_id = 123
ORDER BY created_at DESC;
-- Use the view for nicer output
SELECT * FROM email_timeline
WHERE email_id = 123;
```
### Via API
```http
GET /api/v1/emails/123/activity
```
Response:
```json
[
{
"id": 1,
"email_id": 123,
"event_type": "fetched",
"event_category": "system",
"description": "Email fetched from email server",
"metadata": {
"source": "imap",
"message_id": "msg-abc-123"
},
"user_id": null,
"user_name": null,
"created_at": "2025-12-15T10:30:00",
"created_by": "system"
},
{
"id": 2,
"email_id": 123,
"event_type": "classified",
"event_category": "system",
"description": "Classified as invoice (confidence: 85%)",
"metadata": {
"classification": "invoice",
"confidence": 0.85,
"method": "ai"
},
"created_at": "2025-12-15T10:30:02",
"created_by": "system"
}
]
```
## 🎨 UI Integration
### Email Detail View
Når du vælger en email i email UI:
1. Klik på **"Log"** tab i højre sidebar
2. Se komplet timeline af alle events
3. Ekspander metadata for detaljer
### Timeline Features
- **Kronologisk visning**: Nyeste først
- **Color-coded ikoner**: Baseret på event category
- 🔵 System events (blue)
- 🟢 User events (green)
- 🔷 Workflow events (cyan)
- 🟡 Rule events (yellow)
- ⚫ Integration events (gray)
- **Expandable metadata**: Klik for at se JSON details
- **User attribution**: Viser hvem der udførte action
## 📈 Analytics & Monitoring
### Recent Activity Across All Emails
```http
GET /api/v1/emails/activity/recent?limit=50&event_type=error
```
### Activity Statistics
```http
GET /api/v1/emails/activity/stats
```
Response:
```json
[
{
"event_type": "classified",
"event_category": "system",
"count": 1523,
"last_occurrence": "2025-12-15T12:45:00"
},
{
"event_type": "workflow_executed",
"event_category": "workflow",
"count": 892,
"last_occurrence": "2025-12-15T12:44:30"
}
]
```
## 🔍 Use Cases
### 1. Debugging Email Processing
```sql
-- See complete flow for problematic email
SELECT
event_type,
description,
created_at
FROM email_activity_log
WHERE email_id = 123
ORDER BY created_at;
```
### 2. Performance Monitoring
```sql
-- Find slow workflow executions
SELECT
email_id,
description,
(metadata->>'execution_time_ms')::int as exec_time
FROM email_activity_log
WHERE event_type = 'workflow_executed'
ORDER BY exec_time DESC
LIMIT 10;
```
### 3. User Activity Audit
```sql
-- See what user did
SELECT
e.subject,
a.event_type,
a.description,
a.created_at
FROM email_activity_log a
JOIN email_messages e ON a.email_id = e.id
WHERE a.user_id = 5
ORDER BY a.created_at DESC;
```
### 4. Error Analysis
```sql
-- Find common errors
SELECT
metadata->>'error_type' as error_type,
COUNT(*) as count
FROM email_activity_log
WHERE event_type = 'error'
GROUP BY error_type
ORDER BY count DESC;
```
### 5. Workflow Success Rate
```sql
-- Calculate workflow success rate
SELECT
metadata->>'workflow_name' as workflow,
COUNT(*) FILTER (WHERE metadata->>'status' = 'completed') as success,
COUNT(*) FILTER (WHERE metadata->>'status' = 'failed') as failed,
ROUND(
100.0 * COUNT(*) FILTER (WHERE metadata->>'status' = 'completed') / COUNT(*),
2
) as success_rate
FROM email_activity_log
WHERE event_type = 'workflow_executed'
GROUP BY workflow
ORDER BY success_rate DESC;
```
## 🚀 Auto-Logging
Følgende er allerede implementeret og logger automatisk:
**Email Fetching** - Logged når emails hentes
**Classification** - Logged når AI klassificerer
**Workflow Execution** - Logged ved start og completion
**Status Changes** - Logged når email status ændres
### Kommende Auto-Logging
⏳ Rule matching (tilføjes snart)
⏳ User read events (når user åbner email)
⏳ Attachment actions (download/upload)
⏳ Entity linking (vendor/customer association)
## 💡 Best Practices
### 1. Always Include Metadata
```python
# ❌ Bad - No context
await email_activity_logger.log(
email_id=123,
event_type='action_performed',
category='system',
description='Something happened'
)
# ✅ Good - Rich context
await email_activity_logger.log(
email_id=123,
event_type='invoice_sent',
category='integration',
description='Invoice sent to e-conomic',
metadata={
'invoice_number': 'INV-2025-001',
'economic_id': 12345,
'amount': 1234.56,
'sent_at': datetime.now().isoformat()
}
)
```
### 2. Use Descriptive Event Types
```python
# ❌ Bad - Generic
event_type='action'
# ✅ Good - Specific
event_type='invoice_sent_to_economic'
```
### 3. Choose Correct Category
- **system**: Automated system actions
- **user**: User-triggered actions
- **workflow**: Workflow executions
- **rule**: Rule-based actions
- **integration**: External system integrations
### 4. Log Errors with Context
```python
try:
result = extract_invoice_data(pdf_path)
except Exception as e:
await email_activity_logger.log_error(
email_id=email_id,
error_type='extraction_failed',
error_message=str(e),
context={
'pdf_path': pdf_path,
'file_size': os.path.getsize(pdf_path),
'traceback': traceback.format_exc()
}
)
```
## 🔒 Data Retention
Activity logs kan vokse hurtigt. Implementer cleanup strategi:
```sql
-- Delete logs older than 90 days
DELETE FROM email_activity_log
WHERE created_at < NOW() - INTERVAL '90 days';
-- Archive old logs to separate table
INSERT INTO email_activity_log_archive
SELECT * FROM email_activity_log
WHERE created_at < NOW() - INTERVAL '30 days';
DELETE FROM email_activity_log
WHERE created_at < NOW() - INTERVAL '30 days';
```
## 📊 Performance Considerations
Med indexes på `email_id`, `event_type`, `created_at` og `event_category`, kan systemet håndtere millioner af log entries uden performance issues.
### Index Usage
```sql
-- Fast: Uses idx_email_activity_log_email_id
SELECT * FROM email_activity_log WHERE email_id = 123;
-- Fast: Uses idx_email_activity_log_event_type
SELECT * FROM email_activity_log WHERE event_type = 'workflow_executed';
-- Fast: Uses idx_email_activity_log_created_at
SELECT * FROM email_activity_log WHERE created_at > NOW() - INTERVAL '1 day';
```
## 🎓 Examples
### Complete Email Lifecycle Log
```python
# 1. Email arrives
await email_activity_logger.log_fetched(email_id, 'imap', message_id)
# 2. AI classifies it
await email_activity_logger.log_classified(email_id, 'invoice', 0.92, 'ai')
# 3. Workflow processes it
await email_activity_logger.log_workflow_executed(
email_id, workflow_id, 'Invoice Processing', 'completed', 3, 1100
)
# 4. Links to vendor
await email_activity_logger.log_linked(email_id, 'vendor', 42, 'Acme Corp')
# 5. Extracts invoice
await email_activity_logger.log_invoice_extracted(
email_id, 'INV-001', 1234.56, True
)
# 6. Status changes
await email_activity_logger.log_status_changed(
email_id, 'active', 'processed', 'workflow completed'
)
```
Result: **Complete audit trail af email fra fetch til processed!**
---
**Version**: 1.0
**Last Updated**: 15. december 2025
**Status**: ✅ Production Ready

View File

@ -1,258 +0,0 @@
# Email Rules → Workflows Migration Guide
## 🎯 Formål
BMC Hub er ved at phase out det gamle **Rules** system til fordel for det nyere og mere kraftfulde **Workflows** system.
**Status:**
- ✅ Workflows er default aktiveret (`EMAIL_WORKFLOWS_ENABLED=true`)
- ⚠️ Rules er nu disabled by default (`EMAIL_RULES_ENABLED=false`)
## 🔄 Hvad Er Ændret?
### I koden:
1. **Koordinering tilføjet**: Workflows kører først, rules kun hvis workflow ikke har processed emailen
2. **Deduplication**: Samme action køres ikke 2 gange
3. **Config ændring**: `EMAIL_RULES_ENABLED` er nu `false` by default
4. **Silent failures fixet**: extract_invoice_data fejler nu synligt hvis fil mangler
### I `.env`:
```bash
# Gammelt (deprecated):
EMAIL_RULES_ENABLED=true
EMAIL_RULES_AUTO_PROCESS=false
# Nyt (anbefalet):
EMAIL_WORKFLOWS_ENABLED=true
EMAIL_RULES_ENABLED=false
```
## 📋 Migration Steps
### Trin 1: Identificer Aktive Rules
```sql
-- Se alle aktive rules
SELECT id, name, action_type, conditions, priority, match_count
FROM email_rules
WHERE enabled = true
ORDER BY priority ASC;
```
### Trin 2: Opret Tilsvarende Workflows
For hver rule, opret en workflow:
**Eksempel: Rule → Workflow**
**Gammel Rule:**
```json
{
"name": "Link Supplier Invoices",
"conditions": {
"classification": "invoice",
"sender_domain": ["example.com", "vendor.dk"]
},
"action_type": "link_supplier",
"priority": 10
}
```
**Ny Workflow:**
```sql
INSERT INTO email_workflows (
name,
description,
classification_trigger,
sender_pattern,
confidence_threshold,
workflow_steps,
priority,
enabled
) VALUES (
'Link Supplier Invoices',
'Automatically link invoice emails from known suppliers',
'invoice',
'(example\\.com|vendor\\.dk)$', -- Regex pattern
0.70,
'[
{"action": "link_to_vendor", "params": {"match_by": "email"}},
{"action": "extract_invoice_data", "params": {}},
{"action": "mark_as_processed", "params": {}}
]'::jsonb,
10,
true
);
```
### Trin 3: Test Workflows
1. Send test email der matcher classification
2. Check `email_workflow_executions` tabel for results
3. Verificer at email blev processed korrekt
```sql
-- Se workflow executions
SELECT
e.id,
w.name as workflow_name,
em.subject,
e.status,
e.steps_completed,
e.execution_time_ms,
e.started_at
FROM email_workflow_executions e
JOIN email_workflows w ON w.id = e.workflow_id
JOIN email_messages em ON em.id = e.email_id
ORDER BY e.started_at DESC
LIMIT 20;
```
### Trin 4: Disable Rules
```sql
-- Disable alle rules
UPDATE email_rules SET enabled = false;
```
Eller i `.env`:
```bash
EMAIL_RULES_ENABLED=false
```
### Trin 5: Monitor
I de første dage efter migration:
- Check logs for fejl
- Verificer at workflows kører som forventet
- Check at ingen emails falder igennem
## 🗂️ Mapping: Rules → Workflows
### Action Mapping
| Rule Action | Workflow Action | Notes |
|-------------|-----------------|-------|
| `link_supplier` | `link_to_vendor` | ✅ Direct replacement |
| `link_customer` | `link_to_customer` | ⚠️ Not fully implemented yet |
| `link_case` | `create_ticket` | ✅ Creates ticket from email |
| `mark_spam` | *(none)* | ⚠️ Needs workflow action |
| `create_purchase` | *(none)* | ⚠️ Not implemented |
### Condition Mapping
| Rule Condition | Workflow Equivalent |
|----------------|---------------------|
| `classification` | `classification_trigger` |
| `sender_email` | `sender_pattern` (exact match regex) |
| `sender_domain` | `sender_pattern` (domain regex) |
| `subject_contains` | `subject_pattern` (contains regex) |
| `subject_regex` | `subject_pattern` (direct) |
| `confidence_score` | `confidence_threshold` |
## 🆕 Workflow-Only Features
Workflows kan mere end rules:
1. **Multi-step execution**: Chain multiple actions
2. **Better error handling**: Each step tracked separately
3. **Execution history**: Full audit trail
4. **Regex patterns**: More flexible matching
5. **Stop on match**: Control workflow chaining
6. **Statistics**: Success/failure rates
## ⚠️ Backward Compatibility
**Hvis du MÅ beholde rules:**
Set i `.env`:
```bash
EMAIL_RULES_ENABLED=true
EMAIL_WORKFLOWS_ENABLED=true
```
**Systemet vil:**
- Køre workflows først
- Kun køre rules hvis workflow ikke processede emailen
- Undgå duplicate actions
**Men dette er ikke anbefalet** - rules vil blive fjernet i fremtiden.
## 📊 Status Dashboard
Tilføj til admin UI:
```sql
-- Workflow statistics
SELECT
w.name,
w.classification_trigger,
w.execution_count,
w.success_count,
w.failure_count,
ROUND(100.0 * w.success_count / NULLIF(w.execution_count, 0), 1) as success_rate,
w.last_executed_at
FROM email_workflows w
WHERE w.enabled = true
ORDER BY w.execution_count DESC;
```
## 🐛 Troubleshooting
### "Workflow ikke executed"
**Check:**
1. Er `EMAIL_WORKFLOWS_ENABLED=true` i .env?
2. Er workflow enabled i database?
3. Matcher classification_trigger?
4. Er confidence over threshold?
5. Matcher sender/subject patterns?
**Debug:**
```python
# I logs se:
# "🔄 Finding workflows for classification: invoice (confidence: 0.95)"
# "📋 Found X matching workflow(s)"
```
### "Email processed 2 gange"
**Check:**
1. Er både workflows OG rules enabled?
2. Har de samme actions?
**Fix:**
Disable rules: `EMAIL_RULES_ENABLED=false`
### "Workflow fejler stille"
**Efter fix:**
- extract_invoice_data raiser nu FileNotFoundError hvis PDF mangler
- Check `email_workflow_executions.status = 'failed'`
- Check `error_message` column
## ✅ Success Criteria
Migration er komplet når:
1. ✅ Alle rules er migrated til workflows
2. ✅ Workflows tested og virker
3. ✅ `EMAIL_RULES_ENABLED=false` i produktion
4. ✅ Ingen emails falder igennem
5. ✅ email_workflow_executions viser success
## 📞 Support
Hvis problemer:
1. Check logs: `docker-compose logs -f api | grep "🔄\|❌\|✅"`
2. Check database: `SELECT * FROM email_workflow_executions ORDER BY started_at DESC LIMIT 10;`
3. Revert: Set `EMAIL_RULES_ENABLED=true` midlertidigt
---
**Tidslinje:**
- ✅ Nu: Workflows aktiveret, rules disabled by default
- 🔜 Næste sprint: Fjern rule UI fra admin
- 🔜 Om 1 måned: Drop email_rules tabel helt
**Anbefaling:** Migrer nu, imens begge systemer er tilgængelige som fallback.

View File

@ -1,316 +0,0 @@
# Email Rules vs Workflows - Analyse
## 🔍 Oversigt
BMC Hub har **2 systemer** til automatisk email-behandling:
1. **Email Rules** (legacy) - `email_rules` tabel
2. **Email Workflows** (nyere) - `email_workflows` tabel
## ⚙️ Hvordan Fungerer De?
### Email Processing Flow
```
📧 Ny Email Modtaget
1⃣ Save email til database (email_messages)
2⃣ Classify med AI/simple classifier
↓ (classification + confidence_score gemmes)
3⃣ Execute WORKFLOWS først 🆕
├─ Finder workflows med matching classification
├─ Tjekker confidence_threshold
├─ Checker sender/subject patterns (regex)
├─ Executer workflow steps i rækkefølge
└─ Stopper hvis stop_on_match=true
4⃣ Match RULES bagefter (legacy) 🕰️
├─ Finder rules med matching conditions
├─ Tjekker sender, domain, classification, subject
├─ Executer rule action (kun 1 action per rule)
└─ Stopper efter første match
```
## 🆚 Forskelle
| Feature | Email Rules (Legacy) | Email Workflows (Ny) |
|---------|---------------------|---------------------|
| **Fleksibilitet** | Enkelt action per rule | Multiple steps per workflow |
| **Priority** | Ja (priority field) | Ja (priority field) |
| **Stop on match** | Implicit (første match vinder) | Explicit (stop_on_match flag) |
| **Pattern matching** | Basic (exact match, contains) | Advanced (regex patterns) |
| **Confidence check** | Nej | Ja (confidence_threshold) |
| **Execution tracking** | Nej | Ja (email_workflow_executions) |
| **Statistics** | Ja (match_count) | Ja (execution_count, success/failure) |
| **Actions** | 5 types | 10+ types |
| **Database table** | email_rules | email_workflows |
| **Enabled by** | EMAIL_RULES_ENABLED | EMAIL_WORKFLOWS_ENABLED |
| **Auto-execute** | EMAIL_RULES_AUTO_PROCESS | Altid (hvis enabled) |
## ⚠️ PROBLEM: Duplikering og Konflikter
### 1. Begge Kan Køre Samtidigt
**Scenarie:**
```
Email: Faktura fra leverandør@example.com
Classification: invoice, confidence: 0.95
WORKFLOW matches:
- "Invoice Processing Workflow"
→ Steps: link_to_vendor, extract_invoice_data, mark_as_processed
→ Executes first! ✅
RULE matches:
- "Link Supplier Emails"
→ Action: link_supplier
→ Executes after! ⚠️
RESULTAT: link_to_vendor køres 2 gange!
```
### 2. Ingen Koordination
Workflows ved ikke om rules har kørt (eller omvendt).
**Problem:**
- Email kan markeres som "processed" af workflow
- Rule prøver stadig at køre action bagefter
- Resultatet logges 2 steder (workflow_executions + rule match_count)
### 3. Overlappende Actions
**Samme funktionalitet i begge systemer:**
| Action Type | Rule Name | Workflow Action |
|-------------|-----------|----------------|
| Link vendor | `link_supplier` | `link_to_vendor` |
| Link customer | `link_customer` | `link_to_customer` |
| Mark spam | `mark_spam` | *(mangler)* |
| Link case | `link_case` | `create_ticket` |
| Invoice extraction | *(mangler)* | `extract_invoice_data` |
### 4. Auto-Process Flag Virker Ikke for Workflows
**I koden:**
```python
# Rules respekterer auto-process flag
if self.auto_process:
await self._execute_rule_action(email_data, rule)
else:
logger.info(f"⏭️ Auto-process disabled - rule action not executed")
# Workflows kører ALTID hvis enabled=true
workflow_result = await email_workflow_service.execute_workflows(email_data)
```
**Problem:** Man kan ikke disable workflow auto-execution uden at disable hele workflow systemet.
## ✅ Hvad Virker Godt
### 1. Workflows Er Mere Kraftfulde
- Multi-step execution
- Better tracking (execution history)
- Regex pattern matching
- Confidence threshold check
- Success/failure statistics
### 2. Rules Er Simplere
- God til simple hvis-så logik
- Lettere at forstå for non-technical brugere
- Fungerer fint for basic email routing
### 3. Begge Har Priority Ordering
- Workflows executes i priority order
- Rules matches i priority order
- Første match kan stoppe kæden (hvis configured)
## 🐛 Konkrete Bugs Fundet
### Bug #1: Workflow Executes ALTID
**Kode:** `email_processor_service.py` line 77-79
```python
# Step 4: Execute workflows based on classification
workflow_result = await email_workflow_service.execute_workflows(email_data)
```
**Problem:** Ingen check af `EMAIL_RULES_AUTO_PROCESS` eller lignende flag.
**Fix:**
```python
if settings.EMAIL_WORKFLOWS_ENABLED:
workflow_result = await email_workflow_service.execute_workflows(email_data)
```
### Bug #2: Rules Kører Efter Workflows
**Kode:** `email_processor_service.py` line 84-88
```python
# Step 5: Match against rules (legacy support)
if self.rules_enabled:
matched = await self._match_rules(email_data)
```
**Problem:** Hvis workflow allerede har processed emailen, skal rule ikke køre.
**Fix:**
```python
# Step 5: Match against rules (legacy support) - skip if already processed by workflow
if self.rules_enabled and not email_data.get('_workflow_processed'):
matched = await self._match_rules(email_data)
```
### Bug #3: Manglende Deduplication
**Problem:** Samme action kan executes af både workflow og rule.
**Fix:** Add check i rule execution:
```python
# Check if email already processed by workflow
already_processed = execute_query(
"SELECT id FROM email_workflow_executions WHERE email_id = %s AND status = 'completed'",
(email_id,), fetchone=True
)
if already_processed:
logger.info(f"⏭️ Email already processed by workflow, skipping rule")
return False
```
### Bug #4: `extract_invoice_data` Workflow Action Kan Fejle Stille
**Kode:** `email_workflow_service.py` line 380+
```python
if not file_path.exists():
# No error raised! Just continues...
```
**Problem:** Hvis PDF fil ikke findes, fejler workflow ikke - den fortsætter bare.
**Fix:** Raise exception:
```python
if not file_path.exists():
raise FileNotFoundError(f"Attachment file not found: {attachment_path}")
```
## 💡 Anbefalinger
### Anbefaling #1: Vælg ÉT System
**Option A: Deprecate Rules (anbefalet)**
- Workflows er mere kraftfulde
- Better tracking og debugging
- Fremtidssikret arkitektur
**Migration plan:**
1. Opret workflows der matcher alle aktive rules
2. Disable rules (set enabled=false)
3. Test workflows grundigt
4. Fjern rule execution fra processor
**Option B: Keep Both, Men Koordinér**
- Add `_workflow_processed` flag til email_data
- Skip rules hvis workflow har kørt
- Document clearly når man skal bruge rules vs workflows
### Anbefaling #2: Tilføj Workflow Auto-Process Flag
**Tilføj til `email_workflows` tabel:**
```sql
ALTER TABLE email_workflows ADD COLUMN auto_execute BOOLEAN DEFAULT true;
```
**Check flag før execution:**
```python
if workflow.get('auto_execute', True):
result = await self._execute_workflow(workflow, email_data)
```
### Anbefaling #3: Unified Action Registry
**Opret fælles action handlers:**
```python
# shared/email_actions.py
class EmailActions:
@staticmethod
async def link_to_vendor(email_id, vendor_id):
# Single implementation used by both rules and workflows
...
```
### Anbefaling #4: Better Conflict Detection
**Add admin UI warning:**
```python
# Check for overlapping rules and workflows
def check_conflicts():
conflicts = []
for rule in active_rules:
for workflow in active_workflows:
if might_conflict(rule, workflow):
conflicts.append({
'rule': rule['name'],
'workflow': workflow['name'],
'reason': 'Both match same classification'
})
return conflicts
```
### Anbefaling #5: Execution Log Consolidation
**Single view af alle actions:**
```sql
CREATE VIEW email_action_log AS
SELECT
'workflow' as source,
e.email_id,
w.name as action_name,
e.status,
e.started_at
FROM email_workflow_executions e
JOIN email_workflows w ON w.id = e.workflow_id
UNION ALL
SELECT
'rule' as source,
em.id as email_id,
er.name as action_name,
CASE WHEN em.auto_processed THEN 'completed' ELSE 'skipped' END as status,
em.updated_at as started_at
FROM email_messages em
JOIN email_rules er ON er.id = em.rule_id
WHERE em.rule_id IS NOT NULL
ORDER BY started_at DESC;
```
## 🎯 Action Plan
### Umiddelbart (Kritisk):
1. ✅ Add `EMAIL_WORKFLOWS_ENABLED` check før workflow execution
2. ✅ Add workflow-processed check før rule matching
3. ✅ Fix `extract_invoice_data` silent failure
4. ✅ Add duplicate action detection
### Kort Sigt:
5. Add `auto_execute` column til workflows tabel
6. Create unified action handlers
7. Add conflict detection admin tool
8. Document clearly hvornår man skal bruge hvad
### Lang Sigt:
9. Decide: Deprecate rules eller keep both?
10. Migrate existing rules til workflows (hvis deprecating)
11. Create unified execution log view
12. Add UI for viewing all email actions i ét dashboard
## 📊 Hvad Skal Du Gøre Nu?
**Spørgsmål til dig:**
1. **Vil du beholde begge systemer eller kun workflows?**
- Hvis kun workflows: Vi kan migrate rules → workflows nu
- Hvis begge: Vi skal fixe koordineringen
2. **Skal workflows kunne disables uden at slukke helt for systemet?**
- Ja → Vi tilføjer auto_execute flag
- Nej → Workflows kører altid når enabled=true
3. **Er der aktive rules i produktion lige nu?**
- Ja → Vi skal være forsigtige med ændringer
- Nej → Vi kan bare disable rule system
**Quick Fix (5 min):**
Jeg kan tilføje de 4 kritiske fixes nu hvis du vil fortsætte med begge systemer.
**Long Fix (1 time):**
Jeg kan deprecate rules og migrate til workflows hvis du vil simplificere.
Hvad foretrækker du? 🤔

View File

@ -1,171 +0,0 @@
# Email System - Fejl & Forbedringsforslag
## 🔴 Kritiske Fejl
### 1. Type Errors i Backend (router.py)
**Problem:** `execute_query(fetchone=True)` returnerer måske `None`, men koden antager altid dict
**Lokation:** Line 253-255 i mark-processed endpoint
**Løsning:** ✅ RETTET - Tilføjet `.get()` fallbacks
### 2. Manglende Error Handling i Frontend
**Problem:** `getElementById()` kan returnere `null`, men koden tjekker ikke
**Lokation:** Multiple steder i emails.html
**Løsning:** ✅ RETTET - Tilføjet null-checks
### 3. Race Condition ved Email Loading
**Problem:** Flere samtidige kald til `loadEmails()` kan ske hvis bruger skifter filter hurtigt
**Løsning:** ✅ RETTET - Tilføjet `isLoadingEmails` flag
## ⚠️ Mindre Fejl
### 4. Manglende Loading State
**Problem:** Ingen visuelt feedback mens emails loader
**Løsning:** ✅ RETTET - Tilføjet spinner
### 5. Duplicate Function Names
**Problem:** `delete_email` er defineret 2 gange i router.py
**Løsning:** Skal rettes - én til soft delete, én til hard delete (omdøb en af dem)
### 6. Missing `classify_email` Method
**Problem:** `EmailProcessorService` har ikke `classify_email()` metode men router kalder den
**Løsning:** Skal tilføjes eller erstattes med `simple_classifier.classify()`
## 💡 Forbedringsforslag
### 1. Bulk Operations - Mangler Confirmation
**Problem:** Ingen bekræftelse før bulk actions
**Forslag:**
```javascript
if (!confirm(`Genbehandle ${selectedEmails.size} emails?`)) return;
```
### 2. Better Attachment Error Messages
**Problem:** Kun generisk fejl når attachment download fejler
**Forslag:** ✅ RETTET - Tilføjet detaljeret fejlbesked
### 3. Email Search - Ikke Implementeret
**Problem:** Search query parameter sendes, men backend håndterer den ikke
**Forslag:** Tilføj `WHERE (subject ILIKE %s OR sender_email ILIKE %s)` i SQL
### 4. Auto-Refresh Kan Afbryde Bruger
**Problem:** Hvis bruger læser en email kan auto-refresh resette visningen
**Forslag:** Pause auto-refresh når email detail er åben
### 5. Manglende Pagination
**Problem:** Limit 100 emails, ingen pagination
**Forslag:** Tilføj infinite scroll eller "Load More" knap
### 6. Status Filter Improvement
**Problem:** `currentFilter === 'active'` viser kun status=new, men burde også vise 'error', 'flagged'
**Forslag:**
```javascript
if (currentFilter === 'active') {
url += '&status=new,error,flagged'; // Backend skal understøtte comma-separated
}
```
### 7. Email Preview Cut-off
**Problem:** Preview er altid 80 chars, uanset skærmstørrelse
**Forslag:** Dynamisk længde baseret på viewport width
### 8. Keyboard Navigation Improvements
**Problem:** Kun j/k virker, ikke Tab/Shift+Tab
**Forslag:** Tilføj standard fokus-navigation
### 9. Bulk Select - Mangler "Select All"
**Problem:** Skal manuelt checke hver email
**Forslag:** Tilføj "Vælg alle" checkbox i header
### 10. Processed Emails - Kan Ikke Se Dem
**Problem:** Når email flyttes til "Processed" folder, forsvinder den fra visningen
**Forslag:**
- Tilføj "Processed" tab/filter knap
- Eller vis en bekræftelse "Email flyttet til Processed - Klik her for at se den"
## 🎯 Performance Optimering
### 11. N+1 Query Problem
**Problem:** Henter vendor/customer navne for hver email separat
**Løsning:** SQL query bruger allerede LEFT JOIN - OK ✅
### 12. Missing Indexes
**Anbefaling:** Tilføj indexes:
```sql
CREATE INDEX idx_email_messages_status ON email_messages(status);
CREATE INDEX idx_email_messages_folder ON email_messages(folder);
CREATE INDEX idx_email_messages_classification ON email_messages(classification);
CREATE INDEX idx_email_messages_received_date ON email_messages(received_date DESC);
```
### 13. Large Body Text
**Problem:** Henter fuld body_text og body_html for alle emails i listen
**Forslag:** Brug `LEFT(body_text, 200)` i list query
## 🔒 Sikkerhed
### 14. XSS Protection
**Status:** ✅ Bruger `escapeHtml()` function - godt!
### 15. CSRF Protection
**Status:** ⚠️ POST/DELETE endpoints mangler CSRF tokens (hvis FastAPI ikke har default)
### 16. File Upload Validation
**Problem:** Når email attachments downloades og re-uploades
**Forslag:** Verificer MIME type og fil størrelse
## 📋 Prioriteret Action Plan
### Umiddelbart (Kritisk):
1. ✅ Fix type errors i mark-processed endpoint
2. ✅ Fix missing null-checks i frontend
3. ✅ Add loading state og race condition fix
4. ⏳ Rename duplicate `delete_email` function
5. ⏳ Fix missing `classify_email` method
### Kort Sigt (Denne Uge):
6. Add confirmation til bulk operations
7. Implementer search functionality i backend
8. Add "Select All" checkbox
9. Add "Processed" filter tab
10. Add database indexes
### Mellem Sigt (Næste Sprint):
11. Implement pagination
12. Add auto-refresh pause when email open
13. Improve keyboard navigation
14. Add CSRF protection
### Lang Sigt (Nice to Have):
15. Toast notifications i stedet for alerts
16. Drag-and-drop for email organization
17. Email templates/quick replies
18. Advanced search (date ranges, multiple filters)
## 🧪 Test Cases Mangler
1. Bulk operations med store datasets (>100 emails)
2. Concurrent access (2 users ser samme email)
3. Attachment download timeout handling
4. Email med manglende/korrupt data
5. Unicode/emoji i email subjects
6. Very long email subjects (>500 chars)
7. Emails uden sender navn
## 📊 Metrics & Monitoring
**Anbefaling:** Tilføj:
- Email processing time metrics
- Classification accuracy tracking
- Failed attachment download counter
- User action analytics (hvilke features bruges mest?)
---
**Samlet Vurdering:**
- System fungerer grundlæggende ✅
- Flere kritiske fejl er nu rettet ✅
- God arkitektur med separation mellem router/service layers ✅
- Mangler polish og error handling på edge cases ⚠️
- Performance er acceptabel for <1000 emails
**Anbefalet næste skridt:** Implementer de 5 umiddelbare fixes og test grundigt før deploy.

View File

@ -1,253 +0,0 @@
# Email Workflow System - Komplet Implementation
## Oversigt
Et fuldt automatiseret workflow-system til BMC Hub der eksekverer handlinger baseret på email-klassificering.
## Arkitektur
### Database Schema
**3 hovedtabeller** (migration 014):
- `email_workflows` - Workflow definitioner med triggers og action-steps
- `email_workflow_executions` - Log over alle eksekverede workflows
- `email_workflow_actions` - Katalog over tilgængelige actions
### Workflow Flow
```
Email modtaget → Klassificering (AI/keyword) → Find matching workflows → Eksekver action steps → Log resultat
```
### Workflow Trigger Betingelser
- `classification_trigger` - Hvilken kategori der trigger workflowet (invoice, bankruptcy, etc.)
- `confidence_threshold` - Minimum AI confidence score (default: 0.70)
- `sender_pattern` - Regex match på afsender email (optional)
- `subject_pattern` - Regex match på emne (optional)
- `priority` - Lavere tal = højere prioritet
- `stop_on_match` - Stop efter dette workflow hvis succesfuldt
## Tilgængelige Actions
### Ticket/Case Actions
- `create_ticket` - Opret support ticket/case
- `create_time_entry` - Opret tidsregistrering
### Linking Actions
- `link_to_vendor` - Link email til leverandør (matcher på email)
- `link_to_customer` - Link email til kunde
### Extraction Actions
- `extract_invoice_data` - Udtræk fakturanummer, beløb, forfaldsdato
- `extract_tracking_number` - Find tracking numre (PostNord, GLS, DAO)
### Notification Actions
- `send_slack_notification` - Send besked til Slack kanal
- `send_email_notification` - Send email notifikation
### Status Actions
- `mark_as_processed` - Marker email som behandlet
- `flag_for_review` - Flag til manuel gennemgang
### Control Flow
- `conditional_branch` - Betinget logik baseret på data
## Pre-Konfigurerede Workflows
### 1. Invoice Processing (Priority 10)
**Trigger**: `classification = 'invoice'` AND `confidence >= 0.70`
**Steps**:
1. Link til vendor baseret på afsender email
2. Udtræk faktura data (nummer, beløb, forfaldsdato)
3. Opret billing ticket
4. Send Slack notification til #invoices
5. Marker som processed
### 2. Time Confirmation (Priority 20)
**Trigger**: `classification = 'time_confirmation'` AND `confidence >= 0.65`
**Steps**:
1. Link til kunde baseret på email domæne
2. Opret tidsregistrering
3. Send email notification til admin
4. Marker som processed
### 3. Freight Note Processing (Priority 30)
**Trigger**: `classification = 'freight_note'` AND `confidence >= 0.70`
**Steps**:
1. Udtræk tracking number
2. Link til vendor
3. Opret hardware shipment ticket
4. Marker som processed
### 4. Bankruptcy Alert (Priority 5) 🚨
**Trigger**: `classification = 'bankruptcy'` AND `confidence >= 0.80`
**Steps**:
1. Flag til manuel review
2. Send Slack alert til #alerts
3. Send email til admin
4. Marker som flagged (ikke processed)
### 5. Low Confidence Review (Priority 90)
**Trigger**: `classification = 'general'` AND `confidence >= 0.50`
**Steps**:
1. Flag til manuel review (lav confidence)
## API Endpoints
### Workflows CRUD
```bash
GET /api/v1/workflows # List alle workflows
GET /api/v1/workflows/{id} # Hent specifik workflow
POST /api/v1/workflows # Opret ny workflow
PUT /api/v1/workflows/{id} # Opdater workflow
DELETE /api/v1/workflows/{id} # Slet workflow
POST /api/v1/workflows/{id}/toggle # Enable/disable workflow
```
### Workflow Execution
```bash
POST /api/v1/emails/{email_id}/execute-workflows # Eksekver workflows for email (manuel)
GET /api/v1/workflow-executions # Hent execution history
GET /api/v1/workflow-executions?workflow_id=1 # Filter på workflow
GET /api/v1/workflow-executions?email_id=163 # Filter på email
```
### Workflow Actions & Stats
```bash
GET /api/v1/workflow-actions # List tilgængelige actions
GET /api/v1/workflows/stats/summary # Workflow statistik (success rate etc.)
```
## Automatisk Eksekvering
Workflows eksekveres **automatisk** når emails klassificeres af `EmailProcessorService`:
```python
# I email_processor_service.py efter klassificering:
workflow_result = await email_workflow_service.execute_workflows(email_data)
```
**Aktiveres af**:
- `EMAIL_WORKFLOWS_ENABLED=true` i .env (default: true)
- Email skal have classification og confidence_score
- Mindst ét workflow skal matche betingelserne
## Test Eksempel
### Manuel workflow eksekvering
```bash
# Eksekver workflows for bankruptcy email (ID 163)
curl -X POST http://localhost:8001/api/v1/emails/163/execute-workflows
# Response:
{
"status": "executed",
"workflows_executed": 1,
"workflows_succeeded": 1,
"details": [{
"workflow_id": 4,
"workflow_name": "Bankruptcy Alert",
"status": "completed",
"steps_completed": 4,
"execution_time_ms": 5
}]
}
```
### Tjek workflow stats
```bash
curl http://localhost:8001/api/v1/workflows/stats/summary | jq '.[] | {name, execution_count, success_rate}'
```
## Workflow JSON Format
### Workflow Steps Structure
```json
{
"workflow_steps": [
{
"action": "link_to_vendor",
"params": {
"match_by": "email",
"auto_create": false
}
},
{
"action": "extract_invoice_data",
"params": {
"ai_provider": "ollama",
"fallback_to_regex": true
}
},
{
"action": "mark_as_processed",
"params": {
"status": "processed"
}
}
]
}
```
## Fremtidige Udvidelser
### Actions der mangler implementation
Flere actions er defineret men kun delvist implementeret:
- `create_ticket` - Kræver integration med case/ticket system
- `create_time_entry` - Kræver integration med timetracking
- `send_slack_notification` - Kræver Slack webhook setup
- `extract_invoice_data` - Kræver AI extraction service
### Nye Action Typer
- `update_customer_data` - Opdater kunde info fra email
- `create_purchase_order` - Opret indkøbsordre
- `send_sms_notification` - SMS varsel ved kritiske events
- `create_calendar_event` - Book møde baseret på email
## Monitoring
### View til statistik
```sql
SELECT * FROM v_workflow_stats;
-- Viser: execution_count, success_rate, pending_executions
```
### Execution Log
```sql
SELECT * FROM email_workflow_executions
WHERE status = 'failed'
ORDER BY started_at DESC;
```
## Sikkerhed & Best Practices
1. **Start workflows disabled** - Test først manuelt med `/execute-workflows`
2. **Sæt confidence_threshold fornuftigt** - For lav = mange false positives
3. **Brug stop_on_match** for kritiske workflows (bankruptcy alert)
4. **Log alt** - email_workflow_executions tracker hver step
5. **Gentle degradation** - Steps fejler individuelt, workflow fortsætter
## Configuration
### Environment Variables
```env
EMAIL_WORKFLOWS_ENABLED=true # Enable/disable hele systemet
EMAIL_AI_CONFIDENCE_THRESHOLD=0.7 # Default threshold for auto-processing
EMAIL_TO_TICKET_ENABLED=true # Enable email processing pipeline
```
## Status: ✅ Produktionsklar
Systemet er fuldt implementeret og testet:
- ✅ Database schema oprettet
- ✅ Workflow service implementeret
- ✅ API endpoints tilgængelige
- ✅ Integration med email processor
- ✅ Pre-konfigurerede workflows
- ✅ Execution logging
- ✅ Test bekræftet vellykket
**Næste skridt**: Frontend UI til at administrere workflows visuelt.

View File

@ -97,72 +97,6 @@ Dette opretter:
## Database Isolering ## Database Isolering
### Customer Linking Pattern
**VIGTIG ARKITEKTUR BESLUTNING:**
Core `customers` tabel er **single source of truth** for økonomi, fakturering og CVR.
#### Hvornår skal moduler have egen kunde-tabel?
**✅ Opret modul-specifik kunde-tabel hvis:**
- Modulet syncer fra eksternt system (vTiger, CRM, etc.)
- Mange module-specifikke felter nødvendige (external_id, sync_status, etc.)
- Custom lifecycle/workflow
- **Eksempel:** `tmodule_customers` (sync fra vTiger)
**🚫 Brug direkte `customers.id` foreign key hvis:**
- Simple relationer uden external sync
- Få/ingen custom felter
- Standard workflow
- **Eksempel:** `ticket_system` → direkte FK til `customers.id`
#### Hvis modul-kunde-tabel oprettes:
**SKAL altid have:**
```sql
hub_customer_id INTEGER REFERENCES customers(id) -- Link til core
```
**SKAL have auto-link trigger:**
```sql
CREATE OR REPLACE FUNCTION auto_link_{module}_customer()
RETURNS TRIGGER AS $$
DECLARE
matched_hub_id INTEGER;
BEGIN
IF NEW.hub_customer_id IS NOT NULL THEN
RETURN NEW;
END IF;
SELECT id INTO matched_hub_id
FROM customers
WHERE LOWER(TRIM(name)) = LOWER(TRIM(NEW.name))
LIMIT 1;
IF matched_hub_id IS NOT NULL THEN
NEW.hub_customer_id := matched_hub_id;
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER trigger_auto_link_{module}_customer
BEFORE INSERT OR UPDATE OF name
ON {module}_customers
FOR EACH ROW
EXECUTE FUNCTION auto_link_{module}_customer();
```
**Hvorfor dette pattern?**
- ✅ E-conomic export virker automatisk (via hub_customer_id → customers.economic_customer_number)
- ✅ Billing integration automatisk
- ✅ Ingen manuel linking nødvendig
- ✅ Nye sync'ede kunder linkes automatisk
Se `migrations/028_auto_link_tmodule_customers.sql` for live eksempel.
### Table Prefix Pattern ### Table Prefix Pattern
Alle tabeller SKAL bruge prefix fra `module.json`: Alle tabeller SKAL bruge prefix fra `module.json`:

View File

@ -1,128 +0,0 @@
# Simply-CRM Integration Setup
## Status
⚠️ **Simply-CRM credentials ikke konfigureret** - Salgsordre fra det gamle system vises ikke
## Hvad er Simply-CRM?
Simply-CRM er et **separat CRM system** (VTiger fork) der bruges til at hente **historiske salgsordre** med `recurring_frequency`.
⚠️ **Vigtigt:** Simply-CRM er IKKE det samme som:
- vTiger Cloud (https://bmcnetworks.od2.vtiger.com)
- Det gamle on-premise vTiger (http://crm.bmcnetworks.dk)
Simply-CRM har sin egen URL, credentials og API endpoint (`/webservice.php`).
## Hvorfor vises ingen Simply-CRM data?
3 grunde:
1. ⚠️ **OLD_VTIGER_URL, OLD_VTIGER_USERNAME, OLD_VTIGER_API_KEY er tomme** i `.env` filen
2. Koden leder efter `OLD_VTIGER_API_KEY` men kan ikke finde credentials
3. Serveren er tilgængelig (301 response), men authentication mangler
## Sådan finder du credentials
### Option 1: Hvis I stadig bruger det gamle system
1. **Log ind på Simply-CRM:**
- URL: http://crm.bmcnetworks.dk
- Brug din normale bruger
2. **Find Access Key:**
- Gå til **Settings** (tandhjul øverst til højre)
- Klik på **My Preferences**
- Under **Webservices** vil du se din **Access Key**
- Kopier access key'en
3. **Tilføj til .env:**
```bash
# Simply-CRM (separat system)
SIMPLYCRM_URL=http://your-simplycrm-server.com
SIMPLYCRM_USERNAME=din_email@domain.dk
SIMPLYCRM_API_KEY=din_access_key_herfra
# ELLER hvis det er samme som gamle vTiger (fallback):
OLD_VTIGER_URL=http://crm.bmcnetworks.dk
OLD_VTIGER_USERNAME=din_email@bmcnetworks.dk
OLD_VTIGER_API_KEY=din_access_key_herfra
```
4. **Genstart API:**
```bash
docker restart bmc-hub-api
```
### Option 2: Hvis I ikke længere bruger det gamle system
Hvis alle kunder er migreret til vTiger Cloud og Simply-CRM ikke længere bruges:
1. **Kommenter linjerne ud i .env:**
```bash
# OLD_VTIGER_URL=
# OLD_VTIGER_USERNAME=
# OLD_VTIGER_API_KEY=
```
2. Simply-CRM vil automatisk blive sprunget over og der vises kun:
- vTiger Cloud subscriptions ✅
- BMC Office subscriptions ✅
## Test After Setup
```bash
# Test med en kunde
curl http://localhost:8001/api/v1/customers/327/subscriptions | jq '.sales_orders | length'
# Check logs
docker logs bmc-hub-api --tail=30 | grep -i simply
```
## Hvad henter Simply-CRM?
Koden henter **kun salgsordre med `recurring_frequency`** - altså abonnementer:
```sql
SELECT * FROM SalesOrder
WHERE account_id='<kunde_id>'
AND recurring_frequency IS NOT NULL
AND sostatus NOT IN ('closed', 'cancelled')
```
For hver ordre:
- Henter line items (produkter)
- Grupperer efter ordre ID
- Viser i "Salgsordre" sektionen på kunde-siden
## Hvorfor er det vigtigt?
Uden Simply-CRM credentials kan I ikke se:
- Gamle abonnementer oprettet før cloud migrationen
- Historiske recurring orders
- Kunder der stadig har aktive ordrer i det gamle system
**Men** I kan stadig se:
- ✅ vTiger Cloud subscriptions
- ✅ BMC Office subscriptions
- ✅ Nye vTiger Cloud sales orders
## Current Status
```
✅ vTiger Cloud - Virker (2 subscriptions for Maskinsikkerhed)
✅ BMC Office - Virker (16 subscriptions for Maskinsikkerhed)
⚠️ Simply-CRM - Mangler credentials
```
## Troubleshooting
### "Simply-CRM credentials not configured"
→ Tilføj OLD_VTIGER_* settings til `.env` og genstart
### "Not logged in to Simply-CRM"
→ Access key er forkert eller expired
### "No Simply-CRM account found for 'Kunde Navn'"
→ Kundens navn matcher ikke præcist mellem systemer (vTiger Cloud vs Simply-CRM)
### Server timeout
→ Check at `http://crm.bmcnetworks.dk` er tilgængelig fra Docker containeren

View File

@ -1,423 +0,0 @@
# Ticket System - Email Integration
## Overview
The ticket system integrates with BMC Hub's email workflow engine to automatically create tickets from incoming emails. Uses **Option B: Message-ID Threading** for robust email-to-ticket linkage.
## Architecture
```
Incoming Email → Email Classifier → Email Workflow Engine → Ticket Integration
create_ticket action
EmailTicketIntegration.process_email_for_ticket()
┌───────┴───────┐
↓ ↓
New Ticket Reply to Existing
(creates TKT-XXX) (links via Message-ID)
```
## Email Threading Logic
### Message-ID Based Threading (Option B)
The system uses email headers to detect if an email is part of an existing ticket thread:
1. **In-Reply-To Header**: Check if contains `TKT-YYYYMMDD-XXX` pattern
2. **References Header**: Check all message IDs in chain for ticket number
3. **Subject Line**: Check for `[TKT-YYYYMMDD-XXX]` or `Re: TKT-YYYYMMDD-XXX`
If ticket number found → **Link to existing ticket**
If NOT found → **Create new ticket**
### Ticket Number Format
Pattern: `TKT-YYYYMMDD-XXX`
- `TKT`: Prefix
- `YYYYMMDD`: Date (e.g., 20251215)
- `XXX`: Sequential number (001-999)
Example: `TKT-20251215-001`
## Workflow Actions
### 1. `create_ticket`
Creates new ticket OR links to existing ticket (smart routing).
**Workflow Definition:**
```json
{
"name": "Support Request → Ticket",
"classification_trigger": "support_request",
"workflow_steps": [
{
"action": "create_ticket",
"params": {
"customer_id": 123,
"assigned_to_user_id": 5
}
}
]
}
```
**Parameters:**
- `customer_id` (optional): BMC customer ID to link ticket to
- `assigned_to_user_id` (optional): User ID to assign ticket to
- `priority` (optional): Override priority detection (low/normal/high/critical)
**Returns:**
```json
{
"ticket_id": 42,
"ticket_number": "TKT-20251215-001",
"created": true,
"linked": false
}
```
### 2. `link_email_to_ticket`
Explicitly link email to a specific ticket (manual routing).
**Workflow Definition:**
```json
{
"action": "link_email_to_ticket",
"params": {
"ticket_number": "TKT-20251215-001"
}
}
```
**Parameters:**
- `ticket_number` (required): Target ticket number
**Returns:**
```json
{
"ticket_id": 42,
"ticket_number": "TKT-20251215-001",
"linked": true
}
```
## Automatic Features
### Tag Extraction
Extracts `#hashtags` from email body and adds as ticket tags:
```
Email body: "We need help with #billing #urgent issue"
→ Ticket tags: ["billing", "urgent"]
```
- Max 10 tags per ticket
- Minimum 3 characters
- Converted to lowercase
### Priority Detection
Automatically determines ticket priority from email content:
**Critical**: kritisk, critical, down, nede, urgent, akut
**High**: høj, high, vigtig, important, haster
**Low**: lav, low, spørgsmål, question, info
**Normal**: Everything else (default)
Checks both subject line and extracted tags.
### Email Metadata
Stores rich metadata in ticket:
- **Description**: Formatted with sender email, received date, original body
- **Custom Fields**: `email_from`, `email_message_id`, `created_from_email`
- **Email Log**: Full threading data in `tticket_email_log` table
## Database Storage
### `tticket_email_log` Table
Stores all email-ticket linkages for audit trail:
```sql
CREATE TABLE tticket_email_log (
id SERIAL PRIMARY KEY,
ticket_id INTEGER NOT NULL REFERENCES tticket_tickets(id),
email_message_id TEXT NOT NULL,
email_subject TEXT,
email_from TEXT NOT NULL,
email_received_at TIMESTAMP,
is_reply BOOLEAN DEFAULT FALSE,
thread_data JSONB,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
```
**`thread_data` JSONB structure:**
```json
{
"message_id": "<abc123@example.com>",
"in_reply_to": "<TKT-20251215-001@bmcnetworks.dk>",
"references": "<ref1@example.com> <ref2@example.com>"
}
```
## Usage Examples
### Example 1: Basic Support Request
**Incoming Email:**
```
From: customer@example.com
Subject: Help with network issue
Body: Our internet is down. Can you help?
```
**Workflow Configuration:**
```json
{
"name": "Support Email → Ticket",
"classification_trigger": "support_request",
"confidence_threshold": 0.7,
"workflow_steps": [
{
"action": "create_ticket",
"params": {
"assigned_to_user_id": 5
}
}
]
}
```
**Result:**
- Creates ticket `TKT-20251215-001`
- Priority: `normal` (no urgent keywords)
- Tags: [] (no hashtags found)
- Assigned to user 5
- Email logged in `tticket_email_log`
### Example 2: Email Reply to Ticket
**Incoming Email:**
```
From: customer@example.com
Subject: Re: TKT-20251215-001
In-Reply-To: <TKT-20251215-001@bmcnetworks.dk>
Body: Thanks, issue is resolved now
```
**Result:**
- **Detects existing ticket** via In-Reply-To header
- **Adds comment** to ticket `TKT-20251215-001`
- Does NOT create new ticket
- Logs as reply (`is_reply=true`)
### Example 3: Tagged Urgent Request
**Incoming Email:**
```
From: vip@example.com
Subject: URGENT: Server down!
Body: Production server is down #critical #server
```
**Result:**
- Creates ticket with priority `critical` (subject keyword)
- Tags: `["critical", "server"]`
- Custom field: `created_from_email=true`
## API Endpoints
### Get Ticket Email Thread
```http
GET /api/v1/tickets/{ticket_id}/emails
```
Returns chronological list of all emails linked to ticket.
**Response:**
```json
{
"ticket_id": 42,
"ticket_number": "TKT-20251215-001",
"emails": [
{
"id": 1,
"email_message_id": "<abc123@example.com>",
"email_from": "customer@example.com",
"email_subject": "Help with issue",
"email_received_at": "2025-12-15T10:00:00Z",
"is_reply": false,
"created_at": "2025-12-15T10:01:00Z"
},
{
"id": 2,
"email_message_id": "<def456@example.com>",
"email_from": "customer@example.com",
"email_subject": "Re: TKT-20251215-001",
"email_received_at": "2025-12-15T11:00:00Z",
"is_reply": true,
"created_at": "2025-12-15T11:01:00Z"
}
]
}
```
### Find Tickets by Email Address
```http
GET /api/v1/tickets/by-email/{email_address}
```
Returns all tickets associated with an email address.
**Response:**
```json
{
"email_address": "customer@example.com",
"tickets": [
{
"id": 42,
"ticket_number": "TKT-20251215-001",
"subject": "Network issue",
"status": "open",
"created_at": "2025-12-15T10:00:00Z"
}
]
}
```
## Configuration
### Settings (`.env`)
```bash
# Ticket system enabled
TICKET_ENABLED=true
# Email integration enabled (requires email system)
TICKET_EMAIL_INTEGRATION=true
# Auto-assign new tickets (requires user_id)
TICKET_AUTO_ASSIGN=false
TICKET_DEFAULT_ASSIGNEE_ID=5
# Default priority for new tickets
TICKET_DEFAULT_PRIORITY=normal
```
## Testing
### Test Email-to-Ticket Creation
```bash
curl -X POST http://localhost:8001/api/v1/test/email-to-ticket \
-H "Content-Type: application/json" \
-d '{
"email_data": {
"message_id": "<test123@example.com>",
"subject": "Test ticket",
"from_address": "test@example.com",
"body": "This is a test #testing",
"received_at": "2025-12-15T10:00:00Z"
}
}'
```
### Test Email Reply Linking
```bash
curl -X POST http://localhost:8001/api/v1/test/email-to-ticket \
-H "Content-Type: application/json" \
-d '{
"email_data": {
"message_id": "<reply456@example.com>",
"subject": "Re: TKT-20251215-001",
"from_address": "test@example.com",
"body": "Reply to existing ticket",
"received_at": "2025-12-15T11:00:00Z",
"in_reply_to": "<TKT-20251215-001@bmcnetworks.dk>"
}
}'
```
## Troubleshooting
### Tickets Not Created
**Check:**
1. Email workflow engine enabled (`EMAIL_WORKFLOWS_ENABLED=true`)
2. Workflow exists for classification trigger
3. Confidence threshold met
4. Workflow action is `create_ticket` (NOT old `create_ticket` stub)
**Debug:**
```sql
-- Check workflow executions
SELECT * FROM email_workflow_executions
WHERE status = 'failed'
ORDER BY created_at DESC
LIMIT 10;
```
### Email Not Linked to Ticket
**Check:**
1. Ticket number format correct (`TKT-YYYYMMDD-XXX`)
2. Ticket exists in database
3. Email headers contain ticket number (In-Reply-To, References, Subject)
**Debug:**
```sql
-- Check email logs
SELECT * FROM tticket_email_log
WHERE ticket_id = 42
ORDER BY created_at DESC;
```
### Duplicate Tickets Created
**Check:**
1. Email reply headers missing ticket number
2. Subject line doesn't match pattern (e.g., `Re: Ticket 123` instead of `Re: TKT-20251215-001`)
**Solution:**
- Ensure outgoing ticket emails include ticket number in subject: `[TKT-20251215-001]`
- Add ticket number to Message-ID: `<TKT-20251215-001-reply@bmcnetworks.dk>`
## Best Practices
1. **Include Ticket Number in Replies**: Always include `[TKT-YYYYMMDD-XXX]` in subject line
2. **Use Message-ID with Ticket Number**: Format: `<TKT-YYYYMMDD-XXX@bmcnetworks.dk>`
3. **Set Customer ID in Workflow**: Improves ticket organization and reporting
4. **Monitor Workflow Executions**: Check `email_workflow_executions` table regularly
5. **Review Failed Actions**: Alert on repeated workflow failures
## Security Considerations
1. **No Email Body Storage**: Only stores metadata and body in ticket description
2. **Sender Validation**: Consider implementing sender verification (SPF/DKIM)
3. **Spam Prevention**: Email classifier should filter spam before workflow execution
4. **Customer Isolation**: Ensure `customer_id` properly set to prevent data leakage
## Future Enhancements
- **Attachment Handling**: Link email attachments to ticket attachments
- **Email Templates**: Auto-reply with ticket number
- **SLA Integration**: Start SLA timer on ticket creation from email
- **Multi-Ticket Threading**: Support one email creating multiple tickets
- **Smart Customer Detection**: Auto-detect customer from sender domain
## Related Documentation
- [Ticket System Architecture](TICKET_SYSTEM_ARCHITECTURE.md)
- [Email Workflow System](EMAIL_WORKFLOWS.md)
- [Database Schema](../migrations/025_ticket_module.sql)

View File

@ -1 +0,0 @@
w

View File

@ -1,107 +0,0 @@
# vTiger & Simply-CRM Integration Setup
## Status
**BMC Office Abonnementer** - Virker nu! (fix applied: changed `execute_query_single` to `execute_query`)
⚠️ **vTiger Cloud Abonnementer** - Kræver credentials
⚠️ **Simply-CRM Salgsordre** - Kræver credentials
## Problem
Abonnementer & Salgsordre fanen viste ingen data fra vTiger og Simply-CRM fordi:
1. **BMC Office query brugte `execute_query_single()`** - returnerede kun 1 række i stedet for alle
2. **vTiger Cloud credentials mangler** - VTIGER_URL, VTIGER_USERNAME, VTIGER_ACCESS_KEY
3. **Simply-CRM credentials mangler** - OLD_VTIGER_URL, OLD_VTIGER_USERNAME, OLD_VTIGER_ACCESS_KEY
## Løsning
### 1. BMC Office Subscriptions (✅ Fixed)
Changed from `execute_query_single()` to `execute_query()` in `app/customers/backend/router.py` line 554.
Nu vises alle BMC Office abonnementer korrekt:
- Kunde 327 (Maskinsikkerhed): 16 abonnementer
- Kunde 372 (Norva24 Danmark A/S): 12 abonnementer
### 2. vTiger Cloud Integration (⚠️ Requires Credentials)
Tilføj følgende til `.env` filen:
```bash
# vTiger Cloud Integration
VTIGER_URL=https://bmcnetworks.od2.vtiger.com
VTIGER_USERNAME=din_vtiger_bruger
VTIGER_ACCESS_KEY=din_vtiger_access_key
```
**Sådan finder du credentials:**
1. Log ind på vTiger Cloud (https://bmcnetworks.od2.vtiger.com)
2. Gå til **Settings** (tandhjul øverst til højre)
3. Vælg **Integration** → **Webservices**
4. Kopier **Access Key** for din bruger
5. Username er din vTiger login email
### 3. Simply-CRM / Old vTiger Integration (⚠️ Requires Credentials)
Hvis I stadig bruger den gamle on-premise vTiger installation:
```bash
# Simply-CRM (Old vTiger On-Premise)
OLD_VTIGER_URL=http://crm.bmcnetworks.dk
OLD_VTIGER_USERNAME=din_gamle_vtiger_bruger
OLD_VTIGER_ACCESS_KEY=din_gamle_access_key
```
**Note:** Simply-CRM bruges til at hente salgsordre med `recurring_frequency` fra det gamle system.
## Test Efter Setup
1. Genstart API containeren:
```bash
docker restart bmc-hub-api
```
2. Test en kunde med vTiger ID:
```bash
curl http://localhost:8001/api/v1/customers/39/subscriptions | jq
```
3. Check logs for fejl:
```bash
docker logs bmc-hub-api --tail=50 | grep -i "vtiger\|simply"
```
## Forventet Output
Med credentials konfigureret skulle du se:
```json
{
"status": "success",
"recurring_orders": [...], // vTiger recurring sales orders
"sales_orders": [...], // Simply-CRM orders med recurring_frequency
"subscriptions": [...], // vTiger Subscriptions module
"expired_subscriptions": [...], // Expired/cancelled subscriptions
"bmc_office_subscriptions": [...] // Local BMC Office subscriptions (✅ works now)
}
```
## Frontend Display
Abonnementer & Salgsordre fanen viser nu 3 sektioner:
1. **vTiger Abonnementer** - Subscriptions module data med lock/unlock funktion
2. **BMC Office Abonnementer** - Lokale abonnementer (✅ virker)
3. **Samlet overblik** - Stats kortene øverst
## Troubleshooting
### "VTIGER_URL not configured"
→ Tilføj credentials til `.env` og genstart containeren
### "No Simply-CRM account found"
→ Kunden findes ikke i det gamle system, eller navnet matcher ikke præcist
### "Not logged in to Simply-CRM"
→ OLD_VTIGER_ACCESS_KEY er forkert eller mangler
### BMC Office subscriptions viser stadig ikke data
→ Tjek at containeren er restartet efter query fix

View File

@ -1,339 +0,0 @@
# Email Workflow System - Brugervejledning
## Oversigt
Email Workflow System automatiserer behandling af emails baseret på AI classification. Workflows består af multiple steps der udføres sekventielt når en email matcher specifikke kriterier.
## 🚀 Quick Start
### 1. Åbn Workflow Manager
Klik på "Workflow Management" knappen i email UI'et
### 2. Opret Workflow
Tre måder:
- **Fra Template**: Klik "Templates" og vælg en forudbygget workflow
- **Fra Scratch**: Klik "Ny Workflow" og byg selv
- **Dupliker Eksisterende**: Klik "Dupliker" på en eksisterende workflow
### 3. Konfigurer Workflow
#### Basis Information
- **Navn**: Beskrivende navn (fx "Leverandør Faktura Processing")
- **Beskrivelse**: Detaljeret formål
- **Classification Trigger**: Hvilken email type der trigger workflow
- **Confidence Threshold**: Minimum AI confidence (0.60-0.80 anbefalet)
- **Prioritet**: Lavere tal = højere prioritet (10 = meget høj)
#### Workflow Steps
Tilføj actions i den rækkefølge de skal udføres:
1. Klik "Tilføj Step"
2. Vælg action type fra dropdown
3. Konfigurer action parameters
4. Brug drag-and-drop til at ændre rækkefølge
### 4. Test Workflow
Klik "Test Workflow" før du gemmer for at verificere funktionalitet
### 5. Gem og Aktiver
Klik "Gem" og sørg for at enable switch er tændt
## 📋 Tilgængelige Actions
### Linking Actions
Link emails til eksisterende data:
```json
{
"action": "link_to_vendor",
"params": { "match_by": "email" }
}
```
```json
{
"action": "link_to_customer",
"params": {}
}
```
### Data Extraction
Ekstraher data fra emails/attachments:
```json
{
"action": "extract_invoice_data",
"params": {}
}
```
```json
{
"action": "extract_tracking_number",
"params": { "pattern": "CC[0-9]{4}" }
}
```
### Ticket/Case Creation
Opret tickets automatisk:
```json
{
"action": "create_ticket",
"params": {
"module": "support_cases",
"priority": "normal"
}
}
```
```json
{
"action": "create_time_entry",
"params": {}
}
```
### Notifications
Send notifikationer:
```json
{
"action": "send_slack_notification",
"params": {
"channel": "#alerts",
"message": "🚨 Vigtig email modtaget"
}
}
```
### Processing Control
Styr email status:
```json
{
"action": "mark_as_processed",
"params": {}
}
```
```json
{
"action": "flag_for_review",
"params": { "reason": "needs_manual_review" }
}
```
## 🎯 Workflow Templates
### Invoice Processing
Automatisk behandling af leverandør fakturaer:
- Link til vendor
- Ekstraher faktura data
- Marker som processed
**Classification**: `invoice`
**Confidence**: 0.70
**Priority**: 50
### Time Confirmation
Auto-link time confirmations til sager:
- Ekstraher CC#### nummer
- Opret ticket
- Marker som processed
**Classification**: `time_confirmation`
**Confidence**: 0.60
**Priority**: 30
### Spam Handler
Håndter spam emails:
- Flag for review
- Marker som processed
**Classification**: `spam`
**Confidence**: 0.80
**Priority**: 10
### Freight Note
Behandl fragtbreve:
- Ekstraher tracking nummer
- Link til kunde
- Marker som processed
**Classification**: `freight_note`
**Confidence**: 0.65
**Priority**: 40
### Bankruptcy Alert
Alert ved konkurs emails:
- Send Slack notifikation
- Flag for review
- Marker som processed
**Classification**: `bankruptcy`
**Confidence**: 0.75
**Priority**: 5
### Low Confidence Review
Fang emails med lav confidence:
- Flag for manuel review
**Classification**: `general`
**Confidence**: 0.30
**Priority**: 200
## 🔧 Advanced Features
### Import/Export
**Export**: Download workflow som JSON fil (del med andre systemer)
**Import**: Upload JSON workflow fil
### Duplicate
Kopier eksisterende workflow og tilpas - spar tid!
### Test Mode
Test workflow på specifik email uden at gemme ændringer
### JSON Editor
Switch til JSON view for direkte editing af workflow configuration
### Drag & Drop
Omarranger workflow steps ved at trække i dem
## 📊 Monitoring
### Execution History
Se alle workflow executions med:
- Status (completed/failed)
- Execution time
- Steps completed
- Error messages
### Statistics
Per-workflow stats:
- Total executions
- Success rate
- Failure count
- Last executed
### Debugging
Hvis workflow fejler:
1. Tjek execution history for error message
2. Verificer action parameters
3. Test workflow på sample email
4. Tjek confidence threshold
## ⚙️ Best Practices
### Prioritering
- **1-20**: Kritiske workflows (bankruptcy alerts, spam)
- **21-50**: Almindelige workflows (invoices, time confirmations)
- **51-100**: Low-priority workflows
- **100+**: Catch-all workflows
### Confidence Thresholds
- **0.80+**: Kun meget sikre matches
- **0.70-0.80**: Standard range (anbefalet)
- **0.60-0.70**: Acceptér mere usikkerhed
- **<0.60**: Kun til catch-all workflows
### Workflow Design
1. **Start med linking**: Link til vendor/customer først
2. **Extract data**: Hent data fra attachments
3. **Create tickets**: Opret tickets hvis nødvendigt
4. **Notify**: Send notifikationer
5. **Cleanup**: Marker som processed til sidst
### Testing
- Test altid nye workflows på sample emails
- Start med `enabled = false` og test grundigt
- Monitor execution history i de første dage
- Juster confidence threshold baseret på resultater
## 🚨 Common Issues
### Workflow kører ikke
- Tjek om workflow er enabled
- Verificer classification trigger matcher email classification
- Tjek confidence threshold (for høj?)
- Se om højere prioritet workflow har `stop_on_match = true`
### Action fejler
- Tjek execution history for error message
- Verificer action parameters er korrekte
- Tjek om required data er tilgængelig (fx PDF attachment)
- Test action individuelt
### Duplikering af actions
- Workflows med `stop_on_match = true` stopper andre workflows
- Tjek om flere workflows matcher samme email
- Brug prioritering til at styre rækkefølge
## 🔄 Migration fra Rules
Email Rules systemet er deprecated. Migrer til workflows:
1. Identificer aktive rules
2. Opret tilsvarende workflow (brug templates som udgangspunkt)
3. Test workflow grundigt
4. Disable original rule
5. Monitor execution history
Se `/docs/EMAIL_RULES_TO_WORKFLOWS_MIGRATION.md` for detaljer.
## 📚 Yderligere Dokumentation
- **Actions Reference**: Klik "Quick Guide" i Actions tab
- **Architecture**: Se `app/services/email_workflow_service.py`
- **API Endpoints**: `/api/docs` for komplet API reference
- **Migration Guide**: `/docs/EMAIL_RULES_TO_WORKFLOWS_MIGRATION.md`
## 💡 Tips & Tricks
### Template Modificering
Start med template og tilpas:
1. Vælg template der ligner din use case
2. Rediger navn og beskrivelse
3. Tilføj/fjern steps efter behov
4. Juster confidence og prioritet
5. Test og gem
### Multi-Step Workflows
Build komplekse workflows:
```
Step 1: Link to vendor
Step 2: Extract invoice data
Step 3: Create case if amount > 10000
Step 4: Send Slack notification
Step 5: Mark as processed
```
### Conditional Logic
Brug parameters til at styre adfærd:
```json
{
"action": "create_ticket",
"params": {
"priority": "high",
"assign_to": "billing_team",
"condition": { "amount": { "gt": 50000 } }
}
}
```
### Reusable Patterns
Gem ofte brugte step sekvenser som templates
## ❓ Support
Ved problemer:
1. Tjek execution history
2. Se i `/logs/` for detaljerede fejl
3. Test workflow i isolation
4. Kontakt system administrator
---
**Version**: 1.0
**Last Updated**: 15. december 2025
**Maintained By**: BMC Hub Development Team

View File

@ -1,269 +0,0 @@
# Email Workflow System - Forbedringer
**Dato**: 15. december 2025
**Status**: ✅ Implementeret og Aktiv
## 🎯 Formål
Gøre workflow systemet meget nemmere at bruge og administrere for både technical og non-technical brugere.
## ✨ Nye Features
### 1. **Workflow Templates** 📋
6 forudbyggede workflow templates klar til brug:
- **Invoice Processing**: Auto-behandling af leverandør fakturaer
- **Time Confirmation**: Link time confirmations til sager
- **Spam Handler**: Auto-spam cleanup
- **Freight Note**: Fragtbrev processing
- **Bankruptcy Alert**: Konkurs notifikationer
- **Low Confidence Review**: Catch-all for usikre classifications
**Hvordan**: Klik "Templates" → Vælg template → Tilpas → Gem
### 2. **Test Mode** 🧪
Test workflows før deployment:
- Test eksisterende workflow på sample email
- Test workflow under editing uden at gemme
- Se detaljerede execution results
- Debug step-by-step
**Hvordan**: Klik "Test" på workflow eller "Test Workflow" i editor
### 3. **Duplicate Workflow** 📄
Kopier og tilpas eksisterende workflows:
- Gem tid ved at starte fra working workflow
- Automatisk navngivning med "(kopi)"
- Deaktiveret by default (safe)
- Prioritet +1 (lavere end original)
**Hvordan**: Klik "Dupliker" på eksisterende workflow
### 4. **Import/Export** 💾
Del workflows mellem systemer:
- **Export**: Download workflow som JSON fil
- **Import**: Upload JSON workflow fra anden installation
- Fjerner database-specifikke felter automatisk
- Validering ved import
**Hvordan**: Klik "Export" eller "Import" i workflow manager
### 5. **Action Quick Guide** 📚
Interaktiv guide til tilgængelige actions:
- Kategoriseret efter funktionalitet (Linking, Extraction, Notifications, etc.)
- Eksempel configuration for hver action
- Accordion layout - nem navigation
- Best practices og warnings
**Hvordan**: Klik "Quick Guide" i Actions tab
### 6. **Contextual Help** 💡
Tooltips og hjælpetekst overalt:
- Forklaring af Classification Trigger
- Confidence Threshold guidance (anbefalet ranges)
- Prioritet forklaring
- Field-level hjælp med ikoner
**Visning**: Hover over ikoner ved felter
### 7. **Comprehensive Documentation** 📖
Komplet brugervejledning inkluderet:
- Quick Start guide
- Action reference
- Template dokumentation
- Best practices
- Troubleshooting
- Common issues og løsninger
**Location**: `/docs/WORKFLOW_SYSTEM_GUIDE.md`
**Access**: Klik "Guide" knap i workflow manager header
## 🔧 Forbedrede Features
### Visual Workflow Editor
- **Drag & Drop**: Flyt steps ved at trække
- **Up/Down arrows**: Alternativ til drag & drop
- **Step numbering**: Visuelt step flow
- **JSON view**: Advanced editing mode
- **Parameter editing**: Inline parameter fields
### Workflow List
- **Status badges**: Aktiv/Deaktiveret
- **Statistics**: Execution count, success/failure
- **Quick actions**: Edit, Test, Duplicate, Export, Toggle, Delete
- **Classification badge**: Se trigger type med det samme
### Better UX
- **Loading states**: Spinners under data load
- **Toast notifications**: Feedback på actions
- **Confirmation dialogs**: Før destructive operations
- **Modal styling**: Workflow template cards med hover effects
- **Responsive design**: Virker på alle skærm størrelser
## 📊 Backend Improvements
### New Endpoints
Alle workflow endpoints allerede eksisterende og fungerende:
- `GET /api/v1/workflows` - List alle workflows
- `GET /api/v1/workflows/{id}` - Hent specifik workflow
- `POST /api/v1/workflows` - Opret ny workflow
- `PUT /api/v1/workflows/{id}` - Opdater workflow
- `DELETE /api/v1/workflows/{id}` - Slet workflow
- `POST /api/v1/workflows/{id}/toggle` - Enable/disable
- `GET /api/v1/workflow-actions` - List available actions
- `GET /api/v1/workflow-executions` - Execution history
- `POST /api/v1/emails/{id}/execute-workflows` - Manuel execution
### Documentation Endpoint
Nyt endpoint tilføjet:
- `GET /docs/{doc_name}` - Serve markdown documentation
## 🎨 UI/UX Enhancements
### CSS Additions
```css
.workflow-template-card {
/* Hover effects for templates */
border: 2px solid transparent;
transition: all 0.2s;
}
.workflow-template-card:hover {
border-color: var(--accent);
box-shadow: 0 4px 12px rgba(15, 76, 117, 0.15);
transform: translateY(-2px);
}
```
### JavaScript Functions
Nye funktioner tilføjet:
- `showWorkflowTemplates()` - Vis template picker modal
- `createFromTemplate(key)` - Opret workflow fra template
- `duplicateWorkflow(id)` - Dupliker eksisterende workflow
- `exportWorkflow(id)` - Download workflow som JSON
- `importWorkflow()` - Upload og parse workflow JSON
- `testWorkflow(id)` - Test saved workflow
- `testCurrentWorkflow()` - Test workflow under editing
- `executeTestWorkflow(id)` - Kør test execution
- `showActionGuide()` - Vis action reference guide
## 📈 Impact
### Før
- Manuel SQL til at oprette workflows
- Ingen templates - hver workflow fra scratch
- Svært at dele workflows mellem systemer
- Ingen test muligheder
- Minimal documentation
### Efter
- ✅ Point-and-click workflow creation
- ✅ 6 ready-to-use templates
- ✅ Import/export funktionalitet
- ✅ Test mode før deployment
- ✅ Comprehensive guide og hjælp
## 🚀 Hvordan Bruger Du Det?
### Scenario 1: Quick Start med Template
1. Åbn Workflow Manager
2. Klik "Templates"
3. Vælg "Invoice Processing"
4. Tilpas navn/beskrivelse hvis ønsket
5. Gem - done! ✅
**Tid**: ~30 sekunder
### Scenario 2: Tilpas Eksisterende Workflow
1. Find workflow i listen
2. Klik "Dupliker"
3. Rediger navn og steps
4. Test på sample email
5. Gem når tilfreds
**Tid**: ~2 minutter
### Scenario 3: Del Workflow med Anden Installation
1. Klik "Export" på workflow
2. Send JSON fil til kollega
3. Kollega klikker "Import"
4. Workflow indlæst og klar
**Tid**: ~1 minut
### Scenario 4: Lær Systemet
1. Klik "Guide" i workflow manager
2. Læs Quick Start sektion
3. Se Action Reference
4. Prøv med template
5. Test grundigt før enable
**Tid**: ~10 minutter læsning, derefter ready to go
## 🎓 Best Practices
### For Beginners
1. Start med templates
2. Test altid før enable
3. Brug moderate confidence thresholds (0.70)
4. Læs action guide før custom workflows
### For Advanced Users
1. Kombiner actions kreativt
2. Brug import/export til backup
3. Monitor execution statistics
4. Tune confidence baseret på data
## 📝 Dokumentation
### User-Facing
- **Workflow System Guide**: `/docs/WORKFLOW_SYSTEM_GUIDE.md`
- **Action Quick Guide**: I UI via "Quick Guide" knap
- **Tooltips**: Hover over ikoner
### Developer-Facing
- **Service Implementation**: `app/services/email_workflow_service.py`
- **API Routes**: `app/emails/backend/router.py`
- **Frontend UI**: `app/emails/frontend/emails.html`
## ✅ Testing Checklist
- [x] Templates loader korrekt
- [x] Duplicate functionality virker
- [x] Import/export flow fungerer
- [x] Test mode execution virker
- [x] Action guide vises korrekt
- [x] Tooltips render som forventet
- [x] Documentation endpoint fungerer
- [x] API restart succesfuld
## 🔮 Future Enhancements
### Potentielle Additions
- **Visual Workflow Designer**: Flowchart-style editor
- **Conditional Actions**: If/else logic i workflows
- **Scheduled Workflows**: Time-based triggers
- **Webhook Triggers**: External system integration
- **Workflow Versioning**: Track og rollback changes
- **A/B Testing**: Test multiple workflow versions
- **Analytics Dashboard**: Detailed workflow performance metrics
### Request for Feedback
Tag imod feedback fra brugere på:
- Hvilke templates mangler?
- Hvilke actions skal tilføjes?
- UX pain points?
- Documentation gaps?
## 💬 Support
Ved spørgsmål eller problemer:
1. Læs `/docs/WORKFLOW_SYSTEM_GUIDE.md`
2. Klik "Quick Guide" i UI
3. Tjek execution history for errors
4. Se API logs i `/logs/`
---
**Status**: ✅ Live og Produktionsklar
**Impact**: Major usability improvement
**User Satisfaction**: Forventet høj (TBD efter user feedback)

Some files were not shown because too many files have changed in this diff Show More