Compare commits
No commits in common. "main" and "v2.2.0" have entirely different histories.
@ -38,8 +38,6 @@ GITHUB_TOKEN=your_gitea_token_here
|
|||||||
# =====================================================
|
# =====================================================
|
||||||
# API CONFIGURATION - Production
|
# API CONFIGURATION - Production
|
||||||
# =====================================================
|
# =====================================================
|
||||||
# Stack name used by deployment scripts to name containers
|
|
||||||
STACK_NAME=prod
|
|
||||||
API_HOST=0.0.0.0
|
API_HOST=0.0.0.0
|
||||||
API_PORT=8000
|
API_PORT=8000
|
||||||
API_RELOAD=false
|
API_RELOAD=false
|
||||||
|
|||||||
38
.github/skills/gui-starter/SKILL.md
vendored
38
.github/skills/gui-starter/SKILL.md
vendored
@ -1,38 +0,0 @@
|
|||||||
---
|
|
||||||
name: gui-starter
|
|
||||||
description: "Use when building or updating BMC Hub GUI pages, templates, layout, styling, dark mode toggle, responsive Bootstrap 5 UI, or Nordic Top themed frontend components."
|
|
||||||
---
|
|
||||||
|
|
||||||
# BMC Hub GUI Starter
|
|
||||||
|
|
||||||
## Purpose
|
|
||||||
Use this skill when implementing or refining frontend UI in BMC Hub.
|
|
||||||
|
|
||||||
## Project UI Rules
|
|
||||||
- Follow the Nordic Top style from `docs/design_reference/`.
|
|
||||||
- Keep a minimalist, clean layout with card-based sections.
|
|
||||||
- Use Deep Blue as default primary accent: `#0f4c75`.
|
|
||||||
- Support dark mode with a visible toggle.
|
|
||||||
- Use CSS variables so accent colors can be changed dynamically.
|
|
||||||
- Build mobile-first with Bootstrap 5 grid utilities.
|
|
||||||
|
|
||||||
## Preferred Workflow
|
|
||||||
1. Identify existing template/page and preserve established structure when present.
|
|
||||||
2. Define or update theme tokens as CSS variables (light + dark).
|
|
||||||
3. Implement responsive layout first, then enhance desktop spacing/typography.
|
|
||||||
4. Add or maintain dark mode toggle logic (persist preference in localStorage when relevant).
|
|
||||||
5. Reuse patterns from `docs/design_reference/components.html`, `docs/design_reference/index.html`, `docs/design_reference/customers.html`, and `docs/design_reference/form.html`.
|
|
||||||
6. Validate visual consistency and avoid introducing one-off styles unless necessary.
|
|
||||||
|
|
||||||
## Implementation Guardrails
|
|
||||||
- Do not hardcode colors repeatedly; map them to CSS variables.
|
|
||||||
- Do not remove dark mode support from existing pages.
|
|
||||||
- Do not break existing navigation/topbar behavior.
|
|
||||||
- Avoid large framework changes unless explicitly requested.
|
|
||||||
- Keep accessibility basics in place: color contrast, visible focus states, semantic HTML.
|
|
||||||
|
|
||||||
## Deliverables
|
|
||||||
When using this skill, provide:
|
|
||||||
- Updated frontend files (HTML/CSS/JS) with concise, intentional styling.
|
|
||||||
- A short summary of what changed and why.
|
|
||||||
- Notes about any remaining UI tradeoffs or follow-up refinements.
|
|
||||||
1
.gitignore
vendored
1
.gitignore
vendored
@ -28,4 +28,3 @@ htmlcov/
|
|||||||
.coverage
|
.coverage
|
||||||
.pytest_cache/
|
.pytest_cache/
|
||||||
.mypy_cache/
|
.mypy_cache/
|
||||||
RELEASE_NOTES_v2.2.38.md
|
|
||||||
|
|||||||
@ -114,9 +114,6 @@ SECRET_KEY=$(python3 -c "import secrets; print(secrets.token_urlsafe(32))")
|
|||||||
# 5. CORS Origins (production domain)
|
# 5. CORS Origins (production domain)
|
||||||
CORS_ORIGINS=https://hub.bmcnetworks.dk
|
CORS_ORIGINS=https://hub.bmcnetworks.dk
|
||||||
|
|
||||||
# 5b. Stack name (used by deployment scripts for container names)
|
|
||||||
STACK_NAME=prod
|
|
||||||
|
|
||||||
# 6. e-conomic Credentials (hvis relevant)
|
# 6. e-conomic Credentials (hvis relevant)
|
||||||
ECONOMIC_APP_SECRET_TOKEN=xxxxx
|
ECONOMIC_APP_SECRET_TOKEN=xxxxx
|
||||||
ECONOMIC_AGREEMENT_GRANT_TOKEN=xxxxx
|
ECONOMIC_AGREEMENT_GRANT_TOKEN=xxxxx
|
||||||
|
|||||||
14
Dockerfile
14
Dockerfile
@ -38,18 +38,8 @@ RUN if [ "$RELEASE_VERSION" != "latest" ] && [ -n "$GITHUB_TOKEN" ]; then \
|
|||||||
pip install --no-cache-dir -r /tmp/requirements.txt; \
|
pip install --no-cache-dir -r /tmp/requirements.txt; \
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Copy local source to temp location.
|
# Copy application code
|
||||||
# In release builds we keep downloaded source in /app.
|
COPY . .
|
||||||
# In latest/local builds we copy from /app_local to /app.
|
|
||||||
COPY . /app_local
|
|
||||||
|
|
||||||
RUN if [ "$RELEASE_VERSION" = "latest" ] || [ -z "$GITHUB_TOKEN" ]; then \
|
|
||||||
echo "Using local source files..." && \
|
|
||||||
cp -a /app_local/. /app/; \
|
|
||||||
else \
|
|
||||||
echo "Keeping downloaded release source in /app (no local override)"; \
|
|
||||||
fi && \
|
|
||||||
rm -rf /app_local
|
|
||||||
|
|
||||||
# Create necessary directories
|
# Create necessary directories
|
||||||
RUN mkdir -p /app/logs /app/uploads /app/static /app/data
|
RUN mkdir -p /app/logs /app/uploads /app/static /app/data
|
||||||
|
|||||||
@ -50,7 +50,6 @@ DATABASE_URL=postgresql://bmc_hub_prod:din_stærke_password_her@postgres:5432/bm
|
|||||||
SECRET_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
SECRET_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
||||||
|
|
||||||
# API
|
# API
|
||||||
STACK_NAME=prod
|
|
||||||
API_PORT=8000
|
API_PORT=8000
|
||||||
CORS_ORIGINS=http://172.16.31.183:8001
|
CORS_ORIGINS=http://172.16.31.183:8001
|
||||||
|
|
||||||
|
|||||||
@ -1,38 +0,0 @@
|
|||||||
# BMC Hub v2.2.2 - Sync Safety Release
|
|
||||||
|
|
||||||
**Release Date:** 22. februar 2026
|
|
||||||
|
|
||||||
## 🛡️ Critical Fixes
|
|
||||||
|
|
||||||
### e-conomic Customer Sync Mapping
|
|
||||||
- **Fixed ambiguous matching**: e-conomic sync now matches customers only by `economic_customer_number`
|
|
||||||
- **Removed unsafe fallback in this flow**: CVR/name fallback is no longer used in `/api/v1/system/sync/economic`
|
|
||||||
- **Added conflict-safe behavior**: if multiple local rows share the same `economic_customer_number`, the record is skipped and logged as conflict (no overwrite)
|
|
||||||
- **Improved traceability**: sync logs now include the actual local customer id that was updated/created
|
|
||||||
|
|
||||||
### Settings Sync UX
|
|
||||||
- **Aligned frontend with backend response fields** for vTiger/e-conomic sync summaries
|
|
||||||
- **Improved 2FA error feedback** in Settings sync UI when API returns `403: 2FA required`
|
|
||||||
- **Fixed sync stats request limit** to avoid API validation issues
|
|
||||||
- **Temporarily disabled CVR→e-conomic action** in Settings UI to prevent misleading behavior
|
|
||||||
- **Clarified runtime config source**: sync uses environment variables (`.env`) at runtime
|
|
||||||
|
|
||||||
## 🗄️ Database Safety
|
|
||||||
|
|
||||||
### New Migration
|
|
||||||
- Added migration: `migrations/138_customers_economic_unique_constraint.sql`
|
|
||||||
- Normalizes empty/whitespace `economic_customer_number` values
|
|
||||||
- Adds a partial unique index on non-null `economic_customer_number`
|
|
||||||
- Migration aborts with clear error if duplicates already exist (manual dedupe required before rerun)
|
|
||||||
|
|
||||||
## ⚠️ Deployment Notes
|
|
||||||
|
|
||||||
- Run migration `138_customers_economic_unique_constraint.sql` before enabling broad sync operations in production
|
|
||||||
- If migration fails due to duplicates, deduplicate `customers.economic_customer_number` first, then rerun migration
|
|
||||||
- Existing 2FA API protection remains enabled
|
|
||||||
|
|
||||||
## ✅ Expected Outcome
|
|
||||||
|
|
||||||
- Sync payload and DB target row are now consistent in the e-conomic flow
|
|
||||||
- Incorrect overwrites caused by weak matching strategy are prevented
|
|
||||||
- Future duplicate `economic_customer_number` values are blocked at database level
|
|
||||||
@ -1,15 +0,0 @@
|
|||||||
# BMC Hub v2.2.3 - Migration Hotfix
|
|
||||||
|
|
||||||
**Release Date:** 22. februar 2026
|
|
||||||
|
|
||||||
## 🛠️ Hotfix
|
|
||||||
|
|
||||||
### Migration 138 compatibility fix
|
|
||||||
- Fixed `migrations/138_customers_economic_unique_constraint.sql` for environments where `customers.economic_customer_number` is numeric (`integer`).
|
|
||||||
- Removed unconditional `btrim(...)` usage on non-text columns.
|
|
||||||
- Added type-aware normalization logic that only applies trimming for text-like column types.
|
|
||||||
|
|
||||||
## ✅ Impact
|
|
||||||
|
|
||||||
- Migration `138_customers_economic_unique_constraint.sql` now runs on both numeric and text column variants without `function btrim(integer) does not exist` errors.
|
|
||||||
- Unique index safety behavior and duplicate detection are unchanged.
|
|
||||||
@ -1,30 +0,0 @@
|
|||||||
# BMC Hub v2.2.36 - Helpdesk SAG Routing
|
|
||||||
|
|
||||||
**Release Date:** 2. marts 2026
|
|
||||||
|
|
||||||
## ✨ New Features
|
|
||||||
|
|
||||||
### Helpdesk email → SAG automation
|
|
||||||
- Incoming emails from known customer domains now auto-create a new SAG when no `SAG-<id>` reference is present.
|
|
||||||
- Incoming emails with `SAG-<id>` in subject or threading headers now auto-update the related SAG.
|
|
||||||
- Emails from unknown domains remain in `/emails` for manual handling.
|
|
||||||
|
|
||||||
### Email threading support for routing
|
|
||||||
- Added migration `141_email_threading_headers.sql`.
|
|
||||||
- `email_messages` now stores `in_reply_to` and `email_references` to support robust SAG threading lookup.
|
|
||||||
|
|
||||||
### /emails quick customer creation improvements
|
|
||||||
- Quick create customer modal now includes `email_domain`.
|
|
||||||
- Customer create API now accepts and persists `email_domain`.
|
|
||||||
|
|
||||||
## 🔧 Technical Changes
|
|
||||||
|
|
||||||
- Updated `app/services/email_service.py` to parse and persist `In-Reply-To` and `References` from IMAP/EML uploads.
|
|
||||||
- Updated `app/services/email_workflow_service.py` with system-level helpdesk SAG routing logic.
|
|
||||||
- Updated `app/emails/backend/router.py` to include `customer_name` in email list responses.
|
|
||||||
- Updated `app/customers/backend/router.py` and `app/emails/frontend/emails.html` for `email_domain` support.
|
|
||||||
|
|
||||||
## 📋 Deployment Notes
|
|
||||||
|
|
||||||
- Run database migration 141 before processing new inbound emails for full header-based routing behavior.
|
|
||||||
- Existing workflow/rule behavior is preserved; new routing runs as a system workflow.
|
|
||||||
@ -1,45 +0,0 @@
|
|||||||
# Release Notes v2.2.39
|
|
||||||
|
|
||||||
Dato: 3. marts 2026
|
|
||||||
|
|
||||||
## Nyt: Mission Control (MVP)
|
|
||||||
- Nyt dedikeret fullscreen dashboard til operations-overblik på storskærm.
|
|
||||||
- Realtime-opdateringer via WebSocket (`/api/v1/mission/ws`).
|
|
||||||
- KPI-overblik for sager:
|
|
||||||
- Åbne sager
|
|
||||||
- Nye sager
|
|
||||||
- Sager uden ansvarlig
|
|
||||||
- Deadlines i dag
|
|
||||||
- Overskredne deadlines
|
|
||||||
- Aktivt opkaldsoverlay med deduplikering på `call_id`.
|
|
||||||
- Uptime-alerts (DOWN/UP/DEGRADED) med synlig aktive alarmer.
|
|
||||||
- Live aktivitetsfeed (seneste 20 events).
|
|
||||||
- Lydsystem med mute + volumenkontrol i dashboardet.
|
|
||||||
|
|
||||||
## Nye endpoints
|
|
||||||
- `GET /api/v1/mission/state`
|
|
||||||
- `WS /api/v1/mission/ws`
|
|
||||||
- `POST /api/v1/mission/webhook/telefoni/ringing`
|
|
||||||
- `POST /api/v1/mission/webhook/telefoni/answered`
|
|
||||||
- `POST /api/v1/mission/webhook/telefoni/hangup`
|
|
||||||
- `POST /api/v1/mission/webhook/uptime`
|
|
||||||
|
|
||||||
## Nye filer
|
|
||||||
- `migrations/142_mission_control.sql`
|
|
||||||
- `app/dashboard/backend/mission_router.py`
|
|
||||||
- `app/dashboard/backend/mission_service.py`
|
|
||||||
- `app/dashboard/backend/mission_ws.py`
|
|
||||||
- `app/dashboard/frontend/mission_control.html`
|
|
||||||
|
|
||||||
## Opdaterede filer
|
|
||||||
- `main.py`
|
|
||||||
- `app/core/config.py`
|
|
||||||
- `app/dashboard/backend/views.py`
|
|
||||||
- `VERSION`
|
|
||||||
|
|
||||||
## Drift/konfiguration
|
|
||||||
- Ny setting/env til webhook-sikring: `MISSION_WEBHOOK_TOKEN`.
|
|
||||||
- Nye settings-seeds til Mission Control lyd, KPI-visning, queue-filter og customer-filter.
|
|
||||||
|
|
||||||
## Verificering
|
|
||||||
- Python syntaks-check kørt på ændrede backend-filer med `py_compile`.
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
# Release Notes v2.2.40
|
|
||||||
|
|
||||||
Dato: 3. marts 2026
|
|
||||||
|
|
||||||
## Hotfix: Production build source override
|
|
||||||
- Rettet Docker build-flow i `Dockerfile`, så release-kode hentet via `RELEASE_VERSION` ikke bliver overskrevet af lokal checkout under image build.
|
|
||||||
- Dette løser scenarier hvor produktion kører forkert kodeversion (fx manglende routes som `/dashboard/mission-control`) selv når korrekt release-tag er angivet.
|
|
||||||
|
|
||||||
## Tekniske ændringer
|
|
||||||
- Lokal kildekode kopieres nu til midlertidig mappe (`/app_local`).
|
|
||||||
- Ved release-build (`RELEASE_VERSION != latest` og token sat) bevares downloadet release-kilde i `/app`.
|
|
||||||
- Ved local/latest-build kopieres `/app_local` til `/app` som før.
|
|
||||||
|
|
||||||
## Verificering
|
|
||||||
- Build output skal vise:
|
|
||||||
- `Downloading release ... from Gitea...`
|
|
||||||
- `Keeping downloaded release source in /app (no local override)`
|
|
||||||
- Efter deploy skal `/dashboard/mission-control` ikke længere returnere 404 på release v2.2.39+.
|
|
||||||
@ -1,20 +0,0 @@
|
|||||||
# Release Notes v2.2.41
|
|
||||||
|
|
||||||
Dato: 3. marts 2026
|
|
||||||
|
|
||||||
## Fix: Postgres healthcheck støj i logs
|
|
||||||
- Opdateret healthcheck til at bruge korrekt database-navn (`POSTGRES_DB`) i stedet for default database.
|
|
||||||
- Løser gentagne loglinjer af typen: `FATAL: database "bmc_hub" does not exist` på installationer hvor databasen hedder noget andet (fx `hubdb_v2`).
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `docker-compose.prod.yml`
|
|
||||||
- `docker-compose.yml`
|
|
||||||
- `updateto.sh`
|
|
||||||
- `VERSION`
|
|
||||||
|
|
||||||
## Tekniske noter
|
|
||||||
- Healthcheck er ændret fra:
|
|
||||||
- `pg_isready -U <user>`
|
|
||||||
- Til:
|
|
||||||
- `pg_isready -U <user> -d <db>`
|
|
||||||
- `updateto.sh` bruger nu også `-d "$POSTGRES_DB"` i venteløkke for postgres.
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
# Release Notes v2.2.42
|
|
||||||
|
|
||||||
Dato: 3. marts 2026
|
|
||||||
|
|
||||||
## Fix: Yealink webhook compatibility + deploy robusthed
|
|
||||||
- Tilføjet `GET` support på Mission Control telefoni-webhooks, så Yealink callback-URLs ikke returnerer `405 Method Not Allowed`.
|
|
||||||
- Webhook-endpoints understøtter nu query-parametre for `call_id`, `caller_number`, `queue_name` og valgfri `timestamp`.
|
|
||||||
- `updateto.sh` er hærdet med tydelig fail-fast ved portkonflikter og mislykket container-opstart, så scriptet ikke melder succes ved delvis fejl.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/dashboard/backend/mission_router.py`
|
|
||||||
- `updateto.sh`
|
|
||||||
- `VERSION`
|
|
||||||
|
|
||||||
## Påvirkede endpoints
|
|
||||||
- `/api/v1/mission/webhook/telefoni/ringing` (`POST` + `GET`)
|
|
||||||
- `/api/v1/mission/webhook/telefoni/answered` (`POST` + `GET`)
|
|
||||||
- `/api/v1/mission/webhook/telefoni/hangup` (`POST` + `GET`)
|
|
||||||
@ -1,16 +0,0 @@
|
|||||||
# Release Notes v2.2.43
|
|
||||||
|
|
||||||
Dato: 3. marts 2026
|
|
||||||
|
|
||||||
## Fix: Synlige Mission webhook logs
|
|
||||||
- Tilføjet eksplicit logging for Mission telefoni-webhooks (`ringing`, `answered`, `hangup`) med call-id, nummer, kø og HTTP-metode.
|
|
||||||
- Tilføjet warning logs ved manglende/ugyldig Mission webhook token.
|
|
||||||
- Gør det nemt at fejlsøge Yealink callbacks i `podman logs`.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/dashboard/backend/mission_router.py`
|
|
||||||
- `VERSION`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy med: `./updateto.sh v2.2.43`
|
|
||||||
- Se webhook-log events med: `podman logs -f bmc-hub-api-v2 | grep -E "Mission webhook|forbidden|token"`
|
|
||||||
@ -1,17 +0,0 @@
|
|||||||
# Release Notes v2.2.44
|
|
||||||
|
|
||||||
Dato: 4. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
- `updateto.sh` rydder nu automatisk legacy containere (`bmc-hub-api-v2`, `bmc-hub-postgres-v2`) før deploy.
|
|
||||||
- Forebygger port-lock konflikter på især Postgres host-port (`5433`) under compose opstart.
|
|
||||||
- Mission Control: automatisk timeout på hængende `ringing` opkald, så de ikke bliver stående i Incoming Calls.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `updateto.sh`
|
|
||||||
- `app/dashboard/backend/mission_service.py`
|
|
||||||
- `VERSION`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.44`
|
|
||||||
- Verificér: `curl http://localhost:8001/health`
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
# Release Notes v2.2.45
|
|
||||||
|
|
||||||
Dato: 4. marts 2026
|
|
||||||
|
|
||||||
## Forbedringer
|
|
||||||
- Tilføjet direkte menu-link til Mission Control i Support-dropdownen, så siden er hurtigere at finde.
|
|
||||||
- Tilføjet Mission Control som valgmulighed under Standard Dashboard i Indstillinger.
|
|
||||||
- Opdateret dashboard-fallback logik, så `/dashboard/mission-control` behandles som et kendt standardvalg.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/shared/frontend/base.html`
|
|
||||||
- `app/settings/frontend/settings.html`
|
|
||||||
- `VERSION`
|
|
||||||
- `RELEASE_NOTES_v2.2.45.md`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.45`
|
|
||||||
- Verificér: `curl http://localhost:8001/dashboard/mission-control`
|
|
||||||
@ -1,19 +0,0 @@
|
|||||||
# Release Notes v2.2.46
|
|
||||||
|
|
||||||
Dato: 4. marts 2026
|
|
||||||
|
|
||||||
## Fixes og driftssikring
|
|
||||||
- Mission Control backend tåler nu manglende mission-tabeller uden at crashe requests, og logger tydelige advarsler.
|
|
||||||
- Tilføjet idempotent reparationsmigration for Mission Control schema (`143_mission_control_repair.sql`) til miljøer med delvist oprettede tabeller.
|
|
||||||
- Opdateret `.gitignore` med release-note undtagelse fra tidligere drift.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/dashboard/backend/mission_service.py`
|
|
||||||
- `migrations/143_mission_control_repair.sql`
|
|
||||||
- `.gitignore`
|
|
||||||
- `VERSION`
|
|
||||||
- `RELEASE_NOTES_v2.2.46.md`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.46`
|
|
||||||
- Migration (hvis nødvendig): `docker compose exec db psql -U bmc_hub -d bmc_hub -f migrations/143_mission_control_repair.sql`
|
|
||||||
@ -1,17 +0,0 @@
|
|||||||
# Release Notes v2.2.47
|
|
||||||
|
|
||||||
Dato: 4. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
- Mission webhook GET for ringing accepterer nu token-only ping uden `call_id` og returnerer `200 OK`.
|
|
||||||
- `updateto.sh` bruger nu automatisk port `8001` som default i v2-mappen (`/srv/podman/bmc_hub_v2`), med fortsat støtte for `API_PORT` override i `.env`.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/dashboard/backend/mission_router.py`
|
|
||||||
- `updateto.sh`
|
|
||||||
- `VERSION`
|
|
||||||
- `RELEASE_NOTES_v2.2.47.md`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.47`
|
|
||||||
- Verificér webhook ping: `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/ringing?token=<TOKEN>"`
|
|
||||||
@ -1,21 +0,0 @@
|
|||||||
# Release Notes v2.2.48
|
|
||||||
|
|
||||||
Dato: 4. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
- `sag` aggregering fejler ikke længere hvis tabellen `sag_salgsvarer` mangler; API returnerer fortsat tidsdata og tom salgsliste i stedet for `500`.
|
|
||||||
- Salgsliste-endpoints i `sag` returnerer nu tom liste med advarsel i log, hvis `sag_salgsvarer` ikke findes.
|
|
||||||
- Mission webhooks for `answered` og `hangup` accepterer nu også token-only `GET` ping uden `call_id` (samme kompatibilitet som `ringing`).
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/modules/sag/backend/router.py`
|
|
||||||
- `app/dashboard/backend/mission_router.py`
|
|
||||||
- `VERSION`
|
|
||||||
- `RELEASE_NOTES_v2.2.48.md`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.48`
|
|
||||||
- Valider webhook ping:
|
|
||||||
- `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/ringing?token=<TOKEN>"`
|
|
||||||
- `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/answered?token=<TOKEN>"`
|
|
||||||
- `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/hangup?token=<TOKEN>"`
|
|
||||||
@ -1,40 +0,0 @@
|
|||||||
# Release Notes v2.2.49
|
|
||||||
|
|
||||||
Dato: 5. marts 2026
|
|
||||||
|
|
||||||
## Ny funktionalitet
|
|
||||||
|
|
||||||
### Sag – Relationer
|
|
||||||
- Relation-vinduet vises kun når der faktisk er relerede sager. Enkelt-sag (ingen relationer) viser nu tom-state "Ingen relaterede sager".
|
|
||||||
- Aktuel sag fremhæves tydeligt i relationstræet: accent-farvet venstre-kant, svag baggrund, udfyldt badge med sags-ID og fed titel. Linket er ikke klikbart (man er allerede der).
|
|
||||||
|
|
||||||
### Sag – Sagstype dropdown
|
|
||||||
- Sagstype i topbaren er nu et klikbart dropdown i stedet for et link til redigeringssiden.
|
|
||||||
- Dropdown viser alle 6 typer (Ticket, Pipeline, Opgave, Ordre, Projekt, Service) med farveikoner og markerer den aktive type.
|
|
||||||
- Valg PATCHer sagen direkte og genindlæser siden.
|
|
||||||
- Rettet fejl hvor dropdown åbnede bagved siden (`overflow: hidden` fjernet fra `.case-hero`).
|
|
||||||
|
|
||||||
### Sag – Relation quick-actions (+)
|
|
||||||
- Menuen indeholder nu 12 moduler: Tildel sag, Tidregistrering, Kommentar, Påmindelse, Opgave, Salgspipeline, Filer, Hardware, Løsning, Varekøb & salg, Abonnement, Send email.
|
|
||||||
- Alle moduler åbner mini-modal med relevante felter direkte fra relationspanelet – ingen sidenavigation nødvendig.
|
|
||||||
- Salgspipeline skjules fra menuen hvis sagen allerede har pipeline-data (vises som grå "Pipeline (se sagen)").
|
|
||||||
- Tags bruger nu det globale TagPicker-system (`window.showTagPicker`).
|
|
||||||
|
|
||||||
### Email service
|
|
||||||
- Ny `app/services/email_service.py` til centraliseret e-mail-afsendelse.
|
|
||||||
|
|
||||||
### Telefoni
|
|
||||||
- Opdateringer til telefon-log og router.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/modules/sag/templates/detail.html`
|
|
||||||
- `app/modules/sag/backend/router.py`
|
|
||||||
- `app/dashboard/backend/mission_router.py`
|
|
||||||
- `app/dashboard/backend/mission_service.py`
|
|
||||||
- `app/modules/telefoni/backend/router.py`
|
|
||||||
- `app/modules/telefoni/templates/log.html`
|
|
||||||
- `app/services/email_service.py`
|
|
||||||
- `main.py`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.49`
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
# Release Notes v2.2.50
|
|
||||||
|
|
||||||
Dato: 6. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
- Sag: “Ny email”-compose er gendannet i E-mail-fanen på sager.
|
|
||||||
- Tilføjet synlig compose-sektion med felter for Til/Cc/Bcc/Emne/Besked samt vedhæftning af sagsfiler.
|
|
||||||
- Knap `Ny email` er nu koblet til afsendelse via `/api/v1/sag/{sag_id}/emails/send`.
|
|
||||||
- Compose prefill’er modtager (primær kontakt hvis muligt) og emne (`Sag #<id>:`).
|
|
||||||
- Vedhæftningslisten opdateres fra sagsfiler, også når filpanelet ikke er synligt.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/modules/sag/templates/detail.html`
|
|
||||||
- `VERSION`
|
|
||||||
- `RELEASE_NOTES_v2.2.50.md`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.50`
|
|
||||||
@ -1,21 +0,0 @@
|
|||||||
# Release Notes v2.2.51
|
|
||||||
|
|
||||||
Dato: 7. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
- Settings: Bruger-administration i v2 bruger nu stabile admin-endpoints for statusændring og password reset.
|
|
||||||
- Settings: Forbedrede fejlbeskeder ved brugerhandlinger (status/password), så 4xx/5xx vises tydeligt i UI.
|
|
||||||
- Ticket Sync: Tilføjet Archived Sync monitor i Settings med knapper for Simply/vTiger import og løbende status-check.
|
|
||||||
- Ticket Sync: Nyt endpoint `/api/v1/ticket/archived/status` returnerer parity (remote vs lokal) og samlet `overall_synced`.
|
|
||||||
- Sikkerhed: Sync/import endpoints er låst til admin/superadmin (`users.manage` eller `system.admin`).
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/settings/frontend/settings.html`
|
|
||||||
- `app/ticket/backend/router.py`
|
|
||||||
- `app/system/backend/sync_router.py`
|
|
||||||
- `app/auth/backend/admin.py`
|
|
||||||
- `VERSION`
|
|
||||||
- `RELEASE_NOTES_v2.2.51.md`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.51`
|
|
||||||
@ -1,16 +0,0 @@
|
|||||||
# Release Notes v2.2.52
|
|
||||||
|
|
||||||
Dato: 7. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
- Auth Admin: `GET /api/v1/admin/users` er gjort ekstra robust mod delvist migreret database schema.
|
|
||||||
- Endpointet falder nu tilbage til en simplere query, hvis join/kolonner for grupper eller telefoni mangler.
|
|
||||||
- Reducerer risiko for UI-fejl: "Kunne ikke indlæse brugere" på v2.
|
|
||||||
|
|
||||||
## Ændrede filer
|
|
||||||
- `app/auth/backend/admin.py`
|
|
||||||
- `VERSION`
|
|
||||||
- `RELEASE_NOTES_v2.2.52.md`
|
|
||||||
|
|
||||||
## Drift
|
|
||||||
- Deploy: `./updateto.sh v2.2.52`
|
|
||||||
@ -1,42 +0,0 @@
|
|||||||
# Release Notes - v2.2.53
|
|
||||||
|
|
||||||
Dato: 17. marts 2026
|
|
||||||
|
|
||||||
## Fokus
|
|
||||||
|
|
||||||
Email til SAG flow med manuel godkendelse som standard, tydelig UI-handling og bedre sporbarhed.
|
|
||||||
|
|
||||||
## Tilføjet
|
|
||||||
|
|
||||||
- Manual approval gate i email pipeline (`awaiting_user_action` state), så mails parkeres til brugerhandling før automatisk routing.
|
|
||||||
- Ny feature-flag i config: `EMAIL_REQUIRE_MANUAL_APPROVAL` (default `true`).
|
|
||||||
- Nye email API endpoints:
|
|
||||||
- `GET /api/v1/emails/sag-options`
|
|
||||||
- `GET /api/v1/emails/search-customers`
|
|
||||||
- `GET /api/v1/emails/search-sager`
|
|
||||||
- `POST /api/v1/emails/{email_id}/create-sag`
|
|
||||||
- `POST /api/v1/emails/{email_id}/link-sag`
|
|
||||||
- Email stats udvidet med `awaiting_user_action` i summary/processing stats.
|
|
||||||
- Email frontend opgraderet med forslagspanel og hurtigknapper:
|
|
||||||
- Bekræft forslag
|
|
||||||
- Ret type
|
|
||||||
- Opret ny sag
|
|
||||||
- Tilknyt eksisterende sag
|
|
||||||
- Markér spam
|
|
||||||
- Oprettelse af SAG fra email understøtter nu:
|
|
||||||
- type
|
|
||||||
- sekundær label
|
|
||||||
- ansvarlig bruger
|
|
||||||
- gruppe
|
|
||||||
- startdato
|
|
||||||
- prioritet
|
|
||||||
- Ny migration: `145_sag_start_date.sql` (`start_date` på `sag_sager`).
|
|
||||||
|
|
||||||
## Driftsnoter
|
|
||||||
|
|
||||||
- Kør migration `145_sag_start_date.sql` før brug af startdato-feltet i email->sag flow.
|
|
||||||
- Manuel approval er aktiv som standard; auto-oprettelse er dermed deaktiveret i fase 1.
|
|
||||||
|
|
||||||
## Backup
|
|
||||||
|
|
||||||
- Fallback zip af nuværende email-funktion er oprettet i `backups/email_feature/`.
|
|
||||||
@ -1,28 +0,0 @@
|
|||||||
# Release Notes - v2.2.54
|
|
||||||
|
|
||||||
Dato: 17. marts 2026
|
|
||||||
|
|
||||||
## Fokus
|
|
||||||
|
|
||||||
Forbedringer i email til SAG workflow med deadline-felt og markant bedre firma/kunde-søgning i UI.
|
|
||||||
|
|
||||||
## Tilføjet
|
|
||||||
|
|
||||||
- Deadline understøttes nu i email->sag oprettelse.
|
|
||||||
- Backend request-model udvidet med `deadline`.
|
|
||||||
- `create-sag` gemmer nu deadline på `sag_sager`.
|
|
||||||
- Frontend forslagspanel har fået dedikeret deadline-felt.
|
|
||||||
- Kundevalg i email-panelet er opgraderet til en “super firma-søgning”:
|
|
||||||
- Live dropdown-resultater i stedet for simpel datalist.
|
|
||||||
- Bedre ranking af resultater (exact/prefix/relevans).
|
|
||||||
- Hurtig valg med klik, inklusive visning af CVR/domæne/email metadata.
|
|
||||||
|
|
||||||
## Opdaterede filer
|
|
||||||
|
|
||||||
- `app/emails/backend/router.py`
|
|
||||||
- `app/emails/frontend/emails.html`
|
|
||||||
|
|
||||||
## Bemærkninger
|
|
||||||
|
|
||||||
- Ingen breaking API changes.
|
|
||||||
- Ingen ekstra migration nødvendig for denne release.
|
|
||||||
@ -1,22 +0,0 @@
|
|||||||
# Release Notes v2.2.56
|
|
||||||
|
|
||||||
Dato: 2026-03-18
|
|
||||||
|
|
||||||
## Fokus
|
|
||||||
Stabilisering af email-visning og hardening af supplier-invoices flows.
|
|
||||||
|
|
||||||
## Aendringer
|
|
||||||
- Rettet layout-overflow i email-detaljevisning, saa lange emner, afsenderadresser, HTML-indhold og filnavne ikke skubber kolonnerne ud af layoutet.
|
|
||||||
- Tilfoejet robust wrapping/truncering i emails UI for bedre responsiv opfoersel.
|
|
||||||
- Tilfoejet manglende "Klar til Bogforing" tab i supplier-invoices navigation.
|
|
||||||
- Rettet endpoint mismatch for AI template-analyse i supplier-invoices frontend.
|
|
||||||
- Fjernet JS-funktionskonflikter i supplier-invoices ved at adskille single/bulk send flows.
|
|
||||||
- Tilfoejet backend endpoint til at markere supplier-invoices som betalt.
|
|
||||||
- Fjernet route-konflikt for send-to-economic ved at flytte legacy placeholder til separat sti.
|
|
||||||
- Forbedret approve-flow ved at bruge dynamisk brugeropslag i stedet for hardcoded vaerdi.
|
|
||||||
|
|
||||||
## Berorte filer
|
|
||||||
- app/emails/frontend/emails.html
|
|
||||||
- app/billing/frontend/supplier_invoices.html
|
|
||||||
- app/billing/backend/supplier_invoices.py
|
|
||||||
- RELEASE_NOTES_v2.2.56.md
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
# Release Notes v2.2.57
|
|
||||||
|
|
||||||
Dato: 2026-03-18
|
|
||||||
|
|
||||||
## Fokus
|
|
||||||
Stabilisering af UI i Email- og SAG-modulerne.
|
|
||||||
|
|
||||||
## Aendringer
|
|
||||||
- Email-visning: yderligere hardening af HTML-tabeller i mail-body, inklusive normalisering af inline styles for at undgaa layout break.
|
|
||||||
- Email-visning: forbedret overflow-haandtering for bredt indhold (tabeller, celler og media).
|
|
||||||
- SAG detaljeside: forbedret tab-loading, saa data hentes ved faneskift for Varekob & Salg, Abonnement og Paamindelser.
|
|
||||||
- SAG detaljeside: robust fallback for reminder user-id via `/api/v1/auth/me`.
|
|
||||||
- SAG detaljeside: rettet API-kald for reminders og kalender til stabil case-id reference.
|
|
||||||
|
|
||||||
## Berorte filer
|
|
||||||
- app/emails/frontend/emails.html
|
|
||||||
- app/modules/sag/templates/detail.html
|
|
||||||
- RELEASE_NOTES_v2.2.57.md
|
|
||||||
@ -1,15 +0,0 @@
|
|||||||
# Release Notes v2.2.58
|
|
||||||
|
|
||||||
Dato: 2026-03-18
|
|
||||||
|
|
||||||
## Fokus
|
|
||||||
Forbedret UX paa SAG detaljesiden, saa fanernes indhold vises i toppen ved faneskift.
|
|
||||||
|
|
||||||
## Aendringer
|
|
||||||
- SAG tabs: aktiv tab-pane flyttes til toppen af tab-content ved faneskift.
|
|
||||||
- SAG tabs: automatisk scroll til fanebjaelken efter faneskift.
|
|
||||||
- SAG tabs: samme top-positionering og scroll ved `?tab=` deep-link aktivering.
|
|
||||||
|
|
||||||
## Berorte filer
|
|
||||||
- app/modules/sag/templates/detail.html
|
|
||||||
- RELEASE_NOTES_v2.2.58.md
|
|
||||||
@ -1,16 +0,0 @@
|
|||||||
# Release Notes v2.2.59
|
|
||||||
|
|
||||||
Dato: 2026-03-18
|
|
||||||
|
|
||||||
## Fokus
|
|
||||||
Stabil scroll/navigation i SAG-faner, saa bruger lander ved reelt indhold i den valgte fane.
|
|
||||||
|
|
||||||
## Aendringer
|
|
||||||
- Fjernet DOM-reordering af tab-pane elementer i SAG detaljesiden.
|
|
||||||
- Ny scroll-logik: ved faneskift scrolles til foerste meningsfulde indholdselement i aktiv fane.
|
|
||||||
- Scroll-offset tager hoejde for navbar-hoejde.
|
|
||||||
- Deep-link (`?tab=...`) bruger nu samme robuste scroll-adfaerd.
|
|
||||||
|
|
||||||
## Berorte filer
|
|
||||||
- app/modules/sag/templates/detail.html
|
|
||||||
- RELEASE_NOTES_v2.2.59.md
|
|
||||||
@ -1,17 +0,0 @@
|
|||||||
# Release Notes v2.2.60
|
|
||||||
|
|
||||||
Dato: 2026-03-18
|
|
||||||
|
|
||||||
## Fokus
|
|
||||||
Korrekt top-visning af aktiv fane paa SAG detaljesiden.
|
|
||||||
|
|
||||||
## Aendringer
|
|
||||||
- Tvang korrekt tab-pane synlighed i `#caseTabsContent`:
|
|
||||||
- inaktive faner skjules (`display: none`)
|
|
||||||
- kun aktiv fane vises (`display: block`)
|
|
||||||
- Fjernet tidligere scroll/DOM-workaround til fanevisning.
|
|
||||||
- Resultat: aktiv fane vises i toppen under fanebjaelken uden tom top-sektion.
|
|
||||||
|
|
||||||
## Berorte filer
|
|
||||||
- app/modules/sag/templates/detail.html
|
|
||||||
- RELEASE_NOTES_v2.2.60.md
|
|
||||||
@ -1,15 +0,0 @@
|
|||||||
# Release Notes v2.2.61
|
|
||||||
|
|
||||||
Dato: 18. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
|
|
||||||
- Rettet SAG-fanevisning i sag-detaljesiden, så kun den aktive fane vises i toppen.
|
|
||||||
- Tilføjet direkte klik-fallback på faneknapper (`onclick`) for robust aktivering, også hvis Bootstrap tab-events fejler.
|
|
||||||
- Sat eksplicit start-visibility på tab-panes for at undgå "lang side"-effekten med indhold langt nede.
|
|
||||||
- Fjernet to ødelagte CSS-blokke i toppen af templaten, som kunne skabe ustabil styling/parsing.
|
|
||||||
|
|
||||||
## Berørte filer
|
|
||||||
|
|
||||||
- `app/modules/sag/templates/detail.html`
|
|
||||||
- `RELEASE_NOTES_v2.2.61.md`
|
|
||||||
@ -1,14 +0,0 @@
|
|||||||
# Release Notes v2.2.62
|
|
||||||
|
|
||||||
Dato: 18. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
|
|
||||||
- Rettet grid/nesting i SAG detaljevisning, så højre kolonne ligger i samme row som venstre/center.
|
|
||||||
- `Hardware`, `Salgspipeline`, `Opkaldshistorik` og `Todo-opgaver` vises nu i højre kolonne som forventet.
|
|
||||||
- Fjernet en for tidlig afsluttende `</div>` i detaljer-layoutet, som tidligere fik højre modulkolonne til at falde ned under venstre indhold.
|
|
||||||
|
|
||||||
## Berørte filer
|
|
||||||
|
|
||||||
- `app/modules/sag/templates/detail.html`
|
|
||||||
- `RELEASE_NOTES_v2.2.62.md`
|
|
||||||
@ -1,14 +0,0 @@
|
|||||||
# Release Notes v2.2.63
|
|
||||||
|
|
||||||
Dato: 18. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
|
|
||||||
- Rettet QuickCreate AI-analyse request i frontend.
|
|
||||||
- `POST /api/v1/sag/analyze-quick-create` får nu korrekt payload med både `text` og `user_id` i body.
|
|
||||||
- Forbedret fejllog i frontend ved AI-fejl (inkl. HTTP status), så fejl ikke bliver skjult som generisk "Analysis failed".
|
|
||||||
|
|
||||||
## Berørte filer
|
|
||||||
|
|
||||||
- `app/shared/frontend/quick_create_modal.html`
|
|
||||||
- `RELEASE_NOTES_v2.2.63.md`
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
# Release Notes v2.2.64
|
|
||||||
|
|
||||||
Dato: 18. marts 2026
|
|
||||||
|
|
||||||
## Fixes
|
|
||||||
|
|
||||||
- Forbedret QuickCreate robusthed når AI/LLM er utilgængelig.
|
|
||||||
- Tilføjet lokal heuristisk fallback i `CaseAnalysisService`, så brugeren stadig får:
|
|
||||||
- foreslået titel
|
|
||||||
- foreslået prioritet
|
|
||||||
- simple tags
|
|
||||||
- kunde-match forsøg
|
|
||||||
- Fjernet afhængighed af at Ollama altid svarer, så QuickCreate ikke længere ender i tom AI-unavailable flow ved midlertidige AI-fejl.
|
|
||||||
|
|
||||||
## Berørte filer
|
|
||||||
|
|
||||||
- `app/services/case_analysis_service.py`
|
|
||||||
- `RELEASE_NOTES_v2.2.64.md`
|
|
||||||
@ -2,7 +2,6 @@
|
|||||||
Auth Admin API - Users, Groups, Permissions management
|
Auth Admin API - Users, Groups, Permissions management
|
||||||
"""
|
"""
|
||||||
from fastapi import APIRouter, HTTPException, status, Depends
|
from fastapi import APIRouter, HTTPException, status, Depends
|
||||||
from pydantic import BaseModel, Field
|
|
||||||
from app.core.auth_dependencies import require_permission
|
from app.core.auth_dependencies import require_permission
|
||||||
from app.core.auth_service import AuthService
|
from app.core.auth_service import AuthService
|
||||||
from app.core.database import execute_query, execute_query_single, execute_insert, execute_update
|
from app.core.database import execute_query, execute_query_single, execute_insert, execute_update
|
||||||
@ -14,94 +13,23 @@ logger = logging.getLogger(__name__)
|
|||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
class UserStatusUpdateRequest(BaseModel):
|
|
||||||
is_active: bool
|
|
||||||
|
|
||||||
|
|
||||||
class UserPasswordResetRequest(BaseModel):
|
|
||||||
new_password: str = Field(..., min_length=8, max_length=128)
|
|
||||||
|
|
||||||
|
|
||||||
def _users_column_exists(column_name: str) -> bool:
|
|
||||||
result = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT 1
|
|
||||||
FROM information_schema.columns
|
|
||||||
WHERE table_schema = 'public'
|
|
||||||
AND table_name = 'users'
|
|
||||||
AND column_name = %s
|
|
||||||
LIMIT 1
|
|
||||||
""",
|
|
||||||
(column_name,)
|
|
||||||
)
|
|
||||||
return bool(result)
|
|
||||||
|
|
||||||
|
|
||||||
def _table_exists(table_name: str) -> bool:
|
|
||||||
result = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT 1
|
|
||||||
FROM information_schema.tables
|
|
||||||
WHERE table_schema = 'public'
|
|
||||||
AND table_name = %s
|
|
||||||
LIMIT 1
|
|
||||||
""",
|
|
||||||
(table_name,)
|
|
||||||
)
|
|
||||||
return bool(result)
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/admin/users", dependencies=[Depends(require_permission("users.manage"))])
|
@router.get("/admin/users", dependencies=[Depends(require_permission("users.manage"))])
|
||||||
async def list_users():
|
async def list_users():
|
||||||
is_2fa_expr = "u.is_2fa_enabled" if _users_column_exists("is_2fa_enabled") else "FALSE AS is_2fa_enabled"
|
users = execute_query(
|
||||||
telefoni_extension_expr = "u.telefoni_extension" if _users_column_exists("telefoni_extension") else "NULL::varchar AS telefoni_extension"
|
"""
|
||||||
telefoni_active_expr = "u.telefoni_aktiv" if _users_column_exists("telefoni_aktiv") else "FALSE AS telefoni_aktiv"
|
SELECT u.user_id, u.username, u.email, u.full_name,
|
||||||
telefoni_ip_expr = "u.telefoni_phone_ip" if _users_column_exists("telefoni_phone_ip") else "NULL::varchar AS telefoni_phone_ip"
|
u.is_active, u.is_superadmin, u.is_2fa_enabled,
|
||||||
telefoni_username_expr = "u.telefoni_phone_username" if _users_column_exists("telefoni_phone_username") else "NULL::varchar AS telefoni_phone_username"
|
u.telefoni_extension, u.telefoni_aktiv, u.telefoni_phone_ip, u.telefoni_phone_username,
|
||||||
last_login_expr = "u.last_login_at" if _users_column_exists("last_login_at") else "NULL::timestamp AS last_login_at"
|
u.created_at, u.last_login_at,
|
||||||
has_user_groups = _table_exists("user_groups")
|
COALESCE(array_remove(array_agg(g.name), NULL), ARRAY[]::varchar[]) AS groups
|
||||||
has_groups = _table_exists("groups")
|
FROM users u
|
||||||
|
LEFT JOIN user_groups ug ON u.user_id = ug.user_id
|
||||||
if has_user_groups and has_groups:
|
LEFT JOIN groups g ON ug.group_id = g.id
|
||||||
groups_join = "LEFT JOIN user_groups ug ON u.user_id = ug.user_id LEFT JOIN groups g ON ug.group_id = g.id"
|
GROUP BY u.user_id
|
||||||
groups_select = "COALESCE(array_remove(array_agg(g.name), NULL), ARRAY[]::varchar[]) AS groups"
|
ORDER BY u.user_id
|
||||||
else:
|
"""
|
||||||
groups_join = ""
|
)
|
||||||
groups_select = "ARRAY[]::varchar[] AS groups"
|
return users
|
||||||
|
|
||||||
try:
|
|
||||||
users = execute_query(
|
|
||||||
f"""
|
|
||||||
SELECT u.user_id, u.username, u.email, u.full_name,
|
|
||||||
u.is_active, u.is_superadmin, {is_2fa_expr},
|
|
||||||
{telefoni_extension_expr}, {telefoni_active_expr}, {telefoni_ip_expr}, {telefoni_username_expr},
|
|
||||||
u.created_at, {last_login_expr},
|
|
||||||
{groups_select}
|
|
||||||
FROM users u
|
|
||||||
{groups_join}
|
|
||||||
GROUP BY u.user_id
|
|
||||||
ORDER BY u.user_id
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
return users
|
|
||||||
except Exception as exc:
|
|
||||||
logger.warning("⚠️ Admin user query fallback triggered: %s", exc)
|
|
||||||
try:
|
|
||||||
users = execute_query(
|
|
||||||
f"""
|
|
||||||
SELECT u.user_id, u.username, u.email, u.full_name,
|
|
||||||
u.is_active, u.is_superadmin, {is_2fa_expr},
|
|
||||||
{telefoni_extension_expr}, {telefoni_active_expr}, {telefoni_ip_expr}, {telefoni_username_expr},
|
|
||||||
u.created_at, {last_login_expr},
|
|
||||||
ARRAY[]::varchar[] AS groups
|
|
||||||
FROM users u
|
|
||||||
ORDER BY u.user_id
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
return users
|
|
||||||
except Exception as fallback_exc:
|
|
||||||
logger.error("❌ Failed to load admin users (fallback): %s", fallback_exc)
|
|
||||||
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Could not load users") from fallback_exc
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/admin/users", status_code=status.HTTP_201_CREATED, dependencies=[Depends(require_permission("users.manage"))])
|
@router.post("/admin/users", status_code=status.HTTP_201_CREATED, dependencies=[Depends(require_permission("users.manage"))])
|
||||||
@ -166,48 +94,6 @@ async def update_user_groups(user_id: int, payload: UserGroupsUpdate):
|
|||||||
return {"message": "Groups updated"}
|
return {"message": "Groups updated"}
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/admin/users/{user_id}", dependencies=[Depends(require_permission("users.manage"))])
|
|
||||||
async def update_user_status(user_id: int, payload: UserStatusUpdateRequest):
|
|
||||||
user = execute_query_single(
|
|
||||||
"SELECT user_id, username FROM users WHERE user_id = %s",
|
|
||||||
(user_id,)
|
|
||||||
)
|
|
||||||
if not user:
|
|
||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"UPDATE users SET is_active = %s, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
|
||||||
(payload.is_active, user_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.info("✅ Updated user status via admin: %s -> active=%s", user.get("username"), payload.is_active)
|
|
||||||
return {"message": "User status updated", "user_id": user_id, "is_active": payload.is_active}
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/admin/users/{user_id}/reset-password", dependencies=[Depends(require_permission("users.manage"))])
|
|
||||||
async def admin_reset_user_password(user_id: int, payload: UserPasswordResetRequest):
|
|
||||||
user = execute_query_single(
|
|
||||||
"SELECT user_id, username FROM users WHERE user_id = %s",
|
|
||||||
(user_id,)
|
|
||||||
)
|
|
||||||
if not user:
|
|
||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
|
||||||
|
|
||||||
try:
|
|
||||||
password_hash = AuthService.hash_password(payload.new_password)
|
|
||||||
except Exception as exc:
|
|
||||||
logger.error("❌ Password hash failed for user_id=%s: %s", user_id, exc)
|
|
||||||
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Kunne ikke hashe adgangskoden") from exc
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"UPDATE users SET password_hash = %s, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
|
||||||
(password_hash, user_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.info("✅ Password reset via admin for user: %s", user.get("username"))
|
|
||||||
return {"message": "Password reset", "user_id": user_id}
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/admin/users/{user_id}/2fa/reset")
|
@router.post("/admin/users/{user_id}/2fa/reset")
|
||||||
async def reset_user_2fa(
|
async def reset_user_2fa(
|
||||||
user_id: int,
|
user_id: int,
|
||||||
|
|||||||
@ -74,8 +74,6 @@ async def login(request: Request, credentials: LoginRequest, response: Response)
|
|||||||
|
|
||||||
requires_2fa_setup = (
|
requires_2fa_setup = (
|
||||||
not user.get("is_shadow_admin", False)
|
not user.get("is_shadow_admin", False)
|
||||||
and not settings.AUTH_DISABLE_2FA
|
|
||||||
and AuthService.is_2fa_supported()
|
|
||||||
and not user.get("is_2fa_enabled", False)
|
and not user.get("is_2fa_enabled", False)
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -141,18 +139,10 @@ async def setup_2fa(current_user: dict = Depends(get_current_user)):
|
|||||||
detail="Shadow admin cannot configure 2FA",
|
detail="Shadow admin cannot configure 2FA",
|
||||||
)
|
)
|
||||||
|
|
||||||
try:
|
result = AuthService.setup_user_2fa(
|
||||||
result = AuthService.setup_user_2fa(
|
user_id=current_user["id"],
|
||||||
user_id=current_user["id"],
|
username=current_user["username"]
|
||||||
username=current_user["username"]
|
)
|
||||||
)
|
|
||||||
except RuntimeError as exc:
|
|
||||||
if "2FA columns missing" in str(exc):
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=status.HTTP_400_BAD_REQUEST,
|
|
||||||
detail="2FA er ikke tilgaengelig i denne database (mangler kolonner).",
|
|
||||||
)
|
|
||||||
raise
|
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|||||||
@ -25,26 +25,8 @@ class BackupService:
|
|||||||
"""Service for managing backup operations"""
|
"""Service for managing backup operations"""
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
configured_backup_dir = Path(settings.BACKUP_STORAGE_PATH)
|
self.backup_dir = Path(settings.BACKUP_STORAGE_PATH)
|
||||||
self.backup_dir = configured_backup_dir
|
self.backup_dir.mkdir(parents=True, exist_ok=True)
|
||||||
try:
|
|
||||||
self.backup_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
except OSError as exc:
|
|
||||||
# Local development can run outside Docker where /app is not writable.
|
|
||||||
# Fall back to the workspace data path so app startup does not fail.
|
|
||||||
if str(configured_backup_dir).startswith('/app/'):
|
|
||||||
project_root = Path(__file__).resolve().parents[3]
|
|
||||||
fallback_dir = project_root / 'data' / 'backups'
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Backup path %s not writable (%s). Using fallback %s",
|
|
||||||
configured_backup_dir,
|
|
||||||
exc,
|
|
||||||
fallback_dir,
|
|
||||||
)
|
|
||||||
fallback_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
self.backup_dir = fallback_dir
|
|
||||||
else:
|
|
||||||
raise
|
|
||||||
|
|
||||||
# Subdirectories for different backup types
|
# Subdirectories for different backup types
|
||||||
self.db_dir = self.backup_dir / "database"
|
self.db_dir = self.backup_dir / "database"
|
||||||
|
|||||||
@ -3,14 +3,7 @@ Billing Router
|
|||||||
API endpoints for billing operations
|
API endpoints for billing operations
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, HTTPException
|
from fastapi import APIRouter
|
||||||
from typing import Any, Dict, List
|
|
||||||
from datetime import datetime, date
|
|
||||||
import json
|
|
||||||
from dateutil.relativedelta import relativedelta
|
|
||||||
from app.core.database import execute_query, get_db_connection, release_db_connection
|
|
||||||
from psycopg2.extras import RealDictCursor
|
|
||||||
from app.jobs.reconcile_ordre_drafts import reconcile_ordre_drafts_sync_status
|
|
||||||
from . import supplier_invoices
|
from . import supplier_invoices
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
@ -19,83 +12,6 @@ router = APIRouter()
|
|||||||
router.include_router(supplier_invoices.router, prefix="", tags=["Supplier Invoices"])
|
router.include_router(supplier_invoices.router, prefix="", tags=["Supplier Invoices"])
|
||||||
|
|
||||||
|
|
||||||
@router.get("/billing/drafts/sync-dashboard")
|
|
||||||
async def get_draft_sync_dashboard(limit: int = 20):
|
|
||||||
"""Operational dashboard data for ordre draft sync lifecycle."""
|
|
||||||
try:
|
|
||||||
summary = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'pending') AS pending_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'exported') AS exported_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'failed') AS failed_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'posted') AS posted_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'paid') AS paid_count,
|
|
||||||
COUNT(*) AS total_count
|
|
||||||
FROM ordre_drafts
|
|
||||||
""",
|
|
||||||
(),
|
|
||||||
) or []
|
|
||||||
|
|
||||||
attention = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT
|
|
||||||
d.id,
|
|
||||||
d.title,
|
|
||||||
d.customer_id,
|
|
||||||
d.sync_status,
|
|
||||||
d.economic_order_number,
|
|
||||||
d.economic_invoice_number,
|
|
||||||
d.last_sync_at,
|
|
||||||
d.updated_at,
|
|
||||||
ev.event_type AS latest_event_type,
|
|
||||||
ev.created_at AS latest_event_at
|
|
||||||
FROM ordre_drafts d
|
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT event_type, created_at
|
|
||||||
FROM ordre_draft_sync_events
|
|
||||||
WHERE draft_id = d.id
|
|
||||||
ORDER BY created_at DESC, id DESC
|
|
||||||
LIMIT 1
|
|
||||||
) ev ON TRUE
|
|
||||||
WHERE d.sync_status IN ('pending', 'failed')
|
|
||||||
ORDER BY d.updated_at DESC
|
|
||||||
LIMIT %s
|
|
||||||
""",
|
|
||||||
(max(1, min(limit, 200)),),
|
|
||||||
) or []
|
|
||||||
|
|
||||||
recent_events = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT
|
|
||||||
ev.id,
|
|
||||||
ev.draft_id,
|
|
||||||
ev.event_type,
|
|
||||||
ev.from_status,
|
|
||||||
ev.to_status,
|
|
||||||
ev.event_payload,
|
|
||||||
ev.created_by_user_id,
|
|
||||||
ev.created_at,
|
|
||||||
d.title AS draft_title,
|
|
||||||
d.customer_id,
|
|
||||||
d.sync_status
|
|
||||||
FROM ordre_draft_sync_events ev
|
|
||||||
JOIN ordre_drafts d ON d.id = ev.draft_id
|
|
||||||
ORDER BY ev.created_at DESC, ev.id DESC
|
|
||||||
LIMIT %s
|
|
||||||
""",
|
|
||||||
(max(1, min(limit, 200)),),
|
|
||||||
) or []
|
|
||||||
|
|
||||||
return {
|
|
||||||
"summary": summary[0] if summary else {},
|
|
||||||
"attention_items": attention,
|
|
||||||
"recent_events": recent_events,
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to load sync dashboard: {e}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/billing/invoices")
|
@router.get("/billing/invoices")
|
||||||
async def list_invoices():
|
async def list_invoices():
|
||||||
"""List all invoices"""
|
"""List all invoices"""
|
||||||
@ -106,390 +22,3 @@ async def list_invoices():
|
|||||||
async def sync_to_economic():
|
async def sync_to_economic():
|
||||||
"""Sync data to e-conomic"""
|
"""Sync data to e-conomic"""
|
||||||
return {"message": "e-conomic sync coming soon"}
|
return {"message": "e-conomic sync coming soon"}
|
||||||
|
|
||||||
|
|
||||||
def _to_date(value: Any) -> date | None:
|
|
||||||
if value is None:
|
|
||||||
return None
|
|
||||||
if isinstance(value, date):
|
|
||||||
return value
|
|
||||||
if isinstance(value, datetime):
|
|
||||||
return value.date()
|
|
||||||
text = str(value).strip()
|
|
||||||
if not text:
|
|
||||||
return None
|
|
||||||
try:
|
|
||||||
return datetime.fromisoformat(text.replace("Z", "+00:00")).date()
|
|
||||||
except ValueError:
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def _next_period(start: date, interval: str) -> date:
|
|
||||||
normalized = (interval or "monthly").strip().lower()
|
|
||||||
if normalized == "daily":
|
|
||||||
return start + relativedelta(days=1)
|
|
||||||
if normalized == "biweekly":
|
|
||||||
return start + relativedelta(weeks=2)
|
|
||||||
if normalized == "quarterly":
|
|
||||||
return start + relativedelta(months=3)
|
|
||||||
if normalized == "yearly":
|
|
||||||
return start + relativedelta(years=1)
|
|
||||||
return start + relativedelta(months=1)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/billing/subscriptions/preview")
|
|
||||||
async def preview_subscription_billing(payload: Dict[str, Any]):
|
|
||||||
"""
|
|
||||||
Preview aggregated customer billing from due subscriptions.
|
|
||||||
Generates prorata suggestions for approved-but-not-applied price changes.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
as_of = _to_date(payload.get("as_of")) or date.today()
|
|
||||||
customer_id = payload.get("customer_id")
|
|
||||||
|
|
||||||
where = ["s.status = 'active'", "s.next_invoice_date <= %s", "COALESCE(s.billing_blocked, false) = false"]
|
|
||||||
params: List[Any] = [as_of]
|
|
||||||
if customer_id:
|
|
||||||
where.append("s.customer_id = %s")
|
|
||||||
params.append(customer_id)
|
|
||||||
|
|
||||||
subscriptions = execute_query(
|
|
||||||
f"""
|
|
||||||
SELECT
|
|
||||||
s.id,
|
|
||||||
s.customer_id,
|
|
||||||
c.name AS customer_name,
|
|
||||||
s.product_name,
|
|
||||||
s.billing_interval,
|
|
||||||
s.billing_direction,
|
|
||||||
s.invoice_merge_key,
|
|
||||||
s.next_invoice_date,
|
|
||||||
s.period_start,
|
|
||||||
s.price,
|
|
||||||
COALESCE(
|
|
||||||
(
|
|
||||||
SELECT json_agg(
|
|
||||||
json_build_object(
|
|
||||||
'id', i.id,
|
|
||||||
'description', i.description,
|
|
||||||
'quantity', i.quantity,
|
|
||||||
'unit_price', i.unit_price,
|
|
||||||
'line_total', i.line_total,
|
|
||||||
'asset_id', i.asset_id,
|
|
||||||
'period_from', i.period_from,
|
|
||||||
'period_to', i.period_to,
|
|
||||||
'billing_blocked', i.billing_blocked
|
|
||||||
) ORDER BY i.line_no ASC, i.id ASC
|
|
||||||
)
|
|
||||||
FROM sag_subscription_items i
|
|
||||||
WHERE i.subscription_id = s.id
|
|
||||||
),
|
|
||||||
'[]'::json
|
|
||||||
) AS line_items
|
|
||||||
FROM sag_subscriptions s
|
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
|
||||||
WHERE {' AND '.join(where)}
|
|
||||||
ORDER BY s.customer_id, s.next_invoice_date, s.id
|
|
||||||
""",
|
|
||||||
tuple(params),
|
|
||||||
) or []
|
|
||||||
|
|
||||||
groups: Dict[str, Dict[str, Any]] = {}
|
|
||||||
for sub in subscriptions:
|
|
||||||
merge_key = sub.get("invoice_merge_key") or f"cust-{sub['customer_id']}"
|
|
||||||
key = f"{sub['customer_id']}|{merge_key}|{sub.get('billing_direction') or 'forward'}|{sub.get('next_invoice_date')}"
|
|
||||||
grp = groups.setdefault(
|
|
||||||
key,
|
|
||||||
{
|
|
||||||
"customer_id": sub["customer_id"],
|
|
||||||
"customer_name": sub.get("customer_name"),
|
|
||||||
"merge_key": merge_key,
|
|
||||||
"billing_direction": sub.get("billing_direction") or "forward",
|
|
||||||
"invoice_date": str(sub.get("next_invoice_date")),
|
|
||||||
"coverage_start": None,
|
|
||||||
"coverage_end": None,
|
|
||||||
"subscription_ids": [],
|
|
||||||
"line_count": 0,
|
|
||||||
"amount_total": 0.0,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
|
|
||||||
sub_id = int(sub["id"])
|
|
||||||
grp["subscription_ids"].append(sub_id)
|
|
||||||
start = _to_date(sub.get("period_start") or sub.get("next_invoice_date")) or as_of
|
|
||||||
end = _next_period(start, sub.get("billing_interval") or "monthly")
|
|
||||||
grp["coverage_start"] = str(start) if grp["coverage_start"] is None or str(start) < grp["coverage_start"] else grp["coverage_start"]
|
|
||||||
grp["coverage_end"] = str(end) if grp["coverage_end"] is None or str(end) > grp["coverage_end"] else grp["coverage_end"]
|
|
||||||
|
|
||||||
for item in sub.get("line_items") or []:
|
|
||||||
if item.get("billing_blocked"):
|
|
||||||
continue
|
|
||||||
grp["line_count"] += 1
|
|
||||||
grp["amount_total"] += float(item.get("line_total") or 0)
|
|
||||||
|
|
||||||
price_changes = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT
|
|
||||||
spc.id,
|
|
||||||
spc.subscription_id,
|
|
||||||
spc.subscription_item_id,
|
|
||||||
spc.old_unit_price,
|
|
||||||
spc.new_unit_price,
|
|
||||||
spc.effective_date,
|
|
||||||
spc.approval_status,
|
|
||||||
spc.reason,
|
|
||||||
s.period_start,
|
|
||||||
s.billing_interval
|
|
||||||
FROM subscription_price_changes spc
|
|
||||||
JOIN sag_subscriptions s ON s.id = spc.subscription_id
|
|
||||||
WHERE spc.deleted_at IS NULL
|
|
||||||
AND spc.approval_status IN ('approved', 'pending')
|
|
||||||
AND spc.effective_date <= %s
|
|
||||||
ORDER BY spc.effective_date ASC, spc.id ASC
|
|
||||||
""",
|
|
||||||
(as_of,),
|
|
||||||
) or []
|
|
||||||
|
|
||||||
prorata_suggestions: List[Dict[str, Any]] = []
|
|
||||||
for change in price_changes:
|
|
||||||
period_start = _to_date(change.get("period_start"))
|
|
||||||
if not period_start:
|
|
||||||
continue
|
|
||||||
period_end = _next_period(period_start, change.get("billing_interval") or "monthly")
|
|
||||||
eff = _to_date(change.get("effective_date"))
|
|
||||||
if not eff:
|
|
||||||
continue
|
|
||||||
if eff <= period_start or eff >= period_end:
|
|
||||||
continue
|
|
||||||
|
|
||||||
total_days = max((period_end - period_start).days, 1)
|
|
||||||
remaining_days = max((period_end - eff).days, 0)
|
|
||||||
old_price = float(change.get("old_unit_price") or 0)
|
|
||||||
new_price = float(change.get("new_unit_price") or 0)
|
|
||||||
delta = new_price - old_price
|
|
||||||
prorata_amount = round(delta * (remaining_days / total_days), 2)
|
|
||||||
if prorata_amount == 0:
|
|
||||||
continue
|
|
||||||
|
|
||||||
prorata_suggestions.append(
|
|
||||||
{
|
|
||||||
"price_change_id": change.get("id"),
|
|
||||||
"subscription_id": change.get("subscription_id"),
|
|
||||||
"subscription_item_id": change.get("subscription_item_id"),
|
|
||||||
"effective_date": str(eff),
|
|
||||||
"period_start": str(period_start),
|
|
||||||
"period_end": str(period_end),
|
|
||||||
"old_unit_price": old_price,
|
|
||||||
"new_unit_price": new_price,
|
|
||||||
"remaining_days": remaining_days,
|
|
||||||
"total_days": total_days,
|
|
||||||
"suggested_adjustment": prorata_amount,
|
|
||||||
"adjustment_type": "debit" if prorata_amount > 0 else "credit",
|
|
||||||
"reason": change.get("reason"),
|
|
||||||
"requires_manual_approval": True,
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"status": "preview",
|
|
||||||
"as_of": str(as_of),
|
|
||||||
"group_count": len(groups),
|
|
||||||
"groups": list(groups.values()),
|
|
||||||
"prorata_suggestions": prorata_suggestions,
|
|
||||||
}
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to preview subscription billing: {e}")
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/billing/prorata-adjustments/draft")
|
|
||||||
async def create_prorata_adjustment_draft(payload: Dict[str, Any]):
|
|
||||||
"""
|
|
||||||
Create a manual adjustment draft from an approved prorata suggestion.
|
|
||||||
Payload expects customer_id, subscription_id, amount, reason and optional effective dates.
|
|
||||||
"""
|
|
||||||
conn = get_db_connection()
|
|
||||||
try:
|
|
||||||
customer_id = payload.get("customer_id")
|
|
||||||
subscription_id = payload.get("subscription_id")
|
|
||||||
amount = float(payload.get("amount") or 0)
|
|
||||||
reason = (payload.get("reason") or "Prorata justering").strip()
|
|
||||||
effective_date = _to_date(payload.get("effective_date")) or date.today()
|
|
||||||
period_start = _to_date(payload.get("period_start"))
|
|
||||||
period_end = _to_date(payload.get("period_end"))
|
|
||||||
|
|
||||||
if not customer_id:
|
|
||||||
raise HTTPException(status_code=400, detail="customer_id is required")
|
|
||||||
if not subscription_id:
|
|
||||||
raise HTTPException(status_code=400, detail="subscription_id is required")
|
|
||||||
if amount == 0:
|
|
||||||
raise HTTPException(status_code=400, detail="amount must be non-zero")
|
|
||||||
|
|
||||||
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
SELECT id, customer_id, product_name
|
|
||||||
FROM sag_subscriptions
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(subscription_id,),
|
|
||||||
)
|
|
||||||
sub = cursor.fetchone()
|
|
||||||
if not sub:
|
|
||||||
raise HTTPException(status_code=404, detail="Subscription not found")
|
|
||||||
if int(sub.get("customer_id") or 0) != int(customer_id):
|
|
||||||
raise HTTPException(status_code=400, detail="customer_id mismatch for subscription")
|
|
||||||
|
|
||||||
adjustment_label = "Prorata tillæg" if amount > 0 else "Prorata kredit"
|
|
||||||
line = {
|
|
||||||
"product": {
|
|
||||||
"productNumber": "PRORATA",
|
|
||||||
"description": f"{adjustment_label}: {sub.get('product_name') or 'Abonnement'}"
|
|
||||||
},
|
|
||||||
"quantity": 1,
|
|
||||||
"unitNetPrice": amount,
|
|
||||||
"totalNetAmount": amount,
|
|
||||||
"discountPercentage": 0,
|
|
||||||
"metadata": {
|
|
||||||
"subscription_id": subscription_id,
|
|
||||||
"effective_date": str(effective_date),
|
|
||||||
"period_start": str(period_start) if period_start else None,
|
|
||||||
"period_end": str(period_end) if period_end else None,
|
|
||||||
"reason": reason,
|
|
||||||
"manual_approval": True,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
INSERT INTO ordre_drafts (
|
|
||||||
title,
|
|
||||||
customer_id,
|
|
||||||
lines_json,
|
|
||||||
notes,
|
|
||||||
coverage_start,
|
|
||||||
coverage_end,
|
|
||||||
billing_direction,
|
|
||||||
source_subscription_ids,
|
|
||||||
invoice_aggregate_key,
|
|
||||||
layout_number,
|
|
||||||
created_by_user_id,
|
|
||||||
sync_status,
|
|
||||||
export_status_json,
|
|
||||||
updated_at
|
|
||||||
) VALUES (
|
|
||||||
%s, %s, %s::jsonb, %s,
|
|
||||||
%s, %s, %s, %s, %s,
|
|
||||||
%s, %s, %s, %s::jsonb, CURRENT_TIMESTAMP
|
|
||||||
)
|
|
||||||
RETURNING id, created_at
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
f"Manuel {adjustment_label}",
|
|
||||||
customer_id,
|
|
||||||
json.dumps([line], ensure_ascii=False),
|
|
||||||
reason,
|
|
||||||
period_start,
|
|
||||||
period_end,
|
|
||||||
"backward",
|
|
||||||
[subscription_id],
|
|
||||||
f"manual-prorata-{customer_id}",
|
|
||||||
1,
|
|
||||||
payload.get("created_by_user_id"),
|
|
||||||
"pending",
|
|
||||||
json.dumps(
|
|
||||||
{
|
|
||||||
"source": "prorata_manual",
|
|
||||||
"subscription_id": subscription_id,
|
|
||||||
"effective_date": str(effective_date),
|
|
||||||
},
|
|
||||||
ensure_ascii=False,
|
|
||||||
),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
created = cursor.fetchone()
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
return {
|
|
||||||
"status": "draft_created",
|
|
||||||
"draft_id": created.get("id") if created else None,
|
|
||||||
"created_at": created.get("created_at") if created else None,
|
|
||||||
"subscription_id": subscription_id,
|
|
||||||
"amount": amount,
|
|
||||||
}
|
|
||||||
except HTTPException:
|
|
||||||
conn.rollback()
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
conn.rollback()
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to create prorata adjustment draft: {e}")
|
|
||||||
finally:
|
|
||||||
release_db_connection(conn)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/billing/drafts/reconcile-sync-status")
|
|
||||||
async def reconcile_draft_sync_status(payload: Dict[str, Any]):
|
|
||||||
"""
|
|
||||||
Reconcile ordre_drafts sync_status from known economic references.
|
|
||||||
Rules:
|
|
||||||
- pending/failed + economic_order_number -> exported
|
|
||||||
- exported + economic_invoice_number -> posted
|
|
||||||
- posted + mark_paid_ids contains draft id -> paid
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
apply_changes = bool(payload.get("apply", False))
|
|
||||||
result = await reconcile_ordre_drafts_sync_status(apply_changes=apply_changes)
|
|
||||||
|
|
||||||
mark_paid_ids = set(int(x) for x in (payload.get("mark_paid_ids") or []) if str(x).isdigit())
|
|
||||||
if apply_changes and mark_paid_ids:
|
|
||||||
conn = get_db_connection()
|
|
||||||
try:
|
|
||||||
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
|
|
||||||
for draft_id in mark_paid_ids:
|
|
||||||
cursor.execute("SELECT sync_status FROM ordre_drafts WHERE id = %s", (draft_id,))
|
|
||||||
before = cursor.fetchone()
|
|
||||||
from_status = (before or {}).get("sync_status")
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
UPDATE ordre_drafts
|
|
||||||
SET sync_status = 'paid',
|
|
||||||
last_sync_at = CURRENT_TIMESTAMP,
|
|
||||||
updated_at = CURRENT_TIMESTAMP,
|
|
||||||
last_exported_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
AND sync_status = 'posted'
|
|
||||||
RETURNING id
|
|
||||||
""",
|
|
||||||
(draft_id,),
|
|
||||||
)
|
|
||||||
updated = cursor.fetchone()
|
|
||||||
if updated:
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
INSERT INTO ordre_draft_sync_events (
|
|
||||||
draft_id,
|
|
||||||
event_type,
|
|
||||||
from_status,
|
|
||||||
to_status,
|
|
||||||
event_payload,
|
|
||||||
created_by_user_id
|
|
||||||
) VALUES (%s, %s, %s, %s, %s::jsonb, NULL)
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
draft_id,
|
|
||||||
'sync_status_manual_paid',
|
|
||||||
from_status,
|
|
||||||
'paid',
|
|
||||||
'{"source":"billing_reconcile_endpoint"}',
|
|
||||||
),
|
|
||||||
)
|
|
||||||
conn.commit()
|
|
||||||
finally:
|
|
||||||
release_db_connection(conn)
|
|
||||||
|
|
||||||
if mark_paid_ids:
|
|
||||||
result["mark_paid_ids"] = sorted(mark_paid_ids)
|
|
||||||
return result
|
|
||||||
except Exception as e:
|
|
||||||
raise HTTPException(status_code=500, detail=f"Failed to reconcile draft sync status: {e}")
|
|
||||||
|
|||||||
@ -3,7 +3,7 @@ Supplier Invoices Router - Leverandørfakturaer (Kassekladde)
|
|||||||
Backend API for managing supplier invoices that integrate with e-conomic
|
Backend API for managing supplier invoices that integrate with e-conomic
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, HTTPException, UploadFile, File, BackgroundTasks
|
from fastapi import APIRouter, HTTPException, UploadFile, File
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from typing import List, Dict, Optional
|
from typing import List, Dict, Optional
|
||||||
from datetime import datetime, date, timedelta
|
from datetime import datetime, date, timedelta
|
||||||
@ -339,22 +339,10 @@ async def get_files_by_status(status: Optional[str] = None, limit: int = 100):
|
|||||||
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
||||||
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
||||||
f.detected_vendor_id, v.name as detected_vendor_name,
|
f.detected_vendor_id, v.name as detected_vendor_name,
|
||||||
ext.vendor_name,
|
e.total_amount as detected_amount
|
||||||
ext.vendor_cvr,
|
|
||||||
ext.vendor_matched_id,
|
|
||||||
COALESCE(v_ext.name, ext.vendor_name, v.name) as best_vendor_name,
|
|
||||||
ext.total_amount,
|
|
||||||
ext.confidence as vendor_match_confidence
|
|
||||||
FROM incoming_files f
|
FROM incoming_files f
|
||||||
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
||||||
LEFT JOIN LATERAL (
|
LEFT JOIN extractions e ON f.file_id = e.file_id
|
||||||
SELECT vendor_name, vendor_cvr, vendor_matched_id, total_amount, confidence
|
|
||||||
FROM extractions
|
|
||||||
WHERE file_id = f.file_id
|
|
||||||
ORDER BY created_at DESC
|
|
||||||
LIMIT 1
|
|
||||||
) ext ON true
|
|
||||||
LEFT JOIN vendors v_ext ON v_ext.id = ext.vendor_matched_id
|
|
||||||
WHERE f.status IN ({placeholders})
|
WHERE f.status IN ({placeholders})
|
||||||
ORDER BY f.uploaded_at DESC
|
ORDER BY f.uploaded_at DESC
|
||||||
LIMIT %s
|
LIMIT %s
|
||||||
@ -365,22 +353,10 @@ async def get_files_by_status(status: Optional[str] = None, limit: int = 100):
|
|||||||
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
||||||
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
||||||
f.detected_vendor_id, v.name as detected_vendor_name,
|
f.detected_vendor_id, v.name as detected_vendor_name,
|
||||||
ext.vendor_name,
|
e.total_amount as detected_amount
|
||||||
ext.vendor_cvr,
|
|
||||||
ext.vendor_matched_id,
|
|
||||||
COALESCE(v_ext.name, ext.vendor_name, v.name) as best_vendor_name,
|
|
||||||
ext.total_amount,
|
|
||||||
ext.confidence as vendor_match_confidence
|
|
||||||
FROM incoming_files f
|
FROM incoming_files f
|
||||||
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
||||||
LEFT JOIN LATERAL (
|
LEFT JOIN extractions e ON f.file_id = e.file_id
|
||||||
SELECT vendor_name, vendor_cvr, vendor_matched_id, total_amount, confidence
|
|
||||||
FROM extractions
|
|
||||||
WHERE file_id = f.file_id
|
|
||||||
ORDER BY created_at DESC
|
|
||||||
LIMIT 1
|
|
||||||
) ext ON true
|
|
||||||
LEFT JOIN vendors v_ext ON v_ext.id = ext.vendor_matched_id
|
|
||||||
ORDER BY f.uploaded_at DESC
|
ORDER BY f.uploaded_at DESC
|
||||||
LIMIT %s
|
LIMIT %s
|
||||||
"""
|
"""
|
||||||
@ -527,28 +503,6 @@ async def get_file_extracted_data(file_id: int):
|
|||||||
|
|
||||||
due_date_value = llm_json_data.get('due_date')
|
due_date_value = llm_json_data.get('due_date')
|
||||||
|
|
||||||
# Vendor name: AI uses 'vendor_name', invoice2data uses 'issuer'
|
|
||||||
vendor_name_val = (
|
|
||||||
llm_json_data.get('vendor_name') or
|
|
||||||
llm_json_data.get('issuer') or
|
|
||||||
(extraction.get('vendor_name') if extraction else None)
|
|
||||||
)
|
|
||||||
# Vendor CVR: AI uses 'vendor_cvr', invoice2data uses 'vendor_vat'
|
|
||||||
vendor_cvr_val = (
|
|
||||||
llm_json_data.get('vendor_cvr') or
|
|
||||||
llm_json_data.get('vendor_vat') or
|
|
||||||
(extraction.get('vendor_cvr') if extraction else None)
|
|
||||||
)
|
|
||||||
# Vendor address: AI uses 'vendor_address', invoice2data may have separate fields
|
|
||||||
vendor_address_val = (
|
|
||||||
llm_json_data.get('vendor_address') or
|
|
||||||
llm_json_data.get('supplier_address') or
|
|
||||||
llm_json_data.get('vendor_street')
|
|
||||||
)
|
|
||||||
vendor_city_val = llm_json_data.get('vendor_city') or llm_json_data.get('city')
|
|
||||||
vendor_postal_val = llm_json_data.get('vendor_postal_code') or llm_json_data.get('postal_code')
|
|
||||||
vendor_email_val = llm_json_data.get('vendor_email') or llm_json_data.get('supplier_email')
|
|
||||||
|
|
||||||
# Use invoice_number from LLM JSON (works for both AI and template extraction)
|
# Use invoice_number from LLM JSON (works for both AI and template extraction)
|
||||||
llm_data = {
|
llm_data = {
|
||||||
"invoice_number": llm_json_data.get('invoice_number'),
|
"invoice_number": llm_json_data.get('invoice_number'),
|
||||||
@ -557,12 +511,6 @@ async def get_file_extracted_data(file_id: int):
|
|||||||
"total_amount": float(total_amount_value) if total_amount_value else None,
|
"total_amount": float(total_amount_value) if total_amount_value else None,
|
||||||
"currency": llm_json_data.get('currency') or 'DKK',
|
"currency": llm_json_data.get('currency') or 'DKK',
|
||||||
"document_type": llm_json_data.get('document_type'),
|
"document_type": llm_json_data.get('document_type'),
|
||||||
"vendor_name": vendor_name_val,
|
|
||||||
"vendor_cvr": vendor_cvr_val,
|
|
||||||
"vendor_address": vendor_address_val,
|
|
||||||
"vendor_city": vendor_city_val,
|
|
||||||
"vendor_postal_code": vendor_postal_val,
|
|
||||||
"vendor_email": vendor_email_val,
|
|
||||||
"lines": formatted_lines
|
"lines": formatted_lines
|
||||||
}
|
}
|
||||||
elif extraction:
|
elif extraction:
|
||||||
@ -574,12 +522,6 @@ async def get_file_extracted_data(file_id: int):
|
|||||||
"total_amount": float(extraction.get('total_amount')) if extraction.get('total_amount') else None,
|
"total_amount": float(extraction.get('total_amount')) if extraction.get('total_amount') else None,
|
||||||
"currency": extraction.get('currency') or 'DKK',
|
"currency": extraction.get('currency') or 'DKK',
|
||||||
"document_type": extraction.get('document_type'),
|
"document_type": extraction.get('document_type'),
|
||||||
"vendor_name": extraction.get('vendor_name'),
|
|
||||||
"vendor_cvr": extraction.get('vendor_cvr'),
|
|
||||||
"vendor_address": None,
|
|
||||||
"vendor_city": None,
|
|
||||||
"vendor_postal_code": None,
|
|
||||||
"vendor_email": None,
|
|
||||||
"lines": formatted_lines
|
"lines": formatted_lines
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -757,36 +699,17 @@ async def link_vendor_to_extraction(file_id: int, data: dict):
|
|||||||
(file_id,))
|
(file_id,))
|
||||||
|
|
||||||
if not extraction:
|
if not extraction:
|
||||||
# No extraction exists (e.g. custom template match or not yet processed)
|
raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil")
|
||||||
# Create a minimal placeholder extraction so vendor can be linked
|
|
||||||
logger.info(f"⚠️ No extraction for file {file_id} — creating minimal extraction for vendor link")
|
|
||||||
extraction_id = execute_insert(
|
|
||||||
"""INSERT INTO extractions
|
|
||||||
(file_id, vendor_matched_id, vendor_name, vendor_cvr,
|
|
||||||
document_id, document_type, document_type_detected,
|
|
||||||
currency, confidence, status)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
|
||||||
RETURNING extraction_id""",
|
|
||||||
(file_id, vendor_id,
|
|
||||||
vendor['name'], None,
|
|
||||||
None, 'invoice', 'invoice',
|
|
||||||
'DKK', 1.0, 'manual')
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
extraction_id = extraction['extraction_id']
|
|
||||||
# Update extraction with vendor match
|
|
||||||
execute_update(
|
|
||||||
"UPDATE extractions SET vendor_matched_id = %s WHERE extraction_id = %s",
|
|
||||||
(vendor_id, extraction_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
# Also update incoming_files so table shows vendor immediately
|
# Update extraction with vendor match
|
||||||
execute_update(
|
execute_update(
|
||||||
"UPDATE incoming_files SET detected_vendor_id = %s, status = 'processed' WHERE file_id = %s",
|
"""UPDATE extractions
|
||||||
(vendor_id, file_id)
|
SET vendor_matched_id = %s
|
||||||
|
WHERE extraction_id = %s""",
|
||||||
|
(vendor_id, extraction['extraction_id'])
|
||||||
)
|
)
|
||||||
|
|
||||||
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to file {file_id}")
|
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to extraction for file {file_id}")
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status": "success",
|
"status": "success",
|
||||||
@ -900,37 +823,21 @@ async def link_vendor_to_extraction(file_id: int, data: dict):
|
|||||||
(file_id,))
|
(file_id,))
|
||||||
|
|
||||||
if not extraction:
|
if not extraction:
|
||||||
# Create minimal extraction if none exists
|
raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil")
|
||||||
logger.info(f"⚠️ No extraction for file {file_id} — creating minimal extraction for vendor link")
|
|
||||||
extraction_id = execute_insert(
|
|
||||||
"""INSERT INTO extractions
|
|
||||||
(file_id, vendor_matched_id, vendor_name, vendor_cvr,
|
|
||||||
document_id, document_type, document_type_detected,
|
|
||||||
currency, confidence, status)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
|
||||||
RETURNING extraction_id""",
|
|
||||||
(file_id, vendor_id, vendor['name'], None,
|
|
||||||
None, 'invoice', 'invoice', 'DKK', 1.0, 'manual')
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
extraction_id = extraction['extraction_id']
|
|
||||||
execute_update(
|
|
||||||
"UPDATE extractions SET vendor_matched_id = %s WHERE extraction_id = %s",
|
|
||||||
(vendor_id, extraction_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
|
# Update extraction with vendor match
|
||||||
execute_update(
|
execute_update(
|
||||||
"UPDATE incoming_files SET detected_vendor_id = %s, status = 'processed' WHERE file_id = %s",
|
"UPDATE extractions SET vendor_matched_id = %s WHERE extraction_id = %s",
|
||||||
(vendor_id, file_id)
|
(vendor_id, extraction['extraction_id'])
|
||||||
)
|
)
|
||||||
|
|
||||||
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to extraction {extraction_id}")
|
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to extraction {extraction['extraction_id']}")
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status": "success",
|
"status": "success",
|
||||||
"vendor_id": vendor_id,
|
"vendor_id": vendor_id,
|
||||||
"vendor_name": vendor['name'],
|
"vendor_name": vendor['name'],
|
||||||
"extraction_id": extraction_id
|
"extraction_id": extraction['extraction_id']
|
||||||
}
|
}
|
||||||
|
|
||||||
except HTTPException:
|
except HTTPException:
|
||||||
@ -1703,10 +1610,6 @@ async def delete_supplier_invoice(invoice_id: int):
|
|||||||
class ApproveRequest(BaseModel):
|
class ApproveRequest(BaseModel):
|
||||||
approved_by: str
|
approved_by: str
|
||||||
|
|
||||||
|
|
||||||
class MarkPaidRequest(BaseModel):
|
|
||||||
paid_date: Optional[date] = None
|
|
||||||
|
|
||||||
@router.post("/supplier-invoices/{invoice_id}/approve")
|
@router.post("/supplier-invoices/{invoice_id}/approve")
|
||||||
async def approve_supplier_invoice(invoice_id: int, request: ApproveRequest):
|
async def approve_supplier_invoice(invoice_id: int, request: ApproveRequest):
|
||||||
"""Approve supplier invoice for payment"""
|
"""Approve supplier invoice for payment"""
|
||||||
@ -1739,58 +1642,6 @@ async def approve_supplier_invoice(invoice_id: int, request: ApproveRequest):
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.post("/supplier-invoices/{invoice_id}/mark-paid")
|
|
||||||
async def mark_supplier_invoice_paid(invoice_id: int, request: MarkPaidRequest):
|
|
||||||
"""Mark supplier invoice as paid."""
|
|
||||||
try:
|
|
||||||
invoice = execute_query_single(
|
|
||||||
"SELECT id, invoice_number, status FROM supplier_invoices WHERE id = %s",
|
|
||||||
(invoice_id,)
|
|
||||||
)
|
|
||||||
|
|
||||||
if not invoice:
|
|
||||||
raise HTTPException(status_code=404, detail=f"Faktura {invoice_id} ikke fundet")
|
|
||||||
|
|
||||||
if invoice['status'] == 'paid':
|
|
||||||
return {"success": True, "invoice_id": invoice_id, "status": "paid"}
|
|
||||||
|
|
||||||
if invoice['status'] not in ('approved', 'sent_to_economic'):
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=400,
|
|
||||||
detail=(
|
|
||||||
f"Faktura har status '{invoice['status']}' - "
|
|
||||||
"kun 'approved' eller 'sent_to_economic' kan markeres som betalt"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"""UPDATE supplier_invoices
|
|
||||||
SET status = 'paid', updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s""",
|
|
||||||
(invoice_id,)
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"✅ Marked supplier invoice %s (ID: %s) as paid (date: %s)",
|
|
||||||
invoice['invoice_number'],
|
|
||||||
invoice_id,
|
|
||||||
request.paid_date,
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"success": True,
|
|
||||||
"invoice_id": invoice_id,
|
|
||||||
"status": "paid",
|
|
||||||
"paid_date": request.paid_date,
|
|
||||||
}
|
|
||||||
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Failed to mark invoice {invoice_id} as paid: {e}")
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/supplier-invoices/{invoice_id}/send-to-economic")
|
@router.post("/supplier-invoices/{invoice_id}/send-to-economic")
|
||||||
async def send_to_economic(invoice_id: int):
|
async def send_to_economic(invoice_id: int):
|
||||||
"""
|
"""
|
||||||
@ -2131,16 +1982,10 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
try:
|
try:
|
||||||
# Validate file extension
|
# Validate file extension
|
||||||
suffix = Path(file.filename).suffix.lower()
|
suffix = Path(file.filename).suffix.lower()
|
||||||
suffix_clean = suffix.lstrip('.')
|
if suffix not in settings.ALLOWED_EXTENSIONS:
|
||||||
# Build allowed set — guard against pydantic parsing CSV as a single element
|
|
||||||
raw = settings.ALLOWED_EXTENSIONS
|
|
||||||
if len(raw) == 1 and ',' in raw[0]:
|
|
||||||
raw = [e.strip() for e in raw[0].split(',')]
|
|
||||||
allowed_clean = {ext.lower().lstrip('.') for ext in raw}
|
|
||||||
if suffix_clean not in allowed_clean:
|
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=400,
|
status_code=400,
|
||||||
detail=f"Filtype {suffix} ikke tilladt. Tilladte: {', '.join(sorted(allowed_clean))}"
|
detail=f"Filtype {suffix} ikke tilladt. Tilladte: {', '.join(settings.ALLOWED_EXTENSIONS)}"
|
||||||
)
|
)
|
||||||
|
|
||||||
# Create upload directory
|
# Create upload directory
|
||||||
@ -2152,7 +1997,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
# Validate file size while saving
|
# Validate file size while saving
|
||||||
max_size = settings.EMAIL_MAX_UPLOAD_SIZE_MB * 1024 * 1024
|
max_size = settings.MAX_FILE_SIZE_MB * 1024 * 1024
|
||||||
total_size = 0
|
total_size = 0
|
||||||
|
|
||||||
with open(temp_path, "wb") as buffer:
|
with open(temp_path, "wb") as buffer:
|
||||||
@ -2162,7 +2007,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
temp_path.unlink(missing_ok=True)
|
temp_path.unlink(missing_ok=True)
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=413,
|
status_code=413,
|
||||||
detail=f"Fil for stor (max {settings.EMAIL_MAX_UPLOAD_SIZE_MB}MB)"
|
detail=f"Fil for stor (max {settings.MAX_FILE_SIZE_MB}MB)"
|
||||||
)
|
)
|
||||||
buffer.write(chunk)
|
buffer.write(chunk)
|
||||||
|
|
||||||
@ -2172,7 +2017,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
checksum = ollama_service.calculate_file_checksum(temp_path)
|
checksum = ollama_service.calculate_file_checksum(temp_path)
|
||||||
|
|
||||||
# Check for duplicate file
|
# Check for duplicate file
|
||||||
existing_file = execute_query_single(
|
existing_file = execute_query(
|
||||||
"SELECT file_id, status FROM incoming_files WHERE checksum = %s",
|
"SELECT file_id, status FROM incoming_files WHERE checksum = %s",
|
||||||
(checksum,))
|
(checksum,))
|
||||||
|
|
||||||
@ -2260,7 +2105,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/supplier-invoices/{invoice_id}/send-to-economic-legacy-unimplemented")
|
@router.post("/supplier-invoices/{invoice_id}/send-to-economic")
|
||||||
async def send_invoice_to_economic(invoice_id: int):
|
async def send_invoice_to_economic(invoice_id: int):
|
||||||
"""Send supplier invoice to e-conomic - requires separate implementation"""
|
"""Send supplier invoice to e-conomic - requires separate implementation"""
|
||||||
raise HTTPException(status_code=501, detail="e-conomic integration kommer senere")
|
raise HTTPException(status_code=501, detail="e-conomic integration kommer senere")
|
||||||
@ -2452,65 +2297,21 @@ async def reprocess_uploaded_file(file_id: int):
|
|||||||
extracted_fields = llm_result
|
extracted_fields = llm_result
|
||||||
confidence = llm_result.get('confidence', 0.75)
|
confidence = llm_result.get('confidence', 0.75)
|
||||||
|
|
||||||
# Post-process: clear own CVR(s) if AI mistakenly returned them
|
|
||||||
extracted_cvr = llm_result.get('vendor_cvr')
|
|
||||||
own_cvr = getattr(settings, 'OWN_CVR', '29522790')
|
|
||||||
OWN_CVRS = {str(own_cvr).strip(), '29522790', '14416285'} # alle BMC CVR numre
|
|
||||||
extracted_cvr_clean = str(extracted_cvr).replace('DK', '').strip() if extracted_cvr else ''
|
|
||||||
if extracted_cvr_clean and extracted_cvr_clean in OWN_CVRS:
|
|
||||||
logger.warning(f"⚠️ AI returned own CVR ({extracted_cvr_clean}) as vendor_cvr - clearing it")
|
|
||||||
llm_result['vendor_cvr'] = None
|
|
||||||
extracted_cvr = None
|
|
||||||
# Also clear vendor_name if it looks like BMC
|
|
||||||
vendor_name = llm_result.get('vendor_name', '') or ''
|
|
||||||
if 'BMC' in vendor_name.upper() and 'DENMARK' in vendor_name.upper():
|
|
||||||
logger.warning(f"⚠️ AI returned own company name '{vendor_name}' as vendor_name - clearing it")
|
|
||||||
llm_result['vendor_name'] = None
|
|
||||||
|
|
||||||
# Try to find vendor in DB by extracted CVR or name (overrides detected_vendor_id)
|
|
||||||
if extracted_cvr:
|
|
||||||
cvr_clean = str(extracted_cvr).replace('DK', '').strip()
|
|
||||||
vendor_row = execute_query_single(
|
|
||||||
"SELECT id FROM vendors WHERE cvr_number = %s AND is_active = true",
|
|
||||||
(cvr_clean,))
|
|
||||||
if vendor_row:
|
|
||||||
vendor_id = vendor_row['id']
|
|
||||||
logger.info(f"✅ Matched vendor by CVR {cvr_clean}: vendor_id={vendor_id}")
|
|
||||||
execute_update(
|
|
||||||
"UPDATE incoming_files SET detected_vendor_id = %s WHERE file_id = %s",
|
|
||||||
(vendor_id, file_id))
|
|
||||||
if not vendor_id and llm_result.get('vendor_name'):
|
|
||||||
vendor_row = execute_query_single(
|
|
||||||
"SELECT id FROM vendors WHERE name ILIKE %s AND is_active = true ORDER BY id LIMIT 1",
|
|
||||||
(f"%{llm_result['vendor_name']}%",))
|
|
||||||
if vendor_row:
|
|
||||||
vendor_id = vendor_row['id']
|
|
||||||
logger.info(f"✅ Matched vendor by name '{llm_result['vendor_name']}': vendor_id={vendor_id}")
|
|
||||||
execute_update(
|
|
||||||
"UPDATE incoming_files SET detected_vendor_id = %s WHERE file_id = %s",
|
|
||||||
(vendor_id, file_id))
|
|
||||||
|
|
||||||
# Store AI extracted data in extractions table
|
# Store AI extracted data in extractions table
|
||||||
extraction_id = execute_insert(
|
extraction_id = execute_insert(
|
||||||
"""INSERT INTO extractions
|
"""INSERT INTO supplier_invoice_extractions
|
||||||
(file_id, vendor_matched_id, vendor_name, vendor_cvr,
|
(file_id, vendor_id, invoice_number, invoice_date, due_date,
|
||||||
document_id, document_date, due_date,
|
total_amount, currency, document_type, confidence, llm_data)
|
||||||
total_amount, currency, document_type, document_type_detected,
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s) RETURNING extraction_id""",
|
||||||
confidence, llm_response_json, status)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s) RETURNING extraction_id""",
|
|
||||||
(file_id, vendor_id,
|
(file_id, vendor_id,
|
||||||
llm_result.get('vendor_name'),
|
|
||||||
llm_result.get('vendor_cvr'),
|
|
||||||
llm_result.get('invoice_number'),
|
llm_result.get('invoice_number'),
|
||||||
llm_result.get('invoice_date'),
|
llm_result.get('invoice_date'),
|
||||||
llm_result.get('due_date'),
|
llm_result.get('due_date'),
|
||||||
llm_result.get('total_amount'),
|
llm_result.get('total_amount'),
|
||||||
llm_result.get('currency', 'DKK'),
|
llm_result.get('currency', 'DKK'),
|
||||||
llm_result.get('document_type', 'invoice'),
|
llm_result.get('document_type'),
|
||||||
llm_result.get('document_type', 'invoice'),
|
|
||||||
confidence,
|
confidence,
|
||||||
json.dumps(llm_result),
|
json.dumps(llm_result))
|
||||||
'extracted')
|
|
||||||
)
|
)
|
||||||
|
|
||||||
# Insert line items if extracted
|
# Insert line items if extracted
|
||||||
@ -2519,13 +2320,13 @@ async def reprocess_uploaded_file(file_id: int):
|
|||||||
execute_insert(
|
execute_insert(
|
||||||
"""INSERT INTO extraction_lines
|
"""INSERT INTO extraction_lines
|
||||||
(extraction_id, line_number, description, quantity, unit_price,
|
(extraction_id, line_number, description, quantity, unit_price,
|
||||||
line_total, vat_rate, confidence)
|
line_total, vat_rate, vat_note, confidence)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
RETURNING line_id""",
|
RETURNING line_id""",
|
||||||
(extraction_id, idx, line.get('description'),
|
(extraction_id, idx, line.get('description'),
|
||||||
line.get('quantity'), line.get('unit_price'),
|
line.get('quantity'), line.get('unit_price'),
|
||||||
line.get('line_total'), line.get('vat_rate'),
|
line.get('line_total'), line.get('vat_rate'),
|
||||||
confidence)
|
line.get('vat_note'), confidence)
|
||||||
)
|
)
|
||||||
|
|
||||||
# Update file status to ai_extracted
|
# Update file status to ai_extracted
|
||||||
@ -2575,47 +2376,6 @@ async def reprocess_uploaded_file(file_id: int):
|
|||||||
raise HTTPException(status_code=500, detail=f"Genbehandling fejlede: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Genbehandling fejlede: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
@router.post("/supplier-invoices/files/batch-analyze")
|
|
||||||
async def batch_analyze_files(background_tasks: BackgroundTasks):
|
|
||||||
"""
|
|
||||||
Kør AI-analyse på alle ubehandlede filer i baggrunden.
|
|
||||||
Returnerer øjeblikkeligt – filer behandles async.
|
|
||||||
"""
|
|
||||||
pending = execute_query(
|
|
||||||
"""SELECT file_id, filename FROM incoming_files
|
|
||||||
WHERE status IN ('pending', 'requires_vendor_selection', 'uploaded', 'failed')
|
|
||||||
ORDER BY uploaded_at DESC
|
|
||||||
LIMIT 100""",
|
|
||||||
()
|
|
||||||
)
|
|
||||||
if not pending:
|
|
||||||
return {"started": 0, "message": "Ingen filer at behandle"}
|
|
||||||
|
|
||||||
file_ids = [r['file_id'] for r in pending]
|
|
||||||
logger.info(f"🚀 Batch-analyse startet for {len(file_ids)} filer")
|
|
||||||
|
|
||||||
async def _run_batch(ids):
|
|
||||||
ok = err = 0
|
|
||||||
for fid in ids:
|
|
||||||
try:
|
|
||||||
await reprocess_uploaded_file(fid)
|
|
||||||
ok += 1
|
|
||||||
except Exception as ex:
|
|
||||||
logger.error(f"❌ Batch fejl file {fid}: {ex}")
|
|
||||||
err += 1
|
|
||||||
logger.info(f"✅ Batch færdig: {ok} ok, {err} fejlet")
|
|
||||||
|
|
||||||
background_tasks.add_task(_run_batch, file_ids)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"started": len(file_ids),
|
|
||||||
"message": f"{len(file_ids)} filer sendt til analyse i baggrunden. Opdater siden om lidt.",
|
|
||||||
"analyzed": 0,
|
|
||||||
"requires_vendor_selection": 0,
|
|
||||||
"failed": 0
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@router.put("/supplier-invoices/templates/{template_id}")
|
@router.put("/supplier-invoices/templates/{template_id}")
|
||||||
async def update_template(
|
async def update_template(
|
||||||
template_id: int,
|
template_id: int,
|
||||||
@ -3489,9 +3249,16 @@ async def retry_extraction(file_id: int):
|
|||||||
|
|
||||||
logger.info(f"🔄 Retrying extraction for file {file_id}: {file_data['filename']}")
|
logger.info(f"🔄 Retrying extraction for file {file_id}: {file_data['filename']}")
|
||||||
|
|
||||||
# Run full extraction cascade immediately
|
# Trigger re-analysis by calling the existing upload processing logic
|
||||||
result = await reprocess_uploaded_file(file_id)
|
# For now, just mark as pending - the user can then run batch-analyze
|
||||||
return result
|
|
||||||
|
return {
|
||||||
|
"file_id": file_id,
|
||||||
|
"filename": file_data['filename'],
|
||||||
|
"message": "File marked for re-analysis. Run batch-analyze to process.",
|
||||||
|
"previous_status": file_data['status'],
|
||||||
|
"new_status": "pending"
|
||||||
|
}
|
||||||
|
|
||||||
except HTTPException:
|
except HTTPException:
|
||||||
raise
|
raise
|
||||||
|
|||||||
@ -110,9 +110,6 @@
|
|||||||
<p class="text-muted mb-0">Kassekladde - Integration med e-conomic</p>
|
<p class="text-muted mb-0">Kassekladde - Integration med e-conomic</p>
|
||||||
</div>
|
</div>
|
||||||
<div>
|
<div>
|
||||||
<a href="/billing/sync-dashboard" class="btn btn-outline-dark me-2">
|
|
||||||
<i class="bi bi-activity me-2"></i>Sync Dashboard
|
|
||||||
</a>
|
|
||||||
<a href="/billing/templates" class="btn btn-outline-secondary me-2">
|
<a href="/billing/templates" class="btn btn-outline-secondary me-2">
|
||||||
<i class="bi bi-grid-3x3 me-2"></i>Se Templates
|
<i class="bi bi-grid-3x3 me-2"></i>Se Templates
|
||||||
</a>
|
</a>
|
||||||
@ -176,11 +173,6 @@
|
|||||||
<i class="bi bi-calendar-check me-2"></i>Til Betaling
|
<i class="bi bi-calendar-check me-2"></i>Til Betaling
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
<li class="nav-item">
|
|
||||||
<a class="nav-link" id="ready-tab" data-bs-toggle="tab" href="#ready-content" onclick="switchToReadyTab()">
|
|
||||||
<i class="bi bi-check-circle me-2"></i>Klar til Bogføring
|
|
||||||
</a>
|
|
||||||
</li>
|
|
||||||
<li class="nav-item">
|
<li class="nav-item">
|
||||||
<a class="nav-link" id="lines-tab" data-bs-toggle="tab" href="#lines-content" onclick="switchToLinesTab()">
|
<a class="nav-link" id="lines-tab" data-bs-toggle="tab" href="#lines-content" onclick="switchToLinesTab()">
|
||||||
<i class="bi bi-list-ul me-2"></i>Varelinjer
|
<i class="bi bi-list-ul me-2"></i>Varelinjer
|
||||||
@ -256,7 +248,7 @@
|
|||||||
<strong><span id="selectedKassekladdeCount">0</span> fakturaer valgt</strong>
|
<strong><span id="selectedKassekladdeCount">0</span> fakturaer valgt</strong>
|
||||||
</div>
|
</div>
|
||||||
<div class="btn-group" role="group">
|
<div class="btn-group" role="group">
|
||||||
<button type="button" class="btn btn-sm btn-success" onclick="bulkSendToEconomicKassekladde()" title="Send til e-conomic kassekladde">
|
<button type="button" class="btn btn-sm btn-success" onclick="bulkSendToEconomic()" title="Send til e-conomic kassekladde">
|
||||||
<i class="bi bi-send me-1"></i>Send til e-conomic
|
<i class="bi bi-send me-1"></i>Send til e-conomic
|
||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
@ -875,133 +867,6 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- =============================================
|
|
||||||
QUICK OPRET LEVERANDØR — Split-view modal
|
|
||||||
Venstre: PDF iframe | Højre: Vendor form
|
|
||||||
============================================== -->
|
|
||||||
<div class="modal fade" id="quickVendorSplitModal" tabindex="-1" style="--bs-modal-width: 100%;">
|
|
||||||
<div class="modal-dialog modal-fullscreen">
|
|
||||||
<div class="modal-content">
|
|
||||||
<div class="modal-header py-2 border-bottom">
|
|
||||||
<h5 class="modal-title"><i class="bi bi-person-plus me-2"></i>Opret / Link Leverandør</h5>
|
|
||||||
<div class="ms-3 d-flex align-items-center gap-2">
|
|
||||||
<span class="badge bg-secondary" id="qvSplitFilename" style="font-size:.85rem"></span>
|
|
||||||
</div>
|
|
||||||
<button type="button" class="btn-close ms-auto" data-bs-dismiss="modal"></button>
|
|
||||||
</div>
|
|
||||||
<div class="modal-body p-0 d-flex" style="height: calc(100vh - 120px); overflow:hidden;">
|
|
||||||
|
|
||||||
<!-- LEFT: PDF viewer -->
|
|
||||||
<div class="d-flex flex-column border-end" style="width:58%; min-width:400px; height:100%;">
|
|
||||||
<div class="px-3 py-2 bg-body-tertiary border-bottom small text-muted flex-shrink-0">
|
|
||||||
<i class="bi bi-file-pdf text-danger me-1"></i>Faktura PDF
|
|
||||||
</div>
|
|
||||||
<iframe id="qvPdfFrame" src="" style="flex:1 1 0; min-height:0; border:none; width:100%;" title="PDF Preview"></iframe>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- RIGHT: Vendor form -->
|
|
||||||
<div class="d-flex flex-column" style="width:42%; overflow-y:auto;">
|
|
||||||
<div class="px-4 py-3 bg-body-tertiary border-bottom">
|
|
||||||
<span class="small text-muted">Udfyld leverandøroplysninger — felter er preudfyldt fra faktura-PDF</span>
|
|
||||||
</div>
|
|
||||||
<div class="px-4 py-3">
|
|
||||||
<input type="hidden" id="qvFileId">
|
|
||||||
<input type="hidden" id="qvExistingVendorId">
|
|
||||||
|
|
||||||
<!-- Search existing -->
|
|
||||||
<div class="card mb-3 border-primary">
|
|
||||||
<div class="card-header py-2 bg-primary text-white small"><i class="bi bi-search me-1"></i>Link eksisterende leverandør</div>
|
|
||||||
<div class="card-body py-2">
|
|
||||||
<div class="input-group input-group-sm">
|
|
||||||
<input type="text" class="form-control" id="qvSearchInput" placeholder="Søg navn eller CVR..." oninput="qvSearchVendors(this.value)">
|
|
||||||
<button class="btn btn-outline-secondary" type="button" onclick="qvSearchVendors(document.getElementById('qvSearchInput').value)"><i class="bi bi-search"></i></button>
|
|
||||||
</div>
|
|
||||||
<div id="qvSearchResults" class="list-group mt-2" style="max-height:160px; overflow-y:auto;">
|
|
||||||
<div class="list-group-item text-muted small py-1">Søg for at finde eksisterende leverandør</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="text-center text-muted my-2 small">— eller opret ny leverandør nedenfor —</div>
|
|
||||||
|
|
||||||
<form id="qvVendorForm" autocomplete="off">
|
|
||||||
<div class="row g-2 mb-2">
|
|
||||||
<div class="col-8">
|
|
||||||
<label class="form-label small mb-1">Navn *</label>
|
|
||||||
<input type="text" class="form-control form-control-sm" id="qvName" required placeholder="Firma navn">
|
|
||||||
</div>
|
|
||||||
<div class="col-4">
|
|
||||||
<label class="form-label small mb-1">CVR</label>
|
|
||||||
<input type="text" class="form-control form-control-sm" id="qvCVR" maxlength="8" placeholder="12345678">
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="row g-2 mb-2">
|
|
||||||
<div class="col-6">
|
|
||||||
<label class="form-label small mb-1">Email</label>
|
|
||||||
<input type="email" class="form-control form-control-sm" id="qvEmail" placeholder="kontakt@firma.dk">
|
|
||||||
</div>
|
|
||||||
<div class="col-6">
|
|
||||||
<label class="form-label small mb-1">Telefon</label>
|
|
||||||
<input type="tel" class="form-control form-control-sm" id="qvPhone" placeholder="+45 12 34 56 78">
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="mb-2">
|
|
||||||
<label class="form-label small mb-1">Adresse</label>
|
|
||||||
<input type="text" class="form-control form-control-sm" id="qvAddress" placeholder="Vejnavn nr.">
|
|
||||||
</div>
|
|
||||||
<div class="row g-2 mb-2">
|
|
||||||
<div class="col-4">
|
|
||||||
<label class="form-label small mb-1">Postnr.</label>
|
|
||||||
<input type="text" class="form-control form-control-sm" id="qvPostal" maxlength="10">
|
|
||||||
</div>
|
|
||||||
<div class="col-8">
|
|
||||||
<label class="form-label small mb-1">By</label>
|
|
||||||
<input type="text" class="form-control form-control-sm" id="qvCity">
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="mb-2">
|
|
||||||
<label class="form-label small mb-1">Website / domæne</label>
|
|
||||||
<input type="text" class="form-control form-control-sm" id="qvDomain" placeholder="firma.dk">
|
|
||||||
</div>
|
|
||||||
<div class="mb-2">
|
|
||||||
<label class="form-label small mb-1">Kategori</label>
|
|
||||||
<select class="form-select form-select-sm" id="qvCategory">
|
|
||||||
<option value="general">Generel</option>
|
|
||||||
<option value="telecom">Telecom</option>
|
|
||||||
<option value="hardware">Hardware</option>
|
|
||||||
<option value="software">Software</option>
|
|
||||||
<option value="services">Services</option>
|
|
||||||
<option value="payroll">Løn / HR</option>
|
|
||||||
<option value="utilities">Forsyning</option>
|
|
||||||
<option value="insurance">Forsikring</option>
|
|
||||||
<option value="rent">Husleje / Lokaler</option>
|
|
||||||
</select>
|
|
||||||
</div>
|
|
||||||
<div class="mb-3">
|
|
||||||
<label class="form-label small mb-1">Noter (inkl. bank/IBAN info)</label>
|
|
||||||
<textarea class="form-control form-control-sm" id="qvNotes" rows="3" placeholder="IBAN, kontonummer, BIC/SWIFT, betalingsbetingelser..."></textarea>
|
|
||||||
</div>
|
|
||||||
</form>
|
|
||||||
|
|
||||||
<!-- Status alert -->
|
|
||||||
<div id="qvStatusAlert" class="alert d-none py-2 small"></div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div><!-- /.modal-body -->
|
|
||||||
|
|
||||||
<div class="modal-footer py-2 border-top justify-content-between">
|
|
||||||
<button type="button" class="btn btn-secondary btn-sm" data-bs-dismiss="modal">Luk</button>
|
|
||||||
<div class="d-flex gap-2">
|
|
||||||
<button type="button" class="btn btn-success btn-sm" onclick="saveQuickVendor()">
|
|
||||||
<i class="bi bi-person-plus me-1"></i>Opret og link leverandør
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Link/Create Vendor Modal -->
|
<!-- Link/Create Vendor Modal -->
|
||||||
<div class="modal fade" id="linkVendorModal" tabindex="-1">
|
<div class="modal fade" id="linkVendorModal" tabindex="-1">
|
||||||
<div class="modal-dialog">
|
<div class="modal-dialog">
|
||||||
@ -1400,7 +1265,7 @@ async function markSingleAsPaid(invoiceId) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Helper function to send single invoice to e-conomic
|
// Helper function to send single invoice to e-conomic
|
||||||
async function sendToEconomicById(invoiceId) {
|
async function sendToEconomic(invoiceId) {
|
||||||
if (!confirm('Send denne faktura til e-conomic?')) return;
|
if (!confirm('Send denne faktura til e-conomic?')) return;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@ -1688,7 +1553,7 @@ async function loadReadyForBookingView() {
|
|||||||
<button class="btn btn-sm btn-outline-primary" onclick="viewInvoiceDetails(${invoice.id})" title="Se/Rediger detaljer">
|
<button class="btn btn-sm btn-outline-primary" onclick="viewInvoiceDetails(${invoice.id})" title="Se/Rediger detaljer">
|
||||||
<i class="bi bi-pencil-square"></i>
|
<i class="bi bi-pencil-square"></i>
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-sm btn-primary" onclick="sendToEconomicById(${invoice.id})" title="Send til e-conomic">
|
<button class="btn btn-sm btn-primary" onclick="sendToEconomic(${invoice.id})" title="Send til e-conomic">
|
||||||
<i class="bi bi-send"></i>
|
<i class="bi bi-send"></i>
|
||||||
</button>
|
</button>
|
||||||
</td>
|
</td>
|
||||||
@ -1947,10 +1812,9 @@ function renderUnhandledFiles(files) {
|
|||||||
|
|
||||||
for (const file of files) {
|
for (const file of files) {
|
||||||
const statusBadge = getFileStatusBadge(file.status);
|
const statusBadge = getFileStatusBadge(file.status);
|
||||||
const vendorName = file.best_vendor_name || file.vendor_name || file.detected_vendor_name || '-';
|
const vendorName = file.detected_vendor_name || '-';
|
||||||
const confRaw = file.vendor_match_confidence;
|
const confidence = file.vendor_match_confidence ? `${file.vendor_match_confidence}%` : '-';
|
||||||
const confidence = confRaw !== null && confRaw !== undefined ? `${Math.round(confRaw * 100)}%` : '-';
|
const amount = file.detected_amount ? formatCurrency(file.detected_amount) : '-';
|
||||||
const amount = file.total_amount ? formatCurrency(file.total_amount) : '-';
|
|
||||||
const uploadDate = file.uploaded_at ? new Date(file.uploaded_at).toLocaleDateString('da-DK') : '-';
|
const uploadDate = file.uploaded_at ? new Date(file.uploaded_at).toLocaleDateString('da-DK') : '-';
|
||||||
|
|
||||||
html += `
|
html += `
|
||||||
@ -1978,11 +1842,16 @@ function renderUnhandledFiles(files) {
|
|||||||
<td>${statusBadge}</td>
|
<td>${statusBadge}</td>
|
||||||
<td>
|
<td>
|
||||||
<div class="btn-group btn-group-sm">
|
<div class="btn-group btn-group-sm">
|
||||||
<button class="btn btn-outline-success" onclick="openQuickVendorCreate(${file.file_id}, '${escapeHtml(file.filename)}')" title="Opret / Link leverandør">
|
${file.status === 'extraction_failed' ?
|
||||||
<i class="bi bi-person-plus"></i>
|
`<button class="btn btn-outline-warning" onclick="retryExtraction(${file.file_id})" title="Prøv igen">
|
||||||
</button>
|
<i class="bi bi-arrow-clockwise"></i>
|
||||||
<button class="btn btn-outline-warning" onclick="rerunSingleFile(${file.file_id})" title="Kør analyse igen">
|
</button>` :
|
||||||
<i class="bi bi-arrow-repeat"></i>
|
`<button class="btn btn-outline-primary" onclick="analyzeFile(${file.file_id})" title="Analyser">
|
||||||
|
<i class="bi bi-search"></i>
|
||||||
|
</button>`
|
||||||
|
}
|
||||||
|
<button class="btn btn-outline-secondary" onclick="viewFilePDF(${file.file_id})" title="Vis PDF">
|
||||||
|
<i class="bi bi-file-pdf"></i>
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-danger" onclick="deleteFile(${file.file_id})" title="Slet">
|
<button class="btn btn-outline-danger" onclick="deleteFile(${file.file_id})" title="Slet">
|
||||||
<i class="bi bi-trash"></i>
|
<i class="bi bi-trash"></i>
|
||||||
@ -2038,12 +1907,12 @@ function getFileStatusBadge(status) {
|
|||||||
|
|
||||||
// NEW: Batch analyze all files
|
// NEW: Batch analyze all files
|
||||||
async function batchAnalyzeAllFiles() {
|
async function batchAnalyzeAllFiles() {
|
||||||
if (!confirm('Kør automatisk analyse på alle ubehandlede filer?\n\nDette kan tage flere minutter afhængigt af antal filer.\nSiden opdateres automatisk undervejs.')) {
|
if (!confirm('Kør automatisk analyse på alle ubehandlede filer?\n\nDette vil:\n- Matche leverandører via CVR\n- Ekstrahere fakturadata\n- Oprette fakturaer i kassekladde ved 100% match')) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
showLoadingOverlay('Starter analyse...');
|
showLoadingOverlay('Analyserer filer...');
|
||||||
|
|
||||||
const response = await fetch('/api/v1/supplier-invoices/files/batch-analyze', {
|
const response = await fetch('/api/v1/supplier-invoices/files/batch-analyze', {
|
||||||
method: 'POST'
|
method: 'POST'
|
||||||
@ -2055,27 +1924,19 @@ async function batchAnalyzeAllFiles() {
|
|||||||
|
|
||||||
hideLoadingOverlay();
|
hideLoadingOverlay();
|
||||||
|
|
||||||
if (result.started === 0) {
|
alert(`✅ Batch-analyse fuldført!\n\n` +
|
||||||
alert('ℹ️ Ingen filer at behandle.');
|
`Analyseret: ${result.analyzed}\n` +
|
||||||
return;
|
`Kræver manuel leverandør-valg: ${result.requires_vendor_selection}\n` +
|
||||||
}
|
`Fejlet: ${result.failed}`);
|
||||||
|
|
||||||
alert(`✅ ${result.message}`);
|
// Reload tables
|
||||||
|
|
||||||
// Auto-opdater tabellen hvert 10. sekund i 5 minutter
|
|
||||||
let refreshes = 0;
|
|
||||||
const maxRefreshes = 30;
|
|
||||||
const interval = setInterval(() => {
|
|
||||||
loadUnhandledFiles();
|
|
||||||
refreshes++;
|
|
||||||
if (refreshes >= maxRefreshes) clearInterval(interval);
|
|
||||||
}, 10000);
|
|
||||||
loadUnhandledFiles();
|
loadUnhandledFiles();
|
||||||
|
loadKassekladdeView();
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
hideLoadingOverlay();
|
hideLoadingOverlay();
|
||||||
console.error('Batch analysis error:', error);
|
console.error('Batch analysis error:', error);
|
||||||
alert('❌ Fejl ved batch-analyse: ' + error.message);
|
alert('❌ Fejl ved batch-analyse');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -2104,293 +1965,6 @@ async function retryExtraction(fileId) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// ─── Quick Vendor Split-View ─────────────────────────────────────────────
|
|
||||||
async function openQuickVendorCreate(fileId, filename) {
|
|
||||||
// Reset
|
|
||||||
document.getElementById('qvFileId').value = fileId;
|
|
||||||
document.getElementById('qvExistingVendorId').value = '';
|
|
||||||
document.getElementById('qvSplitFilename').textContent = filename;
|
|
||||||
document.getElementById('qvName').value = '';
|
|
||||||
document.getElementById('qvCVR').value = '';
|
|
||||||
document.getElementById('qvEmail').value = '';
|
|
||||||
document.getElementById('qvPhone').value = '';
|
|
||||||
document.getElementById('qvAddress').value = '';
|
|
||||||
document.getElementById('qvPostal').value = '';
|
|
||||||
document.getElementById('qvCity').value = '';
|
|
||||||
document.getElementById('qvDomain').value = '';
|
|
||||||
document.getElementById('qvNotes').value = '';
|
|
||||||
document.getElementById('qvSearchInput').value = '';
|
|
||||||
document.getElementById('qvSearchResults').innerHTML = '<div class="list-group-item text-muted small py-1">Søg for at finde eksisterende leverandør</div>';
|
|
||||||
document.getElementById('qvStatusAlert').className = 'alert d-none py-2 small';
|
|
||||||
|
|
||||||
// Load PDF in iframe
|
|
||||||
document.getElementById('qvPdfFrame').src = `/api/v1/supplier-invoices/files/${fileId}/download`;
|
|
||||||
|
|
||||||
// Open modal immediately
|
|
||||||
const modal = new bootstrap.Modal(document.getElementById('quickVendorSplitModal'), {backdrop: 'static'});
|
|
||||||
modal.show();
|
|
||||||
|
|
||||||
// Async: load extracted data and pre-fill form
|
|
||||||
await qvLoadAndPrefill(fileId);
|
|
||||||
}
|
|
||||||
|
|
||||||
async function qvLoadAndPrefill(fileId, isRetry) {
|
|
||||||
const statusEl = document.getElementById('qvStatusAlert');
|
|
||||||
statusEl.className = 'alert alert-info py-2 small';
|
|
||||||
statusEl.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Henter fakturadata…';
|
|
||||||
|
|
||||||
try {
|
|
||||||
const resp = await fetch(`/api/v1/supplier-invoices/files/${fileId}/extracted-data`);
|
|
||||||
if (!resp.ok) { throw new Error(`HTTP ${resp.status}`); }
|
|
||||||
const data = await resp.json();
|
|
||||||
|
|
||||||
console.log('[QV] extracted-data response:', JSON.stringify({
|
|
||||||
file_id: data.file_id,
|
|
||||||
status: data.status,
|
|
||||||
has_extraction: !!data.extraction,
|
|
||||||
has_llm_data: !!data.llm_data,
|
|
||||||
llm_data: data.llm_data,
|
|
||||||
extraction_vendor_name: data.extraction?.vendor_name,
|
|
||||||
extraction_vendor_cvr: data.extraction?.vendor_cvr,
|
|
||||||
}));
|
|
||||||
|
|
||||||
// Normaliseret data fra server (backend bygger llm_data med rigtige feltnavne)
|
|
||||||
const ld = data.llm_data || {};
|
|
||||||
const ext = data.extraction || {};
|
|
||||||
|
|
||||||
// llm_response_json: kan være JSONB-objekt eller string
|
|
||||||
let rawAi = {};
|
|
||||||
const rawLlm = ext.llm_response_json;
|
|
||||||
if (rawLlm) {
|
|
||||||
rawAi = (typeof rawLlm === 'string') ? (() => { try { return JSON.parse(rawLlm); } catch(e) { return {}; } })() : rawLlm;
|
|
||||||
}
|
|
||||||
console.log('[QV] rawAi keys:', Object.keys(rawAi).join(', ') || '(tom)');
|
|
||||||
|
|
||||||
// Hent vendor-felter fra alle 3 kilder i prioriteret rækkefølge
|
|
||||||
const name = ld.vendor_name || ext.vendor_name || rawAi.vendor_name || rawAi.issuer || '';
|
|
||||||
const cvr = (ld.vendor_cvr || ext.vendor_cvr || rawAi.vendor_cvr || rawAi.vendor_vat || '').toString().replace(/^DK/i, '').trim();
|
|
||||||
const email = ld.vendor_email || rawAi.vendor_email || rawAi.supplier_email || '';
|
|
||||||
const addr = ld.vendor_address || rawAi.vendor_address || rawAi.supplier_address || rawAi.vendor_street || '';
|
|
||||||
const postal = ld.vendor_postal_code || rawAi.vendor_postal_code || rawAi.postal_code || '';
|
|
||||||
const city = ld.vendor_city || rawAi.vendor_city || rawAi.city || '';
|
|
||||||
|
|
||||||
console.log('[QV] Parsed fields:', { name, cvr, email, addr, postal, city });
|
|
||||||
|
|
||||||
// Ingen extraction i DB overhovedet (fil aldrig kørt) → auto-reprocess
|
|
||||||
if (!data.extraction && !isRetry) {
|
|
||||||
console.log('[QV] Ingen extraction – starter auto-reprocess');
|
|
||||||
await qvAutoReprocess(fileId);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Extraction findes men ingen vendor data → tilbyd reprocess
|
|
||||||
if (!name && !cvr && !isRetry) {
|
|
||||||
console.log('[QV] Extraction uden vendor data – starter auto-reprocess');
|
|
||||||
await qvAutoReprocess(fileId);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Udfyld form
|
|
||||||
if (name) document.getElementById('qvName').value = name;
|
|
||||||
if (cvr) document.getElementById('qvCVR').value = cvr;
|
|
||||||
if (email) document.getElementById('qvEmail').value = email;
|
|
||||||
if (addr) {
|
|
||||||
const parts = addr.split(/,|\n/).map(s => s.trim()).filter(Boolean);
|
|
||||||
if (parts.length >= 1) document.getElementById('qvAddress').value = parts[0];
|
|
||||||
if (!postal && !city && parts.length >= 2) {
|
|
||||||
const postalCity = parts[parts.length - 1];
|
|
||||||
const m = postalCity.match(/^(\d{4})\s+(.+)$/);
|
|
||||||
if (m) { document.getElementById('qvPostal').value = m[1]; document.getElementById('qvCity').value = m[2]; }
|
|
||||||
else { document.getElementById('qvCity').value = postalCity; }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (postal) document.getElementById('qvPostal').value = postal;
|
|
||||||
if (city) document.getElementById('qvCity').value = city;
|
|
||||||
|
|
||||||
if (name || cvr) {
|
|
||||||
statusEl.className = 'alert alert-success py-2 small';
|
|
||||||
statusEl.textContent = `✅ Data hentet${name ? ': ' + name : ''}${cvr ? ' (' + cvr + ')' : ''}`;
|
|
||||||
setTimeout(() => { statusEl.className = 'alert d-none py-2 small'; }, 4000);
|
|
||||||
} else {
|
|
||||||
// AI fandt ingen vendor – men vis hvad der er (fakturanr, beløb)
|
|
||||||
const inv = ld.invoice_number || rawAi.invoice_number || '';
|
|
||||||
const amt = ld.total_amount || rawAi.total_amount || '';
|
|
||||||
statusEl.className = 'alert alert-warning py-2 small';
|
|
||||||
statusEl.innerHTML = `AI fandt ingen leverandørdata${inv ? ' (Faktura ' + inv + (amt ? ', ' + amt + ' DKK' : '') + ')' : ''}. Udfyld navn manuelt eller søg herover.`;
|
|
||||||
}
|
|
||||||
} catch(e) {
|
|
||||||
console.error('[QV] Fejl:', e);
|
|
||||||
statusEl.className = 'alert alert-danger py-2 small';
|
|
||||||
statusEl.textContent = 'Fejl ved hentning: ' + e.message;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function qvAutoReprocess(fileId) {
|
|
||||||
const statusEl = document.getElementById('qvStatusAlert');
|
|
||||||
statusEl.className = 'alert alert-info py-2 small';
|
|
||||||
statusEl.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Analyserer faktura med AI – vent venligst…';
|
|
||||||
|
|
||||||
try {
|
|
||||||
console.log('[QV] Starter reprocess for file:', fileId);
|
|
||||||
const r = await fetch(`/api/v1/supplier-invoices/reprocess/${fileId}`, { method: 'POST' });
|
|
||||||
if (!r.ok) {
|
|
||||||
const errBody = await r.text();
|
|
||||||
console.error('[QV] Reprocess fejlede:', r.status, errBody);
|
|
||||||
throw new Error(`Reprocess HTTP ${r.status}: ${errBody}`);
|
|
||||||
}
|
|
||||||
const reprocessResult = await r.json();
|
|
||||||
console.log('[QV] Reprocess result:', JSON.stringify(reprocessResult));
|
|
||||||
|
|
||||||
// Hent opdateret data med isRetry=true for at undgå uendelig løkke
|
|
||||||
await qvLoadAndPrefill(fileId, true);
|
|
||||||
loadUnhandledFiles();
|
|
||||||
} catch(e) {
|
|
||||||
console.error('[QV] Auto-reprocess fejl:', e);
|
|
||||||
statusEl.className = 'alert alert-warning py-2 small';
|
|
||||||
statusEl.innerHTML = `Kunne ikke køre AI-analyse: ${e.message}. <button class="btn btn-sm btn-outline-warning ms-2" onclick="qvAutoReprocess(${fileId})">Prøv igen</button>`;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function qvSearchVendors(query) {
|
|
||||||
const results = document.getElementById('qvSearchResults');
|
|
||||||
if (!query || query.length < 2) {
|
|
||||||
results.innerHTML = '<div class="list-group-item text-muted small py-1">Søg for at finde eksisterende leverandør</div>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
try {
|
|
||||||
const resp = await fetch(`/api/v1/vendors?search=${encodeURIComponent(query)}&active_only=true`);
|
|
||||||
const vendors = await resp.json();
|
|
||||||
if (!vendors || vendors.length === 0) {
|
|
||||||
results.innerHTML = '<div class="list-group-item text-muted small py-1">Ingen leverandører fundet</div>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
results.innerHTML = vendors.slice(0, 10).map(v => `
|
|
||||||
<button type="button" class="list-group-item list-group-item-action py-1 small"
|
|
||||||
onclick="qvSelectVendor(${v.id}, '${escapeHtml(v.name)}', '${v.cvr_number || ''}')">
|
|
||||||
<strong>${escapeHtml(v.name)}</strong>
|
|
||||||
${v.cvr_number ? `<span class="text-muted ms-2">${v.cvr_number}</span>` : ''}
|
|
||||||
</button>
|
|
||||||
`).join('');
|
|
||||||
} catch(e) {
|
|
||||||
results.innerHTML = '<div class="list-group-item text-danger small py-1">Fejl ved søgning</div>';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function qvSelectVendor(vendorId, vendorName, vendorCVR) {
|
|
||||||
document.getElementById('qvExistingVendorId').value = vendorId;
|
|
||||||
document.getElementById('qvName').value = vendorName;
|
|
||||||
document.getElementById('qvCVR').value = vendorCVR;
|
|
||||||
const alert = document.getElementById('qvStatusAlert');
|
|
||||||
alert.className = 'alert alert-success py-2 small';
|
|
||||||
alert.textContent = `✅ Valgt: ${vendorName} — klik "Opret og link" for at linke`;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function saveQuickVendor() {
|
|
||||||
const fileId = document.getElementById('qvFileId').value;
|
|
||||||
const existingId = document.getElementById('qvExistingVendorId').value;
|
|
||||||
const name = document.getElementById('qvName').value.trim();
|
|
||||||
const cvr = document.getElementById('qvCVR').value.trim();
|
|
||||||
const email = document.getElementById('qvEmail').value.trim();
|
|
||||||
const phone = document.getElementById('qvPhone').value.trim();
|
|
||||||
const address = document.getElementById('qvAddress').value.trim();
|
|
||||||
const postal = document.getElementById('qvPostal').value.trim();
|
|
||||||
const city = document.getElementById('qvCity').value.trim();
|
|
||||||
const domain = document.getElementById('qvDomain').value.trim();
|
|
||||||
const category = document.getElementById('qvCategory').value;
|
|
||||||
const notes = document.getElementById('qvNotes').value.trim();
|
|
||||||
|
|
||||||
const statusEl = document.getElementById('qvStatusAlert');
|
|
||||||
|
|
||||||
if (!name) {
|
|
||||||
statusEl.className = 'alert alert-danger py-2 small';
|
|
||||||
statusEl.textContent = 'Navn er påkrævet.';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
statusEl.className = 'alert alert-info py-2 small';
|
|
||||||
statusEl.textContent = 'Gemmer…';
|
|
||||||
|
|
||||||
try {
|
|
||||||
let vendorId = existingId ? parseInt(existingId) : null;
|
|
||||||
|
|
||||||
if (!vendorId) {
|
|
||||||
// Create new vendor
|
|
||||||
const payload = {
|
|
||||||
name, cvr_number: cvr || null,
|
|
||||||
email: email || null, phone: phone || null,
|
|
||||||
address: [address, postal && city ? `${postal} ${city}` : city].filter(Boolean).join('\n') || null,
|
|
||||||
postal_code: postal || null, city: city || null,
|
|
||||||
domain: domain || null, category,
|
|
||||||
notes: notes || null
|
|
||||||
};
|
|
||||||
const resp = await fetch('/api/v1/vendors', {
|
|
||||||
method: 'POST',
|
|
||||||
headers: {'Content-Type': 'application/json'},
|
|
||||||
body: JSON.stringify(payload)
|
|
||||||
});
|
|
||||||
if (!resp.ok) {
|
|
||||||
const err = await resp.json().catch(() => ({}));
|
|
||||||
throw new Error(err.detail || 'Oprettelse fejlede');
|
|
||||||
}
|
|
||||||
const created = await resp.json();
|
|
||||||
vendorId = created.id;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Link vendor to file
|
|
||||||
const linkResp = await fetch(`/api/v1/supplier-invoices/files/${fileId}/link-vendor`, {
|
|
||||||
method: 'POST',
|
|
||||||
headers: {'Content-Type': 'application/json'},
|
|
||||||
body: JSON.stringify({vendor_id: vendorId})
|
|
||||||
});
|
|
||||||
if (!linkResp.ok) {
|
|
||||||
const err = await linkResp.json().catch(() => ({}));
|
|
||||||
throw new Error(err.detail || 'Link fejlede');
|
|
||||||
}
|
|
||||||
|
|
||||||
statusEl.className = 'alert alert-success py-2 small';
|
|
||||||
statusEl.textContent = `✅ Leverandør ${existingId ? 'linket' : 'oprettet og linket'}!`;
|
|
||||||
|
|
||||||
setTimeout(() => {
|
|
||||||
bootstrap.Modal.getInstance(document.getElementById('quickVendorSplitModal')).hide();
|
|
||||||
loadUnhandledFiles();
|
|
||||||
}, 900);
|
|
||||||
|
|
||||||
} catch(e) {
|
|
||||||
statusEl.className = 'alert alert-danger py-2 small';
|
|
||||||
statusEl.textContent = '❌ ' + e.message;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Rerun full extraction for a file in the unhandled tab
|
|
||||||
async function rerunSingleFile(fileId) {
|
|
||||||
try {
|
|
||||||
showLoadingOverlay('Kører analyse...');
|
|
||||||
|
|
||||||
const response = await fetch(`/api/v1/supplier-invoices/reprocess/${fileId}`, {
|
|
||||||
method: 'POST'
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
const err = await response.json().catch(() => ({}));
|
|
||||||
throw new Error(err.detail || 'Analyse fejlede');
|
|
||||||
}
|
|
||||||
|
|
||||||
const result = await response.json();
|
|
||||||
hideLoadingOverlay();
|
|
||||||
|
|
||||||
const confPct = result.confidence ? Math.round(result.confidence * 100) + '%' : '?%';
|
|
||||||
const vendorInfo = result.vendor_id ? `Leverandør matchet (ID ${result.vendor_id})` : 'Ingen leverandør matchet';
|
|
||||||
alert(`✅ Analyse færdig\n${vendorInfo}\nConfidence: ${confPct}`);
|
|
||||||
|
|
||||||
loadUnhandledFiles();
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
hideLoadingOverlay();
|
|
||||||
console.error('Rerun error:', error);
|
|
||||||
alert('❌ Fejl ved analyse: ' + error.message);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// NEW: Analyze single file
|
// NEW: Analyze single file
|
||||||
async function analyzeFile(fileId) {
|
async function analyzeFile(fileId) {
|
||||||
try {
|
try {
|
||||||
@ -4059,11 +3633,12 @@ async function bulkMarkAsPaid() {
|
|||||||
|
|
||||||
for (const invoiceId of invoiceIds) {
|
for (const invoiceId of invoiceIds) {
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/mark-paid`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}`, {
|
||||||
method: 'POST',
|
method: 'PATCH',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
paid_date: new Date().toISOString().split('T')[0]
|
status: 'paid',
|
||||||
|
payment_date: new Date().toISOString().split('T')[0]
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -4094,11 +3669,12 @@ async function markInvoiceAsPaid(invoiceId) {
|
|||||||
if (!confirm('Marker denne faktura som betalt?')) return;
|
if (!confirm('Marker denne faktura som betalt?')) return;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/mark-paid`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}`, {
|
||||||
method: 'POST',
|
method: 'PATCH',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
paid_date: new Date().toISOString().split('T')[0]
|
status: 'paid',
|
||||||
|
payment_date: new Date().toISOString().split('T')[0]
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -4563,7 +4139,7 @@ async function approveInvoice() {
|
|||||||
const response = await fetch(`/api/v1/supplier-invoices/${currentInvoiceId}/approve`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${currentInvoiceId}/approve`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({ approved_by: getApprovalUser() })
|
body: JSON.stringify({ approved_by: 'CurrentUser' }) // TODO: Get from auth
|
||||||
});
|
});
|
||||||
|
|
||||||
if (response.ok) {
|
if (response.ok) {
|
||||||
@ -4616,7 +4192,7 @@ async function quickApprove(invoiceId) {
|
|||||||
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/approve`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/approve`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({ approved_by: getApprovalUser() })
|
body: JSON.stringify({ approved_by: 'CurrentUser' })
|
||||||
});
|
});
|
||||||
|
|
||||||
if (response.ok) {
|
if (response.ok) {
|
||||||
@ -4961,7 +4537,7 @@ async function createTemplateFromInvoice(invoiceId, vendorId) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Step 2: AI analyze
|
// Step 2: AI analyze
|
||||||
const aiResp = await fetch('/api/v1/supplier-invoices/ai/analyze', {
|
const aiResp = await fetch('/api/v1/supplier-invoices/ai-analyze', {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
@ -5123,7 +4699,7 @@ async function sendSingleToEconomic(invoiceId) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Bulk send selected invoices to e-conomic
|
// Bulk send selected invoices to e-conomic
|
||||||
async function bulkSendToEconomicKassekladde() {
|
async function bulkSendToEconomic() {
|
||||||
const checkboxes = document.querySelectorAll('.kassekladde-checkbox:checked');
|
const checkboxes = document.querySelectorAll('.kassekladde-checkbox:checked');
|
||||||
const invoiceIds = Array.from(checkboxes).map(cb => parseInt(cb.dataset.invoiceId));
|
const invoiceIds = Array.from(checkboxes).map(cb => parseInt(cb.dataset.invoiceId));
|
||||||
|
|
||||||
@ -5171,16 +4747,6 @@ async function bulkSendToEconomicKassekladde() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function getApprovalUser() {
|
|
||||||
const bodyUser = document.body?.dataset?.currentUser;
|
|
||||||
if (bodyUser && bodyUser.trim()) return bodyUser.trim();
|
|
||||||
|
|
||||||
const metaUser = document.querySelector('meta[name="current-user"]')?.content;
|
|
||||||
if (metaUser && metaUser.trim()) return metaUser.trim();
|
|
||||||
|
|
||||||
return 'System';
|
|
||||||
}
|
|
||||||
|
|
||||||
// Select vendor for file (when <100% match)
|
// Select vendor for file (when <100% match)
|
||||||
async function selectVendorForFile(fileId, vendorId) {
|
async function selectVendorForFile(fileId, vendorId) {
|
||||||
if (!vendorId) return;
|
if (!vendorId) return;
|
||||||
|
|||||||
@ -1,408 +0,0 @@
|
|||||||
{% extends "shared/frontend/base.html" %}
|
|
||||||
|
|
||||||
{% block title %}Sync Dashboard - BMC Hub{% endblock %}
|
|
||||||
|
|
||||||
{% block extra_css %}
|
|
||||||
<style>
|
|
||||||
:root {
|
|
||||||
--sync-accent: #0f4c75;
|
|
||||||
--sync-accent-soft: rgba(15, 76, 117, 0.1);
|
|
||||||
--sync-ok: #2f855a;
|
|
||||||
--sync-warn: #c05621;
|
|
||||||
--sync-danger: #c53030;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sync-header {
|
|
||||||
background: linear-gradient(130deg, rgba(15, 76, 117, 0.14), rgba(22, 160, 133, 0.08));
|
|
||||||
border: 1px solid rgba(15, 76, 117, 0.15);
|
|
||||||
border-radius: 16px;
|
|
||||||
padding: 1.25rem;
|
|
||||||
margin-bottom: 1.25rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sync-kpi {
|
|
||||||
border-radius: 14px;
|
|
||||||
border: 1px solid var(--border-color);
|
|
||||||
background: var(--bg-card);
|
|
||||||
padding: 1rem;
|
|
||||||
height: 100%;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sync-kpi .label {
|
|
||||||
color: var(--text-secondary);
|
|
||||||
font-size: 0.82rem;
|
|
||||||
text-transform: uppercase;
|
|
||||||
letter-spacing: 0.06em;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sync-kpi .value {
|
|
||||||
font-size: 1.8rem;
|
|
||||||
font-weight: 700;
|
|
||||||
line-height: 1.1;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sync-kpi.pending .value { color: var(--sync-warn); }
|
|
||||||
.sync-kpi.failed .value { color: var(--sync-danger); }
|
|
||||||
.sync-kpi.posted .value { color: var(--sync-accent); }
|
|
||||||
.sync-kpi.paid .value { color: var(--sync-ok); }
|
|
||||||
|
|
||||||
.status-badge {
|
|
||||||
padding: 0.3rem 0.55rem;
|
|
||||||
border-radius: 999px;
|
|
||||||
font-size: 0.78rem;
|
|
||||||
font-weight: 700;
|
|
||||||
text-transform: uppercase;
|
|
||||||
letter-spacing: 0.04em;
|
|
||||||
}
|
|
||||||
|
|
||||||
.status-pending { background: rgba(192, 86, 33, 0.14); color: var(--sync-warn); }
|
|
||||||
.status-exported { background: rgba(15, 76, 117, 0.14); color: var(--sync-accent); }
|
|
||||||
.status-failed { background: rgba(197, 48, 48, 0.14); color: var(--sync-danger); }
|
|
||||||
.status-posted { background: rgba(22, 101, 52, 0.14); color: #166534; }
|
|
||||||
.status-paid { background: rgba(47, 133, 90, 0.14); color: var(--sync-ok); }
|
|
||||||
|
|
||||||
.table thead th {
|
|
||||||
font-size: 0.78rem;
|
|
||||||
text-transform: uppercase;
|
|
||||||
letter-spacing: 0.06em;
|
|
||||||
color: var(--text-secondary);
|
|
||||||
}
|
|
||||||
|
|
||||||
.mono {
|
|
||||||
font-family: ui-monospace, SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace;
|
|
||||||
font-size: 0.85rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.event-card {
|
|
||||||
border: 1px solid var(--border-color);
|
|
||||||
border-radius: 12px;
|
|
||||||
padding: 0.75rem;
|
|
||||||
background: var(--bg-card);
|
|
||||||
}
|
|
||||||
|
|
||||||
[data-bs-theme="dark"] .sync-header {
|
|
||||||
background: linear-gradient(130deg, rgba(61, 139, 253, 0.14), rgba(44, 62, 80, 0.3));
|
|
||||||
border-color: rgba(61, 139, 253, 0.25);
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
{% endblock %}
|
|
||||||
|
|
||||||
{% block content %}
|
|
||||||
<div class="sync-header d-flex flex-wrap justify-content-between align-items-start gap-3">
|
|
||||||
<div>
|
|
||||||
<h2 class="mb-1">Draft Sync Dashboard</h2>
|
|
||||||
<p class="text-muted mb-0">Overblik over ordre-draft sync, attention queue og seneste events.</p>
|
|
||||||
</div>
|
|
||||||
<div class="d-flex gap-2">
|
|
||||||
<button class="btn btn-outline-secondary" id="btnPreviewReconcile">
|
|
||||||
<i class="bi bi-search me-1"></i>Preview Reconcile
|
|
||||||
</button>
|
|
||||||
<button class="btn btn-primary" id="btnApplyReconcile">
|
|
||||||
<i class="bi bi-arrow-repeat me-1"></i>Kør Reconcile
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="row g-3 mb-4" id="kpiRow">
|
|
||||||
<div class="col-6 col-lg-2">
|
|
||||||
<div class="sync-kpi">
|
|
||||||
<div class="label">Total</div>
|
|
||||||
<div class="value" id="kpiTotal">0</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-6 col-lg-2">
|
|
||||||
<div class="sync-kpi pending">
|
|
||||||
<div class="label">Pending</div>
|
|
||||||
<div class="value" id="kpiPending">0</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-6 col-lg-2">
|
|
||||||
<div class="sync-kpi">
|
|
||||||
<div class="label">Exported</div>
|
|
||||||
<div class="value" id="kpiExported">0</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-6 col-lg-2">
|
|
||||||
<div class="sync-kpi failed">
|
|
||||||
<div class="label">Failed</div>
|
|
||||||
<div class="value" id="kpiFailed">0</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-6 col-lg-2">
|
|
||||||
<div class="sync-kpi posted">
|
|
||||||
<div class="label">Posted</div>
|
|
||||||
<div class="value" id="kpiPosted">0</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-6 col-lg-2">
|
|
||||||
<div class="sync-kpi paid">
|
|
||||||
<div class="label">Paid</div>
|
|
||||||
<div class="value" id="kpiPaid">0</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="row g-3">
|
|
||||||
<div class="col-12 col-xl-7">
|
|
||||||
<div class="card h-100">
|
|
||||||
<div class="card-header d-flex justify-content-between align-items-center">
|
|
||||||
<h5 class="mb-0">Attention Items</h5>
|
|
||||||
<button class="btn btn-sm btn-outline-secondary" id="btnRefreshAttention">Opdater</button>
|
|
||||||
</div>
|
|
||||||
<div class="card-body p-0">
|
|
||||||
<div class="table-responsive">
|
|
||||||
<table class="table table-hover mb-0">
|
|
||||||
<thead>
|
|
||||||
<tr>
|
|
||||||
<th>Draft</th>
|
|
||||||
<th>Status</th>
|
|
||||||
<th>Order</th>
|
|
||||||
<th>Invoice</th>
|
|
||||||
<th>Seneste Event</th>
|
|
||||||
<th></th>
|
|
||||||
</tr>
|
|
||||||
</thead>
|
|
||||||
<tbody id="attentionBody">
|
|
||||||
<tr><td colspan="6" class="text-center py-4 text-muted">Indlæser...</td></tr>
|
|
||||||
</tbody>
|
|
||||||
</table>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="col-12 col-xl-5">
|
|
||||||
<div class="card h-100">
|
|
||||||
<div class="card-header d-flex justify-content-between align-items-center">
|
|
||||||
<h5 class="mb-0">Recent Events</h5>
|
|
||||||
<button class="btn btn-sm btn-outline-secondary" id="btnRefreshEvents">Opdater</button>
|
|
||||||
</div>
|
|
||||||
<div class="card-body" id="recentEventsList">
|
|
||||||
<div class="text-muted">Indlæser...</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="modal fade" id="eventsModal" tabindex="-1" aria-hidden="true">
|
|
||||||
<div class="modal-dialog modal-xl modal-dialog-scrollable">
|
|
||||||
<div class="modal-content">
|
|
||||||
<div class="modal-header">
|
|
||||||
<h5 class="modal-title">Draft Events</h5>
|
|
||||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
|
||||||
</div>
|
|
||||||
<div class="modal-body">
|
|
||||||
<div class="row g-2 mb-3">
|
|
||||||
<div class="col-md-3">
|
|
||||||
<input class="form-control form-control-sm" id="filterEventType" placeholder="event_type">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3">
|
|
||||||
<input class="form-control form-control-sm" id="filterFromStatus" placeholder="from_status">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3">
|
|
||||||
<input class="form-control form-control-sm" id="filterToStatus" placeholder="to_status">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3 d-flex gap-2">
|
|
||||||
<button class="btn btn-sm btn-outline-primary" id="btnApplyEventFilters">Filtrer</button>
|
|
||||||
<button class="btn btn-sm btn-outline-secondary" id="btnClearEventFilters">Nulstil</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="table-responsive">
|
|
||||||
<table class="table table-sm align-middle">
|
|
||||||
<thead>
|
|
||||||
<tr>
|
|
||||||
<th>Tid</th>
|
|
||||||
<th>Event</th>
|
|
||||||
<th>Fra</th>
|
|
||||||
<th>Til</th>
|
|
||||||
<th>Payload</th>
|
|
||||||
</tr>
|
|
||||||
</thead>
|
|
||||||
<tbody id="eventsModalBody"></tbody>
|
|
||||||
</table>
|
|
||||||
</div>
|
|
||||||
<div class="d-flex justify-content-between align-items-center mt-2">
|
|
||||||
<small class="text-muted" id="eventsPagerInfo"></small>
|
|
||||||
<div class="btn-group btn-group-sm">
|
|
||||||
<button class="btn btn-outline-secondary" id="btnPrevEvents">Forrige</button>
|
|
||||||
<button class="btn btn-outline-secondary" id="btnNextEvents">Næste</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
{% endblock %}
|
|
||||||
|
|
||||||
{% block extra_js %}
|
|
||||||
<script>
|
|
||||||
(() => {
|
|
||||||
const state = {
|
|
||||||
selectedDraftId: null,
|
|
||||||
eventsLimit: 20,
|
|
||||||
eventsOffset: 0,
|
|
||||||
eventsTotal: 0,
|
|
||||||
};
|
|
||||||
|
|
||||||
const el = (id) => document.getElementById(id);
|
|
||||||
|
|
||||||
const statusBadge = (status) => {
|
|
||||||
const s = (status || '').toLowerCase();
|
|
||||||
return `<span class="status-badge status-${s || 'pending'}">${s || 'pending'}</span>`;
|
|
||||||
};
|
|
||||||
|
|
||||||
const fetchJson = async (url, options = {}) => {
|
|
||||||
const res = await fetch(url, options);
|
|
||||||
if (!res.ok) {
|
|
||||||
const text = await res.text();
|
|
||||||
throw new Error(text || `HTTP ${res.status}`);
|
|
||||||
}
|
|
||||||
return res.json();
|
|
||||||
};
|
|
||||||
|
|
||||||
const loadDashboard = async () => {
|
|
||||||
const data = await fetchJson('/api/v1/billing/drafts/sync-dashboard?limit=20');
|
|
||||||
const summary = data.summary || {};
|
|
||||||
|
|
||||||
el('kpiTotal').textContent = summary.total_count || 0;
|
|
||||||
el('kpiPending').textContent = summary.pending_count || 0;
|
|
||||||
el('kpiExported').textContent = summary.exported_count || 0;
|
|
||||||
el('kpiFailed').textContent = summary.failed_count || 0;
|
|
||||||
el('kpiPosted').textContent = summary.posted_count || 0;
|
|
||||||
el('kpiPaid').textContent = summary.paid_count || 0;
|
|
||||||
|
|
||||||
const attention = data.attention_items || [];
|
|
||||||
const tbody = el('attentionBody');
|
|
||||||
if (!attention.length) {
|
|
||||||
tbody.innerHTML = '<tr><td colspan="6" class="text-center py-4 text-muted">Ingen attention items</td></tr>';
|
|
||||||
} else {
|
|
||||||
tbody.innerHTML = attention.map(row => `
|
|
||||||
<tr>
|
|
||||||
<td>
|
|
||||||
<div class="fw-semibold">#${row.id} ${row.title || ''}</div>
|
|
||||||
<div class="text-muted small">Kunde ${row.customer_id || '-'}</div>
|
|
||||||
</td>
|
|
||||||
<td>${statusBadge(row.sync_status)}</td>
|
|
||||||
<td class="mono">${row.economic_order_number || '-'}</td>
|
|
||||||
<td class="mono">${row.economic_invoice_number || '-'}</td>
|
|
||||||
<td>
|
|
||||||
<div class="small">${row.latest_event_type || '-'}</div>
|
|
||||||
<div class="text-muted small">${row.latest_event_at || ''}</div>
|
|
||||||
</td>
|
|
||||||
<td>
|
|
||||||
<button class="btn btn-sm btn-outline-primary" data-open-events="${row.id}">Events</button>
|
|
||||||
</td>
|
|
||||||
</tr>
|
|
||||||
`).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
const recent = data.recent_events || [];
|
|
||||||
const list = el('recentEventsList');
|
|
||||||
if (!recent.length) {
|
|
||||||
list.innerHTML = '<div class="text-muted">Ingen events endnu.</div>';
|
|
||||||
} else {
|
|
||||||
list.innerHTML = recent.map(ev => `
|
|
||||||
<div class="event-card mb-2">
|
|
||||||
<div class="d-flex justify-content-between align-items-center mb-1">
|
|
||||||
<strong>#${ev.draft_id} ${ev.event_type}</strong>
|
|
||||||
${statusBadge(ev.to_status || ev.sync_status || 'pending')}
|
|
||||||
</div>
|
|
||||||
<div class="small text-muted">${ev.created_at || ''}</div>
|
|
||||||
<div class="small">${ev.draft_title || ''}</div>
|
|
||||||
</div>
|
|
||||||
`).join('');
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
const runReconcile = async (applyChanges) => {
|
|
||||||
await fetchJson('/api/v1/billing/drafts/reconcile-sync-status', {
|
|
||||||
method: 'POST',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ apply: applyChanges }),
|
|
||||||
});
|
|
||||||
await loadDashboard();
|
|
||||||
};
|
|
||||||
|
|
||||||
const loadEventsForDraft = async () => {
|
|
||||||
if (!state.selectedDraftId) return;
|
|
||||||
const qs = new URLSearchParams({
|
|
||||||
limit: String(state.eventsLimit),
|
|
||||||
offset: String(state.eventsOffset),
|
|
||||||
});
|
|
||||||
|
|
||||||
const eventType = el('filterEventType').value.trim();
|
|
||||||
const fromStatus = el('filterFromStatus').value.trim();
|
|
||||||
const toStatus = el('filterToStatus').value.trim();
|
|
||||||
if (eventType) qs.set('event_type', eventType);
|
|
||||||
if (fromStatus) qs.set('from_status', fromStatus);
|
|
||||||
if (toStatus) qs.set('to_status', toStatus);
|
|
||||||
|
|
||||||
const data = await fetchJson(`/api/v1/ordre/drafts/${state.selectedDraftId}/sync-events?${qs.toString()}`);
|
|
||||||
const items = data.items || [];
|
|
||||||
state.eventsTotal = data.total || 0;
|
|
||||||
|
|
||||||
const body = el('eventsModalBody');
|
|
||||||
body.innerHTML = items.map(ev => `
|
|
||||||
<tr>
|
|
||||||
<td class="small">${ev.created_at || ''}</td>
|
|
||||||
<td class="mono">${ev.event_type || ''}</td>
|
|
||||||
<td>${ev.from_status || '-'}</td>
|
|
||||||
<td>${ev.to_status || '-'}</td>
|
|
||||||
<td><pre class="small mb-0 mono">${JSON.stringify(ev.event_payload || {}, null, 2)}</pre></td>
|
|
||||||
</tr>
|
|
||||||
`).join('') || '<tr><td colspan="5" class="text-center text-muted py-3">Ingen events</td></tr>';
|
|
||||||
|
|
||||||
const start = state.eventsOffset + 1;
|
|
||||||
const end = Math.min(state.eventsOffset + state.eventsLimit, state.eventsTotal);
|
|
||||||
el('eventsPagerInfo').textContent = state.eventsTotal ? `${start}-${end} af ${state.eventsTotal}` : '0 resultater';
|
|
||||||
|
|
||||||
el('btnPrevEvents').disabled = state.eventsOffset <= 0;
|
|
||||||
el('btnNextEvents').disabled = (state.eventsOffset + state.eventsLimit) >= state.eventsTotal;
|
|
||||||
};
|
|
||||||
|
|
||||||
document.addEventListener('click', async (e) => {
|
|
||||||
const target = e.target;
|
|
||||||
if (target.matches('[data-open-events]')) {
|
|
||||||
state.selectedDraftId = Number(target.getAttribute('data-open-events'));
|
|
||||||
state.eventsOffset = 0;
|
|
||||||
await loadEventsForDraft();
|
|
||||||
const modal = new bootstrap.Modal(el('eventsModal'));
|
|
||||||
modal.show();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
el('btnRefreshAttention').addEventListener('click', loadDashboard);
|
|
||||||
el('btnRefreshEvents').addEventListener('click', loadDashboard);
|
|
||||||
el('btnPreviewReconcile').addEventListener('click', async () => runReconcile(false));
|
|
||||||
el('btnApplyReconcile').addEventListener('click', async () => runReconcile(true));
|
|
||||||
|
|
||||||
el('btnApplyEventFilters').addEventListener('click', async () => {
|
|
||||||
state.eventsOffset = 0;
|
|
||||||
await loadEventsForDraft();
|
|
||||||
});
|
|
||||||
|
|
||||||
el('btnClearEventFilters').addEventListener('click', async () => {
|
|
||||||
el('filterEventType').value = '';
|
|
||||||
el('filterFromStatus').value = '';
|
|
||||||
el('filterToStatus').value = '';
|
|
||||||
state.eventsOffset = 0;
|
|
||||||
await loadEventsForDraft();
|
|
||||||
});
|
|
||||||
|
|
||||||
el('btnPrevEvents').addEventListener('click', async () => {
|
|
||||||
state.eventsOffset = Math.max(0, state.eventsOffset - state.eventsLimit);
|
|
||||||
await loadEventsForDraft();
|
|
||||||
});
|
|
||||||
|
|
||||||
el('btnNextEvents').addEventListener('click', async () => {
|
|
||||||
state.eventsOffset += state.eventsLimit;
|
|
||||||
await loadEventsForDraft();
|
|
||||||
});
|
|
||||||
|
|
||||||
loadDashboard().catch((err) => {
|
|
||||||
console.error(err);
|
|
||||||
alert('Kunne ikke indlæse sync dashboard.');
|
|
||||||
});
|
|
||||||
})();
|
|
||||||
</script>
|
|
||||||
{% endblock %}
|
|
||||||
@ -1360,7 +1360,7 @@ async function autoGenerateTemplate() {
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
// Call Ollama to analyze the invoice
|
// Call Ollama to analyze the invoice
|
||||||
const response = await fetch('/api/v1/supplier-invoices/ai/analyze', {
|
const response = await fetch('/api/v1/supplier-invoices/ai-analyze', {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
|
|||||||
@ -45,12 +45,3 @@ async def templates_list_page(request: Request):
|
|||||||
"request": request,
|
"request": request,
|
||||||
"title": "Templates"
|
"title": "Templates"
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
@router.get("/billing/sync-dashboard", response_class=HTMLResponse)
|
|
||||||
async def billing_sync_dashboard_page(request: Request):
|
|
||||||
"""Operational sync dashboard for ordre_drafts lifecycle."""
|
|
||||||
return templates.TemplateResponse("billing/frontend/sync_dashboard.html", {
|
|
||||||
"request": request,
|
|
||||||
"title": "Billing Sync Dashboard"
|
|
||||||
})
|
|
||||||
|
|||||||
@ -15,21 +15,6 @@ logger = logging.getLogger(__name__)
|
|||||||
security = HTTPBearer(auto_error=False)
|
security = HTTPBearer(auto_error=False)
|
||||||
|
|
||||||
|
|
||||||
def _users_column_exists(column_name: str) -> bool:
|
|
||||||
result = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT 1
|
|
||||||
FROM information_schema.columns
|
|
||||||
WHERE table_schema = 'public'
|
|
||||||
AND table_name = 'users'
|
|
||||||
AND column_name = %s
|
|
||||||
LIMIT 1
|
|
||||||
""",
|
|
||||||
(column_name,),
|
|
||||||
)
|
|
||||||
return bool(result)
|
|
||||||
|
|
||||||
|
|
||||||
async def get_current_user(
|
async def get_current_user(
|
||||||
request: Request,
|
request: Request,
|
||||||
credentials: Optional[HTTPAuthorizationCredentials] = Depends(security)
|
credentials: Optional[HTTPAuthorizationCredentials] = Depends(security)
|
||||||
@ -85,11 +70,9 @@ async def get_current_user(
|
|||||||
}
|
}
|
||||||
|
|
||||||
# Get additional user details from database
|
# Get additional user details from database
|
||||||
is_2fa_expr = "is_2fa_enabled" if _users_column_exists("is_2fa_enabled") else "FALSE AS is_2fa_enabled"
|
|
||||||
user_details = execute_query_single(
|
user_details = execute_query_single(
|
||||||
f"SELECT email, full_name, {is_2fa_expr} FROM users WHERE user_id = %s",
|
"SELECT email, full_name, is_2fa_enabled FROM users WHERE user_id = %s",
|
||||||
(user_id,),
|
(user_id,))
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"id": user_id,
|
"id": user_id,
|
||||||
|
|||||||
@ -15,28 +15,6 @@ import logging
|
|||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
_users_column_cache: Dict[str, bool] = {}
|
|
||||||
|
|
||||||
|
|
||||||
def _users_column_exists(column_name: str) -> bool:
|
|
||||||
if column_name in _users_column_cache:
|
|
||||||
return _users_column_cache[column_name]
|
|
||||||
|
|
||||||
result = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT 1
|
|
||||||
FROM information_schema.columns
|
|
||||||
WHERE table_schema = 'public'
|
|
||||||
AND table_name = 'users'
|
|
||||||
AND column_name = %s
|
|
||||||
LIMIT 1
|
|
||||||
""",
|
|
||||||
(column_name,),
|
|
||||||
)
|
|
||||||
exists = bool(result)
|
|
||||||
_users_column_cache[column_name] = exists
|
|
||||||
return exists
|
|
||||||
|
|
||||||
# JWT Settings
|
# JWT Settings
|
||||||
SECRET_KEY = settings.JWT_SECRET_KEY
|
SECRET_KEY = settings.JWT_SECRET_KEY
|
||||||
ALGORITHM = "HS256"
|
ALGORITHM = "HS256"
|
||||||
@ -48,11 +26,6 @@ pwd_context = CryptContext(schemes=["pbkdf2_sha256", "bcrypt_sha256", "bcrypt"],
|
|||||||
class AuthService:
|
class AuthService:
|
||||||
"""Service for authentication and authorization"""
|
"""Service for authentication and authorization"""
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def is_2fa_supported() -> bool:
|
|
||||||
"""Return True only when required 2FA columns exist in users table."""
|
|
||||||
return _users_column_exists("is_2fa_enabled") and _users_column_exists("totp_secret")
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def hash_password(password: str) -> str:
|
def hash_password(password: str) -> str:
|
||||||
"""
|
"""
|
||||||
@ -116,9 +89,6 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def setup_user_2fa(user_id: int, username: str) -> Dict:
|
def setup_user_2fa(user_id: int, username: str) -> Dict:
|
||||||
"""Create and store a new TOTP secret (not enabled until verified)"""
|
"""Create and store a new TOTP secret (not enabled until verified)"""
|
||||||
if not AuthService.is_2fa_supported():
|
|
||||||
raise RuntimeError("2FA columns missing in users table")
|
|
||||||
|
|
||||||
secret = AuthService.generate_2fa_secret()
|
secret = AuthService.generate_2fa_secret()
|
||||||
execute_update(
|
execute_update(
|
||||||
"UPDATE users SET totp_secret = %s, is_2fa_enabled = FALSE, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
"UPDATE users SET totp_secret = %s, is_2fa_enabled = FALSE, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
@ -133,9 +103,6 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def enable_user_2fa(user_id: int, otp_code: str) -> bool:
|
def enable_user_2fa(user_id: int, otp_code: str) -> bool:
|
||||||
"""Enable 2FA after verifying TOTP code"""
|
"""Enable 2FA after verifying TOTP code"""
|
||||||
if not (_users_column_exists("totp_secret") and _users_column_exists("is_2fa_enabled")):
|
|
||||||
return False
|
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
"SELECT totp_secret FROM users WHERE user_id = %s",
|
"SELECT totp_secret FROM users WHERE user_id = %s",
|
||||||
(user_id,)
|
(user_id,)
|
||||||
@ -156,9 +123,6 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def disable_user_2fa(user_id: int, otp_code: str) -> bool:
|
def disable_user_2fa(user_id: int, otp_code: str) -> bool:
|
||||||
"""Disable 2FA after verifying TOTP code"""
|
"""Disable 2FA after verifying TOTP code"""
|
||||||
if not (_users_column_exists("totp_secret") and _users_column_exists("is_2fa_enabled")):
|
|
||||||
return False
|
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
"SELECT totp_secret FROM users WHERE user_id = %s",
|
"SELECT totp_secret FROM users WHERE user_id = %s",
|
||||||
(user_id,)
|
(user_id,)
|
||||||
@ -187,11 +151,10 @@ class AuthService:
|
|||||||
if not user:
|
if not user:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
if _users_column_exists("is_2fa_enabled") and _users_column_exists("totp_secret"):
|
execute_update(
|
||||||
execute_update(
|
"UPDATE users SET is_2fa_enabled = FALSE, totp_secret = NULL, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
"UPDATE users SET is_2fa_enabled = FALSE, totp_secret = NULL, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
(user_id,)
|
||||||
(user_id,)
|
)
|
||||||
)
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@ -293,18 +256,13 @@ class AuthService:
|
|||||||
request_username = (username or "").strip().lower()
|
request_username = (username or "").strip().lower()
|
||||||
|
|
||||||
# Get user
|
# Get user
|
||||||
is_2fa_expr = "is_2fa_enabled" if _users_column_exists("is_2fa_enabled") else "FALSE AS is_2fa_enabled"
|
|
||||||
totp_expr = "totp_secret" if _users_column_exists("totp_secret") else "NULL::text AS totp_secret"
|
|
||||||
last_2fa_expr = "last_2fa_at" if _users_column_exists("last_2fa_at") else "NULL::timestamp AS last_2fa_at"
|
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
f"""SELECT user_id, username, email, password_hash, full_name,
|
"""SELECT user_id, username, email, password_hash, full_name,
|
||||||
is_active, is_superadmin, failed_login_attempts, locked_until,
|
is_active, is_superadmin, failed_login_attempts, locked_until,
|
||||||
{is_2fa_expr}, {totp_expr}, {last_2fa_expr}
|
is_2fa_enabled, totp_secret, last_2fa_at
|
||||||
FROM users
|
FROM users
|
||||||
WHERE username = %s OR email = %s""",
|
WHERE username = %s OR email = %s""",
|
||||||
(username, username),
|
(username, username))
|
||||||
)
|
|
||||||
|
|
||||||
if not user:
|
if not user:
|
||||||
# Shadow Admin fallback (only when no regular user matches)
|
# Shadow Admin fallback (only when no regular user matches)
|
||||||
@ -409,11 +367,10 @@ class AuthService:
|
|||||||
logger.warning(f"❌ Login failed: Invalid 2FA - {username}")
|
logger.warning(f"❌ Login failed: Invalid 2FA - {username}")
|
||||||
return None, "Invalid 2FA code"
|
return None, "Invalid 2FA code"
|
||||||
|
|
||||||
if _users_column_exists("last_2fa_at"):
|
execute_update(
|
||||||
execute_update(
|
"UPDATE users SET last_2fa_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
"UPDATE users SET last_2fa_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
(user['user_id'],)
|
||||||
(user['user_id'],)
|
)
|
||||||
)
|
|
||||||
|
|
||||||
# Success! Reset failed attempts and update last login
|
# Success! Reset failed attempts and update last login
|
||||||
execute_update(
|
execute_update(
|
||||||
@ -459,9 +416,6 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def is_user_2fa_enabled(user_id: int) -> bool:
|
def is_user_2fa_enabled(user_id: int) -> bool:
|
||||||
"""Check if user has 2FA enabled"""
|
"""Check if user has 2FA enabled"""
|
||||||
if not _users_column_exists("is_2fa_enabled"):
|
|
||||||
return False
|
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
"SELECT is_2fa_enabled FROM users WHERE user_id = %s",
|
"SELECT is_2fa_enabled FROM users WHERE user_id = %s",
|
||||||
(user_id,)
|
(user_id,)
|
||||||
|
|||||||
@ -105,27 +105,12 @@ class Settings(BaseSettings):
|
|||||||
EMAIL_AI_ENABLED: bool = False
|
EMAIL_AI_ENABLED: bool = False
|
||||||
EMAIL_AUTO_CLASSIFY: bool = True # Enable classification by default (uses keywords if AI disabled)
|
EMAIL_AUTO_CLASSIFY: bool = True # Enable classification by default (uses keywords if AI disabled)
|
||||||
EMAIL_AI_CONFIDENCE_THRESHOLD: float = 0.7
|
EMAIL_AI_CONFIDENCE_THRESHOLD: float = 0.7
|
||||||
EMAIL_REQUIRE_MANUAL_APPROVAL: bool = True # Phase 1: human approval before case creation/routing
|
|
||||||
EMAIL_MAX_FETCH_PER_RUN: int = 50
|
EMAIL_MAX_FETCH_PER_RUN: int = 50
|
||||||
EMAIL_PROCESS_INTERVAL_MINUTES: int = 5
|
EMAIL_PROCESS_INTERVAL_MINUTES: int = 5
|
||||||
EMAIL_WORKFLOWS_ENABLED: bool = True
|
EMAIL_WORKFLOWS_ENABLED: bool = True
|
||||||
EMAIL_MAX_UPLOAD_SIZE_MB: int = 50 # Max file size for email uploads
|
EMAIL_MAX_UPLOAD_SIZE_MB: int = 50 # Max file size for email uploads
|
||||||
ALLOWED_EXTENSIONS: List[str] = ["pdf", "jpg", "jpeg", "png", "gif", "doc", "docx", "xls", "xlsx", "zip"] # Allowed file extensions for uploads
|
ALLOWED_EXTENSIONS: List[str] = ["pdf", "jpg", "jpeg", "png", "gif", "doc", "docx", "xls", "xlsx", "zip"] # Allowed file extensions for uploads
|
||||||
|
|
||||||
@field_validator("ALLOWED_EXTENSIONS", mode="before")
|
|
||||||
@classmethod
|
|
||||||
def parse_allowed_extensions(cls, v):
|
|
||||||
"""Handle both list and comma-separated string (e.g. from .env: .pdf,.jpg,...)"""
|
|
||||||
if isinstance(v, str):
|
|
||||||
# Split comma-separated and strip whitespace + leading dots
|
|
||||||
return [ext.strip().lstrip('.').lower() for ext in v.split(',') if ext.strip()]
|
|
||||||
if isinstance(v, list):
|
|
||||||
# Fix case where pydantic already wrapped entire CSV as single list element
|
|
||||||
if len(v) == 1 and ',' in str(v[0]):
|
|
||||||
return [ext.strip().lstrip('.').lower() for ext in str(v[0]).split(',') if ext.strip()]
|
|
||||||
return [ext.strip().lstrip('.').lower() for ext in v if ext]
|
|
||||||
return v
|
|
||||||
|
|
||||||
# vTiger Cloud Integration
|
# vTiger Cloud Integration
|
||||||
VTIGER_ENABLED: bool = False
|
VTIGER_ENABLED: bool = False
|
||||||
VTIGER_URL: str = ""
|
VTIGER_URL: str = ""
|
||||||
@ -176,7 +161,7 @@ class Settings(BaseSettings):
|
|||||||
|
|
||||||
# Backup System Configuration
|
# Backup System Configuration
|
||||||
BACKUP_ENABLED: bool = True
|
BACKUP_ENABLED: bool = True
|
||||||
BACKUP_STORAGE_PATH: str = "/app/data/backups"
|
BACKUP_STORAGE_PATH: str = "/app/backups"
|
||||||
BACKUP_DRY_RUN: bool = False
|
BACKUP_DRY_RUN: bool = False
|
||||||
BACKUP_READ_ONLY: bool = False
|
BACKUP_READ_ONLY: bool = False
|
||||||
BACKUP_RESTORE_DRY_RUN: bool = True # SAFETY: Test restore uden at overskrive data
|
BACKUP_RESTORE_DRY_RUN: bool = True # SAFETY: Test restore uden at overskrive data
|
||||||
@ -239,9 +224,6 @@ class Settings(BaseSettings):
|
|||||||
TELEFONI_SHARED_SECRET: str = "" # If set, required as ?token=...
|
TELEFONI_SHARED_SECRET: str = "" # If set, required as ?token=...
|
||||||
TELEFONI_IP_WHITELIST: str = "172.16.31.0/24" # CSV of IPs/CIDRs, e.g. "192.168.1.0/24,10.0.0.10"
|
TELEFONI_IP_WHITELIST: str = "172.16.31.0/24" # CSV of IPs/CIDRs, e.g. "192.168.1.0/24,10.0.0.10"
|
||||||
|
|
||||||
# Mission Control webhooks
|
|
||||||
MISSION_WEBHOOK_TOKEN: str = ""
|
|
||||||
|
|
||||||
# ESET Integration
|
# ESET Integration
|
||||||
ESET_ENABLED: bool = False
|
ESET_ENABLED: bool = False
|
||||||
ESET_API_URL: str = "https://eu.device-management.eset.systems"
|
ESET_API_URL: str = "https://eu.device-management.eset.systems"
|
||||||
|
|||||||
@ -6,7 +6,6 @@ PostgreSQL connection and helpers using psycopg2
|
|||||||
import psycopg2
|
import psycopg2
|
||||||
from psycopg2.extras import RealDictCursor
|
from psycopg2.extras import RealDictCursor
|
||||||
from psycopg2.pool import SimpleConnectionPool
|
from psycopg2.pool import SimpleConnectionPool
|
||||||
from functools import lru_cache
|
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
@ -129,34 +128,3 @@ def execute_query_single(query: str, params: tuple = None):
|
|||||||
"""Execute query and return single row (backwards compatibility for fetchone=True)"""
|
"""Execute query and return single row (backwards compatibility for fetchone=True)"""
|
||||||
result = execute_query(query, params)
|
result = execute_query(query, params)
|
||||||
return result[0] if result and len(result) > 0 else None
|
return result[0] if result and len(result) > 0 else None
|
||||||
|
|
||||||
|
|
||||||
@lru_cache(maxsize=256)
|
|
||||||
def table_has_column(table_name: str, column_name: str, schema: str = "public") -> bool:
|
|
||||||
"""Return whether a column exists in the current database schema."""
|
|
||||||
conn = get_db_connection()
|
|
||||||
try:
|
|
||||||
with conn.cursor() as cursor:
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
SELECT 1
|
|
||||||
FROM information_schema.columns
|
|
||||||
WHERE table_schema = %s
|
|
||||||
AND table_name = %s
|
|
||||||
AND column_name = %s
|
|
||||||
LIMIT 1
|
|
||||||
""",
|
|
||||||
(schema, table_name, column_name),
|
|
||||||
)
|
|
||||||
return cursor.fetchone() is not None
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning(
|
|
||||||
"Schema lookup failed for %s.%s.%s: %s",
|
|
||||||
schema,
|
|
||||||
table_name,
|
|
||||||
column_name,
|
|
||||||
e,
|
|
||||||
)
|
|
||||||
return False
|
|
||||||
finally:
|
|
||||||
release_db_connection(conn)
|
|
||||||
|
|||||||
@ -28,7 +28,6 @@ class CustomerBase(BaseModel):
|
|||||||
name: str
|
name: str
|
||||||
cvr_number: Optional[str] = None
|
cvr_number: Optional[str] = None
|
||||||
email: Optional[str] = None
|
email: Optional[str] = None
|
||||||
email_domain: Optional[str] = None
|
|
||||||
phone: Optional[str] = None
|
phone: Optional[str] = None
|
||||||
address: Optional[str] = None
|
address: Optional[str] = None
|
||||||
city: Optional[str] = None
|
city: Optional[str] = None
|
||||||
@ -49,7 +48,6 @@ class CustomerUpdate(BaseModel):
|
|||||||
name: Optional[str] = None
|
name: Optional[str] = None
|
||||||
cvr_number: Optional[str] = None
|
cvr_number: Optional[str] = None
|
||||||
email: Optional[str] = None
|
email: Optional[str] = None
|
||||||
email_domain: Optional[str] = None
|
|
||||||
phone: Optional[str] = None
|
phone: Optional[str] = None
|
||||||
address: Optional[str] = None
|
address: Optional[str] = None
|
||||||
city: Optional[str] = None
|
city: Optional[str] = None
|
||||||
@ -497,15 +495,14 @@ async def create_customer(customer: CustomerCreate):
|
|||||||
try:
|
try:
|
||||||
customer_id = execute_insert(
|
customer_id = execute_insert(
|
||||||
"""INSERT INTO customers
|
"""INSERT INTO customers
|
||||||
(name, cvr_number, email, email_domain, phone, address, city, postal_code,
|
(name, cvr_number, email, phone, address, city, postal_code,
|
||||||
country, website, is_active, invoice_email, mobile_phone)
|
country, website, is_active, invoice_email, mobile_phone)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
RETURNING id""",
|
RETURNING id""",
|
||||||
(
|
(
|
||||||
customer.name,
|
customer.name,
|
||||||
customer.cvr_number,
|
customer.cvr_number,
|
||||||
customer.email,
|
customer.email,
|
||||||
customer.email_domain,
|
|
||||||
customer.phone,
|
customer.phone,
|
||||||
customer.address,
|
customer.address,
|
||||||
customer.city,
|
customer.city,
|
||||||
|
|||||||
@ -1,455 +0,0 @@
|
|||||||
import json
|
|
||||||
import logging
|
|
||||||
from datetime import datetime
|
|
||||||
from typing import Any, Dict, Optional
|
|
||||||
|
|
||||||
from fastapi import APIRouter, HTTPException, Query, Request, WebSocket, WebSocketDisconnect
|
|
||||||
from pydantic import BaseModel, Field
|
|
||||||
|
|
||||||
from app.core.auth_service import AuthService
|
|
||||||
from app.core.config import settings
|
|
||||||
from app.core.database import execute_query, execute_query_single
|
|
||||||
|
|
||||||
from .mission_service import MissionService
|
|
||||||
from .mission_ws import mission_ws_manager
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
router = APIRouter()
|
|
||||||
|
|
||||||
|
|
||||||
class MissionCallEvent(BaseModel):
|
|
||||||
call_id: str = Field(..., min_length=1, max_length=128)
|
|
||||||
caller_number: Optional[str] = None
|
|
||||||
queue_name: Optional[str] = None
|
|
||||||
timestamp: Optional[datetime] = None
|
|
||||||
|
|
||||||
|
|
||||||
class MissionUptimeWebhook(BaseModel):
|
|
||||||
status: Optional[str] = None
|
|
||||||
service_name: Optional[str] = None
|
|
||||||
customer_name: Optional[str] = None
|
|
||||||
timestamp: Optional[datetime] = None
|
|
||||||
payload: Dict[str, Any] = Field(default_factory=dict)
|
|
||||||
|
|
||||||
|
|
||||||
def _first_query_param(request: Request, *names: str) -> Optional[str]:
|
|
||||||
for name in names:
|
|
||||||
value = request.query_params.get(name)
|
|
||||||
if value and str(value).strip():
|
|
||||||
return str(value).strip()
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def _parse_query_timestamp(request: Request) -> Optional[datetime]:
|
|
||||||
raw = _first_query_param(request, "timestamp", "time", "event_time")
|
|
||||||
if not raw:
|
|
||||||
return None
|
|
||||||
try:
|
|
||||||
return datetime.fromisoformat(raw.replace("Z", "+00:00"))
|
|
||||||
except Exception:
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def _event_from_query(request: Request) -> MissionCallEvent:
|
|
||||||
call_id = _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid")
|
|
||||||
if not call_id:
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Mission webhook invalid query path=%s reason=missing_call_id keys=%s",
|
|
||||||
request.url.path,
|
|
||||||
",".join(sorted(request.query_params.keys())),
|
|
||||||
)
|
|
||||||
raise HTTPException(status_code=400, detail="Missing call_id query parameter")
|
|
||||||
|
|
||||||
return MissionCallEvent(
|
|
||||||
call_id=call_id,
|
|
||||||
caller_number=_first_query_param(request, "caller_number", "caller", "from", "number", "phone"),
|
|
||||||
queue_name=_first_query_param(request, "queue_name", "queue", "group", "line"),
|
|
||||||
timestamp=_parse_query_timestamp(request),
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def _get_webhook_token() -> str:
|
|
||||||
db_token = MissionService.get_setting_value("mission_webhook_token", "") or ""
|
|
||||||
env_token = (getattr(settings, "MISSION_WEBHOOK_TOKEN", "") or "").strip()
|
|
||||||
return db_token.strip() or env_token
|
|
||||||
|
|
||||||
|
|
||||||
def _validate_mission_webhook_token(request: Request, token: Optional[str] = None) -> None:
|
|
||||||
configured = _get_webhook_token()
|
|
||||||
path = request.url.path
|
|
||||||
if not configured:
|
|
||||||
logger.warning("❌ Mission webhook rejected path=%s reason=token_not_configured", path)
|
|
||||||
raise HTTPException(status_code=403, detail="Mission webhook token not configured")
|
|
||||||
|
|
||||||
candidate = token or request.headers.get("x-mission-token") or request.query_params.get("token")
|
|
||||||
if not candidate or candidate.strip() != configured:
|
|
||||||
source = "query_or_arg"
|
|
||||||
if not token and request.headers.get("x-mission-token"):
|
|
||||||
source = "header"
|
|
||||||
|
|
||||||
masked = "<empty>"
|
|
||||||
if candidate:
|
|
||||||
c = candidate.strip()
|
|
||||||
masked = "***" if len(c) <= 8 else f"{c[:4]}...{c[-4:]}"
|
|
||||||
|
|
||||||
logger.warning(
|
|
||||||
"❌ Mission webhook forbidden path=%s reason=token_mismatch source=%s token=%s",
|
|
||||||
path,
|
|
||||||
source,
|
|
||||||
masked,
|
|
||||||
)
|
|
||||||
raise HTTPException(status_code=403, detail="Forbidden")
|
|
||||||
|
|
||||||
|
|
||||||
def _normalize_uptime_payload(payload: MissionUptimeWebhook) -> Dict[str, Any]:
|
|
||||||
raw = dict(payload.payload or {})
|
|
||||||
|
|
||||||
status_candidate = payload.status or raw.get("status") or raw.get("event")
|
|
||||||
if not status_candidate and isinstance(raw.get("monitor"), dict):
|
|
||||||
status_candidate = raw.get("monitor", {}).get("status")
|
|
||||||
|
|
||||||
service_name = payload.service_name or raw.get("service_name") or raw.get("monitor_name")
|
|
||||||
if not service_name and isinstance(raw.get("monitor"), dict):
|
|
||||||
service_name = raw.get("monitor", {}).get("name")
|
|
||||||
|
|
||||||
customer_name = payload.customer_name or raw.get("customer_name") or raw.get("customer")
|
|
||||||
timestamp = payload.timestamp or raw.get("timestamp")
|
|
||||||
|
|
||||||
status = str(status_candidate or "UNKNOWN").upper().strip()
|
|
||||||
if status not in {"UP", "DOWN", "DEGRADED"}:
|
|
||||||
status = "UNKNOWN"
|
|
||||||
|
|
||||||
return {
|
|
||||||
"status": status,
|
|
||||||
"service_name": str(service_name or "Unknown Service"),
|
|
||||||
"customer_name": str(customer_name or "").strip() or None,
|
|
||||||
"timestamp": timestamp,
|
|
||||||
"raw": raw,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/mission/state")
|
|
||||||
async def get_mission_state():
|
|
||||||
return MissionService.get_state()
|
|
||||||
|
|
||||||
|
|
||||||
@router.websocket("/mission/ws")
|
|
||||||
async def mission_ws(websocket: WebSocket):
|
|
||||||
token = websocket.query_params.get("token")
|
|
||||||
auth_header = (websocket.headers.get("authorization") or "").strip()
|
|
||||||
if not token and auth_header.lower().startswith("bearer "):
|
|
||||||
token = auth_header.split(" ", 1)[1].strip()
|
|
||||||
if not token:
|
|
||||||
token = (websocket.cookies.get("access_token") or "").strip() or None
|
|
||||||
|
|
||||||
payload = AuthService.verify_token(token) if token else None
|
|
||||||
if not payload:
|
|
||||||
await websocket.close(code=1008)
|
|
||||||
return
|
|
||||||
|
|
||||||
await mission_ws_manager.connect(websocket)
|
|
||||||
try:
|
|
||||||
await mission_ws_manager.broadcast("mission_state", MissionService.get_state())
|
|
||||||
while True:
|
|
||||||
await websocket.receive_text()
|
|
||||||
except WebSocketDisconnect:
|
|
||||||
await mission_ws_manager.disconnect(websocket)
|
|
||||||
except Exception:
|
|
||||||
await mission_ws_manager.disconnect(websocket)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/mission/webhook/telefoni/ringing")
|
|
||||||
async def mission_telefoni_ringing(event: MissionCallEvent, request: Request, token: Optional[str] = Query(None)):
|
|
||||||
_validate_mission_webhook_token(request, token)
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"☎️ Mission webhook ringing call_id=%s caller=%s queue=%s method=%s",
|
|
||||||
event.call_id,
|
|
||||||
event.caller_number,
|
|
||||||
event.queue_name,
|
|
||||||
request.method,
|
|
||||||
)
|
|
||||||
|
|
||||||
timestamp = event.timestamp or datetime.utcnow()
|
|
||||||
context = MissionService.resolve_contact_context(event.caller_number)
|
|
||||||
queue_name = (event.queue_name or "Ukendt kø").strip()
|
|
||||||
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO mission_call_state (
|
|
||||||
call_id, queue_name, caller_number, contact_name, company_name, customer_tag,
|
|
||||||
state, started_at, answered_at, ended_at, updated_at, last_payload
|
|
||||||
)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, 'ringing', %s, NULL, NULL, NOW(), %s::jsonb)
|
|
||||||
ON CONFLICT (call_id)
|
|
||||||
DO UPDATE SET
|
|
||||||
queue_name = EXCLUDED.queue_name,
|
|
||||||
caller_number = EXCLUDED.caller_number,
|
|
||||||
contact_name = EXCLUDED.contact_name,
|
|
||||||
company_name = EXCLUDED.company_name,
|
|
||||||
customer_tag = EXCLUDED.customer_tag,
|
|
||||||
state = 'ringing',
|
|
||||||
ended_at = NULL,
|
|
||||||
answered_at = NULL,
|
|
||||||
started_at = LEAST(mission_call_state.started_at, EXCLUDED.started_at),
|
|
||||||
updated_at = NOW(),
|
|
||||||
last_payload = EXCLUDED.last_payload
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
event.call_id,
|
|
||||||
queue_name,
|
|
||||||
event.caller_number,
|
|
||||||
context.get("contact_name"),
|
|
||||||
context.get("company_name"),
|
|
||||||
context.get("customer_tag"),
|
|
||||||
timestamp,
|
|
||||||
json.dumps(event.model_dump(mode="json")),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
event_row = MissionService.insert_event(
|
|
||||||
event_type="incoming_call",
|
|
||||||
title=f"Indgående opkald i {queue_name}",
|
|
||||||
severity="warning",
|
|
||||||
source="telefoni",
|
|
||||||
customer_name=context.get("company_name"),
|
|
||||||
payload={
|
|
||||||
"call_id": event.call_id,
|
|
||||||
"queue_name": queue_name,
|
|
||||||
"caller_number": event.caller_number,
|
|
||||||
**context,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
|
|
||||||
call_payload = {
|
|
||||||
"call_id": event.call_id,
|
|
||||||
"queue_name": queue_name,
|
|
||||||
"caller_number": event.caller_number,
|
|
||||||
**context,
|
|
||||||
"timestamp": timestamp,
|
|
||||||
}
|
|
||||||
|
|
||||||
await mission_ws_manager.broadcast("call_ringing", call_payload)
|
|
||||||
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
|
||||||
await mission_ws_manager.broadcast("kpi_update", MissionService.get_kpis())
|
|
||||||
|
|
||||||
return {"status": "ok"}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/mission/webhook/telefoni/ringing")
|
|
||||||
async def mission_telefoni_ringing_get(request: Request, token: Optional[str] = Query(None)):
|
|
||||||
_validate_mission_webhook_token(request, token)
|
|
||||||
|
|
||||||
# Allow token-only GET calls (no call payload) for phone webhook validation/ping.
|
|
||||||
if not _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid"):
|
|
||||||
logger.info("☎️ Mission webhook ringing ping method=%s", request.method)
|
|
||||||
return {"status": "ok", "mode": "ping"}
|
|
||||||
|
|
||||||
event = _event_from_query(request)
|
|
||||||
return await mission_telefoni_ringing(event, request, token)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/mission/webhook/telefoni/answered")
|
|
||||||
async def mission_telefoni_answered(event: MissionCallEvent, request: Request, token: Optional[str] = Query(None)):
|
|
||||||
_validate_mission_webhook_token(request, token)
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"✅ Mission webhook answered call_id=%s caller=%s queue=%s method=%s",
|
|
||||||
event.call_id,
|
|
||||||
event.caller_number,
|
|
||||||
event.queue_name,
|
|
||||||
request.method,
|
|
||||||
)
|
|
||||||
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
UPDATE mission_call_state
|
|
||||||
SET state = 'answered',
|
|
||||||
answered_at = COALESCE(answered_at, NOW()),
|
|
||||||
updated_at = NOW(),
|
|
||||||
last_payload = %s::jsonb
|
|
||||||
WHERE call_id = %s
|
|
||||||
""",
|
|
||||||
(json.dumps(event.model_dump(mode="json")), event.call_id),
|
|
||||||
)
|
|
||||||
|
|
||||||
event_row = MissionService.insert_event(
|
|
||||||
event_type="call_answered",
|
|
||||||
title="Opkald besvaret",
|
|
||||||
severity="info",
|
|
||||||
source="telefoni",
|
|
||||||
payload={"call_id": event.call_id, "queue_name": event.queue_name, "caller_number": event.caller_number},
|
|
||||||
)
|
|
||||||
|
|
||||||
await mission_ws_manager.broadcast("call_answered", {"call_id": event.call_id})
|
|
||||||
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
|
||||||
return {"status": "ok"}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/mission/webhook/telefoni/answered")
|
|
||||||
async def mission_telefoni_answered_get(request: Request, token: Optional[str] = Query(None)):
|
|
||||||
_validate_mission_webhook_token(request, token)
|
|
||||||
|
|
||||||
if not _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid"):
|
|
||||||
logger.info("✅ Mission webhook answered ping method=%s", request.method)
|
|
||||||
return {"status": "ok", "mode": "ping"}
|
|
||||||
|
|
||||||
event = _event_from_query(request)
|
|
||||||
return await mission_telefoni_answered(event, request, token)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/mission/webhook/telefoni/hangup")
|
|
||||||
async def mission_telefoni_hangup(event: MissionCallEvent, request: Request, token: Optional[str] = Query(None)):
|
|
||||||
_validate_mission_webhook_token(request, token)
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"📴 Mission webhook hangup call_id=%s caller=%s queue=%s method=%s",
|
|
||||||
event.call_id,
|
|
||||||
event.caller_number,
|
|
||||||
event.queue_name,
|
|
||||||
request.method,
|
|
||||||
)
|
|
||||||
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
UPDATE mission_call_state
|
|
||||||
SET state = 'hangup',
|
|
||||||
ended_at = NOW(),
|
|
||||||
updated_at = NOW(),
|
|
||||||
last_payload = %s::jsonb
|
|
||||||
WHERE call_id = %s
|
|
||||||
""",
|
|
||||||
(json.dumps(event.model_dump(mode="json")), event.call_id),
|
|
||||||
)
|
|
||||||
|
|
||||||
event_row = MissionService.insert_event(
|
|
||||||
event_type="call_ended",
|
|
||||||
title="Opkald afsluttet",
|
|
||||||
severity="info",
|
|
||||||
source="telefoni",
|
|
||||||
payload={"call_id": event.call_id, "queue_name": event.queue_name, "caller_number": event.caller_number},
|
|
||||||
)
|
|
||||||
|
|
||||||
await mission_ws_manager.broadcast("call_hangup", {"call_id": event.call_id})
|
|
||||||
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
|
||||||
return {"status": "ok"}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/mission/webhook/telefoni/hangup")
|
|
||||||
async def mission_telefoni_hangup_get(request: Request, token: Optional[str] = Query(None)):
|
|
||||||
_validate_mission_webhook_token(request, token)
|
|
||||||
|
|
||||||
if not _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid"):
|
|
||||||
logger.info("📴 Mission webhook hangup ping method=%s", request.method)
|
|
||||||
return {"status": "ok", "mode": "ping"}
|
|
||||||
|
|
||||||
event = _event_from_query(request)
|
|
||||||
return await mission_telefoni_hangup(event, request, token)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/mission/webhook/uptime")
|
|
||||||
async def mission_uptime_webhook(payload: MissionUptimeWebhook, request: Request, token: Optional[str] = Query(None)):
|
|
||||||
_validate_mission_webhook_token(request, token)
|
|
||||||
|
|
||||||
normalized = _normalize_uptime_payload(payload)
|
|
||||||
status = normalized["status"]
|
|
||||||
service_name = normalized["service_name"]
|
|
||||||
customer_name = normalized["customer_name"]
|
|
||||||
alert_key = MissionService.build_alert_key(service_name, customer_name)
|
|
||||||
|
|
||||||
current = execute_query_single("SELECT is_active, started_at FROM mission_uptime_alerts WHERE alert_key = %s", (alert_key,))
|
|
||||||
|
|
||||||
if status in {"DOWN", "DEGRADED"}:
|
|
||||||
started_at = (current or {}).get("started_at")
|
|
||||||
is_active = bool((current or {}).get("is_active"))
|
|
||||||
if not started_at or not is_active:
|
|
||||||
started_at = datetime.utcnow()
|
|
||||||
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO mission_uptime_alerts (
|
|
||||||
alert_key, service_name, customer_name, status, is_active, started_at, resolved_at,
|
|
||||||
updated_at, raw_payload, normalized_payload
|
|
||||||
)
|
|
||||||
VALUES (%s, %s, %s, %s, TRUE, %s, NULL, NOW(), %s::jsonb, %s::jsonb)
|
|
||||||
ON CONFLICT (alert_key)
|
|
||||||
DO UPDATE SET
|
|
||||||
status = EXCLUDED.status,
|
|
||||||
is_active = TRUE,
|
|
||||||
started_at = COALESCE(mission_uptime_alerts.started_at, EXCLUDED.started_at),
|
|
||||||
resolved_at = NULL,
|
|
||||||
updated_at = NOW(),
|
|
||||||
raw_payload = EXCLUDED.raw_payload,
|
|
||||||
normalized_payload = EXCLUDED.normalized_payload
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
alert_key,
|
|
||||||
service_name,
|
|
||||||
customer_name,
|
|
||||||
status,
|
|
||||||
started_at,
|
|
||||||
json.dumps(payload.model_dump(mode="json")),
|
|
||||||
json.dumps(normalized, default=str),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
event_type = "uptime_down" if status == "DOWN" else "uptime_degraded"
|
|
||||||
severity = "critical" if status == "DOWN" else "warning"
|
|
||||||
title = f"{service_name} er {status}"
|
|
||||||
elif status == "UP":
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO mission_uptime_alerts (
|
|
||||||
alert_key, service_name, customer_name, status, is_active, started_at, resolved_at,
|
|
||||||
updated_at, raw_payload, normalized_payload
|
|
||||||
)
|
|
||||||
VALUES (%s, %s, %s, %s, FALSE, NULL, NOW(), NOW(), %s::jsonb, %s::jsonb)
|
|
||||||
ON CONFLICT (alert_key)
|
|
||||||
DO UPDATE SET
|
|
||||||
status = EXCLUDED.status,
|
|
||||||
is_active = FALSE,
|
|
||||||
resolved_at = NOW(),
|
|
||||||
updated_at = NOW(),
|
|
||||||
raw_payload = EXCLUDED.raw_payload,
|
|
||||||
normalized_payload = EXCLUDED.normalized_payload
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
alert_key,
|
|
||||||
service_name,
|
|
||||||
customer_name,
|
|
||||||
status,
|
|
||||||
json.dumps(payload.model_dump(mode="json")),
|
|
||||||
json.dumps(normalized, default=str),
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
event_type = "uptime_up"
|
|
||||||
severity = "success"
|
|
||||||
title = f"{service_name} er UP"
|
|
||||||
else:
|
|
||||||
event_type = "uptime_unknown"
|
|
||||||
severity = "info"
|
|
||||||
title = f"{service_name} status ukendt"
|
|
||||||
|
|
||||||
event_row = MissionService.insert_event(
|
|
||||||
event_type=event_type,
|
|
||||||
title=title,
|
|
||||||
severity=severity,
|
|
||||||
source="uptime",
|
|
||||||
customer_name=customer_name,
|
|
||||||
payload={"alert_key": alert_key, **normalized},
|
|
||||||
)
|
|
||||||
|
|
||||||
await mission_ws_manager.broadcast(
|
|
||||||
"uptime_alert",
|
|
||||||
{
|
|
||||||
"alert_key": alert_key,
|
|
||||||
"status": status,
|
|
||||||
"service_name": service_name,
|
|
||||||
"customer_name": customer_name,
|
|
||||||
"active_alerts": MissionService.get_active_alerts(),
|
|
||||||
},
|
|
||||||
)
|
|
||||||
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
|
||||||
|
|
||||||
return {"status": "ok", "normalized": normalized}
|
|
||||||
@ -1,290 +0,0 @@
|
|||||||
import json
|
|
||||||
import logging
|
|
||||||
from typing import Any, Dict, Optional
|
|
||||||
|
|
||||||
from app.core.database import execute_query, execute_query_single
|
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class MissionService:
|
|
||||||
@staticmethod
|
|
||||||
def _safe(label: str, func, default):
|
|
||||||
try:
|
|
||||||
return func()
|
|
||||||
except Exception as exc:
|
|
||||||
logger.error("❌ Mission state component failed: %s (%s)", label, exc)
|
|
||||||
return default
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def _table_exists(table_name: str) -> bool:
|
|
||||||
row = execute_query_single("SELECT to_regclass(%s) AS table_name", (f"public.{table_name}",))
|
|
||||||
return bool(row and row.get("table_name"))
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_ring_timeout_seconds() -> int:
|
|
||||||
raw = MissionService.get_setting_value("mission_call_ring_timeout_seconds", "180") or "180"
|
|
||||||
try:
|
|
||||||
value = int(raw)
|
|
||||||
except (TypeError, ValueError):
|
|
||||||
value = 180
|
|
||||||
return max(30, min(value, 3600))
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def expire_stale_ringing_calls() -> None:
|
|
||||||
if not MissionService._table_exists("mission_call_state"):
|
|
||||||
return
|
|
||||||
|
|
||||||
timeout_seconds = MissionService.get_ring_timeout_seconds()
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
UPDATE mission_call_state
|
|
||||||
SET state = 'hangup',
|
|
||||||
ended_at = COALESCE(ended_at, NOW()),
|
|
||||||
updated_at = NOW()
|
|
||||||
WHERE state = 'ringing'
|
|
||||||
AND started_at < (NOW() - make_interval(secs => %s))
|
|
||||||
""",
|
|
||||||
(timeout_seconds,),
|
|
||||||
)
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_setting_value(key: str, default: Optional[str] = None) -> Optional[str]:
|
|
||||||
row = execute_query_single("SELECT value FROM settings WHERE key = %s", (key,))
|
|
||||||
if not row:
|
|
||||||
return default
|
|
||||||
value = row.get("value")
|
|
||||||
if value is None or value == "":
|
|
||||||
return default
|
|
||||||
return str(value)
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def parse_json_setting(key: str, default: Any) -> Any:
|
|
||||||
raw = MissionService.get_setting_value(key, None)
|
|
||||||
if raw is None:
|
|
||||||
return default
|
|
||||||
try:
|
|
||||||
return json.loads(raw)
|
|
||||||
except Exception:
|
|
||||||
return default
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def build_alert_key(service_name: str, customer_name: Optional[str]) -> str:
|
|
||||||
customer_part = (customer_name or "").strip().lower() or "global"
|
|
||||||
return f"{service_name.strip().lower()}::{customer_part}"
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def resolve_contact_context(caller_number: Optional[str]) -> Dict[str, Optional[str]]:
|
|
||||||
if not caller_number:
|
|
||||||
return {"contact_name": None, "company_name": None, "customer_tag": None}
|
|
||||||
|
|
||||||
query = """
|
|
||||||
SELECT
|
|
||||||
c.id,
|
|
||||||
c.first_name,
|
|
||||||
c.last_name,
|
|
||||||
(
|
|
||||||
SELECT cu.name
|
|
||||||
FROM contact_companies cc
|
|
||||||
JOIN customers cu ON cu.id = cc.customer_id
|
|
||||||
WHERE cc.contact_id = c.id
|
|
||||||
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
|
||||||
LIMIT 1
|
|
||||||
) AS company_name,
|
|
||||||
(
|
|
||||||
SELECT t.name
|
|
||||||
FROM entity_tags et
|
|
||||||
JOIN tags t ON t.id = et.tag_id
|
|
||||||
WHERE et.entity_type = 'contact'
|
|
||||||
AND et.entity_id = c.id
|
|
||||||
AND LOWER(t.name) IN ('vip', 'serviceaftale', 'service agreement')
|
|
||||||
ORDER BY t.name
|
|
||||||
LIMIT 1
|
|
||||||
) AS customer_tag
|
|
||||||
FROM contacts c
|
|
||||||
WHERE RIGHT(regexp_replace(COALESCE(c.phone, ''), '\\D', '', 'g'), 8) = RIGHT(regexp_replace(%s, '\\D', '', 'g'), 8)
|
|
||||||
OR RIGHT(regexp_replace(COALESCE(c.mobile, ''), '\\D', '', 'g'), 8) = RIGHT(regexp_replace(%s, '\\D', '', 'g'), 8)
|
|
||||||
LIMIT 1
|
|
||||||
"""
|
|
||||||
row = execute_query_single(query, (caller_number, caller_number))
|
|
||||||
if not row:
|
|
||||||
return {"contact_name": None, "company_name": None, "customer_tag": None}
|
|
||||||
|
|
||||||
contact_name = f"{(row.get('first_name') or '').strip()} {(row.get('last_name') or '').strip()}".strip() or None
|
|
||||||
return {
|
|
||||||
"contact_name": contact_name,
|
|
||||||
"company_name": row.get("company_name"),
|
|
||||||
"customer_tag": row.get("customer_tag"),
|
|
||||||
}
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def insert_event(
|
|
||||||
*,
|
|
||||||
event_type: str,
|
|
||||||
title: str,
|
|
||||||
severity: str = "info",
|
|
||||||
source: Optional[str] = None,
|
|
||||||
customer_name: Optional[str] = None,
|
|
||||||
payload: Optional[Dict[str, Any]] = None,
|
|
||||||
) -> Dict[str, Any]:
|
|
||||||
if not MissionService._table_exists("mission_events"):
|
|
||||||
logger.warning("Mission table missing: mission_events (event skipped)")
|
|
||||||
return {}
|
|
||||||
|
|
||||||
rows = execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO mission_events (event_type, severity, title, source, customer_name, payload)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s::jsonb)
|
|
||||||
RETURNING id, event_type, severity, title, source, customer_name, payload, created_at
|
|
||||||
""",
|
|
||||||
(event_type, severity, title, source, customer_name, json.dumps(payload or {})),
|
|
||||||
)
|
|
||||||
return rows[0] if rows else {}
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_kpis() -> Dict[str, int]:
|
|
||||||
query = """
|
|
||||||
SELECT
|
|
||||||
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet') AS open_cases,
|
|
||||||
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) = 'åben' AND ansvarlig_bruger_id IS NULL) AS new_cases,
|
|
||||||
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet' AND ansvarlig_bruger_id IS NULL) AS unassigned_cases,
|
|
||||||
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet' AND deadline IS NOT NULL AND deadline::date = CURRENT_DATE) AS deadlines_today,
|
|
||||||
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet' AND deadline IS NOT NULL AND deadline::date < CURRENT_DATE) AS overdue_deadlines
|
|
||||||
FROM sag_sager
|
|
||||||
"""
|
|
||||||
row = execute_query_single(query) or {}
|
|
||||||
return {
|
|
||||||
"open_cases": int(row.get("open_cases") or 0),
|
|
||||||
"new_cases": int(row.get("new_cases") or 0),
|
|
||||||
"unassigned_cases": int(row.get("unassigned_cases") or 0),
|
|
||||||
"deadlines_today": int(row.get("deadlines_today") or 0),
|
|
||||||
"overdue_deadlines": int(row.get("overdue_deadlines") or 0),
|
|
||||||
}
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_employee_deadlines() -> list[Dict[str, Any]]:
|
|
||||||
rows = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT
|
|
||||||
COALESCE(u.full_name, u.username, 'Ukendt') AS employee_name,
|
|
||||||
COUNT(*) FILTER (WHERE s.deadline::date = CURRENT_DATE) AS deadlines_today,
|
|
||||||
COUNT(*) FILTER (WHERE s.deadline::date < CURRENT_DATE) AS overdue_deadlines
|
|
||||||
FROM sag_sager s
|
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
|
||||||
WHERE s.deleted_at IS NULL
|
|
||||||
AND LOWER(s.status) <> 'afsluttet'
|
|
||||||
AND s.deadline IS NOT NULL
|
|
||||||
GROUP BY COALESCE(u.full_name, u.username, 'Ukendt')
|
|
||||||
HAVING COUNT(*) FILTER (WHERE s.deadline::date = CURRENT_DATE) > 0
|
|
||||||
OR COUNT(*) FILTER (WHERE s.deadline::date < CURRENT_DATE) > 0
|
|
||||||
ORDER BY overdue_deadlines DESC, deadlines_today DESC, employee_name ASC
|
|
||||||
"""
|
|
||||||
) or []
|
|
||||||
return [
|
|
||||||
{
|
|
||||||
"employee_name": row.get("employee_name"),
|
|
||||||
"deadlines_today": int(row.get("deadlines_today") or 0),
|
|
||||||
"overdue_deadlines": int(row.get("overdue_deadlines") or 0),
|
|
||||||
}
|
|
||||||
for row in rows
|
|
||||||
]
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_active_calls() -> list[Dict[str, Any]]:
|
|
||||||
if not MissionService._table_exists("mission_call_state"):
|
|
||||||
logger.warning("Mission table missing: mission_call_state (active calls unavailable)")
|
|
||||||
return []
|
|
||||||
|
|
||||||
MissionService.expire_stale_ringing_calls()
|
|
||||||
rows = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT call_id, queue_name, caller_number, contact_name, company_name, customer_tag, state, started_at, answered_at, ended_at, updated_at
|
|
||||||
FROM mission_call_state
|
|
||||||
WHERE state = 'ringing'
|
|
||||||
ORDER BY started_at DESC
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
return rows or []
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_active_alerts() -> list[Dict[str, Any]]:
|
|
||||||
if not MissionService._table_exists("mission_uptime_alerts"):
|
|
||||||
logger.warning("Mission table missing: mission_uptime_alerts (active alerts unavailable)")
|
|
||||||
return []
|
|
||||||
|
|
||||||
rows = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT alert_key, service_name, customer_name, status, is_active, started_at, resolved_at, updated_at
|
|
||||||
FROM mission_uptime_alerts
|
|
||||||
WHERE is_active = TRUE
|
|
||||||
ORDER BY started_at ASC NULLS LAST
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
return rows or []
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_live_feed(limit: int = 20) -> list[Dict[str, Any]]:
|
|
||||||
if not MissionService._table_exists("mission_events"):
|
|
||||||
logger.warning("Mission table missing: mission_events (live feed unavailable)")
|
|
||||||
return []
|
|
||||||
|
|
||||||
rows = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT id, event_type, severity, title, source, customer_name, payload, created_at
|
|
||||||
FROM mission_events
|
|
||||||
ORDER BY created_at DESC
|
|
||||||
LIMIT %s
|
|
||||||
""",
|
|
||||||
(limit,),
|
|
||||||
)
|
|
||||||
return rows or []
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def get_state() -> Dict[str, Any]:
|
|
||||||
kpis_default = {
|
|
||||||
"open_cases": 0,
|
|
||||||
"new_cases": 0,
|
|
||||||
"unassigned_cases": 0,
|
|
||||||
"deadlines_today": 0,
|
|
||||||
"overdue_deadlines": 0,
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
|
||||||
"kpis": MissionService._safe("kpis", MissionService.get_kpis, kpis_default),
|
|
||||||
"active_calls": MissionService._safe("active_calls", MissionService.get_active_calls, []),
|
|
||||||
"employee_deadlines": MissionService._safe("employee_deadlines", MissionService.get_employee_deadlines, []),
|
|
||||||
"active_alerts": MissionService._safe("active_alerts", MissionService.get_active_alerts, []),
|
|
||||||
"live_feed": MissionService._safe("live_feed", lambda: MissionService.get_live_feed(20), []),
|
|
||||||
"config": {
|
|
||||||
"display_queues": MissionService._safe("config.display_queues", lambda: MissionService.parse_json_setting("mission_display_queues", []), []),
|
|
||||||
"sound_enabled": MissionService._safe(
|
|
||||||
"config.sound_enabled",
|
|
||||||
lambda: str(MissionService.get_setting_value("mission_sound_enabled", "true")).lower() == "true",
|
|
||||||
True,
|
|
||||||
),
|
|
||||||
"sound_volume": MissionService._safe(
|
|
||||||
"config.sound_volume",
|
|
||||||
lambda: int(MissionService.get_setting_value("mission_sound_volume", "70") or 70),
|
|
||||||
70,
|
|
||||||
),
|
|
||||||
"sound_events": MissionService._safe(
|
|
||||||
"config.sound_events",
|
|
||||||
lambda: MissionService.parse_json_setting("mission_sound_events", ["incoming_call", "uptime_down", "critical_event"]),
|
|
||||||
["incoming_call", "uptime_down", "critical_event"],
|
|
||||||
),
|
|
||||||
"kpi_visible": MissionService._safe(
|
|
||||||
"config.kpi_visible",
|
|
||||||
lambda: MissionService.parse_json_setting(
|
|
||||||
"mission_kpi_visible",
|
|
||||||
["open_cases", "new_cases", "unassigned_cases", "deadlines_today", "overdue_deadlines"],
|
|
||||||
),
|
|
||||||
["open_cases", "new_cases", "unassigned_cases", "deadlines_today", "overdue_deadlines"],
|
|
||||||
),
|
|
||||||
"customer_filter": MissionService._safe(
|
|
||||||
"config.customer_filter",
|
|
||||||
lambda: MissionService.get_setting_value("mission_customer_filter", "") or "",
|
|
||||||
"",
|
|
||||||
),
|
|
||||||
},
|
|
||||||
}
|
|
||||||
@ -1,45 +0,0 @@
|
|||||||
import asyncio
|
|
||||||
import json
|
|
||||||
import logging
|
|
||||||
from typing import Set
|
|
||||||
|
|
||||||
from fastapi import WebSocket
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class MissionConnectionManager:
|
|
||||||
def __init__(self) -> None:
|
|
||||||
self._lock = asyncio.Lock()
|
|
||||||
self._connections: Set[WebSocket] = set()
|
|
||||||
|
|
||||||
async def connect(self, websocket: WebSocket) -> None:
|
|
||||||
await websocket.accept()
|
|
||||||
async with self._lock:
|
|
||||||
self._connections.add(websocket)
|
|
||||||
logger.info("📡 Mission WS connected (%s active)", len(self._connections))
|
|
||||||
|
|
||||||
async def disconnect(self, websocket: WebSocket) -> None:
|
|
||||||
async with self._lock:
|
|
||||||
self._connections.discard(websocket)
|
|
||||||
logger.info("📡 Mission WS disconnected (%s active)", len(self._connections))
|
|
||||||
|
|
||||||
async def broadcast(self, event: str, payload: dict) -> None:
|
|
||||||
message = json.dumps({"event": event, "data": payload}, default=str)
|
|
||||||
async with self._lock:
|
|
||||||
targets = list(self._connections)
|
|
||||||
|
|
||||||
dead: list[WebSocket] = []
|
|
||||||
for websocket in targets:
|
|
||||||
try:
|
|
||||||
await websocket.send_text(message)
|
|
||||||
except Exception:
|
|
||||||
dead.append(websocket)
|
|
||||||
|
|
||||||
if dead:
|
|
||||||
async with self._lock:
|
|
||||||
for websocket in dead:
|
|
||||||
self._connections.discard(websocket)
|
|
||||||
|
|
||||||
|
|
||||||
mission_ws_manager = MissionConnectionManager()
|
|
||||||
@ -125,24 +125,10 @@ async def dashboard(request: Request):
|
|||||||
|
|
||||||
from app.core.database import execute_query
|
from app.core.database import execute_query
|
||||||
|
|
||||||
try:
|
result = execute_query_single(unknown_query)
|
||||||
result = execute_query_single(unknown_query)
|
unknown_count = result['count'] if result else 0
|
||||||
unknown_count = result['count'] if result else 0
|
|
||||||
except Exception as exc:
|
|
||||||
if "tticket_worklog" in str(exc):
|
|
||||||
logger.warning("⚠️ tticket_worklog table not found; defaulting unknown worklog count to 0")
|
|
||||||
unknown_count = 0
|
|
||||||
else:
|
|
||||||
raise
|
|
||||||
|
|
||||||
try:
|
raw_alerts = execute_query(bankruptcy_query) or []
|
||||||
raw_alerts = execute_query(bankruptcy_query) or []
|
|
||||||
except Exception as exc:
|
|
||||||
if "email_messages" in str(exc):
|
|
||||||
logger.warning("⚠️ email_messages table not found; skipping bankruptcy alerts")
|
|
||||||
raw_alerts = []
|
|
||||||
else:
|
|
||||||
raise
|
|
||||||
bankruptcy_alerts = []
|
bankruptcy_alerts = []
|
||||||
|
|
||||||
for alert in raw_alerts:
|
for alert in raw_alerts:
|
||||||
@ -358,13 +344,3 @@ async def clear_default_dashboard(
|
|||||||
async def clear_default_dashboard_get_fallback():
|
async def clear_default_dashboard_get_fallback():
|
||||||
return RedirectResponse(url="/settings#system", status_code=303)
|
return RedirectResponse(url="/settings#system", status_code=303)
|
||||||
|
|
||||||
|
|
||||||
@router.get("/dashboard/mission-control", response_class=HTMLResponse)
|
|
||||||
async def mission_control_dashboard(request: Request):
|
|
||||||
return templates.TemplateResponse(
|
|
||||||
"dashboard/frontend/mission_control.html",
|
|
||||||
{
|
|
||||||
"request": request,
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|||||||
@ -1,576 +0,0 @@
|
|||||||
{% extends "shared/frontend/base.html" %}
|
|
||||||
|
|
||||||
{% block title %}Mission Control - BMC Hub{% endblock %}
|
|
||||||
|
|
||||||
{% block extra_css %}
|
|
||||||
<style>
|
|
||||||
:root {
|
|
||||||
--mc-bg: #0b1320;
|
|
||||||
--mc-surface: #121d2f;
|
|
||||||
--mc-surface-2: #16243a;
|
|
||||||
--mc-border: #2c3c58;
|
|
||||||
--mc-text: #e9f1ff;
|
|
||||||
--mc-text-muted: #9fb3d1;
|
|
||||||
--mc-danger: #ef4444;
|
|
||||||
--mc-warning: #f59e0b;
|
|
||||||
--mc-success: #10b981;
|
|
||||||
--mc-info: #3b82f6;
|
|
||||||
}
|
|
||||||
|
|
||||||
body {
|
|
||||||
background: var(--mc-bg) !important;
|
|
||||||
color: var(--mc-text);
|
|
||||||
}
|
|
||||||
|
|
||||||
main.container-fluid {
|
|
||||||
max-width: 100% !important;
|
|
||||||
padding: 0.75rem 1rem 1rem 1rem !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-grid {
|
|
||||||
display: grid;
|
|
||||||
gap: 0.75rem;
|
|
||||||
grid-template-rows: auto 1fr auto;
|
|
||||||
min-height: calc(100vh - 90px);
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-card {
|
|
||||||
background: linear-gradient(180deg, var(--mc-surface) 0%, var(--mc-surface-2) 100%);
|
|
||||||
border: 1px solid var(--mc-border);
|
|
||||||
border-radius: 14px;
|
|
||||||
padding: 0.75rem 1rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-top {
|
|
||||||
display: grid;
|
|
||||||
grid-template-columns: 1fr;
|
|
||||||
gap: 0.75rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-alert-bar {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 0.75rem;
|
|
||||||
font-size: 1.15rem;
|
|
||||||
font-weight: 700;
|
|
||||||
padding: 0.9rem 1rem;
|
|
||||||
border-radius: 12px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-alert-bar.down {
|
|
||||||
background: rgba(239, 68, 68, 0.18);
|
|
||||||
border: 1px solid rgba(239, 68, 68, 0.55);
|
|
||||||
color: #ffd6d6;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-alert-empty {
|
|
||||||
color: var(--mc-text-muted);
|
|
||||||
font-size: 0.95rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-middle {
|
|
||||||
display: grid;
|
|
||||||
grid-template-columns: 2fr 1fr;
|
|
||||||
gap: 0.75rem;
|
|
||||||
min-height: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-kpi-grid {
|
|
||||||
display: grid;
|
|
||||||
grid-template-columns: repeat(5, minmax(0, 1fr));
|
|
||||||
gap: 0.65rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-kpi {
|
|
||||||
background: rgba(255, 255, 255, 0.03);
|
|
||||||
border: 1px solid var(--mc-border);
|
|
||||||
border-radius: 12px;
|
|
||||||
padding: 0.85rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-kpi .label {
|
|
||||||
font-size: 0.8rem;
|
|
||||||
text-transform: uppercase;
|
|
||||||
letter-spacing: 0.05em;
|
|
||||||
color: var(--mc-text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-kpi .value {
|
|
||||||
font-size: 2rem;
|
|
||||||
line-height: 1;
|
|
||||||
font-weight: 800;
|
|
||||||
margin-top: 0.45rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-kpi.warning { border-color: rgba(245, 158, 11, 0.55); }
|
|
||||||
.mc-kpi.danger { border-color: rgba(239, 68, 68, 0.55); }
|
|
||||||
|
|
||||||
.mc-call-overlay {
|
|
||||||
display: none;
|
|
||||||
margin-top: 0.75rem;
|
|
||||||
background: rgba(59, 130, 246, 0.14);
|
|
||||||
border: 2px solid rgba(59, 130, 246, 0.65);
|
|
||||||
border-radius: 14px;
|
|
||||||
padding: 1rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-call-overlay.active {
|
|
||||||
display: block;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-call-title {
|
|
||||||
font-size: 1.7rem;
|
|
||||||
font-weight: 800;
|
|
||||||
margin-bottom: 0.35rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-call-meta {
|
|
||||||
display: flex;
|
|
||||||
flex-wrap: wrap;
|
|
||||||
gap: 0.5rem;
|
|
||||||
font-size: 1.05rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-badge {
|
|
||||||
border-radius: 999px;
|
|
||||||
border: 1px solid var(--mc-border);
|
|
||||||
background: rgba(255, 255, 255, 0.05);
|
|
||||||
padding: 0.18rem 0.55rem;
|
|
||||||
font-size: 0.85rem;
|
|
||||||
color: var(--mc-text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-bottom {
|
|
||||||
display: grid;
|
|
||||||
grid-template-columns: 1.2fr 1fr;
|
|
||||||
gap: 0.75rem;
|
|
||||||
min-height: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-table,
|
|
||||||
.mc-feed {
|
|
||||||
max-height: 30vh;
|
|
||||||
overflow: auto;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-row {
|
|
||||||
display: grid;
|
|
||||||
grid-template-columns: 2fr 1fr 1fr;
|
|
||||||
gap: 0.5rem;
|
|
||||||
padding: 0.4rem 0;
|
|
||||||
border-bottom: 1px solid rgba(159, 179, 209, 0.12);
|
|
||||||
font-size: 0.95rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-row:last-child {
|
|
||||||
border-bottom: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-feed-item {
|
|
||||||
padding: 0.5rem 0;
|
|
||||||
border-bottom: 1px solid rgba(159, 179, 209, 0.12);
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-feed-item:last-child {
|
|
||||||
border-bottom: none;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-feed-title {
|
|
||||||
font-weight: 600;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-feed-meta {
|
|
||||||
color: var(--mc-text-muted);
|
|
||||||
font-size: 0.8rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-controls {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 0.7rem;
|
|
||||||
flex-wrap: wrap;
|
|
||||||
margin-top: 0.4rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-controls label {
|
|
||||||
color: var(--mc-text-muted);
|
|
||||||
font-size: 0.85rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-connection {
|
|
||||||
font-size: 0.8rem;
|
|
||||||
color: var(--mc-text-muted);
|
|
||||||
}
|
|
||||||
|
|
||||||
.mc-hidden {
|
|
||||||
display: none !important;
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (max-width: 1300px) {
|
|
||||||
.mc-middle,
|
|
||||||
.mc-bottom {
|
|
||||||
grid-template-columns: 1fr;
|
|
||||||
}
|
|
||||||
.mc-kpi-grid {
|
|
||||||
grid-template-columns: repeat(2, minmax(0, 1fr));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
{% endblock %}
|
|
||||||
|
|
||||||
{% block content %}
|
|
||||||
<div class="mc-grid">
|
|
||||||
<section class="mc-top">
|
|
||||||
<div class="mc-card">
|
|
||||||
<div id="alertContainer" class="mc-alert-empty">Ingen aktive driftsalarmer</div>
|
|
||||||
<div class="mc-controls">
|
|
||||||
<label><input type="checkbox" id="soundEnabledToggle" checked> Lyd aktiv</label>
|
|
||||||
<label>Lydniveau <input type="range" id="soundVolume" min="0" max="100" value="70"></label>
|
|
||||||
<span id="connectionState" class="mc-connection">Forbinder...</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</section>
|
|
||||||
|
|
||||||
<section class="mc-middle">
|
|
||||||
<div class="mc-card">
|
|
||||||
<h4 class="mb-3">Opgave-overblik</h4>
|
|
||||||
<div id="kpiGrid" class="mc-kpi-grid"></div>
|
|
||||||
<div id="callOverlay" class="mc-call-overlay">
|
|
||||||
<div class="mc-call-title">Indgående opkald</div>
|
|
||||||
<div id="callPrimary" style="font-size:1.35rem;font-weight:700;"></div>
|
|
||||||
<div id="callSecondary" class="mc-call-meta mt-2"></div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="mc-card">
|
|
||||||
<h4 class="mb-3">Aktive opkald</h4>
|
|
||||||
<div id="activeCallsList" class="mc-feed"></div>
|
|
||||||
</div>
|
|
||||||
</section>
|
|
||||||
|
|
||||||
<section class="mc-bottom">
|
|
||||||
<div class="mc-card">
|
|
||||||
<h4 class="mb-3">Deadlines pr. medarbejder</h4>
|
|
||||||
<div class="mc-row" style="font-weight:700;color:var(--mc-text-muted);text-transform:uppercase;font-size:0.75rem;">
|
|
||||||
<div>Medarbejder</div>
|
|
||||||
<div>I dag</div>
|
|
||||||
<div>Overskredet</div>
|
|
||||||
</div>
|
|
||||||
<div id="deadlineTable" class="mc-table"></div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="mc-card">
|
|
||||||
<h4 class="mb-3">Live aktivitetsfeed</h4>
|
|
||||||
<div id="liveFeed" class="mc-feed"></div>
|
|
||||||
</div>
|
|
||||||
</section>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<script>
|
|
||||||
(() => {
|
|
||||||
const kpiLabels = {
|
|
||||||
open_cases: 'Åbne sager',
|
|
||||||
new_cases: 'Nye sager',
|
|
||||||
unassigned_cases: 'Uden ansvarlig',
|
|
||||||
deadlines_today: 'Deadline i dag',
|
|
||||||
overdue_deadlines: 'Overskredne'
|
|
||||||
};
|
|
||||||
|
|
||||||
const state = {
|
|
||||||
ws: null,
|
|
||||||
reconnectAttempts: 0,
|
|
||||||
reconnectTimer: null,
|
|
||||||
failures: 0,
|
|
||||||
config: {
|
|
||||||
sound_enabled: true,
|
|
||||||
sound_volume: 70,
|
|
||||||
sound_events: ['incoming_call', 'uptime_down', 'critical_event'],
|
|
||||||
kpi_visible: Object.keys(kpiLabels),
|
|
||||||
display_queues: []
|
|
||||||
},
|
|
||||||
activeCalls: [],
|
|
||||||
activeAlerts: [],
|
|
||||||
liveFeed: []
|
|
||||||
};
|
|
||||||
|
|
||||||
function updateConnectionLabel(text) {
|
|
||||||
const el = document.getElementById('connectionState');
|
|
||||||
if (el) el.textContent = text;
|
|
||||||
}
|
|
||||||
|
|
||||||
function playTone(type) {
|
|
||||||
const soundEnabledToggle = document.getElementById('soundEnabledToggle');
|
|
||||||
if (!soundEnabledToggle || !soundEnabledToggle.checked) return;
|
|
||||||
|
|
||||||
if (!state.config.sound_events.includes(type)) return;
|
|
||||||
|
|
||||||
const volumeSlider = document.getElementById('soundVolume');
|
|
||||||
const volumePct = Number(volumeSlider?.value || state.config.sound_volume || 70);
|
|
||||||
const gainValue = Math.max(0, Math.min(1, volumePct / 100));
|
|
||||||
|
|
||||||
const AudioCtx = window.AudioContext || window.webkitAudioContext;
|
|
||||||
if (!AudioCtx) return;
|
|
||||||
|
|
||||||
const context = new AudioCtx();
|
|
||||||
const oscillator = context.createOscillator();
|
|
||||||
const gainNode = context.createGain();
|
|
||||||
|
|
||||||
oscillator.type = 'sine';
|
|
||||||
oscillator.frequency.value = type === 'uptime_down' ? 260 : 620;
|
|
||||||
gainNode.gain.value = gainValue * 0.2;
|
|
||||||
|
|
||||||
oscillator.connect(gainNode);
|
|
||||||
gainNode.connect(context.destination);
|
|
||||||
oscillator.start();
|
|
||||||
oscillator.stop(context.currentTime + (type === 'uptime_down' ? 0.35 : 0.15));
|
|
||||||
}
|
|
||||||
|
|
||||||
function escapeHtml(str) {
|
|
||||||
return String(str ?? '')
|
|
||||||
.replaceAll('&', '&')
|
|
||||||
.replaceAll('<', '<')
|
|
||||||
.replaceAll('>', '>')
|
|
||||||
.replaceAll('"', '"')
|
|
||||||
.replaceAll("'", ''');
|
|
||||||
}
|
|
||||||
|
|
||||||
function formatDate(value) {
|
|
||||||
if (!value) return '-';
|
|
||||||
const d = new Date(value);
|
|
||||||
if (Number.isNaN(d.getTime())) return '-';
|
|
||||||
return d.toLocaleString('da-DK');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderKpis(kpis = {}) {
|
|
||||||
const container = document.getElementById('kpiGrid');
|
|
||||||
if (!container) return;
|
|
||||||
|
|
||||||
const visible = Array.isArray(state.config.kpi_visible) && state.config.kpi_visible.length
|
|
||||||
? state.config.kpi_visible
|
|
||||||
: Object.keys(kpiLabels);
|
|
||||||
|
|
||||||
container.innerHTML = visible.map((key) => {
|
|
||||||
const value = Number(kpis[key] ?? 0);
|
|
||||||
const variant = key === 'overdue_deadlines' && value > 0
|
|
||||||
? 'danger'
|
|
||||||
: key === 'deadlines_today' && value > 0
|
|
||||||
? 'warning'
|
|
||||||
: '';
|
|
||||||
return `
|
|
||||||
<div class="mc-kpi ${variant}">
|
|
||||||
<div class="label">${escapeHtml(kpiLabels[key] || key)}</div>
|
|
||||||
<div class="value">${value}</div>
|
|
||||||
</div>
|
|
||||||
`;
|
|
||||||
}).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderActiveCalls() {
|
|
||||||
const list = document.getElementById('activeCallsList');
|
|
||||||
const overlay = document.getElementById('callOverlay');
|
|
||||||
const primary = document.getElementById('callPrimary');
|
|
||||||
const secondary = document.getElementById('callSecondary');
|
|
||||||
|
|
||||||
if (!list || !overlay || !primary || !secondary) return;
|
|
||||||
|
|
||||||
const queueFilter = Array.isArray(state.config.display_queues) ? state.config.display_queues : [];
|
|
||||||
const calls = state.activeCalls.filter(c => {
|
|
||||||
if (!queueFilter.length) return true;
|
|
||||||
return queueFilter.includes(c.queue_name);
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!calls.length) {
|
|
||||||
list.innerHTML = '<div class="mc-feed-meta">Ingen aktive opkald</div>';
|
|
||||||
overlay.classList.remove('active');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const call = calls[0];
|
|
||||||
overlay.classList.add('active');
|
|
||||||
primary.textContent = `${call.queue_name || 'Ukendt kø'} • ${call.caller_number || 'Ukendt nummer'}`;
|
|
||||||
secondary.innerHTML = [
|
|
||||||
call.contact_name ? `<span class="mc-badge">${escapeHtml(call.contact_name)}</span>` : '',
|
|
||||||
call.company_name ? `<span class="mc-badge">${escapeHtml(call.company_name)}</span>` : '',
|
|
||||||
call.customer_tag ? `<span class="mc-badge">${escapeHtml(call.customer_tag)}</span>` : '',
|
|
||||||
call.started_at ? `<span class="mc-badge">${escapeHtml(formatDate(call.started_at))}</span>` : ''
|
|
||||||
].join(' ');
|
|
||||||
|
|
||||||
list.innerHTML = calls.map((item) => `
|
|
||||||
<div class="mc-feed-item">
|
|
||||||
<div class="mc-feed-title">${escapeHtml(item.queue_name || 'Ukendt kø')} • ${escapeHtml(item.caller_number || '-')}</div>
|
|
||||||
<div class="mc-feed-meta">
|
|
||||||
${escapeHtml(item.contact_name || 'Ukendt kontakt')}
|
|
||||||
${item.company_name ? ` • ${escapeHtml(item.company_name)}` : ''}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
`).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderAlerts() {
|
|
||||||
const container = document.getElementById('alertContainer');
|
|
||||||
if (!container) return;
|
|
||||||
|
|
||||||
if (!state.activeAlerts.length) {
|
|
||||||
container.className = 'mc-alert-empty';
|
|
||||||
container.textContent = 'Ingen aktive driftsalarmer';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
container.className = '';
|
|
||||||
container.innerHTML = state.activeAlerts.map((alert) => `
|
|
||||||
<div class="mc-alert-bar down mb-2">
|
|
||||||
<span>🚨</span>
|
|
||||||
<span>${escapeHtml(alert.service_name || 'Ukendt service')}</span>
|
|
||||||
${alert.customer_name ? `<span class="mc-badge">${escapeHtml(alert.customer_name)}</span>` : ''}
|
|
||||||
<span class="mc-badge">Start: ${escapeHtml(formatDate(alert.started_at))}</span>
|
|
||||||
</div>
|
|
||||||
`).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderDeadlines(rows = []) {
|
|
||||||
const table = document.getElementById('deadlineTable');
|
|
||||||
if (!table) return;
|
|
||||||
if (!rows.length) {
|
|
||||||
table.innerHTML = '<div class="mc-feed-meta py-2">Ingen deadlines i dag eller overskredne</div>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
table.innerHTML = rows.map((row) => `
|
|
||||||
<div class="mc-row">
|
|
||||||
<div>${escapeHtml(row.employee_name || 'Ukendt')}</div>
|
|
||||||
<div>${Number(row.deadlines_today || 0)}</div>
|
|
||||||
<div style="color:${Number(row.overdue_deadlines || 0) > 0 ? '#ff9d9d' : 'inherit'}">${Number(row.overdue_deadlines || 0)}</div>
|
|
||||||
</div>
|
|
||||||
`).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderFeed() {
|
|
||||||
const feed = document.getElementById('liveFeed');
|
|
||||||
if (!feed) return;
|
|
||||||
|
|
||||||
if (!state.liveFeed.length) {
|
|
||||||
feed.innerHTML = '<div class="mc-feed-meta">Ingen events endnu</div>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
feed.innerHTML = state.liveFeed.slice(0, 20).map((event) => `
|
|
||||||
<div class="mc-feed-item">
|
|
||||||
<div class="mc-feed-title">${escapeHtml(event.title || event.event_type || 'Event')}</div>
|
|
||||||
<div class="mc-feed-meta">${escapeHtml(event.event_type || 'event')} • ${escapeHtml(formatDate(event.created_at))}</div>
|
|
||||||
</div>
|
|
||||||
`).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderState(payload) {
|
|
||||||
if (!payload) return;
|
|
||||||
state.config = { ...state.config, ...(payload.config || {}) };
|
|
||||||
state.activeCalls = Array.isArray(payload.active_calls) ? payload.active_calls : state.activeCalls;
|
|
||||||
state.activeAlerts = Array.isArray(payload.active_alerts) ? payload.active_alerts : state.activeAlerts;
|
|
||||||
state.liveFeed = Array.isArray(payload.live_feed) ? payload.live_feed : state.liveFeed;
|
|
||||||
|
|
||||||
const soundToggle = document.getElementById('soundEnabledToggle');
|
|
||||||
const volumeSlider = document.getElementById('soundVolume');
|
|
||||||
if (soundToggle) soundToggle.checked = !!state.config.sound_enabled;
|
|
||||||
if (volumeSlider) volumeSlider.value = String(state.config.sound_volume || 70);
|
|
||||||
|
|
||||||
renderKpis(payload.kpis || {});
|
|
||||||
renderActiveCalls();
|
|
||||||
renderAlerts();
|
|
||||||
renderDeadlines(Array.isArray(payload.employee_deadlines) ? payload.employee_deadlines : []);
|
|
||||||
renderFeed();
|
|
||||||
}
|
|
||||||
|
|
||||||
async function loadInitialState() {
|
|
||||||
const res = await fetch('/api/v1/mission/state', { credentials: 'include' });
|
|
||||||
if (!res.ok) throw new Error('Kunne ikke hente mission state');
|
|
||||||
const payload = await res.json();
|
|
||||||
renderState(payload);
|
|
||||||
}
|
|
||||||
|
|
||||||
function scheduleReconnect() {
|
|
||||||
if (state.reconnectTimer) return;
|
|
||||||
state.reconnectAttempts += 1;
|
|
||||||
const delay = Math.min(30000, 1500 * state.reconnectAttempts);
|
|
||||||
updateConnectionLabel(`Frakoblet • reconnect om ${Math.round(delay / 1000)}s`);
|
|
||||||
state.reconnectTimer = setTimeout(() => {
|
|
||||||
state.reconnectTimer = null;
|
|
||||||
connectWs();
|
|
||||||
}, delay);
|
|
||||||
}
|
|
||||||
|
|
||||||
function connectWs() {
|
|
||||||
const proto = window.location.protocol === 'https:' ? 'wss' : 'ws';
|
|
||||||
const url = `${proto}://${window.location.host}/api/v1/mission/ws`;
|
|
||||||
state.ws = new WebSocket(url);
|
|
||||||
|
|
||||||
state.ws.onopen = () => {
|
|
||||||
state.reconnectAttempts = 0;
|
|
||||||
updateConnectionLabel('Live forbindelse aktiv');
|
|
||||||
};
|
|
||||||
|
|
||||||
state.ws.onclose = () => {
|
|
||||||
state.failures += 1;
|
|
||||||
if (state.failures >= 12) {
|
|
||||||
window.location.reload();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
scheduleReconnect();
|
|
||||||
};
|
|
||||||
|
|
||||||
state.ws.onerror = () => {};
|
|
||||||
|
|
||||||
state.ws.onmessage = (evt) => {
|
|
||||||
try {
|
|
||||||
const msg = JSON.parse(evt.data);
|
|
||||||
const event = msg?.event;
|
|
||||||
const data = msg?.data || {};
|
|
||||||
|
|
||||||
if (event === 'mission_state') {
|
|
||||||
renderState(data);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (event === 'kpi_update') {
|
|
||||||
renderKpis(data);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (event === 'call_ringing') {
|
|
||||||
state.activeCalls = [data, ...state.activeCalls.filter(c => c.call_id !== data.call_id)];
|
|
||||||
renderActiveCalls();
|
|
||||||
playTone('incoming_call');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (event === 'call_answered' || event === 'call_hangup') {
|
|
||||||
const id = data.call_id;
|
|
||||||
state.activeCalls = state.activeCalls.filter(c => c.call_id !== id);
|
|
||||||
renderActiveCalls();
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (event === 'uptime_alert') {
|
|
||||||
state.activeAlerts = Array.isArray(data.active_alerts) ? data.active_alerts : state.activeAlerts;
|
|
||||||
renderAlerts();
|
|
||||||
if ((data.status || '').toUpperCase() === 'DOWN') {
|
|
||||||
playTone('uptime_down');
|
|
||||||
}
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (event === 'live_feed_event') {
|
|
||||||
state.liveFeed = [data, ...state.liveFeed.filter(item => item.id !== data.id)].slice(0, 20);
|
|
||||||
renderFeed();
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Mission message parse failed', error);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
document.addEventListener('DOMContentLoaded', async () => {
|
|
||||||
try {
|
|
||||||
await loadInitialState();
|
|
||||||
} catch (error) {
|
|
||||||
updateConnectionLabel('Fejl ved initial load');
|
|
||||||
console.error(error);
|
|
||||||
}
|
|
||||||
connectWs();
|
|
||||||
});
|
|
||||||
})();
|
|
||||||
</script>
|
|
||||||
{% endblock %}
|
|
||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -7,9 +7,11 @@ Runs daily at 04:00
|
|||||||
|
|
||||||
import logging
|
import logging
|
||||||
from datetime import datetime, date
|
from datetime import datetime, date
|
||||||
|
from decimal import Decimal
|
||||||
import json
|
import json
|
||||||
from dateutil.relativedelta import relativedelta
|
from dateutil.relativedelta import relativedelta
|
||||||
|
|
||||||
|
from app.core.config import settings
|
||||||
from app.core.database import execute_query, get_db_connection
|
from app.core.database import execute_query, get_db_connection
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@ -17,11 +19,11 @@ logger = logging.getLogger(__name__)
|
|||||||
|
|
||||||
async def process_subscriptions():
|
async def process_subscriptions():
|
||||||
"""
|
"""
|
||||||
Main job: Process subscriptions due for invoicing.
|
Main job: Process subscriptions due for invoicing
|
||||||
- Find active subscriptions where next_invoice_date <= today
|
- Find active subscriptions where next_invoice_date <= TODAY
|
||||||
- Skip subscriptions blocked for invoicing (missing asset/serial)
|
- Create ordre draft with line items from subscription
|
||||||
- Aggregate eligible subscriptions into one ordre_draft per customer + merge key + due date + billing direction
|
- Advance period_start and next_invoice_date based on billing_interval
|
||||||
- Advance period_start and next_invoice_date for processed subscriptions
|
- Log all actions for audit trail
|
||||||
"""
|
"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@ -37,14 +39,9 @@ async def process_subscriptions():
|
|||||||
c.name AS customer_name,
|
c.name AS customer_name,
|
||||||
s.product_name,
|
s.product_name,
|
||||||
s.billing_interval,
|
s.billing_interval,
|
||||||
s.billing_direction,
|
|
||||||
s.advance_months,
|
|
||||||
s.price,
|
s.price,
|
||||||
s.next_invoice_date,
|
s.next_invoice_date,
|
||||||
s.period_start,
|
s.period_start,
|
||||||
s.invoice_merge_key,
|
|
||||||
s.billing_blocked,
|
|
||||||
s.billing_block_reason,
|
|
||||||
COALESCE(
|
COALESCE(
|
||||||
(
|
(
|
||||||
SELECT json_agg(
|
SELECT json_agg(
|
||||||
@ -54,12 +51,7 @@ async def process_subscriptions():
|
|||||||
'quantity', si.quantity,
|
'quantity', si.quantity,
|
||||||
'unit_price', si.unit_price,
|
'unit_price', si.unit_price,
|
||||||
'line_total', si.line_total,
|
'line_total', si.line_total,
|
||||||
'product_id', si.product_id,
|
'product_id', si.product_id
|
||||||
'asset_id', si.asset_id,
|
|
||||||
'billing_blocked', si.billing_blocked,
|
|
||||||
'billing_block_reason', si.billing_block_reason,
|
|
||||||
'period_from', si.period_from,
|
|
||||||
'period_to', si.period_to
|
|
||||||
) ORDER BY si.id
|
) ORDER BY si.id
|
||||||
)
|
)
|
||||||
FROM sag_subscription_items si
|
FROM sag_subscription_items si
|
||||||
@ -83,185 +75,109 @@ async def process_subscriptions():
|
|||||||
|
|
||||||
logger.info(f"📋 Found {len(subscriptions)} subscription(s) to process")
|
logger.info(f"📋 Found {len(subscriptions)} subscription(s) to process")
|
||||||
|
|
||||||
blocked_count = 0
|
|
||||||
processed_count = 0
|
processed_count = 0
|
||||||
error_count = 0
|
error_count = 0
|
||||||
|
|
||||||
grouped_subscriptions = {}
|
|
||||||
for sub in subscriptions:
|
for sub in subscriptions:
|
||||||
if sub.get('billing_blocked'):
|
|
||||||
blocked_count += 1
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Subscription %s skipped due to billing block: %s",
|
|
||||||
sub.get('id'),
|
|
||||||
sub.get('billing_block_reason') or 'unknown reason'
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
|
|
||||||
group_key = (
|
|
||||||
int(sub['customer_id']),
|
|
||||||
str(sub.get('invoice_merge_key') or f"cust-{sub['customer_id']}"),
|
|
||||||
str(sub.get('next_invoice_date')),
|
|
||||||
str(sub.get('billing_direction') or 'forward'),
|
|
||||||
)
|
|
||||||
grouped_subscriptions.setdefault(group_key, []).append(sub)
|
|
||||||
|
|
||||||
for group in grouped_subscriptions.values():
|
|
||||||
try:
|
try:
|
||||||
count = await _process_subscription_group(group)
|
await _process_single_subscription(sub)
|
||||||
processed_count += count
|
processed_count += 1
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("❌ Failed processing subscription group: %s", e, exc_info=True)
|
logger.error(f"❌ Failed to process subscription {sub['id']}: {e}", exc_info=True)
|
||||||
error_count += 1
|
error_count += 1
|
||||||
|
|
||||||
logger.info(
|
logger.info(f"✅ Subscription processing complete: {processed_count} processed, {error_count} errors")
|
||||||
"✅ Subscription processing complete: %s processed, %s blocked, %s errors",
|
|
||||||
processed_count,
|
|
||||||
blocked_count,
|
|
||||||
error_count,
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Subscription processing job failed: {e}", exc_info=True)
|
logger.error(f"❌ Subscription processing job failed: {e}", exc_info=True)
|
||||||
|
|
||||||
|
|
||||||
async def _process_subscription_group(subscriptions: list[dict]) -> int:
|
async def _process_single_subscription(sub: dict):
|
||||||
"""Create one aggregated ordre draft for a group of subscriptions and advance all periods."""
|
"""Process a single subscription: create ordre draft and advance period"""
|
||||||
|
|
||||||
if not subscriptions:
|
subscription_id = sub['id']
|
||||||
return 0
|
logger.info(f"Processing subscription #{subscription_id}: {sub['product_name']} for {sub['customer_name']}")
|
||||||
|
|
||||||
first = subscriptions[0]
|
|
||||||
customer_id = first['customer_id']
|
|
||||||
customer_name = first.get('customer_name') or f"Customer #{customer_id}"
|
|
||||||
billing_direction = first.get('billing_direction') or 'forward'
|
|
||||||
invoice_aggregate_key = first.get('invoice_merge_key') or f"cust-{customer_id}"
|
|
||||||
|
|
||||||
conn = get_db_connection()
|
conn = get_db_connection()
|
||||||
cursor = conn.cursor()
|
cursor = conn.cursor()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
# Convert line_items from JSON to list
|
||||||
|
line_items = sub.get('line_items', [])
|
||||||
|
if isinstance(line_items, str):
|
||||||
|
line_items = json.loads(line_items)
|
||||||
|
|
||||||
|
# Build ordre draft lines_json
|
||||||
ordre_lines = []
|
ordre_lines = []
|
||||||
source_subscription_ids = []
|
for item in line_items:
|
||||||
coverage_start = None
|
product_number = str(item.get('product_id', 'SUB'))
|
||||||
coverage_end = None
|
ordre_lines.append({
|
||||||
|
"product": {
|
||||||
|
"productNumber": product_number,
|
||||||
|
"description": item.get('description', '')
|
||||||
|
},
|
||||||
|
"quantity": float(item.get('quantity', 1)),
|
||||||
|
"unitNetPrice": float(item.get('unit_price', 0)),
|
||||||
|
"totalNetAmount": float(item.get('line_total', 0)),
|
||||||
|
"discountPercentage": 0
|
||||||
|
})
|
||||||
|
|
||||||
for sub in subscriptions:
|
# Create ordre draft title with period information
|
||||||
subscription_id = int(sub['id'])
|
period_start = sub.get('period_start') or sub.get('next_invoice_date')
|
||||||
source_subscription_ids.append(subscription_id)
|
next_period_start = _calculate_next_period_start(period_start, sub['billing_interval'])
|
||||||
|
|
||||||
line_items = sub.get('line_items', [])
|
title = f"Abonnement: {sub['product_name']}"
|
||||||
if isinstance(line_items, str):
|
notes = f"Periode: {period_start} til {next_period_start}\nAbonnement ID: {subscription_id}"
|
||||||
line_items = json.loads(line_items)
|
|
||||||
|
|
||||||
period_start = sub.get('period_start') or sub.get('next_invoice_date')
|
if sub.get('sag_id'):
|
||||||
period_end = _calculate_next_period_start(period_start, sub['billing_interval'])
|
notes += f"\nSag: {sub['sag_name']}"
|
||||||
if coverage_start is None or period_start < coverage_start:
|
|
||||||
coverage_start = period_start
|
|
||||||
if coverage_end is None or period_end > coverage_end:
|
|
||||||
coverage_end = period_end
|
|
||||||
|
|
||||||
for item in line_items:
|
|
||||||
if item.get('billing_blocked'):
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Skipping blocked subscription item %s on subscription %s",
|
|
||||||
item.get('id'),
|
|
||||||
subscription_id,
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
|
|
||||||
product_number = str(item.get('product_id', 'SUB'))
|
|
||||||
ordre_lines.append({
|
|
||||||
"product": {
|
|
||||||
"productNumber": product_number,
|
|
||||||
"description": item.get('description', '')
|
|
||||||
},
|
|
||||||
"quantity": float(item.get('quantity', 1)),
|
|
||||||
"unitNetPrice": float(item.get('unit_price', 0)),
|
|
||||||
"totalNetAmount": float(item.get('line_total', 0)),
|
|
||||||
"discountPercentage": 0,
|
|
||||||
"metadata": {
|
|
||||||
"subscription_id": subscription_id,
|
|
||||||
"asset_id": item.get('asset_id'),
|
|
||||||
"period_from": str(item.get('period_from') or period_start),
|
|
||||||
"period_to": str(item.get('period_to') or period_end),
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
if not ordre_lines:
|
|
||||||
logger.warning("⚠️ No invoiceable lines in subscription group for customer %s", customer_id)
|
|
||||||
return 0
|
|
||||||
|
|
||||||
title = f"Abonnementer: {customer_name}"
|
|
||||||
notes = (
|
|
||||||
f"Aggregated abonnement faktura\n"
|
|
||||||
f"Kunde: {customer_name}\n"
|
|
||||||
f"Coverage: {coverage_start} til {coverage_end}\n"
|
|
||||||
f"Subscription IDs: {', '.join(str(sid) for sid in source_subscription_ids)}"
|
|
||||||
)
|
|
||||||
|
|
||||||
|
# Insert ordre draft
|
||||||
insert_query = """
|
insert_query = """
|
||||||
INSERT INTO ordre_drafts (
|
INSERT INTO ordre_drafts (
|
||||||
title,
|
title,
|
||||||
customer_id,
|
customer_id,
|
||||||
lines_json,
|
lines_json,
|
||||||
notes,
|
notes,
|
||||||
coverage_start,
|
|
||||||
coverage_end,
|
|
||||||
billing_direction,
|
|
||||||
source_subscription_ids,
|
|
||||||
invoice_aggregate_key,
|
|
||||||
layout_number,
|
layout_number,
|
||||||
created_by_user_id,
|
created_by_user_id,
|
||||||
sync_status,
|
|
||||||
export_status_json,
|
export_status_json,
|
||||||
updated_at
|
updated_at
|
||||||
) VALUES (%s, %s, %s::jsonb, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s::jsonb, CURRENT_TIMESTAMP)
|
) VALUES (%s, %s, %s::jsonb, %s, %s, %s, %s::jsonb, CURRENT_TIMESTAMP)
|
||||||
RETURNING id
|
RETURNING id
|
||||||
"""
|
"""
|
||||||
|
|
||||||
cursor.execute(insert_query, (
|
cursor.execute(insert_query, (
|
||||||
title,
|
title,
|
||||||
customer_id,
|
sub['customer_id'],
|
||||||
json.dumps(ordre_lines, ensure_ascii=False),
|
json.dumps(ordre_lines, ensure_ascii=False),
|
||||||
notes,
|
notes,
|
||||||
coverage_start,
|
|
||||||
coverage_end,
|
|
||||||
billing_direction,
|
|
||||||
source_subscription_ids,
|
|
||||||
invoice_aggregate_key,
|
|
||||||
1, # Default layout
|
1, # Default layout
|
||||||
None, # System-created
|
None, # System-created
|
||||||
'pending',
|
json.dumps({"source": "subscription", "subscription_id": subscription_id}, ensure_ascii=False)
|
||||||
json.dumps({"source": "subscription", "subscription_ids": source_subscription_ids}, ensure_ascii=False)
|
|
||||||
))
|
))
|
||||||
|
|
||||||
ordre_id = cursor.fetchone()[0]
|
ordre_id = cursor.fetchone()[0]
|
||||||
logger.info(
|
logger.info(f"✅ Created ordre draft #{ordre_id} for subscription #{subscription_id}")
|
||||||
"✅ Created aggregated ordre draft #%s for %s subscription(s)",
|
|
||||||
ordre_id,
|
|
||||||
len(source_subscription_ids),
|
|
||||||
)
|
|
||||||
|
|
||||||
for sub in subscriptions:
|
# Calculate new period dates
|
||||||
subscription_id = int(sub['id'])
|
current_period_start = sub.get('period_start') or sub.get('next_invoice_date')
|
||||||
current_period_start = sub.get('period_start') or sub.get('next_invoice_date')
|
new_period_start = next_period_start
|
||||||
new_period_start = _calculate_next_period_start(current_period_start, sub['billing_interval'])
|
new_next_invoice_date = _calculate_next_period_start(new_period_start, sub['billing_interval'])
|
||||||
new_next_invoice_date = _calculate_next_period_start(new_period_start, sub['billing_interval'])
|
|
||||||
|
|
||||||
cursor.execute(
|
# Update subscription with new period dates
|
||||||
"""
|
update_query = """
|
||||||
UPDATE sag_subscriptions
|
UPDATE sag_subscriptions
|
||||||
SET period_start = %s,
|
SET period_start = %s,
|
||||||
next_invoice_date = %s,
|
next_invoice_date = %s,
|
||||||
updated_at = CURRENT_TIMESTAMP
|
updated_at = CURRENT_TIMESTAMP
|
||||||
WHERE id = %s
|
WHERE id = %s
|
||||||
""",
|
"""
|
||||||
(new_period_start, new_next_invoice_date, subscription_id)
|
|
||||||
)
|
cursor.execute(update_query, (new_period_start, new_next_invoice_date, subscription_id))
|
||||||
|
|
||||||
conn.commit()
|
conn.commit()
|
||||||
return len(source_subscription_ids)
|
logger.info(f"✅ Advanced subscription #{subscription_id}: next invoice {new_next_invoice_date}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
conn.rollback()
|
conn.rollback()
|
||||||
|
|||||||
@ -1,119 +0,0 @@
|
|||||||
"""
|
|
||||||
Reconcile ordre draft sync lifecycle.
|
|
||||||
|
|
||||||
Promotes sync_status based on known economic references on ordre_drafts:
|
|
||||||
- pending/failed + economic_order_number -> exported
|
|
||||||
- exported + economic_invoice_number -> posted
|
|
||||||
"""
|
|
||||||
|
|
||||||
import logging
|
|
||||||
from typing import Any, Dict, List
|
|
||||||
|
|
||||||
from app.core.database import execute_query, get_db_connection, release_db_connection
|
|
||||||
from app.services.economic_service import get_economic_service
|
|
||||||
from psycopg2.extras import RealDictCursor
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
async def reconcile_ordre_drafts_sync_status(apply_changes: bool = True) -> Dict[str, Any]:
|
|
||||||
"""Reconcile ordre_drafts sync statuses and optionally persist changes."""
|
|
||||||
|
|
||||||
drafts = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT id, sync_status, economic_order_number, economic_invoice_number
|
|
||||||
FROM ordre_drafts
|
|
||||||
ORDER BY id ASC
|
|
||||||
""",
|
|
||||||
(),
|
|
||||||
) or []
|
|
||||||
|
|
||||||
changes: List[Dict[str, Any]] = []
|
|
||||||
invoice_status_cache: Dict[str, str] = {}
|
|
||||||
economic_service = get_economic_service()
|
|
||||||
|
|
||||||
for draft in drafts:
|
|
||||||
current = (draft.get("sync_status") or "pending").strip().lower()
|
|
||||||
target = current
|
|
||||||
|
|
||||||
if current in {"pending", "failed"} and draft.get("economic_order_number"):
|
|
||||||
target = "exported"
|
|
||||||
if target == "exported" and draft.get("economic_invoice_number"):
|
|
||||||
target = "posted"
|
|
||||||
|
|
||||||
invoice_number = str(draft.get("economic_invoice_number") or "").strip()
|
|
||||||
if invoice_number:
|
|
||||||
if invoice_number not in invoice_status_cache:
|
|
||||||
invoice_status_cache[invoice_number] = await economic_service.get_invoice_lifecycle_status(invoice_number)
|
|
||||||
lifecycle = invoice_status_cache[invoice_number]
|
|
||||||
if lifecycle == "paid":
|
|
||||||
target = "paid"
|
|
||||||
elif lifecycle in {"booked", "unpaid"} and target in {"pending", "failed", "exported"}:
|
|
||||||
target = "posted"
|
|
||||||
|
|
||||||
if target != current:
|
|
||||||
changes.append(
|
|
||||||
{
|
|
||||||
"draft_id": draft.get("id"),
|
|
||||||
"from": current,
|
|
||||||
"to": target,
|
|
||||||
"economic_invoice_number": invoice_number or None,
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
if apply_changes and changes:
|
|
||||||
conn = get_db_connection()
|
|
||||||
try:
|
|
||||||
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
|
|
||||||
for change in changes:
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
UPDATE ordre_drafts
|
|
||||||
SET sync_status = %s,
|
|
||||||
last_sync_at = CURRENT_TIMESTAMP,
|
|
||||||
updated_at = CURRENT_TIMESTAMP,
|
|
||||||
last_exported_at = CASE
|
|
||||||
WHEN %s IN ('exported', 'posted', 'paid') THEN CURRENT_TIMESTAMP
|
|
||||||
ELSE last_exported_at
|
|
||||||
END
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(change["to"], change["to"], change["draft_id"]),
|
|
||||||
)
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
INSERT INTO ordre_draft_sync_events (
|
|
||||||
draft_id,
|
|
||||||
event_type,
|
|
||||||
from_status,
|
|
||||||
to_status,
|
|
||||||
event_payload,
|
|
||||||
created_by_user_id
|
|
||||||
) VALUES (%s, %s, %s, %s, %s::jsonb, NULL)
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
change["draft_id"],
|
|
||||||
"sync_status_reconcile",
|
|
||||||
change["from"],
|
|
||||||
change["to"],
|
|
||||||
'{"source":"reconcile_job"}',
|
|
||||||
),
|
|
||||||
)
|
|
||||||
conn.commit()
|
|
||||||
except Exception:
|
|
||||||
conn.rollback()
|
|
||||||
raise
|
|
||||||
finally:
|
|
||||||
release_db_connection(conn)
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"✅ Reconciled ordre draft sync status: %s changes (%s)",
|
|
||||||
len(changes),
|
|
||||||
"applied" if apply_changes else "preview",
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"status": "applied" if apply_changes else "preview",
|
|
||||||
"change_count": len(changes),
|
|
||||||
"changes": changes,
|
|
||||||
}
|
|
||||||
@ -74,16 +74,9 @@ class VendorBase(BaseModel):
|
|||||||
domain: Optional[str] = None
|
domain: Optional[str] = None
|
||||||
email: Optional[str] = None
|
email: Optional[str] = None
|
||||||
phone: Optional[str] = None
|
phone: Optional[str] = None
|
||||||
address: Optional[str] = None
|
|
||||||
postal_code: Optional[str] = None
|
|
||||||
city: Optional[str] = None
|
|
||||||
website: Optional[str] = None
|
|
||||||
email_pattern: Optional[str] = None
|
|
||||||
contact_person: Optional[str] = None
|
contact_person: Optional[str] = None
|
||||||
category: Optional[str] = None
|
category: Optional[str] = None
|
||||||
priority: Optional[int] = 100
|
|
||||||
notes: Optional[str] = None
|
notes: Optional[str] = None
|
||||||
is_active: bool = True
|
|
||||||
|
|
||||||
|
|
||||||
class VendorCreate(VendorBase):
|
class VendorCreate(VendorBase):
|
||||||
@ -107,6 +100,7 @@ class VendorUpdate(BaseModel):
|
|||||||
class Vendor(VendorBase):
|
class Vendor(VendorBase):
|
||||||
"""Full vendor schema"""
|
"""Full vendor schema"""
|
||||||
id: int
|
id: int
|
||||||
|
is_active: bool = True
|
||||||
created_at: datetime
|
created_at: datetime
|
||||||
updated_at: Optional[datetime] = None
|
updated_at: Optional[datetime] = None
|
||||||
|
|
||||||
@ -280,7 +274,6 @@ class TodoStepCreate(TodoStepBase):
|
|||||||
class TodoStepUpdate(BaseModel):
|
class TodoStepUpdate(BaseModel):
|
||||||
"""Schema for updating a todo step"""
|
"""Schema for updating a todo step"""
|
||||||
is_done: Optional[bool] = None
|
is_done: Optional[bool] = None
|
||||||
is_next: Optional[bool] = None
|
|
||||||
|
|
||||||
|
|
||||||
class TodoStep(TodoStepBase):
|
class TodoStep(TodoStepBase):
|
||||||
@ -288,7 +281,6 @@ class TodoStep(TodoStepBase):
|
|||||||
id: int
|
id: int
|
||||||
sag_id: int
|
sag_id: int
|
||||||
is_done: bool
|
is_done: bool
|
||||||
is_next: bool = False
|
|
||||||
created_by_user_id: Optional[int] = None
|
created_by_user_id: Optional[int] = None
|
||||||
created_by_name: Optional[str] = None
|
created_by_name: Optional[str] = None
|
||||||
created_at: datetime
|
created_at: datetime
|
||||||
|
|||||||
@ -1,7 +1,6 @@
|
|||||||
import logging
|
import logging
|
||||||
import json
|
import json
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from uuid import uuid4
|
|
||||||
from typing import Any, Dict, List, Optional
|
from typing import Any, Dict, List, Optional
|
||||||
|
|
||||||
from fastapi import APIRouter, HTTPException, Query, Request
|
from fastapi import APIRouter, HTTPException, Query, Request
|
||||||
@ -12,7 +11,6 @@ from app.modules.orders.backend.service import aggregate_order_lines
|
|||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
ALLOWED_SYNC_STATUSES = {"pending", "exported", "failed", "posted", "paid"}
|
|
||||||
|
|
||||||
|
|
||||||
class OrdreLineInput(BaseModel):
|
class OrdreLineInput(BaseModel):
|
||||||
@ -34,7 +32,6 @@ class OrdreExportRequest(BaseModel):
|
|||||||
notes: Optional[str] = None
|
notes: Optional[str] = None
|
||||||
layout_number: Optional[int] = None
|
layout_number: Optional[int] = None
|
||||||
draft_id: Optional[int] = None
|
draft_id: Optional[int] = None
|
||||||
force_export: bool = False
|
|
||||||
|
|
||||||
|
|
||||||
class OrdreDraftUpsertRequest(BaseModel):
|
class OrdreDraftUpsertRequest(BaseModel):
|
||||||
@ -68,42 +65,6 @@ def _get_user_id_from_request(http_request: Request) -> Optional[int]:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def _log_sync_event(
|
|
||||||
draft_id: int,
|
|
||||||
event_type: str,
|
|
||||||
from_status: Optional[str],
|
|
||||||
to_status: Optional[str],
|
|
||||||
event_payload: Dict[str, Any],
|
|
||||||
user_id: Optional[int],
|
|
||||||
) -> None:
|
|
||||||
"""Best-effort logging of sync events for ordre_drafts."""
|
|
||||||
try:
|
|
||||||
from app.core.database import execute_query
|
|
||||||
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO ordre_draft_sync_events (
|
|
||||||
draft_id,
|
|
||||||
event_type,
|
|
||||||
from_status,
|
|
||||||
to_status,
|
|
||||||
event_payload,
|
|
||||||
created_by_user_id
|
|
||||||
) VALUES (%s, %s, %s, %s, %s::jsonb, %s)
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
draft_id,
|
|
||||||
event_type,
|
|
||||||
from_status,
|
|
||||||
to_status,
|
|
||||||
json.dumps(event_payload, ensure_ascii=False),
|
|
||||||
user_id,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("⚠️ Could not log ordre sync event for draft %s: %s", draft_id, e)
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/ordre/aggregate")
|
@router.get("/ordre/aggregate")
|
||||||
async def get_ordre_aggregate(
|
async def get_ordre_aggregate(
|
||||||
customer_id: Optional[int] = Query(None),
|
customer_id: Optional[int] = Query(None),
|
||||||
@ -134,39 +95,6 @@ async def export_ordre(request: OrdreExportRequest, http_request: Request):
|
|||||||
"""Export selected ordre lines to e-conomic draft order."""
|
"""Export selected ordre lines to e-conomic draft order."""
|
||||||
try:
|
try:
|
||||||
user_id = _get_user_id_from_request(http_request)
|
user_id = _get_user_id_from_request(http_request)
|
||||||
previous_status = None
|
|
||||||
export_idempotency_key = None
|
|
||||||
|
|
||||||
if request.draft_id:
|
|
||||||
from app.core.database import execute_query_single
|
|
||||||
|
|
||||||
draft_row = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT id, sync_status, export_idempotency_key, export_status_json
|
|
||||||
FROM ordre_drafts
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(request.draft_id,)
|
|
||||||
)
|
|
||||||
if not draft_row:
|
|
||||||
raise HTTPException(status_code=404, detail="Draft not found")
|
|
||||||
|
|
||||||
previous_status = (draft_row.get("sync_status") or "pending").strip().lower()
|
|
||||||
if previous_status in {"exported", "posted", "paid"} and not request.force_export:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=409,
|
|
||||||
detail=f"Draft already exported with status '{previous_status}'. Use force_export=true to retry.",
|
|
||||||
)
|
|
||||||
|
|
||||||
export_idempotency_key = draft_row.get("export_idempotency_key") or str(uuid4())
|
|
||||||
_log_sync_event(
|
|
||||||
request.draft_id,
|
|
||||||
"export_attempt",
|
|
||||||
previous_status,
|
|
||||||
previous_status,
|
|
||||||
{"force_export": request.force_export, "idempotency_key": export_idempotency_key},
|
|
||||||
user_id,
|
|
||||||
)
|
|
||||||
|
|
||||||
line_payload = [line.model_dump() for line in request.lines]
|
line_payload = [line.model_dump() for line in request.lines]
|
||||||
export_result = await ordre_economic_export_service.export_order(
|
export_result = await ordre_economic_export_service.export_order(
|
||||||
@ -195,53 +123,15 @@ async def export_ordre(request: OrdreExportRequest, http_request: Request):
|
|||||||
"timestamp": datetime.utcnow().isoformat(),
|
"timestamp": datetime.utcnow().isoformat(),
|
||||||
}
|
}
|
||||||
|
|
||||||
economic_order_number = (
|
|
||||||
export_result.get("economic_order_number")
|
|
||||||
or export_result.get("order_number")
|
|
||||||
or export_result.get("orderNumber")
|
|
||||||
)
|
|
||||||
economic_invoice_number = (
|
|
||||||
export_result.get("economic_invoice_number")
|
|
||||||
or export_result.get("invoice_number")
|
|
||||||
or export_result.get("invoiceNumber")
|
|
||||||
)
|
|
||||||
target_sync_status = "pending" if export_result.get("dry_run") else "exported"
|
|
||||||
|
|
||||||
execute_query(
|
execute_query(
|
||||||
"""
|
"""
|
||||||
UPDATE ordre_drafts
|
UPDATE ordre_drafts
|
||||||
SET export_status_json = %s::jsonb,
|
SET export_status_json = %s::jsonb,
|
||||||
sync_status = %s,
|
|
||||||
export_idempotency_key = %s,
|
|
||||||
economic_order_number = COALESCE(%s, economic_order_number),
|
|
||||||
economic_invoice_number = COALESCE(%s, economic_invoice_number),
|
|
||||||
last_sync_at = CURRENT_TIMESTAMP,
|
|
||||||
last_exported_at = CURRENT_TIMESTAMP,
|
last_exported_at = CURRENT_TIMESTAMP,
|
||||||
updated_at = CURRENT_TIMESTAMP
|
updated_at = CURRENT_TIMESTAMP
|
||||||
WHERE id = %s
|
WHERE id = %s
|
||||||
""",
|
""",
|
||||||
(
|
(json.dumps(existing_status, ensure_ascii=False), request.draft_id),
|
||||||
json.dumps(existing_status, ensure_ascii=False),
|
|
||||||
target_sync_status,
|
|
||||||
export_idempotency_key,
|
|
||||||
str(economic_order_number) if economic_order_number is not None else None,
|
|
||||||
str(economic_invoice_number) if economic_invoice_number is not None else None,
|
|
||||||
request.draft_id,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
_log_sync_event(
|
|
||||||
request.draft_id,
|
|
||||||
"export_success",
|
|
||||||
previous_status,
|
|
||||||
target_sync_status,
|
|
||||||
{
|
|
||||||
"dry_run": bool(export_result.get("dry_run")),
|
|
||||||
"idempotency_key": export_idempotency_key,
|
|
||||||
"economic_order_number": economic_order_number,
|
|
||||||
"economic_invoice_number": economic_invoice_number,
|
|
||||||
},
|
|
||||||
user_id,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
return export_result
|
return export_result
|
||||||
@ -260,26 +150,9 @@ async def list_ordre_drafts(
|
|||||||
"""List all ordre drafts (no user filtering)."""
|
"""List all ordre drafts (no user filtering)."""
|
||||||
try:
|
try:
|
||||||
query = """
|
query = """
|
||||||
SELECT id, title, customer_id, notes, layout_number, created_by_user_id,
|
SELECT id, title, customer_id, notes, layout_number, created_by_user_id,
|
||||||
coverage_start, coverage_end, billing_direction, source_subscription_ids,
|
created_at, updated_at, last_exported_at
|
||||||
invoice_aggregate_key, sync_status, export_idempotency_key,
|
|
||||||
economic_order_number, economic_invoice_number,
|
|
||||||
ev_latest.event_type AS latest_event_type,
|
|
||||||
ev_latest.created_at AS latest_event_at,
|
|
||||||
(
|
|
||||||
SELECT COUNT(*)
|
|
||||||
FROM ordre_draft_sync_events ev
|
|
||||||
WHERE ev.draft_id = ordre_drafts.id
|
|
||||||
) AS sync_event_count,
|
|
||||||
last_sync_at, created_at, updated_at, last_exported_at
|
|
||||||
FROM ordre_drafts
|
FROM ordre_drafts
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT event_type, created_at
|
|
||||||
FROM ordre_draft_sync_events
|
|
||||||
WHERE draft_id = ordre_drafts.id
|
|
||||||
ORDER BY created_at DESC, id DESC
|
|
||||||
LIMIT 1
|
|
||||||
) ev_latest ON TRUE
|
|
||||||
ORDER BY updated_at DESC, id DESC
|
ORDER BY updated_at DESC, id DESC
|
||||||
LIMIT %s
|
LIMIT %s
|
||||||
"""
|
"""
|
||||||
@ -329,10 +202,9 @@ async def create_ordre_draft(request: OrdreDraftUpsertRequest, http_request: Req
|
|||||||
notes,
|
notes,
|
||||||
layout_number,
|
layout_number,
|
||||||
created_by_user_id,
|
created_by_user_id,
|
||||||
sync_status,
|
|
||||||
export_status_json,
|
export_status_json,
|
||||||
updated_at
|
updated_at
|
||||||
) VALUES (%s, %s, %s::jsonb, %s, %s, %s, 'pending', %s::jsonb, CURRENT_TIMESTAMP)
|
) VALUES (%s, %s, %s::jsonb, %s, %s, %s, %s::jsonb, CURRENT_TIMESTAMP)
|
||||||
RETURNING *
|
RETURNING *
|
||||||
"""
|
"""
|
||||||
params = (
|
params = (
|
||||||
@ -351,172 +223,6 @@ async def create_ordre_draft(request: OrdreDraftUpsertRequest, http_request: Req
|
|||||||
raise HTTPException(status_code=500, detail="Failed to create ordre draft")
|
raise HTTPException(status_code=500, detail="Failed to create ordre draft")
|
||||||
|
|
||||||
|
|
||||||
@router.get("/ordre/drafts/sync-status/summary")
|
|
||||||
async def get_ordre_draft_sync_summary(http_request: Request):
|
|
||||||
"""Return sync status counters for ordre drafts."""
|
|
||||||
try:
|
|
||||||
from app.core.database import execute_query_single
|
|
||||||
|
|
||||||
query = """
|
|
||||||
SELECT
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'pending') AS pending_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'exported') AS exported_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'failed') AS failed_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'posted') AS posted_count,
|
|
||||||
COUNT(*) FILTER (WHERE sync_status = 'paid') AS paid_count,
|
|
||||||
COUNT(*) AS total_count
|
|
||||||
FROM ordre_drafts
|
|
||||||
"""
|
|
||||||
return execute_query_single(query, ()) or {
|
|
||||||
"pending_count": 0,
|
|
||||||
"exported_count": 0,
|
|
||||||
"failed_count": 0,
|
|
||||||
"posted_count": 0,
|
|
||||||
"paid_count": 0,
|
|
||||||
"total_count": 0,
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("❌ Error loading ordre sync summary: %s", e, exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail="Failed to load sync summary")
|
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/ordre/drafts/{draft_id}/sync-status")
|
|
||||||
async def update_ordre_draft_sync_status(draft_id: int, payload: Dict[str, Any], http_request: Request):
|
|
||||||
"""Update sync lifecycle fields for one ordre draft."""
|
|
||||||
try:
|
|
||||||
user_id = _get_user_id_from_request(http_request)
|
|
||||||
sync_status = (payload.get("sync_status") or "").strip().lower()
|
|
||||||
if sync_status not in ALLOWED_SYNC_STATUSES:
|
|
||||||
raise HTTPException(status_code=400, detail="Invalid sync_status")
|
|
||||||
|
|
||||||
economic_order_number = payload.get("economic_order_number")
|
|
||||||
economic_invoice_number = payload.get("economic_invoice_number")
|
|
||||||
export_status_json = payload.get("export_status_json")
|
|
||||||
|
|
||||||
updates = ["sync_status = %s", "last_sync_at = CURRENT_TIMESTAMP", "updated_at = CURRENT_TIMESTAMP"]
|
|
||||||
values: List[Any] = [sync_status]
|
|
||||||
|
|
||||||
if economic_order_number is not None:
|
|
||||||
updates.append("economic_order_number = %s")
|
|
||||||
values.append(str(economic_order_number) if economic_order_number else None)
|
|
||||||
|
|
||||||
if economic_invoice_number is not None:
|
|
||||||
updates.append("economic_invoice_number = %s")
|
|
||||||
values.append(str(economic_invoice_number) if economic_invoice_number else None)
|
|
||||||
|
|
||||||
if export_status_json is not None:
|
|
||||||
updates.append("export_status_json = %s::jsonb")
|
|
||||||
values.append(json.dumps(export_status_json, ensure_ascii=False))
|
|
||||||
|
|
||||||
if sync_status in {"exported", "posted", "paid"}:
|
|
||||||
updates.append("last_exported_at = CURRENT_TIMESTAMP")
|
|
||||||
|
|
||||||
from app.core.database import execute_query_single
|
|
||||||
previous = execute_query_single(
|
|
||||||
"SELECT sync_status FROM ordre_drafts WHERE id = %s",
|
|
||||||
(draft_id,)
|
|
||||||
)
|
|
||||||
if not previous:
|
|
||||||
raise HTTPException(status_code=404, detail="Draft not found")
|
|
||||||
from_status = (previous.get("sync_status") or "pending").strip().lower()
|
|
||||||
|
|
||||||
values.append(draft_id)
|
|
||||||
from app.core.database import execute_query
|
|
||||||
result = execute_query(
|
|
||||||
f"""
|
|
||||||
UPDATE ordre_drafts
|
|
||||||
SET {', '.join(updates)}
|
|
||||||
WHERE id = %s
|
|
||||||
RETURNING *
|
|
||||||
""",
|
|
||||||
tuple(values)
|
|
||||||
)
|
|
||||||
if not result:
|
|
||||||
raise HTTPException(status_code=404, detail="Draft not found")
|
|
||||||
|
|
||||||
_log_sync_event(
|
|
||||||
draft_id,
|
|
||||||
"sync_status_manual_update",
|
|
||||||
from_status,
|
|
||||||
sync_status,
|
|
||||||
{
|
|
||||||
"economic_order_number": economic_order_number,
|
|
||||||
"economic_invoice_number": economic_invoice_number,
|
|
||||||
},
|
|
||||||
user_id,
|
|
||||||
)
|
|
||||||
return result[0]
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("❌ Error updating ordre draft sync status: %s", e, exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail="Failed to update draft sync status")
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/ordre/drafts/{draft_id}/sync-events")
|
|
||||||
async def list_ordre_draft_sync_events(
|
|
||||||
draft_id: int,
|
|
||||||
http_request: Request,
|
|
||||||
limit: int = Query(100, ge=1, le=500),
|
|
||||||
offset: int = Query(0, ge=0),
|
|
||||||
event_type: Optional[str] = Query(None),
|
|
||||||
from_status: Optional[str] = Query(None),
|
|
||||||
to_status: Optional[str] = Query(None),
|
|
||||||
from_date: Optional[str] = Query(None),
|
|
||||||
to_date: Optional[str] = Query(None),
|
|
||||||
):
|
|
||||||
"""List audit events for one ordre draft sync lifecycle."""
|
|
||||||
try:
|
|
||||||
from app.core.database import execute_query
|
|
||||||
|
|
||||||
where_clauses = ["draft_id = %s"]
|
|
||||||
params: List[Any] = [draft_id]
|
|
||||||
|
|
||||||
if event_type:
|
|
||||||
where_clauses.append("event_type = %s")
|
|
||||||
params.append(event_type)
|
|
||||||
if from_status:
|
|
||||||
where_clauses.append("from_status = %s")
|
|
||||||
params.append(from_status)
|
|
||||||
if to_status:
|
|
||||||
where_clauses.append("to_status = %s")
|
|
||||||
params.append(to_status)
|
|
||||||
if from_date:
|
|
||||||
where_clauses.append("created_at >= %s::timestamp")
|
|
||||||
params.append(from_date)
|
|
||||||
if to_date:
|
|
||||||
where_clauses.append("created_at <= %s::timestamp")
|
|
||||||
params.append(to_date)
|
|
||||||
|
|
||||||
count_query = f"""
|
|
||||||
SELECT COUNT(*) AS total
|
|
||||||
FROM ordre_draft_sync_events
|
|
||||||
WHERE {' AND '.join(where_clauses)}
|
|
||||||
"""
|
|
||||||
total_row = execute_query(count_query, tuple(params)) or [{"total": 0}]
|
|
||||||
total = int(total_row[0].get("total") or 0)
|
|
||||||
|
|
||||||
data_query = f"""
|
|
||||||
SELECT id, draft_id, event_type, from_status, to_status, event_payload, created_by_user_id, created_at
|
|
||||||
FROM ordre_draft_sync_events
|
|
||||||
WHERE {' AND '.join(where_clauses)}
|
|
||||||
ORDER BY created_at DESC, id DESC
|
|
||||||
LIMIT %s OFFSET %s
|
|
||||||
"""
|
|
||||||
params.extend([limit, offset])
|
|
||||||
rows = execute_query(data_query, tuple(params)) or []
|
|
||||||
|
|
||||||
return {
|
|
||||||
"items": rows,
|
|
||||||
"total": total,
|
|
||||||
"limit": limit,
|
|
||||||
"offset": offset,
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("❌ Error listing ordre draft sync events: %s", e, exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail="Failed to list sync events")
|
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/ordre/drafts/{draft_id}")
|
@router.patch("/ordre/drafts/{draft_id}")
|
||||||
async def update_ordre_draft(draft_id: int, request: OrdreDraftUpsertRequest, http_request: Request):
|
async def update_ordre_draft(draft_id: int, request: OrdreDraftUpsertRequest, http_request: Request):
|
||||||
"""Update existing ordre draft."""
|
"""Update existing ordre draft."""
|
||||||
|
|||||||
@ -49,30 +49,6 @@
|
|||||||
color: white;
|
color: white;
|
||||||
background: var(--accent);
|
background: var(--accent);
|
||||||
}
|
}
|
||||||
.sync-card {
|
|
||||||
background: var(--bg-card);
|
|
||||||
border-radius: 12px;
|
|
||||||
padding: 1rem 1.25rem;
|
|
||||||
border: 1px solid rgba(0,0,0,0.06);
|
|
||||||
margin-bottom: 1rem;
|
|
||||||
}
|
|
||||||
.sync-label {
|
|
||||||
font-size: 0.8rem;
|
|
||||||
text-transform: uppercase;
|
|
||||||
color: var(--text-secondary);
|
|
||||||
letter-spacing: 0.5px;
|
|
||||||
margin-bottom: 0.25rem;
|
|
||||||
}
|
|
||||||
.sync-value {
|
|
||||||
font-weight: 600;
|
|
||||||
color: var(--text-primary);
|
|
||||||
}
|
|
||||||
.event-payload {
|
|
||||||
max-width: 360px;
|
|
||||||
white-space: pre-wrap;
|
|
||||||
word-break: break-word;
|
|
||||||
font-size: 0.8rem;
|
|
||||||
}
|
|
||||||
</style>
|
</style>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
@ -96,13 +72,6 @@
|
|||||||
<strong>Safety mode aktiv:</strong> e-conomic eksport er read-only eller dry-run.
|
<strong>Safety mode aktiv:</strong> e-conomic eksport er read-only eller dry-run.
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="d-flex justify-content-end mb-3">
|
|
||||||
<div class="form-check form-switch">
|
|
||||||
<input class="form-check-input" type="checkbox" id="forceExportToggle">
|
|
||||||
<label class="form-check-label" for="forceExportToggle">Force export (brug kun ved retry)</label>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="ordre-header">
|
<div class="ordre-header">
|
||||||
<div class="row g-3">
|
<div class="row g-3">
|
||||||
<div class="col-md-3">
|
<div class="col-md-3">
|
||||||
@ -145,120 +114,6 @@
|
|||||||
<div class="col-md-3"><div class="summary-card"><div class="summary-title">Sidst opdateret</div><div id="updatedAt" class="summary-value">-</div></div></div>
|
<div class="col-md-3"><div class="summary-card"><div class="summary-title">Sidst opdateret</div><div id="updatedAt" class="summary-value">-</div></div></div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="sync-card">
|
|
||||||
<div class="d-flex flex-wrap justify-content-between align-items-center gap-2 mb-3">
|
|
||||||
<div>
|
|
||||||
<h5 class="mb-1"><i class="bi bi-arrow-repeat me-2"></i>Sync Lifecycle</h5>
|
|
||||||
<div class="text-muted small">Manuel statusstyring og audit events for denne ordre</div>
|
|
||||||
</div>
|
|
||||||
<div class="d-flex gap-2">
|
|
||||||
<button class="btn btn-outline-secondary btn-sm" onclick="loadSyncEvents(0)">
|
|
||||||
<i class="bi bi-arrow-clockwise me-1"></i>Opdater events
|
|
||||||
</button>
|
|
||||||
<button class="btn btn-outline-success btn-sm" onclick="markDraftPaid()">
|
|
||||||
<i class="bi bi-cash-coin me-1"></i>Markér som betalt
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="row g-3 align-items-end mb-3">
|
|
||||||
<div class="col-md-3">
|
|
||||||
<div class="sync-label">Sync status</div>
|
|
||||||
<select id="syncStatusSelect" class="form-select form-select-sm">
|
|
||||||
<option value="pending">pending</option>
|
|
||||||
<option value="exported">exported</option>
|
|
||||||
<option value="failed">failed</option>
|
|
||||||
<option value="posted">posted</option>
|
|
||||||
<option value="paid">paid</option>
|
|
||||||
</select>
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3">
|
|
||||||
<div class="sync-label">e-conomic ordre nr.</div>
|
|
||||||
<input id="economicOrderNumber" type="text" class="form-control form-control-sm" placeholder="fx 12345">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3">
|
|
||||||
<div class="sync-label">e-conomic faktura nr.</div>
|
|
||||||
<input id="economicInvoiceNumber" type="text" class="form-control form-control-sm" placeholder="fx 998877">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3 d-grid">
|
|
||||||
<button class="btn btn-primary btn-sm" onclick="updateSyncStatus()">
|
|
||||||
<i class="bi bi-check2-circle me-1"></i>Gem sync status
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="row g-3 mb-3">
|
|
||||||
<div class="col-md-3">
|
|
||||||
<div class="sync-label">Aktuel status</div>
|
|
||||||
<div id="syncStatusBadge" class="sync-value">-</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3">
|
|
||||||
<div class="sync-label">Sidste sync</div>
|
|
||||||
<div id="lastSyncAt" class="sync-value">-</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3">
|
|
||||||
<div class="sync-label">Ordrenummer</div>
|
|
||||||
<div id="economicOrderNumberView" class="sync-value">-</div>
|
|
||||||
</div>
|
|
||||||
<div class="col-md-3">
|
|
||||||
<div class="sync-label">Fakturanummer</div>
|
|
||||||
<div id="economicInvoiceNumberView" class="sync-value">-</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="row g-2 align-items-end mb-3">
|
|
||||||
<div class="col-md-3">
|
|
||||||
<label class="form-label mb-1">Event type</label>
|
|
||||||
<input id="eventTypeFilter" type="text" class="form-control form-control-sm" placeholder="fx export_success">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-2">
|
|
||||||
<label class="form-label mb-1">Fra status</label>
|
|
||||||
<input id="fromStatusFilter" type="text" class="form-control form-control-sm" placeholder="fx pending">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-2">
|
|
||||||
<label class="form-label mb-1">Til status</label>
|
|
||||||
<input id="toStatusFilter" type="text" class="form-control form-control-sm" placeholder="fx exported">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-2">
|
|
||||||
<label class="form-label mb-1">Fra dato</label>
|
|
||||||
<input id="fromDateFilter" type="date" class="form-control form-control-sm">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-2">
|
|
||||||
<label class="form-label mb-1">Til dato</label>
|
|
||||||
<input id="toDateFilter" type="date" class="form-control form-control-sm">
|
|
||||||
</div>
|
|
||||||
<div class="col-md-1 d-grid">
|
|
||||||
<button class="btn btn-outline-primary btn-sm" onclick="loadSyncEvents(0)">
|
|
||||||
<i class="bi bi-funnel"></i>
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="table-responsive">
|
|
||||||
<table class="table table-sm align-middle mb-2">
|
|
||||||
<thead>
|
|
||||||
<tr>
|
|
||||||
<th>Tidspunkt</th>
|
|
||||||
<th>Type</th>
|
|
||||||
<th>Fra</th>
|
|
||||||
<th>Til</th>
|
|
||||||
<th>Payload</th>
|
|
||||||
</tr>
|
|
||||||
</thead>
|
|
||||||
<tbody id="syncEventsBody">
|
|
||||||
<tr><td colspan="5" class="text-muted text-center py-3">Indlæser events...</td></tr>
|
|
||||||
</tbody>
|
|
||||||
</table>
|
|
||||||
</div>
|
|
||||||
<div class="d-flex justify-content-between align-items-center">
|
|
||||||
<div id="syncEventsMeta" class="small text-muted">-</div>
|
|
||||||
<div class="d-flex gap-2">
|
|
||||||
<button id="eventsPrevBtn" class="btn btn-outline-secondary btn-sm" onclick="changeEventsPage(-1)">Forrige</button>
|
|
||||||
<button id="eventsNextBtn" class="btn btn-outline-secondary btn-sm" onclick="changeEventsPage(1)">Næste</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="card">
|
<div class="card">
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div class="table-responsive">
|
<div class="table-responsive">
|
||||||
@ -283,15 +138,6 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="toast-container position-fixed bottom-0 end-0 p-3" style="z-index: 1080;">
|
|
||||||
<div id="detailToast" class="toast align-items-center text-bg-dark border-0" role="alert" aria-live="assertive" aria-atomic="true">
|
|
||||||
<div class="d-flex">
|
|
||||||
<div class="toast-body" id="detailToastBody">-</div>
|
|
||||||
<button type="button" class="btn-close btn-close-white me-2 m-auto" data-bs-dismiss="toast" aria-label="Close"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
@ -300,27 +146,6 @@
|
|||||||
const draftId = {{ draft_id }};
|
const draftId = {{ draft_id }};
|
||||||
let orderData = null;
|
let orderData = null;
|
||||||
let orderLines = [];
|
let orderLines = [];
|
||||||
const syncEventsLimit = 10;
|
|
||||||
let syncEventsOffset = 0;
|
|
||||||
let syncEventsTotal = 0;
|
|
||||||
let detailToast = null;
|
|
||||||
|
|
||||||
function showToast(message, variant = 'dark') {
|
|
||||||
const toastEl = document.getElementById('detailToast');
|
|
||||||
const bodyEl = document.getElementById('detailToastBody');
|
|
||||||
if (!toastEl || !bodyEl || typeof bootstrap === 'undefined') {
|
|
||||||
console.log(message);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
toastEl.className = 'toast align-items-center border-0';
|
|
||||||
toastEl.classList.add(`text-bg-${variant}`);
|
|
||||||
bodyEl.textContent = message;
|
|
||||||
if (!detailToast) {
|
|
||||||
detailToast = new bootstrap.Toast(toastEl, { delay: 3200 });
|
|
||||||
}
|
|
||||||
detailToast.show();
|
|
||||||
}
|
|
||||||
|
|
||||||
function formatCurrency(value) {
|
function formatCurrency(value) {
|
||||||
return new Intl.NumberFormat('da-DK', { style: 'currency', currency: 'DKK' }).format(Number(value || 0));
|
return new Intl.NumberFormat('da-DK', { style: 'currency', currency: 'DKK' }).format(Number(value || 0));
|
||||||
@ -339,35 +164,6 @@
|
|||||||
return '<span class="badge bg-success">Salg</span>';
|
return '<span class="badge bg-success">Salg</span>';
|
||||||
}
|
}
|
||||||
|
|
||||||
function escapeHtml(value) {
|
|
||||||
return String(value || '')
|
|
||||||
.replace(/&/g, '&')
|
|
||||||
.replace(/</g, '<')
|
|
||||||
.replace(/>/g, '>')
|
|
||||||
.replace(/"/g, '"')
|
|
||||||
.replace(/'/g, ''');
|
|
||||||
}
|
|
||||||
|
|
||||||
function syncStatusBadge(status) {
|
|
||||||
const normalized = String(status || 'pending').toLowerCase();
|
|
||||||
if (normalized === 'paid') return '<span class="badge bg-success">paid</span>';
|
|
||||||
if (normalized === 'posted') return '<span class="badge bg-info text-dark">posted</span>';
|
|
||||||
if (normalized === 'exported') return '<span class="badge bg-primary">exported</span>';
|
|
||||||
if (normalized === 'failed') return '<span class="badge bg-danger">failed</span>';
|
|
||||||
return '<span class="badge bg-warning text-dark">pending</span>';
|
|
||||||
}
|
|
||||||
|
|
||||||
function refreshSyncPanelFromOrder() {
|
|
||||||
document.getElementById('syncStatusSelect').value = (orderData.sync_status || 'pending').toLowerCase();
|
|
||||||
document.getElementById('economicOrderNumber').value = orderData.economic_order_number || '';
|
|
||||||
document.getElementById('economicInvoiceNumber').value = orderData.economic_invoice_number || '';
|
|
||||||
|
|
||||||
document.getElementById('syncStatusBadge').innerHTML = syncStatusBadge(orderData.sync_status);
|
|
||||||
document.getElementById('lastSyncAt').textContent = formatDate(orderData.last_sync_at);
|
|
||||||
document.getElementById('economicOrderNumberView').textContent = orderData.economic_order_number || '-';
|
|
||||||
document.getElementById('economicInvoiceNumberView').textContent = orderData.economic_invoice_number || '-';
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderLines() {
|
function renderLines() {
|
||||||
const tbody = document.getElementById('linesTableBody');
|
const tbody = document.getElementById('linesTableBody');
|
||||||
if (!orderLines.length) {
|
if (!orderLines.length) {
|
||||||
@ -515,142 +311,10 @@
|
|||||||
document.getElementById('updatedAt').textContent = formatDate(orderData.updated_at);
|
document.getElementById('updatedAt').textContent = formatDate(orderData.updated_at);
|
||||||
|
|
||||||
renderLines();
|
renderLines();
|
||||||
refreshSyncPanelFromOrder();
|
|
||||||
await loadConfig();
|
await loadConfig();
|
||||||
await loadSyncEvents(syncEventsOffset);
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(error);
|
console.error(error);
|
||||||
showToast(`Fejl: ${error.message}`, 'danger');
|
alert(`Fejl: ${error.message}`);
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function updateSyncStatus() {
|
|
||||||
const payload = {
|
|
||||||
sync_status: (document.getElementById('syncStatusSelect').value || 'pending').trim().toLowerCase(),
|
|
||||||
economic_order_number: document.getElementById('economicOrderNumber').value.trim() || null,
|
|
||||||
economic_invoice_number: document.getElementById('economicInvoiceNumber').value.trim() || null,
|
|
||||||
};
|
|
||||||
|
|
||||||
try {
|
|
||||||
const res = await fetch(`/api/v1/ordre/drafts/${draftId}/sync-status`, {
|
|
||||||
method: 'PATCH',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify(payload),
|
|
||||||
});
|
|
||||||
const data = await res.json();
|
|
||||||
if (!res.ok) throw new Error(data.detail || 'Kunne ikke opdatere sync status');
|
|
||||||
|
|
||||||
orderData = data;
|
|
||||||
refreshSyncPanelFromOrder();
|
|
||||||
await loadSyncEvents(0);
|
|
||||||
showToast('Sync status opdateret', 'success');
|
|
||||||
} catch (error) {
|
|
||||||
showToast(`Fejl: ${error.message}`, 'danger');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function markDraftPaid() {
|
|
||||||
if (!confirm('Markér denne ordre som betalt (kun hvis status er posted)?')) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const res = await fetch('/api/v1/billing/drafts/reconcile-sync-status', {
|
|
||||||
method: 'POST',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ apply: true, mark_paid_ids: [draftId] }),
|
|
||||||
});
|
|
||||||
const data = await res.json();
|
|
||||||
if (!res.ok) throw new Error(data.detail || 'Kunne ikke markere som betalt');
|
|
||||||
|
|
||||||
await loadOrder();
|
|
||||||
if ((orderData.sync_status || '').toLowerCase() !== 'paid') {
|
|
||||||
showToast('Ingen statusændring. Ordren skal være i status posted før den kan markeres som paid.', 'warning');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
showToast('Ordre markeret som betalt', 'success');
|
|
||||||
} catch (error) {
|
|
||||||
showToast(`Fejl: ${error.message}`, 'danger');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function buildEventsQuery(offset) {
|
|
||||||
const params = new URLSearchParams();
|
|
||||||
params.set('limit', String(syncEventsLimit));
|
|
||||||
params.set('offset', String(Math.max(0, offset || 0)));
|
|
||||||
|
|
||||||
const eventType = document.getElementById('eventTypeFilter').value.trim();
|
|
||||||
const fromStatus = document.getElementById('fromStatusFilter').value.trim();
|
|
||||||
const toStatus = document.getElementById('toStatusFilter').value.trim();
|
|
||||||
const fromDate = document.getElementById('fromDateFilter').value;
|
|
||||||
const toDate = document.getElementById('toDateFilter').value;
|
|
||||||
|
|
||||||
if (eventType) params.set('event_type', eventType);
|
|
||||||
if (fromStatus) params.set('from_status', fromStatus);
|
|
||||||
if (toStatus) params.set('to_status', toStatus);
|
|
||||||
if (fromDate) params.set('from_date', fromDate);
|
|
||||||
if (toDate) params.set('to_date', toDate);
|
|
||||||
|
|
||||||
return params.toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderSyncEvents(items) {
|
|
||||||
const body = document.getElementById('syncEventsBody');
|
|
||||||
if (!Array.isArray(items) || items.length === 0) {
|
|
||||||
body.innerHTML = '<tr><td colspan="5" class="text-muted text-center py-3">Ingen events fundet</td></tr>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
body.innerHTML = items.map((event) => {
|
|
||||||
const payload = typeof event.event_payload === 'object'
|
|
||||||
? JSON.stringify(event.event_payload, null, 2)
|
|
||||||
: String(event.event_payload || '');
|
|
||||||
|
|
||||||
return `
|
|
||||||
<tr>
|
|
||||||
<td>${formatDate(event.created_at)}</td>
|
|
||||||
<td><span class="badge bg-light text-dark border">${escapeHtml(event.event_type || '-')}</span></td>
|
|
||||||
<td>${escapeHtml(event.from_status || '-')}</td>
|
|
||||||
<td>${escapeHtml(event.to_status || '-')}</td>
|
|
||||||
<td><pre class="event-payload mb-0">${escapeHtml(payload)}</pre></td>
|
|
||||||
</tr>
|
|
||||||
`;
|
|
||||||
}).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function updateEventsPager() {
|
|
||||||
const start = syncEventsTotal === 0 ? 0 : syncEventsOffset + 1;
|
|
||||||
const end = Math.min(syncEventsOffset + syncEventsLimit, syncEventsTotal);
|
|
||||||
document.getElementById('syncEventsMeta').textContent = `Viser ${start}-${end} af ${syncEventsTotal}`;
|
|
||||||
|
|
||||||
document.getElementById('eventsPrevBtn').disabled = syncEventsOffset <= 0;
|
|
||||||
document.getElementById('eventsNextBtn').disabled = syncEventsOffset + syncEventsLimit >= syncEventsTotal;
|
|
||||||
}
|
|
||||||
|
|
||||||
function changeEventsPage(delta) {
|
|
||||||
const nextOffset = syncEventsOffset + (delta * syncEventsLimit);
|
|
||||||
if (nextOffset < 0 || nextOffset >= syncEventsTotal) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
loadSyncEvents(nextOffset);
|
|
||||||
}
|
|
||||||
|
|
||||||
async function loadSyncEvents(offset = 0) {
|
|
||||||
syncEventsOffset = Math.max(0, offset);
|
|
||||||
const body = document.getElementById('syncEventsBody');
|
|
||||||
body.innerHTML = '<tr><td colspan="5" class="text-muted text-center py-3">Indlæser events...</td></tr>';
|
|
||||||
|
|
||||||
try {
|
|
||||||
const query = buildEventsQuery(syncEventsOffset);
|
|
||||||
const res = await fetch(`/api/v1/ordre/drafts/${draftId}/sync-events?${query}`);
|
|
||||||
const data = await res.json();
|
|
||||||
if (!res.ok) throw new Error(data.detail || 'Kunne ikke hente sync events');
|
|
||||||
|
|
||||||
syncEventsTotal = Number(data.total || 0);
|
|
||||||
renderSyncEvents(data.items || []);
|
|
||||||
updateEventsPager();
|
|
||||||
} catch (error) {
|
|
||||||
body.innerHTML = `<tr><td colspan="5" class="text-danger text-center py-3">${escapeHtml(error.message)}</td></tr>`;
|
|
||||||
syncEventsTotal = 0;
|
|
||||||
updateEventsPager();
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -688,22 +352,22 @@
|
|||||||
const data = await res.json();
|
const data = await res.json();
|
||||||
if (!res.ok) throw new Error(data.detail || 'Kunne ikke gemme ordre');
|
if (!res.ok) throw new Error(data.detail || 'Kunne ikke gemme ordre');
|
||||||
|
|
||||||
showToast('Ordre gemt', 'success');
|
alert('Ordre gemt');
|
||||||
await loadOrder();
|
await loadOrder();
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
showToast(`Kunne ikke gemme ordre: ${err.message}`, 'danger');
|
alert(`Kunne ikke gemme ordre: ${err.message}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async function exportOrder() {
|
async function exportOrder() {
|
||||||
const customerId = Number(document.getElementById('customerId').value || 0);
|
const customerId = Number(document.getElementById('customerId').value || 0);
|
||||||
if (!customerId) {
|
if (!customerId) {
|
||||||
showToast('Angiv kunde ID før eksport', 'warning');
|
alert('Angiv kunde ID før eksport');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!orderLines.length) {
|
if (!orderLines.length) {
|
||||||
showToast('Ingen linjer at eksportere', 'warning');
|
alert('Ingen linjer at eksportere');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -724,7 +388,6 @@
|
|||||||
notes: document.getElementById('orderNotes').value || null,
|
notes: document.getElementById('orderNotes').value || null,
|
||||||
layout_number: Number(document.getElementById('layoutNumber').value || 0) || null,
|
layout_number: Number(document.getElementById('layoutNumber').value || 0) || null,
|
||||||
draft_id: draftId,
|
draft_id: draftId,
|
||||||
force_export: document.getElementById('forceExportToggle').checked,
|
|
||||||
};
|
};
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@ -738,11 +401,11 @@
|
|||||||
throw new Error(data.detail || 'Eksport fejlede');
|
throw new Error(data.detail || 'Eksport fejlede');
|
||||||
}
|
}
|
||||||
|
|
||||||
showToast(data.message || 'Eksport udført', data.dry_run ? 'warning' : 'success');
|
alert(data.message || 'Eksport udført');
|
||||||
await loadOrder();
|
await loadOrder();
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error(err);
|
console.error(err);
|
||||||
showToast(`Eksport fejlede: ${err.message}`, 'danger');
|
alert(`Eksport fejlede: ${err.message}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -36,31 +36,6 @@
|
|||||||
.order-row:hover {
|
.order-row:hover {
|
||||||
background-color: rgba(var(--accent-rgb, 15, 76, 117), 0.05);
|
background-color: rgba(var(--accent-rgb, 15, 76, 117), 0.05);
|
||||||
}
|
}
|
||||||
.sync-actions {
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 0.35rem;
|
|
||||||
}
|
|
||||||
.sync-actions .form-select {
|
|
||||||
min-width: 128px;
|
|
||||||
}
|
|
||||||
.latest-event {
|
|
||||||
max-width: 210px;
|
|
||||||
white-space: nowrap;
|
|
||||||
overflow: hidden;
|
|
||||||
text-overflow: ellipsis;
|
|
||||||
}
|
|
||||||
.selected-counter {
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
padding: 0.25rem 0.6rem;
|
|
||||||
border-radius: 999px;
|
|
||||||
border: 1px solid var(--accent);
|
|
||||||
color: var(--accent);
|
|
||||||
background: rgba(var(--accent-rgb, 15, 76, 117), 0.08);
|
|
||||||
font-size: 0.85rem;
|
|
||||||
font-weight: 600;
|
|
||||||
}
|
|
||||||
</style>
|
</style>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
@ -73,16 +48,6 @@
|
|||||||
</div>
|
</div>
|
||||||
<div class="d-flex gap-2">
|
<div class="d-flex gap-2">
|
||||||
<a href="/ordre/create/new" class="btn btn-success"><i class="bi bi-plus-circle me-1"></i>Opret ny ordre</a>
|
<a href="/ordre/create/new" class="btn btn-success"><i class="bi bi-plus-circle me-1"></i>Opret ny ordre</a>
|
||||||
<select id="syncStatusFilter" class="form-select" style="min-width: 170px;" onchange="renderOrders()">
|
|
||||||
<option value="all">Alle sync-status</option>
|
|
||||||
<option value="pending">pending</option>
|
|
||||||
<option value="exported">exported</option>
|
|
||||||
<option value="failed">failed</option>
|
|
||||||
<option value="posted">posted</option>
|
|
||||||
<option value="paid">paid</option>
|
|
||||||
</select>
|
|
||||||
<span id="selectedCountBadge" class="selected-counter">Valgte: 0</span>
|
|
||||||
<button class="btn btn-outline-success" onclick="markSelectedOrdersPaid()"><i class="bi bi-cash-stack me-1"></i>Markér valgte som betalt</button>
|
|
||||||
<button class="btn btn-outline-primary" onclick="loadOrders()"><i class="bi bi-arrow-clockwise me-1"></i>Opdater</button>
|
<button class="btn btn-outline-primary" onclick="loadOrders()"><i class="bi bi-arrow-clockwise me-1"></i>Opdater</button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@ -100,7 +65,6 @@
|
|||||||
<table class="table table-hover align-middle">
|
<table class="table table-hover align-middle">
|
||||||
<thead>
|
<thead>
|
||||||
<tr>
|
<tr>
|
||||||
<th style="width: 42px;"><input id="selectAllOrders" type="checkbox" onchange="toggleSelectAll(this.checked)"></th>
|
|
||||||
<th>Ordre #</th>
|
<th>Ordre #</th>
|
||||||
<th>Titel</th>
|
<th>Titel</th>
|
||||||
<th>Kunde</th>
|
<th>Kunde</th>
|
||||||
@ -108,53 +72,23 @@
|
|||||||
<th>Oprettet</th>
|
<th>Oprettet</th>
|
||||||
<th>Sidst opdateret</th>
|
<th>Sidst opdateret</th>
|
||||||
<th>Sidst eksporteret</th>
|
<th>Sidst eksporteret</th>
|
||||||
<th>Seneste event</th>
|
|
||||||
<th>Status</th>
|
<th>Status</th>
|
||||||
<th>Sync</th>
|
|
||||||
<th>Handlinger</th>
|
<th>Handlinger</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody id="ordersTableBody">
|
<tbody id="ordersTableBody">
|
||||||
<tr><td colspan="12" class="text-muted text-center py-4">Indlæser...</td></tr>
|
<tr><td colspan="9" class="text-muted text-center py-4">Indlæser...</td></tr>
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="toast-container position-fixed bottom-0 end-0 p-3" style="z-index: 1080;">
|
|
||||||
<div id="ordersToast" class="toast align-items-center text-bg-dark border-0" role="alert" aria-live="assertive" aria-atomic="true">
|
|
||||||
<div class="d-flex">
|
|
||||||
<div class="toast-body" id="ordersToastBody">-</div>
|
|
||||||
<button type="button" class="btn-close btn-close-white me-2 m-auto" data-bs-dismiss="toast" aria-label="Close"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
{% block extra_js %}
|
{% block extra_js %}
|
||||||
<script>
|
<script>
|
||||||
let orders = [];
|
let orders = [];
|
||||||
let ordersToast = null;
|
|
||||||
let selectedOrderIds = new Set();
|
|
||||||
|
|
||||||
function showToast(message, variant = 'dark') {
|
|
||||||
const toastEl = document.getElementById('ordersToast');
|
|
||||||
const bodyEl = document.getElementById('ordersToastBody');
|
|
||||||
if (!toastEl || !bodyEl || typeof bootstrap === 'undefined') {
|
|
||||||
console.log(message);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
toastEl.className = 'toast align-items-center border-0';
|
|
||||||
toastEl.classList.add(`text-bg-${variant}`);
|
|
||||||
bodyEl.textContent = message;
|
|
||||||
if (!ordersToast) {
|
|
||||||
ordersToast = new bootstrap.Toast(toastEl, { delay: 2800 });
|
|
||||||
}
|
|
||||||
ordersToast.show();
|
|
||||||
}
|
|
||||||
|
|
||||||
function formatDate(dateStr) {
|
function formatDate(dateStr) {
|
||||||
if (!dateStr) return '-';
|
if (!dateStr) return '-';
|
||||||
@ -162,48 +96,23 @@
|
|||||||
return date.toLocaleDateString('da-DK', { day: '2-digit', month: '2-digit', year: 'numeric', hour: '2-digit', minute: '2-digit' });
|
return date.toLocaleDateString('da-DK', { day: '2-digit', month: '2-digit', year: 'numeric', hour: '2-digit', minute: '2-digit' });
|
||||||
}
|
}
|
||||||
|
|
||||||
function getFilteredOrders() {
|
|
||||||
const filter = (document.getElementById('syncStatusFilter')?.value || 'all').toLowerCase();
|
|
||||||
if (filter === 'all') return orders;
|
|
||||||
return orders.filter(order => String(order.sync_status || 'pending').toLowerCase() === filter);
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderOrders() {
|
function renderOrders() {
|
||||||
const visibleOrders = getFilteredOrders();
|
|
||||||
const tbody = document.getElementById('ordersTableBody');
|
const tbody = document.getElementById('ordersTableBody');
|
||||||
if (!orders.length || !visibleOrders.length) {
|
if (!orders.length) {
|
||||||
const message = orders.length
|
tbody.innerHTML = '<tr><td colspan="9" class="text-muted text-center py-4">Ingen ordre fundet</td></tr>';
|
||||||
? 'Ingen ordre matcher det valgte filter'
|
updateSummary();
|
||||||
: 'Ingen ordre fundet';
|
|
||||||
tbody.innerHTML = `<tr><td colspan="12" class="text-muted text-center py-4">${message}</td></tr>`;
|
|
||||||
updateSummary(visibleOrders);
|
|
||||||
syncSelectAllCheckbox(visibleOrders);
|
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
tbody.innerHTML = visibleOrders.map(order => {
|
tbody.innerHTML = orders.map(order => {
|
||||||
const lines = Array.isArray(order.lines_json) ? order.lines_json : [];
|
const lines = Array.isArray(order.lines_json) ? order.lines_json : [];
|
||||||
const hasExported = order.last_exported_at ? true : false;
|
const hasExported = order.last_exported_at ? true : false;
|
||||||
const statusBadge = hasExported
|
const statusBadge = hasExported
|
||||||
? '<span class="badge bg-success">Eksporteret</span>'
|
? '<span class="badge bg-success">Eksporteret</span>'
|
||||||
: '<span class="badge bg-warning text-dark">Ikke eksporteret</span>';
|
: '<span class="badge bg-warning text-dark">Ikke eksporteret</span>';
|
||||||
|
|
||||||
const syncStatus = String(order.sync_status || 'pending').toLowerCase();
|
|
||||||
let syncBadge = '<span class="badge bg-warning text-dark">pending</span>';
|
|
||||||
if (syncStatus === 'failed') syncBadge = '<span class="badge bg-danger">failed</span>';
|
|
||||||
if (syncStatus === 'exported') syncBadge = '<span class="badge bg-primary">exported</span>';
|
|
||||||
if (syncStatus === 'posted') syncBadge = '<span class="badge bg-info text-dark">posted</span>';
|
|
||||||
if (syncStatus === 'paid') syncBadge = '<span class="badge bg-success">paid</span>';
|
|
||||||
|
|
||||||
const isChecked = selectedOrderIds.has(order.id);
|
|
||||||
const latestEventType = order.latest_event_type || '-';
|
|
||||||
const latestEventAt = order.latest_event_at ? formatDate(order.latest_event_at) : '-';
|
|
||||||
|
|
||||||
return `
|
return `
|
||||||
<tr class="order-row" onclick="window.location.href='/ordre/${order.id}'">
|
<tr class="order-row" onclick="window.location.href='/ordre/${order.id}'">
|
||||||
<td onclick="event.stopPropagation();">
|
|
||||||
<input type="checkbox" ${isChecked ? 'checked' : ''} onchange="toggleOrderSelection(${order.id}, this.checked)">
|
|
||||||
</td>
|
|
||||||
<td><strong>#${order.id}</strong></td>
|
<td><strong>#${order.id}</strong></td>
|
||||||
<td>${order.title || '-'}</td>
|
<td>${order.title || '-'}</td>
|
||||||
<td>${order.customer_id ? `Kunde ${order.customer_id}` : '-'}</td>
|
<td>${order.customer_id ? `Kunde ${order.customer_id}` : '-'}</td>
|
||||||
@ -211,31 +120,7 @@
|
|||||||
<td>${formatDate(order.created_at)}</td>
|
<td>${formatDate(order.created_at)}</td>
|
||||||
<td>${formatDate(order.updated_at)}</td>
|
<td>${formatDate(order.updated_at)}</td>
|
||||||
<td>${formatDate(order.last_exported_at)}</td>
|
<td>${formatDate(order.last_exported_at)}</td>
|
||||||
<td class="latest-event" title="${latestEventType} · ${latestEventAt}">
|
|
||||||
<div class="small fw-semibold">${latestEventType}</div>
|
|
||||||
<div class="small text-muted">${latestEventAt}</div>
|
|
||||||
</td>
|
|
||||||
<td>${statusBadge}</td>
|
<td>${statusBadge}</td>
|
||||||
<td>
|
|
||||||
<div class="sync-actions" onclick="event.stopPropagation();">
|
|
||||||
<span>${syncBadge}</span>
|
|
||||||
<select class="form-select form-select-sm" id="syncStatus-${order.id}">
|
|
||||||
<option value="pending" ${syncStatus === 'pending' ? 'selected' : ''}>pending</option>
|
|
||||||
<option value="exported" ${syncStatus === 'exported' ? 'selected' : ''}>exported</option>
|
|
||||||
<option value="failed" ${syncStatus === 'failed' ? 'selected' : ''}>failed</option>
|
|
||||||
<option value="posted" ${syncStatus === 'posted' ? 'selected' : ''}>posted</option>
|
|
||||||
<option value="paid" ${syncStatus === 'paid' ? 'selected' : ''}>paid</option>
|
|
||||||
</select>
|
|
||||||
<button class="btn btn-sm btn-outline-primary" title="Gem sync" onclick="saveQuickSyncStatus(${order.id})">
|
|
||||||
<i class="bi bi-check2"></i>
|
|
||||||
</button>
|
|
||||||
${syncStatus === 'posted' ? `
|
|
||||||
<button class="btn btn-sm btn-outline-success" title="Markér som betalt" onclick="markOrderPaid(${order.id})">
|
|
||||||
<i class="bi bi-cash-coin"></i>
|
|
||||||
</button>
|
|
||||||
` : ''}
|
|
||||||
</div>
|
|
||||||
</td>
|
|
||||||
<td>
|
<td>
|
||||||
<button class="btn btn-sm btn-outline-primary" onclick="event.stopPropagation(); window.location.href='/ordre/${order.id}'">
|
<button class="btn btn-sm btn-outline-primary" onclick="event.stopPropagation(); window.location.href='/ordre/${order.id}'">
|
||||||
<i class="bi bi-eye"></i>
|
<i class="bi bi-eye"></i>
|
||||||
@ -248,61 +133,18 @@
|
|||||||
`;
|
`;
|
||||||
}).join('');
|
}).join('');
|
||||||
|
|
||||||
syncSelectAllCheckbox(visibleOrders);
|
updateSummary();
|
||||||
updateSelectedCounter();
|
|
||||||
|
|
||||||
updateSummary(visibleOrders);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function toggleOrderSelection(orderId, checked) {
|
function updateSummary() {
|
||||||
if (checked) {
|
|
||||||
selectedOrderIds.add(orderId);
|
|
||||||
} else {
|
|
||||||
selectedOrderIds.delete(orderId);
|
|
||||||
}
|
|
||||||
syncSelectAllCheckbox();
|
|
||||||
updateSelectedCounter();
|
|
||||||
}
|
|
||||||
|
|
||||||
function toggleSelectAll(checked) {
|
|
||||||
const visibleOrders = getFilteredOrders();
|
|
||||||
if (checked) {
|
|
||||||
visibleOrders.forEach(order => selectedOrderIds.add(order.id));
|
|
||||||
} else {
|
|
||||||
visibleOrders.forEach(order => selectedOrderIds.delete(order.id));
|
|
||||||
}
|
|
||||||
renderOrders();
|
|
||||||
}
|
|
||||||
|
|
||||||
function updateSelectedCounter() {
|
|
||||||
const badge = document.getElementById('selectedCountBadge');
|
|
||||||
if (!badge) return;
|
|
||||||
const visibleIds = new Set(getFilteredOrders().map(order => order.id));
|
|
||||||
const selectedVisibleCount = Array.from(selectedOrderIds).filter(id => visibleIds.has(id)).length;
|
|
||||||
badge.textContent = `Valgte: ${selectedVisibleCount}`;
|
|
||||||
}
|
|
||||||
|
|
||||||
function syncSelectAllCheckbox(visibleOrders = null) {
|
|
||||||
const selectAll = document.getElementById('selectAllOrders');
|
|
||||||
if (!selectAll) return;
|
|
||||||
const rows = Array.isArray(visibleOrders) ? visibleOrders : getFilteredOrders();
|
|
||||||
if (!rows.length) {
|
|
||||||
selectAll.checked = false;
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
selectAll.checked = rows.every(order => selectedOrderIds.has(order.id));
|
|
||||||
}
|
|
||||||
|
|
||||||
function updateSummary(visibleOrders = null) {
|
|
||||||
const rows = Array.isArray(visibleOrders) ? visibleOrders : getFilteredOrders();
|
|
||||||
const now = new Date();
|
const now = new Date();
|
||||||
const oneMonthAgo = new Date(now.getFullYear(), now.getMonth() - 1, now.getDate());
|
const oneMonthAgo = new Date(now.getFullYear(), now.getMonth() - 1, now.getDate());
|
||||||
|
|
||||||
const recentOrders = rows.filter(order => new Date(order.created_at) >= oneMonthAgo);
|
const recentOrders = orders.filter(order => new Date(order.created_at) >= oneMonthAgo);
|
||||||
const exportedOrders = rows.filter(order => order.last_exported_at);
|
const exportedOrders = orders.filter(order => order.last_exported_at);
|
||||||
const notExportedOrders = rows.filter(order => !order.last_exported_at);
|
const notExportedOrders = orders.filter(order => !order.last_exported_at);
|
||||||
|
|
||||||
document.getElementById('sumOrders').textContent = rows.length;
|
document.getElementById('sumOrders').textContent = orders.length;
|
||||||
document.getElementById('sumRecent').textContent = recentOrders.length;
|
document.getElementById('sumRecent').textContent = recentOrders.length;
|
||||||
document.getElementById('sumExported').textContent = exportedOrders.length;
|
document.getElementById('sumExported').textContent = exportedOrders.length;
|
||||||
document.getElementById('sumNotExported').textContent = notExportedOrders.length;
|
document.getElementById('sumNotExported').textContent = notExportedOrders.length;
|
||||||
@ -310,7 +152,7 @@
|
|||||||
|
|
||||||
async function loadOrders() {
|
async function loadOrders() {
|
||||||
const tbody = document.getElementById('ordersTableBody');
|
const tbody = document.getElementById('ordersTableBody');
|
||||||
tbody.innerHTML = '<tr><td colspan="12" class="text-muted text-center py-4">Indlæser...</td></tr>';
|
tbody.innerHTML = '<tr><td colspan="9" class="text-muted text-center py-4">Indlæser...</td></tr>';
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const res = await fetch('/api/v1/ordre/drafts?limit=100');
|
const res = await fetch('/api/v1/ordre/drafts?limit=100');
|
||||||
@ -324,13 +166,10 @@
|
|||||||
console.log('Fetched orders:', data);
|
console.log('Fetched orders:', data);
|
||||||
|
|
||||||
orders = Array.isArray(data) ? data : [];
|
orders = Array.isArray(data) ? data : [];
|
||||||
const availableIds = new Set(orders.map(order => order.id));
|
|
||||||
selectedOrderIds = new Set(Array.from(selectedOrderIds).filter(id => availableIds.has(id)));
|
|
||||||
updateSelectedCounter();
|
|
||||||
|
|
||||||
if (orders.length === 0) {
|
if (orders.length === 0) {
|
||||||
tbody.innerHTML = '<tr><td colspan="12" class="text-muted text-center py-4">Ingen ordre fundet. <a href="/ordre/create/new" class="btn btn-sm btn-success ms-2">Opret første ordre</a></td></tr>';
|
tbody.innerHTML = '<tr><td colspan="9" class="text-muted text-center py-4">Ingen ordre fundet. <a href="/ordre/create/new" class="btn btn-sm btn-success ms-2">Opret første ordre</a></td></tr>';
|
||||||
updateSummary([]);
|
updateSummary();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -355,86 +194,9 @@
|
|||||||
renderOrders();
|
renderOrders();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Load orders error:', error);
|
console.error('Load orders error:', error);
|
||||||
tbody.innerHTML = `<tr><td colspan="12" class="text-danger text-center py-4">${error.message || 'Kunne ikke hente ordre'}</td></tr>`;
|
tbody.innerHTML = `<tr><td colspan="9" class="text-danger text-center py-4">${error.message || 'Kunne ikke hente ordre'}</td></tr>`;
|
||||||
orders = [];
|
orders = [];
|
||||||
updateSummary([]);
|
updateSummary();
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function markSelectedOrdersPaid() {
|
|
||||||
const ids = Array.from(selectedOrderIds).map(Number).filter(Boolean);
|
|
||||||
if (!ids.length) {
|
|
||||||
showToast('Vælg mindst én ordre først', 'warning');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const postedEligibleIds = ids.filter(id => {
|
|
||||||
const order = orders.find(item => item.id === id);
|
|
||||||
return String(order?.sync_status || '').toLowerCase() === 'posted';
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!postedEligibleIds.length) {
|
|
||||||
showToast('Ingen af de valgte ordre er i status posted', 'warning');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!confirm(`Markér ${postedEligibleIds.length} posted ordre som betalt?`)) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const res = await fetch('/api/v1/billing/drafts/reconcile-sync-status', {
|
|
||||||
method: 'POST',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ apply: true, mark_paid_ids: postedEligibleIds }),
|
|
||||||
});
|
|
||||||
const data = await res.json();
|
|
||||||
if (!res.ok) throw new Error(data.detail || 'Kunne ikke opdatere valgte ordre');
|
|
||||||
|
|
||||||
await loadOrders();
|
|
||||||
const paidCount = postedEligibleIds.filter(id => {
|
|
||||||
const order = orders.find(item => item.id === id);
|
|
||||||
return String(order?.sync_status || '').toLowerCase() === 'paid';
|
|
||||||
}).length;
|
|
||||||
showToast(`${paidCount}/${postedEligibleIds.length} posted ordre sat til paid`, paidCount > 0 ? 'success' : 'warning');
|
|
||||||
} catch (error) {
|
|
||||||
showToast(`Fejl: ${error.message}`, 'danger');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function saveQuickSyncStatus(orderId) {
|
|
||||||
const select = document.getElementById(`syncStatus-${orderId}`);
|
|
||||||
const syncStatus = (select?.value || '').trim().toLowerCase();
|
|
||||||
if (!syncStatus) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const res = await fetch(`/api/v1/ordre/drafts/${orderId}/sync-status`, {
|
|
||||||
method: 'PATCH',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ sync_status: syncStatus }),
|
|
||||||
});
|
|
||||||
const data = await res.json();
|
|
||||||
if (!res.ok) throw new Error(data.detail || 'Kunne ikke opdatere sync status');
|
|
||||||
await loadOrders();
|
|
||||||
showToast(`Sync status gemt for ordre #${orderId}`, 'success');
|
|
||||||
} catch (error) {
|
|
||||||
showToast(`Fejl: ${error.message}`, 'danger');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function markOrderPaid(orderId) {
|
|
||||||
if (!confirm('Markér ordren som betalt?')) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const res = await fetch('/api/v1/billing/drafts/reconcile-sync-status', {
|
|
||||||
method: 'POST',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ apply: true, mark_paid_ids: [orderId] }),
|
|
||||||
});
|
|
||||||
const data = await res.json();
|
|
||||||
if (!res.ok) throw new Error(data.detail || 'Kunne ikke markere som betalt');
|
|
||||||
await loadOrders();
|
|
||||||
showToast(`Ordre #${orderId} markeret som betalt`, 'success');
|
|
||||||
} catch (error) {
|
|
||||||
showToast(`Fejl: ${error.message}`, 'danger');
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -449,9 +211,9 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
await loadOrders();
|
await loadOrders();
|
||||||
showToast('Ordre slettet', 'success');
|
alert('Ordre slettet');
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
showToast(`Fejl: ${error.message}`, 'danger');
|
alert(`Fejl: ${error.message}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@ -1,5 +1,4 @@
|
|||||||
import logging
|
import logging
|
||||||
import json
|
|
||||||
from datetime import date, datetime
|
from datetime import date, datetime
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
from fastapi import APIRouter, HTTPException, Query, Request
|
from fastapi import APIRouter, HTTPException, Query, Request
|
||||||
@ -57,50 +56,6 @@ def _coerce_optional_int(value: Optional[str]) -> Optional[int]:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def _fetch_case_status_options() -> list[str]:
|
|
||||||
default_statuses = ["åben", "under behandling", "afventer", "løst", "lukket"]
|
|
||||||
values = []
|
|
||||||
seen = set()
|
|
||||||
|
|
||||||
def _add(value: Optional[str]) -> None:
|
|
||||||
candidate = str(value or "").strip()
|
|
||||||
if not candidate:
|
|
||||||
return
|
|
||||||
key = candidate.lower()
|
|
||||||
if key in seen:
|
|
||||||
return
|
|
||||||
seen.add(key)
|
|
||||||
values.append(candidate)
|
|
||||||
|
|
||||||
setting_row = execute_query(
|
|
||||||
"SELECT value FROM settings WHERE key = %s",
|
|
||||||
("case_statuses",)
|
|
||||||
)
|
|
||||||
|
|
||||||
if setting_row and setting_row[0].get("value"):
|
|
||||||
try:
|
|
||||||
parsed = json.loads(setting_row[0].get("value") or "[]")
|
|
||||||
for item in parsed if isinstance(parsed, list) else []:
|
|
||||||
value = ""
|
|
||||||
if isinstance(item, str):
|
|
||||||
value = item.strip()
|
|
||||||
elif isinstance(item, dict):
|
|
||||||
value = str(item.get("value") or "").strip()
|
|
||||||
_add(value)
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
|
|
||||||
statuses = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ()) or []
|
|
||||||
|
|
||||||
for row in statuses:
|
|
||||||
_add(row.get("status"))
|
|
||||||
|
|
||||||
for default in default_statuses:
|
|
||||||
_add(default)
|
|
||||||
|
|
||||||
return values
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/sag", response_class=HTMLResponse)
|
@router.get("/sag", response_class=HTMLResponse)
|
||||||
async def sager_liste(
|
async def sager_liste(
|
||||||
request: Request,
|
request: Request,
|
||||||
@ -122,9 +77,7 @@ async def sager_liste(
|
|||||||
c.name as customer_name,
|
c.name as customer_name,
|
||||||
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) as kontakt_navn,
|
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) as kontakt_navn,
|
||||||
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
||||||
g.name AS assigned_group_name,
|
g.name AS assigned_group_name
|
||||||
nt.title AS next_todo_title,
|
|
||||||
nt.due_date AS next_todo_due_date
|
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON s.customer_id = c.id
|
LEFT JOIN customers c ON s.customer_id = c.id
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
@ -137,22 +90,6 @@ async def sager_liste(
|
|||||||
LIMIT 1
|
LIMIT 1
|
||||||
) cc_first ON true
|
) cc_first ON true
|
||||||
LEFT JOIN contacts cont ON cc_first.contact_id = cont.id
|
LEFT JOIN contacts cont ON cc_first.contact_id = cont.id
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT t.title, t.due_date
|
|
||||||
FROM sag_todo_steps t
|
|
||||||
WHERE t.sag_id = s.id
|
|
||||||
AND t.deleted_at IS NULL
|
|
||||||
AND t.is_done = FALSE
|
|
||||||
ORDER BY
|
|
||||||
CASE
|
|
||||||
WHEN t.is_next THEN 0
|
|
||||||
WHEN t.due_date IS NOT NULL THEN 1
|
|
||||||
ELSE 2
|
|
||||||
END,
|
|
||||||
t.due_date ASC NULLS LAST,
|
|
||||||
t.created_at ASC
|
|
||||||
LIMIT 1
|
|
||||||
) nt ON true
|
|
||||||
LEFT JOIN sag_sager ds ON ds.id = s.deferred_until_case_id
|
LEFT JOIN sag_sager ds ON ds.id = s.deferred_until_case_id
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
"""
|
"""
|
||||||
@ -225,11 +162,7 @@ async def sager_liste(
|
|||||||
sager = [s for s in sager if s['id'] in tagged_ids]
|
sager = [s for s in sager if s['id'] in tagged_ids]
|
||||||
|
|
||||||
# Fetch all distinct statuses and tags for filters
|
# Fetch all distinct statuses and tags for filters
|
||||||
status_options = _fetch_case_status_options()
|
statuses = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ())
|
||||||
|
|
||||||
current_status = str(status or "").strip()
|
|
||||||
if current_status and current_status.lower() not in {s.lower() for s in status_options}:
|
|
||||||
status_options.append(current_status)
|
|
||||||
all_tags = execute_query("SELECT DISTINCT tag_navn FROM sag_tags WHERE deleted_at IS NULL ORDER BY tag_navn", ())
|
all_tags = execute_query("SELECT DISTINCT tag_navn FROM sag_tags WHERE deleted_at IS NULL ORDER BY tag_navn", ())
|
||||||
|
|
||||||
toggle_include_deferred_url = str(
|
toggle_include_deferred_url = str(
|
||||||
@ -241,7 +174,7 @@ async def sager_liste(
|
|||||||
"sager": sager,
|
"sager": sager,
|
||||||
"relations_map": relations_map,
|
"relations_map": relations_map,
|
||||||
"child_ids": list(child_ids),
|
"child_ids": list(child_ids),
|
||||||
"statuses": status_options,
|
"statuses": [s['status'] for s in statuses],
|
||||||
"all_tags": [t['tag_navn'] for t in all_tags],
|
"all_tags": [t['tag_navn'] for t in all_tags],
|
||||||
"current_status": status,
|
"current_status": status,
|
||||||
"current_tag": tag,
|
"current_tag": tag,
|
||||||
@ -252,24 +185,9 @@ async def sager_liste(
|
|||||||
"current_ansvarlig_bruger_id": ansvarlig_bruger_id_int,
|
"current_ansvarlig_bruger_id": ansvarlig_bruger_id_int,
|
||||||
"current_assigned_group_id": assigned_group_id_int,
|
"current_assigned_group_id": assigned_group_id_int,
|
||||||
})
|
})
|
||||||
except Exception:
|
except Exception as e:
|
||||||
logger.exception("❌ Error displaying case list")
|
logger.error("❌ Error displaying case list: %s", e)
|
||||||
return templates.TemplateResponse("modules/sag/templates/index.html", {
|
raise HTTPException(status_code=500, detail="Failed to load case list")
|
||||||
"request": request,
|
|
||||||
"sager": [],
|
|
||||||
"relations_map": {},
|
|
||||||
"child_ids": [],
|
|
||||||
"statuses": _fetch_case_status_options(),
|
|
||||||
"all_tags": [],
|
|
||||||
"current_status": status,
|
|
||||||
"current_tag": tag,
|
|
||||||
"include_deferred": include_deferred,
|
|
||||||
"toggle_include_deferred_url": str(request.url),
|
|
||||||
"assignment_users": [],
|
|
||||||
"assignment_groups": [],
|
|
||||||
"current_ansvarlig_bruger_id": ansvarlig_bruger_id_int,
|
|
||||||
"current_assigned_group_id": assigned_group_id_int,
|
|
||||||
})
|
|
||||||
|
|
||||||
@router.get("/sag/new", response_class=HTMLResponse)
|
@router.get("/sag/new", response_class=HTMLResponse)
|
||||||
async def opret_sag_side(request: Request):
|
async def opret_sag_side(request: Request):
|
||||||
@ -469,7 +387,7 @@ async def sag_detaljer(request: Request, sag_id: int):
|
|||||||
customers = execute_query(customers_query, (sag_id,))
|
customers = execute_query(customers_query, (sag_id,))
|
||||||
|
|
||||||
# Fetch comments
|
# Fetch comments
|
||||||
comments_query = "SELECT * FROM sag_kommentarer WHERE sag_id = %s AND deleted_at IS NULL ORDER BY created_at DESC"
|
comments_query = "SELECT * FROM sag_kommentarer WHERE sag_id = %s AND deleted_at IS NULL ORDER BY created_at ASC"
|
||||||
comments = execute_query(comments_query, (sag_id,))
|
comments = execute_query(comments_query, (sag_id,))
|
||||||
|
|
||||||
# Fetch Solution
|
# Fetch Solution
|
||||||
@ -533,10 +451,7 @@ async def sag_detaljer(request: Request, sag_id: int):
|
|||||||
logger.warning("⚠️ Could not load pipeline stages: %s", e)
|
logger.warning("⚠️ Could not load pipeline stages: %s", e)
|
||||||
pipeline_stages = []
|
pipeline_stages = []
|
||||||
|
|
||||||
status_options = _fetch_case_status_options()
|
statuses = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ())
|
||||||
current_status = str(sag.get("status") or "").strip()
|
|
||||||
if current_status and current_status.lower() not in {s.lower() for s in status_options}:
|
|
||||||
status_options.append(current_status)
|
|
||||||
is_deadline_overdue = _is_deadline_overdue(sag.get("deadline"))
|
is_deadline_overdue = _is_deadline_overdue(sag.get("deadline"))
|
||||||
|
|
||||||
return templates.TemplateResponse("modules/sag/templates/detail.html", {
|
return templates.TemplateResponse("modules/sag/templates/detail.html", {
|
||||||
@ -560,7 +475,7 @@ async def sag_detaljer(request: Request, sag_id: int):
|
|||||||
"nextcloud_instance": nextcloud_instance,
|
"nextcloud_instance": nextcloud_instance,
|
||||||
"related_case_options": related_case_options,
|
"related_case_options": related_case_options,
|
||||||
"pipeline_stages": pipeline_stages,
|
"pipeline_stages": pipeline_stages,
|
||||||
"status_options": status_options,
|
"status_options": [s["status"] for s in statuses],
|
||||||
"is_deadline_overdue": is_deadline_overdue,
|
"is_deadline_overdue": is_deadline_overdue,
|
||||||
"assignment_users": _fetch_assignment_users(),
|
"assignment_users": _fetch_assignment_users(),
|
||||||
"assignment_groups": _fetch_assignment_groups(),
|
"assignment_groups": _fetch_assignment_groups(),
|
||||||
|
|||||||
@ -33,7 +33,7 @@ class RelationService:
|
|||||||
|
|
||||||
# 2. Fetch details for these cases
|
# 2. Fetch details for these cases
|
||||||
placeholders = ','.join(['%s'] * len(tree_ids))
|
placeholders = ','.join(['%s'] * len(tree_ids))
|
||||||
tree_cases_query = f"SELECT id, titel, status, type, template_key FROM sag_sager WHERE id IN ({placeholders})"
|
tree_cases_query = f"SELECT id, titel, status FROM sag_sager WHERE id IN ({placeholders})"
|
||||||
tree_cases = {c['id']: c for c in execute_query(tree_cases_query, tuple(tree_ids))}
|
tree_cases = {c['id']: c for c in execute_query(tree_cases_query, tuple(tree_ids))}
|
||||||
|
|
||||||
# 3. Fetch all edges between these cases
|
# 3. Fetch all edges between these cases
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -17,14 +17,12 @@
|
|||||||
.table-wrapper {
|
.table-wrapper {
|
||||||
background: var(--bg-card);
|
background: var(--bg-card);
|
||||||
border-radius: 12px;
|
border-radius: 12px;
|
||||||
overflow-x: auto;
|
overflow: hidden;
|
||||||
overflow-y: hidden;
|
|
||||||
box-shadow: 0 2px 8px rgba(0,0,0,0.05);
|
box-shadow: 0 2px 8px rgba(0,0,0,0.05);
|
||||||
}
|
}
|
||||||
|
|
||||||
.sag-table {
|
.sag-table {
|
||||||
width: 100%;
|
width: 100%;
|
||||||
min-width: 1760px;
|
|
||||||
margin: 0;
|
margin: 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -34,13 +32,12 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
.sag-table thead th {
|
.sag-table thead th {
|
||||||
padding: 0.6rem 0.75rem;
|
padding: 0.8rem 1rem;
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
font-size: 0.78rem;
|
font-size: 0.85rem;
|
||||||
text-transform: uppercase;
|
text-transform: uppercase;
|
||||||
letter-spacing: 0.3px;
|
letter-spacing: 0.5px;
|
||||||
border: none;
|
border: none;
|
||||||
white-space: nowrap;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.sag-table tbody tr {
|
.sag-table tbody tr {
|
||||||
@ -54,30 +51,9 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
.sag-table tbody td {
|
.sag-table tbody td {
|
||||||
padding: 0.5rem 0.75rem;
|
padding: 0.6rem 1rem;
|
||||||
vertical-align: top;
|
vertical-align: middle;
|
||||||
font-size: 0.86rem;
|
font-size: 0.9rem;
|
||||||
white-space: nowrap;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sag-table td.col-company,
|
|
||||||
.sag-table td.col-contact,
|
|
||||||
.sag-table td.col-owner,
|
|
||||||
.sag-table td.col-group,
|
|
||||||
.sag-table td.col-desc {
|
|
||||||
white-space: normal;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sag-table td.col-company,
|
|
||||||
.sag-table td.col-contact,
|
|
||||||
.sag-table td.col-owner,
|
|
||||||
.sag-table td.col-group {
|
|
||||||
max-width: 180px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.sag-table td.col-desc {
|
|
||||||
min-width: 260px;
|
|
||||||
max-width: 360px;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.sag-id {
|
.sag-id {
|
||||||
@ -270,7 +246,7 @@
|
|||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
{% block content %}
|
{% block content %}
|
||||||
<div class="container-fluid" style="max-width: none; padding-top: 2rem;">
|
<div class="container-fluid" style="max-width: 1400px; padding-top: 2rem;">
|
||||||
<!-- Header -->
|
<!-- Header -->
|
||||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||||
<h1 style="margin: 0; color: var(--accent);">
|
<h1 style="margin: 0; color: var(--accent);">
|
||||||
@ -354,19 +330,17 @@
|
|||||||
<table class="sag-table">
|
<table class="sag-table">
|
||||||
<thead>
|
<thead>
|
||||||
<tr>
|
<tr>
|
||||||
<th style="width: 90px;">SagsID</th>
|
<th style="width: 90px;">ID</th>
|
||||||
<th style="width: 180px;">Virksom.</th>
|
<th>Titel & Beskrivelse</th>
|
||||||
<th style="width: 150px;">Kontakt</th>
|
|
||||||
<th style="width: 300px;">Beskr.</th>
|
|
||||||
<th style="width: 120px;">Type</th>
|
<th style="width: 120px;">Type</th>
|
||||||
<th style="width: 110px;">Prioritet</th>
|
<th style="width: 180px;">Kunde</th>
|
||||||
<th style="width: 160px;">Ansvarl.</th>
|
<th style="width: 150px;">Hovedkontakt</th>
|
||||||
<th style="width: 170px;">Gruppe/Level</th>
|
<th style="width: 160px;">Ansvarlig</th>
|
||||||
<th style="width: 240px;">Næste todo</th>
|
<th style="width: 160px;">Gruppe</th>
|
||||||
<th style="width: 120px;">Opret.</th>
|
<th style="width: 100px;">Status</th>
|
||||||
<th style="width: 120px;">Start arbejde</th>
|
<th style="width: 120px;">Udsat start</th>
|
||||||
<th style="width: 140px;">Start inden</th>
|
<th style="width: 120px;">Oprettet</th>
|
||||||
<th style="width: 120px;">Deadline</th>
|
<th style="width: 120px;">Opdateret</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody id="sagTableBody">
|
<tbody id="sagTableBody">
|
||||||
@ -383,13 +357,7 @@
|
|||||||
{% endif %}
|
{% endif %}
|
||||||
<span class="sag-id">#{{ sag.id }}</span>
|
<span class="sag-id">#{{ sag.id }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td class="col-company" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
||||||
{{ sag.customer_name if sag.customer_name else '-' }}
|
|
||||||
</td>
|
|
||||||
<td class="col-contact" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
|
||||||
{{ sag.kontakt_navn if sag.kontakt_navn and sag.kontakt_navn.strip() else '-' }}
|
|
||||||
</td>
|
|
||||||
<td class="col-desc" onclick="window.location.href='/sag/{{ sag.id }}'">
|
|
||||||
<div class="sag-titel">{{ sag.titel }}</div>
|
<div class="sag-titel">{{ sag.titel }}</div>
|
||||||
{% if sag.beskrivelse %}
|
{% if sag.beskrivelse %}
|
||||||
<div class="sag-beskrivelse">{{ sag.beskrivelse }}</div>
|
<div class="sag-beskrivelse">{{ sag.beskrivelse }}</div>
|
||||||
@ -398,36 +366,29 @@
|
|||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
||||||
<span class="badge bg-light text-dark border">{{ sag.template_key or sag.type or 'ticket' }}</span>
|
<span class="badge bg-light text-dark border">{{ sag.template_key or sag.type or 'ticket' }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; text-transform: capitalize;">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ sag.priority if sag.priority else 'normal' }}
|
{{ sag.customer_name if sag.customer_name else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td class="col-owner" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
|
{{ sag.kontakt_navn if sag.kontakt_navn and sag.kontakt_navn.strip() else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ sag.ansvarlig_navn if sag.ansvarlig_navn else '-' }}
|
{{ sag.ansvarlig_navn if sag.ansvarlig_navn else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td class="col-group" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ sag.assigned_group_name if sag.assigned_group_name else '-' }}
|
{{ sag.assigned_group_name if sag.assigned_group_name else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; white-space: normal; max-width: 240px;">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
||||||
{% if sag.next_todo_title %}
|
<span class="status-badge status-{{ sag.status }}">{{ sag.status }}</span>
|
||||||
<div>{{ sag.next_todo_title }}</div>
|
|
||||||
{% if sag.next_todo_due_date %}
|
|
||||||
<div class="small text-muted">Forfald: {{ sag.next_todo_due_date.strftime('%d/%m-%Y') }}</div>
|
|
||||||
{% endif %}
|
|
||||||
{% else %}
|
|
||||||
-
|
|
||||||
{% endif %}
|
|
||||||
</td>
|
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
|
||||||
{{ sag.created_at.strftime('%d/%m-%Y') if sag.created_at else '-' }}
|
|
||||||
</td>
|
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
|
||||||
{{ sag.start_date.strftime('%d/%m-%Y') if sag.start_date else '-' }}
|
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ sag.deferred_until.strftime('%d/%m-%Y') if sag.deferred_until else '-' }}
|
{{ sag.deferred_until.strftime('%d/%m-%Y') if sag.deferred_until else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ sag.deadline.strftime('%d/%m-%Y') if sag.deadline else '-' }}
|
{{ sag.created_at.strftime('%d/%m-%Y') if sag.created_at else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
||||||
|
{{ sag.updated_at.strftime('%d/%m-%Y') if sag.updated_at else '-' }}
|
||||||
</td>
|
</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% if has_relations %}
|
{% if has_relations %}
|
||||||
@ -441,13 +402,7 @@
|
|||||||
<td>
|
<td>
|
||||||
<span class="sag-id">#{{ related_sag.id }}</span>
|
<span class="sag-id">#{{ related_sag.id }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td class="col-company" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
||||||
{{ related_sag.customer_name if related_sag.customer_name else '-' }}
|
|
||||||
</td>
|
|
||||||
<td class="col-contact" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
|
||||||
{{ related_sag.kontakt_navn if related_sag.kontakt_navn and related_sag.kontakt_navn.strip() else '-' }}
|
|
||||||
</td>
|
|
||||||
<td class="col-desc" onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
|
||||||
{% for rt in all_rel_types %}
|
{% for rt in all_rel_types %}
|
||||||
<span class="relation-badge">{{ rt }}</span>
|
<span class="relation-badge">{{ rt }}</span>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
@ -459,36 +414,29 @@
|
|||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
||||||
<span class="badge bg-light text-dark border">{{ related_sag.template_key or related_sag.type or 'ticket' }}</span>
|
<span class="badge bg-light text-dark border">{{ related_sag.template_key or related_sag.type or 'ticket' }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; text-transform: capitalize;">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ related_sag.priority if related_sag.priority else 'normal' }}
|
{{ related_sag.customer_name if related_sag.customer_name else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td class="col-owner" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
|
{{ related_sag.kontakt_navn if related_sag.kontakt_navn and related_sag.kontakt_navn.strip() else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ related_sag.ansvarlig_navn if related_sag.ansvarlig_navn else '-' }}
|
{{ related_sag.ansvarlig_navn if related_sag.ansvarlig_navn else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td class="col-group" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ related_sag.assigned_group_name if related_sag.assigned_group_name else '-' }}
|
{{ related_sag.assigned_group_name if related_sag.assigned_group_name else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; white-space: normal; max-width: 240px;">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
||||||
{% if related_sag.next_todo_title %}
|
<span class="status-badge status-{{ related_sag.status }}">{{ related_sag.status }}</span>
|
||||||
<div>{{ related_sag.next_todo_title }}</div>
|
|
||||||
{% if related_sag.next_todo_due_date %}
|
|
||||||
<div class="small text-muted">Forfald: {{ related_sag.next_todo_due_date.strftime('%d/%m-%Y') }}</div>
|
|
||||||
{% endif %}
|
|
||||||
{% else %}
|
|
||||||
-
|
|
||||||
{% endif %}
|
|
||||||
</td>
|
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
|
||||||
{{ related_sag.created_at.strftime('%d/%m-%Y') if related_sag.created_at else '-' }}
|
|
||||||
</td>
|
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
|
||||||
{{ related_sag.start_date.strftime('%d/%m-%Y') if related_sag.start_date else '-' }}
|
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ related_sag.deferred_until.strftime('%d/%m-%Y') if related_sag.deferred_until else '-' }}
|
{{ related_sag.deferred_until.strftime('%d/%m-%Y') if related_sag.deferred_until else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ related_sag.deadline.strftime('%d/%m-%Y') if related_sag.deadline else '-' }}
|
{{ related_sag.created_at.strftime('%d/%m-%Y') if related_sag.created_at else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
||||||
|
{{ related_sag.updated_at.strftime('%d/%m-%Y') if related_sag.updated_at else '-' }}
|
||||||
</td>
|
</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|||||||
@ -112,54 +112,19 @@ def _validate_yealink_request(request: Request, token: Optional[str]) -> None:
|
|||||||
db_secret = (_get_setting_value("telefoni_shared_secret", "") or "").strip()
|
db_secret = (_get_setting_value("telefoni_shared_secret", "") or "").strip()
|
||||||
accepted_tokens = {s for s in (env_secret, db_secret) if s}
|
accepted_tokens = {s for s in (env_secret, db_secret) if s}
|
||||||
whitelist = (getattr(settings, "TELEFONI_IP_WHITELIST", "") or "").strip()
|
whitelist = (getattr(settings, "TELEFONI_IP_WHITELIST", "") or "").strip()
|
||||||
client_ip = _get_client_ip(request)
|
|
||||||
path = request.url.path
|
|
||||||
|
|
||||||
def _mask(value: Optional[str]) -> str:
|
|
||||||
if not value:
|
|
||||||
return "<empty>"
|
|
||||||
stripped = value.strip()
|
|
||||||
if len(stripped) <= 8:
|
|
||||||
return "***"
|
|
||||||
return f"{stripped[:4]}...{stripped[-4:]}"
|
|
||||||
|
|
||||||
if not accepted_tokens and not whitelist:
|
if not accepted_tokens and not whitelist:
|
||||||
logger.error(
|
logger.error("❌ Telefoni callbacks are not secured (no TELEFONI_SHARED_SECRET or TELEFONI_IP_WHITELIST set)")
|
||||||
"❌ Telefoni callback rejected path=%s reason=no_security_config ip=%s",
|
|
||||||
path,
|
|
||||||
client_ip,
|
|
||||||
)
|
|
||||||
raise HTTPException(status_code=403, detail="Telefoni callbacks not configured")
|
raise HTTPException(status_code=403, detail="Telefoni callbacks not configured")
|
||||||
|
|
||||||
if token and token.strip() in accepted_tokens:
|
if token and token.strip() in accepted_tokens:
|
||||||
logger.debug("✅ Telefoni callback accepted path=%s auth=token ip=%s", path, client_ip)
|
|
||||||
return
|
return
|
||||||
|
|
||||||
if token and accepted_tokens:
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Telefoni callback token mismatch path=%s ip=%s provided=%s accepted_sources=%s",
|
|
||||||
path,
|
|
||||||
client_ip,
|
|
||||||
_mask(token),
|
|
||||||
"+".join([name for name, value in (("env", env_secret), ("db", db_secret)) if value]) or "none",
|
|
||||||
)
|
|
||||||
elif not token:
|
|
||||||
logger.info("ℹ️ Telefoni callback without token path=%s ip=%s", path, client_ip)
|
|
||||||
|
|
||||||
if whitelist:
|
if whitelist:
|
||||||
|
client_ip = _get_client_ip(request)
|
||||||
if ip_in_whitelist(client_ip, whitelist):
|
if ip_in_whitelist(client_ip, whitelist):
|
||||||
logger.debug("✅ Telefoni callback accepted path=%s auth=ip_whitelist ip=%s", path, client_ip)
|
|
||||||
return
|
return
|
||||||
logger.warning(
|
|
||||||
"⚠️ Telefoni callback IP not in whitelist path=%s ip=%s whitelist=%s",
|
|
||||||
path,
|
|
||||||
client_ip,
|
|
||||||
whitelist,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
logger.info("ℹ️ Telefoni callback whitelist not configured path=%s ip=%s", path, client_ip)
|
|
||||||
|
|
||||||
logger.warning("❌ Telefoni callback forbidden path=%s ip=%s", path, client_ip)
|
|
||||||
raise HTTPException(status_code=403, detail="Forbidden")
|
raise HTTPException(status_code=403, detail="Forbidden")
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@ -358,7 +358,6 @@ async function loadUsers() {
|
|||||||
opt.textContent = `${u.full_name || u.username || ('#' + u.user_id)}${u.telefoni_extension ? ' (' + u.telefoni_extension + ')' : ''}`;
|
opt.textContent = `${u.full_name || u.username || ('#' + u.user_id)}${u.telefoni_extension ? ' (' + u.telefoni_extension + ')' : ''}`;
|
||||||
sel.appendChild(opt);
|
sel.appendChild(opt);
|
||||||
});
|
});
|
||||||
sel.value = '';
|
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.error('Failed loading telefoni users', e);
|
console.error('Failed loading telefoni users', e);
|
||||||
}
|
}
|
||||||
@ -501,16 +500,6 @@ async function unlinkCase(callId) {
|
|||||||
|
|
||||||
document.addEventListener('DOMContentLoaded', async () => {
|
document.addEventListener('DOMContentLoaded', async () => {
|
||||||
initLinkSagModalEvents();
|
initLinkSagModalEvents();
|
||||||
const userFilter = document.getElementById('filterUser');
|
|
||||||
const fromFilter = document.getElementById('filterFrom');
|
|
||||||
const toFilter = document.getElementById('filterTo');
|
|
||||||
const withoutCaseFilter = document.getElementById('filterWithoutCase');
|
|
||||||
|
|
||||||
if (userFilter) userFilter.value = '';
|
|
||||||
if (fromFilter) fromFilter.value = '';
|
|
||||||
if (toFilter) toFilter.value = '';
|
|
||||||
if (withoutCaseFilter) withoutCaseFilter.checked = false;
|
|
||||||
|
|
||||||
await loadUsers();
|
await loadUsers();
|
||||||
document.getElementById('btnRefresh').addEventListener('click', loadCalls);
|
document.getElementById('btnRefresh').addEventListener('click', loadCalls);
|
||||||
document.getElementById('filterUser').addEventListener('change', loadCalls);
|
document.getElementById('filterUser').addEventListener('change', loadCalls);
|
||||||
|
|||||||
@ -12,31 +12,15 @@ import os
|
|||||||
import shutil
|
import shutil
|
||||||
|
|
||||||
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
||||||
from app.core.config import settings
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
# APIRouter instance (module_loader kigger efter denne)
|
# APIRouter instance (module_loader kigger efter denne)
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
# Upload directory for logos (works in both Docker and local development)
|
# Upload directory for logos
|
||||||
_logo_base_dir = os.path.abspath(settings.UPLOAD_DIR)
|
LOGO_UPLOAD_DIR = "/app/uploads/webshop_logos"
|
||||||
LOGO_UPLOAD_DIR = os.path.join(_logo_base_dir, "webshop_logos")
|
os.makedirs(LOGO_UPLOAD_DIR, exist_ok=True)
|
||||||
try:
|
|
||||||
os.makedirs(LOGO_UPLOAD_DIR, exist_ok=True)
|
|
||||||
except OSError as exc:
|
|
||||||
if _logo_base_dir.startswith('/app/'):
|
|
||||||
_fallback_base = os.path.abspath('uploads')
|
|
||||||
LOGO_UPLOAD_DIR = os.path.join(_fallback_base, "webshop_logos")
|
|
||||||
os.makedirs(LOGO_UPLOAD_DIR, exist_ok=True)
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Webshop logo dir %s not writable (%s). Using fallback %s",
|
|
||||||
_logo_base_dir,
|
|
||||||
exc,
|
|
||||||
LOGO_UPLOAD_DIR,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
raise
|
|
||||||
|
|
||||||
|
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
|
|||||||
@ -135,7 +135,7 @@ async def get_prepaid_cards(status: Optional[str] = None, customer_id: Optional[
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.get("/prepaid-cards/{card_id:int}", response_model=Dict[str, Any])
|
@router.get("/prepaid-cards/{card_id}", response_model=Dict[str, Any])
|
||||||
async def get_prepaid_card(card_id: int):
|
async def get_prepaid_card(card_id: int):
|
||||||
"""
|
"""
|
||||||
Get a specific prepaid card with transactions
|
Get a specific prepaid card with transactions
|
||||||
@ -321,7 +321,7 @@ async def create_prepaid_card(card: PrepaidCardCreate):
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.put("/prepaid-cards/{card_id:int}/status")
|
@router.put("/prepaid-cards/{card_id}/status")
|
||||||
async def update_card_status(card_id: int, status: str):
|
async def update_card_status(card_id: int, status: str):
|
||||||
"""
|
"""
|
||||||
Update prepaid card status (cancel, reactivate)
|
Update prepaid card status (cancel, reactivate)
|
||||||
@ -362,7 +362,7 @@ async def update_card_status(card_id: int, status: str):
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.put("/prepaid-cards/{card_id:int}/rounding", response_model=Dict[str, Any])
|
@router.put("/prepaid-cards/{card_id}/rounding", response_model=Dict[str, Any])
|
||||||
async def update_card_rounding(card_id: int, payload: PrepaidCardRoundingUpdate):
|
async def update_card_rounding(card_id: int, payload: PrepaidCardRoundingUpdate):
|
||||||
"""
|
"""
|
||||||
Update rounding interval for a prepaid card (minutes)
|
Update rounding interval for a prepaid card (minutes)
|
||||||
@ -394,7 +394,7 @@ async def update_card_rounding(card_id: int, payload: PrepaidCardRoundingUpdate)
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/prepaid-cards/{card_id:int}")
|
@router.delete("/prepaid-cards/{card_id}")
|
||||||
async def delete_prepaid_card(card_id: int):
|
async def delete_prepaid_card(card_id: int):
|
||||||
"""
|
"""
|
||||||
Delete a prepaid card (only if no transactions)
|
Delete a prepaid card (only if no transactions)
|
||||||
|
|||||||
@ -6,7 +6,7 @@ import logging
|
|||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
templates = Jinja2Templates(directory=["app/prepaid/frontend", "app/shared/frontend", "app"])
|
templates = Jinja2Templates(directory=["app/prepaid/frontend", "app/shared/frontend"])
|
||||||
|
|
||||||
|
|
||||||
@router.get("/prepaid-cards", response_class=HTMLResponse)
|
@router.get("/prepaid-cards", response_class=HTMLResponse)
|
||||||
|
|||||||
@ -113,9 +113,7 @@ def _upsert_product_supplier(product_id: int, payload: Dict[str, Any], source: s
|
|||||||
"""
|
"""
|
||||||
match_params = (product_id, supplier_name, supplier_sku)
|
match_params = (product_id, supplier_name, supplier_sku)
|
||||||
|
|
||||||
existing = None
|
existing = execute_query_single(match_query, match_params) if match_query else None
|
||||||
if match_query and match_params is not None:
|
|
||||||
existing = execute_query_single(match_query, match_params)
|
|
||||||
|
|
||||||
if existing:
|
if existing:
|
||||||
update_query = """
|
update_query = """
|
||||||
@ -475,9 +473,6 @@ async def list_products(
|
|||||||
minimum_term_months,
|
minimum_term_months,
|
||||||
is_bundle,
|
is_bundle,
|
||||||
billable,
|
billable,
|
||||||
serial_number_required,
|
|
||||||
asset_required,
|
|
||||||
rental_asset_enabled,
|
|
||||||
image_url
|
image_url
|
||||||
FROM products
|
FROM products
|
||||||
{where_clause}
|
{where_clause}
|
||||||
@ -531,9 +526,6 @@ async def create_product(payload: Dict[str, Any]):
|
|||||||
parent_product_id,
|
parent_product_id,
|
||||||
bundle_pricing_model,
|
bundle_pricing_model,
|
||||||
billable,
|
billable,
|
||||||
serial_number_required,
|
|
||||||
asset_required,
|
|
||||||
rental_asset_enabled,
|
|
||||||
default_case_tag,
|
default_case_tag,
|
||||||
default_time_rate_id,
|
default_time_rate_id,
|
||||||
category_id,
|
category_id,
|
||||||
@ -556,8 +548,7 @@ async def create_product(payload: Dict[str, Any]):
|
|||||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
||||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
||||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
||||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
%s, %s, %s, %s, %s, %s, %s, %s, %s
|
||||||
%s, %s
|
|
||||||
)
|
)
|
||||||
RETURNING *
|
RETURNING *
|
||||||
"""
|
"""
|
||||||
@ -594,9 +585,6 @@ async def create_product(payload: Dict[str, Any]):
|
|||||||
payload.get("parent_product_id"),
|
payload.get("parent_product_id"),
|
||||||
payload.get("bundle_pricing_model"),
|
payload.get("bundle_pricing_model"),
|
||||||
payload.get("billable", True),
|
payload.get("billable", True),
|
||||||
payload.get("serial_number_required", False),
|
|
||||||
payload.get("asset_required", False),
|
|
||||||
payload.get("rental_asset_enabled", False),
|
|
||||||
payload.get("default_case_tag"),
|
payload.get("default_case_tag"),
|
||||||
payload.get("default_time_rate_id"),
|
payload.get("default_time_rate_id"),
|
||||||
payload.get("category_id"),
|
payload.get("category_id"),
|
||||||
@ -646,7 +634,7 @@ async def update_product(
|
|||||||
payload: Dict[str, Any],
|
payload: Dict[str, Any],
|
||||||
current_user: dict = Depends(require_permission("products.update"))
|
current_user: dict = Depends(require_permission("products.update"))
|
||||||
):
|
):
|
||||||
"""Update product fields for core metadata and billing validation flags."""
|
"""Update product fields like name."""
|
||||||
try:
|
try:
|
||||||
name = payload.get("name")
|
name = payload.get("name")
|
||||||
if name is not None:
|
if name is not None:
|
||||||
@ -654,45 +642,21 @@ async def update_product(
|
|||||||
if not name:
|
if not name:
|
||||||
raise HTTPException(status_code=400, detail="name cannot be empty")
|
raise HTTPException(status_code=400, detail="name cannot be empty")
|
||||||
|
|
||||||
serial_number_required = payload.get("serial_number_required")
|
|
||||||
asset_required = payload.get("asset_required")
|
|
||||||
rental_asset_enabled = payload.get("rental_asset_enabled")
|
|
||||||
|
|
||||||
existing = execute_query_single(
|
existing = execute_query_single(
|
||||||
"""
|
"SELECT name FROM products WHERE id = %s AND deleted_at IS NULL",
|
||||||
SELECT name, serial_number_required, asset_required, rental_asset_enabled
|
|
||||||
FROM products
|
|
||||||
WHERE id = %s AND deleted_at IS NULL
|
|
||||||
""",
|
|
||||||
(product_id,)
|
(product_id,)
|
||||||
)
|
)
|
||||||
if not existing:
|
if not existing:
|
||||||
raise HTTPException(status_code=404, detail="Product not found")
|
raise HTTPException(status_code=404, detail="Product not found")
|
||||||
|
|
||||||
updates = ["updated_at = CURRENT_TIMESTAMP"]
|
query = """
|
||||||
values: List[Any] = []
|
|
||||||
|
|
||||||
if name is not None:
|
|
||||||
updates.append("name = %s")
|
|
||||||
values.append(name)
|
|
||||||
if serial_number_required is not None:
|
|
||||||
updates.append("serial_number_required = %s")
|
|
||||||
values.append(bool(serial_number_required))
|
|
||||||
if asset_required is not None:
|
|
||||||
updates.append("asset_required = %s")
|
|
||||||
values.append(bool(asset_required))
|
|
||||||
if rental_asset_enabled is not None:
|
|
||||||
updates.append("rental_asset_enabled = %s")
|
|
||||||
values.append(bool(rental_asset_enabled))
|
|
||||||
|
|
||||||
values.append(product_id)
|
|
||||||
query = f"""
|
|
||||||
UPDATE products
|
UPDATE products
|
||||||
SET {', '.join(updates)}
|
SET name = COALESCE(%s, name),
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
WHERE id = %s AND deleted_at IS NULL
|
WHERE id = %s AND deleted_at IS NULL
|
||||||
RETURNING *
|
RETURNING *
|
||||||
"""
|
"""
|
||||||
result = execute_query(query, tuple(values))
|
result = execute_query(query, (name, product_id))
|
||||||
if not result:
|
if not result:
|
||||||
raise HTTPException(status_code=404, detail="Product not found")
|
raise HTTPException(status_code=404, detail="Product not found")
|
||||||
if name is not None and name != existing.get("name"):
|
if name is not None and name != existing.get("name"):
|
||||||
@ -702,15 +666,6 @@ async def update_product(
|
|||||||
current_user.get("id") if current_user else None,
|
current_user.get("id") if current_user else None,
|
||||||
{"old": existing.get("name"), "new": name}
|
{"old": existing.get("name"), "new": name}
|
||||||
)
|
)
|
||||||
|
|
||||||
for field in ("serial_number_required", "asset_required", "rental_asset_enabled"):
|
|
||||||
if field in payload and payload.get(field) != existing.get(field):
|
|
||||||
_log_product_audit(
|
|
||||||
product_id,
|
|
||||||
f"{field}_updated",
|
|
||||||
current_user.get("id") if current_user else None,
|
|
||||||
{"old": existing.get(field), "new": bool(payload.get(field))}
|
|
||||||
)
|
|
||||||
return result[0]
|
return result[0]
|
||||||
except HTTPException:
|
except HTTPException:
|
||||||
raise
|
raise
|
||||||
|
|||||||
@ -67,12 +67,12 @@ class CaseAnalysisService:
|
|||||||
|
|
||||||
return analysis
|
return analysis
|
||||||
else:
|
else:
|
||||||
logger.warning("⚠️ Ollama returned no result, using heuristic fallback analysis")
|
logger.warning("⚠️ Ollama returned no result, using empty analysis")
|
||||||
return await self._heuristic_fallback_analysis(text)
|
return self._empty_analysis(text)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Case analysis failed: {e}", exc_info=True)
|
logger.error(f"❌ Case analysis failed: {e}", exc_info=True)
|
||||||
return await self._heuristic_fallback_analysis(text)
|
return self._empty_analysis(text)
|
||||||
|
|
||||||
def _build_analysis_prompt(self) -> str:
|
def _build_analysis_prompt(self) -> str:
|
||||||
"""Build Danish system prompt for case analysis"""
|
"""Build Danish system prompt for case analysis"""
|
||||||
@ -471,73 +471,6 @@ Returner JSON med suggested_title, suggested_description, priority, customer_hin
|
|||||||
ai_reasoning="AI unavailable - fill fields manually"
|
ai_reasoning="AI unavailable - fill fields manually"
|
||||||
)
|
)
|
||||||
|
|
||||||
async def _heuristic_fallback_analysis(self, text: str) -> QuickCreateAnalysis:
|
|
||||||
"""Local fallback when AI service is unavailable."""
|
|
||||||
cleaned_text = (text or "").strip()
|
|
||||||
if not cleaned_text:
|
|
||||||
return self._empty_analysis(text)
|
|
||||||
|
|
||||||
lowered = cleaned_text.lower()
|
|
||||||
|
|
||||||
# Priority heuristics based on urgency wording.
|
|
||||||
urgent_terms = ["nede", "kritisk", "asap", "omgående", "straks", "akut", "haster"]
|
|
||||||
high_terms = ["hurtigt", "vigtigt", "snarest", "prioriter"]
|
|
||||||
low_terms = ["når i får tid", "ikke hastende", "lavprioriteret"]
|
|
||||||
|
|
||||||
if any(term in lowered for term in urgent_terms):
|
|
||||||
priority = SagPriority.URGENT
|
|
||||||
elif any(term in lowered for term in high_terms):
|
|
||||||
priority = SagPriority.HIGH
|
|
||||||
elif any(term in lowered for term in low_terms):
|
|
||||||
priority = SagPriority.LOW
|
|
||||||
else:
|
|
||||||
priority = SagPriority.NORMAL
|
|
||||||
|
|
||||||
# Basic title heuristic: first non-empty line/sentence, clipped to 80 chars.
|
|
||||||
first_line = cleaned_text.splitlines()[0].strip()
|
|
||||||
first_sentence = re.split(r"[.!?]", first_line)[0].strip()
|
|
||||||
title_source = first_sentence or first_line or cleaned_text
|
|
||||||
title = title_source[:80].strip()
|
|
||||||
if not title:
|
|
||||||
title = "Ny sag"
|
|
||||||
|
|
||||||
# Lightweight keyword tags.
|
|
||||||
keyword_tags = {
|
|
||||||
"printer": "printer",
|
|
||||||
"mail": "mail",
|
|
||||||
"email": "mail",
|
|
||||||
"vpn": "vpn",
|
|
||||||
"net": "netværk",
|
|
||||||
"wifi": "wifi",
|
|
||||||
"server": "server",
|
|
||||||
"laptop": "laptop",
|
|
||||||
"adgang": "adgang",
|
|
||||||
"onboarding": "onboarding",
|
|
||||||
}
|
|
||||||
suggested_tags: List[str] = []
|
|
||||||
for key, tag in keyword_tags.items():
|
|
||||||
if key in lowered and tag not in suggested_tags:
|
|
||||||
suggested_tags.append(tag)
|
|
||||||
|
|
||||||
# Try simple customer matching from long words in text.
|
|
||||||
candidate_hints = []
|
|
||||||
for token in re.findall(r"[A-Za-z0-9ÆØÅæøå._-]{3,}", cleaned_text):
|
|
||||||
if token.lower() in {"ring", "kunde", "sag", "skal", "have", "virker", "ikke"}:
|
|
||||||
continue
|
|
||||||
candidate_hints.append(token)
|
|
||||||
customer_id, customer_name = await self._match_customer(candidate_hints[:8])
|
|
||||||
|
|
||||||
return QuickCreateAnalysis(
|
|
||||||
suggested_title=title,
|
|
||||||
suggested_description=cleaned_text,
|
|
||||||
suggested_priority=priority,
|
|
||||||
suggested_customer_id=customer_id,
|
|
||||||
suggested_customer_name=customer_name,
|
|
||||||
suggested_tags=suggested_tags,
|
|
||||||
confidence=0.35,
|
|
||||||
ai_reasoning="AI service unavailable - using local fallback suggestions"
|
|
||||||
)
|
|
||||||
|
|
||||||
def _get_cached_analysis(self, text: str) -> Optional[QuickCreateAnalysis]:
|
def _get_cached_analysis(self, text: str) -> Optional[QuickCreateAnalysis]:
|
||||||
"""Get cached analysis if available and not expired"""
|
"""Get cached analysis if available and not expired"""
|
||||||
text_hash = hashlib.md5(text.encode()).hexdigest()
|
text_hash = hashlib.md5(text.encode()).hexdigest()
|
||||||
|
|||||||
@ -784,7 +784,7 @@ class EconomicService:
|
|||||||
invoice_date: str,
|
invoice_date: str,
|
||||||
total_amount: float,
|
total_amount: float,
|
||||||
vat_breakdown: Dict[str, float],
|
vat_breakdown: Dict[str, float],
|
||||||
line_items: Optional[List[Dict]] = None,
|
line_items: List[Dict] = None,
|
||||||
due_date: Optional[str] = None,
|
due_date: Optional[str] = None,
|
||||||
text: Optional[str] = None) -> Dict:
|
text: Optional[str] = None) -> Dict:
|
||||||
"""
|
"""
|
||||||
@ -983,12 +983,10 @@ class EconomicService:
|
|||||||
data = await response.json() if response_text else {}
|
data = await response.json() if response_text else {}
|
||||||
|
|
||||||
# e-conomic returns array of created vouchers
|
# e-conomic returns array of created vouchers
|
||||||
if isinstance(data, list) and len(data) > 0 and isinstance(data[0], dict):
|
if isinstance(data, list) and len(data) > 0:
|
||||||
voucher_data = data[0]
|
voucher_data = data[0]
|
||||||
elif isinstance(data, dict):
|
|
||||||
voucher_data = data
|
|
||||||
else:
|
else:
|
||||||
voucher_data = {}
|
voucher_data = data
|
||||||
|
|
||||||
voucher_number = voucher_data.get('voucherNumber')
|
voucher_number = voucher_data.get('voucherNumber')
|
||||||
logger.info(f"✅ Supplier invoice posted to kassekladde: voucher #{voucher_number}")
|
logger.info(f"✅ Supplier invoice posted to kassekladde: voucher #{voucher_number}")
|
||||||
@ -1047,8 +1045,8 @@ class EconomicService:
|
|||||||
url = f"{self.api_url}/journals/{journal_number}/vouchers/{accounting_year}-{voucher_number}/attachment/file"
|
url = f"{self.api_url}/journals/{journal_number}/vouchers/{accounting_year}-{voucher_number}/attachment/file"
|
||||||
|
|
||||||
headers = {
|
headers = {
|
||||||
'X-AppSecretToken': str(self.app_secret_token or ''),
|
'X-AppSecretToken': self.app_secret_token,
|
||||||
'X-AgreementGrantToken': str(self.agreement_grant_token or '')
|
'X-AgreementGrantToken': self.agreement_grant_token
|
||||||
}
|
}
|
||||||
|
|
||||||
# Use multipart/form-data as required by e-conomic API
|
# Use multipart/form-data as required by e-conomic API
|
||||||
@ -1072,55 +1070,6 @@ class EconomicService:
|
|||||||
logger.error(f"❌ upload_voucher_attachment error: {e}")
|
logger.error(f"❌ upload_voucher_attachment error: {e}")
|
||||||
return {"error": True, "message": str(e)}
|
return {"error": True, "message": str(e)}
|
||||||
|
|
||||||
async def get_invoice_lifecycle_status(self, invoice_number: str) -> str:
|
|
||||||
"""
|
|
||||||
Resolve lifecycle status for an invoice number from e-conomic.
|
|
||||||
|
|
||||||
Returns one of: draft, booked, unpaid, paid, not_found, error
|
|
||||||
"""
|
|
||||||
invoice_number = str(invoice_number or "").strip()
|
|
||||||
if not invoice_number:
|
|
||||||
return "not_found"
|
|
||||||
|
|
||||||
endpoints = [
|
|
||||||
("paid", f"{self.api_url}/invoices/paid"),
|
|
||||||
("unpaid", f"{self.api_url}/invoices/unpaid"),
|
|
||||||
("booked", f"{self.api_url}/invoices/booked"),
|
|
||||||
("draft", f"{self.api_url}/invoices/drafts"),
|
|
||||||
]
|
|
||||||
|
|
||||||
try:
|
|
||||||
async with aiohttp.ClientSession() as session:
|
|
||||||
for status_name, endpoint in endpoints:
|
|
||||||
page = 0
|
|
||||||
while True:
|
|
||||||
async with session.get(
|
|
||||||
endpoint,
|
|
||||||
params={"pagesize": 1000, "skippages": page},
|
|
||||||
headers=self._get_headers(),
|
|
||||||
) as response:
|
|
||||||
if response.status != 200:
|
|
||||||
break
|
|
||||||
|
|
||||||
data = await response.json()
|
|
||||||
collection = data.get("collection", [])
|
|
||||||
if not collection:
|
|
||||||
break
|
|
||||||
|
|
||||||
for inv in collection:
|
|
||||||
inv_no = inv.get("draftInvoiceNumber") or inv.get("bookedInvoiceNumber")
|
|
||||||
if str(inv_no or "") == invoice_number:
|
|
||||||
return status_name
|
|
||||||
|
|
||||||
if len(collection) < 1000:
|
|
||||||
break
|
|
||||||
page += 1
|
|
||||||
|
|
||||||
return "not_found"
|
|
||||||
except Exception as e:
|
|
||||||
logger.error("❌ Error resolving invoice lifecycle status %s: %s", invoice_number, e)
|
|
||||||
return "error"
|
|
||||||
|
|
||||||
|
|
||||||
# Singleton instance
|
# Singleton instance
|
||||||
_economic_service_instance = None
|
_economic_service_instance = None
|
||||||
|
|||||||
@ -6,7 +6,6 @@ Adapted from OmniSync for BMC Hub timetracking use cases
|
|||||||
|
|
||||||
import logging
|
import logging
|
||||||
import json
|
import json
|
||||||
import asyncio
|
|
||||||
from typing import Dict, Optional, List
|
from typing import Dict, Optional, List
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import aiohttp
|
import aiohttp
|
||||||
@ -78,7 +77,7 @@ Response format (JSON only, no other text):
|
|||||||
IMPORTANT: Return ONLY the JSON object. Do not include any explanation, thinking, or additional text."""
|
IMPORTANT: Return ONLY the JSON object. Do not include any explanation, thinking, or additional text."""
|
||||||
|
|
||||||
def _build_email_context(self, email_data: Dict) -> str:
|
def _build_email_context(self, email_data: Dict) -> str:
|
||||||
"""Build email context for AI classification (email body only - fast)"""
|
"""Build email context for AI analysis"""
|
||||||
|
|
||||||
subject = email_data.get('subject', '')
|
subject = email_data.get('subject', '')
|
||||||
sender = email_data.get('sender_email', '')
|
sender = email_data.get('sender_email', '')
|
||||||
@ -88,17 +87,9 @@ IMPORTANT: Return ONLY the JSON object. Do not include any explanation, thinking
|
|||||||
if len(body) > 2000:
|
if len(body) > 2000:
|
||||||
body = body[:2000] + "... [truncated]"
|
body = body[:2000] + "... [truncated]"
|
||||||
|
|
||||||
# Also note if PDF attachments exist (helps classification even without reading them)
|
|
||||||
attachments = email_data.get('attachments', [])
|
|
||||||
pdf_filenames = [a.get('filename', '') for a in attachments
|
|
||||||
if a.get('filename', '').lower().endswith('.pdf')]
|
|
||||||
attachment_note = ''
|
|
||||||
if pdf_filenames:
|
|
||||||
attachment_note = f"\n\nVedhæftede filer: {', '.join(pdf_filenames)}"
|
|
||||||
|
|
||||||
context = f"""**Email Information:**
|
context = f"""**Email Information:**
|
||||||
From: {sender}
|
From: {sender}
|
||||||
Subject: {subject}{attachment_note}
|
Subject: {subject}
|
||||||
|
|
||||||
**Email Body:**
|
**Email Body:**
|
||||||
{body}
|
{body}
|
||||||
@ -107,116 +98,6 @@ Klassificer denne email."""
|
|||||||
|
|
||||||
return context
|
return context
|
||||||
|
|
||||||
def _extract_pdf_texts_from_attachments(self, email_data: Dict) -> List[str]:
|
|
||||||
"""Extract text from PDF attachments in email_data (in-memory bytes)"""
|
|
||||||
pdf_texts = []
|
|
||||||
attachments = email_data.get('attachments', [])
|
|
||||||
for att in attachments:
|
|
||||||
filename = att.get('filename', '')
|
|
||||||
if not filename.lower().endswith('.pdf'):
|
|
||||||
continue
|
|
||||||
content = att.get('content', b'')
|
|
||||||
if not content:
|
|
||||||
continue
|
|
||||||
try:
|
|
||||||
import pdfplumber
|
|
||||||
import io
|
|
||||||
with pdfplumber.open(io.BytesIO(content)) as pdf:
|
|
||||||
pages = []
|
|
||||||
for page in pdf.pages:
|
|
||||||
text = page.extract_text(layout=True, x_tolerance=2, y_tolerance=2)
|
|
||||||
if text:
|
|
||||||
pages.append(text)
|
|
||||||
if pages:
|
|
||||||
pdf_texts.append(f"=== PDF: {filename} ===\n" + "\n".join(pages))
|
|
||||||
logger.info(f"📄 Extracted PDF text from attachment {filename} ({len(pages)} pages)")
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning(f"⚠️ Could not extract PDF text from {filename}: {e}")
|
|
||||||
return pdf_texts
|
|
||||||
|
|
||||||
def _get_attachment_texts_from_db(self, email_id: int) -> List[str]:
|
|
||||||
"""Fetch PDF attachment text from DB (content_data column) for already-saved emails"""
|
|
||||||
from pathlib import Path
|
|
||||||
pdf_texts = []
|
|
||||||
try:
|
|
||||||
attachments = execute_query(
|
|
||||||
"""SELECT filename, content_data, file_path
|
|
||||||
FROM email_attachments
|
|
||||||
WHERE email_id = %s AND filename ILIKE '%.pdf'""",
|
|
||||||
(email_id,)
|
|
||||||
)
|
|
||||||
for att in (attachments or []):
|
|
||||||
filename = att.get('filename', 'unknown.pdf')
|
|
||||||
content = None
|
|
||||||
# Prefer content_data (bytes in DB)
|
|
||||||
if att.get('content_data'):
|
|
||||||
content = bytes(att['content_data'])
|
|
||||||
# Fallback: read from disk
|
|
||||||
elif att.get('file_path'):
|
|
||||||
fp = Path(att['file_path'])
|
|
||||||
if fp.exists():
|
|
||||||
content = fp.read_bytes()
|
|
||||||
if not content:
|
|
||||||
continue
|
|
||||||
try:
|
|
||||||
import pdfplumber
|
|
||||||
import io
|
|
||||||
with pdfplumber.open(io.BytesIO(content)) as pdf:
|
|
||||||
pages = []
|
|
||||||
for page in pdf.pages:
|
|
||||||
text = page.extract_text(layout=True, x_tolerance=2, y_tolerance=2)
|
|
||||||
if text:
|
|
||||||
pages.append(text)
|
|
||||||
if pages:
|
|
||||||
pdf_texts.append(f"=== PDF: {filename} ===\n" + "\n".join(pages))
|
|
||||||
logger.info(f"📄 Extracted PDF text from DB for {filename} ({len(pages)} pages)")
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning(f"⚠️ Could not extract PDF text for {filename} from DB: {e}")
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error fetching attachment texts from DB for email {email_id}: {e}")
|
|
||||||
return pdf_texts
|
|
||||||
|
|
||||||
def _build_invoice_extraction_context(self, email_data: Dict) -> str:
|
|
||||||
"""Build extraction context with PDF as PRIMARY data source.
|
|
||||||
Email body/sender are ignored for invoice data — only the attached PDF counts.
|
|
||||||
Sender can be a forwarder or external bookkeeper, not the actual vendor.
|
|
||||||
"""
|
|
||||||
subject = email_data.get('subject', '')
|
|
||||||
body = email_data.get('body_text', '') or ''
|
|
||||||
# Keep body brief — it's secondary context at best
|
|
||||||
if len(body) > 300:
|
|
||||||
body = body[:300] + "..."
|
|
||||||
|
|
||||||
# 1. Try in-memory attachment bytes first (during initial fetch)
|
|
||||||
pdf_texts = self._extract_pdf_texts_from_attachments(email_data)
|
|
||||||
|
|
||||||
# 2. Fallback: load from DB for already-processed emails
|
|
||||||
if not pdf_texts and email_data.get('id'):
|
|
||||||
pdf_texts = self._get_attachment_texts_from_db(email_data['id'])
|
|
||||||
|
|
||||||
if pdf_texts:
|
|
||||||
pdf_section = "\n\n".join(pdf_texts)
|
|
||||||
return f"""VEDHÆFTET FAKTURA (primær datakilde - analyser grundigt):
|
|
||||||
{pdf_section}
|
|
||||||
|
|
||||||
---
|
|
||||||
Email emne: {subject}
|
|
||||||
Email tekst (sekundær): {body}
|
|
||||||
|
|
||||||
VIGTIGT: Udtrækket SKAL baseres på PDF-indholdet ovenfor.
|
|
||||||
Afsenderens email-adresse er IKKE leverandøren — leverandøren fremgår af fakturaen."""
|
|
||||||
else:
|
|
||||||
# No PDF found — fall back to email body
|
|
||||||
logger.warning(f"⚠️ No PDF attachment found for email {email_data.get('id')} — using email body only")
|
|
||||||
body_full = email_data.get('body_text', '') or email_data.get('body_html', '') or ''
|
|
||||||
if len(body_full) > 3000:
|
|
||||||
body_full = body_full[:3000] + "..."
|
|
||||||
return f"""Email emne: {subject}
|
|
||||||
Email tekst:
|
|
||||||
{body_full}
|
|
||||||
|
|
||||||
Ingen PDF vedhæftet — udtræk fakturadata fra email-teksten."""
|
|
||||||
|
|
||||||
async def _call_ollama(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
async def _call_ollama(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
||||||
"""Call Ollama API for classification"""
|
"""Call Ollama API for classification"""
|
||||||
|
|
||||||
@ -398,9 +279,9 @@ Ingen PDF vedhæftet — udtræk fakturadata fra email-teksten."""
|
|||||||
logger.info(f"✅ Using cached extraction for email {email_data['id']}")
|
logger.info(f"✅ Using cached extraction for email {email_data['id']}")
|
||||||
return cached_result
|
return cached_result
|
||||||
|
|
||||||
# Build extraction prompt — use PDF-first context, not email sender
|
# Build extraction prompt
|
||||||
system_prompt = self._build_extraction_prompt()
|
system_prompt = self._build_extraction_prompt()
|
||||||
user_message = self._build_invoice_extraction_context(email_data)
|
user_message = self._build_email_context(email_data)
|
||||||
|
|
||||||
# Call Ollama
|
# Call Ollama
|
||||||
result = await self._call_ollama_extraction(system_prompt, user_message)
|
result = await self._call_ollama_extraction(system_prompt, user_message)
|
||||||
@ -413,61 +294,39 @@ Ingen PDF vedhæftet — udtræk fakturadata fra email-teksten."""
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
def _build_extraction_prompt(self) -> str:
|
def _build_extraction_prompt(self) -> str:
|
||||||
"""Build comprehensive Danish system prompt for deep invoice data extraction."""
|
"""Build Danish system prompt for invoice data extraction"""
|
||||||
from app.core.config import settings as cfg
|
return """Du er en ekspert i at udtrække struktureret data fra danske fakturaer.
|
||||||
own_cvr = getattr(cfg, 'OWN_CVR', '')
|
|
||||||
return f"""Du er en ekspert i at læse og udtrække strukturerede data fra danske fakturaer og kreditnotaer.
|
|
||||||
|
|
||||||
DU SKAL ANALYSERE SELVE FAKTURAEN (PDF-indholdet) - IKKE email-afsenderen.
|
Din opgave er at finde og udtrække følgende information fra emailen:
|
||||||
Afsender kan være os selv der videresender, eller en ekstern bogholder - IGNORER afsender.
|
|
||||||
Leverandørens navn og CVR fremgår ALTID af selve fakturadokumentet.
|
|
||||||
|
|
||||||
VIGTIGE REGLER:
|
**Felter at udtrække:**
|
||||||
1. Returner KUN gyldig JSON - ingen forklaring eller ekstra tekst
|
- `invoice_number` (string) - Fakturanummer
|
||||||
2. Hvis et felt ikke findes, sæt det til null
|
- `amount` (decimal) - Fakturabeløb i DKK (uden valutasymbol)
|
||||||
3. Datoer skal være i format YYYY-MM-DD
|
- `due_date` (string YYYY-MM-DD) - Forfaldsdato
|
||||||
4. DANSKE PRISFORMATER:
|
- `vendor_name` (string) - Leverandørens navn
|
||||||
- Tusind-separator: . (punkt) eller mellemrum: "5.965,18" eller "5 965,18"
|
- `order_number` (string) - Ordrenummer (hvis angivet)
|
||||||
- Decimal-separator: , (komma): "1.234,56 kr"
|
- `cvr_number` (string) - CVR-nummer (hvis angivet)
|
||||||
- I JSON: brug . (punkt) som decimal: 1234.56
|
|
||||||
- Eksempel: "5.965,18 kr" → 5965.18
|
|
||||||
5. CVR-nummer: 8 cifre uden mellemrum
|
|
||||||
- IGNORER CVR {own_cvr} — det er VORES eget CVR (køber), ikke leverandørens!
|
|
||||||
- Find LEVERANDØRENS CVR i toppen/headeren af fakturaen
|
|
||||||
6. DOKUMENTTYPE:
|
|
||||||
- "invoice" = Almindelig faktura
|
|
||||||
- "credit_note" = Kreditnota (Kreditnota, Refusion, Tilbagebetaling, Credit Note)
|
|
||||||
7. Varelinjer: udtræk ALLE linjer med beskrivelse, antal, enhedspris, total
|
|
||||||
|
|
||||||
JSON STRUKTUR:
|
**Vigtige regler:**
|
||||||
{{
|
- Hvis et felt ikke findes, brug `null`
|
||||||
"document_type": "invoice" eller "credit_note",
|
- Beløb skal være numerisk (uden "kr", "DKK" osv.)
|
||||||
"invoice_number": "fakturanummer",
|
- Datoer skal være i formatet YYYY-MM-DD
|
||||||
"vendor_name": "leverandørens firmanavn",
|
- Vær præcis - returner kun data du er sikker på
|
||||||
"vendor_cvr": "12345678",
|
|
||||||
"invoice_date": "YYYY-MM-DD",
|
|
||||||
"due_date": "YYYY-MM-DD",
|
|
||||||
"currency": "DKK",
|
|
||||||
"total_amount": 1234.56,
|
|
||||||
"vat_amount": 246.91,
|
|
||||||
"net_amount": 987.65,
|
|
||||||
"order_number": "ordrenummer eller null",
|
|
||||||
"original_invoice_reference": "ref til original faktura (kun kreditnotaer) eller null",
|
|
||||||
"lines": [
|
|
||||||
{{
|
|
||||||
"line_number": 1,
|
|
||||||
"description": "varebeskrivelse",
|
|
||||||
"quantity": 2.0,
|
|
||||||
"unit_price": 500.00,
|
|
||||||
"line_total": 1000.00,
|
|
||||||
"vat_rate": 25.00,
|
|
||||||
"vat_amount": 250.00
|
|
||||||
}}
|
|
||||||
],
|
|
||||||
"confidence": 0.95
|
|
||||||
}}
|
|
||||||
|
|
||||||
Returner KUN JSON - ingen anden tekst."""
|
**Output format (JSON):**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"invoice_number": "INV-2024-001",
|
||||||
|
"amount": 5250.00,
|
||||||
|
"due_date": "2025-01-15",
|
||||||
|
"vendor_name": "Acme Leverandør A/S",
|
||||||
|
"order_number": "ORD-123",
|
||||||
|
"cvr_number": "12345678"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Returner KUN JSON - ingen anden tekst.
|
||||||
|
"""
|
||||||
|
|
||||||
async def _call_ollama_extraction(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
async def _call_ollama_extraction(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
||||||
"""Call Ollama for data extraction"""
|
"""Call Ollama for data extraction"""
|
||||||
@ -481,23 +340,20 @@ Returner KUN JSON - ingen anden tekst."""
|
|||||||
{"role": "user", "content": user_message}
|
{"role": "user", "content": user_message}
|
||||||
],
|
],
|
||||||
"stream": False,
|
"stream": False,
|
||||||
"format": "json",
|
|
||||||
"options": {
|
"options": {
|
||||||
"temperature": 0.0, # Deterministic extraction
|
"temperature": 0.0, # Zero temperature for deterministic extraction
|
||||||
"num_predict": 3000 # Enough for full invoice with many lines
|
"num_predict": 300
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
async with aiohttp.ClientSession() as session:
|
async with aiohttp.ClientSession() as session:
|
||||||
async with session.post(url, json=payload, timeout=aiohttp.ClientTimeout(total=120)) as response:
|
async with session.post(url, json=payload, timeout=aiohttp.ClientTimeout(total=30)) as response:
|
||||||
if response.status != 200:
|
if response.status != 200:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
data = await response.json()
|
data = await response.json()
|
||||||
msg = data.get('message', {})
|
content = data.get('message', {}).get('content', '')
|
||||||
# qwen3 sometimes returns content in 'thinking' field
|
|
||||||
content = msg.get('content', '') or msg.get('thinking', '')
|
|
||||||
|
|
||||||
# Parse JSON response
|
# Parse JSON response
|
||||||
result = self._parse_extraction_response(content)
|
result = self._parse_extraction_response(content)
|
||||||
|
|||||||
@ -49,7 +49,6 @@ class EmailProcessorService:
|
|||||||
'fetched': 0,
|
'fetched': 0,
|
||||||
'saved': 0,
|
'saved': 0,
|
||||||
'classified': 0,
|
'classified': 0,
|
||||||
'awaiting_user_action': 0,
|
|
||||||
'rules_matched': 0,
|
'rules_matched': 0,
|
||||||
'errors': 0
|
'errors': 0
|
||||||
}
|
}
|
||||||
@ -87,8 +86,6 @@ class EmailProcessorService:
|
|||||||
|
|
||||||
if result.get('classified'):
|
if result.get('classified'):
|
||||||
stats['classified'] += 1
|
stats['classified'] += 1
|
||||||
if result.get('awaiting_user_action'):
|
|
||||||
stats['awaiting_user_action'] += 1
|
|
||||||
if result.get('rules_matched'):
|
if result.get('rules_matched'):
|
||||||
stats['rules_matched'] += 1
|
stats['rules_matched'] += 1
|
||||||
|
|
||||||
@ -112,7 +109,6 @@ class EmailProcessorService:
|
|||||||
email_id = email_data.get('id')
|
email_id = email_data.get('id')
|
||||||
stats = {
|
stats = {
|
||||||
'classified': False,
|
'classified': False,
|
||||||
'awaiting_user_action': False,
|
|
||||||
'workflows_executed': 0,
|
'workflows_executed': 0,
|
||||||
'rules_matched': False
|
'rules_matched': False
|
||||||
}
|
}
|
||||||
@ -128,29 +124,6 @@ class EmailProcessorService:
|
|||||||
await self._classify_and_update(email_data)
|
await self._classify_and_update(email_data)
|
||||||
stats['classified'] = True
|
stats['classified'] = True
|
||||||
|
|
||||||
# Step 3.5: Gate automation by manual-approval policy and confidence
|
|
||||||
# Phase-1 policy: suggestions are generated automatically, actions are user-approved.
|
|
||||||
classification = (email_data.get('classification') or '').strip().lower()
|
|
||||||
confidence = float(email_data.get('confidence_score') or 0.0)
|
|
||||||
require_manual_approval = getattr(settings, 'EMAIL_REQUIRE_MANUAL_APPROVAL', True)
|
|
||||||
has_helpdesk_hint = email_workflow_service.has_helpdesk_routing_hint(email_data)
|
|
||||||
|
|
||||||
if has_helpdesk_hint:
|
|
||||||
logger.info(
|
|
||||||
"🧵 Email %s has SAG/thread hint; bypassing manual approval gate for auto-routing",
|
|
||||||
email_id,
|
|
||||||
)
|
|
||||||
|
|
||||||
if require_manual_approval and not has_helpdesk_hint:
|
|
||||||
await self._set_awaiting_user_action(email_id, reason='manual_approval_required')
|
|
||||||
stats['awaiting_user_action'] = True
|
|
||||||
return stats
|
|
||||||
|
|
||||||
if (not classification or confidence < settings.EMAIL_AI_CONFIDENCE_THRESHOLD) and not has_helpdesk_hint:
|
|
||||||
await self._set_awaiting_user_action(email_id, reason='low_confidence')
|
|
||||||
stats['awaiting_user_action'] = True
|
|
||||||
return stats
|
|
||||||
|
|
||||||
# Step 4: Execute workflows based on classification
|
# Step 4: Execute workflows based on classification
|
||||||
workflow_processed = False
|
workflow_processed = False
|
||||||
if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') and settings.EMAIL_WORKFLOWS_ENABLED:
|
if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') and settings.EMAIL_WORKFLOWS_ENABLED:
|
||||||
@ -200,25 +173,6 @@ class EmailProcessorService:
|
|||||||
logger.error(f"❌ Error in process_single_email for {email_id}: {e}")
|
logger.error(f"❌ Error in process_single_email for {email_id}: {e}")
|
||||||
raise e
|
raise e
|
||||||
|
|
||||||
async def _set_awaiting_user_action(self, email_id: Optional[int], reason: str):
|
|
||||||
"""Park an email for manual review before any automatic routing/action."""
|
|
||||||
if not email_id:
|
|
||||||
return
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"""
|
|
||||||
UPDATE email_messages
|
|
||||||
SET status = 'awaiting_user_action',
|
|
||||||
folder = COALESCE(folder, 'INBOX'),
|
|
||||||
auto_processed = false,
|
|
||||||
processed_at = NULL,
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(email_id,)
|
|
||||||
)
|
|
||||||
logger.info("🛑 Email %s moved to awaiting_user_action (%s)", email_id, reason)
|
|
||||||
|
|
||||||
async def _classify_and_update(self, email_data: Dict):
|
async def _classify_and_update(self, email_data: Dict):
|
||||||
"""Classify email and update database"""
|
"""Classify email and update database"""
|
||||||
try:
|
try:
|
||||||
@ -286,39 +240,24 @@ class EmailProcessorService:
|
|||||||
logger.error(f"❌ Classification failed for email {email_data['id']}: {e}")
|
logger.error(f"❌ Classification failed for email {email_data['id']}: {e}")
|
||||||
|
|
||||||
async def _update_extracted_fields(self, email_id: int, extraction: Dict):
|
async def _update_extracted_fields(self, email_id: int, extraction: Dict):
|
||||||
"""Update email with extracted invoice fields (from PDF attachment analysis)"""
|
"""Update email with extracted invoice fields"""
|
||||||
try:
|
try:
|
||||||
# Normalize amount field (new extraction uses total_amount, old used amount)
|
|
||||||
amount = extraction.get('total_amount') or extraction.get('amount')
|
|
||||||
|
|
||||||
query = """
|
query = """
|
||||||
UPDATE email_messages
|
UPDATE email_messages
|
||||||
SET extracted_invoice_number = %s,
|
SET extracted_invoice_number = %s,
|
||||||
extracted_amount = %s,
|
extracted_amount = %s,
|
||||||
extracted_due_date = %s,
|
extracted_due_date = %s
|
||||||
extracted_vendor_name = %s,
|
|
||||||
extracted_vendor_cvr = %s,
|
|
||||||
extracted_invoice_date = %s
|
|
||||||
WHERE id = %s
|
WHERE id = %s
|
||||||
"""
|
"""
|
||||||
|
|
||||||
execute_query(query, (
|
execute_query(query, (
|
||||||
extraction.get('invoice_number'),
|
extraction.get('invoice_number'),
|
||||||
amount,
|
extraction.get('amount'),
|
||||||
extraction.get('due_date'),
|
extraction.get('due_date'),
|
||||||
extraction.get('vendor_name'),
|
|
||||||
extraction.get('vendor_cvr'),
|
|
||||||
extraction.get('invoice_date'),
|
|
||||||
email_id
|
email_id
|
||||||
))
|
))
|
||||||
|
|
||||||
logger.info(
|
logger.info(f"✅ Updated extracted fields for email {email_id}")
|
||||||
f"✅ Updated extracted fields for email {email_id}: "
|
|
||||||
f"invoice={extraction.get('invoice_number')}, "
|
|
||||||
f"vendor={extraction.get('vendor_name')}, "
|
|
||||||
f"cvr={extraction.get('vendor_cvr')}, "
|
|
||||||
f"amount={amount}"
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Error updating extracted fields: {e}")
|
logger.error(f"❌ Error updating extracted fields: {e}")
|
||||||
|
|||||||
@ -11,15 +11,11 @@ import email
|
|||||||
from email.header import decode_header
|
from email.header import decode_header
|
||||||
from email.mime.text import MIMEText
|
from email.mime.text import MIMEText
|
||||||
from email.mime.multipart import MIMEMultipart
|
from email.mime.multipart import MIMEMultipart
|
||||||
from email.mime.base import MIMEBase
|
from typing import List, Dict, Optional, Tuple
|
||||||
from email import encoders
|
|
||||||
from typing import List, Dict, Optional, Tuple, Any
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import json
|
import json
|
||||||
import asyncio
|
import asyncio
|
||||||
import base64
|
import base64
|
||||||
import re
|
|
||||||
from uuid import uuid4
|
|
||||||
|
|
||||||
# Try to import aiosmtplib, but don't fail if not available
|
# Try to import aiosmtplib, but don't fail if not available
|
||||||
try:
|
try:
|
||||||
@ -59,186 +55,6 @@ class EmailService:
|
|||||||
'user_email': settings.GRAPH_USER_EMAIL
|
'user_email': settings.GRAPH_USER_EMAIL
|
||||||
}
|
}
|
||||||
|
|
||||||
def _graph_send_available(self) -> bool:
|
|
||||||
return bool(
|
|
||||||
self.use_graph
|
|
||||||
and self.graph_config.get('tenant_id')
|
|
||||||
and self.graph_config.get('client_id')
|
|
||||||
and self.graph_config.get('client_secret')
|
|
||||||
and self.graph_config.get('user_email')
|
|
||||||
)
|
|
||||||
|
|
||||||
async def _send_via_graph(
|
|
||||||
self,
|
|
||||||
to_addresses: List[str],
|
|
||||||
subject: str,
|
|
||||||
body_text: str,
|
|
||||||
body_html: Optional[str] = None,
|
|
||||||
cc: Optional[List[str]] = None,
|
|
||||||
bcc: Optional[List[str]] = None,
|
|
||||||
reply_to: Optional[str] = None,
|
|
||||||
in_reply_to: Optional[str] = None,
|
|
||||||
references: Optional[str] = None,
|
|
||||||
attachments: Optional[List[Dict]] = None,
|
|
||||||
) -> Tuple[bool, str, Optional[Dict[str, str]]]:
|
|
||||||
"""Send email via Microsoft Graph sendMail endpoint."""
|
|
||||||
|
|
||||||
access_token = await self._get_graph_access_token()
|
|
||||||
if not access_token:
|
|
||||||
return False, "Graph token acquisition failed", None
|
|
||||||
|
|
||||||
def _recipient(addr: str) -> Dict:
|
|
||||||
return {"emailAddress": {"address": addr}}
|
|
||||||
|
|
||||||
message: Dict = {
|
|
||||||
"subject": subject,
|
|
||||||
"body": {
|
|
||||||
"contentType": "HTML" if body_html else "Text",
|
|
||||||
"content": body_html or body_text,
|
|
||||||
},
|
|
||||||
"toRecipients": [_recipient(addr) for addr in (to_addresses or [])],
|
|
||||||
}
|
|
||||||
|
|
||||||
if cc:
|
|
||||||
message["ccRecipients"] = [_recipient(addr) for addr in cc]
|
|
||||||
if bcc:
|
|
||||||
message["bccRecipients"] = [_recipient(addr) for addr in bcc]
|
|
||||||
if reply_to:
|
|
||||||
message["replyTo"] = [_recipient(reply_to)]
|
|
||||||
|
|
||||||
# Microsoft Graph only allows custom internet headers prefixed with x-.
|
|
||||||
# Standard headers like In-Reply-To/References are rejected with
|
|
||||||
# InvalidInternetMessageHeader, so only attach safe diagnostic metadata.
|
|
||||||
headers = []
|
|
||||||
if in_reply_to:
|
|
||||||
headers.append({"name": "x-bmc-in-reply-to", "value": in_reply_to[:900]})
|
|
||||||
if references:
|
|
||||||
headers.append({"name": "x-bmc-references", "value": references[:900]})
|
|
||||||
if headers:
|
|
||||||
message["internetMessageHeaders"] = headers
|
|
||||||
|
|
||||||
graph_attachments = []
|
|
||||||
for attachment in (attachments or []):
|
|
||||||
content = attachment.get("content")
|
|
||||||
if not content:
|
|
||||||
continue
|
|
||||||
graph_attachments.append({
|
|
||||||
"@odata.type": "#microsoft.graph.fileAttachment",
|
|
||||||
"name": attachment.get("filename") or "attachment.bin",
|
|
||||||
"contentType": attachment.get("content_type") or "application/octet-stream",
|
|
||||||
"contentBytes": base64.b64encode(content).decode("ascii"),
|
|
||||||
})
|
|
||||||
if graph_attachments:
|
|
||||||
message["attachments"] = graph_attachments
|
|
||||||
|
|
||||||
url = f"https://graph.microsoft.com/v1.0/users/{self.graph_config['user_email']}/sendMail"
|
|
||||||
request_body = {
|
|
||||||
"message": message,
|
|
||||||
"saveToSentItems": True,
|
|
||||||
}
|
|
||||||
request_headers = {
|
|
||||||
"Authorization": f"Bearer {access_token}",
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
}
|
|
||||||
|
|
||||||
try:
|
|
||||||
async with ClientSession() as session:
|
|
||||||
async with session.post(url, headers=request_headers, json=request_body) as response:
|
|
||||||
if response.status in (200, 202):
|
|
||||||
metadata = None
|
|
||||||
try:
|
|
||||||
metadata = await self._find_recent_sent_graph_message(
|
|
||||||
access_token=access_token,
|
|
||||||
subject=subject,
|
|
||||||
to_addresses=to_addresses,
|
|
||||||
)
|
|
||||||
except Exception as metadata_error:
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Graph send succeeded but SentItems metadata lookup failed: %s",
|
|
||||||
metadata_error,
|
|
||||||
)
|
|
||||||
return True, f"Email sent to {len(to_addresses)} recipient(s) via Graph", metadata
|
|
||||||
|
|
||||||
error_text = await response.text()
|
|
||||||
logger.error("❌ Graph send failed: status=%s body=%s", response.status, error_text)
|
|
||||||
return False, f"Graph send failed ({response.status}): {error_text[:300]}", None
|
|
||||||
except Exception as e:
|
|
||||||
return False, f"Graph send exception: {str(e)}", None
|
|
||||||
|
|
||||||
def _recipient_addresses_match(self, graph_recipients: Optional[List[Dict[str, Any]]], to_addresses: List[str]) -> bool:
|
|
||||||
if not to_addresses:
|
|
||||||
return True
|
|
||||||
|
|
||||||
expected = {addr.strip().lower() for addr in (to_addresses or []) if addr}
|
|
||||||
if not expected:
|
|
||||||
return True
|
|
||||||
|
|
||||||
actual = set()
|
|
||||||
for recipient in graph_recipients or []:
|
|
||||||
address = (
|
|
||||||
recipient.get("emailAddress", {}).get("address")
|
|
||||||
if isinstance(recipient, dict)
|
|
||||||
else None
|
|
||||||
)
|
|
||||||
if address:
|
|
||||||
actual.add(str(address).strip().lower())
|
|
||||||
|
|
||||||
return bool(actual) and expected.issubset(actual)
|
|
||||||
|
|
||||||
async def _find_recent_sent_graph_message(
|
|
||||||
self,
|
|
||||||
access_token: str,
|
|
||||||
subject: str,
|
|
||||||
to_addresses: List[str],
|
|
||||||
) -> Optional[Dict[str, str]]:
|
|
||||||
"""Best-effort lookup for the most recent sent Graph message metadata."""
|
|
||||||
try:
|
|
||||||
user_email = self.graph_config['user_email']
|
|
||||||
url = f"https://graph.microsoft.com/v1.0/users/{user_email}/mailFolders/SentItems/messages"
|
|
||||||
params = {
|
|
||||||
'$top': 15,
|
|
||||||
'$orderby': 'sentDateTime desc',
|
|
||||||
'$select': 'id,subject,toRecipients,internetMessageId,conversationId,sentDateTime'
|
|
||||||
}
|
|
||||||
headers = {
|
|
||||||
'Authorization': f'Bearer {access_token}',
|
|
||||||
'Content-Type': 'application/json',
|
|
||||||
}
|
|
||||||
|
|
||||||
async with ClientSession() as session:
|
|
||||||
async with session.get(url, params=params, headers=headers) as response:
|
|
||||||
if response.status != 200:
|
|
||||||
logger.warning("⚠️ Could not read SentItems metadata (status=%s)", response.status)
|
|
||||||
return None
|
|
||||||
|
|
||||||
payload = await response.json()
|
|
||||||
messages = payload.get('value') or []
|
|
||||||
normalized_subject = (subject or '').strip().lower()
|
|
||||||
|
|
||||||
for msg in messages:
|
|
||||||
candidate_subject = str(msg.get('subject') or '').strip().lower()
|
|
||||||
if normalized_subject and candidate_subject != normalized_subject:
|
|
||||||
continue
|
|
||||||
if not self._recipient_addresses_match(msg.get('toRecipients'), to_addresses):
|
|
||||||
continue
|
|
||||||
|
|
||||||
internet_message_id = self._normalize_message_id_value(msg.get('internetMessageId'))
|
|
||||||
conversation_id = self._normalize_message_id_value(msg.get('conversationId'))
|
|
||||||
if internet_message_id or conversation_id:
|
|
||||||
logger.info(
|
|
||||||
"🧵 Matched sent Graph metadata (conversationId=%s, messageId=%s)",
|
|
||||||
conversation_id,
|
|
||||||
internet_message_id,
|
|
||||||
)
|
|
||||||
return {
|
|
||||||
'internet_message_id': internet_message_id,
|
|
||||||
'conversation_id': conversation_id,
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("⚠️ Failed to resolve sent Graph metadata: %s", e)
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
async def fetch_new_emails(self, limit: int = 50) -> List[Dict]:
|
async def fetch_new_emails(self, limit: int = 50) -> List[Dict]:
|
||||||
"""
|
"""
|
||||||
Fetch new emails from configured source (IMAP or Graph API)
|
Fetch new emails from configured source (IMAP or Graph API)
|
||||||
@ -353,7 +169,7 @@ class EmailService:
|
|||||||
params = {
|
params = {
|
||||||
'$top': limit,
|
'$top': limit,
|
||||||
'$orderby': 'receivedDateTime desc',
|
'$orderby': 'receivedDateTime desc',
|
||||||
'$select': 'id,subject,from,toRecipients,ccRecipients,receivedDateTime,bodyPreview,body,hasAttachments,internetMessageId,conversationId,internetMessageHeaders'
|
'$select': 'id,subject,from,toRecipients,ccRecipients,receivedDateTime,bodyPreview,body,hasAttachments,internetMessageId'
|
||||||
}
|
}
|
||||||
|
|
||||||
headers = {
|
headers = {
|
||||||
@ -396,12 +212,6 @@ class EmailService:
|
|||||||
logger.info(f"✅ New email: {parsed_email['subject'][:50]}... from {parsed_email['sender_email']}")
|
logger.info(f"✅ New email: {parsed_email['subject'][:50]}... from {parsed_email['sender_email']}")
|
||||||
else:
|
else:
|
||||||
logger.debug(f"⏭️ Email already exists: {parsed_email['message_id']}")
|
logger.debug(f"⏭️ Email already exists: {parsed_email['message_id']}")
|
||||||
# Re-save attachment bytes for existing emails (fills content_data for old emails)
|
|
||||||
if parsed_email.get('attachments'):
|
|
||||||
await self._resave_attachment_content(
|
|
||||||
parsed_email['message_id'],
|
|
||||||
parsed_email['attachments']
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Error parsing Graph message: {e}")
|
logger.error(f"❌ Error parsing Graph message: {e}")
|
||||||
@ -460,8 +270,6 @@ class EmailService:
|
|||||||
|
|
||||||
# Get message ID
|
# Get message ID
|
||||||
message_id = msg.get('Message-ID', f"imap-{email_id}")
|
message_id = msg.get('Message-ID', f"imap-{email_id}")
|
||||||
in_reply_to = msg.get('In-Reply-To', '')
|
|
||||||
email_references = msg.get('References', '')
|
|
||||||
|
|
||||||
# Get date
|
# Get date
|
||||||
date_str = msg.get('Date', '')
|
date_str = msg.get('Date', '')
|
||||||
@ -535,8 +343,6 @@ class EmailService:
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
'message_id': message_id,
|
'message_id': message_id,
|
||||||
'in_reply_to': in_reply_to,
|
|
||||||
'email_references': email_references,
|
|
||||||
'subject': subject,
|
'subject': subject,
|
||||||
'sender_name': sender_name,
|
'sender_name': sender_name,
|
||||||
'sender_email': sender_email,
|
'sender_email': sender_email,
|
||||||
@ -579,26 +385,8 @@ class EmailService:
|
|||||||
received_date_str = msg.get('receivedDateTime', '')
|
received_date_str = msg.get('receivedDateTime', '')
|
||||||
received_date = datetime.fromisoformat(received_date_str.replace('Z', '+00:00')) if received_date_str else datetime.now()
|
received_date = datetime.fromisoformat(received_date_str.replace('Z', '+00:00')) if received_date_str else datetime.now()
|
||||||
|
|
||||||
headers = msg.get('internetMessageHeaders') or []
|
|
||||||
in_reply_to = None
|
|
||||||
references = None
|
|
||||||
for header in headers:
|
|
||||||
name = str(header.get('name') or '').strip().lower()
|
|
||||||
value = str(header.get('value') or '').strip()
|
|
||||||
if not value:
|
|
||||||
continue
|
|
||||||
if name == 'in-reply-to':
|
|
||||||
in_reply_to = value
|
|
||||||
elif name == 'references':
|
|
||||||
references = value
|
|
||||||
|
|
||||||
conversation_id = self._normalize_message_id_value(msg.get('conversationId'))
|
|
||||||
|
|
||||||
return {
|
return {
|
||||||
'message_id': msg.get('internetMessageId', msg.get('id', '')),
|
'message_id': msg.get('internetMessageId', msg.get('id', '')),
|
||||||
'in_reply_to': in_reply_to,
|
|
||||||
'email_references': references,
|
|
||||||
'thread_key': conversation_id,
|
|
||||||
'subject': msg.get('subject', ''),
|
'subject': msg.get('subject', ''),
|
||||||
'sender_name': sender_name,
|
'sender_name': sender_name,
|
||||||
'sender_email': sender_email,
|
'sender_email': sender_email,
|
||||||
@ -707,46 +495,6 @@ class EmailService:
|
|||||||
# Just email address
|
# Just email address
|
||||||
return ("", header.strip())
|
return ("", header.strip())
|
||||||
|
|
||||||
def _normalize_message_id_value(self, value: Optional[str]) -> Optional[str]:
|
|
||||||
"""Normalize message-id like tokens for stable thread matching."""
|
|
||||||
if not value:
|
|
||||||
return None
|
|
||||||
normalized = str(value).strip().strip("<>").lower()
|
|
||||||
normalized = "".join(normalized.split())
|
|
||||||
return normalized or None
|
|
||||||
|
|
||||||
def _extract_reference_ids(self, raw_references: Optional[str]) -> List[str]:
|
|
||||||
if not raw_references:
|
|
||||||
return []
|
|
||||||
refs: List[str] = []
|
|
||||||
for token in re.split(r"[\s,]+", str(raw_references).strip()):
|
|
||||||
normalized = self._normalize_message_id_value(token)
|
|
||||||
if normalized:
|
|
||||||
refs.append(normalized)
|
|
||||||
return list(dict.fromkeys(refs))
|
|
||||||
|
|
||||||
def _derive_thread_key(self, email_data: Dict) -> Optional[str]:
|
|
||||||
"""
|
|
||||||
Derive a stable conversation key.
|
|
||||||
Priority:
|
|
||||||
1) First References token (root message id)
|
|
||||||
2) In-Reply-To
|
|
||||||
3) Message-ID
|
|
||||||
"""
|
|
||||||
explicit_thread_key = self._normalize_message_id_value(email_data.get("thread_key"))
|
|
||||||
if explicit_thread_key:
|
|
||||||
return explicit_thread_key
|
|
||||||
|
|
||||||
reference_ids = self._extract_reference_ids(email_data.get("email_references"))
|
|
||||||
if reference_ids:
|
|
||||||
return reference_ids[0]
|
|
||||||
|
|
||||||
in_reply_to = self._normalize_message_id_value(email_data.get("in_reply_to"))
|
|
||||||
if in_reply_to:
|
|
||||||
return in_reply_to
|
|
||||||
|
|
||||||
return self._normalize_message_id_value(email_data.get("message_id"))
|
|
||||||
|
|
||||||
def _parse_email_date(self, date_str: str) -> datetime:
|
def _parse_email_date(self, date_str: str) -> datetime:
|
||||||
"""Parse email date header into datetime object"""
|
"""Parse email date header into datetime object"""
|
||||||
if not date_str:
|
if not date_str:
|
||||||
@ -769,62 +517,29 @@ class EmailService:
|
|||||||
async def save_email(self, email_data: Dict) -> Optional[int]:
|
async def save_email(self, email_data: Dict) -> Optional[int]:
|
||||||
"""Save email to database"""
|
"""Save email to database"""
|
||||||
try:
|
try:
|
||||||
thread_key = self._derive_thread_key(email_data)
|
query = """
|
||||||
|
INSERT INTO email_messages
|
||||||
|
(message_id, subject, sender_email, sender_name, recipient_email, cc,
|
||||||
|
body_text, body_html, received_date, folder, has_attachments, attachment_count,
|
||||||
|
status, is_read)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, 'new', false)
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
|
||||||
try:
|
email_id = execute_insert(query, (
|
||||||
query = """
|
email_data['message_id'],
|
||||||
INSERT INTO email_messages
|
email_data['subject'],
|
||||||
(message_id, subject, sender_email, sender_name, recipient_email, cc,
|
email_data['sender_email'],
|
||||||
body_text, body_html, received_date, folder, has_attachments, attachment_count,
|
email_data['sender_name'],
|
||||||
in_reply_to, email_references, thread_key,
|
email_data['recipient_email'],
|
||||||
status, is_read)
|
email_data['cc'],
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, 'new', false)
|
email_data['body_text'],
|
||||||
RETURNING id
|
email_data['body_html'],
|
||||||
"""
|
email_data['received_date'],
|
||||||
email_id = execute_insert(query, (
|
email_data['folder'],
|
||||||
email_data['message_id'],
|
email_data['has_attachments'],
|
||||||
email_data['subject'],
|
email_data['attachment_count']
|
||||||
email_data['sender_email'],
|
))
|
||||||
email_data['sender_name'],
|
|
||||||
email_data['recipient_email'],
|
|
||||||
email_data['cc'],
|
|
||||||
email_data['body_text'],
|
|
||||||
email_data['body_html'],
|
|
||||||
email_data['received_date'],
|
|
||||||
email_data['folder'],
|
|
||||||
email_data['has_attachments'],
|
|
||||||
email_data['attachment_count'],
|
|
||||||
email_data.get('in_reply_to'),
|
|
||||||
email_data.get('email_references'),
|
|
||||||
thread_key,
|
|
||||||
))
|
|
||||||
except Exception:
|
|
||||||
query = """
|
|
||||||
INSERT INTO email_messages
|
|
||||||
(message_id, subject, sender_email, sender_name, recipient_email, cc,
|
|
||||||
body_text, body_html, received_date, folder, has_attachments, attachment_count,
|
|
||||||
in_reply_to, email_references,
|
|
||||||
status, is_read)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, 'new', false)
|
|
||||||
RETURNING id
|
|
||||||
"""
|
|
||||||
|
|
||||||
email_id = execute_insert(query, (
|
|
||||||
email_data['message_id'],
|
|
||||||
email_data['subject'],
|
|
||||||
email_data['sender_email'],
|
|
||||||
email_data['sender_name'],
|
|
||||||
email_data['recipient_email'],
|
|
||||||
email_data['cc'],
|
|
||||||
email_data['body_text'],
|
|
||||||
email_data['body_html'],
|
|
||||||
email_data['received_date'],
|
|
||||||
email_data['folder'],
|
|
||||||
email_data['has_attachments'],
|
|
||||||
email_data['attachment_count'],
|
|
||||||
email_data.get('in_reply_to'),
|
|
||||||
email_data.get('email_references')
|
|
||||||
))
|
|
||||||
|
|
||||||
logger.info(f"✅ Saved email {email_id}: {email_data['subject'][:50]}...")
|
logger.info(f"✅ Saved email {email_id}: {email_data['subject'][:50]}...")
|
||||||
|
|
||||||
@ -839,85 +554,47 @@ class EmailService:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
async def _save_attachments(self, email_id: int, attachments: List[Dict]):
|
async def _save_attachments(self, email_id: int, attachments: List[Dict]):
|
||||||
"""Save email attachments to disk and database (also stores bytes as fallback)"""
|
"""Save email attachments to disk and database"""
|
||||||
import os
|
import os
|
||||||
import hashlib
|
import hashlib
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
# Use absolute path based on UPLOAD_DIR setting
|
# Create uploads directory if not exists
|
||||||
from app.core.config import settings
|
upload_dir = Path("uploads/email_attachments")
|
||||||
upload_dir = Path(settings.UPLOAD_DIR) / "email_attachments"
|
upload_dir.mkdir(parents=True, exist_ok=True)
|
||||||
try:
|
|
||||||
upload_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning(f"⚠️ Could not create upload dir {upload_dir}: {e}")
|
|
||||||
|
|
||||||
for att in attachments:
|
for att in attachments:
|
||||||
try:
|
try:
|
||||||
filename = att['filename']
|
filename = att['filename']
|
||||||
content = att['content'] # bytes
|
content = att['content']
|
||||||
content_type = att.get('content_type', 'application/octet-stream')
|
content_type = att.get('content_type', 'application/octet-stream')
|
||||||
size_bytes = att.get('size', len(content) if content else 0)
|
size_bytes = att['size']
|
||||||
|
|
||||||
if not content:
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Generate MD5 hash for deduplication
|
# Generate MD5 hash for deduplication
|
||||||
md5_hash = hashlib.md5(content).hexdigest()
|
md5_hash = hashlib.md5(content).hexdigest()
|
||||||
|
|
||||||
# Try to save to disk
|
# Save to disk with hash prefix
|
||||||
file_path_str = None
|
file_path = upload_dir / f"{md5_hash}_{filename}"
|
||||||
try:
|
file_path.write_bytes(content)
|
||||||
file_path = upload_dir / f"{md5_hash}_{filename}"
|
|
||||||
file_path.write_bytes(content)
|
|
||||||
file_path_str = str(file_path)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning(f"⚠️ Could not save attachment to disk ({filename}): {e}")
|
|
||||||
|
|
||||||
# Save to database — always store content_data as fallback
|
# Save to database
|
||||||
query = """
|
query = """
|
||||||
INSERT INTO email_attachments
|
INSERT INTO email_attachments
|
||||||
(email_id, filename, content_type, size_bytes, file_path, content_data)
|
(email_id, filename, content_type, size_bytes, file_path)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s)
|
||||||
ON CONFLICT DO NOTHING
|
|
||||||
"""
|
"""
|
||||||
from psycopg2 import Binary
|
execute_insert(query, (
|
||||||
execute_query(query, (
|
|
||||||
email_id,
|
email_id,
|
||||||
filename,
|
filename,
|
||||||
content_type,
|
content_type,
|
||||||
size_bytes,
|
size_bytes,
|
||||||
file_path_str,
|
str(file_path)
|
||||||
Binary(content)
|
|
||||||
))
|
))
|
||||||
|
|
||||||
logger.info(f"📎 Saved attachment: {filename} ({size_bytes} bytes, disk={file_path_str is not None})")
|
logger.info(f"📎 Saved attachment: {filename} ({size_bytes} bytes)")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Failed to save attachment {att.get('filename', '?')}: {e}")
|
logger.error(f"❌ Failed to save attachment {filename}: {e}")
|
||||||
|
|
||||||
async def _resave_attachment_content(self, message_id: str, attachments: List[Dict]):
|
|
||||||
"""For existing emails, store attachment bytes in content_data if not already saved"""
|
|
||||||
from psycopg2 import Binary
|
|
||||||
for att in attachments:
|
|
||||||
try:
|
|
||||||
filename = att.get('filename')
|
|
||||||
content = att.get('content')
|
|
||||||
if not filename or not content:
|
|
||||||
continue
|
|
||||||
query = """
|
|
||||||
UPDATE email_attachments
|
|
||||||
SET content_data = %s
|
|
||||||
WHERE email_id = (
|
|
||||||
SELECT id FROM email_messages WHERE message_id = %s LIMIT 1
|
|
||||||
)
|
|
||||||
AND filename = %s
|
|
||||||
AND content_data IS NULL
|
|
||||||
"""
|
|
||||||
execute_query(query, (Binary(content), message_id, filename))
|
|
||||||
logger.debug(f"💾 Re-saved content_data for attachment: {filename}")
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning(f"⚠️ Could not re-save content_data for {att.get('filename', '?')}: {e}")
|
|
||||||
|
|
||||||
async def get_unprocessed_emails(self, limit: int = 100) -> List[Dict]:
|
async def get_unprocessed_emails(self, limit: int = 100) -> List[Dict]:
|
||||||
"""Get emails from database that haven't been processed yet"""
|
"""Get emails from database that haven't been processed yet"""
|
||||||
@ -1040,8 +717,6 @@ class EmailService:
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
"message_id": message_id,
|
"message_id": message_id,
|
||||||
"in_reply_to": msg.get("In-Reply-To", ""),
|
|
||||||
"email_references": msg.get("References", ""),
|
|
||||||
"subject": msg.get("Subject", "No Subject"),
|
"subject": msg.get("Subject", "No Subject"),
|
||||||
"sender_name": sender_name,
|
"sender_name": sender_name,
|
||||||
"sender_email": sender_email,
|
"sender_email": sender_email,
|
||||||
@ -1107,8 +782,6 @@ class EmailService:
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
"message_id": message_id,
|
"message_id": message_id,
|
||||||
"in_reply_to": None,
|
|
||||||
"email_references": None,
|
|
||||||
"subject": msg.subject or "No Subject",
|
"subject": msg.subject or "No Subject",
|
||||||
"sender_name": msg.sender or "",
|
"sender_name": msg.sender or "",
|
||||||
"sender_email": msg.senderEmail or "",
|
"sender_email": msg.senderEmail or "",
|
||||||
@ -1146,70 +819,33 @@ class EmailService:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
# Insert email
|
# Insert email
|
||||||
thread_key = self._derive_thread_key(email_data)
|
query = """
|
||||||
try:
|
INSERT INTO email_messages (
|
||||||
query = """
|
message_id, subject, sender_email, sender_name,
|
||||||
INSERT INTO email_messages (
|
recipient_email, cc, body_text, body_html,
|
||||||
message_id, subject, sender_email, sender_name,
|
received_date, folder, has_attachments, attachment_count,
|
||||||
recipient_email, cc, body_text, body_html,
|
status, import_method, created_at
|
||||||
received_date, folder, has_attachments, attachment_count,
|
)
|
||||||
in_reply_to, email_references, thread_key,
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, CURRENT_TIMESTAMP)
|
||||||
status, import_method, created_at
|
RETURNING id
|
||||||
)
|
"""
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, CURRENT_TIMESTAMP)
|
|
||||||
RETURNING id
|
|
||||||
"""
|
|
||||||
|
|
||||||
result = execute_insert(query, (
|
result = execute_insert(query, (
|
||||||
email_data["message_id"],
|
email_data["message_id"],
|
||||||
email_data["subject"],
|
email_data["subject"],
|
||||||
email_data["sender_email"],
|
email_data["sender_email"],
|
||||||
email_data["sender_name"],
|
email_data["sender_name"],
|
||||||
email_data.get("recipient_email", ""),
|
email_data.get("recipient_email", ""),
|
||||||
email_data.get("cc", ""),
|
email_data.get("cc", ""),
|
||||||
email_data["body_text"],
|
email_data["body_text"],
|
||||||
email_data["body_html"],
|
email_data["body_html"],
|
||||||
email_data["received_date"],
|
email_data["received_date"],
|
||||||
email_data["folder"],
|
email_data["folder"],
|
||||||
email_data["has_attachments"],
|
email_data["has_attachments"],
|
||||||
len(email_data.get("attachments", [])),
|
len(email_data.get("attachments", [])),
|
||||||
email_data.get("in_reply_to"),
|
"new",
|
||||||
email_data.get("email_references"),
|
"manual_upload"
|
||||||
thread_key,
|
))
|
||||||
"new",
|
|
||||||
"manual_upload"
|
|
||||||
))
|
|
||||||
except Exception:
|
|
||||||
query = """
|
|
||||||
INSERT INTO email_messages (
|
|
||||||
message_id, subject, sender_email, sender_name,
|
|
||||||
recipient_email, cc, body_text, body_html,
|
|
||||||
received_date, folder, has_attachments, attachment_count,
|
|
||||||
in_reply_to, email_references,
|
|
||||||
status, import_method, created_at
|
|
||||||
)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, CURRENT_TIMESTAMP)
|
|
||||||
RETURNING id
|
|
||||||
"""
|
|
||||||
|
|
||||||
result = execute_insert(query, (
|
|
||||||
email_data["message_id"],
|
|
||||||
email_data["subject"],
|
|
||||||
email_data["sender_email"],
|
|
||||||
email_data["sender_name"],
|
|
||||||
email_data.get("recipient_email", ""),
|
|
||||||
email_data.get("cc", ""),
|
|
||||||
email_data["body_text"],
|
|
||||||
email_data["body_html"],
|
|
||||||
email_data["received_date"],
|
|
||||||
email_data["folder"],
|
|
||||||
email_data["has_attachments"],
|
|
||||||
len(email_data.get("attachments", [])),
|
|
||||||
email_data.get("in_reply_to"),
|
|
||||||
email_data.get("email_references"),
|
|
||||||
"new",
|
|
||||||
"manual_upload"
|
|
||||||
))
|
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
logger.error("❌ Failed to insert email - no ID returned")
|
logger.error("❌ Failed to insert email - no ID returned")
|
||||||
@ -1259,37 +895,14 @@ class EmailService:
|
|||||||
logger.warning(f"🔒 DRY RUN MODE: Would send email to {to_addresses} with subject '{subject}'")
|
logger.warning(f"🔒 DRY RUN MODE: Would send email to {to_addresses} with subject '{subject}'")
|
||||||
return True, "Dry run mode - email not actually sent"
|
return True, "Dry run mode - email not actually sent"
|
||||||
|
|
||||||
graph_failure_message: Optional[str] = None
|
|
||||||
|
|
||||||
# Prefer Graph send when Graph integration is enabled/configured.
|
|
||||||
if self._graph_send_available():
|
|
||||||
graph_ok, graph_message = await self._send_via_graph(
|
|
||||||
to_addresses=to_addresses,
|
|
||||||
subject=subject,
|
|
||||||
body_text=body_text,
|
|
||||||
body_html=body_html,
|
|
||||||
cc=cc,
|
|
||||||
bcc=bcc,
|
|
||||||
reply_to=reply_to,
|
|
||||||
)
|
|
||||||
if graph_ok:
|
|
||||||
logger.info("✅ Email sent via Graph to %s recipient(s): %s", len(to_addresses), subject)
|
|
||||||
return True, graph_message
|
|
||||||
graph_failure_message = graph_message
|
|
||||||
logger.warning("⚠️ Graph send failed, falling back to SMTP: %s", graph_message)
|
|
||||||
|
|
||||||
# Check if aiosmtplib is available
|
# Check if aiosmtplib is available
|
||||||
if not HAS_AIOSMTPLIB:
|
if not HAS_AIOSMTPLIB:
|
||||||
logger.error("❌ aiosmtplib not installed - cannot send email. Install with: pip install aiosmtplib")
|
logger.error("❌ aiosmtplib not installed - cannot send email. Install with: pip install aiosmtplib")
|
||||||
if graph_failure_message:
|
|
||||||
return False, f"Graph failed: {graph_failure_message}; SMTP fallback unavailable: aiosmtplib not installed"
|
|
||||||
return False, "aiosmtplib not installed"
|
return False, "aiosmtplib not installed"
|
||||||
|
|
||||||
# Validate SMTP configuration
|
# Validate SMTP configuration
|
||||||
if not all([settings.EMAIL_SMTP_HOST, settings.EMAIL_SMTP_USER, settings.EMAIL_SMTP_PASSWORD]):
|
if not all([settings.EMAIL_SMTP_HOST, settings.EMAIL_SMTP_USER, settings.EMAIL_SMTP_PASSWORD]):
|
||||||
logger.error("❌ SMTP not configured - cannot send email")
|
logger.error("❌ SMTP not configured - cannot send email")
|
||||||
if graph_failure_message:
|
|
||||||
return False, f"Graph failed: {graph_failure_message}; SMTP fallback unavailable: SMTP not configured"
|
|
||||||
return False, "SMTP not configured"
|
return False, "SMTP not configured"
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@ -1337,153 +950,6 @@ class EmailService:
|
|||||||
return True, f"Email sent to {len(to_addresses)} recipient(s)"
|
return True, f"Email sent to {len(to_addresses)} recipient(s)"
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
error_msg = f"❌ SMTP send error: {str(e)}"
|
error_msg = f"❌ Failed to send email: {str(e)}"
|
||||||
logger.error(error_msg)
|
logger.error(error_msg)
|
||||||
if graph_failure_message:
|
|
||||||
return False, f"Graph failed: {graph_failure_message}; SMTP fallback failed: {str(e)}"
|
|
||||||
return False, error_msg
|
return False, error_msg
|
||||||
|
|
||||||
async def send_email_with_attachments(
|
|
||||||
self,
|
|
||||||
to_addresses: List[str],
|
|
||||||
subject: str,
|
|
||||||
body_text: str,
|
|
||||||
body_html: Optional[str] = None,
|
|
||||||
cc: Optional[List[str]] = None,
|
|
||||||
bcc: Optional[List[str]] = None,
|
|
||||||
reply_to: Optional[str] = None,
|
|
||||||
in_reply_to: Optional[str] = None,
|
|
||||||
references: Optional[str] = None,
|
|
||||||
attachments: Optional[List[Dict]] = None,
|
|
||||||
respect_dry_run: bool = True,
|
|
||||||
) -> Tuple[bool, str, str, Optional[str]]:
|
|
||||||
"""Send email and return status, message, message-id, and optional provider thread key."""
|
|
||||||
|
|
||||||
generated_message_id = f"<{uuid4().hex}@bmchub.local>"
|
|
||||||
provider_thread_key: Optional[str] = None
|
|
||||||
|
|
||||||
if respect_dry_run and settings.REMINDERS_DRY_RUN:
|
|
||||||
logger.warning(
|
|
||||||
"🔒 DRY RUN MODE: Would send email to %s with subject '%s'",
|
|
||||||
to_addresses,
|
|
||||||
subject,
|
|
||||||
)
|
|
||||||
return True, "Dry run mode - email not actually sent", generated_message_id, provider_thread_key
|
|
||||||
|
|
||||||
graph_failure_message: Optional[str] = None
|
|
||||||
|
|
||||||
# Prefer Graph send when Graph integration is enabled/configured.
|
|
||||||
if self._graph_send_available():
|
|
||||||
graph_ok, graph_message, graph_metadata = await self._send_via_graph(
|
|
||||||
to_addresses=to_addresses,
|
|
||||||
subject=subject,
|
|
||||||
body_text=body_text,
|
|
||||||
body_html=body_html,
|
|
||||||
cc=cc,
|
|
||||||
bcc=bcc,
|
|
||||||
reply_to=reply_to,
|
|
||||||
in_reply_to=in_reply_to,
|
|
||||||
references=references,
|
|
||||||
attachments=attachments,
|
|
||||||
)
|
|
||||||
if graph_ok:
|
|
||||||
if graph_metadata:
|
|
||||||
graph_message_id = graph_metadata.get('internet_message_id')
|
|
||||||
graph_thread_key = graph_metadata.get('conversation_id')
|
|
||||||
if graph_message_id:
|
|
||||||
generated_message_id = graph_message_id
|
|
||||||
if graph_thread_key:
|
|
||||||
provider_thread_key = graph_thread_key
|
|
||||||
logger.info(
|
|
||||||
"✅ Email with attachments sent via Graph to %s recipient(s): %s (thread_key=%s)",
|
|
||||||
len(to_addresses),
|
|
||||||
subject,
|
|
||||||
provider_thread_key,
|
|
||||||
)
|
|
||||||
return True, graph_message, generated_message_id, provider_thread_key
|
|
||||||
graph_failure_message = graph_message
|
|
||||||
logger.warning("⚠️ Graph send with attachments failed, falling back to SMTP: %s", graph_message)
|
|
||||||
|
|
||||||
if not HAS_AIOSMTPLIB:
|
|
||||||
logger.error("❌ aiosmtplib not installed - cannot send email. Install with: pip install aiosmtplib")
|
|
||||||
if graph_failure_message:
|
|
||||||
return False, f"Graph failed: {graph_failure_message}; SMTP fallback unavailable: aiosmtplib not installed", generated_message_id, provider_thread_key
|
|
||||||
return False, "aiosmtplib not installed", generated_message_id, provider_thread_key
|
|
||||||
|
|
||||||
if not all([settings.EMAIL_SMTP_HOST, settings.EMAIL_SMTP_USER, settings.EMAIL_SMTP_PASSWORD]):
|
|
||||||
logger.error("❌ SMTP not configured - cannot send email")
|
|
||||||
if graph_failure_message:
|
|
||||||
return False, f"Graph failed: {graph_failure_message}; SMTP fallback unavailable: SMTP not configured", generated_message_id, provider_thread_key
|
|
||||||
return False, "SMTP not configured", generated_message_id, provider_thread_key
|
|
||||||
|
|
||||||
try:
|
|
||||||
msg = MIMEMultipart('mixed')
|
|
||||||
msg['Subject'] = subject
|
|
||||||
msg['From'] = f"{settings.EMAIL_SMTP_FROM_NAME} <{settings.EMAIL_SMTP_FROM_ADDRESS}>"
|
|
||||||
msg['To'] = ', '.join(to_addresses)
|
|
||||||
msg['Message-ID'] = generated_message_id
|
|
||||||
|
|
||||||
if cc:
|
|
||||||
msg['Cc'] = ', '.join(cc)
|
|
||||||
if reply_to:
|
|
||||||
msg['Reply-To'] = reply_to
|
|
||||||
if in_reply_to:
|
|
||||||
msg['In-Reply-To'] = in_reply_to
|
|
||||||
if references:
|
|
||||||
msg['References'] = references
|
|
||||||
|
|
||||||
content_part = MIMEMultipart('alternative')
|
|
||||||
content_part.attach(MIMEText(body_text, 'plain'))
|
|
||||||
if body_html:
|
|
||||||
content_part.attach(MIMEText(body_html, 'html'))
|
|
||||||
msg.attach(content_part)
|
|
||||||
|
|
||||||
for attachment in (attachments or []):
|
|
||||||
content = attachment.get("content")
|
|
||||||
if not content:
|
|
||||||
continue
|
|
||||||
|
|
||||||
filename = attachment.get("filename") or "attachment.bin"
|
|
||||||
content_type = attachment.get("content_type") or "application/octet-stream"
|
|
||||||
maintype, _, subtype = content_type.partition("/")
|
|
||||||
if not maintype or not subtype:
|
|
||||||
maintype, subtype = "application", "octet-stream"
|
|
||||||
|
|
||||||
mime_attachment = MIMEBase(maintype, subtype)
|
|
||||||
mime_attachment.set_payload(content)
|
|
||||||
encoders.encode_base64(mime_attachment)
|
|
||||||
mime_attachment.add_header('Content-Disposition', f'attachment; filename="{filename}"')
|
|
||||||
msg.attach(mime_attachment)
|
|
||||||
|
|
||||||
async with aiosmtplib.SMTP(
|
|
||||||
hostname=settings.EMAIL_SMTP_HOST,
|
|
||||||
port=settings.EMAIL_SMTP_PORT,
|
|
||||||
use_tls=settings.EMAIL_SMTP_USE_TLS
|
|
||||||
) as smtp:
|
|
||||||
await smtp.login(settings.EMAIL_SMTP_USER, settings.EMAIL_SMTP_PASSWORD)
|
|
||||||
|
|
||||||
all_recipients = to_addresses.copy()
|
|
||||||
if cc:
|
|
||||||
all_recipients.extend(cc)
|
|
||||||
if bcc:
|
|
||||||
all_recipients.extend(bcc)
|
|
||||||
|
|
||||||
await smtp.sendmail(
|
|
||||||
settings.EMAIL_SMTP_FROM_ADDRESS,
|
|
||||||
all_recipients,
|
|
||||||
msg.as_string()
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"✅ Email with attachments sent successfully to %s recipient(s): %s",
|
|
||||||
len(to_addresses),
|
|
||||||
subject,
|
|
||||||
)
|
|
||||||
return True, f"Email sent to {len(to_addresses)} recipient(s)", generated_message_id, provider_thread_key
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
error_msg = f"❌ SMTP send error (attachments): {str(e)}"
|
|
||||||
logger.error(error_msg)
|
|
||||||
if graph_failure_message:
|
|
||||||
return False, f"Graph failed: {graph_failure_message}; SMTP fallback failed: {str(e)}", generated_message_id, provider_thread_key
|
|
||||||
return False, error_msg, generated_message_id, provider_thread_key
|
|
||||||
|
|||||||
@ -27,17 +27,6 @@ class EmailWorkflowService:
|
|||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.enabled = settings.EMAIL_WORKFLOWS_ENABLED if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') else True
|
self.enabled = settings.EMAIL_WORKFLOWS_ENABLED if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') else True
|
||||||
|
|
||||||
HELPDESK_SKIP_CLASSIFICATIONS = {
|
|
||||||
'invoice',
|
|
||||||
'order_confirmation',
|
|
||||||
'freight_note',
|
|
||||||
'time_confirmation',
|
|
||||||
'newsletter',
|
|
||||||
'spam',
|
|
||||||
'bankruptcy',
|
|
||||||
'recording'
|
|
||||||
}
|
|
||||||
|
|
||||||
async def execute_workflows(self, email_data: Dict) -> Dict:
|
async def execute_workflows(self, email_data: Dict) -> Dict:
|
||||||
"""
|
"""
|
||||||
Execute all matching workflows for an email
|
Execute all matching workflows for an email
|
||||||
@ -53,22 +42,13 @@ class EmailWorkflowService:
|
|||||||
return {'status': 'disabled', 'workflows_executed': 0}
|
return {'status': 'disabled', 'workflows_executed': 0}
|
||||||
|
|
||||||
email_id = email_data.get('id')
|
email_id = email_data.get('id')
|
||||||
classification = (email_data.get('classification') or '').strip().lower()
|
classification = email_data.get('classification')
|
||||||
confidence = email_data.get('confidence_score', 0.0)
|
confidence = email_data.get('confidence_score', 0.0)
|
||||||
has_hint = self.has_helpdesk_routing_hint(email_data)
|
|
||||||
|
|
||||||
if not email_id:
|
if not email_id or not classification:
|
||||||
logger.warning("⚠️ Cannot execute workflows: missing email_id")
|
logger.warning(f"⚠️ Cannot execute workflows: missing email_id or classification")
|
||||||
return {'status': 'skipped', 'reason': 'missing_data'}
|
return {'status': 'skipped', 'reason': 'missing_data'}
|
||||||
|
|
||||||
if not classification:
|
|
||||||
if has_hint:
|
|
||||||
classification = 'general'
|
|
||||||
email_data['classification'] = classification
|
|
||||||
else:
|
|
||||||
logger.warning("⚠️ Cannot execute workflows: missing classification")
|
|
||||||
return {'status': 'skipped', 'reason': 'missing_data'}
|
|
||||||
|
|
||||||
logger.info(f"🔄 Finding workflows for classification: {classification} (confidence: {confidence})")
|
logger.info(f"🔄 Finding workflows for classification: {classification} (confidence: {confidence})")
|
||||||
|
|
||||||
results = {
|
results = {
|
||||||
@ -90,23 +70,6 @@ class EmailWorkflowService:
|
|||||||
results['workflows_succeeded'] += 1
|
results['workflows_succeeded'] += 1
|
||||||
logger.info("✅ Bankruptcy system workflow executed successfully")
|
logger.info("✅ Bankruptcy system workflow executed successfully")
|
||||||
|
|
||||||
# Special System Workflow: Helpdesk SAG routing
|
|
||||||
# - If SAG/tråd-hint findes => forsøg altid routing til eksisterende sag
|
|
||||||
# - Uden hints: brug klassifikationsgating som før
|
|
||||||
should_try_helpdesk = (
|
|
||||||
classification not in self.HELPDESK_SKIP_CLASSIFICATIONS
|
|
||||||
or has_hint
|
|
||||||
)
|
|
||||||
|
|
||||||
if should_try_helpdesk:
|
|
||||||
helpdesk_result = await self._handle_helpdesk_sag_routing(email_data)
|
|
||||||
if helpdesk_result:
|
|
||||||
results['details'].append(helpdesk_result)
|
|
||||||
if helpdesk_result.get('status') == 'completed':
|
|
||||||
results['workflows_executed'] += 1
|
|
||||||
results['workflows_succeeded'] += 1
|
|
||||||
logger.info("✅ Helpdesk SAG routing workflow executed")
|
|
||||||
|
|
||||||
# Find matching workflows
|
# Find matching workflows
|
||||||
workflows = await self._find_matching_workflows(email_data)
|
workflows = await self._find_matching_workflows(email_data)
|
||||||
|
|
||||||
@ -213,391 +176,6 @@ class EmailWorkflowService:
|
|||||||
'customer_name': first_match['name']
|
'customer_name': first_match['name']
|
||||||
}
|
}
|
||||||
|
|
||||||
def _extract_sender_domain(self, email_data: Dict) -> Optional[str]:
|
|
||||||
sender_email = (email_data.get('sender_email') or '').strip().lower()
|
|
||||||
if '@' not in sender_email:
|
|
||||||
return None
|
|
||||||
domain = sender_email.split('@', 1)[1].strip()
|
|
||||||
if domain.startswith('www.'):
|
|
||||||
domain = domain[4:]
|
|
||||||
return domain or None
|
|
||||||
|
|
||||||
def has_helpdesk_routing_hint(self, email_data: Dict) -> bool:
|
|
||||||
"""Return True when email has explicit routing hints (SAG or thread headers/key)."""
|
|
||||||
if self._extract_sag_id(email_data):
|
|
||||||
return True
|
|
||||||
|
|
||||||
explicit_thread_key = self._normalize_message_id(email_data.get('thread_key'))
|
|
||||||
if explicit_thread_key:
|
|
||||||
return True
|
|
||||||
|
|
||||||
if self._normalize_message_id(email_data.get('in_reply_to')):
|
|
||||||
return True
|
|
||||||
|
|
||||||
if self._extract_reference_message_ids(email_data.get('email_references')):
|
|
||||||
return True
|
|
||||||
|
|
||||||
return False
|
|
||||||
|
|
||||||
def _extract_sag_id(self, email_data: Dict) -> Optional[int]:
|
|
||||||
candidates = [
|
|
||||||
email_data.get('subject') or '',
|
|
||||||
email_data.get('in_reply_to') or '',
|
|
||||||
email_data.get('email_references') or '',
|
|
||||||
email_data.get('body_text') or '',
|
|
||||||
email_data.get('body_html') or '',
|
|
||||||
]
|
|
||||||
|
|
||||||
# Accept both strict and human variants used in real subjects, e.g.:
|
|
||||||
# - SAG-53
|
|
||||||
# - SAG #53
|
|
||||||
# - Sag 53
|
|
||||||
sag_patterns = [
|
|
||||||
r'\bSAG-(\d+)\b',
|
|
||||||
r'\bSAG\s*#\s*(\d+)\b',
|
|
||||||
r'\bSAG\s+(\d+)\b',
|
|
||||||
r'\bBMCid\s*:\s*s(\d+)t\d+\b',
|
|
||||||
]
|
|
||||||
|
|
||||||
for value in candidates:
|
|
||||||
for pattern in sag_patterns:
|
|
||||||
match = re.search(pattern, value, re.IGNORECASE)
|
|
||||||
if match:
|
|
||||||
return int(match.group(1))
|
|
||||||
return None
|
|
||||||
|
|
||||||
def _normalize_message_id(self, value: Optional[str]) -> Optional[str]:
|
|
||||||
if not value:
|
|
||||||
return None
|
|
||||||
normalized = re.sub(r'[<>\s]', '', str(value)).lower().strip()
|
|
||||||
return normalized or None
|
|
||||||
|
|
||||||
def _extract_thread_message_ids(self, email_data: Dict) -> List[str]:
|
|
||||||
tokens: List[str] = []
|
|
||||||
|
|
||||||
in_reply_to = self._normalize_message_id(email_data.get('in_reply_to'))
|
|
||||||
if in_reply_to:
|
|
||||||
tokens.append(in_reply_to)
|
|
||||||
|
|
||||||
raw_references = (email_data.get('email_references') or '').strip()
|
|
||||||
if raw_references:
|
|
||||||
for ref in re.split(r'[\s,]+', raw_references):
|
|
||||||
normalized_ref = self._normalize_message_id(ref)
|
|
||||||
if normalized_ref:
|
|
||||||
tokens.append(normalized_ref)
|
|
||||||
|
|
||||||
# De-duplicate while preserving order
|
|
||||||
return list(dict.fromkeys(tokens))
|
|
||||||
|
|
||||||
def _extract_reference_message_ids(self, raw_references: Optional[str]) -> List[str]:
|
|
||||||
tokens: List[str] = []
|
|
||||||
if raw_references:
|
|
||||||
for ref in re.split(r'[\s,]+', str(raw_references).strip()):
|
|
||||||
normalized_ref = self._normalize_message_id(ref)
|
|
||||||
if normalized_ref:
|
|
||||||
tokens.append(normalized_ref)
|
|
||||||
return list(dict.fromkeys(tokens))
|
|
||||||
|
|
||||||
def _derive_thread_key(self, email_data: Dict) -> Optional[str]:
|
|
||||||
"""Derive stable conversation key: root References -> In-Reply-To -> Message-ID."""
|
|
||||||
explicit = self._normalize_message_id(email_data.get('thread_key'))
|
|
||||||
if explicit:
|
|
||||||
return explicit
|
|
||||||
|
|
||||||
ref_ids = self._extract_reference_message_ids(email_data.get('email_references'))
|
|
||||||
if ref_ids:
|
|
||||||
return ref_ids[0]
|
|
||||||
|
|
||||||
in_reply_to = self._normalize_message_id(email_data.get('in_reply_to'))
|
|
||||||
if in_reply_to:
|
|
||||||
return in_reply_to
|
|
||||||
|
|
||||||
return self._normalize_message_id(email_data.get('message_id'))
|
|
||||||
|
|
||||||
def _find_sag_id_from_thread_key(self, thread_key: Optional[str]) -> Optional[int]:
|
|
||||||
if not thread_key:
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Backward compatibility when DB migration is not yet applied.
|
|
||||||
try:
|
|
||||||
rows = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT se.sag_id
|
|
||||||
FROM sag_emails se
|
|
||||||
JOIN email_messages em ON em.id = se.email_id
|
|
||||||
WHERE em.deleted_at IS NULL
|
|
||||||
AND LOWER(TRIM(COALESCE(em.thread_key, ''))) = %s
|
|
||||||
ORDER BY se.created_at DESC
|
|
||||||
LIMIT 1
|
|
||||||
""",
|
|
||||||
(thread_key,)
|
|
||||||
)
|
|
||||||
return rows[0]['sag_id'] if rows else None
|
|
||||||
except Exception:
|
|
||||||
return None
|
|
||||||
|
|
||||||
def _find_sag_id_from_thread_headers(self, email_data: Dict) -> Optional[int]:
|
|
||||||
thread_message_ids = self._extract_thread_message_ids(email_data)
|
|
||||||
if not thread_message_ids:
|
|
||||||
return None
|
|
||||||
|
|
||||||
placeholders = ','.join(['%s'] * len(thread_message_ids))
|
|
||||||
rows = execute_query(
|
|
||||||
f"""
|
|
||||||
SELECT se.sag_id
|
|
||||||
FROM sag_emails se
|
|
||||||
JOIN email_messages em ON em.id = se.email_id
|
|
||||||
WHERE em.deleted_at IS NULL
|
|
||||||
AND LOWER(REGEXP_REPLACE(COALESCE(em.message_id, ''), '[<>\\s]', '', 'g')) IN ({placeholders})
|
|
||||||
ORDER BY se.created_at DESC
|
|
||||||
LIMIT 1
|
|
||||||
""",
|
|
||||||
tuple(thread_message_ids)
|
|
||||||
)
|
|
||||||
return rows[0]['sag_id'] if rows else None
|
|
||||||
|
|
||||||
def _find_customer_by_domain(self, domain: str) -> Optional[Dict[str, Any]]:
|
|
||||||
if not domain:
|
|
||||||
return None
|
|
||||||
|
|
||||||
domain = domain.lower().strip()
|
|
||||||
domain_alt = domain[4:] if domain.startswith('www.') else f"www.{domain}"
|
|
||||||
|
|
||||||
query = """
|
|
||||||
SELECT id, name
|
|
||||||
FROM customers
|
|
||||||
WHERE is_active = true
|
|
||||||
AND (
|
|
||||||
LOWER(TRIM(email_domain)) = %s
|
|
||||||
OR LOWER(TRIM(email_domain)) = %s
|
|
||||||
)
|
|
||||||
ORDER BY id ASC
|
|
||||||
LIMIT 1
|
|
||||||
"""
|
|
||||||
rows = execute_query(query, (domain, domain_alt))
|
|
||||||
return rows[0] if rows else None
|
|
||||||
|
|
||||||
def _link_email_to_sag(self, sag_id: int, email_id: int) -> None:
|
|
||||||
execute_update(
|
|
||||||
"""
|
|
||||||
INSERT INTO sag_emails (sag_id, email_id)
|
|
||||||
SELECT %s, %s
|
|
||||||
WHERE NOT EXISTS (
|
|
||||||
SELECT 1 FROM sag_emails WHERE sag_id = %s AND email_id = %s
|
|
||||||
)
|
|
||||||
""",
|
|
||||||
(sag_id, email_id, sag_id, email_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
def _strip_quoted_email_text(self, body_text: str) -> str:
|
|
||||||
"""Return only the newest reply content (remove quoted history/signatures)."""
|
|
||||||
if not body_text:
|
|
||||||
return ""
|
|
||||||
|
|
||||||
text = str(body_text).replace("\r\n", "\n").replace("\r", "\n")
|
|
||||||
lines = text.split("\n")
|
|
||||||
cleaned_lines: List[str] = []
|
|
||||||
|
|
||||||
header_marker_re = re.compile(r'^(fra|from|sent|date|dato|to|til|emne|subject|cc):\s*', re.IGNORECASE)
|
|
||||||
original_message_re = re.compile(r'^(original message|oprindelig besked|videresendt besked)', re.IGNORECASE)
|
|
||||||
|
|
||||||
for idx, line in enumerate(lines):
|
|
||||||
stripped = line.strip()
|
|
||||||
lowered = stripped.lower()
|
|
||||||
|
|
||||||
if stripped.startswith('>'):
|
|
||||||
break
|
|
||||||
|
|
||||||
if original_message_re.match(stripped):
|
|
||||||
break
|
|
||||||
|
|
||||||
# Typical separator before quoted headers (e.g. "---" / "_____" lines)
|
|
||||||
if re.match(r'^[-_]{3,}$', stripped):
|
|
||||||
lookahead = lines[idx + 1: idx + 4]
|
|
||||||
if any(header_marker_re.match(candidate.strip()) for candidate in lookahead):
|
|
||||||
break
|
|
||||||
|
|
||||||
if idx > 0 and header_marker_re.match(stripped):
|
|
||||||
if lines[idx - 1].strip() == "":
|
|
||||||
break
|
|
||||||
|
|
||||||
cleaned_lines.append(line)
|
|
||||||
|
|
||||||
while cleaned_lines and cleaned_lines[-1].strip() == "":
|
|
||||||
cleaned_lines.pop()
|
|
||||||
|
|
||||||
return "\n".join(cleaned_lines).strip()
|
|
||||||
|
|
||||||
def _add_helpdesk_comment(self, sag_id: int, email_data: Dict) -> None:
|
|
||||||
email_id = email_data.get('id')
|
|
||||||
sender = email_data.get('sender_email') or 'ukendt'
|
|
||||||
subject = email_data.get('subject') or '(ingen emne)'
|
|
||||||
received = email_data.get('received_date')
|
|
||||||
received_str = received.isoformat() if hasattr(received, 'isoformat') else str(received or '')
|
|
||||||
body_text = self._strip_quoted_email_text((email_data.get('body_text') or '').strip())
|
|
||||||
|
|
||||||
email_meta_line = f"Email-ID: {email_id}\n" if email_id else ""
|
|
||||||
|
|
||||||
comment = (
|
|
||||||
f"📧 Indgående email\n"
|
|
||||||
f"{email_meta_line}"
|
|
||||||
f"Fra: {sender}\n"
|
|
||||||
f"Emne: {subject}\n"
|
|
||||||
f"Modtaget: {received_str}\n\n"
|
|
||||||
f"{body_text}"
|
|
||||||
)
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"""
|
|
||||||
INSERT INTO sag_kommentarer (sag_id, forfatter, indhold, er_system_besked)
|
|
||||||
VALUES (%s, %s, %s, %s)
|
|
||||||
""",
|
|
||||||
(sag_id, 'Email Bot', comment, True)
|
|
||||||
)
|
|
||||||
|
|
||||||
def _create_sag_from_email(self, email_data: Dict, customer_id: int) -> Dict[str, Any]:
|
|
||||||
sender = email_data.get('sender_email') or 'ukendt'
|
|
||||||
subject = (email_data.get('subject') or '').strip() or f"Email fra {sender}"
|
|
||||||
|
|
||||||
description = (
|
|
||||||
f"Auto-oprettet fra email\n"
|
|
||||||
f"Fra: {sender}\n"
|
|
||||||
f"Message-ID: {email_data.get('message_id') or ''}\n\n"
|
|
||||||
f"{(email_data.get('body_text') or '').strip()}"
|
|
||||||
)
|
|
||||||
|
|
||||||
rows = execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO sag_sager (
|
|
||||||
titel, beskrivelse, template_key, status, customer_id, created_by_user_id
|
|
||||||
)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s)
|
|
||||||
RETURNING id, titel, customer_id
|
|
||||||
""",
|
|
||||||
(subject, description, 'ticket', 'åben', customer_id, 1)
|
|
||||||
)
|
|
||||||
|
|
||||||
if not rows:
|
|
||||||
raise ValueError('Failed to create SAG from email')
|
|
||||||
return rows[0]
|
|
||||||
|
|
||||||
async def _handle_helpdesk_sag_routing(self, email_data: Dict) -> Optional[Dict[str, Any]]:
|
|
||||||
email_id = email_data.get('id')
|
|
||||||
if not email_id:
|
|
||||||
return None
|
|
||||||
|
|
||||||
derived_thread_key = self._derive_thread_key(email_data)
|
|
||||||
sag_id_from_thread_key = self._find_sag_id_from_thread_key(derived_thread_key)
|
|
||||||
sag_id_from_thread = self._find_sag_id_from_thread_headers(email_data)
|
|
||||||
sag_id_from_tag = self._extract_sag_id(email_data)
|
|
||||||
|
|
||||||
routing_source = None
|
|
||||||
sag_id = None
|
|
||||||
|
|
||||||
if sag_id_from_thread_key:
|
|
||||||
sag_id = sag_id_from_thread_key
|
|
||||||
routing_source = 'thread_key'
|
|
||||||
logger.info("🧵 Matched email %s to SAG-%s via thread key", email_id, sag_id)
|
|
||||||
|
|
||||||
if sag_id_from_thread:
|
|
||||||
if sag_id and sag_id != sag_id_from_thread:
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Email %s has conflicting thread matches (thread_key: SAG-%s, headers: SAG-%s). Using thread_key.",
|
|
||||||
email_id,
|
|
||||||
sag_id,
|
|
||||||
sag_id_from_thread,
|
|
||||||
)
|
|
||||||
elif not sag_id:
|
|
||||||
sag_id = sag_id_from_thread
|
|
||||||
routing_source = 'thread_headers'
|
|
||||||
logger.info("🔗 Matched email %s to SAG-%s via thread headers", email_id, sag_id)
|
|
||||||
|
|
||||||
if sag_id_from_tag:
|
|
||||||
if sag_id and sag_id != sag_id_from_tag:
|
|
||||||
logger.warning(
|
|
||||||
"⚠️ Email %s contains conflicting case hints (thread: SAG-%s, tag: SAG-%s). Using thread match.",
|
|
||||||
email_id,
|
|
||||||
sag_id,
|
|
||||||
sag_id_from_tag
|
|
||||||
)
|
|
||||||
elif not sag_id:
|
|
||||||
sag_id = sag_id_from_tag
|
|
||||||
routing_source = 'sag_tag'
|
|
||||||
logger.info("🏷️ Matched email %s to SAG-%s via SAG tag", email_id, sag_id)
|
|
||||||
|
|
||||||
# 1) Existing SAG via subject/headers
|
|
||||||
if sag_id:
|
|
||||||
case_rows = execute_query(
|
|
||||||
"SELECT id, customer_id, titel FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(sag_id,)
|
|
||||||
)
|
|
||||||
|
|
||||||
if not case_rows:
|
|
||||||
logger.warning("⚠️ Email %s referenced SAG-%s but case was not found", email_id, sag_id)
|
|
||||||
return {'status': 'skipped', 'action': 'sag_id_not_found', 'sag_id': sag_id}
|
|
||||||
|
|
||||||
case = case_rows[0]
|
|
||||||
self._add_helpdesk_comment(sag_id, email_data)
|
|
||||||
self._link_email_to_sag(sag_id, email_id)
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"""
|
|
||||||
UPDATE email_messages
|
|
||||||
SET linked_case_id = %s,
|
|
||||||
customer_id = COALESCE(customer_id, %s),
|
|
||||||
status = 'processed',
|
|
||||||
folder = 'Processed',
|
|
||||||
processed_at = CURRENT_TIMESTAMP,
|
|
||||||
auto_processed = true
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(sag_id, case.get('customer_id'), email_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
|
||||||
'status': 'completed',
|
|
||||||
'action': 'updated_existing_sag',
|
|
||||||
'sag_id': sag_id,
|
|
||||||
'customer_id': case.get('customer_id'),
|
|
||||||
'routing_source': routing_source
|
|
||||||
}
|
|
||||||
|
|
||||||
# 2) No SAG id -> create only if sender domain belongs to known customer
|
|
||||||
sender_domain = self._extract_sender_domain(email_data)
|
|
||||||
customer = self._find_customer_by_domain(sender_domain) if sender_domain else None
|
|
||||||
|
|
||||||
if not customer:
|
|
||||||
logger.info("⏭️ Email %s has no known customer domain (%s) - kept in /emails", email_id, sender_domain)
|
|
||||||
return {'status': 'skipped', 'action': 'unknown_customer_domain', 'domain': sender_domain}
|
|
||||||
|
|
||||||
case = self._create_sag_from_email(email_data, customer['id'])
|
|
||||||
self._add_helpdesk_comment(case['id'], email_data)
|
|
||||||
self._link_email_to_sag(case['id'], email_id)
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"""
|
|
||||||
UPDATE email_messages
|
|
||||||
SET linked_case_id = %s,
|
|
||||||
customer_id = %s,
|
|
||||||
status = 'processed',
|
|
||||||
folder = 'Processed',
|
|
||||||
processed_at = CURRENT_TIMESTAMP,
|
|
||||||
auto_processed = true
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(case['id'], customer['id'], email_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.info("✅ Created SAG-%s from email %s for customer %s", case['id'], email_id, customer['id'])
|
|
||||||
return {
|
|
||||||
'status': 'completed',
|
|
||||||
'action': 'created_new_sag',
|
|
||||||
'sag_id': case['id'],
|
|
||||||
'customer_id': customer['id'],
|
|
||||||
'domain': sender_domain,
|
|
||||||
'routing_source': 'customer_domain'
|
|
||||||
}
|
|
||||||
|
|
||||||
async def _find_matching_workflows(self, email_data: Dict) -> List[Dict]:
|
async def _find_matching_workflows(self, email_data: Dict) -> List[Dict]:
|
||||||
"""Find all workflows that match this email"""
|
"""Find all workflows that match this email"""
|
||||||
classification = email_data.get('classification')
|
classification = email_data.get('classification')
|
||||||
@ -779,7 +357,6 @@ class EmailWorkflowService:
|
|||||||
handler_map = {
|
handler_map = {
|
||||||
'create_ticket': self._action_create_ticket_system,
|
'create_ticket': self._action_create_ticket_system,
|
||||||
'link_email_to_ticket': self._action_link_email_to_ticket,
|
'link_email_to_ticket': self._action_link_email_to_ticket,
|
||||||
'route_helpdesk_sag': self._handle_helpdesk_sag_routing,
|
|
||||||
'create_time_entry': self._action_create_time_entry,
|
'create_time_entry': self._action_create_time_entry,
|
||||||
'link_to_vendor': self._action_link_to_vendor,
|
'link_to_vendor': self._action_link_to_vendor,
|
||||||
'link_to_customer': self._action_link_to_customer,
|
'link_to_customer': self._action_link_to_customer,
|
||||||
@ -892,8 +469,8 @@ class EmailWorkflowService:
|
|||||||
'body': email_data.get('body_text', ''),
|
'body': email_data.get('body_text', ''),
|
||||||
'html_body': email_data.get('body_html'),
|
'html_body': email_data.get('body_html'),
|
||||||
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
||||||
'in_reply_to': email_data.get('in_reply_to'),
|
'in_reply_to': None, # TODO: Extract from email headers
|
||||||
'references': email_data.get('email_references')
|
'references': None # TODO: Extract from email headers
|
||||||
}
|
}
|
||||||
|
|
||||||
# Get params from workflow
|
# Get params from workflow
|
||||||
@ -939,8 +516,6 @@ class EmailWorkflowService:
|
|||||||
'body': email_data.get('body_text', ''),
|
'body': email_data.get('body_text', ''),
|
||||||
'html_body': email_data.get('body_html'),
|
'html_body': email_data.get('body_html'),
|
||||||
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
||||||
'in_reply_to': email_data.get('in_reply_to'),
|
|
||||||
'references': email_data.get('email_references')
|
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info(f"🔗 Linking email to ticket {ticket_number}")
|
logger.info(f"🔗 Linking email to ticket {ticket_number}")
|
||||||
@ -1019,31 +594,13 @@ class EmailWorkflowService:
|
|||||||
return {'action': 'link_to_vendor', 'matched': False, 'reason': 'Vendor not found'}
|
return {'action': 'link_to_vendor', 'matched': False, 'reason': 'Vendor not found'}
|
||||||
|
|
||||||
async def _action_link_to_customer(self, params: Dict, email_data: Dict) -> Dict:
|
async def _action_link_to_customer(self, params: Dict, email_data: Dict) -> Dict:
|
||||||
"""Link email to customer by sender domain and persist on email_messages"""
|
"""Link email to customer"""
|
||||||
sender_domain = self._extract_sender_domain(email_data)
|
logger.info(f"🔗 Would link to customer")
|
||||||
if not sender_domain:
|
|
||||||
return {'action': 'link_to_customer', 'matched': False, 'reason': 'No sender domain'}
|
|
||||||
|
|
||||||
customer = self._find_customer_by_domain(sender_domain)
|
|
||||||
if not customer:
|
|
||||||
return {
|
|
||||||
'action': 'link_to_customer',
|
|
||||||
'matched': False,
|
|
||||||
'reason': 'Customer not found for domain',
|
|
||||||
'domain': sender_domain
|
|
||||||
}
|
|
||||||
|
|
||||||
execute_update(
|
|
||||||
"UPDATE email_messages SET customer_id = %s WHERE id = %s",
|
|
||||||
(customer['id'], email_data['id'])
|
|
||||||
)
|
|
||||||
|
|
||||||
|
# TODO: Implement customer matching logic
|
||||||
return {
|
return {
|
||||||
'action': 'link_to_customer',
|
'action': 'link_to_customer',
|
||||||
'matched': True,
|
'note': 'Customer linking not yet implemented'
|
||||||
'customer_id': customer['id'],
|
|
||||||
'customer_name': customer['name'],
|
|
||||||
'domain': sender_domain
|
|
||||||
}
|
}
|
||||||
|
|
||||||
async def _action_extract_invoice_data(self, params: Dict, email_data: Dict) -> Dict:
|
async def _action_extract_invoice_data(self, params: Dict, email_data: Dict) -> Dict:
|
||||||
|
|||||||
@ -28,26 +28,14 @@ class OllamaService:
|
|||||||
|
|
||||||
def _build_system_prompt(self) -> str:
|
def _build_system_prompt(self) -> str:
|
||||||
"""Build Danish system prompt for invoice extraction with CVR"""
|
"""Build Danish system prompt for invoice extraction with CVR"""
|
||||||
own_cvr = getattr(settings, 'OWN_CVR', '29522790')
|
return """Du er en ekspert i at læse og udtrække strukturerede data fra danske fakturaer, kreditnotaer og leverandørdokumenter.
|
||||||
# BMC har to CVR numre – begge er VORES (køber), aldrig leverandør
|
|
||||||
own_cvr_rule = (
|
|
||||||
f"4b. KRITISK - LEVERANDØR vs. MODTAGER:\n"
|
|
||||||
f" - På en dansk faktura er LEVERANDØREN (vendor) det firma der HAR SENDT fakturaen.\n"
|
|
||||||
f" De kendes på: firmalogo øverst, bankkonto/IBAN/Gironr. nedad, ingen 'Faktureres til' label.\n"
|
|
||||||
f" - MODTAGEREN (os, buyer) kendes på: navnes under 'Faktureres til', 'Att.', 'Kundenr.', adresseblok med vores navn.\n"
|
|
||||||
f" - BMC DENMARK APS og alle varianter af 'BMC' er ALDRIG leverandøren – det er os (modtageren).\n"
|
|
||||||
f" - CVR {own_cvr} er VORES eget CVR. Sæt ALDRIG vendor_cvr til {own_cvr}.\n"
|
|
||||||
f" - CVR 14416285 er også VORES CVR. Sæt ALDRIG vendor_cvr til 14416285.\n"
|
|
||||||
f" - Ignorer 'SE/CVR-nr.' der hører til modtager-blokken – brug KUN afsenderens CVR som vendor_cvr.\n"
|
|
||||||
)
|
|
||||||
return ("""Du er en ekspert i at læse og udtrække strukturerede data fra danske fakturaer, kreditnotaer og leverandørdokumenter.
|
|
||||||
|
|
||||||
VIGTIGE REGLER:
|
VIGTIGE REGLER:
|
||||||
1. Returner KUN gyldig JSON - ingen forklaring eller ekstra tekst
|
1. Returner KUN gyldig JSON - ingen forklaring eller ekstra tekst
|
||||||
2. Hvis et felt ikke findes, sæt det til null
|
2. Hvis et felt ikke findes, sæt det til null
|
||||||
3. Beregn confidence baseret på hvor sikker du er på hvert felt (0.0-1.0)
|
3. Beregn confidence baseret på hvor sikker du er på hvert felt (0.0-1.0)
|
||||||
4. Datoer skal være i format YYYY-MM-DD
|
4. Datoer skal være i format YYYY-MM-DD
|
||||||
""" + own_cvr_rule + """5. DANSKE PRISFORMATER:
|
5. DANSKE PRISFORMATER:
|
||||||
- Tusind-separator kan være . (punkt) eller mellemrum: "5.965,18" eller "5 965,18"
|
- Tusind-separator kan være . (punkt) eller mellemrum: "5.965,18" eller "5 965,18"
|
||||||
- Decimal-separator er , (komma): "1.234,56 kr"
|
- Decimal-separator er , (komma): "1.234,56 kr"
|
||||||
- I JSON output skal du bruge . (punkt) som decimal: 1234.56
|
- I JSON output skal du bruge . (punkt) som decimal: 1234.56
|
||||||
@ -138,7 +126,7 @@ Output: {
|
|||||||
"confidence": 0.95
|
"confidence": 0.95
|
||||||
}],
|
}],
|
||||||
"confidence": 0.95
|
"confidence": 0.95
|
||||||
}""")
|
}"""
|
||||||
|
|
||||||
async def extract_from_text(self, text: str) -> Dict:
|
async def extract_from_text(self, text: str) -> Dict:
|
||||||
"""
|
"""
|
||||||
@ -182,11 +170,10 @@ Output: {
|
|||||||
],
|
],
|
||||||
"stream": False,
|
"stream": False,
|
||||||
"format": "json",
|
"format": "json",
|
||||||
"think": False,
|
|
||||||
"options": {
|
"options": {
|
||||||
"temperature": 0.1,
|
"temperature": 0.1,
|
||||||
"top_p": 0.9,
|
"top_p": 0.9,
|
||||||
"num_predict": 8000
|
"num_predict": 2000
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
@ -202,7 +189,7 @@ Output: {
|
|||||||
"options": {
|
"options": {
|
||||||
"temperature": 0.1,
|
"temperature": 0.1,
|
||||||
"top_p": 0.9,
|
"top_p": 0.9,
|
||||||
"num_predict": 8000
|
"num_predict": 2000
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
@ -314,88 +301,53 @@ Output: {
|
|||||||
}
|
}
|
||||||
|
|
||||||
def _parse_json_response(self, response: str) -> Dict:
|
def _parse_json_response(self, response: str) -> Dict:
|
||||||
"""Parse JSON from LLM response with aggressive fallback strategies"""
|
"""Parse JSON from LLM response with improved error handling"""
|
||||||
logger.info(f"🔍 Response length: {len(response)}, preview: {response[:200]}")
|
|
||||||
|
|
||||||
# Find outermost JSON object
|
|
||||||
start = response.find('{')
|
|
||||||
end = response.rfind('}') + 1
|
|
||||||
if start < 0 or end <= start:
|
|
||||||
logger.error("❌ No JSON object found in response")
|
|
||||||
return self._extract_fields_with_regex(response)
|
|
||||||
|
|
||||||
json_str = response[start:end]
|
|
||||||
|
|
||||||
# Strategy 1: direct parse
|
|
||||||
try:
|
try:
|
||||||
return json.loads(json_str)
|
# Log preview of response for debugging
|
||||||
except json.JSONDecodeError:
|
logger.info(f"🔍 Response preview (first 500 chars): {response[:500]}")
|
||||||
pass
|
|
||||||
|
|
||||||
# Strategy 2: remove trailing commas before } or ]
|
# Find JSON in response (between first { and last })
|
||||||
fixed = re.sub(r',(\s*[}\]])', r'\1', json_str)
|
start = response.find('{')
|
||||||
try:
|
end = response.rfind('}') + 1
|
||||||
return json.loads(fixed)
|
|
||||||
except json.JSONDecodeError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# Strategy 3: remove JS-style comments (// and /* */)
|
if start >= 0 and end > start:
|
||||||
fixed = re.sub(r'//[^\n]*', '', fixed)
|
json_str = response[start:end]
|
||||||
fixed = re.sub(r'/\*.*?\*/', '', fixed, flags=re.DOTALL)
|
logger.info(f"🔍 Extracted JSON string length: {len(json_str)}, starts at position {start}")
|
||||||
try:
|
|
||||||
return json.loads(fixed)
|
# Try to fix common JSON issues
|
||||||
except json.JSONDecodeError:
|
# Remove trailing commas before } or ]
|
||||||
pass
|
json_str = re.sub(r',(\s*[}\]])', r'\1', json_str)
|
||||||
|
# Fix single quotes to double quotes (but not in values)
|
||||||
|
# This is risky, so we only do it if initial parse fails
|
||||||
|
|
||||||
# Strategy 4: truncate at last valid closing brace
|
|
||||||
# Walk backwards to find longest valid JSON prefix
|
|
||||||
for i in range(len(fixed) - 1, start, -1):
|
|
||||||
if fixed[i] == '}':
|
|
||||||
candidate = fixed[start - start:i + 1] if start == 0 else fixed[:i + 1]
|
|
||||||
# rebuild from inner start
|
|
||||||
c2 = fixed[:i + 1] if start == 0 else json_str[:i - start + 1]
|
|
||||||
try:
|
try:
|
||||||
data = json.loads(c2)
|
data = json.loads(json_str)
|
||||||
logger.warning(f"⚠️ JSON truncated to position {i} — partial parse OK")
|
|
||||||
return data
|
return data
|
||||||
except json.JSONDecodeError:
|
except json.JSONDecodeError:
|
||||||
continue
|
# Try to fix common issues
|
||||||
break
|
# Replace single quotes with double quotes (simple approach)
|
||||||
|
fixed_json = json_str.replace("'", '"')
|
||||||
|
try:
|
||||||
|
data = json.loads(fixed_json)
|
||||||
|
logger.warning("⚠️ Fixed JSON with quote replacement")
|
||||||
|
return data
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
# Strategy 5: regex extraction of key fields (always succeeds with partial data)
|
# Last resort: log the problematic JSON
|
||||||
logger.warning("⚠️ All JSON strategies failed — using regex field extraction")
|
logger.error(f"❌ Problematic JSON: {json_str[:300]}")
|
||||||
return self._extract_fields_with_regex(response)
|
raise
|
||||||
|
else:
|
||||||
def _extract_fields_with_regex(self, text: str) -> Dict:
|
raise ValueError("No JSON found in response")
|
||||||
"""Extract invoice fields from text using regex when JSON parsing fails"""
|
|
||||||
def _find(pattern, default=None):
|
|
||||||
m = re.search(pattern, text, re.IGNORECASE)
|
|
||||||
return m.group(1).strip() if m else default
|
|
||||||
|
|
||||||
def _find_num(pattern):
|
|
||||||
m = re.search(pattern, text, re.IGNORECASE)
|
|
||||||
if not m: return None
|
|
||||||
val = m.group(1).replace('.', '').replace(',', '.')
|
|
||||||
try: return float(val)
|
|
||||||
except: return None
|
|
||||||
|
|
||||||
result = {
|
|
||||||
"document_type": _find(r'"document_type"\s*:\s*"([^"]+)"', 'invoice'),
|
|
||||||
"invoice_number": _find(r'"invoice_number"\s*:\s*"?([^",\n}]+)"?'),
|
|
||||||
"vendor_name": _find(r'"vendor_name"\s*:\s*"([^"]+)"'),
|
|
||||||
"vendor_cvr": _find(r'"vendor_cvr"\s*:\s*"?(\d{8})"?'),
|
|
||||||
"invoice_date": _find(r'"invoice_date"\s*:\s*"([^"]+)"'),
|
|
||||||
"due_date": _find(r'"due_date"\s*:\s*"([^"]+)"'),
|
|
||||||
"currency": _find(r'"currency"\s*:\s*"([^"]+)"', 'DKK'),
|
|
||||||
"total_amount": _find_num(r'"total_amount"\s*:\s*([\d.,]+)'),
|
|
||||||
"vat_amount": _find_num(r'"vat_amount"\s*:\s*([\d.,]+)'),
|
|
||||||
"confidence": 0.5,
|
|
||||||
"lines": [],
|
|
||||||
"_partial": True,
|
|
||||||
}
|
|
||||||
logger.info(f"🔧 Regex extraction: vendor={result['vendor_name']}, cvr={result['vendor_cvr']}, total={result['total_amount']}")
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
except json.JSONDecodeError as e:
|
||||||
|
logger.error(f"❌ JSON parsing failed: {e}")
|
||||||
|
logger.error(f"Raw response preview: {response[:500]}")
|
||||||
|
return {
|
||||||
|
"error": f"JSON parsing failed: {str(e)}",
|
||||||
|
"confidence": 0.0,
|
||||||
|
"raw_response": response[:500]
|
||||||
|
}
|
||||||
|
|
||||||
def calculate_file_checksum(self, file_path: Path) -> str:
|
def calculate_file_checksum(self, file_path: Path) -> str:
|
||||||
"""Calculate SHA256 checksum of file for duplicate detection"""
|
"""Calculate SHA256 checksum of file for duplicate detection"""
|
||||||
|
|||||||
@ -2,30 +2,15 @@
|
|||||||
Settings and User Management API Router
|
Settings and User Management API Router
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, HTTPException, Request
|
from fastapi import APIRouter, HTTPException
|
||||||
from typing import List, Optional, Dict
|
from typing import List, Optional, Dict
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from app.core.database import execute_query
|
from app.core.database import execute_query
|
||||||
from app.core.config import settings
|
|
||||||
import httpx
|
|
||||||
import time
|
|
||||||
import logging
|
import logging
|
||||||
import json
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
DEFAULT_EMAIL_SIGNATURE_TEMPLATE = (
|
|
||||||
"{full_name}\n"
|
|
||||||
"{title}\n"
|
|
||||||
"{company_name}\n"
|
|
||||||
"Telefon: {company_phone}\n"
|
|
||||||
"Email: {email}\n"
|
|
||||||
"Web: {company_website}\n"
|
|
||||||
"Adresse: {company_address}\n"
|
|
||||||
"BMCid: {bmc_id_tag}"
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# Pydantic Models
|
# Pydantic Models
|
||||||
class Setting(BaseModel):
|
class Setting(BaseModel):
|
||||||
@ -69,30 +54,6 @@ class UserUpdate(BaseModel):
|
|||||||
@router.get("/settings", response_model=List[Setting], tags=["Settings"])
|
@router.get("/settings", response_model=List[Setting], tags=["Settings"])
|
||||||
async def get_settings(category: Optional[str] = None):
|
async def get_settings(category: Optional[str] = None):
|
||||||
"""Get all settings or filter by category"""
|
"""Get all settings or filter by category"""
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO settings (key, value, category, description, value_type, is_public)
|
|
||||||
VALUES
|
|
||||||
(%s, %s, %s, %s, %s, %s),
|
|
||||||
(%s, %s, %s, %s, %s, %s)
|
|
||||||
ON CONFLICT (key) DO NOTHING
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
"email_default_signature_template",
|
|
||||||
DEFAULT_EMAIL_SIGNATURE_TEMPLATE,
|
|
||||||
"email",
|
|
||||||
"Standard signatur skabelon til udgående sagsmails",
|
|
||||||
"text",
|
|
||||||
True,
|
|
||||||
"company_website",
|
|
||||||
"https://bmcnetworks.dk",
|
|
||||||
"company",
|
|
||||||
"Firma website",
|
|
||||||
"string",
|
|
||||||
True,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
query = "SELECT * FROM settings"
|
query = "SELECT * FROM settings"
|
||||||
params = []
|
params = []
|
||||||
|
|
||||||
@ -111,7 +72,7 @@ async def get_setting(key: str):
|
|||||||
query = "SELECT * FROM settings WHERE key = %s"
|
query = "SELECT * FROM settings WHERE key = %s"
|
||||||
result = execute_query(query, (key,))
|
result = execute_query(query, (key,))
|
||||||
|
|
||||||
if not result and key in {"case_types", "case_type_module_defaults", "case_statuses"}:
|
if not result and key in {"case_types", "case_type_module_defaults"}:
|
||||||
seed_query = """
|
seed_query = """
|
||||||
INSERT INTO settings (key, value, category, description, value_type, is_public)
|
INSERT INTO settings (key, value, category, description, value_type, is_public)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
@ -144,43 +105,6 @@ async def get_setting(key: str):
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
if key == "case_statuses":
|
|
||||||
execute_query(
|
|
||||||
seed_query,
|
|
||||||
(
|
|
||||||
"case_statuses",
|
|
||||||
json.dumps([
|
|
||||||
{"value": "åben", "is_closed": False},
|
|
||||||
{"value": "under behandling", "is_closed": False},
|
|
||||||
{"value": "afventer", "is_closed": False},
|
|
||||||
{"value": "løst", "is_closed": True},
|
|
||||||
{"value": "lukket", "is_closed": True},
|
|
||||||
], ensure_ascii=False),
|
|
||||||
"system",
|
|
||||||
"Sagsstatus værdier og lukkede markeringer",
|
|
||||||
"json",
|
|
||||||
True,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
result = execute_query(query, (key,))
|
|
||||||
|
|
||||||
if not result and key == "email_default_signature_template":
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO settings (key, value, category, description, value_type, is_public)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s)
|
|
||||||
ON CONFLICT (key) DO NOTHING
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
"email_default_signature_template",
|
|
||||||
DEFAULT_EMAIL_SIGNATURE_TEMPLATE,
|
|
||||||
"email",
|
|
||||||
"Standard signatur skabelon til udgående sagsmails",
|
|
||||||
"text",
|
|
||||||
True,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
result = execute_query(query, (key,))
|
result = execute_query(query, (key,))
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
@ -200,27 +124,6 @@ async def update_setting(key: str, setting: SettingUpdate):
|
|||||||
"""
|
"""
|
||||||
result = execute_query(query, (setting.value, key))
|
result = execute_query(query, (setting.value, key))
|
||||||
|
|
||||||
if not result and key == "email_default_signature_template":
|
|
||||||
result = execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO settings (key, value, category, description, value_type, is_public)
|
|
||||||
VALUES (%s, %s, %s, %s, %s, %s)
|
|
||||||
ON CONFLICT (key)
|
|
||||||
DO UPDATE SET
|
|
||||||
value = EXCLUDED.value,
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
RETURNING *
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
"email_default_signature_template",
|
|
||||||
setting.value,
|
|
||||||
"email",
|
|
||||||
"Standard signatur skabelon til udgående sagsmails",
|
|
||||||
"text",
|
|
||||||
True,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
raise HTTPException(status_code=404, detail="Setting not found")
|
raise HTTPException(status_code=404, detail="Setting not found")
|
||||||
|
|
||||||
@ -274,7 +177,7 @@ async def sync_settings_from_env():
|
|||||||
@router.get("/users", response_model=List[User], tags=["Users"])
|
@router.get("/users", response_model=List[User], tags=["Users"])
|
||||||
async def get_users(is_active: Optional[bool] = None):
|
async def get_users(is_active: Optional[bool] = None):
|
||||||
"""Get all users"""
|
"""Get all users"""
|
||||||
query = "SELECT user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at FROM users"
|
query = "SELECT user_id as id, username, email, full_name, is_active, last_login, created_at FROM users"
|
||||||
params = []
|
params = []
|
||||||
|
|
||||||
if is_active is not None:
|
if is_active is not None:
|
||||||
@ -289,7 +192,7 @@ async def get_users(is_active: Optional[bool] = None):
|
|||||||
@router.get("/users/{user_id}", response_model=User, tags=["Users"])
|
@router.get("/users/{user_id}", response_model=User, tags=["Users"])
|
||||||
async def get_user(user_id: int):
|
async def get_user(user_id: int):
|
||||||
"""Get user by ID"""
|
"""Get user by ID"""
|
||||||
query = "SELECT user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at FROM users WHERE user_id = %s"
|
query = "SELECT user_id as id, username, email, full_name, is_active, last_login, created_at FROM users WHERE user_id = %s"
|
||||||
result = execute_query(query, (user_id,))
|
result = execute_query(query, (user_id,))
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
@ -313,7 +216,7 @@ async def create_user(user: UserCreate):
|
|||||||
query = """
|
query = """
|
||||||
INSERT INTO users (username, email, password_hash, full_name, is_active)
|
INSERT INTO users (username, email, password_hash, full_name, is_active)
|
||||||
VALUES (%s, %s, %s, %s, true)
|
VALUES (%s, %s, %s, %s, true)
|
||||||
RETURNING user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at
|
RETURNING user_id as id, username, email, full_name, is_active, last_login, created_at
|
||||||
"""
|
"""
|
||||||
result = execute_query(query, (user.username, user.email, password_hash, user.full_name))
|
result = execute_query(query, (user.username, user.email, password_hash, user.full_name))
|
||||||
|
|
||||||
@ -354,7 +257,7 @@ async def update_user(user_id: int, user: UserUpdate):
|
|||||||
UPDATE users
|
UPDATE users
|
||||||
SET {', '.join(update_fields)}, updated_at = CURRENT_TIMESTAMP
|
SET {', '.join(update_fields)}, updated_at = CURRENT_TIMESTAMP
|
||||||
WHERE user_id = %s
|
WHERE user_id = %s
|
||||||
RETURNING user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at
|
RETURNING user_id as id, username, email, full_name, is_active, last_login, created_at
|
||||||
"""
|
"""
|
||||||
|
|
||||||
result = execute_query(query, tuple(params))
|
result = execute_query(query, tuple(params))
|
||||||
@ -571,11 +474,15 @@ Output: [Liste af opgaver]""",
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def _get_prompts_with_overrides() -> Dict:
|
|
||||||
"""Get AI prompts with DB overrides applied"""
|
@router.get("/ai-prompts", tags=["Settings"])
|
||||||
|
async def get_ai_prompts():
|
||||||
|
"""Get all AI prompts (defaults merged with custom overrides)"""
|
||||||
prompts = _get_default_prompts()
|
prompts = _get_default_prompts()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
# Check for custom overrides in DB
|
||||||
|
# Note: Table ai_prompts must rely on migration 066
|
||||||
rows = execute_query("SELECT key, prompt_text FROM ai_prompts")
|
rows = execute_query("SELECT key, prompt_text FROM ai_prompts")
|
||||||
if rows:
|
if rows:
|
||||||
for row in rows:
|
for row in rows:
|
||||||
@ -588,40 +495,10 @@ def _get_prompts_with_overrides() -> Dict:
|
|||||||
return prompts
|
return prompts
|
||||||
|
|
||||||
|
|
||||||
def _get_test_input_for_prompt(key: str) -> str:
|
|
||||||
"""Default test input per prompt type"""
|
|
||||||
examples = {
|
|
||||||
"invoice_extraction": "FAKTURA 2026-1001 fra Demo A/S. CVR 12345678. Total 1.250,00 DKK inkl moms.",
|
|
||||||
"ticket_classification": "Emne: Kan ikke logge på VPN. Beskrivelse: Flere brugere er ramt siden i morges.",
|
|
||||||
"ticket_summary": "Bruger havde netværksfejl. Router genstartet og DNS opdateret. Forbindelse virker nu stabilt.",
|
|
||||||
"kb_generation": "Problem: Outlook åbner ikke. Løsning: Reparer Office installation og nulstil profil.",
|
|
||||||
"troubleshooting_assistant": "Server svarer langsomt efter opdatering. CPU er høj, disk IO er normal.",
|
|
||||||
"sentiment_analysis": "Jeg er meget frustreret, systemet er nede igen og vi mister kunder!",
|
|
||||||
"meeting_action_items": "Peter opdaterer firewall fredag. Anna sender status til kunden mandag.",
|
|
||||||
}
|
|
||||||
return examples.get(key, "Skriv kort: AI test OK")
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/ai-prompts", tags=["Settings"])
|
|
||||||
async def get_ai_prompts():
|
|
||||||
"""Get all AI prompts (defaults merged with custom overrides)"""
|
|
||||||
return _get_prompts_with_overrides()
|
|
||||||
|
|
||||||
|
|
||||||
class PromptUpdate(BaseModel):
|
class PromptUpdate(BaseModel):
|
||||||
prompt_text: str
|
prompt_text: str
|
||||||
|
|
||||||
|
|
||||||
class PromptTestRequest(BaseModel):
|
|
||||||
test_input: Optional[str] = None
|
|
||||||
prompt_text: Optional[str] = None
|
|
||||||
timeout_seconds: Optional[int] = None
|
|
||||||
|
|
||||||
|
|
||||||
_prompt_test_last_call: Dict[str, float] = {}
|
|
||||||
|
|
||||||
|
|
||||||
@router.put("/ai-prompts/{key}", tags=["Settings"])
|
@router.put("/ai-prompts/{key}", tags=["Settings"])
|
||||||
async def update_ai_prompt(key: str, update: PromptUpdate):
|
async def update_ai_prompt(key: str, update: PromptUpdate):
|
||||||
"""Override a system prompt with a custom one"""
|
"""Override a system prompt with a custom one"""
|
||||||
@ -656,124 +533,3 @@ async def reset_ai_prompt(key: str):
|
|||||||
raise HTTPException(status_code=500, detail="Could not reset prompt")
|
raise HTTPException(status_code=500, detail="Could not reset prompt")
|
||||||
|
|
||||||
|
|
||||||
@router.post("/ai-prompts/{key}/test", tags=["Settings"])
|
|
||||||
async def test_ai_prompt(key: str, payload: PromptTestRequest, http_request: Request):
|
|
||||||
"""Run a quick AI test for a specific system prompt"""
|
|
||||||
prompts = _get_prompts_with_overrides()
|
|
||||||
if key not in prompts:
|
|
||||||
raise HTTPException(status_code=404, detail="Unknown prompt key")
|
|
||||||
|
|
||||||
prompt_cfg = prompts[key]
|
|
||||||
model = prompt_cfg.get("model") or settings.OLLAMA_MODEL
|
|
||||||
endpoint = prompt_cfg.get("endpoint") or settings.OLLAMA_ENDPOINT
|
|
||||||
prompt_text = (payload.prompt_text or prompt_cfg.get("prompt") or "").strip()
|
|
||||||
if not prompt_text:
|
|
||||||
raise HTTPException(status_code=400, detail="Prompt text is empty")
|
|
||||||
|
|
||||||
test_input = (payload.test_input or _get_test_input_for_prompt(key)).strip()
|
|
||||||
if not test_input:
|
|
||||||
raise HTTPException(status_code=400, detail="Test input is empty")
|
|
||||||
|
|
||||||
start = time.perf_counter()
|
|
||||||
client_host = (http_request.client.host if http_request.client else "unknown")
|
|
||||||
|
|
||||||
# Cooldown to prevent hammering external endpoints and getting rate-limited/banned.
|
|
||||||
cooldown_seconds = 2.0
|
|
||||||
now_monotonic = time.monotonic()
|
|
||||||
last_call = _prompt_test_last_call.get(client_host)
|
|
||||||
if last_call and (now_monotonic - last_call) < cooldown_seconds:
|
|
||||||
wait_for = round(cooldown_seconds - (now_monotonic - last_call), 2)
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=429,
|
|
||||||
detail=f"For mange tests for hurtigt. Vent {wait_for} sekunder og prøv igen.",
|
|
||||||
)
|
|
||||||
_prompt_test_last_call[client_host] = now_monotonic
|
|
||||||
|
|
||||||
read_timeout_seconds = payload.timeout_seconds or 90
|
|
||||||
read_timeout_seconds = max(5, min(int(read_timeout_seconds), 300))
|
|
||||||
|
|
||||||
try:
|
|
||||||
model_normalized = (model or "").strip().lower()
|
|
||||||
# qwen models are more reliable with /api/chat than /api/generate.
|
|
||||||
use_chat_api = model_normalized.startswith("qwen")
|
|
||||||
|
|
||||||
logger.info(
|
|
||||||
"🧪 AI prompt test start key=%s model=%s timeout=%ss client=%s",
|
|
||||||
key,
|
|
||||||
model,
|
|
||||||
read_timeout_seconds,
|
|
||||||
client_host,
|
|
||||||
)
|
|
||||||
|
|
||||||
timeout = httpx.Timeout(connect=10.0, read=float(read_timeout_seconds), write=30.0, pool=10.0)
|
|
||||||
async with httpx.AsyncClient(timeout=timeout) as client:
|
|
||||||
if use_chat_api:
|
|
||||||
response = await client.post(
|
|
||||||
f"{endpoint}/api/chat",
|
|
||||||
json={
|
|
||||||
"model": model,
|
|
||||||
"messages": [
|
|
||||||
{"role": "system", "content": prompt_text},
|
|
||||||
{"role": "user", "content": test_input},
|
|
||||||
],
|
|
||||||
"stream": False,
|
|
||||||
"options": {"temperature": 0.2, "num_predict": 600},
|
|
||||||
},
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
response = await client.post(
|
|
||||||
f"{endpoint}/api/generate",
|
|
||||||
json={
|
|
||||||
"model": model,
|
|
||||||
"prompt": f"{prompt_text}\n\nBrugerinput:\n{test_input}",
|
|
||||||
"stream": False,
|
|
||||||
"options": {"temperature": 0.2, "num_predict": 600},
|
|
||||||
},
|
|
||||||
)
|
|
||||||
|
|
||||||
if response.status_code != 200:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=502,
|
|
||||||
detail=f"AI endpoint fejl: {response.status_code} - {response.text[:300]}",
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
|
||||||
data = response.json()
|
|
||||||
except Exception as parse_error:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=502,
|
|
||||||
detail=f"AI endpoint returnerede ugyldig JSON: {str(parse_error)}",
|
|
||||||
)
|
|
||||||
|
|
||||||
if use_chat_api:
|
|
||||||
message_data = data.get("message", {})
|
|
||||||
ai_response = (message_data.get("content") or message_data.get("thinking") or "").strip()
|
|
||||||
else:
|
|
||||||
ai_response = (data.get("response") or "").strip()
|
|
||||||
|
|
||||||
if not ai_response:
|
|
||||||
raise HTTPException(status_code=502, detail="AI returnerede tomt svar")
|
|
||||||
|
|
||||||
latency_ms = int((time.perf_counter() - start) * 1000)
|
|
||||||
return {
|
|
||||||
"ok": True,
|
|
||||||
"key": key,
|
|
||||||
"model": model,
|
|
||||||
"endpoint": endpoint,
|
|
||||||
"test_input": test_input,
|
|
||||||
"ai_response": ai_response,
|
|
||||||
"timeout_seconds": read_timeout_seconds,
|
|
||||||
"latency_ms": latency_ms,
|
|
||||||
}
|
|
||||||
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except httpx.TimeoutException as e:
|
|
||||||
logger.error(f"❌ AI prompt test timed out for {key}: {repr(e)}")
|
|
||||||
raise HTTPException(status_code=504, detail="AI test timed out (model svarer for langsomt)")
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ AI prompt test failed for {key}: {repr(e)}")
|
|
||||||
err = str(e) or e.__class__.__name__
|
|
||||||
raise HTTPException(status_code=500, detail=f"Kunne ikke teste AI prompt: {err}")
|
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@ -348,23 +348,6 @@
|
|||||||
|
|
||||||
<!-- Email Templates -->
|
<!-- Email Templates -->
|
||||||
<div class="tab-pane fade" id="email-templates">
|
<div class="tab-pane fade" id="email-templates">
|
||||||
<div class="card border-0 shadow-sm mb-4">
|
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="fw-bold mb-2">Standard Signatur (Sagsmails)</h5>
|
|
||||||
<p class="text-muted mb-3">Bruges automatisk ved afsendelse fra sag. Skabelonen bruger indlogget bruger + firmaoplysninger.</p>
|
|
||||||
<label class="form-label small text-muted">Signatur skabelon</label>
|
|
||||||
<textarea class="form-control font-monospace" id="emailDefaultSignatureTemplate" rows="9"></textarea>
|
|
||||||
<div class="small text-muted mt-2">
|
|
||||||
Variabler: <code>{full_name}</code>, <code>{title}</code>, <code>{email}</code>, <code>{company_name}</code>, <code>{company_phone}</code>, <code>{company_address}</code>, <code>{bmc_id_tag}</code>
|
|
||||||
</div>
|
|
||||||
<div class="d-flex justify-content-end mt-3">
|
|
||||||
<button class="btn btn-primary" onclick="saveDefaultEmailSignatureTemplate()">
|
|
||||||
<i class="bi bi-save me-2"></i>Gem Standard Signatur
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||||
<div>
|
<div>
|
||||||
<h5 class="fw-bold mb-1">Email Skabeloner</h5>
|
<h5 class="fw-bold mb-1">Email Skabeloner</h5>
|
||||||
@ -653,10 +636,6 @@
|
|||||||
<div class="mb-4">
|
<div class="mb-4">
|
||||||
<h5 class="fw-bold mb-1">Data Synkronisering</h5>
|
<h5 class="fw-bold mb-1">Data Synkronisering</h5>
|
||||||
<p class="text-muted mb-0">Synkroniser firmaer og kontakter fra vTiger og e-conomic</p>
|
<p class="text-muted mb-0">Synkroniser firmaer og kontakter fra vTiger og e-conomic</p>
|
||||||
<div class="alert alert-info mt-3 mb-0 py-2 px-3 small">
|
|
||||||
<i class="bi bi-info-circle me-2"></i>
|
|
||||||
Sync bruger integration credentials fra <strong>miljøvariabler (.env)</strong> ved runtime.
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Sync Status Cards -->
|
<!-- Sync Status Cards -->
|
||||||
@ -754,16 +733,16 @@
|
|||||||
<i class="bi bi-currency-dollar text-success" style="font-size: 2rem;"></i>
|
<i class="bi bi-currency-dollar text-success" style="font-size: 2rem;"></i>
|
||||||
</div>
|
</div>
|
||||||
<div class="flex-grow-1 ms-3">
|
<div class="flex-grow-1 ms-3">
|
||||||
<h6 class="card-title fw-bold">Sync fra e-conomic</h6>
|
<h6 class="card-title fw-bold">Sync fra e-conomic</h5>
|
||||||
<p class="card-text small text-muted">Hent kunder fra e-conomic. Matcher kun på entydigt e-conomic kundenummer.</p>
|
<p class="card-text small text-muted">Hent kundenumre fra e-conomic. Matcher på CVR nummer eller firma navn.</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="d-grid gap-2">
|
<div class="d-grid gap-2">
|
||||||
<button class="btn btn-success" onclick="syncFromEconomic()" id="btnSyncEconomic">
|
<button class="btn btn-success" onclick="syncFromEconomic()" id="btnSyncEconomic">
|
||||||
<i class="bi bi-download me-2"></i>Sync Firmaer fra e-conomic
|
<i class="bi bi-download me-2"></i>Sync Firmaer fra e-conomic
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-secondary btn-sm" id="btnSyncCvrEconomic" disabled>
|
<button class="btn btn-outline-success btn-sm" onclick="syncCvrToEconomic()" id="btnSyncCvrEconomic">
|
||||||
<i class="bi bi-pause-circle me-2"></i>CVR→e-conomic midlertidigt deaktiveret
|
<i class="bi bi-search me-2"></i>Find Manglende CVR i e-conomic
|
||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
<div class="mt-3 small">
|
<div class="mt-3 small">
|
||||||
@ -771,72 +750,12 @@
|
|||||||
<i class="bi bi-info-circle me-2"></i>
|
<i class="bi bi-info-circle me-2"></i>
|
||||||
<span>Sidst synkroniseret: <span id="lastSyncEconomic">Aldrig</span></span>
|
<span>Sidst synkroniseret: <span id="lastSyncEconomic">Aldrig</span></span>
|
||||||
</div>
|
</div>
|
||||||
<div class="d-flex align-items-center text-muted mt-1">
|
|
||||||
<i class="bi bi-exclamation-circle me-2"></i>
|
|
||||||
<span>CVR-søgning er slået fra midlertidigt for stabil drift.</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Archived Ticket Sync + Monitor -->
|
|
||||||
<div class="card mb-4">
|
|
||||||
<div class="card-header bg-white d-flex justify-content-between align-items-center">
|
|
||||||
<div>
|
|
||||||
<h6 class="mb-0 fw-bold">Archived Tickets Sync</h6>
|
|
||||||
<small class="text-muted">Overvaager om alle archived tickets er synket ned (kildeantal vs lokal DB)</small>
|
|
||||||
</div>
|
|
||||||
<div class="d-flex align-items-center gap-2">
|
|
||||||
<span class="badge bg-secondary" id="archivedOverallBadge">Status ukendt</span>
|
|
||||||
<button class="btn btn-sm btn-outline-secondary" onclick="loadArchivedSyncStatus()" id="btnCheckArchivedSync">
|
|
||||||
<i class="bi bi-arrow-repeat me-1"></i>Tjek nu
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
|
||||||
<div class="row g-3">
|
|
||||||
<div class="col-md-6">
|
|
||||||
<div class="border rounded p-3 h-100">
|
|
||||||
<div class="d-flex justify-content-between align-items-start mb-2">
|
|
||||||
<h6 class="mb-0">Simply archived</h6>
|
|
||||||
<span class="badge bg-secondary" id="archivedSimplyBadge">Ukendt</span>
|
|
||||||
</div>
|
|
||||||
<div class="small text-muted mb-2">Remote: <span id="archivedSimplyRemoteCount">-</span> | Lokal: <span id="archivedSimplyLocalCount">-</span> | Diff: <span id="archivedSimplyDiff">-</span></div>
|
|
||||||
<div class="small text-muted mb-3">Beskeder lokalt: <span id="archivedSimplyMessagesCount">-</span></div>
|
|
||||||
<div class="d-grid">
|
|
||||||
<button class="btn btn-outline-primary btn-sm" onclick="syncArchivedSimply()" id="btnSyncArchivedSimply">
|
|
||||||
<i class="bi bi-cloud-download me-2"></i>Sync Simply Archived
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="col-md-6">
|
|
||||||
<div class="border rounded p-3 h-100">
|
|
||||||
<div class="d-flex justify-content-between align-items-start mb-2">
|
|
||||||
<h6 class="mb-0">vTiger Cases archived</h6>
|
|
||||||
<span class="badge bg-secondary" id="archivedVtigerBadge">Ukendt</span>
|
|
||||||
</div>
|
|
||||||
<div class="small text-muted mb-2">Remote: <span id="archivedVtigerRemoteCount">-</span> | Lokal: <span id="archivedVtigerLocalCount">-</span> | Diff: <span id="archivedVtigerDiff">-</span></div>
|
|
||||||
<div class="small text-muted mb-3">Beskeder lokalt: <span id="archivedVtigerMessagesCount">-</span></div>
|
|
||||||
<div class="d-grid">
|
|
||||||
<button class="btn btn-outline-primary btn-sm" onclick="syncArchivedVtiger()" id="btnSyncArchivedVtiger">
|
|
||||||
<i class="bi bi-cloud-download me-2"></i>Sync vTiger Archived
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="d-flex justify-content-between align-items-center mt-3">
|
|
||||||
<small class="text-muted">Sidst tjekket: <span id="archivedLastChecked">Aldrig</span></small>
|
|
||||||
<small class="text-muted" id="archivedStatusHint">Polling aktiv naar Sync-fanen er aaben.</small>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Sync Log -->
|
<!-- Sync Log -->
|
||||||
<div class="card">
|
<div class="card">
|
||||||
<div class="card-header bg-white">
|
<div class="card-header bg-white">
|
||||||
@ -1113,8 +1032,7 @@ async def scan_document(file_path: str):
|
|||||||
<option value="/ticket/dashboard/technician/v2" {% if default_dashboard_path == '/ticket/dashboard/technician/v2' %}selected{% endif %}>Tekniker Dashboard V2</option>
|
<option value="/ticket/dashboard/technician/v2" {% if default_dashboard_path == '/ticket/dashboard/technician/v2' %}selected{% endif %}>Tekniker Dashboard V2</option>
|
||||||
<option value="/ticket/dashboard/technician/v3" {% if default_dashboard_path == '/ticket/dashboard/technician/v3' %}selected{% endif %}>Tekniker Dashboard V3</option>
|
<option value="/ticket/dashboard/technician/v3" {% if default_dashboard_path == '/ticket/dashboard/technician/v3' %}selected{% endif %}>Tekniker Dashboard V3</option>
|
||||||
<option value="/dashboard/sales" {% if default_dashboard_path == '/dashboard/sales' %}selected{% endif %}>Salg Dashboard</option>
|
<option value="/dashboard/sales" {% if default_dashboard_path == '/dashboard/sales' %}selected{% endif %}>Salg Dashboard</option>
|
||||||
<option value="/dashboard/mission-control" {% if default_dashboard_path == '/dashboard/mission-control' %}selected{% endif %}>Mission Control</option>
|
{% if default_dashboard_path and default_dashboard_path not in ['/ticket/dashboard/technician/v1', '/ticket/dashboard/technician/v2', '/ticket/dashboard/technician/v3', '/dashboard/sales'] %}
|
||||||
{% if default_dashboard_path and default_dashboard_path not in ['/ticket/dashboard/technician/v1', '/ticket/dashboard/technician/v2', '/ticket/dashboard/technician/v3', '/dashboard/sales', '/dashboard/mission-control'] %}
|
|
||||||
<option value="{{ default_dashboard_path }}" selected>Nuværende (tilpasset): {{ default_dashboard_path }}</option>
|
<option value="{{ default_dashboard_path }}" selected>Nuværende (tilpasset): {{ default_dashboard_path }}</option>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</select>
|
</select>
|
||||||
@ -1160,33 +1078,6 @@ async def scan_document(file_path: str):
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="card p-4 mt-4">
|
|
||||||
<div class="d-flex justify-content-between align-items-center mb-3">
|
|
||||||
<div>
|
|
||||||
<h5 class="mb-1 fw-bold">Sagsstatus</h5>
|
|
||||||
<p class="text-muted mb-0">Styr hvilke status-værdier der kan vælges, og marker hvilke der er lukkede.</p>
|
|
||||||
</div>
|
|
||||||
<div class="d-flex gap-2">
|
|
||||||
<input type="text" class="form-control" id="caseStatusInput" placeholder="F.eks. afventer kunde" style="max-width: 260px;">
|
|
||||||
<button class="btn btn-primary" onclick="addCaseStatus()"><i class="bi bi-plus-lg me-1"></i>Tilføj</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="table-responsive">
|
|
||||||
<table class="table table-sm align-middle mb-0">
|
|
||||||
<thead>
|
|
||||||
<tr>
|
|
||||||
<th>Status</th>
|
|
||||||
<th class="text-center" style="width: 150px;">Lukket værdi</th>
|
|
||||||
<th class="text-end" style="width: 100px;">Handling</th>
|
|
||||||
</tr>
|
|
||||||
</thead>
|
|
||||||
<tbody id="caseStatusesTableBody">
|
|
||||||
<tr><td colspan="3" class="text-muted">Indlæser...</td></tr>
|
|
||||||
</tbody>
|
|
||||||
</table>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="card p-4 mt-4">
|
<div class="card p-4 mt-4">
|
||||||
<div class="d-flex justify-content-between align-items-center mb-3">
|
<div class="d-flex justify-content-between align-items-center mb-3">
|
||||||
<div>
|
<div>
|
||||||
@ -1706,8 +1597,6 @@ async function loadSettings() {
|
|||||||
displaySettingsByCategory();
|
displaySettingsByCategory();
|
||||||
renderTelefoniSettings();
|
renderTelefoniSettings();
|
||||||
await loadCaseTypesSetting();
|
await loadCaseTypesSetting();
|
||||||
await loadCaseStatusesSetting();
|
|
||||||
await loadTagsManagement();
|
|
||||||
await loadNextcloudInstances();
|
await loadNextcloudInstances();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error loading settings:', error);
|
console.error('Error loading settings:', error);
|
||||||
@ -1716,7 +1605,7 @@ async function loadSettings() {
|
|||||||
|
|
||||||
function displaySettingsByCategory() {
|
function displaySettingsByCategory() {
|
||||||
const categories = {
|
const categories = {
|
||||||
company: ['company_name', 'company_cvr', 'company_email', 'company_phone', 'company_website', 'company_address'],
|
company: ['company_name', 'company_cvr', 'company_email', 'company_phone', 'company_address'],
|
||||||
integrations: ['vtiger_enabled', 'vtiger_url', 'vtiger_username', 'economic_enabled', 'economic_app_secret', 'economic_agreement_token'],
|
integrations: ['vtiger_enabled', 'vtiger_url', 'vtiger_username', 'economic_enabled', 'economic_app_secret', 'economic_agreement_token'],
|
||||||
notifications: ['email_notifications'],
|
notifications: ['email_notifications'],
|
||||||
system: ['system_timezone']
|
system: ['system_timezone']
|
||||||
@ -2077,132 +1966,6 @@ const CASE_MODULE_LABELS = {
|
|||||||
};
|
};
|
||||||
|
|
||||||
let caseTypeModuleDefaultsCache = {};
|
let caseTypeModuleDefaultsCache = {};
|
||||||
let caseStatusesCache = [];
|
|
||||||
|
|
||||||
function normalizeCaseStatuses(raw) {
|
|
||||||
const normalized = [];
|
|
||||||
const seen = new Set();
|
|
||||||
const source = Array.isArray(raw) ? raw : [];
|
|
||||||
|
|
||||||
source.forEach((item) => {
|
|
||||||
const row = typeof item === 'string'
|
|
||||||
? { value: item, is_closed: false }
|
|
||||||
: (item && typeof item === 'object' ? item : null);
|
|
||||||
|
|
||||||
if (!row) return;
|
|
||||||
const value = String(row.value || '').trim();
|
|
||||||
if (!value) return;
|
|
||||||
|
|
||||||
const key = value.toLowerCase();
|
|
||||||
if (seen.has(key)) return;
|
|
||||||
seen.add(key);
|
|
||||||
|
|
||||||
normalized.push({
|
|
||||||
value,
|
|
||||||
is_closed: Boolean(row.is_closed)
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
const defaults = [
|
|
||||||
{ value: 'åben', is_closed: false },
|
|
||||||
{ value: 'under behandling', is_closed: false },
|
|
||||||
{ value: 'afventer', is_closed: false },
|
|
||||||
{ value: 'løst', is_closed: true },
|
|
||||||
{ value: 'lukket', is_closed: true }
|
|
||||||
];
|
|
||||||
|
|
||||||
defaults.forEach((item) => {
|
|
||||||
const key = item.value.toLowerCase();
|
|
||||||
if (!seen.has(key)) {
|
|
||||||
seen.add(key);
|
|
||||||
normalized.push(item);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
return normalized;
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderCaseStatuses(rows) {
|
|
||||||
const tbody = document.getElementById('caseStatusesTableBody');
|
|
||||||
if (!tbody) return;
|
|
||||||
|
|
||||||
if (!Array.isArray(rows) || !rows.length) {
|
|
||||||
tbody.innerHTML = '<tr><td colspan="3" class="text-muted">Ingen statusværdier defineret</td></tr>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
tbody.innerHTML = rows.map((row, index) => `
|
|
||||||
<tr>
|
|
||||||
<td><span class="fw-semibold">${escapeHtml(row.value)}</span></td>
|
|
||||||
<td class="text-center">
|
|
||||||
<div class="form-check form-switch d-inline-flex">
|
|
||||||
<input class="form-check-input" type="checkbox" id="caseStatusClosed_${index}" ${row.is_closed ? 'checked' : ''}
|
|
||||||
onchange="toggleCaseStatusClosed(${index}, this.checked)">
|
|
||||||
</div>
|
|
||||||
</td>
|
|
||||||
<td class="text-end">
|
|
||||||
<button type="button" class="btn btn-sm btn-outline-danger" onclick="removeCaseStatus(${index})" title="Slet status">
|
|
||||||
<i class="bi bi-trash"></i>
|
|
||||||
</button>
|
|
||||||
</td>
|
|
||||||
</tr>
|
|
||||||
`).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
async function loadCaseStatusesSetting() {
|
|
||||||
try {
|
|
||||||
const response = await fetch('/api/v1/settings/case_statuses');
|
|
||||||
if (!response.ok) {
|
|
||||||
caseStatusesCache = normalizeCaseStatuses([]);
|
|
||||||
renderCaseStatuses(caseStatusesCache);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
const setting = await response.json();
|
|
||||||
const parsed = JSON.parse(setting.value || '[]');
|
|
||||||
caseStatusesCache = normalizeCaseStatuses(parsed);
|
|
||||||
renderCaseStatuses(caseStatusesCache);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error loading case statuses:', error);
|
|
||||||
caseStatusesCache = normalizeCaseStatuses([]);
|
|
||||||
renderCaseStatuses(caseStatusesCache);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function saveCaseStatuses() {
|
|
||||||
await updateSetting('case_statuses', JSON.stringify(caseStatusesCache));
|
|
||||||
renderCaseStatuses(caseStatusesCache);
|
|
||||||
}
|
|
||||||
|
|
||||||
async function addCaseStatus() {
|
|
||||||
const input = document.getElementById('caseStatusInput');
|
|
||||||
if (!input) return;
|
|
||||||
|
|
||||||
const value = input.value.trim();
|
|
||||||
if (!value) return;
|
|
||||||
|
|
||||||
const exists = caseStatusesCache.some((row) => String(row.value || '').toLowerCase() === value.toLowerCase());
|
|
||||||
if (!exists) {
|
|
||||||
caseStatusesCache.push({ value, is_closed: false });
|
|
||||||
await saveCaseStatuses();
|
|
||||||
}
|
|
||||||
|
|
||||||
input.value = '';
|
|
||||||
}
|
|
||||||
|
|
||||||
async function removeCaseStatus(index) {
|
|
||||||
caseStatusesCache = caseStatusesCache.filter((_, i) => i !== index);
|
|
||||||
if (!caseStatusesCache.length) {
|
|
||||||
caseStatusesCache = normalizeCaseStatuses([]);
|
|
||||||
}
|
|
||||||
await saveCaseStatuses();
|
|
||||||
}
|
|
||||||
|
|
||||||
async function toggleCaseStatusClosed(index, checked) {
|
|
||||||
if (!caseStatusesCache[index]) return;
|
|
||||||
caseStatusesCache[index].is_closed = Boolean(checked);
|
|
||||||
await saveCaseStatuses();
|
|
||||||
}
|
|
||||||
|
|
||||||
function normalizeCaseTypeModuleDefaults(raw, caseTypes) {
|
function normalizeCaseTypeModuleDefaults(raw, caseTypes) {
|
||||||
const normalized = {};
|
const normalized = {};
|
||||||
@ -2413,16 +2176,14 @@ async function loadUsers() {
|
|||||||
async function loadAdminUsers() {
|
async function loadAdminUsers() {
|
||||||
try {
|
try {
|
||||||
const response = await fetch('/api/v1/admin/users');
|
const response = await fetch('/api/v1/admin/users');
|
||||||
if (!response.ok) {
|
if (!response.ok) throw new Error('Failed to load users');
|
||||||
throw new Error(await getErrorMessage(response, 'Kunne ikke indlaese brugere'));
|
|
||||||
}
|
|
||||||
usersCache = await response.json();
|
usersCache = await response.json();
|
||||||
displayUsers(usersCache);
|
displayUsers(usersCache);
|
||||||
populateTelefoniTestUsers(usersCache);
|
populateTelefoniTestUsers(usersCache);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error loading users:', error);
|
console.error('Error loading users:', error);
|
||||||
const tbody = document.getElementById('usersTableBody');
|
const tbody = document.getElementById('usersTableBody');
|
||||||
tbody.innerHTML = `<tr><td colspan="11" class="text-center text-muted py-5">${escapeHtml(error.message || 'Kunne ikke indlaese brugere')}</td></tr>`;
|
tbody.innerHTML = '<tr><td colspan="11" class="text-center text-muted py-5">Kunne ikke indlæse brugere</td></tr>';
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -2703,21 +2464,17 @@ async function createUser() {
|
|||||||
|
|
||||||
async function toggleUserActive(userId, isActive) {
|
async function toggleUserActive(userId, isActive) {
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/admin/users/${userId}`, {
|
const response = await fetch(`/api/v1/users/${userId}`, {
|
||||||
method: 'PATCH',
|
method: 'PUT',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({ is_active: isActive })
|
body: JSON.stringify({ is_active: isActive })
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (response.ok) {
|
||||||
alert(await getErrorMessage(response, 'Kunne ikke opdatere brugerstatus'));
|
loadUsers();
|
||||||
return;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
loadUsers();
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error toggling user:', error);
|
console.error('Error toggling user:', error);
|
||||||
alert('Kunne ikke opdatere brugerstatus');
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -2900,18 +2657,13 @@ async function resetPassword(userId) {
|
|||||||
if (!newPassword) return;
|
if (!newPassword) return;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/admin/users/${userId}/reset-password`, {
|
const response = await fetch(`/api/v1/users/${userId}/reset-password?new_password=${newPassword}`, {
|
||||||
method: 'POST',
|
method: 'POST'
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ new_password: newPassword })
|
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (response.ok) {
|
||||||
alert(await getErrorMessage(response, 'Kunne ikke nulstille adgangskode'));
|
alert('Adgangskode nulstillet!');
|
||||||
return;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
alert('Adgangskode nulstillet!');
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error resetting password:', error);
|
console.error('Error resetting password:', error);
|
||||||
alert('Kunne ikke nulstille adgangskode');
|
alert('Kunne ikke nulstille adgangskode');
|
||||||
@ -2959,28 +2711,6 @@ async function loadAIPrompts() {
|
|||||||
const container = document.getElementById('aiPromptsContent');
|
const container = document.getElementById('aiPromptsContent');
|
||||||
|
|
||||||
const accordionHtml = `
|
const accordionHtml = `
|
||||||
<div class="card border-0 shadow-sm mb-3">
|
|
||||||
<div class="card-body py-3">
|
|
||||||
<div class="row g-2 align-items-center">
|
|
||||||
<div class="col-12 col-md-4">
|
|
||||||
<label class="form-label mb-1 small text-muted">Timeout pr. test (sek)</label>
|
|
||||||
<input id="aiTestTimeoutSeconds" type="number" class="form-control form-control-sm" min="5" max="300" step="1" value="90">
|
|
||||||
</div>
|
|
||||||
<div class="col-12 col-md-8 d-flex justify-content-md-end gap-2 align-self-end">
|
|
||||||
<button type="button" class="btn btn-sm btn-outline-secondary" onclick="clearAiPromptLog()">
|
|
||||||
<i class="bi bi-trash me-1"></i>Ryd log
|
|
||||||
</button>
|
|
||||||
<button type="button" class="btn btn-sm btn-outline-secondary" onclick="renderAiPromptLog()">
|
|
||||||
<i class="bi bi-arrow-clockwise me-1"></i>Opdater log
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div class="col-12">
|
|
||||||
<div id="aiPromptLogWindow" class="border rounded p-2 bg-light" style="max-height: 220px; overflow-y: auto; font-family: ui-monospace, SFMono-Regular, Menlo, monospace; font-size: 0.8rem;"></div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="accordion" id="aiPromptsAccordion">
|
<div class="accordion" id="aiPromptsAccordion">
|
||||||
${Object.entries(prompts).map(([key, prompt], index) => `
|
${Object.entries(prompts).map(([key, prompt], index) => `
|
||||||
<div class="accordion-item">
|
<div class="accordion-item">
|
||||||
@ -3037,9 +2767,6 @@ async function loadAIPrompts() {
|
|||||||
<button class="btn btn-outline-danger" onclick="resetPrompt('${key}')" title="Nulstil til standard">
|
<button class="btn btn-outline-danger" onclick="resetPrompt('${key}')" title="Nulstil til standard">
|
||||||
<i class="bi bi-arrow-counterclockwise"></i> Nulstil
|
<i class="bi bi-arrow-counterclockwise"></i> Nulstil
|
||||||
</button>` : ''}
|
</button>` : ''}
|
||||||
<button class="btn btn-outline-success" onclick="testPrompt('${key}')" id="testBtn_${key}" title="Test AI prompt">
|
|
||||||
<i class="bi bi-play-circle"></i> Test
|
|
||||||
</button>
|
|
||||||
<button class="btn btn-outline-primary" onclick="editPrompt('${key}')" id="editBtn_${key}" title="Rediger Prompt">
|
<button class="btn btn-outline-primary" onclick="editPrompt('${key}')" id="editBtn_${key}" title="Rediger Prompt">
|
||||||
<i class="bi bi-pencil"></i> Rediger
|
<i class="bi bi-pencil"></i> Rediger
|
||||||
</button>
|
</button>
|
||||||
@ -3054,8 +2781,6 @@ async function loadAIPrompts() {
|
|||||||
<textarea id="edit_prompt_${key}" class="form-control d-none p-3 bg-white text-dark rounded-bottom"
|
<textarea id="edit_prompt_${key}" class="form-control d-none p-3 bg-white text-dark rounded-bottom"
|
||||||
style="height: 300px; font-family: monospace; font-size: 0.85rem; border-radius: 0;">${escapeHtml(prompt.prompt)}</textarea>
|
style="height: 300px; font-family: monospace; font-size: 0.85rem; border-radius: 0;">${escapeHtml(prompt.prompt)}</textarea>
|
||||||
|
|
||||||
<div id="testResult_${key}" class="alert alert-secondary m-3 py-2 px-3 d-none" style="white-space: pre-wrap; font-size: 0.85rem;"></div>
|
|
||||||
|
|
||||||
<div id="editActions_${key}" class="position-absolute bottom-0 end-0 p-3 d-none">
|
<div id="editActions_${key}" class="position-absolute bottom-0 end-0 p-3 d-none">
|
||||||
<button class="btn btn-sm btn-secondary me-1" onclick="cancelEdit('${key}')">Annuller</button>
|
<button class="btn btn-sm btn-secondary me-1" onclick="cancelEdit('${key}')">Annuller</button>
|
||||||
<button class="btn btn-sm btn-success" onclick="savePrompt('${key}')"><i class="bi bi-check-lg"></i> Gem</button>
|
<button class="btn btn-sm btn-success" onclick="savePrompt('${key}')"><i class="bi bi-check-lg"></i> Gem</button>
|
||||||
@ -3070,7 +2795,6 @@ async function loadAIPrompts() {
|
|||||||
`;
|
`;
|
||||||
|
|
||||||
container.innerHTML = accordionHtml;
|
container.innerHTML = accordionHtml;
|
||||||
renderAiPromptLog();
|
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error loading AI prompts:', error);
|
console.error('Error loading AI prompts:', error);
|
||||||
@ -3150,94 +2874,6 @@ async function resetPrompt(key) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async function testPrompt(key) {
|
|
||||||
const btn = document.getElementById(`testBtn_${key}`);
|
|
||||||
const resultElement = document.getElementById(`testResult_${key}`);
|
|
||||||
const editElement = document.getElementById(`edit_prompt_${key}`);
|
|
||||||
|
|
||||||
const promptText = editElement ? editElement.value : '';
|
|
||||||
const timeoutInput = document.getElementById('aiTestTimeoutSeconds');
|
|
||||||
const timeoutSecondsRaw = timeoutInput ? Number(timeoutInput.value) : 90;
|
|
||||||
const timeoutSeconds = Math.min(300, Math.max(5, Number.isFinite(timeoutSecondsRaw) ? timeoutSecondsRaw : 90));
|
|
||||||
const originalHtml = btn.innerHTML;
|
|
||||||
|
|
||||||
btn.disabled = true;
|
|
||||||
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-1" role="status" aria-hidden="true"></span>Tester';
|
|
||||||
|
|
||||||
resultElement.className = 'alert alert-secondary m-3 py-2 px-3';
|
|
||||||
resultElement.classList.remove('d-none');
|
|
||||||
resultElement.textContent = 'Tester AI...';
|
|
||||||
addAiPromptLogEntry('info', key, `Test startet (timeout=${timeoutSeconds}s)`);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const response = await fetch(`/api/v1/ai-prompts/${key}/test`, {
|
|
||||||
method: 'POST',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ prompt_text: promptText, timeout_seconds: timeoutSeconds })
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
const message = await getErrorMessage(response, 'Kunne ikke teste AI prompt');
|
|
||||||
throw new Error(message);
|
|
||||||
}
|
|
||||||
|
|
||||||
const result = await response.json();
|
|
||||||
const fullResponse = (result.ai_response || '').trim();
|
|
||||||
const preview = fullResponse.length > 1200 ? `${fullResponse.slice(0, 1200)}\n...` : fullResponse;
|
|
||||||
|
|
||||||
resultElement.className = 'alert alert-success m-3 py-2 px-3';
|
|
||||||
resultElement.textContent =
|
|
||||||
`✅ AI svar modtaget (${result.latency_ms} ms)\n` +
|
|
||||||
`Model: ${result.model}\n\n` +
|
|
||||||
`${preview || '[Tomt svar]'}`;
|
|
||||||
addAiPromptLogEntry('success', key, `OK (${result.latency_ms} ms, timeout=${result.timeout_seconds || timeoutSeconds}s)`);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error testing AI prompt:', error);
|
|
||||||
resultElement.className = 'alert alert-danger m-3 py-2 px-3';
|
|
||||||
resultElement.textContent = `❌ ${error.message || 'Kunne ikke teste AI prompt'}`;
|
|
||||||
addAiPromptLogEntry('error', key, error.message || 'Kunne ikke teste AI prompt');
|
|
||||||
} finally {
|
|
||||||
btn.disabled = false;
|
|
||||||
btn.innerHTML = originalHtml;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
let aiPromptTestLog = [];
|
|
||||||
|
|
||||||
function addAiPromptLogEntry(level, key, message) {
|
|
||||||
aiPromptTestLog.unshift({
|
|
||||||
level,
|
|
||||||
key,
|
|
||||||
message,
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
});
|
|
||||||
if (aiPromptTestLog.length > 200) {
|
|
||||||
aiPromptTestLog = aiPromptTestLog.slice(0, 200);
|
|
||||||
}
|
|
||||||
renderAiPromptLog();
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderAiPromptLog() {
|
|
||||||
const logWindow = document.getElementById('aiPromptLogWindow');
|
|
||||||
if (!logWindow) return;
|
|
||||||
|
|
||||||
if (!aiPromptTestLog.length) {
|
|
||||||
logWindow.innerHTML = '<div class="text-muted">Ingen test-log endnu.</div>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
logWindow.innerHTML = aiPromptTestLog.map((row) => {
|
|
||||||
const icon = row.level === 'success' ? 'bi-check-circle text-success' : row.level === 'error' ? 'bi-x-circle text-danger' : 'bi-info-circle text-primary';
|
|
||||||
const ts = new Date(row.timestamp).toLocaleString('da-DK');
|
|
||||||
return `<div class="mb-1"><i class="bi ${icon} me-1"></i><strong>[${ts}]</strong> ${escapeHtml(row.key)}: ${escapeHtml(row.message)}</div>`;
|
|
||||||
}).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function clearAiPromptLog() {
|
|
||||||
aiPromptTestLog = [];
|
|
||||||
renderAiPromptLog();
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
function copyPrompt(key) {
|
function copyPrompt(key) {
|
||||||
@ -3322,8 +2958,6 @@ document.querySelectorAll('.settings-nav .nav-link').forEach(link => {
|
|||||||
// Load data for tab
|
// Load data for tab
|
||||||
if (tab === 'users') {
|
if (tab === 'users') {
|
||||||
loadUsers();
|
loadUsers();
|
||||||
} else if (tab === 'tags') {
|
|
||||||
loadTagsManagement();
|
|
||||||
} else if (tab === 'telefoni') {
|
} else if (tab === 'telefoni') {
|
||||||
renderTelefoniSettings();
|
renderTelefoniSettings();
|
||||||
} else if (tab === 'ai-prompts') {
|
} else if (tab === 'ai-prompts') {
|
||||||
@ -3398,19 +3032,13 @@ let showInactive = false;
|
|||||||
async function loadTagsManagement() {
|
async function loadTagsManagement() {
|
||||||
try {
|
try {
|
||||||
const response = await fetch('/api/v1/tags');
|
const response = await fetch('/api/v1/tags');
|
||||||
if (!response.ok) {
|
if (!response.ok) throw new Error('Failed to load tags');
|
||||||
const msg = await getErrorMessage(response, 'Kunne ikke indlæse tags');
|
|
||||||
throw new Error(msg);
|
|
||||||
}
|
|
||||||
allTagsData = await response.json();
|
allTagsData = await response.json();
|
||||||
updateTagsStats();
|
updateTagsStats();
|
||||||
renderTagsGrid();
|
renderTagsGrid();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error loading tags:', error);
|
console.error('Error loading tags:', error);
|
||||||
allTagsData = [];
|
showNotification('Fejl ved indlæsning af tags', 'error');
|
||||||
updateTagsStats();
|
|
||||||
renderTagsGrid();
|
|
||||||
showNotification('Fejl ved indlæsning af tags: ' + (error.message || 'ukendt fejl'), 'error');
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -3625,11 +3253,10 @@ if (tagsNavLink) {
|
|||||||
|
|
||||||
// ====== SYNC MANAGEMENT ======
|
// ====== SYNC MANAGEMENT ======
|
||||||
let syncLog = [];
|
let syncLog = [];
|
||||||
let archivedSyncPollInterval = null;
|
|
||||||
|
|
||||||
async function loadSyncStats() {
|
async function loadSyncStats() {
|
||||||
try {
|
try {
|
||||||
const response = await fetch('/api/v1/customers?limit=1000');
|
const response = await fetch('/api/v1/customers?limit=10000');
|
||||||
if (!response.ok) throw new Error('Failed to load customers');
|
if (!response.ok) throw new Error('Failed to load customers');
|
||||||
const data = await response.json();
|
const data = await response.json();
|
||||||
const customers = data.customers || [];
|
const customers = data.customers || [];
|
||||||
@ -3694,211 +3321,6 @@ function addSyncLogEntry(title, message, status = 'info') {
|
|||||||
loadSyncLog();
|
loadSyncLog();
|
||||||
}
|
}
|
||||||
|
|
||||||
async function parseApiError(response, fallbackMessage) {
|
|
||||||
let detailMessage = fallbackMessage;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const errorPayload = await response.json();
|
|
||||||
if (errorPayload && errorPayload.detail) {
|
|
||||||
detailMessage = errorPayload.detail;
|
|
||||||
}
|
|
||||||
} catch (parseError) {
|
|
||||||
// Keep fallback message
|
|
||||||
}
|
|
||||||
|
|
||||||
if (response.status === 403 && detailMessage === '2FA required') {
|
|
||||||
return '2FA kræves for sync API. Aktivér 2FA på din bruger og log ind igen.';
|
|
||||||
}
|
|
||||||
|
|
||||||
if (response.status === 403) {
|
|
||||||
if (String(detailMessage).includes('Missing required permission') || String(detailMessage).includes('Superadmin access required')) {
|
|
||||||
return 'Kun admin/superadmin må starte eller overvåge sync.';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return detailMessage;
|
|
||||||
}
|
|
||||||
|
|
||||||
function updateArchivedSourceBadge(sourceKey, isSynced, hasError) {
|
|
||||||
const badgeId = sourceKey === 'simplycrm' ? 'archivedSimplyBadge' : 'archivedVtigerBadge';
|
|
||||||
const badge = document.getElementById(badgeId);
|
|
||||||
if (!badge) return;
|
|
||||||
|
|
||||||
if (hasError) {
|
|
||||||
badge.className = 'badge bg-danger';
|
|
||||||
badge.textContent = 'Fejl';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (isSynced === true) {
|
|
||||||
badge.className = 'badge bg-success';
|
|
||||||
badge.textContent = 'Synket';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
badge.className = 'badge bg-warning text-dark';
|
|
||||||
badge.textContent = 'Mangler';
|
|
||||||
}
|
|
||||||
|
|
||||||
function startArchivedSyncPolling() {
|
|
||||||
if (archivedSyncPollInterval) return;
|
|
||||||
archivedSyncPollInterval = setInterval(() => {
|
|
||||||
loadArchivedSyncStatus();
|
|
||||||
}, 15000);
|
|
||||||
}
|
|
||||||
|
|
||||||
function stopArchivedSyncPolling() {
|
|
||||||
if (!archivedSyncPollInterval) return;
|
|
||||||
clearInterval(archivedSyncPollInterval);
|
|
||||||
archivedSyncPollInterval = null;
|
|
||||||
}
|
|
||||||
|
|
||||||
async function loadArchivedSyncStatus() {
|
|
||||||
const overallBadge = document.getElementById('archivedOverallBadge');
|
|
||||||
const lastChecked = document.getElementById('archivedLastChecked');
|
|
||||||
const hint = document.getElementById('archivedStatusHint');
|
|
||||||
|
|
||||||
try {
|
|
||||||
const response = await fetch('/api/v1/ticket/archived/status');
|
|
||||||
if (!response.ok) {
|
|
||||||
const errorMessage = await parseApiError(response, 'Kunne ikke hente archived status');
|
|
||||||
throw new Error(errorMessage);
|
|
||||||
}
|
|
||||||
|
|
||||||
const status = await response.json();
|
|
||||||
const simply = (status.sources || {}).simplycrm || {};
|
|
||||||
const vtiger = (status.sources || {}).vtiger || {};
|
|
||||||
|
|
||||||
const setText = (id, value) => {
|
|
||||||
const el = document.getElementById(id);
|
|
||||||
if (el) el.textContent = value === null || value === undefined ? '-' : value;
|
|
||||||
};
|
|
||||||
|
|
||||||
setText('archivedSimplyRemoteCount', simply.remote_total_tickets);
|
|
||||||
setText('archivedSimplyLocalCount', simply.local_total_tickets);
|
|
||||||
setText('archivedSimplyDiff', simply.diff);
|
|
||||||
setText('archivedSimplyMessagesCount', simply.local_total_messages);
|
|
||||||
|
|
||||||
setText('archivedVtigerRemoteCount', vtiger.remote_total_tickets);
|
|
||||||
setText('archivedVtigerLocalCount', vtiger.local_total_tickets);
|
|
||||||
setText('archivedVtigerDiff', vtiger.diff);
|
|
||||||
setText('archivedVtigerMessagesCount', vtiger.local_total_messages);
|
|
||||||
|
|
||||||
updateArchivedSourceBadge('simplycrm', simply.is_synced, !!simply.error);
|
|
||||||
updateArchivedSourceBadge('vtiger', vtiger.is_synced, !!vtiger.error);
|
|
||||||
|
|
||||||
if (overallBadge) {
|
|
||||||
if (status.overall_synced === true) {
|
|
||||||
overallBadge.className = 'badge bg-success';
|
|
||||||
overallBadge.textContent = 'Alt synced';
|
|
||||||
} else {
|
|
||||||
overallBadge.className = 'badge bg-warning text-dark';
|
|
||||||
overallBadge.textContent = 'Ikke fuldt synced';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (lastChecked) {
|
|
||||||
lastChecked.textContent = new Date().toLocaleString('da-DK');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (hint) {
|
|
||||||
const errors = [simply.error, vtiger.error].filter(Boolean);
|
|
||||||
hint.textContent = errors.length > 0
|
|
||||||
? `Statusfejl: ${errors.join(' | ')}`
|
|
||||||
: 'Polling aktiv mens Sync-fanen er åben.';
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
if (overallBadge) {
|
|
||||||
overallBadge.className = 'badge bg-danger';
|
|
||||||
overallBadge.textContent = 'Statusfejl';
|
|
||||||
}
|
|
||||||
if (hint) {
|
|
||||||
hint.textContent = error.message;
|
|
||||||
}
|
|
||||||
console.error('Error loading archived sync status:', error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function syncArchivedSimply() {
|
|
||||||
const btn = document.getElementById('btnSyncArchivedSimply');
|
|
||||||
if (!btn) return;
|
|
||||||
|
|
||||||
btn.disabled = true;
|
|
||||||
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Synkroniserer...';
|
|
||||||
|
|
||||||
try {
|
|
||||||
addSyncLogEntry('Simply Archived Sync Startet', 'Importerer archived tickets fra Simply...', 'info');
|
|
||||||
|
|
||||||
const response = await fetch('/api/v1/ticket/archived/simply/import?limit=5000&include_messages=true&force=false', {
|
|
||||||
method: 'POST'
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
const errorMessage = await parseApiError(response, 'Simply archived sync fejlede');
|
|
||||||
throw new Error(errorMessage);
|
|
||||||
}
|
|
||||||
|
|
||||||
const result = await response.json();
|
|
||||||
const details = [
|
|
||||||
`Importeret: ${result.imported || 0}`,
|
|
||||||
`Opdateret: ${result.updated || 0}`,
|
|
||||||
`Sprunget over: ${result.skipped || 0}`,
|
|
||||||
`Fejl: ${result.errors || 0}`,
|
|
||||||
`Beskeder: ${result.messages_imported || 0}`
|
|
||||||
].join(' | ');
|
|
||||||
addSyncLogEntry('Simply Archived Sync Fuldført', details, 'success');
|
|
||||||
|
|
||||||
await loadArchivedSyncStatus();
|
|
||||||
showNotification('Simply archived sync fuldført!', 'success');
|
|
||||||
} catch (error) {
|
|
||||||
addSyncLogEntry('Simply Archived Sync Fejl', error.message, 'error');
|
|
||||||
showNotification('Fejl: ' + error.message, 'error');
|
|
||||||
} finally {
|
|
||||||
btn.disabled = false;
|
|
||||||
btn.innerHTML = '<i class="bi bi-cloud-download me-2"></i>Sync Simply Archived';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function syncArchivedVtiger() {
|
|
||||||
const btn = document.getElementById('btnSyncArchivedVtiger');
|
|
||||||
if (!btn) return;
|
|
||||||
|
|
||||||
btn.disabled = true;
|
|
||||||
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Synkroniserer...';
|
|
||||||
|
|
||||||
try {
|
|
||||||
addSyncLogEntry('vTiger Archived Sync Startet', 'Importerer archived tickets fra vTiger Cases...', 'info');
|
|
||||||
|
|
||||||
const response = await fetch('/api/v1/ticket/archived/vtiger/import?limit=5000&include_messages=true&force=false', {
|
|
||||||
method: 'POST'
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
const errorMessage = await parseApiError(response, 'vTiger archived sync fejlede');
|
|
||||||
throw new Error(errorMessage);
|
|
||||||
}
|
|
||||||
|
|
||||||
const result = await response.json();
|
|
||||||
const details = [
|
|
||||||
`Importeret: ${result.imported || 0}`,
|
|
||||||
`Opdateret: ${result.updated || 0}`,
|
|
||||||
`Sprunget over: ${result.skipped || 0}`,
|
|
||||||
`Fejl: ${result.errors || 0}`,
|
|
||||||
`Beskeder: ${result.messages_imported || 0}`
|
|
||||||
].join(' | ');
|
|
||||||
addSyncLogEntry('vTiger Archived Sync Fuldført', details, 'success');
|
|
||||||
|
|
||||||
await loadArchivedSyncStatus();
|
|
||||||
showNotification('vTiger archived sync fuldført!', 'success');
|
|
||||||
} catch (error) {
|
|
||||||
addSyncLogEntry('vTiger Archived Sync Fejl', error.message, 'error');
|
|
||||||
showNotification('Fejl: ' + error.message, 'error');
|
|
||||||
} finally {
|
|
||||||
btn.disabled = false;
|
|
||||||
btn.innerHTML = '<i class="bi bi-cloud-download me-2"></i>Sync vTiger Archived';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function syncFromVtiger() {
|
async function syncFromVtiger() {
|
||||||
const btn = document.getElementById('btnSyncVtiger');
|
const btn = document.getElementById('btnSyncVtiger');
|
||||||
btn.disabled = true;
|
btn.disabled = true;
|
||||||
@ -3912,16 +3334,16 @@ async function syncFromVtiger() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const errorMessage = await parseApiError(response, 'Sync fejlede');
|
const error = await response.json();
|
||||||
throw new Error(errorMessage);
|
throw new Error(error.detail || 'Sync fejlede');
|
||||||
}
|
}
|
||||||
|
|
||||||
const result = await response.json();
|
const result = await response.json();
|
||||||
const details = [
|
const details = [
|
||||||
`Behandlet: ${result.total_processed || 0}`,
|
`Behandlet: ${result.total_processed || 0}`,
|
||||||
`Linket: ${result.linked || 0}`,
|
`Oprettet: ${result.created || 0}`,
|
||||||
`Opdateret: ${result.updated || 0}`,
|
`Opdateret: ${result.updated || 0}`,
|
||||||
`Ikke fundet/sprunget over: ${result.not_found || 0}`
|
`Sprunget over: ${result.skipped || 0}`
|
||||||
].join(' | ');
|
].join(' | ');
|
||||||
addSyncLogEntry(
|
addSyncLogEntry(
|
||||||
'vTiger Sync Fuldført',
|
'vTiger Sync Fuldført',
|
||||||
@ -3955,8 +3377,8 @@ async function syncVtigerContacts() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const errorMessage = await parseApiError(response, 'Sync fejlede');
|
const error = await response.json();
|
||||||
throw new Error(errorMessage);
|
throw new Error(error.detail || 'Sync fejlede');
|
||||||
}
|
}
|
||||||
|
|
||||||
const result = await response.json();
|
const result = await response.json();
|
||||||
@ -3996,17 +3418,16 @@ async function syncFromEconomic() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const errorMessage = await parseApiError(response, 'Sync fejlede');
|
const error = await response.json();
|
||||||
throw new Error(errorMessage);
|
throw new Error(error.detail || 'Sync fejlede');
|
||||||
}
|
}
|
||||||
|
|
||||||
const result = await response.json();
|
const result = await response.json();
|
||||||
const details = [
|
const details = [
|
||||||
`Behandlet: ${result.total_processed || 0}`,
|
`Behandlet: ${result.total_processed || 0}`,
|
||||||
`Oprettet: ${result.created || 0}`,
|
`Nye matchet: ${result.matched || 0}`,
|
||||||
`Opdateret: ${result.updated || 0}`,
|
`Verificeret: ${result.verified || 0}`,
|
||||||
`Konflikter: ${result.conflicts || 0}`,
|
`Ikke matchet: ${result.not_matched || 0}`
|
||||||
`Sprunget over: ${result.skipped || 0}`
|
|
||||||
].join(' | ');
|
].join(' | ');
|
||||||
addSyncLogEntry(
|
addSyncLogEntry(
|
||||||
'e-conomic Sync Fuldført',
|
'e-conomic Sync Fuldført',
|
||||||
@ -4028,9 +3449,6 @@ async function syncFromEconomic() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async function syncCvrToEconomic() {
|
async function syncCvrToEconomic() {
|
||||||
showNotification('CVR→e-conomic sync er midlertidigt deaktiveret.', 'info');
|
|
||||||
return;
|
|
||||||
|
|
||||||
const btn = document.getElementById('btnSyncCvrEconomic');
|
const btn = document.getElementById('btnSyncCvrEconomic');
|
||||||
btn.disabled = true;
|
btn.disabled = true;
|
||||||
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Søger...';
|
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Søger...';
|
||||||
@ -4077,17 +3495,9 @@ if (syncNavLink) {
|
|||||||
syncNavLink.addEventListener('click', () => {
|
syncNavLink.addEventListener('click', () => {
|
||||||
loadSyncStats();
|
loadSyncStats();
|
||||||
loadSyncLog();
|
loadSyncLog();
|
||||||
loadArchivedSyncStatus();
|
|
||||||
startArchivedSyncPolling();
|
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
document.addEventListener('visibilitychange', () => {
|
|
||||||
if (document.hidden) {
|
|
||||||
stopArchivedSyncPolling();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Notification helper
|
// Notification helper
|
||||||
function showNotification(message, type = 'info') {
|
function showNotification(message, type = 'info') {
|
||||||
// Create toast notification
|
// Create toast notification
|
||||||
@ -4514,45 +3924,6 @@ async function loadEmailTemplateCustomers() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async function loadDefaultEmailSignatureTemplate() {
|
|
||||||
try {
|
|
||||||
const response = await fetch('/api/v1/settings/email_default_signature_template');
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error('Kunne ikke hente signatur-skabelon');
|
|
||||||
}
|
|
||||||
const setting = await response.json();
|
|
||||||
const textarea = document.getElementById('emailDefaultSignatureTemplate');
|
|
||||||
if (textarea) {
|
|
||||||
textarea.value = setting.value || '';
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error loading default email signature template:', error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function saveDefaultEmailSignatureTemplate() {
|
|
||||||
const textarea = document.getElementById('emailDefaultSignatureTemplate');
|
|
||||||
if (!textarea) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
const response = await fetch('/api/v1/settings/email_default_signature_template', {
|
|
||||||
method: 'PUT',
|
|
||||||
headers: { 'Content-Type': 'application/json' },
|
|
||||||
body: JSON.stringify({ value: textarea.value || '' })
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!response.ok) {
|
|
||||||
const err = await response.json().catch(() => ({}));
|
|
||||||
throw new Error(err.detail || 'Kunne ikke gemme signatur-skabelon');
|
|
||||||
}
|
|
||||||
|
|
||||||
showNotification('Standard signatur gemt', 'success');
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error saving default email signature template:', error);
|
|
||||||
showNotification(error.message || 'Kunne ikke gemme signatur-skabelon', 'error');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function openEmailTemplateModal() {
|
async function openEmailTemplateModal() {
|
||||||
// Reset form
|
// Reset form
|
||||||
document.getElementById('emailTemplateForm').reset();
|
document.getElementById('emailTemplateForm').reset();
|
||||||
@ -4698,7 +4069,6 @@ document.addEventListener('DOMContentLoaded', () => {
|
|||||||
// Other loaders are called at bottom of file in existing script
|
// Other loaders are called at bottom of file in existing script
|
||||||
loadEmailTemplates();
|
loadEmailTemplates();
|
||||||
loadEmailTemplateCustomers();
|
loadEmailTemplateCustomers();
|
||||||
loadDefaultEmailSignatureTemplate();
|
|
||||||
});
|
});
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
|
|||||||
@ -246,14 +246,12 @@
|
|||||||
<li><a class="dropdown-item py-2" href="/hardware"><i class="bi bi-laptop me-2"></i>Hardware Assets</a></li>
|
<li><a class="dropdown-item py-2" href="/hardware"><i class="bi bi-laptop me-2"></i>Hardware Assets</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/hardware/eset"><i class="bi bi-shield-check me-2"></i>ESET Oversigt</a></li>
|
<li><a class="dropdown-item py-2" href="/hardware/eset"><i class="bi bi-shield-check me-2"></i>ESET Oversigt</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/telefoni"><i class="bi bi-telephone me-2"></i>Telefoni</a></li>
|
<li><a class="dropdown-item py-2" href="/telefoni"><i class="bi bi-telephone me-2"></i>Telefoni</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/dashboard/mission-control"><i class="bi bi-broadcast-pin me-2"></i>Mission Control</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/app/locations"><i class="bi bi-map-fill me-2"></i>Lokaliteter</a></li>
|
<li><a class="dropdown-item py-2" href="/app/locations"><i class="bi bi-map-fill me-2"></i>Lokaliteter</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
<li><a class="dropdown-item py-2" href="/prepaid-cards"><i class="bi bi-credit-card-2-front me-2"></i>Prepaid Cards</a></li>
|
<li><a class="dropdown-item py-2" href="/prepaid-cards"><i class="bi bi-credit-card-2-front me-2"></i>Prepaid Cards</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/fixed-price-agreements"><i class="bi bi-calendar-check me-2"></i>Fastpris Aftaler</a></li>
|
<li><a class="dropdown-item py-2" href="/fixed-price-agreements"><i class="bi bi-calendar-check me-2"></i>Fastpris Aftaler</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/subscriptions"><i class="bi bi-repeat me-2"></i>Abonnementer</a></li>
|
<li><a class="dropdown-item py-2" href="/subscriptions"><i class="bi bi-repeat me-2"></i>Abonnementer</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
<li><a class="dropdown-item py-2" href="/tags#search"><i class="bi bi-tags me-2"></i>Tag søgning</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="#">Knowledge Base</a></li>
|
<li><a class="dropdown-item py-2" href="#">Knowledge Base</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</li>
|
</li>
|
||||||
@ -282,6 +280,21 @@
|
|||||||
<li><a class="dropdown-item py-2" href="#">Abonnementer</a></li>
|
<li><a class="dropdown-item py-2" href="#">Abonnementer</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="#">Betalinger</a></li>
|
<li><a class="dropdown-item py-2" href="#">Betalinger</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
|
<li class="dropdown-submenu">
|
||||||
|
<a class="dropdown-item dropdown-toggle py-2" href="#" data-submenu-toggle="timetracking">
|
||||||
|
<span><i class="bi bi-clock-history me-2"></i>Timetracking</span>
|
||||||
|
<i class="bi bi-chevron-right small opacity-75"></i>
|
||||||
|
</a>
|
||||||
|
<ul class="dropdown-menu" data-submenu="timetracking">
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking"><i class="bi bi-speedometer2 me-2"></i>Dashboard</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/registrations"><i class="bi bi-list-columns-reverse me-2"></i>Registreringer</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/wizard"><i class="bi bi-magic me-2"></i>Godkend Timer</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/service-contract-wizard"><i class="bi bi-diagram-3 me-2"></i>Servicekontrakt Migration</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/orders"><i class="bi bi-receipt me-2"></i>Ordrer</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/customers"><i class="bi bi-people me-2"></i>Kunder</a></li>
|
||||||
|
</ul>
|
||||||
|
</li>
|
||||||
|
<li><hr class="dropdown-divider"></li>
|
||||||
<li><a class="dropdown-item py-2" href="#">Rapporter</a></li>
|
<li><a class="dropdown-item py-2" href="#">Rapporter</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</li>
|
</li>
|
||||||
@ -292,19 +305,6 @@
|
|||||||
</li>
|
</li>
|
||||||
</ul>
|
</ul>
|
||||||
<div class="d-flex align-items-center gap-3">
|
<div class="d-flex align-items-center gap-3">
|
||||||
<div class="dropdown">
|
|
||||||
<a class="nav-link dropdown-toggle" href="#" role="button" data-bs-toggle="dropdown" aria-expanded="false">
|
|
||||||
<i class="bi bi-clock-history me-2"></i>Data migration
|
|
||||||
</a>
|
|
||||||
<ul class="dropdown-menu dropdown-menu-end mt-2">
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking"><i class="bi bi-speedometer2 me-2"></i>Dashboard</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/registrations"><i class="bi bi-list-columns-reverse me-2"></i>Registreringer</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/wizard"><i class="bi bi-magic me-2"></i>Godkend Timer</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/service-contract-wizard"><i class="bi bi-diagram-3 me-2"></i>Servicekontrakt Migration</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/orders"><i class="bi bi-receipt me-2"></i>Ordrer</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/customers"><i class="bi bi-people me-2"></i>Kunder</a></li>
|
|
||||||
</ul>
|
|
||||||
</div>
|
|
||||||
<button class="btn btn-light rounded-circle border-0" id="quickCreateBtn" style="background: var(--accent-light); color: var(--accent);" title="Opret ny sag (+ eller Cmd+Shift+C)">
|
<button class="btn btn-light rounded-circle border-0" id="quickCreateBtn" style="background: var(--accent-light); color: var(--accent);" title="Opret ny sag (+ eller Cmd+Shift+C)">
|
||||||
<i class="bi bi-plus-circle-fill fs-5"></i>
|
<i class="bi bi-plus-circle-fill fs-5"></i>
|
||||||
</button>
|
</button>
|
||||||
@ -320,7 +320,6 @@
|
|||||||
<ul class="dropdown-menu dropdown-menu-end mt-2">
|
<ul class="dropdown-menu dropdown-menu-end mt-2">
|
||||||
<li><a class="dropdown-item py-2" href="#" data-bs-toggle="modal" data-bs-target="#profileModal">Profil</a></li>
|
<li><a class="dropdown-item py-2" href="#" data-bs-toggle="modal" data-bs-target="#profileModal">Profil</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/settings"><i class="bi bi-gear me-2"></i>Indstillinger</a></li>
|
<li><a class="dropdown-item py-2" href="/settings"><i class="bi bi-gear me-2"></i>Indstillinger</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/tags#search"><i class="bi bi-tags me-2"></i>Tag søgning</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/backups"><i class="bi bi-hdd-stack me-2"></i>Backup System</a></li>
|
<li><a class="dropdown-item py-2" href="/backups"><i class="bi bi-hdd-stack me-2"></i>Backup System</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/devportal"><i class="bi bi-code-square me-2"></i>DEV Portal</a></li>
|
<li><a class="dropdown-item py-2" href="/devportal"><i class="bi bi-code-square me-2"></i>DEV Portal</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
@ -1086,7 +1085,7 @@
|
|||||||
</script>
|
</script>
|
||||||
|
|
||||||
<!-- QuickCreate Modal (AI-Powered Case Creation) -->
|
<!-- QuickCreate Modal (AI-Powered Case Creation) -->
|
||||||
{% include ["quick_create_modal.html", "shared/frontend/quick_create_modal.html"] ignore missing %}
|
{% include "shared/frontend/quick_create_modal.html" %}
|
||||||
|
|
||||||
<!-- Profile Modal -->
|
<!-- Profile Modal -->
|
||||||
<div class="modal fade" id="profileModal" tabindex="-1" aria-hidden="true">
|
<div class="modal fade" id="profileModal" tabindex="-1" aria-hidden="true">
|
||||||
|
|||||||
@ -303,19 +303,15 @@
|
|||||||
async function performAnalysis(text) {
|
async function performAnalysis(text) {
|
||||||
try {
|
try {
|
||||||
const userId = getUserId();
|
const userId = getUserId();
|
||||||
const response = await fetch('/api/v1/sag/analyze-quick-create', {
|
const response = await fetch(`/api/v1/sag/analyze-quick-create?user_id=${userId}`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
credentials: 'include',
|
credentials: 'include',
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({text})
|
||||||
text,
|
|
||||||
user_id: parseInt(userId, 10)
|
|
||||||
})
|
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const errorText = await response.text();
|
throw new Error('Analysis failed');
|
||||||
throw new Error(`Analysis failed (${response.status}): ${errorText || 'unknown error'}`);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const analysis = await response.json();
|
const analysis = await response.json();
|
||||||
|
|||||||
@ -21,8 +21,6 @@ router = APIRouter()
|
|||||||
|
|
||||||
ALLOWED_STATUSES = {"draft", "active", "paused", "cancelled"}
|
ALLOWED_STATUSES = {"draft", "active", "paused", "cancelled"}
|
||||||
STAGING_KEY_SQL = "COALESCE(source_account_id, 'name:' || LOWER(COALESCE(source_customer_name, 'ukendt')))"
|
STAGING_KEY_SQL = "COALESCE(source_account_id, 'name:' || LOWER(COALESCE(source_customer_name, 'ukendt')))"
|
||||||
ALLOWED_BILLING_DIRECTIONS = {"forward", "backward"}
|
|
||||||
ALLOWED_PRICE_CHANGE_STATUSES = {"pending", "approved", "rejected", "applied"}
|
|
||||||
|
|
||||||
|
|
||||||
def _staging_status_with_mapping(status: str, has_customer: bool) -> str:
|
def _staging_status_with_mapping(status: str, has_customer: bool) -> str:
|
||||||
@ -167,15 +165,6 @@ async def create_subscription(payload: Dict[str, Any]):
|
|||||||
billing_interval = payload.get("billing_interval")
|
billing_interval = payload.get("billing_interval")
|
||||||
billing_day = payload.get("billing_day")
|
billing_day = payload.get("billing_day")
|
||||||
start_date = payload.get("start_date")
|
start_date = payload.get("start_date")
|
||||||
billing_direction = (payload.get("billing_direction") or "forward").strip().lower()
|
|
||||||
advance_months = int(payload.get("advance_months") or 1)
|
|
||||||
first_full_period_start = payload.get("first_full_period_start")
|
|
||||||
binding_months = int(payload.get("binding_months") or 0)
|
|
||||||
binding_start_date_raw = payload.get("binding_start_date") or start_date
|
|
||||||
binding_group_key = payload.get("binding_group_key")
|
|
||||||
invoice_merge_key = payload.get("invoice_merge_key")
|
|
||||||
price_change_case_id = payload.get("price_change_case_id")
|
|
||||||
renewal_case_id = payload.get("renewal_case_id")
|
|
||||||
notes = payload.get("notes")
|
notes = payload.get("notes")
|
||||||
line_items = payload.get("line_items") or []
|
line_items = payload.get("line_items") or []
|
||||||
|
|
||||||
@ -189,12 +178,6 @@ async def create_subscription(payload: Dict[str, Any]):
|
|||||||
raise HTTPException(status_code=400, detail="start_date is required")
|
raise HTTPException(status_code=400, detail="start_date is required")
|
||||||
if not line_items:
|
if not line_items:
|
||||||
raise HTTPException(status_code=400, detail="line_items is required")
|
raise HTTPException(status_code=400, detail="line_items is required")
|
||||||
if billing_direction not in ALLOWED_BILLING_DIRECTIONS:
|
|
||||||
raise HTTPException(status_code=400, detail="billing_direction must be forward or backward")
|
|
||||||
if advance_months < 1 or advance_months > 24:
|
|
||||||
raise HTTPException(status_code=400, detail="advance_months must be between 1 and 24")
|
|
||||||
if binding_months < 0:
|
|
||||||
raise HTTPException(status_code=400, detail="binding_months must be >= 0")
|
|
||||||
|
|
||||||
sag = execute_query_single(
|
sag = execute_query_single(
|
||||||
"SELECT id, customer_id FROM sag_sager WHERE id = %s",
|
"SELECT id, customer_id FROM sag_sager WHERE id = %s",
|
||||||
@ -219,27 +202,18 @@ async def create_subscription(payload: Dict[str, Any]):
|
|||||||
product_map = {}
|
product_map = {}
|
||||||
if product_ids:
|
if product_ids:
|
||||||
rows = execute_query(
|
rows = execute_query(
|
||||||
"""
|
"SELECT id, name, sales_price FROM products WHERE id = ANY(%s)",
|
||||||
SELECT id, name, sales_price, serial_number_required, asset_required
|
|
||||||
FROM products
|
|
||||||
WHERE id = ANY(%s)
|
|
||||||
""",
|
|
||||||
(product_ids,)
|
(product_ids,)
|
||||||
)
|
)
|
||||||
product_map = {row["id"]: row for row in (rows or [])}
|
product_map = {row["id"]: row for row in (rows or [])}
|
||||||
|
|
||||||
cleaned_items = []
|
cleaned_items = []
|
||||||
total_price = 0
|
total_price = 0
|
||||||
blocked_reasons = []
|
|
||||||
for idx, item in enumerate(line_items, start=1):
|
for idx, item in enumerate(line_items, start=1):
|
||||||
product_id = item.get("product_id")
|
product_id = item.get("product_id")
|
||||||
description = (item.get("description") or "").strip()
|
description = (item.get("description") or "").strip()
|
||||||
quantity = item.get("quantity")
|
quantity = item.get("quantity")
|
||||||
unit_price = item.get("unit_price")
|
unit_price = item.get("unit_price")
|
||||||
asset_id = item.get("asset_id")
|
|
||||||
serial_number = (item.get("serial_number") or "").strip() or None
|
|
||||||
period_from = item.get("period_from")
|
|
||||||
period_to = item.get("period_to")
|
|
||||||
|
|
||||||
product = product_map.get(product_id)
|
product = product_map.get(product_id)
|
||||||
if not description and product:
|
if not description and product:
|
||||||
@ -254,58 +228,21 @@ async def create_subscription(payload: Dict[str, Any]):
|
|||||||
if unit_price is None or float(unit_price) < 0:
|
if unit_price is None or float(unit_price) < 0:
|
||||||
raise HTTPException(status_code=400, detail="line_items unit_price must be >= 0")
|
raise HTTPException(status_code=400, detail="line_items unit_price must be >= 0")
|
||||||
|
|
||||||
if asset_id is not None:
|
|
||||||
asset = execute_query_single(
|
|
||||||
"SELECT id FROM hardware_assets WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(asset_id,)
|
|
||||||
)
|
|
||||||
if not asset:
|
|
||||||
raise HTTPException(status_code=400, detail=f"asset_id {asset_id} was not found")
|
|
||||||
|
|
||||||
requires_asset = bool(product and product.get("asset_required"))
|
|
||||||
requires_serial_number = bool(product and product.get("serial_number_required"))
|
|
||||||
item_block_reasons: List[str] = []
|
|
||||||
if requires_asset and not asset_id:
|
|
||||||
item_block_reasons.append("Asset mangler")
|
|
||||||
if requires_serial_number and not serial_number:
|
|
||||||
item_block_reasons.append("Serienummer mangler")
|
|
||||||
|
|
||||||
line_total = float(quantity) * float(unit_price)
|
line_total = float(quantity) * float(unit_price)
|
||||||
total_price += line_total
|
total_price += line_total
|
||||||
billing_blocked = len(item_block_reasons) > 0
|
|
||||||
billing_block_reason = "; ".join(item_block_reasons) if billing_blocked else None
|
|
||||||
if billing_block_reason:
|
|
||||||
blocked_reasons.append(f"{description}: {billing_block_reason}")
|
|
||||||
cleaned_items.append({
|
cleaned_items.append({
|
||||||
"line_no": idx,
|
"line_no": idx,
|
||||||
"product_id": product_id,
|
"product_id": product_id,
|
||||||
"asset_id": asset_id,
|
|
||||||
"description": description,
|
"description": description,
|
||||||
"quantity": quantity,
|
"quantity": quantity,
|
||||||
"unit_price": unit_price,
|
"unit_price": unit_price,
|
||||||
"line_total": line_total,
|
"line_total": line_total,
|
||||||
"period_from": period_from,
|
|
||||||
"period_to": period_to,
|
|
||||||
"requires_serial_number": requires_serial_number,
|
|
||||||
"serial_number": serial_number,
|
|
||||||
"billing_blocked": billing_blocked,
|
|
||||||
"billing_block_reason": billing_block_reason,
|
|
||||||
})
|
})
|
||||||
|
|
||||||
product_name = cleaned_items[0]["description"]
|
product_name = cleaned_items[0]["description"]
|
||||||
if len(cleaned_items) > 1:
|
if len(cleaned_items) > 1:
|
||||||
product_name = f"{product_name} (+{len(cleaned_items) - 1})"
|
product_name = f"{product_name} (+{len(cleaned_items) - 1})"
|
||||||
|
|
||||||
billing_blocked = len(blocked_reasons) > 0
|
|
||||||
billing_block_reason = " | ".join(blocked_reasons) if billing_blocked else None
|
|
||||||
|
|
||||||
binding_start_date = _safe_date(binding_start_date_raw)
|
|
||||||
if not binding_start_date:
|
|
||||||
raise HTTPException(status_code=400, detail="binding_start_date must be a valid date")
|
|
||||||
binding_end_date = None
|
|
||||||
if binding_months > 0:
|
|
||||||
binding_end_date = binding_start_date + relativedelta(months=binding_months)
|
|
||||||
|
|
||||||
# Calculate next_invoice_date based on billing_interval
|
# Calculate next_invoice_date based on billing_interval
|
||||||
|
|
||||||
start_dt = datetime.strptime(start_date, "%Y-%m-%d").date()
|
start_dt = datetime.strptime(start_date, "%Y-%m-%d").date()
|
||||||
@ -335,30 +272,14 @@ async def create_subscription(payload: Dict[str, Any]):
|
|||||||
customer_id,
|
customer_id,
|
||||||
product_name,
|
product_name,
|
||||||
billing_interval,
|
billing_interval,
|
||||||
billing_direction,
|
|
||||||
advance_months,
|
|
||||||
first_full_period_start,
|
|
||||||
billing_day,
|
billing_day,
|
||||||
price,
|
price,
|
||||||
start_date,
|
start_date,
|
||||||
period_start,
|
period_start,
|
||||||
next_invoice_date,
|
next_invoice_date,
|
||||||
binding_months,
|
|
||||||
binding_start_date,
|
|
||||||
binding_end_date,
|
|
||||||
binding_group_key,
|
|
||||||
billing_blocked,
|
|
||||||
billing_block_reason,
|
|
||||||
invoice_merge_key,
|
|
||||||
price_change_case_id,
|
|
||||||
renewal_case_id,
|
|
||||||
status,
|
status,
|
||||||
notes
|
notes
|
||||||
) VALUES (
|
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, 'draft', %s)
|
||||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
|
||||||
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
|
||||||
%s, 'draft', %s
|
|
||||||
)
|
|
||||||
RETURNING *
|
RETURNING *
|
||||||
""",
|
""",
|
||||||
(
|
(
|
||||||
@ -366,23 +287,11 @@ async def create_subscription(payload: Dict[str, Any]):
|
|||||||
sag["customer_id"],
|
sag["customer_id"],
|
||||||
product_name,
|
product_name,
|
||||||
billing_interval,
|
billing_interval,
|
||||||
billing_direction,
|
|
||||||
advance_months,
|
|
||||||
first_full_period_start,
|
|
||||||
billing_day,
|
billing_day,
|
||||||
total_price,
|
total_price,
|
||||||
start_date,
|
start_date,
|
||||||
period_start,
|
period_start,
|
||||||
next_invoice_date,
|
next_invoice_date,
|
||||||
binding_months,
|
|
||||||
binding_start_date,
|
|
||||||
binding_end_date,
|
|
||||||
binding_group_key,
|
|
||||||
billing_blocked,
|
|
||||||
billing_block_reason,
|
|
||||||
invoice_merge_key,
|
|
||||||
price_change_case_id,
|
|
||||||
renewal_case_id,
|
|
||||||
notes,
|
notes,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
@ -395,34 +304,20 @@ async def create_subscription(payload: Dict[str, Any]):
|
|||||||
subscription_id,
|
subscription_id,
|
||||||
line_no,
|
line_no,
|
||||||
product_id,
|
product_id,
|
||||||
asset_id,
|
|
||||||
description,
|
description,
|
||||||
quantity,
|
quantity,
|
||||||
unit_price,
|
unit_price,
|
||||||
line_total,
|
line_total
|
||||||
period_from,
|
) VALUES (%s, %s, %s, %s, %s, %s, %s)
|
||||||
period_to,
|
|
||||||
requires_serial_number,
|
|
||||||
serial_number,
|
|
||||||
billing_blocked,
|
|
||||||
billing_block_reason
|
|
||||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
|
||||||
""",
|
""",
|
||||||
(
|
(
|
||||||
subscription["id"],
|
subscription["id"],
|
||||||
item["line_no"],
|
item["line_no"],
|
||||||
item["product_id"],
|
item["product_id"],
|
||||||
item["asset_id"],
|
|
||||||
item["description"],
|
item["description"],
|
||||||
item["quantity"],
|
item["quantity"],
|
||||||
item["unit_price"],
|
item["unit_price"],
|
||||||
item["line_total"],
|
item["line_total"],
|
||||||
item["period_from"],
|
|
||||||
item["period_to"],
|
|
||||||
item["requires_serial_number"],
|
|
||||||
item["serial_number"],
|
|
||||||
item["billing_blocked"],
|
|
||||||
item["billing_block_reason"],
|
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -453,25 +348,13 @@ async def get_subscription(subscription_id: int):
|
|||||||
c.name AS customer_name,
|
c.name AS customer_name,
|
||||||
s.product_name,
|
s.product_name,
|
||||||
s.billing_interval,
|
s.billing_interval,
|
||||||
s.billing_direction,
|
|
||||||
s.advance_months,
|
|
||||||
s.first_full_period_start,
|
|
||||||
s.billing_day,
|
s.billing_day,
|
||||||
s.price,
|
s.price,
|
||||||
s.start_date,
|
s.start_date,
|
||||||
s.end_date,
|
s.end_date,
|
||||||
s.next_invoice_date,
|
s.next_invoice_date,
|
||||||
s.period_start,
|
s.period_start,
|
||||||
s.binding_months,
|
|
||||||
s.binding_start_date,
|
|
||||||
s.binding_end_date,
|
|
||||||
s.binding_group_key,
|
|
||||||
s.notice_period_days,
|
s.notice_period_days,
|
||||||
s.billing_blocked,
|
|
||||||
s.billing_block_reason,
|
|
||||||
s.invoice_merge_key,
|
|
||||||
s.price_change_case_id,
|
|
||||||
s.renewal_case_id,
|
|
||||||
s.status,
|
s.status,
|
||||||
s.notes,
|
s.notes,
|
||||||
s.cancelled_at,
|
s.cancelled_at,
|
||||||
@ -494,18 +377,11 @@ async def get_subscription(subscription_id: int):
|
|||||||
i.id,
|
i.id,
|
||||||
i.line_no,
|
i.line_no,
|
||||||
i.product_id,
|
i.product_id,
|
||||||
i.asset_id,
|
|
||||||
p.name AS product_name,
|
p.name AS product_name,
|
||||||
i.description,
|
i.description,
|
||||||
i.quantity,
|
i.quantity,
|
||||||
i.unit_price,
|
i.unit_price,
|
||||||
i.line_total,
|
i.line_total
|
||||||
i.period_from,
|
|
||||||
i.period_to,
|
|
||||||
i.requires_serial_number,
|
|
||||||
i.serial_number,
|
|
||||||
i.billing_blocked,
|
|
||||||
i.billing_block_reason
|
|
||||||
FROM sag_subscription_items i
|
FROM sag_subscription_items i
|
||||||
LEFT JOIN products p ON p.id = i.product_id
|
LEFT JOIN products p ON p.id = i.product_id
|
||||||
WHERE i.subscription_id = %s
|
WHERE i.subscription_id = %s
|
||||||
@ -540,11 +416,7 @@ async def update_subscription(subscription_id: int, payload: Dict[str, Any]):
|
|||||||
allowed_fields = {
|
allowed_fields = {
|
||||||
"product_name", "billing_interval", "billing_day", "price",
|
"product_name", "billing_interval", "billing_day", "price",
|
||||||
"start_date", "end_date", "next_invoice_date", "period_start",
|
"start_date", "end_date", "next_invoice_date", "period_start",
|
||||||
"notice_period_days", "status", "notes",
|
"notice_period_days", "status", "notes"
|
||||||
"billing_direction", "advance_months", "first_full_period_start",
|
|
||||||
"binding_months", "binding_start_date", "binding_end_date", "binding_group_key",
|
|
||||||
"billing_blocked", "billing_block_reason", "invoice_merge_key",
|
|
||||||
"price_change_case_id", "renewal_case_id"
|
|
||||||
}
|
}
|
||||||
|
|
||||||
updates = []
|
updates = []
|
||||||
@ -599,23 +471,13 @@ async def update_subscription(subscription_id: int, payload: Dict[str, Any]):
|
|||||||
"""
|
"""
|
||||||
INSERT INTO sag_subscription_items (
|
INSERT INTO sag_subscription_items (
|
||||||
subscription_id, line_no, description,
|
subscription_id, line_no, description,
|
||||||
quantity, unit_price, line_total, product_id,
|
quantity, unit_price, line_total, product_id
|
||||||
asset_id, period_from, period_to,
|
) VALUES (%s, %s, %s, %s, %s, %s, %s)
|
||||||
requires_serial_number, serial_number,
|
|
||||||
billing_blocked, billing_block_reason
|
|
||||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
|
||||||
""",
|
""",
|
||||||
(
|
(
|
||||||
subscription_id, idx, description,
|
subscription_id, idx, description,
|
||||||
quantity, unit_price, line_total,
|
quantity, unit_price, line_total,
|
||||||
item.get("product_id"),
|
item.get("product_id")
|
||||||
item.get("asset_id"),
|
|
||||||
item.get("period_from"),
|
|
||||||
item.get("period_to"),
|
|
||||||
bool(item.get("requires_serial_number")),
|
|
||||||
item.get("serial_number"),
|
|
||||||
bool(item.get("billing_blocked")),
|
|
||||||
item.get("billing_block_reason"),
|
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -675,13 +537,10 @@ async def list_subscriptions(status: str = Query("all")):
|
|||||||
c.name AS customer_name,
|
c.name AS customer_name,
|
||||||
s.product_name,
|
s.product_name,
|
||||||
s.billing_interval,
|
s.billing_interval,
|
||||||
s.billing_direction,
|
|
||||||
s.billing_day,
|
s.billing_day,
|
||||||
s.price,
|
s.price,
|
||||||
s.start_date,
|
s.start_date,
|
||||||
s.end_date,
|
s.end_date,
|
||||||
s.billing_blocked,
|
|
||||||
s.invoice_merge_key,
|
|
||||||
s.status,
|
s.status,
|
||||||
(SELECT COUNT(*) FROM sag_subscription_items WHERE subscription_id = s.id) as item_count
|
(SELECT COUNT(*) FROM sag_subscription_items WHERE subscription_id = s.id) as item_count
|
||||||
FROM sag_subscriptions s
|
FROM sag_subscriptions s
|
||||||
@ -743,475 +602,6 @@ async def trigger_subscription_processing():
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.get("/sag-subscriptions/{subscription_id}/price-changes", response_model=List[Dict[str, Any]])
|
|
||||||
async def list_subscription_price_changes(subscription_id: int):
|
|
||||||
"""List planned price changes for one subscription."""
|
|
||||||
try:
|
|
||||||
query = """
|
|
||||||
SELECT
|
|
||||||
spc.id,
|
|
||||||
spc.subscription_id,
|
|
||||||
spc.subscription_item_id,
|
|
||||||
spc.sag_id,
|
|
||||||
sg.titel AS sag_title,
|
|
||||||
spc.change_scope,
|
|
||||||
spc.old_unit_price,
|
|
||||||
spc.new_unit_price,
|
|
||||||
spc.effective_date,
|
|
||||||
spc.approval_status,
|
|
||||||
spc.reason,
|
|
||||||
spc.approved_by_user_id,
|
|
||||||
spc.approved_at,
|
|
||||||
spc.created_by_user_id,
|
|
||||||
spc.created_at,
|
|
||||||
spc.updated_at
|
|
||||||
FROM subscription_price_changes spc
|
|
||||||
LEFT JOIN sag_sager sg ON sg.id = spc.sag_id
|
|
||||||
WHERE spc.subscription_id = %s
|
|
||||||
AND spc.deleted_at IS NULL
|
|
||||||
ORDER BY spc.effective_date ASC, spc.id ASC
|
|
||||||
"""
|
|
||||||
return execute_query(query, (subscription_id,)) or []
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error listing subscription price changes: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/sag-subscriptions/{subscription_id}/price-changes", response_model=Dict[str, Any])
|
|
||||||
async def create_subscription_price_change(subscription_id: int, payload: Dict[str, Any]):
|
|
||||||
"""Create a planned price change (case is mandatory)."""
|
|
||||||
try:
|
|
||||||
new_unit_price = payload.get("new_unit_price")
|
|
||||||
effective_date = payload.get("effective_date")
|
|
||||||
sag_id = payload.get("sag_id")
|
|
||||||
subscription_item_id = payload.get("subscription_item_id")
|
|
||||||
reason = payload.get("reason")
|
|
||||||
created_by_user_id = payload.get("created_by_user_id")
|
|
||||||
|
|
||||||
if new_unit_price is None:
|
|
||||||
raise HTTPException(status_code=400, detail="new_unit_price is required")
|
|
||||||
if float(new_unit_price) < 0:
|
|
||||||
raise HTTPException(status_code=400, detail="new_unit_price must be >= 0")
|
|
||||||
if not effective_date:
|
|
||||||
raise HTTPException(status_code=400, detail="effective_date is required")
|
|
||||||
if not sag_id:
|
|
||||||
raise HTTPException(status_code=400, detail="sag_id is required")
|
|
||||||
|
|
||||||
subscription = execute_query_single(
|
|
||||||
"SELECT id, customer_id, price FROM sag_subscriptions WHERE id = %s",
|
|
||||||
(subscription_id,)
|
|
||||||
)
|
|
||||||
if not subscription:
|
|
||||||
raise HTTPException(status_code=404, detail="Subscription not found")
|
|
||||||
|
|
||||||
sag = execute_query_single(
|
|
||||||
"SELECT id, customer_id FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(sag_id,)
|
|
||||||
)
|
|
||||||
if not sag:
|
|
||||||
raise HTTPException(status_code=400, detail="Sag not found")
|
|
||||||
if int(sag.get("customer_id") or 0) != int(subscription.get("customer_id") or 0):
|
|
||||||
raise HTTPException(status_code=400, detail="Sag customer mismatch for subscription")
|
|
||||||
|
|
||||||
change_scope = "subscription"
|
|
||||||
old_unit_price = subscription.get("price")
|
|
||||||
if subscription_item_id is not None:
|
|
||||||
item = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT id, unit_price
|
|
||||||
FROM sag_subscription_items
|
|
||||||
WHERE id = %s AND subscription_id = %s
|
|
||||||
""",
|
|
||||||
(subscription_item_id, subscription_id)
|
|
||||||
)
|
|
||||||
if not item:
|
|
||||||
raise HTTPException(status_code=400, detail="subscription_item_id not found on this subscription")
|
|
||||||
change_scope = "item"
|
|
||||||
old_unit_price = item.get("unit_price")
|
|
||||||
|
|
||||||
result = execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO subscription_price_changes (
|
|
||||||
subscription_id,
|
|
||||||
subscription_item_id,
|
|
||||||
sag_id,
|
|
||||||
change_scope,
|
|
||||||
old_unit_price,
|
|
||||||
new_unit_price,
|
|
||||||
effective_date,
|
|
||||||
approval_status,
|
|
||||||
reason,
|
|
||||||
created_by_user_id
|
|
||||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, 'pending', %s, %s)
|
|
||||||
RETURNING *
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
subscription_id,
|
|
||||||
subscription_item_id,
|
|
||||||
sag_id,
|
|
||||||
change_scope,
|
|
||||||
old_unit_price,
|
|
||||||
new_unit_price,
|
|
||||||
effective_date,
|
|
||||||
reason,
|
|
||||||
created_by_user_id,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
return result[0] if result else {}
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error creating subscription price change: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/sag-subscriptions/price-changes/{change_id}/approve", response_model=Dict[str, Any])
|
|
||||||
async def approve_subscription_price_change(change_id: int, payload: Dict[str, Any]):
|
|
||||||
"""Approve or reject a planned price change."""
|
|
||||||
try:
|
|
||||||
approval_status = (payload.get("approval_status") or "approved").strip().lower()
|
|
||||||
approved_by_user_id = payload.get("approved_by_user_id")
|
|
||||||
if approval_status not in ALLOWED_PRICE_CHANGE_STATUSES:
|
|
||||||
raise HTTPException(status_code=400, detail="Invalid approval_status")
|
|
||||||
if approval_status == "applied":
|
|
||||||
raise HTTPException(status_code=400, detail="Use apply endpoint to set applied status")
|
|
||||||
|
|
||||||
result = execute_query(
|
|
||||||
"""
|
|
||||||
UPDATE subscription_price_changes
|
|
||||||
SET approval_status = %s,
|
|
||||||
approved_by_user_id = %s,
|
|
||||||
approved_at = CURRENT_TIMESTAMP,
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
AND deleted_at IS NULL
|
|
||||||
RETURNING *
|
|
||||||
""",
|
|
||||||
(approval_status, approved_by_user_id, change_id)
|
|
||||||
)
|
|
||||||
if not result:
|
|
||||||
raise HTTPException(status_code=404, detail="Price change not found")
|
|
||||||
return result[0]
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error approving subscription price change: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/sag-subscriptions/price-changes/{change_id}/apply", response_model=Dict[str, Any])
|
|
||||||
async def apply_subscription_price_change(change_id: int):
|
|
||||||
"""Apply an approved price change to subscription or item pricing."""
|
|
||||||
conn = get_db_connection()
|
|
||||||
try:
|
|
||||||
with conn.cursor(cursor_factory=RealDictCursor) as cursor:
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
SELECT *
|
|
||||||
FROM subscription_price_changes
|
|
||||||
WHERE id = %s
|
|
||||||
AND deleted_at IS NULL
|
|
||||||
""",
|
|
||||||
(change_id,)
|
|
||||||
)
|
|
||||||
change = cursor.fetchone()
|
|
||||||
if not change:
|
|
||||||
raise HTTPException(status_code=404, detail="Price change not found")
|
|
||||||
if change.get("approval_status") not in ("approved", "pending"):
|
|
||||||
raise HTTPException(status_code=400, detail="Price change must be approved or pending before apply")
|
|
||||||
|
|
||||||
subscription_id = int(change["subscription_id"])
|
|
||||||
change_scope = change.get("change_scope")
|
|
||||||
new_unit_price = float(change.get("new_unit_price") or 0)
|
|
||||||
|
|
||||||
if change_scope == "item" and change.get("subscription_item_id"):
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
UPDATE sag_subscription_items
|
|
||||||
SET unit_price = %s,
|
|
||||||
line_total = ROUND((quantity * %s)::numeric, 2),
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(new_unit_price, new_unit_price, change["subscription_item_id"])
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
UPDATE sag_subscription_items
|
|
||||||
SET unit_price = %s,
|
|
||||||
line_total = ROUND((quantity * %s)::numeric, 2),
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE subscription_id = %s
|
|
||||||
""",
|
|
||||||
(new_unit_price, new_unit_price, subscription_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
SELECT COALESCE(SUM(line_total), 0) AS total
|
|
||||||
FROM sag_subscription_items
|
|
||||||
WHERE subscription_id = %s
|
|
||||||
""",
|
|
||||||
(subscription_id,)
|
|
||||||
)
|
|
||||||
row = cursor.fetchone() or {"total": 0}
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
UPDATE sag_subscriptions
|
|
||||||
SET price = %s,
|
|
||||||
price_change_case_id = %s,
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(row.get("total") or 0, change.get("sag_id"), subscription_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
cursor.execute(
|
|
||||||
"""
|
|
||||||
UPDATE subscription_price_changes
|
|
||||||
SET approval_status = 'applied',
|
|
||||||
approved_at = COALESCE(approved_at, CURRENT_TIMESTAMP),
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
RETURNING *
|
|
||||||
""",
|
|
||||||
(change_id,)
|
|
||||||
)
|
|
||||||
updated_change = cursor.fetchone()
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
return updated_change or {}
|
|
||||||
except HTTPException:
|
|
||||||
conn.rollback()
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
conn.rollback()
|
|
||||||
logger.error(f"❌ Error applying subscription price change: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
finally:
|
|
||||||
release_db_connection(conn)
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/sag-subscriptions/{subscription_id}/asset-bindings", response_model=List[Dict[str, Any]])
|
|
||||||
async def list_subscription_asset_bindings(subscription_id: int):
|
|
||||||
"""List asset bindings attached to a subscription."""
|
|
||||||
try:
|
|
||||||
return execute_query(
|
|
||||||
"""
|
|
||||||
SELECT
|
|
||||||
b.id,
|
|
||||||
b.subscription_id,
|
|
||||||
b.asset_id,
|
|
||||||
b.shared_binding_key,
|
|
||||||
b.binding_months,
|
|
||||||
b.start_date,
|
|
||||||
b.end_date,
|
|
||||||
b.notice_period_days,
|
|
||||||
b.status,
|
|
||||||
b.sag_id,
|
|
||||||
b.created_by_user_id,
|
|
||||||
b.created_at,
|
|
||||||
b.updated_at,
|
|
||||||
h.brand,
|
|
||||||
h.model,
|
|
||||||
h.serial_number AS asset_serial_number,
|
|
||||||
h.internal_asset_id,
|
|
||||||
h.status AS asset_status
|
|
||||||
FROM subscription_asset_bindings b
|
|
||||||
LEFT JOIN hardware_assets h ON h.id = b.asset_id
|
|
||||||
WHERE b.subscription_id = %s
|
|
||||||
AND b.deleted_at IS NULL
|
|
||||||
ORDER BY b.start_date DESC, b.id DESC
|
|
||||||
""",
|
|
||||||
(subscription_id,)
|
|
||||||
) or []
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error listing subscription asset bindings: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/sag-subscriptions/{subscription_id}/asset-bindings", response_model=Dict[str, Any])
|
|
||||||
async def create_subscription_asset_binding(subscription_id: int, payload: Dict[str, Any]):
|
|
||||||
"""Create a binding for one asset under a subscription."""
|
|
||||||
try:
|
|
||||||
asset_id = payload.get("asset_id")
|
|
||||||
start_date_raw = payload.get("start_date")
|
|
||||||
end_date_raw = payload.get("end_date")
|
|
||||||
binding_months = int(payload.get("binding_months") or 0)
|
|
||||||
shared_binding_key = payload.get("shared_binding_key")
|
|
||||||
notice_period_days = int(payload.get("notice_period_days") or 30)
|
|
||||||
sag_id = payload.get("sag_id")
|
|
||||||
created_by_user_id = payload.get("created_by_user_id")
|
|
||||||
|
|
||||||
if not asset_id:
|
|
||||||
raise HTTPException(status_code=400, detail="asset_id is required")
|
|
||||||
if notice_period_days < 0:
|
|
||||||
raise HTTPException(status_code=400, detail="notice_period_days must be >= 0")
|
|
||||||
if binding_months < 0:
|
|
||||||
raise HTTPException(status_code=400, detail="binding_months must be >= 0")
|
|
||||||
|
|
||||||
subscription = execute_query_single(
|
|
||||||
"SELECT id, customer_id, start_date FROM sag_subscriptions WHERE id = %s",
|
|
||||||
(subscription_id,)
|
|
||||||
)
|
|
||||||
if not subscription:
|
|
||||||
raise HTTPException(status_code=404, detail="Subscription not found")
|
|
||||||
|
|
||||||
asset = execute_query_single(
|
|
||||||
"SELECT id FROM hardware_assets WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(asset_id,)
|
|
||||||
)
|
|
||||||
if not asset:
|
|
||||||
raise HTTPException(status_code=400, detail="Asset not found")
|
|
||||||
|
|
||||||
if sag_id:
|
|
||||||
sag = execute_query_single(
|
|
||||||
"SELECT id, customer_id FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(sag_id,)
|
|
||||||
)
|
|
||||||
if not sag:
|
|
||||||
raise HTTPException(status_code=400, detail="Sag not found")
|
|
||||||
if int(sag.get("customer_id") or 0) != int(subscription.get("customer_id") or 0):
|
|
||||||
raise HTTPException(status_code=400, detail="Sag customer mismatch for subscription")
|
|
||||||
|
|
||||||
start_date = _safe_date(start_date_raw) or _safe_date(subscription.get("start_date")) or date.today()
|
|
||||||
end_date = _safe_date(end_date_raw)
|
|
||||||
if not end_date and binding_months > 0:
|
|
||||||
end_date = start_date + relativedelta(months=binding_months)
|
|
||||||
|
|
||||||
result = execute_query(
|
|
||||||
"""
|
|
||||||
INSERT INTO subscription_asset_bindings (
|
|
||||||
subscription_id,
|
|
||||||
asset_id,
|
|
||||||
shared_binding_key,
|
|
||||||
binding_months,
|
|
||||||
start_date,
|
|
||||||
end_date,
|
|
||||||
notice_period_days,
|
|
||||||
status,
|
|
||||||
sag_id,
|
|
||||||
created_by_user_id
|
|
||||||
) VALUES (%s, %s, %s, %s, %s, %s, %s, 'active', %s, %s)
|
|
||||||
RETURNING *
|
|
||||||
""",
|
|
||||||
(
|
|
||||||
subscription_id,
|
|
||||||
asset_id,
|
|
||||||
shared_binding_key,
|
|
||||||
binding_months,
|
|
||||||
start_date,
|
|
||||||
end_date,
|
|
||||||
notice_period_days,
|
|
||||||
sag_id,
|
|
||||||
created_by_user_id,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
if not result:
|
|
||||||
raise HTTPException(status_code=500, detail="Could not create binding")
|
|
||||||
|
|
||||||
execute_query(
|
|
||||||
"""
|
|
||||||
UPDATE sag_subscription_items
|
|
||||||
SET asset_id = %s,
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE subscription_id = %s
|
|
||||||
AND asset_id IS NULL
|
|
||||||
""",
|
|
||||||
(asset_id, subscription_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
return result[0]
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error creating subscription asset binding: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.patch("/sag-subscriptions/asset-bindings/{binding_id}", response_model=Dict[str, Any])
|
|
||||||
async def update_subscription_asset_binding(binding_id: int, payload: Dict[str, Any]):
|
|
||||||
"""Update status/dates/notice for a subscription asset binding."""
|
|
||||||
try:
|
|
||||||
allowed_fields = {
|
|
||||||
"shared_binding_key",
|
|
||||||
"binding_months",
|
|
||||||
"start_date",
|
|
||||||
"end_date",
|
|
||||||
"notice_period_days",
|
|
||||||
"status",
|
|
||||||
"sag_id",
|
|
||||||
}
|
|
||||||
updates = []
|
|
||||||
values = []
|
|
||||||
for field, value in payload.items():
|
|
||||||
if field in allowed_fields:
|
|
||||||
updates.append(f"{field} = %s")
|
|
||||||
values.append(value)
|
|
||||||
|
|
||||||
if "status" in payload and payload.get("status") not in {"active", "ended", "cancelled"}:
|
|
||||||
raise HTTPException(status_code=400, detail="Invalid binding status")
|
|
||||||
|
|
||||||
if "notice_period_days" in payload and int(payload.get("notice_period_days") or 0) < 0:
|
|
||||||
raise HTTPException(status_code=400, detail="notice_period_days must be >= 0")
|
|
||||||
|
|
||||||
if not updates:
|
|
||||||
existing = execute_query_single(
|
|
||||||
"SELECT * FROM subscription_asset_bindings WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(binding_id,)
|
|
||||||
)
|
|
||||||
if not existing:
|
|
||||||
raise HTTPException(status_code=404, detail="Binding not found")
|
|
||||||
return existing
|
|
||||||
|
|
||||||
values.append(binding_id)
|
|
||||||
result = execute_query(
|
|
||||||
f"""
|
|
||||||
UPDATE subscription_asset_bindings
|
|
||||||
SET {', '.join(updates)},
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
AND deleted_at IS NULL
|
|
||||||
RETURNING *
|
|
||||||
""",
|
|
||||||
tuple(values)
|
|
||||||
)
|
|
||||||
if not result:
|
|
||||||
raise HTTPException(status_code=404, detail="Binding not found")
|
|
||||||
return result[0]
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error updating subscription asset binding: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/sag-subscriptions/asset-bindings/{binding_id}", response_model=Dict[str, Any])
|
|
||||||
async def delete_subscription_asset_binding(binding_id: int):
|
|
||||||
"""Soft-delete a subscription asset binding."""
|
|
||||||
try:
|
|
||||||
result = execute_query(
|
|
||||||
"""
|
|
||||||
UPDATE subscription_asset_bindings
|
|
||||||
SET deleted_at = CURRENT_TIMESTAMP,
|
|
||||||
updated_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s
|
|
||||||
AND deleted_at IS NULL
|
|
||||||
RETURNING id
|
|
||||||
""",
|
|
||||||
(binding_id,)
|
|
||||||
)
|
|
||||||
if not result:
|
|
||||||
raise HTTPException(status_code=404, detail="Binding not found")
|
|
||||||
return {"status": "deleted", "id": result[0].get("id")}
|
|
||||||
except HTTPException:
|
|
||||||
raise
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"❌ Error deleting subscription asset binding: {e}", exc_info=True)
|
|
||||||
raise HTTPException(status_code=500, detail=str(e))
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/simply-subscription-staging/import", response_model=Dict[str, Any])
|
@router.post("/simply-subscription-staging/import", response_model=Dict[str, Any])
|
||||||
async def import_simply_subscriptions_to_staging():
|
async def import_simply_subscriptions_to_staging():
|
||||||
"""Import recurring Simply CRM SalesOrders into staging (parking area)."""
|
"""Import recurring Simply CRM SalesOrders into staging (parking area)."""
|
||||||
|
|||||||
@ -24,24 +24,22 @@ HUB owns (manual or first-sync only):
|
|||||||
SYNC RULES:
|
SYNC RULES:
|
||||||
===========
|
===========
|
||||||
1. NEVER overwrite source ID if already set (vtiger_id, economic_customer_number)
|
1. NEVER overwrite source ID if already set (vtiger_id, economic_customer_number)
|
||||||
2. Matching is source-specific (e-conomic: strict economic_customer_number only)
|
2. Match order: CVR → Source ID → Name (normalized)
|
||||||
3. Re-sync is idempotent - can run multiple times safely
|
3. Re-sync is idempotent - can run multiple times safely
|
||||||
4. Contact relationships are REPLACED on sync (not added)
|
4. Contact relationships are REPLACED on sync (not added)
|
||||||
5. Each sync only updates fields it owns
|
5. Each sync only updates fields it owns
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
from fastapi import APIRouter, Depends, HTTPException
|
from fastapi import APIRouter, HTTPException
|
||||||
from typing import Dict, Any
|
from typing import Dict, Any
|
||||||
from app.core.database import execute_query
|
from app.core.database import execute_query
|
||||||
from app.core.auth_dependencies import require_any_permission
|
|
||||||
from app.services.vtiger_service import get_vtiger_service
|
from app.services.vtiger_service import get_vtiger_service
|
||||||
import re
|
import re
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
sync_admin_access = require_any_permission("users.manage", "system.admin")
|
|
||||||
|
|
||||||
|
|
||||||
def normalize_name(name: str) -> str:
|
def normalize_name(name: str) -> str:
|
||||||
@ -55,7 +53,7 @@ def normalize_name(name: str) -> str:
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/vtiger")
|
@router.post("/sync/vtiger")
|
||||||
async def sync_from_vtiger(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
async def sync_from_vtiger() -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Link vTiger accounts to existing Hub customers
|
Link vTiger accounts to existing Hub customers
|
||||||
Matches by CVR or normalized name, updates vtiger_id
|
Matches by CVR or normalized name, updates vtiger_id
|
||||||
@ -188,7 +186,7 @@ async def sync_from_vtiger(current_user: dict = Depends(sync_admin_access)) -> D
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/vtiger-contacts")
|
@router.post("/sync/vtiger-contacts")
|
||||||
async def sync_vtiger_contacts(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
async def sync_vtiger_contacts() -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
SIMPEL TILGANG - Sync contacts from vTiger and link to customers
|
SIMPEL TILGANG - Sync contacts from vTiger and link to customers
|
||||||
Step 1: Fetch all contacts from vTiger
|
Step 1: Fetch all contacts from vTiger
|
||||||
@ -448,7 +446,7 @@ async def sync_vtiger_contacts(current_user: dict = Depends(sync_admin_access))
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/economic")
|
@router.post("/sync/economic")
|
||||||
async def sync_from_economic(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
async def sync_from_economic() -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Sync customers from e-conomic (PRIMARY SOURCE)
|
Sync customers from e-conomic (PRIMARY SOURCE)
|
||||||
Creates/updates Hub customers with e-conomic data
|
Creates/updates Hub customers with e-conomic data
|
||||||
@ -477,11 +475,9 @@ async def sync_from_economic(current_user: dict = Depends(sync_admin_access)) ->
|
|||||||
created_count = 0
|
created_count = 0
|
||||||
updated_count = 0
|
updated_count = 0
|
||||||
skipped_count = 0
|
skipped_count = 0
|
||||||
conflict_count = 0
|
|
||||||
|
|
||||||
for eco_customer in economic_customers:
|
for eco_customer in economic_customers:
|
||||||
customer_number_raw = eco_customer.get('customerNumber')
|
customer_number = eco_customer.get('customerNumber')
|
||||||
customer_number = str(customer_number_raw).strip() if customer_number_raw is not None else None
|
|
||||||
cvr = eco_customer.get('corporateIdentificationNumber')
|
cvr = eco_customer.get('corporateIdentificationNumber')
|
||||||
name = eco_customer.get('name', '').strip()
|
name = eco_customer.get('name', '').strip()
|
||||||
address = eco_customer.get('address', '')
|
address = eco_customer.get('address', '')
|
||||||
@ -512,27 +508,20 @@ async def sync_from_economic(current_user: dict = Depends(sync_admin_access)) ->
|
|||||||
# Extract email domain
|
# Extract email domain
|
||||||
email_domain = email.split('@')[-1] if '@' in email else None
|
email_domain = email.split('@')[-1] if '@' in email else None
|
||||||
|
|
||||||
# Strict matching: ONLY match by economic_customer_number
|
# Check if customer exists by economic_customer_number OR CVR
|
||||||
existing = execute_query(
|
existing = execute_query(
|
||||||
"SELECT id, name FROM customers WHERE economic_customer_number = %s ORDER BY id",
|
"SELECT id FROM customers WHERE economic_customer_number = %s",
|
||||||
(customer_number,)
|
(customer_number,)
|
||||||
)
|
)
|
||||||
|
|
||||||
# Conflict handling: duplicate local rows for same e-conomic number
|
# If not found by customer number, try CVR (to avoid duplicates)
|
||||||
if len(existing) > 1:
|
if not existing and cvr:
|
||||||
conflict_count += 1
|
existing = execute_query(
|
||||||
skipped_count += 1
|
"SELECT id FROM customers WHERE cvr_number = %s",
|
||||||
duplicate_ids = ", ".join(str(row['id']) for row in existing)
|
(cvr,)
|
||||||
logger.error(
|
|
||||||
"❌ Konflikt: e-conomic #%s matcher %s lokale kunder (ids: %s) - springer over",
|
|
||||||
customer_number,
|
|
||||||
len(existing),
|
|
||||||
duplicate_ids
|
|
||||||
)
|
)
|
||||||
continue
|
|
||||||
|
|
||||||
if existing:
|
if existing:
|
||||||
target_customer_id = existing[0]['id']
|
|
||||||
# Update existing customer - ONLY update fields e-conomic owns
|
# Update existing customer - ONLY update fields e-conomic owns
|
||||||
# E-conomic does NOT overwrite: name, cvr_number (set once only)
|
# E-conomic does NOT overwrite: name, cvr_number (set once only)
|
||||||
update_query = """
|
update_query = """
|
||||||
@ -548,16 +537,10 @@ async def sync_from_economic(current_user: dict = Depends(sync_admin_access)) ->
|
|||||||
WHERE id = %s
|
WHERE id = %s
|
||||||
"""
|
"""
|
||||||
execute_query(update_query, (
|
execute_query(update_query, (
|
||||||
customer_number, email_domain, address, city, zip_code, country, website, target_customer_id
|
customer_number, email_domain, address, city, zip_code, country, website, existing[0]['id']
|
||||||
))
|
))
|
||||||
updated_count += 1
|
updated_count += 1
|
||||||
logger.info(
|
logger.info(f"✏️ Opdateret: {name} (e-conomic #{customer_number}, CVR: {cvr or 'ingen'})")
|
||||||
"✏️ Opdateret lokal kunde id=%s: %s (e-conomic #%s, CVR: %s)",
|
|
||||||
target_customer_id,
|
|
||||||
name,
|
|
||||||
customer_number,
|
|
||||||
cvr or 'ingen'
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
# Create new customer from e-conomic
|
# Create new customer from e-conomic
|
||||||
insert_query = """
|
insert_query = """
|
||||||
@ -572,32 +555,17 @@ async def sync_from_economic(current_user: dict = Depends(sync_admin_access)) ->
|
|||||||
))
|
))
|
||||||
|
|
||||||
if result:
|
if result:
|
||||||
new_customer_id = result[0]['id']
|
|
||||||
created_count += 1
|
created_count += 1
|
||||||
logger.info(
|
logger.info(f"✨ Oprettet: {name} (e-conomic #{customer_number}, CVR: {cvr or 'ingen'})")
|
||||||
"✨ Oprettet lokal kunde id=%s: %s (e-conomic #%s, CVR: %s)",
|
|
||||||
new_customer_id,
|
|
||||||
name,
|
|
||||||
customer_number,
|
|
||||||
cvr or 'ingen'
|
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
skipped_count += 1
|
skipped_count += 1
|
||||||
|
|
||||||
logger.info(
|
logger.info(f"✅ e-conomic sync fuldført: {created_count} oprettet, {updated_count} opdateret, {skipped_count} sprunget over af {len(economic_customers)} totalt")
|
||||||
"✅ e-conomic sync fuldført: %s oprettet, %s opdateret, %s konflikter, %s sprunget over af %s totalt",
|
|
||||||
created_count,
|
|
||||||
updated_count,
|
|
||||||
conflict_count,
|
|
||||||
skipped_count,
|
|
||||||
len(economic_customers)
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status": "success",
|
"status": "success",
|
||||||
"created": created_count,
|
"created": created_count,
|
||||||
"updated": updated_count,
|
"updated": updated_count,
|
||||||
"conflicts": conflict_count,
|
|
||||||
"skipped": skipped_count,
|
"skipped": skipped_count,
|
||||||
"total_processed": len(economic_customers)
|
"total_processed": len(economic_customers)
|
||||||
}
|
}
|
||||||
@ -608,7 +576,7 @@ async def sync_from_economic(current_user: dict = Depends(sync_admin_access)) ->
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/cvr-to-economic")
|
@router.post("/sync/cvr-to-economic")
|
||||||
async def sync_cvr_to_economic(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
async def sync_cvr_to_economic() -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Find customers in Hub with CVR but without e-conomic customer number
|
Find customers in Hub with CVR but without e-conomic customer number
|
||||||
Search e-conomic for matching CVR and update Hub
|
Search e-conomic for matching CVR and update Hub
|
||||||
@ -670,7 +638,7 @@ async def sync_cvr_to_economic(current_user: dict = Depends(sync_admin_access))
|
|||||||
|
|
||||||
|
|
||||||
@router.get("/sync/diagnostics")
|
@router.get("/sync/diagnostics")
|
||||||
async def sync_diagnostics(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
async def sync_diagnostics() -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Diagnostics: Check contact linking coverage
|
Diagnostics: Check contact linking coverage
|
||||||
Shows why contacts aren't linking to customers
|
Shows why contacts aren't linking to customers
|
||||||
|
|||||||
@ -6,7 +6,7 @@ from typing import Optional, List, Literal
|
|||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
# Tag types
|
# Tag types
|
||||||
TagType = Literal['workflow', 'status', 'category', 'priority', 'billing', 'brand', 'type']
|
TagType = Literal['workflow', 'status', 'category', 'priority', 'billing']
|
||||||
TagGroupBehavior = Literal['multi', 'single', 'toggle']
|
TagGroupBehavior = Literal['multi', 'single', 'toggle']
|
||||||
|
|
||||||
|
|
||||||
@ -37,7 +37,6 @@ class TagBase(BaseModel):
|
|||||||
icon: Optional[str] = None
|
icon: Optional[str] = None
|
||||||
is_active: bool = True
|
is_active: bool = True
|
||||||
tag_group_id: Optional[int] = None
|
tag_group_id: Optional[int] = None
|
||||||
catch_words: Optional[List[str]] = None
|
|
||||||
|
|
||||||
class TagCreate(TagBase):
|
class TagCreate(TagBase):
|
||||||
"""Tag creation model"""
|
"""Tag creation model"""
|
||||||
@ -60,7 +59,6 @@ class TagUpdate(BaseModel):
|
|||||||
icon: Optional[str] = None
|
icon: Optional[str] = None
|
||||||
is_active: Optional[bool] = None
|
is_active: Optional[bool] = None
|
||||||
tag_group_id: Optional[int] = None
|
tag_group_id: Optional[int] = None
|
||||||
catch_words: Optional[List[str]] = None
|
|
||||||
|
|
||||||
|
|
||||||
class EntityTagBase(BaseModel):
|
class EntityTagBase(BaseModel):
|
||||||
|
|||||||
@ -1,10 +1,8 @@
|
|||||||
"""
|
"""
|
||||||
Tag system API endpoints
|
Tag system API endpoints
|
||||||
"""
|
"""
|
||||||
from fastapi import APIRouter, HTTPException, Query
|
from fastapi import APIRouter, HTTPException
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
import json
|
|
||||||
import re
|
|
||||||
from app.tags.backend.models import (
|
from app.tags.backend.models import (
|
||||||
Tag, TagCreate, TagUpdate,
|
Tag, TagCreate, TagUpdate,
|
||||||
EntityTag, EntityTagCreate,
|
EntityTag, EntityTagCreate,
|
||||||
@ -16,197 +14,6 @@ from app.core.database import execute_query, execute_query_single, execute_updat
|
|||||||
|
|
||||||
router = APIRouter(prefix="/tags")
|
router = APIRouter(prefix="/tags")
|
||||||
|
|
||||||
MODULE_LABELS = {
|
|
||||||
"case": "Sager",
|
|
||||||
"email": "Email",
|
|
||||||
"ticket": "Tickets",
|
|
||||||
"customer": "Kunder",
|
|
||||||
"contact": "Kontakter",
|
|
||||||
"time_entry": "Tid",
|
|
||||||
"order": "Ordrer",
|
|
||||||
"comment": "Ticket kommentarer",
|
|
||||||
"worklog": "Ticket worklog",
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def _module_label_for_entity_type(entity_type: Optional[str]) -> str:
|
|
||||||
key = str(entity_type or "").strip().lower()
|
|
||||||
if not key:
|
|
||||||
return "Ukendt modul"
|
|
||||||
return MODULE_LABELS.get(key, f"Ukendt modul ({key})")
|
|
||||||
|
|
||||||
|
|
||||||
def _entity_reference_payload(entity_type: Optional[str], entity_id: Optional[int]) -> dict:
|
|
||||||
etype = str(entity_type or "").strip().lower()
|
|
||||||
eid = int(entity_id or 0)
|
|
||||||
default_label = f"#{eid}" if eid else "Ukendt"
|
|
||||||
|
|
||||||
if not etype or not eid:
|
|
||||||
return {"entity_title": default_label, "entity_url": None}
|
|
||||||
|
|
||||||
try:
|
|
||||||
if etype == "case":
|
|
||||||
row = execute_query_single(
|
|
||||||
"SELECT id, titel FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
title = str(row.get("titel") or "Sag").strip()
|
|
||||||
return {"entity_title": title, "entity_url": f"/sag/{eid}"}
|
|
||||||
|
|
||||||
elif etype == "email":
|
|
||||||
row = execute_query_single(
|
|
||||||
"SELECT id, subject FROM email_messages WHERE id = %s AND deleted_at IS NULL",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
title = str(row.get("subject") or "Email").strip()
|
|
||||||
return {"entity_title": title, "entity_url": f"/emails?id={eid}"}
|
|
||||||
|
|
||||||
elif etype == "ticket":
|
|
||||||
row = execute_query_single(
|
|
||||||
"SELECT id, ticket_number, subject FROM tticket_tickets WHERE id = %s",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
ticket_number = str(row.get("ticket_number") or "").strip()
|
|
||||||
subject = str(row.get("subject") or "Ticket").strip()
|
|
||||||
title = f"{ticket_number} - {subject}" if ticket_number else subject
|
|
||||||
return {"entity_title": title, "entity_url": f"/ticket/tickets/{eid}"}
|
|
||||||
|
|
||||||
elif etype == "customer":
|
|
||||||
row = execute_query_single("SELECT id, name FROM customers WHERE id = %s", (eid,))
|
|
||||||
if row:
|
|
||||||
title = str(row.get("name") or "Kunde").strip()
|
|
||||||
return {"entity_title": title, "entity_url": f"/customers/{eid}"}
|
|
||||||
|
|
||||||
elif etype == "contact":
|
|
||||||
row = execute_query_single(
|
|
||||||
"SELECT id, first_name, last_name, email FROM contacts WHERE id = %s",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
name = " ".join(
|
|
||||||
[str(row.get("first_name") or "").strip(), str(row.get("last_name") or "").strip()]
|
|
||||||
).strip()
|
|
||||||
title = name or str(row.get("email") or "Kontakt").strip()
|
|
||||||
return {"entity_title": title, "entity_url": f"/contacts/{eid}"}
|
|
||||||
|
|
||||||
elif etype == "time_entry":
|
|
||||||
row = execute_query_single(
|
|
||||||
"SELECT id, description, worked_date FROM tmodule_times WHERE id = %s",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
description = str(row.get("description") or "Tidsregistrering").strip()
|
|
||||||
return {"entity_title": description[:90], "entity_url": "/timetracking/registrations"}
|
|
||||||
|
|
||||||
elif etype == "order":
|
|
||||||
row = execute_query_single(
|
|
||||||
"SELECT id, order_number, total_amount FROM tmodule_orders WHERE id = %s",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
order_number = str(row.get("order_number") or "Ordre").strip()
|
|
||||||
total_amount = row.get("total_amount")
|
|
||||||
suffix = f" ({total_amount} kr.)" if total_amount is not None else ""
|
|
||||||
return {"entity_title": f"{order_number}{suffix}", "entity_url": "/timetracking/orders"}
|
|
||||||
|
|
||||||
elif etype == "worklog":
|
|
||||||
row = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT w.id, w.description, w.ticket_id, t.ticket_number
|
|
||||||
FROM tticket_worklog w
|
|
||||||
LEFT JOIN tticket_tickets t ON t.id = w.ticket_id
|
|
||||||
WHERE w.id = %s
|
|
||||||
""",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
ticket_id = row.get("ticket_id")
|
|
||||||
ticket_number = str(row.get("ticket_number") or "Ticket").strip()
|
|
||||||
description = str(row.get("description") or "Worklog").strip()
|
|
||||||
url = f"/ticket/tickets/{ticket_id}" if ticket_id else None
|
|
||||||
return {"entity_title": f"{ticket_number} - {description[:70]}", "entity_url": url}
|
|
||||||
|
|
||||||
elif etype == "comment":
|
|
||||||
row = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT c.id, c.comment_text, c.ticket_id, t.ticket_number
|
|
||||||
FROM tticket_comments c
|
|
||||||
LEFT JOIN tticket_tickets t ON t.id = c.ticket_id
|
|
||||||
WHERE c.id = %s
|
|
||||||
""",
|
|
||||||
(eid,),
|
|
||||||
)
|
|
||||||
if row:
|
|
||||||
ticket_id = row.get("ticket_id")
|
|
||||||
ticket_number = str(row.get("ticket_number") or "Ticket").strip()
|
|
||||||
comment_text = str(row.get("comment_text") or "Kommentar").strip()
|
|
||||||
url = f"/ticket/tickets/{ticket_id}" if ticket_id else None
|
|
||||||
return {"entity_title": f"{ticket_number} - {comment_text[:70]}", "entity_url": url}
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
|
|
||||||
return {"entity_title": default_label, "entity_url": None}
|
|
||||||
|
|
||||||
|
|
||||||
def _normalize_catch_words(value) -> List[str]:
|
|
||||||
"""Normalize catch words from JSON/text/list to a clean lowercase list."""
|
|
||||||
if value is None:
|
|
||||||
return []
|
|
||||||
if isinstance(value, list):
|
|
||||||
words = value
|
|
||||||
elif isinstance(value, str):
|
|
||||||
stripped = value.strip()
|
|
||||||
if not stripped:
|
|
||||||
return []
|
|
||||||
if stripped.startswith("["):
|
|
||||||
try:
|
|
||||||
parsed = json.loads(stripped)
|
|
||||||
words = parsed if isinstance(parsed, list) else []
|
|
||||||
except Exception:
|
|
||||||
words = [w.strip() for w in stripped.replace("\n", ",").split(",")]
|
|
||||||
else:
|
|
||||||
words = [w.strip() for w in stripped.replace("\n", ",").split(",")]
|
|
||||||
else:
|
|
||||||
words = []
|
|
||||||
|
|
||||||
cleaned = []
|
|
||||||
seen = set()
|
|
||||||
for word in words:
|
|
||||||
normalized = str(word or "").strip().lower()
|
|
||||||
if len(normalized) < 2:
|
|
||||||
continue
|
|
||||||
if normalized in seen:
|
|
||||||
continue
|
|
||||||
seen.add(normalized)
|
|
||||||
cleaned.append(normalized)
|
|
||||||
return cleaned
|
|
||||||
|
|
||||||
|
|
||||||
def _tag_row_to_response(row: dict) -> dict:
|
|
||||||
"""Ensure API response always exposes catch_words as a list."""
|
|
||||||
if not row:
|
|
||||||
return row
|
|
||||||
out = dict(row)
|
|
||||||
|
|
||||||
valid_types = {"workflow", "status", "category", "priority", "billing", "brand", "type"}
|
|
||||||
tag_type = str(out.get("type") or "").strip().lower()
|
|
||||||
if tag_type not in valid_types:
|
|
||||||
tag_type = "category"
|
|
||||||
out["type"] = tag_type
|
|
||||||
|
|
||||||
color = str(out.get("color") or "").strip()
|
|
||||||
if not re.fullmatch(r"#[0-9A-Fa-f]{6}", color):
|
|
||||||
out["color"] = "#0f4c75"
|
|
||||||
|
|
||||||
if not out.get("name"):
|
|
||||||
out["name"] = "Unnamed tag"
|
|
||||||
|
|
||||||
out["catch_words"] = _normalize_catch_words(out.get("catch_words"))
|
|
||||||
return out
|
|
||||||
|
|
||||||
# ============= TAG GROUPS =============
|
# ============= TAG GROUPS =============
|
||||||
|
|
||||||
@router.get("/groups", response_model=List[TagGroup])
|
@router.get("/groups", response_model=List[TagGroup])
|
||||||
@ -227,131 +34,13 @@ async def create_tag_group(group: TagGroupCreate):
|
|||||||
|
|
||||||
# ============= TAG CRUD =============
|
# ============= TAG CRUD =============
|
||||||
|
|
||||||
@router.get("/usage")
|
|
||||||
async def list_tag_usage(
|
|
||||||
tag_name: Optional[str] = Query(None),
|
|
||||||
tag_type: Optional[TagType] = Query(None),
|
|
||||||
module: Optional[str] = Query(None),
|
|
||||||
page: int = Query(1, ge=1),
|
|
||||||
page_size: int = Query(25, ge=1, le=200),
|
|
||||||
sort_by: str = Query("tagged_at"),
|
|
||||||
sort_dir: str = Query("desc"),
|
|
||||||
):
|
|
||||||
"""List tag usage across modules with server-side filtering and pagination."""
|
|
||||||
where_parts = ["1=1"]
|
|
||||||
params: List[object] = []
|
|
||||||
|
|
||||||
if tag_name:
|
|
||||||
where_parts.append("LOWER(t.name) LIKE LOWER(%s)")
|
|
||||||
params.append(f"%{tag_name.strip()}%")
|
|
||||||
|
|
||||||
if tag_type:
|
|
||||||
where_parts.append("t.type = %s")
|
|
||||||
params.append(tag_type)
|
|
||||||
|
|
||||||
if module:
|
|
||||||
where_parts.append("LOWER(et.entity_type) = LOWER(%s)")
|
|
||||||
params.append(module.strip())
|
|
||||||
|
|
||||||
where_clause = " AND ".join(where_parts)
|
|
||||||
|
|
||||||
sortable = {
|
|
||||||
"tagged_at": "et.tagged_at",
|
|
||||||
"tag_name": "t.name",
|
|
||||||
"tag_type": "t.type",
|
|
||||||
"module": "et.entity_type",
|
|
||||||
"entity_id": "et.entity_id",
|
|
||||||
}
|
|
||||||
order_column = sortable.get(sort_by, "et.tagged_at")
|
|
||||||
order_direction = "ASC" if str(sort_dir).lower() == "asc" else "DESC"
|
|
||||||
|
|
||||||
count_query = f"""
|
|
||||||
SELECT COUNT(*) AS total
|
|
||||||
FROM entity_tags et
|
|
||||||
JOIN tags t ON t.id = et.tag_id
|
|
||||||
WHERE {where_clause}
|
|
||||||
"""
|
|
||||||
count_row = execute_query_single(count_query, tuple(params)) or {"total": 0}
|
|
||||||
total = int(count_row.get("total") or 0)
|
|
||||||
|
|
||||||
offset = (page - 1) * page_size
|
|
||||||
data_query = f"""
|
|
||||||
SELECT
|
|
||||||
et.id AS entity_tag_id,
|
|
||||||
et.entity_type,
|
|
||||||
et.entity_id,
|
|
||||||
et.tagged_at,
|
|
||||||
t.id AS tag_id,
|
|
||||||
t.name AS tag_name,
|
|
||||||
t.type AS tag_type,
|
|
||||||
t.color AS tag_color,
|
|
||||||
t.is_active AS tag_is_active
|
|
||||||
FROM entity_tags et
|
|
||||||
JOIN tags t ON t.id = et.tag_id
|
|
||||||
WHERE {where_clause}
|
|
||||||
ORDER BY {order_column} {order_direction}, et.id DESC
|
|
||||||
LIMIT %s OFFSET %s
|
|
||||||
"""
|
|
||||||
|
|
||||||
rows = execute_query(data_query, tuple(params + [page_size, offset])) or []
|
|
||||||
items = []
|
|
||||||
for row in rows:
|
|
||||||
entity_type = row.get("entity_type")
|
|
||||||
entity_ref = _entity_reference_payload(entity_type, row.get("entity_id"))
|
|
||||||
items.append(
|
|
||||||
{
|
|
||||||
"entity_tag_id": row.get("entity_tag_id"),
|
|
||||||
"tag_id": row.get("tag_id"),
|
|
||||||
"tag_name": row.get("tag_name"),
|
|
||||||
"tag_type": row.get("tag_type"),
|
|
||||||
"tag_color": row.get("tag_color"),
|
|
||||||
"tag_is_active": bool(row.get("tag_is_active")),
|
|
||||||
"module": _module_label_for_entity_type(entity_type),
|
|
||||||
"entity_type": entity_type,
|
|
||||||
"entity_id": row.get("entity_id"),
|
|
||||||
"entity_title": entity_ref.get("entity_title"),
|
|
||||||
"entity_url": entity_ref.get("entity_url"),
|
|
||||||
"tagged_at": row.get("tagged_at"),
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
module_rows = execute_query(
|
|
||||||
"SELECT DISTINCT entity_type FROM entity_tags ORDER BY entity_type",
|
|
||||||
(),
|
|
||||||
) or []
|
|
||||||
module_options = [
|
|
||||||
{
|
|
||||||
"value": row.get("entity_type"),
|
|
||||||
"label": _module_label_for_entity_type(row.get("entity_type")),
|
|
||||||
}
|
|
||||||
for row in module_rows
|
|
||||||
]
|
|
||||||
|
|
||||||
return {
|
|
||||||
"items": items,
|
|
||||||
"pagination": {
|
|
||||||
"page": page,
|
|
||||||
"page_size": page_size,
|
|
||||||
"total": total,
|
|
||||||
"total_pages": (total + page_size - 1) // page_size if total else 0,
|
|
||||||
},
|
|
||||||
"sort": {"sort_by": sort_by, "sort_dir": order_direction.lower()},
|
|
||||||
"module_options": module_options,
|
|
||||||
}
|
|
||||||
|
|
||||||
@router.get("", response_model=List[Tag])
|
@router.get("", response_model=List[Tag])
|
||||||
async def list_tags(
|
async def list_tags(
|
||||||
type: Optional[TagType] = None,
|
type: Optional[TagType] = None,
|
||||||
is_active: Optional[bool] = None
|
is_active: Optional[bool] = None
|
||||||
):
|
):
|
||||||
"""List all tags with optional filtering"""
|
"""List all tags with optional filtering"""
|
||||||
query = """
|
query = "SELECT * FROM tags WHERE 1=1"
|
||||||
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
|
||||||
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
|
||||||
created_at, updated_at
|
|
||||||
FROM tags
|
|
||||||
WHERE 1=1
|
|
||||||
"""
|
|
||||||
params = []
|
params = []
|
||||||
|
|
||||||
if type:
|
if type:
|
||||||
@ -365,52 +54,32 @@ async def list_tags(
|
|||||||
query += " ORDER BY type, name"
|
query += " ORDER BY type, name"
|
||||||
|
|
||||||
results = execute_query(query, tuple(params) if params else ())
|
results = execute_query(query, tuple(params) if params else ())
|
||||||
return [_tag_row_to_response(row) for row in (results or [])]
|
return results
|
||||||
|
|
||||||
@router.get("/{tag_id}", response_model=Tag)
|
@router.get("/{tag_id}", response_model=Tag)
|
||||||
async def get_tag(tag_id: int):
|
async def get_tag(tag_id: int):
|
||||||
"""Get single tag by ID"""
|
"""Get single tag by ID"""
|
||||||
result = execute_query_single(
|
result = execute_query_single(
|
||||||
"""
|
"SELECT * FROM tags WHERE id = %s",
|
||||||
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
|
||||||
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
|
||||||
created_at, updated_at
|
|
||||||
FROM tags
|
|
||||||
WHERE id = %s
|
|
||||||
""",
|
|
||||||
(tag_id,)
|
(tag_id,)
|
||||||
)
|
)
|
||||||
if not result:
|
if not result:
|
||||||
raise HTTPException(status_code=404, detail="Tag not found")
|
raise HTTPException(status_code=404, detail="Tag not found")
|
||||||
return _tag_row_to_response(result)
|
return result
|
||||||
|
|
||||||
@router.post("", response_model=Tag)
|
@router.post("", response_model=Tag)
|
||||||
async def create_tag(tag: TagCreate):
|
async def create_tag(tag: TagCreate):
|
||||||
"""Create new tag"""
|
"""Create new tag"""
|
||||||
query = """
|
query = """
|
||||||
INSERT INTO tags (name, type, description, color, icon, is_active, tag_group_id, catch_words_json)
|
INSERT INTO tags (name, type, description, color, icon, is_active, tag_group_id)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s::jsonb)
|
VALUES (%s, %s, %s, %s, %s, %s, %s)
|
||||||
RETURNING id, name, type, description, color, icon, is_active, tag_group_id,
|
RETURNING *
|
||||||
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
|
||||||
created_at, updated_at
|
|
||||||
"""
|
"""
|
||||||
catch_words = _normalize_catch_words(tag.catch_words)
|
|
||||||
result = execute_query_single(
|
result = execute_query_single(
|
||||||
query,
|
query,
|
||||||
(
|
(tag.name, tag.type, tag.description, tag.color, tag.icon, tag.is_active, tag.tag_group_id)
|
||||||
tag.name,
|
|
||||||
tag.type,
|
|
||||||
tag.description,
|
|
||||||
tag.color,
|
|
||||||
tag.icon,
|
|
||||||
tag.is_active,
|
|
||||||
tag.tag_group_id,
|
|
||||||
json.dumps(catch_words),
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
if not result:
|
return result
|
||||||
raise HTTPException(status_code=500, detail="Failed to create tag")
|
|
||||||
return _tag_row_to_response(result)
|
|
||||||
|
|
||||||
@router.put("/{tag_id}", response_model=Tag)
|
@router.put("/{tag_id}", response_model=Tag)
|
||||||
async def update_tag(tag_id: int, tag: TagUpdate):
|
async def update_tag(tag_id: int, tag: TagUpdate):
|
||||||
@ -437,9 +106,6 @@ async def update_tag(tag_id: int, tag: TagUpdate):
|
|||||||
if tag.tag_group_id is not None:
|
if tag.tag_group_id is not None:
|
||||||
updates.append("tag_group_id = %s")
|
updates.append("tag_group_id = %s")
|
||||||
params.append(tag.tag_group_id)
|
params.append(tag.tag_group_id)
|
||||||
if tag.catch_words is not None:
|
|
||||||
updates.append("catch_words_json = %s::jsonb")
|
|
||||||
params.append(json.dumps(_normalize_catch_words(tag.catch_words)))
|
|
||||||
|
|
||||||
if not updates:
|
if not updates:
|
||||||
raise HTTPException(status_code=400, detail="No fields to update")
|
raise HTTPException(status_code=400, detail="No fields to update")
|
||||||
@ -451,15 +117,13 @@ async def update_tag(tag_id: int, tag: TagUpdate):
|
|||||||
UPDATE tags
|
UPDATE tags
|
||||||
SET {', '.join(updates)}
|
SET {', '.join(updates)}
|
||||||
WHERE id = %s
|
WHERE id = %s
|
||||||
RETURNING id, name, type, description, color, icon, is_active, tag_group_id,
|
RETURNING *
|
||||||
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
|
||||||
created_at, updated_at
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
result = execute_query_single(query, tuple(params))
|
result = execute_query_single(query, tuple(params))
|
||||||
if not result:
|
if not result:
|
||||||
raise HTTPException(status_code=404, detail="Tag not found")
|
raise HTTPException(status_code=404, detail="Tag not found")
|
||||||
return _tag_row_to_response(result)
|
return result
|
||||||
|
|
||||||
@router.delete("/{tag_id}")
|
@router.delete("/{tag_id}")
|
||||||
async def delete_tag(tag_id: int):
|
async def delete_tag(tag_id: int):
|
||||||
@ -550,91 +214,20 @@ async def remove_tag_from_entity_path(
|
|||||||
async def get_entity_tags(entity_type: str, entity_id: int):
|
async def get_entity_tags(entity_type: str, entity_id: int):
|
||||||
"""Get all tags for a specific entity"""
|
"""Get all tags for a specific entity"""
|
||||||
query = """
|
query = """
|
||||||
SELECT t.id, t.name, t.type, t.description, t.color, t.icon, t.is_active, t.tag_group_id,
|
SELECT t.*
|
||||||
COALESCE(t.catch_words_json, '[]'::jsonb) AS catch_words,
|
|
||||||
t.created_at, t.updated_at
|
|
||||||
FROM tags t
|
FROM tags t
|
||||||
JOIN entity_tags et ON et.tag_id = t.id
|
JOIN entity_tags et ON et.tag_id = t.id
|
||||||
WHERE et.entity_type = %s AND et.entity_id = %s
|
WHERE et.entity_type = %s AND et.entity_id = %s
|
||||||
ORDER BY t.type, t.name
|
ORDER BY t.type, t.name
|
||||||
"""
|
"""
|
||||||
results = execute_query(query, (entity_type, entity_id))
|
results = execute_query(query, (entity_type, entity_id))
|
||||||
return [_tag_row_to_response(row) for row in (results or [])]
|
return results
|
||||||
|
|
||||||
|
|
||||||
@router.get("/entity/{entity_type}/{entity_id}/suggestions")
|
|
||||||
async def suggest_entity_tags(entity_type: str, entity_id: int):
|
|
||||||
"""Suggest tags based on catch words for brand/type tags."""
|
|
||||||
if entity_type != "case":
|
|
||||||
return []
|
|
||||||
|
|
||||||
case_row = execute_query_single(
|
|
||||||
"SELECT id, titel, beskrivelse, template_key FROM sag_sager WHERE id = %s",
|
|
||||||
(entity_id,),
|
|
||||||
)
|
|
||||||
if not case_row:
|
|
||||||
raise HTTPException(status_code=404, detail="Entity not found")
|
|
||||||
|
|
||||||
existing_rows = execute_query(
|
|
||||||
"SELECT tag_id FROM entity_tags WHERE entity_type = %s AND entity_id = %s",
|
|
||||||
(entity_type, entity_id),
|
|
||||||
) or []
|
|
||||||
existing_tag_ids = {int(row.get("tag_id")) for row in existing_rows if row.get("tag_id") is not None}
|
|
||||||
|
|
||||||
candidate_rows = execute_query(
|
|
||||||
"""
|
|
||||||
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
|
||||||
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
|
||||||
created_at, updated_at
|
|
||||||
FROM tags
|
|
||||||
WHERE is_active = true
|
|
||||||
AND type IN ('brand', 'type')
|
|
||||||
ORDER BY type, name
|
|
||||||
""",
|
|
||||||
(),
|
|
||||||
) or []
|
|
||||||
|
|
||||||
haystack = " ".join(
|
|
||||||
[
|
|
||||||
str(case_row.get("titel") or ""),
|
|
||||||
str(case_row.get("beskrivelse") or ""),
|
|
||||||
str(case_row.get("template_key") or ""),
|
|
||||||
]
|
|
||||||
).lower()
|
|
||||||
|
|
||||||
suggestions = []
|
|
||||||
for row in candidate_rows:
|
|
||||||
tag_id = int(row.get("id"))
|
|
||||||
if tag_id in existing_tag_ids:
|
|
||||||
continue
|
|
||||||
|
|
||||||
catch_words = _normalize_catch_words(row.get("catch_words"))
|
|
||||||
if not catch_words:
|
|
||||||
continue
|
|
||||||
|
|
||||||
matched_words = [word for word in catch_words if word in haystack]
|
|
||||||
if not matched_words:
|
|
||||||
continue
|
|
||||||
|
|
||||||
suggestions.append(
|
|
||||||
{
|
|
||||||
"tag": _tag_row_to_response(row),
|
|
||||||
"matched_words": matched_words,
|
|
||||||
"score": len(matched_words),
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
suggestions.sort(key=lambda item: (-item["score"], item["tag"]["type"], item["tag"]["name"]))
|
|
||||||
return suggestions
|
|
||||||
|
|
||||||
@router.get("/search")
|
@router.get("/search")
|
||||||
async def search_tags(q: str, type: Optional[TagType] = None):
|
async def search_tags(q: str, type: Optional[TagType] = None):
|
||||||
"""Search tags by name (fuzzy search)"""
|
"""Search tags by name (fuzzy search)"""
|
||||||
query = """
|
query = """
|
||||||
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
SELECT * FROM tags
|
||||||
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
|
||||||
created_at, updated_at
|
|
||||||
FROM tags
|
|
||||||
WHERE is_active = true
|
WHERE is_active = true
|
||||||
AND LOWER(name) LIKE LOWER(%s)
|
AND LOWER(name) LIKE LOWER(%s)
|
||||||
"""
|
"""
|
||||||
@ -647,7 +240,7 @@ async def search_tags(q: str, type: Optional[TagType] = None):
|
|||||||
query += " ORDER BY name LIMIT 20"
|
query += " ORDER BY name LIMIT 20"
|
||||||
|
|
||||||
results = execute_query(query, tuple(params))
|
results = execute_query(query, tuple(params))
|
||||||
return [_tag_row_to_response(row) for row in (results or [])]
|
return results
|
||||||
|
|
||||||
|
|
||||||
# ============= WORKFLOW MANAGEMENT =============
|
# ============= WORKFLOW MANAGEMENT =============
|
||||||
|
|||||||
@ -1,8 +1,11 @@
|
|||||||
{% extends "shared/frontend/base.html" %}
|
<!DOCTYPE html>
|
||||||
|
<html lang="da">
|
||||||
{% block title %}Tag Administration - BMC Hub{% endblock %}
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
{% block extra_css %}
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>Tag Administration - BMC Hub</title>
|
||||||
|
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||||
|
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.0/font/bootstrap-icons.css">
|
||||||
<style>
|
<style>
|
||||||
:root {
|
:root {
|
||||||
--primary-color: #0f4c75;
|
--primary-color: #0f4c75;
|
||||||
@ -11,8 +14,6 @@
|
|||||||
--category-color: #0f4c75;
|
--category-color: #0f4c75;
|
||||||
--priority-color: #dc3545;
|
--priority-color: #dc3545;
|
||||||
--billing-color: #2d6a4f;
|
--billing-color: #2d6a4f;
|
||||||
--brand-color: #006d77;
|
|
||||||
--type-color: #5c677d;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.tag-badge {
|
.tag-badge {
|
||||||
@ -36,8 +37,6 @@
|
|||||||
.tag-type-category { background-color: var(--category-color); color: white; }
|
.tag-type-category { background-color: var(--category-color); color: white; }
|
||||||
.tag-type-priority { background-color: var(--priority-color); color: white; }
|
.tag-type-priority { background-color: var(--priority-color); color: white; }
|
||||||
.tag-type-billing { background-color: var(--billing-color); color: white; }
|
.tag-type-billing { background-color: var(--billing-color); color: white; }
|
||||||
.tag-type-brand { background-color: var(--brand-color); color: white; }
|
|
||||||
.tag-type-type { background-color: var(--type-color); color: white; }
|
|
||||||
|
|
||||||
.tag-list-item {
|
.tag-list-item {
|
||||||
padding: 1rem;
|
padding: 1rem;
|
||||||
@ -54,8 +53,6 @@
|
|||||||
.tag-list-item[data-type="category"] { border-left-color: var(--category-color); }
|
.tag-list-item[data-type="category"] { border-left-color: var(--category-color); }
|
||||||
.tag-list-item[data-type="priority"] { border-left-color: var(--priority-color); }
|
.tag-list-item[data-type="priority"] { border-left-color: var(--priority-color); }
|
||||||
.tag-list-item[data-type="billing"] { border-left-color: var(--billing-color); }
|
.tag-list-item[data-type="billing"] { border-left-color: var(--billing-color); }
|
||||||
.tag-list-item[data-type="brand"] { border-left-color: var(--brand-color); }
|
|
||||||
.tag-list-item[data-type="type"] { border-left-color: var(--type-color); }
|
|
||||||
|
|
||||||
.color-preview {
|
.color-preview {
|
||||||
width: 40px;
|
width: 40px;
|
||||||
@ -63,68 +60,9 @@
|
|||||||
border-radius: 8px;
|
border-radius: 8px;
|
||||||
border: 2px solid #dee2e6;
|
border: 2px solid #dee2e6;
|
||||||
}
|
}
|
||||||
|
|
||||||
.section-tabs .nav-link {
|
|
||||||
color: var(--primary-color);
|
|
||||||
font-weight: 600;
|
|
||||||
}
|
|
||||||
|
|
||||||
.section-tabs .nav-link.active {
|
|
||||||
background-color: var(--primary-color);
|
|
||||||
color: #fff;
|
|
||||||
border-color: var(--primary-color);
|
|
||||||
}
|
|
||||||
|
|
||||||
.module-badge {
|
|
||||||
display: inline-flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 0.3rem;
|
|
||||||
padding: 0.2rem 0.55rem;
|
|
||||||
border-radius: 999px;
|
|
||||||
font-size: 0.8rem;
|
|
||||||
background: #e7f1f8;
|
|
||||||
color: #0b3552;
|
|
||||||
border: 1px solid #c7dceb;
|
|
||||||
}
|
|
||||||
|
|
||||||
.usage-table thead th {
|
|
||||||
position: sticky;
|
|
||||||
top: 0;
|
|
||||||
z-index: 1;
|
|
||||||
background: #fff;
|
|
||||||
white-space: nowrap;
|
|
||||||
}
|
|
||||||
|
|
||||||
.usage-table .filter-cell {
|
|
||||||
min-width: 160px;
|
|
||||||
}
|
|
||||||
|
|
||||||
.usage-sort-btn {
|
|
||||||
border: 0;
|
|
||||||
background: transparent;
|
|
||||||
color: inherit;
|
|
||||||
font-weight: 600;
|
|
||||||
padding: 0;
|
|
||||||
}
|
|
||||||
|
|
||||||
.usage-sort-btn .bi {
|
|
||||||
font-size: 0.75rem;
|
|
||||||
opacity: 0.55;
|
|
||||||
}
|
|
||||||
|
|
||||||
.usage-sort-btn.active .bi {
|
|
||||||
opacity: 1;
|
|
||||||
}
|
|
||||||
|
|
||||||
@media (max-width: 991px) {
|
|
||||||
.usage-table .filter-cell {
|
|
||||||
min-width: 130px;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
</style>
|
</style>
|
||||||
{% endblock %}
|
</head>
|
||||||
|
<body>
|
||||||
{% block content %}
|
|
||||||
<div class="container-fluid py-4">
|
<div class="container-fluid py-4">
|
||||||
<div class="row mb-4">
|
<div class="row mb-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
@ -138,17 +76,6 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<ul class="nav nav-pills section-tabs mb-4" id="sectionTabs">
|
|
||||||
<li class="nav-item">
|
|
||||||
<button type="button" class="nav-link active" data-section="admin">Tag administration</button>
|
|
||||||
</li>
|
|
||||||
<li class="nav-item">
|
|
||||||
<button type="button" class="nav-link" data-section="search">Tag søgning</button>
|
|
||||||
</li>
|
|
||||||
</ul>
|
|
||||||
|
|
||||||
<div id="tagAdminSection">
|
|
||||||
|
|
||||||
<!-- Type Filter Tabs -->
|
<!-- Type Filter Tabs -->
|
||||||
<ul class="nav nav-tabs mb-4" id="typeFilter">
|
<ul class="nav nav-tabs mb-4" id="typeFilter">
|
||||||
<li class="nav-item">
|
<li class="nav-item">
|
||||||
@ -179,16 +106,6 @@
|
|||||||
<span class="tag-badge tag-type-billing">Billing</span>
|
<span class="tag-badge tag-type-billing">Billing</span>
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
<li class="nav-item">
|
|
||||||
<a class="nav-link" href="#" data-type="brand">
|
|
||||||
<span class="tag-badge tag-type-brand">Brand</span>
|
|
||||||
</a>
|
|
||||||
</li>
|
|
||||||
<li class="nav-item">
|
|
||||||
<a class="nav-link" href="#" data-type="type">
|
|
||||||
<span class="tag-badge tag-type-type">Type</span>
|
|
||||||
</a>
|
|
||||||
</li>
|
|
||||||
</ul>
|
</ul>
|
||||||
|
|
||||||
<!-- Tags List -->
|
<!-- Tags List -->
|
||||||
@ -203,98 +120,6 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
|
||||||
|
|
||||||
<div id="tagSearchSection" class="d-none">
|
|
||||||
<div class="card mb-3">
|
|
||||||
<div class="card-body">
|
|
||||||
<div class="d-flex flex-wrap justify-content-between align-items-center gap-2 mb-3">
|
|
||||||
<div>
|
|
||||||
<h5 class="mb-1">Tag søgning på tværs af moduler</h5>
|
|
||||||
<p class="text-muted mb-0 small">Filtrer efter tag-navn, type og modul. Hver række viser tydeligt hvilket modul tagningen kommer fra.</p>
|
|
||||||
</div>
|
|
||||||
<button type="button" class="btn btn-outline-secondary btn-sm" id="resetUsageFiltersBtn">
|
|
||||||
<i class="bi bi-arrow-counterclockwise"></i> Nulstil filtre
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="table-responsive">
|
|
||||||
<table class="table table-hover align-middle usage-table mb-2">
|
|
||||||
<thead>
|
|
||||||
<tr>
|
|
||||||
<th>
|
|
||||||
<button type="button" class="usage-sort-btn" data-sort-by="tag_name">
|
|
||||||
Tag <i class="bi bi-chevron-expand"></i>
|
|
||||||
</button>
|
|
||||||
</th>
|
|
||||||
<th>
|
|
||||||
<button type="button" class="usage-sort-btn" data-sort-by="tag_type">
|
|
||||||
Type <i class="bi bi-chevron-expand"></i>
|
|
||||||
</button>
|
|
||||||
</th>
|
|
||||||
<th>
|
|
||||||
<button type="button" class="usage-sort-btn" data-sort-by="module">
|
|
||||||
Modul <i class="bi bi-chevron-expand"></i>
|
|
||||||
</button>
|
|
||||||
</th>
|
|
||||||
<th>Objekt</th>
|
|
||||||
<th>Entity type</th>
|
|
||||||
<th>
|
|
||||||
<button type="button" class="usage-sort-btn" data-sort-by="entity_id">
|
|
||||||
Entity ID <i class="bi bi-chevron-expand"></i>
|
|
||||||
</button>
|
|
||||||
</th>
|
|
||||||
<th>
|
|
||||||
<button type="button" class="usage-sort-btn active" data-sort-by="tagged_at">
|
|
||||||
Tagget <i class="bi bi-sort-down"></i>
|
|
||||||
</button>
|
|
||||||
</th>
|
|
||||||
</tr>
|
|
||||||
<tr>
|
|
||||||
<th class="filter-cell">
|
|
||||||
<input id="usageFilterTagName" type="search" class="form-control form-control-sm" placeholder="Søg tag-navn">
|
|
||||||
</th>
|
|
||||||
<th class="filter-cell">
|
|
||||||
<select id="usageFilterTagType" class="form-select form-select-sm">
|
|
||||||
<option value="">Alle typer</option>
|
|
||||||
<option value="workflow">workflow</option>
|
|
||||||
<option value="status">status</option>
|
|
||||||
<option value="category">category</option>
|
|
||||||
<option value="priority">priority</option>
|
|
||||||
<option value="billing">billing</option>
|
|
||||||
<option value="brand">brand</option>
|
|
||||||
<option value="type">type</option>
|
|
||||||
</select>
|
|
||||||
</th>
|
|
||||||
<th class="filter-cell">
|
|
||||||
<select id="usageFilterModule" class="form-select form-select-sm">
|
|
||||||
<option value="">Alle moduler</option>
|
|
||||||
</select>
|
|
||||||
</th>
|
|
||||||
<th></th>
|
|
||||||
<th></th>
|
|
||||||
<th></th>
|
|
||||||
<th></th>
|
|
||||||
</tr>
|
|
||||||
</thead>
|
|
||||||
<tbody id="usageTableBody">
|
|
||||||
<tr>
|
|
||||||
<td colspan="7" class="text-center text-muted py-4">Indlæser...</td>
|
|
||||||
</tr>
|
|
||||||
</tbody>
|
|
||||||
</table>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="d-flex flex-wrap justify-content-between align-items-center gap-2">
|
|
||||||
<div class="small text-muted" id="usageSummary">-</div>
|
|
||||||
<div class="btn-group">
|
|
||||||
<button type="button" class="btn btn-sm btn-outline-primary" id="usagePrevBtn">Forrige</button>
|
|
||||||
<button type="button" class="btn btn-sm btn-outline-primary" id="usageNextBtn">Næste</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Create/Edit Tag Modal -->
|
<!-- Create/Edit Tag Modal -->
|
||||||
@ -323,17 +148,9 @@
|
|||||||
<option value="category">Category - Emne/område</option>
|
<option value="category">Category - Emne/område</option>
|
||||||
<option value="priority">Priority - Hastighed</option>
|
<option value="priority">Priority - Hastighed</option>
|
||||||
<option value="billing">Billing - Økonomi</option>
|
<option value="billing">Billing - Økonomi</option>
|
||||||
<option value="brand">Brand - Leverandør/produktbrand</option>
|
|
||||||
<option value="type">Type - Sagstype/arbejdstype</option>
|
|
||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="mb-3">
|
|
||||||
<label for="tagCatchWords" class="form-label">Catch words</label>
|
|
||||||
<textarea class="form-control" id="tagCatchWords" rows="3" placeholder="fx: office 365, outlook, smtp"></textarea>
|
|
||||||
<small class="text-muted">Brug komma eller ny linje mellem ord. Bruges til auto-forslag på sager.</small>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<label for="tagDescription" class="form-label">Beskrivelse</label>
|
<label for="tagDescription" class="form-label">Beskrivelse</label>
|
||||||
<textarea class="form-control" id="tagDescription" rows="3"></textarea>
|
<textarea class="form-control" id="tagDescription" rows="3"></textarea>
|
||||||
@ -369,59 +186,19 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{% endblock %}
|
|
||||||
|
|
||||||
{% block extra_js %}
|
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||||
<script>
|
<script>
|
||||||
let allTags = [];
|
let allTags = [];
|
||||||
let currentFilter = 'all';
|
let currentFilter = 'all';
|
||||||
let usageDebounceTimer = null;
|
|
||||||
const usageState = {
|
|
||||||
filters: {
|
|
||||||
tag_name: '',
|
|
||||||
tag_type: '',
|
|
||||||
module: ''
|
|
||||||
},
|
|
||||||
page: 1,
|
|
||||||
page_size: 25,
|
|
||||||
sort_by: 'tagged_at',
|
|
||||||
sort_dir: 'desc',
|
|
||||||
total: 0,
|
|
||||||
total_pages: 0
|
|
||||||
};
|
|
||||||
|
|
||||||
// Load tags on page load
|
// Load tags on page load
|
||||||
document.addEventListener('DOMContentLoaded', () => {
|
document.addEventListener('DOMContentLoaded', () => {
|
||||||
loadTags();
|
loadTags();
|
||||||
loadTagUsage();
|
|
||||||
setupEventListeners();
|
setupEventListeners();
|
||||||
const initialSection = window.location.hash === '#search' ? 'search' : 'admin';
|
|
||||||
switchTagSection(initialSection, false);
|
|
||||||
});
|
});
|
||||||
|
|
||||||
function switchTagSection(section, updateHash = true) {
|
|
||||||
const normalized = section === 'search' ? 'search' : 'admin';
|
|
||||||
document.querySelectorAll('#sectionTabs .nav-link').forEach(link => {
|
|
||||||
link.classList.toggle('active', link.dataset.section === normalized);
|
|
||||||
});
|
|
||||||
|
|
||||||
document.getElementById('tagAdminSection').classList.toggle('d-none', normalized !== 'admin');
|
|
||||||
document.getElementById('tagSearchSection').classList.toggle('d-none', normalized !== 'search');
|
|
||||||
|
|
||||||
if (updateHash) {
|
|
||||||
const hash = normalized === 'search' ? '#search' : '#admin';
|
|
||||||
window.history.replaceState(null, '', hash);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function setupEventListeners() {
|
function setupEventListeners() {
|
||||||
// Section tabs
|
|
||||||
document.querySelectorAll('#sectionTabs button').forEach(btn => {
|
|
||||||
btn.addEventListener('click', () => {
|
|
||||||
switchTagSection(btn.dataset.section);
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Type filter tabs
|
// Type filter tabs
|
||||||
document.querySelectorAll('#typeFilter a').forEach(tab => {
|
document.querySelectorAll('#typeFilter a').forEach(tab => {
|
||||||
tab.addEventListener('click', (e) => {
|
tab.addEventListener('click', (e) => {
|
||||||
@ -452,9 +229,7 @@
|
|||||||
'status': '#ffd700',
|
'status': '#ffd700',
|
||||||
'category': '#0f4c75',
|
'category': '#0f4c75',
|
||||||
'priority': '#dc3545',
|
'priority': '#dc3545',
|
||||||
'billing': '#2d6a4f',
|
'billing': '#2d6a4f'
|
||||||
'brand': '#006d77',
|
|
||||||
'type': '#5c677d'
|
|
||||||
};
|
};
|
||||||
if (colorMap[type]) {
|
if (colorMap[type]) {
|
||||||
document.getElementById('tagColor').value = colorMap[type];
|
document.getElementById('tagColor').value = colorMap[type];
|
||||||
@ -465,61 +240,6 @@
|
|||||||
// Save button
|
// Save button
|
||||||
document.getElementById('saveTagBtn').addEventListener('click', saveTag);
|
document.getElementById('saveTagBtn').addEventListener('click', saveTag);
|
||||||
|
|
||||||
// Usage filters
|
|
||||||
document.getElementById('usageFilterTagName').addEventListener('input', () => {
|
|
||||||
usageState.filters.tag_name = document.getElementById('usageFilterTagName').value.trim();
|
|
||||||
usageState.page = 1;
|
|
||||||
debounceUsageLoad();
|
|
||||||
});
|
|
||||||
document.getElementById('usageFilterTagType').addEventListener('change', () => {
|
|
||||||
usageState.filters.tag_type = document.getElementById('usageFilterTagType').value;
|
|
||||||
usageState.page = 1;
|
|
||||||
loadTagUsage();
|
|
||||||
});
|
|
||||||
document.getElementById('usageFilterModule').addEventListener('change', () => {
|
|
||||||
usageState.filters.module = document.getElementById('usageFilterModule').value;
|
|
||||||
usageState.page = 1;
|
|
||||||
loadTagUsage();
|
|
||||||
});
|
|
||||||
|
|
||||||
document.getElementById('resetUsageFiltersBtn').addEventListener('click', () => {
|
|
||||||
usageState.filters = { tag_name: '', tag_type: '', module: '' };
|
|
||||||
usageState.page = 1;
|
|
||||||
document.getElementById('usageFilterTagName').value = '';
|
|
||||||
document.getElementById('usageFilterTagType').value = '';
|
|
||||||
document.getElementById('usageFilterModule').value = '';
|
|
||||||
loadTagUsage();
|
|
||||||
});
|
|
||||||
|
|
||||||
document.getElementById('usagePrevBtn').addEventListener('click', () => {
|
|
||||||
if (usageState.page > 1) {
|
|
||||||
usageState.page -= 1;
|
|
||||||
loadTagUsage();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
document.getElementById('usageNextBtn').addEventListener('click', () => {
|
|
||||||
if (usageState.page < usageState.total_pages) {
|
|
||||||
usageState.page += 1;
|
|
||||||
loadTagUsage();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
document.querySelectorAll('.usage-sort-btn').forEach(btn => {
|
|
||||||
btn.addEventListener('click', () => {
|
|
||||||
const sortBy = btn.dataset.sortBy;
|
|
||||||
if (usageState.sort_by === sortBy) {
|
|
||||||
usageState.sort_dir = usageState.sort_dir === 'asc' ? 'desc' : 'asc';
|
|
||||||
} else {
|
|
||||||
usageState.sort_by = sortBy;
|
|
||||||
usageState.sort_dir = sortBy === 'tagged_at' ? 'desc' : 'asc';
|
|
||||||
}
|
|
||||||
usageState.page = 1;
|
|
||||||
updateSortIndicators();
|
|
||||||
loadTagUsage();
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Modal reset on close
|
// Modal reset on close
|
||||||
document.getElementById('createTagModal').addEventListener('hidden.bs.modal', () => {
|
document.getElementById('createTagModal').addEventListener('hidden.bs.modal', () => {
|
||||||
document.getElementById('tagForm').reset();
|
document.getElementById('tagForm').reset();
|
||||||
@ -544,131 +264,6 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function escapeHtml(value) {
|
|
||||||
return String(value ?? '')
|
|
||||||
.replaceAll('&', '&')
|
|
||||||
.replaceAll('<', '<')
|
|
||||||
.replaceAll('>', '>')
|
|
||||||
.replaceAll('"', '"')
|
|
||||||
.replaceAll("'", ''');
|
|
||||||
}
|
|
||||||
|
|
||||||
function debounceUsageLoad() {
|
|
||||||
if (usageDebounceTimer) {
|
|
||||||
clearTimeout(usageDebounceTimer);
|
|
||||||
}
|
|
||||||
usageDebounceTimer = setTimeout(() => loadTagUsage(), 280);
|
|
||||||
}
|
|
||||||
|
|
||||||
function updateSortIndicators() {
|
|
||||||
document.querySelectorAll('.usage-sort-btn').forEach(btn => {
|
|
||||||
const icon = btn.querySelector('i');
|
|
||||||
if (!icon) return;
|
|
||||||
btn.classList.remove('active');
|
|
||||||
icon.className = 'bi bi-chevron-expand';
|
|
||||||
if (btn.dataset.sortBy === usageState.sort_by) {
|
|
||||||
btn.classList.add('active');
|
|
||||||
icon.className = usageState.sort_dir === 'asc' ? 'bi bi-sort-up' : 'bi bi-sort-down';
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderUsageTable(items) {
|
|
||||||
const tbody = document.getElementById('usageTableBody');
|
|
||||||
if (!Array.isArray(items) || !items.length) {
|
|
||||||
tbody.innerHTML = '<tr><td colspan="7" class="text-center text-muted py-4">Ingen taggede rækker matcher filtrene.</td></tr>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
tbody.innerHTML = items.map(row => {
|
|
||||||
const taggedAt = row.tagged_at ? new Date(row.tagged_at).toLocaleString('da-DK') : '-';
|
|
||||||
const color = /^#[0-9A-Fa-f]{6}$/.test(String(row.tag_color || '')) ? row.tag_color : '#0f4c75';
|
|
||||||
const inactiveBadge = row.tag_is_active ? '' : '<span class="badge bg-secondary ms-2">Inaktiv</span>';
|
|
||||||
const entityTitle = escapeHtml(row.entity_title || `#${row.entity_id || ''}`);
|
|
||||||
const entityCell = row.entity_url
|
|
||||||
? `<a href="${escapeHtml(row.entity_url)}" class="text-decoration-none fw-semibold">${entityTitle}</a>`
|
|
||||||
: `<span class="fw-semibold">${entityTitle}</span>`;
|
|
||||||
|
|
||||||
return `
|
|
||||||
<tr>
|
|
||||||
<td>
|
|
||||||
<span class="tag-badge" style="background:${color}; color:#fff; margin:0;">${escapeHtml(row.tag_name)}</span>
|
|
||||||
${inactiveBadge}
|
|
||||||
</td>
|
|
||||||
<td><span class="badge bg-light text-dark text-uppercase">${escapeHtml(row.tag_type)}</span></td>
|
|
||||||
<td><span class="module-badge"><i class="bi bi-box"></i>${escapeHtml(row.module)}</span></td>
|
|
||||||
<td>${entityCell}</td>
|
|
||||||
<td><span class="text-muted">${escapeHtml(row.entity_type)}</span></td>
|
|
||||||
<td><strong>#${escapeHtml(row.entity_id)}</strong></td>
|
|
||||||
<td class="small text-muted">${escapeHtml(taggedAt)}</td>
|
|
||||||
</tr>
|
|
||||||
`;
|
|
||||||
}).join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderUsageSummary() {
|
|
||||||
const summary = document.getElementById('usageSummary');
|
|
||||||
const prevBtn = document.getElementById('usagePrevBtn');
|
|
||||||
const nextBtn = document.getElementById('usageNextBtn');
|
|
||||||
|
|
||||||
const total = usageState.total;
|
|
||||||
const page = usageState.page;
|
|
||||||
const pageSize = usageState.page_size;
|
|
||||||
const from = total ? ((page - 1) * pageSize + 1) : 0;
|
|
||||||
const to = total ? Math.min(page * pageSize, total) : 0;
|
|
||||||
|
|
||||||
summary.textContent = `Viser ${from}-${to} af ${total} rækker`;
|
|
||||||
prevBtn.disabled = page <= 1;
|
|
||||||
nextBtn.disabled = page >= usageState.total_pages;
|
|
||||||
}
|
|
||||||
|
|
||||||
function fillModuleFilter(options) {
|
|
||||||
const select = document.getElementById('usageFilterModule');
|
|
||||||
const currentValue = usageState.filters.module;
|
|
||||||
const base = '<option value="">Alle moduler</option>';
|
|
||||||
const rows = (options || []).map(option => {
|
|
||||||
return `<option value="${escapeHtml(option.value)}">${escapeHtml(option.label)}</option>`;
|
|
||||||
}).join('');
|
|
||||||
select.innerHTML = `${base}${rows}`;
|
|
||||||
select.value = currentValue || '';
|
|
||||||
}
|
|
||||||
|
|
||||||
async function loadTagUsage() {
|
|
||||||
const tbody = document.getElementById('usageTableBody');
|
|
||||||
tbody.innerHTML = '<tr><td colspan="7" class="text-center text-muted py-4">Indlæser...</td></tr>';
|
|
||||||
|
|
||||||
try {
|
|
||||||
const params = new URLSearchParams({
|
|
||||||
page: String(usageState.page),
|
|
||||||
page_size: String(usageState.page_size),
|
|
||||||
sort_by: usageState.sort_by,
|
|
||||||
sort_dir: usageState.sort_dir
|
|
||||||
});
|
|
||||||
|
|
||||||
if (usageState.filters.tag_name) params.set('tag_name', usageState.filters.tag_name);
|
|
||||||
if (usageState.filters.tag_type) params.set('tag_type', usageState.filters.tag_type);
|
|
||||||
if (usageState.filters.module) params.set('module', usageState.filters.module);
|
|
||||||
|
|
||||||
const response = await fetch(`/api/v1/tags/usage?${params.toString()}`);
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error('Kunne ikke hente tag søgning');
|
|
||||||
}
|
|
||||||
|
|
||||||
const payload = await response.json();
|
|
||||||
usageState.total = Number(payload?.pagination?.total || 0);
|
|
||||||
usageState.total_pages = Number(payload?.pagination?.total_pages || 0);
|
|
||||||
usageState.page = Number(payload?.pagination?.page || usageState.page);
|
|
||||||
|
|
||||||
fillModuleFilter(payload.module_options || []);
|
|
||||||
renderUsageTable(payload.items || []);
|
|
||||||
renderUsageSummary();
|
|
||||||
updateSortIndicators();
|
|
||||||
} catch (error) {
|
|
||||||
tbody.innerHTML = `<tr><td colspan="7" class="text-center text-danger py-4">Fejl ved indlæsning af tag søgning: ${escapeHtml(error.message)}</td></tr>`;
|
|
||||||
document.getElementById('usageSummary').textContent = 'Fejl ved datahentning';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function renderTags() {
|
function renderTags() {
|
||||||
const container = document.getElementById('tagsList');
|
const container = document.getElementById('tagsList');
|
||||||
const filteredTags = currentFilter === 'all'
|
const filteredTags = currentFilter === 'all'
|
||||||
@ -698,7 +293,6 @@
|
|||||||
${!tag.is_active ? '<span class="badge bg-secondary ms-2">Inaktiv</span>' : ''}
|
${!tag.is_active ? '<span class="badge bg-secondary ms-2">Inaktiv</span>' : ''}
|
||||||
</div>
|
</div>
|
||||||
${tag.description ? `<p class="text-muted mb-0 small">${tag.description}</p>` : ''}
|
${tag.description ? `<p class="text-muted mb-0 small">${tag.description}</p>` : ''}
|
||||||
${Array.isArray(tag.catch_words) && tag.catch_words.length ? `<p class="mb-0 mt-1"><small class="text-muted">Catch words: ${tag.catch_words.join(', ')}</small></p>` : ''}
|
|
||||||
</div>
|
</div>
|
||||||
<div class="btn-group">
|
<div class="btn-group">
|
||||||
<button class="btn btn-sm btn-outline-primary" onclick="editTag(${tag.id})">
|
<button class="btn btn-sm btn-outline-primary" onclick="editTag(${tag.id})">
|
||||||
@ -721,11 +315,7 @@
|
|||||||
description: document.getElementById('tagDescription').value || null,
|
description: document.getElementById('tagDescription').value || null,
|
||||||
color: document.getElementById('tagColorHex').value,
|
color: document.getElementById('tagColorHex').value,
|
||||||
icon: document.getElementById('tagIcon').value || null,
|
icon: document.getElementById('tagIcon').value || null,
|
||||||
is_active: document.getElementById('tagActive').checked,
|
is_active: document.getElementById('tagActive').checked
|
||||||
catch_words: document.getElementById('tagCatchWords').value
|
|
||||||
.split(/[\n,]+/)
|
|
||||||
.map(v => v.trim().toLowerCase())
|
|
||||||
.filter(v => v.length > 1)
|
|
||||||
};
|
};
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@ -762,7 +352,6 @@
|
|||||||
document.getElementById('tagColorHex').value = tag.color;
|
document.getElementById('tagColorHex').value = tag.color;
|
||||||
document.getElementById('tagIcon').value = tag.icon || '';
|
document.getElementById('tagIcon').value = tag.icon || '';
|
||||||
document.getElementById('tagActive').checked = tag.is_active;
|
document.getElementById('tagActive').checked = tag.is_active;
|
||||||
document.getElementById('tagCatchWords').value = Array.isArray(tag.catch_words) ? tag.catch_words.join(', ') : '';
|
|
||||||
|
|
||||||
document.querySelector('#createTagModal .modal-title').textContent = 'Rediger Tag';
|
document.querySelector('#createTagModal .modal-title').textContent = 'Rediger Tag';
|
||||||
new bootstrap.Modal(document.getElementById('createTagModal')).show();
|
new bootstrap.Modal(document.getElementById('createTagModal')).show();
|
||||||
@ -785,4 +374,5 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
</script>
|
</script>
|
||||||
{% endblock %}
|
</body>
|
||||||
|
</html>
|
||||||
|
|||||||
@ -11,7 +11,7 @@ import json
|
|||||||
import re
|
import re
|
||||||
import asyncio
|
import asyncio
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
from fastapi import APIRouter, Depends, HTTPException, Query, status
|
from fastapi import APIRouter, HTTPException, Query, status
|
||||||
from fastapi.responses import JSONResponse
|
from fastapi.responses import JSONResponse
|
||||||
|
|
||||||
from app.ticket.backend.ticket_service import TicketService
|
from app.ticket.backend.ticket_service import TicketService
|
||||||
@ -55,13 +55,11 @@ from app.ticket.backend.models import (
|
|||||||
TicketDeadlineUpdateRequest
|
TicketDeadlineUpdateRequest
|
||||||
)
|
)
|
||||||
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
||||||
from app.core.auth_dependencies import require_any_permission
|
|
||||||
from datetime import date, datetime
|
from datetime import date, datetime
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
sync_admin_access = require_any_permission("users.manage", "system.admin")
|
|
||||||
|
|
||||||
|
|
||||||
def _get_first_value(data: dict, keys: List[str]) -> Optional[str]:
|
def _get_first_value(data: dict, keys: List[str]) -> Optional[str]:
|
||||||
@ -129,31 +127,6 @@ def _escape_simply_value(value: str) -> str:
|
|||||||
return value.replace("'", "''")
|
return value.replace("'", "''")
|
||||||
|
|
||||||
|
|
||||||
def _extract_count_value(rows: List[dict]) -> Optional[int]:
|
|
||||||
if not rows:
|
|
||||||
return None
|
|
||||||
|
|
||||||
row = rows[0] or {}
|
|
||||||
if not isinstance(row, dict):
|
|
||||||
return None
|
|
||||||
|
|
||||||
for key in ("total_count", "count", "count(*)", "COUNT(*)"):
|
|
||||||
value = row.get(key)
|
|
||||||
if value is not None:
|
|
||||||
try:
|
|
||||||
return int(value)
|
|
||||||
except (TypeError, ValueError):
|
|
||||||
continue
|
|
||||||
|
|
||||||
for value in row.values():
|
|
||||||
try:
|
|
||||||
return int(value)
|
|
||||||
except (TypeError, ValueError):
|
|
||||||
continue
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
async def _vtiger_query_with_retry(vtiger, query_string: str, retries: int = 5, base_delay: float = 1.25) -> List[dict]:
|
async def _vtiger_query_with_retry(vtiger, query_string: str, retries: int = 5, base_delay: float = 1.25) -> List[dict]:
|
||||||
"""Run vTiger query with exponential backoff on rate-limit responses."""
|
"""Run vTiger query with exponential backoff on rate-limit responses."""
|
||||||
for attempt in range(retries + 1):
|
for attempt in range(retries + 1):
|
||||||
@ -1852,8 +1825,7 @@ async def import_simply_archived_tickets(
|
|||||||
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
||||||
include_messages: bool = Query(True, description="Include comments and emails"),
|
include_messages: bool = Query(True, description="Include comments and emails"),
|
||||||
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
||||||
force: bool = Query(False, description="Update even if sync hash matches"),
|
force: bool = Query(False, description="Update even if sync hash matches")
|
||||||
current_user: dict = Depends(sync_admin_access)
|
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
One-time import of archived tickets from Simply-CRM.
|
One-time import of archived tickets from Simply-CRM.
|
||||||
@ -2185,8 +2157,7 @@ async def import_vtiger_archived_tickets(
|
|||||||
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
||||||
include_messages: bool = Query(True, description="Include comments and emails"),
|
include_messages: bool = Query(True, description="Include comments and emails"),
|
||||||
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
||||||
force: bool = Query(False, description="Update even if sync hash matches"),
|
force: bool = Query(False, description="Update even if sync hash matches")
|
||||||
current_user: dict = Depends(sync_admin_access)
|
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
One-time import of archived tickets from vTiger (Cases module).
|
One-time import of archived tickets from vTiger (Cases module).
|
||||||
@ -2522,93 +2493,8 @@ async def import_vtiger_archived_tickets(
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.get("/archived/status", tags=["Archived Tickets"])
|
|
||||||
async def get_archived_sync_status(current_user: dict = Depends(sync_admin_access)):
|
|
||||||
"""
|
|
||||||
Return archived sync parity status for Simply-CRM and vTiger.
|
|
||||||
"""
|
|
||||||
source_keys = ("simplycrm", "vtiger")
|
|
||||||
sources: dict[str, dict] = {}
|
|
||||||
|
|
||||||
for source_key in source_keys:
|
|
||||||
local_ticket_row = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT COUNT(*) AS total_tickets,
|
|
||||||
MAX(last_synced_at) AS last_synced_at
|
|
||||||
FROM tticket_archived_tickets
|
|
||||||
WHERE source_system = %s
|
|
||||||
""",
|
|
||||||
(source_key,)
|
|
||||||
) or {}
|
|
||||||
|
|
||||||
local_message_row = execute_query_single(
|
|
||||||
"""
|
|
||||||
SELECT COUNT(*) AS total_messages
|
|
||||||
FROM tticket_archived_messages m
|
|
||||||
INNER JOIN tticket_archived_tickets t ON t.id = m.archived_ticket_id
|
|
||||||
WHERE t.source_system = %s
|
|
||||||
""",
|
|
||||||
(source_key,)
|
|
||||||
) or {}
|
|
||||||
|
|
||||||
local_tickets = int(local_ticket_row.get("total_tickets") or 0)
|
|
||||||
local_messages = int(local_message_row.get("total_messages") or 0)
|
|
||||||
last_synced_value = local_ticket_row.get("last_synced_at")
|
|
||||||
if isinstance(last_synced_value, (datetime, date)):
|
|
||||||
last_synced_at_iso = last_synced_value.isoformat()
|
|
||||||
else:
|
|
||||||
last_synced_at_iso = None
|
|
||||||
|
|
||||||
sources[source_key] = {
|
|
||||||
"remote_total_tickets": None,
|
|
||||||
"local_total_tickets": local_tickets,
|
|
||||||
"local_total_messages": local_messages,
|
|
||||||
"last_synced_at": last_synced_at_iso,
|
|
||||||
"diff": None,
|
|
||||||
"is_synced": False,
|
|
||||||
"error": None,
|
|
||||||
}
|
|
||||||
|
|
||||||
try:
|
|
||||||
async with SimplyCRMService() as service:
|
|
||||||
module_name = getattr(settings, "SIMPLYCRM_TICKET_MODULE", "Tickets")
|
|
||||||
simply_rows = await service.query(f"SELECT count(*) AS total_count FROM {module_name};")
|
|
||||||
simply_remote_count = _extract_count_value(simply_rows)
|
|
||||||
sources["simplycrm"]["remote_total_tickets"] = simply_remote_count
|
|
||||||
if simply_remote_count is not None:
|
|
||||||
sources["simplycrm"]["diff"] = simply_remote_count - sources["simplycrm"]["local_total_tickets"]
|
|
||||||
sources["simplycrm"]["is_synced"] = sources["simplycrm"]["diff"] == 0
|
|
||||||
elif service.last_query_error:
|
|
||||||
sources["simplycrm"]["error"] = service.last_query_error.get("message") or str(service.last_query_error)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("⚠️ Simply-CRM archived status check failed: %s", e)
|
|
||||||
sources["simplycrm"]["error"] = str(e)
|
|
||||||
|
|
||||||
try:
|
|
||||||
vtiger = get_vtiger_service()
|
|
||||||
vtiger_rows = await _vtiger_query_with_retry(vtiger, "SELECT count(*) AS total_count FROM Cases;")
|
|
||||||
vtiger_remote_count = _extract_count_value(vtiger_rows)
|
|
||||||
sources["vtiger"]["remote_total_tickets"] = vtiger_remote_count
|
|
||||||
if vtiger_remote_count is not None:
|
|
||||||
sources["vtiger"]["diff"] = vtiger_remote_count - sources["vtiger"]["local_total_tickets"]
|
|
||||||
sources["vtiger"]["is_synced"] = sources["vtiger"]["diff"] == 0
|
|
||||||
elif vtiger.last_query_error:
|
|
||||||
sources["vtiger"]["error"] = vtiger.last_query_error.get("message") or str(vtiger.last_query_error)
|
|
||||||
except Exception as e:
|
|
||||||
logger.warning("⚠️ vTiger archived status check failed: %s", e)
|
|
||||||
sources["vtiger"]["error"] = str(e)
|
|
||||||
|
|
||||||
overall_synced = all(sources[key].get("is_synced") is True for key in source_keys)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"checked_at": datetime.utcnow().isoformat(),
|
|
||||||
"overall_synced": overall_synced,
|
|
||||||
"sources": sources,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/archived/simply/modules", tags=["Archived Tickets"])
|
@router.get("/archived/simply/modules", tags=["Archived Tickets"])
|
||||||
async def list_simply_modules(current_user: dict = Depends(sync_admin_access)):
|
async def list_simply_modules():
|
||||||
"""
|
"""
|
||||||
List available Simply-CRM modules (debug helper).
|
List available Simply-CRM modules (debug helper).
|
||||||
"""
|
"""
|
||||||
@ -2624,8 +2510,7 @@ async def list_simply_modules(current_user: dict = Depends(sync_admin_access)):
|
|||||||
@router.get("/archived/simply/ticket", tags=["Archived Tickets"])
|
@router.get("/archived/simply/ticket", tags=["Archived Tickets"])
|
||||||
async def fetch_simply_ticket(
|
async def fetch_simply_ticket(
|
||||||
ticket_number: Optional[str] = Query(None, description="Ticket number, e.g. TT934"),
|
ticket_number: Optional[str] = Query(None, description="Ticket number, e.g. TT934"),
|
||||||
external_id: Optional[str] = Query(None, description="VTiger record ID, e.g. 17x1234"),
|
external_id: Optional[str] = Query(None, description="VTiger record ID, e.g. 17x1234")
|
||||||
current_user: dict = Depends(sync_admin_access)
|
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Fetch a single HelpDesk ticket from Simply-CRM by ticket number or record id.
|
Fetch a single HelpDesk ticket from Simply-CRM by ticket number or record id.
|
||||||
@ -2659,8 +2544,7 @@ async def fetch_simply_ticket(
|
|||||||
@router.get("/archived/simply/record", tags=["Archived Tickets"])
|
@router.get("/archived/simply/record", tags=["Archived Tickets"])
|
||||||
async def fetch_simply_record(
|
async def fetch_simply_record(
|
||||||
record_id: str = Query(..., description="VTiger record ID, e.g. 11x2601"),
|
record_id: str = Query(..., description="VTiger record ID, e.g. 11x2601"),
|
||||||
module: Optional[str] = Query(None, description="Optional module name for context"),
|
module: Optional[str] = Query(None, description="Optional module name for context")
|
||||||
current_user: dict = Depends(sync_admin_access)
|
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Fetch a single record from Simply-CRM by record id.
|
Fetch a single record from Simply-CRM by record id.
|
||||||
|
|||||||
@ -2,21 +2,6 @@
|
|||||||
|
|
||||||
{% block title %}Tekniker Dashboard V1 - Overblik{% endblock %}
|
{% block title %}Tekniker Dashboard V1 - Overblik{% endblock %}
|
||||||
|
|
||||||
{% block extra_css %}
|
|
||||||
<style>
|
|
||||||
#caseTable thead th {
|
|
||||||
white-space: nowrap;
|
|
||||||
font-size: 0.78rem;
|
|
||||||
letter-spacing: 0.02em;
|
|
||||||
}
|
|
||||||
|
|
||||||
#caseTable tbody td {
|
|
||||||
font-size: 0.84rem;
|
|
||||||
vertical-align: top;
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
{% endblock %}
|
|
||||||
|
|
||||||
{% block content %}
|
{% block content %}
|
||||||
<div class="container-fluid py-4">
|
<div class="container-fluid py-4">
|
||||||
<div class="d-flex justify-content-between align-items-start flex-wrap gap-3 mb-4">
|
<div class="d-flex justify-content-between align-items-start flex-wrap gap-3 mb-4">
|
||||||
@ -80,22 +65,15 @@
|
|||||||
<table class="table table-sm table-hover mb-0" id="caseTable">
|
<table class="table table-sm table-hover mb-0" id="caseTable">
|
||||||
<thead class="table-light" id="tableHead">
|
<thead class="table-light" id="tableHead">
|
||||||
<tr>
|
<tr>
|
||||||
<th>SagsID</th>
|
<th>ID</th>
|
||||||
<th>Virksom.</th>
|
<th>Titel</th>
|
||||||
<th>Kontakt</th>
|
<th>Kunde</th>
|
||||||
<th>Beskr.</th>
|
<th>Status</th>
|
||||||
<th>Type</th>
|
<th>Dato</th>
|
||||||
<th>Prioritet</th>
|
|
||||||
<th>Ansvarl.</th>
|
|
||||||
<th>Gruppe/Level</th>
|
|
||||||
<th>Opret.</th>
|
|
||||||
<th>Start arbejde</th>
|
|
||||||
<th>Start inden</th>
|
|
||||||
<th>Deadline</th>
|
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody id="tableBody">
|
<tbody id="tableBody">
|
||||||
<tr><td colspan="12" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>
|
<tr><td colspan="5" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
</div>
|
</div>
|
||||||
@ -189,16 +167,8 @@ const allData = {
|
|||||||
{
|
{
|
||||||
id: {{ item.id }},
|
id: {{ item.id }},
|
||||||
titel: {{ item.titel | tojson | safe }},
|
titel: {{ item.titel | tojson | safe }},
|
||||||
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
|
||||||
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
|
||||||
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
|
||||||
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
|
||||||
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
|
||||||
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
||||||
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
|
||||||
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
||||||
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -209,16 +179,7 @@ const allData = {
|
|||||||
{
|
{
|
||||||
id: {{ item.id }},
|
id: {{ item.id }},
|
||||||
titel: {{ item.titel | tojson | safe }},
|
titel: {{ item.titel | tojson | safe }},
|
||||||
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
|
||||||
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
|
||||||
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
|
||||||
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
|
||||||
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
|
||||||
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
|
||||||
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
|
||||||
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
||||||
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -230,16 +191,9 @@ const allData = {
|
|||||||
item_type: {{ item.item_type | tojson | safe }},
|
item_type: {{ item.item_type | tojson | safe }},
|
||||||
item_id: {{ item.item_id }},
|
item_id: {{ item.item_id }},
|
||||||
title: {{ item.title | tojson | safe }},
|
title: {{ item.title | tojson | safe }},
|
||||||
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
|
||||||
task_reason: {{ item.task_reason | tojson | safe if item.task_reason else 'null' }},
|
task_reason: {{ item.task_reason | tojson | safe if item.task_reason else 'null' }},
|
||||||
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
||||||
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
|
||||||
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
|
||||||
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
|
||||||
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
|
||||||
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
|
||||||
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }}
|
status: {{ item.status | tojson | safe if item.status else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -251,16 +205,7 @@ const allData = {
|
|||||||
id: {{ item.id }},
|
id: {{ item.id }},
|
||||||
titel: {{ item.titel | tojson | safe }},
|
titel: {{ item.titel | tojson | safe }},
|
||||||
group_name: {{ item.group_name | tojson | safe }},
|
group_name: {{ item.group_name | tojson | safe }},
|
||||||
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
|
||||||
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
|
||||||
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
|
||||||
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
|
||||||
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
|
||||||
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
|
||||||
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
|
||||||
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
||||||
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -280,32 +225,6 @@ function formatShortDate(dateStr) {
|
|||||||
return d.toLocaleDateString('da-DK', { day: '2-digit', month: '2-digit', year: 'numeric' });
|
return d.toLocaleDateString('da-DK', { day: '2-digit', month: '2-digit', year: 'numeric' });
|
||||||
}
|
}
|
||||||
|
|
||||||
function renderCaseTableRow(item, idField = 'id', typeField = 'case') {
|
|
||||||
const itemId = item[idField];
|
|
||||||
const openType = typeField === 'item_type' ? item.item_type : 'case';
|
|
||||||
const description = item.beskrivelse || item.titel || item.title || '-';
|
|
||||||
const typeValue = item.case_type || item.item_type || '-';
|
|
||||||
const groupLevel = item.assigned_group_name || item.group_name || '-';
|
|
||||||
const priorityValue = item.priority || 'normal';
|
|
||||||
|
|
||||||
return `
|
|
||||||
<tr onclick="showCaseDetails(${itemId}, '${openType}')" style="cursor:pointer;">
|
|
||||||
<td>#${itemId}</td>
|
|
||||||
<td>${item.customer_name || '-'}</td>
|
|
||||||
<td>${item.kontakt_navn || '-'}</td>
|
|
||||||
<td>${description}</td>
|
|
||||||
<td>${typeValue}</td>
|
|
||||||
<td>${priorityValue}</td>
|
|
||||||
<td>${item.ansvarlig_navn || '-'}</td>
|
|
||||||
<td>${groupLevel}</td>
|
|
||||||
<td>${formatShortDate(item.created_at)}</td>
|
|
||||||
<td>${formatShortDate(item.start_date)}</td>
|
|
||||||
<td>${formatShortDate(item.deferred_until)}</td>
|
|
||||||
<td>${formatShortDate(item.deadline)}</td>
|
|
||||||
</tr>
|
|
||||||
`;
|
|
||||||
}
|
|
||||||
|
|
||||||
function toggleSection(filterName) {
|
function toggleSection(filterName) {
|
||||||
const kpiCard = document.getElementById('kpi' + filterName.charAt(0).toUpperCase() + filterName.slice(1));
|
const kpiCard = document.getElementById('kpi' + filterName.charAt(0).toUpperCase() + filterName.slice(1));
|
||||||
const listTitle = document.getElementById('listTitle');
|
const listTitle = document.getElementById('listTitle');
|
||||||
@ -323,7 +242,7 @@ function toggleSection(filterName) {
|
|||||||
if (currentFilter === filterName) {
|
if (currentFilter === filterName) {
|
||||||
currentFilter = null;
|
currentFilter = null;
|
||||||
listTitle.textContent = 'Alle sager';
|
listTitle.textContent = 'Alle sager';
|
||||||
tableBody.innerHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>';
|
tableBody.innerHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>';
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -347,43 +266,70 @@ function filterAndPopulateTable(filterName) {
|
|||||||
listTitle.innerHTML = '<i class="bi bi-inbox-fill text-primary"></i> Nye sager';
|
listTitle.innerHTML = '<i class="bi bi-inbox-fill text-primary"></i> Nye sager';
|
||||||
const data = allData.newCases || [];
|
const data = allData.newCases || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen nye sager</td></tr>';
|
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen nye sager</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => renderCaseTableRow(item)).join('');
|
bodyHTML = data.map(item => `
|
||||||
|
<tr onclick="showCaseDetails(${item.id}, 'case')" style="cursor:pointer;">
|
||||||
|
<td>#${item.id}</td>
|
||||||
|
<td>${item.titel || '-'}</td>
|
||||||
|
<td>${item.customer_name || '-'}</td>
|
||||||
|
<td><span class="badge bg-secondary">${item.status || 'Ny'}</span></td>
|
||||||
|
<td>${formatDate(item.created_at)}</td>
|
||||||
|
</tr>
|
||||||
|
`).join('');
|
||||||
}
|
}
|
||||||
} else if (filterName === 'myCases') {
|
} else if (filterName === 'myCases') {
|
||||||
listTitle.innerHTML = '<i class="bi bi-person-check-fill text-success"></i> Mine sager';
|
listTitle.innerHTML = '<i class="bi bi-person-check-fill text-success"></i> Mine sager';
|
||||||
const data = allData.myCases || [];
|
const data = allData.myCases || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen sager tildelt</td></tr>';
|
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen sager tildelt</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => renderCaseTableRow(item)).join('');
|
bodyHTML = data.map(item => `
|
||||||
|
<tr onclick="showCaseDetails(${item.id}, 'case')" style="cursor:pointer;">
|
||||||
|
<td>#${item.id}</td>
|
||||||
|
<td>${item.titel || '-'}</td>
|
||||||
|
<td>${item.customer_name || '-'}</td>
|
||||||
|
<td><span class="badge bg-info">${item.status || '-'}</span></td>
|
||||||
|
<td>${formatShortDate(item.deadline)}</td>
|
||||||
|
</tr>
|
||||||
|
`).join('');
|
||||||
}
|
}
|
||||||
} else if (filterName === 'todayTasks') {
|
} else if (filterName === 'todayTasks') {
|
||||||
listTitle.innerHTML = '<i class="bi bi-calendar-check text-primary"></i> Dagens opgaver';
|
listTitle.innerHTML = '<i class="bi bi-calendar-check text-primary"></i> Dagens opgaver';
|
||||||
const data = allData.todayTasks || [];
|
const data = allData.todayTasks || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen opgaver i dag</td></tr>';
|
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen opgaver i dag</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => {
|
bodyHTML = data.map(item => {
|
||||||
const normalized = {
|
const badge = item.item_type === 'case'
|
||||||
...item,
|
? '<span class="badge bg-primary">Sag</span>'
|
||||||
id: item.item_id,
|
: '<span class="badge bg-info">Ticket</span>';
|
||||||
titel: item.title,
|
return `
|
||||||
beskrivelse: item.task_reason || item.beskrivelse,
|
<tr onclick="showCaseDetails(${item.item_id}, '${item.item_type}')" style="cursor:pointer;">
|
||||||
deadline: item.deadline || item.due_at,
|
<td>#${item.item_id}</td>
|
||||||
case_type: item.case_type || item.item_type
|
<td>${item.title || '-'}<br><small class="text-muted">${item.task_reason || ''}</small></td>
|
||||||
};
|
<td>${item.customer_name || '-'}</td>
|
||||||
return renderCaseTableRow(normalized, 'id', 'item_type');
|
<td>${badge}</td>
|
||||||
|
<td>${formatDate(item.created_at)}</td>
|
||||||
|
</tr>
|
||||||
|
`;
|
||||||
}).join('');
|
}).join('');
|
||||||
}
|
}
|
||||||
} else if (filterName === 'groupCases') {
|
} else if (filterName === 'groupCases') {
|
||||||
listTitle.innerHTML = '<i class="bi bi-people-fill text-info"></i> Gruppe-sager';
|
listTitle.innerHTML = '<i class="bi bi-people-fill text-info"></i> Gruppe-sager';
|
||||||
const data = allData.groupCases || [];
|
const data = allData.groupCases || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen gruppe-sager</td></tr>';
|
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen gruppe-sager</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => renderCaseTableRow(item)).join('');
|
bodyHTML = data.map(item => `
|
||||||
|
<tr onclick="showCaseDetails(${item.id}, 'case')" style="cursor:pointer;">
|
||||||
|
<td>#${item.id}</td>
|
||||||
|
<td>${item.titel || '-'}<br><span class="badge bg-secondary">${item.group_name || '-'}</span></td>
|
||||||
|
<td>${item.customer_name || '-'}</td>
|
||||||
|
<td><span class="badge bg-info">${item.status || '-'}</span></td>
|
||||||
|
<td>${formatShortDate(item.deadline)}</td>
|
||||||
|
</tr>
|
||||||
|
`).join('');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -86,38 +86,14 @@
|
|||||||
<div class="card-body p-0">
|
<div class="card-body p-0">
|
||||||
<div class="table-responsive">
|
<div class="table-responsive">
|
||||||
<table class="table table-sm table-hover mb-0">
|
<table class="table table-sm table-hover mb-0">
|
||||||
<thead class="table-light">
|
<thead class="table-light"><tr><th>ID</th><th>Titel</th><th>Kunde</th><th>Oprettet</th></tr></thead>
|
||||||
<tr>
|
|
||||||
<th>SagsID</th>
|
|
||||||
<th>Virksom.</th>
|
|
||||||
<th>Kontakt</th>
|
|
||||||
<th>Beskr.</th>
|
|
||||||
<th>Type</th>
|
|
||||||
<th>Ansvarl.</th>
|
|
||||||
<th>Gruppe/Level</th>
|
|
||||||
<th>Opret.</th>
|
|
||||||
<th>Start arbejde</th>
|
|
||||||
<th>Start inden</th>
|
|
||||||
<th>Deadline</th>
|
|
||||||
</tr>
|
|
||||||
</thead>
|
|
||||||
<tbody>
|
<tbody>
|
||||||
{% for item in new_cases %}
|
{% for item in new_cases %}
|
||||||
<tr onclick="window.location.href='/sag/{{ item.id }}'" style="cursor:pointer;">
|
<tr onclick="window.location.href='/sag/{{ item.id }}'" style="cursor:pointer;">
|
||||||
<td>#{{ item.id }}</td>
|
<td>#{{ item.id }}</td><td>{{ item.titel }}</td><td>{{ item.customer_name }}</td><td>{{ item.created_at.strftime('%d/%m %H:%M') if item.created_at else '-' }}</td>
|
||||||
<td>{{ item.customer_name or '-' }}</td>
|
|
||||||
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
|
||||||
<td>{{ item.beskrivelse or item.titel or '-' }}</td>
|
|
||||||
<td>{{ item.case_type or '-' }}</td>
|
|
||||||
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
|
||||||
<td>{{ item.assigned_group_name or '-' }}</td>
|
|
||||||
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
|
||||||
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
|
||||||
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
|
||||||
<td>{{ item.deadline.strftime('%d/%m/%Y') if item.deadline else '-' }}</td>
|
|
||||||
</tr>
|
</tr>
|
||||||
{% else %}
|
{% else %}
|
||||||
<tr><td colspan="11" class="text-center text-muted py-3">Ingen nye sager</td></tr>
|
<tr><td colspan="4" class="text-center text-muted py-3">Ingen nye sager</td></tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
|
|||||||
@ -32,71 +32,59 @@
|
|||||||
<table class="table table-hover table-sm mb-0 align-middle">
|
<table class="table table-hover table-sm mb-0 align-middle">
|
||||||
<thead class="table-light">
|
<thead class="table-light">
|
||||||
<tr>
|
<tr>
|
||||||
<th>SagsID</th>
|
|
||||||
<th>Virksom.</th>
|
|
||||||
<th>Kontakt</th>
|
|
||||||
<th>Beskr.</th>
|
|
||||||
<th>Type</th>
|
<th>Type</th>
|
||||||
<th>Ansvarl.</th>
|
<th>ID</th>
|
||||||
<th>Gruppe/Level</th>
|
<th>Titel</th>
|
||||||
<th>Opret.</th>
|
<th>Kunde</th>
|
||||||
<th>Start arbejde</th>
|
<th>Status</th>
|
||||||
<th>Start inden</th>
|
<th>Prioritet/Reason</th>
|
||||||
<th>Deadline</th>
|
<th>Deadline</th>
|
||||||
|
<th>Handling</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody>
|
<tbody>
|
||||||
{% for item in urgent_overdue %}
|
{% for item in urgent_overdue %}
|
||||||
<tr>
|
<tr>
|
||||||
|
<td><span class="badge bg-danger">Haste</span></td>
|
||||||
<td>#{{ item.item_id }}</td>
|
<td>#{{ item.item_id }}</td>
|
||||||
<td>{{ item.customer_name or '-' }}</td>
|
<td>{{ item.title }}</td>
|
||||||
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
<td>{{ item.customer_name }}</td>
|
||||||
<td>{{ item.beskrivelse or item.title or '-' }}</td>
|
<td>{{ item.status }}</td>
|
||||||
<td>{{ item.case_type or item.item_type or '-' }}</td>
|
<td>{{ item.attention_reason }}</td>
|
||||||
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
|
||||||
<td>{{ item.assigned_group_name or '-' }}</td>
|
|
||||||
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
|
||||||
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
|
||||||
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
|
||||||
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
||||||
|
<td><a href="{{ '/sag/' ~ item.item_id if item.item_type == 'case' else '/ticket/tickets/' ~ item.item_id }}" class="btn btn-sm btn-danger">Åbn</a></td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
{% for item in today_tasks %}
|
{% for item in today_tasks %}
|
||||||
<tr>
|
<tr>
|
||||||
|
<td><span class="badge bg-primary">I dag</span></td>
|
||||||
<td>#{{ item.item_id }}</td>
|
<td>#{{ item.item_id }}</td>
|
||||||
<td>{{ item.customer_name or '-' }}</td>
|
<td>{{ item.title }}</td>
|
||||||
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
<td>{{ item.customer_name }}</td>
|
||||||
<td>{{ item.beskrivelse or item.title or item.task_reason or '-' }}</td>
|
<td>{{ item.status }}</td>
|
||||||
<td>{{ item.case_type or item.item_type or '-' }}</td>
|
<td>{{ item.task_reason }}</td>
|
||||||
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
|
||||||
<td>{{ item.assigned_group_name or '-' }}</td>
|
|
||||||
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
|
||||||
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
|
||||||
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
|
||||||
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
||||||
|
<td><a href="{{ '/sag/' ~ item.item_id if item.item_type == 'case' else '/ticket/tickets/' ~ item.item_id }}" class="btn btn-sm btn-outline-primary">Åbn</a></td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
{% for item in my_cases %}
|
{% for item in my_cases %}
|
||||||
<tr>
|
<tr>
|
||||||
|
<td><span class="badge bg-secondary">Min sag</span></td>
|
||||||
<td>#{{ item.id }}</td>
|
<td>#{{ item.id }}</td>
|
||||||
<td>{{ item.customer_name or '-' }}</td>
|
<td>{{ item.titel }}</td>
|
||||||
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
<td>{{ item.customer_name }}</td>
|
||||||
<td>{{ item.beskrivelse or item.titel or '-' }}</td>
|
<td>{{ item.status }}</td>
|
||||||
<td>{{ item.case_type or '-' }}</td>
|
<td>-</td>
|
||||||
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
|
||||||
<td>{{ item.assigned_group_name or '-' }}</td>
|
|
||||||
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
|
||||||
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
|
||||||
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
|
||||||
<td>{{ item.deadline.strftime('%d/%m/%Y') if item.deadline else '-' }}</td>
|
<td>{{ item.deadline.strftime('%d/%m/%Y') if item.deadline else '-' }}</td>
|
||||||
|
<td><a href="/sag/{{ item.id }}" class="btn btn-sm btn-outline-secondary">Åbn</a></td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
{% if not urgent_overdue and not today_tasks and not my_cases %}
|
{% if not urgent_overdue and not today_tasks and not my_cases %}
|
||||||
<tr>
|
<tr>
|
||||||
<td colspan="11" class="text-center text-muted py-4">Ingen data at vise for denne tekniker.</td>
|
<td colspan="8" class="text-center text-muted py-4">Ingen data at vise for denne tekniker.</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</tbody>
|
</tbody>
|
||||||
|
|||||||
@ -10,7 +10,7 @@ from fastapi.templating import Jinja2Templates
|
|||||||
from typing import Optional, Dict, Any
|
from typing import Optional, Dict, Any
|
||||||
from datetime import date
|
from datetime import date
|
||||||
|
|
||||||
from app.core.database import execute_query, execute_update, execute_query_single, table_has_column
|
from app.core.database import execute_query, execute_update, execute_query_single
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@ -18,20 +18,6 @@ router = APIRouter()
|
|||||||
templates = Jinja2Templates(directory="app")
|
templates = Jinja2Templates(directory="app")
|
||||||
|
|
||||||
|
|
||||||
def _case_start_date_sql(alias: str = "s") -> str:
|
|
||||||
"""Select start_date only when the live schema actually has it."""
|
|
||||||
if table_has_column("sag_sager", "start_date"):
|
|
||||||
return f"{alias}.start_date"
|
|
||||||
return "NULL::date AS start_date"
|
|
||||||
|
|
||||||
|
|
||||||
def _case_type_sql(alias: str = "s") -> str:
|
|
||||||
"""Select case type across old/new sag schemas."""
|
|
||||||
if table_has_column("sag_sager", "type"):
|
|
||||||
return f"COALESCE({alias}.template_key, {alias}.type, 'ticket') AS case_type"
|
|
||||||
return f"COALESCE({alias}.template_key, 'ticket') AS case_type"
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/", include_in_schema=False)
|
@router.get("/", include_in_schema=False)
|
||||||
async def ticket_root_redirect():
|
async def ticket_root_redirect():
|
||||||
return RedirectResponse(url="/sag", status_code=302)
|
return RedirectResponse(url="/sag", status_code=302)
|
||||||
@ -376,8 +362,6 @@ async def new_ticket_page(request: Request):
|
|||||||
|
|
||||||
def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
||||||
"""Collect live data slices for technician-focused dashboard variants."""
|
"""Collect live data slices for technician-focused dashboard variants."""
|
||||||
case_start_date_sql = _case_start_date_sql()
|
|
||||||
case_type_sql = _case_type_sql()
|
|
||||||
user_query = """
|
user_query = """
|
||||||
SELECT user_id, COALESCE(full_name, username, CONCAT('Bruger #', user_id::text)) AS display_name
|
SELECT user_id, COALESCE(full_name, username, CONCAT('Bruger #', user_id::text)) AS display_name
|
||||||
FROM users
|
FROM users
|
||||||
@ -387,34 +371,16 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
user_result = execute_query(user_query, (technician_user_id,))
|
user_result = execute_query(user_query, (technician_user_id,))
|
||||||
technician_name = user_result[0]["display_name"] if user_result else f"Bruger #{technician_user_id}"
|
technician_name = user_result[0]["display_name"] if user_result else f"Bruger #{technician_user_id}"
|
||||||
|
|
||||||
new_cases_query = f"""
|
new_cases_query = """
|
||||||
SELECT
|
SELECT
|
||||||
s.id,
|
s.id,
|
||||||
s.titel,
|
s.titel,
|
||||||
s.beskrivelse,
|
|
||||||
s.priority,
|
|
||||||
s.status,
|
s.status,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
{case_start_date_sql},
|
|
||||||
s.deferred_until,
|
|
||||||
s.deadline,
|
s.deadline,
|
||||||
{case_type_sql},
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
|
||||||
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
|
||||||
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
|
||||||
g.name AS assigned_group_name
|
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
|
||||||
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT cc.contact_id
|
|
||||||
FROM contact_companies cc
|
|
||||||
WHERE cc.customer_id = c.id
|
|
||||||
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
|
||||||
LIMIT 1
|
|
||||||
) cc_first ON true
|
|
||||||
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.status = 'åben'
|
AND s.status = 'åben'
|
||||||
ORDER BY s.created_at DESC
|
ORDER BY s.created_at DESC
|
||||||
@ -422,34 +388,16 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
"""
|
"""
|
||||||
new_cases = execute_query(new_cases_query)
|
new_cases = execute_query(new_cases_query)
|
||||||
|
|
||||||
my_cases_query = f"""
|
my_cases_query = """
|
||||||
SELECT
|
SELECT
|
||||||
s.id,
|
s.id,
|
||||||
s.titel,
|
s.titel,
|
||||||
s.beskrivelse,
|
|
||||||
s.priority,
|
|
||||||
s.status,
|
s.status,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
{case_start_date_sql},
|
|
||||||
s.deferred_until,
|
|
||||||
s.deadline,
|
s.deadline,
|
||||||
{case_type_sql},
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
|
||||||
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
|
||||||
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
|
||||||
g.name AS assigned_group_name
|
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
|
||||||
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT cc.contact_id
|
|
||||||
FROM contact_companies cc
|
|
||||||
WHERE cc.customer_id = c.id
|
|
||||||
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
|
||||||
LIMIT 1
|
|
||||||
) cc_first ON true
|
|
||||||
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.ansvarlig_bruger_id = %s
|
AND s.ansvarlig_bruger_id = %s
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
@ -458,36 +406,19 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
"""
|
"""
|
||||||
my_cases = execute_query(my_cases_query, (technician_user_id,))
|
my_cases = execute_query(my_cases_query, (technician_user_id,))
|
||||||
|
|
||||||
today_tasks_query = f"""
|
today_tasks_query = """
|
||||||
SELECT
|
SELECT
|
||||||
'case' AS item_type,
|
'case' AS item_type,
|
||||||
s.id AS item_id,
|
s.id AS item_id,
|
||||||
s.titel AS title,
|
s.titel AS title,
|
||||||
s.beskrivelse,
|
|
||||||
s.status,
|
s.status,
|
||||||
s.deadline AS due_at,
|
s.deadline AS due_at,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
{case_start_date_sql},
|
|
||||||
s.deferred_until,
|
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
{case_type_sql},
|
NULL::text AS priority,
|
||||||
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
|
||||||
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
|
||||||
g.name AS assigned_group_name,
|
|
||||||
COALESCE(s.priority::text, 'normal') AS priority,
|
|
||||||
'Sag deadline i dag' AS task_reason
|
'Sag deadline i dag' AS task_reason
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
|
||||||
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT cc.contact_id
|
|
||||||
FROM contact_companies cc
|
|
||||||
WHERE cc.customer_id = c.id
|
|
||||||
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
|
||||||
LIMIT 1
|
|
||||||
) cc_first ON true
|
|
||||||
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.ansvarlig_bruger_id = %s
|
AND s.ansvarlig_bruger_id = %s
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
@ -499,22 +430,14 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
'ticket' AS item_type,
|
'ticket' AS item_type,
|
||||||
t.id AS item_id,
|
t.id AS item_id,
|
||||||
t.subject AS title,
|
t.subject AS title,
|
||||||
NULL::text AS beskrivelse,
|
|
||||||
t.status,
|
t.status,
|
||||||
NULL::date AS due_at,
|
NULL::date AS due_at,
|
||||||
t.created_at,
|
t.created_at,
|
||||||
NULL::date AS start_date,
|
|
||||||
NULL::date AS deferred_until,
|
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
'ticket' AS case_type,
|
|
||||||
NULL::text AS kontakt_navn,
|
|
||||||
COALESCE(uu.full_name, uu.username) AS ansvarlig_navn,
|
|
||||||
NULL::text AS assigned_group_name,
|
|
||||||
COALESCE(t.priority, 'normal') AS priority,
|
COALESCE(t.priority, 'normal') AS priority,
|
||||||
'Ticket oprettet i dag' AS task_reason
|
'Ticket oprettet i dag' AS task_reason
|
||||||
FROM tticket_tickets t
|
FROM tticket_tickets t
|
||||||
LEFT JOIN customers c ON c.id = t.customer_id
|
LEFT JOIN customers c ON c.id = t.customer_id
|
||||||
LEFT JOIN users uu ON uu.user_id = t.assigned_to_user_id
|
|
||||||
WHERE t.assigned_to_user_id = %s
|
WHERE t.assigned_to_user_id = %s
|
||||||
AND t.status IN ('open', 'in_progress', 'pending_customer')
|
AND t.status IN ('open', 'in_progress', 'pending_customer')
|
||||||
AND DATE(t.created_at) = CURRENT_DATE
|
AND DATE(t.created_at) = CURRENT_DATE
|
||||||
@ -524,36 +447,19 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
"""
|
"""
|
||||||
today_tasks = execute_query(today_tasks_query, (technician_user_id, technician_user_id))
|
today_tasks = execute_query(today_tasks_query, (technician_user_id, technician_user_id))
|
||||||
|
|
||||||
urgent_overdue_query = f"""
|
urgent_overdue_query = """
|
||||||
SELECT
|
SELECT
|
||||||
'case' AS item_type,
|
'case' AS item_type,
|
||||||
s.id AS item_id,
|
s.id AS item_id,
|
||||||
s.titel AS title,
|
s.titel AS title,
|
||||||
s.beskrivelse,
|
|
||||||
s.status,
|
s.status,
|
||||||
s.deadline AS due_at,
|
s.deadline AS due_at,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
{case_start_date_sql},
|
|
||||||
s.deferred_until,
|
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
{case_type_sql},
|
|
||||||
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
|
||||||
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
|
||||||
g.name AS assigned_group_name,
|
|
||||||
NULL::text AS priority,
|
NULL::text AS priority,
|
||||||
'Over deadline' AS attention_reason
|
'Over deadline' AS attention_reason
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
|
||||||
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT cc.contact_id
|
|
||||||
FROM contact_companies cc
|
|
||||||
WHERE cc.customer_id = c.id
|
|
||||||
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
|
||||||
LIMIT 1
|
|
||||||
) cc_first ON true
|
|
||||||
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
AND s.deadline IS NOT NULL
|
AND s.deadline IS NOT NULL
|
||||||
@ -565,17 +471,10 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
'ticket' AS item_type,
|
'ticket' AS item_type,
|
||||||
t.id AS item_id,
|
t.id AS item_id,
|
||||||
t.subject AS title,
|
t.subject AS title,
|
||||||
NULL::text AS beskrivelse,
|
|
||||||
t.status,
|
t.status,
|
||||||
NULL::date AS due_at,
|
NULL::date AS due_at,
|
||||||
t.created_at,
|
t.created_at,
|
||||||
NULL::date AS start_date,
|
|
||||||
NULL::date AS deferred_until,
|
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
'ticket' AS case_type,
|
|
||||||
NULL::text AS kontakt_navn,
|
|
||||||
COALESCE(uu.full_name, uu.username) AS ansvarlig_navn,
|
|
||||||
NULL::text AS assigned_group_name,
|
|
||||||
COALESCE(t.priority, 'normal') AS priority,
|
COALESCE(t.priority, 'normal') AS priority,
|
||||||
CASE
|
CASE
|
||||||
WHEN t.priority = 'urgent' THEN 'Urgent prioritet'
|
WHEN t.priority = 'urgent' THEN 'Urgent prioritet'
|
||||||
@ -583,7 +482,6 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
END AS attention_reason
|
END AS attention_reason
|
||||||
FROM tticket_tickets t
|
FROM tticket_tickets t
|
||||||
LEFT JOIN customers c ON c.id = t.customer_id
|
LEFT JOIN customers c ON c.id = t.customer_id
|
||||||
LEFT JOIN users uu ON uu.user_id = t.assigned_to_user_id
|
|
||||||
WHERE t.status IN ('open', 'in_progress', 'pending_customer')
|
WHERE t.status IN ('open', 'in_progress', 'pending_customer')
|
||||||
AND COALESCE(t.priority, '') IN ('urgent', 'high')
|
AND COALESCE(t.priority, '') IN ('urgent', 'high')
|
||||||
AND (t.assigned_to_user_id = %s OR t.assigned_to_user_id IS NULL)
|
AND (t.assigned_to_user_id = %s OR t.assigned_to_user_id IS NULL)
|
||||||
@ -644,36 +542,19 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
# Get group cases (cases assigned to user's groups)
|
# Get group cases (cases assigned to user's groups)
|
||||||
group_cases = []
|
group_cases = []
|
||||||
if user_group_ids:
|
if user_group_ids:
|
||||||
group_cases_query = f"""
|
group_cases_query = """
|
||||||
SELECT
|
SELECT
|
||||||
s.id,
|
s.id,
|
||||||
s.titel,
|
s.titel,
|
||||||
s.beskrivelse,
|
|
||||||
s.priority,
|
|
||||||
s.status,
|
s.status,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
{case_start_date_sql},
|
|
||||||
s.deferred_until,
|
|
||||||
s.deadline,
|
s.deadline,
|
||||||
{case_type_sql},
|
|
||||||
s.assigned_group_id,
|
s.assigned_group_id,
|
||||||
g.name AS group_name,
|
g.name AS group_name,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name
|
||||||
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
|
||||||
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
|
||||||
g.name AS assigned_group_name
|
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
|
||||||
LEFT JOIN LATERAL (
|
|
||||||
SELECT cc.contact_id
|
|
||||||
FROM contact_companies cc
|
|
||||||
WHERE cc.customer_id = c.id
|
|
||||||
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
|
||||||
LIMIT 1
|
|
||||||
) cc_first ON true
|
|
||||||
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.assigned_group_id = ANY(%s)
|
AND s.assigned_group_id = ANY(%s)
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
|
|||||||
@ -1,12 +0,0 @@
|
|||||||
import re
|
|
||||||
|
|
||||||
with open('app/modules/sag/templates/detail.html', 'r', encoding='utf-8') as f:
|
|
||||||
content = f.read()
|
|
||||||
|
|
||||||
# 1. Fjern max-width
|
|
||||||
content = content.replace('<div class="container-fluid" style="margin-top: 2rem; margin-bottom: 2rem; max-width: 1400px;">', '<div class="container-fluid" style="margin-top: 2rem; margin-bottom: 2rem;">')
|
|
||||||
|
|
||||||
# Find de dele vi vil genbruge (dette kræver præcis regex eller dom parsing)
|
|
||||||
# For denne opgave benytter vi en mere generel struktur opdatering ved at finde specifikke markører.
|
|
||||||
# Her antager jeg scriptet er et template udkast
|
|
||||||
print("Script executed.")
|
|
||||||
@ -1,37 +0,0 @@
|
|||||||
# Email Feature Backup
|
|
||||||
|
|
||||||
Backup artifact for current email handling implementation.
|
|
||||||
|
|
||||||
## Artifact
|
|
||||||
|
|
||||||
- `email_feature_backup_20260317_214413.zip`
|
|
||||||
|
|
||||||
## Contents
|
|
||||||
|
|
||||||
- `app/emails/`
|
|
||||||
- `app/services/email_service.py`
|
|
||||||
- `app/services/email_processor_service.py`
|
|
||||||
- `app/services/email_analysis_service.py`
|
|
||||||
- `app/services/email_workflow_service.py`
|
|
||||||
- `app/services/email_activity_logger.py`
|
|
||||||
- `app/modules/sag/templates/detail.html`
|
|
||||||
- `migrations/013_email_system.sql`
|
|
||||||
- `migrations/014_email_workflows.sql`
|
|
||||||
- `migrations/050_email_activity_log.sql`
|
|
||||||
- `migrations/056_email_import_method.sql`
|
|
||||||
- `migrations/084_sag_files_and_emails.sql`
|
|
||||||
- `migrations/140_email_extracted_vendor_fields.sql`
|
|
||||||
- `migrations/141_email_threading_headers.sql`
|
|
||||||
|
|
||||||
## Restore (code only)
|
|
||||||
|
|
||||||
From repository root:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
unzip -o backups/email_feature/email_feature_backup_20260317_214413.zip -d .
|
|
||||||
```
|
|
||||||
|
|
||||||
## Notes
|
|
||||||
|
|
||||||
- This restore only replaces code files included in the artifact.
|
|
||||||
- Database rollback must be handled separately if schema/data has changed.
|
|
||||||
Binary file not shown.
@ -1,195 +0,0 @@
|
|||||||
<!DOCTYPE html>
|
|
||||||
<html lang="da">
|
|
||||||
<head>
|
|
||||||
<meta charset="UTF-8">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
||||||
<title>Kompakte Designforslag - Sagsdetaljer</title>
|
|
||||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
|
||||||
<style>
|
|
||||||
body {
|
|
||||||
background-color: #f8f9fa;
|
|
||||||
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif;
|
|
||||||
padding-bottom: 50px;
|
|
||||||
font-size: 0.9rem; /* Generelt lidt mindre tekststørrelse for helheden */
|
|
||||||
}
|
|
||||||
.container { max-width: 900px; }
|
|
||||||
.section-title {
|
|
||||||
color: #0f4c75;
|
|
||||||
border-bottom: 2px solid #0f4c75;
|
|
||||||
padding-bottom: 4px;
|
|
||||||
margin-top: 40px;
|
|
||||||
margin-bottom: 15px;
|
|
||||||
font-size: 1.1rem;
|
|
||||||
font-weight: 600;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Værktøjs-klasser for at fjerne default p-margins */
|
|
||||||
p:last-child { margin-bottom: 0; }
|
|
||||||
|
|
||||||
/* ---------------------------------------------------
|
|
||||||
Forslag 1: Inline Chat / Slack Kompakt
|
|
||||||
Alt på (næsten) én linje, nul spildplads.
|
|
||||||
--------------------------------------------------- */
|
|
||||||
.f1-container { background: white; border: 1px solid #ddd; border-radius: 4px; padding: 10px; }
|
|
||||||
.f1-row { padding: 4px 8px; border-radius: 4px; line-height: 1.4; border-bottom: 1px dashed #f0f0f0; }
|
|
||||||
.f1-row:last-child { border-bottom: none; }
|
|
||||||
.f1-row:hover { background-color: #f8f9fa; }
|
|
||||||
.f1-meta { color: #888; font-size: 0.8rem; margin-right: 8px; font-family: SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace; }
|
|
||||||
.f1-author { font-weight: 600; margin-right: 8px; }
|
|
||||||
.f1-author.tech { color: #0f4c75; }
|
|
||||||
.f1-author.cust { color: #b33939; }
|
|
||||||
.f1-content { color: #222; }
|
|
||||||
|
|
||||||
/* ---------------------------------------------------
|
|
||||||
Forslag 2: Helpdesk Split (Venstre/Højre)
|
|
||||||
Tydelig opdeling mellem forfatter og tekst, men kompakt font.
|
|
||||||
--------------------------------------------------- */
|
|
||||||
.f2-container { border: 1px solid #dee2e6; border-radius: 4px; background: white; }
|
|
||||||
.f2-row { display: flex; border-bottom: 1px solid #e9ecef; }
|
|
||||||
.f2-row:last-child { border-bottom: none; }
|
|
||||||
.f2-left { width: 140px; background: #f8f9fa; padding: 8px 10px; flex-shrink: 0; border-right: 1px solid #e9ecef; }
|
|
||||||
.f2-right { padding: 8px 12px; flex-grow: 1; color: #333; }
|
|
||||||
.f2-name { font-weight: 600; font-size: 0.85rem; margin-bottom: 2px; }
|
|
||||||
.f2-time { font-size: 0.75rem; color: #6c757d; }
|
|
||||||
.f2-badge-tech { color: #0f4c75; font-size: 0.75rem; font-weight: bold; }
|
|
||||||
.f2-badge-cust { color: #6c757d; font-size: 0.75rem; font-weight: bold; }
|
|
||||||
|
|
||||||
/* ---------------------------------------------------
|
|
||||||
Forslag 3: Minimalistisk Logbog
|
|
||||||
Information-dense liste. Små headere, tekst umiddelbart under.
|
|
||||||
--------------------------------------------------- */
|
|
||||||
.f3-container { background: white; border: 1px solid #ccc; border-radius: 4px; }
|
|
||||||
.f3-item { border-bottom: 1px solid #eee; }
|
|
||||||
.f3-item:last-child { border-bottom: none; }
|
|
||||||
.f3-header { display: flex; justify-content: space-between; align-items: center; background: #fdfdfd; padding: 3px 8px; border-left: 3px solid #ccc; font-size: 0.85rem; font-weight: 600; color: #444; }
|
|
||||||
.f3-header.tech { border-left-color: #0f4c75; background: #f0f4f8; }
|
|
||||||
.f3-body { padding: 6px 8px 8px 11px; color: #222; }
|
|
||||||
.f3-small { font-weight: normal; color: #777; font-size: 0.75rem; }
|
|
||||||
|
|
||||||
</style>
|
|
||||||
</head>
|
|
||||||
<body>
|
|
||||||
|
|
||||||
<div class="container mt-4">
|
|
||||||
<h2 class="mb-4 text-center">Fokuseret på Minimal Plads (Kompakt)</h2>
|
|
||||||
<p class="text-muted text-center mb-5">Her er 3 designs uden store ikoner, uden navnebobler og med minimal whitespace, præcis som i en professionel log eller et tæt ticket-system.</p>
|
|
||||||
|
|
||||||
<!-- ==============================================
|
|
||||||
FORSLAG 1: INLINE CHAT (SLACK KOMPAKT STYLE)
|
|
||||||
============================================== -->
|
|
||||||
<h3 class="section-title">Forslag 1: Inline Log (Slack Kompakt style)</h3>
|
|
||||||
<p class="text-muted small mb-2">Minder om terminal-output eller kompakt chat. Alt udover teksten står på én linje, og marginer er næsten fjernet.</p>
|
|
||||||
|
|
||||||
<div class="f1-container">
|
|
||||||
<div class="f1-row">
|
|
||||||
<span class="f1-meta">I dag 10:00</span>
|
|
||||||
<span class="f1-author cust">Jens Jensen:</span>
|
|
||||||
<span class="f1-content">Vi har et problem med at vores to printere på kontoret ikke vil forbinde til netværket siden fredag. Skærmene lyser, men de melder offline på printserveren.</span>
|
|
||||||
</div>
|
|
||||||
<div class="f1-row">
|
|
||||||
<span class="f1-meta">I dag 10:15</span>
|
|
||||||
<span class="f1-author tech">Christian Thomas:</span>
|
|
||||||
<span class="f1-content">Hej Jens. Jeg kan se at switchen port 4 & 5 var nede hurtigt i nat. Har I prøvet at genstarte dem, så de fanger ny DHCP IP?</span>
|
|
||||||
</div>
|
|
||||||
<div class="f1-row">
|
|
||||||
<span class="f1-meta">I dag 10:35</span>
|
|
||||||
<span class="f1-author cust">Jens Jensen:</span>
|
|
||||||
<span class="f1-content">Ja, det har vi nu og det løste det mærkeligt nok for den ene, men HP printeren driller stadig.</span>
|
|
||||||
</div>
|
|
||||||
<div class="f1-row">
|
|
||||||
<span class="f1-meta">I dag 10:45</span>
|
|
||||||
<span class="f1-author tech">Christian Thomas:</span>
|
|
||||||
<span class="f1-content">Jeg logger lige på jeres firewall udefra om 5 minutter og tjekker om HP'en er blokeret i MAC-filteret.</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
|
|
||||||
<!-- ==============================================
|
|
||||||
FORSLAG 2: HELPDESK SPLIT (2-KOLONNER)
|
|
||||||
============================================== -->
|
|
||||||
<h3 class="section-title">Forslag 2: Helpdesk Split (ITSM style)</h3>
|
|
||||||
<p class="text-muted small mb-2">Klassisk 2-kolonne layout. Venstre side har fast bredde til metadata, højre side udnytter hele bredden til ren tekst. Ingen tidsspilde vertikalt.</p>
|
|
||||||
|
|
||||||
<div class="f2-container">
|
|
||||||
<!-- Oprindelig sag -->
|
|
||||||
<div class="f2-row">
|
|
||||||
<div class="f2-left">
|
|
||||||
<div class="f2-name">Jens Jensen</div>
|
|
||||||
<div class="f2-badge-cust">KUNDE</div>
|
|
||||||
<div class="f2-time mt-1">I dag, kl. 10:00</div>
|
|
||||||
</div>
|
|
||||||
<div class="f2-right">
|
|
||||||
Vi har et problem med at vores to printere på kontoret ikke vil forbinde til netværket siden fredag. Skærmene lyser, men de melder offline på printserveren.
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Svar -->
|
|
||||||
<div class="f2-row">
|
|
||||||
<div class="f2-left">
|
|
||||||
<div class="f2-name text-primary">Christian Thomas</div>
|
|
||||||
<div class="f2-badge-tech">BMC NETWORKS</div>
|
|
||||||
<div class="f2-time mt-1">I dag, kl. 10:15</div>
|
|
||||||
</div>
|
|
||||||
<div class="f2-right">
|
|
||||||
Hej Jens.<br>Jeg kan se at switchen port 4 & 5 var nede hurtigt i nat. Har I prøvet at genstarte dem, så de fanger ny DHCP IP?
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Svar -->
|
|
||||||
<div class="f2-row">
|
|
||||||
<div class="f2-left">
|
|
||||||
<div class="f2-name">Jens Jensen</div>
|
|
||||||
<div class="f2-badge-cust">KUNDE</div>
|
|
||||||
<div class="f2-time mt-1">I dag, kl. 10:35</div>
|
|
||||||
</div>
|
|
||||||
<div class="f2-right">
|
|
||||||
Ja, det har vi nu og det løste det mærkeligt nok for den ene, men HP printeren driller stadig.
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
|
|
||||||
<!-- ==============================================
|
|
||||||
FORSLAG 3: MINIMALISTISK LOGBOG
|
|
||||||
============================================== -->
|
|
||||||
<h3 class="section-title">Forslag 3: Minimalistisk Logbog</h3>
|
|
||||||
<p class="text-muted small mb-2">Hver tråd adskilles af en meget tynd grå overskrift. Ingen kasser rundt om teksten, den flyder frit for maksimal informationsdensitet.</p>
|
|
||||||
|
|
||||||
<div class="f3-container">
|
|
||||||
|
|
||||||
<div class="f3-item">
|
|
||||||
<div class="f3-header">
|
|
||||||
<span>Jens Jensen <span class="fw-normal text-muted">(Kunde)</span></span>
|
|
||||||
<span class="f3-small">I dag, kl. 10:00</span>
|
|
||||||
</div>
|
|
||||||
<div class="f3-body">
|
|
||||||
Vi har et problem med at vores to printere på kontoret ikke vil forbinde til netværket siden fredag. Skærmene lyser, men de melder offline på printserveren.
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="f3-item">
|
|
||||||
<div class="f3-header tech">
|
|
||||||
<span>Christian Thomas <span class="fw-normal text-muted" style="color: #0f4c75 !important;">(Tekniker)</span></span>
|
|
||||||
<span class="f3-small">I dag, kl. 10:15</span>
|
|
||||||
</div>
|
|
||||||
<div class="f3-body">
|
|
||||||
Hej Jens.<br>Jeg kan se at switchen port 4 & 5 var nede hurtigt i nat. Har I prøvet at genstarte dem, så de fanger ny DHCP IP?
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="f3-item">
|
|
||||||
<div class="f3-header">
|
|
||||||
<span>Jens Jensen <span class="fw-normal text-muted">(Kunde)</span></span>
|
|
||||||
<span class="f3-small">I dag, kl. 10:35</span>
|
|
||||||
</div>
|
|
||||||
<div class="f3-body">
|
|
||||||
Ja, det har vi nu og det løste det mærkeligt nok for den ene, men HP printeren driller stadig.
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</body>
|
|
||||||
</html>
|
|
||||||
@ -1,338 +0,0 @@
|
|||||||
<!DOCTYPE html>
|
|
||||||
<html lang="da">
|
|
||||||
<head>
|
|
||||||
<meta charset="UTF-8">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
||||||
<title>Designforslag - Sagsdetaljer & Kommentarer</title>
|
|
||||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
|
|
||||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.1/font/bootstrap-icons.css">
|
|
||||||
<style>
|
|
||||||
body {
|
|
||||||
background: #f0f2f5;
|
|
||||||
padding: 3rem 1rem;
|
|
||||||
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif;
|
|
||||||
}
|
|
||||||
.container {
|
|
||||||
max-width: 900px;
|
|
||||||
}
|
|
||||||
.proposal-wrapper {
|
|
||||||
background: white;
|
|
||||||
border-radius: 16px;
|
|
||||||
box-shadow: 0 4px 20px rgba(0,0,0,0.06);
|
|
||||||
padding: 2rem;
|
|
||||||
margin-bottom: 4rem;
|
|
||||||
}
|
|
||||||
.proposal-title {
|
|
||||||
border-bottom: 2px solid #f0f2f5;
|
|
||||||
padding-bottom: 1rem;
|
|
||||||
margin-bottom: 2rem;
|
|
||||||
color: #0f4c75;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* Fælles: Opgavebeskrivelse kort */
|
|
||||||
.desc-card {
|
|
||||||
background: #fff;
|
|
||||||
border: 1px solid #e1e4e8;
|
|
||||||
border-radius: 8px;
|
|
||||||
padding: 1.5rem;
|
|
||||||
margin-bottom: 2rem;
|
|
||||||
}
|
|
||||||
.desc-label {
|
|
||||||
font-size: 0.75rem;
|
|
||||||
text-transform: uppercase;
|
|
||||||
letter-spacing: 1px;
|
|
||||||
color: #6c757d;
|
|
||||||
font-weight: 700;
|
|
||||||
margin-bottom: 1rem;
|
|
||||||
display: flex;
|
|
||||||
justify-content: space-between;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* ---------------------------------------------------
|
|
||||||
FORSLAG 1: CHAT / MESSENGER STYLE
|
|
||||||
--------------------------------------------------- */
|
|
||||||
.chat-container {
|
|
||||||
display: flex;
|
|
||||||
flex-direction: column;
|
|
||||||
gap: 1rem;
|
|
||||||
}
|
|
||||||
.chat-msg {
|
|
||||||
max-width: 85%;
|
|
||||||
padding: 1rem;
|
|
||||||
border-radius: 12px;
|
|
||||||
position: relative;
|
|
||||||
}
|
|
||||||
.chat-internal {
|
|
||||||
align-self: flex-end;
|
|
||||||
background-color: #e3f2fd;
|
|
||||||
border-bottom-right-radius: 2px;
|
|
||||||
color: #084298;
|
|
||||||
}
|
|
||||||
.chat-customer {
|
|
||||||
align-self: flex-start;
|
|
||||||
background-color: #f1f3f5;
|
|
||||||
border-bottom-left-radius: 2px;
|
|
||||||
color: #212529;
|
|
||||||
border: 1px solid #e9ecef;
|
|
||||||
}
|
|
||||||
.chat-header {
|
|
||||||
display: flex;
|
|
||||||
justify-content: space-between;
|
|
||||||
align-items: center;
|
|
||||||
font-size: 0.8rem;
|
|
||||||
margin-bottom: 0.5rem;
|
|
||||||
font-weight: 600;
|
|
||||||
opacity: 0.8;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* ---------------------------------------------------
|
|
||||||
FORSLAG 2: TIMELINE / ACTIVITY FEED
|
|
||||||
--------------------------------------------------- */
|
|
||||||
.timeline {
|
|
||||||
position: relative;
|
|
||||||
padding-left: 3rem;
|
|
||||||
margin-top: 1rem;
|
|
||||||
}
|
|
||||||
.timeline::before {
|
|
||||||
content: '';
|
|
||||||
position: absolute;
|
|
||||||
left: 17px;
|
|
||||||
top: 0;
|
|
||||||
bottom: 0;
|
|
||||||
width: 2px;
|
|
||||||
background: #e9ecef;
|
|
||||||
}
|
|
||||||
.timeline-item {
|
|
||||||
position: relative;
|
|
||||||
margin-bottom: 2rem;
|
|
||||||
}
|
|
||||||
.timeline-icon {
|
|
||||||
position: absolute;
|
|
||||||
left: -3rem;
|
|
||||||
width: 36px;
|
|
||||||
height: 36px;
|
|
||||||
border-radius: 50%;
|
|
||||||
background: white;
|
|
||||||
border: 2px solid #e9ecef;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: center;
|
|
||||||
z-index: 1;
|
|
||||||
font-size: 1.1rem;
|
|
||||||
}
|
|
||||||
.icon-internal { border-color: #ffc107; color: #ffc107; }
|
|
||||||
.icon-customer { border-color: #198754; color: #198754; }
|
|
||||||
.icon-system { border-color: #6c757d; color: #6c757d; }
|
|
||||||
|
|
||||||
.timeline-content {
|
|
||||||
background: white;
|
|
||||||
border: 1px solid #e9ecef;
|
|
||||||
border-radius: 8px;
|
|
||||||
box-shadow: 0 2px 4px rgba(0,0,0,0.02);
|
|
||||||
padding: 1.25rem;
|
|
||||||
}
|
|
||||||
.timeline-meta {
|
|
||||||
font-size: 0.85rem;
|
|
||||||
color: #6c757d;
|
|
||||||
margin-bottom: 0.75rem;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 0.5rem;
|
|
||||||
}
|
|
||||||
|
|
||||||
/* ---------------------------------------------------
|
|
||||||
FORSLAG 3: CLEAN CARDS MED FARVEKODER (Trello/Jira style)
|
|
||||||
--------------------------------------------------- */
|
|
||||||
.comment-card {
|
|
||||||
background: white;
|
|
||||||
border: 1px solid #e9ecef;
|
|
||||||
border-radius: 8px;
|
|
||||||
margin-bottom: 1.5rem;
|
|
||||||
overflow: hidden;
|
|
||||||
box-shadow: 0 1px 3px rgba(0,0,0,0.02);
|
|
||||||
}
|
|
||||||
.comment-card.type-internal {
|
|
||||||
border-left: 4px solid #ffc107; /* Gul venstre kant for intern */
|
|
||||||
}
|
|
||||||
.comment-card.type-customer {
|
|
||||||
border-left: 4px solid #0dcaf0; /* Blå/grøn venstre kant for kunde */
|
|
||||||
}
|
|
||||||
.card-header-clean {
|
|
||||||
background: #f8f9fa;
|
|
||||||
border-bottom: 1px solid #e9ecef;
|
|
||||||
padding: 0.75rem 1.25rem;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
justify-content: space-between;
|
|
||||||
}
|
|
||||||
.card-header-clean .author {
|
|
||||||
font-weight: 600;
|
|
||||||
font-size: 0.9rem;
|
|
||||||
display: flex;
|
|
||||||
align-items: center;
|
|
||||||
gap: 0.5rem;
|
|
||||||
}
|
|
||||||
.badge-type {
|
|
||||||
font-size: 0.7rem;
|
|
||||||
padding: 0.2rem 0.5rem;
|
|
||||||
border-radius: 12px;
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
</head>
|
|
||||||
<body>
|
|
||||||
|
|
||||||
<div class="container">
|
|
||||||
<div class="mb-5 text-center">
|
|
||||||
<h1 class="fw-bold" style="color: #0f4c75;">UI Forslag: Sagsdetaljer & Kommentarer</h1>
|
|
||||||
<p class="text-muted">3 forskellige måder at redesigne "Opgavebeskrivelse" og "Kommentarer" på, uden at røre live-koden endnu.</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- FORSLAG 1 -->
|
|
||||||
<div class="proposal-wrapper">
|
|
||||||
<h3 class="proposal-title"><i class="bi bi-chat-dots me-2"></i>Forslag 1: Chat / Messenger UI</h3>
|
|
||||||
<p class="text-muted mb-4">Gør det nemt at adskille hvem der siger hvad. Interne noter (højre, blå), kundens svar (venstre, grå). Beskrivelsen er "låst" i toppen som opgavens udgangspunkt.</p>
|
|
||||||
|
|
||||||
<div class="desc-card">
|
|
||||||
<div class="desc-label">
|
|
||||||
<span><i class="bi bi-card-text me-2"></i>Opgavebeskrivelse</span>
|
|
||||||
<a href="#" class="text-decoration-none text-muted"><i class="bi bi-pencil-square"></i> Rediger</a>
|
|
||||||
</div>
|
|
||||||
<p class="mb-0">awrtqerqerg</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="chat-container mt-4">
|
|
||||||
<div class="chat-msg chat-internal">
|
|
||||||
<div class="chat-header">
|
|
||||||
<span><i class="bi bi-lock-fill me-1"></i> Hurtig kommentar (Intern)</span>
|
|
||||||
<span class="small fw-normal">19/03-2026 06:34</span>
|
|
||||||
</div>
|
|
||||||
<div>tiest</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="chat-msg chat-internal">
|
|
||||||
<div class="chat-header">
|
|
||||||
<span><i class="bi bi-lock-fill me-1"></i> Hurtig kommentar (Intern)</span>
|
|
||||||
<span class="small fw-normal">19/03-2026 07:30</span>
|
|
||||||
</div>
|
|
||||||
<div>test</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="chat-msg chat-customer">
|
|
||||||
<div class="chat-header">
|
|
||||||
<span><i class="bi bi-person-circle me-1"></i> Bruger / Kunde svar</span>
|
|
||||||
<span class="small fw-normal">19/03-2026 08:03</span>
|
|
||||||
</div>
|
|
||||||
<div>sdfsdfsdfgsg</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
|
|
||||||
<!-- FORSLAG 2 -->
|
|
||||||
<div class="proposal-wrapper">
|
|
||||||
<h3 class="proposal-title"><i class="bi bi-clock-history me-2"></i>Forslag 2: Timeline / Activity Feed</h3>
|
|
||||||
<p class="text-muted mb-4">Inspireret af GitHub Issues. En lodret historik-streg samler oprettelse, kommentarer og ændringer i et nemt læsbart flow.</p>
|
|
||||||
|
|
||||||
<div class="timeline">
|
|
||||||
<!-- Beskrivelsen som første post i timelinen -->
|
|
||||||
<div class="timeline-item">
|
|
||||||
<div class="timeline-icon icon-system"><i class="bi bi-flag-fill"></i></div>
|
|
||||||
<div class="timeline-content">
|
|
||||||
<div class="timeline-meta">
|
|
||||||
<strong>Sag oprettet</strong> • Beskrivelse tilføjet
|
|
||||||
<div class="ms-auto"><a href="#" class="text-muted"><i class="bi bi-pencil"></i></a></div>
|
|
||||||
</div>
|
|
||||||
<div class="p-3 bg-light rounded border">
|
|
||||||
awrtqerqerg
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="timeline-item">
|
|
||||||
<div class="timeline-icon icon-internal"><i class="bi bi-chat-square-text"></i></div>
|
|
||||||
<div class="timeline-content">
|
|
||||||
<div class="timeline-meta">
|
|
||||||
<strong>Hurtig kommentar</strong> (Intern note)
|
|
||||||
<span class="ms-auto text-muted small">19/03-2026 06:34</span>
|
|
||||||
</div>
|
|
||||||
<div>tiest</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="timeline-item">
|
|
||||||
<div class="timeline-icon icon-customer"><i class="bi bi-envelope"></i></div>
|
|
||||||
<div class="timeline-content" style="border-color: #c3e6cb;">
|
|
||||||
<div class="timeline-meta text-success">
|
|
||||||
<strong>Bruger</strong> (Svar fra kunde)
|
|
||||||
<span class="ms-auto text-muted small">19/03-2026 08:03</span>
|
|
||||||
</div>
|
|
||||||
<div>sdfsdfsdfgsg</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
|
|
||||||
<!-- FORSLAG 3 -->
|
|
||||||
<div class="proposal-wrapper">
|
|
||||||
<h3 class="proposal-title"><i class="bi bi-card-list me-2"></i>Forslag 3: Clean Cards (Farvekodet venstre kant)</h3>
|
|
||||||
<p class="text-muted mb-4">Meget stilrent design til CRM / Enterprise systemer. Bevarer fuld bredde for lang tekst, men bruger en tyk farvekode på venstre kant til at identificere typen lynhurtigt.</p>
|
|
||||||
|
|
||||||
<div class="desc-card shadow-sm" style="border-top: 4px solid #0f4c75;">
|
|
||||||
<div class="desc-label">
|
|
||||||
<span>Opgavebeskrivelse</span>
|
|
||||||
<button class="btn btn-sm btn-outline-secondary py-0"><i class="bi bi-pencil"></i> Rediger</button>
|
|
||||||
</div>
|
|
||||||
<p class="mb-0 fs-5">awrtqerqerg</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<h5 class="mb-3 mt-5" style="color: #6c757d; font-size: 0.9rem; text-transform: uppercase;">Kommentarer & Historik</h5>
|
|
||||||
|
|
||||||
<div class="comment-card type-internal">
|
|
||||||
<div class="card-header-clean">
|
|
||||||
<div class="author">
|
|
||||||
<div class="bg-warning text-dark rounded-circle d-flex align-items-center justify-content-center" style="width:24px; height:24px; font-size:0.75rem;"><i class="bi bi-lock-fill"></i></div>
|
|
||||||
Hurtig kommentar
|
|
||||||
<span class="badge bg-warning text-dark badge-type ms-2">Intern Note</span>
|
|
||||||
</div>
|
|
||||||
<div class="text-muted small">19/03-2026 06:34</div>
|
|
||||||
</div>
|
|
||||||
<div class="card-body p-3">
|
|
||||||
tiest
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="comment-card type-internal">
|
|
||||||
<div class="card-header-clean">
|
|
||||||
<div class="author">
|
|
||||||
<div class="bg-warning text-dark rounded-circle d-flex align-items-center justify-content-center" style="width:24px; height:24px; font-size:0.75rem;"><i class="bi bi-lock-fill"></i></div>
|
|
||||||
Hurtig kommentar
|
|
||||||
<span class="badge bg-warning text-dark badge-type ms-2">Intern Note</span>
|
|
||||||
</div>
|
|
||||||
<div class="text-muted small">19/03-2026 07:59</div>
|
|
||||||
</div>
|
|
||||||
<div class="card-body p-3">
|
|
||||||
adfgaegea hsrhsh
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="comment-card type-customer">
|
|
||||||
<div class="card-header-clean">
|
|
||||||
<div class="author">
|
|
||||||
<div class="bg-info text-white rounded-circle d-flex align-items-center justify-content-center" style="width:24px; height:24px; font-size:0.75rem;"><i class="bi bi-person-fill"></i></div>
|
|
||||||
Bruger
|
|
||||||
<span class="badge bg-info text-dark badge-type ms-2">Kunde</span>
|
|
||||||
</div>
|
|
||||||
<div class="text-muted small">19/03-2026 08:03</div>
|
|
||||||
</div>
|
|
||||||
<div class="card-body p-3">
|
|
||||||
sdfsdfsdfgsg
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</body>
|
|
||||||
</html>
|
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user