Compare commits
70 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a8eaf6e2a9 | ||
|
|
92b888b78f | ||
|
|
dcae962481 | ||
|
|
243e4375e0 | ||
|
|
153eb728e2 | ||
|
|
73803f894b | ||
|
|
60d692c085 | ||
|
|
beaea0288c | ||
|
|
e07932f2cc | ||
|
|
7a95623094 | ||
|
|
9a3ada380f | ||
|
|
eb5e14e2a1 | ||
|
|
074ab6a62a | ||
|
|
15feb18361 | ||
|
|
695854a272 | ||
|
|
1d7107bff0 | ||
|
|
7678b58cb4 | ||
|
|
7e77266d97 | ||
|
|
ba9622250a | ||
|
|
e3094d7ed0 | ||
|
|
959c9b4401 | ||
|
|
acdc94cd18 | ||
|
|
ed01f07f86 | ||
|
|
1323320fed | ||
|
|
9fc57feda4 | ||
|
|
2bd5a3e057 | ||
|
|
4760b8b3c4 | ||
|
|
701cc63375 | ||
|
|
803b45fab4 | ||
|
|
45d8f4209b | ||
|
|
91f709f4fe | ||
|
|
dd02701b21 | ||
|
|
8b863a3b68 | ||
|
|
827463d59e | ||
|
|
b80f91fae1 | ||
|
|
81cc3a4a9e | ||
|
|
b0a51f1919 | ||
|
|
2d2c7aeb9b | ||
|
|
bf28e94d6e | ||
|
|
72acca9e8b | ||
|
|
4953c82b93 | ||
|
|
4b2b0ea0f3 | ||
|
|
8d29302b01 | ||
|
|
8a0dbcd1cc | ||
|
|
d561a063f6 | ||
|
|
14ccd5accf | ||
|
|
bdf76a2a80 | ||
|
|
2ed3118c83 | ||
|
|
aabd9f0069 | ||
|
|
5e94fc5e69 | ||
|
|
de59bc8367 | ||
|
|
744b405142 | ||
|
|
ea4905ef8a | ||
|
|
09de3c7373 | ||
|
|
c6d310e96d | ||
|
|
3d24987365 | ||
|
|
2fc8a1adce | ||
|
|
aa7b0894af | ||
|
|
3978dae692 | ||
|
|
c5aa31b825 | ||
|
|
84c837f303 | ||
|
|
eb0dad8a10 | ||
|
|
14e1c87a4c | ||
|
|
04acdecb91 | ||
|
|
a8970701ab | ||
|
|
07584b1b0c | ||
|
|
14b13b8239 | ||
|
|
a33da15550 | ||
|
|
8d7d32571a | ||
|
|
abd5014eb0 |
38
.github/skills/gui-starter/SKILL.md
vendored
Normal file
38
.github/skills/gui-starter/SKILL.md
vendored
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
---
|
||||||
|
name: gui-starter
|
||||||
|
description: "Use when building or updating BMC Hub GUI pages, templates, layout, styling, dark mode toggle, responsive Bootstrap 5 UI, or Nordic Top themed frontend components."
|
||||||
|
---
|
||||||
|
|
||||||
|
# BMC Hub GUI Starter
|
||||||
|
|
||||||
|
## Purpose
|
||||||
|
Use this skill when implementing or refining frontend UI in BMC Hub.
|
||||||
|
|
||||||
|
## Project UI Rules
|
||||||
|
- Follow the Nordic Top style from `docs/design_reference/`.
|
||||||
|
- Keep a minimalist, clean layout with card-based sections.
|
||||||
|
- Use Deep Blue as default primary accent: `#0f4c75`.
|
||||||
|
- Support dark mode with a visible toggle.
|
||||||
|
- Use CSS variables so accent colors can be changed dynamically.
|
||||||
|
- Build mobile-first with Bootstrap 5 grid utilities.
|
||||||
|
|
||||||
|
## Preferred Workflow
|
||||||
|
1. Identify existing template/page and preserve established structure when present.
|
||||||
|
2. Define or update theme tokens as CSS variables (light + dark).
|
||||||
|
3. Implement responsive layout first, then enhance desktop spacing/typography.
|
||||||
|
4. Add or maintain dark mode toggle logic (persist preference in localStorage when relevant).
|
||||||
|
5. Reuse patterns from `docs/design_reference/components.html`, `docs/design_reference/index.html`, `docs/design_reference/customers.html`, and `docs/design_reference/form.html`.
|
||||||
|
6. Validate visual consistency and avoid introducing one-off styles unless necessary.
|
||||||
|
|
||||||
|
## Implementation Guardrails
|
||||||
|
- Do not hardcode colors repeatedly; map them to CSS variables.
|
||||||
|
- Do not remove dark mode support from existing pages.
|
||||||
|
- Do not break existing navigation/topbar behavior.
|
||||||
|
- Avoid large framework changes unless explicitly requested.
|
||||||
|
- Keep accessibility basics in place: color contrast, visible focus states, semantic HTML.
|
||||||
|
|
||||||
|
## Deliverables
|
||||||
|
When using this skill, provide:
|
||||||
|
- Updated frontend files (HTML/CSS/JS) with concise, intentional styling.
|
||||||
|
- A short summary of what changed and why.
|
||||||
|
- Notes about any remaining UI tradeoffs or follow-up refinements.
|
||||||
1
.gitignore
vendored
1
.gitignore
vendored
@ -28,3 +28,4 @@ htmlcov/
|
|||||||
.coverage
|
.coverage
|
||||||
.pytest_cache/
|
.pytest_cache/
|
||||||
.mypy_cache/
|
.mypy_cache/
|
||||||
|
RELEASE_NOTES_v2.2.38.md
|
||||||
|
|||||||
14
Dockerfile
14
Dockerfile
@ -38,8 +38,18 @@ RUN if [ "$RELEASE_VERSION" != "latest" ] && [ -n "$GITHUB_TOKEN" ]; then \
|
|||||||
pip install --no-cache-dir -r /tmp/requirements.txt; \
|
pip install --no-cache-dir -r /tmp/requirements.txt; \
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Copy application code
|
# Copy local source to temp location.
|
||||||
COPY . .
|
# In release builds we keep downloaded source in /app.
|
||||||
|
# In latest/local builds we copy from /app_local to /app.
|
||||||
|
COPY . /app_local
|
||||||
|
|
||||||
|
RUN if [ "$RELEASE_VERSION" = "latest" ] || [ -z "$GITHUB_TOKEN" ]; then \
|
||||||
|
echo "Using local source files..." && \
|
||||||
|
cp -a /app_local/. /app/; \
|
||||||
|
else \
|
||||||
|
echo "Keeping downloaded release source in /app (no local override)"; \
|
||||||
|
fi && \
|
||||||
|
rm -rf /app_local
|
||||||
|
|
||||||
# Create necessary directories
|
# Create necessary directories
|
||||||
RUN mkdir -p /app/logs /app/uploads /app/static /app/data
|
RUN mkdir -p /app/logs /app/uploads /app/static /app/data
|
||||||
|
|||||||
15
RELEASE_NOTES_v2.2.3.md
Normal file
15
RELEASE_NOTES_v2.2.3.md
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
# BMC Hub v2.2.3 - Migration Hotfix
|
||||||
|
|
||||||
|
**Release Date:** 22. februar 2026
|
||||||
|
|
||||||
|
## 🛠️ Hotfix
|
||||||
|
|
||||||
|
### Migration 138 compatibility fix
|
||||||
|
- Fixed `migrations/138_customers_economic_unique_constraint.sql` for environments where `customers.economic_customer_number` is numeric (`integer`).
|
||||||
|
- Removed unconditional `btrim(...)` usage on non-text columns.
|
||||||
|
- Added type-aware normalization logic that only applies trimming for text-like column types.
|
||||||
|
|
||||||
|
## ✅ Impact
|
||||||
|
|
||||||
|
- Migration `138_customers_economic_unique_constraint.sql` now runs on both numeric and text column variants without `function btrim(integer) does not exist` errors.
|
||||||
|
- Unique index safety behavior and duplicate detection are unchanged.
|
||||||
30
RELEASE_NOTES_v2.2.36.md
Normal file
30
RELEASE_NOTES_v2.2.36.md
Normal file
@ -0,0 +1,30 @@
|
|||||||
|
# BMC Hub v2.2.36 - Helpdesk SAG Routing
|
||||||
|
|
||||||
|
**Release Date:** 2. marts 2026
|
||||||
|
|
||||||
|
## ✨ New Features
|
||||||
|
|
||||||
|
### Helpdesk email → SAG automation
|
||||||
|
- Incoming emails from known customer domains now auto-create a new SAG when no `SAG-<id>` reference is present.
|
||||||
|
- Incoming emails with `SAG-<id>` in subject or threading headers now auto-update the related SAG.
|
||||||
|
- Emails from unknown domains remain in `/emails` for manual handling.
|
||||||
|
|
||||||
|
### Email threading support for routing
|
||||||
|
- Added migration `141_email_threading_headers.sql`.
|
||||||
|
- `email_messages` now stores `in_reply_to` and `email_references` to support robust SAG threading lookup.
|
||||||
|
|
||||||
|
### /emails quick customer creation improvements
|
||||||
|
- Quick create customer modal now includes `email_domain`.
|
||||||
|
- Customer create API now accepts and persists `email_domain`.
|
||||||
|
|
||||||
|
## 🔧 Technical Changes
|
||||||
|
|
||||||
|
- Updated `app/services/email_service.py` to parse and persist `In-Reply-To` and `References` from IMAP/EML uploads.
|
||||||
|
- Updated `app/services/email_workflow_service.py` with system-level helpdesk SAG routing logic.
|
||||||
|
- Updated `app/emails/backend/router.py` to include `customer_name` in email list responses.
|
||||||
|
- Updated `app/customers/backend/router.py` and `app/emails/frontend/emails.html` for `email_domain` support.
|
||||||
|
|
||||||
|
## 📋 Deployment Notes
|
||||||
|
|
||||||
|
- Run database migration 141 before processing new inbound emails for full header-based routing behavior.
|
||||||
|
- Existing workflow/rule behavior is preserved; new routing runs as a system workflow.
|
||||||
45
RELEASE_NOTES_v2.2.39.md
Normal file
45
RELEASE_NOTES_v2.2.39.md
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
# Release Notes v2.2.39
|
||||||
|
|
||||||
|
Dato: 3. marts 2026
|
||||||
|
|
||||||
|
## Nyt: Mission Control (MVP)
|
||||||
|
- Nyt dedikeret fullscreen dashboard til operations-overblik på storskærm.
|
||||||
|
- Realtime-opdateringer via WebSocket (`/api/v1/mission/ws`).
|
||||||
|
- KPI-overblik for sager:
|
||||||
|
- Åbne sager
|
||||||
|
- Nye sager
|
||||||
|
- Sager uden ansvarlig
|
||||||
|
- Deadlines i dag
|
||||||
|
- Overskredne deadlines
|
||||||
|
- Aktivt opkaldsoverlay med deduplikering på `call_id`.
|
||||||
|
- Uptime-alerts (DOWN/UP/DEGRADED) med synlig aktive alarmer.
|
||||||
|
- Live aktivitetsfeed (seneste 20 events).
|
||||||
|
- Lydsystem med mute + volumenkontrol i dashboardet.
|
||||||
|
|
||||||
|
## Nye endpoints
|
||||||
|
- `GET /api/v1/mission/state`
|
||||||
|
- `WS /api/v1/mission/ws`
|
||||||
|
- `POST /api/v1/mission/webhook/telefoni/ringing`
|
||||||
|
- `POST /api/v1/mission/webhook/telefoni/answered`
|
||||||
|
- `POST /api/v1/mission/webhook/telefoni/hangup`
|
||||||
|
- `POST /api/v1/mission/webhook/uptime`
|
||||||
|
|
||||||
|
## Nye filer
|
||||||
|
- `migrations/142_mission_control.sql`
|
||||||
|
- `app/dashboard/backend/mission_router.py`
|
||||||
|
- `app/dashboard/backend/mission_service.py`
|
||||||
|
- `app/dashboard/backend/mission_ws.py`
|
||||||
|
- `app/dashboard/frontend/mission_control.html`
|
||||||
|
|
||||||
|
## Opdaterede filer
|
||||||
|
- `main.py`
|
||||||
|
- `app/core/config.py`
|
||||||
|
- `app/dashboard/backend/views.py`
|
||||||
|
- `VERSION`
|
||||||
|
|
||||||
|
## Drift/konfiguration
|
||||||
|
- Ny setting/env til webhook-sikring: `MISSION_WEBHOOK_TOKEN`.
|
||||||
|
- Nye settings-seeds til Mission Control lyd, KPI-visning, queue-filter og customer-filter.
|
||||||
|
|
||||||
|
## Verificering
|
||||||
|
- Python syntaks-check kørt på ændrede backend-filer med `py_compile`.
|
||||||
18
RELEASE_NOTES_v2.2.40.md
Normal file
18
RELEASE_NOTES_v2.2.40.md
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Release Notes v2.2.40
|
||||||
|
|
||||||
|
Dato: 3. marts 2026
|
||||||
|
|
||||||
|
## Hotfix: Production build source override
|
||||||
|
- Rettet Docker build-flow i `Dockerfile`, så release-kode hentet via `RELEASE_VERSION` ikke bliver overskrevet af lokal checkout under image build.
|
||||||
|
- Dette løser scenarier hvor produktion kører forkert kodeversion (fx manglende routes som `/dashboard/mission-control`) selv når korrekt release-tag er angivet.
|
||||||
|
|
||||||
|
## Tekniske ændringer
|
||||||
|
- Lokal kildekode kopieres nu til midlertidig mappe (`/app_local`).
|
||||||
|
- Ved release-build (`RELEASE_VERSION != latest` og token sat) bevares downloadet release-kilde i `/app`.
|
||||||
|
- Ved local/latest-build kopieres `/app_local` til `/app` som før.
|
||||||
|
|
||||||
|
## Verificering
|
||||||
|
- Build output skal vise:
|
||||||
|
- `Downloading release ... from Gitea...`
|
||||||
|
- `Keeping downloaded release source in /app (no local override)`
|
||||||
|
- Efter deploy skal `/dashboard/mission-control` ikke længere returnere 404 på release v2.2.39+.
|
||||||
20
RELEASE_NOTES_v2.2.41.md
Normal file
20
RELEASE_NOTES_v2.2.41.md
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
# Release Notes v2.2.41
|
||||||
|
|
||||||
|
Dato: 3. marts 2026
|
||||||
|
|
||||||
|
## Fix: Postgres healthcheck støj i logs
|
||||||
|
- Opdateret healthcheck til at bruge korrekt database-navn (`POSTGRES_DB`) i stedet for default database.
|
||||||
|
- Løser gentagne loglinjer af typen: `FATAL: database "bmc_hub" does not exist` på installationer hvor databasen hedder noget andet (fx `hubdb_v2`).
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `docker-compose.prod.yml`
|
||||||
|
- `docker-compose.yml`
|
||||||
|
- `updateto.sh`
|
||||||
|
- `VERSION`
|
||||||
|
|
||||||
|
## Tekniske noter
|
||||||
|
- Healthcheck er ændret fra:
|
||||||
|
- `pg_isready -U <user>`
|
||||||
|
- Til:
|
||||||
|
- `pg_isready -U <user> -d <db>`
|
||||||
|
- `updateto.sh` bruger nu også `-d "$POSTGRES_DB"` i venteløkke for postgres.
|
||||||
18
RELEASE_NOTES_v2.2.42.md
Normal file
18
RELEASE_NOTES_v2.2.42.md
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Release Notes v2.2.42
|
||||||
|
|
||||||
|
Dato: 3. marts 2026
|
||||||
|
|
||||||
|
## Fix: Yealink webhook compatibility + deploy robusthed
|
||||||
|
- Tilføjet `GET` support på Mission Control telefoni-webhooks, så Yealink callback-URLs ikke returnerer `405 Method Not Allowed`.
|
||||||
|
- Webhook-endpoints understøtter nu query-parametre for `call_id`, `caller_number`, `queue_name` og valgfri `timestamp`.
|
||||||
|
- `updateto.sh` er hærdet med tydelig fail-fast ved portkonflikter og mislykket container-opstart, så scriptet ikke melder succes ved delvis fejl.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/dashboard/backend/mission_router.py`
|
||||||
|
- `updateto.sh`
|
||||||
|
- `VERSION`
|
||||||
|
|
||||||
|
## Påvirkede endpoints
|
||||||
|
- `/api/v1/mission/webhook/telefoni/ringing` (`POST` + `GET`)
|
||||||
|
- `/api/v1/mission/webhook/telefoni/answered` (`POST` + `GET`)
|
||||||
|
- `/api/v1/mission/webhook/telefoni/hangup` (`POST` + `GET`)
|
||||||
16
RELEASE_NOTES_v2.2.43.md
Normal file
16
RELEASE_NOTES_v2.2.43.md
Normal file
@ -0,0 +1,16 @@
|
|||||||
|
# Release Notes v2.2.43
|
||||||
|
|
||||||
|
Dato: 3. marts 2026
|
||||||
|
|
||||||
|
## Fix: Synlige Mission webhook logs
|
||||||
|
- Tilføjet eksplicit logging for Mission telefoni-webhooks (`ringing`, `answered`, `hangup`) med call-id, nummer, kø og HTTP-metode.
|
||||||
|
- Tilføjet warning logs ved manglende/ugyldig Mission webhook token.
|
||||||
|
- Gør det nemt at fejlsøge Yealink callbacks i `podman logs`.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/dashboard/backend/mission_router.py`
|
||||||
|
- `VERSION`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy med: `./updateto.sh v2.2.43`
|
||||||
|
- Se webhook-log events med: `podman logs -f bmc-hub-api-v2 | grep -E "Mission webhook|forbidden|token"`
|
||||||
17
RELEASE_NOTES_v2.2.44.md
Normal file
17
RELEASE_NOTES_v2.2.44.md
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
# Release Notes v2.2.44
|
||||||
|
|
||||||
|
Dato: 4. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
- `updateto.sh` rydder nu automatisk legacy containere (`bmc-hub-api-v2`, `bmc-hub-postgres-v2`) før deploy.
|
||||||
|
- Forebygger port-lock konflikter på især Postgres host-port (`5433`) under compose opstart.
|
||||||
|
- Mission Control: automatisk timeout på hængende `ringing` opkald, så de ikke bliver stående i Incoming Calls.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `updateto.sh`
|
||||||
|
- `app/dashboard/backend/mission_service.py`
|
||||||
|
- `VERSION`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.44`
|
||||||
|
- Verificér: `curl http://localhost:8001/health`
|
||||||
18
RELEASE_NOTES_v2.2.45.md
Normal file
18
RELEASE_NOTES_v2.2.45.md
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Release Notes v2.2.45
|
||||||
|
|
||||||
|
Dato: 4. marts 2026
|
||||||
|
|
||||||
|
## Forbedringer
|
||||||
|
- Tilføjet direkte menu-link til Mission Control i Support-dropdownen, så siden er hurtigere at finde.
|
||||||
|
- Tilføjet Mission Control som valgmulighed under Standard Dashboard i Indstillinger.
|
||||||
|
- Opdateret dashboard-fallback logik, så `/dashboard/mission-control` behandles som et kendt standardvalg.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/shared/frontend/base.html`
|
||||||
|
- `app/settings/frontend/settings.html`
|
||||||
|
- `VERSION`
|
||||||
|
- `RELEASE_NOTES_v2.2.45.md`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.45`
|
||||||
|
- Verificér: `curl http://localhost:8001/dashboard/mission-control`
|
||||||
19
RELEASE_NOTES_v2.2.46.md
Normal file
19
RELEASE_NOTES_v2.2.46.md
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
# Release Notes v2.2.46
|
||||||
|
|
||||||
|
Dato: 4. marts 2026
|
||||||
|
|
||||||
|
## Fixes og driftssikring
|
||||||
|
- Mission Control backend tåler nu manglende mission-tabeller uden at crashe requests, og logger tydelige advarsler.
|
||||||
|
- Tilføjet idempotent reparationsmigration for Mission Control schema (`143_mission_control_repair.sql`) til miljøer med delvist oprettede tabeller.
|
||||||
|
- Opdateret `.gitignore` med release-note undtagelse fra tidligere drift.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/dashboard/backend/mission_service.py`
|
||||||
|
- `migrations/143_mission_control_repair.sql`
|
||||||
|
- `.gitignore`
|
||||||
|
- `VERSION`
|
||||||
|
- `RELEASE_NOTES_v2.2.46.md`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.46`
|
||||||
|
- Migration (hvis nødvendig): `docker compose exec db psql -U bmc_hub -d bmc_hub -f migrations/143_mission_control_repair.sql`
|
||||||
17
RELEASE_NOTES_v2.2.47.md
Normal file
17
RELEASE_NOTES_v2.2.47.md
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
# Release Notes v2.2.47
|
||||||
|
|
||||||
|
Dato: 4. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
- Mission webhook GET for ringing accepterer nu token-only ping uden `call_id` og returnerer `200 OK`.
|
||||||
|
- `updateto.sh` bruger nu automatisk port `8001` som default i v2-mappen (`/srv/podman/bmc_hub_v2`), med fortsat støtte for `API_PORT` override i `.env`.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/dashboard/backend/mission_router.py`
|
||||||
|
- `updateto.sh`
|
||||||
|
- `VERSION`
|
||||||
|
- `RELEASE_NOTES_v2.2.47.md`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.47`
|
||||||
|
- Verificér webhook ping: `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/ringing?token=<TOKEN>"`
|
||||||
21
RELEASE_NOTES_v2.2.48.md
Normal file
21
RELEASE_NOTES_v2.2.48.md
Normal file
@ -0,0 +1,21 @@
|
|||||||
|
# Release Notes v2.2.48
|
||||||
|
|
||||||
|
Dato: 4. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
- `sag` aggregering fejler ikke længere hvis tabellen `sag_salgsvarer` mangler; API returnerer fortsat tidsdata og tom salgsliste i stedet for `500`.
|
||||||
|
- Salgsliste-endpoints i `sag` returnerer nu tom liste med advarsel i log, hvis `sag_salgsvarer` ikke findes.
|
||||||
|
- Mission webhooks for `answered` og `hangup` accepterer nu også token-only `GET` ping uden `call_id` (samme kompatibilitet som `ringing`).
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/modules/sag/backend/router.py`
|
||||||
|
- `app/dashboard/backend/mission_router.py`
|
||||||
|
- `VERSION`
|
||||||
|
- `RELEASE_NOTES_v2.2.48.md`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.48`
|
||||||
|
- Valider webhook ping:
|
||||||
|
- `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/ringing?token=<TOKEN>"`
|
||||||
|
- `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/answered?token=<TOKEN>"`
|
||||||
|
- `curl -i "http://localhost:8001/api/v1/mission/webhook/telefoni/hangup?token=<TOKEN>"`
|
||||||
40
RELEASE_NOTES_v2.2.49.md
Normal file
40
RELEASE_NOTES_v2.2.49.md
Normal file
@ -0,0 +1,40 @@
|
|||||||
|
# Release Notes v2.2.49
|
||||||
|
|
||||||
|
Dato: 5. marts 2026
|
||||||
|
|
||||||
|
## Ny funktionalitet
|
||||||
|
|
||||||
|
### Sag – Relationer
|
||||||
|
- Relation-vinduet vises kun når der faktisk er relerede sager. Enkelt-sag (ingen relationer) viser nu tom-state "Ingen relaterede sager".
|
||||||
|
- Aktuel sag fremhæves tydeligt i relationstræet: accent-farvet venstre-kant, svag baggrund, udfyldt badge med sags-ID og fed titel. Linket er ikke klikbart (man er allerede der).
|
||||||
|
|
||||||
|
### Sag – Sagstype dropdown
|
||||||
|
- Sagstype i topbaren er nu et klikbart dropdown i stedet for et link til redigeringssiden.
|
||||||
|
- Dropdown viser alle 6 typer (Ticket, Pipeline, Opgave, Ordre, Projekt, Service) med farveikoner og markerer den aktive type.
|
||||||
|
- Valg PATCHer sagen direkte og genindlæser siden.
|
||||||
|
- Rettet fejl hvor dropdown åbnede bagved siden (`overflow: hidden` fjernet fra `.case-hero`).
|
||||||
|
|
||||||
|
### Sag – Relation quick-actions (+)
|
||||||
|
- Menuen indeholder nu 12 moduler: Tildel sag, Tidregistrering, Kommentar, Påmindelse, Opgave, Salgspipeline, Filer, Hardware, Løsning, Varekøb & salg, Abonnement, Send email.
|
||||||
|
- Alle moduler åbner mini-modal med relevante felter direkte fra relationspanelet – ingen sidenavigation nødvendig.
|
||||||
|
- Salgspipeline skjules fra menuen hvis sagen allerede har pipeline-data (vises som grå "Pipeline (se sagen)").
|
||||||
|
- Tags bruger nu det globale TagPicker-system (`window.showTagPicker`).
|
||||||
|
|
||||||
|
### Email service
|
||||||
|
- Ny `app/services/email_service.py` til centraliseret e-mail-afsendelse.
|
||||||
|
|
||||||
|
### Telefoni
|
||||||
|
- Opdateringer til telefon-log og router.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/modules/sag/templates/detail.html`
|
||||||
|
- `app/modules/sag/backend/router.py`
|
||||||
|
- `app/dashboard/backend/mission_router.py`
|
||||||
|
- `app/dashboard/backend/mission_service.py`
|
||||||
|
- `app/modules/telefoni/backend/router.py`
|
||||||
|
- `app/modules/telefoni/templates/log.html`
|
||||||
|
- `app/services/email_service.py`
|
||||||
|
- `main.py`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.49`
|
||||||
18
RELEASE_NOTES_v2.2.50.md
Normal file
18
RELEASE_NOTES_v2.2.50.md
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Release Notes v2.2.50
|
||||||
|
|
||||||
|
Dato: 6. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
- Sag: “Ny email”-compose er gendannet i E-mail-fanen på sager.
|
||||||
|
- Tilføjet synlig compose-sektion med felter for Til/Cc/Bcc/Emne/Besked samt vedhæftning af sagsfiler.
|
||||||
|
- Knap `Ny email` er nu koblet til afsendelse via `/api/v1/sag/{sag_id}/emails/send`.
|
||||||
|
- Compose prefill’er modtager (primær kontakt hvis muligt) og emne (`Sag #<id>:`).
|
||||||
|
- Vedhæftningslisten opdateres fra sagsfiler, også når filpanelet ikke er synligt.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/modules/sag/templates/detail.html`
|
||||||
|
- `VERSION`
|
||||||
|
- `RELEASE_NOTES_v2.2.50.md`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.50`
|
||||||
21
RELEASE_NOTES_v2.2.51.md
Normal file
21
RELEASE_NOTES_v2.2.51.md
Normal file
@ -0,0 +1,21 @@
|
|||||||
|
# Release Notes v2.2.51
|
||||||
|
|
||||||
|
Dato: 7. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
- Settings: Bruger-administration i v2 bruger nu stabile admin-endpoints for statusændring og password reset.
|
||||||
|
- Settings: Forbedrede fejlbeskeder ved brugerhandlinger (status/password), så 4xx/5xx vises tydeligt i UI.
|
||||||
|
- Ticket Sync: Tilføjet Archived Sync monitor i Settings med knapper for Simply/vTiger import og løbende status-check.
|
||||||
|
- Ticket Sync: Nyt endpoint `/api/v1/ticket/archived/status` returnerer parity (remote vs lokal) og samlet `overall_synced`.
|
||||||
|
- Sikkerhed: Sync/import endpoints er låst til admin/superadmin (`users.manage` eller `system.admin`).
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/settings/frontend/settings.html`
|
||||||
|
- `app/ticket/backend/router.py`
|
||||||
|
- `app/system/backend/sync_router.py`
|
||||||
|
- `app/auth/backend/admin.py`
|
||||||
|
- `VERSION`
|
||||||
|
- `RELEASE_NOTES_v2.2.51.md`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.51`
|
||||||
16
RELEASE_NOTES_v2.2.52.md
Normal file
16
RELEASE_NOTES_v2.2.52.md
Normal file
@ -0,0 +1,16 @@
|
|||||||
|
# Release Notes v2.2.52
|
||||||
|
|
||||||
|
Dato: 7. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
- Auth Admin: `GET /api/v1/admin/users` er gjort ekstra robust mod delvist migreret database schema.
|
||||||
|
- Endpointet falder nu tilbage til en simplere query, hvis join/kolonner for grupper eller telefoni mangler.
|
||||||
|
- Reducerer risiko for UI-fejl: "Kunne ikke indlæse brugere" på v2.
|
||||||
|
|
||||||
|
## Ændrede filer
|
||||||
|
- `app/auth/backend/admin.py`
|
||||||
|
- `VERSION`
|
||||||
|
- `RELEASE_NOTES_v2.2.52.md`
|
||||||
|
|
||||||
|
## Drift
|
||||||
|
- Deploy: `./updateto.sh v2.2.52`
|
||||||
42
RELEASE_NOTES_v2.2.53.md
Normal file
42
RELEASE_NOTES_v2.2.53.md
Normal file
@ -0,0 +1,42 @@
|
|||||||
|
# Release Notes - v2.2.53
|
||||||
|
|
||||||
|
Dato: 17. marts 2026
|
||||||
|
|
||||||
|
## Fokus
|
||||||
|
|
||||||
|
Email til SAG flow med manuel godkendelse som standard, tydelig UI-handling og bedre sporbarhed.
|
||||||
|
|
||||||
|
## Tilføjet
|
||||||
|
|
||||||
|
- Manual approval gate i email pipeline (`awaiting_user_action` state), så mails parkeres til brugerhandling før automatisk routing.
|
||||||
|
- Ny feature-flag i config: `EMAIL_REQUIRE_MANUAL_APPROVAL` (default `true`).
|
||||||
|
- Nye email API endpoints:
|
||||||
|
- `GET /api/v1/emails/sag-options`
|
||||||
|
- `GET /api/v1/emails/search-customers`
|
||||||
|
- `GET /api/v1/emails/search-sager`
|
||||||
|
- `POST /api/v1/emails/{email_id}/create-sag`
|
||||||
|
- `POST /api/v1/emails/{email_id}/link-sag`
|
||||||
|
- Email stats udvidet med `awaiting_user_action` i summary/processing stats.
|
||||||
|
- Email frontend opgraderet med forslagspanel og hurtigknapper:
|
||||||
|
- Bekræft forslag
|
||||||
|
- Ret type
|
||||||
|
- Opret ny sag
|
||||||
|
- Tilknyt eksisterende sag
|
||||||
|
- Markér spam
|
||||||
|
- Oprettelse af SAG fra email understøtter nu:
|
||||||
|
- type
|
||||||
|
- sekundær label
|
||||||
|
- ansvarlig bruger
|
||||||
|
- gruppe
|
||||||
|
- startdato
|
||||||
|
- prioritet
|
||||||
|
- Ny migration: `145_sag_start_date.sql` (`start_date` på `sag_sager`).
|
||||||
|
|
||||||
|
## Driftsnoter
|
||||||
|
|
||||||
|
- Kør migration `145_sag_start_date.sql` før brug af startdato-feltet i email->sag flow.
|
||||||
|
- Manuel approval er aktiv som standard; auto-oprettelse er dermed deaktiveret i fase 1.
|
||||||
|
|
||||||
|
## Backup
|
||||||
|
|
||||||
|
- Fallback zip af nuværende email-funktion er oprettet i `backups/email_feature/`.
|
||||||
28
RELEASE_NOTES_v2.2.54.md
Normal file
28
RELEASE_NOTES_v2.2.54.md
Normal file
@ -0,0 +1,28 @@
|
|||||||
|
# Release Notes - v2.2.54
|
||||||
|
|
||||||
|
Dato: 17. marts 2026
|
||||||
|
|
||||||
|
## Fokus
|
||||||
|
|
||||||
|
Forbedringer i email til SAG workflow med deadline-felt og markant bedre firma/kunde-søgning i UI.
|
||||||
|
|
||||||
|
## Tilføjet
|
||||||
|
|
||||||
|
- Deadline understøttes nu i email->sag oprettelse.
|
||||||
|
- Backend request-model udvidet med `deadline`.
|
||||||
|
- `create-sag` gemmer nu deadline på `sag_sager`.
|
||||||
|
- Frontend forslagspanel har fået dedikeret deadline-felt.
|
||||||
|
- Kundevalg i email-panelet er opgraderet til en “super firma-søgning”:
|
||||||
|
- Live dropdown-resultater i stedet for simpel datalist.
|
||||||
|
- Bedre ranking af resultater (exact/prefix/relevans).
|
||||||
|
- Hurtig valg med klik, inklusive visning af CVR/domæne/email metadata.
|
||||||
|
|
||||||
|
## Opdaterede filer
|
||||||
|
|
||||||
|
- `app/emails/backend/router.py`
|
||||||
|
- `app/emails/frontend/emails.html`
|
||||||
|
|
||||||
|
## Bemærkninger
|
||||||
|
|
||||||
|
- Ingen breaking API changes.
|
||||||
|
- Ingen ekstra migration nødvendig for denne release.
|
||||||
22
RELEASE_NOTES_v2.2.56.md
Normal file
22
RELEASE_NOTES_v2.2.56.md
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
# Release Notes v2.2.56
|
||||||
|
|
||||||
|
Dato: 2026-03-18
|
||||||
|
|
||||||
|
## Fokus
|
||||||
|
Stabilisering af email-visning og hardening af supplier-invoices flows.
|
||||||
|
|
||||||
|
## Aendringer
|
||||||
|
- Rettet layout-overflow i email-detaljevisning, saa lange emner, afsenderadresser, HTML-indhold og filnavne ikke skubber kolonnerne ud af layoutet.
|
||||||
|
- Tilfoejet robust wrapping/truncering i emails UI for bedre responsiv opfoersel.
|
||||||
|
- Tilfoejet manglende "Klar til Bogforing" tab i supplier-invoices navigation.
|
||||||
|
- Rettet endpoint mismatch for AI template-analyse i supplier-invoices frontend.
|
||||||
|
- Fjernet JS-funktionskonflikter i supplier-invoices ved at adskille single/bulk send flows.
|
||||||
|
- Tilfoejet backend endpoint til at markere supplier-invoices som betalt.
|
||||||
|
- Fjernet route-konflikt for send-to-economic ved at flytte legacy placeholder til separat sti.
|
||||||
|
- Forbedret approve-flow ved at bruge dynamisk brugeropslag i stedet for hardcoded vaerdi.
|
||||||
|
|
||||||
|
## Berorte filer
|
||||||
|
- app/emails/frontend/emails.html
|
||||||
|
- app/billing/frontend/supplier_invoices.html
|
||||||
|
- app/billing/backend/supplier_invoices.py
|
||||||
|
- RELEASE_NOTES_v2.2.56.md
|
||||||
18
RELEASE_NOTES_v2.2.57.md
Normal file
18
RELEASE_NOTES_v2.2.57.md
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Release Notes v2.2.57
|
||||||
|
|
||||||
|
Dato: 2026-03-18
|
||||||
|
|
||||||
|
## Fokus
|
||||||
|
Stabilisering af UI i Email- og SAG-modulerne.
|
||||||
|
|
||||||
|
## Aendringer
|
||||||
|
- Email-visning: yderligere hardening af HTML-tabeller i mail-body, inklusive normalisering af inline styles for at undgaa layout break.
|
||||||
|
- Email-visning: forbedret overflow-haandtering for bredt indhold (tabeller, celler og media).
|
||||||
|
- SAG detaljeside: forbedret tab-loading, saa data hentes ved faneskift for Varekob & Salg, Abonnement og Paamindelser.
|
||||||
|
- SAG detaljeside: robust fallback for reminder user-id via `/api/v1/auth/me`.
|
||||||
|
- SAG detaljeside: rettet API-kald for reminders og kalender til stabil case-id reference.
|
||||||
|
|
||||||
|
## Berorte filer
|
||||||
|
- app/emails/frontend/emails.html
|
||||||
|
- app/modules/sag/templates/detail.html
|
||||||
|
- RELEASE_NOTES_v2.2.57.md
|
||||||
15
RELEASE_NOTES_v2.2.58.md
Normal file
15
RELEASE_NOTES_v2.2.58.md
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
# Release Notes v2.2.58
|
||||||
|
|
||||||
|
Dato: 2026-03-18
|
||||||
|
|
||||||
|
## Fokus
|
||||||
|
Forbedret UX paa SAG detaljesiden, saa fanernes indhold vises i toppen ved faneskift.
|
||||||
|
|
||||||
|
## Aendringer
|
||||||
|
- SAG tabs: aktiv tab-pane flyttes til toppen af tab-content ved faneskift.
|
||||||
|
- SAG tabs: automatisk scroll til fanebjaelken efter faneskift.
|
||||||
|
- SAG tabs: samme top-positionering og scroll ved `?tab=` deep-link aktivering.
|
||||||
|
|
||||||
|
## Berorte filer
|
||||||
|
- app/modules/sag/templates/detail.html
|
||||||
|
- RELEASE_NOTES_v2.2.58.md
|
||||||
16
RELEASE_NOTES_v2.2.59.md
Normal file
16
RELEASE_NOTES_v2.2.59.md
Normal file
@ -0,0 +1,16 @@
|
|||||||
|
# Release Notes v2.2.59
|
||||||
|
|
||||||
|
Dato: 2026-03-18
|
||||||
|
|
||||||
|
## Fokus
|
||||||
|
Stabil scroll/navigation i SAG-faner, saa bruger lander ved reelt indhold i den valgte fane.
|
||||||
|
|
||||||
|
## Aendringer
|
||||||
|
- Fjernet DOM-reordering af tab-pane elementer i SAG detaljesiden.
|
||||||
|
- Ny scroll-logik: ved faneskift scrolles til foerste meningsfulde indholdselement i aktiv fane.
|
||||||
|
- Scroll-offset tager hoejde for navbar-hoejde.
|
||||||
|
- Deep-link (`?tab=...`) bruger nu samme robuste scroll-adfaerd.
|
||||||
|
|
||||||
|
## Berorte filer
|
||||||
|
- app/modules/sag/templates/detail.html
|
||||||
|
- RELEASE_NOTES_v2.2.59.md
|
||||||
17
RELEASE_NOTES_v2.2.60.md
Normal file
17
RELEASE_NOTES_v2.2.60.md
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
# Release Notes v2.2.60
|
||||||
|
|
||||||
|
Dato: 2026-03-18
|
||||||
|
|
||||||
|
## Fokus
|
||||||
|
Korrekt top-visning af aktiv fane paa SAG detaljesiden.
|
||||||
|
|
||||||
|
## Aendringer
|
||||||
|
- Tvang korrekt tab-pane synlighed i `#caseTabsContent`:
|
||||||
|
- inaktive faner skjules (`display: none`)
|
||||||
|
- kun aktiv fane vises (`display: block`)
|
||||||
|
- Fjernet tidligere scroll/DOM-workaround til fanevisning.
|
||||||
|
- Resultat: aktiv fane vises i toppen under fanebjaelken uden tom top-sektion.
|
||||||
|
|
||||||
|
## Berorte filer
|
||||||
|
- app/modules/sag/templates/detail.html
|
||||||
|
- RELEASE_NOTES_v2.2.60.md
|
||||||
15
RELEASE_NOTES_v2.2.61.md
Normal file
15
RELEASE_NOTES_v2.2.61.md
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
# Release Notes v2.2.61
|
||||||
|
|
||||||
|
Dato: 18. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
|
||||||
|
- Rettet SAG-fanevisning i sag-detaljesiden, så kun den aktive fane vises i toppen.
|
||||||
|
- Tilføjet direkte klik-fallback på faneknapper (`onclick`) for robust aktivering, også hvis Bootstrap tab-events fejler.
|
||||||
|
- Sat eksplicit start-visibility på tab-panes for at undgå "lang side"-effekten med indhold langt nede.
|
||||||
|
- Fjernet to ødelagte CSS-blokke i toppen af templaten, som kunne skabe ustabil styling/parsing.
|
||||||
|
|
||||||
|
## Berørte filer
|
||||||
|
|
||||||
|
- `app/modules/sag/templates/detail.html`
|
||||||
|
- `RELEASE_NOTES_v2.2.61.md`
|
||||||
14
RELEASE_NOTES_v2.2.62.md
Normal file
14
RELEASE_NOTES_v2.2.62.md
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
# Release Notes v2.2.62
|
||||||
|
|
||||||
|
Dato: 18. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
|
||||||
|
- Rettet grid/nesting i SAG detaljevisning, så højre kolonne ligger i samme row som venstre/center.
|
||||||
|
- `Hardware`, `Salgspipeline`, `Opkaldshistorik` og `Todo-opgaver` vises nu i højre kolonne som forventet.
|
||||||
|
- Fjernet en for tidlig afsluttende `</div>` i detaljer-layoutet, som tidligere fik højre modulkolonne til at falde ned under venstre indhold.
|
||||||
|
|
||||||
|
## Berørte filer
|
||||||
|
|
||||||
|
- `app/modules/sag/templates/detail.html`
|
||||||
|
- `RELEASE_NOTES_v2.2.62.md`
|
||||||
14
RELEASE_NOTES_v2.2.63.md
Normal file
14
RELEASE_NOTES_v2.2.63.md
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
# Release Notes v2.2.63
|
||||||
|
|
||||||
|
Dato: 18. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
|
||||||
|
- Rettet QuickCreate AI-analyse request i frontend.
|
||||||
|
- `POST /api/v1/sag/analyze-quick-create` får nu korrekt payload med både `text` og `user_id` i body.
|
||||||
|
- Forbedret fejllog i frontend ved AI-fejl (inkl. HTTP status), så fejl ikke bliver skjult som generisk "Analysis failed".
|
||||||
|
|
||||||
|
## Berørte filer
|
||||||
|
|
||||||
|
- `app/shared/frontend/quick_create_modal.html`
|
||||||
|
- `RELEASE_NOTES_v2.2.63.md`
|
||||||
18
RELEASE_NOTES_v2.2.64.md
Normal file
18
RELEASE_NOTES_v2.2.64.md
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Release Notes v2.2.64
|
||||||
|
|
||||||
|
Dato: 18. marts 2026
|
||||||
|
|
||||||
|
## Fixes
|
||||||
|
|
||||||
|
- Forbedret QuickCreate robusthed når AI/LLM er utilgængelig.
|
||||||
|
- Tilføjet lokal heuristisk fallback i `CaseAnalysisService`, så brugeren stadig får:
|
||||||
|
- foreslået titel
|
||||||
|
- foreslået prioritet
|
||||||
|
- simple tags
|
||||||
|
- kunde-match forsøg
|
||||||
|
- Fjernet afhængighed af at Ollama altid svarer, så QuickCreate ikke længere ender i tom AI-unavailable flow ved midlertidige AI-fejl.
|
||||||
|
|
||||||
|
## Berørte filer
|
||||||
|
|
||||||
|
- `app/services/case_analysis_service.py`
|
||||||
|
- `RELEASE_NOTES_v2.2.64.md`
|
||||||
@ -2,6 +2,7 @@
|
|||||||
Auth Admin API - Users, Groups, Permissions management
|
Auth Admin API - Users, Groups, Permissions management
|
||||||
"""
|
"""
|
||||||
from fastapi import APIRouter, HTTPException, status, Depends
|
from fastapi import APIRouter, HTTPException, status, Depends
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
from app.core.auth_dependencies import require_permission
|
from app.core.auth_dependencies import require_permission
|
||||||
from app.core.auth_service import AuthService
|
from app.core.auth_service import AuthService
|
||||||
from app.core.database import execute_query, execute_query_single, execute_insert, execute_update
|
from app.core.database import execute_query, execute_query_single, execute_insert, execute_update
|
||||||
@ -13,23 +14,94 @@ logger = logging.getLogger(__name__)
|
|||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
class UserStatusUpdateRequest(BaseModel):
|
||||||
|
is_active: bool
|
||||||
|
|
||||||
|
|
||||||
|
class UserPasswordResetRequest(BaseModel):
|
||||||
|
new_password: str = Field(..., min_length=8, max_length=128)
|
||||||
|
|
||||||
|
|
||||||
|
def _users_column_exists(column_name: str) -> bool:
|
||||||
|
result = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT 1
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'users'
|
||||||
|
AND column_name = %s
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(column_name,)
|
||||||
|
)
|
||||||
|
return bool(result)
|
||||||
|
|
||||||
|
|
||||||
|
def _table_exists(table_name: str) -> bool:
|
||||||
|
result = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT 1
|
||||||
|
FROM information_schema.tables
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = %s
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(table_name,)
|
||||||
|
)
|
||||||
|
return bool(result)
|
||||||
|
|
||||||
|
|
||||||
@router.get("/admin/users", dependencies=[Depends(require_permission("users.manage"))])
|
@router.get("/admin/users", dependencies=[Depends(require_permission("users.manage"))])
|
||||||
async def list_users():
|
async def list_users():
|
||||||
users = execute_query(
|
is_2fa_expr = "u.is_2fa_enabled" if _users_column_exists("is_2fa_enabled") else "FALSE AS is_2fa_enabled"
|
||||||
"""
|
telefoni_extension_expr = "u.telefoni_extension" if _users_column_exists("telefoni_extension") else "NULL::varchar AS telefoni_extension"
|
||||||
SELECT u.user_id, u.username, u.email, u.full_name,
|
telefoni_active_expr = "u.telefoni_aktiv" if _users_column_exists("telefoni_aktiv") else "FALSE AS telefoni_aktiv"
|
||||||
u.is_active, u.is_superadmin, u.is_2fa_enabled,
|
telefoni_ip_expr = "u.telefoni_phone_ip" if _users_column_exists("telefoni_phone_ip") else "NULL::varchar AS telefoni_phone_ip"
|
||||||
u.telefoni_extension, u.telefoni_aktiv, u.telefoni_phone_ip, u.telefoni_phone_username,
|
telefoni_username_expr = "u.telefoni_phone_username" if _users_column_exists("telefoni_phone_username") else "NULL::varchar AS telefoni_phone_username"
|
||||||
u.created_at, u.last_login_at,
|
last_login_expr = "u.last_login_at" if _users_column_exists("last_login_at") else "NULL::timestamp AS last_login_at"
|
||||||
COALESCE(array_remove(array_agg(g.name), NULL), ARRAY[]::varchar[]) AS groups
|
has_user_groups = _table_exists("user_groups")
|
||||||
FROM users u
|
has_groups = _table_exists("groups")
|
||||||
LEFT JOIN user_groups ug ON u.user_id = ug.user_id
|
|
||||||
LEFT JOIN groups g ON ug.group_id = g.id
|
if has_user_groups and has_groups:
|
||||||
GROUP BY u.user_id
|
groups_join = "LEFT JOIN user_groups ug ON u.user_id = ug.user_id LEFT JOIN groups g ON ug.group_id = g.id"
|
||||||
ORDER BY u.user_id
|
groups_select = "COALESCE(array_remove(array_agg(g.name), NULL), ARRAY[]::varchar[]) AS groups"
|
||||||
"""
|
else:
|
||||||
)
|
groups_join = ""
|
||||||
return users
|
groups_select = "ARRAY[]::varchar[] AS groups"
|
||||||
|
|
||||||
|
try:
|
||||||
|
users = execute_query(
|
||||||
|
f"""
|
||||||
|
SELECT u.user_id, u.username, u.email, u.full_name,
|
||||||
|
u.is_active, u.is_superadmin, {is_2fa_expr},
|
||||||
|
{telefoni_extension_expr}, {telefoni_active_expr}, {telefoni_ip_expr}, {telefoni_username_expr},
|
||||||
|
u.created_at, {last_login_expr},
|
||||||
|
{groups_select}
|
||||||
|
FROM users u
|
||||||
|
{groups_join}
|
||||||
|
GROUP BY u.user_id
|
||||||
|
ORDER BY u.user_id
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
return users
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("⚠️ Admin user query fallback triggered: %s", exc)
|
||||||
|
try:
|
||||||
|
users = execute_query(
|
||||||
|
f"""
|
||||||
|
SELECT u.user_id, u.username, u.email, u.full_name,
|
||||||
|
u.is_active, u.is_superadmin, {is_2fa_expr},
|
||||||
|
{telefoni_extension_expr}, {telefoni_active_expr}, {telefoni_ip_expr}, {telefoni_username_expr},
|
||||||
|
u.created_at, {last_login_expr},
|
||||||
|
ARRAY[]::varchar[] AS groups
|
||||||
|
FROM users u
|
||||||
|
ORDER BY u.user_id
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
return users
|
||||||
|
except Exception as fallback_exc:
|
||||||
|
logger.error("❌ Failed to load admin users (fallback): %s", fallback_exc)
|
||||||
|
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Could not load users") from fallback_exc
|
||||||
|
|
||||||
|
|
||||||
@router.post("/admin/users", status_code=status.HTTP_201_CREATED, dependencies=[Depends(require_permission("users.manage"))])
|
@router.post("/admin/users", status_code=status.HTTP_201_CREATED, dependencies=[Depends(require_permission("users.manage"))])
|
||||||
@ -94,6 +166,48 @@ async def update_user_groups(user_id: int, payload: UserGroupsUpdate):
|
|||||||
return {"message": "Groups updated"}
|
return {"message": "Groups updated"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.patch("/admin/users/{user_id}", dependencies=[Depends(require_permission("users.manage"))])
|
||||||
|
async def update_user_status(user_id: int, payload: UserStatusUpdateRequest):
|
||||||
|
user = execute_query_single(
|
||||||
|
"SELECT user_id, username FROM users WHERE user_id = %s",
|
||||||
|
(user_id,)
|
||||||
|
)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"UPDATE users SET is_active = %s, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
|
(payload.is_active, user_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info("✅ Updated user status via admin: %s -> active=%s", user.get("username"), payload.is_active)
|
||||||
|
return {"message": "User status updated", "user_id": user_id, "is_active": payload.is_active}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/admin/users/{user_id}/reset-password", dependencies=[Depends(require_permission("users.manage"))])
|
||||||
|
async def admin_reset_user_password(user_id: int, payload: UserPasswordResetRequest):
|
||||||
|
user = execute_query_single(
|
||||||
|
"SELECT user_id, username FROM users WHERE user_id = %s",
|
||||||
|
(user_id,)
|
||||||
|
)
|
||||||
|
if not user:
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="User not found")
|
||||||
|
|
||||||
|
try:
|
||||||
|
password_hash = AuthService.hash_password(payload.new_password)
|
||||||
|
except Exception as exc:
|
||||||
|
logger.error("❌ Password hash failed for user_id=%s: %s", user_id, exc)
|
||||||
|
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Kunne ikke hashe adgangskoden") from exc
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"UPDATE users SET password_hash = %s, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
|
(password_hash, user_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info("✅ Password reset via admin for user: %s", user.get("username"))
|
||||||
|
return {"message": "Password reset", "user_id": user_id}
|
||||||
|
|
||||||
|
|
||||||
@router.post("/admin/users/{user_id}/2fa/reset")
|
@router.post("/admin/users/{user_id}/2fa/reset")
|
||||||
async def reset_user_2fa(
|
async def reset_user_2fa(
|
||||||
user_id: int,
|
user_id: int,
|
||||||
|
|||||||
@ -74,6 +74,8 @@ async def login(request: Request, credentials: LoginRequest, response: Response)
|
|||||||
|
|
||||||
requires_2fa_setup = (
|
requires_2fa_setup = (
|
||||||
not user.get("is_shadow_admin", False)
|
not user.get("is_shadow_admin", False)
|
||||||
|
and not settings.AUTH_DISABLE_2FA
|
||||||
|
and AuthService.is_2fa_supported()
|
||||||
and not user.get("is_2fa_enabled", False)
|
and not user.get("is_2fa_enabled", False)
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -139,10 +141,18 @@ async def setup_2fa(current_user: dict = Depends(get_current_user)):
|
|||||||
detail="Shadow admin cannot configure 2FA",
|
detail="Shadow admin cannot configure 2FA",
|
||||||
)
|
)
|
||||||
|
|
||||||
result = AuthService.setup_user_2fa(
|
try:
|
||||||
user_id=current_user["id"],
|
result = AuthService.setup_user_2fa(
|
||||||
username=current_user["username"]
|
user_id=current_user["id"],
|
||||||
)
|
username=current_user["username"]
|
||||||
|
)
|
||||||
|
except RuntimeError as exc:
|
||||||
|
if "2FA columns missing" in str(exc):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail="2FA er ikke tilgaengelig i denne database (mangler kolonner).",
|
||||||
|
)
|
||||||
|
raise
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|||||||
@ -25,8 +25,26 @@ class BackupService:
|
|||||||
"""Service for managing backup operations"""
|
"""Service for managing backup operations"""
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.backup_dir = Path(settings.BACKUP_STORAGE_PATH)
|
configured_backup_dir = Path(settings.BACKUP_STORAGE_PATH)
|
||||||
self.backup_dir.mkdir(parents=True, exist_ok=True)
|
self.backup_dir = configured_backup_dir
|
||||||
|
try:
|
||||||
|
self.backup_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
except OSError as exc:
|
||||||
|
# Local development can run outside Docker where /app is not writable.
|
||||||
|
# Fall back to the workspace data path so app startup does not fail.
|
||||||
|
if str(configured_backup_dir).startswith('/app/'):
|
||||||
|
project_root = Path(__file__).resolve().parents[3]
|
||||||
|
fallback_dir = project_root / 'data' / 'backups'
|
||||||
|
logger.warning(
|
||||||
|
"⚠️ Backup path %s not writable (%s). Using fallback %s",
|
||||||
|
configured_backup_dir,
|
||||||
|
exc,
|
||||||
|
fallback_dir,
|
||||||
|
)
|
||||||
|
fallback_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
self.backup_dir = fallback_dir
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
# Subdirectories for different backup types
|
# Subdirectories for different backup types
|
||||||
self.db_dir = self.backup_dir / "database"
|
self.db_dir = self.backup_dir / "database"
|
||||||
|
|||||||
@ -3,7 +3,7 @@ Supplier Invoices Router - Leverandørfakturaer (Kassekladde)
|
|||||||
Backend API for managing supplier invoices that integrate with e-conomic
|
Backend API for managing supplier invoices that integrate with e-conomic
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from fastapi import APIRouter, HTTPException, UploadFile, File
|
from fastapi import APIRouter, HTTPException, UploadFile, File, BackgroundTasks
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from typing import List, Dict, Optional
|
from typing import List, Dict, Optional
|
||||||
from datetime import datetime, date, timedelta
|
from datetime import datetime, date, timedelta
|
||||||
@ -339,10 +339,22 @@ async def get_files_by_status(status: Optional[str] = None, limit: int = 100):
|
|||||||
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
||||||
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
||||||
f.detected_vendor_id, v.name as detected_vendor_name,
|
f.detected_vendor_id, v.name as detected_vendor_name,
|
||||||
e.total_amount as detected_amount
|
ext.vendor_name,
|
||||||
|
ext.vendor_cvr,
|
||||||
|
ext.vendor_matched_id,
|
||||||
|
COALESCE(v_ext.name, ext.vendor_name, v.name) as best_vendor_name,
|
||||||
|
ext.total_amount,
|
||||||
|
ext.confidence as vendor_match_confidence
|
||||||
FROM incoming_files f
|
FROM incoming_files f
|
||||||
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
||||||
LEFT JOIN extractions e ON f.file_id = e.file_id
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT vendor_name, vendor_cvr, vendor_matched_id, total_amount, confidence
|
||||||
|
FROM extractions
|
||||||
|
WHERE file_id = f.file_id
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
LIMIT 1
|
||||||
|
) ext ON true
|
||||||
|
LEFT JOIN vendors v_ext ON v_ext.id = ext.vendor_matched_id
|
||||||
WHERE f.status IN ({placeholders})
|
WHERE f.status IN ({placeholders})
|
||||||
ORDER BY f.uploaded_at DESC
|
ORDER BY f.uploaded_at DESC
|
||||||
LIMIT %s
|
LIMIT %s
|
||||||
@ -353,10 +365,22 @@ async def get_files_by_status(status: Optional[str] = None, limit: int = 100):
|
|||||||
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
SELECT f.file_id, f.filename, f.file_path, f.file_size, f.mime_type,
|
||||||
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
f.status, f.uploaded_at, f.processed_at, f.detected_cvr,
|
||||||
f.detected_vendor_id, v.name as detected_vendor_name,
|
f.detected_vendor_id, v.name as detected_vendor_name,
|
||||||
e.total_amount as detected_amount
|
ext.vendor_name,
|
||||||
|
ext.vendor_cvr,
|
||||||
|
ext.vendor_matched_id,
|
||||||
|
COALESCE(v_ext.name, ext.vendor_name, v.name) as best_vendor_name,
|
||||||
|
ext.total_amount,
|
||||||
|
ext.confidence as vendor_match_confidence
|
||||||
FROM incoming_files f
|
FROM incoming_files f
|
||||||
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
LEFT JOIN vendors v ON f.detected_vendor_id = v.id
|
||||||
LEFT JOIN extractions e ON f.file_id = e.file_id
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT vendor_name, vendor_cvr, vendor_matched_id, total_amount, confidence
|
||||||
|
FROM extractions
|
||||||
|
WHERE file_id = f.file_id
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
LIMIT 1
|
||||||
|
) ext ON true
|
||||||
|
LEFT JOIN vendors v_ext ON v_ext.id = ext.vendor_matched_id
|
||||||
ORDER BY f.uploaded_at DESC
|
ORDER BY f.uploaded_at DESC
|
||||||
LIMIT %s
|
LIMIT %s
|
||||||
"""
|
"""
|
||||||
@ -503,6 +527,28 @@ async def get_file_extracted_data(file_id: int):
|
|||||||
|
|
||||||
due_date_value = llm_json_data.get('due_date')
|
due_date_value = llm_json_data.get('due_date')
|
||||||
|
|
||||||
|
# Vendor name: AI uses 'vendor_name', invoice2data uses 'issuer'
|
||||||
|
vendor_name_val = (
|
||||||
|
llm_json_data.get('vendor_name') or
|
||||||
|
llm_json_data.get('issuer') or
|
||||||
|
(extraction.get('vendor_name') if extraction else None)
|
||||||
|
)
|
||||||
|
# Vendor CVR: AI uses 'vendor_cvr', invoice2data uses 'vendor_vat'
|
||||||
|
vendor_cvr_val = (
|
||||||
|
llm_json_data.get('vendor_cvr') or
|
||||||
|
llm_json_data.get('vendor_vat') or
|
||||||
|
(extraction.get('vendor_cvr') if extraction else None)
|
||||||
|
)
|
||||||
|
# Vendor address: AI uses 'vendor_address', invoice2data may have separate fields
|
||||||
|
vendor_address_val = (
|
||||||
|
llm_json_data.get('vendor_address') or
|
||||||
|
llm_json_data.get('supplier_address') or
|
||||||
|
llm_json_data.get('vendor_street')
|
||||||
|
)
|
||||||
|
vendor_city_val = llm_json_data.get('vendor_city') or llm_json_data.get('city')
|
||||||
|
vendor_postal_val = llm_json_data.get('vendor_postal_code') or llm_json_data.get('postal_code')
|
||||||
|
vendor_email_val = llm_json_data.get('vendor_email') or llm_json_data.get('supplier_email')
|
||||||
|
|
||||||
# Use invoice_number from LLM JSON (works for both AI and template extraction)
|
# Use invoice_number from LLM JSON (works for both AI and template extraction)
|
||||||
llm_data = {
|
llm_data = {
|
||||||
"invoice_number": llm_json_data.get('invoice_number'),
|
"invoice_number": llm_json_data.get('invoice_number'),
|
||||||
@ -511,6 +557,12 @@ async def get_file_extracted_data(file_id: int):
|
|||||||
"total_amount": float(total_amount_value) if total_amount_value else None,
|
"total_amount": float(total_amount_value) if total_amount_value else None,
|
||||||
"currency": llm_json_data.get('currency') or 'DKK',
|
"currency": llm_json_data.get('currency') or 'DKK',
|
||||||
"document_type": llm_json_data.get('document_type'),
|
"document_type": llm_json_data.get('document_type'),
|
||||||
|
"vendor_name": vendor_name_val,
|
||||||
|
"vendor_cvr": vendor_cvr_val,
|
||||||
|
"vendor_address": vendor_address_val,
|
||||||
|
"vendor_city": vendor_city_val,
|
||||||
|
"vendor_postal_code": vendor_postal_val,
|
||||||
|
"vendor_email": vendor_email_val,
|
||||||
"lines": formatted_lines
|
"lines": formatted_lines
|
||||||
}
|
}
|
||||||
elif extraction:
|
elif extraction:
|
||||||
@ -522,6 +574,12 @@ async def get_file_extracted_data(file_id: int):
|
|||||||
"total_amount": float(extraction.get('total_amount')) if extraction.get('total_amount') else None,
|
"total_amount": float(extraction.get('total_amount')) if extraction.get('total_amount') else None,
|
||||||
"currency": extraction.get('currency') or 'DKK',
|
"currency": extraction.get('currency') or 'DKK',
|
||||||
"document_type": extraction.get('document_type'),
|
"document_type": extraction.get('document_type'),
|
||||||
|
"vendor_name": extraction.get('vendor_name'),
|
||||||
|
"vendor_cvr": extraction.get('vendor_cvr'),
|
||||||
|
"vendor_address": None,
|
||||||
|
"vendor_city": None,
|
||||||
|
"vendor_postal_code": None,
|
||||||
|
"vendor_email": None,
|
||||||
"lines": formatted_lines
|
"lines": formatted_lines
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -699,17 +757,36 @@ async def link_vendor_to_extraction(file_id: int, data: dict):
|
|||||||
(file_id,))
|
(file_id,))
|
||||||
|
|
||||||
if not extraction:
|
if not extraction:
|
||||||
raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil")
|
# No extraction exists (e.g. custom template match or not yet processed)
|
||||||
|
# Create a minimal placeholder extraction so vendor can be linked
|
||||||
# Update extraction with vendor match
|
logger.info(f"⚠️ No extraction for file {file_id} — creating minimal extraction for vendor link")
|
||||||
|
extraction_id = execute_insert(
|
||||||
|
"""INSERT INTO extractions
|
||||||
|
(file_id, vendor_matched_id, vendor_name, vendor_cvr,
|
||||||
|
document_id, document_type, document_type_detected,
|
||||||
|
currency, confidence, status)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING extraction_id""",
|
||||||
|
(file_id, vendor_id,
|
||||||
|
vendor['name'], None,
|
||||||
|
None, 'invoice', 'invoice',
|
||||||
|
'DKK', 1.0, 'manual')
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
extraction_id = extraction['extraction_id']
|
||||||
|
# Update extraction with vendor match
|
||||||
|
execute_update(
|
||||||
|
"UPDATE extractions SET vendor_matched_id = %s WHERE extraction_id = %s",
|
||||||
|
(vendor_id, extraction_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Also update incoming_files so table shows vendor immediately
|
||||||
execute_update(
|
execute_update(
|
||||||
"""UPDATE extractions
|
"UPDATE incoming_files SET detected_vendor_id = %s, status = 'processed' WHERE file_id = %s",
|
||||||
SET vendor_matched_id = %s
|
(vendor_id, file_id)
|
||||||
WHERE extraction_id = %s""",
|
|
||||||
(vendor_id, extraction['extraction_id'])
|
|
||||||
)
|
)
|
||||||
|
|
||||||
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to extraction for file {file_id}")
|
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to file {file_id}")
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status": "success",
|
"status": "success",
|
||||||
@ -823,21 +900,37 @@ async def link_vendor_to_extraction(file_id: int, data: dict):
|
|||||||
(file_id,))
|
(file_id,))
|
||||||
|
|
||||||
if not extraction:
|
if not extraction:
|
||||||
raise HTTPException(status_code=404, detail="Ingen extraction fundet for denne fil")
|
# Create minimal extraction if none exists
|
||||||
|
logger.info(f"⚠️ No extraction for file {file_id} — creating minimal extraction for vendor link")
|
||||||
# Update extraction with vendor match
|
extraction_id = execute_insert(
|
||||||
|
"""INSERT INTO extractions
|
||||||
|
(file_id, vendor_matched_id, vendor_name, vendor_cvr,
|
||||||
|
document_id, document_type, document_type_detected,
|
||||||
|
currency, confidence, status)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING extraction_id""",
|
||||||
|
(file_id, vendor_id, vendor['name'], None,
|
||||||
|
None, 'invoice', 'invoice', 'DKK', 1.0, 'manual')
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
extraction_id = extraction['extraction_id']
|
||||||
|
execute_update(
|
||||||
|
"UPDATE extractions SET vendor_matched_id = %s WHERE extraction_id = %s",
|
||||||
|
(vendor_id, extraction_id)
|
||||||
|
)
|
||||||
|
|
||||||
execute_update(
|
execute_update(
|
||||||
"UPDATE extractions SET vendor_matched_id = %s WHERE extraction_id = %s",
|
"UPDATE incoming_files SET detected_vendor_id = %s, status = 'processed' WHERE file_id = %s",
|
||||||
(vendor_id, extraction['extraction_id'])
|
(vendor_id, file_id)
|
||||||
)
|
)
|
||||||
|
|
||||||
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to extraction {extraction['extraction_id']}")
|
logger.info(f"✅ Linked vendor {vendor['name']} (ID: {vendor_id}) to extraction {extraction_id}")
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"status": "success",
|
"status": "success",
|
||||||
"vendor_id": vendor_id,
|
"vendor_id": vendor_id,
|
||||||
"vendor_name": vendor['name'],
|
"vendor_name": vendor['name'],
|
||||||
"extraction_id": extraction['extraction_id']
|
"extraction_id": extraction_id
|
||||||
}
|
}
|
||||||
|
|
||||||
except HTTPException:
|
except HTTPException:
|
||||||
@ -1610,6 +1703,10 @@ async def delete_supplier_invoice(invoice_id: int):
|
|||||||
class ApproveRequest(BaseModel):
|
class ApproveRequest(BaseModel):
|
||||||
approved_by: str
|
approved_by: str
|
||||||
|
|
||||||
|
|
||||||
|
class MarkPaidRequest(BaseModel):
|
||||||
|
paid_date: Optional[date] = None
|
||||||
|
|
||||||
@router.post("/supplier-invoices/{invoice_id}/approve")
|
@router.post("/supplier-invoices/{invoice_id}/approve")
|
||||||
async def approve_supplier_invoice(invoice_id: int, request: ApproveRequest):
|
async def approve_supplier_invoice(invoice_id: int, request: ApproveRequest):
|
||||||
"""Approve supplier invoice for payment"""
|
"""Approve supplier invoice for payment"""
|
||||||
@ -1642,6 +1739,58 @@ async def approve_supplier_invoice(invoice_id: int, request: ApproveRequest):
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/supplier-invoices/{invoice_id}/mark-paid")
|
||||||
|
async def mark_supplier_invoice_paid(invoice_id: int, request: MarkPaidRequest):
|
||||||
|
"""Mark supplier invoice as paid."""
|
||||||
|
try:
|
||||||
|
invoice = execute_query_single(
|
||||||
|
"SELECT id, invoice_number, status FROM supplier_invoices WHERE id = %s",
|
||||||
|
(invoice_id,)
|
||||||
|
)
|
||||||
|
|
||||||
|
if not invoice:
|
||||||
|
raise HTTPException(status_code=404, detail=f"Faktura {invoice_id} ikke fundet")
|
||||||
|
|
||||||
|
if invoice['status'] == 'paid':
|
||||||
|
return {"success": True, "invoice_id": invoice_id, "status": "paid"}
|
||||||
|
|
||||||
|
if invoice['status'] not in ('approved', 'sent_to_economic'):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400,
|
||||||
|
detail=(
|
||||||
|
f"Faktura har status '{invoice['status']}' - "
|
||||||
|
"kun 'approved' eller 'sent_to_economic' kan markeres som betalt"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"""UPDATE supplier_invoices
|
||||||
|
SET status = 'paid', updated_at = CURRENT_TIMESTAMP
|
||||||
|
WHERE id = %s""",
|
||||||
|
(invoice_id,)
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"✅ Marked supplier invoice %s (ID: %s) as paid (date: %s)",
|
||||||
|
invoice['invoice_number'],
|
||||||
|
invoice_id,
|
||||||
|
request.paid_date,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": True,
|
||||||
|
"invoice_id": invoice_id,
|
||||||
|
"status": "paid",
|
||||||
|
"paid_date": request.paid_date,
|
||||||
|
}
|
||||||
|
|
||||||
|
except HTTPException:
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"❌ Failed to mark invoice {invoice_id} as paid: {e}")
|
||||||
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
@router.post("/supplier-invoices/{invoice_id}/send-to-economic")
|
@router.post("/supplier-invoices/{invoice_id}/send-to-economic")
|
||||||
async def send_to_economic(invoice_id: int):
|
async def send_to_economic(invoice_id: int):
|
||||||
"""
|
"""
|
||||||
@ -1982,10 +2131,16 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
try:
|
try:
|
||||||
# Validate file extension
|
# Validate file extension
|
||||||
suffix = Path(file.filename).suffix.lower()
|
suffix = Path(file.filename).suffix.lower()
|
||||||
if suffix not in settings.ALLOWED_EXTENSIONS:
|
suffix_clean = suffix.lstrip('.')
|
||||||
|
# Build allowed set — guard against pydantic parsing CSV as a single element
|
||||||
|
raw = settings.ALLOWED_EXTENSIONS
|
||||||
|
if len(raw) == 1 and ',' in raw[0]:
|
||||||
|
raw = [e.strip() for e in raw[0].split(',')]
|
||||||
|
allowed_clean = {ext.lower().lstrip('.') for ext in raw}
|
||||||
|
if suffix_clean not in allowed_clean:
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=400,
|
status_code=400,
|
||||||
detail=f"Filtype {suffix} ikke tilladt. Tilladte: {', '.join(settings.ALLOWED_EXTENSIONS)}"
|
detail=f"Filtype {suffix} ikke tilladt. Tilladte: {', '.join(sorted(allowed_clean))}"
|
||||||
)
|
)
|
||||||
|
|
||||||
# Create upload directory
|
# Create upload directory
|
||||||
@ -1997,7 +2152,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
# Validate file size while saving
|
# Validate file size while saving
|
||||||
max_size = settings.MAX_FILE_SIZE_MB * 1024 * 1024
|
max_size = settings.EMAIL_MAX_UPLOAD_SIZE_MB * 1024 * 1024
|
||||||
total_size = 0
|
total_size = 0
|
||||||
|
|
||||||
with open(temp_path, "wb") as buffer:
|
with open(temp_path, "wb") as buffer:
|
||||||
@ -2007,7 +2162,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
temp_path.unlink(missing_ok=True)
|
temp_path.unlink(missing_ok=True)
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=413,
|
status_code=413,
|
||||||
detail=f"Fil for stor (max {settings.MAX_FILE_SIZE_MB}MB)"
|
detail=f"Fil for stor (max {settings.EMAIL_MAX_UPLOAD_SIZE_MB}MB)"
|
||||||
)
|
)
|
||||||
buffer.write(chunk)
|
buffer.write(chunk)
|
||||||
|
|
||||||
@ -2017,7 +2172,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
checksum = ollama_service.calculate_file_checksum(temp_path)
|
checksum = ollama_service.calculate_file_checksum(temp_path)
|
||||||
|
|
||||||
# Check for duplicate file
|
# Check for duplicate file
|
||||||
existing_file = execute_query(
|
existing_file = execute_query_single(
|
||||||
"SELECT file_id, status FROM incoming_files WHERE checksum = %s",
|
"SELECT file_id, status FROM incoming_files WHERE checksum = %s",
|
||||||
(checksum,))
|
(checksum,))
|
||||||
|
|
||||||
@ -2105,7 +2260,7 @@ async def upload_supplier_invoice(file: UploadFile = File(...)):
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/supplier-invoices/{invoice_id}/send-to-economic")
|
@router.post("/supplier-invoices/{invoice_id}/send-to-economic-legacy-unimplemented")
|
||||||
async def send_invoice_to_economic(invoice_id: int):
|
async def send_invoice_to_economic(invoice_id: int):
|
||||||
"""Send supplier invoice to e-conomic - requires separate implementation"""
|
"""Send supplier invoice to e-conomic - requires separate implementation"""
|
||||||
raise HTTPException(status_code=501, detail="e-conomic integration kommer senere")
|
raise HTTPException(status_code=501, detail="e-conomic integration kommer senere")
|
||||||
@ -2296,22 +2451,66 @@ async def reprocess_uploaded_file(file_id: int):
|
|||||||
|
|
||||||
extracted_fields = llm_result
|
extracted_fields = llm_result
|
||||||
confidence = llm_result.get('confidence', 0.75)
|
confidence = llm_result.get('confidence', 0.75)
|
||||||
|
|
||||||
|
# Post-process: clear own CVR(s) if AI mistakenly returned them
|
||||||
|
extracted_cvr = llm_result.get('vendor_cvr')
|
||||||
|
own_cvr = getattr(settings, 'OWN_CVR', '29522790')
|
||||||
|
OWN_CVRS = {str(own_cvr).strip(), '29522790', '14416285'} # alle BMC CVR numre
|
||||||
|
extracted_cvr_clean = str(extracted_cvr).replace('DK', '').strip() if extracted_cvr else ''
|
||||||
|
if extracted_cvr_clean and extracted_cvr_clean in OWN_CVRS:
|
||||||
|
logger.warning(f"⚠️ AI returned own CVR ({extracted_cvr_clean}) as vendor_cvr - clearing it")
|
||||||
|
llm_result['vendor_cvr'] = None
|
||||||
|
extracted_cvr = None
|
||||||
|
# Also clear vendor_name if it looks like BMC
|
||||||
|
vendor_name = llm_result.get('vendor_name', '') or ''
|
||||||
|
if 'BMC' in vendor_name.upper() and 'DENMARK' in vendor_name.upper():
|
||||||
|
logger.warning(f"⚠️ AI returned own company name '{vendor_name}' as vendor_name - clearing it")
|
||||||
|
llm_result['vendor_name'] = None
|
||||||
|
|
||||||
|
# Try to find vendor in DB by extracted CVR or name (overrides detected_vendor_id)
|
||||||
|
if extracted_cvr:
|
||||||
|
cvr_clean = str(extracted_cvr).replace('DK', '').strip()
|
||||||
|
vendor_row = execute_query_single(
|
||||||
|
"SELECT id FROM vendors WHERE cvr_number = %s AND is_active = true",
|
||||||
|
(cvr_clean,))
|
||||||
|
if vendor_row:
|
||||||
|
vendor_id = vendor_row['id']
|
||||||
|
logger.info(f"✅ Matched vendor by CVR {cvr_clean}: vendor_id={vendor_id}")
|
||||||
|
execute_update(
|
||||||
|
"UPDATE incoming_files SET detected_vendor_id = %s WHERE file_id = %s",
|
||||||
|
(vendor_id, file_id))
|
||||||
|
if not vendor_id and llm_result.get('vendor_name'):
|
||||||
|
vendor_row = execute_query_single(
|
||||||
|
"SELECT id FROM vendors WHERE name ILIKE %s AND is_active = true ORDER BY id LIMIT 1",
|
||||||
|
(f"%{llm_result['vendor_name']}%",))
|
||||||
|
if vendor_row:
|
||||||
|
vendor_id = vendor_row['id']
|
||||||
|
logger.info(f"✅ Matched vendor by name '{llm_result['vendor_name']}': vendor_id={vendor_id}")
|
||||||
|
execute_update(
|
||||||
|
"UPDATE incoming_files SET detected_vendor_id = %s WHERE file_id = %s",
|
||||||
|
(vendor_id, file_id))
|
||||||
|
|
||||||
# Store AI extracted data in extractions table
|
# Store AI extracted data in extractions table
|
||||||
extraction_id = execute_insert(
|
extraction_id = execute_insert(
|
||||||
"""INSERT INTO supplier_invoice_extractions
|
"""INSERT INTO extractions
|
||||||
(file_id, vendor_id, invoice_number, invoice_date, due_date,
|
(file_id, vendor_matched_id, vendor_name, vendor_cvr,
|
||||||
total_amount, currency, document_type, confidence, llm_data)
|
document_id, document_date, due_date,
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s) RETURNING extraction_id""",
|
total_amount, currency, document_type, document_type_detected,
|
||||||
(file_id, vendor_id,
|
confidence, llm_response_json, status)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s) RETURNING extraction_id""",
|
||||||
|
(file_id, vendor_id,
|
||||||
|
llm_result.get('vendor_name'),
|
||||||
|
llm_result.get('vendor_cvr'),
|
||||||
llm_result.get('invoice_number'),
|
llm_result.get('invoice_number'),
|
||||||
llm_result.get('invoice_date'),
|
llm_result.get('invoice_date'),
|
||||||
llm_result.get('due_date'),
|
llm_result.get('due_date'),
|
||||||
llm_result.get('total_amount'),
|
llm_result.get('total_amount'),
|
||||||
llm_result.get('currency', 'DKK'),
|
llm_result.get('currency', 'DKK'),
|
||||||
llm_result.get('document_type'),
|
llm_result.get('document_type', 'invoice'),
|
||||||
|
llm_result.get('document_type', 'invoice'),
|
||||||
confidence,
|
confidence,
|
||||||
json.dumps(llm_result))
|
json.dumps(llm_result),
|
||||||
|
'extracted')
|
||||||
)
|
)
|
||||||
|
|
||||||
# Insert line items if extracted
|
# Insert line items if extracted
|
||||||
@ -2320,13 +2519,13 @@ async def reprocess_uploaded_file(file_id: int):
|
|||||||
execute_insert(
|
execute_insert(
|
||||||
"""INSERT INTO extraction_lines
|
"""INSERT INTO extraction_lines
|
||||||
(extraction_id, line_number, description, quantity, unit_price,
|
(extraction_id, line_number, description, quantity, unit_price,
|
||||||
line_total, vat_rate, vat_note, confidence)
|
line_total, vat_rate, confidence)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
RETURNING line_id""",
|
RETURNING line_id""",
|
||||||
(extraction_id, idx, line.get('description'),
|
(extraction_id, idx, line.get('description'),
|
||||||
line.get('quantity'), line.get('unit_price'),
|
line.get('quantity'), line.get('unit_price'),
|
||||||
line.get('line_total'), line.get('vat_rate'),
|
line.get('line_total'), line.get('vat_rate'),
|
||||||
line.get('vat_note'), confidence)
|
confidence)
|
||||||
)
|
)
|
||||||
|
|
||||||
# Update file status to ai_extracted
|
# Update file status to ai_extracted
|
||||||
@ -2376,6 +2575,47 @@ async def reprocess_uploaded_file(file_id: int):
|
|||||||
raise HTTPException(status_code=500, detail=f"Genbehandling fejlede: {str(e)}")
|
raise HTTPException(status_code=500, detail=f"Genbehandling fejlede: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/supplier-invoices/files/batch-analyze")
|
||||||
|
async def batch_analyze_files(background_tasks: BackgroundTasks):
|
||||||
|
"""
|
||||||
|
Kør AI-analyse på alle ubehandlede filer i baggrunden.
|
||||||
|
Returnerer øjeblikkeligt – filer behandles async.
|
||||||
|
"""
|
||||||
|
pending = execute_query(
|
||||||
|
"""SELECT file_id, filename FROM incoming_files
|
||||||
|
WHERE status IN ('pending', 'requires_vendor_selection', 'uploaded', 'failed')
|
||||||
|
ORDER BY uploaded_at DESC
|
||||||
|
LIMIT 100""",
|
||||||
|
()
|
||||||
|
)
|
||||||
|
if not pending:
|
||||||
|
return {"started": 0, "message": "Ingen filer at behandle"}
|
||||||
|
|
||||||
|
file_ids = [r['file_id'] for r in pending]
|
||||||
|
logger.info(f"🚀 Batch-analyse startet for {len(file_ids)} filer")
|
||||||
|
|
||||||
|
async def _run_batch(ids):
|
||||||
|
ok = err = 0
|
||||||
|
for fid in ids:
|
||||||
|
try:
|
||||||
|
await reprocess_uploaded_file(fid)
|
||||||
|
ok += 1
|
||||||
|
except Exception as ex:
|
||||||
|
logger.error(f"❌ Batch fejl file {fid}: {ex}")
|
||||||
|
err += 1
|
||||||
|
logger.info(f"✅ Batch færdig: {ok} ok, {err} fejlet")
|
||||||
|
|
||||||
|
background_tasks.add_task(_run_batch, file_ids)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"started": len(file_ids),
|
||||||
|
"message": f"{len(file_ids)} filer sendt til analyse i baggrunden. Opdater siden om lidt.",
|
||||||
|
"analyzed": 0,
|
||||||
|
"requires_vendor_selection": 0,
|
||||||
|
"failed": 0
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
@router.put("/supplier-invoices/templates/{template_id}")
|
@router.put("/supplier-invoices/templates/{template_id}")
|
||||||
async def update_template(
|
async def update_template(
|
||||||
template_id: int,
|
template_id: int,
|
||||||
@ -3248,17 +3488,10 @@ async def retry_extraction(file_id: int):
|
|||||||
)
|
)
|
||||||
|
|
||||||
logger.info(f"🔄 Retrying extraction for file {file_id}: {file_data['filename']}")
|
logger.info(f"🔄 Retrying extraction for file {file_id}: {file_data['filename']}")
|
||||||
|
|
||||||
# Trigger re-analysis by calling the existing upload processing logic
|
# Run full extraction cascade immediately
|
||||||
# For now, just mark as pending - the user can then run batch-analyze
|
result = await reprocess_uploaded_file(file_id)
|
||||||
|
return result
|
||||||
return {
|
|
||||||
"file_id": file_id,
|
|
||||||
"filename": file_data['filename'],
|
|
||||||
"message": "File marked for re-analysis. Run batch-analyze to process.",
|
|
||||||
"previous_status": file_data['status'],
|
|
||||||
"new_status": "pending"
|
|
||||||
}
|
|
||||||
|
|
||||||
except HTTPException:
|
except HTTPException:
|
||||||
raise
|
raise
|
||||||
|
|||||||
@ -173,6 +173,11 @@
|
|||||||
<i class="bi bi-calendar-check me-2"></i>Til Betaling
|
<i class="bi bi-calendar-check me-2"></i>Til Betaling
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link" id="ready-tab" data-bs-toggle="tab" href="#ready-content" onclick="switchToReadyTab()">
|
||||||
|
<i class="bi bi-check-circle me-2"></i>Klar til Bogføring
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
<li class="nav-item">
|
<li class="nav-item">
|
||||||
<a class="nav-link" id="lines-tab" data-bs-toggle="tab" href="#lines-content" onclick="switchToLinesTab()">
|
<a class="nav-link" id="lines-tab" data-bs-toggle="tab" href="#lines-content" onclick="switchToLinesTab()">
|
||||||
<i class="bi bi-list-ul me-2"></i>Varelinjer
|
<i class="bi bi-list-ul me-2"></i>Varelinjer
|
||||||
@ -248,7 +253,7 @@
|
|||||||
<strong><span id="selectedKassekladdeCount">0</span> fakturaer valgt</strong>
|
<strong><span id="selectedKassekladdeCount">0</span> fakturaer valgt</strong>
|
||||||
</div>
|
</div>
|
||||||
<div class="btn-group" role="group">
|
<div class="btn-group" role="group">
|
||||||
<button type="button" class="btn btn-sm btn-success" onclick="bulkSendToEconomic()" title="Send til e-conomic kassekladde">
|
<button type="button" class="btn btn-sm btn-success" onclick="bulkSendToEconomicKassekladde()" title="Send til e-conomic kassekladde">
|
||||||
<i class="bi bi-send me-1"></i>Send til e-conomic
|
<i class="bi bi-send me-1"></i>Send til e-conomic
|
||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
@ -867,6 +872,133 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<!-- =============================================
|
||||||
|
QUICK OPRET LEVERANDØR — Split-view modal
|
||||||
|
Venstre: PDF iframe | Højre: Vendor form
|
||||||
|
============================================== -->
|
||||||
|
<div class="modal fade" id="quickVendorSplitModal" tabindex="-1" style="--bs-modal-width: 100%;">
|
||||||
|
<div class="modal-dialog modal-fullscreen">
|
||||||
|
<div class="modal-content">
|
||||||
|
<div class="modal-header py-2 border-bottom">
|
||||||
|
<h5 class="modal-title"><i class="bi bi-person-plus me-2"></i>Opret / Link Leverandør</h5>
|
||||||
|
<div class="ms-3 d-flex align-items-center gap-2">
|
||||||
|
<span class="badge bg-secondary" id="qvSplitFilename" style="font-size:.85rem"></span>
|
||||||
|
</div>
|
||||||
|
<button type="button" class="btn-close ms-auto" data-bs-dismiss="modal"></button>
|
||||||
|
</div>
|
||||||
|
<div class="modal-body p-0 d-flex" style="height: calc(100vh - 120px); overflow:hidden;">
|
||||||
|
|
||||||
|
<!-- LEFT: PDF viewer -->
|
||||||
|
<div class="d-flex flex-column border-end" style="width:58%; min-width:400px; height:100%;">
|
||||||
|
<div class="px-3 py-2 bg-body-tertiary border-bottom small text-muted flex-shrink-0">
|
||||||
|
<i class="bi bi-file-pdf text-danger me-1"></i>Faktura PDF
|
||||||
|
</div>
|
||||||
|
<iframe id="qvPdfFrame" src="" style="flex:1 1 0; min-height:0; border:none; width:100%;" title="PDF Preview"></iframe>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- RIGHT: Vendor form -->
|
||||||
|
<div class="d-flex flex-column" style="width:42%; overflow-y:auto;">
|
||||||
|
<div class="px-4 py-3 bg-body-tertiary border-bottom">
|
||||||
|
<span class="small text-muted">Udfyld leverandøroplysninger — felter er preudfyldt fra faktura-PDF</span>
|
||||||
|
</div>
|
||||||
|
<div class="px-4 py-3">
|
||||||
|
<input type="hidden" id="qvFileId">
|
||||||
|
<input type="hidden" id="qvExistingVendorId">
|
||||||
|
|
||||||
|
<!-- Search existing -->
|
||||||
|
<div class="card mb-3 border-primary">
|
||||||
|
<div class="card-header py-2 bg-primary text-white small"><i class="bi bi-search me-1"></i>Link eksisterende leverandør</div>
|
||||||
|
<div class="card-body py-2">
|
||||||
|
<div class="input-group input-group-sm">
|
||||||
|
<input type="text" class="form-control" id="qvSearchInput" placeholder="Søg navn eller CVR..." oninput="qvSearchVendors(this.value)">
|
||||||
|
<button class="btn btn-outline-secondary" type="button" onclick="qvSearchVendors(document.getElementById('qvSearchInput').value)"><i class="bi bi-search"></i></button>
|
||||||
|
</div>
|
||||||
|
<div id="qvSearchResults" class="list-group mt-2" style="max-height:160px; overflow-y:auto;">
|
||||||
|
<div class="list-group-item text-muted small py-1">Søg for at finde eksisterende leverandør</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="text-center text-muted my-2 small">— eller opret ny leverandør nedenfor —</div>
|
||||||
|
|
||||||
|
<form id="qvVendorForm" autocomplete="off">
|
||||||
|
<div class="row g-2 mb-2">
|
||||||
|
<div class="col-8">
|
||||||
|
<label class="form-label small mb-1">Navn *</label>
|
||||||
|
<input type="text" class="form-control form-control-sm" id="qvName" required placeholder="Firma navn">
|
||||||
|
</div>
|
||||||
|
<div class="col-4">
|
||||||
|
<label class="form-label small mb-1">CVR</label>
|
||||||
|
<input type="text" class="form-control form-control-sm" id="qvCVR" maxlength="8" placeholder="12345678">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="row g-2 mb-2">
|
||||||
|
<div class="col-6">
|
||||||
|
<label class="form-label small mb-1">Email</label>
|
||||||
|
<input type="email" class="form-control form-control-sm" id="qvEmail" placeholder="kontakt@firma.dk">
|
||||||
|
</div>
|
||||||
|
<div class="col-6">
|
||||||
|
<label class="form-label small mb-1">Telefon</label>
|
||||||
|
<input type="tel" class="form-control form-control-sm" id="qvPhone" placeholder="+45 12 34 56 78">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="mb-2">
|
||||||
|
<label class="form-label small mb-1">Adresse</label>
|
||||||
|
<input type="text" class="form-control form-control-sm" id="qvAddress" placeholder="Vejnavn nr.">
|
||||||
|
</div>
|
||||||
|
<div class="row g-2 mb-2">
|
||||||
|
<div class="col-4">
|
||||||
|
<label class="form-label small mb-1">Postnr.</label>
|
||||||
|
<input type="text" class="form-control form-control-sm" id="qvPostal" maxlength="10">
|
||||||
|
</div>
|
||||||
|
<div class="col-8">
|
||||||
|
<label class="form-label small mb-1">By</label>
|
||||||
|
<input type="text" class="form-control form-control-sm" id="qvCity">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="mb-2">
|
||||||
|
<label class="form-label small mb-1">Website / domæne</label>
|
||||||
|
<input type="text" class="form-control form-control-sm" id="qvDomain" placeholder="firma.dk">
|
||||||
|
</div>
|
||||||
|
<div class="mb-2">
|
||||||
|
<label class="form-label small mb-1">Kategori</label>
|
||||||
|
<select class="form-select form-select-sm" id="qvCategory">
|
||||||
|
<option value="general">Generel</option>
|
||||||
|
<option value="telecom">Telecom</option>
|
||||||
|
<option value="hardware">Hardware</option>
|
||||||
|
<option value="software">Software</option>
|
||||||
|
<option value="services">Services</option>
|
||||||
|
<option value="payroll">Løn / HR</option>
|
||||||
|
<option value="utilities">Forsyning</option>
|
||||||
|
<option value="insurance">Forsikring</option>
|
||||||
|
<option value="rent">Husleje / Lokaler</option>
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
<div class="mb-3">
|
||||||
|
<label class="form-label small mb-1">Noter (inkl. bank/IBAN info)</label>
|
||||||
|
<textarea class="form-control form-control-sm" id="qvNotes" rows="3" placeholder="IBAN, kontonummer, BIC/SWIFT, betalingsbetingelser..."></textarea>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
|
||||||
|
<!-- Status alert -->
|
||||||
|
<div id="qvStatusAlert" class="alert d-none py-2 small"></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div><!-- /.modal-body -->
|
||||||
|
|
||||||
|
<div class="modal-footer py-2 border-top justify-content-between">
|
||||||
|
<button type="button" class="btn btn-secondary btn-sm" data-bs-dismiss="modal">Luk</button>
|
||||||
|
<div class="d-flex gap-2">
|
||||||
|
<button type="button" class="btn btn-success btn-sm" onclick="saveQuickVendor()">
|
||||||
|
<i class="bi bi-person-plus me-1"></i>Opret og link leverandør
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<!-- Link/Create Vendor Modal -->
|
<!-- Link/Create Vendor Modal -->
|
||||||
<div class="modal fade" id="linkVendorModal" tabindex="-1">
|
<div class="modal fade" id="linkVendorModal" tabindex="-1">
|
||||||
<div class="modal-dialog">
|
<div class="modal-dialog">
|
||||||
@ -1265,7 +1397,7 @@ async function markSingleAsPaid(invoiceId) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Helper function to send single invoice to e-conomic
|
// Helper function to send single invoice to e-conomic
|
||||||
async function sendToEconomic(invoiceId) {
|
async function sendToEconomicById(invoiceId) {
|
||||||
if (!confirm('Send denne faktura til e-conomic?')) return;
|
if (!confirm('Send denne faktura til e-conomic?')) return;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@ -1553,7 +1685,7 @@ async function loadReadyForBookingView() {
|
|||||||
<button class="btn btn-sm btn-outline-primary" onclick="viewInvoiceDetails(${invoice.id})" title="Se/Rediger detaljer">
|
<button class="btn btn-sm btn-outline-primary" onclick="viewInvoiceDetails(${invoice.id})" title="Se/Rediger detaljer">
|
||||||
<i class="bi bi-pencil-square"></i>
|
<i class="bi bi-pencil-square"></i>
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-sm btn-primary" onclick="sendToEconomic(${invoice.id})" title="Send til e-conomic">
|
<button class="btn btn-sm btn-primary" onclick="sendToEconomicById(${invoice.id})" title="Send til e-conomic">
|
||||||
<i class="bi bi-send"></i>
|
<i class="bi bi-send"></i>
|
||||||
</button>
|
</button>
|
||||||
</td>
|
</td>
|
||||||
@ -1812,9 +1944,10 @@ function renderUnhandledFiles(files) {
|
|||||||
|
|
||||||
for (const file of files) {
|
for (const file of files) {
|
||||||
const statusBadge = getFileStatusBadge(file.status);
|
const statusBadge = getFileStatusBadge(file.status);
|
||||||
const vendorName = file.detected_vendor_name || '-';
|
const vendorName = file.best_vendor_name || file.vendor_name || file.detected_vendor_name || '-';
|
||||||
const confidence = file.vendor_match_confidence ? `${file.vendor_match_confidence}%` : '-';
|
const confRaw = file.vendor_match_confidence;
|
||||||
const amount = file.detected_amount ? formatCurrency(file.detected_amount) : '-';
|
const confidence = confRaw !== null && confRaw !== undefined ? `${Math.round(confRaw * 100)}%` : '-';
|
||||||
|
const amount = file.total_amount ? formatCurrency(file.total_amount) : '-';
|
||||||
const uploadDate = file.uploaded_at ? new Date(file.uploaded_at).toLocaleDateString('da-DK') : '-';
|
const uploadDate = file.uploaded_at ? new Date(file.uploaded_at).toLocaleDateString('da-DK') : '-';
|
||||||
|
|
||||||
html += `
|
html += `
|
||||||
@ -1842,16 +1975,11 @@ function renderUnhandledFiles(files) {
|
|||||||
<td>${statusBadge}</td>
|
<td>${statusBadge}</td>
|
||||||
<td>
|
<td>
|
||||||
<div class="btn-group btn-group-sm">
|
<div class="btn-group btn-group-sm">
|
||||||
${file.status === 'extraction_failed' ?
|
<button class="btn btn-outline-success" onclick="openQuickVendorCreate(${file.file_id}, '${escapeHtml(file.filename)}')" title="Opret / Link leverandør">
|
||||||
`<button class="btn btn-outline-warning" onclick="retryExtraction(${file.file_id})" title="Prøv igen">
|
<i class="bi bi-person-plus"></i>
|
||||||
<i class="bi bi-arrow-clockwise"></i>
|
</button>
|
||||||
</button>` :
|
<button class="btn btn-outline-warning" onclick="rerunSingleFile(${file.file_id})" title="Kør analyse igen">
|
||||||
`<button class="btn btn-outline-primary" onclick="analyzeFile(${file.file_id})" title="Analyser">
|
<i class="bi bi-arrow-repeat"></i>
|
||||||
<i class="bi bi-search"></i>
|
|
||||||
</button>`
|
|
||||||
}
|
|
||||||
<button class="btn btn-outline-secondary" onclick="viewFilePDF(${file.file_id})" title="Vis PDF">
|
|
||||||
<i class="bi bi-file-pdf"></i>
|
|
||||||
</button>
|
</button>
|
||||||
<button class="btn btn-outline-danger" onclick="deleteFile(${file.file_id})" title="Slet">
|
<button class="btn btn-outline-danger" onclick="deleteFile(${file.file_id})" title="Slet">
|
||||||
<i class="bi bi-trash"></i>
|
<i class="bi bi-trash"></i>
|
||||||
@ -1907,12 +2035,12 @@ function getFileStatusBadge(status) {
|
|||||||
|
|
||||||
// NEW: Batch analyze all files
|
// NEW: Batch analyze all files
|
||||||
async function batchAnalyzeAllFiles() {
|
async function batchAnalyzeAllFiles() {
|
||||||
if (!confirm('Kør automatisk analyse på alle ubehandlede filer?\n\nDette vil:\n- Matche leverandører via CVR\n- Ekstrahere fakturadata\n- Oprette fakturaer i kassekladde ved 100% match')) {
|
if (!confirm('Kør automatisk analyse på alle ubehandlede filer?\n\nDette kan tage flere minutter afhængigt af antal filer.\nSiden opdateres automatisk undervejs.')) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
showLoadingOverlay('Analyserer filer...');
|
showLoadingOverlay('Starter analyse...');
|
||||||
|
|
||||||
const response = await fetch('/api/v1/supplier-invoices/files/batch-analyze', {
|
const response = await fetch('/api/v1/supplier-invoices/files/batch-analyze', {
|
||||||
method: 'POST'
|
method: 'POST'
|
||||||
@ -1924,19 +2052,27 @@ async function batchAnalyzeAllFiles() {
|
|||||||
|
|
||||||
hideLoadingOverlay();
|
hideLoadingOverlay();
|
||||||
|
|
||||||
alert(`✅ Batch-analyse fuldført!\n\n` +
|
if (result.started === 0) {
|
||||||
`Analyseret: ${result.analyzed}\n` +
|
alert('ℹ️ Ingen filer at behandle.');
|
||||||
`Kræver manuel leverandør-valg: ${result.requires_vendor_selection}\n` +
|
return;
|
||||||
`Fejlet: ${result.failed}`);
|
}
|
||||||
|
|
||||||
// Reload tables
|
alert(`✅ ${result.message}`);
|
||||||
|
|
||||||
|
// Auto-opdater tabellen hvert 10. sekund i 5 minutter
|
||||||
|
let refreshes = 0;
|
||||||
|
const maxRefreshes = 30;
|
||||||
|
const interval = setInterval(() => {
|
||||||
|
loadUnhandledFiles();
|
||||||
|
refreshes++;
|
||||||
|
if (refreshes >= maxRefreshes) clearInterval(interval);
|
||||||
|
}, 10000);
|
||||||
loadUnhandledFiles();
|
loadUnhandledFiles();
|
||||||
loadKassekladdeView();
|
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
hideLoadingOverlay();
|
hideLoadingOverlay();
|
||||||
console.error('Batch analysis error:', error);
|
console.error('Batch analysis error:', error);
|
||||||
alert('❌ Fejl ved batch-analyse');
|
alert('❌ Fejl ved batch-analyse: ' + error.message);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -1965,6 +2101,293 @@ async function retryExtraction(fileId) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ─── Quick Vendor Split-View ─────────────────────────────────────────────
|
||||||
|
async function openQuickVendorCreate(fileId, filename) {
|
||||||
|
// Reset
|
||||||
|
document.getElementById('qvFileId').value = fileId;
|
||||||
|
document.getElementById('qvExistingVendorId').value = '';
|
||||||
|
document.getElementById('qvSplitFilename').textContent = filename;
|
||||||
|
document.getElementById('qvName').value = '';
|
||||||
|
document.getElementById('qvCVR').value = '';
|
||||||
|
document.getElementById('qvEmail').value = '';
|
||||||
|
document.getElementById('qvPhone').value = '';
|
||||||
|
document.getElementById('qvAddress').value = '';
|
||||||
|
document.getElementById('qvPostal').value = '';
|
||||||
|
document.getElementById('qvCity').value = '';
|
||||||
|
document.getElementById('qvDomain').value = '';
|
||||||
|
document.getElementById('qvNotes').value = '';
|
||||||
|
document.getElementById('qvSearchInput').value = '';
|
||||||
|
document.getElementById('qvSearchResults').innerHTML = '<div class="list-group-item text-muted small py-1">Søg for at finde eksisterende leverandør</div>';
|
||||||
|
document.getElementById('qvStatusAlert').className = 'alert d-none py-2 small';
|
||||||
|
|
||||||
|
// Load PDF in iframe
|
||||||
|
document.getElementById('qvPdfFrame').src = `/api/v1/supplier-invoices/files/${fileId}/download`;
|
||||||
|
|
||||||
|
// Open modal immediately
|
||||||
|
const modal = new bootstrap.Modal(document.getElementById('quickVendorSplitModal'), {backdrop: 'static'});
|
||||||
|
modal.show();
|
||||||
|
|
||||||
|
// Async: load extracted data and pre-fill form
|
||||||
|
await qvLoadAndPrefill(fileId);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function qvLoadAndPrefill(fileId, isRetry) {
|
||||||
|
const statusEl = document.getElementById('qvStatusAlert');
|
||||||
|
statusEl.className = 'alert alert-info py-2 small';
|
||||||
|
statusEl.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Henter fakturadata…';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const resp = await fetch(`/api/v1/supplier-invoices/files/${fileId}/extracted-data`);
|
||||||
|
if (!resp.ok) { throw new Error(`HTTP ${resp.status}`); }
|
||||||
|
const data = await resp.json();
|
||||||
|
|
||||||
|
console.log('[QV] extracted-data response:', JSON.stringify({
|
||||||
|
file_id: data.file_id,
|
||||||
|
status: data.status,
|
||||||
|
has_extraction: !!data.extraction,
|
||||||
|
has_llm_data: !!data.llm_data,
|
||||||
|
llm_data: data.llm_data,
|
||||||
|
extraction_vendor_name: data.extraction?.vendor_name,
|
||||||
|
extraction_vendor_cvr: data.extraction?.vendor_cvr,
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Normaliseret data fra server (backend bygger llm_data med rigtige feltnavne)
|
||||||
|
const ld = data.llm_data || {};
|
||||||
|
const ext = data.extraction || {};
|
||||||
|
|
||||||
|
// llm_response_json: kan være JSONB-objekt eller string
|
||||||
|
let rawAi = {};
|
||||||
|
const rawLlm = ext.llm_response_json;
|
||||||
|
if (rawLlm) {
|
||||||
|
rawAi = (typeof rawLlm === 'string') ? (() => { try { return JSON.parse(rawLlm); } catch(e) { return {}; } })() : rawLlm;
|
||||||
|
}
|
||||||
|
console.log('[QV] rawAi keys:', Object.keys(rawAi).join(', ') || '(tom)');
|
||||||
|
|
||||||
|
// Hent vendor-felter fra alle 3 kilder i prioriteret rækkefølge
|
||||||
|
const name = ld.vendor_name || ext.vendor_name || rawAi.vendor_name || rawAi.issuer || '';
|
||||||
|
const cvr = (ld.vendor_cvr || ext.vendor_cvr || rawAi.vendor_cvr || rawAi.vendor_vat || '').toString().replace(/^DK/i, '').trim();
|
||||||
|
const email = ld.vendor_email || rawAi.vendor_email || rawAi.supplier_email || '';
|
||||||
|
const addr = ld.vendor_address || rawAi.vendor_address || rawAi.supplier_address || rawAi.vendor_street || '';
|
||||||
|
const postal = ld.vendor_postal_code || rawAi.vendor_postal_code || rawAi.postal_code || '';
|
||||||
|
const city = ld.vendor_city || rawAi.vendor_city || rawAi.city || '';
|
||||||
|
|
||||||
|
console.log('[QV] Parsed fields:', { name, cvr, email, addr, postal, city });
|
||||||
|
|
||||||
|
// Ingen extraction i DB overhovedet (fil aldrig kørt) → auto-reprocess
|
||||||
|
if (!data.extraction && !isRetry) {
|
||||||
|
console.log('[QV] Ingen extraction – starter auto-reprocess');
|
||||||
|
await qvAutoReprocess(fileId);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extraction findes men ingen vendor data → tilbyd reprocess
|
||||||
|
if (!name && !cvr && !isRetry) {
|
||||||
|
console.log('[QV] Extraction uden vendor data – starter auto-reprocess');
|
||||||
|
await qvAutoReprocess(fileId);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Udfyld form
|
||||||
|
if (name) document.getElementById('qvName').value = name;
|
||||||
|
if (cvr) document.getElementById('qvCVR').value = cvr;
|
||||||
|
if (email) document.getElementById('qvEmail').value = email;
|
||||||
|
if (addr) {
|
||||||
|
const parts = addr.split(/,|\n/).map(s => s.trim()).filter(Boolean);
|
||||||
|
if (parts.length >= 1) document.getElementById('qvAddress').value = parts[0];
|
||||||
|
if (!postal && !city && parts.length >= 2) {
|
||||||
|
const postalCity = parts[parts.length - 1];
|
||||||
|
const m = postalCity.match(/^(\d{4})\s+(.+)$/);
|
||||||
|
if (m) { document.getElementById('qvPostal').value = m[1]; document.getElementById('qvCity').value = m[2]; }
|
||||||
|
else { document.getElementById('qvCity').value = postalCity; }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (postal) document.getElementById('qvPostal').value = postal;
|
||||||
|
if (city) document.getElementById('qvCity').value = city;
|
||||||
|
|
||||||
|
if (name || cvr) {
|
||||||
|
statusEl.className = 'alert alert-success py-2 small';
|
||||||
|
statusEl.textContent = `✅ Data hentet${name ? ': ' + name : ''}${cvr ? ' (' + cvr + ')' : ''}`;
|
||||||
|
setTimeout(() => { statusEl.className = 'alert d-none py-2 small'; }, 4000);
|
||||||
|
} else {
|
||||||
|
// AI fandt ingen vendor – men vis hvad der er (fakturanr, beløb)
|
||||||
|
const inv = ld.invoice_number || rawAi.invoice_number || '';
|
||||||
|
const amt = ld.total_amount || rawAi.total_amount || '';
|
||||||
|
statusEl.className = 'alert alert-warning py-2 small';
|
||||||
|
statusEl.innerHTML = `AI fandt ingen leverandørdata${inv ? ' (Faktura ' + inv + (amt ? ', ' + amt + ' DKK' : '') + ')' : ''}. Udfyld navn manuelt eller søg herover.`;
|
||||||
|
}
|
||||||
|
} catch(e) {
|
||||||
|
console.error('[QV] Fejl:', e);
|
||||||
|
statusEl.className = 'alert alert-danger py-2 small';
|
||||||
|
statusEl.textContent = 'Fejl ved hentning: ' + e.message;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function qvAutoReprocess(fileId) {
|
||||||
|
const statusEl = document.getElementById('qvStatusAlert');
|
||||||
|
statusEl.className = 'alert alert-info py-2 small';
|
||||||
|
statusEl.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Analyserer faktura med AI – vent venligst…';
|
||||||
|
|
||||||
|
try {
|
||||||
|
console.log('[QV] Starter reprocess for file:', fileId);
|
||||||
|
const r = await fetch(`/api/v1/supplier-invoices/reprocess/${fileId}`, { method: 'POST' });
|
||||||
|
if (!r.ok) {
|
||||||
|
const errBody = await r.text();
|
||||||
|
console.error('[QV] Reprocess fejlede:', r.status, errBody);
|
||||||
|
throw new Error(`Reprocess HTTP ${r.status}: ${errBody}`);
|
||||||
|
}
|
||||||
|
const reprocessResult = await r.json();
|
||||||
|
console.log('[QV] Reprocess result:', JSON.stringify(reprocessResult));
|
||||||
|
|
||||||
|
// Hent opdateret data med isRetry=true for at undgå uendelig løkke
|
||||||
|
await qvLoadAndPrefill(fileId, true);
|
||||||
|
loadUnhandledFiles();
|
||||||
|
} catch(e) {
|
||||||
|
console.error('[QV] Auto-reprocess fejl:', e);
|
||||||
|
statusEl.className = 'alert alert-warning py-2 small';
|
||||||
|
statusEl.innerHTML = `Kunne ikke køre AI-analyse: ${e.message}. <button class="btn btn-sm btn-outline-warning ms-2" onclick="qvAutoReprocess(${fileId})">Prøv igen</button>`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function qvSearchVendors(query) {
|
||||||
|
const results = document.getElementById('qvSearchResults');
|
||||||
|
if (!query || query.length < 2) {
|
||||||
|
results.innerHTML = '<div class="list-group-item text-muted small py-1">Søg for at finde eksisterende leverandør</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const resp = await fetch(`/api/v1/vendors?search=${encodeURIComponent(query)}&active_only=true`);
|
||||||
|
const vendors = await resp.json();
|
||||||
|
if (!vendors || vendors.length === 0) {
|
||||||
|
results.innerHTML = '<div class="list-group-item text-muted small py-1">Ingen leverandører fundet</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
results.innerHTML = vendors.slice(0, 10).map(v => `
|
||||||
|
<button type="button" class="list-group-item list-group-item-action py-1 small"
|
||||||
|
onclick="qvSelectVendor(${v.id}, '${escapeHtml(v.name)}', '${v.cvr_number || ''}')">
|
||||||
|
<strong>${escapeHtml(v.name)}</strong>
|
||||||
|
${v.cvr_number ? `<span class="text-muted ms-2">${v.cvr_number}</span>` : ''}
|
||||||
|
</button>
|
||||||
|
`).join('');
|
||||||
|
} catch(e) {
|
||||||
|
results.innerHTML = '<div class="list-group-item text-danger small py-1">Fejl ved søgning</div>';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function qvSelectVendor(vendorId, vendorName, vendorCVR) {
|
||||||
|
document.getElementById('qvExistingVendorId').value = vendorId;
|
||||||
|
document.getElementById('qvName').value = vendorName;
|
||||||
|
document.getElementById('qvCVR').value = vendorCVR;
|
||||||
|
const alert = document.getElementById('qvStatusAlert');
|
||||||
|
alert.className = 'alert alert-success py-2 small';
|
||||||
|
alert.textContent = `✅ Valgt: ${vendorName} — klik "Opret og link" for at linke`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function saveQuickVendor() {
|
||||||
|
const fileId = document.getElementById('qvFileId').value;
|
||||||
|
const existingId = document.getElementById('qvExistingVendorId').value;
|
||||||
|
const name = document.getElementById('qvName').value.trim();
|
||||||
|
const cvr = document.getElementById('qvCVR').value.trim();
|
||||||
|
const email = document.getElementById('qvEmail').value.trim();
|
||||||
|
const phone = document.getElementById('qvPhone').value.trim();
|
||||||
|
const address = document.getElementById('qvAddress').value.trim();
|
||||||
|
const postal = document.getElementById('qvPostal').value.trim();
|
||||||
|
const city = document.getElementById('qvCity').value.trim();
|
||||||
|
const domain = document.getElementById('qvDomain').value.trim();
|
||||||
|
const category = document.getElementById('qvCategory').value;
|
||||||
|
const notes = document.getElementById('qvNotes').value.trim();
|
||||||
|
|
||||||
|
const statusEl = document.getElementById('qvStatusAlert');
|
||||||
|
|
||||||
|
if (!name) {
|
||||||
|
statusEl.className = 'alert alert-danger py-2 small';
|
||||||
|
statusEl.textContent = 'Navn er påkrævet.';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
statusEl.className = 'alert alert-info py-2 small';
|
||||||
|
statusEl.textContent = 'Gemmer…';
|
||||||
|
|
||||||
|
try {
|
||||||
|
let vendorId = existingId ? parseInt(existingId) : null;
|
||||||
|
|
||||||
|
if (!vendorId) {
|
||||||
|
// Create new vendor
|
||||||
|
const payload = {
|
||||||
|
name, cvr_number: cvr || null,
|
||||||
|
email: email || null, phone: phone || null,
|
||||||
|
address: [address, postal && city ? `${postal} ${city}` : city].filter(Boolean).join('\n') || null,
|
||||||
|
postal_code: postal || null, city: city || null,
|
||||||
|
domain: domain || null, category,
|
||||||
|
notes: notes || null
|
||||||
|
};
|
||||||
|
const resp = await fetch('/api/v1/vendors', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {'Content-Type': 'application/json'},
|
||||||
|
body: JSON.stringify(payload)
|
||||||
|
});
|
||||||
|
if (!resp.ok) {
|
||||||
|
const err = await resp.json().catch(() => ({}));
|
||||||
|
throw new Error(err.detail || 'Oprettelse fejlede');
|
||||||
|
}
|
||||||
|
const created = await resp.json();
|
||||||
|
vendorId = created.id;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Link vendor to file
|
||||||
|
const linkResp = await fetch(`/api/v1/supplier-invoices/files/${fileId}/link-vendor`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {'Content-Type': 'application/json'},
|
||||||
|
body: JSON.stringify({vendor_id: vendorId})
|
||||||
|
});
|
||||||
|
if (!linkResp.ok) {
|
||||||
|
const err = await linkResp.json().catch(() => ({}));
|
||||||
|
throw new Error(err.detail || 'Link fejlede');
|
||||||
|
}
|
||||||
|
|
||||||
|
statusEl.className = 'alert alert-success py-2 small';
|
||||||
|
statusEl.textContent = `✅ Leverandør ${existingId ? 'linket' : 'oprettet og linket'}!`;
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
bootstrap.Modal.getInstance(document.getElementById('quickVendorSplitModal')).hide();
|
||||||
|
loadUnhandledFiles();
|
||||||
|
}, 900);
|
||||||
|
|
||||||
|
} catch(e) {
|
||||||
|
statusEl.className = 'alert alert-danger py-2 small';
|
||||||
|
statusEl.textContent = '❌ ' + e.message;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Rerun full extraction for a file in the unhandled tab
|
||||||
|
async function rerunSingleFile(fileId) {
|
||||||
|
try {
|
||||||
|
showLoadingOverlay('Kører analyse...');
|
||||||
|
|
||||||
|
const response = await fetch(`/api/v1/supplier-invoices/reprocess/${fileId}`, {
|
||||||
|
method: 'POST'
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const err = await response.json().catch(() => ({}));
|
||||||
|
throw new Error(err.detail || 'Analyse fejlede');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
hideLoadingOverlay();
|
||||||
|
|
||||||
|
const confPct = result.confidence ? Math.round(result.confidence * 100) + '%' : '?%';
|
||||||
|
const vendorInfo = result.vendor_id ? `Leverandør matchet (ID ${result.vendor_id})` : 'Ingen leverandør matchet';
|
||||||
|
alert(`✅ Analyse færdig\n${vendorInfo}\nConfidence: ${confPct}`);
|
||||||
|
|
||||||
|
loadUnhandledFiles();
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
hideLoadingOverlay();
|
||||||
|
console.error('Rerun error:', error);
|
||||||
|
alert('❌ Fejl ved analyse: ' + error.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// NEW: Analyze single file
|
// NEW: Analyze single file
|
||||||
async function analyzeFile(fileId) {
|
async function analyzeFile(fileId) {
|
||||||
try {
|
try {
|
||||||
@ -3633,12 +4056,11 @@ async function bulkMarkAsPaid() {
|
|||||||
|
|
||||||
for (const invoiceId of invoiceIds) {
|
for (const invoiceId of invoiceIds) {
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/mark-paid`, {
|
||||||
method: 'PATCH',
|
method: 'POST',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
status: 'paid',
|
paid_date: new Date().toISOString().split('T')[0]
|
||||||
payment_date: new Date().toISOString().split('T')[0]
|
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -3669,12 +4091,11 @@ async function markInvoiceAsPaid(invoiceId) {
|
|||||||
if (!confirm('Marker denne faktura som betalt?')) return;
|
if (!confirm('Marker denne faktura som betalt?')) return;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/mark-paid`, {
|
||||||
method: 'PATCH',
|
method: 'POST',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
status: 'paid',
|
paid_date: new Date().toISOString().split('T')[0]
|
||||||
payment_date: new Date().toISOString().split('T')[0]
|
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
@ -4139,7 +4560,7 @@ async function approveInvoice() {
|
|||||||
const response = await fetch(`/api/v1/supplier-invoices/${currentInvoiceId}/approve`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${currentInvoiceId}/approve`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({ approved_by: 'CurrentUser' }) // TODO: Get from auth
|
body: JSON.stringify({ approved_by: getApprovalUser() })
|
||||||
});
|
});
|
||||||
|
|
||||||
if (response.ok) {
|
if (response.ok) {
|
||||||
@ -4192,7 +4613,7 @@ async function quickApprove(invoiceId) {
|
|||||||
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/approve`, {
|
const response = await fetch(`/api/v1/supplier-invoices/${invoiceId}/approve`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({ approved_by: 'CurrentUser' })
|
body: JSON.stringify({ approved_by: getApprovalUser() })
|
||||||
});
|
});
|
||||||
|
|
||||||
if (response.ok) {
|
if (response.ok) {
|
||||||
@ -4537,7 +4958,7 @@ async function createTemplateFromInvoice(invoiceId, vendorId) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Step 2: AI analyze
|
// Step 2: AI analyze
|
||||||
const aiResp = await fetch('/api/v1/supplier-invoices/ai-analyze', {
|
const aiResp = await fetch('/api/v1/supplier-invoices/ai/analyze', {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
@ -4699,7 +5120,7 @@ async function sendSingleToEconomic(invoiceId) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Bulk send selected invoices to e-conomic
|
// Bulk send selected invoices to e-conomic
|
||||||
async function bulkSendToEconomic() {
|
async function bulkSendToEconomicKassekladde() {
|
||||||
const checkboxes = document.querySelectorAll('.kassekladde-checkbox:checked');
|
const checkboxes = document.querySelectorAll('.kassekladde-checkbox:checked');
|
||||||
const invoiceIds = Array.from(checkboxes).map(cb => parseInt(cb.dataset.invoiceId));
|
const invoiceIds = Array.from(checkboxes).map(cb => parseInt(cb.dataset.invoiceId));
|
||||||
|
|
||||||
@ -4747,6 +5168,16 @@ async function bulkSendToEconomic() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function getApprovalUser() {
|
||||||
|
const bodyUser = document.body?.dataset?.currentUser;
|
||||||
|
if (bodyUser && bodyUser.trim()) return bodyUser.trim();
|
||||||
|
|
||||||
|
const metaUser = document.querySelector('meta[name="current-user"]')?.content;
|
||||||
|
if (metaUser && metaUser.trim()) return metaUser.trim();
|
||||||
|
|
||||||
|
return 'System';
|
||||||
|
}
|
||||||
|
|
||||||
// Select vendor for file (when <100% match)
|
// Select vendor for file (when <100% match)
|
||||||
async function selectVendorForFile(fileId, vendorId) {
|
async function selectVendorForFile(fileId, vendorId) {
|
||||||
if (!vendorId) return;
|
if (!vendorId) return;
|
||||||
|
|||||||
@ -1360,7 +1360,7 @@ async function autoGenerateTemplate() {
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
// Call Ollama to analyze the invoice
|
// Call Ollama to analyze the invoice
|
||||||
const response = await fetch('/api/v1/supplier-invoices/ai-analyze', {
|
const response = await fetch('/api/v1/supplier-invoices/ai/analyze', {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
|
|||||||
@ -15,6 +15,21 @@ logger = logging.getLogger(__name__)
|
|||||||
security = HTTPBearer(auto_error=False)
|
security = HTTPBearer(auto_error=False)
|
||||||
|
|
||||||
|
|
||||||
|
def _users_column_exists(column_name: str) -> bool:
|
||||||
|
result = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT 1
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'users'
|
||||||
|
AND column_name = %s
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(column_name,),
|
||||||
|
)
|
||||||
|
return bool(result)
|
||||||
|
|
||||||
|
|
||||||
async def get_current_user(
|
async def get_current_user(
|
||||||
request: Request,
|
request: Request,
|
||||||
credentials: Optional[HTTPAuthorizationCredentials] = Depends(security)
|
credentials: Optional[HTTPAuthorizationCredentials] = Depends(security)
|
||||||
@ -70,9 +85,11 @@ async def get_current_user(
|
|||||||
}
|
}
|
||||||
|
|
||||||
# Get additional user details from database
|
# Get additional user details from database
|
||||||
|
is_2fa_expr = "is_2fa_enabled" if _users_column_exists("is_2fa_enabled") else "FALSE AS is_2fa_enabled"
|
||||||
user_details = execute_query_single(
|
user_details = execute_query_single(
|
||||||
"SELECT email, full_name, is_2fa_enabled FROM users WHERE user_id = %s",
|
f"SELECT email, full_name, {is_2fa_expr} FROM users WHERE user_id = %s",
|
||||||
(user_id,))
|
(user_id,),
|
||||||
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"id": user_id,
|
"id": user_id,
|
||||||
|
|||||||
@ -15,6 +15,28 @@ import logging
|
|||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
_users_column_cache: Dict[str, bool] = {}
|
||||||
|
|
||||||
|
|
||||||
|
def _users_column_exists(column_name: str) -> bool:
|
||||||
|
if column_name in _users_column_cache:
|
||||||
|
return _users_column_cache[column_name]
|
||||||
|
|
||||||
|
result = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT 1
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'users'
|
||||||
|
AND column_name = %s
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(column_name,),
|
||||||
|
)
|
||||||
|
exists = bool(result)
|
||||||
|
_users_column_cache[column_name] = exists
|
||||||
|
return exists
|
||||||
|
|
||||||
# JWT Settings
|
# JWT Settings
|
||||||
SECRET_KEY = settings.JWT_SECRET_KEY
|
SECRET_KEY = settings.JWT_SECRET_KEY
|
||||||
ALGORITHM = "HS256"
|
ALGORITHM = "HS256"
|
||||||
@ -25,6 +47,11 @@ pwd_context = CryptContext(schemes=["pbkdf2_sha256", "bcrypt_sha256", "bcrypt"],
|
|||||||
|
|
||||||
class AuthService:
|
class AuthService:
|
||||||
"""Service for authentication and authorization"""
|
"""Service for authentication and authorization"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def is_2fa_supported() -> bool:
|
||||||
|
"""Return True only when required 2FA columns exist in users table."""
|
||||||
|
return _users_column_exists("is_2fa_enabled") and _users_column_exists("totp_secret")
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def hash_password(password: str) -> str:
|
def hash_password(password: str) -> str:
|
||||||
@ -89,6 +116,9 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def setup_user_2fa(user_id: int, username: str) -> Dict:
|
def setup_user_2fa(user_id: int, username: str) -> Dict:
|
||||||
"""Create and store a new TOTP secret (not enabled until verified)"""
|
"""Create and store a new TOTP secret (not enabled until verified)"""
|
||||||
|
if not AuthService.is_2fa_supported():
|
||||||
|
raise RuntimeError("2FA columns missing in users table")
|
||||||
|
|
||||||
secret = AuthService.generate_2fa_secret()
|
secret = AuthService.generate_2fa_secret()
|
||||||
execute_update(
|
execute_update(
|
||||||
"UPDATE users SET totp_secret = %s, is_2fa_enabled = FALSE, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
"UPDATE users SET totp_secret = %s, is_2fa_enabled = FALSE, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
@ -103,6 +133,9 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def enable_user_2fa(user_id: int, otp_code: str) -> bool:
|
def enable_user_2fa(user_id: int, otp_code: str) -> bool:
|
||||||
"""Enable 2FA after verifying TOTP code"""
|
"""Enable 2FA after verifying TOTP code"""
|
||||||
|
if not (_users_column_exists("totp_secret") and _users_column_exists("is_2fa_enabled")):
|
||||||
|
return False
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
"SELECT totp_secret FROM users WHERE user_id = %s",
|
"SELECT totp_secret FROM users WHERE user_id = %s",
|
||||||
(user_id,)
|
(user_id,)
|
||||||
@ -123,6 +156,9 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def disable_user_2fa(user_id: int, otp_code: str) -> bool:
|
def disable_user_2fa(user_id: int, otp_code: str) -> bool:
|
||||||
"""Disable 2FA after verifying TOTP code"""
|
"""Disable 2FA after verifying TOTP code"""
|
||||||
|
if not (_users_column_exists("totp_secret") and _users_column_exists("is_2fa_enabled")):
|
||||||
|
return False
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
"SELECT totp_secret FROM users WHERE user_id = %s",
|
"SELECT totp_secret FROM users WHERE user_id = %s",
|
||||||
(user_id,)
|
(user_id,)
|
||||||
@ -151,10 +187,11 @@ class AuthService:
|
|||||||
if not user:
|
if not user:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
execute_update(
|
if _users_column_exists("is_2fa_enabled") and _users_column_exists("totp_secret"):
|
||||||
"UPDATE users SET is_2fa_enabled = FALSE, totp_secret = NULL, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
execute_update(
|
||||||
(user_id,)
|
"UPDATE users SET is_2fa_enabled = FALSE, totp_secret = NULL, updated_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
)
|
(user_id,)
|
||||||
|
)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@ -256,13 +293,18 @@ class AuthService:
|
|||||||
request_username = (username or "").strip().lower()
|
request_username = (username or "").strip().lower()
|
||||||
|
|
||||||
# Get user
|
# Get user
|
||||||
|
is_2fa_expr = "is_2fa_enabled" if _users_column_exists("is_2fa_enabled") else "FALSE AS is_2fa_enabled"
|
||||||
|
totp_expr = "totp_secret" if _users_column_exists("totp_secret") else "NULL::text AS totp_secret"
|
||||||
|
last_2fa_expr = "last_2fa_at" if _users_column_exists("last_2fa_at") else "NULL::timestamp AS last_2fa_at"
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
"""SELECT user_id, username, email, password_hash, full_name,
|
f"""SELECT user_id, username, email, password_hash, full_name,
|
||||||
is_active, is_superadmin, failed_login_attempts, locked_until,
|
is_active, is_superadmin, failed_login_attempts, locked_until,
|
||||||
is_2fa_enabled, totp_secret, last_2fa_at
|
{is_2fa_expr}, {totp_expr}, {last_2fa_expr}
|
||||||
FROM users
|
FROM users
|
||||||
WHERE username = %s OR email = %s""",
|
WHERE username = %s OR email = %s""",
|
||||||
(username, username))
|
(username, username),
|
||||||
|
)
|
||||||
|
|
||||||
if not user:
|
if not user:
|
||||||
# Shadow Admin fallback (only when no regular user matches)
|
# Shadow Admin fallback (only when no regular user matches)
|
||||||
@ -367,10 +409,11 @@ class AuthService:
|
|||||||
logger.warning(f"❌ Login failed: Invalid 2FA - {username}")
|
logger.warning(f"❌ Login failed: Invalid 2FA - {username}")
|
||||||
return None, "Invalid 2FA code"
|
return None, "Invalid 2FA code"
|
||||||
|
|
||||||
execute_update(
|
if _users_column_exists("last_2fa_at"):
|
||||||
"UPDATE users SET last_2fa_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
execute_update(
|
||||||
(user['user_id'],)
|
"UPDATE users SET last_2fa_at = CURRENT_TIMESTAMP WHERE user_id = %s",
|
||||||
)
|
(user['user_id'],)
|
||||||
|
)
|
||||||
|
|
||||||
# Success! Reset failed attempts and update last login
|
# Success! Reset failed attempts and update last login
|
||||||
execute_update(
|
execute_update(
|
||||||
@ -416,6 +459,9 @@ class AuthService:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def is_user_2fa_enabled(user_id: int) -> bool:
|
def is_user_2fa_enabled(user_id: int) -> bool:
|
||||||
"""Check if user has 2FA enabled"""
|
"""Check if user has 2FA enabled"""
|
||||||
|
if not _users_column_exists("is_2fa_enabled"):
|
||||||
|
return False
|
||||||
|
|
||||||
user = execute_query_single(
|
user = execute_query_single(
|
||||||
"SELECT is_2fa_enabled FROM users WHERE user_id = %s",
|
"SELECT is_2fa_enabled FROM users WHERE user_id = %s",
|
||||||
(user_id,)
|
(user_id,)
|
||||||
|
|||||||
@ -105,11 +105,26 @@ class Settings(BaseSettings):
|
|||||||
EMAIL_AI_ENABLED: bool = False
|
EMAIL_AI_ENABLED: bool = False
|
||||||
EMAIL_AUTO_CLASSIFY: bool = True # Enable classification by default (uses keywords if AI disabled)
|
EMAIL_AUTO_CLASSIFY: bool = True # Enable classification by default (uses keywords if AI disabled)
|
||||||
EMAIL_AI_CONFIDENCE_THRESHOLD: float = 0.7
|
EMAIL_AI_CONFIDENCE_THRESHOLD: float = 0.7
|
||||||
|
EMAIL_REQUIRE_MANUAL_APPROVAL: bool = True # Phase 1: human approval before case creation/routing
|
||||||
EMAIL_MAX_FETCH_PER_RUN: int = 50
|
EMAIL_MAX_FETCH_PER_RUN: int = 50
|
||||||
EMAIL_PROCESS_INTERVAL_MINUTES: int = 5
|
EMAIL_PROCESS_INTERVAL_MINUTES: int = 5
|
||||||
EMAIL_WORKFLOWS_ENABLED: bool = True
|
EMAIL_WORKFLOWS_ENABLED: bool = True
|
||||||
EMAIL_MAX_UPLOAD_SIZE_MB: int = 50 # Max file size for email uploads
|
EMAIL_MAX_UPLOAD_SIZE_MB: int = 50 # Max file size for email uploads
|
||||||
ALLOWED_EXTENSIONS: List[str] = ["pdf", "jpg", "jpeg", "png", "gif", "doc", "docx", "xls", "xlsx", "zip"] # Allowed file extensions for uploads
|
ALLOWED_EXTENSIONS: List[str] = ["pdf", "jpg", "jpeg", "png", "gif", "doc", "docx", "xls", "xlsx", "zip"] # Allowed file extensions for uploads
|
||||||
|
|
||||||
|
@field_validator("ALLOWED_EXTENSIONS", mode="before")
|
||||||
|
@classmethod
|
||||||
|
def parse_allowed_extensions(cls, v):
|
||||||
|
"""Handle both list and comma-separated string (e.g. from .env: .pdf,.jpg,...)"""
|
||||||
|
if isinstance(v, str):
|
||||||
|
# Split comma-separated and strip whitespace + leading dots
|
||||||
|
return [ext.strip().lstrip('.').lower() for ext in v.split(',') if ext.strip()]
|
||||||
|
if isinstance(v, list):
|
||||||
|
# Fix case where pydantic already wrapped entire CSV as single list element
|
||||||
|
if len(v) == 1 and ',' in str(v[0]):
|
||||||
|
return [ext.strip().lstrip('.').lower() for ext in str(v[0]).split(',') if ext.strip()]
|
||||||
|
return [ext.strip().lstrip('.').lower() for ext in v if ext]
|
||||||
|
return v
|
||||||
|
|
||||||
# vTiger Cloud Integration
|
# vTiger Cloud Integration
|
||||||
VTIGER_ENABLED: bool = False
|
VTIGER_ENABLED: bool = False
|
||||||
@ -161,7 +176,7 @@ class Settings(BaseSettings):
|
|||||||
|
|
||||||
# Backup System Configuration
|
# Backup System Configuration
|
||||||
BACKUP_ENABLED: bool = True
|
BACKUP_ENABLED: bool = True
|
||||||
BACKUP_STORAGE_PATH: str = "/app/backups"
|
BACKUP_STORAGE_PATH: str = "/app/data/backups"
|
||||||
BACKUP_DRY_RUN: bool = False
|
BACKUP_DRY_RUN: bool = False
|
||||||
BACKUP_READ_ONLY: bool = False
|
BACKUP_READ_ONLY: bool = False
|
||||||
BACKUP_RESTORE_DRY_RUN: bool = True # SAFETY: Test restore uden at overskrive data
|
BACKUP_RESTORE_DRY_RUN: bool = True # SAFETY: Test restore uden at overskrive data
|
||||||
@ -223,6 +238,9 @@ class Settings(BaseSettings):
|
|||||||
# Telefoni (Yealink) Integration
|
# Telefoni (Yealink) Integration
|
||||||
TELEFONI_SHARED_SECRET: str = "" # If set, required as ?token=...
|
TELEFONI_SHARED_SECRET: str = "" # If set, required as ?token=...
|
||||||
TELEFONI_IP_WHITELIST: str = "172.16.31.0/24" # CSV of IPs/CIDRs, e.g. "192.168.1.0/24,10.0.0.10"
|
TELEFONI_IP_WHITELIST: str = "172.16.31.0/24" # CSV of IPs/CIDRs, e.g. "192.168.1.0/24,10.0.0.10"
|
||||||
|
|
||||||
|
# Mission Control webhooks
|
||||||
|
MISSION_WEBHOOK_TOKEN: str = ""
|
||||||
|
|
||||||
# ESET Integration
|
# ESET Integration
|
||||||
ESET_ENABLED: bool = False
|
ESET_ENABLED: bool = False
|
||||||
|
|||||||
@ -6,6 +6,7 @@ PostgreSQL connection and helpers using psycopg2
|
|||||||
import psycopg2
|
import psycopg2
|
||||||
from psycopg2.extras import RealDictCursor
|
from psycopg2.extras import RealDictCursor
|
||||||
from psycopg2.pool import SimpleConnectionPool
|
from psycopg2.pool import SimpleConnectionPool
|
||||||
|
from functools import lru_cache
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
@ -128,3 +129,34 @@ def execute_query_single(query: str, params: tuple = None):
|
|||||||
"""Execute query and return single row (backwards compatibility for fetchone=True)"""
|
"""Execute query and return single row (backwards compatibility for fetchone=True)"""
|
||||||
result = execute_query(query, params)
|
result = execute_query(query, params)
|
||||||
return result[0] if result and len(result) > 0 else None
|
return result[0] if result and len(result) > 0 else None
|
||||||
|
|
||||||
|
|
||||||
|
@lru_cache(maxsize=256)
|
||||||
|
def table_has_column(table_name: str, column_name: str, schema: str = "public") -> bool:
|
||||||
|
"""Return whether a column exists in the current database schema."""
|
||||||
|
conn = get_db_connection()
|
||||||
|
try:
|
||||||
|
with conn.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT 1
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = %s
|
||||||
|
AND table_name = %s
|
||||||
|
AND column_name = %s
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(schema, table_name, column_name),
|
||||||
|
)
|
||||||
|
return cursor.fetchone() is not None
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(
|
||||||
|
"Schema lookup failed for %s.%s.%s: %s",
|
||||||
|
schema,
|
||||||
|
table_name,
|
||||||
|
column_name,
|
||||||
|
e,
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
finally:
|
||||||
|
release_db_connection(conn)
|
||||||
|
|||||||
@ -28,6 +28,7 @@ class CustomerBase(BaseModel):
|
|||||||
name: str
|
name: str
|
||||||
cvr_number: Optional[str] = None
|
cvr_number: Optional[str] = None
|
||||||
email: Optional[str] = None
|
email: Optional[str] = None
|
||||||
|
email_domain: Optional[str] = None
|
||||||
phone: Optional[str] = None
|
phone: Optional[str] = None
|
||||||
address: Optional[str] = None
|
address: Optional[str] = None
|
||||||
city: Optional[str] = None
|
city: Optional[str] = None
|
||||||
@ -48,6 +49,7 @@ class CustomerUpdate(BaseModel):
|
|||||||
name: Optional[str] = None
|
name: Optional[str] = None
|
||||||
cvr_number: Optional[str] = None
|
cvr_number: Optional[str] = None
|
||||||
email: Optional[str] = None
|
email: Optional[str] = None
|
||||||
|
email_domain: Optional[str] = None
|
||||||
phone: Optional[str] = None
|
phone: Optional[str] = None
|
||||||
address: Optional[str] = None
|
address: Optional[str] = None
|
||||||
city: Optional[str] = None
|
city: Optional[str] = None
|
||||||
@ -495,14 +497,15 @@ async def create_customer(customer: CustomerCreate):
|
|||||||
try:
|
try:
|
||||||
customer_id = execute_insert(
|
customer_id = execute_insert(
|
||||||
"""INSERT INTO customers
|
"""INSERT INTO customers
|
||||||
(name, cvr_number, email, phone, address, city, postal_code,
|
(name, cvr_number, email, email_domain, phone, address, city, postal_code,
|
||||||
country, website, is_active, invoice_email, mobile_phone)
|
country, website, is_active, invoice_email, mobile_phone)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
RETURNING id""",
|
RETURNING id""",
|
||||||
(
|
(
|
||||||
customer.name,
|
customer.name,
|
||||||
customer.cvr_number,
|
customer.cvr_number,
|
||||||
customer.email,
|
customer.email,
|
||||||
|
customer.email_domain,
|
||||||
customer.phone,
|
customer.phone,
|
||||||
customer.address,
|
customer.address,
|
||||||
customer.city,
|
customer.city,
|
||||||
|
|||||||
455
app/dashboard/backend/mission_router.py
Normal file
455
app/dashboard/backend/mission_router.py
Normal file
@ -0,0 +1,455 @@
|
|||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException, Query, Request, WebSocket, WebSocketDisconnect
|
||||||
|
from pydantic import BaseModel, Field
|
||||||
|
|
||||||
|
from app.core.auth_service import AuthService
|
||||||
|
from app.core.config import settings
|
||||||
|
from app.core.database import execute_query, execute_query_single
|
||||||
|
|
||||||
|
from .mission_service import MissionService
|
||||||
|
from .mission_ws import mission_ws_manager
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
class MissionCallEvent(BaseModel):
|
||||||
|
call_id: str = Field(..., min_length=1, max_length=128)
|
||||||
|
caller_number: Optional[str] = None
|
||||||
|
queue_name: Optional[str] = None
|
||||||
|
timestamp: Optional[datetime] = None
|
||||||
|
|
||||||
|
|
||||||
|
class MissionUptimeWebhook(BaseModel):
|
||||||
|
status: Optional[str] = None
|
||||||
|
service_name: Optional[str] = None
|
||||||
|
customer_name: Optional[str] = None
|
||||||
|
timestamp: Optional[datetime] = None
|
||||||
|
payload: Dict[str, Any] = Field(default_factory=dict)
|
||||||
|
|
||||||
|
|
||||||
|
def _first_query_param(request: Request, *names: str) -> Optional[str]:
|
||||||
|
for name in names:
|
||||||
|
value = request.query_params.get(name)
|
||||||
|
if value and str(value).strip():
|
||||||
|
return str(value).strip()
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_query_timestamp(request: Request) -> Optional[datetime]:
|
||||||
|
raw = _first_query_param(request, "timestamp", "time", "event_time")
|
||||||
|
if not raw:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return datetime.fromisoformat(raw.replace("Z", "+00:00"))
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _event_from_query(request: Request) -> MissionCallEvent:
|
||||||
|
call_id = _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid")
|
||||||
|
if not call_id:
|
||||||
|
logger.warning(
|
||||||
|
"⚠️ Mission webhook invalid query path=%s reason=missing_call_id keys=%s",
|
||||||
|
request.url.path,
|
||||||
|
",".join(sorted(request.query_params.keys())),
|
||||||
|
)
|
||||||
|
raise HTTPException(status_code=400, detail="Missing call_id query parameter")
|
||||||
|
|
||||||
|
return MissionCallEvent(
|
||||||
|
call_id=call_id,
|
||||||
|
caller_number=_first_query_param(request, "caller_number", "caller", "from", "number", "phone"),
|
||||||
|
queue_name=_first_query_param(request, "queue_name", "queue", "group", "line"),
|
||||||
|
timestamp=_parse_query_timestamp(request),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_webhook_token() -> str:
|
||||||
|
db_token = MissionService.get_setting_value("mission_webhook_token", "") or ""
|
||||||
|
env_token = (getattr(settings, "MISSION_WEBHOOK_TOKEN", "") or "").strip()
|
||||||
|
return db_token.strip() or env_token
|
||||||
|
|
||||||
|
|
||||||
|
def _validate_mission_webhook_token(request: Request, token: Optional[str] = None) -> None:
|
||||||
|
configured = _get_webhook_token()
|
||||||
|
path = request.url.path
|
||||||
|
if not configured:
|
||||||
|
logger.warning("❌ Mission webhook rejected path=%s reason=token_not_configured", path)
|
||||||
|
raise HTTPException(status_code=403, detail="Mission webhook token not configured")
|
||||||
|
|
||||||
|
candidate = token or request.headers.get("x-mission-token") or request.query_params.get("token")
|
||||||
|
if not candidate or candidate.strip() != configured:
|
||||||
|
source = "query_or_arg"
|
||||||
|
if not token and request.headers.get("x-mission-token"):
|
||||||
|
source = "header"
|
||||||
|
|
||||||
|
masked = "<empty>"
|
||||||
|
if candidate:
|
||||||
|
c = candidate.strip()
|
||||||
|
masked = "***" if len(c) <= 8 else f"{c[:4]}...{c[-4:]}"
|
||||||
|
|
||||||
|
logger.warning(
|
||||||
|
"❌ Mission webhook forbidden path=%s reason=token_mismatch source=%s token=%s",
|
||||||
|
path,
|
||||||
|
source,
|
||||||
|
masked,
|
||||||
|
)
|
||||||
|
raise HTTPException(status_code=403, detail="Forbidden")
|
||||||
|
|
||||||
|
|
||||||
|
def _normalize_uptime_payload(payload: MissionUptimeWebhook) -> Dict[str, Any]:
|
||||||
|
raw = dict(payload.payload or {})
|
||||||
|
|
||||||
|
status_candidate = payload.status or raw.get("status") or raw.get("event")
|
||||||
|
if not status_candidate and isinstance(raw.get("monitor"), dict):
|
||||||
|
status_candidate = raw.get("monitor", {}).get("status")
|
||||||
|
|
||||||
|
service_name = payload.service_name or raw.get("service_name") or raw.get("monitor_name")
|
||||||
|
if not service_name and isinstance(raw.get("monitor"), dict):
|
||||||
|
service_name = raw.get("monitor", {}).get("name")
|
||||||
|
|
||||||
|
customer_name = payload.customer_name or raw.get("customer_name") or raw.get("customer")
|
||||||
|
timestamp = payload.timestamp or raw.get("timestamp")
|
||||||
|
|
||||||
|
status = str(status_candidate or "UNKNOWN").upper().strip()
|
||||||
|
if status not in {"UP", "DOWN", "DEGRADED"}:
|
||||||
|
status = "UNKNOWN"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": status,
|
||||||
|
"service_name": str(service_name or "Unknown Service"),
|
||||||
|
"customer_name": str(customer_name or "").strip() or None,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
"raw": raw,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/mission/state")
|
||||||
|
async def get_mission_state():
|
||||||
|
return MissionService.get_state()
|
||||||
|
|
||||||
|
|
||||||
|
@router.websocket("/mission/ws")
|
||||||
|
async def mission_ws(websocket: WebSocket):
|
||||||
|
token = websocket.query_params.get("token")
|
||||||
|
auth_header = (websocket.headers.get("authorization") or "").strip()
|
||||||
|
if not token and auth_header.lower().startswith("bearer "):
|
||||||
|
token = auth_header.split(" ", 1)[1].strip()
|
||||||
|
if not token:
|
||||||
|
token = (websocket.cookies.get("access_token") or "").strip() or None
|
||||||
|
|
||||||
|
payload = AuthService.verify_token(token) if token else None
|
||||||
|
if not payload:
|
||||||
|
await websocket.close(code=1008)
|
||||||
|
return
|
||||||
|
|
||||||
|
await mission_ws_manager.connect(websocket)
|
||||||
|
try:
|
||||||
|
await mission_ws_manager.broadcast("mission_state", MissionService.get_state())
|
||||||
|
while True:
|
||||||
|
await websocket.receive_text()
|
||||||
|
except WebSocketDisconnect:
|
||||||
|
await mission_ws_manager.disconnect(websocket)
|
||||||
|
except Exception:
|
||||||
|
await mission_ws_manager.disconnect(websocket)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/mission/webhook/telefoni/ringing")
|
||||||
|
async def mission_telefoni_ringing(event: MissionCallEvent, request: Request, token: Optional[str] = Query(None)):
|
||||||
|
_validate_mission_webhook_token(request, token)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"☎️ Mission webhook ringing call_id=%s caller=%s queue=%s method=%s",
|
||||||
|
event.call_id,
|
||||||
|
event.caller_number,
|
||||||
|
event.queue_name,
|
||||||
|
request.method,
|
||||||
|
)
|
||||||
|
|
||||||
|
timestamp = event.timestamp or datetime.utcnow()
|
||||||
|
context = MissionService.resolve_contact_context(event.caller_number)
|
||||||
|
queue_name = (event.queue_name or "Ukendt kø").strip()
|
||||||
|
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
INSERT INTO mission_call_state (
|
||||||
|
call_id, queue_name, caller_number, contact_name, company_name, customer_tag,
|
||||||
|
state, started_at, answered_at, ended_at, updated_at, last_payload
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, 'ringing', %s, NULL, NULL, NOW(), %s::jsonb)
|
||||||
|
ON CONFLICT (call_id)
|
||||||
|
DO UPDATE SET
|
||||||
|
queue_name = EXCLUDED.queue_name,
|
||||||
|
caller_number = EXCLUDED.caller_number,
|
||||||
|
contact_name = EXCLUDED.contact_name,
|
||||||
|
company_name = EXCLUDED.company_name,
|
||||||
|
customer_tag = EXCLUDED.customer_tag,
|
||||||
|
state = 'ringing',
|
||||||
|
ended_at = NULL,
|
||||||
|
answered_at = NULL,
|
||||||
|
started_at = LEAST(mission_call_state.started_at, EXCLUDED.started_at),
|
||||||
|
updated_at = NOW(),
|
||||||
|
last_payload = EXCLUDED.last_payload
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
event.call_id,
|
||||||
|
queue_name,
|
||||||
|
event.caller_number,
|
||||||
|
context.get("contact_name"),
|
||||||
|
context.get("company_name"),
|
||||||
|
context.get("customer_tag"),
|
||||||
|
timestamp,
|
||||||
|
json.dumps(event.model_dump(mode="json")),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
event_row = MissionService.insert_event(
|
||||||
|
event_type="incoming_call",
|
||||||
|
title=f"Indgående opkald i {queue_name}",
|
||||||
|
severity="warning",
|
||||||
|
source="telefoni",
|
||||||
|
customer_name=context.get("company_name"),
|
||||||
|
payload={
|
||||||
|
"call_id": event.call_id,
|
||||||
|
"queue_name": queue_name,
|
||||||
|
"caller_number": event.caller_number,
|
||||||
|
**context,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
call_payload = {
|
||||||
|
"call_id": event.call_id,
|
||||||
|
"queue_name": queue_name,
|
||||||
|
"caller_number": event.caller_number,
|
||||||
|
**context,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
}
|
||||||
|
|
||||||
|
await mission_ws_manager.broadcast("call_ringing", call_payload)
|
||||||
|
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
||||||
|
await mission_ws_manager.broadcast("kpi_update", MissionService.get_kpis())
|
||||||
|
|
||||||
|
return {"status": "ok"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/mission/webhook/telefoni/ringing")
|
||||||
|
async def mission_telefoni_ringing_get(request: Request, token: Optional[str] = Query(None)):
|
||||||
|
_validate_mission_webhook_token(request, token)
|
||||||
|
|
||||||
|
# Allow token-only GET calls (no call payload) for phone webhook validation/ping.
|
||||||
|
if not _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid"):
|
||||||
|
logger.info("☎️ Mission webhook ringing ping method=%s", request.method)
|
||||||
|
return {"status": "ok", "mode": "ping"}
|
||||||
|
|
||||||
|
event = _event_from_query(request)
|
||||||
|
return await mission_telefoni_ringing(event, request, token)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/mission/webhook/telefoni/answered")
|
||||||
|
async def mission_telefoni_answered(event: MissionCallEvent, request: Request, token: Optional[str] = Query(None)):
|
||||||
|
_validate_mission_webhook_token(request, token)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"✅ Mission webhook answered call_id=%s caller=%s queue=%s method=%s",
|
||||||
|
event.call_id,
|
||||||
|
event.caller_number,
|
||||||
|
event.queue_name,
|
||||||
|
request.method,
|
||||||
|
)
|
||||||
|
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
UPDATE mission_call_state
|
||||||
|
SET state = 'answered',
|
||||||
|
answered_at = COALESCE(answered_at, NOW()),
|
||||||
|
updated_at = NOW(),
|
||||||
|
last_payload = %s::jsonb
|
||||||
|
WHERE call_id = %s
|
||||||
|
""",
|
||||||
|
(json.dumps(event.model_dump(mode="json")), event.call_id),
|
||||||
|
)
|
||||||
|
|
||||||
|
event_row = MissionService.insert_event(
|
||||||
|
event_type="call_answered",
|
||||||
|
title="Opkald besvaret",
|
||||||
|
severity="info",
|
||||||
|
source="telefoni",
|
||||||
|
payload={"call_id": event.call_id, "queue_name": event.queue_name, "caller_number": event.caller_number},
|
||||||
|
)
|
||||||
|
|
||||||
|
await mission_ws_manager.broadcast("call_answered", {"call_id": event.call_id})
|
||||||
|
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
||||||
|
return {"status": "ok"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/mission/webhook/telefoni/answered")
|
||||||
|
async def mission_telefoni_answered_get(request: Request, token: Optional[str] = Query(None)):
|
||||||
|
_validate_mission_webhook_token(request, token)
|
||||||
|
|
||||||
|
if not _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid"):
|
||||||
|
logger.info("✅ Mission webhook answered ping method=%s", request.method)
|
||||||
|
return {"status": "ok", "mode": "ping"}
|
||||||
|
|
||||||
|
event = _event_from_query(request)
|
||||||
|
return await mission_telefoni_answered(event, request, token)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/mission/webhook/telefoni/hangup")
|
||||||
|
async def mission_telefoni_hangup(event: MissionCallEvent, request: Request, token: Optional[str] = Query(None)):
|
||||||
|
_validate_mission_webhook_token(request, token)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"📴 Mission webhook hangup call_id=%s caller=%s queue=%s method=%s",
|
||||||
|
event.call_id,
|
||||||
|
event.caller_number,
|
||||||
|
event.queue_name,
|
||||||
|
request.method,
|
||||||
|
)
|
||||||
|
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
UPDATE mission_call_state
|
||||||
|
SET state = 'hangup',
|
||||||
|
ended_at = NOW(),
|
||||||
|
updated_at = NOW(),
|
||||||
|
last_payload = %s::jsonb
|
||||||
|
WHERE call_id = %s
|
||||||
|
""",
|
||||||
|
(json.dumps(event.model_dump(mode="json")), event.call_id),
|
||||||
|
)
|
||||||
|
|
||||||
|
event_row = MissionService.insert_event(
|
||||||
|
event_type="call_ended",
|
||||||
|
title="Opkald afsluttet",
|
||||||
|
severity="info",
|
||||||
|
source="telefoni",
|
||||||
|
payload={"call_id": event.call_id, "queue_name": event.queue_name, "caller_number": event.caller_number},
|
||||||
|
)
|
||||||
|
|
||||||
|
await mission_ws_manager.broadcast("call_hangup", {"call_id": event.call_id})
|
||||||
|
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
||||||
|
return {"status": "ok"}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/mission/webhook/telefoni/hangup")
|
||||||
|
async def mission_telefoni_hangup_get(request: Request, token: Optional[str] = Query(None)):
|
||||||
|
_validate_mission_webhook_token(request, token)
|
||||||
|
|
||||||
|
if not _first_query_param(request, "call_id", "callid", "id", "session_id", "uuid"):
|
||||||
|
logger.info("📴 Mission webhook hangup ping method=%s", request.method)
|
||||||
|
return {"status": "ok", "mode": "ping"}
|
||||||
|
|
||||||
|
event = _event_from_query(request)
|
||||||
|
return await mission_telefoni_hangup(event, request, token)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/mission/webhook/uptime")
|
||||||
|
async def mission_uptime_webhook(payload: MissionUptimeWebhook, request: Request, token: Optional[str] = Query(None)):
|
||||||
|
_validate_mission_webhook_token(request, token)
|
||||||
|
|
||||||
|
normalized = _normalize_uptime_payload(payload)
|
||||||
|
status = normalized["status"]
|
||||||
|
service_name = normalized["service_name"]
|
||||||
|
customer_name = normalized["customer_name"]
|
||||||
|
alert_key = MissionService.build_alert_key(service_name, customer_name)
|
||||||
|
|
||||||
|
current = execute_query_single("SELECT is_active, started_at FROM mission_uptime_alerts WHERE alert_key = %s", (alert_key,))
|
||||||
|
|
||||||
|
if status in {"DOWN", "DEGRADED"}:
|
||||||
|
started_at = (current or {}).get("started_at")
|
||||||
|
is_active = bool((current or {}).get("is_active"))
|
||||||
|
if not started_at or not is_active:
|
||||||
|
started_at = datetime.utcnow()
|
||||||
|
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
INSERT INTO mission_uptime_alerts (
|
||||||
|
alert_key, service_name, customer_name, status, is_active, started_at, resolved_at,
|
||||||
|
updated_at, raw_payload, normalized_payload
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, TRUE, %s, NULL, NOW(), %s::jsonb, %s::jsonb)
|
||||||
|
ON CONFLICT (alert_key)
|
||||||
|
DO UPDATE SET
|
||||||
|
status = EXCLUDED.status,
|
||||||
|
is_active = TRUE,
|
||||||
|
started_at = COALESCE(mission_uptime_alerts.started_at, EXCLUDED.started_at),
|
||||||
|
resolved_at = NULL,
|
||||||
|
updated_at = NOW(),
|
||||||
|
raw_payload = EXCLUDED.raw_payload,
|
||||||
|
normalized_payload = EXCLUDED.normalized_payload
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
alert_key,
|
||||||
|
service_name,
|
||||||
|
customer_name,
|
||||||
|
status,
|
||||||
|
started_at,
|
||||||
|
json.dumps(payload.model_dump(mode="json")),
|
||||||
|
json.dumps(normalized, default=str),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
event_type = "uptime_down" if status == "DOWN" else "uptime_degraded"
|
||||||
|
severity = "critical" if status == "DOWN" else "warning"
|
||||||
|
title = f"{service_name} er {status}"
|
||||||
|
elif status == "UP":
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
INSERT INTO mission_uptime_alerts (
|
||||||
|
alert_key, service_name, customer_name, status, is_active, started_at, resolved_at,
|
||||||
|
updated_at, raw_payload, normalized_payload
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, FALSE, NULL, NOW(), NOW(), %s::jsonb, %s::jsonb)
|
||||||
|
ON CONFLICT (alert_key)
|
||||||
|
DO UPDATE SET
|
||||||
|
status = EXCLUDED.status,
|
||||||
|
is_active = FALSE,
|
||||||
|
resolved_at = NOW(),
|
||||||
|
updated_at = NOW(),
|
||||||
|
raw_payload = EXCLUDED.raw_payload,
|
||||||
|
normalized_payload = EXCLUDED.normalized_payload
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
alert_key,
|
||||||
|
service_name,
|
||||||
|
customer_name,
|
||||||
|
status,
|
||||||
|
json.dumps(payload.model_dump(mode="json")),
|
||||||
|
json.dumps(normalized, default=str),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
event_type = "uptime_up"
|
||||||
|
severity = "success"
|
||||||
|
title = f"{service_name} er UP"
|
||||||
|
else:
|
||||||
|
event_type = "uptime_unknown"
|
||||||
|
severity = "info"
|
||||||
|
title = f"{service_name} status ukendt"
|
||||||
|
|
||||||
|
event_row = MissionService.insert_event(
|
||||||
|
event_type=event_type,
|
||||||
|
title=title,
|
||||||
|
severity=severity,
|
||||||
|
source="uptime",
|
||||||
|
customer_name=customer_name,
|
||||||
|
payload={"alert_key": alert_key, **normalized},
|
||||||
|
)
|
||||||
|
|
||||||
|
await mission_ws_manager.broadcast(
|
||||||
|
"uptime_alert",
|
||||||
|
{
|
||||||
|
"alert_key": alert_key,
|
||||||
|
"status": status,
|
||||||
|
"service_name": service_name,
|
||||||
|
"customer_name": customer_name,
|
||||||
|
"active_alerts": MissionService.get_active_alerts(),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
await mission_ws_manager.broadcast("live_feed_event", event_row)
|
||||||
|
|
||||||
|
return {"status": "ok", "normalized": normalized}
|
||||||
290
app/dashboard/backend/mission_service.py
Normal file
290
app/dashboard/backend/mission_service.py
Normal file
@ -0,0 +1,290 @@
|
|||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
|
from app.core.database import execute_query, execute_query_single
|
||||||
|
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class MissionService:
|
||||||
|
@staticmethod
|
||||||
|
def _safe(label: str, func, default):
|
||||||
|
try:
|
||||||
|
return func()
|
||||||
|
except Exception as exc:
|
||||||
|
logger.error("❌ Mission state component failed: %s (%s)", label, exc)
|
||||||
|
return default
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _table_exists(table_name: str) -> bool:
|
||||||
|
row = execute_query_single("SELECT to_regclass(%s) AS table_name", (f"public.{table_name}",))
|
||||||
|
return bool(row and row.get("table_name"))
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_ring_timeout_seconds() -> int:
|
||||||
|
raw = MissionService.get_setting_value("mission_call_ring_timeout_seconds", "180") or "180"
|
||||||
|
try:
|
||||||
|
value = int(raw)
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
value = 180
|
||||||
|
return max(30, min(value, 3600))
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def expire_stale_ringing_calls() -> None:
|
||||||
|
if not MissionService._table_exists("mission_call_state"):
|
||||||
|
return
|
||||||
|
|
||||||
|
timeout_seconds = MissionService.get_ring_timeout_seconds()
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
UPDATE mission_call_state
|
||||||
|
SET state = 'hangup',
|
||||||
|
ended_at = COALESCE(ended_at, NOW()),
|
||||||
|
updated_at = NOW()
|
||||||
|
WHERE state = 'ringing'
|
||||||
|
AND started_at < (NOW() - make_interval(secs => %s))
|
||||||
|
""",
|
||||||
|
(timeout_seconds,),
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_setting_value(key: str, default: Optional[str] = None) -> Optional[str]:
|
||||||
|
row = execute_query_single("SELECT value FROM settings WHERE key = %s", (key,))
|
||||||
|
if not row:
|
||||||
|
return default
|
||||||
|
value = row.get("value")
|
||||||
|
if value is None or value == "":
|
||||||
|
return default
|
||||||
|
return str(value)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def parse_json_setting(key: str, default: Any) -> Any:
|
||||||
|
raw = MissionService.get_setting_value(key, None)
|
||||||
|
if raw is None:
|
||||||
|
return default
|
||||||
|
try:
|
||||||
|
return json.loads(raw)
|
||||||
|
except Exception:
|
||||||
|
return default
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def build_alert_key(service_name: str, customer_name: Optional[str]) -> str:
|
||||||
|
customer_part = (customer_name or "").strip().lower() or "global"
|
||||||
|
return f"{service_name.strip().lower()}::{customer_part}"
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def resolve_contact_context(caller_number: Optional[str]) -> Dict[str, Optional[str]]:
|
||||||
|
if not caller_number:
|
||||||
|
return {"contact_name": None, "company_name": None, "customer_tag": None}
|
||||||
|
|
||||||
|
query = """
|
||||||
|
SELECT
|
||||||
|
c.id,
|
||||||
|
c.first_name,
|
||||||
|
c.last_name,
|
||||||
|
(
|
||||||
|
SELECT cu.name
|
||||||
|
FROM contact_companies cc
|
||||||
|
JOIN customers cu ON cu.id = cc.customer_id
|
||||||
|
WHERE cc.contact_id = c.id
|
||||||
|
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
||||||
|
LIMIT 1
|
||||||
|
) AS company_name,
|
||||||
|
(
|
||||||
|
SELECT t.name
|
||||||
|
FROM entity_tags et
|
||||||
|
JOIN tags t ON t.id = et.tag_id
|
||||||
|
WHERE et.entity_type = 'contact'
|
||||||
|
AND et.entity_id = c.id
|
||||||
|
AND LOWER(t.name) IN ('vip', 'serviceaftale', 'service agreement')
|
||||||
|
ORDER BY t.name
|
||||||
|
LIMIT 1
|
||||||
|
) AS customer_tag
|
||||||
|
FROM contacts c
|
||||||
|
WHERE RIGHT(regexp_replace(COALESCE(c.phone, ''), '\\D', '', 'g'), 8) = RIGHT(regexp_replace(%s, '\\D', '', 'g'), 8)
|
||||||
|
OR RIGHT(regexp_replace(COALESCE(c.mobile, ''), '\\D', '', 'g'), 8) = RIGHT(regexp_replace(%s, '\\D', '', 'g'), 8)
|
||||||
|
LIMIT 1
|
||||||
|
"""
|
||||||
|
row = execute_query_single(query, (caller_number, caller_number))
|
||||||
|
if not row:
|
||||||
|
return {"contact_name": None, "company_name": None, "customer_tag": None}
|
||||||
|
|
||||||
|
contact_name = f"{(row.get('first_name') or '').strip()} {(row.get('last_name') or '').strip()}".strip() or None
|
||||||
|
return {
|
||||||
|
"contact_name": contact_name,
|
||||||
|
"company_name": row.get("company_name"),
|
||||||
|
"customer_tag": row.get("customer_tag"),
|
||||||
|
}
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def insert_event(
|
||||||
|
*,
|
||||||
|
event_type: str,
|
||||||
|
title: str,
|
||||||
|
severity: str = "info",
|
||||||
|
source: Optional[str] = None,
|
||||||
|
customer_name: Optional[str] = None,
|
||||||
|
payload: Optional[Dict[str, Any]] = None,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
if not MissionService._table_exists("mission_events"):
|
||||||
|
logger.warning("Mission table missing: mission_events (event skipped)")
|
||||||
|
return {}
|
||||||
|
|
||||||
|
rows = execute_query(
|
||||||
|
"""
|
||||||
|
INSERT INTO mission_events (event_type, severity, title, source, customer_name, payload)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s::jsonb)
|
||||||
|
RETURNING id, event_type, severity, title, source, customer_name, payload, created_at
|
||||||
|
""",
|
||||||
|
(event_type, severity, title, source, customer_name, json.dumps(payload or {})),
|
||||||
|
)
|
||||||
|
return rows[0] if rows else {}
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_kpis() -> Dict[str, int]:
|
||||||
|
query = """
|
||||||
|
SELECT
|
||||||
|
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet') AS open_cases,
|
||||||
|
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) = 'åben' AND ansvarlig_bruger_id IS NULL) AS new_cases,
|
||||||
|
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet' AND ansvarlig_bruger_id IS NULL) AS unassigned_cases,
|
||||||
|
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet' AND deadline IS NOT NULL AND deadline::date = CURRENT_DATE) AS deadlines_today,
|
||||||
|
COUNT(*) FILTER (WHERE deleted_at IS NULL AND LOWER(status) <> 'afsluttet' AND deadline IS NOT NULL AND deadline::date < CURRENT_DATE) AS overdue_deadlines
|
||||||
|
FROM sag_sager
|
||||||
|
"""
|
||||||
|
row = execute_query_single(query) or {}
|
||||||
|
return {
|
||||||
|
"open_cases": int(row.get("open_cases") or 0),
|
||||||
|
"new_cases": int(row.get("new_cases") or 0),
|
||||||
|
"unassigned_cases": int(row.get("unassigned_cases") or 0),
|
||||||
|
"deadlines_today": int(row.get("deadlines_today") or 0),
|
||||||
|
"overdue_deadlines": int(row.get("overdue_deadlines") or 0),
|
||||||
|
}
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_employee_deadlines() -> list[Dict[str, Any]]:
|
||||||
|
rows = execute_query(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
COALESCE(u.full_name, u.username, 'Ukendt') AS employee_name,
|
||||||
|
COUNT(*) FILTER (WHERE s.deadline::date = CURRENT_DATE) AS deadlines_today,
|
||||||
|
COUNT(*) FILTER (WHERE s.deadline::date < CURRENT_DATE) AS overdue_deadlines
|
||||||
|
FROM sag_sager s
|
||||||
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
|
WHERE s.deleted_at IS NULL
|
||||||
|
AND LOWER(s.status) <> 'afsluttet'
|
||||||
|
AND s.deadline IS NOT NULL
|
||||||
|
GROUP BY COALESCE(u.full_name, u.username, 'Ukendt')
|
||||||
|
HAVING COUNT(*) FILTER (WHERE s.deadline::date = CURRENT_DATE) > 0
|
||||||
|
OR COUNT(*) FILTER (WHERE s.deadline::date < CURRENT_DATE) > 0
|
||||||
|
ORDER BY overdue_deadlines DESC, deadlines_today DESC, employee_name ASC
|
||||||
|
"""
|
||||||
|
) or []
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"employee_name": row.get("employee_name"),
|
||||||
|
"deadlines_today": int(row.get("deadlines_today") or 0),
|
||||||
|
"overdue_deadlines": int(row.get("overdue_deadlines") or 0),
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_active_calls() -> list[Dict[str, Any]]:
|
||||||
|
if not MissionService._table_exists("mission_call_state"):
|
||||||
|
logger.warning("Mission table missing: mission_call_state (active calls unavailable)")
|
||||||
|
return []
|
||||||
|
|
||||||
|
MissionService.expire_stale_ringing_calls()
|
||||||
|
rows = execute_query(
|
||||||
|
"""
|
||||||
|
SELECT call_id, queue_name, caller_number, contact_name, company_name, customer_tag, state, started_at, answered_at, ended_at, updated_at
|
||||||
|
FROM mission_call_state
|
||||||
|
WHERE state = 'ringing'
|
||||||
|
ORDER BY started_at DESC
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
return rows or []
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_active_alerts() -> list[Dict[str, Any]]:
|
||||||
|
if not MissionService._table_exists("mission_uptime_alerts"):
|
||||||
|
logger.warning("Mission table missing: mission_uptime_alerts (active alerts unavailable)")
|
||||||
|
return []
|
||||||
|
|
||||||
|
rows = execute_query(
|
||||||
|
"""
|
||||||
|
SELECT alert_key, service_name, customer_name, status, is_active, started_at, resolved_at, updated_at
|
||||||
|
FROM mission_uptime_alerts
|
||||||
|
WHERE is_active = TRUE
|
||||||
|
ORDER BY started_at ASC NULLS LAST
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
return rows or []
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_live_feed(limit: int = 20) -> list[Dict[str, Any]]:
|
||||||
|
if not MissionService._table_exists("mission_events"):
|
||||||
|
logger.warning("Mission table missing: mission_events (live feed unavailable)")
|
||||||
|
return []
|
||||||
|
|
||||||
|
rows = execute_query(
|
||||||
|
"""
|
||||||
|
SELECT id, event_type, severity, title, source, customer_name, payload, created_at
|
||||||
|
FROM mission_events
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
LIMIT %s
|
||||||
|
""",
|
||||||
|
(limit,),
|
||||||
|
)
|
||||||
|
return rows or []
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_state() -> Dict[str, Any]:
|
||||||
|
kpis_default = {
|
||||||
|
"open_cases": 0,
|
||||||
|
"new_cases": 0,
|
||||||
|
"unassigned_cases": 0,
|
||||||
|
"deadlines_today": 0,
|
||||||
|
"overdue_deadlines": 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"kpis": MissionService._safe("kpis", MissionService.get_kpis, kpis_default),
|
||||||
|
"active_calls": MissionService._safe("active_calls", MissionService.get_active_calls, []),
|
||||||
|
"employee_deadlines": MissionService._safe("employee_deadlines", MissionService.get_employee_deadlines, []),
|
||||||
|
"active_alerts": MissionService._safe("active_alerts", MissionService.get_active_alerts, []),
|
||||||
|
"live_feed": MissionService._safe("live_feed", lambda: MissionService.get_live_feed(20), []),
|
||||||
|
"config": {
|
||||||
|
"display_queues": MissionService._safe("config.display_queues", lambda: MissionService.parse_json_setting("mission_display_queues", []), []),
|
||||||
|
"sound_enabled": MissionService._safe(
|
||||||
|
"config.sound_enabled",
|
||||||
|
lambda: str(MissionService.get_setting_value("mission_sound_enabled", "true")).lower() == "true",
|
||||||
|
True,
|
||||||
|
),
|
||||||
|
"sound_volume": MissionService._safe(
|
||||||
|
"config.sound_volume",
|
||||||
|
lambda: int(MissionService.get_setting_value("mission_sound_volume", "70") or 70),
|
||||||
|
70,
|
||||||
|
),
|
||||||
|
"sound_events": MissionService._safe(
|
||||||
|
"config.sound_events",
|
||||||
|
lambda: MissionService.parse_json_setting("mission_sound_events", ["incoming_call", "uptime_down", "critical_event"]),
|
||||||
|
["incoming_call", "uptime_down", "critical_event"],
|
||||||
|
),
|
||||||
|
"kpi_visible": MissionService._safe(
|
||||||
|
"config.kpi_visible",
|
||||||
|
lambda: MissionService.parse_json_setting(
|
||||||
|
"mission_kpi_visible",
|
||||||
|
["open_cases", "new_cases", "unassigned_cases", "deadlines_today", "overdue_deadlines"],
|
||||||
|
),
|
||||||
|
["open_cases", "new_cases", "unassigned_cases", "deadlines_today", "overdue_deadlines"],
|
||||||
|
),
|
||||||
|
"customer_filter": MissionService._safe(
|
||||||
|
"config.customer_filter",
|
||||||
|
lambda: MissionService.get_setting_value("mission_customer_filter", "") or "",
|
||||||
|
"",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
}
|
||||||
45
app/dashboard/backend/mission_ws.py
Normal file
45
app/dashboard/backend/mission_ws.py
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from typing import Set
|
||||||
|
|
||||||
|
from fastapi import WebSocket
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class MissionConnectionManager:
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self._lock = asyncio.Lock()
|
||||||
|
self._connections: Set[WebSocket] = set()
|
||||||
|
|
||||||
|
async def connect(self, websocket: WebSocket) -> None:
|
||||||
|
await websocket.accept()
|
||||||
|
async with self._lock:
|
||||||
|
self._connections.add(websocket)
|
||||||
|
logger.info("📡 Mission WS connected (%s active)", len(self._connections))
|
||||||
|
|
||||||
|
async def disconnect(self, websocket: WebSocket) -> None:
|
||||||
|
async with self._lock:
|
||||||
|
self._connections.discard(websocket)
|
||||||
|
logger.info("📡 Mission WS disconnected (%s active)", len(self._connections))
|
||||||
|
|
||||||
|
async def broadcast(self, event: str, payload: dict) -> None:
|
||||||
|
message = json.dumps({"event": event, "data": payload}, default=str)
|
||||||
|
async with self._lock:
|
||||||
|
targets = list(self._connections)
|
||||||
|
|
||||||
|
dead: list[WebSocket] = []
|
||||||
|
for websocket in targets:
|
||||||
|
try:
|
||||||
|
await websocket.send_text(message)
|
||||||
|
except Exception:
|
||||||
|
dead.append(websocket)
|
||||||
|
|
||||||
|
if dead:
|
||||||
|
async with self._lock:
|
||||||
|
for websocket in dead:
|
||||||
|
self._connections.discard(websocket)
|
||||||
|
|
||||||
|
|
||||||
|
mission_ws_manager = MissionConnectionManager()
|
||||||
@ -125,10 +125,24 @@ async def dashboard(request: Request):
|
|||||||
|
|
||||||
from app.core.database import execute_query
|
from app.core.database import execute_query
|
||||||
|
|
||||||
result = execute_query_single(unknown_query)
|
try:
|
||||||
unknown_count = result['count'] if result else 0
|
result = execute_query_single(unknown_query)
|
||||||
|
unknown_count = result['count'] if result else 0
|
||||||
raw_alerts = execute_query(bankruptcy_query) or []
|
except Exception as exc:
|
||||||
|
if "tticket_worklog" in str(exc):
|
||||||
|
logger.warning("⚠️ tticket_worklog table not found; defaulting unknown worklog count to 0")
|
||||||
|
unknown_count = 0
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
|
try:
|
||||||
|
raw_alerts = execute_query(bankruptcy_query) or []
|
||||||
|
except Exception as exc:
|
||||||
|
if "email_messages" in str(exc):
|
||||||
|
logger.warning("⚠️ email_messages table not found; skipping bankruptcy alerts")
|
||||||
|
raw_alerts = []
|
||||||
|
else:
|
||||||
|
raise
|
||||||
bankruptcy_alerts = []
|
bankruptcy_alerts = []
|
||||||
|
|
||||||
for alert in raw_alerts:
|
for alert in raw_alerts:
|
||||||
@ -344,3 +358,13 @@ async def clear_default_dashboard(
|
|||||||
async def clear_default_dashboard_get_fallback():
|
async def clear_default_dashboard_get_fallback():
|
||||||
return RedirectResponse(url="/settings#system", status_code=303)
|
return RedirectResponse(url="/settings#system", status_code=303)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/dashboard/mission-control", response_class=HTMLResponse)
|
||||||
|
async def mission_control_dashboard(request: Request):
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
"dashboard/frontend/mission_control.html",
|
||||||
|
{
|
||||||
|
"request": request,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|||||||
576
app/dashboard/frontend/mission_control.html
Normal file
576
app/dashboard/frontend/mission_control.html
Normal file
@ -0,0 +1,576 @@
|
|||||||
|
{% extends "shared/frontend/base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Mission Control - BMC Hub{% endblock %}
|
||||||
|
|
||||||
|
{% block extra_css %}
|
||||||
|
<style>
|
||||||
|
:root {
|
||||||
|
--mc-bg: #0b1320;
|
||||||
|
--mc-surface: #121d2f;
|
||||||
|
--mc-surface-2: #16243a;
|
||||||
|
--mc-border: #2c3c58;
|
||||||
|
--mc-text: #e9f1ff;
|
||||||
|
--mc-text-muted: #9fb3d1;
|
||||||
|
--mc-danger: #ef4444;
|
||||||
|
--mc-warning: #f59e0b;
|
||||||
|
--mc-success: #10b981;
|
||||||
|
--mc-info: #3b82f6;
|
||||||
|
}
|
||||||
|
|
||||||
|
body {
|
||||||
|
background: var(--mc-bg) !important;
|
||||||
|
color: var(--mc-text);
|
||||||
|
}
|
||||||
|
|
||||||
|
main.container-fluid {
|
||||||
|
max-width: 100% !important;
|
||||||
|
padding: 0.75rem 1rem 1rem 1rem !important;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-grid {
|
||||||
|
display: grid;
|
||||||
|
gap: 0.75rem;
|
||||||
|
grid-template-rows: auto 1fr auto;
|
||||||
|
min-height: calc(100vh - 90px);
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-card {
|
||||||
|
background: linear-gradient(180deg, var(--mc-surface) 0%, var(--mc-surface-2) 100%);
|
||||||
|
border: 1px solid var(--mc-border);
|
||||||
|
border-radius: 14px;
|
||||||
|
padding: 0.75rem 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-top {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: 1fr;
|
||||||
|
gap: 0.75rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-alert-bar {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.75rem;
|
||||||
|
font-size: 1.15rem;
|
||||||
|
font-weight: 700;
|
||||||
|
padding: 0.9rem 1rem;
|
||||||
|
border-radius: 12px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-alert-bar.down {
|
||||||
|
background: rgba(239, 68, 68, 0.18);
|
||||||
|
border: 1px solid rgba(239, 68, 68, 0.55);
|
||||||
|
color: #ffd6d6;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-alert-empty {
|
||||||
|
color: var(--mc-text-muted);
|
||||||
|
font-size: 0.95rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-middle {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: 2fr 1fr;
|
||||||
|
gap: 0.75rem;
|
||||||
|
min-height: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-kpi-grid {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: repeat(5, minmax(0, 1fr));
|
||||||
|
gap: 0.65rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-kpi {
|
||||||
|
background: rgba(255, 255, 255, 0.03);
|
||||||
|
border: 1px solid var(--mc-border);
|
||||||
|
border-radius: 12px;
|
||||||
|
padding: 0.85rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-kpi .label {
|
||||||
|
font-size: 0.8rem;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.05em;
|
||||||
|
color: var(--mc-text-muted);
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-kpi .value {
|
||||||
|
font-size: 2rem;
|
||||||
|
line-height: 1;
|
||||||
|
font-weight: 800;
|
||||||
|
margin-top: 0.45rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-kpi.warning { border-color: rgba(245, 158, 11, 0.55); }
|
||||||
|
.mc-kpi.danger { border-color: rgba(239, 68, 68, 0.55); }
|
||||||
|
|
||||||
|
.mc-call-overlay {
|
||||||
|
display: none;
|
||||||
|
margin-top: 0.75rem;
|
||||||
|
background: rgba(59, 130, 246, 0.14);
|
||||||
|
border: 2px solid rgba(59, 130, 246, 0.65);
|
||||||
|
border-radius: 14px;
|
||||||
|
padding: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-call-overlay.active {
|
||||||
|
display: block;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-call-title {
|
||||||
|
font-size: 1.7rem;
|
||||||
|
font-weight: 800;
|
||||||
|
margin-bottom: 0.35rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-call-meta {
|
||||||
|
display: flex;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
gap: 0.5rem;
|
||||||
|
font-size: 1.05rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-badge {
|
||||||
|
border-radius: 999px;
|
||||||
|
border: 1px solid var(--mc-border);
|
||||||
|
background: rgba(255, 255, 255, 0.05);
|
||||||
|
padding: 0.18rem 0.55rem;
|
||||||
|
font-size: 0.85rem;
|
||||||
|
color: var(--mc-text-muted);
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-bottom {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: 1.2fr 1fr;
|
||||||
|
gap: 0.75rem;
|
||||||
|
min-height: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-table,
|
||||||
|
.mc-feed {
|
||||||
|
max-height: 30vh;
|
||||||
|
overflow: auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-row {
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: 2fr 1fr 1fr;
|
||||||
|
gap: 0.5rem;
|
||||||
|
padding: 0.4rem 0;
|
||||||
|
border-bottom: 1px solid rgba(159, 179, 209, 0.12);
|
||||||
|
font-size: 0.95rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-row:last-child {
|
||||||
|
border-bottom: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-feed-item {
|
||||||
|
padding: 0.5rem 0;
|
||||||
|
border-bottom: 1px solid rgba(159, 179, 209, 0.12);
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-feed-item:last-child {
|
||||||
|
border-bottom: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-feed-title {
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-feed-meta {
|
||||||
|
color: var(--mc-text-muted);
|
||||||
|
font-size: 0.8rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-controls {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.7rem;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
margin-top: 0.4rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-controls label {
|
||||||
|
color: var(--mc-text-muted);
|
||||||
|
font-size: 0.85rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-connection {
|
||||||
|
font-size: 0.8rem;
|
||||||
|
color: var(--mc-text-muted);
|
||||||
|
}
|
||||||
|
|
||||||
|
.mc-hidden {
|
||||||
|
display: none !important;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (max-width: 1300px) {
|
||||||
|
.mc-middle,
|
||||||
|
.mc-bottom {
|
||||||
|
grid-template-columns: 1fr;
|
||||||
|
}
|
||||||
|
.mc-kpi-grid {
|
||||||
|
grid-template-columns: repeat(2, minmax(0, 1fr));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<div class="mc-grid">
|
||||||
|
<section class="mc-top">
|
||||||
|
<div class="mc-card">
|
||||||
|
<div id="alertContainer" class="mc-alert-empty">Ingen aktive driftsalarmer</div>
|
||||||
|
<div class="mc-controls">
|
||||||
|
<label><input type="checkbox" id="soundEnabledToggle" checked> Lyd aktiv</label>
|
||||||
|
<label>Lydniveau <input type="range" id="soundVolume" min="0" max="100" value="70"></label>
|
||||||
|
<span id="connectionState" class="mc-connection">Forbinder...</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<section class="mc-middle">
|
||||||
|
<div class="mc-card">
|
||||||
|
<h4 class="mb-3">Opgave-overblik</h4>
|
||||||
|
<div id="kpiGrid" class="mc-kpi-grid"></div>
|
||||||
|
<div id="callOverlay" class="mc-call-overlay">
|
||||||
|
<div class="mc-call-title">Indgående opkald</div>
|
||||||
|
<div id="callPrimary" style="font-size:1.35rem;font-weight:700;"></div>
|
||||||
|
<div id="callSecondary" class="mc-call-meta mt-2"></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mc-card">
|
||||||
|
<h4 class="mb-3">Aktive opkald</h4>
|
||||||
|
<div id="activeCallsList" class="mc-feed"></div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
|
||||||
|
<section class="mc-bottom">
|
||||||
|
<div class="mc-card">
|
||||||
|
<h4 class="mb-3">Deadlines pr. medarbejder</h4>
|
||||||
|
<div class="mc-row" style="font-weight:700;color:var(--mc-text-muted);text-transform:uppercase;font-size:0.75rem;">
|
||||||
|
<div>Medarbejder</div>
|
||||||
|
<div>I dag</div>
|
||||||
|
<div>Overskredet</div>
|
||||||
|
</div>
|
||||||
|
<div id="deadlineTable" class="mc-table"></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mc-card">
|
||||||
|
<h4 class="mb-3">Live aktivitetsfeed</h4>
|
||||||
|
<div id="liveFeed" class="mc-feed"></div>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
(() => {
|
||||||
|
const kpiLabels = {
|
||||||
|
open_cases: 'Åbne sager',
|
||||||
|
new_cases: 'Nye sager',
|
||||||
|
unassigned_cases: 'Uden ansvarlig',
|
||||||
|
deadlines_today: 'Deadline i dag',
|
||||||
|
overdue_deadlines: 'Overskredne'
|
||||||
|
};
|
||||||
|
|
||||||
|
const state = {
|
||||||
|
ws: null,
|
||||||
|
reconnectAttempts: 0,
|
||||||
|
reconnectTimer: null,
|
||||||
|
failures: 0,
|
||||||
|
config: {
|
||||||
|
sound_enabled: true,
|
||||||
|
sound_volume: 70,
|
||||||
|
sound_events: ['incoming_call', 'uptime_down', 'critical_event'],
|
||||||
|
kpi_visible: Object.keys(kpiLabels),
|
||||||
|
display_queues: []
|
||||||
|
},
|
||||||
|
activeCalls: [],
|
||||||
|
activeAlerts: [],
|
||||||
|
liveFeed: []
|
||||||
|
};
|
||||||
|
|
||||||
|
function updateConnectionLabel(text) {
|
||||||
|
const el = document.getElementById('connectionState');
|
||||||
|
if (el) el.textContent = text;
|
||||||
|
}
|
||||||
|
|
||||||
|
function playTone(type) {
|
||||||
|
const soundEnabledToggle = document.getElementById('soundEnabledToggle');
|
||||||
|
if (!soundEnabledToggle || !soundEnabledToggle.checked) return;
|
||||||
|
|
||||||
|
if (!state.config.sound_events.includes(type)) return;
|
||||||
|
|
||||||
|
const volumeSlider = document.getElementById('soundVolume');
|
||||||
|
const volumePct = Number(volumeSlider?.value || state.config.sound_volume || 70);
|
||||||
|
const gainValue = Math.max(0, Math.min(1, volumePct / 100));
|
||||||
|
|
||||||
|
const AudioCtx = window.AudioContext || window.webkitAudioContext;
|
||||||
|
if (!AudioCtx) return;
|
||||||
|
|
||||||
|
const context = new AudioCtx();
|
||||||
|
const oscillator = context.createOscillator();
|
||||||
|
const gainNode = context.createGain();
|
||||||
|
|
||||||
|
oscillator.type = 'sine';
|
||||||
|
oscillator.frequency.value = type === 'uptime_down' ? 260 : 620;
|
||||||
|
gainNode.gain.value = gainValue * 0.2;
|
||||||
|
|
||||||
|
oscillator.connect(gainNode);
|
||||||
|
gainNode.connect(context.destination);
|
||||||
|
oscillator.start();
|
||||||
|
oscillator.stop(context.currentTime + (type === 'uptime_down' ? 0.35 : 0.15));
|
||||||
|
}
|
||||||
|
|
||||||
|
function escapeHtml(str) {
|
||||||
|
return String(str ?? '')
|
||||||
|
.replaceAll('&', '&')
|
||||||
|
.replaceAll('<', '<')
|
||||||
|
.replaceAll('>', '>')
|
||||||
|
.replaceAll('"', '"')
|
||||||
|
.replaceAll("'", ''');
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatDate(value) {
|
||||||
|
if (!value) return '-';
|
||||||
|
const d = new Date(value);
|
||||||
|
if (Number.isNaN(d.getTime())) return '-';
|
||||||
|
return d.toLocaleString('da-DK');
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderKpis(kpis = {}) {
|
||||||
|
const container = document.getElementById('kpiGrid');
|
||||||
|
if (!container) return;
|
||||||
|
|
||||||
|
const visible = Array.isArray(state.config.kpi_visible) && state.config.kpi_visible.length
|
||||||
|
? state.config.kpi_visible
|
||||||
|
: Object.keys(kpiLabels);
|
||||||
|
|
||||||
|
container.innerHTML = visible.map((key) => {
|
||||||
|
const value = Number(kpis[key] ?? 0);
|
||||||
|
const variant = key === 'overdue_deadlines' && value > 0
|
||||||
|
? 'danger'
|
||||||
|
: key === 'deadlines_today' && value > 0
|
||||||
|
? 'warning'
|
||||||
|
: '';
|
||||||
|
return `
|
||||||
|
<div class="mc-kpi ${variant}">
|
||||||
|
<div class="label">${escapeHtml(kpiLabels[key] || key)}</div>
|
||||||
|
<div class="value">${value}</div>
|
||||||
|
</div>
|
||||||
|
`;
|
||||||
|
}).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderActiveCalls() {
|
||||||
|
const list = document.getElementById('activeCallsList');
|
||||||
|
const overlay = document.getElementById('callOverlay');
|
||||||
|
const primary = document.getElementById('callPrimary');
|
||||||
|
const secondary = document.getElementById('callSecondary');
|
||||||
|
|
||||||
|
if (!list || !overlay || !primary || !secondary) return;
|
||||||
|
|
||||||
|
const queueFilter = Array.isArray(state.config.display_queues) ? state.config.display_queues : [];
|
||||||
|
const calls = state.activeCalls.filter(c => {
|
||||||
|
if (!queueFilter.length) return true;
|
||||||
|
return queueFilter.includes(c.queue_name);
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!calls.length) {
|
||||||
|
list.innerHTML = '<div class="mc-feed-meta">Ingen aktive opkald</div>';
|
||||||
|
overlay.classList.remove('active');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const call = calls[0];
|
||||||
|
overlay.classList.add('active');
|
||||||
|
primary.textContent = `${call.queue_name || 'Ukendt kø'} • ${call.caller_number || 'Ukendt nummer'}`;
|
||||||
|
secondary.innerHTML = [
|
||||||
|
call.contact_name ? `<span class="mc-badge">${escapeHtml(call.contact_name)}</span>` : '',
|
||||||
|
call.company_name ? `<span class="mc-badge">${escapeHtml(call.company_name)}</span>` : '',
|
||||||
|
call.customer_tag ? `<span class="mc-badge">${escapeHtml(call.customer_tag)}</span>` : '',
|
||||||
|
call.started_at ? `<span class="mc-badge">${escapeHtml(formatDate(call.started_at))}</span>` : ''
|
||||||
|
].join(' ');
|
||||||
|
|
||||||
|
list.innerHTML = calls.map((item) => `
|
||||||
|
<div class="mc-feed-item">
|
||||||
|
<div class="mc-feed-title">${escapeHtml(item.queue_name || 'Ukendt kø')} • ${escapeHtml(item.caller_number || '-')}</div>
|
||||||
|
<div class="mc-feed-meta">
|
||||||
|
${escapeHtml(item.contact_name || 'Ukendt kontakt')}
|
||||||
|
${item.company_name ? ` • ${escapeHtml(item.company_name)}` : ''}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderAlerts() {
|
||||||
|
const container = document.getElementById('alertContainer');
|
||||||
|
if (!container) return;
|
||||||
|
|
||||||
|
if (!state.activeAlerts.length) {
|
||||||
|
container.className = 'mc-alert-empty';
|
||||||
|
container.textContent = 'Ingen aktive driftsalarmer';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
container.className = '';
|
||||||
|
container.innerHTML = state.activeAlerts.map((alert) => `
|
||||||
|
<div class="mc-alert-bar down mb-2">
|
||||||
|
<span>🚨</span>
|
||||||
|
<span>${escapeHtml(alert.service_name || 'Ukendt service')}</span>
|
||||||
|
${alert.customer_name ? `<span class="mc-badge">${escapeHtml(alert.customer_name)}</span>` : ''}
|
||||||
|
<span class="mc-badge">Start: ${escapeHtml(formatDate(alert.started_at))}</span>
|
||||||
|
</div>
|
||||||
|
`).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderDeadlines(rows = []) {
|
||||||
|
const table = document.getElementById('deadlineTable');
|
||||||
|
if (!table) return;
|
||||||
|
if (!rows.length) {
|
||||||
|
table.innerHTML = '<div class="mc-feed-meta py-2">Ingen deadlines i dag eller overskredne</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
table.innerHTML = rows.map((row) => `
|
||||||
|
<div class="mc-row">
|
||||||
|
<div>${escapeHtml(row.employee_name || 'Ukendt')}</div>
|
||||||
|
<div>${Number(row.deadlines_today || 0)}</div>
|
||||||
|
<div style="color:${Number(row.overdue_deadlines || 0) > 0 ? '#ff9d9d' : 'inherit'}">${Number(row.overdue_deadlines || 0)}</div>
|
||||||
|
</div>
|
||||||
|
`).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderFeed() {
|
||||||
|
const feed = document.getElementById('liveFeed');
|
||||||
|
if (!feed) return;
|
||||||
|
|
||||||
|
if (!state.liveFeed.length) {
|
||||||
|
feed.innerHTML = '<div class="mc-feed-meta">Ingen events endnu</div>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
feed.innerHTML = state.liveFeed.slice(0, 20).map((event) => `
|
||||||
|
<div class="mc-feed-item">
|
||||||
|
<div class="mc-feed-title">${escapeHtml(event.title || event.event_type || 'Event')}</div>
|
||||||
|
<div class="mc-feed-meta">${escapeHtml(event.event_type || 'event')} • ${escapeHtml(formatDate(event.created_at))}</div>
|
||||||
|
</div>
|
||||||
|
`).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderState(payload) {
|
||||||
|
if (!payload) return;
|
||||||
|
state.config = { ...state.config, ...(payload.config || {}) };
|
||||||
|
state.activeCalls = Array.isArray(payload.active_calls) ? payload.active_calls : state.activeCalls;
|
||||||
|
state.activeAlerts = Array.isArray(payload.active_alerts) ? payload.active_alerts : state.activeAlerts;
|
||||||
|
state.liveFeed = Array.isArray(payload.live_feed) ? payload.live_feed : state.liveFeed;
|
||||||
|
|
||||||
|
const soundToggle = document.getElementById('soundEnabledToggle');
|
||||||
|
const volumeSlider = document.getElementById('soundVolume');
|
||||||
|
if (soundToggle) soundToggle.checked = !!state.config.sound_enabled;
|
||||||
|
if (volumeSlider) volumeSlider.value = String(state.config.sound_volume || 70);
|
||||||
|
|
||||||
|
renderKpis(payload.kpis || {});
|
||||||
|
renderActiveCalls();
|
||||||
|
renderAlerts();
|
||||||
|
renderDeadlines(Array.isArray(payload.employee_deadlines) ? payload.employee_deadlines : []);
|
||||||
|
renderFeed();
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadInitialState() {
|
||||||
|
const res = await fetch('/api/v1/mission/state', { credentials: 'include' });
|
||||||
|
if (!res.ok) throw new Error('Kunne ikke hente mission state');
|
||||||
|
const payload = await res.json();
|
||||||
|
renderState(payload);
|
||||||
|
}
|
||||||
|
|
||||||
|
function scheduleReconnect() {
|
||||||
|
if (state.reconnectTimer) return;
|
||||||
|
state.reconnectAttempts += 1;
|
||||||
|
const delay = Math.min(30000, 1500 * state.reconnectAttempts);
|
||||||
|
updateConnectionLabel(`Frakoblet • reconnect om ${Math.round(delay / 1000)}s`);
|
||||||
|
state.reconnectTimer = setTimeout(() => {
|
||||||
|
state.reconnectTimer = null;
|
||||||
|
connectWs();
|
||||||
|
}, delay);
|
||||||
|
}
|
||||||
|
|
||||||
|
function connectWs() {
|
||||||
|
const proto = window.location.protocol === 'https:' ? 'wss' : 'ws';
|
||||||
|
const url = `${proto}://${window.location.host}/api/v1/mission/ws`;
|
||||||
|
state.ws = new WebSocket(url);
|
||||||
|
|
||||||
|
state.ws.onopen = () => {
|
||||||
|
state.reconnectAttempts = 0;
|
||||||
|
updateConnectionLabel('Live forbindelse aktiv');
|
||||||
|
};
|
||||||
|
|
||||||
|
state.ws.onclose = () => {
|
||||||
|
state.failures += 1;
|
||||||
|
if (state.failures >= 12) {
|
||||||
|
window.location.reload();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
scheduleReconnect();
|
||||||
|
};
|
||||||
|
|
||||||
|
state.ws.onerror = () => {};
|
||||||
|
|
||||||
|
state.ws.onmessage = (evt) => {
|
||||||
|
try {
|
||||||
|
const msg = JSON.parse(evt.data);
|
||||||
|
const event = msg?.event;
|
||||||
|
const data = msg?.data || {};
|
||||||
|
|
||||||
|
if (event === 'mission_state') {
|
||||||
|
renderState(data);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (event === 'kpi_update') {
|
||||||
|
renderKpis(data);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (event === 'call_ringing') {
|
||||||
|
state.activeCalls = [data, ...state.activeCalls.filter(c => c.call_id !== data.call_id)];
|
||||||
|
renderActiveCalls();
|
||||||
|
playTone('incoming_call');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (event === 'call_answered' || event === 'call_hangup') {
|
||||||
|
const id = data.call_id;
|
||||||
|
state.activeCalls = state.activeCalls.filter(c => c.call_id !== id);
|
||||||
|
renderActiveCalls();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (event === 'uptime_alert') {
|
||||||
|
state.activeAlerts = Array.isArray(data.active_alerts) ? data.active_alerts : state.activeAlerts;
|
||||||
|
renderAlerts();
|
||||||
|
if ((data.status || '').toUpperCase() === 'DOWN') {
|
||||||
|
playTone('uptime_down');
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (event === 'live_feed_event') {
|
||||||
|
state.liveFeed = [data, ...state.liveFeed.filter(item => item.id !== data.id)].slice(0, 20);
|
||||||
|
renderFeed();
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Mission message parse failed', error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
document.addEventListener('DOMContentLoaded', async () => {
|
||||||
|
try {
|
||||||
|
await loadInitialState();
|
||||||
|
} catch (error) {
|
||||||
|
updateConnectionLabel('Fejl ved initial load');
|
||||||
|
console.error(error);
|
||||||
|
}
|
||||||
|
connectWs();
|
||||||
|
});
|
||||||
|
})();
|
||||||
|
</script>
|
||||||
|
{% endblock %}
|
||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@ -74,9 +74,16 @@ class VendorBase(BaseModel):
|
|||||||
domain: Optional[str] = None
|
domain: Optional[str] = None
|
||||||
email: Optional[str] = None
|
email: Optional[str] = None
|
||||||
phone: Optional[str] = None
|
phone: Optional[str] = None
|
||||||
|
address: Optional[str] = None
|
||||||
|
postal_code: Optional[str] = None
|
||||||
|
city: Optional[str] = None
|
||||||
|
website: Optional[str] = None
|
||||||
|
email_pattern: Optional[str] = None
|
||||||
contact_person: Optional[str] = None
|
contact_person: Optional[str] = None
|
||||||
category: Optional[str] = None
|
category: Optional[str] = None
|
||||||
|
priority: Optional[int] = 100
|
||||||
notes: Optional[str] = None
|
notes: Optional[str] = None
|
||||||
|
is_active: bool = True
|
||||||
|
|
||||||
|
|
||||||
class VendorCreate(VendorBase):
|
class VendorCreate(VendorBase):
|
||||||
@ -100,10 +107,9 @@ class VendorUpdate(BaseModel):
|
|||||||
class Vendor(VendorBase):
|
class Vendor(VendorBase):
|
||||||
"""Full vendor schema"""
|
"""Full vendor schema"""
|
||||||
id: int
|
id: int
|
||||||
is_active: bool = True
|
|
||||||
created_at: datetime
|
created_at: datetime
|
||||||
updated_at: Optional[datetime] = None
|
updated_at: Optional[datetime] = None
|
||||||
|
|
||||||
class Config:
|
class Config:
|
||||||
from_attributes = True
|
from_attributes = True
|
||||||
|
|
||||||
@ -274,6 +280,7 @@ class TodoStepCreate(TodoStepBase):
|
|||||||
class TodoStepUpdate(BaseModel):
|
class TodoStepUpdate(BaseModel):
|
||||||
"""Schema for updating a todo step"""
|
"""Schema for updating a todo step"""
|
||||||
is_done: Optional[bool] = None
|
is_done: Optional[bool] = None
|
||||||
|
is_next: Optional[bool] = None
|
||||||
|
|
||||||
|
|
||||||
class TodoStep(TodoStepBase):
|
class TodoStep(TodoStepBase):
|
||||||
@ -281,6 +288,7 @@ class TodoStep(TodoStepBase):
|
|||||||
id: int
|
id: int
|
||||||
sag_id: int
|
sag_id: int
|
||||||
is_done: bool
|
is_done: bool
|
||||||
|
is_next: bool = False
|
||||||
created_by_user_id: Optional[int] = None
|
created_by_user_id: Optional[int] = None
|
||||||
created_by_name: Optional[str] = None
|
created_by_name: Optional[str] = None
|
||||||
created_at: datetime
|
created_at: datetime
|
||||||
|
|||||||
@ -1,6 +1,7 @@
|
|||||||
import logging
|
import logging
|
||||||
import os
|
import os
|
||||||
import shutil
|
import shutil
|
||||||
|
import json
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
@ -26,6 +27,11 @@ logger = logging.getLogger(__name__)
|
|||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
|
|
||||||
|
def _table_exists(table_name: str) -> bool:
|
||||||
|
row = execute_query_single("SELECT to_regclass(%s) AS table_name", (f"public.{table_name}",))
|
||||||
|
return bool(row and row.get("table_name"))
|
||||||
|
|
||||||
|
|
||||||
def _get_user_id_from_request(request: Request) -> int:
|
def _get_user_id_from_request(request: Request) -> int:
|
||||||
user_id = getattr(request.state, "user_id", None)
|
user_id = getattr(request.state, "user_id", None)
|
||||||
if user_id is not None:
|
if user_id is not None:
|
||||||
@ -45,15 +51,64 @@ def _get_user_id_from_request(request: Request) -> int:
|
|||||||
|
|
||||||
|
|
||||||
def _normalize_case_status(status_value: Optional[str]) -> str:
|
def _normalize_case_status(status_value: Optional[str]) -> str:
|
||||||
|
allowed_statuses = []
|
||||||
|
seen = set()
|
||||||
|
|
||||||
|
def _add_status(value: Optional[str]) -> None:
|
||||||
|
candidate = str(value or "").strip()
|
||||||
|
if not candidate:
|
||||||
|
return
|
||||||
|
key = candidate.lower()
|
||||||
|
if key in seen:
|
||||||
|
return
|
||||||
|
seen.add(key)
|
||||||
|
allowed_statuses.append(candidate)
|
||||||
|
|
||||||
|
try:
|
||||||
|
setting_row = execute_query_single("SELECT value FROM settings WHERE key = %s", ("case_statuses",))
|
||||||
|
if setting_row and setting_row.get("value"):
|
||||||
|
parsed = json.loads(setting_row.get("value") or "[]")
|
||||||
|
for item in parsed if isinstance(parsed, list) else []:
|
||||||
|
if isinstance(item, str):
|
||||||
|
value = item.strip()
|
||||||
|
elif isinstance(item, dict):
|
||||||
|
value = str(item.get("value") or "").strip()
|
||||||
|
else:
|
||||||
|
value = ""
|
||||||
|
_add_status(value)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Include historical/current DB statuses so legacy values remain valid
|
||||||
|
try:
|
||||||
|
rows = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ()) or []
|
||||||
|
for row in rows:
|
||||||
|
_add_status(row.get("status"))
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if not allowed_statuses:
|
||||||
|
allowed_statuses = ["åben", "under behandling", "afventer", "løst", "lukket"]
|
||||||
|
|
||||||
|
allowed_map = {s.lower(): s for s in allowed_statuses}
|
||||||
|
|
||||||
if not status_value:
|
if not status_value:
|
||||||
return "åben"
|
return allowed_map.get("åben", allowed_statuses[0])
|
||||||
|
|
||||||
normalized = str(status_value).strip().lower()
|
normalized = str(status_value).strip().lower()
|
||||||
if normalized == "afventer":
|
if normalized in allowed_map:
|
||||||
return "åben"
|
return allowed_map[normalized]
|
||||||
if normalized in {"åben", "lukket"}:
|
|
||||||
return normalized
|
# Backward compatibility for legacy mapping
|
||||||
return "åben"
|
if normalized == "afventer" and "åben" in allowed_map:
|
||||||
|
return allowed_map["åben"]
|
||||||
|
|
||||||
|
# Do not force unknown values back to default; preserve user-entered/custom DB values
|
||||||
|
raw_value = str(status_value).strip()
|
||||||
|
if raw_value:
|
||||||
|
return raw_value
|
||||||
|
|
||||||
|
return allowed_map.get("åben", allowed_statuses[0])
|
||||||
|
|
||||||
|
|
||||||
def _normalize_optional_timestamp(value: Optional[str], field_name: str) -> Optional[str]:
|
def _normalize_optional_timestamp(value: Optional[str], field_name: str) -> Optional[str]:
|
||||||
@ -109,6 +164,30 @@ class QuickCreateRequest(BaseModel):
|
|||||||
user_id: int
|
user_id: int
|
||||||
|
|
||||||
|
|
||||||
|
class SagSendEmailRequest(BaseModel):
|
||||||
|
to: List[str]
|
||||||
|
subject: str = Field(..., min_length=1, max_length=998)
|
||||||
|
body_text: str = Field(..., min_length=1)
|
||||||
|
cc: List[str] = Field(default_factory=list)
|
||||||
|
bcc: List[str] = Field(default_factory=list)
|
||||||
|
body_html: Optional[str] = None
|
||||||
|
attachment_file_ids: List[int] = Field(default_factory=list)
|
||||||
|
thread_email_id: Optional[int] = None
|
||||||
|
thread_key: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
def _normalize_email_list(values: List[str], field_name: str) -> List[str]:
|
||||||
|
cleaned: List[str] = []
|
||||||
|
for value in values or []:
|
||||||
|
candidate = str(value or "").strip()
|
||||||
|
if not candidate:
|
||||||
|
continue
|
||||||
|
if "@" not in candidate or "." not in candidate.split("@")[-1]:
|
||||||
|
raise HTTPException(status_code=400, detail=f"Invalid email in {field_name}: {candidate}")
|
||||||
|
cleaned.append(candidate)
|
||||||
|
return list(dict.fromkeys(cleaned))
|
||||||
|
|
||||||
|
|
||||||
@router.post("/sag/analyze-quick-create", response_model=QuickCreateAnalysis)
|
@router.post("/sag/analyze-quick-create", response_model=QuickCreateAnalysis)
|
||||||
async def analyze_quick_create(request: QuickCreateRequest):
|
async def analyze_quick_create(request: QuickCreateRequest):
|
||||||
"""
|
"""
|
||||||
@ -213,6 +292,10 @@ async def list_all_sale_items(
|
|||||||
):
|
):
|
||||||
"""List all sale items across cases (orders overview)."""
|
"""List all sale items across cases (orders overview)."""
|
||||||
try:
|
try:
|
||||||
|
if not _table_exists("sag_salgsvarer"):
|
||||||
|
logger.warning("⚠️ sag_salgsvarer table missing - returning empty sale items list")
|
||||||
|
return []
|
||||||
|
|
||||||
query = """
|
query = """
|
||||||
SELECT si.*, s.titel AS sag_titel, s.customer_id, c.name AS customer_name
|
SELECT si.*, s.titel AS sag_titel, s.customer_id, c.name AS customer_name
|
||||||
FROM sag_salgsvarer si
|
FROM sag_salgsvarer si
|
||||||
@ -378,7 +461,7 @@ async def list_todo_steps(sag_id: int):
|
|||||||
LEFT JOIN users u_created ON u_created.user_id = t.created_by_user_id
|
LEFT JOIN users u_created ON u_created.user_id = t.created_by_user_id
|
||||||
LEFT JOIN users u_completed ON u_completed.user_id = t.completed_by_user_id
|
LEFT JOIN users u_completed ON u_completed.user_id = t.completed_by_user_id
|
||||||
WHERE t.sag_id = %s AND t.deleted_at IS NULL
|
WHERE t.sag_id = %s AND t.deleted_at IS NULL
|
||||||
ORDER BY t.is_done ASC, t.due_date NULLS LAST, t.created_at DESC
|
ORDER BY t.is_done ASC, t.is_next DESC, t.due_date NULLS LAST, t.created_at DESC
|
||||||
"""
|
"""
|
||||||
return execute_query(query, (sag_id,)) or []
|
return execute_query(query, (sag_id,)) or []
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
@ -433,34 +516,63 @@ async def create_todo_step(sag_id: int, request: Request, data: TodoStepCreate):
|
|||||||
@router.patch("/sag/todo-steps/{step_id}", response_model=TodoStep)
|
@router.patch("/sag/todo-steps/{step_id}", response_model=TodoStep)
|
||||||
async def update_todo_step(step_id: int, request: Request, data: TodoStepUpdate):
|
async def update_todo_step(step_id: int, request: Request, data: TodoStepUpdate):
|
||||||
try:
|
try:
|
||||||
if data.is_done is None:
|
if data.is_done is None and data.is_next is None:
|
||||||
raise HTTPException(status_code=400, detail="is_done is required")
|
raise HTTPException(status_code=400, detail="Provide is_done or is_next")
|
||||||
|
|
||||||
user_id = _get_user_id_from_request(request)
|
step_row = execute_query_single(
|
||||||
if data.is_done:
|
"SELECT id, sag_id, is_done FROM sag_todo_steps WHERE id = %s AND deleted_at IS NULL",
|
||||||
update_query = """
|
(step_id,)
|
||||||
UPDATE sag_todo_steps
|
)
|
||||||
SET is_done = TRUE,
|
if not step_row:
|
||||||
completed_by_user_id = %s,
|
|
||||||
completed_at = CURRENT_TIMESTAMP
|
|
||||||
WHERE id = %s AND deleted_at IS NULL
|
|
||||||
RETURNING id
|
|
||||||
"""
|
|
||||||
result = execute_query(update_query, (user_id, step_id))
|
|
||||||
else:
|
|
||||||
update_query = """
|
|
||||||
UPDATE sag_todo_steps
|
|
||||||
SET is_done = FALSE,
|
|
||||||
completed_by_user_id = NULL,
|
|
||||||
completed_at = NULL
|
|
||||||
WHERE id = %s AND deleted_at IS NULL
|
|
||||||
RETURNING id
|
|
||||||
"""
|
|
||||||
result = execute_query(update_query, (step_id,))
|
|
||||||
|
|
||||||
if not result:
|
|
||||||
raise HTTPException(status_code=404, detail="Todo step not found")
|
raise HTTPException(status_code=404, detail="Todo step not found")
|
||||||
|
|
||||||
|
if data.is_done is not None:
|
||||||
|
user_id = _get_user_id_from_request(request)
|
||||||
|
if data.is_done:
|
||||||
|
update_query = """
|
||||||
|
UPDATE sag_todo_steps
|
||||||
|
SET is_done = TRUE,
|
||||||
|
is_next = FALSE,
|
||||||
|
completed_by_user_id = %s,
|
||||||
|
completed_at = CURRENT_TIMESTAMP
|
||||||
|
WHERE id = %s AND deleted_at IS NULL
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
execute_query(update_query, (user_id, step_id))
|
||||||
|
else:
|
||||||
|
update_query = """
|
||||||
|
UPDATE sag_todo_steps
|
||||||
|
SET is_done = FALSE,
|
||||||
|
completed_by_user_id = NULL,
|
||||||
|
completed_at = NULL
|
||||||
|
WHERE id = %s AND deleted_at IS NULL
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
execute_query(update_query, (step_id,))
|
||||||
|
|
||||||
|
if data.is_next is not None:
|
||||||
|
if step_row.get("is_done") and data.is_next:
|
||||||
|
raise HTTPException(status_code=400, detail="Completed todo cannot be marked as next")
|
||||||
|
|
||||||
|
if data.is_next:
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
UPDATE sag_todo_steps
|
||||||
|
SET is_next = FALSE
|
||||||
|
WHERE sag_id = %s AND deleted_at IS NULL
|
||||||
|
""",
|
||||||
|
(step_row["sag_id"],)
|
||||||
|
)
|
||||||
|
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
UPDATE sag_todo_steps
|
||||||
|
SET is_next = %s
|
||||||
|
WHERE id = %s AND deleted_at IS NULL
|
||||||
|
""",
|
||||||
|
(bool(data.is_next), step_id)
|
||||||
|
)
|
||||||
|
|
||||||
return execute_query(
|
return execute_query(
|
||||||
"""
|
"""
|
||||||
SELECT
|
SELECT
|
||||||
@ -519,8 +631,12 @@ async def update_sag(sag_id: int, updates: dict):
|
|||||||
updates["status"] = _normalize_case_status(updates.get("status"))
|
updates["status"] = _normalize_case_status(updates.get("status"))
|
||||||
if "deadline" in updates:
|
if "deadline" in updates:
|
||||||
updates["deadline"] = _normalize_optional_timestamp(updates.get("deadline"), "deadline")
|
updates["deadline"] = _normalize_optional_timestamp(updates.get("deadline"), "deadline")
|
||||||
|
if "start_date" in updates:
|
||||||
|
updates["start_date"] = _normalize_optional_timestamp(updates.get("start_date"), "start_date")
|
||||||
if "deferred_until" in updates:
|
if "deferred_until" in updates:
|
||||||
updates["deferred_until"] = _normalize_optional_timestamp(updates.get("deferred_until"), "deferred_until")
|
updates["deferred_until"] = _normalize_optional_timestamp(updates.get("deferred_until"), "deferred_until")
|
||||||
|
if "priority" in updates:
|
||||||
|
updates["priority"] = (str(updates.get("priority") or "").strip().lower() or "normal")
|
||||||
if "ansvarlig_bruger_id" in updates:
|
if "ansvarlig_bruger_id" in updates:
|
||||||
updates["ansvarlig_bruger_id"] = _coerce_optional_int(updates.get("ansvarlig_bruger_id"), "ansvarlig_bruger_id")
|
updates["ansvarlig_bruger_id"] = _coerce_optional_int(updates.get("ansvarlig_bruger_id"), "ansvarlig_bruger_id")
|
||||||
_validate_user_id(updates["ansvarlig_bruger_id"])
|
_validate_user_id(updates["ansvarlig_bruger_id"])
|
||||||
@ -536,6 +652,8 @@ async def update_sag(sag_id: int, updates: dict):
|
|||||||
"status",
|
"status",
|
||||||
"ansvarlig_bruger_id",
|
"ansvarlig_bruger_id",
|
||||||
"assigned_group_id",
|
"assigned_group_id",
|
||||||
|
"priority",
|
||||||
|
"start_date",
|
||||||
"deadline",
|
"deadline",
|
||||||
"deferred_until",
|
"deferred_until",
|
||||||
"deferred_until_case_id",
|
"deferred_until_case_id",
|
||||||
@ -568,6 +686,86 @@ async def update_sag(sag_id: int, updates: dict):
|
|||||||
raise HTTPException(status_code=500, detail="Failed to update case")
|
raise HTTPException(status_code=500, detail="Failed to update case")
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Beskrivelse inline editing with history
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
class BeskrivelsePatch(BaseModel):
|
||||||
|
beskrivelse: str
|
||||||
|
|
||||||
|
|
||||||
|
@router.patch("/sag/{sag_id}/beskrivelse")
|
||||||
|
async def update_sag_beskrivelse(sag_id: int, body: BeskrivelsePatch, request: Request):
|
||||||
|
"""Update case description and store a change history entry."""
|
||||||
|
try:
|
||||||
|
row = execute_query_single(
|
||||||
|
"SELECT id, beskrivelse FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
||||||
|
(sag_id,)
|
||||||
|
)
|
||||||
|
if not row:
|
||||||
|
raise HTTPException(status_code=404, detail="Case not found")
|
||||||
|
|
||||||
|
old_beskrivelse = row.get("beskrivelse")
|
||||||
|
new_beskrivelse = body.beskrivelse
|
||||||
|
|
||||||
|
# Resolve acting user (may be None for anonymous)
|
||||||
|
user_id = _get_user_id_from_request(request)
|
||||||
|
changed_by_name = None
|
||||||
|
if user_id:
|
||||||
|
u = execute_query_single(
|
||||||
|
"SELECT COALESCE(full_name, username, CONCAT('Bruger #', user_id::text)) AS name FROM users WHERE user_id = %s",
|
||||||
|
(user_id,)
|
||||||
|
)
|
||||||
|
if u:
|
||||||
|
changed_by_name = u["name"]
|
||||||
|
|
||||||
|
# Write history entry
|
||||||
|
execute_query(
|
||||||
|
"""INSERT INTO sag_beskrivelse_history
|
||||||
|
(sag_id, beskrivelse_before, beskrivelse_after, changed_by_user_id, changed_by_name)
|
||||||
|
VALUES (%s, %s, %s, %s, %s)""",
|
||||||
|
(sag_id, old_beskrivelse, new_beskrivelse, user_id, changed_by_name)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update the case
|
||||||
|
execute_query(
|
||||||
|
"UPDATE sag_sager SET beskrivelse = %s, updated_at = NOW() WHERE id = %s",
|
||||||
|
(new_beskrivelse, sag_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info("✅ Beskrivelse updated for sag %s by user %s", sag_id, user_id)
|
||||||
|
return {"ok": True, "beskrivelse": new_beskrivelse}
|
||||||
|
|
||||||
|
except HTTPException:
|
||||||
|
raise
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("❌ Error updating beskrivelse for sag %s: %s", sag_id, e)
|
||||||
|
raise HTTPException(status_code=500, detail="Failed to update description")
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/sag/{sag_id}/beskrivelse/history")
|
||||||
|
async def get_sag_beskrivelse_history(sag_id: int):
|
||||||
|
"""Return the change history for a case's description, newest first."""
|
||||||
|
exists = execute_query_single(
|
||||||
|
"SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
||||||
|
(sag_id,)
|
||||||
|
)
|
||||||
|
if not exists:
|
||||||
|
raise HTTPException(status_code=404, detail="Case not found")
|
||||||
|
|
||||||
|
rows = execute_query(
|
||||||
|
"""SELECT id, beskrivelse_before, beskrivelse_after,
|
||||||
|
changed_by_name, changed_at
|
||||||
|
FROM sag_beskrivelse_history
|
||||||
|
WHERE sag_id = %s
|
||||||
|
ORDER BY changed_at DESC
|
||||||
|
LIMIT 50""",
|
||||||
|
(sag_id,)
|
||||||
|
) or []
|
||||||
|
|
||||||
|
return rows
|
||||||
|
|
||||||
|
|
||||||
class PipelineUpdate(BaseModel):
|
class PipelineUpdate(BaseModel):
|
||||||
amount: Optional[float] = None
|
amount: Optional[float] = None
|
||||||
probability: Optional[int] = Field(default=None, ge=0, le=100)
|
probability: Optional[int] = Field(default=None, ge=0, le=100)
|
||||||
@ -748,6 +946,15 @@ async def delete_relation(sag_id: int, relation_id: int):
|
|||||||
# TAGS - Case Tags
|
# TAGS - Case Tags
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
|
|
||||||
|
@router.get("/sag/tags/all")
|
||||||
|
async def get_all_tags():
|
||||||
|
"""Return all distinct tag names across all cases (for autocomplete)."""
|
||||||
|
rows = execute_query(
|
||||||
|
"SELECT DISTINCT tag_navn FROM sag_tags WHERE deleted_at IS NULL ORDER BY tag_navn ASC LIMIT 200"
|
||||||
|
) or []
|
||||||
|
return rows
|
||||||
|
|
||||||
|
|
||||||
@router.get("/sag/{sag_id}/tags")
|
@router.get("/sag/{sag_id}/tags")
|
||||||
async def get_tags(sag_id: int):
|
async def get_tags(sag_id: int):
|
||||||
"""Get all tags for a case."""
|
"""Get all tags for a case."""
|
||||||
@ -1205,6 +1412,10 @@ async def get_varekob_salg(sag_id: int, include_subcases: bool = True):
|
|||||||
if not check:
|
if not check:
|
||||||
raise HTTPException(status_code=404, detail="Case not found")
|
raise HTTPException(status_code=404, detail="Case not found")
|
||||||
|
|
||||||
|
has_sale_items_table = _table_exists("sag_salgsvarer")
|
||||||
|
if not has_sale_items_table:
|
||||||
|
logger.warning("⚠️ sag_salgsvarer table missing - sale item aggregation skipped for sag_id=%s", sag_id)
|
||||||
|
|
||||||
if include_subcases:
|
if include_subcases:
|
||||||
case_tree_query = """
|
case_tree_query = """
|
||||||
WITH RECURSIVE normalized_relations AS (
|
WITH RECURSIVE normalized_relations AS (
|
||||||
@ -1268,36 +1479,39 @@ async def get_varekob_salg(sag_id: int, include_subcases: bool = True):
|
|||||||
"""
|
"""
|
||||||
time_entries = execute_query(time_query, (sag_id,))
|
time_entries = execute_query(time_query, (sag_id,))
|
||||||
|
|
||||||
sale_items_query = """
|
if has_sale_items_table:
|
||||||
WITH RECURSIVE normalized_relations AS (
|
sale_items_query = """
|
||||||
SELECT
|
WITH RECURSIVE normalized_relations AS (
|
||||||
CASE
|
SELECT
|
||||||
WHEN LOWER(relationstype) IN ('afledt af', 'afledt_af') THEN målsag_id
|
CASE
|
||||||
WHEN LOWER(relationstype) IN ('årsag til', 'årsag_til') THEN kilde_sag_id
|
WHEN LOWER(relationstype) IN ('afledt af', 'afledt_af') THEN målsag_id
|
||||||
ELSE kilde_sag_id
|
WHEN LOWER(relationstype) IN ('årsag til', 'årsag_til') THEN kilde_sag_id
|
||||||
END AS parent_id,
|
ELSE kilde_sag_id
|
||||||
CASE
|
END AS parent_id,
|
||||||
WHEN LOWER(relationstype) IN ('afledt af', 'afledt_af') THEN kilde_sag_id
|
CASE
|
||||||
WHEN LOWER(relationstype) IN ('årsag til', 'årsag_til') THEN målsag_id
|
WHEN LOWER(relationstype) IN ('afledt af', 'afledt_af') THEN kilde_sag_id
|
||||||
ELSE målsag_id
|
WHEN LOWER(relationstype) IN ('årsag til', 'årsag_til') THEN målsag_id
|
||||||
END AS child_id
|
ELSE målsag_id
|
||||||
FROM sag_relationer
|
END AS child_id
|
||||||
WHERE deleted_at IS NULL
|
FROM sag_relationer
|
||||||
),
|
WHERE deleted_at IS NULL
|
||||||
case_tree AS (
|
),
|
||||||
SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL
|
case_tree AS (
|
||||||
UNION
|
SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL
|
||||||
SELECT nr.child_id
|
UNION
|
||||||
FROM normalized_relations nr
|
SELECT nr.child_id
|
||||||
JOIN case_tree ct ON nr.parent_id = ct.id
|
FROM normalized_relations nr
|
||||||
)
|
JOIN case_tree ct ON nr.parent_id = ct.id
|
||||||
SELECT si.*, s.titel AS source_sag_titel
|
)
|
||||||
FROM sag_salgsvarer si
|
SELECT si.*, s.titel AS source_sag_titel
|
||||||
JOIN case_tree ct ON si.sag_id = ct.id
|
FROM sag_salgsvarer si
|
||||||
LEFT JOIN sag_sager s ON s.id = si.sag_id
|
JOIN case_tree ct ON si.sag_id = ct.id
|
||||||
ORDER BY si.line_date DESC NULLS LAST, si.id DESC
|
LEFT JOIN sag_sager s ON s.id = si.sag_id
|
||||||
"""
|
ORDER BY si.line_date DESC NULLS LAST, si.id DESC
|
||||||
sale_items = execute_query(sale_items_query, (sag_id,))
|
"""
|
||||||
|
sale_items = execute_query(sale_items_query, (sag_id,))
|
||||||
|
else:
|
||||||
|
sale_items = []
|
||||||
else:
|
else:
|
||||||
case_tree = execute_query(
|
case_tree = execute_query(
|
||||||
"SELECT id, titel FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
"SELECT id, titel FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
||||||
@ -1312,14 +1526,17 @@ async def get_varekob_salg(sag_id: int, include_subcases: bool = True):
|
|||||||
"""
|
"""
|
||||||
time_entries = execute_query(time_query, (sag_id,))
|
time_entries = execute_query(time_query, (sag_id,))
|
||||||
|
|
||||||
sale_items_query = """
|
if has_sale_items_table:
|
||||||
SELECT si.*, s.titel AS source_sag_titel
|
sale_items_query = """
|
||||||
FROM sag_salgsvarer si
|
SELECT si.*, s.titel AS source_sag_titel
|
||||||
LEFT JOIN sag_sager s ON s.id = si.sag_id
|
FROM sag_salgsvarer si
|
||||||
WHERE si.sag_id = %s
|
LEFT JOIN sag_sager s ON s.id = si.sag_id
|
||||||
ORDER BY si.line_date DESC NULLS LAST, si.id DESC
|
WHERE si.sag_id = %s
|
||||||
"""
|
ORDER BY si.line_date DESC NULLS LAST, si.id DESC
|
||||||
sale_items = execute_query(sale_items_query, (sag_id,))
|
"""
|
||||||
|
sale_items = execute_query(sale_items_query, (sag_id,))
|
||||||
|
else:
|
||||||
|
sale_items = []
|
||||||
|
|
||||||
total_entries = len(time_entries or [])
|
total_entries = len(time_entries or [])
|
||||||
total_hours = 0
|
total_hours = 0
|
||||||
@ -1497,6 +1714,10 @@ async def list_sale_items(sag_id: int):
|
|||||||
if not check:
|
if not check:
|
||||||
raise HTTPException(status_code=404, detail="Case not found")
|
raise HTTPException(status_code=404, detail="Case not found")
|
||||||
|
|
||||||
|
if not _table_exists("sag_salgsvarer"):
|
||||||
|
logger.warning("⚠️ sag_salgsvarer table missing - returning empty sale items list for sag_id=%s", sag_id)
|
||||||
|
return []
|
||||||
|
|
||||||
query = """
|
query = """
|
||||||
SELECT si.*, s.titel AS source_sag_titel
|
SELECT si.*, s.titel AS source_sag_titel
|
||||||
FROM sag_salgsvarer si
|
FROM sag_salgsvarer si
|
||||||
@ -1893,11 +2114,25 @@ async def add_sag_email_link(sag_id: int, payload: dict):
|
|||||||
async def get_sag_emails(sag_id: int):
|
async def get_sag_emails(sag_id: int):
|
||||||
"""Get emails linked to a case."""
|
"""Get emails linked to a case."""
|
||||||
query = """
|
query = """
|
||||||
SELECT e.*
|
WITH linked_emails AS (
|
||||||
FROM email_messages e
|
SELECT
|
||||||
JOIN sag_emails se ON e.id = se.email_id
|
e.*,
|
||||||
WHERE se.sag_id = %s
|
COALESCE(
|
||||||
ORDER BY e.received_date DESC
|
NULLIF(REGEXP_REPLACE(TRIM(COALESCE(e.in_reply_to, '')), '[<>\\s]', '', 'g'), ''),
|
||||||
|
NULLIF(REGEXP_REPLACE((REGEXP_SPLIT_TO_ARRAY(COALESCE(e.email_references, ''), E'[\\s,]+'))[1], '[<>\\s]', '', 'g'), ''),
|
||||||
|
NULLIF(REGEXP_REPLACE(TRIM(COALESCE(e.message_id, '')), '[<>\\s]', '', 'g'), ''),
|
||||||
|
CONCAT('email-', e.id::text)
|
||||||
|
) AS thread_key
|
||||||
|
FROM email_messages e
|
||||||
|
JOIN sag_emails se ON e.id = se.email_id
|
||||||
|
WHERE se.sag_id = %s
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
linked_emails.*,
|
||||||
|
COUNT(*) OVER (PARTITION BY linked_emails.thread_key) AS thread_message_count,
|
||||||
|
MAX(linked_emails.received_date) OVER (PARTITION BY linked_emails.thread_key) AS thread_last_received_date
|
||||||
|
FROM linked_emails
|
||||||
|
ORDER BY thread_last_received_date DESC NULLS LAST, received_date DESC
|
||||||
"""
|
"""
|
||||||
return execute_query(query, (sag_id,)) or []
|
return execute_query(query, (sag_id,)) or []
|
||||||
|
|
||||||
@ -1969,6 +2204,8 @@ async def upload_sag_email(sag_id: int, file: UploadFile = File(...)):
|
|||||||
|
|
||||||
email_data = {
|
email_data = {
|
||||||
'message_id': msg.get('Message-ID', f"eml-{temp_id}"),
|
'message_id': msg.get('Message-ID', f"eml-{temp_id}"),
|
||||||
|
'in_reply_to': _decode_header_str(msg.get('In-Reply-To', '')),
|
||||||
|
'email_references': _decode_header_str(msg.get('References', '')),
|
||||||
'subject': _decode_header_str(msg.get('Subject', 'No Subject')),
|
'subject': _decode_header_str(msg.get('Subject', 'No Subject')),
|
||||||
'sender_email': _decode_header_str(msg.get('From', '')),
|
'sender_email': _decode_header_str(msg.get('From', '')),
|
||||||
'sender_name': _decode_header_str(msg.get('From', '')),
|
'sender_name': _decode_header_str(msg.get('From', '')),
|
||||||
@ -1999,6 +2236,230 @@ async def upload_sag_email(sag_id: int, file: UploadFile = File(...)):
|
|||||||
await add_sag_email_link(sag_id, {"email_id": email_id})
|
await add_sag_email_link(sag_id, {"email_id": email_id})
|
||||||
return {"status": "imported", "email_id": email_id}
|
return {"status": "imported", "email_id": email_id}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/sag/{sag_id}/emails/send")
|
||||||
|
async def send_sag_email(sag_id: int, payload: SagSendEmailRequest):
|
||||||
|
"""Send outbound email directly from case email tab and link it to case."""
|
||||||
|
case_exists = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (sag_id,))
|
||||||
|
if not case_exists:
|
||||||
|
raise HTTPException(status_code=404, detail="Case not found")
|
||||||
|
|
||||||
|
to_addresses = _normalize_email_list(payload.to, "to")
|
||||||
|
cc_addresses = _normalize_email_list(payload.cc, "cc")
|
||||||
|
bcc_addresses = _normalize_email_list(payload.bcc, "bcc")
|
||||||
|
|
||||||
|
if not to_addresses:
|
||||||
|
raise HTTPException(status_code=400, detail="At least one recipient in 'to' is required")
|
||||||
|
|
||||||
|
subject = (payload.subject or "").strip()
|
||||||
|
body_text = (payload.body_text or "").strip()
|
||||||
|
if not subject:
|
||||||
|
raise HTTPException(status_code=400, detail="subject is required")
|
||||||
|
if not body_text:
|
||||||
|
raise HTTPException(status_code=400, detail="body_text is required")
|
||||||
|
|
||||||
|
attachment_rows = []
|
||||||
|
attachment_ids = list(dict.fromkeys(payload.attachment_file_ids or []))
|
||||||
|
if attachment_ids:
|
||||||
|
placeholders = ",".join(["%s"] * len(attachment_ids))
|
||||||
|
attachment_query = f"""
|
||||||
|
SELECT id, filename, content_type, size_bytes, stored_name
|
||||||
|
FROM sag_files
|
||||||
|
WHERE sag_id = %s AND id IN ({placeholders})
|
||||||
|
"""
|
||||||
|
attachment_rows = execute_query(attachment_query, (sag_id, *attachment_ids))
|
||||||
|
|
||||||
|
if len(attachment_rows) != len(attachment_ids):
|
||||||
|
raise HTTPException(status_code=400, detail="One or more selected attachments were not found on this case")
|
||||||
|
|
||||||
|
smtp_attachments = []
|
||||||
|
for row in attachment_rows:
|
||||||
|
path = _resolve_attachment_path(row["stored_name"])
|
||||||
|
if not path.exists():
|
||||||
|
raise HTTPException(status_code=404, detail=f"Attachment file is missing on server: {row['filename']}")
|
||||||
|
|
||||||
|
smtp_attachments.append({
|
||||||
|
"filename": row["filename"],
|
||||||
|
"content_type": row.get("content_type") or "application/octet-stream",
|
||||||
|
"content": path.read_bytes(),
|
||||||
|
"size": row.get("size_bytes") or 0,
|
||||||
|
"file_path": str(path),
|
||||||
|
})
|
||||||
|
|
||||||
|
in_reply_to_header = None
|
||||||
|
references_header = None
|
||||||
|
if payload.thread_email_id:
|
||||||
|
thread_row = None
|
||||||
|
try:
|
||||||
|
thread_row = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT id, message_id, in_reply_to, email_references
|
||||||
|
FROM email_messages
|
||||||
|
WHERE id = %s
|
||||||
|
""",
|
||||||
|
(payload.thread_email_id,),
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
# Backward compatibility for DBs without in_reply_to/email_references columns.
|
||||||
|
thread_row = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT id, message_id
|
||||||
|
FROM email_messages
|
||||||
|
WHERE id = %s
|
||||||
|
""",
|
||||||
|
(payload.thread_email_id,),
|
||||||
|
)
|
||||||
|
if thread_row:
|
||||||
|
base_message_id = str(thread_row.get("message_id") or "").strip()
|
||||||
|
if base_message_id and not base_message_id.startswith("<"):
|
||||||
|
base_message_id = f"<{base_message_id}>"
|
||||||
|
if base_message_id:
|
||||||
|
in_reply_to_header = base_message_id
|
||||||
|
|
||||||
|
existing_refs = str(thread_row.get("email_references") or "").strip()
|
||||||
|
if existing_refs:
|
||||||
|
references_header = f"{existing_refs} {base_message_id}".strip()
|
||||||
|
else:
|
||||||
|
references_header = base_message_id
|
||||||
|
|
||||||
|
email_service = EmailService()
|
||||||
|
success, send_message, generated_message_id = await email_service.send_email_with_attachments(
|
||||||
|
to_addresses=to_addresses,
|
||||||
|
subject=subject,
|
||||||
|
body_text=body_text,
|
||||||
|
body_html=payload.body_html,
|
||||||
|
cc=cc_addresses,
|
||||||
|
bcc=bcc_addresses,
|
||||||
|
in_reply_to=in_reply_to_header,
|
||||||
|
references=references_header,
|
||||||
|
attachments=smtp_attachments,
|
||||||
|
respect_dry_run=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
if not success:
|
||||||
|
logger.error("❌ Failed to send case email for case %s: %s", sag_id, send_message)
|
||||||
|
raise HTTPException(status_code=500, detail="Failed to send email")
|
||||||
|
|
||||||
|
sender_name = settings.EMAIL_SMTP_FROM_NAME or "BMC Hub"
|
||||||
|
sender_email = settings.EMAIL_SMTP_FROM_ADDRESS or ""
|
||||||
|
|
||||||
|
insert_result = None
|
||||||
|
try:
|
||||||
|
insert_email_query = """
|
||||||
|
INSERT INTO email_messages (
|
||||||
|
message_id, subject, sender_email, sender_name,
|
||||||
|
recipient_email, cc, body_text, body_html,
|
||||||
|
in_reply_to, email_references,
|
||||||
|
received_date, folder, has_attachments, attachment_count,
|
||||||
|
status, import_method, linked_case_id
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
insert_result = execute_query(
|
||||||
|
insert_email_query,
|
||||||
|
(
|
||||||
|
generated_message_id,
|
||||||
|
subject,
|
||||||
|
sender_email,
|
||||||
|
sender_name,
|
||||||
|
", ".join(to_addresses),
|
||||||
|
", ".join(cc_addresses),
|
||||||
|
body_text,
|
||||||
|
payload.body_html,
|
||||||
|
in_reply_to_header,
|
||||||
|
references_header,
|
||||||
|
datetime.now(),
|
||||||
|
"Sent",
|
||||||
|
bool(smtp_attachments),
|
||||||
|
len(smtp_attachments),
|
||||||
|
"sent",
|
||||||
|
"sag_outbound",
|
||||||
|
sag_id,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
insert_email_query = """
|
||||||
|
INSERT INTO email_messages (
|
||||||
|
message_id, subject, sender_email, sender_name,
|
||||||
|
recipient_email, cc, body_text, body_html,
|
||||||
|
received_date, folder, has_attachments, attachment_count,
|
||||||
|
status, import_method, linked_case_id
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
"""
|
||||||
|
insert_result = execute_query(
|
||||||
|
insert_email_query,
|
||||||
|
(
|
||||||
|
generated_message_id,
|
||||||
|
subject,
|
||||||
|
sender_email,
|
||||||
|
sender_name,
|
||||||
|
", ".join(to_addresses),
|
||||||
|
", ".join(cc_addresses),
|
||||||
|
body_text,
|
||||||
|
payload.body_html,
|
||||||
|
datetime.now(),
|
||||||
|
"Sent",
|
||||||
|
bool(smtp_attachments),
|
||||||
|
len(smtp_attachments),
|
||||||
|
"sent",
|
||||||
|
"sag_outbound",
|
||||||
|
sag_id,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
if not insert_result:
|
||||||
|
logger.error("❌ Email sent but outbound log insert failed for case %s", sag_id)
|
||||||
|
raise HTTPException(status_code=500, detail="Email sent but logging failed")
|
||||||
|
|
||||||
|
email_id = insert_result[0]["id"]
|
||||||
|
|
||||||
|
if smtp_attachments:
|
||||||
|
from psycopg2 import Binary
|
||||||
|
|
||||||
|
for attachment in smtp_attachments:
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
INSERT INTO email_attachments (
|
||||||
|
email_id, filename, content_type, size_bytes, file_path, content_data
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
email_id,
|
||||||
|
attachment["filename"],
|
||||||
|
attachment["content_type"],
|
||||||
|
attachment.get("size") or len(attachment["content"]),
|
||||||
|
attachment.get("file_path"),
|
||||||
|
Binary(attachment["content"]),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
execute_query(
|
||||||
|
"""
|
||||||
|
INSERT INTO sag_emails (sag_id, email_id)
|
||||||
|
VALUES (%s, %s)
|
||||||
|
ON CONFLICT DO NOTHING
|
||||||
|
""",
|
||||||
|
(sag_id, email_id),
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"✅ Outbound case email sent and linked (case=%s, email_id=%s, thread_email_id=%s, thread_key=%s, recipients=%s)",
|
||||||
|
sag_id,
|
||||||
|
email_id,
|
||||||
|
payload.thread_email_id,
|
||||||
|
payload.thread_key,
|
||||||
|
", ".join(to_addresses),
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"status": "sent",
|
||||||
|
"email_id": email_id,
|
||||||
|
"message": send_message,
|
||||||
|
}
|
||||||
|
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
# SOLUTIONS
|
# SOLUTIONS
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
|
|||||||
@ -1,4 +1,5 @@
|
|||||||
import logging
|
import logging
|
||||||
|
import json
|
||||||
from datetime import date, datetime
|
from datetime import date, datetime
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
from fastapi import APIRouter, HTTPException, Query, Request
|
from fastapi import APIRouter, HTTPException, Query, Request
|
||||||
@ -56,6 +57,50 @@ def _coerce_optional_int(value: Optional[str]) -> Optional[int]:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _fetch_case_status_options() -> list[str]:
|
||||||
|
default_statuses = ["åben", "under behandling", "afventer", "løst", "lukket"]
|
||||||
|
values = []
|
||||||
|
seen = set()
|
||||||
|
|
||||||
|
def _add(value: Optional[str]) -> None:
|
||||||
|
candidate = str(value or "").strip()
|
||||||
|
if not candidate:
|
||||||
|
return
|
||||||
|
key = candidate.lower()
|
||||||
|
if key in seen:
|
||||||
|
return
|
||||||
|
seen.add(key)
|
||||||
|
values.append(candidate)
|
||||||
|
|
||||||
|
setting_row = execute_query(
|
||||||
|
"SELECT value FROM settings WHERE key = %s",
|
||||||
|
("case_statuses",)
|
||||||
|
)
|
||||||
|
|
||||||
|
if setting_row and setting_row[0].get("value"):
|
||||||
|
try:
|
||||||
|
parsed = json.loads(setting_row[0].get("value") or "[]")
|
||||||
|
for item in parsed if isinstance(parsed, list) else []:
|
||||||
|
value = ""
|
||||||
|
if isinstance(item, str):
|
||||||
|
value = item.strip()
|
||||||
|
elif isinstance(item, dict):
|
||||||
|
value = str(item.get("value") or "").strip()
|
||||||
|
_add(value)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
statuses = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ()) or []
|
||||||
|
|
||||||
|
for row in statuses:
|
||||||
|
_add(row.get("status"))
|
||||||
|
|
||||||
|
for default in default_statuses:
|
||||||
|
_add(default)
|
||||||
|
|
||||||
|
return values
|
||||||
|
|
||||||
|
|
||||||
@router.get("/sag", response_class=HTMLResponse)
|
@router.get("/sag", response_class=HTMLResponse)
|
||||||
async def sager_liste(
|
async def sager_liste(
|
||||||
request: Request,
|
request: Request,
|
||||||
@ -77,7 +122,9 @@ async def sager_liste(
|
|||||||
c.name as customer_name,
|
c.name as customer_name,
|
||||||
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) as kontakt_navn,
|
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) as kontakt_navn,
|
||||||
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
||||||
g.name AS assigned_group_name
|
g.name AS assigned_group_name,
|
||||||
|
nt.title AS next_todo_title,
|
||||||
|
nt.due_date AS next_todo_due_date
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON s.customer_id = c.id
|
LEFT JOIN customers c ON s.customer_id = c.id
|
||||||
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
@ -90,6 +137,22 @@ async def sager_liste(
|
|||||||
LIMIT 1
|
LIMIT 1
|
||||||
) cc_first ON true
|
) cc_first ON true
|
||||||
LEFT JOIN contacts cont ON cc_first.contact_id = cont.id
|
LEFT JOIN contacts cont ON cc_first.contact_id = cont.id
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT t.title, t.due_date
|
||||||
|
FROM sag_todo_steps t
|
||||||
|
WHERE t.sag_id = s.id
|
||||||
|
AND t.deleted_at IS NULL
|
||||||
|
AND t.is_done = FALSE
|
||||||
|
ORDER BY
|
||||||
|
CASE
|
||||||
|
WHEN t.is_next THEN 0
|
||||||
|
WHEN t.due_date IS NOT NULL THEN 1
|
||||||
|
ELSE 2
|
||||||
|
END,
|
||||||
|
t.due_date ASC NULLS LAST,
|
||||||
|
t.created_at ASC
|
||||||
|
LIMIT 1
|
||||||
|
) nt ON true
|
||||||
LEFT JOIN sag_sager ds ON ds.id = s.deferred_until_case_id
|
LEFT JOIN sag_sager ds ON ds.id = s.deferred_until_case_id
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
"""
|
"""
|
||||||
@ -162,7 +225,11 @@ async def sager_liste(
|
|||||||
sager = [s for s in sager if s['id'] in tagged_ids]
|
sager = [s for s in sager if s['id'] in tagged_ids]
|
||||||
|
|
||||||
# Fetch all distinct statuses and tags for filters
|
# Fetch all distinct statuses and tags for filters
|
||||||
statuses = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ())
|
status_options = _fetch_case_status_options()
|
||||||
|
|
||||||
|
current_status = str(status or "").strip()
|
||||||
|
if current_status and current_status.lower() not in {s.lower() for s in status_options}:
|
||||||
|
status_options.append(current_status)
|
||||||
all_tags = execute_query("SELECT DISTINCT tag_navn FROM sag_tags WHERE deleted_at IS NULL ORDER BY tag_navn", ())
|
all_tags = execute_query("SELECT DISTINCT tag_navn FROM sag_tags WHERE deleted_at IS NULL ORDER BY tag_navn", ())
|
||||||
|
|
||||||
toggle_include_deferred_url = str(
|
toggle_include_deferred_url = str(
|
||||||
@ -174,7 +241,7 @@ async def sager_liste(
|
|||||||
"sager": sager,
|
"sager": sager,
|
||||||
"relations_map": relations_map,
|
"relations_map": relations_map,
|
||||||
"child_ids": list(child_ids),
|
"child_ids": list(child_ids),
|
||||||
"statuses": [s['status'] for s in statuses],
|
"statuses": status_options,
|
||||||
"all_tags": [t['tag_navn'] for t in all_tags],
|
"all_tags": [t['tag_navn'] for t in all_tags],
|
||||||
"current_status": status,
|
"current_status": status,
|
||||||
"current_tag": tag,
|
"current_tag": tag,
|
||||||
@ -451,7 +518,10 @@ async def sag_detaljer(request: Request, sag_id: int):
|
|||||||
logger.warning("⚠️ Could not load pipeline stages: %s", e)
|
logger.warning("⚠️ Could not load pipeline stages: %s", e)
|
||||||
pipeline_stages = []
|
pipeline_stages = []
|
||||||
|
|
||||||
statuses = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ())
|
status_options = _fetch_case_status_options()
|
||||||
|
current_status = str(sag.get("status") or "").strip()
|
||||||
|
if current_status and current_status.lower() not in {s.lower() for s in status_options}:
|
||||||
|
status_options.append(current_status)
|
||||||
is_deadline_overdue = _is_deadline_overdue(sag.get("deadline"))
|
is_deadline_overdue = _is_deadline_overdue(sag.get("deadline"))
|
||||||
|
|
||||||
return templates.TemplateResponse("modules/sag/templates/detail.html", {
|
return templates.TemplateResponse("modules/sag/templates/detail.html", {
|
||||||
@ -475,7 +545,7 @@ async def sag_detaljer(request: Request, sag_id: int):
|
|||||||
"nextcloud_instance": nextcloud_instance,
|
"nextcloud_instance": nextcloud_instance,
|
||||||
"related_case_options": related_case_options,
|
"related_case_options": related_case_options,
|
||||||
"pipeline_stages": pipeline_stages,
|
"pipeline_stages": pipeline_stages,
|
||||||
"status_options": [s["status"] for s in statuses],
|
"status_options": status_options,
|
||||||
"is_deadline_overdue": is_deadline_overdue,
|
"is_deadline_overdue": is_deadline_overdue,
|
||||||
"assignment_users": _fetch_assignment_users(),
|
"assignment_users": _fetch_assignment_users(),
|
||||||
"assignment_groups": _fetch_assignment_groups(),
|
"assignment_groups": _fetch_assignment_groups(),
|
||||||
|
|||||||
@ -33,7 +33,7 @@ class RelationService:
|
|||||||
|
|
||||||
# 2. Fetch details for these cases
|
# 2. Fetch details for these cases
|
||||||
placeholders = ','.join(['%s'] * len(tree_ids))
|
placeholders = ','.join(['%s'] * len(tree_ids))
|
||||||
tree_cases_query = f"SELECT id, titel, status FROM sag_sager WHERE id IN ({placeholders})"
|
tree_cases_query = f"SELECT id, titel, status, type, template_key FROM sag_sager WHERE id IN ({placeholders})"
|
||||||
tree_cases = {c['id']: c for c in execute_query(tree_cases_query, tuple(tree_ids))}
|
tree_cases = {c['id']: c for c in execute_query(tree_cases_query, tuple(tree_ids))}
|
||||||
|
|
||||||
# 3. Fetch all edges between these cases
|
# 3. Fetch all edges between these cases
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
7009
app/modules/sag/templates/detail.html.bak
Normal file
7009
app/modules/sag/templates/detail.html.bak
Normal file
File diff suppressed because it is too large
Load Diff
@ -17,12 +17,14 @@
|
|||||||
.table-wrapper {
|
.table-wrapper {
|
||||||
background: var(--bg-card);
|
background: var(--bg-card);
|
||||||
border-radius: 12px;
|
border-radius: 12px;
|
||||||
overflow: hidden;
|
overflow-x: auto;
|
||||||
|
overflow-y: hidden;
|
||||||
box-shadow: 0 2px 8px rgba(0,0,0,0.05);
|
box-shadow: 0 2px 8px rgba(0,0,0,0.05);
|
||||||
}
|
}
|
||||||
|
|
||||||
.sag-table {
|
.sag-table {
|
||||||
width: 100%;
|
width: 100%;
|
||||||
|
min-width: 1760px;
|
||||||
margin: 0;
|
margin: 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -32,12 +34,13 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
.sag-table thead th {
|
.sag-table thead th {
|
||||||
padding: 0.8rem 1rem;
|
padding: 0.6rem 0.75rem;
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
font-size: 0.85rem;
|
font-size: 0.78rem;
|
||||||
text-transform: uppercase;
|
text-transform: uppercase;
|
||||||
letter-spacing: 0.5px;
|
letter-spacing: 0.3px;
|
||||||
border: none;
|
border: none;
|
||||||
|
white-space: nowrap;
|
||||||
}
|
}
|
||||||
|
|
||||||
.sag-table tbody tr {
|
.sag-table tbody tr {
|
||||||
@ -51,9 +54,30 @@
|
|||||||
}
|
}
|
||||||
|
|
||||||
.sag-table tbody td {
|
.sag-table tbody td {
|
||||||
padding: 0.6rem 1rem;
|
padding: 0.5rem 0.75rem;
|
||||||
vertical-align: middle;
|
vertical-align: top;
|
||||||
font-size: 0.9rem;
|
font-size: 0.86rem;
|
||||||
|
white-space: nowrap;
|
||||||
|
}
|
||||||
|
|
||||||
|
.sag-table td.col-company,
|
||||||
|
.sag-table td.col-contact,
|
||||||
|
.sag-table td.col-owner,
|
||||||
|
.sag-table td.col-group,
|
||||||
|
.sag-table td.col-desc {
|
||||||
|
white-space: normal;
|
||||||
|
}
|
||||||
|
|
||||||
|
.sag-table td.col-company,
|
||||||
|
.sag-table td.col-contact,
|
||||||
|
.sag-table td.col-owner,
|
||||||
|
.sag-table td.col-group {
|
||||||
|
max-width: 180px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.sag-table td.col-desc {
|
||||||
|
min-width: 260px;
|
||||||
|
max-width: 360px;
|
||||||
}
|
}
|
||||||
|
|
||||||
.sag-id {
|
.sag-id {
|
||||||
@ -246,7 +270,7 @@
|
|||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
{% block content %}
|
{% block content %}
|
||||||
<div class="container-fluid" style="max-width: 1400px; padding-top: 2rem;">
|
<div class="container-fluid" style="max-width: none; padding-top: 2rem;">
|
||||||
<!-- Header -->
|
<!-- Header -->
|
||||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||||
<h1 style="margin: 0; color: var(--accent);">
|
<h1 style="margin: 0; color: var(--accent);">
|
||||||
@ -330,17 +354,19 @@
|
|||||||
<table class="sag-table">
|
<table class="sag-table">
|
||||||
<thead>
|
<thead>
|
||||||
<tr>
|
<tr>
|
||||||
<th style="width: 90px;">ID</th>
|
<th style="width: 90px;">SagsID</th>
|
||||||
<th>Titel & Beskrivelse</th>
|
<th style="width: 180px;">Virksom.</th>
|
||||||
|
<th style="width: 150px;">Kontakt</th>
|
||||||
|
<th style="width: 300px;">Beskr.</th>
|
||||||
<th style="width: 120px;">Type</th>
|
<th style="width: 120px;">Type</th>
|
||||||
<th style="width: 180px;">Kunde</th>
|
<th style="width: 110px;">Prioritet</th>
|
||||||
<th style="width: 150px;">Hovedkontakt</th>
|
<th style="width: 160px;">Ansvarl.</th>
|
||||||
<th style="width: 160px;">Ansvarlig</th>
|
<th style="width: 170px;">Gruppe/Level</th>
|
||||||
<th style="width: 160px;">Gruppe</th>
|
<th style="width: 240px;">Næste todo</th>
|
||||||
<th style="width: 100px;">Status</th>
|
<th style="width: 120px;">Opret.</th>
|
||||||
<th style="width: 120px;">Udsat start</th>
|
<th style="width: 120px;">Start arbejde</th>
|
||||||
<th style="width: 120px;">Oprettet</th>
|
<th style="width: 140px;">Start inden</th>
|
||||||
<th style="width: 120px;">Opdateret</th>
|
<th style="width: 120px;">Deadline</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody id="sagTableBody">
|
<tbody id="sagTableBody">
|
||||||
@ -357,7 +383,13 @@
|
|||||||
{% endif %}
|
{% endif %}
|
||||||
<span class="sag-id">#{{ sag.id }}</span>
|
<span class="sag-id">#{{ sag.id }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
<td class="col-company" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
|
{{ sag.customer_name if sag.customer_name else '-' }}
|
||||||
|
</td>
|
||||||
|
<td class="col-contact" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
|
{{ sag.kontakt_navn if sag.kontakt_navn and sag.kontakt_navn.strip() else '-' }}
|
||||||
|
</td>
|
||||||
|
<td class="col-desc" onclick="window.location.href='/sag/{{ sag.id }}'">
|
||||||
<div class="sag-titel">{{ sag.titel }}</div>
|
<div class="sag-titel">{{ sag.titel }}</div>
|
||||||
{% if sag.beskrivelse %}
|
{% if sag.beskrivelse %}
|
||||||
<div class="sag-beskrivelse">{{ sag.beskrivelse }}</div>
|
<div class="sag-beskrivelse">{{ sag.beskrivelse }}</div>
|
||||||
@ -366,29 +398,36 @@
|
|||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
||||||
<span class="badge bg-light text-dark border">{{ sag.template_key or sag.type or 'ticket' }}</span>
|
<span class="badge bg-light text-dark border">{{ sag.template_key or sag.type or 'ticket' }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; text-transform: capitalize;">
|
||||||
{{ sag.customer_name if sag.customer_name else '-' }}
|
{{ sag.priority if sag.priority else 'normal' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td class="col-owner" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ sag.kontakt_navn if sag.kontakt_navn and sag.kontakt_navn.strip() else '-' }}
|
|
||||||
</td>
|
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
|
||||||
{{ sag.ansvarlig_navn if sag.ansvarlig_navn else '-' }}
|
{{ sag.ansvarlig_navn if sag.ansvarlig_navn else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td class="col-group" onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ sag.assigned_group_name if sag.assigned_group_name else '-' }}
|
{{ sag.assigned_group_name if sag.assigned_group_name else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; white-space: normal; max-width: 240px;">
|
||||||
<span class="status-badge status-{{ sag.status }}">{{ sag.status }}</span>
|
{% if sag.next_todo_title %}
|
||||||
</td>
|
<div>{{ sag.next_todo_title }}</div>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
{% if sag.next_todo_due_date %}
|
||||||
{{ sag.deferred_until.strftime('%d/%m-%Y') if sag.deferred_until else '-' }}
|
<div class="small text-muted">Forfald: {{ sag.next_todo_due_date.strftime('%d/%m-%Y') }}</div>
|
||||||
|
{% endif %}
|
||||||
|
{% else %}
|
||||||
|
-
|
||||||
|
{% endif %}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ sag.created_at.strftime('%d/%m-%Y') if sag.created_at else '-' }}
|
{{ sag.created_at.strftime('%d/%m-%Y') if sag.created_at else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ sag.updated_at.strftime('%d/%m-%Y') if sag.updated_at else '-' }}
|
{{ sag.start_date.strftime('%d/%m-%Y') if sag.start_date else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
||||||
|
{{ sag.deferred_until.strftime('%d/%m-%Y') if sag.deferred_until else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ sag.id }}'" style="color: var(--text-secondary);">
|
||||||
|
{{ sag.deadline.strftime('%d/%m-%Y') if sag.deadline else '-' }}
|
||||||
</td>
|
</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% if has_relations %}
|
{% if has_relations %}
|
||||||
@ -402,7 +441,13 @@
|
|||||||
<td>
|
<td>
|
||||||
<span class="sag-id">#{{ related_sag.id }}</span>
|
<span class="sag-id">#{{ related_sag.id }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
<td class="col-company" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
|
{{ related_sag.customer_name if related_sag.customer_name else '-' }}
|
||||||
|
</td>
|
||||||
|
<td class="col-contact" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
|
{{ related_sag.kontakt_navn if related_sag.kontakt_navn and related_sag.kontakt_navn.strip() else '-' }}
|
||||||
|
</td>
|
||||||
|
<td class="col-desc" onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
||||||
{% for rt in all_rel_types %}
|
{% for rt in all_rel_types %}
|
||||||
<span class="relation-badge">{{ rt }}</span>
|
<span class="relation-badge">{{ rt }}</span>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
@ -414,29 +459,36 @@
|
|||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
||||||
<span class="badge bg-light text-dark border">{{ related_sag.template_key or related_sag.type or 'ticket' }}</span>
|
<span class="badge bg-light text-dark border">{{ related_sag.template_key or related_sag.type or 'ticket' }}</span>
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; text-transform: capitalize;">
|
||||||
{{ related_sag.customer_name if related_sag.customer_name else '-' }}
|
{{ related_sag.priority if related_sag.priority else 'normal' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td class="col-owner" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ related_sag.kontakt_navn if related_sag.kontakt_navn and related_sag.kontakt_navn.strip() else '-' }}
|
|
||||||
</td>
|
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
|
||||||
{{ related_sag.ansvarlig_navn if related_sag.ansvarlig_navn else '-' }}
|
{{ related_sag.ansvarlig_navn if related_sag.ansvarlig_navn else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
<td class="col-group" onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem;">
|
||||||
{{ related_sag.assigned_group_name if related_sag.assigned_group_name else '-' }}
|
{{ related_sag.assigned_group_name if related_sag.assigned_group_name else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary); font-size: 0.85rem; white-space: normal; max-width: 240px;">
|
||||||
<span class="status-badge status-{{ related_sag.status }}">{{ related_sag.status }}</span>
|
{% if related_sag.next_todo_title %}
|
||||||
</td>
|
<div>{{ related_sag.next_todo_title }}</div>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
{% if related_sag.next_todo_due_date %}
|
||||||
{{ related_sag.deferred_until.strftime('%d/%m-%Y') if related_sag.deferred_until else '-' }}
|
<div class="small text-muted">Forfald: {{ related_sag.next_todo_due_date.strftime('%d/%m-%Y') }}</div>
|
||||||
|
{% endif %}
|
||||||
|
{% else %}
|
||||||
|
-
|
||||||
|
{% endif %}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ related_sag.created_at.strftime('%d/%m-%Y') if related_sag.created_at else '-' }}
|
{{ related_sag.created_at.strftime('%d/%m-%Y') if related_sag.created_at else '-' }}
|
||||||
</td>
|
</td>
|
||||||
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
||||||
{{ related_sag.updated_at.strftime('%d/%m-%Y') if related_sag.updated_at else '-' }}
|
{{ related_sag.start_date.strftime('%d/%m-%Y') if related_sag.start_date else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
||||||
|
{{ related_sag.deferred_until.strftime('%d/%m-%Y') if related_sag.deferred_until else '-' }}
|
||||||
|
</td>
|
||||||
|
<td onclick="window.location.href='/sag/{{ related_sag.id }}'" style="color: var(--text-secondary);">
|
||||||
|
{{ related_sag.deadline.strftime('%d/%m-%Y') if related_sag.deadline else '-' }}
|
||||||
</td>
|
</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|||||||
@ -112,19 +112,54 @@ def _validate_yealink_request(request: Request, token: Optional[str]) -> None:
|
|||||||
db_secret = (_get_setting_value("telefoni_shared_secret", "") or "").strip()
|
db_secret = (_get_setting_value("telefoni_shared_secret", "") or "").strip()
|
||||||
accepted_tokens = {s for s in (env_secret, db_secret) if s}
|
accepted_tokens = {s for s in (env_secret, db_secret) if s}
|
||||||
whitelist = (getattr(settings, "TELEFONI_IP_WHITELIST", "") or "").strip()
|
whitelist = (getattr(settings, "TELEFONI_IP_WHITELIST", "") or "").strip()
|
||||||
|
client_ip = _get_client_ip(request)
|
||||||
|
path = request.url.path
|
||||||
|
|
||||||
|
def _mask(value: Optional[str]) -> str:
|
||||||
|
if not value:
|
||||||
|
return "<empty>"
|
||||||
|
stripped = value.strip()
|
||||||
|
if len(stripped) <= 8:
|
||||||
|
return "***"
|
||||||
|
return f"{stripped[:4]}...{stripped[-4:]}"
|
||||||
|
|
||||||
if not accepted_tokens and not whitelist:
|
if not accepted_tokens and not whitelist:
|
||||||
logger.error("❌ Telefoni callbacks are not secured (no TELEFONI_SHARED_SECRET or TELEFONI_IP_WHITELIST set)")
|
logger.error(
|
||||||
|
"❌ Telefoni callback rejected path=%s reason=no_security_config ip=%s",
|
||||||
|
path,
|
||||||
|
client_ip,
|
||||||
|
)
|
||||||
raise HTTPException(status_code=403, detail="Telefoni callbacks not configured")
|
raise HTTPException(status_code=403, detail="Telefoni callbacks not configured")
|
||||||
|
|
||||||
if token and token.strip() in accepted_tokens:
|
if token and token.strip() in accepted_tokens:
|
||||||
|
logger.debug("✅ Telefoni callback accepted path=%s auth=token ip=%s", path, client_ip)
|
||||||
return
|
return
|
||||||
|
|
||||||
if whitelist:
|
if token and accepted_tokens:
|
||||||
client_ip = _get_client_ip(request)
|
logger.warning(
|
||||||
if ip_in_whitelist(client_ip, whitelist):
|
"⚠️ Telefoni callback token mismatch path=%s ip=%s provided=%s accepted_sources=%s",
|
||||||
return
|
path,
|
||||||
|
client_ip,
|
||||||
|
_mask(token),
|
||||||
|
"+".join([name for name, value in (("env", env_secret), ("db", db_secret)) if value]) or "none",
|
||||||
|
)
|
||||||
|
elif not token:
|
||||||
|
logger.info("ℹ️ Telefoni callback without token path=%s ip=%s", path, client_ip)
|
||||||
|
|
||||||
|
if whitelist:
|
||||||
|
if ip_in_whitelist(client_ip, whitelist):
|
||||||
|
logger.debug("✅ Telefoni callback accepted path=%s auth=ip_whitelist ip=%s", path, client_ip)
|
||||||
|
return
|
||||||
|
logger.warning(
|
||||||
|
"⚠️ Telefoni callback IP not in whitelist path=%s ip=%s whitelist=%s",
|
||||||
|
path,
|
||||||
|
client_ip,
|
||||||
|
whitelist,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.info("ℹ️ Telefoni callback whitelist not configured path=%s ip=%s", path, client_ip)
|
||||||
|
|
||||||
|
logger.warning("❌ Telefoni callback forbidden path=%s ip=%s", path, client_ip)
|
||||||
raise HTTPException(status_code=403, detail="Forbidden")
|
raise HTTPException(status_code=403, detail="Forbidden")
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@ -358,6 +358,7 @@ async function loadUsers() {
|
|||||||
opt.textContent = `${u.full_name || u.username || ('#' + u.user_id)}${u.telefoni_extension ? ' (' + u.telefoni_extension + ')' : ''}`;
|
opt.textContent = `${u.full_name || u.username || ('#' + u.user_id)}${u.telefoni_extension ? ' (' + u.telefoni_extension + ')' : ''}`;
|
||||||
sel.appendChild(opt);
|
sel.appendChild(opt);
|
||||||
});
|
});
|
||||||
|
sel.value = '';
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.error('Failed loading telefoni users', e);
|
console.error('Failed loading telefoni users', e);
|
||||||
}
|
}
|
||||||
@ -500,6 +501,16 @@ async function unlinkCase(callId) {
|
|||||||
|
|
||||||
document.addEventListener('DOMContentLoaded', async () => {
|
document.addEventListener('DOMContentLoaded', async () => {
|
||||||
initLinkSagModalEvents();
|
initLinkSagModalEvents();
|
||||||
|
const userFilter = document.getElementById('filterUser');
|
||||||
|
const fromFilter = document.getElementById('filterFrom');
|
||||||
|
const toFilter = document.getElementById('filterTo');
|
||||||
|
const withoutCaseFilter = document.getElementById('filterWithoutCase');
|
||||||
|
|
||||||
|
if (userFilter) userFilter.value = '';
|
||||||
|
if (fromFilter) fromFilter.value = '';
|
||||||
|
if (toFilter) toFilter.value = '';
|
||||||
|
if (withoutCaseFilter) withoutCaseFilter.checked = false;
|
||||||
|
|
||||||
await loadUsers();
|
await loadUsers();
|
||||||
document.getElementById('btnRefresh').addEventListener('click', loadCalls);
|
document.getElementById('btnRefresh').addEventListener('click', loadCalls);
|
||||||
document.getElementById('filterUser').addEventListener('change', loadCalls);
|
document.getElementById('filterUser').addEventListener('change', loadCalls);
|
||||||
|
|||||||
@ -12,15 +12,31 @@ import os
|
|||||||
import shutil
|
import shutil
|
||||||
|
|
||||||
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
||||||
|
from app.core.config import settings
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
# APIRouter instance (module_loader kigger efter denne)
|
# APIRouter instance (module_loader kigger efter denne)
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
|
||||||
# Upload directory for logos
|
# Upload directory for logos (works in both Docker and local development)
|
||||||
LOGO_UPLOAD_DIR = "/app/uploads/webshop_logos"
|
_logo_base_dir = os.path.abspath(settings.UPLOAD_DIR)
|
||||||
os.makedirs(LOGO_UPLOAD_DIR, exist_ok=True)
|
LOGO_UPLOAD_DIR = os.path.join(_logo_base_dir, "webshop_logos")
|
||||||
|
try:
|
||||||
|
os.makedirs(LOGO_UPLOAD_DIR, exist_ok=True)
|
||||||
|
except OSError as exc:
|
||||||
|
if _logo_base_dir.startswith('/app/'):
|
||||||
|
_fallback_base = os.path.abspath('uploads')
|
||||||
|
LOGO_UPLOAD_DIR = os.path.join(_fallback_base, "webshop_logos")
|
||||||
|
os.makedirs(LOGO_UPLOAD_DIR, exist_ok=True)
|
||||||
|
logger.warning(
|
||||||
|
"⚠️ Webshop logo dir %s not writable (%s). Using fallback %s",
|
||||||
|
_logo_base_dir,
|
||||||
|
exc,
|
||||||
|
LOGO_UPLOAD_DIR,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
# ============================================================================
|
# ============================================================================
|
||||||
|
|||||||
@ -67,12 +67,12 @@ class CaseAnalysisService:
|
|||||||
|
|
||||||
return analysis
|
return analysis
|
||||||
else:
|
else:
|
||||||
logger.warning("⚠️ Ollama returned no result, using empty analysis")
|
logger.warning("⚠️ Ollama returned no result, using heuristic fallback analysis")
|
||||||
return self._empty_analysis(text)
|
return await self._heuristic_fallback_analysis(text)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Case analysis failed: {e}", exc_info=True)
|
logger.error(f"❌ Case analysis failed: {e}", exc_info=True)
|
||||||
return self._empty_analysis(text)
|
return await self._heuristic_fallback_analysis(text)
|
||||||
|
|
||||||
def _build_analysis_prompt(self) -> str:
|
def _build_analysis_prompt(self) -> str:
|
||||||
"""Build Danish system prompt for case analysis"""
|
"""Build Danish system prompt for case analysis"""
|
||||||
@ -470,6 +470,73 @@ Returner JSON med suggested_title, suggested_description, priority, customer_hin
|
|||||||
confidence=0.0,
|
confidence=0.0,
|
||||||
ai_reasoning="AI unavailable - fill fields manually"
|
ai_reasoning="AI unavailable - fill fields manually"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
async def _heuristic_fallback_analysis(self, text: str) -> QuickCreateAnalysis:
|
||||||
|
"""Local fallback when AI service is unavailable."""
|
||||||
|
cleaned_text = (text or "").strip()
|
||||||
|
if not cleaned_text:
|
||||||
|
return self._empty_analysis(text)
|
||||||
|
|
||||||
|
lowered = cleaned_text.lower()
|
||||||
|
|
||||||
|
# Priority heuristics based on urgency wording.
|
||||||
|
urgent_terms = ["nede", "kritisk", "asap", "omgående", "straks", "akut", "haster"]
|
||||||
|
high_terms = ["hurtigt", "vigtigt", "snarest", "prioriter"]
|
||||||
|
low_terms = ["når i får tid", "ikke hastende", "lavprioriteret"]
|
||||||
|
|
||||||
|
if any(term in lowered for term in urgent_terms):
|
||||||
|
priority = SagPriority.URGENT
|
||||||
|
elif any(term in lowered for term in high_terms):
|
||||||
|
priority = SagPriority.HIGH
|
||||||
|
elif any(term in lowered for term in low_terms):
|
||||||
|
priority = SagPriority.LOW
|
||||||
|
else:
|
||||||
|
priority = SagPriority.NORMAL
|
||||||
|
|
||||||
|
# Basic title heuristic: first non-empty line/sentence, clipped to 80 chars.
|
||||||
|
first_line = cleaned_text.splitlines()[0].strip()
|
||||||
|
first_sentence = re.split(r"[.!?]", first_line)[0].strip()
|
||||||
|
title_source = first_sentence or first_line or cleaned_text
|
||||||
|
title = title_source[:80].strip()
|
||||||
|
if not title:
|
||||||
|
title = "Ny sag"
|
||||||
|
|
||||||
|
# Lightweight keyword tags.
|
||||||
|
keyword_tags = {
|
||||||
|
"printer": "printer",
|
||||||
|
"mail": "mail",
|
||||||
|
"email": "mail",
|
||||||
|
"vpn": "vpn",
|
||||||
|
"net": "netværk",
|
||||||
|
"wifi": "wifi",
|
||||||
|
"server": "server",
|
||||||
|
"laptop": "laptop",
|
||||||
|
"adgang": "adgang",
|
||||||
|
"onboarding": "onboarding",
|
||||||
|
}
|
||||||
|
suggested_tags: List[str] = []
|
||||||
|
for key, tag in keyword_tags.items():
|
||||||
|
if key in lowered and tag not in suggested_tags:
|
||||||
|
suggested_tags.append(tag)
|
||||||
|
|
||||||
|
# Try simple customer matching from long words in text.
|
||||||
|
candidate_hints = []
|
||||||
|
for token in re.findall(r"[A-Za-z0-9ÆØÅæøå._-]{3,}", cleaned_text):
|
||||||
|
if token.lower() in {"ring", "kunde", "sag", "skal", "have", "virker", "ikke"}:
|
||||||
|
continue
|
||||||
|
candidate_hints.append(token)
|
||||||
|
customer_id, customer_name = await self._match_customer(candidate_hints[:8])
|
||||||
|
|
||||||
|
return QuickCreateAnalysis(
|
||||||
|
suggested_title=title,
|
||||||
|
suggested_description=cleaned_text,
|
||||||
|
suggested_priority=priority,
|
||||||
|
suggested_customer_id=customer_id,
|
||||||
|
suggested_customer_name=customer_name,
|
||||||
|
suggested_tags=suggested_tags,
|
||||||
|
confidence=0.35,
|
||||||
|
ai_reasoning="AI service unavailable - using local fallback suggestions"
|
||||||
|
)
|
||||||
|
|
||||||
def _get_cached_analysis(self, text: str) -> Optional[QuickCreateAnalysis]:
|
def _get_cached_analysis(self, text: str) -> Optional[QuickCreateAnalysis]:
|
||||||
"""Get cached analysis if available and not expired"""
|
"""Get cached analysis if available and not expired"""
|
||||||
|
|||||||
@ -6,6 +6,7 @@ Adapted from OmniSync for BMC Hub timetracking use cases
|
|||||||
|
|
||||||
import logging
|
import logging
|
||||||
import json
|
import json
|
||||||
|
import asyncio
|
||||||
from typing import Dict, Optional, List
|
from typing import Dict, Optional, List
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import aiohttp
|
import aiohttp
|
||||||
@ -77,7 +78,7 @@ Response format (JSON only, no other text):
|
|||||||
IMPORTANT: Return ONLY the JSON object. Do not include any explanation, thinking, or additional text."""
|
IMPORTANT: Return ONLY the JSON object. Do not include any explanation, thinking, or additional text."""
|
||||||
|
|
||||||
def _build_email_context(self, email_data: Dict) -> str:
|
def _build_email_context(self, email_data: Dict) -> str:
|
||||||
"""Build email context for AI analysis"""
|
"""Build email context for AI classification (email body only - fast)"""
|
||||||
|
|
||||||
subject = email_data.get('subject', '')
|
subject = email_data.get('subject', '')
|
||||||
sender = email_data.get('sender_email', '')
|
sender = email_data.get('sender_email', '')
|
||||||
@ -87,9 +88,17 @@ IMPORTANT: Return ONLY the JSON object. Do not include any explanation, thinking
|
|||||||
if len(body) > 2000:
|
if len(body) > 2000:
|
||||||
body = body[:2000] + "... [truncated]"
|
body = body[:2000] + "... [truncated]"
|
||||||
|
|
||||||
|
# Also note if PDF attachments exist (helps classification even without reading them)
|
||||||
|
attachments = email_data.get('attachments', [])
|
||||||
|
pdf_filenames = [a.get('filename', '') for a in attachments
|
||||||
|
if a.get('filename', '').lower().endswith('.pdf')]
|
||||||
|
attachment_note = ''
|
||||||
|
if pdf_filenames:
|
||||||
|
attachment_note = f"\n\nVedhæftede filer: {', '.join(pdf_filenames)}"
|
||||||
|
|
||||||
context = f"""**Email Information:**
|
context = f"""**Email Information:**
|
||||||
From: {sender}
|
From: {sender}
|
||||||
Subject: {subject}
|
Subject: {subject}{attachment_note}
|
||||||
|
|
||||||
**Email Body:**
|
**Email Body:**
|
||||||
{body}
|
{body}
|
||||||
@ -97,6 +106,116 @@ Subject: {subject}
|
|||||||
Klassificer denne email."""
|
Klassificer denne email."""
|
||||||
|
|
||||||
return context
|
return context
|
||||||
|
|
||||||
|
def _extract_pdf_texts_from_attachments(self, email_data: Dict) -> List[str]:
|
||||||
|
"""Extract text from PDF attachments in email_data (in-memory bytes)"""
|
||||||
|
pdf_texts = []
|
||||||
|
attachments = email_data.get('attachments', [])
|
||||||
|
for att in attachments:
|
||||||
|
filename = att.get('filename', '')
|
||||||
|
if not filename.lower().endswith('.pdf'):
|
||||||
|
continue
|
||||||
|
content = att.get('content', b'')
|
||||||
|
if not content:
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
import pdfplumber
|
||||||
|
import io
|
||||||
|
with pdfplumber.open(io.BytesIO(content)) as pdf:
|
||||||
|
pages = []
|
||||||
|
for page in pdf.pages:
|
||||||
|
text = page.extract_text(layout=True, x_tolerance=2, y_tolerance=2)
|
||||||
|
if text:
|
||||||
|
pages.append(text)
|
||||||
|
if pages:
|
||||||
|
pdf_texts.append(f"=== PDF: {filename} ===\n" + "\n".join(pages))
|
||||||
|
logger.info(f"📄 Extracted PDF text from attachment {filename} ({len(pages)} pages)")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"⚠️ Could not extract PDF text from {filename}: {e}")
|
||||||
|
return pdf_texts
|
||||||
|
|
||||||
|
def _get_attachment_texts_from_db(self, email_id: int) -> List[str]:
|
||||||
|
"""Fetch PDF attachment text from DB (content_data column) for already-saved emails"""
|
||||||
|
from pathlib import Path
|
||||||
|
pdf_texts = []
|
||||||
|
try:
|
||||||
|
attachments = execute_query(
|
||||||
|
"""SELECT filename, content_data, file_path
|
||||||
|
FROM email_attachments
|
||||||
|
WHERE email_id = %s AND filename ILIKE '%.pdf'""",
|
||||||
|
(email_id,)
|
||||||
|
)
|
||||||
|
for att in (attachments or []):
|
||||||
|
filename = att.get('filename', 'unknown.pdf')
|
||||||
|
content = None
|
||||||
|
# Prefer content_data (bytes in DB)
|
||||||
|
if att.get('content_data'):
|
||||||
|
content = bytes(att['content_data'])
|
||||||
|
# Fallback: read from disk
|
||||||
|
elif att.get('file_path'):
|
||||||
|
fp = Path(att['file_path'])
|
||||||
|
if fp.exists():
|
||||||
|
content = fp.read_bytes()
|
||||||
|
if not content:
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
import pdfplumber
|
||||||
|
import io
|
||||||
|
with pdfplumber.open(io.BytesIO(content)) as pdf:
|
||||||
|
pages = []
|
||||||
|
for page in pdf.pages:
|
||||||
|
text = page.extract_text(layout=True, x_tolerance=2, y_tolerance=2)
|
||||||
|
if text:
|
||||||
|
pages.append(text)
|
||||||
|
if pages:
|
||||||
|
pdf_texts.append(f"=== PDF: {filename} ===\n" + "\n".join(pages))
|
||||||
|
logger.info(f"📄 Extracted PDF text from DB for {filename} ({len(pages)} pages)")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"⚠️ Could not extract PDF text for {filename} from DB: {e}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"❌ Error fetching attachment texts from DB for email {email_id}: {e}")
|
||||||
|
return pdf_texts
|
||||||
|
|
||||||
|
def _build_invoice_extraction_context(self, email_data: Dict) -> str:
|
||||||
|
"""Build extraction context with PDF as PRIMARY data source.
|
||||||
|
Email body/sender are ignored for invoice data — only the attached PDF counts.
|
||||||
|
Sender can be a forwarder or external bookkeeper, not the actual vendor.
|
||||||
|
"""
|
||||||
|
subject = email_data.get('subject', '')
|
||||||
|
body = email_data.get('body_text', '') or ''
|
||||||
|
# Keep body brief — it's secondary context at best
|
||||||
|
if len(body) > 300:
|
||||||
|
body = body[:300] + "..."
|
||||||
|
|
||||||
|
# 1. Try in-memory attachment bytes first (during initial fetch)
|
||||||
|
pdf_texts = self._extract_pdf_texts_from_attachments(email_data)
|
||||||
|
|
||||||
|
# 2. Fallback: load from DB for already-processed emails
|
||||||
|
if not pdf_texts and email_data.get('id'):
|
||||||
|
pdf_texts = self._get_attachment_texts_from_db(email_data['id'])
|
||||||
|
|
||||||
|
if pdf_texts:
|
||||||
|
pdf_section = "\n\n".join(pdf_texts)
|
||||||
|
return f"""VEDHÆFTET FAKTURA (primær datakilde - analyser grundigt):
|
||||||
|
{pdf_section}
|
||||||
|
|
||||||
|
---
|
||||||
|
Email emne: {subject}
|
||||||
|
Email tekst (sekundær): {body}
|
||||||
|
|
||||||
|
VIGTIGT: Udtrækket SKAL baseres på PDF-indholdet ovenfor.
|
||||||
|
Afsenderens email-adresse er IKKE leverandøren — leverandøren fremgår af fakturaen."""
|
||||||
|
else:
|
||||||
|
# No PDF found — fall back to email body
|
||||||
|
logger.warning(f"⚠️ No PDF attachment found for email {email_data.get('id')} — using email body only")
|
||||||
|
body_full = email_data.get('body_text', '') or email_data.get('body_html', '') or ''
|
||||||
|
if len(body_full) > 3000:
|
||||||
|
body_full = body_full[:3000] + "..."
|
||||||
|
return f"""Email emne: {subject}
|
||||||
|
Email tekst:
|
||||||
|
{body_full}
|
||||||
|
|
||||||
|
Ingen PDF vedhæftet — udtræk fakturadata fra email-teksten."""
|
||||||
|
|
||||||
async def _call_ollama(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
async def _call_ollama(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
||||||
"""Call Ollama API for classification"""
|
"""Call Ollama API for classification"""
|
||||||
@ -279,9 +398,9 @@ Klassificer denne email."""
|
|||||||
logger.info(f"✅ Using cached extraction for email {email_data['id']}")
|
logger.info(f"✅ Using cached extraction for email {email_data['id']}")
|
||||||
return cached_result
|
return cached_result
|
||||||
|
|
||||||
# Build extraction prompt
|
# Build extraction prompt — use PDF-first context, not email sender
|
||||||
system_prompt = self._build_extraction_prompt()
|
system_prompt = self._build_extraction_prompt()
|
||||||
user_message = self._build_email_context(email_data)
|
user_message = self._build_invoice_extraction_context(email_data)
|
||||||
|
|
||||||
# Call Ollama
|
# Call Ollama
|
||||||
result = await self._call_ollama_extraction(system_prompt, user_message)
|
result = await self._call_ollama_extraction(system_prompt, user_message)
|
||||||
@ -294,39 +413,61 @@ Klassificer denne email."""
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
def _build_extraction_prompt(self) -> str:
|
def _build_extraction_prompt(self) -> str:
|
||||||
"""Build Danish system prompt for invoice data extraction"""
|
"""Build comprehensive Danish system prompt for deep invoice data extraction."""
|
||||||
return """Du er en ekspert i at udtrække struktureret data fra danske fakturaer.
|
from app.core.config import settings as cfg
|
||||||
|
own_cvr = getattr(cfg, 'OWN_CVR', '')
|
||||||
|
return f"""Du er en ekspert i at læse og udtrække strukturerede data fra danske fakturaer og kreditnotaer.
|
||||||
|
|
||||||
Din opgave er at finde og udtrække følgende information fra emailen:
|
DU SKAL ANALYSERE SELVE FAKTURAEN (PDF-indholdet) - IKKE email-afsenderen.
|
||||||
|
Afsender kan være os selv der videresender, eller en ekstern bogholder - IGNORER afsender.
|
||||||
|
Leverandørens navn og CVR fremgår ALTID af selve fakturadokumentet.
|
||||||
|
|
||||||
**Felter at udtrække:**
|
VIGTIGE REGLER:
|
||||||
- `invoice_number` (string) - Fakturanummer
|
1. Returner KUN gyldig JSON - ingen forklaring eller ekstra tekst
|
||||||
- `amount` (decimal) - Fakturabeløb i DKK (uden valutasymbol)
|
2. Hvis et felt ikke findes, sæt det til null
|
||||||
- `due_date` (string YYYY-MM-DD) - Forfaldsdato
|
3. Datoer skal være i format YYYY-MM-DD
|
||||||
- `vendor_name` (string) - Leverandørens navn
|
4. DANSKE PRISFORMATER:
|
||||||
- `order_number` (string) - Ordrenummer (hvis angivet)
|
- Tusind-separator: . (punkt) eller mellemrum: "5.965,18" eller "5 965,18"
|
||||||
- `cvr_number` (string) - CVR-nummer (hvis angivet)
|
- Decimal-separator: , (komma): "1.234,56 kr"
|
||||||
|
- I JSON: brug . (punkt) som decimal: 1234.56
|
||||||
|
- Eksempel: "5.965,18 kr" → 5965.18
|
||||||
|
5. CVR-nummer: 8 cifre uden mellemrum
|
||||||
|
- IGNORER CVR {own_cvr} — det er VORES eget CVR (køber), ikke leverandørens!
|
||||||
|
- Find LEVERANDØRENS CVR i toppen/headeren af fakturaen
|
||||||
|
6. DOKUMENTTYPE:
|
||||||
|
- "invoice" = Almindelig faktura
|
||||||
|
- "credit_note" = Kreditnota (Kreditnota, Refusion, Tilbagebetaling, Credit Note)
|
||||||
|
7. Varelinjer: udtræk ALLE linjer med beskrivelse, antal, enhedspris, total
|
||||||
|
|
||||||
**Vigtige regler:**
|
JSON STRUKTUR:
|
||||||
- Hvis et felt ikke findes, brug `null`
|
{{
|
||||||
- Beløb skal være numerisk (uden "kr", "DKK" osv.)
|
"document_type": "invoice" eller "credit_note",
|
||||||
- Datoer skal være i formatet YYYY-MM-DD
|
"invoice_number": "fakturanummer",
|
||||||
- Vær præcis - returner kun data du er sikker på
|
"vendor_name": "leverandørens firmanavn",
|
||||||
|
"vendor_cvr": "12345678",
|
||||||
|
"invoice_date": "YYYY-MM-DD",
|
||||||
|
"due_date": "YYYY-MM-DD",
|
||||||
|
"currency": "DKK",
|
||||||
|
"total_amount": 1234.56,
|
||||||
|
"vat_amount": 246.91,
|
||||||
|
"net_amount": 987.65,
|
||||||
|
"order_number": "ordrenummer eller null",
|
||||||
|
"original_invoice_reference": "ref til original faktura (kun kreditnotaer) eller null",
|
||||||
|
"lines": [
|
||||||
|
{{
|
||||||
|
"line_number": 1,
|
||||||
|
"description": "varebeskrivelse",
|
||||||
|
"quantity": 2.0,
|
||||||
|
"unit_price": 500.00,
|
||||||
|
"line_total": 1000.00,
|
||||||
|
"vat_rate": 25.00,
|
||||||
|
"vat_amount": 250.00
|
||||||
|
}}
|
||||||
|
],
|
||||||
|
"confidence": 0.95
|
||||||
|
}}
|
||||||
|
|
||||||
**Output format (JSON):**
|
Returner KUN JSON - ingen anden tekst."""
|
||||||
```json
|
|
||||||
{
|
|
||||||
"invoice_number": "INV-2024-001",
|
|
||||||
"amount": 5250.00,
|
|
||||||
"due_date": "2025-01-15",
|
|
||||||
"vendor_name": "Acme Leverandør A/S",
|
|
||||||
"order_number": "ORD-123",
|
|
||||||
"cvr_number": "12345678"
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
Returner KUN JSON - ingen anden tekst.
|
|
||||||
"""
|
|
||||||
|
|
||||||
async def _call_ollama_extraction(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
async def _call_ollama_extraction(self, system_prompt: str, user_message: str) -> Optional[Dict]:
|
||||||
"""Call Ollama for data extraction"""
|
"""Call Ollama for data extraction"""
|
||||||
@ -340,20 +481,23 @@ Returner KUN JSON - ingen anden tekst.
|
|||||||
{"role": "user", "content": user_message}
|
{"role": "user", "content": user_message}
|
||||||
],
|
],
|
||||||
"stream": False,
|
"stream": False,
|
||||||
|
"format": "json",
|
||||||
"options": {
|
"options": {
|
||||||
"temperature": 0.0, # Zero temperature for deterministic extraction
|
"temperature": 0.0, # Deterministic extraction
|
||||||
"num_predict": 300
|
"num_predict": 3000 # Enough for full invoice with many lines
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
async with aiohttp.ClientSession() as session:
|
async with aiohttp.ClientSession() as session:
|
||||||
async with session.post(url, json=payload, timeout=aiohttp.ClientTimeout(total=30)) as response:
|
async with session.post(url, json=payload, timeout=aiohttp.ClientTimeout(total=120)) as response:
|
||||||
if response.status != 200:
|
if response.status != 200:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
data = await response.json()
|
data = await response.json()
|
||||||
content = data.get('message', {}).get('content', '')
|
msg = data.get('message', {})
|
||||||
|
# qwen3 sometimes returns content in 'thinking' field
|
||||||
|
content = msg.get('content', '') or msg.get('thinking', '')
|
||||||
|
|
||||||
# Parse JSON response
|
# Parse JSON response
|
||||||
result = self._parse_extraction_response(content)
|
result = self._parse_extraction_response(content)
|
||||||
|
|||||||
@ -49,6 +49,7 @@ class EmailProcessorService:
|
|||||||
'fetched': 0,
|
'fetched': 0,
|
||||||
'saved': 0,
|
'saved': 0,
|
||||||
'classified': 0,
|
'classified': 0,
|
||||||
|
'awaiting_user_action': 0,
|
||||||
'rules_matched': 0,
|
'rules_matched': 0,
|
||||||
'errors': 0
|
'errors': 0
|
||||||
}
|
}
|
||||||
@ -86,6 +87,8 @@ class EmailProcessorService:
|
|||||||
|
|
||||||
if result.get('classified'):
|
if result.get('classified'):
|
||||||
stats['classified'] += 1
|
stats['classified'] += 1
|
||||||
|
if result.get('awaiting_user_action'):
|
||||||
|
stats['awaiting_user_action'] += 1
|
||||||
if result.get('rules_matched'):
|
if result.get('rules_matched'):
|
||||||
stats['rules_matched'] += 1
|
stats['rules_matched'] += 1
|
||||||
|
|
||||||
@ -109,6 +112,7 @@ class EmailProcessorService:
|
|||||||
email_id = email_data.get('id')
|
email_id = email_data.get('id')
|
||||||
stats = {
|
stats = {
|
||||||
'classified': False,
|
'classified': False,
|
||||||
|
'awaiting_user_action': False,
|
||||||
'workflows_executed': 0,
|
'workflows_executed': 0,
|
||||||
'rules_matched': False
|
'rules_matched': False
|
||||||
}
|
}
|
||||||
@ -123,6 +127,22 @@ class EmailProcessorService:
|
|||||||
if settings.EMAIL_AUTO_CLASSIFY:
|
if settings.EMAIL_AUTO_CLASSIFY:
|
||||||
await self._classify_and_update(email_data)
|
await self._classify_and_update(email_data)
|
||||||
stats['classified'] = True
|
stats['classified'] = True
|
||||||
|
|
||||||
|
# Step 3.5: Gate automation by manual-approval policy and confidence
|
||||||
|
# Phase-1 policy: suggestions are generated automatically, actions are user-approved.
|
||||||
|
classification = (email_data.get('classification') or '').strip().lower()
|
||||||
|
confidence = float(email_data.get('confidence_score') or 0.0)
|
||||||
|
require_manual_approval = getattr(settings, 'EMAIL_REQUIRE_MANUAL_APPROVAL', True)
|
||||||
|
|
||||||
|
if require_manual_approval:
|
||||||
|
await self._set_awaiting_user_action(email_id, reason='manual_approval_required')
|
||||||
|
stats['awaiting_user_action'] = True
|
||||||
|
return stats
|
||||||
|
|
||||||
|
if not classification or confidence < settings.EMAIL_AI_CONFIDENCE_THRESHOLD:
|
||||||
|
await self._set_awaiting_user_action(email_id, reason='low_confidence')
|
||||||
|
stats['awaiting_user_action'] = True
|
||||||
|
return stats
|
||||||
|
|
||||||
# Step 4: Execute workflows based on classification
|
# Step 4: Execute workflows based on classification
|
||||||
workflow_processed = False
|
workflow_processed = False
|
||||||
@ -172,6 +192,25 @@ class EmailProcessorService:
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Error in process_single_email for {email_id}: {e}")
|
logger.error(f"❌ Error in process_single_email for {email_id}: {e}")
|
||||||
raise e
|
raise e
|
||||||
|
|
||||||
|
async def _set_awaiting_user_action(self, email_id: Optional[int], reason: str):
|
||||||
|
"""Park an email for manual review before any automatic routing/action."""
|
||||||
|
if not email_id:
|
||||||
|
return
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"""
|
||||||
|
UPDATE email_messages
|
||||||
|
SET status = 'awaiting_user_action',
|
||||||
|
folder = COALESCE(folder, 'INBOX'),
|
||||||
|
auto_processed = false,
|
||||||
|
processed_at = NULL,
|
||||||
|
updated_at = CURRENT_TIMESTAMP
|
||||||
|
WHERE id = %s
|
||||||
|
""",
|
||||||
|
(email_id,)
|
||||||
|
)
|
||||||
|
logger.info("🛑 Email %s moved to awaiting_user_action (%s)", email_id, reason)
|
||||||
|
|
||||||
async def _classify_and_update(self, email_data: Dict):
|
async def _classify_and_update(self, email_data: Dict):
|
||||||
"""Classify email and update database"""
|
"""Classify email and update database"""
|
||||||
@ -240,25 +279,40 @@ class EmailProcessorService:
|
|||||||
logger.error(f"❌ Classification failed for email {email_data['id']}: {e}")
|
logger.error(f"❌ Classification failed for email {email_data['id']}: {e}")
|
||||||
|
|
||||||
async def _update_extracted_fields(self, email_id: int, extraction: Dict):
|
async def _update_extracted_fields(self, email_id: int, extraction: Dict):
|
||||||
"""Update email with extracted invoice fields"""
|
"""Update email with extracted invoice fields (from PDF attachment analysis)"""
|
||||||
try:
|
try:
|
||||||
|
# Normalize amount field (new extraction uses total_amount, old used amount)
|
||||||
|
amount = extraction.get('total_amount') or extraction.get('amount')
|
||||||
|
|
||||||
query = """
|
query = """
|
||||||
UPDATE email_messages
|
UPDATE email_messages
|
||||||
SET extracted_invoice_number = %s,
|
SET extracted_invoice_number = %s,
|
||||||
extracted_amount = %s,
|
extracted_amount = %s,
|
||||||
extracted_due_date = %s
|
extracted_due_date = %s,
|
||||||
|
extracted_vendor_name = %s,
|
||||||
|
extracted_vendor_cvr = %s,
|
||||||
|
extracted_invoice_date = %s
|
||||||
WHERE id = %s
|
WHERE id = %s
|
||||||
"""
|
"""
|
||||||
|
|
||||||
execute_query(query, (
|
execute_query(query, (
|
||||||
extraction.get('invoice_number'),
|
extraction.get('invoice_number'),
|
||||||
extraction.get('amount'),
|
amount,
|
||||||
extraction.get('due_date'),
|
extraction.get('due_date'),
|
||||||
|
extraction.get('vendor_name'),
|
||||||
|
extraction.get('vendor_cvr'),
|
||||||
|
extraction.get('invoice_date'),
|
||||||
email_id
|
email_id
|
||||||
))
|
))
|
||||||
|
|
||||||
logger.info(f"✅ Updated extracted fields for email {email_id}")
|
logger.info(
|
||||||
|
f"✅ Updated extracted fields for email {email_id}: "
|
||||||
|
f"invoice={extraction.get('invoice_number')}, "
|
||||||
|
f"vendor={extraction.get('vendor_name')}, "
|
||||||
|
f"cvr={extraction.get('vendor_cvr')}, "
|
||||||
|
f"amount={amount}"
|
||||||
|
)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Error updating extracted fields: {e}")
|
logger.error(f"❌ Error updating extracted fields: {e}")
|
||||||
|
|
||||||
|
|||||||
@ -11,11 +11,14 @@ import email
|
|||||||
from email.header import decode_header
|
from email.header import decode_header
|
||||||
from email.mime.text import MIMEText
|
from email.mime.text import MIMEText
|
||||||
from email.mime.multipart import MIMEMultipart
|
from email.mime.multipart import MIMEMultipart
|
||||||
|
from email.mime.base import MIMEBase
|
||||||
|
from email import encoders
|
||||||
from typing import List, Dict, Optional, Tuple
|
from typing import List, Dict, Optional, Tuple
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import json
|
import json
|
||||||
import asyncio
|
import asyncio
|
||||||
import base64
|
import base64
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
# Try to import aiosmtplib, but don't fail if not available
|
# Try to import aiosmtplib, but don't fail if not available
|
||||||
try:
|
try:
|
||||||
@ -212,6 +215,12 @@ class EmailService:
|
|||||||
logger.info(f"✅ New email: {parsed_email['subject'][:50]}... from {parsed_email['sender_email']}")
|
logger.info(f"✅ New email: {parsed_email['subject'][:50]}... from {parsed_email['sender_email']}")
|
||||||
else:
|
else:
|
||||||
logger.debug(f"⏭️ Email already exists: {parsed_email['message_id']}")
|
logger.debug(f"⏭️ Email already exists: {parsed_email['message_id']}")
|
||||||
|
# Re-save attachment bytes for existing emails (fills content_data for old emails)
|
||||||
|
if parsed_email.get('attachments'):
|
||||||
|
await self._resave_attachment_content(
|
||||||
|
parsed_email['message_id'],
|
||||||
|
parsed_email['attachments']
|
||||||
|
)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Error parsing Graph message: {e}")
|
logger.error(f"❌ Error parsing Graph message: {e}")
|
||||||
@ -270,6 +279,8 @@ class EmailService:
|
|||||||
|
|
||||||
# Get message ID
|
# Get message ID
|
||||||
message_id = msg.get('Message-ID', f"imap-{email_id}")
|
message_id = msg.get('Message-ID', f"imap-{email_id}")
|
||||||
|
in_reply_to = msg.get('In-Reply-To', '')
|
||||||
|
email_references = msg.get('References', '')
|
||||||
|
|
||||||
# Get date
|
# Get date
|
||||||
date_str = msg.get('Date', '')
|
date_str = msg.get('Date', '')
|
||||||
@ -343,6 +354,8 @@ class EmailService:
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
'message_id': message_id,
|
'message_id': message_id,
|
||||||
|
'in_reply_to': in_reply_to,
|
||||||
|
'email_references': email_references,
|
||||||
'subject': subject,
|
'subject': subject,
|
||||||
'sender_name': sender_name,
|
'sender_name': sender_name,
|
||||||
'sender_email': sender_email,
|
'sender_email': sender_email,
|
||||||
@ -387,6 +400,8 @@ class EmailService:
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
'message_id': msg.get('internetMessageId', msg.get('id', '')),
|
'message_id': msg.get('internetMessageId', msg.get('id', '')),
|
||||||
|
'in_reply_to': None,
|
||||||
|
'email_references': None,
|
||||||
'subject': msg.get('subject', ''),
|
'subject': msg.get('subject', ''),
|
||||||
'sender_name': sender_name,
|
'sender_name': sender_name,
|
||||||
'sender_email': sender_email,
|
'sender_email': sender_email,
|
||||||
@ -521,8 +536,9 @@ class EmailService:
|
|||||||
INSERT INTO email_messages
|
INSERT INTO email_messages
|
||||||
(message_id, subject, sender_email, sender_name, recipient_email, cc,
|
(message_id, subject, sender_email, sender_name, recipient_email, cc,
|
||||||
body_text, body_html, received_date, folder, has_attachments, attachment_count,
|
body_text, body_html, received_date, folder, has_attachments, attachment_count,
|
||||||
|
in_reply_to, email_references,
|
||||||
status, is_read)
|
status, is_read)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, 'new', false)
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, 'new', false)
|
||||||
RETURNING id
|
RETURNING id
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@ -538,7 +554,9 @@ class EmailService:
|
|||||||
email_data['received_date'],
|
email_data['received_date'],
|
||||||
email_data['folder'],
|
email_data['folder'],
|
||||||
email_data['has_attachments'],
|
email_data['has_attachments'],
|
||||||
email_data['attachment_count']
|
email_data['attachment_count'],
|
||||||
|
email_data.get('in_reply_to'),
|
||||||
|
email_data.get('email_references')
|
||||||
))
|
))
|
||||||
|
|
||||||
logger.info(f"✅ Saved email {email_id}: {email_data['subject'][:50]}...")
|
logger.info(f"✅ Saved email {email_id}: {email_data['subject'][:50]}...")
|
||||||
@ -554,48 +572,86 @@ class EmailService:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
async def _save_attachments(self, email_id: int, attachments: List[Dict]):
|
async def _save_attachments(self, email_id: int, attachments: List[Dict]):
|
||||||
"""Save email attachments to disk and database"""
|
"""Save email attachments to disk and database (also stores bytes as fallback)"""
|
||||||
import os
|
import os
|
||||||
import hashlib
|
import hashlib
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
# Create uploads directory if not exists
|
# Use absolute path based on UPLOAD_DIR setting
|
||||||
upload_dir = Path("uploads/email_attachments")
|
from app.core.config import settings
|
||||||
upload_dir.mkdir(parents=True, exist_ok=True)
|
upload_dir = Path(settings.UPLOAD_DIR) / "email_attachments"
|
||||||
|
try:
|
||||||
|
upload_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"⚠️ Could not create upload dir {upload_dir}: {e}")
|
||||||
|
|
||||||
for att in attachments:
|
for att in attachments:
|
||||||
try:
|
try:
|
||||||
filename = att['filename']
|
filename = att['filename']
|
||||||
content = att['content']
|
content = att['content'] # bytes
|
||||||
content_type = att.get('content_type', 'application/octet-stream')
|
content_type = att.get('content_type', 'application/octet-stream')
|
||||||
size_bytes = att['size']
|
size_bytes = att.get('size', len(content) if content else 0)
|
||||||
|
|
||||||
|
if not content:
|
||||||
|
continue
|
||||||
|
|
||||||
# Generate MD5 hash for deduplication
|
# Generate MD5 hash for deduplication
|
||||||
md5_hash = hashlib.md5(content).hexdigest()
|
md5_hash = hashlib.md5(content).hexdigest()
|
||||||
|
|
||||||
# Save to disk with hash prefix
|
# Try to save to disk
|
||||||
file_path = upload_dir / f"{md5_hash}_{filename}"
|
file_path_str = None
|
||||||
file_path.write_bytes(content)
|
try:
|
||||||
|
file_path = upload_dir / f"{md5_hash}_{filename}"
|
||||||
|
file_path.write_bytes(content)
|
||||||
|
file_path_str = str(file_path)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"⚠️ Could not save attachment to disk ({filename}): {e}")
|
||||||
|
|
||||||
# Save to database
|
# Save to database — always store content_data as fallback
|
||||||
query = """
|
query = """
|
||||||
INSERT INTO email_attachments
|
INSERT INTO email_attachments
|
||||||
(email_id, filename, content_type, size_bytes, file_path)
|
(email_id, filename, content_type, size_bytes, file_path, content_data)
|
||||||
VALUES (%s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT DO NOTHING
|
||||||
"""
|
"""
|
||||||
execute_insert(query, (
|
from psycopg2 import Binary
|
||||||
|
execute_query(query, (
|
||||||
email_id,
|
email_id,
|
||||||
filename,
|
filename,
|
||||||
content_type,
|
content_type,
|
||||||
size_bytes,
|
size_bytes,
|
||||||
str(file_path)
|
file_path_str,
|
||||||
|
Binary(content)
|
||||||
))
|
))
|
||||||
|
|
||||||
logger.info(f"📎 Saved attachment: {filename} ({size_bytes} bytes)")
|
logger.info(f"📎 Saved attachment: {filename} ({size_bytes} bytes, disk={file_path_str is not None})")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"❌ Failed to save attachment {filename}: {e}")
|
logger.error(f"❌ Failed to save attachment {att.get('filename', '?')}: {e}")
|
||||||
|
|
||||||
|
async def _resave_attachment_content(self, message_id: str, attachments: List[Dict]):
|
||||||
|
"""For existing emails, store attachment bytes in content_data if not already saved"""
|
||||||
|
from psycopg2 import Binary
|
||||||
|
for att in attachments:
|
||||||
|
try:
|
||||||
|
filename = att.get('filename')
|
||||||
|
content = att.get('content')
|
||||||
|
if not filename or not content:
|
||||||
|
continue
|
||||||
|
query = """
|
||||||
|
UPDATE email_attachments
|
||||||
|
SET content_data = %s
|
||||||
|
WHERE email_id = (
|
||||||
|
SELECT id FROM email_messages WHERE message_id = %s LIMIT 1
|
||||||
|
)
|
||||||
|
AND filename = %s
|
||||||
|
AND content_data IS NULL
|
||||||
|
"""
|
||||||
|
execute_query(query, (Binary(content), message_id, filename))
|
||||||
|
logger.debug(f"💾 Re-saved content_data for attachment: {filename}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"⚠️ Could not re-save content_data for {att.get('filename', '?')}: {e}")
|
||||||
|
|
||||||
async def get_unprocessed_emails(self, limit: int = 100) -> List[Dict]:
|
async def get_unprocessed_emails(self, limit: int = 100) -> List[Dict]:
|
||||||
"""Get emails from database that haven't been processed yet"""
|
"""Get emails from database that haven't been processed yet"""
|
||||||
query = """
|
query = """
|
||||||
@ -717,6 +773,8 @@ class EmailService:
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
"message_id": message_id,
|
"message_id": message_id,
|
||||||
|
"in_reply_to": msg.get("In-Reply-To", ""),
|
||||||
|
"email_references": msg.get("References", ""),
|
||||||
"subject": msg.get("Subject", "No Subject"),
|
"subject": msg.get("Subject", "No Subject"),
|
||||||
"sender_name": sender_name,
|
"sender_name": sender_name,
|
||||||
"sender_email": sender_email,
|
"sender_email": sender_email,
|
||||||
@ -782,6 +840,8 @@ class EmailService:
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
"message_id": message_id,
|
"message_id": message_id,
|
||||||
|
"in_reply_to": None,
|
||||||
|
"email_references": None,
|
||||||
"subject": msg.subject or "No Subject",
|
"subject": msg.subject or "No Subject",
|
||||||
"sender_name": msg.sender or "",
|
"sender_name": msg.sender or "",
|
||||||
"sender_email": msg.senderEmail or "",
|
"sender_email": msg.senderEmail or "",
|
||||||
@ -824,9 +884,10 @@ class EmailService:
|
|||||||
message_id, subject, sender_email, sender_name,
|
message_id, subject, sender_email, sender_name,
|
||||||
recipient_email, cc, body_text, body_html,
|
recipient_email, cc, body_text, body_html,
|
||||||
received_date, folder, has_attachments, attachment_count,
|
received_date, folder, has_attachments, attachment_count,
|
||||||
|
in_reply_to, email_references,
|
||||||
status, import_method, created_at
|
status, import_method, created_at
|
||||||
)
|
)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, CURRENT_TIMESTAMP)
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, CURRENT_TIMESTAMP)
|
||||||
RETURNING id
|
RETURNING id
|
||||||
"""
|
"""
|
||||||
|
|
||||||
@ -843,6 +904,8 @@ class EmailService:
|
|||||||
email_data["folder"],
|
email_data["folder"],
|
||||||
email_data["has_attachments"],
|
email_data["has_attachments"],
|
||||||
len(email_data.get("attachments", [])),
|
len(email_data.get("attachments", [])),
|
||||||
|
email_data.get("in_reply_to"),
|
||||||
|
email_data.get("email_references"),
|
||||||
"new",
|
"new",
|
||||||
"manual_upload"
|
"manual_upload"
|
||||||
))
|
))
|
||||||
@ -953,3 +1016,107 @@ class EmailService:
|
|||||||
error_msg = f"❌ Failed to send email: {str(e)}"
|
error_msg = f"❌ Failed to send email: {str(e)}"
|
||||||
logger.error(error_msg)
|
logger.error(error_msg)
|
||||||
return False, error_msg
|
return False, error_msg
|
||||||
|
|
||||||
|
async def send_email_with_attachments(
|
||||||
|
self,
|
||||||
|
to_addresses: List[str],
|
||||||
|
subject: str,
|
||||||
|
body_text: str,
|
||||||
|
body_html: Optional[str] = None,
|
||||||
|
cc: Optional[List[str]] = None,
|
||||||
|
bcc: Optional[List[str]] = None,
|
||||||
|
reply_to: Optional[str] = None,
|
||||||
|
in_reply_to: Optional[str] = None,
|
||||||
|
references: Optional[str] = None,
|
||||||
|
attachments: Optional[List[Dict]] = None,
|
||||||
|
respect_dry_run: bool = True,
|
||||||
|
) -> Tuple[bool, str, str]:
|
||||||
|
"""Send email via SMTP with optional attachments and return generated Message-ID."""
|
||||||
|
|
||||||
|
generated_message_id = f"<{uuid4().hex}@bmchub.local>"
|
||||||
|
|
||||||
|
if respect_dry_run and settings.REMINDERS_DRY_RUN:
|
||||||
|
logger.warning(
|
||||||
|
"🔒 DRY RUN MODE: Would send email to %s with subject '%s'",
|
||||||
|
to_addresses,
|
||||||
|
subject,
|
||||||
|
)
|
||||||
|
return True, "Dry run mode - email not actually sent", generated_message_id
|
||||||
|
|
||||||
|
if not HAS_AIOSMTPLIB:
|
||||||
|
logger.error("❌ aiosmtplib not installed - cannot send email. Install with: pip install aiosmtplib")
|
||||||
|
return False, "aiosmtplib not installed", generated_message_id
|
||||||
|
|
||||||
|
if not all([settings.EMAIL_SMTP_HOST, settings.EMAIL_SMTP_USER, settings.EMAIL_SMTP_PASSWORD]):
|
||||||
|
logger.error("❌ SMTP not configured - cannot send email")
|
||||||
|
return False, "SMTP not configured", generated_message_id
|
||||||
|
|
||||||
|
try:
|
||||||
|
msg = MIMEMultipart('mixed')
|
||||||
|
msg['Subject'] = subject
|
||||||
|
msg['From'] = f"{settings.EMAIL_SMTP_FROM_NAME} <{settings.EMAIL_SMTP_FROM_ADDRESS}>"
|
||||||
|
msg['To'] = ', '.join(to_addresses)
|
||||||
|
msg['Message-ID'] = generated_message_id
|
||||||
|
|
||||||
|
if cc:
|
||||||
|
msg['Cc'] = ', '.join(cc)
|
||||||
|
if reply_to:
|
||||||
|
msg['Reply-To'] = reply_to
|
||||||
|
if in_reply_to:
|
||||||
|
msg['In-Reply-To'] = in_reply_to
|
||||||
|
if references:
|
||||||
|
msg['References'] = references
|
||||||
|
|
||||||
|
content_part = MIMEMultipart('alternative')
|
||||||
|
content_part.attach(MIMEText(body_text, 'plain'))
|
||||||
|
if body_html:
|
||||||
|
content_part.attach(MIMEText(body_html, 'html'))
|
||||||
|
msg.attach(content_part)
|
||||||
|
|
||||||
|
for attachment in (attachments or []):
|
||||||
|
content = attachment.get("content")
|
||||||
|
if not content:
|
||||||
|
continue
|
||||||
|
|
||||||
|
filename = attachment.get("filename") or "attachment.bin"
|
||||||
|
content_type = attachment.get("content_type") or "application/octet-stream"
|
||||||
|
maintype, _, subtype = content_type.partition("/")
|
||||||
|
if not maintype or not subtype:
|
||||||
|
maintype, subtype = "application", "octet-stream"
|
||||||
|
|
||||||
|
mime_attachment = MIMEBase(maintype, subtype)
|
||||||
|
mime_attachment.set_payload(content)
|
||||||
|
encoders.encode_base64(mime_attachment)
|
||||||
|
mime_attachment.add_header('Content-Disposition', f'attachment; filename="{filename}"')
|
||||||
|
msg.attach(mime_attachment)
|
||||||
|
|
||||||
|
async with aiosmtplib.SMTP(
|
||||||
|
hostname=settings.EMAIL_SMTP_HOST,
|
||||||
|
port=settings.EMAIL_SMTP_PORT,
|
||||||
|
use_tls=settings.EMAIL_SMTP_USE_TLS
|
||||||
|
) as smtp:
|
||||||
|
await smtp.login(settings.EMAIL_SMTP_USER, settings.EMAIL_SMTP_PASSWORD)
|
||||||
|
|
||||||
|
all_recipients = to_addresses.copy()
|
||||||
|
if cc:
|
||||||
|
all_recipients.extend(cc)
|
||||||
|
if bcc:
|
||||||
|
all_recipients.extend(bcc)
|
||||||
|
|
||||||
|
await smtp.sendmail(
|
||||||
|
settings.EMAIL_SMTP_FROM_ADDRESS,
|
||||||
|
all_recipients,
|
||||||
|
msg.as_string()
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
"✅ Email with attachments sent successfully to %s recipient(s): %s",
|
||||||
|
len(to_addresses),
|
||||||
|
subject,
|
||||||
|
)
|
||||||
|
return True, f"Email sent to {len(to_addresses)} recipient(s)", generated_message_id
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
error_msg = f"❌ Failed to send email with attachments: {str(e)}"
|
||||||
|
logger.error(error_msg)
|
||||||
|
return False, error_msg, generated_message_id
|
||||||
|
|||||||
@ -26,6 +26,17 @@ class EmailWorkflowService:
|
|||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.enabled = settings.EMAIL_WORKFLOWS_ENABLED if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') else True
|
self.enabled = settings.EMAIL_WORKFLOWS_ENABLED if hasattr(settings, 'EMAIL_WORKFLOWS_ENABLED') else True
|
||||||
|
|
||||||
|
HELPDESK_SKIP_CLASSIFICATIONS = {
|
||||||
|
'invoice',
|
||||||
|
'order_confirmation',
|
||||||
|
'freight_note',
|
||||||
|
'time_confirmation',
|
||||||
|
'newsletter',
|
||||||
|
'spam',
|
||||||
|
'bankruptcy',
|
||||||
|
'recording'
|
||||||
|
}
|
||||||
|
|
||||||
async def execute_workflows(self, email_data: Dict) -> Dict:
|
async def execute_workflows(self, email_data: Dict) -> Dict:
|
||||||
"""
|
"""
|
||||||
@ -69,6 +80,18 @@ class EmailWorkflowService:
|
|||||||
results['workflows_executed'] += 1
|
results['workflows_executed'] += 1
|
||||||
results['workflows_succeeded'] += 1
|
results['workflows_succeeded'] += 1
|
||||||
logger.info("✅ Bankruptcy system workflow executed successfully")
|
logger.info("✅ Bankruptcy system workflow executed successfully")
|
||||||
|
|
||||||
|
# Special System Workflow: Helpdesk SAG routing
|
||||||
|
# - If SAG-<id> is present in subject/header => update existing case
|
||||||
|
# - If no SAG id and sender domain matches customer => create new case
|
||||||
|
if classification not in self.HELPDESK_SKIP_CLASSIFICATIONS:
|
||||||
|
helpdesk_result = await self._handle_helpdesk_sag_routing(email_data)
|
||||||
|
if helpdesk_result:
|
||||||
|
results['details'].append(helpdesk_result)
|
||||||
|
if helpdesk_result.get('status') == 'completed':
|
||||||
|
results['workflows_executed'] += 1
|
||||||
|
results['workflows_succeeded'] += 1
|
||||||
|
logger.info("✅ Helpdesk SAG routing workflow executed")
|
||||||
|
|
||||||
# Find matching workflows
|
# Find matching workflows
|
||||||
workflows = await self._find_matching_workflows(email_data)
|
workflows = await self._find_matching_workflows(email_data)
|
||||||
@ -176,6 +199,235 @@ class EmailWorkflowService:
|
|||||||
'customer_name': first_match['name']
|
'customer_name': first_match['name']
|
||||||
}
|
}
|
||||||
|
|
||||||
|
def _extract_sender_domain(self, email_data: Dict) -> Optional[str]:
|
||||||
|
sender_email = (email_data.get('sender_email') or '').strip().lower()
|
||||||
|
if '@' not in sender_email:
|
||||||
|
return None
|
||||||
|
domain = sender_email.split('@', 1)[1].strip()
|
||||||
|
if domain.startswith('www.'):
|
||||||
|
domain = domain[4:]
|
||||||
|
return domain or None
|
||||||
|
|
||||||
|
def _extract_sag_id(self, email_data: Dict) -> Optional[int]:
|
||||||
|
candidates = [
|
||||||
|
email_data.get('subject') or '',
|
||||||
|
email_data.get('in_reply_to') or '',
|
||||||
|
email_data.get('email_references') or ''
|
||||||
|
]
|
||||||
|
|
||||||
|
for value in candidates:
|
||||||
|
match = re.search(r'\bSAG-(\d+)\b', value, re.IGNORECASE)
|
||||||
|
if match:
|
||||||
|
return int(match.group(1))
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _normalize_message_id(self, value: Optional[str]) -> Optional[str]:
|
||||||
|
if not value:
|
||||||
|
return None
|
||||||
|
normalized = re.sub(r'[<>\s]', '', str(value)).lower().strip()
|
||||||
|
return normalized or None
|
||||||
|
|
||||||
|
def _extract_thread_message_ids(self, email_data: Dict) -> List[str]:
|
||||||
|
tokens: List[str] = []
|
||||||
|
|
||||||
|
in_reply_to = self._normalize_message_id(email_data.get('in_reply_to'))
|
||||||
|
if in_reply_to:
|
||||||
|
tokens.append(in_reply_to)
|
||||||
|
|
||||||
|
raw_references = (email_data.get('email_references') or '').strip()
|
||||||
|
if raw_references:
|
||||||
|
for ref in re.split(r'[\s,]+', raw_references):
|
||||||
|
normalized_ref = self._normalize_message_id(ref)
|
||||||
|
if normalized_ref:
|
||||||
|
tokens.append(normalized_ref)
|
||||||
|
|
||||||
|
# De-duplicate while preserving order
|
||||||
|
return list(dict.fromkeys(tokens))
|
||||||
|
|
||||||
|
def _find_sag_id_from_thread_headers(self, email_data: Dict) -> Optional[int]:
|
||||||
|
thread_message_ids = self._extract_thread_message_ids(email_data)
|
||||||
|
if not thread_message_ids:
|
||||||
|
return None
|
||||||
|
|
||||||
|
placeholders = ','.join(['%s'] * len(thread_message_ids))
|
||||||
|
rows = execute_query(
|
||||||
|
f"""
|
||||||
|
SELECT se.sag_id
|
||||||
|
FROM sag_emails se
|
||||||
|
JOIN email_messages em ON em.id = se.email_id
|
||||||
|
WHERE em.deleted_at IS NULL
|
||||||
|
AND LOWER(REGEXP_REPLACE(COALESCE(em.message_id, ''), '[<>\\s]', '', 'g')) IN ({placeholders})
|
||||||
|
ORDER BY se.created_at DESC
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
tuple(thread_message_ids)
|
||||||
|
)
|
||||||
|
return rows[0]['sag_id'] if rows else None
|
||||||
|
|
||||||
|
def _find_customer_by_domain(self, domain: str) -> Optional[Dict[str, Any]]:
|
||||||
|
if not domain:
|
||||||
|
return None
|
||||||
|
|
||||||
|
domain = domain.lower().strip()
|
||||||
|
domain_alt = domain[4:] if domain.startswith('www.') else f"www.{domain}"
|
||||||
|
|
||||||
|
query = """
|
||||||
|
SELECT id, name
|
||||||
|
FROM customers
|
||||||
|
WHERE is_active = true
|
||||||
|
AND (
|
||||||
|
LOWER(TRIM(email_domain)) = %s
|
||||||
|
OR LOWER(TRIM(email_domain)) = %s
|
||||||
|
)
|
||||||
|
ORDER BY id ASC
|
||||||
|
LIMIT 1
|
||||||
|
"""
|
||||||
|
rows = execute_query(query, (domain, domain_alt))
|
||||||
|
return rows[0] if rows else None
|
||||||
|
|
||||||
|
def _link_email_to_sag(self, sag_id: int, email_id: int) -> None:
|
||||||
|
execute_update(
|
||||||
|
"""
|
||||||
|
INSERT INTO sag_emails (sag_id, email_id)
|
||||||
|
SELECT %s, %s
|
||||||
|
WHERE NOT EXISTS (
|
||||||
|
SELECT 1 FROM sag_emails WHERE sag_id = %s AND email_id = %s
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
(sag_id, email_id, sag_id, email_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
def _add_helpdesk_comment(self, sag_id: int, email_data: Dict) -> None:
|
||||||
|
sender = email_data.get('sender_email') or 'ukendt'
|
||||||
|
subject = email_data.get('subject') or '(ingen emne)'
|
||||||
|
received = email_data.get('received_date')
|
||||||
|
received_str = received.isoformat() if hasattr(received, 'isoformat') else str(received or '')
|
||||||
|
body_text = (email_data.get('body_text') or '').strip()
|
||||||
|
|
||||||
|
comment = (
|
||||||
|
f"📧 Indgående email\n"
|
||||||
|
f"Fra: {sender}\n"
|
||||||
|
f"Emne: {subject}\n"
|
||||||
|
f"Modtaget: {received_str}\n\n"
|
||||||
|
f"{body_text}"
|
||||||
|
)
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"""
|
||||||
|
INSERT INTO sag_kommentarer (sag_id, forfatter, indhold, er_system_besked)
|
||||||
|
VALUES (%s, %s, %s, %s)
|
||||||
|
""",
|
||||||
|
(sag_id, 'Email Bot', comment, True)
|
||||||
|
)
|
||||||
|
|
||||||
|
def _create_sag_from_email(self, email_data: Dict, customer_id: int) -> Dict[str, Any]:
|
||||||
|
sender = email_data.get('sender_email') or 'ukendt'
|
||||||
|
subject = (email_data.get('subject') or '').strip() or f"Email fra {sender}"
|
||||||
|
|
||||||
|
description = (
|
||||||
|
f"Auto-oprettet fra email\n"
|
||||||
|
f"Fra: {sender}\n"
|
||||||
|
f"Message-ID: {email_data.get('message_id') or ''}\n\n"
|
||||||
|
f"{(email_data.get('body_text') or '').strip()}"
|
||||||
|
)
|
||||||
|
|
||||||
|
rows = execute_query(
|
||||||
|
"""
|
||||||
|
INSERT INTO sag_sager (
|
||||||
|
titel, beskrivelse, template_key, status, customer_id, created_by_user_id
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
|
RETURNING id, titel, customer_id
|
||||||
|
""",
|
||||||
|
(subject, description, 'ticket', 'åben', customer_id, 1)
|
||||||
|
)
|
||||||
|
|
||||||
|
if not rows:
|
||||||
|
raise ValueError('Failed to create SAG from email')
|
||||||
|
return rows[0]
|
||||||
|
|
||||||
|
async def _handle_helpdesk_sag_routing(self, email_data: Dict) -> Optional[Dict[str, Any]]:
|
||||||
|
email_id = email_data.get('id')
|
||||||
|
if not email_id:
|
||||||
|
return None
|
||||||
|
|
||||||
|
sag_id = self._extract_sag_id(email_data)
|
||||||
|
if not sag_id:
|
||||||
|
sag_id = self._find_sag_id_from_thread_headers(email_data)
|
||||||
|
if sag_id:
|
||||||
|
logger.info("🔗 Matched email %s to SAG-%s via thread headers", email_id, sag_id)
|
||||||
|
|
||||||
|
# 1) Existing SAG via subject/headers
|
||||||
|
if sag_id:
|
||||||
|
case_rows = execute_query(
|
||||||
|
"SELECT id, customer_id, titel FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
||||||
|
(sag_id,)
|
||||||
|
)
|
||||||
|
|
||||||
|
if not case_rows:
|
||||||
|
logger.warning("⚠️ Email %s referenced SAG-%s but case was not found", email_id, sag_id)
|
||||||
|
return {'status': 'skipped', 'action': 'sag_id_not_found', 'sag_id': sag_id}
|
||||||
|
|
||||||
|
case = case_rows[0]
|
||||||
|
self._add_helpdesk_comment(sag_id, email_data)
|
||||||
|
self._link_email_to_sag(sag_id, email_id)
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"""
|
||||||
|
UPDATE email_messages
|
||||||
|
SET linked_case_id = %s,
|
||||||
|
customer_id = COALESCE(customer_id, %s),
|
||||||
|
status = 'processed',
|
||||||
|
folder = 'Processed',
|
||||||
|
processed_at = CURRENT_TIMESTAMP,
|
||||||
|
auto_processed = true
|
||||||
|
WHERE id = %s
|
||||||
|
""",
|
||||||
|
(sag_id, case.get('customer_id'), email_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
'status': 'completed',
|
||||||
|
'action': 'updated_existing_sag',
|
||||||
|
'sag_id': sag_id,
|
||||||
|
'customer_id': case.get('customer_id')
|
||||||
|
}
|
||||||
|
|
||||||
|
# 2) No SAG id -> create only if sender domain belongs to known customer
|
||||||
|
sender_domain = self._extract_sender_domain(email_data)
|
||||||
|
customer = self._find_customer_by_domain(sender_domain) if sender_domain else None
|
||||||
|
|
||||||
|
if not customer:
|
||||||
|
logger.info("⏭️ Email %s has no known customer domain (%s) - kept in /emails", email_id, sender_domain)
|
||||||
|
return {'status': 'skipped', 'action': 'unknown_customer_domain', 'domain': sender_domain}
|
||||||
|
|
||||||
|
case = self._create_sag_from_email(email_data, customer['id'])
|
||||||
|
self._add_helpdesk_comment(case['id'], email_data)
|
||||||
|
self._link_email_to_sag(case['id'], email_id)
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"""
|
||||||
|
UPDATE email_messages
|
||||||
|
SET linked_case_id = %s,
|
||||||
|
customer_id = %s,
|
||||||
|
status = 'processed',
|
||||||
|
folder = 'Processed',
|
||||||
|
processed_at = CURRENT_TIMESTAMP,
|
||||||
|
auto_processed = true
|
||||||
|
WHERE id = %s
|
||||||
|
""",
|
||||||
|
(case['id'], customer['id'], email_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info("✅ Created SAG-%s from email %s for customer %s", case['id'], email_id, customer['id'])
|
||||||
|
return {
|
||||||
|
'status': 'completed',
|
||||||
|
'action': 'created_new_sag',
|
||||||
|
'sag_id': case['id'],
|
||||||
|
'customer_id': customer['id'],
|
||||||
|
'domain': sender_domain
|
||||||
|
}
|
||||||
|
|
||||||
async def _find_matching_workflows(self, email_data: Dict) -> List[Dict]:
|
async def _find_matching_workflows(self, email_data: Dict) -> List[Dict]:
|
||||||
"""Find all workflows that match this email"""
|
"""Find all workflows that match this email"""
|
||||||
classification = email_data.get('classification')
|
classification = email_data.get('classification')
|
||||||
@ -357,6 +609,7 @@ class EmailWorkflowService:
|
|||||||
handler_map = {
|
handler_map = {
|
||||||
'create_ticket': self._action_create_ticket_system,
|
'create_ticket': self._action_create_ticket_system,
|
||||||
'link_email_to_ticket': self._action_link_email_to_ticket,
|
'link_email_to_ticket': self._action_link_email_to_ticket,
|
||||||
|
'route_helpdesk_sag': self._handle_helpdesk_sag_routing,
|
||||||
'create_time_entry': self._action_create_time_entry,
|
'create_time_entry': self._action_create_time_entry,
|
||||||
'link_to_vendor': self._action_link_to_vendor,
|
'link_to_vendor': self._action_link_to_vendor,
|
||||||
'link_to_customer': self._action_link_to_customer,
|
'link_to_customer': self._action_link_to_customer,
|
||||||
@ -469,8 +722,8 @@ class EmailWorkflowService:
|
|||||||
'body': email_data.get('body_text', ''),
|
'body': email_data.get('body_text', ''),
|
||||||
'html_body': email_data.get('body_html'),
|
'html_body': email_data.get('body_html'),
|
||||||
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
||||||
'in_reply_to': None, # TODO: Extract from email headers
|
'in_reply_to': email_data.get('in_reply_to'),
|
||||||
'references': None # TODO: Extract from email headers
|
'references': email_data.get('email_references')
|
||||||
}
|
}
|
||||||
|
|
||||||
# Get params from workflow
|
# Get params from workflow
|
||||||
@ -516,6 +769,8 @@ class EmailWorkflowService:
|
|||||||
'body': email_data.get('body_text', ''),
|
'body': email_data.get('body_text', ''),
|
||||||
'html_body': email_data.get('body_html'),
|
'html_body': email_data.get('body_html'),
|
||||||
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
'received_at': email_data.get('received_date').isoformat() if email_data.get('received_date') else None,
|
||||||
|
'in_reply_to': email_data.get('in_reply_to'),
|
||||||
|
'references': email_data.get('email_references')
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info(f"🔗 Linking email to ticket {ticket_number}")
|
logger.info(f"🔗 Linking email to ticket {ticket_number}")
|
||||||
@ -594,13 +849,31 @@ class EmailWorkflowService:
|
|||||||
return {'action': 'link_to_vendor', 'matched': False, 'reason': 'Vendor not found'}
|
return {'action': 'link_to_vendor', 'matched': False, 'reason': 'Vendor not found'}
|
||||||
|
|
||||||
async def _action_link_to_customer(self, params: Dict, email_data: Dict) -> Dict:
|
async def _action_link_to_customer(self, params: Dict, email_data: Dict) -> Dict:
|
||||||
"""Link email to customer"""
|
"""Link email to customer by sender domain and persist on email_messages"""
|
||||||
logger.info(f"🔗 Would link to customer")
|
sender_domain = self._extract_sender_domain(email_data)
|
||||||
|
if not sender_domain:
|
||||||
# TODO: Implement customer matching logic
|
return {'action': 'link_to_customer', 'matched': False, 'reason': 'No sender domain'}
|
||||||
|
|
||||||
|
customer = self._find_customer_by_domain(sender_domain)
|
||||||
|
if not customer:
|
||||||
|
return {
|
||||||
|
'action': 'link_to_customer',
|
||||||
|
'matched': False,
|
||||||
|
'reason': 'Customer not found for domain',
|
||||||
|
'domain': sender_domain
|
||||||
|
}
|
||||||
|
|
||||||
|
execute_update(
|
||||||
|
"UPDATE email_messages SET customer_id = %s WHERE id = %s",
|
||||||
|
(customer['id'], email_data['id'])
|
||||||
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
'action': 'link_to_customer',
|
'action': 'link_to_customer',
|
||||||
'note': 'Customer linking not yet implemented'
|
'matched': True,
|
||||||
|
'customer_id': customer['id'],
|
||||||
|
'customer_name': customer['name'],
|
||||||
|
'domain': sender_domain
|
||||||
}
|
}
|
||||||
|
|
||||||
async def _action_extract_invoice_data(self, params: Dict, email_data: Dict) -> Dict:
|
async def _action_extract_invoice_data(self, params: Dict, email_data: Dict) -> Dict:
|
||||||
|
|||||||
@ -28,14 +28,26 @@ class OllamaService:
|
|||||||
|
|
||||||
def _build_system_prompt(self) -> str:
|
def _build_system_prompt(self) -> str:
|
||||||
"""Build Danish system prompt for invoice extraction with CVR"""
|
"""Build Danish system prompt for invoice extraction with CVR"""
|
||||||
return """Du er en ekspert i at læse og udtrække strukturerede data fra danske fakturaer, kreditnotaer og leverandørdokumenter.
|
own_cvr = getattr(settings, 'OWN_CVR', '29522790')
|
||||||
|
# BMC har to CVR numre – begge er VORES (køber), aldrig leverandør
|
||||||
|
own_cvr_rule = (
|
||||||
|
f"4b. KRITISK - LEVERANDØR vs. MODTAGER:\n"
|
||||||
|
f" - På en dansk faktura er LEVERANDØREN (vendor) det firma der HAR SENDT fakturaen.\n"
|
||||||
|
f" De kendes på: firmalogo øverst, bankkonto/IBAN/Gironr. nedad, ingen 'Faktureres til' label.\n"
|
||||||
|
f" - MODTAGEREN (os, buyer) kendes på: navnes under 'Faktureres til', 'Att.', 'Kundenr.', adresseblok med vores navn.\n"
|
||||||
|
f" - BMC DENMARK APS og alle varianter af 'BMC' er ALDRIG leverandøren – det er os (modtageren).\n"
|
||||||
|
f" - CVR {own_cvr} er VORES eget CVR. Sæt ALDRIG vendor_cvr til {own_cvr}.\n"
|
||||||
|
f" - CVR 14416285 er også VORES CVR. Sæt ALDRIG vendor_cvr til 14416285.\n"
|
||||||
|
f" - Ignorer 'SE/CVR-nr.' der hører til modtager-blokken – brug KUN afsenderens CVR som vendor_cvr.\n"
|
||||||
|
)
|
||||||
|
return ("""Du er en ekspert i at læse og udtrække strukturerede data fra danske fakturaer, kreditnotaer og leverandørdokumenter.
|
||||||
|
|
||||||
VIGTIGE REGLER:
|
VIGTIGE REGLER:
|
||||||
1. Returner KUN gyldig JSON - ingen forklaring eller ekstra tekst
|
1. Returner KUN gyldig JSON - ingen forklaring eller ekstra tekst
|
||||||
2. Hvis et felt ikke findes, sæt det til null
|
2. Hvis et felt ikke findes, sæt det til null
|
||||||
3. Beregn confidence baseret på hvor sikker du er på hvert felt (0.0-1.0)
|
3. Beregn confidence baseret på hvor sikker du er på hvert felt (0.0-1.0)
|
||||||
4. Datoer skal være i format YYYY-MM-DD
|
4. Datoer skal være i format YYYY-MM-DD
|
||||||
5. DANSKE PRISFORMATER:
|
""" + own_cvr_rule + """5. DANSKE PRISFORMATER:
|
||||||
- Tusind-separator kan være . (punkt) eller mellemrum: "5.965,18" eller "5 965,18"
|
- Tusind-separator kan være . (punkt) eller mellemrum: "5.965,18" eller "5 965,18"
|
||||||
- Decimal-separator er , (komma): "1.234,56 kr"
|
- Decimal-separator er , (komma): "1.234,56 kr"
|
||||||
- I JSON output skal du bruge . (punkt) som decimal: 1234.56
|
- I JSON output skal du bruge . (punkt) som decimal: 1234.56
|
||||||
@ -126,7 +138,7 @@ Output: {
|
|||||||
"confidence": 0.95
|
"confidence": 0.95
|
||||||
}],
|
}],
|
||||||
"confidence": 0.95
|
"confidence": 0.95
|
||||||
}"""
|
}""")
|
||||||
|
|
||||||
async def extract_from_text(self, text: str) -> Dict:
|
async def extract_from_text(self, text: str) -> Dict:
|
||||||
"""
|
"""
|
||||||
@ -170,10 +182,11 @@ Output: {
|
|||||||
],
|
],
|
||||||
"stream": False,
|
"stream": False,
|
||||||
"format": "json",
|
"format": "json",
|
||||||
|
"think": False,
|
||||||
"options": {
|
"options": {
|
||||||
"temperature": 0.1,
|
"temperature": 0.1,
|
||||||
"top_p": 0.9,
|
"top_p": 0.9,
|
||||||
"num_predict": 2000
|
"num_predict": 8000
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
@ -189,7 +202,7 @@ Output: {
|
|||||||
"options": {
|
"options": {
|
||||||
"temperature": 0.1,
|
"temperature": 0.1,
|
||||||
"top_p": 0.9,
|
"top_p": 0.9,
|
||||||
"num_predict": 2000
|
"num_predict": 8000
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
@ -301,53 +314,88 @@ Output: {
|
|||||||
}
|
}
|
||||||
|
|
||||||
def _parse_json_response(self, response: str) -> Dict:
|
def _parse_json_response(self, response: str) -> Dict:
|
||||||
"""Parse JSON from LLM response with improved error handling"""
|
"""Parse JSON from LLM response with aggressive fallback strategies"""
|
||||||
|
logger.info(f"🔍 Response length: {len(response)}, preview: {response[:200]}")
|
||||||
|
|
||||||
|
# Find outermost JSON object
|
||||||
|
start = response.find('{')
|
||||||
|
end = response.rfind('}') + 1
|
||||||
|
if start < 0 or end <= start:
|
||||||
|
logger.error("❌ No JSON object found in response")
|
||||||
|
return self._extract_fields_with_regex(response)
|
||||||
|
|
||||||
|
json_str = response[start:end]
|
||||||
|
|
||||||
|
# Strategy 1: direct parse
|
||||||
try:
|
try:
|
||||||
# Log preview of response for debugging
|
return json.loads(json_str)
|
||||||
logger.info(f"🔍 Response preview (first 500 chars): {response[:500]}")
|
except json.JSONDecodeError:
|
||||||
|
pass
|
||||||
# Find JSON in response (between first { and last })
|
|
||||||
start = response.find('{')
|
# Strategy 2: remove trailing commas before } or ]
|
||||||
end = response.rfind('}') + 1
|
fixed = re.sub(r',(\s*[}\]])', r'\1', json_str)
|
||||||
|
try:
|
||||||
if start >= 0 and end > start:
|
return json.loads(fixed)
|
||||||
json_str = response[start:end]
|
except json.JSONDecodeError:
|
||||||
logger.info(f"🔍 Extracted JSON string length: {len(json_str)}, starts at position {start}")
|
pass
|
||||||
|
|
||||||
# Try to fix common JSON issues
|
# Strategy 3: remove JS-style comments (// and /* */)
|
||||||
# Remove trailing commas before } or ]
|
fixed = re.sub(r'//[^\n]*', '', fixed)
|
||||||
json_str = re.sub(r',(\s*[}\]])', r'\1', json_str)
|
fixed = re.sub(r'/\*.*?\*/', '', fixed, flags=re.DOTALL)
|
||||||
# Fix single quotes to double quotes (but not in values)
|
try:
|
||||||
# This is risky, so we only do it if initial parse fails
|
return json.loads(fixed)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Strategy 4: truncate at last valid closing brace
|
||||||
|
# Walk backwards to find longest valid JSON prefix
|
||||||
|
for i in range(len(fixed) - 1, start, -1):
|
||||||
|
if fixed[i] == '}':
|
||||||
|
candidate = fixed[start - start:i + 1] if start == 0 else fixed[:i + 1]
|
||||||
|
# rebuild from inner start
|
||||||
|
c2 = fixed[:i + 1] if start == 0 else json_str[:i - start + 1]
|
||||||
try:
|
try:
|
||||||
data = json.loads(json_str)
|
data = json.loads(c2)
|
||||||
|
logger.warning(f"⚠️ JSON truncated to position {i} — partial parse OK")
|
||||||
return data
|
return data
|
||||||
except json.JSONDecodeError:
|
except json.JSONDecodeError:
|
||||||
# Try to fix common issues
|
continue
|
||||||
# Replace single quotes with double quotes (simple approach)
|
break
|
||||||
fixed_json = json_str.replace("'", '"')
|
|
||||||
try:
|
# Strategy 5: regex extraction of key fields (always succeeds with partial data)
|
||||||
data = json.loads(fixed_json)
|
logger.warning("⚠️ All JSON strategies failed — using regex field extraction")
|
||||||
logger.warning("⚠️ Fixed JSON with quote replacement")
|
return self._extract_fields_with_regex(response)
|
||||||
return data
|
|
||||||
except:
|
def _extract_fields_with_regex(self, text: str) -> Dict:
|
||||||
pass
|
"""Extract invoice fields from text using regex when JSON parsing fails"""
|
||||||
|
def _find(pattern, default=None):
|
||||||
# Last resort: log the problematic JSON
|
m = re.search(pattern, text, re.IGNORECASE)
|
||||||
logger.error(f"❌ Problematic JSON: {json_str[:300]}")
|
return m.group(1).strip() if m else default
|
||||||
raise
|
|
||||||
else:
|
def _find_num(pattern):
|
||||||
raise ValueError("No JSON found in response")
|
m = re.search(pattern, text, re.IGNORECASE)
|
||||||
|
if not m: return None
|
||||||
except json.JSONDecodeError as e:
|
val = m.group(1).replace('.', '').replace(',', '.')
|
||||||
logger.error(f"❌ JSON parsing failed: {e}")
|
try: return float(val)
|
||||||
logger.error(f"Raw response preview: {response[:500]}")
|
except: return None
|
||||||
return {
|
|
||||||
"error": f"JSON parsing failed: {str(e)}",
|
result = {
|
||||||
"confidence": 0.0,
|
"document_type": _find(r'"document_type"\s*:\s*"([^"]+)"', 'invoice'),
|
||||||
"raw_response": response[:500]
|
"invoice_number": _find(r'"invoice_number"\s*:\s*"?([^",\n}]+)"?'),
|
||||||
}
|
"vendor_name": _find(r'"vendor_name"\s*:\s*"([^"]+)"'),
|
||||||
|
"vendor_cvr": _find(r'"vendor_cvr"\s*:\s*"?(\d{8})"?'),
|
||||||
|
"invoice_date": _find(r'"invoice_date"\s*:\s*"([^"]+)"'),
|
||||||
|
"due_date": _find(r'"due_date"\s*:\s*"([^"]+)"'),
|
||||||
|
"currency": _find(r'"currency"\s*:\s*"([^"]+)"', 'DKK'),
|
||||||
|
"total_amount": _find_num(r'"total_amount"\s*:\s*([\d.,]+)'),
|
||||||
|
"vat_amount": _find_num(r'"vat_amount"\s*:\s*([\d.,]+)'),
|
||||||
|
"confidence": 0.5,
|
||||||
|
"lines": [],
|
||||||
|
"_partial": True,
|
||||||
|
}
|
||||||
|
logger.info(f"🔧 Regex extraction: vendor={result['vendor_name']}, cvr={result['vendor_cvr']}, total={result['total_amount']}")
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
def calculate_file_checksum(self, file_path: Path) -> str:
|
def calculate_file_checksum(self, file_path: Path) -> str:
|
||||||
"""Calculate SHA256 checksum of file for duplicate detection"""
|
"""Calculate SHA256 checksum of file for duplicate detection"""
|
||||||
|
|||||||
@ -6,7 +6,11 @@ from fastapi import APIRouter, HTTPException
|
|||||||
from typing import List, Optional, Dict
|
from typing import List, Optional, Dict
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from app.core.database import execute_query
|
from app.core.database import execute_query
|
||||||
|
from app.core.config import settings
|
||||||
|
import httpx
|
||||||
|
import time
|
||||||
import logging
|
import logging
|
||||||
|
import json
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
@ -72,7 +76,7 @@ async def get_setting(key: str):
|
|||||||
query = "SELECT * FROM settings WHERE key = %s"
|
query = "SELECT * FROM settings WHERE key = %s"
|
||||||
result = execute_query(query, (key,))
|
result = execute_query(query, (key,))
|
||||||
|
|
||||||
if not result and key in {"case_types", "case_type_module_defaults"}:
|
if not result and key in {"case_types", "case_type_module_defaults", "case_statuses"}:
|
||||||
seed_query = """
|
seed_query = """
|
||||||
INSERT INTO settings (key, value, category, description, value_type, is_public)
|
INSERT INTO settings (key, value, category, description, value_type, is_public)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s)
|
||||||
@ -105,6 +109,25 @@ async def get_setting(key: str):
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if key == "case_statuses":
|
||||||
|
execute_query(
|
||||||
|
seed_query,
|
||||||
|
(
|
||||||
|
"case_statuses",
|
||||||
|
json.dumps([
|
||||||
|
{"value": "åben", "is_closed": False},
|
||||||
|
{"value": "under behandling", "is_closed": False},
|
||||||
|
{"value": "afventer", "is_closed": False},
|
||||||
|
{"value": "løst", "is_closed": True},
|
||||||
|
{"value": "lukket", "is_closed": True},
|
||||||
|
], ensure_ascii=False),
|
||||||
|
"system",
|
||||||
|
"Sagsstatus værdier og lukkede markeringer",
|
||||||
|
"json",
|
||||||
|
True,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
result = execute_query(query, (key,))
|
result = execute_query(query, (key,))
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
@ -177,7 +200,7 @@ async def sync_settings_from_env():
|
|||||||
@router.get("/users", response_model=List[User], tags=["Users"])
|
@router.get("/users", response_model=List[User], tags=["Users"])
|
||||||
async def get_users(is_active: Optional[bool] = None):
|
async def get_users(is_active: Optional[bool] = None):
|
||||||
"""Get all users"""
|
"""Get all users"""
|
||||||
query = "SELECT user_id as id, username, email, full_name, is_active, last_login, created_at FROM users"
|
query = "SELECT user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at FROM users"
|
||||||
params = []
|
params = []
|
||||||
|
|
||||||
if is_active is not None:
|
if is_active is not None:
|
||||||
@ -192,7 +215,7 @@ async def get_users(is_active: Optional[bool] = None):
|
|||||||
@router.get("/users/{user_id}", response_model=User, tags=["Users"])
|
@router.get("/users/{user_id}", response_model=User, tags=["Users"])
|
||||||
async def get_user(user_id: int):
|
async def get_user(user_id: int):
|
||||||
"""Get user by ID"""
|
"""Get user by ID"""
|
||||||
query = "SELECT user_id as id, username, email, full_name, is_active, last_login, created_at FROM users WHERE user_id = %s"
|
query = "SELECT user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at FROM users WHERE user_id = %s"
|
||||||
result = execute_query(query, (user_id,))
|
result = execute_query(query, (user_id,))
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
@ -216,7 +239,7 @@ async def create_user(user: UserCreate):
|
|||||||
query = """
|
query = """
|
||||||
INSERT INTO users (username, email, password_hash, full_name, is_active)
|
INSERT INTO users (username, email, password_hash, full_name, is_active)
|
||||||
VALUES (%s, %s, %s, %s, true)
|
VALUES (%s, %s, %s, %s, true)
|
||||||
RETURNING user_id as id, username, email, full_name, is_active, last_login, created_at
|
RETURNING user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at
|
||||||
"""
|
"""
|
||||||
result = execute_query(query, (user.username, user.email, password_hash, user.full_name))
|
result = execute_query(query, (user.username, user.email, password_hash, user.full_name))
|
||||||
|
|
||||||
@ -257,7 +280,7 @@ async def update_user(user_id: int, user: UserUpdate):
|
|||||||
UPDATE users
|
UPDATE users
|
||||||
SET {', '.join(update_fields)}, updated_at = CURRENT_TIMESTAMP
|
SET {', '.join(update_fields)}, updated_at = CURRENT_TIMESTAMP
|
||||||
WHERE user_id = %s
|
WHERE user_id = %s
|
||||||
RETURNING user_id as id, username, email, full_name, is_active, last_login, created_at
|
RETURNING user_id as id, username, email, full_name, is_active, last_login_at as last_login, created_at
|
||||||
"""
|
"""
|
||||||
|
|
||||||
result = execute_query(query, tuple(params))
|
result = execute_query(query, tuple(params))
|
||||||
@ -474,15 +497,11 @@ Output: [Liste af opgaver]""",
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _get_prompts_with_overrides() -> Dict:
|
||||||
@router.get("/ai-prompts", tags=["Settings"])
|
"""Get AI prompts with DB overrides applied"""
|
||||||
async def get_ai_prompts():
|
|
||||||
"""Get all AI prompts (defaults merged with custom overrides)"""
|
|
||||||
prompts = _get_default_prompts()
|
prompts = _get_default_prompts()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Check for custom overrides in DB
|
|
||||||
# Note: Table ai_prompts must rely on migration 066
|
|
||||||
rows = execute_query("SELECT key, prompt_text FROM ai_prompts")
|
rows = execute_query("SELECT key, prompt_text FROM ai_prompts")
|
||||||
if rows:
|
if rows:
|
||||||
for row in rows:
|
for row in rows:
|
||||||
@ -491,14 +510,40 @@ async def get_ai_prompts():
|
|||||||
prompts[row['key']]['is_custom'] = True
|
prompts[row['key']]['is_custom'] = True
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Could not load custom ai prompts: {e}")
|
logger.warning(f"Could not load custom ai prompts: {e}")
|
||||||
|
|
||||||
return prompts
|
return prompts
|
||||||
|
|
||||||
|
|
||||||
|
def _get_test_input_for_prompt(key: str) -> str:
|
||||||
|
"""Default test input per prompt type"""
|
||||||
|
examples = {
|
||||||
|
"invoice_extraction": "FAKTURA 2026-1001 fra Demo A/S. CVR 12345678. Total 1.250,00 DKK inkl moms.",
|
||||||
|
"ticket_classification": "Emne: Kan ikke logge på VPN. Beskrivelse: Flere brugere er ramt siden i morges.",
|
||||||
|
"ticket_summary": "Bruger havde netværksfejl. Router genstartet og DNS opdateret. Forbindelse virker nu stabilt.",
|
||||||
|
"kb_generation": "Problem: Outlook åbner ikke. Løsning: Reparer Office installation og nulstil profil.",
|
||||||
|
"troubleshooting_assistant": "Server svarer langsomt efter opdatering. CPU er høj, disk IO er normal.",
|
||||||
|
"sentiment_analysis": "Jeg er meget frustreret, systemet er nede igen og vi mister kunder!",
|
||||||
|
"meeting_action_items": "Peter opdaterer firewall fredag. Anna sender status til kunden mandag.",
|
||||||
|
}
|
||||||
|
return examples.get(key, "Skriv kort: AI test OK")
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/ai-prompts", tags=["Settings"])
|
||||||
|
async def get_ai_prompts():
|
||||||
|
"""Get all AI prompts (defaults merged with custom overrides)"""
|
||||||
|
return _get_prompts_with_overrides()
|
||||||
|
|
||||||
|
|
||||||
class PromptUpdate(BaseModel):
|
class PromptUpdate(BaseModel):
|
||||||
prompt_text: str
|
prompt_text: str
|
||||||
|
|
||||||
|
|
||||||
|
class PromptTestRequest(BaseModel):
|
||||||
|
test_input: Optional[str] = None
|
||||||
|
prompt_text: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
@router.put("/ai-prompts/{key}", tags=["Settings"])
|
@router.put("/ai-prompts/{key}", tags=["Settings"])
|
||||||
async def update_ai_prompt(key: str, update: PromptUpdate):
|
async def update_ai_prompt(key: str, update: PromptUpdate):
|
||||||
"""Override a system prompt with a custom one"""
|
"""Override a system prompt with a custom one"""
|
||||||
@ -533,3 +578,98 @@ async def reset_ai_prompt(key: str):
|
|||||||
raise HTTPException(status_code=500, detail="Could not reset prompt")
|
raise HTTPException(status_code=500, detail="Could not reset prompt")
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/ai-prompts/{key}/test", tags=["Settings"])
|
||||||
|
async def test_ai_prompt(key: str, payload: PromptTestRequest):
|
||||||
|
"""Run a quick AI test for a specific system prompt"""
|
||||||
|
prompts = _get_prompts_with_overrides()
|
||||||
|
if key not in prompts:
|
||||||
|
raise HTTPException(status_code=404, detail="Unknown prompt key")
|
||||||
|
|
||||||
|
prompt_cfg = prompts[key]
|
||||||
|
model = prompt_cfg.get("model") or settings.OLLAMA_MODEL
|
||||||
|
endpoint = prompt_cfg.get("endpoint") or settings.OLLAMA_ENDPOINT
|
||||||
|
prompt_text = (payload.prompt_text or prompt_cfg.get("prompt") or "").strip()
|
||||||
|
if not prompt_text:
|
||||||
|
raise HTTPException(status_code=400, detail="Prompt text is empty")
|
||||||
|
|
||||||
|
test_input = (payload.test_input or _get_test_input_for_prompt(key)).strip()
|
||||||
|
if not test_input:
|
||||||
|
raise HTTPException(status_code=400, detail="Test input is empty")
|
||||||
|
|
||||||
|
start = time.perf_counter()
|
||||||
|
try:
|
||||||
|
model_normalized = (model or "").strip().lower()
|
||||||
|
# qwen models are more reliable with /api/chat than /api/generate.
|
||||||
|
use_chat_api = model_normalized.startswith("qwen")
|
||||||
|
|
||||||
|
timeout = httpx.Timeout(connect=10.0, read=180.0, write=30.0, pool=10.0)
|
||||||
|
async with httpx.AsyncClient(timeout=timeout) as client:
|
||||||
|
if use_chat_api:
|
||||||
|
response = await client.post(
|
||||||
|
f"{endpoint}/api/chat",
|
||||||
|
json={
|
||||||
|
"model": model,
|
||||||
|
"messages": [
|
||||||
|
{"role": "system", "content": prompt_text},
|
||||||
|
{"role": "user", "content": test_input},
|
||||||
|
],
|
||||||
|
"stream": False,
|
||||||
|
"options": {"temperature": 0.2, "num_predict": 600},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
response = await client.post(
|
||||||
|
f"{endpoint}/api/generate",
|
||||||
|
json={
|
||||||
|
"model": model,
|
||||||
|
"prompt": f"{prompt_text}\n\nBrugerinput:\n{test_input}",
|
||||||
|
"stream": False,
|
||||||
|
"options": {"temperature": 0.2, "num_predict": 600},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if response.status_code != 200:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=502,
|
||||||
|
detail=f"AI endpoint fejl: {response.status_code} - {response.text[:300]}",
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
data = response.json()
|
||||||
|
except Exception as parse_error:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=502,
|
||||||
|
detail=f"AI endpoint returnerede ugyldig JSON: {str(parse_error)}",
|
||||||
|
)
|
||||||
|
|
||||||
|
if use_chat_api:
|
||||||
|
message_data = data.get("message", {})
|
||||||
|
ai_response = (message_data.get("content") or message_data.get("thinking") or "").strip()
|
||||||
|
else:
|
||||||
|
ai_response = (data.get("response") or "").strip()
|
||||||
|
|
||||||
|
if not ai_response:
|
||||||
|
raise HTTPException(status_code=502, detail="AI returnerede tomt svar")
|
||||||
|
|
||||||
|
latency_ms = int((time.perf_counter() - start) * 1000)
|
||||||
|
return {
|
||||||
|
"ok": True,
|
||||||
|
"key": key,
|
||||||
|
"model": model,
|
||||||
|
"endpoint": endpoint,
|
||||||
|
"test_input": test_input,
|
||||||
|
"ai_response": ai_response,
|
||||||
|
"latency_ms": latency_ms,
|
||||||
|
}
|
||||||
|
|
||||||
|
except HTTPException:
|
||||||
|
raise
|
||||||
|
except httpx.TimeoutException as e:
|
||||||
|
logger.error(f"❌ AI prompt test timed out for {key}: {repr(e)}")
|
||||||
|
raise HTTPException(status_code=504, detail="AI test timed out (model svarer for langsomt)")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"❌ AI prompt test failed for {key}: {repr(e)}")
|
||||||
|
err = str(e) or e.__class__.__name__
|
||||||
|
raise HTTPException(status_code=500, detail=f"Kunne ikke teste AI prompt: {err}")
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@ -764,6 +764,62 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<!-- Archived Ticket Sync + Monitor -->
|
||||||
|
<div class="card mb-4">
|
||||||
|
<div class="card-header bg-white d-flex justify-content-between align-items-center">
|
||||||
|
<div>
|
||||||
|
<h6 class="mb-0 fw-bold">Archived Tickets Sync</h6>
|
||||||
|
<small class="text-muted">Overvaager om alle archived tickets er synket ned (kildeantal vs lokal DB)</small>
|
||||||
|
</div>
|
||||||
|
<div class="d-flex align-items-center gap-2">
|
||||||
|
<span class="badge bg-secondary" id="archivedOverallBadge">Status ukendt</span>
|
||||||
|
<button class="btn btn-sm btn-outline-secondary" onclick="loadArchivedSyncStatus()" id="btnCheckArchivedSync">
|
||||||
|
<i class="bi bi-arrow-repeat me-1"></i>Tjek nu
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="row g-3">
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="border rounded p-3 h-100">
|
||||||
|
<div class="d-flex justify-content-between align-items-start mb-2">
|
||||||
|
<h6 class="mb-0">Simply archived</h6>
|
||||||
|
<span class="badge bg-secondary" id="archivedSimplyBadge">Ukendt</span>
|
||||||
|
</div>
|
||||||
|
<div class="small text-muted mb-2">Remote: <span id="archivedSimplyRemoteCount">-</span> | Lokal: <span id="archivedSimplyLocalCount">-</span> | Diff: <span id="archivedSimplyDiff">-</span></div>
|
||||||
|
<div class="small text-muted mb-3">Beskeder lokalt: <span id="archivedSimplyMessagesCount">-</span></div>
|
||||||
|
<div class="d-grid">
|
||||||
|
<button class="btn btn-outline-primary btn-sm" onclick="syncArchivedSimply()" id="btnSyncArchivedSimply">
|
||||||
|
<i class="bi bi-cloud-download me-2"></i>Sync Simply Archived
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="border rounded p-3 h-100">
|
||||||
|
<div class="d-flex justify-content-between align-items-start mb-2">
|
||||||
|
<h6 class="mb-0">vTiger Cases archived</h6>
|
||||||
|
<span class="badge bg-secondary" id="archivedVtigerBadge">Ukendt</span>
|
||||||
|
</div>
|
||||||
|
<div class="small text-muted mb-2">Remote: <span id="archivedVtigerRemoteCount">-</span> | Lokal: <span id="archivedVtigerLocalCount">-</span> | Diff: <span id="archivedVtigerDiff">-</span></div>
|
||||||
|
<div class="small text-muted mb-3">Beskeder lokalt: <span id="archivedVtigerMessagesCount">-</span></div>
|
||||||
|
<div class="d-grid">
|
||||||
|
<button class="btn btn-outline-primary btn-sm" onclick="syncArchivedVtiger()" id="btnSyncArchivedVtiger">
|
||||||
|
<i class="bi bi-cloud-download me-2"></i>Sync vTiger Archived
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex justify-content-between align-items-center mt-3">
|
||||||
|
<small class="text-muted">Sidst tjekket: <span id="archivedLastChecked">Aldrig</span></small>
|
||||||
|
<small class="text-muted" id="archivedStatusHint">Polling aktiv naar Sync-fanen er aaben.</small>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<!-- Sync Log -->
|
<!-- Sync Log -->
|
||||||
<div class="card">
|
<div class="card">
|
||||||
<div class="card-header bg-white">
|
<div class="card-header bg-white">
|
||||||
@ -1040,7 +1096,8 @@ async def scan_document(file_path: str):
|
|||||||
<option value="/ticket/dashboard/technician/v2" {% if default_dashboard_path == '/ticket/dashboard/technician/v2' %}selected{% endif %}>Tekniker Dashboard V2</option>
|
<option value="/ticket/dashboard/technician/v2" {% if default_dashboard_path == '/ticket/dashboard/technician/v2' %}selected{% endif %}>Tekniker Dashboard V2</option>
|
||||||
<option value="/ticket/dashboard/technician/v3" {% if default_dashboard_path == '/ticket/dashboard/technician/v3' %}selected{% endif %}>Tekniker Dashboard V3</option>
|
<option value="/ticket/dashboard/technician/v3" {% if default_dashboard_path == '/ticket/dashboard/technician/v3' %}selected{% endif %}>Tekniker Dashboard V3</option>
|
||||||
<option value="/dashboard/sales" {% if default_dashboard_path == '/dashboard/sales' %}selected{% endif %}>Salg Dashboard</option>
|
<option value="/dashboard/sales" {% if default_dashboard_path == '/dashboard/sales' %}selected{% endif %}>Salg Dashboard</option>
|
||||||
{% if default_dashboard_path and default_dashboard_path not in ['/ticket/dashboard/technician/v1', '/ticket/dashboard/technician/v2', '/ticket/dashboard/technician/v3', '/dashboard/sales'] %}
|
<option value="/dashboard/mission-control" {% if default_dashboard_path == '/dashboard/mission-control' %}selected{% endif %}>Mission Control</option>
|
||||||
|
{% if default_dashboard_path and default_dashboard_path not in ['/ticket/dashboard/technician/v1', '/ticket/dashboard/technician/v2', '/ticket/dashboard/technician/v3', '/dashboard/sales', '/dashboard/mission-control'] %}
|
||||||
<option value="{{ default_dashboard_path }}" selected>Nuværende (tilpasset): {{ default_dashboard_path }}</option>
|
<option value="{{ default_dashboard_path }}" selected>Nuværende (tilpasset): {{ default_dashboard_path }}</option>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</select>
|
</select>
|
||||||
@ -1086,6 +1143,33 @@ async def scan_document(file_path: str):
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div class="card p-4 mt-4">
|
||||||
|
<div class="d-flex justify-content-between align-items-center mb-3">
|
||||||
|
<div>
|
||||||
|
<h5 class="mb-1 fw-bold">Sagsstatus</h5>
|
||||||
|
<p class="text-muted mb-0">Styr hvilke status-værdier der kan vælges, og marker hvilke der er lukkede.</p>
|
||||||
|
</div>
|
||||||
|
<div class="d-flex gap-2">
|
||||||
|
<input type="text" class="form-control" id="caseStatusInput" placeholder="F.eks. afventer kunde" style="max-width: 260px;">
|
||||||
|
<button class="btn btn-primary" onclick="addCaseStatus()"><i class="bi bi-plus-lg me-1"></i>Tilføj</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="table-responsive">
|
||||||
|
<table class="table table-sm align-middle mb-0">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Status</th>
|
||||||
|
<th class="text-center" style="width: 150px;">Lukket værdi</th>
|
||||||
|
<th class="text-end" style="width: 100px;">Handling</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody id="caseStatusesTableBody">
|
||||||
|
<tr><td colspan="3" class="text-muted">Indlæser...</td></tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<div class="card p-4 mt-4">
|
<div class="card p-4 mt-4">
|
||||||
<div class="d-flex justify-content-between align-items-center mb-3">
|
<div class="d-flex justify-content-between align-items-center mb-3">
|
||||||
<div>
|
<div>
|
||||||
@ -1605,6 +1689,8 @@ async function loadSettings() {
|
|||||||
displaySettingsByCategory();
|
displaySettingsByCategory();
|
||||||
renderTelefoniSettings();
|
renderTelefoniSettings();
|
||||||
await loadCaseTypesSetting();
|
await loadCaseTypesSetting();
|
||||||
|
await loadCaseStatusesSetting();
|
||||||
|
await loadTagsManagement();
|
||||||
await loadNextcloudInstances();
|
await loadNextcloudInstances();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error loading settings:', error);
|
console.error('Error loading settings:', error);
|
||||||
@ -1974,6 +2060,132 @@ const CASE_MODULE_LABELS = {
|
|||||||
};
|
};
|
||||||
|
|
||||||
let caseTypeModuleDefaultsCache = {};
|
let caseTypeModuleDefaultsCache = {};
|
||||||
|
let caseStatusesCache = [];
|
||||||
|
|
||||||
|
function normalizeCaseStatuses(raw) {
|
||||||
|
const normalized = [];
|
||||||
|
const seen = new Set();
|
||||||
|
const source = Array.isArray(raw) ? raw : [];
|
||||||
|
|
||||||
|
source.forEach((item) => {
|
||||||
|
const row = typeof item === 'string'
|
||||||
|
? { value: item, is_closed: false }
|
||||||
|
: (item && typeof item === 'object' ? item : null);
|
||||||
|
|
||||||
|
if (!row) return;
|
||||||
|
const value = String(row.value || '').trim();
|
||||||
|
if (!value) return;
|
||||||
|
|
||||||
|
const key = value.toLowerCase();
|
||||||
|
if (seen.has(key)) return;
|
||||||
|
seen.add(key);
|
||||||
|
|
||||||
|
normalized.push({
|
||||||
|
value,
|
||||||
|
is_closed: Boolean(row.is_closed)
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
const defaults = [
|
||||||
|
{ value: 'åben', is_closed: false },
|
||||||
|
{ value: 'under behandling', is_closed: false },
|
||||||
|
{ value: 'afventer', is_closed: false },
|
||||||
|
{ value: 'løst', is_closed: true },
|
||||||
|
{ value: 'lukket', is_closed: true }
|
||||||
|
];
|
||||||
|
|
||||||
|
defaults.forEach((item) => {
|
||||||
|
const key = item.value.toLowerCase();
|
||||||
|
if (!seen.has(key)) {
|
||||||
|
seen.add(key);
|
||||||
|
normalized.push(item);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return normalized;
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderCaseStatuses(rows) {
|
||||||
|
const tbody = document.getElementById('caseStatusesTableBody');
|
||||||
|
if (!tbody) return;
|
||||||
|
|
||||||
|
if (!Array.isArray(rows) || !rows.length) {
|
||||||
|
tbody.innerHTML = '<tr><td colspan="3" class="text-muted">Ingen statusværdier defineret</td></tr>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
tbody.innerHTML = rows.map((row, index) => `
|
||||||
|
<tr>
|
||||||
|
<td><span class="fw-semibold">${escapeHtml(row.value)}</span></td>
|
||||||
|
<td class="text-center">
|
||||||
|
<div class="form-check form-switch d-inline-flex">
|
||||||
|
<input class="form-check-input" type="checkbox" id="caseStatusClosed_${index}" ${row.is_closed ? 'checked' : ''}
|
||||||
|
onchange="toggleCaseStatusClosed(${index}, this.checked)">
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td class="text-end">
|
||||||
|
<button type="button" class="btn btn-sm btn-outline-danger" onclick="removeCaseStatus(${index})" title="Slet status">
|
||||||
|
<i class="bi bi-trash"></i>
|
||||||
|
</button>
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
`).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadCaseStatusesSetting() {
|
||||||
|
try {
|
||||||
|
const response = await fetch('/api/v1/settings/case_statuses');
|
||||||
|
if (!response.ok) {
|
||||||
|
caseStatusesCache = normalizeCaseStatuses([]);
|
||||||
|
renderCaseStatuses(caseStatusesCache);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const setting = await response.json();
|
||||||
|
const parsed = JSON.parse(setting.value || '[]');
|
||||||
|
caseStatusesCache = normalizeCaseStatuses(parsed);
|
||||||
|
renderCaseStatuses(caseStatusesCache);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error loading case statuses:', error);
|
||||||
|
caseStatusesCache = normalizeCaseStatuses([]);
|
||||||
|
renderCaseStatuses(caseStatusesCache);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function saveCaseStatuses() {
|
||||||
|
await updateSetting('case_statuses', JSON.stringify(caseStatusesCache));
|
||||||
|
renderCaseStatuses(caseStatusesCache);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function addCaseStatus() {
|
||||||
|
const input = document.getElementById('caseStatusInput');
|
||||||
|
if (!input) return;
|
||||||
|
|
||||||
|
const value = input.value.trim();
|
||||||
|
if (!value) return;
|
||||||
|
|
||||||
|
const exists = caseStatusesCache.some((row) => String(row.value || '').toLowerCase() === value.toLowerCase());
|
||||||
|
if (!exists) {
|
||||||
|
caseStatusesCache.push({ value, is_closed: false });
|
||||||
|
await saveCaseStatuses();
|
||||||
|
}
|
||||||
|
|
||||||
|
input.value = '';
|
||||||
|
}
|
||||||
|
|
||||||
|
async function removeCaseStatus(index) {
|
||||||
|
caseStatusesCache = caseStatusesCache.filter((_, i) => i !== index);
|
||||||
|
if (!caseStatusesCache.length) {
|
||||||
|
caseStatusesCache = normalizeCaseStatuses([]);
|
||||||
|
}
|
||||||
|
await saveCaseStatuses();
|
||||||
|
}
|
||||||
|
|
||||||
|
async function toggleCaseStatusClosed(index, checked) {
|
||||||
|
if (!caseStatusesCache[index]) return;
|
||||||
|
caseStatusesCache[index].is_closed = Boolean(checked);
|
||||||
|
await saveCaseStatuses();
|
||||||
|
}
|
||||||
|
|
||||||
function normalizeCaseTypeModuleDefaults(raw, caseTypes) {
|
function normalizeCaseTypeModuleDefaults(raw, caseTypes) {
|
||||||
const normalized = {};
|
const normalized = {};
|
||||||
@ -2184,14 +2396,16 @@ async function loadUsers() {
|
|||||||
async function loadAdminUsers() {
|
async function loadAdminUsers() {
|
||||||
try {
|
try {
|
||||||
const response = await fetch('/api/v1/admin/users');
|
const response = await fetch('/api/v1/admin/users');
|
||||||
if (!response.ok) throw new Error('Failed to load users');
|
if (!response.ok) {
|
||||||
|
throw new Error(await getErrorMessage(response, 'Kunne ikke indlaese brugere'));
|
||||||
|
}
|
||||||
usersCache = await response.json();
|
usersCache = await response.json();
|
||||||
displayUsers(usersCache);
|
displayUsers(usersCache);
|
||||||
populateTelefoniTestUsers(usersCache);
|
populateTelefoniTestUsers(usersCache);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error loading users:', error);
|
console.error('Error loading users:', error);
|
||||||
const tbody = document.getElementById('usersTableBody');
|
const tbody = document.getElementById('usersTableBody');
|
||||||
tbody.innerHTML = '<tr><td colspan="11" class="text-center text-muted py-5">Kunne ikke indlæse brugere</td></tr>';
|
tbody.innerHTML = `<tr><td colspan="11" class="text-center text-muted py-5">${escapeHtml(error.message || 'Kunne ikke indlaese brugere')}</td></tr>`;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -2472,17 +2686,21 @@ async function createUser() {
|
|||||||
|
|
||||||
async function toggleUserActive(userId, isActive) {
|
async function toggleUserActive(userId, isActive) {
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/users/${userId}`, {
|
const response = await fetch(`/api/v1/admin/users/${userId}`, {
|
||||||
method: 'PUT',
|
method: 'PATCH',
|
||||||
headers: { 'Content-Type': 'application/json' },
|
headers: { 'Content-Type': 'application/json' },
|
||||||
body: JSON.stringify({ is_active: isActive })
|
body: JSON.stringify({ is_active: isActive })
|
||||||
});
|
});
|
||||||
|
|
||||||
if (response.ok) {
|
if (!response.ok) {
|
||||||
loadUsers();
|
alert(await getErrorMessage(response, 'Kunne ikke opdatere brugerstatus'));
|
||||||
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
loadUsers();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error toggling user:', error);
|
console.error('Error toggling user:', error);
|
||||||
|
alert('Kunne ikke opdatere brugerstatus');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -2665,13 +2883,18 @@ async function resetPassword(userId) {
|
|||||||
if (!newPassword) return;
|
if (!newPassword) return;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const response = await fetch(`/api/v1/users/${userId}/reset-password?new_password=${newPassword}`, {
|
const response = await fetch(`/api/v1/admin/users/${userId}/reset-password`, {
|
||||||
method: 'POST'
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: JSON.stringify({ new_password: newPassword })
|
||||||
});
|
});
|
||||||
|
|
||||||
if (response.ok) {
|
if (!response.ok) {
|
||||||
alert('Adgangskode nulstillet!');
|
alert(await getErrorMessage(response, 'Kunne ikke nulstille adgangskode'));
|
||||||
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
alert('Adgangskode nulstillet!');
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error resetting password:', error);
|
console.error('Error resetting password:', error);
|
||||||
alert('Kunne ikke nulstille adgangskode');
|
alert('Kunne ikke nulstille adgangskode');
|
||||||
@ -2775,6 +2998,9 @@ async function loadAIPrompts() {
|
|||||||
<button class="btn btn-outline-danger" onclick="resetPrompt('${key}')" title="Nulstil til standard">
|
<button class="btn btn-outline-danger" onclick="resetPrompt('${key}')" title="Nulstil til standard">
|
||||||
<i class="bi bi-arrow-counterclockwise"></i> Nulstil
|
<i class="bi bi-arrow-counterclockwise"></i> Nulstil
|
||||||
</button>` : ''}
|
</button>` : ''}
|
||||||
|
<button class="btn btn-outline-success" onclick="testPrompt('${key}')" id="testBtn_${key}" title="Test AI prompt">
|
||||||
|
<i class="bi bi-play-circle"></i> Test
|
||||||
|
</button>
|
||||||
<button class="btn btn-outline-primary" onclick="editPrompt('${key}')" id="editBtn_${key}" title="Rediger Prompt">
|
<button class="btn btn-outline-primary" onclick="editPrompt('${key}')" id="editBtn_${key}" title="Rediger Prompt">
|
||||||
<i class="bi bi-pencil"></i> Rediger
|
<i class="bi bi-pencil"></i> Rediger
|
||||||
</button>
|
</button>
|
||||||
@ -2788,6 +3014,8 @@ async function loadAIPrompts() {
|
|||||||
style="max-height: 400px; overflow-y: auto; font-size: 0.85rem; white-space: pre-wrap; border-radius: 0;">${escapeHtml(prompt.prompt)}</pre>
|
style="max-height: 400px; overflow-y: auto; font-size: 0.85rem; white-space: pre-wrap; border-radius: 0;">${escapeHtml(prompt.prompt)}</pre>
|
||||||
<textarea id="edit_prompt_${key}" class="form-control d-none p-3 bg-white text-dark rounded-bottom"
|
<textarea id="edit_prompt_${key}" class="form-control d-none p-3 bg-white text-dark rounded-bottom"
|
||||||
style="height: 300px; font-family: monospace; font-size: 0.85rem; border-radius: 0;">${escapeHtml(prompt.prompt)}</textarea>
|
style="height: 300px; font-family: monospace; font-size: 0.85rem; border-radius: 0;">${escapeHtml(prompt.prompt)}</textarea>
|
||||||
|
|
||||||
|
<div id="testResult_${key}" class="alert alert-secondary m-3 py-2 px-3 d-none" style="white-space: pre-wrap; font-size: 0.85rem;"></div>
|
||||||
|
|
||||||
<div id="editActions_${key}" class="position-absolute bottom-0 end-0 p-3 d-none">
|
<div id="editActions_${key}" class="position-absolute bottom-0 end-0 p-3 d-none">
|
||||||
<button class="btn btn-sm btn-secondary me-1" onclick="cancelEdit('${key}')">Annuller</button>
|
<button class="btn btn-sm btn-secondary me-1" onclick="cancelEdit('${key}')">Annuller</button>
|
||||||
@ -2882,6 +3110,52 @@ async function resetPrompt(key) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async function testPrompt(key) {
|
||||||
|
const btn = document.getElementById(`testBtn_${key}`);
|
||||||
|
const resultElement = document.getElementById(`testResult_${key}`);
|
||||||
|
const editElement = document.getElementById(`edit_prompt_${key}`);
|
||||||
|
|
||||||
|
const promptText = editElement ? editElement.value : '';
|
||||||
|
const originalHtml = btn.innerHTML;
|
||||||
|
|
||||||
|
btn.disabled = true;
|
||||||
|
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-1" role="status" aria-hidden="true"></span>Tester';
|
||||||
|
|
||||||
|
resultElement.className = 'alert alert-secondary m-3 py-2 px-3';
|
||||||
|
resultElement.classList.remove('d-none');
|
||||||
|
resultElement.textContent = 'Tester AI...';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`/api/v1/ai-prompts/${key}/test`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: JSON.stringify({ prompt_text: promptText })
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const message = await getErrorMessage(response, 'Kunne ikke teste AI prompt');
|
||||||
|
throw new Error(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
const fullResponse = (result.ai_response || '').trim();
|
||||||
|
const preview = fullResponse.length > 1200 ? `${fullResponse.slice(0, 1200)}\n...` : fullResponse;
|
||||||
|
|
||||||
|
resultElement.className = 'alert alert-success m-3 py-2 px-3';
|
||||||
|
resultElement.textContent =
|
||||||
|
`✅ AI svar modtaget (${result.latency_ms} ms)\n` +
|
||||||
|
`Model: ${result.model}\n\n` +
|
||||||
|
`${preview || '[Tomt svar]'}`;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error testing AI prompt:', error);
|
||||||
|
resultElement.className = 'alert alert-danger m-3 py-2 px-3';
|
||||||
|
resultElement.textContent = `❌ ${error.message || 'Kunne ikke teste AI prompt'}`;
|
||||||
|
} finally {
|
||||||
|
btn.disabled = false;
|
||||||
|
btn.innerHTML = originalHtml;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
function copyPrompt(key) {
|
function copyPrompt(key) {
|
||||||
@ -2966,6 +3240,8 @@ document.querySelectorAll('.settings-nav .nav-link').forEach(link => {
|
|||||||
// Load data for tab
|
// Load data for tab
|
||||||
if (tab === 'users') {
|
if (tab === 'users') {
|
||||||
loadUsers();
|
loadUsers();
|
||||||
|
} else if (tab === 'tags') {
|
||||||
|
loadTagsManagement();
|
||||||
} else if (tab === 'telefoni') {
|
} else if (tab === 'telefoni') {
|
||||||
renderTelefoniSettings();
|
renderTelefoniSettings();
|
||||||
} else if (tab === 'ai-prompts') {
|
} else if (tab === 'ai-prompts') {
|
||||||
@ -3040,13 +3316,19 @@ let showInactive = false;
|
|||||||
async function loadTagsManagement() {
|
async function loadTagsManagement() {
|
||||||
try {
|
try {
|
||||||
const response = await fetch('/api/v1/tags');
|
const response = await fetch('/api/v1/tags');
|
||||||
if (!response.ok) throw new Error('Failed to load tags');
|
if (!response.ok) {
|
||||||
|
const msg = await getErrorMessage(response, 'Kunne ikke indlæse tags');
|
||||||
|
throw new Error(msg);
|
||||||
|
}
|
||||||
allTagsData = await response.json();
|
allTagsData = await response.json();
|
||||||
updateTagsStats();
|
updateTagsStats();
|
||||||
renderTagsGrid();
|
renderTagsGrid();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error loading tags:', error);
|
console.error('Error loading tags:', error);
|
||||||
showNotification('Fejl ved indlæsning af tags', 'error');
|
allTagsData = [];
|
||||||
|
updateTagsStats();
|
||||||
|
renderTagsGrid();
|
||||||
|
showNotification('Fejl ved indlæsning af tags: ' + (error.message || 'ukendt fejl'), 'error');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -3261,6 +3543,7 @@ if (tagsNavLink) {
|
|||||||
|
|
||||||
// ====== SYNC MANAGEMENT ======
|
// ====== SYNC MANAGEMENT ======
|
||||||
let syncLog = [];
|
let syncLog = [];
|
||||||
|
let archivedSyncPollInterval = null;
|
||||||
|
|
||||||
async function loadSyncStats() {
|
async function loadSyncStats() {
|
||||||
try {
|
try {
|
||||||
@ -3345,9 +3628,195 @@ async function parseApiError(response, fallbackMessage) {
|
|||||||
return '2FA kræves for sync API. Aktivér 2FA på din bruger og log ind igen.';
|
return '2FA kræves for sync API. Aktivér 2FA på din bruger og log ind igen.';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (response.status === 403) {
|
||||||
|
if (String(detailMessage).includes('Missing required permission') || String(detailMessage).includes('Superadmin access required')) {
|
||||||
|
return 'Kun admin/superadmin må starte eller overvåge sync.';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return detailMessage;
|
return detailMessage;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function updateArchivedSourceBadge(sourceKey, isSynced, hasError) {
|
||||||
|
const badgeId = sourceKey === 'simplycrm' ? 'archivedSimplyBadge' : 'archivedVtigerBadge';
|
||||||
|
const badge = document.getElementById(badgeId);
|
||||||
|
if (!badge) return;
|
||||||
|
|
||||||
|
if (hasError) {
|
||||||
|
badge.className = 'badge bg-danger';
|
||||||
|
badge.textContent = 'Fejl';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isSynced === true) {
|
||||||
|
badge.className = 'badge bg-success';
|
||||||
|
badge.textContent = 'Synket';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
badge.className = 'badge bg-warning text-dark';
|
||||||
|
badge.textContent = 'Mangler';
|
||||||
|
}
|
||||||
|
|
||||||
|
function startArchivedSyncPolling() {
|
||||||
|
if (archivedSyncPollInterval) return;
|
||||||
|
archivedSyncPollInterval = setInterval(() => {
|
||||||
|
loadArchivedSyncStatus();
|
||||||
|
}, 15000);
|
||||||
|
}
|
||||||
|
|
||||||
|
function stopArchivedSyncPolling() {
|
||||||
|
if (!archivedSyncPollInterval) return;
|
||||||
|
clearInterval(archivedSyncPollInterval);
|
||||||
|
archivedSyncPollInterval = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadArchivedSyncStatus() {
|
||||||
|
const overallBadge = document.getElementById('archivedOverallBadge');
|
||||||
|
const lastChecked = document.getElementById('archivedLastChecked');
|
||||||
|
const hint = document.getElementById('archivedStatusHint');
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch('/api/v1/ticket/archived/status');
|
||||||
|
if (!response.ok) {
|
||||||
|
const errorMessage = await parseApiError(response, 'Kunne ikke hente archived status');
|
||||||
|
throw new Error(errorMessage);
|
||||||
|
}
|
||||||
|
|
||||||
|
const status = await response.json();
|
||||||
|
const simply = (status.sources || {}).simplycrm || {};
|
||||||
|
const vtiger = (status.sources || {}).vtiger || {};
|
||||||
|
|
||||||
|
const setText = (id, value) => {
|
||||||
|
const el = document.getElementById(id);
|
||||||
|
if (el) el.textContent = value === null || value === undefined ? '-' : value;
|
||||||
|
};
|
||||||
|
|
||||||
|
setText('archivedSimplyRemoteCount', simply.remote_total_tickets);
|
||||||
|
setText('archivedSimplyLocalCount', simply.local_total_tickets);
|
||||||
|
setText('archivedSimplyDiff', simply.diff);
|
||||||
|
setText('archivedSimplyMessagesCount', simply.local_total_messages);
|
||||||
|
|
||||||
|
setText('archivedVtigerRemoteCount', vtiger.remote_total_tickets);
|
||||||
|
setText('archivedVtigerLocalCount', vtiger.local_total_tickets);
|
||||||
|
setText('archivedVtigerDiff', vtiger.diff);
|
||||||
|
setText('archivedVtigerMessagesCount', vtiger.local_total_messages);
|
||||||
|
|
||||||
|
updateArchivedSourceBadge('simplycrm', simply.is_synced, !!simply.error);
|
||||||
|
updateArchivedSourceBadge('vtiger', vtiger.is_synced, !!vtiger.error);
|
||||||
|
|
||||||
|
if (overallBadge) {
|
||||||
|
if (status.overall_synced === true) {
|
||||||
|
overallBadge.className = 'badge bg-success';
|
||||||
|
overallBadge.textContent = 'Alt synced';
|
||||||
|
} else {
|
||||||
|
overallBadge.className = 'badge bg-warning text-dark';
|
||||||
|
overallBadge.textContent = 'Ikke fuldt synced';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lastChecked) {
|
||||||
|
lastChecked.textContent = new Date().toLocaleString('da-DK');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (hint) {
|
||||||
|
const errors = [simply.error, vtiger.error].filter(Boolean);
|
||||||
|
hint.textContent = errors.length > 0
|
||||||
|
? `Statusfejl: ${errors.join(' | ')}`
|
||||||
|
: 'Polling aktiv mens Sync-fanen er åben.';
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
if (overallBadge) {
|
||||||
|
overallBadge.className = 'badge bg-danger';
|
||||||
|
overallBadge.textContent = 'Statusfejl';
|
||||||
|
}
|
||||||
|
if (hint) {
|
||||||
|
hint.textContent = error.message;
|
||||||
|
}
|
||||||
|
console.error('Error loading archived sync status:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function syncArchivedSimply() {
|
||||||
|
const btn = document.getElementById('btnSyncArchivedSimply');
|
||||||
|
if (!btn) return;
|
||||||
|
|
||||||
|
btn.disabled = true;
|
||||||
|
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Synkroniserer...';
|
||||||
|
|
||||||
|
try {
|
||||||
|
addSyncLogEntry('Simply Archived Sync Startet', 'Importerer archived tickets fra Simply...', 'info');
|
||||||
|
|
||||||
|
const response = await fetch('/api/v1/ticket/archived/simply/import?limit=5000&include_messages=true&force=false', {
|
||||||
|
method: 'POST'
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const errorMessage = await parseApiError(response, 'Simply archived sync fejlede');
|
||||||
|
throw new Error(errorMessage);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
const details = [
|
||||||
|
`Importeret: ${result.imported || 0}`,
|
||||||
|
`Opdateret: ${result.updated || 0}`,
|
||||||
|
`Sprunget over: ${result.skipped || 0}`,
|
||||||
|
`Fejl: ${result.errors || 0}`,
|
||||||
|
`Beskeder: ${result.messages_imported || 0}`
|
||||||
|
].join(' | ');
|
||||||
|
addSyncLogEntry('Simply Archived Sync Fuldført', details, 'success');
|
||||||
|
|
||||||
|
await loadArchivedSyncStatus();
|
||||||
|
showNotification('Simply archived sync fuldført!', 'success');
|
||||||
|
} catch (error) {
|
||||||
|
addSyncLogEntry('Simply Archived Sync Fejl', error.message, 'error');
|
||||||
|
showNotification('Fejl: ' + error.message, 'error');
|
||||||
|
} finally {
|
||||||
|
btn.disabled = false;
|
||||||
|
btn.innerHTML = '<i class="bi bi-cloud-download me-2"></i>Sync Simply Archived';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function syncArchivedVtiger() {
|
||||||
|
const btn = document.getElementById('btnSyncArchivedVtiger');
|
||||||
|
if (!btn) return;
|
||||||
|
|
||||||
|
btn.disabled = true;
|
||||||
|
btn.innerHTML = '<span class="spinner-border spinner-border-sm me-2"></span>Synkroniserer...';
|
||||||
|
|
||||||
|
try {
|
||||||
|
addSyncLogEntry('vTiger Archived Sync Startet', 'Importerer archived tickets fra vTiger Cases...', 'info');
|
||||||
|
|
||||||
|
const response = await fetch('/api/v1/ticket/archived/vtiger/import?limit=5000&include_messages=true&force=false', {
|
||||||
|
method: 'POST'
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const errorMessage = await parseApiError(response, 'vTiger archived sync fejlede');
|
||||||
|
throw new Error(errorMessage);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
const details = [
|
||||||
|
`Importeret: ${result.imported || 0}`,
|
||||||
|
`Opdateret: ${result.updated || 0}`,
|
||||||
|
`Sprunget over: ${result.skipped || 0}`,
|
||||||
|
`Fejl: ${result.errors || 0}`,
|
||||||
|
`Beskeder: ${result.messages_imported || 0}`
|
||||||
|
].join(' | ');
|
||||||
|
addSyncLogEntry('vTiger Archived Sync Fuldført', details, 'success');
|
||||||
|
|
||||||
|
await loadArchivedSyncStatus();
|
||||||
|
showNotification('vTiger archived sync fuldført!', 'success');
|
||||||
|
} catch (error) {
|
||||||
|
addSyncLogEntry('vTiger Archived Sync Fejl', error.message, 'error');
|
||||||
|
showNotification('Fejl: ' + error.message, 'error');
|
||||||
|
} finally {
|
||||||
|
btn.disabled = false;
|
||||||
|
btn.innerHTML = '<i class="bi bi-cloud-download me-2"></i>Sync vTiger Archived';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
async function syncFromVtiger() {
|
async function syncFromVtiger() {
|
||||||
const btn = document.getElementById('btnSyncVtiger');
|
const btn = document.getElementById('btnSyncVtiger');
|
||||||
btn.disabled = true;
|
btn.disabled = true;
|
||||||
@ -3526,9 +3995,17 @@ if (syncNavLink) {
|
|||||||
syncNavLink.addEventListener('click', () => {
|
syncNavLink.addEventListener('click', () => {
|
||||||
loadSyncStats();
|
loadSyncStats();
|
||||||
loadSyncLog();
|
loadSyncLog();
|
||||||
|
loadArchivedSyncStatus();
|
||||||
|
startArchivedSyncPolling();
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
document.addEventListener('visibilitychange', () => {
|
||||||
|
if (document.hidden) {
|
||||||
|
stopArchivedSyncPolling();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
// Notification helper
|
// Notification helper
|
||||||
function showNotification(message, type = 'info') {
|
function showNotification(message, type = 'info') {
|
||||||
// Create toast notification
|
// Create toast notification
|
||||||
|
|||||||
@ -246,12 +246,14 @@
|
|||||||
<li><a class="dropdown-item py-2" href="/hardware"><i class="bi bi-laptop me-2"></i>Hardware Assets</a></li>
|
<li><a class="dropdown-item py-2" href="/hardware"><i class="bi bi-laptop me-2"></i>Hardware Assets</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/hardware/eset"><i class="bi bi-shield-check me-2"></i>ESET Oversigt</a></li>
|
<li><a class="dropdown-item py-2" href="/hardware/eset"><i class="bi bi-shield-check me-2"></i>ESET Oversigt</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/telefoni"><i class="bi bi-telephone me-2"></i>Telefoni</a></li>
|
<li><a class="dropdown-item py-2" href="/telefoni"><i class="bi bi-telephone me-2"></i>Telefoni</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/dashboard/mission-control"><i class="bi bi-broadcast-pin me-2"></i>Mission Control</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/app/locations"><i class="bi bi-map-fill me-2"></i>Lokaliteter</a></li>
|
<li><a class="dropdown-item py-2" href="/app/locations"><i class="bi bi-map-fill me-2"></i>Lokaliteter</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
<li><a class="dropdown-item py-2" href="/prepaid-cards"><i class="bi bi-credit-card-2-front me-2"></i>Prepaid Cards</a></li>
|
<li><a class="dropdown-item py-2" href="/prepaid-cards"><i class="bi bi-credit-card-2-front me-2"></i>Prepaid Cards</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/fixed-price-agreements"><i class="bi bi-calendar-check me-2"></i>Fastpris Aftaler</a></li>
|
<li><a class="dropdown-item py-2" href="/fixed-price-agreements"><i class="bi bi-calendar-check me-2"></i>Fastpris Aftaler</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/subscriptions"><i class="bi bi-repeat me-2"></i>Abonnementer</a></li>
|
<li><a class="dropdown-item py-2" href="/subscriptions"><i class="bi bi-repeat me-2"></i>Abonnementer</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/tags#search"><i class="bi bi-tags me-2"></i>Tag søgning</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="#">Knowledge Base</a></li>
|
<li><a class="dropdown-item py-2" href="#">Knowledge Base</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</li>
|
</li>
|
||||||
@ -280,21 +282,6 @@
|
|||||||
<li><a class="dropdown-item py-2" href="#">Abonnementer</a></li>
|
<li><a class="dropdown-item py-2" href="#">Abonnementer</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="#">Betalinger</a></li>
|
<li><a class="dropdown-item py-2" href="#">Betalinger</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
<li class="dropdown-submenu">
|
|
||||||
<a class="dropdown-item dropdown-toggle py-2" href="#" data-submenu-toggle="timetracking">
|
|
||||||
<span><i class="bi bi-clock-history me-2"></i>Timetracking</span>
|
|
||||||
<i class="bi bi-chevron-right small opacity-75"></i>
|
|
||||||
</a>
|
|
||||||
<ul class="dropdown-menu" data-submenu="timetracking">
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking"><i class="bi bi-speedometer2 me-2"></i>Dashboard</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/registrations"><i class="bi bi-list-columns-reverse me-2"></i>Registreringer</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/wizard"><i class="bi bi-magic me-2"></i>Godkend Timer</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/service-contract-wizard"><i class="bi bi-diagram-3 me-2"></i>Servicekontrakt Migration</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/orders"><i class="bi bi-receipt me-2"></i>Ordrer</a></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="/timetracking/customers"><i class="bi bi-people me-2"></i>Kunder</a></li>
|
|
||||||
</ul>
|
|
||||||
</li>
|
|
||||||
<li><hr class="dropdown-divider"></li>
|
|
||||||
<li><a class="dropdown-item py-2" href="#">Rapporter</a></li>
|
<li><a class="dropdown-item py-2" href="#">Rapporter</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</li>
|
</li>
|
||||||
@ -305,6 +292,19 @@
|
|||||||
</li>
|
</li>
|
||||||
</ul>
|
</ul>
|
||||||
<div class="d-flex align-items-center gap-3">
|
<div class="d-flex align-items-center gap-3">
|
||||||
|
<div class="dropdown">
|
||||||
|
<a class="nav-link dropdown-toggle" href="#" role="button" data-bs-toggle="dropdown" aria-expanded="false">
|
||||||
|
<i class="bi bi-clock-history me-2"></i>Data migration
|
||||||
|
</a>
|
||||||
|
<ul class="dropdown-menu dropdown-menu-end mt-2">
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking"><i class="bi bi-speedometer2 me-2"></i>Dashboard</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/registrations"><i class="bi bi-list-columns-reverse me-2"></i>Registreringer</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/wizard"><i class="bi bi-magic me-2"></i>Godkend Timer</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/service-contract-wizard"><i class="bi bi-diagram-3 me-2"></i>Servicekontrakt Migration</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/orders"><i class="bi bi-receipt me-2"></i>Ordrer</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/timetracking/customers"><i class="bi bi-people me-2"></i>Kunder</a></li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
<button class="btn btn-light rounded-circle border-0" id="quickCreateBtn" style="background: var(--accent-light); color: var(--accent);" title="Opret ny sag (+ eller Cmd+Shift+C)">
|
<button class="btn btn-light rounded-circle border-0" id="quickCreateBtn" style="background: var(--accent-light); color: var(--accent);" title="Opret ny sag (+ eller Cmd+Shift+C)">
|
||||||
<i class="bi bi-plus-circle-fill fs-5"></i>
|
<i class="bi bi-plus-circle-fill fs-5"></i>
|
||||||
</button>
|
</button>
|
||||||
@ -320,6 +320,7 @@
|
|||||||
<ul class="dropdown-menu dropdown-menu-end mt-2">
|
<ul class="dropdown-menu dropdown-menu-end mt-2">
|
||||||
<li><a class="dropdown-item py-2" href="#" data-bs-toggle="modal" data-bs-target="#profileModal">Profil</a></li>
|
<li><a class="dropdown-item py-2" href="#" data-bs-toggle="modal" data-bs-target="#profileModal">Profil</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/settings"><i class="bi bi-gear me-2"></i>Indstillinger</a></li>
|
<li><a class="dropdown-item py-2" href="/settings"><i class="bi bi-gear me-2"></i>Indstillinger</a></li>
|
||||||
|
<li><a class="dropdown-item py-2" href="/tags#search"><i class="bi bi-tags me-2"></i>Tag søgning</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/backups"><i class="bi bi-hdd-stack me-2"></i>Backup System</a></li>
|
<li><a class="dropdown-item py-2" href="/backups"><i class="bi bi-hdd-stack me-2"></i>Backup System</a></li>
|
||||||
<li><a class="dropdown-item py-2" href="/devportal"><i class="bi bi-code-square me-2"></i>DEV Portal</a></li>
|
<li><a class="dropdown-item py-2" href="/devportal"><i class="bi bi-code-square me-2"></i>DEV Portal</a></li>
|
||||||
<li><hr class="dropdown-divider"></li>
|
<li><hr class="dropdown-divider"></li>
|
||||||
|
|||||||
@ -303,15 +303,19 @@
|
|||||||
async function performAnalysis(text) {
|
async function performAnalysis(text) {
|
||||||
try {
|
try {
|
||||||
const userId = getUserId();
|
const userId = getUserId();
|
||||||
const response = await fetch(`/api/v1/sag/analyze-quick-create?user_id=${userId}`, {
|
const response = await fetch('/api/v1/sag/analyze-quick-create', {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {'Content-Type': 'application/json'},
|
headers: {'Content-Type': 'application/json'},
|
||||||
credentials: 'include',
|
credentials: 'include',
|
||||||
body: JSON.stringify({text})
|
body: JSON.stringify({
|
||||||
|
text,
|
||||||
|
user_id: parseInt(userId, 10)
|
||||||
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
throw new Error('Analysis failed');
|
const errorText = await response.text();
|
||||||
|
throw new Error(`Analysis failed (${response.status}): ${errorText || 'unknown error'}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const analysis = await response.json();
|
const analysis = await response.json();
|
||||||
|
|||||||
@ -31,15 +31,17 @@ SYNC RULES:
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
from fastapi import APIRouter, HTTPException
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
from typing import Dict, Any
|
from typing import Dict, Any
|
||||||
from app.core.database import execute_query
|
from app.core.database import execute_query
|
||||||
|
from app.core.auth_dependencies import require_any_permission
|
||||||
from app.services.vtiger_service import get_vtiger_service
|
from app.services.vtiger_service import get_vtiger_service
|
||||||
import re
|
import re
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
sync_admin_access = require_any_permission("users.manage", "system.admin")
|
||||||
|
|
||||||
|
|
||||||
def normalize_name(name: str) -> str:
|
def normalize_name(name: str) -> str:
|
||||||
@ -53,7 +55,7 @@ def normalize_name(name: str) -> str:
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/vtiger")
|
@router.post("/sync/vtiger")
|
||||||
async def sync_from_vtiger() -> Dict[str, Any]:
|
async def sync_from_vtiger(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Link vTiger accounts to existing Hub customers
|
Link vTiger accounts to existing Hub customers
|
||||||
Matches by CVR or normalized name, updates vtiger_id
|
Matches by CVR or normalized name, updates vtiger_id
|
||||||
@ -186,7 +188,7 @@ async def sync_from_vtiger() -> Dict[str, Any]:
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/vtiger-contacts")
|
@router.post("/sync/vtiger-contacts")
|
||||||
async def sync_vtiger_contacts() -> Dict[str, Any]:
|
async def sync_vtiger_contacts(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
SIMPEL TILGANG - Sync contacts from vTiger and link to customers
|
SIMPEL TILGANG - Sync contacts from vTiger and link to customers
|
||||||
Step 1: Fetch all contacts from vTiger
|
Step 1: Fetch all contacts from vTiger
|
||||||
@ -446,7 +448,7 @@ async def sync_vtiger_contacts() -> Dict[str, Any]:
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/economic")
|
@router.post("/sync/economic")
|
||||||
async def sync_from_economic() -> Dict[str, Any]:
|
async def sync_from_economic(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Sync customers from e-conomic (PRIMARY SOURCE)
|
Sync customers from e-conomic (PRIMARY SOURCE)
|
||||||
Creates/updates Hub customers with e-conomic data
|
Creates/updates Hub customers with e-conomic data
|
||||||
@ -606,7 +608,7 @@ async def sync_from_economic() -> Dict[str, Any]:
|
|||||||
|
|
||||||
|
|
||||||
@router.post("/sync/cvr-to-economic")
|
@router.post("/sync/cvr-to-economic")
|
||||||
async def sync_cvr_to_economic() -> Dict[str, Any]:
|
async def sync_cvr_to_economic(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Find customers in Hub with CVR but without e-conomic customer number
|
Find customers in Hub with CVR but without e-conomic customer number
|
||||||
Search e-conomic for matching CVR and update Hub
|
Search e-conomic for matching CVR and update Hub
|
||||||
@ -668,7 +670,7 @@ async def sync_cvr_to_economic() -> Dict[str, Any]:
|
|||||||
|
|
||||||
|
|
||||||
@router.get("/sync/diagnostics")
|
@router.get("/sync/diagnostics")
|
||||||
async def sync_diagnostics() -> Dict[str, Any]:
|
async def sync_diagnostics(current_user: dict = Depends(sync_admin_access)) -> Dict[str, Any]:
|
||||||
"""
|
"""
|
||||||
Diagnostics: Check contact linking coverage
|
Diagnostics: Check contact linking coverage
|
||||||
Shows why contacts aren't linking to customers
|
Shows why contacts aren't linking to customers
|
||||||
|
|||||||
@ -6,7 +6,7 @@ from typing import Optional, List, Literal
|
|||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
# Tag types
|
# Tag types
|
||||||
TagType = Literal['workflow', 'status', 'category', 'priority', 'billing']
|
TagType = Literal['workflow', 'status', 'category', 'priority', 'billing', 'brand', 'type']
|
||||||
TagGroupBehavior = Literal['multi', 'single', 'toggle']
|
TagGroupBehavior = Literal['multi', 'single', 'toggle']
|
||||||
|
|
||||||
|
|
||||||
@ -37,6 +37,7 @@ class TagBase(BaseModel):
|
|||||||
icon: Optional[str] = None
|
icon: Optional[str] = None
|
||||||
is_active: bool = True
|
is_active: bool = True
|
||||||
tag_group_id: Optional[int] = None
|
tag_group_id: Optional[int] = None
|
||||||
|
catch_words: Optional[List[str]] = None
|
||||||
|
|
||||||
class TagCreate(TagBase):
|
class TagCreate(TagBase):
|
||||||
"""Tag creation model"""
|
"""Tag creation model"""
|
||||||
@ -59,6 +60,7 @@ class TagUpdate(BaseModel):
|
|||||||
icon: Optional[str] = None
|
icon: Optional[str] = None
|
||||||
is_active: Optional[bool] = None
|
is_active: Optional[bool] = None
|
||||||
tag_group_id: Optional[int] = None
|
tag_group_id: Optional[int] = None
|
||||||
|
catch_words: Optional[List[str]] = None
|
||||||
|
|
||||||
|
|
||||||
class EntityTagBase(BaseModel):
|
class EntityTagBase(BaseModel):
|
||||||
|
|||||||
@ -1,8 +1,10 @@
|
|||||||
"""
|
"""
|
||||||
Tag system API endpoints
|
Tag system API endpoints
|
||||||
"""
|
"""
|
||||||
from fastapi import APIRouter, HTTPException
|
from fastapi import APIRouter, HTTPException, Query
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
|
import json
|
||||||
|
import re
|
||||||
from app.tags.backend.models import (
|
from app.tags.backend.models import (
|
||||||
Tag, TagCreate, TagUpdate,
|
Tag, TagCreate, TagUpdate,
|
||||||
EntityTag, EntityTagCreate,
|
EntityTag, EntityTagCreate,
|
||||||
@ -14,6 +16,197 @@ from app.core.database import execute_query, execute_query_single, execute_updat
|
|||||||
|
|
||||||
router = APIRouter(prefix="/tags")
|
router = APIRouter(prefix="/tags")
|
||||||
|
|
||||||
|
MODULE_LABELS = {
|
||||||
|
"case": "Sager",
|
||||||
|
"email": "Email",
|
||||||
|
"ticket": "Tickets",
|
||||||
|
"customer": "Kunder",
|
||||||
|
"contact": "Kontakter",
|
||||||
|
"time_entry": "Tid",
|
||||||
|
"order": "Ordrer",
|
||||||
|
"comment": "Ticket kommentarer",
|
||||||
|
"worklog": "Ticket worklog",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _module_label_for_entity_type(entity_type: Optional[str]) -> str:
|
||||||
|
key = str(entity_type or "").strip().lower()
|
||||||
|
if not key:
|
||||||
|
return "Ukendt modul"
|
||||||
|
return MODULE_LABELS.get(key, f"Ukendt modul ({key})")
|
||||||
|
|
||||||
|
|
||||||
|
def _entity_reference_payload(entity_type: Optional[str], entity_id: Optional[int]) -> dict:
|
||||||
|
etype = str(entity_type or "").strip().lower()
|
||||||
|
eid = int(entity_id or 0)
|
||||||
|
default_label = f"#{eid}" if eid else "Ukendt"
|
||||||
|
|
||||||
|
if not etype or not eid:
|
||||||
|
return {"entity_title": default_label, "entity_url": None}
|
||||||
|
|
||||||
|
try:
|
||||||
|
if etype == "case":
|
||||||
|
row = execute_query_single(
|
||||||
|
"SELECT id, titel FROM sag_sager WHERE id = %s AND deleted_at IS NULL",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
title = str(row.get("titel") or "Sag").strip()
|
||||||
|
return {"entity_title": title, "entity_url": f"/sag/{eid}"}
|
||||||
|
|
||||||
|
elif etype == "email":
|
||||||
|
row = execute_query_single(
|
||||||
|
"SELECT id, subject FROM email_messages WHERE id = %s AND deleted_at IS NULL",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
title = str(row.get("subject") or "Email").strip()
|
||||||
|
return {"entity_title": title, "entity_url": f"/emails?id={eid}"}
|
||||||
|
|
||||||
|
elif etype == "ticket":
|
||||||
|
row = execute_query_single(
|
||||||
|
"SELECT id, ticket_number, subject FROM tticket_tickets WHERE id = %s",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
ticket_number = str(row.get("ticket_number") or "").strip()
|
||||||
|
subject = str(row.get("subject") or "Ticket").strip()
|
||||||
|
title = f"{ticket_number} - {subject}" if ticket_number else subject
|
||||||
|
return {"entity_title": title, "entity_url": f"/ticket/tickets/{eid}"}
|
||||||
|
|
||||||
|
elif etype == "customer":
|
||||||
|
row = execute_query_single("SELECT id, name FROM customers WHERE id = %s", (eid,))
|
||||||
|
if row:
|
||||||
|
title = str(row.get("name") or "Kunde").strip()
|
||||||
|
return {"entity_title": title, "entity_url": f"/customers/{eid}"}
|
||||||
|
|
||||||
|
elif etype == "contact":
|
||||||
|
row = execute_query_single(
|
||||||
|
"SELECT id, first_name, last_name, email FROM contacts WHERE id = %s",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
name = " ".join(
|
||||||
|
[str(row.get("first_name") or "").strip(), str(row.get("last_name") or "").strip()]
|
||||||
|
).strip()
|
||||||
|
title = name or str(row.get("email") or "Kontakt").strip()
|
||||||
|
return {"entity_title": title, "entity_url": f"/contacts/{eid}"}
|
||||||
|
|
||||||
|
elif etype == "time_entry":
|
||||||
|
row = execute_query_single(
|
||||||
|
"SELECT id, description, worked_date FROM tmodule_times WHERE id = %s",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
description = str(row.get("description") or "Tidsregistrering").strip()
|
||||||
|
return {"entity_title": description[:90], "entity_url": "/timetracking/registrations"}
|
||||||
|
|
||||||
|
elif etype == "order":
|
||||||
|
row = execute_query_single(
|
||||||
|
"SELECT id, order_number, total_amount FROM tmodule_orders WHERE id = %s",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
order_number = str(row.get("order_number") or "Ordre").strip()
|
||||||
|
total_amount = row.get("total_amount")
|
||||||
|
suffix = f" ({total_amount} kr.)" if total_amount is not None else ""
|
||||||
|
return {"entity_title": f"{order_number}{suffix}", "entity_url": "/timetracking/orders"}
|
||||||
|
|
||||||
|
elif etype == "worklog":
|
||||||
|
row = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT w.id, w.description, w.ticket_id, t.ticket_number
|
||||||
|
FROM tticket_worklog w
|
||||||
|
LEFT JOIN tticket_tickets t ON t.id = w.ticket_id
|
||||||
|
WHERE w.id = %s
|
||||||
|
""",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
ticket_id = row.get("ticket_id")
|
||||||
|
ticket_number = str(row.get("ticket_number") or "Ticket").strip()
|
||||||
|
description = str(row.get("description") or "Worklog").strip()
|
||||||
|
url = f"/ticket/tickets/{ticket_id}" if ticket_id else None
|
||||||
|
return {"entity_title": f"{ticket_number} - {description[:70]}", "entity_url": url}
|
||||||
|
|
||||||
|
elif etype == "comment":
|
||||||
|
row = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT c.id, c.comment_text, c.ticket_id, t.ticket_number
|
||||||
|
FROM tticket_comments c
|
||||||
|
LEFT JOIN tticket_tickets t ON t.id = c.ticket_id
|
||||||
|
WHERE c.id = %s
|
||||||
|
""",
|
||||||
|
(eid,),
|
||||||
|
)
|
||||||
|
if row:
|
||||||
|
ticket_id = row.get("ticket_id")
|
||||||
|
ticket_number = str(row.get("ticket_number") or "Ticket").strip()
|
||||||
|
comment_text = str(row.get("comment_text") or "Kommentar").strip()
|
||||||
|
url = f"/ticket/tickets/{ticket_id}" if ticket_id else None
|
||||||
|
return {"entity_title": f"{ticket_number} - {comment_text[:70]}", "entity_url": url}
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return {"entity_title": default_label, "entity_url": None}
|
||||||
|
|
||||||
|
|
||||||
|
def _normalize_catch_words(value) -> List[str]:
|
||||||
|
"""Normalize catch words from JSON/text/list to a clean lowercase list."""
|
||||||
|
if value is None:
|
||||||
|
return []
|
||||||
|
if isinstance(value, list):
|
||||||
|
words = value
|
||||||
|
elif isinstance(value, str):
|
||||||
|
stripped = value.strip()
|
||||||
|
if not stripped:
|
||||||
|
return []
|
||||||
|
if stripped.startswith("["):
|
||||||
|
try:
|
||||||
|
parsed = json.loads(stripped)
|
||||||
|
words = parsed if isinstance(parsed, list) else []
|
||||||
|
except Exception:
|
||||||
|
words = [w.strip() for w in stripped.replace("\n", ",").split(",")]
|
||||||
|
else:
|
||||||
|
words = [w.strip() for w in stripped.replace("\n", ",").split(",")]
|
||||||
|
else:
|
||||||
|
words = []
|
||||||
|
|
||||||
|
cleaned = []
|
||||||
|
seen = set()
|
||||||
|
for word in words:
|
||||||
|
normalized = str(word or "").strip().lower()
|
||||||
|
if len(normalized) < 2:
|
||||||
|
continue
|
||||||
|
if normalized in seen:
|
||||||
|
continue
|
||||||
|
seen.add(normalized)
|
||||||
|
cleaned.append(normalized)
|
||||||
|
return cleaned
|
||||||
|
|
||||||
|
|
||||||
|
def _tag_row_to_response(row: dict) -> dict:
|
||||||
|
"""Ensure API response always exposes catch_words as a list."""
|
||||||
|
if not row:
|
||||||
|
return row
|
||||||
|
out = dict(row)
|
||||||
|
|
||||||
|
valid_types = {"workflow", "status", "category", "priority", "billing", "brand", "type"}
|
||||||
|
tag_type = str(out.get("type") or "").strip().lower()
|
||||||
|
if tag_type not in valid_types:
|
||||||
|
tag_type = "category"
|
||||||
|
out["type"] = tag_type
|
||||||
|
|
||||||
|
color = str(out.get("color") or "").strip()
|
||||||
|
if not re.fullmatch(r"#[0-9A-Fa-f]{6}", color):
|
||||||
|
out["color"] = "#0f4c75"
|
||||||
|
|
||||||
|
if not out.get("name"):
|
||||||
|
out["name"] = "Unnamed tag"
|
||||||
|
|
||||||
|
out["catch_words"] = _normalize_catch_words(out.get("catch_words"))
|
||||||
|
return out
|
||||||
|
|
||||||
# ============= TAG GROUPS =============
|
# ============= TAG GROUPS =============
|
||||||
|
|
||||||
@router.get("/groups", response_model=List[TagGroup])
|
@router.get("/groups", response_model=List[TagGroup])
|
||||||
@ -34,13 +227,131 @@ async def create_tag_group(group: TagGroupCreate):
|
|||||||
|
|
||||||
# ============= TAG CRUD =============
|
# ============= TAG CRUD =============
|
||||||
|
|
||||||
|
@router.get("/usage")
|
||||||
|
async def list_tag_usage(
|
||||||
|
tag_name: Optional[str] = Query(None),
|
||||||
|
tag_type: Optional[TagType] = Query(None),
|
||||||
|
module: Optional[str] = Query(None),
|
||||||
|
page: int = Query(1, ge=1),
|
||||||
|
page_size: int = Query(25, ge=1, le=200),
|
||||||
|
sort_by: str = Query("tagged_at"),
|
||||||
|
sort_dir: str = Query("desc"),
|
||||||
|
):
|
||||||
|
"""List tag usage across modules with server-side filtering and pagination."""
|
||||||
|
where_parts = ["1=1"]
|
||||||
|
params: List[object] = []
|
||||||
|
|
||||||
|
if tag_name:
|
||||||
|
where_parts.append("LOWER(t.name) LIKE LOWER(%s)")
|
||||||
|
params.append(f"%{tag_name.strip()}%")
|
||||||
|
|
||||||
|
if tag_type:
|
||||||
|
where_parts.append("t.type = %s")
|
||||||
|
params.append(tag_type)
|
||||||
|
|
||||||
|
if module:
|
||||||
|
where_parts.append("LOWER(et.entity_type) = LOWER(%s)")
|
||||||
|
params.append(module.strip())
|
||||||
|
|
||||||
|
where_clause = " AND ".join(where_parts)
|
||||||
|
|
||||||
|
sortable = {
|
||||||
|
"tagged_at": "et.tagged_at",
|
||||||
|
"tag_name": "t.name",
|
||||||
|
"tag_type": "t.type",
|
||||||
|
"module": "et.entity_type",
|
||||||
|
"entity_id": "et.entity_id",
|
||||||
|
}
|
||||||
|
order_column = sortable.get(sort_by, "et.tagged_at")
|
||||||
|
order_direction = "ASC" if str(sort_dir).lower() == "asc" else "DESC"
|
||||||
|
|
||||||
|
count_query = f"""
|
||||||
|
SELECT COUNT(*) AS total
|
||||||
|
FROM entity_tags et
|
||||||
|
JOIN tags t ON t.id = et.tag_id
|
||||||
|
WHERE {where_clause}
|
||||||
|
"""
|
||||||
|
count_row = execute_query_single(count_query, tuple(params)) or {"total": 0}
|
||||||
|
total = int(count_row.get("total") or 0)
|
||||||
|
|
||||||
|
offset = (page - 1) * page_size
|
||||||
|
data_query = f"""
|
||||||
|
SELECT
|
||||||
|
et.id AS entity_tag_id,
|
||||||
|
et.entity_type,
|
||||||
|
et.entity_id,
|
||||||
|
et.tagged_at,
|
||||||
|
t.id AS tag_id,
|
||||||
|
t.name AS tag_name,
|
||||||
|
t.type AS tag_type,
|
||||||
|
t.color AS tag_color,
|
||||||
|
t.is_active AS tag_is_active
|
||||||
|
FROM entity_tags et
|
||||||
|
JOIN tags t ON t.id = et.tag_id
|
||||||
|
WHERE {where_clause}
|
||||||
|
ORDER BY {order_column} {order_direction}, et.id DESC
|
||||||
|
LIMIT %s OFFSET %s
|
||||||
|
"""
|
||||||
|
|
||||||
|
rows = execute_query(data_query, tuple(params + [page_size, offset])) or []
|
||||||
|
items = []
|
||||||
|
for row in rows:
|
||||||
|
entity_type = row.get("entity_type")
|
||||||
|
entity_ref = _entity_reference_payload(entity_type, row.get("entity_id"))
|
||||||
|
items.append(
|
||||||
|
{
|
||||||
|
"entity_tag_id": row.get("entity_tag_id"),
|
||||||
|
"tag_id": row.get("tag_id"),
|
||||||
|
"tag_name": row.get("tag_name"),
|
||||||
|
"tag_type": row.get("tag_type"),
|
||||||
|
"tag_color": row.get("tag_color"),
|
||||||
|
"tag_is_active": bool(row.get("tag_is_active")),
|
||||||
|
"module": _module_label_for_entity_type(entity_type),
|
||||||
|
"entity_type": entity_type,
|
||||||
|
"entity_id": row.get("entity_id"),
|
||||||
|
"entity_title": entity_ref.get("entity_title"),
|
||||||
|
"entity_url": entity_ref.get("entity_url"),
|
||||||
|
"tagged_at": row.get("tagged_at"),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
module_rows = execute_query(
|
||||||
|
"SELECT DISTINCT entity_type FROM entity_tags ORDER BY entity_type",
|
||||||
|
(),
|
||||||
|
) or []
|
||||||
|
module_options = [
|
||||||
|
{
|
||||||
|
"value": row.get("entity_type"),
|
||||||
|
"label": _module_label_for_entity_type(row.get("entity_type")),
|
||||||
|
}
|
||||||
|
for row in module_rows
|
||||||
|
]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"items": items,
|
||||||
|
"pagination": {
|
||||||
|
"page": page,
|
||||||
|
"page_size": page_size,
|
||||||
|
"total": total,
|
||||||
|
"total_pages": (total + page_size - 1) // page_size if total else 0,
|
||||||
|
},
|
||||||
|
"sort": {"sort_by": sort_by, "sort_dir": order_direction.lower()},
|
||||||
|
"module_options": module_options,
|
||||||
|
}
|
||||||
|
|
||||||
@router.get("", response_model=List[Tag])
|
@router.get("", response_model=List[Tag])
|
||||||
async def list_tags(
|
async def list_tags(
|
||||||
type: Optional[TagType] = None,
|
type: Optional[TagType] = None,
|
||||||
is_active: Optional[bool] = None
|
is_active: Optional[bool] = None
|
||||||
):
|
):
|
||||||
"""List all tags with optional filtering"""
|
"""List all tags with optional filtering"""
|
||||||
query = "SELECT * FROM tags WHERE 1=1"
|
query = """
|
||||||
|
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
||||||
|
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
||||||
|
created_at, updated_at
|
||||||
|
FROM tags
|
||||||
|
WHERE 1=1
|
||||||
|
"""
|
||||||
params = []
|
params = []
|
||||||
|
|
||||||
if type:
|
if type:
|
||||||
@ -54,32 +365,52 @@ async def list_tags(
|
|||||||
query += " ORDER BY type, name"
|
query += " ORDER BY type, name"
|
||||||
|
|
||||||
results = execute_query(query, tuple(params) if params else ())
|
results = execute_query(query, tuple(params) if params else ())
|
||||||
return results
|
return [_tag_row_to_response(row) for row in (results or [])]
|
||||||
|
|
||||||
@router.get("/{tag_id}", response_model=Tag)
|
@router.get("/{tag_id}", response_model=Tag)
|
||||||
async def get_tag(tag_id: int):
|
async def get_tag(tag_id: int):
|
||||||
"""Get single tag by ID"""
|
"""Get single tag by ID"""
|
||||||
result = execute_query_single(
|
result = execute_query_single(
|
||||||
"SELECT * FROM tags WHERE id = %s",
|
"""
|
||||||
|
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
||||||
|
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
||||||
|
created_at, updated_at
|
||||||
|
FROM tags
|
||||||
|
WHERE id = %s
|
||||||
|
""",
|
||||||
(tag_id,)
|
(tag_id,)
|
||||||
)
|
)
|
||||||
if not result:
|
if not result:
|
||||||
raise HTTPException(status_code=404, detail="Tag not found")
|
raise HTTPException(status_code=404, detail="Tag not found")
|
||||||
return result
|
return _tag_row_to_response(result)
|
||||||
|
|
||||||
@router.post("", response_model=Tag)
|
@router.post("", response_model=Tag)
|
||||||
async def create_tag(tag: TagCreate):
|
async def create_tag(tag: TagCreate):
|
||||||
"""Create new tag"""
|
"""Create new tag"""
|
||||||
query = """
|
query = """
|
||||||
INSERT INTO tags (name, type, description, color, icon, is_active, tag_group_id)
|
INSERT INTO tags (name, type, description, color, icon, is_active, tag_group_id, catch_words_json)
|
||||||
VALUES (%s, %s, %s, %s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s::jsonb)
|
||||||
RETURNING *
|
RETURNING id, name, type, description, color, icon, is_active, tag_group_id,
|
||||||
|
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
||||||
|
created_at, updated_at
|
||||||
"""
|
"""
|
||||||
|
catch_words = _normalize_catch_words(tag.catch_words)
|
||||||
result = execute_query_single(
|
result = execute_query_single(
|
||||||
query,
|
query,
|
||||||
(tag.name, tag.type, tag.description, tag.color, tag.icon, tag.is_active, tag.tag_group_id)
|
(
|
||||||
|
tag.name,
|
||||||
|
tag.type,
|
||||||
|
tag.description,
|
||||||
|
tag.color,
|
||||||
|
tag.icon,
|
||||||
|
tag.is_active,
|
||||||
|
tag.tag_group_id,
|
||||||
|
json.dumps(catch_words),
|
||||||
|
)
|
||||||
)
|
)
|
||||||
return result
|
if not result:
|
||||||
|
raise HTTPException(status_code=500, detail="Failed to create tag")
|
||||||
|
return _tag_row_to_response(result)
|
||||||
|
|
||||||
@router.put("/{tag_id}", response_model=Tag)
|
@router.put("/{tag_id}", response_model=Tag)
|
||||||
async def update_tag(tag_id: int, tag: TagUpdate):
|
async def update_tag(tag_id: int, tag: TagUpdate):
|
||||||
@ -106,6 +437,9 @@ async def update_tag(tag_id: int, tag: TagUpdate):
|
|||||||
if tag.tag_group_id is not None:
|
if tag.tag_group_id is not None:
|
||||||
updates.append("tag_group_id = %s")
|
updates.append("tag_group_id = %s")
|
||||||
params.append(tag.tag_group_id)
|
params.append(tag.tag_group_id)
|
||||||
|
if tag.catch_words is not None:
|
||||||
|
updates.append("catch_words_json = %s::jsonb")
|
||||||
|
params.append(json.dumps(_normalize_catch_words(tag.catch_words)))
|
||||||
|
|
||||||
if not updates:
|
if not updates:
|
||||||
raise HTTPException(status_code=400, detail="No fields to update")
|
raise HTTPException(status_code=400, detail="No fields to update")
|
||||||
@ -117,13 +451,15 @@ async def update_tag(tag_id: int, tag: TagUpdate):
|
|||||||
UPDATE tags
|
UPDATE tags
|
||||||
SET {', '.join(updates)}
|
SET {', '.join(updates)}
|
||||||
WHERE id = %s
|
WHERE id = %s
|
||||||
RETURNING *
|
RETURNING id, name, type, description, color, icon, is_active, tag_group_id,
|
||||||
|
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
||||||
|
created_at, updated_at
|
||||||
"""
|
"""
|
||||||
|
|
||||||
result = execute_query_single(query, tuple(params))
|
result = execute_query_single(query, tuple(params))
|
||||||
if not result:
|
if not result:
|
||||||
raise HTTPException(status_code=404, detail="Tag not found")
|
raise HTTPException(status_code=404, detail="Tag not found")
|
||||||
return result
|
return _tag_row_to_response(result)
|
||||||
|
|
||||||
@router.delete("/{tag_id}")
|
@router.delete("/{tag_id}")
|
||||||
async def delete_tag(tag_id: int):
|
async def delete_tag(tag_id: int):
|
||||||
@ -214,20 +550,91 @@ async def remove_tag_from_entity_path(
|
|||||||
async def get_entity_tags(entity_type: str, entity_id: int):
|
async def get_entity_tags(entity_type: str, entity_id: int):
|
||||||
"""Get all tags for a specific entity"""
|
"""Get all tags for a specific entity"""
|
||||||
query = """
|
query = """
|
||||||
SELECT t.*
|
SELECT t.id, t.name, t.type, t.description, t.color, t.icon, t.is_active, t.tag_group_id,
|
||||||
|
COALESCE(t.catch_words_json, '[]'::jsonb) AS catch_words,
|
||||||
|
t.created_at, t.updated_at
|
||||||
FROM tags t
|
FROM tags t
|
||||||
JOIN entity_tags et ON et.tag_id = t.id
|
JOIN entity_tags et ON et.tag_id = t.id
|
||||||
WHERE et.entity_type = %s AND et.entity_id = %s
|
WHERE et.entity_type = %s AND et.entity_id = %s
|
||||||
ORDER BY t.type, t.name
|
ORDER BY t.type, t.name
|
||||||
"""
|
"""
|
||||||
results = execute_query(query, (entity_type, entity_id))
|
results = execute_query(query, (entity_type, entity_id))
|
||||||
return results
|
return [_tag_row_to_response(row) for row in (results or [])]
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/entity/{entity_type}/{entity_id}/suggestions")
|
||||||
|
async def suggest_entity_tags(entity_type: str, entity_id: int):
|
||||||
|
"""Suggest tags based on catch words for brand/type tags."""
|
||||||
|
if entity_type != "case":
|
||||||
|
return []
|
||||||
|
|
||||||
|
case_row = execute_query_single(
|
||||||
|
"SELECT id, titel, beskrivelse, template_key FROM sag_sager WHERE id = %s",
|
||||||
|
(entity_id,),
|
||||||
|
)
|
||||||
|
if not case_row:
|
||||||
|
raise HTTPException(status_code=404, detail="Entity not found")
|
||||||
|
|
||||||
|
existing_rows = execute_query(
|
||||||
|
"SELECT tag_id FROM entity_tags WHERE entity_type = %s AND entity_id = %s",
|
||||||
|
(entity_type, entity_id),
|
||||||
|
) or []
|
||||||
|
existing_tag_ids = {int(row.get("tag_id")) for row in existing_rows if row.get("tag_id") is not None}
|
||||||
|
|
||||||
|
candidate_rows = execute_query(
|
||||||
|
"""
|
||||||
|
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
||||||
|
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
||||||
|
created_at, updated_at
|
||||||
|
FROM tags
|
||||||
|
WHERE is_active = true
|
||||||
|
AND type IN ('brand', 'type')
|
||||||
|
ORDER BY type, name
|
||||||
|
""",
|
||||||
|
(),
|
||||||
|
) or []
|
||||||
|
|
||||||
|
haystack = " ".join(
|
||||||
|
[
|
||||||
|
str(case_row.get("titel") or ""),
|
||||||
|
str(case_row.get("beskrivelse") or ""),
|
||||||
|
str(case_row.get("template_key") or ""),
|
||||||
|
]
|
||||||
|
).lower()
|
||||||
|
|
||||||
|
suggestions = []
|
||||||
|
for row in candidate_rows:
|
||||||
|
tag_id = int(row.get("id"))
|
||||||
|
if tag_id in existing_tag_ids:
|
||||||
|
continue
|
||||||
|
|
||||||
|
catch_words = _normalize_catch_words(row.get("catch_words"))
|
||||||
|
if not catch_words:
|
||||||
|
continue
|
||||||
|
|
||||||
|
matched_words = [word for word in catch_words if word in haystack]
|
||||||
|
if not matched_words:
|
||||||
|
continue
|
||||||
|
|
||||||
|
suggestions.append(
|
||||||
|
{
|
||||||
|
"tag": _tag_row_to_response(row),
|
||||||
|
"matched_words": matched_words,
|
||||||
|
"score": len(matched_words),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
suggestions.sort(key=lambda item: (-item["score"], item["tag"]["type"], item["tag"]["name"]))
|
||||||
|
return suggestions
|
||||||
|
|
||||||
@router.get("/search")
|
@router.get("/search")
|
||||||
async def search_tags(q: str, type: Optional[TagType] = None):
|
async def search_tags(q: str, type: Optional[TagType] = None):
|
||||||
"""Search tags by name (fuzzy search)"""
|
"""Search tags by name (fuzzy search)"""
|
||||||
query = """
|
query = """
|
||||||
SELECT * FROM tags
|
SELECT id, name, type, description, color, icon, is_active, tag_group_id,
|
||||||
|
COALESCE(catch_words_json, '[]'::jsonb) AS catch_words,
|
||||||
|
created_at, updated_at
|
||||||
|
FROM tags
|
||||||
WHERE is_active = true
|
WHERE is_active = true
|
||||||
AND LOWER(name) LIKE LOWER(%s)
|
AND LOWER(name) LIKE LOWER(%s)
|
||||||
"""
|
"""
|
||||||
@ -240,7 +647,7 @@ async def search_tags(q: str, type: Optional[TagType] = None):
|
|||||||
query += " ORDER BY name LIMIT 20"
|
query += " ORDER BY name LIMIT 20"
|
||||||
|
|
||||||
results = execute_query(query, tuple(params))
|
results = execute_query(query, tuple(params))
|
||||||
return results
|
return [_tag_row_to_response(row) for row in (results or [])]
|
||||||
|
|
||||||
|
|
||||||
# ============= WORKFLOW MANAGEMENT =============
|
# ============= WORKFLOW MANAGEMENT =============
|
||||||
|
|||||||
@ -1,11 +1,8 @@
|
|||||||
<!DOCTYPE html>
|
{% extends "shared/frontend/base.html" %}
|
||||||
<html lang="da">
|
|
||||||
<head>
|
{% block title %}Tag Administration - BMC Hub{% endblock %}
|
||||||
<meta charset="UTF-8">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
{% block extra_css %}
|
||||||
<title>Tag Administration - BMC Hub</title>
|
|
||||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
|
||||||
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.0/font/bootstrap-icons.css">
|
|
||||||
<style>
|
<style>
|
||||||
:root {
|
:root {
|
||||||
--primary-color: #0f4c75;
|
--primary-color: #0f4c75;
|
||||||
@ -14,6 +11,8 @@
|
|||||||
--category-color: #0f4c75;
|
--category-color: #0f4c75;
|
||||||
--priority-color: #dc3545;
|
--priority-color: #dc3545;
|
||||||
--billing-color: #2d6a4f;
|
--billing-color: #2d6a4f;
|
||||||
|
--brand-color: #006d77;
|
||||||
|
--type-color: #5c677d;
|
||||||
}
|
}
|
||||||
|
|
||||||
.tag-badge {
|
.tag-badge {
|
||||||
@ -37,6 +36,8 @@
|
|||||||
.tag-type-category { background-color: var(--category-color); color: white; }
|
.tag-type-category { background-color: var(--category-color); color: white; }
|
||||||
.tag-type-priority { background-color: var(--priority-color); color: white; }
|
.tag-type-priority { background-color: var(--priority-color); color: white; }
|
||||||
.tag-type-billing { background-color: var(--billing-color); color: white; }
|
.tag-type-billing { background-color: var(--billing-color); color: white; }
|
||||||
|
.tag-type-brand { background-color: var(--brand-color); color: white; }
|
||||||
|
.tag-type-type { background-color: var(--type-color); color: white; }
|
||||||
|
|
||||||
.tag-list-item {
|
.tag-list-item {
|
||||||
padding: 1rem;
|
padding: 1rem;
|
||||||
@ -53,6 +54,8 @@
|
|||||||
.tag-list-item[data-type="category"] { border-left-color: var(--category-color); }
|
.tag-list-item[data-type="category"] { border-left-color: var(--category-color); }
|
||||||
.tag-list-item[data-type="priority"] { border-left-color: var(--priority-color); }
|
.tag-list-item[data-type="priority"] { border-left-color: var(--priority-color); }
|
||||||
.tag-list-item[data-type="billing"] { border-left-color: var(--billing-color); }
|
.tag-list-item[data-type="billing"] { border-left-color: var(--billing-color); }
|
||||||
|
.tag-list-item[data-type="brand"] { border-left-color: var(--brand-color); }
|
||||||
|
.tag-list-item[data-type="type"] { border-left-color: var(--type-color); }
|
||||||
|
|
||||||
.color-preview {
|
.color-preview {
|
||||||
width: 40px;
|
width: 40px;
|
||||||
@ -60,9 +63,68 @@
|
|||||||
border-radius: 8px;
|
border-radius: 8px;
|
||||||
border: 2px solid #dee2e6;
|
border: 2px solid #dee2e6;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.section-tabs .nav-link {
|
||||||
|
color: var(--primary-color);
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
|
.section-tabs .nav-link.active {
|
||||||
|
background-color: var(--primary-color);
|
||||||
|
color: #fff;
|
||||||
|
border-color: var(--primary-color);
|
||||||
|
}
|
||||||
|
|
||||||
|
.module-badge {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.3rem;
|
||||||
|
padding: 0.2rem 0.55rem;
|
||||||
|
border-radius: 999px;
|
||||||
|
font-size: 0.8rem;
|
||||||
|
background: #e7f1f8;
|
||||||
|
color: #0b3552;
|
||||||
|
border: 1px solid #c7dceb;
|
||||||
|
}
|
||||||
|
|
||||||
|
.usage-table thead th {
|
||||||
|
position: sticky;
|
||||||
|
top: 0;
|
||||||
|
z-index: 1;
|
||||||
|
background: #fff;
|
||||||
|
white-space: nowrap;
|
||||||
|
}
|
||||||
|
|
||||||
|
.usage-table .filter-cell {
|
||||||
|
min-width: 160px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.usage-sort-btn {
|
||||||
|
border: 0;
|
||||||
|
background: transparent;
|
||||||
|
color: inherit;
|
||||||
|
font-weight: 600;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.usage-sort-btn .bi {
|
||||||
|
font-size: 0.75rem;
|
||||||
|
opacity: 0.55;
|
||||||
|
}
|
||||||
|
|
||||||
|
.usage-sort-btn.active .bi {
|
||||||
|
opacity: 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (max-width: 991px) {
|
||||||
|
.usage-table .filter-cell {
|
||||||
|
min-width: 130px;
|
||||||
|
}
|
||||||
|
}
|
||||||
</style>
|
</style>
|
||||||
</head>
|
{% endblock %}
|
||||||
<body>
|
|
||||||
|
{% block content %}
|
||||||
<div class="container-fluid py-4">
|
<div class="container-fluid py-4">
|
||||||
<div class="row mb-4">
|
<div class="row mb-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
@ -76,6 +138,17 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<ul class="nav nav-pills section-tabs mb-4" id="sectionTabs">
|
||||||
|
<li class="nav-item">
|
||||||
|
<button type="button" class="nav-link active" data-section="admin">Tag administration</button>
|
||||||
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<button type="button" class="nav-link" data-section="search">Tag søgning</button>
|
||||||
|
</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<div id="tagAdminSection">
|
||||||
|
|
||||||
<!-- Type Filter Tabs -->
|
<!-- Type Filter Tabs -->
|
||||||
<ul class="nav nav-tabs mb-4" id="typeFilter">
|
<ul class="nav nav-tabs mb-4" id="typeFilter">
|
||||||
<li class="nav-item">
|
<li class="nav-item">
|
||||||
@ -106,6 +179,16 @@
|
|||||||
<span class="tag-badge tag-type-billing">Billing</span>
|
<span class="tag-badge tag-type-billing">Billing</span>
|
||||||
</a>
|
</a>
|
||||||
</li>
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link" href="#" data-type="brand">
|
||||||
|
<span class="tag-badge tag-type-brand">Brand</span>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link" href="#" data-type="type">
|
||||||
|
<span class="tag-badge tag-type-type">Type</span>
|
||||||
|
</a>
|
||||||
|
</li>
|
||||||
</ul>
|
</ul>
|
||||||
|
|
||||||
<!-- Tags List -->
|
<!-- Tags List -->
|
||||||
@ -120,6 +203,98 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div id="tagSearchSection" class="d-none">
|
||||||
|
<div class="card mb-3">
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="d-flex flex-wrap justify-content-between align-items-center gap-2 mb-3">
|
||||||
|
<div>
|
||||||
|
<h5 class="mb-1">Tag søgning på tværs af moduler</h5>
|
||||||
|
<p class="text-muted mb-0 small">Filtrer efter tag-navn, type og modul. Hver række viser tydeligt hvilket modul tagningen kommer fra.</p>
|
||||||
|
</div>
|
||||||
|
<button type="button" class="btn btn-outline-secondary btn-sm" id="resetUsageFiltersBtn">
|
||||||
|
<i class="bi bi-arrow-counterclockwise"></i> Nulstil filtre
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="table-responsive">
|
||||||
|
<table class="table table-hover align-middle usage-table mb-2">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>
|
||||||
|
<button type="button" class="usage-sort-btn" data-sort-by="tag_name">
|
||||||
|
Tag <i class="bi bi-chevron-expand"></i>
|
||||||
|
</button>
|
||||||
|
</th>
|
||||||
|
<th>
|
||||||
|
<button type="button" class="usage-sort-btn" data-sort-by="tag_type">
|
||||||
|
Type <i class="bi bi-chevron-expand"></i>
|
||||||
|
</button>
|
||||||
|
</th>
|
||||||
|
<th>
|
||||||
|
<button type="button" class="usage-sort-btn" data-sort-by="module">
|
||||||
|
Modul <i class="bi bi-chevron-expand"></i>
|
||||||
|
</button>
|
||||||
|
</th>
|
||||||
|
<th>Objekt</th>
|
||||||
|
<th>Entity type</th>
|
||||||
|
<th>
|
||||||
|
<button type="button" class="usage-sort-btn" data-sort-by="entity_id">
|
||||||
|
Entity ID <i class="bi bi-chevron-expand"></i>
|
||||||
|
</button>
|
||||||
|
</th>
|
||||||
|
<th>
|
||||||
|
<button type="button" class="usage-sort-btn active" data-sort-by="tagged_at">
|
||||||
|
Tagget <i class="bi bi-sort-down"></i>
|
||||||
|
</button>
|
||||||
|
</th>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<th class="filter-cell">
|
||||||
|
<input id="usageFilterTagName" type="search" class="form-control form-control-sm" placeholder="Søg tag-navn">
|
||||||
|
</th>
|
||||||
|
<th class="filter-cell">
|
||||||
|
<select id="usageFilterTagType" class="form-select form-select-sm">
|
||||||
|
<option value="">Alle typer</option>
|
||||||
|
<option value="workflow">workflow</option>
|
||||||
|
<option value="status">status</option>
|
||||||
|
<option value="category">category</option>
|
||||||
|
<option value="priority">priority</option>
|
||||||
|
<option value="billing">billing</option>
|
||||||
|
<option value="brand">brand</option>
|
||||||
|
<option value="type">type</option>
|
||||||
|
</select>
|
||||||
|
</th>
|
||||||
|
<th class="filter-cell">
|
||||||
|
<select id="usageFilterModule" class="form-select form-select-sm">
|
||||||
|
<option value="">Alle moduler</option>
|
||||||
|
</select>
|
||||||
|
</th>
|
||||||
|
<th></th>
|
||||||
|
<th></th>
|
||||||
|
<th></th>
|
||||||
|
<th></th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody id="usageTableBody">
|
||||||
|
<tr>
|
||||||
|
<td colspan="7" class="text-center text-muted py-4">Indlæser...</td>
|
||||||
|
</tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex flex-wrap justify-content-between align-items-center gap-2">
|
||||||
|
<div class="small text-muted" id="usageSummary">-</div>
|
||||||
|
<div class="btn-group">
|
||||||
|
<button type="button" class="btn btn-sm btn-outline-primary" id="usagePrevBtn">Forrige</button>
|
||||||
|
<button type="button" class="btn btn-sm btn-outline-primary" id="usageNextBtn">Næste</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Create/Edit Tag Modal -->
|
<!-- Create/Edit Tag Modal -->
|
||||||
@ -148,9 +323,17 @@
|
|||||||
<option value="category">Category - Emne/område</option>
|
<option value="category">Category - Emne/område</option>
|
||||||
<option value="priority">Priority - Hastighed</option>
|
<option value="priority">Priority - Hastighed</option>
|
||||||
<option value="billing">Billing - Økonomi</option>
|
<option value="billing">Billing - Økonomi</option>
|
||||||
|
<option value="brand">Brand - Leverandør/produktbrand</option>
|
||||||
|
<option value="type">Type - Sagstype/arbejdstype</option>
|
||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div class="mb-3">
|
||||||
|
<label for="tagCatchWords" class="form-label">Catch words</label>
|
||||||
|
<textarea class="form-control" id="tagCatchWords" rows="3" placeholder="fx: office 365, outlook, smtp"></textarea>
|
||||||
|
<small class="text-muted">Brug komma eller ny linje mellem ord. Bruges til auto-forslag på sager.</small>
|
||||||
|
</div>
|
||||||
|
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
<label for="tagDescription" class="form-label">Beskrivelse</label>
|
<label for="tagDescription" class="form-label">Beskrivelse</label>
|
||||||
<textarea class="form-control" id="tagDescription" rows="3"></textarea>
|
<textarea class="form-control" id="tagDescription" rows="3"></textarea>
|
||||||
@ -186,19 +369,59 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
{% endblock %}
|
||||||
|
|
||||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
{% block extra_js %}
|
||||||
<script>
|
<script>
|
||||||
let allTags = [];
|
let allTags = [];
|
||||||
let currentFilter = 'all';
|
let currentFilter = 'all';
|
||||||
|
let usageDebounceTimer = null;
|
||||||
|
const usageState = {
|
||||||
|
filters: {
|
||||||
|
tag_name: '',
|
||||||
|
tag_type: '',
|
||||||
|
module: ''
|
||||||
|
},
|
||||||
|
page: 1,
|
||||||
|
page_size: 25,
|
||||||
|
sort_by: 'tagged_at',
|
||||||
|
sort_dir: 'desc',
|
||||||
|
total: 0,
|
||||||
|
total_pages: 0
|
||||||
|
};
|
||||||
|
|
||||||
// Load tags on page load
|
// Load tags on page load
|
||||||
document.addEventListener('DOMContentLoaded', () => {
|
document.addEventListener('DOMContentLoaded', () => {
|
||||||
loadTags();
|
loadTags();
|
||||||
|
loadTagUsage();
|
||||||
setupEventListeners();
|
setupEventListeners();
|
||||||
|
const initialSection = window.location.hash === '#search' ? 'search' : 'admin';
|
||||||
|
switchTagSection(initialSection, false);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
function switchTagSection(section, updateHash = true) {
|
||||||
|
const normalized = section === 'search' ? 'search' : 'admin';
|
||||||
|
document.querySelectorAll('#sectionTabs .nav-link').forEach(link => {
|
||||||
|
link.classList.toggle('active', link.dataset.section === normalized);
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('tagAdminSection').classList.toggle('d-none', normalized !== 'admin');
|
||||||
|
document.getElementById('tagSearchSection').classList.toggle('d-none', normalized !== 'search');
|
||||||
|
|
||||||
|
if (updateHash) {
|
||||||
|
const hash = normalized === 'search' ? '#search' : '#admin';
|
||||||
|
window.history.replaceState(null, '', hash);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function setupEventListeners() {
|
function setupEventListeners() {
|
||||||
|
// Section tabs
|
||||||
|
document.querySelectorAll('#sectionTabs button').forEach(btn => {
|
||||||
|
btn.addEventListener('click', () => {
|
||||||
|
switchTagSection(btn.dataset.section);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
// Type filter tabs
|
// Type filter tabs
|
||||||
document.querySelectorAll('#typeFilter a').forEach(tab => {
|
document.querySelectorAll('#typeFilter a').forEach(tab => {
|
||||||
tab.addEventListener('click', (e) => {
|
tab.addEventListener('click', (e) => {
|
||||||
@ -229,7 +452,9 @@
|
|||||||
'status': '#ffd700',
|
'status': '#ffd700',
|
||||||
'category': '#0f4c75',
|
'category': '#0f4c75',
|
||||||
'priority': '#dc3545',
|
'priority': '#dc3545',
|
||||||
'billing': '#2d6a4f'
|
'billing': '#2d6a4f',
|
||||||
|
'brand': '#006d77',
|
||||||
|
'type': '#5c677d'
|
||||||
};
|
};
|
||||||
if (colorMap[type]) {
|
if (colorMap[type]) {
|
||||||
document.getElementById('tagColor').value = colorMap[type];
|
document.getElementById('tagColor').value = colorMap[type];
|
||||||
@ -240,6 +465,61 @@
|
|||||||
// Save button
|
// Save button
|
||||||
document.getElementById('saveTagBtn').addEventListener('click', saveTag);
|
document.getElementById('saveTagBtn').addEventListener('click', saveTag);
|
||||||
|
|
||||||
|
// Usage filters
|
||||||
|
document.getElementById('usageFilterTagName').addEventListener('input', () => {
|
||||||
|
usageState.filters.tag_name = document.getElementById('usageFilterTagName').value.trim();
|
||||||
|
usageState.page = 1;
|
||||||
|
debounceUsageLoad();
|
||||||
|
});
|
||||||
|
document.getElementById('usageFilterTagType').addEventListener('change', () => {
|
||||||
|
usageState.filters.tag_type = document.getElementById('usageFilterTagType').value;
|
||||||
|
usageState.page = 1;
|
||||||
|
loadTagUsage();
|
||||||
|
});
|
||||||
|
document.getElementById('usageFilterModule').addEventListener('change', () => {
|
||||||
|
usageState.filters.module = document.getElementById('usageFilterModule').value;
|
||||||
|
usageState.page = 1;
|
||||||
|
loadTagUsage();
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('resetUsageFiltersBtn').addEventListener('click', () => {
|
||||||
|
usageState.filters = { tag_name: '', tag_type: '', module: '' };
|
||||||
|
usageState.page = 1;
|
||||||
|
document.getElementById('usageFilterTagName').value = '';
|
||||||
|
document.getElementById('usageFilterTagType').value = '';
|
||||||
|
document.getElementById('usageFilterModule').value = '';
|
||||||
|
loadTagUsage();
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('usagePrevBtn').addEventListener('click', () => {
|
||||||
|
if (usageState.page > 1) {
|
||||||
|
usageState.page -= 1;
|
||||||
|
loadTagUsage();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
document.getElementById('usageNextBtn').addEventListener('click', () => {
|
||||||
|
if (usageState.page < usageState.total_pages) {
|
||||||
|
usageState.page += 1;
|
||||||
|
loadTagUsage();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
document.querySelectorAll('.usage-sort-btn').forEach(btn => {
|
||||||
|
btn.addEventListener('click', () => {
|
||||||
|
const sortBy = btn.dataset.sortBy;
|
||||||
|
if (usageState.sort_by === sortBy) {
|
||||||
|
usageState.sort_dir = usageState.sort_dir === 'asc' ? 'desc' : 'asc';
|
||||||
|
} else {
|
||||||
|
usageState.sort_by = sortBy;
|
||||||
|
usageState.sort_dir = sortBy === 'tagged_at' ? 'desc' : 'asc';
|
||||||
|
}
|
||||||
|
usageState.page = 1;
|
||||||
|
updateSortIndicators();
|
||||||
|
loadTagUsage();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
// Modal reset on close
|
// Modal reset on close
|
||||||
document.getElementById('createTagModal').addEventListener('hidden.bs.modal', () => {
|
document.getElementById('createTagModal').addEventListener('hidden.bs.modal', () => {
|
||||||
document.getElementById('tagForm').reset();
|
document.getElementById('tagForm').reset();
|
||||||
@ -264,6 +544,131 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function escapeHtml(value) {
|
||||||
|
return String(value ?? '')
|
||||||
|
.replaceAll('&', '&')
|
||||||
|
.replaceAll('<', '<')
|
||||||
|
.replaceAll('>', '>')
|
||||||
|
.replaceAll('"', '"')
|
||||||
|
.replaceAll("'", ''');
|
||||||
|
}
|
||||||
|
|
||||||
|
function debounceUsageLoad() {
|
||||||
|
if (usageDebounceTimer) {
|
||||||
|
clearTimeout(usageDebounceTimer);
|
||||||
|
}
|
||||||
|
usageDebounceTimer = setTimeout(() => loadTagUsage(), 280);
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateSortIndicators() {
|
||||||
|
document.querySelectorAll('.usage-sort-btn').forEach(btn => {
|
||||||
|
const icon = btn.querySelector('i');
|
||||||
|
if (!icon) return;
|
||||||
|
btn.classList.remove('active');
|
||||||
|
icon.className = 'bi bi-chevron-expand';
|
||||||
|
if (btn.dataset.sortBy === usageState.sort_by) {
|
||||||
|
btn.classList.add('active');
|
||||||
|
icon.className = usageState.sort_dir === 'asc' ? 'bi bi-sort-up' : 'bi bi-sort-down';
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderUsageTable(items) {
|
||||||
|
const tbody = document.getElementById('usageTableBody');
|
||||||
|
if (!Array.isArray(items) || !items.length) {
|
||||||
|
tbody.innerHTML = '<tr><td colspan="7" class="text-center text-muted py-4">Ingen taggede rækker matcher filtrene.</td></tr>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
tbody.innerHTML = items.map(row => {
|
||||||
|
const taggedAt = row.tagged_at ? new Date(row.tagged_at).toLocaleString('da-DK') : '-';
|
||||||
|
const color = /^#[0-9A-Fa-f]{6}$/.test(String(row.tag_color || '')) ? row.tag_color : '#0f4c75';
|
||||||
|
const inactiveBadge = row.tag_is_active ? '' : '<span class="badge bg-secondary ms-2">Inaktiv</span>';
|
||||||
|
const entityTitle = escapeHtml(row.entity_title || `#${row.entity_id || ''}`);
|
||||||
|
const entityCell = row.entity_url
|
||||||
|
? `<a href="${escapeHtml(row.entity_url)}" class="text-decoration-none fw-semibold">${entityTitle}</a>`
|
||||||
|
: `<span class="fw-semibold">${entityTitle}</span>`;
|
||||||
|
|
||||||
|
return `
|
||||||
|
<tr>
|
||||||
|
<td>
|
||||||
|
<span class="tag-badge" style="background:${color}; color:#fff; margin:0;">${escapeHtml(row.tag_name)}</span>
|
||||||
|
${inactiveBadge}
|
||||||
|
</td>
|
||||||
|
<td><span class="badge bg-light text-dark text-uppercase">${escapeHtml(row.tag_type)}</span></td>
|
||||||
|
<td><span class="module-badge"><i class="bi bi-box"></i>${escapeHtml(row.module)}</span></td>
|
||||||
|
<td>${entityCell}</td>
|
||||||
|
<td><span class="text-muted">${escapeHtml(row.entity_type)}</span></td>
|
||||||
|
<td><strong>#${escapeHtml(row.entity_id)}</strong></td>
|
||||||
|
<td class="small text-muted">${escapeHtml(taggedAt)}</td>
|
||||||
|
</tr>
|
||||||
|
`;
|
||||||
|
}).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderUsageSummary() {
|
||||||
|
const summary = document.getElementById('usageSummary');
|
||||||
|
const prevBtn = document.getElementById('usagePrevBtn');
|
||||||
|
const nextBtn = document.getElementById('usageNextBtn');
|
||||||
|
|
||||||
|
const total = usageState.total;
|
||||||
|
const page = usageState.page;
|
||||||
|
const pageSize = usageState.page_size;
|
||||||
|
const from = total ? ((page - 1) * pageSize + 1) : 0;
|
||||||
|
const to = total ? Math.min(page * pageSize, total) : 0;
|
||||||
|
|
||||||
|
summary.textContent = `Viser ${from}-${to} af ${total} rækker`;
|
||||||
|
prevBtn.disabled = page <= 1;
|
||||||
|
nextBtn.disabled = page >= usageState.total_pages;
|
||||||
|
}
|
||||||
|
|
||||||
|
function fillModuleFilter(options) {
|
||||||
|
const select = document.getElementById('usageFilterModule');
|
||||||
|
const currentValue = usageState.filters.module;
|
||||||
|
const base = '<option value="">Alle moduler</option>';
|
||||||
|
const rows = (options || []).map(option => {
|
||||||
|
return `<option value="${escapeHtml(option.value)}">${escapeHtml(option.label)}</option>`;
|
||||||
|
}).join('');
|
||||||
|
select.innerHTML = `${base}${rows}`;
|
||||||
|
select.value = currentValue || '';
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadTagUsage() {
|
||||||
|
const tbody = document.getElementById('usageTableBody');
|
||||||
|
tbody.innerHTML = '<tr><td colspan="7" class="text-center text-muted py-4">Indlæser...</td></tr>';
|
||||||
|
|
||||||
|
try {
|
||||||
|
const params = new URLSearchParams({
|
||||||
|
page: String(usageState.page),
|
||||||
|
page_size: String(usageState.page_size),
|
||||||
|
sort_by: usageState.sort_by,
|
||||||
|
sort_dir: usageState.sort_dir
|
||||||
|
});
|
||||||
|
|
||||||
|
if (usageState.filters.tag_name) params.set('tag_name', usageState.filters.tag_name);
|
||||||
|
if (usageState.filters.tag_type) params.set('tag_type', usageState.filters.tag_type);
|
||||||
|
if (usageState.filters.module) params.set('module', usageState.filters.module);
|
||||||
|
|
||||||
|
const response = await fetch(`/api/v1/tags/usage?${params.toString()}`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error('Kunne ikke hente tag søgning');
|
||||||
|
}
|
||||||
|
|
||||||
|
const payload = await response.json();
|
||||||
|
usageState.total = Number(payload?.pagination?.total || 0);
|
||||||
|
usageState.total_pages = Number(payload?.pagination?.total_pages || 0);
|
||||||
|
usageState.page = Number(payload?.pagination?.page || usageState.page);
|
||||||
|
|
||||||
|
fillModuleFilter(payload.module_options || []);
|
||||||
|
renderUsageTable(payload.items || []);
|
||||||
|
renderUsageSummary();
|
||||||
|
updateSortIndicators();
|
||||||
|
} catch (error) {
|
||||||
|
tbody.innerHTML = `<tr><td colspan="7" class="text-center text-danger py-4">Fejl ved indlæsning af tag søgning: ${escapeHtml(error.message)}</td></tr>`;
|
||||||
|
document.getElementById('usageSummary').textContent = 'Fejl ved datahentning';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function renderTags() {
|
function renderTags() {
|
||||||
const container = document.getElementById('tagsList');
|
const container = document.getElementById('tagsList');
|
||||||
const filteredTags = currentFilter === 'all'
|
const filteredTags = currentFilter === 'all'
|
||||||
@ -293,6 +698,7 @@
|
|||||||
${!tag.is_active ? '<span class="badge bg-secondary ms-2">Inaktiv</span>' : ''}
|
${!tag.is_active ? '<span class="badge bg-secondary ms-2">Inaktiv</span>' : ''}
|
||||||
</div>
|
</div>
|
||||||
${tag.description ? `<p class="text-muted mb-0 small">${tag.description}</p>` : ''}
|
${tag.description ? `<p class="text-muted mb-0 small">${tag.description}</p>` : ''}
|
||||||
|
${Array.isArray(tag.catch_words) && tag.catch_words.length ? `<p class="mb-0 mt-1"><small class="text-muted">Catch words: ${tag.catch_words.join(', ')}</small></p>` : ''}
|
||||||
</div>
|
</div>
|
||||||
<div class="btn-group">
|
<div class="btn-group">
|
||||||
<button class="btn btn-sm btn-outline-primary" onclick="editTag(${tag.id})">
|
<button class="btn btn-sm btn-outline-primary" onclick="editTag(${tag.id})">
|
||||||
@ -315,7 +721,11 @@
|
|||||||
description: document.getElementById('tagDescription').value || null,
|
description: document.getElementById('tagDescription').value || null,
|
||||||
color: document.getElementById('tagColorHex').value,
|
color: document.getElementById('tagColorHex').value,
|
||||||
icon: document.getElementById('tagIcon').value || null,
|
icon: document.getElementById('tagIcon').value || null,
|
||||||
is_active: document.getElementById('tagActive').checked
|
is_active: document.getElementById('tagActive').checked,
|
||||||
|
catch_words: document.getElementById('tagCatchWords').value
|
||||||
|
.split(/[\n,]+/)
|
||||||
|
.map(v => v.trim().toLowerCase())
|
||||||
|
.filter(v => v.length > 1)
|
||||||
};
|
};
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@ -352,6 +762,7 @@
|
|||||||
document.getElementById('tagColorHex').value = tag.color;
|
document.getElementById('tagColorHex').value = tag.color;
|
||||||
document.getElementById('tagIcon').value = tag.icon || '';
|
document.getElementById('tagIcon').value = tag.icon || '';
|
||||||
document.getElementById('tagActive').checked = tag.is_active;
|
document.getElementById('tagActive').checked = tag.is_active;
|
||||||
|
document.getElementById('tagCatchWords').value = Array.isArray(tag.catch_words) ? tag.catch_words.join(', ') : '';
|
||||||
|
|
||||||
document.querySelector('#createTagModal .modal-title').textContent = 'Rediger Tag';
|
document.querySelector('#createTagModal .modal-title').textContent = 'Rediger Tag';
|
||||||
new bootstrap.Modal(document.getElementById('createTagModal')).show();
|
new bootstrap.Modal(document.getElementById('createTagModal')).show();
|
||||||
@ -374,5 +785,4 @@
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
</script>
|
</script>
|
||||||
</body>
|
{% endblock %}
|
||||||
</html>
|
|
||||||
|
|||||||
@ -11,7 +11,7 @@ import json
|
|||||||
import re
|
import re
|
||||||
import asyncio
|
import asyncio
|
||||||
from typing import List, Optional
|
from typing import List, Optional
|
||||||
from fastapi import APIRouter, HTTPException, Query, status
|
from fastapi import APIRouter, Depends, HTTPException, Query, status
|
||||||
from fastapi.responses import JSONResponse
|
from fastapi.responses import JSONResponse
|
||||||
|
|
||||||
from app.ticket.backend.ticket_service import TicketService
|
from app.ticket.backend.ticket_service import TicketService
|
||||||
@ -55,11 +55,13 @@ from app.ticket.backend.models import (
|
|||||||
TicketDeadlineUpdateRequest
|
TicketDeadlineUpdateRequest
|
||||||
)
|
)
|
||||||
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
from app.core.database import execute_query, execute_insert, execute_update, execute_query_single
|
||||||
|
from app.core.auth_dependencies import require_any_permission
|
||||||
from datetime import date, datetime
|
from datetime import date, datetime
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
router = APIRouter()
|
router = APIRouter()
|
||||||
|
sync_admin_access = require_any_permission("users.manage", "system.admin")
|
||||||
|
|
||||||
|
|
||||||
def _get_first_value(data: dict, keys: List[str]) -> Optional[str]:
|
def _get_first_value(data: dict, keys: List[str]) -> Optional[str]:
|
||||||
@ -127,6 +129,31 @@ def _escape_simply_value(value: str) -> str:
|
|||||||
return value.replace("'", "''")
|
return value.replace("'", "''")
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_count_value(rows: List[dict]) -> Optional[int]:
|
||||||
|
if not rows:
|
||||||
|
return None
|
||||||
|
|
||||||
|
row = rows[0] or {}
|
||||||
|
if not isinstance(row, dict):
|
||||||
|
return None
|
||||||
|
|
||||||
|
for key in ("total_count", "count", "count(*)", "COUNT(*)"):
|
||||||
|
value = row.get(key)
|
||||||
|
if value is not None:
|
||||||
|
try:
|
||||||
|
return int(value)
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
continue
|
||||||
|
|
||||||
|
for value in row.values():
|
||||||
|
try:
|
||||||
|
return int(value)
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
continue
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
async def _vtiger_query_with_retry(vtiger, query_string: str, retries: int = 5, base_delay: float = 1.25) -> List[dict]:
|
async def _vtiger_query_with_retry(vtiger, query_string: str, retries: int = 5, base_delay: float = 1.25) -> List[dict]:
|
||||||
"""Run vTiger query with exponential backoff on rate-limit responses."""
|
"""Run vTiger query with exponential backoff on rate-limit responses."""
|
||||||
for attempt in range(retries + 1):
|
for attempt in range(retries + 1):
|
||||||
@ -1825,7 +1852,8 @@ async def import_simply_archived_tickets(
|
|||||||
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
||||||
include_messages: bool = Query(True, description="Include comments and emails"),
|
include_messages: bool = Query(True, description="Include comments and emails"),
|
||||||
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
||||||
force: bool = Query(False, description="Update even if sync hash matches")
|
force: bool = Query(False, description="Update even if sync hash matches"),
|
||||||
|
current_user: dict = Depends(sync_admin_access)
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
One-time import of archived tickets from Simply-CRM.
|
One-time import of archived tickets from Simply-CRM.
|
||||||
@ -2157,7 +2185,8 @@ async def import_vtiger_archived_tickets(
|
|||||||
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
limit: int = Query(5000, ge=1, le=50000, description="Maximum tickets to import"),
|
||||||
include_messages: bool = Query(True, description="Include comments and emails"),
|
include_messages: bool = Query(True, description="Include comments and emails"),
|
||||||
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
ticket_number: Optional[str] = Query(None, description="Import a single ticket by number"),
|
||||||
force: bool = Query(False, description="Update even if sync hash matches")
|
force: bool = Query(False, description="Update even if sync hash matches"),
|
||||||
|
current_user: dict = Depends(sync_admin_access)
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
One-time import of archived tickets from vTiger (Cases module).
|
One-time import of archived tickets from vTiger (Cases module).
|
||||||
@ -2493,8 +2522,93 @@ async def import_vtiger_archived_tickets(
|
|||||||
raise HTTPException(status_code=500, detail=str(e))
|
raise HTTPException(status_code=500, detail=str(e))
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/archived/status", tags=["Archived Tickets"])
|
||||||
|
async def get_archived_sync_status(current_user: dict = Depends(sync_admin_access)):
|
||||||
|
"""
|
||||||
|
Return archived sync parity status for Simply-CRM and vTiger.
|
||||||
|
"""
|
||||||
|
source_keys = ("simplycrm", "vtiger")
|
||||||
|
sources: dict[str, dict] = {}
|
||||||
|
|
||||||
|
for source_key in source_keys:
|
||||||
|
local_ticket_row = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT COUNT(*) AS total_tickets,
|
||||||
|
MAX(last_synced_at) AS last_synced_at
|
||||||
|
FROM tticket_archived_tickets
|
||||||
|
WHERE source_system = %s
|
||||||
|
""",
|
||||||
|
(source_key,)
|
||||||
|
) or {}
|
||||||
|
|
||||||
|
local_message_row = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT COUNT(*) AS total_messages
|
||||||
|
FROM tticket_archived_messages m
|
||||||
|
INNER JOIN tticket_archived_tickets t ON t.id = m.archived_ticket_id
|
||||||
|
WHERE t.source_system = %s
|
||||||
|
""",
|
||||||
|
(source_key,)
|
||||||
|
) or {}
|
||||||
|
|
||||||
|
local_tickets = int(local_ticket_row.get("total_tickets") or 0)
|
||||||
|
local_messages = int(local_message_row.get("total_messages") or 0)
|
||||||
|
last_synced_value = local_ticket_row.get("last_synced_at")
|
||||||
|
if isinstance(last_synced_value, (datetime, date)):
|
||||||
|
last_synced_at_iso = last_synced_value.isoformat()
|
||||||
|
else:
|
||||||
|
last_synced_at_iso = None
|
||||||
|
|
||||||
|
sources[source_key] = {
|
||||||
|
"remote_total_tickets": None,
|
||||||
|
"local_total_tickets": local_tickets,
|
||||||
|
"local_total_messages": local_messages,
|
||||||
|
"last_synced_at": last_synced_at_iso,
|
||||||
|
"diff": None,
|
||||||
|
"is_synced": False,
|
||||||
|
"error": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
async with SimplyCRMService() as service:
|
||||||
|
module_name = getattr(settings, "SIMPLYCRM_TICKET_MODULE", "Tickets")
|
||||||
|
simply_rows = await service.query(f"SELECT count(*) AS total_count FROM {module_name};")
|
||||||
|
simply_remote_count = _extract_count_value(simply_rows)
|
||||||
|
sources["simplycrm"]["remote_total_tickets"] = simply_remote_count
|
||||||
|
if simply_remote_count is not None:
|
||||||
|
sources["simplycrm"]["diff"] = simply_remote_count - sources["simplycrm"]["local_total_tickets"]
|
||||||
|
sources["simplycrm"]["is_synced"] = sources["simplycrm"]["diff"] == 0
|
||||||
|
elif service.last_query_error:
|
||||||
|
sources["simplycrm"]["error"] = service.last_query_error.get("message") or str(service.last_query_error)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("⚠️ Simply-CRM archived status check failed: %s", e)
|
||||||
|
sources["simplycrm"]["error"] = str(e)
|
||||||
|
|
||||||
|
try:
|
||||||
|
vtiger = get_vtiger_service()
|
||||||
|
vtiger_rows = await _vtiger_query_with_retry(vtiger, "SELECT count(*) AS total_count FROM Cases;")
|
||||||
|
vtiger_remote_count = _extract_count_value(vtiger_rows)
|
||||||
|
sources["vtiger"]["remote_total_tickets"] = vtiger_remote_count
|
||||||
|
if vtiger_remote_count is not None:
|
||||||
|
sources["vtiger"]["diff"] = vtiger_remote_count - sources["vtiger"]["local_total_tickets"]
|
||||||
|
sources["vtiger"]["is_synced"] = sources["vtiger"]["diff"] == 0
|
||||||
|
elif vtiger.last_query_error:
|
||||||
|
sources["vtiger"]["error"] = vtiger.last_query_error.get("message") or str(vtiger.last_query_error)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning("⚠️ vTiger archived status check failed: %s", e)
|
||||||
|
sources["vtiger"]["error"] = str(e)
|
||||||
|
|
||||||
|
overall_synced = all(sources[key].get("is_synced") is True for key in source_keys)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"checked_at": datetime.utcnow().isoformat(),
|
||||||
|
"overall_synced": overall_synced,
|
||||||
|
"sources": sources,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
@router.get("/archived/simply/modules", tags=["Archived Tickets"])
|
@router.get("/archived/simply/modules", tags=["Archived Tickets"])
|
||||||
async def list_simply_modules():
|
async def list_simply_modules(current_user: dict = Depends(sync_admin_access)):
|
||||||
"""
|
"""
|
||||||
List available Simply-CRM modules (debug helper).
|
List available Simply-CRM modules (debug helper).
|
||||||
"""
|
"""
|
||||||
@ -2510,7 +2624,8 @@ async def list_simply_modules():
|
|||||||
@router.get("/archived/simply/ticket", tags=["Archived Tickets"])
|
@router.get("/archived/simply/ticket", tags=["Archived Tickets"])
|
||||||
async def fetch_simply_ticket(
|
async def fetch_simply_ticket(
|
||||||
ticket_number: Optional[str] = Query(None, description="Ticket number, e.g. TT934"),
|
ticket_number: Optional[str] = Query(None, description="Ticket number, e.g. TT934"),
|
||||||
external_id: Optional[str] = Query(None, description="VTiger record ID, e.g. 17x1234")
|
external_id: Optional[str] = Query(None, description="VTiger record ID, e.g. 17x1234"),
|
||||||
|
current_user: dict = Depends(sync_admin_access)
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Fetch a single HelpDesk ticket from Simply-CRM by ticket number or record id.
|
Fetch a single HelpDesk ticket from Simply-CRM by ticket number or record id.
|
||||||
@ -2544,7 +2659,8 @@ async def fetch_simply_ticket(
|
|||||||
@router.get("/archived/simply/record", tags=["Archived Tickets"])
|
@router.get("/archived/simply/record", tags=["Archived Tickets"])
|
||||||
async def fetch_simply_record(
|
async def fetch_simply_record(
|
||||||
record_id: str = Query(..., description="VTiger record ID, e.g. 11x2601"),
|
record_id: str = Query(..., description="VTiger record ID, e.g. 11x2601"),
|
||||||
module: Optional[str] = Query(None, description="Optional module name for context")
|
module: Optional[str] = Query(None, description="Optional module name for context"),
|
||||||
|
current_user: dict = Depends(sync_admin_access)
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Fetch a single record from Simply-CRM by record id.
|
Fetch a single record from Simply-CRM by record id.
|
||||||
|
|||||||
@ -2,6 +2,21 @@
|
|||||||
|
|
||||||
{% block title %}Tekniker Dashboard V1 - Overblik{% endblock %}
|
{% block title %}Tekniker Dashboard V1 - Overblik{% endblock %}
|
||||||
|
|
||||||
|
{% block extra_css %}
|
||||||
|
<style>
|
||||||
|
#caseTable thead th {
|
||||||
|
white-space: nowrap;
|
||||||
|
font-size: 0.78rem;
|
||||||
|
letter-spacing: 0.02em;
|
||||||
|
}
|
||||||
|
|
||||||
|
#caseTable tbody td {
|
||||||
|
font-size: 0.84rem;
|
||||||
|
vertical-align: top;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
{% endblock %}
|
||||||
|
|
||||||
{% block content %}
|
{% block content %}
|
||||||
<div class="container-fluid py-4">
|
<div class="container-fluid py-4">
|
||||||
<div class="d-flex justify-content-between align-items-start flex-wrap gap-3 mb-4">
|
<div class="d-flex justify-content-between align-items-start flex-wrap gap-3 mb-4">
|
||||||
@ -65,15 +80,22 @@
|
|||||||
<table class="table table-sm table-hover mb-0" id="caseTable">
|
<table class="table table-sm table-hover mb-0" id="caseTable">
|
||||||
<thead class="table-light" id="tableHead">
|
<thead class="table-light" id="tableHead">
|
||||||
<tr>
|
<tr>
|
||||||
<th>ID</th>
|
<th>SagsID</th>
|
||||||
<th>Titel</th>
|
<th>Virksom.</th>
|
||||||
<th>Kunde</th>
|
<th>Kontakt</th>
|
||||||
<th>Status</th>
|
<th>Beskr.</th>
|
||||||
<th>Dato</th>
|
<th>Type</th>
|
||||||
|
<th>Prioritet</th>
|
||||||
|
<th>Ansvarl.</th>
|
||||||
|
<th>Gruppe/Level</th>
|
||||||
|
<th>Opret.</th>
|
||||||
|
<th>Start arbejde</th>
|
||||||
|
<th>Start inden</th>
|
||||||
|
<th>Deadline</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody id="tableBody">
|
<tbody id="tableBody">
|
||||||
<tr><td colspan="5" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>
|
<tr><td colspan="12" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
</div>
|
</div>
|
||||||
@ -167,8 +189,16 @@ const allData = {
|
|||||||
{
|
{
|
||||||
id: {{ item.id }},
|
id: {{ item.id }},
|
||||||
titel: {{ item.titel | tojson | safe }},
|
titel: {{ item.titel | tojson | safe }},
|
||||||
|
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
||||||
|
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
|
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
||||||
|
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
||||||
|
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
||||||
|
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
||||||
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
||||||
|
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
||||||
|
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
||||||
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -179,7 +209,16 @@ const allData = {
|
|||||||
{
|
{
|
||||||
id: {{ item.id }},
|
id: {{ item.id }},
|
||||||
titel: {{ item.titel | tojson | safe }},
|
titel: {{ item.titel | tojson | safe }},
|
||||||
|
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
||||||
|
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
|
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
||||||
|
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
||||||
|
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
||||||
|
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
||||||
|
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
||||||
|
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
||||||
|
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
||||||
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -191,9 +230,16 @@ const allData = {
|
|||||||
item_type: {{ item.item_type | tojson | safe }},
|
item_type: {{ item.item_type | tojson | safe }},
|
||||||
item_id: {{ item.item_id }},
|
item_id: {{ item.item_id }},
|
||||||
title: {{ item.title | tojson | safe }},
|
title: {{ item.title | tojson | safe }},
|
||||||
|
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
|
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
||||||
task_reason: {{ item.task_reason | tojson | safe if item.task_reason else 'null' }},
|
task_reason: {{ item.task_reason | tojson | safe if item.task_reason else 'null' }},
|
||||||
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
||||||
|
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
||||||
|
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
||||||
|
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
||||||
|
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
||||||
|
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
||||||
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }}
|
status: {{ item.status | tojson | safe if item.status else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -205,7 +251,16 @@ const allData = {
|
|||||||
id: {{ item.id }},
|
id: {{ item.id }},
|
||||||
titel: {{ item.titel | tojson | safe }},
|
titel: {{ item.titel | tojson | safe }},
|
||||||
group_name: {{ item.group_name | tojson | safe }},
|
group_name: {{ item.group_name | tojson | safe }},
|
||||||
|
beskrivelse: {{ item.beskrivelse | tojson | safe if item.beskrivelse else 'null' }},
|
||||||
|
priority: {{ item.priority | tojson | safe if item.priority else 'null' }},
|
||||||
customer_name: {{ item.customer_name | tojson | safe }},
|
customer_name: {{ item.customer_name | tojson | safe }},
|
||||||
|
kontakt_navn: {{ item.kontakt_navn | tojson | safe if item.kontakt_navn else 'null' }},
|
||||||
|
case_type: {{ item.case_type | tojson | safe if item.case_type else 'null' }},
|
||||||
|
ansvarlig_navn: {{ item.ansvarlig_navn | tojson | safe if item.ansvarlig_navn else 'null' }},
|
||||||
|
assigned_group_name: {{ item.assigned_group_name | tojson | safe if item.assigned_group_name else 'null' }},
|
||||||
|
created_at: {{ item.created_at.isoformat() | tojson | safe if item.created_at else 'null' }},
|
||||||
|
start_date: {{ item.start_date.isoformat() | tojson | safe if item.start_date else 'null' }},
|
||||||
|
deferred_until: {{ item.deferred_until.isoformat() | tojson | safe if item.deferred_until else 'null' }},
|
||||||
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
status: {{ item.status | tojson | safe if item.status else 'null' }},
|
||||||
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
deadline: {{ item.deadline.isoformat() | tojson | safe if item.deadline else 'null' }}
|
||||||
}{% if not loop.last %},{% endif %}
|
}{% if not loop.last %},{% endif %}
|
||||||
@ -225,6 +280,32 @@ function formatShortDate(dateStr) {
|
|||||||
return d.toLocaleDateString('da-DK', { day: '2-digit', month: '2-digit', year: 'numeric' });
|
return d.toLocaleDateString('da-DK', { day: '2-digit', month: '2-digit', year: 'numeric' });
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function renderCaseTableRow(item, idField = 'id', typeField = 'case') {
|
||||||
|
const itemId = item[idField];
|
||||||
|
const openType = typeField === 'item_type' ? item.item_type : 'case';
|
||||||
|
const description = item.beskrivelse || item.titel || item.title || '-';
|
||||||
|
const typeValue = item.case_type || item.item_type || '-';
|
||||||
|
const groupLevel = item.assigned_group_name || item.group_name || '-';
|
||||||
|
const priorityValue = item.priority || 'normal';
|
||||||
|
|
||||||
|
return `
|
||||||
|
<tr onclick="showCaseDetails(${itemId}, '${openType}')" style="cursor:pointer;">
|
||||||
|
<td>#${itemId}</td>
|
||||||
|
<td>${item.customer_name || '-'}</td>
|
||||||
|
<td>${item.kontakt_navn || '-'}</td>
|
||||||
|
<td>${description}</td>
|
||||||
|
<td>${typeValue}</td>
|
||||||
|
<td>${priorityValue}</td>
|
||||||
|
<td>${item.ansvarlig_navn || '-'}</td>
|
||||||
|
<td>${groupLevel}</td>
|
||||||
|
<td>${formatShortDate(item.created_at)}</td>
|
||||||
|
<td>${formatShortDate(item.start_date)}</td>
|
||||||
|
<td>${formatShortDate(item.deferred_until)}</td>
|
||||||
|
<td>${formatShortDate(item.deadline)}</td>
|
||||||
|
</tr>
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
|
||||||
function toggleSection(filterName) {
|
function toggleSection(filterName) {
|
||||||
const kpiCard = document.getElementById('kpi' + filterName.charAt(0).toUpperCase() + filterName.slice(1));
|
const kpiCard = document.getElementById('kpi' + filterName.charAt(0).toUpperCase() + filterName.slice(1));
|
||||||
const listTitle = document.getElementById('listTitle');
|
const listTitle = document.getElementById('listTitle');
|
||||||
@ -242,7 +323,7 @@ function toggleSection(filterName) {
|
|||||||
if (currentFilter === filterName) {
|
if (currentFilter === filterName) {
|
||||||
currentFilter = null;
|
currentFilter = null;
|
||||||
listTitle.textContent = 'Alle sager';
|
listTitle.textContent = 'Alle sager';
|
||||||
tableBody.innerHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>';
|
tableBody.innerHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Vælg et filter ovenfor</td></tr>';
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -266,70 +347,43 @@ function filterAndPopulateTable(filterName) {
|
|||||||
listTitle.innerHTML = '<i class="bi bi-inbox-fill text-primary"></i> Nye sager';
|
listTitle.innerHTML = '<i class="bi bi-inbox-fill text-primary"></i> Nye sager';
|
||||||
const data = allData.newCases || [];
|
const data = allData.newCases || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen nye sager</td></tr>';
|
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen nye sager</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => `
|
bodyHTML = data.map(item => renderCaseTableRow(item)).join('');
|
||||||
<tr onclick="showCaseDetails(${item.id}, 'case')" style="cursor:pointer;">
|
|
||||||
<td>#${item.id}</td>
|
|
||||||
<td>${item.titel || '-'}</td>
|
|
||||||
<td>${item.customer_name || '-'}</td>
|
|
||||||
<td><span class="badge bg-secondary">${item.status || 'Ny'}</span></td>
|
|
||||||
<td>${formatDate(item.created_at)}</td>
|
|
||||||
</tr>
|
|
||||||
`).join('');
|
|
||||||
}
|
}
|
||||||
} else if (filterName === 'myCases') {
|
} else if (filterName === 'myCases') {
|
||||||
listTitle.innerHTML = '<i class="bi bi-person-check-fill text-success"></i> Mine sager';
|
listTitle.innerHTML = '<i class="bi bi-person-check-fill text-success"></i> Mine sager';
|
||||||
const data = allData.myCases || [];
|
const data = allData.myCases || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen sager tildelt</td></tr>';
|
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen sager tildelt</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => `
|
bodyHTML = data.map(item => renderCaseTableRow(item)).join('');
|
||||||
<tr onclick="showCaseDetails(${item.id}, 'case')" style="cursor:pointer;">
|
|
||||||
<td>#${item.id}</td>
|
|
||||||
<td>${item.titel || '-'}</td>
|
|
||||||
<td>${item.customer_name || '-'}</td>
|
|
||||||
<td><span class="badge bg-info">${item.status || '-'}</span></td>
|
|
||||||
<td>${formatShortDate(item.deadline)}</td>
|
|
||||||
</tr>
|
|
||||||
`).join('');
|
|
||||||
}
|
}
|
||||||
} else if (filterName === 'todayTasks') {
|
} else if (filterName === 'todayTasks') {
|
||||||
listTitle.innerHTML = '<i class="bi bi-calendar-check text-primary"></i> Dagens opgaver';
|
listTitle.innerHTML = '<i class="bi bi-calendar-check text-primary"></i> Dagens opgaver';
|
||||||
const data = allData.todayTasks || [];
|
const data = allData.todayTasks || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen opgaver i dag</td></tr>';
|
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen opgaver i dag</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => {
|
bodyHTML = data.map(item => {
|
||||||
const badge = item.item_type === 'case'
|
const normalized = {
|
||||||
? '<span class="badge bg-primary">Sag</span>'
|
...item,
|
||||||
: '<span class="badge bg-info">Ticket</span>';
|
id: item.item_id,
|
||||||
return `
|
titel: item.title,
|
||||||
<tr onclick="showCaseDetails(${item.item_id}, '${item.item_type}')" style="cursor:pointer;">
|
beskrivelse: item.task_reason || item.beskrivelse,
|
||||||
<td>#${item.item_id}</td>
|
deadline: item.deadline || item.due_at,
|
||||||
<td>${item.title || '-'}<br><small class="text-muted">${item.task_reason || ''}</small></td>
|
case_type: item.case_type || item.item_type
|
||||||
<td>${item.customer_name || '-'}</td>
|
};
|
||||||
<td>${badge}</td>
|
return renderCaseTableRow(normalized, 'id', 'item_type');
|
||||||
<td>${formatDate(item.created_at)}</td>
|
|
||||||
</tr>
|
|
||||||
`;
|
|
||||||
}).join('');
|
}).join('');
|
||||||
}
|
}
|
||||||
} else if (filterName === 'groupCases') {
|
} else if (filterName === 'groupCases') {
|
||||||
listTitle.innerHTML = '<i class="bi bi-people-fill text-info"></i> Gruppe-sager';
|
listTitle.innerHTML = '<i class="bi bi-people-fill text-info"></i> Gruppe-sager';
|
||||||
const data = allData.groupCases || [];
|
const data = allData.groupCases || [];
|
||||||
if (data.length === 0) {
|
if (data.length === 0) {
|
||||||
bodyHTML = '<tr><td colspan="5" class="text-center text-muted py-3">Ingen gruppe-sager</td></tr>';
|
bodyHTML = '<tr><td colspan="12" class="text-center text-muted py-3">Ingen gruppe-sager</td></tr>';
|
||||||
} else {
|
} else {
|
||||||
bodyHTML = data.map(item => `
|
bodyHTML = data.map(item => renderCaseTableRow(item)).join('');
|
||||||
<tr onclick="showCaseDetails(${item.id}, 'case')" style="cursor:pointer;">
|
|
||||||
<td>#${item.id}</td>
|
|
||||||
<td>${item.titel || '-'}<br><span class="badge bg-secondary">${item.group_name || '-'}</span></td>
|
|
||||||
<td>${item.customer_name || '-'}</td>
|
|
||||||
<td><span class="badge bg-info">${item.status || '-'}</span></td>
|
|
||||||
<td>${formatShortDate(item.deadline)}</td>
|
|
||||||
</tr>
|
|
||||||
`).join('');
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -86,14 +86,38 @@
|
|||||||
<div class="card-body p-0">
|
<div class="card-body p-0">
|
||||||
<div class="table-responsive">
|
<div class="table-responsive">
|
||||||
<table class="table table-sm table-hover mb-0">
|
<table class="table table-sm table-hover mb-0">
|
||||||
<thead class="table-light"><tr><th>ID</th><th>Titel</th><th>Kunde</th><th>Oprettet</th></tr></thead>
|
<thead class="table-light">
|
||||||
|
<tr>
|
||||||
|
<th>SagsID</th>
|
||||||
|
<th>Virksom.</th>
|
||||||
|
<th>Kontakt</th>
|
||||||
|
<th>Beskr.</th>
|
||||||
|
<th>Type</th>
|
||||||
|
<th>Ansvarl.</th>
|
||||||
|
<th>Gruppe/Level</th>
|
||||||
|
<th>Opret.</th>
|
||||||
|
<th>Start arbejde</th>
|
||||||
|
<th>Start inden</th>
|
||||||
|
<th>Deadline</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
<tbody>
|
<tbody>
|
||||||
{% for item in new_cases %}
|
{% for item in new_cases %}
|
||||||
<tr onclick="window.location.href='/sag/{{ item.id }}'" style="cursor:pointer;">
|
<tr onclick="window.location.href='/sag/{{ item.id }}'" style="cursor:pointer;">
|
||||||
<td>#{{ item.id }}</td><td>{{ item.titel }}</td><td>{{ item.customer_name }}</td><td>{{ item.created_at.strftime('%d/%m %H:%M') if item.created_at else '-' }}</td>
|
<td>#{{ item.id }}</td>
|
||||||
|
<td>{{ item.customer_name or '-' }}</td>
|
||||||
|
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
||||||
|
<td>{{ item.beskrivelse or item.titel or '-' }}</td>
|
||||||
|
<td>{{ item.case_type or '-' }}</td>
|
||||||
|
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
||||||
|
<td>{{ item.assigned_group_name or '-' }}</td>
|
||||||
|
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
||||||
|
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
||||||
|
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
||||||
|
<td>{{ item.deadline.strftime('%d/%m/%Y') if item.deadline else '-' }}</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% else %}
|
{% else %}
|
||||||
<tr><td colspan="4" class="text-center text-muted py-3">Ingen nye sager</td></tr>
|
<tr><td colspan="11" class="text-center text-muted py-3">Ingen nye sager</td></tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
|
|||||||
@ -32,59 +32,71 @@
|
|||||||
<table class="table table-hover table-sm mb-0 align-middle">
|
<table class="table table-hover table-sm mb-0 align-middle">
|
||||||
<thead class="table-light">
|
<thead class="table-light">
|
||||||
<tr>
|
<tr>
|
||||||
|
<th>SagsID</th>
|
||||||
|
<th>Virksom.</th>
|
||||||
|
<th>Kontakt</th>
|
||||||
|
<th>Beskr.</th>
|
||||||
<th>Type</th>
|
<th>Type</th>
|
||||||
<th>ID</th>
|
<th>Ansvarl.</th>
|
||||||
<th>Titel</th>
|
<th>Gruppe/Level</th>
|
||||||
<th>Kunde</th>
|
<th>Opret.</th>
|
||||||
<th>Status</th>
|
<th>Start arbejde</th>
|
||||||
<th>Prioritet/Reason</th>
|
<th>Start inden</th>
|
||||||
<th>Deadline</th>
|
<th>Deadline</th>
|
||||||
<th>Handling</th>
|
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody>
|
<tbody>
|
||||||
{% for item in urgent_overdue %}
|
{% for item in urgent_overdue %}
|
||||||
<tr>
|
<tr>
|
||||||
<td><span class="badge bg-danger">Haste</span></td>
|
|
||||||
<td>#{{ item.item_id }}</td>
|
<td>#{{ item.item_id }}</td>
|
||||||
<td>{{ item.title }}</td>
|
<td>{{ item.customer_name or '-' }}</td>
|
||||||
<td>{{ item.customer_name }}</td>
|
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
||||||
<td>{{ item.status }}</td>
|
<td>{{ item.beskrivelse or item.title or '-' }}</td>
|
||||||
<td>{{ item.attention_reason }}</td>
|
<td>{{ item.case_type or item.item_type or '-' }}</td>
|
||||||
|
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
||||||
|
<td>{{ item.assigned_group_name or '-' }}</td>
|
||||||
|
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
||||||
|
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
||||||
|
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
||||||
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
||||||
<td><a href="{{ '/sag/' ~ item.item_id if item.item_type == 'case' else '/ticket/tickets/' ~ item.item_id }}" class="btn btn-sm btn-danger">Åbn</a></td>
|
|
||||||
</tr>
|
</tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
{% for item in today_tasks %}
|
{% for item in today_tasks %}
|
||||||
<tr>
|
<tr>
|
||||||
<td><span class="badge bg-primary">I dag</span></td>
|
|
||||||
<td>#{{ item.item_id }}</td>
|
<td>#{{ item.item_id }}</td>
|
||||||
<td>{{ item.title }}</td>
|
<td>{{ item.customer_name or '-' }}</td>
|
||||||
<td>{{ item.customer_name }}</td>
|
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
||||||
<td>{{ item.status }}</td>
|
<td>{{ item.beskrivelse or item.title or item.task_reason or '-' }}</td>
|
||||||
<td>{{ item.task_reason }}</td>
|
<td>{{ item.case_type or item.item_type or '-' }}</td>
|
||||||
|
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
||||||
|
<td>{{ item.assigned_group_name or '-' }}</td>
|
||||||
|
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
||||||
|
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
||||||
|
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
||||||
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
<td>{{ item.due_at.strftime('%d/%m/%Y') if item.due_at else '-' }}</td>
|
||||||
<td><a href="{{ '/sag/' ~ item.item_id if item.item_type == 'case' else '/ticket/tickets/' ~ item.item_id }}" class="btn btn-sm btn-outline-primary">Åbn</a></td>
|
|
||||||
</tr>
|
</tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
{% for item in my_cases %}
|
{% for item in my_cases %}
|
||||||
<tr>
|
<tr>
|
||||||
<td><span class="badge bg-secondary">Min sag</span></td>
|
|
||||||
<td>#{{ item.id }}</td>
|
<td>#{{ item.id }}</td>
|
||||||
<td>{{ item.titel }}</td>
|
<td>{{ item.customer_name or '-' }}</td>
|
||||||
<td>{{ item.customer_name }}</td>
|
<td>{{ item.kontakt_navn if item.kontakt_navn and item.kontakt_navn.strip() else '-' }}</td>
|
||||||
<td>{{ item.status }}</td>
|
<td>{{ item.beskrivelse or item.titel or '-' }}</td>
|
||||||
<td>-</td>
|
<td>{{ item.case_type or '-' }}</td>
|
||||||
|
<td>{{ item.ansvarlig_navn or '-' }}</td>
|
||||||
|
<td>{{ item.assigned_group_name or '-' }}</td>
|
||||||
|
<td>{{ item.created_at.strftime('%d/%m/%Y') if item.created_at else '-' }}</td>
|
||||||
|
<td>{{ item.start_date.strftime('%d/%m/%Y') if item.start_date else '-' }}</td>
|
||||||
|
<td>{{ item.deferred_until.strftime('%d/%m/%Y') if item.deferred_until else '-' }}</td>
|
||||||
<td>{{ item.deadline.strftime('%d/%m/%Y') if item.deadline else '-' }}</td>
|
<td>{{ item.deadline.strftime('%d/%m/%Y') if item.deadline else '-' }}</td>
|
||||||
<td><a href="/sag/{{ item.id }}" class="btn btn-sm btn-outline-secondary">Åbn</a></td>
|
|
||||||
</tr>
|
</tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
{% if not urgent_overdue and not today_tasks and not my_cases %}
|
{% if not urgent_overdue and not today_tasks and not my_cases %}
|
||||||
<tr>
|
<tr>
|
||||||
<td colspan="8" class="text-center text-muted py-4">Ingen data at vise for denne tekniker.</td>
|
<td colspan="11" class="text-center text-muted py-4">Ingen data at vise for denne tekniker.</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</tbody>
|
</tbody>
|
||||||
|
|||||||
@ -10,7 +10,7 @@ from fastapi.templating import Jinja2Templates
|
|||||||
from typing import Optional, Dict, Any
|
from typing import Optional, Dict, Any
|
||||||
from datetime import date
|
from datetime import date
|
||||||
|
|
||||||
from app.core.database import execute_query, execute_update, execute_query_single
|
from app.core.database import execute_query, execute_update, execute_query_single, table_has_column
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
@ -18,6 +18,20 @@ router = APIRouter()
|
|||||||
templates = Jinja2Templates(directory="app")
|
templates = Jinja2Templates(directory="app")
|
||||||
|
|
||||||
|
|
||||||
|
def _case_start_date_sql(alias: str = "s") -> str:
|
||||||
|
"""Select start_date only when the live schema actually has it."""
|
||||||
|
if table_has_column("sag_sager", "start_date"):
|
||||||
|
return f"{alias}.start_date"
|
||||||
|
return "NULL::date AS start_date"
|
||||||
|
|
||||||
|
|
||||||
|
def _case_type_sql(alias: str = "s") -> str:
|
||||||
|
"""Select case type across old/new sag schemas."""
|
||||||
|
if table_has_column("sag_sager", "type"):
|
||||||
|
return f"COALESCE({alias}.template_key, {alias}.type, 'ticket') AS case_type"
|
||||||
|
return f"COALESCE({alias}.template_key, 'ticket') AS case_type"
|
||||||
|
|
||||||
|
|
||||||
@router.get("/", include_in_schema=False)
|
@router.get("/", include_in_schema=False)
|
||||||
async def ticket_root_redirect():
|
async def ticket_root_redirect():
|
||||||
return RedirectResponse(url="/sag", status_code=302)
|
return RedirectResponse(url="/sag", status_code=302)
|
||||||
@ -362,6 +376,8 @@ async def new_ticket_page(request: Request):
|
|||||||
|
|
||||||
def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
||||||
"""Collect live data slices for technician-focused dashboard variants."""
|
"""Collect live data slices for technician-focused dashboard variants."""
|
||||||
|
case_start_date_sql = _case_start_date_sql()
|
||||||
|
case_type_sql = _case_type_sql()
|
||||||
user_query = """
|
user_query = """
|
||||||
SELECT user_id, COALESCE(full_name, username, CONCAT('Bruger #', user_id::text)) AS display_name
|
SELECT user_id, COALESCE(full_name, username, CONCAT('Bruger #', user_id::text)) AS display_name
|
||||||
FROM users
|
FROM users
|
||||||
@ -371,16 +387,34 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
user_result = execute_query(user_query, (technician_user_id,))
|
user_result = execute_query(user_query, (technician_user_id,))
|
||||||
technician_name = user_result[0]["display_name"] if user_result else f"Bruger #{technician_user_id}"
|
technician_name = user_result[0]["display_name"] if user_result else f"Bruger #{technician_user_id}"
|
||||||
|
|
||||||
new_cases_query = """
|
new_cases_query = f"""
|
||||||
SELECT
|
SELECT
|
||||||
s.id,
|
s.id,
|
||||||
s.titel,
|
s.titel,
|
||||||
|
s.beskrivelse,
|
||||||
|
s.priority,
|
||||||
s.status,
|
s.status,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
|
{case_start_date_sql},
|
||||||
|
s.deferred_until,
|
||||||
s.deadline,
|
s.deadline,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name
|
{case_type_sql},
|
||||||
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
|
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
||||||
|
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
||||||
|
g.name AS assigned_group_name
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
|
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT cc.contact_id
|
||||||
|
FROM contact_companies cc
|
||||||
|
WHERE cc.customer_id = c.id
|
||||||
|
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
||||||
|
LIMIT 1
|
||||||
|
) cc_first ON true
|
||||||
|
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.status = 'åben'
|
AND s.status = 'åben'
|
||||||
ORDER BY s.created_at DESC
|
ORDER BY s.created_at DESC
|
||||||
@ -388,16 +422,34 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
"""
|
"""
|
||||||
new_cases = execute_query(new_cases_query)
|
new_cases = execute_query(new_cases_query)
|
||||||
|
|
||||||
my_cases_query = """
|
my_cases_query = f"""
|
||||||
SELECT
|
SELECT
|
||||||
s.id,
|
s.id,
|
||||||
s.titel,
|
s.titel,
|
||||||
|
s.beskrivelse,
|
||||||
|
s.priority,
|
||||||
s.status,
|
s.status,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
|
{case_start_date_sql},
|
||||||
|
s.deferred_until,
|
||||||
s.deadline,
|
s.deadline,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name
|
{case_type_sql},
|
||||||
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
|
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
||||||
|
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
||||||
|
g.name AS assigned_group_name
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
|
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT cc.contact_id
|
||||||
|
FROM contact_companies cc
|
||||||
|
WHERE cc.customer_id = c.id
|
||||||
|
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
||||||
|
LIMIT 1
|
||||||
|
) cc_first ON true
|
||||||
|
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.ansvarlig_bruger_id = %s
|
AND s.ansvarlig_bruger_id = %s
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
@ -406,19 +458,36 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
"""
|
"""
|
||||||
my_cases = execute_query(my_cases_query, (technician_user_id,))
|
my_cases = execute_query(my_cases_query, (technician_user_id,))
|
||||||
|
|
||||||
today_tasks_query = """
|
today_tasks_query = f"""
|
||||||
SELECT
|
SELECT
|
||||||
'case' AS item_type,
|
'case' AS item_type,
|
||||||
s.id AS item_id,
|
s.id AS item_id,
|
||||||
s.titel AS title,
|
s.titel AS title,
|
||||||
|
s.beskrivelse,
|
||||||
s.status,
|
s.status,
|
||||||
s.deadline AS due_at,
|
s.deadline AS due_at,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
|
{case_start_date_sql},
|
||||||
|
s.deferred_until,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
NULL::text AS priority,
|
{case_type_sql},
|
||||||
|
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
||||||
|
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
||||||
|
g.name AS assigned_group_name,
|
||||||
|
COALESCE(s.priority::text, 'normal') AS priority,
|
||||||
'Sag deadline i dag' AS task_reason
|
'Sag deadline i dag' AS task_reason
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
|
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT cc.contact_id
|
||||||
|
FROM contact_companies cc
|
||||||
|
WHERE cc.customer_id = c.id
|
||||||
|
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
||||||
|
LIMIT 1
|
||||||
|
) cc_first ON true
|
||||||
|
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.ansvarlig_bruger_id = %s
|
AND s.ansvarlig_bruger_id = %s
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
@ -430,14 +499,22 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
'ticket' AS item_type,
|
'ticket' AS item_type,
|
||||||
t.id AS item_id,
|
t.id AS item_id,
|
||||||
t.subject AS title,
|
t.subject AS title,
|
||||||
|
NULL::text AS beskrivelse,
|
||||||
t.status,
|
t.status,
|
||||||
NULL::date AS due_at,
|
NULL::date AS due_at,
|
||||||
t.created_at,
|
t.created_at,
|
||||||
|
NULL::date AS start_date,
|
||||||
|
NULL::date AS deferred_until,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
|
'ticket' AS case_type,
|
||||||
|
NULL::text AS kontakt_navn,
|
||||||
|
COALESCE(uu.full_name, uu.username) AS ansvarlig_navn,
|
||||||
|
NULL::text AS assigned_group_name,
|
||||||
COALESCE(t.priority, 'normal') AS priority,
|
COALESCE(t.priority, 'normal') AS priority,
|
||||||
'Ticket oprettet i dag' AS task_reason
|
'Ticket oprettet i dag' AS task_reason
|
||||||
FROM tticket_tickets t
|
FROM tticket_tickets t
|
||||||
LEFT JOIN customers c ON c.id = t.customer_id
|
LEFT JOIN customers c ON c.id = t.customer_id
|
||||||
|
LEFT JOIN users uu ON uu.user_id = t.assigned_to_user_id
|
||||||
WHERE t.assigned_to_user_id = %s
|
WHERE t.assigned_to_user_id = %s
|
||||||
AND t.status IN ('open', 'in_progress', 'pending_customer')
|
AND t.status IN ('open', 'in_progress', 'pending_customer')
|
||||||
AND DATE(t.created_at) = CURRENT_DATE
|
AND DATE(t.created_at) = CURRENT_DATE
|
||||||
@ -447,19 +524,36 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
"""
|
"""
|
||||||
today_tasks = execute_query(today_tasks_query, (technician_user_id, technician_user_id))
|
today_tasks = execute_query(today_tasks_query, (technician_user_id, technician_user_id))
|
||||||
|
|
||||||
urgent_overdue_query = """
|
urgent_overdue_query = f"""
|
||||||
SELECT
|
SELECT
|
||||||
'case' AS item_type,
|
'case' AS item_type,
|
||||||
s.id AS item_id,
|
s.id AS item_id,
|
||||||
s.titel AS title,
|
s.titel AS title,
|
||||||
|
s.beskrivelse,
|
||||||
s.status,
|
s.status,
|
||||||
s.deadline AS due_at,
|
s.deadline AS due_at,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
|
{case_start_date_sql},
|
||||||
|
s.deferred_until,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
|
{case_type_sql},
|
||||||
|
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
||||||
|
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
||||||
|
g.name AS assigned_group_name,
|
||||||
NULL::text AS priority,
|
NULL::text AS priority,
|
||||||
'Over deadline' AS attention_reason
|
'Over deadline' AS attention_reason
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
|
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT cc.contact_id
|
||||||
|
FROM contact_companies cc
|
||||||
|
WHERE cc.customer_id = c.id
|
||||||
|
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
||||||
|
LIMIT 1
|
||||||
|
) cc_first ON true
|
||||||
|
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
AND s.deadline IS NOT NULL
|
AND s.deadline IS NOT NULL
|
||||||
@ -471,10 +565,17 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
'ticket' AS item_type,
|
'ticket' AS item_type,
|
||||||
t.id AS item_id,
|
t.id AS item_id,
|
||||||
t.subject AS title,
|
t.subject AS title,
|
||||||
|
NULL::text AS beskrivelse,
|
||||||
t.status,
|
t.status,
|
||||||
NULL::date AS due_at,
|
NULL::date AS due_at,
|
||||||
t.created_at,
|
t.created_at,
|
||||||
|
NULL::date AS start_date,
|
||||||
|
NULL::date AS deferred_until,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
|
'ticket' AS case_type,
|
||||||
|
NULL::text AS kontakt_navn,
|
||||||
|
COALESCE(uu.full_name, uu.username) AS ansvarlig_navn,
|
||||||
|
NULL::text AS assigned_group_name,
|
||||||
COALESCE(t.priority, 'normal') AS priority,
|
COALESCE(t.priority, 'normal') AS priority,
|
||||||
CASE
|
CASE
|
||||||
WHEN t.priority = 'urgent' THEN 'Urgent prioritet'
|
WHEN t.priority = 'urgent' THEN 'Urgent prioritet'
|
||||||
@ -482,6 +583,7 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
END AS attention_reason
|
END AS attention_reason
|
||||||
FROM tticket_tickets t
|
FROM tticket_tickets t
|
||||||
LEFT JOIN customers c ON c.id = t.customer_id
|
LEFT JOIN customers c ON c.id = t.customer_id
|
||||||
|
LEFT JOIN users uu ON uu.user_id = t.assigned_to_user_id
|
||||||
WHERE t.status IN ('open', 'in_progress', 'pending_customer')
|
WHERE t.status IN ('open', 'in_progress', 'pending_customer')
|
||||||
AND COALESCE(t.priority, '') IN ('urgent', 'high')
|
AND COALESCE(t.priority, '') IN ('urgent', 'high')
|
||||||
AND (t.assigned_to_user_id = %s OR t.assigned_to_user_id IS NULL)
|
AND (t.assigned_to_user_id = %s OR t.assigned_to_user_id IS NULL)
|
||||||
@ -542,19 +644,36 @@ def _get_technician_dashboard_data(technician_user_id: int) -> Dict[str, Any]:
|
|||||||
# Get group cases (cases assigned to user's groups)
|
# Get group cases (cases assigned to user's groups)
|
||||||
group_cases = []
|
group_cases = []
|
||||||
if user_group_ids:
|
if user_group_ids:
|
||||||
group_cases_query = """
|
group_cases_query = f"""
|
||||||
SELECT
|
SELECT
|
||||||
s.id,
|
s.id,
|
||||||
s.titel,
|
s.titel,
|
||||||
|
s.beskrivelse,
|
||||||
|
s.priority,
|
||||||
s.status,
|
s.status,
|
||||||
s.created_at,
|
s.created_at,
|
||||||
|
{case_start_date_sql},
|
||||||
|
s.deferred_until,
|
||||||
s.deadline,
|
s.deadline,
|
||||||
|
{case_type_sql},
|
||||||
s.assigned_group_id,
|
s.assigned_group_id,
|
||||||
g.name AS group_name,
|
g.name AS group_name,
|
||||||
COALESCE(c.name, 'Ukendt kunde') AS customer_name
|
COALESCE(c.name, 'Ukendt kunde') AS customer_name,
|
||||||
|
CONCAT(COALESCE(cont.first_name, ''), ' ', COALESCE(cont.last_name, '')) AS kontakt_navn,
|
||||||
|
COALESCE(u.full_name, u.username) AS ansvarlig_navn,
|
||||||
|
g.name AS assigned_group_name
|
||||||
FROM sag_sager s
|
FROM sag_sager s
|
||||||
LEFT JOIN customers c ON c.id = s.customer_id
|
LEFT JOIN customers c ON c.id = s.customer_id
|
||||||
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
LEFT JOIN groups g ON g.id = s.assigned_group_id
|
||||||
|
LEFT JOIN users u ON u.user_id = s.ansvarlig_bruger_id
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT cc.contact_id
|
||||||
|
FROM contact_companies cc
|
||||||
|
WHERE cc.customer_id = c.id
|
||||||
|
ORDER BY cc.is_primary DESC NULLS LAST, cc.id ASC
|
||||||
|
LIMIT 1
|
||||||
|
) cc_first ON true
|
||||||
|
LEFT JOIN contacts cont ON cont.id = cc_first.contact_id
|
||||||
WHERE s.deleted_at IS NULL
|
WHERE s.deleted_at IS NULL
|
||||||
AND s.assigned_group_id = ANY(%s)
|
AND s.assigned_group_id = ANY(%s)
|
||||||
AND s.status <> 'lukket'
|
AND s.status <> 'lukket'
|
||||||
|
|||||||
12
apply_layout.py
Normal file
12
apply_layout.py
Normal file
@ -0,0 +1,12 @@
|
|||||||
|
import re
|
||||||
|
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'r', encoding='utf-8') as f:
|
||||||
|
content = f.read()
|
||||||
|
|
||||||
|
# 1. Fjern max-width
|
||||||
|
content = content.replace('<div class="container-fluid" style="margin-top: 2rem; margin-bottom: 2rem; max-width: 1400px;">', '<div class="container-fluid" style="margin-top: 2rem; margin-bottom: 2rem;">')
|
||||||
|
|
||||||
|
# Find de dele vi vil genbruge (dette kræver præcis regex eller dom parsing)
|
||||||
|
# For denne opgave benytter vi en mere generel struktur opdatering ved at finde specifikke markører.
|
||||||
|
# Her antager jeg scriptet er et template udkast
|
||||||
|
print("Script executed.")
|
||||||
37
backups/email_feature/README.md
Normal file
37
backups/email_feature/README.md
Normal file
@ -0,0 +1,37 @@
|
|||||||
|
# Email Feature Backup
|
||||||
|
|
||||||
|
Backup artifact for current email handling implementation.
|
||||||
|
|
||||||
|
## Artifact
|
||||||
|
|
||||||
|
- `email_feature_backup_20260317_214413.zip`
|
||||||
|
|
||||||
|
## Contents
|
||||||
|
|
||||||
|
- `app/emails/`
|
||||||
|
- `app/services/email_service.py`
|
||||||
|
- `app/services/email_processor_service.py`
|
||||||
|
- `app/services/email_analysis_service.py`
|
||||||
|
- `app/services/email_workflow_service.py`
|
||||||
|
- `app/services/email_activity_logger.py`
|
||||||
|
- `app/modules/sag/templates/detail.html`
|
||||||
|
- `migrations/013_email_system.sql`
|
||||||
|
- `migrations/014_email_workflows.sql`
|
||||||
|
- `migrations/050_email_activity_log.sql`
|
||||||
|
- `migrations/056_email_import_method.sql`
|
||||||
|
- `migrations/084_sag_files_and_emails.sql`
|
||||||
|
- `migrations/140_email_extracted_vendor_fields.sql`
|
||||||
|
- `migrations/141_email_threading_headers.sql`
|
||||||
|
|
||||||
|
## Restore (code only)
|
||||||
|
|
||||||
|
From repository root:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
unzip -o backups/email_feature/email_feature_backup_20260317_214413.zip -d .
|
||||||
|
```
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- This restore only replaces code files included in the artifact.
|
||||||
|
- Database rollback must be handled separately if schema/data has changed.
|
||||||
BIN
backups/email_feature/email_feature_backup_20260317_214413.zip
Normal file
BIN
backups/email_feature/email_feature_backup_20260317_214413.zip
Normal file
Binary file not shown.
@ -4,7 +4,7 @@ services:
|
|||||||
# PostgreSQL Database - Production
|
# PostgreSQL Database - Production
|
||||||
postgres:
|
postgres:
|
||||||
image: postgres:16-alpine
|
image: postgres:16-alpine
|
||||||
container_name: bmc-hub-postgres-prod
|
container_name: bmc-hub-postgres-${STACK_NAME:-prod}
|
||||||
environment:
|
environment:
|
||||||
POSTGRES_USER: ${POSTGRES_USER}
|
POSTGRES_USER: ${POSTGRES_USER}
|
||||||
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
||||||
@ -20,7 +20,7 @@ services:
|
|||||||
- "${POSTGRES_BIND_ADDR:-127.0.0.1}:${POSTGRES_PORT:-5432}:5432"
|
- "${POSTGRES_BIND_ADDR:-127.0.0.1}:${POSTGRES_PORT:-5432}:5432"
|
||||||
restart: always
|
restart: always
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER}"]
|
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]
|
||||||
interval: 10s
|
interval: 10s
|
||||||
timeout: 5s
|
timeout: 5s
|
||||||
retries: 5
|
retries: 5
|
||||||
@ -36,11 +36,9 @@ services:
|
|||||||
RELEASE_VERSION: ${RELEASE_VERSION:-latest}
|
RELEASE_VERSION: ${RELEASE_VERSION:-latest}
|
||||||
GITHUB_TOKEN: ${GITHUB_TOKEN}
|
GITHUB_TOKEN: ${GITHUB_TOKEN}
|
||||||
GITHUB_REPO: ${GITHUB_REPO:-ct/bmc_hub}
|
GITHUB_REPO: ${GITHUB_REPO:-ct/bmc_hub}
|
||||||
image: localhost/bmc-hub:${RELEASE_VERSION:-latest}
|
container_name: bmc-hub-api-${STACK_NAME:-prod}
|
||||||
container_name: bmc-hub-api-prod
|
|
||||||
depends_on:
|
depends_on:
|
||||||
postgres:
|
- postgres
|
||||||
condition: service_healthy
|
|
||||||
ports:
|
ports:
|
||||||
- "${API_PORT:-8000}:8000"
|
- "${API_PORT:-8000}:8000"
|
||||||
volumes:
|
volumes:
|
||||||
|
|||||||
@ -16,7 +16,7 @@ services:
|
|||||||
- "${POSTGRES_PORT:-5433}:5432"
|
- "${POSTGRES_PORT:-5433}:5432"
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-bmc_hub}"]
|
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-bmc_hub} -d ${POSTGRES_DB:-bmc_hub}"]
|
||||||
interval: 10s
|
interval: 10s
|
||||||
timeout: 5s
|
timeout: 5s
|
||||||
retries: 5
|
retries: 5
|
||||||
|
|||||||
299
docs/sagsvisning_mockups.html
Normal file
299
docs/sagsvisning_mockups.html
Normal file
@ -0,0 +1,299 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="da">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>Mockups - Sagsvisning</title>
|
||||||
|
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||||
|
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.4.0/css/all.min.css">
|
||||||
|
<style>
|
||||||
|
:root {
|
||||||
|
--primary-color: #0f4c75; /* Nordic Top Deep Blue */
|
||||||
|
--bg-body: #f4f6f8;
|
||||||
|
--card-border: #e2e8f0;
|
||||||
|
}
|
||||||
|
body { background-color: var(--bg-body); font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif; }
|
||||||
|
.top-nav { background-color: #fff; padding: 15px; border-bottom: 2px solid var(--primary-color); margin-bottom: 20px; }
|
||||||
|
.mockup-container { display: none; }
|
||||||
|
.mockup-container.active { display: block; }
|
||||||
|
.card { border: 1px solid var(--card-border); box-shadow: 0 1px 3px rgba(0,0,0,0.05); margin-bottom: 1rem; border-radius: 8px; }
|
||||||
|
.card-header { background-color: #fff; border-bottom: 1px solid var(--card-border); font-weight: 600; color: var(--primary-color); }
|
||||||
|
.section-title { font-size: 0.85rem; text-transform: uppercase; color: #6c757d; font-weight: 700; margin-bottom: 10px; letter-spacing: 0.5px; }
|
||||||
|
.btn-primary { background-color: var(--primary-color); border-color: var(--primary-color); }
|
||||||
|
.badge-status { background-color: var(--primary-color); color: white; }
|
||||||
|
.timeline-item { border-left: 2px solid var(--card-border); padding-left: 15px; margin-bottom: 15px; position: relative; }
|
||||||
|
.timeline-item::before { content: ''; position: absolute; left: -6px; top: 0; width: 10px; height: 10px; border-radius: 50%; background: var(--primary-color); }
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
|
||||||
|
<div class="top-nav text-center">
|
||||||
|
<h4 class="mb-3" style="color: var(--primary-color);">Vælg Layout Mockup</h4>
|
||||||
|
<div class="btn-group" role="group">
|
||||||
|
<button type="button" class="btn btn-outline-primary active" onclick="showMockup('mockup1', this)">Forslag 1: Arbejdsstationen (3 Kolonner)</button>
|
||||||
|
<button type="button" class="btn btn-outline-primary" onclick="showMockup('mockup2', this)">Forslag 2: Tidslinjen (Inbox Flow)</button>
|
||||||
|
<button type="button" class="btn btn-outline-primary" onclick="showMockup('mockup3', this)">Forslag 3: Det Fokuserede Workspace (Store Faner)</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="container-fluid px-4">
|
||||||
|
|
||||||
|
<!-- FORSLAG 1: TRE KOLONNER -->
|
||||||
|
<div id="mockup1" class="mockup-container active">
|
||||||
|
<h5 class="text-muted"><i class="fas fa-columns"></i> Forslag 1: Arbejdsstationen (Kontekst -> Arbejde -> Styring)</h5>
|
||||||
|
<hr>
|
||||||
|
|
||||||
|
<!-- Header status -->
|
||||||
|
<div class="card mb-3">
|
||||||
|
<div class="card-body py-2 d-flex justify-content-between align-items-center flex-wrap">
|
||||||
|
<div><strong>ID: 1</strong> <span class="badge badge-status">åben</span> | <strong>Kunde:</strong> Blåhund Import (TEST) | <strong>Kontakt:</strong> Janne Vinter</div>
|
||||||
|
<div><strong>Datoer:</strong> Opr: 01/03-26 | <strong>Deadline:</strong> <span class="text-danger border border-danger p-1 rounded"><i class="far fa-clock"></i> 03/03-26</span></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="row">
|
||||||
|
<!-- Kol 1: Kontekst (Venstre) -->
|
||||||
|
<div class="col-md-3">
|
||||||
|
<div class="section-title">Kontekst & Stamdata</div>
|
||||||
|
<div class="card"><div class="card-header"><i class="fas fa-building"></i> Kunder</div><div class="card-body py-2"><small>Blåhund Import (TEST)</small></div></div>
|
||||||
|
<div class="card"><div class="card-header"><i class="fas fa-users"></i> Kontakter</div><div class="card-body py-2"><small>Janne Vinter</small></div></div>
|
||||||
|
<div class="card"><div class="card-header"><i class="fas fa-laptop"></i> Hardware</div><div class="card-body py-2"><span class="text-muted small">Ingen valgt</span></div></div>
|
||||||
|
<div class="card"><div class="card-header"><i class="fas fa-map-marker-alt"></i> Lokationer</div><div class="card-body py-2"><span class="text-muted small">Ingen valgt</span></div></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Kol 2: Arbejde (Midten) -->
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="section-title">Arbejdsflade</div>
|
||||||
|
|
||||||
|
<!-- Beskrivelse altid synlig -->
|
||||||
|
<div class="card border-primary mb-3">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">dette er en test sag</h5>
|
||||||
|
<p class="card-text text-muted mb-0">Ingen beskrivelse tilføjet.</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Faner tager sig af resten -->
|
||||||
|
<ul class="nav nav-tabs mb-3">
|
||||||
|
<li class="nav-item"><a class="nav-link active" href="#">Sagsdetaljer</a></li>
|
||||||
|
<li class="nav-item"><a class="nav-link" href="#"><i class="fas fa-wrench"></i> Løsning</a></li>
|
||||||
|
<li class="nav-item"><a class="nav-link" href="#"><i class="fas fa-envelope"></i> E-mail</a></li>
|
||||||
|
<li class="nav-item"><a class="nav-link" href="#"><i class="fas fa-shopping-basket"></i> Varekøb & Salg</a></li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h6><i class="fas fa-link"></i> Relationer</h6>
|
||||||
|
<div class="border rounded p-2 mb-3 bg-light"><small>#2 Undersag 1 -> Afledt af #1 dette er en test sag</small></div>
|
||||||
|
|
||||||
|
<h6><i class="fas fa-phone"></i> Opkaldshistorik</h6>
|
||||||
|
<div class="border rounded p-2 mb-3 text-center text-muted"><small>Ingen opkald registreret</small></div>
|
||||||
|
|
||||||
|
<h6><i class="fas fa-paperclip"></i> Filer & Dokumenter</h6>
|
||||||
|
<div class="border rounded p-3 text-center bg-light border-dashed"><small><i class="fas fa-cloud-upload-alt fs-4 d-block mb-1"></i> Træk filer hertil for at uploade</small></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Kol 3: Styring (Højre) -->
|
||||||
|
<div class="col-md-3">
|
||||||
|
<div class="section-title">Sagstyring</div>
|
||||||
|
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header">Ansvar & Tildeling</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<label class="form-label small">Ansvarlig medarbejder</label>
|
||||||
|
<select class="form-select form-select-sm mb-2"><option>Ingen</option></select>
|
||||||
|
<label class="form-label small">Ansvarlig gruppe</label>
|
||||||
|
<select class="form-select form-select-sm mb-3"><option>Technicians</option></select>
|
||||||
|
<button class="btn btn-primary btn-sm w-100">Gem Tildeling</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header"><i class="fas fa-check-square text-success"></i> Todo-opgaver</div>
|
||||||
|
<div class="card-body text-center py-4 text-muted"><small>Ingen opgaver endnu</small><br><button class="btn btn-outline-secondary btn-sm mt-2"><i class="fas fa-plus"></i> Opret</button></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
|
||||||
|
<!-- FORSLAG 2: TIDSLINJEN -->
|
||||||
|
<div id="mockup2" class="mockup-container">
|
||||||
|
<h5 class="text-muted"><i class="fas fa-stream"></i> Forslag 2: Tidslinjen (Fokus på flow og kommunikation)</h5>
|
||||||
|
<hr>
|
||||||
|
|
||||||
|
<!-- Sticky Kompakt Header -->
|
||||||
|
<div class="card shadow-sm border-0 mb-4 sticky-top" style="z-index: 1000; top: 0;">
|
||||||
|
<div class="card-body py-2 d-flex justify-content-between align-items-center fs-6 bg-white">
|
||||||
|
<div>
|
||||||
|
<span class="badge badge-status me-2">ID: 1</span>
|
||||||
|
<strong>Blåhund Import</strong> <span class="text-muted">/ Janne Vinter</span>
|
||||||
|
</div>
|
||||||
|
<div class="d-flex align-items-center gap-3">
|
||||||
|
<select class="form-select form-select-sm" style="width: auto;"><option>Ingen (Technicians)</option></select>
|
||||||
|
<span class="badge bg-danger">Frist: 03/03-26</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="row">
|
||||||
|
<!-- Hoved feed (Venstre) -->
|
||||||
|
<div class="col-md-8">
|
||||||
|
|
||||||
|
<!-- Beskrivelse - Hero boks -->
|
||||||
|
<div class="p-4 rounded mb-4" style="background-color: #e3f2fd; border-left: 4px solid var(--primary-color);">
|
||||||
|
<h4 class="mb-1">dette er en test sag</h4>
|
||||||
|
<p class="mb-0 text-muted">Ingen beskrivelse angivet.</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Handlingsmoduler - Inline tabs for inputs -->
|
||||||
|
<div class="card mb-4 bg-light">
|
||||||
|
<div class="card-body py-2">
|
||||||
|
<button class="btn btn-sm btn-outline-primary"><i class="fas fa-comment"></i> Nyt Svar/Notat</button>
|
||||||
|
<button class="btn btn-sm btn-outline-secondary"><i class="fas fa-wrench"></i> Registrer Løsning/Tid</button>
|
||||||
|
<button class="btn btn-sm btn-outline-secondary"><i class="fas fa-shopping-basket"></i> Tilføj Vare</button>
|
||||||
|
<button class="btn btn-sm btn-outline-secondary"><i class="fas fa-paperclip"></i> Vedhæft fil</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Tidslinjen / Log -->
|
||||||
|
<h6 class="text-muted"><i class="fas fa-history"></i> Aktivitet & Historik</h6>
|
||||||
|
<div class="bg-white p-3 rounded border">
|
||||||
|
<div class="timeline-item">
|
||||||
|
<div class="small fw-bold">System <span class="text-muted fw-normal float-end">01/03/2026 14:00</span></div>
|
||||||
|
<div>Sagen blev oprettet.</div>
|
||||||
|
<div class="mt-2 p-2 bg-light border rounded"><small>Relation: #2 Undersag 1 tilknyttet.</small></div>
|
||||||
|
</div>
|
||||||
|
<div class="text-center text-muted small mt-4"><i class="fas fa-check"></i> Slut på historik</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Sidebar (Højre) -->
|
||||||
|
<div class="col-md-4">
|
||||||
|
<div class="card mb-3">
|
||||||
|
<div class="card-header">Sagsfakta & Stamdata</div>
|
||||||
|
<div class="accordion accordion-flush" id="accordionFakta">
|
||||||
|
<div class="accordion-item">
|
||||||
|
<h2 class="accordion-header"><button class="accordion-button collapsed py-2" type="button" data-bs-toggle="collapse" data-bs-target="#fakta1"><i class="fas fa-building me-2"></i> Kunde & Kontakt</button></h2>
|
||||||
|
<div id="fakta1" class="accordion-collapse collapse"><div class="accordion-body small">Blåhund Import (TEST)<br>Janne Vinter</div></div>
|
||||||
|
</div>
|
||||||
|
<div class="accordion-item">
|
||||||
|
<h2 class="accordion-header"><button class="accordion-button collapsed py-2" type="button" data-bs-toggle="collapse" data-bs-target="#fakta2"><i class="fas fa-laptop me-2"></i> Hardware & Lokation</button></h2>
|
||||||
|
<div id="fakta2" class="accordion-collapse collapse"><div class="accordion-body small text-muted">Intet valgt</div></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card">
|
||||||
|
<div class="card-header"><i class="fas fa-check-square"></i> Todo-opgaver & Wiki</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<input type="text" class="form-control form-control-sm mb-2" placeholder="Søg i Wiki...">
|
||||||
|
<hr>
|
||||||
|
<div class="text-center text-muted small"><small>Ingen Todo-opgaver</small></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
|
||||||
|
<!-- FORSLAG 3: DET FOKUSEREDE WORKSPACE -->
|
||||||
|
<div id="mockup3" class="mockup-container">
|
||||||
|
<h5 class="text-muted"><i class="fas fa-window-maximize"></i> Forslag 3: Fokuseret Workspace (Store kategoriserede faner)</h5>
|
||||||
|
<hr>
|
||||||
|
|
||||||
|
<div class="row">
|
||||||
|
<!-- Sidebar venstre (Lille) -->
|
||||||
|
<div class="col-md-2 border-end" style="min-height: 70vh;">
|
||||||
|
<div class="mb-4">
|
||||||
|
<div class="small fw-bold text-muted mb-2">Sags Info</div>
|
||||||
|
<div class="fs-5 text-primary fw-bold">#1 åben</div>
|
||||||
|
<div class="small mt-1 text-danger"><i class="far fa-clock"></i> 03/03-26</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mb-4">
|
||||||
|
<div class="small fw-bold text-muted mb-2">Tildeling</div>
|
||||||
|
<select class="form-select form-select-sm mb-1"><option>Ingen</option></select>
|
||||||
|
<select class="form-select form-select-sm"><option>Technicians</option></select>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mb-4">
|
||||||
|
<div class="small fw-bold text-muted mb-2">Hurtige links</div>
|
||||||
|
<ul class="nav flex-column small">
|
||||||
|
<li class="nav-item"><a class="nav-link px-0 text-dark" href="#"><i class="fas fa-link me-1"></i> Relationer (1)</a></li>
|
||||||
|
<li class="nav-item"><a class="nav-link px-0 text-dark" href="#"><i class="fas fa-check-square me-1 text-success"></i> Todo (0)</a></li>
|
||||||
|
<li class="nav-item"><a class="nav-link px-0 text-dark" href="#"><i class="fas fa-book me-1"></i> Wiki søgning</a></li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Hovedarbejdsflade -->
|
||||||
|
<div class="col-md-10">
|
||||||
|
<div class="d-flex justify-content-between align-items-center mb-3">
|
||||||
|
<h3 class="m-0">dette er en test sag</h3>
|
||||||
|
<button class="btn btn-outline-primary btn-sm"><i class="fas fa-edit"></i> Rediger</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- STORE arbejdsfaner -->
|
||||||
|
<ul class="nav nav-pills nav-fill mb-4 border rounded bg-white shadow-sm p-1">
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link active fw-bold" href="#"><i class="fas fa-eye"></i> 1. Overblik & Stamdata</a>
|
||||||
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link text-dark fw-bold" href="#"><i class="fas fa-wrench"></i> 2. Løsning & Salg</a>
|
||||||
|
</li>
|
||||||
|
<li class="nav-item">
|
||||||
|
<a class="nav-link text-dark fw-bold" href="#"><i class="fas fa-comments"></i> 3. Kommunikation (Mail/Log)</a>
|
||||||
|
</li>
|
||||||
|
</ul>
|
||||||
|
|
||||||
|
<!-- Indhold for aktiv fane (Overblik) -->
|
||||||
|
<div class="card border-0 shadow-sm">
|
||||||
|
<div class="card-body p-4">
|
||||||
|
<h5 class="text-primary border-bottom pb-2 mb-3">Beskrivelse</h5>
|
||||||
|
<p class="text-muted">Ingen beskrivelse tilføjet for denne sag. Lorem ipsum dolor sit amet, consectetur adipiscing elit.</p>
|
||||||
|
|
||||||
|
<div class="row mt-5">
|
||||||
|
<div class="col-md-6">
|
||||||
|
<h6 class="text-muted text-uppercase small fw-bold">Personer & Steder</h6>
|
||||||
|
<table class="table table-sm table-borderless">
|
||||||
|
<tr><td class="text-muted w-25">Kunde</td><td><strong>Blåhund Import (TEST)</strong></td></tr>
|
||||||
|
<tr><td class="text-muted">Kontakt</td><td>Janne Vinter</td></tr>
|
||||||
|
<tr><td class="text-muted">Lokation</td><td>-</td></tr>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
<div class="col-md-6">
|
||||||
|
<h6 class="text-muted text-uppercase small fw-bold">Udstyr</h6>
|
||||||
|
<table class="table table-sm table-borderless">
|
||||||
|
<tr><td class="text-muted w-25">Hardware</td><td>-</td></tr>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
function showMockup(id, btnClicked) {
|
||||||
|
// Skjul alle
|
||||||
|
document.querySelectorAll('.mockup-container').forEach(el => el.classList.remove('active'));
|
||||||
|
// Fjern active state fra knapper
|
||||||
|
document.querySelectorAll('.btn-group .btn').forEach(btn => btn.classList.remove('active'));
|
||||||
|
|
||||||
|
// Vis valgte
|
||||||
|
document.getElementById(id).classList.add('active');
|
||||||
|
btnClicked.classList.add('active');
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
141
fix_cols.py
Normal file
141
fix_cols.py
Normal file
@ -0,0 +1,141 @@
|
|||||||
|
import re
|
||||||
|
|
||||||
|
def fix_columns():
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'r', encoding='utf-8') as f:
|
||||||
|
html = f.read()
|
||||||
|
|
||||||
|
# Udskift selve start containeren!
|
||||||
|
# Målet er at omdanne:
|
||||||
|
# <div class="col-lg-8" id="case-left-column"> ... (Hero card, etc.)
|
||||||
|
# <div class="col-lg-4" id="case-right-column"> ... (Tidsreg, etc.)
|
||||||
|
|
||||||
|
# Fordi det er komplekst at udtrække hver enkelt data-module fra en stor fil uden at tabe layout,
|
||||||
|
# griber vi det an ved at ændre CSS klasserne på container niveauet HVIS vi kun ville ha flex,
|
||||||
|
# men for rigtige 3 kolonner flytter vi `case-left-column`s grid definitioner.
|
||||||
|
|
||||||
|
# Vi kan bygge 3 kolonner inde "case-left-column" + "case-right-column" er den 3. kolonne.
|
||||||
|
# Så left -> 2 kolonner, right -> 1 kolonne. Total 3.
|
||||||
|
# Nu er left = col-lg-8. Vi gør den til col-xl-9 col-lg-8.
|
||||||
|
# Right = col-lg-4. Bliver til col-xl-3 col-lg-4.
|
||||||
|
|
||||||
|
# INDE i left:
|
||||||
|
# Put et grid: <div class="row"><div class="col-xl-4"> (Venstre) </div> <div class="col-xl-8"> (Midten med Opgavebeskivelse) </div></div>
|
||||||
|
|
||||||
|
# Step 1: Let's find "id="case-left-column""
|
||||||
|
html = html.replace('<div class="col-lg-8" id="case-left-column">', '<div class="col-xl-9 col-lg-8" id="case-left-column">\n<div class="row g-4">\n<!-- TREDELT-1: Relations, History, etc. -->\n<div class="col-xl-4 order-2 order-xl-1" id="inner-left-col">\n</div>\n<!-- TREDELT-2: Hero, Info -->\n<div class="col-xl-8 order-1 order-xl-2" id="inner-center-col">\n')
|
||||||
|
|
||||||
|
html = html.replace('<div class="col-lg-4" id="case-right-column">', '</div></div><!-- slut inner cols -->\n</div>\n<div class="col-xl-3 col-lg-4" id="case-right-column">')
|
||||||
|
|
||||||
|
# Now we need to MOVE widgets from "inner-center-col" (where everything currently is) to "inner-left-col".
|
||||||
|
# The widgets we want to move are:
|
||||||
|
# 'relations'
|
||||||
|
# 'call-history'
|
||||||
|
# 'pipeline'
|
||||||
|
|
||||||
|
def move_widget(widget_name, dest_id, current_html):
|
||||||
|
pattern = f'data-module="{widget_name}"'
|
||||||
|
match = current_html.find(pattern)
|
||||||
|
if match == -1:
|
||||||
|
return current_html
|
||||||
|
|
||||||
|
div_start = current_html.rfind('<div class="row mb-3"', max(0, match - 200), match)
|
||||||
|
if div_start == -1:
|
||||||
|
div_start = current_html.rfind('<div class="card', max(0, match - 200), match)
|
||||||
|
|
||||||
|
if div_start == -1:
|
||||||
|
return current_html
|
||||||
|
|
||||||
|
# Find balanced end
|
||||||
|
count = 0
|
||||||
|
i = div_start
|
||||||
|
end_idx = -1
|
||||||
|
while i < len(current_html):
|
||||||
|
if current_html.startswith('<div', i):
|
||||||
|
count += 1
|
||||||
|
i += 4
|
||||||
|
elif current_html.startswith('</div', i):
|
||||||
|
count -= 1
|
||||||
|
if count <= 0:
|
||||||
|
i = current_html.find('>', i) + 1
|
||||||
|
end_idx = i
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
i += 5
|
||||||
|
else:
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
if end_idx != -1:
|
||||||
|
widget = current_html[div_start:end_idx]
|
||||||
|
# Fjern fra oprendelig plads
|
||||||
|
current_html = current_html[:div_start] + current_html[end_idx:]
|
||||||
|
|
||||||
|
# Sæt ind i ny plads (lige efter dest_id div'en)
|
||||||
|
dest_pattern = f'id="{dest_id}">\n'
|
||||||
|
dest_pos = current_html.find(dest_pattern)
|
||||||
|
if dest_pos != -1:
|
||||||
|
insert_pos = dest_pos + len(dest_pattern)
|
||||||
|
current_html = current_html[:insert_pos] + widget + "\n" + current_html[insert_pos:]
|
||||||
|
|
||||||
|
return current_html
|
||||||
|
|
||||||
|
html = move_widget('relations', 'inner-left-col', html)
|
||||||
|
html = move_widget('call-history', 'inner-left-col', html)
|
||||||
|
html = move_widget('pipeline', 'inner-left-col', html)
|
||||||
|
|
||||||
|
# Nogle widgets ligger i right-col, som vi gerne vil have i left col nu?
|
||||||
|
# Contacts, Customers, Locations
|
||||||
|
# De ligger ikke i en <div class="row mb-3">, de er bare direkte `<div class="card h-100...`
|
||||||
|
# Let's extract them correctly
|
||||||
|
def move_card(widget_name, dest_id, current_html):
|
||||||
|
pattern = f'data-module="{widget_name}"'
|
||||||
|
match = current_html.find(pattern)
|
||||||
|
if match == -1:
|
||||||
|
return current_html
|
||||||
|
|
||||||
|
div_start = current_html.rfind('<div class="card', max(0, match - 200), match)
|
||||||
|
if div_start == -1:
|
||||||
|
return current_html
|
||||||
|
|
||||||
|
count = 0
|
||||||
|
i = div_start
|
||||||
|
end_idx = -1
|
||||||
|
while i < len(current_html):
|
||||||
|
if current_html.startswith('<div', i):
|
||||||
|
count += 1
|
||||||
|
i += 4
|
||||||
|
elif current_html.startswith('</div', i):
|
||||||
|
count -= 1
|
||||||
|
if count <= 0:
|
||||||
|
i = current_html.find('>', i) + 1
|
||||||
|
end_idx = i
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
i += 5
|
||||||
|
else:
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
if end_idx != -1:
|
||||||
|
widget = current_html[div_start:end_idx]
|
||||||
|
# De er ofte svøbt i en class mb-3 i col-right. Hvis ikke, læg vi en mb-3 kappe
|
||||||
|
widget = f'<div class="mb-3">{widget}</div>'
|
||||||
|
current_html = current_html[:div_start] + current_html[end_idx:]
|
||||||
|
|
||||||
|
dest_pattern = f'id="{dest_id}">\n'
|
||||||
|
dest_pos = current_html.find(dest_pattern)
|
||||||
|
if dest_pos != -1:
|
||||||
|
insert_pos = dest_pos + len(dest_pattern)
|
||||||
|
current_html = current_html[:insert_pos] + widget + "\n" + current_html[insert_pos:]
|
||||||
|
|
||||||
|
return current_html
|
||||||
|
|
||||||
|
html = move_card('contacts', 'inner-left-col', html)
|
||||||
|
html = move_card('customers', 'inner-left-col', html)
|
||||||
|
html = move_card('locations', 'inner-left-col', html)
|
||||||
|
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'w', encoding='utf-8') as f:
|
||||||
|
f.write(html)
|
||||||
|
|
||||||
|
print("Drejede kolonnerne på plads!")
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
fix_columns()
|
||||||
63
fix_desc2.py
Normal file
63
fix_desc2.py
Normal file
@ -0,0 +1,63 @@
|
|||||||
|
def replace_desc():
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'r', encoding='utf-8') as f:
|
||||||
|
html = f.read()
|
||||||
|
|
||||||
|
start_str = "<!-- ROW 1: Main Info -->"
|
||||||
|
end_str = "<!-- ROW 1B: Pipeline -->"
|
||||||
|
|
||||||
|
start_idx = html.find(start_str)
|
||||||
|
end_idx = html.find(end_str)
|
||||||
|
|
||||||
|
if start_idx == -1 or end_idx == -1:
|
||||||
|
print("COULD NOT FIND ROWS")
|
||||||
|
return
|
||||||
|
|
||||||
|
new_desc = """<!-- ROW 1: Main Info -->
|
||||||
|
<div class="row mb-3">
|
||||||
|
<!-- MAIN HERO CARD: Titel & Beskrivelse -->
|
||||||
|
<div class="col-12 mb-4 mt-2">
|
||||||
|
<div class="card shadow-sm border-0 border-start border-4 border-primary" style="background-color: var(--bg-card); border-radius: 8px;">
|
||||||
|
<div class="card-body p-4 pt-4 pb-5 position-relative">
|
||||||
|
<div class="d-flex justify-content-between align-items-start mb-4">
|
||||||
|
<div class="w-100 pe-3">
|
||||||
|
<h2 class="mb-2 fw-bolder" style="color: var(--accent); font-size: 1.8rem; letter-spacing: -0.5px;">
|
||||||
|
{{ case.titel }}
|
||||||
|
</h2>
|
||||||
|
<div class="d-flex align-items-center gap-2 mb-1 mt-2">
|
||||||
|
<span class="badge {{ 'bg-success' if case.status == 'åben' else 'bg-secondary' }} px-2 py-1 shadow-sm">{{ case.status }}</span>
|
||||||
|
<span class="badge bg-light text-dark border px-2 py-1">{{ case.template_key or case.type or 'ticket' }}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex gap-2 flex-shrink-0 mt-1">
|
||||||
|
<a href="/sag/{{ case.id }}/edit" class="btn btn-outline-primary shadow-sm" style="border-radius: 6px;">
|
||||||
|
<i class="bi bi-pencil me-1"></i>Rediger sag
|
||||||
|
</a>
|
||||||
|
<button onclick="confirmDeleteCase()" class="btn btn-outline-danger shadow-sm" style="border-radius: 6px;" title="Slet sag">
|
||||||
|
<i class="bi bi-trash"></i>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mt-4 pt-3 border-top border-light">
|
||||||
|
<div class="d-flex align-items-center mb-3">
|
||||||
|
<i class="bi bi-card-text fs-5 text-muted me-2"></i>
|
||||||
|
<h6 class="text-muted text-uppercase small mb-0 fw-bold" style="letter-spacing: 0.05em;">Opgavebeskrivelse</h6>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="description-section rounded bg-white p-4 shadow-sm border" style="min-height: 120px;">
|
||||||
|
<div class="prose text-dark" style="font-size: 1.05rem; line-height: 1.7; white-space: pre-wrap;">{{ case.beskrivelse or '<div class="text-center p-3"><p class="text-muted fst-italic mb-2">Ingen opgavebeskrivelse tilføjet endnu.</p></div>' | safe }}</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
"""
|
||||||
|
|
||||||
|
html = html[:start_idx] + new_desc + "\n " + html[end_idx:]
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'w', encoding='utf-8') as f:
|
||||||
|
f.write(html)
|
||||||
|
print("Done description")
|
||||||
|
|
||||||
|
replace_desc()
|
||||||
82
fix_hero_desc.py
Normal file
82
fix_hero_desc.py
Normal file
@ -0,0 +1,82 @@
|
|||||||
|
import re
|
||||||
|
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'r') as f:
|
||||||
|
content = f.read()
|
||||||
|
|
||||||
|
old_block = """ <!-- Main Case Info -->
|
||||||
|
<div class="col-12 mb-3">
|
||||||
|
<div class="card h-100 d-flex flex-column case-summary-card">
|
||||||
|
<div class="card-header case-summary-header">
|
||||||
|
<div>
|
||||||
|
<div class="case-summary-title">{{ case.titel }}</div>
|
||||||
|
<div class="case-summary-meta">
|
||||||
|
<span class="case-pill">#{{ case.id }}</span>
|
||||||
|
<span class="case-pill">{{ case.status }}</span>
|
||||||
|
<span class="case-pill case-pill-muted">{{ case.template_key or case.type or 'ticket' }}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="d-flex align-items-center gap-2">
|
||||||
|
<a href="/sag/{{ case.id }}/edit" class="btn btn-sm btn-outline-primary">
|
||||||
|
<i class="bi bi-pencil">
|
||||||
|
with op content = f.read()
|
||||||
|
|
||||||
|
old_block = """ <!-- Main Cas="
|
||||||
|
old_block = """ e-d <div class="col-12 mb-3">
|
||||||
|
<div class="card h-1bi <div class="card-header case-summary-header">
|
||||||
|
>
|
||||||
|
<div>
|
||||||
|
<div class c <div class="case-summary-meta">
|
||||||
|
<span class="case-pill">#{er <span class="case-pill">{{ case.status }}</sel <span class="case-pill case-pill-muted">{{ case s </div>
|
||||||
|
</div>
|
||||||
|
<div class="d-flex align-items-center gap-2"</ </div>
|
||||||
|
|
||||||
|
</d <a href="/sag/{{ case.id }}/edit" class=ri <i class="bi bi-pencil">
|
||||||
|
with op content = f.read()
|
||||||
|
|
||||||
|
ol-0 borderwith op content = f.read()
|
||||||
|
|
||||||
|
old_block = """ ol
|
||||||
|
old_block = """ <!-- Maus:old_block = """ e-d <div car <div class="card h-1bi >
|
||||||
|
<div>
|
||||||
|
<div class c "w ">
|
||||||
|
<span class="case-pill">#{er <span cla </div>
|
||||||
|
<div class="d-flex align-items-center gap-2"</ </div>
|
||||||
|
|
||||||
|
</d <a href="/sag/{{ case.id }}/edit" class=ri <i class="bi b } <div m"
|
||||||
|
</d <a href="/sag/{{ case.id }}/edit" clda </d 1"with op content = f.read()
|
||||||
|
|
||||||
|
ol-0 borderwith op content = f.read()
|
||||||
|
|
||||||
|
old_block = """ ol
|
||||||
|
old_block = """ <!-- M
|
||||||
|
ol-0 borderwith op conte
|
||||||
|
old_block = """ ol
|
||||||
|
old_block = """ nk-old_block = """ <div>
|
||||||
|
<div class c "w ">
|
||||||
|
i <spa <div class="d-flex align-items-center gap-2"</ </div>
|
||||||
|
|
||||||
|
</d <a <i c
|
||||||
|
</d <a href="/sag/{{ case.id }}/edit" cl>
|
||||||
|
</d </d <a href="/sag/{{ case.id }}/edit" clda </d 1"with op content = f.read()
|
||||||
|
|
||||||
|
ol-0 borderwith op content = f.re
|
||||||
|
ol-0 borderwith op content = f.read()
|
||||||
|
|
||||||
|
old_block = """ ol
|
||||||
|
old_block = """ <!-- M
|
||||||
|
ol-0 borderwith op cmal
|
||||||
|
old_block = """ ol
|
||||||
|
old_block = """ 0.05emold_block = """ seol-0 borderwith op conte</old_block = """ ol
|
||||||
|
old_block = old_block = """ <div class c order" i ar
|
||||||
|
</d <a <i c
|
||||||
|
</d enter p-3"><p class="text-muted fst-italic mb-2">Ingen opgavebeskrivelse tilføjet endnu.</p></div>' | safe }}</div>
|
||||||
|
</d </d <a href="/ </div>
|
||||||
|
</div>
|
||||||
|
</d </d <a href="/sag/{{e(
|
||||||
|
ol-0 borderwith op content = f.re
|
||||||
|
ol-0 borderwith op content = f.read()
|
||||||
|
|
||||||
|
old_block = """ ol
|
||||||
|
old_block = """ <!-- M
|
||||||
|
ol-0priol-0 borderwith op content = f.reah
|
||||||
|
old_bund")
|
||||||
85
fix_hide_logic2.py
Normal file
85
fix_hide_logic2.py
Normal file
@ -0,0 +1,85 @@
|
|||||||
|
import re
|
||||||
|
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'r', encoding='utf-8') as f:
|
||||||
|
html = f.read()
|
||||||
|
|
||||||
|
pattern = re.compile(r"document\.querySelectorAll\('\[data-module\]'\)\.forEach\(\(el\) => \{.*?updateRightColumnVisibility\(\);", re.DOTALL)
|
||||||
|
|
||||||
|
new_code = """document.querySelectorAll('[data-module]').forEach((el) => {
|
||||||
|
const moduleName = el.getAttribute('data-module');
|
||||||
|
const hasContent = moduleHasContent(el);
|
||||||
|
const isTimeModule = moduleName === 'time';
|
||||||
|
const shouldCompactWhenEmpty = moduleName !== 'wiki' && moduleName !== 'pipeline' && !isTimeModule;
|
||||||
|
const pref = modulePrefs[moduleName];
|
||||||
|
const tabButton = document.querySelector(`[data-module-tab="${moduleName}"]`);
|
||||||
|
|
||||||
|
// Helper til at skjule eller vise modulet og dets mb-3 indpakning
|
||||||
|
const setVisibility = (visible) => {
|
||||||
|
let wrapper = null;
|
||||||
|
if (el.parentElement) {
|
||||||
|
const isMB3 = el.parentElement.classList.contains('mb-3');
|
||||||
|
const isRowCol12 = el.parentElement.classList.contains('col-12') && el.parentElement.parentElement && el.parentElement.parentElement.classList.contains('row');
|
||||||
|
if (isMB3) wrapper = el.parentElement;
|
||||||
|
else if (isRowCol12) wrapper = el.parentElement.parentElement;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (visible) {
|
||||||
|
el.classList.remove('d-none');
|
||||||
|
if (wrapper && wrapper.classList.contains('d-none')) {
|
||||||
|
wrapper.classList.remove('d-none');
|
||||||
|
}
|
||||||
|
if (tabButton && tabButton.classList.contains('d-none')) {
|
||||||
|
tabButton.classList.remove('d-none');
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
el.classList.add('d-none');
|
||||||
|
if (wrapper && !wrapper.classList.contains('d-none')) wrapper.classList.add('d-none');
|
||||||
|
if (tabButton && !tabButton.classList.contains('d-none')) tabButton.classList.add('d-none');
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Altid vis time (tid)
|
||||||
|
if (isTimeModule) {
|
||||||
|
setVisibility(true);
|
||||||
|
el.classList.remove('module-empty-compact');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// HVIS specifik præference deaktiverer den - Skjul den! Uanset content.
|
||||||
|
if (pref === false) {
|
||||||
|
setVisibility(false);
|
||||||
|
el.classList.remove('module-empty-compact');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// HVIS specifik præference aktiverer den (brugervalg)
|
||||||
|
if (pref === true) {
|
||||||
|
setVisibility(true);
|
||||||
|
el.classList.toggle('module-empty-compact', shouldCompactWhenEmpty && !hasContent);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Default logic (ingen brugervalg) - har den content, så vis den
|
||||||
|
if (hasContent) {
|
||||||
|
setVisibility(true);
|
||||||
|
el.classList.remove('module-empty-compact');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Default logic - ingen content: se på layout defaults
|
||||||
|
if (standardModuleSet.has(moduleName)) {
|
||||||
|
setVisibility(true);
|
||||||
|
el.classList.toggle('module-empty-compact', shouldCompactWhenEmpty);
|
||||||
|
} else {
|
||||||
|
setVisibility(false);
|
||||||
|
el.classList.remove('module-empty-compact');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
updateRightColumnVisibility();"""
|
||||||
|
|
||||||
|
html, count = pattern.subn(new_code, html)
|
||||||
|
print(f"Replaced {count} instances.")
|
||||||
|
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'w', encoding='utf-8') as f:
|
||||||
|
f.write(html)
|
||||||
128
fix_relations_center.py
Normal file
128
fix_relations_center.py
Normal file
@ -0,0 +1,128 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Move Relationer to center column + add dynamic column distribution JS.
|
||||||
|
"""
|
||||||
|
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'r', encoding='utf-8') as f:
|
||||||
|
content = f.read()
|
||||||
|
|
||||||
|
# ── 1. Extract the relations block from inner-left-col ───────────────────────
|
||||||
|
start_marker = '<div class="row mb-3">\n <div class="col-12 mb-3">\n <div class="card h-100 d-flex flex-column" data-module="relations"'
|
||||||
|
end_marker_after = ' </div>\n<div class="mb-3"></div>\n<div class="mb-3"></div>\n<div class="mb-3"></div>'
|
||||||
|
|
||||||
|
start_idx = content.find(start_marker)
|
||||||
|
if start_idx == -1:
|
||||||
|
print("ERROR: Could not find relations block start")
|
||||||
|
exit(1)
|
||||||
|
|
||||||
|
end_marker_idx = content.find(end_marker_after, start_idx)
|
||||||
|
if end_marker_idx == -1:
|
||||||
|
print("ERROR: Could not find relations block end / empty spacers")
|
||||||
|
exit(1)
|
||||||
|
|
||||||
|
end_idx = end_marker_idx + len(end_marker_after)
|
||||||
|
|
||||||
|
relations_block = content[start_idx:end_idx - len('\n<div class="mb-3"></div>\n<div class="mb-3"></div>\n<div class="mb-3"></div>')]
|
||||||
|
print(f"Extracted relations block: chars {start_idx} - {end_idx}")
|
||||||
|
print(f"Relations block starts with: {relations_block[:80]!r}")
|
||||||
|
print(f"Relations block ends with: {relations_block[-60:]!r}")
|
||||||
|
|
||||||
|
# ── 2. Remove the relations block + spacers from inner-left-col ──────────────
|
||||||
|
content = content[:start_idx] + content[end_idx:]
|
||||||
|
print("Removed relations + spacers from inner-left-col")
|
||||||
|
|
||||||
|
# ── 3. Insert relations into inner-center-col (before ROW 3: Files) ──────────
|
||||||
|
insert_before = ' <!-- ROW 3: Files + Linked Emails -->'
|
||||||
|
insert_idx = content.find(insert_before)
|
||||||
|
if insert_idx == -1:
|
||||||
|
print("ERROR: Could not find ROW 3 insertion point")
|
||||||
|
exit(1)
|
||||||
|
|
||||||
|
relations_in_center = '\n <!-- Relationer (center) -->\n' + relations_block + '\n\n'
|
||||||
|
content = content[:insert_idx] + relations_in_center + content[insert_idx:]
|
||||||
|
print(f"Inserted relations before ROW 3 at char {insert_idx}")
|
||||||
|
|
||||||
|
# ── 4. Add updateInnerColumnVisibility() after updateRightColumnVisibility() ─
|
||||||
|
old_js = """ function updateRightColumnVisibility() {
|
||||||
|
const rightColumn = document.getElementById('case-right-column');
|
||||||
|
const leftColumn = document.getElementById('case-left-column');
|
||||||
|
if (!rightColumn || !leftColumn) return;
|
||||||
|
|
||||||
|
const visibleRightModules = rightColumn.querySelectorAll('.right-module-card:not(.d-none)');
|
||||||
|
if (visibleRightModules.length === 0) {
|
||||||
|
rightColumn.classList.add('d-none');
|
||||||
|
rightColumn.classList.remove('col-lg-4');
|
||||||
|
leftColumn.classList.remove('col-lg-8');
|
||||||
|
leftColumn.classList.add('col-12');
|
||||||
|
} else {
|
||||||
|
rightColumn.classList.remove('d-none');
|
||||||
|
rightColumn.classList.add('col-lg-4');
|
||||||
|
leftColumn.classList.add('col-lg-8');
|
||||||
|
leftColumn.classList.remove('col-12');
|
||||||
|
}
|
||||||
|
}"""
|
||||||
|
|
||||||
|
new_js = """ function updateRightColumnVisibility() {
|
||||||
|
const rightColumn = document.getElementById('case-right-column');
|
||||||
|
const leftColumn = document.getElementById('case-left-column');
|
||||||
|
if (!rightColumn || !leftColumn) return;
|
||||||
|
|
||||||
|
const visibleRightModules = rightColumn.querySelectorAll('.right-module-card:not(.d-none)');
|
||||||
|
if (visibleRightModules.length === 0) {
|
||||||
|
rightColumn.classList.add('d-none');
|
||||||
|
rightColumn.classList.remove('col-lg-4');
|
||||||
|
leftColumn.classList.remove('col-lg-8');
|
||||||
|
leftColumn.classList.add('col-12');
|
||||||
|
} else {
|
||||||
|
rightColumn.classList.remove('d-none');
|
||||||
|
rightColumn.classList.add('col-lg-4');
|
||||||
|
leftColumn.classList.add('col-lg-8');
|
||||||
|
leftColumn.classList.remove('col-12');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateInnerColumnVisibility() {
|
||||||
|
const leftCol = document.getElementById('inner-left-col');
|
||||||
|
const centerCol = document.getElementById('inner-center-col');
|
||||||
|
if (!leftCol || !centerCol) return;
|
||||||
|
|
||||||
|
// Tæl synlige moduler i venstre kolonnen (mb-3 wrappers der ikke er skjulte)
|
||||||
|
const visibleLeftModules = leftCol.querySelectorAll('.mb-3:not(.d-none) [data-module]');
|
||||||
|
const hasVisibleLeft = visibleLeftModules.length > 0;
|
||||||
|
|
||||||
|
if (!hasVisibleLeft) {
|
||||||
|
// Ingen synlige moduler i venstre - udvid center til fuld bredde
|
||||||
|
leftCol.classList.add('d-none');
|
||||||
|
centerCol.classList.remove('col-xl-8');
|
||||||
|
centerCol.classList.add('col-xl-12');
|
||||||
|
} else {
|
||||||
|
// Gendan 4/8 split
|
||||||
|
leftCol.classList.remove('d-none');
|
||||||
|
centerCol.classList.remove('col-xl-12');
|
||||||
|
centerCol.classList.add('col-xl-8');
|
||||||
|
}
|
||||||
|
}"""
|
||||||
|
|
||||||
|
if old_js in content:
|
||||||
|
content = content.replace(old_js, new_js)
|
||||||
|
print("Added updateInnerColumnVisibility() function")
|
||||||
|
else:
|
||||||
|
print("ERROR: Could not find updateRightColumnVisibility() for JS patch")
|
||||||
|
exit(1)
|
||||||
|
|
||||||
|
# ── 5. Call updateInnerColumnVisibility() from applyViewLayout ───────────────
|
||||||
|
old_call = ' updateRightColumnVisibility();\n }'
|
||||||
|
new_call = ' updateRightColumnVisibility();\n updateInnerColumnVisibility();\n }'
|
||||||
|
|
||||||
|
if old_call in content:
|
||||||
|
content = content.replace(old_call, new_call, 1)
|
||||||
|
print("Added updateInnerColumnVisibility() call in applyViewLayout")
|
||||||
|
else:
|
||||||
|
print("ERROR: Could not find updateRightColumnVisibility() call")
|
||||||
|
exit(1)
|
||||||
|
|
||||||
|
# ── Write file ────────────────────────────────────────────────────────────────
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'w', encoding='utf-8') as f:
|
||||||
|
f.write(content)
|
||||||
|
|
||||||
|
print("\n✅ Done! Lines written:", content.count('\n'))
|
||||||
174
fix_top.py
Normal file
174
fix_top.py
Normal file
@ -0,0 +1,174 @@
|
|||||||
|
import re
|
||||||
|
|
||||||
|
def main():
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'r', encoding='utf-8') as f:
|
||||||
|
html = f.read()
|
||||||
|
|
||||||
|
# --- 1. Topbar fix ---
|
||||||
|
topbar_pattern = re.compile(r'<!-- Quick Info Bar \(Redesigned\) -->.*?<!-- Tabs Navigation -->', re.DOTALL)
|
||||||
|
|
||||||
|
new_topbar = """<!-- Hero Header (Redesigned) -->
|
||||||
|
<div class="card mb-4 border-0 shadow-sm hero-header" style="border-radius: 8px;">
|
||||||
|
<div class="card-body p-3 px-4">
|
||||||
|
<div class="d-flex flex-wrap justify-content-between align-items-center gap-3">
|
||||||
|
|
||||||
|
<!-- Left: Who & What -->
|
||||||
|
<div class="d-flex flex-wrap align-items-center gap-4">
|
||||||
|
<div class="d-flex align-items-center">
|
||||||
|
<span class="badge" style="background: var(--accent); font-size: 1.1rem; padding: 0.5em 0.8em; margin-right: 0.5rem; box-shadow: 0 2px 4px rgba(0,0,0,0.1);">#{{ case.id }}</span>
|
||||||
|
<span class="badge {{ 'bg-success' if case.status == 'åben' else 'bg-secondary' }}" style="font-size: 0.9rem; padding: 0.5em 0.8em;">{{ case.status }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex flex-column">
|
||||||
|
<span class="text-muted text-uppercase fw-bold" style="font-size: 0.70rem; letter-spacing: 0.5px;">Kunde</span>
|
||||||
|
{% if customer %}
|
||||||
|
<a href="/customers/{{ customer.id }}" class="fw-bold fs-5 text-dark text-decoration-none hover-primary">
|
||||||
|
{{ customer.name }}
|
||||||
|
</a>
|
||||||
|
{% else %}
|
||||||
|
<span class="fs-5 text-muted">Ingen kunde</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex flex-column">
|
||||||
|
<span class="text-muted text-uppercase fw-bold" style="font-size: 0.70rem; letter-spacing: 0.5px;">Kontakt</span>
|
||||||
|
{% if hovedkontakt %}
|
||||||
|
<span class="fw-bold fs-6 text-dark hover-primary" style="cursor: pointer; text-decoration: underline; text-decoration-style: dotted; text-underline-offset: 4px;" onclick="showKontaktModal()" title="Se kontaktinfo">
|
||||||
|
{{ hovedkontakt.first_name ~ ' ' ~ hovedkontakt.last_name }}
|
||||||
|
</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="fs-6 text-muted fst-italic">Ingen</span>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex flex-column border-end pe-4">
|
||||||
|
<span class="text-muted text-uppercase fw-bold" style="font-size: 0.70rem; letter-spacing: 0.5px;">Afdeling</span>
|
||||||
|
<span class="fs-6 hover-primary" style="cursor: pointer;" onclick="showAfdelingModal()" title="Ændre afdeling">
|
||||||
|
{{ customer.department if customer and customer.department else 'N/A' }}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex flex-column pe-4">
|
||||||
|
<span class="text-muted text-uppercase fw-bold" style="font-size: 0.70rem; letter-spacing: 0.5px;">Ansvarlig</span>
|
||||||
|
<div class="d-flex gap-2">
|
||||||
|
<select id="assignmentUserSelect" class="form-select form-select-sm shadow-none" style="border: none; background-color: #f8f9fa; font-weight: bold; width: auto; font-size: 0.9rem;" onchange="saveAssignment()">
|
||||||
|
<option value="">Ingen (Bruger)</option>
|
||||||
|
{% for user in assignment_users or [] %}
|
||||||
|
<option value="{{ user.user_id }}" {% if case.ansvarlig_bruger_id == user.user_id %}selected{% endif %}>{{ user.display_name }}</option>
|
||||||
|
{% endfor %}
|
||||||
|
</select>
|
||||||
|
<select id="assignmentGroupSelect" class="form-select form-select-sm shadow-none" style="border: none; background-color: #f8f9fa; font-weight: bold; width: auto; font-size: 0.9rem;" onchange="saveAssignment()">
|
||||||
|
<option value="">Ingen (Gruppe)</option>
|
||||||
|
{% for group in assignment_groups or [] %}
|
||||||
|
<option value="{{ group.id }}" {% if case.assigned_group_id == group.id %}selected{% endif %}>{{ group.name }}</option>
|
||||||
|
{% endfor %}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Right: Time & Dates -->
|
||||||
|
<div class="d-flex flex-wrap align-items-center gap-4">
|
||||||
|
<div class="d-flex flex-column text-end border-end pe-4">
|
||||||
|
<span class="text-muted text-uppercase fw-bold" style="font-size: 0.70rem; letter-spacing: 0.5px;">Datoer <i class="bi bi-info-circle text-muted" title="Oprettet / Opdateret"></i></span>
|
||||||
|
<div class="small mt-1">
|
||||||
|
<span class="text-muted fw-bold" style="font-size: 0.8rem;">Opr:</span> {{ case.created_at.strftime('%d/%m-%y') if case.created_at else '-' }}
|
||||||
|
<span class="text-muted mx-1">|</span>
|
||||||
|
<span class="text-muted" style="font-size: 0.8rem;">Opd:</span> {{ case.updated_at.strftime('%d/%m-%y') if case.updated_at else '-' }}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex flex-column text-end">
|
||||||
|
<span class="text-muted text-uppercase fw-bold" style="font-size: 0.70rem; letter-spacing: 0.5px;">Deadline</span>
|
||||||
|
<div class="d-flex align-items-center justify-content-end mt-1">
|
||||||
|
{% if case.deadline %}
|
||||||
|
<span class="badge bg-light text-dark border {{ 'text-danger border-danger' if is_deadline_overdue else '' }}" style="font-size: 0.85rem; font-weight: 500;">
|
||||||
|
<i class="bi bi-clock me-1"></i>{{ case.deadline.strftime('%d/%m-%y') }}
|
||||||
|
</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="text-muted small fst-italic">Ingen</span>
|
||||||
|
{% endif %}
|
||||||
|
<button class="btn btn-link btn-sm p-0 ms-1 text-muted" onclick="openDeadlineModal()" title="Rediger deadline"><i class="bi bi-pencil-square"></i></button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex flex-column text-end">
|
||||||
|
<span class="text-muted text-uppercase fw-bold" style="font-size: 0.70rem; letter-spacing: 0.5px;">Udsat</span>
|
||||||
|
<div class="d-flex align-items-center justify-content-end mt-1">
|
||||||
|
{% if case.deferred_until %}
|
||||||
|
<span class="badge bg-light text-dark border" style="font-size: 0.85rem; font-weight: 500;">
|
||||||
|
<i class="bi bi-calendar-event me-1"></i>{{ case.deferred_until.strftime('%d/%m-%y') }}
|
||||||
|
</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="text-muted small fst-italic">Nej</span>
|
||||||
|
{% endif %}
|
||||||
|
<button class="btn btn-link btn-sm p-0 ms-1 text-muted" onclick="openDeferredModal()" title="Rediger udsættelse"><i class="bi bi-pencil-square"></i></button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Tabs Navigation -->"""
|
||||||
|
html, n = topbar_pattern.subn(new_topbar, html)
|
||||||
|
print(f"Topbar replaced: {n}")
|
||||||
|
|
||||||
|
# --- 2. Hovedbeskrivelsen! ---
|
||||||
|
desc_pattern = re.compile(r'<!-- Main Case Info -->.*?<div class="row mb-3">\s*<div class="col-12 mb-3">\s*<div class="card h-100 d-flex flex-column" data-module="pipeline"', re.DOTALL)
|
||||||
|
|
||||||
|
new_desc = """<!-- MAIN HERO CARD: Titel & Beskrivelse -->
|
||||||
|
<div class="col-12 mb-4 mt-2">
|
||||||
|
<div class="card shadow-sm border-0 border-start border-4 border-primary" style="background-color: var(--bg-card); border-radius: 8px;">
|
||||||
|
<div class="card-body p-4 pt-4 pb-5 position-relative">
|
||||||
|
<div class="d-flex justify-content-between align-items-start mb-4">
|
||||||
|
<div class="w-100 pe-3">
|
||||||
|
<h2 class="mb-2 fw-bolder" style="color: var(--accent); font-size: 1.8rem; letter-spacing: -0.5px;">
|
||||||
|
{{ case.titel }}
|
||||||
|
</h2>
|
||||||
|
<div class="d-flex align-items-center gap-2 mb-1 mt-2">
|
||||||
|
<span class="badge {{ 'bg-success' if case.status == 'åben' else 'bg-secondary' }} px-2 py-1 shadow-sm">{{ case.status }}</span>
|
||||||
|
<span class="badge bg-light text-dark border px-2 py-1">{{ case.template_key or case.type or 'ticket' }}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex gap-2 flex-shrink-0 mt-1">
|
||||||
|
<a href="/sag/{{ case.id }}/edit" class="btn btn-outline-primary shadow-sm" style="border-radius: 6px;">
|
||||||
|
<i class="bi bi-pencil me-1"></i>Rediger sag
|
||||||
|
</a>
|
||||||
|
<button onclick="confirmDeleteCase()" class="btn btn-outline-danger shadow-sm" style="border-radius: 6px;" title="Slet sag">
|
||||||
|
<i class="bi bi-trash"></i>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="mt-4 pt-3 border-top border-light">
|
||||||
|
<div class="d-flex align-items-center mb-3">
|
||||||
|
<i class="bi bi-card-text fs-5 text-muted me-2"></i>
|
||||||
|
<h6 class="text-muted text-uppercase small mb-0 fw-bold" style="letter-spacing: 0.05em;">Opgavebeskrivelse</h6>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="description-section rounded bg-white p-4 shadow-sm border" style="min-height: 120px;">
|
||||||
|
<div class="prose text-dark" style="font-size: 1.05rem; line-height: 1.7; white-space: pre-wrap;">{{ case.beskrivelse or '<div class="text-center p-3"><p class="text-muted fst-italic mb-2">Ingen opgavebeskrivelse tilføjet endnu.</p></div>' | safe }}</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- ROW 1B: Pipeline -->
|
||||||
|
<div class="row mb-3">
|
||||||
|
<div class="col-12 mb-3">
|
||||||
|
<div class="card h-100 d-flex flex-column" data-module="pipeline" """
|
||||||
|
|
||||||
|
html, n2 = desc_pattern.subn(new_desc, html)
|
||||||
|
print(f"Desc replaced: {n2}")
|
||||||
|
|
||||||
|
with open('app/modules/sag/templates/detail.html', 'w', encoding='utf-8') as f:
|
||||||
|
f.write(html)
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
62
main.py
62
main.py
@ -16,6 +16,29 @@ from app.core.database import init_db
|
|||||||
from app.core.auth_service import AuthService
|
from app.core.auth_service import AuthService
|
||||||
from app.core.database import execute_query_single
|
from app.core.database import execute_query_single
|
||||||
|
|
||||||
|
|
||||||
|
_users_column_cache: dict[str, bool] = {}
|
||||||
|
|
||||||
|
|
||||||
|
def _users_column_exists(column_name: str) -> bool:
|
||||||
|
if column_name in _users_column_cache:
|
||||||
|
return _users_column_cache[column_name]
|
||||||
|
|
||||||
|
result = execute_query_single(
|
||||||
|
"""
|
||||||
|
SELECT 1
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'users'
|
||||||
|
AND column_name = %s
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(column_name,),
|
||||||
|
)
|
||||||
|
exists = bool(result)
|
||||||
|
_users_column_cache[column_name] = exists
|
||||||
|
return exists
|
||||||
|
|
||||||
def get_version():
|
def get_version():
|
||||||
"""Read version from VERSION file"""
|
"""Read version from VERSION file"""
|
||||||
try:
|
try:
|
||||||
@ -36,6 +59,7 @@ from app.system.backend import router as system_api
|
|||||||
from app.system.backend import sync_router
|
from app.system.backend import sync_router
|
||||||
from app.dashboard.backend import views as dashboard_views
|
from app.dashboard.backend import views as dashboard_views
|
||||||
from app.dashboard.backend import router as dashboard_api
|
from app.dashboard.backend import router as dashboard_api
|
||||||
|
from app.dashboard.backend import mission_router as mission_api
|
||||||
from app.prepaid.backend import router as prepaid_api
|
from app.prepaid.backend import router as prepaid_api
|
||||||
from app.prepaid.backend import views as prepaid_views
|
from app.prepaid.backend import views as prepaid_views
|
||||||
from app.fixed_price.backend import router as fixed_price_api
|
from app.fixed_price.backend import router as fixed_price_api
|
||||||
@ -205,9 +229,18 @@ async def auth_middleware(request: Request, call_next):
|
|||||||
"/api/v1/auth/login"
|
"/api/v1/auth/login"
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public_prefixes = {
|
||||||
|
"/api/v1/mission/webhook/telefoni/",
|
||||||
|
"/api/v1/mission/webhook/uptime",
|
||||||
|
}
|
||||||
|
|
||||||
# Yealink Action URL callbacks (secured inside telefoni module by token/IP)
|
# Yealink Action URL callbacks (secured inside telefoni module by token/IP)
|
||||||
public_paths.add("/api/v1/telefoni/established")
|
public_paths.add("/api/v1/telefoni/established")
|
||||||
public_paths.add("/api/v1/telefoni/terminated")
|
public_paths.add("/api/v1/telefoni/terminated")
|
||||||
|
public_paths.add("/api/v1/mission/webhook/telefoni/ringing")
|
||||||
|
public_paths.add("/api/v1/mission/webhook/telefoni/answered")
|
||||||
|
public_paths.add("/api/v1/mission/webhook/telefoni/hangup")
|
||||||
|
public_paths.add("/api/v1/mission/webhook/uptime")
|
||||||
|
|
||||||
if settings.DEV_ALLOW_ARCHIVED_IMPORT:
|
if settings.DEV_ALLOW_ARCHIVED_IMPORT:
|
||||||
public_paths.add("/api/v1/ticket/archived/simply/import")
|
public_paths.add("/api/v1/ticket/archived/simply/import")
|
||||||
@ -215,7 +248,12 @@ async def auth_middleware(request: Request, call_next):
|
|||||||
public_paths.add("/api/v1/ticket/archived/simply/ticket")
|
public_paths.add("/api/v1/ticket/archived/simply/ticket")
|
||||||
public_paths.add("/api/v1/ticket/archived/simply/record")
|
public_paths.add("/api/v1/ticket/archived/simply/record")
|
||||||
|
|
||||||
if path in public_paths or path.startswith("/static") or path.startswith("/docs"):
|
if (
|
||||||
|
path in public_paths
|
||||||
|
or any(path.startswith(prefix) for prefix in public_prefixes)
|
||||||
|
or path.startswith("/static")
|
||||||
|
or path.startswith("/docs")
|
||||||
|
):
|
||||||
return await call_next(request)
|
return await call_next(request)
|
||||||
|
|
||||||
token = None
|
token = None
|
||||||
@ -250,11 +288,16 @@ async def auth_middleware(request: Request, call_next):
|
|||||||
content={"detail": "Invalid token"}
|
content={"detail": "Invalid token"}
|
||||||
)
|
)
|
||||||
user_id = int(payload.get("sub"))
|
user_id = int(payload.get("sub"))
|
||||||
user = execute_query_single(
|
|
||||||
"SELECT is_2fa_enabled FROM users WHERE user_id = %s",
|
if _users_column_exists("is_2fa_enabled"):
|
||||||
(user_id,)
|
user = execute_query_single(
|
||||||
)
|
"SELECT COALESCE(is_2fa_enabled, FALSE) AS is_2fa_enabled FROM users WHERE user_id = %s",
|
||||||
is_2fa_enabled = bool(user and user.get("is_2fa_enabled"))
|
(user_id,),
|
||||||
|
)
|
||||||
|
is_2fa_enabled = bool(user and user.get("is_2fa_enabled"))
|
||||||
|
else:
|
||||||
|
# Older schemas without 2FA columns should not block authenticated requests.
|
||||||
|
is_2fa_enabled = False
|
||||||
|
|
||||||
if not is_2fa_enabled:
|
if not is_2fa_enabled:
|
||||||
allowed_2fa_paths = (
|
allowed_2fa_paths = (
|
||||||
@ -279,6 +322,7 @@ app.include_router(alert_notes_api, prefix="/api/v1", tags=["Alert Notes"])
|
|||||||
app.include_router(billing_api.router, prefix="/api/v1", tags=["Billing"])
|
app.include_router(billing_api.router, prefix="/api/v1", tags=["Billing"])
|
||||||
app.include_router(system_api.router, prefix="/api/v1", tags=["System"])
|
app.include_router(system_api.router, prefix="/api/v1", tags=["System"])
|
||||||
app.include_router(dashboard_api.router, prefix="/api/v1", tags=["Dashboard"])
|
app.include_router(dashboard_api.router, prefix="/api/v1", tags=["Dashboard"])
|
||||||
|
app.include_router(mission_api.router, prefix="/api/v1", tags=["Mission"])
|
||||||
app.include_router(sync_router.router, prefix="/api/v1/system", tags=["System Sync"])
|
app.include_router(sync_router.router, prefix="/api/v1/system", tags=["System Sync"])
|
||||||
app.include_router(prepaid_api.router, prefix="/api/v1", tags=["Prepaid Cards"])
|
app.include_router(prepaid_api.router, prefix="/api/v1", tags=["Prepaid Cards"])
|
||||||
app.include_router(fixed_price_api.router, prefix="/api/v1", tags=["Fixed-Price Agreements"])
|
app.include_router(fixed_price_api.router, prefix="/api/v1", tags=["Fixed-Price Agreements"])
|
||||||
@ -380,5 +424,9 @@ if __name__ == "__main__":
|
|||||||
"main:app",
|
"main:app",
|
||||||
host="0.0.0.0",
|
host="0.0.0.0",
|
||||||
port=8000,
|
port=8000,
|
||||||
reload=False
|
reload=False,
|
||||||
|
workers=2,
|
||||||
|
timeout_keep_alive=65,
|
||||||
|
access_log=True,
|
||||||
|
log_level="info"
|
||||||
)
|
)
|
||||||
|
|||||||
@ -37,7 +37,7 @@ CREATE TABLE email_rules (
|
|||||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
created_by_user_id INTEGER,
|
created_by_user_id INTEGER,
|
||||||
|
|
||||||
FOREIGN KEY (created_by_user_id) REFERENCES users(id) ON DELETE SET NULL
|
FOREIGN KEY (created_by_user_id) REFERENCES users(user_id) ON DELETE SET NULL
|
||||||
);
|
);
|
||||||
|
|
||||||
-- Email Messages Table (main storage)
|
-- Email Messages Table (main storage)
|
||||||
@ -183,7 +183,7 @@ SELECT
|
|||||||
COUNT(ea.id) as attachment_count_actual,
|
COUNT(ea.id) as attachment_count_actual,
|
||||||
er.name as rule_name,
|
er.name as rule_name,
|
||||||
v.name as supplier_name,
|
v.name as supplier_name,
|
||||||
tc.customer_name,
|
tc.name as customer_name,
|
||||||
tcase.title as case_title
|
tcase.title as case_title
|
||||||
FROM email_messages em
|
FROM email_messages em
|
||||||
LEFT JOIN email_attachments ea ON em.id = ea.email_id
|
LEFT JOIN email_attachments ea ON em.id = ea.email_id
|
||||||
@ -193,7 +193,7 @@ LEFT JOIN tmodule_customers tc ON em.customer_id = tc.id
|
|||||||
LEFT JOIN tmodule_cases tcase ON em.linked_case_id = tcase.id
|
LEFT JOIN tmodule_cases tcase ON em.linked_case_id = tcase.id
|
||||||
WHERE em.deleted_at IS NULL
|
WHERE em.deleted_at IS NULL
|
||||||
AND em.status IN ('new', 'error')
|
AND em.status IN ('new', 'error')
|
||||||
GROUP BY em.id, er.name, v.name, tc.customer_name, tcase.title
|
GROUP BY em.id, er.name, v.name, tc.name, tcase.title
|
||||||
ORDER BY em.received_date DESC;
|
ORDER BY em.received_date DESC;
|
||||||
|
|
||||||
-- View for recent email activity
|
-- View for recent email activity
|
||||||
|
|||||||
@ -27,9 +27,9 @@ CREATE TABLE IF NOT EXISTS tticket_relations (
|
|||||||
CONSTRAINT no_self_reference CHECK (ticket_id != related_ticket_id)
|
CONSTRAINT no_self_reference CHECK (ticket_id != related_ticket_id)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_tticket_relations_ticket ON tticket_relations(ticket_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_relations_ticket ON tticket_relations(ticket_id);
|
||||||
CREATE INDEX idx_tticket_relations_related ON tticket_relations(related_ticket_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_relations_related ON tticket_relations(related_ticket_id);
|
||||||
CREATE INDEX idx_tticket_relations_type ON tticket_relations(relation_type);
|
CREATE INDEX IF NOT EXISTS idx_tticket_relations_type ON tticket_relations(relation_type);
|
||||||
|
|
||||||
-- View for at finde alle relationer for en ticket (begge retninger)
|
-- View for at finde alle relationer for en ticket (begge retninger)
|
||||||
CREATE OR REPLACE VIEW tticket_all_relations AS
|
CREATE OR REPLACE VIEW tticket_all_relations AS
|
||||||
@ -90,10 +90,10 @@ CREATE TABLE IF NOT EXISTS tticket_calendar_events (
|
|||||||
completed_at TIMESTAMP
|
completed_at TIMESTAMP
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_tticket_calendar_ticket ON tticket_calendar_events(ticket_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_calendar_ticket ON tticket_calendar_events(ticket_id);
|
||||||
CREATE INDEX idx_tticket_calendar_date ON tticket_calendar_events(event_date);
|
CREATE INDEX IF NOT EXISTS idx_tticket_calendar_date ON tticket_calendar_events(event_date);
|
||||||
CREATE INDEX idx_tticket_calendar_type ON tticket_calendar_events(event_type);
|
CREATE INDEX IF NOT EXISTS idx_tticket_calendar_type ON tticket_calendar_events(event_type);
|
||||||
CREATE INDEX idx_tticket_calendar_status ON tticket_calendar_events(status);
|
CREATE INDEX IF NOT EXISTS idx_tticket_calendar_status ON tticket_calendar_events(status);
|
||||||
|
|
||||||
-- ============================================================================
|
-- ============================================================================
|
||||||
-- TEMPLATES (svarskabeloner, guides, standardbreve)
|
-- TEMPLATES (svarskabeloner, guides, standardbreve)
|
||||||
@ -128,8 +128,8 @@ CREATE TABLE IF NOT EXISTS tticket_templates (
|
|||||||
usage_count INTEGER DEFAULT 0
|
usage_count INTEGER DEFAULT 0
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_tticket_templates_category ON tticket_templates(category);
|
CREATE INDEX IF NOT EXISTS idx_tticket_templates_category ON tticket_templates(category);
|
||||||
CREATE INDEX idx_tticket_templates_active ON tticket_templates(is_active);
|
CREATE INDEX IF NOT EXISTS idx_tticket_templates_active ON tticket_templates(is_active);
|
||||||
|
|
||||||
-- ============================================================================
|
-- ============================================================================
|
||||||
-- TEMPLATE USAGE LOG (hvornår blev skabeloner brugt)
|
-- TEMPLATE USAGE LOG (hvornår blev skabeloner brugt)
|
||||||
@ -143,8 +143,8 @@ CREATE TABLE IF NOT EXISTS tticket_template_usage (
|
|||||||
was_modified BOOLEAN DEFAULT false -- Blev template redigeret før afsendelse?
|
was_modified BOOLEAN DEFAULT false -- Blev template redigeret før afsendelse?
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_tticket_template_usage_template ON tticket_template_usage(template_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_template_usage_template ON tticket_template_usage(template_id);
|
||||||
CREATE INDEX idx_tticket_template_usage_ticket ON tticket_template_usage(ticket_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_template_usage_ticket ON tticket_template_usage(ticket_id);
|
||||||
|
|
||||||
-- ============================================================================
|
-- ============================================================================
|
||||||
-- AI SUGGESTIONS (forslag til actions - aldrig automatisk)
|
-- AI SUGGESTIONS (forslag til actions - aldrig automatisk)
|
||||||
@ -186,10 +186,10 @@ CREATE TABLE IF NOT EXISTS tticket_ai_suggestions (
|
|||||||
expires_at TIMESTAMP -- Forslag udløber efter X dage
|
expires_at TIMESTAMP -- Forslag udløber efter X dage
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_tticket_ai_suggestions_ticket ON tticket_ai_suggestions(ticket_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_ai_suggestions_ticket ON tticket_ai_suggestions(ticket_id);
|
||||||
CREATE INDEX idx_tticket_ai_suggestions_type ON tticket_ai_suggestions(suggestion_type);
|
CREATE INDEX IF NOT EXISTS idx_tticket_ai_suggestions_type ON tticket_ai_suggestions(suggestion_type);
|
||||||
CREATE INDEX idx_tticket_ai_suggestions_status ON tticket_ai_suggestions(status);
|
CREATE INDEX IF NOT EXISTS idx_tticket_ai_suggestions_status ON tticket_ai_suggestions(status);
|
||||||
CREATE INDEX idx_tticket_ai_suggestions_created ON tticket_ai_suggestions(created_at);
|
CREATE INDEX IF NOT EXISTS idx_tticket_ai_suggestions_created ON tticket_ai_suggestions(created_at);
|
||||||
|
|
||||||
-- ============================================================================
|
-- ============================================================================
|
||||||
-- EMAIL METADATA (udvidet til contact identification)
|
-- EMAIL METADATA (udvidet til contact identification)
|
||||||
@ -227,9 +227,9 @@ CREATE TABLE IF NOT EXISTS tticket_email_metadata (
|
|||||||
updated_at TIMESTAMP
|
updated_at TIMESTAMP
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_tticket_email_ticket ON tticket_email_metadata(ticket_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_email_ticket ON tticket_email_metadata(ticket_id);
|
||||||
CREATE INDEX idx_tticket_email_message_id ON tticket_email_metadata(message_id);
|
CREATE INDEX IF NOT EXISTS idx_tticket_email_message_id ON tticket_email_metadata(message_id);
|
||||||
CREATE INDEX idx_tticket_email_from ON tticket_email_metadata(from_email);
|
CREATE INDEX IF NOT EXISTS idx_tticket_email_from ON tticket_email_metadata(from_email);
|
||||||
|
|
||||||
-- ============================================================================
|
-- ============================================================================
|
||||||
-- Tilføj manglende kolonner til existing tticket_tickets
|
-- Tilføj manglende kolonner til existing tticket_tickets
|
||||||
@ -265,9 +265,15 @@ CREATE TABLE IF NOT EXISTS tticket_audit_log (
|
|||||||
metadata JSONB -- Additional context
|
metadata JSONB -- Additional context
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_tticket_audit_ticket ON tticket_audit_log(ticket_id);
|
ALTER TABLE tticket_audit_log
|
||||||
CREATE INDEX idx_tticket_audit_action ON tticket_audit_log(action);
|
ADD COLUMN IF NOT EXISTS field_name VARCHAR(100),
|
||||||
CREATE INDEX idx_tticket_audit_performed ON tticket_audit_log(performed_at DESC);
|
ADD COLUMN IF NOT EXISTS performed_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
ADD COLUMN IF NOT EXISTS reason TEXT,
|
||||||
|
ADD COLUMN IF NOT EXISTS metadata JSONB;
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tticket_audit_ticket ON tticket_audit_log(ticket_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tticket_audit_action ON tticket_audit_log(action);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tticket_audit_performed ON tticket_audit_log(performed_at DESC);
|
||||||
|
|
||||||
-- ============================================================================
|
-- ============================================================================
|
||||||
-- TRIGGERS for audit logging
|
-- TRIGGERS for audit logging
|
||||||
|
|||||||
@ -24,7 +24,17 @@ ADD COLUMN IF NOT EXISTS time_date DATE;
|
|||||||
ALTER TABLE tmodule_order_lines
|
ALTER TABLE tmodule_order_lines
|
||||||
ADD COLUMN IF NOT EXISTS is_travel BOOLEAN DEFAULT false;
|
ADD COLUMN IF NOT EXISTS is_travel BOOLEAN DEFAULT false;
|
||||||
|
|
||||||
-- Log migration
|
-- Log migration when the legacy tracking table exists
|
||||||
INSERT INTO migration_log (migration_name, applied_at)
|
DO $$
|
||||||
VALUES ('031_add_is_travel_column', CURRENT_TIMESTAMP)
|
BEGIN
|
||||||
ON CONFLICT DO NOTHING;
|
IF EXISTS (
|
||||||
|
SELECT 1
|
||||||
|
FROM information_schema.tables
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'migration_log'
|
||||||
|
) THEN
|
||||||
|
INSERT INTO migration_log (migration_name, applied_at)
|
||||||
|
VALUES ('031_add_is_travel_column', CURRENT_TIMESTAMP)
|
||||||
|
ON CONFLICT DO NOTHING;
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|||||||
@ -4,13 +4,13 @@
|
|||||||
|
|
||||||
-- Add import_method column
|
-- Add import_method column
|
||||||
ALTER TABLE email_messages
|
ALTER TABLE email_messages
|
||||||
ADD COLUMN import_method VARCHAR(50) DEFAULT 'imap';
|
ADD COLUMN IF NOT EXISTS import_method VARCHAR(50) DEFAULT 'imap';
|
||||||
|
|
||||||
-- Add comment
|
-- Add comment
|
||||||
COMMENT ON COLUMN email_messages.import_method IS 'How the email was imported: imap, graph_api, or manual_upload';
|
COMMENT ON COLUMN email_messages.import_method IS 'How the email was imported: imap, graph_api, or manual_upload';
|
||||||
|
|
||||||
-- Create index for filtering by import method
|
-- Create index for filtering by import method
|
||||||
CREATE INDEX idx_email_messages_import_method ON email_messages(import_method);
|
CREATE INDEX IF NOT EXISTS idx_email_messages_import_method ON email_messages(import_method);
|
||||||
|
|
||||||
-- Update existing records to reflect their actual source
|
-- Update existing records to reflect their actual source
|
||||||
-- (all existing emails were fetched via IMAP or Graph API)
|
-- (all existing emails were fetched via IMAP or Graph API)
|
||||||
@ -20,5 +20,8 @@ WHERE import_method IS NULL;
|
|||||||
|
|
||||||
-- Add constraint to ensure valid values
|
-- Add constraint to ensure valid values
|
||||||
ALTER TABLE email_messages
|
ALTER TABLE email_messages
|
||||||
ADD CONSTRAINT chk_email_import_method
|
DROP CONSTRAINT IF EXISTS chk_email_import_method;
|
||||||
|
|
||||||
|
ALTER TABLE email_messages
|
||||||
|
ADD CONSTRAINT chk_email_import_method
|
||||||
CHECK (import_method IN ('imap', 'graph_api', 'manual_upload'));
|
CHECK (import_method IN ('imap', 'graph_api', 'manual_upload'));
|
||||||
|
|||||||
@ -1,5 +1,5 @@
|
|||||||
-- 069_conversation_category.sql
|
-- 069_conversation_category.sql
|
||||||
-- Add category column for conversation classification
|
-- Add category column for conversation classification
|
||||||
|
|
||||||
ALTER TABLE conversations ADD COLUMN category VARCHAR(50) DEFAULT 'General';
|
ALTER TABLE conversations ADD COLUMN IF NOT EXISTS category VARCHAR(50) DEFAULT 'General';
|
||||||
COMMENT ON COLUMN conversations.category IS 'Conversation Category: General, Support, Sales, Internal, Meeting';
|
COMMENT ON COLUMN conversations.category IS 'Conversation Category: General, Support, Sales, Internal, Meeting';
|
||||||
|
|||||||
@ -1,4 +1,4 @@
|
|||||||
-- 072_add_category_to_conversations.sql
|
-- 072_add_category_to_conversations.sql
|
||||||
|
|
||||||
ALTER TABLE conversations ADD COLUMN category VARCHAR(50) DEFAULT 'General';
|
ALTER TABLE conversations ADD COLUMN IF NOT EXISTS category VARCHAR(50) DEFAULT 'General';
|
||||||
COMMENT ON COLUMN conversations.category IS 'Category of the conversation (e.g. Sales, Support, General)';
|
COMMENT ON COLUMN conversations.category IS 'Category of the conversation (e.g. Sales, Support, General)';
|
||||||
|
|||||||
@ -11,4 +11,4 @@ CREATE TABLE IF NOT EXISTS sag_kommentarer (
|
|||||||
deleted_at TIMESTAMP WITH TIME ZONE DEFAULT NULL
|
deleted_at TIMESTAMP WITH TIME ZONE DEFAULT NULL
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE INDEX idx_sag_kommentarer_sag_id ON sag_comments(sag_id);
|
CREATE INDEX IF NOT EXISTS idx_sag_kommentarer_sag_id ON sag_kommentarer(sag_id);
|
||||||
|
|||||||
@ -51,7 +51,7 @@ SELECT
|
|||||||
s.customer_id,
|
s.customer_id,
|
||||||
cust.name as customer_name,
|
cust.name as customer_name,
|
||||||
s.sag_id,
|
s.sag_id,
|
||||||
sag.title as sag_title,
|
sag.titel as sag_title,
|
||||||
s.session_link,
|
s.session_link,
|
||||||
s.started_at,
|
s.started_at,
|
||||||
s.ended_at,
|
s.ended_at,
|
||||||
|
|||||||
@ -1,15 +1,31 @@
|
|||||||
-- Migration: Enforce unique economic customer number on customers
|
-- Migration: Enforce unique economic customer number on customers
|
||||||
-- Prevents ambiguous mapping during e-conomic sync
|
-- Prevents ambiguous mapping during e-conomic sync
|
||||||
|
|
||||||
-- Normalize values before uniqueness check
|
-- Normalize values before uniqueness check (only for text-like columns)
|
||||||
UPDATE customers
|
-- Uses string-concatenated EXECUTE to avoid nested dollar-quoting issues
|
||||||
SET economic_customer_number = NULL
|
DO $$
|
||||||
WHERE economic_customer_number IS NOT NULL
|
DECLARE
|
||||||
AND btrim(economic_customer_number) = '';
|
column_data_type TEXT;
|
||||||
|
BEGIN
|
||||||
|
SELECT data_type
|
||||||
|
INTO column_data_type
|
||||||
|
FROM information_schema.columns
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name = 'customers'
|
||||||
|
AND column_name = 'economic_customer_number';
|
||||||
|
|
||||||
UPDATE customers
|
-- Only normalize whitespace for text-type columns (integer columns are skipped)
|
||||||
SET economic_customer_number = btrim(economic_customer_number)
|
IF column_data_type IN ('character varying', 'character', 'text') THEN
|
||||||
WHERE economic_customer_number IS NOT NULL;
|
EXECUTE 'UPDATE customers'
|
||||||
|
|| ' SET economic_customer_number = NULL'
|
||||||
|
|| ' WHERE economic_customer_number IS NOT NULL'
|
||||||
|
|| ' AND btrim(economic_customer_number) = ''''';
|
||||||
|
|
||||||
|
EXECUTE 'UPDATE customers'
|
||||||
|
|| ' SET economic_customer_number = btrim(economic_customer_number)'
|
||||||
|
|| ' WHERE economic_customer_number IS NOT NULL';
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
-- Abort migration if duplicates exist (must be manually resolved first)
|
-- Abort migration if duplicates exist (must be manually resolved first)
|
||||||
DO $$
|
DO $$
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user