Compare commits
3 Commits
f059cb6c95
...
25168108d6
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
25168108d6 | ||
|
|
ef171c7573 | ||
|
|
4b467aeeec |
24
RELEASE_NOTES_v2.1.0.md
Normal file
24
RELEASE_NOTES_v2.1.0.md
Normal file
@ -0,0 +1,24 @@
|
||||
# Release Notes v2.1.0
|
||||
|
||||
## New Features
|
||||
- **Email Drag-and-Drop Upload**: Upload .msg and .eml files directly to opportunities by dragging them onto the email drop zone
|
||||
- **Multiple Email Linking**: Link multiple emails to a single opportunity with search and persistent storage
|
||||
- **Contact Persons Management**: Add, link, and manage contact persons for opportunities with roles and search functionality
|
||||
- **File Uploads**: Upload files to opportunity comments and contract sections with drag-and-drop support
|
||||
- **Utility Company Lookup**: Automatically lookup electricity suppliers for customer addresses via Elnet API
|
||||
- **UI Reorganization**: Moved pipeline status to top-left for better visibility in opportunity detail view
|
||||
- **Email HTML Rendering**: Display HTML email bodies in the email viewer
|
||||
|
||||
## Technical Changes
|
||||
- Added Many-to-Many relationships for opportunity emails and contacts
|
||||
- New database tables: pipeline_opportunity_emails, pipeline_opportunity_contacts, pipeline_opportunity_comment_attachments, pipeline_opportunity_contract_files
|
||||
- Enhanced email processing to support .msg and .eml file uploads
|
||||
- Improved file handling with size limits and type validation
|
||||
- Updated customer detail page with utility company information
|
||||
|
||||
## Fixes
|
||||
- Fixed API_BASE path issues in opportunity detail page
|
||||
- Improved email attachment handling and display
|
||||
|
||||
---
|
||||
Release Date: 29. januar 2026
|
||||
26
RELEASE_NOTES_v2.1.1.md
Normal file
26
RELEASE_NOTES_v2.1.1.md
Normal file
@ -0,0 +1,26 @@
|
||||
# BMC Hub v2.1.1 - Bug Fix Release
|
||||
|
||||
**Release Date:** 29. januar 2026
|
||||
|
||||
## 🐛 Bug Fixes
|
||||
|
||||
### Migrationer Interface
|
||||
- **Fixed container runtime detection**: Production servers using Podman now show correct commands instead of Docker commands
|
||||
- **Updated migration command display**: Frontend now correctly shows `podman exec` commands for production environments
|
||||
- **Improved user experience**: Added container runtime information in the standard setup section
|
||||
|
||||
## 🔧 Technical Changes
|
||||
|
||||
- Updated `app/settings/frontend/migrations.html` to detect production environment and use appropriate container runtime
|
||||
- Modified `app/settings/backend/views.py` to pass production environment flag to template
|
||||
- Container runtime detection based on hostname (production vs localhost/127.0.0.1)
|
||||
|
||||
## 📋 Deployment Notes
|
||||
|
||||
This is a frontend-only change that fixes the migration interface display. No database changes required.
|
||||
|
||||
## ✅ Verification
|
||||
|
||||
- Migration page now shows correct Podman commands on production servers
|
||||
- Local development still uses Docker commands
|
||||
- Migration execution via web interface continues to work as before
|
||||
492
SAG_MODULE_PLAN.md
Normal file
492
SAG_MODULE_PLAN.md
Normal file
@ -0,0 +1,492 @@
|
||||
# Implementeringsplan: Sag-modulet (Case Module)
|
||||
|
||||
## Oversigt - Hvad er "Sag"?
|
||||
|
||||
**Sag-modulet** er hjertet i BMC Hub's nye relation- og proces-styringssystem. I stedet for at have separate systemer for "tickets", "opgaver" og "ordrer", har vi én universel entitet: **en Sag**.
|
||||
|
||||
### Kerneidéen (meget vigtig!)
|
||||
|
||||
> **Der er kun én ting: en Sag. Tickets, opgaver og ordrer er blot sager med forskellige relationer, tags og moduler.**
|
||||
|
||||
**Eksempler:**
|
||||
|
||||
1. **Kunde ringer og skal have ny skærm**
|
||||
- Dette er en *Sag* (ticket-type med tag: `support`)
|
||||
- Den får tag: `urgent` fordi det er ekspres
|
||||
|
||||
2. **Indkøb af skærm hos leverandør**
|
||||
- Dette er også en *Sag* (ordre-type med tag: `indkøb`)
|
||||
- Den er *relateret til* den første sag som "afledt_af"
|
||||
- Ansvarlig: Indkøbschef
|
||||
|
||||
3. **Ompakning og afsendelse af skærm**
|
||||
- Dette er en *Sag* (opgave-type med tag: `ompakning`)
|
||||
- Den er relateret til indkøbssagen som "udførelse_for"
|
||||
- Ansvarlig: Lagermedarbejder
|
||||
- Deadline: I dag
|
||||
|
||||
**Alle tre er samme datatype i databasen.** Forskellen er:
|
||||
- Hvilke *tags* de har
|
||||
- Hvilken *kunde/kontakt* de er knyttet til
|
||||
- Hvilke *relationer* de har til andre sager
|
||||
- Hvem der er *ansvarlig*
|
||||
|
||||
### Hvad betyder det for systemet?
|
||||
|
||||
**Uden Sag-modulet:**
|
||||
- Du skal have en "Ticket-sektion" for support
|
||||
- Du skal have en "Task-sektion" for opgaver
|
||||
- Du skal have en "Order-sektion" for ordrer
|
||||
- De snakker ikke sammen naturligt
|
||||
- Data-duplikering
|
||||
- Kompleks logik
|
||||
|
||||
**Med Sag-modulet:**
|
||||
- Ét API endpoint: `/api/v1/sag`
|
||||
- Ét UI-område: "Sager" med intelligente filtre
|
||||
- Relationer er førsteklasses borgere (se hvad der hænger sammen)
|
||||
- Tags styr flowet (f.eks. "support" + "urgent" = prioriteret)
|
||||
- Sager kan "vokse": Start som ticket → bliv til ordre → bliv til installation
|
||||
- Alt er søgbart og filterabelt på tværs af domæner
|
||||
|
||||
---
|
||||
|
||||
## Teknisk arkitektur
|
||||
|
||||
### Databasestruktur
|
||||
|
||||
Sag-modulet bruger tre hovedtabeller (med `sag_` prefix):
|
||||
|
||||
#### **sag_sager** (Hovedtabel for sager)
|
||||
```
|
||||
id (primary key)
|
||||
titel (VARCHAR) - kort navn på sagen
|
||||
beskrivelse (TEXT) - detaljeret beskrivelse
|
||||
template_key (VARCHAR) - struktur-template (f.eks. "ticket", "opgave", "ordre") - default NULL
|
||||
status (VARCHAR) - "åben" eller "lukket"
|
||||
customer_id (foreign key) - hvilken kunde sagen handler om - NULLABLE
|
||||
ansvarlig_bruger_id (foreign key) - hvem skal håndtere den
|
||||
created_by_user_id (foreign key) - hvem oprettede sagen
|
||||
deadline (TIMESTAMP) - hvornår skal det være færdigt
|
||||
created_at (TIMESTAMP)
|
||||
updated_at (TIMESTAMP)
|
||||
deleted_at (TIMESTAMP) - soft-delete: sættes når sagen "slettes"
|
||||
```
|
||||
|
||||
**Soft-delete:** Når du sletter en sag, bliver `deleted_at` sat til nu. Sagen bliver ikke fjernet fra DB. Det betyder:
|
||||
- Du kan gendanne data hvis modulet deaktiveres
|
||||
- Historien bevares (audit trail)
|
||||
- Relations er intakte hvis du genopretter
|
||||
|
||||
#### **sag_relationer** (Hvordan sager hænger sammen)
|
||||
```
|
||||
id (primary key)
|
||||
kilde_sag_id (foreign key) - hvilken sag relationen STARTER fra (retning: fra denne)
|
||||
målsag_id (foreign key) - hvilken sag relationen PEGER PÅ (retning: til denne)
|
||||
relationstype (VARCHAR) - f.eks. "parent_of", "child_of", "derived_from", "blocks", "executes_for"
|
||||
created_at (TIMESTAMP)
|
||||
deleted_at (TIMESTAMP) - soft-delete
|
||||
```
|
||||
|
||||
**Eksempel (retningsbestemt):**
|
||||
- Sag 1 (kundesamtale) → Sag 5 (indkøb af skærm)
|
||||
- kilde_sag_id: 1, målsag_id: 5
|
||||
- relationstype: "derives" eller "parent_of"
|
||||
- Betyder: "Sag 1 er forælder/genererer Sag 5"
|
||||
|
||||
**Note:** Relationer er enrettet. For bidirektionale links oprettes to relations (1→5 og 5→1).
|
||||
|
||||
#### **sag_tags** (Hvordan vi kategoriserer sager)
|
||||
```
|
||||
id (primary key)
|
||||
sag_id (foreign key) - hvilken sag tagget tilhører
|
||||
tag_navn (VARCHAR) - f.eks. "support", "urgent", "vip", "ompakning"
|
||||
state (VARCHAR) - "aktiv" eller "inaktiv" - default "aktiv"
|
||||
closed_at (TIMESTAMP) - hvornår tagget blev lukket/inaktiveret - NULLABLE
|
||||
created_at (TIMESTAMP)
|
||||
deleted_at (TIMESTAMP) - soft-delete
|
||||
```
|
||||
|
||||
**Tags bruges til:**
|
||||
- Filtrering: "Vis alle sager med tag = support"
|
||||
- Workflow: "Sager med tag = urgent skal løses i dag"
|
||||
- Kategorisering: "Alle sager med tag = ompakning"
|
||||
|
||||
### API-endpoints
|
||||
|
||||
**Sager CRUD:**
|
||||
- `GET /api/v1/cases` - Liste alle sager (filter efter tags, status, ansvarlig)
|
||||
- `POST /api/v1/cases` - Opret ny sag
|
||||
- `GET /api/v1/cases/{id}` - Vis detaljer om en sag
|
||||
- `PATCH /api/v1/cases/{id}` - Opdater en sag
|
||||
- `DELETE /api/v1/cases/{id}` - Slet en sag (soft-delete, sætter deleted_at)
|
||||
|
||||
**Relationer:**
|
||||
- `GET /api/v1/cases/{id}/relations` - Vis alle relaterede sager
|
||||
- `POST /api/v1/cases/{id}/relations` - Tilføj relation til anden sag
|
||||
- `DELETE /api/v1/cases/{id}/relations/{relation_id}` - Fjern relation
|
||||
|
||||
**Tags:**
|
||||
- `GET /api/v1/cases/{id}/tags` - Vis alle tags på sagen
|
||||
- `POST /api/v1/cases/{id}/tags` - Tilføj tag
|
||||
- `DELETE /api/v1/cases/{id}/tags/{tag_id}` - Fjern tag
|
||||
|
||||
### UI-koncept
|
||||
|
||||
**Sag-listen** (`/sag`):
|
||||
- Alle dine sager på ét sted
|
||||
- Filter: "Mine sager", "Åbne sager", "Sager med tag=support", "Sager med tag=urgent"
|
||||
- Søgebar
|
||||
- Sortering efter deadline, oprettelsestid, status
|
||||
|
||||
**Sag-listen** (`/cases`):
|
||||
**Sag-detaljer** (`/cases/{id}`):
|
||||
- Hovedinfo: titel, beskrivelse, status, deadline
|
||||
- **Relaterede sager**: Sektioner som:
|
||||
- "Forælder-sag" (hvis denne sag er en del af noget større)
|
||||
- "Barn-sager" (sager der er afledt af denne)
|
||||
- "Blokeret af" (sager der holder denne op)
|
||||
- "Udfører for" (hvis denne er udførelsessag for noget)
|
||||
- **Tags**: Viste tags, mulighed for at tilføje flere
|
||||
- **Ansvarlig**: Hvem skal håndtere det
|
||||
- **Historie**: Hvis modulet får aktivitetslog senere
|
||||
|
||||
**Designet:**
|
||||
- Nordic Top minimalistisk design
|
||||
- Dark mode support
|
||||
- Responsive (mobil-venligt)
|
||||
- Intuitivt navigation mellem relaterede sager
|
||||
|
||||
---
|
||||
|
||||
## Implementeringsplan - Trin for trin
|
||||
|
||||
### Fase 1: Modul-struktur (forberedelse)
|
||||
|
||||
#### Trin 1.1: Opret modul-mappen
|
||||
```
|
||||
app/modules/sag/
|
||||
├── module.json # Modulets metadata
|
||||
├── README.md # Dokumentation
|
||||
├── backend/
|
||||
│ ├── __init__.py
|
||||
│ └── router.py # FastAPI endpoints
|
||||
├── frontend/
|
||||
│ ├── __init__.py
|
||||
│ └── views.py # HTML views
|
||||
├── templates/
|
||||
│ ├── index.html # Sag-liste
|
||||
│ └── detail.html # Sag-detaljer
|
||||
└── migrations/
|
||||
└── 001_init.sql # Database schema
|
||||
```
|
||||
|
||||
#### Trin 1.2: Opret module.json
|
||||
```json
|
||||
{
|
||||
"name": "sag",
|
||||
"version": "1.0.0",
|
||||
"description": "Universel sag-håndtering - tickets, opgaver og ordrer som sager med relationer",
|
||||
"author": "BMC Networks",
|
||||
"enabled": true,
|
||||
"dependencies": [],
|
||||
"table_prefix": "sag_",
|
||||
"api_prefix": "/api/v1/cases",
|
||||
"tags": ["Sag", "Case Management"],
|
||||
"config": {
|
||||
"safety_switches": {
|
||||
"read_only": false,
|
||||
"dry_run": false
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Fase 2: Database-setup
|
||||
|
||||
#### Trin 2.1: Opret migrations/001_init.sql
|
||||
|
||||
SQL-migrations definerer tabeller for sager, relationer og tags. Se `migrations/001_init.sql` for detaljer.
|
||||
|
||||
**Vigtige points:**
|
||||
- Alle tabelnavne starter med `sag_`
|
||||
- Soft-delete: `deleted_at` kolonne hvor man checker `WHERE deleted_at IS NULL`
|
||||
- Foreign keys til `customers` for at linke til kundedata
|
||||
- Indexes for performance
|
||||
- Triggers til auto-update af `updated_at`
|
||||
|
||||
**Eksempel-query (queries filtrerer soft-deleted):**
|
||||
```sql
|
||||
SELECT * FROM sag_sager
|
||||
WHERE customer_id = %s
|
||||
AND deleted_at IS NULL
|
||||
ORDER BY created_at DESC;
|
||||
```
|
||||
|
||||
### Fase 3: Backend-API
|
||||
|
||||
#### Trin 3.1: Opret backend/router.py
|
||||
|
||||
Implementer alle 9 API-endpoints med disse mønstre:
|
||||
|
||||
**GET /cases (list):**
|
||||
```python
|
||||
@router.get("/cases")
|
||||
async def list_sager(
|
||||
status: str = None,
|
||||
tag: str = None,
|
||||
customer_id: int = None,
|
||||
ansvarlig_bruger_id: int = None
|
||||
):
|
||||
# Build query med WHERE deleted_at IS NULL
|
||||
# Filter efter parameters
|
||||
# Return liste
|
||||
```
|
||||
|
||||
**POST /cases (create):**
|
||||
```python
|
||||
@router.post("/cases")
|
||||
async def create_sag(sag_data: dict):
|
||||
# Validér input
|
||||
# INSERT INTO sag_sager
|
||||
# RETURNING *
|
||||
# Return ny sag
|
||||
```
|
||||
|
||||
**GET /cases/{id}:**
|
||||
```python
|
||||
@router.get("/cases/{id}")
|
||||
async def get_sag(id: int):
|
||||
# SELECT * FROM sag_sager WHERE id = %s AND deleted_at IS NULL
|
||||
# Hvis ikke found: HTTPException(404)
|
||||
# Return sag detaljer
|
||||
```
|
||||
|
||||
**PATCH /cases/{id} (update):**
|
||||
```python
|
||||
@router.patch("/cases/{id}")
|
||||
async def update_sag(id: int, updates: dict):
|
||||
# UPDATE sag_sager SET ... WHERE id = %s
|
||||
# Automatisk updated_at via trigger
|
||||
# Return opdateret sag
|
||||
```
|
||||
|
||||
**DELETE /cases/{id} (soft-delete):**
|
||||
```python
|
||||
@router.delete("/cases/{id}")
|
||||
async def delete_sag(id: int):
|
||||
# UPDATE sag_sager SET deleted_at = NOW() WHERE id = %s
|
||||
# Return success
|
||||
```
|
||||
|
||||
**Relationer endpoints:** Lignende pattern for `/cases/{id}/relations`
|
||||
|
||||
**Tags endpoints:** Lignende pattern for `/cases/{id}/tags`
|
||||
|
||||
**Vigtige mønstre:**
|
||||
- Altid bruge `execute_query()` fra `app.core.database`
|
||||
- Parameteriserede queries (`%s` placeholders)
|
||||
- `RealDictCursor` for dict-like row access
|
||||
- Filtrer `WHERE deleted_at IS NULL` på alle SELECT queries
|
||||
- Eksportér router som `router` (module loader leder efter denne)
|
||||
|
||||
### Fase 4: Frontend-views
|
||||
|
||||
#### Trin 4.1: Opret frontend/views.py
|
||||
|
||||
```python
|
||||
from fastapi import APIRouter
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
|
||||
router = APIRouter()
|
||||
templates = Jinja2Templates(directory="app/modules/sag/templates")
|
||||
|
||||
@router.get("/cases", response_class=HTMLResponse)
|
||||
async def cases_liste(request):
|
||||
# Hent sager fra API
|
||||
return templates.TemplateResponse("index.html", {"request": request, "cases": ...})
|
||||
|
||||
@router.get("/cases/{id}", response_class=HTMLResponse)
|
||||
async def sag_detaljer(request, id: int):
|
||||
# Hent sag + relationer + tags
|
||||
return templates.TemplateResponse("detail.html", {"request": request, "sag": ..., "relationer": ...})
|
||||
```
|
||||
|
||||
### Fase 5: Frontend-templates
|
||||
|
||||
#### Trin 5.1: Opret templates/index.html
|
||||
|
||||
Sag-listen med:
|
||||
- Search-bar
|
||||
- Filter-knapper (status, tags, ansvarlig)
|
||||
- Tabel/kort-view med alle sager
|
||||
- Klikkable sager der går til `/sag/{id}`
|
||||
- Nordic Top design med dark mode
|
||||
|
||||
#### Trin 5.2: Opret templates/detail.html
|
||||
|
||||
Sag-detaljer med:
|
||||
- Hovedinfo: titel, beskrivelse, status, deadline, ansvarlig
|
||||
- Sektioner: "Relaterede sager", "Tags", "Aktivitet" (hvis implementeret senere)
|
||||
- Knap til at redigere sagen
|
||||
- Knap til at tilføje relation
|
||||
- Knap til at tilføje tag
|
||||
- Mulighed for at se og slette relationer/tags
|
||||
|
||||
### Fase 6: Test og aktivering
|
||||
|
||||
#### Trin 6.1: Test databasen
|
||||
```bash
|
||||
docker compose exec db psql -U bmc_admin -d bmc_hub -c "SELECT * FROM sag_sager;"
|
||||
```
|
||||
|
||||
#### Trin 6.2: Test API-endpoints
|
||||
```bash
|
||||
# Opret sag
|
||||
curl -X POST http://localhost:8001/api/v1/cases \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"titel": "Test sag", "customer_id": 1}'
|
||||
|
||||
# Hent sag
|
||||
curl http://localhost:8001/api/v1/cases/1
|
||||
|
||||
# Hent sag-liste
|
||||
curl http://localhost:8001/api/v1/cases
|
||||
```
|
||||
|
||||
#### Trin 6.3: Test frontend
|
||||
- Besøg http://localhost:8001/cases
|
||||
- Se sag-liste
|
||||
- Klik på sag → se detaljer
|
||||
- Tilføj tag, relation
|
||||
|
||||
#### Trin 6.4: Test soft-delete
|
||||
- Slet sag via `DELETE /cases/{id}`
|
||||
- Check databasen: `deleted_at` skal være sat
|
||||
- Verify den ikke vises i list-endpoints mere
|
||||
|
||||
#### Trin 6.5: Test modul-deaktivering
|
||||
- Rediger `module.json`: sæt `"enabled": false`
|
||||
- Restart Docker: `docker compose restart api`
|
||||
- Besøg http://localhost:8001/cases → 404
|
||||
- Besøg http://localhost:8001/api/v1/cases → 404
|
||||
- Revert: `"enabled": true`, restart, verifiér det virker igen
|
||||
|
||||
### Fase 7: Dokumentation
|
||||
|
||||
#### Trin 7.1: Opret README.md i modulet
|
||||
Dokumenter:
|
||||
- Hvad modulet gør
|
||||
- API-endpoints med eksempler
|
||||
- Database-schema
|
||||
- Hvordan man bruger relationer og tags
|
||||
- Eksempel-workflows
|
||||
|
||||
---
|
||||
|
||||
## Vigtige principper under implementeringen
|
||||
|
||||
### 1. **Soft-delete først**
|
||||
Alle `DELETE` operationer sætter `deleted_at` til `NOW()` i stedet for at slette fysisk. Det betyder:
|
||||
- Data bevares hvis modulet deaktiveres
|
||||
- Audit trail bevares
|
||||
- Relationer forbliver intakte
|
||||
|
||||
### 2. **Always filter deleted_at**
|
||||
Alle SELECT queries skal have:
|
||||
```sql
|
||||
WHERE deleted_at IS NULL
|
||||
```
|
||||
|
||||
Undtagelse: Admin-sider der skal se "deleted history" (implementeres senere).
|
||||
|
||||
### 3. **Foreign keys til customers**
|
||||
Alle sager skal være knyttet til en `customer_id`. Det gør det muligt at:
|
||||
- Lave customer-specifikke views senere
|
||||
- Sikre data-isolation
|
||||
- Tracke hvem sagerne handler om
|
||||
|
||||
### 4. **Relationer er data**
|
||||
Relationer er ikke blot links - de er egne database-records med type og soft-delete. Det betyder:
|
||||
- Du kan se hele historien af relationer
|
||||
- Du kan "gendanne" relationer hvis de slettes
|
||||
- Relationstyper er konfigurerbare
|
||||
|
||||
### 5. **Tags driver visibility**
|
||||
Tags bruges til:
|
||||
- UI-filtre: "Vis kun sager med tag=urgent"
|
||||
- Workflow: "Sager med tag=support skal have SLA"
|
||||
- Kategorisering: "Alt med tag=ompakning"
|
||||
|
||||
---
|
||||
|
||||
## Hvad efter?
|
||||
|
||||
Når Sag-modulet er live, kan du:
|
||||
|
||||
1. **Konvertere tickets til sager** - Migrationsscript der tager gamle tickets og laver dem til sager
|
||||
2. **Konvertere opgaver til sager** - Samme pattern
|
||||
3. **Tilføje aktivitetslog** - "Hvem ændrede hvad hvornår" på hver sag
|
||||
4. **Integrere med e-conomic** - Når en sag får tag=faktura, oprettes den som ordre i e-conomic
|
||||
5. **Tilføje workflowkonfiguration** - "Hvis status=i_gang og tag=urgent, send reminder hver dag"
|
||||
6. **Tilføje dependencies** - "Sag B kan ikke starte før Sag A er done"
|
||||
7. **Tilføje SLA-tracking** - "Support-sager skal løses inden 24 timer"
|
||||
|
||||
Men først: **Få grundlaget på plads med denne modul-implementering.**
|
||||
|
||||
---
|
||||
|
||||
## Kommandoer til at komme i gang
|
||||
|
||||
```bash
|
||||
# Gå til workspace
|
||||
cd /Users/christianthomas/DEV/bmc_hub_dev
|
||||
|
||||
# Se hvor vi er
|
||||
docker compose ps -a
|
||||
|
||||
# Start dev-miljø hvis det ikke kører
|
||||
docker compose up -d
|
||||
|
||||
# Se logs
|
||||
docker compose logs -f api
|
||||
|
||||
# Efter at have lavet koden: restart API
|
||||
docker compose restart api
|
||||
|
||||
# Test at modulet loadet
|
||||
docker compose logs api | grep -i "sag"
|
||||
|
||||
# Manuelt test af database-migration
|
||||
docker compose exec db psql -U bmc_admin -d bmc_hub -c "\dt sag_*"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Tidsestimation
|
||||
|
||||
- **Fase 1-2 (modul + database)**: 30 min
|
||||
- **Fase 3 (backend API)**: 1-2 timer
|
||||
- **Fase 4-5 (frontend)**: 1-2 timer
|
||||
- **Fase 6 (test)**: 30 min
|
||||
- **Fase 7 (dokumentation)**: 30 min
|
||||
|
||||
**Total: 4-6 timer**
|
||||
|
||||
---
|
||||
|
||||
## TL;DR - for implementer
|
||||
|
||||
1. Opret `app/modules/sag/` med standard-struktur
|
||||
2. Opret `module.json` med `"enabled": true`
|
||||
3. Opret `migrations/001_init.sql` med 3 tabeller (`sag_sager`, `sag_relationer`, `sag_tags`)
|
||||
4. Implementer 9 API-endpoints i `backend/router.py` (alle queries filtrerer `deleted_at IS NULL`)
|
||||
5. Implementer 2 HTML-views i `frontend/views.py` (liste + detaljer)
|
||||
6. Opret 2 templates i `templates/` (index.html + detail.html)
|
||||
7. Test endpoints og UI
|
||||
8. Verifiér soft-delete virker
|
||||
9. Verifiér modulet kan deaktiveres og data bevares
|
||||
10. Skrive README.md
|
||||
|
||||
**Modulet bliver automatisk loadet af system - ingen manual registration nødvendig.**
|
||||
@ -20,6 +20,10 @@ class Settings(BaseSettings):
|
||||
API_PORT: int = 8000
|
||||
API_RELOAD: bool = False
|
||||
ENABLE_RELOAD: bool = False # Added to match docker-compose.yml
|
||||
|
||||
# Elnet supplier lookup
|
||||
ELNET_API_BASE_URL: str = "https://api.elnet.greenpowerdenmark.dk/api"
|
||||
ELNET_TIMEOUT_SECONDS: int = 12
|
||||
|
||||
# Security
|
||||
SECRET_KEY: str = "dev-secret-key-change-in-production"
|
||||
@ -70,7 +74,7 @@ class Settings(BaseSettings):
|
||||
EMAIL_PROCESS_INTERVAL_MINUTES: int = 5
|
||||
EMAIL_WORKFLOWS_ENABLED: bool = True
|
||||
EMAIL_MAX_UPLOAD_SIZE_MB: int = 50 # Max file size for email uploads
|
||||
ALLOWED_EXTENSIONS: List[str] = [".pdf", ".jpg", ".jpeg", ".png", ".gif", ".doc", ".docx", ".xls", ".xlsx", ".zip"] # Allowed file extensions for uploads
|
||||
ALLOWED_EXTENSIONS: List[str] = ["pdf", "jpg", "jpeg", "png", "gif", "doc", "docx", "xls", "xlsx", "zip"] # Allowed file extensions for uploads
|
||||
|
||||
# vTiger Cloud Integration
|
||||
VTIGER_ENABLED: bool = False
|
||||
|
||||
@ -8,8 +8,12 @@ from fastapi import APIRouter, HTTPException, Query
|
||||
from typing import List, Optional, Dict
|
||||
from pydantic import BaseModel
|
||||
import logging
|
||||
import asyncio
|
||||
import aiohttp
|
||||
from urllib.parse import quote
|
||||
|
||||
from app.core.database import execute_query, execute_query_single, execute_update
|
||||
from app.core.config import settings
|
||||
from app.services.cvr_service import get_cvr_service
|
||||
from app.services.customer_activity_logger import CustomerActivityLogger
|
||||
from app.services.customer_consistency import CustomerConsistencyService
|
||||
@ -403,6 +407,85 @@ async def get_customer(customer_id: int):
|
||||
|
||||
|
||||
|
||||
@router.get("/customers/{customer_id}/utility-company")
|
||||
async def get_customer_utility_company(customer_id: int):
|
||||
"""Lookup the netselskab for a customer's address via the Elnet API"""
|
||||
customer = execute_query_single(
|
||||
"SELECT address, city, postal_code FROM customers WHERE id = %s",
|
||||
(customer_id,)
|
||||
)
|
||||
|
||||
if not customer:
|
||||
raise HTTPException(status_code=404, detail="Customer not found")
|
||||
|
||||
address = customer.get('address')
|
||||
if not address:
|
||||
raise HTTPException(status_code=400, detail="Kunde har ikke en adresse")
|
||||
|
||||
components = [address.strip()]
|
||||
city_parts = []
|
||||
if customer.get('postal_code'):
|
||||
city_parts.append(customer['postal_code'].strip())
|
||||
if customer.get('city'):
|
||||
city_parts.append(customer['city'].strip())
|
||||
if city_parts:
|
||||
components.append(' '.join(city_parts))
|
||||
|
||||
search_address = ", ".join([comp for comp in components if comp])
|
||||
if not search_address:
|
||||
raise HTTPException(status_code=400, detail="Ugyldig adressedata")
|
||||
|
||||
base_url = settings.ELNET_API_BASE_URL.rstrip('/')
|
||||
lookup_url = f"{base_url}/supplierlookup/{quote(search_address, safe='')}"
|
||||
timeout_seconds = settings.ELNET_TIMEOUT_SECONDS
|
||||
|
||||
try:
|
||||
timeout = aiohttp.ClientTimeout(total=timeout_seconds)
|
||||
async with aiohttp.ClientSession(timeout=timeout) as session:
|
||||
async with session.get(lookup_url) as response:
|
||||
if response.status == 404:
|
||||
return {
|
||||
"customer_id": customer_id,
|
||||
"address": search_address,
|
||||
"found": False,
|
||||
"message": "Ingen netselskab matchede adressen"
|
||||
}
|
||||
|
||||
if response.status != 200:
|
||||
detail = await response.text()
|
||||
logger.warning(
|
||||
"⚠️ Elnet returned %s for %s (%s)",
|
||||
response.status,
|
||||
customer_id,
|
||||
detail[:200]
|
||||
)
|
||||
raise HTTPException(status_code=502, detail="Fejl fra netselskabs-API")
|
||||
|
||||
payload = await response.json()
|
||||
except asyncio.TimeoutError as exc:
|
||||
logger.error("❌ Elnet request timed out for customer %s: %s", customer_id, exc)
|
||||
raise HTTPException(status_code=504, detail="Timeout ved forespørgsel til netselskabet")
|
||||
except aiohttp.ClientError as exc:
|
||||
logger.error("❌ Elnet request failed for customer %s: %s", customer_id, exc)
|
||||
raise HTTPException(status_code=502, detail="Kunne ikke kontakte netselskabets API")
|
||||
except Exception as exc:
|
||||
logger.error("❌ Unexpected error fetching netselskab for %s: %s", customer_id, exc)
|
||||
raise HTTPException(status_code=500, detail="Fejl ved forespørgsel til netselskabet")
|
||||
|
||||
supplier = {
|
||||
"def": payload.get("def"),
|
||||
"name": payload.get("name"),
|
||||
"phone": payload.get("phone"),
|
||||
"website": payload.get("website")
|
||||
}
|
||||
|
||||
return {
|
||||
"customer_id": customer_id,
|
||||
"address": search_address,
|
||||
"found": bool(payload and payload.get("name")),
|
||||
"supplier": supplier
|
||||
}
|
||||
|
||||
@router.post("/customers")
|
||||
async def create_customer(customer: CustomerCreate):
|
||||
"""Create a new customer"""
|
||||
|
||||
@ -398,6 +398,14 @@
|
||||
<div class="col-12">
|
||||
<div class="info-card">
|
||||
<h5 class="fw-bold mb-3">Integration</h5>
|
||||
<div class="info-row">
|
||||
<span class="info-label">El-selskab</span>
|
||||
<span class="info-value" id="utilityCompanyName">-</span>
|
||||
</div>
|
||||
<div class="info-row">
|
||||
<span class="info-label">Kontakt</span>
|
||||
<span class="info-value text-muted small" id="utilityCompanyContact">-</span>
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col-md-6">
|
||||
<div class="info-row">
|
||||
@ -924,6 +932,8 @@ async function loadCustomer() {
|
||||
customerData = await response.json();
|
||||
displayCustomer(customerData);
|
||||
|
||||
await loadUtilityCompany();
|
||||
|
||||
// Check data consistency
|
||||
await checkDataConsistency();
|
||||
} catch (error) {
|
||||
@ -1015,6 +1025,78 @@ function displayCustomer(customer) {
|
||||
document.getElementById('createdAt').textContent = new Date(customer.created_at).toLocaleString('da-DK');
|
||||
}
|
||||
|
||||
async function loadUtilityCompany() {
|
||||
const nameEl = document.getElementById('utilityCompanyName');
|
||||
const contactEl = document.getElementById('utilityCompanyContact');
|
||||
if (!nameEl || !contactEl) return;
|
||||
|
||||
if (!customerData?.address) {
|
||||
nameEl.textContent = 'Ingen adresse angivet';
|
||||
contactEl.textContent = 'Tilføj adresse for at hente netselskab';
|
||||
return;
|
||||
}
|
||||
|
||||
nameEl.innerHTML = '<span class="spinner-border spinner-border-sm text-primary me-2" role="status" aria-hidden="true"></span>Henter netselskab...';
|
||||
contactEl.textContent = '';
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/v1/customers/${customerId}/utility-company`);
|
||||
let payload = {};
|
||||
try {
|
||||
payload = await response.json();
|
||||
} catch (err) {
|
||||
console.warn('Utility payload could not be parsed', err);
|
||||
}
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(payload.detail || 'Fejl ved netselskabsopslag');
|
||||
}
|
||||
|
||||
if (!payload.found) {
|
||||
nameEl.textContent = 'Netselskab ikke fundet';
|
||||
contactEl.textContent = payload.message || 'Kontroller adressen og prøv igen';
|
||||
return;
|
||||
}
|
||||
|
||||
displayUtilityCompany(payload);
|
||||
} catch (error) {
|
||||
console.error('Error fetching utility company:', error);
|
||||
nameEl.textContent = 'Kunne ikke hente netselskab';
|
||||
contactEl.textContent = 'Prøv igen senere';
|
||||
}
|
||||
}
|
||||
|
||||
function displayUtilityCompany(payload) {
|
||||
const nameEl = document.getElementById('utilityCompanyName');
|
||||
const contactEl = document.getElementById('utilityCompanyContact');
|
||||
if (!nameEl || !contactEl) return;
|
||||
|
||||
const supplier = payload?.supplier;
|
||||
if (!supplier) {
|
||||
nameEl.textContent = 'Netselskab ikke fundet';
|
||||
contactEl.textContent = payload?.message || 'Ingen data fra API';
|
||||
return;
|
||||
}
|
||||
|
||||
nameEl.textContent = supplier.name || 'Ukendt netselskab';
|
||||
|
||||
const contactPieces = [];
|
||||
if (supplier.phone) {
|
||||
contactPieces.push(`Tlf. ${escapeHtml(supplier.phone)}`);
|
||||
}
|
||||
|
||||
if (supplier.website) {
|
||||
const normalized = supplier.website.toLowerCase().startsWith('http')
|
||||
? supplier.website
|
||||
: `https://${supplier.website}`;
|
||||
const href = escapeHtml(normalized);
|
||||
const label = escapeHtml(supplier.website);
|
||||
contactPieces.push(`<a href="${href}" target="_blank" rel="noreferrer noopener">${label}</a>`);
|
||||
}
|
||||
|
||||
contactEl.innerHTML = contactPieces.length > 0 ? contactPieces.join(' • ') : 'Ingen kontaktinfo';
|
||||
}
|
||||
|
||||
async function loadContacts() {
|
||||
const container = document.getElementById('contactsContainer');
|
||||
container.innerHTML = '<div class="col-12 text-center py-5"><div class="spinner-border text-primary"></div></div>';
|
||||
|
||||
@ -177,6 +177,7 @@ async def list_emails(
|
||||
em.id, em.message_id, em.subject, em.sender_email, em.sender_name,
|
||||
em.received_date, em.classification, em.confidence_score, em.status,
|
||||
em.is_read, em.has_attachments, em.attachment_count,
|
||||
em.body_text, em.body_html,
|
||||
er.name as rule_name,
|
||||
v.name as supplier_name,
|
||||
NULL as customer_name
|
||||
|
||||
@ -1740,10 +1740,15 @@ function renderEmailDetail(email) {
|
||||
</div>
|
||||
|
||||
<div class="email-body">
|
||||
${email.body_html ? `<iframe srcdoc="${email.body_html.replace(/"/g, '"')}"></iframe>` :
|
||||
${email.body_html ? `<div class="email-html-body"></div>` :
|
||||
`<pre style="white-space: pre-wrap; font-family: inherit;">${escapeHtml(email.body_text || 'Ingen indhold')}</pre>`}
|
||||
</div>
|
||||
`;
|
||||
// If HTML, inject it as innerHTML after rendering
|
||||
if (email.body_html) {
|
||||
const htmlDiv = pane.querySelector('.email-html-body');
|
||||
if (htmlDiv) htmlDiv.innerHTML = email.body_html;
|
||||
}
|
||||
}
|
||||
|
||||
function renderEmailAnalysis(email) {
|
||||
|
||||
177
app/modules/sag/README.md
Normal file
177
app/modules/sag/README.md
Normal file
@ -0,0 +1,177 @@
|
||||
# Sag Module - Case Management
|
||||
|
||||
## Oversigt
|
||||
|
||||
Sag-modulet implementerer en universel sag-håndtering system hvor tickets, opgaver og ordrer er blot sager med forskellige tags og relationer.
|
||||
|
||||
**Kerneidé:** Der er kun én ting: en Sag. Alt andet er metadata, tags og relationer.
|
||||
|
||||
## Database Schema
|
||||
|
||||
### sag_sager (Hovedtabel)
|
||||
- `id` - Primary key
|
||||
- `titel` - Case title
|
||||
- `beskrivelse` - Detailed description
|
||||
- `type` - Case type (ticket, opgave, ordre, etc.)
|
||||
- `status` - Status (åben, i_gang, afsluttet, on_hold)
|
||||
- `customer_id` - Foreign key to customers table
|
||||
- `ansvarlig_bruger_id` - Assigned user
|
||||
- `deadline` - Due date
|
||||
- `created_at` - Creation timestamp
|
||||
- `updated_at` - Last update (auto-updated via trigger)
|
||||
- `deleted_at` - Soft-delete timestamp (NULL = active)
|
||||
|
||||
### sag_relationer (Relations)
|
||||
- `id` - Primary key
|
||||
- `kilde_sag_id` - Source case
|
||||
- `målsag_id` - Target case
|
||||
- `relationstype` - Relation type (forælder, barn, afledt_af, blokkerer, udfører_for)
|
||||
- `created_at` - Creation timestamp
|
||||
- `deleted_at` - Soft-delete timestamp
|
||||
|
||||
### sag_tags (Tags)
|
||||
- `id` - Primary key
|
||||
- `sag_id` - Case reference
|
||||
- `tag_navn` - Tag name (support, urgent, vip, ompakning, etc.)
|
||||
- `created_at` - Creation timestamp
|
||||
- `deleted_at` - Soft-delete timestamp
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Cases CRUD
|
||||
|
||||
**List cases**
|
||||
```
|
||||
GET /api/v1/sag?status=åben&tag=support&customer_id=1
|
||||
```
|
||||
|
||||
**Create case**
|
||||
```
|
||||
POST /api/v1/sag
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"titel": "Skærm mangler",
|
||||
"beskrivelse": "Kunde har brug for ny skærm",
|
||||
"type": "ticket",
|
||||
"customer_id": 1,
|
||||
"status": "åben"
|
||||
}
|
||||
```
|
||||
|
||||
**Get case**
|
||||
```
|
||||
GET /api/v1/sag/1
|
||||
```
|
||||
|
||||
**Update case**
|
||||
```
|
||||
PATCH /api/v1/sag/1
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"status": "i_gang",
|
||||
"ansvarlig_bruger_id": 5
|
||||
}
|
||||
```
|
||||
|
||||
**Delete case (soft)**
|
||||
```
|
||||
DELETE /api/v1/sag/1
|
||||
```
|
||||
|
||||
### Relations
|
||||
|
||||
**Get relations**
|
||||
```
|
||||
GET /api/v1/sag/1/relationer
|
||||
```
|
||||
|
||||
**Add relation**
|
||||
```
|
||||
POST /api/v1/sag/1/relationer
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"målsag_id": 2,
|
||||
"relationstype": "afledt_af"
|
||||
}
|
||||
```
|
||||
|
||||
**Delete relation**
|
||||
```
|
||||
DELETE /api/v1/sag/1/relationer/5
|
||||
```
|
||||
|
||||
### Tags
|
||||
|
||||
**Get tags**
|
||||
```
|
||||
GET /api/v1/sag/1/tags
|
||||
```
|
||||
|
||||
**Add tag**
|
||||
```
|
||||
POST /api/v1/sag/1/tags
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"tag_navn": "urgent"
|
||||
}
|
||||
```
|
||||
|
||||
**Delete tag**
|
||||
```
|
||||
DELETE /api/v1/sag/1/tags/3
|
||||
```
|
||||
|
||||
## Frontend Routes
|
||||
|
||||
- `GET /sag` - List all cases with filters
|
||||
- `GET /sag/{id}` - View case details
|
||||
- `GET /sag/new` - Create new case (future)
|
||||
- `GET /sag/{id}/edit` - Edit case (future)
|
||||
|
||||
## Features
|
||||
|
||||
✅ Soft-delete with data preservation
|
||||
✅ Nordic Top design with dark mode support
|
||||
✅ Responsive mobile-friendly UI
|
||||
✅ Case relations (parent/child)
|
||||
✅ Dynamic tagging system
|
||||
✅ Full-text search
|
||||
✅ Status filtering
|
||||
✅ Customer tracking
|
||||
|
||||
## Example Workflows
|
||||
|
||||
### Support Ticket
|
||||
1. Customer calls → Create Sag with type="ticket", tag="support"
|
||||
2. Urgency high → Add tag="urgent"
|
||||
3. Create order for new hardware → Create related Sag with type="ordre", relation="afledt_af"
|
||||
4. Pack and ship → Create related Sag with type="opgave", tag="ompakning"
|
||||
|
||||
### Future Integrations
|
||||
|
||||
- Activity logging (who changed what when)
|
||||
- e-conomic integration (auto-create orders)
|
||||
- SLA tracking (response/resolution times)
|
||||
- Workflow automation (auto-tags based on conditions)
|
||||
- Dependency management (can't start case B until case A done)
|
||||
|
||||
## Soft-Delete Safety
|
||||
|
||||
All DELETE operations use soft-delete:
|
||||
- Data is preserved in database
|
||||
- `deleted_at` is set to current timestamp
|
||||
- All queries filter `WHERE deleted_at IS NULL`
|
||||
- Data can be recovered if module is disabled
|
||||
- Audit trail is maintained
|
||||
|
||||
## Development Notes
|
||||
|
||||
- All queries use `execute_query()` from `app.core.database`
|
||||
- Parameterized queries with `%s` placeholders (SQL injection prevention)
|
||||
- `RealDictCursor` for dict-like row access
|
||||
- Triggers maintain `updated_at` automatically
|
||||
- Relations are first-class citizens (not just links)
|
||||
1
app/modules/sag/backend/__init__.py
Normal file
1
app/modules/sag/backend/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
"""Sag module backend."""
|
||||
321
app/modules/sag/backend/router.py
Normal file
321
app/modules/sag/backend/router.py
Normal file
@ -0,0 +1,321 @@
|
||||
import logging
|
||||
from typing import List, Optional
|
||||
from fastapi import APIRouter, HTTPException, Query
|
||||
from app.core.database import execute_query
|
||||
from datetime import datetime
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
# ============================================================================
|
||||
# SAGER - CRUD Operations
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/sag")
|
||||
async def list_sager(
|
||||
status: Optional[str] = Query(None),
|
||||
tag: Optional[str] = Query(None),
|
||||
customer_id: Optional[int] = Query(None),
|
||||
ansvarlig_bruger_id: Optional[int] = Query(None),
|
||||
):
|
||||
"""List all cases with optional filtering."""
|
||||
try:
|
||||
query = "SELECT * FROM sag_sager WHERE deleted_at IS NULL"
|
||||
params = []
|
||||
|
||||
if status:
|
||||
query += " AND status = %s"
|
||||
params.append(status)
|
||||
if customer_id:
|
||||
query += " AND customer_id = %s"
|
||||
params.append(customer_id)
|
||||
if ansvarlig_bruger_id:
|
||||
query += " AND ansvarlig_bruger_id = %s"
|
||||
params.append(ansvarlig_bruger_id)
|
||||
|
||||
query += " ORDER BY created_at DESC"
|
||||
|
||||
cases = execute_query(query, tuple(params))
|
||||
|
||||
# If tag filter, filter in Python after fetch
|
||||
if tag:
|
||||
case_ids = [case['id'] for case in cases]
|
||||
if case_ids:
|
||||
tag_query = "SELECT sag_id FROM sag_tags WHERE tag_navn = %s AND deleted_at IS NULL"
|
||||
tagged_cases = execute_query(tag_query, (tag,))
|
||||
tagged_ids = set(t['sag_id'] for t in tagged_cases)
|
||||
cases = [c for c in cases if c['id'] in tagged_ids]
|
||||
|
||||
return cases
|
||||
except Exception as e:
|
||||
logger.error("❌ Error listing cases: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to list cases")
|
||||
|
||||
@router.post("/sag")
|
||||
async def create_sag(data: dict):
|
||||
"""Create a new case."""
|
||||
try:
|
||||
if not data.get('titel'):
|
||||
raise HTTPException(status_code=400, detail="titel is required")
|
||||
if not data.get('customer_id'):
|
||||
raise HTTPException(status_code=400, detail="customer_id is required")
|
||||
|
||||
query = """
|
||||
INSERT INTO sag_sager
|
||||
(titel, beskrivelse, type, status, customer_id, ansvarlig_bruger_id, deadline)
|
||||
VALUES (%s, %s, %s, %s, %s, %s, %s)
|
||||
RETURNING *
|
||||
"""
|
||||
params = (
|
||||
data.get('titel'),
|
||||
data.get('beskrivelse', ''),
|
||||
data.get('type', 'ticket'),
|
||||
data.get('status', 'åben'),
|
||||
data.get('customer_id'),
|
||||
data.get('ansvarlig_bruger_id'),
|
||||
data.get('deadline'),
|
||||
)
|
||||
|
||||
result = execute_query(query, params)
|
||||
if result:
|
||||
logger.info("✅ Case created: %s", result[0]['id'])
|
||||
return result[0]
|
||||
raise HTTPException(status_code=500, detail="Failed to create case")
|
||||
except Exception as e:
|
||||
logger.error("❌ Error creating case: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to create case")
|
||||
|
||||
@router.get("/sag/{sag_id}")
|
||||
async def get_sag(sag_id: int):
|
||||
"""Get a specific case."""
|
||||
try:
|
||||
query = "SELECT * FROM sag_sager WHERE id = %s AND deleted_at IS NULL"
|
||||
result = execute_query(query, (sag_id,))
|
||||
if not result:
|
||||
raise HTTPException(status_code=404, detail="Case not found")
|
||||
return result[0]
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error getting case: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to get case")
|
||||
|
||||
@router.patch("/sag/{sag_id}")
|
||||
async def update_sag(sag_id: int, updates: dict):
|
||||
"""Update a case."""
|
||||
try:
|
||||
# Check if case exists
|
||||
check = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (sag_id,))
|
||||
if not check:
|
||||
raise HTTPException(status_code=404, detail="Case not found")
|
||||
|
||||
# Build dynamic update query
|
||||
allowed_fields = ['titel', 'beskrivelse', 'type', 'status', 'ansvarlig_bruger_id', 'deadline']
|
||||
set_clauses = []
|
||||
params = []
|
||||
|
||||
for field in allowed_fields:
|
||||
if field in updates:
|
||||
set_clauses.append(f"{field} = %s")
|
||||
params.append(updates[field])
|
||||
|
||||
if not set_clauses:
|
||||
raise HTTPException(status_code=400, detail="No valid fields to update")
|
||||
|
||||
params.append(sag_id)
|
||||
query = f"UPDATE sag_sager SET {', '.join(set_clauses)} WHERE id = %s RETURNING *"
|
||||
|
||||
result = execute_query(query, tuple(params))
|
||||
if result:
|
||||
logger.info("✅ Case updated: %s", sag_id)
|
||||
return result[0]
|
||||
raise HTTPException(status_code=500, detail="Failed to update case")
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error updating case: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to update case")
|
||||
|
||||
@router.delete("/sag/{sag_id}")
|
||||
async def delete_sag(sag_id: int):
|
||||
"""Soft-delete a case."""
|
||||
try:
|
||||
check = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (sag_id,))
|
||||
if not check:
|
||||
raise HTTPException(status_code=404, detail="Case not found")
|
||||
|
||||
query = "UPDATE sag_sager SET deleted_at = NOW() WHERE id = %s RETURNING id"
|
||||
result = execute_query(query, (sag_id,))
|
||||
|
||||
if result:
|
||||
logger.info("✅ Case soft-deleted: %s", sag_id)
|
||||
return {"status": "deleted", "id": sag_id}
|
||||
raise HTTPException(status_code=500, detail="Failed to delete case")
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error deleting case: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to delete case")
|
||||
|
||||
# ============================================================================
|
||||
# RELATIONER - Case Relations
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/sag/{sag_id}/relationer")
|
||||
async def get_relationer(sag_id: int):
|
||||
"""Get all relations for a case."""
|
||||
try:
|
||||
# Check if case exists
|
||||
check = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (sag_id,))
|
||||
if not check:
|
||||
raise HTTPException(status_code=404, detail="Case not found")
|
||||
|
||||
query = """
|
||||
SELECT sr.*,
|
||||
ss_kilde.titel as kilde_titel,
|
||||
ss_mål.titel as mål_titel
|
||||
FROM sag_relationer sr
|
||||
JOIN sag_sager ss_kilde ON sr.kilde_sag_id = ss_kilde.id
|
||||
JOIN sag_sager ss_mål ON sr.målsag_id = ss_mål.id
|
||||
WHERE (sr.kilde_sag_id = %s OR sr.målsag_id = %s)
|
||||
AND sr.deleted_at IS NULL
|
||||
ORDER BY sr.created_at DESC
|
||||
"""
|
||||
result = execute_query(query, (sag_id, sag_id))
|
||||
return result
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error getting relations: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to get relations")
|
||||
|
||||
@router.post("/sag/{sag_id}/relationer")
|
||||
async def create_relation(sag_id: int, data: dict):
|
||||
"""Add a relation to another case."""
|
||||
try:
|
||||
if not data.get('målsag_id') or not data.get('relationstype'):
|
||||
raise HTTPException(status_code=400, detail="målsag_id and relationstype required")
|
||||
|
||||
målsag_id = data.get('målsag_id')
|
||||
relationstype = data.get('relationstype')
|
||||
|
||||
# Validate both cases exist
|
||||
check1 = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (sag_id,))
|
||||
check2 = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (målsag_id,))
|
||||
|
||||
if not check1 or not check2:
|
||||
raise HTTPException(status_code=404, detail="One or both cases not found")
|
||||
|
||||
query = """
|
||||
INSERT INTO sag_relationer (kilde_sag_id, målsag_id, relationstype)
|
||||
VALUES (%s, %s, %s)
|
||||
RETURNING *
|
||||
"""
|
||||
result = execute_query(query, (sag_id, målsag_id, relationstype))
|
||||
|
||||
if result:
|
||||
logger.info("✅ Relation created: %s -> %s (%s)", sag_id, målsag_id, relationstype)
|
||||
return result[0]
|
||||
raise HTTPException(status_code=500, detail="Failed to create relation")
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error creating relation: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to create relation")
|
||||
|
||||
@router.delete("/sag/{sag_id}/relationer/{relation_id}")
|
||||
async def delete_relation(sag_id: int, relation_id: int):
|
||||
"""Soft-delete a relation."""
|
||||
try:
|
||||
check = execute_query(
|
||||
"SELECT id FROM sag_relationer WHERE id = %s AND deleted_at IS NULL AND (kilde_sag_id = %s OR målsag_id = %s)",
|
||||
(relation_id, sag_id, sag_id)
|
||||
)
|
||||
if not check:
|
||||
raise HTTPException(status_code=404, detail="Relation not found")
|
||||
|
||||
query = "UPDATE sag_relationer SET deleted_at = NOW() WHERE id = %s RETURNING id"
|
||||
result = execute_query(query, (relation_id,))
|
||||
|
||||
if result:
|
||||
logger.info("✅ Relation soft-deleted: %s", relation_id)
|
||||
return {"status": "deleted", "id": relation_id}
|
||||
raise HTTPException(status_code=500, detail="Failed to delete relation")
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error deleting relation: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to delete relation")
|
||||
|
||||
# ============================================================================
|
||||
# TAGS - Case Tags
|
||||
# ============================================================================
|
||||
|
||||
@router.get("/sag/{sag_id}/tags")
|
||||
async def get_tags(sag_id: int):
|
||||
"""Get all tags for a case."""
|
||||
try:
|
||||
check = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (sag_id,))
|
||||
if not check:
|
||||
raise HTTPException(status_code=404, detail="Case not found")
|
||||
|
||||
query = "SELECT * FROM sag_tags WHERE sag_id = %s AND deleted_at IS NULL ORDER BY created_at DESC"
|
||||
result = execute_query(query, (sag_id,))
|
||||
return result
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error getting tags: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to get tags")
|
||||
|
||||
@router.post("/sag/{sag_id}/tags")
|
||||
async def add_tag(sag_id: int, data: dict):
|
||||
"""Add a tag to a case."""
|
||||
try:
|
||||
if not data.get('tag_navn'):
|
||||
raise HTTPException(status_code=400, detail="tag_navn is required")
|
||||
|
||||
check = execute_query("SELECT id FROM sag_sager WHERE id = %s AND deleted_at IS NULL", (sag_id,))
|
||||
if not check:
|
||||
raise HTTPException(status_code=404, detail="Case not found")
|
||||
|
||||
query = """
|
||||
INSERT INTO sag_tags (sag_id, tag_navn)
|
||||
VALUES (%s, %s)
|
||||
RETURNING *
|
||||
"""
|
||||
result = execute_query(query, (sag_id, data.get('tag_navn')))
|
||||
|
||||
if result:
|
||||
logger.info("✅ Tag added: %s -> %s", sag_id, data.get('tag_navn'))
|
||||
return result[0]
|
||||
raise HTTPException(status_code=500, detail="Failed to add tag")
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error adding tag: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to add tag")
|
||||
|
||||
@router.delete("/sag/{sag_id}/tags/{tag_id}")
|
||||
async def delete_tag(sag_id: int, tag_id: int):
|
||||
"""Soft-delete a tag."""
|
||||
try:
|
||||
check = execute_query(
|
||||
"SELECT id FROM sag_tags WHERE id = %s AND sag_id = %s AND deleted_at IS NULL",
|
||||
(tag_id, sag_id)
|
||||
)
|
||||
if not check:
|
||||
raise HTTPException(status_code=404, detail="Tag not found")
|
||||
|
||||
query = "UPDATE sag_tags SET deleted_at = NOW() WHERE id = %s RETURNING id"
|
||||
result = execute_query(query, (tag_id,))
|
||||
|
||||
if result:
|
||||
logger.info("✅ Tag soft-deleted: %s", tag_id)
|
||||
return {"status": "deleted", "id": tag_id}
|
||||
raise HTTPException(status_code=500, detail="Failed to delete tag")
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error deleting tag: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to delete tag")
|
||||
1
app/modules/sag/frontend/__init__.py
Normal file
1
app/modules/sag/frontend/__init__.py
Normal file
@ -0,0 +1 @@
|
||||
"""Sag module frontend."""
|
||||
111
app/modules/sag/frontend/views.py
Normal file
111
app/modules/sag/frontend/views.py
Normal file
@ -0,0 +1,111 @@
|
||||
import logging
|
||||
from fastapi import APIRouter, HTTPException, Query
|
||||
from fastapi.responses import HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from pathlib import Path
|
||||
from app.core.database import execute_query
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
# Setup template directory
|
||||
template_dir = Path(__file__).parent.parent / "templates"
|
||||
templates = Jinja2Templates(directory=str(template_dir))
|
||||
|
||||
@router.get("/sag", response_class=HTMLResponse)
|
||||
async def sager_liste(
|
||||
request,
|
||||
status: str = Query(None),
|
||||
tag: str = Query(None),
|
||||
customer_id: int = Query(None),
|
||||
):
|
||||
"""Display list of all cases."""
|
||||
try:
|
||||
query = "SELECT * FROM sag_sager WHERE deleted_at IS NULL"
|
||||
params = []
|
||||
|
||||
if status:
|
||||
query += " AND status = %s"
|
||||
params.append(status)
|
||||
if customer_id:
|
||||
query += " AND customer_id = %s"
|
||||
params.append(customer_id)
|
||||
|
||||
query += " ORDER BY created_at DESC"
|
||||
sager = execute_query(query, tuple(params))
|
||||
|
||||
# Filter by tag if provided
|
||||
if tag and sager:
|
||||
sag_ids = [s['id'] for s in sager]
|
||||
tag_query = "SELECT sag_id FROM sag_tags WHERE tag_navn = %s AND deleted_at IS NULL"
|
||||
tagged = execute_query(tag_query, (tag,))
|
||||
tagged_ids = set(t['sag_id'] for t in tagged)
|
||||
sager = [s for s in sager if s['id'] in tagged_ids]
|
||||
|
||||
# Fetch all distinct statuses and tags for filters
|
||||
statuses = execute_query("SELECT DISTINCT status FROM sag_sager WHERE deleted_at IS NULL ORDER BY status", ())
|
||||
all_tags = execute_query("SELECT DISTINCT tag_navn FROM sag_tags WHERE deleted_at IS NULL ORDER BY tag_navn", ())
|
||||
|
||||
return templates.TemplateResponse("index.html", {
|
||||
"request": request,
|
||||
"sager": sager,
|
||||
"statuses": [s['status'] for s in statuses],
|
||||
"all_tags": [t['tag_navn'] for t in all_tags],
|
||||
"current_status": status,
|
||||
"current_tag": tag,
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error("❌ Error displaying case list: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to load case list")
|
||||
|
||||
@router.get("/sag/{sag_id}", response_class=HTMLResponse)
|
||||
async def sag_detaljer(request, sag_id: int):
|
||||
"""Display case details."""
|
||||
try:
|
||||
# Fetch main case
|
||||
sag_query = "SELECT * FROM sag_sager WHERE id = %s AND deleted_at IS NULL"
|
||||
sag_result = execute_query(sag_query, (sag_id,))
|
||||
|
||||
if not sag_result:
|
||||
raise HTTPException(status_code=404, detail="Case not found")
|
||||
|
||||
sag = sag_result[0]
|
||||
|
||||
# Fetch tags
|
||||
tags_query = "SELECT * FROM sag_tags WHERE sag_id = %s AND deleted_at IS NULL ORDER BY created_at DESC"
|
||||
tags = execute_query(tags_query, (sag_id,))
|
||||
|
||||
# Fetch relations
|
||||
relationer_query = """
|
||||
SELECT sr.*,
|
||||
ss_kilde.titel as kilde_titel,
|
||||
ss_mål.titel as mål_titel
|
||||
FROM sag_relationer sr
|
||||
JOIN sag_sager ss_kilde ON sr.kilde_sag_id = ss_kilde.id
|
||||
JOIN sag_sager ss_mål ON sr.målsag_id = ss_mål.id
|
||||
WHERE (sr.kilde_sag_id = %s OR sr.målsag_id = %s)
|
||||
AND sr.deleted_at IS NULL
|
||||
ORDER BY sr.created_at DESC
|
||||
"""
|
||||
relationer = execute_query(relationer_query, (sag_id, sag_id))
|
||||
|
||||
# Fetch customer info if customer_id exists
|
||||
customer = None
|
||||
if sag.get('customer_id'):
|
||||
customer_query = "SELECT * FROM customers WHERE id = %s"
|
||||
customer_result = execute_query(customer_query, (sag['customer_id'],))
|
||||
if customer_result:
|
||||
customer = customer_result[0]
|
||||
|
||||
return templates.TemplateResponse("detail.html", {
|
||||
"request": request,
|
||||
"sag": sag,
|
||||
"customer": customer,
|
||||
"tags": tags,
|
||||
"relationer": relationer,
|
||||
})
|
||||
except HTTPException:
|
||||
raise
|
||||
except Exception as e:
|
||||
logger.error("❌ Error displaying case details: %s", e)
|
||||
raise HTTPException(status_code=500, detail="Failed to load case details")
|
||||
62
app/modules/sag/migrations/001_init.sql
Normal file
62
app/modules/sag/migrations/001_init.sql
Normal file
@ -0,0 +1,62 @@
|
||||
-- Sag Module: Initialize case management tables
|
||||
-- Supports universal case handling with relations and tags
|
||||
|
||||
-- Main cases table
|
||||
CREATE TABLE IF NOT EXISTS sag_sager (
|
||||
id SERIAL PRIMARY KEY,
|
||||
titel VARCHAR(255) NOT NULL,
|
||||
beskrivelse TEXT,
|
||||
type VARCHAR(50) NOT NULL DEFAULT 'ticket', -- ticket, opgave, ordre, etc.
|
||||
status VARCHAR(50) NOT NULL DEFAULT 'åben', -- åben, i_gang, afsluttet, on_hold
|
||||
customer_id INTEGER NOT NULL REFERENCES customers(id),
|
||||
ansvarlig_bruger_id INTEGER,
|
||||
deadline TIMESTAMP,
|
||||
created_at TIMESTAMP NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMP NOT NULL DEFAULT NOW(),
|
||||
deleted_at TIMESTAMP,
|
||||
CONSTRAINT valid_status CHECK (status IN ('åben', 'i_gang', 'afsluttet', 'on_hold'))
|
||||
);
|
||||
|
||||
-- Relations between cases
|
||||
CREATE TABLE IF NOT EXISTS sag_relationer (
|
||||
id SERIAL PRIMARY KEY,
|
||||
kilde_sag_id INTEGER NOT NULL REFERENCES sag_sager(id) ON DELETE CASCADE,
|
||||
målsag_id INTEGER NOT NULL REFERENCES sag_sager(id) ON DELETE CASCADE,
|
||||
relationstype VARCHAR(50) NOT NULL, -- forælder, barn, afledt_af, blokkerer, udfører_for
|
||||
created_at TIMESTAMP NOT NULL DEFAULT NOW(),
|
||||
deleted_at TIMESTAMP,
|
||||
CONSTRAINT different_cases CHECK (kilde_sag_id != målsag_id)
|
||||
);
|
||||
|
||||
-- Tags for categorization
|
||||
CREATE TABLE IF NOT EXISTS sag_tags (
|
||||
id SERIAL PRIMARY KEY,
|
||||
sag_id INTEGER NOT NULL REFERENCES sag_sager(id) ON DELETE CASCADE,
|
||||
tag_navn VARCHAR(100) NOT NULL,
|
||||
created_at TIMESTAMP NOT NULL DEFAULT NOW(),
|
||||
deleted_at TIMESTAMP
|
||||
);
|
||||
|
||||
-- Indexes for performance
|
||||
CREATE INDEX IF NOT EXISTS idx_sag_sager_customer_id ON sag_sager(customer_id) WHERE deleted_at IS NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_sag_sager_status ON sag_sager(status) WHERE deleted_at IS NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_sag_sager_ansvarlig ON sag_sager(ansvarlig_bruger_id) WHERE deleted_at IS NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_sag_relationer_kilde ON sag_relationer(kilde_sag_id) WHERE deleted_at IS NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_sag_relationer_mål ON sag_relationer(målsag_id) WHERE deleted_at IS NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_sag_tags_sag_id ON sag_tags(sag_id) WHERE deleted_at IS NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_sag_tags_tag_navn ON sag_tags(tag_navn) WHERE deleted_at IS NULL;
|
||||
|
||||
-- Trigger to auto-update updated_at
|
||||
CREATE OR REPLACE FUNCTION update_sag_sager_updated_at()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = NOW();
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql;
|
||||
|
||||
DROP TRIGGER IF EXISTS trigger_sag_sager_updated_at ON sag_sager;
|
||||
CREATE TRIGGER trigger_sag_sager_updated_at
|
||||
BEFORE UPDATE ON sag_sager
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_sag_sager_updated_at();
|
||||
17
app/modules/sag/module.json
Normal file
17
app/modules/sag/module.json
Normal file
@ -0,0 +1,17 @@
|
||||
{
|
||||
"name": "sag",
|
||||
"version": "1.0.0",
|
||||
"description": "Universel sag-håndtering - tickets, opgaver og ordrer som sager med relationer",
|
||||
"author": "BMC Networks",
|
||||
"enabled": true,
|
||||
"dependencies": [],
|
||||
"table_prefix": "sag_",
|
||||
"api_prefix": "/api/v1/sag",
|
||||
"tags": ["Sag", "Case Management"],
|
||||
"config": {
|
||||
"safety_switches": {
|
||||
"read_only": false,
|
||||
"dry_run": false
|
||||
}
|
||||
}
|
||||
}
|
||||
236
app/modules/sag/templates/detail.html
Normal file
236
app/modules/sag/templates/detail.html
Normal file
@ -0,0 +1,236 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="da">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>{{ sag.titel }} - BMC Hub</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<style>
|
||||
:root {
|
||||
--primary-color: #0f4c75;
|
||||
--secondary-color: #3282b8;
|
||||
--accent-color: #00a8e8;
|
||||
--bg-light: #f7f9fc;
|
||||
--bg-dark: #1a1a2e;
|
||||
}
|
||||
|
||||
body {
|
||||
background-color: var(--bg-light);
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
|
||||
}
|
||||
|
||||
body.dark-mode {
|
||||
background-color: var(--bg-dark);
|
||||
color: #f0f0f0;
|
||||
}
|
||||
|
||||
.navbar {
|
||||
background: linear-gradient(135deg, var(--primary-color), var(--secondary-color));
|
||||
}
|
||||
|
||||
.content-wrapper {
|
||||
padding: 2rem 0;
|
||||
}
|
||||
|
||||
.card {
|
||||
border: none;
|
||||
border-radius: 8px;
|
||||
box-shadow: 0 2px 4px rgba(0,0,0,0.05);
|
||||
margin-bottom: 2rem;
|
||||
}
|
||||
|
||||
body.dark-mode .card {
|
||||
background-color: #2a2a3e;
|
||||
box-shadow: 0 2px 4px rgba(0,0,0,0.3);
|
||||
}
|
||||
|
||||
.card-header {
|
||||
background-color: var(--primary-color);
|
||||
color: white;
|
||||
border-radius: 8px 8px 0 0;
|
||||
}
|
||||
|
||||
body.dark-mode .card-header {
|
||||
background-color: var(--secondary-color);
|
||||
}
|
||||
|
||||
.card-body label {
|
||||
font-weight: 600;
|
||||
color: var(--primary-color);
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
body.dark-mode .card-body label {
|
||||
color: var(--accent-color);
|
||||
}
|
||||
|
||||
.btn-back {
|
||||
background-color: transparent;
|
||||
color: var(--primary-color);
|
||||
border: 1px solid var(--primary-color);
|
||||
padding: 0.5rem 1rem;
|
||||
border-radius: 4px;
|
||||
text-decoration: none;
|
||||
display: inline-block;
|
||||
margin-bottom: 2rem;
|
||||
}
|
||||
|
||||
body.dark-mode .btn-back {
|
||||
color: var(--accent-color);
|
||||
border-color: var(--accent-color);
|
||||
}
|
||||
|
||||
.btn-back:hover {
|
||||
background-color: var(--primary-color);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.tag {
|
||||
display: inline-block;
|
||||
background-color: var(--primary-color);
|
||||
color: white;
|
||||
padding: 0.4rem 0.8rem;
|
||||
border-radius: 4px;
|
||||
font-size: 0.85rem;
|
||||
margin-right: 0.5rem;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
body.dark-mode .tag {
|
||||
background-color: var(--secondary-color);
|
||||
}
|
||||
|
||||
.relation-card {
|
||||
background-color: #f9f9f9;
|
||||
padding: 1rem;
|
||||
border-radius: 4px;
|
||||
margin-bottom: 0.5rem;
|
||||
border-left: 3px solid var(--accent-color);
|
||||
}
|
||||
|
||||
body.dark-mode .relation-card {
|
||||
background-color: #3a3a4e;
|
||||
border-left-color: var(--secondary-color);
|
||||
}
|
||||
|
||||
.relation-type {
|
||||
font-weight: 600;
|
||||
color: var(--secondary-color);
|
||||
font-size: 0.85rem;
|
||||
}
|
||||
|
||||
.relation-link {
|
||||
color: var(--accent-color);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.relation-link:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<!-- Navigation -->
|
||||
<nav class="navbar navbar-expand-lg navbar-dark">
|
||||
<div class="container">
|
||||
<a class="navbar-brand" href="/">🛠️ BMC Hub</a>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<!-- Main Content -->
|
||||
<div class="content-wrapper">
|
||||
<div class="container">
|
||||
<a href="/sag" class="btn-back">← Tilbage til sager</a>
|
||||
|
||||
<!-- Main Card -->
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h1 style="margin: 0; font-size: 1.5rem;">{{ sag.titel }}</h1>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-6">
|
||||
<label>Beskrivelse</label>
|
||||
<p>{{ sag.beskrivelse or 'Ingen beskrivelse' }}</p>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<label>Status</label>
|
||||
<p><span style="background-color: var(--accent-color); color: white; padding: 0.3rem 0.7rem; border-radius: 4px;">{{ sag.status }}</span></p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-6">
|
||||
<label>Type</label>
|
||||
<p>{{ sag.type }}</p>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<label>Deadline</label>
|
||||
<p>{{ sag.deadline[:10] if sag.deadline else 'Ikke sat' }}</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if customer %}
|
||||
<div class="row mb-4">
|
||||
<div class="col-md-6">
|
||||
<label>Kunde</label>
|
||||
<p><a href="/customers/{{ customer.id }}" class="relation-link">{{ customer.name }}</a></p>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<div class="row">
|
||||
<div class="col-12">
|
||||
<label>Oprettet</label>
|
||||
<p>{{ sag.created_at }}</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Tags Section -->
|
||||
{% if tags %}
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 style="margin: 0;">📌 Tags</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
{% for tag in tags %}
|
||||
<span class="tag">{{ tag.tag_navn }}</span>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Relations Section -->
|
||||
{% if relationer %}
|
||||
<div class="card">
|
||||
<div class="card-header">
|
||||
<h5 style="margin: 0;">🔗 Relaterede sager</h5>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
{% for rel in relationer %}
|
||||
<div class="relation-card">
|
||||
<div class="relation-type">{{ rel.relationstype }}</div>
|
||||
{% if rel.kilde_sag_id == sag.id %}
|
||||
→ <a href="/sag/{{ rel.målsag_id }}" class="relation-link">{{ rel.mål_titel }}</a>
|
||||
{% else %}
|
||||
← <a href="/sag/{{ rel.kilde_sag_id }}" class="relation-link">{{ rel.kilde_titel }}</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Action Buttons -->
|
||||
<div style="margin-top: 2rem; display: flex; gap: 1rem; flex-wrap: wrap;">
|
||||
<a href="/sag/{{ sag.id }}/edit" class="btn" style="background-color: var(--accent-color); color: white;">✏️ Rediger</a>
|
||||
<button class="btn" style="background-color: #e74c3c; color: white;" onclick="if(confirm('Slet denne sag?')) { fetch('/api/v1/sag/{{ sag.id }}', {method: 'DELETE'}).then(() => window.location='/sag'); }">🗑️ Slet</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
350
app/modules/sag/templates/index.html
Normal file
350
app/modules/sag/templates/index.html
Normal file
@ -0,0 +1,350 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="da">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Sager - BMC Hub</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<style>
|
||||
:root {
|
||||
--primary-color: #0f4c75;
|
||||
--secondary-color: #3282b8;
|
||||
--accent-color: #00a8e8;
|
||||
--bg-light: #f7f9fc;
|
||||
--bg-dark: #1a1a2e;
|
||||
--text-light: #333;
|
||||
--text-dark: #f0f0f0;
|
||||
--border-color: #ddd;
|
||||
}
|
||||
|
||||
body {
|
||||
background-color: var(--bg-light);
|
||||
color: var(--text-light);
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
|
||||
}
|
||||
|
||||
body.dark-mode {
|
||||
background-color: var(--bg-dark);
|
||||
color: var(--text-dark);
|
||||
}
|
||||
|
||||
.navbar {
|
||||
background: linear-gradient(135deg, var(--primary-color), var(--secondary-color));
|
||||
box-shadow: 0 2px 8px rgba(0,0,0,0.1);
|
||||
}
|
||||
|
||||
.navbar-brand {
|
||||
font-weight: 600;
|
||||
font-size: 1.4rem;
|
||||
}
|
||||
|
||||
.content-wrapper {
|
||||
padding: 2rem 0;
|
||||
min-height: 100vh;
|
||||
}
|
||||
|
||||
.page-header {
|
||||
margin-bottom: 2rem;
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.page-header h1 {
|
||||
font-size: 2rem;
|
||||
font-weight: 700;
|
||||
color: var(--primary-color);
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
body.dark-mode .page-header h1 {
|
||||
color: var(--accent-color);
|
||||
}
|
||||
|
||||
.btn-new {
|
||||
background-color: var(--accent-color);
|
||||
color: white;
|
||||
border: none;
|
||||
padding: 0.6rem 1.5rem;
|
||||
border-radius: 6px;
|
||||
font-weight: 500;
|
||||
transition: all 0.3s ease;
|
||||
}
|
||||
|
||||
.btn-new:hover {
|
||||
background-color: var(--secondary-color);
|
||||
color: white;
|
||||
transform: translateY(-2px);
|
||||
box-shadow: 0 4px 12px rgba(0,168,232,0.3);
|
||||
}
|
||||
|
||||
.filter-section {
|
||||
background: white;
|
||||
padding: 1.5rem;
|
||||
border-radius: 8px;
|
||||
margin-bottom: 2rem;
|
||||
box-shadow: 0 2px 4px rgba(0,0,0,0.05);
|
||||
}
|
||||
|
||||
body.dark-mode .filter-section {
|
||||
background-color: #2a2a3e;
|
||||
box-shadow: 0 2px 4px rgba(0,0,0,0.3);
|
||||
}
|
||||
|
||||
.filter-section label {
|
||||
font-weight: 600;
|
||||
color: var(--primary-color);
|
||||
margin-bottom: 0.5rem;
|
||||
display: block;
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
body.dark-mode .filter-section label {
|
||||
color: var(--accent-color);
|
||||
}
|
||||
|
||||
.filter-section select,
|
||||
.filter-section input {
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: 4px;
|
||||
padding: 0.5rem;
|
||||
font-size: 0.95rem;
|
||||
}
|
||||
|
||||
body.dark-mode .filter-section select,
|
||||
body.dark-mode .filter-section input {
|
||||
background-color: #3a3a4e;
|
||||
color: var(--text-dark);
|
||||
border-color: #555;
|
||||
}
|
||||
|
||||
.sag-card {
|
||||
background: white;
|
||||
border-radius: 8px;
|
||||
padding: 1.5rem;
|
||||
margin-bottom: 1rem;
|
||||
border-left: 4px solid var(--primary-color);
|
||||
cursor: pointer;
|
||||
transition: all 0.3s ease;
|
||||
box-shadow: 0 2px 4px rgba(0,0,0,0.05);
|
||||
text-decoration: none;
|
||||
color: inherit;
|
||||
display: block;
|
||||
}
|
||||
|
||||
body.dark-mode .sag-card {
|
||||
background-color: #2a2a3e;
|
||||
box-shadow: 0 2px 4px rgba(0,0,0,0.3);
|
||||
}
|
||||
|
||||
.sag-card:hover {
|
||||
box-shadow: 0 4px 12px rgba(0,0,0,0.1);
|
||||
transform: translateY(-2px);
|
||||
border-left-color: var(--accent-color);
|
||||
}
|
||||
|
||||
body.dark-mode .sag-card:hover {
|
||||
box-shadow: 0 4px 12px rgba(0,168,232,0.2);
|
||||
}
|
||||
|
||||
.sag-title {
|
||||
font-size: 1.1rem;
|
||||
font-weight: 600;
|
||||
color: var(--primary-color);
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
body.dark-mode .sag-title {
|
||||
color: var(--accent-color);
|
||||
}
|
||||
|
||||
.sag-meta {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
font-size: 0.9rem;
|
||||
color: #666;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
body.dark-mode .sag-meta {
|
||||
color: #aaa;
|
||||
}
|
||||
|
||||
.status-badge {
|
||||
display: inline-block;
|
||||
padding: 0.3rem 0.7rem;
|
||||
border-radius: 20px;
|
||||
font-size: 0.8rem;
|
||||
font-weight: 500;
|
||||
}
|
||||
|
||||
.status-åben {
|
||||
background-color: #ffeaa7;
|
||||
color: #d63031;
|
||||
}
|
||||
|
||||
.status-i_gang {
|
||||
background-color: #a29bfe;
|
||||
color: #2d3436;
|
||||
}
|
||||
|
||||
.status-afsluttet {
|
||||
background-color: #55efc4;
|
||||
color: #00b894;
|
||||
}
|
||||
|
||||
.status-on_hold {
|
||||
background-color: #fab1a0;
|
||||
color: #e17055;
|
||||
}
|
||||
|
||||
.tag {
|
||||
display: inline-block;
|
||||
background-color: var(--primary-color);
|
||||
color: white;
|
||||
padding: 0.3rem 0.6rem;
|
||||
border-radius: 4px;
|
||||
font-size: 0.8rem;
|
||||
margin-right: 0.5rem;
|
||||
margin-top: 0.5rem;
|
||||
}
|
||||
|
||||
body.dark-mode .tag {
|
||||
background-color: var(--secondary-color);
|
||||
}
|
||||
|
||||
.dark-mode-toggle {
|
||||
background: none;
|
||||
border: none;
|
||||
color: white;
|
||||
font-size: 1.2rem;
|
||||
cursor: pointer;
|
||||
padding: 0.5rem;
|
||||
}
|
||||
|
||||
.empty-state {
|
||||
text-align: center;
|
||||
padding: 3rem 1rem;
|
||||
color: #999;
|
||||
}
|
||||
|
||||
body.dark-mode .empty-state {
|
||||
color: #666;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<!-- Navigation -->
|
||||
<nav class="navbar navbar-expand-lg navbar-dark">
|
||||
<div class="container">
|
||||
<a class="navbar-brand" href="/">🛠️ BMC Hub</a>
|
||||
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navbarNav">
|
||||
<span class="navbar-toggler-icon"></span>
|
||||
</button>
|
||||
<div class="collapse navbar-collapse" id="navbarNav">
|
||||
<ul class="navbar-nav ms-auto">
|
||||
<li class="nav-item">
|
||||
<a class="nav-link active" href="/sag">Sager</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<button class="dark-mode-toggle" onclick="toggleDarkMode()">🌙</button>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<!-- Main Content -->
|
||||
<div class="content-wrapper">
|
||||
<div class="container">
|
||||
<!-- Page Header -->
|
||||
<div class="page-header">
|
||||
<h1>📋 Sager</h1>
|
||||
<a href="/sag/new" class="btn-new">+ Ny sag</a>
|
||||
</div>
|
||||
|
||||
<!-- Filters -->
|
||||
<div class="filter-section">
|
||||
<div class="row g-3">
|
||||
<div class="col-md-4">
|
||||
<label>Status</label>
|
||||
<form method="get" style="display: flex; gap: 0.5rem;">
|
||||
<select name="status" onchange="this.form.submit()" style="flex: 1;">
|
||||
<option value="">Alle statuser</option>
|
||||
{% for s in statuses %}
|
||||
<option value="{{ s }}" {% if s == current_status %}selected{% endif %}>{{ s }}</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
</form>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<label>Tag</label>
|
||||
<form method="get" style="display: flex; gap: 0.5rem;">
|
||||
<select name="tag" onchange="this.form.submit()" style="flex: 1;">
|
||||
<option value="">Alle tags</option>
|
||||
{% for t in all_tags %}
|
||||
<option value="{{ t }}" {% if t == current_tag %}selected{% endif %}>{{ t }}</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
</form>
|
||||
</div>
|
||||
<div class="col-md-4">
|
||||
<label>Søg</label>
|
||||
<input type="text" placeholder="Søg efter sager..." class="form-control" id="searchInput">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Cases List -->
|
||||
<div id="casesList">
|
||||
{% if sager %}
|
||||
{% for sag in sager %}
|
||||
<a href="/sag/{{ sag.id }}" class="sag-card">
|
||||
<div class="sag-title">{{ sag.titel }}</div>
|
||||
{% if sag.beskrivelse %}
|
||||
<div style="color: #666; font-size: 0.9rem; margin-bottom: 0.5rem;">{{ sag.beskrivelse[:100] }}{% if sag.beskrivelse|length > 100 %}...{% endif %}</div>
|
||||
{% endif %}
|
||||
<div class="sag-meta">
|
||||
<span class="status-badge status-{{ sag.status }}">{{ sag.status }}</span>
|
||||
<span>{{ sag.type }}</span>
|
||||
<span style="color: #999;">{{ sag.created_at[:10] }}</span>
|
||||
</div>
|
||||
</a>
|
||||
{% endfor %}
|
||||
{% else %}
|
||||
<div class="empty-state">
|
||||
<p>Ingen sager fundet</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<script>
|
||||
function toggleDarkMode() {
|
||||
document.body.classList.toggle('dark-mode');
|
||||
localStorage.setItem('darkMode', document.body.classList.contains('dark-mode'));
|
||||
}
|
||||
|
||||
// Load dark mode preference
|
||||
if (localStorage.getItem('darkMode') === 'true') {
|
||||
document.body.classList.add('dark-mode');
|
||||
}
|
||||
|
||||
// Search functionality
|
||||
document.getElementById('searchInput').addEventListener('keyup', function(e) {
|
||||
const search = e.target.value.toLowerCase();
|
||||
document.querySelectorAll('.sag-card').forEach(card => {
|
||||
const text = card.textContent.toLowerCase();
|
||||
card.style.display = text.includes(search) ? 'block' : 'none';
|
||||
});
|
||||
});
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
@ -1,22 +1,368 @@
|
||||
|
||||
"""
|
||||
Opportunities (Pipeline) Router
|
||||
Hub-local sales pipeline
|
||||
"""
|
||||
|
||||
from fastapi import APIRouter, HTTPException, Query
|
||||
from pathlib import Path
|
||||
from uuid import uuid4
|
||||
|
||||
from fastapi import APIRouter, HTTPException, Query, UploadFile, File, Form, Request
|
||||
from fastapi.responses import FileResponse
|
||||
from pydantic import BaseModel
|
||||
from typing import Optional, List, Dict
|
||||
from typing import Optional, List, Dict, Any, Tuple
|
||||
from datetime import date, datetime
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
|
||||
import psycopg2
|
||||
|
||||
from app.core.config import settings
|
||||
from app.core.database import execute_query, execute_query_single, execute_update
|
||||
from app.services.opportunity_service import handle_stage_change
|
||||
from app.services.email_service import EmailService
|
||||
import email
|
||||
from email.header import decode_header
|
||||
try:
|
||||
import extract_msg
|
||||
except ImportError:
|
||||
extract_msg = None
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
def _is_undefined_table_error(exc: Exception) -> bool:
|
||||
# Postgres undefined_table SQLSTATE
|
||||
if getattr(exc, "pgcode", None) == "42P01":
|
||||
return True
|
||||
undefined_table = getattr(psycopg2, "errors", None)
|
||||
if undefined_table is not None:
|
||||
try:
|
||||
return isinstance(exc, psycopg2.errors.UndefinedTable)
|
||||
except Exception:
|
||||
return False
|
||||
return False
|
||||
|
||||
@router.post("/opportunities/{opportunity_id}/email-links", tags=["Opportunities"])
|
||||
async def add_opportunity_email_link(opportunity_id: int, payload: dict):
|
||||
"""Add a linked email to an opportunity"""
|
||||
email_id = payload.get("email_id")
|
||||
if not email_id or not isinstance(email_id, int):
|
||||
raise HTTPException(status_code=400, detail="Invalid email_id")
|
||||
|
||||
try:
|
||||
_get_opportunity(opportunity_id)
|
||||
except HTTPException:
|
||||
raise HTTPException(status_code=404, detail="Opportunity not found")
|
||||
|
||||
try:
|
||||
execute_query(
|
||||
"INSERT INTO pipeline_opportunity_emails (opportunity_id, email_id) VALUES (%s, %s) ON CONFLICT DO NOTHING",
|
||||
(opportunity_id, email_id)
|
||||
)
|
||||
except Exception as e:
|
||||
if _is_undefined_table_error(e):
|
||||
raise HTTPException(
|
||||
status_code=409,
|
||||
detail="Database migration required: run 032_opportunity_emails_m2m.sql",
|
||||
)
|
||||
logger.error(f"Failed to add email link: {e}")
|
||||
raise HTTPException(status_code=500, detail="Kunne ikke tilføje email-link")
|
||||
|
||||
return _get_opportunity(opportunity_id)
|
||||
|
||||
@router.delete("/opportunities/{opportunity_id}/email-links/{email_id}", tags=["Opportunities"])
|
||||
async def remove_opportunity_email_link(opportunity_id: int, email_id: int):
|
||||
"""Remove a linked email from an opportunity"""
|
||||
try:
|
||||
execute_query(
|
||||
"DELETE FROM pipeline_opportunity_emails WHERE opportunity_id = %s AND email_id = %s",
|
||||
(opportunity_id, email_id)
|
||||
)
|
||||
except Exception as e:
|
||||
if _is_undefined_table_error(e):
|
||||
raise HTTPException(
|
||||
status_code=409,
|
||||
detail="Database migration required: run 032_opportunity_emails_m2m.sql",
|
||||
)
|
||||
logger.error(f"Failed to remove email link: {e}")
|
||||
raise HTTPException(status_code=500, detail="Kunne ikke fjerne email-link")
|
||||
|
||||
return {"success": True}
|
||||
|
||||
@router.patch("/opportunities/{opportunity_id}/email-link", tags=["Opportunities"])
|
||||
async def update_opportunity_email_link(opportunity_id: int, payload: dict):
|
||||
"""Legacy endpoint: Update the linked email (single) -> Redirects to add link"""
|
||||
# For backward compatibility, we treat this as "add link" for now
|
||||
return await add_opportunity_email_link(opportunity_id, payload)
|
||||
|
||||
|
||||
def _decode_header_str(header_val):
|
||||
if not header_val:
|
||||
return ""
|
||||
try:
|
||||
decoded_list = decode_header(header_val)
|
||||
result = ""
|
||||
for content, encoding in decoded_list:
|
||||
if isinstance(content, bytes):
|
||||
if encoding:
|
||||
try:
|
||||
result += content.decode(encoding)
|
||||
except LookupError:
|
||||
result += content.decode('utf-8', errors='ignore')
|
||||
except Exception:
|
||||
result += content.decode('utf-8', errors='ignore')
|
||||
else:
|
||||
result += content.decode('utf-8', errors='ignore')
|
||||
else:
|
||||
result += str(content)
|
||||
return result
|
||||
except Exception:
|
||||
return str(header_val)
|
||||
|
||||
async def _process_uploaded_email(file: UploadFile, opportunity_id: int) -> dict:
|
||||
content = await file.read()
|
||||
filename = file.filename.lower()
|
||||
email_data = {}
|
||||
|
||||
# Generate a unique message ID if one doesn't exist to prevent collisions/logic errors
|
||||
temp_id = str(uuid4())
|
||||
|
||||
if filename.endswith('.msg'):
|
||||
if not extract_msg:
|
||||
raise HTTPException(status_code=500, detail="Library 'extract-msg' not installed")
|
||||
|
||||
# extract-msg needs a file-like object or path. BytesIO works.
|
||||
import io
|
||||
msg = extract_msg.Message(io.BytesIO(content))
|
||||
|
||||
# Map fields
|
||||
email_data = {
|
||||
'message_id': msg.messageId or f"msg-{temp_id}",
|
||||
'subject': msg.subject or "No Subject",
|
||||
'sender_email': msg.sender or "",
|
||||
'sender_name': msg.sender or "", # msg.sender is often "Name <email>" or just email
|
||||
'recipient_email': msg.to or "",
|
||||
'cc': msg.cc or "",
|
||||
'body_text': msg.body,
|
||||
'body_html': msg.htmlBody, # might be None
|
||||
'received_date': msg.date or datetime.now(),
|
||||
'folder': 'Imported',
|
||||
'attachment_count': len(msg.attachments),
|
||||
'has_attachments': len(msg.attachments) > 0,
|
||||
'attachments': []
|
||||
}
|
||||
|
||||
# Handle msg attachments (simplified, might need more work for full fidelity)
|
||||
for att in msg.attachments:
|
||||
# Binary attachments in msg
|
||||
if hasattr(att, 'data'):
|
||||
email_data['attachments'].append({
|
||||
'filename': att.longFilename or att.shortFilename or 'attachment',
|
||||
'content': att.data,
|
||||
'size': len(att.data),
|
||||
'content_type': 'application/octet-stream'
|
||||
})
|
||||
|
||||
elif filename.endswith('.eml'):
|
||||
msg = email.message_from_bytes(content)
|
||||
|
||||
# Helper to get body
|
||||
body_text = ""
|
||||
body_html = ""
|
||||
if msg.is_multipart():
|
||||
for part in msg.walk():
|
||||
ctype = part.get_content_type()
|
||||
if ctype == "text/plain" and not body_text:
|
||||
body_text = part.get_payload(decode=True).decode('utf-8', errors='ignore')
|
||||
elif ctype == "text/html" and not body_html:
|
||||
body_html = part.get_payload(decode=True).decode('utf-8', errors='ignore')
|
||||
else:
|
||||
body_text = msg.get_payload(decode=True).decode('utf-8', errors='ignore')
|
||||
|
||||
# Attachments
|
||||
attachments = []
|
||||
for part in msg.walk():
|
||||
if part.get_content_maintype() == 'multipart': continue
|
||||
if part.get_content_type() in ['text/plain', 'text/html']: continue
|
||||
fname = part.get_filename()
|
||||
if fname:
|
||||
payload = part.get_payload(decode=True)
|
||||
if payload:
|
||||
attachments.append({
|
||||
'filename': _decode_header_str(fname),
|
||||
'content': payload,
|
||||
'size': len(payload),
|
||||
'content_type': part.get_content_type()
|
||||
})
|
||||
|
||||
email_data = {
|
||||
'message_id': msg.get('Message-ID', f"eml-{temp_id}"),
|
||||
'subject': _decode_header_str(msg.get('Subject', 'No Subject')),
|
||||
'sender_email': _decode_header_str(msg.get('From', '')),
|
||||
'sender_name': _decode_header_str(msg.get('From', '')),
|
||||
'recipient_email': _decode_header_str(msg.get('To', '')),
|
||||
'cc': _decode_header_str(msg.get('Cc', '')),
|
||||
'body_text': body_text,
|
||||
'body_html': body_html,
|
||||
'received_date': datetime.now(), # EML date parsing is complex, default to now for import
|
||||
'folder': 'Imported',
|
||||
'has_attachments': len(attachments) > 0,
|
||||
'attachment_count': len(attachments),
|
||||
'attachments': attachments
|
||||
}
|
||||
|
||||
# Try parse date
|
||||
if msg.get('Date'):
|
||||
try:
|
||||
from email.utils import parsedate_to_datetime
|
||||
email_data['received_date'] = parsedate_to_datetime(msg.get('Date'))
|
||||
except: pass
|
||||
|
||||
else:
|
||||
raise HTTPException(status_code=400, detail="Unsupported file format. Use .eml or .msg")
|
||||
|
||||
# Save via EmailService
|
||||
svc = EmailService()
|
||||
|
||||
# Check if exists (by message_id)
|
||||
# The svc.save_email method inserts. We might want to check for duplicates first?
|
||||
# save_email returns id if new, or might fail?
|
||||
# Actually email_messages table likely has unique constraint on message_id?
|
||||
# Let's check save_email again. It does INSERT.
|
||||
# We should search first.
|
||||
|
||||
# Simple check query
|
||||
existing = execute_query_single("SELECT id FROM email_messages WHERE message_id = %s", (email_data['message_id'],))
|
||||
if existing:
|
||||
email_id = existing['id']
|
||||
else:
|
||||
email_id = await svc.save_email(email_data)
|
||||
if not email_id:
|
||||
raise HTTPException(status_code=500, detail="Failed to save imported email")
|
||||
|
||||
# Link to opportunity
|
||||
try:
|
||||
execute_query(
|
||||
"INSERT INTO pipeline_opportunity_emails (opportunity_id, email_id) VALUES (%s, %s) ON CONFLICT DO NOTHING",
|
||||
(opportunity_id, email_id)
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to link imported email: {e}")
|
||||
raise HTTPException(status_code=500, detail="Failed to link email")
|
||||
|
||||
return _get_opportunity(opportunity_id)
|
||||
|
||||
|
||||
@router.post("/opportunities/{opportunity_id}/upload-email", tags=["Opportunities"])
|
||||
async def upload_opportunity_email(opportunity_id: int, file: UploadFile = File(...)):
|
||||
"""Upload an .eml or .msg file and link it to the opportunity"""
|
||||
return await _process_uploaded_email(file, opportunity_id)
|
||||
|
||||
|
||||
@router.post("/opportunities/{opportunity_id}/contacts", tags=["Opportunities"])
|
||||
async def add_opportunity_contact_link(opportunity_id: int, payload: dict):
|
||||
"""Link a contact to an opportunity"""
|
||||
contact_id = payload.get("contact_id")
|
||||
role = payload.get("role")
|
||||
|
||||
if not contact_id:
|
||||
raise HTTPException(status_code=400, detail="Invalid contact_id")
|
||||
|
||||
try:
|
||||
execute_query(
|
||||
"INSERT INTO pipeline_opportunity_contacts (opportunity_id, contact_id, role) VALUES (%s, %s, %s) ON CONFLICT (opportunity_id, contact_id) DO UPDATE SET role = EXCLUDED.role",
|
||||
(opportunity_id, contact_id, role)
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to add contact link: {e}")
|
||||
raise HTTPException(status_code=500, detail="Kunne ikke tilføje kontaktperson")
|
||||
|
||||
return _get_opportunity(opportunity_id)
|
||||
|
||||
|
||||
@router.delete("/opportunities/{opportunity_id}/contacts/{contact_id}", tags=["Opportunities"])
|
||||
async def remove_opportunity_contact_link(opportunity_id: int, contact_id: int):
|
||||
"""Remove a linked contact from an opportunity"""
|
||||
try:
|
||||
execute_query(
|
||||
"DELETE FROM pipeline_opportunity_contacts WHERE opportunity_id = %s AND contact_id = %s",
|
||||
(opportunity_id, contact_id)
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to remove contact link: {e}")
|
||||
raise HTTPException(status_code=500, detail="Kunne ikke fjerne kontaktperson")
|
||||
|
||||
return _get_opportunity(opportunity_id)
|
||||
|
||||
|
||||
UPLOAD_BASE_PATH = Path(settings.UPLOAD_DIR).resolve()
|
||||
COMMENT_ATTACHMENT_SUBDIR = "opportunity_comments"
|
||||
CONTRACT_ATTACHMENT_SUBDIR = "opportunity_contract_files"
|
||||
for subdir in (COMMENT_ATTACHMENT_SUBDIR, CONTRACT_ATTACHMENT_SUBDIR):
|
||||
(UPLOAD_BASE_PATH / subdir).mkdir(parents=True, exist_ok=True)
|
||||
ALLOWED_EXTENSIONS = {ext.lower() for ext in settings.ALLOWED_EXTENSIONS}
|
||||
MAX_ATTACHMENT_SIZE = settings.EMAIL_MAX_UPLOAD_SIZE_MB * 1024 * 1024
|
||||
|
||||
|
||||
def _is_attachment_allowed(filename: str) -> bool:
|
||||
extension = Path(filename).suffix.lower().lstrip(".")
|
||||
return extension in ALLOWED_EXTENSIONS
|
||||
|
||||
|
||||
def _validate_attachment(upload_file: UploadFile) -> None:
|
||||
if not _is_attachment_allowed(upload_file.filename):
|
||||
raise HTTPException(400, detail="Unsupported attachment type")
|
||||
|
||||
upload_file.file.seek(0, os.SEEK_END)
|
||||
size = upload_file.file.tell()
|
||||
upload_file.file.seek(0)
|
||||
if size > MAX_ATTACHMENT_SIZE:
|
||||
raise HTTPException(
|
||||
400,
|
||||
detail=f"Attachment exceeds size limit of {settings.EMAIL_MAX_UPLOAD_SIZE_MB} MB",
|
||||
)
|
||||
|
||||
|
||||
def _generate_stored_name(filename: str, subdir: str) -> str:
|
||||
cleaned = Path(filename).name
|
||||
unique = f"{uuid4().hex}_{cleaned}"
|
||||
return f"{subdir}/{unique}"
|
||||
|
||||
|
||||
def _resolve_attachment_path(stored_name: str) -> Path:
|
||||
return UPLOAD_BASE_PATH / stored_name
|
||||
|
||||
|
||||
def _store_upload_file(upload_file: UploadFile, subdir: str) -> Tuple[str, int]:
|
||||
_validate_attachment(upload_file)
|
||||
stored_name = _generate_stored_name(upload_file.filename, subdir)
|
||||
destination = _resolve_attachment_path(stored_name)
|
||||
destination.parent.mkdir(parents=True, exist_ok=True)
|
||||
upload_file.file.seek(0)
|
||||
try:
|
||||
with destination.open("wb") as buffer:
|
||||
shutil.copyfileobj(upload_file.file, buffer)
|
||||
except PermissionError as e:
|
||||
logger.error(
|
||||
"❌ Upload permission denied: %s (base=%s, subdir=%s)",
|
||||
str(destination),
|
||||
str(UPLOAD_BASE_PATH),
|
||||
subdir,
|
||||
)
|
||||
raise HTTPException(
|
||||
status_code=500,
|
||||
detail=(
|
||||
"Upload directory is not writable. Fix permissions for the host-mounted 'uploads' folder "
|
||||
"(e.g. /srv/podman/bmc_hub_v1.0/uploads) and restart the API container."
|
||||
),
|
||||
) from e
|
||||
return stored_name, destination.stat().st_size
|
||||
|
||||
|
||||
class PipelineStageBase(BaseModel):
|
||||
name: str
|
||||
description: Optional[str] = None
|
||||
@ -102,6 +448,33 @@ class OpportunityCommentCreate(OpportunityCommentBase):
|
||||
pass
|
||||
|
||||
|
||||
class OpportunityCommentAttachment(BaseModel):
|
||||
id: int
|
||||
filename: str
|
||||
content_type: Optional[str] = None
|
||||
size_bytes: Optional[int] = None
|
||||
created_at: datetime
|
||||
download_url: str
|
||||
|
||||
|
||||
class OpportunityEmailAttachment(BaseModel):
|
||||
id: int
|
||||
filename: str
|
||||
content_type: Optional[str] = None
|
||||
size_bytes: Optional[int] = None
|
||||
created_at: datetime
|
||||
download_url: str
|
||||
|
||||
|
||||
class OpportunityContractFile(BaseModel):
|
||||
id: int
|
||||
filename: str
|
||||
content_type: Optional[str] = None
|
||||
size_bytes: Optional[int] = None
|
||||
created_at: datetime
|
||||
download_url: str
|
||||
|
||||
|
||||
class OpportunityCommentResponse(BaseModel):
|
||||
id: int
|
||||
opportunity_id: int
|
||||
@ -119,6 +492,8 @@ class OpportunityCommentResponse(BaseModel):
|
||||
metadata: Optional[Dict] = None
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
attachments: List[OpportunityCommentAttachment] = []
|
||||
email_attachments: List[OpportunityEmailAttachment] = []
|
||||
|
||||
|
||||
def _get_stage(stage_id: int):
|
||||
@ -152,6 +527,49 @@ def _get_opportunity(opportunity_id: int):
|
||||
opportunity = execute_query_single(query, (opportunity_id,))
|
||||
if not opportunity:
|
||||
raise HTTPException(status_code=404, detail="Opportunity not found")
|
||||
|
||||
# Fetch linked emails
|
||||
email_query = """
|
||||
SELECT e.id, e.subject, e.sender_email, e.received_date, e.body_text, e.body_html
|
||||
FROM email_messages e
|
||||
JOIN pipeline_opportunity_emails poe ON e.id = poe.email_id
|
||||
WHERE poe.opportunity_id = %s
|
||||
ORDER BY e.received_date DESC
|
||||
"""
|
||||
linked_emails = execute_query(email_query, (opportunity_id,))
|
||||
try:
|
||||
linked_emails = execute_query(email_query, (opportunity_id,))
|
||||
except Exception as e:
|
||||
if _is_undefined_table_error(e):
|
||||
logger.warning(
|
||||
"⚠️ Missing table pipeline_opportunity_emails; linked_emails disabled until migration 032_opportunity_emails_m2m.sql is applied"
|
||||
)
|
||||
linked_emails = []
|
||||
else:
|
||||
raise
|
||||
opportunity["linked_emails"] = linked_emails or []
|
||||
|
||||
# Fetch linked contacts
|
||||
contacts_query = """
|
||||
SELECT c.id, c.first_name, c.last_name, c.email, c.phone, c.mobile, poc.role
|
||||
FROM contacts c
|
||||
JOIN pipeline_opportunity_contacts poc ON c.id = poc.contact_id
|
||||
WHERE poc.opportunity_id = %s
|
||||
ORDER BY c.first_name, c.last_name
|
||||
"""
|
||||
linked_contacts = execute_query(contacts_query, (opportunity_id,))
|
||||
try:
|
||||
linked_contacts = execute_query(contacts_query, (opportunity_id,))
|
||||
except Exception as e:
|
||||
if _is_undefined_table_error(e):
|
||||
logger.warning(
|
||||
"⚠️ Missing table pipeline_opportunity_contacts; linked_contacts disabled until migration 033_opportunity_contacts.sql is applied"
|
||||
)
|
||||
linked_contacts = []
|
||||
else:
|
||||
raise
|
||||
opportunity["linked_contacts"] = linked_contacts or []
|
||||
|
||||
return opportunity
|
||||
|
||||
|
||||
@ -176,7 +594,23 @@ def _fetch_opportunity_comments(opportunity_id: int):
|
||||
WHERE c.opportunity_id = %s
|
||||
ORDER BY c.created_at DESC
|
||||
"""
|
||||
return execute_query(query, (opportunity_id,)) or []
|
||||
comments = execute_query(query, (opportunity_id,)) or []
|
||||
if not comments:
|
||||
return []
|
||||
|
||||
comment_ids = [comment["id"] for comment in comments]
|
||||
attachments_map = _fetch_comment_attachments_map(comment_ids)
|
||||
email_ids = list({comment["email_id"] for comment in comments if comment.get("email_id")})
|
||||
email_attachment_map = _fetch_email_attachments_map(email_ids)
|
||||
|
||||
for comment in comments:
|
||||
comment["attachments"] = attachments_map.get(comment["id"], [])
|
||||
if comment.get("email_id"):
|
||||
comment["email_attachments"] = email_attachment_map.get(comment["email_id"], [])
|
||||
else:
|
||||
comment["email_attachments"] = []
|
||||
|
||||
return comments
|
||||
|
||||
|
||||
def _fetch_comment(comment_id: int):
|
||||
@ -189,7 +623,169 @@ def _fetch_comment(comment_id: int):
|
||||
WHERE c.id = %s
|
||||
"""
|
||||
result = execute_query(query, (comment_id,))
|
||||
return result[0] if result else None
|
||||
if not result:
|
||||
return None
|
||||
|
||||
comment = result[0]
|
||||
attachments = _fetch_comment_attachments_map([comment_id])
|
||||
comment["attachments"] = attachments.get(comment_id, [])
|
||||
if comment.get("email_id"):
|
||||
email_attachments = _fetch_email_attachments_map([comment["email_id"]])
|
||||
comment["email_attachments"] = email_attachments.get(comment["email_id"], [])
|
||||
else:
|
||||
comment["email_attachments"] = []
|
||||
|
||||
return comment
|
||||
|
||||
|
||||
def _comment_attachment_download_url(opportunity_id: int, attachment_id: int) -> str:
|
||||
return f"/api/v1/opportunities/{opportunity_id}/comment-attachments/{attachment_id}"
|
||||
|
||||
|
||||
def _email_attachment_download_url(email_id: int, attachment_id: int) -> str:
|
||||
return f"/api/v1/emails/{email_id}/attachments/{attachment_id}"
|
||||
|
||||
|
||||
def _fetch_comment_attachments_map(comment_ids: List[int]) -> Dict[int, List[Dict[str, Any]]]:
|
||||
if not comment_ids:
|
||||
return {}
|
||||
|
||||
query = """
|
||||
SELECT a.id, a.comment_id, a.opportunity_id, a.filename, a.content_type, a.size_bytes, a.created_at
|
||||
FROM pipeline_opportunity_comment_attachments a
|
||||
WHERE a.comment_id = ANY(%s)
|
||||
ORDER BY a.created_at DESC
|
||||
"""
|
||||
rows = execute_query(query, (comment_ids,)) or []
|
||||
attachments_by_comment: Dict[int, List[Dict[str, Any]]] = {}
|
||||
for row in rows:
|
||||
attachments_by_comment.setdefault(row["comment_id"], []).append({
|
||||
"id": row["id"],
|
||||
"filename": row["filename"],
|
||||
"content_type": row.get("content_type"),
|
||||
"size_bytes": row.get("size_bytes"),
|
||||
"created_at": row.get("created_at"),
|
||||
"download_url": _comment_attachment_download_url(row["opportunity_id"], row["id"])
|
||||
})
|
||||
return attachments_by_comment
|
||||
|
||||
|
||||
def _fetch_email_attachments_map(email_ids: List[int]) -> Dict[int, List[Dict[str, Any]]]:
|
||||
if not email_ids:
|
||||
return {}
|
||||
|
||||
query = """
|
||||
SELECT id, email_id, filename, content_type, size_bytes, created_at
|
||||
FROM email_attachments
|
||||
WHERE email_id = ANY(%s)
|
||||
ORDER BY id ASC
|
||||
"""
|
||||
rows = execute_query(query, (email_ids,)) or []
|
||||
email_map: Dict[int, List[Dict[str, Any]]] = {}
|
||||
for row in rows:
|
||||
email_map.setdefault(row["email_id"], []).append({
|
||||
"id": row["id"],
|
||||
"filename": row["filename"],
|
||||
"content_type": row.get("content_type"),
|
||||
"size_bytes": row.get("size_bytes"),
|
||||
"created_at": row.get("created_at"),
|
||||
"download_url": _email_attachment_download_url(row["email_id"], row["id"])
|
||||
})
|
||||
return email_map
|
||||
|
||||
|
||||
def _contract_file_download_url(opportunity_id: int, file_id: int) -> str:
|
||||
return f"/api/v1/opportunities/{opportunity_id}/contract-files/{file_id}"
|
||||
|
||||
|
||||
def _fetch_contract_files(opportunity_id: int) -> List[Dict[str, Any]]:
|
||||
query = """
|
||||
SELECT id, filename, content_type, size_bytes, stored_name, created_at
|
||||
FROM pipeline_opportunity_contract_files
|
||||
WHERE opportunity_id = %s
|
||||
ORDER BY created_at DESC
|
||||
"""
|
||||
rows = execute_query(query, (opportunity_id,)) or []
|
||||
return [
|
||||
{
|
||||
"id": row["id"],
|
||||
"filename": row["filename"],
|
||||
"content_type": row.get("content_type"),
|
||||
"size_bytes": row.get("size_bytes"),
|
||||
"created_at": row.get("created_at"),
|
||||
"download_url": _contract_file_download_url(opportunity_id, row["id"]),
|
||||
}
|
||||
for row in rows
|
||||
]
|
||||
|
||||
|
||||
def _save_contract_files(opportunity_id: int, files: List[UploadFile], uploaded_by_user_id: Optional[int] = None) -> List[Dict[str, Any]]:
|
||||
if not files:
|
||||
return []
|
||||
|
||||
insert_query = """
|
||||
INSERT INTO pipeline_opportunity_contract_files
|
||||
(opportunity_id, filename, content_type, size_bytes, stored_name, uploaded_by_user_id)
|
||||
VALUES (%s, %s, %s, %s, %s, %s)
|
||||
RETURNING id, filename, content_type, size_bytes, created_at
|
||||
"""
|
||||
|
||||
saved_files = []
|
||||
for upload_file in files:
|
||||
if not upload_file or not upload_file.filename:
|
||||
continue
|
||||
|
||||
stored_name, size_bytes = _store_upload_file(upload_file, CONTRACT_ATTACHMENT_SUBDIR)
|
||||
result = execute_query(
|
||||
insert_query,
|
||||
(
|
||||
opportunity_id,
|
||||
upload_file.filename,
|
||||
upload_file.content_type,
|
||||
size_bytes,
|
||||
stored_name,
|
||||
uploaded_by_user_id,
|
||||
)
|
||||
)
|
||||
if result:
|
||||
saved = result[0]
|
||||
saved_files.append({
|
||||
"id": saved["id"],
|
||||
"filename": saved["filename"],
|
||||
"content_type": saved.get("content_type"),
|
||||
"size_bytes": saved.get("size_bytes"),
|
||||
"created_at": saved.get("created_at"),
|
||||
"download_url": _contract_file_download_url(opportunity_id, saved["id"]),
|
||||
})
|
||||
|
||||
return saved_files
|
||||
def _save_comment_attachments(opportunity_id: int, comment_id: int, files: List[UploadFile], uploaded_by_user_id: Optional[int] = None) -> None:
|
||||
if not files:
|
||||
return
|
||||
|
||||
insert_query = """
|
||||
INSERT INTO pipeline_opportunity_comment_attachments
|
||||
(opportunity_id, comment_id, filename, content_type, size_bytes, stored_name, uploaded_by_user_id)
|
||||
VALUES (%s, %s, %s, %s, %s, %s, %s)
|
||||
"""
|
||||
|
||||
for upload_file in files:
|
||||
if not upload_file or not upload_file.filename:
|
||||
continue
|
||||
|
||||
stored_name, size_bytes = _store_upload_file(upload_file, COMMENT_ATTACHMENT_SUBDIR)
|
||||
execute_query(
|
||||
insert_query,
|
||||
(
|
||||
opportunity_id,
|
||||
comment_id,
|
||||
upload_file.filename,
|
||||
upload_file.content_type,
|
||||
size_bytes,
|
||||
stored_name,
|
||||
uploaded_by_user_id,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
# ============================
|
||||
@ -445,11 +1041,64 @@ async def list_opportunity_comments(opportunity_id: int):
|
||||
response_model=OpportunityCommentResponse,
|
||||
tags=["Opportunities"]
|
||||
)
|
||||
async def add_opportunity_comment(opportunity_id: int, comment: OpportunityCommentCreate):
|
||||
async def add_opportunity_comment(
|
||||
opportunity_id: int,
|
||||
request: Request,
|
||||
content: Optional[str] = Form(None),
|
||||
author_name: Optional[str] = Form(None),
|
||||
user_id: Optional[int] = Form(None),
|
||||
email_id: Optional[int] = Form(None),
|
||||
contract_number: Optional[str] = Form(None),
|
||||
contract_context: Optional[str] = Form(None),
|
||||
contract_link: Optional[str] = Form(None),
|
||||
metadata: Optional[str] = Form(None),
|
||||
files: Optional[List[UploadFile]] = File(None),
|
||||
):
|
||||
_get_opportunity(opportunity_id)
|
||||
|
||||
author_name = comment.author_name or 'Hub Bruger'
|
||||
metadata_json = json.dumps(comment.metadata) if comment.metadata else None
|
||||
if request.headers.get("content-type", "").startswith("application/json"):
|
||||
payload: Dict[str, Any] = await request.json()
|
||||
else:
|
||||
payload = {
|
||||
"content": content,
|
||||
"author_name": author_name,
|
||||
"user_id": user_id,
|
||||
"email_id": email_id,
|
||||
"contract_number": contract_number,
|
||||
"contract_context": contract_context,
|
||||
"contract_link": contract_link,
|
||||
"metadata": metadata,
|
||||
}
|
||||
|
||||
content_value = payload.get("content")
|
||||
if not content_value:
|
||||
raise HTTPException(status_code=400, detail="Kommentar er påkrævet")
|
||||
|
||||
resolved_author = payload.get("author_name") or 'Hub Bruger'
|
||||
resolved_user_id = payload.get("user_id")
|
||||
if isinstance(resolved_user_id, str):
|
||||
try:
|
||||
resolved_user_id = int(resolved_user_id)
|
||||
except ValueError:
|
||||
resolved_user_id = None
|
||||
|
||||
resolved_email_id = payload.get("email_id")
|
||||
if isinstance(resolved_email_id, str):
|
||||
try:
|
||||
resolved_email_id = int(resolved_email_id)
|
||||
except ValueError:
|
||||
resolved_email_id = None
|
||||
metadata_payload = payload.get("metadata")
|
||||
metadata_obj = None
|
||||
if metadata_payload:
|
||||
if isinstance(metadata_payload, str):
|
||||
try:
|
||||
metadata_obj = json.loads(metadata_payload)
|
||||
except json.JSONDecodeError:
|
||||
metadata_obj = None
|
||||
elif isinstance(metadata_payload, dict):
|
||||
metadata_obj = metadata_payload
|
||||
metadata_json = json.dumps(metadata_obj) if metadata_obj else None
|
||||
|
||||
query = """
|
||||
INSERT INTO pipeline_opportunity_comments
|
||||
@ -463,13 +1112,13 @@ async def add_opportunity_comment(opportunity_id: int, comment: OpportunityComme
|
||||
query,
|
||||
(
|
||||
opportunity_id,
|
||||
comment.user_id,
|
||||
author_name,
|
||||
comment.content,
|
||||
comment.email_id,
|
||||
comment.contract_number,
|
||||
comment.contract_context,
|
||||
comment.contract_link,
|
||||
resolved_user_id,
|
||||
resolved_author,
|
||||
content_value,
|
||||
resolved_email_id,
|
||||
payload.get("contract_number"),
|
||||
payload.get("contract_context"),
|
||||
payload.get("contract_link"),
|
||||
metadata_json
|
||||
)
|
||||
)
|
||||
@ -478,9 +1127,39 @@ async def add_opportunity_comment(opportunity_id: int, comment: OpportunityComme
|
||||
raise HTTPException(status_code=500, detail="Kunne ikke oprette kommentar")
|
||||
|
||||
comment_id = result[0]["id"]
|
||||
attachment_files = files or []
|
||||
if attachment_files:
|
||||
_save_comment_attachments(opportunity_id, comment_id, attachment_files, resolved_user_id)
|
||||
|
||||
return _fetch_comment(comment_id)
|
||||
|
||||
|
||||
@router.get("/opportunities/{opportunity_id}/comment-attachments/{attachment_id}", tags=["Opportunities"])
|
||||
async def download_comment_attachment(opportunity_id: int, attachment_id: int):
|
||||
query = """
|
||||
SELECT * FROM pipeline_opportunity_comment_attachments
|
||||
WHERE id = %s AND opportunity_id = %s
|
||||
"""
|
||||
result = execute_query(query, (attachment_id, opportunity_id))
|
||||
if not result:
|
||||
raise HTTPException(status_code=404, detail="Vedhæftet fil ikke fundet")
|
||||
|
||||
attachment = result[0]
|
||||
stored_name = attachment.get("stored_name")
|
||||
if not stored_name:
|
||||
raise HTTPException(status_code=500, detail="Vedhæftet fil mangler sti")
|
||||
|
||||
file_path = _resolve_attachment_path(stored_name)
|
||||
if not file_path.exists():
|
||||
raise HTTPException(status_code=404, detail="Filen findes ikke på serveren")
|
||||
|
||||
return FileResponse(
|
||||
path=file_path,
|
||||
filename=attachment.get("filename"),
|
||||
media_type=attachment.get("content_type") or "application/octet-stream"
|
||||
)
|
||||
|
||||
|
||||
@router.get(
|
||||
"/contracts/search",
|
||||
tags=["Opportunities"],
|
||||
@ -502,3 +1181,59 @@ async def search_contracts(query: str = Query(..., min_length=2), limit: int = Q
|
||||
params = (f"%{query}%", limit)
|
||||
results = execute_query(sql, params)
|
||||
return results or []
|
||||
|
||||
|
||||
@router.get(
|
||||
"/opportunities/{opportunity_id}/contract-files",
|
||||
tags=["Opportunities"],
|
||||
response_model=List[OpportunityContractFile]
|
||||
)
|
||||
async def list_contract_files(opportunity_id: int):
|
||||
_get_opportunity(opportunity_id)
|
||||
return _fetch_contract_files(opportunity_id)
|
||||
|
||||
|
||||
@router.post(
|
||||
"/opportunities/{opportunity_id}/contract-files",
|
||||
tags=["Opportunities"],
|
||||
response_model=List[OpportunityContractFile]
|
||||
)
|
||||
async def upload_contract_files(opportunity_id: int, files: List[UploadFile] = File(...)):
|
||||
_get_opportunity(opportunity_id)
|
||||
if not files:
|
||||
raise HTTPException(status_code=400, detail="Ingen filer at uploade")
|
||||
|
||||
saved = _save_contract_files(opportunity_id, files)
|
||||
if not saved:
|
||||
raise HTTPException(status_code=500, detail="Kunne ikke gemme filer")
|
||||
return saved
|
||||
|
||||
|
||||
@router.get(
|
||||
"/opportunities/{opportunity_id}/contract-files/{file_id}",
|
||||
tags=["Opportunities"]
|
||||
)
|
||||
async def download_contract_file(opportunity_id: int, file_id: int):
|
||||
query = """
|
||||
SELECT * FROM pipeline_opportunity_contract_files
|
||||
WHERE id = %s AND opportunity_id = %s
|
||||
"""
|
||||
result = execute_query(query, (file_id, opportunity_id))
|
||||
if not result:
|
||||
raise HTTPException(status_code=404, detail="Filen ikke fundet")
|
||||
|
||||
row = result[0]
|
||||
stored_name = row.get("stored_name")
|
||||
if not stored_name:
|
||||
raise HTTPException(status_code=500, detail="Filen mangler lagring")
|
||||
|
||||
path = _resolve_attachment_path(stored_name)
|
||||
if not path.exists():
|
||||
raise HTTPException(status_code=404, detail="Filen findes ikke på serveren")
|
||||
|
||||
return FileResponse(
|
||||
path=path,
|
||||
filename=row.get("filename"),
|
||||
media_type=row.get("content_type") or "application/octet-stream",
|
||||
headers={"Content-Disposition": f"inline; filename=\"{row.get('filename')}\""}
|
||||
)
|
||||
|
||||
@ -222,12 +222,11 @@ function renderOpportunities() {
|
||||
}
|
||||
|
||||
tbody.innerHTML = filtered.map(o => `
|
||||
<tr>
|
||||
<tr class="opportunity-row" style="cursor:pointer" onclick="goToDetail(${o.id})">
|
||||
<td class="fw-semibold">${escapeHtml(o.title)}</td>
|
||||
<td>${escapeHtml(o.customer_name || '-')}
|
||||
</td>
|
||||
<td>${escapeHtml(o.customer_name || '-')}</td>
|
||||
<td>${formatCurrency(o.amount, o.currency)}</td>
|
||||
<td>${o.expected_close_date ? formatDate(o.expected_close_date) : '<span class="text-muted">-</span>'}</td>
|
||||
<td>${o.expected_close_date ? formatDate(o.expected_close_date) : '<span class=\"text-muted\">-</span>'}</td>
|
||||
<td>
|
||||
<span class="stage-pill">
|
||||
<span class="stage-dot" style="background:${o.stage_color || '#0f4c75'}"></span>
|
||||
@ -236,9 +235,7 @@ function renderOpportunities() {
|
||||
</td>
|
||||
<td>${o.probability || 0}%</td>
|
||||
<td class="text-end">
|
||||
<button class="btn btn-sm btn-outline-primary" onclick="goToDetail(${o.id})">
|
||||
<i class="bi bi-arrow-right"></i>
|
||||
</button>
|
||||
<i class="bi bi-arrow-right"></i>
|
||||
</td>
|
||||
</tr>
|
||||
`).join('');
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@ -46,7 +46,8 @@ async def migrations_page(request: Request):
|
||||
"migrations": migrations,
|
||||
"db_user": settings.POSTGRES_USER,
|
||||
"db_name": settings.POSTGRES_DB,
|
||||
"db_container": "bmc-hub-postgres"
|
||||
"db_container": "bmc-hub-postgres",
|
||||
"is_production": request.url.hostname not in ['localhost', '127.0.0.1', '0.0.0.0']
|
||||
})
|
||||
|
||||
|
||||
|
||||
@ -118,7 +118,8 @@
|
||||
<ul class="list-unstyled mb-0">
|
||||
<li class="mb-2"><strong>DB bruger:</strong> {{ db_user }}</li>
|
||||
<li class="mb-2"><strong>DB navn:</strong> {{ db_name }}</li>
|
||||
<li class="mb-0"><strong>DB container:</strong> {{ db_container }}</li>
|
||||
<li class="mb-2"><strong>DB container:</strong> {{ db_container }}</li>
|
||||
<li class="mb-0"><strong>Container runtime:</strong> {{ 'Podman' if is_production else 'Docker' }}</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
@ -130,11 +131,14 @@
|
||||
let currentAltCommand = "";
|
||||
|
||||
function buildCommand(migrationName) {
|
||||
const dockerCmd = `docker exec -i {{ db_container }} psql -U {{ db_user }} -d {{ db_name }} < migrations/${migrationName}`;
|
||||
// Use podman for production, docker for local development
|
||||
const runtime = {{ 'true' if is_production else 'false' }} ? 'podman' : 'docker';
|
||||
|
||||
const containerCmd = `${runtime} exec -i {{ db_container }} psql -U {{ db_user }} -d {{ db_name }} < migrations/${migrationName}`;
|
||||
const localCmd = `psql \"$DATABASE_URL\" -f migrations/${migrationName}`;
|
||||
currentCommand = dockerCmd;
|
||||
currentAltCommand = `${dockerCmd}\n${localCmd}`;
|
||||
document.getElementById('commandBox').textContent = dockerCmd;
|
||||
currentCommand = containerCmd;
|
||||
currentAltCommand = `${containerCmd}\n${localCmd}`;
|
||||
document.getElementById('commandBox').textContent = containerCmd;
|
||||
document.getElementById('copyCommandBtn').disabled = false;
|
||||
document.getElementById('copyAltBtn').disabled = false;
|
||||
}
|
||||
|
||||
@ -13,9 +13,11 @@ services:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
# Mount all migration files for initialization
|
||||
- ./migrations:/docker-entrypoint-initdb.d:ro
|
||||
# NO external port mapping - only accessible via Docker network
|
||||
# ports:
|
||||
# - "${POSTGRES_PORT:-5432}:5432"
|
||||
# Optional: publish Postgres to the host.
|
||||
# Default binds to localhost for safety; set POSTGRES_BIND_ADDR=0.0.0.0 (or host IP)
|
||||
# if the API container can't reach Postgres via the bridge network (Podman netavark issue).
|
||||
ports:
|
||||
- "${POSTGRES_BIND_ADDR:-127.0.0.1}:${POSTGRES_PORT:-5432}:5432"
|
||||
restart: always
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER}"]
|
||||
@ -49,10 +51,14 @@ services:
|
||||
env_file:
|
||||
- .env
|
||||
environment:
|
||||
# Override database URL to point to postgres service
|
||||
- DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
|
||||
# Override database URL to point to postgres service (or host fallback).
|
||||
# Set POSTGRES_HOST=host.containers.internal if bridge networking is broken.
|
||||
- DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST:-postgres}:5432/${POSTGRES_DB}
|
||||
- ENABLE_RELOAD=false
|
||||
restart: always
|
||||
# Podman rootless: map container user namespace to the host user.
|
||||
# This avoids permission issues on bind-mounted folders like ./uploads and ./logs.
|
||||
userns_mode: "keep-id"
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
|
||||
interval: 30s
|
||||
|
||||
20
migrations/019_opportunity_comment_attachments.sql
Normal file
20
migrations/019_opportunity_comment_attachments.sql
Normal file
@ -0,0 +1,20 @@
|
||||
-- Migration 019: Opportunity comment attachments
|
||||
-- Captures uploaded files that belong to a pipeline comment thread
|
||||
|
||||
CREATE TABLE IF NOT EXISTS pipeline_opportunity_comment_attachments (
|
||||
id SERIAL PRIMARY KEY,
|
||||
opportunity_id INTEGER NOT NULL REFERENCES pipeline_opportunities(id) ON DELETE CASCADE,
|
||||
comment_id INTEGER NOT NULL REFERENCES pipeline_opportunity_comments(id) ON DELETE CASCADE,
|
||||
filename VARCHAR(255) NOT NULL,
|
||||
content_type VARCHAR(100),
|
||||
size_bytes INTEGER,
|
||||
stored_name TEXT NOT NULL,
|
||||
uploaded_by_user_id INTEGER,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
FOREIGN KEY (uploaded_by_user_id) REFERENCES users(user_id) ON DELETE SET NULL
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_pipeline_comment_attachments_comment_id ON pipeline_opportunity_comment_attachments(comment_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_pipeline_comment_attachments_opportunity_id ON pipeline_opportunity_comment_attachments(opportunity_id);
|
||||
|
||||
COMMENT ON TABLE pipeline_opportunity_comment_attachments IS 'User uploaded files stored alongside pipeline opportunity comments';
|
||||
18
migrations/020_opportunity_contract_files.sql
Normal file
18
migrations/020_opportunity_contract_files.sql
Normal file
@ -0,0 +1,18 @@
|
||||
-- Migration 020: Opportunity contract files
|
||||
-- Stores documents that belong to the offer/contract card and can be downloaded independently of comments
|
||||
|
||||
CREATE TABLE IF NOT EXISTS pipeline_opportunity_contract_files (
|
||||
id SERIAL PRIMARY KEY,
|
||||
opportunity_id INTEGER NOT NULL REFERENCES pipeline_opportunities(id) ON DELETE CASCADE,
|
||||
filename VARCHAR(255) NOT NULL,
|
||||
content_type VARCHAR(100),
|
||||
size_bytes INTEGER,
|
||||
stored_name TEXT NOT NULL,
|
||||
uploaded_by_user_id INTEGER,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
FOREIGN KEY (uploaded_by_user_id) REFERENCES users(user_id) ON DELETE SET NULL
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_pipeline_opportunity_contract_files_opportunity_id ON pipeline_opportunity_contract_files(opportunity_id);
|
||||
|
||||
COMMENT ON TABLE pipeline_opportunity_contract_files IS 'Files uploaded directly to the opportunity (Tilbud & Kontrakt) card.';
|
||||
4
migrations/021_opportunity_email_link.sql
Normal file
4
migrations/021_opportunity_email_link.sql
Normal file
@ -0,0 +1,4 @@
|
||||
-- Migration 021: Add email_id to pipeline_opportunities for direct email linking
|
||||
ALTER TABLE pipeline_opportunities
|
||||
ADD COLUMN IF NOT EXISTS email_id INTEGER REFERENCES email_messages(id) ON DELETE SET NULL;
|
||||
CREATE INDEX IF NOT EXISTS idx_pipeline_opportunities_email_id ON pipeline_opportunities(email_id);
|
||||
10
migrations/032_opportunity_emails_m2m.sql
Normal file
10
migrations/032_opportunity_emails_m2m.sql
Normal file
@ -0,0 +1,10 @@
|
||||
CREATE TABLE IF NOT EXISTS pipeline_opportunity_emails (
|
||||
opportunity_id INTEGER REFERENCES pipeline_opportunities(id) ON DELETE CASCADE,
|
||||
email_id INTEGER REFERENCES email_messages(id) ON DELETE CASCADE,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
PRIMARY KEY (opportunity_id, email_id)
|
||||
);
|
||||
|
||||
INSERT INTO pipeline_opportunity_emails (opportunity_id, email_id)
|
||||
SELECT id, email_id FROM pipeline_opportunities WHERE email_id IS NOT NULL
|
||||
ON CONFLICT DO NOTHING;
|
||||
11
migrations/033_opportunity_contacts.sql
Normal file
11
migrations/033_opportunity_contacts.sql
Normal file
@ -0,0 +1,11 @@
|
||||
-- Create link table for Opportunity <-> Contacts (Many-to-Many)
|
||||
CREATE TABLE IF NOT EXISTS pipeline_opportunity_contacts (
|
||||
opportunity_id INTEGER NOT NULL REFERENCES pipeline_opportunities(id) ON DELETE CASCADE,
|
||||
contact_id INTEGER NOT NULL REFERENCES contacts(id) ON DELETE CASCADE,
|
||||
role VARCHAR(100), -- optional role in this opportunity (e.g. "Decision Maker", "Influencer")
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
|
||||
PRIMARY KEY (opportunity_id, contact_id)
|
||||
);
|
||||
|
||||
-- Index for faster lookups
|
||||
CREATE INDEX IF NOT EXISTS idx_pipeline_opportunity_contacts_opp ON pipeline_opportunity_contacts(opportunity_id);
|
||||
65
updateto.sh
65
updateto.sh
@ -7,6 +7,17 @@ set -e # Exit on any error
|
||||
|
||||
VERSION=$1
|
||||
|
||||
# This production deployment is designed for ROOTLESS Podman.
|
||||
# Running with sudo will use a different Podman storage (rootful) and can make it look
|
||||
# like "data disappeared" because volumes/networks are separate.
|
||||
if [ "${EUID:-$(id -u)}" -eq 0 ]; then
|
||||
echo "❌ Fejl: Kør ikke dette script som root (sudo)."
|
||||
echo " Brug i stedet: sudo -iu bmcadmin && cd /srv/podman/bmc_hub_v1.0 && ./updateto.sh $VERSION"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
PODMAN_COMPOSE_FILE="docker-compose.prod.yml"
|
||||
|
||||
if [ -z "$VERSION" ]; then
|
||||
echo "❌ Fejl: Ingen version angivet"
|
||||
echo "Usage: ./updateto.sh v1.3.15"
|
||||
@ -40,6 +51,12 @@ if [ ! -f ".env" ]; then
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ ! -f "$PODMAN_COMPOSE_FILE" ]; then
|
||||
echo "❌ Fejl: $PODMAN_COMPOSE_FILE ikke fundet i $(pwd)"
|
||||
echo " Kør fra /srv/podman/bmc_hub_v1.0"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Load environment variables (DB credentials)
|
||||
set -a
|
||||
source .env
|
||||
@ -60,12 +77,12 @@ echo "✅ .env opdateret"
|
||||
# Stop containers
|
||||
echo ""
|
||||
echo "⏹️ Stopper containere..."
|
||||
podman-compose -f docker-compose.prod.yml down
|
||||
podman-compose -f "$PODMAN_COMPOSE_FILE" down
|
||||
|
||||
# Pull/rebuild with new version
|
||||
echo ""
|
||||
echo "🔨 Bygger nyt image med version $VERSION..."
|
||||
podman-compose -f docker-compose.prod.yml up -d --build
|
||||
podman-compose -f "$PODMAN_COMPOSE_FILE" up -d --build
|
||||
|
||||
# Sync migrations from container to host
|
||||
echo ""
|
||||
@ -84,24 +101,38 @@ echo ""
|
||||
echo "⏳ Venter på container startup..."
|
||||
sleep 5
|
||||
|
||||
# Run database migration
|
||||
# Database migrations
|
||||
echo ""
|
||||
echo "🧱 Kører database migrationer..."
|
||||
if [ -z "$POSTGRES_USER" ] || [ -z "$POSTGRES_DB" ]; then
|
||||
echo "❌ Fejl: POSTGRES_USER/POSTGRES_DB mangler i .env"
|
||||
exit 1
|
||||
fi
|
||||
echo "🧱 Database migrationer"
|
||||
echo " NOTE: Scriptet kører ikke længere en hardcoded enkelt-migration automatisk."
|
||||
echo " Brug migrations-UI'en i BMC Hub, eller sæt RUN_MIGRATIONS=true for at køre alle .sql i /docker-entrypoint-initdb.d/ i sorteret rækkefølge."
|
||||
|
||||
for i in {1..10}; do
|
||||
if podman exec bmc-hub-postgres-prod pg_isready -U "$POSTGRES_USER" >/dev/null 2>&1; then
|
||||
break
|
||||
if [ "${RUN_MIGRATIONS:-false}" = "true" ]; then
|
||||
if [ -z "$POSTGRES_USER" ] || [ -z "$POSTGRES_DB" ]; then
|
||||
echo "❌ Fejl: POSTGRES_USER/POSTGRES_DB mangler i .env"
|
||||
exit 1
|
||||
fi
|
||||
echo "⏳ Venter på postgres... ($i/10)"
|
||||
sleep 2
|
||||
done
|
||||
|
||||
podman exec -i bmc-hub-postgres-prod psql -U "$POSTGRES_USER" -d "$POSTGRES_DB" -f /docker-entrypoint-initdb.d/016_opportunities.sql
|
||||
echo "✅ Migration 016_opportunities.sql kørt"
|
||||
for i in {1..30}; do
|
||||
if podman exec bmc-hub-postgres-prod pg_isready -U "$POSTGRES_USER" >/dev/null 2>&1; then
|
||||
break
|
||||
fi
|
||||
echo "⏳ Venter på postgres... ($i/30)"
|
||||
sleep 2
|
||||
done
|
||||
|
||||
echo "📄 Kører alle migrations fra /docker-entrypoint-initdb.d (sorteret)..."
|
||||
podman exec bmc-hub-postgres-prod sh -lc "ls -1 /docker-entrypoint-initdb.d/*.sql 2>/dev/null | sort" \
|
||||
| while read -r file; do
|
||||
[ -z "$file" ] && continue
|
||||
echo "➡️ $file"
|
||||
podman exec -i bmc-hub-postgres-prod psql -v ON_ERROR_STOP=1 -U "$POSTGRES_USER" -d "$POSTGRES_DB" -f "$file"
|
||||
done
|
||||
|
||||
echo "✅ Migrations kørt"
|
||||
else
|
||||
echo "ℹ️ RUN_MIGRATIONS=false (default)"
|
||||
fi
|
||||
|
||||
# Show logs
|
||||
echo ""
|
||||
@ -113,7 +144,7 @@ echo ""
|
||||
echo "✅ Deployment fuldført!"
|
||||
echo ""
|
||||
echo "🔍 Tjek status med:"
|
||||
echo " podman-compose -f docker-compose.prod.yml ps"
|
||||
echo " podman-compose -f $PODMAN_COMPOSE_FILE ps"
|
||||
echo " podman logs -f bmc-hub-api-prod"
|
||||
echo ""
|
||||
echo "🌐 Test health endpoint:"
|
||||
|
||||
Loading…
Reference in New Issue
Block a user