Compare commits
22 Commits
27af22d241
...
feat/multi
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c534a972a9 | ||
|
|
6fc2f34ba9 | ||
|
|
c1d8357956 | ||
|
|
695dafacd5 | ||
|
|
69a3088579 | ||
|
|
3d212979d9 | ||
|
|
7dd39f9712 | ||
|
|
f74322beab | ||
|
|
f5ef9e0811 | ||
|
|
06f8fa5842 | ||
|
|
7a2408e310 | ||
|
|
09a5403f83 | ||
|
|
3d73d9e422 | ||
|
|
dafc2df0d4 | ||
|
|
5e01fefd4c | ||
|
|
8020b2d14b | ||
|
|
172debdbdb | ||
|
|
ecb4777a35 | ||
|
|
cc872cfdad | ||
|
|
8d58e97ac6 | ||
|
|
b930b2bc85 | ||
|
|
5dfd795908 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -29,6 +29,7 @@ output/
|
|||||||
vfp/*.json
|
vfp/*.json
|
||||||
*.~pck
|
*.~pck
|
||||||
.claude/HANDOFF.md
|
.claude/HANDOFF.md
|
||||||
|
scripts/work/
|
||||||
|
|
||||||
# Virtual environments
|
# Virtual environments
|
||||||
venv/
|
venv/
|
||||||
@@ -45,3 +46,4 @@ api/api/
|
|||||||
|
|
||||||
# Logs directory
|
# Logs directory
|
||||||
logs/
|
logs/
|
||||||
|
.gstack/
|
||||||
|
|||||||
154
CLAUDE.md
154
CLAUDE.md
@@ -1,140 +1,60 @@
|
|||||||
# CLAUDE.md
|
# CLAUDE.md
|
||||||
|
|
||||||
## REGULI OBLIGATORII
|
|
||||||
|
|
||||||
**Pentru task-uri paralele foloseste INTOTDEAUNA TeamCreate + TaskCreate, NU Agent tool cu subagenti paraleli.**
|
|
||||||
Skill-ul `superpowers:dispatching-parallel-agents` NU se aplica in acest proiect. In loc de dispatch cu Agent tool, creeaza o echipa cu TeamCreate, defineste task-uri cu TaskCreate, si spawneaza teammates cu Agent tool + `team_name`.
|
|
||||||
|
|
||||||
## Project Overview
|
## Project Overview
|
||||||
|
|
||||||
**System:** Import Comenzi Web GoMag → Sistem ROA Oracle
|
**System:** Import Comenzi Web GoMag → Sistem ROA Oracle
|
||||||
|
Stack: FastAPI + Jinja2 + Bootstrap 5.3 + Oracle PL/SQL + SQLite
|
||||||
|
|
||||||
Importa automat comenzi din GoMag in sistemul ERP ROA Oracle. Stack complet Python/FastAPI.
|
Documentatie completa: [README.md](README.md)
|
||||||
|
|
||||||
### Tech Stack
|
## Implementare cu TeamCreate
|
||||||
- **API + Admin:** FastAPI + Jinja2 + Bootstrap 5.3
|
|
||||||
- **GoMag Integration:** Python (`gomag_client.py` — API download with pagination)
|
**OBLIGATORIU:** Folosim TeamCreate + TaskCreate, NU Agent tool cu subagenti paraleli. Skill-ul `superpowers:dispatching-parallel-agents` NU se aplica in acest proiect.
|
||||||
- **Sync Orchestrator:** Python (`sync_service.py` — download → parse → validate → import)
|
|
||||||
- **Database:** Oracle PL/SQL packages (IMPORT_PARTENERI, IMPORT_COMENZI) + SQLite (tracking)
|
- Team lead citeste TOATE fisierele implicate, creeaza planul
|
||||||
|
- **ASTEAPTA aprobare explicita** de la user inainte de implementare
|
||||||
|
- Task-uri pe fisiere non-overlapping (evita conflicte)
|
||||||
|
- Cache-bust static assets (`?v=N`) la fiecare schimbare UI
|
||||||
|
|
||||||
## Development Commands
|
## Development Commands
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Run FastAPI server — INTOTDEAUNA via start.sh (seteaza Oracle env vars)
|
# INTOTDEAUNA via start.sh (seteaza Oracle env vars)
|
||||||
./start.sh
|
./start.sh
|
||||||
# NU folosi uvicorn direct — lipsesc LD_LIBRARY_PATH si TNS_ADMIN pentru Oracle
|
# NU folosi uvicorn direct — lipsesc LD_LIBRARY_PATH si TNS_ADMIN
|
||||||
|
|
||||||
# Tests
|
# Tests
|
||||||
python api/test_app_basic.py # Test A - fara Oracle
|
python api/test_app_basic.py # fara Oracle
|
||||||
python api/test_integration.py # Test C - cu Oracle
|
python api/test_integration.py # cu Oracle
|
||||||
```
|
```
|
||||||
|
|
||||||
## UI Development Workflow: Preview → Implement → Verify
|
## Reguli critice (nu le incalca)
|
||||||
|
|
||||||
**OBLIGATORIU**: Respecta ordinea exacta. NU treci la pasul urmator fara aprobare explicita.
|
### Flux import comenzi
|
||||||
|
1. Download GoMag API → JSON → parse → validate SKU-uri → import Oracle
|
||||||
|
2. Ordinea: **parteneri** (cauta/creeaza) → **adrese** → **comanda** → **factura cache**
|
||||||
|
3. SKU lookup: ARTICOLE_TERTI (mapped) are prioritate fata de NOM_ARTICOLE (direct)
|
||||||
|
4. Complex sets: un SKU → multiple CODMAT-uri cu `procent_pret` (trebuie sa fie sum=100%)
|
||||||
|
5. Comenzi anulate (GoMag statusId=7): verifica daca au factura inainte de stergere din Oracle
|
||||||
|
|
||||||
### 1. Plan & Preview — ASTEAPTA APROBARE
|
### Statusuri comenzi
|
||||||
1. Citeste TOATE fisierele implicate
|
`IMPORTED` / `ALREADY_IMPORTED` / `SKIPPED` / `ERROR` / `CANCELLED` / `DELETED_IN_ROA`
|
||||||
2. Scrie planul de implementare cu decizii de design
|
- Upsert: `IMPORTED` existent NU se suprascrie cu `ALREADY_IMPORTED`
|
||||||
3. Genereaza **mockup-uri Markdown** care descriu rezultatul asteptat (tabele, liste, cod pseudo-CSS) — NU HTML static
|
- Recovery: la fiecare sync, comenzile ERROR sunt reverificate in Oracle
|
||||||
4. **Prezinta mockup-urile userului si ASTEAPTA aprobare explicita**
|
|
||||||
5. Rafineaza planul daca userul cere modificari
|
|
||||||
6. **NU trece la implementare pana userul nu spune explicit "ok", "aprob", "executa" sau similar**
|
|
||||||
|
|
||||||
### 2. Implementation cu TeamCreate (Agent Teams)
|
### Parteneri
|
||||||
|
- Prioritate: **companie** (PJ, cod_fiscal + registru) daca exista in GoMag, altfel persoana fizica cu **shipping name**
|
||||||
|
- Adresa livrare: intotdeauna GoMag shipping
|
||||||
|
- Adresa facturare: daca shipping ≠ billing person → shipping pt ambele; altfel → billing din GoMag
|
||||||
|
|
||||||
Folosim **TeamCreate** (team agents), NU superpowers subagents. Diferenta:
|
### Preturi
|
||||||
- **TeamCreate**: agenti independenti cu task list partajat, comunicare directa intre ei, context propriu
|
- Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
|
||||||
- **Subagents (Agent tool)**: agenti care raporteaza doar la main — NU se folosesc
|
- Daca pretul lipseste, se insereaza automat pret=0
|
||||||
|
|
||||||
#### Workflow TeamCreate:
|
### Invoice cache
|
||||||
|
- Coloanele `factura_*` pe `orders` (SQLite), populate lazy din Oracle (`vanzari WHERE sters=0`)
|
||||||
|
- Refresh complet: verifica facturi noi + facturi sterse + comenzi sterse din ROA
|
||||||
|
|
||||||
1. **Main agent** (team lead) citeste TOATE fisierele implicate, creeaza planul
|
## Deploy Windows
|
||||||
2. **TeamCreate** creeaza echipa (ex: `ui-polish`)
|
|
||||||
3. **TaskCreate** creeaza task-uri independente, pe fisiere non-overlapping:
|
|
||||||
- Task 1: Templates + CSS (HTML templates, style.css, cache-bust)
|
|
||||||
- Task 2: JavaScript (shared.js, dashboard.js, logs.js, mappings.js)
|
|
||||||
- Task 3: Verificare Playwright (depinde de Task 1 + Task 2)
|
|
||||||
4. **Agent tool** cu `team_name` spawneaza teammates folosind agentii predefiniti din `.claude/agents/`:
|
|
||||||
- `subagent_type: ui-templates` → pentru Task 1 (templates + CSS)
|
|
||||||
- `subagent_type: ui-js` → pentru Task 2 (JavaScript)
|
|
||||||
- `subagent_type: ui-verify` → pentru Task 3 (Playwright verification)
|
|
||||||
- `subagent_type: backend-api` → pentru modificari backend/API (routers, services, Oracle/SQLite)
|
|
||||||
- `subagent_type: qa-tester` → pentru teste de integrare
|
|
||||||
5. Teammates lucreaza in paralel, comunica intre ei, marcheaza task-uri completate
|
|
||||||
6. Cand Task 1 + Task 2 sunt complete, teammate-ul de verificare preia Task 3
|
|
||||||
|
|
||||||
#### Teammate-ul de verificare (Task 3):
|
Vezi [README.md](README.md#deploy-windows)
|
||||||
1. Navigheaza la fiecare pagina cu Playwright MCP la 375x812 (mobile) si 1440x900 (desktop)
|
|
||||||
2. **Foloseste browser_snapshot** (NU screenshot-uri) pentru a inspecta structura DOM
|
|
||||||
3. Verifica ca implementarea respecta fiecare punct din preview-ul aprobat (structura coloane, bold, dots, filtre etc.)
|
|
||||||
4. Raporteaza discrepante concrete la team lead (ce e diferit fata de preview)
|
|
||||||
5. NU salveaza screenshot-uri after/
|
|
||||||
|
|
||||||
#### Bucla de corectie (responsabilitatea team lead-ului):
|
|
||||||
1. Dupa ce verify-agent raporteaza, **team lead-ul analizeaza discrepantele**
|
|
||||||
2. Pentru fiecare discrepanta, creeaza un nou task de fix si spawneaza un agent sa-l rezolve
|
|
||||||
3. Dupa fix, spawneaza din nou verify-agent pentru re-verificare
|
|
||||||
4. **Repeta bucla** pana cand toate verificarile trec (implementare ≈ preview)
|
|
||||||
5. Abia atunci declara task-ul complet
|
|
||||||
|
|
||||||
```
|
|
||||||
screenshots/
|
|
||||||
└── preview/ # Mockup-uri Markdown aprobate de user (referinta pentru verificare)
|
|
||||||
```
|
|
||||||
|
|
||||||
### Principii
|
|
||||||
- Team lead citeste TOATE fisierele inainte sa creeze task-uri
|
|
||||||
- Task-uri pe fisiere non-overlapping (evita conflicte)
|
|
||||||
- Fiecare task contine prompt detaliat, self-contained
|
|
||||||
- Desktop-ul nu trebuie sa se schimbe cand se adauga imbunatatiri mobile
|
|
||||||
- Cache-bust static assets (increment `?v=N`) la fiecare schimbare UI
|
|
||||||
- Teammates comunica intre ei cu SendMessage, nu doar cu team lead-ul
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
```
|
|
||||||
[GoMag API] → [Python Sync Service] → [Oracle PL/SQL] → [FastAPI Admin]
|
|
||||||
↓ ↓ ↑ ↑
|
|
||||||
JSON Orders Download/Parse/Import Store/Update Dashboard + Config
|
|
||||||
```
|
|
||||||
|
|
||||||
### FastAPI App Structure
|
|
||||||
- **Routers:** health, dashboard, mappings, articles, validation, sync
|
|
||||||
- **Services:** gomag_client, sync, order_reader, import, mapping, article, validation, invoice, sqlite, scheduler
|
|
||||||
- **Templates:** Jinja2 (dashboard, mappings, missing_skus, logs)
|
|
||||||
- **Static:** CSS (`style.css`), JS (`shared.js`, `dashboard.js`, `logs.js`, `mappings.js`)
|
|
||||||
- **Databases:** Oracle (ERP data) + SQLite (order tracking, sync runs)
|
|
||||||
|
|
||||||
## Business Rules
|
|
||||||
|
|
||||||
### Partners
|
|
||||||
- Search priority: cod_fiscal → denumire → create new
|
|
||||||
- Individuals (CUI 13 digits): separate nume/prenume
|
|
||||||
- Default address: Bucuresti Sectorul 1
|
|
||||||
- All new partners: ID_UTIL = -3
|
|
||||||
|
|
||||||
### Articles & Mappings
|
|
||||||
- Simple SKUs: found directly in nom_articole (not stored in ARTICOLE_TERTI)
|
|
||||||
- Repackaging: SKU → CODMAT with different quantities
|
|
||||||
- Complex sets: One SKU → multiple CODMATs with percentage pricing (must sum to 100%)
|
|
||||||
- Inactive articles: activ=0 (soft delete)
|
|
||||||
|
|
||||||
### Orders
|
|
||||||
- Default: ID_GESTIUNE=1, ID_SECTIE=1, ID_POL=0
|
|
||||||
- Delivery date = order date + 1 day
|
|
||||||
- All orders: INTERNA=0 (external)
|
|
||||||
|
|
||||||
## Configuration
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# .env
|
|
||||||
ORACLE_USER=CONTAFIN_ORACLE
|
|
||||||
ORACLE_PASSWORD=********
|
|
||||||
ORACLE_DSN=ROA_ROMFAST
|
|
||||||
TNS_ADMIN=/app
|
|
||||||
```
|
|
||||||
|
|
||||||
## Deploy & Depanare Windows
|
|
||||||
|
|
||||||
Vezi [README.md](README.md#deploy-windows) pentru instructiuni complete de deploy si depanare pe Windows Server.
|
|
||||||
|
|||||||
187
README.md
187
README.md
@@ -101,11 +101,11 @@ gomag-vending/
|
|||||||
│ │ ├── database.py # Oracle pool + SQLite schema + migrari
|
│ │ ├── database.py # Oracle pool + SQLite schema + migrari
|
||||||
│ │ ├── routers/ # Endpoint-uri HTTP
|
│ │ ├── routers/ # Endpoint-uri HTTP
|
||||||
│ │ │ ├── health.py # GET /health
|
│ │ │ ├── health.py # GET /health
|
||||||
│ │ │ ├── dashboard.py # GET / (HTML)
|
│ │ │ ├── dashboard.py # GET / (HTML) + /settings (HTML)
|
||||||
│ │ │ ├── mappings.py # /mappings, /api/mappings
|
│ │ │ ├── mappings.py # /mappings, /api/mappings
|
||||||
│ │ │ ├── articles.py # /api/articles/search
|
│ │ │ ├── articles.py # /api/articles/search
|
||||||
│ │ │ ├── validation.py # /api/validate/*
|
│ │ │ ├── validation.py # /api/validate/*
|
||||||
│ │ │ └── sync.py # /api/sync/* + /api/dashboard/orders
|
│ │ │ └── sync.py # /api/sync/* + /api/dashboard/* + /api/settings
|
||||||
│ │ ├── services/
|
│ │ ├── services/
|
||||||
│ │ │ ├── gomag_client.py # Download comenzi GoMag API
|
│ │ │ ├── gomag_client.py # Download comenzi GoMag API
|
||||||
│ │ │ ├── sync_service.py # Orchestrare: download→validate→import
|
│ │ │ ├── sync_service.py # Orchestrare: download→validate→import
|
||||||
@@ -117,8 +117,8 @@ gomag-vending/
|
|||||||
│ │ │ ├── article_service.py
|
│ │ │ ├── article_service.py
|
||||||
│ │ │ ├── invoice_service.py # Verificare facturi ROA
|
│ │ │ ├── invoice_service.py # Verificare facturi ROA
|
||||||
│ │ │ └── scheduler_service.py # APScheduler timer
|
│ │ │ └── scheduler_service.py # APScheduler timer
|
||||||
│ │ ├── templates/ # Jinja2 HTML
|
│ │ ├── templates/ # Jinja2 (dashboard, mappings, missing_skus, logs, settings)
|
||||||
│ │ └── static/ # CSS + JS
|
│ │ └── static/ # CSS (style.css) + JS (dashboard, logs, mappings, settings, shared)
|
||||||
│ ├── database-scripts/ # Oracle SQL (ARTICOLE_TERTI, packages)
|
│ ├── database-scripts/ # Oracle SQL (ARTICOLE_TERTI, packages)
|
||||||
│ ├── data/ # SQLite DB (import.db) + JSON orders
|
│ ├── data/ # SQLite DB (import.db) + JSON orders
|
||||||
│ ├── .env # Configurare locala (nu in git)
|
│ ├── .env # Configurare locala (nu in git)
|
||||||
@@ -165,31 +165,130 @@ gomag-vending/
|
|||||||
## Fluxul de Import
|
## Fluxul de Import
|
||||||
|
|
||||||
```
|
```
|
||||||
1. gomag_client.py descarca comenzi GoMag API → JSON files
|
1. gomag_client.py descarca comenzi GoMag API → JSON files (paginat)
|
||||||
2. order_reader.py parseaza JSON-urile
|
2. order_reader.py parseaza JSON-urile, sorteaza cronologic (cele mai vechi primele)
|
||||||
3. validation_service.py valideaza SKU-uri contra ARTICOLE_TERTI + NOM_ARTICOLE
|
3. Comenzi anulate (GoMag statusId=7) → separate, sterse din Oracle daca nu au factura
|
||||||
4. import_service.py creeaza/cauta partener in Oracle (shipping person = facturare)
|
4. validation_service.py valideaza SKU-uri: ARTICOLE_TERTI (mapped) → NOM_ARTICOLE (direct) → missing
|
||||||
5. PACK_IMPORT_COMENZI.importa_comanda_web() insereaza comanda in ROA
|
5. Verificare existenta in Oracle (COMENZI by date range) → deja importate se sar
|
||||||
6. Rezultate salvate in SQLite (orders, sync_run_orders, order_items)
|
6. Stale error recovery: comenzi ERROR reverificate in Oracle (crash recovery)
|
||||||
|
7. Validare preturi + dual policy: articole rutate la id_pol_vanzare sau id_pol_productie
|
||||||
|
8. import_service.py: cauta/creeaza partener → adrese → importa comanda in Oracle
|
||||||
|
9. Invoice cache: verifica facturi + comenzi sterse din ROA
|
||||||
|
10. Rezultate salvate in SQLite (orders, sync_run_orders, order_items)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Statuses Comenzi
|
||||||
|
|
||||||
|
| Status | Descriere |
|
||||||
|
|--------|-----------|
|
||||||
|
| `IMPORTED` | Importata nou in ROA in acest run |
|
||||||
|
| `ALREADY_IMPORTED` | Existenta deja in Oracle, contorizata |
|
||||||
|
| `SKIPPED` | SKU-uri lipsa → neimportata |
|
||||||
|
| `ERROR` | Eroare la import (reverificate automat la urmatorul sync) |
|
||||||
|
| `CANCELLED` | Comanda anulata in GoMag (statusId=7) |
|
||||||
|
| `DELETED_IN_ROA` | A fost importata dar comanda a fost stearsa din ROA |
|
||||||
|
|
||||||
|
**Regula upsert:** daca statusul existent este `IMPORTED`, nu se suprascrie cu `ALREADY_IMPORTED`.
|
||||||
|
|
||||||
### Reguli Business
|
### Reguli Business
|
||||||
- **Persoana**: shipping name = persoana pe eticheta = beneficiarul facturii
|
|
||||||
- **Adresa**: cand billing ≠ shipping → adresa shipping pentru ambele (facturare + livrare)
|
**Parteneri & Adrese:**
|
||||||
- **SKU simplu**: gasit direct in NOM_ARTICOLE → nu se stocheaza in ARTICOLE_TERTI
|
- Prioritate partener: daca exista **companie** in GoMag (billing.company_name) → firma (PJ, cod_fiscal + registru). Altfel → persoana fizica, cu **shipping name** ca nume partener
|
||||||
- **SKU cu repackaging**: un SKU → CODMAT cu cantitate diferita
|
- Adresa livrare: intotdeauna din GoMag shipping
|
||||||
- **SKU set complex**: un SKU → multiple CODMAT-uri cu procente de pret
|
- Adresa facturare: daca shipping name ≠ billing name → adresa shipping pt ambele; daca aceeasi persoana → adresa billing din GoMag
|
||||||
|
- Cautare partener in Oracle: cod_fiscal → denumire → create new (ID_UTIL = -3)
|
||||||
|
|
||||||
|
**Articole & Mapari:**
|
||||||
|
- SKU lookup: ARTICOLE_TERTI (mapped, activ=1) are prioritate fata de NOM_ARTICOLE (direct)
|
||||||
|
- SKU simplu: gasit direct in NOM_ARTICOLE → nu se stocheaza in ARTICOLE_TERTI
|
||||||
|
- SKU cu repackaging: un SKU → CODMAT cu cantitate diferita (`cantitate_roa`)
|
||||||
|
- SKU set complex: un SKU → multiple CODMAT-uri cu `procent_pret` (trebuie sum = 100%)
|
||||||
|
|
||||||
|
**Preturi & Discounturi:**
|
||||||
|
- Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
|
||||||
|
- Daca pretul lipseste in politica, se insereaza automat pret=0
|
||||||
|
- Discount VAT splitting: daca `split_discount_vat=1`, discountul se repartizeaza proportional pe cotele TVA din comanda
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Status Implementare
|
## Facturi & Cache
|
||||||
|
|
||||||
| Faza | Status | Descriere |
|
Facturile sunt verificate live din Oracle si cacate in SQLite (`factura_*` pe tabelul `orders`).
|
||||||
|------|--------|-----------|
|
|
||||||
| Phase 1: Database Foundation | Complet | ARTICOLE_TERTI, PACK_IMPORT_PARTENERI, PACK_IMPORT_COMENZI |
|
### Sursa Oracle
|
||||||
| Phase 2: Python Integration | Complet | gomag_client.py, sync_service.py |
|
```sql
|
||||||
| Phase 3-4: FastAPI Dashboard | Complet | UI responsive, smart polling, filter bar, paginare |
|
SELECT id_comanda, numar_act, serie_act,
|
||||||
| Phase 5: Production | In Progress | Logging done, Auth + SMTP pending |
|
total_fara_tva, total_tva, total_cu_tva,
|
||||||
|
TO_CHAR(data_act, 'YYYY-MM-DD')
|
||||||
|
FROM vanzari
|
||||||
|
WHERE id_comanda IN (...) AND sters = 0
|
||||||
|
```
|
||||||
|
|
||||||
|
### Populare Cache
|
||||||
|
1. **Dashboard** (`GET /api/dashboard/orders`) — comenzile fara cache sunt verificate live si cacate automat la fiecare request
|
||||||
|
2. **Detaliu comanda** (`GET /api/sync/order/{order_number}`) — verifica Oracle live daca nu e caat
|
||||||
|
3. **Refresh manual** (`POST /api/dashboard/refresh-invoices`) — refresh complet pentru toate comenzile
|
||||||
|
|
||||||
|
### Refresh Complet — `/api/dashboard/refresh-invoices`
|
||||||
|
|
||||||
|
Face trei verificari in Oracle si actualizeaza SQLite:
|
||||||
|
|
||||||
|
| Verificare | Actiune |
|
||||||
|
|------------|---------|
|
||||||
|
| Comenzi necacturate → au primit factura? | Cacheaza datele facturii |
|
||||||
|
| Comenzi cacturate → factura a fost stearsa? | Sterge cache factura |
|
||||||
|
| Toate comenzile importate → comanda stearsa din ROA? | Seteaza status `DELETED_IN_ROA` |
|
||||||
|
|
||||||
|
Returneaza: `{ checked, invoices_added, invoices_cleared, orders_deleted }`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## API Reference — Sync & Comenzi
|
||||||
|
|
||||||
|
### Sync
|
||||||
|
| Method | Path | Descriere |
|
||||||
|
|--------|------|-----------|
|
||||||
|
| POST | `/api/sync/start` | Porneste sync in background |
|
||||||
|
| POST | `/api/sync/stop` | Trimite semnal de stop |
|
||||||
|
| GET | `/api/sync/status` | Status curent + progres + last_run |
|
||||||
|
| GET | `/api/sync/history` | Istoric run-uri (paginat) |
|
||||||
|
| GET | `/api/sync/run/{id}` | Detalii run specific |
|
||||||
|
| GET | `/api/sync/run/{id}/log` | Log per comanda (JSON) |
|
||||||
|
| GET | `/api/sync/run/{id}/text-log` | Log text (live din memorie sau reconstruit din SQLite) |
|
||||||
|
| GET | `/api/sync/run/{id}/orders` | Comenzi run filtrate/paginate |
|
||||||
|
| GET | `/api/sync/order/{number}` | Detaliu comanda + items + ARTICOLE_TERTI + factura |
|
||||||
|
|
||||||
|
### Dashboard Comenzi
|
||||||
|
| Method | Path | Descriere |
|
||||||
|
|--------|------|-----------|
|
||||||
|
| GET | `/api/dashboard/orders` | Comenzi cu enrichment factura |
|
||||||
|
| POST | `/api/dashboard/refresh-invoices` | Force-refresh stare facturi + deleted orders |
|
||||||
|
|
||||||
|
**Parametri `/api/dashboard/orders`:**
|
||||||
|
- `period_days`: 3/7/30/90 sau 0 (toate sau interval custom)
|
||||||
|
- `period_start`, `period_end`: interval custom (cand `period_days=0`)
|
||||||
|
- `status`: `all` / `IMPORTED` / `SKIPPED` / `ERROR` / `UNINVOICED` / `INVOICED`
|
||||||
|
- `search`, `sort_by`, `sort_dir`, `page`, `per_page`
|
||||||
|
|
||||||
|
Filtrele `UNINVOICED` si `INVOICED` fac fetch din toate comenzile IMPORTED si filtreaza server-side dupa prezenta/absenta cache-ului de factura.
|
||||||
|
|
||||||
|
### Scheduler
|
||||||
|
| Method | Path | Descriere |
|
||||||
|
|--------|------|-----------|
|
||||||
|
| PUT | `/api/sync/schedule` | Configureaza (enabled, interval_minutes: 5/10/30) |
|
||||||
|
| GET | `/api/sync/schedule` | Status curent |
|
||||||
|
|
||||||
|
Configuratia este persistata in SQLite (`scheduler_config`).
|
||||||
|
|
||||||
|
### Settings
|
||||||
|
| Method | Path | Descriere |
|
||||||
|
|--------|------|-----------|
|
||||||
|
| GET | `/api/settings` | Citeste setari aplicatie |
|
||||||
|
| PUT | `/api/settings` | Salveaza setari |
|
||||||
|
| GET | `/api/settings/sectii` | Lista sectii Oracle |
|
||||||
|
| GET | `/api/settings/politici` | Lista politici preturi Oracle |
|
||||||
|
|
||||||
|
**Setari disponibile:** `transport_codmat`, `transport_vat`, `discount_codmat`, `discount_vat`, `transport_id_pol`, `discount_id_pol`, `id_pol`, `id_pol_productie`, `id_sectie`, `split_discount_vat`, `gomag_api_key`, `gomag_api_shop`, `gomag_order_days_back`, `gomag_limit`
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -204,13 +303,20 @@ gomag-vending/
|
|||||||
|
|
||||||
Scriptul `deploy.ps1` face automat: git clone, venv, dependinte, detectare Oracle, `start.bat`, serviciu NSSM, configurare IIS reverse proxy.
|
Scriptul `deploy.ps1` face automat: git clone, venv, dependinte, detectare Oracle, `start.bat`, serviciu NSSM, configurare IIS reverse proxy.
|
||||||
|
|
||||||
### Update
|
### Update cod (pull + restart)
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
# Ca Administrator
|
# Ca Administrator
|
||||||
.\update.ps1
|
.\update.ps1
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Sau manual:
|
||||||
|
```powershell
|
||||||
|
cd C:\gomag-vending
|
||||||
|
git pull origin main
|
||||||
|
nssm restart GoMagVending
|
||||||
|
```
|
||||||
|
|
||||||
### Configurare `.env` pe Windows
|
### Configurare `.env` pe Windows
|
||||||
|
|
||||||
```ini
|
```ini
|
||||||
@@ -220,24 +326,39 @@ ORACLE_PASSWORD=****
|
|||||||
ORACLE_DSN=ROA
|
ORACLE_DSN=ROA
|
||||||
TNS_ADMIN=C:\roa\instantclient_11_2_0_2
|
TNS_ADMIN=C:\roa\instantclient_11_2_0_2
|
||||||
INSTANTCLIENTPATH=C:\app\Server\product\18.0.0\dbhomeXE\bin
|
INSTANTCLIENTPATH=C:\app\Server\product\18.0.0\dbhomeXE\bin
|
||||||
|
SQLITE_DB_PATH=api/data/import.db
|
||||||
|
JSON_OUTPUT_DIR=api/data/orders
|
||||||
|
APP_PORT=5003
|
||||||
|
ID_POL=39
|
||||||
|
ID_GESTIUNE=0
|
||||||
|
ID_SECTIE=6
|
||||||
|
GOMAG_API_KEY=...
|
||||||
|
GOMAG_API_SHOP=...
|
||||||
|
GOMAG_ORDER_DAYS_BACK=7
|
||||||
|
GOMAG_LIMIT=100
|
||||||
```
|
```
|
||||||
|
|
||||||
**Important:**
|
**Important:**
|
||||||
- `TNS_ADMIN` = folderul care contine `tnsnames.ora` (NU fisierul in sine)
|
- `TNS_ADMIN` = folderul care contine `tnsnames.ora` (NU fisierul in sine)
|
||||||
- `ORACLE_DSN` = alias-ul exact din `tnsnames.ora`
|
- `ORACLE_DSN` = alias-ul exact din `tnsnames.ora`
|
||||||
- `INSTANTCLIENTPATH` = calea catre Oracle bin (thick mode)
|
- `INSTANTCLIENTPATH` = calea catre Oracle bin (thick mode, Oracle 10g/11g)
|
||||||
|
- `FORCE_THIN_MODE=true` = elimina necesitatea Instant Client (Oracle 12.1+)
|
||||||
|
- Setarile din `.env` pot fi suprascrise din UI → `Setari` → salvate in SQLite
|
||||||
|
|
||||||
### Serviciu Windows (NSSM)
|
### Serviciu Windows (NSSM)
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
nssm restart GoMagVending # restart
|
nssm restart GoMagVending # restart serviciu
|
||||||
nssm status GoMagVending # status
|
nssm status GoMagVending # status serviciu
|
||||||
nssm stop GoMagVending # stop
|
nssm stop GoMagVending # stop serviciu
|
||||||
|
nssm start GoMagVending # start serviciu
|
||||||
```
|
```
|
||||||
|
|
||||||
Loguri serviciu: `logs/service_stdout.log`, `logs/service_stderr.log`
|
Loguri serviciu: `logs/service_stdout.log`, `logs/service_stderr.log`
|
||||||
Loguri aplicatie: `logs/sync_comenzi_*.log`
|
Loguri aplicatie: `logs/sync_comenzi_*.log`
|
||||||
|
|
||||||
|
**Nota:** Userul `gomag` nu are drepturi de admin — `nssm restart` necesita PowerShell Administrator direct pe server.
|
||||||
|
|
||||||
### Depanare SSH
|
### Depanare SSH
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
@@ -258,9 +379,14 @@ Get-Process *python* | Select-Object Id,ProcessName,Path
|
|||||||
|
|
||||||
# Verificare loguri recente
|
# Verificare loguri recente
|
||||||
Get-ChildItem C:\gomag-vending\logs\*.log | Sort-Object LastWriteTime -Descending | Select-Object -First 3
|
Get-ChildItem C:\gomag-vending\logs\*.log | Sort-Object LastWriteTime -Descending | Select-Object -First 3
|
||||||
```
|
|
||||||
|
|
||||||
**Nota SSH:** Userul `gomag` nu are drepturi de admin — `nssm restart` si `net stop/start` necesita PowerShell Administrator direct pe server.
|
# Test sync manual (verifica ca Oracle pool porneste)
|
||||||
|
curl http://localhost:5003/health
|
||||||
|
curl -X POST http://localhost:5003/api/sync/start
|
||||||
|
|
||||||
|
# Refresh facturi manual
|
||||||
|
curl -X POST http://localhost:5003/api/dashboard/refresh-invoices
|
||||||
|
```
|
||||||
|
|
||||||
### Probleme frecvente
|
### Probleme frecvente
|
||||||
|
|
||||||
@@ -269,6 +395,9 @@ Get-ChildItem C:\gomag-vending\logs\*.log | Sort-Object LastWriteTime -Descendin
|
|||||||
| `ORA-12154: TNS:could not resolve` | `TNS_ADMIN` gresit sau `tnsnames.ora` nu contine alias-ul DSN | Verifica `TNS_ADMIN` in `.env` + alias in `tnsnames.ora` |
|
| `ORA-12154: TNS:could not resolve` | `TNS_ADMIN` gresit sau `tnsnames.ora` nu contine alias-ul DSN | Verifica `TNS_ADMIN` in `.env` + alias in `tnsnames.ora` |
|
||||||
| `ORA-04088: LOGON_AUDIT_TRIGGER` + `Nu aveti licenta pentru PYTHON` | Trigger ROA blocheaza executabile nelicențiate | Adauga `python.exe` (calea completa) in ROASUPORT |
|
| `ORA-04088: LOGON_AUDIT_TRIGGER` + `Nu aveti licenta pentru PYTHON` | Trigger ROA blocheaza executabile nelicențiate | Adauga `python.exe` (calea completa) in ROASUPORT |
|
||||||
| `503 Service Unavailable` pe `/api/articles/search` | Oracle pool nu s-a initializat | Verifica logul `sync_comenzi_*.log` pentru eroarea exacta |
|
| `503 Service Unavailable` pe `/api/articles/search` | Oracle pool nu s-a initializat | Verifica logul `sync_comenzi_*.log` pentru eroarea exacta |
|
||||||
|
| Facturile nu apar in dashboard | Cache SQLite gol — invoice_service nu a putut interoga Oracle | Apasa butonul Refresh Facturi din dashboard sau `POST /api/dashboard/refresh-invoices` |
|
||||||
|
| Comanda apare ca `DELETED_IN_ROA` | Comanda a fost stearsa manual din ROA | Normal — marcat automat la refresh |
|
||||||
|
| Scheduler nu porneste dupa restart | Config pierduta | Verifica SQLite `scheduler_config` sau reconfigureaza din UI |
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -109,7 +109,9 @@ CREATE TABLE IF NOT EXISTS orders (
|
|||||||
invoice_checked_at TEXT,
|
invoice_checked_at TEXT,
|
||||||
order_total REAL,
|
order_total REAL,
|
||||||
delivery_cost REAL,
|
delivery_cost REAL,
|
||||||
discount_total REAL
|
discount_total REAL,
|
||||||
|
web_status TEXT,
|
||||||
|
discount_split TEXT
|
||||||
);
|
);
|
||||||
CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status);
|
CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status);
|
||||||
CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date);
|
CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date);
|
||||||
@@ -316,6 +318,8 @@ def init_sqlite():
|
|||||||
("order_total", "REAL"),
|
("order_total", "REAL"),
|
||||||
("delivery_cost", "REAL"),
|
("delivery_cost", "REAL"),
|
||||||
("discount_total", "REAL"),
|
("discount_total", "REAL"),
|
||||||
|
("web_status", "TEXT"),
|
||||||
|
("discount_split", "TEXT"),
|
||||||
]:
|
]:
|
||||||
if col not in order_cols:
|
if col not in order_cols:
|
||||||
conn.execute(f"ALTER TABLE orders ADD COLUMN {col} {typedef}")
|
conn.execute(f"ALTER TABLE orders ADD COLUMN {col} {typedef}")
|
||||||
|
|||||||
@@ -32,11 +32,15 @@ class AppSettingsUpdate(BaseModel):
|
|||||||
discount_vat: str = "21"
|
discount_vat: str = "21"
|
||||||
discount_id_pol: str = ""
|
discount_id_pol: str = ""
|
||||||
id_pol: str = ""
|
id_pol: str = ""
|
||||||
|
id_pol_productie: str = ""
|
||||||
id_sectie: str = ""
|
id_sectie: str = ""
|
||||||
|
id_gestiune: str = ""
|
||||||
|
split_discount_vat: str = ""
|
||||||
gomag_api_key: str = ""
|
gomag_api_key: str = ""
|
||||||
gomag_api_shop: str = ""
|
gomag_api_shop: str = ""
|
||||||
gomag_order_days_back: str = "7"
|
gomag_order_days_back: str = "7"
|
||||||
gomag_limit: str = "100"
|
gomag_limit: str = "100"
|
||||||
|
dashboard_poll_seconds: str = "5"
|
||||||
|
|
||||||
|
|
||||||
# API endpoints
|
# API endpoints
|
||||||
@@ -66,12 +70,14 @@ async def sync_status():
|
|||||||
|
|
||||||
# Build last_run from most recent completed/failed sync_runs row
|
# Build last_run from most recent completed/failed sync_runs row
|
||||||
current_run_id = status.get("run_id")
|
current_run_id = status.get("run_id")
|
||||||
|
is_running = status.get("status") == "running"
|
||||||
last_run = None
|
last_run = None
|
||||||
try:
|
try:
|
||||||
from ..database import get_sqlite
|
from ..database import get_sqlite
|
||||||
db = await get_sqlite()
|
db = await get_sqlite()
|
||||||
try:
|
try:
|
||||||
if current_run_id:
|
if current_run_id and is_running:
|
||||||
|
# Only exclude current run while it's actively running
|
||||||
cursor = await db.execute("""
|
cursor = await db.execute("""
|
||||||
SELECT * FROM sync_runs
|
SELECT * FROM sync_runs
|
||||||
WHERE status IN ('completed', 'failed') AND run_id != ?
|
WHERE status IN ('completed', 'failed') AND run_id != ?
|
||||||
@@ -313,6 +319,29 @@ def _get_articole_terti_for_skus(skus: set) -> dict:
|
|||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def _get_nom_articole_for_direct_skus(skus: set) -> dict:
|
||||||
|
"""Query NOM_ARTICOLE for SKUs that exist directly as CODMAT (direct mapping)."""
|
||||||
|
from .. import database
|
||||||
|
result = {}
|
||||||
|
sku_list = list(skus)
|
||||||
|
conn = database.get_oracle_connection()
|
||||||
|
try:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
for i in range(0, len(sku_list), 500):
|
||||||
|
batch = sku_list[i:i+500]
|
||||||
|
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
|
||||||
|
params = {f"s{j}": sku for j, sku in enumerate(batch)}
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT codmat, denumire FROM NOM_ARTICOLE
|
||||||
|
WHERE codmat IN ({placeholders}) AND sters = 0 AND inactiv = 0
|
||||||
|
""", params)
|
||||||
|
for row in cur:
|
||||||
|
result[row[0]] = row[1] or ""
|
||||||
|
finally:
|
||||||
|
database.pool.release(conn)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
@router.get("/api/sync/order/{order_number}")
|
@router.get("/api/sync/order/{order_number}")
|
||||||
async def order_detail(order_number: str):
|
async def order_detail(order_number: str):
|
||||||
"""Get order detail with line items (R9), enriched with ARTICOLE_TERTI data."""
|
"""Get order detail with line items (R9), enriched with ARTICOLE_TERTI data."""
|
||||||
@@ -330,6 +359,23 @@ async def order_detail(order_number: str):
|
|||||||
if sku and sku in codmat_map:
|
if sku and sku in codmat_map:
|
||||||
item["codmat_details"] = codmat_map[sku]
|
item["codmat_details"] = codmat_map[sku]
|
||||||
|
|
||||||
|
# Enrich direct SKUs (SKU=CODMAT in NOM_ARTICOLE, no ARTICOLE_TERTI entry)
|
||||||
|
direct_skus = {item["sku"] for item in items
|
||||||
|
if item.get("sku") and item.get("mapping_status") == "direct"
|
||||||
|
and not item.get("codmat_details")}
|
||||||
|
if direct_skus:
|
||||||
|
nom_map = await asyncio.to_thread(_get_nom_articole_for_direct_skus, direct_skus)
|
||||||
|
for item in items:
|
||||||
|
sku = item.get("sku")
|
||||||
|
if sku and sku in nom_map and not item.get("codmat_details"):
|
||||||
|
item["codmat_details"] = [{
|
||||||
|
"codmat": sku,
|
||||||
|
"cantitate_roa": 1,
|
||||||
|
"procent_pret": 100,
|
||||||
|
"denumire": nom_map[sku],
|
||||||
|
"direct": True
|
||||||
|
}]
|
||||||
|
|
||||||
# Enrich with invoice data
|
# Enrich with invoice data
|
||||||
order = detail.get("order", {})
|
order = detail.get("order", {})
|
||||||
if order.get("factura_numar") and order.get("factura_data"):
|
if order.get("factura_numar") and order.get("factura_data"):
|
||||||
@@ -363,6 +409,13 @@ async def order_detail(order_number: str):
|
|||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
# Parse discount_split JSON string
|
||||||
|
if order.get("discount_split"):
|
||||||
|
try:
|
||||||
|
order["discount_split"] = json.loads(order["discount_split"])
|
||||||
|
except (json.JSONDecodeError, TypeError):
|
||||||
|
pass
|
||||||
|
|
||||||
return detail
|
return detail
|
||||||
|
|
||||||
|
|
||||||
@@ -492,34 +545,73 @@ async def dashboard_orders(page: int = 1, per_page: int = 50,
|
|||||||
|
|
||||||
@router.post("/api/dashboard/refresh-invoices")
|
@router.post("/api/dashboard/refresh-invoices")
|
||||||
async def refresh_invoices():
|
async def refresh_invoices():
|
||||||
"""Force-refresh invoice status from Oracle for all uninvoiced imported orders."""
|
"""Force-refresh invoice/order status from Oracle.
|
||||||
try:
|
|
||||||
uninvoiced = await sqlite_service.get_uninvoiced_imported_orders()
|
|
||||||
if not uninvoiced:
|
|
||||||
return {"updated": 0, "message": "Nicio comanda de verificat"}
|
|
||||||
|
|
||||||
id_comanda_list = [o["id_comanda"] for o in uninvoiced]
|
Checks:
|
||||||
invoice_data = await asyncio.to_thread(
|
1. Uninvoiced orders → did they get invoiced?
|
||||||
invoice_service.check_invoices_for_orders, id_comanda_list
|
2. Invoiced orders → was the invoice deleted?
|
||||||
)
|
3. All imported orders → was the order deleted from ROA?
|
||||||
id_to_order = {o["id_comanda"]: o["order_number"] for o in uninvoiced}
|
"""
|
||||||
updated = 0
|
try:
|
||||||
for idc, inv in invoice_data.items():
|
invoices_added = 0
|
||||||
order_num = id_to_order.get(idc)
|
invoices_cleared = 0
|
||||||
if order_num and inv.get("facturat"):
|
orders_deleted = 0
|
||||||
await sqlite_service.update_order_invoice(
|
|
||||||
order_num,
|
# 1. Check uninvoiced → new invoices
|
||||||
serie=inv.get("serie_act"),
|
uninvoiced = await sqlite_service.get_uninvoiced_imported_orders()
|
||||||
numar=str(inv.get("numar_act", "")),
|
if uninvoiced:
|
||||||
total_fara_tva=inv.get("total_fara_tva"),
|
id_comanda_list = [o["id_comanda"] for o in uninvoiced]
|
||||||
total_tva=inv.get("total_tva"),
|
invoice_data = await asyncio.to_thread(
|
||||||
total_cu_tva=inv.get("total_cu_tva"),
|
invoice_service.check_invoices_for_orders, id_comanda_list
|
||||||
data_act=inv.get("data_act"),
|
)
|
||||||
)
|
id_to_order = {o["id_comanda"]: o["order_number"] for o in uninvoiced}
|
||||||
updated += 1
|
for idc, inv in invoice_data.items():
|
||||||
return {"updated": updated, "checked": len(uninvoiced)}
|
order_num = id_to_order.get(idc)
|
||||||
|
if order_num and inv.get("facturat"):
|
||||||
|
await sqlite_service.update_order_invoice(
|
||||||
|
order_num,
|
||||||
|
serie=inv.get("serie_act"),
|
||||||
|
numar=str(inv.get("numar_act", "")),
|
||||||
|
total_fara_tva=inv.get("total_fara_tva"),
|
||||||
|
total_tva=inv.get("total_tva"),
|
||||||
|
total_cu_tva=inv.get("total_cu_tva"),
|
||||||
|
data_act=inv.get("data_act"),
|
||||||
|
)
|
||||||
|
invoices_added += 1
|
||||||
|
|
||||||
|
# 2. Check invoiced → deleted invoices
|
||||||
|
invoiced = await sqlite_service.get_invoiced_imported_orders()
|
||||||
|
if invoiced:
|
||||||
|
id_comanda_list = [o["id_comanda"] for o in invoiced]
|
||||||
|
invoice_data = await asyncio.to_thread(
|
||||||
|
invoice_service.check_invoices_for_orders, id_comanda_list
|
||||||
|
)
|
||||||
|
for o in invoiced:
|
||||||
|
if o["id_comanda"] not in invoice_data:
|
||||||
|
await sqlite_service.clear_order_invoice(o["order_number"])
|
||||||
|
invoices_cleared += 1
|
||||||
|
|
||||||
|
# 3. Check all imported → deleted orders in ROA
|
||||||
|
all_imported = await sqlite_service.get_all_imported_orders()
|
||||||
|
if all_imported:
|
||||||
|
id_comanda_list = [o["id_comanda"] for o in all_imported]
|
||||||
|
existing_ids = await asyncio.to_thread(
|
||||||
|
invoice_service.check_orders_exist, id_comanda_list
|
||||||
|
)
|
||||||
|
for o in all_imported:
|
||||||
|
if o["id_comanda"] not in existing_ids:
|
||||||
|
await sqlite_service.mark_order_deleted_in_roa(o["order_number"])
|
||||||
|
orders_deleted += 1
|
||||||
|
|
||||||
|
checked = len(uninvoiced) + len(invoiced) + len(all_imported)
|
||||||
|
return {
|
||||||
|
"checked": checked,
|
||||||
|
"invoices_added": invoices_added,
|
||||||
|
"invoices_cleared": invoices_cleared,
|
||||||
|
"orders_deleted": orders_deleted,
|
||||||
|
}
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
return {"error": str(e), "updated": 0}
|
return {"error": str(e), "invoices_added": 0}
|
||||||
|
|
||||||
|
|
||||||
@router.put("/api/sync/schedule")
|
@router.put("/api/sync/schedule")
|
||||||
@@ -553,14 +645,18 @@ async def get_app_settings():
|
|||||||
"transport_vat": s.get("transport_vat", "21"),
|
"transport_vat": s.get("transport_vat", "21"),
|
||||||
"discount_codmat": s.get("discount_codmat", ""),
|
"discount_codmat": s.get("discount_codmat", ""),
|
||||||
"transport_id_pol": s.get("transport_id_pol", ""),
|
"transport_id_pol": s.get("transport_id_pol", ""),
|
||||||
"discount_vat": s.get("discount_vat", "19"),
|
"discount_vat": s.get("discount_vat", "21"),
|
||||||
"discount_id_pol": s.get("discount_id_pol", ""),
|
"discount_id_pol": s.get("discount_id_pol", ""),
|
||||||
"id_pol": s.get("id_pol", ""),
|
"id_pol": s.get("id_pol", ""),
|
||||||
|
"id_pol_productie": s.get("id_pol_productie", ""),
|
||||||
"id_sectie": s.get("id_sectie", ""),
|
"id_sectie": s.get("id_sectie", ""),
|
||||||
|
"id_gestiune": s.get("id_gestiune", ""),
|
||||||
|
"split_discount_vat": s.get("split_discount_vat", ""),
|
||||||
"gomag_api_key": s.get("gomag_api_key", "") or config_settings.GOMAG_API_KEY,
|
"gomag_api_key": s.get("gomag_api_key", "") or config_settings.GOMAG_API_KEY,
|
||||||
"gomag_api_shop": s.get("gomag_api_shop", "") or config_settings.GOMAG_API_SHOP,
|
"gomag_api_shop": s.get("gomag_api_shop", "") or config_settings.GOMAG_API_SHOP,
|
||||||
"gomag_order_days_back": s.get("gomag_order_days_back", "") or str(config_settings.GOMAG_ORDER_DAYS_BACK),
|
"gomag_order_days_back": s.get("gomag_order_days_back", "") or str(config_settings.GOMAG_ORDER_DAYS_BACK),
|
||||||
"gomag_limit": s.get("gomag_limit", "") or str(config_settings.GOMAG_LIMIT),
|
"gomag_limit": s.get("gomag_limit", "") or str(config_settings.GOMAG_LIMIT),
|
||||||
|
"dashboard_poll_seconds": s.get("dashboard_poll_seconds", "5"),
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@@ -574,14 +670,38 @@ async def update_app_settings(config: AppSettingsUpdate):
|
|||||||
await sqlite_service.set_app_setting("discount_vat", config.discount_vat)
|
await sqlite_service.set_app_setting("discount_vat", config.discount_vat)
|
||||||
await sqlite_service.set_app_setting("discount_id_pol", config.discount_id_pol)
|
await sqlite_service.set_app_setting("discount_id_pol", config.discount_id_pol)
|
||||||
await sqlite_service.set_app_setting("id_pol", config.id_pol)
|
await sqlite_service.set_app_setting("id_pol", config.id_pol)
|
||||||
|
await sqlite_service.set_app_setting("id_pol_productie", config.id_pol_productie)
|
||||||
await sqlite_service.set_app_setting("id_sectie", config.id_sectie)
|
await sqlite_service.set_app_setting("id_sectie", config.id_sectie)
|
||||||
|
await sqlite_service.set_app_setting("id_gestiune", config.id_gestiune)
|
||||||
|
await sqlite_service.set_app_setting("split_discount_vat", config.split_discount_vat)
|
||||||
await sqlite_service.set_app_setting("gomag_api_key", config.gomag_api_key)
|
await sqlite_service.set_app_setting("gomag_api_key", config.gomag_api_key)
|
||||||
await sqlite_service.set_app_setting("gomag_api_shop", config.gomag_api_shop)
|
await sqlite_service.set_app_setting("gomag_api_shop", config.gomag_api_shop)
|
||||||
await sqlite_service.set_app_setting("gomag_order_days_back", config.gomag_order_days_back)
|
await sqlite_service.set_app_setting("gomag_order_days_back", config.gomag_order_days_back)
|
||||||
await sqlite_service.set_app_setting("gomag_limit", config.gomag_limit)
|
await sqlite_service.set_app_setting("gomag_limit", config.gomag_limit)
|
||||||
|
await sqlite_service.set_app_setting("dashboard_poll_seconds", config.dashboard_poll_seconds)
|
||||||
return {"success": True}
|
return {"success": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/api/settings/gestiuni")
|
||||||
|
async def get_gestiuni():
|
||||||
|
"""Get list of warehouses from Oracle for dropdown."""
|
||||||
|
def _query():
|
||||||
|
conn = database.get_oracle_connection()
|
||||||
|
try:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
cur.execute(
|
||||||
|
"SELECT id_gestiune, nume_gestiune FROM nom_gestiuni WHERE sters=0 AND inactiv=0 ORDER BY id_gestiune"
|
||||||
|
)
|
||||||
|
return [{"id": str(row[0]), "label": f"{row[0]} - {row[1]}"} for row in cur]
|
||||||
|
finally:
|
||||||
|
database.pool.release(conn)
|
||||||
|
try:
|
||||||
|
return await asyncio.to_thread(_query)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"get_gestiuni error: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
@router.get("/api/settings/sectii")
|
@router.get("/api/settings/sectii")
|
||||||
async def get_sectii():
|
async def get_sectii():
|
||||||
"""Get list of sections from Oracle for dropdown."""
|
"""Get list of sections from Oracle for dropdown."""
|
||||||
|
|||||||
@@ -47,6 +47,13 @@ async def download_orders(
|
|||||||
out_dir = Path(json_dir)
|
out_dir = Path(json_dir)
|
||||||
out_dir.mkdir(parents=True, exist_ok=True)
|
out_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# Clean old JSON files before downloading new ones
|
||||||
|
old_files = list(out_dir.glob("gomag_orders*.json"))
|
||||||
|
if old_files:
|
||||||
|
for f in old_files:
|
||||||
|
f.unlink()
|
||||||
|
_log(f"Șterse {len(old_files)} fișiere JSON vechi")
|
||||||
|
|
||||||
headers = {
|
headers = {
|
||||||
"Apikey": effective_key,
|
"Apikey": effective_key,
|
||||||
"ApiShop": effective_shop,
|
"ApiShop": effective_shop,
|
||||||
|
|||||||
@@ -60,18 +60,81 @@ def format_address_for_oracle(address: str, city: str, region: str) -> str:
|
|||||||
return f"JUD:{region_clean};{city_clean};{address_clean}"
|
return f"JUD:{region_clean};{city_clean};{address_clean}"
|
||||||
|
|
||||||
|
|
||||||
|
def compute_discount_split(order, settings: dict) -> dict | None:
|
||||||
|
"""Compute proportional discount split by VAT rate from order items.
|
||||||
|
|
||||||
|
Returns: {"11": 3.98, "21": 1.43} or None if split not applicable.
|
||||||
|
Only splits when split_discount_vat is enabled AND multiple VAT rates exist.
|
||||||
|
When single VAT rate: returns {actual_rate: total} (smarter than GoMag's fixed 21%).
|
||||||
|
"""
|
||||||
|
if not order or order.discount_total <= 0:
|
||||||
|
return None
|
||||||
|
|
||||||
|
split_enabled = settings.get("split_discount_vat") == "1"
|
||||||
|
|
||||||
|
# Calculate VAT distribution from order items (exclude zero-value)
|
||||||
|
vat_totals = {}
|
||||||
|
for item in order.items:
|
||||||
|
item_value = abs(item.price * item.quantity)
|
||||||
|
if item_value > 0:
|
||||||
|
vat_key = str(int(item.vat)) if item.vat == int(item.vat) else str(item.vat)
|
||||||
|
vat_totals[vat_key] = vat_totals.get(vat_key, 0) + item_value
|
||||||
|
|
||||||
|
if not vat_totals:
|
||||||
|
return None
|
||||||
|
|
||||||
|
grand_total = sum(vat_totals.values())
|
||||||
|
if grand_total <= 0:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if len(vat_totals) == 1:
|
||||||
|
# Single VAT rate — use that rate (smarter than GoMag's fixed 21%)
|
||||||
|
actual_vat = list(vat_totals.keys())[0]
|
||||||
|
return {actual_vat: round(order.discount_total, 2)}
|
||||||
|
|
||||||
|
if not split_enabled:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Multiple VAT rates — split proportionally
|
||||||
|
result = {}
|
||||||
|
discount_remaining = order.discount_total
|
||||||
|
sorted_rates = sorted(vat_totals.keys(), key=lambda x: float(x))
|
||||||
|
|
||||||
|
for i, vat_rate in enumerate(sorted_rates):
|
||||||
|
if i == len(sorted_rates) - 1:
|
||||||
|
split_amount = round(discount_remaining, 2) # last gets remainder
|
||||||
|
else:
|
||||||
|
proportion = vat_totals[vat_rate] / grand_total
|
||||||
|
split_amount = round(order.discount_total * proportion, 2)
|
||||||
|
discount_remaining -= split_amount
|
||||||
|
|
||||||
|
if split_amount > 0:
|
||||||
|
result[vat_rate] = split_amount
|
||||||
|
|
||||||
|
return result if result else None
|
||||||
|
|
||||||
|
|
||||||
def build_articles_json(items, order=None, settings=None) -> str:
|
def build_articles_json(items, order=None, settings=None) -> str:
|
||||||
"""Build JSON string for Oracle PACK_IMPORT_COMENZI.importa_comanda.
|
"""Build JSON string for Oracle PACK_IMPORT_COMENZI.importa_comanda.
|
||||||
Includes transport and discount as extra articles if configured."""
|
Includes transport and discount as extra articles if configured.
|
||||||
|
Supports per-article id_pol from codmat_policy_map and discount VAT splitting."""
|
||||||
articles = []
|
articles = []
|
||||||
|
codmat_policy_map = settings.get("_codmat_policy_map", {}) if settings else {}
|
||||||
|
default_id_pol = settings.get("id_pol", "") if settings else ""
|
||||||
|
|
||||||
for item in items:
|
for item in items:
|
||||||
articles.append({
|
article_dict = {
|
||||||
"sku": item.sku,
|
"sku": item.sku,
|
||||||
"quantity": str(item.quantity),
|
"quantity": str(item.quantity),
|
||||||
"price": str(item.price),
|
"price": str(item.price),
|
||||||
"vat": str(item.vat),
|
"vat": str(item.vat),
|
||||||
"name": clean_web_text(item.name)
|
"name": clean_web_text(item.name)
|
||||||
})
|
}
|
||||||
|
# Per-article id_pol from dual-policy validation
|
||||||
|
item_pol = codmat_policy_map.get(item.sku)
|
||||||
|
if item_pol and str(item_pol) != str(default_id_pol):
|
||||||
|
article_dict["id_pol"] = str(item_pol)
|
||||||
|
articles.append(article_dict)
|
||||||
|
|
||||||
if order and settings:
|
if order and settings:
|
||||||
transport_codmat = settings.get("transport_codmat", "")
|
transport_codmat = settings.get("transport_codmat", "")
|
||||||
@@ -90,25 +153,55 @@ def build_articles_json(items, order=None, settings=None) -> str:
|
|||||||
if settings.get("transport_id_pol"):
|
if settings.get("transport_id_pol"):
|
||||||
article_dict["id_pol"] = settings["transport_id_pol"]
|
article_dict["id_pol"] = settings["transport_id_pol"]
|
||||||
articles.append(article_dict)
|
articles.append(article_dict)
|
||||||
# Discount total with quantity -1 (positive price)
|
|
||||||
|
# Discount — smart VAT splitting
|
||||||
if order.discount_total > 0 and discount_codmat:
|
if order.discount_total > 0 and discount_codmat:
|
||||||
# Use GoMag JSON discount VAT if available, fallback to settings
|
discount_split = compute_discount_split(order, settings)
|
||||||
discount_vat = getattr(order, 'discount_vat', None) or settings.get("discount_vat", "19")
|
|
||||||
article_dict = {
|
if discount_split and len(discount_split) > 1:
|
||||||
"sku": discount_codmat,
|
# Multiple VAT rates — multiple discount lines
|
||||||
"quantity": "-1",
|
for vat_rate, split_amount in sorted(discount_split.items(), key=lambda x: float(x[0])):
|
||||||
"price": str(order.discount_total),
|
article_dict = {
|
||||||
"vat": discount_vat,
|
"sku": discount_codmat,
|
||||||
"name": "Discount"
|
"quantity": "-1",
|
||||||
}
|
"price": str(split_amount),
|
||||||
if settings.get("discount_id_pol"):
|
"vat": vat_rate,
|
||||||
article_dict["id_pol"] = settings["discount_id_pol"]
|
"name": f"Discount (TVA {vat_rate}%)"
|
||||||
articles.append(article_dict)
|
}
|
||||||
|
if settings.get("discount_id_pol"):
|
||||||
|
article_dict["id_pol"] = settings["discount_id_pol"]
|
||||||
|
articles.append(article_dict)
|
||||||
|
elif discount_split and len(discount_split) == 1:
|
||||||
|
# Single VAT rate — use detected rate
|
||||||
|
actual_vat = list(discount_split.keys())[0]
|
||||||
|
article_dict = {
|
||||||
|
"sku": discount_codmat,
|
||||||
|
"quantity": "-1",
|
||||||
|
"price": str(order.discount_total),
|
||||||
|
"vat": actual_vat,
|
||||||
|
"name": "Discount"
|
||||||
|
}
|
||||||
|
if settings.get("discount_id_pol"):
|
||||||
|
article_dict["id_pol"] = settings["discount_id_pol"]
|
||||||
|
articles.append(article_dict)
|
||||||
|
else:
|
||||||
|
# Fallback — original behavior with GoMag VAT or settings default
|
||||||
|
discount_vat = getattr(order, 'discount_vat', None) or settings.get("discount_vat", "21")
|
||||||
|
article_dict = {
|
||||||
|
"sku": discount_codmat,
|
||||||
|
"quantity": "-1",
|
||||||
|
"price": str(order.discount_total),
|
||||||
|
"vat": discount_vat,
|
||||||
|
"name": "Discount"
|
||||||
|
}
|
||||||
|
if settings.get("discount_id_pol"):
|
||||||
|
article_dict["id_pol"] = settings["discount_id_pol"]
|
||||||
|
articles.append(article_dict)
|
||||||
|
|
||||||
return json.dumps(articles)
|
return json.dumps(articles)
|
||||||
|
|
||||||
|
|
||||||
def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_settings: dict = None) -> dict:
|
def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_settings: dict = None, id_gestiuni: list[int] = None) -> dict:
|
||||||
"""Import a single order into Oracle ROA.
|
"""Import a single order into Oracle ROA.
|
||||||
|
|
||||||
Returns dict with:
|
Returns dict with:
|
||||||
@@ -246,6 +339,9 @@ def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_se
|
|||||||
|
|
||||||
id_comanda = cur.var(oracledb.DB_TYPE_NUMBER)
|
id_comanda = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||||
|
|
||||||
|
# Convert list[int] to CSV string for Oracle VARCHAR2 param
|
||||||
|
id_gestiune_csv = ",".join(str(g) for g in id_gestiuni) if id_gestiuni else None
|
||||||
|
|
||||||
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
|
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
|
||||||
order_number, # p_nr_comanda_ext
|
order_number, # p_nr_comanda_ext
|
||||||
order_date, # p_data_comanda
|
order_date, # p_data_comanda
|
||||||
@@ -255,6 +351,7 @@ def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_se
|
|||||||
addr_fact_id, # p_id_adresa_facturare
|
addr_fact_id, # p_id_adresa_facturare
|
||||||
id_pol, # p_id_pol
|
id_pol, # p_id_pol
|
||||||
id_sectie, # p_id_sectie
|
id_sectie, # p_id_sectie
|
||||||
|
id_gestiune_csv, # p_id_gestiune (CSV string)
|
||||||
id_comanda # v_id_comanda (OUT)
|
id_comanda # v_id_comanda (OUT)
|
||||||
])
|
])
|
||||||
|
|
||||||
@@ -294,3 +391,51 @@ def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_se
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def soft_delete_order_in_roa(id_comanda: int) -> dict:
|
||||||
|
"""Soft-delete an order in Oracle ROA (set sters=1 on comenzi + comenzi_detalii).
|
||||||
|
Returns {"success": bool, "error": str|None, "details_deleted": int}
|
||||||
|
"""
|
||||||
|
result = {"success": False, "error": None, "details_deleted": 0}
|
||||||
|
|
||||||
|
if database.pool is None:
|
||||||
|
result["error"] = "Oracle pool not initialized"
|
||||||
|
return result
|
||||||
|
|
||||||
|
conn = None
|
||||||
|
try:
|
||||||
|
conn = database.pool.acquire()
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
# Soft-delete order details
|
||||||
|
cur.execute(
|
||||||
|
"UPDATE comenzi_detalii SET sters = 1 WHERE id_comanda = :1 AND sters = 0",
|
||||||
|
[id_comanda]
|
||||||
|
)
|
||||||
|
result["details_deleted"] = cur.rowcount
|
||||||
|
|
||||||
|
# Soft-delete the order itself
|
||||||
|
cur.execute(
|
||||||
|
"UPDATE comenzi SET sters = 1 WHERE id_comanda = :1 AND sters = 0",
|
||||||
|
[id_comanda]
|
||||||
|
)
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
result["success"] = True
|
||||||
|
logger.info(f"Soft-deleted order ID={id_comanda} in Oracle ROA ({result['details_deleted']} details)")
|
||||||
|
except Exception as e:
|
||||||
|
result["error"] = str(e)
|
||||||
|
logger.error(f"Error soft-deleting order ID={id_comanda}: {e}")
|
||||||
|
if conn:
|
||||||
|
try:
|
||||||
|
conn.rollback()
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
finally:
|
||||||
|
if conn:
|
||||||
|
try:
|
||||||
|
database.pool.release(conn)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return result
|
||||||
|
|||||||
@@ -43,3 +43,33 @@ def check_invoices_for_orders(id_comanda_list: list) -> dict:
|
|||||||
database.pool.release(conn)
|
database.pool.release(conn)
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def check_orders_exist(id_comanda_list: list) -> set:
|
||||||
|
"""Check which id_comanda values still exist in Oracle COMENZI (sters=0).
|
||||||
|
Returns set of id_comanda that exist.
|
||||||
|
"""
|
||||||
|
if not id_comanda_list or database.pool is None:
|
||||||
|
return set()
|
||||||
|
|
||||||
|
existing = set()
|
||||||
|
conn = database.get_oracle_connection()
|
||||||
|
try:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
for i in range(0, len(id_comanda_list), 500):
|
||||||
|
batch = id_comanda_list[i:i+500]
|
||||||
|
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
|
||||||
|
params = {f"c{j}": cid for j, cid in enumerate(batch)}
|
||||||
|
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT id_comanda FROM COMENZI
|
||||||
|
WHERE id_comanda IN ({placeholders}) AND sters = 0
|
||||||
|
""", params)
|
||||||
|
for row in cur:
|
||||||
|
existing.add(row[0])
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Order existence check failed: {e}")
|
||||||
|
finally:
|
||||||
|
database.pool.release(conn)
|
||||||
|
|
||||||
|
return existing
|
||||||
|
|||||||
@@ -159,6 +159,24 @@ def create_mapping(sku: str, codmat: str, cantitate_roa: float = 1, procent_pret
|
|||||||
|
|
||||||
with database.pool.acquire() as conn:
|
with database.pool.acquire() as conn:
|
||||||
with conn.cursor() as cur:
|
with conn.cursor() as cur:
|
||||||
|
# Validate CODMAT exists in NOM_ARTICOLE
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(*) FROM NOM_ARTICOLE
|
||||||
|
WHERE codmat = :codmat AND sters = 0 AND inactiv = 0
|
||||||
|
""", {"codmat": codmat})
|
||||||
|
if cur.fetchone()[0] == 0:
|
||||||
|
raise HTTPException(status_code=400, detail="CODMAT-ul nu exista in nomenclator")
|
||||||
|
|
||||||
|
# Warn if SKU is already a direct CODMAT in NOM_ARTICOLE
|
||||||
|
if sku == codmat:
|
||||||
|
cur.execute("""
|
||||||
|
SELECT COUNT(*) FROM NOM_ARTICOLE
|
||||||
|
WHERE codmat = :sku AND sters = 0 AND inactiv = 0
|
||||||
|
""", {"sku": sku})
|
||||||
|
if cur.fetchone()[0] > 0:
|
||||||
|
raise HTTPException(status_code=409,
|
||||||
|
detail="SKU-ul exista direct in nomenclator ca CODMAT, nu necesita mapare")
|
||||||
|
|
||||||
# Check for active duplicate
|
# Check for active duplicate
|
||||||
cur.execute("""
|
cur.execute("""
|
||||||
SELECT COUNT(*) FROM ARTICOLE_TERTI
|
SELECT COUNT(*) FROM ARTICOLE_TERTI
|
||||||
|
|||||||
@@ -1,8 +1,16 @@
|
|||||||
import json
|
import json
|
||||||
import logging
|
import logging
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
from ..database import get_sqlite, get_sqlite_sync
|
from ..database import get_sqlite, get_sqlite_sync
|
||||||
|
|
||||||
|
_tz_bucharest = ZoneInfo("Europe/Bucharest")
|
||||||
|
|
||||||
|
|
||||||
|
def _now_str():
|
||||||
|
"""Return current Bucharest time as ISO string."""
|
||||||
|
return datetime.now(_tz_bucharest).replace(tzinfo=None).isoformat()
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
@@ -12,8 +20,8 @@ async def create_sync_run(run_id: str, json_files: int = 0):
|
|||||||
try:
|
try:
|
||||||
await db.execute("""
|
await db.execute("""
|
||||||
INSERT INTO sync_runs (run_id, started_at, status, json_files)
|
INSERT INTO sync_runs (run_id, started_at, status, json_files)
|
||||||
VALUES (?, datetime('now'), 'running', ?)
|
VALUES (?, ?, 'running', ?)
|
||||||
""", (run_id, json_files))
|
""", (run_id, _now_str(), json_files))
|
||||||
await db.commit()
|
await db.commit()
|
||||||
finally:
|
finally:
|
||||||
await db.close()
|
await db.close()
|
||||||
@@ -28,7 +36,7 @@ async def update_sync_run(run_id: str, status: str, total_orders: int = 0,
|
|||||||
try:
|
try:
|
||||||
await db.execute("""
|
await db.execute("""
|
||||||
UPDATE sync_runs SET
|
UPDATE sync_runs SET
|
||||||
finished_at = datetime('now'),
|
finished_at = ?,
|
||||||
status = ?,
|
status = ?,
|
||||||
total_orders = ?,
|
total_orders = ?,
|
||||||
imported = ?,
|
imported = ?,
|
||||||
@@ -38,7 +46,7 @@ async def update_sync_run(run_id: str, status: str, total_orders: int = 0,
|
|||||||
already_imported = ?,
|
already_imported = ?,
|
||||||
new_imported = ?
|
new_imported = ?
|
||||||
WHERE run_id = ?
|
WHERE run_id = ?
|
||||||
""", (status, total_orders, imported, skipped, errors, error_message,
|
""", (_now_str(), status, total_orders, imported, skipped, errors, error_message,
|
||||||
already_imported, new_imported, run_id))
|
already_imported, new_imported, run_id))
|
||||||
await db.commit()
|
await db.commit()
|
||||||
finally:
|
finally:
|
||||||
@@ -52,7 +60,8 @@ async def upsert_order(sync_run_id: str, order_number: str, order_date: str,
|
|||||||
shipping_name: str = None, billing_name: str = None,
|
shipping_name: str = None, billing_name: str = None,
|
||||||
payment_method: str = None, delivery_method: str = None,
|
payment_method: str = None, delivery_method: str = None,
|
||||||
order_total: float = None,
|
order_total: float = None,
|
||||||
delivery_cost: float = None, discount_total: float = None):
|
delivery_cost: float = None, discount_total: float = None,
|
||||||
|
web_status: str = None, discount_split: str = None):
|
||||||
"""Upsert a single order — one row per order_number, status updated in place."""
|
"""Upsert a single order — one row per order_number, status updated in place."""
|
||||||
db = await get_sqlite()
|
db = await get_sqlite()
|
||||||
try:
|
try:
|
||||||
@@ -62,9 +71,10 @@ async def upsert_order(sync_run_id: str, order_number: str, order_date: str,
|
|||||||
id_comanda, id_partener, error_message, missing_skus, items_count,
|
id_comanda, id_partener, error_message, missing_skus, items_count,
|
||||||
last_sync_run_id, shipping_name, billing_name,
|
last_sync_run_id, shipping_name, billing_name,
|
||||||
payment_method, delivery_method, order_total,
|
payment_method, delivery_method, order_total,
|
||||||
delivery_cost, discount_total)
|
delivery_cost, discount_total, web_status, discount_split)
|
||||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||||
ON CONFLICT(order_number) DO UPDATE SET
|
ON CONFLICT(order_number) DO UPDATE SET
|
||||||
|
customer_name = excluded.customer_name,
|
||||||
status = CASE
|
status = CASE
|
||||||
WHEN orders.status = 'IMPORTED' AND excluded.status = 'ALREADY_IMPORTED'
|
WHEN orders.status = 'IMPORTED' AND excluded.status = 'ALREADY_IMPORTED'
|
||||||
THEN orders.status
|
THEN orders.status
|
||||||
@@ -86,13 +96,15 @@ async def upsert_order(sync_run_id: str, order_number: str, order_date: str,
|
|||||||
order_total = COALESCE(excluded.order_total, orders.order_total),
|
order_total = COALESCE(excluded.order_total, orders.order_total),
|
||||||
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
|
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
|
||||||
discount_total = COALESCE(excluded.discount_total, orders.discount_total),
|
discount_total = COALESCE(excluded.discount_total, orders.discount_total),
|
||||||
|
web_status = COALESCE(excluded.web_status, orders.web_status),
|
||||||
|
discount_split = COALESCE(excluded.discount_split, orders.discount_split),
|
||||||
updated_at = datetime('now')
|
updated_at = datetime('now')
|
||||||
""", (order_number, order_date, customer_name, status,
|
""", (order_number, order_date, customer_name, status,
|
||||||
id_comanda, id_partener, error_message,
|
id_comanda, id_partener, error_message,
|
||||||
json.dumps(missing_skus) if missing_skus else None,
|
json.dumps(missing_skus) if missing_skus else None,
|
||||||
items_count, sync_run_id, shipping_name, billing_name,
|
items_count, sync_run_id, shipping_name, billing_name,
|
||||||
payment_method, delivery_method, order_total,
|
payment_method, delivery_method, order_total,
|
||||||
delivery_cost, discount_total))
|
delivery_cost, discount_total, web_status, discount_split))
|
||||||
await db.commit()
|
await db.commit()
|
||||||
finally:
|
finally:
|
||||||
await db.close()
|
await db.close()
|
||||||
@@ -117,7 +129,8 @@ async def save_orders_batch(orders_data: list[dict]):
|
|||||||
Each dict must have: sync_run_id, order_number, order_date, customer_name, status,
|
Each dict must have: sync_run_id, order_number, order_date, customer_name, status,
|
||||||
id_comanda, id_partener, error_message, missing_skus (list|None), items_count,
|
id_comanda, id_partener, error_message, missing_skus (list|None), items_count,
|
||||||
shipping_name, billing_name, payment_method, delivery_method, status_at_run,
|
shipping_name, billing_name, payment_method, delivery_method, status_at_run,
|
||||||
items (list of item dicts), delivery_cost (optional), discount_total (optional).
|
items (list of item dicts), delivery_cost (optional), discount_total (optional),
|
||||||
|
web_status (optional).
|
||||||
"""
|
"""
|
||||||
if not orders_data:
|
if not orders_data:
|
||||||
return
|
return
|
||||||
@@ -130,9 +143,10 @@ async def save_orders_batch(orders_data: list[dict]):
|
|||||||
id_comanda, id_partener, error_message, missing_skus, items_count,
|
id_comanda, id_partener, error_message, missing_skus, items_count,
|
||||||
last_sync_run_id, shipping_name, billing_name,
|
last_sync_run_id, shipping_name, billing_name,
|
||||||
payment_method, delivery_method, order_total,
|
payment_method, delivery_method, order_total,
|
||||||
delivery_cost, discount_total)
|
delivery_cost, discount_total, web_status, discount_split)
|
||||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||||
ON CONFLICT(order_number) DO UPDATE SET
|
ON CONFLICT(order_number) DO UPDATE SET
|
||||||
|
customer_name = excluded.customer_name,
|
||||||
status = CASE
|
status = CASE
|
||||||
WHEN orders.status = 'IMPORTED' AND excluded.status = 'ALREADY_IMPORTED'
|
WHEN orders.status = 'IMPORTED' AND excluded.status = 'ALREADY_IMPORTED'
|
||||||
THEN orders.status
|
THEN orders.status
|
||||||
@@ -154,6 +168,8 @@ async def save_orders_batch(orders_data: list[dict]):
|
|||||||
order_total = COALESCE(excluded.order_total, orders.order_total),
|
order_total = COALESCE(excluded.order_total, orders.order_total),
|
||||||
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
|
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
|
||||||
discount_total = COALESCE(excluded.discount_total, orders.discount_total),
|
discount_total = COALESCE(excluded.discount_total, orders.discount_total),
|
||||||
|
web_status = COALESCE(excluded.web_status, orders.web_status),
|
||||||
|
discount_split = COALESCE(excluded.discount_split, orders.discount_split),
|
||||||
updated_at = datetime('now')
|
updated_at = datetime('now')
|
||||||
""", [
|
""", [
|
||||||
(d["order_number"], d["order_date"], d["customer_name"], d["status"],
|
(d["order_number"], d["order_date"], d["customer_name"], d["status"],
|
||||||
@@ -163,7 +179,8 @@ async def save_orders_batch(orders_data: list[dict]):
|
|||||||
d.get("shipping_name"), d.get("billing_name"),
|
d.get("shipping_name"), d.get("billing_name"),
|
||||||
d.get("payment_method"), d.get("delivery_method"),
|
d.get("payment_method"), d.get("delivery_method"),
|
||||||
d.get("order_total"),
|
d.get("order_total"),
|
||||||
d.get("delivery_cost"), d.get("discount_total"))
|
d.get("delivery_cost"), d.get("discount_total"),
|
||||||
|
d.get("web_status"), d.get("discount_split"))
|
||||||
for d in orders_data
|
for d in orders_data
|
||||||
])
|
])
|
||||||
|
|
||||||
@@ -617,6 +634,7 @@ async def get_run_orders_filtered(run_id: str, status_filter: str = "all",
|
|||||||
"skipped": status_counts.get("SKIPPED", 0),
|
"skipped": status_counts.get("SKIPPED", 0),
|
||||||
"error": status_counts.get("ERROR", 0),
|
"error": status_counts.get("ERROR", 0),
|
||||||
"already_imported": status_counts.get("ALREADY_IMPORTED", 0),
|
"already_imported": status_counts.get("ALREADY_IMPORTED", 0),
|
||||||
|
"cancelled": status_counts.get("CANCELLED", 0),
|
||||||
"total": sum(status_counts.values())
|
"total": sum(status_counts.values())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -713,6 +731,7 @@ async def get_orders(page: int = 1, per_page: int = 50,
|
|||||||
"imported_all": status_counts.get("IMPORTED", 0) + status_counts.get("ALREADY_IMPORTED", 0),
|
"imported_all": status_counts.get("IMPORTED", 0) + status_counts.get("ALREADY_IMPORTED", 0),
|
||||||
"skipped": status_counts.get("SKIPPED", 0),
|
"skipped": status_counts.get("SKIPPED", 0),
|
||||||
"error": status_counts.get("ERROR", 0),
|
"error": status_counts.get("ERROR", 0),
|
||||||
|
"cancelled": status_counts.get("CANCELLED", 0),
|
||||||
"total": sum(status_counts.values()),
|
"total": sum(status_counts.values()),
|
||||||
"uninvoiced_sqlite": uninvoiced_sqlite,
|
"uninvoiced_sqlite": uninvoiced_sqlite,
|
||||||
}
|
}
|
||||||
@@ -781,6 +800,109 @@ async def update_order_invoice(order_number: str, serie: str = None,
|
|||||||
await db.close()
|
await db.close()
|
||||||
|
|
||||||
|
|
||||||
|
async def get_invoiced_imported_orders() -> list:
|
||||||
|
"""Get imported orders that HAVE cached invoice data (for re-verification)."""
|
||||||
|
db = await get_sqlite()
|
||||||
|
try:
|
||||||
|
cursor = await db.execute("""
|
||||||
|
SELECT order_number, id_comanda FROM orders
|
||||||
|
WHERE status IN ('IMPORTED', 'ALREADY_IMPORTED')
|
||||||
|
AND id_comanda IS NOT NULL
|
||||||
|
AND factura_numar IS NOT NULL AND factura_numar != ''
|
||||||
|
""")
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
return [dict(r) for r in rows]
|
||||||
|
finally:
|
||||||
|
await db.close()
|
||||||
|
|
||||||
|
|
||||||
|
async def get_all_imported_orders() -> list:
|
||||||
|
"""Get ALL imported orders with id_comanda (for checking if deleted in ROA)."""
|
||||||
|
db = await get_sqlite()
|
||||||
|
try:
|
||||||
|
cursor = await db.execute("""
|
||||||
|
SELECT order_number, id_comanda FROM orders
|
||||||
|
WHERE status IN ('IMPORTED', 'ALREADY_IMPORTED')
|
||||||
|
AND id_comanda IS NOT NULL
|
||||||
|
""")
|
||||||
|
rows = await cursor.fetchall()
|
||||||
|
return [dict(r) for r in rows]
|
||||||
|
finally:
|
||||||
|
await db.close()
|
||||||
|
|
||||||
|
|
||||||
|
async def clear_order_invoice(order_number: str):
|
||||||
|
"""Clear cached invoice data when invoice was deleted in ROA."""
|
||||||
|
db = await get_sqlite()
|
||||||
|
try:
|
||||||
|
await db.execute("""
|
||||||
|
UPDATE orders SET
|
||||||
|
factura_serie = NULL,
|
||||||
|
factura_numar = NULL,
|
||||||
|
factura_total_fara_tva = NULL,
|
||||||
|
factura_total_tva = NULL,
|
||||||
|
factura_total_cu_tva = NULL,
|
||||||
|
factura_data = NULL,
|
||||||
|
invoice_checked_at = datetime('now'),
|
||||||
|
updated_at = datetime('now')
|
||||||
|
WHERE order_number = ?
|
||||||
|
""", (order_number,))
|
||||||
|
await db.commit()
|
||||||
|
finally:
|
||||||
|
await db.close()
|
||||||
|
|
||||||
|
|
||||||
|
async def mark_order_deleted_in_roa(order_number: str):
|
||||||
|
"""Mark an order as deleted in ROA — clears id_comanda and invoice cache."""
|
||||||
|
db = await get_sqlite()
|
||||||
|
try:
|
||||||
|
await db.execute("""
|
||||||
|
UPDATE orders SET
|
||||||
|
status = 'DELETED_IN_ROA',
|
||||||
|
id_comanda = NULL,
|
||||||
|
id_partener = NULL,
|
||||||
|
factura_serie = NULL,
|
||||||
|
factura_numar = NULL,
|
||||||
|
factura_total_fara_tva = NULL,
|
||||||
|
factura_total_tva = NULL,
|
||||||
|
factura_total_cu_tva = NULL,
|
||||||
|
factura_data = NULL,
|
||||||
|
invoice_checked_at = NULL,
|
||||||
|
error_message = 'Comanda stearsa din ROA',
|
||||||
|
updated_at = datetime('now')
|
||||||
|
WHERE order_number = ?
|
||||||
|
""", (order_number,))
|
||||||
|
await db.commit()
|
||||||
|
finally:
|
||||||
|
await db.close()
|
||||||
|
|
||||||
|
|
||||||
|
async def mark_order_cancelled(order_number: str, web_status: str = "Anulata"):
|
||||||
|
"""Mark an order as cancelled from GoMag. Clears id_comanda and invoice cache."""
|
||||||
|
db = await get_sqlite()
|
||||||
|
try:
|
||||||
|
await db.execute("""
|
||||||
|
UPDATE orders SET
|
||||||
|
status = 'CANCELLED',
|
||||||
|
id_comanda = NULL,
|
||||||
|
id_partener = NULL,
|
||||||
|
factura_serie = NULL,
|
||||||
|
factura_numar = NULL,
|
||||||
|
factura_total_fara_tva = NULL,
|
||||||
|
factura_total_tva = NULL,
|
||||||
|
factura_total_cu_tva = NULL,
|
||||||
|
factura_data = NULL,
|
||||||
|
invoice_checked_at = NULL,
|
||||||
|
web_status = ?,
|
||||||
|
error_message = 'Comanda anulata in GoMag',
|
||||||
|
updated_at = datetime('now')
|
||||||
|
WHERE order_number = ?
|
||||||
|
""", (web_status, order_number))
|
||||||
|
await db.commit()
|
||||||
|
finally:
|
||||||
|
await db.close()
|
||||||
|
|
||||||
|
|
||||||
# ── App Settings ─────────────────────────────────
|
# ── App Settings ─────────────────────────────────
|
||||||
|
|
||||||
async def get_app_settings() -> dict:
|
async def get_app_settings() -> dict:
|
||||||
|
|||||||
@@ -3,6 +3,14 @@ import json
|
|||||||
import logging
|
import logging
|
||||||
import uuid
|
import uuid
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
|
||||||
|
_tz_bucharest = ZoneInfo("Europe/Bucharest")
|
||||||
|
|
||||||
|
|
||||||
|
def _now():
|
||||||
|
"""Return current time in Bucharest timezone (naive, for display/storage)."""
|
||||||
|
return datetime.now(_tz_bucharest).replace(tzinfo=None)
|
||||||
|
|
||||||
from . import order_reader, validation_service, import_service, sqlite_service, invoice_service, gomag_client
|
from . import order_reader, validation_service, import_service, sqlite_service, invoice_service, gomag_client
|
||||||
from ..config import settings
|
from ..config import settings
|
||||||
@@ -22,7 +30,7 @@ def _log_line(run_id: str, message: str):
|
|||||||
"""Append a timestamped line to the in-memory log buffer."""
|
"""Append a timestamped line to the in-memory log buffer."""
|
||||||
if run_id not in _run_logs:
|
if run_id not in _run_logs:
|
||||||
_run_logs[run_id] = []
|
_run_logs[run_id] = []
|
||||||
ts = datetime.now().strftime("%H:%M:%S")
|
ts = _now().strftime("%H:%M:%S")
|
||||||
_run_logs[run_id].append(f"[{ts}] {message}")
|
_run_logs[run_id].append(f"[{ts}] {message}")
|
||||||
|
|
||||||
|
|
||||||
@@ -62,30 +70,37 @@ async def prepare_sync(id_pol: int = None, id_sectie: int = None) -> dict:
|
|||||||
if _sync_lock.locked():
|
if _sync_lock.locked():
|
||||||
return {"error": "Sync already running", "run_id": _current_sync.get("run_id") if _current_sync else None}
|
return {"error": "Sync already running", "run_id": _current_sync.get("run_id") if _current_sync else None}
|
||||||
|
|
||||||
run_id = datetime.now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
|
run_id = _now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
|
||||||
_current_sync = {
|
_current_sync = {
|
||||||
"run_id": run_id,
|
"run_id": run_id,
|
||||||
"status": "running",
|
"status": "running",
|
||||||
"started_at": datetime.now().isoformat(),
|
"started_at": _now().isoformat(),
|
||||||
"finished_at": None,
|
"finished_at": None,
|
||||||
"phase": "starting",
|
"phase": "starting",
|
||||||
"phase_text": "Starting...",
|
"phase_text": "Starting...",
|
||||||
"progress_current": 0,
|
"progress_current": 0,
|
||||||
"progress_total": 0,
|
"progress_total": 0,
|
||||||
"counts": {"imported": 0, "skipped": 0, "errors": 0, "already_imported": 0},
|
"counts": {"imported": 0, "skipped": 0, "errors": 0, "already_imported": 0, "cancelled": 0},
|
||||||
}
|
}
|
||||||
return {"run_id": run_id, "status": "starting"}
|
return {"run_id": run_id, "status": "starting"}
|
||||||
|
|
||||||
|
|
||||||
def _derive_customer_info(order):
|
def _derive_customer_info(order):
|
||||||
"""Extract shipping/billing names and customer from an order."""
|
"""Extract shipping/billing names and customer from an order.
|
||||||
|
customer = who appears on the invoice (partner in ROA):
|
||||||
|
- company name if billing is on a company
|
||||||
|
- shipping person name otherwise (consistent with import_service partner logic)
|
||||||
|
"""
|
||||||
shipping_name = ""
|
shipping_name = ""
|
||||||
if order.shipping:
|
if order.shipping:
|
||||||
shipping_name = f"{getattr(order.shipping, 'firstname', '') or ''} {getattr(order.shipping, 'lastname', '') or ''}".strip()
|
shipping_name = f"{getattr(order.shipping, 'firstname', '') or ''} {getattr(order.shipping, 'lastname', '') or ''}".strip()
|
||||||
billing_name = f"{getattr(order.billing, 'firstname', '') or ''} {getattr(order.billing, 'lastname', '') or ''}".strip()
|
billing_name = f"{getattr(order.billing, 'firstname', '') or ''} {getattr(order.billing, 'lastname', '') or ''}".strip()
|
||||||
if not shipping_name:
|
if not shipping_name:
|
||||||
shipping_name = billing_name
|
shipping_name = billing_name
|
||||||
customer = shipping_name or order.billing.company_name or billing_name
|
if order.billing.is_company and order.billing.company_name:
|
||||||
|
customer = order.billing.company_name
|
||||||
|
else:
|
||||||
|
customer = shipping_name or billing_name
|
||||||
payment_method = getattr(order, 'payment_name', None) or None
|
payment_method = getattr(order, 'payment_name', None) or None
|
||||||
delivery_method = getattr(order, 'delivery_name', None) or None
|
delivery_method = getattr(order, 'delivery_name', None) or None
|
||||||
return shipping_name, billing_name, customer, payment_method, delivery_method
|
return shipping_name, billing_name, customer, payment_method, delivery_method
|
||||||
@@ -135,22 +150,22 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
async with _sync_lock:
|
async with _sync_lock:
|
||||||
# Use provided run_id or generate one
|
# Use provided run_id or generate one
|
||||||
if not run_id:
|
if not run_id:
|
||||||
run_id = datetime.now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
|
run_id = _now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
|
||||||
_current_sync = {
|
_current_sync = {
|
||||||
"run_id": run_id,
|
"run_id": run_id,
|
||||||
"status": "running",
|
"status": "running",
|
||||||
"started_at": datetime.now().isoformat(),
|
"started_at": _now().isoformat(),
|
||||||
"finished_at": None,
|
"finished_at": None,
|
||||||
"phase": "reading",
|
"phase": "reading",
|
||||||
"phase_text": "Reading JSON files...",
|
"phase_text": "Reading JSON files...",
|
||||||
"progress_current": 0,
|
"progress_current": 0,
|
||||||
"progress_total": 0,
|
"progress_total": 0,
|
||||||
"counts": {"imported": 0, "skipped": 0, "errors": 0, "already_imported": 0},
|
"counts": {"imported": 0, "skipped": 0, "errors": 0, "already_imported": 0, "cancelled": 0},
|
||||||
}
|
}
|
||||||
|
|
||||||
_update_progress("reading", "Reading JSON files...")
|
_update_progress("reading", "Reading JSON files...")
|
||||||
|
|
||||||
started_dt = datetime.now()
|
started_dt = _now()
|
||||||
_run_logs[run_id] = [
|
_run_logs[run_id] = [
|
||||||
f"=== Sync Run {run_id} ===",
|
f"=== Sync Run {run_id} ===",
|
||||||
f"Inceput: {started_dt.strftime('%d.%m.%Y %H:%M:%S')}",
|
f"Inceput: {started_dt.strftime('%d.%m.%Y %H:%M:%S')}",
|
||||||
@@ -205,6 +220,104 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
summary = {"run_id": run_id, "status": "completed", "message": "No orders found", "json_files": json_count}
|
summary = {"run_id": run_id, "status": "completed", "message": "No orders found", "json_files": json_count}
|
||||||
return summary
|
return summary
|
||||||
|
|
||||||
|
# ── Separate cancelled orders (GoMag status "Anulata" / statusId "7") ──
|
||||||
|
cancelled_orders = [o for o in orders if o.status_id == "7" or (o.status and o.status.lower() == "anulata")]
|
||||||
|
active_orders = [o for o in orders if o not in cancelled_orders]
|
||||||
|
cancelled_count = len(cancelled_orders)
|
||||||
|
|
||||||
|
if cancelled_orders:
|
||||||
|
_log_line(run_id, f"Comenzi anulate in GoMag: {cancelled_count}")
|
||||||
|
|
||||||
|
# Record cancelled orders in SQLite
|
||||||
|
cancelled_batch = []
|
||||||
|
for order in cancelled_orders:
|
||||||
|
shipping_name, billing_name, customer, payment_method, delivery_method = _derive_customer_info(order)
|
||||||
|
order_items_data = [
|
||||||
|
{"sku": item.sku, "product_name": item.name,
|
||||||
|
"quantity": item.quantity, "price": item.price, "vat": item.vat,
|
||||||
|
"mapping_status": "unknown", "codmat": None,
|
||||||
|
"id_articol": None, "cantitate_roa": None}
|
||||||
|
for item in order.items
|
||||||
|
]
|
||||||
|
cancelled_batch.append({
|
||||||
|
"sync_run_id": run_id, "order_number": order.number,
|
||||||
|
"order_date": order.date, "customer_name": customer,
|
||||||
|
"status": "CANCELLED", "status_at_run": "CANCELLED",
|
||||||
|
"id_comanda": None, "id_partener": None,
|
||||||
|
"error_message": "Comanda anulata in GoMag",
|
||||||
|
"missing_skus": None,
|
||||||
|
"items_count": len(order.items),
|
||||||
|
"shipping_name": shipping_name, "billing_name": billing_name,
|
||||||
|
"payment_method": payment_method, "delivery_method": delivery_method,
|
||||||
|
"order_total": order.total or None,
|
||||||
|
"delivery_cost": order.delivery_cost or None,
|
||||||
|
"discount_total": order.discount_total or None,
|
||||||
|
"web_status": order.status or "Anulata",
|
||||||
|
"items": order_items_data,
|
||||||
|
})
|
||||||
|
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → ANULAT in GoMag")
|
||||||
|
|
||||||
|
await sqlite_service.save_orders_batch(cancelled_batch)
|
||||||
|
|
||||||
|
# Check if any cancelled orders were previously imported
|
||||||
|
from ..database import get_sqlite as _get_sqlite
|
||||||
|
db_check = await _get_sqlite()
|
||||||
|
try:
|
||||||
|
cancelled_numbers = [o.number for o in cancelled_orders]
|
||||||
|
placeholders = ",".join("?" for _ in cancelled_numbers)
|
||||||
|
cursor = await db_check.execute(f"""
|
||||||
|
SELECT order_number, id_comanda FROM orders
|
||||||
|
WHERE order_number IN ({placeholders})
|
||||||
|
AND id_comanda IS NOT NULL
|
||||||
|
AND status = 'CANCELLED'
|
||||||
|
""", cancelled_numbers)
|
||||||
|
previously_imported = [dict(r) for r in await cursor.fetchall()]
|
||||||
|
finally:
|
||||||
|
await db_check.close()
|
||||||
|
|
||||||
|
if previously_imported:
|
||||||
|
_log_line(run_id, f"Verificare {len(previously_imported)} comenzi anulate care erau importate in Oracle...")
|
||||||
|
# Check which have invoices
|
||||||
|
id_comanda_list = [o["id_comanda"] for o in previously_imported]
|
||||||
|
invoice_data = await asyncio.to_thread(
|
||||||
|
invoice_service.check_invoices_for_orders, id_comanda_list
|
||||||
|
)
|
||||||
|
|
||||||
|
for o in previously_imported:
|
||||||
|
idc = o["id_comanda"]
|
||||||
|
order_num = o["order_number"]
|
||||||
|
if idc in invoice_data:
|
||||||
|
# Invoiced — keep in Oracle, just log warning
|
||||||
|
_log_line(run_id,
|
||||||
|
f"#{order_num} → ANULAT dar FACTURAT (factura {invoice_data[idc].get('serie_act', '')}"
|
||||||
|
f"{invoice_data[idc].get('numar_act', '')}) — NU se sterge din Oracle")
|
||||||
|
# Update web_status but keep CANCELLED status (already set by batch above)
|
||||||
|
else:
|
||||||
|
# Not invoiced — soft-delete in Oracle
|
||||||
|
del_result = await asyncio.to_thread(
|
||||||
|
import_service.soft_delete_order_in_roa, idc
|
||||||
|
)
|
||||||
|
if del_result["success"]:
|
||||||
|
# Clear id_comanda via mark_order_cancelled
|
||||||
|
await sqlite_service.mark_order_cancelled(order_num, "Anulata")
|
||||||
|
_log_line(run_id,
|
||||||
|
f"#{order_num} → ANULAT + STERS din Oracle (ID: {idc}, "
|
||||||
|
f"{del_result['details_deleted']} detalii)")
|
||||||
|
else:
|
||||||
|
_log_line(run_id,
|
||||||
|
f"#{order_num} → ANULAT dar EROARE la stergere Oracle: {del_result['error']}")
|
||||||
|
|
||||||
|
orders = active_orders
|
||||||
|
|
||||||
|
if not orders:
|
||||||
|
_log_line(run_id, "Nicio comanda activa dupa filtrare anulate.")
|
||||||
|
await sqlite_service.update_sync_run(run_id, "completed", cancelled_count, 0, 0, 0)
|
||||||
|
_update_progress("completed", f"No active orders ({cancelled_count} cancelled)")
|
||||||
|
summary = {"run_id": run_id, "status": "completed",
|
||||||
|
"message": f"No active orders ({cancelled_count} cancelled)",
|
||||||
|
"json_files": json_count, "cancelled": cancelled_count}
|
||||||
|
return summary
|
||||||
|
|
||||||
_update_progress("validation", f"Validating {len(orders)} orders...", 0, len(orders))
|
_update_progress("validation", f"Validating {len(orders)} orders...", 0, len(orders))
|
||||||
|
|
||||||
# ── Single Oracle connection for entire validation phase ──
|
# ── Single Oracle connection for entire validation phase ──
|
||||||
@@ -217,9 +330,9 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
try:
|
try:
|
||||||
min_date = datetime.strptime(min_date_str[:10], "%Y-%m-%d") - timedelta(days=1)
|
min_date = datetime.strptime(min_date_str[:10], "%Y-%m-%d") - timedelta(days=1)
|
||||||
except (ValueError, TypeError):
|
except (ValueError, TypeError):
|
||||||
min_date = datetime.now() - timedelta(days=90)
|
min_date = _now() - timedelta(days=90)
|
||||||
else:
|
else:
|
||||||
min_date = datetime.now() - timedelta(days=90)
|
min_date = _now() - timedelta(days=90)
|
||||||
|
|
||||||
existing_map = await asyncio.to_thread(
|
existing_map = await asyncio.to_thread(
|
||||||
validation_service.check_orders_in_roa, min_date, conn
|
validation_service.check_orders_in_roa, min_date, conn
|
||||||
@@ -229,9 +342,22 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
# (can happen if previous import committed partially without rollback)
|
# (can happen if previous import committed partially without rollback)
|
||||||
await _fix_stale_error_orders(existing_map, run_id)
|
await _fix_stale_error_orders(existing_map, run_id)
|
||||||
|
|
||||||
|
# Load app settings early (needed for id_gestiune in SKU validation)
|
||||||
|
app_settings = await sqlite_service.get_app_settings()
|
||||||
|
id_pol = id_pol or int(app_settings.get("id_pol") or 0) or settings.ID_POL
|
||||||
|
id_sectie = id_sectie or int(app_settings.get("id_sectie") or 0) or settings.ID_SECTIE
|
||||||
|
# Parse multi-gestiune CSV: "1,3" → [1, 3], "" → None
|
||||||
|
id_gestiune_raw = (app_settings.get("id_gestiune") or "").strip()
|
||||||
|
if id_gestiune_raw and id_gestiune_raw != "0":
|
||||||
|
id_gestiuni = [int(g) for g in id_gestiune_raw.split(",") if g.strip()]
|
||||||
|
else:
|
||||||
|
id_gestiuni = None # None = orice gestiune
|
||||||
|
logger.info(f"Sync params: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNI={id_gestiuni}")
|
||||||
|
_log_line(run_id, f"Parametri import: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNI={id_gestiuni}")
|
||||||
|
|
||||||
# Step 2b: Validate SKUs (reuse same connection)
|
# Step 2b: Validate SKUs (reuse same connection)
|
||||||
all_skus = order_reader.get_all_skus(orders)
|
all_skus = order_reader.get_all_skus(orders)
|
||||||
validation = await asyncio.to_thread(validation_service.validate_skus, all_skus, conn)
|
validation = await asyncio.to_thread(validation_service.validate_skus, all_skus, conn, id_gestiuni)
|
||||||
importable, skipped = validation_service.classify_orders(orders, validation)
|
importable, skipped = validation_service.classify_orders(orders, validation)
|
||||||
|
|
||||||
# ── Split importable into truly_importable vs already_in_roa ──
|
# ── Split importable into truly_importable vs already_in_roa ──
|
||||||
@@ -251,8 +377,13 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
# Step 2c: Build SKU context from skipped orders
|
# Step 2c: Build SKU context from skipped orders
|
||||||
sku_context = {}
|
sku_context = {}
|
||||||
for order, missing_skus_list in skipped:
|
for order, missing_skus_list in skipped:
|
||||||
customer = order.billing.company_name or \
|
if order.billing.is_company and order.billing.company_name:
|
||||||
f"{order.billing.firstname} {order.billing.lastname}"
|
customer = order.billing.company_name
|
||||||
|
else:
|
||||||
|
ship_name = ""
|
||||||
|
if order.shipping:
|
||||||
|
ship_name = f"{order.shipping.firstname} {order.shipping.lastname}".strip()
|
||||||
|
customer = ship_name or f"{order.billing.firstname} {order.billing.lastname}"
|
||||||
for sku in missing_skus_list:
|
for sku in missing_skus_list:
|
||||||
if sku not in sku_context:
|
if sku not in sku_context:
|
||||||
sku_context[sku] = {"orders": [], "customers": []}
|
sku_context[sku] = {"orders": [], "customers": []}
|
||||||
@@ -280,12 +411,6 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
)
|
)
|
||||||
|
|
||||||
# Step 2d: Pre-validate prices for importable articles
|
# Step 2d: Pre-validate prices for importable articles
|
||||||
# Load app settings (for transport/discount CODMAT config AND id_pol/id_sectie override)
|
|
||||||
app_settings = await sqlite_service.get_app_settings()
|
|
||||||
id_pol = id_pol or int(app_settings.get("id_pol") or 0) or settings.ID_POL
|
|
||||||
id_sectie = id_sectie or int(app_settings.get("id_sectie") or 0) or settings.ID_SECTIE
|
|
||||||
logger.info(f"Sync params: ID_POL={id_pol}, ID_SECTIE={id_sectie}")
|
|
||||||
_log_line(run_id, f"Parametri import: ID_POL={id_pol}, ID_SECTIE={id_sectie}")
|
|
||||||
if id_pol and (truly_importable or already_in_roa):
|
if id_pol and (truly_importable or already_in_roa):
|
||||||
_update_progress("validation", "Validating prices...", 0, len(truly_importable))
|
_update_progress("validation", "Validating prices...", 0, len(truly_importable))
|
||||||
_log_line(run_id, "Validare preturi...")
|
_log_line(run_id, "Validare preturi...")
|
||||||
@@ -296,21 +421,86 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
pass
|
pass
|
||||||
elif item.sku in validation["direct"]:
|
elif item.sku in validation["direct"]:
|
||||||
all_codmats.add(item.sku)
|
all_codmats.add(item.sku)
|
||||||
|
# Get standard VAT rate from settings for PROC_TVAV metadata
|
||||||
|
cota_tva = float(app_settings.get("discount_vat") or 21)
|
||||||
|
|
||||||
|
# Dual pricing policy support
|
||||||
|
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
|
||||||
|
codmat_policy_map = {}
|
||||||
|
|
||||||
if all_codmats:
|
if all_codmats:
|
||||||
price_result = await asyncio.to_thread(
|
if id_pol_productie:
|
||||||
validation_service.validate_prices, all_codmats, id_pol,
|
# Dual-policy: classify articles by cont (sales vs production)
|
||||||
conn, validation.get("direct_id_map")
|
codmat_policy_map = await asyncio.to_thread(
|
||||||
)
|
validation_service.validate_and_ensure_prices_dual,
|
||||||
if price_result["missing_price"]:
|
all_codmats, id_pol, id_pol_productie,
|
||||||
logger.info(
|
conn, validation.get("direct_id_map"),
|
||||||
f"Auto-adding price 0 for {len(price_result['missing_price'])} "
|
cota_tva=cota_tva
|
||||||
f"direct articles in policy {id_pol}"
|
|
||||||
)
|
)
|
||||||
await asyncio.to_thread(
|
_log_line(run_id,
|
||||||
validation_service.ensure_prices,
|
f"Politici duale: {sum(1 for v in codmat_policy_map.values() if v == id_pol)} vanzare, "
|
||||||
price_result["missing_price"], id_pol,
|
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)} productie")
|
||||||
|
else:
|
||||||
|
# Single-policy (backward compatible)
|
||||||
|
price_result = await asyncio.to_thread(
|
||||||
|
validation_service.validate_prices, all_codmats, id_pol,
|
||||||
conn, validation.get("direct_id_map")
|
conn, validation.get("direct_id_map")
|
||||||
)
|
)
|
||||||
|
if price_result["missing_price"]:
|
||||||
|
logger.info(
|
||||||
|
f"Auto-adding price 0 for {len(price_result['missing_price'])} "
|
||||||
|
f"direct articles in policy {id_pol}"
|
||||||
|
)
|
||||||
|
await asyncio.to_thread(
|
||||||
|
validation_service.ensure_prices,
|
||||||
|
price_result["missing_price"], id_pol,
|
||||||
|
conn, validation.get("direct_id_map"),
|
||||||
|
cota_tva=cota_tva
|
||||||
|
)
|
||||||
|
|
||||||
|
# Also validate mapped SKU prices (cherry-pick 1)
|
||||||
|
mapped_skus_in_orders = set()
|
||||||
|
for order in (truly_importable + already_in_roa):
|
||||||
|
for item in order.items:
|
||||||
|
if item.sku in validation["mapped"]:
|
||||||
|
mapped_skus_in_orders.add(item.sku)
|
||||||
|
|
||||||
|
if mapped_skus_in_orders:
|
||||||
|
mapped_codmat_data = await asyncio.to_thread(
|
||||||
|
validation_service.resolve_mapped_codmats, mapped_skus_in_orders, conn
|
||||||
|
)
|
||||||
|
# Build id_map for mapped codmats and validate/ensure their prices
|
||||||
|
mapped_id_map = {}
|
||||||
|
for sku, entries in mapped_codmat_data.items():
|
||||||
|
for entry in entries:
|
||||||
|
mapped_id_map[entry["codmat"]] = {
|
||||||
|
"id_articol": entry["id_articol"],
|
||||||
|
"cont": entry.get("cont")
|
||||||
|
}
|
||||||
|
mapped_codmats = set(mapped_id_map.keys())
|
||||||
|
if mapped_codmats:
|
||||||
|
if id_pol_productie:
|
||||||
|
mapped_policy_map = await asyncio.to_thread(
|
||||||
|
validation_service.validate_and_ensure_prices_dual,
|
||||||
|
mapped_codmats, id_pol, id_pol_productie,
|
||||||
|
conn, mapped_id_map, cota_tva=cota_tva
|
||||||
|
)
|
||||||
|
codmat_policy_map.update(mapped_policy_map)
|
||||||
|
else:
|
||||||
|
mp_result = await asyncio.to_thread(
|
||||||
|
validation_service.validate_prices,
|
||||||
|
mapped_codmats, id_pol, conn, mapped_id_map
|
||||||
|
)
|
||||||
|
if mp_result["missing_price"]:
|
||||||
|
await asyncio.to_thread(
|
||||||
|
validation_service.ensure_prices,
|
||||||
|
mp_result["missing_price"], id_pol,
|
||||||
|
conn, mapped_id_map, cota_tva=cota_tva
|
||||||
|
)
|
||||||
|
|
||||||
|
# Pass codmat_policy_map to import via app_settings
|
||||||
|
if codmat_policy_map:
|
||||||
|
app_settings["_codmat_policy_map"] = codmat_policy_map
|
||||||
finally:
|
finally:
|
||||||
await asyncio.to_thread(database.pool.release, conn)
|
await asyncio.to_thread(database.pool.release, conn)
|
||||||
|
|
||||||
@@ -339,6 +529,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
"order_total": order.total or None,
|
"order_total": order.total or None,
|
||||||
"delivery_cost": order.delivery_cost or None,
|
"delivery_cost": order.delivery_cost or None,
|
||||||
"discount_total": order.discount_total or None,
|
"discount_total": order.discount_total or None,
|
||||||
|
"web_status": order.status or None,
|
||||||
"items": order_items_data,
|
"items": order_items_data,
|
||||||
})
|
})
|
||||||
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → DEJA IMPORTAT (ID: {id_comanda_roa})")
|
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → DEJA IMPORTAT (ID: {id_comanda_roa})")
|
||||||
@@ -369,6 +560,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
"order_total": order.total or None,
|
"order_total": order.total or None,
|
||||||
"delivery_cost": order.delivery_cost or None,
|
"delivery_cost": order.delivery_cost or None,
|
||||||
"discount_total": order.discount_total or None,
|
"discount_total": order.discount_total or None,
|
||||||
|
"web_status": order.status or None,
|
||||||
"items": order_items_data,
|
"items": order_items_data,
|
||||||
})
|
})
|
||||||
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → OMIS (lipsa: {', '.join(missing_skus)})")
|
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → OMIS (lipsa: {', '.join(missing_skus)})")
|
||||||
@@ -393,7 +585,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
result = await asyncio.to_thread(
|
result = await asyncio.to_thread(
|
||||||
import_service.import_single_order,
|
import_service.import_single_order,
|
||||||
order, id_pol=id_pol, id_sectie=id_sectie,
|
order, id_pol=id_pol, id_sectie=id_sectie,
|
||||||
app_settings=app_settings
|
app_settings=app_settings, id_gestiuni=id_gestiuni
|
||||||
)
|
)
|
||||||
|
|
||||||
# Build order items data for storage (R9)
|
# Build order items data for storage (R9)
|
||||||
@@ -407,6 +599,10 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
"cantitate_roa": None
|
"cantitate_roa": None
|
||||||
})
|
})
|
||||||
|
|
||||||
|
# Compute discount split for SQLite storage
|
||||||
|
ds = import_service.compute_discount_split(order, app_settings)
|
||||||
|
discount_split_json = json.dumps(ds) if ds else None
|
||||||
|
|
||||||
if result["success"]:
|
if result["success"]:
|
||||||
imported_count += 1
|
imported_count += 1
|
||||||
await sqlite_service.upsert_order(
|
await sqlite_service.upsert_order(
|
||||||
@@ -425,6 +621,8 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
order_total=order.total or None,
|
order_total=order.total or None,
|
||||||
delivery_cost=order.delivery_cost or None,
|
delivery_cost=order.delivery_cost or None,
|
||||||
discount_total=order.discount_total or None,
|
discount_total=order.discount_total or None,
|
||||||
|
web_status=order.status or None,
|
||||||
|
discount_split=discount_split_json,
|
||||||
)
|
)
|
||||||
await sqlite_service.add_sync_run_order(run_id, order.number, "IMPORTED")
|
await sqlite_service.add_sync_run_order(run_id, order.number, "IMPORTED")
|
||||||
# Store ROA address IDs (R9)
|
# Store ROA address IDs (R9)
|
||||||
@@ -453,6 +651,8 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
order_total=order.total or None,
|
order_total=order.total or None,
|
||||||
delivery_cost=order.delivery_cost or None,
|
delivery_cost=order.delivery_cost or None,
|
||||||
discount_total=order.discount_total or None,
|
discount_total=order.discount_total or None,
|
||||||
|
web_status=order.status or None,
|
||||||
|
discount_split=discount_split_json,
|
||||||
)
|
)
|
||||||
await sqlite_service.add_sync_run_order(run_id, order.number, "ERROR")
|
await sqlite_service.add_sync_run_order(run_id, order.number, "ERROR")
|
||||||
await sqlite_service.add_order_items(order.number, order_items_data)
|
await sqlite_service.add_order_items(order.number, order_items_data)
|
||||||
@@ -463,17 +663,19 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
logger.warning("Too many errors, stopping sync")
|
logger.warning("Too many errors, stopping sync")
|
||||||
break
|
break
|
||||||
|
|
||||||
# Step 4b: Invoice check — update cached invoice data
|
# Step 4b: Invoice & order status check — sync with Oracle
|
||||||
_update_progress("invoices", "Checking invoices...", 0, 0)
|
_update_progress("invoices", "Checking invoices & order status...", 0, 0)
|
||||||
invoices_updated = 0
|
invoices_updated = 0
|
||||||
|
invoices_cleared = 0
|
||||||
|
orders_deleted = 0
|
||||||
try:
|
try:
|
||||||
|
# 4b-1: Uninvoiced → check for new invoices
|
||||||
uninvoiced = await sqlite_service.get_uninvoiced_imported_orders()
|
uninvoiced = await sqlite_service.get_uninvoiced_imported_orders()
|
||||||
if uninvoiced:
|
if uninvoiced:
|
||||||
id_comanda_list = [o["id_comanda"] for o in uninvoiced]
|
id_comanda_list = [o["id_comanda"] for o in uninvoiced]
|
||||||
invoice_data = await asyncio.to_thread(
|
invoice_data = await asyncio.to_thread(
|
||||||
invoice_service.check_invoices_for_orders, id_comanda_list
|
invoice_service.check_invoices_for_orders, id_comanda_list
|
||||||
)
|
)
|
||||||
# Build reverse map: id_comanda → order_number
|
|
||||||
id_to_order = {o["id_comanda"]: o["order_number"] for o in uninvoiced}
|
id_to_order = {o["id_comanda"]: o["order_number"] for o in uninvoiced}
|
||||||
for idc, inv in invoice_data.items():
|
for idc, inv in invoice_data.items():
|
||||||
order_num = id_to_order.get(idc)
|
order_num = id_to_order.get(idc)
|
||||||
@@ -488,10 +690,39 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
data_act=inv.get("data_act"),
|
data_act=inv.get("data_act"),
|
||||||
)
|
)
|
||||||
invoices_updated += 1
|
invoices_updated += 1
|
||||||
if invoices_updated:
|
|
||||||
_log_line(run_id, f"Facturi actualizate: {invoices_updated} comenzi facturate")
|
# 4b-2: Invoiced → check for deleted invoices
|
||||||
|
invoiced = await sqlite_service.get_invoiced_imported_orders()
|
||||||
|
if invoiced:
|
||||||
|
id_comanda_list = [o["id_comanda"] for o in invoiced]
|
||||||
|
invoice_data = await asyncio.to_thread(
|
||||||
|
invoice_service.check_invoices_for_orders, id_comanda_list
|
||||||
|
)
|
||||||
|
for o in invoiced:
|
||||||
|
if o["id_comanda"] not in invoice_data:
|
||||||
|
await sqlite_service.clear_order_invoice(o["order_number"])
|
||||||
|
invoices_cleared += 1
|
||||||
|
|
||||||
|
# 4b-3: All imported → check for deleted orders in ROA
|
||||||
|
all_imported = await sqlite_service.get_all_imported_orders()
|
||||||
|
if all_imported:
|
||||||
|
id_comanda_list = [o["id_comanda"] for o in all_imported]
|
||||||
|
existing_ids = await asyncio.to_thread(
|
||||||
|
invoice_service.check_orders_exist, id_comanda_list
|
||||||
|
)
|
||||||
|
for o in all_imported:
|
||||||
|
if o["id_comanda"] not in existing_ids:
|
||||||
|
await sqlite_service.mark_order_deleted_in_roa(o["order_number"])
|
||||||
|
orders_deleted += 1
|
||||||
|
|
||||||
|
if invoices_updated:
|
||||||
|
_log_line(run_id, f"Facturi noi: {invoices_updated} comenzi facturate")
|
||||||
|
if invoices_cleared:
|
||||||
|
_log_line(run_id, f"Facturi sterse: {invoices_cleared} facturi eliminate din cache")
|
||||||
|
if orders_deleted:
|
||||||
|
_log_line(run_id, f"Comenzi sterse din ROA: {orders_deleted}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Invoice check failed: {e}")
|
logger.warning(f"Invoice/order status check failed: {e}")
|
||||||
|
|
||||||
# Step 5: Update sync run
|
# Step 5: Update sync run
|
||||||
total_imported = imported_count + already_imported_count # backward-compat
|
total_imported = imported_count + already_imported_count # backward-compat
|
||||||
@@ -505,36 +736,40 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
"run_id": run_id,
|
"run_id": run_id,
|
||||||
"status": status,
|
"status": status,
|
||||||
"json_files": json_count,
|
"json_files": json_count,
|
||||||
"total_orders": len(orders),
|
"total_orders": len(orders) + cancelled_count,
|
||||||
"new_orders": len(truly_importable),
|
"new_orders": len(truly_importable),
|
||||||
"imported": total_imported,
|
"imported": total_imported,
|
||||||
"new_imported": imported_count,
|
"new_imported": imported_count,
|
||||||
"already_imported": already_imported_count,
|
"already_imported": already_imported_count,
|
||||||
"skipped": len(skipped),
|
"skipped": len(skipped),
|
||||||
"errors": error_count,
|
"errors": error_count,
|
||||||
|
"cancelled": cancelled_count,
|
||||||
"missing_skus": len(validation["missing"]),
|
"missing_skus": len(validation["missing"]),
|
||||||
"invoices_updated": invoices_updated,
|
"invoices_updated": invoices_updated,
|
||||||
|
"invoices_cleared": invoices_cleared,
|
||||||
|
"orders_deleted_in_roa": orders_deleted,
|
||||||
}
|
}
|
||||||
|
|
||||||
_update_progress("completed",
|
_update_progress("completed",
|
||||||
f"Completed: {imported_count} new, {already_imported_count} already, {len(skipped)} skipped, {error_count} errors",
|
f"Completed: {imported_count} new, {already_imported_count} already, {len(skipped)} skipped, {error_count} errors, {cancelled_count} cancelled",
|
||||||
len(truly_importable), len(truly_importable),
|
len(truly_importable), len(truly_importable),
|
||||||
{"imported": imported_count, "skipped": len(skipped), "errors": error_count,
|
{"imported": imported_count, "skipped": len(skipped), "errors": error_count,
|
||||||
"already_imported": already_imported_count})
|
"already_imported": already_imported_count, "cancelled": cancelled_count})
|
||||||
if _current_sync:
|
if _current_sync:
|
||||||
_current_sync["status"] = status
|
_current_sync["status"] = status
|
||||||
_current_sync["finished_at"] = datetime.now().isoformat()
|
_current_sync["finished_at"] = _now().isoformat()
|
||||||
|
|
||||||
logger.info(
|
logger.info(
|
||||||
f"Sync {run_id} completed: {imported_count} new, {already_imported_count} already imported, "
|
f"Sync {run_id} completed: {imported_count} new, {already_imported_count} already imported, "
|
||||||
f"{len(skipped)} skipped, {error_count} errors"
|
f"{len(skipped)} skipped, {error_count} errors, {cancelled_count} cancelled"
|
||||||
)
|
)
|
||||||
|
|
||||||
duration = (datetime.now() - started_dt).total_seconds()
|
duration = (_now() - started_dt).total_seconds()
|
||||||
_log_line(run_id, "")
|
_log_line(run_id, "")
|
||||||
|
cancelled_text = f", {cancelled_count} anulate" if cancelled_count else ""
|
||||||
_run_logs[run_id].append(
|
_run_logs[run_id].append(
|
||||||
f"Finalizat: {imported_count} importate, {already_imported_count} deja importate, "
|
f"Finalizat: {imported_count} importate, {already_imported_count} deja importate, "
|
||||||
f"{len(skipped)} nemapate, {error_count} erori din {len(orders)} comenzi | Durata: {int(duration)}s"
|
f"{len(skipped)} nemapate, {error_count} erori{cancelled_text} din {len(orders) + cancelled_count} comenzi | Durata: {int(duration)}s"
|
||||||
)
|
)
|
||||||
|
|
||||||
return summary
|
return summary
|
||||||
@@ -545,7 +780,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
|||||||
await sqlite_service.update_sync_run(run_id, "failed", 0, 0, 0, 1, error_message=str(e))
|
await sqlite_service.update_sync_run(run_id, "failed", 0, 0, 0, 1, error_message=str(e))
|
||||||
if _current_sync:
|
if _current_sync:
|
||||||
_current_sync["status"] = "failed"
|
_current_sync["status"] = "failed"
|
||||||
_current_sync["finished_at"] = datetime.now().isoformat()
|
_current_sync["finished_at"] = _now().isoformat()
|
||||||
_current_sync["error"] = str(e)
|
_current_sync["error"] = str(e)
|
||||||
return {"run_id": run_id, "status": "failed", "error": str(e)}
|
return {"run_id": run_id, "status": "failed", "error": str(e)}
|
||||||
finally:
|
finally:
|
||||||
|
|||||||
@@ -1,5 +1,4 @@
|
|||||||
import logging
|
import logging
|
||||||
from datetime import datetime, timedelta
|
|
||||||
from .. import database
|
from .. import database
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
@@ -29,20 +28,81 @@ def check_orders_in_roa(min_date, conn) -> dict:
|
|||||||
return existing
|
return existing
|
||||||
|
|
||||||
|
|
||||||
def validate_skus(skus: set[str], conn=None) -> dict:
|
def resolve_codmat_ids(codmats: set[str], id_gestiuni: list[int] = None, conn=None) -> dict[str, dict]:
|
||||||
|
"""Resolve CODMATs to best id_articol + cont: prefers article with stock, then MAX(id_articol).
|
||||||
|
Filters: sters=0 AND inactiv=0.
|
||||||
|
id_gestiuni: list of warehouse IDs to check stock in, or None for all.
|
||||||
|
Returns: {codmat: {"id_articol": int, "cont": str|None}}
|
||||||
|
"""
|
||||||
|
if not codmats:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
result = {}
|
||||||
|
codmat_list = list(codmats)
|
||||||
|
|
||||||
|
# Build stoc subquery dynamically for index optimization
|
||||||
|
if id_gestiuni:
|
||||||
|
gest_placeholders = ",".join([f":g{k}" for k in range(len(id_gestiuni))])
|
||||||
|
stoc_filter = f"AND s.id_gestiune IN ({gest_placeholders})"
|
||||||
|
else:
|
||||||
|
stoc_filter = ""
|
||||||
|
|
||||||
|
own_conn = conn is None
|
||||||
|
if own_conn:
|
||||||
|
conn = database.get_oracle_connection()
|
||||||
|
try:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
for i in range(0, len(codmat_list), 500):
|
||||||
|
batch = codmat_list[i:i+500]
|
||||||
|
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
|
||||||
|
params = {f"c{j}": cm for j, cm in enumerate(batch)}
|
||||||
|
if id_gestiuni:
|
||||||
|
for k, gid in enumerate(id_gestiuni):
|
||||||
|
params[f"g{k}"] = gid
|
||||||
|
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT codmat, id_articol, cont FROM (
|
||||||
|
SELECT na.codmat, na.id_articol, na.cont,
|
||||||
|
ROW_NUMBER() OVER (
|
||||||
|
PARTITION BY na.codmat
|
||||||
|
ORDER BY
|
||||||
|
CASE WHEN EXISTS (
|
||||||
|
SELECT 1 FROM stoc s
|
||||||
|
WHERE s.id_articol = na.id_articol
|
||||||
|
{stoc_filter}
|
||||||
|
AND s.an = EXTRACT(YEAR FROM SYSDATE)
|
||||||
|
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
|
||||||
|
AND s.cants + s.cant - s.cante > 0
|
||||||
|
) THEN 0 ELSE 1 END,
|
||||||
|
na.id_articol DESC
|
||||||
|
) AS rn
|
||||||
|
FROM nom_articole na
|
||||||
|
WHERE na.codmat IN ({placeholders})
|
||||||
|
AND na.sters = 0 AND na.inactiv = 0
|
||||||
|
) WHERE rn = 1
|
||||||
|
""", params)
|
||||||
|
for row in cur:
|
||||||
|
result[row[0]] = {"id_articol": row[1], "cont": row[2]}
|
||||||
|
finally:
|
||||||
|
if own_conn:
|
||||||
|
database.pool.release(conn)
|
||||||
|
|
||||||
|
logger.info(f"resolve_codmat_ids: {len(result)}/{len(codmats)} resolved (gestiuni={id_gestiuni})")
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def validate_skus(skus: set[str], conn=None, id_gestiuni: list[int] = None) -> dict:
|
||||||
"""Validate a set of SKUs against Oracle.
|
"""Validate a set of SKUs against Oracle.
|
||||||
Returns: {mapped: set, direct: set, missing: set, direct_id_map: {codmat: id_articol}}
|
Returns: {mapped: set, direct: set, missing: set, direct_id_map: {codmat: {"id_articol": int, "cont": str|None}}}
|
||||||
- mapped: found in ARTICOLE_TERTI (active)
|
- mapped: found in ARTICOLE_TERTI (active)
|
||||||
- direct: found in NOM_ARTICOLE by codmat (not in ARTICOLE_TERTI)
|
- direct: found in NOM_ARTICOLE by codmat (not in ARTICOLE_TERTI)
|
||||||
- missing: not found anywhere
|
- missing: not found anywhere
|
||||||
- direct_id_map: {codmat: id_articol} for direct SKUs (saves a round-trip in validate_prices)
|
- direct_id_map: {codmat: {"id_articol": int, "cont": str|None}} for direct SKUs
|
||||||
"""
|
"""
|
||||||
if not skus:
|
if not skus:
|
||||||
return {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}}
|
return {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}}
|
||||||
|
|
||||||
mapped = set()
|
mapped = set()
|
||||||
direct = set()
|
|
||||||
direct_id_map = {}
|
|
||||||
sku_list = list(skus)
|
sku_list = list(skus)
|
||||||
|
|
||||||
own_conn = conn is None
|
own_conn = conn is None
|
||||||
@@ -64,18 +124,15 @@ def validate_skus(skus: set[str], conn=None) -> dict:
|
|||||||
for row in cur:
|
for row in cur:
|
||||||
mapped.add(row[0])
|
mapped.add(row[0])
|
||||||
|
|
||||||
# Check NOM_ARTICOLE for remaining — also fetch id_articol
|
# Resolve remaining SKUs via resolve_codmat_ids (consistent id_articol selection)
|
||||||
remaining = [s for s in batch if s not in mapped]
|
all_remaining = [s for s in sku_list if s not in mapped]
|
||||||
if remaining:
|
if all_remaining:
|
||||||
placeholders2 = ",".join([f":n{j}" for j in range(len(remaining))])
|
direct_id_map = resolve_codmat_ids(set(all_remaining), id_gestiuni, conn)
|
||||||
params2 = {f"n{j}": sku for j, sku in enumerate(remaining)}
|
direct = set(direct_id_map.keys())
|
||||||
cur.execute(f"""
|
else:
|
||||||
SELECT codmat, id_articol FROM NOM_ARTICOLE
|
direct_id_map = {}
|
||||||
WHERE codmat IN ({placeholders2}) AND sters = 0 AND inactiv = 0
|
direct = set()
|
||||||
""", params2)
|
|
||||||
for row in cur:
|
|
||||||
direct.add(row[0])
|
|
||||||
direct_id_map[row[0]] = row[1]
|
|
||||||
finally:
|
finally:
|
||||||
if own_conn:
|
if own_conn:
|
||||||
database.pool.release(conn)
|
database.pool.release(conn)
|
||||||
@@ -83,7 +140,8 @@ def validate_skus(skus: set[str], conn=None) -> dict:
|
|||||||
missing = skus - mapped - direct
|
missing = skus - mapped - direct
|
||||||
|
|
||||||
logger.info(f"SKU validation: {len(mapped)} mapped, {len(direct)} direct, {len(missing)} missing")
|
logger.info(f"SKU validation: {len(mapped)} mapped, {len(direct)} direct, {len(missing)} missing")
|
||||||
return {"mapped": mapped, "direct": direct, "missing": missing, "direct_id_map": direct_id_map}
|
return {"mapped": mapped, "direct": direct, "missing": missing,
|
||||||
|
"direct_id_map": direct_id_map}
|
||||||
|
|
||||||
def classify_orders(orders, validation_result):
|
def classify_orders(orders, validation_result):
|
||||||
"""Classify orders as importable or skipped based on SKU validation.
|
"""Classify orders as importable or skipped based on SKU validation.
|
||||||
@@ -105,6 +163,19 @@ def classify_orders(orders, validation_result):
|
|||||||
|
|
||||||
return importable, skipped
|
return importable, skipped
|
||||||
|
|
||||||
|
def _extract_id_map(direct_id_map: dict) -> dict:
|
||||||
|
"""Extract {codmat: id_articol} from either enriched or simple format."""
|
||||||
|
if not direct_id_map:
|
||||||
|
return {}
|
||||||
|
result = {}
|
||||||
|
for cm, val in direct_id_map.items():
|
||||||
|
if isinstance(val, dict):
|
||||||
|
result[cm] = val["id_articol"]
|
||||||
|
else:
|
||||||
|
result[cm] = val
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None) -> dict:
|
def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None) -> dict:
|
||||||
"""Check which CODMATs have a price entry in CRM_POLITICI_PRET_ART for the given policy.
|
"""Check which CODMATs have a price entry in CRM_POLITICI_PRET_ART for the given policy.
|
||||||
If direct_id_map is provided, skips the NOM_ARTICOLE lookup for those CODMATs.
|
If direct_id_map is provided, skips the NOM_ARTICOLE lookup for those CODMATs.
|
||||||
@@ -113,37 +184,15 @@ def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: di
|
|||||||
if not codmats:
|
if not codmats:
|
||||||
return {"has_price": set(), "missing_price": set()}
|
return {"has_price": set(), "missing_price": set()}
|
||||||
|
|
||||||
codmat_to_id = {}
|
codmat_to_id = _extract_id_map(direct_id_map)
|
||||||
ids_with_price = set()
|
ids_with_price = set()
|
||||||
codmat_list = list(codmats)
|
|
||||||
|
|
||||||
# Pre-populate from direct_id_map if available
|
|
||||||
if direct_id_map:
|
|
||||||
for cm in codmat_list:
|
|
||||||
if cm in direct_id_map:
|
|
||||||
codmat_to_id[cm] = direct_id_map[cm]
|
|
||||||
|
|
||||||
own_conn = conn is None
|
own_conn = conn is None
|
||||||
if own_conn:
|
if own_conn:
|
||||||
conn = database.get_oracle_connection()
|
conn = database.get_oracle_connection()
|
||||||
try:
|
try:
|
||||||
with conn.cursor() as cur:
|
with conn.cursor() as cur:
|
||||||
# Step 1: Get ID_ARTICOL for CODMATs not already in direct_id_map
|
# Check which ID_ARTICOLs have a price in the policy
|
||||||
remaining = [cm for cm in codmat_list if cm not in codmat_to_id]
|
|
||||||
if remaining:
|
|
||||||
for i in range(0, len(remaining), 500):
|
|
||||||
batch = remaining[i:i+500]
|
|
||||||
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
|
|
||||||
params = {f"c{j}": cm for j, cm in enumerate(batch)}
|
|
||||||
|
|
||||||
cur.execute(f"""
|
|
||||||
SELECT id_articol, codmat FROM NOM_ARTICOLE
|
|
||||||
WHERE codmat IN ({placeholders})
|
|
||||||
""", params)
|
|
||||||
for row in cur:
|
|
||||||
codmat_to_id[row[1]] = row[0]
|
|
||||||
|
|
||||||
# Step 2: Check which ID_ARTICOLs have a price in the policy
|
|
||||||
id_list = list(codmat_to_id.values())
|
id_list = list(codmat_to_id.values())
|
||||||
for i in range(0, len(id_list), 500):
|
for i in range(0, len(id_list), 500):
|
||||||
batch = id_list[i:i+500]
|
batch = id_list[i:i+500]
|
||||||
@@ -168,14 +217,18 @@ def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: di
|
|||||||
logger.info(f"Price validation (policy {id_pol}): {len(has_price)} have price, {len(missing_price)} missing price")
|
logger.info(f"Price validation (policy {id_pol}): {len(has_price)} have price, {len(missing_price)} missing price")
|
||||||
return {"has_price": has_price, "missing_price": missing_price}
|
return {"has_price": has_price, "missing_price": missing_price}
|
||||||
|
|
||||||
def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None):
|
def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None,
|
||||||
|
cota_tva: float = None):
|
||||||
"""Insert price 0 entries for CODMATs missing from the given price policy.
|
"""Insert price 0 entries for CODMATs missing from the given price policy.
|
||||||
Uses batch executemany instead of individual INSERTs.
|
Uses batch executemany instead of individual INSERTs.
|
||||||
Relies on TRG_CRM_POLITICI_PRET_ART trigger for ID_POL_ART sequence.
|
Relies on TRG_CRM_POLITICI_PRET_ART trigger for ID_POL_ART sequence.
|
||||||
|
cota_tva: VAT rate from settings (e.g. 21) — used for PROC_TVAV metadata.
|
||||||
"""
|
"""
|
||||||
if not codmats:
|
if not codmats:
|
||||||
return
|
return
|
||||||
|
|
||||||
|
proc_tvav = 1 + (cota_tva / 100) if cota_tva else 1.21
|
||||||
|
|
||||||
own_conn = conn is None
|
own_conn = conn is None
|
||||||
if own_conn:
|
if own_conn:
|
||||||
conn = database.get_oracle_connection()
|
conn = database.get_oracle_connection()
|
||||||
@@ -191,27 +244,9 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
|
|||||||
return
|
return
|
||||||
id_valuta = row[0]
|
id_valuta = row[0]
|
||||||
|
|
||||||
# Build batch params using direct_id_map where available
|
# Build batch params using direct_id_map (already resolved via resolve_codmat_ids)
|
||||||
batch_params = []
|
batch_params = []
|
||||||
need_lookup = []
|
codmat_id_map = _extract_id_map(direct_id_map)
|
||||||
codmat_id_map = dict(direct_id_map) if direct_id_map else {}
|
|
||||||
|
|
||||||
for codmat in codmats:
|
|
||||||
if codmat not in codmat_id_map:
|
|
||||||
need_lookup.append(codmat)
|
|
||||||
|
|
||||||
# Batch lookup remaining CODMATs
|
|
||||||
if need_lookup:
|
|
||||||
for i in range(0, len(need_lookup), 500):
|
|
||||||
batch = need_lookup[i:i+500]
|
|
||||||
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
|
|
||||||
params = {f"c{j}": cm for j, cm in enumerate(batch)}
|
|
||||||
cur.execute(f"""
|
|
||||||
SELECT codmat, id_articol FROM NOM_ARTICOLE
|
|
||||||
WHERE codmat IN ({placeholders}) AND sters = 0 AND inactiv = 0
|
|
||||||
""", params)
|
|
||||||
for r in cur:
|
|
||||||
codmat_id_map[r[0]] = r[1]
|
|
||||||
|
|
||||||
for codmat in codmats:
|
for codmat in codmats:
|
||||||
id_articol = codmat_id_map.get(codmat)
|
id_articol = codmat_id_map.get(codmat)
|
||||||
@@ -221,7 +256,8 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
|
|||||||
batch_params.append({
|
batch_params.append({
|
||||||
"id_pol": id_pol,
|
"id_pol": id_pol,
|
||||||
"id_articol": id_articol,
|
"id_articol": id_articol,
|
||||||
"id_valuta": id_valuta
|
"id_valuta": id_valuta,
|
||||||
|
"proc_tvav": proc_tvav
|
||||||
})
|
})
|
||||||
|
|
||||||
if batch_params:
|
if batch_params:
|
||||||
@@ -231,9 +267,9 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
|
|||||||
ID_UTIL, DATAORA, PROC_TVAV, PRETFTVA, PRETCTVA)
|
ID_UTIL, DATAORA, PROC_TVAV, PRETFTVA, PRETCTVA)
|
||||||
VALUES
|
VALUES
|
||||||
(:id_pol, :id_articol, 0, :id_valuta,
|
(:id_pol, :id_articol, 0, :id_valuta,
|
||||||
-3, SYSDATE, 1.19, 0, 0)
|
-3, SYSDATE, :proc_tvav, 0, 0)
|
||||||
""", batch_params)
|
""", batch_params)
|
||||||
logger.info(f"Batch inserted {len(batch_params)} price entries for policy {id_pol}")
|
logger.info(f"Batch inserted {len(batch_params)} price entries for policy {id_pol} (PROC_TVAV={proc_tvav})")
|
||||||
|
|
||||||
conn.commit()
|
conn.commit()
|
||||||
finally:
|
finally:
|
||||||
@@ -241,3 +277,125 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
|
|||||||
database.pool.release(conn)
|
database.pool.release(conn)
|
||||||
|
|
||||||
logger.info(f"Ensure prices done: {len(codmats)} CODMATs processed for policy {id_pol}")
|
logger.info(f"Ensure prices done: {len(codmats)} CODMATs processed for policy {id_pol}")
|
||||||
|
|
||||||
|
|
||||||
|
def validate_and_ensure_prices_dual(codmats: set[str], id_pol_vanzare: int,
|
||||||
|
id_pol_productie: int, conn, direct_id_map: dict,
|
||||||
|
cota_tva: float = 21) -> dict[str, int]:
|
||||||
|
"""Dual-policy price validation: assign each CODMAT to sales or production policy.
|
||||||
|
|
||||||
|
Logic:
|
||||||
|
1. Check both policies in one SQL
|
||||||
|
2. If article in one policy → use that
|
||||||
|
3. If article in BOTH → prefer id_pol_vanzare
|
||||||
|
4. If article in NEITHER → check cont: 341/345 → production, else → sales; insert price 0
|
||||||
|
|
||||||
|
Returns: codmat_policy_map = {codmat: assigned_id_pol}
|
||||||
|
"""
|
||||||
|
if not codmats:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
codmat_policy_map = {}
|
||||||
|
id_map = _extract_id_map(direct_id_map)
|
||||||
|
|
||||||
|
# Collect all id_articol values we need to check
|
||||||
|
id_to_codmats = {} # {id_articol: [codmat, ...]}
|
||||||
|
for cm in codmats:
|
||||||
|
aid = id_map.get(cm)
|
||||||
|
if aid:
|
||||||
|
id_to_codmats.setdefault(aid, []).append(cm)
|
||||||
|
|
||||||
|
if not id_to_codmats:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
# Query both policies in one SQL
|
||||||
|
existing = {} # {id_articol: set of id_pol}
|
||||||
|
id_list = list(id_to_codmats.keys())
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
for i in range(0, len(id_list), 500):
|
||||||
|
batch = id_list[i:i+500]
|
||||||
|
placeholders = ",".join([f":a{j}" for j in range(len(batch))])
|
||||||
|
params = {f"a{j}": aid for j, aid in enumerate(batch)}
|
||||||
|
params["id_pol_v"] = id_pol_vanzare
|
||||||
|
params["id_pol_p"] = id_pol_productie
|
||||||
|
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT pa.ID_ARTICOL, pa.ID_POL FROM CRM_POLITICI_PRET_ART pa
|
||||||
|
WHERE pa.ID_POL IN (:id_pol_v, :id_pol_p) AND pa.ID_ARTICOL IN ({placeholders})
|
||||||
|
""", params)
|
||||||
|
for row in cur:
|
||||||
|
existing.setdefault(row[0], set()).add(row[1])
|
||||||
|
|
||||||
|
# Classify each codmat
|
||||||
|
missing_vanzare = set() # CODMATs needing price 0 in sales policy
|
||||||
|
missing_productie = set() # CODMATs needing price 0 in production policy
|
||||||
|
|
||||||
|
for aid, cms in id_to_codmats.items():
|
||||||
|
pols = existing.get(aid, set())
|
||||||
|
for cm in cms:
|
||||||
|
if pols:
|
||||||
|
if id_pol_vanzare in pols:
|
||||||
|
codmat_policy_map[cm] = id_pol_vanzare
|
||||||
|
elif id_pol_productie in pols:
|
||||||
|
codmat_policy_map[cm] = id_pol_productie
|
||||||
|
else:
|
||||||
|
# Not in any policy — classify by cont
|
||||||
|
info = direct_id_map.get(cm, {})
|
||||||
|
cont = info.get("cont", "") if isinstance(info, dict) else ""
|
||||||
|
cont_str = str(cont or "").strip()
|
||||||
|
if cont_str in ("341", "345"):
|
||||||
|
codmat_policy_map[cm] = id_pol_productie
|
||||||
|
missing_productie.add(cm)
|
||||||
|
else:
|
||||||
|
codmat_policy_map[cm] = id_pol_vanzare
|
||||||
|
missing_vanzare.add(cm)
|
||||||
|
|
||||||
|
# Ensure prices for missing articles in each policy
|
||||||
|
if missing_vanzare:
|
||||||
|
ensure_prices(missing_vanzare, id_pol_vanzare, conn, direct_id_map, cota_tva=cota_tva)
|
||||||
|
if missing_productie:
|
||||||
|
ensure_prices(missing_productie, id_pol_productie, conn, direct_id_map, cota_tva=cota_tva)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Dual-policy: {len(codmat_policy_map)} CODMATs assigned "
|
||||||
|
f"(vanzare={sum(1 for v in codmat_policy_map.values() if v == id_pol_vanzare)}, "
|
||||||
|
f"productie={sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)})"
|
||||||
|
)
|
||||||
|
return codmat_policy_map
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_mapped_codmats(mapped_skus: set[str], conn) -> dict[str, list[dict]]:
|
||||||
|
"""For mapped SKUs, get their underlying CODMATs from ARTICOLE_TERTI + nom_articole.
|
||||||
|
|
||||||
|
Returns: {sku: [{"codmat": str, "id_articol": int, "cont": str|None}]}
|
||||||
|
"""
|
||||||
|
if not mapped_skus:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
result = {}
|
||||||
|
sku_list = list(mapped_skus)
|
||||||
|
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
for i in range(0, len(sku_list), 500):
|
||||||
|
batch = sku_list[i:i+500]
|
||||||
|
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
|
||||||
|
params = {f"s{j}": sku for j, sku in enumerate(batch)}
|
||||||
|
|
||||||
|
cur.execute(f"""
|
||||||
|
SELECT at.sku, at.codmat, na.id_articol, na.cont
|
||||||
|
FROM ARTICOLE_TERTI at
|
||||||
|
JOIN NOM_ARTICOLE na ON na.codmat = at.codmat AND na.sters = 0 AND na.inactiv = 0
|
||||||
|
WHERE at.sku IN ({placeholders}) AND at.activ = 1 AND at.sters = 0
|
||||||
|
""", params)
|
||||||
|
for row in cur:
|
||||||
|
sku = row[0]
|
||||||
|
if sku not in result:
|
||||||
|
result[sku] = []
|
||||||
|
result[sku].append({
|
||||||
|
"codmat": row[1],
|
||||||
|
"id_articol": row[2],
|
||||||
|
"cont": row[3]
|
||||||
|
})
|
||||||
|
|
||||||
|
logger.info(f"resolve_mapped_codmats: {len(result)} SKUs → {sum(len(v) for v in result.values())} CODMATs")
|
||||||
|
return result
|
||||||
|
|||||||
@@ -289,6 +289,7 @@ body {
|
|||||||
.fc-red { color: #dc2626; }
|
.fc-red { color: #dc2626; }
|
||||||
.fc-neutral { color: #6b7280; }
|
.fc-neutral { color: #6b7280; }
|
||||||
.fc-blue { color: #2563eb; }
|
.fc-blue { color: #2563eb; }
|
||||||
|
.fc-dark { color: #374151; }
|
||||||
|
|
||||||
/* ── Log viewer (dark theme — keep as-is) ────────── */
|
/* ── Log viewer (dark theme — keep as-is) ────────── */
|
||||||
.log-viewer {
|
.log-viewer {
|
||||||
@@ -424,13 +425,12 @@ tr.mapping-deleted td {
|
|||||||
|
|
||||||
/* ── Search input ────────────────────────────────── */
|
/* ── Search input ────────────────────────────────── */
|
||||||
.search-input {
|
.search-input {
|
||||||
margin-left: auto;
|
|
||||||
padding: 0.375rem 0.75rem;
|
padding: 0.375rem 0.75rem;
|
||||||
border: 1px solid #d1d5db;
|
border: 1px solid #d1d5db;
|
||||||
border-radius: 0.375rem;
|
border-radius: 0.375rem;
|
||||||
font-size: 0.9375rem;
|
font-size: 0.9375rem;
|
||||||
outline: none;
|
outline: none;
|
||||||
min-width: 180px;
|
width: 160px;
|
||||||
}
|
}
|
||||||
.search-input:focus { border-color: var(--blue-600); }
|
.search-input:focus { border-color: var(--blue-600); }
|
||||||
|
|
||||||
@@ -705,7 +705,7 @@ tr.mapping-deleted td {
|
|||||||
gap: 0.375rem;
|
gap: 0.375rem;
|
||||||
}
|
}
|
||||||
.filter-pill { padding: 0.25rem 0.5rem; font-size: 0.8125rem; }
|
.filter-pill { padding: 0.25rem 0.5rem; font-size: 0.8125rem; }
|
||||||
.search-input { min-width: 0; width: 100%; order: 99; }
|
.search-input { min-width: 0; width: auto; flex: 1; }
|
||||||
.page-btn.page-number { display: none; }
|
.page-btn.page-number { display: none; }
|
||||||
.page-btn.page-ellipsis { display: none; }
|
.page-btn.page-ellipsis { display: none; }
|
||||||
.table-responsive { display: none; }
|
.table-responsive { display: none; }
|
||||||
|
|||||||
@@ -13,21 +13,32 @@ let _pollInterval = null;
|
|||||||
let _lastSyncStatus = null;
|
let _lastSyncStatus = null;
|
||||||
let _lastRunId = null;
|
let _lastRunId = null;
|
||||||
let _currentRunId = null;
|
let _currentRunId = null;
|
||||||
|
let _pollIntervalMs = 5000; // default, overridden from settings
|
||||||
|
let _knownLastRunId = null; // track last_run.run_id to detect missed syncs
|
||||||
|
|
||||||
// ── Init ──────────────────────────────────────────
|
// ── Init ──────────────────────────────────────────
|
||||||
|
|
||||||
document.addEventListener('DOMContentLoaded', () => {
|
document.addEventListener('DOMContentLoaded', async () => {
|
||||||
|
await initPollInterval();
|
||||||
loadSchedulerStatus();
|
loadSchedulerStatus();
|
||||||
loadDashOrders();
|
loadDashOrders();
|
||||||
startSyncPolling();
|
startSyncPolling();
|
||||||
wireFilterBar();
|
wireFilterBar();
|
||||||
});
|
});
|
||||||
|
|
||||||
|
async function initPollInterval() {
|
||||||
|
try {
|
||||||
|
const data = await fetchJSON('/api/settings');
|
||||||
|
const sec = parseInt(data.dashboard_poll_seconds) || 5;
|
||||||
|
_pollIntervalMs = sec * 1000;
|
||||||
|
} catch(e) {}
|
||||||
|
}
|
||||||
|
|
||||||
// ── Smart Sync Polling ────────────────────────────
|
// ── Smart Sync Polling ────────────────────────────
|
||||||
|
|
||||||
function startSyncPolling() {
|
function startSyncPolling() {
|
||||||
if (_pollInterval) clearInterval(_pollInterval);
|
if (_pollInterval) clearInterval(_pollInterval);
|
||||||
_pollInterval = setInterval(pollSyncStatus, 30000);
|
_pollInterval = setInterval(pollSyncStatus, _pollIntervalMs);
|
||||||
pollSyncStatus(); // immediate first call
|
pollSyncStatus(); // immediate first call
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -37,6 +48,12 @@ async function pollSyncStatus() {
|
|||||||
updateSyncPanel(data);
|
updateSyncPanel(data);
|
||||||
const isRunning = data.status === 'running';
|
const isRunning = data.status === 'running';
|
||||||
const wasRunning = _lastSyncStatus === 'running';
|
const wasRunning = _lastSyncStatus === 'running';
|
||||||
|
|
||||||
|
// Detect missed sync completions via last_run.run_id change
|
||||||
|
const newLastRunId = data.last_run?.run_id || null;
|
||||||
|
const missedSync = !isRunning && !wasRunning && _knownLastRunId && newLastRunId && newLastRunId !== _knownLastRunId;
|
||||||
|
_knownLastRunId = newLastRunId;
|
||||||
|
|
||||||
if (isRunning && !wasRunning) {
|
if (isRunning && !wasRunning) {
|
||||||
// Switched to running — speed up polling
|
// Switched to running — speed up polling
|
||||||
clearInterval(_pollInterval);
|
clearInterval(_pollInterval);
|
||||||
@@ -44,7 +61,10 @@ async function pollSyncStatus() {
|
|||||||
} else if (!isRunning && wasRunning) {
|
} else if (!isRunning && wasRunning) {
|
||||||
// Sync just completed — slow down and refresh orders
|
// Sync just completed — slow down and refresh orders
|
||||||
clearInterval(_pollInterval);
|
clearInterval(_pollInterval);
|
||||||
_pollInterval = setInterval(pollSyncStatus, 30000);
|
_pollInterval = setInterval(pollSyncStatus, _pollIntervalMs);
|
||||||
|
loadDashOrders();
|
||||||
|
} else if (missedSync) {
|
||||||
|
// Sync completed while we weren't watching (e.g. auto-sync) — refresh orders
|
||||||
loadDashOrders();
|
loadDashOrders();
|
||||||
}
|
}
|
||||||
_lastSyncStatus = data.status;
|
_lastSyncStatus = data.status;
|
||||||
@@ -283,6 +303,7 @@ async function loadDashOrders() {
|
|||||||
if (el('cntErr')) el('cntErr').textContent = c.error || c.errors || 0;
|
if (el('cntErr')) el('cntErr').textContent = c.error || c.errors || 0;
|
||||||
if (el('cntFact')) el('cntFact').textContent = c.facturate || 0;
|
if (el('cntFact')) el('cntFact').textContent = c.facturate || 0;
|
||||||
if (el('cntNef')) el('cntNef').textContent = c.nefacturate || c.uninvoiced || 0;
|
if (el('cntNef')) el('cntNef').textContent = c.nefacturate || c.uninvoiced || 0;
|
||||||
|
if (el('cntCanc')) el('cntCanc').textContent = c.cancelled || 0;
|
||||||
|
|
||||||
const tbody = document.getElementById('dashOrdersBody');
|
const tbody = document.getElementById('dashOrdersBody');
|
||||||
const orders = data.orders || [];
|
const orders = data.orders || [];
|
||||||
@@ -321,7 +342,7 @@ async function loadDashOrders() {
|
|||||||
dateFmt = d.slice(8, 10) + '.' + d.slice(5, 7) + '.' + d.slice(2, 4);
|
dateFmt = d.slice(8, 10) + '.' + d.slice(5, 7) + '.' + d.slice(2, 4);
|
||||||
if (d.length >= 16) dateFmt += ' ' + d.slice(11, 16);
|
if (d.length >= 16) dateFmt += ' ' + d.slice(11, 16);
|
||||||
}
|
}
|
||||||
const name = o.shipping_name || o.customer_name || o.billing_name || '\u2014';
|
const name = o.customer_name || o.shipping_name || o.billing_name || '\u2014';
|
||||||
const totalStr = o.order_total ? Number(o.order_total).toFixed(2) : '';
|
const totalStr = o.order_total ? Number(o.order_total).toFixed(2) : '';
|
||||||
return `<div class="flat-row" onclick="openDashOrderDetail('${esc(o.order_number)}')" style="font-size:0.875rem">
|
return `<div class="flat-row" onclick="openDashOrderDetail('${esc(o.order_number)}')" style="font-size:0.875rem">
|
||||||
${statusDot(o.status)}
|
${statusDot(o.status)}
|
||||||
@@ -340,7 +361,8 @@ async function loadDashOrders() {
|
|||||||
{ label: 'Omise', count: c.skipped || 0, value: 'SKIPPED', active: activeStatus === 'SKIPPED', colorClass: 'fc-yellow' },
|
{ label: 'Omise', count: c.skipped || 0, value: 'SKIPPED', active: activeStatus === 'SKIPPED', colorClass: 'fc-yellow' },
|
||||||
{ label: 'Erori', count: c.error || c.errors || 0, value: 'ERROR', active: activeStatus === 'ERROR', colorClass: 'fc-red' },
|
{ label: 'Erori', count: c.error || c.errors || 0, value: 'ERROR', active: activeStatus === 'ERROR', colorClass: 'fc-red' },
|
||||||
{ label: 'Fact.', count: c.facturate || 0, value: 'INVOICED', active: activeStatus === 'INVOICED', colorClass: 'fc-green' },
|
{ label: 'Fact.', count: c.facturate || 0, value: 'INVOICED', active: activeStatus === 'INVOICED', colorClass: 'fc-green' },
|
||||||
{ label: 'Nefact.', count: c.nefacturate || c.uninvoiced || 0, value: 'UNINVOICED', active: activeStatus === 'UNINVOICED', colorClass: 'fc-red' }
|
{ label: 'Nefact.', count: c.nefacturate || c.uninvoiced || 0, value: 'UNINVOICED', active: activeStatus === 'UNINVOICED', colorClass: 'fc-red' },
|
||||||
|
{ label: 'Anulate', count: c.cancelled || 0, value: 'CANCELLED', active: activeStatus === 'CANCELLED', colorClass: 'fc-dark' }
|
||||||
], (val) => {
|
], (val) => {
|
||||||
document.querySelectorAll('.filter-pill[data-status]').forEach(b => b.classList.remove('active'));
|
document.querySelectorAll('.filter-pill[data-status]').forEach(b => b.classList.remove('active'));
|
||||||
const pill = document.querySelector(`.filter-pill[data-status="${val}"]`);
|
const pill = document.querySelector(`.filter-pill[data-status="${val}"]`);
|
||||||
@@ -386,13 +408,14 @@ function dashChangePerPage(val) {
|
|||||||
// ── Client cell with Cont tooltip (Task F4) ───────
|
// ── Client cell with Cont tooltip (Task F4) ───────
|
||||||
|
|
||||||
function renderClientCell(order) {
|
function renderClientCell(order) {
|
||||||
const shipping = (order.shipping_name || order.customer_name || '').trim();
|
const display = (order.customer_name || order.shipping_name || '').trim();
|
||||||
const billing = (order.billing_name || '').trim();
|
const billing = (order.billing_name || '').trim();
|
||||||
const isDiff = order.is_different_person && billing && shipping !== billing;
|
const shipping = (order.shipping_name || '').trim();
|
||||||
|
const isDiff = display !== shipping && shipping;
|
||||||
if (isDiff) {
|
if (isDiff) {
|
||||||
return `<td class="tooltip-cont fw-bold" data-tooltip="Cont: ${escHtml(billing)}">${escHtml(shipping)} <sup style="color:#6b7280;font-size:0.65rem">▲</sup></td>`;
|
return `<td class="tooltip-cont fw-bold" data-tooltip="Livrare: ${escHtml(shipping)}">${escHtml(display)} <sup style="color:#6b7280;font-size:0.65rem">▲</sup></td>`;
|
||||||
}
|
}
|
||||||
return `<td class="fw-bold">${escHtml(shipping || billing || '\u2014')}</td>`;
|
return `<td class="fw-bold">${escHtml(display || billing || '\u2014')}</td>`;
|
||||||
}
|
}
|
||||||
|
|
||||||
// ── Helper functions ──────────────────────────────
|
// ── Helper functions ──────────────────────────────
|
||||||
@@ -437,6 +460,8 @@ function orderStatusBadge(status) {
|
|||||||
case 'ALREADY_IMPORTED': return '<span class="badge bg-info">Deja importat</span>';
|
case 'ALREADY_IMPORTED': return '<span class="badge bg-info">Deja importat</span>';
|
||||||
case 'SKIPPED': return '<span class="badge bg-warning">Omis</span>';
|
case 'SKIPPED': return '<span class="badge bg-warning">Omis</span>';
|
||||||
case 'ERROR': return '<span class="badge bg-danger">Eroare</span>';
|
case 'ERROR': return '<span class="badge bg-danger">Eroare</span>';
|
||||||
|
case 'CANCELLED': return '<span class="badge bg-secondary">Anulat</span>';
|
||||||
|
case 'DELETED_IN_ROA': return '<span class="badge bg-dark">Sters din ROA</span>';
|
||||||
default: return `<span class="badge bg-secondary">${esc(status)}</span>`;
|
default: return `<span class="badge bg-secondary">${esc(status)}</span>`;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -453,6 +478,9 @@ function renderCodmatCell(item) {
|
|||||||
}
|
}
|
||||||
if (item.codmat_details.length === 1) {
|
if (item.codmat_details.length === 1) {
|
||||||
const d = item.codmat_details[0];
|
const d = item.codmat_details[0];
|
||||||
|
if (d.direct) {
|
||||||
|
return `<code>${esc(d.codmat)}</code> <span class="badge bg-secondary" style="font-size:0.6rem;vertical-align:middle">direct</span>`;
|
||||||
|
}
|
||||||
return `<code>${esc(d.codmat)}</code>`;
|
return `<code>${esc(d.codmat)}</code>`;
|
||||||
}
|
}
|
||||||
return item.codmat_details.map(d =>
|
return item.codmat_details.map(d =>
|
||||||
@@ -550,7 +578,19 @@ async function openDashOrderDetail(orderNumber) {
|
|||||||
if (dlvEl) dlvEl.textContent = order.delivery_cost > 0 ? Number(order.delivery_cost).toFixed(2) + ' lei' : '–';
|
if (dlvEl) dlvEl.textContent = order.delivery_cost > 0 ? Number(order.delivery_cost).toFixed(2) + ' lei' : '–';
|
||||||
|
|
||||||
const dscEl = document.getElementById('detailDiscount');
|
const dscEl = document.getElementById('detailDiscount');
|
||||||
if (dscEl) dscEl.textContent = order.discount_total > 0 ? '–' + Number(order.discount_total).toFixed(2) + ' lei' : '–';
|
if (dscEl) {
|
||||||
|
if (order.discount_total > 0 && order.discount_split && typeof order.discount_split === 'object') {
|
||||||
|
const entries = Object.entries(order.discount_split);
|
||||||
|
if (entries.length > 1) {
|
||||||
|
const parts = entries.map(([vat, amt]) => `–${Number(amt).toFixed(2)} (TVA ${vat}%)`);
|
||||||
|
dscEl.innerHTML = parts.join('<br>');
|
||||||
|
} else {
|
||||||
|
dscEl.textContent = '–' + Number(order.discount_total).toFixed(2) + ' lei';
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
dscEl.textContent = order.discount_total > 0 ? '–' + Number(order.discount_total).toFixed(2) + ' lei' : '–';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const items = data.items || [];
|
const items = data.items || [];
|
||||||
if (items.length === 0) {
|
if (items.length === 0) {
|
||||||
@@ -571,7 +611,7 @@ async function openDashOrderDetail(orderNumber) {
|
|||||||
if (mobileContainer) {
|
if (mobileContainer) {
|
||||||
mobileContainer.innerHTML = '<div class="detail-item-flat">' + items.map((item, idx) => {
|
mobileContainer.innerHTML = '<div class="detail-item-flat">' + items.map((item, idx) => {
|
||||||
const codmatText = item.codmat_details?.length
|
const codmatText = item.codmat_details?.length
|
||||||
? item.codmat_details.map(d => `<code>${esc(d.codmat)}</code>`).join(' ')
|
? item.codmat_details.map(d => `<code>${esc(d.codmat)}</code>${d.direct ? ' <span class="badge bg-secondary" style="font-size:0.55rem">direct</span>' : ''}`).join(' ')
|
||||||
: `<code>${esc(item.codmat || '–')}</code>`;
|
: `<code>${esc(item.codmat || '–')}</code>`;
|
||||||
const valoare = (Number(item.price || 0) * Number(item.quantity || 0)).toFixed(2);
|
const valoare = (Number(item.price || 0) * Number(item.quantity || 0)).toFixed(2);
|
||||||
return `<div class="dif-item">
|
return `<div class="dif-item">
|
||||||
@@ -617,15 +657,34 @@ function openQuickMap(sku, productName, orderNumber, itemIdx) {
|
|||||||
const container = document.getElementById('qmCodmatLines');
|
const container = document.getElementById('qmCodmatLines');
|
||||||
container.innerHTML = '';
|
container.innerHTML = '';
|
||||||
|
|
||||||
// Pre-populate with existing codmat_details if available
|
// Check if this is a direct SKU (SKU=CODMAT in NOM_ARTICOLE)
|
||||||
const item = (window._detailItems || [])[itemIdx];
|
const item = (window._detailItems || [])[itemIdx];
|
||||||
const details = item?.codmat_details;
|
const details = item?.codmat_details;
|
||||||
if (details && details.length > 0) {
|
const isDirect = details?.length === 1 && details[0].direct === true;
|
||||||
details.forEach(d => {
|
const directInfo = document.getElementById('qmDirectInfo');
|
||||||
addQmCodmatLine({ codmat: d.codmat, cantitate: d.cantitate_roa, procent: d.procent_pret, denumire: d.denumire });
|
const saveBtn = document.getElementById('qmSaveBtn');
|
||||||
});
|
|
||||||
} else {
|
if (isDirect) {
|
||||||
|
if (directInfo) {
|
||||||
|
directInfo.innerHTML = `<i class="bi bi-info-circle"></i> SKU = CODMAT direct in nomenclator (<code>${escHtml(details[0].codmat)}</code> — ${escHtml(details[0].denumire || '')}).<br><small class="text-muted">Poti suprascrie cu un alt CODMAT daca e necesar (ex: reambalare).</small>`;
|
||||||
|
directInfo.style.display = '';
|
||||||
|
}
|
||||||
|
if (saveBtn) {
|
||||||
|
saveBtn.textContent = 'Suprascrie mapare';
|
||||||
|
}
|
||||||
addQmCodmatLine();
|
addQmCodmatLine();
|
||||||
|
} else {
|
||||||
|
if (directInfo) directInfo.style.display = 'none';
|
||||||
|
if (saveBtn) saveBtn.textContent = 'Salveaza';
|
||||||
|
|
||||||
|
// Pre-populate with existing codmat_details if available
|
||||||
|
if (details && details.length > 0) {
|
||||||
|
details.forEach(d => {
|
||||||
|
addQmCodmatLine({ codmat: d.codmat, cantitate: d.cantitate_roa, procent: d.procent_pret, denumire: d.denumire });
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
addQmCodmatLine();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
new bootstrap.Modal(document.getElementById('quickMapModal')).show();
|
new bootstrap.Modal(document.getElementById('quickMapModal')).show();
|
||||||
@@ -737,7 +796,9 @@ async function saveQuickMapping() {
|
|||||||
if (currentQmOrderNumber) openDashOrderDetail(currentQmOrderNumber);
|
if (currentQmOrderNumber) openDashOrderDetail(currentQmOrderNumber);
|
||||||
loadDashOrders();
|
loadDashOrders();
|
||||||
} else {
|
} else {
|
||||||
alert('Eroare: ' + (data.error || 'Unknown'));
|
const msg = data.detail || data.error || 'Unknown';
|
||||||
|
document.getElementById('qmPctWarning').textContent = msg;
|
||||||
|
document.getElementById('qmPctWarning').style.display = '';
|
||||||
}
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
alert('Eroare: ' + err.message);
|
alert('Eroare: ' + err.message);
|
||||||
|
|||||||
@@ -38,6 +38,7 @@ function orderStatusBadge(status) {
|
|||||||
case 'ALREADY_IMPORTED': return '<span class="badge bg-info">Deja importat</span>';
|
case 'ALREADY_IMPORTED': return '<span class="badge bg-info">Deja importat</span>';
|
||||||
case 'SKIPPED': return '<span class="badge bg-warning">Omis</span>';
|
case 'SKIPPED': return '<span class="badge bg-warning">Omis</span>';
|
||||||
case 'ERROR': return '<span class="badge bg-danger">Eroare</span>';
|
case 'ERROR': return '<span class="badge bg-danger">Eroare</span>';
|
||||||
|
case 'DELETED_IN_ROA': return '<span class="badge bg-dark">Sters din ROA</span>';
|
||||||
default: return `<span class="badge bg-secondary">${esc(status)}</span>`;
|
default: return `<span class="badge bg-secondary">${esc(status)}</span>`;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -9,12 +9,23 @@ document.addEventListener('DOMContentLoaded', async () => {
|
|||||||
|
|
||||||
async function loadDropdowns() {
|
async function loadDropdowns() {
|
||||||
try {
|
try {
|
||||||
const [sectiiRes, politiciRes] = await Promise.all([
|
const [sectiiRes, politiciRes, gestiuniRes] = await Promise.all([
|
||||||
fetch('/api/settings/sectii'),
|
fetch('/api/settings/sectii'),
|
||||||
fetch('/api/settings/politici')
|
fetch('/api/settings/politici'),
|
||||||
|
fetch('/api/settings/gestiuni')
|
||||||
]);
|
]);
|
||||||
const sectii = await sectiiRes.json();
|
const sectii = await sectiiRes.json();
|
||||||
const politici = await politiciRes.json();
|
const politici = await politiciRes.json();
|
||||||
|
const gestiuni = await gestiuniRes.json();
|
||||||
|
|
||||||
|
const gestContainer = document.getElementById('settGestiuniContainer');
|
||||||
|
if (gestContainer) {
|
||||||
|
gestContainer.innerHTML = '';
|
||||||
|
gestiuni.forEach(g => {
|
||||||
|
gestContainer.innerHTML += `<div class="form-check mb-0"><input class="form-check-input" type="checkbox" value="${escHtml(g.id)}" id="gestChk_${escHtml(g.id)}"><label class="form-check-label" for="gestChk_${escHtml(g.id)}">${escHtml(g.label)}</label></div>`;
|
||||||
|
});
|
||||||
|
if (gestiuni.length === 0) gestContainer.innerHTML = '<span class="text-muted small">Nicio gestiune disponibilă</span>';
|
||||||
|
}
|
||||||
|
|
||||||
const sectieEl = document.getElementById('settIdSectie');
|
const sectieEl = document.getElementById('settIdSectie');
|
||||||
if (sectieEl) {
|
if (sectieEl) {
|
||||||
@@ -47,6 +58,14 @@ async function loadDropdowns() {
|
|||||||
dPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
dPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const pPolEl = document.getElementById('settIdPolProductie');
|
||||||
|
if (pPolEl) {
|
||||||
|
pPolEl.innerHTML = '<option value="">— fără politică producție —</option>';
|
||||||
|
politici.forEach(p => {
|
||||||
|
pPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
||||||
|
});
|
||||||
|
}
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('loadDropdowns error:', err);
|
console.error('loadDropdowns error:', err);
|
||||||
}
|
}
|
||||||
@@ -61,14 +80,26 @@ async function loadSettings() {
|
|||||||
if (el('settTransportVat')) el('settTransportVat').value = data.transport_vat || '21';
|
if (el('settTransportVat')) el('settTransportVat').value = data.transport_vat || '21';
|
||||||
if (el('settTransportIdPol')) el('settTransportIdPol').value = data.transport_id_pol || '';
|
if (el('settTransportIdPol')) el('settTransportIdPol').value = data.transport_id_pol || '';
|
||||||
if (el('settDiscountCodmat')) el('settDiscountCodmat').value = data.discount_codmat || '';
|
if (el('settDiscountCodmat')) el('settDiscountCodmat').value = data.discount_codmat || '';
|
||||||
if (el('settDiscountVat')) el('settDiscountVat').value = data.discount_vat || '19';
|
if (el('settDiscountVat')) el('settDiscountVat').value = data.discount_vat || '21';
|
||||||
if (el('settDiscountIdPol')) el('settDiscountIdPol').value = data.discount_id_pol || '';
|
if (el('settDiscountIdPol')) el('settDiscountIdPol').value = data.discount_id_pol || '';
|
||||||
|
if (el('settSplitDiscountVat')) el('settSplitDiscountVat').checked = data.split_discount_vat === "1";
|
||||||
if (el('settIdPol')) el('settIdPol').value = data.id_pol || '';
|
if (el('settIdPol')) el('settIdPol').value = data.id_pol || '';
|
||||||
|
if (el('settIdPolProductie')) el('settIdPolProductie').value = data.id_pol_productie || '';
|
||||||
if (el('settIdSectie')) el('settIdSectie').value = data.id_sectie || '';
|
if (el('settIdSectie')) el('settIdSectie').value = data.id_sectie || '';
|
||||||
|
// Multi-gestiune checkboxes
|
||||||
|
const gestVal = data.id_gestiune || '';
|
||||||
|
if (gestVal) {
|
||||||
|
const selectedIds = gestVal.split(',').map(s => s.trim());
|
||||||
|
selectedIds.forEach(id => {
|
||||||
|
const chk = document.getElementById('gestChk_' + id);
|
||||||
|
if (chk) chk.checked = true;
|
||||||
|
});
|
||||||
|
}
|
||||||
if (el('settGomagApiKey')) el('settGomagApiKey').value = data.gomag_api_key || '';
|
if (el('settGomagApiKey')) el('settGomagApiKey').value = data.gomag_api_key || '';
|
||||||
if (el('settGomagApiShop')) el('settGomagApiShop').value = data.gomag_api_shop || '';
|
if (el('settGomagApiShop')) el('settGomagApiShop').value = data.gomag_api_shop || '';
|
||||||
if (el('settGomagDaysBack')) el('settGomagDaysBack').value = data.gomag_order_days_back || '7';
|
if (el('settGomagDaysBack')) el('settGomagDaysBack').value = data.gomag_order_days_back || '7';
|
||||||
if (el('settGomagLimit')) el('settGomagLimit').value = data.gomag_limit || '100';
|
if (el('settGomagLimit')) el('settGomagLimit').value = data.gomag_limit || '100';
|
||||||
|
if (el('settDashPollSeconds')) el('settDashPollSeconds').value = data.dashboard_poll_seconds || '5';
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('loadSettings error:', err);
|
console.error('loadSettings error:', err);
|
||||||
}
|
}
|
||||||
@@ -81,14 +112,18 @@ async function saveSettings() {
|
|||||||
transport_vat: el('settTransportVat')?.value || '21',
|
transport_vat: el('settTransportVat')?.value || '21',
|
||||||
transport_id_pol: el('settTransportIdPol')?.value?.trim() || '',
|
transport_id_pol: el('settTransportIdPol')?.value?.trim() || '',
|
||||||
discount_codmat: el('settDiscountCodmat')?.value?.trim() || '',
|
discount_codmat: el('settDiscountCodmat')?.value?.trim() || '',
|
||||||
discount_vat: el('settDiscountVat')?.value || '19',
|
discount_vat: el('settDiscountVat')?.value || '21',
|
||||||
discount_id_pol: el('settDiscountIdPol')?.value?.trim() || '',
|
discount_id_pol: el('settDiscountIdPol')?.value?.trim() || '',
|
||||||
|
split_discount_vat: el('settSplitDiscountVat')?.checked ? "1" : "",
|
||||||
id_pol: el('settIdPol')?.value?.trim() || '',
|
id_pol: el('settIdPol')?.value?.trim() || '',
|
||||||
|
id_pol_productie: el('settIdPolProductie')?.value?.trim() || '',
|
||||||
id_sectie: el('settIdSectie')?.value?.trim() || '',
|
id_sectie: el('settIdSectie')?.value?.trim() || '',
|
||||||
|
id_gestiune: Array.from(document.querySelectorAll('#settGestiuniContainer input:checked')).map(c => c.value).join(','),
|
||||||
gomag_api_key: el('settGomagApiKey')?.value?.trim() || '',
|
gomag_api_key: el('settGomagApiKey')?.value?.trim() || '',
|
||||||
gomag_api_shop: el('settGomagApiShop')?.value?.trim() || '',
|
gomag_api_shop: el('settGomagApiShop')?.value?.trim() || '',
|
||||||
gomag_order_days_back: el('settGomagDaysBack')?.value?.trim() || '7',
|
gomag_order_days_back: el('settGomagDaysBack')?.value?.trim() || '7',
|
||||||
gomag_limit: el('settGomagLimit')?.value?.trim() || '100',
|
gomag_limit: el('settGomagLimit')?.value?.trim() || '100',
|
||||||
|
dashboard_poll_seconds: el('settDashPollSeconds')?.value?.trim() || '5',
|
||||||
};
|
};
|
||||||
try {
|
try {
|
||||||
const res = await fetch('/api/settings', {
|
const res = await fetch('/api/settings', {
|
||||||
|
|||||||
@@ -219,6 +219,9 @@ function statusDot(status) {
|
|||||||
case 'ERROR':
|
case 'ERROR':
|
||||||
case 'FAILED':
|
case 'FAILED':
|
||||||
return '<span class="dot dot-red"></span>';
|
return '<span class="dot dot-red"></span>';
|
||||||
|
case 'CANCELLED':
|
||||||
|
case 'DELETED_IN_ROA':
|
||||||
|
return '<span class="dot dot-gray"></span>';
|
||||||
default:
|
default:
|
||||||
return '<span class="dot dot-gray"></span>';
|
return '<span class="dot dot-gray"></span>';
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -7,7 +7,7 @@
|
|||||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
|
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.2/font/bootstrap-icons.css" rel="stylesheet">
|
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.2/font/bootstrap-icons.css" rel="stylesheet">
|
||||||
{% set rp = request.scope.get('root_path', '') %}
|
{% set rp = request.scope.get('root_path', '') %}
|
||||||
<link href="{{ rp }}/static/css/style.css?v=10" rel="stylesheet">
|
<link href="{{ rp }}/static/css/style.css?v=14" rel="stylesheet">
|
||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
<!-- Top Navbar -->
|
<!-- Top Navbar -->
|
||||||
@@ -29,7 +29,7 @@
|
|||||||
|
|
||||||
<script>window.ROOT_PATH = "{{ rp }}";</script>
|
<script>window.ROOT_PATH = "{{ rp }}";</script>
|
||||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
|
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
|
||||||
<script src="{{ rp }}/static/js/shared.js?v=10"></script>
|
<script src="{{ rp }}/static/js/shared.js?v=11"></script>
|
||||||
{% block scripts %}{% endblock %}
|
{% block scripts %}{% endblock %}
|
||||||
</body>
|
</body>
|
||||||
</html>
|
</html>
|
||||||
|
|||||||
@@ -17,6 +17,8 @@
|
|||||||
<input type="checkbox" id="schedulerToggle" class="cursor-pointer" onchange="toggleScheduler()">
|
<input type="checkbox" id="schedulerToggle" class="cursor-pointer" onchange="toggleScheduler()">
|
||||||
</label>
|
</label>
|
||||||
<select id="schedulerInterval" class="select-compact" onchange="updateSchedulerInterval()">
|
<select id="schedulerInterval" class="select-compact" onchange="updateSchedulerInterval()">
|
||||||
|
<option value="1">1 min</option>
|
||||||
|
<option value="3">3 min</option>
|
||||||
<option value="5">5 min</option>
|
<option value="5">5 min</option>
|
||||||
<option value="10" selected>10 min</option>
|
<option value="10" selected>10 min</option>
|
||||||
<option value="30">30 min</option>
|
<option value="30">30 min</option>
|
||||||
@@ -49,6 +51,8 @@
|
|||||||
<div class="filter-bar" id="ordersFilterBar">
|
<div class="filter-bar" id="ordersFilterBar">
|
||||||
<!-- Period dropdown -->
|
<!-- Period dropdown -->
|
||||||
<select id="periodSelect" class="select-compact">
|
<select id="periodSelect" class="select-compact">
|
||||||
|
<option value="1">1 zi</option>
|
||||||
|
<option value="2">2 zile</option>
|
||||||
<option value="3">3 zile</option>
|
<option value="3">3 zile</option>
|
||||||
<option value="7" selected>7 zile</option>
|
<option value="7" selected>7 zile</option>
|
||||||
<option value="30">30 zile</option>
|
<option value="30">30 zile</option>
|
||||||
@@ -62,6 +66,7 @@
|
|||||||
<span>—</span>
|
<span>—</span>
|
||||||
<input type="date" id="periodEnd" class="select-compact">
|
<input type="date" id="periodEnd" class="select-compact">
|
||||||
</div>
|
</div>
|
||||||
|
<input type="search" id="orderSearch" placeholder="Cauta comanda, client..." class="search-input">
|
||||||
<!-- Status pills -->
|
<!-- Status pills -->
|
||||||
<button class="filter-pill active d-none d-md-inline-flex" data-status="all">Toate <span class="filter-count fc-neutral" id="cntAll">0</span></button>
|
<button class="filter-pill active d-none d-md-inline-flex" data-status="all">Toate <span class="filter-count fc-neutral" id="cntAll">0</span></button>
|
||||||
<button class="filter-pill d-none d-md-inline-flex" data-status="IMPORTED">Importat <span class="filter-count fc-green" id="cntImp">0</span></button>
|
<button class="filter-pill d-none d-md-inline-flex" data-status="IMPORTED">Importat <span class="filter-count fc-green" id="cntImp">0</span></button>
|
||||||
@@ -69,9 +74,8 @@
|
|||||||
<button class="filter-pill d-none d-md-inline-flex" data-status="ERROR">Erori <span class="filter-count fc-red" id="cntErr">0</span></button>
|
<button class="filter-pill d-none d-md-inline-flex" data-status="ERROR">Erori <span class="filter-count fc-red" id="cntErr">0</span></button>
|
||||||
<button class="filter-pill d-none d-md-inline-flex" data-status="INVOICED">Facturate <span class="filter-count fc-green" id="cntFact">0</span></button>
|
<button class="filter-pill d-none d-md-inline-flex" data-status="INVOICED">Facturate <span class="filter-count fc-green" id="cntFact">0</span></button>
|
||||||
<button class="filter-pill d-none d-md-inline-flex" data-status="UNINVOICED">Nefacturate <span class="filter-count fc-red" id="cntNef">0</span></button>
|
<button class="filter-pill d-none d-md-inline-flex" data-status="UNINVOICED">Nefacturate <span class="filter-count fc-red" id="cntNef">0</span></button>
|
||||||
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex align-items-center gap-1" id="btnRefreshInvoices" onclick="refreshInvoices()" title="Actualizeaza status facturi din Oracle">↻ Facturi</button>
|
<button class="filter-pill d-none d-md-inline-flex" data-status="CANCELLED">Anulate <span class="filter-count fc-dark" id="cntCanc">0</span></button>
|
||||||
<!-- Search (integrated, end of row) -->
|
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" id="btnRefreshInvoices" onclick="refreshInvoices()" title="Actualizeaza status facturi din Oracle">↻</button>
|
||||||
<input type="search" id="orderSearch" placeholder="Cauta..." class="search-input">
|
|
||||||
</div>
|
</div>
|
||||||
<div class="d-md-none mb-2 d-flex align-items-center gap-2">
|
<div class="d-md-none mb-2 d-flex align-items-center gap-2">
|
||||||
<div class="flex-grow-1" id="dashMobileSeg"></div>
|
<div class="flex-grow-1" id="dashMobileSeg"></div>
|
||||||
@@ -187,11 +191,12 @@
|
|||||||
<button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addQmCodmatLine()" style="font-size:0.8rem; padding:2px 10px">
|
<button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addQmCodmatLine()" style="font-size:0.8rem; padding:2px 10px">
|
||||||
+ CODMAT
|
+ CODMAT
|
||||||
</button>
|
</button>
|
||||||
|
<div id="qmDirectInfo" class="alert alert-info mt-2" style="display:none; font-size:0.85rem; padding:8px 12px;"></div>
|
||||||
<div id="qmPctWarning" class="text-danger mt-2" style="display:none;"></div>
|
<div id="qmPctWarning" class="text-danger mt-2" style="display:none;"></div>
|
||||||
</div>
|
</div>
|
||||||
<div class="modal-footer">
|
<div class="modal-footer">
|
||||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
|
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
|
||||||
<button type="button" class="btn btn-primary" onclick="saveQuickMapping()">Salveaza</button>
|
<button type="button" class="btn btn-primary" id="qmSaveBtn" onclick="saveQuickMapping()">Salveaza</button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -199,5 +204,5 @@
|
|||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
{% block scripts %}
|
{% block scripts %}
|
||||||
<script src="{{ request.scope.get('root_path', '') }}/static/js/dashboard.js?v=11"></script>
|
<script src="{{ request.scope.get('root_path', '') }}/static/js/dashboard.js?v=17"></script>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|||||||
@@ -38,6 +38,13 @@
|
|||||||
<div class="card h-100">
|
<div class="card h-100">
|
||||||
<div class="card-header py-2 px-3 fw-semibold">Import ROA</div>
|
<div class="card-header py-2 px-3 fw-semibold">Import ROA</div>
|
||||||
<div class="card-body py-2 px-3">
|
<div class="card-body py-2 px-3">
|
||||||
|
<div class="mb-2">
|
||||||
|
<label class="form-label mb-0 small">Gestiuni pentru verificare stoc</label>
|
||||||
|
<div id="settGestiuniContainer" class="border rounded p-2" style="max-height:120px;overflow-y:auto;font-size:0.85rem">
|
||||||
|
<span class="text-muted small">Se încarcă...</span>
|
||||||
|
</div>
|
||||||
|
<div class="form-text" style="font-size:0.75rem">Nicio selecție = orice gestiune</div>
|
||||||
|
</div>
|
||||||
<div class="mb-2">
|
<div class="mb-2">
|
||||||
<label class="form-label mb-0 small">Secție (ID_SECTIE)</label>
|
<label class="form-label mb-0 small">Secție (ID_SECTIE)</label>
|
||||||
<select class="form-select form-select-sm" id="settIdSectie">
|
<select class="form-select form-select-sm" id="settIdSectie">
|
||||||
@@ -45,11 +52,18 @@
|
|||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
<div class="mb-2">
|
<div class="mb-2">
|
||||||
<label class="form-label mb-0 small">Politică de Preț (ID_POL)</label>
|
<label class="form-label mb-0 small">Politică Preț Vânzare (ID_POL)</label>
|
||||||
<select class="form-select form-select-sm" id="settIdPol">
|
<select class="form-select form-select-sm" id="settIdPol">
|
||||||
<option value="">— selectează politică —</option>
|
<option value="">— selectează politică —</option>
|
||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="mb-2">
|
||||||
|
<label class="form-label mb-0 small">Politică Preț Producție</label>
|
||||||
|
<select class="form-select form-select-sm" id="settIdPolProductie">
|
||||||
|
<option value="">— fără politică producție —</option>
|
||||||
|
</select>
|
||||||
|
<div class="form-text" style="font-size:0.75rem">Pentru articole cu cont 341/345 (producție proprie)</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -107,8 +121,9 @@
|
|||||||
<select class="form-select form-select-sm" id="settDiscountVat">
|
<select class="form-select form-select-sm" id="settDiscountVat">
|
||||||
<option value="5">5%</option>
|
<option value="5">5%</option>
|
||||||
<option value="9">9%</option>
|
<option value="9">9%</option>
|
||||||
<option value="19" selected>19%</option>
|
<option value="11">11%</option>
|
||||||
<option value="21">21%</option>
|
<option value="19">19%</option>
|
||||||
|
<option value="21" selected>21%</option>
|
||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
<div class="col-6">
|
<div class="col-6">
|
||||||
@@ -118,6 +133,27 @@
|
|||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="mt-2 form-check">
|
||||||
|
<input type="checkbox" class="form-check-input" id="settSplitDiscountVat">
|
||||||
|
<label class="form-check-label small" for="settSplitDiscountVat">
|
||||||
|
Împarte discount pe cote TVA (proporțional cu valoarea articolelor)
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="row g-3 mb-3">
|
||||||
|
<div class="col-md-6">
|
||||||
|
<div class="card h-100">
|
||||||
|
<div class="card-header py-2 px-3 fw-semibold">Dashboard</div>
|
||||||
|
<div class="card-body py-2 px-3">
|
||||||
|
<div class="mb-2">
|
||||||
|
<label class="form-label mb-0 small">Interval polling (secunde)</label>
|
||||||
|
<input type="number" class="form-control form-control-sm" id="settDashPollSeconds" value="5" min="1" max="300">
|
||||||
|
<div class="form-text" style="font-size:0.75rem">Cât de des verifică dashboard-ul starea sync-ului (implicit 5s)</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -131,5 +167,5 @@
|
|||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
{% block scripts %}
|
{% block scripts %}
|
||||||
<script src="{{ request.scope.get('root_path', '') }}/static/js/settings.js?v=2"></script>
|
<script src="{{ request.scope.get('root_path', '') }}/static/js/settings.js?v=6"></script>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|||||||
@@ -61,6 +61,7 @@ CREATE OR REPLACE PACKAGE PACK_IMPORT_COMENZI AS
|
|||||||
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
|
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
|
||||||
p_id_pol IN NUMBER DEFAULT NULL,
|
p_id_pol IN NUMBER DEFAULT NULL,
|
||||||
p_id_sectie IN NUMBER DEFAULT NULL,
|
p_id_sectie IN NUMBER DEFAULT NULL,
|
||||||
|
p_id_gestiune IN VARCHAR2 DEFAULT NULL,
|
||||||
v_id_comanda OUT NUMBER);
|
v_id_comanda OUT NUMBER);
|
||||||
|
|
||||||
-- Functii pentru managementul erorilor (pentru orchestrator VFP)
|
-- Functii pentru managementul erorilor (pentru orchestrator VFP)
|
||||||
@@ -88,6 +89,60 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
|||||||
g_last_error := NULL;
|
g_last_error := NULL;
|
||||||
END clear_error;
|
END clear_error;
|
||||||
|
|
||||||
|
-- ================================================================
|
||||||
|
-- Functie helper: selecteaza id_articol corect pentru un CODMAT
|
||||||
|
-- Prioritate: sters=0 AND inactiv=0, preferinta stoc, MAX(id_articol) fallback
|
||||||
|
-- ================================================================
|
||||||
|
FUNCTION resolve_id_articol(p_codmat IN VARCHAR2, p_id_gest IN VARCHAR2) RETURN NUMBER IS
|
||||||
|
v_result NUMBER;
|
||||||
|
BEGIN
|
||||||
|
IF p_id_gest IS NOT NULL THEN
|
||||||
|
-- Cu gestiuni specifice (CSV: "1,3") — split in subquery pentru IN clause
|
||||||
|
BEGIN
|
||||||
|
SELECT id_articol INTO v_result FROM (
|
||||||
|
SELECT na.id_articol
|
||||||
|
FROM nom_articole na
|
||||||
|
WHERE na.codmat = p_codmat AND na.sters = 0 AND na.inactiv = 0
|
||||||
|
ORDER BY
|
||||||
|
CASE WHEN EXISTS (
|
||||||
|
SELECT 1 FROM stoc s
|
||||||
|
WHERE s.id_articol = na.id_articol
|
||||||
|
AND s.id_gestiune IN (
|
||||||
|
SELECT TO_NUMBER(REGEXP_SUBSTR(p_id_gest, '[^,]+', 1, LEVEL))
|
||||||
|
FROM DUAL
|
||||||
|
CONNECT BY LEVEL <= REGEXP_COUNT(p_id_gest, ',') + 1
|
||||||
|
)
|
||||||
|
AND s.an = EXTRACT(YEAR FROM SYSDATE)
|
||||||
|
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
|
||||||
|
AND s.cants + s.cant - s.cante > 0
|
||||||
|
) THEN 0 ELSE 1 END,
|
||||||
|
na.id_articol DESC
|
||||||
|
) WHERE ROWNUM = 1;
|
||||||
|
EXCEPTION WHEN NO_DATA_FOUND THEN v_result := NULL;
|
||||||
|
END;
|
||||||
|
ELSE
|
||||||
|
-- Fara gestiune — cauta stoc in orice gestiune
|
||||||
|
BEGIN
|
||||||
|
SELECT id_articol INTO v_result FROM (
|
||||||
|
SELECT na.id_articol
|
||||||
|
FROM nom_articole na
|
||||||
|
WHERE na.codmat = p_codmat AND na.sters = 0 AND na.inactiv = 0
|
||||||
|
ORDER BY
|
||||||
|
CASE WHEN EXISTS (
|
||||||
|
SELECT 1 FROM stoc s
|
||||||
|
WHERE s.id_articol = na.id_articol
|
||||||
|
AND s.an = EXTRACT(YEAR FROM SYSDATE)
|
||||||
|
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
|
||||||
|
AND s.cants + s.cant - s.cante > 0
|
||||||
|
) THEN 0 ELSE 1 END,
|
||||||
|
na.id_articol DESC
|
||||||
|
) WHERE ROWNUM = 1;
|
||||||
|
EXCEPTION WHEN NO_DATA_FOUND THEN v_result := NULL;
|
||||||
|
END;
|
||||||
|
END IF;
|
||||||
|
RETURN v_result;
|
||||||
|
END resolve_id_articol;
|
||||||
|
|
||||||
-- ================================================================
|
-- ================================================================
|
||||||
-- Procedura principala pentru importul unei comenzi
|
-- Procedura principala pentru importul unei comenzi
|
||||||
-- ================================================================
|
-- ================================================================
|
||||||
@@ -99,6 +154,7 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
|||||||
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
|
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
|
||||||
p_id_pol IN NUMBER DEFAULT NULL,
|
p_id_pol IN NUMBER DEFAULT NULL,
|
||||||
p_id_sectie IN NUMBER DEFAULT NULL,
|
p_id_sectie IN NUMBER DEFAULT NULL,
|
||||||
|
p_id_gestiune IN VARCHAR2 DEFAULT NULL,
|
||||||
v_id_comanda OUT NUMBER) IS
|
v_id_comanda OUT NUMBER) IS
|
||||||
v_data_livrare DATE;
|
v_data_livrare DATE;
|
||||||
v_sku VARCHAR2(100);
|
v_sku VARCHAR2(100);
|
||||||
@@ -203,8 +259,7 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
|||||||
-- Cauta mai intai in ARTICOLE_TERTI (mapari speciale / seturi)
|
-- Cauta mai intai in ARTICOLE_TERTI (mapari speciale / seturi)
|
||||||
v_found_mapping := FALSE;
|
v_found_mapping := FALSE;
|
||||||
|
|
||||||
FOR rec IN (SELECT at.codmat, at.cantitate_roa, at.procent_pret,
|
FOR rec IN (SELECT at.codmat, at.cantitate_roa, at.procent_pret
|
||||||
(SELECT MAX(na.id_articol) FROM nom_articole na WHERE na.codmat = at.codmat) AS id_articol
|
|
||||||
FROM articole_terti at
|
FROM articole_terti at
|
||||||
WHERE at.sku = v_sku
|
WHERE at.sku = v_sku
|
||||||
AND at.activ = 1
|
AND at.activ = 1
|
||||||
@@ -212,6 +267,14 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
|||||||
ORDER BY at.procent_pret DESC) LOOP
|
ORDER BY at.procent_pret DESC) LOOP
|
||||||
|
|
||||||
v_found_mapping := TRUE;
|
v_found_mapping := TRUE;
|
||||||
|
v_id_articol := resolve_id_articol(rec.codmat, p_id_gestiune);
|
||||||
|
IF v_id_articol IS NULL THEN
|
||||||
|
v_articole_eroare := v_articole_eroare + 1;
|
||||||
|
g_last_error := g_last_error || CHR(10) ||
|
||||||
|
'Articol activ negasit pentru CODMAT: ' || rec.codmat;
|
||||||
|
CONTINUE;
|
||||||
|
END IF;
|
||||||
|
|
||||||
v_cantitate_roa := rec.cantitate_roa * v_cantitate_web;
|
v_cantitate_roa := rec.cantitate_roa * v_cantitate_web;
|
||||||
v_pret_unitar := CASE WHEN v_pret_web IS NOT NULL
|
v_pret_unitar := CASE WHEN v_pret_web IS NOT NULL
|
||||||
THEN (v_pret_web * rec.procent_pret / 100) / rec.cantitate_roa
|
THEN (v_pret_web * rec.procent_pret / 100) / rec.cantitate_roa
|
||||||
@@ -220,7 +283,7 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
|||||||
|
|
||||||
BEGIN
|
BEGIN
|
||||||
PACK_COMENZI.adauga_articol_comanda(V_ID_COMANDA => v_id_comanda,
|
PACK_COMENZI.adauga_articol_comanda(V_ID_COMANDA => v_id_comanda,
|
||||||
V_ID_ARTICOL => rec.id_articol,
|
V_ID_ARTICOL => v_id_articol,
|
||||||
V_ID_POL => NVL(v_id_pol_articol, p_id_pol),
|
V_ID_POL => NVL(v_id_pol_articol, p_id_pol),
|
||||||
V_CANTITATE => v_cantitate_roa,
|
V_CANTITATE => v_cantitate_roa,
|
||||||
V_PRET => v_pret_unitar,
|
V_PRET => v_pret_unitar,
|
||||||
@@ -236,40 +299,34 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
|||||||
END;
|
END;
|
||||||
END LOOP;
|
END LOOP;
|
||||||
|
|
||||||
-- Daca nu s-a gasit mapare, cauta direct in NOM_ARTICOLE
|
-- Daca nu s-a gasit mapare, cauta direct in NOM_ARTICOLE via resolve_id_articol
|
||||||
IF NOT v_found_mapping THEN
|
IF NOT v_found_mapping THEN
|
||||||
BEGIN
|
v_id_articol := resolve_id_articol(v_sku, p_id_gestiune);
|
||||||
SELECT id_articol, codmat
|
IF v_id_articol IS NULL THEN
|
||||||
INTO v_id_articol, v_codmat
|
v_articole_eroare := v_articole_eroare + 1;
|
||||||
FROM nom_articole
|
g_last_error := g_last_error || CHR(10) ||
|
||||||
WHERE codmat = v_sku
|
'SKU negasit in ARTICOLE_TERTI si NOM_ARTICOLE (activ): ' || v_sku;
|
||||||
AND id_articol = (SELECT MAX(id_articol) FROM nom_articole WHERE codmat = v_sku);
|
ELSE
|
||||||
|
v_codmat := v_sku;
|
||||||
v_pret_unitar := NVL(v_pret_web, 0);
|
v_pret_unitar := NVL(v_pret_web, 0);
|
||||||
|
|
||||||
PACK_COMENZI.adauga_articol_comanda(V_ID_COMANDA => v_id_comanda,
|
BEGIN
|
||||||
V_ID_ARTICOL => v_id_articol,
|
PACK_COMENZI.adauga_articol_comanda(V_ID_COMANDA => v_id_comanda,
|
||||||
V_ID_POL => NVL(v_id_pol_articol, p_id_pol),
|
V_ID_ARTICOL => v_id_articol,
|
||||||
V_CANTITATE => v_cantitate_web,
|
V_ID_POL => NVL(v_id_pol_articol, p_id_pol),
|
||||||
V_PRET => v_pret_unitar,
|
V_CANTITATE => v_cantitate_web,
|
||||||
V_ID_UTIL => c_id_util,
|
V_PRET => v_pret_unitar,
|
||||||
V_ID_SECTIE => p_id_sectie,
|
V_ID_UTIL => c_id_util,
|
||||||
V_PTVA => v_vat);
|
V_ID_SECTIE => p_id_sectie,
|
||||||
v_articole_procesate := v_articole_procesate + 1;
|
V_PTVA => v_vat);
|
||||||
EXCEPTION
|
v_articole_procesate := v_articole_procesate + 1;
|
||||||
WHEN NO_DATA_FOUND THEN
|
EXCEPTION
|
||||||
v_articole_eroare := v_articole_eroare + 1;
|
WHEN OTHERS THEN
|
||||||
g_last_error := g_last_error || CHR(10) ||
|
v_articole_eroare := v_articole_eroare + 1;
|
||||||
'SKU negasit in ARTICOLE_TERTI si NOM_ARTICOLE: ' || v_sku;
|
g_last_error := g_last_error || CHR(10) ||
|
||||||
WHEN TOO_MANY_ROWS THEN
|
'Eroare adaugare articol ' || v_sku || ' (CODMAT: ' || v_codmat || '): ' || SQLERRM;
|
||||||
v_articole_eroare := v_articole_eroare + 1;
|
END;
|
||||||
g_last_error := g_last_error || CHR(10) ||
|
END IF;
|
||||||
'Multiple articole gasite pentru SKU: ' || v_sku;
|
|
||||||
WHEN OTHERS THEN
|
|
||||||
v_articole_eroare := v_articole_eroare + 1;
|
|
||||||
g_last_error := g_last_error || CHR(10) ||
|
|
||||||
'Eroare adaugare articol ' || v_sku || ' (CODMAT: ' || v_codmat || '): ' || SQLERRM;
|
|
||||||
END;
|
|
||||||
END IF;
|
END IF;
|
||||||
|
|
||||||
END; -- End BEGIN block pentru articol individual
|
END; -- End BEGIN block pentru articol individual
|
||||||
|
|||||||
94
scripts/HANDOFF_MAPPING.md
Normal file
94
scripts/HANDOFF_MAPPING.md
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
# Handoff: Matching GoMag SKU → ROA Articole pentru Mapari
|
||||||
|
|
||||||
|
## Context
|
||||||
|
|
||||||
|
Vending (coffeepoint.ro) are ~429 comenzi GoMag importate in SQLite, din care ~393 SKIPPED (lipsesc mapari SKU).
|
||||||
|
Facturile pentru aceste comenzi exista deja in Oracle ROA, create manual independent de import.
|
||||||
|
Scopul: descoperim corespondenta SKU GoMag → id_articol ROA din compararea comenzilor cu facturile.
|
||||||
|
|
||||||
|
## Ce s-a facut
|
||||||
|
|
||||||
|
### 1. Fix customer_name (COMPLETAT - commits pe main)
|
||||||
|
- **Problema:** `customer_name` in SQLite era shipping person, nu firma de facturare
|
||||||
|
- **Fix:** Cand billing e pe firma, `customer_name = company_name` (nu shipping person)
|
||||||
|
- **Fix 2:** `customer_name` nu se actualiza la upsert SQLite (doar la INSERT)
|
||||||
|
- **Fix 3:** Dashboard JS afisa `shipping_name` cu prioritate in loc de `customer_name`
|
||||||
|
- **Commits:** `cc872cf`, `ecb4777`, `172debd`, `8020b2d`
|
||||||
|
|
||||||
|
### 2. Matching comenzi → facturi (FUNCTIONEAZA)
|
||||||
|
- **Metoda:** Fuzzy match pe client name + total comanda + data (±3 zile)
|
||||||
|
- **Rezultat:** 428/429 comenzi matched cu facturi Oracle (1 nematched)
|
||||||
|
- **Script:** `scripts/match_all.py`, `scripts/match_by_price.py`
|
||||||
|
|
||||||
|
### 3. Matching linii comenzi → linii facturi (ESUAT - REZULTATE NESATISFACATOARE)
|
||||||
|
|
||||||
|
#### Ce s-a incercat:
|
||||||
|
1. **Match pe CODMAT** (SKU == CODMAT din vanzari_detalii) → Multe articole ROA nu au CODMAT completat
|
||||||
|
2. **Match pe valoare linie** (qty × pret) → Functioneaza cand comanda GoMag corespunde exact cu factura
|
||||||
|
3. **Match pe pret unitar** (pret fara TVA) → Idem, functioneaza doar cand articolele coincid
|
||||||
|
|
||||||
|
#### De ce nu merge:
|
||||||
|
- **Articolele din factura ROA sunt COMPLET DIFERITE** fata de comanda GoMag in multe cazuri
|
||||||
|
- Exemplu: comanda GoMag are "Lavazza Crema E Aroma" dar factura ROA are "CAFEA FRESSO BLUE"
|
||||||
|
- Asta se intampla probabil pentru ca vanzatorul ajusteaza comanda inainte de facturare (inlocuieste produse, adauga altele, modifica cantitati)
|
||||||
|
- Matching-ul pe pret gaseste corespondente FALSE (produse diferite care au intamplator acelasi pret)
|
||||||
|
- Rezultat: din 37 mapari "simple 1:1", unele sunt corecte, altele sunt nonsens
|
||||||
|
- Repackaging si seturi sunt aproape toate false
|
||||||
|
|
||||||
|
#### Ce a produs:
|
||||||
|
- `scripts/output/update_codmat.sql` — 37 UPDATE-uri nom_articole (TREBUIE VERIFICATE MANUAL, multe sunt gresite)
|
||||||
|
- `scripts/output/repack_mappings.csv` — 16 repackaging (majoritatea gresite)
|
||||||
|
- `scripts/output/set_mappings.csv` — 52 seturi (aproape toate gresite)
|
||||||
|
- `scripts/output/inconsistent_skus.csv` — 11 SKU-uri cu match-uri contradictorii
|
||||||
|
|
||||||
|
## Ce NU a mers si de ce
|
||||||
|
|
||||||
|
Algoritmul actual face matching "in bulk" pe toate comenzile simultan, ceea ce produce prea mult zgomot.
|
||||||
|
Cand o comanda are produse complet diferite fata de factura, algoritmul forteaza match-uri absurde pe baza de pret.
|
||||||
|
|
||||||
|
## Strategie propusa pentru sesiunea urmatoare
|
||||||
|
|
||||||
|
### Abordare: subset → confirmare → generalizare
|
||||||
|
|
||||||
|
**Pas 1: Identificare perechi comanda-factura cu CERTITUDINE**
|
||||||
|
- Foloseste perechile unde clientul se potriveste EXACT (score > 0.9) si totalul e identic
|
||||||
|
- Aceste perechi au sanse mai mari sa aiba si articole corespunzatoare
|
||||||
|
|
||||||
|
**Pas 2: Comparare manuala pe un subset mic (5-10 perechi)**
|
||||||
|
- Alege perechi unde numarul de articole GoMag == numarul de articole ROA (fara transport/discount)
|
||||||
|
- Afiseaza side-by-side: GoMag SKU+produs+qty vs ROA codmat+produs+qty
|
||||||
|
- User-ul confirma manual care corespondente sunt corecte
|
||||||
|
|
||||||
|
**Pas 3: Validare croise**
|
||||||
|
- Un SKU care apare in mai multe comenzi trebuie sa se mapeze mereu pe acelasi id_articol
|
||||||
|
- Daca SKU X → id_articol Y in comanda A dar SKU X → id_articol Z in comanda B → marcheaza ca suspect
|
||||||
|
|
||||||
|
**Pas 4: Generalizare doar pe mapari confirmate**
|
||||||
|
- Extinde doar maparile validate pe subset la restul comenzilor
|
||||||
|
- Nu forta match-uri noi — lasa unresolved ce nu se confirma
|
||||||
|
|
||||||
|
### Alt approach posibil: match pe DENUMIRE (fuzzy name match)
|
||||||
|
- In loc de pret, compara denumirea produsului GoMag cu denumirea articolului ROA
|
||||||
|
- Exemplu: "Lavazza Crema E Aroma Cafea Boabe 1 Kg" vs "LAVAZZA BBE CREMA E AROMA"
|
||||||
|
- Ar putea fi mai precis decat match pe pret, mai ales cand preturile coincid accidental
|
||||||
|
|
||||||
|
### Tools utile deja existente:
|
||||||
|
- `scripts/compare_order.py <order_nr> <fact_nr>` — comparare detaliata o comanda vs o factura
|
||||||
|
- `scripts/fetch_one_order.py <order_nr>` — fetch JSON complet din GoMag API
|
||||||
|
- `scripts/match_all.py` — matching bulk (de refacut cu strategie noua)
|
||||||
|
|
||||||
|
## Structura Oracle relevanta
|
||||||
|
|
||||||
|
- `vanzari` — header factura (id_vanzare, numar_act, serie_act, data_act, total_cu_tva, id_part)
|
||||||
|
- `vanzari_detalii` — linii factura (id_vanzare, id_articol, cantitate, pret, pret_cu_tva)
|
||||||
|
- `nom_articole` — nomenclator articole (id_articol, codmat, denumire)
|
||||||
|
- `comenzi` — header comanda ROA (id_comanda, id_part, nr_comanda)
|
||||||
|
- `comenzi_elemente` — linii comanda ROA
|
||||||
|
- `nom_parteneri` — parteneri (id_part, denumire, prenume)
|
||||||
|
- `ARTICOLE_TERTI` — mapari SKU → CODMAT (sku, codmat, cantitate_roa, procent_pret)
|
||||||
|
|
||||||
|
## Server
|
||||||
|
- SSH: `ssh -p 22122 gomag@79.119.86.134`
|
||||||
|
- App: `C:\gomag-vending`
|
||||||
|
- SQLite: `C:\gomag-vending\api\data\import.db`
|
||||||
|
- Oracle user: VENDING / ROMFASTSOFT / DSN=ROA
|
||||||
@@ -1,306 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""
|
|
||||||
Parser pentru log-urile sync_comenzi_web.
|
|
||||||
Extrage comenzi esuate, SKU-uri lipsa, si genereaza un sumar.
|
|
||||||
Suporta atat formatul vechi (verbose) cat si formatul nou (compact).
|
|
||||||
|
|
||||||
Utilizare:
|
|
||||||
python parse_sync_log.py # Ultimul log din vfp/log/
|
|
||||||
python parse_sync_log.py <fisier.log> # Log specific
|
|
||||||
python parse_sync_log.py --skus # Doar lista SKU-uri lipsa
|
|
||||||
python parse_sync_log.py --dir /path/to/logs # Director custom
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
import re
|
|
||||||
import glob
|
|
||||||
import argparse
|
|
||||||
|
|
||||||
# Regex pentru linii cu timestamp (intrare noua in log)
|
|
||||||
RE_TIMESTAMP = re.compile(r'^\[(\d{2}:\d{2}:\d{2})\]\s+\[(\w+\s*)\]\s*(.*)')
|
|
||||||
|
|
||||||
# Regex format NOU: [N/Total] OrderNumber P:X A:Y/Z -> OK/ERR details
|
|
||||||
RE_COMPACT_OK = re.compile(r'\[(\d+)/(\d+)\]\s+(\S+)\s+.*->\s+OK\s+ID:(\S+)')
|
|
||||||
RE_COMPACT_ERR = re.compile(r'\[(\d+)/(\d+)\]\s+(\S+)\s+.*->\s+ERR\s+(.*)')
|
|
||||||
|
|
||||||
# Regex format VECHI (backwards compat)
|
|
||||||
RE_SKU_NOT_FOUND = re.compile(r'SKU negasit.*?:\s*(\S+)')
|
|
||||||
RE_PRICE_POLICY = re.compile(r'Pretul pentru acest articol nu a fost gasit')
|
|
||||||
RE_FAILED_ORDER = re.compile(r'Import comanda esuat pentru\s+(\S+)')
|
|
||||||
RE_ARTICOL_ERR = re.compile(r'Eroare adaugare articol\s+(\S+)')
|
|
||||||
RE_ORDER_PROCESS = re.compile(r'Procesez comanda:\s+(\S+)\s+din\s+(\S+)')
|
|
||||||
RE_ORDER_SUCCESS = re.compile(r'SUCCES: Comanda importata.*?ID Oracle:\s+(\S+)')
|
|
||||||
|
|
||||||
# Regex comune
|
|
||||||
RE_SYNC_END = re.compile(r'SYNC END\s*\|.*?(\d+)\s+processed.*?(\d+)\s+ok.*?(\d+)\s+err')
|
|
||||||
RE_STATS_LINE = re.compile(r'Duration:\s*(\S+)\s*\|\s*Orders:\s*(\S+)')
|
|
||||||
RE_STOPPED_EARLY = re.compile(r'Peste \d+.*ero|stopped early')
|
|
||||||
|
|
||||||
|
|
||||||
def find_latest_log(log_dir):
|
|
||||||
"""Gaseste cel mai recent log sync_comenzi din directorul specificat."""
|
|
||||||
pattern = os.path.join(log_dir, 'sync_comenzi_*.log')
|
|
||||||
files = glob.glob(pattern)
|
|
||||||
if not files:
|
|
||||||
return None
|
|
||||||
return max(files, key=os.path.getmtime)
|
|
||||||
|
|
||||||
|
|
||||||
def parse_log_entries(lines):
|
|
||||||
"""Parseaza liniile log-ului in intrari structurate."""
|
|
||||||
entries = []
|
|
||||||
current = None
|
|
||||||
|
|
||||||
for line in lines:
|
|
||||||
line = line.rstrip('\n\r')
|
|
||||||
m = RE_TIMESTAMP.match(line)
|
|
||||||
if m:
|
|
||||||
if current:
|
|
||||||
entries.append(current)
|
|
||||||
current = {
|
|
||||||
'time': m.group(1),
|
|
||||||
'level': m.group(2).strip(),
|
|
||||||
'text': m.group(3),
|
|
||||||
'full': line,
|
|
||||||
'continuation': []
|
|
||||||
}
|
|
||||||
elif current is not None:
|
|
||||||
current['continuation'].append(line)
|
|
||||||
current['text'] += '\n' + line
|
|
||||||
|
|
||||||
if current:
|
|
||||||
entries.append(current)
|
|
||||||
|
|
||||||
return entries
|
|
||||||
|
|
||||||
|
|
||||||
def extract_sku_from_error(err_text):
|
|
||||||
"""Extrage SKU din textul erorii (diverse formate)."""
|
|
||||||
# SKU_NOT_FOUND: 8714858424056
|
|
||||||
m = re.search(r'SKU_NOT_FOUND:\s*(\S+)', err_text)
|
|
||||||
if m:
|
|
||||||
return ('SKU_NOT_FOUND', m.group(1))
|
|
||||||
|
|
||||||
# PRICE_POLICY: 8000070028685
|
|
||||||
m = re.search(r'PRICE_POLICY:\s*(\S+)', err_text)
|
|
||||||
if m:
|
|
||||||
return ('PRICE_POLICY', m.group(1))
|
|
||||||
|
|
||||||
# Format vechi: SKU negasit...NOM_ARTICOLE: xxx
|
|
||||||
m = RE_SKU_NOT_FOUND.search(err_text)
|
|
||||||
if m:
|
|
||||||
return ('SKU_NOT_FOUND', m.group(1))
|
|
||||||
|
|
||||||
# Format vechi: Eroare adaugare articol xxx
|
|
||||||
m = RE_ARTICOL_ERR.search(err_text)
|
|
||||||
if m:
|
|
||||||
return ('ARTICOL_ERROR', m.group(1))
|
|
||||||
|
|
||||||
# Format vechi: Pretul...
|
|
||||||
if RE_PRICE_POLICY.search(err_text):
|
|
||||||
return ('PRICE_POLICY', '(SKU necunoscut)')
|
|
||||||
|
|
||||||
return (None, None)
|
|
||||||
|
|
||||||
|
|
||||||
def analyze_entries(entries):
|
|
||||||
"""Analizeaza intrarile si extrage informatii relevante."""
|
|
||||||
result = {
|
|
||||||
'start_time': None,
|
|
||||||
'end_time': None,
|
|
||||||
'duration': None,
|
|
||||||
'total_orders': 0,
|
|
||||||
'success_orders': 0,
|
|
||||||
'error_orders': 0,
|
|
||||||
'stopped_early': False,
|
|
||||||
'failed': [],
|
|
||||||
'missing_skus': [],
|
|
||||||
}
|
|
||||||
|
|
||||||
seen_skus = set()
|
|
||||||
current_order = None
|
|
||||||
|
|
||||||
for entry in entries:
|
|
||||||
text = entry['text']
|
|
||||||
level = entry['level']
|
|
||||||
|
|
||||||
# Start/end time
|
|
||||||
if entry['time']:
|
|
||||||
if result['start_time'] is None:
|
|
||||||
result['start_time'] = entry['time']
|
|
||||||
result['end_time'] = entry['time']
|
|
||||||
|
|
||||||
# Format NOU: SYNC END line cu statistici
|
|
||||||
m = RE_SYNC_END.search(text)
|
|
||||||
if m:
|
|
||||||
result['total_orders'] = int(m.group(1))
|
|
||||||
result['success_orders'] = int(m.group(2))
|
|
||||||
result['error_orders'] = int(m.group(3))
|
|
||||||
|
|
||||||
# Format NOU: compact OK line
|
|
||||||
m = RE_COMPACT_OK.search(text)
|
|
||||||
if m:
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Format NOU: compact ERR line
|
|
||||||
m = RE_COMPACT_ERR.search(text)
|
|
||||||
if m:
|
|
||||||
order_nr = m.group(3)
|
|
||||||
err_detail = m.group(4).strip()
|
|
||||||
err_type, sku = extract_sku_from_error(err_detail)
|
|
||||||
if err_type and sku:
|
|
||||||
result['failed'].append((order_nr, err_type, sku))
|
|
||||||
if sku not in seen_skus and sku != '(SKU necunoscut)':
|
|
||||||
seen_skus.add(sku)
|
|
||||||
result['missing_skus'].append(sku)
|
|
||||||
else:
|
|
||||||
result['failed'].append((order_nr, 'ERROR', err_detail[:60]))
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Stopped early
|
|
||||||
if RE_STOPPED_EARLY.search(text):
|
|
||||||
result['stopped_early'] = True
|
|
||||||
|
|
||||||
# Format VECHI: statistici din sumar
|
|
||||||
if 'Total comenzi procesate:' in text:
|
|
||||||
try:
|
|
||||||
result['total_orders'] = int(text.split(':')[-1].strip())
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
||||||
if 'Comenzi importate cu succes:' in text:
|
|
||||||
try:
|
|
||||||
result['success_orders'] = int(text.split(':')[-1].strip())
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
||||||
if 'Comenzi cu erori:' in text:
|
|
||||||
try:
|
|
||||||
result['error_orders'] = int(text.split(':')[-1].strip())
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# Format VECHI: Duration line
|
|
||||||
m = RE_STATS_LINE.search(text)
|
|
||||||
if m:
|
|
||||||
result['duration'] = m.group(1)
|
|
||||||
|
|
||||||
# Format VECHI: erori
|
|
||||||
if level == 'ERROR':
|
|
||||||
m_fail = RE_FAILED_ORDER.search(text)
|
|
||||||
if m_fail:
|
|
||||||
current_order = m_fail.group(1)
|
|
||||||
|
|
||||||
m = RE_ORDER_PROCESS.search(text)
|
|
||||||
if m:
|
|
||||||
current_order = m.group(1)
|
|
||||||
|
|
||||||
err_type, sku = extract_sku_from_error(text)
|
|
||||||
if err_type and sku:
|
|
||||||
order_nr = current_order or '?'
|
|
||||||
result['failed'].append((order_nr, err_type, sku))
|
|
||||||
if sku not in seen_skus and sku != '(SKU necunoscut)':
|
|
||||||
seen_skus.add(sku)
|
|
||||||
result['missing_skus'].append(sku)
|
|
||||||
|
|
||||||
# Duration din SYNC END
|
|
||||||
m = re.search(r'\|\s*(\d+)s\s*$', text)
|
|
||||||
if m:
|
|
||||||
result['duration'] = m.group(1) + 's'
|
|
||||||
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
def format_report(result, log_path):
|
|
||||||
"""Formateaza raportul complet."""
|
|
||||||
lines = []
|
|
||||||
lines.append('=== SYNC LOG REPORT ===')
|
|
||||||
lines.append(f'File: {os.path.basename(log_path)}')
|
|
||||||
|
|
||||||
duration = result["duration"] or "?"
|
|
||||||
start = result["start_time"] or "?"
|
|
||||||
end = result["end_time"] or "?"
|
|
||||||
lines.append(f'Run: {start} - {end} ({duration})')
|
|
||||||
lines.append('')
|
|
||||||
|
|
||||||
stopped = 'YES' if result['stopped_early'] else 'NO'
|
|
||||||
lines.append(
|
|
||||||
f'SUMMARY: {result["total_orders"]} processed, '
|
|
||||||
f'{result["success_orders"]} success, '
|
|
||||||
f'{result["error_orders"]} errors '
|
|
||||||
f'(stopped early: {stopped})'
|
|
||||||
)
|
|
||||||
lines.append('')
|
|
||||||
|
|
||||||
if result['failed']:
|
|
||||||
lines.append('FAILED ORDERS:')
|
|
||||||
seen = set()
|
|
||||||
for order_nr, err_type, sku in result['failed']:
|
|
||||||
key = (order_nr, err_type, sku)
|
|
||||||
if key not in seen:
|
|
||||||
seen.add(key)
|
|
||||||
lines.append(f' {order_nr:<12} {err_type:<18} {sku}')
|
|
||||||
lines.append('')
|
|
||||||
|
|
||||||
if result['missing_skus']:
|
|
||||||
lines.append(f'MISSING SKUs ({len(result["missing_skus"])} unique):')
|
|
||||||
for sku in sorted(result['missing_skus']):
|
|
||||||
lines.append(f' {sku}')
|
|
||||||
lines.append('')
|
|
||||||
|
|
||||||
return '\n'.join(lines)
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
parser = argparse.ArgumentParser(
|
|
||||||
description='Parser pentru log-urile sync_comenzi_web'
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
'logfile', nargs='?', default=None,
|
|
||||||
help='Fisier log specific (default: ultimul din vfp/log/)'
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
'--skus', action='store_true',
|
|
||||||
help='Afiseaza doar lista SKU-uri lipsa (una pe linie)'
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
'--dir', default=None,
|
|
||||||
help='Director cu log-uri (default: vfp/log/ relativ la script)'
|
|
||||||
)
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
if args.logfile:
|
|
||||||
log_path = args.logfile
|
|
||||||
else:
|
|
||||||
if args.dir:
|
|
||||||
log_dir = args.dir
|
|
||||||
else:
|
|
||||||
script_dir = os.path.dirname(os.path.abspath(__file__))
|
|
||||||
project_dir = os.path.dirname(script_dir)
|
|
||||||
log_dir = os.path.join(project_dir, 'vfp', 'log')
|
|
||||||
|
|
||||||
log_path = find_latest_log(log_dir)
|
|
||||||
if not log_path:
|
|
||||||
print(f'Nu am gasit fisiere sync_comenzi_*.log in {log_dir}',
|
|
||||||
file=sys.stderr)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
if not os.path.isfile(log_path):
|
|
||||||
print(f'Fisierul nu exista: {log_path}', file=sys.stderr)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
with open(log_path, 'r', encoding='utf-8', errors='replace') as f:
|
|
||||||
lines = f.readlines()
|
|
||||||
|
|
||||||
entries = parse_log_entries(lines)
|
|
||||||
result = analyze_entries(entries)
|
|
||||||
|
|
||||||
if args.skus:
|
|
||||||
for sku in sorted(result['missing_skus']):
|
|
||||||
print(sku)
|
|
||||||
else:
|
|
||||||
print(format_report(result, log_path))
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
main()
|
|
||||||
Reference in New Issue
Block a user