Compare commits

...

9 Commits

Author SHA1 Message Date
Claude Agent
c534a972a9 feat: multi-gestiune stock verification setting
Replace single-select gestiune dropdown with multi-select checkboxes.
Settings stores comma-separated IDs, Python builds IN clause with bind
variables, Oracle PL/SQL splits CSV via REGEXP_SUBSTR for stock lookup.
Empty selection = all warehouses (unchanged behavior).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 16:15:40 +00:00
Claude Agent
6fc2f34ba9 docs: simplify CLAUDE.md, update README with accurate business rules
CLAUDE.md reduced from 214 to 60 lines — moved architecture, API endpoints,
and detailed docs to README. Kept only AI-critical rules (TeamCreate, import
flow gotchas, partner/pricing logic).

README updated: added CANCELLED status, dual pricing policy, discount VAT
splitting, stale error recovery, accurate partner/address logic, settings
page references. Removed outdated Status Implementare section.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-18 15:48:33 +00:00
Claude Agent
c1d8357956 gitignore 2026-03-18 15:11:09 +00:00
Claude Agent
695dafacd5 feat: dual pricing policies + discount VAT splitting
Add production pricing policy (id_pol_productie) for articles with cont 341/345,
smart discount VAT splitting across multiple rates, per-article id_pol support,
and mapped SKU price validation. Settings UI updated with new controls.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-18 15:10:05 +00:00
Claude Agent
69a3088579 refactor(dashboard): move search box to filter bar after period dropdown
Search was hidden in card header — now inline with filters for better
discoverability. Compact refresh button to icon-only. On mobile, search
and period dropdown share the same row via flex.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-17 16:08:58 +00:00
Claude Agent
3d212979d9 refactor(dashboard): move search box from filter bar to card header
Reduces vertical space by eliminating the second row in the filter bar.
Search input is now next to the "Comenzi" title, hidden on mobile.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-17 14:04:10 +00:00
Claude Agent
7dd39f9712 feat(order-detail): show CODMAT for direct SKUs + mapping validations
- Enrich order detail items with NOM_ARTICOLE data for direct SKUs
  (SKU=CODMAT) that have no ARTICOLE_TERTI entry
- Validate CODMAT exists in nomenclator before saving mapping (400)
- Block redundant self-mapping when SKU is already direct CODMAT (409)
- Show "direct" badge in CODMAT column for direct SKUs
- Show info alert in quick map modal for direct SKUs
- Display backend validation errors inline in modal

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-17 13:10:20 +00:00
Claude Agent
f74322beab fix(dashboard): update sync card after completion + use Bucharest timezone
Sync card was showing previous run data after sync completed because the
last_run query excluded the current run_id even after it finished. Now only
excludes during active running state.

All datetime.now() and SQLite datetime('now') replaced with Europe/Bucharest
timezone to fix times displayed 2 hours behind (was using UTC).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-17 13:02:18 +00:00
Claude Agent
f5ef9e0811 chore: move working scripts to scripts/work/ (gitignored)
Prevents untracked file conflicts on git pull on Windows server.
Scripts are development/analysis tools, not part of the deployed app.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-17 12:34:34 +00:00
27 changed files with 678 additions and 1753 deletions

2
.gitignore vendored
View File

@@ -29,6 +29,7 @@ output/
vfp/*.json vfp/*.json
*.~pck *.~pck
.claude/HANDOFF.md .claude/HANDOFF.md
scripts/work/
# Virtual environments # Virtual environments
venv/ venv/
@@ -45,3 +46,4 @@ api/api/
# Logs directory # Logs directory
logs/ logs/
.gstack/

220
CLAUDE.md
View File

@@ -1,206 +1,60 @@
# CLAUDE.md # CLAUDE.md
## REGULI OBLIGATORII
**Pentru task-uri paralele foloseste INTOTDEAUNA TeamCreate + TaskCreate, NU Agent tool cu subagenti paraleli.**
Skill-ul `superpowers:dispatching-parallel-agents` NU se aplica in acest proiect. In loc de dispatch cu Agent tool, creeaza o echipa cu TeamCreate, defineste task-uri cu TaskCreate, si spawneaza teammates cu Agent tool + `team_name`.
## Project Overview ## Project Overview
**System:** Import Comenzi Web GoMag → Sistem ROA Oracle **System:** Import Comenzi Web GoMag → Sistem ROA Oracle
Stack: FastAPI + Jinja2 + Bootstrap 5.3 + Oracle PL/SQL + SQLite
Importa automat comenzi din GoMag in sistemul ERP ROA Oracle. Stack complet Python/FastAPI. Documentatie completa: [README.md](README.md)
### Tech Stack ## Implementare cu TeamCreate
- **API + Admin:** FastAPI + Jinja2 + Bootstrap 5.3
- **GoMag Integration:** Python (`gomag_client.py` — API download with pagination) **OBLIGATORIU:** Folosim TeamCreate + TaskCreate, NU Agent tool cu subagenti paraleli. Skill-ul `superpowers:dispatching-parallel-agents` NU se aplica in acest proiect.
- **Sync Orchestrator:** Python (`sync_service.py` — download → parse → validate → import)
- **Database:** Oracle PL/SQL packages (IMPORT_PARTENERI, IMPORT_COMENZI) + SQLite (tracking) - Team lead citeste TOATE fisierele implicate, creeaza planul
- **ASTEAPTA aprobare explicita** de la user inainte de implementare
- Task-uri pe fisiere non-overlapping (evita conflicte)
- Cache-bust static assets (`?v=N`) la fiecare schimbare UI
## Development Commands ## Development Commands
```bash ```bash
# Run FastAPI server — INTOTDEAUNA via start.sh (seteaza Oracle env vars) # INTOTDEAUNA via start.sh (seteaza Oracle env vars)
./start.sh ./start.sh
# NU folosi uvicorn direct — lipsesc LD_LIBRARY_PATH si TNS_ADMIN pentru Oracle # NU folosi uvicorn direct — lipsesc LD_LIBRARY_PATH si TNS_ADMIN
# Tests # Tests
python api/test_app_basic.py # Test A - fara Oracle python api/test_app_basic.py # fara Oracle
python api/test_integration.py # Test C - cu Oracle python api/test_integration.py # cu Oracle
``` ```
## UI Development Workflow: Preview → Implement → Verify ## Reguli critice (nu le incalca)
**OBLIGATORIU**: Respecta ordinea exacta. NU treci la pasul urmator fara aprobare explicita. ### Flux import comenzi
1. Download GoMag API → JSON → parse → validate SKU-uri → import Oracle
2. Ordinea: **parteneri** (cauta/creeaza) → **adrese****comanda****factura cache**
3. SKU lookup: ARTICOLE_TERTI (mapped) are prioritate fata de NOM_ARTICOLE (direct)
4. Complex sets: un SKU → multiple CODMAT-uri cu `procent_pret` (trebuie sa fie sum=100%)
5. Comenzi anulate (GoMag statusId=7): verifica daca au factura inainte de stergere din Oracle
### 1. Plan & Preview — ASTEAPTA APROBARE ### Statusuri comenzi
1. Citeste TOATE fisierele implicate `IMPORTED` / `ALREADY_IMPORTED` / `SKIPPED` / `ERROR` / `CANCELLED` / `DELETED_IN_ROA`
2. Scrie planul de implementare cu decizii de design - Upsert: `IMPORTED` existent NU se suprascrie cu `ALREADY_IMPORTED`
3. Genereaza **mockup-uri Markdown** care descriu rezultatul asteptat (tabele, liste, cod pseudo-CSS) — NU HTML static - Recovery: la fiecare sync, comenzile ERROR sunt reverificate in Oracle
4. **Prezinta mockup-urile userului si ASTEAPTA aprobare explicita**
5. Rafineaza planul daca userul cere modificari
6. **NU trece la implementare pana userul nu spune explicit "ok", "aprob", "executa" sau similar**
### 2. Implementation cu TeamCreate (Agent Teams) ### Parteneri
- Prioritate: **companie** (PJ, cod_fiscal + registru) daca exista in GoMag, altfel persoana fizica cu **shipping name**
- Adresa livrare: intotdeauna GoMag shipping
- Adresa facturare: daca shipping ≠ billing person → shipping pt ambele; altfel → billing din GoMag
Folosim **TeamCreate** (team agents), NU superpowers subagents. Diferenta: ### Preturi
- **TeamCreate**: agenti independenti cu task list partajat, comunicare directa intre ei, context propriu - Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
- **Subagents (Agent tool)**: agenti care raporteaza doar la main — NU se folosesc - Daca pretul lipseste, se insereaza automat pret=0
#### Workflow TeamCreate: ### Invoice cache
- Coloanele `factura_*` pe `orders` (SQLite), populate lazy din Oracle (`vanzari WHERE sters=0`)
- Refresh complet: verifica facturi noi + facturi sterse + comenzi sterse din ROA
1. **Main agent** (team lead) citeste TOATE fisierele implicate, creeaza planul ## Deploy Windows
2. **TeamCreate** creeaza echipa (ex: `ui-polish`)
3. **TaskCreate** creeaza task-uri independente, pe fisiere non-overlapping:
- Task 1: Templates + CSS (HTML templates, style.css, cache-bust)
- Task 2: JavaScript (shared.js, dashboard.js, logs.js, mappings.js)
- Task 3: Verificare Playwright (depinde de Task 1 + Task 2)
4. **Agent tool** cu `team_name` spawneaza teammates folosind agentii predefiniti din `.claude/agents/`:
- `subagent_type: ui-templates` → pentru Task 1 (templates + CSS)
- `subagent_type: ui-js` → pentru Task 2 (JavaScript)
- `subagent_type: ui-verify` → pentru Task 3 (Playwright verification)
- `subagent_type: backend-api` → pentru modificari backend/API (routers, services, Oracle/SQLite)
- `subagent_type: qa-tester` → pentru teste de integrare
5. Teammates lucreaza in paralel, comunica intre ei, marcheaza task-uri completate
6. Cand Task 1 + Task 2 sunt complete, teammate-ul de verificare preia Task 3
#### Teammate-ul de verificare (Task 3): Vezi [README.md](README.md#deploy-windows)
1. Navigheaza la fiecare pagina cu Playwright MCP la 375x812 (mobile) si 1440x900 (desktop)
2. **Foloseste browser_snapshot** (NU screenshot-uri) pentru a inspecta structura DOM
3. Verifica ca implementarea respecta fiecare punct din preview-ul aprobat (structura coloane, bold, dots, filtre etc.)
4. Raporteaza discrepante concrete la team lead (ce e diferit fata de preview)
5. NU salveaza screenshot-uri after/
#### Bucla de corectie (responsabilitatea team lead-ului):
1. Dupa ce verify-agent raporteaza, **team lead-ul analizeaza discrepantele**
2. Pentru fiecare discrepanta, creeaza un nou task de fix si spawneaza un agent sa-l rezolve
3. Dupa fix, spawneaza din nou verify-agent pentru re-verificare
4. **Repeta bucla** pana cand toate verificarile trec (implementare ≈ preview)
5. Abia atunci declara task-ul complet
```
screenshots/
└── preview/ # Mockup-uri Markdown aprobate de user (referinta pentru verificare)
```
### Principii
- Team lead citeste TOATE fisierele inainte sa creeze task-uri
- Task-uri pe fisiere non-overlapping (evita conflicte)
- Fiecare task contine prompt detaliat, self-contained
- Desktop-ul nu trebuie sa se schimbe cand se adauga imbunatatiri mobile
- Cache-bust static assets (increment `?v=N`) la fiecare schimbare UI
- Teammates comunica intre ei cu SendMessage, nu doar cu team lead-ul
## Architecture
```
[GoMag API] → [Python Sync Service] → [Oracle PL/SQL] → [FastAPI Admin]
↓ ↓ ↑ ↑
JSON Orders Download/Parse/Import Store/Update Dashboard + Config
[SQLite — tracking DB]
orders, sync_runs, missing_skus,
order_items, web_products,
invoice cache, app_settings
```
### FastAPI App Structure
- **Routers:** health, dashboard, mappings, articles, validation, sync
- **Services:** gomag_client, sync, order_reader, import, mapping, article, validation, invoice, sqlite, scheduler
- **Templates:** Jinja2 (dashboard, mappings, missing_skus, logs)
- **Static:** CSS (`style.css`), JS (`shared.js`, `dashboard.js`, `logs.js`, `mappings.js`)
- **Databases:** Oracle (ERP data) + SQLite (order tracking, sync runs)
## API Endpoints — Sync & Comenzi
### Sync
| Method | Path | Descriere |
|--------|------|-----------|
| POST | `/api/sync/start` | Porneste sync in background |
| POST | `/api/sync/stop` | Trimite semnal de stop |
| GET | `/api/sync/status` | Status curent + last_run |
| GET | `/api/sync/history` | Istoric run-uri (paginat) |
| GET | `/api/sync/run/{run_id}` | Detalii run |
| GET | `/api/sync/run/{run_id}/log` | Log per comanda (JSON) |
| GET | `/api/sync/run/{run_id}/text-log` | Log text (live sau din SQLite) |
| GET | `/api/sync/run/{run_id}/orders` | Comenzi run filtrate/paginate |
| GET | `/api/sync/order/{order_number}` | Detaliu comanda + items + factura |
### Dashboard Comenzi
| Method | Path | Descriere |
|--------|------|-----------|
| GET | `/api/dashboard/orders` | Comenzi cu date factura (cache SQLite → Oracle fallback) |
| POST | `/api/dashboard/refresh-invoices` | Force-refresh stare facturi + comenzi sterse din ROA |
**Parametri `/api/dashboard/orders`:**
- `period_days`: 3/7/30/90 sau 0 (all / custom range)
- `period_start`, `period_end`: interval custom (cand `period_days=0`)
- `status`: `all` / `IMPORTED` / `SKIPPED` / `ERROR` / `UNINVOICED` / `INVOICED`
- `search`, `sort_by`, `sort_dir`, `page`, `per_page`
**`POST /api/dashboard/refresh-invoices` face:**
1. Necacturate → verifica Oracle daca au primit factura
2. Cacturate → verifica Oracle daca factura a fost stearsa
3. Toate importate → verifica Oracle daca comanda a fost stearsa (→ `DELETED_IN_ROA`)
### Scheduler
| Method | Path | Descriere |
|--------|------|-----------|
| PUT | `/api/sync/schedule` | Configureaza scheduler (enabled, interval_minutes) |
| GET | `/api/sync/schedule` | Status curent scheduler |
### Settings
| Method | Path | Descriere |
|--------|------|-----------|
| GET | `/api/settings` | Citeste setari aplicatie |
| PUT | `/api/settings` | Salveaza setari |
| GET | `/api/settings/sectii` | Lista sectii Oracle (dropdown) |
| GET | `/api/settings/politici` | Lista politici preturi Oracle (dropdown) |
## Invoice Cache (SQLite)
Facturile sunt cacate in coloana `factura_*` pe tabelul `orders`:
- `factura_serie`, `factura_numar`, `factura_data`
- `factura_total_fara_tva`, `factura_total_tva`, `factura_total_cu_tva`
**Sursa Oracle:** `SELECT ... FROM vanzari WHERE id_comanda IN (...) AND sters=0`
**Populare cache:**
- La fiecare cerere `/api/dashboard/orders` — comenzile fara cache sunt verificate live si cacate
- La deschidere detaliu comanda `/api/sync/order/{order_number}` — verifica live daca nu e caat
- La `POST /api/dashboard/refresh-invoices` — refresh complet pentru toate comenzile
## Business Rules
### Partners
- Search priority: cod_fiscal → denumire → create new
- Individuals (CUI 13 digits): separate nume/prenume
- Default address: Bucuresti Sectorul 1
- All new partners: ID_UTIL = -3
### Articles & Mappings
- Simple SKUs: found directly in nom_articole (not stored in ARTICOLE_TERTI)
- Repackaging: SKU → CODMAT with different quantities
- Complex sets: One SKU → multiple CODMATs with percentage pricing (must sum to 100%)
- Inactive articles: activ=0 (soft delete)
### Orders
- Default: ID_GESTIUNE=1, ID_SECTIE=1, ID_POL=0
- Delivery date = order date + 1 day
- All orders: INTERNA=0 (external)
- **Statuses:** `IMPORTED` / `ALREADY_IMPORTED` / `SKIPPED` / `ERROR` / `DELETED_IN_ROA`
- Upsert rule: daca status=`IMPORTED` exista, nu se suprascrie cu `ALREADY_IMPORTED`
## Configuration
```bash
# .env
ORACLE_USER=CONTAFIN_ORACLE
ORACLE_PASSWORD=********
ORACLE_DSN=ROA_ROMFAST
TNS_ADMIN=/app
```
## Deploy & Depanare Windows
Vezi [README.md](README.md#deploy-windows) pentru instructiuni complete de deploy si depanare pe Windows Server.

View File

@@ -101,11 +101,11 @@ gomag-vending/
│ │ ├── database.py # Oracle pool + SQLite schema + migrari │ │ ├── database.py # Oracle pool + SQLite schema + migrari
│ │ ├── routers/ # Endpoint-uri HTTP │ │ ├── routers/ # Endpoint-uri HTTP
│ │ │ ├── health.py # GET /health │ │ │ ├── health.py # GET /health
│ │ │ ├── dashboard.py # GET / (HTML) │ │ │ ├── dashboard.py # GET / (HTML) + /settings (HTML)
│ │ │ ├── mappings.py # /mappings, /api/mappings │ │ │ ├── mappings.py # /mappings, /api/mappings
│ │ │ ├── articles.py # /api/articles/search │ │ │ ├── articles.py # /api/articles/search
│ │ │ ├── validation.py # /api/validate/* │ │ │ ├── validation.py # /api/validate/*
│ │ │ └── sync.py # /api/sync/* + /api/dashboard/orders │ │ │ └── sync.py # /api/sync/* + /api/dashboard/* + /api/settings
│ │ ├── services/ │ │ ├── services/
│ │ │ ├── gomag_client.py # Download comenzi GoMag API │ │ │ ├── gomag_client.py # Download comenzi GoMag API
│ │ │ ├── sync_service.py # Orchestrare: download→validate→import │ │ │ ├── sync_service.py # Orchestrare: download→validate→import
@@ -117,8 +117,8 @@ gomag-vending/
│ │ │ ├── article_service.py │ │ │ ├── article_service.py
│ │ │ ├── invoice_service.py # Verificare facturi ROA │ │ │ ├── invoice_service.py # Verificare facturi ROA
│ │ │ └── scheduler_service.py # APScheduler timer │ │ │ └── scheduler_service.py # APScheduler timer
│ │ ├── templates/ # Jinja2 HTML │ │ ├── templates/ # Jinja2 (dashboard, mappings, missing_skus, logs, settings)
│ │ └── static/ # CSS + JS │ │ └── static/ # CSS (style.css) + JS (dashboard, logs, mappings, settings, shared)
│ ├── database-scripts/ # Oracle SQL (ARTICOLE_TERTI, packages) │ ├── database-scripts/ # Oracle SQL (ARTICOLE_TERTI, packages)
│ ├── data/ # SQLite DB (import.db) + JSON orders │ ├── data/ # SQLite DB (import.db) + JSON orders
│ ├── .env # Configurare locala (nu in git) │ ├── .env # Configurare locala (nu in git)
@@ -165,12 +165,16 @@ gomag-vending/
## Fluxul de Import ## Fluxul de Import
``` ```
1. gomag_client.py descarca comenzi GoMag API → JSON files 1. gomag_client.py descarca comenzi GoMag API → JSON files (paginat)
2. order_reader.py parseaza JSON-urile 2. order_reader.py parseaza JSON-urile, sorteaza cronologic (cele mai vechi primele)
3. validation_service.py valideaza SKU-uri contra ARTICOLE_TERTI + NOM_ARTICOLE 3. Comenzi anulate (GoMag statusId=7) → separate, sterse din Oracle daca nu au factura
4. import_service.py creeaza/cauta partener in Oracle (shipping person = facturare) 4. validation_service.py valideaza SKU-uri: ARTICOLE_TERTI (mapped) → NOM_ARTICOLE (direct) → missing
5. PACK_IMPORT_COMENZI.importa_comanda_web() insereaza comanda in ROA 5. Verificare existenta in Oracle (COMENZI by date range) → deja importate se sar
6. Rezultate salvate in SQLite (orders, sync_run_orders, order_items) 6. Stale error recovery: comenzi ERROR reverificate in Oracle (crash recovery)
7. Validare preturi + dual policy: articole rutate la id_pol_vanzare sau id_pol_productie
8. import_service.py: cauta/creeaza partener → adrese → importa comanda in Oracle
9. Invoice cache: verifica facturi + comenzi sterse din ROA
10. Rezultate salvate in SQLite (orders, sync_run_orders, order_items)
``` ```
### Statuses Comenzi ### Statuses Comenzi
@@ -180,17 +184,30 @@ gomag-vending/
| `IMPORTED` | Importata nou in ROA in acest run | | `IMPORTED` | Importata nou in ROA in acest run |
| `ALREADY_IMPORTED` | Existenta deja in Oracle, contorizata | | `ALREADY_IMPORTED` | Existenta deja in Oracle, contorizata |
| `SKIPPED` | SKU-uri lipsa → neimportata | | `SKIPPED` | SKU-uri lipsa → neimportata |
| `ERROR` | Eroare la import | | `ERROR` | Eroare la import (reverificate automat la urmatorul sync) |
| `CANCELLED` | Comanda anulata in GoMag (statusId=7) |
| `DELETED_IN_ROA` | A fost importata dar comanda a fost stearsa din ROA | | `DELETED_IN_ROA` | A fost importata dar comanda a fost stearsa din ROA |
**Regula upsert:** daca statusul existent este `IMPORTED`, nu se suprascrie cu `ALREADY_IMPORTED`. **Regula upsert:** daca statusul existent este `IMPORTED`, nu se suprascrie cu `ALREADY_IMPORTED`.
### Reguli Business ### Reguli Business
- **Persoana**: shipping name = persoana pe eticheta = beneficiarul facturii
- **Adresa**: cand billing ≠ shipping → adresa shipping pentru ambele (facturare + livrare) **Parteneri & Adrese:**
- **SKU simplu**: gasit direct in NOM_ARTICOLE → nu se stocheaza in ARTICOLE_TERTI - Prioritate partener: daca exista **companie** in GoMag (billing.company_name) → firma (PJ, cod_fiscal + registru). Altfel → persoana fizica, cu **shipping name** ca nume partener
- **SKU cu repackaging**: un SKU → CODMAT cu cantitate diferita - Adresa livrare: intotdeauna din GoMag shipping
- **SKU set complex**: un SKU → multiple CODMAT-uri cu procente de pret - Adresa facturare: daca shipping name ≠ billing name → adresa shipping pt ambele; daca aceeasi persoana → adresa billing din GoMag
- Cautare partener in Oracle: cod_fiscal → denumire → create new (ID_UTIL = -3)
**Articole & Mapari:**
- SKU lookup: ARTICOLE_TERTI (mapped, activ=1) are prioritate fata de NOM_ARTICOLE (direct)
- SKU simplu: gasit direct in NOM_ARTICOLE → nu se stocheaza in ARTICOLE_TERTI
- SKU cu repackaging: un SKU → CODMAT cu cantitate diferita (`cantitate_roa`)
- SKU set complex: un SKU → multiple CODMAT-uri cu `procent_pret` (trebuie sum = 100%)
**Preturi & Discounturi:**
- Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
- Daca pretul lipseste in politica, se insereaza automat pret=0
- Discount VAT splitting: daca `split_discount_vat=1`, discountul se repartizeaza proportional pe cotele TVA din comanda
--- ---
@@ -271,18 +288,7 @@ Configuratia este persistata in SQLite (`scheduler_config`).
| GET | `/api/settings/sectii` | Lista sectii Oracle | | GET | `/api/settings/sectii` | Lista sectii Oracle |
| GET | `/api/settings/politici` | Lista politici preturi Oracle | | GET | `/api/settings/politici` | Lista politici preturi Oracle |
**Setari disponibile:** `transport_codmat`, `transport_vat`, `discount_codmat`, `discount_vat`, `transport_id_pol`, `discount_id_pol`, `id_pol`, `id_sectie`, `gomag_api_key`, `gomag_api_shop`, `gomag_order_days_back`, `gomag_limit` **Setari disponibile:** `transport_codmat`, `transport_vat`, `discount_codmat`, `discount_vat`, `transport_id_pol`, `discount_id_pol`, `id_pol`, `id_pol_productie`, `id_sectie`, `split_discount_vat`, `gomag_api_key`, `gomag_api_shop`, `gomag_order_days_back`, `gomag_limit`
---
## Status Implementare
| Faza | Status | Descriere |
|------|--------|-----------|
| Phase 1: Database Foundation | Complet | ARTICOLE_TERTI, PACK_IMPORT_PARTENERI, PACK_IMPORT_COMENZI |
| Phase 2: Python Integration | Complet | gomag_client.py, sync_service.py |
| Phase 3-4: FastAPI Dashboard | Complet | UI responsive, smart polling, filter bar, paginare |
| Phase 5: Production | In Progress | Logging done, Auth + SMTP pending |
--- ---

View File

@@ -110,7 +110,8 @@ CREATE TABLE IF NOT EXISTS orders (
order_total REAL, order_total REAL,
delivery_cost REAL, delivery_cost REAL,
discount_total REAL, discount_total REAL,
web_status TEXT web_status TEXT,
discount_split TEXT
); );
CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status); CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status);
CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date); CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date);
@@ -318,6 +319,7 @@ def init_sqlite():
("delivery_cost", "REAL"), ("delivery_cost", "REAL"),
("discount_total", "REAL"), ("discount_total", "REAL"),
("web_status", "TEXT"), ("web_status", "TEXT"),
("discount_split", "TEXT"),
]: ]:
if col not in order_cols: if col not in order_cols:
conn.execute(f"ALTER TABLE orders ADD COLUMN {col} {typedef}") conn.execute(f"ALTER TABLE orders ADD COLUMN {col} {typedef}")

View File

@@ -32,8 +32,10 @@ class AppSettingsUpdate(BaseModel):
discount_vat: str = "21" discount_vat: str = "21"
discount_id_pol: str = "" discount_id_pol: str = ""
id_pol: str = "" id_pol: str = ""
id_pol_productie: str = ""
id_sectie: str = "" id_sectie: str = ""
id_gestiune: str = "" id_gestiune: str = ""
split_discount_vat: str = ""
gomag_api_key: str = "" gomag_api_key: str = ""
gomag_api_shop: str = "" gomag_api_shop: str = ""
gomag_order_days_back: str = "7" gomag_order_days_back: str = "7"
@@ -68,12 +70,14 @@ async def sync_status():
# Build last_run from most recent completed/failed sync_runs row # Build last_run from most recent completed/failed sync_runs row
current_run_id = status.get("run_id") current_run_id = status.get("run_id")
is_running = status.get("status") == "running"
last_run = None last_run = None
try: try:
from ..database import get_sqlite from ..database import get_sqlite
db = await get_sqlite() db = await get_sqlite()
try: try:
if current_run_id: if current_run_id and is_running:
# Only exclude current run while it's actively running
cursor = await db.execute(""" cursor = await db.execute("""
SELECT * FROM sync_runs SELECT * FROM sync_runs
WHERE status IN ('completed', 'failed') AND run_id != ? WHERE status IN ('completed', 'failed') AND run_id != ?
@@ -315,6 +319,29 @@ def _get_articole_terti_for_skus(skus: set) -> dict:
return result return result
def _get_nom_articole_for_direct_skus(skus: set) -> dict:
"""Query NOM_ARTICOLE for SKUs that exist directly as CODMAT (direct mapping)."""
from .. import database
result = {}
sku_list = list(skus)
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
for i in range(0, len(sku_list), 500):
batch = sku_list[i:i+500]
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
params = {f"s{j}": sku for j, sku in enumerate(batch)}
cur.execute(f"""
SELECT codmat, denumire FROM NOM_ARTICOLE
WHERE codmat IN ({placeholders}) AND sters = 0 AND inactiv = 0
""", params)
for row in cur:
result[row[0]] = row[1] or ""
finally:
database.pool.release(conn)
return result
@router.get("/api/sync/order/{order_number}") @router.get("/api/sync/order/{order_number}")
async def order_detail(order_number: str): async def order_detail(order_number: str):
"""Get order detail with line items (R9), enriched with ARTICOLE_TERTI data.""" """Get order detail with line items (R9), enriched with ARTICOLE_TERTI data."""
@@ -332,6 +359,23 @@ async def order_detail(order_number: str):
if sku and sku in codmat_map: if sku and sku in codmat_map:
item["codmat_details"] = codmat_map[sku] item["codmat_details"] = codmat_map[sku]
# Enrich direct SKUs (SKU=CODMAT in NOM_ARTICOLE, no ARTICOLE_TERTI entry)
direct_skus = {item["sku"] for item in items
if item.get("sku") and item.get("mapping_status") == "direct"
and not item.get("codmat_details")}
if direct_skus:
nom_map = await asyncio.to_thread(_get_nom_articole_for_direct_skus, direct_skus)
for item in items:
sku = item.get("sku")
if sku and sku in nom_map and not item.get("codmat_details"):
item["codmat_details"] = [{
"codmat": sku,
"cantitate_roa": 1,
"procent_pret": 100,
"denumire": nom_map[sku],
"direct": True
}]
# Enrich with invoice data # Enrich with invoice data
order = detail.get("order", {}) order = detail.get("order", {})
if order.get("factura_numar") and order.get("factura_data"): if order.get("factura_numar") and order.get("factura_data"):
@@ -365,6 +409,13 @@ async def order_detail(order_number: str):
except Exception: except Exception:
pass pass
# Parse discount_split JSON string
if order.get("discount_split"):
try:
order["discount_split"] = json.loads(order["discount_split"])
except (json.JSONDecodeError, TypeError):
pass
return detail return detail
@@ -594,11 +645,13 @@ async def get_app_settings():
"transport_vat": s.get("transport_vat", "21"), "transport_vat": s.get("transport_vat", "21"),
"discount_codmat": s.get("discount_codmat", ""), "discount_codmat": s.get("discount_codmat", ""),
"transport_id_pol": s.get("transport_id_pol", ""), "transport_id_pol": s.get("transport_id_pol", ""),
"discount_vat": s.get("discount_vat", "19"), "discount_vat": s.get("discount_vat", "21"),
"discount_id_pol": s.get("discount_id_pol", ""), "discount_id_pol": s.get("discount_id_pol", ""),
"id_pol": s.get("id_pol", ""), "id_pol": s.get("id_pol", ""),
"id_pol_productie": s.get("id_pol_productie", ""),
"id_sectie": s.get("id_sectie", ""), "id_sectie": s.get("id_sectie", ""),
"id_gestiune": s.get("id_gestiune", ""), "id_gestiune": s.get("id_gestiune", ""),
"split_discount_vat": s.get("split_discount_vat", ""),
"gomag_api_key": s.get("gomag_api_key", "") or config_settings.GOMAG_API_KEY, "gomag_api_key": s.get("gomag_api_key", "") or config_settings.GOMAG_API_KEY,
"gomag_api_shop": s.get("gomag_api_shop", "") or config_settings.GOMAG_API_SHOP, "gomag_api_shop": s.get("gomag_api_shop", "") or config_settings.GOMAG_API_SHOP,
"gomag_order_days_back": s.get("gomag_order_days_back", "") or str(config_settings.GOMAG_ORDER_DAYS_BACK), "gomag_order_days_back": s.get("gomag_order_days_back", "") or str(config_settings.GOMAG_ORDER_DAYS_BACK),
@@ -617,8 +670,10 @@ async def update_app_settings(config: AppSettingsUpdate):
await sqlite_service.set_app_setting("discount_vat", config.discount_vat) await sqlite_service.set_app_setting("discount_vat", config.discount_vat)
await sqlite_service.set_app_setting("discount_id_pol", config.discount_id_pol) await sqlite_service.set_app_setting("discount_id_pol", config.discount_id_pol)
await sqlite_service.set_app_setting("id_pol", config.id_pol) await sqlite_service.set_app_setting("id_pol", config.id_pol)
await sqlite_service.set_app_setting("id_pol_productie", config.id_pol_productie)
await sqlite_service.set_app_setting("id_sectie", config.id_sectie) await sqlite_service.set_app_setting("id_sectie", config.id_sectie)
await sqlite_service.set_app_setting("id_gestiune", config.id_gestiune) await sqlite_service.set_app_setting("id_gestiune", config.id_gestiune)
await sqlite_service.set_app_setting("split_discount_vat", config.split_discount_vat)
await sqlite_service.set_app_setting("gomag_api_key", config.gomag_api_key) await sqlite_service.set_app_setting("gomag_api_key", config.gomag_api_key)
await sqlite_service.set_app_setting("gomag_api_shop", config.gomag_api_shop) await sqlite_service.set_app_setting("gomag_api_shop", config.gomag_api_shop)
await sqlite_service.set_app_setting("gomag_order_days_back", config.gomag_order_days_back) await sqlite_service.set_app_setting("gomag_order_days_back", config.gomag_order_days_back)

View File

@@ -60,18 +60,81 @@ def format_address_for_oracle(address: str, city: str, region: str) -> str:
return f"JUD:{region_clean};{city_clean};{address_clean}" return f"JUD:{region_clean};{city_clean};{address_clean}"
def compute_discount_split(order, settings: dict) -> dict | None:
"""Compute proportional discount split by VAT rate from order items.
Returns: {"11": 3.98, "21": 1.43} or None if split not applicable.
Only splits when split_discount_vat is enabled AND multiple VAT rates exist.
When single VAT rate: returns {actual_rate: total} (smarter than GoMag's fixed 21%).
"""
if not order or order.discount_total <= 0:
return None
split_enabled = settings.get("split_discount_vat") == "1"
# Calculate VAT distribution from order items (exclude zero-value)
vat_totals = {}
for item in order.items:
item_value = abs(item.price * item.quantity)
if item_value > 0:
vat_key = str(int(item.vat)) if item.vat == int(item.vat) else str(item.vat)
vat_totals[vat_key] = vat_totals.get(vat_key, 0) + item_value
if not vat_totals:
return None
grand_total = sum(vat_totals.values())
if grand_total <= 0:
return None
if len(vat_totals) == 1:
# Single VAT rate — use that rate (smarter than GoMag's fixed 21%)
actual_vat = list(vat_totals.keys())[0]
return {actual_vat: round(order.discount_total, 2)}
if not split_enabled:
return None
# Multiple VAT rates — split proportionally
result = {}
discount_remaining = order.discount_total
sorted_rates = sorted(vat_totals.keys(), key=lambda x: float(x))
for i, vat_rate in enumerate(sorted_rates):
if i == len(sorted_rates) - 1:
split_amount = round(discount_remaining, 2) # last gets remainder
else:
proportion = vat_totals[vat_rate] / grand_total
split_amount = round(order.discount_total * proportion, 2)
discount_remaining -= split_amount
if split_amount > 0:
result[vat_rate] = split_amount
return result if result else None
def build_articles_json(items, order=None, settings=None) -> str: def build_articles_json(items, order=None, settings=None) -> str:
"""Build JSON string for Oracle PACK_IMPORT_COMENZI.importa_comanda. """Build JSON string for Oracle PACK_IMPORT_COMENZI.importa_comanda.
Includes transport and discount as extra articles if configured.""" Includes transport and discount as extra articles if configured.
Supports per-article id_pol from codmat_policy_map and discount VAT splitting."""
articles = [] articles = []
codmat_policy_map = settings.get("_codmat_policy_map", {}) if settings else {}
default_id_pol = settings.get("id_pol", "") if settings else ""
for item in items: for item in items:
articles.append({ article_dict = {
"sku": item.sku, "sku": item.sku,
"quantity": str(item.quantity), "quantity": str(item.quantity),
"price": str(item.price), "price": str(item.price),
"vat": str(item.vat), "vat": str(item.vat),
"name": clean_web_text(item.name) "name": clean_web_text(item.name)
}) }
# Per-article id_pol from dual-policy validation
item_pol = codmat_policy_map.get(item.sku)
if item_pol and str(item_pol) != str(default_id_pol):
article_dict["id_pol"] = str(item_pol)
articles.append(article_dict)
if order and settings: if order and settings:
transport_codmat = settings.get("transport_codmat", "") transport_codmat = settings.get("transport_codmat", "")
@@ -90,10 +153,40 @@ def build_articles_json(items, order=None, settings=None) -> str:
if settings.get("transport_id_pol"): if settings.get("transport_id_pol"):
article_dict["id_pol"] = settings["transport_id_pol"] article_dict["id_pol"] = settings["transport_id_pol"]
articles.append(article_dict) articles.append(article_dict)
# Discount total with quantity -1 (positive price)
# Discount — smart VAT splitting
if order.discount_total > 0 and discount_codmat: if order.discount_total > 0 and discount_codmat:
# Use GoMag JSON discount VAT if available, fallback to settings discount_split = compute_discount_split(order, settings)
discount_vat = getattr(order, 'discount_vat', None) or settings.get("discount_vat", "19")
if discount_split and len(discount_split) > 1:
# Multiple VAT rates — multiple discount lines
for vat_rate, split_amount in sorted(discount_split.items(), key=lambda x: float(x[0])):
article_dict = {
"sku": discount_codmat,
"quantity": "-1",
"price": str(split_amount),
"vat": vat_rate,
"name": f"Discount (TVA {vat_rate}%)"
}
if settings.get("discount_id_pol"):
article_dict["id_pol"] = settings["discount_id_pol"]
articles.append(article_dict)
elif discount_split and len(discount_split) == 1:
# Single VAT rate — use detected rate
actual_vat = list(discount_split.keys())[0]
article_dict = {
"sku": discount_codmat,
"quantity": "-1",
"price": str(order.discount_total),
"vat": actual_vat,
"name": "Discount"
}
if settings.get("discount_id_pol"):
article_dict["id_pol"] = settings["discount_id_pol"]
articles.append(article_dict)
else:
# Fallback — original behavior with GoMag VAT or settings default
discount_vat = getattr(order, 'discount_vat', None) or settings.get("discount_vat", "21")
article_dict = { article_dict = {
"sku": discount_codmat, "sku": discount_codmat,
"quantity": "-1", "quantity": "-1",
@@ -108,7 +201,7 @@ def build_articles_json(items, order=None, settings=None) -> str:
return json.dumps(articles) return json.dumps(articles)
def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_settings: dict = None, id_gestiune: int = None) -> dict: def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_settings: dict = None, id_gestiuni: list[int] = None) -> dict:
"""Import a single order into Oracle ROA. """Import a single order into Oracle ROA.
Returns dict with: Returns dict with:
@@ -246,6 +339,9 @@ def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_se
id_comanda = cur.var(oracledb.DB_TYPE_NUMBER) id_comanda = cur.var(oracledb.DB_TYPE_NUMBER)
# Convert list[int] to CSV string for Oracle VARCHAR2 param
id_gestiune_csv = ",".join(str(g) for g in id_gestiuni) if id_gestiuni else None
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [ cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
order_number, # p_nr_comanda_ext order_number, # p_nr_comanda_ext
order_date, # p_data_comanda order_date, # p_data_comanda
@@ -255,7 +351,7 @@ def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_se
addr_fact_id, # p_id_adresa_facturare addr_fact_id, # p_id_adresa_facturare
id_pol, # p_id_pol id_pol, # p_id_pol
id_sectie, # p_id_sectie id_sectie, # p_id_sectie
id_gestiune, # p_id_gestiune id_gestiune_csv, # p_id_gestiune (CSV string)
id_comanda # v_id_comanda (OUT) id_comanda # v_id_comanda (OUT)
]) ])

View File

@@ -159,6 +159,24 @@ def create_mapping(sku: str, codmat: str, cantitate_roa: float = 1, procent_pret
with database.pool.acquire() as conn: with database.pool.acquire() as conn:
with conn.cursor() as cur: with conn.cursor() as cur:
# Validate CODMAT exists in NOM_ARTICOLE
cur.execute("""
SELECT COUNT(*) FROM NOM_ARTICOLE
WHERE codmat = :codmat AND sters = 0 AND inactiv = 0
""", {"codmat": codmat})
if cur.fetchone()[0] == 0:
raise HTTPException(status_code=400, detail="CODMAT-ul nu exista in nomenclator")
# Warn if SKU is already a direct CODMAT in NOM_ARTICOLE
if sku == codmat:
cur.execute("""
SELECT COUNT(*) FROM NOM_ARTICOLE
WHERE codmat = :sku AND sters = 0 AND inactiv = 0
""", {"sku": sku})
if cur.fetchone()[0] > 0:
raise HTTPException(status_code=409,
detail="SKU-ul exista direct in nomenclator ca CODMAT, nu necesita mapare")
# Check for active duplicate # Check for active duplicate
cur.execute(""" cur.execute("""
SELECT COUNT(*) FROM ARTICOLE_TERTI SELECT COUNT(*) FROM ARTICOLE_TERTI

View File

@@ -1,8 +1,16 @@
import json import json
import logging import logging
from datetime import datetime from datetime import datetime
from zoneinfo import ZoneInfo
from ..database import get_sqlite, get_sqlite_sync from ..database import get_sqlite, get_sqlite_sync
_tz_bucharest = ZoneInfo("Europe/Bucharest")
def _now_str():
"""Return current Bucharest time as ISO string."""
return datetime.now(_tz_bucharest).replace(tzinfo=None).isoformat()
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -12,8 +20,8 @@ async def create_sync_run(run_id: str, json_files: int = 0):
try: try:
await db.execute(""" await db.execute("""
INSERT INTO sync_runs (run_id, started_at, status, json_files) INSERT INTO sync_runs (run_id, started_at, status, json_files)
VALUES (?, datetime('now'), 'running', ?) VALUES (?, ?, 'running', ?)
""", (run_id, json_files)) """, (run_id, _now_str(), json_files))
await db.commit() await db.commit()
finally: finally:
await db.close() await db.close()
@@ -28,7 +36,7 @@ async def update_sync_run(run_id: str, status: str, total_orders: int = 0,
try: try:
await db.execute(""" await db.execute("""
UPDATE sync_runs SET UPDATE sync_runs SET
finished_at = datetime('now'), finished_at = ?,
status = ?, status = ?,
total_orders = ?, total_orders = ?,
imported = ?, imported = ?,
@@ -38,7 +46,7 @@ async def update_sync_run(run_id: str, status: str, total_orders: int = 0,
already_imported = ?, already_imported = ?,
new_imported = ? new_imported = ?
WHERE run_id = ? WHERE run_id = ?
""", (status, total_orders, imported, skipped, errors, error_message, """, (_now_str(), status, total_orders, imported, skipped, errors, error_message,
already_imported, new_imported, run_id)) already_imported, new_imported, run_id))
await db.commit() await db.commit()
finally: finally:
@@ -53,7 +61,7 @@ async def upsert_order(sync_run_id: str, order_number: str, order_date: str,
payment_method: str = None, delivery_method: str = None, payment_method: str = None, delivery_method: str = None,
order_total: float = None, order_total: float = None,
delivery_cost: float = None, discount_total: float = None, delivery_cost: float = None, discount_total: float = None,
web_status: str = None): web_status: str = None, discount_split: str = None):
"""Upsert a single order — one row per order_number, status updated in place.""" """Upsert a single order — one row per order_number, status updated in place."""
db = await get_sqlite() db = await get_sqlite()
try: try:
@@ -63,8 +71,8 @@ async def upsert_order(sync_run_id: str, order_number: str, order_date: str,
id_comanda, id_partener, error_message, missing_skus, items_count, id_comanda, id_partener, error_message, missing_skus, items_count,
last_sync_run_id, shipping_name, billing_name, last_sync_run_id, shipping_name, billing_name,
payment_method, delivery_method, order_total, payment_method, delivery_method, order_total,
delivery_cost, discount_total, web_status) delivery_cost, discount_total, web_status, discount_split)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(order_number) DO UPDATE SET ON CONFLICT(order_number) DO UPDATE SET
customer_name = excluded.customer_name, customer_name = excluded.customer_name,
status = CASE status = CASE
@@ -89,13 +97,14 @@ async def upsert_order(sync_run_id: str, order_number: str, order_date: str,
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost), delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
discount_total = COALESCE(excluded.discount_total, orders.discount_total), discount_total = COALESCE(excluded.discount_total, orders.discount_total),
web_status = COALESCE(excluded.web_status, orders.web_status), web_status = COALESCE(excluded.web_status, orders.web_status),
discount_split = COALESCE(excluded.discount_split, orders.discount_split),
updated_at = datetime('now') updated_at = datetime('now')
""", (order_number, order_date, customer_name, status, """, (order_number, order_date, customer_name, status,
id_comanda, id_partener, error_message, id_comanda, id_partener, error_message,
json.dumps(missing_skus) if missing_skus else None, json.dumps(missing_skus) if missing_skus else None,
items_count, sync_run_id, shipping_name, billing_name, items_count, sync_run_id, shipping_name, billing_name,
payment_method, delivery_method, order_total, payment_method, delivery_method, order_total,
delivery_cost, discount_total, web_status)) delivery_cost, discount_total, web_status, discount_split))
await db.commit() await db.commit()
finally: finally:
await db.close() await db.close()
@@ -134,8 +143,8 @@ async def save_orders_batch(orders_data: list[dict]):
id_comanda, id_partener, error_message, missing_skus, items_count, id_comanda, id_partener, error_message, missing_skus, items_count,
last_sync_run_id, shipping_name, billing_name, last_sync_run_id, shipping_name, billing_name,
payment_method, delivery_method, order_total, payment_method, delivery_method, order_total,
delivery_cost, discount_total, web_status) delivery_cost, discount_total, web_status, discount_split)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(order_number) DO UPDATE SET ON CONFLICT(order_number) DO UPDATE SET
customer_name = excluded.customer_name, customer_name = excluded.customer_name,
status = CASE status = CASE
@@ -160,6 +169,7 @@ async def save_orders_batch(orders_data: list[dict]):
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost), delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
discount_total = COALESCE(excluded.discount_total, orders.discount_total), discount_total = COALESCE(excluded.discount_total, orders.discount_total),
web_status = COALESCE(excluded.web_status, orders.web_status), web_status = COALESCE(excluded.web_status, orders.web_status),
discount_split = COALESCE(excluded.discount_split, orders.discount_split),
updated_at = datetime('now') updated_at = datetime('now')
""", [ """, [
(d["order_number"], d["order_date"], d["customer_name"], d["status"], (d["order_number"], d["order_date"], d["customer_name"], d["status"],
@@ -170,7 +180,7 @@ async def save_orders_batch(orders_data: list[dict]):
d.get("payment_method"), d.get("delivery_method"), d.get("payment_method"), d.get("delivery_method"),
d.get("order_total"), d.get("order_total"),
d.get("delivery_cost"), d.get("discount_total"), d.get("delivery_cost"), d.get("discount_total"),
d.get("web_status")) d.get("web_status"), d.get("discount_split"))
for d in orders_data for d in orders_data
]) ])

View File

@@ -3,6 +3,14 @@ import json
import logging import logging
import uuid import uuid
from datetime import datetime, timedelta from datetime import datetime, timedelta
from zoneinfo import ZoneInfo
_tz_bucharest = ZoneInfo("Europe/Bucharest")
def _now():
"""Return current time in Bucharest timezone (naive, for display/storage)."""
return datetime.now(_tz_bucharest).replace(tzinfo=None)
from . import order_reader, validation_service, import_service, sqlite_service, invoice_service, gomag_client from . import order_reader, validation_service, import_service, sqlite_service, invoice_service, gomag_client
from ..config import settings from ..config import settings
@@ -22,7 +30,7 @@ def _log_line(run_id: str, message: str):
"""Append a timestamped line to the in-memory log buffer.""" """Append a timestamped line to the in-memory log buffer."""
if run_id not in _run_logs: if run_id not in _run_logs:
_run_logs[run_id] = [] _run_logs[run_id] = []
ts = datetime.now().strftime("%H:%M:%S") ts = _now().strftime("%H:%M:%S")
_run_logs[run_id].append(f"[{ts}] {message}") _run_logs[run_id].append(f"[{ts}] {message}")
@@ -62,11 +70,11 @@ async def prepare_sync(id_pol: int = None, id_sectie: int = None) -> dict:
if _sync_lock.locked(): if _sync_lock.locked():
return {"error": "Sync already running", "run_id": _current_sync.get("run_id") if _current_sync else None} return {"error": "Sync already running", "run_id": _current_sync.get("run_id") if _current_sync else None}
run_id = datetime.now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6] run_id = _now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
_current_sync = { _current_sync = {
"run_id": run_id, "run_id": run_id,
"status": "running", "status": "running",
"started_at": datetime.now().isoformat(), "started_at": _now().isoformat(),
"finished_at": None, "finished_at": None,
"phase": "starting", "phase": "starting",
"phase_text": "Starting...", "phase_text": "Starting...",
@@ -142,11 +150,11 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
async with _sync_lock: async with _sync_lock:
# Use provided run_id or generate one # Use provided run_id or generate one
if not run_id: if not run_id:
run_id = datetime.now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6] run_id = _now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
_current_sync = { _current_sync = {
"run_id": run_id, "run_id": run_id,
"status": "running", "status": "running",
"started_at": datetime.now().isoformat(), "started_at": _now().isoformat(),
"finished_at": None, "finished_at": None,
"phase": "reading", "phase": "reading",
"phase_text": "Reading JSON files...", "phase_text": "Reading JSON files...",
@@ -157,7 +165,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
_update_progress("reading", "Reading JSON files...") _update_progress("reading", "Reading JSON files...")
started_dt = datetime.now() started_dt = _now()
_run_logs[run_id] = [ _run_logs[run_id] = [
f"=== Sync Run {run_id} ===", f"=== Sync Run {run_id} ===",
f"Inceput: {started_dt.strftime('%d.%m.%Y %H:%M:%S')}", f"Inceput: {started_dt.strftime('%d.%m.%Y %H:%M:%S')}",
@@ -322,9 +330,9 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
try: try:
min_date = datetime.strptime(min_date_str[:10], "%Y-%m-%d") - timedelta(days=1) min_date = datetime.strptime(min_date_str[:10], "%Y-%m-%d") - timedelta(days=1)
except (ValueError, TypeError): except (ValueError, TypeError):
min_date = datetime.now() - timedelta(days=90) min_date = _now() - timedelta(days=90)
else: else:
min_date = datetime.now() - timedelta(days=90) min_date = _now() - timedelta(days=90)
existing_map = await asyncio.to_thread( existing_map = await asyncio.to_thread(
validation_service.check_orders_in_roa, min_date, conn validation_service.check_orders_in_roa, min_date, conn
@@ -338,13 +346,18 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
app_settings = await sqlite_service.get_app_settings() app_settings = await sqlite_service.get_app_settings()
id_pol = id_pol or int(app_settings.get("id_pol") or 0) or settings.ID_POL id_pol = id_pol or int(app_settings.get("id_pol") or 0) or settings.ID_POL
id_sectie = id_sectie or int(app_settings.get("id_sectie") or 0) or settings.ID_SECTIE id_sectie = id_sectie or int(app_settings.get("id_sectie") or 0) or settings.ID_SECTIE
id_gestiune = int(app_settings.get("id_gestiune") or 0) or None # None = orice gestiune # Parse multi-gestiune CSV: "1,3" → [1, 3], "" → None
logger.info(f"Sync params: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNE={id_gestiune}") id_gestiune_raw = (app_settings.get("id_gestiune") or "").strip()
_log_line(run_id, f"Parametri import: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNE={id_gestiune}") if id_gestiune_raw and id_gestiune_raw != "0":
id_gestiuni = [int(g) for g in id_gestiune_raw.split(",") if g.strip()]
else:
id_gestiuni = None # None = orice gestiune
logger.info(f"Sync params: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNI={id_gestiuni}")
_log_line(run_id, f"Parametri import: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNI={id_gestiuni}")
# Step 2b: Validate SKUs (reuse same connection) # Step 2b: Validate SKUs (reuse same connection)
all_skus = order_reader.get_all_skus(orders) all_skus = order_reader.get_all_skus(orders)
validation = await asyncio.to_thread(validation_service.validate_skus, all_skus, conn, id_gestiune) validation = await asyncio.to_thread(validation_service.validate_skus, all_skus, conn, id_gestiuni)
importable, skipped = validation_service.classify_orders(orders, validation) importable, skipped = validation_service.classify_orders(orders, validation)
# ── Split importable into truly_importable vs already_in_roa ── # ── Split importable into truly_importable vs already_in_roa ──
@@ -408,7 +421,27 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
pass pass
elif item.sku in validation["direct"]: elif item.sku in validation["direct"]:
all_codmats.add(item.sku) all_codmats.add(item.sku)
# Get standard VAT rate from settings for PROC_TVAV metadata
cota_tva = float(app_settings.get("discount_vat") or 21)
# Dual pricing policy support
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
codmat_policy_map = {}
if all_codmats: if all_codmats:
if id_pol_productie:
# Dual-policy: classify articles by cont (sales vs production)
codmat_policy_map = await asyncio.to_thread(
validation_service.validate_and_ensure_prices_dual,
all_codmats, id_pol, id_pol_productie,
conn, validation.get("direct_id_map"),
cota_tva=cota_tva
)
_log_line(run_id,
f"Politici duale: {sum(1 for v in codmat_policy_map.values() if v == id_pol)} vanzare, "
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)} productie")
else:
# Single-policy (backward compatible)
price_result = await asyncio.to_thread( price_result = await asyncio.to_thread(
validation_service.validate_prices, all_codmats, id_pol, validation_service.validate_prices, all_codmats, id_pol,
conn, validation.get("direct_id_map") conn, validation.get("direct_id_map")
@@ -421,8 +454,53 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
await asyncio.to_thread( await asyncio.to_thread(
validation_service.ensure_prices, validation_service.ensure_prices,
price_result["missing_price"], id_pol, price_result["missing_price"], id_pol,
conn, validation.get("direct_id_map") conn, validation.get("direct_id_map"),
cota_tva=cota_tva
) )
# Also validate mapped SKU prices (cherry-pick 1)
mapped_skus_in_orders = set()
for order in (truly_importable + already_in_roa):
for item in order.items:
if item.sku in validation["mapped"]:
mapped_skus_in_orders.add(item.sku)
if mapped_skus_in_orders:
mapped_codmat_data = await asyncio.to_thread(
validation_service.resolve_mapped_codmats, mapped_skus_in_orders, conn
)
# Build id_map for mapped codmats and validate/ensure their prices
mapped_id_map = {}
for sku, entries in mapped_codmat_data.items():
for entry in entries:
mapped_id_map[entry["codmat"]] = {
"id_articol": entry["id_articol"],
"cont": entry.get("cont")
}
mapped_codmats = set(mapped_id_map.keys())
if mapped_codmats:
if id_pol_productie:
mapped_policy_map = await asyncio.to_thread(
validation_service.validate_and_ensure_prices_dual,
mapped_codmats, id_pol, id_pol_productie,
conn, mapped_id_map, cota_tva=cota_tva
)
codmat_policy_map.update(mapped_policy_map)
else:
mp_result = await asyncio.to_thread(
validation_service.validate_prices,
mapped_codmats, id_pol, conn, mapped_id_map
)
if mp_result["missing_price"]:
await asyncio.to_thread(
validation_service.ensure_prices,
mp_result["missing_price"], id_pol,
conn, mapped_id_map, cota_tva=cota_tva
)
# Pass codmat_policy_map to import via app_settings
if codmat_policy_map:
app_settings["_codmat_policy_map"] = codmat_policy_map
finally: finally:
await asyncio.to_thread(database.pool.release, conn) await asyncio.to_thread(database.pool.release, conn)
@@ -507,7 +585,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
result = await asyncio.to_thread( result = await asyncio.to_thread(
import_service.import_single_order, import_service.import_single_order,
order, id_pol=id_pol, id_sectie=id_sectie, order, id_pol=id_pol, id_sectie=id_sectie,
app_settings=app_settings, id_gestiune=id_gestiune app_settings=app_settings, id_gestiuni=id_gestiuni
) )
# Build order items data for storage (R9) # Build order items data for storage (R9)
@@ -521,6 +599,10 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
"cantitate_roa": None "cantitate_roa": None
}) })
# Compute discount split for SQLite storage
ds = import_service.compute_discount_split(order, app_settings)
discount_split_json = json.dumps(ds) if ds else None
if result["success"]: if result["success"]:
imported_count += 1 imported_count += 1
await sqlite_service.upsert_order( await sqlite_service.upsert_order(
@@ -540,6 +622,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
delivery_cost=order.delivery_cost or None, delivery_cost=order.delivery_cost or None,
discount_total=order.discount_total or None, discount_total=order.discount_total or None,
web_status=order.status or None, web_status=order.status or None,
discount_split=discount_split_json,
) )
await sqlite_service.add_sync_run_order(run_id, order.number, "IMPORTED") await sqlite_service.add_sync_run_order(run_id, order.number, "IMPORTED")
# Store ROA address IDs (R9) # Store ROA address IDs (R9)
@@ -569,6 +652,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
delivery_cost=order.delivery_cost or None, delivery_cost=order.delivery_cost or None,
discount_total=order.discount_total or None, discount_total=order.discount_total or None,
web_status=order.status or None, web_status=order.status or None,
discount_split=discount_split_json,
) )
await sqlite_service.add_sync_run_order(run_id, order.number, "ERROR") await sqlite_service.add_sync_run_order(run_id, order.number, "ERROR")
await sqlite_service.add_order_items(order.number, order_items_data) await sqlite_service.add_order_items(order.number, order_items_data)
@@ -673,14 +757,14 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
"already_imported": already_imported_count, "cancelled": cancelled_count}) "already_imported": already_imported_count, "cancelled": cancelled_count})
if _current_sync: if _current_sync:
_current_sync["status"] = status _current_sync["status"] = status
_current_sync["finished_at"] = datetime.now().isoformat() _current_sync["finished_at"] = _now().isoformat()
logger.info( logger.info(
f"Sync {run_id} completed: {imported_count} new, {already_imported_count} already imported, " f"Sync {run_id} completed: {imported_count} new, {already_imported_count} already imported, "
f"{len(skipped)} skipped, {error_count} errors, {cancelled_count} cancelled" f"{len(skipped)} skipped, {error_count} errors, {cancelled_count} cancelled"
) )
duration = (datetime.now() - started_dt).total_seconds() duration = (_now() - started_dt).total_seconds()
_log_line(run_id, "") _log_line(run_id, "")
cancelled_text = f", {cancelled_count} anulate" if cancelled_count else "" cancelled_text = f", {cancelled_count} anulate" if cancelled_count else ""
_run_logs[run_id].append( _run_logs[run_id].append(
@@ -696,7 +780,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
await sqlite_service.update_sync_run(run_id, "failed", 0, 0, 0, 1, error_message=str(e)) await sqlite_service.update_sync_run(run_id, "failed", 0, 0, 0, 1, error_message=str(e))
if _current_sync: if _current_sync:
_current_sync["status"] = "failed" _current_sync["status"] = "failed"
_current_sync["finished_at"] = datetime.now().isoformat() _current_sync["finished_at"] = _now().isoformat()
_current_sync["error"] = str(e) _current_sync["error"] = str(e)
return {"run_id": run_id, "status": "failed", "error": str(e)} return {"run_id": run_id, "status": "failed", "error": str(e)}
finally: finally:

View File

@@ -28,10 +28,11 @@ def check_orders_in_roa(min_date, conn) -> dict:
return existing return existing
def resolve_codmat_ids(codmats: set[str], id_gestiune: int = None, conn=None) -> dict[str, int]: def resolve_codmat_ids(codmats: set[str], id_gestiuni: list[int] = None, conn=None) -> dict[str, dict]:
"""Resolve CODMATs to best id_articol: prefers article with stock, then MAX(id_articol). """Resolve CODMATs to best id_articol + cont: prefers article with stock, then MAX(id_articol).
Filters: sters=0 AND inactiv=0. Filters: sters=0 AND inactiv=0.
Returns: {codmat: id_articol} id_gestiuni: list of warehouse IDs to check stock in, or None for all.
Returns: {codmat: {"id_articol": int, "cont": str|None}}
""" """
if not codmats: if not codmats:
return {} return {}
@@ -40,8 +41,9 @@ def resolve_codmat_ids(codmats: set[str], id_gestiune: int = None, conn=None) ->
codmat_list = list(codmats) codmat_list = list(codmats)
# Build stoc subquery dynamically for index optimization # Build stoc subquery dynamically for index optimization
if id_gestiune is not None: if id_gestiuni:
stoc_filter = "AND s.id_gestiune = :id_gestiune" gest_placeholders = ",".join([f":g{k}" for k in range(len(id_gestiuni))])
stoc_filter = f"AND s.id_gestiune IN ({gest_placeholders})"
else: else:
stoc_filter = "" stoc_filter = ""
@@ -54,12 +56,13 @@ def resolve_codmat_ids(codmats: set[str], id_gestiune: int = None, conn=None) ->
batch = codmat_list[i:i+500] batch = codmat_list[i:i+500]
placeholders = ",".join([f":c{j}" for j in range(len(batch))]) placeholders = ",".join([f":c{j}" for j in range(len(batch))])
params = {f"c{j}": cm for j, cm in enumerate(batch)} params = {f"c{j}": cm for j, cm in enumerate(batch)}
if id_gestiune is not None: if id_gestiuni:
params["id_gestiune"] = id_gestiune for k, gid in enumerate(id_gestiuni):
params[f"g{k}"] = gid
cur.execute(f""" cur.execute(f"""
SELECT codmat, id_articol FROM ( SELECT codmat, id_articol, cont FROM (
SELECT na.codmat, na.id_articol, SELECT na.codmat, na.id_articol, na.cont,
ROW_NUMBER() OVER ( ROW_NUMBER() OVER (
PARTITION BY na.codmat PARTITION BY na.codmat
ORDER BY ORDER BY
@@ -79,22 +82,22 @@ def resolve_codmat_ids(codmats: set[str], id_gestiune: int = None, conn=None) ->
) WHERE rn = 1 ) WHERE rn = 1
""", params) """, params)
for row in cur: for row in cur:
result[row[0]] = row[1] result[row[0]] = {"id_articol": row[1], "cont": row[2]}
finally: finally:
if own_conn: if own_conn:
database.pool.release(conn) database.pool.release(conn)
logger.info(f"resolve_codmat_ids: {len(result)}/{len(codmats)} resolved (gestiune={id_gestiune})") logger.info(f"resolve_codmat_ids: {len(result)}/{len(codmats)} resolved (gestiuni={id_gestiuni})")
return result return result
def validate_skus(skus: set[str], conn=None, id_gestiune: int = None) -> dict: def validate_skus(skus: set[str], conn=None, id_gestiuni: list[int] = None) -> dict:
"""Validate a set of SKUs against Oracle. """Validate a set of SKUs against Oracle.
Returns: {mapped: set, direct: set, missing: set, direct_id_map: {codmat: id_articol}} Returns: {mapped: set, direct: set, missing: set, direct_id_map: {codmat: {"id_articol": int, "cont": str|None}}}
- mapped: found in ARTICOLE_TERTI (active) - mapped: found in ARTICOLE_TERTI (active)
- direct: found in NOM_ARTICOLE by codmat (not in ARTICOLE_TERTI) - direct: found in NOM_ARTICOLE by codmat (not in ARTICOLE_TERTI)
- missing: not found anywhere - missing: not found anywhere
- direct_id_map: {codmat: id_articol} for direct SKUs (saves a round-trip in validate_prices) - direct_id_map: {codmat: {"id_articol": int, "cont": str|None}} for direct SKUs
""" """
if not skus: if not skus:
return {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}} return {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}}
@@ -124,11 +127,12 @@ def validate_skus(skus: set[str], conn=None, id_gestiune: int = None) -> dict:
# Resolve remaining SKUs via resolve_codmat_ids (consistent id_articol selection) # Resolve remaining SKUs via resolve_codmat_ids (consistent id_articol selection)
all_remaining = [s for s in sku_list if s not in mapped] all_remaining = [s for s in sku_list if s not in mapped]
if all_remaining: if all_remaining:
direct_id_map = resolve_codmat_ids(set(all_remaining), id_gestiune, conn) direct_id_map = resolve_codmat_ids(set(all_remaining), id_gestiuni, conn)
direct = set(direct_id_map.keys()) direct = set(direct_id_map.keys())
else: else:
direct_id_map = {} direct_id_map = {}
direct = set() direct = set()
finally: finally:
if own_conn: if own_conn:
database.pool.release(conn) database.pool.release(conn)
@@ -136,7 +140,8 @@ def validate_skus(skus: set[str], conn=None, id_gestiune: int = None) -> dict:
missing = skus - mapped - direct missing = skus - mapped - direct
logger.info(f"SKU validation: {len(mapped)} mapped, {len(direct)} direct, {len(missing)} missing") logger.info(f"SKU validation: {len(mapped)} mapped, {len(direct)} direct, {len(missing)} missing")
return {"mapped": mapped, "direct": direct, "missing": missing, "direct_id_map": direct_id_map} return {"mapped": mapped, "direct": direct, "missing": missing,
"direct_id_map": direct_id_map}
def classify_orders(orders, validation_result): def classify_orders(orders, validation_result):
"""Classify orders as importable or skipped based on SKU validation. """Classify orders as importable or skipped based on SKU validation.
@@ -158,6 +163,19 @@ def classify_orders(orders, validation_result):
return importable, skipped return importable, skipped
def _extract_id_map(direct_id_map: dict) -> dict:
"""Extract {codmat: id_articol} from either enriched or simple format."""
if not direct_id_map:
return {}
result = {}
for cm, val in direct_id_map.items():
if isinstance(val, dict):
result[cm] = val["id_articol"]
else:
result[cm] = val
return result
def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None) -> dict: def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None) -> dict:
"""Check which CODMATs have a price entry in CRM_POLITICI_PRET_ART for the given policy. """Check which CODMATs have a price entry in CRM_POLITICI_PRET_ART for the given policy.
If direct_id_map is provided, skips the NOM_ARTICOLE lookup for those CODMATs. If direct_id_map is provided, skips the NOM_ARTICOLE lookup for those CODMATs.
@@ -166,7 +184,7 @@ def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: di
if not codmats: if not codmats:
return {"has_price": set(), "missing_price": set()} return {"has_price": set(), "missing_price": set()}
codmat_to_id = dict(direct_id_map) if direct_id_map else {} codmat_to_id = _extract_id_map(direct_id_map)
ids_with_price = set() ids_with_price = set()
own_conn = conn is None own_conn = conn is None
@@ -199,14 +217,18 @@ def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: di
logger.info(f"Price validation (policy {id_pol}): {len(has_price)} have price, {len(missing_price)} missing price") logger.info(f"Price validation (policy {id_pol}): {len(has_price)} have price, {len(missing_price)} missing price")
return {"has_price": has_price, "missing_price": missing_price} return {"has_price": has_price, "missing_price": missing_price}
def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None): def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None,
cota_tva: float = None):
"""Insert price 0 entries for CODMATs missing from the given price policy. """Insert price 0 entries for CODMATs missing from the given price policy.
Uses batch executemany instead of individual INSERTs. Uses batch executemany instead of individual INSERTs.
Relies on TRG_CRM_POLITICI_PRET_ART trigger for ID_POL_ART sequence. Relies on TRG_CRM_POLITICI_PRET_ART trigger for ID_POL_ART sequence.
cota_tva: VAT rate from settings (e.g. 21) — used for PROC_TVAV metadata.
""" """
if not codmats: if not codmats:
return return
proc_tvav = 1 + (cota_tva / 100) if cota_tva else 1.21
own_conn = conn is None own_conn = conn is None
if own_conn: if own_conn:
conn = database.get_oracle_connection() conn = database.get_oracle_connection()
@@ -224,7 +246,7 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
# Build batch params using direct_id_map (already resolved via resolve_codmat_ids) # Build batch params using direct_id_map (already resolved via resolve_codmat_ids)
batch_params = [] batch_params = []
codmat_id_map = dict(direct_id_map) if direct_id_map else {} codmat_id_map = _extract_id_map(direct_id_map)
for codmat in codmats: for codmat in codmats:
id_articol = codmat_id_map.get(codmat) id_articol = codmat_id_map.get(codmat)
@@ -234,7 +256,8 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
batch_params.append({ batch_params.append({
"id_pol": id_pol, "id_pol": id_pol,
"id_articol": id_articol, "id_articol": id_articol,
"id_valuta": id_valuta "id_valuta": id_valuta,
"proc_tvav": proc_tvav
}) })
if batch_params: if batch_params:
@@ -244,9 +267,9 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
ID_UTIL, DATAORA, PROC_TVAV, PRETFTVA, PRETCTVA) ID_UTIL, DATAORA, PROC_TVAV, PRETFTVA, PRETCTVA)
VALUES VALUES
(:id_pol, :id_articol, 0, :id_valuta, (:id_pol, :id_articol, 0, :id_valuta,
-3, SYSDATE, 1.19, 0, 0) -3, SYSDATE, :proc_tvav, 0, 0)
""", batch_params) """, batch_params)
logger.info(f"Batch inserted {len(batch_params)} price entries for policy {id_pol}") logger.info(f"Batch inserted {len(batch_params)} price entries for policy {id_pol} (PROC_TVAV={proc_tvav})")
conn.commit() conn.commit()
finally: finally:
@@ -254,3 +277,125 @@ def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict
database.pool.release(conn) database.pool.release(conn)
logger.info(f"Ensure prices done: {len(codmats)} CODMATs processed for policy {id_pol}") logger.info(f"Ensure prices done: {len(codmats)} CODMATs processed for policy {id_pol}")
def validate_and_ensure_prices_dual(codmats: set[str], id_pol_vanzare: int,
id_pol_productie: int, conn, direct_id_map: dict,
cota_tva: float = 21) -> dict[str, int]:
"""Dual-policy price validation: assign each CODMAT to sales or production policy.
Logic:
1. Check both policies in one SQL
2. If article in one policy → use that
3. If article in BOTH → prefer id_pol_vanzare
4. If article in NEITHER → check cont: 341/345 → production, else → sales; insert price 0
Returns: codmat_policy_map = {codmat: assigned_id_pol}
"""
if not codmats:
return {}
codmat_policy_map = {}
id_map = _extract_id_map(direct_id_map)
# Collect all id_articol values we need to check
id_to_codmats = {} # {id_articol: [codmat, ...]}
for cm in codmats:
aid = id_map.get(cm)
if aid:
id_to_codmats.setdefault(aid, []).append(cm)
if not id_to_codmats:
return {}
# Query both policies in one SQL
existing = {} # {id_articol: set of id_pol}
id_list = list(id_to_codmats.keys())
with conn.cursor() as cur:
for i in range(0, len(id_list), 500):
batch = id_list[i:i+500]
placeholders = ",".join([f":a{j}" for j in range(len(batch))])
params = {f"a{j}": aid for j, aid in enumerate(batch)}
params["id_pol_v"] = id_pol_vanzare
params["id_pol_p"] = id_pol_productie
cur.execute(f"""
SELECT pa.ID_ARTICOL, pa.ID_POL FROM CRM_POLITICI_PRET_ART pa
WHERE pa.ID_POL IN (:id_pol_v, :id_pol_p) AND pa.ID_ARTICOL IN ({placeholders})
""", params)
for row in cur:
existing.setdefault(row[0], set()).add(row[1])
# Classify each codmat
missing_vanzare = set() # CODMATs needing price 0 in sales policy
missing_productie = set() # CODMATs needing price 0 in production policy
for aid, cms in id_to_codmats.items():
pols = existing.get(aid, set())
for cm in cms:
if pols:
if id_pol_vanzare in pols:
codmat_policy_map[cm] = id_pol_vanzare
elif id_pol_productie in pols:
codmat_policy_map[cm] = id_pol_productie
else:
# Not in any policy — classify by cont
info = direct_id_map.get(cm, {})
cont = info.get("cont", "") if isinstance(info, dict) else ""
cont_str = str(cont or "").strip()
if cont_str in ("341", "345"):
codmat_policy_map[cm] = id_pol_productie
missing_productie.add(cm)
else:
codmat_policy_map[cm] = id_pol_vanzare
missing_vanzare.add(cm)
# Ensure prices for missing articles in each policy
if missing_vanzare:
ensure_prices(missing_vanzare, id_pol_vanzare, conn, direct_id_map, cota_tva=cota_tva)
if missing_productie:
ensure_prices(missing_productie, id_pol_productie, conn, direct_id_map, cota_tva=cota_tva)
logger.info(
f"Dual-policy: {len(codmat_policy_map)} CODMATs assigned "
f"(vanzare={sum(1 for v in codmat_policy_map.values() if v == id_pol_vanzare)}, "
f"productie={sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)})"
)
return codmat_policy_map
def resolve_mapped_codmats(mapped_skus: set[str], conn) -> dict[str, list[dict]]:
"""For mapped SKUs, get their underlying CODMATs from ARTICOLE_TERTI + nom_articole.
Returns: {sku: [{"codmat": str, "id_articol": int, "cont": str|None}]}
"""
if not mapped_skus:
return {}
result = {}
sku_list = list(mapped_skus)
with conn.cursor() as cur:
for i in range(0, len(sku_list), 500):
batch = sku_list[i:i+500]
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
params = {f"s{j}": sku for j, sku in enumerate(batch)}
cur.execute(f"""
SELECT at.sku, at.codmat, na.id_articol, na.cont
FROM ARTICOLE_TERTI at
JOIN NOM_ARTICOLE na ON na.codmat = at.codmat AND na.sters = 0 AND na.inactiv = 0
WHERE at.sku IN ({placeholders}) AND at.activ = 1 AND at.sters = 0
""", params)
for row in cur:
sku = row[0]
if sku not in result:
result[sku] = []
result[sku].append({
"codmat": row[1],
"id_articol": row[2],
"cont": row[3]
})
logger.info(f"resolve_mapped_codmats: {len(result)} SKUs → {sum(len(v) for v in result.values())} CODMATs")
return result

View File

@@ -425,13 +425,12 @@ tr.mapping-deleted td {
/* ── Search input ────────────────────────────────── */ /* ── Search input ────────────────────────────────── */
.search-input { .search-input {
margin-left: auto;
padding: 0.375rem 0.75rem; padding: 0.375rem 0.75rem;
border: 1px solid #d1d5db; border: 1px solid #d1d5db;
border-radius: 0.375rem; border-radius: 0.375rem;
font-size: 0.9375rem; font-size: 0.9375rem;
outline: none; outline: none;
min-width: 180px; width: 160px;
} }
.search-input:focus { border-color: var(--blue-600); } .search-input:focus { border-color: var(--blue-600); }
@@ -706,7 +705,7 @@ tr.mapping-deleted td {
gap: 0.375rem; gap: 0.375rem;
} }
.filter-pill { padding: 0.25rem 0.5rem; font-size: 0.8125rem; } .filter-pill { padding: 0.25rem 0.5rem; font-size: 0.8125rem; }
.search-input { min-width: 0; width: 100%; order: 99; } .search-input { min-width: 0; width: auto; flex: 1; }
.page-btn.page-number { display: none; } .page-btn.page-number { display: none; }
.page-btn.page-ellipsis { display: none; } .page-btn.page-ellipsis { display: none; }
.table-responsive { display: none; } .table-responsive { display: none; }

View File

@@ -478,6 +478,9 @@ function renderCodmatCell(item) {
} }
if (item.codmat_details.length === 1) { if (item.codmat_details.length === 1) {
const d = item.codmat_details[0]; const d = item.codmat_details[0];
if (d.direct) {
return `<code>${esc(d.codmat)}</code> <span class="badge bg-secondary" style="font-size:0.6rem;vertical-align:middle">direct</span>`;
}
return `<code>${esc(d.codmat)}</code>`; return `<code>${esc(d.codmat)}</code>`;
} }
return item.codmat_details.map(d => return item.codmat_details.map(d =>
@@ -575,7 +578,19 @@ async function openDashOrderDetail(orderNumber) {
if (dlvEl) dlvEl.textContent = order.delivery_cost > 0 ? Number(order.delivery_cost).toFixed(2) + ' lei' : ''; if (dlvEl) dlvEl.textContent = order.delivery_cost > 0 ? Number(order.delivery_cost).toFixed(2) + ' lei' : '';
const dscEl = document.getElementById('detailDiscount'); const dscEl = document.getElementById('detailDiscount');
if (dscEl) dscEl.textContent = order.discount_total > 0 ? '' + Number(order.discount_total).toFixed(2) + ' lei' : ''; if (dscEl) {
if (order.discount_total > 0 && order.discount_split && typeof order.discount_split === 'object') {
const entries = Object.entries(order.discount_split);
if (entries.length > 1) {
const parts = entries.map(([vat, amt]) => `${Number(amt).toFixed(2)} (TVA ${vat}%)`);
dscEl.innerHTML = parts.join('<br>');
} else {
dscEl.textContent = '' + Number(order.discount_total).toFixed(2) + ' lei';
}
} else {
dscEl.textContent = order.discount_total > 0 ? '' + Number(order.discount_total).toFixed(2) + ' lei' : '';
}
}
const items = data.items || []; const items = data.items || [];
if (items.length === 0) { if (items.length === 0) {
@@ -596,7 +611,7 @@ async function openDashOrderDetail(orderNumber) {
if (mobileContainer) { if (mobileContainer) {
mobileContainer.innerHTML = '<div class="detail-item-flat">' + items.map((item, idx) => { mobileContainer.innerHTML = '<div class="detail-item-flat">' + items.map((item, idx) => {
const codmatText = item.codmat_details?.length const codmatText = item.codmat_details?.length
? item.codmat_details.map(d => `<code>${esc(d.codmat)}</code>`).join(' ') ? item.codmat_details.map(d => `<code>${esc(d.codmat)}</code>${d.direct ? ' <span class="badge bg-secondary" style="font-size:0.55rem">direct</span>' : ''}`).join(' ')
: `<code>${esc(item.codmat || '')}</code>`; : `<code>${esc(item.codmat || '')}</code>`;
const valoare = (Number(item.price || 0) * Number(item.quantity || 0)).toFixed(2); const valoare = (Number(item.price || 0) * Number(item.quantity || 0)).toFixed(2);
return `<div class="dif-item"> return `<div class="dif-item">
@@ -642,9 +657,27 @@ function openQuickMap(sku, productName, orderNumber, itemIdx) {
const container = document.getElementById('qmCodmatLines'); const container = document.getElementById('qmCodmatLines');
container.innerHTML = ''; container.innerHTML = '';
// Pre-populate with existing codmat_details if available // Check if this is a direct SKU (SKU=CODMAT in NOM_ARTICOLE)
const item = (window._detailItems || [])[itemIdx]; const item = (window._detailItems || [])[itemIdx];
const details = item?.codmat_details; const details = item?.codmat_details;
const isDirect = details?.length === 1 && details[0].direct === true;
const directInfo = document.getElementById('qmDirectInfo');
const saveBtn = document.getElementById('qmSaveBtn');
if (isDirect) {
if (directInfo) {
directInfo.innerHTML = `<i class="bi bi-info-circle"></i> SKU = CODMAT direct in nomenclator (<code>${escHtml(details[0].codmat)}</code> — ${escHtml(details[0].denumire || '')}).<br><small class="text-muted">Poti suprascrie cu un alt CODMAT daca e necesar (ex: reambalare).</small>`;
directInfo.style.display = '';
}
if (saveBtn) {
saveBtn.textContent = 'Suprascrie mapare';
}
addQmCodmatLine();
} else {
if (directInfo) directInfo.style.display = 'none';
if (saveBtn) saveBtn.textContent = 'Salveaza';
// Pre-populate with existing codmat_details if available
if (details && details.length > 0) { if (details && details.length > 0) {
details.forEach(d => { details.forEach(d => {
addQmCodmatLine({ codmat: d.codmat, cantitate: d.cantitate_roa, procent: d.procent_pret, denumire: d.denumire }); addQmCodmatLine({ codmat: d.codmat, cantitate: d.cantitate_roa, procent: d.procent_pret, denumire: d.denumire });
@@ -652,6 +685,7 @@ function openQuickMap(sku, productName, orderNumber, itemIdx) {
} else { } else {
addQmCodmatLine(); addQmCodmatLine();
} }
}
new bootstrap.Modal(document.getElementById('quickMapModal')).show(); new bootstrap.Modal(document.getElementById('quickMapModal')).show();
} }
@@ -762,7 +796,9 @@ async function saveQuickMapping() {
if (currentQmOrderNumber) openDashOrderDetail(currentQmOrderNumber); if (currentQmOrderNumber) openDashOrderDetail(currentQmOrderNumber);
loadDashOrders(); loadDashOrders();
} else { } else {
alert('Eroare: ' + (data.error || 'Unknown')); const msg = data.detail || data.error || 'Unknown';
document.getElementById('qmPctWarning').textContent = msg;
document.getElementById('qmPctWarning').style.display = '';
} }
} catch (err) { } catch (err) {
alert('Eroare: ' + err.message); alert('Eroare: ' + err.message);

View File

@@ -18,12 +18,13 @@ async function loadDropdowns() {
const politici = await politiciRes.json(); const politici = await politiciRes.json();
const gestiuni = await gestiuniRes.json(); const gestiuni = await gestiuniRes.json();
const gestiuneEl = document.getElementById('settIdGestiune'); const gestContainer = document.getElementById('settGestiuniContainer');
if (gestiuneEl) { if (gestContainer) {
gestiuneEl.innerHTML = '<option value="">— orice gestiune —</option>'; gestContainer.innerHTML = '';
gestiuni.forEach(g => { gestiuni.forEach(g => {
gestiuneEl.innerHTML += `<option value="${escHtml(g.id)}">${escHtml(g.label)}</option>`; gestContainer.innerHTML += `<div class="form-check mb-0"><input class="form-check-input" type="checkbox" value="${escHtml(g.id)}" id="gestChk_${escHtml(g.id)}"><label class="form-check-label" for="gestChk_${escHtml(g.id)}">${escHtml(g.label)}</label></div>`;
}); });
if (gestiuni.length === 0) gestContainer.innerHTML = '<span class="text-muted small">Nicio gestiune disponibilă</span>';
} }
const sectieEl = document.getElementById('settIdSectie'); const sectieEl = document.getElementById('settIdSectie');
@@ -57,6 +58,14 @@ async function loadDropdowns() {
dPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`; dPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
}); });
} }
const pPolEl = document.getElementById('settIdPolProductie');
if (pPolEl) {
pPolEl.innerHTML = '<option value="">— fără politică producție —</option>';
politici.forEach(p => {
pPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
});
}
} catch (err) { } catch (err) {
console.error('loadDropdowns error:', err); console.error('loadDropdowns error:', err);
} }
@@ -71,11 +80,21 @@ async function loadSettings() {
if (el('settTransportVat')) el('settTransportVat').value = data.transport_vat || '21'; if (el('settTransportVat')) el('settTransportVat').value = data.transport_vat || '21';
if (el('settTransportIdPol')) el('settTransportIdPol').value = data.transport_id_pol || ''; if (el('settTransportIdPol')) el('settTransportIdPol').value = data.transport_id_pol || '';
if (el('settDiscountCodmat')) el('settDiscountCodmat').value = data.discount_codmat || ''; if (el('settDiscountCodmat')) el('settDiscountCodmat').value = data.discount_codmat || '';
if (el('settDiscountVat')) el('settDiscountVat').value = data.discount_vat || '19'; if (el('settDiscountVat')) el('settDiscountVat').value = data.discount_vat || '21';
if (el('settDiscountIdPol')) el('settDiscountIdPol').value = data.discount_id_pol || ''; if (el('settDiscountIdPol')) el('settDiscountIdPol').value = data.discount_id_pol || '';
if (el('settSplitDiscountVat')) el('settSplitDiscountVat').checked = data.split_discount_vat === "1";
if (el('settIdPol')) el('settIdPol').value = data.id_pol || ''; if (el('settIdPol')) el('settIdPol').value = data.id_pol || '';
if (el('settIdPolProductie')) el('settIdPolProductie').value = data.id_pol_productie || '';
if (el('settIdSectie')) el('settIdSectie').value = data.id_sectie || ''; if (el('settIdSectie')) el('settIdSectie').value = data.id_sectie || '';
if (el('settIdGestiune')) el('settIdGestiune').value = data.id_gestiune || ''; // Multi-gestiune checkboxes
const gestVal = data.id_gestiune || '';
if (gestVal) {
const selectedIds = gestVal.split(',').map(s => s.trim());
selectedIds.forEach(id => {
const chk = document.getElementById('gestChk_' + id);
if (chk) chk.checked = true;
});
}
if (el('settGomagApiKey')) el('settGomagApiKey').value = data.gomag_api_key || ''; if (el('settGomagApiKey')) el('settGomagApiKey').value = data.gomag_api_key || '';
if (el('settGomagApiShop')) el('settGomagApiShop').value = data.gomag_api_shop || ''; if (el('settGomagApiShop')) el('settGomagApiShop').value = data.gomag_api_shop || '';
if (el('settGomagDaysBack')) el('settGomagDaysBack').value = data.gomag_order_days_back || '7'; if (el('settGomagDaysBack')) el('settGomagDaysBack').value = data.gomag_order_days_back || '7';
@@ -93,11 +112,13 @@ async function saveSettings() {
transport_vat: el('settTransportVat')?.value || '21', transport_vat: el('settTransportVat')?.value || '21',
transport_id_pol: el('settTransportIdPol')?.value?.trim() || '', transport_id_pol: el('settTransportIdPol')?.value?.trim() || '',
discount_codmat: el('settDiscountCodmat')?.value?.trim() || '', discount_codmat: el('settDiscountCodmat')?.value?.trim() || '',
discount_vat: el('settDiscountVat')?.value || '19', discount_vat: el('settDiscountVat')?.value || '21',
discount_id_pol: el('settDiscountIdPol')?.value?.trim() || '', discount_id_pol: el('settDiscountIdPol')?.value?.trim() || '',
split_discount_vat: el('settSplitDiscountVat')?.checked ? "1" : "",
id_pol: el('settIdPol')?.value?.trim() || '', id_pol: el('settIdPol')?.value?.trim() || '',
id_pol_productie: el('settIdPolProductie')?.value?.trim() || '',
id_sectie: el('settIdSectie')?.value?.trim() || '', id_sectie: el('settIdSectie')?.value?.trim() || '',
id_gestiune: el('settIdGestiune')?.value?.trim() || '', id_gestiune: Array.from(document.querySelectorAll('#settGestiuniContainer input:checked')).map(c => c.value).join(','),
gomag_api_key: el('settGomagApiKey')?.value?.trim() || '', gomag_api_key: el('settGomagApiKey')?.value?.trim() || '',
gomag_api_shop: el('settGomagApiShop')?.value?.trim() || '', gomag_api_shop: el('settGomagApiShop')?.value?.trim() || '',
gomag_order_days_back: el('settGomagDaysBack')?.value?.trim() || '7', gomag_order_days_back: el('settGomagDaysBack')?.value?.trim() || '7',

View File

@@ -7,7 +7,7 @@
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.2/font/bootstrap-icons.css" rel="stylesheet"> <link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.2/font/bootstrap-icons.css" rel="stylesheet">
{% set rp = request.scope.get('root_path', '') %} {% set rp = request.scope.get('root_path', '') %}
<link href="{{ rp }}/static/css/style.css?v=11" rel="stylesheet"> <link href="{{ rp }}/static/css/style.css?v=14" rel="stylesheet">
</head> </head>
<body> <body>
<!-- Top Navbar --> <!-- Top Navbar -->

View File

@@ -66,6 +66,7 @@
<span>&#8212;</span> <span>&#8212;</span>
<input type="date" id="periodEnd" class="select-compact"> <input type="date" id="periodEnd" class="select-compact">
</div> </div>
<input type="search" id="orderSearch" placeholder="Cauta comanda, client..." class="search-input">
<!-- Status pills --> <!-- Status pills -->
<button class="filter-pill active d-none d-md-inline-flex" data-status="all">Toate <span class="filter-count fc-neutral" id="cntAll">0</span></button> <button class="filter-pill active d-none d-md-inline-flex" data-status="all">Toate <span class="filter-count fc-neutral" id="cntAll">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="IMPORTED">Importat <span class="filter-count fc-green" id="cntImp">0</span></button> <button class="filter-pill d-none d-md-inline-flex" data-status="IMPORTED">Importat <span class="filter-count fc-green" id="cntImp">0</span></button>
@@ -74,9 +75,7 @@
<button class="filter-pill d-none d-md-inline-flex" data-status="INVOICED">Facturate <span class="filter-count fc-green" id="cntFact">0</span></button> <button class="filter-pill d-none d-md-inline-flex" data-status="INVOICED">Facturate <span class="filter-count fc-green" id="cntFact">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="UNINVOICED">Nefacturate <span class="filter-count fc-red" id="cntNef">0</span></button> <button class="filter-pill d-none d-md-inline-flex" data-status="UNINVOICED">Nefacturate <span class="filter-count fc-red" id="cntNef">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="CANCELLED">Anulate <span class="filter-count fc-dark" id="cntCanc">0</span></button> <button class="filter-pill d-none d-md-inline-flex" data-status="CANCELLED">Anulate <span class="filter-count fc-dark" id="cntCanc">0</span></button>
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex align-items-center gap-1" id="btnRefreshInvoices" onclick="refreshInvoices()" title="Actualizeaza status facturi din Oracle">&#8635; Facturi</button> <button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" id="btnRefreshInvoices" onclick="refreshInvoices()" title="Actualizeaza status facturi din Oracle">&#8635;</button>
<!-- Search (integrated, end of row) -->
<input type="search" id="orderSearch" placeholder="Cauta..." class="search-input">
</div> </div>
<div class="d-md-none mb-2 d-flex align-items-center gap-2"> <div class="d-md-none mb-2 d-flex align-items-center gap-2">
<div class="flex-grow-1" id="dashMobileSeg"></div> <div class="flex-grow-1" id="dashMobileSeg"></div>
@@ -192,11 +191,12 @@
<button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addQmCodmatLine()" style="font-size:0.8rem; padding:2px 10px"> <button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addQmCodmatLine()" style="font-size:0.8rem; padding:2px 10px">
+ CODMAT + CODMAT
</button> </button>
<div id="qmDirectInfo" class="alert alert-info mt-2" style="display:none; font-size:0.85rem; padding:8px 12px;"></div>
<div id="qmPctWarning" class="text-danger mt-2" style="display:none;"></div> <div id="qmPctWarning" class="text-danger mt-2" style="display:none;"></div>
</div> </div>
<div class="modal-footer"> <div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button> <button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
<button type="button" class="btn btn-primary" onclick="saveQuickMapping()">Salveaza</button> <button type="button" class="btn btn-primary" id="qmSaveBtn" onclick="saveQuickMapping()">Salveaza</button>
</div> </div>
</div> </div>
</div> </div>
@@ -204,5 +204,5 @@
{% endblock %} {% endblock %}
{% block scripts %} {% block scripts %}
<script src="{{ request.scope.get('root_path', '') }}/static/js/dashboard.js?v=15"></script> <script src="{{ request.scope.get('root_path', '') }}/static/js/dashboard.js?v=17"></script>
{% endblock %} {% endblock %}

View File

@@ -39,10 +39,11 @@
<div class="card-header py-2 px-3 fw-semibold">Import ROA</div> <div class="card-header py-2 px-3 fw-semibold">Import ROA</div>
<div class="card-body py-2 px-3"> <div class="card-body py-2 px-3">
<div class="mb-2"> <div class="mb-2">
<label class="form-label mb-0 small">Gestiune (ID_GESTIUNE)</label> <label class="form-label mb-0 small">Gestiuni pentru verificare stoc</label>
<select class="form-select form-select-sm" id="settIdGestiune"> <div id="settGestiuniContainer" class="border rounded p-2" style="max-height:120px;overflow-y:auto;font-size:0.85rem">
<option value="">— orice gestiune —</option> <span class="text-muted small">Se încarcă...</span>
</select> </div>
<div class="form-text" style="font-size:0.75rem">Nicio selecție = orice gestiune</div>
</div> </div>
<div class="mb-2"> <div class="mb-2">
<label class="form-label mb-0 small">Secție (ID_SECTIE)</label> <label class="form-label mb-0 small">Secție (ID_SECTIE)</label>
@@ -51,11 +52,18 @@
</select> </select>
</div> </div>
<div class="mb-2"> <div class="mb-2">
<label class="form-label mb-0 small">Politică de Preț (ID_POL)</label> <label class="form-label mb-0 small">Politică Preț Vânzare (ID_POL)</label>
<select class="form-select form-select-sm" id="settIdPol"> <select class="form-select form-select-sm" id="settIdPol">
<option value="">— selectează politică —</option> <option value="">— selectează politică —</option>
</select> </select>
</div> </div>
<div class="mb-2">
<label class="form-label mb-0 small">Politică Preț Producție</label>
<select class="form-select form-select-sm" id="settIdPolProductie">
<option value="">— fără politică producție —</option>
</select>
<div class="form-text" style="font-size:0.75rem">Pentru articole cu cont 341/345 (producție proprie)</div>
</div>
</div> </div>
</div> </div>
</div> </div>
@@ -113,8 +121,9 @@
<select class="form-select form-select-sm" id="settDiscountVat"> <select class="form-select form-select-sm" id="settDiscountVat">
<option value="5">5%</option> <option value="5">5%</option>
<option value="9">9%</option> <option value="9">9%</option>
<option value="19" selected>19%</option> <option value="11">11%</option>
<option value="21">21%</option> <option value="19">19%</option>
<option value="21" selected>21%</option>
</select> </select>
</div> </div>
<div class="col-6"> <div class="col-6">
@@ -124,6 +133,12 @@
</select> </select>
</div> </div>
</div> </div>
<div class="mt-2 form-check">
<input type="checkbox" class="form-check-input" id="settSplitDiscountVat">
<label class="form-check-label small" for="settSplitDiscountVat">
Împarte discount pe cote TVA (proporțional cu valoarea articolelor)
</label>
</div>
</div> </div>
</div> </div>
</div> </div>
@@ -152,5 +167,5 @@
{% endblock %} {% endblock %}
{% block scripts %} {% block scripts %}
<script src="{{ request.scope.get('root_path', '') }}/static/js/settings.js?v=4"></script> <script src="{{ request.scope.get('root_path', '') }}/static/js/settings.js?v=6"></script>
{% endblock %} {% endblock %}

View File

@@ -61,7 +61,7 @@ CREATE OR REPLACE PACKAGE PACK_IMPORT_COMENZI AS
p_id_adresa_facturare IN NUMBER DEFAULT NULL, p_id_adresa_facturare IN NUMBER DEFAULT NULL,
p_id_pol IN NUMBER DEFAULT NULL, p_id_pol IN NUMBER DEFAULT NULL,
p_id_sectie IN NUMBER DEFAULT NULL, p_id_sectie IN NUMBER DEFAULT NULL,
p_id_gestiune IN NUMBER DEFAULT NULL, p_id_gestiune IN VARCHAR2 DEFAULT NULL,
v_id_comanda OUT NUMBER); v_id_comanda OUT NUMBER);
-- Functii pentru managementul erorilor (pentru orchestrator VFP) -- Functii pentru managementul erorilor (pentru orchestrator VFP)
@@ -93,11 +93,11 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
-- Functie helper: selecteaza id_articol corect pentru un CODMAT -- Functie helper: selecteaza id_articol corect pentru un CODMAT
-- Prioritate: sters=0 AND inactiv=0, preferinta stoc, MAX(id_articol) fallback -- Prioritate: sters=0 AND inactiv=0, preferinta stoc, MAX(id_articol) fallback
-- ================================================================ -- ================================================================
FUNCTION resolve_id_articol(p_codmat IN VARCHAR2, p_id_gest IN NUMBER) RETURN NUMBER IS FUNCTION resolve_id_articol(p_codmat IN VARCHAR2, p_id_gest IN VARCHAR2) RETURN NUMBER IS
v_result NUMBER; v_result NUMBER;
BEGIN BEGIN
IF p_id_gest IS NOT NULL THEN IF p_id_gest IS NOT NULL THEN
-- Cu gestiune specifica — Oracle poate folosi index pe stoc(id_gestiune, an, luna) -- Cu gestiuni specifice (CSV: "1,3") — split in subquery pentru IN clause
BEGIN BEGIN
SELECT id_articol INTO v_result FROM ( SELECT id_articol INTO v_result FROM (
SELECT na.id_articol SELECT na.id_articol
@@ -107,7 +107,11 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
CASE WHEN EXISTS ( CASE WHEN EXISTS (
SELECT 1 FROM stoc s SELECT 1 FROM stoc s
WHERE s.id_articol = na.id_articol WHERE s.id_articol = na.id_articol
AND s.id_gestiune = p_id_gest AND s.id_gestiune IN (
SELECT TO_NUMBER(REGEXP_SUBSTR(p_id_gest, '[^,]+', 1, LEVEL))
FROM DUAL
CONNECT BY LEVEL <= REGEXP_COUNT(p_id_gest, ',') + 1
)
AND s.an = EXTRACT(YEAR FROM SYSDATE) AND s.an = EXTRACT(YEAR FROM SYSDATE)
AND s.luna = EXTRACT(MONTH FROM SYSDATE) AND s.luna = EXTRACT(MONTH FROM SYSDATE)
AND s.cants + s.cant - s.cante > 0 AND s.cants + s.cant - s.cante > 0
@@ -150,7 +154,7 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
p_id_adresa_facturare IN NUMBER DEFAULT NULL, p_id_adresa_facturare IN NUMBER DEFAULT NULL,
p_id_pol IN NUMBER DEFAULT NULL, p_id_pol IN NUMBER DEFAULT NULL,
p_id_sectie IN NUMBER DEFAULT NULL, p_id_sectie IN NUMBER DEFAULT NULL,
p_id_gestiune IN NUMBER DEFAULT NULL, p_id_gestiune IN VARCHAR2 DEFAULT NULL,
v_id_comanda OUT NUMBER) IS v_id_comanda OUT NUMBER) IS
v_data_livrare DATE; v_data_livrare DATE;
v_sku VARCHAR2(100); v_sku VARCHAR2(100);

View File

@@ -1,179 +0,0 @@
"""Analyze billing vs shipping patterns across all GoMag orders"""
import sys, json, httpx
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
API_KEY = '4c5e46df8f6c4f054fe2787de7a13d4a'
API_SHOP = 'https://coffeepoint.ro'
API_URL = 'https://api.gomag.ro/api/v1/order/read/json'
headers = {
'Apikey': API_KEY,
'ApiShop': API_SHOP,
'User-Agent': 'Mozilla/5.0',
'Content-Type': 'application/json',
}
params = {'startDate': '2026-03-08', 'page': 1, 'limit': 250}
all_orders = []
for page in range(1, 20):
params['page'] = page
resp = httpx.get(API_URL, headers=headers, params=params, timeout=60)
data = resp.json()
pages = data.get('pages', 1)
orders_raw = data.get('orders', {})
if isinstance(orders_raw, dict):
for key, order in orders_raw.items():
if isinstance(order, dict):
all_orders.append(order)
print(f"Page {page}/{pages}: {len(orders_raw)} orders")
if page >= pages:
break
print(f"\nTotal orders: {len(all_orders)}")
# Analyze patterns
company_orders = []
person_orders = []
diff_person_orders = [] # shipping != billing person
for order in all_orders:
billing = order.get('billing', {}) or {}
shipping = order.get('shipping', {}) or {}
company = billing.get('company', {})
is_company = isinstance(company, dict) and bool(company.get('name'))
b_first = billing.get('firstname', '') or ''
b_last = billing.get('lastname', '') or ''
s_first = shipping.get('firstname', '') or ''
s_last = shipping.get('lastname', '') or ''
billing_person = f"{b_first} {b_last}".strip()
shipping_person = f"{s_first} {s_last}".strip()
entry = {
'number': order.get('number', ''),
'total': order.get('total', ''),
'billing_person': billing_person,
'shipping_person': shipping_person,
'company_name': company.get('name', '') if is_company else '',
'company_code': company.get('code', '') if is_company else '',
'is_company': is_company,
'same_person': billing_person.upper() == shipping_person.upper(),
}
if is_company:
company_orders.append(entry)
else:
person_orders.append(entry)
if not entry['same_person'] and shipping_person and billing_person:
diff_person_orders.append(entry)
print(f"\n{'='*80}")
print(f"COMPANY orders (billing has company): {len(company_orders)}")
print(f"PERSON orders (no company): {len(person_orders)}")
print(f" - same billing/shipping person: {len(person_orders) - len(diff_person_orders)}")
print(f" - DIFFERENT billing/shipping person: {len(diff_person_orders)}")
# Show company examples
print(f"\n{'='*80}")
print(f"COMPANY ORDERS — first 20 examples")
print(f"{'ORDER':>12s} {'COMPANY':35s} {'CUI':20s} {'BILLING_PERSON':25s} {'SHIPPING_PERSON':25s} {'SAME':5s}")
for e in company_orders[:20]:
same = 'DA' if e['same_person'] else 'NU'
print(f"{e['number']:>12s} {e['company_name'][:35]:35s} {e['company_code'][:20]:20s} {e['billing_person'][:25]:25s} {e['shipping_person'][:25]:25s} {same:5s}")
# Show different person examples
print(f"\n{'='*80}")
print(f"DIFFERENT PERSON (no company, billing != shipping) — ALL {len(diff_person_orders)} examples")
print(f"{'ORDER':>12s} {'BILLING_PERSON':30s} {'SHIPPING_PERSON':30s}")
for e in diff_person_orders:
print(f"{e['number']:>12s} {e['billing_person'][:30]:30s} {e['shipping_person'][:30]:30s}")
# Show person orders with same name
print(f"\n{'='*80}")
print(f"PERSON ORDERS (same billing/shipping) — first 10 examples")
print(f"{'ORDER':>12s} {'BILLING_PERSON':30s} {'SHIPPING_PERSON':30s} {'TOTAL':>10s}")
same_person = [e for e in person_orders if e['same_person']]
for e in same_person[:10]:
print(f"{e['number']:>12s} {e['billing_person'][:30]:30s} {e['shipping_person'][:30]:30s} {e['total']:>10s}")
# Now cross-reference with Oracle to verify import logic
# For company orders, check what partner name was created in ROA
import oracledb, os, sqlite3
os.environ['PATH'] = r'C:\app\Server\product\18.0.0\dbhomeXE\bin' + ';' + os.environ.get('PATH','')
oracledb.init_oracle_client()
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
db.row_factory = sqlite3.Row
c = db.cursor()
conn = oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA')
cur = conn.cursor()
print(f"\n{'='*80}")
print(f"VERIFICATION: Company orders — GoMag vs SQLite vs Oracle ROA")
print(f"{'ORDER':>12s} | {'GOMAG_COMPANY':30s} | {'SQLITE_CUSTOMER':30s} | {'ROA_PARTNER':30s} | MATCH?")
# Build lookup from GoMag orders
gomag_lookup = {}
for order in all_orders:
num = order.get('number', '')
gomag_lookup[num] = order
# Get all imported orders from SQLite with id_partener
c.execute("""
SELECT order_number, customer_name, id_partener, billing_name, shipping_name
FROM orders WHERE id_partener IS NOT NULL
""")
for row in c.fetchall():
on = row['order_number']
gomag = gomag_lookup.get(on)
if not gomag:
continue
billing = gomag.get('billing', {}) or {}
company = billing.get('company', {})
is_company = isinstance(company, dict) and bool(company.get('name'))
company_name = company.get('name', '') if is_company else ''
# Get ROA partner name
roa_partner = ''
if row['id_partener']:
cur.execute("SELECT denumire, prenume FROM nom_parteneri WHERE id_part = :1", [row['id_partener']])
r = cur.fetchone()
if r:
roa_partner = ((r[0] or '') + ' ' + (r[1] or '')).strip()
gomag_label = company_name if is_company else f"{billing.get('firstname','')} {billing.get('lastname','')}"
match = 'OK' if company_name and company_name.upper()[:15] in roa_partner.upper() else ('OK' if not company_name else 'DIFF')
print(f"{on:>12s} | {gomag_label[:30]:30s} | {(row['customer_name'] or '')[:30]:30s} | {roa_partner[:30]:30s} | {match}")
# Also show some SKIPPED company orders to see what customer_name we stored
print(f"\n{'='*80}")
print(f"SKIPPED company orders — GoMag company vs SQLite customer_name")
print(f"{'ORDER':>12s} | {'GOMAG_COMPANY':30s} | {'SQLITE_CUSTOMER':30s} | {'BILLING_PERSON':25s} | {'SHIPPING_PERSON':25s}")
c.execute("SELECT order_number, customer_name, billing_name, shipping_name FROM orders WHERE status = 'SKIPPED' LIMIT 200")
for row in c.fetchall():
on = row['order_number']
gomag = gomag_lookup.get(on)
if not gomag:
continue
billing = gomag.get('billing', {}) or {}
company = billing.get('company', {})
is_company = isinstance(company, dict) and bool(company.get('name'))
if not is_company:
continue
company_name = company.get('name', '')
b_person = f"{billing.get('firstname','')} {billing.get('lastname','')}".strip()
shipping = gomag.get('shipping', {}) or {}
s_person = f"{shipping.get('firstname','')} {shipping.get('lastname','')}".strip()
print(f"{on:>12s} | {company_name[:30]:30s} | {(row['customer_name'] or '')[:30]:30s} | {b_person[:25]:25s} | {s_person[:25]:25s}")
db.close()
conn.close()

View File

@@ -1,31 +0,0 @@
import sqlite3, sys, importlib
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
# Check SQLite current state
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
c = db.cursor()
c.execute("SELECT order_number, customer_name, shipping_name, billing_name FROM orders WHERE order_number='480102897'")
r = c.fetchone()
print(f"SQLite: customer={r[1]}, shipping={r[2]}, billing={r[3]}")
db.close()
# Check deployed code version
sys.path.insert(0, r'C:\gomag-vending\api')
from app.services.sync_service import _derive_customer_info
from app.services.order_reader import OrderData, OrderBilling, OrderShipping
# Simulate the order
billing = OrderBilling(firstname='Liviu', lastname='Stoica', is_company=True, company_name='SLM COMERCE SRL')
shipping = OrderShipping(firstname='Liviu', lastname='Stoica')
order = OrderData(id='1', number='480102897', date='2026-03-09', billing=billing, shipping=shipping)
s, b, customer, _, _ = _derive_customer_info(order)
print(f"Code: _derive_customer_info returns customer={customer!r}")
# Check if the sqlite_service has the fix
import inspect
from app.services.sqlite_service import upsert_order
source = inspect.getsource(upsert_order)
if 'customer_name = excluded.customer_name' in source:
print("sqlite_service: upsert has customer_name update ✓")
else:
print("sqlite_service: upsert MISSING customer_name update ✗")

View File

@@ -1,61 +0,0 @@
"""Check imported orders in SQLite and Oracle — what needs to be deleted"""
import sys, sqlite3, oracledb, os
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
os.environ['PATH'] = r'C:\app\Server\product\18.0.0\dbhomeXE\bin' + ';' + os.environ.get('PATH','')
oracledb.init_oracle_client()
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
db.row_factory = sqlite3.Row
c = db.cursor()
# Get imported orders with id_comanda
c.execute("""
SELECT order_number, customer_name, id_comanda, id_partener, order_date, order_total
FROM orders
WHERE status = 'IMPORTED' AND id_comanda IS NOT NULL
ORDER BY order_date
""")
imported = [dict(r) for r in c.fetchall()]
db.close()
print(f"Imported orders in SQLite: {len(imported)}")
# Check Oracle status
conn = oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA')
cur = conn.cursor()
print(f"\n{'ORDER_NR':>12s} {'ID_CMD':>8s} {'SQLITE_CLIENT':30s} {'ROA_PARTNER':30s} {'ROA_STERS':>9s} {'FACTURAT':>8s} {'DATA':>12s}")
for o in imported:
id_cmd = o['id_comanda']
# Check COMENZI
cur.execute("""
SELECT c.sters, p.denumire, p.prenume,
(SELECT COUNT(*) FROM vanzari v WHERE v.id_comanda = c.id_comanda AND v.sters = 0) as nr_facturi
FROM comenzi c
LEFT JOIN nom_parteneri p ON c.id_part = p.id_part
WHERE c.id_comanda = :1
""", [id_cmd])
row = cur.fetchone()
if row:
sters = row[0]
partner = ((row[1] or '') + ' ' + (row[2] or '')).strip()
nr_fact = row[3]
print(f"{o['order_number']:>12s} {id_cmd:>8d} {(o['customer_name'] or '')[:30]:30s} {partner[:30]:30s} {'DA' if sters else 'NU':>9s} {nr_fact:>8d} {str(o['order_date'])[:10]:>12s}")
else:
print(f"{o['order_number']:>12s} {id_cmd:>8d} {(o['customer_name'] or '')[:30]:30s} {'NOT FOUND':30s}")
# Check if there's a delete/sters mechanism
print(f"\n--- COMENZI table columns for delete ---")
cur.execute("SELECT column_name FROM all_tab_columns WHERE table_name='COMENZI' AND owner='VENDING' AND column_name IN ('STERS','ID_UTILS','DATAORAS') ORDER BY column_id")
for r in cur:
print(f" {r[0]}")
# Check comenzi_detalii
cur.execute("SELECT column_name FROM all_tab_columns WHERE table_name='COMENZI_DETALII' AND owner='VENDING' ORDER BY column_id")
print(f"\n--- COMENZI_DETALII columns ---")
for r in cur:
print(f" {r[0]}")
conn.close()

View File

@@ -1,76 +0,0 @@
"""Compare specific GoMag order vs Oracle invoice"""
import sys, sqlite3, oracledb, os
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
os.environ['PATH'] = r'C:\app\Server\product\18.0.0\dbhomeXE\bin' + ';' + os.environ.get('PATH','')
oracledb.init_oracle_client()
ORDER = sys.argv[1] if len(sys.argv) > 1 else '480104185'
FACT_NR = sys.argv[2] if len(sys.argv) > 2 else '4105'
# GoMag order from SQLite
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
db.row_factory = sqlite3.Row
c = db.cursor()
c.execute("SELECT * FROM orders WHERE order_number = ?", (ORDER,))
order = dict(c.fetchone())
c.execute("SELECT * FROM order_items WHERE order_number = ? ORDER BY sku", (ORDER,))
items = [dict(r) for r in c.fetchall()]
db.close()
print(f"=== GoMag Order {ORDER} ===")
print(f" Client: {order['customer_name']}")
print(f" Shipping: {order['shipping_name']}")
print(f" Billing: {order['billing_name']}")
print(f" Date: {order['order_date']}")
print(f" Total: {order['order_total']}")
print(f" Status: {order['status']}")
print(f" Items ({len(items)}):")
go_total = 0
for it in items:
line = it['quantity'] * it['price']
go_total += line
print(f" SKU={it['sku']:25s} qty={it['quantity']:6.1f} x {it['price']:8.2f} = {line:8.2f} {it['product_name']}")
print(f" Sum lines: {go_total:.2f}")
# Oracle invoice
conn = oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA')
cur = conn.cursor()
cur.execute("""
SELECT v.id_vanzare, TO_CHAR(v.data_act, 'YYYY-MM-DD'), v.total_fara_tva, v.total_cu_tva,
p.denumire, p.prenume
FROM vanzari v
LEFT JOIN nom_parteneri p ON v.id_part = p.id_part
WHERE v.numar_act = :1 AND v.serie_act = 'VM' AND v.sters = 0
AND v.data_act >= TO_DATE('2026-03-01','YYYY-MM-DD')
""", [int(FACT_NR)])
rows = cur.fetchall()
for row in rows:
id_vanz = row[0]
print(f"\n=== Oracle Invoice VM{FACT_NR} (id_vanzare={id_vanz}) ===")
print(f" Client: {row[4]} {row[5] or ''}")
print(f" Date: {row[1]}")
print(f" Total fara TVA: {float(row[2]):.2f}")
print(f" Total cu TVA: {float(row[3]):.2f}")
cur.execute("""
SELECT vd.id_articol, a.codmat, a.denumire,
vd.cantitate, vd.pret, vd.pret_cu_tva, vd.proc_tvav
FROM vanzari_detalii vd
LEFT JOIN nom_articole a ON vd.id_articol = a.id_articol
WHERE vd.id_vanzare = :1 AND vd.sters = 0
ORDER BY vd.id_articol
""", [id_vanz])
det = cur.fetchall()
print(f" Items ({len(det)}):")
roa_total = 0
for d in det:
line = float(d[3]) * float(d[4])
roa_total += line
print(f" COD={str(d[1] or ''):25s} qty={float(d[3]):6.1f} x {float(d[4]):8.2f} = {line:8.2f} TVA={float(d[6]):.0f}% {d[2]}")
print(f" Sum lines (fara TVA): {roa_total:.2f}")
conn.close()

View File

@@ -1,30 +0,0 @@
import sqlite3, sys
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
c = db.cursor()
c.execute("SELECT COUNT(*), MIN(order_date), MAX(order_date) FROM orders")
total, min_d, max_d = c.fetchone()
print(f"Total orders: {total} (from {min_d} to {max_d})")
c.execute("SELECT status, COUNT(*) FROM orders GROUP BY status ORDER BY COUNT(*) DESC")
print("\nBy status:")
for r in c:
print(f" {r[0]:20s} {r[1]:5d}")
c.execute("SELECT COUNT(*) FROM orders WHERE status IN ('IMPORTED','ALREADY_IMPORTED')")
print(f"\nImported (matchable): {c.fetchone()[0]}")
c.execute("SELECT COUNT(*) FROM order_items")
print(f"Total order_items: {c.fetchone()[0]}")
c.execute("""
SELECT COUNT(DISTINCT oi.sku)
FROM order_items oi
JOIN orders o ON oi.order_number = o.order_number
WHERE o.status IN ('IMPORTED','ALREADY_IMPORTED')
""")
print(f"Unique SKUs in imported orders: {c.fetchone()[0]}")
db.close()

View File

@@ -1,98 +0,0 @@
"""
Delete all imported orders from Oracle ROA and reset SQLite status.
Soft-delete: SET sters=1 on comenzi + comenzi_detalii.
Reset SQLite: clear id_comanda, id_partener, set status back to allow re-import.
"""
import sys, sqlite3, oracledb, os
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
DRY_RUN = '--execute' not in sys.argv
os.environ['PATH'] = r'C:\app\Server\product\18.0.0\dbhomeXE\bin' + ';' + os.environ.get('PATH','')
oracledb.init_oracle_client()
# Get imported orders
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
db.row_factory = sqlite3.Row
c = db.cursor()
c.execute("""
SELECT order_number, id_comanda, id_partener, customer_name
FROM orders
WHERE status = 'IMPORTED' AND id_comanda IS NOT NULL
""")
imported = [dict(r) for r in c.fetchall()]
print(f"Orders to delete: {len(imported)}")
if DRY_RUN:
print("*** DRY RUN — add --execute to actually delete ***\n")
# Step 1: Soft-delete in Oracle
conn = oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA')
cur = conn.cursor()
id_comandas = [o['id_comanda'] for o in imported]
# Verify none are invoiced
for id_cmd in id_comandas:
cur.execute("SELECT COUNT(*) FROM vanzari WHERE id_comanda = :1 AND sters = 0", [id_cmd])
cnt = cur.fetchone()[0]
if cnt > 0:
print(f" ERROR: comanda {id_cmd} has {cnt} active invoices! Aborting.")
sys.exit(1)
print("Oracle: no invoices found on any order — safe to delete")
for id_cmd in id_comandas:
order_num = [o['order_number'] for o in imported if o['id_comanda'] == id_cmd][0]
if DRY_RUN:
# Just show what would happen
cur.execute("SELECT COUNT(*) FROM comenzi_detalii WHERE id_comanda = :1 AND sters = 0", [id_cmd])
det_cnt = cur.fetchone()[0]
print(f" Would delete: comanda {id_cmd} (order {order_num}) + {det_cnt} detail lines")
else:
# Soft-delete detail lines
cur.execute("UPDATE comenzi_detalii SET sters = 1 WHERE id_comanda = :1 AND sters = 0", [id_cmd])
det_deleted = cur.rowcount
# Soft-delete order header
cur.execute("UPDATE comenzi SET sters = 1 WHERE id_comanda = :1 AND sters = 0", [id_cmd])
hdr_deleted = cur.rowcount
print(f" Deleted: comanda {id_cmd} (order {order_num}): header={hdr_deleted}, details={det_deleted}")
if not DRY_RUN:
conn.commit()
print(f"\nOracle: {len(id_comandas)} orders soft-deleted (sters=1)")
else:
print(f"\nOracle: DRY RUN — nothing changed")
conn.close()
# Step 2: Reset SQLite
if not DRY_RUN:
c.execute("""
UPDATE orders SET
status = 'SKIPPED',
id_comanda = NULL,
id_partener = NULL,
id_adresa_facturare = NULL,
id_adresa_livrare = NULL,
error_message = NULL,
factura_serie = NULL,
factura_numar = NULL,
factura_total_fara_tva = NULL,
factura_total_tva = NULL,
factura_total_cu_tva = NULL,
factura_data = NULL,
invoice_checked_at = NULL
WHERE status = 'IMPORTED' AND id_comanda IS NOT NULL
""")
db.commit()
print(f"SQLite: {c.rowcount} orders reset to SKIPPED (id_comanda/id_partener cleared)")
else:
print(f"SQLite: DRY RUN — would reset {len(imported)} orders to SKIPPED")
db.close()
print("\nDone!" if not DRY_RUN else "\nDone (dry run). Run with --execute to apply.")

View File

@@ -1,78 +0,0 @@
"""Explore Oracle structure for invoice matching."""
import oracledb
import os
os.environ['PATH'] = r'C:\app\Server\product\18.0.0\dbhomeXE\bin' + ';' + os.environ.get('PATH','')
oracledb.init_oracle_client()
conn = oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA')
cur = conn.cursor()
# Recent vanzari (last 10 days)
cur.execute("""
SELECT v.id_vanzare, v.numar_act, v.serie_act,
TO_CHAR(v.data_act, 'YYYY-MM-DD') as data_act,
v.total_fara_tva, v.total_cu_tva, v.id_part, v.id_comanda,
p.denumire as partener
FROM vanzari v
LEFT JOIN nom_parteneri p ON v.id_part = p.id_part
WHERE v.sters = 0 AND v.data_act >= SYSDATE - 10
ORDER BY v.data_act DESC
""")
print('=== Recent VANZARI (last 10 days) ===')
print(f'{"ID_VANZ":>8s} {"NR_ACT":>8s} {"SERIE":>6s} {"DATA":>12s} {"TOTAL_FARA":>12s} {"TOTAL_CU":>12s} {"ID_PART":>8s} {"ID_CMD":>8s} PARTENER')
for r in cur:
print(f'{r[0]:8d} {str(r[1] or ""):>8s} {str(r[2] or ""):>6s} {str(r[3]):>12s} {float(r[4] or 0):12.2f} {float(r[5] or 0):12.2f} {r[6] or 0:8d} {str(r[7] or ""):>8s} {r[8] or ""}')
print()
# Vanzari_detalii for those invoices
cur.execute("""
SELECT vd.id_vanzare, vd.id_articol, a.codmat, a.denumire,
vd.cantitate, vd.pret, vd.pret_cu_tva, vd.proc_tvav
FROM vanzari_detalii vd
JOIN vanzari v ON vd.id_vanzare = v.id_vanzare
LEFT JOIN nom_articole a ON vd.id_articol = a.id_articol
WHERE v.sters = 0 AND vd.sters = 0 AND v.data_act >= SYSDATE - 10
ORDER BY vd.id_vanzare, vd.id_articol
""")
print('=== Recent VANZARI_DETALII (last 10 days) ===')
print(f'{"ID_VANZ":>8s} {"ID_ART":>8s} {"CODMAT":>15s} {"DENUMIRE":>40s} {"QTY":>8s} {"PRET":>10s} {"PRET_CU":>10s} {"TVA%":>6s}')
for r in cur:
print(f'{r[0]:8d} {r[1]:8d} {str(r[2] or ""):>15s} {str((r[3] or "")[:40]):>40s} {float(r[4] or 0):8.2f} {float(r[5] or 0):10.4f} {float(r[6] or 0):10.4f} {float(r[7] or 0):6.1f}')
print()
# Also get SQLite orders for comparison
print('=== SQLite orders (imported, last 10 days) ===')
import sqlite3
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
c = db.cursor()
c.execute("""
SELECT o.order_number, o.order_date, o.customer_name, o.status,
o.id_comanda, o.order_total,
o.factura_serie, o.factura_numar, o.factura_data
FROM orders o
WHERE o.order_date >= date('now', '-10 days')
ORDER BY o.order_date DESC
""")
print(f'{"ORDER_NR":>10s} {"DATE":>12s} {"CLIENT":>30s} {"STATUS":>10s} {"ID_CMD":>8s} {"TOTAL":>10s} {"F_SERIE":>8s} {"F_NR":>8s} {"F_DATA":>12s}')
for r in c:
print(f'{str(r[0]):>10s} {str(r[1])[:10]:>12s} {str((r[2] or "")[:30]):>30s} {str(r[3]):>10s} {str(r[4] or ""):>8s} {float(r[5] or 0):10.2f} {str(r[6] or ""):>8s} {str(r[7] or ""):>8s} {str(r[8] or ""):>12s}')
print()
# Order items
c.execute("""
SELECT oi.order_number, oi.sku, oi.product_name, oi.quantity, oi.price, oi.vat, oi.mapping_status
FROM order_items oi
JOIN orders o ON oi.order_number = o.order_number
WHERE o.order_date >= date('now', '-10 days')
ORDER BY oi.order_number, oi.sku
""")
print('=== SQLite order_items (last 10 days) ===')
print(f'{"ORDER_NR":>10s} {"SKU":>20s} {"PRODUCT":>40s} {"QTY":>6s} {"PRICE":>10s} {"VAT":>6s} {"MAP":>8s}')
for r in c:
print(f'{str(r[0]):>10s} {str(r[1] or ""):>20s} {str((r[2] or "")[:40]):>40s} {float(r[3] or 0):6.1f} {float(r[4] or 0):10.2f} {float(r[5] or 0):6.1f} {str(r[6] or ""):>8s}')
db.close()
conn.close()

View File

@@ -1,30 +0,0 @@
import sys, json, httpx
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
API_URL = 'https://api.gomag.ro/api/v1/order/read/json'
headers = {
'Apikey': '4c5e46df8f6c4f054fe2787de7a13d4a',
'ApiShop': 'https://coffeepoint.ro',
'User-Agent': 'Mozilla/5.0',
'Content-Type': 'application/json',
}
target = sys.argv[1] if len(sys.argv) > 1 else '480700091'
for page in range(1, 20):
resp = httpx.get(API_URL, headers=headers, params={'startDate': '2026-03-08', 'page': page, 'limit': 250}, timeout=60)
data = resp.json()
orders = data.get('orders', {})
if isinstance(orders, dict) and target in orders:
print(json.dumps(orders[target], indent=2, ensure_ascii=False))
sys.exit(0)
# Also check by 'number' field
if isinstance(orders, dict):
for key, order in orders.items():
if isinstance(order, dict) and str(order.get('number', '')) == target:
print(json.dumps(order, indent=2, ensure_ascii=False))
sys.exit(0)
if page >= data.get('pages', 1):
break
print(f"Order {target} not found")

View File

@@ -1,533 +0,0 @@
"""
Match ALL GoMag orders (SQLite) with manual invoices (Oracle vanzari)
by date + client name + total value.
Then compare line items to discover SKU → CODMAT mappings.
"""
import oracledb
import os
import sys
import sqlite3
import csv
from difflib import SequenceMatcher
sys.stdout.reconfigure(encoding='utf-8', errors='replace')
os.environ['PATH'] = r'C:\app\Server\product\18.0.0\dbhomeXE\bin' + ';' + os.environ.get('PATH','')
oracledb.init_oracle_client()
# --- Step 1: Get ALL GoMag orders from SQLite ---
print("=" * 80)
print("STEP 1: Loading ALL GoMag orders from SQLite")
print("=" * 80)
db = sqlite3.connect(r'C:\gomag-vending\api\data\import.db')
db.row_factory = sqlite3.Row
c = db.cursor()
# ALL orders, not just IMPORTED
c.execute("""
SELECT order_number, order_date, customer_name, status,
id_comanda, order_total, billing_name, shipping_name
FROM orders
ORDER BY order_date DESC
""")
orders = [dict(r) for r in c.fetchall()]
# Get order items
for order in orders:
c.execute("""
SELECT sku, product_name, quantity, price, vat, mapping_status
FROM order_items
WHERE order_number = ?
ORDER BY sku
""", (order['order_number'],))
order['items'] = [dict(r) for r in c.fetchall()]
db.close()
by_status = {}
for o in orders:
by_status.setdefault(o['status'], 0)
by_status[o['status']] += 1
print(f"Loaded {len(orders)} GoMag orders: {by_status}")
# --- Step 2: Get Oracle invoices with date range matching orders ---
print()
print("=" * 80)
print("STEP 2: Loading Oracle invoices (vanzari + detalii)")
print("=" * 80)
conn = oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA')
cur = conn.cursor()
# Get date range from orders
min_date = min(str(o['order_date'])[:10] for o in orders)
max_date = max(str(o['order_date'])[:10] for o in orders)
print(f"Order date range: {min_date} to {max_date}")
# Get vanzari in that range (with some margin)
cur.execute("""
SELECT v.id_vanzare, v.numar_act, v.serie_act,
TO_CHAR(v.data_act, 'YYYY-MM-DD') as data_act,
v.total_fara_tva, v.total_cu_tva, v.id_part,
p.denumire as partener, p.prenume
FROM vanzari v
LEFT JOIN nom_parteneri p ON v.id_part = p.id_part
WHERE v.sters = 0
AND v.data_act >= TO_DATE(:1, 'YYYY-MM-DD') - 2
AND v.data_act <= TO_DATE(:2, 'YYYY-MM-DD') + 2
AND v.total_cu_tva > 0
ORDER BY v.data_act DESC
""", [min_date, max_date])
invoices = []
for r in cur:
inv = {
'id_vanzare': r[0],
'numar_act': r[1],
'serie_act': r[2] or '',
'data_act': r[3],
'total_fara_tva': float(r[4] or 0),
'total_cu_tva': float(r[5] or 0),
'id_part': r[6],
'partener': ((r[7] or '') + ' ' + (r[8] or '')).strip(),
}
invoices.append(inv)
print(f"Loaded {len(invoices)} Oracle invoices in range {min_date} - {max_date}")
# Get detail lines for ALL invoices in one batch
inv_ids = [inv['id_vanzare'] for inv in invoices]
inv_map = {inv['id_vanzare']: inv for inv in invoices}
for inv in invoices:
inv['items'] = []
# Batch fetch details
for i in range(0, len(inv_ids), 500):
batch = inv_ids[i:i+500]
placeholders = ",".join([f":d{j}" for j in range(len(batch))])
params = {f"d{j}": did for j, did in enumerate(batch)}
cur.execute(f"""
SELECT vd.id_vanzare, vd.id_articol, a.codmat, a.denumire,
vd.cantitate, vd.pret, vd.pret_cu_tva, vd.proc_tvav
FROM vanzari_detalii vd
LEFT JOIN nom_articole a ON vd.id_articol = a.id_articol
WHERE vd.id_vanzare IN ({placeholders}) AND vd.sters = 0
ORDER BY vd.id_vanzare, vd.id_articol
""", params)
for r in cur:
inv_map[r[0]]['items'].append({
'id_articol': r[1],
'codmat': r[2],
'denumire': r[3],
'cantitate': float(r[4] or 0),
'pret': float(r[5] or 0),
'pret_cu_tva': float(r[6] or 0),
'tva_pct': float(r[7] or 0),
})
conn.close()
# --- Step 3: Fuzzy matching ---
print()
print("=" * 80)
print("STEP 3: Matching orders → invoices (date + name + total)")
print("=" * 80)
def normalize_name(name):
if not name:
return ''
n = name.strip().upper()
for old, new in [('S.R.L.', 'SRL'), ('S.R.L', 'SRL'), ('SC ', ''), ('PFA ', ''), ('PF ', '')]:
n = n.replace(old, new)
return n
def name_similarity(n1, n2):
nn1 = normalize_name(n1)
nn2 = normalize_name(n2)
if not nn1 or not nn2:
return 0
# Also try reversed word order (GoMag: "Popescu Ion", ROA: "ION POPESCU")
sim1 = SequenceMatcher(None, nn1, nn2).ratio()
words1 = nn1.split()
if len(words1) >= 2:
reversed1 = ' '.join(reversed(words1))
sim2 = SequenceMatcher(None, reversed1, nn2).ratio()
return max(sim1, sim2)
return sim1
matches = []
unmatched_orders = []
used_invoices = set()
# Sort orders by total descending (match big orders first - more unique)
orders_sorted = sorted(orders, key=lambda o: -(o['order_total'] or 0))
for order in orders_sorted:
best_match = None
best_score = 0
order_date = str(order['order_date'])[:10]
order_total = order['order_total'] or 0
order_name = order['customer_name'] or ''
for inv in invoices:
if inv['id_vanzare'] in used_invoices:
continue
# Date match (must be within +/- 2 days)
try:
od = int(order_date.replace('-',''))
id_ = int(inv['data_act'].replace('-',''))
date_diff = abs(od - id_)
except:
continue
if date_diff > 2:
continue
# Total match (within 10% or 10 lei — more lenient for transport/discount)
total_diff = abs(order_total - inv['total_cu_tva'])
total_pct = total_diff / max(order_total, 0.01) * 100
if total_pct > 15 and total_diff > 15:
continue
# Name similarity
sim = name_similarity(order_name, inv['partener'])
# Also check billing_name/shipping_name
sim2 = name_similarity(order.get('billing_name') or '', inv['partener'])
sim3 = name_similarity(order.get('shipping_name') or '', inv['partener'])
sim = max(sim, sim2, sim3)
# Score
date_score = 1 if date_diff == 0 else (0.7 if date_diff == 1 else 0.3)
total_score = 1 - min(total_pct / 100, 1)
score = sim * 0.45 + total_score * 0.40 + date_score * 0.15
if score > best_score:
best_score = score
best_match = inv
if best_match and best_score > 0.45:
matches.append({
'order': order,
'invoice': best_match,
'score': best_score,
})
used_invoices.add(best_match['id_vanzare'])
else:
unmatched_orders.append(order)
print(f"Matched: {len(matches)} | Unmatched orders: {len(unmatched_orders)}")
matched_statuses = {}
for m in matches:
s = m['order']['status']
matched_statuses.setdefault(s, 0)
matched_statuses[s] += 1
print(f"Matched by status: {matched_statuses}")
# --- Step 4: Compare line items ---
print()
print("=" * 80)
print("STEP 4: Line item comparison")
print("=" * 80)
simple_mappings = []
repack_mappings = []
complex_mappings = []
unresolved = []
match_details = []
for m in matches:
o = m['order']
inv = m['invoice']
go_items = o['items']
# Filter out TRANSPORT and DISCOUNT from ROA items
roa_items = [ri for ri in inv['items']
if ri['codmat'] not in ('TRANSPORT', 'DISCOUNT', None, '')
and ri['cantitate'] > 0]
roa_transport = [ri for ri in inv['items']
if ri['codmat'] in ('TRANSPORT', 'DISCOUNT') or ri['cantitate'] < 0]
detail = {
'order_number': o['order_number'],
'customer': o['customer_name'],
'order_total': o['order_total'],
'factura': f"{inv['serie_act']}{inv['numar_act']}",
'inv_total': inv['total_cu_tva'],
'score': m['score'],
'go_items': len(go_items),
'roa_items': len(roa_items),
'matched_items': [],
'unresolved_items': [],
}
go_remaining = list(range(len(go_items)))
roa_remaining = list(range(len(roa_items)))
item_matches = []
# Pass 1: exact match by codmat (SKU == CODMAT)
for gi_idx in list(go_remaining):
gi = go_items[gi_idx]
for ri_idx in list(roa_remaining):
ri = roa_items[ri_idx]
if ri['codmat'] and gi['sku'] == ri['codmat']:
item_matches.append((gi_idx, [ri_idx]))
go_remaining.remove(gi_idx)
roa_remaining.remove(ri_idx)
break
# Pass 2: match by total value (qty * price)
for gi_idx in list(go_remaining):
gi = go_items[gi_idx]
go_total_cu = gi['quantity'] * gi['price']
go_total_fara = go_total_cu / (1 + gi['vat']/100) if gi['vat'] else go_total_cu
for ri_idx in list(roa_remaining):
ri = roa_items[ri_idx]
roa_total_fara = ri['cantitate'] * ri['pret']
roa_total_cu = ri['cantitate'] * ri['pret_cu_tva']
if (abs(go_total_fara - roa_total_fara) < 1.0 or
abs(go_total_cu - roa_total_cu) < 1.0 or
abs(go_total_cu - roa_total_fara) < 1.0):
item_matches.append((gi_idx, [ri_idx]))
go_remaining.remove(gi_idx)
roa_remaining.remove(ri_idx)
break
# Pass 3: 1:1 positional match (if same count remaining)
if len(go_remaining) == len(roa_remaining) == 1:
item_matches.append((go_remaining[0], [roa_remaining[0]]))
go_remaining = []
roa_remaining = []
# Pass 4: 1:N by combined total
for gi_idx in list(go_remaining):
gi = go_items[gi_idx]
go_total_cu = gi['quantity'] * gi['price']
go_total_fara = go_total_cu / (1 + gi['vat']/100) if gi['vat'] else go_total_cu
if len(roa_remaining) >= 2:
# Try all pairs
found = False
for i_pos, ri_idx1 in enumerate(roa_remaining):
for ri_idx2 in roa_remaining[i_pos+1:]:
ri1 = roa_items[ri_idx1]
ri2 = roa_items[ri_idx2]
combined_fara = ri1['cantitate'] * ri1['pret'] + ri2['cantitate'] * ri2['pret']
combined_cu = ri1['cantitate'] * ri1['pret_cu_tva'] + ri2['cantitate'] * ri2['pret_cu_tva']
if (abs(go_total_fara - combined_fara) < 2.0 or
abs(go_total_cu - combined_cu) < 2.0):
item_matches.append((gi_idx, [ri_idx1, ri_idx2]))
go_remaining.remove(gi_idx)
roa_remaining.remove(ri_idx1)
roa_remaining.remove(ri_idx2)
found = True
break
if found:
break
# Classify matches
for gi_idx, ri_indices in item_matches:
gi = go_items[gi_idx]
ris = [roa_items[i] for i in ri_indices]
if len(ris) == 1:
ri = ris[0]
if gi['sku'] == ri['codmat']:
# Already mapped (SKU == CODMAT)
detail['matched_items'].append(f"ALREADY: {gi['sku']} == {ri['codmat']}")
simple_mappings.append({
'sku': gi['sku'], 'codmat': ri['codmat'],
'id_articol': ri['id_articol'],
'type': 'already_equal',
'product_name': gi['product_name'], 'denumire': ri['denumire'],
'go_qty': gi['quantity'], 'roa_qty': ri['cantitate'],
'go_price': gi['price'], 'roa_pret': ri['pret'],
})
elif abs(gi['quantity'] - ri['cantitate']) < 0.01:
# Simple 1:1 different codmat
detail['matched_items'].append(f"SIMPLE: {gi['sku']}{ri['codmat']}")
simple_mappings.append({
'sku': gi['sku'], 'codmat': ri['codmat'],
'id_articol': ri['id_articol'],
'type': 'simple',
'product_name': gi['product_name'], 'denumire': ri['denumire'],
'go_qty': gi['quantity'], 'roa_qty': ri['cantitate'],
'go_price': gi['price'], 'roa_pret': ri['pret'],
})
else:
# Repackaging
cantitate_roa = ri['cantitate'] / gi['quantity'] if gi['quantity'] else 1
detail['matched_items'].append(f"REPACK: {gi['sku']}{ri['codmat']} x{cantitate_roa:.3f}")
repack_mappings.append({
'sku': gi['sku'], 'codmat': ri['codmat'],
'id_articol': ri['id_articol'],
'cantitate_roa': round(cantitate_roa, 3),
'product_name': gi['product_name'], 'denumire': ri['denumire'],
'go_qty': gi['quantity'], 'roa_qty': ri['cantitate'],
})
else:
# Complex set
go_total_cu = gi['quantity'] * gi['price']
go_total_fara = go_total_cu / (1 + gi['vat']/100) if gi['vat'] else go_total_cu
for ri in ris:
ri_total = ri['cantitate'] * ri['pret']
pct = round(ri_total / go_total_fara * 100, 2) if go_total_fara else 0
cantitate_roa = ri['cantitate'] / gi['quantity'] if gi['quantity'] else 1
detail['matched_items'].append(f"SET: {gi['sku']}{ri['codmat']} {pct}%")
complex_mappings.append({
'sku': gi['sku'], 'codmat': ri['codmat'],
'id_articol': ri['id_articol'],
'cantitate_roa': round(cantitate_roa, 3),
'procent_pret': pct,
'product_name': gi['product_name'], 'denumire': ri['denumire'],
})
for gi_idx in go_remaining:
gi = go_items[gi_idx]
remaining_roa = [roa_items[i] for i in roa_remaining]
detail['unresolved_items'].append(gi['sku'])
unresolved.append({
'sku': gi['sku'],
'product_name': gi['product_name'],
'quantity': gi['quantity'],
'price': gi['price'],
'order': o['order_number'],
'factura': f"{inv['serie_act']}{inv['numar_act']}",
'roa_remaining': '; '.join([f"{r['codmat'] or '?'}({r['cantitate']}x{r['pret']:.2f}={r['denumire'][:30]})"
for r in remaining_roa]),
})
match_details.append(detail)
# --- Step 5: Deduplicate and summarize ---
print()
print("=" * 80)
print("STEP 5: SUMMARY")
print("=" * 80)
# Deduplicate simple
seen_simple_equal = {}
seen_simple_new = {}
for m in simple_mappings:
key = (m['sku'], m['codmat'])
if m['type'] == 'already_equal':
seen_simple_equal[key] = m
else:
seen_simple_new[key] = m
seen_repack = {}
for m in repack_mappings:
key = (m['sku'], m['codmat'])
if key not in seen_repack:
seen_repack[key] = m
seen_complex = {}
for m in complex_mappings:
key = (m['sku'], m['codmat'])
if key not in seen_complex:
seen_complex[key] = m
# Deduplicate unresolved SKUs
seen_unresolved_skus = {}
for u in unresolved:
if u['sku'] not in seen_unresolved_skus:
seen_unresolved_skus[u['sku']] = u
print(f"\n--- Already mapped (SKU == CODMAT in nom_articole): {len(seen_simple_equal)} unique ---")
for key, m in sorted(seen_simple_equal.items()):
print(f" {m['sku']:25s} = {m['codmat']:15s} | {(m['product_name'] or '')[:40]}")
print(f"\n--- NEW simple 1:1 (SKU != CODMAT, same qty): {len(seen_simple_new)} unique ---")
for key, m in sorted(seen_simple_new.items()):
print(f" {m['sku']:25s}{m['codmat']:15s} | GoMag: {(m['product_name'] or '')[:30]} → ROA: {(m['denumire'] or '')[:30]}")
print(f"\n--- Repackaging (different qty): {len(seen_repack)} unique ---")
for key, m in sorted(seen_repack.items()):
print(f" {m['sku']:25s}{m['codmat']:15s} x{m['cantitate_roa']} | {(m['product_name'] or '')[:30]}{(m['denumire'] or '')[:30]}")
print(f"\n--- Complex sets (1 SKU → N CODMATs): {len(seen_complex)} unique ---")
for key, m in sorted(seen_complex.items()):
print(f" {m['sku']:25s}{m['codmat']:15s} {m['procent_pret']:6.2f}% | {(m['product_name'] or '')[:30]}{(m['denumire'] or '')[:30]}")
print(f"\n--- Unresolved (unique SKUs): {len(seen_unresolved_skus)} ---")
for sku, u in sorted(seen_unresolved_skus.items()):
print(f" {sku:25s} | {(u['product_name'] or '')[:40]} | example: order={u['order']}")
print(f"\n--- Unmatched orders (no invoice found): {len(unmatched_orders)} ---")
for o in unmatched_orders[:20]:
print(f" {o['order_number']:>12s} | {str(o['order_date'])[:10]} | {(o['customer_name'] or '')[:30]:30s} | {o['order_total'] or 0:10.2f} | {o['status']}")
if len(unmatched_orders) > 20:
print(f" ... and {len(unmatched_orders) - 20} more")
# --- Write output files ---
out_dir = r'C:\gomag-vending\scripts\output'
os.makedirs(out_dir, exist_ok=True)
# Full match report
with open(os.path.join(out_dir, 'match_report.csv'), 'w', newline='', encoding='utf-8') as f:
w = csv.writer(f)
w.writerow(['order_number', 'customer', 'order_total', 'factura', 'inv_total', 'score',
'go_items', 'roa_items', 'matched', 'unresolved'])
for d in match_details:
w.writerow([d['order_number'], d['customer'], d['order_total'],
d['factura'], d['inv_total'], f"{d['score']:.2f}",
d['go_items'], d['roa_items'],
'; '.join(d['matched_items']),
'; '.join(d['unresolved_items'])])
# New simple mappings (SKU → CODMAT where SKU != CODMAT)
with open(os.path.join(out_dir, 'simple_new_mappings.csv'), 'w', newline='', encoding='utf-8') as f:
w = csv.writer(f)
w.writerow(['sku', 'codmat', 'id_articol', 'product_name_gomag', 'denumire_roa', 'go_qty', 'roa_qty', 'go_price', 'roa_pret'])
for m in seen_simple_new.values():
w.writerow([m['sku'], m['codmat'], m['id_articol'], m['product_name'], m['denumire'],
m['go_qty'], m['roa_qty'], m['go_price'], m['roa_pret']])
# Repackaging CSV
with open(os.path.join(out_dir, 'repack_mappings.csv'), 'w', newline='', encoding='utf-8') as f:
w = csv.writer(f)
w.writerow(['sku', 'codmat', 'cantitate_roa', 'procent_pret', 'product_name_gomag', 'denumire_roa'])
for m in seen_repack.values():
w.writerow([m['sku'], m['codmat'], m['cantitate_roa'], 100, m['product_name'], m['denumire']])
# Complex sets CSV
with open(os.path.join(out_dir, 'complex_mappings.csv'), 'w', newline='', encoding='utf-8') as f:
w = csv.writer(f)
w.writerow(['sku', 'codmat', 'cantitate_roa', 'procent_pret', 'product_name_gomag', 'denumire_roa'])
for m in seen_complex.values():
w.writerow([m['sku'], m['codmat'], round(m['cantitate_roa'], 3), m['procent_pret'],
m['product_name'], m['denumire']])
# Unresolved
with open(os.path.join(out_dir, 'unresolved.csv'), 'w', newline='', encoding='utf-8') as f:
w = csv.writer(f)
w.writerow(['sku', 'product_name', 'quantity', 'price', 'order', 'factura', 'roa_remaining_items'])
for u in unresolved:
w.writerow([u['sku'], u['product_name'], u['quantity'], u['price'],
u['order'], u['factura'], u['roa_remaining']])
# Already equal (for reference)
with open(os.path.join(out_dir, 'already_mapped.csv'), 'w', newline='', encoding='utf-8') as f:
w = csv.writer(f)
w.writerow(['sku', 'codmat', 'id_articol', 'product_name_gomag', 'denumire_roa'])
for m in seen_simple_equal.values():
w.writerow([m['sku'], m['codmat'], m['id_articol'], m['product_name'], m['denumire']])
# Unmatched orders
with open(os.path.join(out_dir, 'unmatched_orders.csv'), 'w', newline='', encoding='utf-8') as f:
w = csv.writer(f)
w.writerow(['order_number', 'order_date', 'customer_name', 'status', 'order_total', 'items_count'])
for o in unmatched_orders:
w.writerow([o['order_number'], str(o['order_date'])[:10], o['customer_name'],
o['status'], o['order_total'], len(o['items'])])
print(f"\nOutput written to {out_dir}:")
print(f" match_report.csv - {len(match_details)} matched order-invoice pairs")
print(f" already_mapped.csv - {len(seen_simple_equal)} SKU==CODMAT (already OK)")
print(f" simple_new_mappings.csv - {len(seen_simple_new)} new SKU→CODMAT (need codmat in nom_articole or ARTICOLE_TERTI)")
print(f" repack_mappings.csv - {len(seen_repack)} repackaging")
print(f" complex_mappings.csv - {len(seen_complex)} complex sets")
print(f" unresolved.csv - {len(unresolved)} unresolved item lines")
print(f" unmatched_orders.csv - {len(unmatched_orders)} orders without invoice match")

View File

@@ -1,306 +0,0 @@
#!/usr/bin/env python3
"""
Parser pentru log-urile sync_comenzi_web.
Extrage comenzi esuate, SKU-uri lipsa, si genereaza un sumar.
Suporta atat formatul vechi (verbose) cat si formatul nou (compact).
Utilizare:
python parse_sync_log.py # Ultimul log din vfp/log/
python parse_sync_log.py <fisier.log> # Log specific
python parse_sync_log.py --skus # Doar lista SKU-uri lipsa
python parse_sync_log.py --dir /path/to/logs # Director custom
"""
import os
import sys
import re
import glob
import argparse
# Regex pentru linii cu timestamp (intrare noua in log)
RE_TIMESTAMP = re.compile(r'^\[(\d{2}:\d{2}:\d{2})\]\s+\[(\w+\s*)\]\s*(.*)')
# Regex format NOU: [N/Total] OrderNumber P:X A:Y/Z -> OK/ERR details
RE_COMPACT_OK = re.compile(r'\[(\d+)/(\d+)\]\s+(\S+)\s+.*->\s+OK\s+ID:(\S+)')
RE_COMPACT_ERR = re.compile(r'\[(\d+)/(\d+)\]\s+(\S+)\s+.*->\s+ERR\s+(.*)')
# Regex format VECHI (backwards compat)
RE_SKU_NOT_FOUND = re.compile(r'SKU negasit.*?:\s*(\S+)')
RE_PRICE_POLICY = re.compile(r'Pretul pentru acest articol nu a fost gasit')
RE_FAILED_ORDER = re.compile(r'Import comanda esuat pentru\s+(\S+)')
RE_ARTICOL_ERR = re.compile(r'Eroare adaugare articol\s+(\S+)')
RE_ORDER_PROCESS = re.compile(r'Procesez comanda:\s+(\S+)\s+din\s+(\S+)')
RE_ORDER_SUCCESS = re.compile(r'SUCCES: Comanda importata.*?ID Oracle:\s+(\S+)')
# Regex comune
RE_SYNC_END = re.compile(r'SYNC END\s*\|.*?(\d+)\s+processed.*?(\d+)\s+ok.*?(\d+)\s+err')
RE_STATS_LINE = re.compile(r'Duration:\s*(\S+)\s*\|\s*Orders:\s*(\S+)')
RE_STOPPED_EARLY = re.compile(r'Peste \d+.*ero|stopped early')
def find_latest_log(log_dir):
"""Gaseste cel mai recent log sync_comenzi din directorul specificat."""
pattern = os.path.join(log_dir, 'sync_comenzi_*.log')
files = glob.glob(pattern)
if not files:
return None
return max(files, key=os.path.getmtime)
def parse_log_entries(lines):
"""Parseaza liniile log-ului in intrari structurate."""
entries = []
current = None
for line in lines:
line = line.rstrip('\n\r')
m = RE_TIMESTAMP.match(line)
if m:
if current:
entries.append(current)
current = {
'time': m.group(1),
'level': m.group(2).strip(),
'text': m.group(3),
'full': line,
'continuation': []
}
elif current is not None:
current['continuation'].append(line)
current['text'] += '\n' + line
if current:
entries.append(current)
return entries
def extract_sku_from_error(err_text):
"""Extrage SKU din textul erorii (diverse formate)."""
# SKU_NOT_FOUND: 8714858424056
m = re.search(r'SKU_NOT_FOUND:\s*(\S+)', err_text)
if m:
return ('SKU_NOT_FOUND', m.group(1))
# PRICE_POLICY: 8000070028685
m = re.search(r'PRICE_POLICY:\s*(\S+)', err_text)
if m:
return ('PRICE_POLICY', m.group(1))
# Format vechi: SKU negasit...NOM_ARTICOLE: xxx
m = RE_SKU_NOT_FOUND.search(err_text)
if m:
return ('SKU_NOT_FOUND', m.group(1))
# Format vechi: Eroare adaugare articol xxx
m = RE_ARTICOL_ERR.search(err_text)
if m:
return ('ARTICOL_ERROR', m.group(1))
# Format vechi: Pretul...
if RE_PRICE_POLICY.search(err_text):
return ('PRICE_POLICY', '(SKU necunoscut)')
return (None, None)
def analyze_entries(entries):
"""Analizeaza intrarile si extrage informatii relevante."""
result = {
'start_time': None,
'end_time': None,
'duration': None,
'total_orders': 0,
'success_orders': 0,
'error_orders': 0,
'stopped_early': False,
'failed': [],
'missing_skus': [],
}
seen_skus = set()
current_order = None
for entry in entries:
text = entry['text']
level = entry['level']
# Start/end time
if entry['time']:
if result['start_time'] is None:
result['start_time'] = entry['time']
result['end_time'] = entry['time']
# Format NOU: SYNC END line cu statistici
m = RE_SYNC_END.search(text)
if m:
result['total_orders'] = int(m.group(1))
result['success_orders'] = int(m.group(2))
result['error_orders'] = int(m.group(3))
# Format NOU: compact OK line
m = RE_COMPACT_OK.search(text)
if m:
continue
# Format NOU: compact ERR line
m = RE_COMPACT_ERR.search(text)
if m:
order_nr = m.group(3)
err_detail = m.group(4).strip()
err_type, sku = extract_sku_from_error(err_detail)
if err_type and sku:
result['failed'].append((order_nr, err_type, sku))
if sku not in seen_skus and sku != '(SKU necunoscut)':
seen_skus.add(sku)
result['missing_skus'].append(sku)
else:
result['failed'].append((order_nr, 'ERROR', err_detail[:60]))
continue
# Stopped early
if RE_STOPPED_EARLY.search(text):
result['stopped_early'] = True
# Format VECHI: statistici din sumar
if 'Total comenzi procesate:' in text:
try:
result['total_orders'] = int(text.split(':')[-1].strip())
except ValueError:
pass
if 'Comenzi importate cu succes:' in text:
try:
result['success_orders'] = int(text.split(':')[-1].strip())
except ValueError:
pass
if 'Comenzi cu erori:' in text:
try:
result['error_orders'] = int(text.split(':')[-1].strip())
except ValueError:
pass
# Format VECHI: Duration line
m = RE_STATS_LINE.search(text)
if m:
result['duration'] = m.group(1)
# Format VECHI: erori
if level == 'ERROR':
m_fail = RE_FAILED_ORDER.search(text)
if m_fail:
current_order = m_fail.group(1)
m = RE_ORDER_PROCESS.search(text)
if m:
current_order = m.group(1)
err_type, sku = extract_sku_from_error(text)
if err_type and sku:
order_nr = current_order or '?'
result['failed'].append((order_nr, err_type, sku))
if sku not in seen_skus and sku != '(SKU necunoscut)':
seen_skus.add(sku)
result['missing_skus'].append(sku)
# Duration din SYNC END
m = re.search(r'\|\s*(\d+)s\s*$', text)
if m:
result['duration'] = m.group(1) + 's'
return result
def format_report(result, log_path):
"""Formateaza raportul complet."""
lines = []
lines.append('=== SYNC LOG REPORT ===')
lines.append(f'File: {os.path.basename(log_path)}')
duration = result["duration"] or "?"
start = result["start_time"] or "?"
end = result["end_time"] or "?"
lines.append(f'Run: {start} - {end} ({duration})')
lines.append('')
stopped = 'YES' if result['stopped_early'] else 'NO'
lines.append(
f'SUMMARY: {result["total_orders"]} processed, '
f'{result["success_orders"]} success, '
f'{result["error_orders"]} errors '
f'(stopped early: {stopped})'
)
lines.append('')
if result['failed']:
lines.append('FAILED ORDERS:')
seen = set()
for order_nr, err_type, sku in result['failed']:
key = (order_nr, err_type, sku)
if key not in seen:
seen.add(key)
lines.append(f' {order_nr:<12} {err_type:<18} {sku}')
lines.append('')
if result['missing_skus']:
lines.append(f'MISSING SKUs ({len(result["missing_skus"])} unique):')
for sku in sorted(result['missing_skus']):
lines.append(f' {sku}')
lines.append('')
return '\n'.join(lines)
def main():
parser = argparse.ArgumentParser(
description='Parser pentru log-urile sync_comenzi_web'
)
parser.add_argument(
'logfile', nargs='?', default=None,
help='Fisier log specific (default: ultimul din vfp/log/)'
)
parser.add_argument(
'--skus', action='store_true',
help='Afiseaza doar lista SKU-uri lipsa (una pe linie)'
)
parser.add_argument(
'--dir', default=None,
help='Director cu log-uri (default: vfp/log/ relativ la script)'
)
args = parser.parse_args()
if args.logfile:
log_path = args.logfile
else:
if args.dir:
log_dir = args.dir
else:
script_dir = os.path.dirname(os.path.abspath(__file__))
project_dir = os.path.dirname(script_dir)
log_dir = os.path.join(project_dir, 'vfp', 'log')
log_path = find_latest_log(log_dir)
if not log_path:
print(f'Nu am gasit fisiere sync_comenzi_*.log in {log_dir}',
file=sys.stderr)
sys.exit(1)
if not os.path.isfile(log_path):
print(f'Fisierul nu exista: {log_path}', file=sys.stderr)
sys.exit(1)
with open(log_path, 'r', encoding='utf-8', errors='replace') as f:
lines = f.readlines()
entries = parse_log_entries(lines)
result = analyze_entries(entries)
if args.skus:
for sku in sorted(result['missing_skus']):
print(sku)
else:
print(format_report(result, log_path))
if __name__ == '__main__':
main()