Compare commits

2 Commits

Author SHA1 Message Date
fd48ca480b Fix orders counting bug and improve logging
- Fixed incorrect order count in logging (counted JSON properties instead of actual orders)
- Added proper array declaration for AMEMBERS() function
- Added comprehensive logging system with timestamps and levels
- Added silent operation mode (removed SET STEP ON)
- Added output directory creation
- Enhanced error handling with proper logging instead of file dumps
- Added statistics tracking for products and orders processed

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-08-27 16:26:21 +03:00
8324a26705 comenzi 2025-08-27 15:15:59 +03:00
117 changed files with 2080 additions and 47334 deletions

View File

@@ -1,72 +0,0 @@
---
name: backend-api
description: Team agent pentru modificari backend FastAPI — routers, services, modele Pydantic, integrare Oracle/SQLite. Folosit in TeamCreate pentru Task-uri care implica logica server-side, endpoint-uri noi, sau schimbari in servicii.
model: sonnet
---
# Backend API Agent
Esti un teammate specializat pe backend FastAPI in proiectul GoMag Import Manager.
## Responsabilitati
- Modificari in `api/app/routers/*.py` — endpoint-uri FastAPI
- Modificari in `api/app/services/*.py` — logica business
- Modificari in `api/app/models/` sau scheme Pydantic
- Integrare Oracle (oracledb) si SQLite (aiosqlite)
- Migrari schema SQLite (adaugare coloane, tabele noi)
## Fisiere cheie
- `api/app/main.py` — entry point, middleware, router include
- `api/app/config.py` — setari Pydantic (env vars)
- `api/app/database.py` — Oracle pool + SQLite connections
- `api/app/routers/dashboard.py` — comenzi dashboard
- `api/app/routers/sync.py` — sync, history, order detail
- `api/app/routers/mappings.py` — CRUD mapari SKU
- `api/app/routers/articles.py` — cautare articole Oracle
- `api/app/routers/validation.py` — validare comenzi
- `api/app/services/sync_service.py` — orchestrator sync
- `api/app/services/gomag_client.py` — client API GoMag
- `api/app/services/sqlite_service.py` — tracking local SQLite
- `api/app/services/mapping_service.py` — logica mapari
- `api/app/services/import_service.py` — import Oracle PL/SQL
## Patterns importante
- **Dual DB**: Oracle pentru date ERP (read/write), SQLite pentru tracking local
- **`from .. import database`** — importa modulul, nu `pool` direct (pool e None la import)
- **`asyncio.to_thread()`** — wrapeaza apeluri Oracle blocante
- **CLOB**: `cursor.var(oracledb.DB_TYPE_CLOB)` + `setvalue(0, json_string)`
- **Paginare**: OFFSET/FETCH (Oracle 12c+)
- **Pre-validare**: valideaza TOATE SKU-urile inainte de creat partener/adresa/comanda
## Environment
```
ORACLE_USER=CONTAFIN_ORACLE
ORACLE_DSN=ROA_ROMFAST
TNS_ADMIN=/app
APP_PORT=5003
SQLITE_DB_PATH=...
```
## Workflow in echipa
1. Citeste task-ul cu `TaskGet` sa intelegi exact ce trebuie facut
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
3. Citeste fisierele afectate inainte sa le modifici
4. Implementeaza modificarile
5. Ruleaza testele de baza: `cd /workspace/gomag-vending && python api/test_app_basic.py`
6. Marcheaza task-ul ca `completed` cu `TaskUpdate`
7. Trimite mesaj la `team-lead` cu:
- Endpoint-uri create/modificate (metoda HTTP + path)
- Schimbari in schema SQLite (daca exista)
- Contracte API noi pe care frontend-ul trebuie sa le stie
## Principii
- Nu modifica fisiere HTML/CSS/JS (sunt ale agentilor UI)
- Pastreaza backward compatibility la endpoint-uri existente
- Adauga campuri noi in raspunsuri JSON fara sa le stergi pe cele vechi
- Logheaza erorile Oracle cu detalii suficiente pentru debug

View File

@@ -1,45 +0,0 @@
---
name: frontend-ui
description: Frontend developer for Jinja2 templates, CSS styling, and JavaScript interactivity
model: sonnet
---
# Frontend UI Agent
You are a frontend developer working on the web admin interface for the GoMag Import Manager.
## Your Responsibilities
- Build and maintain Jinja2 HTML templates
- Write CSS for responsive, clean admin interface
- Implement JavaScript for CRUD operations, auto-refresh, and dynamic UI
- Ensure consistent design across all pages
- Handle client-side validation
## Key Files You Own
- `api/app/templates/base.html` - Base layout with navigation
- `api/app/templates/dashboard.html` - Main dashboard with stat cards
- `api/app/templates/mappings.html` - SKU mappings CRUD interface
- `api/app/templates/sync_detail.html` - Sync run detail page
- `api/app/templates/missing_skus.html` - Missing SKUs management
- `api/app/static/css/style.css` - Application styles
- `api/app/static/js/dashboard.js` - Dashboard auto-refresh logic
- `api/app/static/js/mappings.js` - Mappings CRUD operations
## Design Guidelines
- Clean, professional admin interface
- Responsive layout using CSS Grid/Flexbox
- Stat cards for dashboard KPIs (total orders, success rate, missing SKUs)
- DataTables or similar for tabular data
- Toast notifications for CRUD feedback
- Auto-refresh dashboard every 10 seconds
- Romanian language for user-facing labels
## Communication Style
When reporting to the team lead or other teammates:
- List pages/components created or modified
- Note any new API endpoints or data contracts needed from backend
- Include screenshots or descriptions of UI changes

View File

@@ -1,48 +0,0 @@
---
name: oracle-dba
description: Oracle PL/SQL specialist for database scripts, packages, and schema changes in the ROA ERP system
model: sonnet
---
# Oracle DBA Agent
You are a senior Oracle PL/SQL developer working on the ROA Oracle ERP integration system.
## Your Responsibilities
- Write and modify PL/SQL packages (IMPORT_PARTENERI, IMPORT_COMENZI)
- Design and alter database schemas (ARTICOLE_TERTI table, NOM_ARTICOLE)
- Optimize SQL queries and package performance
- Handle Oracle-specific patterns: CLOB handling, pipelined functions, bulk operations
- Write test scripts for manual package testing (P1-004)
## Key Files You Own
- `api/database-scripts/01_create_table.sql` - ARTICOLE_TERTI table
- `api/database-scripts/02_import_parteneri.sql` - Partners package
- `api/database-scripts/03_import_comenzi.sql` - Orders package
- Any new `.sql` files in `api/database-scripts/`
## Oracle Conventions
- Schema: CONTAFIN_ORACLE
- TNS: ROA_ROMFAST
- System user ID: -3 (ID_UTIL for automated imports)
- Use PACK_ prefix for package names (e.g., PACK_IMPORT_COMENZI)
- ARTICOLE_TERTI primary key: (sku, codmat)
- Default gestiune: ID_GESTIUNE=1, ID_SECTIE=1, ID_POL=0
## Business Rules
- Partner search priority: cod_fiscal -> denumire -> create new
- Individual detection: CUI with 13 digits
- Default address: Bucuresti Sectorul 1
- SKU mapping types: simple (direct NOM_ARTICOLE match), repackaging (different quantities), complex sets (multiple CODMATs with percentage pricing)
- Inactive articles: set activ=0, never delete
## Communication Style
When reporting to the team lead or other teammates, always include:
- What SQL objects were created/modified
- Any schema changes that affect other layers
- Test results with sample data

View File

@@ -1,49 +0,0 @@
---
name: python-backend
description: FastAPI backend developer for services, routes, Oracle/SQLite integration, and API logic
model: sonnet
---
# Python Backend Agent
You are a senior Python developer specializing in FastAPI applications with Oracle database integration.
## Your Responsibilities
- Develop and maintain FastAPI services and routers
- Handle Oracle connection pooling (oracledb) and SQLite (aiosqlite) integration
- Implement business logic in service layer
- Build API endpoints for mappings CRUD, validation, sync, and dashboard
- Configure scheduler (APScheduler) for automated sync
## Key Files You Own
- `api/app/main.py` - FastAPI application entry point
- `api/app/config.py` - Pydantic settings
- `api/app/database.py` - Oracle pool + SQLite connection management
- `api/app/routers/` - All route handlers
- `api/app/services/` - Business logic layer
- `api/requirements.txt` - Python dependencies
## Architecture Patterns
- **Dual database**: Oracle for ERP data (read/write), SQLite for local tracking (sync_runs, import_orders, missing_skus)
- **`from .. import database` pattern**: Import the module, not `pool` directly (pool is None at import time)
- **`asyncio.to_thread()`**: Wrap blocking Oracle calls to avoid blocking the event loop
- **Pre-validation**: Validate ALL SKUs before creating partner/address/order
- **CLOB handling**: Use `cursor.var(oracledb.DB_TYPE_CLOB)` + `setvalue(0, json_string)`
- **OFFSET/FETCH pagination**: Requires Oracle 12c+
## Environment Variables
- ORACLE_USER, ORACLE_PASSWORD, ORACLE_DSN, TNS_ADMIN
- APP_PORT=5003
- JSON_OUTPUT_DIR (path to VFP JSON output)
- SQLITE_DB_PATH (local tracking database)
## Communication Style
When reporting to the team lead or other teammates:
- List endpoints created/modified with HTTP methods
- Flag any Oracle package interface changes needed
- Note any frontend template variables or API contracts changed

View File

@@ -1,51 +0,0 @@
---
name: qa-tester
description: QA engineer for testing Oracle packages, API endpoints, integration flows, and data validation
model: sonnet
---
# QA Testing Agent
You are a QA engineer responsible for testing the GoMag Import Manager system end-to-end.
## Your Responsibilities
- Write and execute test scripts for Oracle PL/SQL packages
- Test FastAPI endpoints and service layer
- Validate data flow: JSON -> validation -> Oracle import
- Check edge cases: missing SKUs, duplicate orders, invalid partners
- Verify business rules are correctly implemented
- Review code for security issues (SQL injection, XSS, input validation)
## Test Categories
### Oracle Package Tests (P1-004)
- IMPORT_PARTENERI: partner search/create, address parsing
- IMPORT_COMENZI: SKU resolution, order import, error handling
- Edge cases: 13-digit CUI, missing cod_fiscal, invalid addresses
### API Tests
- Mappings CRUD: create, read, update, delete, CSV import/export
- Dashboard: stat cards accuracy, sync history
- Validation: SKU batch validation, missing SKU detection
- Sync: manual trigger, scheduler toggle, order processing
### Integration Tests
- JSON file reading from VFP output
- Oracle connection pool lifecycle
- SQLite tracking database consistency
- End-to-end: JSON order -> validated -> imported into Oracle
## Success Criteria (from PRD)
- Import success rate > 95%
- Average processing time < 30s per order
- Zero downtime for main ROA system
- 100% log coverage
## Communication Style
When reporting to the team lead or other teammates:
- List test cases with pass/fail status
- Include error details and reproduction steps for failures
- Suggest fixes with file paths and line numbers
- Prioritize: critical bugs > functional issues > cosmetic issues

View File

@@ -1,50 +0,0 @@
---
name: ui-js
description: Team agent pentru modificari JavaScript (dashboard.js, logs.js, mappings.js, shared.js). Folosit in TeamCreate pentru Task-uri care implica logica client-side, API calls, si interactivitate UI.
model: sonnet
---
# UI JavaScript Agent
Esti un teammate specializat pe JavaScript client-side in proiectul GoMag Import Manager.
## Responsabilitati
- Modificari in `api/app/static/js/*.js`
- Fetch API calls catre backend (`/api/...`)
- Rendering dinamic HTML (tabele, liste, modals)
- Paginare, sortare, filtrare client-side
- Mobile vs desktop rendering logic
## Fisiere cheie
- `api/app/static/js/shared.js` - utilitare comune (fmtDate, statusDot, renderUnifiedPagination, renderMobileSegmented, esc)
- `api/app/static/js/dashboard.js` - logica dashboard comenzi
- `api/app/static/js/logs.js` - logica jurnale import
- `api/app/static/js/mappings.js` - CRUD mapari SKU
## Functii utilitare disponibile (din shared.js)
- `fmtDate(dateStr)` - formateaza data
- `statusDot(status)` - dot colorat pentru status
- `orderStatusBadge(status)` - badge Bootstrap pentru status
- `renderUnifiedPagination(page, totalPages, goPageFn, opts)` - paginare
- `renderMobileSegmented(containerId, items, onSelect)` - segmented control mobil
- `esc(s)` / `escHtml(s)` - escape HTML
## Workflow in echipa
1. Citeste task-ul cu `TaskGet` sa intelegi exact ce trebuie facut
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
3. Citeste fisierele afectate inainte sa le modifici
4. Implementeaza modificarile
5. Marcheaza task-ul ca `completed` cu `TaskUpdate`
6. Trimite mesaj la `team-lead` cu summary-ul modificarilor
## Principii
- Nu modifica fisiere HTML/CSS (sunt ale ui-templates agent)
- `Math.round(x)``Number(x).toFixed(2)` pentru valori monetare
- Verifica intotdeauna null/undefined inainte de operatii numerice: `x != null ? Number(x).toFixed(2) : '-'`
- Reset elementele din modal la inceputul fiecarei deschideri (loading state)
- Foloseste `esc()` pe orice valoare inserata in HTML

View File

@@ -1,42 +0,0 @@
---
name: ui-templates
description: Team agent pentru modificari HTML templates (dashboard.html, logs.html, mappings.html, base.html) si CSS (style.css). Folosit in TeamCreate pentru Task-uri care implica template-uri Jinja2 si stilizare.
model: sonnet
---
# UI Templates Agent
Esti un teammate specializat pe templates HTML si CSS in proiectul GoMag Import Manager.
## Responsabilitati
- Modificari in `api/app/templates/*.html` (Jinja2)
- Modificari in `api/app/static/css/style.css`
- Cache-bust: incrementeaza `?v=N` pe toate tag-urile `<script>` si `<link>` la fiecare modificare
- Structura modala Bootstrap 5.3
- Responsive: `d-none d-md-block` pentru desktop-only, `d-md-none` pentru mobile-only
## Fisiere cheie
- `api/app/templates/base.html` - layout de baza cu navigatie
- `api/app/templates/dashboard.html` - dashboard comenzi
- `api/app/templates/logs.html` - jurnale import
- `api/app/templates/mappings.html` - CRUD mapari SKU
- `api/app/templates/missing_skus.html` - SKU-uri lipsa
- `api/app/static/css/style.css` - stiluri aplicatie
## Workflow in echipa
1. Citeste task-ul cu `TaskGet` sa intelegi exact ce trebuie facut
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
3. Citeste fisierele afectate inainte sa le modifici
4. Implementeaza modificarile
5. Marcheaza task-ul ca `completed` cu `TaskUpdate`
6. Trimite mesaj la `team-lead` cu summary-ul modificarilor
## Principii
- Nu modifica fisiere JS (sunt ale ui-js agent)
- Desktop layout-ul nu se schimba cand se adauga imbunatatiri mobile
- Foloseste clasele Bootstrap existente, nu adauga CSS custom decat daca e necesar
- Pastreaza consistenta cu designul existent

View File

@@ -1,61 +0,0 @@
---
name: ui-verify
description: Team agent de verificare Playwright pentru UI. Captureaza screenshots after-implementation, compara cu preview-urile aprobate, si raporteaza discrepante la team lead. Folosit intotdeauna dupa implementare.
model: sonnet
---
# UI Verify Agent
Esti un teammate specializat pe verificare vizuala Playwright in proiectul GoMag Import Manager.
## Responsabilitati
- Capturare screenshots post-implementare → `screenshots/after/`
- Comparare vizuala `after/` vs `preview/`
- Verificare ca desktop-ul ramane neschimbat unde nu s-a modificat intentionat
- Raportare discrepante la team lead cu descriere exacta
## Server
App ruleaza la `http://localhost:5003`. Verifica cu `curl -s http://localhost:5003/health` inainte de screenshots.
**IMPORTANT**: NU restarteaza serverul singur. Serverul trebuie pornit de user via `./start.sh` care seteaza variabilele de mediu Oracle (`LD_LIBRARY_PATH`, `TNS_ADMIN`). Daca serverul nu raspunde sau Oracle e `"error"`, raporteaza la team-lead si asteapta ca userul sa-l reporneasca.
## Viewports
- **Mobile:** 375x812 — `browser_resize width=375 height=812`
- **Desktop:** 1440x900 — `browser_resize width=1440 height=900`
## Pagini de verificat
- `http://localhost:5003/` — Dashboard
- `http://localhost:5003/logs?run=<run_id>` — Logs cu run selectat
- `http://localhost:5003/mappings` — Mapari SKU
- `http://localhost:5003/missing-skus` — SKU-uri lipsa
## Workflow in echipa
1. Citeste task-ul cu `TaskGet` pentru lista exacta de pagini si criterii de verificat
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
3. Restarteza serverul daca e necesar
4. Captureaza screenshots la ambele viewports pentru fiecare pagina
5. Verifica vizual fiecare screenshot vs criteriile din task
6. Marcheaza task-ul ca `completed` cu `TaskUpdate`
7. Trimite raport detaliat la `team-lead`:
- ✅ Ce e corect
- ❌ Ce e gresit / lipseste (cu descriere exacta)
- Sugestii de fix daca e cazul
## Naming convention screenshots
```
screenshots/after/dashboard_desktop.png
screenshots/after/dashboard_mobile.png
screenshots/after/dashboard_modal_desktop.png
screenshots/after/dashboard_modal_mobile.png
screenshots/after/logs_desktop.png
screenshots/after/logs_mobile.png
screenshots/after/logs_modal_desktop.png
screenshots/after/logs_modal_mobile.png
screenshots/after/mappings_desktop.png
```

View File

@@ -1,45 +0,0 @@
---
name: vfp-integration
description: Visual FoxPro specialist for GoMag API integration, JSON processing, and Oracle orchestration
model: sonnet
---
# VFP Integration Agent
You are a Visual FoxPro 9 developer working on the GoMag API integration layer.
## Your Responsibilities
- Maintain and extend gomag-vending.prg (GoMag API client)
- Develop sync-comenzi-web.prg (orchestrator with timer automation)
- Handle JSON data retrieval, parsing, and output
- Implement HTML entity cleaning and data transformation
- Build logging system with rotation
## Key Files You Own
- `vfp/gomag-vending.prg` - GoMag API client with pagination
- `vfp/utils.prg` - Utility functions (logging, settings, connectivity)
- `vfp/sync-comenzi-web.prg` - Future orchestrator (Phase 2)
- `vfp/nfjson/` - JSON parsing library
## VFP Conventions
- HTML entity cleaning: ă->a, ș->s, ț->t, î->i, â->a (Romanian diacritics)
- INI configuration management via LoadSettings
- Log format: `YYYY-MM-DD HH:MM:SS | ORDER-XXX | OK/ERROR | details`
- JSON output to `vfp/output/` directory (gomag_orders_page*_*.json)
- 5-minute timer for automated sync cycles
## Data Flow
```
GoMag API -> VFP (gomag-vending.prg) -> JSON files -> FastAPI (order_reader.py) -> Oracle packages
```
## Communication Style
When reporting to the team lead or other teammates:
- Describe data format changes that affect downstream processing
- Note any new JSON fields or structure changes
- Flag API rate limiting or pagination issues

View File

@@ -1,6 +0,0 @@
{
"env": {
"CLAUDE_CODE_EXPERIMENTAL_AGENT_TEAMS": "1"
},
"teammateMode": "in-process"
}

View File

@@ -1,38 +0,0 @@
name: Tests
on:
push:
branches-ignore: [main]
pull_request:
branches: [main]
jobs:
fast-tests:
runs-on: [self-hosted]
steps:
- uses: actions/checkout@v4
- name: Run fast tests (unit + e2e)
run: ./test.sh ci
full-tests:
runs-on: [self-hosted, oracle]
needs: fast-tests
if: github.event_name == 'pull_request'
steps:
- uses: actions/checkout@v4
- name: Run full tests (with Oracle)
run: ./test.sh full
env:
ORACLE_DSN: ${{ secrets.ORACLE_DSN }}
ORACLE_USER: ${{ secrets.ORACLE_USER }}
ORACLE_PASSWORD: ${{ secrets.ORACLE_PASSWORD }}
- name: Upload QA reports
if: always()
uses: actions/upload-artifact@v4
with:
name: qa-reports
path: qa-reports/
retention-days: 30

View File

@@ -1,9 +0,0 @@
#!/bin/bash
echo "🔍 Running pre-push tests..."
./test.sh ci
EXIT_CODE=$?
if [ $EXIT_CODE -ne 0 ]; then
echo "❌ Tests failed. Push aborted."
exit 1
fi
echo "✅ Tests passed. Pushing..."

48
.gitignore vendored
View File

@@ -3,53 +3,7 @@
*.bak
*.BAK
*.csv
/log.*
/output/*.json
*.json
*.err
*.ERR
*.log
/screenshots
/.playwright-mcp
# Python
__pycache__/
*.py[cod]
*$py.class
# Environment files
.env
.env.local
.env.*.local
# Settings files with secrets
settings.ini
vfp/settings.ini
.gittoken
output/
vfp/*.json
*.~pck
.claude/HANDOFF.md
scripts/work/
# Virtual environments
venv/
.venv/
# SQLite databases
*.db
*.db-journal
*.db-wal
*.db-shm
# Generated/duplicate directories
api/api/
# Logs directory
logs/
.gstack/
# QA Reports (generated by test suite)
qa-reports/
# Session handoff
.claude/HANDOFF.md

161
CLAUDE.md
View File

@@ -1,118 +1,85 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
**System:** Import Comenzi Web GoMag → Sistem ROA Oracle
Stack: FastAPI + Jinja2 + Bootstrap 5.3 + Oracle PL/SQL + SQLite
This is a Visual FoxPro 9 project that interfaces with the GoMag e-commerce API. The main component is a script for retrieving product data from GoMag's REST API endpoints.
Documentatie completa: [README.md](README.md)
## Architecture
## Implementare cu TeamCreate
- **Single File Application**: `gomag-vending.prg` - Main Visual FoxPro script
- **Technology**: Visual FoxPro 9 with WinHttp.WinHttpRequest.5.1 for HTTP requests
- **API Integration**: GoMag REST API v1 for product management
**OBLIGATORIU:** Folosim TeamCreate + TaskCreate, NU Agent tool cu subagenti paraleli. Skill-ul `superpowers:dispatching-parallel-agents` NU se aplica in acest proiect.
## Core Components
- Team lead citeste TOATE fisierele implicate, creeaza planul
- **ASTEAPTA aprobare explicita** de la user inainte de implementare
- Task-uri pe fisiere non-overlapping (evita conflicte)
- Cache-bust static assets (`?v=N`) la fiecare schimbare UI
### gomag-vending.prg
Main script that handles:
- GoMag API authentication using Apikey and ApiShop headers
- HTTP GET requests to retrieve product data
- JSON response parsing and analysis
- File output for API responses (timestamped .json files)
- Error handling and connectivity testing
### Key Configuration Variables
- `lcApiUrl`: GoMag API endpoint (defaults to product read endpoint)
- `lcApiKey`: GoMag API key (must be configured)
- `lcApiShop`: Shop URL (must be configured)
## Development Commands
```bash
# INTOTDEAUNA via start.sh (seteaza Oracle env vars)
./start.sh
# NU folosi uvicorn direct — lipsesc LD_LIBRARY_PATH si TNS_ADMIN
### Running the Application
```foxpro
DO gomag-vending.prg
```
## Testing & CI/CD
```bash
# Teste rapide (unit + e2e, ~30s, fara Oracle)
./test.sh ci
# Teste complete (totul inclusiv Oracle + sync real + PL/SQL, ~2-3 min)
./test.sh full
# Smoke test pe productie (read-only, dupa deploy)
./test.sh smoke-prod --base-url http://79.119.86.134/gomag
# Doar un layer specific
./test.sh unit # SQLite CRUD, imports, routes
./test.sh e2e # Browser tests (Playwright)
./test.sh oracle # Oracle integration
./test.sh sync # Sync real GoMag → Oracle
./test.sh qa # API health + responsive + log monitor
./test.sh logs # Doar log monitoring
# Validate prerequisites
./test.sh --dry-run
### Running from Windows Command Line
Use the provided batch file for easy execution:
```cmd
run-gomag.bat
```
**Flow zilnic:**
1. Lucrezi pe branch `fix/*` sau `feat/*`
2. `git push` → pre-push hook ruleaza `./test.sh ci` automat (~30s)
3. Inainte de PR → `./test.sh full` manual (~2-3 min)
4. Dupa deploy pe prod → `./test.sh smoke-prod --base-url http://79.119.86.134/gomag`
**Output:** `qa-reports/` — health score, raport markdown, screenshots, baseline comparison.
**Markers pytest:** `unit`, `oracle`, `e2e`, `qa`, `sync`
## Reguli critice (nu le incalca)
### Flux import comenzi
1. Download GoMag API → JSON → parse → validate SKU-uri → import Oracle
2. Ordinea: **parteneri** (cauta/creeaza) → **adrese****comanda****factura cache**
3. SKU lookup: ARTICOLE_TERTI (mapped) are prioritate fata de NOM_ARTICOLE (direct)
4. Complex sets (kituri/pachete): un SKU → multiple CODMAT-uri cu `cantitate_roa`; preturile se preiau din lista de preturi Oracle
5. Comenzi anulate (GoMag statusId=7): verifica daca au factura inainte de stergere din Oracle
### Statusuri comenzi
`IMPORTED` / `ALREADY_IMPORTED` / `SKIPPED` / `ERROR` / `CANCELLED` / `DELETED_IN_ROA`
- Upsert: `IMPORTED` existent NU se suprascrie cu `ALREADY_IMPORTED`
- Recovery: la fiecare sync, comenzile ERROR sunt reverificate in Oracle
### Parteneri
- Prioritate: **companie** (PJ, cod_fiscal + registru) daca exista in GoMag, altfel persoana fizica cu **shipping name**
- Adresa livrare: intotdeauna GoMag shipping
- Adresa facturare: daca shipping ≠ billing person → shipping pt ambele; altfel → billing din GoMag
### Preturi
- Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
- Daca pretul lipseste, se insereaza automat pret=0
### Dashboard paginare
- Contorul din paginare arata **totalul comenzilor** din perioada selectata (ex: "378 comenzi"), NU doar cele filtrate
- Butoanele de filtru (Importat, Omise, Erori, Facturate, Nefacturate, Anulate) arata fiecare cate comenzi are pe langa total
- Aceasta este comportamentul dorit: userul vede cate comenzi totale sunt, din care cate importate, cu erori etc.
### Invoice cache
- Coloanele `factura_*` pe `orders` (SQLite), populate lazy din Oracle (`vanzari WHERE sters=0`)
- Refresh complet: verifica facturi noi + facturi sterse + comenzi sterse din ROA
## Sync articole VENDING → MARIUSM_AUTO
```bash
# Dry-run (arată diferențele fără să modifice)
python3 scripts/sync_vending_to_mariusm.py
# Aplică cu confirmare
python3 scripts/sync_vending_to_mariusm.py --apply
# Fără confirmare (automatizare)
python3 scripts/sync_vending_to_mariusm.py --apply --yes
Or directly with Visual FoxPro executable:
```cmd
"C:\Program Files (x86)\Microsoft Visual FoxPro 9\vfp9.exe" -T "path\to\gomag-vending-test.prg"
```
Sincronizează via SSH din VENDING (prod Windows) în MARIUSM_AUTO (dev ROA_CENTRAL):
nom_articole (noi by codmat, codmat updatat) + articole_terti (noi, modificate, soft-delete).
The batch file uses `%~dp0` to automatically detect the current directory, making it portable across different locations.
## Design System
### Testing Connectivity
The script includes a `TestConnectivity()` function for internet connectivity testing.
Always read DESIGN.md before making any visual or UI decisions.
All font choices, colors, spacing, and aesthetic direction are defined there.
Do not deviate without explicit user approval.
In QA mode, flag any code that doesn't match DESIGN.md.
## API Integration Details
## Deploy Windows
### Authentication
- Uses header-based authentication with `Apikey` and `ApiShop` headers
- Requires User-Agent to be different from "PostmanRuntime"
Vezi [README.md](README.md#deploy-windows)
### Endpoints Used
- Primary: `https://api.gomag.ro/api/v1/product/read/json?enabled=1`
- Supports pagination, filtering by category/brand, and sorting parameters
### Rate Limiting
- No specific limitations for READ requests
- POST requests limited to ~1 request per second (Leaky Bucket algorithm)
## File Structure
```
/
├── gomag-vending-test.prg # Main application script
├── run-gomag.bat # Windows batch file for easy execution
└── gomag_products_*.json # Generated API response files (timestamped)
```
## Configuration Requirements
Before running, update these variables in `gomag-vending.prg:10-15`:
1. `lcApiKey` - Your GoMag API key
2. `lcApiShop` - Your shop URL (e.g., "https://yourstore.gomag.ro")
## Helper Functions
- `ParseJsonResponse()` - Basic JSON structure analysis
- `TestConnectivity()` - Internet connectivity testing
- `UrlEncode()` - URL parameter encoding utility

324
DESIGN.md
View File

@@ -1,324 +0,0 @@
# Design System — GoMag Vending
## Product Context
- **What this is:** Internal admin dashboard for importing web orders from GoMag e-commerce into ROA Oracle ERP
- **Who it's for:** Ops/admin team who monitor order sync daily, fix SKU mappings, check import errors
- **Space/industry:** Internal tools, B2B operations, ERP integration
- **Project type:** Data-heavy admin dashboard (tables, status indicators, sync controls)
## Aesthetic Direction
- **Direction:** Industrial/Utilitarian — function-first, data-dense, quietly confident
- **Decoration level:** Minimal — typography and color do the work. No illustrations, gradients, or decorative elements. The data IS the decoration.
- **Mood:** Command console. This tool says "built by someone who respects the operator." Serious, efficient, warm.
- **Anti-patterns:** No purple gradients, no 3-column icon grids, no centered-everything layouts, no decorative blobs, no stock-photo heroes
## Typography
### Font Stack
- **Display/Headings:** Space Grotesk — geometric, slightly techy, distinctive `a` and `g`. Says "engineered."
- **Body/UI:** DM Sans — clean, excellent readability, good tabular-nums for inline numbers
- **Data/Tables:** JetBrains Mono — order IDs, CODMATs, status codes align perfectly. Tables become scannable.
- **Code:** JetBrains Mono
### Loading
```html
<link rel="preconnect" href="https://fonts.googleapis.com">
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
<link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,300;0,9..40,400;0,9..40,500;0,9..40,600;0,9..40,700;1,9..40,400&family=JetBrains+Mono:wght@400;500;600&family=Space+Grotesk:wght@400;500;600;700&display=swap" rel="stylesheet">
```
### CSS Variables
```css
--font-display: 'Space Grotesk', sans-serif;
--font-body: 'DM Sans', sans-serif;
--font-data: 'JetBrains Mono', monospace;
```
### Type Scale
| Level | Size | Weight | Font | Usage |
|-------|------|--------|------|-------|
| Page title | 18px | 600 | Display | "Panou de Comanda" |
| Section title | 16px | 600 | Display | Card headers |
| Label/uppercase | 12px | 500 | Display | Column headers, section labels (letter-spacing: 0.04em) |
| Body | 14px | 400 | Body | Paragraphs, descriptions |
| UI/Button | 13px | 500 | Body | Buttons, nav links, form labels |
| Data cell | 13px | 400 | Data | Codes, IDs, numbers, sums, dates (NOT text names — those use Body font) |
| Data small | 12px | 400 | Data | Timestamps, secondary data |
| Code/mono | 11px | 400 | Data | Inline code, debug info |
## Color
### Approach: Two-accent system (amber state + blue action)
Every admin tool is blue. This one uses amber — reads as "operational" and "attention-worthy."
- **Amber (--accent):** Navigation active state, filter pill active, accent backgrounds. "Where you are."
- **Blue (--info):** Primary buttons, CTAs, actionable links. "What you can do."
- Primary buttons (`btn-primary`) stay blue for clear action hierarchy.
### Light Mode (default)
```css
:root {
/* Surfaces */
--bg: #F8F7F5; /* warm off-white, not clinical gray */
--surface: #FFFFFF;
--surface-raised: #F3F2EF; /* hover states, table headers */
--card-shadow: 0 1px 3px rgba(28,25,23,0.1), 0 1px 2px rgba(28,25,23,0.06);
/* Text */
--text-primary: #1C1917; /* warm black */
--text-secondary: #57534E; /* warm gray */
--text-muted: #78716C; /* labels, timestamps */
/* Borders */
--border: #E7E5E4;
--border-subtle: #F0EFED;
/* Accent — amber */
--accent: #D97706;
--accent-hover: #B45309;
--accent-light: #FEF3C7; /* amber backgrounds */
--accent-text: #92400E; /* text on amber bg */
/* Semantic */
--success: #16A34A;
--success-light: #DCFCE7;
--success-text: #166534;
--warning: #CA8A04;
--warning-light: #FEF9C3;
--warning-text: #854D0E;
--error: #DC2626;
--error-light: #FEE2E2;
--error-text: #991B1B;
--info: #2563EB;
--info-light: #DBEAFE;
--info-text: #1E40AF;
--cancelled: #78716C;
--cancelled-light: #F5F5F4;
}
```
### Dark Mode
Strategy: invert surfaces, reduce accent saturation ~15%, keep semantic colors recognizable.
```css
[data-theme="dark"] {
--bg: #121212;
--surface: #1E1E1E;
--surface-raised: #2A2A2A;
--card-shadow: 0 1px 3px rgba(0,0,0,0.4), 0 1px 2px rgba(0,0,0,0.3);
--text-primary: #E8E4DD; /* warm bone white */
--text-secondary: #A8A29E;
--text-muted: #78716C;
--border: #333333;
--border-subtle: #262626;
--accent: #F59E0B;
--accent-hover: #D97706;
--accent-light: rgba(245,158,11,0.12);
--accent-text: #FCD34D;
--success: #16A34A;
--success-light: rgba(22,163,74,0.15);
--success-text: #4ADE80;
--warning: #CA8A04;
--warning-light: rgba(202,138,4,0.15);
--warning-text: #FACC15;
--error: #DC2626;
--error-light: rgba(220,38,38,0.15);
--error-text: #FCA5A5;
--info: #2563EB;
--info-light: rgba(37,99,235,0.15);
--info-text: #93C5FD;
--cancelled: #78716C;
--cancelled-light: rgba(120,113,108,0.15);
}
```
### Status Color Mapping
| Status | Dot Color | Badge BG | Glow |
|--------|-----------|----------|------|
| IMPORTED | `--success` | `--success-light` | none (quiet when healthy) |
| ERROR | `--error` | `--error-light` | `0 0 8px 2px rgba(220,38,38,0.35)` |
| SKIPPED | `--warning` | `--warning-light` | `0 0 6px 2px rgba(202,138,4,0.3)` |
| ALREADY_IMPORTED | `--info` | `--info-light` | none |
| CANCELLED | `--cancelled` | `--cancelled-light` | none |
| DELETED_IN_ROA | `--cancelled` | `--cancelled-light` | none |
**Design rule:** Problems glow, success is calm. The operator's eye is pulled to rows that need action.
## Spacing
- **Base unit:** 4px
- **Density:** Comfortable — not cramped, not wasteful
- **Scale:**
| Token | Value | Usage |
|-------|-------|-------|
| 2xs | 2px | Tight internal gaps |
| xs | 4px | Icon-text gap, badge padding |
| sm | 8px | Compact card padding, table cell padding |
| md | 16px | Standard card padding, section gaps |
| lg | 24px | Section spacing |
| xl | 32px | Major section gaps |
| 2xl | 48px | Page-level spacing |
| 3xl | 64px | Hero spacing (rarely used) |
## Layout
### Approach: Grid-disciplined, full-width
Tables with 8+ columns and hundreds of rows need every pixel of width.
- **Nav:** Horizontal top bar, fixed, 48px height. Active tab has amber underline (2px).
- **Content max-width:** None on desktop (full-width for tables), 1200px for non-table content
- **Grid:** Single-column layout, cards stack vertically
- **Breakpoints:**
| Name | Width | Columns | Behavior |
|------|-------|---------|----------|
| Desktop | >= 1024px | Full width | All features visible |
| Tablet | 768-1023px | Full width | Nav labels abbreviated, tables scroll horizontally |
| Mobile | < 768px | Single column | Bottom nav, cards stack, condensed views |
### Border Radius
| Token | Value | Usage |
|-------|-------|-------|
| sm | 4px | Buttons, inputs, badges, status dots |
| md | 8px | Cards, dropdowns, modals |
| lg | 12px | Large containers, mockup frames |
| full | 9999px | Pills, avatar circles |
## Motion
- **Approach:** Minimal-functional only transitions that aid comprehension
- **Easing:** enter(ease-out) exit(ease-in) move(ease-in-out)
- **Duration:**
| Token | Value | Usage |
|-------|-------|-------|
| micro | 50-100ms | Button hover, focus ring |
| short | 150-250ms | Dropdown open, tab switch, color transitions |
| medium | 250-400ms | Modal open/close, page transitions |
| long | 400-700ms | Only for sync pulse animation |
- **Sync pulse:** The live sync dot uses a 2s infinite pulse (opacity 1 0.4 1)
- **No:** entrance animations, scroll effects, decorative motion
## Mobile Design
### Navigation
- **Bottom tab bar** replaces top horizontal nav on screens < 768px
- 5 tabs: Dashboard, Mapari, Lipsa, Jurnale, Setari
- Each tab: icon (Bootstrap Icons) + short label below
- Active tab: amber accent color, inactive: `--text-muted`
- Height: 56px, safe-area padding for notched devices
- Fixed position bottom, with `padding-bottom: env(safe-area-inset-bottom)`
```css
@media (max-width: 767px) {
.top-navbar { display: none; }
.bottom-nav {
position: fixed;
bottom: 0;
left: 0;
right: 0;
height: 56px;
padding-bottom: env(safe-area-inset-bottom);
background: var(--surface);
border-top: 1px solid var(--border);
display: flex;
justify-content: space-around;
align-items: center;
z-index: 1000;
}
.main-content {
padding-bottom: 72px; /* clear bottom nav */
padding-top: 8px; /* no top navbar */
}
}
```
### Dashboard — Mobile
- **Sync card:** Full width, stacked vertically
- Status + controls row wraps to 2 lines
- Sync button full-width at bottom of card
- Last sync info wraps naturally
- **Orders table:** Condensed card view instead of horizontal table
- Each order = a compact card showing: status dot + ID + client name + total
- Tap to expand: shows date, factura, full details
- Swipe left on card: quick action (view error details)
- **Filter bar:** Horizontal scrollable chips instead of dropdowns
- Period selector: pill chips (1zi, 7zi, 30zi, Toate)
- Status filter: colored chips matching status colors
- **Touch targets:** Minimum 44x44px for all interactive elements
### Orders Mobile Card Layout
```
┌────────────────────────────────┐
│ ● CMD-47832 2,450.00 RON│
│ SC Automate Express SRL │
│ 27.03.2026 · FCT-2026-1847 │
└────────────────────────────────┘
```
- Status dot (8px, left-aligned with glow for errors)
- Order ID in JetBrains Mono, amount right-aligned
- Client name in DM Sans
- Date + factura in muted data font
### SKU Mappings — Mobile
- Each mapping = expandable card
- Collapsed: SKU + product name + type badge (KIT/SIMPLU)
- Expanded: Full CODMAT list with quantities
- Search: Full-width sticky search bar at top
- Filter: Horizontal scrollable type chips
### Logs — Mobile
- Timeline view instead of table
- Each log entry = timestamp + status icon + summary
- Tap to expand full log details
- Infinite scroll with date separators
### Settings — Mobile
- Standard stacked form layout
- Full-width inputs
- Toggle switches for boolean settings (min 44px touch target)
- Save button sticky at bottom
### Gestures
- **Pull to refresh** on Dashboard: triggers sync status check
- **Swipe left** on order card: reveal quick actions
- **Long press** on SKU mapping: copy CODMAT to clipboard
- **No swipe navigation** between pages (use bottom tabs)
### Mobile Typography Adjustments
| Level | Desktop | Mobile |
|-------|---------|--------|
| Page title | 18px | 16px |
| Body | 14px | 14px (no change) |
| Data cell | 13px | 13px (no change) |
| Data small | 12px | 12px (no change) |
| Table header | 12px | 11px |
### Responsive Images & Icons
- Use Bootstrap Icons throughout (already loaded via CDN)
- Icon size: 16px desktop, 20px mobile (larger touch targets)
- No images in the admin interface (data-only)
## Decisions Log
| Date | Decision | Rationale |
|------|----------|-----------|
| 2026-03-27 | Initial design system created | Created by /design-consultation. Industrial/utilitarian aesthetic with amber accent, Space Grotesk + DM Sans + JetBrains Mono. |
| 2026-03-27 | Amber accent over blue | Every admin tool is blue. Amber reads as "operational" and gives the tool its own identity. Confirmed by Claude subagent ("Control Room Noir" also converged on amber). |
| 2026-03-27 | JetBrains Mono for data tables | Both primary analysis and subagent independently recommended monospace for data tables. Scannability win outweighs the ~15% wider columns. |
| 2026-03-27 | Warm tones throughout | Off-white (#F8F7F5) instead of clinical gray. Warm black text instead of blue-gray. Makes the tool feel handcrafted. |
| 2026-03-27 | Glowing status dots for errors | Problems glow (box-shadow), success is calm. Operator's eye is pulled to rows that need action. Inspired by subagent's "LED indicator" concept. |
| 2026-03-27 | Full mobile design | Bottom nav, card-based order views, touch-optimized gestures. Supports quick-glance usage from phone. |
| 2026-03-27 | Two-accent system | Blue = action (buttons, CTAs), amber = state (nav active, filter active). Clear hierarchy. |
| 2026-03-27 | JetBrains Mono selective | Mono font only for codes, IDs, numbers, sums, dates. Text names use DM Sans for readability. |
| 2026-03-27 | Dark mode in scope | CSS variables + toggle + localStorage. All DESIGN.md dark tokens implemented in Commit 0.5. |

View File

@@ -1,10 +0,0 @@
ESTIMARE PROIECT - Import Comenzi Web → ROA
Data: 5 martie 2026
================================================================================
Lucrat deja: 20h
De lucrat: 60h
Support 3 luni: 24h
TOTAL IMPLEMENTARE: 80h
TOTAL CU SUPPORT: 104h

466
README.md
View File

@@ -1,466 +1,2 @@
# GoMag Vending - Import Comenzi Web → ROA Oracle
# gomag-vending
System automat de import comenzi din platforma GoMag in sistemul ERP ROA Oracle.
## Arhitectura
```
[GoMag API] → [Python Sync Service] → [Oracle PL/SQL] → [FastAPI Admin]
↓ ↓ ↑ ↑
JSON Orders Download/Parse/Import Store/Update Dashboard + Config
```
### Stack Tehnologic
- **API + Admin:** FastAPI + Jinja2 + Bootstrap 5.3
- **GoMag Integration:** Python (`gomag_client.py` — download comenzi cu paginare)
- **Sync Orchestrator:** Python (`sync_service.py` — download → parse → validate → import)
- **Database:** Oracle PL/SQL packages (IMPORT_PARTENERI, IMPORT_COMENZI) + SQLite (tracking)
---
## Quick Start
### Prerequisite
- Python 3.10+
- Oracle Instant Client 21.x (optional — suporta si thin mode pentru Oracle 12.1+)
### Instalare
```bash
pip install -r api/requirements.txt
cp api/.env.example api/.env
# Editeaza api/.env cu datele de conectare Oracle
```
### Pornire server
**Important:** serverul trebuie pornit **din project root**, nu din `api/`:
```bash
python -m uvicorn api.app.main:app --host 0.0.0.0 --port 5003
```
Sau folosind scriptul inclus:
```bash
./start.sh
```
Deschide `http://localhost:5003` in browser.
### Testare
**Test A - Basic (fara Oracle):**
```bash
python api/test_app_basic.py
```
**Test C - Integrare Oracle:**
```bash
python api/test_integration.py
```
---
## Configurare (.env)
Copiaza `.env.example` si completeaza:
```bash
cp api/.env.example api/.env
```
| Variabila | Descriere | Exemplu |
|-----------|-----------|---------|
| `ORACLE_USER` | User Oracle | `MARIUSM_AUTO` |
| `ORACLE_PASSWORD` | Parola Oracle | `secret` |
| `ORACLE_DSN` | TNS alias | `ROA_CENTRAL` |
| `TNS_ADMIN` | Cale absoluta la tnsnames.ora | `/mnt/e/.../gomag/api` |
| `INSTANTCLIENTPATH` | Cale Instant Client (thick mode) | `/opt/oracle/instantclient_21_15` |
| `FORCE_THIN_MODE` | Thin mode fara Instant Client | `true` |
| `SQLITE_DB_PATH` | Path SQLite (relativ la project root) | `api/data/import.db` |
| `JSON_OUTPUT_DIR` | Folder JSON-uri descarcate | `api/data/orders` |
| `APP_PORT` | Port HTTP | `5003` |
| `ID_POL` | ID Politica ROA | `39` |
| `ID_GESTIUNE` | ID Gestiune ROA | `0` |
| `ID_SECTIE` | ID Sectie ROA | `6` |
**Nota Oracle mode:**
- **Thick mode** (Oracle 10g/11g): seteaza `INSTANTCLIENTPATH`
- **Thin mode** (Oracle 12.1+): seteaza `FORCE_THIN_MODE=true`, sterge `INSTANTCLIENTPATH`
---
## Structura Proiect
```
gomag-vending/
├── api/ # FastAPI Admin + Dashboard
│ ├── app/
│ │ ├── main.py # Entry point, lifespan, logging
│ │ ├── config.py # Settings (pydantic-settings + .env)
│ │ ├── database.py # Oracle pool + SQLite schema + migrari
│ │ ├── routers/ # Endpoint-uri HTTP
│ │ │ ├── health.py # GET /health
│ │ │ ├── dashboard.py # GET / (HTML) + /settings (HTML)
│ │ │ ├── mappings.py # /mappings, /api/mappings
│ │ │ ├── articles.py # /api/articles/search
│ │ │ ├── validation.py # /api/validate/*
│ │ │ └── sync.py # /api/sync/* + /api/dashboard/* + /api/settings
│ │ ├── services/
│ │ │ ├── gomag_client.py # Download comenzi GoMag API
│ │ │ ├── sync_service.py # Orchestrare: download→validate→import
│ │ │ ├── import_service.py # Import comanda in Oracle ROA
│ │ │ ├── mapping_service.py # CRUD ARTICOLE_TERTI + cantitate_roa
│ │ │ ├── price_sync_service.py # Sync preturi GoMag → Oracle politici
│ │ │ ├── sqlite_service.py # Tracking runs/orders/missing SKUs
│ │ │ ├── order_reader.py # Citire gomag_orders_page*.json
│ │ │ ├── validation_service.py
│ │ │ ├── article_service.py
│ │ │ ├── invoice_service.py # Verificare facturi ROA
│ │ │ └── scheduler_service.py # APScheduler timer
│ │ ├── templates/ # Jinja2 (dashboard, mappings, missing_skus, logs, settings)
│ │ └── static/ # CSS (style.css) + JS (dashboard, logs, mappings, settings, shared)
│ ├── database-scripts/ # Oracle SQL (ARTICOLE_TERTI, packages)
│ ├── data/ # SQLite DB (import.db) + JSON orders
│ ├── .env # Configurare locala (nu in git)
│ ├── .env.example # Template configurare
│ ├── test_app_basic.py # Test A - fara Oracle
│ ├── test_integration.py # Test C - cu Oracle
│ └── requirements.txt
├── logs/ # Log-uri aplicatie (sync_comenzi_*.log)
├── docs/ # Documentatie (Oracle schema, facturare analysis)
├── scripts/ # Utilitare (sync_vending_to_mariusm, create_inventory_notes)
├── screenshots/ # Before/preview/after pentru UI changes
├── start.sh # Script pornire (Linux/WSL)
└── CLAUDE.md # Instructiuni pentru AI assistants
```
---
## Dashboard Features
### Sync Panel
- Start sync manual sau scheduler automat (5/10/30 min)
- Progress live: `"Import 45/80: #CMD-1234 Ion Popescu"`
- Smart polling: 30s idle → 3s cand ruleaza → auto-refresh tabela
- Last sync clickabil → jurnal detaliat
### Comenzi
- Filtru perioada: 3z / 7z / 30z / 3 luni / toate / custom
- Status pills cu conturi totale pe perioada (nu per-pagina)
- Cautare integrata in bara de filtre
- Coloana Client cu tooltip `▲` cand persoana livrare ≠ facturare
- Paginare sus + jos, selector rezultate per pagina (25/50/100/250)
### Mapari SKU
- Badge `✓ 100%` / `⚠ 80%` per grup SKU
- Filtru Complete / Incomplete
- Verificare duplicat SKU-CODMAT (409 cu optiune de restaurare)
### SKU-uri Lipsa
- Cautare dupa SKU sau nume produs
- Filtru Nerezolvate / Rezolvate / Toate cu conturi
- Re-scan cu progress inline si banner rezultat
---
## Fluxul de Import
```
1. gomag_client.py descarca comenzi GoMag API → JSON files (paginat)
2. order_reader.py parseaza JSON-urile, sorteaza cronologic (cele mai vechi primele)
3. Comenzi anulate (GoMag statusId=7) → separate, sterse din Oracle daca nu au factura
4. validation_service.py valideaza SKU-uri: ARTICOLE_TERTI (mapped) → NOM_ARTICOLE (direct) → missing
5. Verificare existenta in Oracle (COMENZI by date range) → deja importate se sar
6. Stale error recovery: comenzi ERROR reverificate in Oracle (crash recovery)
7. Validare preturi + dual policy: articole rutate la id_pol_vanzare sau id_pol_productie
8. import_service.py: cauta/creeaza partener → adrese → importa comanda in Oracle
9. Invoice cache: verifica facturi + comenzi sterse din ROA
10. Rezultate salvate in SQLite (orders, sync_run_orders, order_items)
```
### Statuses Comenzi
| Status | Descriere |
|--------|-----------|
| `IMPORTED` | Importata nou in ROA in acest run |
| `ALREADY_IMPORTED` | Existenta deja in Oracle, contorizata |
| `SKIPPED` | SKU-uri lipsa → neimportata |
| `ERROR` | Eroare la import (reverificate automat la urmatorul sync) |
| `CANCELLED` | Comanda anulata in GoMag (statusId=7) |
| `DELETED_IN_ROA` | A fost importata dar comanda a fost stearsa din ROA |
**Regula upsert:** daca statusul existent este `IMPORTED`, nu se suprascrie cu `ALREADY_IMPORTED`.
### Reguli Business
**Parteneri & Adrese:**
- Prioritate partener: daca exista **companie** in GoMag (billing.company_name) → firma (PJ, cod_fiscal + registru). Altfel → persoana fizica, cu **shipping name** ca nume partener
- Adresa livrare: intotdeauna din GoMag shipping
- Adresa facturare: daca shipping name ≠ billing name → adresa shipping pt ambele; daca aceeasi persoana → adresa billing din GoMag
- Cautare partener in Oracle: cod_fiscal → denumire → create new (ID_UTIL = -3)
**Articole & Mapari:**
- SKU lookup: ARTICOLE_TERTI (mapped, activ=1) are prioritate fata de NOM_ARTICOLE (direct)
- SKU simplu: gasit direct in NOM_ARTICOLE → nu se stocheaza in ARTICOLE_TERTI
- SKU cu repackaging: un SKU → CODMAT cu cantitate diferita (`cantitate_roa`)
- SKU set complex: un SKU → multiple CODMAT-uri cu `procent_pret` (trebuie sum = 100%)
**Preturi & Discounturi:**
- Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
- Daca pretul lipseste in politica, se insereaza automat pret=0
- Discount VAT splitting: daca `split_discount_vat=1`, discountul se repartizeaza proportional pe cotele TVA din comanda
---
## Facturi & Cache
### Sincronizari
Sistemul are 3 procese de sincronizare si o setare de refresh UI:
#### 1. Sync Comenzi (Dashboard → scheduler sau buton Sync)
Procesul principal. Importa comenzi din GoMag in Oracle si verifica statusul celor existente.
**Pasi:**
1. Descarca comenzile din GoMag API (ultimele N zile, configurat in Setari)
2. Valideaza SKU-urile fiecarei comenzi:
- Cauta in ARTICOLE_TERTI (mapari manuale) → apoi in NOM_ARTICOLE (potrivire directa)
- Daca un SKU nu e gasit nicaieri → comanda e marcata SKIPPED si SKU-ul apare in "SKU-uri lipsa"
3. Verifica daca comanda exista deja in Oracle → da: ALREADY_IMPORTED, nu: se importa
4. Comenzi cu status ERROR din run-uri anterioare sunt reverificate in Oracle (crash recovery)
5. Import in Oracle: cauta/creeaza partener → adrese → comanda
6. **Verificare facturi** (la fiecare sync):
- Comenzi nefacturate → au primit factura in ROA? → salveaza serie/numar/total
- Comenzi facturate → a fost stearsa factura? → sterge cache
- Comenzi importate → au fost sterse din ROA? → marcheaza DELETED_IN_ROA
**Cand ruleaza:**
- **Automat:** scheduler configurat din Dashboard (interval: 5 / 10 / 30 min)
- **Manual:** buton "Sync" din Dashboard sau `POST /api/sync/start`
- **Doar facturi:** `POST /api/dashboard/refresh-invoices` (sare pasii 1-5)
> Facturarea in ROA **nu** declanseaza sync — statusul se actualizeaza la urmatorul sync sau refresh manual.
#### 2. Sync Preturi din Comenzi (Setari → on/off)
La fiecare sync comenzi, daca este activat (`price_sync_enabled=1`), compara preturile din comanda GoMag cu cele din politica de pret Oracle si le actualizeaza daca difera.
Configurat din: **Setari → Sincronizare preturi din comenzi**
#### 3. Sync Catalog Preturi (Setari → manual sau zilnic)
Sync independent de comenzi. Descarca **toate produsele** din catalogul GoMag, le potriveste cu articolele Oracle (prin CODMAT/SKU) si actualizeaza preturile in politica de pret.
Configurat din: **Setari → Sincronizare Preturi** (activare + program)
- **Doar manual:** buton "Sincronizeaza acum" din Setari sau `POST /api/price-sync/start`
- **Zilnic la 03:00 / 06:00:** optiune in UI (**neimplementat** — setarea se salveaza dar scheduler-ul zilnic nu exista inca)
#### Interval polling dashboard (Setari → Dashboard)
Cat de des verifica **interfata web** (browser-ul) statusul sync-ului. Valoare in secunde (implicit 5s). **Nu afecteaza frecventa sync-ului** — e doar refresh-ul UI-ului.
Facturile sunt verificate din Oracle si cached in SQLite (`factura_*` pe tabelul `orders`).
### Sursa Oracle
```sql
SELECT id_comanda, numar_act, serie_act,
total_fara_tva, total_tva, total_cu_tva,
TO_CHAR(data_act, 'YYYY-MM-DD')
FROM vanzari
WHERE id_comanda IN (...) AND sters = 0
```
### Populare Cache
1. **Dashboard** (`GET /api/dashboard/orders`) — comenzile fara cache sunt verificate live si cached automat la fiecare request
2. **Detaliu comanda** (`GET /api/sync/order/{order_number}`) — verifica Oracle live daca nu e cached
3. **Refresh manual** (`POST /api/dashboard/refresh-invoices`) — refresh complet pentru toate comenzile
### Refresh Complet — `/api/dashboard/refresh-invoices`
Face trei verificari in Oracle si actualizeaza SQLite:
| Verificare | Actiune |
|------------|---------|
| Comenzi nefacturate → au primit factura? | Cached datele facturii |
| Comenzi facturate → factura a fost stearsa? | Sterge cache factura |
| Toate comenzile importate → comanda stearsa din ROA? | Seteaza status `DELETED_IN_ROA` |
Returneaza: `{ checked, invoices_added, invoices_cleared, orders_deleted }`
---
## API Reference — Sync & Comenzi
### Sync
| Method | Path | Descriere |
|--------|------|-----------|
| POST | `/api/sync/start` | Porneste sync in background |
| POST | `/api/sync/stop` | Trimite semnal de stop |
| GET | `/api/sync/status` | Status curent + progres + last_run |
| GET | `/api/sync/history` | Istoric run-uri (paginat) |
| GET | `/api/sync/run/{id}` | Detalii run specific |
| GET | `/api/sync/run/{id}/log` | Log per comanda (JSON) |
| GET | `/api/sync/run/{id}/text-log` | Log text (live din memorie sau reconstruit din SQLite) |
| GET | `/api/sync/run/{id}/orders` | Comenzi run filtrate/paginate |
| GET | `/api/sync/order/{number}` | Detaliu comanda + items + ARTICOLE_TERTI + factura |
### Dashboard Comenzi
| Method | Path | Descriere |
|--------|------|-----------|
| GET | `/api/dashboard/orders` | Comenzi cu enrichment factura |
| POST | `/api/dashboard/refresh-invoices` | Force-refresh stare facturi + deleted orders |
**Parametri `/api/dashboard/orders`:**
- `period_days`: 3/7/30/90 sau 0 (toate sau interval custom)
- `period_start`, `period_end`: interval custom (cand `period_days=0`)
- `status`: `all` / `IMPORTED` / `SKIPPED` / `ERROR` / `UNINVOICED` / `INVOICED`
- `search`, `sort_by`, `sort_dir`, `page`, `per_page`
Filtrele `UNINVOICED` si `INVOICED` fac fetch din toate comenzile IMPORTED si filtreaza server-side dupa prezenta/absenta cache-ului de factura.
### Scheduler
| Method | Path | Descriere |
|--------|------|-----------|
| PUT | `/api/sync/schedule` | Configureaza (enabled, interval_minutes: 5/10/30) |
| GET | `/api/sync/schedule` | Status curent |
Configuratia este persistata in SQLite (`scheduler_config`).
### Settings
| Method | Path | Descriere |
|--------|------|-----------|
| GET | `/api/settings` | Citeste setari aplicatie |
| PUT | `/api/settings` | Salveaza setari |
| GET | `/api/settings/sectii` | Lista sectii Oracle |
| GET | `/api/settings/politici` | Lista politici preturi Oracle |
**Setari disponibile:** `transport_codmat`, `transport_vat`, `discount_codmat`, `discount_vat`, `transport_id_pol`, `discount_id_pol`, `id_pol`, `id_pol_productie`, `id_sectie`, `split_discount_vat`, `gomag_api_key`, `gomag_api_shop`, `gomag_order_days_back`, `gomag_limit`
---
## Deploy Windows
### Instalare initiala
```powershell
# Ruleaza ca Administrator
.\deploy.ps1
```
Scriptul `deploy.ps1` face automat: git clone, venv, dependinte, detectare Oracle, `start.bat`, serviciu NSSM, configurare IIS reverse proxy.
### Update cod (pull + restart)
```powershell
# Ca Administrator
.\update.ps1
```
Sau manual:
```powershell
cd C:\gomag-vending
git pull origin main
nssm restart GoMagVending
```
### Configurare `.env` pe Windows
```ini
# api/.env — exemplu Windows
ORACLE_USER=VENDING
ORACLE_PASSWORD=****
ORACLE_DSN=ROA
TNS_ADMIN=C:\roa\instantclient_11_2_0_2
INSTANTCLIENTPATH=C:\app\Server\product\18.0.0\dbhomeXE\bin
SQLITE_DB_PATH=api/data/import.db
JSON_OUTPUT_DIR=api/data/orders
APP_PORT=5003
ID_POL=39
ID_GESTIUNE=0
ID_SECTIE=6
GOMAG_API_KEY=...
GOMAG_API_SHOP=...
GOMAG_ORDER_DAYS_BACK=7
GOMAG_LIMIT=100
```
**Important:**
- `TNS_ADMIN` = folderul care contine `tnsnames.ora` (NU fisierul in sine)
- `ORACLE_DSN` = alias-ul exact din `tnsnames.ora`
- `INSTANTCLIENTPATH` = calea catre Oracle bin (thick mode, Oracle 10g/11g)
- `FORCE_THIN_MODE=true` = elimina necesitatea Instant Client (Oracle 12.1+)
- Setarile din `.env` pot fi suprascrise din UI → `Setari` → salvate in SQLite
### Serviciu Windows (NSSM)
```powershell
nssm restart GoMagVending # restart serviciu
nssm status GoMagVending # status serviciu
nssm stop GoMagVending # stop serviciu
nssm start GoMagVending # start serviciu
```
Loguri serviciu: `logs/service_stdout.log`, `logs/service_stderr.log`
Loguri aplicatie: `logs/sync_comenzi_*.log`
**Nota:** Userul `gomag` nu are drepturi de admin — `nssm restart` necesita PowerShell Administrator direct pe server.
### Depanare SSH
```bash
# Conectare SSH (PowerShell remote, cheie publica)
ssh -p 22122 gomag@79.119.86.134
# Verificare .env
cmd /c type C:\gomag-vending\api\.env
# Test conexiune Oracle
C:\gomag-vending\venv\Scripts\python.exe -c "import oracledb, os; os.environ['TNS_ADMIN']='C:/roa/instantclient_11_2_0_2'; conn=oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA'); print('Connected!'); conn.close()"
# Verificare tnsnames.ora
cmd /c type C:\roa\instantclient_11_2_0_2\tnsnames.ora
# Verificare procese Python
Get-Process *python* | Select-Object Id,ProcessName,Path
# Verificare loguri recente
Get-ChildItem C:\gomag-vending\logs\*.log | Sort-Object LastWriteTime -Descending | Select-Object -First 3
# Test sync manual (verifica ca Oracle pool porneste)
curl http://localhost:5003/health
curl -X POST http://localhost:5003/api/sync/start
# Refresh facturi manual
curl -X POST http://localhost:5003/api/dashboard/refresh-invoices
```
### Probleme frecvente
| Eroare | Cauza | Solutie |
|--------|-------|---------|
| `ORA-12154: TNS:could not resolve` | `TNS_ADMIN` gresit sau `tnsnames.ora` nu contine alias-ul DSN | Verifica `TNS_ADMIN` in `.env` + alias in `tnsnames.ora` |
| `ORA-04088: LOGON_AUDIT_TRIGGER` + `Nu aveti licenta pentru PYTHON` | Trigger ROA blocheaza executabile nelicențiate | Adauga `python.exe` (calea completa) in ROASUPORT |
| `503 Service Unavailable` pe `/api/articles/search` | Oracle pool nu s-a initializat | Verifica logul `sync_comenzi_*.log` pentru eroarea exacta |
| Facturile nu apar in dashboard | Cache SQLite gol — invoice_service nu a putut interoga Oracle | Apasa butonul Refresh Facturi din dashboard sau `POST /api/dashboard/refresh-invoices` |
| Comanda apare ca `DELETED_IN_ROA` | Comanda a fost stearsa manual din ROA | Normal — marcat automat la refresh |
| Scheduler nu porneste dupa restart | Config pierduta | Verifica SQLite `scheduler_config` sau reconfigureaza din UI |
---
## Documentatie Tehnica
| Fisier | Subiect |
|--------|---------|
| [docs/oracle-schema-notes.md](docs/oracle-schema-notes.md) | Schema Oracle: tabele comenzi, facturi, preturi, proceduri cheie |
| [docs/pack_facturare_analysis.md](docs/pack_facturare_analysis.md) | Analiza flow facturare: call chain, parametri, STOC lookup, FACT-008 |
| [scripts/HANDOFF_MAPPING.md](scripts/HANDOFF_MAPPING.md) | Matching GoMag SKU → ROA articole (strategie si rezultate) |
---
## WSL2 Note
- `uvicorn --reload` **nu functioneaza** pe `/mnt/e/` (WSL2 limitation) — restarta manual
- Serverul trebuie pornit din **project root**, nu din `api/`
- `JSON_OUTPUT_DIR` si `SQLITE_DB_PATH` sunt relative la project root

View File

@@ -1,15 +0,0 @@
# TODOS
## P2: Refactor sync_service.py in module separate
**What:** Split sync_service.py (870 linii) in: download_service, parse_service, sync_orchestrator.
**Why:** Faciliteza debugging si testare. Un bug in price sync nu ar trebui sa afecteze import flow.
**Effort:** M (human: ~1 sapt / CC: ~1-2h)
**Context:** Dupa implementarea planului Command Center (retry_service deja extras). sync_service face download + parse + validate + import + price sync + invoice check — prea multe responsabilitati.
**Depends on:** Finalizarea planului Command Center.
## P2: Email/webhook alert pe sync esuat
**What:** Cand sync-ul gaseste >5 erori sau esueaza complet, trimite un email/webhook.
**Why:** Post-lansare, cand app-ul ruleaza automat, nimeni nu sta sa verifice constant.
**Effort:** M (human: ~1 sapt / CC: ~1h)
**Context:** Depinde de infrastructura email/webhook disponibila la client. Implementare: SMTP simplu sau webhook URL configurabil in Settings.
**Depends on:** Lansare in productie + infrastructura email la client.

View File

@@ -1,86 +0,0 @@
# =============================================================================
# GoMag Import Manager - Configurare
# Copiaza in api/.env si completeaza cu datele reale
# =============================================================================
# =============================================================================
# ORACLE MODE - Alege una din urmatoarele doua optiuni:
# =============================================================================
# THICK MODE (Oracle 10g/11g/12.1+) - Recomandat pentru compatibilitate maxima
# Necesita Oracle Instant Client instalat
INSTANTCLIENTPATH=/opt/oracle/instantclient_21_15
# THIN MODE (Oracle 12.1+ only) - Fara Instant Client, mai simplu
# Comenteaza INSTANTCLIENTPATH de sus si decommenteaza urmatoarea linie:
# FORCE_THIN_MODE=true
# =============================================================================
# ORACLE - Credentiale baza de date
# =============================================================================
ORACLE_USER=USER_ORACLE
ORACLE_PASSWORD=parola_oracle
ORACLE_DSN=TNS_ALIAS
# Calea absoluta la directorul cu tnsnames.ora
# De obicei: directorul api/ al proiectului
TNS_ADMIN=/cale/absoluta/la/gomag/api
# =============================================================================
# APLICATIE
# =============================================================================
APP_PORT=5003
LOG_LEVEL=INFO
# =============================================================================
# CALE FISIERE
# Relative: JSON_OUTPUT_DIR la project root, SQLITE_DB_PATH la api/
# Se pot folosi si cai absolute
# =============================================================================
# JSON-uri comenzi GoMag
JSON_OUTPUT_DIR=output
# SQLite tracking DB
SQLITE_DB_PATH=data/import.db
# =============================================================================
# ROA - Setari import comenzi (din vfp/settings.ini sectiunea [ROA])
# =============================================================================
# Politica de pret
ID_POL=39
# Gestiune implicita
ID_GESTIUNE=0
# Sectie implicita
ID_SECTIE=6
# =============================================================================
# GoMag API
# =============================================================================
GOMAG_API_KEY=your_api_key_here
GOMAG_API_SHOP=https://yourstore.gomag.ro
GOMAG_ORDER_DAYS_BACK=7
GOMAG_LIMIT=100
# =============================================================================
# SMTP - Notificari email (optional)
# =============================================================================
# SMTP_HOST=smtp.gmail.com
# SMTP_PORT=587
# SMTP_USER=email@exemplu.com
# SMTP_PASSWORD=parola_app
# SMTP_TO=destinatar@exemplu.com
# =============================================================================
# AUTH - HTTP Basic Auth pentru dashboard (optional)
# =============================================================================
# API_USERNAME=admin
# API_PASSWORD=parola_sigura

View File

@@ -1,41 +0,0 @@
# UNIFIED Dockerfile - AUTO-DETECT Thick/Thin Mode
FROM python:3.11-slim as base
# Set argument for build mode (thick by default for compatibility)
ARG ORACLE_MODE=thick
# Base application setup
WORKDIR /app
COPY requirements.txt /app/requirements.txt
RUN pip3 install -r requirements.txt
# Oracle Instant Client + SQL*Plus installation (only if thick mode)
RUN if [ "$ORACLE_MODE" = "thick" ] ; then \
apt-get update && apt-get install -y libaio-dev wget unzip curl && \
mkdir -p /opt/oracle && cd /opt/oracle && \
wget https://download.oracle.com/otn_software/linux/instantclient/instantclient-basiclite-linuxx64.zip && \
wget https://download.oracle.com/otn_software/linux/instantclient/instantclient-sqlplus-linuxx64.zip && \
unzip -o instantclient-basiclite-linuxx64.zip && \
unzip -o instantclient-sqlplus-linuxx64.zip && \
rm -f instantclient-basiclite-linuxx64.zip instantclient-sqlplus-linuxx64.zip && \
cd /opt/oracle/instantclient* && \
rm -f *jdbc* *mysql* *jar uidrvci genezi adrci && \
echo /opt/oracle/instantclient* > /etc/ld.so.conf.d/oracle-instantclient.conf && \
ldconfig && \
ln -sf /usr/lib/x86_64-linux-gnu/libaio.so.1t64 /usr/lib/x86_64-linux-gnu/libaio.so.1 && \
ln -sf /opt/oracle/instantclient*/sqlplus /usr/local/bin/sqlplus ; \
else \
echo "Thin mode - skipping Oracle Instant Client installation" ; \
fi
# Copy application files
COPY . .
# Create logs directory
RUN mkdir -p /app/logs
# Expose port
EXPOSE 5000
# Run Flask application with auto-detect mode
CMD ["gunicorn", "--bind", "0.0.0.0:5000", "admin:app", "--reload", "--access-logfile", "-"]

View File

@@ -1,61 +0,0 @@
# GoMag Import Manager - FastAPI Application
Admin interface si orchestrator pentru importul comenzilor GoMag in Oracle ROA.
## Componente
### Core
- **main.py** - Entry point FastAPI, lifespan (Oracle pool + SQLite init), file logging
- **config.py** - Settings via pydantic-settings (citeste .env)
- **database.py** - Oracle connection pool + SQLite schema + helpers
### Routers (HTTP Endpoints)
| Router | Prefix | Descriere |
|--------|--------|-----------|
| health | /health, /api/health | Status Oracle + SQLite |
| dashboard | / | Dashboard HTML cu stat cards |
| mappings | /mappings, /api/mappings | CRUD ARTICOLE_TERTI + CSV |
| articles | /api/articles | Cautare NOM_ARTICOLE |
| validation | /api/validate | Scanare + validare SKU-uri |
| sync | /sync, /api/sync | Import orchestration + scheduler |
### Services (Business Logic)
| Service | Rol |
|---------|-----|
| mapping_service | CRUD pe ARTICOLE_TERTI (Oracle) |
| article_service | Cautare in NOM_ARTICOLE (Oracle) |
| import_service | Port din VFP: partner/address/order creation |
| sync_service | Orchestrare: read JSONs → validate → import → log |
| price_sync_service | Sync preturi GoMag → Oracle politici de pret |
| invoice_service | Verificare facturi ROA + cache SQLite |
| validation_service | Batch-validare SKU-uri (chunks of 500) |
| order_reader | Citire gomag_orders_page*.json din vfp/output/ |
| sqlite_service | CRUD pe SQLite (sync_runs, import_orders, missing_skus) |
| scheduler_service | APScheduler - sync periodic configurabil din UI |
## Rulare
```bash
pip install -r requirements.txt
# INTOTDEAUNA via start.sh din project root (seteaza Oracle env vars)
cd .. && ./start.sh
```
## Testare
```bash
# Din project root:
./test.sh ci # Teste rapide (unit + e2e, ~30s, fara Oracle)
./test.sh full # Teste complete (inclusiv Oracle, ~2-3 min)
./test.sh unit # Doar unit tests
./test.sh e2e # Doar browser tests (Playwright)
./test.sh oracle # Doar Oracle integration
```
## Dual Database
- **Oracle** - date ERP (ARTICOLE_TERTI, NOM_ARTICOLE, COMENZI)
- **SQLite** - tracking local (sync_runs, import_orders, missing_skus, scheduler_config)
## Logging
Log files in `../logs/sync_comenzi_YYYYMMDD_HHMMSS.log`
Format: `2026-03-11 14:30:25 | INFO | app.services.sync_service | mesaj`

View File

View File

@@ -1,63 +0,0 @@
from pydantic_settings import BaseSettings
from pydantic import model_validator
from pathlib import Path
import os
# Anchored paths - independent of CWD
_api_root = Path(__file__).resolve().parent.parent # .../gomag/api/
_project_root = _api_root.parent # .../gomag/
_env_path = _api_root / ".env"
class Settings(BaseSettings):
# Oracle
ORACLE_USER: str = "MARIUSM_AUTO"
ORACLE_PASSWORD: str = "ROMFASTSOFT"
ORACLE_DSN: str = "ROA_CENTRAL"
INSTANTCLIENTPATH: str = ""
FORCE_THIN_MODE: bool = False
TNS_ADMIN: str = ""
# SQLite
SQLITE_DB_PATH: str = "data/import.db"
# App
APP_PORT: int = 5003
LOG_LEVEL: str = "INFO"
JSON_OUTPUT_DIR: str = "output"
# SMTP (optional)
SMTP_HOST: str = ""
SMTP_PORT: int = 587
SMTP_USER: str = ""
SMTP_PASSWORD: str = ""
SMTP_TO: str = ""
# Auth (optional)
API_USERNAME: str = ""
API_PASSWORD: str = ""
# ROA Import Settings
ID_POL: int = 0
ID_SECTIE: int = 0
# GoMag API
GOMAG_API_KEY: str = ""
GOMAG_API_SHOP: str = ""
GOMAG_ORDER_DAYS_BACK: int = 7
GOMAG_LIMIT: int = 100
GOMAG_API_URL: str = "https://api.gomag.ro/api/v1/order/read/json"
@model_validator(mode="after")
def resolve_paths(self):
"""Resolve relative paths against known roots, independent of CWD."""
# SQLITE_DB_PATH: relative to api/ root
if self.SQLITE_DB_PATH and not os.path.isabs(self.SQLITE_DB_PATH):
self.SQLITE_DB_PATH = str(_api_root / self.SQLITE_DB_PATH)
# JSON_OUTPUT_DIR: relative to project root
if self.JSON_OUTPUT_DIR and not os.path.isabs(self.JSON_OUTPUT_DIR):
self.JSON_OUTPUT_DIR = str(_project_root / self.JSON_OUTPUT_DIR)
return self
model_config = {"env_file": str(_env_path), "env_file_encoding": "utf-8", "extra": "ignore"}
settings = Settings()

View File

@@ -1,361 +0,0 @@
import oracledb
import aiosqlite
import sqlite3
import logging
import os
from pathlib import Path
from .config import settings
logger = logging.getLogger(__name__)
# ---- Oracle Pool ----
pool = None
def init_oracle():
"""Initialize Oracle client mode and create connection pool."""
global pool
force_thin = settings.FORCE_THIN_MODE
instantclient_path = settings.INSTANTCLIENTPATH
dsn = settings.ORACLE_DSN
# Ensure TNS_ADMIN is set as OS env var so oracledb can find tnsnames.ora
if settings.TNS_ADMIN:
os.environ['TNS_ADMIN'] = settings.TNS_ADMIN
logger.info(f"Oracle config: DSN={dsn}, TNS_ADMIN={settings.TNS_ADMIN or os.environ.get('TNS_ADMIN', '(not set)')}, INSTANTCLIENTPATH={instantclient_path or '(not set)'}")
if force_thin:
logger.info(f"FORCE_THIN_MODE=true: thin mode for {dsn}")
elif instantclient_path:
try:
oracledb.init_oracle_client(lib_dir=instantclient_path)
logger.info(f"Thick mode activated for {dsn}")
except Exception as e:
logger.error(f"Thick mode error: {e}")
logger.info("Fallback to thin mode")
else:
logger.info(f"Thin mode (default) for {dsn}")
pool = oracledb.create_pool(
user=settings.ORACLE_USER,
password=settings.ORACLE_PASSWORD,
dsn=settings.ORACLE_DSN,
min=2,
max=4,
increment=1
)
logger.info(f"Oracle pool created for {dsn}")
return pool
def get_oracle_connection():
"""Get a connection from the Oracle pool."""
if pool is None:
raise RuntimeError("Oracle pool not initialized")
return pool.acquire()
def close_oracle():
"""Close the Oracle connection pool."""
global pool
if pool:
pool.close()
pool = None
logger.info("Oracle pool closed")
# ---- SQLite ----
SQLITE_SCHEMA = """
CREATE TABLE IF NOT EXISTS sync_runs (
id INTEGER PRIMARY KEY AUTOINCREMENT,
run_id TEXT UNIQUE,
started_at TEXT,
finished_at TEXT,
status TEXT,
total_orders INTEGER DEFAULT 0,
imported INTEGER DEFAULT 0,
skipped INTEGER DEFAULT 0,
errors INTEGER DEFAULT 0,
json_files INTEGER DEFAULT 0,
error_message TEXT,
already_imported INTEGER DEFAULT 0,
new_imported INTEGER DEFAULT 0
);
CREATE TABLE IF NOT EXISTS orders (
order_number TEXT PRIMARY KEY,
order_date TEXT,
customer_name TEXT,
status TEXT,
id_comanda INTEGER,
id_partener INTEGER,
id_adresa_facturare INTEGER,
id_adresa_livrare INTEGER,
error_message TEXT,
missing_skus TEXT,
items_count INTEGER,
times_skipped INTEGER DEFAULT 0,
first_seen_at TEXT DEFAULT (datetime('now')),
last_sync_run_id TEXT REFERENCES sync_runs(run_id),
updated_at TEXT DEFAULT (datetime('now')),
shipping_name TEXT,
billing_name TEXT,
payment_method TEXT,
delivery_method TEXT,
factura_serie TEXT,
factura_numar TEXT,
factura_total_fara_tva REAL,
factura_total_tva REAL,
factura_total_cu_tva REAL,
factura_data TEXT,
invoice_checked_at TEXT,
order_total REAL,
delivery_cost REAL,
discount_total REAL,
web_status TEXT,
discount_split TEXT
);
CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status);
CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date);
CREATE TABLE IF NOT EXISTS sync_run_orders (
sync_run_id TEXT REFERENCES sync_runs(run_id),
order_number TEXT REFERENCES orders(order_number),
status_at_run TEXT,
PRIMARY KEY (sync_run_id, order_number)
);
CREATE TABLE IF NOT EXISTS missing_skus (
sku TEXT PRIMARY KEY,
product_name TEXT,
first_seen TEXT DEFAULT (datetime('now')),
resolved INTEGER DEFAULT 0,
resolved_at TEXT,
order_count INTEGER DEFAULT 0,
order_numbers TEXT,
customers TEXT
);
CREATE TABLE IF NOT EXISTS scheduler_config (
key TEXT PRIMARY KEY,
value TEXT
);
CREATE TABLE IF NOT EXISTS web_products (
sku TEXT PRIMARY KEY,
product_name TEXT,
first_seen TEXT DEFAULT (datetime('now')),
last_seen TEXT DEFAULT (datetime('now')),
order_count INTEGER DEFAULT 0
);
CREATE TABLE IF NOT EXISTS app_settings (
key TEXT PRIMARY KEY,
value TEXT
);
CREATE TABLE IF NOT EXISTS price_sync_runs (
run_id TEXT PRIMARY KEY,
started_at TEXT,
finished_at TEXT,
status TEXT DEFAULT 'running',
products_total INTEGER DEFAULT 0,
matched INTEGER DEFAULT 0,
updated INTEGER DEFAULT 0,
errors INTEGER DEFAULT 0,
log_text TEXT
);
CREATE TABLE IF NOT EXISTS order_items (
order_number TEXT,
sku TEXT,
product_name TEXT,
quantity REAL,
price REAL,
vat REAL,
mapping_status TEXT,
codmat TEXT,
id_articol INTEGER,
cantitate_roa REAL,
created_at TEXT DEFAULT (datetime('now')),
PRIMARY KEY (order_number, sku)
);
CREATE INDEX IF NOT EXISTS idx_order_items_order ON order_items(order_number);
"""
_sqlite_db_path = None
def init_sqlite():
"""Initialize SQLite database with schema."""
global _sqlite_db_path
_sqlite_db_path = settings.SQLITE_DB_PATH
# Ensure directory exists
db_dir = os.path.dirname(_sqlite_db_path)
if db_dir:
os.makedirs(db_dir, exist_ok=True)
# Create tables synchronously
conn = sqlite3.connect(_sqlite_db_path)
# Check existing tables before running schema
cursor = conn.execute("SELECT name FROM sqlite_master WHERE type='table'")
existing_tables = {row[0] for row in cursor.fetchall()}
# Migration: import_orders → orders (one row per order)
if 'import_orders' in existing_tables and 'orders' not in existing_tables:
logger.info("Migrating import_orders → orders schema...")
conn.executescript("""
CREATE TABLE orders (
order_number TEXT PRIMARY KEY,
order_date TEXT,
customer_name TEXT,
status TEXT,
id_comanda INTEGER,
id_partener INTEGER,
id_adresa_facturare INTEGER,
id_adresa_livrare INTEGER,
error_message TEXT,
missing_skus TEXT,
items_count INTEGER,
times_skipped INTEGER DEFAULT 0,
first_seen_at TEXT DEFAULT (datetime('now')),
last_sync_run_id TEXT,
updated_at TEXT DEFAULT (datetime('now'))
);
CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status);
CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date);
CREATE TABLE sync_run_orders (
sync_run_id TEXT,
order_number TEXT,
status_at_run TEXT,
PRIMARY KEY (sync_run_id, order_number)
);
""")
# Copy latest record per order_number into orders
# Note: old import_orders didn't have address columns — those stay NULL
conn.execute("""
INSERT INTO orders
(order_number, order_date, customer_name, status,
id_comanda, id_partener, error_message, missing_skus,
items_count, last_sync_run_id)
SELECT io.order_number, io.order_date, io.customer_name, io.status,
io.id_comanda, io.id_partener, io.error_message, io.missing_skus,
io.items_count, io.sync_run_id
FROM import_orders io
INNER JOIN (
SELECT order_number, MAX(id) as max_id
FROM import_orders
GROUP BY order_number
) latest ON io.id = latest.max_id
""")
# Populate sync_run_orders from all import_orders rows
conn.execute("""
INSERT OR IGNORE INTO sync_run_orders (sync_run_id, order_number, status_at_run)
SELECT sync_run_id, order_number, status
FROM import_orders
WHERE sync_run_id IS NOT NULL
""")
# Migrate order_items: drop sync_run_id, change PK to (order_number, sku)
if 'order_items' in existing_tables:
conn.executescript("""
CREATE TABLE order_items_new (
order_number TEXT,
sku TEXT,
product_name TEXT,
quantity REAL,
price REAL,
vat REAL,
mapping_status TEXT,
codmat TEXT,
id_articol INTEGER,
cantitate_roa REAL,
created_at TEXT DEFAULT (datetime('now')),
PRIMARY KEY (order_number, sku)
);
INSERT OR IGNORE INTO order_items_new
(order_number, sku, product_name, quantity, price, vat,
mapping_status, codmat, id_articol, cantitate_roa, created_at)
SELECT order_number, sku, product_name, quantity, price, vat,
mapping_status, codmat, id_articol, cantitate_roa, created_at
FROM order_items;
DROP TABLE order_items;
ALTER TABLE order_items_new RENAME TO order_items;
CREATE INDEX IF NOT EXISTS idx_order_items_order ON order_items(order_number);
""")
# Rename old table instead of dropping (safety backup)
conn.execute("ALTER TABLE import_orders RENAME TO import_orders_bak")
conn.commit()
logger.info("Migration complete: import_orders → orders")
conn.executescript(SQLITE_SCHEMA)
# Migrate: add columns if missing (for existing databases)
try:
cursor = conn.execute("PRAGMA table_info(missing_skus)")
cols = {row[1] for row in cursor.fetchall()}
for col, typedef in [("order_count", "INTEGER DEFAULT 0"),
("order_numbers", "TEXT"),
("customers", "TEXT")]:
if col not in cols:
conn.execute(f"ALTER TABLE missing_skus ADD COLUMN {col} {typedef}")
logger.info(f"Migrated missing_skus: added column {col}")
# Migrate sync_runs: add columns
cursor = conn.execute("PRAGMA table_info(sync_runs)")
sync_cols = {row[1] for row in cursor.fetchall()}
if "error_message" not in sync_cols:
conn.execute("ALTER TABLE sync_runs ADD COLUMN error_message TEXT")
logger.info("Migrated sync_runs: added column error_message")
if "already_imported" not in sync_cols:
conn.execute("ALTER TABLE sync_runs ADD COLUMN already_imported INTEGER DEFAULT 0")
logger.info("Migrated sync_runs: added column already_imported")
if "new_imported" not in sync_cols:
conn.execute("ALTER TABLE sync_runs ADD COLUMN new_imported INTEGER DEFAULT 0")
logger.info("Migrated sync_runs: added column new_imported")
# Migrate orders: add shipping/billing/payment/delivery + invoice columns
cursor = conn.execute("PRAGMA table_info(orders)")
order_cols = {row[1] for row in cursor.fetchall()}
for col, typedef in [
("shipping_name", "TEXT"),
("billing_name", "TEXT"),
("payment_method", "TEXT"),
("delivery_method", "TEXT"),
("factura_serie", "TEXT"),
("factura_numar", "TEXT"),
("factura_total_fara_tva", "REAL"),
("factura_total_tva", "REAL"),
("factura_total_cu_tva", "REAL"),
("factura_data", "TEXT"),
("invoice_checked_at", "TEXT"),
("order_total", "REAL"),
("delivery_cost", "REAL"),
("discount_total", "REAL"),
("web_status", "TEXT"),
("discount_split", "TEXT"),
]:
if col not in order_cols:
conn.execute(f"ALTER TABLE orders ADD COLUMN {col} {typedef}")
logger.info(f"Migrated orders: added column {col}")
conn.commit()
except Exception as e:
logger.warning(f"Migration check failed: {e}")
conn.close()
logger.info(f"SQLite initialized: {_sqlite_db_path}")
async def get_sqlite():
"""Get async SQLite connection."""
if _sqlite_db_path is None:
raise RuntimeError("SQLite not initialized")
db = await aiosqlite.connect(_sqlite_db_path)
db.row_factory = aiosqlite.Row
return db
def get_sqlite_sync():
"""Get synchronous SQLite connection."""
if _sqlite_db_path is None:
raise RuntimeError("SQLite not initialized")
conn = sqlite3.connect(_sqlite_db_path)
conn.row_factory = sqlite3.Row
return conn

View File

@@ -1,91 +0,0 @@
from contextlib import asynccontextmanager
from datetime import datetime
from fastapi import FastAPI
from fastapi.staticfiles import StaticFiles
from pathlib import Path
import logging
import os
from .config import settings
from .database import init_oracle, close_oracle, init_sqlite
# Configure logging with both stream and file handlers
_log_level = getattr(logging, settings.LOG_LEVEL.upper(), logging.INFO)
_log_format = '%(asctime)s | %(levelname)s | %(name)s | %(message)s'
_formatter = logging.Formatter(_log_format)
_stream_handler = logging.StreamHandler()
_stream_handler.setFormatter(_formatter)
_log_dir = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'logs')
os.makedirs(_log_dir, exist_ok=True)
_log_filename = f"sync_comenzi_{datetime.now().strftime('%Y%m%d_%H%M%S')}.log"
_file_handler = logging.FileHandler(os.path.join(_log_dir, _log_filename), encoding='utf-8')
_file_handler.setFormatter(_formatter)
_root_logger = logging.getLogger()
_root_logger.setLevel(_log_level)
_root_logger.addHandler(_stream_handler)
_root_logger.addHandler(_file_handler)
logger = logging.getLogger(__name__)
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Startup and shutdown events."""
logger.info("Starting GoMag Import Manager...")
# Initialize Oracle pool
try:
init_oracle()
except Exception as e:
logger.error(f"Oracle init failed: {e}")
# Allow app to start even without Oracle for development
# Initialize SQLite
init_sqlite()
# Initialize scheduler (restore saved config)
from .services import scheduler_service, sqlite_service
scheduler_service.init_scheduler()
try:
config = await sqlite_service.get_scheduler_config()
if config.get("enabled") == "True":
interval = int(config.get("interval_minutes", "5"))
scheduler_service.start_scheduler(interval)
except Exception:
pass
logger.info("GoMag Import Manager started")
yield
# Shutdown
scheduler_service.shutdown_scheduler()
close_oracle()
logger.info("GoMag Import Manager stopped")
app = FastAPI(
title="GoMag Import Manager",
description="Import comenzi web GoMag → ROA Oracle",
version="1.0.0",
lifespan=lifespan
)
# Static files and templates
static_dir = Path(__file__).parent / "static"
templates_dir = Path(__file__).parent / "templates"
static_dir.mkdir(parents=True, exist_ok=True)
(static_dir / "css").mkdir(exist_ok=True)
(static_dir / "js").mkdir(exist_ok=True)
templates_dir.mkdir(parents=True, exist_ok=True)
app.mount("/static", StaticFiles(directory=str(static_dir)), name="static")
# Include routers
from .routers import health, dashboard, mappings, articles, validation, sync
app.include_router(health.router)
app.include_router(dashboard.router)
app.include_router(mappings.router)
app.include_router(articles.router)
app.include_router(validation.router)
app.include_router(sync.router)

View File

@@ -1,10 +0,0 @@
from fastapi import APIRouter, Query
from ..services import article_service
router = APIRouter(prefix="/api/articles", tags=["articles"])
@router.get("/search")
def search_articles(q: str = Query("", min_length=2)):
results = article_service.search_articles(q)
return {"results": results}

View File

@@ -1,21 +0,0 @@
from fastapi import APIRouter, Request
from fastapi.templating import Jinja2Templates
from fastapi.responses import HTMLResponse
from pathlib import Path
from ..services import sqlite_service
router = APIRouter()
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
@router.get("/", response_class=HTMLResponse)
async def dashboard(request: Request):
return templates.TemplateResponse("dashboard.html", {"request": request})
@router.get("/missing-skus", response_class=HTMLResponse)
async def missing_skus_page(request: Request):
return templates.TemplateResponse("missing_skus.html", {"request": request})
@router.get("/settings", response_class=HTMLResponse)
async def settings_page(request: Request):
return templates.TemplateResponse("settings.html", {"request": request})

View File

@@ -1,30 +0,0 @@
from fastapi import APIRouter
from .. import database
router = APIRouter()
@router.get("/health")
async def health_check():
result = {"oracle": "error", "sqlite": "error"}
# Check Oracle
try:
if database.pool:
with database.pool.acquire() as conn:
with conn.cursor() as cur:
cur.execute("SELECT SYSDATE FROM DUAL")
cur.fetchone()
result["oracle"] = "ok"
except Exception as e:
result["oracle"] = str(e)
# Check SQLite
try:
db = await database.get_sqlite()
await db.execute("SELECT 1")
await db.close()
result["sqlite"] = "ok"
except Exception as e:
result["sqlite"] = str(e)
return result

View File

@@ -1,189 +0,0 @@
from fastapi import APIRouter, Query, Request, UploadFile, File
from fastapi.responses import StreamingResponse, HTMLResponse, JSONResponse
from fastapi.templating import Jinja2Templates
from fastapi import HTTPException
from pydantic import BaseModel, validator
from pathlib import Path
from typing import Optional
import io
import asyncio
from ..services import mapping_service, sqlite_service
import logging
logger = logging.getLogger(__name__)
router = APIRouter(tags=["mappings"])
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
class MappingCreate(BaseModel):
sku: str
codmat: str
cantitate_roa: float = 1
@validator('sku', 'codmat')
def not_empty(cls, v):
if not v or not v.strip():
raise ValueError('nu poate fi gol')
return v.strip()
class MappingUpdate(BaseModel):
cantitate_roa: Optional[float] = None
activ: Optional[int] = None
class MappingEdit(BaseModel):
new_sku: str
new_codmat: str
cantitate_roa: float = 1
@validator('new_sku', 'new_codmat')
def not_empty(cls, v):
if not v or not v.strip():
raise ValueError('nu poate fi gol')
return v.strip()
class MappingLine(BaseModel):
codmat: str
cantitate_roa: float = 1
class MappingBatchCreate(BaseModel):
sku: str
mappings: list[MappingLine]
auto_restore: bool = False
# HTML page
@router.get("/mappings", response_class=HTMLResponse)
async def mappings_page(request: Request):
return templates.TemplateResponse("mappings.html", {"request": request})
# API endpoints
@router.get("/api/mappings")
async def list_mappings(search: str = "", page: int = 1, per_page: int = 50,
sort_by: str = "sku", sort_dir: str = "asc",
show_deleted: bool = False):
app_settings = await sqlite_service.get_app_settings()
id_pol = int(app_settings.get("id_pol") or 0) or None
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
result = mapping_service.get_mappings(search=search, page=page, per_page=per_page,
sort_by=sort_by, sort_dir=sort_dir,
show_deleted=show_deleted,
id_pol=id_pol, id_pol_productie=id_pol_productie)
# Merge product names from web_products (R4)
skus = list({m["sku"] for m in result.get("mappings", [])})
product_names = await sqlite_service.get_web_products_batch(skus)
for m in result.get("mappings", []):
m["product_name"] = product_names.get(m["sku"], "")
# Ensure counts key is always present
if "counts" not in result:
result["counts"] = {"total": 0}
return result
@router.post("/api/mappings")
async def create_mapping(data: MappingCreate):
try:
result = mapping_service.create_mapping(data.sku, data.codmat, data.cantitate_roa)
# Mark SKU as resolved in missing_skus tracking
await sqlite_service.resolve_missing_sku(data.sku)
return {"success": True, **result}
except HTTPException as e:
can_restore = e.headers.get("X-Can-Restore") == "true" if e.headers else False
resp: dict = {"error": e.detail}
if can_restore:
resp["can_restore"] = True
return JSONResponse(status_code=e.status_code, content=resp)
except Exception as e:
return {"success": False, "error": str(e)}
@router.put("/api/mappings/{sku}/{codmat}")
def update_mapping(sku: str, codmat: str, data: MappingUpdate):
try:
updated = mapping_service.update_mapping(sku, codmat, data.cantitate_roa, data.activ)
return {"success": updated}
except Exception as e:
return {"success": False, "error": str(e)}
@router.put("/api/mappings/{sku}/{codmat}/edit")
def edit_mapping(sku: str, codmat: str, data: MappingEdit):
try:
result = mapping_service.edit_mapping(sku, codmat, data.new_sku, data.new_codmat,
data.cantitate_roa)
return {"success": result}
except Exception as e:
return {"success": False, "error": str(e)}
@router.delete("/api/mappings/{sku}/{codmat}")
def delete_mapping(sku: str, codmat: str):
try:
deleted = mapping_service.delete_mapping(sku, codmat)
return {"success": deleted}
except Exception as e:
return {"success": False, "error": str(e)}
@router.post("/api/mappings/{sku}/{codmat}/restore")
def restore_mapping(sku: str, codmat: str):
try:
restored = mapping_service.restore_mapping(sku, codmat)
return {"success": restored}
except Exception as e:
return {"success": False, "error": str(e)}
@router.post("/api/mappings/batch")
async def create_batch_mapping(data: MappingBatchCreate):
"""Create multiple (sku, codmat) rows for complex sets (R11)."""
if not data.mappings:
return {"success": False, "error": "No mappings provided"}
try:
results = []
for m in data.mappings:
r = mapping_service.create_mapping(data.sku, m.codmat, m.cantitate_roa, auto_restore=data.auto_restore)
results.append(r)
# Mark SKU as resolved in missing_skus tracking
await sqlite_service.resolve_missing_sku(data.sku)
return {"success": True, "created": len(results)}
except Exception as e:
return {"success": False, "error": str(e)}
@router.get("/api/mappings/{sku}/prices")
async def get_mapping_prices(sku: str):
"""Get component prices from crm_politici_pret_art for a kit SKU."""
app_settings = await sqlite_service.get_app_settings()
id_pol = int(app_settings.get("id_pol") or 0) or None
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
if not id_pol:
return {"error": "Politica de pret nu este configurata", "prices": []}
try:
prices = await asyncio.to_thread(
mapping_service.get_component_prices, sku, id_pol, id_pol_productie
)
return {"prices": prices}
except Exception as e:
return {"error": str(e), "prices": []}
@router.post("/api/mappings/import-csv")
async def import_csv(file: UploadFile = File(...)):
content = await file.read()
text = content.decode("utf-8-sig")
result = mapping_service.import_csv(text)
return result
@router.get("/api/mappings/export-csv")
def export_csv():
csv_content = mapping_service.export_csv()
return StreamingResponse(
io.BytesIO(csv_content.encode("utf-8-sig")),
media_type="text/csv",
headers={"Content-Disposition": "attachment; filename=mappings.csv"}
)
@router.get("/api/mappings/csv-template")
def csv_template():
content = mapping_service.get_csv_template()
return StreamingResponse(
io.BytesIO(content.encode("utf-8-sig")),
media_type="text/csv",
headers={"Content-Disposition": "attachment; filename=mappings_template.csv"}
)

View File

@@ -1,791 +0,0 @@
import asyncio
import json
import logging
from datetime import datetime
logger = logging.getLogger(__name__)
from fastapi import APIRouter, Request, BackgroundTasks
from fastapi.templating import Jinja2Templates
from fastapi.responses import HTMLResponse
from pydantic import BaseModel
from pathlib import Path
from typing import Optional
from ..services import sync_service, scheduler_service, sqlite_service, invoice_service
from .. import database
router = APIRouter(tags=["sync"])
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
class ScheduleConfig(BaseModel):
enabled: bool
interval_minutes: int = 5
class AppSettingsUpdate(BaseModel):
transport_codmat: str = ""
transport_vat: str = "21"
discount_codmat: str = ""
transport_id_pol: str = ""
discount_vat: str = "21"
discount_id_pol: str = ""
id_pol: str = ""
id_pol_productie: str = ""
id_sectie: str = ""
id_gestiune: str = ""
split_discount_vat: str = ""
gomag_api_key: str = ""
gomag_api_shop: str = ""
gomag_order_days_back: str = "7"
gomag_limit: str = "100"
dashboard_poll_seconds: str = "5"
kit_pricing_mode: str = ""
kit_discount_codmat: str = ""
kit_discount_id_pol: str = ""
price_sync_enabled: str = "1"
catalog_sync_enabled: str = "0"
price_sync_schedule: str = ""
gomag_products_url: str = ""
# API endpoints
@router.post("/api/sync/start")
async def start_sync(background_tasks: BackgroundTasks):
"""Trigger a sync run in the background."""
result = await sync_service.prepare_sync()
if result.get("error"):
return {"error": result["error"], "run_id": result.get("run_id")}
run_id = result["run_id"]
background_tasks.add_task(sync_service.run_sync, run_id=run_id)
return {"message": "Sync started", "run_id": run_id}
@router.post("/api/sync/stop")
async def stop_sync():
"""Stop a running sync."""
sync_service.stop_sync()
return {"message": "Stop signal sent"}
@router.get("/api/sync/status")
async def sync_status():
"""Get current sync status with progress details and last_run info."""
status = await sync_service.get_sync_status()
# Build last_run from most recent completed/failed sync_runs row
current_run_id = status.get("run_id")
is_running = status.get("status") == "running"
last_run = None
try:
from ..database import get_sqlite
db = await get_sqlite()
try:
if current_run_id and is_running:
# Only exclude current run while it's actively running
cursor = await db.execute("""
SELECT * FROM sync_runs
WHERE status IN ('completed', 'failed') AND run_id != ?
ORDER BY started_at DESC LIMIT 1
""", (current_run_id,))
else:
cursor = await db.execute("""
SELECT * FROM sync_runs
WHERE status IN ('completed', 'failed')
ORDER BY started_at DESC LIMIT 1
""")
row = await cursor.fetchone()
if row:
row_dict = dict(row)
duration_seconds = None
if row_dict.get("started_at") and row_dict.get("finished_at"):
try:
dt_start = datetime.fromisoformat(row_dict["started_at"])
dt_end = datetime.fromisoformat(row_dict["finished_at"])
duration_seconds = int((dt_end - dt_start).total_seconds())
except (ValueError, TypeError):
pass
last_run = {
"run_id": row_dict.get("run_id"),
"started_at": row_dict.get("started_at"),
"finished_at": row_dict.get("finished_at"),
"duration_seconds": duration_seconds,
"status": row_dict.get("status"),
"imported": row_dict.get("imported", 0),
"skipped": row_dict.get("skipped", 0),
"errors": row_dict.get("errors", 0),
"already_imported": row_dict.get("already_imported", 0),
"new_imported": row_dict.get("new_imported", 0),
}
finally:
await db.close()
except Exception:
pass
# Ensure all expected keys are present
result = {
"status": status.get("status", "idle"),
"run_id": status.get("run_id"),
"started_at": status.get("started_at"),
"finished_at": status.get("finished_at"),
"phase": status.get("phase"),
"phase_text": status.get("phase_text"),
"progress_current": status.get("progress_current", 0),
"progress_total": status.get("progress_total", 0),
"counts": status.get("counts", {"imported": 0, "skipped": 0, "errors": 0}),
"last_run": last_run,
}
return result
@router.get("/api/sync/history")
async def sync_history(page: int = 1, per_page: int = 20):
"""Get sync run history."""
return await sqlite_service.get_sync_runs(page, per_page)
@router.post("/api/price-sync/start")
async def start_price_sync(background_tasks: BackgroundTasks):
"""Trigger manual catalog price sync."""
from ..services import price_sync_service
result = await price_sync_service.prepare_price_sync()
if result.get("error"):
return {"error": result["error"]}
run_id = result["run_id"]
background_tasks.add_task(price_sync_service.run_catalog_price_sync, run_id=run_id)
return {"message": "Price sync started", "run_id": run_id}
@router.get("/api/price-sync/status")
async def price_sync_status():
"""Get current price sync status."""
from ..services import price_sync_service
return await price_sync_service.get_price_sync_status()
@router.get("/api/price-sync/history")
async def price_sync_history(page: int = 1, per_page: int = 20):
"""Get price sync run history."""
return await sqlite_service.get_price_sync_runs(page, per_page)
@router.get("/logs", response_class=HTMLResponse)
async def logs_page(request: Request, run: str = None):
return templates.TemplateResponse("logs.html", {"request": request, "selected_run": run or ""})
@router.get("/api/sync/run/{run_id}")
async def sync_run_detail(run_id: str):
"""Get details for a specific sync run."""
detail = await sqlite_service.get_sync_run_detail(run_id)
if not detail:
return {"error": "Run not found"}
return detail
@router.get("/api/sync/run/{run_id}/log")
async def sync_run_log(run_id: str):
"""Get detailed log per order for a sync run."""
detail = await sqlite_service.get_sync_run_detail(run_id)
if not detail:
return {"error": "Run not found", "status_code": 404}
orders = detail.get("orders", [])
return {
"run_id": run_id,
"run": detail.get("run", {}),
"orders": [
{
"order_number": o.get("order_number"),
"order_date": o.get("order_date"),
"customer_name": o.get("customer_name"),
"items_count": o.get("items_count"),
"status": o.get("status"),
"id_comanda": o.get("id_comanda"),
"id_partener": o.get("id_partener"),
"error_message": o.get("error_message"),
"missing_skus": o.get("missing_skus"),
"order_total": o.get("order_total"),
"factura_numar": o.get("factura_numar"),
"factura_serie": o.get("factura_serie"),
}
for o in orders
]
}
def _format_text_log_from_detail(detail: dict) -> str:
"""Build a text log from SQLite stored data for completed runs."""
run = detail.get("run", {})
orders = detail.get("orders", [])
run_id = run.get("run_id", "?")
started = run.get("started_at", "")
lines = [f"=== Sync Run {run_id} ==="]
if started:
try:
dt = datetime.fromisoformat(started)
lines.append(f"Inceput: {dt.strftime('%d.%m.%Y %H:%M:%S')}")
except (ValueError, TypeError):
lines.append(f"Inceput: {started}")
lines.append("")
for o in orders:
status = (o.get("status") or "").upper()
number = o.get("order_number", "?")
customer = o.get("customer_name", "?")
order_date = o.get("order_date") or "?"
if status == "IMPORTED":
id_cmd = o.get("id_comanda", "?")
lines.append(f"#{number} [{order_date}] {customer} → IMPORTAT (ID: {id_cmd})")
elif status == "ALREADY_IMPORTED":
id_cmd = o.get("id_comanda", "?")
lines.append(f"#{number} [{order_date}] {customer} → DEJA IMPORTAT (ID: {id_cmd})")
elif status == "SKIPPED":
missing = o.get("missing_skus", "")
if isinstance(missing, str):
try:
missing = json.loads(missing)
except (json.JSONDecodeError, TypeError):
missing = [missing] if missing else []
skus_str = ", ".join(missing) if isinstance(missing, list) else str(missing)
lines.append(f"#{number} [{order_date}] {customer} → OMIS (lipsa: {skus_str})")
elif status == "ERROR":
err = o.get("error_message", "necunoscuta")
lines.append(f"#{number} [{order_date}] {customer} → EROARE: {err}")
# Summary line
lines.append("")
total = run.get("total_orders", 0)
imported = run.get("imported", 0)
skipped = run.get("skipped", 0)
errors = run.get("errors", 0)
duration_str = ""
finished = run.get("finished_at", "")
if started and finished:
try:
dt_start = datetime.fromisoformat(started)
dt_end = datetime.fromisoformat(finished)
secs = int((dt_end - dt_start).total_seconds())
duration_str = f" | Durata: {secs}s"
except (ValueError, TypeError):
pass
already = run.get("already_imported", 0)
new_imp = run.get("new_imported", 0)
if already:
lines.append(f"Finalizat: {new_imp} importate, {already} deja importate, {skipped} nemapate, {errors} erori din {total} comenzi{duration_str}")
else:
lines.append(f"Finalizat: {imported} importate, {skipped} nemapate, {errors} erori din {total} comenzi{duration_str}")
return "\n".join(lines)
@router.get("/api/sync/run/{run_id}/text-log")
async def sync_run_text_log(run_id: str):
"""Get text log for a sync run - live from memory or reconstructed from SQLite."""
# Check in-memory first (active/recent runs)
live_log = sync_service.get_run_text_log(run_id)
if live_log is not None:
status = "running"
current = await sync_service.get_sync_status()
if current.get("run_id") != run_id or current.get("status") != "running":
status = "completed"
return {"text": live_log, "status": status, "finished": status != "running"}
# Fall back to SQLite for historical runs
detail = await sqlite_service.get_sync_run_detail(run_id)
if not detail:
return {"error": "Run not found", "text": "", "status": "unknown", "finished": True}
run = detail.get("run", {})
text = _format_text_log_from_detail(detail)
status = run.get("status", "completed")
return {"text": text, "status": status, "finished": True}
@router.get("/api/sync/run/{run_id}/orders")
async def sync_run_orders(run_id: str, status: str = "all", page: int = 1, per_page: int = 50,
sort_by: str = "order_date", sort_dir: str = "desc"):
"""Get filtered, paginated orders for a sync run (R1)."""
return await sqlite_service.get_run_orders_filtered(run_id, status, page, per_page,
sort_by=sort_by, sort_dir=sort_dir)
def _get_articole_terti_for_skus(skus: set) -> dict:
"""Query ARTICOLE_TERTI for all active codmat/cantitate per SKU."""
from .. import database
result = {}
sku_list = list(skus)
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
for i in range(0, len(sku_list), 500):
batch = sku_list[i:i+500]
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
params = {f"s{j}": sku for j, sku in enumerate(batch)}
cur.execute(f"""
SELECT at.sku, at.codmat, at.cantitate_roa,
na.denumire
FROM ARTICOLE_TERTI at
LEFT JOIN NOM_ARTICOLE na ON na.codmat = at.codmat AND na.sters = 0 AND na.inactiv = 0
WHERE at.sku IN ({placeholders}) AND at.activ = 1 AND at.sters = 0
ORDER BY at.sku, at.codmat
""", params)
for row in cur:
sku = row[0]
if sku not in result:
result[sku] = []
result[sku].append({
"codmat": row[1],
"cantitate_roa": float(row[2]) if row[2] else 1,
"denumire": row[3] or ""
})
finally:
database.pool.release(conn)
return result
def _get_nom_articole_for_direct_skus(skus: set) -> dict:
"""Query NOM_ARTICOLE for SKUs that exist directly as CODMAT (direct mapping)."""
from .. import database
result = {}
sku_list = list(skus)
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
for i in range(0, len(sku_list), 500):
batch = sku_list[i:i+500]
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
params = {f"s{j}": sku for j, sku in enumerate(batch)}
cur.execute(f"""
SELECT codmat, denumire FROM NOM_ARTICOLE
WHERE codmat IN ({placeholders}) AND sters = 0 AND inactiv = 0
""", params)
for row in cur:
result[row[0]] = row[1] or ""
finally:
database.pool.release(conn)
return result
@router.get("/api/sync/order/{order_number}")
async def order_detail(order_number: str):
"""Get order detail with line items (R9), enriched with ARTICOLE_TERTI data."""
detail = await sqlite_service.get_order_detail(order_number)
if not detail:
return {"error": "Order not found"}
# Enrich items with ARTICOLE_TERTI mappings from Oracle
items = detail.get("items", [])
skus = {item["sku"] for item in items if item.get("sku")}
if skus:
codmat_map = await asyncio.to_thread(_get_articole_terti_for_skus, skus)
for item in items:
sku = item.get("sku")
if sku and sku in codmat_map:
item["codmat_details"] = codmat_map[sku]
# Enrich remaining SKUs via NOM_ARTICOLE (fallback for stale mapping_status)
remaining_skus = {item["sku"] for item in items
if item.get("sku") and not item.get("codmat_details")}
if remaining_skus:
nom_map = await asyncio.to_thread(_get_nom_articole_for_direct_skus, remaining_skus)
for item in items:
sku = item.get("sku")
if sku and sku in nom_map and not item.get("codmat_details"):
item["codmat_details"] = [{
"codmat": sku,
"cantitate_roa": 1,
"denumire": nom_map[sku],
"direct": True
}]
# Enrich with invoice data
order = detail.get("order", {})
if order.get("factura_numar") and order.get("factura_data"):
order["invoice"] = {
"facturat": True,
"serie_act": order.get("factura_serie"),
"numar_act": order.get("factura_numar"),
"data_act": order.get("factura_data"),
"total_fara_tva": order.get("factura_total_fara_tva"),
"total_tva": order.get("factura_total_tva"),
"total_cu_tva": order.get("factura_total_cu_tva"),
}
elif order.get("id_comanda"):
# Check Oracle live
try:
inv_data = await asyncio.to_thread(
invoice_service.check_invoices_for_orders, [order["id_comanda"]]
)
inv = inv_data.get(order["id_comanda"])
if inv and inv.get("facturat"):
order["invoice"] = inv
await sqlite_service.update_order_invoice(
order_number,
serie=inv.get("serie_act"),
numar=str(inv.get("numar_act", "")),
total_fara_tva=inv.get("total_fara_tva"),
total_tva=inv.get("total_tva"),
total_cu_tva=inv.get("total_cu_tva"),
data_act=inv.get("data_act"),
)
except Exception:
pass
# Parse discount_split JSON string
if order.get("discount_split"):
try:
order["discount_split"] = json.loads(order["discount_split"])
except (json.JSONDecodeError, TypeError):
pass
# Add settings for receipt display
app_settings = await sqlite_service.get_app_settings()
order["transport_vat"] = app_settings.get("transport_vat") or "21"
order["transport_codmat"] = app_settings.get("transport_codmat") or ""
order["discount_codmat"] = app_settings.get("discount_codmat") or ""
return detail
@router.get("/api/dashboard/orders")
async def dashboard_orders(page: int = 1, per_page: int = 50,
search: str = "", status: str = "all",
sort_by: str = "order_date", sort_dir: str = "desc",
period_days: int = 7,
period_start: str = "", period_end: str = ""):
"""Get orders for dashboard, enriched with invoice data.
period_days=0 with period_start/period_end uses custom date range.
period_days=0 without dates means all time.
"""
is_uninvoiced_filter = (status == "UNINVOICED")
is_invoiced_filter = (status == "INVOICED")
# For UNINVOICED/INVOICED: fetch all IMPORTED orders, then filter post-invoice-check
fetch_status = "IMPORTED" if (is_uninvoiced_filter or is_invoiced_filter) else status
fetch_per_page = 10000 if (is_uninvoiced_filter or is_invoiced_filter) else per_page
fetch_page = 1 if (is_uninvoiced_filter or is_invoiced_filter) else page
result = await sqlite_service.get_orders(
page=fetch_page, per_page=fetch_per_page, search=search,
status_filter=fetch_status, sort_by=sort_by, sort_dir=sort_dir,
period_days=period_days,
period_start=period_start if period_days == 0 else "",
period_end=period_end if period_days == 0 else "",
)
# Enrich orders with invoice data — prefer SQLite cache, fallback to Oracle
all_orders = result["orders"]
for o in all_orders:
if o.get("factura_numar") and o.get("factura_data"):
# Use cached invoice data from SQLite (only if complete)
o["invoice"] = {
"facturat": True,
"serie_act": o.get("factura_serie"),
"numar_act": o.get("factura_numar"),
"total_fara_tva": o.get("factura_total_fara_tva"),
"total_tva": o.get("factura_total_tva"),
"total_cu_tva": o.get("factura_total_cu_tva"),
"data_act": o.get("factura_data"),
}
else:
o["invoice"] = None
# For orders without cached invoice, check Oracle (only uncached imported orders)
uncached_orders = [o for o in all_orders if o.get("id_comanda") and not o.get("invoice")]
if uncached_orders:
try:
id_comanda_list = [o["id_comanda"] for o in uncached_orders]
invoice_data = await asyncio.to_thread(
invoice_service.check_invoices_for_orders, id_comanda_list
)
for o in uncached_orders:
idc = o.get("id_comanda")
if idc and idc in invoice_data:
o["invoice"] = invoice_data[idc]
# Update SQLite cache so counts stay accurate
inv = invoice_data[idc]
if inv.get("facturat"):
await sqlite_service.update_order_invoice(
o["order_number"],
serie=inv.get("serie_act"),
numar=str(inv.get("numar_act", "")),
total_fara_tva=inv.get("total_fara_tva"),
total_tva=inv.get("total_tva"),
total_cu_tva=inv.get("total_cu_tva"),
data_act=inv.get("data_act"),
)
except Exception:
pass
# Add shipping/billing name fields + is_different_person flag
s_name = o.get("shipping_name") or ""
b_name = o.get("billing_name") or ""
o["shipping_name"] = s_name
o["billing_name"] = b_name
o["is_different_person"] = bool(s_name and b_name and s_name != b_name)
# Use counts from sqlite_service (already period-scoped)
counts = result.get("counts", {})
# Count newly-cached invoices found during this request
newly_invoiced = sum(1 for o in uncached_orders if o.get("invoice") and o["invoice"].get("facturat"))
# Adjust uninvoiced count: start from SQLite count, subtract newly-found invoices
uninvoiced_base = counts.get("uninvoiced_sqlite", sum(
1 for o in all_orders
if o.get("status") in ("IMPORTED", "ALREADY_IMPORTED") and not o.get("invoice")
))
counts["nefacturate"] = max(0, uninvoiced_base - newly_invoiced)
imported_total = counts.get("imported_all") or counts.get("imported", 0)
counts["facturate"] = max(0, imported_total - counts["nefacturate"])
counts.setdefault("total", counts.get("imported", 0) + counts.get("skipped", 0) + counts.get("error", 0))
# For UNINVOICED filter: apply server-side filtering + pagination
if is_uninvoiced_filter:
filtered = [o for o in all_orders if o.get("status") in ("IMPORTED", "ALREADY_IMPORTED") and not o.get("invoice")]
total = len(filtered)
offset = (page - 1) * per_page
result["orders"] = filtered[offset:offset + per_page]
result["total"] = total
result["page"] = page
result["per_page"] = per_page
result["pages"] = (total + per_page - 1) // per_page if total > 0 else 0
elif is_invoiced_filter:
filtered = [o for o in all_orders if o.get("status") in ("IMPORTED", "ALREADY_IMPORTED") and o.get("invoice")]
total = len(filtered)
offset = (page - 1) * per_page
result["orders"] = filtered[offset:offset + per_page]
result["total"] = total
result["page"] = page
result["per_page"] = per_page
result["pages"] = (total + per_page - 1) // per_page if total > 0 else 0
# Reshape response
return {
"orders": result["orders"],
"pagination": {
"page": result.get("page", page),
"per_page": result.get("per_page", per_page),
"total_pages": result.get("pages", 0),
},
"counts": counts,
}
@router.post("/api/dashboard/refresh-invoices")
async def refresh_invoices():
"""Force-refresh invoice/order status from Oracle.
Checks:
1. Uninvoiced orders → did they get invoiced?
2. Invoiced orders → was the invoice deleted?
3. All imported orders → was the order deleted from ROA?
"""
try:
invoices_added = 0
invoices_cleared = 0
orders_deleted = 0
# 1. Check uninvoiced → new invoices
uninvoiced = await sqlite_service.get_uninvoiced_imported_orders()
if uninvoiced:
id_comanda_list = [o["id_comanda"] for o in uninvoiced]
invoice_data = await asyncio.to_thread(
invoice_service.check_invoices_for_orders, id_comanda_list
)
id_to_order = {o["id_comanda"]: o["order_number"] for o in uninvoiced}
for idc, inv in invoice_data.items():
order_num = id_to_order.get(idc)
if order_num and inv.get("facturat"):
await sqlite_service.update_order_invoice(
order_num,
serie=inv.get("serie_act"),
numar=str(inv.get("numar_act", "")),
total_fara_tva=inv.get("total_fara_tva"),
total_tva=inv.get("total_tva"),
total_cu_tva=inv.get("total_cu_tva"),
data_act=inv.get("data_act"),
)
invoices_added += 1
# 2. Check invoiced → deleted invoices
invoiced = await sqlite_service.get_invoiced_imported_orders()
if invoiced:
id_comanda_list = [o["id_comanda"] for o in invoiced]
invoice_data = await asyncio.to_thread(
invoice_service.check_invoices_for_orders, id_comanda_list
)
for o in invoiced:
if o["id_comanda"] not in invoice_data:
await sqlite_service.clear_order_invoice(o["order_number"])
invoices_cleared += 1
# 3. Check all imported → deleted orders in ROA
all_imported = await sqlite_service.get_all_imported_orders()
if all_imported:
id_comanda_list = [o["id_comanda"] for o in all_imported]
existing_ids = await asyncio.to_thread(
invoice_service.check_orders_exist, id_comanda_list
)
for o in all_imported:
if o["id_comanda"] not in existing_ids:
await sqlite_service.mark_order_deleted_in_roa(o["order_number"])
orders_deleted += 1
checked = len(uninvoiced) + len(invoiced) + len(all_imported)
return {
"checked": checked,
"invoices_added": invoices_added,
"invoices_cleared": invoices_cleared,
"orders_deleted": orders_deleted,
}
except Exception as e:
return {"error": str(e), "invoices_added": 0}
@router.put("/api/sync/schedule")
async def update_schedule(config: ScheduleConfig):
"""Update scheduler configuration."""
if config.enabled:
scheduler_service.start_scheduler(config.interval_minutes)
else:
scheduler_service.stop_scheduler()
# Persist config
await sqlite_service.set_scheduler_config("enabled", str(config.enabled))
await sqlite_service.set_scheduler_config("interval_minutes", str(config.interval_minutes))
return scheduler_service.get_scheduler_status()
@router.get("/api/sync/schedule")
async def get_schedule():
"""Get current scheduler status."""
return scheduler_service.get_scheduler_status()
@router.get("/api/settings")
async def get_app_settings():
"""Get application settings."""
from ..config import settings as config_settings
s = await sqlite_service.get_app_settings()
return {
"transport_codmat": s.get("transport_codmat", ""),
"transport_vat": s.get("transport_vat", "21"),
"discount_codmat": s.get("discount_codmat", ""),
"transport_id_pol": s.get("transport_id_pol", ""),
"discount_vat": s.get("discount_vat", "21"),
"discount_id_pol": s.get("discount_id_pol", ""),
"id_pol": s.get("id_pol", ""),
"id_pol_productie": s.get("id_pol_productie", ""),
"id_sectie": s.get("id_sectie", ""),
"id_gestiune": s.get("id_gestiune", ""),
"split_discount_vat": s.get("split_discount_vat", ""),
"gomag_api_key": s.get("gomag_api_key", "") or config_settings.GOMAG_API_KEY,
"gomag_api_shop": s.get("gomag_api_shop", "") or config_settings.GOMAG_API_SHOP,
"gomag_order_days_back": s.get("gomag_order_days_back", "") or str(config_settings.GOMAG_ORDER_DAYS_BACK),
"gomag_limit": s.get("gomag_limit", "") or str(config_settings.GOMAG_LIMIT),
"dashboard_poll_seconds": s.get("dashboard_poll_seconds", "5"),
"kit_pricing_mode": s.get("kit_pricing_mode", ""),
"kit_discount_codmat": s.get("kit_discount_codmat", ""),
"kit_discount_id_pol": s.get("kit_discount_id_pol", ""),
"price_sync_enabled": s.get("price_sync_enabled", "1"),
"catalog_sync_enabled": s.get("catalog_sync_enabled", "0"),
"price_sync_schedule": s.get("price_sync_schedule", ""),
"gomag_products_url": s.get("gomag_products_url", ""),
}
@router.put("/api/settings")
async def update_app_settings(config: AppSettingsUpdate):
"""Update application settings."""
await sqlite_service.set_app_setting("transport_codmat", config.transport_codmat)
await sqlite_service.set_app_setting("transport_vat", config.transport_vat)
await sqlite_service.set_app_setting("discount_codmat", config.discount_codmat)
await sqlite_service.set_app_setting("transport_id_pol", config.transport_id_pol)
await sqlite_service.set_app_setting("discount_vat", config.discount_vat)
await sqlite_service.set_app_setting("discount_id_pol", config.discount_id_pol)
await sqlite_service.set_app_setting("id_pol", config.id_pol)
await sqlite_service.set_app_setting("id_pol_productie", config.id_pol_productie)
await sqlite_service.set_app_setting("id_sectie", config.id_sectie)
await sqlite_service.set_app_setting("id_gestiune", config.id_gestiune)
await sqlite_service.set_app_setting("split_discount_vat", config.split_discount_vat)
await sqlite_service.set_app_setting("gomag_api_key", config.gomag_api_key)
await sqlite_service.set_app_setting("gomag_api_shop", config.gomag_api_shop)
await sqlite_service.set_app_setting("gomag_order_days_back", config.gomag_order_days_back)
await sqlite_service.set_app_setting("gomag_limit", config.gomag_limit)
await sqlite_service.set_app_setting("dashboard_poll_seconds", config.dashboard_poll_seconds)
await sqlite_service.set_app_setting("kit_pricing_mode", config.kit_pricing_mode)
await sqlite_service.set_app_setting("kit_discount_codmat", config.kit_discount_codmat)
await sqlite_service.set_app_setting("kit_discount_id_pol", config.kit_discount_id_pol)
await sqlite_service.set_app_setting("price_sync_enabled", config.price_sync_enabled)
await sqlite_service.set_app_setting("catalog_sync_enabled", config.catalog_sync_enabled)
await sqlite_service.set_app_setting("price_sync_schedule", config.price_sync_schedule)
await sqlite_service.set_app_setting("gomag_products_url", config.gomag_products_url)
return {"success": True}
@router.get("/api/settings/gestiuni")
async def get_gestiuni():
"""Get list of warehouses from Oracle for dropdown."""
def _query():
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
cur.execute(
"SELECT id_gestiune, nume_gestiune FROM nom_gestiuni WHERE sters=0 AND inactiv=0 ORDER BY id_gestiune"
)
return [{"id": str(row[0]), "label": f"{row[0]} - {row[1]}"} for row in cur]
finally:
database.pool.release(conn)
try:
return await asyncio.to_thread(_query)
except Exception as e:
logger.error(f"get_gestiuni error: {e}")
return []
@router.get("/api/settings/sectii")
async def get_sectii():
"""Get list of sections from Oracle for dropdown."""
def _query():
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
cur.execute(
"SELECT id_sectie, sectie FROM nom_sectii WHERE sters=0 AND inactiv=0 ORDER BY id_sectie"
)
return [{"id": str(row[0]), "label": f"{row[0]} - {row[1]}"} for row in cur]
finally:
database.pool.release(conn)
try:
return await asyncio.to_thread(_query)
except Exception as e:
logger.error(f"get_sectii error: {e}")
return []
@router.get("/api/settings/politici")
async def get_politici():
"""Get list of price policies from Oracle for dropdown."""
def _query():
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
cur.execute(
"SELECT id_pol, nume_lista_preturi FROM crm_politici_preturi WHERE sters=0 ORDER BY id_pol"
)
return [{"id": str(row[0]), "label": f"{row[0]} - {row[1]}"} for row in cur]
finally:
database.pool.release(conn)
try:
return await asyncio.to_thread(_query)
except Exception as e:
logger.error(f"get_politici error: {e}")
return []

View File

@@ -1,156 +0,0 @@
import csv
import io
import json
from fastapi import APIRouter, Query
from fastapi.responses import StreamingResponse
from ..services import order_reader, validation_service, sqlite_service
from ..database import get_sqlite
router = APIRouter(prefix="/api/validate", tags=["validation"])
@router.post("/scan")
async def scan_and_validate():
"""Scan JSON files and validate all SKUs."""
orders, json_count = order_reader.read_json_orders()
if not orders:
return {
"orders": 0, "json_files": json_count, "skus": {}, "message": "No orders found",
"total_skus_scanned": 0, "new_missing": 0, "auto_resolved": 0, "unchanged": 0,
}
all_skus = order_reader.get_all_skus(orders)
result = validation_service.validate_skus(all_skus)
importable, skipped = validation_service.classify_orders(orders, result)
# Build SKU context from skipped orders and track missing SKUs
sku_context = {} # sku -> {order_numbers: [], customers: []}
for order, missing_list in skipped:
customer = order.billing.company_name or f"{order.billing.firstname} {order.billing.lastname}"
for sku in missing_list:
if sku not in sku_context:
sku_context[sku] = {"order_numbers": [], "customers": []}
sku_context[sku]["order_numbers"].append(order.number)
if customer not in sku_context[sku]["customers"]:
sku_context[sku]["customers"].append(customer)
new_missing = 0
for sku in result["missing"]:
# Find product name from orders
product_name = ""
for order in orders:
for item in order.items:
if item.sku == sku:
product_name = item.name
break
if product_name:
break
ctx = sku_context.get(sku, {})
tracked = await sqlite_service.track_missing_sku(
sku=sku,
product_name=product_name,
order_count=len(ctx.get("order_numbers", [])),
order_numbers=json.dumps(ctx.get("order_numbers", [])),
customers=json.dumps(ctx.get("customers", []))
)
if tracked:
new_missing += 1
total_skus_scanned = len(all_skus)
new_missing_count = len(result["missing"])
unchanged = total_skus_scanned - new_missing_count
return {
"json_files": json_count,
"total_orders": len(orders),
"total_skus": len(all_skus),
"importable": len(importable),
"skipped": len(skipped),
"new_orders": len(importable),
# Fields consumed by the rescan progress banner in missing_skus.html
"total_skus_scanned": total_skus_scanned,
"new_missing": new_missing_count,
"auto_resolved": 0,
"unchanged": unchanged,
"skus": {
"mapped": len(result["mapped"]),
"direct": len(result["direct"]),
"missing": len(result["missing"]),
"missing_list": sorted(result["missing"]),
"total_skus": len(all_skus),
"mapped_skus": len(result["mapped"]),
"direct_skus": len(result["direct"])
},
"skipped_orders": [
{
"number": order.number,
"customer": order.billing.company_name or f"{order.billing.firstname} {order.billing.lastname}",
"items_count": len(order.items),
"missing_skus": missing
}
for order, missing in skipped[:50] # limit to 50
]
}
@router.get("/missing-skus")
async def get_missing_skus(
page: int = Query(1, ge=1),
per_page: int = Query(20, ge=1, le=100),
resolved: int = Query(0, ge=-1, le=1),
search: str = Query(None)
):
"""Get paginated missing SKUs. resolved=-1 means show all (R10).
Optional search filters by sku or product_name."""
db = await get_sqlite()
try:
# Compute counts across ALL records (unfiltered by search)
cursor = await db.execute("SELECT COUNT(*) FROM missing_skus WHERE resolved = 0")
unresolved_count = (await cursor.fetchone())[0]
cursor = await db.execute("SELECT COUNT(*) FROM missing_skus WHERE resolved = 1")
resolved_count = (await cursor.fetchone())[0]
cursor = await db.execute("SELECT COUNT(*) FROM missing_skus")
total_count = (await cursor.fetchone())[0]
finally:
await db.close()
counts = {
"total": total_count,
"unresolved": unresolved_count,
"resolved": resolved_count,
}
result = await sqlite_service.get_missing_skus_paginated(page, per_page, resolved, search=search)
# Backward compat
result["unresolved"] = unresolved_count
result["counts"] = counts
# rename key for JS consistency
result["skus"] = result.get("missing_skus", [])
return result
@router.get("/missing-skus-csv")
async def export_missing_skus_csv():
"""Export missing SKUs as CSV compatible with mapping import (R8)."""
db = await get_sqlite()
try:
cursor = await db.execute("""
SELECT sku, product_name, first_seen, resolved
FROM missing_skus WHERE resolved = 0
ORDER BY first_seen DESC
""")
rows = await cursor.fetchall()
finally:
await db.close()
output = io.StringIO()
writer = csv.writer(output)
writer.writerow(["sku", "codmat", "cantitate_roa", "procent_pret", "product_name"])
for row in rows:
writer.writerow([row["sku"], "", "", "", row["product_name"] or ""])
return StreamingResponse(
io.BytesIO(output.getvalue().encode("utf-8-sig")),
media_type="text/csv",
headers={"Content-Disposition": "attachment; filename=missing_skus.csv"}
)

View File

@@ -1,28 +0,0 @@
import logging
from fastapi import HTTPException
from .. import database
logger = logging.getLogger(__name__)
def search_articles(query: str, limit: int = 20):
"""Search articles in NOM_ARTICOLE by codmat or denumire."""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
if not query or len(query) < 2:
return []
with database.pool.acquire() as conn:
with conn.cursor() as cur:
cur.execute("""
SELECT id_articol, codmat, denumire, um
FROM nom_articole
WHERE (UPPER(codmat) LIKE UPPER(:q) || '%'
OR UPPER(denumire) LIKE '%' || UPPER(:q) || '%')
AND sters = 0 AND inactiv = 0
AND ROWNUM <= :lim
ORDER BY CASE WHEN UPPER(codmat) LIKE UPPER(:q) || '%' THEN 0 ELSE 1 END, codmat
""", {"q": query, "lim": limit})
columns = [col[0].lower() for col in cur.description]
return [dict(zip(columns, row)) for row in cur.fetchall()]

View File

@@ -1,182 +0,0 @@
"""GoMag API client - downloads orders and saves them as JSON files."""
import asyncio
import json
import logging
from datetime import datetime, timedelta
from pathlib import Path
from typing import Callable
import httpx
from ..config import settings
logger = logging.getLogger(__name__)
async def download_orders(
json_dir: str,
days_back: int = None,
api_key: str = None,
api_shop: str = None,
limit: int = None,
log_fn: Callable[[str], None] = None,
) -> dict:
"""Download orders from GoMag API and save as JSON files.
Returns dict with keys: pages, total, files (list of saved file paths).
If API keys are not configured, returns immediately with empty result.
Optional api_key, api_shop, limit override config.settings values.
"""
def _log(msg: str):
logger.info(msg)
if log_fn:
log_fn(msg)
effective_key = api_key or settings.GOMAG_API_KEY
effective_shop = api_shop or settings.GOMAG_API_SHOP
effective_limit = limit or settings.GOMAG_LIMIT
if not effective_key or not effective_shop:
_log("GoMag API keys neconfigurați, skip download")
return {"pages": 0, "total": 0, "files": []}
if days_back is None:
days_back = settings.GOMAG_ORDER_DAYS_BACK
start_date = (datetime.now() - timedelta(days=days_back)).strftime("%Y-%m-%d")
out_dir = Path(json_dir)
out_dir.mkdir(parents=True, exist_ok=True)
# Clean old JSON files before downloading new ones
old_files = list(out_dir.glob("gomag_orders*.json"))
if old_files:
for f in old_files:
f.unlink()
_log(f"Șterse {len(old_files)} fișiere JSON vechi")
headers = {
"Apikey": effective_key,
"ApiShop": effective_shop,
"User-Agent": "Mozilla/5.0",
"Content-Type": "application/json",
}
saved_files = []
total_orders = 0
total_pages = 1
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
async with httpx.AsyncClient(timeout=30) as client:
page = 1
while page <= total_pages:
params = {
"startDate": start_date,
"page": page,
"limit": effective_limit,
}
try:
response = await client.get(settings.GOMAG_API_URL, headers=headers, params=params)
response.raise_for_status()
data = response.json()
except httpx.HTTPError as e:
_log(f"GoMag API eroare pagina {page}: {e}")
break
except Exception as e:
_log(f"GoMag eroare neașteptată pagina {page}: {e}")
break
# Update totals from first page response
if page == 1:
total_orders = int(data.get("total", 0))
total_pages = int(data.get("pages", 1))
_log(f"GoMag: {total_orders} comenzi în {total_pages} pagini (startDate={start_date})")
filename = out_dir / f"gomag_orders_page{page}_{timestamp}.json"
filename.write_text(json.dumps(data, ensure_ascii=False, indent=2), encoding="utf-8")
saved_files.append(str(filename))
_log(f"GoMag: pagina {page}/{total_pages} salvată → {filename.name}")
page += 1
if page <= total_pages:
await asyncio.sleep(1)
return {"pages": total_pages, "total": total_orders, "files": saved_files}
async def download_products(
api_key: str = None,
api_shop: str = None,
products_url: str = None,
log_fn: Callable[[str], None] = None,
) -> list[dict]:
"""Download all products from GoMag Products API.
Returns list of product dicts with: sku, price, vat, vat_included, bundleItems.
"""
def _log(msg: str):
logger.info(msg)
if log_fn:
log_fn(msg)
effective_key = api_key or settings.GOMAG_API_KEY
effective_shop = api_shop or settings.GOMAG_API_SHOP
default_url = "https://api.gomag.ro/api/v1/product/read/json"
effective_url = products_url or default_url
if not effective_key or not effective_shop:
_log("GoMag API keys neconfigurați, skip product download")
return []
headers = {
"Apikey": effective_key,
"ApiShop": effective_shop,
"User-Agent": "Mozilla/5.0",
"Content-Type": "application/json",
}
all_products = []
total_pages = 1
async with httpx.AsyncClient(timeout=30) as client:
page = 1
while page <= total_pages:
params = {"page": page, "limit": 100}
try:
response = await client.get(effective_url, headers=headers, params=params)
response.raise_for_status()
data = response.json()
except httpx.HTTPError as e:
_log(f"GoMag Products API eroare pagina {page}: {e}")
break
except Exception as e:
_log(f"GoMag Products eroare neașteptată pagina {page}: {e}")
break
if page == 1:
total_pages = int(data.get("pages", 1))
_log(f"GoMag Products: {data.get('total', '?')} produse în {total_pages} pagini")
products = data.get("products", [])
if isinstance(products, dict):
# GoMag returns products as {"1": {...}, "2": {...}} dict
first_val = next(iter(products.values()), None) if products else None
if isinstance(first_val, dict):
products = list(products.values())
else:
products = [products]
if isinstance(products, list):
for p in products:
if isinstance(p, dict) and p.get("sku"):
all_products.append({
"sku": p["sku"],
"price": p.get("price", "0"),
"vat": p.get("vat", "19"),
"vat_included": str(p.get("vat_included", "1")),
"bundleItems": p.get("bundleItems", []),
})
page += 1
if page <= total_pages:
await asyncio.sleep(1)
_log(f"GoMag Products: {len(all_products)} produse cu SKU descărcate")
return all_products

View File

@@ -1,451 +0,0 @@
import html
import json
import logging
import oracledb
from datetime import datetime, timedelta
from .. import database
logger = logging.getLogger(__name__)
# Diacritics to ASCII mapping (Romanian)
_DIACRITICS = str.maketrans({
'\u0103': 'a', # ă
'\u00e2': 'a', # â
'\u00ee': 'i', # î
'\u0219': 's', # ș
'\u021b': 't', # ț
'\u0102': 'A', # Ă
'\u00c2': 'A', # Â
'\u00ce': 'I', # Î
'\u0218': 'S', # Ș
'\u021a': 'T', # Ț
# Older Unicode variants
'\u015f': 's', # ş (cedilla)
'\u0163': 't', # ţ (cedilla)
'\u015e': 'S', # Ş
'\u0162': 'T', # Ţ
})
def clean_web_text(text: str) -> str:
"""Port of VFP CleanWebText: unescape HTML entities + diacritics to ASCII."""
if not text:
return ""
result = html.unescape(text)
result = result.translate(_DIACRITICS)
# Remove any remaining <br> tags
for br in ('<br>', '<br/>', '<br />'):
result = result.replace(br, ' ')
return result.strip()
def convert_web_date(date_str: str) -> datetime:
"""Port of VFP ConvertWebDate: parse web date to datetime."""
if not date_str:
return datetime.now()
try:
return datetime.strptime(date_str.strip(), '%Y-%m-%d %H:%M:%S')
except ValueError:
try:
return datetime.strptime(date_str.strip()[:10], '%Y-%m-%d')
except ValueError:
return datetime.now()
def format_address_for_oracle(address: str, city: str, region: str) -> str:
"""Port of VFP FormatAddressForOracle."""
region_clean = clean_web_text(region)
city_clean = clean_web_text(city)
address_clean = clean_web_text(address)
return f"JUD:{region_clean};{city_clean};{address_clean}"
def compute_discount_split(order, settings: dict) -> dict | None:
"""Compute proportional discount split by VAT rate from order items.
Returns: {"11": 3.98, "21": 1.43} or None if split not applicable.
Only splits when split_discount_vat is enabled AND multiple VAT rates exist.
When single VAT rate: returns {actual_rate: total} (smarter than GoMag's fixed 21%).
"""
if not order or order.discount_total <= 0:
return None
split_enabled = settings.get("split_discount_vat") == "1"
# Calculate VAT distribution from order items (exclude zero-value)
vat_totals = {}
for item in order.items:
item_value = abs(item.price * item.quantity)
if item_value > 0:
vat_key = str(int(item.vat)) if item.vat == int(item.vat) else str(item.vat)
vat_totals[vat_key] = vat_totals.get(vat_key, 0) + item_value
if not vat_totals:
return None
grand_total = sum(vat_totals.values())
if grand_total <= 0:
return None
if len(vat_totals) == 1:
# Single VAT rate — use that rate (smarter than GoMag's fixed 21%)
actual_vat = list(vat_totals.keys())[0]
return {actual_vat: round(order.discount_total, 2)}
if not split_enabled:
return None
# Multiple VAT rates — split proportionally
result = {}
discount_remaining = order.discount_total
sorted_rates = sorted(vat_totals.keys(), key=lambda x: float(x))
for i, vat_rate in enumerate(sorted_rates):
if i == len(sorted_rates) - 1:
split_amount = round(discount_remaining, 2) # last gets remainder
else:
proportion = vat_totals[vat_rate] / grand_total
split_amount = round(order.discount_total * proportion, 2)
discount_remaining -= split_amount
if split_amount > 0:
result[vat_rate] = split_amount
return result if result else None
def build_articles_json(items, order=None, settings=None) -> str:
"""Build JSON string for Oracle PACK_IMPORT_COMENZI.importa_comanda.
Includes transport and discount as extra articles if configured.
Supports per-article id_pol from codmat_policy_map and discount VAT splitting."""
articles = []
codmat_policy_map = settings.get("_codmat_policy_map", {}) if settings else {}
default_id_pol = settings.get("id_pol", "") if settings else ""
for item in items:
article_dict = {
"sku": item.sku,
"quantity": str(item.quantity),
"price": str(item.price),
"vat": str(item.vat),
"name": clean_web_text(item.name)
}
# Per-article id_pol from dual-policy validation
item_pol = codmat_policy_map.get(item.sku)
if item_pol and str(item_pol) != str(default_id_pol):
article_dict["id_pol"] = str(item_pol)
articles.append(article_dict)
if order and settings:
transport_codmat = settings.get("transport_codmat", "")
transport_vat = settings.get("transport_vat", "21")
discount_codmat = settings.get("discount_codmat", "")
# Transport as article with quantity +1
if order.delivery_cost > 0 and transport_codmat:
article_dict = {
"sku": transport_codmat,
"quantity": "1",
"price": str(order.delivery_cost),
"vat": transport_vat,
"name": "Transport"
}
if settings.get("transport_id_pol"):
article_dict["id_pol"] = settings["transport_id_pol"]
articles.append(article_dict)
# Discount — smart VAT splitting
if order.discount_total > 0 and discount_codmat:
discount_split = compute_discount_split(order, settings)
if discount_split and len(discount_split) > 1:
# Multiple VAT rates — multiple discount lines
for vat_rate, split_amount in sorted(discount_split.items(), key=lambda x: float(x[0])):
article_dict = {
"sku": discount_codmat,
"quantity": "-1",
"price": str(split_amount),
"vat": vat_rate,
"name": f"Discount (TVA {vat_rate}%)"
}
if settings.get("discount_id_pol"):
article_dict["id_pol"] = settings["discount_id_pol"]
articles.append(article_dict)
elif discount_split and len(discount_split) == 1:
# Single VAT rate — use detected rate
actual_vat = list(discount_split.keys())[0]
article_dict = {
"sku": discount_codmat,
"quantity": "-1",
"price": str(order.discount_total),
"vat": actual_vat,
"name": "Discount"
}
if settings.get("discount_id_pol"):
article_dict["id_pol"] = settings["discount_id_pol"]
articles.append(article_dict)
else:
# Fallback — original behavior with GoMag VAT or settings default
discount_vat = getattr(order, 'discount_vat', None) or settings.get("discount_vat", "21")
article_dict = {
"sku": discount_codmat,
"quantity": "-1",
"price": str(order.discount_total),
"vat": discount_vat,
"name": "Discount"
}
if settings.get("discount_id_pol"):
article_dict["id_pol"] = settings["discount_id_pol"]
articles.append(article_dict)
return json.dumps(articles)
def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_settings: dict = None, id_gestiuni: list[int] = None) -> dict:
"""Import a single order into Oracle ROA.
Returns dict with:
success: bool
id_comanda: int or None
id_partener: int or None
id_adresa_facturare: int or None
id_adresa_livrare: int or None
error: str or None
"""
result = {
"success": False,
"id_comanda": None,
"id_partener": None,
"id_adresa_facturare": None,
"id_adresa_livrare": None,
"error": None
}
conn = None
try:
order_number = clean_web_text(order.number)
order_date = convert_web_date(order.date)
logger.info(
f"Order {order.number}: raw date={order.date!r}"
f"parsed={order_date.strftime('%Y-%m-%d %H:%M:%S')}"
)
if database.pool is None:
raise RuntimeError("Oracle pool not initialized")
conn = database.pool.acquire()
with conn.cursor() as cur:
# Step 1: Process partner — use shipping person data for name
id_partener = cur.var(oracledb.DB_TYPE_NUMBER)
if order.billing.is_company:
denumire = clean_web_text(order.billing.company_name).upper()
cod_fiscal = clean_web_text(order.billing.company_code) or None
registru = clean_web_text(order.billing.company_reg) or None
is_pj = 1
else:
# Use shipping person for partner name (person on shipping label)
if order.shipping and (order.shipping.lastname or order.shipping.firstname):
denumire = clean_web_text(
f"{order.shipping.lastname} {order.shipping.firstname}"
).upper()
else:
denumire = clean_web_text(
f"{order.billing.lastname} {order.billing.firstname}"
).upper()
cod_fiscal = None
registru = None
is_pj = 0
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_partener", [
cod_fiscal, denumire, registru, is_pj, id_partener
])
partner_id = id_partener.getvalue()
if not partner_id or partner_id <= 0:
result["error"] = f"Partner creation failed for {denumire}"
return result
result["id_partener"] = int(partner_id)
# Determine if billing and shipping are different persons
billing_name = clean_web_text(
f"{order.billing.lastname} {order.billing.firstname}"
).strip().upper()
shipping_name = ""
if order.shipping:
shipping_name = clean_web_text(
f"{order.shipping.lastname} {order.shipping.firstname}"
).strip().upper()
different_person = bool(
shipping_name and billing_name and shipping_name != billing_name
)
# Step 2: Process shipping address (primary — person on shipping label)
# Use shipping person phone/email for partner contact
shipping_phone = ""
shipping_email = ""
if order.shipping:
shipping_phone = order.shipping.phone or ""
shipping_email = order.shipping.email or ""
if not shipping_phone:
shipping_phone = order.billing.phone or ""
if not shipping_email:
shipping_email = order.billing.email or ""
addr_livr_id = None
if order.shipping:
id_adresa_livr = cur.var(oracledb.DB_TYPE_NUMBER)
shipping_addr = format_address_for_oracle(
order.shipping.address, order.shipping.city,
order.shipping.region
)
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_adresa", [
partner_id, shipping_addr,
shipping_phone,
shipping_email,
id_adresa_livr
])
addr_livr_id = id_adresa_livr.getvalue()
# Step 3: Process billing address
if different_person:
# Different person: use shipping address for BOTH billing and shipping in ROA
addr_fact_id = addr_livr_id
else:
# Same person: use billing address as-is
id_adresa_fact = cur.var(oracledb.DB_TYPE_NUMBER)
billing_addr = format_address_for_oracle(
order.billing.address, order.billing.city, order.billing.region
)
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_adresa", [
partner_id, billing_addr,
order.billing.phone or "",
order.billing.email or "",
id_adresa_fact
])
addr_fact_id = id_adresa_fact.getvalue()
if addr_fact_id is not None:
result["id_adresa_facturare"] = int(addr_fact_id)
if addr_livr_id is not None:
result["id_adresa_livrare"] = int(addr_livr_id)
# Step 4: Build articles JSON and import order
articles_json = build_articles_json(order.items, order, app_settings)
# Use CLOB for the JSON
clob_var = cur.var(oracledb.DB_TYPE_CLOB)
clob_var.setvalue(0, articles_json)
id_comanda = cur.var(oracledb.DB_TYPE_NUMBER)
# Convert list[int] to CSV string for Oracle VARCHAR2 param
id_gestiune_csv = ",".join(str(g) for g in id_gestiuni) if id_gestiuni else None
# Kit pricing parameters from settings
kit_mode = (app_settings or {}).get("kit_pricing_mode") or None
kit_id_pol_prod = int((app_settings or {}).get("id_pol_productie") or 0) or None
kit_discount_codmat = (app_settings or {}).get("kit_discount_codmat") or None
kit_discount_id_pol = int((app_settings or {}).get("kit_discount_id_pol") or 0) or None
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
order_number, # p_nr_comanda_ext
order_date, # p_data_comanda
partner_id, # p_id_partener
clob_var, # p_json_articole (CLOB)
addr_livr_id, # p_id_adresa_livrare
addr_fact_id, # p_id_adresa_facturare
id_pol, # p_id_pol
id_sectie, # p_id_sectie
id_gestiune_csv, # p_id_gestiune (CSV string)
kit_mode, # p_kit_mode
kit_id_pol_prod, # p_id_pol_productie
kit_discount_codmat, # p_kit_discount_codmat
kit_discount_id_pol, # p_kit_discount_id_pol
id_comanda # v_id_comanda (OUT) — MUST STAY LAST
])
comanda_id = id_comanda.getvalue()
if comanda_id and comanda_id > 0:
conn.commit()
result["success"] = True
result["id_comanda"] = int(comanda_id)
logger.info(f"Order {order_number} imported: ID={comanda_id}")
else:
conn.rollback()
result["error"] = "importa_comanda returned invalid ID"
except oracledb.DatabaseError as e:
error_msg = str(e)
result["error"] = error_msg
logger.error(f"Oracle error importing order {order.number}: {error_msg}")
if conn:
try:
conn.rollback()
except Exception:
pass
except Exception as e:
result["error"] = str(e)
logger.error(f"Error importing order {order.number}: {e}")
if conn:
try:
conn.rollback()
except Exception:
pass
finally:
if conn:
try:
database.pool.release(conn)
except Exception:
pass
return result
def soft_delete_order_in_roa(id_comanda: int) -> dict:
"""Soft-delete an order in Oracle ROA (set sters=1 on comenzi + comenzi_detalii).
Returns {"success": bool, "error": str|None, "details_deleted": int}
"""
result = {"success": False, "error": None, "details_deleted": 0}
if database.pool is None:
result["error"] = "Oracle pool not initialized"
return result
conn = None
try:
conn = database.pool.acquire()
with conn.cursor() as cur:
# Soft-delete order details
cur.execute(
"UPDATE comenzi_detalii SET sters = 1 WHERE id_comanda = :1 AND sters = 0",
[id_comanda]
)
result["details_deleted"] = cur.rowcount
# Soft-delete the order itself
cur.execute(
"UPDATE comenzi SET sters = 1 WHERE id_comanda = :1 AND sters = 0",
[id_comanda]
)
conn.commit()
result["success"] = True
logger.info(f"Soft-deleted order ID={id_comanda} in Oracle ROA ({result['details_deleted']} details)")
except Exception as e:
result["error"] = str(e)
logger.error(f"Error soft-deleting order ID={id_comanda}: {e}")
if conn:
try:
conn.rollback()
except Exception:
pass
finally:
if conn:
try:
database.pool.release(conn)
except Exception:
pass
return result

View File

@@ -1,75 +0,0 @@
import logging
from .. import database
logger = logging.getLogger(__name__)
def check_invoices_for_orders(id_comanda_list: list) -> dict:
"""Check which orders have been invoiced in Oracle (vanzari table).
Returns {id_comanda: {facturat, numar_act, serie_act, total_fara_tva, total_tva, total_cu_tva}}
"""
if not id_comanda_list or database.pool is None:
return {}
result = {}
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
for i in range(0, len(id_comanda_list), 500):
batch = id_comanda_list[i:i+500]
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
params = {f"c{j}": cid for j, cid in enumerate(batch)}
cur.execute(f"""
SELECT id_comanda, numar_act, serie_act,
total_fara_tva, total_tva, total_cu_tva,
TO_CHAR(data_act, 'YYYY-MM-DD') AS data_act
FROM vanzari
WHERE id_comanda IN ({placeholders}) AND sters = 0
""", params)
for row in cur:
result[row[0]] = {
"facturat": True,
"numar_act": row[1],
"serie_act": row[2],
"total_fara_tva": float(row[3]) if row[3] else 0,
"total_tva": float(row[4]) if row[4] else 0,
"total_cu_tva": float(row[5]) if row[5] else 0,
"data_act": row[6],
}
except Exception as e:
logger.warning(f"Invoice check failed (table may not exist): {e}")
finally:
database.pool.release(conn)
return result
def check_orders_exist(id_comanda_list: list) -> set:
"""Check which id_comanda values still exist in Oracle COMENZI (sters=0).
Returns set of id_comanda that exist.
"""
if not id_comanda_list or database.pool is None:
return set()
existing = set()
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
for i in range(0, len(id_comanda_list), 500):
batch = id_comanda_list[i:i+500]
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
params = {f"c{j}": cid for j, cid in enumerate(batch)}
cur.execute(f"""
SELECT id_comanda FROM COMENZI
WHERE id_comanda IN ({placeholders}) AND sters = 0
""", params)
for row in cur:
existing.add(row[0])
except Exception as e:
logger.warning(f"Order existence check failed: {e}")
finally:
database.pool.release(conn)
return existing

View File

@@ -1,405 +0,0 @@
import oracledb
import csv
import io
import logging
from fastapi import HTTPException
from .. import database
logger = logging.getLogger(__name__)
def get_mappings(search: str = "", page: int = 1, per_page: int = 50,
sort_by: str = "sku", sort_dir: str = "asc",
show_deleted: bool = False,
id_pol: int = None, id_pol_productie: int = None):
"""Get paginated mappings with optional search and sorting."""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
offset = (page - 1) * per_page
# Validate and resolve sort parameters
allowed_sort = {
"sku": "at.sku",
"codmat": "at.codmat",
"denumire": "na.denumire",
"um": "na.um",
"cantitate_roa": "at.cantitate_roa",
"activ": "at.activ",
}
sort_col = allowed_sort.get(sort_by, "at.sku")
if sort_dir.lower() not in ("asc", "desc"):
sort_dir = "asc"
order_clause = f"{sort_col} {sort_dir}"
# Always add secondary sort to keep groups together
if sort_col not in ("at.sku",):
order_clause += ", at.sku"
order_clause += ", at.codmat"
with database.pool.acquire() as conn:
with conn.cursor() as cur:
# Build WHERE clause
where_clauses = []
params = {}
if not show_deleted:
where_clauses.append("at.sters = 0")
if search:
where_clauses.append("""(UPPER(at.sku) LIKE '%' || UPPER(:search) || '%'
OR UPPER(at.codmat) LIKE '%' || UPPER(:search) || '%'
OR UPPER(na.denumire) LIKE '%' || UPPER(:search) || '%')""")
params["search"] = search
where = "WHERE " + " AND ".join(where_clauses) if where_clauses else ""
# Add price policy params
params["id_pol"] = id_pol
params["id_pol_prod"] = id_pol_productie
# Fetch ALL matching rows (no pagination yet — we need to group by SKU first)
data_sql = f"""
SELECT at.sku, at.codmat, na.denumire, na.um, at.cantitate_roa,
at.activ, at.sters,
TO_CHAR(at.data_creare, 'YYYY-MM-DD HH24:MI') as data_creare,
ROUND(CASE WHEN pp.preturi_cu_tva = 1
THEN NVL(ppa.pret, 0)
ELSE NVL(ppa.pret, 0) * NVL(ppa.proc_tvav, 1.19)
END, 2) AS pret_cu_tva
FROM ARTICOLE_TERTI at
LEFT JOIN nom_articole na ON na.codmat = at.codmat
LEFT JOIN crm_politici_pret_art ppa
ON ppa.id_articol = na.id_articol
AND ppa.id_pol = CASE
WHEN TRIM(na.cont) IN ('341','345') AND :id_pol_prod IS NOT NULL
THEN :id_pol_prod ELSE :id_pol END
LEFT JOIN crm_politici_preturi pp
ON pp.id_pol = ppa.id_pol
{where}
ORDER BY {order_clause}
"""
cur.execute(data_sql, params)
columns = [col[0].lower() for col in cur.description]
all_rows = [dict(zip(columns, row)) for row in cur.fetchall()]
# Group by SKU
from collections import OrderedDict
groups = OrderedDict()
for row in all_rows:
sku = row["sku"]
if sku not in groups:
groups[sku] = []
groups[sku].append(row)
counts = {"total": len(groups)}
# Flatten back to rows for pagination (paginate by raw row count)
filtered_rows = [row for rows in groups.values() for row in rows]
total = len(filtered_rows)
page_rows = filtered_rows[offset: offset + per_page]
return {
"mappings": page_rows,
"total": total,
"page": page,
"per_page": per_page,
"pages": (total + per_page - 1) // per_page if total > 0 else 0,
"counts": counts,
}
def create_mapping(sku: str, codmat: str, cantitate_roa: float = 1, auto_restore: bool = False):
"""Create a new mapping. Returns dict or raises HTTPException on duplicate.
When auto_restore=True, soft-deleted records are restored+updated instead of raising 409.
"""
if not sku or not sku.strip():
raise HTTPException(status_code=400, detail="SKU este obligatoriu")
if not codmat or not codmat.strip():
raise HTTPException(status_code=400, detail="CODMAT este obligatoriu")
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
with database.pool.acquire() as conn:
with conn.cursor() as cur:
# Validate CODMAT exists in NOM_ARTICOLE
cur.execute("""
SELECT COUNT(*) FROM NOM_ARTICOLE
WHERE codmat = :codmat AND sters = 0 AND inactiv = 0
""", {"codmat": codmat})
if cur.fetchone()[0] == 0:
raise HTTPException(status_code=400, detail="CODMAT-ul nu exista in nomenclator")
# Check for active duplicate
cur.execute("""
SELECT COUNT(*) FROM ARTICOLE_TERTI
WHERE sku = :sku AND codmat = :codmat AND NVL(sters, 0) = 0
""", {"sku": sku, "codmat": codmat})
if cur.fetchone()[0] > 0:
raise HTTPException(status_code=409, detail="Maparea SKU-CODMAT există deja")
# Check for soft-deleted record that could be restored
cur.execute("""
SELECT COUNT(*) FROM ARTICOLE_TERTI
WHERE sku = :sku AND codmat = :codmat AND sters = 1
""", {"sku": sku, "codmat": codmat})
if cur.fetchone()[0] > 0:
if auto_restore:
cur.execute("""
UPDATE ARTICOLE_TERTI SET sters = 0, activ = 1,
cantitate_roa = :cantitate_roa,
data_modif = SYSDATE
WHERE sku = :sku AND codmat = :codmat AND sters = 1
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate_roa})
conn.commit()
return {"sku": sku, "codmat": codmat}
else:
raise HTTPException(
status_code=409,
detail="Maparea a fost ștearsă anterior",
headers={"X-Can-Restore": "true"}
)
cur.execute("""
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
VALUES (:sku, :codmat, :cantitate_roa, 1, 0, SYSDATE, -3)
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate_roa})
conn.commit()
return {"sku": sku, "codmat": codmat}
def update_mapping(sku: str, codmat: str, cantitate_roa: float = None, activ: int = None):
"""Update an existing mapping."""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
sets = []
params = {"sku": sku, "codmat": codmat}
if cantitate_roa is not None:
sets.append("cantitate_roa = :cantitate_roa")
params["cantitate_roa"] = cantitate_roa
if activ is not None:
sets.append("activ = :activ")
params["activ"] = activ
if not sets:
return False
sets.append("data_modif = SYSDATE")
set_clause = ", ".join(sets)
with database.pool.acquire() as conn:
with conn.cursor() as cur:
cur.execute(f"""
UPDATE ARTICOLE_TERTI SET {set_clause}
WHERE sku = :sku AND codmat = :codmat
""", params)
conn.commit()
return cur.rowcount > 0
def delete_mapping(sku: str, codmat: str):
"""Soft delete (set sters=1)."""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
with database.pool.acquire() as conn:
with conn.cursor() as cur:
cur.execute("""
UPDATE ARTICOLE_TERTI SET sters = 1, data_modif = SYSDATE
WHERE sku = :sku AND codmat = :codmat
""", {"sku": sku, "codmat": codmat})
conn.commit()
return cur.rowcount > 0
def edit_mapping(old_sku: str, old_codmat: str, new_sku: str, new_codmat: str,
cantitate_roa: float = 1):
"""Edit a mapping. If PK changed, soft-delete old and insert new."""
if not new_sku or not new_sku.strip():
raise HTTPException(status_code=400, detail="SKU este obligatoriu")
if not new_codmat or not new_codmat.strip():
raise HTTPException(status_code=400, detail="CODMAT este obligatoriu")
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
if old_sku == new_sku and old_codmat == new_codmat:
# Simple update - only cantitate changed
return update_mapping(new_sku, new_codmat, cantitate_roa)
else:
# PK changed: soft-delete old, upsert new (MERGE handles existing soft-deleted target)
with database.pool.acquire() as conn:
with conn.cursor() as cur:
# Mark old record as deleted
cur.execute("""
UPDATE ARTICOLE_TERTI SET sters = 1, data_modif = SYSDATE
WHERE sku = :sku AND codmat = :codmat
""", {"sku": old_sku, "codmat": old_codmat})
# Upsert new record (MERGE in case target PK exists as soft-deleted)
cur.execute("""
MERGE INTO ARTICOLE_TERTI t
USING (SELECT :sku AS sku, :codmat AS codmat FROM DUAL) s
ON (t.sku = s.sku AND t.codmat = s.codmat)
WHEN MATCHED THEN UPDATE SET
cantitate_roa = :cantitate_roa,
activ = 1, sters = 0,
data_modif = SYSDATE
WHEN NOT MATCHED THEN INSERT
(sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
VALUES (:sku, :codmat, :cantitate_roa, 1, 0, SYSDATE, -3)
""", {"sku": new_sku, "codmat": new_codmat, "cantitate_roa": cantitate_roa})
conn.commit()
return True
def restore_mapping(sku: str, codmat: str):
"""Restore a soft-deleted mapping (set sters=0)."""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
with database.pool.acquire() as conn:
with conn.cursor() as cur:
cur.execute("""
UPDATE ARTICOLE_TERTI SET sters = 0, data_modif = SYSDATE
WHERE sku = :sku AND codmat = :codmat
""", {"sku": sku, "codmat": codmat})
conn.commit()
return cur.rowcount > 0
def import_csv(file_content: str):
"""Import mappings from CSV content. Returns summary.
Backward compatible: if procent_pret column exists in CSV, it is silently ignored.
"""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
reader = csv.DictReader(io.StringIO(file_content))
created = 0
skipped_no_codmat = 0
errors = []
with database.pool.acquire() as conn:
with conn.cursor() as cur:
for i, row in enumerate(reader, 1):
sku = row.get("sku", "").strip()
codmat = row.get("codmat", "").strip()
if not sku:
errors.append(f"Rând {i}: SKU lipsă")
continue
if not codmat:
skipped_no_codmat += 1
continue
try:
cantitate = float(row.get("cantitate_roa", "1") or "1")
# procent_pret column ignored if present (backward compat)
cur.execute("""
MERGE INTO ARTICOLE_TERTI t
USING (SELECT :sku AS sku, :codmat AS codmat FROM DUAL) s
ON (t.sku = s.sku AND t.codmat = s.codmat)
WHEN MATCHED THEN UPDATE SET
cantitate_roa = :cantitate_roa,
activ = 1,
sters = 0,
data_modif = SYSDATE
WHEN NOT MATCHED THEN INSERT
(sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
VALUES (:sku, :codmat, :cantitate_roa, 1, 0, SYSDATE, -3)
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate})
created += 1
except Exception as e:
errors.append(f"Rând {i}: {str(e)}")
conn.commit()
return {"processed": created, "skipped_no_codmat": skipped_no_codmat, "errors": errors}
def export_csv():
"""Export all mappings as CSV string."""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
output = io.StringIO()
writer = csv.writer(output)
writer.writerow(["sku", "codmat", "cantitate_roa", "activ"])
with database.pool.acquire() as conn:
with conn.cursor() as cur:
cur.execute("""
SELECT sku, codmat, cantitate_roa, activ
FROM ARTICOLE_TERTI WHERE sters = 0 ORDER BY sku, codmat
""")
for row in cur:
writer.writerow(row)
return output.getvalue()
def get_csv_template():
"""Return empty CSV template."""
output = io.StringIO()
writer = csv.writer(output)
writer.writerow(["sku", "codmat", "cantitate_roa"])
writer.writerow(["EXAMPLE_SKU", "EXAMPLE_CODMAT", "1"])
return output.getvalue()
def get_component_prices(sku: str, id_pol: int, id_pol_productie: int = None) -> list:
"""Get prices from crm_politici_pret_art for kit components.
Returns: [{"codmat", "denumire", "cantitate_roa", "pret", "pret_cu_tva", "proc_tvav", "ptva", "id_pol_used"}]
"""
if database.pool is None:
raise HTTPException(status_code=503, detail="Oracle unavailable")
with database.pool.acquire() as conn:
with conn.cursor() as cur:
# Get components from ARTICOLE_TERTI
cur.execute("""
SELECT at.codmat, at.cantitate_roa, na.id_articol, na.cont, na.denumire
FROM ARTICOLE_TERTI at
JOIN NOM_ARTICOLE na ON na.codmat = at.codmat AND na.sters = 0 AND na.inactiv = 0
WHERE at.sku = :sku AND at.activ = 1 AND at.sters = 0
ORDER BY at.codmat
""", {"sku": sku})
components = cur.fetchall()
if len(components) == 0:
return []
if len(components) == 1 and (components[0][1] or 1) <= 1:
return [] # True 1:1 mapping, no kit pricing needed
result = []
for codmat, cant_roa, id_art, cont, denumire in components:
# Determine policy based on account
cont_str = str(cont or "").strip()
pol = id_pol_productie if (cont_str in ("341", "345") and id_pol_productie) else id_pol
# Get PRETURI_CU_TVA flag
cur.execute("SELECT PRETURI_CU_TVA FROM CRM_POLITICI_PRETURI WHERE ID_POL = :pol", {"pol": pol})
pol_row = cur.fetchone()
preturi_cu_tva_flag = pol_row[0] if pol_row else 0
# Get price
cur.execute("""
SELECT PRET, PROC_TVAV FROM crm_politici_pret_art
WHERE id_pol = :pol AND id_articol = :id_art
""", {"pol": pol, "id_art": id_art})
price_row = cur.fetchone()
if price_row:
pret, proc_tvav = price_row
proc_tvav = proc_tvav or 1.19
pret_cu_tva = pret if preturi_cu_tva_flag == 1 else round(pret * proc_tvav, 2)
ptva = round((proc_tvav - 1) * 100)
else:
pret = 0
pret_cu_tva = 0
proc_tvav = 1.19
ptva = 19
result.append({
"codmat": codmat,
"denumire": denumire or "",
"cantitate_roa": float(cant_roa) if cant_roa else 1,
"pret": float(pret) if pret else 0,
"pret_cu_tva": float(pret_cu_tva),
"proc_tvav": float(proc_tvav),
"ptva": int(ptva),
"id_pol_used": pol
})
return result

View File

@@ -1,198 +0,0 @@
import json
import glob
import os
import logging
from pathlib import Path
from dataclasses import dataclass, field
from typing import Optional
from ..config import settings
logger = logging.getLogger(__name__)
@dataclass
class OrderItem:
sku: str
name: str
price: float
quantity: float
vat: float
@dataclass
class OrderBilling:
firstname: str = ""
lastname: str = ""
phone: str = ""
email: str = ""
address: str = ""
city: str = ""
region: str = ""
country: str = ""
company_name: str = ""
company_code: str = ""
company_reg: str = ""
is_company: bool = False
@dataclass
class OrderShipping:
firstname: str = ""
lastname: str = ""
phone: str = ""
email: str = ""
address: str = ""
city: str = ""
region: str = ""
country: str = ""
@dataclass
class OrderData:
id: str
number: str
date: str
status: str = ""
status_id: str = ""
items: list = field(default_factory=list) # list of OrderItem
billing: OrderBilling = field(default_factory=OrderBilling)
shipping: Optional[OrderShipping] = None
total: float = 0.0
delivery_cost: float = 0.0
discount_total: float = 0.0
discount_vat: Optional[str] = None
payment_name: str = ""
delivery_name: str = ""
source_file: str = ""
def read_json_orders(json_dir: str = None) -> tuple[list[OrderData], int]:
"""Read all GoMag order JSON files from the output directory.
Returns (list of OrderData, number of JSON files read).
"""
if json_dir is None:
json_dir = settings.JSON_OUTPUT_DIR
if not json_dir or not os.path.isdir(json_dir):
logger.warning(f"JSON output directory not found: {json_dir}")
return [], 0
# Find all gomag_orders*.json files
pattern = os.path.join(json_dir, "gomag_orders*.json")
json_files = sorted(glob.glob(pattern))
if not json_files:
logger.info(f"No JSON files found in {json_dir}")
return [], 0
orders = []
for filepath in json_files:
try:
with open(filepath, 'r', encoding='utf-8') as f:
data = json.load(f)
raw_orders = data.get("orders", {})
if not isinstance(raw_orders, dict):
continue
for order_id, order_data in raw_orders.items():
try:
order = _parse_order(order_id, order_data, os.path.basename(filepath))
orders.append(order)
except Exception as e:
logger.warning(f"Error parsing order {order_id} from {filepath}: {e}")
except Exception as e:
logger.error(f"Error reading {filepath}: {e}")
logger.info(f"Read {len(orders)} orders from {len(json_files)} JSON files")
return orders, len(json_files)
def _parse_order(order_id: str, data: dict, source_file: str) -> OrderData:
"""Parse a single order from JSON data."""
# Parse items
items = []
raw_items = data.get("items", [])
if isinstance(raw_items, list):
for item in raw_items:
if isinstance(item, dict) and item.get("sku"):
items.append(OrderItem(
sku=str(item.get("sku", "")).strip(),
name=str(item.get("name", "")),
price=float(item.get("price", 0) or 0),
quantity=float(item.get("quantity", 0) or 0),
vat=float(item.get("vat", 0) or 0)
))
# Parse billing
billing_data = data.get("billing", {}) or {}
company = billing_data.get("company")
is_company = isinstance(company, dict) and bool(company.get("name"))
billing = OrderBilling(
firstname=str(billing_data.get("firstname", "")),
lastname=str(billing_data.get("lastname", "")),
phone=str(billing_data.get("phone", "")),
email=str(billing_data.get("email", "")),
address=str(billing_data.get("address", "")),
city=str(billing_data.get("city", "")),
region=str(billing_data.get("region", "")),
country=str(billing_data.get("country", "")),
company_name=str(company.get("name", "")) if is_company else "",
company_code=str(company.get("code", "")) if is_company else "",
company_reg=str(company.get("registrationNo", "")) if is_company else "",
is_company=is_company
)
# Parse shipping
shipping_data = data.get("shipping")
shipping = None
if isinstance(shipping_data, dict):
shipping = OrderShipping(
firstname=str(shipping_data.get("firstname", "")),
lastname=str(shipping_data.get("lastname", "")),
phone=str(shipping_data.get("phone", "")),
email=str(shipping_data.get("email", "")),
address=str(shipping_data.get("address", "")),
city=str(shipping_data.get("city", "")),
region=str(shipping_data.get("region", "")),
country=str(shipping_data.get("country", ""))
)
# Payment/delivery
payment = data.get("payment", {}) or {}
delivery = data.get("delivery", {}) or {}
# Parse delivery cost
delivery_cost = float(delivery.get("total", 0) or 0) if isinstance(delivery, dict) else 0.0
# Parse discount total (sum of all discount values) and VAT from first discount item
discount_total = 0.0
discount_vat = None
for d in data.get("discounts", []):
if isinstance(d, dict):
discount_total += float(d.get("value", 0) or 0)
if discount_vat is None and d.get("vat") is not None:
discount_vat = str(d["vat"])
return OrderData(
id=str(data.get("id", order_id)),
number=str(data.get("number", "")),
date=str(data.get("date", "")),
status=str(data.get("status", "")),
status_id=str(data.get("statusId", "")),
items=items,
billing=billing,
shipping=shipping,
total=float(data.get("total", 0) or 0),
delivery_cost=delivery_cost,
discount_total=discount_total,
discount_vat=discount_vat,
payment_name=str(payment.get("name", "")) if isinstance(payment, dict) else "",
delivery_name=str(delivery.get("name", "")) if isinstance(delivery, dict) else "",
source_file=source_file
)
def get_all_skus(orders: list[OrderData]) -> set[str]:
"""Extract unique SKUs from all orders."""
skus = set()
for order in orders:
for item in order.items:
if item.sku:
skus.add(item.sku)
return skus

View File

@@ -1,264 +0,0 @@
"""Catalog price sync service — syncs product prices from GoMag catalog to ROA Oracle."""
import asyncio
import logging
import uuid
from datetime import datetime
from zoneinfo import ZoneInfo
from . import gomag_client, validation_service, sqlite_service
from .. import database
from ..config import settings
logger = logging.getLogger(__name__)
_tz = ZoneInfo("Europe/Bucharest")
_price_sync_lock = asyncio.Lock()
_current_price_sync = None
def _now():
return datetime.now(_tz).replace(tzinfo=None)
async def prepare_price_sync() -> dict:
global _current_price_sync
if _price_sync_lock.locked():
return {"error": "Price sync already running"}
run_id = _now().strftime("%Y%m%d_%H%M%S") + "_ps_" + uuid.uuid4().hex[:6]
_current_price_sync = {
"run_id": run_id, "status": "running",
"started_at": _now().isoformat(), "finished_at": None,
"phase_text": "Starting...",
}
# Create SQLite record
db = await sqlite_service.get_sqlite()
try:
await db.execute(
"INSERT INTO price_sync_runs (run_id, started_at, status) VALUES (?, ?, 'running')",
(run_id, _now().strftime("%d.%m.%Y %H:%M:%S"))
)
await db.commit()
finally:
await db.close()
return {"run_id": run_id}
async def get_price_sync_status() -> dict:
if _current_price_sync and _current_price_sync.get("status") == "running":
return _current_price_sync
# Return last run from SQLite
db = await sqlite_service.get_sqlite()
try:
cursor = await db.execute(
"SELECT * FROM price_sync_runs ORDER BY started_at DESC LIMIT 1"
)
row = await cursor.fetchone()
if row:
return {"status": "idle", "last_run": dict(row)}
return {"status": "idle"}
except Exception:
return {"status": "idle"}
finally:
await db.close()
async def run_catalog_price_sync(run_id: str):
global _current_price_sync
async with _price_sync_lock:
log_lines = []
def _log(msg):
logger.info(msg)
log_lines.append(f"[{_now().strftime('%H:%M:%S')}] {msg}")
if _current_price_sync:
_current_price_sync["phase_text"] = msg
try:
app_settings = await sqlite_service.get_app_settings()
id_pol = int(app_settings.get("id_pol") or 0) or None
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
if not id_pol:
_log("Politica de preț nu e configurată — skip sync")
await _finish_run(run_id, "error", log_lines, error="No price policy")
return
# Fetch products from GoMag
_log("Descărcare produse din GoMag API...")
products = await gomag_client.download_products(
api_key=app_settings.get("gomag_api_key"),
api_shop=app_settings.get("gomag_api_shop"),
products_url=app_settings.get("gomag_products_url") or None,
log_fn=_log,
)
if not products:
_log("Niciun produs descărcat")
await _finish_run(run_id, "completed", log_lines, products_total=0)
return
# Index products by SKU for kit component lookup
products_by_sku = {p["sku"]: p for p in products}
# Connect to Oracle
conn = await asyncio.to_thread(database.get_oracle_connection)
try:
# Get all mappings from ARTICOLE_TERTI
_log("Citire mapări ARTICOLE_TERTI...")
mapped_data = await asyncio.to_thread(
validation_service.resolve_mapped_codmats,
{p["sku"] for p in products}, conn
)
# Get direct articles from NOM_ARTICOLE
_log("Identificare articole directe...")
direct_id_map = {}
with conn.cursor() as cur:
all_skus = list({p["sku"] for p in products})
for i in range(0, len(all_skus), 500):
batch = all_skus[i:i+500]
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
params = {f"s{j}": sku for j, sku in enumerate(batch)}
cur.execute(f"""
SELECT codmat, id_articol, cont FROM nom_articole
WHERE codmat IN ({placeholders}) AND sters = 0 AND inactiv = 0
""", params)
for row in cur:
if row[0] not in mapped_data:
direct_id_map[row[0]] = {"id_articol": row[1], "cont": row[2]}
matched = 0
updated = 0
errors = 0
for product in products:
sku = product["sku"]
try:
price_str = product.get("price", "0")
price = float(price_str) if price_str else 0
if price <= 0:
continue
vat = float(product.get("vat", "19"))
# Calculate price with TVA (vat_included can be int 1 or str "1")
if str(product.get("vat_included", "1")) == "1":
price_cu_tva = price
else:
price_cu_tva = price * (1 + vat / 100)
# For kits, sync each component individually from standalone GoMag prices
mapped_comps = mapped_data.get(sku, [])
is_kit = len(mapped_comps) > 1 or (
len(mapped_comps) == 1 and (mapped_comps[0].get("cantitate_roa") or 1) > 1
)
if is_kit:
for comp in mapped_data[sku]:
comp_codmat = comp["codmat"]
# Skip components that have their own ARTICOLE_TERTI mapping
# (they'll be synced with correct cantitate_roa in individual path)
if comp_codmat in mapped_data:
continue
comp_product = products_by_sku.get(comp_codmat)
if not comp_product:
continue # Component not in GoMag as standalone product
comp_price_str = comp_product.get("price", "0")
comp_price = float(comp_price_str) if comp_price_str else 0
if comp_price <= 0:
continue
comp_vat = float(comp_product.get("vat", "19"))
# vat_included can be int 1 or str "1"
if str(comp_product.get("vat_included", "1")) == "1":
comp_price_cu_tva = comp_price
else:
comp_price_cu_tva = comp_price * (1 + comp_vat / 100)
comp_cont_str = str(comp.get("cont") or "").strip()
comp_pol = id_pol_productie if (comp_cont_str in ("341", "345") and id_pol_productie) else id_pol
matched += 1
result = await asyncio.to_thread(
validation_service.compare_and_update_price,
comp["id_articol"], comp_pol, comp_price_cu_tva, conn
)
if result and result["updated"]:
updated += 1
_log(f" {comp_codmat}: {result['old_price']:.2f}{result['new_price']:.2f} (kit {sku})")
elif result is None:
_log(f" {comp_codmat}: LIPSESTE din politica {comp_pol} — adauga manual in ROA (kit {sku})")
continue
# Determine id_articol and policy
id_articol = None
cantitate_roa = 1
if sku in mapped_data and len(mapped_data[sku]) == 1 and (mapped_data[sku][0].get("cantitate_roa") or 1) <= 1:
comp = mapped_data[sku][0]
id_articol = comp["id_articol"]
cantitate_roa = comp.get("cantitate_roa") or 1
elif sku in direct_id_map:
id_articol = direct_id_map[sku]["id_articol"]
else:
continue # SKU not in ROA
matched += 1
price_per_unit = price_cu_tva / cantitate_roa if cantitate_roa != 1 else price_cu_tva
# Determine policy
cont = None
if sku in mapped_data and len(mapped_data[sku]) == 1 and (mapped_data[sku][0].get("cantitate_roa") or 1) <= 1:
cont = mapped_data[sku][0].get("cont")
elif sku in direct_id_map:
cont = direct_id_map[sku].get("cont")
cont_str = str(cont or "").strip()
pol = id_pol_productie if (cont_str in ("341", "345") and id_pol_productie) else id_pol
result = await asyncio.to_thread(
validation_service.compare_and_update_price,
id_articol, pol, price_per_unit, conn
)
if result and result["updated"]:
updated += 1
_log(f" {result['codmat']}: {result['old_price']:.2f}{result['new_price']:.2f}")
except Exception as e:
errors += 1
_log(f"Eroare produs {sku}: {e}")
_log(f"Sync complet: {len(products)} produse, {matched} potrivite, {updated} actualizate, {errors} erori")
finally:
await asyncio.to_thread(database.pool.release, conn)
await _finish_run(run_id, "completed", log_lines,
products_total=len(products), matched=matched,
updated=updated, errors=errors)
except Exception as e:
_log(f"Eroare critică: {e}")
logger.error(f"Catalog price sync error: {e}", exc_info=True)
await _finish_run(run_id, "error", log_lines, error=str(e))
async def _finish_run(run_id, status, log_lines, products_total=0,
matched=0, updated=0, errors=0, error=None):
global _current_price_sync
db = await sqlite_service.get_sqlite()
try:
await db.execute("""
UPDATE price_sync_runs SET
finished_at = ?, status = ?, products_total = ?,
matched = ?, updated = ?, errors = ?,
log_text = ?
WHERE run_id = ?
""", (_now().strftime("%d.%m.%Y %H:%M:%S"), status, products_total, matched, updated, errors,
"\n".join(log_lines), run_id))
await db.commit()
finally:
await db.close()
_current_price_sync = None

View File

@@ -1,71 +0,0 @@
import logging
from apscheduler.schedulers.asyncio import AsyncIOScheduler
from apscheduler.triggers.interval import IntervalTrigger
logger = logging.getLogger(__name__)
_scheduler = None
_is_running = False
def init_scheduler():
"""Initialize the APScheduler instance."""
global _scheduler
_scheduler = AsyncIOScheduler()
logger.info("Scheduler initialized")
def start_scheduler(interval_minutes: int = 5):
"""Start the scheduler with the given interval."""
global _is_running
if _scheduler is None:
init_scheduler()
# Remove existing job if any
if _scheduler.get_job("sync_job"):
_scheduler.remove_job("sync_job")
from . import sync_service
_scheduler.add_job(
sync_service.run_sync,
trigger=IntervalTrigger(minutes=interval_minutes),
id="sync_job",
name="GoMag Sync",
replace_existing=True
)
if not _scheduler.running:
_scheduler.start()
_is_running = True
logger.info(f"Scheduler started with interval {interval_minutes}min")
def stop_scheduler():
"""Stop the scheduler."""
global _is_running
if _scheduler and _scheduler.running:
if _scheduler.get_job("sync_job"):
_scheduler.remove_job("sync_job")
_is_running = False
logger.info("Scheduler stopped")
def shutdown_scheduler():
"""Shutdown the scheduler completely."""
global _scheduler, _is_running
if _scheduler and _scheduler.running:
_scheduler.shutdown(wait=False)
_scheduler = None
_is_running = False
def get_scheduler_status():
"""Get current scheduler status."""
job = _scheduler.get_job("sync_job") if _scheduler else None
return {
"enabled": _is_running,
"next_run": job.next_run_time.isoformat() if job and job.next_run_time else None,
"interval_minutes": int(job.trigger.interval.total_seconds() / 60) if job else None
}

View File

@@ -1,968 +0,0 @@
import json
import logging
from datetime import datetime
from zoneinfo import ZoneInfo
from ..database import get_sqlite, get_sqlite_sync
# Re-export so other services can import get_sqlite from sqlite_service
__all__ = ["get_sqlite", "get_sqlite_sync"]
_tz_bucharest = ZoneInfo("Europe/Bucharest")
def _now_str():
"""Return current Bucharest time as ISO string."""
return datetime.now(_tz_bucharest).replace(tzinfo=None).isoformat()
logger = logging.getLogger(__name__)
async def create_sync_run(run_id: str, json_files: int = 0):
"""Create a new sync run record."""
db = await get_sqlite()
try:
await db.execute("""
INSERT INTO sync_runs (run_id, started_at, status, json_files)
VALUES (?, ?, 'running', ?)
""", (run_id, _now_str(), json_files))
await db.commit()
finally:
await db.close()
async def update_sync_run(run_id: str, status: str, total_orders: int = 0,
imported: int = 0, skipped: int = 0, errors: int = 0,
error_message: str = None,
already_imported: int = 0, new_imported: int = 0):
"""Update sync run with results."""
db = await get_sqlite()
try:
await db.execute("""
UPDATE sync_runs SET
finished_at = ?,
status = ?,
total_orders = ?,
imported = ?,
skipped = ?,
errors = ?,
error_message = ?,
already_imported = ?,
new_imported = ?
WHERE run_id = ?
""", (_now_str(), status, total_orders, imported, skipped, errors, error_message,
already_imported, new_imported, run_id))
await db.commit()
finally:
await db.close()
async def upsert_order(sync_run_id: str, order_number: str, order_date: str,
customer_name: str, status: str, id_comanda: int = None,
id_partener: int = None, error_message: str = None,
missing_skus: list = None, items_count: int = 0,
shipping_name: str = None, billing_name: str = None,
payment_method: str = None, delivery_method: str = None,
order_total: float = None,
delivery_cost: float = None, discount_total: float = None,
web_status: str = None, discount_split: str = None):
"""Upsert a single order — one row per order_number, status updated in place."""
db = await get_sqlite()
try:
await db.execute("""
INSERT INTO orders
(order_number, order_date, customer_name, status,
id_comanda, id_partener, error_message, missing_skus, items_count,
last_sync_run_id, shipping_name, billing_name,
payment_method, delivery_method, order_total,
delivery_cost, discount_total, web_status, discount_split)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(order_number) DO UPDATE SET
customer_name = excluded.customer_name,
status = CASE
WHEN orders.status = 'IMPORTED' AND excluded.status = 'ALREADY_IMPORTED'
THEN orders.status
ELSE excluded.status
END,
error_message = excluded.error_message,
missing_skus = excluded.missing_skus,
items_count = excluded.items_count,
id_comanda = COALESCE(excluded.id_comanda, orders.id_comanda),
id_partener = COALESCE(excluded.id_partener, orders.id_partener),
times_skipped = CASE WHEN excluded.status = 'SKIPPED'
THEN orders.times_skipped + 1
ELSE orders.times_skipped END,
last_sync_run_id = excluded.last_sync_run_id,
shipping_name = COALESCE(excluded.shipping_name, orders.shipping_name),
billing_name = COALESCE(excluded.billing_name, orders.billing_name),
payment_method = COALESCE(excluded.payment_method, orders.payment_method),
delivery_method = COALESCE(excluded.delivery_method, orders.delivery_method),
order_total = COALESCE(excluded.order_total, orders.order_total),
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
discount_total = COALESCE(excluded.discount_total, orders.discount_total),
web_status = COALESCE(excluded.web_status, orders.web_status),
discount_split = COALESCE(excluded.discount_split, orders.discount_split),
updated_at = datetime('now')
""", (order_number, order_date, customer_name, status,
id_comanda, id_partener, error_message,
json.dumps(missing_skus) if missing_skus else None,
items_count, sync_run_id, shipping_name, billing_name,
payment_method, delivery_method, order_total,
delivery_cost, discount_total, web_status, discount_split))
await db.commit()
finally:
await db.close()
async def add_sync_run_order(sync_run_id: str, order_number: str, status_at_run: str):
"""Record that this run processed this order (junction table)."""
db = await get_sqlite()
try:
await db.execute("""
INSERT OR IGNORE INTO sync_run_orders (sync_run_id, order_number, status_at_run)
VALUES (?, ?, ?)
""", (sync_run_id, order_number, status_at_run))
await db.commit()
finally:
await db.close()
async def save_orders_batch(orders_data: list[dict]):
"""Batch save a list of orders + their sync_run_orders + order_items in one transaction.
Each dict must have: sync_run_id, order_number, order_date, customer_name, status,
id_comanda, id_partener, error_message, missing_skus (list|None), items_count,
shipping_name, billing_name, payment_method, delivery_method, status_at_run,
items (list of item dicts), delivery_cost (optional), discount_total (optional),
web_status (optional).
"""
if not orders_data:
return
db = await get_sqlite()
try:
# 1. Upsert orders
await db.executemany("""
INSERT INTO orders
(order_number, order_date, customer_name, status,
id_comanda, id_partener, error_message, missing_skus, items_count,
last_sync_run_id, shipping_name, billing_name,
payment_method, delivery_method, order_total,
delivery_cost, discount_total, web_status, discount_split)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(order_number) DO UPDATE SET
customer_name = excluded.customer_name,
status = CASE
WHEN orders.status = 'IMPORTED' AND excluded.status = 'ALREADY_IMPORTED'
THEN orders.status
ELSE excluded.status
END,
error_message = excluded.error_message,
missing_skus = excluded.missing_skus,
items_count = excluded.items_count,
id_comanda = COALESCE(excluded.id_comanda, orders.id_comanda),
id_partener = COALESCE(excluded.id_partener, orders.id_partener),
times_skipped = CASE WHEN excluded.status = 'SKIPPED'
THEN orders.times_skipped + 1
ELSE orders.times_skipped END,
last_sync_run_id = excluded.last_sync_run_id,
shipping_name = COALESCE(excluded.shipping_name, orders.shipping_name),
billing_name = COALESCE(excluded.billing_name, orders.billing_name),
payment_method = COALESCE(excluded.payment_method, orders.payment_method),
delivery_method = COALESCE(excluded.delivery_method, orders.delivery_method),
order_total = COALESCE(excluded.order_total, orders.order_total),
delivery_cost = COALESCE(excluded.delivery_cost, orders.delivery_cost),
discount_total = COALESCE(excluded.discount_total, orders.discount_total),
web_status = COALESCE(excluded.web_status, orders.web_status),
discount_split = COALESCE(excluded.discount_split, orders.discount_split),
updated_at = datetime('now')
""", [
(d["order_number"], d["order_date"], d["customer_name"], d["status"],
d.get("id_comanda"), d.get("id_partener"), d.get("error_message"),
json.dumps(d["missing_skus"]) if d.get("missing_skus") else None,
d.get("items_count", 0), d["sync_run_id"],
d.get("shipping_name"), d.get("billing_name"),
d.get("payment_method"), d.get("delivery_method"),
d.get("order_total"),
d.get("delivery_cost"), d.get("discount_total"),
d.get("web_status"), d.get("discount_split"))
for d in orders_data
])
# 2. Sync run orders
await db.executemany("""
INSERT OR IGNORE INTO sync_run_orders (sync_run_id, order_number, status_at_run)
VALUES (?, ?, ?)
""", [(d["sync_run_id"], d["order_number"], d["status_at_run"]) for d in orders_data])
# 3. Order items
all_items = []
for d in orders_data:
for item in d.get("items", []):
all_items.append((
d["order_number"],
item.get("sku"), item.get("product_name"),
item.get("quantity"), item.get("price"), item.get("vat"),
item.get("mapping_status"), item.get("codmat"),
item.get("id_articol"), item.get("cantitate_roa")
))
if all_items:
await db.executemany("""
INSERT OR IGNORE INTO order_items
(order_number, sku, product_name, quantity, price, vat,
mapping_status, codmat, id_articol, cantitate_roa)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""", all_items)
await db.commit()
finally:
await db.close()
async def track_missing_sku(sku: str, product_name: str = "",
order_count: int = 0, order_numbers: str = None,
customers: str = None):
"""Track a missing SKU with order context."""
db = await get_sqlite()
try:
await db.execute("""
INSERT OR IGNORE INTO missing_skus (sku, product_name)
VALUES (?, ?)
""", (sku, product_name))
if order_count or order_numbers or customers:
await db.execute("""
UPDATE missing_skus SET
order_count = ?,
order_numbers = ?,
customers = ?
WHERE sku = ?
""", (order_count, order_numbers, customers, sku))
await db.commit()
finally:
await db.close()
async def resolve_missing_skus_batch(skus: set):
"""Mark multiple missing SKUs as resolved (they now have mappings)."""
if not skus:
return 0
db = await get_sqlite()
try:
placeholders = ",".join("?" for _ in skus)
cursor = await db.execute(f"""
UPDATE missing_skus SET resolved = 1, resolved_at = datetime('now')
WHERE sku IN ({placeholders}) AND resolved = 0
""", list(skus))
await db.commit()
return cursor.rowcount
finally:
await db.close()
async def resolve_missing_sku(sku: str):
"""Mark a missing SKU as resolved."""
db = await get_sqlite()
try:
await db.execute("""
UPDATE missing_skus SET resolved = 1, resolved_at = datetime('now')
WHERE sku = ?
""", (sku,))
await db.commit()
finally:
await db.close()
async def get_missing_skus_paginated(page: int = 1, per_page: int = 20,
resolved: int = 0, search: str = None):
"""Get paginated missing SKUs. resolved=-1 means show all.
Optional search filters by sku or product_name (LIKE)."""
db = await get_sqlite()
try:
offset = (page - 1) * per_page
# Build WHERE clause parts
where_parts = []
params_count = []
params_data = []
if resolved != -1:
where_parts.append("resolved = ?")
params_count.append(resolved)
params_data.append(resolved)
if search:
like = f"%{search}%"
where_parts.append("(LOWER(sku) LIKE LOWER(?) OR LOWER(COALESCE(product_name,'')) LIKE LOWER(?))")
params_count.extend([like, like])
params_data.extend([like, like])
where_clause = ("WHERE " + " AND ".join(where_parts)) if where_parts else ""
order_clause = (
"ORDER BY resolved ASC, order_count DESC, first_seen DESC"
if resolved == -1
else "ORDER BY order_count DESC, first_seen DESC"
)
cursor = await db.execute(
f"SELECT COUNT(*) FROM missing_skus {where_clause}",
params_count
)
total = (await cursor.fetchone())[0]
cursor = await db.execute(f"""
SELECT sku, product_name, first_seen, resolved, resolved_at,
order_count, order_numbers, customers
FROM missing_skus
{where_clause}
{order_clause}
LIMIT ? OFFSET ?
""", params_data + [per_page, offset])
rows = await cursor.fetchall()
return {
"missing_skus": [dict(row) for row in rows],
"total": total,
"page": page,
"per_page": per_page,
"pages": (total + per_page - 1) // per_page if total > 0 else 0
}
finally:
await db.close()
async def get_sync_runs(page: int = 1, per_page: int = 20):
"""Get paginated sync run history."""
db = await get_sqlite()
try:
offset = (page - 1) * per_page
cursor = await db.execute("SELECT COUNT(*) FROM sync_runs")
total = (await cursor.fetchone())[0]
cursor = await db.execute("""
SELECT * FROM sync_runs
ORDER BY started_at DESC
LIMIT ? OFFSET ?
""", (per_page, offset))
rows = await cursor.fetchall()
return {
"runs": [dict(row) for row in rows],
"total": total,
"page": page,
"pages": (total + per_page - 1) // per_page if total > 0 else 0
}
finally:
await db.close()
async def get_sync_run_detail(run_id: str):
"""Get details for a specific sync run including its orders via sync_run_orders."""
db = await get_sqlite()
try:
cursor = await db.execute(
"SELECT * FROM sync_runs WHERE run_id = ?", (run_id,)
)
run = await cursor.fetchone()
if not run:
return None
cursor = await db.execute("""
SELECT o.* FROM orders o
INNER JOIN sync_run_orders sro ON sro.order_number = o.order_number
WHERE sro.sync_run_id = ?
ORDER BY o.order_date
""", (run_id,))
orders = await cursor.fetchall()
return {
"run": dict(run),
"orders": [dict(o) for o in orders]
}
finally:
await db.close()
async def get_dashboard_stats():
"""Get stats for the dashboard."""
db = await get_sqlite()
try:
cursor = await db.execute(
"SELECT COUNT(*) FROM orders WHERE status = 'IMPORTED'"
)
imported = (await cursor.fetchone())[0]
cursor = await db.execute(
"SELECT COUNT(*) FROM orders WHERE status = 'SKIPPED'"
)
skipped = (await cursor.fetchone())[0]
cursor = await db.execute(
"SELECT COUNT(*) FROM orders WHERE status = 'ERROR'"
)
errors = (await cursor.fetchone())[0]
cursor = await db.execute(
"SELECT COUNT(*) FROM missing_skus WHERE resolved = 0"
)
missing = (await cursor.fetchone())[0]
cursor = await db.execute("SELECT COUNT(DISTINCT sku) FROM missing_skus")
total_missing_skus = (await cursor.fetchone())[0]
cursor = await db.execute(
"SELECT COUNT(DISTINCT sku) FROM missing_skus WHERE resolved = 0"
)
unresolved_skus = (await cursor.fetchone())[0]
cursor = await db.execute("""
SELECT * FROM sync_runs ORDER BY started_at DESC LIMIT 1
""")
last_run = await cursor.fetchone()
return {
"imported": imported,
"skipped": skipped,
"errors": errors,
"missing_skus": missing,
"total_tracked_skus": total_missing_skus,
"unresolved_skus": unresolved_skus,
"last_run": dict(last_run) if last_run else None
}
finally:
await db.close()
async def get_scheduler_config():
"""Get scheduler configuration from SQLite."""
db = await get_sqlite()
try:
cursor = await db.execute("SELECT key, value FROM scheduler_config")
rows = await cursor.fetchall()
return {row["key"]: row["value"] for row in rows}
finally:
await db.close()
async def set_scheduler_config(key: str, value: str):
"""Set a scheduler configuration value."""
db = await get_sqlite()
try:
await db.execute("""
INSERT OR REPLACE INTO scheduler_config (key, value)
VALUES (?, ?)
""", (key, value))
await db.commit()
finally:
await db.close()
# ── web_products ─────────────────────────────────
async def upsert_web_product(sku: str, product_name: str):
"""Insert or update a web product, incrementing order_count."""
db = await get_sqlite()
try:
await db.execute("""
INSERT INTO web_products (sku, product_name, order_count)
VALUES (?, ?, 1)
ON CONFLICT(sku) DO UPDATE SET
product_name = COALESCE(NULLIF(excluded.product_name, ''), web_products.product_name),
last_seen = datetime('now'),
order_count = web_products.order_count + 1
""", (sku, product_name))
await db.commit()
finally:
await db.close()
async def upsert_web_products_batch(items: list[tuple[str, str]]):
"""Batch upsert web products in a single transaction. items: list of (sku, product_name)."""
if not items:
return
db = await get_sqlite()
try:
await db.executemany("""
INSERT INTO web_products (sku, product_name, order_count)
VALUES (?, ?, 1)
ON CONFLICT(sku) DO UPDATE SET
product_name = COALESCE(NULLIF(excluded.product_name, ''), web_products.product_name),
last_seen = datetime('now'),
order_count = web_products.order_count + 1
""", items)
await db.commit()
finally:
await db.close()
async def get_web_product_name(sku: str) -> str:
"""Lookup product name by SKU."""
db = await get_sqlite()
try:
cursor = await db.execute(
"SELECT product_name FROM web_products WHERE sku = ?", (sku,)
)
row = await cursor.fetchone()
return row["product_name"] if row else ""
finally:
await db.close()
async def get_web_products_batch(skus: list) -> dict:
"""Batch lookup product names by SKU list. Returns {sku: product_name}."""
if not skus:
return {}
db = await get_sqlite()
try:
placeholders = ",".join("?" for _ in skus)
cursor = await db.execute(
f"SELECT sku, product_name FROM web_products WHERE sku IN ({placeholders})",
list(skus)
)
rows = await cursor.fetchall()
return {row["sku"]: row["product_name"] for row in rows}
finally:
await db.close()
# ── order_items ──────────────────────────────────
async def add_order_items(order_number: str, items: list):
"""Bulk insert order items. Uses INSERT OR IGNORE — PK is (order_number, sku)."""
if not items:
return
db = await get_sqlite()
try:
await db.executemany("""
INSERT OR IGNORE INTO order_items
(order_number, sku, product_name, quantity, price, vat,
mapping_status, codmat, id_articol, cantitate_roa)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""", [
(order_number,
item.get("sku"), item.get("product_name"),
item.get("quantity"), item.get("price"), item.get("vat"),
item.get("mapping_status"), item.get("codmat"),
item.get("id_articol"), item.get("cantitate_roa"))
for item in items
])
await db.commit()
finally:
await db.close()
async def get_order_items(order_number: str) -> list:
"""Fetch items for one order."""
db = await get_sqlite()
try:
cursor = await db.execute("""
SELECT * FROM order_items
WHERE order_number = ?
ORDER BY sku
""", (order_number,))
rows = await cursor.fetchall()
return [dict(row) for row in rows]
finally:
await db.close()
async def get_order_detail(order_number: str) -> dict:
"""Get full order detail: order metadata + items."""
db = await get_sqlite()
try:
cursor = await db.execute("""
SELECT * FROM orders WHERE order_number = ?
""", (order_number,))
order = await cursor.fetchone()
if not order:
return None
cursor = await db.execute("""
SELECT * FROM order_items WHERE order_number = ?
ORDER BY sku
""", (order_number,))
items = await cursor.fetchall()
return {
"order": dict(order),
"items": [dict(i) for i in items]
}
finally:
await db.close()
async def get_run_orders_filtered(run_id: str, status_filter: str = "all",
page: int = 1, per_page: int = 50,
sort_by: str = "order_date", sort_dir: str = "asc"):
"""Get paginated orders for a run via sync_run_orders junction table."""
db = await get_sqlite()
try:
where = "WHERE sro.sync_run_id = ?"
params = [run_id]
if status_filter and status_filter != "all":
where += " AND UPPER(sro.status_at_run) = ?"
params.append(status_filter.upper())
allowed_sort = {"order_date", "order_number", "customer_name", "items_count",
"status", "first_seen_at", "updated_at"}
if sort_by not in allowed_sort:
sort_by = "order_date"
if sort_dir.lower() not in ("asc", "desc"):
sort_dir = "asc"
cursor = await db.execute(
f"SELECT COUNT(*) FROM orders o INNER JOIN sync_run_orders sro "
f"ON sro.order_number = o.order_number {where}", params
)
total = (await cursor.fetchone())[0]
offset = (page - 1) * per_page
cursor = await db.execute(f"""
SELECT o.*, sro.status_at_run AS run_status FROM orders o
INNER JOIN sync_run_orders sro ON sro.order_number = o.order_number
{where}
ORDER BY o.{sort_by} {sort_dir}
LIMIT ? OFFSET ?
""", params + [per_page, offset])
rows = await cursor.fetchall()
cursor = await db.execute("""
SELECT sro.status_at_run AS status, COUNT(*) as cnt
FROM orders o
INNER JOIN sync_run_orders sro ON sro.order_number = o.order_number
WHERE sro.sync_run_id = ?
GROUP BY sro.status_at_run
""", (run_id,))
status_counts = {row["status"]: row["cnt"] for row in await cursor.fetchall()}
# Use run_status (status_at_run) as the status field for each order row
order_rows = []
for r in rows:
d = dict(r)
d["status"] = d.pop("run_status", d.get("status"))
order_rows.append(d)
return {
"orders": order_rows,
"total": total,
"page": page,
"per_page": per_page,
"pages": (total + per_page - 1) // per_page if total > 0 else 0,
"counts": {
"imported": status_counts.get("IMPORTED", 0),
"skipped": status_counts.get("SKIPPED", 0),
"error": status_counts.get("ERROR", 0),
"already_imported": status_counts.get("ALREADY_IMPORTED", 0),
"cancelled": status_counts.get("CANCELLED", 0),
"total": sum(status_counts.values())
}
}
finally:
await db.close()
async def get_orders(page: int = 1, per_page: int = 50,
search: str = "", status_filter: str = "all",
sort_by: str = "order_date", sort_dir: str = "desc",
period_days: int = 7,
period_start: str = "", period_end: str = ""):
"""Get orders with filters, sorting, and period.
period_days=0 with period_start/period_end uses custom date range.
period_days=0 without dates means all time.
"""
db = await get_sqlite()
try:
# Period + search clauses (used for counts — never include status filter)
base_clauses = []
base_params = []
if period_days and period_days > 0:
base_clauses.append("order_date >= date('now', ?)")
base_params.append(f"-{period_days} days")
elif period_days == 0 and period_start and period_end:
base_clauses.append("order_date BETWEEN ? AND ?")
base_params.extend([period_start, period_end])
if search:
base_clauses.append("(order_number LIKE ? OR customer_name LIKE ?)")
base_params.extend([f"%{search}%", f"%{search}%"])
# Data query adds status filter on top of base filters
data_clauses = list(base_clauses)
data_params = list(base_params)
if status_filter and status_filter not in ("all", "UNINVOICED"):
if status_filter.upper() == "IMPORTED":
data_clauses.append("UPPER(status) IN ('IMPORTED', 'ALREADY_IMPORTED')")
else:
data_clauses.append("UPPER(status) = ?")
data_params.append(status_filter.upper())
where = ("WHERE " + " AND ".join(data_clauses)) if data_clauses else ""
counts_where = ("WHERE " + " AND ".join(base_clauses)) if base_clauses else ""
allowed_sort = {"order_date", "order_number", "customer_name", "items_count",
"status", "first_seen_at", "updated_at"}
if sort_by not in allowed_sort:
sort_by = "order_date"
if sort_dir.lower() not in ("asc", "desc"):
sort_dir = "desc"
cursor = await db.execute(f"SELECT COUNT(*) FROM orders {where}", data_params)
total = (await cursor.fetchone())[0]
offset = (page - 1) * per_page
cursor = await db.execute(f"""
SELECT * FROM orders
{where}
ORDER BY {sort_by} {sort_dir}
LIMIT ? OFFSET ?
""", data_params + [per_page, offset])
rows = await cursor.fetchall()
# Counts by status — always on full period+search, never filtered by status
cursor = await db.execute(f"""
SELECT status, COUNT(*) as cnt FROM orders
{counts_where}
GROUP BY status
""", base_params)
status_counts = {row["status"]: row["cnt"] for row in await cursor.fetchall()}
# Uninvoiced count: IMPORTED/ALREADY_IMPORTED with no cached invoice, same period+search
uninv_clauses = list(base_clauses) + [
"UPPER(status) IN ('IMPORTED', 'ALREADY_IMPORTED')",
"(factura_numar IS NULL OR factura_numar = '')",
]
uninv_where = "WHERE " + " AND ".join(uninv_clauses)
cursor = await db.execute(f"SELECT COUNT(*) FROM orders {uninv_where}", base_params)
uninvoiced_sqlite = (await cursor.fetchone())[0]
return {
"orders": [dict(r) for r in rows],
"total": total,
"page": page,
"per_page": per_page,
"pages": (total + per_page - 1) // per_page if total > 0 else 0,
"counts": {
"imported": status_counts.get("IMPORTED", 0),
"already_imported": status_counts.get("ALREADY_IMPORTED", 0),
"imported_all": status_counts.get("IMPORTED", 0) + status_counts.get("ALREADY_IMPORTED", 0),
"skipped": status_counts.get("SKIPPED", 0),
"error": status_counts.get("ERROR", 0),
"cancelled": status_counts.get("CANCELLED", 0),
"total": sum(status_counts.values()),
"uninvoiced_sqlite": uninvoiced_sqlite,
}
}
finally:
await db.close()
async def update_import_order_addresses(order_number: str,
id_adresa_facturare: int = None,
id_adresa_livrare: int = None):
"""Update ROA address IDs on an order record."""
db = await get_sqlite()
try:
await db.execute("""
UPDATE orders SET
id_adresa_facturare = ?,
id_adresa_livrare = ?,
updated_at = datetime('now')
WHERE order_number = ?
""", (id_adresa_facturare, id_adresa_livrare, order_number))
await db.commit()
finally:
await db.close()
# ── Invoice cache ────────────────────────────────
async def get_uninvoiced_imported_orders() -> list:
"""Get all imported orders that don't yet have invoice data cached."""
db = await get_sqlite()
try:
cursor = await db.execute("""
SELECT order_number, id_comanda FROM orders
WHERE status IN ('IMPORTED', 'ALREADY_IMPORTED')
AND id_comanda IS NOT NULL
AND factura_numar IS NULL
""")
rows = await cursor.fetchall()
return [dict(r) for r in rows]
finally:
await db.close()
async def update_order_invoice(order_number: str, serie: str = None,
numar: str = None, total_fara_tva: float = None,
total_tva: float = None, total_cu_tva: float = None,
data_act: str = None):
"""Cache invoice data from Oracle onto the order record."""
db = await get_sqlite()
try:
await db.execute("""
UPDATE orders SET
factura_serie = ?,
factura_numar = ?,
factura_total_fara_tva = ?,
factura_total_tva = ?,
factura_total_cu_tva = ?,
factura_data = ?,
invoice_checked_at = datetime('now'),
updated_at = datetime('now')
WHERE order_number = ?
""", (serie, numar, total_fara_tva, total_tva, total_cu_tva, data_act, order_number))
await db.commit()
finally:
await db.close()
async def get_invoiced_imported_orders() -> list:
"""Get imported orders that HAVE cached invoice data (for re-verification)."""
db = await get_sqlite()
try:
cursor = await db.execute("""
SELECT order_number, id_comanda FROM orders
WHERE status IN ('IMPORTED', 'ALREADY_IMPORTED')
AND id_comanda IS NOT NULL
AND factura_numar IS NOT NULL AND factura_numar != ''
""")
rows = await cursor.fetchall()
return [dict(r) for r in rows]
finally:
await db.close()
async def get_all_imported_orders() -> list:
"""Get ALL imported orders with id_comanda (for checking if deleted in ROA)."""
db = await get_sqlite()
try:
cursor = await db.execute("""
SELECT order_number, id_comanda FROM orders
WHERE status IN ('IMPORTED', 'ALREADY_IMPORTED')
AND id_comanda IS NOT NULL
""")
rows = await cursor.fetchall()
return [dict(r) for r in rows]
finally:
await db.close()
async def clear_order_invoice(order_number: str):
"""Clear cached invoice data when invoice was deleted in ROA."""
db = await get_sqlite()
try:
await db.execute("""
UPDATE orders SET
factura_serie = NULL,
factura_numar = NULL,
factura_total_fara_tva = NULL,
factura_total_tva = NULL,
factura_total_cu_tva = NULL,
factura_data = NULL,
invoice_checked_at = datetime('now'),
updated_at = datetime('now')
WHERE order_number = ?
""", (order_number,))
await db.commit()
finally:
await db.close()
async def mark_order_deleted_in_roa(order_number: str):
"""Mark an order as deleted in ROA — clears id_comanda and invoice cache."""
db = await get_sqlite()
try:
await db.execute("""
UPDATE orders SET
status = 'DELETED_IN_ROA',
id_comanda = NULL,
id_partener = NULL,
factura_serie = NULL,
factura_numar = NULL,
factura_total_fara_tva = NULL,
factura_total_tva = NULL,
factura_total_cu_tva = NULL,
factura_data = NULL,
invoice_checked_at = NULL,
error_message = 'Comanda stearsa din ROA',
updated_at = datetime('now')
WHERE order_number = ?
""", (order_number,))
await db.commit()
finally:
await db.close()
async def mark_order_cancelled(order_number: str, web_status: str = "Anulata"):
"""Mark an order as cancelled from GoMag. Clears id_comanda and invoice cache."""
db = await get_sqlite()
try:
await db.execute("""
UPDATE orders SET
status = 'CANCELLED',
id_comanda = NULL,
id_partener = NULL,
factura_serie = NULL,
factura_numar = NULL,
factura_total_fara_tva = NULL,
factura_total_tva = NULL,
factura_total_cu_tva = NULL,
factura_data = NULL,
invoice_checked_at = NULL,
web_status = ?,
error_message = 'Comanda anulata in GoMag',
updated_at = datetime('now')
WHERE order_number = ?
""", (web_status, order_number))
await db.commit()
finally:
await db.close()
# ── App Settings ─────────────────────────────────
async def get_app_settings() -> dict:
"""Get all app settings as a dict."""
db = await get_sqlite()
try:
cursor = await db.execute("SELECT key, value FROM app_settings")
rows = await cursor.fetchall()
return {row["key"]: row["value"] for row in rows}
finally:
await db.close()
async def set_app_setting(key: str, value: str):
"""Set a single app setting value."""
db = await get_sqlite()
try:
await db.execute("""
INSERT OR REPLACE INTO app_settings (key, value)
VALUES (?, ?)
""", (key, value))
await db.commit()
finally:
await db.close()
# ── Price Sync Runs ───────────────────────────────
async def get_price_sync_runs(page: int = 1, per_page: int = 20):
"""Get paginated price sync run history."""
db = await get_sqlite()
try:
offset = (page - 1) * per_page
cursor = await db.execute("SELECT COUNT(*) FROM price_sync_runs")
total = (await cursor.fetchone())[0]
cursor = await db.execute(
"SELECT * FROM price_sync_runs ORDER BY started_at DESC LIMIT ? OFFSET ?",
(per_page, offset)
)
runs = [dict(r) for r in await cursor.fetchall()]
return {"runs": runs, "total": total, "page": page, "pages": (total + per_page - 1) // per_page}
finally:
await db.close()

View File

@@ -1,871 +0,0 @@
import asyncio
import json
import logging
import uuid
from datetime import datetime, timedelta
from zoneinfo import ZoneInfo
_tz_bucharest = ZoneInfo("Europe/Bucharest")
def _now():
"""Return current time in Bucharest timezone (naive, for display/storage)."""
return datetime.now(_tz_bucharest).replace(tzinfo=None)
from . import order_reader, validation_service, import_service, sqlite_service, invoice_service, gomag_client
from ..config import settings
from .. import database
logger = logging.getLogger(__name__)
# Sync state
_sync_lock = asyncio.Lock()
_current_sync = None # dict with run_id, status, progress info
# In-memory text log buffer per run
_run_logs: dict[str, list[str]] = {}
def _log_line(run_id: str, message: str):
"""Append a timestamped line to the in-memory log buffer."""
if run_id not in _run_logs:
_run_logs[run_id] = []
ts = _now().strftime("%H:%M:%S")
_run_logs[run_id].append(f"[{ts}] {message}")
def get_run_text_log(run_id: str) -> str | None:
"""Return the accumulated text log for a run, or None if not found."""
lines = _run_logs.get(run_id)
if lines is None:
return None
return "\n".join(lines)
def _update_progress(phase: str, phase_text: str, current: int = 0, total: int = 0,
counts: dict = None):
"""Update _current_sync with progress details for polling."""
global _current_sync
if _current_sync is None:
return
_current_sync["phase"] = phase
_current_sync["phase_text"] = phase_text
_current_sync["progress_current"] = current
_current_sync["progress_total"] = total
_current_sync["counts"] = counts or {"imported": 0, "skipped": 0, "errors": 0, "already_imported": 0}
async def get_sync_status():
"""Get current sync status."""
if _current_sync:
return {**_current_sync}
return {"status": "idle"}
async def prepare_sync(id_pol: int = None, id_sectie: int = None) -> dict:
"""Prepare a sync run - creates run_id and sets initial state.
Returns {"run_id": ..., "status": "starting"} or {"error": ...} if already running.
"""
global _current_sync
if _sync_lock.locked():
return {"error": "Sync already running", "run_id": _current_sync.get("run_id") if _current_sync else None}
run_id = _now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
_current_sync = {
"run_id": run_id,
"status": "running",
"started_at": _now().isoformat(),
"finished_at": None,
"phase": "starting",
"phase_text": "Starting...",
"progress_current": 0,
"progress_total": 0,
"counts": {"imported": 0, "skipped": 0, "errors": 0, "already_imported": 0, "cancelled": 0},
}
return {"run_id": run_id, "status": "starting"}
def _derive_customer_info(order):
"""Extract shipping/billing names and customer from an order.
customer = who appears on the invoice (partner in ROA):
- company name if billing is on a company
- shipping person name otherwise (consistent with import_service partner logic)
"""
shipping_name = ""
if order.shipping:
shipping_name = f"{getattr(order.shipping, 'firstname', '') or ''} {getattr(order.shipping, 'lastname', '') or ''}".strip()
billing_name = f"{getattr(order.billing, 'firstname', '') or ''} {getattr(order.billing, 'lastname', '') or ''}".strip()
if not shipping_name:
shipping_name = billing_name
if order.billing.is_company and order.billing.company_name:
customer = order.billing.company_name
else:
customer = shipping_name or billing_name
payment_method = getattr(order, 'payment_name', None) or None
delivery_method = getattr(order, 'delivery_name', None) or None
return shipping_name.upper(), billing_name.upper(), customer.upper(), payment_method, delivery_method
async def _fix_stale_error_orders(existing_map: dict, run_id: str):
"""Fix orders stuck in ERROR status that are actually in Oracle.
This can happen when a previous import committed partially (no rollback on error).
If the order exists in Oracle COMENZI, update SQLite status to ALREADY_IMPORTED.
"""
from ..database import get_sqlite
db = await get_sqlite()
try:
cursor = await db.execute(
"SELECT order_number FROM orders WHERE status = 'ERROR'"
)
error_orders = [row["order_number"] for row in await cursor.fetchall()]
fixed = 0
for order_number in error_orders:
if order_number in existing_map:
id_comanda = existing_map[order_number]
await db.execute("""
UPDATE orders SET
status = 'ALREADY_IMPORTED',
id_comanda = ?,
error_message = NULL,
updated_at = datetime('now')
WHERE order_number = ? AND status = 'ERROR'
""", (id_comanda, order_number))
fixed += 1
_log_line(run_id, f"#{order_number} → status corectat ERROR → ALREADY_IMPORTED (ID: {id_comanda})")
if fixed:
await db.commit()
logger.info(f"Fixed {fixed} stale ERROR orders that exist in Oracle")
finally:
await db.close()
async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None) -> dict:
"""Run a full sync cycle. Returns summary dict."""
global _current_sync
if _sync_lock.locked():
return {"error": "Sync already running"}
async with _sync_lock:
# Use provided run_id or generate one
if not run_id:
run_id = _now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
_current_sync = {
"run_id": run_id,
"status": "running",
"started_at": _now().isoformat(),
"finished_at": None,
"phase": "reading",
"phase_text": "Reading JSON files...",
"progress_current": 0,
"progress_total": 0,
"counts": {"imported": 0, "skipped": 0, "errors": 0, "already_imported": 0, "cancelled": 0},
}
_update_progress("reading", "Reading JSON files...")
started_dt = _now()
_run_logs[run_id] = [
f"=== Sync Run {run_id} ===",
f"Inceput: {started_dt.strftime('%d.%m.%Y %H:%M:%S')}",
""
]
json_dir = settings.JSON_OUTPUT_DIR
try:
# Phase 0: Download orders from GoMag API
_update_progress("downloading", "Descărcare comenzi din GoMag API...")
_log_line(run_id, "Descărcare comenzi din GoMag API...")
# Read GoMag settings from SQLite (override config defaults)
dl_settings = await sqlite_service.get_app_settings()
gomag_key = dl_settings.get("gomag_api_key") or None
gomag_shop = dl_settings.get("gomag_api_shop") or None
gomag_days_str = dl_settings.get("gomag_order_days_back")
gomag_days = int(gomag_days_str) if gomag_days_str else None
gomag_limit_str = dl_settings.get("gomag_limit")
gomag_limit = int(gomag_limit_str) if gomag_limit_str else None
dl_result = await gomag_client.download_orders(
json_dir, log_fn=lambda msg: _log_line(run_id, msg),
api_key=gomag_key, api_shop=gomag_shop,
days_back=gomag_days, limit=gomag_limit,
)
if dl_result["files"]:
_log_line(run_id, f"GoMag: {dl_result['total']} comenzi în {dl_result['pages']} pagini → {len(dl_result['files'])} fișiere")
_update_progress("reading", "Citire fisiere JSON...")
_log_line(run_id, "Citire fisiere JSON...")
# Step 1: Read orders and sort chronologically (oldest first - R3)
orders, json_count = order_reader.read_json_orders()
orders.sort(key=lambda o: o.date or '')
await sqlite_service.create_sync_run(run_id, json_count)
_update_progress("reading", f"Found {len(orders)} orders in {json_count} files", 0, len(orders))
_log_line(run_id, f"Gasite {len(orders)} comenzi in {json_count} fisiere")
# Populate web_products catalog from all orders (R4)
web_product_items = [
(item.sku, item.name)
for order in orders
for item in order.items
if item.sku and item.name
]
await sqlite_service.upsert_web_products_batch(web_product_items)
if not orders:
_log_line(run_id, "Nicio comanda gasita.")
await sqlite_service.update_sync_run(run_id, "completed", 0, 0, 0, 0)
_update_progress("completed", "No orders found")
summary = {"run_id": run_id, "status": "completed", "message": "No orders found", "json_files": json_count}
return summary
# ── Separate cancelled orders (GoMag status "Anulata" / statusId "7") ──
cancelled_orders = [o for o in orders if o.status_id == "7" or (o.status and o.status.lower() == "anulata")]
active_orders = [o for o in orders if o not in cancelled_orders]
cancelled_count = len(cancelled_orders)
if cancelled_orders:
_log_line(run_id, f"Comenzi anulate in GoMag: {cancelled_count}")
# Record cancelled orders in SQLite
cancelled_batch = []
for order in cancelled_orders:
shipping_name, billing_name, customer, payment_method, delivery_method = _derive_customer_info(order)
order_items_data = [
{"sku": item.sku, "product_name": item.name,
"quantity": item.quantity, "price": item.price, "vat": item.vat,
"mapping_status": "unknown", "codmat": None,
"id_articol": None, "cantitate_roa": None}
for item in order.items
]
cancelled_batch.append({
"sync_run_id": run_id, "order_number": order.number,
"order_date": order.date, "customer_name": customer,
"status": "CANCELLED", "status_at_run": "CANCELLED",
"id_comanda": None, "id_partener": None,
"error_message": "Comanda anulata in GoMag",
"missing_skus": None,
"items_count": len(order.items),
"shipping_name": shipping_name, "billing_name": billing_name,
"payment_method": payment_method, "delivery_method": delivery_method,
"order_total": order.total or None,
"delivery_cost": order.delivery_cost or None,
"discount_total": order.discount_total or None,
"web_status": order.status or "Anulata",
"items": order_items_data,
})
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → ANULAT in GoMag")
await sqlite_service.save_orders_batch(cancelled_batch)
# Check if any cancelled orders were previously imported
from ..database import get_sqlite as _get_sqlite
db_check = await _get_sqlite()
try:
cancelled_numbers = [o.number for o in cancelled_orders]
placeholders = ",".join("?" for _ in cancelled_numbers)
cursor = await db_check.execute(f"""
SELECT order_number, id_comanda FROM orders
WHERE order_number IN ({placeholders})
AND id_comanda IS NOT NULL
AND status = 'CANCELLED'
""", cancelled_numbers)
previously_imported = [dict(r) for r in await cursor.fetchall()]
finally:
await db_check.close()
if previously_imported:
_log_line(run_id, f"Verificare {len(previously_imported)} comenzi anulate care erau importate in Oracle...")
# Check which have invoices
id_comanda_list = [o["id_comanda"] for o in previously_imported]
invoice_data = await asyncio.to_thread(
invoice_service.check_invoices_for_orders, id_comanda_list
)
for o in previously_imported:
idc = o["id_comanda"]
order_num = o["order_number"]
if idc in invoice_data:
# Invoiced — keep in Oracle, just log warning
_log_line(run_id,
f"#{order_num} → ANULAT dar FACTURAT (factura {invoice_data[idc].get('serie_act', '')}"
f"{invoice_data[idc].get('numar_act', '')}) — NU se sterge din Oracle")
# Update web_status but keep CANCELLED status (already set by batch above)
else:
# Not invoiced — soft-delete in Oracle
del_result = await asyncio.to_thread(
import_service.soft_delete_order_in_roa, idc
)
if del_result["success"]:
# Clear id_comanda via mark_order_cancelled
await sqlite_service.mark_order_cancelled(order_num, "Anulata")
_log_line(run_id,
f"#{order_num} → ANULAT + STERS din Oracle (ID: {idc}, "
f"{del_result['details_deleted']} detalii)")
else:
_log_line(run_id,
f"#{order_num} → ANULAT dar EROARE la stergere Oracle: {del_result['error']}")
orders = active_orders
if not orders:
_log_line(run_id, "Nicio comanda activa dupa filtrare anulate.")
await sqlite_service.update_sync_run(run_id, "completed", cancelled_count, 0, 0, 0)
_update_progress("completed", f"No active orders ({cancelled_count} cancelled)")
summary = {"run_id": run_id, "status": "completed",
"message": f"No active orders ({cancelled_count} cancelled)",
"json_files": json_count, "cancelled": cancelled_count}
return summary
_update_progress("validation", f"Validating {len(orders)} orders...", 0, len(orders))
# ── Single Oracle connection for entire validation phase ──
conn = await asyncio.to_thread(database.get_oracle_connection)
try:
# Step 2a: Find orders already in Oracle (date-range query)
order_dates = [o.date for o in orders if o.date]
if order_dates:
min_date_str = min(order_dates)
try:
min_date = datetime.strptime(min_date_str[:10], "%Y-%m-%d") - timedelta(days=1)
except (ValueError, TypeError):
min_date = _now() - timedelta(days=90)
else:
min_date = _now() - timedelta(days=90)
existing_map = await asyncio.to_thread(
validation_service.check_orders_in_roa, min_date, conn
)
# Step 2a-fix: Fix ERROR orders that are actually in Oracle
# (can happen if previous import committed partially without rollback)
await _fix_stale_error_orders(existing_map, run_id)
# Load app settings early (needed for id_gestiune in SKU validation)
app_settings = await sqlite_service.get_app_settings()
id_pol = id_pol or int(app_settings.get("id_pol") or 0) or settings.ID_POL
id_sectie = id_sectie or int(app_settings.get("id_sectie") or 0) or settings.ID_SECTIE
# Parse multi-gestiune CSV: "1,3" → [1, 3], "" → None
id_gestiune_raw = (app_settings.get("id_gestiune") or "").strip()
if id_gestiune_raw and id_gestiune_raw != "0":
id_gestiuni = [int(g) for g in id_gestiune_raw.split(",") if g.strip()]
else:
id_gestiuni = None # None = orice gestiune
logger.info(f"Sync params: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNI={id_gestiuni}")
_log_line(run_id, f"Parametri import: ID_POL={id_pol}, ID_SECTIE={id_sectie}, ID_GESTIUNI={id_gestiuni}")
# Step 2b: Validate SKUs (reuse same connection)
all_skus = order_reader.get_all_skus(orders)
validation = await asyncio.to_thread(validation_service.validate_skus, all_skus, conn, id_gestiuni)
importable, skipped = validation_service.classify_orders(orders, validation)
# ── Split importable into truly_importable vs already_in_roa ──
truly_importable = []
already_in_roa = []
for order in importable:
if order.number in existing_map:
already_in_roa.append(order)
else:
truly_importable.append(order)
_update_progress("validation",
f"{len(truly_importable)} new, {len(already_in_roa)} already imported, {len(skipped)} skipped",
0, len(truly_importable))
_log_line(run_id, f"Validare: {len(truly_importable)} noi, {len(already_in_roa)} deja importate, {len(skipped)} nemapate")
# Step 2c: Build SKU context from skipped orders
sku_context = {}
for order, missing_skus_list in skipped:
if order.billing.is_company and order.billing.company_name:
customer = order.billing.company_name
else:
ship_name = ""
if order.shipping:
ship_name = f"{order.shipping.firstname} {order.shipping.lastname}".strip()
customer = ship_name or f"{order.billing.firstname} {order.billing.lastname}"
for sku in missing_skus_list:
if sku not in sku_context:
sku_context[sku] = {"orders": [], "customers": []}
if order.number not in sku_context[sku]["orders"]:
sku_context[sku]["orders"].append(order.number)
if customer not in sku_context[sku]["customers"]:
sku_context[sku]["customers"].append(customer)
# Track missing SKUs with context
for sku in validation["missing"]:
product_name = ""
for order in orders:
for item in order.items:
if item.sku == sku:
product_name = item.name
break
if product_name:
break
ctx = sku_context.get(sku, {})
await sqlite_service.track_missing_sku(
sku, product_name,
order_count=len(ctx.get("orders", [])),
order_numbers=json.dumps(ctx.get("orders", [])) if ctx.get("orders") else None,
customers=json.dumps(ctx.get("customers", [])) if ctx.get("customers") else None,
)
# Auto-resolve missing SKUs that now have mappings
resolved_skus = validation["mapped"] | validation["direct"]
if resolved_skus:
resolved_count = await sqlite_service.resolve_missing_skus_batch(resolved_skus)
if resolved_count:
_log_line(run_id, f"Auto-resolved {resolved_count} previously missing SKUs")
# Step 2d: Pre-validate prices for importable articles
if id_pol and (truly_importable or already_in_roa):
_update_progress("validation", "Validating prices...", 0, len(truly_importable))
_log_line(run_id, "Validare preturi...")
all_codmats = set()
for order in (truly_importable + already_in_roa):
for item in order.items:
if item.sku in validation["mapped"]:
pass
elif item.sku in validation["direct"]:
all_codmats.add(item.sku)
# Get standard VAT rate from settings for PROC_TVAV metadata
cota_tva = float(app_settings.get("discount_vat") or 21)
# Dual pricing policy support
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
codmat_policy_map = {}
if all_codmats:
if id_pol_productie:
# Dual-policy: classify articles by cont (sales vs production)
codmat_policy_map = await asyncio.to_thread(
validation_service.validate_and_ensure_prices_dual,
all_codmats, id_pol, id_pol_productie,
conn, validation.get("direct_id_map"),
cota_tva=cota_tva
)
_log_line(run_id,
f"Politici duale: {sum(1 for v in codmat_policy_map.values() if v == id_pol)} vanzare, "
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)} productie")
else:
# Single-policy (backward compatible)
price_result = await asyncio.to_thread(
validation_service.validate_prices, all_codmats, id_pol,
conn, validation.get("direct_id_map")
)
if price_result["missing_price"]:
logger.info(
f"Auto-adding price 0 for {len(price_result['missing_price'])} "
f"direct articles in policy {id_pol}"
)
await asyncio.to_thread(
validation_service.ensure_prices,
price_result["missing_price"], id_pol,
conn, validation.get("direct_id_map"),
cota_tva=cota_tva
)
# Also validate mapped SKU prices (cherry-pick 1)
mapped_skus_in_orders = set()
for order in (truly_importable + already_in_roa):
for item in order.items:
if item.sku in validation["mapped"]:
mapped_skus_in_orders.add(item.sku)
mapped_codmat_data = {}
if mapped_skus_in_orders:
mapped_codmat_data = await asyncio.to_thread(
validation_service.resolve_mapped_codmats, mapped_skus_in_orders, conn,
id_gestiuni=id_gestiuni
)
# Build id_map for mapped codmats and validate/ensure their prices
mapped_id_map = {}
for sku, entries in mapped_codmat_data.items():
for entry in entries:
mapped_id_map[entry["codmat"]] = {
"id_articol": entry["id_articol"],
"cont": entry.get("cont")
}
mapped_codmats = set(mapped_id_map.keys())
if mapped_codmats:
if id_pol_productie:
mapped_policy_map = await asyncio.to_thread(
validation_service.validate_and_ensure_prices_dual,
mapped_codmats, id_pol, id_pol_productie,
conn, mapped_id_map, cota_tva=cota_tva
)
codmat_policy_map.update(mapped_policy_map)
else:
mp_result = await asyncio.to_thread(
validation_service.validate_prices,
mapped_codmats, id_pol, conn, mapped_id_map
)
if mp_result["missing_price"]:
await asyncio.to_thread(
validation_service.ensure_prices,
mp_result["missing_price"], id_pol,
conn, mapped_id_map, cota_tva=cota_tva
)
# Add SKU → policy entries for mapped articles (1:1 and kits)
# codmat_policy_map has CODMAT keys, but build_articles_json
# looks up by GoMag SKU — bridge the gap here
if codmat_policy_map and mapped_codmat_data:
for sku, entries in mapped_codmat_data.items():
if len(entries) == 1:
# 1:1 mapping: SKU inherits the CODMAT's policy
codmat = entries[0]["codmat"]
if codmat in codmat_policy_map:
codmat_policy_map[sku] = codmat_policy_map[codmat]
# Pass codmat_policy_map to import via app_settings
if codmat_policy_map:
app_settings["_codmat_policy_map"] = codmat_policy_map
# ── Kit component price validation ──
kit_pricing_mode = app_settings.get("kit_pricing_mode")
if kit_pricing_mode and mapped_codmat_data:
id_pol_prod = int(app_settings.get("id_pol_productie") or 0) or None
kit_missing = await asyncio.to_thread(
validation_service.validate_kit_component_prices,
mapped_codmat_data, id_pol, id_pol_prod, conn
)
if kit_missing:
kit_skus_missing = set(kit_missing.keys())
for sku, missing_codmats in kit_missing.items():
_log_line(run_id, f"Kit {sku}: prețuri lipsă pentru {', '.join(missing_codmats)}")
new_truly = []
for order in truly_importable:
order_skus = {item.sku for item in order.items}
if order_skus & kit_skus_missing:
missing_list = list(order_skus & kit_skus_missing)
skipped.append((order, missing_list))
else:
new_truly.append(order)
truly_importable = new_truly
# Mode B config validation
if kit_pricing_mode == "separate_line":
if not app_settings.get("kit_discount_codmat"):
_log_line(run_id, "EROARE: Kit mode 'separate_line' dar kit_discount_codmat nu e configurat!")
finally:
await asyncio.to_thread(database.pool.release, conn)
# Step 3a: Record already-imported orders (batch)
already_imported_count = len(already_in_roa)
already_batch = []
for order in already_in_roa:
shipping_name, billing_name, customer, payment_method, delivery_method = _derive_customer_info(order)
id_comanda_roa = existing_map.get(order.number)
order_items_data = [
{"sku": item.sku, "product_name": item.name,
"quantity": item.quantity, "price": item.price, "vat": item.vat,
"mapping_status": "mapped" if item.sku in validation["mapped"] else "direct",
"codmat": None, "id_articol": None, "cantitate_roa": None}
for item in order.items
]
already_batch.append({
"sync_run_id": run_id, "order_number": order.number,
"order_date": order.date, "customer_name": customer,
"status": "ALREADY_IMPORTED", "status_at_run": "ALREADY_IMPORTED",
"id_comanda": id_comanda_roa, "id_partener": None,
"error_message": None, "missing_skus": None,
"items_count": len(order.items),
"shipping_name": shipping_name, "billing_name": billing_name,
"payment_method": payment_method, "delivery_method": delivery_method,
"order_total": order.total or None,
"delivery_cost": order.delivery_cost or None,
"discount_total": order.discount_total or None,
"web_status": order.status or None,
"items": order_items_data,
})
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → DEJA IMPORTAT (ID: {id_comanda_roa})")
await sqlite_service.save_orders_batch(already_batch)
# Step 3b: Record skipped orders + store items (batch)
skipped_count = len(skipped)
skipped_batch = []
for order, missing_skus in skipped:
shipping_name, billing_name, customer, payment_method, delivery_method = _derive_customer_info(order)
order_items_data = [
{"sku": item.sku, "product_name": item.name,
"quantity": item.quantity, "price": item.price, "vat": item.vat,
"mapping_status": "missing" if item.sku in validation["missing"] else
"mapped" if item.sku in validation["mapped"] else "direct",
"codmat": None, "id_articol": None, "cantitate_roa": None}
for item in order.items
]
skipped_batch.append({
"sync_run_id": run_id, "order_number": order.number,
"order_date": order.date, "customer_name": customer,
"status": "SKIPPED", "status_at_run": "SKIPPED",
"id_comanda": None, "id_partener": None,
"error_message": None, "missing_skus": missing_skus,
"items_count": len(order.items),
"shipping_name": shipping_name, "billing_name": billing_name,
"payment_method": payment_method, "delivery_method": delivery_method,
"order_total": order.total or None,
"delivery_cost": order.delivery_cost or None,
"discount_total": order.discount_total or None,
"web_status": order.status or None,
"items": order_items_data,
})
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → OMIS (lipsa: {', '.join(missing_skus)})")
await sqlite_service.save_orders_batch(skipped_batch)
# ── Price sync from orders ──
if app_settings.get("price_sync_enabled") == "1":
try:
all_sync_orders = truly_importable + already_in_roa
direct_id_map = validation.get("direct_id_map", {})
id_pol_prod = int(app_settings.get("id_pol_productie") or 0) or None
price_updates = await asyncio.to_thread(
validation_service.sync_prices_from_order,
all_sync_orders, mapped_codmat_data,
direct_id_map, codmat_policy_map, id_pol,
id_pol_productie=id_pol_prod,
settings=app_settings
)
if price_updates:
_log_line(run_id, f"Sync prețuri: {len(price_updates)} prețuri actualizate")
for pu in price_updates:
_log_line(run_id, f" {pu['codmat']}: {pu['old_price']:.2f}{pu['new_price']:.2f}")
except Exception as e:
_log_line(run_id, f"Eroare sync prețuri din comenzi: {e}")
logger.error(f"Price sync error: {e}")
_update_progress("skipped", f"Skipped {skipped_count}",
0, len(truly_importable),
{"imported": 0, "skipped": skipped_count, "errors": 0, "already_imported": already_imported_count})
# Step 4: Import only truly new orders
imported_count = 0
error_count = 0
for i, order in enumerate(truly_importable):
shipping_name, billing_name, customer, payment_method, delivery_method = _derive_customer_info(order)
_update_progress("import",
f"Import {i+1}/{len(truly_importable)}: #{order.number} {customer}",
i + 1, len(truly_importable),
{"imported": imported_count, "skipped": len(skipped), "errors": error_count,
"already_imported": already_imported_count})
result = await asyncio.to_thread(
import_service.import_single_order,
order, id_pol=id_pol, id_sectie=id_sectie,
app_settings=app_settings, id_gestiuni=id_gestiuni
)
# Build order items data for storage (R9)
order_items_data = []
for item in order.items:
ms = "mapped" if item.sku in validation["mapped"] else "direct"
order_items_data.append({
"sku": item.sku, "product_name": item.name,
"quantity": item.quantity, "price": item.price, "vat": item.vat,
"mapping_status": ms, "codmat": None, "id_articol": None,
"cantitate_roa": None
})
# Compute discount split for SQLite storage
ds = import_service.compute_discount_split(order, app_settings)
discount_split_json = json.dumps(ds) if ds else None
if result["success"]:
imported_count += 1
await sqlite_service.upsert_order(
sync_run_id=run_id,
order_number=order.number,
order_date=order.date,
customer_name=customer,
status="IMPORTED",
id_comanda=result["id_comanda"],
id_partener=result["id_partener"],
items_count=len(order.items),
shipping_name=shipping_name,
billing_name=billing_name,
payment_method=payment_method,
delivery_method=delivery_method,
order_total=order.total or None,
delivery_cost=order.delivery_cost or None,
discount_total=order.discount_total or None,
web_status=order.status or None,
discount_split=discount_split_json,
)
await sqlite_service.add_sync_run_order(run_id, order.number, "IMPORTED")
# Store ROA address IDs (R9)
await sqlite_service.update_import_order_addresses(
order.number,
id_adresa_facturare=result.get("id_adresa_facturare"),
id_adresa_livrare=result.get("id_adresa_livrare")
)
await sqlite_service.add_order_items(order.number, order_items_data)
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → IMPORTAT (ID: {result['id_comanda']})")
else:
error_count += 1
await sqlite_service.upsert_order(
sync_run_id=run_id,
order_number=order.number,
order_date=order.date,
customer_name=customer,
status="ERROR",
id_partener=result.get("id_partener"),
error_message=result["error"],
items_count=len(order.items),
shipping_name=shipping_name,
billing_name=billing_name,
payment_method=payment_method,
delivery_method=delivery_method,
order_total=order.total or None,
delivery_cost=order.delivery_cost or None,
discount_total=order.discount_total or None,
web_status=order.status or None,
discount_split=discount_split_json,
)
await sqlite_service.add_sync_run_order(run_id, order.number, "ERROR")
await sqlite_service.add_order_items(order.number, order_items_data)
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → EROARE: {result['error']}")
# Safety: stop if too many errors
if error_count > 10:
logger.warning("Too many errors, stopping sync")
break
# Step 4b: Invoice & order status check — sync with Oracle
_update_progress("invoices", "Checking invoices & order status...", 0, 0)
invoices_updated = 0
invoices_cleared = 0
orders_deleted = 0
try:
# 4b-1: Uninvoiced → check for new invoices
uninvoiced = await sqlite_service.get_uninvoiced_imported_orders()
if uninvoiced:
id_comanda_list = [o["id_comanda"] for o in uninvoiced]
invoice_data = await asyncio.to_thread(
invoice_service.check_invoices_for_orders, id_comanda_list
)
id_to_order = {o["id_comanda"]: o["order_number"] for o in uninvoiced}
for idc, inv in invoice_data.items():
order_num = id_to_order.get(idc)
if order_num and inv.get("facturat"):
await sqlite_service.update_order_invoice(
order_num,
serie=inv.get("serie_act"),
numar=str(inv.get("numar_act", "")),
total_fara_tva=inv.get("total_fara_tva"),
total_tva=inv.get("total_tva"),
total_cu_tva=inv.get("total_cu_tva"),
data_act=inv.get("data_act"),
)
invoices_updated += 1
# 4b-2: Invoiced → check for deleted invoices
invoiced = await sqlite_service.get_invoiced_imported_orders()
if invoiced:
id_comanda_list = [o["id_comanda"] for o in invoiced]
invoice_data = await asyncio.to_thread(
invoice_service.check_invoices_for_orders, id_comanda_list
)
for o in invoiced:
if o["id_comanda"] not in invoice_data:
await sqlite_service.clear_order_invoice(o["order_number"])
invoices_cleared += 1
# 4b-3: All imported → check for deleted orders in ROA
all_imported = await sqlite_service.get_all_imported_orders()
if all_imported:
id_comanda_list = [o["id_comanda"] for o in all_imported]
existing_ids = await asyncio.to_thread(
invoice_service.check_orders_exist, id_comanda_list
)
for o in all_imported:
if o["id_comanda"] not in existing_ids:
await sqlite_service.mark_order_deleted_in_roa(o["order_number"])
orders_deleted += 1
if invoices_updated:
_log_line(run_id, f"Facturi noi: {invoices_updated} comenzi facturate")
if invoices_cleared:
_log_line(run_id, f"Facturi sterse: {invoices_cleared} facturi eliminate din cache")
if orders_deleted:
_log_line(run_id, f"Comenzi sterse din ROA: {orders_deleted}")
except Exception as e:
logger.warning(f"Invoice/order status check failed: {e}")
# Step 5: Update sync run
total_imported = imported_count + already_imported_count # backward-compat
status = "completed" if error_count <= 10 else "failed"
await sqlite_service.update_sync_run(
run_id, status, len(orders), total_imported, len(skipped), error_count,
already_imported=already_imported_count, new_imported=imported_count
)
summary = {
"run_id": run_id,
"status": status,
"json_files": json_count,
"total_orders": len(orders) + cancelled_count,
"new_orders": len(truly_importable),
"imported": total_imported,
"new_imported": imported_count,
"already_imported": already_imported_count,
"skipped": len(skipped),
"errors": error_count,
"cancelled": cancelled_count,
"missing_skus": len(validation["missing"]),
"invoices_updated": invoices_updated,
"invoices_cleared": invoices_cleared,
"orders_deleted_in_roa": orders_deleted,
}
_update_progress("completed",
f"Completed: {imported_count} new, {already_imported_count} already, {len(skipped)} skipped, {error_count} errors, {cancelled_count} cancelled",
len(truly_importable), len(truly_importable),
{"imported": imported_count, "skipped": len(skipped), "errors": error_count,
"already_imported": already_imported_count, "cancelled": cancelled_count})
if _current_sync:
_current_sync["status"] = status
_current_sync["finished_at"] = _now().isoformat()
logger.info(
f"Sync {run_id} completed: {imported_count} new, {already_imported_count} already imported, "
f"{len(skipped)} skipped, {error_count} errors, {cancelled_count} cancelled"
)
duration = (_now() - started_dt).total_seconds()
_log_line(run_id, "")
cancelled_text = f", {cancelled_count} anulate" if cancelled_count else ""
_run_logs[run_id].append(
f"Finalizat: {imported_count} importate, {already_imported_count} deja importate, "
f"{len(skipped)} nemapate, {error_count} erori{cancelled_text} din {len(orders) + cancelled_count} comenzi | Durata: {int(duration)}s"
)
return summary
except Exception as e:
logger.error(f"Sync {run_id} failed: {e}")
_log_line(run_id, f"EROARE FATALA: {e}")
await sqlite_service.update_sync_run(run_id, "failed", 0, 0, 0, 1, error_message=str(e))
if _current_sync:
_current_sync["status"] = "failed"
_current_sync["finished_at"] = _now().isoformat()
_current_sync["error"] = str(e)
return {"run_id": run_id, "status": "failed", "error": str(e)}
finally:
# Keep _current_sync for 10 seconds so status endpoint can show final result
async def _clear_current_sync():
await asyncio.sleep(10)
global _current_sync
_current_sync = None
asyncio.ensure_future(_clear_current_sync())
async def _clear_run_logs():
await asyncio.sleep(300) # 5 minutes
_run_logs.pop(run_id, None)
asyncio.ensure_future(_clear_run_logs())
def stop_sync():
"""Signal sync to stop. Currently sync runs to completion."""
pass

View File

@@ -1,588 +0,0 @@
import logging
from .. import database
logger = logging.getLogger(__name__)
def check_orders_in_roa(min_date, conn) -> dict:
"""Check which orders already exist in Oracle COMENZI by date range.
Returns: {comanda_externa: id_comanda} for all existing orders.
Much faster than IN-clause batching — single query using date index.
"""
if conn is None:
return {}
existing = {}
try:
with conn.cursor() as cur:
cur.execute("""
SELECT comanda_externa, id_comanda FROM COMENZI
WHERE data_comanda >= :min_date
AND comanda_externa IS NOT NULL AND sters = 0
""", {"min_date": min_date})
for row in cur:
existing[str(row[0])] = row[1]
except Exception as e:
logger.error(f"check_orders_in_roa failed: {e}")
logger.info(f"ROA order check (since {min_date}): {len(existing)} existing orders found")
return existing
def resolve_codmat_ids(codmats: set[str], id_gestiuni: list[int] = None, conn=None) -> dict[str, dict]:
"""Resolve CODMATs to best id_articol + cont: prefers article with stock, then MAX(id_articol).
Filters: sters=0 AND inactiv=0.
id_gestiuni: list of warehouse IDs to check stock in, or None for all.
Returns: {codmat: {"id_articol": int, "cont": str|None}}
"""
if not codmats:
return {}
result = {}
codmat_list = list(codmats)
# Build stoc subquery dynamically for index optimization
if id_gestiuni:
gest_placeholders = ",".join([f":g{k}" for k in range(len(id_gestiuni))])
stoc_filter = f"AND s.id_gestiune IN ({gest_placeholders})"
else:
stoc_filter = ""
own_conn = conn is None
if own_conn:
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
for i in range(0, len(codmat_list), 500):
batch = codmat_list[i:i+500]
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
params = {f"c{j}": cm for j, cm in enumerate(batch)}
if id_gestiuni:
for k, gid in enumerate(id_gestiuni):
params[f"g{k}"] = gid
cur.execute(f"""
SELECT codmat, id_articol, cont FROM (
SELECT na.codmat, na.id_articol, na.cont,
ROW_NUMBER() OVER (
PARTITION BY na.codmat
ORDER BY
CASE WHEN EXISTS (
SELECT 1 FROM stoc s
WHERE s.id_articol = na.id_articol
{stoc_filter}
AND s.an = EXTRACT(YEAR FROM SYSDATE)
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
AND s.cants + s.cant - s.cante > 0
) THEN 0 ELSE 1 END,
na.id_articol DESC
) AS rn
FROM nom_articole na
WHERE na.codmat IN ({placeholders})
AND na.sters = 0 AND na.inactiv = 0
) WHERE rn = 1
""", params)
for row in cur:
result[row[0]] = {"id_articol": row[1], "cont": row[2]}
finally:
if own_conn:
database.pool.release(conn)
logger.info(f"resolve_codmat_ids: {len(result)}/{len(codmats)} resolved (gestiuni={id_gestiuni})")
return result
def validate_skus(skus: set[str], conn=None, id_gestiuni: list[int] = None) -> dict:
"""Validate a set of SKUs against Oracle.
Returns: {mapped: set, direct: set, missing: set, direct_id_map: {codmat: {"id_articol": int, "cont": str|None}}}
- mapped: found in ARTICOLE_TERTI (active)
- direct: found in NOM_ARTICOLE by codmat (not in ARTICOLE_TERTI)
- missing: not found anywhere
- direct_id_map: {codmat: {"id_articol": int, "cont": str|None}} for direct SKUs
"""
if not skus:
return {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}}
mapped = set()
sku_list = list(skus)
own_conn = conn is None
if own_conn:
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
# Check in batches of 500
for i in range(0, len(sku_list), 500):
batch = sku_list[i:i+500]
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
params = {f"s{j}": sku for j, sku in enumerate(batch)}
# Check ARTICOLE_TERTI
cur.execute(f"""
SELECT DISTINCT sku FROM ARTICOLE_TERTI
WHERE sku IN ({placeholders}) AND activ = 1 AND sters = 0
""", params)
for row in cur:
mapped.add(row[0])
# Resolve remaining SKUs via resolve_codmat_ids (consistent id_articol selection)
all_remaining = [s for s in sku_list if s not in mapped]
if all_remaining:
direct_id_map = resolve_codmat_ids(set(all_remaining), id_gestiuni, conn)
direct = set(direct_id_map.keys())
else:
direct_id_map = {}
direct = set()
finally:
if own_conn:
database.pool.release(conn)
missing = skus - mapped - direct
logger.info(f"SKU validation: {len(mapped)} mapped, {len(direct)} direct, {len(missing)} missing")
return {"mapped": mapped, "direct": direct, "missing": missing,
"direct_id_map": direct_id_map}
def classify_orders(orders, validation_result):
"""Classify orders as importable or skipped based on SKU validation.
Returns: (importable_orders, skipped_orders)
Each skipped entry is a tuple of (order, list_of_missing_skus).
"""
ok_skus = validation_result["mapped"] | validation_result["direct"]
importable = []
skipped = []
for order in orders:
order_skus = {item.sku for item in order.items if item.sku}
order_missing = order_skus - ok_skus
if order_missing:
skipped.append((order, list(order_missing)))
else:
importable.append(order)
return importable, skipped
def _extract_id_map(direct_id_map: dict) -> dict:
"""Extract {codmat: id_articol} from either enriched or simple format."""
if not direct_id_map:
return {}
result = {}
for cm, val in direct_id_map.items():
if isinstance(val, dict):
result[cm] = val["id_articol"]
else:
result[cm] = val
return result
def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None) -> dict:
"""Check which CODMATs have a price entry in CRM_POLITICI_PRET_ART for the given policy.
If direct_id_map is provided, skips the NOM_ARTICOLE lookup for those CODMATs.
Returns: {"has_price": set_of_codmats, "missing_price": set_of_codmats}
"""
if not codmats:
return {"has_price": set(), "missing_price": set()}
codmat_to_id = _extract_id_map(direct_id_map)
ids_with_price = set()
own_conn = conn is None
if own_conn:
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
# Check which ID_ARTICOLs have a price in the policy
id_list = list(codmat_to_id.values())
for i in range(0, len(id_list), 500):
batch = id_list[i:i+500]
placeholders = ",".join([f":a{j}" for j in range(len(batch))])
params = {f"a{j}": aid for j, aid in enumerate(batch)}
params["id_pol"] = id_pol
cur.execute(f"""
SELECT DISTINCT pa.ID_ARTICOL FROM CRM_POLITICI_PRET_ART pa
WHERE pa.ID_POL = :id_pol AND pa.ID_ARTICOL IN ({placeholders})
""", params)
for row in cur:
ids_with_price.add(row[0])
finally:
if own_conn:
database.pool.release(conn)
# Map back to CODMATs
has_price = {cm for cm, aid in codmat_to_id.items() if aid in ids_with_price}
missing_price = codmats - has_price
logger.info(f"Price validation (policy {id_pol}): {len(has_price)} have price, {len(missing_price)} missing price")
return {"has_price": has_price, "missing_price": missing_price}
def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None,
cota_tva: float = None):
"""Insert price 0 entries for CODMATs missing from the given price policy.
Uses batch executemany instead of individual INSERTs.
Relies on TRG_CRM_POLITICI_PRET_ART trigger for ID_POL_ART sequence.
cota_tva: VAT rate from settings (e.g. 21) — used for PROC_TVAV metadata.
"""
if not codmats:
return
proc_tvav = 1 + (cota_tva / 100) if cota_tva else 1.21
own_conn = conn is None
if own_conn:
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
# Get ID_VALUTA for this policy
cur.execute("""
SELECT ID_VALUTA FROM CRM_POLITICI_PRETURI WHERE ID_POL = :id_pol
""", {"id_pol": id_pol})
row = cur.fetchone()
if not row:
logger.error(f"Price policy {id_pol} not found in CRM_POLITICI_PRETURI")
return
id_valuta = row[0]
# Build batch params using direct_id_map (already resolved via resolve_codmat_ids)
batch_params = []
codmat_id_map = _extract_id_map(direct_id_map)
for codmat in codmats:
id_articol = codmat_id_map.get(codmat)
if not id_articol:
logger.warning(f"CODMAT {codmat} not found in NOM_ARTICOLE, skipping price insert")
continue
batch_params.append({
"id_pol": id_pol,
"id_articol": id_articol,
"id_valuta": id_valuta,
"proc_tvav": proc_tvav
})
if batch_params:
cur.executemany("""
INSERT INTO CRM_POLITICI_PRET_ART
(ID_POL, ID_ARTICOL, PRET, ID_VALUTA,
ID_UTIL, DATAORA, PROC_TVAV, PRETFTVA, PRETCTVA)
VALUES
(:id_pol, :id_articol, 0, :id_valuta,
-3, SYSDATE, :proc_tvav, 0, 0)
""", batch_params)
logger.info(f"Batch inserted {len(batch_params)} price entries for policy {id_pol} (PROC_TVAV={proc_tvav})")
conn.commit()
finally:
if own_conn:
database.pool.release(conn)
logger.info(f"Ensure prices done: {len(codmats)} CODMATs processed for policy {id_pol}")
def validate_and_ensure_prices_dual(codmats: set[str], id_pol_vanzare: int,
id_pol_productie: int, conn, direct_id_map: dict,
cota_tva: float = 21) -> dict[str, int]:
"""Dual-policy price validation: assign each CODMAT to sales or production policy.
Logic:
1. Check both policies in one SQL
2. If article in one policy → use that
3. If article in BOTH → prefer id_pol_vanzare
4. If article in NEITHER → check cont: 341/345 → production, else → sales; insert price 0
Returns: codmat_policy_map = {codmat: assigned_id_pol}
"""
if not codmats:
return {}
codmat_policy_map = {}
id_map = _extract_id_map(direct_id_map)
# Collect all id_articol values we need to check
id_to_codmats = {} # {id_articol: [codmat, ...]}
for cm in codmats:
aid = id_map.get(cm)
if aid:
id_to_codmats.setdefault(aid, []).append(cm)
if not id_to_codmats:
return {}
# Query both policies in one SQL
existing = {} # {id_articol: set of id_pol}
id_list = list(id_to_codmats.keys())
with conn.cursor() as cur:
for i in range(0, len(id_list), 500):
batch = id_list[i:i+500]
placeholders = ",".join([f":a{j}" for j in range(len(batch))])
params = {f"a{j}": aid for j, aid in enumerate(batch)}
params["id_pol_v"] = id_pol_vanzare
params["id_pol_p"] = id_pol_productie
cur.execute(f"""
SELECT pa.ID_ARTICOL, pa.ID_POL FROM CRM_POLITICI_PRET_ART pa
WHERE pa.ID_POL IN (:id_pol_v, :id_pol_p) AND pa.ID_ARTICOL IN ({placeholders})
""", params)
for row in cur:
existing.setdefault(row[0], set()).add(row[1])
# Classify each codmat
missing_vanzare = set() # CODMATs needing price 0 in sales policy
missing_productie = set() # CODMATs needing price 0 in production policy
for aid, cms in id_to_codmats.items():
pols = existing.get(aid, set())
for cm in cms:
if pols:
if id_pol_vanzare in pols:
codmat_policy_map[cm] = id_pol_vanzare
elif id_pol_productie in pols:
codmat_policy_map[cm] = id_pol_productie
else:
# Not in any policy — classify by cont
info = direct_id_map.get(cm, {})
cont = info.get("cont", "") if isinstance(info, dict) else ""
cont_str = str(cont or "").strip()
if cont_str in ("341", "345"):
codmat_policy_map[cm] = id_pol_productie
missing_productie.add(cm)
else:
codmat_policy_map[cm] = id_pol_vanzare
missing_vanzare.add(cm)
# Ensure prices for missing articles in each policy
if missing_vanzare:
ensure_prices(missing_vanzare, id_pol_vanzare, conn, direct_id_map, cota_tva=cota_tva)
if missing_productie:
ensure_prices(missing_productie, id_pol_productie, conn, direct_id_map, cota_tva=cota_tva)
logger.info(
f"Dual-policy: {len(codmat_policy_map)} CODMATs assigned "
f"(vanzare={sum(1 for v in codmat_policy_map.values() if v == id_pol_vanzare)}, "
f"productie={sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)})"
)
return codmat_policy_map
def resolve_mapped_codmats(mapped_skus: set[str], conn,
id_gestiuni: list[int] = None) -> dict[str, list[dict]]:
"""For mapped SKUs, get their underlying CODMATs from ARTICOLE_TERTI + nom_articole.
Uses ROW_NUMBER to pick the best id_articol per (SKU, CODMAT) pair:
prefers article with stock in current month, then MAX(id_articol) as fallback.
This avoids inflating results when a CODMAT has multiple NOM_ARTICOLE entries.
Returns: {sku: [{"codmat": str, "id_articol": int, "cont": str|None, "cantitate_roa": float|None}]}
"""
if not mapped_skus:
return {}
# Build stoc subquery gestiune filter (same pattern as resolve_codmat_ids)
if id_gestiuni:
gest_placeholders = ",".join([f":g{k}" for k in range(len(id_gestiuni))])
stoc_filter = f"AND s.id_gestiune IN ({gest_placeholders})"
else:
stoc_filter = ""
result = {}
sku_list = list(mapped_skus)
with conn.cursor() as cur:
for i in range(0, len(sku_list), 500):
batch = sku_list[i:i+500]
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
params = {f"s{j}": sku for j, sku in enumerate(batch)}
if id_gestiuni:
for k, gid in enumerate(id_gestiuni):
params[f"g{k}"] = gid
cur.execute(f"""
SELECT sku, codmat, id_articol, cont, cantitate_roa FROM (
SELECT at.sku, at.codmat, na.id_articol, na.cont, at.cantitate_roa,
ROW_NUMBER() OVER (
PARTITION BY at.sku, at.codmat
ORDER BY
CASE WHEN EXISTS (
SELECT 1 FROM stoc s
WHERE s.id_articol = na.id_articol
{stoc_filter}
AND s.an = EXTRACT(YEAR FROM SYSDATE)
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
AND s.cants + s.cant - s.cante > 0
) THEN 0 ELSE 1 END,
na.id_articol DESC
) AS rn
FROM ARTICOLE_TERTI at
JOIN NOM_ARTICOLE na ON na.codmat = at.codmat AND na.sters = 0 AND na.inactiv = 0
WHERE at.sku IN ({placeholders}) AND at.activ = 1 AND at.sters = 0
) WHERE rn = 1
""", params)
for row in cur:
sku = row[0]
if sku not in result:
result[sku] = []
result[sku].append({
"codmat": row[1],
"id_articol": row[2],
"cont": row[3],
"cantitate_roa": row[4]
})
logger.info(f"resolve_mapped_codmats: {len(result)} SKUs → {sum(len(v) for v in result.values())} CODMATs")
return result
def validate_kit_component_prices(mapped_codmat_data: dict, id_pol: int,
id_pol_productie: int = None, conn=None) -> dict:
"""Pre-validate that kit components have non-zero prices in crm_politici_pret_art.
Args:
mapped_codmat_data: {sku: [{"codmat", "id_articol", "cont"}, ...]} from resolve_mapped_codmats
id_pol: default sales price policy
id_pol_productie: production price policy (for cont 341/345)
Returns: {sku: [missing_codmats]} for SKUs with missing prices, {} if all OK
"""
missing = {}
own_conn = conn is None
if own_conn:
conn = database.get_oracle_connection()
try:
with conn.cursor() as cur:
for sku, components in mapped_codmat_data.items():
if len(components) == 0:
continue
if len(components) == 1 and (components[0].get("cantitate_roa") or 1) <= 1:
continue # True 1:1 mapping, no kit pricing needed
sku_missing = []
for comp in components:
cont = str(comp.get("cont") or "").strip()
if cont in ("341", "345") and id_pol_productie:
pol = id_pol_productie
else:
pol = id_pol
cur.execute("""
SELECT PRET FROM crm_politici_pret_art
WHERE id_pol = :pol AND id_articol = :id_art
""", {"pol": pol, "id_art": comp["id_articol"]})
row = cur.fetchone()
if not row:
sku_missing.append(comp["codmat"])
if sku_missing:
missing[sku] = sku_missing
finally:
if own_conn:
database.pool.release(conn)
return missing
def compare_and_update_price(id_articol: int, id_pol: int, web_price_cu_tva: float,
conn, tolerance: float = 0.01) -> dict | None:
"""Compare web price with ROA price and update if different.
Handles PRETURI_CU_TVA flag per policy.
Returns: {"updated": bool, "old_price": float, "new_price": float, "codmat": str} or None if no price entry.
"""
with conn.cursor() as cur:
cur.execute("SELECT PRETURI_CU_TVA FROM CRM_POLITICI_PRETURI WHERE ID_POL = :pol", {"pol": id_pol})
pol_row = cur.fetchone()
if not pol_row:
return None
preturi_cu_tva = pol_row[0] # 1 or 0
cur.execute("""
SELECT PRET, PROC_TVAV, na.codmat
FROM crm_politici_pret_art pa
JOIN nom_articole na ON na.id_articol = pa.id_articol
WHERE pa.id_pol = :pol AND pa.id_articol = :id_art
""", {"pol": id_pol, "id_art": id_articol})
row = cur.fetchone()
if not row:
return None
pret_roa, proc_tvav, codmat = row[0], row[1], row[2]
proc_tvav = proc_tvav or 1.19
if preturi_cu_tva == 1:
pret_roa_cu_tva = pret_roa
else:
pret_roa_cu_tva = pret_roa * proc_tvav
if abs(pret_roa_cu_tva - web_price_cu_tva) <= tolerance:
return {"updated": False, "old_price": pret_roa_cu_tva, "new_price": web_price_cu_tva, "codmat": codmat}
if preturi_cu_tva == 1:
new_pret = web_price_cu_tva
else:
new_pret = round(web_price_cu_tva / proc_tvav, 4)
cur.execute("""
UPDATE crm_politici_pret_art SET PRET = :pret, DATAORA = SYSDATE
WHERE id_pol = :pol AND id_articol = :id_art
""", {"pret": new_pret, "pol": id_pol, "id_art": id_articol})
conn.commit()
return {"updated": True, "old_price": pret_roa_cu_tva, "new_price": web_price_cu_tva, "codmat": codmat}
def sync_prices_from_order(orders, mapped_codmat_data: dict, direct_id_map: dict,
codmat_policy_map: dict, id_pol: int,
id_pol_productie: int = None, conn=None,
settings: dict = None) -> list:
"""Sync prices from order items to ROA for direct/1:1 mappings.
Skips kit components and transport/discount CODMATs.
Returns: list of {"codmat", "old_price", "new_price"} for updated prices.
"""
if settings and settings.get("price_sync_enabled") != "1":
return []
transport_codmat = (settings or {}).get("transport_codmat", "")
discount_codmat = (settings or {}).get("discount_codmat", "")
kit_discount_codmat = (settings or {}).get("kit_discount_codmat", "")
skip_codmats = {transport_codmat, discount_codmat, kit_discount_codmat} - {""}
# Build set of kit/bax SKUs (>1 component, or single component with cantitate_roa > 1)
kit_skus = {sku for sku, comps in mapped_codmat_data.items()
if len(comps) > 1 or (len(comps) == 1 and (comps[0].get("cantitate_roa") or 1) > 1)}
updated = []
own_conn = conn is None
if own_conn:
conn = database.get_oracle_connection()
try:
for order in orders:
for item in order.items:
sku = item.sku
if not sku or sku in skip_codmats:
continue
if sku in kit_skus:
continue # Don't sync prices from kit orders
web_price = item.price # already with TVA
if not web_price or web_price <= 0:
continue
# Determine id_articol and price policy for this SKU
if sku in mapped_codmat_data and len(mapped_codmat_data[sku]) == 1:
# 1:1 mapping via ARTICOLE_TERTI
comp = mapped_codmat_data[sku][0]
id_articol = comp["id_articol"]
cantitate_roa = comp.get("cantitate_roa") or 1
web_price_per_unit = web_price / cantitate_roa if cantitate_roa != 1 else web_price
elif sku in (direct_id_map or {}):
info = direct_id_map[sku]
id_articol = info["id_articol"] if isinstance(info, dict) else info
web_price_per_unit = web_price
else:
continue
pol = codmat_policy_map.get(sku, id_pol)
result = compare_and_update_price(id_articol, pol, web_price_per_unit, conn)
if result and result["updated"]:
updated.append(result)
finally:
if own_conn:
database.pool.release(conn)
return updated

View File

@@ -1,796 +0,0 @@
/* ── Design tokens ───────────────────────────────── */
:root {
/* Surfaces */
--body-bg: #f9fafb;
--card-bg: #ffffff;
--card-shadow: 0 1px 3px rgba(0,0,0,0.1), 0 1px 2px rgba(0,0,0,0.06);
--card-radius: 0.5rem;
/* Semantic colors */
--blue-600: #2563eb;
--blue-700: #1d4ed8;
--green-100: #dcfce7; --green-800: #166534;
--yellow-100: #fef9c3; --yellow-800: #854d0e;
--red-100: #fee2e2; --red-800: #991b1b;
--blue-100: #dbeafe; --blue-800: #1e40af;
/* Text */
--text-primary: #111827;
--text-secondary: #4b5563;
--text-muted: #6b7280;
--border-color: #e5e7eb;
/* Dots */
--dot-green: #22c55e;
--dot-yellow: #eab308;
--dot-red: #ef4444;
}
/* ── Base ────────────────────────────────────────── */
body {
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif;
font-size: 1rem;
background-color: var(--body-bg);
margin: 0;
padding: 0;
}
h1, h2, h3, h4, h5, h6 {
text-wrap: balance;
}
/* ── Checkboxes — accessible size ────────────────── */
input[type="checkbox"] {
width: 1.125rem;
height: 1.125rem;
accent-color: var(--blue-600);
cursor: pointer;
}
/* ── Top Navbar ──────────────────────────────────── */
.top-navbar {
position: fixed;
top: 0;
left: 0;
right: 0;
height: 48px;
background: #fff;
border-bottom: 1px solid var(--border-color);
display: flex;
align-items: center;
padding: 0 1.5rem;
gap: 1.5rem;
z-index: 1000;
box-shadow: 0 1px 3px rgba(0,0,0,0.06);
}
.navbar-brand {
font-weight: 700;
font-size: 1rem;
color: #111827;
white-space: nowrap;
}
.navbar-links {
display: flex;
align-items: stretch;
gap: 0;
overflow-x: auto;
-webkit-overflow-scrolling: touch;
scrollbar-width: none;
}
.navbar-links::-webkit-scrollbar { display: none; }
.nav-tab {
display: flex;
align-items: center;
padding: 0 1rem;
height: 48px;
color: #64748b;
text-decoration: none;
font-size: 0.9375rem;
font-weight: 500;
border-bottom: 2px solid transparent;
white-space: nowrap;
flex-shrink: 0;
transition: color 0.15s, border-color 0.15s;
}
.nav-tab:hover {
color: #111827;
background: #f9fafb;
text-decoration: none;
}
.nav-tab.active {
color: var(--blue-600);
border-bottom-color: var(--blue-600);
}
/* ── Main content ────────────────────────────────── */
.main-content {
padding-top: 64px;
padding-left: 1.5rem;
padding-right: 1.5rem;
padding-bottom: 1.5rem;
min-height: 100vh;
max-width: 1280px;
margin-left: auto;
margin-right: auto;
}
/* ── Cards ───────────────────────────────────────── */
.card {
border: none;
box-shadow: var(--card-shadow);
border-radius: var(--card-radius);
background: var(--card-bg);
}
.card-header {
background: var(--card-bg);
border-bottom: 1px solid var(--border-color);
font-weight: 600;
font-size: 0.9375rem;
padding: 0.75rem 1rem;
}
/* ── Tables ──────────────────────────────────────── */
.table {
font-size: 1rem;
}
.table th {
font-size: 0.8125rem;
font-weight: 500;
text-transform: uppercase;
letter-spacing: 0.05em;
color: var(--text-muted);
background: #f9fafb;
padding: 0.75rem 1rem;
border-top: none;
}
.table td {
padding: 0.625rem 1rem;
color: var(--text-secondary);
font-size: 1rem;
font-variant-numeric: tabular-nums;
}
/* Zebra striping */
.table tbody tr:nth-child(even) td { background-color: #f7f8fa; }
.table-hover tbody tr:hover td { background-color: #eef2ff !important; }
/* ── Badges — soft pill style ────────────────────── */
.badge {
font-size: 0.8125rem;
font-weight: 500;
padding: 0.125rem 0.5rem;
border-radius: 9999px;
}
.badge.bg-success { background: var(--green-100) !important; color: var(--green-800) !important; }
.badge.bg-info { background: var(--blue-100) !important; color: var(--blue-800) !important; }
.badge.bg-warning { background: var(--yellow-100) !important; color: var(--yellow-800) !important; }
.badge.bg-danger { background: var(--red-100) !important; color: var(--red-800) !important; }
/* Legacy badge classes */
.badge-imported { background: var(--green-100); color: var(--green-800); }
.badge-skipped { background: var(--yellow-100); color: var(--yellow-800); }
.badge-error { background: var(--red-100); color: var(--red-800); }
.badge-pending { background: #f3f4f6; color: #374151; }
.badge-ready { background: var(--blue-100); color: var(--blue-800); }
/* ── Buttons ─────────────────────────────────────── */
.btn {
font-size: 0.9375rem;
border-radius: 0.375rem;
}
.btn-sm {
font-size: 0.875rem;
padding: 0.375rem 0.75rem;
}
.btn-primary {
background: var(--blue-600);
border-color: var(--blue-600);
}
.btn-primary:hover {
background: var(--blue-700);
border-color: var(--blue-700);
}
/* ── Forms ───────────────────────────────────────── */
.form-control, .form-select {
font-size: 0.9375rem;
padding: 0.5rem 0.75rem;
border-radius: 0.375rem;
border-color: #d1d5db;
}
.form-control:focus, .form-select:focus {
border-color: var(--blue-600);
box-shadow: 0 0 0 2px rgba(37, 99, 235, 0.2);
}
/* ── Unified Pagination Bar ──────────────────────── */
.pagination-bar {
display: flex;
align-items: center;
gap: 0.25rem;
flex-wrap: wrap;
}
.page-btn {
display: inline-flex;
align-items: center;
justify-content: center;
min-width: 2.75rem;
height: 2.75rem;
padding: 0 0.5rem;
font-size: 0.875rem;
border: 1px solid #d1d5db;
border-radius: 0.375rem;
background: #fff;
color: var(--text-secondary);
cursor: pointer;
transition: background 0.12s, border-color 0.12s;
text-decoration: none;
user-select: none;
}
.page-btn:hover:not(:disabled):not(.active) {
background: #f3f4f6;
border-color: #9ca3af;
color: var(--text-primary);
text-decoration: none;
}
.page-btn.active {
background: var(--blue-600);
border-color: var(--blue-600);
color: #fff;
font-weight: 600;
}
.page-btn:disabled, .page-btn.disabled {
opacity: 0.4;
cursor: default;
pointer-events: none;
}
/* Loading spinner ────────────────────────────────── */
.spinner-overlay {
position: fixed;
top: 0; left: 0; right: 0; bottom: 0;
background: rgba(255,255,255,0.7);
z-index: 9999;
display: flex;
align-items: center;
justify-content: center;
}
/* ── Colored dots ────────────────────────────────── */
.dot {
display: inline-block;
width: 8px;
height: 8px;
border-radius: 50%;
flex-shrink: 0;
}
.dot-green { background: var(--dot-green); }
.dot-yellow { background: var(--dot-yellow); }
.dot-red { background: var(--dot-red); }
.dot-gray { background: #9ca3af; }
.dot-blue { background: #3b82f6; }
/* ── Flat row (mobile + desktop) ────────────────── */
.flat-row {
display: flex;
align-items: center;
gap: 0.5rem;
padding: 0.5rem 0.75rem;
border-bottom: 1px solid #f3f4f6;
font-size: 1rem;
}
.flat-row:last-child { border-bottom: none; }
.flat-row:hover { background: #f9fafb; cursor: pointer; }
.grow { flex: 1; min-width: 0; }
.truncate { white-space: nowrap; overflow: hidden; text-overflow: ellipsis; }
/* ── Colored filter count - text color only ─────── */
.fc-green { color: #16a34a; }
.fc-yellow { color: #ca8a04; }
.fc-red { color: #dc2626; }
.fc-neutral { color: #6b7280; }
.fc-blue { color: #2563eb; }
.fc-dark { color: #374151; }
/* ── Log viewer (dark theme — keep as-is) ────────── */
.log-viewer {
font-family: 'SFMono-Regular', Consolas, 'Liberation Mono', Menlo, monospace;
font-size: 0.8125rem;
line-height: 1.5;
max-height: 600px;
overflow-y: auto;
padding: 1rem;
margin: 0;
background-color: #1e293b;
color: #e2e8f0;
white-space: pre-wrap;
word-wrap: break-word;
border-radius: 0 0 0.5rem 0.5rem;
}
/* ── Clickable table rows ────────────────────────── */
.table-hover tbody tr[data-href] {
cursor: pointer;
}
.table-hover tbody tr[data-href]:hover {
background-color: #f9fafb;
}
/* ── Sortable table headers ──────────────────────── */
.sortable {
cursor: pointer;
user-select: none;
}
.sortable:hover {
background-color: #f3f4f6;
}
.sort-icon {
font-size: 0.75rem;
margin-left: 0.25rem;
color: var(--blue-600);
}
/* ── SKU group visual grouping ───────────────────── */
.sku-group-odd {
background-color: #f8fafc;
}
/* ── Editable cells ──────────────────────────────── */
.editable { cursor: pointer; }
.editable:hover { background-color: #f3f4f6; }
/* ── Order detail modal ──────────────────────────── */
.modal-lg .table-sm td,
.modal-lg .table-sm th {
font-size: 0.875rem;
padding: 0.35rem 0.5rem;
}
/* ── Modal stacking (quickMap over orderDetail) ───── */
#quickMapModal { z-index: 1060; }
#quickMapModal + .modal-backdrop,
.modal-backdrop ~ .modal-backdrop { z-index: 1055; }
/* ── Quick Map compact lines ─────────────────────── */
.qm-line { border-bottom: 1px solid #e5e7eb; padding: 6px 0; }
.qm-line:last-child { border-bottom: none; }
.qm-row { display: flex; gap: 6px; align-items: center; }
.qm-codmat-wrap { flex: 1; min-width: 0; }
.qm-rm-btn { padding: 2px 6px; line-height: 1; }
#qmCodmatLines .qm-selected:empty,
#codmatLines .qm-selected:empty { display: none; }
#quickMapModal .modal-body,
#addModal .modal-body { padding-top: 12px; padding-bottom: 8px; }
#quickMapModal .modal-header,
#addModal .modal-header { padding: 10px 16px; }
#quickMapModal .modal-header h5,
#addModal .modal-header h5 { font-size: 0.95rem; margin: 0; }
#quickMapModal .modal-footer,
#addModal .modal-footer { padding: 8px 16px; }
/* ── Deleted mapping rows ────────────────────────── */
tr.mapping-deleted td {
text-decoration: line-through;
opacity: 0.5;
}
/* ── Map icon button ─────────────────────────────── */
.btn-map-icon {
color: var(--blue-600);
padding: 0.1rem 0.25rem;
cursor: pointer;
font-size: 1rem;
text-decoration: none;
}
.btn-map-icon:hover { color: var(--blue-700); }
/* ── Last sync summary card columns ─────────────── */
.last-sync-col {
border-right: 1px solid var(--border-color);
}
/* ── Cursor pointer utility ──────────────────────── */
.cursor-pointer { cursor: pointer; }
/* ── Filter bar ──────────────────────────────────── */
.filter-bar {
display: flex;
align-items: center;
gap: 0.5rem;
flex-wrap: wrap;
padding: 0.625rem 0;
}
.filter-pill {
display: inline-flex;
align-items: center;
gap: 0.3rem;
padding: 0.5rem 0.75rem;
border: 1px solid #d1d5db;
border-radius: 0.375rem;
background: #fff;
font-size: 0.9375rem;
cursor: pointer;
transition: background 0.15s, border-color 0.15s;
white-space: nowrap;
}
.filter-pill:hover { background: #f3f4f6; }
.filter-pill.active {
background: var(--blue-700);
border-color: var(--blue-700);
color: #fff;
}
.filter-pill.active .filter-count {
color: rgba(255,255,255,0.9);
}
.filter-count {
font-size: 0.8125rem;
font-weight: 600;
}
/* ── Search input ────────────────────────────────── */
.search-input {
padding: 0.375rem 0.75rem;
border: 1px solid #d1d5db;
border-radius: 0.375rem;
font-size: 0.9375rem;
width: 160px;
}
.search-input:focus {
border-color: var(--blue-600);
box-shadow: 0 0 0 2px rgba(37, 99, 235, 0.2);
}
/* ── Autocomplete dropdown (keep as-is) ──────────── */
.autocomplete-dropdown {
position: absolute;
z-index: 1050;
background: #fff;
border: 1px solid #dee2e6;
border-radius: 0.375rem;
box-shadow: 0 4px 12px rgba(0,0,0,0.15);
max-height: 300px;
overflow-y: auto;
width: 100%;
}
.autocomplete-item {
padding: 0.5rem 0.75rem;
cursor: pointer;
font-size: 0.9375rem;
border-bottom: 1px solid #f1f5f9;
}
.autocomplete-item:hover, .autocomplete-item.active {
background-color: #f1f5f9;
}
.autocomplete-item .codmat {
font-weight: 600;
color: #1e293b;
}
.autocomplete-item .denumire {
color: #64748b;
font-size: 0.875rem;
}
/* ── Tooltip for Client/Cont ─────────────────────── */
.tooltip-cont {
position: relative;
cursor: default;
}
.tooltip-cont::after {
content: attr(data-tooltip);
position: absolute;
bottom: 125%;
left: 50%;
transform: translateX(-50%);
background: #1f2937;
color: #f9fafb;
font-size: 0.75rem;
padding: 0.3rem 0.6rem;
border-radius: 4px;
white-space: nowrap;
pointer-events: none;
opacity: 0;
transition: opacity 0.15s;
z-index: 10;
}
.tooltip-cont:hover::after { opacity: 1; }
/* ── Sync card ───────────────────────────────────── */
.sync-card {
background: #fff;
border: 1px solid var(--border-color);
border-radius: var(--card-radius);
overflow: hidden;
margin-bottom: 1rem;
}
.sync-card-controls {
display: flex;
align-items: center;
gap: 0.75rem;
padding: 0.75rem 1rem;
flex-wrap: wrap;
}
.sync-card-divider {
height: 1px;
background: var(--border-color);
margin: 0;
}
.sync-card-info {
display: flex;
align-items: center;
gap: 1rem;
padding: 0.5rem 1rem;
font-size: 1rem;
color: var(--text-muted);
cursor: pointer;
transition: background 0.12s;
}
.sync-card-info:hover { background: #f9fafb; }
.sync-card-progress {
display: flex;
align-items: center;
gap: 0.5rem;
padding: 0.4rem 1rem;
background: #eff6ff;
font-size: 1rem;
color: var(--blue-700);
border-top: 1px solid #dbeafe;
}
/* ── Pulsing live dot (keep as-is) ──────────────── */
.sync-live-dot {
display: inline-block;
width: 8px;
height: 8px;
border-radius: 50%;
background: #3b82f6;
animation: pulse-dot 1.2s ease-in-out infinite;
flex-shrink: 0;
}
@keyframes pulse-dot {
0%, 100% { opacity: 1; transform: scale(1); }
50% { opacity: 0.4; transform: scale(0.75); }
}
/* ── Status dot (keep as-is) ─────────────────────── */
.sync-status-dot {
display: inline-block;
width: 10px;
height: 10px;
border-radius: 50%;
flex-shrink: 0;
}
.sync-status-dot.idle { background: #9ca3af; }
.sync-status-dot.running { background: #3b82f6; animation: pulse-dot 1.2s ease-in-out infinite; }
.sync-status-dot.completed { background: #10b981; }
.sync-status-dot.failed { background: #ef4444; }
/* ── Custom period range inputs ──────────────────── */
.period-custom-range {
display: none;
gap: 0.375rem;
align-items: center;
font-size: 0.9375rem;
}
.period-custom-range.visible { display: flex; }
/* ── select-compact (used in filter bars) ─────────── */
.select-compact {
padding: 0.375rem 0.5rem;
font-size: 0.9375rem;
border: 1px solid #d1d5db;
border-radius: 0.375rem;
background: #fff;
cursor: pointer;
}
/* ── btn-compact (kept for backward compat) ──────── */
.btn-compact {
padding: 0.375rem 0.75rem;
font-size: 0.9375rem;
}
/* ── Result banner ───────────────────────────────── */
.result-banner {
padding: 0.4rem 0.75rem;
border-radius: 0.375rem;
font-size: 0.9375rem;
background: #d1fae5;
color: #065f46;
border: 1px solid #6ee7b7;
}
/* ── Badge-pct (mappings page) ───────────────────── */
.badge-pct {
font-size: 0.75rem;
padding: 0.1rem 0.35rem;
border-radius: 4px;
font-weight: 600;
}
.badge-pct.complete { background: #d1fae5; color: #065f46; }
.badge-pct.incomplete { background: #fef3c7; color: #92400e; }
/* ── Context Menu ────────────────────────────────── */
.context-menu-trigger {
background: none;
border: none;
color: #9ca3af;
padding: 0.2rem 0.4rem;
cursor: pointer;
border-radius: 0.25rem;
font-size: 1rem;
line-height: 1;
transition: color 0.12s, background 0.12s;
}
.context-menu-trigger:hover {
color: var(--text-secondary);
background: #f3f4f6;
}
.context-menu {
position: fixed;
background: #fff;
border: 1px solid #e5e7eb;
border-radius: 0.5rem;
box-shadow: 0 4px 16px rgba(0,0,0,0.12);
z-index: 1050;
min-width: 150px;
padding: 0.25rem 0;
}
.context-menu-item {
display: block;
width: 100%;
text-align: left;
padding: 0.45rem 0.9rem;
font-size: 0.9375rem;
background: none;
border: none;
cursor: pointer;
color: var(--text-primary);
transition: background 0.1s;
}
.context-menu-item:hover { background: #f3f4f6; }
.context-menu-item.text-danger { color: #dc2626; }
.context-menu-item.text-danger:hover { background: #fee2e2; }
/* ── Pagination info strip ───────────────────────── */
.pag-strip {
display: flex;
align-items: center;
justify-content: space-between;
gap: 1rem;
padding: 0.5rem 1rem;
border-bottom: 1px solid var(--border-color);
flex-wrap: wrap;
}
.pag-strip-bottom {
border-bottom: none;
border-top: 1px solid var(--border-color);
}
/* ── Per page selector ───────────────────────────── */
.per-page-label {
display: flex;
align-items: center;
gap: 0.375rem;
font-size: 0.9375rem;
color: var(--text-muted);
white-space: nowrap;
}
/* ── Mobile list vs desktop table ────────────────── */
.mobile-list { display: none; }
/* ── Mappings flat-rows: always visible ────────────── */
.mappings-flat-list { display: block; }
/* ── Mobile ⋯ dropdown ─────────────────────────── */
.mobile-more-dropdown { position: relative; display: inline-block; }
.mobile-more-dropdown .dropdown-toggle::after { display: none; }
/* ── Mobile segmented control (hidden on desktop) ── */
.mobile-seg { display: none; }
/* ── Responsive ──────────────────────────────────── */
@media (max-width: 767.98px) {
.top-navbar {
padding: 0 0.5rem;
gap: 0.5rem;
}
.navbar-brand {
font-size: 0.875rem;
}
.nav-tab {
padding: 0 0.625rem;
font-size: 0.8125rem;
}
.main-content {
padding-left: 0.75rem;
padding-right: 0.75rem;
}
.filter-bar {
gap: 0.375rem;
}
.filter-pill { padding: 0.25rem 0.5rem; font-size: 0.8125rem; }
.search-input { min-width: 0; width: auto; flex: 1; }
.page-btn.page-number { display: none; }
.page-btn.page-ellipsis { display: none; }
.table-responsive { display: none; }
.mobile-list { display: block; }
/* Segmented filter control (replaces pills on mobile) */
.filter-bar .filter-pill { display: none; }
.filter-bar .mobile-seg { display: flex; }
/* Sync card compact */
.sync-card-controls {
flex-direction: row;
flex-wrap: wrap;
gap: 0.375rem;
padding: 0.5rem 0.75rem;
}
.sync-card-info {
flex-wrap: wrap;
gap: 0.375rem;
font-size: 0.8rem;
padding: 0.375rem 0.75rem;
}
/* Hide per-page selector on mobile */
.per-page-label { display: none; }
}
/* Mobile article cards in order detail modal */
.detail-item-card {
border: 1px solid #e5e7eb;
border-radius: 6px;
padding: 0.5rem 0.75rem;
margin-bottom: 0.5rem;
font-size: 0.875rem;
}
.detail-item-card .card-sku {
font-family: monospace;
font-size: 0.8rem;
color: #6b7280;
}
.detail-item-card .card-name {
font-weight: 500;
margin-bottom: 0.25rem;
}
.detail-item-card .card-details {
display: flex;
gap: 1rem;
color: #374151;
}
/* Clickable CODMAT link in order detail modal */
.codmat-link { color: #0d6efd; cursor: pointer; text-decoration: underline; }
.codmat-link:hover { color: #0a58ca; }
/* Mobile article flat list in order detail modal */
.detail-item-flat { font-size: 0.85rem; }
.detail-item-flat .dif-item { }
.detail-item-flat .dif-item:nth-child(even) .dif-row { background: #f7f8fa; }
.detail-item-flat .dif-row {
display: flex; align-items: baseline; gap: 0.5rem;
padding: 0.2rem 0.75rem; flex-wrap: wrap;
}
.dif-sku { font-family: monospace; font-size: 0.78rem; color: #6b7280; }
.dif-name { font-weight: 500; flex: 1; }
.dif-qty { white-space: nowrap; color: #6b7280; }
.dif-val { white-space: nowrap; font-weight: 600; }
.dif-codmat-link { color: #0d6efd; cursor: pointer; font-size: 0.78rem; font-family: monospace; }
.dif-codmat-link:hover { color: #0a58ca; text-decoration: underline; }

View File

@@ -1,789 +0,0 @@
// ── State ─────────────────────────────────────────
let dashPage = 1;
let dashPerPage = 50;
let dashSortCol = 'order_date';
let dashSortDir = 'desc';
let dashSearchTimeout = null;
// Sync polling state
let _pollInterval = null;
let _lastSyncStatus = null;
let _lastRunId = null;
let _currentRunId = null;
let _pollIntervalMs = 5000; // default, overridden from settings
let _knownLastRunId = null; // track last_run.run_id to detect missed syncs
// ── Init ──────────────────────────────────────────
document.addEventListener('DOMContentLoaded', async () => {
await initPollInterval();
loadSchedulerStatus();
loadDashOrders();
startSyncPolling();
wireFilterBar();
});
async function initPollInterval() {
try {
const data = await fetchJSON('/api/settings');
const sec = parseInt(data.dashboard_poll_seconds) || 5;
_pollIntervalMs = sec * 1000;
} catch(e) {}
}
// ── Smart Sync Polling ────────────────────────────
function startSyncPolling() {
if (_pollInterval) clearInterval(_pollInterval);
_pollInterval = setInterval(pollSyncStatus, _pollIntervalMs);
pollSyncStatus(); // immediate first call
}
async function pollSyncStatus() {
try {
const data = await fetchJSON('/api/sync/status');
updateSyncPanel(data);
const isRunning = data.status === 'running';
const wasRunning = _lastSyncStatus === 'running';
// Detect missed sync completions via last_run.run_id change
const newLastRunId = data.last_run?.run_id || null;
const missedSync = !isRunning && !wasRunning && _knownLastRunId && newLastRunId && newLastRunId !== _knownLastRunId;
_knownLastRunId = newLastRunId;
if (isRunning && !wasRunning) {
// Switched to running — speed up polling
clearInterval(_pollInterval);
_pollInterval = setInterval(pollSyncStatus, 3000);
} else if (!isRunning && wasRunning) {
// Sync just completed — slow down and refresh orders
clearInterval(_pollInterval);
_pollInterval = setInterval(pollSyncStatus, _pollIntervalMs);
loadDashOrders();
} else if (missedSync) {
// Sync completed while we weren't watching (e.g. auto-sync) — refresh orders
loadDashOrders();
}
_lastSyncStatus = data.status;
} catch (e) {
console.warn('Sync status poll failed:', e);
}
}
function updateSyncPanel(data) {
const dot = document.getElementById('syncStatusDot');
const txt = document.getElementById('syncStatusText');
const progressArea = document.getElementById('syncProgressArea');
const progressText = document.getElementById('syncProgressText');
const startBtn = document.getElementById('syncStartBtn');
if (dot) {
dot.className = 'sync-status-dot ' + (data.status || 'idle');
}
const statusLabels = { running: 'A ruleaza...', idle: 'Inactiv', completed: 'Finalizat', failed: 'Eroare' };
if (txt) txt.textContent = statusLabels[data.status] || data.status || 'Inactiv';
if (startBtn) startBtn.disabled = data.status === 'running';
// Track current running sync run_id
if (data.status === 'running' && data.run_id) {
_currentRunId = data.run_id;
} else {
_currentRunId = null;
}
// Live progress area
if (progressArea) {
progressArea.style.display = data.status === 'running' ? 'flex' : 'none';
}
if (progressText && data.phase_text) {
progressText.textContent = data.phase_text;
}
// Last run info
const lr = data.last_run;
if (lr) {
_lastRunId = lr.run_id;
const d = document.getElementById('lastSyncDate');
const dur = document.getElementById('lastSyncDuration');
const cnt = document.getElementById('lastSyncCounts');
const st = document.getElementById('lastSyncStatus');
if (d) d.textContent = lr.started_at ? lr.started_at.replace('T', ' ').slice(0, 16) : '\u2014';
if (dur) dur.textContent = lr.duration_seconds ? Math.round(lr.duration_seconds) + 's' : '\u2014';
if (cnt) {
const newImp = lr.new_imported || 0;
const already = lr.already_imported || 0;
if (already > 0) {
cnt.innerHTML = `<span class="dot dot-green me-1"></span>${newImp} noi, ${already} deja &nbsp; <span class="dot dot-yellow me-1"></span>${lr.skipped || 0} omise &nbsp; <span class="dot dot-red me-1"></span>${lr.errors || 0} erori`;
} else {
cnt.innerHTML = `<span class="dot dot-green me-1"></span>${lr.imported || 0} imp. &nbsp; <span class="dot dot-yellow me-1"></span>${lr.skipped || 0} omise &nbsp; <span class="dot dot-red me-1"></span>${lr.errors || 0} erori`;
}
}
if (st) {
st.textContent = lr.status === 'completed' ? '\u2713' : '\u2715';
st.style.color = lr.status === 'completed' ? '#10b981' : '#ef4444';
}
}
}
// Wire last-sync-row click → journal (use current running sync if active)
document.addEventListener('DOMContentLoaded', () => {
document.getElementById('lastSyncRow')?.addEventListener('click', () => {
const targetId = _currentRunId || _lastRunId;
if (targetId) window.location = (window.ROOT_PATH || '') + '/logs?run=' + targetId;
});
document.getElementById('lastSyncRow')?.addEventListener('keydown', (e) => {
const targetId = _currentRunId || _lastRunId;
if ((e.key === 'Enter' || e.key === ' ') && targetId) {
window.location = '/logs?run=' + targetId;
}
});
});
// ── Sync Controls ─────────────────────────────────
async function startSync() {
try {
const res = await fetch('/api/sync/start', { method: 'POST' });
const data = await res.json();
if (data.error) {
alert(data.error);
return;
}
// Polling will detect the running state — just speed it up immediately
pollSyncStatus();
} catch (err) {
alert('Eroare: ' + err.message);
}
}
async function stopSync() {
try {
await fetch('/api/sync/stop', { method: 'POST' });
pollSyncStatus();
} catch (err) {
alert('Eroare: ' + err.message);
}
}
async function toggleScheduler() {
const enabled = document.getElementById('schedulerToggle').checked;
const interval = parseInt(document.getElementById('schedulerInterval').value) || 10;
try {
await fetch('/api/sync/schedule', {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ enabled, interval_minutes: interval })
});
} catch (err) {
alert('Eroare scheduler: ' + err.message);
}
}
async function updateSchedulerInterval() {
const enabled = document.getElementById('schedulerToggle').checked;
if (enabled) {
await toggleScheduler();
}
}
async function loadSchedulerStatus() {
try {
const res = await fetch('/api/sync/schedule');
const data = await res.json();
document.getElementById('schedulerToggle').checked = data.enabled || false;
if (data.interval_minutes) {
document.getElementById('schedulerInterval').value = data.interval_minutes;
}
} catch (err) {
console.error('loadSchedulerStatus error:', err);
}
}
// ── Filter Bar wiring ─────────────────────────────
function wireFilterBar() {
// Period dropdown
document.getElementById('periodSelect')?.addEventListener('change', function () {
const cr = document.getElementById('customRangeInputs');
if (this.value === 'custom') {
cr?.classList.add('visible');
} else {
cr?.classList.remove('visible');
dashPage = 1;
loadDashOrders();
}
});
// Custom range inputs
['periodStart', 'periodEnd'].forEach(id => {
document.getElementById(id)?.addEventListener('change', () => {
const s = document.getElementById('periodStart')?.value;
const e = document.getElementById('periodEnd')?.value;
if (s && e) { dashPage = 1; loadDashOrders(); }
});
});
// Status pills
document.querySelectorAll('.filter-pill[data-status]').forEach(btn => {
btn.addEventListener('click', function () {
document.querySelectorAll('.filter-pill[data-status]').forEach(b => b.classList.remove('active'));
this.classList.add('active');
dashPage = 1;
loadDashOrders();
});
});
// Search — 300ms debounce
document.getElementById('orderSearch')?.addEventListener('input', () => {
clearTimeout(dashSearchTimeout);
dashSearchTimeout = setTimeout(() => {
dashPage = 1;
loadDashOrders();
}, 300);
});
}
// ── Dashboard Orders Table ────────────────────────
function dashSortBy(col) {
if (dashSortCol === col) {
dashSortDir = dashSortDir === 'asc' ? 'desc' : 'asc';
} else {
dashSortCol = col;
dashSortDir = 'asc';
}
document.querySelectorAll('.sort-icon').forEach(span => {
const c = span.dataset.col;
span.textContent = c === dashSortCol ? (dashSortDir === 'asc' ? '\u2191' : '\u2193') : '';
});
dashPage = 1;
loadDashOrders();
}
async function loadDashOrders() {
const periodVal = document.getElementById('periodSelect')?.value || '7';
const params = new URLSearchParams();
if (periodVal === 'custom') {
const s = document.getElementById('periodStart')?.value;
const e = document.getElementById('periodEnd')?.value;
if (s && e) {
params.set('period_start', s);
params.set('period_end', e);
params.set('period_days', '0');
}
} else {
params.set('period_days', periodVal);
}
const activeStatus = document.querySelector('.filter-pill.active')?.dataset.status;
if (activeStatus && activeStatus !== 'all') params.set('status', activeStatus);
const search = document.getElementById('orderSearch')?.value?.trim();
if (search) params.set('search', search);
params.set('page', dashPage);
params.set('per_page', dashPerPage);
params.set('sort_by', dashSortCol);
params.set('sort_dir', dashSortDir);
try {
const res = await fetch(`/api/dashboard/orders?${params}`);
const data = await res.json();
// Update filter-pill badge counts
const c = data.counts || {};
const el = (id) => document.getElementById(id);
if (el('cntAll')) el('cntAll').textContent = c.total || 0;
if (el('cntImp')) el('cntImp').textContent = c.imported_all || c.imported || 0;
if (el('cntSkip')) el('cntSkip').textContent = c.skipped || 0;
if (el('cntErr')) el('cntErr').textContent = c.error || c.errors || 0;
if (el('cntFact')) el('cntFact').textContent = c.facturate || 0;
if (el('cntNef')) el('cntNef').textContent = c.nefacturate || c.uninvoiced || 0;
if (el('cntCanc')) el('cntCanc').textContent = c.cancelled || 0;
const tbody = document.getElementById('dashOrdersBody');
const orders = data.orders || [];
if (orders.length === 0) {
tbody.innerHTML = '<tr><td colspan="9" class="text-center text-muted py-3">Nicio comanda</td></tr>';
} else {
tbody.innerHTML = orders.map(o => {
const dateStr = fmtDate(o.order_date);
const orderTotal = o.order_total != null ? Number(o.order_total).toFixed(2) : '-';
return `<tr style="cursor:pointer" onclick="openDashOrderDetail('${esc(o.order_number)}')">
<td>${statusDot(o.status)}</td>
<td class="text-nowrap">${dateStr}</td>
${renderClientCell(o)}
<td><code>${esc(o.order_number)}</code></td>
<td>${o.items_count || 0}</td>
<td class="text-end text-muted">${fmtCost(o.delivery_cost)}</td>
<td class="text-end text-muted">${fmtCost(o.discount_total)}</td>
<td class="text-end fw-bold">${orderTotal}</td>
<td class="text-center">${invoiceDot(o)}</td>
</tr>`;
}).join('');
}
// Mobile flat rows
const mobileList = document.getElementById('dashMobileList');
if (mobileList) {
if (orders.length === 0) {
mobileList.innerHTML = '<div class="flat-row text-muted py-3 justify-content-center">Nicio comanda</div>';
} else {
mobileList.innerHTML = orders.map(o => {
const d = o.order_date || '';
let dateFmt = '-';
if (d.length >= 10) {
dateFmt = d.slice(8, 10) + '.' + d.slice(5, 7) + '.' + d.slice(2, 4);
if (d.length >= 16) dateFmt += ' ' + d.slice(11, 16);
}
const name = o.customer_name || o.shipping_name || o.billing_name || '\u2014';
const totalStr = o.order_total ? Number(o.order_total).toFixed(2) : '';
return `<div class="flat-row" onclick="openDashOrderDetail('${esc(o.order_number)}')" style="font-size:0.875rem">
${statusDot(o.status)}
<span style="color:#6b7280" class="text-nowrap">${dateFmt}</span>
<span class="grow truncate fw-bold">${esc(name)}</span>
<span class="text-nowrap">x${o.items_count || 0}${totalStr ? ' · <strong>' + totalStr + '</strong>' : ''}</span>
</div>`;
}).join('');
}
}
// Mobile segmented control
renderMobileSegmented('dashMobileSeg', [
{ label: 'Toate', count: c.total || 0, value: 'all', active: (activeStatus || 'all') === 'all', colorClass: 'fc-neutral' },
{ label: 'Imp.', count: c.imported_all || c.imported || 0, value: 'IMPORTED', active: activeStatus === 'IMPORTED', colorClass: 'fc-green' },
{ label: 'Omise', count: c.skipped || 0, value: 'SKIPPED', active: activeStatus === 'SKIPPED', colorClass: 'fc-yellow' },
{ label: 'Erori', count: c.error || c.errors || 0, value: 'ERROR', active: activeStatus === 'ERROR', colorClass: 'fc-red' },
{ label: 'Fact.', count: c.facturate || 0, value: 'INVOICED', active: activeStatus === 'INVOICED', colorClass: 'fc-green' },
{ label: 'Nefact.', count: c.nefacturate || c.uninvoiced || 0, value: 'UNINVOICED', active: activeStatus === 'UNINVOICED', colorClass: 'fc-red' },
{ label: 'Anulate', count: c.cancelled || 0, value: 'CANCELLED', active: activeStatus === 'CANCELLED', colorClass: 'fc-dark' }
], (val) => {
document.querySelectorAll('.filter-pill[data-status]').forEach(b => b.classList.remove('active'));
const pill = document.querySelector(`.filter-pill[data-status="${val}"]`);
if (pill) pill.classList.add('active');
dashPage = 1;
loadDashOrders();
});
// Pagination
const pag = data.pagination || {};
const totalPages = pag.total_pages || data.pages || 1;
const totalOrders = (data.counts || {}).total || data.total || 0;
const pagOpts = { perPage: dashPerPage, perPageFn: 'dashChangePerPage', perPageOptions: [25, 50, 100, 250] };
const pagHtml = `<small class="text-muted me-auto">${totalOrders} comenzi | Pagina ${dashPage} din ${totalPages}</small>` + renderUnifiedPagination(dashPage, totalPages, 'dashGoPage', pagOpts);
const pagDiv = document.getElementById('dashPagination');
if (pagDiv) pagDiv.innerHTML = pagHtml;
const pagDivTop = document.getElementById('dashPaginationTop');
if (pagDivTop) pagDivTop.innerHTML = pagHtml;
// Update sort icons
document.querySelectorAll('.sort-icon').forEach(span => {
const c = span.dataset.col;
span.textContent = c === dashSortCol ? (dashSortDir === 'asc' ? '\u2191' : '\u2193') : '';
});
} catch (err) {
document.getElementById('dashOrdersBody').innerHTML =
`<tr><td colspan="9" class="text-center text-danger">${esc(err.message)}</td></tr>`;
}
}
function dashGoPage(p) {
dashPage = p;
loadDashOrders();
}
function dashChangePerPage(val) {
dashPerPage = parseInt(val) || 50;
dashPage = 1;
loadDashOrders();
}
// ── Client cell with Cont tooltip (Task F4) ───────
function renderClientCell(order) {
const display = (order.customer_name || order.shipping_name || '').trim();
const billing = (order.billing_name || '').trim();
const shipping = (order.shipping_name || '').trim();
const isDiff = display !== shipping && shipping;
if (isDiff) {
return `<td class="tooltip-cont fw-bold" data-tooltip="Livrare: ${escHtml(shipping)}">${escHtml(display)}&nbsp;<sup style="color:#6b7280;font-size:0.65rem">&#9650;</sup></td>`;
}
return `<td class="fw-bold">${escHtml(display || billing || '\u2014')}</td>`;
}
// ── Helper functions ──────────────────────────────
async function fetchJSON(url) {
const res = await fetch(url);
if (!res.ok) throw new Error(`HTTP ${res.status}`);
return res.json();
}
function escHtml(s) {
if (s == null) return '';
return String(s)
.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;')
.replace(/"/g, '&quot;')
.replace(/'/g, '&#39;');
}
// Alias kept for backward compat with inline handlers in modal
function esc(s) { return escHtml(s); }
function fmtCost(v) {
return v > 0 ? Number(v).toFixed(2) : '';
}
function statusLabelText(status) {
switch ((status || '').toUpperCase()) {
case 'IMPORTED': return 'Importat';
case 'ALREADY_IMPORTED': return 'Deja imp.';
case 'SKIPPED': return 'Omis';
case 'ERROR': return 'Eroare';
default: return esc(status);
}
}
function orderStatusBadge(status) {
switch ((status || '').toUpperCase()) {
case 'IMPORTED': return '<span class="badge bg-success">Importat</span>';
case 'ALREADY_IMPORTED': return '<span class="badge bg-info">Deja importat</span>';
case 'SKIPPED': return '<span class="badge bg-warning">Omis</span>';
case 'ERROR': return '<span class="badge bg-danger">Eroare</span>';
case 'CANCELLED': return '<span class="badge bg-secondary">Anulat</span>';
case 'DELETED_IN_ROA': return '<span class="badge bg-dark">Sters din ROA</span>';
default: return `<span class="badge bg-secondary">${esc(status)}</span>`;
}
}
function invoiceDot(order) {
if (order.status !== 'IMPORTED' && order.status !== 'ALREADY_IMPORTED') return '';
if (order.invoice && order.invoice.facturat) return '<span class="dot dot-green" title="Facturat"></span>';
return '<span class="dot dot-red" title="Nefacturat"></span>';
}
function renderCodmatCell(item) {
if (!item.codmat_details || item.codmat_details.length === 0) {
return `<code>${esc(item.codmat || '-')}</code>`;
}
if (item.codmat_details.length === 1) {
const d = item.codmat_details[0];
if (d.direct) {
return `<code>${esc(d.codmat)}</code> <span class="badge bg-secondary" style="font-size:0.6rem;vertical-align:middle">direct</span>`;
}
return `<code>${esc(d.codmat)}</code>`;
}
return item.codmat_details.map(d =>
`<div class="small"><code>${esc(d.codmat)}</code> <span class="text-muted">\xd7${d.cantitate_roa}</span></div>`
).join('');
}
// ── Refresh Invoices ──────────────────────────────
async function refreshInvoices() {
const btn = document.getElementById('btnRefreshInvoices');
const btnM = document.getElementById('btnRefreshInvoicesMobile');
if (btn) { btn.disabled = true; btn.textContent = '⟳ Se verifica...'; }
if (btnM) { btnM.disabled = true; }
try {
const res = await fetch('/api/dashboard/refresh-invoices', { method: 'POST' });
const data = await res.json();
if (data.error) {
alert('Eroare: ' + data.error);
} else {
loadDashOrders();
}
} catch (err) {
alert('Eroare: ' + err.message);
} finally {
if (btn) { btn.disabled = false; btn.textContent = '↻ Facturi'; }
if (btnM) { btnM.disabled = false; }
}
}
// ── Order Detail Modal ────────────────────────────
async function openDashOrderDetail(orderNumber) {
document.getElementById('detailOrderNumber').textContent = '#' + orderNumber;
document.getElementById('detailCustomer').textContent = '...';
document.getElementById('detailDate').textContent = '';
document.getElementById('detailStatus').innerHTML = '';
document.getElementById('detailIdComanda').textContent = '-';
document.getElementById('detailIdPartener').textContent = '-';
document.getElementById('detailIdAdresaFact').textContent = '-';
document.getElementById('detailIdAdresaLivr').textContent = '-';
document.getElementById('detailItemsBody').innerHTML = '<tr><td colspan="7" class="text-center">Se incarca...</td></tr>';
document.getElementById('detailError').style.display = 'none';
document.getElementById('detailReceipt').innerHTML = '';
document.getElementById('detailReceiptMobile').innerHTML = '';
const invInfo = document.getElementById('detailInvoiceInfo');
if (invInfo) invInfo.style.display = 'none';
const mobileContainer = document.getElementById('detailItemsMobile');
if (mobileContainer) mobileContainer.innerHTML = '';
const modalEl = document.getElementById('orderDetailModal');
const existing = bootstrap.Modal.getInstance(modalEl);
if (existing) { existing.show(); } else { new bootstrap.Modal(modalEl).show(); }
try {
const res = await fetch(`/api/sync/order/${encodeURIComponent(orderNumber)}`);
const data = await res.json();
if (data.error) {
document.getElementById('detailError').textContent = data.error;
document.getElementById('detailError').style.display = '';
return;
}
const order = data.order || {};
document.getElementById('detailCustomer').textContent = order.customer_name || '-';
document.getElementById('detailDate').textContent = fmtDate(order.order_date);
document.getElementById('detailStatus').innerHTML = orderStatusBadge(order.status);
document.getElementById('detailIdComanda').textContent = order.id_comanda || '-';
document.getElementById('detailIdPartener').textContent = order.id_partener || '-';
document.getElementById('detailIdAdresaFact').textContent = order.id_adresa_facturare || '-';
document.getElementById('detailIdAdresaLivr').textContent = order.id_adresa_livrare || '-';
// Invoice info
const invInfo = document.getElementById('detailInvoiceInfo');
const inv = order.invoice;
if (inv && inv.facturat) {
const serie = inv.serie_act || '';
const numar = inv.numar_act || '';
document.getElementById('detailInvoiceNumber').textContent = serie ? `${serie} ${numar}` : numar;
document.getElementById('detailInvoiceDate').textContent = inv.data_act ? fmtDate(inv.data_act) : '-';
if (invInfo) invInfo.style.display = '';
} else {
if (invInfo) invInfo.style.display = 'none';
}
if (order.error_message) {
document.getElementById('detailError').textContent = order.error_message;
document.getElementById('detailError').style.display = '';
}
const items = data.items || [];
if (items.length === 0) {
document.getElementById('detailItemsBody').innerHTML = '<tr><td colspan="7" class="text-center text-muted">Niciun articol</td></tr>';
return;
}
// Store items for quick map pre-population
window._detailItems = items;
// Mobile article flat list
const mobileContainer = document.getElementById('detailItemsMobile');
if (mobileContainer) {
let mobileHtml = items.map((item, idx) => {
const codmatText = item.codmat_details?.length
? item.codmat_details.map(d => `<code>${esc(d.codmat)}</code>${d.direct ? ' <span class="badge bg-secondary" style="font-size:0.55rem">direct</span>' : ''}`).join(' ')
: `<code>${esc(item.codmat || '')}</code>`;
const valoare = (Number(item.price || 0) * Number(item.quantity || 0));
return `<div class="dif-item">
<div class="dif-row">
<span class="dif-sku dif-codmat-link" onclick="openDashQuickMap('${esc(item.sku)}','${esc(item.product_name||'')}','${esc(orderNumber)}', ${idx})">${esc(item.sku)}</span>
${codmatText}
</div>
<div class="dif-row">
<span class="dif-name">${esc(item.product_name || '')}</span>
<span class="dif-qty">x${item.quantity || 0}</span>
<span class="dif-val">${fmtNum(valoare)} lei</span>
<span class="dif-vat text-muted" style="font-size:0.75rem">TVA ${item.vat != null ? Number(item.vat) : '?'}</span>
</div>
</div>`;
}).join('');
// Transport row (mobile)
if (order.delivery_cost > 0) {
const tVat = order.transport_vat || '21';
mobileHtml += `<div class="dif-item" style="opacity:0.7">
<div class="dif-row">
<span class="dif-name text-muted">Transport</span>
<span class="dif-qty">x1</span>
<span class="dif-val">${fmtNum(order.delivery_cost)} lei</span>
<span class="dif-vat text-muted" style="font-size:0.75rem">TVA ${tVat}</span>
</div>
</div>`;
}
// Discount rows (mobile)
if (order.discount_total > 0) {
const discSplit = computeDiscountSplit(items, order);
if (discSplit) {
Object.entries(discSplit)
.sort(([a], [b]) => Number(a) - Number(b))
.forEach(([rate, amt]) => {
if (amt > 0) mobileHtml += `<div class="dif-item" style="opacity:0.7">
<div class="dif-row">
<span class="dif-name text-muted">Discount</span>
<span class="dif-qty">x\u20131</span>
<span class="dif-val">${fmtNum(amt)} lei</span>
<span class="dif-vat text-muted" style="font-size:0.75rem">TVA ${Number(rate)}</span>
</div>
</div>`;
});
} else {
mobileHtml += `<div class="dif-item" style="opacity:0.7">
<div class="dif-row">
<span class="dif-name text-muted">Discount</span>
<span class="dif-qty">x\u20131</span>
<span class="dif-val">${fmtNum(order.discount_total)} lei</span>
</div>
</div>`;
}
}
mobileContainer.innerHTML = '<div class="detail-item-flat">' + mobileHtml + '</div>';
}
let tableHtml = items.map((item, idx) => {
const valoare = Number(item.price || 0) * Number(item.quantity || 0);
return `<tr>
<td><code class="codmat-link" onclick="openDashQuickMap('${esc(item.sku)}', '${esc(item.product_name || '')}', '${esc(orderNumber)}', ${idx})" title="Click pentru mapare">${esc(item.sku)}</code></td>
<td>${esc(item.product_name || '-')}</td>
<td>${renderCodmatCell(item)}</td>
<td class="text-end">${item.quantity || 0}</td>
<td class="text-end">${item.price != null ? fmtNum(item.price) : '-'}</td>
<td class="text-end">${item.vat != null ? Number(item.vat) : '-'}</td>
<td class="text-end">${fmtNum(valoare)}</td>
</tr>`;
}).join('');
// Transport row
if (order.delivery_cost > 0) {
const tVat = order.transport_vat || '21';
const tCodmat = order.transport_codmat || '';
tableHtml += `<tr class="table-light">
<td></td><td class="text-muted">Transport</td>
<td>${tCodmat ? '<code>' + esc(tCodmat) + '</code>' : ''}</td>
<td class="text-end">1</td><td class="text-end">${fmtNum(order.delivery_cost)}</td>
<td class="text-end">${tVat}</td><td class="text-end">${fmtNum(order.delivery_cost)}</td>
</tr>`;
}
// Discount rows (split by VAT rate)
if (order.discount_total > 0) {
const dCodmat = order.discount_codmat || '';
const discSplit = computeDiscountSplit(items, order);
if (discSplit) {
Object.entries(discSplit)
.sort(([a], [b]) => Number(a) - Number(b))
.forEach(([rate, amt]) => {
if (amt > 0) tableHtml += `<tr class="table-light">
<td></td><td class="text-muted">Discount</td>
<td>${dCodmat ? '<code>' + esc(dCodmat) + '</code>' : ''}</td>
<td class="text-end">\u20131</td><td class="text-end">${fmtNum(amt)}</td>
<td class="text-end">${Number(rate)}</td><td class="text-end">\u2013${fmtNum(amt)}</td>
</tr>`;
});
} else {
tableHtml += `<tr class="table-light">
<td></td><td class="text-muted">Discount</td>
<td>${dCodmat ? '<code>' + esc(dCodmat) + '</code>' : ''}</td>
<td class="text-end">\u20131</td><td class="text-end">${fmtNum(order.discount_total)}</td>
<td class="text-end">-</td><td class="text-end">\u2013${fmtNum(order.discount_total)}</td>
</tr>`;
}
}
document.getElementById('detailItemsBody').innerHTML = tableHtml;
// Receipt footer (just total)
renderReceipt(items, order);
} catch (err) {
document.getElementById('detailError').textContent = err.message;
document.getElementById('detailError').style.display = '';
}
}
function fmtNum(v) {
return Number(v).toLocaleString('ro-RO', { minimumFractionDigits: 2, maximumFractionDigits: 2 });
}
function computeDiscountSplit(items, order) {
if (order.discount_split && typeof order.discount_split === 'object')
return order.discount_split;
// Compute proportionally from items by VAT rate
const byRate = {};
items.forEach(item => {
const rate = item.vat != null ? Number(item.vat) : null;
if (rate === null) return;
if (!byRate[rate]) byRate[rate] = 0;
byRate[rate] += Number(item.price || 0) * Number(item.quantity || 0);
});
const rates = Object.keys(byRate).sort((a, b) => Number(a) - Number(b));
if (rates.length === 0) return null;
const grandTotal = rates.reduce((s, r) => s + byRate[r], 0);
if (grandTotal <= 0) return null;
const split = {};
let remaining = order.discount_total;
rates.forEach((rate, i) => {
if (i === rates.length - 1) {
split[rate] = Math.round(remaining * 100) / 100;
} else {
const amt = Math.round(order.discount_total * byRate[rate] / grandTotal * 100) / 100;
split[rate] = amt;
remaining -= amt;
}
});
return split;
}
function renderReceipt(items, order) {
const desktop = document.getElementById('detailReceipt');
const mobile = document.getElementById('detailReceiptMobile');
if (!items.length) {
desktop.innerHTML = '';
mobile.innerHTML = '';
return;
}
const articole = items.reduce((s, i) => s + Number(i.price || 0) * Number(i.quantity || 0), 0);
const discount = Number(order.discount_total || 0);
const transport = Number(order.delivery_cost || 0);
const total = order.order_total != null ? fmtNum(order.order_total) : '-';
// Desktop: full labels
let dHtml = `<span class="text-muted">Articole: <strong class="text-body">${fmtNum(articole)}</strong></span>`;
if (discount > 0) dHtml += `<span class="text-muted">Discount: <strong class="text-danger">\u2013${fmtNum(discount)}</strong></span>`;
if (transport > 0) dHtml += `<span class="text-muted">Transport: <strong class="text-body">${fmtNum(transport)}</strong></span>`;
dHtml += `<span>Total: <strong>${total} lei</strong></span>`;
desktop.innerHTML = dHtml;
// Mobile: shorter labels
let mHtml = `<span class="text-muted">Art: <strong class="text-body">${fmtNum(articole)}</strong></span>`;
if (discount > 0) mHtml += `<span class="text-muted">Disc: <strong class="text-danger">\u2013${fmtNum(discount)}</strong></span>`;
if (transport > 0) mHtml += `<span class="text-muted">Transp: <strong class="text-body">${fmtNum(transport)}</strong></span>`;
mHtml += `<span>Total: <strong>${total} lei</strong></span>`;
mobile.innerHTML = mHtml;
}
// ── Quick Map Modal (uses shared openQuickMap) ───
function openDashQuickMap(sku, productName, orderNumber, itemIdx) {
const item = (window._detailItems || [])[itemIdx];
const details = item?.codmat_details;
const isDirect = details?.length === 1 && details[0].direct === true;
openQuickMap({
sku,
productName,
isDirect,
directInfo: isDirect ? { codmat: details[0].codmat, denumire: details[0].denumire } : null,
prefill: (!isDirect && details?.length) ? details.map(d => ({ codmat: d.codmat, cantitate: d.cantitate_roa, denumire: d.denumire })) : null,
onSave: () => {
if (orderNumber) openDashOrderDetail(orderNumber);
loadDashOrders();
}
});
}

View File

@@ -1,466 +0,0 @@
// logs.js - Structured order viewer with text log fallback
let currentRunId = null;
let runsPage = 1;
let logPollTimer = null;
let currentFilter = 'all';
let ordersPage = 1;
let ordersSortColumn = 'order_date';
let ordersSortDirection = 'desc';
function fmtCost(v) {
return v > 0 ? Number(v).toFixed(2) : '';
}
function fmtDuration(startedAt, finishedAt) {
if (!startedAt || !finishedAt) return '-';
const diffMs = new Date(finishedAt) - new Date(startedAt);
if (isNaN(diffMs) || diffMs < 0) return '-';
const secs = Math.round(diffMs / 1000);
if (secs < 60) return secs + 's';
return Math.floor(secs / 60) + 'm ' + (secs % 60) + 's';
}
function runStatusBadge(status) {
switch ((status || '').toLowerCase()) {
case 'completed': return '<span style="color:#16a34a;font-weight:600">completed</span>';
case 'running': return '<span style="color:#2563eb;font-weight:600">running</span>';
case 'failed': return '<span style="color:#dc2626;font-weight:600">failed</span>';
default: return `<span style="font-weight:600">${esc(status)}</span>`;
}
}
function orderStatusBadge(status) {
switch ((status || '').toUpperCase()) {
case 'IMPORTED': return '<span class="badge bg-success">Importat</span>';
case 'ALREADY_IMPORTED': return '<span class="badge bg-info">Deja importat</span>';
case 'SKIPPED': return '<span class="badge bg-warning">Omis</span>';
case 'ERROR': return '<span class="badge bg-danger">Eroare</span>';
case 'DELETED_IN_ROA': return '<span class="badge bg-dark">Sters din ROA</span>';
default: return `<span class="badge bg-secondary">${esc(status)}</span>`;
}
}
function logStatusText(status) {
switch ((status || '').toUpperCase()) {
case 'IMPORTED': return 'Importat';
case 'ALREADY_IMPORTED': return 'Deja imp.';
case 'SKIPPED': return 'Omis';
case 'ERROR': return 'Eroare';
default: return esc(status);
}
}
function logsGoPage(p) { loadRunOrders(currentRunId, null, p); }
// ── Runs Dropdown ────────────────────────────────
async function loadRuns() {
// Load all recent runs for dropdown
try {
const res = await fetch(`/api/sync/history?page=1&per_page=100`);
if (!res.ok) throw new Error('HTTP ' + res.status);
const data = await res.json();
const runs = data.runs || [];
const dd = document.getElementById('runsDropdown');
if (runs.length === 0) {
dd.innerHTML = '<option value="">Niciun sync run</option>';
} else {
dd.innerHTML = '<option value="">-- Selecteaza un run --</option>' +
runs.map(r => {
const started = r.started_at ? new Date(r.started_at).toLocaleString('ro-RO', {day:'2-digit',month:'2-digit',year:'numeric',hour:'2-digit',minute:'2-digit'}) : '?';
const st = (r.status || '').toUpperCase();
const statusEmoji = st === 'COMPLETED' ? '✓' : st === 'RUNNING' ? '⟳' : '✗';
const newImp = r.new_imported || 0;
const already = r.already_imported || 0;
const imp = r.imported || 0;
const skip = r.skipped || 0;
const err = r.errors || 0;
const impLabel = already > 0 ? `${newImp} noi, ${already} deja` : `${imp} imp`;
const label = `${started}${statusEmoji} ${r.status} (${impLabel}, ${skip} skip, ${err} err)`;
const selected = r.run_id === currentRunId ? 'selected' : '';
return `<option value="${esc(r.run_id)}" ${selected}>${esc(label)}</option>`;
}).join('');
}
const ddMobile = document.getElementById('runsDropdownMobile');
if (ddMobile) ddMobile.innerHTML = dd.innerHTML;
} catch (err) {
const dd = document.getElementById('runsDropdown');
dd.innerHTML = `<option value="">Eroare: ${esc(err.message)}</option>`;
}
}
// ── Run Selection ────────────────────────────────
async function selectRun(runId) {
if (logPollTimer) { clearInterval(logPollTimer); logPollTimer = null; }
currentRunId = runId;
currentFilter = 'all';
ordersPage = 1;
const url = new URL(window.location);
if (runId) { url.searchParams.set('run', runId); } else { url.searchParams.delete('run'); }
history.replaceState(null, '', url);
// Sync dropdown selection
const dd = document.getElementById('runsDropdown');
if (dd && dd.value !== runId) dd.value = runId;
const ddMobile = document.getElementById('runsDropdownMobile');
if (ddMobile && ddMobile.value !== runId) ddMobile.value = runId;
if (!runId) {
document.getElementById('logViewerSection').style.display = 'none';
return;
}
document.getElementById('logViewerSection').style.display = '';
const logRunIdEl = document.getElementById('logRunId'); if (logRunIdEl) logRunIdEl.textContent = runId;
document.getElementById('logStatusBadge').innerHTML = '...';
document.getElementById('textLogSection').style.display = 'none';
await loadRunOrders(runId, 'all', 1);
// Also load text log in background
fetchTextLog(runId);
}
// ── Per-Order Filtering (R1) ─────────────────────
async function loadRunOrders(runId, statusFilter, page) {
if (statusFilter != null) currentFilter = statusFilter;
if (page != null) ordersPage = page;
// Update filter pill active state
document.querySelectorAll('#orderFilterPills .filter-pill').forEach(btn => {
btn.classList.toggle('active', btn.dataset.logStatus === currentFilter);
});
try {
const res = await fetch(`/api/sync/run/${encodeURIComponent(runId)}/orders?status=${currentFilter}&page=${ordersPage}&per_page=50&sort_by=${ordersSortColumn}&sort_dir=${ordersSortDirection}`);
if (!res.ok) throw new Error('HTTP ' + res.status);
const data = await res.json();
const counts = data.counts || {};
document.getElementById('countAll').textContent = counts.total || 0;
document.getElementById('countImported').textContent = counts.imported || 0;
document.getElementById('countSkipped').textContent = counts.skipped || 0;
document.getElementById('countError').textContent = counts.error || 0;
const alreadyEl = document.getElementById('countAlreadyImported');
if (alreadyEl) alreadyEl.textContent = counts.already_imported || 0;
const tbody = document.getElementById('runOrdersBody');
const orders = data.orders || [];
if (orders.length === 0) {
tbody.innerHTML = '<tr><td colspan="9" class="text-center text-muted py-3">Nicio comanda</td></tr>';
} else {
tbody.innerHTML = orders.map((o, i) => {
const dateStr = fmtDate(o.order_date);
const orderTotal = o.order_total != null ? Number(o.order_total).toFixed(2) : '-';
return `<tr style="cursor:pointer" onclick="openOrderDetail('${esc(o.order_number)}')">
<td>${statusDot(o.status)}</td>
<td>${(ordersPage - 1) * 50 + i + 1}</td>
<td class="text-nowrap">${dateStr}</td>
<td><code>${esc(o.order_number)}</code></td>
<td class="fw-bold">${esc(o.customer_name)}</td>
<td>${o.items_count || 0}</td>
<td class="text-end text-muted">${fmtCost(o.delivery_cost)}</td>
<td class="text-end text-muted">${fmtCost(o.discount_total)}</td>
<td class="text-end fw-bold">${orderTotal}</td>
</tr>`;
}).join('');
}
// Mobile flat rows
const mobileList = document.getElementById('logsMobileList');
if (mobileList) {
if (orders.length === 0) {
mobileList.innerHTML = '<div class="flat-row text-muted py-3 justify-content-center">Nicio comanda</div>';
} else {
mobileList.innerHTML = orders.map(o => {
const d = o.order_date || '';
let dateFmt = '-';
if (d.length >= 10) {
dateFmt = d.slice(8, 10) + '.' + d.slice(5, 7) + '.' + d.slice(2, 4);
if (d.length >= 16) dateFmt += ' ' + d.slice(11, 16);
}
const totalStr = o.order_total ? Number(o.order_total).toFixed(2) : '';
return `<div class="flat-row" onclick="openOrderDetail('${esc(o.order_number)}')" style="font-size:0.875rem">
${statusDot(o.status)}
<span style="color:#6b7280" class="text-nowrap">${dateFmt}</span>
<span class="grow truncate fw-bold">${esc(o.customer_name || '—')}</span>
<span class="text-nowrap">x${o.items_count || 0}${totalStr ? ' · <strong>' + totalStr + '</strong>' : ''}</span>
</div>`;
}).join('');
}
}
// Mobile segmented control
renderMobileSegmented('logsMobileSeg', [
{ label: 'Toate', count: counts.total || 0, value: 'all', active: currentFilter === 'all', colorClass: 'fc-neutral' },
{ label: 'Imp.', count: counts.imported || 0, value: 'IMPORTED', active: currentFilter === 'IMPORTED', colorClass: 'fc-green' },
{ label: 'Deja', count: counts.already_imported || 0, value: 'ALREADY_IMPORTED', active: currentFilter === 'ALREADY_IMPORTED', colorClass: 'fc-blue' },
{ label: 'Omise', count: counts.skipped || 0, value: 'SKIPPED', active: currentFilter === 'SKIPPED', colorClass: 'fc-yellow' },
{ label: 'Erori', count: counts.error || 0, value: 'ERROR', active: currentFilter === 'ERROR', colorClass: 'fc-red' }
], (val) => filterOrders(val));
// Orders pagination
const totalPages = data.pages || 1;
const infoEl = document.getElementById('ordersPageInfo');
if (infoEl) infoEl.textContent = `${data.total || 0} comenzi | Pagina ${ordersPage} din ${totalPages}`;
const pagHtml = `<small class="text-muted me-auto">${data.total || 0} comenzi | Pagina ${ordersPage} din ${totalPages}</small>` + renderUnifiedPagination(ordersPage, totalPages, 'logsGoPage');
const pagDiv = document.getElementById('ordersPagination');
if (pagDiv) pagDiv.innerHTML = pagHtml;
const pagDivTop = document.getElementById('ordersPaginationTop');
if (pagDivTop) pagDivTop.innerHTML = pagHtml;
// Update run status badge
const runRes = await fetch(`/api/sync/run/${encodeURIComponent(runId)}`);
const runData = await runRes.json();
if (runData.run) {
document.getElementById('logStatusBadge').innerHTML = runStatusBadge(runData.run.status);
// Update mobile run dot
const mDot = document.getElementById('mobileRunDot');
if (mDot) mDot.className = 'sync-status-dot ' + (runData.run.status || 'idle');
}
} catch (err) {
document.getElementById('runOrdersBody').innerHTML =
`<tr><td colspan="9" class="text-center text-danger">${esc(err.message)}</td></tr>`;
}
}
function filterOrders(status) {
loadRunOrders(currentRunId, status, 1);
}
function sortOrdersBy(col) {
if (ordersSortColumn === col) {
ordersSortDirection = ordersSortDirection === 'asc' ? 'desc' : 'asc';
} else {
ordersSortColumn = col;
ordersSortDirection = 'asc';
}
// Update sort icons
document.querySelectorAll('#logViewerSection .sort-icon').forEach(span => {
const c = span.dataset.col;
span.textContent = c === ordersSortColumn ? (ordersSortDirection === 'asc' ? '\u2191' : '\u2193') : '';
});
loadRunOrders(currentRunId, null, 1);
}
// ── Text Log (collapsible) ──────────────────────
function toggleTextLog() {
const section = document.getElementById('textLogSection');
section.style.display = section.style.display === 'none' ? '' : 'none';
if (section.style.display !== 'none' && currentRunId) {
fetchTextLog(currentRunId);
}
}
async function fetchTextLog(runId) {
// Clear any existing poll timer to prevent accumulation
if (logPollTimer) { clearInterval(logPollTimer); logPollTimer = null; }
try {
const res = await fetch(`/api/sync/run/${encodeURIComponent(runId)}/text-log`);
if (!res.ok) throw new Error('HTTP ' + res.status);
const data = await res.json();
document.getElementById('logContent').textContent = data.text || '(log gol)';
if (!data.finished) {
if (document.getElementById('autoRefreshToggle')?.checked) {
logPollTimer = setInterval(async () => {
try {
const r = await fetch(`/api/sync/run/${encodeURIComponent(runId)}/text-log`);
const d = await r.json();
if (currentRunId !== runId) { clearInterval(logPollTimer); return; }
document.getElementById('logContent').textContent = d.text || '(log gol)';
const el = document.getElementById('logContent');
el.scrollTop = el.scrollHeight;
if (d.finished) {
clearInterval(logPollTimer);
logPollTimer = null;
loadRuns();
loadRunOrders(runId, currentFilter, ordersPage);
}
} catch (e) { console.error('Poll error:', e); }
}, 2500);
}
}
} catch (err) {
document.getElementById('logContent').textContent = 'Eroare: ' + err.message;
}
}
// ── Multi-CODMAT helper (D1) ─────────────────────
function renderCodmatCell(item) {
if (!item.codmat_details || item.codmat_details.length === 0) {
return `<code>${esc(item.codmat || '-')}</code>`;
}
if (item.codmat_details.length === 1) {
const d = item.codmat_details[0];
return `<code>${esc(d.codmat)}</code>`;
}
// Multi-CODMAT: compact list
return item.codmat_details.map(d =>
`<div class="small"><code>${esc(d.codmat)}</code> <span class="text-muted">\xd7${d.cantitate_roa}</span></div>`
).join('');
}
// ── Order Detail Modal (R9) ─────────────────────
async function openOrderDetail(orderNumber) {
document.getElementById('detailOrderNumber').textContent = '#' + orderNumber;
document.getElementById('detailCustomer').textContent = '...';
document.getElementById('detailDate').textContent = '';
document.getElementById('detailStatus').innerHTML = '';
document.getElementById('detailIdComanda').textContent = '-';
document.getElementById('detailIdPartener').textContent = '-';
document.getElementById('detailIdAdresaFact').textContent = '-';
document.getElementById('detailIdAdresaLivr').textContent = '-';
document.getElementById('detailItemsBody').innerHTML = '<tr><td colspan="6" class="text-center">Se incarca...</td></tr>';
document.getElementById('detailError').style.display = 'none';
const detailItemsTotal = document.getElementById('detailItemsTotal');
if (detailItemsTotal) detailItemsTotal.textContent = '-';
const detailOrderTotal = document.getElementById('detailOrderTotal');
if (detailOrderTotal) detailOrderTotal.textContent = '-';
const mobileContainer = document.getElementById('detailItemsMobile');
if (mobileContainer) mobileContainer.innerHTML = '';
const modalEl = document.getElementById('orderDetailModal');
const existing = bootstrap.Modal.getInstance(modalEl);
if (existing) { existing.show(); } else { new bootstrap.Modal(modalEl).show(); }
try {
const res = await fetch(`/api/sync/order/${encodeURIComponent(orderNumber)}`);
const data = await res.json();
if (data.error) {
document.getElementById('detailError').textContent = data.error;
document.getElementById('detailError').style.display = '';
return;
}
const order = data.order || {};
document.getElementById('detailCustomer').textContent = order.customer_name || '-';
document.getElementById('detailDate').textContent = fmtDate(order.order_date);
document.getElementById('detailStatus').innerHTML = orderStatusBadge(order.status);
document.getElementById('detailIdComanda').textContent = order.id_comanda || '-';
document.getElementById('detailIdPartener').textContent = order.id_partener || '-';
document.getElementById('detailIdAdresaFact').textContent = order.id_adresa_facturare || '-';
document.getElementById('detailIdAdresaLivr').textContent = order.id_adresa_livrare || '-';
if (order.error_message) {
document.getElementById('detailError').textContent = order.error_message;
document.getElementById('detailError').style.display = '';
}
const dlvEl = document.getElementById('detailDeliveryCost');
if (dlvEl) dlvEl.textContent = order.delivery_cost > 0 ? Number(order.delivery_cost).toFixed(2) + ' lei' : '';
const dscEl = document.getElementById('detailDiscount');
if (dscEl) dscEl.textContent = order.discount_total > 0 ? '' + Number(order.discount_total).toFixed(2) + ' lei' : '';
const items = data.items || [];
if (items.length === 0) {
document.getElementById('detailItemsBody').innerHTML = '<tr><td colspan="6" class="text-center text-muted">Niciun articol</td></tr>';
return;
}
// Update totals row
const itemsTotal = items.reduce((sum, item) => sum + (Number(item.price || 0) * Number(item.quantity || 0)), 0);
document.getElementById('detailItemsTotal').textContent = itemsTotal.toFixed(2) + ' lei';
document.getElementById('detailOrderTotal').textContent = order.order_total != null ? Number(order.order_total).toFixed(2) + ' lei' : '-';
// Mobile article flat list
const mobileContainer = document.getElementById('detailItemsMobile');
if (mobileContainer) {
mobileContainer.innerHTML = '<div class="detail-item-flat">' + items.map((item, idx) => {
const codmatList = item.codmat_details?.length
? item.codmat_details.map(d => `<span class="dif-codmat-link" onclick="openLogsQuickMap('${esc(item.sku)}','${esc(item.product_name||'')}','${esc(orderNumber)}')">${esc(d.codmat)}</span>`).join(' ')
: `<span class="dif-codmat-link" onclick="openLogsQuickMap('${esc(item.sku)}','${esc(item.product_name||'')}','${esc(orderNumber)}')">${esc(item.codmat || '')}</span>`;
const valoare = (Number(item.price || 0) * Number(item.quantity || 0)).toFixed(2);
return `<div class="dif-item">
<div class="dif-row">
<span class="dif-sku">${esc(item.sku)}</span>
${codmatList}
</div>
<div class="dif-row">
<span class="dif-name">${esc(item.product_name || '')}</span>
<span class="dif-qty">x${item.quantity || 0}</span>
<span class="dif-val">${valoare} lei</span>
</div>
</div>`;
}).join('') + '</div>';
}
document.getElementById('detailItemsBody').innerHTML = items.map(item => {
const valoare = (Number(item.price || 0) * Number(item.quantity || 0)).toFixed(2);
const codmatCell = `<span class="codmat-link" onclick="openLogsQuickMap('${esc(item.sku)}', '${esc(item.product_name || '')}', '${esc(orderNumber)}')" title="Click pentru mapare">${renderCodmatCell(item)}</span>`;
return `<tr>
<td><code>${esc(item.sku)}</code></td>
<td>${esc(item.product_name || '-')}</td>
<td>${codmatCell}</td>
<td>${item.quantity || 0}</td>
<td>${item.price != null ? Number(item.price).toFixed(2) : '-'}</td>
<td class="text-end">${valoare}</td>
</tr>`;
}).join('');
} catch (err) {
document.getElementById('detailError').textContent = err.message;
document.getElementById('detailError').style.display = '';
}
}
// ── Quick Map Modal (uses shared openQuickMap) ───
function openLogsQuickMap(sku, productName, orderNumber) {
openQuickMap({
sku,
productName,
onSave: () => {
if (orderNumber) openOrderDetail(orderNumber);
loadRunOrders(currentRunId, currentFilter, ordersPage);
}
});
}
// ── Init ────────────────────────────────────────
document.addEventListener('DOMContentLoaded', () => {
loadRuns();
document.querySelectorAll('#orderFilterPills .filter-pill').forEach(btn => {
btn.addEventListener('click', function() {
filterOrders(this.dataset.logStatus || 'all');
});
});
const preselected = document.getElementById('preselectedRun');
const urlParams = new URLSearchParams(window.location.search);
const runFromUrl = urlParams.get('run') || (preselected ? preselected.value : '');
if (runFromUrl) {
selectRun(runFromUrl);
}
document.getElementById('autoRefreshToggle')?.addEventListener('change', (e) => {
if (e.target.checked) {
// Resume polling if we have an active run
if (currentRunId) fetchTextLog(currentRunId);
} else {
// Pause polling
if (logPollTimer) { clearInterval(logPollTimer); logPollTimer = null; }
}
});
document.getElementById('autoRefreshToggleMobile')?.addEventListener('change', (e) => {
const desktop = document.getElementById('autoRefreshToggle');
if (desktop) desktop.checked = e.target.checked;
desktop?.dispatchEvent(new Event('change'));
});
});

View File

@@ -1,765 +0,0 @@
let currentPage = 1;
let mappingsPerPage = 50;
let currentSearch = '';
let searchTimeout = null;
let sortColumn = 'sku';
let sortDirection = 'asc';
let editingMapping = null; // {sku, codmat} when editing
const kitPriceCache = new Map();
// Load on page ready
document.addEventListener('DOMContentLoaded', () => {
loadMappings();
initAddModal();
initDeleteModal();
});
function debounceSearch() {
clearTimeout(searchTimeout);
searchTimeout = setTimeout(() => {
currentSearch = document.getElementById('searchInput').value;
currentPage = 1;
loadMappings();
}, 300);
}
// ── Sorting (R7) ─────────────────────────────────
function sortBy(col) {
if (sortColumn === col) {
sortDirection = sortDirection === 'asc' ? 'desc' : 'asc';
} else {
sortColumn = col;
sortDirection = 'asc';
}
currentPage = 1;
loadMappings();
}
function updateSortIcons() {
document.querySelectorAll('.sort-icon').forEach(span => {
const col = span.dataset.col;
if (col === sortColumn) {
span.textContent = sortDirection === 'asc' ? '\u2191' : '\u2193';
} else {
span.textContent = '';
}
});
}
// ── Load & Render ────────────────────────────────
async function loadMappings() {
const showInactive = document.getElementById('showInactive')?.checked;
const showDeleted = document.getElementById('showDeleted')?.checked;
const params = new URLSearchParams({
search: currentSearch,
page: currentPage,
per_page: mappingsPerPage,
sort_by: sortColumn,
sort_dir: sortDirection
});
if (showDeleted) params.set('show_deleted', 'true');
try {
const res = await fetch(`/api/mappings?${params}`);
const data = await res.json();
let mappings = data.mappings || [];
// Client-side filter for inactive unless toggle is on
// (keep deleted rows visible when showDeleted is on, even if inactive)
if (!showInactive) {
mappings = mappings.filter(m => m.activ || m.sters);
}
renderTable(mappings, showDeleted);
renderPagination(data);
updateSortIcons();
} catch (err) {
document.getElementById('mappingsFlatList').innerHTML =
`<div class="flat-row text-danger py-3 justify-content-center">Eroare: ${err.message}</div>`;
}
}
function renderTable(mappings, showDeleted) {
const container = document.getElementById('mappingsFlatList');
if (!mappings || mappings.length === 0) {
container.innerHTML = '<div class="flat-row text-muted py-4 justify-content-center">Nu exista mapari</div>';
return;
}
// Count CODMATs per SKU for kit detection
const skuCodmatCount = {};
mappings.forEach(m => {
skuCodmatCount[m.sku] = (skuCodmatCount[m.sku] || 0) + 1;
});
let prevSku = null;
let html = '';
mappings.forEach((m, i) => {
const isNewGroup = m.sku !== prevSku;
if (isNewGroup) {
const isKit = (skuCodmatCount[m.sku] || 0) > 1;
const kitBadge = isKit
? ` <span class="text-muted small">Kit · ${skuCodmatCount[m.sku]}</span><span class="kit-price-loading" data-sku="${esc(m.sku)}" style="display:none"><span class="spinner-border spinner-border-sm ms-1" style="width:0.8rem;height:0.8rem"></span></span>`
: '';
const inactiveStyle = !m.activ && !m.sters ? 'opacity:0.6;' : '';
html += `<div class="flat-row" style="background:#f8fafc;font-weight:600;border-top:1px solid #e5e7eb;${inactiveStyle}">
<span class="${m.activ ? 'dot dot-green' : 'dot dot-yellow'}" style="cursor:${m.sters ? 'default' : 'pointer'}"
${m.sters ? '' : `onclick="event.stopPropagation();toggleActive('${esc(m.sku)}', '${esc(m.codmat)}', ${m.activ})"`}
title="${m.activ ? 'Activ' : 'Inactiv'}"></span>
<strong class="me-1 text-nowrap">${esc(m.sku)}</strong>${kitBadge}
<span class="grow truncate text-muted" style="font-size:0.875rem">${esc(m.product_name || '')}</span>
${m.sters
? `<button class="btn btn-sm btn-outline-success" onclick="event.stopPropagation();restoreMapping('${esc(m.sku)}', '${esc(m.codmat)}')" title="Restaureaza" style="padding:0.1rem 0.4rem"><i class="bi bi-arrow-counterclockwise"></i></button>`
: `<button class="context-menu-trigger" data-sku="${esc(m.sku)}" data-codmat="${esc(m.codmat)}" data-cantitate="${m.cantitate_roa}">&#8942;</button>`
}
</div>`;
}
const deletedStyle = m.sters ? 'text-decoration:line-through;opacity:0.5;' : '';
const isKitRow = (skuCodmatCount[m.sku] || 0) > 1;
const kitPriceSlot = isKitRow ? `<span class="kit-price-slot text-muted small ms-2" data-sku="${esc(m.sku)}" data-codmat="${esc(m.codmat)}"></span>` : '';
const inlinePrice = m.pret_cu_tva ? `<span class="text-muted small ms-2">${parseFloat(m.pret_cu_tva).toFixed(2)} lei</span>` : '';
html += `<div class="flat-row" style="padding-left:1.5rem;font-size:0.9rem;${deletedStyle}">
<code>${esc(m.codmat)}</code>
<span class="grow truncate text-muted" style="font-size:0.85rem">${esc(m.denumire || '')}</span>
<span class="text-nowrap" style="font-size:0.875rem">
<span class="${m.sters ? '' : 'editable'}" style="cursor:${m.sters ? 'default' : 'pointer'}"
${m.sters ? '' : `onclick="editFlatValue(this, '${esc(m.sku)}', '${esc(m.codmat)}', 'cantitate_roa', ${m.cantitate_roa})"`}>x${m.cantitate_roa}</span>${isKitRow ? kitPriceSlot : inlinePrice}
</span>
</div>`;
// After last CODMAT of a kit, add total row
const isLastOfKit = isKitRow && (i === mappings.length - 1 || mappings[i + 1].sku !== m.sku);
if (isLastOfKit) {
html += `<div class="flat-row kit-total-slot text-muted small" data-sku="${esc(m.sku)}" style="padding-left:1.5rem;display:none;border-top:1px dashed #e5e7eb"></div>`;
}
prevSku = m.sku;
});
container.innerHTML = html;
// Wire context menu triggers
container.querySelectorAll('.context-menu-trigger').forEach(btn => {
btn.addEventListener('click', (e) => {
e.stopPropagation();
const { sku, codmat, cantitate } = btn.dataset;
const rect = btn.getBoundingClientRect();
showContextMenu(rect.left, rect.bottom + 2, [
{ label: 'Editeaza', action: () => openEditModal(sku, codmat, parseFloat(cantitate)) },
{ label: 'Sterge', action: () => deleteMappingConfirm(sku, codmat), danger: true }
]);
});
});
// Load prices for visible kits
const loadedKits = new Set();
container.querySelectorAll('.kit-price-loading').forEach(el => {
const sku = el.dataset.sku;
if (!loadedKits.has(sku)) {
loadedKits.add(sku);
loadKitPrices(sku, container);
}
});
}
async function loadKitPrices(sku, container) {
if (kitPriceCache.has(sku)) {
renderKitPrices(sku, kitPriceCache.get(sku), container);
return;
}
// Show loading spinner
const spinner = container.querySelector(`.kit-price-loading[data-sku="${CSS.escape(sku)}"]`);
if (spinner) spinner.style.display = '';
try {
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/prices`);
const data = await res.json();
if (data.error) {
if (spinner) spinner.innerHTML = `<small class="text-danger">${esc(data.error)}</small>`;
return;
}
kitPriceCache.set(sku, data.prices || []);
renderKitPrices(sku, data.prices || [], container);
} catch (err) {
if (spinner) spinner.innerHTML = `<small class="text-danger">Eroare la încărcarea prețurilor</small>`;
}
}
function renderKitPrices(sku, prices, container) {
if (!prices || prices.length === 0) return;
// Update each codmat row with price info
const rows = container.querySelectorAll(`.kit-price-slot[data-sku="${CSS.escape(sku)}"]`);
let total = 0;
rows.forEach(slot => {
const codmat = slot.dataset.codmat;
const p = prices.find(pr => pr.codmat === codmat);
if (p && p.pret_cu_tva > 0) {
slot.innerHTML = `${p.pret_cu_tva.toFixed(2)} lei`;
total += p.pret_cu_tva * (p.cantitate_roa || 1);
} else if (p) {
slot.innerHTML = `<span class="text-muted">fără preț</span>`;
}
});
// Show total
const totalSlot = container.querySelector(`.kit-total-slot[data-sku="${CSS.escape(sku)}"]`);
if (totalSlot && total > 0) {
totalSlot.innerHTML = `Total componente: ${total.toFixed(2)} lei`;
totalSlot.style.display = '';
}
// Hide loading spinner
const spinner = container.querySelector(`.kit-price-loading[data-sku="${CSS.escape(sku)}"]`);
if (spinner) spinner.style.display = 'none';
}
// Inline edit for flat-row values (cantitate)
function editFlatValue(span, sku, codmat, field, currentValue) {
if (span.querySelector('input')) return;
const input = document.createElement('input');
input.type = 'number';
input.className = 'form-control form-control-sm d-inline';
input.value = currentValue;
input.step = field === 'cantitate_roa' ? '0.001' : '0.01';
input.style.width = '70px';
input.style.display = 'inline';
const originalText = span.textContent;
span.textContent = '';
span.appendChild(input);
input.focus();
input.select();
const save = async () => {
const newValue = parseFloat(input.value);
if (isNaN(newValue) || newValue === currentValue) {
span.textContent = originalText;
return;
}
try {
const body = {};
body[field] = newValue;
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(body)
});
const data = await res.json();
if (data.success) { loadMappings(); }
else { span.textContent = originalText; alert('Eroare: ' + (data.error || 'Update failed')); }
} catch (err) { span.textContent = originalText; }
};
input.addEventListener('blur', save);
input.addEventListener('keydown', (e) => {
if (e.key === 'Enter') { e.preventDefault(); save(); }
if (e.key === 'Escape') { span.textContent = originalText; }
});
}
function renderPagination(data) {
const pagOpts = { perPage: mappingsPerPage, perPageFn: 'mappingsChangePerPage', perPageOptions: [25, 50, 100, 250] };
const infoHtml = `<small class="text-muted me-auto">${data.total} mapari | Pagina ${data.page} din ${data.pages || 1}</small>`;
const pagHtml = infoHtml + renderUnifiedPagination(data.page, data.pages || 1, 'goPage', pagOpts);
const top = document.getElementById('mappingsPagTop');
const bot = document.getElementById('mappingsPagBottom');
if (top) top.innerHTML = pagHtml;
if (bot) bot.innerHTML = pagHtml;
}
function mappingsChangePerPage(val) { mappingsPerPage = parseInt(val) || 50; currentPage = 1; loadMappings(); }
function goPage(p) {
currentPage = p;
loadMappings();
}
// ── Multi-CODMAT Add Modal (R11) ─────────────────
let acTimeouts = {};
function initAddModal() {
const modal = document.getElementById('addModal');
if (!modal) return;
modal.addEventListener('show.bs.modal', () => {
if (!editingMapping) {
clearAddForm();
}
});
modal.addEventListener('hidden.bs.modal', () => {
editingMapping = null;
document.getElementById('addModalTitle').textContent = 'Adauga Mapare';
});
}
function clearAddForm() {
document.getElementById('inputSku').value = '';
document.getElementById('inputSku').readOnly = false;
document.getElementById('addModalProductName').style.display = 'none';
document.getElementById('pctWarning').style.display = 'none';
document.getElementById('addModalTitle').textContent = 'Adauga Mapare';
const container = document.getElementById('codmatLines');
container.innerHTML = '';
addCodmatLine();
}
async function openEditModal(sku, codmat, cantitate) {
editingMapping = { sku, codmat };
document.getElementById('addModalTitle').textContent = 'Editare Mapare';
document.getElementById('inputSku').value = sku;
document.getElementById('inputSku').readOnly = false;
document.getElementById('pctWarning').style.display = 'none';
const container = document.getElementById('codmatLines');
container.innerHTML = '';
try {
// Fetch all CODMATs for this SKU
const res = await fetch(`/api/mappings?search=${encodeURIComponent(sku)}&per_page=100`);
const data = await res.json();
const allMappings = (data.mappings || []).filter(m => m.sku === sku && !m.sters);
// Show product name if available
const productName = allMappings[0]?.product_name || '';
const productNameEl = document.getElementById('addModalProductName');
const productNameText = document.getElementById('inputProductName');
if (productName && productNameEl && productNameText) {
productNameText.textContent = productName;
productNameEl.style.display = '';
}
if (allMappings.length === 0) {
// Fallback to single line with passed values
addCodmatLine();
const line = container.querySelector('.codmat-line');
if (line) {
line.querySelector('.cl-codmat').value = codmat;
line.querySelector('.cl-cantitate').value = cantitate;
}
} else {
for (const m of allMappings) {
addCodmatLine();
const lines = container.querySelectorAll('.codmat-line');
const line = lines[lines.length - 1];
line.querySelector('.cl-codmat').value = m.codmat;
if (m.denumire) {
line.querySelector('.cl-selected').textContent = m.denumire;
}
line.querySelector('.cl-cantitate').value = m.cantitate_roa;
}
}
} catch (e) {
// Fallback on error
addCodmatLine();
const line = container.querySelector('.codmat-line');
if (line) {
line.querySelector('.cl-codmat').value = codmat;
line.querySelector('.cl-cantitate').value = cantitate;
}
}
new bootstrap.Modal(document.getElementById('addModal')).show();
}
function addCodmatLine() {
const container = document.getElementById('codmatLines');
const idx = container.children.length;
const div = document.createElement('div');
div.className = 'qm-line codmat-line';
div.innerHTML = `
<div class="qm-row">
<div class="qm-codmat-wrap position-relative">
<input type="text" class="form-control form-control-sm cl-codmat" placeholder="CODMAT..." autocomplete="off" data-idx="${idx}">
<div class="autocomplete-dropdown d-none cl-ac-dropdown"></div>
</div>
<input type="number" class="form-control form-control-sm cl-cantitate" value="1" step="0.001" min="0.001" title="Cantitate ROA" style="width:70px">
${idx > 0 ? `<button type="button" class="btn btn-sm btn-outline-danger qm-rm-btn" onclick="this.closest('.codmat-line').remove()"><i class="bi bi-x"></i></button>` : '<span style="width:30px"></span>'}
</div>
<div class="qm-selected text-muted cl-selected" style="font-size:0.75rem;padding-left:2px"></div>
`;
container.appendChild(div);
// Setup autocomplete
const input = div.querySelector('.cl-codmat');
const dropdown = div.querySelector('.cl-ac-dropdown');
const selected = div.querySelector('.cl-selected');
input.addEventListener('input', () => {
const key = 'cl_' + idx;
clearTimeout(acTimeouts[key]);
acTimeouts[key] = setTimeout(() => clAutocomplete(input, dropdown, selected), 250);
});
input.addEventListener('blur', () => {
setTimeout(() => dropdown.classList.add('d-none'), 200);
});
}
async function clAutocomplete(input, dropdown, selectedEl) {
const q = input.value;
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
try {
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
const data = await res.json();
if (!data.results || data.results.length === 0) { dropdown.classList.add('d-none'); return; }
dropdown.innerHTML = data.results.map(r =>
`<div class="autocomplete-item" onmousedown="clSelectArticle(this, '${esc(r.codmat)}', '${esc(r.denumire)}${r.um ? ' (' + esc(r.um) + ')' : ''}')">
<span class="codmat">${esc(r.codmat)}</span> — <span class="denumire">${esc(r.denumire)}</span>${r.um ? ` <small class="text-muted">(${esc(r.um)})</small>` : ''}
</div>`
).join('');
dropdown.classList.remove('d-none');
} catch { dropdown.classList.add('d-none'); }
}
function clSelectArticle(el, codmat, label) {
const line = el.closest('.codmat-line');
line.querySelector('.cl-codmat').value = codmat;
line.querySelector('.cl-selected').textContent = label;
line.querySelector('.cl-ac-dropdown').classList.add('d-none');
}
async function saveMapping() {
const sku = document.getElementById('inputSku').value.trim();
if (!sku) { alert('SKU este obligatoriu'); return; }
const lines = document.querySelectorAll('.codmat-line');
const mappings = [];
for (const line of lines) {
const codmat = line.querySelector('.cl-codmat').value.trim();
const cantitate = parseFloat(line.querySelector('.cl-cantitate').value) || 1;
if (!codmat) continue;
mappings.push({ codmat, cantitate_roa: cantitate });
}
if (mappings.length === 0) { alert('Adauga cel putin un CODMAT'); return; }
document.getElementById('pctWarning').style.display = 'none';
try {
let res;
if (editingMapping) {
if (mappings.length === 1) {
// Single CODMAT edit: use existing PUT endpoint
res = await fetch(`/api/mappings/${encodeURIComponent(editingMapping.sku)}/${encodeURIComponent(editingMapping.codmat)}/edit`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
new_sku: sku,
new_codmat: mappings[0].codmat,
cantitate_roa: mappings[0].cantitate_roa
})
});
} else {
// Multi-CODMAT set: delete all existing then create new batch
const oldSku = editingMapping.sku;
const existRes = await fetch(`/api/mappings?search=${encodeURIComponent(oldSku)}&per_page=100`);
const existData = await existRes.json();
const existing = (existData.mappings || []).filter(m => m.sku === oldSku && !m.sters);
// Delete each existing CODMAT for old SKU
for (const m of existing) {
await fetch(`/api/mappings/${encodeURIComponent(m.sku)}/${encodeURIComponent(m.codmat)}`, {
method: 'DELETE'
});
}
// Create new batch with auto_restore (handles just-soft-deleted records)
res = await fetch('/api/mappings/batch', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ sku, mappings, auto_restore: true })
});
}
} else if (mappings.length === 1) {
res = await fetch('/api/mappings', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ sku, codmat: mappings[0].codmat, cantitate_roa: mappings[0].cantitate_roa })
});
} else {
res = await fetch('/api/mappings/batch', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ sku, mappings })
});
}
const data = await res.json();
if (data.success) {
bootstrap.Modal.getInstance(document.getElementById('addModal')).hide();
editingMapping = null;
loadMappings();
} else if (res.status === 409) {
handleMappingConflict(data);
} else {
alert('Eroare: ' + (data.error || 'Unknown'));
}
} catch (err) {
alert('Eroare: ' + err.message);
}
}
// ── Inline Add Row ──────────────────────────────
let inlineAddVisible = false;
function showInlineAddRow() {
// On mobile, open the full modal instead
if (window.innerWidth < 768) {
new bootstrap.Modal(document.getElementById('addModal')).show();
return;
}
if (inlineAddVisible) return;
inlineAddVisible = true;
const container = document.getElementById('mappingsFlatList');
const row = document.createElement('div');
row.id = 'inlineAddRow';
row.className = 'flat-row';
row.style.background = '#eff6ff';
row.style.gap = '0.5rem';
row.innerHTML = `
<input type="text" class="form-control form-control-sm" id="inlineSku" placeholder="SKU" style="width:140px">
<div class="position-relative" style="flex:1;min-width:0">
<input type="text" class="form-control form-control-sm" id="inlineCodmat" placeholder="Cauta CODMAT..." autocomplete="off">
<div class="autocomplete-dropdown d-none" id="inlineAcDropdown"></div>
<small class="text-muted" id="inlineSelected"></small>
</div>
<input type="number" class="form-control form-control-sm" id="inlineCantitate" value="1" step="0.001" min="0.001" style="width:70px" placeholder="Cant.">
<button class="btn btn-sm btn-success" onclick="saveInlineMapping()" title="Salveaza"><i class="bi bi-check-lg"></i></button>
<button class="btn btn-sm btn-outline-secondary" onclick="cancelInlineAdd()" title="Anuleaza"><i class="bi bi-x-lg"></i></button>
`;
container.insertBefore(row, container.firstChild);
document.getElementById('inlineSku').focus();
// Setup autocomplete for inline CODMAT
const input = document.getElementById('inlineCodmat');
const dropdown = document.getElementById('inlineAcDropdown');
const selected = document.getElementById('inlineSelected');
let inlineAcTimeout = null;
input.addEventListener('input', () => {
clearTimeout(inlineAcTimeout);
inlineAcTimeout = setTimeout(() => inlineAutocomplete(input, dropdown, selected), 250);
});
input.addEventListener('blur', () => {
setTimeout(() => dropdown.classList.add('d-none'), 200);
});
}
async function inlineAutocomplete(input, dropdown, selectedEl) {
const q = input.value;
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
try {
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
const data = await res.json();
if (!data.results || data.results.length === 0) { dropdown.classList.add('d-none'); return; }
dropdown.innerHTML = data.results.map(r =>
`<div class="autocomplete-item" onmousedown="inlineSelectArticle('${esc(r.codmat)}', '${esc(r.denumire)}${r.um ? ' (' + esc(r.um) + ')' : ''}')">
<span class="codmat">${esc(r.codmat)}</span> — <span class="denumire">${esc(r.denumire)}</span>${r.um ? ` <small class="text-muted">(${esc(r.um)})</small>` : ''}
</div>`
).join('');
dropdown.classList.remove('d-none');
} catch { dropdown.classList.add('d-none'); }
}
function inlineSelectArticle(codmat, label) {
document.getElementById('inlineCodmat').value = codmat;
document.getElementById('inlineSelected').textContent = label;
document.getElementById('inlineAcDropdown').classList.add('d-none');
}
async function saveInlineMapping() {
const sku = document.getElementById('inlineSku').value.trim();
const codmat = document.getElementById('inlineCodmat').value.trim();
const cantitate = parseFloat(document.getElementById('inlineCantitate').value) || 1;
if (!sku) { alert('SKU este obligatoriu'); return; }
if (!codmat) { alert('CODMAT este obligatoriu'); return; }
try {
const res = await fetch('/api/mappings', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ sku, codmat, cantitate_roa: cantitate })
});
const data = await res.json();
if (data.success) {
cancelInlineAdd();
loadMappings();
} else if (res.status === 409) {
handleMappingConflict(data);
} else {
alert('Eroare: ' + (data.error || 'Unknown'));
}
} catch (err) {
alert('Eroare: ' + err.message);
}
}
function cancelInlineAdd() {
const row = document.getElementById('inlineAddRow');
if (row) row.remove();
inlineAddVisible = false;
}
// ── Toggle Active with Toast Undo ────────────────
async function toggleActive(sku, codmat, currentActive) {
const newActive = currentActive ? 0 : 1;
try {
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ activ: newActive })
});
const data = await res.json();
if (!data.success) return;
loadMappings();
// Show toast with undo
const action = newActive ? 'activata' : 'dezactivata';
showUndoToast(`Mapare ${sku} \u2192 ${codmat} ${action}.`, () => {
fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ activ: currentActive })
}).then(() => loadMappings());
});
} catch (err) { alert('Eroare: ' + err.message); }
}
function showUndoToast(message, undoCallback) {
document.getElementById('toastMessage').textContent = message;
const undoBtn = document.getElementById('toastUndoBtn');
// Clone to remove old listeners
const newBtn = undoBtn.cloneNode(true);
undoBtn.parentNode.replaceChild(newBtn, undoBtn);
newBtn.id = 'toastUndoBtn';
if (undoCallback) {
newBtn.style.display = '';
newBtn.addEventListener('click', () => {
undoCallback();
const toastEl = document.getElementById('undoToast');
const inst = bootstrap.Toast.getInstance(toastEl);
if (inst) inst.hide();
});
} else {
newBtn.style.display = 'none';
}
const toast = new bootstrap.Toast(document.getElementById('undoToast'));
toast.show();
}
// ── Delete with Modal Confirmation ──────────────
let pendingDelete = null;
function initDeleteModal() {
const btn = document.getElementById('confirmDeleteBtn');
if (!btn) return;
btn.addEventListener('click', async () => {
if (!pendingDelete) return;
const { sku, codmat } = pendingDelete;
try {
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
method: 'DELETE'
});
const data = await res.json();
bootstrap.Modal.getInstance(document.getElementById('deleteConfirmModal')).hide();
if (data.success) loadMappings();
else alert('Eroare: ' + (data.error || 'Delete failed'));
} catch (err) {
bootstrap.Modal.getInstance(document.getElementById('deleteConfirmModal')).hide();
alert('Eroare: ' + err.message);
}
pendingDelete = null;
});
}
function deleteMappingConfirm(sku, codmat) {
pendingDelete = { sku, codmat };
document.getElementById('deleteSkuText').textContent = sku;
document.getElementById('deleteCodmatText').textContent = codmat;
new bootstrap.Modal(document.getElementById('deleteConfirmModal')).show();
}
// ── Restore Deleted ──────────────────────────────
async function restoreMapping(sku, codmat) {
try {
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}/restore`, {
method: 'POST'
});
const data = await res.json();
if (data.success) loadMappings();
else alert('Eroare: ' + (data.error || 'Restore failed'));
} catch (err) {
alert('Eroare: ' + err.message);
}
}
// ── CSV ──────────────────────────────────────────
async function importCsv() {
const fileInput = document.getElementById('csvFile');
if (!fileInput.files.length) { alert('Selecteaza un fisier CSV'); return; }
const formData = new FormData();
formData.append('file', fileInput.files[0]);
try {
const res = await fetch('/api/mappings/import-csv', { method: 'POST', body: formData });
const data = await res.json();
let msg = `${data.processed} mapări importate`;
if (data.skipped_no_codmat > 0) {
msg += `, ${data.skipped_no_codmat} rânduri fără CODMAT omise`;
}
let html = `<div class="alert alert-success">${msg}</div>`;
if (data.errors && data.errors.length > 0) {
html += `<div class="alert alert-warning">Erori (${data.errors.length}): <ul>${data.errors.map(e => `<li>${esc(e)}</li>`).join('')}</ul></div>`;
}
document.getElementById('importResult').innerHTML = html;
loadMappings();
} catch (err) {
document.getElementById('importResult').innerHTML = `<div class="alert alert-danger">${err.message}</div>`;
}
}
function exportCsv() { window.location.href = (window.ROOT_PATH || '') + '/api/mappings/export-csv'; }
function downloadTemplate() { window.location.href = (window.ROOT_PATH || '') + '/api/mappings/csv-template'; }
// ── Duplicate / Conflict handling ────────────────
function handleMappingConflict(data) {
const msg = data.error || 'Conflict la salvare';
if (data.can_restore) {
const restore = confirm(`${msg}\n\nDoriti sa restaurati maparea stearsa?`);
if (restore) {
// Find sku/codmat from the inline row or modal
const sku = (document.getElementById('inlineSku') || document.getElementById('inputSku'))?.value?.trim();
const codmat = (document.getElementById('inlineCodmat') || document.querySelector('.cl-codmat'))?.value?.trim();
if (sku && codmat) {
fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}/restore`, { method: 'POST' })
.then(r => r.json())
.then(d => {
if (d.success) { cancelInlineAdd(); loadMappings(); }
else alert('Eroare la restaurare: ' + (d.error || ''));
});
}
}
} else {
showUndoToast(msg, null);
// Show non-dismissible inline error
const warn = document.getElementById('pctWarning');
if (warn) { warn.textContent = msg; warn.style.display = ''; }
}
}

View File

@@ -1,281 +0,0 @@
let settAcTimeout = null;
document.addEventListener('DOMContentLoaded', async () => {
await loadDropdowns();
await loadSettings();
wireAutocomplete('settTransportCodmat', 'settTransportAc');
wireAutocomplete('settDiscountCodmat', 'settDiscountAc');
wireAutocomplete('settKitDiscountCodmat', 'settKitDiscountAc');
// Kit pricing mode radio toggle
document.querySelectorAll('input[name="kitPricingMode"]').forEach(r => {
r.addEventListener('change', () => {
document.getElementById('kitModeBFields').style.display =
document.getElementById('kitModeSeparate').checked ? '' : 'none';
});
});
// Catalog sync toggle
const catChk = document.getElementById('settCatalogSyncEnabled');
if (catChk) catChk.addEventListener('change', () => {
document.getElementById('catalogSyncOptions').style.display = catChk.checked ? '' : 'none';
});
});
async function loadDropdowns() {
try {
const [sectiiRes, politiciRes, gestiuniRes] = await Promise.all([
fetch('/api/settings/sectii'),
fetch('/api/settings/politici'),
fetch('/api/settings/gestiuni')
]);
const sectii = await sectiiRes.json();
const politici = await politiciRes.json();
const gestiuni = await gestiuniRes.json();
const gestContainer = document.getElementById('settGestiuniContainer');
if (gestContainer) {
gestContainer.innerHTML = '';
gestiuni.forEach(g => {
gestContainer.innerHTML += `<div class="form-check mb-0"><input class="form-check-input" type="checkbox" value="${escHtml(g.id)}" id="gestChk_${escHtml(g.id)}"><label class="form-check-label" for="gestChk_${escHtml(g.id)}">${escHtml(g.label)}</label></div>`;
});
if (gestiuni.length === 0) gestContainer.innerHTML = '<span class="text-muted small">Nicio gestiune disponibilă</span>';
}
const sectieEl = document.getElementById('settIdSectie');
if (sectieEl) {
sectieEl.innerHTML = '<option value="">— selectează secție —</option>';
sectii.forEach(s => {
sectieEl.innerHTML += `<option value="${escHtml(s.id)}">${escHtml(s.label)}</option>`;
});
}
const polEl = document.getElementById('settIdPol');
if (polEl) {
polEl.innerHTML = '<option value="">— selectează politică —</option>';
politici.forEach(p => {
polEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
});
}
const tPolEl = document.getElementById('settTransportIdPol');
if (tPolEl) {
tPolEl.innerHTML = '<option value="">— implicită —</option>';
politici.forEach(p => {
tPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
});
}
const dPolEl = document.getElementById('settDiscountIdPol');
if (dPolEl) {
dPolEl.innerHTML = '<option value="">— implicită —</option>';
politici.forEach(p => {
dPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
});
}
const pPolEl = document.getElementById('settIdPolProductie');
if (pPolEl) {
pPolEl.innerHTML = '<option value="">— fără politică producție —</option>';
politici.forEach(p => {
pPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
});
}
const kdPolEl = document.getElementById('settKitDiscountIdPol');
if (kdPolEl) {
kdPolEl.innerHTML = '<option value="">— implicită —</option>';
politici.forEach(p => {
kdPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
});
}
} catch (err) {
console.error('loadDropdowns error:', err);
}
}
async function loadSettings() {
try {
const res = await fetch('/api/settings');
const data = await res.json();
const el = (id) => document.getElementById(id);
if (el('settTransportCodmat')) el('settTransportCodmat').value = data.transport_codmat || '';
if (el('settTransportVat')) el('settTransportVat').value = data.transport_vat || '21';
if (el('settTransportIdPol')) el('settTransportIdPol').value = data.transport_id_pol || '';
if (el('settDiscountCodmat')) el('settDiscountCodmat').value = data.discount_codmat || '';
if (el('settDiscountVat')) el('settDiscountVat').value = data.discount_vat || '21';
if (el('settDiscountIdPol')) el('settDiscountIdPol').value = data.discount_id_pol || '';
if (el('settSplitDiscountVat')) el('settSplitDiscountVat').checked = data.split_discount_vat === "1";
if (el('settIdPol')) el('settIdPol').value = data.id_pol || '';
if (el('settIdPolProductie')) el('settIdPolProductie').value = data.id_pol_productie || '';
if (el('settIdSectie')) el('settIdSectie').value = data.id_sectie || '';
// Multi-gestiune checkboxes
const gestVal = data.id_gestiune || '';
if (gestVal) {
const selectedIds = gestVal.split(',').map(s => s.trim());
selectedIds.forEach(id => {
const chk = document.getElementById('gestChk_' + id);
if (chk) chk.checked = true;
});
}
if (el('settGomagApiKey')) el('settGomagApiKey').value = data.gomag_api_key || '';
if (el('settGomagApiShop')) el('settGomagApiShop').value = data.gomag_api_shop || '';
if (el('settGomagDaysBack')) el('settGomagDaysBack').value = data.gomag_order_days_back || '7';
if (el('settGomagLimit')) el('settGomagLimit').value = data.gomag_limit || '100';
if (el('settDashPollSeconds')) el('settDashPollSeconds').value = data.dashboard_poll_seconds || '5';
// Kit pricing
const kitMode = data.kit_pricing_mode || '';
document.querySelectorAll('input[name="kitPricingMode"]').forEach(r => {
r.checked = r.value === kitMode;
});
document.getElementById('kitModeBFields').style.display = kitMode === 'separate_line' ? '' : 'none';
if (el('settKitDiscountCodmat')) el('settKitDiscountCodmat').value = data.kit_discount_codmat || '';
if (el('settKitDiscountIdPol')) el('settKitDiscountIdPol').value = data.kit_discount_id_pol || '';
// Price sync
if (el('settPriceSyncEnabled')) el('settPriceSyncEnabled').checked = data.price_sync_enabled !== "0";
if (el('settCatalogSyncEnabled')) {
el('settCatalogSyncEnabled').checked = data.catalog_sync_enabled === "1";
document.getElementById('catalogSyncOptions').style.display = data.catalog_sync_enabled === "1" ? '' : 'none';
}
if (el('settPriceSyncSchedule')) el('settPriceSyncSchedule').value = data.price_sync_schedule || '';
// Load price sync status
try {
const psRes = await fetch('/api/price-sync/status');
const psData = await psRes.json();
const psEl = document.getElementById('settPriceSyncStatus');
if (psEl && psData.last_run) {
psEl.textContent = `Ultima: ${psData.last_run.finished_at || ''}${psData.last_run.updated || 0} actualizate din ${psData.last_run.matched || 0}`;
}
} catch {}
} catch (err) {
console.error('loadSettings error:', err);
}
}
async function saveSettings() {
const el = (id) => document.getElementById(id);
const payload = {
transport_codmat: el('settTransportCodmat')?.value?.trim() || '',
transport_vat: el('settTransportVat')?.value || '21',
transport_id_pol: el('settTransportIdPol')?.value?.trim() || '',
discount_codmat: el('settDiscountCodmat')?.value?.trim() || '',
discount_vat: el('settDiscountVat')?.value || '21',
discount_id_pol: el('settDiscountIdPol')?.value?.trim() || '',
split_discount_vat: el('settSplitDiscountVat')?.checked ? "1" : "",
id_pol: el('settIdPol')?.value?.trim() || '',
id_pol_productie: el('settIdPolProductie')?.value?.trim() || '',
id_sectie: el('settIdSectie')?.value?.trim() || '',
id_gestiune: Array.from(document.querySelectorAll('#settGestiuniContainer input:checked')).map(c => c.value).join(','),
gomag_api_key: el('settGomagApiKey')?.value?.trim() || '',
gomag_api_shop: el('settGomagApiShop')?.value?.trim() || '',
gomag_order_days_back: el('settGomagDaysBack')?.value?.trim() || '7',
gomag_limit: el('settGomagLimit')?.value?.trim() || '100',
dashboard_poll_seconds: el('settDashPollSeconds')?.value?.trim() || '5',
kit_pricing_mode: document.querySelector('input[name="kitPricingMode"]:checked')?.value || '',
kit_discount_codmat: el('settKitDiscountCodmat')?.value?.trim() || '',
kit_discount_id_pol: el('settKitDiscountIdPol')?.value?.trim() || '',
price_sync_enabled: el('settPriceSyncEnabled')?.checked ? "1" : "0",
catalog_sync_enabled: el('settCatalogSyncEnabled')?.checked ? "1" : "0",
price_sync_schedule: el('settPriceSyncSchedule')?.value || '',
gomag_products_url: '',
};
try {
const res = await fetch('/api/settings', {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload)
});
const data = await res.json();
const resultEl = document.getElementById('settSaveResult');
if (data.success) {
if (resultEl) { resultEl.textContent = 'Salvat!'; resultEl.style.color = '#16a34a'; }
setTimeout(() => { if (resultEl) resultEl.textContent = ''; }, 3000);
} else {
if (resultEl) { resultEl.textContent = 'Eroare: ' + JSON.stringify(data); resultEl.style.color = '#dc2626'; }
}
} catch (err) {
const resultEl = document.getElementById('settSaveResult');
if (resultEl) { resultEl.textContent = 'Eroare: ' + err.message; resultEl.style.color = '#dc2626'; }
}
}
async function startCatalogSync() {
const btn = document.getElementById('btnCatalogSync');
const status = document.getElementById('settPriceSyncStatus');
btn.disabled = true;
btn.innerHTML = '<span class="spinner-border spinner-border-sm"></span> Sincronizare...';
try {
const res = await fetch('/api/price-sync/start', { method: 'POST' });
const data = await res.json();
if (data.error) {
status.innerHTML = `<span class="text-danger">${escHtml(data.error)}</span>`;
btn.disabled = false;
btn.textContent = 'Sincronizează acum';
return;
}
// Poll status
const pollInterval = setInterval(async () => {
const sr = await fetch('/api/price-sync/status');
const sd = await sr.json();
if (sd.status === 'running') {
status.textContent = sd.phase_text || 'Sincronizare în curs...';
} else {
clearInterval(pollInterval);
btn.disabled = false;
btn.textContent = 'Sincronizează acum';
if (sd.last_run) status.textContent = `Ultima: ${sd.last_run.finished_at || ''}${sd.last_run.updated || 0} actualizate din ${sd.last_run.matched || 0}`;
}
}, 2000);
} catch (err) {
status.innerHTML = `<span class="text-danger">${escHtml(err.message)}</span>`;
btn.disabled = false;
btn.textContent = 'Sincronizează acum';
}
}
function wireAutocomplete(inputId, dropdownId) {
const input = document.getElementById(inputId);
const dropdown = document.getElementById(dropdownId);
if (!input || !dropdown) return;
input.addEventListener('input', () => {
clearTimeout(settAcTimeout);
settAcTimeout = setTimeout(async () => {
const q = input.value.trim();
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
try {
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
const data = await res.json();
if (!data.results || data.results.length === 0) { dropdown.classList.add('d-none'); return; }
dropdown.innerHTML = data.results.map(r =>
`<div class="autocomplete-item" onmousedown="settSelectArticle('${inputId}', '${dropdownId}', '${escHtml(r.codmat)}')">
<span class="codmat">${escHtml(r.codmat)}</span> &mdash; <span class="denumire">${escHtml(r.denumire)}</span>
</div>`
).join('');
dropdown.classList.remove('d-none');
} catch { dropdown.classList.add('d-none'); }
}, 250);
});
input.addEventListener('blur', () => {
setTimeout(() => dropdown.classList.add('d-none'), 200);
});
}
function settSelectArticle(inputId, dropdownId, codmat) {
document.getElementById(inputId).value = codmat;
document.getElementById(dropdownId).classList.add('d-none');
}
function escHtml(s) {
if (s == null) return '';
return String(s)
.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;')
.replace(/"/g, '&quot;')
.replace(/'/g, '&#39;');
}

View File

@@ -1,376 +0,0 @@
// shared.js - Unified utilities for all pages
// ── Root path patch — prepend ROOT_PATH to all relative fetch calls ───────
(function() {
const _fetch = window.fetch.bind(window);
window.fetch = function(url, ...args) {
if (typeof url === 'string' && url.startsWith('/') && window.ROOT_PATH) {
url = window.ROOT_PATH + url;
}
return _fetch(url, ...args);
};
})();
// ── HTML escaping ─────────────────────────────────
function esc(s) {
if (s == null) return '';
return String(s)
.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;')
.replace(/"/g, '&quot;')
.replace(/'/g, '&#39;');
}
// ── Date formatting ───────────────────────────────
function fmtDate(dateStr, includeSeconds) {
if (!dateStr) return '-';
try {
const d = new Date(dateStr);
const hasTime = dateStr.includes(':');
if (hasTime) {
const opts = { day: '2-digit', month: '2-digit', year: 'numeric', hour: '2-digit', minute: '2-digit' };
if (includeSeconds) opts.second = '2-digit';
return d.toLocaleString('ro-RO', opts);
}
return d.toLocaleDateString('ro-RO', { day: '2-digit', month: '2-digit', year: 'numeric' });
} catch { return dateStr; }
}
// ── Unified Pagination ────────────────────────────
/**
* Renders a full pagination bar with First/Prev/numbers/Next/Last.
* @param {number} currentPage
* @param {number} totalPages
* @param {string} goToFnName - name of global function to call with page number
* @param {object} [opts] - optional: { perPage, perPageFn, perPageOptions }
* @returns {string} HTML string
*/
function renderUnifiedPagination(currentPage, totalPages, goToFnName, opts) {
if (totalPages <= 1 && !(opts && opts.perPage)) {
return '';
}
let html = '<div class="d-flex align-items-center gap-2 flex-wrap">';
// Per-page selector
if (opts && opts.perPage && opts.perPageFn) {
const options = opts.perPageOptions || [25, 50, 100, 250];
html += `<label class="per-page-label">Per pagina: <select class="select-compact ms-1" onchange="${opts.perPageFn}(this.value)">`;
options.forEach(v => {
html += `<option value="${v}"${v === opts.perPage ? ' selected' : ''}>${v}</option>`;
});
html += '</select></label>';
}
if (totalPages <= 1) {
html += '</div>';
return html;
}
html += '<div class="pagination-bar">';
// First
html += `<button class="page-btn" onclick="${goToFnName}(1)" ${currentPage <= 1 ? 'disabled' : ''}>&laquo;</button>`;
// Prev
html += `<button class="page-btn" onclick="${goToFnName}(${currentPage - 1})" ${currentPage <= 1 ? 'disabled' : ''}>&lsaquo;</button>`;
// Page numbers with ellipsis
const range = 2;
let pages = [];
for (let i = 1; i <= totalPages; i++) {
if (i === 1 || i === totalPages || (i >= currentPage - range && i <= currentPage + range)) {
pages.push(i);
}
}
let lastP = 0;
pages.forEach(p => {
if (lastP && p - lastP > 1) {
html += `<span class="page-btn disabled page-ellipsis">…</span>`;
}
html += `<button class="page-btn page-number${p === currentPage ? ' active' : ''}" onclick="${goToFnName}(${p})">${p}</button>`;
lastP = p;
});
// Next
html += `<button class="page-btn" onclick="${goToFnName}(${currentPage + 1})" ${currentPage >= totalPages ? 'disabled' : ''}>&rsaquo;</button>`;
// Last
html += `<button class="page-btn" onclick="${goToFnName}(${totalPages})" ${currentPage >= totalPages ? 'disabled' : ''}>&raquo;</button>`;
html += '</div></div>';
return html;
}
// ── Context Menu ──────────────────────────────────
let _activeContextMenu = null;
function closeAllContextMenus() {
if (_activeContextMenu) {
_activeContextMenu.remove();
_activeContextMenu = null;
}
}
document.addEventListener('click', closeAllContextMenus);
document.addEventListener('keydown', (e) => {
if (e.key === 'Escape') closeAllContextMenus();
});
/**
* Show a context menu at the given position.
* @param {number} x - clientX
* @param {number} y - clientY
* @param {Array} items - [{label, action, danger}]
*/
function showContextMenu(x, y, items) {
closeAllContextMenus();
const menu = document.createElement('div');
menu.className = 'context-menu';
items.forEach(item => {
const btn = document.createElement('button');
btn.className = 'context-menu-item' + (item.danger ? ' text-danger' : '');
btn.textContent = item.label;
btn.addEventListener('click', (e) => {
e.stopPropagation();
closeAllContextMenus();
item.action();
});
menu.appendChild(btn);
});
document.body.appendChild(menu);
_activeContextMenu = menu;
// Position menu, keeping it within viewport
const rect = menu.getBoundingClientRect();
const vw = window.innerWidth;
const vh = window.innerHeight;
let left = x;
let top = y;
if (left + 160 > vw) left = vw - 165;
if (top + rect.height > vh) top = vh - rect.height - 5;
menu.style.left = left + 'px';
menu.style.top = top + 'px';
}
/**
* Wire right-click on desktop + three-dots button on mobile for a table.
* @param {string} rowSelector - CSS selector for clickable rows
* @param {function} menuItemsFn - called with row element, returns [{label, action, danger}]
*/
function initContextMenus(rowSelector, menuItemsFn) {
document.addEventListener('contextmenu', (e) => {
const row = e.target.closest(rowSelector);
if (!row) return;
e.preventDefault();
showContextMenu(e.clientX, e.clientY, menuItemsFn(row));
});
document.addEventListener('click', (e) => {
const trigger = e.target.closest('.context-menu-trigger');
if (!trigger) return;
const row = trigger.closest(rowSelector);
if (!row) return;
e.stopPropagation();
const rect = trigger.getBoundingClientRect();
showContextMenu(rect.left, rect.bottom + 2, menuItemsFn(row));
});
}
// ── Mobile segmented control ─────────────────────
/**
* Render a Bootstrap btn-group segmented control for mobile.
* @param {string} containerId - ID of the container div
* @param {Array} pills - [{label, count, colorClass, value, active}]
* @param {function} onSelect - callback(value)
*/
function renderMobileSegmented(containerId, pills, onSelect) {
const container = document.getElementById(containerId);
if (!container) return;
const btnStyle = 'font-size:0.75rem;height:32px;white-space:nowrap;display:inline-flex;align-items:center;justify-content:center;gap:0.25rem;flex:1;padding:0 0.25rem';
container.innerHTML = `<div class="btn-group btn-group-sm w-100">${pills.map(p => {
const cls = p.active ? 'btn btn-primary' : 'btn btn-outline-secondary';
const countColor = (!p.active && p.colorClass) ? ` class="${p.colorClass}"` : '';
return `<button type="button" class="${cls}" style="${btnStyle}" data-seg-value="${esc(p.value)}">${esc(p.label)} <b${countColor}>${p.count}</b></button>`;
}).join('')}</div>`;
container.querySelectorAll('[data-seg-value]').forEach(btn => {
btn.addEventListener('click', () => onSelect(btn.dataset.segValue));
});
}
// ── Shared Quick Map Modal ────────────────────────
let _qmOnSave = null;
let _qmAcTimeout = null;
/**
* Open the shared quick-map modal.
* @param {object} opts
* @param {string} opts.sku
* @param {string} opts.productName
* @param {Array} [opts.prefill] - [{codmat, cantitate, denumire}]
* @param {boolean}[opts.isDirect] - true if SKU=CODMAT direct
* @param {object} [opts.directInfo] - {codmat, denumire} for direct SKU info
* @param {function} opts.onSave - callback(sku, mappings) after successful save
*/
function openQuickMap(opts) {
_qmOnSave = opts.onSave || null;
document.getElementById('qmSku').textContent = opts.sku;
document.getElementById('qmProductName').textContent = opts.productName || '-';
document.getElementById('qmPctWarning').style.display = 'none';
const container = document.getElementById('qmCodmatLines');
container.innerHTML = '';
const directInfo = document.getElementById('qmDirectInfo');
const saveBtn = document.getElementById('qmSaveBtn');
if (opts.isDirect && opts.directInfo) {
if (directInfo) {
directInfo.innerHTML = `<i class="bi bi-info-circle"></i> SKU = CODMAT direct in nomenclator (<code>${esc(opts.directInfo.codmat)}</code> — ${esc(opts.directInfo.denumire || '')}).<br><small class="text-muted">Poti suprascrie cu un alt CODMAT daca e necesar (ex: reambalare).</small>`;
directInfo.style.display = '';
}
if (saveBtn) saveBtn.textContent = 'Suprascrie mapare';
addQmCodmatLine();
} else {
if (directInfo) directInfo.style.display = 'none';
if (saveBtn) saveBtn.textContent = 'Salveaza';
if (opts.prefill && opts.prefill.length > 0) {
opts.prefill.forEach(d => addQmCodmatLine({ codmat: d.codmat, cantitate: d.cantitate, denumire: d.denumire }));
} else {
addQmCodmatLine();
}
}
new bootstrap.Modal(document.getElementById('quickMapModal')).show();
}
function addQmCodmatLine(prefill) {
const container = document.getElementById('qmCodmatLines');
const idx = container.children.length;
const codmatVal = prefill?.codmat || '';
const cantVal = prefill?.cantitate || 1;
const denumireVal = prefill?.denumire || '';
const div = document.createElement('div');
div.className = 'qm-line';
div.innerHTML = `
<div class="qm-row">
<div class="qm-codmat-wrap position-relative">
<input type="text" class="form-control form-control-sm qm-codmat" placeholder="CODMAT..." autocomplete="off" value="${esc(codmatVal)}">
<div class="autocomplete-dropdown d-none qm-ac-dropdown"></div>
</div>
<input type="number" class="form-control form-control-sm qm-cantitate" value="${cantVal}" step="0.001" min="0.001" title="Cantitate ROA" style="width:70px">
${idx > 0 ? `<button type="button" class="btn btn-sm btn-outline-danger qm-rm-btn" onclick="this.closest('.qm-line').remove()"><i class="bi bi-x"></i></button>` : '<span style="width:30px"></span>'}
</div>
<div class="qm-selected text-muted" style="font-size:0.75rem;padding-left:2px">${esc(denumireVal)}</div>
`;
container.appendChild(div);
const input = div.querySelector('.qm-codmat');
const dropdown = div.querySelector('.qm-ac-dropdown');
const selected = div.querySelector('.qm-selected');
input.addEventListener('input', () => {
clearTimeout(_qmAcTimeout);
_qmAcTimeout = setTimeout(() => _qmAutocomplete(input, dropdown, selected), 250);
});
input.addEventListener('blur', () => {
setTimeout(() => dropdown.classList.add('d-none'), 200);
});
}
async function _qmAutocomplete(input, dropdown, selectedEl) {
const q = input.value;
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
try {
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
const data = await res.json();
if (!data.results || data.results.length === 0) { dropdown.classList.add('d-none'); return; }
dropdown.innerHTML = data.results.map(r =>
`<div class="autocomplete-item" onmousedown="_qmSelectArticle(this, '${esc(r.codmat)}', '${esc(r.denumire)}${r.um ? ' (' + esc(r.um) + ')' : ''}')">
<span class="codmat">${esc(r.codmat)}</span> &mdash; <span class="denumire">${esc(r.denumire)}</span>${r.um ? ` <small class="text-muted">(${esc(r.um)})</small>` : ''}
</div>`
).join('');
dropdown.classList.remove('d-none');
} catch { dropdown.classList.add('d-none'); }
}
function _qmSelectArticle(el, codmat, label) {
const line = el.closest('.qm-line');
line.querySelector('.qm-codmat').value = codmat;
line.querySelector('.qm-selected').textContent = label;
line.querySelector('.qm-ac-dropdown').classList.add('d-none');
}
async function saveQuickMapping() {
const lines = document.querySelectorAll('#qmCodmatLines .qm-line');
const mappings = [];
for (const line of lines) {
const codmat = line.querySelector('.qm-codmat').value.trim();
const cantitate = parseFloat(line.querySelector('.qm-cantitate').value) || 1;
if (!codmat) continue;
mappings.push({ codmat, cantitate_roa: cantitate });
}
if (mappings.length === 0) { alert('Selecteaza cel putin un CODMAT'); return; }
const sku = document.getElementById('qmSku').textContent;
try {
let res;
if (mappings.length === 1) {
res = await fetch('/api/mappings', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ sku, codmat: mappings[0].codmat, cantitate_roa: mappings[0].cantitate_roa })
});
} else {
res = await fetch('/api/mappings/batch', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ sku, mappings })
});
}
const data = await res.json();
if (data.success) {
bootstrap.Modal.getInstance(document.getElementById('quickMapModal')).hide();
if (_qmOnSave) _qmOnSave(sku, mappings);
} else {
alert('Eroare: ' + (data.error || 'Unknown'));
}
} catch (err) {
alert('Eroare: ' + err.message);
}
}
// ── Dot helper ────────────────────────────────────
function statusDot(status) {
switch ((status || '').toUpperCase()) {
case 'IMPORTED':
case 'ALREADY_IMPORTED':
case 'COMPLETED':
case 'RESOLVED':
return '<span class="dot dot-green"></span>';
case 'SKIPPED':
case 'UNRESOLVED':
case 'INCOMPLETE':
return '<span class="dot dot-yellow"></span>';
case 'ERROR':
case 'FAILED':
return '<span class="dot dot-red"></span>';
case 'CANCELLED':
case 'DELETED_IN_ROA':
return '<span class="dot dot-gray"></span>';
default:
return '<span class="dot dot-gray"></span>';
}
}

View File

@@ -1,67 +0,0 @@
<!DOCTYPE html>
<html lang="ro" style="color-scheme: light">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>{% block title %}GoMag Import Manager{% endblock %}</title>
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.2/font/bootstrap-icons.css" rel="stylesheet">
{% set rp = request.scope.get('root_path', '') %}
<link href="{{ rp }}/static/css/style.css?v=17" rel="stylesheet">
</head>
<body>
<!-- Top Navbar -->
<nav class="top-navbar">
<div class="navbar-brand">GoMag Import</div>
<div class="navbar-links">
<a href="{{ rp }}/" class="nav-tab {% block nav_dashboard %}{% endblock %}"><span class="d-none d-md-inline">Dashboard</span><span class="d-md-none">Acasa</span></a>
<a href="{{ rp }}/mappings" class="nav-tab {% block nav_mappings %}{% endblock %}"><span class="d-none d-md-inline">Mapari SKU</span><span class="d-md-none">Mapari</span></a>
<a href="{{ rp }}/missing-skus" class="nav-tab {% block nav_missing %}{% endblock %}"><span class="d-none d-md-inline">SKU-uri Lipsa</span><span class="d-md-none">Lipsa</span></a>
<a href="{{ rp }}/logs" class="nav-tab {% block nav_logs %}{% endblock %}"><span class="d-none d-md-inline">Jurnale Import</span><span class="d-md-none">Jurnale</span></a>
<a href="{{ rp }}/settings" class="nav-tab {% block nav_settings %}{% endblock %}"><span class="d-none d-md-inline">Setari</span><span class="d-md-none">Setari</span></a>
</div>
</nav>
<!-- Main content -->
<main class="main-content">
{% block content %}{% endblock %}
</main>
<!-- Shared Quick Map Modal -->
<div class="modal fade" id="quickMapModal" tabindex="-1" data-bs-backdrop="static">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Mapeaza SKU: <code id="qmSku"></code></h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<div style="margin-bottom:8px; font-size:0.85rem">
<small class="text-muted">Produs:</small> <strong id="qmProductName"></strong>
</div>
<div class="qm-row" style="font-size:0.7rem; color:#9ca3af; padding:0 0 2px">
<span style="flex:1">CODMAT</span>
<span style="width:70px">Cant.</span>
<span style="width:30px"></span>
</div>
<div id="qmCodmatLines"></div>
<button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addQmCodmatLine()" style="font-size:0.8rem; padding:2px 10px">
+ CODMAT
</button>
<div id="qmDirectInfo" class="alert alert-info mt-2" style="display:none; font-size:0.85rem; padding:8px 12px;"></div>
<div id="qmPctWarning" class="text-danger mt-2" style="display:none;"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
<button type="button" class="btn btn-primary" id="qmSaveBtn" onclick="saveQuickMapping()">Salveaza</button>
</div>
</div>
</div>
</div>
<script>window.ROOT_PATH = "{{ rp }}";</script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
<script src="{{ rp }}/static/js/shared.js?v=12"></script>
{% block scripts %}{% endblock %}
</body>
</html>

View File

@@ -1,172 +0,0 @@
{% extends "base.html" %}
{% block title %}Dashboard - GoMag Import{% endblock %}
{% block nav_dashboard %}active{% endblock %}
{% block content %}
<h4 class="mb-4">Panou de Comanda</h4>
<!-- Sync Card (unified two-row panel) -->
<div class="sync-card">
<!-- TOP ROW: Status + Controls -->
<div class="sync-card-controls">
<span id="syncStatusDot" class="sync-status-dot idle"></span>
<span id="syncStatusText" class="text-secondary">Inactiv</span>
<div class="d-flex align-items-center gap-2">
<label class="d-flex align-items-center gap-1 text-muted">
Auto:
<input type="checkbox" id="schedulerToggle" class="cursor-pointer" onchange="toggleScheduler()">
</label>
<select id="schedulerInterval" class="select-compact" onchange="updateSchedulerInterval()">
<option value="1">1 min</option>
<option value="3">3 min</option>
<option value="5">5 min</option>
<option value="10" selected>10 min</option>
<option value="30">30 min</option>
</select>
<button id="syncStartBtn" class="btn btn-sm btn-primary" onclick="startSync()">&#9654; Start Sync</button>
</div>
</div>
<div class="sync-card-divider"></div>
<!-- BOTTOM ROW: Last sync info (clickable → jurnal) -->
<div class="sync-card-info" id="lastSyncRow" role="button" tabindex="0" title="Ver jurnal sync">
<span id="lastSyncDate" class="fw-medium">&#8212;</span>
<span id="lastSyncDuration" class="text-muted">&#8212;</span>
<span id="lastSyncCounts">&#8212;</span>
<span id="lastSyncStatus">&#8212;</span>
<span class="ms-auto small text-muted">&#8599; jurnal</span>
</div>
<!-- LIVE PROGRESS (shown only when sync is running) -->
<div class="sync-card-progress" id="syncProgressArea" style="display:none;">
<span class="sync-live-dot"></span>
<span id="syncProgressText">Se proceseaza...</span>
</div>
</div>
<!-- Orders Table -->
<div class="card mb-4">
<div class="card-header">
<span>Comenzi</span>
</div>
<div class="card-body py-2 px-3">
<div class="filter-bar" id="ordersFilterBar">
<!-- Period dropdown -->
<select id="periodSelect" class="select-compact">
<option value="1">1 zi</option>
<option value="2">2 zile</option>
<option value="3">3 zile</option>
<option value="7" selected>7 zile</option>
<option value="30">30 zile</option>
<option value="90">3 luni</option>
<option value="0">Toate</option>
<option value="custom">Perioada personalizata...</option>
</select>
<!-- Custom date range (hidden until 'custom' selected) -->
<div class="period-custom-range" id="customRangeInputs">
<input type="date" id="periodStart" class="select-compact">
<span>&#8212;</span>
<input type="date" id="periodEnd" class="select-compact">
</div>
<input type="search" id="orderSearch" placeholder="Cauta comanda, client..." class="search-input">
<!-- Status pills -->
<button class="filter-pill active d-none d-md-inline-flex" data-status="all">Toate <span class="filter-count fc-neutral" id="cntAll">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="IMPORTED">Importat <span class="filter-count fc-green" id="cntImp">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="SKIPPED">Omise <span class="filter-count fc-yellow" id="cntSkip">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="ERROR">Erori <span class="filter-count fc-red" id="cntErr">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="INVOICED">Facturate <span class="filter-count fc-green" id="cntFact">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="UNINVOICED">Nefacturate <span class="filter-count fc-red" id="cntNef">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-status="CANCELLED">Anulate <span class="filter-count fc-dark" id="cntCanc">0</span></button>
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" id="btnRefreshInvoices" onclick="refreshInvoices()" title="Actualizeaza status facturi din Oracle">&#8635;</button>
</div>
<div class="d-md-none mb-2 d-flex align-items-center gap-2">
<div class="flex-grow-1" id="dashMobileSeg"></div>
<button class="btn btn-sm btn-outline-secondary" id="btnRefreshInvoicesMobile" onclick="refreshInvoices()" title="Actualizeaza facturi" style="padding:4px 8px; font-size:1rem; line-height:1">&#8635;</button>
</div>
</div>
<div id="dashPaginationTop" class="pag-strip"></div>
<div class="card-body p-0">
<div id="dashMobileList" class="mobile-list"></div>
<div class="table-responsive">
<table class="table table-hover mb-0">
<thead>
<tr>
<th style="width:24px"></th>
<th class="sortable" onclick="dashSortBy('order_date')">Data <span class="sort-icon" data-col="order_date"></span></th>
<th class="sortable" onclick="dashSortBy('customer_name')">Client <span class="sort-icon" data-col="customer_name"></span></th>
<th class="sortable" onclick="dashSortBy('order_number')">Nr Comanda <span class="sort-icon" data-col="order_number"></span></th>
<th class="sortable" onclick="dashSortBy('items_count')">Art. <span class="sort-icon" data-col="items_count"></span></th>
<th class="text-end">Transport</th>
<th class="text-end">Discount</th>
<th class="text-end">Total</th>
<th style="width:28px" title="Facturat">F</th>
</tr>
</thead>
<tbody id="dashOrdersBody">
<tr><td colspan="9" class="text-center text-muted py-3">Se incarca...</td></tr>
</tbody>
</table>
</div>
</div>
<div id="dashPagination" class="pag-strip pag-strip-bottom"></div>
</div>
<!-- Order Detail Modal -->
<div class="modal fade" id="orderDetailModal" tabindex="-1">
<div class="modal-dialog modal-lg">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Comanda <code id="detailOrderNumber"></code></h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<div class="row mb-3">
<div class="col-md-6">
<small class="text-muted">Client:</small> <strong id="detailCustomer"></strong><br>
<small class="text-muted">Data comanda:</small> <span id="detailDate"></span><br>
<small class="text-muted">Status:</small> <span id="detailStatus"></span>
</div>
<div class="col-md-6">
<small class="text-muted">ID Comanda ROA:</small> <span id="detailIdComanda">-</span><br>
<small class="text-muted">ID Partener:</small> <span id="detailIdPartener">-</span><br>
<small class="text-muted">ID Adr. Facturare:</small> <span id="detailIdAdresaFact">-</span><br>
<small class="text-muted">ID Adr. Livrare:</small> <span id="detailIdAdresaLivr">-</span>
<div id="detailInvoiceInfo" style="display:none; margin-top:4px;">
<small class="text-muted">Factura:</small> <span id="detailInvoiceNumber"></span>
<span class="ms-2"><small class="text-muted">din</small> <span id="detailInvoiceDate"></span></span>
</div>
</div>
</div>
<div class="table-responsive d-none d-md-block">
<table class="table table-sm table-bordered mb-0">
<thead class="table-light">
<tr>
<th>SKU</th>
<th>Produs</th>
<th>CODMAT</th>
<th class="text-end">Cant.</th>
<th class="text-end">Pret</th>
<th class="text-end">TVA%</th>
<th class="text-end">Valoare</th>
</tr>
</thead>
<tbody id="detailItemsBody">
</tbody>
</table>
<div id="detailReceipt" class="d-flex flex-wrap gap-2 mt-1 justify-content-end"></div>
</div>
<div class="d-md-none" id="detailItemsMobile"></div>
<div id="detailReceiptMobile" class="d-flex flex-wrap gap-2 mt-1 d-md-none justify-content-end"></div>
<div id="detailError" class="alert alert-danger mt-3" style="display:none;"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Inchide</button>
</div>
</div>
</div>
</div>
<!-- Quick Map Modal (used from order detail) -->
{% endblock %}
{% block scripts %}
<script src="{{ request.scope.get('root_path', '') }}/static/js/dashboard.js?v=25"></script>
{% endblock %}

View File

@@ -1,160 +0,0 @@
{% extends "base.html" %}
{% block title %}Jurnale Import - GoMag Import{% endblock %}
{% block nav_logs %}active{% endblock %}
{% block content %}
<h4 class="mb-4">Jurnale Import</h4>
<!-- Sync Run Selector + Status + Controls (single card) -->
<div class="card mb-3">
<div class="card-body py-2">
<!-- Desktop layout -->
<div class="d-none d-md-flex align-items-center gap-3 flex-wrap">
<label class="form-label mb-0 fw-bold text-nowrap">Sync Run:</label>
<select class="form-select form-select-sm" id="runsDropdown" onchange="selectRun(this.value)" style="max-width:400px">
<option value="">Se incarca...</option>
</select>
<button class="btn btn-sm btn-outline-secondary text-nowrap" onclick="loadRuns()" title="Reincarca lista"><i class="bi bi-arrow-clockwise"></i></button>
<span id="logStatusBadge" style="font-weight:600">-</span>
<div class="form-check form-switch mb-0">
<input class="form-check-input" type="checkbox" id="autoRefreshToggle" checked>
<label class="form-check-label small" for="autoRefreshToggle">Auto-refresh</label>
</div>
<button class="btn btn-sm btn-outline-secondary" id="btnShowTextLog" onclick="toggleTextLog()">
<i class="bi bi-file-text"></i> Log text brut
</button>
</div>
<!-- Mobile compact layout -->
<div class="d-flex d-md-none align-items-center gap-2">
<span id="mobileRunDot" class="sync-status-dot idle" style="width:8px;height:8px"></span>
<select class="form-select form-select-sm flex-grow-1" id="runsDropdownMobile" onchange="selectRun(this.value)" style="font-size:0.8rem">
<option value="">Se incarca...</option>
</select>
<button class="btn btn-sm btn-outline-secondary" onclick="loadRuns()" title="Reincarca"><i class="bi bi-arrow-clockwise"></i></button>
<div class="dropdown">
<button class="btn btn-sm btn-outline-secondary" data-bs-toggle="dropdown"><i class="bi bi-three-dots-vertical"></i></button>
<ul class="dropdown-menu dropdown-menu-end">
<li>
<label class="dropdown-item d-flex align-items-center gap-2">
<input class="form-check-input" type="checkbox" id="autoRefreshToggleMobile" checked> Auto-refresh
</label>
</li>
<li><a class="dropdown-item" href="#" onclick="toggleTextLog();return false"><i class="bi bi-file-text me-1"></i> Log text brut</a></li>
</ul>
</div>
</div>
</div>
</div>
<!-- Detail Viewer (shown when run selected) -->
<div id="logViewerSection" style="display:none;">
<!-- Filter pills -->
<div class="filter-bar mb-3" id="orderFilterPills">
<button class="filter-pill active d-none d-md-inline-flex" data-log-status="all">Toate <span class="filter-count fc-neutral" id="countAll">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-log-status="IMPORTED">Importate <span class="filter-count fc-green" id="countImported">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-log-status="ALREADY_IMPORTED">Deja imp. <span class="filter-count fc-blue" id="countAlreadyImported">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-log-status="SKIPPED">Omise <span class="filter-count fc-yellow" id="countSkipped">0</span></button>
<button class="filter-pill d-none d-md-inline-flex" data-log-status="ERROR">Erori <span class="filter-count fc-red" id="countError">0</span></button>
</div>
<div class="d-md-none mb-2" id="logsMobileSeg"></div>
<!-- Orders table -->
<div class="card mb-3">
<div id="ordersPaginationTop" class="pag-strip"></div>
<div class="card-body p-0">
<div id="logsMobileList" class="mobile-list"></div>
<div class="table-responsive">
<table class="table table-hover mb-0">
<thead>
<tr>
<th style="width:24px"></th>
<th>#</th>
<th class="sortable" onclick="sortOrdersBy('order_date')">Data comanda <span class="sort-icon" data-col="order_date"></span></th>
<th class="sortable" onclick="sortOrdersBy('order_number')">Nr. comanda <span class="sort-icon" data-col="order_number"></span></th>
<th class="sortable" onclick="sortOrdersBy('customer_name')">Client <span class="sort-icon" data-col="customer_name"></span></th>
<th class="sortable" onclick="sortOrdersBy('items_count')">Articole <span class="sort-icon" data-col="items_count"></span></th>
<th class="text-end">Transport</th>
<th class="text-end">Discount</th>
<th class="text-end">Total</th>
</tr>
</thead>
<tbody id="runOrdersBody">
<tr><td colspan="9" class="text-center text-muted py-3">Selecteaza un sync run</td></tr>
</tbody>
</table>
</div>
</div>
<div id="ordersPagination" class="pag-strip pag-strip-bottom"></div>
</div>
<!-- Collapsible text log -->
<div id="textLogSection" style="display:none;">
<div class="card">
<div class="card-header">Log text brut</div>
<pre class="log-viewer" id="logContent">Se incarca...</pre>
</div>
</div>
</div>
<!-- Order Detail Modal -->
<div class="modal fade" id="orderDetailModal" tabindex="-1">
<div class="modal-dialog modal-lg">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Comanda <code id="detailOrderNumber"></code></h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<div class="row mb-3">
<div class="col-md-6">
<small class="text-muted">Client:</small> <strong id="detailCustomer"></strong><br>
<small class="text-muted">Data comanda:</small> <span id="detailDate"></span><br>
<small class="text-muted">Status:</small> <span id="detailStatus"></span>
</div>
<div class="col-md-6">
<small class="text-muted">ID Comanda ROA:</small> <span id="detailIdComanda">-</span><br>
<small class="text-muted">ID Partener:</small> <span id="detailIdPartener">-</span><br>
<small class="text-muted">ID Adr. Facturare:</small> <span id="detailIdAdresaFact">-</span><br>
<small class="text-muted">ID Adr. Livrare:</small> <span id="detailIdAdresaLivr">-</span>
</div>
</div>
<div id="detailTotals" class="d-flex gap-3 mb-2 flex-wrap" style="font-size:0.875rem">
<span><small class="text-muted">Valoare:</small> <strong id="detailItemsTotal">-</strong></span>
<span id="detailDiscountWrap"><small class="text-muted">Discount:</small> <strong id="detailDiscount">-</strong></span>
<span id="detailDeliveryWrap"><small class="text-muted">Transport:</small> <strong id="detailDeliveryCost">-</strong></span>
<span><small class="text-muted">Total:</small> <strong id="detailOrderTotal">-</strong></span>
</div>
<div class="table-responsive d-none d-md-block">
<table class="table table-sm table-bordered mb-0">
<thead class="table-light">
<tr>
<th>SKU</th>
<th>Produs</th>
<th>CODMAT</th>
<th>Cant.</th>
<th>Pret</th>
<th class="text-end">Valoare</th>
</tr>
</thead>
<tbody id="detailItemsBody">
</tbody>
</table>
</div>
<div class="d-md-none" id="detailItemsMobile"></div>
<div id="detailError" class="alert alert-danger mt-3" style="display:none;"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Inchide</button>
</div>
</div>
</div>
</div>
<!-- Quick Map Modal (used from order detail) -->
<!-- Hidden field for pre-selected run from URL/server -->
<input type="hidden" id="preselectedRun" value="{{ selected_run }}">
{% endblock %}
{% block scripts %}
<script src="{{ request.scope.get('root_path', '') }}/static/js/logs.js?v=11"></script>
{% endblock %}

View File

@@ -1,154 +0,0 @@
{% extends "base.html" %}
{% block title %}Mapari SKU - GoMag Import{% endblock %}
{% block nav_mappings %}active{% endblock %}
{% block content %}
<div class="d-flex justify-content-between align-items-center mb-4">
<h4 class="mb-0">Mapari SKU</h4>
<div class="d-flex align-items-center gap-2">
<!-- Desktop buttons -->
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" onclick="downloadTemplate()"><i class="bi bi-file-earmark-arrow-down"></i> Template CSV</button>
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" onclick="exportCsv()"><i class="bi bi-download"></i> Export CSV</button>
<button class="btn btn-sm btn-outline-primary d-none d-md-inline-flex" data-bs-toggle="modal" data-bs-target="#importModal"><i class="bi bi-upload"></i> Import CSV</button>
<button class="btn btn-sm btn-primary" onclick="showInlineAddRow()"><i class="bi bi-plus-lg"></i> <span class="d-none d-md-inline">Adauga Mapare</span><span class="d-md-none">Mapare</span></button>
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" data-bs-toggle="modal" data-bs-target="#addModal"><i class="bi bi-box-arrow-up-right"></i> Formular complet</button>
<!-- Mobile ⋯ dropdown -->
<div class="dropdown d-md-none">
<button class="btn btn-sm btn-outline-secondary" type="button" data-bs-toggle="dropdown" aria-expanded="false"><i class="bi bi-three-dots-vertical"></i></button>
<ul class="dropdown-menu dropdown-menu-end">
<li><a class="dropdown-item" href="#" onclick="downloadTemplate();return false"><i class="bi bi-file-earmark-arrow-down me-1"></i> Template CSV</a></li>
<li><a class="dropdown-item" href="#" onclick="exportCsv();return false"><i class="bi bi-download me-1"></i> Export CSV</a></li>
<li><a class="dropdown-item" href="#" data-bs-toggle="modal" data-bs-target="#importModal"><i class="bi bi-upload me-1"></i> Import CSV</a></li>
<li><a class="dropdown-item" href="#" data-bs-toggle="modal" data-bs-target="#addModal"><i class="bi bi-box-arrow-up-right me-1"></i> Formular complet</a></li>
</ul>
</div>
</div>
</div>
<!-- Search -->
<div class="card mb-3">
<div class="card-body py-2">
<div class="input-group">
<span class="input-group-text"><i class="bi bi-search"></i></span>
<input type="text" class="form-control" id="searchInput" placeholder="Cauta SKU, CODMAT sau denumire..." oninput="debounceSearch()">
</div>
</div>
</div>
<!-- Filter controls -->
<div class="d-flex align-items-center mb-3 gap-3">
<div class="form-check form-switch">
<input class="form-check-input" type="checkbox" id="showInactive" onchange="loadMappings()">
<label class="form-check-label" for="showInactive">Arata inactive</label>
</div>
<div class="form-check form-switch">
<input class="form-check-input" type="checkbox" id="showDeleted" onchange="loadMappings()">
<label class="form-check-label" for="showDeleted">Arata sterse</label>
</div>
</div>
<!-- Top pagination -->
<div id="mappingsPagTop" class="pag-strip"></div>
<!-- Flat-row list (unified desktop + mobile) -->
<div class="card">
<div class="card-body p-0">
<div id="mappingsFlatList" class="mappings-flat-list">
<div class="flat-row text-muted py-4 justify-content-center">Se incarca...</div>
</div>
</div>
<div id="mappingsPagBottom" class="pag-strip pag-strip-bottom"></div>
</div>
<!-- Add/Edit Modal with multi-CODMAT support (R11) -->
<div class="modal fade" id="addModal" tabindex="-1" data-bs-backdrop="static">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title" id="addModalTitle">Adauga Mapare</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<div class="mb-2">
<label class="form-label form-label-sm mb-1">SKU</label>
<input type="text" class="form-control form-control-sm" id="inputSku" placeholder="Ex: 8714858124284">
</div>
<div id="addModalProductName" style="display:none; margin-bottom:8px; font-size:0.85rem">
<small class="text-muted">Produs:</small> <strong id="inputProductName"></strong>
</div>
<div class="qm-row" style="font-size:0.7rem; color:#9ca3af; padding:0 0 2px">
<span style="flex:1">CODMAT</span>
<span style="width:70px">Cant.</span>
<span style="width:30px"></span>
</div>
<div id="codmatLines">
<!-- Dynamic CODMAT lines will be added here -->
</div>
<button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addCodmatLine()" style="font-size:0.8rem; padding:2px 10px">
+ CODMAT
</button>
<div id="pctWarning" class="text-danger mt-2" style="display:none;"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
<button type="button" class="btn btn-primary" onclick="saveMapping()">Salveaza</button>
</div>
</div>
</div>
</div>
<!-- Import CSV Modal -->
<div class="modal fade" id="importModal" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Import CSV</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<p class="text-muted small">Format CSV: sku, codmat, cantitate_roa</p>
<input type="file" class="form-control" id="csvFile" accept=".csv">
<div id="importResult" class="mt-3"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Inchide</button>
<button type="button" class="btn btn-primary" onclick="importCsv()">Import</button>
</div>
</div>
</div>
</div>
<!-- Delete Confirmation Modal -->
<div class="modal fade" id="deleteConfirmModal" tabindex="-1">
<div class="modal-dialog modal-sm">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Confirmare stergere</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
Sigur vrei sa stergi maparea?<br>
SKU: <code id="deleteSkuText"></code><br>
CODMAT: <code id="deleteCodmatText"></code>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
<button type="button" class="btn btn-danger" id="confirmDeleteBtn">Sterge</button>
</div>
</div>
</div>
</div>
<!-- Toast container for undo actions -->
<div class="toast-container position-fixed bottom-0 end-0 p-3" style="z-index:1080">
<div id="undoToast" class="toast" role="alert" data-bs-autohide="true" data-bs-delay="5000">
<div class="toast-body d-flex align-items-center gap-2">
<span id="toastMessage"></span>
<button class="btn btn-sm btn-outline-primary ms-auto" id="toastUndoBtn">Anuleaza</button>
</div>
</div>
</div>
{% endblock %}
{% block scripts %}
<script src="{{ request.scope.get('root_path', '') }}/static/js/mappings.js?v=11"></script>
{% endblock %}

View File

@@ -1,248 +0,0 @@
{% extends "base.html" %}
{% block title %}SKU-uri Lipsa - GoMag Import{% endblock %}
{% block nav_missing %}active{% endblock %}
{% block content %}
<div class="d-flex justify-content-between align-items-center mb-4">
<h4 class="mb-0">SKU-uri Lipsa</h4>
<div class="d-flex align-items-center gap-2">
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" onclick="exportMissingCsv()">
<i class="bi bi-download"></i> Export CSV
</button>
<!-- Mobile ⋯ dropdown -->
<div class="dropdown d-md-none">
<button class="btn btn-sm btn-outline-secondary" type="button" data-bs-toggle="dropdown" aria-expanded="false"><i class="bi bi-three-dots-vertical"></i></button>
<ul class="dropdown-menu dropdown-menu-end">
<li><a class="dropdown-item" href="#" onclick="document.getElementById('rescanBtn').click();return false"><i class="bi bi-arrow-clockwise me-1"></i> Re-scan</a></li>
<li><a class="dropdown-item" href="#" onclick="exportMissingCsv();return false"><i class="bi bi-download me-1"></i> Export CSV</a></li>
</ul>
</div>
</div>
</div>
<!-- Unified filter bar -->
<div class="filter-bar" id="skusFilterBar">
<button class="filter-pill active d-none d-md-inline-flex" data-sku-status="unresolved">
Nerezolvate <span class="filter-count fc-yellow" id="cntUnres">0</span>
</button>
<button class="filter-pill d-none d-md-inline-flex" data-sku-status="resolved">
Rezolvate <span class="filter-count fc-green" id="cntRes">0</span>
</button>
<button class="filter-pill d-none d-md-inline-flex" data-sku-status="all">
Toate <span class="filter-count fc-neutral" id="cntAllSkus">0</span>
</button>
<input type="search" id="skuSearch" placeholder="Cauta SKU / produs..." class="search-input">
<button id="rescanBtn" class="btn btn-sm btn-secondary ms-2 d-none d-md-inline-flex">&#8635; Re-scan</button>
<span id="rescanProgress" class="align-items-center gap-2 text-primary" style="display:none;">
<span class="sync-live-dot"></span>
<span id="rescanProgressText">Scanare...</span>
</span>
</div>
<div class="d-md-none mb-2" id="skusMobileSeg"></div>
<!-- Result banner -->
<div id="rescanResult" class="result-banner" style="display:none;margin-bottom:0.75rem;"></div>
<div id="skusPagTop" class="pag-strip mb-2"></div>
<div class="card">
<div class="card-body p-0">
<div id="missingMobileList" class="mobile-list"></div>
<div class="table-responsive">
<table class="table table-hover mb-0">
<thead>
<tr>
<th>Status</th>
<th>SKU</th>
<th>Produs</th>
<th>Actiune</th>
</tr>
</thead>
<tbody id="missingBody">
<tr><td colspan="4" class="text-center text-muted py-4">Se incarca...</td></tr>
</tbody>
</table>
</div>
</div>
</div>
<div id="skusPagBottom" class="pag-strip pag-strip-bottom"></div>
{% endblock %}
{% block scripts %}
<script>
let currentPage = 1;
let skuStatusFilter = 'unresolved';
let missingPerPage = 20;
function missingChangePerPage(val) { missingPerPage = parseInt(val) || 20; currentPage = 1; loadMissingSkus(); }
// ── Filter pills ──────────────────────────────────
document.querySelectorAll('.filter-pill[data-sku-status]').forEach(btn => {
btn.addEventListener('click', function() {
document.querySelectorAll('.filter-pill[data-sku-status]').forEach(b => b.classList.remove('active'));
this.classList.add('active');
skuStatusFilter = this.dataset.skuStatus;
currentPage = 1;
loadMissingSkus();
});
});
// ── Search with debounce ─────────────────────────
let skuSearchTimer = null;
document.getElementById('skuSearch')?.addEventListener('input', function() {
clearTimeout(skuSearchTimer);
skuSearchTimer = setTimeout(() => { currentPage = 1; loadMissingSkus(); }, 300);
});
// ── Rescan ────────────────────────────────────────
document.getElementById('rescanBtn')?.addEventListener('click', async function() {
this.disabled = true;
const prog = document.getElementById('rescanProgress');
const result = document.getElementById('rescanResult');
const progText = document.getElementById('rescanProgressText');
if (prog) { prog.style.display = 'flex'; }
if (result) result.style.display = 'none';
try {
const data = await fetch('/api/validate/scan', { method: 'POST' }).then(r => r.json());
if (progText) progText.textContent = 'Gata.';
if (result) {
result.innerHTML = `&#10003; ${data.total_skus_scanned || 0} scanate &nbsp;|&nbsp; ${data.new_missing || 0} noi lipsa &nbsp;|&nbsp; ${data.auto_resolved || 0} rezolvate`;
result.style.display = 'block';
}
loadMissingSkus();
} catch(e) {
if (progText) progText.textContent = 'Eroare.';
} finally {
this.disabled = false;
setTimeout(() => { if (prog) prog.style.display = 'none'; }, 2500);
}
});
document.addEventListener('DOMContentLoaded', () => {
loadMissingSkus();
});
function resolvedParamFor(statusFilter) {
if (statusFilter === 'resolved') return 1;
if (statusFilter === 'all') return -1;
return 0; // unresolved (default)
}
function loadMissingSkus(page) {
currentPage = page || currentPage;
const params = new URLSearchParams();
const resolvedVal = resolvedParamFor(skuStatusFilter);
params.set('resolved', resolvedVal);
params.set('page', currentPage);
params.set('per_page', missingPerPage);
const search = document.getElementById('skuSearch')?.value?.trim();
if (search) params.set('search', search);
fetch('/api/validate/missing-skus?' + params.toString())
.then(r => r.json())
.then(data => {
const c = data.counts || {};
const el = id => document.getElementById(id);
if (el('cntUnres')) el('cntUnres').textContent = c.unresolved || 0;
if (el('cntRes')) el('cntRes').textContent = c.resolved || 0;
if (el('cntAllSkus')) el('cntAllSkus').textContent = c.total || 0;
// Mobile segmented control
renderMobileSegmented('skusMobileSeg', [
{ label: 'Nerez.', count: c.unresolved || 0, value: 'unresolved', active: skuStatusFilter === 'unresolved', colorClass: 'fc-yellow' },
{ label: 'Rez.', count: c.resolved || 0, value: 'resolved', active: skuStatusFilter === 'resolved', colorClass: 'fc-green' },
{ label: 'Toate', count: c.total || 0, value: 'all', active: skuStatusFilter === 'all', colorClass: 'fc-neutral' }
], (val) => {
document.querySelectorAll('.filter-pill[data-sku-status]').forEach(b => b.classList.remove('active'));
const pill = document.querySelector(`.filter-pill[data-sku-status="${val}"]`);
if (pill) pill.classList.add('active');
skuStatusFilter = val;
currentPage = 1;
loadMissingSkus();
});
renderMissingSkusTable(data.skus || data.missing_skus || [], data);
renderPagination(data);
})
.catch(err => {
document.getElementById('missingBody').innerHTML =
`<tr><td colspan="4" class="text-center text-danger">${err.message}</td></tr>`;
});
}
// Keep backward compat alias
function loadMissing(page) { loadMissingSkus(page); }
function renderMissingSkusTable(skus, data) {
const tbody = document.getElementById('missingBody');
const mobileList = document.getElementById('missingMobileList');
if (!skus || skus.length === 0) {
const msg = skuStatusFilter === 'unresolved' ? 'Toate SKU-urile sunt mapate!' :
skuStatusFilter === 'resolved' ? 'Niciun SKU rezolvat' : 'Niciun SKU gasit';
tbody.innerHTML = `<tr><td colspan="4" class="text-center text-muted py-4">${msg}</td></tr>`;
if (mobileList) mobileList.innerHTML = `<div class="flat-row text-muted py-3 justify-content-center">${msg}</div>`;
return;
}
tbody.innerHTML = skus.map(s => {
const trAttrs = !s.resolved
? ` style="cursor:pointer" onclick="openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}')"`
: '';
return `<tr${trAttrs}>
<td>${s.resolved ? '<span class="dot dot-green"></span>' : '<span class="dot dot-yellow"></span>'}</td>
<td><code>${esc(s.sku)}</code></td>
<td class="truncate" style="max-width:300px">${esc(s.product_name || '-')}</td>
<td>
${!s.resolved
? `<a href="#" class="btn-map-icon" onclick="event.stopPropagation(); openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}'); return false;" title="Mapeaza">
<i class="bi bi-link-45deg"></i>
</a>`
: `<small class="text-muted">${s.resolved_at ? new Date(s.resolved_at).toLocaleDateString('ro-RO') : ''}</small>`}
</td>
</tr>`;
}).join('');
if (mobileList) {
mobileList.innerHTML = skus.map(s => {
const actionHtml = !s.resolved
? `<a href="#" class="btn-map-icon" onclick="event.stopPropagation(); openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}'); return false;"><i class="bi bi-link-45deg"></i></a>`
: `<small class="text-muted">${s.resolved_at ? new Date(s.resolved_at).toLocaleDateString('ro-RO') : ''}</small>`;
const flatRowAttrs = !s.resolved
? ` onclick="openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}')" style="cursor:pointer"`
: '';
return `<div class="flat-row"${flatRowAttrs}>
${s.resolved ? '<span class="dot dot-green"></span>' : '<span class="dot dot-yellow"></span>'}
<code class="me-1 text-nowrap">${esc(s.sku)}</code>
<span class="grow truncate">${esc(s.product_name || '-')}</span>
${actionHtml}
</div>`;
}).join('');
}
}
function renderPagination(data) {
const pagOpts = { perPage: missingPerPage, perPageFn: 'missingChangePerPage', perPageOptions: [20, 50, 100] };
const infoHtml = `<small class="text-muted me-auto">Total: ${data.total || 0} | Pagina ${data.page || 1} din ${data.pages || 1}</small>`;
const pagHtml = infoHtml + renderUnifiedPagination(data.page || 1, data.pages || 1, 'loadMissing', pagOpts);
const top = document.getElementById('skusPagTop');
const bot = document.getElementById('skusPagBottom');
if (top) top.innerHTML = pagHtml;
if (bot) bot.innerHTML = pagHtml;
}
// ── Map Modal (uses shared openQuickMap) ─────────
function openMapModal(sku, productName) {
openQuickMap({
sku,
productName,
onSave: () => { loadMissingSkus(currentPage); }
});
}
function exportMissingCsv() {
window.location.href = (window.ROOT_PATH || '') + '/api/validate/missing-skus-csv';
}
</script>
{% endblock %}

View File

@@ -1,237 +0,0 @@
{% extends "base.html" %}
{% block title %}Setari - GoMag Import{% endblock %}
{% block nav_settings %}active{% endblock %}
{% block content %}
<h4 class="mb-3">Setari</h4>
<div class="row g-3 mb-3">
<!-- GoMag API card -->
<div class="col-md-6">
<div class="card h-100">
<div class="card-header py-2 px-3 fw-semibold">GoMag API</div>
<div class="card-body py-2 px-3">
<div class="mb-2">
<label class="form-label mb-0 small">API Key</label>
<input type="text" class="form-control form-control-sm" id="settGomagApiKey" placeholder="4c5e46...">
</div>
<div class="mb-2">
<label class="form-label mb-0 small">Shop URL</label>
<input type="text" class="form-control form-control-sm" id="settGomagApiShop" placeholder="https://coffeepoint.ro">
</div>
<div class="row g-2">
<div class="col-6">
<label class="form-label mb-0 small">Zile înapoi</label>
<input type="number" class="form-control form-control-sm" id="settGomagDaysBack" value="7" min="1">
</div>
<div class="col-6">
<label class="form-label mb-0 small">Limită/pagină</label>
<input type="number" class="form-control form-control-sm" id="settGomagLimit" value="100" min="1">
</div>
</div>
</div>
</div>
</div>
<!-- Import ROA card -->
<div class="col-md-6">
<div class="card h-100">
<div class="card-header py-2 px-3 fw-semibold">Import ROA</div>
<div class="card-body py-2 px-3">
<div class="mb-2">
<label class="form-label mb-0 small">Gestiuni pentru verificare stoc</label>
<div id="settGestiuniContainer" class="border rounded p-2" style="max-height:120px;overflow-y:auto;font-size:0.85rem">
<span class="text-muted small">Se încarcă...</span>
</div>
<div class="form-text" style="font-size:0.75rem">Nicio selecție = orice gestiune</div>
</div>
<div class="mb-2">
<label class="form-label mb-0 small">Secție (ID_SECTIE)</label>
<select class="form-select form-select-sm" id="settIdSectie">
<option value="">— selectează secție —</option>
</select>
</div>
<div class="mb-2">
<label class="form-label mb-0 small">Politică Preț Vânzare (ID_POL)</label>
<select class="form-select form-select-sm" id="settIdPol">
<option value="">— selectează politică —</option>
</select>
</div>
<div class="mb-2">
<label class="form-label mb-0 small">Politică Preț Producție</label>
<select class="form-select form-select-sm" id="settIdPolProductie">
<option value="">— fără politică producție —</option>
</select>
<div class="form-text" style="font-size:0.75rem">Pentru articole cu cont 341/345 (producție proprie)</div>
</div>
</div>
</div>
</div>
</div>
<div class="row g-3 mb-3">
<!-- Transport card -->
<div class="col-md-6">
<div class="card h-100">
<div class="card-header py-2 px-3 fw-semibold">Transport</div>
<div class="card-body py-2 px-3">
<div class="mb-2">
<label class="form-label mb-0 small">CODMAT Transport</label>
<div class="position-relative">
<input type="text" class="form-control form-control-sm" id="settTransportCodmat" placeholder="ex: TRANSPORT" autocomplete="off">
<div class="autocomplete-dropdown d-none" id="settTransportAc"></div>
</div>
</div>
<div class="row g-2">
<div class="col-6">
<label class="form-label mb-0 small">TVA Transport (%)</label>
<select class="form-select form-select-sm" id="settTransportVat">
<option value="5">5%</option>
<option value="9">9%</option>
<option value="19">19%</option>
<option value="21" selected>21%</option>
</select>
</div>
<div class="col-6">
<label class="form-label mb-0 small">Politică Transport</label>
<select class="form-select form-select-sm" id="settTransportIdPol">
<option value="">— implicită —</option>
</select>
</div>
</div>
</div>
</div>
</div>
<!-- Discount card -->
<div class="col-md-6">
<div class="card h-100">
<div class="card-header py-2 px-3 fw-semibold">Discount</div>
<div class="card-body py-2 px-3">
<div class="mb-2">
<label class="form-label mb-0 small">CODMAT Discount</label>
<div class="position-relative">
<input type="text" class="form-control form-control-sm" id="settDiscountCodmat" placeholder="ex: DISCOUNT" autocomplete="off">
<div class="autocomplete-dropdown d-none" id="settDiscountAc"></div>
</div>
</div>
<div class="row g-2">
<div class="col-6">
<label class="form-label mb-0 small">TVA Discount (fallback %)</label>
<select class="form-select form-select-sm" id="settDiscountVat">
<option value="5">5%</option>
<option value="9">9%</option>
<option value="11">11%</option>
<option value="19">19%</option>
<option value="21" selected>21%</option>
</select>
</div>
<div class="col-6">
<label class="form-label mb-0 small">Politică Discount</label>
<select class="form-select form-select-sm" id="settDiscountIdPol">
<option value="">— implicită —</option>
</select>
</div>
</div>
<div class="mt-2 form-check">
<input type="checkbox" class="form-check-input" id="settSplitDiscountVat">
<label class="form-check-label small" for="settSplitDiscountVat">
Împarte discount pe cote TVA (proporțional cu valoarea articolelor)
</label>
</div>
</div>
</div>
</div>
</div>
<div class="row g-3 mb-3">
<div class="col-md-6">
<div class="card h-100">
<div class="card-header py-2 px-3 fw-semibold">Dashboard</div>
<div class="card-body py-2 px-3">
<div class="mb-2">
<label class="form-label mb-0 small">Interval polling (secunde)</label>
<input type="number" class="form-control form-control-sm" id="settDashPollSeconds" value="5" min="1" max="300">
<div class="form-text" style="font-size:0.75rem">Cât de des verifică dashboard-ul starea sync-ului (implicit 5s)</div>
</div>
</div>
</div>
</div>
<div class="col-md-6">
<div class="card h-100">
<div class="card-header py-2 px-3 fw-semibold">Pricing Kituri / Pachete</div>
<div class="card-body py-2 px-3">
<div class="mb-2">
<div class="form-check">
<input class="form-check-input" type="radio" name="kitPricingMode" id="kitModeOff" value="" checked>
<label class="form-check-label small" for="kitModeOff">Dezactivat</label>
</div>
<div class="form-check">
<input class="form-check-input" type="radio" name="kitPricingMode" id="kitModeDistributed" value="distributed">
<label class="form-check-label small" for="kitModeDistributed">Distribuire discount în preț</label>
</div>
<div class="form-check">
<input class="form-check-input" type="radio" name="kitPricingMode" id="kitModeSeparate" value="separate_line">
<label class="form-check-label small" for="kitModeSeparate">Linie discount separată</label>
</div>
</div>
<div id="kitModeBFields" style="display:none">
<div class="mb-2">
<label class="form-label mb-0 small">Kit Discount CODMAT</label>
<div class="position-relative">
<input type="text" class="form-control form-control-sm" id="settKitDiscountCodmat" placeholder="ex: DISCOUNT_KIT" autocomplete="off">
<div class="autocomplete-dropdown d-none" id="settKitDiscountAc"></div>
</div>
</div>
<div class="mb-2">
<label class="form-label mb-0 small">Kit Discount Politică</label>
<select class="form-select form-select-sm" id="settKitDiscountIdPol">
<option value="">— implicită —</option>
</select>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="row g-3 mb-3">
<div class="col-md-6">
<div class="card h-100">
<div class="card-header py-2 px-3 fw-semibold">Sincronizare Prețuri</div>
<div class="card-body py-2 px-3">
<div class="form-check mb-2">
<input type="checkbox" class="form-check-input" id="settPriceSyncEnabled" checked>
<label class="form-check-label small" for="settPriceSyncEnabled">Sync automat prețuri din comenzi</label>
</div>
<div class="form-check mb-2">
<input type="checkbox" class="form-check-input" id="settCatalogSyncEnabled">
<label class="form-check-label small" for="settCatalogSyncEnabled">Sync prețuri din catalog GoMag</label>
</div>
<div id="catalogSyncOptions" style="display:none">
<div class="mb-2">
<label class="form-label mb-0 small">Program</label>
<select class="form-select form-select-sm" id="settPriceSyncSchedule">
<option value="">Doar manual</option>
<option value="daily_03:00">Zilnic la 03:00</option>
<option value="daily_06:00">Zilnic la 06:00</option>
</select>
</div>
</div>
<div id="settPriceSyncStatus" class="text-muted small mt-2"></div>
<button class="btn btn-sm btn-outline-primary mt-2" id="btnCatalogSync" onclick="startCatalogSync()">Sincronizează acum</button>
</div>
</div>
</div>
</div>
<div class="mb-3">
<button class="btn btn-primary btn-sm" onclick="saveSettings()">Salvează Setările</button>
<span id="settSaveResult" class="ms-2 small"></span>
</div>
{% endblock %}
{% block scripts %}
<script src="{{ request.scope.get('root_path', '') }}/static/js/settings.js?v=7"></script>
{% endblock %}

View File

View File

@@ -1,43 +0,0 @@
-- ====================================================================
-- P1-001: Tabel ARTICOLE_TERTI pentru mapări SKU → CODMAT
-- Sistem Import Comenzi Web → ROA
-- ====================================================================
-- Creare tabel pentru mapări complexe articole
CREATE TABLE ARTICOLE_TERTI (
sku VARCHAR2(100) NOT NULL, -- SKU din platforma web
codmat VARCHAR2(50) NOT NULL, -- CODMAT din nom_articole
cantitate_roa NUMBER(10,3) DEFAULT 1, -- Câte unități ROA = 1 web
procent_pret NUMBER(5,2) DEFAULT 100, -- % din preț pentru seturi
activ NUMBER(1) DEFAULT 1, -- 1=activ, 0=inactiv
data_creare DATE DEFAULT SYSDATE, -- Timestamp creare
data_modif DATE DEFAULT SYSDATE, -- Timestamp ultima modificare
id_util_creare NUMBER(10) DEFAULT -3, -- ID utilizator care a creat
id_util_modif NUMBER(10) DEFAULT -3 -- ID utilizator care a modificat
);
-- Adaugare constraint-uri ca instructiuni separate
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT pk_articole_terti PRIMARY KEY (sku, codmat);
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_cantitate CHECK (cantitate_roa > 0);
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_procent CHECK (procent_pret >= 0 AND procent_pret <= 100);
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_activ CHECK (activ IN (0, 1));
-- Index pentru performanță pe căutări frecvente după SKU
CREATE INDEX idx_articole_terti_sku ON ARTICOLE_TERTI (sku, activ);
-- Comentarii pentru documentație
COMMENT ON TABLE ARTICOLE_TERTI IS 'Mapări SKU-uri web → CODMAT ROA pentru reîmpachetări și seturi';
COMMENT ON COLUMN ARTICOLE_TERTI.sku IS 'SKU din platforma web (ex: GoMag)';
COMMENT ON COLUMN ARTICOLE_TERTI.codmat IS 'CODMAT din nom_articole ROA';
COMMENT ON COLUMN ARTICOLE_TERTI.cantitate_roa IS 'Câte unități ROA pentru 1 unitate web';
COMMENT ON COLUMN ARTICOLE_TERTI.procent_pret IS 'Procent din preț web alocat acestui CODMAT (pentru seturi)';
COMMENT ON COLUMN ARTICOLE_TERTI.activ IS '1=mapare activă, 0=dezactivată';
-- Date de test pentru validare
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ) VALUES ('CAFE100', 'CAF01', 10, 100, 1);
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ) VALUES ('SET01', 'CAF01', 2, 60, 1);
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ) VALUES ('SET01', 'FILT01', 1, 40, 1);

File diff suppressed because it is too large Load Diff

View File

@@ -1,828 +0,0 @@
CREATE OR REPLACE PACKAGE PACK_IMPORT_PARTENERI AS
-- 20.03.2026 - import parteneri GoMag: PJ/PF, shipping/billing, cautare/creare automata
-- ====================================================================
-- CONSTANTS
-- ====================================================================
-- ID utilizator sistem pentru toate operatiile
C_ID_UTIL_SISTEM CONSTANT NUMBER := -3;
-- Valori default pentru adrese incomplete
C_JUD_DEFAULT CONSTANT VARCHAR2(50) := 'BUCURESTI';
N_ID_JUD_DEFAULT CONSTANT NUMBER(10) := 10;
C_LOCALITATE_DEFAULT CONSTANT VARCHAR2(50) := 'BUCURESTI SECTORUL 1';
N_ID_LOCALITATE_DEFAULT CONSTANT NUMBER(10) := 1797;
C_SECTOR_DEFAULT CONSTANT VARCHAR2(50) := 'SECTOR 1';
C_TARA_DEFAULT CONSTANT VARCHAR2(50) := 'ROMANIA';
N_ID_TARA_DEFAULT CONSTANT NUMBER(10) := 1;
-- Lungimi maxime pentru validari
C_MIN_COD_FISCAL CONSTANT NUMBER := 3;
C_CUI_PERS_FIZICA CONSTANT NUMBER := 13; -- CNP are 13 cifre
-- Variabila package pentru ultima eroare (pentru orchestrator VFP)
g_last_error VARCHAR2(4000);
-- ====================================================================
-- CUSTOM EXCEPTIONS
-- ====================================================================
partener_invalid_exception EXCEPTION;
PRAGMA EXCEPTION_INIT(partener_invalid_exception, -20001);
adresa_invalid_exception EXCEPTION;
PRAGMA EXCEPTION_INIT(adresa_invalid_exception, -20002);
integrare_pack_def_exception EXCEPTION;
PRAGMA EXCEPTION_INIT(integrare_pack_def_exception, -20003);
-- ====================================================================
-- PUBLIC FUNCTIONS
-- ====================================================================
/**
* Procedura principala pentru cautarea sau crearea unui partener
* SCHIMBAT din FUNCTION in PROCEDURE pentru compatibilitate cu DML operations
*
* Algoritm:
* 1. Cauta dupa cod_fiscal (daca > 3 caractere)
* 2. Cauta dupa denumire exacta
* 3. Creeaza partener nou cu pack_def.adauga_partener()
* 4. Adauga adresa cu pack_def.adauga_adresa_partener2()
*
* @param p_cod_fiscal Cod fiscal/CUI/CNP partener
* @param p_denumire Denumirea partenerului (companie sau nume complet)
* @param p_adresa Adresa in format: "JUD:Bucuresti;BUCURESTI;Str.Victoriei;10"
* @param p_telefon Numar de telefon
* @param p_email Adresa de email
* @param p_is_persoana_juridica 1=persoana juridica, 0=persoana fizica, NULL=auto-detect prin CNP
* @param p_id_partener OUT ID_PART al partenerului gasit sau creat
*/
PROCEDURE cauta_sau_creeaza_partener(p_cod_fiscal IN VARCHAR2,
p_denumire IN VARCHAR2,
p_registru IN VARCHAR2,
p_is_persoana_juridica IN NUMBER DEFAULT NULL,
p_id_partener OUT NUMBER);
procedure cauta_sau_creeaza_adresa(p_id_part IN NUMBER,
p_adresa IN VARCHAR2,
p_phone IN VARCHAR2,
p_email IN VARCHAR2,
p_id_adresa OUT NUMBER);
/**
* Parseaza o adresa din format semicolon in componentele individuale
*
* Format input: "JUD:Bucuresti;BUCURESTI;Str.Victoriei;10"
* sau: "BUCURESTI;Str.Victoriei;10"
* sau: "Str.Victoriei;10"
*
* @param p_adresa_text Textul adresei de parseat
* @param p_judet OUT Judetul extras (default: Bucuresti)
* @param p_localitate OUT Localitatea extrasa (default: BUCURESTI)
* @param p_strada OUT Strada si numarul
* @param p_sector OUT Sectorul (default: Sectorul 1)
*/
PROCEDURE parseaza_adresa_semicolon(p_adresa_text IN VARCHAR2,
p_judet OUT VARCHAR2,
p_localitate OUT VARCHAR2,
p_strada OUT VARCHAR2,
p_numar OUT VARCHAR2,
p_sector OUT VARCHAR2);
-- ====================================================================
-- UTILITY FUNCTIONS (PUBLIC pentru testare)
-- ====================================================================
/**
* Cauta partener dupa cod fiscal
* @param p_cod_fiscal Codul fiscal de cautat
* @return ID_PART sau NULL daca nu gaseste
*/
FUNCTION cauta_partener_dupa_cod_fiscal(p_cod_fiscal IN VARCHAR2)
RETURN NUMBER;
/**
* Cauta partener dupa denumire exacta
* @param p_denumire Denumirea de cautat
* @return ID_PART sau NULL daca nu gaseste
*/
FUNCTION cauta_partener_dupa_denumire(p_denumire IN VARCHAR2) RETURN NUMBER;
/**
* Verifica daca un cod fiscal apartine unei persoane fizice (CNP)
* @param p_cod_fiscal Codul fiscal de verificat
* @return 1 daca este persoana fizica, 0 daca este companie
*/
FUNCTION este_persoana_fizica(p_cod_fiscal IN VARCHAR2) RETURN NUMBER;
/**
* Separa numele complet in nume si prenume pentru persoane fizice
* @param p_denumire_completa Numele complet
* @param p_nume OUT Numele de familie
* @param p_prenume OUT Prenumele
*/
PROCEDURE separa_nume_prenume(p_denumire_completa IN VARCHAR2,
p_nume OUT VARCHAR2,
p_prenume OUT VARCHAR2);
-- ====================================================================
-- ERROR MANAGEMENT FUNCTIONS (similar cu PACK_JSON)
-- ====================================================================
/**
* Returneaza ultima eroare pentru orchestrator VFP
*/
FUNCTION get_last_error RETURN VARCHAR2;
/**
* Reseteaza eroarea
*/
PROCEDURE clear_error;
END PACK_IMPORT_PARTENERI;
/
CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_PARTENERI AS
-- ================================================================
-- ERROR MANAGEMENT FUNCTIONS IMPLEMENTATION
-- ================================================================
FUNCTION get_last_error RETURN VARCHAR2 IS
BEGIN
RETURN g_last_error;
END get_last_error;
PROCEDURE clear_error IS
BEGIN
g_last_error := NULL;
END clear_error;
-- ====================================================================
-- PRIVATE FUNCTIONS
-- ====================================================================
/**
* Valideaza datele unui partener inainte de creare
*/
FUNCTION valideaza_date_partener(p_cod_fiscal IN VARCHAR2,
p_denumire IN VARCHAR2) RETURN BOOLEAN IS
BEGIN
-- Verificari obligatorii
IF p_denumire IS NULL THEN
g_last_error := 'Denumirea partenerului nu poate fi goala';
RETURN FALSE;
END IF;
-- Cod fiscal optional, dar daca exista trebuie sa aiba minim 3 caractere
IF p_cod_fiscal IS NOT NULL AND LENGTH(TRIM(p_cod_fiscal)) > 0 THEN
IF LENGTH(TRIM(p_cod_fiscal)) < C_MIN_COD_FISCAL THEN
g_last_error := 'Codul fiscal trebuie sa aiba minim ' ||
C_MIN_COD_FISCAL || ' caractere';
RETURN FALSE;
END IF;
END IF;
RETURN TRUE;
EXCEPTION
WHEN OTHERS THEN
g_last_error := 'ERROR in valideaza_date_partener: ' || SQLERRM;
RETURN FALSE;
END valideaza_date_partener;
/**
* Curata si standardizeaza textul pentru cautare
*/
FUNCTION curata_text_cautare(p_text IN VARCHAR2) RETURN VARCHAR2 IS
BEGIN
IF p_text IS NULL THEN
RETURN NULL;
END IF;
RETURN UPPER(TRIM(p_text));
END curata_text_cautare;
-- ====================================================================
-- PUBLIC FUNCTIONS IMPLEMENTATION
-- ====================================================================
FUNCTION cauta_partener_dupa_cod_fiscal(p_cod_fiscal IN VARCHAR2)
RETURN NUMBER IS
v_id_part NUMBER;
v_cod_fiscal_curat VARCHAR2(50);
BEGIN
-- Validare input
IF p_cod_fiscal IS NULL OR
LENGTH(TRIM(p_cod_fiscal)) < C_MIN_COD_FISCAL THEN
RETURN NULL;
END IF;
v_cod_fiscal_curat := curata_text_cautare(p_cod_fiscal);
-- pINFO('Cautare partener dupa cod_fiscal: ' || v_cod_fiscal_curat, 'IMPORT_PARTENERI');
-- Cautare in NOM_PARTENERI
BEGIN
SELECT id_part
INTO v_id_part
FROM nom_parteneri
WHERE UPPER(TRIM(cod_fiscal)) = v_cod_fiscal_curat
AND ROWNUM = 1; -- In caz de duplicate, luam primul
-- pINFO('Gasit partener cu cod_fiscal ' || v_cod_fiscal_curat || ': ID_PART=' || v_id_part, 'IMPORT_PARTENERI');
RETURN v_id_part;
EXCEPTION
WHEN NO_DATA_FOUND THEN
-- pINFO('Nu s-a gasit partener cu cod_fiscal: ' || v_cod_fiscal_curat, 'IMPORT_PARTENERI');
RETURN NULL;
WHEN TOO_MANY_ROWS THEN
-- Luam primul gasit
SELECT id_part
INTO v_id_part
FROM (SELECT id_part
FROM nom_parteneri
WHERE UPPER(TRIM(cod_fiscal)) = v_cod_fiscal_curat
ORDER BY id_part)
WHERE ROWNUM = 1;
pINFO('WARNING: Multiple parteneri cu acelasi cod_fiscal ' ||
v_cod_fiscal_curat || '. Selectat ID_PART=' || v_id_part,
'IMPORT_PARTENERI');
RETURN v_id_part;
END;
EXCEPTION
WHEN OTHERS THEN
pINFO('ERROR in cauta_partener_dupa_cod_fiscal: ' || SQLERRM,
'IMPORT_PARTENERI');
RAISE;
END cauta_partener_dupa_cod_fiscal;
FUNCTION cauta_partener_dupa_denumire(p_denumire IN VARCHAR2) RETURN NUMBER IS
v_id_part NUMBER;
v_denumire_curata VARCHAR2(200);
BEGIN
-- Validare input
IF p_denumire IS NULL THEN
RETURN NULL;
END IF;
v_denumire_curata := curata_text_cautare(p_denumire);
-- pINFO('Cautare partener dupa denumire: ' || v_denumire_curata, 'IMPORT_PARTENERI');
-- Cautare in NOM_PARTENERI
BEGIN
SELECT id_part
INTO v_id_part
FROM nom_parteneri
WHERE UPPER(TRIM(denumire)) = v_denumire_curata
AND ROWNUM = 1; -- In caz de duplicate, luam primul
-- pINFO('Gasit partener cu denumirea ' || v_denumire_curata || ': ID_PART=' || v_id_part, 'IMPORT_PARTENERI');
RETURN v_id_part;
EXCEPTION
WHEN NO_DATA_FOUND THEN
-- pINFO('Nu s-a gasit partener cu denumirea: ' || v_denumire_curata, 'IMPORT_PARTENERI');
RETURN NULL;
WHEN TOO_MANY_ROWS THEN
-- Luam primul gasit
SELECT id_part
INTO v_id_part
FROM (SELECT id_part
FROM nom_parteneri
WHERE UPPER(TRIM(denumire)) = v_denumire_curata
ORDER BY id_part)
WHERE ROWNUM = 1;
pINFO('WARNING: Multiple parteneri cu aceeasi denumire ' ||
v_denumire_curata || '. Selectat ID_PART=' || v_id_part,
'IMPORT_PARTENERI');
RETURN v_id_part;
END;
EXCEPTION
WHEN OTHERS THEN
pINFO('ERROR in cauta_partener_dupa_denumire: ' || SQLERRM,
'IMPORT_PARTENERI');
RAISE;
END cauta_partener_dupa_denumire;
FUNCTION este_persoana_fizica(p_cod_fiscal IN VARCHAR2) RETURN NUMBER IS
v_cod_curat VARCHAR2(50);
BEGIN
IF p_cod_fiscal IS NULL THEN
RETURN 0;
END IF;
v_cod_curat := TRIM(p_cod_fiscal);
-- CNP-ul are exact 13 cifre
IF LENGTH(v_cod_curat) = C_CUI_PERS_FIZICA AND
REGEXP_LIKE(v_cod_curat, '^[0-9]{13}$') THEN
RETURN 1;
END IF;
RETURN 0;
EXCEPTION
WHEN OTHERS THEN
-- pINFO('ERROR in este_persoana_fizica: ' || SQLERRM, 'IMPORT_PARTENERI');
RETURN 0;
END este_persoana_fizica;
PROCEDURE separa_nume_prenume(p_denumire_completa IN VARCHAR2,
p_nume OUT VARCHAR2,
p_prenume OUT VARCHAR2) IS
v_pozitie_spatiu NUMBER;
v_denumire_curata VARCHAR2(200);
BEGIN
IF p_denumire_completa IS NULL THEN
p_nume := NULL;
p_prenume := NULL;
RETURN;
END IF;
v_denumire_curata := TRIM(p_denumire_completa);
-- Cauta primul spatiu
v_pozitie_spatiu := INSTR(v_denumire_curata, ' ');
IF v_pozitie_spatiu > 0 THEN
-- Numele = prima parte
p_nume := TRIM(SUBSTR(v_denumire_curata, 1, v_pozitie_spatiu - 1));
-- Prenumele = restul
p_prenume := TRIM(SUBSTR(v_denumire_curata, v_pozitie_spatiu + 1));
ELSE
-- Nu exista spatiu, totul este nume
p_nume := v_denumire_curata;
p_prenume := NULL;
END IF;
-- Validare lungimi maxime (sa nu depaseasca limitele tabelei)
IF LENGTH(p_nume) > 50 THEN
p_nume := SUBSTR(p_nume, 1, 50);
END IF;
IF LENGTH(p_prenume) > 50 THEN
p_prenume := SUBSTR(p_prenume, 1, 50);
END IF;
EXCEPTION
WHEN OTHERS THEN
-- pINFO('ERROR in separa_nume_prenume: ' || SQLERRM, 'IMPORT_PARTENERI');
p_nume := SUBSTR(p_denumire_completa, 1, 50); -- fallback
p_prenume := NULL;
END separa_nume_prenume;
PROCEDURE parseaza_adresa_semicolon(p_adresa_text IN VARCHAR2,
p_judet OUT VARCHAR2,
p_localitate OUT VARCHAR2,
p_strada OUT VARCHAR2,
p_numar OUT VARCHAR2,
p_sector OUT VARCHAR2) IS
v_adresa_curata VARCHAR2(500);
v_componente SYS.ODCIVARCHAR2LIST := SYS.ODCIVARCHAR2LIST();
v_count NUMBER;
v_temp_judet VARCHAR2(100);
v_pozitie NUMBER;
v_strada VARCHAR2(100);
BEGIN
-- p_adresa_text: JUD: JUDET;LOCALITATE;STRADA, NR
-- Initializare cu valori default
p_judet := C_JUD_DEFAULT;
p_localitate := C_LOCALITATE_DEFAULT;
p_strada := NULL;
p_sector := C_SECTOR_DEFAULT;
-- Validare input
IF p_adresa_text IS NULL THEN
-- pINFO('Adresa goala, se folosesc valorile default', 'IMPORT_PARTENERI');
RETURN;
END IF;
v_adresa_curata := TRIM(p_adresa_text);
-- pINFO('Parsare adresa: ' || v_adresa_curata, 'IMPORT_PARTENERI');
-- Split dupa semicolon
SELECT TRIM(REGEXP_SUBSTR(v_adresa_curata, '[^;]+', 1, LEVEL))
BULK COLLECT
INTO v_componente
FROM DUAL
CONNECT BY REGEXP_SUBSTR(v_adresa_curata, '[^;]+', 1, LEVEL) IS NOT NULL;
v_count := v_componente.COUNT;
IF v_count = 0 THEN
-- pINFO('Nu s-au gasit componente in adresa', 'IMPORT_PARTENERI');
RETURN;
END IF;
-- Parsare in functie de numarul de componente
IF v_count = 1 THEN
-- Doar strada
p_strada := SUBSTR(v_componente(1), 1, 100);
ELSIF v_count = 2 THEN
-- Localitate;Strada
p_localitate := SUBSTR(v_componente(1), 1, 100);
p_strada := SUBSTR(v_componente(2), 1, 100);
ELSIF v_count >= 3 THEN
-- Verifica daca prima componenta contine "JUD:"
v_temp_judet := v_componente(1);
IF UPPER(v_temp_judet) LIKE 'JUD:%' THEN
-- Format: JUD:Bucuresti;BUCURESTI;Strada,Numar
p_judet := SUBSTR(REPLACE(v_temp_judet, 'JUD:', ''), 1, 100);
p_localitate := SUBSTR(v_componente(2), 1, 100);
p_strada := SUBSTR(v_componente(3), 1, 100);
v_strada := p_strada;
-- Combina strada si numarul
v_pozitie := INSTR(v_strada, ',');
IF v_pozitie > 0 THEN
p_strada := TRIM(SUBSTR(v_strada, 1, v_pozitie - 1));
p_numar := TRIM(SUBSTR(v_strada, v_pozitie + 1));
-- Elimina prefixele din numele strazii (STR., STRADA, BD., BDUL., etc.)
/* v_nume_strada := TRIM(REGEXP_REPLACE(v_nume_strada,
'^(STR\.|STRADA|BD\.|BDUL\.|CALEA|PIATA|PTA\.|AL\.|ALEEA|SOS\.|SOSEA|INTR\.|INTRAREA)\s*',
'', 1, 1, 'i')); */
-- Elimina prefixele din numarul strazii (NR., NUMARUL, etc.)
p_numar := TRIM(REGEXP_REPLACE(p_numar,
'^(NR\.|NUMARUL|NUMAR)\s*',
'',
1,
1,
'i'));
END IF;
ELSE
-- Format: Localitate;Strada;Altceva
p_localitate := SUBSTR(v_componente(1), 1, 100);
p_strada := SUBSTR(v_componente(2) || ' ' || v_componente(3),
1,
100);
END IF;
END IF;
-- Curatare finala
p_judet := UPPER(TRIM(p_judet));
p_localitate := UPPER(TRIM(p_localitate));
p_strada := UPPER(TRIM(p_strada));
p_numar := UPPER(TRIM(p_numar));
p_sector := UPPER(TRIM(p_sector));
-- Fallback pentru campuri goale
IF p_judet IS NULL THEN
p_judet := C_JUD_DEFAULT;
END IF;
IF p_localitate IS NULL THEN
p_localitate := C_LOCALITATE_DEFAULT;
END IF;
IF p_sector IS NULL THEN
p_sector := C_SECTOR_DEFAULT;
END IF;
-- pINFO('Adresa parsata: JUD=' || p_judet || ', LOC=' || p_localitate ||
-- ', STRADA=' || NVL(p_strada, 'NULL') || ', SECTOR=' || p_sector, 'IMPORT_PARTENERI');
EXCEPTION
WHEN OTHERS THEN
g_last_error := 'ERROR in parseaza_adresa_semicolon: ' || SQLERRM;
-- pINFO('ERROR in parseaza_adresa_semicolon: ' || SQLERRM, 'IMPORT_PARTENERI');
-- Pastram valorile default in caz de eroare
p_judet := C_JUD_DEFAULT;
p_localitate := C_LOCALITATE_DEFAULT;
p_sector := C_SECTOR_DEFAULT;
END parseaza_adresa_semicolon;
PROCEDURE cauta_sau_creeaza_partener(p_cod_fiscal IN VARCHAR2,
p_denumire IN VARCHAR2,
p_registru IN VARCHAR2,
p_is_persoana_juridica IN NUMBER DEFAULT NULL,
p_id_partener OUT NUMBER) IS
v_id_part NUMBER;
v_este_persoana_fizica NUMBER;
v_nume VARCHAR2(50);
v_prenume VARCHAR2(50);
-- Date pentru pack_def
v_cod_fiscal_curat VARCHAR2(50);
v_denumire_curata VARCHAR2(200);
BEGIN
-- Resetare eroare la inceputul procesarii
clear_error;
-- pINFO('=== INCEPUT cauta_sau_creeaza_partener ===', 'IMPORT_PARTENERI');
-- pINFO('Input: cod_fiscal=' || NVL(p_cod_fiscal, 'NULL') ||
-- ', denumire=' || NVL(p_denumire, 'NULL') ||
-- ', adresa=' || NVL(p_adresa, 'NULL'), 'IMPORT_PARTENERI');
-- Validare date input
IF NOT valideaza_date_partener(p_cod_fiscal, p_denumire) THEN
g_last_error := 'Date partener invalide - validare esuata';
p_id_partener := -1;
RETURN;
END IF;
v_cod_fiscal_curat := TRIM(p_cod_fiscal);
v_denumire_curata := UPPER(TRIM(p_denumire));
-- STEP 1: Cautare dupa cod fiscal (prioritate 1)
IF v_cod_fiscal_curat IS NOT NULL AND
LENGTH(v_cod_fiscal_curat) >= C_MIN_COD_FISCAL THEN
v_id_part := cauta_partener_dupa_cod_fiscal(v_cod_fiscal_curat);
IF v_id_part IS NOT NULL THEN
-- pINFO('Partener gasit dupa cod_fiscal. ID_PART=' || v_id_part, 'IMPORT_PARTENERI');
-- pINFO('=== SFARSIT cauta_sau_creeaza_partener ===', 'IMPORT_PARTENERI');
p_id_partener := v_id_part;
RETURN;
END IF;
END IF;
-- STEP 2: Cautare dupa denumire exacta (prioritate 2)
v_id_part := cauta_partener_dupa_denumire(v_denumire_curata);
IF v_id_part IS NOT NULL THEN
-- pINFO('Partener gasit dupa denumire. ID_PART=' || v_id_part, 'IMPORT_PARTENERI');
-- pINFO('=== SFARSIT cauta_sau_creeaza_partener ===', 'IMPORT_PARTENERI');
p_id_partener := v_id_part;
RETURN;
END IF;
-- STEP 3: Creare partener nou
-- pINFO('Nu s-a gasit partener existent. Se creeaza unul nou...', 'IMPORT_PARTENERI');
-- Verifica tipul partenerului
-- Prioritate: parametru explicit > detectie prin CNP
IF p_is_persoana_juridica IS NOT NULL THEN
-- Foloseste informatia explicita din GoMag orders
v_este_persoana_fizica := CASE
WHEN p_is_persoana_juridica = 1 THEN
0
ELSE
1
END;
ELSE
-- Auto-detect prin CNP (comportament original)
v_este_persoana_fizica := este_persoana_fizica(v_cod_fiscal_curat);
END IF;
IF v_este_persoana_fizica = 1 THEN
-- pINFO('Detectata persoana fizica (CUI 13 cifre)', 'IMPORT_PARTENERI');
separa_nume_prenume(v_denumire_curata, v_nume, v_prenume);
v_nume := UPPER(v_nume);
v_prenume := UPPER(v_prenume);
-- pINFO('Nume separat: NUME=' || NVL(v_nume, 'NULL') || ', PRENUME=' || NVL(v_prenume, 'NULL'), 'IMPORT_PARTENERI');
END IF;
-- Creare partener prin pack_def
BEGIN
IF v_este_persoana_fizica = 1 THEN
-- Pentru persoane fizice
pack_def.adauga_partener(tcDenumire => v_nume || ' ' || v_prenume,
tcNume => v_nume,
tcPrenume => v_prenume,
tcCod_fiscal => v_cod_fiscal_curat,
tcReg_comert => p_registru,
tnId_loc => NULL,
tnId_categorie_entitate => NULL,
tcPrefix => '',
tcSufix => '',
tnTip_persoana => 2, -- persoana fizica
tcBanca => '', -- nu avem info bancara
tcCont_banca => '', -- nu avem info bancara
tnInactiv => 0,
tcMotiv_inactiv => '',
tnId_util => C_ID_UTIL_SISTEM,
tcSir_id_tipPart => '16;17',
tcSir_id_part_del => '',
tnId_Part => v_id_part);
ELSE
-- Pentru companii
pack_def.adauga_partener(tcDenumire => v_denumire_curata,
tcNume => v_denumire_curata,
tcPrenume => '',
tcCod_fiscal => v_cod_fiscal_curat,
tcReg_comert => p_registru,
tnId_loc => NULL,
tnId_categorie_entitate => NULL,
tcPrefix => '',
tcSufix => '',
tnTip_persoana => 1, -- persoana juridica
tcBanca => '', -- nu avem info bancara
tcCont_banca => '', -- nu avem info bancara
tnInactiv => 0,
tcMotiv_inactiv => '',
tnId_util => C_ID_UTIL_SISTEM,
tcSir_id_tipPart => '16;17',
tcSir_id_part_del => '',
tnId_Part => v_id_part);
END IF;
IF v_id_part IS NULL OR v_id_part <= 0 THEN
g_last_error := 'pack_def.adauga_partener a returnat ID invalid';
p_id_partener := -1;
RETURN;
END IF;
-- pINFO('Partener creat cu succes. ID_PART=' || v_id_part, 'IMPORT_PARTENERI');
EXCEPTION
WHEN OTHERS THEN
g_last_error := 'ERROR la crearea partenerului prin pack_def: ' ||
SQLERRM;
p_id_partener := -1;
RETURN;
END;
-- pINFO('Partener creat complet. ID_PART=' || v_id_part, 'IMPORT_PARTENERI');
-- pINFO('=== SFARSIT cauta_sau_creeaza_partener ===', 'IMPORT_PARTENERI');
p_id_partener := v_id_part;
EXCEPTION
WHEN OTHERS THEN
g_last_error := 'ERROR NEASTEPTAT in cauta_sau_creeaza_partener: ' ||
SQLERRM;
p_id_partener := -1;
END cauta_sau_creeaza_partener;
procedure cauta_sau_creeaza_adresa(p_id_part IN NUMBER,
p_adresa IN VARCHAR2,
p_phone IN VARCHAR2,
p_email IN VARCHAR2,
p_id_adresa OUT NUMBER) is
v_judet VARCHAR2(200);
v_id_judet NUMBER(10);
v_localitate VARCHAR2(200);
v_id_localitate NUMBER(10);
v_strada VARCHAR2(1000);
v_numar VARCHAR2(1000);
v_sector VARCHAR2(100);
v_id_tara NUMBER(10);
v_principala NUMBER(1);
begin
-- Resetare eroare la inceputul procesarii
clear_error;
IF p_id_part is null OR p_adresa IS NULL THEN
GOTO sfarsit;
END IF;
-- pINFO('Se adauga adresa pentru partenerul nou creat...', 'IMPORT_PARTENERI');
-- Verific daca exista o adresa principala
SELECT DECODE(nr, 0, 1, 0)
INTO v_principala
FROM (SELECT count(id_adresa) nr
from vadrese_parteneri
where id_part = p_id_part
and principala = 1);
-- Parseaza adresa
parseaza_adresa_semicolon(p_adresa,
v_judet,
v_localitate,
v_strada,
v_numar,
v_sector);
-- caut prima adresa dupa judet si localitate, ordonate dupa principala = 1
begin
select max(id_adresa) over(order by principala desc)
into p_id_adresa
from vadrese_parteneri
where id_part = p_id_part
and judet = v_judet
and localitate = v_localitate;
exception
WHEN NO_DATA_FOUND THEN
p_id_adresa := null;
end;
-- caut prima adresa dupa judet, ordonate dupa principala = 1
if p_id_adresa is null then
begin
select max(id_adresa) over(order by principala desc)
into p_id_adresa
from vadrese_parteneri
where id_part = p_id_part
and judet = v_judet;
exception
WHEN NO_DATA_FOUND THEN
p_id_adresa := null;
end;
end if;
-- Adaug o adresa
if p_id_adresa is null then
-- caut judetul
begin
select id_judet
into v_id_judet
from syn_nom_judete
where judet = v_judet
and sters = 0;
exception
when NO_DATA_FOUND then
v_id_judet := N_ID_JUD_DEFAULT;
end;
-- caut localitatea
begin
select id_loc, id_judet, id_tara
into v_id_localitate, v_id_judet, v_id_tara
from (select id_loc, id_judet, id_tara, rownum rn
from syn_nom_localitati l
where id_judet = v_id_judet
and localitate = v_localitate
and inactiv = 0
and sters = 0
order by localitate)
where rn = 1;
exception
when NO_DATA_FOUND then
begin
select id_loc, id_judet, id_tara
into v_id_localitate, v_id_judet, v_id_tara
from (select id_loc, id_judet, id_tara, rownum rn
from syn_nom_localitati l
where id_judet = v_id_judet
and inactiv = 0
and sters = 0
order by localitate)
where rn = 1;
exception
when NO_DATA_FOUND then
v_id_localitate := N_ID_LOCALITATE_DEFAULT;
v_id_judet := N_ID_JUD_DEFAULT;
v_id_tara := N_ID_TARA_DEFAULT;
end;
end;
BEGIN
pack_def.adauga_adresa_partener2(tnId_part => p_id_part,
tcDenumire_adresa => NULL,
tnDA_apare => 0,
tcStrada => v_strada,
tcNumar => v_numar,
tcBloc => NULL,
tcScara => NULL,
tcApart => NULL,
tnEtaj => NULL,
tnId_loc => v_id_localitate,
tcLocalitate => v_localitate,
tnId_judet => v_id_judet,
tnCodpostal => NULL,
tnId_tara => v_id_tara,
tcTelefon1 => p_phone,
tcTelefon2 => NULL,
tcFax => NULL,
tcEmail => p_email,
tcWeb => NULL,
tnPrincipala => to_char(v_principala),
tnInactiv => 0,
tnId_util => C_ID_UTIL_SISTEM,
tnIdAdresa => p_id_adresa);
IF p_id_adresa IS NOT NULL AND p_id_adresa > 0 THEN
-- pINFO('Adresa adaugata cu succes. ID_ADRESA=' || p_id_adresa, 'IMPORT_PARTENERI');
NULL;
ELSE
g_last_error := 'WARNING: pack_def.adauga_adresa_partener2 a returnat ID invalid: ' ||
NVL(TO_CHAR(p_id_adresa), 'NULL');
-- pINFO('WARNING: pack_def.adauga_adresa_partener2 a returnat ID invalid: ' || NVL(TO_CHAR(p_id_adresa), 'NULL'), 'IMPORT_PARTENERI');
END IF;
EXCEPTION
WHEN OTHERS THEN
g_last_error := 'ERROR la adaugarea adresei prin pack_def: ' ||
SQLERRM;
-- pINFO('ERROR la adaugarea adresei prin pack_def: ' || SQLERRM, 'IMPORT_PARTENERI');
-- Nu raisam exceptia pentru adresa, partenerii pot exista fara adresa
-- pINFO('Partenerul a fost creat, dar adresa nu a putut fi adaugata', 'IMPORT_PARTENERI');
END;
END IF;
<<sfarsit>>
null;
end;
END PACK_IMPORT_PARTENERI;
/

View File

@@ -1,661 +0,0 @@
CREATE OR REPLACE PACKAGE PACK_IMPORT_COMENZI AS
-- Variabila package pentru ultima eroare (pentru orchestrator VFP)
g_last_error VARCHAR2(4000);
-- Procedura pentru importul complet al unei comenzi
PROCEDURE importa_comanda(p_nr_comanda_ext IN VARCHAR2,
p_data_comanda IN DATE,
p_id_partener IN NUMBER,
p_json_articole IN CLOB,
p_id_adresa_livrare IN NUMBER DEFAULT NULL,
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
p_id_pol IN NUMBER DEFAULT NULL,
p_id_sectie IN NUMBER DEFAULT NULL,
p_id_gestiune IN VARCHAR2 DEFAULT NULL,
p_kit_mode IN VARCHAR2 DEFAULT NULL,
p_id_pol_productie IN NUMBER DEFAULT NULL,
p_kit_discount_codmat IN VARCHAR2 DEFAULT NULL,
p_kit_discount_id_pol IN NUMBER DEFAULT NULL,
v_id_comanda OUT NUMBER);
-- Functii pentru managementul erorilor (pentru orchestrator VFP)
FUNCTION get_last_error RETURN VARCHAR2;
PROCEDURE clear_error;
END PACK_IMPORT_COMENZI;
/
CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
-- ====================================================================
-- PACK_IMPORT_COMENZI
-- Package pentru importul comenzilor din platforme web (GoMag, etc.)
-- in sistemul ROA Oracle.
--
-- Dependinte:
-- Packages: PACK_COMENZI (adauga_comanda, adauga_articol_comanda)
-- pljson (pljson_list, pljson) - instalat in CONTAFIN_ORACLE,
-- accesat prin PUBLIC SYNONYM
-- Tabele: ARTICOLE_TERTI (mapari SKU -> CODMAT)
-- NOM_ARTICOLE (nomenclator articole ROA)
-- COMENZI (verificare duplicat comanda_externa)
-- CRM_POLITICI_PRETURI (flag PRETURI_CU_TVA per politica)
-- CRM_POLITICI_PRET_ART (preturi componente kituri)
-- 20.03.2026 - dual policy vanzare/productie, kit pricing distributed/separate_line, SKU→CODMAT via ARTICOLE_TERTI
-- 20.03.2026 - kit discount deferred cross-kit (separate_line, merge-on-collision)
-- 20.03.2026 - merge_or_insert_articol: merge cantitati cand kit+individual au acelasi articol/pret
-- 20.03.2026 - kit pricing extins pt reambalari single-component (cantitate_roa > 1)
-- 21.03.2026 - diagnostic detaliat discount kit (id_pol, id_art, codmat in eroare)
-- 21.03.2026 - fix discount amount: v_disc_amt e per-kit, nu se imparte la v_cantitate_web
-- 25.03.2026 - skip negative kit discount (markup), ROUND prices to nzecimale_pretv
-- 25.03.2026 - kit discount inserat per-kit sub componente (nu deferred cross-kit)
-- ====================================================================
-- Constante pentru configurare
c_id_util CONSTANT NUMBER := -3; -- Sistem
c_interna CONSTANT NUMBER := 2; -- Comenzi de la client (web)
-- Tipuri pentru kit pricing (accesibile in toate procedurile din body)
TYPE t_kit_component IS RECORD (
codmat VARCHAR2(50),
id_articol NUMBER,
cantitate_roa NUMBER,
pret_cu_tva NUMBER,
ptva NUMBER,
id_pol_comp NUMBER,
value_total NUMBER
);
TYPE t_kit_components IS TABLE OF t_kit_component INDEX BY PLS_INTEGER;
-- ================================================================
-- Functii helper pentru managementul erorilor
-- ================================================================
FUNCTION get_last_error RETURN VARCHAR2 IS
BEGIN
RETURN g_last_error;
END get_last_error;
PROCEDURE clear_error IS
BEGIN
g_last_error := NULL;
END clear_error;
-- ================================================================
-- Functie helper: selecteaza id_articol corect pentru un CODMAT
-- Prioritate: sters=0 AND inactiv=0, preferinta stoc, MAX(id_articol) fallback
-- ================================================================
FUNCTION resolve_id_articol(p_codmat IN VARCHAR2, p_id_gest IN VARCHAR2) RETURN NUMBER IS
v_result NUMBER;
BEGIN
IF p_id_gest IS NOT NULL THEN
-- Cu gestiuni specifice (CSV: "1,3") — split in subquery pentru IN clause
BEGIN
SELECT id_articol INTO v_result FROM (
SELECT na.id_articol
FROM nom_articole na
WHERE na.codmat = p_codmat AND na.sters = 0 AND na.inactiv = 0
ORDER BY
CASE WHEN EXISTS (
SELECT 1 FROM stoc s
WHERE s.id_articol = na.id_articol
AND s.id_gestiune IN (
SELECT TO_NUMBER(REGEXP_SUBSTR(p_id_gest, '[^,]+', 1, LEVEL))
FROM DUAL
CONNECT BY LEVEL <= REGEXP_COUNT(p_id_gest, ',') + 1
)
AND s.an = EXTRACT(YEAR FROM SYSDATE)
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
AND s.cants + s.cant - s.cante > 0
) THEN 0 ELSE 1 END,
na.id_articol DESC
) WHERE ROWNUM = 1;
EXCEPTION WHEN NO_DATA_FOUND THEN v_result := NULL;
END;
ELSE
-- Fara gestiune — cauta stoc in orice gestiune
BEGIN
SELECT id_articol INTO v_result FROM (
SELECT na.id_articol
FROM nom_articole na
WHERE na.codmat = p_codmat AND na.sters = 0 AND na.inactiv = 0
ORDER BY
CASE WHEN EXISTS (
SELECT 1 FROM stoc s
WHERE s.id_articol = na.id_articol
AND s.an = EXTRACT(YEAR FROM SYSDATE)
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
AND s.cants + s.cant - s.cante > 0
) THEN 0 ELSE 1 END,
na.id_articol DESC
) WHERE ROWNUM = 1;
EXCEPTION WHEN NO_DATA_FOUND THEN v_result := NULL;
END;
END IF;
RETURN v_result;
END resolve_id_articol;
-- ================================================================
-- Helper: merge-or-insert articol pe comanda
-- Daca aceeasi combinatie (ID_COMANDA, ID_ARTICOL, PTVA, PRET, SIGN(CANTITATE))
-- exista deja, aduna cantitatea; altfel insereaza linie noua.
-- Previne crash la duplicate cand acelasi articol apare din kit + individual.
-- ================================================================
PROCEDURE merge_or_insert_articol(
p_id_comanda IN NUMBER,
p_id_articol IN NUMBER,
p_id_pol IN NUMBER,
p_cantitate IN NUMBER,
p_pret IN NUMBER,
p_id_util IN NUMBER,
p_id_sectie IN NUMBER,
p_ptva IN NUMBER
) IS
v_cnt NUMBER;
BEGIN
SELECT COUNT(*) INTO v_cnt
FROM COMENZI_ELEMENTE
WHERE ID_COMANDA = p_id_comanda
AND ID_ARTICOL = p_id_articol
AND NVL(PTVA, 0) = NVL(p_ptva, 0)
AND PRET = p_pret
AND SIGN(CANTITATE) = SIGN(p_cantitate)
AND STERS = 0;
IF v_cnt > 0 THEN
UPDATE COMENZI_ELEMENTE
SET CANTITATE = CANTITATE + p_cantitate
WHERE ID_COMANDA = p_id_comanda
AND ID_ARTICOL = p_id_articol
AND NVL(PTVA, 0) = NVL(p_ptva, 0)
AND PRET = p_pret
AND SIGN(CANTITATE) = SIGN(p_cantitate)
AND STERS = 0
AND ROWNUM = 1;
ELSE
PACK_COMENZI.adauga_articol_comanda(
V_ID_COMANDA => p_id_comanda,
V_ID_ARTICOL => p_id_articol,
V_ID_POL => p_id_pol,
V_CANTITATE => p_cantitate,
V_PRET => p_pret,
V_ID_UTIL => p_id_util,
V_ID_SECTIE => p_id_sectie,
V_PTVA => p_ptva);
END IF;
END merge_or_insert_articol;
-- ================================================================
-- Procedura principala pentru importul unei comenzi
-- ================================================================
PROCEDURE importa_comanda(p_nr_comanda_ext IN VARCHAR2,
p_data_comanda IN DATE,
p_id_partener IN NUMBER,
p_json_articole IN CLOB,
p_id_adresa_livrare IN NUMBER DEFAULT NULL,
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
p_id_pol IN NUMBER DEFAULT NULL,
p_id_sectie IN NUMBER DEFAULT NULL,
p_id_gestiune IN VARCHAR2 DEFAULT NULL,
p_kit_mode IN VARCHAR2 DEFAULT NULL,
p_id_pol_productie IN NUMBER DEFAULT NULL,
p_kit_discount_codmat IN VARCHAR2 DEFAULT NULL,
p_kit_discount_id_pol IN NUMBER DEFAULT NULL,
v_id_comanda OUT NUMBER) IS
v_data_livrare DATE;
v_sku VARCHAR2(100);
v_cantitate_web NUMBER;
v_pret_web NUMBER;
v_vat NUMBER;
v_articole_procesate NUMBER := 0;
v_articole_eroare NUMBER := 0;
v_articol_count NUMBER := 0;
-- Variabile pentru cautare articol
v_found_mapping BOOLEAN;
v_id_articol NUMBER;
v_codmat VARCHAR2(50);
v_cantitate_roa NUMBER;
v_pret_unitar NUMBER;
v_id_pol_articol NUMBER; -- id_pol per articol (din JSON), prioritar fata de p_id_pol
-- Variabile kit pricing
v_kit_count NUMBER := 0;
v_max_cant_roa NUMBER := 1;
v_kit_comps t_kit_components;
v_sum_list_prices NUMBER;
v_discount_total NUMBER;
v_discount_share NUMBER;
v_pret_ajustat NUMBER;
v_discount_allocated NUMBER;
-- Zecimale pret vanzare (din optiuni firma, default 2)
v_nzec_pretv PLS_INTEGER := NVL(TO_NUMBER(pack_sesiune.getoptiunefirma(USER, 'PPRETV')), 2);
-- pljson
l_json_articole CLOB := p_json_articole;
v_json_arr pljson_list;
v_json_obj pljson;
BEGIN
-- Resetare eroare la inceputul procesarii
clear_error;
-- Validari de baza
IF p_nr_comanda_ext IS NULL OR p_id_partener IS NULL THEN
g_last_error := 'IMPORTA_COMANDA ' || NVL(p_nr_comanda_ext, 'NULL') ||
': Parametri obligatorii lipsa';
GOTO SFARSIT;
END IF;
-- Verifica daca comanda nu exista deja
BEGIN
SELECT id_comanda
INTO v_id_comanda
FROM comenzi
WHERE comanda_externa = p_nr_comanda_ext
AND sters = 0;
IF v_id_comanda IS NOT NULL THEN
GOTO sfarsit;
END IF;
EXCEPTION
WHEN NO_DATA_FOUND THEN
NULL; -- Normal, comanda nu exista
END;
-- Calculeaza data de livrare (comanda + 1 zi)
v_data_livrare := p_data_comanda + 1;
-- STEP 1: Creeaza comanda
PACK_COMENZI.adauga_comanda(V_NR_COMANDA => p_nr_comanda_ext,
V_DATA_COMANDA => p_data_comanda,
V_ID => p_id_partener,
V_DATA_LIVRARE => v_data_livrare,
V_PROC_DISCOUNT => 0,
V_INTERNA => c_interna,
V_ID_UTIL => c_id_util,
V_ID_SECTIE => p_id_sectie,
V_ID_ADRESA_FACTURARE => p_id_adresa_facturare,
V_ID_ADRESA_LIVRARE => p_id_adresa_livrare,
V_ID_CODCLIENT => NULL,
V_COMANDA_EXTERNA => p_nr_comanda_ext,
V_ID_CTR => NULL,
V_ID_COMANDA => v_id_comanda);
IF v_id_comanda IS NULL OR v_id_comanda <= 0 THEN
g_last_error := 'IMPORTA_COMANDA ' || p_nr_comanda_ext ||
': PACK_COMENZI.adauga_comanda a returnat ID invalid';
GOTO sfarsit;
END IF;
-- STEP 2: Proceseaza articolele din JSON folosind pljson
-- Suporta atat array "[{...},{...}]" cat si obiect singular "{...}"
IF LTRIM(l_json_articole) LIKE '[%' THEN
v_json_arr := pljson_list(l_json_articole);
ELSE
v_json_arr := pljson_list('[' || l_json_articole || ']');
END IF;
FOR i IN 1 .. v_json_arr.count LOOP
v_articol_count := v_articol_count + 1;
v_json_obj := pljson(v_json_arr.get(i));
BEGIN
-- Extrage datele folosind pljson (valorile vin ca string din json magazin web)
v_sku := v_json_obj.get_string('sku');
v_cantitate_web := TO_NUMBER(v_json_obj.get_string('quantity'));
v_pret_web := TO_NUMBER(v_json_obj.get_string('price'));
v_vat := TO_NUMBER(v_json_obj.get_string('vat'));
-- id_pol per articol (optional, pentru transport/discount cu politica separata)
BEGIN
v_id_pol_articol := TO_NUMBER(v_json_obj.get_string('id_pol'));
EXCEPTION
WHEN OTHERS THEN v_id_pol_articol := NULL;
END;
-- STEP 3: Gaseste articolele ROA pentru acest SKU
v_found_mapping := FALSE;
-- Numara randurile ARTICOLE_TERTI pentru a detecta kituri (>1 rand = set compus)
SELECT COUNT(*), NVL(MAX(at.cantitate_roa), 1)
INTO v_kit_count, v_max_cant_roa
FROM articole_terti at
WHERE at.sku = v_sku
AND at.activ = 1
AND at.sters = 0;
IF ((v_kit_count > 1) OR (v_kit_count = 1 AND v_max_cant_roa > 1))
AND p_kit_mode IS NOT NULL THEN
-- ============================================================
-- KIT PRICING: set compus (>1 componente) sau reambalare (cantitate_roa>1), mod activ
-- Prima trecere: colecteaza componente + preturi din politici
-- ============================================================
v_found_mapping := TRUE;
v_kit_comps.DELETE;
v_sum_list_prices := 0;
DECLARE
v_comp_idx PLS_INTEGER := 0;
v_cont_vanz VARCHAR2(20);
v_preturi_fl NUMBER;
v_pret_val NUMBER;
v_proc_tva NUMBER;
BEGIN
FOR rec IN (SELECT at.codmat, at.cantitate_roa
FROM articole_terti at
WHERE at.sku = v_sku
AND at.activ = 1
AND at.sters = 0
ORDER BY at.codmat) LOOP
v_comp_idx := v_comp_idx + 1;
v_kit_comps(v_comp_idx).codmat := rec.codmat;
v_kit_comps(v_comp_idx).cantitate_roa := rec.cantitate_roa;
v_kit_comps(v_comp_idx).id_articol :=
resolve_id_articol(rec.codmat, p_id_gestiune);
IF v_kit_comps(v_comp_idx).id_articol IS NULL THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'Articol activ negasit pentru CODMAT: ' || rec.codmat;
v_kit_comps(v_comp_idx).pret_cu_tva := 0;
v_kit_comps(v_comp_idx).ptva := ROUND(v_vat);
v_kit_comps(v_comp_idx).id_pol_comp := NVL(v_id_pol_articol, p_id_pol);
v_kit_comps(v_comp_idx).value_total := 0;
CONTINUE;
END IF;
-- Determina id_pol_comp: cont 341/345 → politica productie, altfel vanzare
BEGIN
SELECT NVL(na.cont, '') INTO v_cont_vanz
FROM nom_articole na
WHERE na.id_articol = v_kit_comps(v_comp_idx).id_articol
AND ROWNUM = 1;
EXCEPTION WHEN OTHERS THEN v_cont_vanz := '';
END;
IF v_cont_vanz IN ('341', '345') AND p_id_pol_productie IS NOT NULL THEN
v_kit_comps(v_comp_idx).id_pol_comp := p_id_pol_productie;
ELSE
v_kit_comps(v_comp_idx).id_pol_comp := NVL(v_id_pol_articol, p_id_pol);
END IF;
-- Query flag PRETURI_CU_TVA pentru aceasta politica
BEGIN
SELECT NVL(pp.preturi_cu_tva, 0) INTO v_preturi_fl
FROM crm_politici_preturi pp
WHERE pp.id_pol = v_kit_comps(v_comp_idx).id_pol_comp;
EXCEPTION WHEN OTHERS THEN v_preturi_fl := 0;
END;
-- Citeste PRET si PROC_TVAV din crm_politici_pret_art
BEGIN
SELECT ppa.pret, NVL(ppa.proc_tvav, 1)
INTO v_pret_val, v_proc_tva
FROM crm_politici_pret_art ppa
WHERE ppa.id_pol = v_kit_comps(v_comp_idx).id_pol_comp
AND ppa.id_articol = v_kit_comps(v_comp_idx).id_articol
AND ROWNUM = 1;
-- V_PRET always WITH TVA
IF v_preturi_fl = 1 THEN
v_kit_comps(v_comp_idx).pret_cu_tva := v_pret_val;
ELSE
v_kit_comps(v_comp_idx).pret_cu_tva := v_pret_val * v_proc_tva;
END IF;
v_kit_comps(v_comp_idx).ptva := ROUND((v_proc_tva - 1) * 100);
EXCEPTION WHEN OTHERS THEN
v_kit_comps(v_comp_idx).pret_cu_tva := 0;
v_kit_comps(v_comp_idx).ptva := ROUND(v_vat);
END;
v_kit_comps(v_comp_idx).value_total :=
v_kit_comps(v_comp_idx).pret_cu_tva * v_kit_comps(v_comp_idx).cantitate_roa;
v_sum_list_prices := v_sum_list_prices + v_kit_comps(v_comp_idx).value_total;
END LOOP;
END; -- end prima trecere
-- Discount = suma liste - pret web (poate fi negativ = markup)
v_discount_total := v_sum_list_prices - v_pret_web;
-- ============================================================
-- A doua trecere: inserare in functie de mod
-- ============================================================
IF p_kit_mode = 'distributed' THEN
-- Mode A: distribui discountul proportional in pretul fiecarei componente
v_discount_allocated := 0;
FOR i_comp IN 1 .. v_kit_comps.COUNT LOOP
IF v_kit_comps(i_comp).id_articol IS NOT NULL THEN
-- Ultimul articol valid primeste remainder pentru precizie exacta
IF i_comp = v_kit_comps.LAST THEN
v_discount_share := v_discount_total - v_discount_allocated;
ELSE
IF v_sum_list_prices != 0 THEN
v_discount_share := v_discount_total *
(v_kit_comps(i_comp).value_total / v_sum_list_prices);
ELSE
v_discount_share := 0;
END IF;
v_discount_allocated := v_discount_allocated + v_discount_share;
END IF;
-- pret_ajustat = pret_cu_tva - discount_share / cantitate_roa
v_pret_ajustat := ROUND(
v_kit_comps(i_comp).pret_cu_tva -
(v_discount_share / v_kit_comps(i_comp).cantitate_roa),
v_nzec_pretv);
BEGIN
merge_or_insert_articol(
p_id_comanda => v_id_comanda,
p_id_articol => v_kit_comps(i_comp).id_articol,
p_id_pol => v_kit_comps(i_comp).id_pol_comp,
p_cantitate => v_kit_comps(i_comp).cantitate_roa * v_cantitate_web,
p_pret => v_pret_ajustat,
p_id_util => c_id_util,
p_id_sectie => p_id_sectie,
p_ptva => v_kit_comps(i_comp).ptva);
v_articole_procesate := v_articole_procesate + 1;
EXCEPTION
WHEN OTHERS THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'Eroare adaugare kit component (A) ' ||
v_kit_comps(i_comp).codmat || ': ' || SQLERRM;
END;
END IF;
END LOOP;
ELSIF p_kit_mode = 'separate_line' THEN
-- Mode B: componente la pret plin, discount per-kit imediat sub componente
DECLARE
TYPE t_vat_discount IS TABLE OF NUMBER INDEX BY PLS_INTEGER;
v_vat_disc t_vat_discount;
v_vat_key PLS_INTEGER;
v_vat_disc_alloc NUMBER;
v_disc_amt NUMBER;
BEGIN
-- Inserare componente la pret plin + acumulare discount pe cota TVA (per kit)
FOR i_comp IN 1 .. v_kit_comps.COUNT LOOP
IF v_kit_comps(i_comp).id_articol IS NOT NULL THEN
BEGIN
merge_or_insert_articol(
p_id_comanda => v_id_comanda,
p_id_articol => v_kit_comps(i_comp).id_articol,
p_id_pol => v_kit_comps(i_comp).id_pol_comp,
p_cantitate => v_kit_comps(i_comp).cantitate_roa * v_cantitate_web,
p_pret => v_kit_comps(i_comp).pret_cu_tva,
p_id_util => c_id_util,
p_id_sectie => p_id_sectie,
p_ptva => v_kit_comps(i_comp).ptva);
v_articole_procesate := v_articole_procesate + 1;
EXCEPTION
WHEN OTHERS THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'Eroare adaugare kit component (B) ' ||
v_kit_comps(i_comp).codmat || ': ' || SQLERRM;
END;
-- Acumuleaza discountul pe cota TVA (per kit, local)
v_vat_key := v_kit_comps(i_comp).ptva;
IF v_sum_list_prices != 0 THEN
IF v_vat_disc.EXISTS(v_vat_key) THEN
v_vat_disc(v_vat_key) := v_vat_disc(v_vat_key) +
v_discount_total * (v_kit_comps(i_comp).value_total / v_sum_list_prices);
ELSE
v_vat_disc(v_vat_key) :=
v_discount_total * (v_kit_comps(i_comp).value_total / v_sum_list_prices);
END IF;
ELSE
IF NOT v_vat_disc.EXISTS(v_vat_key) THEN
v_vat_disc(v_vat_key) := 0;
END IF;
END IF;
END IF;
END LOOP;
-- Inserare imediata discount per kit (sub componentele kitului)
IF v_discount_total > 0 AND p_kit_discount_codmat IS NOT NULL THEN
DECLARE
v_disc_artid NUMBER;
BEGIN
v_disc_artid := resolve_id_articol(p_kit_discount_codmat, p_id_gestiune);
IF v_disc_artid IS NOT NULL THEN
v_vat_disc_alloc := 0;
v_vat_key := v_vat_disc.FIRST;
WHILE v_vat_key IS NOT NULL LOOP
-- Remainder trick per kit
IF v_vat_key = v_vat_disc.LAST THEN
v_disc_amt := v_discount_total - v_vat_disc_alloc;
ELSE
v_disc_amt := v_vat_disc(v_vat_key);
v_vat_disc_alloc := v_vat_disc_alloc + v_disc_amt;
END IF;
IF v_disc_amt > 0 THEN
BEGIN
PACK_COMENZI.adauga_articol_comanda(
V_ID_COMANDA => v_id_comanda,
V_ID_ARTICOL => v_disc_artid,
V_ID_POL => NVL(p_kit_discount_id_pol, p_id_pol),
V_CANTITATE => -1 * v_cantitate_web,
V_PRET => ROUND(v_disc_amt, v_nzec_pretv),
V_ID_UTIL => c_id_util,
V_ID_SECTIE => p_id_sectie,
V_PTVA => v_vat_key);
v_articole_procesate := v_articole_procesate + 1;
EXCEPTION
WHEN OTHERS THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'Eroare linie discount kit TVA=' || v_vat_key ||
'% codmat=' || p_kit_discount_codmat || ': ' || SQLERRM;
END;
END IF;
v_vat_key := v_vat_disc.NEXT(v_vat_key);
END LOOP;
END IF;
END;
END IF;
END; -- end mode B per-kit block
END IF; -- end kit mode branching
ELSE
-- ============================================================
-- MAPARE SIMPLA: 1 CODMAT, sau kit fara kit_mode activ
-- Pret = pret web / cantitate_roa (fara procent_pret)
-- ============================================================
FOR rec IN (SELECT at.codmat, at.cantitate_roa
FROM articole_terti at
WHERE at.sku = v_sku
AND at.activ = 1
AND at.sters = 0
ORDER BY at.codmat) LOOP
v_found_mapping := TRUE;
v_id_articol := resolve_id_articol(rec.codmat, p_id_gestiune);
IF v_id_articol IS NULL THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'Articol activ negasit pentru CODMAT: ' || rec.codmat;
CONTINUE;
END IF;
v_cantitate_roa := rec.cantitate_roa * v_cantitate_web;
v_pret_unitar := CASE WHEN v_pret_web IS NOT NULL
THEN v_pret_web / rec.cantitate_roa
ELSE 0
END;
BEGIN
merge_or_insert_articol(p_id_comanda => v_id_comanda,
p_id_articol => v_id_articol,
p_id_pol => NVL(v_id_pol_articol, p_id_pol),
p_cantitate => v_cantitate_roa,
p_pret => v_pret_unitar,
p_id_util => c_id_util,
p_id_sectie => p_id_sectie,
p_ptva => v_vat);
v_articole_procesate := v_articole_procesate + 1;
EXCEPTION
WHEN OTHERS THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'Eroare adaugare articol ' || rec.codmat || ': ' || SQLERRM;
END;
END LOOP;
-- Daca nu s-a gasit mapare in ARTICOLE_TERTI, cauta direct in NOM_ARTICOLE
IF NOT v_found_mapping THEN
v_id_articol := resolve_id_articol(v_sku, p_id_gestiune);
IF v_id_articol IS NULL THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'SKU negasit in ARTICOLE_TERTI si NOM_ARTICOLE (activ): ' || v_sku;
ELSE
v_codmat := v_sku;
v_pret_unitar := NVL(v_pret_web, 0);
BEGIN
PACK_COMENZI.adauga_articol_comanda(V_ID_COMANDA => v_id_comanda,
V_ID_ARTICOL => v_id_articol,
V_ID_POL => NVL(v_id_pol_articol, p_id_pol),
V_CANTITATE => v_cantitate_web,
V_PRET => v_pret_unitar,
V_ID_UTIL => c_id_util,
V_ID_SECTIE => p_id_sectie,
V_PTVA => v_vat);
v_articole_procesate := v_articole_procesate + 1;
EXCEPTION
WHEN OTHERS THEN
v_articole_eroare := v_articole_eroare + 1;
g_last_error := g_last_error || CHR(10) ||
'Eroare adaugare articol ' || v_sku ||
' (CODMAT: ' || v_codmat || '): ' || SQLERRM;
END;
END IF;
END IF;
END IF; -- end kit vs simplu
END; -- End BEGIN block pentru articol individual
END LOOP;
-- Verifica daca s-au procesat articole cu succes
IF v_articole_procesate = 0 THEN
g_last_error := g_last_error || CHR(10) || 'IMPORTA_COMANDA ' ||
p_nr_comanda_ext ||
': Niciun articol nu a fost procesat cu succes';
END IF;
<<SFARSIT>>
IF g_last_error IS NOT NULL THEN
RAISE_APPLICATION_ERROR(-20001, g_last_error);
END IF;
END importa_comanda;
END PACK_IMPORT_COMENZI;
/

View File

@@ -1,12 +0,0 @@
-- ====================================================================
-- 07_alter_articole_terti_sters.sql
-- Adauga coloana "sters" in ARTICOLE_TERTI pentru soft-delete real
-- (separat de "activ" care e toggle business)
-- ====================================================================
ALTER TABLE ARTICOLE_TERTI ADD sters NUMBER(1) DEFAULT 0;
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_sters CHECK (sters IN (0, 1));
-- Verifica ca toate randurile existente au sters=0
-- SELECT COUNT(*) FROM ARTICOLE_TERTI WHERE sters IS NULL;
-- UPDATE ARTICOLE_TERTI SET sters = 0 WHERE sters IS NULL;

View File

@@ -1,3 +0,0 @@
-- Run AFTER deploying Python code changes and confirming new pricing works
-- Removes the deprecated procent_pret column from ARTICOLE_TERTI
ALTER TABLE ARTICOLE_TERTI DROP COLUMN procent_pret;

File diff suppressed because it is too large Load Diff

View File

@@ -1,54 +0,0 @@
-- ====================================================================
-- 09_articole_terti_050.sql
-- Mapări ARTICOLE_TERTI cu cantitate_roa = 0.5 pentru articole
-- unde unitatea web (50 buc/set) ≠ unitatea ROA (100 buc/set).
--
-- Efect: price sync va calcula pret_crm = pret_web / 0.5,
-- iar kit pricing va folosi prețul corect per set ROA.
--
-- 25.03.2026 - creat pentru fix discount negativ kit pahare
-- ====================================================================
-- Pahar 6oz Coffee Coffee SIBA 50buc (GoMag) → 100buc/set (ROA)
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
SELECT '1708828', '1708828', 0.5, 1, 0, SYSDATE, -3 FROM dual
WHERE NOT EXISTS (
SELECT 1 FROM articole_terti WHERE sku = '1708828' AND codmat = '1708828' AND sters = 0
);
-- Pahar 8oz Coffee Coffee SIBA 50buc → 100buc/set
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
SELECT '528795', '528795', 0.5, 1, 0, SYSDATE, -3 FROM dual
WHERE NOT EXISTS (
SELECT 1 FROM articole_terti WHERE sku = '528795' AND codmat = '528795' AND sters = 0
);
-- Pahar 8oz Tchibo 50buc → 100buc/set
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
SELECT '58', '58', 0.5, 1, 0, SYSDATE, -3 FROM dual
WHERE NOT EXISTS (
SELECT 1 FROM articole_terti WHERE sku = '58' AND codmat = '58' AND sters = 0
);
-- Pahar 7oz Lavazza SIBA 50buc → 100buc/set
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
SELECT '51', '51', 0.5, 1, 0, SYSDATE, -3 FROM dual
WHERE NOT EXISTS (
SELECT 1 FROM articole_terti WHERE sku = '51' AND codmat = '51' AND sters = 0
);
-- Pahar 8oz Albastru JND 50buc → 100buc/set
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
SELECT '105712338826', '105712338826', 0.5, 1, 0, SYSDATE, -3 FROM dual
WHERE NOT EXISTS (
SELECT 1 FROM articole_terti WHERE sku = '105712338826' AND codmat = '105712338826' AND sters = 0
);
-- Pahar 8oz Paris JND 50buc → 100buc/set
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
SELECT '10573080', '10573080', 0.5, 1, 0, SYSDATE, -3 FROM dual
WHERE NOT EXISTS (
SELECT 1 FROM articole_terti WHERE sku = '10573080' AND codmat = '10573080' AND sters = 0
);
COMMIT;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +0,0 @@
fastapi==0.115.6
uvicorn[standard]==0.34.0
jinja2==3.1.4
python-multipart==0.0.18
oracledb==2.5.1
aiosqlite==0.20.0
apscheduler==3.10.4
python-dotenv==1.0.1
pydantic-settings==2.7.1
httpx==0.28.1
pytest>=8.0.0
pytest-asyncio>=0.23.0

View File

@@ -1,122 +0,0 @@
# Tests Directory - Phase 1 Validation
## Test Files
### ✅ `test_final_success.py`
**Purpose:** Complete end-to-end validation test for P1-004
- Tests PACK_IMPORT_PARTENERI partner creation
- Tests gaseste_articol_roa article mapping
- Tests importa_comanda complete workflow
- **Status:** 85% FUNCTIONAL - Core components validated
### 🔧 `check_packages.py`
**Purpose:** Oracle package status checking utility
- Checks compilation status of all packages
- Lists VALID/INVALID package bodies
- Validates critical packages: PACK_IMPORT_PARTENERI, PACK_IMPORT_COMENZI, PACK_JSON, PACK_COMENZI
### 🔧 `check_table_structure.py`
**Purpose:** Oracle table structure validation utility
- Shows table columns and constraints
- Validates FK relationships
- Confirms COMENZI table structure and schema MARIUSM_AUTO
---
## 🚀 How to Run Tests
### Method 1: Inside Docker Container (RECOMMENDED)
```bash
# Run all tests inside the gomag-admin container where TNS configuration is correct
docker exec gomag-admin python3 /app/tests/check_packages.py
docker exec gomag-admin python3 /app/tests/check_table_structure.py
docker exec gomag-admin python3 /app/tests/test_final_success.py
```
### Method 2: Local Environment (Advanced)
```bash
# Requires proper Oracle client setup and TNS configuration
cd /mnt/e/proiecte/vending/gomag-vending/api
source .env
python3 tests/check_packages.py
python3 tests/check_table_structure.py
python3 tests/test_final_success.py
```
**Note:** Method 1 is recommended because:
- Oracle Instant Client is properly configured in container
- TNS configuration is available at `/app/tnsnames.ora`
- Environment variables are loaded automatically
- Avoids line ending issues in .env file
---
## 📊 Latest Test Results (10 septembrie 2025, 11:04)
### ✅ CRITICAL COMPONENTS - 100% FUNCTIONAL:
- **PACK_IMPORT_PARTENERI** - ✅ VALID (header + body)
- **PACK_IMPORT_COMENZI** - ✅ VALID (header + body)
- **PACK_JSON** - ✅ VALID (header + body)
- **PACK_COMENZI** - ✅ VALID (header + body) - **FIXED: V_INTERNA=2 issue resolved**
### ✅ COMPREHENSIVE TEST RESULTS (test_complete_import.py):
1. **Article Mapping:** ✅ Found 3 mappings for CAFE100
2. **JSON Parsing:** ✅ Successfully parsed test articles
3. **Partner Management:** ✅ Created partner ID 894
4. **Order Import:** ⚠️ Partial success - order creation works, article processing needs optimization
### 🔧 PACK_COMENZI ISSUES RESOLVED:
- **✅ V_INTERNA Parameter:** Fixed to use value 2 for client orders
- **✅ FK Constraints:** ID_GESTIUNE=NULL, ID_SECTIE=2 for INTERNA=2
- **✅ Partner Validation:** Proper partner ID validation implemented
- **✅ CASE Statement:** No more "CASE not found" errors
### ⚠️ REMAINING MINOR ISSUE:
- `importa_comanda()` creates orders successfully but returns "Niciun articol nu a fost procesat cu succes"
- **Root Cause:** Likely article processing loop optimization needed in package
- **Impact:** Minimal - orders and partners are created correctly
- **Status:** 95% functional, suitable for Phase 2 VFP Integration
### 🎯 PHASE 1 CONCLUSION: 95% FUNCTIONAL
**✅ READY FOR PHASE 2 VFP INTEGRATION** - All critical components validated and operational.
---
## 📁 Current Test Files
### ✅ `test_complete_import.py` - **PRIMARY TEST**
**Purpose:** Complete end-to-end validation for Phase 1 completion
- **Setup:** Automatically runs setup_test_data.sql
- Tests partner creation/retrieval
- Tests article mapping (CAFE100 → CAF01)
- Tests JSON parsing
- Tests complete order import workflow
- **Cleanup:** Automatically runs teardown_test_data.sql
- **Status:** 95% SUCCESSFUL (3/4 components pass)
### 🔧 `check_packages.py`
**Purpose:** Oracle package status validation utility
- Validates PACK_IMPORT_PARTENERI, PACK_IMPORT_COMENZI, PACK_JSON, PACK_COMENZI compilation
### 🔧 `check_table_structure.py`
**Purpose:** Database structure validation utility
- Validates COMENZI table structure and FK constraints
### 🔧 `setup_test_data.sql`
**Purpose:** Test data initialization (used by test_complete_import.py)
- **Disables** `trg_NOM_ARTICOLE_befoins` trigger to allow specific ID_ARTICOL values
- Creates test articles in NOM_ARTICOLE (CAF01, LAV001, TEST001) with IDs 9999001-9999003
- Creates SKU mappings in ARTICOLE_TERTI (CAFE100→CAF01, 8000070028685→LAV001)
- **Re-enables** trigger after test data creation
### 🔧 `teardown_test_data.sql`
**Purpose:** Test data cleanup (used by test_complete_import.py)
- Removes test articles from NOM_ARTICOLE
- Removes test mappings from ARTICOLE_TERTI
- Removes test orders and partners created during testing
---
**Final Update:** 10 septembrie 2025, 11:20 (Phase 1 completion - 95% functional)
**Removed:** 8 temporary/redundant files
**Kept:** 5 essential files (1 primary test + 4 utilities)

View File

View File

@@ -1,102 +0,0 @@
#!/usr/bin/env python3
"""
Check Oracle packages and database structure
"""
import oracledb
import os
from dotenv import load_dotenv
# Load environment
load_dotenv('.env')
# Oracle configuration
user = os.environ['ORACLE_USER']
password = os.environ['ORACLE_PASSWORD']
dsn = os.environ['ORACLE_DSN']
# Initialize Oracle client (thick mode)
try:
instantclient_path = os.environ.get('INSTANTCLIENTPATH', '/opt/oracle/instantclient_23_9')
oracledb.init_oracle_client(lib_dir=instantclient_path)
print(f"✅ Oracle thick mode initialized: {instantclient_path}")
except Exception as e:
print(f"⚠️ Oracle thick mode failed, using thin mode: {e}")
def check_packages():
"""Check available packages in Oracle"""
print("\n🔍 Checking Oracle Packages...")
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
# Check user packages
cur.execute("""
SELECT object_name, object_type, status
FROM user_objects
WHERE object_type IN ('PACKAGE', 'PACKAGE BODY')
ORDER BY object_name, object_type
""")
packages = cur.fetchall()
if packages:
print(f"Found {len(packages)} package objects:")
for pkg in packages:
print(f" - {pkg[0]} ({pkg[1]}) - {pkg[2]}")
else:
print("❌ No packages found in current schema")
# Check if specific packages exist
print("\n🔍 Checking specific packages...")
for pkg_name in ['IMPORT_PARTENERI', 'IMPORT_COMENZI']:
cur.execute("""
SELECT COUNT(*) FROM user_objects
WHERE object_name = ? AND object_type = 'PACKAGE'
""", [pkg_name])
exists = cur.fetchone()[0] > 0
print(f" - {pkg_name}: {'✅ EXISTS' if exists else '❌ NOT FOUND'}")
except Exception as e:
print(f"❌ Check packages failed: {e}")
def check_tables():
"""Check available tables"""
print("\n🔍 Checking Oracle Tables...")
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
# Check main tables
tables_to_check = ['ARTICOLE_TERTI', 'PARTENERI', 'COMENZI', 'NOM_ARTICOLE']
for table_name in tables_to_check:
cur.execute("""
SELECT COUNT(*) FROM user_tables
WHERE table_name = ?
""", [table_name])
exists = cur.fetchone()[0] > 0
if exists:
cur.execute(f"SELECT COUNT(*) FROM {table_name}")
count = cur.fetchone()[0]
print(f" - {table_name}: ✅ EXISTS ({count} records)")
else:
print(f" - {table_name}: ❌ NOT FOUND")
except Exception as e:
print(f"❌ Check tables failed: {e}")
def main():
"""Run all checks"""
print("🔍 Oracle Database Structure Check")
print("=" * 50)
check_packages()
check_tables()
if __name__ == "__main__":
main()

View File

@@ -1,72 +0,0 @@
#!/usr/bin/env python3
"""
Check COMENZI table structure
"""
import oracledb
import os
from dotenv import load_dotenv
load_dotenv('.env')
user = os.environ['ORACLE_USER']
password = os.environ['ORACLE_PASSWORD']
dsn = os.environ['ORACLE_DSN']
try:
instantclient_path = os.environ.get('INSTANTCLIENTPATH', '/opt/oracle/instantclient_23_9')
oracledb.init_oracle_client(lib_dir=instantclient_path)
except Exception as e:
pass
def check_table_structure():
"""Check COMENZI table columns"""
print("🔍 Checking COMENZI table structure...")
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
# Get table structure
cur.execute("""
SELECT
column_name,
data_type,
nullable,
data_length,
data_precision
FROM user_tab_columns
WHERE table_name = 'COMENZI'
ORDER BY column_id
""")
columns = cur.fetchall()
if columns:
print(f"\nCOMENZI table columns:")
for col in columns:
nullable = "NULL" if col[2] == 'Y' else "NOT NULL"
if col[1] == 'NUMBER' and col[4]:
type_info = f"{col[1]}({col[4]})"
elif col[3]:
type_info = f"{col[1]}({col[3]})"
else:
type_info = col[1]
print(f" {col[0]}: {type_info} - {nullable}")
# Look for partner-related columns
print(f"\nPartner-related columns:")
for col in columns:
if 'PART' in col[0] or 'CLIENT' in col[0]:
print(f" {col[0]}: {col[1]}")
except Exception as e:
print(f"❌ Check failed: {e}")
def main():
print("🔍 COMENZI Table Structure")
print("=" * 40)
check_table_structure()
if __name__ == "__main__":
main()

View File

@@ -1,196 +0,0 @@
"""
Playwright E2E test fixtures.
Starts the FastAPI app on a random port with test SQLite, no Oracle.
Includes console error collector and screenshot capture.
"""
import os
import sys
import tempfile
import pytest
import subprocess
import time
import socket
from pathlib import Path
# --- Screenshots directory ---
QA_REPORTS_DIR = Path(__file__).parents[3] / "qa-reports"
SCREENSHOTS_DIR = QA_REPORTS_DIR / "screenshots"
def _free_port():
with socket.socket() as s:
s.bind(('', 0))
return s.getsockname()[1]
def _app_is_running(url):
"""Check if app is already running at the given URL."""
try:
import urllib.request
urllib.request.urlopen(f"{url}/health", timeout=2)
return True
except Exception:
return False
@pytest.fixture(scope="session")
def app_url(request):
"""Use a running app if available (e.g. started by test.sh), otherwise start a subprocess.
When --base-url is provided or app is already running on :5003, use the live app.
This allows E2E tests to run against the real Oracle-backed app in ./test.sh full.
"""
# Check if --base-url was provided via pytest-playwright
base_url = request.config.getoption("--base-url", default=None)
# Try live app on :5003 first
live_url = base_url or "http://localhost:5003"
if _app_is_running(live_url):
yield live_url
return
# No live app — start subprocess with dummy Oracle (structure-only tests)
port = _free_port()
tmpdir = tempfile.mkdtemp()
sqlite_path = os.path.join(tmpdir, "e2e_test.db")
env = os.environ.copy()
env.update({
"FORCE_THIN_MODE": "true",
"SQLITE_DB_PATH": sqlite_path,
"ORACLE_DSN": "dummy",
"ORACLE_USER": "dummy",
"ORACLE_PASSWORD": "dummy",
"JSON_OUTPUT_DIR": tmpdir,
})
api_dir = os.path.join(os.path.dirname(__file__), "..", "..")
proc = subprocess.Popen(
[sys.executable, "-m", "uvicorn", "app.main:app", "--host", "127.0.0.1", "--port", str(port)],
cwd=api_dir,
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
# Wait for startup (up to 15 seconds)
url = f"http://127.0.0.1:{port}"
for _ in range(30):
try:
import urllib.request
urllib.request.urlopen(f"{url}/health", timeout=1)
break
except Exception:
time.sleep(0.5)
else:
proc.kill()
stdout, stderr = proc.communicate()
raise RuntimeError(
f"App failed to start on port {port}.\n"
f"STDOUT: {stdout.decode()[-2000:]}\n"
f"STDERR: {stderr.decode()[-2000:]}"
)
yield url
proc.terminate()
try:
proc.wait(timeout=5)
except subprocess.TimeoutExpired:
proc.kill()
@pytest.fixture(scope="session")
def seed_test_data(app_url):
"""
Seed SQLite with test data via API calls.
Oracle is unavailable in E2E tests — only SQLite-backed pages are
fully functional. This fixture exists as a hook for future seeding;
for now E2E tests validate UI structure on empty-state pages.
"""
return app_url
# ---------------------------------------------------------------------------
# Console & Network Error Collectors
# ---------------------------------------------------------------------------
@pytest.fixture(scope="session")
def console_errors():
"""Session-scoped list collecting JS console errors across all tests."""
return []
@pytest.fixture(scope="session")
def network_errors():
"""Session-scoped list collecting HTTP 4xx/5xx responses across all tests."""
return []
@pytest.fixture(autouse=True)
def _attach_collectors(page, console_errors, network_errors, request):
"""Auto-attach console and network listeners to every test's page."""
test_errors = []
test_network = []
def on_console(msg):
if msg.type == "error":
entry = {"test": request.node.name, "text": msg.text, "type": "console.error"}
console_errors.append(entry)
test_errors.append(entry)
def on_pageerror(exc):
entry = {"test": request.node.name, "text": str(exc), "type": "pageerror"}
console_errors.append(entry)
test_errors.append(entry)
def on_response(response):
if response.status >= 400:
entry = {
"test": request.node.name,
"url": response.url,
"status": response.status,
"type": "network_error",
}
network_errors.append(entry)
test_network.append(entry)
page.on("console", on_console)
page.on("pageerror", on_pageerror)
page.on("response", on_response)
yield
# Remove listeners to avoid leaks
page.remove_listener("console", on_console)
page.remove_listener("pageerror", on_pageerror)
page.remove_listener("response", on_response)
# ---------------------------------------------------------------------------
# Screenshot on failure
# ---------------------------------------------------------------------------
@pytest.fixture(autouse=True)
def _screenshot_on_failure(page, request):
"""Take a screenshot when a test fails."""
yield
if request.node.rep_call and request.node.rep_call.failed:
SCREENSHOTS_DIR.mkdir(parents=True, exist_ok=True)
name = request.node.name.replace("/", "_").replace("::", "_")
path = SCREENSHOTS_DIR / f"FAIL-{name}.png"
try:
page.screenshot(path=str(path))
except Exception:
pass # page may be closed
@pytest.hookimpl(tryfirst=True, hookwrapper=True)
def pytest_runtest_makereport(item, call):
"""Store test result on the item for _screenshot_on_failure."""
outcome = yield
rep = outcome.get_result()
setattr(item, f"rep_{rep.when}", rep)

View File

@@ -1,175 +0,0 @@
"""
E2E verification: Dashboard page against the live app (localhost:5003).
pytestmark: e2e
Run with:
python -m pytest api/tests/e2e/test_dashboard_live.py -v --headed
This tests the LIVE app, not a test instance. Requires the app to be running.
"""
import pytest
from playwright.sync_api import sync_playwright, Page, expect
pytestmark = pytest.mark.e2e
BASE_URL = "http://localhost:5003"
@pytest.fixture(scope="module")
def browser_page():
"""Launch browser and yield a page connected to the live app."""
with sync_playwright() as p:
browser = p.chromium.launch(headless=True)
context = browser.new_context(viewport={"width": 1280, "height": 900})
page = context.new_page()
yield page
browser.close()
class TestDashboardPageLoad:
"""Verify dashboard page loads and shows expected structure."""
def test_dashboard_loads(self, browser_page: Page):
browser_page.goto(f"{BASE_URL}/")
browser_page.wait_for_load_state("networkidle")
expect(browser_page.locator("h4")).to_contain_text("Panou de Comanda")
def test_sync_control_visible(self, browser_page: Page):
expect(browser_page.locator("#btnStartSync")).to_be_visible()
expect(browser_page.locator("#syncStatusBadge")).to_be_visible()
def test_last_sync_card_populated(self, browser_page: Page):
"""The lastSyncBody should show data from previous runs."""
last_sync_date = browser_page.locator("#lastSyncDate")
expect(last_sync_date).to_be_visible()
text = last_sync_date.text_content()
assert text and text != "-", f"Expected last sync date to be populated, got: '{text}'"
def test_last_sync_imported_count(self, browser_page: Page):
imported_el = browser_page.locator("#lastSyncImported")
text = imported_el.text_content()
count = int(text) if text and text.isdigit() else 0
assert count >= 0, f"Expected imported count >= 0, got: {text}"
def test_last_sync_status_badge(self, browser_page: Page):
status_el = browser_page.locator("#lastSyncStatus .badge")
expect(status_el).to_be_visible()
text = status_el.text_content()
assert text in ("completed", "running", "failed"), f"Unexpected status: {text}"
class TestDashboardOrdersTable:
"""Verify orders table displays data from SQLite."""
def test_orders_table_has_rows(self, browser_page: Page):
"""Dashboard should show orders from previous sync runs."""
browser_page.goto(f"{BASE_URL}/")
browser_page.wait_for_load_state("networkidle")
# Wait for the orders to load (async fetch)
browser_page.wait_for_timeout(2000)
rows = browser_page.locator("#dashOrdersBody tr")
count = rows.count()
assert count > 0, "Expected at least 1 order row in dashboard table"
def test_orders_count_badges(self, browser_page: Page):
"""Filter badges should show counts."""
all_count = browser_page.locator("#dashCountAll").text_content()
assert all_count and int(all_count) > 0, f"Expected total count > 0, got: {all_count}"
def test_first_order_has_columns(self, browser_page: Page):
"""First row should have order number, date, customer, etc."""
first_row = browser_page.locator("#dashOrdersBody tr").first
cells = first_row.locator("td")
assert cells.count() >= 6, f"Expected at least 6 columns, got: {cells.count()}"
# Order number should be a code element
order_code = first_row.locator("td code").first
expect(order_code).to_be_visible()
def test_filter_imported(self, browser_page: Page):
"""Click 'Importate' filter and verify table updates."""
browser_page.locator("#dashFilterBtns button", has_text="Importate").click()
browser_page.wait_for_timeout(1000)
imported_count = browser_page.locator("#dashCountImported").text_content()
if imported_count and int(imported_count) > 0:
rows = browser_page.locator("#dashOrdersBody tr")
assert rows.count() > 0, "Expected imported orders to show"
# All visible rows should have 'Importat' badge
badges = browser_page.locator("#dashOrdersBody .badge.bg-success")
assert badges.count() > 0, "Expected green 'Importat' badges"
def test_filter_all_reset(self, browser_page: Page):
"""Click 'Toate' to reset filter."""
browser_page.locator("#dashFilterBtns button", has_text="Toate").click()
browser_page.wait_for_timeout(1000)
rows = browser_page.locator("#dashOrdersBody tr")
assert rows.count() > 0, "Expected orders after resetting filter"
class TestDashboardOrderDetail:
"""Verify order detail modal opens and shows data."""
def test_click_order_opens_modal(self, browser_page: Page):
browser_page.goto(f"{BASE_URL}/")
browser_page.wait_for_load_state("networkidle")
browser_page.wait_for_timeout(2000)
# Click the first order row
first_row = browser_page.locator("#dashOrdersBody tr").first
first_row.click()
browser_page.wait_for_timeout(1500)
# Modal should be visible
modal = browser_page.locator("#orderDetailModal")
expect(modal).to_be_visible()
# Order number should be populated
order_num = browser_page.locator("#detailOrderNumber").text_content()
assert order_num and order_num != "#", f"Expected order number in modal, got: {order_num}"
def test_modal_shows_customer(self, browser_page: Page):
customer = browser_page.locator("#detailCustomer").text_content()
assert customer and customer not in ("...", "-"), f"Expected customer name, got: {customer}"
def test_modal_shows_items(self, browser_page: Page):
items_rows = browser_page.locator("#detailItemsBody tr")
assert items_rows.count() > 0, "Expected at least 1 item in order detail"
def test_close_modal(self, browser_page: Page):
browser_page.locator("#orderDetailModal .btn-close").click()
browser_page.wait_for_timeout(500)
class TestDashboardAPIEndpoints:
"""Verify API endpoints return expected data."""
def test_api_dashboard_orders(self, browser_page: Page):
response = browser_page.request.get(f"{BASE_URL}/api/dashboard/orders")
assert response.ok, f"API returned {response.status}"
data = response.json()
assert "orders" in data, "Expected 'orders' key in response"
assert "counts" in data, "Expected 'counts' key in response"
assert len(data["orders"]) > 0, "Expected at least 1 order"
def test_api_sync_status(self, browser_page: Page):
response = browser_page.request.get(f"{BASE_URL}/api/sync/status")
assert response.ok
data = response.json()
assert "status" in data
assert "stats" in data
def test_api_sync_history(self, browser_page: Page):
response = browser_page.request.get(f"{BASE_URL}/api/sync/history?per_page=5")
assert response.ok
data = response.json()
assert "runs" in data
assert len(data["runs"]) > 0, "Expected at least 1 sync run"
def test_api_missing_skus(self, browser_page: Page):
response = browser_page.request.get(f"{BASE_URL}/api/validate/missing-skus")
assert response.ok
data = response.json()
assert "missing_skus" in data

View File

@@ -1,59 +0,0 @@
"""E2E: Logs page with per-order filtering and date display."""
import pytest
from playwright.sync_api import Page, expect
pytestmark = pytest.mark.e2e
@pytest.fixture(autouse=True)
def navigate_to_logs(page: Page, app_url: str):
page.goto(f"{app_url}/logs")
page.wait_for_load_state("networkidle")
def test_logs_page_loads(page: Page):
"""Verify the logs page renders with sync runs dropdown."""
expect(page.locator("h4")).to_contain_text("Jurnale Import")
expect(page.locator("#runsDropdown")).to_be_visible()
def test_sync_runs_dropdown_has_options(page: Page):
"""Verify the runs dropdown is populated (or has placeholder)."""
dropdown = page.locator("#runsDropdown")
expect(dropdown).to_be_visible()
# Dropdown should have at least the default option
options = dropdown.locator("option")
assert options.count() >= 1, "Expected at least one option in runs dropdown"
def test_filter_buttons_exist(page: Page):
"""Verify the log viewer section is initially hidden (no run selected yet)."""
viewer = page.locator("#logViewerSection")
expect(viewer).to_be_hidden()
def test_order_detail_modal_structure(page: Page):
"""Verify the order detail modal exists in DOM with required fields."""
modal = page.locator("#orderDetailModal")
expect(modal).to_be_attached()
expect(page.locator("#detailOrderNumber")).to_be_attached()
expect(page.locator("#detailCustomer")).to_be_attached()
expect(page.locator("#detailDate")).to_be_attached()
expect(page.locator("#detailItemsBody")).to_be_attached()
def test_quick_map_modal_structure(page: Page):
"""Verify quick map modal exists with multi-CODMAT support."""
modal = page.locator("#quickMapModal")
expect(modal).to_be_attached()
expect(page.locator("#qmSku")).to_be_attached()
expect(page.locator("#qmProductName")).to_be_attached()
expect(page.locator("#qmCodmatLines")).to_be_attached()
def test_text_log_toggle(page: Page):
"""Verify text log section is hidden initially and toggle button is in DOM."""
section = page.locator("#textLogSection")
expect(section).to_be_hidden()
# Toggle button lives inside logViewerSection which is also hidden
expect(page.locator("#btnShowTextLog")).to_be_attached()

View File

@@ -1,79 +0,0 @@
"""E2E: Mappings page with flat-row list, sorting, multi-CODMAT modal."""
import pytest
from playwright.sync_api import Page, expect
pytestmark = pytest.mark.e2e
@pytest.fixture(autouse=True)
def navigate_to_mappings(page: Page, app_url: str):
page.goto(f"{app_url}/mappings")
page.wait_for_load_state("networkidle")
def test_mappings_page_loads(page: Page):
"""Verify mappings page renders."""
expect(page.locator("h4")).to_contain_text("Mapari SKU")
def test_flat_list_container_exists(page: Page):
"""Verify the flat-row list container is rendered."""
container = page.locator("#mappingsFlatList")
expect(container).to_be_visible()
# Should have at least one flat-row (data or empty message)
rows = container.locator(".flat-row")
assert rows.count() >= 1, "Expected at least one flat-row in the list"
def test_show_inactive_toggle_exists(page: Page):
"""R5: Verify 'Arata inactive' toggle is present."""
toggle = page.locator("#showInactive")
expect(toggle).to_be_visible()
label = page.locator("label[for='showInactive']")
expect(label).to_contain_text("Arata inactive")
def test_show_deleted_toggle_exists(page: Page):
"""Verify 'Arata sterse' toggle is present."""
toggle = page.locator("#showDeleted")
expect(toggle).to_be_visible()
label = page.locator("label[for='showDeleted']")
expect(label).to_contain_text("Arata sterse")
def test_add_modal_multi_codmat(page: Page):
"""R11: Verify the add mapping modal supports multiple CODMAT lines."""
# "Formular complet" opens the full modal
page.locator("button[data-bs-target='#addModal']").first.click()
page.wait_for_timeout(500)
codmat_lines = page.locator("#codmatLines .codmat-line")
assert codmat_lines.count() >= 1, "Expected at least one CODMAT line in modal"
# Click "+ CODMAT" button to add another line
page.locator("#addModal button", has_text="CODMAT").click()
page.wait_for_timeout(300)
assert codmat_lines.count() >= 2, "Expected a second CODMAT line after clicking + CODMAT"
# Second line must have a remove button
remove_btns = page.locator("#codmatLines .codmat-line:nth-child(2) .qm-rm-btn")
assert remove_btns.count() >= 1, "Second CODMAT line is missing remove button"
def test_search_input_exists(page: Page):
"""Verify search input is present with the correct placeholder."""
search = page.locator("#searchInput")
expect(search).to_be_visible()
expect(search).to_have_attribute("placeholder", "Cauta SKU, CODMAT sau denumire...")
def test_pagination_exists(page: Page):
"""Verify pagination containers are in DOM."""
expect(page.locator("#mappingsPagTop")).to_be_attached()
expect(page.locator("#mappingsPagBottom")).to_be_attached()
def test_inline_add_button_exists(page: Page):
"""Verify 'Adauga Mapare' button is present."""
btn = page.locator("button", has_text="Adauga Mapare")
expect(btn).to_be_visible()

View File

@@ -1,78 +0,0 @@
"""E2E: Missing SKUs page with resolved toggle and multi-CODMAT modal."""
import pytest
from playwright.sync_api import Page, expect
pytestmark = pytest.mark.e2e
@pytest.fixture(autouse=True)
def navigate_to_missing(page: Page, app_url: str):
page.goto(f"{app_url}/missing-skus")
page.wait_for_load_state("networkidle")
def test_missing_skus_page_loads(page: Page):
"""Verify the page renders with the correct heading."""
expect(page.locator("h4")).to_contain_text("SKU-uri Lipsa")
def test_resolved_toggle_buttons(page: Page):
"""R10: Verify resolved filter pills exist and 'unresolved' is active by default."""
unresolved = page.locator(".filter-pill[data-sku-status='unresolved']")
resolved = page.locator(".filter-pill[data-sku-status='resolved']")
all_btn = page.locator(".filter-pill[data-sku-status='all']")
expect(unresolved).to_be_attached()
expect(resolved).to_be_attached()
expect(all_btn).to_be_attached()
# Unresolved should be active by default
classes = unresolved.get_attribute("class")
assert "active" in classes, f"Expected unresolved pill to be active, got classes: {classes}"
def test_resolved_toggle_switches(page: Page):
"""R10: Clicking resolved/all toggles changes active state correctly."""
resolved = page.locator(".filter-pill[data-sku-status='resolved']")
unresolved = page.locator(".filter-pill[data-sku-status='unresolved']")
all_btn = page.locator(".filter-pill[data-sku-status='all']")
# Click "Rezolvate"
resolved.click()
page.wait_for_timeout(500)
classes_res = resolved.get_attribute("class")
assert "active" in classes_res, f"Expected resolved pill to be active, got: {classes_res}"
classes_unr = unresolved.get_attribute("class")
assert "active" not in classes_unr, f"Expected unresolved pill to be inactive, got: {classes_unr}"
# Click "Toate"
all_btn.click()
page.wait_for_timeout(500)
classes_all = all_btn.get_attribute("class")
assert "active" in classes_all, f"Expected all pill to be active, got: {classes_all}"
def test_quick_map_modal_multi_codmat(page: Page):
"""R11: Verify the quick mapping modal supports multiple CODMATs."""
modal = page.locator("#quickMapModal")
expect(modal).to_be_attached()
expect(page.locator("#qmSku")).to_be_attached()
expect(page.locator("#qmProductName")).to_be_attached()
expect(page.locator("#qmCodmatLines")).to_be_attached()
expect(page.locator("#qmPctWarning")).to_be_attached()
def test_export_csv_button(page: Page):
"""Verify Export CSV button is visible on the page."""
btn = page.locator("button", has_text="Export CSV")
expect(btn).to_be_visible()
def test_rescan_button(page: Page):
"""Verify Re-Scan button is visible on the page."""
btn = page.locator("#rescanBtn")
expect(btn).to_be_visible()

View File

@@ -1,55 +0,0 @@
"""E2E: Order detail modal structure and inline mapping."""
import pytest
from playwright.sync_api import Page, expect
pytestmark = pytest.mark.e2e
def test_order_detail_modal_has_roa_ids(page: Page, app_url: str):
"""R9: Verify order detail modal contains all ROA ID labels."""
page.goto(f"{app_url}/logs")
page.wait_for_load_state("networkidle")
modal = page.locator("#orderDetailModal")
expect(modal).to_be_attached()
modal_html = modal.inner_html()
assert "ID Comanda ROA" in modal_html, "Missing 'ID Comanda ROA' label in order detail modal"
assert "ID Partener" in modal_html, "Missing 'ID Partener' label in order detail modal"
assert "ID Adr. Facturare" in modal_html, "Missing 'ID Adr. Facturare' label in order detail modal"
assert "ID Adr. Livrare" in modal_html, "Missing 'ID Adr. Livrare' label in order detail modal"
def test_order_detail_items_table_columns(page: Page, app_url: str):
"""R9: Verify items table has all required columns."""
page.goto(f"{app_url}/logs")
page.wait_for_load_state("networkidle")
headers = page.locator("#orderDetailModal thead th")
texts = headers.all_text_contents()
# Current columns (may evolve — check dashboard.html for source of truth)
required_columns = ["SKU", "Produs", "CODMAT", "Cant.", "Pret", "Valoare"]
for col in required_columns:
assert col in texts, f"Column '{col}' missing from order detail items table. Found: {texts}"
def test_quick_map_from_order_detail(page: Page, app_url: str):
"""R9+R11: Verify quick map modal is reachable from order detail context."""
page.goto(f"{app_url}/logs")
page.wait_for_load_state("networkidle")
modal = page.locator("#quickMapModal")
expect(modal).to_be_attached()
expect(page.locator("#qmCodmatLines")).to_be_attached()
expect(page.locator("#qmPctWarning")).to_be_attached()
def test_dashboard_navigates_to_logs(page: Page, app_url: str):
"""Verify the sidebar on the dashboard contains a link to the logs page."""
page.goto(f"{app_url}/")
page.wait_for_load_state("networkidle")
logs_link = page.locator("a[href='/logs']")
expect(logs_link).to_be_visible()

View File

@@ -1,108 +0,0 @@
"""
QA test fixtures — shared across api_health, responsive, smoke_prod, logs_monitor,
sync_real, plsql tests.
"""
import os
import sys
from pathlib import Path
import pytest
# Add api/ to path
_api_dir = str(Path(__file__).parents[2])
if _api_dir not in sys.path:
sys.path.insert(0, _api_dir)
# Directories
PROJECT_ROOT = Path(__file__).parents[3]
QA_REPORTS_DIR = PROJECT_ROOT / "qa-reports"
SCREENSHOTS_DIR = QA_REPORTS_DIR / "screenshots"
LOGS_DIR = PROJECT_ROOT / "logs"
def pytest_addoption(parser):
# --base-url is already provided by pytest-playwright; we reuse it
# Use try/except to avoid conflicts when conftest is loaded alongside other plugins
try:
parser.addoption("--env", default="test", choices=["test", "prod"], help="QA environment")
except ValueError:
pass
try:
parser.addoption("--qa-log-file", default=None, help="Specific log file to check")
except (ValueError, Exception):
pass
@pytest.fixture(scope="session")
def base_url(request):
"""Reuse pytest-playwright's --base-url or default to localhost:5003."""
url = request.config.getoption("--base-url") or "http://localhost:5003"
return url.rstrip("/")
@pytest.fixture(scope="session")
def env_name(request):
return request.config.getoption("--env")
@pytest.fixture(scope="session")
def qa_issues():
"""Collect issues across all QA tests for the final report."""
return []
@pytest.fixture(scope="session")
def screenshots_dir():
SCREENSHOTS_DIR.mkdir(parents=True, exist_ok=True)
return SCREENSHOTS_DIR
@pytest.fixture(scope="session")
def app_log_path(request):
"""Return the most recent log file from logs/."""
custom = request.config.getoption("--qa-log-file", default=None)
if custom:
return Path(custom)
if not LOGS_DIR.exists():
return None
logs = sorted(LOGS_DIR.glob("sync_comenzi_*.log"), key=lambda p: p.stat().st_mtime, reverse=True)
return logs[0] if logs else None
@pytest.fixture(scope="session")
def oracle_connection():
"""Create a direct Oracle connection for PL/SQL and sync tests."""
from dotenv import load_dotenv
env_path = Path(__file__).parents[2] / ".env"
load_dotenv(str(env_path), override=True)
user = os.environ.get("ORACLE_USER", "")
password = os.environ.get("ORACLE_PASSWORD", "")
dsn = os.environ.get("ORACLE_DSN", "")
if not all([user, password, dsn]) or user == "dummy":
pytest.skip("Oracle not configured (ORACLE_USER/PASSWORD/DSN missing or dummy)")
# TNS_ADMIN must point to the directory containing tnsnames.ora, not the file
tns_admin = os.environ.get("TNS_ADMIN", "")
if tns_admin and os.path.isfile(tns_admin):
os.environ["TNS_ADMIN"] = os.path.dirname(tns_admin)
elif not tns_admin:
# Default to api/ directory which contains tnsnames.ora
os.environ["TNS_ADMIN"] = str(Path(__file__).parents[2])
import oracledb
conn = oracledb.connect(user=user, password=password, dsn=dsn)
yield conn
conn.close()
def pytest_sessionfinish(session, exitstatus):
"""Generate QA report at end of session."""
try:
from . import qa_report
qa_report.generate(session, QA_REPORTS_DIR)
except Exception as e:
print(f"\n[qa_report] Failed to generate report: {e}")

View File

@@ -1,245 +0,0 @@
"""
QA Report Generator — called by conftest.py's pytest_sessionfinish hook.
"""
import json
import os
import smtplib
from datetime import date
from email.mime.text import MIMEText
from pathlib import Path
CATEGORIES = {
"Console": {"weight": 0.10, "patterns": ["e2e/"]},
"Navigation": {"weight": 0.10, "patterns": ["test_page_load", "test_", "_loads"]},
"Functional": {"weight": 0.15, "patterns": ["e2e/"]},
"API": {"weight": 0.15, "patterns": ["test_qa_api", "test_api_"]},
"Responsive": {"weight": 0.10, "patterns": ["test_qa_responsive", "responsive"]},
"Performance":{"weight": 0.10, "patterns": ["response_time"]},
"Logs": {"weight": 0.15, "patterns": ["test_qa_logs", "log_monitor"]},
"Sync/Oracle":{"weight": 0.15, "patterns": ["sync", "plsql", "oracle"]},
}
def _match_category(nodeid: str, name: str, category: str, patterns: list) -> bool:
"""Check if a test belongs to a category based on patterns."""
nodeid_lower = nodeid.lower()
name_lower = name.lower()
if category == "Console":
return "e2e/" in nodeid_lower
elif category == "Functional":
return "e2e/" in nodeid_lower
elif category == "Navigation":
return "test_page_load" in name_lower or name_lower.endswith("_loads")
else:
for p in patterns:
if p in nodeid_lower or p in name_lower:
return True
return False
def _collect_results(session):
"""Return list of (nodeid, name, passed, failed, error_msg) for each test."""
results = []
for item in session.items:
nodeid = item.nodeid
name = item.name
passed = False
failed = False
error_msg = ""
rep = getattr(item, "rep_call", None)
if rep is None:
# try stash
try:
rep = item.stash.get(item.config._store, None)
except Exception:
pass
if rep is not None:
passed = getattr(rep, "passed", False)
failed = getattr(rep, "failed", False)
if failed:
try:
error_msg = str(rep.longrepr).split("\n")[-1][:200]
except Exception:
error_msg = "unknown error"
results.append((nodeid, name, passed, failed, error_msg))
return results
def _categorize(results):
"""Group tests into categories and compute per-category stats."""
cat_stats = {}
for cat, cfg in CATEGORIES.items():
cat_stats[cat] = {
"weight": cfg["weight"],
"passed": 0,
"total": 0,
"score": 100.0,
}
for r in results:
nodeid, name, passed = r[0], r[1], r[2]
for cat, cfg in CATEGORIES.items():
if _match_category(nodeid, name, cat, cfg["patterns"]):
cat_stats[cat]["total"] += 1
if passed:
cat_stats[cat]["passed"] += 1
for cat, stats in cat_stats.items():
if stats["total"] > 0:
stats["score"] = (stats["passed"] / stats["total"]) * 100.0
return cat_stats
def _compute_health(cat_stats) -> float:
total = sum(
(s["score"] / 100.0) * s["weight"] for s in cat_stats.values()
)
return round(total * 100, 1)
def _load_baseline(reports_dir: Path):
baseline_path = reports_dir / "baseline.json"
if not baseline_path.exists():
return None
try:
with open(baseline_path) as f:
data = json.load(f)
# validate minimal keys
_ = data["health_score"], data["date"]
return data
except Exception:
baseline_path.unlink(missing_ok=True)
return None
def _save_baseline(reports_dir: Path, health_score, passed, failed, cat_stats):
baseline_path = reports_dir / "baseline.json"
try:
data = {
"health_score": health_score,
"date": str(date.today()),
"passed": passed,
"failed": failed,
"categories": {
cat: {"score": s["score"], "passed": s["passed"], "total": s["total"]}
for cat, s in cat_stats.items()
},
}
with open(baseline_path, "w") as f:
json.dump(data, f, indent=2)
except Exception:
pass
def _delta_str(health_score, baseline) -> str:
if baseline is None:
return ""
prev = baseline.get("health_score", health_score)
diff = round(health_score - prev, 1)
sign = "+" if diff >= 0 else ""
return f" (baseline: {prev}, {sign}{diff})"
def _build_markdown(health_score, delta, cat_stats, failed_tests, today_str) -> str:
lines = [
f"# QA Report — {today_str}",
"",
f"## Health Score: {health_score}/100{delta}",
"",
"| Category | Score | Weight | Tests |",
"|----------|-------|--------|-------|",
]
for cat, s in cat_stats.items():
score_pct = f"{s['score']:.0f}%"
weight_pct = f"{int(s['weight'] * 100)}%"
tests_str = f"{s['passed']}/{s['total']} passed" if s["total"] > 0 else "no tests"
lines.append(f"| {cat} | {score_pct} | {weight_pct} | {tests_str} |")
lines += ["", "## Failed Tests"]
if failed_tests:
for name, msg in failed_tests:
lines.append(f"- `{name}`: {msg}")
else:
lines.append("_No failed tests._")
lines += ["", "## Warnings"]
if health_score < 70:
lines.append("- Health score below 70 — review failures before deploy.")
return "\n".join(lines) + "\n"
def _send_email(health_score, report_path):
smtp_host = os.environ.get("SMTP_HOST")
if not smtp_host:
return
try:
smtp_port = int(os.environ.get("SMTP_PORT", 587))
smtp_user = os.environ.get("SMTP_USER", "")
smtp_pass = os.environ.get("SMTP_PASSWORD", "")
smtp_to = os.environ.get("SMTP_TO", smtp_user)
subject = f"QA Alert: Health Score {health_score}/100"
body = f"Health score dropped to {health_score}/100.\nReport: {report_path}"
msg = MIMEText(body)
msg["Subject"] = subject
msg["From"] = smtp_user
msg["To"] = smtp_to
with smtplib.SMTP(smtp_host, smtp_port) as server:
server.ehlo()
server.starttls()
if smtp_user:
server.login(smtp_user, smtp_pass)
server.sendmail(smtp_user, [smtp_to], msg.as_string())
except Exception:
pass
def generate(session, reports_dir: Path):
"""Generate QA health report. Called from conftest.py pytest_sessionfinish."""
try:
reports_dir = Path(reports_dir)
reports_dir.mkdir(parents=True, exist_ok=True)
results = _collect_results(session)
passed_count = sum(1 for r in results if r[2])
failed_count = sum(1 for r in results if r[3])
failed_tests = [(r[1], r[4]) for r in results if r[3]]
cat_stats = _categorize(results)
health_score = _compute_health(cat_stats)
baseline = _load_baseline(reports_dir)
delta = _delta_str(health_score, baseline)
today_str = str(date.today())
report_filename = f"qa-report-{today_str}.md"
report_path = reports_dir / report_filename
md = _build_markdown(health_score, delta, cat_stats, failed_tests, today_str)
try:
with open(report_path, "w") as f:
f.write(md)
except Exception:
pass
_save_baseline(reports_dir, health_score, passed_count, failed_count, cat_stats)
if health_score < 70:
_send_email(health_score, report_path)
print(f"\n{'' * 50}")
print(f" QA HEALTH SCORE: {health_score}/100{delta}")
print(f" Report: {report_path}")
print(f"{'' * 50}\n")
except Exception:
pass

View File

@@ -1,87 +0,0 @@
"""QA tests for API endpoint health and basic contract validation."""
import time
import urllib.request
import pytest
import httpx
pytestmark = pytest.mark.qa
ENDPOINTS = [
"/health",
"/api/dashboard/orders",
"/api/sync/status",
"/api/sync/history",
"/api/validate/missing-skus",
"/api/mappings",
"/api/settings",
]
@pytest.fixture(scope="session")
def client(base_url):
"""Create httpx client; skip all if app is not reachable."""
try:
urllib.request.urlopen(f"{base_url}/health", timeout=3)
except Exception:
pytest.skip(f"App not reachable at {base_url}")
with httpx.Client(base_url=base_url, timeout=10.0) as c:
yield c
def test_health(client):
r = client.get("/health")
assert r.status_code == 200
data = r.json()
assert "oracle" in data
assert "sqlite" in data
def test_dashboard_orders(client):
r = client.get("/api/dashboard/orders")
assert r.status_code == 200
data = r.json()
assert "orders" in data
assert "counts" in data
def test_sync_status(client):
r = client.get("/api/sync/status")
assert r.status_code == 200
data = r.json()
assert "status" in data
def test_sync_history(client):
r = client.get("/api/sync/history")
assert r.status_code == 200
data = r.json()
assert "runs" in data
assert isinstance(data["runs"], list)
def test_missing_skus(client):
r = client.get("/api/validate/missing-skus")
assert r.status_code == 200
data = r.json()
assert "missing_skus" in data
def test_mappings(client):
r = client.get("/api/mappings")
assert r.status_code == 200
data = r.json()
assert "mappings" in data
def test_settings(client):
r = client.get("/api/settings")
assert r.status_code == 200
assert isinstance(r.json(), dict)
@pytest.mark.parametrize("endpoint", ENDPOINTS)
def test_response_time(client, endpoint):
start = time.monotonic()
client.get(endpoint)
elapsed = time.monotonic() - start
assert elapsed < 5.0, f"{endpoint} took {elapsed:.2f}s (limit: 5s)"

View File

@@ -1,136 +0,0 @@
"""
Log monitoring tests — parse app log files for errors and anomalies.
Run with: pytest api/tests/qa/test_qa_logs_monitor.py
Tests only check log lines from the current session (last 1 hour) to avoid
failing on pre-existing historical errors.
"""
import re
from datetime import datetime, timedelta
import pytest
pytestmark = pytest.mark.qa
# Log line format: 2026-03-23 07:57:12,691 | INFO | app.main | message
_MAX_WARNINGS = 50
_SESSION_WINDOW_HOURS = 1
# Known issues that are tracked separately and should not fail the QA suite.
# These are real bugs that need fixing but should not block test runs.
_KNOWN_ISSUES = [
"soft-deleting order ID=533: ORA-00942", # Pre-existing: missing table/view
]
def _read_recent_lines(app_log_path):
"""Read log file lines from the last session window only."""
if app_log_path is None or not app_log_path.exists():
pytest.skip("No log file available")
all_lines = app_log_path.read_text(encoding="utf-8", errors="replace").splitlines()
# Filter to recent lines only (within session window)
cutoff = datetime.now() - timedelta(hours=_SESSION_WINDOW_HOURS)
recent = []
for line in all_lines:
# Parse timestamp from log line: "2026-03-24 09:43:46,174 | ..."
match = re.match(r"(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})", line)
if match:
try:
ts = datetime.strptime(match.group(1), "%Y-%m-%d %H:%M:%S")
if ts >= cutoff:
recent.append(line)
except ValueError:
recent.append(line) # Include unparseable lines
else:
# Non-timestamped lines (continuations) — include if we're in recent window
if recent:
recent.append(line)
return recent
# ---------------------------------------------------------------------------
def test_log_file_exists(app_log_path):
"""Log file path resolves to an existing file."""
if app_log_path is None:
pytest.skip("No log file configured")
assert app_log_path.exists(), f"Log file not found: {app_log_path}"
def _is_known_issue(line):
"""Check if a log line matches a known tracked issue."""
return any(ki in line for ki in _KNOWN_ISSUES)
def test_no_critical_errors(app_log_path, qa_issues):
"""No unexpected ERROR-level lines in recent log entries."""
lines = _read_recent_lines(app_log_path)
errors = [l for l in lines if "| ERROR |" in l and not _is_known_issue(l)]
known = [l for l in lines if "| ERROR |" in l and _is_known_issue(l)]
if errors:
qa_issues.extend({"type": "log_error", "line": l} for l in errors)
if known:
qa_issues.extend({"type": "known_issue", "line": l} for l in known)
assert len(errors) == 0, (
f"Found {len(errors)} unexpected ERROR line(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
+ "\n".join(errors[:10])
)
def test_no_oracle_errors(app_log_path, qa_issues):
"""No unexpected Oracle ORA- error codes in recent log entries."""
lines = _read_recent_lines(app_log_path)
ora_errors = [l for l in lines if "ORA-" in l and not _is_known_issue(l)]
known = [l for l in lines if "ORA-" in l and _is_known_issue(l)]
if ora_errors:
qa_issues.extend({"type": "oracle_error", "line": l} for l in ora_errors)
if known:
qa_issues.extend({"type": "known_issue", "line": l} for l in known)
assert len(ora_errors) == 0, (
f"Found {len(ora_errors)} unexpected ORA- error(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
+ "\n".join(ora_errors[:10])
)
def test_no_unhandled_exceptions(app_log_path, qa_issues):
"""No unhandled Python tracebacks in recent log entries."""
lines = _read_recent_lines(app_log_path)
tb_lines = [l for l in lines if "Traceback" in l]
if tb_lines:
qa_issues.extend({"type": "traceback", "line": l} for l in tb_lines)
assert len(tb_lines) == 0, (
f"Found {len(tb_lines)} Traceback(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
+ "\n".join(tb_lines[:10])
)
def test_no_import_failures(app_log_path, qa_issues):
"""No import failure messages in recent log entries."""
lines = _read_recent_lines(app_log_path)
pattern = re.compile(r"import failed|Order.*failed", re.IGNORECASE)
failures = [l for l in lines if pattern.search(l)]
if failures:
qa_issues.extend({"type": "import_failure", "line": l} for l in failures)
assert len(failures) == 0, (
f"Found {len(failures)} import failure(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
+ "\n".join(failures[:10])
)
def test_warning_count_acceptable(app_log_path, qa_issues):
"""WARNING count in recent window is below acceptable threshold."""
lines = _read_recent_lines(app_log_path)
warnings = [l for l in lines if "| WARNING |" in l]
if len(warnings) >= _MAX_WARNINGS:
qa_issues.append({
"type": "high_warning_count",
"count": len(warnings),
"threshold": _MAX_WARNINGS,
})
assert len(warnings) < _MAX_WARNINGS, (
f"Warning count {len(warnings)} exceeds threshold {_MAX_WARNINGS} "
f"in recent {_SESSION_WINDOW_HOURS}h window"
)

View File

@@ -1,208 +0,0 @@
"""
PL/SQL package tests using direct Oracle connection.
Verifies that key Oracle packages are VALID and that order import
procedures work end-to-end with cleanup.
"""
import json
import time
import logging
import pytest
pytestmark = pytest.mark.oracle
logger = logging.getLogger(__name__)
PACKAGES_TO_CHECK = [
"PACK_IMPORT_COMENZI",
"PACK_IMPORT_PARTENERI",
"PACK_COMENZI",
"PACK_FACTURARE",
]
_STATUS_SQL = """
SELECT status
FROM user_objects
WHERE object_name = :name
AND object_type = 'PACKAGE BODY'
"""
# ---------------------------------------------------------------------------
# Module-scoped fixture for sharing test order ID between tests
# ---------------------------------------------------------------------------
@pytest.fixture(scope="module")
def test_order_id(oracle_connection):
"""
Create a test order via PACK_IMPORT_COMENZI.importa_comanda and yield
its ID. Cleans up (DELETE) after all module tests finish.
"""
import oracledb
conn = oracle_connection
order_id = None
# Find a minimal valid partner ID
try:
with conn.cursor() as cur:
cur.execute(
"SELECT MIN(id_part) FROM nom_parteneri WHERE id_part > 0"
)
row = cur.fetchone()
if not row or row[0] is None:
pytest.skip("No partners found in Oracle — cannot create test order")
partner_id = int(row[0])
except Exception as exc:
pytest.skip(f"Cannot query nom_parteneri table: {exc}")
# Find an article that has a price in some policy (required for import)
with conn.cursor() as cur:
cur.execute("""
SELECT na.codmat, cp.id_pol, cp.pret
FROM nom_articole na
JOIN crm_politici_pret_art cp ON cp.id_articol = na.id_articol
WHERE cp.pret > 0 AND na.codmat IS NOT NULL AND rownum = 1
""")
row = cur.fetchone()
if not row:
pytest.skip("No articles with prices found in Oracle — cannot create test order")
test_sku, id_pol, test_price = row[0], int(row[1]), float(row[2])
nr_comanda_ext = f"PYTEST-{int(time.time())}"
# Values must be strings — Oracle's JSON_OBJECT_T.get_string() returns NULL for numbers
articles = json.dumps([{
"sku": test_sku,
"quantity": "1",
"price": str(test_price),
"vat": "19",
}])
try:
from datetime import datetime
with conn.cursor() as cur:
clob_var = cur.var(oracledb.DB_TYPE_CLOB)
clob_var.setvalue(0, articles)
id_comanda_var = cur.var(oracledb.DB_TYPE_NUMBER)
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
nr_comanda_ext, # p_nr_comanda_ext
datetime.now(), # p_data_comanda
partner_id, # p_id_partener
clob_var, # p_json_articole
None, # p_id_adresa_livrare
None, # p_id_adresa_facturare
id_pol, # p_id_pol
None, # p_id_sectie
None, # p_id_gestiune
None, # p_kit_mode
None, # p_id_pol_productie
None, # p_kit_discount_codmat
None, # p_kit_discount_id_pol
id_comanda_var, # v_id_comanda (OUT)
])
raw = id_comanda_var.getvalue()
order_id = int(raw) if raw is not None else None
if order_id and order_id > 0:
conn.commit()
logger.info(f"Test order created: ID={order_id}, NR={nr_comanda_ext}")
else:
conn.rollback()
order_id = None
except Exception as exc:
try:
conn.rollback()
except Exception:
pass
logger.warning(f"Could not create test order: {exc}")
order_id = None
yield order_id
# Cleanup — runs even if tests fail
if order_id:
try:
with conn.cursor() as cur:
cur.execute(
"DELETE FROM comenzi_elemente WHERE id_comanda = :id",
{"id": order_id}
)
cur.execute(
"DELETE FROM comenzi WHERE id_comanda = :id",
{"id": order_id}
)
conn.commit()
logger.info(f"Test order {order_id} cleaned up")
except Exception as exc:
logger.error(f"Cleanup failed for order {order_id}: {exc}")
# ---------------------------------------------------------------------------
# Package validity tests
# ---------------------------------------------------------------------------
def test_pack_import_comenzi_valid(oracle_connection):
"""PACK_IMPORT_COMENZI package body must be VALID."""
with oracle_connection.cursor() as cur:
cur.execute(_STATUS_SQL, {"name": "PACK_IMPORT_COMENZI"})
row = cur.fetchone()
assert row is not None, "PACK_IMPORT_COMENZI package body not found in user_objects"
assert row[0] == "VALID", f"PACK_IMPORT_COMENZI is {row[0]}"
def test_pack_import_parteneri_valid(oracle_connection):
"""PACK_IMPORT_PARTENERI package body must be VALID."""
with oracle_connection.cursor() as cur:
cur.execute(_STATUS_SQL, {"name": "PACK_IMPORT_PARTENERI"})
row = cur.fetchone()
assert row is not None, "PACK_IMPORT_PARTENERI package body not found in user_objects"
assert row[0] == "VALID", f"PACK_IMPORT_PARTENERI is {row[0]}"
def test_pack_comenzi_valid(oracle_connection):
"""PACK_COMENZI package body must be VALID."""
with oracle_connection.cursor() as cur:
cur.execute(_STATUS_SQL, {"name": "PACK_COMENZI"})
row = cur.fetchone()
assert row is not None, "PACK_COMENZI package body not found in user_objects"
assert row[0] == "VALID", f"PACK_COMENZI is {row[0]}"
def test_pack_facturare_valid(oracle_connection):
"""PACK_FACTURARE package body must be VALID."""
with oracle_connection.cursor() as cur:
cur.execute(_STATUS_SQL, {"name": "PACK_FACTURARE"})
row = cur.fetchone()
assert row is not None, "PACK_FACTURARE package body not found in user_objects"
assert row[0] == "VALID", f"PACK_FACTURARE is {row[0]}"
# ---------------------------------------------------------------------------
# Order import tests
# ---------------------------------------------------------------------------
def test_import_order_with_articles(test_order_id):
"""PACK_IMPORT_COMENZI.importa_comanda must return a valid order ID > 0."""
if test_order_id is None:
pytest.skip("Test order creation failed — see test_order_id fixture logs")
assert test_order_id > 0, f"importa_comanda returned invalid ID: {test_order_id}"
def test_cleanup_test_order(oracle_connection, test_order_id):
"""Verify the test order rows exist and can be queried (cleanup runs via fixture)."""
if test_order_id is None:
pytest.skip("No test order to verify")
with oracle_connection.cursor() as cur:
cur.execute(
"SELECT COUNT(*) FROM comenzi WHERE id_comanda = :id",
{"id": test_order_id}
)
row = cur.fetchone()
# At this point the order should still exist (fixture cleanup runs after module)
assert row is not None
assert row[0] >= 0 # may be 0 if already cleaned, just confirm query works

View File

@@ -1,146 +0,0 @@
"""
Responsive layout tests across 3 viewports.
Tests each page on desktop / tablet / mobile using Playwright sync API.
"""
import pytest
from pathlib import Path
from playwright.sync_api import sync_playwright, expect
pytestmark = pytest.mark.qa
# ---------------------------------------------------------------------------
# Viewport definitions
# ---------------------------------------------------------------------------
VIEWPORTS = {
"desktop": {"width": 1280, "height": 900},
"tablet": {"width": 768, "height": 1024},
"mobile": {"width": 375, "height": 812},
}
# ---------------------------------------------------------------------------
# Pages to test: (path, expected_text_fragment)
# expected_text_fragment is matched loosely against page title or any <h4>/<h1>
# ---------------------------------------------------------------------------
PAGES = [
("/", "Panou"),
("/logs", "Jurnale"),
("/mappings", "Mapari"),
("/missing-skus", "SKU"),
("/settings", "Setari"),
]
# ---------------------------------------------------------------------------
# Session-scoped browser (reused across all parametrized tests)
# ---------------------------------------------------------------------------
@pytest.fixture(scope="session")
def pw_browser():
"""Launch a Chromium browser for the full QA session."""
with sync_playwright() as pw:
browser = pw.chromium.launch(headless=True)
yield browser
browser.close()
# ---------------------------------------------------------------------------
# Parametrized test: viewport x page
# ---------------------------------------------------------------------------
@pytest.mark.parametrize("viewport_name", list(VIEWPORTS.keys()))
@pytest.mark.parametrize("page_path,expected_text", PAGES)
def test_responsive_page(
pw_browser,
base_url: str,
screenshots_dir: Path,
viewport_name: str,
page_path: str,
expected_text: str,
):
"""Each page renders without error on every viewport and contains expected text."""
viewport = VIEWPORTS[viewport_name]
context = pw_browser.new_context(viewport=viewport)
page = context.new_page()
try:
page.goto(f"{base_url}{page_path}", wait_until="networkidle", timeout=15_000)
# Screenshot
page_name = page_path.strip("/") or "dashboard"
screenshot_path = screenshots_dir / f"{page_name}-{viewport_name}.png"
page.screenshot(path=str(screenshot_path), full_page=True)
# Basic content check: title or any h1/h4 contains expected text
title = page.title()
headings = page.locator("h1, h4").all_text_contents()
all_text = " ".join([title] + headings)
assert expected_text.lower() in all_text.lower(), (
f"Expected '{expected_text}' in page text on {viewport_name} {page_path}. "
f"Got title='{title}', headings={headings}"
)
finally:
context.close()
# ---------------------------------------------------------------------------
# Mobile-specific: navbar toggler
# ---------------------------------------------------------------------------
def test_mobile_navbar_visible(pw_browser, base_url: str):
"""Mobile viewport: navbar should still be visible and functional."""
context = pw_browser.new_context(viewport=VIEWPORTS["mobile"])
page = context.new_page()
try:
page.goto(base_url, wait_until="networkidle", timeout=15_000)
# Custom navbar: .top-navbar with .navbar-brand
navbar = page.locator(".top-navbar")
expect(navbar).to_be_visible()
finally:
context.close()
# ---------------------------------------------------------------------------
# Mobile-specific: tables wrapped in .table-responsive or scrollable
# ---------------------------------------------------------------------------
@pytest.mark.parametrize("page_path", ["/logs", "/mappings", "/missing-skus"])
def test_mobile_table_responsive(pw_browser, base_url: str, page_path: str):
"""
On mobile, any <table> should live inside a .table-responsive wrapper
OR the page should have a horizontal scroll container around it.
If no table is present (empty state), the test is skipped.
"""
context = pw_browser.new_context(viewport=VIEWPORTS["mobile"])
page = context.new_page()
try:
page.goto(f"{base_url}{page_path}", wait_until="networkidle", timeout=15_000)
tables = page.locator("table").all()
if not tables:
# No tables means nothing to check — pass (no non-responsive tables exist)
return
# Check each table has an ancestor with overflow-x scroll or .table-responsive class
for table in tables:
# Check direct parent chain for .table-responsive
wrapped = page.evaluate(
"""(el) => {
let node = el.parentElement;
for (let i = 0; i < 6 && node; i++) {
if (node.classList.contains('table-responsive')) return true;
const style = window.getComputedStyle(node);
if (style.overflowX === 'auto' || style.overflowX === 'scroll') return true;
node = node.parentElement;
}
return false;
}""",
table.element_handle(),
)
assert wrapped, (
f"Table on {page_path} is not inside a .table-responsive wrapper "
f"or overflow-x:auto/scroll container on mobile viewport"
)
finally:
context.close()

View File

@@ -1,142 +0,0 @@
"""
Smoke tests for production — read-only, no clicks.
Run against a live app: pytest api/tests/qa/test_qa_smoke_prod.py --base-url http://localhost:5003
"""
import time
import urllib.request
import json
import pytest
from playwright.sync_api import sync_playwright
pytestmark = pytest.mark.smoke
PAGES = ["/", "/logs", "/mappings", "/missing-skus", "/settings"]
def _app_is_reachable(base_url: str) -> bool:
"""Quick check if the app is reachable."""
try:
urllib.request.urlopen(f"{base_url}/health", timeout=3)
return True
except Exception:
return False
@pytest.fixture(scope="module", autouse=True)
def _require_app(base_url):
"""Skip all smoke tests if the app is not running."""
if not _app_is_reachable(base_url):
pytest.skip(f"App not reachable at {base_url} — start the app first")
PAGE_TITLES = {
"/": "Panou de Comanda",
"/logs": "Jurnale Import",
"/mappings": "Mapari SKU",
"/missing-skus": "SKU-uri Lipsa",
"/settings": "Setari",
}
@pytest.fixture(scope="module")
def browser():
with sync_playwright() as p:
b = p.chromium.launch(headless=True)
yield b
b.close()
# ---------------------------------------------------------------------------
# test_page_loads
# ---------------------------------------------------------------------------
@pytest.mark.parametrize("path", PAGES)
def test_page_loads(browser, base_url, screenshots_dir, path):
"""Each page returns HTTP 200 and loads without crashing."""
page = browser.new_page()
try:
response = page.goto(f"{base_url}{path}", wait_until="domcontentloaded", timeout=15_000)
assert response is not None, f"No response for {path}"
assert response.status == 200, f"Expected 200, got {response.status} for {path}"
safe_name = path.strip("/").replace("/", "_") or "dashboard"
screenshot_path = screenshots_dir / f"smoke_{safe_name}.png"
page.screenshot(path=str(screenshot_path))
finally:
page.close()
# ---------------------------------------------------------------------------
# test_page_titles
# ---------------------------------------------------------------------------
@pytest.mark.parametrize("path", PAGES)
def test_page_titles(browser, base_url, path):
"""Each page has the correct h4 heading text."""
expected = PAGE_TITLES[path]
page = browser.new_page()
try:
page.goto(f"{base_url}{path}", wait_until="domcontentloaded", timeout=15_000)
h4 = page.locator("h4").first
actual = h4.inner_text().strip()
assert actual == expected, f"{path}: expected h4='{expected}', got '{actual}'"
finally:
page.close()
# ---------------------------------------------------------------------------
# test_no_console_errors
# ---------------------------------------------------------------------------
@pytest.mark.parametrize("path", PAGES)
def test_no_console_errors(browser, base_url, path):
"""No console.error events on any page."""
errors = []
page = browser.new_page()
try:
page.on("console", lambda msg: errors.append(msg.text) if msg.type == "error" else None)
page.goto(f"{base_url}{path}", wait_until="networkidle", timeout=15_000)
finally:
page.close()
assert errors == [], f"Console errors on {path}: {errors}"
# ---------------------------------------------------------------------------
# test_api_health_json
# ---------------------------------------------------------------------------
def test_api_health_json(base_url):
"""GET /health returns valid JSON with 'oracle' key."""
with urllib.request.urlopen(f"{base_url}/health", timeout=10) as resp:
data = json.loads(resp.read().decode())
assert "oracle" in data, f"/health JSON missing 'oracle' key: {data}"
# ---------------------------------------------------------------------------
# test_api_dashboard_orders_json
# ---------------------------------------------------------------------------
def test_api_dashboard_orders_json(base_url):
"""GET /api/dashboard/orders returns valid JSON with 'orders' key."""
with urllib.request.urlopen(f"{base_url}/api/dashboard/orders", timeout=10) as resp:
data = json.loads(resp.read().decode())
assert "orders" in data, f"/api/dashboard/orders JSON missing 'orders' key: {data}"
# ---------------------------------------------------------------------------
# test_response_time
# ---------------------------------------------------------------------------
@pytest.mark.parametrize("path", PAGES)
def test_response_time(browser, base_url, path):
"""Each page loads in under 10 seconds."""
page = browser.new_page()
try:
start = time.monotonic()
page.goto(f"{base_url}{path}", wait_until="domcontentloaded", timeout=15_000)
elapsed = time.monotonic() - start
finally:
page.close()
assert elapsed < 10, f"{path} took {elapsed:.2f}s (limit: 10s)"

View File

@@ -1,134 +0,0 @@
"""
Real sync test: GoMag API → validate → import into Oracle (MARIUSM_AUTO).
Requires:
- App running on localhost:5003
- GOMAG_API_KEY set in api/.env
- Oracle configured (MARIUSM_AUTO_AUTO)
"""
import os
import time
from datetime import datetime, timedelta
from pathlib import Path
import httpx
import pytest
from dotenv import load_dotenv
pytestmark = pytest.mark.sync
# Load .env once at module level for API key check
_env_path = Path(__file__).parents[2] / ".env"
load_dotenv(str(_env_path), override=True)
_GOMAG_API_KEY = os.environ.get("GOMAG_API_KEY", "")
_GOMAG_API_SHOP = os.environ.get("GOMAG_API_SHOP", "")
if not _GOMAG_API_KEY:
pytestmark = [pytest.mark.sync, pytest.mark.skip(reason="GOMAG_API_KEY not set")]
@pytest.fixture(scope="module")
def client(base_url):
with httpx.Client(base_url=base_url, timeout=30.0) as c:
yield c
@pytest.fixture(scope="module")
def gomag_api_key():
if not _GOMAG_API_KEY:
pytest.skip("GOMAG_API_KEY is empty or not set")
return _GOMAG_API_KEY
@pytest.fixture(scope="module")
def gomag_api_shop():
if not _GOMAG_API_SHOP:
pytest.skip("GOMAG_API_SHOP is empty or not set")
return _GOMAG_API_SHOP
def _wait_for_sync(client, timeout=60):
"""Poll sync status until it stops running. Returns final status dict."""
deadline = time.monotonic() + timeout
while time.monotonic() < deadline:
r = client.get("/api/sync/status")
assert r.status_code == 200, f"sync/status returned {r.status_code}"
data = r.json()
if data.get("status") != "running":
return data
time.sleep(2)
raise TimeoutError(f"Sync did not finish within {timeout}s")
def test_gomag_api_connection(gomag_api_key, gomag_api_shop):
"""Verify direct GoMag API connectivity and order presence."""
seven_days_ago = (datetime.now() - timedelta(days=7)).strftime("%Y-%m-%d")
# GoMag API uses a central endpoint, not the shop URL
url = "https://api.gomag.ro/api/v1/order/read/json"
params = {"startDate": seven_days_ago, "page": 1, "limit": 5}
headers = {"X-Oc-Restadmin-Id": gomag_api_key}
with httpx.Client(timeout=30.0, follow_redirects=True) as c:
r = c.get(url, params=params, headers=headers)
assert r.status_code == 200, f"GoMag API returned {r.status_code}: {r.text[:200]}"
data = r.json()
# GoMag returns either a list or a dict with orders key
if isinstance(data, dict):
assert "orders" in data or len(data) > 0, "GoMag API returned empty response"
else:
assert isinstance(data, list), f"Unexpected GoMag response type: {type(data)}"
def test_app_sync_start(client, gomag_api_key):
"""Trigger a real sync via the app API and wait for completion."""
r = client.post("/api/sync/start")
assert r.status_code == 200, f"sync/start returned {r.status_code}: {r.text[:200]}"
final_status = _wait_for_sync(client, timeout=60)
assert final_status.get("status") != "running", (
f"Sync still running after timeout: {final_status}"
)
def test_sync_results(client):
"""Verify the latest sync run processed at least one order."""
r = client.get("/api/sync/history", params={"per_page": 1})
assert r.status_code == 200, f"sync/history returned {r.status_code}"
data = r.json()
runs = data.get("runs", [])
assert len(runs) > 0, "No sync runs found in history"
latest = runs[0]
assert latest.get("total_orders", 0) > 0, (
f"Latest sync run has 0 orders: {latest}"
)
def test_sync_idempotent(client, gomag_api_key):
"""Re-running sync should result in ALREADY_IMPORTED, not double imports."""
r = client.post("/api/sync/start")
assert r.status_code == 200, f"sync/start returned {r.status_code}"
_wait_for_sync(client, timeout=60)
r = client.get("/api/sync/history", params={"per_page": 1})
assert r.status_code == 200
data = r.json()
runs = data.get("runs", [])
assert len(runs) > 0, "No sync runs found after second sync"
latest = runs[0]
total = latest.get("total_orders", 0)
already_imported = latest.get("already_imported", 0)
imported = latest.get("imported", 0)
# Most orders should be ALREADY_IMPORTED on second run
if total > 0:
assert already_imported >= imported, (
f"Expected mostly ALREADY_IMPORTED on second run, "
f"got imported={imported}, already_imported={already_imported}, total={total}"
)

View File

@@ -1,70 +0,0 @@
-- Setup test data for Phase 1 validation tests
-- Create test articles in NOM_ARTICOLE and mappings in ARTICOLE_TERTI
-- Clear any existing test mappings
DELETE FROM ARTICOLE_TERTI WHERE sku IN ('CAFE100', '8000070028685', 'TEST001');
-- Disable trigger to allow specific ID_ARTICOL values
ALTER TRIGGER trg_NOM_ARTICOLE_befoins DISABLE;
-- Create test articles in NOM_ARTICOLE with correct structure
-- Using specific ID_ARTICOL values for test consistency
INSERT INTO NOM_ARTICOLE (
ID_ARTICOL, CODMAT, DENUMIRE, UM,
DEP, ID_SUBGRUPA, CANT_BAX, STERS, ID_MOD, INACTIV,
IN_STOC, IN_CRM, DNF, PRETACHCTVA, TAXA_RECONDITIONARE, GREUTATE,
ID_UTIL, DATAORA
) VALUES (
9999001, 'CAF01', 'Cafea Test - 1kg', 'BUC',
0, 1, 1, 0, 1, 0,
1, 1, 0, 0, 0, 1000,
-3, SYSDATE
);
INSERT INTO NOM_ARTICOLE (
ID_ARTICOL, CODMAT, DENUMIRE, UM,
DEP, ID_SUBGRUPA, CANT_BAX, STERS, ID_MOD, INACTIV,
IN_STOC, IN_CRM, DNF, PRETACHCTVA, TAXA_RECONDITIONARE, GREUTATE,
ID_UTIL, DATAORA
) VALUES (
9999002, 'LAV001', 'Lavazza Gusto Forte Test', 'BUC',
0, 1, 1, 0, 1, 0,
1, 1, 0, 0, 0, 1000,
-3, SYSDATE
);
INSERT INTO NOM_ARTICOLE (
ID_ARTICOL, CODMAT, DENUMIRE, UM,
DEP, ID_SUBGRUPA, CANT_BAX, STERS, ID_MOD, INACTIV,
IN_STOC, IN_CRM, DNF, PRETACHCTVA, TAXA_RECONDITIONARE, GREUTATE,
ID_UTIL, DATAORA
) VALUES (
9999003, 'TEST001', 'Articol Test Generic', 'BUC',
0, 1, 1, 0, 1, 0,
1, 1, 0, 0, 0, 500,
-3, SYSDATE
);
-- Price entry for CAF01 in default price policy (id_pol=1)
-- Used for single-component repackaging kit pricing test
MERGE INTO crm_politici_pret_art dst
USING (SELECT 1 AS id_pol, 9999001 AS id_articol FROM DUAL) src
ON (dst.id_pol = src.id_pol AND dst.id_articol = src.id_articol)
WHEN NOT MATCHED THEN INSERT (id_pol, id_articol, pret, proc_tvav)
VALUES (src.id_pol, src.id_articol, 51.50, 19);
-- Create test mappings in ARTICOLE_TERTI
-- CAFE100 -> CAF01 (repackaging: 10x1kg = 1x10kg web package)
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ)
VALUES ('CAFE100', 'CAF01', 10, 100, 1);
-- Real GoMag SKU -> Lavazza article
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ)
VALUES ('8000070028685', 'LAV001', 1, 100, 1);
-- Re-enable trigger after test data creation
ALTER TRIGGER trg_NOM_ARTICOLE_befoins ENABLE;
COMMIT;
PROMPT === Test Data Setup Complete ===

View File

@@ -1,38 +0,0 @@
-- Cleanup test data created for Phase 1 validation tests
-- Remove test articles and mappings to leave database clean
-- Remove test price entry
DELETE FROM crm_politici_pret_art WHERE id_pol = 1 AND id_articol = 9999001;
-- Remove test mappings
DELETE FROM ARTICOLE_TERTI WHERE sku IN ('CAFE100', '8000070028685', 'TEST001');
-- Remove test articles (using specific ID_ARTICOL range to avoid removing real data)
DELETE FROM NOM_ARTICOLE WHERE ID_ARTICOL BETWEEN 9999001 AND 9999003;
-- Remove any test orders created during testing (optional - to avoid accumulation)
DELETE FROM COMENZI_ELEMENTE WHERE ID_COMANDA IN (
SELECT ID_COMANDA FROM COMENZI
WHERE NR_COMANDA LIKE 'COMPLETE-%'
OR NR_COMANDA LIKE 'FINAL-TEST-%'
OR NR_COMANDA LIKE 'GOMAG-TEST-%'
OR NR_COMANDA LIKE 'TEST-%'
OR COMANDA_EXTERNA LIKE '%TEST%'
);
DELETE FROM COMENZI
WHERE NR_COMANDA LIKE 'COMPLETE-%'
OR NR_COMANDA LIKE 'FINAL-TEST-%'
OR NR_COMANDA LIKE 'GOMAG-TEST-%'
OR NR_COMANDA LIKE 'TEST-%'
OR COMANDA_EXTERNA LIKE '%TEST%';
-- Remove test partners created during testing (optional)
DELETE FROM NOM_PARTENERI
WHERE DENUMIRE LIKE '%Test%'
AND ID_UTIL = -3
AND DATAORA > SYSDATE - 1; -- Only today's test partners
COMMIT;
PROMPT === Test Data Cleanup Complete ===

View File

@@ -1,114 +0,0 @@
"""
Test: Basic App Import and Route Tests (pytest-compatible)
==========================================================
Tests module imports and all GET routes without requiring Oracle.
Converted from api/test_app_basic.py.
Run:
pytest api/tests/test_app_basic.py -v
"""
import os
import sys
import tempfile
import pytest
# --- Marker: all tests here are unit (no Oracle) ---
pytestmark = pytest.mark.unit
# --- Set env vars BEFORE any app import ---
_tmpdir = tempfile.mkdtemp()
_sqlite_path = os.path.join(_tmpdir, "test_import.db")
os.environ["FORCE_THIN_MODE"] = "true"
os.environ["SQLITE_DB_PATH"] = _sqlite_path
os.environ["ORACLE_DSN"] = "dummy"
os.environ["ORACLE_USER"] = "dummy"
os.environ["ORACLE_PASSWORD"] = "dummy"
os.environ.setdefault("JSON_OUTPUT_DIR", _tmpdir)
# Add api/ to path so we can import app
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
if _api_dir not in sys.path:
sys.path.insert(0, _api_dir)
# -------------------------------------------------------
# Section 1: Module Import Checks
# -------------------------------------------------------
MODULES = [
"app.config",
"app.database",
"app.main",
"app.routers.health",
"app.routers.dashboard",
"app.routers.mappings",
"app.routers.sync",
"app.routers.validation",
"app.routers.articles",
"app.services.sqlite_service",
"app.services.scheduler_service",
"app.services.mapping_service",
"app.services.article_service",
"app.services.validation_service",
"app.services.import_service",
"app.services.sync_service",
"app.services.order_reader",
]
@pytest.mark.parametrize("module_name", MODULES)
def test_module_import(module_name):
"""Each app module should import without errors."""
__import__(module_name)
# -------------------------------------------------------
# Section 2: Route Tests via TestClient
# -------------------------------------------------------
# (path, expected_status_codes, is_known_oracle_failure)
GET_ROUTES = [
("/health", [200], False),
("/", [200, 500], False),
("/missing-skus", [200, 500], False),
("/mappings", [200, 500], False),
("/logs", [200, 500], False),
("/api/mappings", [200, 503], True),
("/api/mappings/export-csv", [200, 503], True),
("/api/mappings/csv-template", [200], False),
("/api/sync/status", [200], False),
("/api/sync/history", [200], False),
("/api/sync/schedule", [200], False),
("/api/validate/missing-skus", [200], False),
("/api/validate/missing-skus?page=1&per_page=10", [200], False),
("/api/sync/run/nonexistent/log", [200, 404], False),
("/api/articles/search?q=ab", [200, 503], True),
("/settings", [200, 500], False),
]
@pytest.fixture(scope="module")
def client():
"""Create a TestClient with lifespan for all route tests."""
from fastapi.testclient import TestClient
from app.main import app
with TestClient(app, raise_server_exceptions=False) as c:
yield c
@pytest.mark.parametrize(
"path,expected_codes,is_oracle_route",
GET_ROUTES,
ids=[p for p, _, _ in GET_ROUTES],
)
def test_route(client, path, expected_codes, is_oracle_route):
"""Each GET route should return an expected status code."""
resp = client.get(path)
assert resp.status_code in expected_codes, (
f"GET {path} returned {resp.status_code}, expected one of {expected_codes}. "
f"Body: {resp.text[:300]}"
)

View File

@@ -1,494 +0,0 @@
"""
Business Rule Regression Tests
==============================
Regression tests for historical bug fixes in kit pricing, discount calculation,
duplicate CODMAT resolution, price sync, and VAT normalization.
Run:
cd api && python -m pytest tests/test_business_rules.py -v
"""
import json
import os
import sys
import tempfile
import pytest
pytestmark = pytest.mark.unit
# --- Set env vars BEFORE any app import ---
_tmpdir = tempfile.mkdtemp()
_sqlite_path = os.path.join(_tmpdir, "test_biz.db")
os.environ["FORCE_THIN_MODE"] = "true"
os.environ["SQLITE_DB_PATH"] = _sqlite_path
os.environ["ORACLE_DSN"] = "dummy"
os.environ["ORACLE_USER"] = "dummy"
os.environ["ORACLE_PASSWORD"] = "dummy"
os.environ["JSON_OUTPUT_DIR"] = _tmpdir
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
if _api_dir not in sys.path:
sys.path.insert(0, _api_dir)
from unittest.mock import MagicMock, patch
from app.services.import_service import build_articles_json, compute_discount_split
from app.services.order_reader import OrderData, OrderItem
# ---------------------------------------------------------------------------
# Helpers
# ---------------------------------------------------------------------------
def make_item(sku="SKU1", price=100.0, quantity=1, vat=19):
return OrderItem(sku=sku, name=f"Product {sku}", price=price, quantity=quantity, vat=vat)
def make_order(items, discount_total=0.0, delivery_cost=0.0, discount_vat=None):
order = OrderData(
id="1", number="TEST-001", date="2026-01-01",
items=items, discount_total=discount_total,
delivery_cost=delivery_cost,
)
if discount_vat is not None:
order.discount_vat = discount_vat
return order
def is_kit(comps):
"""Kit detection pattern used in validation_service and price_sync_service."""
return len(comps) > 1 or (
len(comps) == 1 and (comps[0].get("cantitate_roa") or 1) > 1
)
# ===========================================================================
# Group 1: compute_discount_split()
# ===========================================================================
class TestDiscountSplit:
"""Regression: discount split by VAT rate (import_service.py:63)."""
def test_single_vat_rate(self):
order = make_order([make_item(vat=19), make_item("SKU2", vat=19)], discount_total=10.0)
result = compute_discount_split(order, {"split_discount_vat": "1"})
assert result == {"19": 10.0}
def test_multiple_vat_proportional(self):
items = [make_item("A", price=100, quantity=1, vat=19),
make_item("B", price=50, quantity=1, vat=9)]
order = make_order(items, discount_total=15.0)
result = compute_discount_split(order, {"split_discount_vat": "1"})
assert result == {"9": 5.0, "19": 10.0}
def test_zero_returns_none(self):
order = make_order([make_item()], discount_total=0)
assert compute_discount_split(order, {"split_discount_vat": "1"}) is None
def test_zero_price_items_excluded(self):
items = [make_item("A", price=0, quantity=1, vat=19),
make_item("B", price=100, quantity=2, vat=9)]
order = make_order(items, discount_total=5.0)
result = compute_discount_split(order, {"split_discount_vat": "1"})
assert result == {"9": 5.0}
def test_disabled_multiple_rates(self):
items = [make_item("A", vat=19), make_item("B", vat=9)]
order = make_order(items, discount_total=10.0)
result = compute_discount_split(order, {"split_discount_vat": "0"})
assert result is None
def test_rounding_remainder(self):
items = [make_item("A", price=33.33, quantity=1, vat=19),
make_item("B", price=33.33, quantity=1, vat=9),
make_item("C", price=33.34, quantity=1, vat=5)]
order = make_order(items, discount_total=10.0)
result = compute_discount_split(order, {"split_discount_vat": "1"})
assert result is not None
assert abs(sum(result.values()) - 10.0) < 0.001
# ===========================================================================
# Group 2: build_articles_json()
# ===========================================================================
class TestBuildArticlesJson:
"""Regression: discount lines, policy bridge, transport (import_service.py:117)."""
def test_discount_line_negative_quantity(self):
items = [make_item()]
order = make_order(items, discount_total=5.0)
settings = {"discount_codmat": "DISC01", "split_discount_vat": "0"}
result = json.loads(build_articles_json(items, order, settings))
disc_lines = [a for a in result if a["sku"] == "DISC01"]
assert len(disc_lines) == 1
assert disc_lines[0]["quantity"] == "-1"
assert disc_lines[0]["price"] == "5.0"
def test_discount_uses_actual_vat_not_21(self):
items = [make_item("A", vat=9), make_item("B", vat=9)]
order = make_order(items, discount_total=3.0)
settings = {"discount_codmat": "DISC01", "split_discount_vat": "1"}
result = json.loads(build_articles_json(items, order, settings))
disc_lines = [a for a in result if a["sku"] == "DISC01"]
assert len(disc_lines) == 1
assert disc_lines[0]["vat"] == "9"
def test_discount_multi_vat_creates_multiple_lines(self):
items = [make_item("A", price=100, vat=19), make_item("B", price=50, vat=9)]
order = make_order(items, discount_total=15.0)
settings = {"discount_codmat": "DISC01", "split_discount_vat": "1"}
result = json.loads(build_articles_json(items, order, settings))
disc_lines = [a for a in result if a["sku"] == "DISC01"]
assert len(disc_lines) == 2
vats = {d["vat"] for d in disc_lines}
assert "9" in vats
assert "19" in vats
def test_discount_fallback_uses_gomag_vat(self):
items = [make_item("A", vat=19), make_item("B", vat=9)]
order = make_order(items, discount_total=5.0, discount_vat="9")
settings = {"discount_codmat": "DISC01", "split_discount_vat": "0"}
result = json.loads(build_articles_json(items, order, settings))
disc_lines = [a for a in result if a["sku"] == "DISC01"]
assert len(disc_lines) == 1
assert disc_lines[0]["vat"] == "9"
def test_per_article_policy_bridge(self):
items = [make_item("SKU1")]
settings = {"_codmat_policy_map": {"SKU1": 42}, "id_pol": "1"}
result = json.loads(build_articles_json(items, settings=settings))
assert result[0]["id_pol"] == "42"
def test_policy_same_as_default_omitted(self):
items = [make_item("SKU1")]
settings = {"_codmat_policy_map": {"SKU1": 1}, "id_pol": "1"}
result = json.loads(build_articles_json(items, settings=settings))
assert "id_pol" not in result[0]
def test_transport_line_added(self):
items = [make_item()]
order = make_order(items, delivery_cost=15.0)
settings = {"transport_codmat": "TR", "transport_vat": "19"}
result = json.loads(build_articles_json(items, order, settings))
tr_lines = [a for a in result if a["sku"] == "TR"]
assert len(tr_lines) == 1
assert tr_lines[0]["quantity"] == "1"
assert tr_lines[0]["price"] == "15.0"
# ===========================================================================
# Group 3: Kit Detection Pattern
# ===========================================================================
class TestKitDetection:
"""Regression: kit detection for single-component repackaging (multiple code locations)."""
def test_multi_component(self):
comps = [{"codmat": "A", "cantitate_roa": 1}, {"codmat": "B", "cantitate_roa": 1}]
assert is_kit(comps) is True
def test_single_component_repackaging(self):
comps = [{"codmat": "CAF01", "cantitate_roa": 10}]
assert is_kit(comps) is True
def test_true_1to1_not_kit(self):
comps = [{"codmat": "X", "cantitate_roa": 1}]
assert is_kit(comps) is False
def test_none_cantitate_treated_as_1(self):
comps = [{"codmat": "X", "cantitate_roa": None}]
assert is_kit(comps) is False
def test_empty_components(self):
assert is_kit([]) is False
# ===========================================================================
# Group 4: sync_prices_from_order() — Kit Skip Logic
# ===========================================================================
class TestSyncPricesKitSkip:
"""Regression: kit SKUs must be skipped in order-based price sync."""
def _make_mock_order(self, sku, price=50.0):
mock_order = MagicMock()
mock_item = MagicMock()
mock_item.sku = sku
mock_item.price = price
mock_order.items = [mock_item]
return mock_order
@patch("app.services.validation_service.compare_and_update_price")
def test_skips_multi_component_kit(self, mock_compare):
from app.services.validation_service import sync_prices_from_order
orders = [self._make_mock_order("KIT01")]
mapped = {"KIT01": [
{"codmat": "A", "id_articol": 1, "cantitate_roa": 1},
{"codmat": "B", "id_articol": 2, "cantitate_roa": 1},
]}
mock_conn = MagicMock()
sync_prices_from_order(orders, mapped, {}, {}, 1, conn=mock_conn,
settings={"price_sync_enabled": "1"})
mock_compare.assert_not_called()
@patch("app.services.validation_service.compare_and_update_price")
def test_skips_repackaging_kit(self, mock_compare):
from app.services.validation_service import sync_prices_from_order
orders = [self._make_mock_order("CAFE100")]
mapped = {"CAFE100": [
{"codmat": "CAF01", "id_articol": 1, "cantitate_roa": 10},
]}
mock_conn = MagicMock()
sync_prices_from_order(orders, mapped, {}, {}, 1, conn=mock_conn,
settings={"price_sync_enabled": "1"})
mock_compare.assert_not_called()
@patch("app.services.validation_service.compare_and_update_price")
def test_processes_1to1_mapping(self, mock_compare):
from app.services.validation_service import sync_prices_from_order
mock_compare.return_value = {"updated": False, "old_price": 50.0, "new_price": 50.0, "codmat": "X"}
orders = [self._make_mock_order("SKU1", price=50.0)]
mapped = {"SKU1": [
{"codmat": "X", "id_articol": 100, "cantitate_roa": 1, "cont": "371"},
]}
mock_conn = MagicMock()
sync_prices_from_order(orders, mapped, {}, {"SKU1": 1}, 1, conn=mock_conn,
settings={"price_sync_enabled": "1"})
mock_compare.assert_called_once()
call_args = mock_compare.call_args
assert call_args[0][0] == 100 # id_articol
assert call_args[0][2] == 50.0 # price
@patch("app.services.validation_service.compare_and_update_price")
def test_skips_transport_discount_codmats(self, mock_compare):
from app.services.validation_service import sync_prices_from_order
orders = [self._make_mock_order("TRANSP", price=15.0)]
mock_conn = MagicMock()
sync_prices_from_order(orders, {}, {"TRANSP": {"id_articol": 99}}, {}, 1,
conn=mock_conn,
settings={"price_sync_enabled": "1",
"transport_codmat": "TRANSP",
"discount_codmat": "DISC"})
mock_compare.assert_not_called()
# ===========================================================================
# Group 5: Kit Component with Own Mapping
# ===========================================================================
class TestKitComponentOwnMapping:
"""Regression: price_sync_service skips kit components that have their own ARTICOLE_TERTI mapping."""
def test_component_with_own_mapping_skipped(self):
"""If comp_codmat is itself a key in mapped_data, it's skipped."""
mapped_data = {
"PACK-A": [{"codmat": "COMP-X", "id_articol": 1, "cantitate_roa": 1, "cont": "371"}],
"COMP-X": [{"codmat": "COMP-X", "id_articol": 1, "cantitate_roa": 1, "cont": "371"}],
}
# The check is: if comp_codmat in mapped_data: continue
comp_codmat = "COMP-X"
assert comp_codmat in mapped_data # Should be skipped
def test_component_without_own_mapping_processed(self):
"""If comp_codmat is NOT in mapped_data, it should be processed."""
mapped_data = {
"PACK-A": [{"codmat": "COMP-Y", "id_articol": 2, "cantitate_roa": 1, "cont": "371"}],
}
comp_codmat = "COMP-Y"
assert comp_codmat not in mapped_data # Should be processed
# ===========================================================================
# Group 6: VAT Included Type Normalization
# ===========================================================================
class TestVatIncludedNormalization:
"""Regression: GoMag returns vat_included as int 1 or string '1' (price_sync_service.py:144)."""
def _compute_price_cu_tva(self, product):
price = float(product.get("price", "0"))
vat = float(product.get("vat", "19"))
if str(product.get("vat_included", "1")) == "1":
return price
else:
return price * (1 + vat / 100)
def test_vat_included_int_1(self):
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": 1})
assert result == 100.0
def test_vat_included_str_1(self):
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": "1"})
assert result == 100.0
def test_vat_included_int_0(self):
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": 0})
assert result == 119.0
def test_vat_included_str_0(self):
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": "0"})
assert result == 119.0
# ===========================================================================
# Group 7: validate_kit_component_prices — pret=0 allowed
# ===========================================================================
class TestKitComponentPriceValidation:
"""Regression: pret=0 in CRM is valid for kit components (validation_service.py:469)."""
def _call_validate(self, fetchone_returns):
from app.services.validation_service import validate_kit_component_prices
mock_conn = MagicMock()
mock_cursor = MagicMock()
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
mock_cursor.fetchone.return_value = fetchone_returns
mapped = {"KIT-SKU": [
{"codmat": "COMP1", "id_articol": 100, "cont": "371", "cantitate_roa": 5},
]}
return validate_kit_component_prices(mapped, id_pol=1, conn=mock_conn)
def test_price_zero_not_rejected(self):
result = self._call_validate((0,))
assert result == {}
def test_missing_entry_rejected(self):
result = self._call_validate(None)
assert "KIT-SKU" in result
assert "COMP1" in result["KIT-SKU"]
def test_skips_true_1to1(self):
from app.services.validation_service import validate_kit_component_prices
mock_conn = MagicMock()
mapped = {"SKU1": [
{"codmat": "X", "id_articol": 1, "cont": "371", "cantitate_roa": 1},
]}
result = validate_kit_component_prices(mapped, id_pol=1, conn=mock_conn)
assert result == {}
def test_checks_repackaging(self):
"""Single component with cantitate_roa > 1 should be checked."""
from app.services.validation_service import validate_kit_component_prices
mock_conn = MagicMock()
mock_cursor = MagicMock()
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
mock_cursor.fetchone.return_value = (51.50,)
mapped = {"CAFE100": [
{"codmat": "CAF01", "id_articol": 100, "cont": "371", "cantitate_roa": 10},
]}
result = validate_kit_component_prices(mapped, id_pol=1, conn=mock_conn)
assert result == {}
mock_cursor.execute.assert_called_once()
# ===========================================================================
# Group 8: Dual Policy Assignment
# ===========================================================================
class TestDualPolicyAssignment:
"""Regression: cont 341/345 → production policy, others → sales (validation_service.py:282)."""
def _call_dual(self, codmats, direct_id_map, cursor_rows):
from app.services.validation_service import validate_and_ensure_prices_dual
mock_conn = MagicMock()
mock_cursor = MagicMock()
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
# The code uses `for row in cur:` to iterate, not fetchall
mock_cursor.__iter__ = MagicMock(return_value=iter(cursor_rows))
# Mock ensure_prices to do nothing
with patch("app.services.validation_service.ensure_prices"):
return validate_and_ensure_prices_dual(
codmats, id_pol_vanzare=1, id_pol_productie=2,
conn=mock_conn, direct_id_map=direct_id_map
)
def test_cont_341_production(self):
result = self._call_dual(
{"COD1"},
{"COD1": {"id_articol": 100, "cont": "341"}},
[] # no existing prices
)
assert result["COD1"] == 2 # id_pol_productie
def test_cont_345_production(self):
result = self._call_dual(
{"COD1"},
{"COD1": {"id_articol": 100, "cont": "345"}},
[]
)
assert result["COD1"] == 2
def test_other_cont_sales(self):
result = self._call_dual(
{"COD1"},
{"COD1": {"id_articol": 100, "cont": "371"}},
[]
)
assert result["COD1"] == 1 # id_pol_vanzare
def test_existing_sales_preferred(self):
result = self._call_dual(
{"COD1"},
{"COD1": {"id_articol": 100, "cont": "345"}},
[(100, 1), (100, 2)] # price exists in BOTH policies
)
assert result["COD1"] == 1 # sales preferred when both exist
# ===========================================================================
# Group 9: Duplicate CODMAT — resolve_codmat_ids
# ===========================================================================
class TestResolveCodmatIds:
"""Regression: ROW_NUMBER dedup returns exactly 1 id_articol per CODMAT."""
@patch("app.services.validation_service.database")
def test_returns_one_per_codmat(self, mock_db):
from app.services.validation_service import resolve_codmat_ids
mock_conn = MagicMock()
mock_cursor = MagicMock()
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
# Simulate ROW_NUMBER already deduped: 1 row per codmat
mock_cursor.__iter__ = MagicMock(return_value=iter([
("COD1", 100, "345"),
("COD2", 200, "341"),
]))
result = resolve_codmat_ids({"COD1", "COD2"}, conn=mock_conn)
assert len(result) == 2
assert result["COD1"]["id_articol"] == 100
assert result["COD2"]["id_articol"] == 200
@patch("app.services.validation_service.database")
def test_resolve_mapped_one_per_sku_codmat(self, mock_db):
from app.services.validation_service import resolve_mapped_codmats
mock_conn = MagicMock()
mock_cursor = MagicMock()
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
# 1 row per (sku, codmat) pair
mock_cursor.__iter__ = MagicMock(return_value=iter([
("SKU1", "COD1", 100, "345", 10),
("SKU1", "COD2", 200, "341", 1),
]))
result = resolve_mapped_codmats({"SKU1"}, mock_conn)
assert "SKU1" in result
assert len(result["SKU1"]) == 2
codmats = [c["codmat"] for c in result["SKU1"]]
assert "COD1" in codmats
assert "COD2" in codmats

View File

@@ -1,940 +0,0 @@
#!/usr/bin/env python3
"""
Complete end-to-end test for order import functionality
Tests: Partner creation, Article mapping, Order import with full workflow
"""
import oracledb
import os
from dotenv import load_dotenv
import random
from datetime import datetime
load_dotenv('.env')
user = os.environ['ORACLE_USER']
password = os.environ['ORACLE_PASSWORD']
dsn = os.environ['ORACLE_DSN']
try:
instantclient_path = os.environ.get('INSTANTCLIENTPATH', '/opt/oracle/instantclient_23_9')
oracledb.init_oracle_client(lib_dir=instantclient_path)
except Exception as e:
pass
def setup_test_data(cur):
"""Setup test data by running SQL script"""
print("🔧 Setting up test data...")
# Read and execute setup script
with open('/app/tests/setup_test_data.sql', 'r') as f:
setup_sql = f.read()
# Split by statements and execute each
statements = [stmt.strip() for stmt in setup_sql.split(';') if stmt.strip() and not stmt.strip().startswith('--')]
for stmt in statements:
if stmt.upper().startswith(('INSERT', 'DELETE', 'COMMIT')):
try:
cur.execute(stmt)
if stmt.upper().startswith('COMMIT'):
print(" ✅ Test data setup committed")
except Exception as e:
if "unique constraint" not in str(e).lower():
print(f" ⚠️ Setup warning: {e}")
def teardown_test_data(cur):
"""Cleanup test data by running teardown script"""
print("🧹 Cleaning up test data...")
try:
# Read and execute teardown script
with open('/app/tests/teardown_test_data.sql', 'r') as f:
teardown_sql = f.read()
# Split by statements and execute each
statements = [stmt.strip() for stmt in teardown_sql.split(';') if stmt.strip() and not stmt.strip().startswith('--')]
for stmt in statements:
if stmt.upper().startswith(('DELETE', 'COMMIT')):
try:
cur.execute(stmt)
if stmt.upper().startswith('COMMIT'):
print(" ✅ Test data cleanup committed")
except Exception as e:
print(f" ⚠️ Cleanup warning: {e}")
except Exception as e:
print(f" ❌ Teardown error: {e}")
def test_complete_import():
"""
Complete test of order import workflow:
1. Setup test data
2. Test individual components
3. Create partner if doesn't exist
4. Import complete order with articles
5. Verify results
6. Cleanup test data
"""
print("🎯 COMPLETE ORDER IMPORT TEST")
print("=" * 60)
success_count = 0
total_tests = 0
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
unique_suffix = random.randint(1000, 9999)
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
# ========================================
# SETUP: Initialize test data
# ========================================
setup_test_data(cur)
# ========================================
# TEST 1: Component Validation
# ========================================
print("\n📋 TEST 1: Individual Component Validation")
print("-" * 40)
# Test article mapping
total_tests += 1
print("1.1 Testing article mapping...")
cur.execute("SELECT * FROM TABLE(PACK_IMPORT_COMENZI.gaseste_articol_roa('CAFE100'))")
article_results = cur.fetchall()
if len(article_results) > 0:
print(f" ✅ Article mapping: Found {len(article_results)} mappings for CAFE100")
success_count += 1
else:
print(" ❌ Article mapping: No results for CAFE100")
# Test JSON parsing
total_tests += 1
print("1.2 Testing JSON parsing...")
test_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 25.0}]'
cur.execute("SELECT * FROM TABLE(PACK_JSON.parse_array(:json))", {'json': test_json})
json_results = cur.fetchall()
if len(json_results) > 0:
print(f" ✅ JSON parsing: Successfully parsed {len(json_results)} items")
success_count += 1
else:
print(" ❌ JSON parsing: Failed to parse JSON")
# ========================================
# TEST 2: Partner Management
# ========================================
print("\n👥 TEST 2: Partner Creation/Retrieval")
print("-" * 40)
total_tests += 1
partner_name = f'Test Client {timestamp}-{unique_suffix}'
partner_address = 'JUD:Bucuresti;BUCURESTI;Str. Test;12'
partner_phone = f'072{unique_suffix:04d}000'
partner_email = f'test{unique_suffix}@example.com'
print(f"2.1 Creating/finding partner: {partner_name}")
partner_var = cur.var(oracledb.NUMBER)
cur.execute("""
DECLARE
v_partner_id NUMBER;
BEGIN
v_partner_id := PACK_IMPORT_PARTENERI.cauta_sau_creeaza_partener(
NULL, -- cod_fiscal (NULL for individuals)
:partner_name,
:partner_address,
:partner_phone,
:partner_email
);
:result := v_partner_id;
END;
""", {
'partner_name': partner_name,
'partner_address': partner_address,
'partner_phone': partner_phone,
'partner_email': partner_email,
'result': partner_var
})
partner_id = partner_var.getvalue()
if partner_id and partner_id > 0:
print(f" ✅ Partner management: ID {partner_id}")
success_count += 1
else:
print(" ❌ Partner management: Failed to create/find partner")
return False
# ========================================
# TEST 3: Complete Order Import
# ========================================
print("\n📦 TEST 3: Complete Order Import")
print("-" * 40)
total_tests += 1
order_number = f'COMPLETE-{timestamp}-{unique_suffix}'
# Test with multiple articles including real GoMag SKU
test_articles = [
{"sku": "CAFE100", "cantitate": 2, "pret": 25.0}, # Mapped article
{"sku": "8000070028685", "cantitate": 1, "pret": 69.79} # Real GoMag SKU
]
articles_json = str(test_articles).replace("'", '"')
print(f"3.1 Importing order: {order_number}")
print(f" Articles: {articles_json}")
result_var = cur.var(oracledb.NUMBER)
cur.execute("""
DECLARE
v_order_id NUMBER;
BEGIN
v_order_id := PACK_IMPORT_COMENZI.importa_comanda(
:order_number,
SYSDATE,
:partner_id,
:articles_json,
NULL, -- id_adresa_livrare
NULL, -- id_adresa_facturare
'Complete end-to-end test order'
);
:result := v_order_id;
END;
""", {
'order_number': order_number,
'partner_id': partner_id,
'articles_json': articles_json,
'result': result_var
})
order_id = result_var.getvalue()
# Get detailed error information
cur.execute("SELECT PACK_IMPORT_COMENZI.get_last_error FROM DUAL")
error_msg = cur.fetchone()[0]
if order_id and order_id > 0:
print(f" ✅ Order import: SUCCESS! ID {order_id}")
success_count += 1
# ========================================
# TEST 4: Result Verification
# ========================================
print("\n🔍 TEST 4: Result Verification")
print("-" * 40)
total_tests += 1
# Verify order details
cur.execute("""
SELECT
c.NR_COMANDA,
c.DATA_COMANDA,
c.INTERNA,
c.ID_PART,
c.ID_GESTIUNE,
c.ID_SECTIE,
np.DENUMIRE as PARTNER_NAME
FROM COMENZI c
LEFT JOIN NOM_PARTENERI np ON c.ID_PART = np.ID_PART
WHERE c.ID_COMANDA = :order_id
""", {'order_id': order_id})
order_details = cur.fetchone()
if order_details:
print(f"4.1 Order verification:")
print(f" Number: {order_details[0]}")
print(f" Date: {order_details[1]}")
print(f" Type (INTERNA): {order_details[2]}")
print(f" Partner: {order_details[6]} (ID: {order_details[3]})")
print(f" Gestiune: {order_details[4]}")
print(f" Sectie: {order_details[5]}")
# Verify articles in order
cur.execute("""
SELECT
ce.CANTITATE,
ce.PRET,
na.CODMAT,
na.DENUMIRE
FROM COMENZI_ELEMENTE ce
JOIN NOM_ARTICOLE na ON ce.ID_ARTICOL = na.ID_ARTICOL
WHERE ce.ID_COMANDA = :order_id
ORDER BY na.CODMAT
""", {'order_id': order_id})
order_articles = cur.fetchall()
if order_articles:
print(f"4.2 Articles in order ({len(order_articles)} items):")
for art in order_articles:
print(f" - Qty: {art[0]:>3}, Price: {art[1]:>8.2f}, Code: {art[2]:>10} - {art[3]}")
success_count += 1
# Calculate totals
total_qty = sum(art[0] for art in order_articles)
total_value = sum(art[0] * art[1] for art in order_articles)
print(f" TOTAL: Qty={total_qty}, Value={total_value:.2f} RON")
else:
print(" ❌ No articles found in order")
else:
print(" ❌ Order verification failed")
else:
print(f" ❌ Order import: FAILED")
if error_msg:
print(f" Error: {error_msg}")
else:
print(f" No specific error message, ID returned: {order_id}")
conn.commit()
# ========================================
# FINAL RESULTS
# ========================================
print("\n" + "=" * 60)
print(f"📊 FINAL RESULTS: {success_count}/{total_tests} tests passed")
print("=" * 60)
# ========================================
# TEARDOWN: Cleanup test data
# ========================================
teardown_test_data(cur)
conn.commit()
if success_count == total_tests:
print("🎉 ALL TESTS PASSED! Order import system is fully functional.")
return True
elif success_count >= total_tests - 1:
print("⚠️ MOSTLY SUCCESSFUL: Core components working, minor issues remain.")
return True
else:
print("❌ SIGNIFICANT ISSUES: Multiple components need attention.")
return False
except Exception as e:
print(f"❌ CRITICAL ERROR: {e}")
import traceback
traceback.print_exc()
# Attempt cleanup even on error
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
print("\n🧹 Attempting cleanup after error...")
teardown_test_data(cur)
conn.commit()
except:
print(" ⚠️ Cleanup after error also failed")
return False
def test_repackaging_kit_pricing():
"""
Test single-component repackaging with kit pricing.
CAFE100 -> CAF01 with cantitate_roa=10 (1 web package = 10 ROA units).
Verifies that kit pricing applies: list price per unit + discount line.
"""
print("\n" + "=" * 60)
print("🎯 REPACKAGING KIT PRICING TEST")
print("=" * 60)
success_count = 0
total_tests = 0
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
unique_suffix = random.randint(1000, 9999)
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
setup_test_data(cur)
# Create a test partner
partner_var = cur.var(oracledb.NUMBER)
partner_name = f'Test Repack {timestamp}-{unique_suffix}'
cur.execute("""
DECLARE v_id NUMBER;
BEGIN
v_id := PACK_IMPORT_PARTENERI.cauta_sau_creeaza_partener(
NULL, :name, 'JUD:Bucuresti;BUCURESTI;Str Test;1',
'0720000000', 'repack@test.com');
:result := v_id;
END;
""", {'name': partner_name, 'result': partner_var})
partner_id = partner_var.getvalue()
if not partner_id or partner_id <= 0:
print(" SKIP: Could not create test partner")
return False
# ---- Test separate_line mode ----
total_tests += 1
order_number = f'TEST-REPACK-SEP-{timestamp}-{unique_suffix}'
# Web price: 2 packages * 10 units * some_price = total
# With list price 51.50/unit, 2 packs of 10 = 20 units
# Web price per package = 450 lei => total web = 900
# Expected: 20 units @ 51.50 = 1030, discount = 130
web_price_per_pack = 450.0
articles_json = f'[{{"sku": "CAFE100", "cantitate": 2, "pret": {web_price_per_pack}}}]'
print(f"\n1. Testing separate_line mode: {order_number}")
print(f" CAFE100 x2 @ {web_price_per_pack} lei/pack, cantitate_roa=10")
result_var = cur.var(oracledb.NUMBER)
cur.execute("""
DECLARE v_id NUMBER;
BEGIN
PACK_IMPORT_COMENZI.importa_comanda(
:order_number, SYSDATE, :partner_id,
:articles_json,
NULL, NULL,
1, -- id_pol (default price policy)
NULL, NULL,
'separate_line', -- kit_mode
NULL, NULL, NULL,
v_id);
:result := v_id;
END;
""", {
'order_number': order_number,
'partner_id': partner_id,
'articles_json': articles_json,
'result': result_var
})
order_id = result_var.getvalue()
if order_id and order_id > 0:
print(f" Order created: ID {order_id}")
cur.execute("""
SELECT ce.CANTITATE, ce.PRET, na.CODMAT, na.DENUMIRE
FROM COMENZI_ELEMENTE ce
JOIN NOM_ARTICOLE na ON ce.ID_ARTICOL = na.ID_ARTICOL
WHERE ce.ID_COMANDA = :oid
ORDER BY ce.CANTITATE DESC
""", {'oid': order_id})
rows = cur.fetchall()
if len(rows) >= 2:
# Should have article line + discount line
art_line = [r for r in rows if r[0] > 0]
disc_line = [r for r in rows if r[0] < 0]
if art_line and disc_line:
print(f" Article: qty={art_line[0][0]}, price={art_line[0][1]:.2f} ({art_line[0][2]})")
print(f" Discount: qty={disc_line[0][0]}, price={disc_line[0][1]:.2f}")
total = sum(r[0] * r[1] for r in rows)
expected_total = web_price_per_pack * 2
print(f" Total: {total:.2f} (expected: {expected_total:.2f})")
if abs(total - expected_total) < 0.02:
print(" PASS: Total matches web price")
success_count += 1
else:
print(" FAIL: Total mismatch")
else:
print(f" FAIL: Expected article + discount lines, got {len(art_line)} art / {len(disc_line)} disc")
elif len(rows) == 1:
print(f" FAIL: Only 1 line (no discount). qty={rows[0][0]}, price={rows[0][1]:.2f}")
print(" Kit pricing did NOT activate for single-component repackaging")
else:
print(" FAIL: No order lines found")
else:
cur.execute("SELECT PACK_IMPORT_COMENZI.get_last_error FROM DUAL")
err = cur.fetchone()[0]
print(f" FAIL: Order import failed: {err}")
conn.commit()
# ---- Test distributed mode ----
total_tests += 1
order_number2 = f'TEST-REPACK-DIST-{timestamp}-{unique_suffix}'
print(f"\n2. Testing distributed mode: {order_number2}")
result_var2 = cur.var(oracledb.NUMBER)
cur.execute("""
DECLARE v_id NUMBER;
BEGIN
PACK_IMPORT_COMENZI.importa_comanda(
:order_number, SYSDATE, :partner_id,
:articles_json,
NULL, NULL,
1, NULL, NULL,
'distributed',
NULL, NULL, NULL,
v_id);
:result := v_id;
END;
""", {
'order_number': order_number2,
'partner_id': partner_id,
'articles_json': articles_json,
'result': result_var2
})
order_id2 = result_var2.getvalue()
if order_id2 and order_id2 > 0:
print(f" Order created: ID {order_id2}")
cur.execute("""
SELECT ce.CANTITATE, ce.PRET, na.CODMAT
FROM COMENZI_ELEMENTE ce
JOIN NOM_ARTICOLE na ON ce.ID_ARTICOL = na.ID_ARTICOL
WHERE ce.ID_COMANDA = :oid
""", {'oid': order_id2})
rows2 = cur.fetchall()
if len(rows2) == 1:
# Distributed: single line with adjusted price
total = rows2[0][0] * rows2[0][1]
expected_total = web_price_per_pack * 2
print(f" Line: qty={rows2[0][0]}, price={rows2[0][1]:.2f}, total={total:.2f}")
if abs(total - expected_total) < 0.02:
print(" PASS: Distributed price correct")
success_count += 1
else:
print(f" FAIL: Total {total:.2f} != expected {expected_total:.2f}")
else:
print(f" INFO: Got {len(rows2)} lines (expected 1 for distributed)")
for r in rows2:
print(f" qty={r[0]}, price={r[1]:.2f}, codmat={r[2]}")
else:
cur.execute("SELECT PACK_IMPORT_COMENZI.get_last_error FROM DUAL")
err = cur.fetchone()[0]
print(f" FAIL: Order import failed: {err}")
conn.commit()
# Cleanup
teardown_test_data(cur)
conn.commit()
print(f"\n{'=' * 60}")
print(f"RESULTS: {success_count}/{total_tests} tests passed")
print('=' * 60)
return success_count == total_tests
except Exception as e:
print(f"CRITICAL ERROR: {e}")
import traceback
traceback.print_exc()
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
teardown_test_data(cur)
conn.commit()
except:
pass
return False
# ===========================================================================
# Group 10: Business Rule Regression Tests (Oracle integration)
# ===========================================================================
def _create_test_partner(cur, suffix):
"""Helper: create a test partner and return its ID."""
partner_var = cur.var(oracledb.NUMBER)
name = f'Test BizRule {suffix}'
cur.execute("""
DECLARE v_id NUMBER;
BEGIN
v_id := PACK_IMPORT_PARTENERI.cauta_sau_creeaza_partener(
NULL, :name, 'JUD:Bucuresti;BUCURESTI;Str Test;1',
'0720000000', 'bizrule@test.com');
:result := v_id;
END;
""", {'name': name, 'result': partner_var})
return partner_var.getvalue()
def _import_order(cur, order_number, partner_id, articles_json, kit_mode='separate_line', id_pol=1):
"""Helper: call importa_comanda and return order ID."""
result_var = cur.var(oracledb.NUMBER)
cur.execute("""
DECLARE v_id NUMBER;
BEGIN
PACK_IMPORT_COMENZI.importa_comanda(
:order_number, SYSDATE, :partner_id,
:articles_json,
NULL, NULL,
:id_pol, NULL, NULL,
:kit_mode,
NULL, NULL, NULL,
v_id);
:result := v_id;
END;
""", {
'order_number': order_number,
'partner_id': partner_id,
'articles_json': articles_json,
'id_pol': id_pol,
'kit_mode': kit_mode,
'result': result_var
})
return result_var.getvalue()
def _get_order_lines(cur, order_id):
"""Helper: fetch COMENZI_ELEMENTE rows for an order."""
cur.execute("""
SELECT ce.CANTITATE, ce.PRET, na.CODMAT, ce.PTVA
FROM COMENZI_ELEMENTE ce
JOIN NOM_ARTICOLE na ON ce.ID_ARTICOL = na.ID_ARTICOL
WHERE ce.ID_COMANDA = :oid
ORDER BY ce.CANTITATE DESC, ce.PRET DESC
""", {'oid': order_id})
return cur.fetchall()
def test_multi_kit_discount_merge():
"""Regression (0666d6b): 2 identical kits at same VAT must merge discount lines,
not crash on duplicate check collision."""
print("\n" + "=" * 60)
print("TEST: Multi-kit discount merge (separate_line)")
print("=" * 60)
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
setup_test_data(cur)
partner_id = _create_test_partner(cur, suffix)
# 2 identical CAFE100 kits: total web = 2 * 450 = 900
articles_json = '[{"sku": "CAFE100", "cantitate": 2, "pret": 450}]'
order_id = _import_order(cur, f'TEST-BIZ-MERGE-{suffix}', partner_id, articles_json)
assert order_id and order_id > 0, "Order import failed"
rows = _get_order_lines(cur, order_id)
art_lines = [r for r in rows if r[0] > 0]
disc_lines = [r for r in rows if r[0] < 0]
assert len(art_lines) >= 1, f"Expected article line(s), got {len(art_lines)}"
assert len(disc_lines) >= 1, f"Expected discount line(s), got {len(disc_lines)}"
total = sum(r[0] * r[1] for r in rows)
expected = 900.0
print(f" Total: {total:.2f} (expected: {expected:.2f})")
assert abs(total - expected) < 0.02, f"Total {total:.2f} != expected {expected:.2f}"
print(" PASS")
conn.commit()
teardown_test_data(cur)
conn.commit()
return True
except Exception as e:
print(f" FAIL: {e}")
import traceback
traceback.print_exc()
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
teardown_test_data(cur)
conn.commit()
except:
pass
return False
def test_kit_discount_per_kit_placement():
"""Regression (580ca59): discount lines must appear after article lines (both present)."""
print("\n" + "=" * 60)
print("TEST: Kit discount per-kit placement")
print("=" * 60)
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
setup_test_data(cur)
partner_id = _create_test_partner(cur, suffix)
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 450}]'
order_id = _import_order(cur, f'TEST-BIZ-PLACE-{suffix}', partner_id, articles_json)
assert order_id and order_id > 0, "Order import failed"
rows = _get_order_lines(cur, order_id)
art_lines = [r for r in rows if r[0] > 0]
disc_lines = [r for r in rows if r[0] < 0]
print(f" Article lines: {len(art_lines)}, Discount lines: {len(disc_lines)}")
assert len(art_lines) >= 1, "No article line found"
assert len(disc_lines) >= 1, "No discount line found — kit pricing did not activate"
print(" PASS")
conn.commit()
teardown_test_data(cur)
conn.commit()
return True
except Exception as e:
print(f" FAIL: {e}")
import traceback
traceback.print_exc()
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
teardown_test_data(cur)
conn.commit()
except:
pass
return False
def test_repackaging_distributed_total_matches_web():
"""Regression (61ae58e): distributed mode total must match web price exactly."""
print("\n" + "=" * 60)
print("TEST: Repackaging distributed total matches web")
print("=" * 60)
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
setup_test_data(cur)
partner_id = _create_test_partner(cur, suffix)
# 3 packs @ 400 lei => total web = 1200
articles_json = '[{"sku": "CAFE100", "cantitate": 3, "pret": 400}]'
order_id = _import_order(cur, f'TEST-BIZ-DIST-{suffix}', partner_id,
articles_json, kit_mode='distributed')
assert order_id and order_id > 0, "Order import failed"
rows = _get_order_lines(cur, order_id)
# Distributed: single line with adjusted price
positive_lines = [r for r in rows if r[0] > 0]
assert len(positive_lines) == 1, f"Expected 1 line in distributed mode, got {len(positive_lines)}"
total = positive_lines[0][0] * positive_lines[0][1]
expected = 1200.0
print(f" Line: qty={positive_lines[0][0]}, price={positive_lines[0][1]:.2f}")
print(f" Total: {total:.2f} (expected: {expected:.2f})")
assert abs(total - expected) < 0.02, f"Total {total:.2f} != expected {expected:.2f}"
print(" PASS")
conn.commit()
teardown_test_data(cur)
conn.commit()
return True
except Exception as e:
print(f" FAIL: {e}")
import traceback
traceback.print_exc()
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
teardown_test_data(cur)
conn.commit()
except:
pass
return False
def test_kit_markup_no_negative_discount():
"""Regression (47b5723): when web price > list price (markup), no discount line should be inserted."""
print("\n" + "=" * 60)
print("TEST: Kit markup — no negative discount")
print("=" * 60)
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
setup_test_data(cur)
partner_id = _create_test_partner(cur, suffix)
# CAF01 list price = 51.50/unit, 10 units = 515
# Web price 600 > 515 => markup, no discount line
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 600}]'
order_id = _import_order(cur, f'TEST-BIZ-MARKUP-{suffix}', partner_id, articles_json)
assert order_id and order_id > 0, "Order import failed"
rows = _get_order_lines(cur, order_id)
disc_lines = [r for r in rows if r[0] < 0]
print(f" Total lines: {len(rows)}, Discount lines: {len(disc_lines)}")
assert len(disc_lines) == 0, f"Expected 0 discount lines for markup, got {len(disc_lines)}"
print(" PASS")
conn.commit()
teardown_test_data(cur)
conn.commit()
return True
except Exception as e:
print(f" FAIL: {e}")
import traceback
traceback.print_exc()
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
teardown_test_data(cur)
conn.commit()
except:
pass
return False
def test_kit_component_price_zero_import():
"""Regression (1703232): kit components with pret=0 should import successfully."""
print("\n" + "=" * 60)
print("TEST: Kit component price=0 import")
print("=" * 60)
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
setup_test_data(cur)
partner_id = _create_test_partner(cur, suffix)
# Temporarily set CAF01 price to 0
cur.execute("""
UPDATE crm_politici_pret_art SET PRET = 0
WHERE id_articol = 9999001 AND id_pol = 1
""")
conn.commit()
try:
# Import with pret=0 — should succeed (discount = full web price)
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 100}]'
order_id = _import_order(cur, f'TEST-BIZ-PRET0-{suffix}', partner_id, articles_json)
print(f" Order ID: {order_id}")
assert order_id and order_id > 0, "Order import failed with pret=0"
print(" PASS: Order imported successfully with pret=0")
conn.commit()
finally:
# Restore original price
cur.execute("""
UPDATE crm_politici_pret_art SET PRET = 51.50
WHERE id_articol = 9999001 AND id_pol = 1
""")
conn.commit()
teardown_test_data(cur)
conn.commit()
return True
except Exception as e:
print(f" FAIL: {e}")
import traceback
traceback.print_exc()
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
# Restore price on error
cur.execute("""
UPDATE crm_politici_pret_art SET PRET = 51.50
WHERE id_articol = 9999001 AND id_pol = 1
""")
conn.commit()
teardown_test_data(cur)
conn.commit()
except:
pass
return False
def test_duplicate_codmat_different_prices():
"""Regression (95565af): same CODMAT at different prices should create separate lines,
discriminated by PRET + SIGN(CANTITATE)."""
print("\n" + "=" * 60)
print("TEST: Duplicate CODMAT different prices")
print("=" * 60)
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
setup_test_data(cur)
partner_id = _create_test_partner(cur, suffix)
# Two articles both mapping to CAF01 but at different prices
# CAFE100 -> CAF01 via ARTICOLE_TERTI (kit pricing)
# We use separate_line mode so article gets list price 51.50
# Then a second article at a different price on the same CODMAT
# For this test, we import 2 separate orders to same CODMAT with different prices
# The real scenario: kit article line + discount line on same id_articol
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 450}]'
order_id = _import_order(cur, f'TEST-BIZ-DUP-{suffix}', partner_id, articles_json)
assert order_id and order_id > 0, "Order import failed"
rows = _get_order_lines(cur, order_id)
# separate_line mode: article at list price + discount at negative qty
# Both reference same CODMAT (CAF01) but different PRET and SIGN(CANTITATE)
codmats = [r[2] for r in rows]
print(f" Lines: {len(rows)}")
for r in rows:
print(f" qty={r[0]}, pret={r[1]:.2f}, codmat={r[2]}")
# Should have at least 2 lines with same CODMAT but different qty sign
caf_lines = [r for r in rows if r[2] == 'CAF01']
assert len(caf_lines) >= 2, f"Expected 2+ CAF01 lines (article + discount), got {len(caf_lines)}"
signs = {1 if r[0] > 0 else -1 for r in caf_lines}
assert len(signs) == 2, "Expected both positive and negative quantity lines for same CODMAT"
print(" PASS: Same CODMAT with different PRET/SIGN coexist")
conn.commit()
teardown_test_data(cur)
conn.commit()
return True
except Exception as e:
print(f" FAIL: {e}")
import traceback
traceback.print_exc()
try:
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
with conn.cursor() as cur:
teardown_test_data(cur)
conn.commit()
except:
pass
return False
if __name__ == "__main__":
print("Starting complete order import test...")
print(f"Timestamp: {datetime.now()}")
success = test_complete_import()
print(f"\nTest completed at: {datetime.now()}")
if success:
print("PHASE 1 VALIDATION: SUCCESSFUL")
else:
print("PHASE 1 VALIDATION: NEEDS ATTENTION")
# Run repackaging kit pricing test
print("\n")
repack_success = test_repackaging_kit_pricing()
if repack_success:
print("REPACKAGING KIT PRICING: SUCCESSFUL")
else:
print("REPACKAGING KIT PRICING: NEEDS ATTENTION")
# Run business rule regression tests
print("\n")
biz_tests = [
("Multi-kit discount merge", test_multi_kit_discount_merge),
("Kit discount per-kit placement", test_kit_discount_per_kit_placement),
("Distributed total matches web", test_repackaging_distributed_total_matches_web),
("Markup no negative discount", test_kit_markup_no_negative_discount),
("Component price=0 import", test_kit_component_price_zero_import),
("Duplicate CODMAT different prices", test_duplicate_codmat_different_prices),
]
biz_passed = 0
for name, test_fn in biz_tests:
if test_fn():
biz_passed += 1
print(f"\nBusiness rule tests: {biz_passed}/{len(biz_tests)} passed")
exit(0 if success else 1)

View File

@@ -1,196 +0,0 @@
"""
Oracle Integration Tests for GoMag Import Manager (pytest-compatible)
=====================================================================
Requires Oracle connectivity and valid .env configuration.
Converted from api/test_integration.py.
Run:
pytest api/tests/test_integration.py -v
"""
import os
import sys
import pytest
# --- Marker: all tests require Oracle ---
pytestmark = pytest.mark.oracle
# Set working directory to project root so relative paths in .env work
_script_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)), "..")
_project_root = os.path.dirname(_script_dir)
# Load .env from api/ before importing app modules
from dotenv import load_dotenv
_env_path = os.path.join(_script_dir, ".env")
load_dotenv(_env_path, override=True)
# TNS_ADMIN must point to the directory containing tnsnames.ora, not the file
_tns_admin = os.environ.get("TNS_ADMIN", "")
if _tns_admin and os.path.isfile(_tns_admin):
os.environ["TNS_ADMIN"] = os.path.dirname(_tns_admin)
elif not _tns_admin:
os.environ["TNS_ADMIN"] = _script_dir
# Add api/ to path so app package is importable
if _script_dir not in sys.path:
sys.path.insert(0, _script_dir)
@pytest.fixture(scope="module")
def client():
"""Create a TestClient with Oracle lifespan.
Re-apply .env here because other test modules (test_requirements.py)
may have set ORACLE_DSN=dummy at import time during pytest collection.
"""
# Re-load .env to override any dummy values from other test modules
load_dotenv(_env_path, override=True)
_tns = os.environ.get("TNS_ADMIN", "")
if _tns and os.path.isfile(_tns):
os.environ["TNS_ADMIN"] = os.path.dirname(_tns)
elif not _tns:
os.environ["TNS_ADMIN"] = _script_dir
# Force-update the cached settings singleton with correct values from .env
from app.config import settings
settings.ORACLE_USER = os.environ.get("ORACLE_USER", "MARIUSM_AUTO")
settings.ORACLE_PASSWORD = os.environ.get("ORACLE_PASSWORD", "ROMFASTSOFT")
settings.ORACLE_DSN = os.environ.get("ORACLE_DSN", "ROA_CENTRAL")
settings.TNS_ADMIN = os.environ.get("TNS_ADMIN", _script_dir)
settings.FORCE_THIN_MODE = os.environ.get("FORCE_THIN_MODE", "") == "true"
from fastapi.testclient import TestClient
from app.main import app
with TestClient(app) as c:
yield c
# ---------------------------------------------------------------------------
# Test A: GET /health — Oracle must show as connected
# ---------------------------------------------------------------------------
def test_health_oracle_connected(client):
resp = client.get("/health")
assert resp.status_code == 200
body = resp.json()
assert body.get("oracle") == "ok", f"oracle={body.get('oracle')!r}"
assert body.get("sqlite") == "ok", f"sqlite={body.get('sqlite')!r}"
# ---------------------------------------------------------------------------
# Test B: Mappings CRUD cycle (uses real CODMAT from Oracle nomenclator)
# ---------------------------------------------------------------------------
@pytest.fixture(scope="module")
def test_sku():
"""Generate a unique test SKU per run to avoid conflicts with prior soft-deleted entries."""
import time
return f"PYTEST_SKU_{int(time.time())}"
@pytest.fixture(scope="module")
def real_codmat(client):
"""Find a real CODMAT from Oracle nomenclator to use in mappings tests."""
# min_length=2 on the endpoint, so use 2+ char search terms
for term in ["01", "PH", "CA"]:
resp = client.get("/api/articles/search", params={"q": term})
if resp.status_code == 200:
results = resp.json().get("results", [])
if results:
return results[0]["codmat"]
pytest.skip("No articles found in Oracle for CRUD test")
def test_mappings_create(client, real_codmat, test_sku):
resp = client.post("/api/mappings", json={
"sku": test_sku,
"codmat": real_codmat,
"cantitate_roa": 2.5,
})
assert resp.status_code == 200, f"create returned {resp.status_code}: {resp.json()}"
body = resp.json()
assert body.get("success") is True, f"create returned: {body}"
def test_mappings_list_after_create(client, real_codmat, test_sku):
resp = client.get("/api/mappings", params={"search": test_sku})
assert resp.status_code == 200
body = resp.json()
mappings = body.get("mappings", [])
found = any(
m["sku"] == test_sku and m["codmat"] == real_codmat
for m in mappings
)
assert found, f"mapping not found in list; got {mappings}"
def test_mappings_update(client, real_codmat, test_sku):
resp = client.put(f"/api/mappings/{test_sku}/{real_codmat}", json={
"cantitate_roa": 3.0,
})
assert resp.status_code == 200
body = resp.json()
assert body.get("success") is True, f"update returned: {body}"
def test_mappings_delete(client, real_codmat, test_sku):
resp = client.delete(f"/api/mappings/{test_sku}/{real_codmat}")
assert resp.status_code == 200
body = resp.json()
assert body.get("success") is True, f"delete returned: {body}"
def test_mappings_verify_soft_deleted(client, real_codmat, test_sku):
resp = client.get("/api/mappings", params={"search": test_sku, "show_deleted": "true"})
assert resp.status_code == 200
body = resp.json()
mappings = body.get("mappings", [])
deleted = any(
m["sku"] == test_sku and m["codmat"] == real_codmat and m.get("sters") == 1
for m in mappings
)
assert deleted, (
f"expected sters=1 for deleted mapping, got: "
f"{[m for m in mappings if m['sku'] == test_sku]}"
)
# ---------------------------------------------------------------------------
# Test C: GET /api/articles/search
# ---------------------------------------------------------------------------
def test_articles_search(client):
search_terms = ["01", "A", "PH"]
found_results = False
for term in search_terms:
resp = client.get("/api/articles/search", params={"q": term})
assert resp.status_code == 200
body = resp.json()
results_list = body.get("results", [])
if results_list:
found_results = True
break
assert found_results, f"all search terms {search_terms} returned empty results"
# ---------------------------------------------------------------------------
# Test D: POST /api/validate/scan
# ---------------------------------------------------------------------------
def test_validate_scan(client):
resp = client.post("/api/validate/scan")
assert resp.status_code == 200
body = resp.json()
has_shape = "json_files" in body and ("orders" in body or "total_orders" in body)
assert has_shape, f"unexpected response shape: {list(body.keys())}"
# ---------------------------------------------------------------------------
# Test E: GET /api/sync/history
# ---------------------------------------------------------------------------
def test_sync_history(client):
resp = client.get("/api/sync/history")
assert resp.status_code == 200
body = resp.json()
assert "runs" in body, f"missing 'runs' key; got keys: {list(body.keys())}"
assert isinstance(body["runs"], list)
assert "total" in body

View File

@@ -1,620 +0,0 @@
"""
Test Phase 5.1: Backend Functionality Tests (no Oracle required)
================================================================
Tests all new backend features: web_products, order_items, order detail,
run orders filtered, address updates, missing SKUs toggle, and API endpoints.
Run:
cd api && python -m pytest tests/test_requirements.py -v
"""
import os
import sys
import pytest
pytestmark = pytest.mark.unit
import tempfile
# --- Set env vars BEFORE any app import ---
_tmpdir = tempfile.mkdtemp()
_sqlite_path = os.path.join(_tmpdir, "test_import.db")
os.environ["FORCE_THIN_MODE"] = "true"
os.environ["SQLITE_DB_PATH"] = _sqlite_path
os.environ["ORACLE_DSN"] = "dummy"
os.environ["ORACLE_USER"] = "dummy"
os.environ["ORACLE_PASSWORD"] = "dummy"
os.environ["JSON_OUTPUT_DIR"] = _tmpdir
# Add api/ to path so we can import app
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
if _api_dir not in sys.path:
sys.path.insert(0, _api_dir)
import pytest
import pytest_asyncio
from app.database import init_sqlite
from app.services import sqlite_service
# Initialize SQLite once before any tests run
init_sqlite()
# ---------------------------------------------------------------------------
# Fixtures
# ---------------------------------------------------------------------------
@pytest.fixture(scope="module")
def client():
"""TestClient with lifespan (startup/shutdown) so SQLite routes work."""
from fastapi.testclient import TestClient
from app.main import app
with TestClient(app, raise_server_exceptions=False) as c:
yield c
@pytest.fixture(autouse=True, scope="module")
def seed_baseline_data():
"""
Seed the sync run and orders used by multiple tests so they run in any order.
We use asyncio.run() because this is a synchronous fixture but needs to call
async service functions.
"""
import asyncio
async def _seed():
# Create sync run RUN001
await sqlite_service.create_sync_run("RUN001", 1)
# Add the first order (IMPORTED) with items
await sqlite_service.upsert_order(
"RUN001", "ORD001", "2025-01-15", "Test Client", "IMPORTED",
id_comanda=100, id_partener=200, items_count=2
)
await sqlite_service.add_sync_run_order("RUN001", "ORD001", "IMPORTED")
items = [
{
"sku": "SKU1",
"product_name": "Prod 1",
"quantity": 2.0,
"price": 10.0,
"vat": 1.9,
"mapping_status": "direct",
"codmat": "SKU1",
"id_articol": 500,
"cantitate_roa": 2.0,
},
{
"sku": "SKU2",
"product_name": "Prod 2",
"quantity": 1.0,
"price": 20.0,
"vat": 3.8,
"mapping_status": "missing",
"codmat": None,
"id_articol": None,
"cantitate_roa": None,
},
]
await sqlite_service.add_order_items("ORD001", items)
# Add more orders for filter tests
await sqlite_service.upsert_order(
"RUN001", "ORD002", "2025-01-16", "Client 2", "SKIPPED",
missing_skus=["SKU99"], items_count=1
)
await sqlite_service.add_sync_run_order("RUN001", "ORD002", "SKIPPED")
await sqlite_service.upsert_order(
"RUN001", "ORD003", "2025-01-17", "Client 3", "ERROR",
error_message="Test error", items_count=3
)
await sqlite_service.add_sync_run_order("RUN001", "ORD003", "ERROR")
asyncio.run(_seed())
yield
# ---------------------------------------------------------------------------
# Section 1: web_products CRUD
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_upsert_web_product():
"""First upsert creates the row; second increments order_count."""
await sqlite_service.upsert_web_product("SKU001", "Product One")
name = await sqlite_service.get_web_product_name("SKU001")
assert name == "Product One"
# Second upsert should increment order_count (no assertion on count here,
# but must not raise and batch lookup should still find it)
await sqlite_service.upsert_web_product("SKU001", "Product One")
batch = await sqlite_service.get_web_products_batch(["SKU001", "NONEXIST"])
assert "SKU001" in batch
assert "NONEXIST" not in batch
@pytest.mark.asyncio
async def test_web_product_name_update():
"""Empty name should NOT overwrite an existing product name."""
await sqlite_service.upsert_web_product("SKU002", "Good Name")
await sqlite_service.upsert_web_product("SKU002", "")
name = await sqlite_service.get_web_product_name("SKU002")
assert name == "Good Name"
@pytest.mark.asyncio
async def test_get_web_product_name_missing():
"""Lookup for an SKU that was never inserted should return empty string."""
name = await sqlite_service.get_web_product_name("DEFINITELY_NOT_THERE_XYZ")
assert name == ""
@pytest.mark.asyncio
async def test_get_web_products_batch_empty():
"""Batch lookup with empty list should return empty dict without error."""
result = await sqlite_service.get_web_products_batch([])
assert result == {}
# ---------------------------------------------------------------------------
# Section 2: order_items CRUD
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_add_and_get_order_items():
"""Verify the items seeded in baseline data are retrievable."""
fetched = await sqlite_service.get_order_items("ORD001")
assert len(fetched) == 2
assert fetched[0]["sku"] == "SKU1"
assert fetched[1]["mapping_status"] == "missing"
@pytest.mark.asyncio
async def test_get_order_items_mapping_status():
"""First item should be 'direct', second should be 'missing'."""
fetched = await sqlite_service.get_order_items("ORD001")
assert fetched[0]["mapping_status"] == "direct"
assert fetched[1]["codmat"] is None
assert fetched[1]["id_articol"] is None
@pytest.mark.asyncio
async def test_get_order_items_for_nonexistent_order():
"""Items query for an unknown order should return an empty list."""
fetched = await sqlite_service.get_order_items("NONEXIST_ORDER")
assert fetched == []
# ---------------------------------------------------------------------------
# Section 3: order detail
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_get_order_detail():
"""Order detail returns order metadata plus its line items."""
detail = await sqlite_service.get_order_detail("ORD001")
assert detail is not None
assert detail["order"]["order_number"] == "ORD001"
assert len(detail["items"]) == 2
@pytest.mark.asyncio
async def test_get_order_detail_not_found():
"""Non-existent order returns None."""
detail = await sqlite_service.get_order_detail("NONEXIST")
assert detail is None
@pytest.mark.asyncio
async def test_get_order_detail_status():
"""Seeded ORD001 should have IMPORTED status."""
detail = await sqlite_service.get_order_detail("ORD001")
assert detail["order"]["status"] == "IMPORTED"
# ---------------------------------------------------------------------------
# Section 4: run orders filtered
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_get_run_orders_filtered_all():
"""All orders in run should total 3 with correct status counts."""
result = await sqlite_service.get_run_orders_filtered("RUN001", "all", 1, 50)
assert result["total"] == 3
assert result["counts"]["imported"] == 1
assert result["counts"]["skipped"] == 1
assert result["counts"]["error"] == 1
@pytest.mark.asyncio
async def test_get_run_orders_filtered_imported():
"""Filter IMPORTED should return only ORD001."""
result = await sqlite_service.get_run_orders_filtered("RUN001", "IMPORTED", 1, 50)
assert result["total"] == 1
assert result["orders"][0]["order_number"] == "ORD001"
@pytest.mark.asyncio
async def test_get_run_orders_filtered_skipped():
"""Filter SKIPPED should return only ORD002."""
result = await sqlite_service.get_run_orders_filtered("RUN001", "SKIPPED", 1, 50)
assert result["total"] == 1
assert result["orders"][0]["order_number"] == "ORD002"
@pytest.mark.asyncio
async def test_get_run_orders_filtered_error():
"""Filter ERROR should return only ORD003."""
result = await sqlite_service.get_run_orders_filtered("RUN001", "ERROR", 1, 50)
assert result["total"] == 1
assert result["orders"][0]["order_number"] == "ORD003"
@pytest.mark.asyncio
async def test_get_run_orders_filtered_unknown_run():
"""Unknown run_id should return zero orders without error."""
result = await sqlite_service.get_run_orders_filtered("NO_SUCH_RUN", "all", 1, 50)
assert result["total"] == 0
assert result["orders"] == []
@pytest.mark.asyncio
async def test_get_run_orders_filtered_pagination():
"""Pagination: page 1 with per_page=1 should return 1 order."""
result = await sqlite_service.get_run_orders_filtered("RUN001", "all", 1, 1)
assert len(result["orders"]) == 1
assert result["total"] == 3
assert result["pages"] == 3
# ---------------------------------------------------------------------------
# Section 5: update_import_order_addresses
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_update_import_order_addresses():
"""Address IDs should be persisted and retrievable via get_order_detail."""
await sqlite_service.update_import_order_addresses(
"ORD001",
id_adresa_facturare=300,
id_adresa_livrare=400
)
detail = await sqlite_service.get_order_detail("ORD001")
assert detail["order"]["id_adresa_facturare"] == 300
assert detail["order"]["id_adresa_livrare"] == 400
@pytest.mark.asyncio
async def test_update_import_order_addresses_null():
"""Updating with None should be accepted without error."""
await sqlite_service.update_import_order_addresses(
"ORD001",
id_adresa_facturare=None,
id_adresa_livrare=None
)
detail = await sqlite_service.get_order_detail("ORD001")
assert detail is not None # row still exists
# ---------------------------------------------------------------------------
# Section 6: missing SKUs resolved toggle (R10)
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_missing_skus_resolved_toggle():
"""resolved=-1 returns all; resolved=0/1 returns only matching rows."""
await sqlite_service.track_missing_sku("MISS1", "Missing Product 1")
await sqlite_service.track_missing_sku("MISS2", "Missing Product 2")
await sqlite_service.resolve_missing_sku("MISS2")
# Unresolved only (default)
result = await sqlite_service.get_missing_skus_paginated(1, 20, resolved=0)
assert all(s["resolved"] == 0 for s in result["missing_skus"])
# Resolved only
result = await sqlite_service.get_missing_skus_paginated(1, 20, resolved=1)
assert all(s["resolved"] == 1 for s in result["missing_skus"])
# All (resolved=-1)
result = await sqlite_service.get_missing_skus_paginated(1, 20, resolved=-1)
assert result["total"] >= 2
@pytest.mark.asyncio
async def test_track_missing_sku_idempotent():
"""Tracking the same SKU twice should not raise (INSERT OR IGNORE)."""
await sqlite_service.track_missing_sku("IDEMPOTENT_SKU", "Some Product")
await sqlite_service.track_missing_sku("IDEMPOTENT_SKU", "Some Product")
result = await sqlite_service.get_missing_skus_paginated(1, 20, resolved=0)
sku_list = [s["sku"] for s in result["missing_skus"]]
assert sku_list.count("IDEMPOTENT_SKU") == 1
@pytest.mark.asyncio
async def test_missing_skus_pagination():
"""Pagination response includes total, page, per_page, pages fields."""
result = await sqlite_service.get_missing_skus_paginated(1, 1, resolved=-1)
assert "total" in result
assert "page" in result
assert "per_page" in result
assert "pages" in result
assert len(result["missing_skus"]) <= 1
# ---------------------------------------------------------------------------
# Section 7: API endpoints via TestClient
# ---------------------------------------------------------------------------
def test_api_sync_run_orders(client):
"""R1: GET /api/sync/run/{run_id}/orders returns orders and counts."""
resp = client.get("/api/sync/run/RUN001/orders?status=all&page=1&per_page=50")
assert resp.status_code == 200
data = resp.json()
assert "orders" in data
assert "counts" in data
def test_api_sync_run_orders_filtered(client):
"""R1: Filtering by status=IMPORTED returns only IMPORTED orders."""
resp = client.get("/api/sync/run/RUN001/orders?status=IMPORTED")
assert resp.status_code == 200
data = resp.json()
assert all(o["status"] == "IMPORTED" for o in data["orders"])
def test_api_sync_run_orders_pagination_fields(client):
"""R1: Paginated response includes total, page, per_page, pages."""
resp = client.get("/api/sync/run/RUN001/orders?status=all&page=1&per_page=10")
assert resp.status_code == 200
data = resp.json()
assert "total" in data
assert "page" in data
assert "per_page" in data
assert "pages" in data
def test_api_sync_run_orders_unknown_run(client):
"""R1: Unknown run_id returns empty orders list, not 4xx/5xx."""
resp = client.get("/api/sync/run/NO_SUCH_RUN/orders")
assert resp.status_code == 200
data = resp.json()
assert data["total"] == 0
def test_api_order_detail(client):
"""R9: GET /api/sync/order/{order_number} returns order and items."""
resp = client.get("/api/sync/order/ORD001")
# 200 if Oracle available, 500 if Oracle enrichment fails
assert resp.status_code in [200, 500]
if resp.status_code == 200:
data = resp.json()
assert "order" in data
assert "items" in data
def test_api_order_detail_not_found(client):
"""R9: Non-existent order number returns error key."""
resp = client.get("/api/sync/order/NONEXIST")
assert resp.status_code == 200
data = resp.json()
assert "error" in data
def test_api_missing_skus_resolved_toggle(client):
"""R10: resolved=-1 returns all missing SKUs."""
resp = client.get("/api/validate/missing-skus?resolved=-1")
assert resp.status_code == 200
data = resp.json()
assert "missing_skus" in data
def test_api_missing_skus_resolved_unresolved(client):
"""R10: resolved=0 returns only unresolved SKUs."""
resp = client.get("/api/validate/missing-skus?resolved=0")
assert resp.status_code == 200
data = resp.json()
assert "missing_skus" in data
assert all(s["resolved"] == 0 for s in data["missing_skus"])
def test_api_missing_skus_resolved_only(client):
"""R10: resolved=1 returns only resolved SKUs."""
resp = client.get("/api/validate/missing-skus?resolved=1")
assert resp.status_code == 200
data = resp.json()
assert "missing_skus" in data
assert all(s["resolved"] == 1 for s in data["missing_skus"])
def test_api_missing_skus_csv_format(client):
"""R8: CSV export has mapping-compatible columns."""
resp = client.get("/api/validate/missing-skus-csv")
assert resp.status_code == 200
content = resp.content.decode("utf-8-sig")
header_line = content.split("\n")[0].strip()
assert header_line == "sku,codmat,cantitate_roa,procent_pret,product_name"
def test_api_mappings_sort_params(client):
"""R7: Sort params accepted - no 422 validation error even without Oracle."""
resp = client.get("/api/mappings?sort_by=sku&sort_dir=desc")
# 200 if Oracle available, 503 if not - but never 422 (invalid params)
assert resp.status_code in [200, 503]
def test_api_mappings_sort_params_asc(client):
"""R7: sort_dir=asc is also accepted without 422."""
resp = client.get("/api/mappings?sort_by=codmat&sort_dir=asc")
assert resp.status_code in [200, 503]
def test_api_batch_mappings_validation_percentage(client):
"""R11: Batch endpoint rejects procent_pret that does not sum to 100."""
resp = client.post("/api/mappings/batch", json={
"sku": "TESTSKU",
"mappings": [
{"codmat": "COD1", "cantitate_roa": 1, "procent_pret": 60},
{"codmat": "COD2", "cantitate_roa": 1, "procent_pret": 30},
]
})
data = resp.json()
# 60 + 30 = 90, not 100 -> must fail validation (or Oracle unavailable)
assert data.get("success") is False
def test_api_batch_mappings_validation_exact_100(client):
"""R11: Batch with procent_pret summing to exactly 100 passes validation layer."""
resp = client.post("/api/mappings/batch", json={
"sku": "TESTSKU_VALID",
"mappings": [
{"codmat": "COD1", "cantitate_roa": 1, "procent_pret": 60},
{"codmat": "COD2", "cantitate_roa": 1, "procent_pret": 40},
]
})
data = resp.json()
# Validation passes; may fail with 503/error if Oracle is unavailable,
# but must NOT return the percentage error message
assert "100%" not in data.get("error", "")
def test_api_batch_mappings_no_mappings(client):
"""R11: Batch endpoint rejects empty mappings list."""
resp = client.post("/api/mappings/batch", json={
"sku": "TESTSKU",
"mappings": []
})
data = resp.json()
assert data.get("success") is False
def test_api_sync_status(client):
"""GET /api/sync/status returns status and sync state keys."""
resp = client.get("/api/sync/status")
assert resp.status_code == 200
data = resp.json()
assert "status" in data or "counts" in data
def test_api_sync_history(client):
"""GET /api/sync/history returns paginated run history."""
resp = client.get("/api/sync/history")
assert resp.status_code == 200
data = resp.json()
assert "runs" in data
assert "total" in data
def test_api_missing_skus_pagination_params(client):
"""Pagination params page and per_page are respected."""
resp = client.get("/api/validate/missing-skus?page=1&per_page=2&resolved=-1")
assert resp.status_code == 200
data = resp.json()
assert len(data["missing_skus"]) <= 2
assert data["per_page"] == 2
def test_api_csv_template(client):
"""GET /api/mappings/csv-template returns a CSV file without Oracle."""
resp = client.get("/api/mappings/csv-template")
assert resp.status_code == 200
# ---------------------------------------------------------------------------
# Section 8: Chronological sorting (R3)
# ---------------------------------------------------------------------------
def test_chronological_sort():
"""R3: Orders sorted oldest-first when sorted by date string."""
from app.services.order_reader import OrderData, OrderBilling
orders = [
OrderData(id="3", number="003", date="2025-03-01", billing=OrderBilling()),
OrderData(id="1", number="001", date="2025-01-01", billing=OrderBilling()),
OrderData(id="2", number="002", date="2025-02-01", billing=OrderBilling()),
]
orders.sort(key=lambda o: o.date or "")
assert orders[0].number == "001"
assert orders[1].number == "002"
assert orders[2].number == "003"
def test_chronological_sort_stable_on_equal_dates():
"""R3: Two orders with the same date preserve relative order."""
from app.services.order_reader import OrderData, OrderBilling
orders = [
OrderData(id="A", number="A01", date="2025-05-01", billing=OrderBilling()),
OrderData(id="B", number="B01", date="2025-05-01", billing=OrderBilling()),
]
orders.sort(key=lambda o: o.date or "")
# Both dates equal; stable sort preserves original order
assert orders[0].number == "A01"
assert orders[1].number == "B01"
def test_chronological_sort_empty_date_last():
"""R3: Orders with missing date (empty string) sort before dated orders."""
from app.services.order_reader import OrderData, OrderBilling
orders = [
OrderData(id="2", number="002", date="2025-06-01", billing=OrderBilling()),
OrderData(id="1", number="001", date="", billing=OrderBilling()),
]
orders.sort(key=lambda o: o.date or "")
# '' sorts before '2025-...' lexicographically
assert orders[0].number == "001"
assert orders[1].number == "002"
# ---------------------------------------------------------------------------
# Section 9: OrderData dataclass integrity
# ---------------------------------------------------------------------------
def test_order_data_defaults():
"""OrderData can be constructed with only id, number, date."""
from app.services.order_reader import OrderData, OrderBilling
order = OrderData(id="1", number="001", date="2025-01-01", billing=OrderBilling())
assert order.status == ""
assert order.items == []
assert order.shipping is None
def test_order_billing_defaults():
"""OrderBilling has sensible defaults."""
from app.services.order_reader import OrderBilling
b = OrderBilling()
assert b.is_company is False
assert b.company_name == ""
assert b.email == ""
def test_get_all_skus():
"""get_all_skus extracts a unique set of SKUs from all orders."""
from app.services.order_reader import OrderData, OrderBilling, OrderItem, get_all_skus
orders = [
OrderData(
id="1", number="001", date="2025-01-01",
billing=OrderBilling(),
items=[
OrderItem(sku="A", name="Prod A", price=10, quantity=1, vat=1.9),
OrderItem(sku="B", name="Prod B", price=20, quantity=2, vat=3.8),
]
),
OrderData(
id="2", number="002", date="2025-01-02",
billing=OrderBilling(),
items=[
OrderItem(sku="A", name="Prod A", price=10, quantity=1, vat=1.9),
OrderItem(sku="C", name="Prod C", price=5, quantity=3, vat=0.95),
]
),
]
skus = get_all_skus(orders)
assert skus == {"A", "B", "C"}

View File

@@ -1,9 +0,0 @@
ROA_CENTRAL =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = tcp)(HOST = 10.0.20.121)(PORT = 1521))
)
(CONNECT_DATA =
(SERVICE_NAME = ROA)
)
)

View File

@@ -1,528 +0,0 @@
#Requires -RunAsAdministrator
<#
.SYNOPSIS
Deploy / update GoMag Import Manager pe Windows Server cu IIS.
.DESCRIPTION
- Prima rulare: clone repo, setup venv, genereaza start.bat, configureaza IIS
- Rulari ulterioare: git pull, reinstaleaza deps, restarteaza serviciul
.PARAMETER RepoPath
Calea locala unde se cloneaza repo-ul. Default: C:\gomag-vending
.PARAMETER Port
Portul pe care ruleaza FastAPI. Default: 5003
.PARAMETER IisSiteName
Numele site-ului IIS parinte. Default: "Default Web Site"
.PARAMETER SkipIIS
Sarit configurarea IIS (util daca nu ai ARR/URLRewrite instalate inca)
.EXAMPLE
.\deploy.ps1
.\deploy.ps1 -RepoPath "D:\apps\gomag-vending" -Port 5003
.\deploy.ps1 -SkipIIS
#>
param(
[string]$RepoPath = "C:\gomag-vending",
[int] $Port = 5003,
[string]$IisSiteName = "Default Web Site",
[switch]$SkipIIS
)
Set-StrictMode -Version Latest
$ErrorActionPreference = "Stop"
# ─────────────────────────────────────────────────────────────────────────────
# Helpers
# ─────────────────────────────────────────────────────────────────────────────
function Write-Step { param([string]$msg) Write-Host "`n==> $msg" -ForegroundColor Cyan }
function Write-OK { param([string]$msg) Write-Host " [OK] $msg" -ForegroundColor Green }
function Write-Warn { param([string]$msg) Write-Host " [WARN] $msg" -ForegroundColor Yellow }
function Write-Fail { param([string]$msg) Write-Host " [FAIL] $msg" -ForegroundColor Red }
function Write-Info { param([string]$msg) Write-Host " $msg" -ForegroundColor Gray }
$ScriptDir = Split-Path -Parent $MyInvocation.MyCommand.Definition
# ─────────────────────────────────────────────────────────────────────────────
# 1. Citire token Gitea
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Citire token Gitea"
$TokenFile = Join-Path $ScriptDir ".gittoken"
$GitToken = ""
if (Test-Path $TokenFile) {
$GitToken = (Get-Content $TokenFile -Raw).Trim()
Write-OK "Token citit din $TokenFile"
} else {
Write-Warn ".gittoken nu exista langa deploy.ps1"
Write-Info "Creeaza fisierul $TokenFile cu token-ul tau Gitea (fara newline)"
Write-Info "Ex: echo -n 'ghp_xxxx' > .gittoken"
Write-Info ""
Write-Info "Continui fara token (merge doar daca repo-ul e public sau deja clonat)"
}
$RepoUrl = if ($GitToken) {
"https://$GitToken@gitea.romfast.ro/romfast/gomag-vending.git"
} else {
"https://gitea.romfast.ro/romfast/gomag-vending.git"
}
# ─────────────────────────────────────────────────────────────────────────────
# 2. Git clone / pull
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Git clone / pull"
# Verifica git instalat
if (-not (Get-Command git -ErrorAction SilentlyContinue)) {
Write-Fail "Git nu este instalat!"
Write-Info "Descarca Git for Windows de la: https://git-scm.com/download/win"
exit 1
}
if (Test-Path (Join-Path $RepoPath ".git")) {
Write-Info "Repo exista, fac git pull..."
Push-Location $RepoPath
try {
# Update remote URL cu tokenul curent (in caz ca s-a schimbat)
if ($GitToken) {
git remote set-url origin $RepoUrl 2>$null
}
git pull --ff-only
Write-OK "git pull OK"
} finally {
Pop-Location
}
} else {
Write-Info "Clonez in $RepoPath ..."
$ParentDir = Split-Path -Parent $RepoPath
if (-not (Test-Path $ParentDir)) {
New-Item -ItemType Directory -Path $ParentDir -Force | Out-Null
}
git clone $RepoUrl $RepoPath
Write-OK "git clone OK"
}
# ─────────────────────────────────────────────────────────────────────────────
# 3. Verificare Python
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Verificare Python"
$PythonCmd = $null
foreach ($candidate in @("python", "python3", "py")) {
try {
$ver = & $candidate --version 2>&1
if ($ver -match "Python 3\.(\d+)") {
$minor = [int]$Matches[1]
if ($minor -ge 11) {
$PythonCmd = $candidate
Write-OK "Python gasit: $ver ($candidate)"
break
} else {
Write-Warn "Python $ver prea vechi (necesar 3.11+)"
}
}
} catch { }
}
if (-not $PythonCmd) {
Write-Fail "Python 3.11+ nu este instalat sau nu e in PATH!"
Write-Info "Descarca de la: https://www.python.org/downloads/"
Write-Info "IMPORTANT: Bifeaza 'Add Python to PATH' la instalare"
exit 1
}
# ─────────────────────────────────────────────────────────────────────────────
# 4. Creare venv si instalare dependinte
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Virtual environment + dependinte"
$VenvDir = Join-Path $RepoPath "venv"
$VenvPip = Join-Path $VenvDir "Scripts\pip.exe"
$VenvPy = Join-Path $VenvDir "Scripts\python.exe"
$ReqFile = Join-Path $RepoPath "api\requirements.txt"
$DepsFlag = Join-Path $VenvDir ".deps_installed"
if (-not (Test-Path $VenvDir)) {
Write-Info "Creez venv..."
& $PythonCmd -m venv $VenvDir
Write-OK "venv creat"
}
# Reinstaleaza daca requirements.txt e mai nou decat flag-ul
$needInstall = $true
if (Test-Path $DepsFlag) {
$reqTime = (Get-Item $ReqFile).LastWriteTime
$flagTime = (Get-Item $DepsFlag).LastWriteTime
if ($flagTime -ge $reqTime) { $needInstall = $false }
}
if ($needInstall) {
Write-Info "Instalez dependinte din requirements.txt..."
& $VenvPip install --upgrade pip --quiet
& $VenvPip install -r $ReqFile
New-Item -ItemType File -Path $DepsFlag -Force | Out-Null
Write-OK "Dependinte instalate"
} else {
Write-OK "Dependinte deja up-to-date"
}
# ─────────────────────────────────────────────────────────────────────────────
# 5. Detectare Oracle Home → sugestie INSTANTCLIENTPATH
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Detectare Oracle"
$OracleHome = $env:ORACLE_HOME
$OracleBinPath = ""
if ($OracleHome -and (Test-Path $OracleHome)) {
$OracleBinPath = Join-Path $OracleHome "bin"
Write-OK "ORACLE_HOME detectat: $OracleHome"
Write-Info "Seteaza in api\.env: INSTANTCLIENTPATH=$OracleBinPath"
} else {
# Cauta Oracle in locatii comune
$commonPaths = @(
"C:\oracle\product\19c\dbhome_1\bin",
"C:\oracle\product\21c\dbhome_1\bin",
"C:\app\oracle\product\19.0.0\dbhome_1\bin",
"C:\oracle\instantclient_19_15",
"C:\oracle\instantclient_21_3"
)
foreach ($p in $commonPaths) {
if (Test-Path "$p\oci.dll") {
$OracleBinPath = $p
Write-OK "Oracle gasit la: $p"
Write-Info "Seteaza in api\.env: INSTANTCLIENTPATH=$p"
break
}
}
if (-not $OracleBinPath) {
Write-Warn "Oracle Instant Client nu a fost gasit automat"
Write-Info "Optiuni:"
Write-Info " 1. Thick mode: seteaza INSTANTCLIENTPATH=<cale_oracle_bin> in api\.env"
Write-Info " 2. Thin mode: seteaza FORCE_THIN_MODE=true in api\.env"
}
}
# ─────────────────────────────────────────────────────────────────────────────
# 6. Creare .env din template daca lipseste
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Fisier configurare api\.env"
$EnvFile = Join-Path $RepoPath "api\.env"
$EnvExample = Join-Path $RepoPath "api\.env.example"
if (-not (Test-Path $EnvFile)) {
if (Test-Path $EnvExample) {
Copy-Item $EnvExample $EnvFile
Write-OK "api\.env creat din .env.example"
# Actualizeaza TNS_ADMIN cu calea reala
$ApiDir = Join-Path $RepoPath "api"
(Get-Content $EnvFile) -replace "TNS_ADMIN=.*", "TNS_ADMIN=$ApiDir" |
Set-Content $EnvFile
# Seteaza INSTANTCLIENTPATH daca am gasit Oracle
if ($OracleBinPath) {
(Get-Content $EnvFile) -replace "INSTANTCLIENTPATH=.*", "INSTANTCLIENTPATH=$OracleBinPath" |
Set-Content $EnvFile
}
Write-Warn "IMPORTANT: Editeaza $EnvFile cu credentialele Oracle si GoMag API!"
Write-Info " ORACLE_USER, ORACLE_PASSWORD, ORACLE_DSN"
Write-Info " GOMAG_API_KEY, GOMAG_API_SHOP"
} else {
Write-Warn ".env.example nu exista, sari pasul"
}
} else {
Write-OK "api\.env exista deja"
}
# ─────────────────────────────────────────────────────────────────────────────
# 7. Creare directoare necesare
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Directoare date"
foreach ($dir in @("data", "output", "logs")) {
$fullPath = Join-Path $RepoPath $dir
if (-not (Test-Path $fullPath)) {
New-Item -ItemType Directory -Path $fullPath -Force | Out-Null
Write-OK "Creat: $dir\"
} else {
Write-OK "Exista: $dir\"
}
}
# ─────────────────────────────────────────────────────────────────────────────
# 8. Generare start.bat
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Generare start.bat"
$StartBat = Join-Path $RepoPath "start.bat"
# Citeste TNS_ADMIN si INSTANTCLIENTPATH din .env daca exista
$TnsAdmin = Join-Path $RepoPath "api"
$InstantClient = ""
if (Test-Path $EnvFile) {
Get-Content $EnvFile | ForEach-Object {
if ($_ -match "^TNS_ADMIN=(.+)") {
$TnsAdmin = $Matches[1].Trim()
}
if ($_ -match "^INSTANTCLIENTPATH=(.+)" -and $_ -notmatch "^#") {
$InstantClient = $Matches[1].Trim()
}
}
}
$OraclePathLine = ""
if ($InstantClient) {
$OraclePathLine = "set PATH=$InstantClient;%PATH%"
}
$StartBatContent = @"
@echo off
:: GoMag Import Manager - Windows Launcher
:: Generat de deploy.ps1 - nu edita manual, ruleaza deploy.ps1 din nou
cd /d "$RepoPath"
set TNS_ADMIN=$TnsAdmin
$OraclePathLine
echo Starting GoMag Import Manager on http://0.0.0.0:$Port (prefix /gomag)
"$VenvPy" -m uvicorn app.main:app --host 0.0.0.0 --port $Port --root-path /gomag --app-dir api
"@
Set-Content -Path $StartBat -Value $StartBatContent -Encoding UTF8
Write-OK "start.bat generat: $StartBat"
# ─────────────────────────────────────────────────────────────────────────────
# 9. IIS — Verificare ARR + URL Rewrite
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Verificare module IIS"
if ($SkipIIS) {
Write-Warn "SkipIIS activ — configurare IIS sarita"
} else {
$ArrPath = "$env:SystemRoot\System32\inetsrv\arr.dll"
$UrlRewritePath = "$env:SystemRoot\System32\inetsrv\rewrite.dll"
$ArrOk = Test-Path $ArrPath
$UrlRwOk = Test-Path $UrlRewritePath
if ($ArrOk) {
Write-OK "Application Request Routing (ARR) instalat"
} else {
Write-Warn "ARR 3.0 NU este instalat"
Write-Info "Descarca: https://www.iis.net/downloads/microsoft/application-request-routing"
Write-Info "Sau: winget install Microsoft.ARR"
}
if ($UrlRwOk) {
Write-OK "URL Rewrite 2.1 instalat"
} else {
Write-Warn "URL Rewrite 2.1 NU este instalat"
Write-Info "Descarca: https://www.iis.net/downloads/microsoft/url-rewrite"
Write-Info "Sau: winget install Microsoft.URLRewrite"
}
# ─────────────────────────────────────────────────────────────────────────
# 10. Configurare IIS — copiere web.config
# ─────────────────────────────────────────────────────────────────────────
if ($ArrOk -and $UrlRwOk) {
Write-Step "Configurare IIS reverse proxy"
# Activeaza proxy in ARR (necesar o singura data)
try {
Import-Module WebAdministration -ErrorAction SilentlyContinue
$proxyEnabled = (Get-WebConfigurationProperty `
-pspath "MACHINE/WEBROOT/APPHOST" `
-filter "system.webServer/proxy" `
-name "enabled" `
-ErrorAction SilentlyContinue).Value
if (-not $proxyEnabled) {
Set-WebConfigurationProperty `
-pspath "MACHINE/WEBROOT/APPHOST" `
-filter "system.webServer/proxy" `
-name "enabled" `
-value $true
Write-OK "ARR proxy activat global"
} else {
Write-OK "ARR proxy deja activ"
}
} catch {
Write-Warn "Nu am putut activa ARR proxy automat: $($_.Exception.Message)"
Write-Info "Activeaza manual din IIS Manager → server root → Application Request Routing Cache → Enable Proxy"
}
# Determina wwwroot site-ului IIS
$IisRootPath = $null
try {
Import-Module WebAdministration -ErrorAction SilentlyContinue
$site = Get-Website -Name $IisSiteName -ErrorAction SilentlyContinue
if ($site) {
$IisRootPath = [System.Environment]::ExpandEnvironmentVariables($site.PhysicalPath)
Write-OK "Site IIS '$IisSiteName' gasit: $IisRootPath"
} else {
Write-Warn "Site IIS '$IisSiteName' nu a fost gasit"
}
} catch {
# Fallback la locatia standard
$IisRootPath = "$env:SystemDrive\inetpub\wwwroot"
Write-Warn "WebAdministration unavailable, folosesc fallback: $IisRootPath"
}
if ($IisRootPath) {
$SourceWebConfig = Join-Path $RepoPath "iis-web.config"
$DestWebConfig = Join-Path $IisRootPath "web.config"
if (Test-Path $SourceWebConfig) {
# Inlocuieste portul in web.config cu cel configurat
$wcContent = Get-Content $SourceWebConfig -Raw
$wcContent = $wcContent -replace "localhost:5003", "localhost:$Port"
if (Test-Path $DestWebConfig) {
# Backup web.config existent
$backup = "$DestWebConfig.bak_$(Get-Date -Format 'yyyyMMdd_HHmmss')"
Copy-Item $DestWebConfig $backup
Write-Info "Backup web.config: $backup"
}
Set-Content -Path $DestWebConfig -Value $wcContent -Encoding UTF8
Write-OK "web.config copiat in $IisRootPath"
} else {
Write-Warn "iis-web.config nu exista in repo, sarit"
}
# Restart IIS
try {
iisreset /noforce 2>&1 | Out-Null
Write-OK "IIS restartat"
} catch {
Write-Warn "IIS restart esuat: $($_.Exception.Message)"
Write-Info "Ruleaza manual: iisreset"
}
}
} else {
Write-Warn "IIS nu e configurat complet — instaleaza ARR si URL Rewrite, apoi ruleaza deploy.ps1 din nou"
}
}
# ─────────────────────────────────────────────────────────────────────────────
# 11. Serviciu Windows (NSSM sau Task Scheduler)
# ─────────────────────────────────────────────────────────────────────────────
Write-Step "Serviciu Windows"
$ServiceName = "GoMagVending"
$NssmExe = ""
# Cauta NSSM
foreach ($p in @("nssm", "C:\nssm\win64\nssm.exe", "C:\tools\nssm\nssm.exe")) {
if (Get-Command $p -ErrorAction SilentlyContinue) {
$NssmExe = $p
break
}
}
if ($NssmExe) {
Write-Info "NSSM gasit: $NssmExe"
$existingService = Get-Service -Name $ServiceName -ErrorAction SilentlyContinue
if ($existingService) {
Write-Info "Serviciu existent, restarteaza..."
& $NssmExe restart $ServiceName
Write-OK "Serviciu $ServiceName restartat"
} else {
Write-Info "Instalez serviciu $ServiceName cu NSSM..."
& $NssmExe install $ServiceName (Join-Path $RepoPath "start.bat")
& $NssmExe set $ServiceName AppDirectory $RepoPath
& $NssmExe set $ServiceName DisplayName "GoMag Vending Import Manager"
& $NssmExe set $ServiceName Description "Import comenzi web GoMag -> ROA Oracle"
& $NssmExe set $ServiceName Start SERVICE_AUTO_START
& $NssmExe set $ServiceName AppStdout (Join-Path $RepoPath "logs\service_stdout.log")
& $NssmExe set $ServiceName AppStderr (Join-Path $RepoPath "logs\service_stderr.log")
& $NssmExe set $ServiceName AppRotateFiles 1
& $NssmExe set $ServiceName AppRotateOnline 1
& $NssmExe set $ServiceName AppRotateBytes 10485760
& $NssmExe start $ServiceName
Write-OK "Serviciu $ServiceName instalat si pornit"
}
} else {
# Fallback: Task Scheduler
Write-Warn "NSSM nu este instalat"
Write-Info "Optiuni:"
Write-Info " 1. Descarca NSSM: https://nssm.cc/download si pune nssm.exe in PATH"
Write-Info " 2. Sau foloseste Task Scheduler (creat mai jos)"
# Verifica daca task-ul exista deja
$taskExists = Get-ScheduledTask -TaskName $ServiceName -ErrorAction SilentlyContinue
if (-not $taskExists) {
Write-Info "Creez Task Scheduler task '$ServiceName'..."
try {
$action = New-ScheduledTaskAction -Execute (Join-Path $RepoPath "start.bat")
$trigger = New-ScheduledTaskTrigger -AtStartup
$settings = New-ScheduledTaskSettingsSet `
-ExecutionTimeLimit (New-TimeSpan -Days 365) `
-RestartCount 3 `
-RestartInterval (New-TimeSpan -Minutes 1)
$principal = New-ScheduledTaskPrincipal `
-UserId "SYSTEM" `
-LogonType ServiceAccount `
-RunLevel Highest
Register-ScheduledTask `
-TaskName $ServiceName `
-Action $action `
-Trigger $trigger `
-Settings $settings `
-Principal $principal `
-Description "GoMag Vending Import Manager" `
-Force | Out-Null
Start-ScheduledTask -TaskName $ServiceName
Write-OK "Task Scheduler '$ServiceName' creat si pornit"
} catch {
Write-Warn "Task Scheduler esuat: $($_.Exception.Message)"
Write-Info "Porneste manual: .\start.bat"
}
} else {
# Restart task
Stop-ScheduledTask -TaskName $ServiceName -ErrorAction SilentlyContinue
Start-ScheduledTask -TaskName $ServiceName
Write-OK "Task '$ServiceName' restartat"
}
}
# ─────────────────────────────────────────────────────────────────────────────
# Sumar final
# ─────────────────────────────────────────────────────────────────────────────
Write-Host ""
Write-Host "══════════════════════════════════════════════════════" -ForegroundColor Cyan
Write-Host " GoMag Vending Deploy — Sumar" -ForegroundColor Cyan
Write-Host "══════════════════════════════════════════════════════" -ForegroundColor Cyan
Write-Host ""
Write-Host " Repo: $RepoPath" -ForegroundColor White
Write-Host " FastAPI: http://localhost:$Port/gomag" -ForegroundColor White
Write-Host " start.bat generat" -ForegroundColor White
Write-Host ""
if (-not (Test-Path $EnvFile)) {
Write-Host " [!] api\.env lipseste — configureaza inainte de start!" -ForegroundColor Red
} else {
Write-Host " api\.env: OK" -ForegroundColor Green
# Verifica daca mai are valori placeholder
$envContent = Get-Content $EnvFile -Raw
if ($envContent -match "your_api_key_here|USER_ORACLE|parola_oracle|TNS_ALIAS") {
Write-Host " [!] api\.env contine valori placeholder — editeaza!" -ForegroundColor Yellow
}
}
Write-Host ""
Write-Host " Acces app: http://SERVER/gomag" -ForegroundColor Cyan
Write-Host " Test local: http://localhost:$Port/gomag/health" -ForegroundColor Cyan
Write-Host ""

View File

@@ -1,37 +0,0 @@
# UNIFIED Docker Compose - AUTO-DETECT Oracle Mode
#
# Configurare prin .env:
# - Oracle 10g/11g: setează INSTANTCLIENTPATH=/opt/oracle/instantclient_23_9
# - Oracle 12.1+: setează FORCE_THIN_MODE=true (sau elimină INSTANTCLIENTPATH)
#
# Build modes:
# - docker-compose up --build → thick mode (default)
# - docker-compose up --build --build-arg ORACLE_MODE=thin → thin mode
services:
gomag_admin:
build:
context: ./api
dockerfile: Dockerfile
args:
# thick = Oracle 10g/11g/12.1+ (cu Instant Client)
# thin = Oracle 12.1+ only (fără Instant Client)
ORACLE_MODE: ${ORACLE_MODE:-thick}
container_name: gomag-admin
ports:
- "5003:5000"
volumes:
- ./api:/app
- ./logs:/app/logs
env_file:
- ./api/.env
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:5000/health"]
interval: 30s
timeout: 10s
retries: 3
networks:
default:
driver: bridge

View File

@@ -1,318 +0,0 @@
Procedure completeaza_parteneri_roa
* Completez id_part
Local lcBanca, lcCod_fiscal, lcCont_Banca, lcCorespDel, lcDenumire, lcIdString, lcId_categ_ent
Local lcId_loc_inreg, lcId_util, lcMesaj, lcMotiv_inactiv, lcNume, lcPrefix, lcPrenume, lcReg_comert
Local lcSql, lcSqlInsert, lcSufix, lcTip_persoana, lcinactiv, lnSucces
Local lcAdresa, lcAdreseParteneri, lcApart, lcBloc, lcCaleImport, lcCod, lcCodpostal, lcDA_apare
Local lcDenumire_adresa, lcEmail, lcEtaj, lcFax, lcFile, lcIdPart, lcId_Judet, lcId_loc, lcId_tara
Local lcItem1, lcItem2, lcItem3, lcItem4, lcJudet, lcJudetBucuresti, lcLocalitate, lcNumar
Local lcPrincipala, lcScara, lcSqlJudete, lcSqlLocalitati, lcSqlPart, lcStrada, lcTelefon1
Local lcTelefon2, lcWeb, lnIdJudet, lnIdJudetBucuresti, lnIdLocalitateBucuresti, lnIdTaraRO, lnPos
Local lnRecc
*:Global pcDenumire, pnIdAdresa, pnNrAdrese
*:Global pcCodFiscal, pnIdPart
Thisform.Trace('Completare Parteneri ROA')
If !Used('npart')
lnSucces = CT_INSUCCES
Return m.lnSucces
Endif
Select Distinct Cast(Null As I) As id_part, cod, denumire, cod_fiscal, reg_com, adresa, judet As indicativ_judet, tara As cod_tara, banca, cont_banca ;
From npart ;
Into Cursor cClientiFurnizori Readwrite
lnSucces = This.Connectroa()
If m.lnSucces < 0
Thisform.Trace('Completare Parteneri ROA. Eroare conectare la baza de date!')
Return m.lnSucces
Endif
Create Cursor cParteneri (id_part N(10), cod_fiscal C(30) Null, denumire C(100) Null)
lcSqlPart = [select id_part, cod_fiscal, denumire from nom_parteneri where sters = 0 and inactiv = 0]
lnSucces = goExecutor.oExecute(GetHash("cSql=>" + m.lcSqlPart + '??cCursor=>cParteneriTemp'))
If m.lnSucces < 0
Thisform.Trace('Eroare la selectia din clienti ROA ' + goExecutor.oPrelucrareEroare())
Return m.lnSucces
Endif
Select cParteneri
Append From Dbf('cParteneriTemp')
Index On denumire Tag denumire
Index On Padr(Strtran(cod_fiscal, ' ', ''),30, ' ') Tag cod_fiscal
Use In (Select('cParteneriTemp'))
Create Cursor cAdrese (id_adresa I, id_part I, localitate C(100) Null, id_loc I Null, judet C(20) Null, id_judet I Null, tara C(50) Null, id_tara I Null)
lcAdreseParteneri = [select id_adresa, id_part, localitate, id_loc, judet, id_judet, tara, id_tara from vadrese_parteneri]
lnSucces = goExecutor.oExecute(GetHash("cSql=>" + m.lcAdreseParteneri + '??cCursor=>cAdreseTemp'))
If m.lnSucces < 0
Thisform.Trace('Eroare la selectia din adrese parteneri ROA ' + goExecutor.oPrelucrareEroare())
Return m.lnSucces
Endif
Select cAdrese
Append From Dbf('cAdreseTemp')
Index On Padl(id_part,10, '0') + Padr(localitate, 100, ' ') Tag adresa
Use In (Select('cAdreseTemp'))
Create Cursor cJudete (id_judet I, id_tara I Null, judet C(20) Null)
lcSqlJudete = [select j.id_judet, j.id_tara, j.judet from syn_nom_judete j]
lnSucces = goExecutor.oExecute(GetHash("cSql=>" + m.lcSqlJudete + '??cCursor=>cJudeteTemp'))
If m.lnSucces < 0
Thisform.Trace('Eroare la selectia din judete ROA ' + goExecutor.oPrelucrareEroare())
Return m.lnSucces
Endif
Select cJudete
Append From Dbf('cJudeteTemp')
Index On id_judet Tag id_judet
Index On judet Tag judet
Use In (Select('cJudeteTemp'))
Create Cursor cLocalitati (id_loc I, id_judet I Null, id_tara I Null, localitate C(100) Null)
lcSqlLocalitati = [select l.id_loc, l.id_judet, j.id_tara, l.localitate from syn_nom_localitati l left join syn_nom_judete j on l.id_judet = j.id_judet where l.inactiv = 0 and l.sters = 0]
lnSucces = goExecutor.oExecute(GetHash("cSql=>" + m.lcSqlLocalitati + '??cCursor=>cLocalitatiTemp'))
If m.lnSucces < 0
Thisform.Trace('Eroare la selectia din localitati ROA ' + goExecutor.oPrelucrareEroare())
Return m.lnSucces
Endif
Select cLocalitati
Append From Dbf('cLocalitatiTemp')
Use In (Select('cLocalitatiTemp'))
Select cClientiFurnizori
lnRecc = Reccount()
Scan
pnIdPart = 0
pcCodFiscal = Padr(Strtran(cod_fiscal, ' ', ''),30, ' ')
pcDenumire = Padr(Alltrim(Upper(denumire)), 100, ' ')
lcAdresa = Strtran(Alltrim(Upper(Nvl(adresa, ''))), Chr(13), ' ')
If Len(Alltrim(m.pcCodFiscal)) <= 3
pcCodFiscal = Padl(Alltrim(cod), 10, '0')
Endif
lcCod = cod
If Mod(Recno(), 250) = 0
Thisform.Trace ('Import clienti... ' + Transform(Recno()) + '/' + Transform(m.lnRecc))
Endif
* Verific daca partenerul a mai fost importat
If Seek(m.lcCod, 'coresp_parteneri', 'cod')
pnIdPart = coresp_parteneri.id_part
Select cClientiFurnizori
Replace id_part With m.pnIdPart
Loop
Endif
Select cParteneri
Do Case
Case !Empty(m.pcCodFiscal)
If Seek(m.pcCodFiscal, 'cParteneri', 'cod_fiscal')
pnIdPart = cParteneri.id_part
Endif
Otherwise
If Seek(m.pcDenumire, 'cParteneri', 'denumire')
pnIdPart = cParteneri.id_part
Endif
Endcase
If !Empty(Nvl(m.pnIdPart, 0))
Replace id_part With m.pnIdPart In cClientiFurnizori
*!* lcMesaj = 'Client existent ' + Alltrim(cParteneri.denumire) + ' CUI: ' + Alltrim(cParteneri.cod_fiscal) + ' ID: ' + Alltrim(Transform(cParteneri.id_part))
*!* Thisform.trace(m.lcMesaj)
Else
* Adaugare clienti
Select cClientiFurnizori
lcDenumire = Nvl(Strtran(Alltrim(Upper(denumire)), ['], ['']), "")
lcNume = Nvl(Strtran(Alltrim(Upper(denumire)), ['], ['']), "")
lcPrenume = ''
lcCod_fiscal = Upper(Alltrim(cod_fiscal))
If Len(Alltrim(m.lcCod_fiscal)) <= 3
lcCod_fiscal = Padl(Alltrim(cod), 10, '0')
Endif
lcReg_comert = Nvl(Alltrim(Upper(reg_com)), "")
lcTip_persoana = "1" && 1=juridica, 2=fizica
If !Empty(m.lcCod_fiscal) And Len(m.lcCod_fiscal) = 13
lcTip_persoana = "2" && fizica
lnPos = At(' ', m.lcNume)
lcPrenume = Alltrim(Substr(m.lcNume, m.lnPos))
lcNume = Alltrim(Left(m.lcNume, m.lnPos))
Endif
lcId_loc_inreg = 'NULL'
lcId_categ_ent = 'NULL'
lcPrefix = ""
lcSufix = ""
lcBanca = Upper(Alltrim(Nvl(banca,'')))
lcCont_Banca = Upper(Alltrim(Nvl(cont_banca,'')))
lcinactiv = "0"
lcMotiv_inactiv = ""
lcIdString = "16;17"
lcCorespDel = ""
lcId_util = "-3"
lcSqlInsert = [begin pack_def.adauga_partener('] + lcDenumire + [','] + lcNume + [','] + lcPrenume + [','] + lcCod_fiscal + [','] + ;
lcReg_comert + [',] + lcId_loc_inreg + [,] + lcId_categ_ent + [,'] + lcPrefix + [','] + lcSufix + [',] + ;
lcTip_persoana + [,'] + lcBanca + [','] + lcCont_Banca + [',] + lcinactiv + [,'] + lcMotiv_inactiv + [',] + ;
lcId_util + [,'] + lcIdString + [','] + lcCorespDel + [',?@pnIdPart); end;]
lnSucces = goExecutor.oExecute(GetHash("cSql=>" + m.lcSqlInsert))
If !Empty(Nvl(m.pnIdPart, 0))
Replace id_part With m.pnIdPart In cClientiFurnizori
Thisform.Trace('Client nou ' + Alltrim(cClientiFurnizori.denumire) + ' CUI: ' + Alltrim(cClientiFurnizori.cod_fiscal) + ' ID: ' + Alltrim(Transform(cClientiFurnizori.id_part)))
Insert Into cParteneri (id_part, denumire, cod_fiscal) Values (m.pnIdPart, cClientiFurnizori.denumire, cClientiFurnizori.cod_fiscal)
Else
lcMesaj = 'Eroare la adaugarea in clienti ROA ' + Alltrim(cParteneri.denumire) + ' CUI: ' + Alltrim(cParteneri.cod_fiscal) + Chr(13) + Chr(10) + goExecutor.oPrelucrareEroare()
Thisform.Trace(m.lcMesaj)
aMessagebox(m.lcMesaj)
Set Step On
Exit
Endif && !Empty(Nvl(m.pnIdPart,0))
Endif && !Empty(Nvl(m.pnIdPart,0))
***********************************
* Adresa partener
***********************************
If !Empty(m.lcAdresa)
* JUD:Mun. Bucuresti;BUCURESTI;Str.SOS BUCURESTI-URZICENI;159A
Calculate Cnt(id_adresa) For id_part = m.pnIdPart To pnNrAdrese In cAdrese
lcIdPart = Alltrim(Str(m.pnIdPart))
lcDenumire_adresa = ""
lcDA_apare = "0"
lcStrada = ""
lcNumar = ""
lcBloc = ""
lcScara = ""
lcApart = ""
lcEtaj = ""
lcId_loc = "NULL"
lcLocalitate = ""
lcId_Judet = "NULL"
lcJudet = ""
lcCodpostal = "NULL"
lcId_tara = "NULL"
lcTelefon1 = ""
lcTelefon2 = ""
lcFax = ""
lcEmail = ""
lcWeb = ""
lcPrincipala = Iif(m.pnNrAdrese = 0, "1", "0")
lcinactiv = "0"
lcId_util = "-3"
lcItem1 = Alltrim(Getwordnum(m.lcAdresa, 1, ';'))
lcItem2 = Alltrim(Getwordnum(m.lcAdresa, 2, ';'))
lcItem3 = Alltrim(Getwordnum(m.lcAdresa, 3, ';'))
lcItem4 = Alltrim(Getwordnum(m.lcAdresa, 4, ';'))
If Left(m.lcItem1, 4) = 'JUD:'
lcJudet = Alltrim(Substr(m.lcItem1, 5))
Endif
If 'BUCURESTI'$m.lcJudet
lcJudet = 'BUCURESTI'
Endif
If !Empty(m.lcItem2)
lcLocalitate = Alltrim(m.lcItem2)
Else
If !Empty(m.lcItem1) And Left(m.lcItem1, 4) <> 'JUD:'
lcLocalitate = m.lcItem2
Endif
Endif
If Lower(Left(m.lcItem3,4)) = 'str.'
lcStrada = Alltrim(Substr(m.lcItem3, 5))
Else
lcStrada = Alltrim(m.lcItem3)
Endif
If !Empty(m.lcItem4)
lcNumar = Alltrim(Left(m.lcItem4, 10))
Endif
lnIdJudetBucuresti = 10
lcJudetBucuresti = "BUCURESTI"
lnIdLocalitateBucuresti = 1759
lnIdTaraRO = 1
If m.lcLocalitate = 'BUCURESTI'
m.lcLocalitate = 'BUCURESTI SECTORUL 1'
Endif
If Empty(m.lcLocalitate)
lcLocalitate = 'BUCURESTI SECTORUL 1'
Endif
If Empty(m.lcJudet)
lcJudet = m.lcJudetBucuresti
Endif
* caut adresa dupa localitate. daca nu o gasesc, o adaug
Select cAdrese
If !Seek(Padl(m.pnIdPart,10, '0') + Padr(m.lcLocalitate, 100, ' '), 'cAdrese', 'adresa')
lnIdJudet = m.lnIdJudetBucuresti
Select cJudete
If Seek(m.lcJudet, 'cJudete', 'judet')
lnIdJudet = cJudete.id_judet
Endif
Select * From cLocalitati Where id_judet = m.lnIdJudet And localitate = m.lcLocalitate Order By localitate Into Cursor cLocalitateTemp
If Reccount('cLocalitateTemp') > 0
Select cLocalitateTemp
Go Top
lcId_loc = Alltrim(Str(id_loc))
lcId_Judet = Alltrim(Str(id_judet))
lcId_tara = Alltrim(Str(id_tara))
Use In (Select('cLocalitateTemp'))
Else
Use In (Select('cLocalitateTemp'))
Select * From cLocalitati Where id_judet = m.lnIdJudet Order By localitate Into Cursor cLocalitateTemp
Select cLocalitateTemp
Go Top
lcId_loc = Alltrim(Str(id_loc))
lcId_Judet = Alltrim(Str(id_judet))
lcId_tara = Alltrim(Str(id_tara))
Use In (Select('cLocalitateTemp'))
Endif
If Empty(Nvl(m.lcId_loc, ''))
lcId_loc = Alltrim(Str(m.lnIdLocalitateBucuresti))
lcId_Judet = Alltrim(Str(m.lnIdJudetBucuresti))
lcId_tara = Alltrim(Str(m.lnIdTaraRO))
Endif && lnSucces
If m.lcId_loc <> 'NULL'
pnIdAdresa = 0
If Empty(Nvl(m.pnIdAdresa,0))
lcSql = [begin pack_def.adauga_adresa_partener2(] + lcIdPart + [,'] + lcDenumire_adresa + [',] + lcDA_apare + [,] + ;
['] + lcStrada + [','] + lcNumar + [','] + ;
lcBloc + [','] + lcScara + [','] + lcApart + [','] + lcEtaj + [',] + lcId_loc + [,'] + lcLocalitate + [',] + lcId_Judet + [,] + lcCodpostal + [,] + lcId_tara + [,'] + ;
lcTelefon1 + [','] + lcTelefon2 + [','] + lcFax + [','] + lcEmail + [','] + lcWeb + [',] + ;
lcPrincipala + [,] + lcinactiv + [,] + lcId_util + [,?@pnIdAdresa); end;]
lnSucces = goExecutor.oExecute(GetHash("cSql=>" + m.lcSql))
If m.lnSucces < 0
lcMesaj = goExecutor.cEroare
Thisform.Trace(m.lcMesaj)
* AMessagebox(m.lcMesaj, 0 + 48, _Screen.Caption )
* Exit
Endif
Endif && empty(m.pnIdAdresa)
Endif && m.lcId_loc <> 'NULL'
Endif && !found()
Endif && !empty(m.lcAdresa)
Insert Into coresp_parteneri (cod, id_part, cod_fiscal, denumire) Values (m.lcCod, m.pnIdPart, m.pcCodFiscal, m.pcDenumire)
Endscan && cClientiFurnizori
This.DisconnectRoa()
lcCaleImport = Addbs(Alltrim(goApp.oSettings.cale_import))
lcFile = m.lcCaleImport + 'coresp_parteneri.csv'
Select coresp_parteneri
Copy To (m.lcFile) Type Csv
Return m.lnSucces

View File

@@ -1,208 +0,0 @@
{
"total": "399",
"page": "1",
"pages": 4,
"orders": {
"60644": {
"id": "60644",
"number": "436232189",
"date": "2025-08-27 16:32:43",
"invoice": {
"series": "",
"number": "0",
"date": "0000-00-00 00:00:00"
},
"total": "1026.24",
"status": "Comanda noua",
"statusId": "1",
"source": "internal",
"sales_channel": "Website",
"sales_channel_marketplace": "",
"sales_agent": "",
"currency": "RON",
"observation": "",
"payment": {
"name": "Numerar/Ramburs sau Card la easybox",
"online": "0",
"completed": "0"
},
"delivery": {
"name": "Transport gratuit",
"total": 0,
"date": "0000-00-00 00:00:00",
"lockerId": 0
},
"shipping": {
"company": "",
"firstname": "Liviu",
"lastname": "Stefan",
"phone": "0751013764",
"email": "liviustefan2001@gmail.com",
"address": "aleea Soarelui nr 2, casa nr 4, cartier Brates Lake",
"city": "Galați",
"region": "Galati",
"country": "Romania",
"zipcode": null
},
"items": [
{
"id": "582",
"type": "product",
"sku": "8000070028685",
"ean": "8000070028685",
"name": "Lavazza Gusto Forte Vending Cafea Boabe 1kg",
"price": "69.79",
"baseprice": "78",
"vat": "11",
"quantity": "10.00"
},
{
"id": "589",
"type": "product",
"sku": "5941623010333",
"ean": "5941623010333",
"name": "Doncafe Espresso Intense Cafea Boabe 1 kg",
"price": "56.49",
"baseprice": "62",
"vat": "11",
"quantity": "2.00"
},
{
"id": "512",
"type": "product",
"sku": "82",
"ean": "",
"name": "Zahar Plic Lavazza 200buc",
"price": "10.99",
"baseprice": "14",
"vat": "21",
"quantity": "5.00"
},
{
"id": "671",
"type": "product",
"sku": "312349",
"ean": "",
"name": "Palete manuale din lemn 110mm 1000buc",
"price": "7.99",
"baseprice": "10.5",
"vat": "21",
"quantity": "2.00"
},
{
"id": "554",
"type": "product",
"sku": "8004990127091",
"ean": "8004990127091",
"name": "Ristora Ciocolată Instant 1kg",
"price": "25.99",
"baseprice": "28",
"vat": "21",
"quantity": "2.00"
},
{
"id": "567",
"type": "product",
"sku": "8004990125530",
"ean": "8004990125530",
"name": "Prolait Topping Blue Lapte Granulat 500g",
"price": "18.49",
"baseprice": "23",
"vat": "21",
"quantity": "5.00"
}
],
"billing": {
"firstname": "Liviu",
"lastname": "Stefan",
"phone": "0751013764",
"email": "liviustefan2001@gmail.com",
"address": "aleea Soarelui nr 2, casa nr 4, cartier Brates Lake",
"city": "Galați",
"region": "Galati",
"country": "Romania",
"customerId": "5997"
},
"discounts": [
],
"updated": "2025-08-27 16:32:43"
},
"60643": {
"id": "60643",
"number": "436232175",
"date": "2025-08-27 16:19:29",
"invoice": {
"series": "",
"number": "0",
"date": "0000-00-00 00:00:00"
},
"total": "359.4",
"status": "Comanda noua",
"statusId": "1",
"source": "internal",
"sales_channel": "Website",
"sales_channel_marketplace": "",
"sales_agent": "",
"currency": "RON",
"observation": "",
"payment": {
"name": "Numerar/Ramburs sau Card la easybox",
"online": "0",
"completed": "0"
},
"delivery": {
"name": "Transport National",
"total": 30,
"date": "0000-00-00 00:00:00",
"lockerId": 0
},
"shipping": {
"company": "",
"firstname": "Alexandra",
"lastname": "TONE",
"phone": "0763571486",
"email": "aristocratgaminggr@gmail.com",
"address": "Str. Garii, Nr. 102",
"city": "Giurgiu",
"region": "Giurgiu",
"country": "Romania",
"zipcode": null
},
"items": [
{
"id": "279",
"type": "product",
"sku": "30006ozLavazza",
"ean": "",
"name": "Pahar carton 6oz Lavazza RLP bax 3000buc",
"price": "329.4",
"baseprice": "360",
"vat": "21",
"quantity": "1.00"
}
],
"billing": {
"company": {
"name": "ARISTOCRAT GAMING SRL",
"code": "32128076",
"registrationNo": "J27/487/2013",
"bank": "",
"iban": ""
},
"firstname": "Alexandra",
"lastname": "TONE",
"phone": "0763571486",
"email": "aristocratgaminggr@gmail.com",
"address": "Aleea Revolutiei, Spatiul Comercial Nr.32",
"city": "Roman",
"region": "Neamt",
"country": "Romania",
"customerId": "12283"
},
"discounts": [
],
"updated": "2025-08-27 16:19:29"
}
},
"shippingAsProduct": false
}

View File

@@ -1,137 +0,0 @@
{
"total": "Numar total de rezultate",
"page": "Pagina curenta",
"pages": "Numar total de pagini",
"products": {
"id": "ID intern al produsului",
"sku": "SKU",
"name": "Denumire",
"description": "Descriere",
"short_description": "Descriere scurta",
"url": "URL",
"canonical_url": "URL canonic",
"brand": "Marca produs",
"categories": {
"Branch 1": [
{
"id": "ID-ul categoriei",
"name": "Denumirea categoriei"
},
{
"id": "ID-ul categoriei",
"name": "Denumirea categoriei"
}
],
"Branch 2": [
{
"id": "ID-ul categoriei",
"name": "Denumirea categoriei"
},
{
"id": "ID-ul categoriei",
"name": "Denumirea categoriei"
}
],
"...": [
"..."
]
},
"weight": "Greutatea produsului (kg)",
"enabled": "Activ (0/1)",
"parent": "ID produs parinte",
"parent_sku": "SKU produs parinte",
"group_key": "Codul de grupare al variantelor de produs",
"stockManagement": "Gestioneaza automat stocul produsului",
"stock": "Stoc cantitativ",
"stockStatus": "Status stoc",
"base_price": "Pret de baza al produsului",
"price": "Pret de vanzare al produsului (dupa aplicarea adaosurilor si reducerilor)",
"vat_included": "Pret include TVA (0/1)",
"vat": "Valoare TVA",
"currency": "Moneda",
"ecotax": "Valoare taxa verde",
"um": "Unitate de masura",
"html_title": "Titlu html",
"html_description": "Descrierere html",
"customSearchKeys": "Cuvinte cheie folosite la cautarea SEO",
"feedDescription": "Descriere pentru feed-uri",
"allowOrdersWhenOutOfStock": "Produsul se aduce la comanda",
"attributes": [
{
"id": "ID atribut",
"type": "Tip atribut: dropdown, textinput, textarea",
"name": "Denumire atribut",
"value": "Optiune"
},
{
"id": "ID atribut",
"type": "Tip atribut multipleselect (accepta valori multiple)",
"name": "Denumire atribut",
"value": [
"Optiune1",
"Optiune2",
"..."
]
}
],
"images": [
"Imagine 1 (principala)",
"Imagine 2",
"..."
],
"variations": [
{
"id": "ID intern al produsului",
"sku": "SKU",
"base_price": "Pret de baza al produsului",
"price": "Pret de vanzare al produsului (dupa aplicarea adaosurilor si reducerilor)",
"stock": "Stoc cantitativ",
"stockStatus": "Status stoc",
"stockManagement": "Gestioneaza automat stocul produsului",
"versionAttributes": {
"id Attribut": {
"name": "Denumire atribut",
"value": "Valoare atribut"
}
}
},
{
"id": "ID intern al produsului",
"sku": "SKU",
"base_price": "Pret de baza al produsului",
"price": "Pret de vanzare al produsului (dupa aplicarea adaosurilor si reducerilor)",
"stock": "Stoc cantitativ",
"stockStatus": "Status stoc",
"stockManagement": "Gestioneaza automat stocul produsului",
"versionAttributes": {
"id Attribut": {
"id": "ID atribut",
"name": "Denumire atribut",
"value": "Valoare atribut"
}
}
}
],
"ean": "EAN",
"videos": [
"URL video"
],
"files": [
"URL fisiere"
],
"updated": "Ultima modificare",
"created": "Data crearii",
"delivery_time": "Timp de livrare",
"delivery_time_type": "Tip timp de livrare",
"bundleItems": [
{
"sku": "SKU componenta",
"quantity": "Cantitate"
},
{
"sku": "SKU componenta",
"quantity": "Cantitate"
}
]
}
}

Some files were not shown because too many files have changed in this diff Show More