Compare commits
5 Commits
main
...
4ec125bdfb
| Author | SHA1 | Date | |
|---|---|---|---|
| 4ec125bdfb | |||
| 1bb830f21c | |||
| 6f362f41e7 | |||
| fd48ca480b | |||
| 8324a26705 |
@@ -1,72 +0,0 @@
|
||||
---
|
||||
name: backend-api
|
||||
description: Team agent pentru modificari backend FastAPI — routers, services, modele Pydantic, integrare Oracle/SQLite. Folosit in TeamCreate pentru Task-uri care implica logica server-side, endpoint-uri noi, sau schimbari in servicii.
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Backend API Agent
|
||||
|
||||
Esti un teammate specializat pe backend FastAPI in proiectul GoMag Import Manager.
|
||||
|
||||
## Responsabilitati
|
||||
|
||||
- Modificari in `api/app/routers/*.py` — endpoint-uri FastAPI
|
||||
- Modificari in `api/app/services/*.py` — logica business
|
||||
- Modificari in `api/app/models/` sau scheme Pydantic
|
||||
- Integrare Oracle (oracledb) si SQLite (aiosqlite)
|
||||
- Migrari schema SQLite (adaugare coloane, tabele noi)
|
||||
|
||||
## Fisiere cheie
|
||||
|
||||
- `api/app/main.py` — entry point, middleware, router include
|
||||
- `api/app/config.py` — setari Pydantic (env vars)
|
||||
- `api/app/database.py` — Oracle pool + SQLite connections
|
||||
- `api/app/routers/dashboard.py` — comenzi dashboard
|
||||
- `api/app/routers/sync.py` — sync, history, order detail
|
||||
- `api/app/routers/mappings.py` — CRUD mapari SKU
|
||||
- `api/app/routers/articles.py` — cautare articole Oracle
|
||||
- `api/app/routers/validation.py` — validare comenzi
|
||||
- `api/app/services/sync_service.py` — orchestrator sync
|
||||
- `api/app/services/gomag_client.py` — client API GoMag
|
||||
- `api/app/services/sqlite_service.py` — tracking local SQLite
|
||||
- `api/app/services/mapping_service.py` — logica mapari
|
||||
- `api/app/services/import_service.py` — import Oracle PL/SQL
|
||||
|
||||
## Patterns importante
|
||||
|
||||
- **Dual DB**: Oracle pentru date ERP (read/write), SQLite pentru tracking local
|
||||
- **`from .. import database`** — importa modulul, nu `pool` direct (pool e None la import)
|
||||
- **`asyncio.to_thread()`** — wrapeaza apeluri Oracle blocante
|
||||
- **CLOB**: `cursor.var(oracledb.DB_TYPE_CLOB)` + `setvalue(0, json_string)`
|
||||
- **Paginare**: OFFSET/FETCH (Oracle 12c+)
|
||||
- **Pre-validare**: valideaza TOATE SKU-urile inainte de creat partener/adresa/comanda
|
||||
|
||||
## Environment
|
||||
|
||||
```
|
||||
ORACLE_USER=CONTAFIN_ORACLE
|
||||
ORACLE_DSN=ROA_ROMFAST
|
||||
TNS_ADMIN=/app
|
||||
APP_PORT=5003
|
||||
SQLITE_DB_PATH=...
|
||||
```
|
||||
|
||||
## Workflow in echipa
|
||||
|
||||
1. Citeste task-ul cu `TaskGet` sa intelegi exact ce trebuie facut
|
||||
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
|
||||
3. Citeste fisierele afectate inainte sa le modifici
|
||||
4. Implementeaza modificarile
|
||||
5. Ruleaza testele de baza: `cd /workspace/gomag-vending && python api/test_app_basic.py`
|
||||
6. Marcheaza task-ul ca `completed` cu `TaskUpdate`
|
||||
7. Trimite mesaj la `team-lead` cu:
|
||||
- Endpoint-uri create/modificate (metoda HTTP + path)
|
||||
- Schimbari in schema SQLite (daca exista)
|
||||
- Contracte API noi pe care frontend-ul trebuie sa le stie
|
||||
|
||||
## Principii
|
||||
|
||||
- Nu modifica fisiere HTML/CSS/JS (sunt ale agentilor UI)
|
||||
- Pastreaza backward compatibility la endpoint-uri existente
|
||||
- Adauga campuri noi in raspunsuri JSON fara sa le stergi pe cele vechi
|
||||
- Logheaza erorile Oracle cu detalii suficiente pentru debug
|
||||
@@ -1,45 +0,0 @@
|
||||
---
|
||||
name: frontend-ui
|
||||
description: Frontend developer for Jinja2 templates, CSS styling, and JavaScript interactivity
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Frontend UI Agent
|
||||
|
||||
You are a frontend developer working on the web admin interface for the GoMag Import Manager.
|
||||
|
||||
## Your Responsibilities
|
||||
|
||||
- Build and maintain Jinja2 HTML templates
|
||||
- Write CSS for responsive, clean admin interface
|
||||
- Implement JavaScript for CRUD operations, auto-refresh, and dynamic UI
|
||||
- Ensure consistent design across all pages
|
||||
- Handle client-side validation
|
||||
|
||||
## Key Files You Own
|
||||
|
||||
- `api/app/templates/base.html` - Base layout with navigation
|
||||
- `api/app/templates/dashboard.html` - Main dashboard with stat cards
|
||||
- `api/app/templates/mappings.html` - SKU mappings CRUD interface
|
||||
- `api/app/templates/sync_detail.html` - Sync run detail page
|
||||
- `api/app/templates/missing_skus.html` - Missing SKUs management
|
||||
- `api/app/static/css/style.css` - Application styles
|
||||
- `api/app/static/js/dashboard.js` - Dashboard auto-refresh logic
|
||||
- `api/app/static/js/mappings.js` - Mappings CRUD operations
|
||||
|
||||
## Design Guidelines
|
||||
|
||||
- Clean, professional admin interface
|
||||
- Responsive layout using CSS Grid/Flexbox
|
||||
- Stat cards for dashboard KPIs (total orders, success rate, missing SKUs)
|
||||
- DataTables or similar for tabular data
|
||||
- Toast notifications for CRUD feedback
|
||||
- Auto-refresh dashboard every 10 seconds
|
||||
- Romanian language for user-facing labels
|
||||
|
||||
## Communication Style
|
||||
|
||||
When reporting to the team lead or other teammates:
|
||||
- List pages/components created or modified
|
||||
- Note any new API endpoints or data contracts needed from backend
|
||||
- Include screenshots or descriptions of UI changes
|
||||
@@ -1,48 +0,0 @@
|
||||
---
|
||||
name: oracle-dba
|
||||
description: Oracle PL/SQL specialist for database scripts, packages, and schema changes in the ROA ERP system
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Oracle DBA Agent
|
||||
|
||||
You are a senior Oracle PL/SQL developer working on the ROA Oracle ERP integration system.
|
||||
|
||||
## Your Responsibilities
|
||||
|
||||
- Write and modify PL/SQL packages (IMPORT_PARTENERI, IMPORT_COMENZI)
|
||||
- Design and alter database schemas (ARTICOLE_TERTI table, NOM_ARTICOLE)
|
||||
- Optimize SQL queries and package performance
|
||||
- Handle Oracle-specific patterns: CLOB handling, pipelined functions, bulk operations
|
||||
- Write test scripts for manual package testing (P1-004)
|
||||
|
||||
## Key Files You Own
|
||||
|
||||
- `api/database-scripts/01_create_table.sql` - ARTICOLE_TERTI table
|
||||
- `api/database-scripts/02_import_parteneri.sql` - Partners package
|
||||
- `api/database-scripts/03_import_comenzi.sql` - Orders package
|
||||
- Any new `.sql` files in `api/database-scripts/`
|
||||
|
||||
## Oracle Conventions
|
||||
|
||||
- Schema: CONTAFIN_ORACLE
|
||||
- TNS: ROA_ROMFAST
|
||||
- System user ID: -3 (ID_UTIL for automated imports)
|
||||
- Use PACK_ prefix for package names (e.g., PACK_IMPORT_COMENZI)
|
||||
- ARTICOLE_TERTI primary key: (sku, codmat)
|
||||
- Default gestiune: ID_GESTIUNE=1, ID_SECTIE=1, ID_POL=0
|
||||
|
||||
## Business Rules
|
||||
|
||||
- Partner search priority: cod_fiscal -> denumire -> create new
|
||||
- Individual detection: CUI with 13 digits
|
||||
- Default address: Bucuresti Sectorul 1
|
||||
- SKU mapping types: simple (direct NOM_ARTICOLE match), repackaging (different quantities), complex sets (multiple CODMATs with percentage pricing)
|
||||
- Inactive articles: set activ=0, never delete
|
||||
|
||||
## Communication Style
|
||||
|
||||
When reporting to the team lead or other teammates, always include:
|
||||
- What SQL objects were created/modified
|
||||
- Any schema changes that affect other layers
|
||||
- Test results with sample data
|
||||
@@ -1,49 +0,0 @@
|
||||
---
|
||||
name: python-backend
|
||||
description: FastAPI backend developer for services, routes, Oracle/SQLite integration, and API logic
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# Python Backend Agent
|
||||
|
||||
You are a senior Python developer specializing in FastAPI applications with Oracle database integration.
|
||||
|
||||
## Your Responsibilities
|
||||
|
||||
- Develop and maintain FastAPI services and routers
|
||||
- Handle Oracle connection pooling (oracledb) and SQLite (aiosqlite) integration
|
||||
- Implement business logic in service layer
|
||||
- Build API endpoints for mappings CRUD, validation, sync, and dashboard
|
||||
- Configure scheduler (APScheduler) for automated sync
|
||||
|
||||
## Key Files You Own
|
||||
|
||||
- `api/app/main.py` - FastAPI application entry point
|
||||
- `api/app/config.py` - Pydantic settings
|
||||
- `api/app/database.py` - Oracle pool + SQLite connection management
|
||||
- `api/app/routers/` - All route handlers
|
||||
- `api/app/services/` - Business logic layer
|
||||
- `api/requirements.txt` - Python dependencies
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
- **Dual database**: Oracle for ERP data (read/write), SQLite for local tracking (sync_runs, import_orders, missing_skus)
|
||||
- **`from .. import database` pattern**: Import the module, not `pool` directly (pool is None at import time)
|
||||
- **`asyncio.to_thread()`**: Wrap blocking Oracle calls to avoid blocking the event loop
|
||||
- **Pre-validation**: Validate ALL SKUs before creating partner/address/order
|
||||
- **CLOB handling**: Use `cursor.var(oracledb.DB_TYPE_CLOB)` + `setvalue(0, json_string)`
|
||||
- **OFFSET/FETCH pagination**: Requires Oracle 12c+
|
||||
|
||||
## Environment Variables
|
||||
|
||||
- ORACLE_USER, ORACLE_PASSWORD, ORACLE_DSN, TNS_ADMIN
|
||||
- APP_PORT=5003
|
||||
- JSON_OUTPUT_DIR (path to VFP JSON output)
|
||||
- SQLITE_DB_PATH (local tracking database)
|
||||
|
||||
## Communication Style
|
||||
|
||||
When reporting to the team lead or other teammates:
|
||||
- List endpoints created/modified with HTTP methods
|
||||
- Flag any Oracle package interface changes needed
|
||||
- Note any frontend template variables or API contracts changed
|
||||
@@ -1,51 +0,0 @@
|
||||
---
|
||||
name: qa-tester
|
||||
description: QA engineer for testing Oracle packages, API endpoints, integration flows, and data validation
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# QA Testing Agent
|
||||
|
||||
You are a QA engineer responsible for testing the GoMag Import Manager system end-to-end.
|
||||
|
||||
## Your Responsibilities
|
||||
|
||||
- Write and execute test scripts for Oracle PL/SQL packages
|
||||
- Test FastAPI endpoints and service layer
|
||||
- Validate data flow: JSON -> validation -> Oracle import
|
||||
- Check edge cases: missing SKUs, duplicate orders, invalid partners
|
||||
- Verify business rules are correctly implemented
|
||||
- Review code for security issues (SQL injection, XSS, input validation)
|
||||
|
||||
## Test Categories
|
||||
|
||||
### Oracle Package Tests (P1-004)
|
||||
- IMPORT_PARTENERI: partner search/create, address parsing
|
||||
- IMPORT_COMENZI: SKU resolution, order import, error handling
|
||||
- Edge cases: 13-digit CUI, missing cod_fiscal, invalid addresses
|
||||
|
||||
### API Tests
|
||||
- Mappings CRUD: create, read, update, delete, CSV import/export
|
||||
- Dashboard: stat cards accuracy, sync history
|
||||
- Validation: SKU batch validation, missing SKU detection
|
||||
- Sync: manual trigger, scheduler toggle, order processing
|
||||
|
||||
### Integration Tests
|
||||
- JSON file reading from VFP output
|
||||
- Oracle connection pool lifecycle
|
||||
- SQLite tracking database consistency
|
||||
- End-to-end: JSON order -> validated -> imported into Oracle
|
||||
|
||||
## Success Criteria (from PRD)
|
||||
- Import success rate > 95%
|
||||
- Average processing time < 30s per order
|
||||
- Zero downtime for main ROA system
|
||||
- 100% log coverage
|
||||
|
||||
## Communication Style
|
||||
|
||||
When reporting to the team lead or other teammates:
|
||||
- List test cases with pass/fail status
|
||||
- Include error details and reproduction steps for failures
|
||||
- Suggest fixes with file paths and line numbers
|
||||
- Prioritize: critical bugs > functional issues > cosmetic issues
|
||||
@@ -1,50 +0,0 @@
|
||||
---
|
||||
name: ui-js
|
||||
description: Team agent pentru modificari JavaScript (dashboard.js, logs.js, mappings.js, shared.js). Folosit in TeamCreate pentru Task-uri care implica logica client-side, API calls, si interactivitate UI.
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# UI JavaScript Agent
|
||||
|
||||
Esti un teammate specializat pe JavaScript client-side in proiectul GoMag Import Manager.
|
||||
|
||||
## Responsabilitati
|
||||
|
||||
- Modificari in `api/app/static/js/*.js`
|
||||
- Fetch API calls catre backend (`/api/...`)
|
||||
- Rendering dinamic HTML (tabele, liste, modals)
|
||||
- Paginare, sortare, filtrare client-side
|
||||
- Mobile vs desktop rendering logic
|
||||
|
||||
## Fisiere cheie
|
||||
|
||||
- `api/app/static/js/shared.js` - utilitare comune (fmtDate, statusDot, renderUnifiedPagination, renderMobileSegmented, esc)
|
||||
- `api/app/static/js/dashboard.js` - logica dashboard comenzi
|
||||
- `api/app/static/js/logs.js` - logica jurnale import
|
||||
- `api/app/static/js/mappings.js` - CRUD mapari SKU
|
||||
|
||||
## Functii utilitare disponibile (din shared.js)
|
||||
|
||||
- `fmtDate(dateStr)` - formateaza data
|
||||
- `statusDot(status)` - dot colorat pentru status
|
||||
- `orderStatusBadge(status)` - badge Bootstrap pentru status
|
||||
- `renderUnifiedPagination(page, totalPages, goPageFn, opts)` - paginare
|
||||
- `renderMobileSegmented(containerId, items, onSelect)` - segmented control mobil
|
||||
- `esc(s)` / `escHtml(s)` - escape HTML
|
||||
|
||||
## Workflow in echipa
|
||||
|
||||
1. Citeste task-ul cu `TaskGet` sa intelegi exact ce trebuie facut
|
||||
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
|
||||
3. Citeste fisierele afectate inainte sa le modifici
|
||||
4. Implementeaza modificarile
|
||||
5. Marcheaza task-ul ca `completed` cu `TaskUpdate`
|
||||
6. Trimite mesaj la `team-lead` cu summary-ul modificarilor
|
||||
|
||||
## Principii
|
||||
|
||||
- Nu modifica fisiere HTML/CSS (sunt ale ui-templates agent)
|
||||
- `Math.round(x)` → `Number(x).toFixed(2)` pentru valori monetare
|
||||
- Verifica intotdeauna null/undefined inainte de operatii numerice: `x != null ? Number(x).toFixed(2) : '-'`
|
||||
- Reset elementele din modal la inceputul fiecarei deschideri (loading state)
|
||||
- Foloseste `esc()` pe orice valoare inserata in HTML
|
||||
@@ -1,42 +0,0 @@
|
||||
---
|
||||
name: ui-templates
|
||||
description: Team agent pentru modificari HTML templates (dashboard.html, logs.html, mappings.html, base.html) si CSS (style.css). Folosit in TeamCreate pentru Task-uri care implica template-uri Jinja2 si stilizare.
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# UI Templates Agent
|
||||
|
||||
Esti un teammate specializat pe templates HTML si CSS in proiectul GoMag Import Manager.
|
||||
|
||||
## Responsabilitati
|
||||
|
||||
- Modificari in `api/app/templates/*.html` (Jinja2)
|
||||
- Modificari in `api/app/static/css/style.css`
|
||||
- Cache-bust: incrementeaza `?v=N` pe toate tag-urile `<script>` si `<link>` la fiecare modificare
|
||||
- Structura modala Bootstrap 5.3
|
||||
- Responsive: `d-none d-md-block` pentru desktop-only, `d-md-none` pentru mobile-only
|
||||
|
||||
## Fisiere cheie
|
||||
|
||||
- `api/app/templates/base.html` - layout de baza cu navigatie
|
||||
- `api/app/templates/dashboard.html` - dashboard comenzi
|
||||
- `api/app/templates/logs.html` - jurnale import
|
||||
- `api/app/templates/mappings.html` - CRUD mapari SKU
|
||||
- `api/app/templates/missing_skus.html` - SKU-uri lipsa
|
||||
- `api/app/static/css/style.css` - stiluri aplicatie
|
||||
|
||||
## Workflow in echipa
|
||||
|
||||
1. Citeste task-ul cu `TaskGet` sa intelegi exact ce trebuie facut
|
||||
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
|
||||
3. Citeste fisierele afectate inainte sa le modifici
|
||||
4. Implementeaza modificarile
|
||||
5. Marcheaza task-ul ca `completed` cu `TaskUpdate`
|
||||
6. Trimite mesaj la `team-lead` cu summary-ul modificarilor
|
||||
|
||||
## Principii
|
||||
|
||||
- Nu modifica fisiere JS (sunt ale ui-js agent)
|
||||
- Desktop layout-ul nu se schimba cand se adauga imbunatatiri mobile
|
||||
- Foloseste clasele Bootstrap existente, nu adauga CSS custom decat daca e necesar
|
||||
- Pastreaza consistenta cu designul existent
|
||||
@@ -1,61 +0,0 @@
|
||||
---
|
||||
name: ui-verify
|
||||
description: Team agent de verificare Playwright pentru UI. Captureaza screenshots after-implementation, compara cu preview-urile aprobate, si raporteaza discrepante la team lead. Folosit intotdeauna dupa implementare.
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# UI Verify Agent
|
||||
|
||||
Esti un teammate specializat pe verificare vizuala Playwright in proiectul GoMag Import Manager.
|
||||
|
||||
## Responsabilitati
|
||||
|
||||
- Capturare screenshots post-implementare → `screenshots/after/`
|
||||
- Comparare vizuala `after/` vs `preview/`
|
||||
- Verificare ca desktop-ul ramane neschimbat unde nu s-a modificat intentionat
|
||||
- Raportare discrepante la team lead cu descriere exacta
|
||||
|
||||
## Server
|
||||
|
||||
App ruleaza la `http://localhost:5003`. Verifica cu `curl -s http://localhost:5003/health` inainte de screenshots.
|
||||
|
||||
**IMPORTANT**: NU restarteaza serverul singur. Serverul trebuie pornit de user via `./start.sh` care seteaza variabilele de mediu Oracle (`LD_LIBRARY_PATH`, `TNS_ADMIN`). Daca serverul nu raspunde sau Oracle e `"error"`, raporteaza la team-lead si asteapta ca userul sa-l reporneasca.
|
||||
|
||||
## Viewports
|
||||
|
||||
- **Mobile:** 375x812 — `browser_resize width=375 height=812`
|
||||
- **Desktop:** 1440x900 — `browser_resize width=1440 height=900`
|
||||
|
||||
## Pagini de verificat
|
||||
|
||||
- `http://localhost:5003/` — Dashboard
|
||||
- `http://localhost:5003/logs?run=<run_id>` — Logs cu run selectat
|
||||
- `http://localhost:5003/mappings` — Mapari SKU
|
||||
- `http://localhost:5003/missing-skus` — SKU-uri lipsa
|
||||
|
||||
## Workflow in echipa
|
||||
|
||||
1. Citeste task-ul cu `TaskGet` pentru lista exacta de pagini si criterii de verificat
|
||||
2. Marcheaza task-ul ca `in_progress` cu `TaskUpdate`
|
||||
3. Restarteza serverul daca e necesar
|
||||
4. Captureaza screenshots la ambele viewports pentru fiecare pagina
|
||||
5. Verifica vizual fiecare screenshot vs criteriile din task
|
||||
6. Marcheaza task-ul ca `completed` cu `TaskUpdate`
|
||||
7. Trimite raport detaliat la `team-lead`:
|
||||
- ✅ Ce e corect
|
||||
- ❌ Ce e gresit / lipseste (cu descriere exacta)
|
||||
- Sugestii de fix daca e cazul
|
||||
|
||||
## Naming convention screenshots
|
||||
|
||||
```
|
||||
screenshots/after/dashboard_desktop.png
|
||||
screenshots/after/dashboard_mobile.png
|
||||
screenshots/after/dashboard_modal_desktop.png
|
||||
screenshots/after/dashboard_modal_mobile.png
|
||||
screenshots/after/logs_desktop.png
|
||||
screenshots/after/logs_mobile.png
|
||||
screenshots/after/logs_modal_desktop.png
|
||||
screenshots/after/logs_modal_mobile.png
|
||||
screenshots/after/mappings_desktop.png
|
||||
```
|
||||
@@ -1,45 +0,0 @@
|
||||
---
|
||||
name: vfp-integration
|
||||
description: Visual FoxPro specialist for GoMag API integration, JSON processing, and Oracle orchestration
|
||||
model: sonnet
|
||||
---
|
||||
|
||||
# VFP Integration Agent
|
||||
|
||||
You are a Visual FoxPro 9 developer working on the GoMag API integration layer.
|
||||
|
||||
## Your Responsibilities
|
||||
|
||||
- Maintain and extend gomag-vending.prg (GoMag API client)
|
||||
- Develop sync-comenzi-web.prg (orchestrator with timer automation)
|
||||
- Handle JSON data retrieval, parsing, and output
|
||||
- Implement HTML entity cleaning and data transformation
|
||||
- Build logging system with rotation
|
||||
|
||||
## Key Files You Own
|
||||
|
||||
- `vfp/gomag-vending.prg` - GoMag API client with pagination
|
||||
- `vfp/utils.prg` - Utility functions (logging, settings, connectivity)
|
||||
- `vfp/sync-comenzi-web.prg` - Future orchestrator (Phase 2)
|
||||
- `vfp/nfjson/` - JSON parsing library
|
||||
|
||||
## VFP Conventions
|
||||
|
||||
- HTML entity cleaning: ă->a, ș->s, ț->t, î->i, â->a (Romanian diacritics)
|
||||
- INI configuration management via LoadSettings
|
||||
- Log format: `YYYY-MM-DD HH:MM:SS | ORDER-XXX | OK/ERROR | details`
|
||||
- JSON output to `vfp/output/` directory (gomag_orders_page*_*.json)
|
||||
- 5-minute timer for automated sync cycles
|
||||
|
||||
## Data Flow
|
||||
|
||||
```
|
||||
GoMag API -> VFP (gomag-vending.prg) -> JSON files -> FastAPI (order_reader.py) -> Oracle packages
|
||||
```
|
||||
|
||||
## Communication Style
|
||||
|
||||
When reporting to the team lead or other teammates:
|
||||
- Describe data format changes that affect downstream processing
|
||||
- Note any new JSON fields or structure changes
|
||||
- Flag API rate limiting or pagination issues
|
||||
@@ -1,6 +0,0 @@
|
||||
{
|
||||
"env": {
|
||||
"CLAUDE_CODE_EXPERIMENTAL_AGENT_TEAMS": "1"
|
||||
},
|
||||
"teammateMode": "in-process"
|
||||
}
|
||||
@@ -1,38 +0,0 @@
|
||||
name: Tests
|
||||
|
||||
on:
|
||||
push:
|
||||
branches-ignore: [main]
|
||||
pull_request:
|
||||
branches: [main]
|
||||
|
||||
jobs:
|
||||
fast-tests:
|
||||
runs-on: [self-hosted]
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run fast tests (unit + e2e)
|
||||
run: ./test.sh ci
|
||||
|
||||
full-tests:
|
||||
runs-on: [self-hosted, oracle]
|
||||
needs: fast-tests
|
||||
if: github.event_name == 'pull_request'
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run full tests (with Oracle)
|
||||
run: ./test.sh full
|
||||
env:
|
||||
ORACLE_DSN: ${{ secrets.ORACLE_DSN }}
|
||||
ORACLE_USER: ${{ secrets.ORACLE_USER }}
|
||||
ORACLE_PASSWORD: ${{ secrets.ORACLE_PASSWORD }}
|
||||
|
||||
- name: Upload QA reports
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: qa-reports
|
||||
path: qa-reports/
|
||||
retention-days: 30
|
||||
@@ -1,9 +0,0 @@
|
||||
#!/bin/bash
|
||||
echo "🔍 Running pre-push tests..."
|
||||
./test.sh ci
|
||||
EXIT_CODE=$?
|
||||
if [ $EXIT_CODE -ne 0 ]; then
|
||||
echo "❌ Tests failed. Push aborted."
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ Tests passed. Pushing..."
|
||||
46
.gitignore
vendored
46
.gitignore
vendored
@@ -8,49 +8,3 @@
|
||||
*.err
|
||||
*.ERR
|
||||
*.log
|
||||
/screenshots
|
||||
/.playwright-mcp
|
||||
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
|
||||
# Environment files
|
||||
.env
|
||||
.env.local
|
||||
.env.*.local
|
||||
|
||||
# Settings files with secrets
|
||||
settings.ini
|
||||
vfp/settings.ini
|
||||
.gittoken
|
||||
output/
|
||||
vfp/*.json
|
||||
*.~pck
|
||||
.claude/HANDOFF.md
|
||||
scripts/work/
|
||||
|
||||
# Virtual environments
|
||||
venv/
|
||||
.venv/
|
||||
|
||||
# SQLite databases
|
||||
*.db
|
||||
*.db-journal
|
||||
*.db-wal
|
||||
*.db-shm
|
||||
|
||||
# Generated/duplicate directories
|
||||
api/api/
|
||||
|
||||
# Logs directory
|
||||
logs/
|
||||
.gstack/
|
||||
.gstack-audit/
|
||||
|
||||
# QA Reports (generated by test suite)
|
||||
qa-reports/
|
||||
|
||||
# Session handoff
|
||||
.claude/HANDOFF.md
|
||||
|
||||
216
CLAUDE.md
216
CLAUDE.md
@@ -1,128 +1,130 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
**System:** Import Comenzi Web GoMag → Sistem ROA Oracle
|
||||
Stack: FastAPI + Jinja2 + Bootstrap 5.3 + Oracle PL/SQL + SQLite
|
||||
This is a Visual FoxPro 9 project that interfaces with the GoMag e-commerce API. The application retrieves both product and order data from GoMag's REST API endpoints with full pagination support and comprehensive error handling.
|
||||
|
||||
Documentatie completa: [README.md](README.md)
|
||||
## Architecture
|
||||
|
||||
## Implementare cu TeamCreate
|
||||
- **Main Application**: `gomag-vending.prg` - Primary Visual FoxPro script with pagination support
|
||||
- **Utility Module**: `utils.prg` - INI file handling, logging, and helper functions
|
||||
- **JSON Library**: `nfjson/` - Third-party JSON parsing library for VFP
|
||||
- **Technology**: Visual FoxPro 9 with WinHttp.WinHttpRequest.5.1 for HTTP requests
|
||||
- **API Integration**: GoMag REST API v1 for products and orders management
|
||||
|
||||
**OBLIGATORIU:** Folosim TeamCreate + TaskCreate, NU Agent tool cu subagenti paraleli. Skill-ul `superpowers:dispatching-parallel-agents` NU se aplica in acest proiect.
|
||||
## Core Components
|
||||
|
||||
- Team lead citeste TOATE fisierele implicate, creeaza planul
|
||||
- **ASTEAPTA aprobare explicita** de la user inainte de implementare
|
||||
- Task-uri pe fisiere non-overlapping (evita conflicte)
|
||||
- Cache-bust static assets (`?v=N`) la fiecare schimbare UI
|
||||
### gomag-vending.prg
|
||||
Main application script with:
|
||||
- Complete pagination support for products and orders
|
||||
- Configurable API settings via INI file
|
||||
- Comprehensive error handling and logging
|
||||
- Rate limiting compliance (1-second delays between requests)
|
||||
- JSON array output generation for both products and orders
|
||||
|
||||
### utils.prg
|
||||
Utility functions module containing:
|
||||
- INI file operations (`ReadPini`, `WritePini`, `LoadSettings`)
|
||||
- Logging system (`InitLog`, `LogMessage`, `CloseLog`)
|
||||
- Connectivity testing (`TestConnectivity`)
|
||||
- URL encoding utilities (`UrlEncode`)
|
||||
- Default configuration creation (`CreateDefaultIni`)
|
||||
|
||||
### settings.ini
|
||||
Configuration file with sections:
|
||||
- `[API]` - API endpoints, credentials, and headers
|
||||
- `[PAGINATION]` - Page size limits
|
||||
- `[OPTIONS]` - Feature toggles for products/orders retrieval
|
||||
- `[FILTERS]` - Date range filters for orders
|
||||
|
||||
## Development Commands
|
||||
|
||||
```bash
|
||||
# INTOTDEAUNA via start.sh (seteaza Oracle env vars)
|
||||
./start.sh
|
||||
# NU folosi uvicorn direct — lipsesc LD_LIBRARY_PATH si TNS_ADMIN
|
||||
### Running the Application
|
||||
```foxpro
|
||||
DO gomag-vending.prg
|
||||
```
|
||||
|
||||
## Testing & CI/CD
|
||||
|
||||
```bash
|
||||
# Teste rapide (unit + e2e, ~30s, fara Oracle)
|
||||
./test.sh ci
|
||||
|
||||
# Teste complete (totul inclusiv Oracle + sync real + PL/SQL, ~2-3 min)
|
||||
./test.sh full
|
||||
|
||||
# Smoke test pe productie (read-only, dupa deploy)
|
||||
./test.sh smoke-prod --base-url http://79.119.86.134/gomag
|
||||
|
||||
# Doar un layer specific
|
||||
./test.sh unit # SQLite CRUD, imports, routes
|
||||
./test.sh e2e # Browser tests (Playwright)
|
||||
./test.sh oracle # Oracle integration
|
||||
./test.sh sync # Sync real GoMag → Oracle
|
||||
./test.sh qa # API health + responsive + log monitor
|
||||
./test.sh logs # Doar log monitoring
|
||||
|
||||
# Validate prerequisites
|
||||
./test.sh --dry-run
|
||||
### Running from Windows Command Line
|
||||
Use the provided batch file:
|
||||
```cmd
|
||||
run-gomag.bat
|
||||
```
|
||||
|
||||
**Flow zilnic:**
|
||||
1. Lucrezi pe branch `fix/*` sau `feat/*`
|
||||
2. `git push` → pre-push hook ruleaza `./test.sh ci` automat (~30s)
|
||||
3. Inainte de PR → `./test.sh full` manual (~2-3 min)
|
||||
4. Dupa deploy pe prod → `./test.sh smoke-prod --base-url http://79.119.86.134/gomag`
|
||||
|
||||
**Output:** `qa-reports/` — health score, raport markdown, screenshots, baseline comparison.
|
||||
|
||||
**Markers pytest:** `unit`, `oracle`, `e2e`, `qa`, `sync`
|
||||
|
||||
## Reguli critice (nu le incalca)
|
||||
|
||||
### Flux import comenzi
|
||||
1. Download GoMag API → JSON → parse → validate SKU-uri → import Oracle
|
||||
2. Ordinea: **parteneri** (cauta/creeaza) → **adrese** → **comanda** → **factura cache**
|
||||
3. SKU lookup: ARTICOLE_TERTI (mapped) are prioritate fata de NOM_ARTICOLE (direct)
|
||||
4. Complex sets (kituri/pachete): un SKU → multiple CODMAT-uri cu `cantitate_roa`; preturile se preiau din lista de preturi Oracle
|
||||
5. Comenzi anulate (GoMag statusId=7): verifica daca au factura inainte de stergere din Oracle
|
||||
|
||||
### Statusuri comenzi
|
||||
`IMPORTED` / `ALREADY_IMPORTED` / `SKIPPED` / `ERROR` / `CANCELLED` / `DELETED_IN_ROA`
|
||||
- Upsert: `IMPORTED` existent NU se suprascrie cu `ALREADY_IMPORTED`
|
||||
- Recovery: la fiecare sync, comenzile ERROR sunt reverificate in Oracle
|
||||
|
||||
### Parteneri
|
||||
- Prioritate: **companie** (PJ, cod_fiscal + registru) daca exista in GoMag (name SAU code), altfel persoana fizica cu **shipping name**
|
||||
- Adresa livrare: intotdeauna GoMag shipping
|
||||
- Adresa facturare PJ: adresa billing din GoMag (sediul firmei)
|
||||
- Adresa facturare PF: adresa shipping din GoMag (ramburs curier pe numele destinatarului)
|
||||
|
||||
### Cautare partener PJ dupa cod fiscal (ANAF strict mode)
|
||||
Cand avem date ANAF (`anaf_strict=1`), PL/SQL `cauta_partener_dupa_cod_fiscal` diferentiaza intre platitor si neplatitor TVA:
|
||||
- **Platitor TVA** (scpTVA=True) → cauta in `nom_parteneri.cod_fiscal` doar `RO<cifre>` si `RO <cifre>` (cu/fara spatiu)
|
||||
- **Neplatitor TVA** (scpTVA=False) → cauta doar forma bare `<cifre>`
|
||||
- **Nu cross-match** intre platitor si neplatitor — entitati fiscal distincte
|
||||
- Fallback non-strict (`NULL`): toate 3 formele (anti-dedup la ANAF down)
|
||||
|
||||
Python normalizeaza CUI-ul (`re.sub(r'\s+', '', ...)`) inainte de apel Oracle. La creare partener NOU PJ, se foloseste numele oficial ANAF (`denumire_anaf`) in loc de GoMag company_name (poate avea typos); partenerii existenti nu sunt atinsi.
|
||||
|
||||
### Preturi
|
||||
- Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
|
||||
- Daca pretul lipseste, se insereaza automat pret=0
|
||||
|
||||
### Dashboard paginare
|
||||
- Contorul din paginare arata **totalul comenzilor** din perioada selectata (ex: "378 comenzi"), NU doar cele filtrate
|
||||
- Butoanele de filtru (Importat, Omise, Erori, Facturate, Nefacturate, Anulate) arata fiecare cate comenzi are pe langa total
|
||||
- Aceasta este comportamentul dorit: userul vede cate comenzi totale sunt, din care cate importate, cu erori etc.
|
||||
|
||||
### Invoice cache
|
||||
- Coloanele `factura_*` pe `orders` (SQLite), populate lazy din Oracle (`vanzari WHERE sters=0`)
|
||||
- Refresh complet: verifica facturi noi + facturi sterse + comenzi sterse din ROA
|
||||
|
||||
## Sync articole VENDING → MARIUSM_AUTO
|
||||
|
||||
```bash
|
||||
# Dry-run (arată diferențele fără să modifice)
|
||||
python3 scripts/sync_vending_to_mariusm.py
|
||||
|
||||
# Aplică cu confirmare
|
||||
python3 scripts/sync_vending_to_mariusm.py --apply
|
||||
|
||||
# Fără confirmare (automatizare)
|
||||
python3 scripts/sync_vending_to_mariusm.py --apply --yes
|
||||
Direct execution with Visual FoxPro:
|
||||
```cmd
|
||||
"C:\Program Files (x86)\Microsoft Visual FoxPro 9\vfp9.exe" -T "gomag-vending.prg"
|
||||
```
|
||||
|
||||
Sincronizează via SSH din VENDING (prod Windows) în MARIUSM_AUTO (dev ROA_CENTRAL):
|
||||
nom_articole (noi by codmat, codmat updatat) + articole_terti (noi, modificate, soft-delete).
|
||||
## Configuration Management
|
||||
|
||||
## Design System
|
||||
The application uses `settings.ini` for all configuration. Key settings:
|
||||
|
||||
Always read DESIGN.md before making any visual or UI decisions.
|
||||
All font choices, colors, spacing, and aesthetic direction are defined there.
|
||||
Do not deviate without explicit user approval.
|
||||
In QA mode, flag any code that doesn't match DESIGN.md.
|
||||
### Required Configuration
|
||||
- `ApiKey` - Your GoMag API key
|
||||
- `ApiShop` - Your shop URL (e.g., "https://yourstore.gomag.ro")
|
||||
|
||||
## Deploy Windows
|
||||
### Feature Control
|
||||
- `GetProducts` - Set to "1" to retrieve products, "0" to skip
|
||||
- `GetOrders` - Set to "1" to retrieve orders, "0" to skip
|
||||
- `OrderDaysBack` - Number of days back to retrieve orders (default: 7)
|
||||
|
||||
Vezi [README.md](README.md#deploy-windows)
|
||||
### Pagination
|
||||
- `Limit` - Records per page (default: 100, max recommended for GoMag API)
|
||||
|
||||
## API Integration Details
|
||||
|
||||
### Authentication
|
||||
- Header-based authentication using `Apikey` and `ApiShop` headers
|
||||
- User-Agent must differ from "PostmanRuntime"
|
||||
|
||||
### Endpoints
|
||||
- Products: `https://api.gomag.ro/api/v1/product/read/json?enabled=1`
|
||||
- Orders: `https://api.gomag.ro/api/v1/order/read/json`
|
||||
|
||||
### Rate Limiting
|
||||
- 1-second pause between paginated requests
|
||||
- No specific READ request limitations
|
||||
- POST requests limited to ~1 request per second (Leaky Bucket)
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
/
|
||||
├── gomag-vending.prg # Main application
|
||||
├── utils.prg # Utility functions
|
||||
├── settings.ini # Configuration file
|
||||
├── run-gomag.bat # Windows launcher
|
||||
├── nfjson/ # JSON parsing library
|
||||
│ ├── nfjsoncreate.prg
|
||||
│ └── nfjsonread.prg
|
||||
├── log/ # Generated log files
|
||||
├── output/ # Generated JSON files
|
||||
└── products-example.json # Sample API response
|
||||
```
|
||||
|
||||
## Output Files
|
||||
|
||||
### Generated Files
|
||||
- `log/gomag_sync_YYYYMMDD_HHMMSS.log` - Detailed execution logs
|
||||
- `output/gomag_all_products_YYYYMMDD_HHMMSS.json` - Complete product array
|
||||
- `output/gomag_orders_last7days_YYYYMMDD_HHMMSS.json` - Orders array
|
||||
|
||||
### Error Files
|
||||
- `gomag_error_pageN_*.json` - Raw API responses for failed parsing
|
||||
- `gomag_error_pageN_*.log` - HTTP error details with status codes
|
||||
|
||||
## Key Functions
|
||||
|
||||
### Main Application (gomag-vending.prg)
|
||||
- `SaveProductsArray` - Converts paginated product data to JSON array
|
||||
- `SaveOrdersArray` - Converts paginated order data to JSON array
|
||||
- `MergeProducts` - Combines products from multiple pages
|
||||
- `MergeOrdersArray` - Combines orders from multiple pages
|
||||
|
||||
### Utilities (utils.prg)
|
||||
- `LoadSettings` - Loads complete INI configuration into object
|
||||
- `InitLog`/`LogMessage`/`CloseLog` - Comprehensive logging system
|
||||
- `TestConnectivity` - Internet connection verification
|
||||
- `CreateDefaultIni` - Generates template configuration file
|
||||
335
DESIGN.md
335
DESIGN.md
@@ -1,335 +0,0 @@
|
||||
# Design System — GoMag Vending
|
||||
|
||||
## Product Context
|
||||
- **What this is:** Internal admin dashboard for importing web orders from GoMag e-commerce into ROA Oracle ERP
|
||||
- **Who it's for:** Ops/admin team who monitor order sync daily, fix SKU mappings, check import errors
|
||||
- **Space/industry:** Internal tools, B2B operations, ERP integration
|
||||
- **Project type:** Data-heavy admin dashboard (tables, status indicators, sync controls)
|
||||
|
||||
## Aesthetic Direction
|
||||
- **Direction:** Industrial/Utilitarian — function-first, data-dense, quietly confident
|
||||
- **Decoration level:** Minimal — typography and color do the work. No illustrations, gradients, or decorative elements. The data IS the decoration.
|
||||
- **Mood:** Command console. This tool says "built by someone who respects the operator." Serious, efficient, warm.
|
||||
- **Anti-patterns:** No purple gradients, no 3-column icon grids, no centered-everything layouts, no decorative blobs, no stock-photo heroes
|
||||
|
||||
## Typography
|
||||
|
||||
### Font Stack
|
||||
- **Display/Headings:** Space Grotesk — geometric, slightly techy, distinctive `a` and `g`. Says "engineered."
|
||||
- **Body/UI:** DM Sans — clean, excellent readability, good tabular-nums for inline numbers
|
||||
- **Data/Tables:** JetBrains Mono — order IDs, CODMATs, status codes align perfectly. Tables become scannable.
|
||||
- **Code:** JetBrains Mono
|
||||
|
||||
### Loading
|
||||
```html
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,300;0,9..40,400;0,9..40,500;0,9..40,600;0,9..40,700;1,9..40,400&family=JetBrains+Mono:wght@400;500;600&family=Space+Grotesk:wght@400;500;600;700&display=swap" rel="stylesheet">
|
||||
```
|
||||
|
||||
### CSS Variables
|
||||
```css
|
||||
--font-display: 'Space Grotesk', sans-serif;
|
||||
--font-body: 'DM Sans', sans-serif;
|
||||
--font-data: 'JetBrains Mono', monospace;
|
||||
```
|
||||
|
||||
### Type Scale
|
||||
| Level | Size | Weight | Font | Usage |
|
||||
|-------|------|--------|------|-------|
|
||||
| Page title | 18px | 600 | Display | "Panou de Comanda" |
|
||||
| Section title | 16px | 600 | Display | Card headers |
|
||||
| Label/uppercase | 12px | 500 | Display | Column headers, section labels (letter-spacing: 0.04em) |
|
||||
| Body | 14px | 400 | Body | Paragraphs, descriptions |
|
||||
| UI/Button | 13px | 500 | Body | Buttons, nav links, form labels |
|
||||
| Data cell | 13px | 400 | Data | Codes, IDs, numbers, sums, dates (NOT text names — those use Body font) |
|
||||
| Data small | 12px | 400 | Data | Timestamps, secondary data |
|
||||
| Code/mono | 11px | 400 | Data | Inline code, debug info |
|
||||
|
||||
## Color
|
||||
|
||||
### Approach: Two-accent system (amber state + blue action)
|
||||
Every admin tool is blue. This one uses amber — reads as "operational" and "attention-worthy."
|
||||
- **Amber (--accent):** Navigation active state, filter pill active, accent backgrounds. "Where you are."
|
||||
- **Blue (--info):** Primary buttons, CTAs, actionable links. "What you can do."
|
||||
- Primary buttons (`btn-primary`) stay blue for clear action hierarchy.
|
||||
|
||||
### Light Mode (default)
|
||||
```css
|
||||
:root {
|
||||
/* Surfaces */
|
||||
--bg: #F8F7F5; /* warm off-white, not clinical gray */
|
||||
--surface: #FFFFFF;
|
||||
--surface-raised: #F3F2EF; /* hover states, table headers */
|
||||
--card-shadow: 0 1px 3px rgba(28,25,23,0.1), 0 1px 2px rgba(28,25,23,0.06);
|
||||
|
||||
/* Text */
|
||||
--text-primary: #1C1917; /* warm black */
|
||||
--text-secondary: #57534E; /* warm gray */
|
||||
--text-muted: #78716C; /* labels, timestamps */
|
||||
|
||||
/* Borders */
|
||||
--border: #E7E5E4;
|
||||
--border-subtle: #F0EFED;
|
||||
|
||||
/* Accent — amber */
|
||||
--accent: #D97706;
|
||||
--accent-hover: #B45309;
|
||||
--accent-light: #FEF3C7; /* amber backgrounds */
|
||||
--accent-text: #92400E; /* text on amber bg */
|
||||
|
||||
/* Semantic */
|
||||
--success: #16A34A;
|
||||
--success-light: #DCFCE7;
|
||||
--success-text: #166534;
|
||||
|
||||
--warning: #CA8A04;
|
||||
--warning-light: #FEF9C3;
|
||||
--warning-text: #854D0E;
|
||||
|
||||
--error: #DC2626;
|
||||
--error-light: #FEE2E2;
|
||||
--error-text: #991B1B;
|
||||
|
||||
--info: #2563EB;
|
||||
--info-light: #DBEAFE;
|
||||
--info-text: #1E40AF;
|
||||
|
||||
--cancelled: #78716C;
|
||||
--cancelled-light: #F5F5F4;
|
||||
|
||||
--compare: #EA580C;
|
||||
--compare-light: #FFF7ED;
|
||||
--compare-text: #9A3412;
|
||||
}
|
||||
```
|
||||
|
||||
### Dark Mode
|
||||
Strategy: invert surfaces, reduce accent saturation ~15%, keep semantic colors recognizable.
|
||||
|
||||
```css
|
||||
[data-theme="dark"] {
|
||||
--bg: #121212;
|
||||
--surface: #1E1E1E;
|
||||
--surface-raised: #2A2A2A;
|
||||
--card-shadow: 0 1px 3px rgba(0,0,0,0.4), 0 1px 2px rgba(0,0,0,0.3);
|
||||
|
||||
--text-primary: #E8E4DD; /* warm bone white */
|
||||
--text-secondary: #A8A29E;
|
||||
--text-muted: #78716C;
|
||||
|
||||
--border: #333333;
|
||||
--border-subtle: #262626;
|
||||
|
||||
--accent: #F59E0B;
|
||||
--accent-hover: #D97706;
|
||||
--accent-light: rgba(245,158,11,0.12);
|
||||
--accent-text: #FCD34D;
|
||||
|
||||
--success: #16A34A;
|
||||
--success-light: rgba(22,163,74,0.15);
|
||||
--success-text: #4ADE80;
|
||||
|
||||
--warning: #CA8A04;
|
||||
--warning-light: rgba(202,138,4,0.15);
|
||||
--warning-text: #FACC15;
|
||||
|
||||
--error: #DC2626;
|
||||
--error-light: rgba(220,38,38,0.15);
|
||||
--error-text: #FCA5A5;
|
||||
|
||||
--info: #2563EB;
|
||||
--info-light: rgba(37,99,235,0.15);
|
||||
--info-text: #93C5FD;
|
||||
|
||||
--cancelled: #78716C;
|
||||
--cancelled-light: rgba(120,113,108,0.15);
|
||||
|
||||
--compare: #EA580C;
|
||||
--compare-light: rgba(234,88,12,0.15);
|
||||
--compare-text: #FB923C;
|
||||
}
|
||||
```
|
||||
|
||||
### Status Color Mapping
|
||||
| Status | Dot Color | Badge BG | Glow |
|
||||
|--------|-----------|----------|------|
|
||||
| IMPORTED | `--success` | `--success-light` | none (quiet when healthy) |
|
||||
| ERROR | `--error` | `--error-light` | `0 0 8px 2px rgba(220,38,38,0.35)` |
|
||||
| SKIPPED | `--warning` | `--warning-light` | `0 0 6px 2px rgba(202,138,4,0.3)` |
|
||||
| ALREADY_IMPORTED | `--info` | `--info-light` | none |
|
||||
| CANCELLED | `--cancelled` | `--cancelled-light` | none |
|
||||
| DELETED_IN_ROA | `--cancelled` | `--cancelled-light` | none |
|
||||
| MALFORMED | `--compare` | `--compare-light` | `0 0 8px 2px rgba(234,88,12,0.35)` |
|
||||
|
||||
**Design rule:** Problems glow, success is calm. The operator's eye is pulled to rows that need action.
|
||||
|
||||
**ERROR vs MALFORMED:** ERROR red signals a runtime issue operators can fix on our side (Oracle hiccup, network, stale state). MALFORMED orange signals the payload itself is broken at the source — the operator should escalate to GoMag rather than keep retrying. Visually distinct colors make the diagnostic path obvious at a glance.
|
||||
|
||||
## Spacing
|
||||
- **Base unit:** 4px
|
||||
- **Density:** Comfortable — not cramped, not wasteful
|
||||
- **Scale:**
|
||||
|
||||
| Token | Value | Usage |
|
||||
|-------|-------|-------|
|
||||
| 2xs | 2px | Tight internal gaps |
|
||||
| xs | 4px | Icon-text gap, badge padding |
|
||||
| sm | 8px | Compact card padding, table cell padding |
|
||||
| md | 16px | Standard card padding, section gaps |
|
||||
| lg | 24px | Section spacing |
|
||||
| xl | 32px | Major section gaps |
|
||||
| 2xl | 48px | Page-level spacing |
|
||||
| 3xl | 64px | Hero spacing (rarely used) |
|
||||
|
||||
## Layout
|
||||
|
||||
### Approach: Grid-disciplined, full-width
|
||||
Tables with 8+ columns and hundreds of rows need every pixel of width.
|
||||
|
||||
- **Nav:** Horizontal top bar, fixed, 48px height. Active tab has amber underline (2px).
|
||||
- **Content max-width:** None on desktop (full-width for tables), 1200px for non-table content
|
||||
- **Grid:** Single-column layout, cards stack vertically
|
||||
- **Breakpoints:**
|
||||
|
||||
| Name | Width | Columns | Behavior |
|
||||
|------|-------|---------|----------|
|
||||
| Desktop | >= 1024px | Full width | All features visible |
|
||||
| Tablet | 768-1023px | Full width | Nav labels abbreviated, tables scroll horizontally |
|
||||
| Mobile | < 768px | Single column | Bottom nav, cards stack, condensed views |
|
||||
|
||||
### Border Radius
|
||||
| Token | Value | Usage |
|
||||
|-------|-------|-------|
|
||||
| sm | 4px | Buttons, inputs, badges, status dots |
|
||||
| md | 8px | Cards, dropdowns, modals |
|
||||
| lg | 12px | Large containers, mockup frames |
|
||||
| full | 9999px | Pills, avatar circles |
|
||||
|
||||
## Motion
|
||||
- **Approach:** Minimal-functional — only transitions that aid comprehension
|
||||
- **Easing:** enter(ease-out) exit(ease-in) move(ease-in-out)
|
||||
- **Duration:**
|
||||
|
||||
| Token | Value | Usage |
|
||||
|-------|-------|-------|
|
||||
| micro | 50-100ms | Button hover, focus ring |
|
||||
| short | 150-250ms | Dropdown open, tab switch, color transitions |
|
||||
| medium | 250-400ms | Modal open/close, page transitions |
|
||||
| long | 400-700ms | Only for sync pulse animation |
|
||||
|
||||
- **Sync pulse:** The live sync dot uses a 2s infinite pulse (opacity 1 → 0.4 → 1)
|
||||
- **No:** entrance animations, scroll effects, decorative motion
|
||||
|
||||
## Mobile Design
|
||||
|
||||
### Navigation
|
||||
- **Bottom tab bar** replaces top horizontal nav on screens < 768px
|
||||
- 5 tabs: Dashboard, Mapari, Lipsa, Jurnale, Setari
|
||||
- Each tab: icon (Bootstrap Icons) + short label below
|
||||
- Active tab: amber accent color, inactive: `--text-muted`
|
||||
- Height: 56px, safe-area padding for notched devices
|
||||
- Fixed position bottom, with `padding-bottom: env(safe-area-inset-bottom)`
|
||||
|
||||
```css
|
||||
@media (max-width: 767px) {
|
||||
.top-navbar { display: none; }
|
||||
.bottom-nav {
|
||||
position: fixed;
|
||||
bottom: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
height: 56px;
|
||||
padding-bottom: env(safe-area-inset-bottom);
|
||||
background: var(--surface);
|
||||
border-top: 1px solid var(--border);
|
||||
display: flex;
|
||||
justify-content: space-around;
|
||||
align-items: center;
|
||||
z-index: 1000;
|
||||
}
|
||||
.main-content {
|
||||
padding-bottom: 72px; /* clear bottom nav */
|
||||
padding-top: 8px; /* no top navbar */
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Dashboard — Mobile
|
||||
- **Sync card:** Full width, stacked vertically
|
||||
- Status + controls row wraps to 2 lines
|
||||
- Sync button full-width at bottom of card
|
||||
- Last sync info wraps naturally
|
||||
- **Orders table:** Condensed card view instead of horizontal table
|
||||
- Each order = a compact card showing: status dot + ID + client name + total
|
||||
- Tap to expand: shows date, factura, full details
|
||||
- Swipe left on card: quick action (view error details)
|
||||
- **Filter bar:** Horizontal scrollable chips instead of dropdowns
|
||||
- Period selector: pill chips (1zi, 7zi, 30zi, Toate)
|
||||
- Status filter: colored chips matching status colors
|
||||
- **Touch targets:** Minimum 44x44px for all interactive elements
|
||||
|
||||
### Orders Mobile Card Layout
|
||||
```
|
||||
┌────────────────────────────────┐
|
||||
│ ● CMD-47832 2,450.00 RON│
|
||||
│ SC Automate Express SRL │
|
||||
│ 27.03.2026 · FCT-2026-1847 │
|
||||
└────────────────────────────────┘
|
||||
```
|
||||
- Status dot (8px, left-aligned with glow for errors)
|
||||
- Order ID in JetBrains Mono, amount right-aligned
|
||||
- Client name in DM Sans
|
||||
- Date + factura in muted data font
|
||||
|
||||
### SKU Mappings — Mobile
|
||||
- Each mapping = expandable card
|
||||
- Collapsed: SKU + product name + type badge (KIT/SIMPLU)
|
||||
- Expanded: Full CODMAT list with quantities
|
||||
- Search: Full-width sticky search bar at top
|
||||
- Filter: Horizontal scrollable type chips
|
||||
|
||||
### Logs — Mobile
|
||||
- Timeline view instead of table
|
||||
- Each log entry = timestamp + status icon + summary
|
||||
- Tap to expand full log details
|
||||
- Infinite scroll with date separators
|
||||
|
||||
### Settings — Mobile
|
||||
- Standard stacked form layout
|
||||
- Full-width inputs
|
||||
- Toggle switches for boolean settings (min 44px touch target)
|
||||
- Save button sticky at bottom
|
||||
|
||||
### Gestures
|
||||
- **Pull to refresh** on Dashboard: triggers sync status check
|
||||
- **Swipe left** on order card: reveal quick actions
|
||||
- **Long press** on SKU mapping: copy CODMAT to clipboard
|
||||
- **No swipe navigation** between pages (use bottom tabs)
|
||||
|
||||
### Mobile Typography Adjustments
|
||||
| Level | Desktop | Mobile |
|
||||
|-------|---------|--------|
|
||||
| Page title | 18px | 16px |
|
||||
| Body | 14px | 14px (no change) |
|
||||
| Data cell | 13px | 13px (no change) |
|
||||
| Data small | 12px | 12px (no change) |
|
||||
| Table header | 12px | 11px |
|
||||
|
||||
### Responsive Images & Icons
|
||||
- Use Bootstrap Icons throughout (already loaded via CDN)
|
||||
- Icon size: 16px desktop, 20px mobile (larger touch targets)
|
||||
- No images in the admin interface (data-only)
|
||||
|
||||
## Decisions Log
|
||||
| Date | Decision | Rationale |
|
||||
|------|----------|-----------|
|
||||
| 2026-03-27 | Initial design system created | Created by /design-consultation. Industrial/utilitarian aesthetic with amber accent, Space Grotesk + DM Sans + JetBrains Mono. |
|
||||
| 2026-03-27 | Amber accent over blue | Every admin tool is blue. Amber reads as "operational" and gives the tool its own identity. Confirmed by Claude subagent ("Control Room Noir" also converged on amber). |
|
||||
| 2026-03-27 | JetBrains Mono for data tables | Both primary analysis and subagent independently recommended monospace for data tables. Scannability win outweighs the ~15% wider columns. |
|
||||
| 2026-03-27 | Warm tones throughout | Off-white (#F8F7F5) instead of clinical gray. Warm black text instead of blue-gray. Makes the tool feel handcrafted. |
|
||||
| 2026-03-27 | Glowing status dots for errors | Problems glow (box-shadow), success is calm. Operator's eye is pulled to rows that need action. Inspired by subagent's "LED indicator" concept. |
|
||||
| 2026-03-27 | Full mobile design | Bottom nav, card-based order views, touch-optimized gestures. Supports quick-glance usage from phone. |
|
||||
| 2026-03-27 | Two-accent system | Blue = action (buttons, CTAs), amber = state (nav active, filter active). Clear hierarchy. |
|
||||
| 2026-03-27 | JetBrains Mono selective | Mono font only for codes, IDs, numbers, sums, dates. Text names use DM Sans for readability. |
|
||||
| 2026-03-27 | Dark mode in scope | CSS variables + toggle + localStorage. All DESIGN.md dark tokens implemented in Commit 0.5. |
|
||||
@@ -1,10 +0,0 @@
|
||||
ESTIMARE PROIECT - Import Comenzi Web → ROA
|
||||
Data: 5 martie 2026
|
||||
================================================================================
|
||||
|
||||
Lucrat deja: 20h
|
||||
De lucrat: 60h
|
||||
Support 3 luni: 24h
|
||||
|
||||
TOTAL IMPLEMENTARE: 80h
|
||||
TOTAL CU SUPPORT: 104h
|
||||
527
README.md
527
README.md
@@ -1,527 +1,2 @@
|
||||
# GoMag Vending - Import Comenzi Web → ROA Oracle
|
||||
# gomag-vending
|
||||
|
||||
System automat de import comenzi din platforma GoMag in sistemul ERP ROA Oracle.
|
||||
|
||||
## Arhitectura
|
||||
|
||||
```
|
||||
[GoMag API] → [Python Sync Service] → [Oracle PL/SQL] → [FastAPI Admin]
|
||||
↓ ↓ ↑ ↑
|
||||
JSON Orders Download/Parse/Import Store/Update Dashboard + Config
|
||||
```
|
||||
|
||||
### Stack Tehnologic
|
||||
- **API + Admin:** FastAPI + Jinja2 + Bootstrap 5.3
|
||||
- **GoMag Integration:** Python (`gomag_client.py` — download comenzi cu paginare)
|
||||
- **Sync Orchestrator:** Python (`sync_service.py` — download → parse → validate → import)
|
||||
- **Database:** Oracle PL/SQL packages (IMPORT_PARTENERI, IMPORT_COMENZI) + SQLite (tracking)
|
||||
|
||||
---
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Prerequisite
|
||||
- Python 3.10+
|
||||
- Oracle Instant Client 21.x (optional — suporta si thin mode pentru Oracle 12.1+)
|
||||
|
||||
### Instalare
|
||||
|
||||
```bash
|
||||
pip install -r api/requirements.txt
|
||||
cp api/.env.example api/.env
|
||||
# Editeaza api/.env cu datele de conectare Oracle
|
||||
```
|
||||
|
||||
### Pornire server
|
||||
|
||||
**Important:** serverul trebuie pornit **din project root**, nu din `api/`:
|
||||
|
||||
```bash
|
||||
python -m uvicorn api.app.main:app --host 0.0.0.0 --port 5003
|
||||
```
|
||||
|
||||
Sau folosind scriptul inclus:
|
||||
```bash
|
||||
./start.sh
|
||||
```
|
||||
|
||||
Deschide `http://localhost:5003` in browser.
|
||||
|
||||
### Testare
|
||||
|
||||
**Test A - Basic (fara Oracle):**
|
||||
```bash
|
||||
python api/test_app_basic.py
|
||||
```
|
||||
|
||||
**Test C - Integrare Oracle:**
|
||||
```bash
|
||||
python api/test_integration.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Configurare (.env)
|
||||
|
||||
Copiaza `.env.example` si completeaza:
|
||||
|
||||
```bash
|
||||
cp api/.env.example api/.env
|
||||
```
|
||||
|
||||
| Variabila | Descriere | Exemplu |
|
||||
|-----------|-----------|---------|
|
||||
| `ORACLE_USER` | User Oracle | `MARIUSM_AUTO` |
|
||||
| `ORACLE_PASSWORD` | Parola Oracle | `secret` |
|
||||
| `ORACLE_DSN` | TNS alias | `ROA_CENTRAL` |
|
||||
| `TNS_ADMIN` | Cale absoluta la tnsnames.ora | `/mnt/e/.../gomag/api` |
|
||||
| `INSTANTCLIENTPATH` | Cale Instant Client (thick mode) | `/opt/oracle/instantclient_21_15` |
|
||||
| `FORCE_THIN_MODE` | Thin mode fara Instant Client | `true` |
|
||||
| `SQLITE_DB_PATH` | Path SQLite (relativ la project root) | `api/data/import.db` |
|
||||
| `JSON_OUTPUT_DIR` | Folder JSON-uri descarcate | `api/data/orders` |
|
||||
| `APP_PORT` | Port HTTP | `5003` |
|
||||
| `ID_POL` | ID Politica ROA | `39` |
|
||||
| `ID_GESTIUNE` | ID Gestiune ROA | `0` |
|
||||
| `ID_SECTIE` | ID Sectie ROA | `6` |
|
||||
|
||||
**Nota Oracle mode:**
|
||||
- **Thick mode** (Oracle 10g/11g): seteaza `INSTANTCLIENTPATH`
|
||||
- **Thin mode** (Oracle 12.1+): seteaza `FORCE_THIN_MODE=true`, sterge `INSTANTCLIENTPATH`
|
||||
|
||||
---
|
||||
|
||||
## Structura Proiect
|
||||
|
||||
```
|
||||
gomag-vending/
|
||||
├── api/ # FastAPI Admin + Dashboard
|
||||
│ ├── app/
|
||||
│ │ ├── main.py # Entry point, lifespan, logging
|
||||
│ │ ├── config.py # Settings (pydantic-settings + .env)
|
||||
│ │ ├── database.py # Oracle pool + SQLite schema + migrari
|
||||
│ │ ├── routers/ # Endpoint-uri HTTP
|
||||
│ │ │ ├── health.py # GET /health
|
||||
│ │ │ ├── dashboard.py # GET / (HTML) + /settings (HTML)
|
||||
│ │ │ ├── mappings.py # /mappings, /api/mappings
|
||||
│ │ │ ├── articles.py # /api/articles/search
|
||||
│ │ │ ├── validation.py # /api/validate/*
|
||||
│ │ │ └── sync.py # /api/sync/* + /api/dashboard/* + /api/settings
|
||||
│ │ ├── services/
|
||||
│ │ │ ├── gomag_client.py # Download comenzi GoMag API
|
||||
│ │ │ ├── sync_service.py # Orchestrare: download→validate→import
|
||||
│ │ │ ├── import_service.py # Import comanda in Oracle ROA
|
||||
│ │ │ ├── mapping_service.py # CRUD ARTICOLE_TERTI + cantitate_roa
|
||||
│ │ │ ├── price_sync_service.py # Sync preturi GoMag → Oracle politici
|
||||
│ │ │ ├── sqlite_service.py # Tracking runs/orders/missing SKUs
|
||||
│ │ │ ├── order_reader.py # Citire gomag_orders_page*.json
|
||||
│ │ │ ├── validation_service.py
|
||||
│ │ │ ├── article_service.py
|
||||
│ │ │ ├── invoice_service.py # Verificare facturi ROA
|
||||
│ │ │ └── scheduler_service.py # APScheduler timer
|
||||
│ │ ├── templates/ # Jinja2 (dashboard, mappings, missing_skus, logs, settings)
|
||||
│ │ └── static/ # CSS (style.css) + JS (dashboard, logs, mappings, settings, shared)
|
||||
│ ├── database-scripts/ # Oracle SQL (ARTICOLE_TERTI, packages)
|
||||
│ ├── data/ # SQLite DB (import.db) + JSON orders
|
||||
│ ├── .env # Configurare locala (nu in git)
|
||||
│ ├── .env.example # Template configurare
|
||||
│ ├── test_app_basic.py # Test A - fara Oracle
|
||||
│ ├── test_integration.py # Test C - cu Oracle
|
||||
│ └── requirements.txt
|
||||
├── logs/ # Log-uri aplicatie (sync_comenzi_*.log)
|
||||
├── docs/ # Documentatie (Oracle schema, facturare analysis)
|
||||
├── scripts/ # Utilitare (sync_vending_to_mariusm, create_inventory_notes)
|
||||
├── screenshots/ # Before/preview/after pentru UI changes
|
||||
├── start.sh # Script pornire (Linux/WSL)
|
||||
└── CLAUDE.md # Instructiuni pentru AI assistants
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Dashboard Features
|
||||
|
||||
### Sync Panel
|
||||
- Start sync manual sau scheduler automat (5/10/30 min)
|
||||
- Progress live: `"Import 45/80: #CMD-1234 Ion Popescu"`
|
||||
- Smart polling: 30s idle → 3s cand ruleaza → auto-refresh tabela
|
||||
- Last sync clickabil → jurnal detaliat
|
||||
|
||||
### Comenzi
|
||||
- Filtru perioada: 3z / 7z / 30z / 3 luni / toate / custom
|
||||
- Status pills cu conturi totale pe perioada (nu per-pagina)
|
||||
- Cautare integrata in bara de filtre
|
||||
- Coloana Client cu tooltip `▲` cand persoana livrare ≠ facturare
|
||||
- Paginare sus + jos, selector rezultate per pagina (25/50/100/250)
|
||||
|
||||
### Mapari SKU
|
||||
- Badge `✓ 100%` / `⚠ 80%` per grup SKU
|
||||
- Filtru Complete / Incomplete
|
||||
- Verificare duplicat SKU-CODMAT (409 cu optiune de restaurare)
|
||||
|
||||
### SKU-uri Lipsa
|
||||
- Cautare dupa SKU sau nume produs
|
||||
- Filtru Nerezolvate / Rezolvate / Toate cu conturi
|
||||
- Re-scan cu progress inline si banner rezultat
|
||||
|
||||
---
|
||||
|
||||
## Fluxul de Import
|
||||
|
||||
```
|
||||
1. gomag_client.py descarca comenzi GoMag API → JSON files (paginat)
|
||||
2. order_reader.py parseaza JSON-urile, sorteaza cronologic (cele mai vechi primele)
|
||||
3. Comenzi anulate (GoMag statusId=7) → separate, sterse din Oracle daca nu au factura
|
||||
4. validation_service.py valideaza SKU-uri: ARTICOLE_TERTI (mapped) → NOM_ARTICOLE (direct) → missing
|
||||
5. Verificare existenta in Oracle (COMENZI by date range) → deja importate se sar
|
||||
6. Stale error recovery: comenzi ERROR reverificate in Oracle (crash recovery)
|
||||
7. Validare preturi + dual policy: articole rutate la id_pol_vanzare sau id_pol_productie
|
||||
8. import_service.py: cauta/creeaza partener → adrese → importa comanda in Oracle
|
||||
9. Invoice cache: verifica facturi + comenzi sterse din ROA
|
||||
10. Rezultate salvate in SQLite (orders, sync_run_orders, order_items)
|
||||
```
|
||||
|
||||
### Statuses Comenzi
|
||||
|
||||
| Status | Descriere |
|
||||
|--------|-----------|
|
||||
| `IMPORTED` | Importata nou in ROA in acest run |
|
||||
| `ALREADY_IMPORTED` | Existenta deja in Oracle, contorizata |
|
||||
| `SKIPPED` | SKU-uri lipsa → neimportata |
|
||||
| `ERROR` | Eroare la import (reverificate automat la urmatorul sync) |
|
||||
| `CANCELLED` | Comanda anulata in GoMag (statusId=7) |
|
||||
| `DELETED_IN_ROA` | A fost importata dar comanda a fost stearsa din ROA |
|
||||
|
||||
**Regula upsert:** daca statusul existent este `IMPORTED`, nu se suprascrie cu `ALREADY_IMPORTED`.
|
||||
|
||||
### Reguli Business
|
||||
|
||||
**Parteneri & Adrese:**
|
||||
- Prioritate partener: daca exista **companie** in GoMag (billing.company.name SAU billing.company.code) → firma (PJ, cod_fiscal + registru). Altfel → persoana fizica, cu **shipping name** ca nume partener
|
||||
- Adresa livrare: intotdeauna din GoMag shipping
|
||||
- Adresa facturare **PJ**: adresa billing din GoMag (sediul firmei)
|
||||
- Adresa facturare **PF**: adresa shipping din GoMag (ramburs curier pe numele destinatarului)
|
||||
- Cautare partener in Oracle: cod_fiscal → denumire → create new (ID_UTIL = -3)
|
||||
|
||||
**Articole & Mapari:**
|
||||
- SKU lookup: ARTICOLE_TERTI (mapped, activ=1) are prioritate fata de NOM_ARTICOLE (direct)
|
||||
- SKU simplu: gasit direct in NOM_ARTICOLE → nu se stocheaza in ARTICOLE_TERTI
|
||||
- SKU cu repackaging: un SKU → CODMAT cu cantitate diferita (`cantitate_roa`)
|
||||
- SKU set complex: un SKU → multiple CODMAT-uri cu `procent_pret` (trebuie sum = 100%)
|
||||
|
||||
**Preturi & Discounturi:**
|
||||
- Dual policy: articolele sunt rutate la `id_pol_vanzare` sau `id_pol_productie` pe baza contului contabil (341/345 = productie)
|
||||
- Daca pretul lipseste in politica, se insereaza automat pret=0
|
||||
- Discount VAT splitting: daca `split_discount_vat=1`, discountul se repartizeaza proportional pe cotele TVA din comanda
|
||||
|
||||
---
|
||||
|
||||
## Facturi & Cache
|
||||
|
||||
### Sincronizari
|
||||
|
||||
Sistemul are 3 procese de sincronizare si o setare de refresh UI:
|
||||
|
||||
#### 1. Sync Comenzi (Dashboard → scheduler sau buton Sync)
|
||||
|
||||
Procesul principal. Importa comenzi din GoMag in Oracle si verifica statusul celor existente.
|
||||
|
||||
**Pasi:**
|
||||
1. Descarca comenzile din GoMag API (ultimele N zile, configurat in Setari)
|
||||
2. Valideaza SKU-urile fiecarei comenzi:
|
||||
- Cauta in ARTICOLE_TERTI (mapari manuale) → apoi in NOM_ARTICOLE (potrivire directa)
|
||||
- Daca un SKU nu e gasit nicaieri → comanda e marcata SKIPPED si SKU-ul apare in "SKU-uri lipsa"
|
||||
3. Verifica daca comanda exista deja in Oracle → da: ALREADY_IMPORTED, nu: se importa
|
||||
4. Comenzi cu status ERROR din run-uri anterioare sunt reverificate in Oracle (crash recovery)
|
||||
5. Import in Oracle: cauta/creeaza partener → adrese → comanda
|
||||
6. **Verificare facturi** (la fiecare sync):
|
||||
- Comenzi nefacturate → au primit factura in ROA? → salveaza serie/numar/total
|
||||
- Comenzi facturate → a fost stearsa factura? → sterge cache
|
||||
- Comenzi importate → au fost sterse din ROA? → marcheaza DELETED_IN_ROA
|
||||
|
||||
**Cand ruleaza:**
|
||||
- **Automat:** scheduler configurat din Dashboard (interval: 5 / 10 / 30 min)
|
||||
- **Manual:** buton "Sync" din Dashboard sau `POST /api/sync/start`
|
||||
- **Doar facturi:** `POST /api/dashboard/refresh-invoices` (sare pasii 1-5)
|
||||
|
||||
> Facturarea in ROA **nu** declanseaza sync — statusul se actualizeaza la urmatorul sync sau refresh manual.
|
||||
|
||||
#### 2. Sync Preturi din Comenzi (Setari → on/off)
|
||||
|
||||
La fiecare sync comenzi, daca este activat (`price_sync_enabled=1`), compara preturile din comanda GoMag cu cele din politica de pret Oracle si le actualizeaza daca difera.
|
||||
|
||||
Configurat din: **Setari → Sincronizare preturi din comenzi**
|
||||
|
||||
#### 3. Sync Catalog Preturi (Setari → manual sau zilnic)
|
||||
|
||||
Sync independent de comenzi. Descarca **toate produsele** din catalogul GoMag, le potriveste cu articolele Oracle (prin CODMAT/SKU) si actualizeaza preturile in politica de pret.
|
||||
|
||||
Configurat din: **Setari → Sincronizare Preturi** (activare + program)
|
||||
- **Doar manual:** buton "Sincronizeaza acum" din Setari sau `POST /api/price-sync/start`
|
||||
- **Zilnic la 03:00 / 06:00:** optiune in UI (**neimplementat** — setarea se salveaza dar scheduler-ul zilnic nu exista inca)
|
||||
|
||||
#### Interval polling dashboard (Setari → Dashboard)
|
||||
|
||||
Cat de des verifica **interfata web** (browser-ul) statusul sync-ului. Valoare in secunde (implicit 5s). **Nu afecteaza frecventa sync-ului** — e doar refresh-ul UI-ului.
|
||||
|
||||
Facturile sunt verificate din Oracle si cached in SQLite (`factura_*` pe tabelul `orders`).
|
||||
|
||||
### Sursa Oracle
|
||||
```sql
|
||||
SELECT id_comanda, numar_act, serie_act,
|
||||
total_fara_tva, total_tva, total_cu_tva,
|
||||
TO_CHAR(data_act, 'YYYY-MM-DD')
|
||||
FROM vanzari
|
||||
WHERE id_comanda IN (...) AND sters = 0
|
||||
```
|
||||
|
||||
### Populare Cache
|
||||
1. **Dashboard** (`GET /api/dashboard/orders`) — comenzile fara cache sunt verificate live si cached automat la fiecare request
|
||||
2. **Detaliu comanda** (`GET /api/sync/order/{order_number}`) — verifica Oracle live daca nu e cached
|
||||
3. **Refresh manual** (`POST /api/dashboard/refresh-invoices`) — refresh complet pentru toate comenzile
|
||||
|
||||
### Refresh Complet — `/api/dashboard/refresh-invoices`
|
||||
|
||||
Face trei verificari in Oracle si actualizeaza SQLite:
|
||||
|
||||
| Verificare | Actiune |
|
||||
|------------|---------|
|
||||
| Comenzi nefacturate → au primit factura? | Cached datele facturii |
|
||||
| Comenzi facturate → factura a fost stearsa? | Sterge cache factura |
|
||||
| Toate comenzile importate → comanda stearsa din ROA? | Seteaza status `DELETED_IN_ROA` |
|
||||
|
||||
Returneaza: `{ checked, invoices_added, invoices_cleared, orders_deleted }`
|
||||
|
||||
---
|
||||
|
||||
## API Reference — Sync & Comenzi
|
||||
|
||||
### Sync
|
||||
| Method | Path | Descriere |
|
||||
|--------|------|-----------|
|
||||
| POST | `/api/sync/start` | Porneste sync in background |
|
||||
| POST | `/api/sync/stop` | Trimite semnal de stop |
|
||||
| GET | `/api/sync/status` | Status curent + progres + last_run |
|
||||
| GET | `/api/sync/history` | Istoric run-uri (paginat) |
|
||||
| GET | `/api/sync/run/{id}` | Detalii run specific |
|
||||
| GET | `/api/sync/run/{id}/log` | Log per comanda (JSON) |
|
||||
| GET | `/api/sync/run/{id}/text-log` | Log text (live din memorie sau reconstruit din SQLite) |
|
||||
| GET | `/api/sync/run/{id}/orders` | Comenzi run filtrate/paginate |
|
||||
| GET | `/api/sync/order/{number}` | Detaliu comanda + items + ARTICOLE_TERTI + factura |
|
||||
|
||||
### Dashboard Comenzi
|
||||
| Method | Path | Descriere |
|
||||
|--------|------|-----------|
|
||||
| GET | `/api/dashboard/orders` | Comenzi cu enrichment factura |
|
||||
| POST | `/api/dashboard/refresh-invoices` | Force-refresh stare facturi + deleted orders |
|
||||
|
||||
**Parametri `/api/dashboard/orders`:**
|
||||
- `period_days`: 3/7/30/90 sau 0 (toate sau interval custom)
|
||||
- `period_start`, `period_end`: interval custom (cand `period_days=0`)
|
||||
- `status`: `all` / `IMPORTED` / `SKIPPED` / `ERROR` / `UNINVOICED` / `INVOICED`
|
||||
- `search`, `sort_by`, `sort_dir`, `page`, `per_page`
|
||||
|
||||
Filtrele `UNINVOICED` si `INVOICED` fac fetch din toate comenzile IMPORTED si filtreaza server-side dupa prezenta/absenta cache-ului de factura.
|
||||
|
||||
### Scheduler
|
||||
| Method | Path | Descriere |
|
||||
|--------|------|-----------|
|
||||
| PUT | `/api/sync/schedule` | Configureaza (enabled, interval_minutes: 5/10/30) |
|
||||
| GET | `/api/sync/schedule` | Status curent |
|
||||
|
||||
Configuratia este persistata in SQLite (`scheduler_config`).
|
||||
|
||||
### Settings
|
||||
| Method | Path | Descriere |
|
||||
|--------|------|-----------|
|
||||
| GET | `/api/settings` | Citeste setari aplicatie |
|
||||
| PUT | `/api/settings` | Salveaza setari |
|
||||
| GET | `/api/settings/sectii` | Lista sectii Oracle |
|
||||
| GET | `/api/settings/politici` | Lista politici preturi Oracle |
|
||||
|
||||
**Setari disponibile:** `transport_codmat`, `transport_vat`, `discount_codmat`, `discount_vat`, `transport_id_pol`, `discount_id_pol`, `id_pol`, `id_pol_productie`, `id_sectie`, `split_discount_vat`, `gomag_api_key`, `gomag_api_shop`, `gomag_order_days_back`, `gomag_limit`
|
||||
|
||||
---
|
||||
|
||||
## Deploy Windows
|
||||
|
||||
### Instalare initiala
|
||||
|
||||
```powershell
|
||||
# Ruleaza ca Administrator
|
||||
.\deploy.ps1
|
||||
```
|
||||
|
||||
Scriptul `deploy.ps1` face automat: git clone, venv, dependinte, detectare Oracle, `start.bat`, serviciu NSSM, configurare IIS reverse proxy.
|
||||
|
||||
### Update cod (pull + restart)
|
||||
|
||||
```powershell
|
||||
# Ca Administrator
|
||||
.\update.ps1
|
||||
```
|
||||
|
||||
Sau manual:
|
||||
```powershell
|
||||
cd C:\gomag-vending
|
||||
git pull origin main
|
||||
nssm restart GoMagVending
|
||||
```
|
||||
|
||||
### Configurare `.env` pe Windows
|
||||
|
||||
```ini
|
||||
# api/.env — exemplu Windows
|
||||
ORACLE_USER=VENDING
|
||||
ORACLE_PASSWORD=****
|
||||
ORACLE_DSN=ROA
|
||||
TNS_ADMIN=C:\roa\instantclient_11_2_0_2
|
||||
INSTANTCLIENTPATH=C:\app\Server\product\18.0.0\dbhomeXE\bin
|
||||
SQLITE_DB_PATH=api/data/import.db
|
||||
JSON_OUTPUT_DIR=api/data/orders
|
||||
APP_PORT=5003
|
||||
ID_POL=39
|
||||
ID_GESTIUNE=0
|
||||
ID_SECTIE=6
|
||||
GOMAG_API_KEY=...
|
||||
GOMAG_API_SHOP=...
|
||||
GOMAG_ORDER_DAYS_BACK=7
|
||||
GOMAG_LIMIT=100
|
||||
```
|
||||
|
||||
**Important:**
|
||||
- `TNS_ADMIN` = folderul care contine `tnsnames.ora` (NU fisierul in sine)
|
||||
- `ORACLE_DSN` = alias-ul exact din `tnsnames.ora`
|
||||
- `INSTANTCLIENTPATH` = calea catre Oracle bin (thick mode, Oracle 10g/11g)
|
||||
- `FORCE_THIN_MODE=true` = elimina necesitatea Instant Client (Oracle 12.1+)
|
||||
- Setarile din `.env` pot fi suprascrise din UI → `Setari` → salvate in SQLite
|
||||
|
||||
### Serviciu Windows (NSSM)
|
||||
|
||||
```powershell
|
||||
nssm restart GoMagVending # restart serviciu
|
||||
nssm status GoMagVending # status serviciu
|
||||
nssm stop GoMagVending # stop serviciu
|
||||
nssm start GoMagVending # start serviciu
|
||||
```
|
||||
|
||||
Loguri serviciu: `logs/service_stdout.log`, `logs/service_stderr.log`
|
||||
Loguri aplicatie: `logs/sync_comenzi_*.log`
|
||||
|
||||
**Nota:** Userul `gomag` nu are drepturi de admin — `nssm restart` necesita PowerShell Administrator direct pe server.
|
||||
|
||||
### Depanare SSH
|
||||
|
||||
```bash
|
||||
# Conectare SSH (PowerShell remote, cheie publica)
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 -o StrictHostKeyChecking=no gomag@79.119.86.134
|
||||
|
||||
# Verificare .env
|
||||
powershell -Command "Get-Content C:\gomag-vending\api\.env | Select-String 'ORACLE_'"
|
||||
|
||||
# Test conexiune Oracle
|
||||
C:\gomag-vending\venv\Scripts\python.exe -c "import oracledb, os; os.environ['TNS_ADMIN']='C:/roa/instantclient_11_2_0_2'; conn=oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA'); print('Connected!'); conn.close()"
|
||||
|
||||
# Verificare tnsnames.ora
|
||||
cmd /c type C:\roa\instantclient_11_2_0_2\tnsnames.ora
|
||||
|
||||
# Verificare procese Python (ID-uri pentru kill/restart)
|
||||
powershell -Command "Get-Process python -ErrorAction SilentlyContinue | Format-Table Id, CPU -AutoSize"
|
||||
|
||||
# Verificare loguri recente
|
||||
Get-ChildItem C:\gomag-vending\logs\*.log | Sort-Object LastWriteTime -Descending | Select-Object -First 3
|
||||
|
||||
# Test app (prin nginx reverse proxy)
|
||||
powershell -Command "Invoke-WebRequest -Uri 'http://localhost/gomag/' -UseBasicParsing | Select-Object StatusCode"
|
||||
|
||||
# Retry comanda din linie de comanda
|
||||
powershell -Command "Invoke-WebRequest -Uri 'http://localhost/gomag/api/orders/NRCOMANDA/retry' -Method POST -UseBasicParsing | Select-Object -ExpandProperty Content"
|
||||
```
|
||||
|
||||
#### Deploy pachet Oracle PL/SQL via SSH
|
||||
|
||||
```bash
|
||||
# Metoda corecta: sqlplus cu fisier .pck (contine ambele: PACKAGE + PACKAGE BODY)
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"echo exit | sqlplus -S VENDING/PAROLA@ROA '@C:\\gomag-vending\\api\\database-scripts\\05_pack_import_parteneri.pck'\""
|
||||
# Output asteptat: "Package created." + "Package body created."
|
||||
```
|
||||
|
||||
#### Restart serviciu FastAPI via SSH
|
||||
|
||||
Userul `gomag` nu are acces la `nssm` sau `sc` (necesita Administrator).
|
||||
Metoda disponibila — kill python + relanseaza start.ps1:
|
||||
|
||||
```bash
|
||||
# 1. Gaseste PID-urile Python
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"Get-Process python -ErrorAction SilentlyContinue | Format-Table Id, CPU -AutoSize\""
|
||||
|
||||
# 2. Kill + restart (inlocuieste PID1,PID2 cu valorile reale)
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"Stop-Process -Id PID1,PID2 -Force -ErrorAction SilentlyContinue; Start-Sleep 2; cd C:\\gomag-vending; Start-Process powershell -ArgumentList '-NoExit','-File','start.ps1' -WindowStyle Hidden\""
|
||||
|
||||
# 3. Verifica ca a pornit (asteapta ~5s)
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"Invoke-WebRequest -Uri 'http://localhost/gomag/' -UseBasicParsing | Select-Object StatusCode\""
|
||||
```
|
||||
|
||||
#### Ce NU merge via SSH (userul gomag fara Administrator)
|
||||
|
||||
| Comanda | Eroare | Alternativa |
|
||||
|---------|--------|-------------|
|
||||
| `nssm restart GoMagVending` | `Error opening service manager!` | Kill python + Start-Process start.ps1 (vezi mai sus) |
|
||||
| `sc query` / `sc stop` | `Access is denied` | Nu exista alternativa — necesita acces direct la server |
|
||||
| `Get-WmiObject Win32_Process` | `Access denied` | `Get-Process` simplu fara CommandLine |
|
||||
| Pipe `\|` in -Command cu ghilimele nested | `An empty pipe element is not allowed` | Scrie SQL in fisier temporar, copiaza cu scp, ruleaza `@fisier.sql` |
|
||||
| `&&` (bash syntax) in PowerShell | `The term '&&' is not recognized` | Foloseste `;` (continua indiferent) sau `-Command "cmd1; cmd2"` |
|
||||
| `-m` flag la `curl` in PowerShell | `Ambiguous parameter name` | Foloseste `Invoke-WebRequest` in loc de curl |
|
||||
| Here-doc `<< 'EOF'` in PowerShell | `Missing file specification` | Scrie fisierul local, copiaza cu scp |
|
||||
|
||||
#### Rulare SQL ad-hoc prin SSH (fara interactiv)
|
||||
|
||||
PowerShell nu suporta pipe catre sqlplus cu ghilimele complexe. Metoda corecta:
|
||||
|
||||
```bash
|
||||
# 1. Scrie SQL local
|
||||
cat > /tmp/query.sql << 'EOF'
|
||||
SELECT coloana FROM tabel WHERE conditie;
|
||||
exit
|
||||
EOF
|
||||
|
||||
# 2. Copiaza pe prod
|
||||
scp -i ~/.ssh/id_ed25519 -P 22122 /tmp/query.sql "gomag@79.119.86.134:C:/gomag-vending/query.sql"
|
||||
|
||||
# 3. Ruleaza
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"sqlplus -S VENDING/PAROLA@ROA '@C:\\gomag-vending\\query.sql'\""
|
||||
```
|
||||
|
||||
**Nu folosi** `echo 'SQL;' | sqlplus` — PowerShell trateaza `|` diferit si poate esua cu "empty pipe element".
|
||||
|
||||
### Probleme frecvente
|
||||
|
||||
| Eroare | Cauza | Solutie |
|
||||
|--------|-------|---------|
|
||||
| `ORA-12154: TNS:could not resolve` | `TNS_ADMIN` gresit sau `tnsnames.ora` nu contine alias-ul DSN | Verifica `TNS_ADMIN` in `.env` + alias in `tnsnames.ora` |
|
||||
| `ORA-04088: LOGON_AUDIT_TRIGGER` + `Nu aveti licenta pentru PYTHON` | Trigger ROA blocheaza executabile nelicențiate | Adauga `python.exe` (calea completa) in ROASUPORT |
|
||||
| `503 Service Unavailable` pe `/api/articles/search` | Oracle pool nu s-a initializat | Verifica logul `sync_comenzi_*.log` pentru eroarea exacta |
|
||||
| Facturile nu apar in dashboard | Cache SQLite gol — invoice_service nu a putut interoga Oracle | Apasa butonul Refresh Facturi din dashboard sau `POST /api/dashboard/refresh-invoices` |
|
||||
| Comanda apare ca `DELETED_IN_ROA` | Comanda a fost stearsa manual din ROA | Normal — marcat automat la refresh |
|
||||
| Scheduler nu porneste dupa restart | Config pierduta | Verifica SQLite `scheduler_config` sau reconfigureaza din UI |
|
||||
|
||||
---
|
||||
|
||||
## Documentatie Tehnica
|
||||
|
||||
| Fisier | Subiect |
|
||||
|--------|---------|
|
||||
| [docs/oracle-schema-notes.md](docs/oracle-schema-notes.md) | Schema Oracle: tabele comenzi, facturi, preturi, proceduri cheie |
|
||||
| [docs/pack_facturare_analysis.md](docs/pack_facturare_analysis.md) | Analiza flow facturare: call chain, parametri, STOC lookup, FACT-008 |
|
||||
| [scripts/HANDOFF_MAPPING.md](scripts/HANDOFF_MAPPING.md) | Matching GoMag SKU → ROA articole (strategie si rezultate) |
|
||||
|
||||
---
|
||||
|
||||
## WSL2 Note
|
||||
|
||||
- `uvicorn --reload` **nu functioneaza** pe `/mnt/e/` (WSL2 limitation) — restarta manual
|
||||
- Serverul trebuie pornit din **project root**, nu din `api/`
|
||||
- `JSON_OUTPUT_DIR` si `SQLITE_DB_PATH` sunt relative la project root
|
||||
|
||||
29
TODOS.md
29
TODOS.md
@@ -1,29 +0,0 @@
|
||||
# TODOS
|
||||
|
||||
## P2: Refactor sync_service.py in module separate
|
||||
**What:** Split sync_service.py (870 linii) in: download_service, parse_service, sync_orchestrator.
|
||||
**Why:** Faciliteza debugging si testare.
|
||||
**Effort:** M (human: ~1 sapt / CC: ~1-2h)
|
||||
**Context:** Dupa implementarea planului Command Center (retry_service deja extras). sync_service face download + parse + validate + import + invoice check — prea multe responsabilitati.
|
||||
**Depends on:** Finalizarea planului Command Center.
|
||||
|
||||
## P2: Email/webhook alert pe sync esuat
|
||||
**What:** Cand sync-ul gaseste >5 erori sau esueaza complet, trimite un email/webhook.
|
||||
**Why:** Post-lansare, cand app-ul ruleaza automat, nimeni nu sta sa verifice constant.
|
||||
**Effort:** M (human: ~1 sapt / CC: ~1h)
|
||||
**Context:** Depinde de infrastructura email/webhook disponibila la client. Implementare: SMTP simplu sau webhook URL configurabil in Settings.
|
||||
**Depends on:** Lansare in productie + infrastructura email la client.
|
||||
|
||||
## P3: Fix script — handle missing orders in GoMag API
|
||||
**What:** Fix script for 17 address-less orders should check if GoMag API returns data for each order, and report which orders couldn't be fixed.
|
||||
**Why:** Old orders may be deleted or expired from GoMag API. Without this check, the fix script fails silently and the operator thinks all 17 were fixed.
|
||||
**Effort:** S (human: ~10min / CC: ~2min)
|
||||
**Context:** Part of the address overflow fix (Pas 5). The fix script re-downloads from GoMag API to get original address text, but doesn't verify the API response. Add empty-response check + report.
|
||||
**Depends on:** Address parser fix (Pas 1-2) deployed.
|
||||
|
||||
## P3: Cleanup orphan VFP-era addresses in Oracle
|
||||
**What:** One-time script to find and soft-delete partner addresses created by VFP that have no linked orders and incorrect street data.
|
||||
**Why:** After TIER 2 removal, old addresses that were incorrectly reused remain attached to partners. They're cosmetic clutter but not harmful — new addresses are created correctly now.
|
||||
**Effort:** S (human: ~2h / CC: ~10min)
|
||||
**Context:** TIER 2 matched county+city without street, reusing VFP-era addresses with wrong streets. After removal (2026-04-06), new imports create correct addresses. Old wrong addresses stay. Could identify them by: address has id_loc but no linked order rows, and was last modified before 2026-04-06.
|
||||
**Depends on:** TIER 2 removal deployed and verified.
|
||||
@@ -1,86 +0,0 @@
|
||||
# =============================================================================
|
||||
# GoMag Import Manager - Configurare
|
||||
# Copiaza in api/.env si completeaza cu datele reale
|
||||
# =============================================================================
|
||||
|
||||
# =============================================================================
|
||||
# ORACLE MODE - Alege una din urmatoarele doua optiuni:
|
||||
# =============================================================================
|
||||
|
||||
# THICK MODE (Oracle 10g/11g/12.1+) - Recomandat pentru compatibilitate maxima
|
||||
# Necesita Oracle Instant Client instalat
|
||||
INSTANTCLIENTPATH=/opt/oracle/instantclient_21_15
|
||||
|
||||
# THIN MODE (Oracle 12.1+ only) - Fara Instant Client, mai simplu
|
||||
# Comenteaza INSTANTCLIENTPATH de sus si decommenteaza urmatoarea linie:
|
||||
# FORCE_THIN_MODE=true
|
||||
|
||||
# =============================================================================
|
||||
# ORACLE - Credentiale baza de date
|
||||
# =============================================================================
|
||||
|
||||
ORACLE_USER=USER_ORACLE
|
||||
ORACLE_PASSWORD=parola_oracle
|
||||
ORACLE_DSN=TNS_ALIAS
|
||||
|
||||
# Calea absoluta la directorul cu tnsnames.ora
|
||||
# De obicei: directorul api/ al proiectului
|
||||
TNS_ADMIN=/cale/absoluta/la/gomag/api
|
||||
|
||||
# =============================================================================
|
||||
# APLICATIE
|
||||
# =============================================================================
|
||||
|
||||
APP_PORT=5003
|
||||
LOG_LEVEL=INFO
|
||||
|
||||
# =============================================================================
|
||||
# CALE FISIERE
|
||||
# Relative: JSON_OUTPUT_DIR la project root, SQLITE_DB_PATH la api/
|
||||
# Se pot folosi si cai absolute
|
||||
# =============================================================================
|
||||
|
||||
# JSON-uri comenzi GoMag
|
||||
JSON_OUTPUT_DIR=output
|
||||
|
||||
# SQLite tracking DB
|
||||
SQLITE_DB_PATH=data/import.db
|
||||
|
||||
# =============================================================================
|
||||
# ROA - Setari import comenzi (din vfp/settings.ini sectiunea [ROA])
|
||||
# =============================================================================
|
||||
|
||||
# Politica de pret
|
||||
ID_POL=39
|
||||
|
||||
# Gestiune implicita
|
||||
ID_GESTIUNE=0
|
||||
|
||||
# Sectie implicita
|
||||
ID_SECTIE=6
|
||||
|
||||
# =============================================================================
|
||||
# GoMag API
|
||||
# =============================================================================
|
||||
|
||||
GOMAG_API_KEY=your_api_key_here
|
||||
GOMAG_API_SHOP=https://yourstore.gomag.ro
|
||||
GOMAG_ORDER_DAYS_BACK=7
|
||||
GOMAG_LIMIT=100
|
||||
|
||||
# =============================================================================
|
||||
# SMTP - Notificari email (optional)
|
||||
# =============================================================================
|
||||
|
||||
# SMTP_HOST=smtp.gmail.com
|
||||
# SMTP_PORT=587
|
||||
# SMTP_USER=email@exemplu.com
|
||||
# SMTP_PASSWORD=parola_app
|
||||
# SMTP_TO=destinatar@exemplu.com
|
||||
|
||||
# =============================================================================
|
||||
# AUTH - HTTP Basic Auth pentru dashboard (optional)
|
||||
# =============================================================================
|
||||
|
||||
# API_USERNAME=admin
|
||||
# API_PASSWORD=parola_sigura
|
||||
@@ -1,41 +0,0 @@
|
||||
# UNIFIED Dockerfile - AUTO-DETECT Thick/Thin Mode
|
||||
FROM python:3.11-slim as base
|
||||
|
||||
# Set argument for build mode (thick by default for compatibility)
|
||||
ARG ORACLE_MODE=thick
|
||||
|
||||
# Base application setup
|
||||
WORKDIR /app
|
||||
COPY requirements.txt /app/requirements.txt
|
||||
RUN pip3 install -r requirements.txt
|
||||
|
||||
# Oracle Instant Client + SQL*Plus installation (only if thick mode)
|
||||
RUN if [ "$ORACLE_MODE" = "thick" ] ; then \
|
||||
apt-get update && apt-get install -y libaio-dev wget unzip curl && \
|
||||
mkdir -p /opt/oracle && cd /opt/oracle && \
|
||||
wget https://download.oracle.com/otn_software/linux/instantclient/instantclient-basiclite-linuxx64.zip && \
|
||||
wget https://download.oracle.com/otn_software/linux/instantclient/instantclient-sqlplus-linuxx64.zip && \
|
||||
unzip -o instantclient-basiclite-linuxx64.zip && \
|
||||
unzip -o instantclient-sqlplus-linuxx64.zip && \
|
||||
rm -f instantclient-basiclite-linuxx64.zip instantclient-sqlplus-linuxx64.zip && \
|
||||
cd /opt/oracle/instantclient* && \
|
||||
rm -f *jdbc* *mysql* *jar uidrvci genezi adrci && \
|
||||
echo /opt/oracle/instantclient* > /etc/ld.so.conf.d/oracle-instantclient.conf && \
|
||||
ldconfig && \
|
||||
ln -sf /usr/lib/x86_64-linux-gnu/libaio.so.1t64 /usr/lib/x86_64-linux-gnu/libaio.so.1 && \
|
||||
ln -sf /opt/oracle/instantclient*/sqlplus /usr/local/bin/sqlplus ; \
|
||||
else \
|
||||
echo "Thin mode - skipping Oracle Instant Client installation" ; \
|
||||
fi
|
||||
|
||||
# Copy application files
|
||||
COPY . .
|
||||
|
||||
# Create logs directory
|
||||
RUN mkdir -p /app/logs
|
||||
|
||||
# Expose port
|
||||
EXPOSE 5000
|
||||
|
||||
# Run Flask application with auto-detect mode
|
||||
CMD ["gunicorn", "--bind", "0.0.0.0:5000", "admin:app", "--reload", "--access-logfile", "-"]
|
||||
@@ -1,61 +0,0 @@
|
||||
# GoMag Import Manager - FastAPI Application
|
||||
|
||||
Admin interface si orchestrator pentru importul comenzilor GoMag in Oracle ROA.
|
||||
|
||||
## Componente
|
||||
|
||||
### Core
|
||||
- **main.py** - Entry point FastAPI, lifespan (Oracle pool + SQLite init), file logging
|
||||
- **config.py** - Settings via pydantic-settings (citeste .env)
|
||||
- **database.py** - Oracle connection pool + SQLite schema + helpers
|
||||
|
||||
### Routers (HTTP Endpoints)
|
||||
| Router | Prefix | Descriere |
|
||||
|--------|--------|-----------|
|
||||
| health | /health, /api/health | Status Oracle + SQLite |
|
||||
| dashboard | / | Dashboard HTML cu stat cards |
|
||||
| mappings | /mappings, /api/mappings | CRUD ARTICOLE_TERTI + CSV |
|
||||
| articles | /api/articles | Cautare NOM_ARTICOLE |
|
||||
| validation | /api/validate | Scanare + validare SKU-uri |
|
||||
| sync | /sync, /api/sync | Import orchestration + scheduler |
|
||||
|
||||
### Services (Business Logic)
|
||||
| Service | Rol |
|
||||
|---------|-----|
|
||||
| mapping_service | CRUD pe ARTICOLE_TERTI (Oracle) |
|
||||
| article_service | Cautare in NOM_ARTICOLE (Oracle) |
|
||||
| import_service | Port din VFP: partner/address/order creation |
|
||||
| sync_service | Orchestrare: read JSONs → validate → import → log |
|
||||
| price_sync_service | Sync preturi GoMag → Oracle politici de pret |
|
||||
| invoice_service | Verificare facturi ROA + cache SQLite |
|
||||
| validation_service | Batch-validare SKU-uri (chunks of 500) |
|
||||
| order_reader | Citire gomag_orders_page*.json din vfp/output/ |
|
||||
| sqlite_service | CRUD pe SQLite (sync_runs, import_orders, missing_skus) |
|
||||
| scheduler_service | APScheduler - sync periodic configurabil din UI |
|
||||
|
||||
## Rulare
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
# INTOTDEAUNA via start.sh din project root (seteaza Oracle env vars)
|
||||
cd .. && ./start.sh
|
||||
```
|
||||
|
||||
## Testare
|
||||
|
||||
```bash
|
||||
# Din project root:
|
||||
./test.sh ci # Teste rapide (unit + e2e, ~30s, fara Oracle)
|
||||
./test.sh full # Teste complete (inclusiv Oracle, ~2-3 min)
|
||||
./test.sh unit # Doar unit tests
|
||||
./test.sh e2e # Doar browser tests (Playwright)
|
||||
./test.sh oracle # Doar Oracle integration
|
||||
```
|
||||
|
||||
## Dual Database
|
||||
- **Oracle** - date ERP (ARTICOLE_TERTI, NOM_ARTICOLE, COMENZI)
|
||||
- **SQLite** - tracking local (sync_runs, import_orders, missing_skus, scheduler_config)
|
||||
|
||||
## Logging
|
||||
Log files in `../logs/sync_comenzi_YYYYMMDD_HHMMSS.log`
|
||||
Format: `2026-03-11 14:30:25 | INFO | app.services.sync_service | mesaj`
|
||||
@@ -1,63 +0,0 @@
|
||||
from pydantic_settings import BaseSettings
|
||||
from pydantic import model_validator
|
||||
from pathlib import Path
|
||||
import os
|
||||
|
||||
# Anchored paths - independent of CWD
|
||||
_api_root = Path(__file__).resolve().parent.parent # .../gomag/api/
|
||||
_project_root = _api_root.parent # .../gomag/
|
||||
_env_path = _api_root / ".env"
|
||||
|
||||
class Settings(BaseSettings):
|
||||
# Oracle
|
||||
ORACLE_USER: str = "MARIUSM_AUTO"
|
||||
ORACLE_PASSWORD: str = "ROMFASTSOFT"
|
||||
ORACLE_DSN: str = "ROA_CENTRAL"
|
||||
INSTANTCLIENTPATH: str = ""
|
||||
FORCE_THIN_MODE: bool = False
|
||||
TNS_ADMIN: str = ""
|
||||
|
||||
# SQLite
|
||||
SQLITE_DB_PATH: str = "data/import.db"
|
||||
|
||||
# App
|
||||
APP_PORT: int = 5003
|
||||
LOG_LEVEL: str = "INFO"
|
||||
JSON_OUTPUT_DIR: str = "output"
|
||||
|
||||
# SMTP (optional)
|
||||
SMTP_HOST: str = ""
|
||||
SMTP_PORT: int = 587
|
||||
SMTP_USER: str = ""
|
||||
SMTP_PASSWORD: str = ""
|
||||
SMTP_TO: str = ""
|
||||
|
||||
# Auth (optional)
|
||||
API_USERNAME: str = ""
|
||||
API_PASSWORD: str = ""
|
||||
|
||||
# ROA Import Settings
|
||||
ID_POL: int = 0
|
||||
ID_SECTIE: int = 0
|
||||
|
||||
# GoMag API
|
||||
GOMAG_API_KEY: str = ""
|
||||
GOMAG_API_SHOP: str = ""
|
||||
GOMAG_ORDER_DAYS_BACK: int = 7
|
||||
GOMAG_LIMIT: int = 100
|
||||
GOMAG_API_URL: str = "https://api.gomag.ro/api/v1/order/read/json"
|
||||
|
||||
@model_validator(mode="after")
|
||||
def resolve_paths(self):
|
||||
"""Resolve relative paths against known roots, independent of CWD."""
|
||||
# SQLITE_DB_PATH: relative to api/ root
|
||||
if self.SQLITE_DB_PATH and not os.path.isabs(self.SQLITE_DB_PATH):
|
||||
self.SQLITE_DB_PATH = str(_api_root / self.SQLITE_DB_PATH)
|
||||
# JSON_OUTPUT_DIR: relative to project root
|
||||
if self.JSON_OUTPUT_DIR and not os.path.isabs(self.JSON_OUTPUT_DIR):
|
||||
self.JSON_OUTPUT_DIR = str(_project_root / self.JSON_OUTPUT_DIR)
|
||||
return self
|
||||
|
||||
model_config = {"env_file": str(_env_path), "env_file_encoding": "utf-8", "extra": "ignore"}
|
||||
|
||||
settings = Settings()
|
||||
@@ -1,22 +0,0 @@
|
||||
"""Application-wide constants shared across services, routers, and tests."""
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class OrderStatus(str, Enum):
|
||||
"""Order status values stored in SQLite `orders.status` column.
|
||||
|
||||
Inherits from `str` so existing string comparisons (==, in, dict.get)
|
||||
keep working. Always use `.value` when passing to SQL queries or JSON
|
||||
payloads to avoid Python-version-specific str(enum) surprises.
|
||||
"""
|
||||
IMPORTED = "IMPORTED"
|
||||
ALREADY_IMPORTED = "ALREADY_IMPORTED"
|
||||
SKIPPED = "SKIPPED"
|
||||
ERROR = "ERROR"
|
||||
CANCELLED = "CANCELLED"
|
||||
DELETED_IN_ROA = "DELETED_IN_ROA"
|
||||
# Structural-fail: GoMag sent a payload that cannot be inserted as-is
|
||||
# (missing fields, unparseable date, invalid quantity/price, or a runtime
|
||||
# insert crash). Row persists with status=MALFORMED + error_message so
|
||||
# operators can escalate to GoMag without blocking the rest of the batch.
|
||||
MALFORMED = "MALFORMED"
|
||||
@@ -1,431 +0,0 @@
|
||||
import oracledb
|
||||
import aiosqlite
|
||||
import sqlite3
|
||||
import logging
|
||||
import os
|
||||
from .config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# ---- Oracle Pool ----
|
||||
pool = None
|
||||
|
||||
def init_oracle():
|
||||
"""Initialize Oracle client mode and create connection pool."""
|
||||
global pool
|
||||
|
||||
force_thin = settings.FORCE_THIN_MODE
|
||||
instantclient_path = settings.INSTANTCLIENTPATH
|
||||
dsn = settings.ORACLE_DSN
|
||||
|
||||
# Ensure TNS_ADMIN is set as OS env var so oracledb can find tnsnames.ora
|
||||
if settings.TNS_ADMIN:
|
||||
os.environ['TNS_ADMIN'] = settings.TNS_ADMIN
|
||||
|
||||
logger.info(f"Oracle config: DSN={dsn}, TNS_ADMIN={settings.TNS_ADMIN or os.environ.get('TNS_ADMIN', '(not set)')}, INSTANTCLIENTPATH={instantclient_path or '(not set)'}")
|
||||
|
||||
if force_thin:
|
||||
logger.info(f"FORCE_THIN_MODE=true: thin mode for {dsn}")
|
||||
elif instantclient_path:
|
||||
try:
|
||||
oracledb.init_oracle_client(lib_dir=instantclient_path)
|
||||
logger.info(f"Thick mode activated for {dsn}")
|
||||
except Exception as e:
|
||||
logger.error(f"Thick mode error: {e}")
|
||||
logger.info("Fallback to thin mode")
|
||||
else:
|
||||
logger.info(f"Thin mode (default) for {dsn}")
|
||||
|
||||
pool = oracledb.create_pool(
|
||||
user=settings.ORACLE_USER,
|
||||
password=settings.ORACLE_PASSWORD,
|
||||
dsn=settings.ORACLE_DSN,
|
||||
min=2,
|
||||
max=4,
|
||||
increment=1
|
||||
)
|
||||
logger.info(f"Oracle pool created for {dsn}")
|
||||
return pool
|
||||
|
||||
def get_oracle_connection():
|
||||
"""Get a connection from the Oracle pool."""
|
||||
if pool is None:
|
||||
raise RuntimeError("Oracle pool not initialized")
|
||||
return pool.acquire()
|
||||
|
||||
def close_oracle():
|
||||
"""Close the Oracle connection pool."""
|
||||
global pool
|
||||
if pool:
|
||||
pool.close()
|
||||
pool = None
|
||||
logger.info("Oracle pool closed")
|
||||
|
||||
# ---- SQLite ----
|
||||
SQLITE_SCHEMA = """
|
||||
CREATE TABLE IF NOT EXISTS sync_runs (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
run_id TEXT UNIQUE,
|
||||
started_at TEXT,
|
||||
finished_at TEXT,
|
||||
status TEXT,
|
||||
total_orders INTEGER DEFAULT 0,
|
||||
imported INTEGER DEFAULT 0,
|
||||
skipped INTEGER DEFAULT 0,
|
||||
errors INTEGER DEFAULT 0,
|
||||
json_files INTEGER DEFAULT 0,
|
||||
error_message TEXT,
|
||||
already_imported INTEGER DEFAULT 0,
|
||||
new_imported INTEGER DEFAULT 0
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS orders (
|
||||
order_number TEXT PRIMARY KEY,
|
||||
order_date TEXT,
|
||||
customer_name TEXT,
|
||||
status TEXT,
|
||||
id_comanda INTEGER,
|
||||
id_partener INTEGER,
|
||||
id_adresa_facturare INTEGER,
|
||||
id_adresa_livrare INTEGER,
|
||||
error_message TEXT,
|
||||
missing_skus TEXT,
|
||||
items_count INTEGER,
|
||||
times_skipped INTEGER DEFAULT 0,
|
||||
first_seen_at TEXT DEFAULT (datetime('now')),
|
||||
last_sync_run_id TEXT REFERENCES sync_runs(run_id),
|
||||
updated_at TEXT DEFAULT (datetime('now')),
|
||||
shipping_name TEXT,
|
||||
billing_name TEXT,
|
||||
payment_method TEXT,
|
||||
delivery_method TEXT,
|
||||
factura_serie TEXT,
|
||||
factura_numar TEXT,
|
||||
factura_total_fara_tva REAL,
|
||||
factura_total_tva REAL,
|
||||
factura_total_cu_tva REAL,
|
||||
factura_data TEXT,
|
||||
invoice_checked_at TEXT,
|
||||
order_total REAL,
|
||||
delivery_cost REAL,
|
||||
discount_total REAL,
|
||||
web_status TEXT,
|
||||
discount_split TEXT
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS sync_run_orders (
|
||||
sync_run_id TEXT REFERENCES sync_runs(run_id),
|
||||
order_number TEXT REFERENCES orders(order_number),
|
||||
status_at_run TEXT,
|
||||
PRIMARY KEY (sync_run_id, order_number)
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS missing_skus (
|
||||
sku TEXT PRIMARY KEY,
|
||||
product_name TEXT,
|
||||
first_seen TEXT DEFAULT (datetime('now')),
|
||||
resolved INTEGER DEFAULT 0,
|
||||
resolved_at TEXT,
|
||||
order_count INTEGER DEFAULT 0,
|
||||
order_numbers TEXT,
|
||||
customers TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS scheduler_config (
|
||||
key TEXT PRIMARY KEY,
|
||||
value TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS web_products (
|
||||
sku TEXT PRIMARY KEY,
|
||||
product_name TEXT,
|
||||
first_seen TEXT DEFAULT (datetime('now')),
|
||||
last_seen TEXT DEFAULT (datetime('now')),
|
||||
order_count INTEGER DEFAULT 0
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS app_settings (
|
||||
key TEXT PRIMARY KEY,
|
||||
value TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS price_sync_runs (
|
||||
run_id TEXT PRIMARY KEY,
|
||||
started_at TEXT,
|
||||
finished_at TEXT,
|
||||
status TEXT DEFAULT 'running',
|
||||
products_total INTEGER DEFAULT 0,
|
||||
matched INTEGER DEFAULT 0,
|
||||
updated INTEGER DEFAULT 0,
|
||||
errors INTEGER DEFAULT 0,
|
||||
log_text TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS order_items (
|
||||
order_number TEXT,
|
||||
sku TEXT,
|
||||
product_name TEXT,
|
||||
quantity REAL,
|
||||
price REAL,
|
||||
baseprice REAL,
|
||||
vat REAL,
|
||||
mapping_status TEXT,
|
||||
codmat TEXT,
|
||||
id_articol INTEGER,
|
||||
cantitate_roa REAL,
|
||||
created_at TEXT DEFAULT (datetime('now')),
|
||||
PRIMARY KEY (order_number, sku)
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_order_items_order ON order_items(order_number);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS anaf_cache (
|
||||
cui TEXT PRIMARY KEY,
|
||||
scp_tva INTEGER,
|
||||
denumire_anaf TEXT,
|
||||
checked_at TEXT NOT NULL
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS sync_phase_failures (
|
||||
run_id TEXT NOT NULL REFERENCES sync_runs(run_id),
|
||||
phase TEXT NOT NULL,
|
||||
error_summary TEXT,
|
||||
created_at TEXT DEFAULT (datetime('now')),
|
||||
PRIMARY KEY (run_id, phase)
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_spf_phase_time ON sync_phase_failures(phase, created_at);
|
||||
"""
|
||||
|
||||
_sqlite_db_path = None
|
||||
|
||||
def init_sqlite():
|
||||
"""Initialize SQLite database with schema."""
|
||||
global _sqlite_db_path
|
||||
_sqlite_db_path = settings.SQLITE_DB_PATH
|
||||
|
||||
# Ensure directory exists
|
||||
db_dir = os.path.dirname(_sqlite_db_path)
|
||||
if db_dir:
|
||||
os.makedirs(db_dir, exist_ok=True)
|
||||
|
||||
# Create tables synchronously
|
||||
conn = sqlite3.connect(_sqlite_db_path)
|
||||
|
||||
# Check existing tables before running schema
|
||||
cursor = conn.execute("SELECT name FROM sqlite_master WHERE type='table'")
|
||||
existing_tables = {row[0] for row in cursor.fetchall()}
|
||||
|
||||
# Migration: import_orders → orders (one row per order)
|
||||
if 'import_orders' in existing_tables and 'orders' not in existing_tables:
|
||||
logger.info("Migrating import_orders → orders schema...")
|
||||
conn.executescript("""
|
||||
CREATE TABLE orders (
|
||||
order_number TEXT PRIMARY KEY,
|
||||
order_date TEXT,
|
||||
customer_name TEXT,
|
||||
status TEXT,
|
||||
id_comanda INTEGER,
|
||||
id_partener INTEGER,
|
||||
id_adresa_facturare INTEGER,
|
||||
id_adresa_livrare INTEGER,
|
||||
error_message TEXT,
|
||||
missing_skus TEXT,
|
||||
items_count INTEGER,
|
||||
times_skipped INTEGER DEFAULT 0,
|
||||
first_seen_at TEXT DEFAULT (datetime('now')),
|
||||
last_sync_run_id TEXT,
|
||||
updated_at TEXT DEFAULT (datetime('now'))
|
||||
);
|
||||
CREATE INDEX IF NOT EXISTS idx_orders_status ON orders(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_orders_date ON orders(order_date);
|
||||
|
||||
CREATE TABLE sync_run_orders (
|
||||
sync_run_id TEXT,
|
||||
order_number TEXT,
|
||||
status_at_run TEXT,
|
||||
PRIMARY KEY (sync_run_id, order_number)
|
||||
);
|
||||
""")
|
||||
# Copy latest record per order_number into orders
|
||||
# Note: old import_orders didn't have address columns — those stay NULL
|
||||
conn.execute("""
|
||||
INSERT INTO orders
|
||||
(order_number, order_date, customer_name, status,
|
||||
id_comanda, id_partener, error_message, missing_skus,
|
||||
items_count, last_sync_run_id)
|
||||
SELECT io.order_number, io.order_date, io.customer_name, io.status,
|
||||
io.id_comanda, io.id_partener, io.error_message, io.missing_skus,
|
||||
io.items_count, io.sync_run_id
|
||||
FROM import_orders io
|
||||
INNER JOIN (
|
||||
SELECT order_number, MAX(id) as max_id
|
||||
FROM import_orders
|
||||
GROUP BY order_number
|
||||
) latest ON io.id = latest.max_id
|
||||
""")
|
||||
# Populate sync_run_orders from all import_orders rows
|
||||
conn.execute("""
|
||||
INSERT OR IGNORE INTO sync_run_orders (sync_run_id, order_number, status_at_run)
|
||||
SELECT sync_run_id, order_number, status
|
||||
FROM import_orders
|
||||
WHERE sync_run_id IS NOT NULL
|
||||
""")
|
||||
# Migrate order_items: drop sync_run_id, change PK to (order_number, sku)
|
||||
if 'order_items' in existing_tables:
|
||||
conn.executescript("""
|
||||
CREATE TABLE order_items_new (
|
||||
order_number TEXT,
|
||||
sku TEXT,
|
||||
product_name TEXT,
|
||||
quantity REAL,
|
||||
price REAL,
|
||||
vat REAL,
|
||||
mapping_status TEXT,
|
||||
codmat TEXT,
|
||||
id_articol INTEGER,
|
||||
cantitate_roa REAL,
|
||||
created_at TEXT DEFAULT (datetime('now')),
|
||||
PRIMARY KEY (order_number, sku)
|
||||
);
|
||||
INSERT OR IGNORE INTO order_items_new
|
||||
(order_number, sku, product_name, quantity, price, vat,
|
||||
mapping_status, codmat, id_articol, cantitate_roa, created_at)
|
||||
SELECT order_number, sku, product_name, quantity, price, vat,
|
||||
mapping_status, codmat, id_articol, cantitate_roa, created_at
|
||||
FROM order_items;
|
||||
DROP TABLE order_items;
|
||||
ALTER TABLE order_items_new RENAME TO order_items;
|
||||
CREATE INDEX IF NOT EXISTS idx_order_items_order ON order_items(order_number);
|
||||
""")
|
||||
# Rename old table instead of dropping (safety backup)
|
||||
conn.execute("ALTER TABLE import_orders RENAME TO import_orders_bak")
|
||||
conn.commit()
|
||||
logger.info("Migration complete: import_orders → orders")
|
||||
|
||||
conn.executescript(SQLITE_SCHEMA)
|
||||
|
||||
# Migrate: add columns if missing (for existing databases)
|
||||
try:
|
||||
cursor = conn.execute("PRAGMA table_info(missing_skus)")
|
||||
cols = {row[1] for row in cursor.fetchall()}
|
||||
for col, typedef in [("order_count", "INTEGER DEFAULT 0"),
|
||||
("order_numbers", "TEXT"),
|
||||
("customers", "TEXT")]:
|
||||
if col not in cols:
|
||||
conn.execute(f"ALTER TABLE missing_skus ADD COLUMN {col} {typedef}")
|
||||
logger.info(f"Migrated missing_skus: added column {col}")
|
||||
# Migrate sync_runs: add columns
|
||||
cursor = conn.execute("PRAGMA table_info(sync_runs)")
|
||||
sync_cols = {row[1] for row in cursor.fetchall()}
|
||||
if "error_message" not in sync_cols:
|
||||
conn.execute("ALTER TABLE sync_runs ADD COLUMN error_message TEXT")
|
||||
logger.info("Migrated sync_runs: added column error_message")
|
||||
if "already_imported" not in sync_cols:
|
||||
conn.execute("ALTER TABLE sync_runs ADD COLUMN already_imported INTEGER DEFAULT 0")
|
||||
logger.info("Migrated sync_runs: added column already_imported")
|
||||
if "new_imported" not in sync_cols:
|
||||
conn.execute("ALTER TABLE sync_runs ADD COLUMN new_imported INTEGER DEFAULT 0")
|
||||
logger.info("Migrated sync_runs: added column new_imported")
|
||||
|
||||
# Migrate orders: add shipping/billing/payment/delivery + invoice columns
|
||||
cursor = conn.execute("PRAGMA table_info(orders)")
|
||||
order_cols = {row[1] for row in cursor.fetchall()}
|
||||
for col, typedef in [
|
||||
("shipping_name", "TEXT"),
|
||||
("billing_name", "TEXT"),
|
||||
("payment_method", "TEXT"),
|
||||
("delivery_method", "TEXT"),
|
||||
("factura_serie", "TEXT"),
|
||||
("factura_numar", "TEXT"),
|
||||
("factura_total_fara_tva", "REAL"),
|
||||
("factura_total_tva", "REAL"),
|
||||
("factura_total_cu_tva", "REAL"),
|
||||
("factura_data", "TEXT"),
|
||||
("invoice_checked_at", "TEXT"),
|
||||
("order_total", "REAL"),
|
||||
("delivery_cost", "REAL"),
|
||||
("discount_total", "REAL"),
|
||||
("web_status", "TEXT"),
|
||||
("discount_split", "TEXT"),
|
||||
("price_match", "INTEGER"),
|
||||
("cod_fiscal_gomag", "TEXT"),
|
||||
("cod_fiscal_roa", "TEXT"),
|
||||
("denumire_roa", "TEXT"),
|
||||
("anaf_platitor_tva", "INTEGER"),
|
||||
("anaf_checked_at", "TEXT"),
|
||||
("anaf_cod_fiscal_adjusted", "INTEGER DEFAULT 0"),
|
||||
("adresa_livrare_gomag", "TEXT"),
|
||||
("adresa_facturare_gomag", "TEXT"),
|
||||
("adresa_livrare_roa", "TEXT"),
|
||||
("adresa_facturare_roa", "TEXT"),
|
||||
("anaf_denumire_mismatch", "INTEGER DEFAULT 0"),
|
||||
("denumire_anaf", "TEXT"),
|
||||
("address_mismatch", "INTEGER DEFAULT 0"),
|
||||
("partner_mismatch", "INTEGER DEFAULT 0"),
|
||||
]:
|
||||
if col not in order_cols:
|
||||
conn.execute(f"ALTER TABLE orders ADD COLUMN {col} {typedef}")
|
||||
logger.info(f"Migrated orders: added column {col}")
|
||||
|
||||
# Migrate order_items: add baseprice column
|
||||
cursor = conn.execute("PRAGMA table_info(order_items)")
|
||||
oi_cols = {row[1] for row in cursor.fetchall()}
|
||||
if "baseprice" not in oi_cols:
|
||||
conn.execute("ALTER TABLE order_items ADD COLUMN baseprice REAL")
|
||||
conn.execute("UPDATE orders SET price_match = NULL WHERE price_match = 0")
|
||||
logger.info("Migrated order_items: added baseprice; reset price_match for re-check")
|
||||
|
||||
conn.commit()
|
||||
|
||||
# Backfill address_mismatch from stored address JSON
|
||||
_backfill_address_mismatch(conn)
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Migration check failed: {e}")
|
||||
|
||||
conn.close()
|
||||
logger.info(f"SQLite initialized: {_sqlite_db_path}")
|
||||
|
||||
|
||||
def _backfill_address_mismatch(conn):
|
||||
"""Recompute address_mismatch from stored address JSON for all orders."""
|
||||
from .services.sync_service import _addr_match
|
||||
try:
|
||||
rows = conn.execute("""
|
||||
SELECT order_number, adresa_livrare_gomag, adresa_livrare_roa,
|
||||
adresa_facturare_gomag, adresa_facturare_roa
|
||||
FROM orders
|
||||
WHERE adresa_livrare_roa IS NOT NULL OR adresa_facturare_roa IS NOT NULL
|
||||
""").fetchall()
|
||||
updated = 0
|
||||
for r in rows:
|
||||
livr_ok = _addr_match(r[1], r[2])
|
||||
fact_ok = _addr_match(r[3], r[4])
|
||||
new_val = 1 if (not livr_ok or not fact_ok) else 0
|
||||
conn.execute(
|
||||
"UPDATE orders SET address_mismatch = ? WHERE order_number = ?",
|
||||
(new_val, r[0])
|
||||
)
|
||||
updated += 1
|
||||
if updated:
|
||||
conn.commit()
|
||||
logger.info(f"Backfill address_mismatch: {updated} orders recomputed")
|
||||
except Exception as e:
|
||||
logger.warning(f"Backfill address_mismatch failed: {e}")
|
||||
|
||||
async def get_sqlite():
|
||||
"""Get async SQLite connection."""
|
||||
if _sqlite_db_path is None:
|
||||
raise RuntimeError("SQLite not initialized")
|
||||
db = await aiosqlite.connect(_sqlite_db_path)
|
||||
db.row_factory = aiosqlite.Row
|
||||
return db
|
||||
|
||||
def get_sqlite_sync():
|
||||
"""Get synchronous SQLite connection."""
|
||||
if _sqlite_db_path is None:
|
||||
raise RuntimeError("SQLite not initialized")
|
||||
conn = sqlite3.connect(_sqlite_db_path)
|
||||
conn.row_factory = sqlite3.Row
|
||||
return conn
|
||||
@@ -1,91 +0,0 @@
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime
|
||||
from fastapi import FastAPI
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from pathlib import Path
|
||||
import logging
|
||||
import os
|
||||
|
||||
from .config import settings
|
||||
from .database import init_oracle, close_oracle, init_sqlite
|
||||
|
||||
# Configure logging with both stream and file handlers
|
||||
_log_level = getattr(logging, settings.LOG_LEVEL.upper(), logging.INFO)
|
||||
_log_format = '%(asctime)s | %(levelname)s | %(name)s | %(message)s'
|
||||
_formatter = logging.Formatter(_log_format)
|
||||
|
||||
_stream_handler = logging.StreamHandler()
|
||||
_stream_handler.setFormatter(_formatter)
|
||||
|
||||
_log_dir = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'logs')
|
||||
os.makedirs(_log_dir, exist_ok=True)
|
||||
_log_filename = f"sync_comenzi_{datetime.now().strftime('%Y%m%d_%H%M%S')}.log"
|
||||
_file_handler = logging.FileHandler(os.path.join(_log_dir, _log_filename), encoding='utf-8')
|
||||
_file_handler.setFormatter(_formatter)
|
||||
|
||||
_root_logger = logging.getLogger()
|
||||
_root_logger.setLevel(_log_level)
|
||||
_root_logger.addHandler(_stream_handler)
|
||||
_root_logger.addHandler(_file_handler)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Startup and shutdown events."""
|
||||
logger.info("Starting GoMag Import Manager...")
|
||||
|
||||
# Initialize Oracle pool
|
||||
try:
|
||||
init_oracle()
|
||||
except Exception as e:
|
||||
logger.error(f"Oracle init failed: {e}")
|
||||
# Allow app to start even without Oracle for development
|
||||
|
||||
# Initialize SQLite
|
||||
init_sqlite()
|
||||
|
||||
# Initialize scheduler (restore saved config)
|
||||
from .services import scheduler_service, sqlite_service
|
||||
scheduler_service.init_scheduler()
|
||||
try:
|
||||
config = await sqlite_service.get_scheduler_config()
|
||||
if config.get("enabled") == "True":
|
||||
interval = int(config.get("interval_minutes", "10"))
|
||||
scheduler_service.start_scheduler(interval)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
logger.info("GoMag Import Manager started")
|
||||
yield
|
||||
|
||||
# Shutdown
|
||||
scheduler_service.shutdown_scheduler()
|
||||
close_oracle()
|
||||
logger.info("GoMag Import Manager stopped")
|
||||
|
||||
app = FastAPI(
|
||||
title="GoMag Import Manager",
|
||||
description="Import comenzi web GoMag → ROA Oracle",
|
||||
version="1.0.0",
|
||||
lifespan=lifespan
|
||||
)
|
||||
|
||||
# Static files and templates
|
||||
static_dir = Path(__file__).parent / "static"
|
||||
templates_dir = Path(__file__).parent / "templates"
|
||||
static_dir.mkdir(parents=True, exist_ok=True)
|
||||
(static_dir / "css").mkdir(exist_ok=True)
|
||||
(static_dir / "js").mkdir(exist_ok=True)
|
||||
templates_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
app.mount("/static", StaticFiles(directory=str(static_dir)), name="static")
|
||||
|
||||
# Include routers
|
||||
from .routers import health, dashboard, mappings, articles, validation, sync
|
||||
app.include_router(health.router)
|
||||
app.include_router(dashboard.router)
|
||||
app.include_router(mappings.router)
|
||||
app.include_router(articles.router)
|
||||
app.include_router(validation.router)
|
||||
app.include_router(sync.router)
|
||||
@@ -1,10 +0,0 @@
|
||||
from fastapi import APIRouter, Query
|
||||
|
||||
from ..services import article_service
|
||||
|
||||
router = APIRouter(prefix="/api/articles", tags=["articles"])
|
||||
|
||||
@router.get("/search")
|
||||
def search_articles(q: str = Query("", min_length=2)):
|
||||
results = article_service.search_articles(q)
|
||||
return {"results": results}
|
||||
@@ -1,23 +0,0 @@
|
||||
from fastapi import APIRouter, Request
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse
|
||||
from pathlib import Path
|
||||
|
||||
from ..services import sqlite_service
|
||||
from ..constants import OrderStatus
|
||||
|
||||
router = APIRouter()
|
||||
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
||||
templates.env.globals["OrderStatus"] = OrderStatus
|
||||
|
||||
@router.get("/", response_class=HTMLResponse)
|
||||
async def dashboard(request: Request):
|
||||
return templates.TemplateResponse("dashboard.html", {"request": request})
|
||||
|
||||
@router.get("/missing-skus", response_class=HTMLResponse)
|
||||
async def missing_skus_page(request: Request):
|
||||
return templates.TemplateResponse("missing_skus.html", {"request": request})
|
||||
|
||||
@router.get("/settings", response_class=HTMLResponse)
|
||||
async def settings_page(request: Request):
|
||||
return templates.TemplateResponse("settings.html", {"request": request})
|
||||
@@ -1,30 +0,0 @@
|
||||
from fastapi import APIRouter
|
||||
from .. import database
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.get("/health")
|
||||
async def health_check():
|
||||
result = {"oracle": "error", "sqlite": "error"}
|
||||
|
||||
# Check Oracle
|
||||
try:
|
||||
if database.pool:
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("SELECT SYSDATE FROM DUAL")
|
||||
cur.fetchone()
|
||||
result["oracle"] = "ok"
|
||||
except Exception as e:
|
||||
result["oracle"] = str(e)
|
||||
|
||||
# Check SQLite
|
||||
try:
|
||||
db = await database.get_sqlite()
|
||||
await db.execute("SELECT 1")
|
||||
await db.close()
|
||||
result["sqlite"] = "ok"
|
||||
except Exception as e:
|
||||
result["sqlite"] = str(e)
|
||||
|
||||
return result
|
||||
@@ -1,190 +0,0 @@
|
||||
from fastapi import APIRouter, Query, Request, UploadFile, File
|
||||
from fastapi.responses import StreamingResponse, HTMLResponse, JSONResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi import HTTPException
|
||||
from pydantic import BaseModel, validator
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
import io
|
||||
import asyncio
|
||||
|
||||
from ..services import mapping_service, sqlite_service, validation_service
|
||||
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(tags=["mappings"])
|
||||
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
||||
|
||||
class MappingCreate(BaseModel):
|
||||
sku: str
|
||||
codmat: str
|
||||
cantitate_roa: float = 1
|
||||
|
||||
@validator('sku', 'codmat')
|
||||
def not_empty(cls, v):
|
||||
if not v or not v.strip():
|
||||
raise ValueError('nu poate fi gol')
|
||||
return v.strip()
|
||||
|
||||
class MappingUpdate(BaseModel):
|
||||
cantitate_roa: Optional[float] = None
|
||||
activ: Optional[int] = None
|
||||
|
||||
class MappingEdit(BaseModel):
|
||||
new_sku: str
|
||||
new_codmat: str
|
||||
cantitate_roa: float = 1
|
||||
|
||||
@validator('new_sku', 'new_codmat')
|
||||
def not_empty(cls, v):
|
||||
if not v or not v.strip():
|
||||
raise ValueError('nu poate fi gol')
|
||||
return v.strip()
|
||||
|
||||
class MappingLine(BaseModel):
|
||||
codmat: str
|
||||
cantitate_roa: float = 1
|
||||
|
||||
class MappingBatchCreate(BaseModel):
|
||||
sku: str
|
||||
mappings: list[MappingLine]
|
||||
auto_restore: bool = False
|
||||
|
||||
# HTML page
|
||||
@router.get("/mappings", response_class=HTMLResponse)
|
||||
async def mappings_page(request: Request):
|
||||
return templates.TemplateResponse("mappings.html", {"request": request})
|
||||
|
||||
# API endpoints
|
||||
@router.get("/api/mappings")
|
||||
async def list_mappings(search: str = "", page: int = 1, per_page: int = 50,
|
||||
sort_by: str = "sku", sort_dir: str = "asc",
|
||||
show_deleted: bool = False):
|
||||
app_settings = await sqlite_service.get_app_settings()
|
||||
id_pol = int(app_settings.get("id_pol") or 0) or None
|
||||
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
|
||||
|
||||
result = mapping_service.get_mappings(search=search, page=page, per_page=per_page,
|
||||
sort_by=sort_by, sort_dir=sort_dir,
|
||||
show_deleted=show_deleted,
|
||||
id_pol=id_pol, id_pol_productie=id_pol_productie)
|
||||
# Merge product names from web_products (R4)
|
||||
skus = list({m["sku"] for m in result.get("mappings", [])})
|
||||
product_names = await sqlite_service.get_web_products_batch(skus)
|
||||
for m in result.get("mappings", []):
|
||||
m["product_name"] = product_names.get(m["sku"], "")
|
||||
# Ensure counts key is always present
|
||||
if "counts" not in result:
|
||||
result["counts"] = {"total": 0}
|
||||
return result
|
||||
|
||||
@router.post("/api/mappings")
|
||||
async def create_mapping(data: MappingCreate):
|
||||
try:
|
||||
result = mapping_service.create_mapping(data.sku, data.codmat, data.cantitate_roa)
|
||||
# Mark SKU as resolved in missing_skus tracking
|
||||
await sqlite_service.resolve_missing_sku(data.sku)
|
||||
return {"success": True, **result}
|
||||
except HTTPException as e:
|
||||
can_restore = e.headers.get("X-Can-Restore") == "true" if e.headers else False
|
||||
resp: dict = {"error": e.detail}
|
||||
if can_restore:
|
||||
resp["can_restore"] = True
|
||||
return JSONResponse(status_code=e.status_code, content=resp)
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.put("/api/mappings/{sku}/{codmat}")
|
||||
def update_mapping(sku: str, codmat: str, data: MappingUpdate):
|
||||
try:
|
||||
updated = mapping_service.update_mapping(sku, codmat, data.cantitate_roa, data.activ)
|
||||
return {"success": updated}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.put("/api/mappings/{sku}/{codmat}/edit")
|
||||
def edit_mapping(sku: str, codmat: str, data: MappingEdit):
|
||||
try:
|
||||
result = mapping_service.edit_mapping(sku, codmat, data.new_sku, data.new_codmat,
|
||||
data.cantitate_roa)
|
||||
return {"success": result}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.delete("/api/mappings/{sku}/{codmat}")
|
||||
def delete_mapping(sku: str, codmat: str):
|
||||
try:
|
||||
deleted = mapping_service.delete_mapping(sku, codmat)
|
||||
return {"success": deleted}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.post("/api/mappings/{sku}/{codmat}/restore")
|
||||
def restore_mapping(sku: str, codmat: str):
|
||||
try:
|
||||
restored = mapping_service.restore_mapping(sku, codmat)
|
||||
return {"success": restored}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.post("/api/mappings/batch")
|
||||
async def create_batch_mapping(data: MappingBatchCreate):
|
||||
"""Create multiple (sku, codmat) rows for complex sets (R11)."""
|
||||
if not data.mappings:
|
||||
return {"success": False, "error": "No mappings provided"}
|
||||
|
||||
try:
|
||||
results = []
|
||||
for m in data.mappings:
|
||||
r = mapping_service.create_mapping(data.sku, m.codmat, m.cantitate_roa, auto_restore=data.auto_restore)
|
||||
results.append(r)
|
||||
# Mark SKU as resolved in missing_skus tracking
|
||||
await sqlite_service.resolve_missing_sku(data.sku)
|
||||
return {"success": True, "created": len(results)}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
|
||||
@router.get("/api/mappings/prices")
|
||||
async def get_mapping_prices(sku: str = Query(...)):
|
||||
"""Get component prices from crm_politici_pret_art for a kit SKU."""
|
||||
app_settings = await sqlite_service.get_app_settings()
|
||||
id_pol = int(app_settings.get("id_pol") or 0) or None
|
||||
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
|
||||
if not id_pol:
|
||||
return {"error": "Politica de pret nu este configurata", "prices": []}
|
||||
try:
|
||||
prices = await asyncio.to_thread(
|
||||
mapping_service.get_component_prices, sku, id_pol, id_pol_productie
|
||||
)
|
||||
return {"prices": prices}
|
||||
except Exception as e:
|
||||
return {"error": str(e), "prices": []}
|
||||
|
||||
|
||||
@router.post("/api/mappings/import-csv")
|
||||
async def import_csv(file: UploadFile = File(...)):
|
||||
content = await file.read()
|
||||
text = content.decode("utf-8-sig")
|
||||
result = mapping_service.import_csv(text)
|
||||
await validation_service.reconcile_unresolved_missing_skus()
|
||||
return result
|
||||
|
||||
@router.get("/api/mappings/export-csv")
|
||||
def export_csv():
|
||||
csv_content = mapping_service.export_csv()
|
||||
return StreamingResponse(
|
||||
io.BytesIO(csv_content.encode("utf-8-sig")),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=mappings.csv"}
|
||||
)
|
||||
|
||||
@router.get("/api/mappings/csv-template")
|
||||
def csv_template():
|
||||
content = mapping_service.get_csv_template()
|
||||
return StreamingResponse(
|
||||
io.BytesIO(content.encode("utf-8-sig")),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=mappings_template.csv"}
|
||||
)
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,158 +0,0 @@
|
||||
import csv
|
||||
import io
|
||||
import json
|
||||
from fastapi import APIRouter, Query
|
||||
from fastapi.responses import StreamingResponse
|
||||
|
||||
from ..services import order_reader, validation_service, sqlite_service
|
||||
from ..database import get_sqlite
|
||||
|
||||
router = APIRouter(prefix="/api/validate", tags=["validation"])
|
||||
|
||||
@router.post("/scan")
|
||||
async def scan_and_validate():
|
||||
"""Scan JSON files and validate all SKUs."""
|
||||
orders, json_count = order_reader.read_json_orders()
|
||||
|
||||
if not orders:
|
||||
return {
|
||||
"orders": 0, "json_files": json_count, "skus": {}, "message": "No orders found",
|
||||
"total_skus_scanned": 0, "new_missing": 0, "auto_resolved": 0, "unchanged": 0,
|
||||
}
|
||||
|
||||
all_skus = order_reader.get_all_skus(orders)
|
||||
result = validation_service.validate_skus(all_skus)
|
||||
importable, skipped = validation_service.classify_orders(orders, result)
|
||||
|
||||
# Build SKU context from skipped orders and track missing SKUs
|
||||
sku_context = {} # sku -> {order_numbers: [], customers: []}
|
||||
for order, missing_list in skipped:
|
||||
customer = order.billing.company_name or f"{order.billing.lastname} {order.billing.firstname}"
|
||||
for sku in missing_list:
|
||||
if sku not in sku_context:
|
||||
sku_context[sku] = {"order_numbers": [], "customers": []}
|
||||
sku_context[sku]["order_numbers"].append(order.number)
|
||||
if customer not in sku_context[sku]["customers"]:
|
||||
sku_context[sku]["customers"].append(customer)
|
||||
|
||||
new_missing = 0
|
||||
for sku in result["missing"]:
|
||||
# Find product name from orders
|
||||
product_name = ""
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
if item.sku == sku:
|
||||
product_name = item.name
|
||||
break
|
||||
if product_name:
|
||||
break
|
||||
|
||||
ctx = sku_context.get(sku, {})
|
||||
tracked = await sqlite_service.track_missing_sku(
|
||||
sku=sku,
|
||||
product_name=product_name,
|
||||
order_count=len(ctx.get("order_numbers", [])),
|
||||
order_numbers=json.dumps(ctx.get("order_numbers", [])),
|
||||
customers=json.dumps(ctx.get("customers", []))
|
||||
)
|
||||
if tracked:
|
||||
new_missing += 1
|
||||
|
||||
rec = await validation_service.reconcile_unresolved_missing_skus()
|
||||
|
||||
total_skus_scanned = len(all_skus)
|
||||
new_missing_count = len(result["missing"])
|
||||
unchanged = total_skus_scanned - new_missing_count
|
||||
|
||||
return {
|
||||
"json_files": json_count,
|
||||
"total_orders": len(orders),
|
||||
"total_skus": len(all_skus),
|
||||
"importable": len(importable),
|
||||
"skipped": len(skipped),
|
||||
"new_orders": len(importable),
|
||||
# Fields consumed by the rescan progress banner in missing_skus.html
|
||||
"total_skus_scanned": total_skus_scanned,
|
||||
"new_missing": new_missing_count,
|
||||
"auto_resolved": rec["resolved"],
|
||||
"unchanged": unchanged,
|
||||
"skus": {
|
||||
"mapped": len(result["mapped"]),
|
||||
"direct": len(result["direct"]),
|
||||
"missing": len(result["missing"]),
|
||||
"missing_list": sorted(result["missing"]),
|
||||
"total_skus": len(all_skus),
|
||||
"mapped_skus": len(result["mapped"]),
|
||||
"direct_skus": len(result["direct"])
|
||||
},
|
||||
"skipped_orders": [
|
||||
{
|
||||
"number": order.number,
|
||||
"customer": order.billing.company_name or f"{order.billing.lastname} {order.billing.firstname}",
|
||||
"items_count": len(order.items),
|
||||
"missing_skus": missing
|
||||
}
|
||||
for order, missing in skipped[:50] # limit to 50
|
||||
]
|
||||
}
|
||||
|
||||
@router.get("/missing-skus")
|
||||
async def get_missing_skus(
|
||||
page: int = Query(1, ge=1),
|
||||
per_page: int = Query(20, ge=1, le=100),
|
||||
resolved: int = Query(0, ge=-1, le=1),
|
||||
search: str = Query(None)
|
||||
):
|
||||
"""Get paginated missing SKUs. resolved=-1 means show all (R10).
|
||||
Optional search filters by sku or product_name."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
# Compute counts across ALL records (unfiltered by search)
|
||||
cursor = await db.execute("SELECT COUNT(*) FROM missing_skus WHERE resolved = 0")
|
||||
unresolved_count = (await cursor.fetchone())[0]
|
||||
cursor = await db.execute("SELECT COUNT(*) FROM missing_skus WHERE resolved = 1")
|
||||
resolved_count = (await cursor.fetchone())[0]
|
||||
cursor = await db.execute("SELECT COUNT(*) FROM missing_skus")
|
||||
total_count = (await cursor.fetchone())[0]
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
counts = {
|
||||
"total": total_count,
|
||||
"unresolved": unresolved_count,
|
||||
"resolved": resolved_count,
|
||||
}
|
||||
|
||||
result = await sqlite_service.get_missing_skus_paginated(page, per_page, resolved, search=search)
|
||||
# Backward compat
|
||||
result["unresolved"] = unresolved_count
|
||||
result["counts"] = counts
|
||||
# rename key for JS consistency
|
||||
result["skus"] = result.get("missing_skus", [])
|
||||
return result
|
||||
|
||||
@router.get("/missing-skus-csv")
|
||||
async def export_missing_skus_csv():
|
||||
"""Export missing SKUs as CSV compatible with mapping import (R8)."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
cursor = await db.execute("""
|
||||
SELECT sku, product_name, first_seen, resolved
|
||||
FROM missing_skus WHERE resolved = 0
|
||||
ORDER BY first_seen DESC
|
||||
""")
|
||||
rows = await cursor.fetchall()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(["sku", "codmat", "cantitate_roa", "procent_pret", "product_name"])
|
||||
for row in rows:
|
||||
writer.writerow([row["sku"], "", "", "", row["product_name"] or ""])
|
||||
|
||||
return StreamingResponse(
|
||||
io.BytesIO(output.getvalue().encode("utf-8-sig")),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=missing_skus.csv"}
|
||||
)
|
||||
@@ -1,216 +0,0 @@
|
||||
import re
|
||||
import logging
|
||||
import httpx
|
||||
import asyncio
|
||||
from datetime import datetime
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Romanian diacritics to ASCII mapping (same 14 chars as import_service)
|
||||
_DIACRITICS = str.maketrans('ĂăÂâÎîȘșȚțŞşŢţ', 'AAAAIISSTTSSTT')
|
||||
|
||||
|
||||
def strip_ro_prefix(cod_fiscal: str) -> str:
|
||||
"""Normalize CUI: strip whitespace, uppercase, remove 'RO' prefix, fix OCR-like typos."""
|
||||
if not cod_fiscal:
|
||||
return ""
|
||||
cleaned = cod_fiscal.strip().upper()
|
||||
cleaned = re.sub(r'^RO\s*', '', cleaned)
|
||||
# Fix common character confusions in CUI (O→0, I→1, L→1, B→8)
|
||||
cleaned = cleaned.translate(str.maketrans('OIL', '011'))
|
||||
return cleaned
|
||||
|
||||
|
||||
def validate_cui(bare_cui: str) -> bool:
|
||||
"""Validate bare CUI: digits only, length 2-10."""
|
||||
if not bare_cui:
|
||||
return False
|
||||
return bare_cui.isdigit() and 2 <= len(bare_cui) <= 10
|
||||
|
||||
|
||||
# Cheia de testare CUI Romania (9 ponderi, aliniate la dreapta cu cifrele fara cifra de control)
|
||||
_CUI_KEY = [7, 5, 3, 2, 1, 7, 5, 3, 2]
|
||||
|
||||
|
||||
def validate_cui_checksum(bare_cui: str) -> bool:
|
||||
"""Validate CUI check digit using the Romanian algorithm.
|
||||
|
||||
Algorithm: pad to 9 digits (without check digit), multiply by key 753217532,
|
||||
sum products, (sum * 10) % 11 → if 10 then 0, else result == check digit.
|
||||
"""
|
||||
if not validate_cui(bare_cui):
|
||||
return False
|
||||
digits = [int(d) for d in bare_cui]
|
||||
check_digit = digits[-1]
|
||||
body = digits[:-1]
|
||||
padded = [0] * (9 - len(body)) + body
|
||||
total = sum(d * k for d, k in zip(padded, _CUI_KEY))
|
||||
result = (total * 10) % 11
|
||||
if result == 10:
|
||||
result = 0
|
||||
return result == check_digit
|
||||
|
||||
|
||||
def sanitize_cui(raw_cf: str) -> tuple[str, str | None]:
|
||||
"""Sanitize and validate CUI. Returns (clean_cui, warning_or_none).
|
||||
|
||||
Steps: strip RO prefix, fix OCR typos (O→0), validate checksum.
|
||||
If sanitized version passes checksum but original didn't, returns the fixed CUI.
|
||||
If neither passes, returns original with warning.
|
||||
"""
|
||||
bare = strip_ro_prefix(raw_cf)
|
||||
if not bare:
|
||||
return bare, None
|
||||
|
||||
if validate_cui(bare) and validate_cui_checksum(bare):
|
||||
return bare, None
|
||||
|
||||
# Sanitized version passes format but not checksum
|
||||
if validate_cui(bare):
|
||||
return bare, f"CUI {bare} nu trece verificarea cifrei de control"
|
||||
|
||||
# Not even valid format
|
||||
return bare, f"CUI {raw_cf!r} contine caractere invalide dupa sanitizare: {bare!r}"
|
||||
|
||||
|
||||
async def check_vat_status_batch(cui_list: list[str], date: str = None, log_fn=None) -> dict[str, dict]:
|
||||
"""POST to ANAF API to check VAT status for a batch of CUIs.
|
||||
|
||||
Chunks in batches of 500 (ANAF API limit).
|
||||
Returns {cui_str: {"scpTVA": bool|None, "denumire_anaf": str, "checked_at": str}, ...}
|
||||
"""
|
||||
if not cui_list:
|
||||
return {}
|
||||
|
||||
check_date = date or datetime.now().strftime("%Y-%m-%d")
|
||||
results = {}
|
||||
|
||||
for i in range(0, len(cui_list), 500):
|
||||
chunk = cui_list[i:i+500]
|
||||
body = [{"cui": int(cui), "data": check_date} for cui in chunk if cui.isdigit()]
|
||||
if not body:
|
||||
continue
|
||||
|
||||
chunk_results = await _call_anaf_api(body, log_fn=log_fn)
|
||||
results.update(chunk_results)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
async def _call_anaf_api(body: list[dict], retry: int = 0, log_fn=None) -> dict[str, dict]:
|
||||
"""Internal: single ANAF API call with retry logic."""
|
||||
url = "https://webservicesp.anaf.ro/api/PlatitorTvaRest/v9/tva"
|
||||
results = {}
|
||||
|
||||
def _log_error(msg: str):
|
||||
logger.error(msg)
|
||||
if log_fn:
|
||||
log_fn(f"ANAF eroare: {msg}")
|
||||
|
||||
def _log_warning(msg: str):
|
||||
logger.warning(msg)
|
||||
if log_fn:
|
||||
log_fn(f"ANAF warn: {msg}")
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||
response = await client.post(url, json=body)
|
||||
|
||||
if response.status_code == 429:
|
||||
if retry < 1:
|
||||
_log_warning("ANAF API rate limited (429), retrying in 10s...")
|
||||
await asyncio.sleep(10)
|
||||
return await _call_anaf_api(body, retry + 1, log_fn)
|
||||
_log_error("ANAF API rate limited after retry")
|
||||
return {}
|
||||
|
||||
if response.status_code >= 500:
|
||||
if retry < 1:
|
||||
_log_warning(f"ANAF API server error ({response.status_code}), retrying in 3s...")
|
||||
await asyncio.sleep(3)
|
||||
return await _call_anaf_api(body, retry + 1, log_fn)
|
||||
_log_error(f"ANAF API server error after retry: {response.status_code}")
|
||||
return {}
|
||||
|
||||
if 400 <= response.status_code < 500:
|
||||
_log_error(f"ANAF API client error {response.status_code} (nu se reincearca)")
|
||||
return {}
|
||||
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
checked_at = datetime.now().isoformat()
|
||||
|
||||
# CONTRACT (consumed by sync_service.evaluate_cui_gate):
|
||||
# Return {} → transient error (down/429/5xx/timeout)
|
||||
# Return {cui: {scpTVA: None, denumire_anaf: ""}} → ANAF notFound explicit
|
||||
# Return {cui: {scpTVA: bool, denumire_anaf: str}} → ANAF found
|
||||
# If you change this semantics, update the gate in sync_service too.
|
||||
|
||||
# Parse ANAF response
|
||||
found_list = data.get("found", [])
|
||||
for item in found_list:
|
||||
date_generals = item.get("date_generale", {})
|
||||
cui_str = str(date_generals.get("cui", ""))
|
||||
results[cui_str] = {
|
||||
"scpTVA": item.get("inregistrare_scop_Tva", {}).get("scpTVA"),
|
||||
"denumire_anaf": date_generals.get("denumire", ""),
|
||||
"checked_at": checked_at,
|
||||
}
|
||||
|
||||
# Not found CUIs — ANAF returns plain integers (CUI values), not dicts
|
||||
notfound_list = data.get("notFound", [])
|
||||
for item in notfound_list:
|
||||
if isinstance(item, int):
|
||||
cui_str = str(item)
|
||||
else:
|
||||
date_gen = item.get("date_generale", {})
|
||||
cui_str = str(date_gen.get("cui", item.get("cui", "")))
|
||||
results[cui_str] = {
|
||||
"scpTVA": None,
|
||||
"denumire_anaf": "",
|
||||
"checked_at": checked_at,
|
||||
}
|
||||
|
||||
logger.info(f"ANAF batch: {len(body)} CUIs → {len(found_list)} found, {len(notfound_list)} not found")
|
||||
|
||||
except httpx.TimeoutException:
|
||||
if retry < 1:
|
||||
_log_warning("ANAF API timeout, retrying in 3s...")
|
||||
await asyncio.sleep(3)
|
||||
return await _call_anaf_api(body, retry + 1, log_fn)
|
||||
_log_error("ANAF API timeout after retry")
|
||||
except Exception as e:
|
||||
if retry < 1:
|
||||
_log_warning(f"ANAF API error: {e}, retrying in 3s...")
|
||||
await asyncio.sleep(3)
|
||||
return await _call_anaf_api(body, retry + 1, log_fn)
|
||||
_log_error(f"ANAF API error after retry: {e}")
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def determine_correct_cod_fiscal(bare_cui: str, is_vat_payer: bool | None) -> str:
|
||||
"""Determine the correct cod_fiscal format based on ANAF VAT status.
|
||||
True → "RO" + bare, False → bare, None → bare (conservative)
|
||||
"""
|
||||
if is_vat_payer is True:
|
||||
return "RO" + bare_cui
|
||||
return bare_cui
|
||||
|
||||
|
||||
def normalize_company_name(name: str) -> str:
|
||||
"""Normalize company name for comparison: strip SRL/SA suffixes, diacritics, punctuation."""
|
||||
if not name:
|
||||
return ""
|
||||
result = name.strip().upper()
|
||||
# Strip diacritics
|
||||
result = result.translate(_DIACRITICS)
|
||||
# Remove common suffixes and legal forms
|
||||
result = re.sub(r'\b(S\.?R\.?L\.?|S\.?A\.?|S\.?C\.?|S\.?N\.?C\.?|S\.?C\.?S\.?|P\.?F\.?A\.?|INTREPRINDERE\s+INDIVIDUALA)\b', '', result)
|
||||
# Strip II only at start of name (avoid matching Roman numeral II in "TEHNICA II SRL")
|
||||
result = re.sub(r'^I\.?I\.?\s+', '', result)
|
||||
# Remove punctuation and extra spaces
|
||||
result = re.sub(r'[^\w\s]', '', result)
|
||||
result = re.sub(r'\s+', ' ', result).strip()
|
||||
return result
|
||||
@@ -1,28 +0,0 @@
|
||||
import logging
|
||||
from fastapi import HTTPException
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def search_articles(query: str, limit: int = 20):
|
||||
"""Search articles in NOM_ARTICOLE by codmat or denumire."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
if not query or len(query) < 2:
|
||||
return []
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT id_articol, codmat, denumire, um
|
||||
FROM nom_articole
|
||||
WHERE (UPPER(codmat) LIKE UPPER(:q) || '%'
|
||||
OR UPPER(denumire) LIKE '%' || UPPER(:q) || '%')
|
||||
AND sters = 0 AND inactiv = 0
|
||||
AND ROWNUM <= :lim
|
||||
ORDER BY CASE WHEN UPPER(codmat) LIKE UPPER(:q) || '%' THEN 0 ELSE 1 END, codmat
|
||||
""", {"q": query, "lim": limit})
|
||||
|
||||
columns = [col[0].lower() for col in cur.description]
|
||||
return [dict(zip(columns, row)) for row in cur.fetchall()]
|
||||
@@ -1,105 +0,0 @@
|
||||
"""GoMag API client - downloads orders and saves them as JSON files."""
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime, timedelta
|
||||
from pathlib import Path
|
||||
from typing import Callable
|
||||
|
||||
import httpx
|
||||
|
||||
from ..config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def download_orders(
|
||||
json_dir: str,
|
||||
days_back: int = None,
|
||||
api_key: str = None,
|
||||
api_shop: str = None,
|
||||
limit: int = None,
|
||||
log_fn: Callable[[str], None] = None,
|
||||
) -> dict:
|
||||
"""Download orders from GoMag API and save as JSON files.
|
||||
|
||||
Returns dict with keys: pages, total, files (list of saved file paths).
|
||||
If API keys are not configured, returns immediately with empty result.
|
||||
Optional api_key, api_shop, limit override config.settings values.
|
||||
"""
|
||||
def _log(msg: str):
|
||||
logger.info(msg)
|
||||
if log_fn:
|
||||
log_fn(msg)
|
||||
|
||||
effective_key = api_key or settings.GOMAG_API_KEY
|
||||
effective_shop = api_shop or settings.GOMAG_API_SHOP
|
||||
effective_limit = limit or settings.GOMAG_LIMIT
|
||||
|
||||
if not effective_key or not effective_shop:
|
||||
_log("GoMag API keys neconfigurați, skip download")
|
||||
return {"pages": 0, "total": 0, "files": []}
|
||||
|
||||
if days_back is None:
|
||||
days_back = settings.GOMAG_ORDER_DAYS_BACK
|
||||
|
||||
start_date = (datetime.now() - timedelta(days=days_back)).strftime("%Y-%m-%d")
|
||||
out_dir = Path(json_dir)
|
||||
out_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Clean old JSON files before downloading new ones
|
||||
old_files = list(out_dir.glob("gomag_orders*.json"))
|
||||
if old_files:
|
||||
for f in old_files:
|
||||
f.unlink()
|
||||
_log(f"Șterse {len(old_files)} fișiere JSON vechi")
|
||||
|
||||
headers = {
|
||||
"Apikey": effective_key,
|
||||
"ApiShop": effective_shop,
|
||||
"User-Agent": "Mozilla/5.0",
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
saved_files = []
|
||||
total_orders = 0
|
||||
total_pages = 1
|
||||
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
|
||||
async with httpx.AsyncClient(timeout=30) as client:
|
||||
page = 1
|
||||
while page <= total_pages:
|
||||
params = {
|
||||
"startDate": start_date,
|
||||
"page": page,
|
||||
"limit": effective_limit,
|
||||
}
|
||||
try:
|
||||
response = await client.get(settings.GOMAG_API_URL, headers=headers, params=params)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
except httpx.HTTPError as e:
|
||||
_log(f"GoMag API eroare pagina {page}: {e}")
|
||||
break
|
||||
except Exception as e:
|
||||
_log(f"GoMag eroare neașteptată pagina {page}: {e}")
|
||||
break
|
||||
|
||||
# Update totals from first page response
|
||||
if page == 1:
|
||||
total_orders = int(data.get("total", 0))
|
||||
total_pages = int(data.get("pages", 1))
|
||||
_log(f"GoMag: {total_orders} comenzi în {total_pages} pagini (startDate={start_date})")
|
||||
|
||||
filename = out_dir / f"gomag_orders_page{page}_{timestamp}.json"
|
||||
filename.write_text(json.dumps(data, ensure_ascii=False, indent=2), encoding="utf-8")
|
||||
saved_files.append(str(filename))
|
||||
_log(f"GoMag: pagina {page}/{total_pages} salvată → {filename.name}")
|
||||
|
||||
page += 1
|
||||
if page <= total_pages:
|
||||
await asyncio.sleep(1)
|
||||
|
||||
return {"pages": total_pages, "total": total_orders, "files": saved_files}
|
||||
|
||||
|
||||
@@ -1,533 +0,0 @@
|
||||
import html
|
||||
import json
|
||||
import logging
|
||||
import re
|
||||
import unicodedata
|
||||
import oracledb
|
||||
from datetime import datetime, timedelta
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Stroke/ligature letters NFKD does not decompose (structural mod, not a
|
||||
# combining mark). Everything else — RO cedilla ş/ţ, RO comma-below ș/ț,
|
||||
# HU ő/ű, DE umlaut, CZ háček, FR accent, ES tilde — is handled
|
||||
# universally by unicodedata.normalize('NFKD') + Mn-category strip below.
|
||||
_NFKD_OVERRIDES = str.maketrans({
|
||||
'ß': 'ss', # ß
|
||||
'æ': 'ae', 'Æ': 'AE', # æ Æ
|
||||
'œ': 'oe', 'Œ': 'OE', # œ Œ
|
||||
'ł': 'l', 'Ł': 'L', # ł Ł (Polish)
|
||||
'đ': 'd', 'Đ': 'D', # đ Đ (Croatian)
|
||||
'ø': 'o', 'Ø': 'O', # ø Ø (Danish/Norwegian)
|
||||
})
|
||||
|
||||
|
||||
def clean_web_text(text: str) -> str:
|
||||
"""Port of VFP CleanWebText: unescape HTML entities + strip diacritics to ASCII.
|
||||
|
||||
NFKD decomposition + combining-mark filter covers RO/HU/DE/CZ/PL/FR/ES in
|
||||
one pass; _NFKD_OVERRIDES handles stroke letters NFKD leaves alone.
|
||||
"""
|
||||
if not text:
|
||||
return ""
|
||||
result = html.unescape(text)
|
||||
result = result.translate(_NFKD_OVERRIDES)
|
||||
decomposed = unicodedata.normalize('NFKD', result)
|
||||
result = ''.join(ch for ch in decomposed if not unicodedata.combining(ch))
|
||||
# Remove any remaining <br> tags
|
||||
for br in ('<br>', '<br/>', '<br />'):
|
||||
result = result.replace(br, ' ')
|
||||
return result.strip()
|
||||
|
||||
|
||||
def convert_web_date(date_str: str) -> datetime:
|
||||
"""Port of VFP ConvertWebDate: parse web date to datetime."""
|
||||
if not date_str:
|
||||
return datetime.now()
|
||||
try:
|
||||
return datetime.strptime(date_str.strip(), '%Y-%m-%d %H:%M:%S')
|
||||
except ValueError:
|
||||
try:
|
||||
return datetime.strptime(date_str.strip()[:10], '%Y-%m-%d')
|
||||
except ValueError:
|
||||
return datetime.now()
|
||||
|
||||
|
||||
def determine_partner_data(order) -> dict:
|
||||
"""Extract partner identification from a GoMag order (no Oracle calls).
|
||||
|
||||
Returns: {denumire, cod_fiscal, registru, is_pj}
|
||||
Identical logic to import_single_order partner block — reuse to avoid drift.
|
||||
"""
|
||||
if order.billing.is_company:
|
||||
denumire = clean_web_text(order.billing.company_name).upper()
|
||||
if not denumire:
|
||||
# CUI-only fallback: company has code but no name → use billing person name
|
||||
denumire = clean_web_text(
|
||||
f"{order.billing.lastname} {order.billing.firstname}"
|
||||
).upper()
|
||||
raw_cf = clean_web_text(order.billing.company_code) or None
|
||||
# Collapse internal whitespace: "RO 34963277" → "RO34963277"
|
||||
cod_fiscal = re.sub(r'\s+', '', raw_cf) if raw_cf else None
|
||||
registru = clean_web_text(order.billing.company_reg) or None
|
||||
is_pj = 1
|
||||
else:
|
||||
if order.shipping and (order.shipping.lastname or order.shipping.firstname):
|
||||
raw_name = clean_web_text(
|
||||
f"{order.shipping.lastname} {order.shipping.firstname}"
|
||||
).upper()
|
||||
else:
|
||||
raw_name = clean_web_text(
|
||||
f"{order.billing.lastname} {order.billing.firstname}"
|
||||
).upper()
|
||||
denumire = " ".join(sorted(raw_name.split()))
|
||||
cod_fiscal = None
|
||||
registru = None
|
||||
is_pj = 0
|
||||
return {"denumire": denumire, "cod_fiscal": cod_fiscal, "registru": registru, "is_pj": is_pj}
|
||||
|
||||
|
||||
def format_address_for_oracle(address: str, city: str, region: str) -> str:
|
||||
"""Port of VFP FormatAddressForOracle."""
|
||||
region_clean = clean_web_text(region)
|
||||
city_clean = clean_web_text(city)
|
||||
address_clean = clean_web_text(address)
|
||||
address_clean = " ".join(address_clean.replace(",", " ").split())
|
||||
# Strip city/region suffixes users often append to address
|
||||
if city_clean or region_clean:
|
||||
addr_upper = address_clean.upper().rstrip()
|
||||
city_upper = city_clean.upper().strip() if city_clean else ""
|
||||
region_upper = region_clean.upper().strip() if region_clean else ""
|
||||
for pattern in [
|
||||
(city_upper + " " + region_upper).strip(),
|
||||
(region_upper + " " + city_upper).strip(),
|
||||
city_upper,
|
||||
region_upper,
|
||||
]:
|
||||
if pattern and addr_upper.endswith(pattern):
|
||||
stripped = address_clean[:len(address_clean.rstrip()) - len(pattern)].rstrip()
|
||||
if stripped:
|
||||
address_clean = stripped
|
||||
addr_upper = address_clean.upper().rstrip()
|
||||
break
|
||||
return f"JUD:{region_clean};{city_clean};{address_clean}"
|
||||
|
||||
|
||||
def compute_discount_split(order, settings: dict) -> dict | None:
|
||||
"""Compute proportional discount split by VAT rate from order items.
|
||||
|
||||
Returns: {"11": 3.98, "21": 1.43} or None if split not applicable.
|
||||
Only splits when split_discount_vat is enabled AND multiple VAT rates exist.
|
||||
When single VAT rate: returns {actual_rate: total} (smarter than GoMag's fixed 21%).
|
||||
"""
|
||||
if not order or order.discount_total <= 0:
|
||||
return None
|
||||
|
||||
split_enabled = settings.get("split_discount_vat") == "1"
|
||||
|
||||
# Calculate VAT distribution from order items (exclude zero-value)
|
||||
vat_totals = {}
|
||||
for item in order.items:
|
||||
item_value = abs(item.price * item.quantity)
|
||||
if item_value > 0:
|
||||
vat_key = str(int(item.vat)) if item.vat == int(item.vat) else str(item.vat)
|
||||
vat_totals[vat_key] = vat_totals.get(vat_key, 0) + item_value
|
||||
|
||||
if not vat_totals:
|
||||
return None
|
||||
|
||||
grand_total = sum(vat_totals.values())
|
||||
if grand_total <= 0:
|
||||
return None
|
||||
|
||||
if len(vat_totals) == 1:
|
||||
# Single VAT rate — use that rate (smarter than GoMag's fixed 21%)
|
||||
actual_vat = list(vat_totals.keys())[0]
|
||||
return {actual_vat: round(order.discount_total, 2)}
|
||||
|
||||
if not split_enabled:
|
||||
return None
|
||||
|
||||
# Multiple VAT rates — split proportionally
|
||||
result = {}
|
||||
discount_remaining = order.discount_total
|
||||
sorted_rates = sorted(vat_totals.keys(), key=lambda x: float(x))
|
||||
|
||||
for i, vat_rate in enumerate(sorted_rates):
|
||||
if i == len(sorted_rates) - 1:
|
||||
split_amount = round(discount_remaining, 2) # last gets remainder
|
||||
else:
|
||||
proportion = vat_totals[vat_rate] / grand_total
|
||||
split_amount = round(order.discount_total * proportion, 2)
|
||||
discount_remaining -= split_amount
|
||||
|
||||
if split_amount > 0:
|
||||
result[vat_rate] = split_amount
|
||||
|
||||
return result if result else None
|
||||
|
||||
|
||||
def build_articles_json(items, order=None, settings=None) -> str:
|
||||
"""Build JSON string for Oracle PACK_IMPORT_COMENZI.importa_comanda.
|
||||
Includes transport and discount as extra articles if configured.
|
||||
Supports per-article id_pol from codmat_policy_map and discount VAT splitting."""
|
||||
articles = []
|
||||
codmat_policy_map = settings.get("_codmat_policy_map", {}) if settings else {}
|
||||
default_id_pol = settings.get("id_pol", "") if settings else ""
|
||||
|
||||
for item in items:
|
||||
article_dict = {
|
||||
"sku": item.sku,
|
||||
"quantity": str(item.quantity),
|
||||
"price": str(item.price),
|
||||
"vat": str(item.vat),
|
||||
"name": clean_web_text(item.name)
|
||||
}
|
||||
# Per-article id_pol from dual-policy validation
|
||||
item_pol = codmat_policy_map.get(item.sku)
|
||||
if item_pol and str(item_pol) != str(default_id_pol):
|
||||
article_dict["id_pol"] = str(item_pol)
|
||||
articles.append(article_dict)
|
||||
|
||||
if order and settings:
|
||||
transport_codmat = settings.get("transport_codmat", "")
|
||||
transport_vat = settings.get("transport_vat", "21")
|
||||
discount_codmat = settings.get("discount_codmat", "")
|
||||
|
||||
# Transport as article with quantity +1
|
||||
if order.delivery_cost > 0 and transport_codmat:
|
||||
article_dict = {
|
||||
"sku": transport_codmat,
|
||||
"quantity": "1",
|
||||
"price": str(order.delivery_cost),
|
||||
"vat": transport_vat,
|
||||
"name": "Transport"
|
||||
}
|
||||
if settings.get("transport_id_pol"):
|
||||
article_dict["id_pol"] = settings["transport_id_pol"]
|
||||
articles.append(article_dict)
|
||||
|
||||
# Discount — smart VAT splitting
|
||||
if order.discount_total > 0 and discount_codmat:
|
||||
discount_split = compute_discount_split(order, settings)
|
||||
|
||||
if discount_split and len(discount_split) > 1:
|
||||
# Multiple VAT rates — multiple discount lines
|
||||
for vat_rate, split_amount in sorted(discount_split.items(), key=lambda x: float(x[0])):
|
||||
article_dict = {
|
||||
"sku": discount_codmat,
|
||||
"quantity": "-1",
|
||||
"price": str(split_amount),
|
||||
"vat": vat_rate,
|
||||
"name": f"Discount (TVA {vat_rate}%)"
|
||||
}
|
||||
if settings.get("discount_id_pol"):
|
||||
article_dict["id_pol"] = settings["discount_id_pol"]
|
||||
articles.append(article_dict)
|
||||
elif discount_split and len(discount_split) == 1:
|
||||
# Single VAT rate — use detected rate
|
||||
actual_vat = list(discount_split.keys())[0]
|
||||
article_dict = {
|
||||
"sku": discount_codmat,
|
||||
"quantity": "-1",
|
||||
"price": str(order.discount_total),
|
||||
"vat": actual_vat,
|
||||
"name": "Discount"
|
||||
}
|
||||
if settings.get("discount_id_pol"):
|
||||
article_dict["id_pol"] = settings["discount_id_pol"]
|
||||
articles.append(article_dict)
|
||||
else:
|
||||
# Fallback — original behavior with GoMag VAT or settings default
|
||||
discount_vat = getattr(order, 'discount_vat', None) or settings.get("discount_vat", "21")
|
||||
article_dict = {
|
||||
"sku": discount_codmat,
|
||||
"quantity": "-1",
|
||||
"price": str(order.discount_total),
|
||||
"vat": discount_vat,
|
||||
"name": "Discount"
|
||||
}
|
||||
if settings.get("discount_id_pol"):
|
||||
article_dict["id_pol"] = settings["discount_id_pol"]
|
||||
articles.append(article_dict)
|
||||
|
||||
return json.dumps(articles)
|
||||
|
||||
|
||||
def import_single_order(order, id_pol: int = None, id_sectie: int = None, app_settings: dict = None, id_gestiuni: list[int] = None, cod_fiscal_override: str = None, anaf_strict: int = None, denumire_override: str = None) -> dict:
|
||||
"""Import a single order into Oracle ROA.
|
||||
|
||||
Returns dict with:
|
||||
success: bool
|
||||
id_comanda: int or None
|
||||
id_partener: int or None
|
||||
id_adresa_facturare: int or None
|
||||
id_adresa_livrare: int or None
|
||||
error: str or None
|
||||
"""
|
||||
result = {
|
||||
"success": False,
|
||||
"id_comanda": None,
|
||||
"id_partener": None,
|
||||
"id_adresa_facturare": None,
|
||||
"id_adresa_livrare": None,
|
||||
"error": None
|
||||
}
|
||||
|
||||
conn = None
|
||||
try:
|
||||
order_number = clean_web_text(order.number)
|
||||
order_date = convert_web_date(order.date)
|
||||
logger.info(
|
||||
f"Order {order.number}: raw date={order.date!r} → "
|
||||
f"parsed={order_date.strftime('%Y-%m-%d %H:%M:%S')}"
|
||||
)
|
||||
|
||||
if database.pool is None:
|
||||
raise RuntimeError("Oracle pool not initialized")
|
||||
conn = database.pool.acquire()
|
||||
with conn.cursor() as cur:
|
||||
# Step 1: Process partner — use shipping person data for name
|
||||
id_partener = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
|
||||
_pdata = determine_partner_data(order)
|
||||
# PJ: prefer ANAF official name (denumire_override) over GoMag company_name
|
||||
# (for new partner creation; existing partner lookup is CUI-based)
|
||||
denumire = (denumire_override
|
||||
if (_pdata["is_pj"] and denumire_override)
|
||||
else _pdata["denumire"])
|
||||
cod_fiscal = (cod_fiscal_override or _pdata["cod_fiscal"]) if _pdata["is_pj"] else None
|
||||
registru = _pdata["registru"]
|
||||
is_pj = _pdata["is_pj"]
|
||||
|
||||
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_partener", [
|
||||
cod_fiscal, denumire, registru, is_pj, anaf_strict, id_partener
|
||||
])
|
||||
|
||||
partner_id = id_partener.getvalue()
|
||||
if not partner_id or partner_id <= 0:
|
||||
result["error"] = f"Partner creation failed for {denumire}"
|
||||
return result
|
||||
|
||||
result["id_partener"] = int(partner_id)
|
||||
|
||||
# Query partner data from Oracle for sync back to SQLite
|
||||
cur.execute("SELECT denumire, cod_fiscal FROM nom_parteneri WHERE id_part = :1", [partner_id])
|
||||
row = cur.fetchone()
|
||||
result["denumire_roa"] = row[0] if row else None
|
||||
result["cod_fiscal_roa"] = row[1] if row else None
|
||||
|
||||
# Step 2: Process shipping address (primary — person on shipping label)
|
||||
# Use shipping person phone/email for partner contact
|
||||
shipping_phone = ""
|
||||
shipping_email = ""
|
||||
if order.shipping:
|
||||
shipping_phone = order.shipping.phone or ""
|
||||
shipping_email = order.shipping.email or ""
|
||||
if not shipping_phone:
|
||||
shipping_phone = order.billing.phone or ""
|
||||
if not shipping_email:
|
||||
shipping_email = order.billing.email or ""
|
||||
|
||||
addr_livr_id = None
|
||||
if order.shipping:
|
||||
id_adresa_livr = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
shipping_addr = format_address_for_oracle(
|
||||
order.shipping.address, order.shipping.city,
|
||||
order.shipping.region
|
||||
)
|
||||
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_adresa", [
|
||||
partner_id, shipping_addr,
|
||||
shipping_phone,
|
||||
shipping_email,
|
||||
id_adresa_livr
|
||||
])
|
||||
addr_livr_id = id_adresa_livr.getvalue()
|
||||
|
||||
if addr_livr_id is None:
|
||||
cur.execute("SELECT PACK_IMPORT_PARTENERI.get_last_error FROM dual")
|
||||
plsql_err = cur.fetchone()[0]
|
||||
err_msg = f"Shipping address creation failed for partner {partner_id}"
|
||||
if plsql_err:
|
||||
err_msg += f": {plsql_err}"
|
||||
logger.error(f"Order {order_number}: {err_msg}")
|
||||
result["error"] = err_msg
|
||||
return result
|
||||
|
||||
# Step 3: Process billing address — PJ vs PF rule
|
||||
if is_pj:
|
||||
# PJ (company): billing address = GoMag billing (company HQ)
|
||||
billing_addr = format_address_for_oracle(
|
||||
order.billing.address, order.billing.city, order.billing.region
|
||||
)
|
||||
if addr_livr_id and order.shipping and billing_addr == shipping_addr:
|
||||
# billing = shipping: reuse addr_livr_id to avoid duplicate Oracle address
|
||||
addr_fact_id = addr_livr_id
|
||||
else:
|
||||
id_adresa_fact = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_adresa", [
|
||||
partner_id, billing_addr,
|
||||
order.billing.phone or "",
|
||||
order.billing.email or "",
|
||||
id_adresa_fact
|
||||
])
|
||||
addr_fact_id = id_adresa_fact.getvalue()
|
||||
|
||||
if addr_fact_id is None:
|
||||
cur.execute("SELECT PACK_IMPORT_PARTENERI.get_last_error FROM dual")
|
||||
plsql_err = cur.fetchone()[0]
|
||||
err_msg = f"Billing address creation failed for partner {partner_id}"
|
||||
if plsql_err:
|
||||
err_msg += f": {plsql_err}"
|
||||
logger.error(f"Order {order_number}: {err_msg}")
|
||||
result["error"] = err_msg
|
||||
return result
|
||||
else:
|
||||
# PF (individual): billing = shipping (ramburs curier pe numele destinatarului)
|
||||
addr_fact_id = addr_livr_id
|
||||
|
||||
if addr_fact_id is not None:
|
||||
result["id_adresa_facturare"] = int(addr_fact_id)
|
||||
if addr_livr_id is not None:
|
||||
result["id_adresa_livrare"] = int(addr_livr_id)
|
||||
|
||||
# Query address details from Oracle for sync back to SQLite
|
||||
if addr_livr_id:
|
||||
cur.execute("""SELECT strada, numar, bloc, scara, apart, etaj, localitate, judet
|
||||
FROM vadrese_parteneri WHERE id_adresa = :1""", [int(addr_livr_id)])
|
||||
row = cur.fetchone()
|
||||
result["adresa_livrare_roa"] = {
|
||||
"strada": row[0], "numar": row[1], "bloc": row[2], "scara": row[3],
|
||||
"apart": row[4], "etaj": row[5], "localitate": row[6], "judet": row[7]
|
||||
} if row else None
|
||||
if addr_fact_id and addr_fact_id != addr_livr_id:
|
||||
cur.execute("""SELECT strada, numar, bloc, scara, apart, etaj, localitate, judet
|
||||
FROM vadrese_parteneri WHERE id_adresa = :1""", [int(addr_fact_id)])
|
||||
row = cur.fetchone()
|
||||
result["adresa_facturare_roa"] = {
|
||||
"strada": row[0], "numar": row[1], "bloc": row[2], "scara": row[3],
|
||||
"apart": row[4], "etaj": row[5], "localitate": row[6], "judet": row[7]
|
||||
} if row else None
|
||||
elif addr_fact_id and addr_fact_id == addr_livr_id:
|
||||
result["adresa_facturare_roa"] = result.get("adresa_livrare_roa")
|
||||
|
||||
# Step 4: Build articles JSON and import order
|
||||
articles_json = build_articles_json(order.items, order, app_settings)
|
||||
|
||||
# Use CLOB for the JSON
|
||||
clob_var = cur.var(oracledb.DB_TYPE_CLOB)
|
||||
clob_var.setvalue(0, articles_json)
|
||||
|
||||
id_comanda = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
|
||||
# Convert list[int] to CSV string for Oracle VARCHAR2 param
|
||||
id_gestiune_csv = ",".join(str(g) for g in id_gestiuni) if id_gestiuni else None
|
||||
|
||||
# Kit pricing parameters from settings
|
||||
kit_mode = (app_settings or {}).get("kit_pricing_mode") or None
|
||||
kit_id_pol_prod = int((app_settings or {}).get("id_pol_productie") or 0) or None
|
||||
kit_discount_codmat = (app_settings or {}).get("kit_discount_codmat") or None
|
||||
kit_discount_id_pol = int((app_settings or {}).get("kit_discount_id_pol") or 0) or None
|
||||
|
||||
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
|
||||
order_number, # p_nr_comanda_ext
|
||||
order_date, # p_data_comanda
|
||||
partner_id, # p_id_partener
|
||||
clob_var, # p_json_articole (CLOB)
|
||||
addr_livr_id, # p_id_adresa_livrare
|
||||
addr_fact_id, # p_id_adresa_facturare
|
||||
id_pol, # p_id_pol
|
||||
id_sectie, # p_id_sectie
|
||||
id_gestiune_csv, # p_id_gestiune (CSV string)
|
||||
kit_mode, # p_kit_mode
|
||||
kit_id_pol_prod, # p_id_pol_productie
|
||||
kit_discount_codmat, # p_kit_discount_codmat
|
||||
kit_discount_id_pol, # p_kit_discount_id_pol
|
||||
id_comanda # v_id_comanda (OUT) — MUST STAY LAST
|
||||
])
|
||||
|
||||
comanda_id = id_comanda.getvalue()
|
||||
|
||||
if comanda_id and comanda_id > 0:
|
||||
conn.commit()
|
||||
result["success"] = True
|
||||
result["id_comanda"] = int(comanda_id)
|
||||
logger.info(f"Order {order_number} imported: ID={comanda_id}")
|
||||
else:
|
||||
conn.rollback()
|
||||
result["error"] = "importa_comanda returned invalid ID"
|
||||
|
||||
except oracledb.DatabaseError as e:
|
||||
error_msg = str(e)
|
||||
result["error"] = error_msg
|
||||
logger.error(f"Oracle error importing order {order.number}: {error_msg}")
|
||||
if conn:
|
||||
try:
|
||||
conn.rollback()
|
||||
except Exception:
|
||||
pass
|
||||
except Exception as e:
|
||||
result["error"] = str(e)
|
||||
logger.error(f"Error importing order {order.number}: {e}")
|
||||
if conn:
|
||||
try:
|
||||
conn.rollback()
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
if conn:
|
||||
try:
|
||||
database.pool.release(conn)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def soft_delete_order_in_roa(id_comanda: int) -> dict:
|
||||
"""Soft-delete an order in Oracle ROA (set sters=1 on comenzi + comenzi_detalii).
|
||||
Returns {"success": bool, "error": str|None, "details_deleted": int}
|
||||
"""
|
||||
result = {"success": False, "error": None, "details_deleted": 0}
|
||||
|
||||
if database.pool is None:
|
||||
result["error"] = "Oracle pool not initialized"
|
||||
return result
|
||||
|
||||
conn = None
|
||||
try:
|
||||
conn = database.pool.acquire()
|
||||
with conn.cursor() as cur:
|
||||
# Soft-delete order details
|
||||
cur.execute(
|
||||
"UPDATE comenzi_elemente SET sters = 1 WHERE id_comanda = :1 AND sters = 0",
|
||||
[id_comanda]
|
||||
)
|
||||
result["details_deleted"] = cur.rowcount
|
||||
|
||||
# Soft-delete the order itself
|
||||
cur.execute(
|
||||
"UPDATE comenzi SET sters = 1 WHERE id_comanda = :1 AND sters = 0",
|
||||
[id_comanda]
|
||||
)
|
||||
|
||||
conn.commit()
|
||||
result["success"] = True
|
||||
logger.info(f"Soft-deleted order ID={id_comanda} in Oracle ROA ({result['details_deleted']} details)")
|
||||
except Exception as e:
|
||||
result["error"] = str(e)
|
||||
logger.error(f"Error soft-deleting order ID={id_comanda}: {e}")
|
||||
if conn:
|
||||
try:
|
||||
conn.rollback()
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
if conn:
|
||||
try:
|
||||
database.pool.release(conn)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return result
|
||||
@@ -1,75 +0,0 @@
|
||||
import logging
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def check_invoices_for_orders(id_comanda_list: list) -> dict:
|
||||
"""Check which orders have been invoiced in Oracle (vanzari table).
|
||||
Returns {id_comanda: {facturat, numar_act, serie_act, total_fara_tva, total_tva, total_cu_tva}}
|
||||
"""
|
||||
if not id_comanda_list or database.pool is None:
|
||||
return {}
|
||||
|
||||
result = {}
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(id_comanda_list), 500):
|
||||
batch = id_comanda_list[i:i+500]
|
||||
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
|
||||
params = {f"c{j}": cid for j, cid in enumerate(batch)}
|
||||
|
||||
cur.execute(f"""
|
||||
SELECT id_comanda, numar_act, serie_act,
|
||||
total_fara_tva, total_tva, total_cu_tva,
|
||||
TO_CHAR(data_act, 'YYYY-MM-DD') AS data_act
|
||||
FROM vanzari
|
||||
WHERE id_comanda IN ({placeholders}) AND sters = 0
|
||||
""", params)
|
||||
for row in cur:
|
||||
result[row[0]] = {
|
||||
"facturat": True,
|
||||
"numar_act": row[1],
|
||||
"serie_act": row[2],
|
||||
"total_fara_tva": float(row[3]) if row[3] else 0,
|
||||
"total_tva": float(row[4]) if row[4] else 0,
|
||||
"total_cu_tva": float(row[5]) if row[5] else 0,
|
||||
"data_act": row[6],
|
||||
}
|
||||
except Exception as e:
|
||||
logger.warning(f"Invoice check failed (table may not exist): {e}")
|
||||
finally:
|
||||
database.pool.release(conn)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
def check_orders_exist(id_comanda_list: list) -> set:
|
||||
"""Check which id_comanda values still exist in Oracle COMENZI (sters=0).
|
||||
Returns set of id_comanda that exist.
|
||||
"""
|
||||
if not id_comanda_list or database.pool is None:
|
||||
return set()
|
||||
|
||||
existing = set()
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(id_comanda_list), 500):
|
||||
batch = id_comanda_list[i:i+500]
|
||||
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
|
||||
params = {f"c{j}": cid for j, cid in enumerate(batch)}
|
||||
|
||||
cur.execute(f"""
|
||||
SELECT id_comanda FROM COMENZI
|
||||
WHERE id_comanda IN ({placeholders}) AND sters = 0
|
||||
""", params)
|
||||
for row in cur:
|
||||
existing.add(row[0])
|
||||
except Exception as e:
|
||||
logger.warning(f"Order existence check failed: {e}")
|
||||
finally:
|
||||
database.pool.release(conn)
|
||||
|
||||
return existing
|
||||
@@ -1,405 +0,0 @@
|
||||
import oracledb
|
||||
import csv
|
||||
import io
|
||||
import logging
|
||||
from fastapi import HTTPException
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def get_mappings(search: str = "", page: int = 1, per_page: int = 50,
|
||||
sort_by: str = "sku", sort_dir: str = "asc",
|
||||
show_deleted: bool = False,
|
||||
id_pol: int = None, id_pol_productie: int = None):
|
||||
"""Get paginated mappings with optional search and sorting."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
offset = (page - 1) * per_page
|
||||
|
||||
# Validate and resolve sort parameters
|
||||
allowed_sort = {
|
||||
"sku": "at.sku",
|
||||
"codmat": "at.codmat",
|
||||
"denumire": "na.denumire",
|
||||
"um": "na.um",
|
||||
"cantitate_roa": "at.cantitate_roa",
|
||||
"activ": "at.activ",
|
||||
}
|
||||
sort_col = allowed_sort.get(sort_by, "at.sku")
|
||||
if sort_dir.lower() not in ("asc", "desc"):
|
||||
sort_dir = "asc"
|
||||
order_clause = f"{sort_col} {sort_dir}"
|
||||
# Always add secondary sort to keep groups together
|
||||
if sort_col not in ("at.sku",):
|
||||
order_clause += ", at.sku"
|
||||
order_clause += ", at.codmat"
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Build WHERE clause
|
||||
where_clauses = []
|
||||
params = {}
|
||||
if not show_deleted:
|
||||
where_clauses.append("at.sters = 0")
|
||||
if search:
|
||||
where_clauses.append("""(UPPER(at.sku) LIKE '%' || UPPER(:search) || '%'
|
||||
OR UPPER(at.codmat) LIKE '%' || UPPER(:search) || '%'
|
||||
OR UPPER(na.denumire) LIKE '%' || UPPER(:search) || '%')""")
|
||||
params["search"] = search
|
||||
where = "WHERE " + " AND ".join(where_clauses) if where_clauses else ""
|
||||
|
||||
# Add price policy params
|
||||
params["id_pol"] = id_pol
|
||||
params["id_pol_prod"] = id_pol_productie
|
||||
|
||||
# Fetch ALL matching rows (no pagination yet — we need to group by SKU first)
|
||||
data_sql = f"""
|
||||
SELECT at.sku, at.codmat, na.denumire, na.um, at.cantitate_roa,
|
||||
at.activ, at.sters,
|
||||
TO_CHAR(at.data_creare, 'YYYY-MM-DD HH24:MI') as data_creare,
|
||||
ROUND(CASE WHEN pp.preturi_cu_tva = 1
|
||||
THEN NVL(ppa.pret, 0)
|
||||
ELSE NVL(ppa.pret, 0) * NVL(ppa.proc_tvav, 1.19)
|
||||
END, 2) AS pret_cu_tva
|
||||
FROM ARTICOLE_TERTI at
|
||||
LEFT JOIN nom_articole na ON na.codmat = at.codmat
|
||||
LEFT JOIN crm_politici_pret_art ppa
|
||||
ON ppa.id_articol = na.id_articol
|
||||
AND ppa.id_pol = CASE
|
||||
WHEN TRIM(na.cont) IN ('341','345') AND :id_pol_prod IS NOT NULL
|
||||
THEN :id_pol_prod ELSE :id_pol END
|
||||
LEFT JOIN crm_politici_preturi pp
|
||||
ON pp.id_pol = ppa.id_pol
|
||||
{where}
|
||||
ORDER BY {order_clause}
|
||||
"""
|
||||
cur.execute(data_sql, params)
|
||||
columns = [col[0].lower() for col in cur.description]
|
||||
all_rows = [dict(zip(columns, row)) for row in cur.fetchall()]
|
||||
|
||||
# Group by SKU
|
||||
from collections import OrderedDict
|
||||
groups = OrderedDict()
|
||||
for row in all_rows:
|
||||
sku = row["sku"]
|
||||
if sku not in groups:
|
||||
groups[sku] = []
|
||||
groups[sku].append(row)
|
||||
|
||||
counts = {"total": len(groups)}
|
||||
|
||||
# Flatten back to rows for pagination (paginate by raw row count)
|
||||
filtered_rows = [row for rows in groups.values() for row in rows]
|
||||
total = len(filtered_rows)
|
||||
page_rows = filtered_rows[offset: offset + per_page]
|
||||
|
||||
return {
|
||||
"mappings": page_rows,
|
||||
"total": total,
|
||||
"page": page,
|
||||
"per_page": per_page,
|
||||
"pages": (total + per_page - 1) // per_page if total > 0 else 0,
|
||||
"counts": counts,
|
||||
}
|
||||
|
||||
def create_mapping(sku: str, codmat: str, cantitate_roa: float = 1, auto_restore: bool = False):
|
||||
"""Create a new mapping. Returns dict or raises HTTPException on duplicate.
|
||||
|
||||
When auto_restore=True, soft-deleted records are restored+updated instead of raising 409.
|
||||
"""
|
||||
if not sku or not sku.strip():
|
||||
raise HTTPException(status_code=400, detail="SKU este obligatoriu")
|
||||
if not codmat or not codmat.strip():
|
||||
raise HTTPException(status_code=400, detail="CODMAT este obligatoriu")
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Validate CODMAT exists in NOM_ARTICOLE
|
||||
cur.execute("""
|
||||
SELECT COUNT(*) FROM NOM_ARTICOLE
|
||||
WHERE codmat = :codmat AND sters = 0 AND inactiv = 0
|
||||
""", {"codmat": codmat})
|
||||
if cur.fetchone()[0] == 0:
|
||||
raise HTTPException(status_code=400, detail="CODMAT-ul nu exista in nomenclator")
|
||||
|
||||
# Check for active duplicate
|
||||
cur.execute("""
|
||||
SELECT COUNT(*) FROM ARTICOLE_TERTI
|
||||
WHERE sku = :sku AND codmat = :codmat AND NVL(sters, 0) = 0
|
||||
""", {"sku": sku, "codmat": codmat})
|
||||
if cur.fetchone()[0] > 0:
|
||||
raise HTTPException(status_code=409, detail="Maparea SKU-CODMAT există deja")
|
||||
|
||||
# Check for soft-deleted record that could be restored
|
||||
cur.execute("""
|
||||
SELECT COUNT(*) FROM ARTICOLE_TERTI
|
||||
WHERE sku = :sku AND codmat = :codmat AND sters = 1
|
||||
""", {"sku": sku, "codmat": codmat})
|
||||
if cur.fetchone()[0] > 0:
|
||||
if auto_restore:
|
||||
cur.execute("""
|
||||
UPDATE ARTICOLE_TERTI SET sters = 0, activ = 1,
|
||||
cantitate_roa = :cantitate_roa,
|
||||
data_modif = SYSDATE
|
||||
WHERE sku = :sku AND codmat = :codmat AND sters = 1
|
||||
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate_roa})
|
||||
conn.commit()
|
||||
return {"sku": sku, "codmat": codmat}
|
||||
else:
|
||||
raise HTTPException(
|
||||
status_code=409,
|
||||
detail="Maparea a fost ștearsă anterior",
|
||||
headers={"X-Can-Restore": "true"}
|
||||
)
|
||||
|
||||
cur.execute("""
|
||||
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
VALUES (:sku, :codmat, :cantitate_roa, 1, 0, SYSDATE, -3)
|
||||
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate_roa})
|
||||
conn.commit()
|
||||
return {"sku": sku, "codmat": codmat}
|
||||
|
||||
def update_mapping(sku: str, codmat: str, cantitate_roa: float = None, activ: int = None):
|
||||
"""Update an existing mapping."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
sets = []
|
||||
params = {"sku": sku, "codmat": codmat}
|
||||
|
||||
if cantitate_roa is not None:
|
||||
sets.append("cantitate_roa = :cantitate_roa")
|
||||
params["cantitate_roa"] = cantitate_roa
|
||||
if activ is not None:
|
||||
sets.append("activ = :activ")
|
||||
params["activ"] = activ
|
||||
|
||||
if not sets:
|
||||
return False
|
||||
|
||||
sets.append("data_modif = SYSDATE")
|
||||
set_clause = ", ".join(sets)
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(f"""
|
||||
UPDATE ARTICOLE_TERTI SET {set_clause}
|
||||
WHERE sku = :sku AND codmat = :codmat
|
||||
""", params)
|
||||
conn.commit()
|
||||
return cur.rowcount > 0
|
||||
|
||||
def delete_mapping(sku: str, codmat: str):
|
||||
"""Soft delete (set sters=1)."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
UPDATE ARTICOLE_TERTI SET sters = 1, data_modif = SYSDATE
|
||||
WHERE sku = :sku AND codmat = :codmat
|
||||
""", {"sku": sku, "codmat": codmat})
|
||||
conn.commit()
|
||||
return cur.rowcount > 0
|
||||
|
||||
def edit_mapping(old_sku: str, old_codmat: str, new_sku: str, new_codmat: str,
|
||||
cantitate_roa: float = 1):
|
||||
"""Edit a mapping. If PK changed, soft-delete old and insert new."""
|
||||
if not new_sku or not new_sku.strip():
|
||||
raise HTTPException(status_code=400, detail="SKU este obligatoriu")
|
||||
if not new_codmat or not new_codmat.strip():
|
||||
raise HTTPException(status_code=400, detail="CODMAT este obligatoriu")
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
if old_sku == new_sku and old_codmat == new_codmat:
|
||||
# Simple update - only cantitate changed
|
||||
return update_mapping(new_sku, new_codmat, cantitate_roa)
|
||||
else:
|
||||
# PK changed: soft-delete old, upsert new (MERGE handles existing soft-deleted target)
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Mark old record as deleted
|
||||
cur.execute("""
|
||||
UPDATE ARTICOLE_TERTI SET sters = 1, data_modif = SYSDATE
|
||||
WHERE sku = :sku AND codmat = :codmat
|
||||
""", {"sku": old_sku, "codmat": old_codmat})
|
||||
# Upsert new record (MERGE in case target PK exists as soft-deleted)
|
||||
cur.execute("""
|
||||
MERGE INTO ARTICOLE_TERTI t
|
||||
USING (SELECT :sku AS sku, :codmat AS codmat FROM DUAL) s
|
||||
ON (t.sku = s.sku AND t.codmat = s.codmat)
|
||||
WHEN MATCHED THEN UPDATE SET
|
||||
cantitate_roa = :cantitate_roa,
|
||||
activ = 1, sters = 0,
|
||||
data_modif = SYSDATE
|
||||
WHEN NOT MATCHED THEN INSERT
|
||||
(sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
VALUES (:sku, :codmat, :cantitate_roa, 1, 0, SYSDATE, -3)
|
||||
""", {"sku": new_sku, "codmat": new_codmat, "cantitate_roa": cantitate_roa})
|
||||
conn.commit()
|
||||
return True
|
||||
|
||||
def restore_mapping(sku: str, codmat: str):
|
||||
"""Restore a soft-deleted mapping (set sters=0)."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
UPDATE ARTICOLE_TERTI SET sters = 0, data_modif = SYSDATE
|
||||
WHERE sku = :sku AND codmat = :codmat
|
||||
""", {"sku": sku, "codmat": codmat})
|
||||
conn.commit()
|
||||
return cur.rowcount > 0
|
||||
|
||||
def import_csv(file_content: str):
|
||||
"""Import mappings from CSV content. Returns summary.
|
||||
Backward compatible: if procent_pret column exists in CSV, it is silently ignored.
|
||||
"""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
reader = csv.DictReader(io.StringIO(file_content))
|
||||
created = 0
|
||||
skipped_no_codmat = 0
|
||||
errors = []
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for i, row in enumerate(reader, 1):
|
||||
sku = row.get("sku", "").strip()
|
||||
codmat = row.get("codmat", "").strip()
|
||||
|
||||
if not sku:
|
||||
errors.append(f"Rând {i}: SKU lipsă")
|
||||
continue
|
||||
|
||||
if not codmat:
|
||||
skipped_no_codmat += 1
|
||||
continue
|
||||
|
||||
try:
|
||||
cantitate = float(row.get("cantitate_roa", "1") or "1")
|
||||
# procent_pret column ignored if present (backward compat)
|
||||
|
||||
cur.execute("""
|
||||
MERGE INTO ARTICOLE_TERTI t
|
||||
USING (SELECT :sku AS sku, :codmat AS codmat FROM DUAL) s
|
||||
ON (t.sku = s.sku AND t.codmat = s.codmat)
|
||||
WHEN MATCHED THEN UPDATE SET
|
||||
cantitate_roa = :cantitate_roa,
|
||||
activ = 1,
|
||||
sters = 0,
|
||||
data_modif = SYSDATE
|
||||
WHEN NOT MATCHED THEN INSERT
|
||||
(sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
VALUES (:sku, :codmat, :cantitate_roa, 1, 0, SYSDATE, -3)
|
||||
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate})
|
||||
created += 1
|
||||
|
||||
except Exception as e:
|
||||
errors.append(f"Rând {i}: {str(e)}")
|
||||
|
||||
conn.commit()
|
||||
|
||||
return {"processed": created, "skipped_no_codmat": skipped_no_codmat, "errors": errors}
|
||||
|
||||
def export_csv():
|
||||
"""Export all mappings as CSV string."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(["sku", "codmat", "cantitate_roa", "activ"])
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT sku, codmat, cantitate_roa, activ
|
||||
FROM ARTICOLE_TERTI WHERE sters = 0 ORDER BY sku, codmat
|
||||
""")
|
||||
for row in cur:
|
||||
writer.writerow(row)
|
||||
|
||||
return output.getvalue()
|
||||
|
||||
def get_csv_template():
|
||||
"""Return empty CSV template."""
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(["sku", "codmat", "cantitate_roa"])
|
||||
writer.writerow(["EXAMPLE_SKU", "EXAMPLE_CODMAT", "1"])
|
||||
return output.getvalue()
|
||||
|
||||
def get_component_prices(sku: str, id_pol: int, id_pol_productie: int = None) -> list:
|
||||
"""Get prices from crm_politici_pret_art for kit components.
|
||||
Returns: [{"codmat", "denumire", "cantitate_roa", "pret", "pret_cu_tva", "proc_tvav", "ptva", "id_pol_used"}]
|
||||
"""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Get components from ARTICOLE_TERTI
|
||||
cur.execute("""
|
||||
SELECT at.codmat, at.cantitate_roa, na.id_articol, na.cont, na.denumire
|
||||
FROM ARTICOLE_TERTI at
|
||||
JOIN NOM_ARTICOLE na ON na.codmat = at.codmat AND na.sters = 0 AND na.inactiv = 0
|
||||
WHERE at.sku = :sku AND at.activ = 1 AND at.sters = 0
|
||||
ORDER BY at.codmat
|
||||
""", {"sku": sku})
|
||||
components = cur.fetchall()
|
||||
|
||||
if len(components) == 0:
|
||||
return []
|
||||
if len(components) == 1 and (components[0][1] or 1) <= 1:
|
||||
return [] # True 1:1 mapping, no kit pricing needed
|
||||
|
||||
result = []
|
||||
for codmat, cant_roa, id_art, cont, denumire in components:
|
||||
# Determine policy based on account
|
||||
cont_str = str(cont or "").strip()
|
||||
pol = id_pol_productie if (cont_str in ("341", "345") and id_pol_productie) else id_pol
|
||||
|
||||
# Get PRETURI_CU_TVA flag
|
||||
cur.execute("SELECT PRETURI_CU_TVA FROM CRM_POLITICI_PRETURI WHERE ID_POL = :pol", {"pol": pol})
|
||||
pol_row = cur.fetchone()
|
||||
preturi_cu_tva_flag = pol_row[0] if pol_row else 0
|
||||
|
||||
# Get price
|
||||
cur.execute("""
|
||||
SELECT PRET, PROC_TVAV FROM crm_politici_pret_art
|
||||
WHERE id_pol = :pol AND id_articol = :id_art
|
||||
""", {"pol": pol, "id_art": id_art})
|
||||
price_row = cur.fetchone()
|
||||
|
||||
if price_row:
|
||||
pret, proc_tvav = price_row
|
||||
proc_tvav = proc_tvav or 1.19
|
||||
pret_cu_tva = pret if preturi_cu_tva_flag == 1 else round(pret * proc_tvav, 2)
|
||||
ptva = round((proc_tvav - 1) * 100)
|
||||
else:
|
||||
pret = 0
|
||||
pret_cu_tva = 0
|
||||
proc_tvav = 1.19
|
||||
ptva = 19
|
||||
|
||||
result.append({
|
||||
"codmat": codmat,
|
||||
"denumire": denumire or "",
|
||||
"cantitate_roa": float(cant_roa) if cant_roa else 1,
|
||||
"pret": float(pret) if pret else 0,
|
||||
"pret_cu_tva": float(pret_cu_tva),
|
||||
"proc_tvav": float(proc_tvav),
|
||||
"ptva": int(ptva),
|
||||
"id_pol_used": pol
|
||||
})
|
||||
|
||||
return result
|
||||
@@ -1,202 +0,0 @@
|
||||
import json
|
||||
import glob
|
||||
import os
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Optional
|
||||
|
||||
from ..config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@dataclass
|
||||
class OrderItem:
|
||||
sku: str
|
||||
name: str
|
||||
price: float
|
||||
quantity: float
|
||||
vat: float
|
||||
baseprice: float = 0.0
|
||||
|
||||
@dataclass
|
||||
class OrderBilling:
|
||||
firstname: str = ""
|
||||
lastname: str = ""
|
||||
phone: str = ""
|
||||
email: str = ""
|
||||
address: str = ""
|
||||
city: str = ""
|
||||
region: str = ""
|
||||
country: str = ""
|
||||
company_name: str = ""
|
||||
company_code: str = ""
|
||||
company_reg: str = ""
|
||||
is_company: bool = False
|
||||
|
||||
@dataclass
|
||||
class OrderShipping:
|
||||
firstname: str = ""
|
||||
lastname: str = ""
|
||||
phone: str = ""
|
||||
email: str = ""
|
||||
address: str = ""
|
||||
city: str = ""
|
||||
region: str = ""
|
||||
country: str = ""
|
||||
|
||||
@dataclass
|
||||
class OrderData:
|
||||
id: str
|
||||
number: str
|
||||
date: str
|
||||
status: str = ""
|
||||
status_id: str = ""
|
||||
items: list = field(default_factory=list) # list of OrderItem
|
||||
billing: OrderBilling = field(default_factory=OrderBilling)
|
||||
shipping: Optional[OrderShipping] = None
|
||||
total: float = 0.0
|
||||
delivery_cost: float = 0.0
|
||||
discount_total: float = 0.0
|
||||
discount_vat: Optional[str] = None
|
||||
payment_name: str = ""
|
||||
delivery_name: str = ""
|
||||
source_file: str = ""
|
||||
|
||||
def read_json_orders(json_dir: str = None) -> tuple[list[OrderData], int]:
|
||||
"""Read all GoMag order JSON files from the output directory.
|
||||
Returns (list of OrderData, number of JSON files read).
|
||||
"""
|
||||
if json_dir is None:
|
||||
json_dir = settings.JSON_OUTPUT_DIR
|
||||
|
||||
if not json_dir or not os.path.isdir(json_dir):
|
||||
logger.warning(f"JSON output directory not found: {json_dir}")
|
||||
return [], 0
|
||||
|
||||
# Find all gomag_orders*.json files
|
||||
pattern = os.path.join(json_dir, "gomag_orders*.json")
|
||||
json_files = sorted(glob.glob(pattern))
|
||||
|
||||
if not json_files:
|
||||
logger.info(f"No JSON files found in {json_dir}")
|
||||
return [], 0
|
||||
|
||||
orders = []
|
||||
for filepath in json_files:
|
||||
try:
|
||||
with open(filepath, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
|
||||
raw_orders = data.get("orders", {})
|
||||
if not isinstance(raw_orders, dict):
|
||||
continue
|
||||
|
||||
for order_id, order_data in raw_orders.items():
|
||||
try:
|
||||
order = _parse_order(order_id, order_data, os.path.basename(filepath))
|
||||
orders.append(order)
|
||||
except Exception as e:
|
||||
logger.warning(f"Error parsing order {order_id} from {filepath}: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading {filepath}: {e}")
|
||||
|
||||
logger.info(f"Read {len(orders)} orders from {len(json_files)} JSON files")
|
||||
return orders, len(json_files)
|
||||
|
||||
def _parse_order(order_id: str, data: dict, source_file: str) -> OrderData:
|
||||
"""Parse a single order from JSON data."""
|
||||
# Parse items
|
||||
items = []
|
||||
raw_items = data.get("items", [])
|
||||
if isinstance(raw_items, list):
|
||||
for item in raw_items:
|
||||
if isinstance(item, dict) and item.get("sku"):
|
||||
items.append(OrderItem(
|
||||
sku=str(item.get("sku", "")).strip(),
|
||||
name=str(item.get("name", "")),
|
||||
price=float(item.get("price", 0) or 0),
|
||||
quantity=float(item.get("quantity", 0) or 0),
|
||||
vat=float(item.get("vat", 0) or 0),
|
||||
baseprice=float(item.get("baseprice", 0) or 0)
|
||||
))
|
||||
|
||||
# Parse billing
|
||||
billing_data = data.get("billing", {}) or {}
|
||||
company = billing_data.get("company")
|
||||
is_company = isinstance(company, dict) and (
|
||||
bool(company.get("name")) or bool(company.get("code"))
|
||||
)
|
||||
|
||||
billing = OrderBilling(
|
||||
firstname=str(billing_data.get("firstname", "")),
|
||||
lastname=str(billing_data.get("lastname", "")),
|
||||
phone=str(billing_data.get("phone", "")),
|
||||
email=str(billing_data.get("email", "")),
|
||||
address=str(billing_data.get("address", "")),
|
||||
city=str(billing_data.get("city", "")),
|
||||
region=str(billing_data.get("region", "")),
|
||||
country=str(billing_data.get("country", "")),
|
||||
company_name=str(company.get("name", "")) if is_company else "",
|
||||
company_code=str(company.get("code", "")) if is_company else "",
|
||||
company_reg=str(company.get("registrationNo", "")) if is_company else "",
|
||||
is_company=is_company
|
||||
)
|
||||
|
||||
# Parse shipping
|
||||
shipping_data = data.get("shipping")
|
||||
shipping = None
|
||||
if isinstance(shipping_data, dict):
|
||||
shipping = OrderShipping(
|
||||
firstname=str(shipping_data.get("firstname", "")),
|
||||
lastname=str(shipping_data.get("lastname", "")),
|
||||
phone=str(shipping_data.get("phone", "")),
|
||||
email=str(shipping_data.get("email", "")),
|
||||
address=str(shipping_data.get("address", "")),
|
||||
city=str(shipping_data.get("city", "")),
|
||||
region=str(shipping_data.get("region", "")),
|
||||
country=str(shipping_data.get("country", ""))
|
||||
)
|
||||
|
||||
# Payment/delivery
|
||||
payment = data.get("payment", {}) or {}
|
||||
delivery = data.get("delivery", {}) or {}
|
||||
|
||||
# Parse delivery cost
|
||||
delivery_cost = float(delivery.get("total", 0) or 0) if isinstance(delivery, dict) else 0.0
|
||||
|
||||
# Parse discount total (sum of all discount values) and VAT from first discount item
|
||||
discount_total = 0.0
|
||||
discount_vat = None
|
||||
for d in data.get("discounts", []):
|
||||
if isinstance(d, dict):
|
||||
discount_total += float(d.get("value", 0) or 0)
|
||||
if discount_vat is None and d.get("vat") is not None:
|
||||
discount_vat = str(d["vat"])
|
||||
|
||||
return OrderData(
|
||||
id=str(data.get("id", order_id)),
|
||||
number=str(data.get("number", "")),
|
||||
date=str(data.get("date", "")),
|
||||
status=str(data.get("status", "")),
|
||||
status_id=str(data.get("statusId", "")),
|
||||
items=items,
|
||||
billing=billing,
|
||||
shipping=shipping,
|
||||
total=float(data.get("total", 0) or 0),
|
||||
delivery_cost=delivery_cost,
|
||||
discount_total=discount_total,
|
||||
discount_vat=discount_vat,
|
||||
payment_name=str(payment.get("name", "")) if isinstance(payment, dict) else "",
|
||||
delivery_name=str(delivery.get("name", "")) if isinstance(delivery, dict) else "",
|
||||
source_file=source_file
|
||||
)
|
||||
|
||||
def get_all_skus(orders: list[OrderData]) -> set[str]:
|
||||
"""Extract unique SKUs from all orders."""
|
||||
skus = set()
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
if item.sku:
|
||||
skus.add(item.sku)
|
||||
return skus
|
||||
@@ -1,335 +0,0 @@
|
||||
"""Retry service — re-import individual failed/skipped orders."""
|
||||
import asyncio
|
||||
import logging
|
||||
import tempfile
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from ..constants import OrderStatus
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def _download_and_reimport(order_number: str, order_date_str: str, customer_name: str, app_settings: dict) -> dict:
|
||||
"""Download order from GoMag and re-import it into Oracle.
|
||||
|
||||
Does NOT check status guard — caller is responsible.
|
||||
Returns: {"success": bool, "message": str, "status": str|None}
|
||||
"""
|
||||
from . import sqlite_service, gomag_client, import_service, order_reader, validation_service
|
||||
|
||||
# Parse order date for narrow download window
|
||||
try:
|
||||
order_date = datetime.fromisoformat(order_date_str.replace("Z", "+00:00")).date()
|
||||
except (ValueError, AttributeError):
|
||||
order_date = datetime.now().date() - timedelta(days=1)
|
||||
|
||||
gomag_key = app_settings.get("gomag_api_key") or None
|
||||
gomag_shop = app_settings.get("gomag_api_shop") or None
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp_dir:
|
||||
try:
|
||||
today = datetime.now().date()
|
||||
days_back = (today - order_date).days + 1
|
||||
if days_back < 2:
|
||||
days_back = 2
|
||||
|
||||
await gomag_client.download_orders(
|
||||
tmp_dir, days_back=days_back,
|
||||
api_key=gomag_key, api_shop=gomag_shop,
|
||||
limit=200,
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Retry download failed for {order_number}: {e}")
|
||||
return {"success": False, "message": f"Eroare download GoMag: {e}"}
|
||||
|
||||
# Find the specific order in downloaded data
|
||||
target_order = None
|
||||
orders, _ = order_reader.read_json_orders(json_dir=tmp_dir)
|
||||
for o in orders:
|
||||
if str(o.number) == str(order_number):
|
||||
target_order = o
|
||||
break
|
||||
|
||||
if not target_order:
|
||||
return {"success": False, "message": f"Comanda {order_number} nu a fost gasita in GoMag API"}
|
||||
|
||||
# Import the order
|
||||
id_pol = int(app_settings.get("id_pol") or 0)
|
||||
id_sectie = int(app_settings.get("id_sectie") or 0)
|
||||
id_gestiune = app_settings.get("id_gestiune", "")
|
||||
id_gestiuni = [int(g.strip()) for g in id_gestiune.split(",") if g.strip()] if id_gestiune else None
|
||||
|
||||
# Pre-validate prices: auto-insert PRET=0 in CRM_POLITICI_PRET_ART for missing
|
||||
# CODMATs so PL/SQL doesn't crash with COM-001. Mirrors sync_service flow.
|
||||
from .. import database
|
||||
validation = {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}}
|
||||
if database.pool is not None:
|
||||
conn = await asyncio.to_thread(database.get_oracle_connection)
|
||||
try:
|
||||
skus = {item.sku for item in target_order.items if item.sku}
|
||||
if skus:
|
||||
validation = await asyncio.to_thread(
|
||||
validation_service.validate_skus, skus, conn, id_gestiuni,
|
||||
)
|
||||
if id_pol and skus:
|
||||
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
|
||||
cota_tva = float(app_settings.get("discount_vat") or 21)
|
||||
await asyncio.to_thread(
|
||||
validation_service.pre_validate_order_prices,
|
||||
[target_order], app_settings, conn, id_pol, id_pol_productie,
|
||||
id_gestiuni, validation, None, cota_tva,
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Retry pre-validation failed for {order_number}: {e}")
|
||||
await sqlite_service.upsert_order(
|
||||
sync_run_id="retry",
|
||||
order_number=order_number,
|
||||
order_date=order_date_str,
|
||||
customer_name=customer_name,
|
||||
status=OrderStatus.ERROR.value,
|
||||
error_message=f"Retry pre-validation failed: {e}",
|
||||
)
|
||||
return {"success": False, "message": f"Eroare pre-validare preturi: {e}"}
|
||||
finally:
|
||||
await asyncio.to_thread(database.pool.release, conn)
|
||||
|
||||
try:
|
||||
result = await asyncio.to_thread(
|
||||
import_service.import_single_order,
|
||||
target_order, id_pol=id_pol, id_sectie=id_sectie,
|
||||
app_settings=app_settings, id_gestiuni=id_gestiuni
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Retry import failed for {order_number}: {e}")
|
||||
await sqlite_service.upsert_order(
|
||||
sync_run_id="retry",
|
||||
order_number=order_number,
|
||||
order_date=order_date_str,
|
||||
customer_name=customer_name,
|
||||
status=OrderStatus.ERROR.value,
|
||||
error_message=f"Retry failed: {e}",
|
||||
)
|
||||
return {"success": False, "message": f"Eroare import: {e}"}
|
||||
|
||||
order_items_data = [
|
||||
{
|
||||
"sku": item.sku, "product_name": item.name,
|
||||
"quantity": item.quantity, "price": item.price,
|
||||
"baseprice": item.baseprice, "vat": item.vat,
|
||||
"mapping_status": "mapped" if item.sku in validation["mapped"] else "direct",
|
||||
"codmat": None, "id_articol": None, "cantitate_roa": None,
|
||||
}
|
||||
for item in target_order.items
|
||||
]
|
||||
|
||||
if result.get("success"):
|
||||
await sqlite_service.upsert_order(
|
||||
sync_run_id="retry",
|
||||
order_number=order_number,
|
||||
order_date=order_date_str,
|
||||
customer_name=customer_name,
|
||||
status=OrderStatus.IMPORTED.value,
|
||||
id_comanda=result.get("id_comanda"),
|
||||
id_partener=result.get("id_partener"),
|
||||
error_message=None,
|
||||
)
|
||||
if result.get("id_adresa_facturare") or result.get("id_adresa_livrare"):
|
||||
await sqlite_service.update_import_order_addresses(
|
||||
order_number=order_number,
|
||||
id_adresa_facturare=result.get("id_adresa_facturare"),
|
||||
id_adresa_livrare=result.get("id_adresa_livrare"),
|
||||
)
|
||||
await sqlite_service.add_order_items(order_number, order_items_data)
|
||||
logger.info(f"Retry successful for order {order_number} → IMPORTED ({len(order_items_data)} items)")
|
||||
return {"success": True, "message": "Comanda reimportata cu succes", "status": OrderStatus.IMPORTED.value}
|
||||
else:
|
||||
error = result.get("error", "Unknown error")
|
||||
await sqlite_service.upsert_order(
|
||||
sync_run_id="retry",
|
||||
order_number=order_number,
|
||||
order_date=order_date_str,
|
||||
customer_name=customer_name,
|
||||
status=OrderStatus.ERROR.value,
|
||||
error_message=f"Retry: {error}",
|
||||
)
|
||||
await sqlite_service.add_order_items(order_number, order_items_data)
|
||||
return {"success": False, "message": f"Import esuat: {error}", "status": OrderStatus.ERROR.value}
|
||||
|
||||
|
||||
async def retry_single_order(order_number: str, app_settings: dict) -> dict:
|
||||
"""Re-download and re-import a single order from GoMag.
|
||||
|
||||
Steps:
|
||||
1. Read order from SQLite to get order_date / customer_name
|
||||
2. Check sync lock (no retry during active sync)
|
||||
3. Download narrow date range from GoMag (order_date ± 1 day)
|
||||
4. Find the specific order in downloaded data
|
||||
5. Run import_single_order()
|
||||
6. Update status in SQLite
|
||||
|
||||
Returns: {"success": bool, "message": str, "status": str|None}
|
||||
"""
|
||||
from . import sqlite_service, sync_service
|
||||
|
||||
# Check sync lock
|
||||
if sync_service._sync_lock.locked():
|
||||
return {"success": False, "message": "Sync in curs — asteapta finalizarea"}
|
||||
|
||||
# Get order from SQLite
|
||||
detail = await sqlite_service.get_order_detail(order_number)
|
||||
if not detail:
|
||||
return {"success": False, "message": "Comanda nu a fost gasita"}
|
||||
|
||||
order_data = detail["order"]
|
||||
status = order_data.get("status", "")
|
||||
if status not in (OrderStatus.ERROR.value, OrderStatus.SKIPPED.value,
|
||||
OrderStatus.DELETED_IN_ROA.value, OrderStatus.MALFORMED.value):
|
||||
return {"success": False,
|
||||
"message": f"Retry permis doar pentru ERROR/SKIPPED/DELETED_IN_ROA/MALFORMED (status actual: {status})"}
|
||||
|
||||
order_date_str = order_data.get("order_date", "")
|
||||
customer_name = order_data.get("customer_name", "")
|
||||
|
||||
return await _download_and_reimport(order_number, order_date_str, customer_name, app_settings)
|
||||
|
||||
|
||||
async def resync_single_order(order_number: str, app_settings: dict) -> dict:
|
||||
"""Soft-delete an imported order from Oracle then re-import it from GoMag.
|
||||
|
||||
Steps:
|
||||
1. Check sync lock
|
||||
2. Load order from SQLite
|
||||
3. Validate status is IMPORTED/ALREADY_IMPORTED with id_comanda
|
||||
4. Invoice safety gate (check Oracle for invoices)
|
||||
5. Soft-delete from Oracle
|
||||
6. Mark DELETED_IN_ROA in SQLite
|
||||
7. Re-import via _download_and_reimport
|
||||
|
||||
Returns: {"success": bool, "message": str, "status": str|None}
|
||||
"""
|
||||
from . import sqlite_service, sync_service, import_service, invoice_service
|
||||
from .. import database
|
||||
|
||||
# Check sync lock
|
||||
if sync_service._sync_lock.locked():
|
||||
return {"success": False, "message": "Sync in curs — asteapta finalizarea"}
|
||||
|
||||
# Get order from SQLite
|
||||
detail = await sqlite_service.get_order_detail(order_number)
|
||||
if not detail:
|
||||
return {"success": False, "message": "Comanda nu a fost gasita"}
|
||||
|
||||
order_data = detail["order"]
|
||||
status = order_data.get("status", "")
|
||||
id_comanda = order_data.get("id_comanda")
|
||||
|
||||
if status not in (OrderStatus.IMPORTED.value, OrderStatus.ALREADY_IMPORTED.value) or not id_comanda:
|
||||
return {"success": False, "message": f"Resync permis doar pentru IMPORTED/ALREADY_IMPORTED cu id_comanda (status actual: {status})"}
|
||||
|
||||
# Invoice safety gate
|
||||
if database.pool is None:
|
||||
return {"success": False, "message": "Oracle indisponibil"}
|
||||
|
||||
if order_data.get("factura_numar"):
|
||||
return {"success": False, "message": "Comanda este facturata"}
|
||||
|
||||
try:
|
||||
invoice_result = await asyncio.to_thread(
|
||||
invoice_service.check_invoices_for_orders, [id_comanda]
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Invoice check failed for {order_number}: {e}")
|
||||
return {"success": False, "message": "Nu se poate verifica factura — Oracle indisponibil"}
|
||||
|
||||
if invoice_result.get(id_comanda):
|
||||
return {"success": False, "message": "Comanda este facturata"}
|
||||
|
||||
# Soft-delete from Oracle
|
||||
try:
|
||||
delete_result = await asyncio.to_thread(
|
||||
import_service.soft_delete_order_in_roa, id_comanda
|
||||
)
|
||||
if not delete_result.get("success"):
|
||||
return {"success": False, "message": f"Eroare stergere din Oracle: {delete_result.get('error', 'Unknown')}"}
|
||||
except Exception as e:
|
||||
logger.error(f"Soft-delete failed for {order_number} (id_comanda={id_comanda}): {e}")
|
||||
return {"success": False, "message": f"Eroare stergere din Oracle: {e}"}
|
||||
|
||||
# Mark deleted in SQLite
|
||||
await sqlite_service.mark_order_deleted_in_roa(order_number)
|
||||
|
||||
order_date_str = order_data.get("order_date", "")
|
||||
customer_name = order_data.get("customer_name", "")
|
||||
|
||||
# Re-import
|
||||
reimport_result = await _download_and_reimport(order_number, order_date_str, customer_name, app_settings)
|
||||
if not reimport_result.get("success"):
|
||||
logger.warning(f"Resync: order {order_number} deleted from Oracle but reimport failed")
|
||||
return {
|
||||
"success": False,
|
||||
"message": "Comanda stearsa din Oracle dar reimportul a esuat — foloseste Reimporta pentru a reincerca",
|
||||
}
|
||||
|
||||
return reimport_result
|
||||
|
||||
|
||||
async def delete_single_order(order_number: str) -> dict:
|
||||
"""Soft-delete an imported order from Oracle without re-importing.
|
||||
|
||||
Same invoice safety gate as resync_single_order.
|
||||
|
||||
Returns: {"success": bool, "message": str}
|
||||
"""
|
||||
from . import sqlite_service, sync_service, import_service, invoice_service
|
||||
from .. import database
|
||||
|
||||
# Check sync lock
|
||||
if sync_service._sync_lock.locked():
|
||||
return {"success": False, "message": "Sync in curs — asteapta finalizarea"}
|
||||
|
||||
# Get order from SQLite
|
||||
detail = await sqlite_service.get_order_detail(order_number)
|
||||
if not detail:
|
||||
return {"success": False, "message": "Comanda nu a fost gasita"}
|
||||
|
||||
order_data = detail["order"]
|
||||
status = order_data.get("status", "")
|
||||
id_comanda = order_data.get("id_comanda")
|
||||
|
||||
if status not in (OrderStatus.IMPORTED.value, OrderStatus.ALREADY_IMPORTED.value) or not id_comanda:
|
||||
return {"success": False, "message": f"Stergere permisa doar pentru IMPORTED/ALREADY_IMPORTED cu id_comanda (status actual: {status})"}
|
||||
|
||||
# Invoice safety gate
|
||||
if database.pool is None:
|
||||
return {"success": False, "message": "Oracle indisponibil"}
|
||||
|
||||
if order_data.get("factura_numar"):
|
||||
return {"success": False, "message": "Comanda este facturata"}
|
||||
|
||||
try:
|
||||
invoice_result = await asyncio.to_thread(
|
||||
invoice_service.check_invoices_for_orders, [id_comanda]
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Invoice check failed for {order_number}: {e}")
|
||||
return {"success": False, "message": "Nu se poate verifica factura — Oracle indisponibil"}
|
||||
|
||||
if invoice_result.get(id_comanda):
|
||||
return {"success": False, "message": "Comanda este facturata"}
|
||||
|
||||
# Soft-delete from Oracle
|
||||
try:
|
||||
delete_result = await asyncio.to_thread(
|
||||
import_service.soft_delete_order_in_roa, id_comanda
|
||||
)
|
||||
if not delete_result.get("success"):
|
||||
return {"success": False, "message": f"Eroare stergere din Oracle: {delete_result.get('error', 'Unknown')}"}
|
||||
except Exception as e:
|
||||
logger.error(f"Soft-delete failed for {order_number} (id_comanda={id_comanda}): {e}")
|
||||
return {"success": False, "message": f"Eroare stergere din Oracle: {e}"}
|
||||
|
||||
# Mark deleted in SQLite
|
||||
await sqlite_service.mark_order_deleted_in_roa(order_number)
|
||||
|
||||
logger.info(f"Order {order_number} (id_comanda={id_comanda}) deleted from ROA")
|
||||
return {"success": True, "message": "Comanda stearsa din ROA"}
|
||||
@@ -1,71 +0,0 @@
|
||||
import logging
|
||||
from apscheduler.schedulers.asyncio import AsyncIOScheduler
|
||||
from apscheduler.triggers.interval import IntervalTrigger
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
_scheduler = None
|
||||
_is_running = False
|
||||
|
||||
|
||||
def init_scheduler():
|
||||
"""Initialize the APScheduler instance."""
|
||||
global _scheduler
|
||||
_scheduler = AsyncIOScheduler()
|
||||
logger.info("Scheduler initialized")
|
||||
|
||||
|
||||
def start_scheduler(interval_minutes: int = 10):
|
||||
"""Start the scheduler with the given interval."""
|
||||
global _is_running
|
||||
if _scheduler is None:
|
||||
init_scheduler()
|
||||
|
||||
# Remove existing job if any
|
||||
if _scheduler.get_job("sync_job"):
|
||||
_scheduler.remove_job("sync_job")
|
||||
|
||||
from . import sync_service
|
||||
|
||||
_scheduler.add_job(
|
||||
sync_service.run_sync,
|
||||
trigger=IntervalTrigger(minutes=interval_minutes),
|
||||
id="sync_job",
|
||||
name="GoMag Sync",
|
||||
replace_existing=True
|
||||
)
|
||||
|
||||
if not _scheduler.running:
|
||||
_scheduler.start()
|
||||
|
||||
_is_running = True
|
||||
logger.info(f"Scheduler started with interval {interval_minutes}min")
|
||||
|
||||
|
||||
def stop_scheduler():
|
||||
"""Stop the scheduler."""
|
||||
global _is_running
|
||||
if _scheduler and _scheduler.running:
|
||||
if _scheduler.get_job("sync_job"):
|
||||
_scheduler.remove_job("sync_job")
|
||||
_is_running = False
|
||||
logger.info("Scheduler stopped")
|
||||
|
||||
|
||||
def shutdown_scheduler():
|
||||
"""Shutdown the scheduler completely."""
|
||||
global _scheduler, _is_running
|
||||
if _scheduler and _scheduler.running:
|
||||
_scheduler.shutdown(wait=False)
|
||||
_scheduler = None
|
||||
_is_running = False
|
||||
|
||||
|
||||
def get_scheduler_status():
|
||||
"""Get current scheduler status."""
|
||||
job = _scheduler.get_job("sync_job") if _scheduler else None
|
||||
return {
|
||||
"enabled": _is_running,
|
||||
"next_run": job.next_run_time.isoformat() if job and job.next_run_time else None,
|
||||
"interval_minutes": int(job.trigger.interval.total_seconds() / 60) if job else None
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,828 +0,0 @@
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from .. import database
|
||||
from . import sqlite_service
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def validate_structural(order: dict) -> tuple[bool, str | None, str | None]:
|
||||
"""Pre-flight structural validator used by save_orders_batch.
|
||||
|
||||
Returns (True, None, None) on pass, (False, error_type, error_msg) on fail.
|
||||
Rules are intentionally minimal — only catches malformed payloads that
|
||||
would crash downstream inserts. Semantic checks (SKU existence, price
|
||||
comparison, etc.) are handled in later phases.
|
||||
"""
|
||||
if not isinstance(order, dict):
|
||||
return False, "MISSING_FIELD", f"order is not a dict: {type(order).__name__}"
|
||||
|
||||
order_number = order.get("order_number")
|
||||
if order_number is None or str(order_number).strip() == "":
|
||||
return False, "MISSING_FIELD", "order_number is missing or empty"
|
||||
|
||||
raw_date = order.get("order_date")
|
||||
if raw_date in (None, ""):
|
||||
return False, "INVALID_DATE", "order_date is missing or empty"
|
||||
if isinstance(raw_date, datetime):
|
||||
pass
|
||||
elif isinstance(raw_date, str):
|
||||
parsed = None
|
||||
for fmt in ("%Y-%m-%d %H:%M:%S", "%Y-%m-%dT%H:%M:%S", "%Y-%m-%d"):
|
||||
try:
|
||||
parsed = datetime.strptime(raw_date, fmt)
|
||||
break
|
||||
except ValueError:
|
||||
continue
|
||||
if parsed is None:
|
||||
try:
|
||||
parsed = datetime.fromisoformat(raw_date.replace("Z", "+00:00"))
|
||||
except ValueError:
|
||||
return False, "INVALID_DATE", f"order_date not parseable: {raw_date!r}"
|
||||
else:
|
||||
return False, "INVALID_DATE", f"order_date wrong type: {type(raw_date).__name__}"
|
||||
|
||||
items = order.get("items")
|
||||
if not items or not isinstance(items, list):
|
||||
return False, "EMPTY_ITEMS", "items missing or not a non-empty list"
|
||||
|
||||
for idx, item in enumerate(items):
|
||||
if not isinstance(item, dict):
|
||||
return False, "EMPTY_ITEMS", f"item[{idx}] is not a dict"
|
||||
|
||||
qty_raw = item.get("quantity")
|
||||
if qty_raw is None or qty_raw == "":
|
||||
return False, "INVALID_QUANTITY", f"item[{idx}] quantity missing"
|
||||
try:
|
||||
qty = float(qty_raw)
|
||||
except (TypeError, ValueError):
|
||||
return False, "INVALID_QUANTITY", f"item[{idx}] quantity not numeric: {qty_raw!r}"
|
||||
if qty <= 0:
|
||||
return False, "INVALID_QUANTITY", f"item[{idx}] quantity not > 0: {qty}"
|
||||
|
||||
price_raw = item.get("price")
|
||||
if price_raw is None or price_raw == "":
|
||||
return False, "INVALID_PRICE", f"item[{idx}] price missing"
|
||||
try:
|
||||
float(price_raw)
|
||||
except (TypeError, ValueError):
|
||||
return False, "INVALID_PRICE", f"item[{idx}] price not numeric: {price_raw!r}"
|
||||
|
||||
return True, None, None
|
||||
|
||||
|
||||
async def reconcile_unresolved_missing_skus(conn=None) -> dict:
|
||||
"""Revalidate all resolved=0 SKUs in missing_skus against Oracle.
|
||||
Fail-soft: logs warning and returns zero if Oracle is unavailable.
|
||||
Returns {"checked": N, "resolved": M, "error": str|None}.
|
||||
"""
|
||||
db = await sqlite_service.get_sqlite()
|
||||
try:
|
||||
cursor = await db.execute("SELECT sku FROM missing_skus WHERE resolved = 0")
|
||||
rows = await cursor.fetchall()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
if not rows:
|
||||
return {"checked": 0, "resolved": 0, "error": None}
|
||||
|
||||
unresolved_set = {row[0] for row in rows}
|
||||
|
||||
try:
|
||||
result = await asyncio.to_thread(validate_skus, unresolved_set, conn)
|
||||
except Exception as e:
|
||||
logger.warning(f"reconcile_unresolved_missing_skus: Oracle unavailable — {e}")
|
||||
return {"checked": len(unresolved_set), "resolved": 0, "error": str(e)}
|
||||
|
||||
resolved_set = result["mapped"] | result["direct"]
|
||||
resolved_count = await sqlite_service.resolve_missing_skus_batch(resolved_set)
|
||||
logger.info(f"reconcile_unresolved_missing_skus: checked={len(unresolved_set)}, resolved={resolved_count}")
|
||||
return {"checked": len(unresolved_set), "resolved": resolved_count, "error": None}
|
||||
|
||||
def check_orders_in_roa(min_date, conn) -> dict:
|
||||
"""Check which orders already exist in Oracle COMENZI by date range.
|
||||
Returns: {comanda_externa: id_comanda} for all existing orders.
|
||||
Much faster than IN-clause batching — single query using date index.
|
||||
"""
|
||||
if conn is None:
|
||||
return {}
|
||||
|
||||
existing = {}
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT comanda_externa, id_comanda FROM COMENZI
|
||||
WHERE data_comanda >= :min_date
|
||||
AND comanda_externa IS NOT NULL AND sters = 0
|
||||
""", {"min_date": min_date})
|
||||
for row in cur:
|
||||
existing[str(row[0])] = row[1]
|
||||
except Exception as e:
|
||||
logger.error(f"check_orders_in_roa failed: {e}")
|
||||
|
||||
logger.info(f"ROA order check (since {min_date}): {len(existing)} existing orders found")
|
||||
return existing
|
||||
|
||||
|
||||
def resolve_codmat_ids(codmats: set[str], id_gestiuni: list[int] = None, conn=None) -> dict[str, dict]:
|
||||
"""Resolve CODMATs to best id_articol + cont: prefers article with stock, then MAX(id_articol).
|
||||
Filters: sters=0 AND inactiv=0.
|
||||
id_gestiuni: list of warehouse IDs to check stock in, or None for all.
|
||||
Returns: {codmat: {"id_articol": int, "cont": str|None}}
|
||||
"""
|
||||
if not codmats:
|
||||
return {}
|
||||
|
||||
result = {}
|
||||
codmat_list = list(codmats)
|
||||
|
||||
# Build stoc subquery dynamically for index optimization
|
||||
if id_gestiuni:
|
||||
gest_placeholders = ",".join([f":g{k}" for k in range(len(id_gestiuni))])
|
||||
stoc_filter = f"AND s.id_gestiune IN ({gest_placeholders})"
|
||||
else:
|
||||
stoc_filter = ""
|
||||
|
||||
own_conn = conn is None
|
||||
if own_conn:
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(codmat_list), 500):
|
||||
batch = codmat_list[i:i+500]
|
||||
placeholders = ",".join([f":c{j}" for j in range(len(batch))])
|
||||
params = {f"c{j}": cm for j, cm in enumerate(batch)}
|
||||
if id_gestiuni:
|
||||
for k, gid in enumerate(id_gestiuni):
|
||||
params[f"g{k}"] = gid
|
||||
|
||||
cur.execute(f"""
|
||||
SELECT codmat, id_articol, cont FROM (
|
||||
SELECT na.codmat, na.id_articol, na.cont,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY na.codmat
|
||||
ORDER BY
|
||||
CASE WHEN EXISTS (
|
||||
SELECT 1 FROM stoc s
|
||||
WHERE s.id_articol = na.id_articol
|
||||
{stoc_filter}
|
||||
AND s.an = EXTRACT(YEAR FROM SYSDATE)
|
||||
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
|
||||
AND s.cants + s.cant - s.cante > 0
|
||||
) THEN 0 ELSE 1 END,
|
||||
na.id_articol DESC
|
||||
) AS rn
|
||||
FROM nom_articole na
|
||||
WHERE na.codmat IN ({placeholders})
|
||||
AND na.sters = 0 AND na.inactiv = 0
|
||||
) WHERE rn = 1
|
||||
""", params)
|
||||
for row in cur:
|
||||
result[row[0]] = {"id_articol": row[1], "cont": row[2]}
|
||||
finally:
|
||||
if own_conn:
|
||||
database.pool.release(conn)
|
||||
|
||||
logger.info(f"resolve_codmat_ids: {len(result)}/{len(codmats)} resolved (gestiuni={id_gestiuni})")
|
||||
return result
|
||||
|
||||
|
||||
def validate_skus(skus: set[str], conn=None, id_gestiuni: list[int] = None) -> dict:
|
||||
"""Validate a set of SKUs against Oracle.
|
||||
Returns: {mapped: set, direct: set, missing: set, direct_id_map: {codmat: {"id_articol": int, "cont": str|None}}}
|
||||
- mapped: found in ARTICOLE_TERTI (active)
|
||||
- direct: found in NOM_ARTICOLE by codmat (not in ARTICOLE_TERTI)
|
||||
- missing: not found anywhere
|
||||
- direct_id_map: {codmat: {"id_articol": int, "cont": str|None}} for direct SKUs
|
||||
"""
|
||||
if not skus:
|
||||
return {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}}
|
||||
|
||||
mapped = set()
|
||||
sku_list = list(skus)
|
||||
|
||||
own_conn = conn is None
|
||||
if own_conn:
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
# Check in batches of 500
|
||||
for i in range(0, len(sku_list), 500):
|
||||
batch = sku_list[i:i+500]
|
||||
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
|
||||
params = {f"s{j}": sku for j, sku in enumerate(batch)}
|
||||
|
||||
# Check ARTICOLE_TERTI
|
||||
cur.execute(f"""
|
||||
SELECT DISTINCT sku FROM ARTICOLE_TERTI
|
||||
WHERE sku IN ({placeholders}) AND activ = 1 AND sters = 0
|
||||
""", params)
|
||||
for row in cur:
|
||||
mapped.add(row[0])
|
||||
|
||||
# Resolve remaining SKUs via resolve_codmat_ids (consistent id_articol selection)
|
||||
all_remaining = [s for s in sku_list if s not in mapped]
|
||||
if all_remaining:
|
||||
direct_id_map = resolve_codmat_ids(set(all_remaining), id_gestiuni, conn)
|
||||
direct = set(direct_id_map.keys())
|
||||
else:
|
||||
direct_id_map = {}
|
||||
direct = set()
|
||||
|
||||
finally:
|
||||
if own_conn:
|
||||
database.pool.release(conn)
|
||||
|
||||
missing = skus - mapped - direct
|
||||
|
||||
logger.info(f"SKU validation: {len(mapped)} mapped, {len(direct)} direct, {len(missing)} missing")
|
||||
return {"mapped": mapped, "direct": direct, "missing": missing,
|
||||
"direct_id_map": direct_id_map}
|
||||
|
||||
def classify_orders(orders, validation_result):
|
||||
"""Classify orders as importable or skipped based on SKU validation.
|
||||
Returns: (importable_orders, skipped_orders)
|
||||
Each skipped entry is a tuple of (order, list_of_missing_skus).
|
||||
"""
|
||||
ok_skus = validation_result["mapped"] | validation_result["direct"]
|
||||
importable = []
|
||||
skipped = []
|
||||
|
||||
for order in orders:
|
||||
order_skus = {item.sku for item in order.items if item.sku}
|
||||
order_missing = order_skus - ok_skus
|
||||
|
||||
if order_missing:
|
||||
skipped.append((order, list(order_missing)))
|
||||
else:
|
||||
importable.append(order)
|
||||
|
||||
return importable, skipped
|
||||
|
||||
def _extract_id_map(direct_id_map: dict) -> dict:
|
||||
"""Extract {codmat: id_articol} from either enriched or simple format."""
|
||||
if not direct_id_map:
|
||||
return {}
|
||||
result = {}
|
||||
for cm, val in direct_id_map.items():
|
||||
if isinstance(val, dict):
|
||||
result[cm] = val["id_articol"]
|
||||
else:
|
||||
result[cm] = val
|
||||
return result
|
||||
|
||||
|
||||
def validate_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None) -> dict:
|
||||
"""Check which CODMATs have a price entry in CRM_POLITICI_PRET_ART for the given policy.
|
||||
If direct_id_map is provided, skips the NOM_ARTICOLE lookup for those CODMATs.
|
||||
Returns: {"has_price": set_of_codmats, "missing_price": set_of_codmats}
|
||||
"""
|
||||
if not codmats:
|
||||
return {"has_price": set(), "missing_price": set()}
|
||||
|
||||
codmat_to_id = _extract_id_map(direct_id_map)
|
||||
ids_with_price = set()
|
||||
|
||||
own_conn = conn is None
|
||||
if own_conn:
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
# Check which ID_ARTICOLs have a price in the policy
|
||||
id_list = list(codmat_to_id.values())
|
||||
for i in range(0, len(id_list), 500):
|
||||
batch = id_list[i:i+500]
|
||||
placeholders = ",".join([f":a{j}" for j in range(len(batch))])
|
||||
params = {f"a{j}": aid for j, aid in enumerate(batch)}
|
||||
params["id_pol"] = id_pol
|
||||
|
||||
cur.execute(f"""
|
||||
SELECT DISTINCT pa.ID_ARTICOL FROM CRM_POLITICI_PRET_ART pa
|
||||
WHERE pa.ID_POL = :id_pol AND pa.ID_ARTICOL IN ({placeholders})
|
||||
""", params)
|
||||
for row in cur:
|
||||
ids_with_price.add(row[0])
|
||||
finally:
|
||||
if own_conn:
|
||||
database.pool.release(conn)
|
||||
|
||||
# Map back to CODMATs
|
||||
has_price = {cm for cm, aid in codmat_to_id.items() if aid in ids_with_price}
|
||||
missing_price = codmats - has_price
|
||||
|
||||
logger.info(f"Price validation (policy {id_pol}): {len(has_price)} have price, {len(missing_price)} missing price")
|
||||
return {"has_price": has_price, "missing_price": missing_price}
|
||||
|
||||
def ensure_prices(codmats: set[str], id_pol: int, conn=None, direct_id_map: dict=None,
|
||||
cota_tva: float = None):
|
||||
"""Insert price 0 entries for CODMATs missing from the given price policy.
|
||||
Uses batch executemany instead of individual INSERTs.
|
||||
Relies on TRG_CRM_POLITICI_PRET_ART trigger for ID_POL_ART sequence.
|
||||
cota_tva: VAT rate from settings (e.g. 21) — used for PROC_TVAV metadata.
|
||||
"""
|
||||
if not codmats:
|
||||
return
|
||||
|
||||
proc_tvav = 1 + (cota_tva / 100) if cota_tva else 1.21
|
||||
|
||||
own_conn = conn is None
|
||||
if own_conn:
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
# Get ID_VALUTA for this policy
|
||||
cur.execute("""
|
||||
SELECT ID_VALUTA FROM CRM_POLITICI_PRETURI WHERE ID_POL = :id_pol
|
||||
""", {"id_pol": id_pol})
|
||||
row = cur.fetchone()
|
||||
if not row:
|
||||
logger.error(f"Price policy {id_pol} not found in CRM_POLITICI_PRETURI")
|
||||
return
|
||||
id_valuta = row[0]
|
||||
|
||||
# Build batch params using direct_id_map (already resolved via resolve_codmat_ids)
|
||||
batch_params = []
|
||||
codmat_id_map = _extract_id_map(direct_id_map)
|
||||
|
||||
for codmat in codmats:
|
||||
id_articol = codmat_id_map.get(codmat)
|
||||
if not id_articol:
|
||||
logger.warning(f"CODMAT {codmat} not found in NOM_ARTICOLE, skipping price insert")
|
||||
continue
|
||||
batch_params.append({
|
||||
"id_pol": id_pol,
|
||||
"id_articol": id_articol,
|
||||
"id_valuta": id_valuta,
|
||||
"proc_tvav": proc_tvav
|
||||
})
|
||||
|
||||
if batch_params:
|
||||
cur.executemany("""
|
||||
INSERT INTO CRM_POLITICI_PRET_ART
|
||||
(ID_POL, ID_ARTICOL, PRET, ID_VALUTA,
|
||||
ID_UTIL, DATAORA, PROC_TVAV, PRETFTVA, PRETCTVA)
|
||||
VALUES
|
||||
(:id_pol, :id_articol, 0, :id_valuta,
|
||||
-3, SYSDATE, :proc_tvav, 0, 0)
|
||||
""", batch_params)
|
||||
logger.info(f"Batch inserted {len(batch_params)} price entries for policy {id_pol} (PROC_TVAV={proc_tvav})")
|
||||
|
||||
conn.commit()
|
||||
finally:
|
||||
if own_conn:
|
||||
database.pool.release(conn)
|
||||
|
||||
logger.info(f"Ensure prices done: {len(codmats)} CODMATs processed for policy {id_pol}")
|
||||
|
||||
|
||||
def validate_and_ensure_prices_dual(codmats: set[str], id_pol_vanzare: int,
|
||||
id_pol_productie: int, conn, direct_id_map: dict,
|
||||
cota_tva: float = 21) -> dict[str, int]:
|
||||
"""Dual-policy price validation: assign each CODMAT to sales or production policy.
|
||||
|
||||
Logic:
|
||||
1. Check both policies in one SQL
|
||||
2. If article in one policy → use that
|
||||
3. If article in BOTH → prefer id_pol_vanzare
|
||||
4. If article in NEITHER → check cont: 341/345 → production, else → sales; insert price 0
|
||||
|
||||
Returns: codmat_policy_map = {codmat: assigned_id_pol}
|
||||
"""
|
||||
if not codmats:
|
||||
return {}
|
||||
|
||||
codmat_policy_map = {}
|
||||
id_map = _extract_id_map(direct_id_map)
|
||||
|
||||
# Collect all id_articol values we need to check
|
||||
id_to_codmats = {} # {id_articol: [codmat, ...]}
|
||||
for cm in codmats:
|
||||
aid = id_map.get(cm)
|
||||
if aid:
|
||||
id_to_codmats.setdefault(aid, []).append(cm)
|
||||
|
||||
if not id_to_codmats:
|
||||
return {}
|
||||
|
||||
# Query both policies in one SQL
|
||||
existing = {} # {id_articol: set of id_pol}
|
||||
id_list = list(id_to_codmats.keys())
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(id_list), 500):
|
||||
batch = id_list[i:i+500]
|
||||
placeholders = ",".join([f":a{j}" for j in range(len(batch))])
|
||||
params = {f"a{j}": aid for j, aid in enumerate(batch)}
|
||||
params["id_pol_v"] = id_pol_vanzare
|
||||
params["id_pol_p"] = id_pol_productie
|
||||
|
||||
cur.execute(f"""
|
||||
SELECT pa.ID_ARTICOL, pa.ID_POL FROM CRM_POLITICI_PRET_ART pa
|
||||
WHERE pa.ID_POL IN (:id_pol_v, :id_pol_p) AND pa.ID_ARTICOL IN ({placeholders})
|
||||
""", params)
|
||||
for row in cur:
|
||||
existing.setdefault(row[0], set()).add(row[1])
|
||||
|
||||
# Classify each codmat
|
||||
missing_vanzare = set() # CODMATs needing price 0 in sales policy
|
||||
missing_productie = set() # CODMATs needing price 0 in production policy
|
||||
|
||||
for aid, cms in id_to_codmats.items():
|
||||
pols = existing.get(aid, set())
|
||||
for cm in cms:
|
||||
if pols:
|
||||
if id_pol_vanzare in pols:
|
||||
codmat_policy_map[cm] = id_pol_vanzare
|
||||
elif id_pol_productie in pols:
|
||||
codmat_policy_map[cm] = id_pol_productie
|
||||
else:
|
||||
# Not in any policy — classify by cont
|
||||
info = direct_id_map.get(cm, {})
|
||||
cont = info.get("cont", "") if isinstance(info, dict) else ""
|
||||
cont_str = str(cont or "").strip()
|
||||
if cont_str in ("341", "345"):
|
||||
codmat_policy_map[cm] = id_pol_productie
|
||||
missing_productie.add(cm)
|
||||
else:
|
||||
codmat_policy_map[cm] = id_pol_vanzare
|
||||
missing_vanzare.add(cm)
|
||||
|
||||
# Ensure prices for missing articles in each policy
|
||||
if missing_vanzare:
|
||||
ensure_prices(missing_vanzare, id_pol_vanzare, conn, direct_id_map, cota_tva=cota_tva)
|
||||
if missing_productie:
|
||||
ensure_prices(missing_productie, id_pol_productie, conn, direct_id_map, cota_tva=cota_tva)
|
||||
|
||||
logger.info(
|
||||
f"Dual-policy: {len(codmat_policy_map)} CODMATs assigned "
|
||||
f"(vanzare={sum(1 for v in codmat_policy_map.values() if v == id_pol_vanzare)}, "
|
||||
f"productie={sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)})"
|
||||
)
|
||||
return codmat_policy_map
|
||||
|
||||
|
||||
def resolve_mapped_codmats(mapped_skus: set[str], conn,
|
||||
id_gestiuni: list[int] = None) -> dict[str, list[dict]]:
|
||||
"""For mapped SKUs, get their underlying CODMATs from ARTICOLE_TERTI + nom_articole.
|
||||
|
||||
Uses ROW_NUMBER to pick the best id_articol per (SKU, CODMAT) pair:
|
||||
prefers article with stock in current month, then MAX(id_articol) as fallback.
|
||||
This avoids inflating results when a CODMAT has multiple NOM_ARTICOLE entries.
|
||||
|
||||
Returns: {sku: [{"codmat": str, "id_articol": int, "cont": str|None, "cantitate_roa": float|None}]}
|
||||
"""
|
||||
if not mapped_skus:
|
||||
return {}
|
||||
|
||||
# Build stoc subquery gestiune filter (same pattern as resolve_codmat_ids)
|
||||
if id_gestiuni:
|
||||
gest_placeholders = ",".join([f":g{k}" for k in range(len(id_gestiuni))])
|
||||
stoc_filter = f"AND s.id_gestiune IN ({gest_placeholders})"
|
||||
else:
|
||||
stoc_filter = ""
|
||||
|
||||
result = {}
|
||||
sku_list = list(mapped_skus)
|
||||
|
||||
with conn.cursor() as cur:
|
||||
for i in range(0, len(sku_list), 500):
|
||||
batch = sku_list[i:i+500]
|
||||
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
|
||||
params = {f"s{j}": sku for j, sku in enumerate(batch)}
|
||||
if id_gestiuni:
|
||||
for k, gid in enumerate(id_gestiuni):
|
||||
params[f"g{k}"] = gid
|
||||
|
||||
cur.execute(f"""
|
||||
SELECT sku, codmat, id_articol, cont, cantitate_roa FROM (
|
||||
SELECT at.sku, at.codmat, na.id_articol, na.cont, at.cantitate_roa,
|
||||
ROW_NUMBER() OVER (
|
||||
PARTITION BY at.sku, at.codmat
|
||||
ORDER BY
|
||||
CASE WHEN EXISTS (
|
||||
SELECT 1 FROM stoc s
|
||||
WHERE s.id_articol = na.id_articol
|
||||
{stoc_filter}
|
||||
AND s.an = EXTRACT(YEAR FROM SYSDATE)
|
||||
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
|
||||
AND s.cants + s.cant - s.cante > 0
|
||||
) THEN 0 ELSE 1 END,
|
||||
na.id_articol DESC
|
||||
) AS rn
|
||||
FROM ARTICOLE_TERTI at
|
||||
JOIN NOM_ARTICOLE na ON na.codmat = at.codmat AND na.sters = 0 AND na.inactiv = 0
|
||||
WHERE at.sku IN ({placeholders}) AND at.activ = 1 AND at.sters = 0
|
||||
) WHERE rn = 1
|
||||
""", params)
|
||||
for row in cur:
|
||||
sku = row[0]
|
||||
if sku not in result:
|
||||
result[sku] = []
|
||||
result[sku].append({
|
||||
"codmat": row[1],
|
||||
"id_articol": row[2],
|
||||
"cont": row[3],
|
||||
"cantitate_roa": row[4]
|
||||
})
|
||||
|
||||
logger.info(f"resolve_mapped_codmats: {len(result)} SKUs → {sum(len(v) for v in result.values())} CODMATs")
|
||||
return result
|
||||
|
||||
|
||||
def pre_validate_order_prices(orders, app_settings: dict, conn, id_pol: int,
|
||||
id_pol_productie: int = None,
|
||||
id_gestiuni: list[int] = None,
|
||||
validation: dict = None,
|
||||
log_callback=None,
|
||||
cota_tva: float = 21) -> dict:
|
||||
"""Pre-validate prices for orders before importing them via PACK_IMPORT_COMENZI.
|
||||
|
||||
Auto-inserts PRET=0 rows in CRM_POLITICI_PRET_ART for missing CODMATs so
|
||||
PL/SQL adauga_articol_comanda doesn't raise COM-001. Mutates
|
||||
app_settings["_codmat_policy_map"] for build_articles_json routing.
|
||||
|
||||
Used by both bulk sync (sync_service.run_sync) and retry (retry_service).
|
||||
|
||||
Args:
|
||||
orders: list of orders to scan for SKUs/CODMATs
|
||||
app_settings: mutated with _codmat_policy_map (SKU/CODMAT → id_pol)
|
||||
conn: Oracle connection (caller manages lifecycle)
|
||||
id_pol: default sales price policy
|
||||
id_pol_productie: production policy for cont 341/345 (None = single-policy)
|
||||
id_gestiuni: gestiune filter for resolve_mapped_codmats
|
||||
validation: output of validate_skus; computed internally if None
|
||||
log_callback: optional Callable[[str], None] for progress messages
|
||||
cota_tva: VAT rate for PROC_TVAV metadata (default 21)
|
||||
|
||||
Returns: {"codmat_policy_map": dict, "kit_missing": dict, "validation": dict}
|
||||
- codmat_policy_map: {codmat_or_sku: id_pol}
|
||||
- kit_missing: {sku: [missing_codmats]} for kits with unprice components
|
||||
- validation: validate_skus result (for caller convenience)
|
||||
"""
|
||||
log = log_callback or (lambda _msg: None)
|
||||
|
||||
if not orders:
|
||||
return {"codmat_policy_map": {}, "kit_missing": {}, "validation": validation or {}}
|
||||
|
||||
if validation is None:
|
||||
all_skus = {item.sku for o in orders for item in o.items if item.sku}
|
||||
validation = validate_skus(all_skus, conn, id_gestiuni)
|
||||
|
||||
log("Validare preturi...")
|
||||
|
||||
# Direct CODMATs (SKU exists in NOM_ARTICOLE without ARTICOLE_TERTI mapping)
|
||||
all_codmats = set()
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
if item.sku in validation["mapped"]:
|
||||
continue
|
||||
if item.sku in validation["direct"]:
|
||||
all_codmats.add(item.sku)
|
||||
|
||||
codmat_policy_map = {}
|
||||
|
||||
if all_codmats:
|
||||
if id_pol_productie:
|
||||
codmat_policy_map = validate_and_ensure_prices_dual(
|
||||
all_codmats, id_pol, id_pol_productie,
|
||||
conn, validation.get("direct_id_map"), cota_tva=cota_tva,
|
||||
)
|
||||
log(f"Politici duale: "
|
||||
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol)} vanzare, "
|
||||
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)} productie")
|
||||
else:
|
||||
price_result = validate_prices(
|
||||
all_codmats, id_pol, conn, validation.get("direct_id_map"),
|
||||
)
|
||||
if price_result["missing_price"]:
|
||||
logger.info(
|
||||
f"Auto-adding price 0 for {len(price_result['missing_price'])} "
|
||||
f"direct articles in policy {id_pol}"
|
||||
)
|
||||
ensure_prices(
|
||||
price_result["missing_price"], id_pol,
|
||||
conn, validation.get("direct_id_map"), cota_tva=cota_tva,
|
||||
)
|
||||
|
||||
# Mapped SKUs (via ARTICOLE_TERTI)
|
||||
mapped_skus_in_orders = {
|
||||
item.sku for o in orders for item in o.items
|
||||
if item.sku in validation["mapped"]
|
||||
}
|
||||
|
||||
mapped_codmat_data = {}
|
||||
if mapped_skus_in_orders:
|
||||
mapped_codmat_data = resolve_mapped_codmats(
|
||||
mapped_skus_in_orders, conn, id_gestiuni=id_gestiuni,
|
||||
)
|
||||
mapped_id_map = {}
|
||||
for sku, entries in mapped_codmat_data.items():
|
||||
for entry in entries:
|
||||
mapped_id_map[entry["codmat"]] = {
|
||||
"id_articol": entry["id_articol"],
|
||||
"cont": entry.get("cont"),
|
||||
}
|
||||
mapped_codmats = set(mapped_id_map.keys())
|
||||
if mapped_codmats:
|
||||
if id_pol_productie:
|
||||
mapped_policy_map = validate_and_ensure_prices_dual(
|
||||
mapped_codmats, id_pol, id_pol_productie,
|
||||
conn, mapped_id_map, cota_tva=cota_tva,
|
||||
)
|
||||
codmat_policy_map.update(mapped_policy_map)
|
||||
else:
|
||||
mp_result = validate_prices(
|
||||
mapped_codmats, id_pol, conn, mapped_id_map,
|
||||
)
|
||||
if mp_result["missing_price"]:
|
||||
ensure_prices(
|
||||
mp_result["missing_price"], id_pol,
|
||||
conn, mapped_id_map, cota_tva=cota_tva,
|
||||
)
|
||||
|
||||
# Bridge SKU → policy via 1:1 mappings (build_articles_json reads by SKU)
|
||||
if codmat_policy_map and mapped_codmat_data:
|
||||
for sku, entries in mapped_codmat_data.items():
|
||||
if len(entries) == 1:
|
||||
codmat = entries[0]["codmat"]
|
||||
if codmat in codmat_policy_map:
|
||||
codmat_policy_map[sku] = codmat_policy_map[codmat]
|
||||
|
||||
if codmat_policy_map:
|
||||
app_settings["_codmat_policy_map"] = codmat_policy_map
|
||||
|
||||
# Kit component price gating
|
||||
kit_missing = {}
|
||||
kit_pricing_mode = app_settings.get("kit_pricing_mode")
|
||||
if kit_pricing_mode and mapped_codmat_data:
|
||||
kit_missing = validate_kit_component_prices(
|
||||
mapped_codmat_data, id_pol, id_pol_productie, conn,
|
||||
)
|
||||
if kit_missing:
|
||||
for sku, missing_codmats in kit_missing.items():
|
||||
log(f"Kit {sku}: prețuri lipsă pentru {', '.join(missing_codmats)}")
|
||||
|
||||
return {
|
||||
"codmat_policy_map": codmat_policy_map,
|
||||
"kit_missing": kit_missing,
|
||||
"validation": validation,
|
||||
"mapped_codmat_data": mapped_codmat_data,
|
||||
}
|
||||
|
||||
|
||||
def validate_kit_component_prices(mapped_codmat_data: dict, id_pol: int,
|
||||
id_pol_productie: int = None, conn=None) -> dict:
|
||||
"""Pre-validate that kit components have non-zero prices in crm_politici_pret_art.
|
||||
|
||||
Args:
|
||||
mapped_codmat_data: {sku: [{"codmat", "id_articol", "cont"}, ...]} from resolve_mapped_codmats
|
||||
id_pol: default sales price policy
|
||||
id_pol_productie: production price policy (for cont 341/345)
|
||||
|
||||
Returns: {sku: [missing_codmats]} for SKUs with missing prices, {} if all OK
|
||||
"""
|
||||
missing = {}
|
||||
own_conn = conn is None
|
||||
if own_conn:
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
for sku, components in mapped_codmat_data.items():
|
||||
if len(components) == 0:
|
||||
continue
|
||||
if len(components) == 1 and (components[0].get("cantitate_roa") or 1) <= 1:
|
||||
continue # True 1:1 mapping, no kit pricing needed
|
||||
sku_missing = []
|
||||
for comp in components:
|
||||
cont = str(comp.get("cont") or "").strip()
|
||||
if cont in ("341", "345") and id_pol_productie:
|
||||
pol = id_pol_productie
|
||||
else:
|
||||
pol = id_pol
|
||||
cur.execute("""
|
||||
SELECT PRET FROM crm_politici_pret_art
|
||||
WHERE id_pol = :pol AND id_articol = :id_art
|
||||
""", {"pol": pol, "id_art": comp["id_articol"]})
|
||||
row = cur.fetchone()
|
||||
if not row:
|
||||
sku_missing.append(comp["codmat"])
|
||||
if sku_missing:
|
||||
missing[sku] = sku_missing
|
||||
finally:
|
||||
if own_conn:
|
||||
database.pool.release(conn)
|
||||
return missing
|
||||
|
||||
|
||||
def compare_and_update_price(id_articol: int, id_pol: int, web_price_cu_tva: float,
|
||||
conn, tolerance: float = 0.01) -> dict | None:
|
||||
"""Compare web price with ROA price and update if different.
|
||||
|
||||
Handles PRETURI_CU_TVA flag per policy.
|
||||
Returns: {"updated": bool, "old_price": float, "new_price": float, "codmat": str} or None if no price entry.
|
||||
"""
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("SELECT PRETURI_CU_TVA FROM CRM_POLITICI_PRETURI WHERE ID_POL = :pol", {"pol": id_pol})
|
||||
pol_row = cur.fetchone()
|
||||
if not pol_row:
|
||||
return None
|
||||
preturi_cu_tva = pol_row[0] # 1 or 0
|
||||
|
||||
cur.execute("""
|
||||
SELECT PRET, PROC_TVAV, na.codmat
|
||||
FROM crm_politici_pret_art pa
|
||||
JOIN nom_articole na ON na.id_articol = pa.id_articol
|
||||
WHERE pa.id_pol = :pol AND pa.id_articol = :id_art
|
||||
""", {"pol": id_pol, "id_art": id_articol})
|
||||
row = cur.fetchone()
|
||||
if not row:
|
||||
return None
|
||||
|
||||
pret_roa, proc_tvav, codmat = row[0], row[1], row[2]
|
||||
proc_tvav = proc_tvav or 1.19
|
||||
|
||||
if preturi_cu_tva == 1:
|
||||
pret_roa_cu_tva = pret_roa
|
||||
else:
|
||||
pret_roa_cu_tva = pret_roa * proc_tvav
|
||||
|
||||
if abs(pret_roa_cu_tva - web_price_cu_tva) <= tolerance:
|
||||
return {"updated": False, "old_price": pret_roa_cu_tva, "new_price": web_price_cu_tva, "codmat": codmat}
|
||||
|
||||
if preturi_cu_tva == 1:
|
||||
new_pret = web_price_cu_tva
|
||||
else:
|
||||
new_pret = round(web_price_cu_tva / proc_tvav, 4)
|
||||
|
||||
cur.execute("""
|
||||
UPDATE crm_politici_pret_art SET PRET = :pret, DATAORA = SYSDATE
|
||||
WHERE id_pol = :pol AND id_articol = :id_art
|
||||
""", {"pret": new_pret, "pol": id_pol, "id_art": id_articol})
|
||||
conn.commit()
|
||||
|
||||
return {"updated": True, "old_price": pret_roa_cu_tva, "new_price": web_price_cu_tva, "codmat": codmat}
|
||||
|
||||
|
||||
def sync_prices_from_order(orders, mapped_codmat_data: dict, direct_id_map: dict,
|
||||
codmat_policy_map: dict, id_pol: int,
|
||||
id_pol_productie: int = None, conn=None,
|
||||
settings: dict = None) -> list:
|
||||
"""Sync prices from order items to ROA for direct/1:1 mappings.
|
||||
|
||||
Skips kit components and transport/discount CODMATs.
|
||||
Returns: list of {"codmat", "old_price", "new_price"} for updated prices.
|
||||
"""
|
||||
if settings and settings.get("price_sync_enabled") != "1":
|
||||
return []
|
||||
|
||||
transport_codmat = (settings or {}).get("transport_codmat", "")
|
||||
discount_codmat = (settings or {}).get("discount_codmat", "")
|
||||
kit_discount_codmat = (settings or {}).get("kit_discount_codmat", "")
|
||||
skip_codmats = {transport_codmat, discount_codmat, kit_discount_codmat} - {""}
|
||||
|
||||
# Build set of kit/bax SKUs (>1 component, or single component with cantitate_roa > 1)
|
||||
kit_skus = {sku for sku, comps in mapped_codmat_data.items()
|
||||
if len(comps) > 1 or (len(comps) == 1 and float(comps[0].get("cantitate_roa") or 1) != 1)}
|
||||
|
||||
updated = []
|
||||
own_conn = conn is None
|
||||
if own_conn:
|
||||
conn = database.get_oracle_connection()
|
||||
try:
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
sku = item.sku
|
||||
if not sku or sku in skip_codmats:
|
||||
continue
|
||||
if sku in kit_skus:
|
||||
continue # Don't sync prices from kit orders
|
||||
|
||||
web_price = item.price # already with TVA
|
||||
if not web_price or web_price <= 0:
|
||||
continue
|
||||
|
||||
# Determine id_articol and price policy for this SKU
|
||||
if sku in mapped_codmat_data and len(mapped_codmat_data[sku]) == 1:
|
||||
# 1:1 mapping via ARTICOLE_TERTI
|
||||
comp = mapped_codmat_data[sku][0]
|
||||
id_articol = comp["id_articol"]
|
||||
cantitate_roa = comp.get("cantitate_roa") or 1
|
||||
web_price_per_unit = web_price / cantitate_roa if cantitate_roa != 1 else web_price
|
||||
elif sku in (direct_id_map or {}):
|
||||
info = direct_id_map[sku]
|
||||
id_articol = info["id_articol"] if isinstance(info, dict) else info
|
||||
web_price_per_unit = web_price
|
||||
else:
|
||||
continue
|
||||
|
||||
pol = codmat_policy_map.get(sku, id_pol)
|
||||
result = compare_and_update_price(id_articol, pol, web_price_per_unit, conn)
|
||||
if result and result["updated"]:
|
||||
updated.append(result)
|
||||
finally:
|
||||
if own_conn:
|
||||
database.pool.release(conn)
|
||||
|
||||
return updated
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,724 +0,0 @@
|
||||
// ── State ─────────────────────────────────────────
|
||||
let dashPage = 1;
|
||||
let dashPerPage = 50;
|
||||
let dashSortCol = 'order_date';
|
||||
let dashSortDir = 'desc';
|
||||
let dashSearchTimeout = null;
|
||||
// Sync polling state
|
||||
let _pollInterval = null;
|
||||
let _lastSyncStatus = null;
|
||||
let _lastRunId = null;
|
||||
let _currentRunId = null;
|
||||
let _pollIntervalMs = 5000; // default, overridden from settings
|
||||
let _knownLastRunId = null; // track last_run.run_id to detect missed syncs
|
||||
let _schedulerLoading = false; // prevent onchange during programmatic load
|
||||
|
||||
// ── Init ──────────────────────────────────────────
|
||||
|
||||
document.addEventListener('DOMContentLoaded', async () => {
|
||||
await initPollInterval();
|
||||
loadSchedulerStatus();
|
||||
loadDashOrders();
|
||||
startSyncPolling();
|
||||
wireFilterBar();
|
||||
checkFirstTime();
|
||||
});
|
||||
|
||||
async function initPollInterval() {
|
||||
try {
|
||||
const data = await fetchJSON('/api/settings');
|
||||
const sec = parseInt(data.dashboard_poll_seconds) || 5;
|
||||
_pollIntervalMs = sec * 1000;
|
||||
} catch(e) {}
|
||||
}
|
||||
|
||||
// ── Smart Sync Polling ────────────────────────────
|
||||
|
||||
function startSyncPolling() {
|
||||
if (_pollInterval) clearInterval(_pollInterval);
|
||||
_pollInterval = setInterval(pollSyncStatus, _pollIntervalMs);
|
||||
pollSyncStatus(); // immediate first call
|
||||
}
|
||||
|
||||
async function pollSyncStatus() {
|
||||
try {
|
||||
const data = await fetchJSON('/api/sync/status');
|
||||
updateSyncPanel(data);
|
||||
const isRunning = data.status === 'running';
|
||||
const wasRunning = _lastSyncStatus === 'running';
|
||||
|
||||
// Detect missed sync completions via last_run.run_id change
|
||||
const newLastRunId = data.last_run?.run_id || null;
|
||||
const missedSync = !isRunning && !wasRunning && _knownLastRunId && newLastRunId && newLastRunId !== _knownLastRunId;
|
||||
_knownLastRunId = newLastRunId;
|
||||
|
||||
if (isRunning && !wasRunning) {
|
||||
// Switched to running — speed up polling
|
||||
clearInterval(_pollInterval);
|
||||
_pollInterval = setInterval(pollSyncStatus, 3000);
|
||||
} else if (!isRunning && wasRunning) {
|
||||
// Sync just completed — slow down and refresh orders
|
||||
clearInterval(_pollInterval);
|
||||
_pollInterval = setInterval(pollSyncStatus, _pollIntervalMs);
|
||||
loadDashOrders();
|
||||
} else if (missedSync) {
|
||||
// Sync completed while we weren't watching (e.g. auto-sync) — refresh orders
|
||||
loadDashOrders();
|
||||
}
|
||||
_lastSyncStatus = data.status;
|
||||
} catch (e) {
|
||||
console.warn('Sync status poll failed:', e);
|
||||
}
|
||||
}
|
||||
|
||||
function updateSyncPanel(data) {
|
||||
const dot = document.getElementById('syncStatusDot');
|
||||
const txt = document.getElementById('syncStatusText');
|
||||
const progressArea = document.getElementById('syncProgressArea');
|
||||
const progressText = document.getElementById('syncProgressText');
|
||||
const startBtn = document.getElementById('syncStartBtn');
|
||||
|
||||
if (dot) {
|
||||
dot.className = 'sync-status-dot ' + (data.status || 'idle');
|
||||
}
|
||||
const statusLabels = { running: 'A ruleaza...', idle: 'Inactiv', completed: 'Finalizat', failed: 'Eroare' };
|
||||
if (txt) txt.textContent = statusLabels[data.status] || data.status || 'Inactiv';
|
||||
if (startBtn) startBtn.disabled = data.status === 'running';
|
||||
|
||||
// Track current running sync run_id
|
||||
if (data.status === 'running' && data.run_id) {
|
||||
_currentRunId = data.run_id;
|
||||
} else {
|
||||
_currentRunId = null;
|
||||
}
|
||||
|
||||
// Live progress area
|
||||
if (progressArea) {
|
||||
progressArea.style.display = data.status === 'running' ? 'flex' : 'none';
|
||||
}
|
||||
if (progressText && data.phase_text) {
|
||||
progressText.textContent = data.phase_text;
|
||||
}
|
||||
|
||||
// Last run info
|
||||
const lr = data.last_run;
|
||||
if (lr) {
|
||||
_lastRunId = lr.run_id;
|
||||
const d = document.getElementById('lastSyncDate');
|
||||
const dur = document.getElementById('lastSyncDuration');
|
||||
const cnt = document.getElementById('lastSyncCounts');
|
||||
const st = document.getElementById('lastSyncStatus');
|
||||
if (d) d.textContent = lr.started_at ? lr.started_at.replace('T', ' ').slice(0, 16) : '\u2014';
|
||||
if (dur) dur.textContent = lr.duration_seconds ? Math.round(lr.duration_seconds) + 's' : '\u2014';
|
||||
if (cnt) {
|
||||
const newImp = lr.new_imported || 0;
|
||||
const already = lr.already_imported || 0;
|
||||
if (already > 0) {
|
||||
cnt.innerHTML = `<span class="dot dot-green me-1"></span>${newImp} noi, ${already} deja <span class="dot dot-yellow me-1"></span>${lr.skipped || 0} omise <span class="dot dot-red me-1"></span>${lr.errors || 0} erori`;
|
||||
} else {
|
||||
cnt.innerHTML = `<span class="dot dot-green me-1"></span>${lr.imported || 0} imp. <span class="dot dot-yellow me-1"></span>${lr.skipped || 0} omise <span class="dot dot-red me-1"></span>${lr.errors || 0} erori`;
|
||||
}
|
||||
}
|
||||
if (st) {
|
||||
st.textContent = lr.status === 'completed' ? '\u2713' : '\u2715';
|
||||
st.style.color = lr.status === 'completed' ? 'var(--success)' : 'var(--error)';
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function checkFirstTime() {
|
||||
const welcomeEl = document.getElementById('welcomeCard');
|
||||
if (!welcomeEl) return;
|
||||
try {
|
||||
const data = await fetchJSON('/api/sync/status');
|
||||
if (!data.last_run) {
|
||||
welcomeEl.innerHTML = `<div class="welcome-card">
|
||||
<h5 style="font-family:var(--font-display);margin:0 0 8px">Bine ai venit!</h5>
|
||||
<p class="text-muted mb-2" style="font-size:0.875rem">Configureaza si ruleaza primul sync:</p>
|
||||
<div class="welcome-steps">
|
||||
<span class="welcome-step"><b>1.</b> <a href="${window.ROOT_PATH||''}/settings">Verifica Settings</a></span>
|
||||
<span class="welcome-step"><b>2.</b> Apasa "Start Sync"</span>
|
||||
<span class="welcome-step"><b>3.</b> <a href="${window.ROOT_PATH||''}/missing-skus">Mapeaza SKU-urile lipsa</a></span>
|
||||
</div>
|
||||
</div>`;
|
||||
welcomeEl.style.display = '';
|
||||
} else {
|
||||
welcomeEl.style.display = 'none';
|
||||
}
|
||||
} catch(e) { welcomeEl.style.display = 'none'; }
|
||||
}
|
||||
|
||||
// Wire last-sync-row click → journal (use current running sync if active)
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
document.getElementById('lastSyncRow')?.addEventListener('click', () => {
|
||||
const targetId = _currentRunId || _lastRunId;
|
||||
if (targetId) window.location = (window.ROOT_PATH || '') + '/logs?run=' + targetId;
|
||||
});
|
||||
document.getElementById('lastSyncRow')?.addEventListener('keydown', (e) => {
|
||||
const targetId = _currentRunId || _lastRunId;
|
||||
if ((e.key === 'Enter' || e.key === ' ') && targetId) {
|
||||
window.location = '/logs?run=' + targetId;
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// ── Sync Health pill ─────────────────────────────
|
||||
|
||||
let _lastHealth = null;
|
||||
|
||||
async function pollSyncHealth() {
|
||||
try {
|
||||
const data = await fetchJSON('/api/sync/health');
|
||||
_lastHealth = data;
|
||||
renderHealthPill(data);
|
||||
} catch (e) { /* fail-soft: keep last state */ }
|
||||
}
|
||||
|
||||
function renderHealthPill(h) {
|
||||
const pill = document.getElementById('syncHealthPill');
|
||||
if (!pill) return;
|
||||
const icon = pill.querySelector('i');
|
||||
const label = pill.querySelector('.health-pill-label');
|
||||
let state = 'healthy', iconCls = 'bi-check-circle-fill', text = 'Sanatos', tooltip;
|
||||
|
||||
const recent = h.recent_phase_failures || {};
|
||||
const recentCount = Object.values(recent).reduce((a, b) => a + (b || 0), 0);
|
||||
|
||||
if (h.escalation_phase || h.last_sync_status === 'halted_escalation') {
|
||||
state = 'escalated';
|
||||
iconCls = 'bi-x-octagon-fill';
|
||||
text = 'Blocat';
|
||||
tooltip = `Blocat — faza "${h.escalation_phase || '?'}" a esuat 3 sync-uri consecutive.\n`
|
||||
+ `Ultima eroare: ${h.last_halt_reason || '—'}\n`
|
||||
+ `Click Start Sync pentru override manual.`;
|
||||
} else if (h.last_sync_status === 'failed' || recentCount > 0) {
|
||||
state = 'warning';
|
||||
iconCls = 'bi-exclamation-triangle-fill';
|
||||
text = 'Atentie';
|
||||
const topPhases = Object.entries(recent).slice(0, 3)
|
||||
.map(([p, c]) => `${p} (${c} of last 3)`).join(', ');
|
||||
tooltip = `Atentie — ${topPhases || 'sync anterior esuat'}`
|
||||
+ (h.last_halt_reason ? `\nLast error: ${h.last_halt_reason}` : '');
|
||||
} else {
|
||||
const lastAt = h.last_sync_at ? h.last_sync_at.replace('T', ' ').slice(5, 16) : 'nicio rulare';
|
||||
tooltip = `Sanatos — ultimul sync: ${lastAt}`;
|
||||
}
|
||||
|
||||
pill.className = 'health-pill ' + state;
|
||||
pill.setAttribute('aria-label', `Sync: ${text}`);
|
||||
pill.title = tooltip;
|
||||
if (icon) icon.className = 'bi ' + iconCls;
|
||||
if (label) label.textContent = text;
|
||||
}
|
||||
|
||||
function startHealthPolling() {
|
||||
pollSyncHealth();
|
||||
setInterval(pollSyncHealth, 10000);
|
||||
}
|
||||
|
||||
document.addEventListener('DOMContentLoaded', startHealthPolling);
|
||||
|
||||
// ── Sync Controls ─────────────────────────────────
|
||||
|
||||
async function startSync() {
|
||||
// Escalation override — confirm before overriding the auto-halt
|
||||
if (_lastHealth && (_lastHealth.escalation_phase
|
||||
|| _lastHealth.last_sync_status === 'halted_escalation')) {
|
||||
const phase = _lastHealth.escalation_phase || '?';
|
||||
const reason = _lastHealth.last_halt_reason || '(unknown)';
|
||||
const msg = `⚠ Sync blocat automat\n\n`
|
||||
+ `Faza "${phase}" a esuat in ultimele 3 sync-uri consecutive.\n`
|
||||
+ `Ultima eroare: ${reason}\n\n`
|
||||
+ `Repornesti oricum?`;
|
||||
if (!confirm(msg)) return;
|
||||
}
|
||||
try {
|
||||
const res = await fetch('/api/sync/start', { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.error) {
|
||||
alert(data.error);
|
||||
return;
|
||||
}
|
||||
// Polling will detect the running state — just speed it up immediately
|
||||
pollSyncStatus();
|
||||
pollSyncHealth();
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
async function stopSync() {
|
||||
try {
|
||||
await fetch('/api/sync/stop', { method: 'POST' });
|
||||
pollSyncStatus();
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
async function toggleScheduler() {
|
||||
const enabled = document.getElementById('schedulerToggle').checked;
|
||||
const interval = parseInt(document.getElementById('schedulerInterval').value) || 10;
|
||||
try {
|
||||
await fetch('/api/sync/schedule', {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ enabled, interval_minutes: interval })
|
||||
});
|
||||
} catch (err) {
|
||||
alert('Eroare scheduler: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
async function updateSchedulerInterval() {
|
||||
if (_schedulerLoading) return; // ignore programmatic changes during load
|
||||
await toggleScheduler(); // always save interval (even when disabled)
|
||||
}
|
||||
|
||||
async function loadSchedulerStatus() {
|
||||
_schedulerLoading = true;
|
||||
try {
|
||||
const res = await fetch('/api/sync/schedule');
|
||||
const data = await res.json();
|
||||
document.getElementById('schedulerToggle').checked = data.enabled || false;
|
||||
document.getElementById('schedulerInterval').value = data.interval_minutes || 10;
|
||||
} catch (err) {
|
||||
console.error('loadSchedulerStatus error:', err);
|
||||
} finally {
|
||||
_schedulerLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Filter Bar wiring ─────────────────────────────
|
||||
|
||||
function wireFilterBar() {
|
||||
// Period preset buttons
|
||||
document.querySelectorAll('.preset-btn[data-days]').forEach(btn => {
|
||||
btn.addEventListener('click', function() {
|
||||
document.querySelectorAll('.preset-btn').forEach(b => b.classList.remove('active'));
|
||||
this.classList.add('active');
|
||||
const days = this.dataset.days;
|
||||
const cr = document.getElementById('customRangeInputs');
|
||||
if (days === 'custom') {
|
||||
cr?.classList.add('visible');
|
||||
} else {
|
||||
cr?.classList.remove('visible');
|
||||
dashPage = 1;
|
||||
loadDashOrders();
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Custom range inputs
|
||||
['periodStart', 'periodEnd'].forEach(id => {
|
||||
document.getElementById(id)?.addEventListener('change', () => {
|
||||
const s = document.getElementById('periodStart')?.value;
|
||||
const e = document.getElementById('periodEnd')?.value;
|
||||
if (s && e) { dashPage = 1; loadDashOrders(); }
|
||||
});
|
||||
});
|
||||
|
||||
// Status pills
|
||||
document.querySelectorAll('.filter-pill[data-status]').forEach(btn => {
|
||||
btn.addEventListener('click', function () {
|
||||
document.querySelectorAll('.filter-pill[data-status]').forEach(b => b.classList.remove('active'));
|
||||
this.classList.add('active');
|
||||
dashPage = 1;
|
||||
loadDashOrders();
|
||||
});
|
||||
});
|
||||
|
||||
// Search — 300ms debounce
|
||||
document.getElementById('orderSearch')?.addEventListener('input', () => {
|
||||
clearTimeout(dashSearchTimeout);
|
||||
dashSearchTimeout = setTimeout(() => {
|
||||
dashPage = 1;
|
||||
loadDashOrders();
|
||||
}, 300);
|
||||
});
|
||||
}
|
||||
|
||||
// ── Dashboard Orders Table ────────────────────────
|
||||
|
||||
function dashSortBy(col) {
|
||||
if (dashSortCol === col) {
|
||||
dashSortDir = dashSortDir === 'asc' ? 'desc' : 'asc';
|
||||
} else {
|
||||
dashSortCol = col;
|
||||
dashSortDir = 'asc';
|
||||
}
|
||||
document.querySelectorAll('.sort-icon').forEach(span => {
|
||||
const c = span.dataset.col;
|
||||
span.textContent = c === dashSortCol ? (dashSortDir === 'asc' ? '\u2191' : '\u2193') : '';
|
||||
});
|
||||
dashPage = 1;
|
||||
loadDashOrders();
|
||||
}
|
||||
|
||||
async function loadDashOrders() {
|
||||
const activePreset = document.querySelector('.preset-btn.active');
|
||||
const periodVal = activePreset?.dataset.days || '3';
|
||||
const params = new URLSearchParams();
|
||||
|
||||
if (periodVal === 'custom') {
|
||||
const s = document.getElementById('periodStart')?.value;
|
||||
const e = document.getElementById('periodEnd')?.value;
|
||||
if (s && e) {
|
||||
params.set('period_start', s);
|
||||
params.set('period_end', e);
|
||||
params.set('period_days', '0');
|
||||
}
|
||||
} else {
|
||||
params.set('period_days', periodVal);
|
||||
}
|
||||
|
||||
const activeStatus = document.querySelector('.filter-pill.active')?.dataset.status;
|
||||
if (activeStatus && activeStatus !== 'all') params.set('status', activeStatus);
|
||||
|
||||
const search = document.getElementById('orderSearch')?.value?.trim();
|
||||
if (search) params.set('search', search);
|
||||
|
||||
params.set('page', dashPage);
|
||||
params.set('per_page', dashPerPage);
|
||||
params.set('sort_by', dashSortCol);
|
||||
params.set('sort_dir', dashSortDir);
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/dashboard/orders?${params}`);
|
||||
const data = await res.json();
|
||||
|
||||
// Update filter-pill badge counts
|
||||
const c = data.counts || {};
|
||||
const el = (id) => document.getElementById(id);
|
||||
if (el('cntAll')) el('cntAll').textContent = c.total || 0;
|
||||
if (el('cntImp')) el('cntImp').textContent = c.imported_all || c.imported || 0;
|
||||
if (el('cntSkip')) el('cntSkip').textContent = c.skipped || 0;
|
||||
if (el('cntErr')) el('cntErr').textContent = c.error || c.errors || 0;
|
||||
if (el('cntFact')) el('cntFact').textContent = c.facturate || 0;
|
||||
if (el('cntNef')) el('cntNef').textContent = c.nefacturate || c.uninvoiced || 0;
|
||||
if (el('cntCanc')) el('cntCanc').textContent = c.cancelled || 0;
|
||||
if (el('cntMal')) el('cntMal').textContent = c.malformed || 0;
|
||||
if (el('cntDiff')) el('cntDiff').textContent = c.diffs || 0;
|
||||
|
||||
// Attention card
|
||||
const attnEl = document.getElementById('attentionCard');
|
||||
if (attnEl) {
|
||||
const errors = c.error || 0;
|
||||
const unmapped = c.unresolved_skus || 0;
|
||||
const nefact = c.nefacturate || 0;
|
||||
const diffs = c.diffs || 0;
|
||||
|
||||
const incompleteAddr = c.incomplete_addresses || 0;
|
||||
const partnerMismatches = c.partner_mismatches || 0;
|
||||
|
||||
if (errors === 0 && unmapped === 0 && nefact === 0 && incompleteAddr === 0 && diffs === 0 && partnerMismatches === 0) {
|
||||
attnEl.innerHTML = '<div class="attention-card attention-ok"><i class="bi bi-check-circle"></i> Totul in ordine</div>';
|
||||
} else {
|
||||
let items = [];
|
||||
if (errors > 0) items.push(`<span class="attention-item attention-error" onclick="document.querySelector('.filter-pill[data-status=ERROR]')?.click()"><i class="bi bi-exclamation-triangle"></i> ${errors} erori import</span>`);
|
||||
if (unmapped > 0) items.push(`<span class="attention-item attention-warning" onclick="window.location='${window.ROOT_PATH||''}/missing-skus'"><i class="bi bi-puzzle"></i> ${unmapped} SKU-uri nemapate</span>`);
|
||||
if (nefact > 0) items.push(`<span class="attention-item attention-warning" onclick="document.querySelector('.filter-pill[data-status=UNINVOICED]')?.click()"><i class="bi bi-receipt"></i> ${nefact} nefacturate</span>`);
|
||||
if (c.incomplete_addresses > 0) items.push(`<span class="attention-item attention-warning"><i class="bi bi-geo-alt"></i> ${c.incomplete_addresses} adrese incomplete</span>`);
|
||||
if (diffs > 0) items.push(`<span class="attention-item attention-warning" onclick="document.querySelector('.filter-pill[data-status=DIFFS]')?.click()"><i class="bi bi-exclamation-diamond"></i> ${diffs} diferente ANAF</span>`);
|
||||
if (partnerMismatches > 0) items.push(`<span class="attention-item attention-error" onclick="document.querySelector('.filter-pill[data-status=DIFFS]')?.click()"><i class="bi bi-people"></i> ${partnerMismatches} partener schimbat</span>`);
|
||||
attnEl.innerHTML = '<div class="attention-card attention-alert">' + items.join('') + '</div>';
|
||||
}
|
||||
}
|
||||
|
||||
const tbody = document.getElementById('dashOrdersBody');
|
||||
const orders = data.orders || [];
|
||||
|
||||
if (orders.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="10" class="text-center text-muted py-3">Nicio comanda</td></tr>';
|
||||
} else {
|
||||
tbody.innerHTML = orders.map(o => {
|
||||
const dateStr = fmtDate(o.order_date);
|
||||
const orderTotal = o.order_total != null ? Number(o.order_total).toFixed(2) : '-';
|
||||
|
||||
return `<tr style="cursor:pointer" onclick="openDashOrderDetail('${esc(o.order_number)}')">
|
||||
<td>${statusDot(o.status)}</td>
|
||||
<td class="text-center">${invoiceDot(o)}</td>
|
||||
<td class="text-nowrap">${dateStr}</td>
|
||||
${renderClientCell(o)}
|
||||
<td><code>${esc(o.order_number)}</code>${diffDots(o)}</td>
|
||||
<td>${o.items_count || 0}</td>
|
||||
<td class="text-end text-muted">${fmtCost(o.delivery_cost)}</td>
|
||||
<td class="text-end text-muted">${fmtCost(o.discount_total)}</td>
|
||||
<td class="text-end fw-bold">${orderTotal}</td>
|
||||
<td class="kebab-dropdown" onclick="event.stopPropagation()">${(o.status === ORDER_STATUS.IMPORTED || o.status === ORDER_STATUS.ALREADY_IMPORTED) && !(o.invoice && o.invoice.facturat) ? '<div class="dropdown"><button class="btn btn-sm border-0" aria-label="Actiuni comanda" data-bs-toggle="dropdown"><i class="bi bi-three-dots-vertical"></i></button><ul class="dropdown-menu dropdown-menu-end"><li><button class="dropdown-item" onclick="dashResyncOrder(\'' + esc(o.order_number) + '\', this)"><i class="bi bi-arrow-repeat me-2"></i>Resync</button></li><li><button class="dropdown-item text-danger" onclick="dashDeleteOrder(\'' + esc(o.order_number) + '\', this)"><i class="bi bi-trash me-2"></i>Sterge din ROA</button></li></ul></div>' : ''}</td>
|
||||
</tr>`;
|
||||
}).join('');
|
||||
}
|
||||
|
||||
// Mobile flat rows
|
||||
const mobileList = document.getElementById('dashMobileList');
|
||||
if (mobileList) {
|
||||
if (orders.length === 0) {
|
||||
mobileList.innerHTML = '<div class="flat-row text-muted py-3 justify-content-center">Nicio comanda</div>';
|
||||
} else {
|
||||
mobileList.innerHTML = orders.map(o => {
|
||||
const d = o.order_date || '';
|
||||
let dateFmt = '-';
|
||||
if (d.length >= 10) {
|
||||
dateFmt = d.slice(8, 10) + '.' + d.slice(5, 7) + '.' + d.slice(2, 4);
|
||||
if (d.length >= 16) dateFmt += ' ' + d.slice(11, 16);
|
||||
}
|
||||
const name = o.customer_name || o.shipping_name || o.billing_name || '\u2014';
|
||||
const totalStr = o.order_total ? Number(o.order_total).toFixed(2) : '';
|
||||
return `<div class="flat-row" onclick="openDashOrderDetail('${esc(o.order_number)}')" style="font-size:0.875rem">
|
||||
${statusDot(o.status)}
|
||||
<span style="color:var(--text-muted)" class="text-nowrap">${dateFmt}</span>
|
||||
<span class="grow truncate fw-bold">${esc(name)}</span>
|
||||
<span class="text-nowrap">x${o.items_count || 0}${totalStr ? ' · ' + diffDots(o, true) + '<strong>' + totalStr + '</strong>' : ''}</span>
|
||||
</div>`;
|
||||
}).join('');
|
||||
}
|
||||
}
|
||||
|
||||
// Mobile segmented control
|
||||
renderMobileSegmented('dashMobileSeg', [
|
||||
{ label: 'Toate', count: c.total || 0, value: 'all', active: (activeStatus || 'all') === 'all', colorClass: 'fc-neutral' },
|
||||
{ label: 'Imp.', count: c.imported_all || c.imported || 0, value: ORDER_STATUS.IMPORTED, active: activeStatus === ORDER_STATUS.IMPORTED, colorClass: 'fc-green' },
|
||||
{ label: 'Omise', count: c.skipped || 0, value: ORDER_STATUS.SKIPPED, active: activeStatus === ORDER_STATUS.SKIPPED, colorClass: 'fc-yellow' },
|
||||
{ label: 'Erori', count: c.error || c.errors || 0, value: ORDER_STATUS.ERROR, active: activeStatus === ORDER_STATUS.ERROR, colorClass: 'fc-red' },
|
||||
{ label: 'Fact.', count: c.facturate || 0, value: 'INVOICED', active: activeStatus === 'INVOICED', colorClass: 'fc-green' },
|
||||
{ label: 'Nefact.', count: c.nefacturate || c.uninvoiced || 0, value: 'UNINVOICED', active: activeStatus === 'UNINVOICED', colorClass: 'fc-red' },
|
||||
{ label: 'Anulate', count: c.cancelled || 0, value: ORDER_STATUS.CANCELLED, active: activeStatus === ORDER_STATUS.CANCELLED, colorClass: 'fc-dark' },
|
||||
{ label: 'Def.', count: c.malformed || 0, value: ORDER_STATUS.MALFORMED, active: activeStatus === ORDER_STATUS.MALFORMED, colorClass: 'fc-orange' },
|
||||
{ label: 'Dif.', count: c.diffs || 0, value: 'DIFFS', active: activeStatus === 'DIFFS', colorClass: 'fc-orange' }
|
||||
], (val) => {
|
||||
document.querySelectorAll('.filter-pill[data-status]').forEach(b => b.classList.remove('active'));
|
||||
const pill = document.querySelector(`.filter-pill[data-status="${val}"]`);
|
||||
if (pill) pill.classList.add('active');
|
||||
dashPage = 1;
|
||||
loadDashOrders();
|
||||
});
|
||||
|
||||
// Pagination
|
||||
const pag = data.pagination || {};
|
||||
const totalPages = pag.total_pages || data.pages || 1;
|
||||
const totalOrders = (data.counts || {}).total || data.total || 0;
|
||||
|
||||
const pagOpts = { perPage: dashPerPage, perPageFn: 'dashChangePerPage', perPageOptions: [25, 50, 100, 250] };
|
||||
const pagHtml = `<small class="text-muted me-auto">${totalOrders} comenzi | Pagina ${dashPage} din ${totalPages}</small>` + renderUnifiedPagination(dashPage, totalPages, 'dashGoPage', pagOpts);
|
||||
const pagDiv = document.getElementById('dashPagination');
|
||||
if (pagDiv) pagDiv.innerHTML = pagHtml;
|
||||
const pagDivTop = document.getElementById('dashPaginationTop');
|
||||
if (pagDivTop) pagDivTop.innerHTML = pagHtml;
|
||||
|
||||
// Update sort icons
|
||||
document.querySelectorAll('.sort-icon').forEach(span => {
|
||||
const c = span.dataset.col;
|
||||
span.textContent = c === dashSortCol ? (dashSortDir === 'asc' ? '\u2191' : '\u2193') : '';
|
||||
});
|
||||
} catch (err) {
|
||||
document.getElementById('dashOrdersBody').innerHTML =
|
||||
`<tr><td colspan="10" class="text-center text-danger">${esc(err.message)}</td></tr>`;
|
||||
}
|
||||
}
|
||||
|
||||
function dashGoPage(p) {
|
||||
dashPage = p;
|
||||
loadDashOrders();
|
||||
}
|
||||
|
||||
function dashChangePerPage(val) {
|
||||
dashPerPage = parseInt(val) || 50;
|
||||
dashPage = 1;
|
||||
loadDashOrders();
|
||||
}
|
||||
|
||||
// ── Client cell with Cont tooltip (Task F4) ───────
|
||||
|
||||
function renderClientCell(order) {
|
||||
const display = (order.customer_name || order.shipping_name || '').trim();
|
||||
const billing = (order.billing_name || '').trim();
|
||||
const shipping = (order.shipping_name || '').trim();
|
||||
// PJ: invoice party (company = display) differs from shipping person
|
||||
// PF ramburs: invoice party = shipping, but billing person differs from shipping
|
||||
const isPJDiff = display && shipping && display !== shipping;
|
||||
const isPFDiff = !isPJDiff && billing && shipping && billing !== shipping;
|
||||
if (isPJDiff || isPFDiff) {
|
||||
const facturat = isPJDiff ? display : billing;
|
||||
const tip = `Facturat: ${escHtml(facturat)} · Livrare: ${escHtml(shipping)}`;
|
||||
return `<td class="tooltip-cont fw-bold" data-tooltip="${tip}">${escHtml(display)} <sup class="client-diff-indicator" aria-label="${tip}" title="${tip}">▲</sup></td>`;
|
||||
}
|
||||
return `<td class="fw-bold">${escHtml(display || billing || '\u2014')}</td>`;
|
||||
}
|
||||
|
||||
// ── Helper functions ──────────────────────────────
|
||||
|
||||
async function fetchJSON(url) {
|
||||
const res = await fetch(url);
|
||||
if (!res.ok) throw new Error(`HTTP ${res.status}`);
|
||||
return res.json();
|
||||
}
|
||||
|
||||
function escHtml(s) {
|
||||
if (s == null) return '';
|
||||
return String(s)
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/>/g, '>')
|
||||
.replace(/"/g, '"')
|
||||
.replace(/'/g, ''');
|
||||
}
|
||||
|
||||
function statusLabelText(status) {
|
||||
switch ((status || '').toUpperCase()) {
|
||||
case ORDER_STATUS.IMPORTED: return 'Importat';
|
||||
case ORDER_STATUS.ALREADY_IMPORTED: return 'Deja imp.';
|
||||
case ORDER_STATUS.SKIPPED: return 'Omis';
|
||||
case ORDER_STATUS.ERROR: return 'Eroare';
|
||||
default: return esc(status);
|
||||
}
|
||||
}
|
||||
|
||||
function diffDots(o, mobile) {
|
||||
const sz = mobile ? 6 : 7;
|
||||
const ml = mobile ? 'margin-right:2px' : 'margin-left:3px';
|
||||
let d = '';
|
||||
const s = `display:inline-block;width:${sz}px;height:${sz}px;border-radius:50%;${ml};vertical-align:middle`;
|
||||
if (o.anaf_cod_fiscal_adjusted===1 ||
|
||||
(o.cod_fiscal_gomag && o.anaf_platitor_tva!==null && o.anaf_cod_fiscal_adjusted!==1 &&
|
||||
(/^RO/i.test(o.cod_fiscal_gomag)!==(o.anaf_platitor_tva===1))))
|
||||
d += `<span style="${s};background:var(--error)" title="CUI/TVA ANAF"></span>`;
|
||||
if (o.anaf_denumire_mismatch===1)
|
||||
d += `<span style="${s};background:var(--compare)" title="Denumire ANAF"></span>`;
|
||||
if (o.address_mismatch===1)
|
||||
d += `<span style="${s};background:var(--info)" title="Adresa diferita"></span>`;
|
||||
if (o.partner_mismatch===1)
|
||||
d += `<span style="${s};background:var(--warning)" title="Partener schimbat"></span>`;
|
||||
return d;
|
||||
}
|
||||
|
||||
function invoiceDot(order) {
|
||||
if (order.status !== ORDER_STATUS.IMPORTED && order.status !== ORDER_STATUS.ALREADY_IMPORTED) return '–';
|
||||
if (order.invoice && order.invoice.facturat) return '<span class="dot dot-green" style="box-shadow:none" title="Facturat"></span>';
|
||||
return '<span class="dot dot-red" style="box-shadow:none" title="Nefacturat"></span>';
|
||||
}
|
||||
|
||||
// ── Refresh Invoices ──────────────────────────────
|
||||
|
||||
async function refreshInvoices() {
|
||||
const btn = document.getElementById('btnRefreshInvoices');
|
||||
const btnM = document.getElementById('btnRefreshInvoicesMobile');
|
||||
if (btn) { btn.disabled = true; btn.textContent = '⟳ Se verifica...'; }
|
||||
if (btnM) { btnM.disabled = true; }
|
||||
try {
|
||||
const res = await fetch('/api/dashboard/refresh-invoices', { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.error) {
|
||||
alert('Eroare: ' + data.error);
|
||||
} else {
|
||||
loadDashOrders();
|
||||
}
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
} finally {
|
||||
if (btn) { btn.disabled = false; btn.textContent = '↻ Facturi'; }
|
||||
if (btnM) { btnM.disabled = false; }
|
||||
}
|
||||
}
|
||||
|
||||
// ── Order Detail Modal ────────────────────────────
|
||||
|
||||
async function refreshOrderAddress(orderNumber) {
|
||||
if (!orderNumber) return;
|
||||
const btn = document.getElementById('refreshAddrBtn');
|
||||
if (btn) { btn.disabled = true; btn.innerHTML = '<span class="spinner-border spinner-border-sm"></span>'; }
|
||||
try {
|
||||
const res = await fetch(`/api/orders/${orderNumber}/refresh-address`, {method: 'POST'});
|
||||
if (!res.ok) {
|
||||
const err = await res.json().catch(() => ({}));
|
||||
showToast('Eroare refresh adresă: ' + (err.detail || res.status), 'danger');
|
||||
return;
|
||||
}
|
||||
showToast('Adresă actualizată din Oracle', 'success');
|
||||
renderOrderDetailModal(orderNumber, {onQuickMap: openDashQuickMap});
|
||||
} catch (e) {
|
||||
showToast('Eroare conexiune', 'danger');
|
||||
} finally {
|
||||
if (btn) { btn.disabled = false; btn.innerHTML = '<i class="bi bi-arrow-clockwise"></i>'; }
|
||||
}
|
||||
}
|
||||
|
||||
function openDashOrderDetail(orderNumber) {
|
||||
_sharedModalQuickMapFn = openDashQuickMap;
|
||||
renderOrderDetailModal(orderNumber, {
|
||||
onQuickMap: openDashQuickMap,
|
||||
onStatusChange: loadDashOrders,
|
||||
onAfterRender: function() { /* nothing extra needed */ }
|
||||
});
|
||||
}
|
||||
|
||||
// ── Quick Map Modal (uses shared openQuickMap) ───
|
||||
|
||||
function openDashQuickMap(sku, productName, orderNumber, itemIdx) {
|
||||
const item = (window._detailItems || [])[itemIdx];
|
||||
const details = item?.codmat_details;
|
||||
const isDirect = details?.length === 1 && details[0].direct === true;
|
||||
|
||||
openQuickMap({
|
||||
sku,
|
||||
productName,
|
||||
isDirect,
|
||||
directInfo: isDirect ? { codmat: details[0].codmat, denumire: details[0].denumire } : null,
|
||||
prefill: (!isDirect && details?.length) ? details.map(d => ({ codmat: d.codmat, cantitate: d.cantitate_roa, denumire: d.denumire })) : null,
|
||||
onSave: () => {
|
||||
if (orderNumber) openDashOrderDetail(orderNumber);
|
||||
loadDashOrders();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ── Dashboard row action handlers ────────────────
|
||||
|
||||
async function dashResyncOrder(orderNumber, btn) {
|
||||
// Close dropdown immediately
|
||||
const dd = btn.closest('.dropdown-menu');
|
||||
if (dd) bootstrap.Dropdown.getInstance(dd.previousElementSibling)?.hide();
|
||||
// Find the table row for visual feedback
|
||||
const row = document.querySelector(`tr[data-order="${orderNumber}"]`) ||
|
||||
btn.closest('tr');
|
||||
try {
|
||||
if (row) row.style.opacity = '0.5';
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/resync`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
loadDashOrders();
|
||||
} else {
|
||||
if (row) row.style.opacity = '';
|
||||
alert(data.message || 'Eroare la resync');
|
||||
}
|
||||
} catch (err) {
|
||||
if (row) row.style.opacity = '';
|
||||
alert('Eroare conexiune la resync');
|
||||
}
|
||||
}
|
||||
|
||||
async function dashDeleteOrder(orderNumber, btn) {
|
||||
// Close dropdown immediately
|
||||
const dd = btn.closest('.dropdown-menu');
|
||||
if (dd) bootstrap.Dropdown.getInstance(dd.previousElementSibling)?.hide();
|
||||
// Confirm before delete
|
||||
if (!confirm(`Stergi comanda ${orderNumber} din ROA?`)) return;
|
||||
// Find the table row for visual feedback
|
||||
const row = document.querySelector(`tr[data-order="${orderNumber}"]`) ||
|
||||
btn.closest('tr');
|
||||
try {
|
||||
if (row) row.style.opacity = '0.5';
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/delete`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
loadDashOrders();
|
||||
} else {
|
||||
if (row) row.style.opacity = '';
|
||||
alert(data.message || 'Eroare la stergere');
|
||||
}
|
||||
} catch (err) {
|
||||
if (row) row.style.opacity = '';
|
||||
alert('Eroare conexiune la stergere');
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,396 +0,0 @@
|
||||
// logs.js - Structured order viewer with text log fallback
|
||||
|
||||
let currentRunId = null;
|
||||
let runsPage = 1;
|
||||
let logPollTimer = null;
|
||||
let currentFilter = 'all';
|
||||
let ordersPage = 1;
|
||||
let ordersSortColumn = 'order_date';
|
||||
let ordersSortDirection = 'desc';
|
||||
|
||||
function fmtDuration(startedAt, finishedAt) {
|
||||
if (!startedAt || !finishedAt) return '-';
|
||||
const diffMs = new Date(finishedAt) - new Date(startedAt);
|
||||
if (isNaN(diffMs) || diffMs < 0) return '-';
|
||||
const secs = Math.round(diffMs / 1000);
|
||||
if (secs < 60) return secs + 's';
|
||||
return Math.floor(secs / 60) + 'm ' + (secs % 60) + 's';
|
||||
}
|
||||
|
||||
function runStatusBadge(status) {
|
||||
switch ((status || '').toLowerCase()) {
|
||||
case 'completed': return '<span style="color:var(--success);font-weight:600">completed</span>';
|
||||
case 'running': return '<span style="color:var(--info);font-weight:600">running</span>';
|
||||
case 'failed': return '<span style="color:var(--error);font-weight:600">failed</span>';
|
||||
default: return `<span style="font-weight:600">${esc(status)}</span>`;
|
||||
}
|
||||
}
|
||||
|
||||
function logStatusText(status) {
|
||||
switch ((status || '').toUpperCase()) {
|
||||
case ORDER_STATUS.IMPORTED: return 'Importat';
|
||||
case ORDER_STATUS.ALREADY_IMPORTED: return 'Deja imp.';
|
||||
case ORDER_STATUS.SKIPPED: return 'Omis';
|
||||
case ORDER_STATUS.ERROR: return 'Eroare';
|
||||
default: return esc(status);
|
||||
}
|
||||
}
|
||||
|
||||
function logsGoPage(p) { loadRunOrders(currentRunId, null, p); }
|
||||
|
||||
// ── Runs Dropdown ────────────────────────────────
|
||||
|
||||
async function loadRuns() {
|
||||
// Load all recent runs for dropdown
|
||||
try {
|
||||
const res = await fetch(`/api/sync/history?page=1&per_page=100`);
|
||||
if (!res.ok) throw new Error('HTTP ' + res.status);
|
||||
const data = await res.json();
|
||||
const runs = data.runs || [];
|
||||
|
||||
const dd = document.getElementById('runsDropdown');
|
||||
if (runs.length === 0) {
|
||||
dd.innerHTML = '<option value="">Niciun sync run</option>';
|
||||
} else {
|
||||
dd.innerHTML = '<option value="">-- Selecteaza un run --</option>' +
|
||||
runs.map(r => {
|
||||
const started = r.started_at ? new Date(r.started_at).toLocaleString('ro-RO', {day:'2-digit',month:'2-digit',year:'numeric',hour:'2-digit',minute:'2-digit'}) : '?';
|
||||
const st = (r.status || '').toUpperCase();
|
||||
const statusEmoji = st === 'COMPLETED' ? '✓' : st === 'RUNNING' ? '⟳' : '✗';
|
||||
const newImp = r.new_imported || 0;
|
||||
const already = r.already_imported || 0;
|
||||
const imp = r.imported || 0;
|
||||
const skip = r.skipped || 0;
|
||||
const err = r.errors || 0;
|
||||
const impLabel = already > 0 ? `${newImp} noi, ${already} deja` : `${imp} imp`;
|
||||
const label = `${started} — ${statusEmoji} ${r.status} (${impLabel}, ${skip} skip, ${err} err)`;
|
||||
const selected = r.run_id === currentRunId ? 'selected' : '';
|
||||
return `<option value="${esc(r.run_id)}" ${selected}>${esc(label)}</option>`;
|
||||
}).join('');
|
||||
}
|
||||
const ddMobile = document.getElementById('runsDropdownMobile');
|
||||
if (ddMobile) ddMobile.innerHTML = dd.innerHTML;
|
||||
} catch (err) {
|
||||
const dd = document.getElementById('runsDropdown');
|
||||
dd.innerHTML = `<option value="">Eroare: ${esc(err.message)}</option>`;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Run Selection ────────────────────────────────
|
||||
|
||||
async function selectRun(runId) {
|
||||
if (logPollTimer) { clearInterval(logPollTimer); logPollTimer = null; }
|
||||
|
||||
currentRunId = runId;
|
||||
currentFilter = 'all';
|
||||
ordersPage = 1;
|
||||
|
||||
const url = new URL(window.location);
|
||||
if (runId) { url.searchParams.set('run', runId); } else { url.searchParams.delete('run'); }
|
||||
history.replaceState(null, '', url);
|
||||
|
||||
// Sync dropdown selection
|
||||
const dd = document.getElementById('runsDropdown');
|
||||
if (dd && dd.value !== runId) dd.value = runId;
|
||||
const ddMobile = document.getElementById('runsDropdownMobile');
|
||||
if (ddMobile && ddMobile.value !== runId) ddMobile.value = runId;
|
||||
|
||||
const emptyState = document.getElementById('logEmptyState');
|
||||
if (!runId) {
|
||||
document.getElementById('logViewerSection').style.display = 'none';
|
||||
if (emptyState) emptyState.style.display = '';
|
||||
return;
|
||||
}
|
||||
|
||||
document.getElementById('logViewerSection').style.display = '';
|
||||
if (emptyState) emptyState.style.display = 'none';
|
||||
const logRunIdEl = document.getElementById('logRunId'); if (logRunIdEl) logRunIdEl.textContent = runId;
|
||||
document.getElementById('logStatusBadge').innerHTML = '...';
|
||||
document.getElementById('textLogSection').style.display = 'none';
|
||||
|
||||
await loadRunOrders(runId, 'all', 1);
|
||||
|
||||
// Also load text log in background
|
||||
fetchTextLog(runId);
|
||||
}
|
||||
|
||||
// ── Per-Order Filtering (R1) ─────────────────────
|
||||
|
||||
async function loadRunOrders(runId, statusFilter, page) {
|
||||
if (statusFilter != null) currentFilter = statusFilter;
|
||||
if (page != null) ordersPage = page;
|
||||
|
||||
// Update filter pill active state
|
||||
document.querySelectorAll('#orderFilterPills .filter-pill').forEach(btn => {
|
||||
btn.classList.toggle('active', btn.dataset.logStatus === currentFilter);
|
||||
});
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/sync/run/${encodeURIComponent(runId)}/orders?status=${currentFilter}&page=${ordersPage}&per_page=50&sort_by=${ordersSortColumn}&sort_dir=${ordersSortDirection}`);
|
||||
if (!res.ok) throw new Error('HTTP ' + res.status);
|
||||
const data = await res.json();
|
||||
|
||||
const counts = data.counts || {};
|
||||
document.getElementById('countAll').textContent = counts.total || 0;
|
||||
document.getElementById('countImported').textContent = counts.imported || 0;
|
||||
document.getElementById('countSkipped').textContent = counts.skipped || 0;
|
||||
document.getElementById('countError').textContent = counts.error || 0;
|
||||
const alreadyEl = document.getElementById('countAlreadyImported');
|
||||
if (alreadyEl) alreadyEl.textContent = counts.already_imported || 0;
|
||||
const malEl = document.getElementById('countMalformed');
|
||||
if (malEl) malEl.textContent = counts.malformed || 0;
|
||||
|
||||
const tbody = document.getElementById('runOrdersBody');
|
||||
const orders = data.orders || [];
|
||||
|
||||
if (orders.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="9" class="text-center text-muted py-3">Nicio comanda</td></tr>';
|
||||
} else {
|
||||
const problemOrders = orders.filter(o => [ORDER_STATUS.ERROR, ORDER_STATUS.SKIPPED].includes(o.status));
|
||||
const okOrders = orders.filter(o => [ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes(o.status));
|
||||
const otherOrders = orders.filter(o => ![ORDER_STATUS.ERROR, ORDER_STATUS.SKIPPED, ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes(o.status));
|
||||
|
||||
function orderRow(o, i) {
|
||||
const dateStr = fmtDate(o.order_date);
|
||||
const orderTotal = o.order_total != null ? Number(o.order_total).toFixed(2) : '-';
|
||||
return `<tr style="cursor:pointer" onclick="openOrderDetail('${esc(o.order_number)}')">
|
||||
<td>${statusDot(o.status)}</td>
|
||||
<td>${(ordersPage - 1) * 50 + i + 1}</td>
|
||||
<td class="text-nowrap">${dateStr}</td>
|
||||
<td><code>${esc(o.order_number)}</code></td>
|
||||
<td class="fw-bold">${esc(o.customer_name)}</td>
|
||||
<td>${o.items_count || 0}</td>
|
||||
<td class="text-end text-muted">${fmtCost(o.delivery_cost)}</td>
|
||||
<td class="text-end text-muted">${fmtCost(o.discount_total)}</td>
|
||||
<td class="text-end fw-bold">${orderTotal}</td>
|
||||
</tr>`;
|
||||
}
|
||||
|
||||
let html = '';
|
||||
// Show problem orders first (always visible)
|
||||
problemOrders.forEach((o, i) => { html += orderRow(o, i); });
|
||||
otherOrders.forEach((o, i) => { html += orderRow(o, problemOrders.length + i); });
|
||||
|
||||
// Collapsible OK orders
|
||||
if (okOrders.length > 0) {
|
||||
const toggleId = 'okOrdersCollapse_' + Date.now();
|
||||
html += `<tr><td colspan="9" class="p-0">
|
||||
<div class="log-ok-toggle" onclick="this.nextElementSibling.classList.toggle('d-none')">
|
||||
▶ ${okOrders.length} comenzi importate cu succes
|
||||
</div>
|
||||
<div class="d-none">
|
||||
<table class="table mb-0">
|
||||
<tbody>
|
||||
${okOrders.map((o, i) => orderRow(o, problemOrders.length + otherOrders.length + i)).join('')}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</td></tr>`;
|
||||
}
|
||||
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
// Mobile flat rows
|
||||
const mobileList = document.getElementById('logsMobileList');
|
||||
if (mobileList) {
|
||||
if (orders.length === 0) {
|
||||
mobileList.innerHTML = '<div class="flat-row text-muted py-3 justify-content-center">Nicio comanda</div>';
|
||||
} else {
|
||||
const problemOrders = orders.filter(o => [ORDER_STATUS.ERROR, ORDER_STATUS.SKIPPED].includes(o.status));
|
||||
const okOrders = orders.filter(o => [ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes(o.status));
|
||||
const otherOrders = orders.filter(o => ![ORDER_STATUS.ERROR, ORDER_STATUS.SKIPPED, ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes(o.status));
|
||||
|
||||
function mobileRow(o) {
|
||||
const d = o.order_date || '';
|
||||
let dateFmt = '-';
|
||||
if (d.length >= 10) {
|
||||
dateFmt = d.slice(8, 10) + '.' + d.slice(5, 7) + '.' + d.slice(2, 4);
|
||||
if (d.length >= 16) dateFmt += ' ' + d.slice(11, 16);
|
||||
}
|
||||
const totalStr = o.order_total ? Number(o.order_total).toFixed(2) : '';
|
||||
return `<div class="flat-row" onclick="openOrderDetail('${esc(o.order_number)}')" style="font-size:0.875rem">
|
||||
${statusDot(o.status)}
|
||||
<span style="color:var(--text-muted)" class="text-nowrap">${dateFmt}</span>
|
||||
<span class="grow truncate fw-bold">${esc(o.customer_name || '—')}</span>
|
||||
<span class="text-nowrap">x${o.items_count || 0}${totalStr ? ' · <strong>' + totalStr + '</strong>' : ''}</span>
|
||||
</div>`;
|
||||
}
|
||||
|
||||
let mobileHtml = '';
|
||||
problemOrders.forEach(o => { mobileHtml += mobileRow(o); });
|
||||
otherOrders.forEach(o => { mobileHtml += mobileRow(o); });
|
||||
|
||||
if (okOrders.length > 0) {
|
||||
mobileHtml += `<div class="log-ok-toggle" onclick="this.nextElementSibling.classList.toggle('d-none')">
|
||||
▶ ${okOrders.length} comenzi importate cu succes
|
||||
</div>
|
||||
<div class="d-none">
|
||||
${okOrders.map(o => mobileRow(o)).join('')}
|
||||
</div>`;
|
||||
}
|
||||
|
||||
mobileList.innerHTML = mobileHtml;
|
||||
}
|
||||
}
|
||||
|
||||
// Mobile segmented control
|
||||
renderMobileSegmented('logsMobileSeg', [
|
||||
{ label: 'Toate', count: counts.total || 0, value: 'all', active: currentFilter === 'all', colorClass: 'fc-neutral' },
|
||||
{ label: 'Imp.', count: counts.imported || 0, value: ORDER_STATUS.IMPORTED, active: currentFilter === ORDER_STATUS.IMPORTED, colorClass: 'fc-green' },
|
||||
{ label: 'Deja', count: counts.already_imported || 0, value: ORDER_STATUS.ALREADY_IMPORTED, active: currentFilter === ORDER_STATUS.ALREADY_IMPORTED, colorClass: 'fc-blue' },
|
||||
{ label: 'Omise', count: counts.skipped || 0, value: ORDER_STATUS.SKIPPED, active: currentFilter === ORDER_STATUS.SKIPPED, colorClass: 'fc-yellow' },
|
||||
{ label: 'Erori', count: counts.error || 0, value: ORDER_STATUS.ERROR, active: currentFilter === ORDER_STATUS.ERROR, colorClass: 'fc-red' },
|
||||
{ label: 'Defecte', count: counts.malformed || 0, value: ORDER_STATUS.MALFORMED, active: currentFilter === ORDER_STATUS.MALFORMED, colorClass: 'fc-orange' }
|
||||
], (val) => filterOrders(val));
|
||||
|
||||
// Orders pagination
|
||||
const totalPages = data.pages || 1;
|
||||
const infoEl = document.getElementById('ordersPageInfo');
|
||||
if (infoEl) infoEl.textContent = `${data.total || 0} comenzi | Pagina ${ordersPage} din ${totalPages}`;
|
||||
const pagHtml = `<small class="text-muted me-auto">${data.total || 0} comenzi | Pagina ${ordersPage} din ${totalPages}</small>` + renderUnifiedPagination(ordersPage, totalPages, 'logsGoPage');
|
||||
const pagDiv = document.getElementById('ordersPagination');
|
||||
if (pagDiv) pagDiv.innerHTML = pagHtml;
|
||||
const pagDivTop = document.getElementById('ordersPaginationTop');
|
||||
if (pagDivTop) pagDivTop.innerHTML = pagHtml;
|
||||
|
||||
// Update run status badge
|
||||
const runRes = await fetch(`/api/sync/run/${encodeURIComponent(runId)}`);
|
||||
const runData = await runRes.json();
|
||||
if (runData.run) {
|
||||
document.getElementById('logStatusBadge').innerHTML = runStatusBadge(runData.run.status);
|
||||
// Update mobile run dot
|
||||
const mDot = document.getElementById('mobileRunDot');
|
||||
if (mDot) mDot.className = 'sync-status-dot ' + (runData.run.status || 'idle');
|
||||
}
|
||||
} catch (err) {
|
||||
document.getElementById('runOrdersBody').innerHTML =
|
||||
`<tr><td colspan="9" class="text-center text-danger">${esc(err.message)}</td></tr>`;
|
||||
}
|
||||
}
|
||||
|
||||
function filterOrders(status) {
|
||||
loadRunOrders(currentRunId, status, 1);
|
||||
}
|
||||
|
||||
function sortOrdersBy(col) {
|
||||
if (ordersSortColumn === col) {
|
||||
ordersSortDirection = ordersSortDirection === 'asc' ? 'desc' : 'asc';
|
||||
} else {
|
||||
ordersSortColumn = col;
|
||||
ordersSortDirection = 'asc';
|
||||
}
|
||||
// Update sort icons
|
||||
document.querySelectorAll('#logViewerSection .sort-icon').forEach(span => {
|
||||
const c = span.dataset.col;
|
||||
span.textContent = c === ordersSortColumn ? (ordersSortDirection === 'asc' ? '\u2191' : '\u2193') : '';
|
||||
});
|
||||
loadRunOrders(currentRunId, null, 1);
|
||||
}
|
||||
|
||||
// ── Text Log (collapsible) ──────────────────────
|
||||
|
||||
function toggleTextLog() {
|
||||
const section = document.getElementById('textLogSection');
|
||||
section.style.display = section.style.display === 'none' ? '' : 'none';
|
||||
if (section.style.display !== 'none' && currentRunId) {
|
||||
fetchTextLog(currentRunId);
|
||||
}
|
||||
}
|
||||
|
||||
async function fetchTextLog(runId) {
|
||||
// Clear any existing poll timer to prevent accumulation
|
||||
if (logPollTimer) { clearInterval(logPollTimer); logPollTimer = null; }
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/sync/run/${encodeURIComponent(runId)}/text-log`);
|
||||
if (!res.ok) throw new Error('HTTP ' + res.status);
|
||||
const data = await res.json();
|
||||
|
||||
document.getElementById('logContent').textContent = data.text || '(log gol)';
|
||||
|
||||
if (!data.finished) {
|
||||
if (document.getElementById('autoRefreshToggle')?.checked) {
|
||||
logPollTimer = setInterval(async () => {
|
||||
try {
|
||||
const r = await fetch(`/api/sync/run/${encodeURIComponent(runId)}/text-log`);
|
||||
const d = await r.json();
|
||||
if (currentRunId !== runId) { clearInterval(logPollTimer); return; }
|
||||
document.getElementById('logContent').textContent = d.text || '(log gol)';
|
||||
const el = document.getElementById('logContent');
|
||||
el.scrollTop = el.scrollHeight;
|
||||
if (d.finished) {
|
||||
clearInterval(logPollTimer);
|
||||
logPollTimer = null;
|
||||
loadRuns();
|
||||
loadRunOrders(runId, currentFilter, ordersPage);
|
||||
}
|
||||
} catch (e) { console.error('Poll error:', e); }
|
||||
}, 2500);
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
document.getElementById('logContent').textContent = 'Eroare: ' + err.message;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Order Detail Modal (R9) ─────────────────────
|
||||
|
||||
function openOrderDetail(orderNumber) {
|
||||
_sharedModalQuickMapFn = function(sku, productName, orderNum, itemIdx) {
|
||||
openLogsQuickMap(sku, productName, orderNum);
|
||||
};
|
||||
renderOrderDetailModal(orderNumber, {
|
||||
onQuickMap: function(sku, productName, orderNum, itemIdx) {
|
||||
openLogsQuickMap(sku, productName, orderNum);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ── Quick Map Modal (uses shared openQuickMap) ───
|
||||
|
||||
function openLogsQuickMap(sku, productName, orderNumber) {
|
||||
openQuickMap({
|
||||
sku,
|
||||
productName,
|
||||
onSave: () => {
|
||||
if (orderNumber) openOrderDetail(orderNumber);
|
||||
loadRunOrders(currentRunId, currentFilter, ordersPage);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ── Init ────────────────────────────────────────
|
||||
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
loadRuns();
|
||||
|
||||
document.querySelectorAll('#orderFilterPills .filter-pill').forEach(btn => {
|
||||
btn.addEventListener('click', function() {
|
||||
filterOrders(this.dataset.logStatus || 'all');
|
||||
});
|
||||
});
|
||||
|
||||
const preselected = document.getElementById('preselectedRun');
|
||||
const urlParams = new URLSearchParams(window.location.search);
|
||||
const runFromUrl = urlParams.get('run') || (preselected ? preselected.value : '');
|
||||
if (runFromUrl) {
|
||||
selectRun(runFromUrl);
|
||||
}
|
||||
|
||||
document.getElementById('autoRefreshToggle')?.addEventListener('change', (e) => {
|
||||
if (e.target.checked) {
|
||||
// Resume polling if we have an active run
|
||||
if (currentRunId) fetchTextLog(currentRunId);
|
||||
} else {
|
||||
// Pause polling
|
||||
if (logPollTimer) { clearInterval(logPollTimer); logPollTimer = null; }
|
||||
}
|
||||
});
|
||||
|
||||
document.getElementById('autoRefreshToggleMobile')?.addEventListener('change', (e) => {
|
||||
const desktop = document.getElementById('autoRefreshToggle');
|
||||
if (desktop) desktop.checked = e.target.checked;
|
||||
desktop?.dispatchEvent(new Event('change'));
|
||||
});
|
||||
});
|
||||
@@ -1,738 +0,0 @@
|
||||
let currentPage = 1;
|
||||
let mappingsPerPage = 50;
|
||||
let currentSearch = '';
|
||||
let searchTimeout = null;
|
||||
let sortColumn = 'sku';
|
||||
let sortDirection = 'asc';
|
||||
let editingMapping = null; // {sku, codmat} when editing
|
||||
|
||||
const kitPriceCache = new Map();
|
||||
|
||||
// Load on page ready
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
loadMappings();
|
||||
initAddModal();
|
||||
initDeleteModal();
|
||||
});
|
||||
|
||||
function debounceSearch() {
|
||||
clearTimeout(searchTimeout);
|
||||
searchTimeout = setTimeout(() => {
|
||||
currentSearch = document.getElementById('searchInput').value;
|
||||
currentPage = 1;
|
||||
loadMappings();
|
||||
}, 300);
|
||||
}
|
||||
|
||||
// ── Sorting (R7) ─────────────────────────────────
|
||||
|
||||
function sortBy(col) {
|
||||
if (sortColumn === col) {
|
||||
sortDirection = sortDirection === 'asc' ? 'desc' : 'asc';
|
||||
} else {
|
||||
sortColumn = col;
|
||||
sortDirection = 'asc';
|
||||
}
|
||||
currentPage = 1;
|
||||
loadMappings();
|
||||
}
|
||||
|
||||
function updateSortIcons() {
|
||||
document.querySelectorAll('.sort-icon').forEach(span => {
|
||||
const col = span.dataset.col;
|
||||
if (col === sortColumn) {
|
||||
span.textContent = sortDirection === 'asc' ? '\u2191' : '\u2193';
|
||||
} else {
|
||||
span.textContent = '';
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ── Load & Render ────────────────────────────────
|
||||
|
||||
async function loadMappings() {
|
||||
const showInactive = document.getElementById('showInactive')?.checked;
|
||||
const showDeleted = document.getElementById('showDeleted')?.checked;
|
||||
const params = new URLSearchParams({
|
||||
search: currentSearch,
|
||||
page: currentPage,
|
||||
per_page: mappingsPerPage,
|
||||
sort_by: sortColumn,
|
||||
sort_dir: sortDirection
|
||||
});
|
||||
if (showDeleted) params.set('show_deleted', 'true');
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/mappings?${params}`);
|
||||
const data = await res.json();
|
||||
|
||||
let mappings = data.mappings || [];
|
||||
|
||||
// Client-side filter for inactive unless toggle is on
|
||||
// (keep deleted rows visible when showDeleted is on, even if inactive)
|
||||
if (!showInactive) {
|
||||
mappings = mappings.filter(m => m.activ || m.sters);
|
||||
}
|
||||
|
||||
renderTable(mappings, showDeleted);
|
||||
renderPagination(data);
|
||||
updateSortIcons();
|
||||
} catch (err) {
|
||||
document.getElementById('mappingsFlatList').innerHTML =
|
||||
`<div class="flat-row text-danger py-3 justify-content-center">Eroare: ${err.message}</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
function renderTable(mappings, showDeleted) {
|
||||
const container = document.getElementById('mappingsFlatList');
|
||||
|
||||
if (!mappings || mappings.length === 0) {
|
||||
container.innerHTML = '<div class="flat-row text-muted py-4 justify-content-center">Nu exista mapari</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
// Count CODMATs per SKU for kit detection
|
||||
const skuCodmatCount = {};
|
||||
mappings.forEach(m => {
|
||||
skuCodmatCount[m.sku] = (skuCodmatCount[m.sku] || 0) + 1;
|
||||
});
|
||||
|
||||
let prevSku = null;
|
||||
let html = '';
|
||||
mappings.forEach((m, i) => {
|
||||
const isNewGroup = m.sku !== prevSku;
|
||||
if (isNewGroup) {
|
||||
const isKit = (skuCodmatCount[m.sku] || 0) > 1;
|
||||
const kitBadge = isKit
|
||||
? ` <span class="text-muted small">Kit · ${skuCodmatCount[m.sku]}</span><span class="kit-price-loading" data-sku="${esc(m.sku)}" style="display:none"><span class="spinner-border spinner-border-sm ms-1" style="width:0.8rem;height:0.8rem"></span></span>`
|
||||
: '';
|
||||
const inactiveStyle = !m.activ && !m.sters ? 'opacity:0.6;' : '';
|
||||
html += `<div class="flat-row" style="background:var(--surface-raised);font-weight:600;border-top:1px solid var(--border);${inactiveStyle}">
|
||||
<span class="${m.activ ? 'dot dot-green' : 'dot dot-yellow'}" style="cursor:${m.sters ? 'default' : 'pointer'}"
|
||||
${m.sters ? '' : `onclick="event.stopPropagation();toggleActive('${esc(m.sku)}', '${esc(m.codmat)}', ${m.activ})"`}
|
||||
title="${m.activ ? 'Activ' : 'Inactiv'}"></span>
|
||||
<strong class="me-1 text-nowrap">${esc(m.sku)}</strong>${kitBadge}
|
||||
<span class="grow truncate text-muted" style="font-size:0.875rem">${esc(m.product_name || '')}</span>
|
||||
${m.sters
|
||||
? `<button class="btn btn-sm btn-outline-success" onclick="event.stopPropagation();restoreMapping('${esc(m.sku)}', '${esc(m.codmat)}')" title="Restaureaza" style="padding:0.1rem 0.4rem"><i class="bi bi-arrow-counterclockwise"></i></button>`
|
||||
: `<button class="context-menu-trigger" data-sku="${esc(m.sku)}" data-codmat="${esc(m.codmat)}" data-cantitate="${m.cantitate_roa}">⋮</button>`
|
||||
}
|
||||
</div>`;
|
||||
}
|
||||
const deletedStyle = m.sters ? 'text-decoration:line-through;opacity:0.5;' : '';
|
||||
const isKitRow = (skuCodmatCount[m.sku] || 0) > 1;
|
||||
const kitPriceSlot = isKitRow ? `<span class="kit-price-slot text-muted small ms-2" data-sku="${esc(m.sku)}" data-codmat="${esc(m.codmat)}"></span>` : '';
|
||||
const inlinePrice = m.pret_cu_tva ? `<span class="text-muted small ms-2">${parseFloat(m.pret_cu_tva).toFixed(2)} lei</span>` : '';
|
||||
html += `<div class="flat-row" style="padding-left:1.5rem;font-size:0.9rem;${deletedStyle}">
|
||||
<code>${esc(m.codmat)}</code>
|
||||
<span class="grow truncate text-muted" style="font-size:0.85rem">${esc(m.denumire || '')}</span>
|
||||
<span class="text-nowrap" style="font-size:0.875rem">
|
||||
<span class="${m.sters ? '' : 'editable'}" style="cursor:${m.sters ? 'default' : 'pointer'}"
|
||||
${m.sters ? '' : `onclick="editFlatValue(this, '${esc(m.sku)}', '${esc(m.codmat)}', 'cantitate_roa', ${m.cantitate_roa})"`}>x${m.cantitate_roa}</span>${isKitRow ? kitPriceSlot : inlinePrice}
|
||||
</span>
|
||||
</div>`;
|
||||
|
||||
// After last CODMAT of a kit, add total row
|
||||
const isLastOfKit = isKitRow && (i === mappings.length - 1 || mappings[i + 1].sku !== m.sku);
|
||||
if (isLastOfKit) {
|
||||
html += `<div class="flat-row kit-total-slot text-muted small" data-sku="${esc(m.sku)}" style="padding-left:1.5rem;display:none;border-top:1px dashed var(--border)"></div>`;
|
||||
}
|
||||
|
||||
prevSku = m.sku;
|
||||
});
|
||||
container.innerHTML = html;
|
||||
|
||||
// Wire context menu triggers
|
||||
container.querySelectorAll('.context-menu-trigger').forEach(btn => {
|
||||
btn.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
const { sku, codmat, cantitate } = btn.dataset;
|
||||
const rect = btn.getBoundingClientRect();
|
||||
showContextMenu(rect.left, rect.bottom + 2, [
|
||||
{ label: 'Editeaza', action: () => openEditModal(sku, codmat, parseFloat(cantitate)) },
|
||||
{ label: 'Sterge', action: () => deleteMappingConfirm(sku, codmat), danger: true }
|
||||
]);
|
||||
});
|
||||
});
|
||||
|
||||
// Load prices for visible kits
|
||||
const loadedKits = new Set();
|
||||
container.querySelectorAll('.kit-price-loading').forEach(el => {
|
||||
const sku = el.dataset.sku;
|
||||
if (!loadedKits.has(sku)) {
|
||||
loadedKits.add(sku);
|
||||
loadKitPrices(sku, container);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async function loadKitPrices(sku, container) {
|
||||
if (kitPriceCache.has(sku)) {
|
||||
renderKitPrices(sku, kitPriceCache.get(sku), container);
|
||||
return;
|
||||
}
|
||||
// Show loading spinner
|
||||
const spinner = container.querySelector(`.kit-price-loading[data-sku="${CSS.escape(sku)}"]`);
|
||||
if (spinner) spinner.style.display = '';
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/mappings/prices?sku=${encodeURIComponent(sku)}`);
|
||||
const data = await res.json();
|
||||
if (data.error) {
|
||||
if (spinner) spinner.innerHTML = `<small class="text-danger">${esc(data.error)}</small>`;
|
||||
return;
|
||||
}
|
||||
kitPriceCache.set(sku, data.prices || []);
|
||||
renderKitPrices(sku, data.prices || [], container);
|
||||
} catch (err) {
|
||||
if (spinner) spinner.innerHTML = `<small class="text-danger">Eroare la încărcarea prețurilor</small>`;
|
||||
}
|
||||
}
|
||||
|
||||
function renderKitPrices(sku, prices, container) {
|
||||
if (!prices || prices.length === 0) return;
|
||||
// Update each codmat row with price info
|
||||
const rows = container.querySelectorAll(`.kit-price-slot[data-sku="${CSS.escape(sku)}"]`);
|
||||
let total = 0;
|
||||
rows.forEach(slot => {
|
||||
const codmat = slot.dataset.codmat;
|
||||
const p = prices.find(pr => pr.codmat === codmat);
|
||||
if (p && p.pret_cu_tva > 0) {
|
||||
slot.innerHTML = `${p.pret_cu_tva.toFixed(2)} lei`;
|
||||
total += p.pret_cu_tva * (p.cantitate_roa || 1);
|
||||
} else if (p) {
|
||||
slot.innerHTML = `<span class="text-muted">fără preț</span>`;
|
||||
}
|
||||
});
|
||||
// Show total
|
||||
const totalSlot = container.querySelector(`.kit-total-slot[data-sku="${CSS.escape(sku)}"]`);
|
||||
if (totalSlot && total > 0) {
|
||||
totalSlot.innerHTML = `Total componente: ${total.toFixed(2)} lei`;
|
||||
totalSlot.style.display = '';
|
||||
}
|
||||
// Hide loading spinner
|
||||
const spinner = container.querySelector(`.kit-price-loading[data-sku="${CSS.escape(sku)}"]`);
|
||||
if (spinner) spinner.style.display = 'none';
|
||||
}
|
||||
|
||||
// Inline edit for flat-row values (cantitate)
|
||||
function editFlatValue(span, sku, codmat, field, currentValue) {
|
||||
if (span.querySelector('input')) return;
|
||||
|
||||
const input = document.createElement('input');
|
||||
input.type = 'number';
|
||||
input.className = 'form-control form-control-sm d-inline';
|
||||
input.value = currentValue;
|
||||
input.step = field === 'cantitate_roa' ? '0.001' : '0.01';
|
||||
input.style.width = '70px';
|
||||
input.style.display = 'inline';
|
||||
|
||||
const originalText = span.textContent;
|
||||
span.textContent = '';
|
||||
span.appendChild(input);
|
||||
input.focus();
|
||||
input.select();
|
||||
|
||||
const save = async () => {
|
||||
const newValue = parseFloat(input.value);
|
||||
if (isNaN(newValue) || newValue === currentValue) {
|
||||
span.textContent = originalText;
|
||||
return;
|
||||
}
|
||||
try {
|
||||
const body = {};
|
||||
body[field] = newValue;
|
||||
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(body)
|
||||
});
|
||||
const data = await res.json();
|
||||
if (data.success) { loadMappings(); }
|
||||
else { span.textContent = originalText; alert('Eroare: ' + (data.error || 'Update failed')); }
|
||||
} catch (err) { span.textContent = originalText; }
|
||||
};
|
||||
|
||||
input.addEventListener('blur', save);
|
||||
input.addEventListener('keydown', (e) => {
|
||||
if (e.key === 'Enter') { e.preventDefault(); save(); }
|
||||
if (e.key === 'Escape') { span.textContent = originalText; }
|
||||
});
|
||||
}
|
||||
|
||||
function renderPagination(data) {
|
||||
const pagOpts = { perPage: mappingsPerPage, perPageFn: 'mappingsChangePerPage', perPageOptions: [25, 50, 100, 250] };
|
||||
const infoHtml = `<small class="text-muted me-auto">${data.total} mapari | Pagina ${data.page} din ${data.pages || 1}</small>`;
|
||||
const pagHtml = infoHtml + renderUnifiedPagination(data.page, data.pages || 1, 'goPage', pagOpts);
|
||||
const top = document.getElementById('mappingsPagTop');
|
||||
const bot = document.getElementById('mappingsPagBottom');
|
||||
if (top) top.innerHTML = pagHtml;
|
||||
if (bot) bot.innerHTML = pagHtml;
|
||||
}
|
||||
|
||||
function mappingsChangePerPage(val) { mappingsPerPage = parseInt(val) || 50; currentPage = 1; loadMappings(); }
|
||||
|
||||
function goPage(p) {
|
||||
currentPage = p;
|
||||
loadMappings();
|
||||
}
|
||||
|
||||
// ── Multi-CODMAT Add Modal (R11) ─────────────────
|
||||
|
||||
function initAddModal() {
|
||||
const modal = document.getElementById('addModal');
|
||||
if (!modal) return;
|
||||
|
||||
modal.addEventListener('show.bs.modal', () => {
|
||||
if (!editingMapping) {
|
||||
clearAddForm();
|
||||
}
|
||||
});
|
||||
modal.addEventListener('hidden.bs.modal', () => {
|
||||
editingMapping = null;
|
||||
document.getElementById('addModalTitle').textContent = 'Adauga Mapare';
|
||||
});
|
||||
}
|
||||
|
||||
function clearAddForm() {
|
||||
document.getElementById('inputSku').value = '';
|
||||
document.getElementById('inputSku').readOnly = false;
|
||||
document.getElementById('addModalProductName').style.display = 'none';
|
||||
document.getElementById('pctWarning').style.display = 'none';
|
||||
document.getElementById('addModalTitle').textContent = 'Adauga Mapare';
|
||||
const container = document.getElementById('codmatLines');
|
||||
container.innerHTML = '';
|
||||
addCodmatLine();
|
||||
}
|
||||
|
||||
async function openEditModal(sku, codmat, cantitate) {
|
||||
editingMapping = { sku, codmat };
|
||||
document.getElementById('addModalTitle').textContent = 'Editare Mapare';
|
||||
document.getElementById('inputSku').value = sku;
|
||||
document.getElementById('inputSku').readOnly = false;
|
||||
document.getElementById('pctWarning').style.display = 'none';
|
||||
|
||||
const container = document.getElementById('codmatLines');
|
||||
container.innerHTML = '';
|
||||
|
||||
try {
|
||||
// Fetch all CODMATs for this SKU
|
||||
const res = await fetch(`/api/mappings?search=${encodeURIComponent(sku)}&per_page=100`);
|
||||
const data = await res.json();
|
||||
const allMappings = (data.mappings || []).filter(m => m.sku === sku && !m.sters);
|
||||
|
||||
// Show product name if available
|
||||
const productName = allMappings[0]?.product_name || '';
|
||||
const productNameEl = document.getElementById('addModalProductName');
|
||||
const productNameText = document.getElementById('inputProductName');
|
||||
if (productName && productNameEl && productNameText) {
|
||||
productNameText.textContent = productName;
|
||||
productNameEl.style.display = '';
|
||||
}
|
||||
|
||||
if (allMappings.length === 0) {
|
||||
// Fallback to single line with passed values
|
||||
addCodmatLine();
|
||||
const line = container.querySelector('.codmat-line');
|
||||
if (line) {
|
||||
line.querySelector('.cl-codmat').value = codmat;
|
||||
line.querySelector('.cl-cantitate').value = cantitate;
|
||||
}
|
||||
} else {
|
||||
for (const m of allMappings) {
|
||||
addCodmatLine();
|
||||
const lines = container.querySelectorAll('.codmat-line');
|
||||
const line = lines[lines.length - 1];
|
||||
line.querySelector('.cl-codmat').value = m.codmat;
|
||||
if (m.denumire) {
|
||||
line.querySelector('.cl-selected').textContent = m.denumire;
|
||||
}
|
||||
line.querySelector('.cl-cantitate').value = m.cantitate_roa;
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
// Fallback on error
|
||||
addCodmatLine();
|
||||
const line = container.querySelector('.codmat-line');
|
||||
if (line) {
|
||||
line.querySelector('.cl-codmat').value = codmat;
|
||||
line.querySelector('.cl-cantitate').value = cantitate;
|
||||
}
|
||||
}
|
||||
|
||||
new bootstrap.Modal(document.getElementById('addModal')).show();
|
||||
}
|
||||
|
||||
function addCodmatLine() {
|
||||
const container = document.getElementById('codmatLines');
|
||||
const idx = container.children.length;
|
||||
const div = document.createElement('div');
|
||||
div.className = 'qm-line codmat-line';
|
||||
div.innerHTML = `
|
||||
<div class="qm-row">
|
||||
<div class="qm-codmat-wrap position-relative">
|
||||
<input type="text" class="form-control form-control-sm cl-codmat" placeholder="CODMAT..." autocomplete="nope" data-idx="${idx}">
|
||||
<div class="autocomplete-dropdown d-none cl-ac-dropdown"></div>
|
||||
</div>
|
||||
<input type="number" class="form-control form-control-sm cl-cantitate" value="1" step="0.001" min="0.001" title="Cantitate ROA" style="width:70px">
|
||||
${idx > 0 ? `<button type="button" class="btn btn-sm btn-outline-danger qm-rm-btn" onclick="this.closest('.codmat-line').remove()"><i class="bi bi-x"></i></button>` : '<span style="width:30px"></span>'}
|
||||
</div>
|
||||
<div class="qm-selected text-muted cl-selected" style="font-size:0.75rem;padding-left:2px"></div>
|
||||
`;
|
||||
container.appendChild(div);
|
||||
|
||||
// Setup autocomplete
|
||||
const input = div.querySelector('.cl-codmat');
|
||||
const dropdown = div.querySelector('.cl-ac-dropdown');
|
||||
const selected = div.querySelector('.cl-selected');
|
||||
|
||||
setupAutocomplete(input, dropdown, selected, clAutocomplete);
|
||||
}
|
||||
|
||||
async function clAutocomplete(input, dropdown, selectedEl) {
|
||||
const q = input.value;
|
||||
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
|
||||
const data = await res.json();
|
||||
if (!data.results || data.results.length === 0) { dropdown.classList.add('d-none'); return; }
|
||||
|
||||
dropdown.innerHTML = data.results.map((r, i) => {
|
||||
const label = r.denumire + (r.um ? ` (${r.um})` : '');
|
||||
return `<div class="autocomplete-item" id="ac-cl-${i}" data-codmat="${esc(r.codmat)}" data-label="${esc(label)}">
|
||||
<span class="codmat">${esc(r.codmat)}</span> — <span class="denumire">${esc(r.denumire)}</span>${r.um ? ` <small class="text-muted">(${esc(r.um)})</small>` : ''}
|
||||
</div>`;
|
||||
}).join('');
|
||||
dropdown.classList.remove('d-none');
|
||||
} catch { dropdown.classList.add('d-none'); }
|
||||
}
|
||||
|
||||
async function saveMapping() {
|
||||
const sku = document.getElementById('inputSku').value.trim();
|
||||
if (!sku) { alert('SKU este obligatoriu'); return; }
|
||||
|
||||
const lines = document.querySelectorAll('.codmat-line');
|
||||
const mappings = [];
|
||||
|
||||
for (const line of lines) {
|
||||
const codmat = line.querySelector('.cl-codmat').value.trim();
|
||||
const cantitate = parseFloat(line.querySelector('.cl-cantitate').value) || 1;
|
||||
if (!codmat) continue;
|
||||
mappings.push({ codmat, cantitate_roa: cantitate });
|
||||
}
|
||||
|
||||
if (mappings.length === 0) { alert('Adauga cel putin un CODMAT'); return; }
|
||||
|
||||
document.getElementById('pctWarning').style.display = 'none';
|
||||
|
||||
try {
|
||||
let res;
|
||||
|
||||
if (editingMapping) {
|
||||
if (mappings.length === 1) {
|
||||
// Single CODMAT edit: use existing PUT endpoint
|
||||
res = await fetch(`/api/mappings/${encodeURIComponent(editingMapping.sku)}/${encodeURIComponent(editingMapping.codmat)}/edit`, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
new_sku: sku,
|
||||
new_codmat: mappings[0].codmat,
|
||||
cantitate_roa: mappings[0].cantitate_roa
|
||||
})
|
||||
});
|
||||
} else {
|
||||
// Multi-CODMAT set: delete all existing then create new batch
|
||||
const oldSku = editingMapping.sku;
|
||||
const existRes = await fetch(`/api/mappings?search=${encodeURIComponent(oldSku)}&per_page=100`);
|
||||
const existData = await existRes.json();
|
||||
const existing = (existData.mappings || []).filter(m => m.sku === oldSku && !m.sters);
|
||||
|
||||
// Delete each existing CODMAT for old SKU
|
||||
for (const m of existing) {
|
||||
await fetch(`/api/mappings/${encodeURIComponent(m.sku)}/${encodeURIComponent(m.codmat)}`, {
|
||||
method: 'DELETE'
|
||||
});
|
||||
}
|
||||
|
||||
// Create new batch with auto_restore (handles just-soft-deleted records)
|
||||
res = await fetch('/api/mappings/batch', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ sku, mappings, auto_restore: true })
|
||||
});
|
||||
}
|
||||
} else if (mappings.length === 1) {
|
||||
res = await fetch('/api/mappings', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ sku, codmat: mappings[0].codmat, cantitate_roa: mappings[0].cantitate_roa })
|
||||
});
|
||||
} else {
|
||||
res = await fetch('/api/mappings/batch', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ sku, mappings })
|
||||
});
|
||||
}
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
bootstrap.Modal.getInstance(document.getElementById('addModal')).hide();
|
||||
editingMapping = null;
|
||||
loadMappings();
|
||||
} else if (res.status === 409) {
|
||||
handleMappingConflict(data);
|
||||
} else {
|
||||
alert('Eroare: ' + (data.error || 'Unknown'));
|
||||
}
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
// ── Inline Add Row ──────────────────────────────
|
||||
|
||||
let inlineAddVisible = false;
|
||||
|
||||
function showInlineAddRow() {
|
||||
// On mobile, open the full modal instead
|
||||
if (window.innerWidth < 768) {
|
||||
new bootstrap.Modal(document.getElementById('addModal')).show();
|
||||
return;
|
||||
}
|
||||
|
||||
if (inlineAddVisible) return;
|
||||
inlineAddVisible = true;
|
||||
|
||||
const container = document.getElementById('mappingsFlatList');
|
||||
const row = document.createElement('div');
|
||||
row.id = 'inlineAddRow';
|
||||
row.className = 'flat-row';
|
||||
row.style.background = 'var(--info-light)';
|
||||
row.style.gap = '0.5rem';
|
||||
row.innerHTML = `
|
||||
<input type="text" class="form-control form-control-sm" id="inlineSku" placeholder="SKU" style="width:140px">
|
||||
<div class="position-relative" style="flex:1;min-width:0">
|
||||
<input type="text" class="form-control form-control-sm" id="inlineCodmat" placeholder="Cauta CODMAT..." autocomplete="nope">
|
||||
<div class="autocomplete-dropdown d-none" id="inlineAcDropdown"></div>
|
||||
<small class="text-muted" id="inlineSelected"></small>
|
||||
</div>
|
||||
<input type="number" class="form-control form-control-sm" id="inlineCantitate" value="1" step="0.001" min="0.001" style="width:70px" placeholder="Cant.">
|
||||
<button class="btn btn-sm btn-success" onclick="saveInlineMapping()" title="Salveaza"><i class="bi bi-check-lg"></i></button>
|
||||
<button class="btn btn-sm btn-outline-secondary" onclick="cancelInlineAdd()" title="Anuleaza"><i class="bi bi-x-lg"></i></button>
|
||||
`;
|
||||
container.insertBefore(row, container.firstChild);
|
||||
document.getElementById('inlineSku').focus();
|
||||
|
||||
// Setup autocomplete for inline CODMAT
|
||||
const input = document.getElementById('inlineCodmat');
|
||||
const dropdown = document.getElementById('inlineAcDropdown');
|
||||
const selected = document.getElementById('inlineSelected');
|
||||
|
||||
setupAutocomplete(input, dropdown, selected, inlineAutocomplete);
|
||||
}
|
||||
|
||||
async function inlineAutocomplete(input, dropdown, selectedEl) {
|
||||
const q = input.value;
|
||||
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
|
||||
try {
|
||||
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
|
||||
const data = await res.json();
|
||||
if (!data.results || data.results.length === 0) { dropdown.classList.add('d-none'); return; }
|
||||
dropdown.innerHTML = data.results.map((r, i) => {
|
||||
const label = r.denumire + (r.um ? ` (${r.um})` : '');
|
||||
return `<div class="autocomplete-item" id="ac-il-${i}" data-codmat="${esc(r.codmat)}" data-label="${esc(label)}">
|
||||
<span class="codmat">${esc(r.codmat)}</span> — <span class="denumire">${esc(r.denumire)}</span>${r.um ? ` <small class="text-muted">(${esc(r.um)})</small>` : ''}
|
||||
</div>`;
|
||||
}).join('');
|
||||
dropdown.classList.remove('d-none');
|
||||
} catch { dropdown.classList.add('d-none'); }
|
||||
}
|
||||
|
||||
async function saveInlineMapping() {
|
||||
const sku = document.getElementById('inlineSku').value.trim();
|
||||
const codmat = document.getElementById('inlineCodmat').value.trim();
|
||||
const cantitate = parseFloat(document.getElementById('inlineCantitate').value) || 1;
|
||||
|
||||
if (!sku) { alert('SKU este obligatoriu'); return; }
|
||||
if (!codmat) { alert('CODMAT este obligatoriu'); return; }
|
||||
|
||||
try {
|
||||
const res = await fetch('/api/mappings', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ sku, codmat, cantitate_roa: cantitate })
|
||||
});
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
cancelInlineAdd();
|
||||
loadMappings();
|
||||
} else if (res.status === 409) {
|
||||
handleMappingConflict(data);
|
||||
} else {
|
||||
alert('Eroare: ' + (data.error || 'Unknown'));
|
||||
}
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
function cancelInlineAdd() {
|
||||
const row = document.getElementById('inlineAddRow');
|
||||
if (row) row.remove();
|
||||
inlineAddVisible = false;
|
||||
}
|
||||
|
||||
// ── Toggle Active with Toast Undo ────────────────
|
||||
|
||||
async function toggleActive(sku, codmat, currentActive) {
|
||||
const newActive = currentActive ? 0 : 1;
|
||||
try {
|
||||
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ activ: newActive })
|
||||
});
|
||||
const data = await res.json();
|
||||
if (!data.success) return;
|
||||
|
||||
loadMappings();
|
||||
|
||||
// Show toast with undo
|
||||
const action = newActive ? 'activata' : 'dezactivata';
|
||||
showUndoToast(`Mapare ${sku} \u2192 ${codmat} ${action}.`, () => {
|
||||
fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ activ: currentActive })
|
||||
}).then(() => loadMappings());
|
||||
});
|
||||
} catch (err) { alert('Eroare: ' + err.message); }
|
||||
}
|
||||
|
||||
function showUndoToast(message, undoCallback) {
|
||||
document.getElementById('toastMessage').textContent = message;
|
||||
const undoBtn = document.getElementById('toastUndoBtn');
|
||||
// Clone to remove old listeners
|
||||
const newBtn = undoBtn.cloneNode(true);
|
||||
undoBtn.parentNode.replaceChild(newBtn, undoBtn);
|
||||
newBtn.id = 'toastUndoBtn';
|
||||
if (undoCallback) {
|
||||
newBtn.style.display = '';
|
||||
newBtn.addEventListener('click', () => {
|
||||
undoCallback();
|
||||
const toastEl = document.getElementById('undoToast');
|
||||
const inst = bootstrap.Toast.getInstance(toastEl);
|
||||
if (inst) inst.hide();
|
||||
});
|
||||
} else {
|
||||
newBtn.style.display = 'none';
|
||||
}
|
||||
const toast = new bootstrap.Toast(document.getElementById('undoToast'));
|
||||
toast.show();
|
||||
}
|
||||
|
||||
// ── Delete with Modal Confirmation ──────────────
|
||||
|
||||
let pendingDelete = null;
|
||||
|
||||
function initDeleteModal() {
|
||||
const btn = document.getElementById('confirmDeleteBtn');
|
||||
if (!btn) return;
|
||||
btn.addEventListener('click', async () => {
|
||||
if (!pendingDelete) return;
|
||||
const { sku, codmat } = pendingDelete;
|
||||
try {
|
||||
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
|
||||
method: 'DELETE'
|
||||
});
|
||||
const data = await res.json();
|
||||
bootstrap.Modal.getInstance(document.getElementById('deleteConfirmModal')).hide();
|
||||
if (data.success) loadMappings();
|
||||
else alert('Eroare: ' + (data.error || 'Delete failed'));
|
||||
} catch (err) {
|
||||
bootstrap.Modal.getInstance(document.getElementById('deleteConfirmModal')).hide();
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
pendingDelete = null;
|
||||
});
|
||||
}
|
||||
|
||||
function deleteMappingConfirm(sku, codmat) {
|
||||
pendingDelete = { sku, codmat };
|
||||
document.getElementById('deleteSkuText').textContent = sku;
|
||||
document.getElementById('deleteCodmatText').textContent = codmat;
|
||||
new bootstrap.Modal(document.getElementById('deleteConfirmModal')).show();
|
||||
}
|
||||
|
||||
// ── Restore Deleted ──────────────────────────────
|
||||
|
||||
async function restoreMapping(sku, codmat) {
|
||||
try {
|
||||
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}/restore`, {
|
||||
method: 'POST'
|
||||
});
|
||||
const data = await res.json();
|
||||
if (data.success) loadMappings();
|
||||
else alert('Eroare: ' + (data.error || 'Restore failed'));
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
// ── CSV ──────────────────────────────────────────
|
||||
|
||||
async function importCsv() {
|
||||
const fileInput = document.getElementById('csvFile');
|
||||
if (!fileInput.files.length) { alert('Selecteaza un fisier CSV'); return; }
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('file', fileInput.files[0]);
|
||||
|
||||
try {
|
||||
const res = await fetch('/api/mappings/import-csv', { method: 'POST', body: formData });
|
||||
const data = await res.json();
|
||||
let msg = `${data.processed} mapări importate`;
|
||||
if (data.skipped_no_codmat > 0) {
|
||||
msg += `, ${data.skipped_no_codmat} rânduri fără CODMAT omise`;
|
||||
}
|
||||
let html = `<div class="alert alert-success">${msg}</div>`;
|
||||
if (data.errors && data.errors.length > 0) {
|
||||
html += `<div class="alert alert-warning">Erori (${data.errors.length}): <ul>${data.errors.map(e => `<li>${esc(e)}</li>`).join('')}</ul></div>`;
|
||||
}
|
||||
document.getElementById('importResult').innerHTML = html;
|
||||
loadMappings();
|
||||
} catch (err) {
|
||||
document.getElementById('importResult').innerHTML = `<div class="alert alert-danger">${err.message}</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
function exportCsv() { window.location.href = (window.ROOT_PATH || '') + '/api/mappings/export-csv'; }
|
||||
function downloadTemplate() { window.location.href = (window.ROOT_PATH || '') + '/api/mappings/csv-template'; }
|
||||
|
||||
// ── Duplicate / Conflict handling ────────────────
|
||||
|
||||
function handleMappingConflict(data) {
|
||||
const msg = data.error || 'Conflict la salvare';
|
||||
if (data.can_restore) {
|
||||
const restore = confirm(`${msg}\n\nDoriti sa restaurati maparea stearsa?`);
|
||||
if (restore) {
|
||||
// Find sku/codmat from the inline row or modal
|
||||
const sku = (document.getElementById('inlineSku') || document.getElementById('inputSku'))?.value?.trim();
|
||||
const codmat = (document.getElementById('inlineCodmat') || document.querySelector('.cl-codmat'))?.value?.trim();
|
||||
if (sku && codmat) {
|
||||
fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}/restore`, { method: 'POST' })
|
||||
.then(r => r.json())
|
||||
.then(d => {
|
||||
if (d.success) { cancelInlineAdd(); loadMappings(); }
|
||||
else alert('Eroare la restaurare: ' + (d.error || ''));
|
||||
});
|
||||
}
|
||||
}
|
||||
} else {
|
||||
showUndoToast(msg, null);
|
||||
// Show non-dismissible inline error
|
||||
const warn = document.getElementById('pctWarning');
|
||||
if (warn) { warn.textContent = msg; warn.style.display = ''; }
|
||||
}
|
||||
}
|
||||
@@ -1,239 +0,0 @@
|
||||
let settAcTimeout = null;
|
||||
|
||||
document.addEventListener('DOMContentLoaded', async () => {
|
||||
await loadDropdowns();
|
||||
await loadSettings();
|
||||
wireAutocomplete('settTransportCodmat', 'settTransportAc');
|
||||
wireAutocomplete('settDiscountCodmat', 'settDiscountAc');
|
||||
wireAutocomplete('settKitDiscountCodmat', 'settKitDiscountAc');
|
||||
|
||||
// Kit pricing mode radio toggle
|
||||
document.querySelectorAll('input[name="kitPricingMode"]').forEach(r => {
|
||||
r.addEventListener('change', () => {
|
||||
const mode = document.querySelector('input[name="kitPricingMode"]:checked')?.value || '';
|
||||
document.getElementById('kitModeBFields').style.display =
|
||||
(mode === 'separate_line' || mode === 'distributed') ? '' : 'none';
|
||||
});
|
||||
});
|
||||
|
||||
// Dark mode toggle
|
||||
const darkToggle = document.getElementById('settDarkMode');
|
||||
if (darkToggle) {
|
||||
darkToggle.checked = document.documentElement.getAttribute('data-theme') === 'dark';
|
||||
darkToggle.addEventListener('change', () => {
|
||||
if (typeof toggleDarkMode === 'function') toggleDarkMode();
|
||||
});
|
||||
}
|
||||
|
||||
// Catalog sync toggle
|
||||
const catChk = document.getElementById('settCatalogSyncEnabled');
|
||||
if (catChk) catChk.addEventListener('change', () => {
|
||||
document.getElementById('catalogSyncOptions').style.display = catChk.checked ? '' : 'none';
|
||||
});
|
||||
});
|
||||
|
||||
async function loadDropdowns() {
|
||||
try {
|
||||
const [sectiiRes, politiciRes, gestiuniRes] = await Promise.all([
|
||||
fetch('/api/settings/sectii'),
|
||||
fetch('/api/settings/politici'),
|
||||
fetch('/api/settings/gestiuni')
|
||||
]);
|
||||
const sectii = await sectiiRes.json();
|
||||
const politici = await politiciRes.json();
|
||||
const gestiuni = await gestiuniRes.json();
|
||||
|
||||
const gestContainer = document.getElementById('settGestiuniContainer');
|
||||
if (gestContainer) {
|
||||
gestContainer.innerHTML = '';
|
||||
gestiuni.forEach(g => {
|
||||
gestContainer.innerHTML += `<div class="form-check mb-0"><input class="form-check-input" type="checkbox" value="${escHtml(g.id)}" id="gestChk_${escHtml(g.id)}"><label class="form-check-label" for="gestChk_${escHtml(g.id)}">${escHtml(g.label)}</label></div>`;
|
||||
});
|
||||
if (gestiuni.length === 0) gestContainer.innerHTML = '<span class="text-muted small">Nicio gestiune disponibilă</span>';
|
||||
}
|
||||
|
||||
const sectieEl = document.getElementById('settIdSectie');
|
||||
if (sectieEl) {
|
||||
sectieEl.innerHTML = '<option value="">— selectează secție —</option>';
|
||||
sectii.forEach(s => {
|
||||
sectieEl.innerHTML += `<option value="${escHtml(s.id)}">${escHtml(s.label)}</option>`;
|
||||
});
|
||||
}
|
||||
|
||||
const polEl = document.getElementById('settIdPol');
|
||||
if (polEl) {
|
||||
polEl.innerHTML = '<option value="">— selectează politică —</option>';
|
||||
politici.forEach(p => {
|
||||
polEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
||||
});
|
||||
}
|
||||
|
||||
const tPolEl = document.getElementById('settTransportIdPol');
|
||||
if (tPolEl) {
|
||||
tPolEl.innerHTML = '<option value="">— implicită —</option>';
|
||||
politici.forEach(p => {
|
||||
tPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
||||
});
|
||||
}
|
||||
|
||||
const dPolEl = document.getElementById('settDiscountIdPol');
|
||||
if (dPolEl) {
|
||||
dPolEl.innerHTML = '<option value="">— implicită —</option>';
|
||||
politici.forEach(p => {
|
||||
dPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
||||
});
|
||||
}
|
||||
|
||||
const pPolEl = document.getElementById('settIdPolProductie');
|
||||
if (pPolEl) {
|
||||
pPolEl.innerHTML = '<option value="">— fără politică producție —</option>';
|
||||
politici.forEach(p => {
|
||||
pPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
||||
});
|
||||
}
|
||||
|
||||
const kdPolEl = document.getElementById('settKitDiscountIdPol');
|
||||
if (kdPolEl) {
|
||||
kdPolEl.innerHTML = '<option value="">— implicită —</option>';
|
||||
politici.forEach(p => {
|
||||
kdPolEl.innerHTML += `<option value="${escHtml(p.id)}">${escHtml(p.label)}</option>`;
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('loadDropdowns error:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async function loadSettings() {
|
||||
try {
|
||||
const res = await fetch('/api/settings');
|
||||
const data = await res.json();
|
||||
const el = (id) => document.getElementById(id);
|
||||
if (el('settTransportCodmat')) el('settTransportCodmat').value = data.transport_codmat || '';
|
||||
if (el('settTransportVat')) el('settTransportVat').value = data.transport_vat || '21';
|
||||
if (el('settTransportIdPol')) el('settTransportIdPol').value = data.transport_id_pol || '';
|
||||
if (el('settDiscountCodmat')) el('settDiscountCodmat').value = data.discount_codmat || '';
|
||||
if (el('settDiscountVat')) el('settDiscountVat').value = data.discount_vat || '21';
|
||||
if (el('settDiscountIdPol')) el('settDiscountIdPol').value = data.discount_id_pol || '';
|
||||
if (el('settSplitDiscountVat')) el('settSplitDiscountVat').checked = data.split_discount_vat === "1";
|
||||
if (el('settIdPol')) el('settIdPol').value = data.id_pol || '';
|
||||
if (el('settIdPolProductie')) el('settIdPolProductie').value = data.id_pol_productie || '';
|
||||
if (el('settIdSectie')) el('settIdSectie').value = data.id_sectie || '';
|
||||
// Multi-gestiune checkboxes
|
||||
const gestVal = data.id_gestiune || '';
|
||||
if (gestVal) {
|
||||
const selectedIds = gestVal.split(',').map(s => s.trim());
|
||||
selectedIds.forEach(id => {
|
||||
const chk = document.getElementById('gestChk_' + id);
|
||||
if (chk) chk.checked = true;
|
||||
});
|
||||
}
|
||||
if (el('settGomagApiKey')) el('settGomagApiKey').value = data.gomag_api_key || '';
|
||||
if (el('settGomagApiShop')) el('settGomagApiShop').value = data.gomag_api_shop || '';
|
||||
if (el('settGomagDaysBack')) el('settGomagDaysBack').value = data.gomag_order_days_back || '7';
|
||||
if (el('settGomagLimit')) el('settGomagLimit').value = data.gomag_limit || '100';
|
||||
if (el('settDashPollSeconds')) el('settDashPollSeconds').value = data.dashboard_poll_seconds || '5';
|
||||
|
||||
// Kit pricing
|
||||
const kitMode = data.kit_pricing_mode || '';
|
||||
document.querySelectorAll('input[name="kitPricingMode"]').forEach(r => {
|
||||
r.checked = r.value === kitMode;
|
||||
});
|
||||
document.getElementById('kitModeBFields').style.display = (kitMode === 'separate_line' || kitMode === 'distributed') ? '' : 'none';
|
||||
if (el('settKitDiscountCodmat')) el('settKitDiscountCodmat').value = data.kit_discount_codmat || '';
|
||||
if (el('settKitDiscountIdPol')) el('settKitDiscountIdPol').value = data.kit_discount_id_pol || '';
|
||||
|
||||
// Price sync
|
||||
if (el('settPriceSyncEnabled')) el('settPriceSyncEnabled').checked = data.price_sync_enabled !== "0";
|
||||
} catch (err) {
|
||||
console.error('loadSettings error:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async function saveSettings() {
|
||||
const el = (id) => document.getElementById(id);
|
||||
const payload = {
|
||||
transport_codmat: el('settTransportCodmat')?.value?.trim() || '',
|
||||
transport_vat: el('settTransportVat')?.value || '21',
|
||||
transport_id_pol: el('settTransportIdPol')?.value?.trim() || '',
|
||||
discount_codmat: el('settDiscountCodmat')?.value?.trim() || '',
|
||||
discount_vat: el('settDiscountVat')?.value || '21',
|
||||
discount_id_pol: el('settDiscountIdPol')?.value?.trim() || '',
|
||||
split_discount_vat: el('settSplitDiscountVat')?.checked ? "1" : "",
|
||||
id_pol: el('settIdPol')?.value?.trim() || '',
|
||||
id_pol_productie: el('settIdPolProductie')?.value?.trim() || '',
|
||||
id_sectie: el('settIdSectie')?.value?.trim() || '',
|
||||
id_gestiune: Array.from(document.querySelectorAll('#settGestiuniContainer input:checked')).map(c => c.value).join(','),
|
||||
gomag_api_key: el('settGomagApiKey')?.value?.trim() || '',
|
||||
gomag_api_shop: el('settGomagApiShop')?.value?.trim() || '',
|
||||
gomag_order_days_back: el('settGomagDaysBack')?.value?.trim() || '7',
|
||||
gomag_limit: el('settGomagLimit')?.value?.trim() || '100',
|
||||
dashboard_poll_seconds: el('settDashPollSeconds')?.value?.trim() || '5',
|
||||
kit_pricing_mode: document.querySelector('input[name="kitPricingMode"]:checked')?.value || '',
|
||||
kit_discount_codmat: el('settKitDiscountCodmat')?.value?.trim() || '',
|
||||
kit_discount_id_pol: el('settKitDiscountIdPol')?.value?.trim() || '',
|
||||
price_sync_enabled: el('settPriceSyncEnabled')?.checked ? "1" : "0",
|
||||
};
|
||||
try {
|
||||
const res = await fetch('/api/settings', {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(payload)
|
||||
});
|
||||
const data = await res.json();
|
||||
const resultEl = document.getElementById('settSaveResult');
|
||||
if (data.success) {
|
||||
if (resultEl) { resultEl.textContent = 'Salvat!'; resultEl.style.color = 'var(--success)'; }
|
||||
setTimeout(() => { if (resultEl) resultEl.textContent = ''; }, 3000);
|
||||
} else {
|
||||
if (resultEl) { resultEl.textContent = 'Eroare: ' + JSON.stringify(data); resultEl.style.color = 'var(--error)'; }
|
||||
}
|
||||
} catch (err) {
|
||||
const resultEl = document.getElementById('settSaveResult');
|
||||
if (resultEl) { resultEl.textContent = 'Eroare: ' + err.message; resultEl.style.color = 'var(--error)'; }
|
||||
}
|
||||
}
|
||||
|
||||
function wireAutocomplete(inputId, dropdownId) {
|
||||
const input = document.getElementById(inputId);
|
||||
const dropdown = document.getElementById(dropdownId);
|
||||
if (!input || !dropdown) return;
|
||||
|
||||
input.addEventListener('input', () => {
|
||||
clearTimeout(settAcTimeout);
|
||||
settAcTimeout = setTimeout(async () => {
|
||||
const q = input.value.trim();
|
||||
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
|
||||
try {
|
||||
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
|
||||
const data = await res.json();
|
||||
if (!data.results || data.results.length === 0) { dropdown.classList.add('d-none'); return; }
|
||||
dropdown.innerHTML = data.results.map(r =>
|
||||
`<div class="autocomplete-item" onmousedown="settSelectArticle('${inputId}', '${dropdownId}', '${escHtml(r.codmat)}')">
|
||||
<span class="codmat">${escHtml(r.codmat)}</span> — <span class="denumire">${escHtml(r.denumire)}</span>
|
||||
</div>`
|
||||
).join('');
|
||||
dropdown.classList.remove('d-none');
|
||||
} catch { dropdown.classList.add('d-none'); }
|
||||
}, 250);
|
||||
});
|
||||
|
||||
input.addEventListener('blur', () => {
|
||||
setTimeout(() => dropdown.classList.add('d-none'), 200);
|
||||
});
|
||||
}
|
||||
|
||||
function settSelectArticle(inputId, dropdownId, codmat) {
|
||||
document.getElementById(inputId).value = codmat;
|
||||
document.getElementById(dropdownId).classList.add('d-none');
|
||||
}
|
||||
|
||||
function escHtml(s) {
|
||||
if (s == null) return '';
|
||||
return String(s)
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/>/g, '>')
|
||||
.replace(/"/g, '"')
|
||||
.replace(/'/g, ''');
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,195 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="ro">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>{% block title %}GoMag Import Manager{% endblock %}</title>
|
||||
<!-- FOUC prevention: apply saved theme before any rendering -->
|
||||
<script>
|
||||
try {
|
||||
var t = localStorage.getItem('theme');
|
||||
if (!t) t = window.matchMedia('(prefers-color-scheme: dark)').matches ? 'dark' : 'light';
|
||||
if (t === 'dark') document.documentElement.setAttribute('data-theme', 'dark');
|
||||
} catch(e) {}
|
||||
</script>
|
||||
<!-- Fonts (DESIGN.md) -->
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,300;0,9..40,400;0,9..40,500;0,9..40,600;0,9..40,700;1,9..40,400&family=JetBrains+Mono:wght@400;500;600&family=Space+Grotesk:wght@400;500;600;700&display=swap" rel="stylesheet">
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.2/font/bootstrap-icons.css" rel="stylesheet">
|
||||
{% set rp = request.scope.get('root_path', '') %}
|
||||
<link href="{{ rp }}/static/css/style.css?v=46" rel="stylesheet">
|
||||
</head>
|
||||
<body>
|
||||
<!-- Top Navbar (hidden on mobile via CSS) -->
|
||||
<nav class="top-navbar">
|
||||
<div class="navbar-brand">GoMag Import</div>
|
||||
<div class="navbar-links">
|
||||
<a href="{{ rp }}/" class="nav-tab {% block nav_dashboard %}{% endblock %}"><span class="d-none d-md-inline">Dashboard</span><span class="d-md-none">Acasa</span></a>
|
||||
<a href="{{ rp }}/mappings" class="nav-tab {% block nav_mappings %}{% endblock %}"><span class="d-none d-md-inline">Mapari SKU</span><span class="d-md-none">Mapari</span></a>
|
||||
<a href="{{ rp }}/missing-skus" class="nav-tab {% block nav_missing %}{% endblock %}"><span class="d-none d-md-inline">SKU-uri Lipsa</span><span class="d-md-none">Lipsa</span></a>
|
||||
<a href="{{ rp }}/logs" class="nav-tab {% block nav_logs %}{% endblock %}"><span class="d-none d-md-inline">Jurnale Import</span><span class="d-md-none">Jurnale</span></a>
|
||||
<a href="{{ rp }}/settings" class="nav-tab {% block nav_settings %}{% endblock %}"><span class="d-none d-md-inline">Setari</span><span class="d-md-none">Setari</span></a>
|
||||
</div>
|
||||
<button class="dark-toggle" onclick="toggleDarkMode()" title="Comuta tema" aria-label="Comuta tema intunecata">
|
||||
<i class="bi bi-sun-fill"></i>
|
||||
</button>
|
||||
</nav>
|
||||
|
||||
<!-- Bottom Nav (mobile only, shown via CSS) -->
|
||||
<nav class="bottom-nav">
|
||||
<a href="{{ rp }}/" class="bottom-nav-item {% block bnav_dashboard %}{% endblock %}"><i class="bi bi-speedometer2"></i><span>Dashboard</span></a>
|
||||
<a href="{{ rp }}/mappings" class="bottom-nav-item {% block bnav_mappings %}{% endblock %}"><i class="bi bi-arrow-left-right"></i><span>Mapari</span></a>
|
||||
<a href="{{ rp }}/missing-skus" class="bottom-nav-item {% block bnav_missing %}{% endblock %}"><i class="bi bi-exclamation-triangle"></i><span>Lipsa</span></a>
|
||||
<a href="{{ rp }}/logs" class="bottom-nav-item {% block bnav_logs %}{% endblock %}"><i class="bi bi-journal-text"></i><span>Jurnale</span></a>
|
||||
<a href="{{ rp }}/settings" class="bottom-nav-item {% block bnav_settings %}{% endblock %}"><i class="bi bi-gear"></i><span>Setari</span></a>
|
||||
</nav>
|
||||
|
||||
<!-- Main content -->
|
||||
<main class="main-content {% block main_class %}{% endblock %}">
|
||||
{% block content %}{% endblock %}
|
||||
</main>
|
||||
|
||||
<!-- Shared Quick Map Modal -->
|
||||
<div class="modal fade" id="quickMapModal" tabindex="-1" data-bs-backdrop="static">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Mapeaza SKU: <code id="qmSku"></code></h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div style="margin-bottom:8px; font-size:0.85rem">
|
||||
<small class="text-muted">Produs:</small> <strong id="qmProductName"></strong>
|
||||
</div>
|
||||
<div class="qm-row" style="font-size:0.7rem; color:var(--text-muted); padding:0 0 2px">
|
||||
<span style="flex:1">CODMAT</span>
|
||||
<span style="width:70px">Cant.</span>
|
||||
<span style="width:30px"></span>
|
||||
</div>
|
||||
<div id="qmCodmatLines"></div>
|
||||
<button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addQmCodmatLine()" style="font-size:0.8rem; padding:2px 10px">
|
||||
+ CODMAT
|
||||
</button>
|
||||
<div id="qmDirectInfo" class="alert alert-info mt-2" style="display:none; font-size:0.85rem; padding:8px 12px;"></div>
|
||||
<div id="qmPctWarning" class="text-danger mt-2" style="display:none;"></div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
|
||||
<button type="button" class="btn btn-primary" id="qmSaveBtn" onclick="saveQuickMapping()">Salveaza</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Shared Order Detail Modal -->
|
||||
<div class="modal fade" id="orderDetailModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-lg">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Comanda <code id="detailOrderNumber"></code></h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="row mb-3">
|
||||
<!-- GOMAG Column -->
|
||||
<div class="col-md-6">
|
||||
<div class="detail-col-label">GOMAG</div>
|
||||
<div class="detail-client-name" id="detailCustomer">...</div>
|
||||
<div class="detail-cui-line" id="detailCuiGomag" style="display:none">
|
||||
<small class="text-muted">CUI:</small> <span class="font-data" id="detailCuiGomagVal"></span>
|
||||
</div>
|
||||
<div><small class="text-muted">Data:</small> <span id="detailDate"></span></div>
|
||||
<div><small class="text-muted">Status:</small> <span id="detailStatus"></span></div>
|
||||
</div>
|
||||
<!-- ROA Column -->
|
||||
<div class="col-md-6">
|
||||
<div class="detail-col-label">ROA</div>
|
||||
<div class="detail-client-name" id="detailPartenerRoa" style="display:none"></div>
|
||||
<div class="detail-cui-line" id="detailCuiRoa" style="display:none">
|
||||
<small class="text-muted">CUI:</small> <span class="font-data" id="detailCuiRoaVal"></span>
|
||||
<span id="detailPartnerAnafArea"></span>
|
||||
</div>
|
||||
<div><small class="text-muted">ID Comanda:</small> <span class="font-data detail-roa-id" id="detailIdComanda">-</span></div>
|
||||
<div><small class="text-muted">ID Partener:</small> <span class="font-data detail-roa-id" id="detailIdPartener">-</span></div>
|
||||
<div id="detailInvoiceInfo" style="display:none; margin-top:4px;">
|
||||
<small class="text-muted">Factura:</small> <span id="detailInvoiceNumber"></span>
|
||||
<span class="ms-2"><small class="text-muted">din</small> <span id="detailInvoiceDate"></span></span>
|
||||
<div id="detailInvoiceRecon" class="mt-1" style="display:none"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<!-- Partner mismatch alert -->
|
||||
<div id="detailPartnerMismatch" style="display:none" class="mb-2"></div>
|
||||
<!-- Denomination mismatch alert -->
|
||||
<div id="detailDenomMismatch" style="display:none" class="mb-2"></div>
|
||||
<!-- Compact Address Lines -->
|
||||
<div id="detailAddressBlock" style="display:none" class="mb-3">
|
||||
<div class="detail-col-label d-flex align-items-center justify-content-end" style="border-bottom:1px solid var(--border);margin-bottom:8px;padding-bottom:4px">
|
||||
<button id="refreshAddrBtn" class="btn btn-sm btn-outline-secondary py-0 px-1"
|
||||
onclick="refreshOrderAddress(window._detailOrderNumber)"
|
||||
aria-label="Refresh adresă din Oracle" title="Refresh adresă din Oracle">
|
||||
<i class="bi bi-arrow-clockwise"></i>
|
||||
</button>
|
||||
</div>
|
||||
<div id="detailAddressLines"></div>
|
||||
</div>
|
||||
<div class="table-responsive d-none d-md-block">
|
||||
<table class="table table-sm table-bordered mb-0">
|
||||
<thead class="table-light">
|
||||
<tr>
|
||||
<th>SKU</th>
|
||||
<th>Produs</th>
|
||||
<th>CODMAT</th>
|
||||
<th class="text-end">Cant.</th>
|
||||
<th class="text-end">Pret GoMag</th>
|
||||
<th class="text-end">TVA%</th>
|
||||
<th class="text-end">Valoare</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="detailItemsBody">
|
||||
</tbody>
|
||||
</table>
|
||||
<div id="detailReceipt" class="d-flex flex-wrap gap-2 mt-1 justify-content-end"></div>
|
||||
</div>
|
||||
<div class="d-md-none" id="detailItemsMobile"></div>
|
||||
<div id="detailReceiptMobile" class="d-flex flex-wrap gap-2 mt-1 d-md-none justify-content-end"></div>
|
||||
<div id="detailError" class="alert alert-danger mt-3" style="display:none;"></div>
|
||||
</div>
|
||||
<div class="modal-footer d-flex">
|
||||
<button type="button" id="detailDeleteBtn" class="btn btn-sm btn-outline-danger me-auto" style="display:none"><i class="bi bi-trash"></i> Sterge din ROA</button>
|
||||
<button type="button" id="detailRetryBtn" class="btn btn-sm btn-outline-primary" style="display:none"><i class="bi bi-arrow-clockwise"></i> Reimporta</button>
|
||||
<button type="button" id="detailResyncBtn" class="btn btn-sm btn-outline-warning" style="display:none"><i class="bi bi-arrow-repeat"></i> Resync</button>
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Inchide</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>window.ROOT_PATH = "{{ rp }}";</script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<script src="{{ rp }}/static/js/shared.js?v=49"></script>
|
||||
<script>
|
||||
// Dark mode toggle
|
||||
function toggleDarkMode() {
|
||||
var isDark = document.documentElement.getAttribute('data-theme') === 'dark';
|
||||
var newTheme = isDark ? 'light' : 'dark';
|
||||
document.documentElement.setAttribute('data-theme', newTheme);
|
||||
try { localStorage.setItem('theme', newTheme); } catch(e) {}
|
||||
updateDarkToggleIcon();
|
||||
// Sync settings page toggle if present
|
||||
var settToggle = document.getElementById('settDarkMode');
|
||||
if (settToggle) settToggle.checked = (newTheme === 'dark');
|
||||
}
|
||||
function updateDarkToggleIcon() {
|
||||
var isDark = document.documentElement.getAttribute('data-theme') === 'dark';
|
||||
document.querySelectorAll('.dark-toggle i').forEach(function(el) {
|
||||
el.className = isDark ? 'bi bi-moon-fill' : 'bi bi-sun-fill';
|
||||
});
|
||||
}
|
||||
updateDarkToggleIcon();
|
||||
</script>
|
||||
{% block scripts %}{% endblock %}
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,125 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Dashboard - GoMag Import{% endblock %}
|
||||
{% block nav_dashboard %}active{% endblock %}
|
||||
{% block bnav_dashboard %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<h4 class="mb-4">Panou de Comanda</h4>
|
||||
|
||||
<div id="welcomeCard" style="display:none"></div>
|
||||
|
||||
<!-- Sync Card (unified two-row panel) -->
|
||||
<div class="sync-card">
|
||||
<!-- TOP ROW: Status + Controls -->
|
||||
<div class="sync-card-controls">
|
||||
<span id="syncStatusDot" class="sync-status-dot idle"></span>
|
||||
<span id="syncStatusText" class="text-secondary">Inactiv</span>
|
||||
<span id="syncHealthPill" class="health-pill healthy" role="status"
|
||||
aria-label="Sync sanatos" title="Verificare stare sync">
|
||||
<i class="bi bi-check-circle-fill" aria-hidden="true"></i>
|
||||
<span class="health-pill-label">Sanatos</span>
|
||||
</span>
|
||||
<div class="d-flex align-items-center gap-2">
|
||||
<label class="d-flex align-items-center gap-1 text-muted">
|
||||
Auto:
|
||||
<input type="checkbox" id="schedulerToggle" class="cursor-pointer" onchange="toggleScheduler()">
|
||||
</label>
|
||||
<select id="schedulerInterval" class="select-compact" onchange="updateSchedulerInterval()">
|
||||
<option value="1">1 min</option>
|
||||
<option value="3">3 min</option>
|
||||
<option value="5">5 min</option>
|
||||
<option value="10" selected>10 min</option>
|
||||
<option value="30">30 min</option>
|
||||
</select>
|
||||
<button id="syncStartBtn" class="btn btn-sm btn-primary" onclick="startSync()">▶ Start Sync</button>
|
||||
</div>
|
||||
</div>
|
||||
<div class="sync-card-divider"></div>
|
||||
<!-- BOTTOM ROW: Last sync info (clickable → jurnal) -->
|
||||
<div class="sync-card-info" id="lastSyncRow" role="button" tabindex="0" title="Ver jurnal sync">
|
||||
<span id="lastSyncDate" class="fw-medium">—</span>
|
||||
<span id="lastSyncDuration" class="text-muted">—</span>
|
||||
<span id="lastSyncCounts">—</span>
|
||||
<span id="lastSyncStatus">—</span>
|
||||
<span class="ms-auto small text-muted">↗ jurnal</span>
|
||||
</div>
|
||||
<!-- LIVE PROGRESS (shown only when sync is running) -->
|
||||
<div class="sync-card-progress" id="syncProgressArea" style="display:none;">
|
||||
<span class="sync-live-dot"></span>
|
||||
<span id="syncProgressText">Se proceseaza...</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Orders Table -->
|
||||
<div class="card mb-4">
|
||||
<div class="card-header">
|
||||
<span>Comenzi</span>
|
||||
</div>
|
||||
<div class="card-body py-2 px-3">
|
||||
<div id="attentionCard"></div>
|
||||
<div class="filter-bar" id="ordersFilterBar">
|
||||
<!-- Period preset buttons -->
|
||||
<div class="period-presets">
|
||||
<button class="preset-btn" data-days="1">Azi</button>
|
||||
<button class="preset-btn active" data-days="3">3 zile</button>
|
||||
<button class="preset-btn" data-days="7">7 zile</button>
|
||||
<button class="preset-btn" data-days="30">30 zile</button>
|
||||
<button class="preset-btn" data-days="custom">Custom</button>
|
||||
</div>
|
||||
<!-- Custom date range (hidden until 'Custom' clicked) -->
|
||||
<div class="period-custom-range" id="customRangeInputs">
|
||||
<input type="date" id="periodStart" class="select-compact">
|
||||
<span>—</span>
|
||||
<input type="date" id="periodEnd" class="select-compact">
|
||||
</div>
|
||||
<input type="search" id="orderSearch" placeholder="Cauta comanda, client..." class="search-input">
|
||||
<!-- Status pills -->
|
||||
<button class="filter-pill active d-none d-md-inline-flex" data-status="all">Toate <span class="filter-count fc-neutral" id="cntAll">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="{{ OrderStatus.IMPORTED.value }}">Importat <span class="filter-count fc-green" id="cntImp">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="{{ OrderStatus.SKIPPED.value }}">Omise <span class="filter-count fc-yellow" id="cntSkip">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="{{ OrderStatus.ERROR.value }}">Erori <span class="filter-count fc-red" id="cntErr">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="INVOICED">Facturate <span class="filter-count fc-green" id="cntFact">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="UNINVOICED">Nefacturate <span class="filter-count fc-red" id="cntNef">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="{{ OrderStatus.CANCELLED.value }}">Anulate <span class="filter-count fc-dark" id="cntCanc">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="{{ OrderStatus.MALFORMED.value }}">Defecte <span class="filter-count fc-orange" id="cntMal">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-status="DIFFS">Diferente <span class="filter-count fc-orange" id="cntDiff">0</span></button>
|
||||
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" id="btnRefreshInvoices" onclick="refreshInvoices()" title="Actualizeaza status facturi din Oracle">↻</button>
|
||||
</div>
|
||||
<div class="d-md-none mb-2 d-flex align-items-center gap-2" style="max-width:100%;overflow:hidden">
|
||||
<div class="flex-grow-1" id="dashMobileSeg" style="min-width:0;overflow-x:auto"></div>
|
||||
<button class="btn btn-sm btn-outline-secondary" id="btnRefreshInvoicesMobile" onclick="refreshInvoices()" title="Actualizeaza facturi" style="padding:4px 8px; font-size:1rem; line-height:1">↻</button>
|
||||
</div>
|
||||
</div>
|
||||
<div id="dashPaginationTop" class="pag-strip"></div>
|
||||
<div class="card-body p-0">
|
||||
<div id="dashMobileList" class="mobile-list"></div>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th style="width:24px"></th>
|
||||
<th style="width:28px" title="Facturat">F</th>
|
||||
<th class="sortable" onclick="dashSortBy('order_date')">Data <span class="sort-icon" data-col="order_date"></span></th>
|
||||
<th class="sortable" onclick="dashSortBy('customer_name')">Client <span class="sort-icon" data-col="customer_name"></span></th>
|
||||
<th class="sortable" onclick="dashSortBy('order_number')">Nr Comanda <span class="sort-icon" data-col="order_number"></span></th>
|
||||
<th class="sortable" onclick="dashSortBy('items_count')">Art. <span class="sort-icon" data-col="items_count"></span></th>
|
||||
<th class="text-end">Transport</th>
|
||||
<th class="text-end">Discount</th>
|
||||
<th class="text-end">Total</th>
|
||||
<th style="width:44px"></th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="dashOrdersBody">
|
||||
<tr><td colspan="10" class="text-center text-muted py-3">Se incarca...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
<div id="dashPagination" class="pag-strip pag-strip-bottom"></div>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script src="{{ request.scope.get('root_path', '') }}/static/js/dashboard.js?v=52"></script>
|
||||
{% endblock %}
|
||||
@@ -1,114 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Jurnale Import - GoMag Import{% endblock %}
|
||||
{% block nav_logs %}active{% endblock %}
|
||||
{% block bnav_logs %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<h4 class="mb-4">Jurnale Import</h4>
|
||||
|
||||
<!-- Sync Run Selector + Status + Controls (single card) -->
|
||||
<div class="card mb-3">
|
||||
<div class="card-body py-2">
|
||||
<!-- Desktop layout -->
|
||||
<div class="d-none d-md-flex align-items-center gap-3 flex-wrap">
|
||||
<label class="form-label mb-0 fw-bold text-nowrap">Sync Run:</label>
|
||||
<select class="form-select form-select-sm" id="runsDropdown" onchange="selectRun(this.value)" style="max-width:400px">
|
||||
<option value="">Se incarca...</option>
|
||||
</select>
|
||||
<button class="btn btn-sm btn-outline-secondary text-nowrap" onclick="loadRuns()" title="Reincarca lista"><i class="bi bi-arrow-clockwise"></i></button>
|
||||
<span id="logStatusBadge" style="font-weight:600">-</span>
|
||||
<div class="form-check form-switch mb-0">
|
||||
<input class="form-check-input" type="checkbox" id="autoRefreshToggle" checked>
|
||||
<label class="form-check-label small" for="autoRefreshToggle">Auto-refresh</label>
|
||||
</div>
|
||||
<button class="btn btn-sm btn-outline-secondary" id="btnShowTextLog" onclick="toggleTextLog()">
|
||||
<i class="bi bi-file-text"></i> Log text brut
|
||||
</button>
|
||||
</div>
|
||||
<!-- Mobile compact layout -->
|
||||
<div class="d-flex d-md-none align-items-center gap-2">
|
||||
<span id="mobileRunDot" class="sync-status-dot idle" style="width:8px;height:8px"></span>
|
||||
<select class="form-select form-select-sm flex-grow-1" id="runsDropdownMobile" onchange="selectRun(this.value)" style="font-size:0.8rem">
|
||||
<option value="">Se incarca...</option>
|
||||
</select>
|
||||
<button class="btn btn-sm btn-outline-secondary" onclick="loadRuns()" title="Reincarca"><i class="bi bi-arrow-clockwise"></i></button>
|
||||
<div class="dropdown">
|
||||
<button class="btn btn-sm btn-outline-secondary" data-bs-toggle="dropdown"><i class="bi bi-three-dots-vertical"></i></button>
|
||||
<ul class="dropdown-menu dropdown-menu-end">
|
||||
<li>
|
||||
<label class="dropdown-item d-flex align-items-center gap-2">
|
||||
<input class="form-check-input" type="checkbox" id="autoRefreshToggleMobile" checked> Auto-refresh
|
||||
</label>
|
||||
</li>
|
||||
<li><a class="dropdown-item" href="#" onclick="toggleTextLog();return false"><i class="bi bi-file-text me-1"></i> Log text brut</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Empty state (shown when no run selected) -->
|
||||
<div id="logEmptyState" class="text-center py-5" style="color:var(--text-muted)">
|
||||
<i class="bi bi-journal-text" style="font-size:2.5rem;opacity:0.4"></i>
|
||||
<p class="mt-3 mb-1" style="font-size:0.9375rem">Selecteaza un sync run din lista de mai sus</p>
|
||||
<p style="font-size:0.8125rem">Jurnalele arata detalii pentru fiecare sincronizare: comenzi importate, omise, erori.</p>
|
||||
</div>
|
||||
|
||||
<!-- Detail Viewer (shown when run selected) -->
|
||||
<div id="logViewerSection" style="display:none;">
|
||||
<!-- Filter pills -->
|
||||
<div class="filter-bar mb-3" id="orderFilterPills">
|
||||
<button class="filter-pill active d-none d-md-inline-flex" data-log-status="all">Toate <span class="filter-count fc-neutral" id="countAll">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-log-status="{{ OrderStatus.IMPORTED.value }}">Importate <span class="filter-count fc-green" id="countImported">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-log-status="{{ OrderStatus.ALREADY_IMPORTED.value }}">Deja imp. <span class="filter-count fc-blue" id="countAlreadyImported">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-log-status="{{ OrderStatus.SKIPPED.value }}">Omise <span class="filter-count fc-yellow" id="countSkipped">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-log-status="{{ OrderStatus.ERROR.value }}">Erori <span class="filter-count fc-red" id="countError">0</span></button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-log-status="{{ OrderStatus.MALFORMED.value }}">Defecte <span class="filter-count fc-orange" id="countMalformed">0</span></button>
|
||||
</div>
|
||||
<div class="d-md-none mb-2" id="logsMobileSeg" style="overflow-x:auto"></div>
|
||||
|
||||
<!-- Orders table -->
|
||||
<div class="card mb-3">
|
||||
<div id="ordersPaginationTop" class="pag-strip"></div>
|
||||
<div class="card-body p-0">
|
||||
<div id="logsMobileList" class="mobile-list"></div>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th style="width:24px"></th>
|
||||
<th>#</th>
|
||||
<th class="sortable" onclick="sortOrdersBy('order_date')">Data comanda <span class="sort-icon" data-col="order_date"></span></th>
|
||||
<th class="sortable" onclick="sortOrdersBy('order_number')">Nr. comanda <span class="sort-icon" data-col="order_number"></span></th>
|
||||
<th class="sortable" onclick="sortOrdersBy('customer_name')">Client <span class="sort-icon" data-col="customer_name"></span></th>
|
||||
<th class="sortable" onclick="sortOrdersBy('items_count')">Articole <span class="sort-icon" data-col="items_count"></span></th>
|
||||
<th class="text-end">Transport</th>
|
||||
<th class="text-end">Discount</th>
|
||||
<th class="text-end">Total</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="runOrdersBody">
|
||||
<tr><td colspan="9" class="text-center text-muted py-3">Selecteaza un sync run</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
<div id="ordersPagination" class="pag-strip pag-strip-bottom"></div>
|
||||
</div>
|
||||
|
||||
<!-- Collapsible text log -->
|
||||
<div id="textLogSection" style="display:none;">
|
||||
<div class="card">
|
||||
<div class="card-header">Log text brut</div>
|
||||
<pre class="log-viewer" id="logContent">Se incarca...</pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Hidden field for pre-selected run from URL/server -->
|
||||
<input type="hidden" id="preselectedRun" value="{{ selected_run }}">
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script src="{{ request.scope.get('root_path', '') }}/static/js/logs.js?v=16"></script>
|
||||
{% endblock %}
|
||||
@@ -1,163 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Mapari SKU - GoMag Import{% endblock %}
|
||||
{% block nav_mappings %}active{% endblock %}
|
||||
{% block bnav_mappings %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h4 class="mb-0">Mapari SKU</h4>
|
||||
<div class="d-flex align-items-center gap-2">
|
||||
<!-- Desktop Import/Export dropdown -->
|
||||
<div class="dropdown d-none d-md-inline-block">
|
||||
<button class="btn btn-sm btn-outline-secondary dropdown-toggle" type="button" data-bs-toggle="dropdown">
|
||||
<i class="bi bi-file-earmark-spreadsheet"></i> Import/Export
|
||||
</button>
|
||||
<ul class="dropdown-menu">
|
||||
<li><a class="dropdown-item" href="#" onclick="downloadTemplate(); return false"><i class="bi bi-file-earmark-arrow-down me-1"></i> Download Template CSV</a></li>
|
||||
<li><a class="dropdown-item" href="#" onclick="exportCsv(); return false"><i class="bi bi-download me-1"></i> Export CSV</a></li>
|
||||
<li><hr class="dropdown-divider"></li>
|
||||
<li><a class="dropdown-item" href="#" data-bs-toggle="modal" data-bs-target="#importModal"><i class="bi bi-upload me-1"></i> Import CSV</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
<button class="btn btn-sm btn-primary" onclick="showInlineAddRow()"><i class="bi bi-plus-lg"></i> <span class="d-none d-md-inline">Adauga Mapare</span><span class="d-md-none">Mapare</span></button>
|
||||
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" data-bs-toggle="modal" data-bs-target="#addModal"><i class="bi bi-box-arrow-up-right"></i> Formular complet</button>
|
||||
<!-- Mobile ⋯ dropdown -->
|
||||
<div class="dropdown d-md-none">
|
||||
<button class="btn btn-sm btn-outline-secondary" type="button" data-bs-toggle="dropdown" aria-expanded="false"><i class="bi bi-three-dots-vertical"></i></button>
|
||||
<ul class="dropdown-menu dropdown-menu-end">
|
||||
<li><a class="dropdown-item" href="#" onclick="downloadTemplate();return false"><i class="bi bi-file-earmark-arrow-down me-1"></i> Template CSV</a></li>
|
||||
<li><a class="dropdown-item" href="#" onclick="exportCsv();return false"><i class="bi bi-download me-1"></i> Export CSV</a></li>
|
||||
<li><a class="dropdown-item" href="#" data-bs-toggle="modal" data-bs-target="#importModal"><i class="bi bi-upload me-1"></i> Import CSV</a></li>
|
||||
<li><a class="dropdown-item" href="#" data-bs-toggle="modal" data-bs-target="#addModal"><i class="bi bi-box-arrow-up-right me-1"></i> Formular complet</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Search -->
|
||||
<div class="card mb-3">
|
||||
<div class="card-body py-2">
|
||||
<div class="input-group">
|
||||
<span class="input-group-text"><i class="bi bi-search"></i></span>
|
||||
<input type="text" class="form-control" id="searchInput" placeholder="Cauta SKU, CODMAT sau denumire..." oninput="debounceSearch()">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Filter controls -->
|
||||
<div class="d-flex align-items-center mb-3 gap-3">
|
||||
<div class="form-check form-switch">
|
||||
<input class="form-check-input" type="checkbox" id="showInactive" onchange="loadMappings()">
|
||||
<label class="form-check-label" for="showInactive">Arata inactive</label>
|
||||
</div>
|
||||
<div class="form-check form-switch">
|
||||
<input class="form-check-input" type="checkbox" id="showDeleted" onchange="loadMappings()">
|
||||
<label class="form-check-label" for="showDeleted">Arata sterse</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Top pagination -->
|
||||
<div id="mappingsPagTop" class="pag-strip"></div>
|
||||
|
||||
<!-- Flat-row list (unified desktop + mobile) -->
|
||||
<div class="card">
|
||||
<div class="card-body p-0">
|
||||
<div id="mappingsFlatList" class="mappings-flat-list">
|
||||
<div class="flat-row text-muted py-4 justify-content-center">Se incarca...</div>
|
||||
</div>
|
||||
</div>
|
||||
<div id="mappingsPagBottom" class="pag-strip pag-strip-bottom"></div>
|
||||
</div>
|
||||
|
||||
<!-- Add/Edit Modal with multi-CODMAT support (R11) -->
|
||||
<div class="modal fade" id="addModal" tabindex="-1" data-bs-backdrop="static">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title" id="addModalTitle">Adauga Mapare</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="mb-2">
|
||||
<label class="form-label form-label-sm mb-1">SKU</label>
|
||||
<input type="text" class="form-control form-control-sm" id="inputSku" placeholder="Ex: 8714858124284">
|
||||
</div>
|
||||
<div id="addModalProductName" style="display:none; margin-bottom:8px; font-size:0.85rem">
|
||||
<small class="text-muted">Produs:</small> <strong id="inputProductName"></strong>
|
||||
</div>
|
||||
<div class="qm-row" style="font-size:0.7rem; color:#9ca3af; padding:0 0 2px">
|
||||
<span style="flex:1">CODMAT</span>
|
||||
<span style="width:70px">Cant.</span>
|
||||
<span style="width:30px"></span>
|
||||
</div>
|
||||
<div id="codmatLines">
|
||||
<!-- Dynamic CODMAT lines will be added here -->
|
||||
</div>
|
||||
<button type="button" class="btn btn-sm btn-outline-secondary mt-1" onclick="addCodmatLine()" style="font-size:0.8rem; padding:2px 10px">
|
||||
+ CODMAT
|
||||
</button>
|
||||
<div id="pctWarning" class="text-danger mt-2" style="display:none;"></div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
|
||||
<button type="button" class="btn btn-primary" onclick="saveMapping()">Salveaza</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Import CSV Modal -->
|
||||
<div class="modal fade" id="importModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Import CSV</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p class="text-muted small">Format CSV: sku, codmat, cantitate_roa</p>
|
||||
<input type="file" class="form-control" id="csvFile" accept=".csv">
|
||||
<div id="importResult" class="mt-3"></div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Inchide</button>
|
||||
<button type="button" class="btn btn-primary" onclick="importCsv()">Import</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<!-- Delete Confirmation Modal -->
|
||||
<div class="modal fade" id="deleteConfirmModal" tabindex="-1">
|
||||
<div class="modal-dialog modal-sm">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Confirmare stergere</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
Sigur vrei sa stergi maparea?<br>
|
||||
SKU: <code id="deleteSkuText"></code><br>
|
||||
CODMAT: <code id="deleteCodmatText"></code>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
|
||||
<button type="button" class="btn btn-danger" id="confirmDeleteBtn">Sterge</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Toast container for undo actions -->
|
||||
<div class="toast-container position-fixed bottom-0 end-0 p-3" style="z-index:1080">
|
||||
<div id="undoToast" class="toast" role="alert" data-bs-autohide="true" data-bs-delay="5000">
|
||||
<div class="toast-body d-flex align-items-center gap-2">
|
||||
<span id="toastMessage"></span>
|
||||
<button class="btn btn-sm btn-outline-primary ms-auto" id="toastUndoBtn">Anuleaza</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script src="{{ request.scope.get('root_path', '') }}/static/js/mappings.js?v=17"></script>
|
||||
{% endblock %}
|
||||
@@ -1,249 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}SKU-uri Lipsa - GoMag Import{% endblock %}
|
||||
{% block nav_missing %}active{% endblock %}
|
||||
{% block bnav_missing %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h4 class="mb-0">SKU-uri Lipsa</h4>
|
||||
<div class="d-flex align-items-center gap-2">
|
||||
<button class="btn btn-sm btn-outline-secondary d-none d-md-inline-flex" onclick="exportMissingCsv()">
|
||||
<i class="bi bi-download"></i> Export CSV
|
||||
</button>
|
||||
<!-- Mobile ⋯ dropdown -->
|
||||
<div class="dropdown d-md-none">
|
||||
<button class="btn btn-sm btn-outline-secondary" type="button" data-bs-toggle="dropdown" aria-expanded="false"><i class="bi bi-three-dots-vertical"></i></button>
|
||||
<ul class="dropdown-menu dropdown-menu-end">
|
||||
<li><a class="dropdown-item" href="#" onclick="document.getElementById('rescanBtn').click();return false"><i class="bi bi-arrow-clockwise me-1"></i> Re-scan</a></li>
|
||||
<li><a class="dropdown-item" href="#" onclick="exportMissingCsv();return false"><i class="bi bi-download me-1"></i> Export CSV</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Unified filter bar -->
|
||||
<div class="filter-bar" id="skusFilterBar">
|
||||
<button class="filter-pill active d-none d-md-inline-flex" data-sku-status="unresolved">
|
||||
Nerezolvate <span class="filter-count fc-yellow" id="cntUnres">0</span>
|
||||
</button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-sku-status="resolved">
|
||||
Rezolvate <span class="filter-count fc-green" id="cntRes">0</span>
|
||||
</button>
|
||||
<button class="filter-pill d-none d-md-inline-flex" data-sku-status="all">
|
||||
Toate <span class="filter-count fc-neutral" id="cntAllSkus">0</span>
|
||||
</button>
|
||||
<input type="search" id="skuSearch" placeholder="Cauta SKU / produs..." class="search-input">
|
||||
<button id="rescanBtn" class="btn btn-sm btn-secondary ms-2 d-none d-md-inline-flex">↻ Re-scan</button>
|
||||
<span id="rescanProgress" class="align-items-center gap-2 text-primary" style="display:none;">
|
||||
<span class="sync-live-dot"></span>
|
||||
<span id="rescanProgressText">Scanare...</span>
|
||||
</span>
|
||||
</div>
|
||||
<div class="d-md-none mb-2" id="skusMobileSeg"></div>
|
||||
<!-- Result banner -->
|
||||
<div id="rescanResult" class="result-banner" style="display:none;margin-bottom:0.75rem;"></div>
|
||||
|
||||
<div id="skusPagTop" class="pag-strip mb-2"></div>
|
||||
<div class="card">
|
||||
<div class="card-body p-0">
|
||||
<div id="missingMobileList" class="mobile-list"></div>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Status</th>
|
||||
<th>SKU</th>
|
||||
<th>Produs</th>
|
||||
<th>Actiune</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="missingBody">
|
||||
<tr><td colspan="4" class="text-center text-muted py-4">Se incarca...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div id="skusPagBottom" class="pag-strip pag-strip-bottom"></div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
let currentPage = 1;
|
||||
let skuStatusFilter = 'unresolved';
|
||||
let missingPerPage = 20;
|
||||
|
||||
function missingChangePerPage(val) { missingPerPage = parseInt(val) || 20; currentPage = 1; loadMissingSkus(); }
|
||||
|
||||
// ── Filter pills ──────────────────────────────────
|
||||
document.querySelectorAll('.filter-pill[data-sku-status]').forEach(btn => {
|
||||
btn.addEventListener('click', function() {
|
||||
document.querySelectorAll('.filter-pill[data-sku-status]').forEach(b => b.classList.remove('active'));
|
||||
this.classList.add('active');
|
||||
skuStatusFilter = this.dataset.skuStatus;
|
||||
currentPage = 1;
|
||||
loadMissingSkus();
|
||||
});
|
||||
});
|
||||
|
||||
// ── Search with debounce ─────────────────────────
|
||||
let skuSearchTimer = null;
|
||||
document.getElementById('skuSearch')?.addEventListener('input', function() {
|
||||
clearTimeout(skuSearchTimer);
|
||||
skuSearchTimer = setTimeout(() => { currentPage = 1; loadMissingSkus(); }, 300);
|
||||
});
|
||||
|
||||
// ── Rescan ────────────────────────────────────────
|
||||
document.getElementById('rescanBtn')?.addEventListener('click', async function() {
|
||||
this.disabled = true;
|
||||
const prog = document.getElementById('rescanProgress');
|
||||
const result = document.getElementById('rescanResult');
|
||||
const progText = document.getElementById('rescanProgressText');
|
||||
if (prog) { prog.style.display = 'flex'; }
|
||||
if (result) result.style.display = 'none';
|
||||
try {
|
||||
const data = await fetch('/api/validate/scan', { method: 'POST' }).then(r => r.json());
|
||||
if (progText) progText.textContent = 'Gata.';
|
||||
if (result) {
|
||||
result.innerHTML = `✓ ${data.total_skus_scanned || 0} scanate | ${data.new_missing || 0} noi lipsa | ${data.auto_resolved || 0} rezolvate`;
|
||||
result.style.display = 'block';
|
||||
}
|
||||
loadMissingSkus();
|
||||
} catch(e) {
|
||||
if (progText) progText.textContent = 'Eroare.';
|
||||
} finally {
|
||||
this.disabled = false;
|
||||
setTimeout(() => { if (prog) prog.style.display = 'none'; }, 2500);
|
||||
}
|
||||
});
|
||||
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
loadMissingSkus();
|
||||
});
|
||||
|
||||
function resolvedParamFor(statusFilter) {
|
||||
if (statusFilter === 'resolved') return 1;
|
||||
if (statusFilter === 'all') return -1;
|
||||
return 0; // unresolved (default)
|
||||
}
|
||||
|
||||
function loadMissingSkus(page) {
|
||||
currentPage = page || currentPage;
|
||||
const params = new URLSearchParams();
|
||||
const resolvedVal = resolvedParamFor(skuStatusFilter);
|
||||
params.set('resolved', resolvedVal);
|
||||
params.set('page', currentPage);
|
||||
params.set('per_page', missingPerPage);
|
||||
const search = document.getElementById('skuSearch')?.value?.trim();
|
||||
if (search) params.set('search', search);
|
||||
|
||||
fetch('/api/validate/missing-skus?' + params.toString())
|
||||
.then(r => r.json())
|
||||
.then(data => {
|
||||
const c = data.counts || {};
|
||||
const el = id => document.getElementById(id);
|
||||
if (el('cntUnres')) el('cntUnres').textContent = c.unresolved || 0;
|
||||
if (el('cntRes')) el('cntRes').textContent = c.resolved || 0;
|
||||
if (el('cntAllSkus')) el('cntAllSkus').textContent = c.total || 0;
|
||||
|
||||
// Mobile segmented control
|
||||
renderMobileSegmented('skusMobileSeg', [
|
||||
{ label: 'Nerez.', count: c.unresolved || 0, value: 'unresolved', active: skuStatusFilter === 'unresolved', colorClass: 'fc-yellow' },
|
||||
{ label: 'Rez.', count: c.resolved || 0, value: 'resolved', active: skuStatusFilter === 'resolved', colorClass: 'fc-green' },
|
||||
{ label: 'Toate', count: c.total || 0, value: 'all', active: skuStatusFilter === 'all', colorClass: 'fc-neutral' }
|
||||
], (val) => {
|
||||
document.querySelectorAll('.filter-pill[data-sku-status]').forEach(b => b.classList.remove('active'));
|
||||
const pill = document.querySelector(`.filter-pill[data-sku-status="${val}"]`);
|
||||
if (pill) pill.classList.add('active');
|
||||
skuStatusFilter = val;
|
||||
currentPage = 1;
|
||||
loadMissingSkus();
|
||||
});
|
||||
|
||||
renderMissingSkusTable(data.skus || data.missing_skus || [], data);
|
||||
renderPagination(data);
|
||||
})
|
||||
.catch(err => {
|
||||
document.getElementById('missingBody').innerHTML =
|
||||
`<tr><td colspan="4" class="text-center text-danger">${err.message}</td></tr>`;
|
||||
});
|
||||
}
|
||||
|
||||
// Keep backward compat alias
|
||||
function loadMissing(page) { loadMissingSkus(page); }
|
||||
|
||||
function renderMissingSkusTable(skus, data) {
|
||||
const tbody = document.getElementById('missingBody');
|
||||
const mobileList = document.getElementById('missingMobileList');
|
||||
|
||||
if (!skus || skus.length === 0) {
|
||||
const msg = skuStatusFilter === 'unresolved' ? 'Toate SKU-urile sunt mapate!' :
|
||||
skuStatusFilter === 'resolved' ? 'Niciun SKU rezolvat' : 'Niciun SKU gasit';
|
||||
tbody.innerHTML = `<tr><td colspan="4" class="text-center text-muted py-4">${msg}</td></tr>`;
|
||||
if (mobileList) mobileList.innerHTML = `<div class="flat-row text-muted py-3 justify-content-center">${msg}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
tbody.innerHTML = skus.map(s => {
|
||||
const trAttrs = !s.resolved
|
||||
? ` style="cursor:pointer" onclick="openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}')"`
|
||||
: '';
|
||||
return `<tr${trAttrs}>
|
||||
<td>${s.resolved ? '<span class="dot dot-green"></span>' : '<span class="dot dot-yellow"></span>'}</td>
|
||||
<td><code>${esc(s.sku)}</code></td>
|
||||
<td class="truncate" style="max-width:300px">${esc(s.product_name || '-')}</td>
|
||||
<td>
|
||||
${!s.resolved
|
||||
? `<a href="#" class="btn-map-icon" onclick="event.stopPropagation(); openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}'); return false;" title="Mapeaza">
|
||||
<i class="bi bi-link-45deg"></i>
|
||||
</a>`
|
||||
: `<small class="text-muted">${s.resolved_at ? new Date(s.resolved_at).toLocaleDateString('ro-RO') : ''}</small>`}
|
||||
</td>
|
||||
</tr>`;
|
||||
}).join('');
|
||||
|
||||
if (mobileList) {
|
||||
mobileList.innerHTML = skus.map(s => {
|
||||
const actionHtml = !s.resolved
|
||||
? `<a href="#" class="btn-map-icon" onclick="event.stopPropagation(); openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}'); return false;"><i class="bi bi-link-45deg"></i></a>`
|
||||
: `<small class="text-muted">${s.resolved_at ? new Date(s.resolved_at).toLocaleDateString('ro-RO') : ''}</small>`;
|
||||
const flatRowAttrs = !s.resolved
|
||||
? ` onclick="openMapModal('${esc(s.sku)}', '${esc(s.product_name || '')}')" style="cursor:pointer"`
|
||||
: '';
|
||||
return `<div class="flat-row"${flatRowAttrs}>
|
||||
${s.resolved ? '<span class="dot dot-green"></span>' : '<span class="dot dot-yellow"></span>'}
|
||||
<code class="me-1 text-nowrap">${esc(s.sku)}</code>
|
||||
<span class="grow truncate">${esc(s.product_name || '-')}</span>
|
||||
${actionHtml}
|
||||
</div>`;
|
||||
}).join('');
|
||||
}
|
||||
}
|
||||
|
||||
function renderPagination(data) {
|
||||
const pagOpts = { perPage: missingPerPage, perPageFn: 'missingChangePerPage', perPageOptions: [20, 50, 100] };
|
||||
const infoHtml = `<small class="text-muted me-auto">Total: ${data.total || 0} | Pagina ${data.page || 1} din ${data.pages || 1}</small>`;
|
||||
const pagHtml = infoHtml + renderUnifiedPagination(data.page || 1, data.pages || 1, 'loadMissing', pagOpts);
|
||||
const top = document.getElementById('skusPagTop');
|
||||
const bot = document.getElementById('skusPagBottom');
|
||||
if (top) top.innerHTML = pagHtml;
|
||||
if (bot) bot.innerHTML = pagHtml;
|
||||
}
|
||||
|
||||
// ── Map Modal (uses shared openQuickMap) ─────────
|
||||
|
||||
function openMapModal(sku, productName) {
|
||||
openQuickMap({
|
||||
sku,
|
||||
productName,
|
||||
onSave: () => { loadMissingSkus(currentPage); }
|
||||
});
|
||||
}
|
||||
|
||||
function exportMissingCsv() {
|
||||
window.location.href = (window.ROOT_PATH || '') + '/api/validate/missing-skus-csv';
|
||||
}
|
||||
|
||||
</script>
|
||||
{% endblock %}
|
||||
@@ -1,231 +0,0 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Setari - GoMag Import{% endblock %}
|
||||
{% block nav_settings %}active{% endblock %}
|
||||
{% block bnav_settings %}active{% endblock %}
|
||||
{% block main_class %}constrained{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<h4 class="mb-3">Setari</h4>
|
||||
|
||||
<!-- Dark mode toggle -->
|
||||
<div class="theme-toggle-card">
|
||||
<div>
|
||||
<i class="bi bi-moon-fill me-2"></i>
|
||||
<label for="settDarkMode">Mod intunecat</label>
|
||||
</div>
|
||||
<div class="form-check form-switch mb-0">
|
||||
<input class="form-check-input" type="checkbox" role="switch" id="settDarkMode" style="width:2.5rem;height:1.25rem">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row g-3 mb-3">
|
||||
<!-- GoMag API card -->
|
||||
<div class="col-md-6">
|
||||
<div class="card h-100">
|
||||
<div class="card-header py-2 px-3 fw-semibold">GoMag API</div>
|
||||
<div class="card-body py-2 px-3">
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">API Key</label>
|
||||
<input type="text" class="form-control form-control-sm" id="settGomagApiKey" placeholder="4c5e46...">
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Shop URL</label>
|
||||
<input type="text" class="form-control form-control-sm" id="settGomagApiShop" placeholder="https://coffeepoint.ro">
|
||||
</div>
|
||||
<div class="row g-2">
|
||||
<div class="col-6">
|
||||
<label class="form-label mb-0 small">Zile înapoi</label>
|
||||
<input type="number" class="form-control form-control-sm" id="settGomagDaysBack" value="7" min="1">
|
||||
</div>
|
||||
<div class="col-6">
|
||||
<label class="form-label mb-0 small">Limită/pagină</label>
|
||||
<input type="number" class="form-control form-control-sm" id="settGomagLimit" value="100" min="1">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Import ROA card -->
|
||||
<div class="col-md-6">
|
||||
<div class="card h-100">
|
||||
<div class="card-header py-2 px-3 fw-semibold">Import ROA</div>
|
||||
<div class="card-body py-2 px-3">
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Gestiuni pentru verificare stoc</label>
|
||||
<div id="settGestiuniContainer" class="border rounded p-2" style="max-height:120px;overflow-y:auto;font-size:0.85rem">
|
||||
<span class="text-muted small">Se încarcă...</span>
|
||||
</div>
|
||||
<div class="form-text" style="font-size:0.75rem">Nicio selecție = orice gestiune</div>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Secție (ID_SECTIE)</label>
|
||||
<select class="form-select form-select-sm" id="settIdSectie">
|
||||
<option value="">— selectează secție —</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Politică Preț Vânzare (ID_POL)</label>
|
||||
<select class="form-select form-select-sm" id="settIdPol">
|
||||
<option value="">— selectează politică —</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Politică Preț Producție</label>
|
||||
<select class="form-select form-select-sm" id="settIdPolProductie">
|
||||
<option value="">— fără politică producție —</option>
|
||||
</select>
|
||||
<div class="form-text" style="font-size:0.75rem">Pentru articole cu cont 341/345 (producție proprie)</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="row g-3 mb-3">
|
||||
<!-- Transport card -->
|
||||
<div class="col-md-6">
|
||||
<div class="card h-100">
|
||||
<div class="card-header py-2 px-3 fw-semibold">Transport</div>
|
||||
<div class="card-body py-2 px-3">
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">CODMAT Transport</label>
|
||||
<div class="position-relative">
|
||||
<input type="text" class="form-control form-control-sm" id="settTransportCodmat" placeholder="ex: TRANSPORT" autocomplete="off">
|
||||
<div class="autocomplete-dropdown d-none" id="settTransportAc"></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row g-2">
|
||||
<div class="col-6">
|
||||
<label class="form-label mb-0 small">TVA Transport (%)</label>
|
||||
<select class="form-select form-select-sm" id="settTransportVat">
|
||||
<option value="5">5%</option>
|
||||
<option value="9">9%</option>
|
||||
<option value="19">19%</option>
|
||||
<option value="21" selected>21%</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-6">
|
||||
<label class="form-label mb-0 small">Politică Transport</label>
|
||||
<select class="form-select form-select-sm" id="settTransportIdPol">
|
||||
<option value="">— implicită —</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Discount card -->
|
||||
<div class="col-md-6">
|
||||
<div class="card h-100">
|
||||
<div class="card-header py-2 px-3 fw-semibold">Discount</div>
|
||||
<div class="card-body py-2 px-3">
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">CODMAT Discount</label>
|
||||
<div class="position-relative">
|
||||
<input type="text" class="form-control form-control-sm" id="settDiscountCodmat" placeholder="ex: DISCOUNT" autocomplete="off">
|
||||
<div class="autocomplete-dropdown d-none" id="settDiscountAc"></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row g-2">
|
||||
<div class="col-6">
|
||||
<label class="form-label mb-0 small">TVA Discount (fallback %)</label>
|
||||
<select class="form-select form-select-sm" id="settDiscountVat">
|
||||
<option value="5">5%</option>
|
||||
<option value="9">9%</option>
|
||||
<option value="11">11%</option>
|
||||
<option value="19">19%</option>
|
||||
<option value="21" selected>21%</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col-6">
|
||||
<label class="form-label mb-0 small">Politică Discount</label>
|
||||
<select class="form-select form-select-sm" id="settDiscountIdPol">
|
||||
<option value="">— implicită —</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mt-2 form-check">
|
||||
<input type="checkbox" class="form-check-input" id="settSplitDiscountVat">
|
||||
<label class="form-check-label small" for="settSplitDiscountVat">
|
||||
Împarte discount pe cote TVA (proporțional cu valoarea articolelor)
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mt-4">
|
||||
<button class="btn btn-sm btn-outline-secondary" type="button" data-bs-toggle="collapse" data-bs-target="#advancedSettings" aria-expanded="false" title="Modificati doar la indicatia echipei tehnice">
|
||||
<i class="bi bi-gear"></i> Setari avansate
|
||||
</button>
|
||||
<div class="collapse mt-2" id="advancedSettings">
|
||||
<div class="row g-3 mb-3">
|
||||
<div class="col-md-6">
|
||||
<div class="card h-100">
|
||||
<div class="card-header py-2 px-3 fw-semibold">Dashboard</div>
|
||||
<div class="card-body py-2 px-3">
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Interval polling (secunde)</label>
|
||||
<input type="number" class="form-control form-control-sm" id="settDashPollSeconds" value="5" min="1" max="300">
|
||||
<div class="form-text" style="font-size:0.75rem">Cât de des verifică dashboard-ul starea sync-ului (implicit 5s)</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-12">
|
||||
<div class="card h-100">
|
||||
<div class="card-header py-2 px-3 fw-semibold">Pricing Kituri / Pachete</div>
|
||||
<div class="card-body py-2 px-3">
|
||||
<div class="mb-2">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="radio" name="kitPricingMode" id="kitModeOff" value="" checked>
|
||||
<label class="form-check-label small" for="kitModeOff">Dezactivat</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="radio" name="kitPricingMode" id="kitModeDistributed" value="distributed">
|
||||
<label class="form-check-label small" for="kitModeDistributed">Distribuire discount în preț</label>
|
||||
</div>
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="radio" name="kitPricingMode" id="kitModeSeparate" value="separate_line">
|
||||
<label class="form-check-label small" for="kitModeSeparate">Linie discount separată</label>
|
||||
</div>
|
||||
</div>
|
||||
<div id="kitModeBFields" style="display:none">
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Kit Discount CODMAT</label>
|
||||
<div class="position-relative">
|
||||
<input type="text" class="form-control form-control-sm" id="settKitDiscountCodmat" placeholder="ex: DISCOUNT_KIT" autocomplete="off">
|
||||
<div class="autocomplete-dropdown d-none" id="settKitDiscountAc"></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<label class="form-label mb-0 small">Kit Discount Politică</label>
|
||||
<select class="form-select form-select-sm" id="settKitDiscountIdPol">
|
||||
<option value="">— implicită —</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-check mt-2">
|
||||
<input type="checkbox" class="form-check-input" id="settPriceSyncEnabled" checked>
|
||||
<label class="form-check-label small" for="settPriceSyncEnabled">Sync automat prețuri din comenzi</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<button class="btn btn-primary btn-sm" onclick="saveSettings()">Salvează Setările</button>
|
||||
<span id="settSaveResult" class="ms-2 small"></span>
|
||||
</div>
|
||||
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script src="{{ request.scope.get('root_path', '') }}/static/js/settings.js?v=10"></script>
|
||||
{% endblock %}
|
||||
@@ -1,43 +0,0 @@
|
||||
-- ====================================================================
|
||||
-- P1-001: Tabel ARTICOLE_TERTI pentru mapări SKU → CODMAT
|
||||
-- Sistem Import Comenzi Web → ROA
|
||||
-- ====================================================================
|
||||
|
||||
-- Creare tabel pentru mapări complexe articole
|
||||
CREATE TABLE ARTICOLE_TERTI (
|
||||
sku VARCHAR2(100) NOT NULL, -- SKU din platforma web
|
||||
codmat VARCHAR2(50) NOT NULL, -- CODMAT din nom_articole
|
||||
cantitate_roa NUMBER(10,3) DEFAULT 1, -- Câte unități ROA = 1 web
|
||||
procent_pret NUMBER(5,2) DEFAULT 100, -- % din preț pentru seturi
|
||||
activ NUMBER(1) DEFAULT 1, -- 1=activ, 0=inactiv
|
||||
data_creare DATE DEFAULT SYSDATE, -- Timestamp creare
|
||||
data_modif DATE DEFAULT SYSDATE, -- Timestamp ultima modificare
|
||||
id_util_creare NUMBER(10) DEFAULT -3, -- ID utilizator care a creat
|
||||
id_util_modif NUMBER(10) DEFAULT -3 -- ID utilizator care a modificat
|
||||
);
|
||||
|
||||
-- Adaugare constraint-uri ca instructiuni separate
|
||||
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT pk_articole_terti PRIMARY KEY (sku, codmat);
|
||||
|
||||
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_cantitate CHECK (cantitate_roa > 0);
|
||||
|
||||
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_procent CHECK (procent_pret >= 0 AND procent_pret <= 100);
|
||||
|
||||
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_activ CHECK (activ IN (0, 1));
|
||||
|
||||
-- Index pentru performanță pe căutări frecvente după SKU
|
||||
CREATE INDEX idx_articole_terti_sku ON ARTICOLE_TERTI (sku, activ);
|
||||
|
||||
|
||||
-- Comentarii pentru documentație
|
||||
COMMENT ON TABLE ARTICOLE_TERTI IS 'Mapări SKU-uri web → CODMAT ROA pentru reîmpachetări și seturi';
|
||||
COMMENT ON COLUMN ARTICOLE_TERTI.sku IS 'SKU din platforma web (ex: GoMag)';
|
||||
COMMENT ON COLUMN ARTICOLE_TERTI.codmat IS 'CODMAT din nom_articole ROA';
|
||||
COMMENT ON COLUMN ARTICOLE_TERTI.cantitate_roa IS 'Câte unități ROA pentru 1 unitate web';
|
||||
COMMENT ON COLUMN ARTICOLE_TERTI.procent_pret IS 'Procent din preț web alocat acestui CODMAT (pentru seturi)';
|
||||
COMMENT ON COLUMN ARTICOLE_TERTI.activ IS '1=mapare activă, 0=dezactivată';
|
||||
|
||||
-- Date de test pentru validare
|
||||
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ) VALUES ('CAFE100', 'CAF01', 10, 100, 1);
|
||||
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ) VALUES ('SET01', 'CAF01', 2, 60, 1);
|
||||
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ) VALUES ('SET01', 'FILT01', 1, 40, 1);
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,662 +0,0 @@
|
||||
CREATE OR REPLACE PACKAGE PACK_IMPORT_COMENZI AS
|
||||
|
||||
-- Variabila package pentru ultima eroare (pentru orchestrator VFP)
|
||||
g_last_error VARCHAR2(4000);
|
||||
|
||||
-- Procedura pentru importul complet al unei comenzi
|
||||
PROCEDURE importa_comanda(p_nr_comanda_ext IN VARCHAR2,
|
||||
p_data_comanda IN DATE,
|
||||
p_id_partener IN NUMBER,
|
||||
p_json_articole IN CLOB,
|
||||
p_id_adresa_livrare IN NUMBER DEFAULT NULL,
|
||||
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
|
||||
p_id_pol IN NUMBER DEFAULT NULL,
|
||||
p_id_sectie IN NUMBER DEFAULT NULL,
|
||||
p_id_gestiune IN VARCHAR2 DEFAULT NULL,
|
||||
p_kit_mode IN VARCHAR2 DEFAULT NULL,
|
||||
p_id_pol_productie IN NUMBER DEFAULT NULL,
|
||||
p_kit_discount_codmat IN VARCHAR2 DEFAULT NULL,
|
||||
p_kit_discount_id_pol IN NUMBER DEFAULT NULL,
|
||||
v_id_comanda OUT NUMBER);
|
||||
|
||||
-- Functii pentru managementul erorilor (pentru orchestrator VFP)
|
||||
FUNCTION get_last_error RETURN VARCHAR2;
|
||||
PROCEDURE clear_error;
|
||||
|
||||
END PACK_IMPORT_COMENZI;
|
||||
/
|
||||
CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
||||
|
||||
-- ====================================================================
|
||||
-- PACK_IMPORT_COMENZI
|
||||
-- Package pentru importul comenzilor din platforme web (GoMag, etc.)
|
||||
-- in sistemul ROA Oracle.
|
||||
--
|
||||
-- Dependinte:
|
||||
-- Packages: PACK_COMENZI (adauga_comanda, adauga_articol_comanda)
|
||||
-- pljson (pljson_list, pljson) - instalat in CONTAFIN_ORACLE,
|
||||
-- accesat prin PUBLIC SYNONYM
|
||||
-- Tabele: ARTICOLE_TERTI (mapari SKU -> CODMAT)
|
||||
-- NOM_ARTICOLE (nomenclator articole ROA)
|
||||
-- COMENZI (verificare duplicat comanda_externa)
|
||||
-- CRM_POLITICI_PRETURI (flag PRETURI_CU_TVA per politica)
|
||||
-- CRM_POLITICI_PRET_ART (preturi componente kituri)
|
||||
|
||||
-- 20.03.2026 - dual policy vanzare/productie, kit pricing distributed/separate_line, SKU→CODMAT via ARTICOLE_TERTI
|
||||
-- 20.03.2026 - kit discount deferred cross-kit (separate_line, merge-on-collision)
|
||||
-- 20.03.2026 - merge_or_insert_articol: merge cantitati cand kit+individual au acelasi articol/pret
|
||||
-- 20.03.2026 - kit pricing extins pt reambalari single-component (cantitate_roa > 1)
|
||||
-- 21.03.2026 - diagnostic detaliat discount kit (id_pol, id_art, codmat in eroare)
|
||||
-- 21.03.2026 - fix discount amount: v_disc_amt e per-kit, nu se imparte la v_cantitate_web
|
||||
-- 25.03.2026 - skip negative kit discount (markup), ROUND prices to nzecimale_pretv
|
||||
-- 25.03.2026 - kit discount inserat per-kit sub componente (nu deferred cross-kit)
|
||||
-- 22.04.2026 - fix duplicate article: NOM_ARTICOLE fallback si kit discount line folosesc merge_or_insert_articol (prod VENDING comanda 485224762)
|
||||
-- ====================================================================
|
||||
|
||||
-- Constante pentru configurare
|
||||
c_id_util CONSTANT NUMBER := -3; -- Sistem
|
||||
c_interna CONSTANT NUMBER := 2; -- Comenzi de la client (web)
|
||||
|
||||
-- Tipuri pentru kit pricing (accesibile in toate procedurile din body)
|
||||
TYPE t_kit_component IS RECORD (
|
||||
codmat VARCHAR2(50),
|
||||
id_articol NUMBER,
|
||||
cantitate_roa NUMBER,
|
||||
pret_cu_tva NUMBER,
|
||||
ptva NUMBER,
|
||||
id_pol_comp NUMBER,
|
||||
value_total NUMBER
|
||||
);
|
||||
TYPE t_kit_components IS TABLE OF t_kit_component INDEX BY PLS_INTEGER;
|
||||
|
||||
-- ================================================================
|
||||
-- Functii helper pentru managementul erorilor
|
||||
-- ================================================================
|
||||
FUNCTION get_last_error RETURN VARCHAR2 IS
|
||||
BEGIN
|
||||
RETURN g_last_error;
|
||||
END get_last_error;
|
||||
|
||||
PROCEDURE clear_error IS
|
||||
BEGIN
|
||||
g_last_error := NULL;
|
||||
END clear_error;
|
||||
|
||||
-- ================================================================
|
||||
-- Functie helper: selecteaza id_articol corect pentru un CODMAT
|
||||
-- Prioritate: sters=0 AND inactiv=0, preferinta stoc, MAX(id_articol) fallback
|
||||
-- ================================================================
|
||||
FUNCTION resolve_id_articol(p_codmat IN VARCHAR2, p_id_gest IN VARCHAR2) RETURN NUMBER IS
|
||||
v_result NUMBER;
|
||||
BEGIN
|
||||
IF p_id_gest IS NOT NULL THEN
|
||||
-- Cu gestiuni specifice (CSV: "1,3") — split in subquery pentru IN clause
|
||||
BEGIN
|
||||
SELECT id_articol INTO v_result FROM (
|
||||
SELECT na.id_articol
|
||||
FROM nom_articole na
|
||||
WHERE na.codmat = p_codmat AND na.sters = 0 AND na.inactiv = 0
|
||||
ORDER BY
|
||||
CASE WHEN EXISTS (
|
||||
SELECT 1 FROM stoc s
|
||||
WHERE s.id_articol = na.id_articol
|
||||
AND s.id_gestiune IN (
|
||||
SELECT TO_NUMBER(REGEXP_SUBSTR(p_id_gest, '[^,]+', 1, LEVEL))
|
||||
FROM DUAL
|
||||
CONNECT BY LEVEL <= REGEXP_COUNT(p_id_gest, ',') + 1
|
||||
)
|
||||
AND s.an = EXTRACT(YEAR FROM SYSDATE)
|
||||
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
|
||||
AND s.cants + s.cant - s.cante > 0
|
||||
) THEN 0 ELSE 1 END,
|
||||
na.id_articol DESC
|
||||
) WHERE ROWNUM = 1;
|
||||
EXCEPTION WHEN NO_DATA_FOUND THEN v_result := NULL;
|
||||
END;
|
||||
ELSE
|
||||
-- Fara gestiune — cauta stoc in orice gestiune
|
||||
BEGIN
|
||||
SELECT id_articol INTO v_result FROM (
|
||||
SELECT na.id_articol
|
||||
FROM nom_articole na
|
||||
WHERE na.codmat = p_codmat AND na.sters = 0 AND na.inactiv = 0
|
||||
ORDER BY
|
||||
CASE WHEN EXISTS (
|
||||
SELECT 1 FROM stoc s
|
||||
WHERE s.id_articol = na.id_articol
|
||||
AND s.an = EXTRACT(YEAR FROM SYSDATE)
|
||||
AND s.luna = EXTRACT(MONTH FROM SYSDATE)
|
||||
AND s.cants + s.cant - s.cante > 0
|
||||
) THEN 0 ELSE 1 END,
|
||||
na.id_articol DESC
|
||||
) WHERE ROWNUM = 1;
|
||||
EXCEPTION WHEN NO_DATA_FOUND THEN v_result := NULL;
|
||||
END;
|
||||
END IF;
|
||||
RETURN v_result;
|
||||
END resolve_id_articol;
|
||||
|
||||
-- ================================================================
|
||||
-- Helper: merge-or-insert articol pe comanda
|
||||
-- Daca aceeasi combinatie (ID_COMANDA, ID_ARTICOL, PTVA, PRET, SIGN(CANTITATE))
|
||||
-- exista deja, aduna cantitatea; altfel insereaza linie noua.
|
||||
-- Previne crash la duplicate cand acelasi articol apare din kit + individual.
|
||||
-- ================================================================
|
||||
PROCEDURE merge_or_insert_articol(
|
||||
p_id_comanda IN NUMBER,
|
||||
p_id_articol IN NUMBER,
|
||||
p_id_pol IN NUMBER,
|
||||
p_cantitate IN NUMBER,
|
||||
p_pret IN NUMBER,
|
||||
p_id_util IN NUMBER,
|
||||
p_id_sectie IN NUMBER,
|
||||
p_ptva IN NUMBER
|
||||
) IS
|
||||
v_cnt NUMBER;
|
||||
BEGIN
|
||||
SELECT COUNT(*) INTO v_cnt
|
||||
FROM COMENZI_ELEMENTE
|
||||
WHERE ID_COMANDA = p_id_comanda
|
||||
AND ID_ARTICOL = p_id_articol
|
||||
AND NVL(PTVA, 0) = NVL(p_ptva, 0)
|
||||
AND PRET = p_pret
|
||||
AND SIGN(CANTITATE) = SIGN(p_cantitate)
|
||||
AND STERS = 0;
|
||||
|
||||
IF v_cnt > 0 THEN
|
||||
UPDATE COMENZI_ELEMENTE
|
||||
SET CANTITATE = CANTITATE + p_cantitate
|
||||
WHERE ID_COMANDA = p_id_comanda
|
||||
AND ID_ARTICOL = p_id_articol
|
||||
AND NVL(PTVA, 0) = NVL(p_ptva, 0)
|
||||
AND PRET = p_pret
|
||||
AND SIGN(CANTITATE) = SIGN(p_cantitate)
|
||||
AND STERS = 0
|
||||
AND ROWNUM = 1;
|
||||
ELSE
|
||||
PACK_COMENZI.adauga_articol_comanda(
|
||||
V_ID_COMANDA => p_id_comanda,
|
||||
V_ID_ARTICOL => p_id_articol,
|
||||
V_ID_POL => p_id_pol,
|
||||
V_CANTITATE => p_cantitate,
|
||||
V_PRET => p_pret,
|
||||
V_ID_UTIL => p_id_util,
|
||||
V_ID_SECTIE => p_id_sectie,
|
||||
V_PTVA => p_ptva);
|
||||
END IF;
|
||||
END merge_or_insert_articol;
|
||||
|
||||
-- ================================================================
|
||||
-- Procedura principala pentru importul unei comenzi
|
||||
-- ================================================================
|
||||
PROCEDURE importa_comanda(p_nr_comanda_ext IN VARCHAR2,
|
||||
p_data_comanda IN DATE,
|
||||
p_id_partener IN NUMBER,
|
||||
p_json_articole IN CLOB,
|
||||
p_id_adresa_livrare IN NUMBER DEFAULT NULL,
|
||||
p_id_adresa_facturare IN NUMBER DEFAULT NULL,
|
||||
p_id_pol IN NUMBER DEFAULT NULL,
|
||||
p_id_sectie IN NUMBER DEFAULT NULL,
|
||||
p_id_gestiune IN VARCHAR2 DEFAULT NULL,
|
||||
p_kit_mode IN VARCHAR2 DEFAULT NULL,
|
||||
p_id_pol_productie IN NUMBER DEFAULT NULL,
|
||||
p_kit_discount_codmat IN VARCHAR2 DEFAULT NULL,
|
||||
p_kit_discount_id_pol IN NUMBER DEFAULT NULL,
|
||||
v_id_comanda OUT NUMBER) IS
|
||||
v_data_livrare DATE;
|
||||
v_sku VARCHAR2(100);
|
||||
v_cantitate_web NUMBER;
|
||||
v_pret_web NUMBER;
|
||||
v_vat NUMBER;
|
||||
v_articole_procesate NUMBER := 0;
|
||||
v_articole_eroare NUMBER := 0;
|
||||
v_articol_count NUMBER := 0;
|
||||
|
||||
-- Variabile pentru cautare articol
|
||||
v_found_mapping BOOLEAN;
|
||||
v_id_articol NUMBER;
|
||||
v_codmat VARCHAR2(50);
|
||||
v_cantitate_roa NUMBER;
|
||||
v_pret_unitar NUMBER;
|
||||
v_id_pol_articol NUMBER; -- id_pol per articol (din JSON), prioritar fata de p_id_pol
|
||||
|
||||
-- Variabile kit pricing
|
||||
v_kit_count NUMBER := 0;
|
||||
v_max_cant_roa NUMBER := 1;
|
||||
v_kit_comps t_kit_components;
|
||||
v_sum_list_prices NUMBER;
|
||||
v_discount_total NUMBER;
|
||||
v_discount_share NUMBER;
|
||||
v_pret_ajustat NUMBER;
|
||||
v_discount_allocated NUMBER;
|
||||
|
||||
-- Zecimale pret vanzare (din optiuni firma, default 2)
|
||||
v_nzec_pretv PLS_INTEGER := NVL(TO_NUMBER(pack_sesiune.getoptiunefirma(USER, 'PPRETV')), 2);
|
||||
|
||||
-- pljson
|
||||
l_json_articole CLOB := p_json_articole;
|
||||
v_json_arr pljson_list;
|
||||
v_json_obj pljson;
|
||||
BEGIN
|
||||
-- Resetare eroare la inceputul procesarii
|
||||
clear_error;
|
||||
|
||||
-- Validari de baza
|
||||
IF p_nr_comanda_ext IS NULL OR p_id_partener IS NULL THEN
|
||||
g_last_error := 'IMPORTA_COMANDA ' || NVL(p_nr_comanda_ext, 'NULL') ||
|
||||
': Parametri obligatorii lipsa';
|
||||
GOTO SFARSIT;
|
||||
END IF;
|
||||
|
||||
-- Verifica daca comanda nu exista deja
|
||||
BEGIN
|
||||
SELECT id_comanda
|
||||
INTO v_id_comanda
|
||||
FROM comenzi
|
||||
WHERE comanda_externa = p_nr_comanda_ext
|
||||
AND sters = 0;
|
||||
|
||||
IF v_id_comanda IS NOT NULL THEN
|
||||
GOTO sfarsit;
|
||||
END IF;
|
||||
EXCEPTION
|
||||
WHEN NO_DATA_FOUND THEN
|
||||
NULL; -- Normal, comanda nu exista
|
||||
END;
|
||||
|
||||
-- Calculeaza data de livrare (comanda + 1 zi)
|
||||
v_data_livrare := p_data_comanda + 1;
|
||||
|
||||
-- STEP 1: Creeaza comanda
|
||||
PACK_COMENZI.adauga_comanda(V_NR_COMANDA => p_nr_comanda_ext,
|
||||
V_DATA_COMANDA => p_data_comanda,
|
||||
V_ID => p_id_partener,
|
||||
V_DATA_LIVRARE => v_data_livrare,
|
||||
V_PROC_DISCOUNT => 0,
|
||||
V_INTERNA => c_interna,
|
||||
V_ID_UTIL => c_id_util,
|
||||
V_ID_SECTIE => p_id_sectie,
|
||||
V_ID_ADRESA_FACTURARE => p_id_adresa_facturare,
|
||||
V_ID_ADRESA_LIVRARE => p_id_adresa_livrare,
|
||||
V_ID_CODCLIENT => NULL,
|
||||
V_COMANDA_EXTERNA => p_nr_comanda_ext,
|
||||
V_ID_CTR => NULL,
|
||||
V_ID_COMANDA => v_id_comanda);
|
||||
|
||||
IF v_id_comanda IS NULL OR v_id_comanda <= 0 THEN
|
||||
g_last_error := 'IMPORTA_COMANDA ' || p_nr_comanda_ext ||
|
||||
': PACK_COMENZI.adauga_comanda a returnat ID invalid';
|
||||
GOTO sfarsit;
|
||||
END IF;
|
||||
|
||||
-- STEP 2: Proceseaza articolele din JSON folosind pljson
|
||||
-- Suporta atat array "[{...},{...}]" cat si obiect singular "{...}"
|
||||
IF LTRIM(l_json_articole) LIKE '[%' THEN
|
||||
v_json_arr := pljson_list(l_json_articole);
|
||||
ELSE
|
||||
v_json_arr := pljson_list('[' || l_json_articole || ']');
|
||||
END IF;
|
||||
|
||||
FOR i IN 1 .. v_json_arr.count LOOP
|
||||
v_articol_count := v_articol_count + 1;
|
||||
v_json_obj := pljson(v_json_arr.get(i));
|
||||
|
||||
BEGIN
|
||||
-- Extrage datele folosind pljson (valorile vin ca string din json magazin web)
|
||||
v_sku := v_json_obj.get_string('sku');
|
||||
v_cantitate_web := TO_NUMBER(v_json_obj.get_string('quantity'));
|
||||
v_pret_web := TO_NUMBER(v_json_obj.get_string('price'));
|
||||
v_vat := TO_NUMBER(v_json_obj.get_string('vat'));
|
||||
|
||||
-- id_pol per articol (optional, pentru transport/discount cu politica separata)
|
||||
BEGIN
|
||||
v_id_pol_articol := TO_NUMBER(v_json_obj.get_string('id_pol'));
|
||||
EXCEPTION
|
||||
WHEN OTHERS THEN v_id_pol_articol := NULL;
|
||||
END;
|
||||
|
||||
-- STEP 3: Gaseste articolele ROA pentru acest SKU
|
||||
v_found_mapping := FALSE;
|
||||
|
||||
-- Numara randurile ARTICOLE_TERTI pentru a detecta kituri (>1 rand = set compus)
|
||||
SELECT COUNT(*), NVL(MAX(at.cantitate_roa), 1)
|
||||
INTO v_kit_count, v_max_cant_roa
|
||||
FROM articole_terti at
|
||||
WHERE at.sku = v_sku
|
||||
AND at.activ = 1
|
||||
AND at.sters = 0;
|
||||
|
||||
IF ((v_kit_count > 1) OR (v_kit_count = 1 AND v_max_cant_roa > 1))
|
||||
AND p_kit_mode IS NOT NULL THEN
|
||||
-- ============================================================
|
||||
-- KIT PRICING: set compus (>1 componente) sau reambalare (cantitate_roa>1), mod activ
|
||||
-- Prima trecere: colecteaza componente + preturi din politici
|
||||
-- ============================================================
|
||||
v_found_mapping := TRUE;
|
||||
v_kit_comps.DELETE;
|
||||
v_sum_list_prices := 0;
|
||||
|
||||
DECLARE
|
||||
v_comp_idx PLS_INTEGER := 0;
|
||||
v_cont_vanz VARCHAR2(20);
|
||||
v_preturi_fl NUMBER;
|
||||
v_pret_val NUMBER;
|
||||
v_proc_tva NUMBER;
|
||||
BEGIN
|
||||
FOR rec IN (SELECT at.codmat, at.cantitate_roa
|
||||
FROM articole_terti at
|
||||
WHERE at.sku = v_sku
|
||||
AND at.activ = 1
|
||||
AND at.sters = 0
|
||||
ORDER BY at.codmat) LOOP
|
||||
v_comp_idx := v_comp_idx + 1;
|
||||
v_kit_comps(v_comp_idx).codmat := rec.codmat;
|
||||
v_kit_comps(v_comp_idx).cantitate_roa := rec.cantitate_roa;
|
||||
v_kit_comps(v_comp_idx).id_articol :=
|
||||
resolve_id_articol(rec.codmat, p_id_gestiune);
|
||||
|
||||
IF v_kit_comps(v_comp_idx).id_articol IS NULL THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'Articol activ negasit pentru CODMAT: ' || rec.codmat;
|
||||
v_kit_comps(v_comp_idx).pret_cu_tva := 0;
|
||||
v_kit_comps(v_comp_idx).ptva := ROUND(v_vat);
|
||||
v_kit_comps(v_comp_idx).id_pol_comp := NVL(v_id_pol_articol, p_id_pol);
|
||||
v_kit_comps(v_comp_idx).value_total := 0;
|
||||
CONTINUE;
|
||||
END IF;
|
||||
|
||||
-- Determina id_pol_comp: cont 341/345 → politica productie, altfel vanzare
|
||||
BEGIN
|
||||
SELECT NVL(na.cont, '') INTO v_cont_vanz
|
||||
FROM nom_articole na
|
||||
WHERE na.id_articol = v_kit_comps(v_comp_idx).id_articol
|
||||
AND ROWNUM = 1;
|
||||
EXCEPTION WHEN OTHERS THEN v_cont_vanz := '';
|
||||
END;
|
||||
|
||||
IF v_cont_vanz IN ('341', '345') AND p_id_pol_productie IS NOT NULL THEN
|
||||
v_kit_comps(v_comp_idx).id_pol_comp := p_id_pol_productie;
|
||||
ELSE
|
||||
v_kit_comps(v_comp_idx).id_pol_comp := NVL(v_id_pol_articol, p_id_pol);
|
||||
END IF;
|
||||
|
||||
-- Query flag PRETURI_CU_TVA pentru aceasta politica
|
||||
BEGIN
|
||||
SELECT NVL(pp.preturi_cu_tva, 0) INTO v_preturi_fl
|
||||
FROM crm_politici_preturi pp
|
||||
WHERE pp.id_pol = v_kit_comps(v_comp_idx).id_pol_comp;
|
||||
EXCEPTION WHEN OTHERS THEN v_preturi_fl := 0;
|
||||
END;
|
||||
|
||||
-- Citeste PRET si PROC_TVAV din crm_politici_pret_art
|
||||
BEGIN
|
||||
SELECT ppa.pret, NVL(ppa.proc_tvav, 1)
|
||||
INTO v_pret_val, v_proc_tva
|
||||
FROM crm_politici_pret_art ppa
|
||||
WHERE ppa.id_pol = v_kit_comps(v_comp_idx).id_pol_comp
|
||||
AND ppa.id_articol = v_kit_comps(v_comp_idx).id_articol
|
||||
AND ROWNUM = 1;
|
||||
|
||||
-- V_PRET always WITH TVA
|
||||
IF v_preturi_fl = 1 THEN
|
||||
v_kit_comps(v_comp_idx).pret_cu_tva := v_pret_val;
|
||||
ELSE
|
||||
v_kit_comps(v_comp_idx).pret_cu_tva := v_pret_val * v_proc_tva;
|
||||
END IF;
|
||||
v_kit_comps(v_comp_idx).ptva := ROUND((v_proc_tva - 1) * 100);
|
||||
EXCEPTION WHEN OTHERS THEN
|
||||
v_kit_comps(v_comp_idx).pret_cu_tva := 0;
|
||||
v_kit_comps(v_comp_idx).ptva := ROUND(v_vat);
|
||||
END;
|
||||
|
||||
v_kit_comps(v_comp_idx).value_total :=
|
||||
v_kit_comps(v_comp_idx).pret_cu_tva * v_kit_comps(v_comp_idx).cantitate_roa;
|
||||
v_sum_list_prices := v_sum_list_prices + v_kit_comps(v_comp_idx).value_total;
|
||||
END LOOP;
|
||||
END; -- end prima trecere
|
||||
|
||||
-- Discount = suma liste - pret web (poate fi negativ = markup)
|
||||
v_discount_total := v_sum_list_prices - v_pret_web;
|
||||
|
||||
-- ============================================================
|
||||
-- A doua trecere: inserare in functie de mod
|
||||
-- ============================================================
|
||||
IF p_kit_mode = 'distributed' THEN
|
||||
-- Mode A: distribui discountul proportional in pretul fiecarei componente
|
||||
v_discount_allocated := 0;
|
||||
FOR i_comp IN 1 .. v_kit_comps.COUNT LOOP
|
||||
IF v_kit_comps(i_comp).id_articol IS NOT NULL THEN
|
||||
-- Ultimul articol valid primeste remainder pentru precizie exacta
|
||||
IF i_comp = v_kit_comps.LAST THEN
|
||||
v_discount_share := v_discount_total - v_discount_allocated;
|
||||
ELSE
|
||||
IF v_sum_list_prices != 0 THEN
|
||||
v_discount_share := v_discount_total *
|
||||
(v_kit_comps(i_comp).value_total / v_sum_list_prices);
|
||||
ELSE
|
||||
v_discount_share := 0;
|
||||
END IF;
|
||||
v_discount_allocated := v_discount_allocated + v_discount_share;
|
||||
END IF;
|
||||
|
||||
-- pret_ajustat = pret_cu_tva - discount_share / cantitate_roa
|
||||
v_pret_ajustat := ROUND(
|
||||
v_kit_comps(i_comp).pret_cu_tva -
|
||||
(v_discount_share / v_kit_comps(i_comp).cantitate_roa),
|
||||
v_nzec_pretv);
|
||||
|
||||
BEGIN
|
||||
merge_or_insert_articol(
|
||||
p_id_comanda => v_id_comanda,
|
||||
p_id_articol => v_kit_comps(i_comp).id_articol,
|
||||
p_id_pol => v_kit_comps(i_comp).id_pol_comp,
|
||||
p_cantitate => v_kit_comps(i_comp).cantitate_roa * v_cantitate_web,
|
||||
p_pret => v_pret_ajustat,
|
||||
p_id_util => c_id_util,
|
||||
p_id_sectie => p_id_sectie,
|
||||
p_ptva => v_kit_comps(i_comp).ptva);
|
||||
v_articole_procesate := v_articole_procesate + 1;
|
||||
EXCEPTION
|
||||
WHEN OTHERS THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'Eroare adaugare kit component (A) ' ||
|
||||
v_kit_comps(i_comp).codmat || ': ' || SQLERRM;
|
||||
END;
|
||||
END IF;
|
||||
END LOOP;
|
||||
|
||||
ELSIF p_kit_mode = 'separate_line' THEN
|
||||
-- Mode B: componente la pret plin, discount per-kit imediat sub componente
|
||||
DECLARE
|
||||
TYPE t_vat_discount IS TABLE OF NUMBER INDEX BY PLS_INTEGER;
|
||||
v_vat_disc t_vat_discount;
|
||||
v_vat_key PLS_INTEGER;
|
||||
v_vat_disc_alloc NUMBER;
|
||||
v_disc_amt NUMBER;
|
||||
BEGIN
|
||||
-- Inserare componente la pret plin + acumulare discount pe cota TVA (per kit)
|
||||
FOR i_comp IN 1 .. v_kit_comps.COUNT LOOP
|
||||
IF v_kit_comps(i_comp).id_articol IS NOT NULL THEN
|
||||
BEGIN
|
||||
merge_or_insert_articol(
|
||||
p_id_comanda => v_id_comanda,
|
||||
p_id_articol => v_kit_comps(i_comp).id_articol,
|
||||
p_id_pol => v_kit_comps(i_comp).id_pol_comp,
|
||||
p_cantitate => v_kit_comps(i_comp).cantitate_roa * v_cantitate_web,
|
||||
p_pret => v_kit_comps(i_comp).pret_cu_tva,
|
||||
p_id_util => c_id_util,
|
||||
p_id_sectie => p_id_sectie,
|
||||
p_ptva => v_kit_comps(i_comp).ptva);
|
||||
v_articole_procesate := v_articole_procesate + 1;
|
||||
EXCEPTION
|
||||
WHEN OTHERS THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'Eroare adaugare kit component (B) ' ||
|
||||
v_kit_comps(i_comp).codmat || ': ' || SQLERRM;
|
||||
END;
|
||||
|
||||
-- Acumuleaza discountul pe cota TVA (per kit, local)
|
||||
v_vat_key := v_kit_comps(i_comp).ptva;
|
||||
IF v_sum_list_prices != 0 THEN
|
||||
IF v_vat_disc.EXISTS(v_vat_key) THEN
|
||||
v_vat_disc(v_vat_key) := v_vat_disc(v_vat_key) +
|
||||
v_discount_total * (v_kit_comps(i_comp).value_total / v_sum_list_prices);
|
||||
ELSE
|
||||
v_vat_disc(v_vat_key) :=
|
||||
v_discount_total * (v_kit_comps(i_comp).value_total / v_sum_list_prices);
|
||||
END IF;
|
||||
ELSE
|
||||
IF NOT v_vat_disc.EXISTS(v_vat_key) THEN
|
||||
v_vat_disc(v_vat_key) := 0;
|
||||
END IF;
|
||||
END IF;
|
||||
END IF;
|
||||
END LOOP;
|
||||
|
||||
-- Inserare imediata discount per kit (sub componentele kitului)
|
||||
IF v_discount_total > 0 AND p_kit_discount_codmat IS NOT NULL THEN
|
||||
DECLARE
|
||||
v_disc_artid NUMBER;
|
||||
BEGIN
|
||||
v_disc_artid := resolve_id_articol(p_kit_discount_codmat, p_id_gestiune);
|
||||
IF v_disc_artid IS NOT NULL THEN
|
||||
v_vat_disc_alloc := 0;
|
||||
v_vat_key := v_vat_disc.FIRST;
|
||||
WHILE v_vat_key IS NOT NULL LOOP
|
||||
-- Remainder trick per kit
|
||||
IF v_vat_key = v_vat_disc.LAST THEN
|
||||
v_disc_amt := v_discount_total - v_vat_disc_alloc;
|
||||
ELSE
|
||||
v_disc_amt := v_vat_disc(v_vat_key);
|
||||
v_vat_disc_alloc := v_vat_disc_alloc + v_disc_amt;
|
||||
END IF;
|
||||
|
||||
IF v_disc_amt > 0 THEN
|
||||
BEGIN
|
||||
merge_or_insert_articol(
|
||||
p_id_comanda => v_id_comanda,
|
||||
p_id_articol => v_disc_artid,
|
||||
p_id_pol => NVL(p_kit_discount_id_pol, p_id_pol),
|
||||
p_cantitate => -1 * v_cantitate_web,
|
||||
p_pret => ROUND(v_disc_amt, v_nzec_pretv),
|
||||
p_id_util => c_id_util,
|
||||
p_id_sectie => p_id_sectie,
|
||||
p_ptva => v_vat_key);
|
||||
v_articole_procesate := v_articole_procesate + 1;
|
||||
EXCEPTION
|
||||
WHEN OTHERS THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'Eroare linie discount kit TVA=' || v_vat_key ||
|
||||
'% codmat=' || p_kit_discount_codmat || ': ' || SQLERRM;
|
||||
END;
|
||||
END IF;
|
||||
|
||||
v_vat_key := v_vat_disc.NEXT(v_vat_key);
|
||||
END LOOP;
|
||||
END IF;
|
||||
END;
|
||||
END IF;
|
||||
END; -- end mode B per-kit block
|
||||
END IF; -- end kit mode branching
|
||||
|
||||
ELSE
|
||||
-- ============================================================
|
||||
-- MAPARE SIMPLA: 1 CODMAT, sau kit fara kit_mode activ
|
||||
-- Pret = pret web / cantitate_roa (fara procent_pret)
|
||||
-- ============================================================
|
||||
FOR rec IN (SELECT at.codmat, at.cantitate_roa
|
||||
FROM articole_terti at
|
||||
WHERE at.sku = v_sku
|
||||
AND at.activ = 1
|
||||
AND at.sters = 0
|
||||
ORDER BY at.codmat) LOOP
|
||||
|
||||
v_found_mapping := TRUE;
|
||||
v_id_articol := resolve_id_articol(rec.codmat, p_id_gestiune);
|
||||
IF v_id_articol IS NULL THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'Articol activ negasit pentru CODMAT: ' || rec.codmat;
|
||||
CONTINUE;
|
||||
END IF;
|
||||
|
||||
v_cantitate_roa := rec.cantitate_roa * v_cantitate_web;
|
||||
v_pret_unitar := CASE WHEN v_pret_web IS NOT NULL
|
||||
THEN v_pret_web / rec.cantitate_roa
|
||||
ELSE 0
|
||||
END;
|
||||
|
||||
BEGIN
|
||||
merge_or_insert_articol(p_id_comanda => v_id_comanda,
|
||||
p_id_articol => v_id_articol,
|
||||
p_id_pol => NVL(v_id_pol_articol, p_id_pol),
|
||||
p_cantitate => v_cantitate_roa,
|
||||
p_pret => v_pret_unitar,
|
||||
p_id_util => c_id_util,
|
||||
p_id_sectie => p_id_sectie,
|
||||
p_ptva => v_vat);
|
||||
v_articole_procesate := v_articole_procesate + 1;
|
||||
EXCEPTION
|
||||
WHEN OTHERS THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'Eroare adaugare articol ' || rec.codmat || ': ' || SQLERRM;
|
||||
END;
|
||||
END LOOP;
|
||||
|
||||
-- Daca nu s-a gasit mapare in ARTICOLE_TERTI, cauta direct in NOM_ARTICOLE
|
||||
IF NOT v_found_mapping THEN
|
||||
v_id_articol := resolve_id_articol(v_sku, p_id_gestiune);
|
||||
IF v_id_articol IS NULL THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'SKU negasit in ARTICOLE_TERTI si NOM_ARTICOLE (activ): ' || v_sku;
|
||||
ELSE
|
||||
v_codmat := v_sku;
|
||||
v_pret_unitar := NVL(v_pret_web, 0);
|
||||
|
||||
BEGIN
|
||||
merge_or_insert_articol(p_id_comanda => v_id_comanda,
|
||||
p_id_articol => v_id_articol,
|
||||
p_id_pol => NVL(v_id_pol_articol, p_id_pol),
|
||||
p_cantitate => v_cantitate_web,
|
||||
p_pret => v_pret_unitar,
|
||||
p_id_util => c_id_util,
|
||||
p_id_sectie => p_id_sectie,
|
||||
p_ptva => v_vat);
|
||||
v_articole_procesate := v_articole_procesate + 1;
|
||||
EXCEPTION
|
||||
WHEN OTHERS THEN
|
||||
v_articole_eroare := v_articole_eroare + 1;
|
||||
g_last_error := g_last_error || CHR(10) ||
|
||||
'Eroare adaugare articol ' || v_sku ||
|
||||
' (CODMAT: ' || v_codmat || '): ' || SQLERRM;
|
||||
END;
|
||||
END IF;
|
||||
END IF;
|
||||
END IF; -- end kit vs simplu
|
||||
|
||||
END; -- End BEGIN block pentru articol individual
|
||||
|
||||
END LOOP;
|
||||
|
||||
-- Verifica daca s-au procesat articole cu succes
|
||||
IF v_articole_procesate = 0 THEN
|
||||
g_last_error := g_last_error || CHR(10) || 'IMPORTA_COMANDA ' ||
|
||||
p_nr_comanda_ext ||
|
||||
': Niciun articol nu a fost procesat cu succes';
|
||||
END IF;
|
||||
|
||||
<<SFARSIT>>
|
||||
IF g_last_error IS NOT NULL THEN
|
||||
RAISE_APPLICATION_ERROR(-20001, g_last_error);
|
||||
END IF;
|
||||
|
||||
END importa_comanda;
|
||||
|
||||
END PACK_IMPORT_COMENZI;
|
||||
/
|
||||
@@ -1,12 +0,0 @@
|
||||
-- ====================================================================
|
||||
-- 07_alter_articole_terti_sters.sql
|
||||
-- Adauga coloana "sters" in ARTICOLE_TERTI pentru soft-delete real
|
||||
-- (separat de "activ" care e toggle business)
|
||||
-- ====================================================================
|
||||
|
||||
ALTER TABLE ARTICOLE_TERTI ADD sters NUMBER(1) DEFAULT 0;
|
||||
ALTER TABLE ARTICOLE_TERTI ADD CONSTRAINT chk_art_terti_sters CHECK (sters IN (0, 1));
|
||||
|
||||
-- Verifica ca toate randurile existente au sters=0
|
||||
-- SELECT COUNT(*) FROM ARTICOLE_TERTI WHERE sters IS NULL;
|
||||
-- UPDATE ARTICOLE_TERTI SET sters = 0 WHERE sters IS NULL;
|
||||
@@ -1,3 +0,0 @@
|
||||
-- Run AFTER deploying Python code changes and confirming new pricing works
|
||||
-- Removes the deprecated procent_pret column from ARTICOLE_TERTI
|
||||
ALTER TABLE ARTICOLE_TERTI DROP COLUMN procent_pret;
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,54 +0,0 @@
|
||||
-- ====================================================================
|
||||
-- 09_articole_terti_050.sql
|
||||
-- Mapări ARTICOLE_TERTI cu cantitate_roa = 0.5 pentru articole
|
||||
-- unde unitatea web (50 buc/set) ≠ unitatea ROA (100 buc/set).
|
||||
--
|
||||
-- Efect: price sync va calcula pret_crm = pret_web / 0.5,
|
||||
-- iar kit pricing va folosi prețul corect per set ROA.
|
||||
--
|
||||
-- 25.03.2026 - creat pentru fix discount negativ kit pahare
|
||||
-- ====================================================================
|
||||
|
||||
-- Pahar 6oz Coffee Coffee SIBA 50buc (GoMag) → 100buc/set (ROA)
|
||||
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
SELECT '1708828', '1708828', 0.5, 1, 0, SYSDATE, -3 FROM dual
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM articole_terti WHERE sku = '1708828' AND codmat = '1708828' AND sters = 0
|
||||
);
|
||||
|
||||
-- Pahar 8oz Coffee Coffee SIBA 50buc → 100buc/set
|
||||
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
SELECT '528795', '528795', 0.5, 1, 0, SYSDATE, -3 FROM dual
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM articole_terti WHERE sku = '528795' AND codmat = '528795' AND sters = 0
|
||||
);
|
||||
|
||||
-- Pahar 8oz Tchibo 50buc → 100buc/set
|
||||
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
SELECT '58', '58', 0.5, 1, 0, SYSDATE, -3 FROM dual
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM articole_terti WHERE sku = '58' AND codmat = '58' AND sters = 0
|
||||
);
|
||||
|
||||
-- Pahar 7oz Lavazza SIBA 50buc → 100buc/set
|
||||
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
SELECT '51', '51', 0.5, 1, 0, SYSDATE, -3 FROM dual
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM articole_terti WHERE sku = '51' AND codmat = '51' AND sters = 0
|
||||
);
|
||||
|
||||
-- Pahar 8oz Albastru JND 50buc → 100buc/set
|
||||
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
SELECT '105712338826', '105712338826', 0.5, 1, 0, SYSDATE, -3 FROM dual
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM articole_terti WHERE sku = '105712338826' AND codmat = '105712338826' AND sters = 0
|
||||
);
|
||||
|
||||
-- Pahar 8oz Paris JND 50buc → 100buc/set
|
||||
INSERT INTO articole_terti (sku, codmat, cantitate_roa, activ, sters, data_creare, id_util_creare)
|
||||
SELECT '10573080', '10573080', 0.5, 1, 0, SYSDATE, -3 FROM dual
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM articole_terti WHERE sku = '10573080' AND codmat = '10573080' AND sters = 0
|
||||
);
|
||||
|
||||
COMMIT;
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,95 +0,0 @@
|
||||
-- ============================================================================
|
||||
-- cleanup_comenzi_sterse_nefacturate.sql
|
||||
-- 2026-04-08
|
||||
--
|
||||
-- Soft-delete (sters=1) comenzile din ROA care sunt:
|
||||
-- 1. Active (sters=0)
|
||||
-- 2. Nu au factura activa in VANZARI
|
||||
-- 3. Mai vechi de 3 zile (DATA_COMANDA < SYSDATE - 3)
|
||||
--
|
||||
-- Motivatie: comenzi de test importate din GoMag care au fost facturate manual
|
||||
-- (direct, nu factura din comanda). Raman pe veci ca active nefacturate.
|
||||
--
|
||||
-- IMPORTANT: Ruleaza intai SELECT-ul de preview inainte de UPDATE!
|
||||
-- ============================================================================
|
||||
|
||||
SET SERVEROUTPUT ON;
|
||||
SET LINESIZE 200;
|
||||
|
||||
-- ============================================================================
|
||||
-- STEP 1: PREVIEW — vezi ce se va marca sters
|
||||
-- ============================================================================
|
||||
|
||||
PROMPT;
|
||||
PROMPT === PREVIEW: Comenzi active, nefacturate, mai vechi de 3 zile ===;
|
||||
PROMPT;
|
||||
|
||||
SELECT c.id_comanda,
|
||||
c.nr_comanda,
|
||||
c.comanda_externa,
|
||||
c.data_comanda,
|
||||
c.id_part,
|
||||
(SELECT COUNT(*) FROM comenzi_elemente e
|
||||
WHERE e.id_comanda = c.id_comanda AND e.sters = 0) AS nr_elemente
|
||||
FROM comenzi c
|
||||
WHERE c.sters = 0
|
||||
AND c.data_comanda < SYSDATE - 3
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM vanzari v
|
||||
WHERE v.id_comanda = c.id_comanda
|
||||
AND v.sters = 0
|
||||
)
|
||||
ORDER BY c.data_comanda;
|
||||
|
||||
-- ============================================================================
|
||||
-- STEP 2: SOFT-DELETE — decomentati blocul dupa verificarea preview-ului
|
||||
-- ============================================================================
|
||||
|
||||
/*
|
||||
DECLARE
|
||||
v_elemente_count NUMBER := 0;
|
||||
v_comenzi_count NUMBER := 0;
|
||||
BEGIN
|
||||
-- Mai intai soft-delete pe detalii (COMENZI_ELEMENTE)
|
||||
UPDATE comenzi_elemente SET sters = 1
|
||||
WHERE sters = 0
|
||||
AND id_comanda IN (
|
||||
SELECT c.id_comanda
|
||||
FROM comenzi c
|
||||
WHERE c.sters = 0
|
||||
AND c.data_comanda < SYSDATE - 3
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM vanzari v
|
||||
WHERE v.id_comanda = c.id_comanda
|
||||
AND v.sters = 0
|
||||
)
|
||||
);
|
||||
v_elemente_count := SQL%ROWCOUNT;
|
||||
|
||||
-- Apoi soft-delete pe header (COMENZI)
|
||||
UPDATE comenzi SET sters = 1
|
||||
WHERE sters = 0
|
||||
AND data_comanda < SYSDATE - 3
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM vanzari v
|
||||
WHERE v.id_comanda = comenzi.id_comanda
|
||||
AND v.sters = 0
|
||||
);
|
||||
v_comenzi_count := SQL%ROWCOUNT;
|
||||
|
||||
DBMS_OUTPUT.PUT_LINE('=== REZULTAT CLEANUP ===');
|
||||
DBMS_OUTPUT.PUT_LINE('Elemente marcate sters: ' || v_elemente_count);
|
||||
DBMS_OUTPUT.PUT_LINE('Comenzi marcate sters: ' || v_comenzi_count);
|
||||
|
||||
-- COMMIT explicit — decomentati doar dupa ce sunteti siguri
|
||||
-- COMMIT;
|
||||
|
||||
-- Sau ROLLBACK daca ceva nu arata bine:
|
||||
-- ROLLBACK;
|
||||
END;
|
||||
/
|
||||
*/
|
||||
|
||||
PROMPT;
|
||||
PROMPT === Pentru a executa, decomentati blocul PL/SQL si COMMIT ===;
|
||||
PROMPT;
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,13 +0,0 @@
|
||||
-- pre_deploy_verify_soundex.sql
|
||||
-- Rulat pe production Oracle INAINTE de deploy 05_pack_import_parteneri.pck (fix SOUNDEX L2)
|
||||
-- Verifica ca premisa e adevarata: "Crampoaia/Crimpoia" exista in nomenclator pentru OLT
|
||||
|
||||
SELECT l.localitate,
|
||||
SOUNDEX(CONVERT(UPPER(TRIM(l.localitate)), 'US7ASCII', 'AL32UTF8')) soundex_val
|
||||
FROM syn_nom_localitati l
|
||||
JOIN syn_nom_judete j ON l.id_judet = j.id_judet
|
||||
WHERE j.judet = 'OLT' AND j.sters = 0
|
||||
AND SOUNDEX(CONVERT(UPPER(TRIM(l.localitate)), 'US7ASCII', 'AL32UTF8')) = SOUNDEX('CRAMPOIA')
|
||||
AND l.inactiv = 0 AND l.sters = 0;
|
||||
-- Rezultat asteptat: >=1 row (ex: CRIMPOIA cu SOUNDEX C651)
|
||||
-- Daca 0 rows: Crampoaia nu exista in nomenclator → SOUNDEX nu rezolva → alt plan necesar
|
||||
@@ -1,12 +0,0 @@
|
||||
fastapi==0.115.6
|
||||
uvicorn[standard]==0.34.0
|
||||
jinja2==3.1.4
|
||||
python-multipart==0.0.18
|
||||
oracledb==2.5.1
|
||||
aiosqlite==0.20.0
|
||||
apscheduler==3.10.4
|
||||
python-dotenv==1.0.1
|
||||
pydantic-settings==2.7.1
|
||||
httpx==0.28.1
|
||||
pytest>=8.0.0
|
||||
pytest-asyncio>=0.23.0
|
||||
@@ -1,122 +0,0 @@
|
||||
# Tests Directory - Phase 1 Validation
|
||||
|
||||
## Test Files
|
||||
|
||||
### ✅ `test_final_success.py`
|
||||
**Purpose:** Complete end-to-end validation test for P1-004
|
||||
- Tests PACK_IMPORT_PARTENERI partner creation
|
||||
- Tests gaseste_articol_roa article mapping
|
||||
- Tests importa_comanda complete workflow
|
||||
- **Status:** 85% FUNCTIONAL - Core components validated
|
||||
|
||||
### 🔧 `check_packages.py`
|
||||
**Purpose:** Oracle package status checking utility
|
||||
- Checks compilation status of all packages
|
||||
- Lists VALID/INVALID package bodies
|
||||
- Validates critical packages: PACK_IMPORT_PARTENERI, PACK_IMPORT_COMENZI, PACK_JSON, PACK_COMENZI
|
||||
|
||||
### 🔧 `check_table_structure.py`
|
||||
**Purpose:** Oracle table structure validation utility
|
||||
- Shows table columns and constraints
|
||||
- Validates FK relationships
|
||||
- Confirms COMENZI table structure and schema MARIUSM_AUTO
|
||||
|
||||
---
|
||||
|
||||
## 🚀 How to Run Tests
|
||||
|
||||
### Method 1: Inside Docker Container (RECOMMENDED)
|
||||
```bash
|
||||
# Run all tests inside the gomag-admin container where TNS configuration is correct
|
||||
docker exec gomag-admin python3 /app/tests/check_packages.py
|
||||
docker exec gomag-admin python3 /app/tests/check_table_structure.py
|
||||
docker exec gomag-admin python3 /app/tests/test_final_success.py
|
||||
```
|
||||
|
||||
### Method 2: Local Environment (Advanced)
|
||||
```bash
|
||||
# Requires proper Oracle client setup and TNS configuration
|
||||
cd /mnt/e/proiecte/vending/gomag-vending/api
|
||||
source .env
|
||||
python3 tests/check_packages.py
|
||||
python3 tests/check_table_structure.py
|
||||
python3 tests/test_final_success.py
|
||||
```
|
||||
|
||||
**Note:** Method 1 is recommended because:
|
||||
- Oracle Instant Client is properly configured in container
|
||||
- TNS configuration is available at `/app/tnsnames.ora`
|
||||
- Environment variables are loaded automatically
|
||||
- Avoids line ending issues in .env file
|
||||
|
||||
---
|
||||
|
||||
## 📊 Latest Test Results (10 septembrie 2025, 11:04)
|
||||
|
||||
### ✅ CRITICAL COMPONENTS - 100% FUNCTIONAL:
|
||||
- **PACK_IMPORT_PARTENERI** - ✅ VALID (header + body)
|
||||
- **PACK_IMPORT_COMENZI** - ✅ VALID (header + body)
|
||||
- **PACK_JSON** - ✅ VALID (header + body)
|
||||
- **PACK_COMENZI** - ✅ VALID (header + body) - **FIXED: V_INTERNA=2 issue resolved**
|
||||
|
||||
### ✅ COMPREHENSIVE TEST RESULTS (test_complete_import.py):
|
||||
1. **Article Mapping:** ✅ Found 3 mappings for CAFE100
|
||||
2. **JSON Parsing:** ✅ Successfully parsed test articles
|
||||
3. **Partner Management:** ✅ Created partner ID 894
|
||||
4. **Order Import:** ⚠️ Partial success - order creation works, article processing needs optimization
|
||||
|
||||
### 🔧 PACK_COMENZI ISSUES RESOLVED:
|
||||
- **✅ V_INTERNA Parameter:** Fixed to use value 2 for client orders
|
||||
- **✅ FK Constraints:** ID_GESTIUNE=NULL, ID_SECTIE=2 for INTERNA=2
|
||||
- **✅ Partner Validation:** Proper partner ID validation implemented
|
||||
- **✅ CASE Statement:** No more "CASE not found" errors
|
||||
|
||||
### ⚠️ REMAINING MINOR ISSUE:
|
||||
- `importa_comanda()` creates orders successfully but returns "Niciun articol nu a fost procesat cu succes"
|
||||
- **Root Cause:** Likely article processing loop optimization needed in package
|
||||
- **Impact:** Minimal - orders and partners are created correctly
|
||||
- **Status:** 95% functional, suitable for Phase 2 VFP Integration
|
||||
|
||||
### 🎯 PHASE 1 CONCLUSION: 95% FUNCTIONAL
|
||||
**✅ READY FOR PHASE 2 VFP INTEGRATION** - All critical components validated and operational.
|
||||
|
||||
---
|
||||
|
||||
## 📁 Current Test Files
|
||||
|
||||
### ✅ `test_complete_import.py` - **PRIMARY TEST**
|
||||
**Purpose:** Complete end-to-end validation for Phase 1 completion
|
||||
- **Setup:** Automatically runs setup_test_data.sql
|
||||
- Tests partner creation/retrieval
|
||||
- Tests article mapping (CAFE100 → CAF01)
|
||||
- Tests JSON parsing
|
||||
- Tests complete order import workflow
|
||||
- **Cleanup:** Automatically runs teardown_test_data.sql
|
||||
- **Status:** 95% SUCCESSFUL (3/4 components pass)
|
||||
|
||||
### 🔧 `check_packages.py`
|
||||
**Purpose:** Oracle package status validation utility
|
||||
- Validates PACK_IMPORT_PARTENERI, PACK_IMPORT_COMENZI, PACK_JSON, PACK_COMENZI compilation
|
||||
|
||||
### 🔧 `check_table_structure.py`
|
||||
**Purpose:** Database structure validation utility
|
||||
- Validates COMENZI table structure and FK constraints
|
||||
|
||||
### 🔧 `setup_test_data.sql`
|
||||
**Purpose:** Test data initialization (used by test_complete_import.py)
|
||||
- **Disables** `trg_NOM_ARTICOLE_befoins` trigger to allow specific ID_ARTICOL values
|
||||
- Creates test articles in NOM_ARTICOLE (CAF01, LAV001, TEST001) with IDs 9999001-9999003
|
||||
- Creates SKU mappings in ARTICOLE_TERTI (CAFE100→CAF01, 8000070028685→LAV001)
|
||||
- **Re-enables** trigger after test data creation
|
||||
|
||||
### 🔧 `teardown_test_data.sql`
|
||||
**Purpose:** Test data cleanup (used by test_complete_import.py)
|
||||
- Removes test articles from NOM_ARTICOLE
|
||||
- Removes test mappings from ARTICOLE_TERTI
|
||||
- Removes test orders and partners created during testing
|
||||
|
||||
---
|
||||
|
||||
**Final Update:** 10 septembrie 2025, 11:20 (Phase 1 completion - 95% functional)
|
||||
**Removed:** 8 temporary/redundant files
|
||||
**Kept:** 5 essential files (1 primary test + 4 utilities)
|
||||
@@ -1,102 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Check Oracle packages and database structure
|
||||
"""
|
||||
|
||||
import oracledb
|
||||
import os
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Load environment
|
||||
load_dotenv('.env')
|
||||
|
||||
# Oracle configuration
|
||||
user = os.environ['ORACLE_USER']
|
||||
password = os.environ['ORACLE_PASSWORD']
|
||||
dsn = os.environ['ORACLE_DSN']
|
||||
|
||||
# Initialize Oracle client (thick mode)
|
||||
try:
|
||||
instantclient_path = os.environ.get('INSTANTCLIENTPATH', '/opt/oracle/instantclient_23_9')
|
||||
oracledb.init_oracle_client(lib_dir=instantclient_path)
|
||||
print(f"✅ Oracle thick mode initialized: {instantclient_path}")
|
||||
except Exception as e:
|
||||
print(f"⚠️ Oracle thick mode failed, using thin mode: {e}")
|
||||
|
||||
def check_packages():
|
||||
"""Check available packages in Oracle"""
|
||||
print("\n🔍 Checking Oracle Packages...")
|
||||
|
||||
try:
|
||||
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||
with conn.cursor() as cur:
|
||||
|
||||
# Check user packages
|
||||
cur.execute("""
|
||||
SELECT object_name, object_type, status
|
||||
FROM user_objects
|
||||
WHERE object_type IN ('PACKAGE', 'PACKAGE BODY')
|
||||
ORDER BY object_name, object_type
|
||||
""")
|
||||
|
||||
packages = cur.fetchall()
|
||||
if packages:
|
||||
print(f"Found {len(packages)} package objects:")
|
||||
for pkg in packages:
|
||||
print(f" - {pkg[0]} ({pkg[1]}) - {pkg[2]}")
|
||||
else:
|
||||
print("❌ No packages found in current schema")
|
||||
|
||||
# Check if specific packages exist
|
||||
print("\n🔍 Checking specific packages...")
|
||||
for pkg_name in ['IMPORT_PARTENERI', 'IMPORT_COMENZI']:
|
||||
cur.execute("""
|
||||
SELECT COUNT(*) FROM user_objects
|
||||
WHERE object_name = ? AND object_type = 'PACKAGE'
|
||||
""", [pkg_name])
|
||||
|
||||
exists = cur.fetchone()[0] > 0
|
||||
print(f" - {pkg_name}: {'✅ EXISTS' if exists else '❌ NOT FOUND'}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Check packages failed: {e}")
|
||||
|
||||
def check_tables():
|
||||
"""Check available tables"""
|
||||
print("\n🔍 Checking Oracle Tables...")
|
||||
|
||||
try:
|
||||
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||
with conn.cursor() as cur:
|
||||
|
||||
# Check main tables
|
||||
tables_to_check = ['ARTICOLE_TERTI', 'PARTENERI', 'COMENZI', 'NOM_ARTICOLE']
|
||||
|
||||
for table_name in tables_to_check:
|
||||
cur.execute("""
|
||||
SELECT COUNT(*) FROM user_tables
|
||||
WHERE table_name = ?
|
||||
""", [table_name])
|
||||
|
||||
exists = cur.fetchone()[0] > 0
|
||||
|
||||
if exists:
|
||||
cur.execute(f"SELECT COUNT(*) FROM {table_name}")
|
||||
count = cur.fetchone()[0]
|
||||
print(f" - {table_name}: ✅ EXISTS ({count} records)")
|
||||
else:
|
||||
print(f" - {table_name}: ❌ NOT FOUND")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Check tables failed: {e}")
|
||||
|
||||
def main():
|
||||
"""Run all checks"""
|
||||
print("🔍 Oracle Database Structure Check")
|
||||
print("=" * 50)
|
||||
|
||||
check_packages()
|
||||
check_tables()
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,72 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Check COMENZI table structure
|
||||
"""
|
||||
|
||||
import oracledb
|
||||
import os
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv('.env')
|
||||
|
||||
user = os.environ['ORACLE_USER']
|
||||
password = os.environ['ORACLE_PASSWORD']
|
||||
dsn = os.environ['ORACLE_DSN']
|
||||
|
||||
try:
|
||||
instantclient_path = os.environ.get('INSTANTCLIENTPATH', '/opt/oracle/instantclient_23_9')
|
||||
oracledb.init_oracle_client(lib_dir=instantclient_path)
|
||||
except Exception as e:
|
||||
pass
|
||||
|
||||
def check_table_structure():
|
||||
"""Check COMENZI table columns"""
|
||||
print("🔍 Checking COMENZI table structure...")
|
||||
|
||||
try:
|
||||
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||
with conn.cursor() as cur:
|
||||
|
||||
# Get table structure
|
||||
cur.execute("""
|
||||
SELECT
|
||||
column_name,
|
||||
data_type,
|
||||
nullable,
|
||||
data_length,
|
||||
data_precision
|
||||
FROM user_tab_columns
|
||||
WHERE table_name = 'COMENZI'
|
||||
ORDER BY column_id
|
||||
""")
|
||||
|
||||
columns = cur.fetchall()
|
||||
if columns:
|
||||
print(f"\nCOMENZI table columns:")
|
||||
for col in columns:
|
||||
nullable = "NULL" if col[2] == 'Y' else "NOT NULL"
|
||||
if col[1] == 'NUMBER' and col[4]:
|
||||
type_info = f"{col[1]}({col[4]})"
|
||||
elif col[3]:
|
||||
type_info = f"{col[1]}({col[3]})"
|
||||
else:
|
||||
type_info = col[1]
|
||||
print(f" {col[0]}: {type_info} - {nullable}")
|
||||
|
||||
# Look for partner-related columns
|
||||
print(f"\nPartner-related columns:")
|
||||
for col in columns:
|
||||
if 'PART' in col[0] or 'CLIENT' in col[0]:
|
||||
print(f" {col[0]}: {col[1]}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Check failed: {e}")
|
||||
|
||||
def main():
|
||||
print("🔍 COMENZI Table Structure")
|
||||
print("=" * 40)
|
||||
|
||||
check_table_structure()
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,196 +0,0 @@
|
||||
"""
|
||||
Playwright E2E test fixtures.
|
||||
Starts the FastAPI app on a random port with test SQLite, no Oracle.
|
||||
Includes console error collector and screenshot capture.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
import pytest
|
||||
import subprocess
|
||||
import time
|
||||
import socket
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
# --- Screenshots directory ---
|
||||
QA_REPORTS_DIR = Path(__file__).parents[3] / "qa-reports"
|
||||
SCREENSHOTS_DIR = QA_REPORTS_DIR / "screenshots"
|
||||
|
||||
|
||||
def _free_port():
|
||||
with socket.socket() as s:
|
||||
s.bind(('', 0))
|
||||
return s.getsockname()[1]
|
||||
|
||||
|
||||
def _app_is_running(url):
|
||||
"""Check if app is already running at the given URL."""
|
||||
try:
|
||||
import urllib.request
|
||||
urllib.request.urlopen(f"{url}/health", timeout=2)
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def app_url(request):
|
||||
"""Use a running app if available (e.g. started by test.sh), otherwise start a subprocess.
|
||||
|
||||
When --base-url is provided or app is already running on :5003, use the live app.
|
||||
This allows E2E tests to run against the real Oracle-backed app in ./test.sh full.
|
||||
"""
|
||||
# Check if --base-url was provided via pytest-playwright
|
||||
base_url = request.config.getoption("--base-url", default=None)
|
||||
|
||||
# Try live app on :5003 first
|
||||
live_url = base_url or "http://localhost:5003"
|
||||
if _app_is_running(live_url):
|
||||
yield live_url
|
||||
return
|
||||
|
||||
# No live app — start subprocess with dummy Oracle (structure-only tests)
|
||||
port = _free_port()
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
sqlite_path = os.path.join(tmpdir, "e2e_test.db")
|
||||
|
||||
env = os.environ.copy()
|
||||
env.update({
|
||||
"FORCE_THIN_MODE": "true",
|
||||
"SQLITE_DB_PATH": sqlite_path,
|
||||
"ORACLE_DSN": "dummy",
|
||||
"ORACLE_USER": "dummy",
|
||||
"ORACLE_PASSWORD": "dummy",
|
||||
"JSON_OUTPUT_DIR": tmpdir,
|
||||
})
|
||||
|
||||
api_dir = os.path.join(os.path.dirname(__file__), "..", "..")
|
||||
proc = subprocess.Popen(
|
||||
[sys.executable, "-m", "uvicorn", "app.main:app", "--host", "127.0.0.1", "--port", str(port)],
|
||||
cwd=api_dir,
|
||||
env=env,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
)
|
||||
|
||||
# Wait for startup (up to 15 seconds)
|
||||
url = f"http://127.0.0.1:{port}"
|
||||
for _ in range(30):
|
||||
try:
|
||||
import urllib.request
|
||||
urllib.request.urlopen(f"{url}/health", timeout=1)
|
||||
break
|
||||
except Exception:
|
||||
time.sleep(0.5)
|
||||
else:
|
||||
proc.kill()
|
||||
stdout, stderr = proc.communicate()
|
||||
raise RuntimeError(
|
||||
f"App failed to start on port {port}.\n"
|
||||
f"STDOUT: {stdout.decode()[-2000:]}\n"
|
||||
f"STDERR: {stderr.decode()[-2000:]}"
|
||||
)
|
||||
|
||||
yield url
|
||||
|
||||
proc.terminate()
|
||||
try:
|
||||
proc.wait(timeout=5)
|
||||
except subprocess.TimeoutExpired:
|
||||
proc.kill()
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def seed_test_data(app_url):
|
||||
"""
|
||||
Seed SQLite with test data via API calls.
|
||||
|
||||
Oracle is unavailable in E2E tests — only SQLite-backed pages are
|
||||
fully functional. This fixture exists as a hook for future seeding;
|
||||
for now E2E tests validate UI structure on empty-state pages.
|
||||
"""
|
||||
return app_url
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Console & Network Error Collectors
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def console_errors():
|
||||
"""Session-scoped list collecting JS console errors across all tests."""
|
||||
return []
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def network_errors():
|
||||
"""Session-scoped list collecting HTTP 4xx/5xx responses across all tests."""
|
||||
return []
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def _attach_collectors(page, console_errors, network_errors, request):
|
||||
"""Auto-attach console and network listeners to every test's page."""
|
||||
test_errors = []
|
||||
test_network = []
|
||||
|
||||
def on_console(msg):
|
||||
if msg.type == "error":
|
||||
entry = {"test": request.node.name, "text": msg.text, "type": "console.error"}
|
||||
console_errors.append(entry)
|
||||
test_errors.append(entry)
|
||||
|
||||
def on_pageerror(exc):
|
||||
entry = {"test": request.node.name, "text": str(exc), "type": "pageerror"}
|
||||
console_errors.append(entry)
|
||||
test_errors.append(entry)
|
||||
|
||||
def on_response(response):
|
||||
if response.status >= 400:
|
||||
entry = {
|
||||
"test": request.node.name,
|
||||
"url": response.url,
|
||||
"status": response.status,
|
||||
"type": "network_error",
|
||||
}
|
||||
network_errors.append(entry)
|
||||
test_network.append(entry)
|
||||
|
||||
page.on("console", on_console)
|
||||
page.on("pageerror", on_pageerror)
|
||||
page.on("response", on_response)
|
||||
|
||||
yield
|
||||
|
||||
# Remove listeners to avoid leaks
|
||||
page.remove_listener("console", on_console)
|
||||
page.remove_listener("pageerror", on_pageerror)
|
||||
page.remove_listener("response", on_response)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Screenshot on failure
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def _screenshot_on_failure(page, request):
|
||||
"""Take a screenshot when a test fails."""
|
||||
yield
|
||||
|
||||
if request.node.rep_call and request.node.rep_call.failed:
|
||||
SCREENSHOTS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
name = request.node.name.replace("/", "_").replace("::", "_")
|
||||
path = SCREENSHOTS_DIR / f"FAIL-{name}.png"
|
||||
try:
|
||||
page.screenshot(path=str(path))
|
||||
except Exception:
|
||||
pass # page may be closed
|
||||
|
||||
|
||||
@pytest.hookimpl(tryfirst=True, hookwrapper=True)
|
||||
def pytest_runtest_makereport(item, call):
|
||||
"""Store test result on the item for _screenshot_on_failure."""
|
||||
outcome = yield
|
||||
rep = outcome.get_result()
|
||||
setattr(item, f"rep_{rep.when}", rep)
|
||||
@@ -1,175 +0,0 @@
|
||||
"""
|
||||
E2E verification: Dashboard page against the live app (localhost:5003).
|
||||
|
||||
pytestmark: e2e
|
||||
|
||||
Run with:
|
||||
python -m pytest api/tests/e2e/test_dashboard_live.py -v --headed
|
||||
|
||||
This tests the LIVE app, not a test instance. Requires the app to be running.
|
||||
"""
|
||||
import pytest
|
||||
from playwright.sync_api import sync_playwright, Page, expect
|
||||
|
||||
pytestmark = pytest.mark.e2e
|
||||
|
||||
BASE_URL = "http://localhost:5003"
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def browser_page():
|
||||
"""Launch browser and yield a page connected to the live app."""
|
||||
with sync_playwright() as p:
|
||||
browser = p.chromium.launch(headless=True)
|
||||
context = browser.new_context(viewport={"width": 1280, "height": 900})
|
||||
page = context.new_page()
|
||||
yield page
|
||||
browser.close()
|
||||
|
||||
|
||||
class TestDashboardPageLoad:
|
||||
"""Verify dashboard page loads and shows expected structure."""
|
||||
|
||||
def test_dashboard_loads(self, browser_page: Page):
|
||||
browser_page.goto(f"{BASE_URL}/")
|
||||
browser_page.wait_for_load_state("networkidle")
|
||||
expect(browser_page.locator("h4")).to_contain_text("Panou de Comanda")
|
||||
|
||||
def test_sync_control_visible(self, browser_page: Page):
|
||||
expect(browser_page.locator("#btnStartSync")).to_be_visible()
|
||||
expect(browser_page.locator("#syncStatusBadge")).to_be_visible()
|
||||
|
||||
def test_last_sync_card_populated(self, browser_page: Page):
|
||||
"""The lastSyncBody should show data from previous runs."""
|
||||
last_sync_date = browser_page.locator("#lastSyncDate")
|
||||
expect(last_sync_date).to_be_visible()
|
||||
text = last_sync_date.text_content()
|
||||
assert text and text != "-", f"Expected last sync date to be populated, got: '{text}'"
|
||||
|
||||
def test_last_sync_imported_count(self, browser_page: Page):
|
||||
imported_el = browser_page.locator("#lastSyncImported")
|
||||
text = imported_el.text_content()
|
||||
count = int(text) if text and text.isdigit() else 0
|
||||
assert count >= 0, f"Expected imported count >= 0, got: {text}"
|
||||
|
||||
def test_last_sync_status_badge(self, browser_page: Page):
|
||||
status_el = browser_page.locator("#lastSyncStatus .badge")
|
||||
expect(status_el).to_be_visible()
|
||||
text = status_el.text_content()
|
||||
assert text in ("completed", "running", "failed"), f"Unexpected status: {text}"
|
||||
|
||||
|
||||
class TestDashboardOrdersTable:
|
||||
"""Verify orders table displays data from SQLite."""
|
||||
|
||||
def test_orders_table_has_rows(self, browser_page: Page):
|
||||
"""Dashboard should show orders from previous sync runs."""
|
||||
browser_page.goto(f"{BASE_URL}/")
|
||||
browser_page.wait_for_load_state("networkidle")
|
||||
# Wait for the orders to load (async fetch)
|
||||
browser_page.wait_for_timeout(2000)
|
||||
|
||||
rows = browser_page.locator("#dashOrdersBody tr")
|
||||
count = rows.count()
|
||||
assert count > 0, "Expected at least 1 order row in dashboard table"
|
||||
|
||||
def test_orders_count_badges(self, browser_page: Page):
|
||||
"""Filter badges should show counts."""
|
||||
all_count = browser_page.locator("#dashCountAll").text_content()
|
||||
assert all_count and int(all_count) > 0, f"Expected total count > 0, got: {all_count}"
|
||||
|
||||
def test_first_order_has_columns(self, browser_page: Page):
|
||||
"""First row should have order number, date, customer, etc."""
|
||||
first_row = browser_page.locator("#dashOrdersBody tr").first
|
||||
cells = first_row.locator("td")
|
||||
assert cells.count() >= 6, f"Expected at least 6 columns, got: {cells.count()}"
|
||||
|
||||
# Order number should be a code element
|
||||
order_code = first_row.locator("td code").first
|
||||
expect(order_code).to_be_visible()
|
||||
|
||||
def test_filter_imported(self, browser_page: Page):
|
||||
"""Click 'Importate' filter and verify table updates."""
|
||||
browser_page.locator("#dashFilterBtns button", has_text="Importate").click()
|
||||
browser_page.wait_for_timeout(1000)
|
||||
|
||||
imported_count = browser_page.locator("#dashCountImported").text_content()
|
||||
if imported_count and int(imported_count) > 0:
|
||||
rows = browser_page.locator("#dashOrdersBody tr")
|
||||
assert rows.count() > 0, "Expected imported orders to show"
|
||||
# All visible rows should have 'Importat' badge
|
||||
badges = browser_page.locator("#dashOrdersBody .badge.bg-success")
|
||||
assert badges.count() > 0, "Expected green 'Importat' badges"
|
||||
|
||||
def test_filter_all_reset(self, browser_page: Page):
|
||||
"""Click 'Toate' to reset filter."""
|
||||
browser_page.locator("#dashFilterBtns button", has_text="Toate").click()
|
||||
browser_page.wait_for_timeout(1000)
|
||||
rows = browser_page.locator("#dashOrdersBody tr")
|
||||
assert rows.count() > 0, "Expected orders after resetting filter"
|
||||
|
||||
|
||||
class TestDashboardOrderDetail:
|
||||
"""Verify order detail modal opens and shows data."""
|
||||
|
||||
def test_click_order_opens_modal(self, browser_page: Page):
|
||||
browser_page.goto(f"{BASE_URL}/")
|
||||
browser_page.wait_for_load_state("networkidle")
|
||||
browser_page.wait_for_timeout(2000)
|
||||
|
||||
# Click the first order row
|
||||
first_row = browser_page.locator("#dashOrdersBody tr").first
|
||||
first_row.click()
|
||||
browser_page.wait_for_timeout(1500)
|
||||
|
||||
# Modal should be visible
|
||||
modal = browser_page.locator("#orderDetailModal")
|
||||
expect(modal).to_be_visible()
|
||||
|
||||
# Order number should be populated
|
||||
order_num = browser_page.locator("#detailOrderNumber").text_content()
|
||||
assert order_num and order_num != "#", f"Expected order number in modal, got: {order_num}"
|
||||
|
||||
def test_modal_shows_customer(self, browser_page: Page):
|
||||
customer = browser_page.locator("#detailCustomer").text_content()
|
||||
assert customer and customer not in ("...", "-"), f"Expected customer name, got: {customer}"
|
||||
|
||||
def test_modal_shows_items(self, browser_page: Page):
|
||||
items_rows = browser_page.locator("#detailItemsBody tr")
|
||||
assert items_rows.count() > 0, "Expected at least 1 item in order detail"
|
||||
|
||||
def test_close_modal(self, browser_page: Page):
|
||||
browser_page.locator("#orderDetailModal .btn-close").click()
|
||||
browser_page.wait_for_timeout(500)
|
||||
|
||||
|
||||
class TestDashboardAPIEndpoints:
|
||||
"""Verify API endpoints return expected data."""
|
||||
|
||||
def test_api_dashboard_orders(self, browser_page: Page):
|
||||
response = browser_page.request.get(f"{BASE_URL}/api/dashboard/orders")
|
||||
assert response.ok, f"API returned {response.status}"
|
||||
data = response.json()
|
||||
assert "orders" in data, "Expected 'orders' key in response"
|
||||
assert "counts" in data, "Expected 'counts' key in response"
|
||||
assert len(data["orders"]) > 0, "Expected at least 1 order"
|
||||
|
||||
def test_api_sync_status(self, browser_page: Page):
|
||||
response = browser_page.request.get(f"{BASE_URL}/api/sync/status")
|
||||
assert response.ok
|
||||
data = response.json()
|
||||
assert "status" in data
|
||||
assert "stats" in data
|
||||
|
||||
def test_api_sync_history(self, browser_page: Page):
|
||||
response = browser_page.request.get(f"{BASE_URL}/api/sync/history?per_page=5")
|
||||
assert response.ok
|
||||
data = response.json()
|
||||
assert "runs" in data
|
||||
assert len(data["runs"]) > 0, "Expected at least 1 sync run"
|
||||
|
||||
def test_api_missing_skus(self, browser_page: Page):
|
||||
response = browser_page.request.get(f"{BASE_URL}/api/validate/missing-skus")
|
||||
assert response.ok
|
||||
data = response.json()
|
||||
assert "missing_skus" in data
|
||||
@@ -1,105 +0,0 @@
|
||||
"""
|
||||
E2E tests for DESIGN.md migration (Commit 0.5).
|
||||
Tests: dark toggle, FOUC prevention, bottom nav, active tab amber, dark contrast.
|
||||
"""
|
||||
import pytest
|
||||
|
||||
pytestmark = [pytest.mark.e2e]
|
||||
|
||||
|
||||
def test_dark_mode_toggle(page, app_url):
|
||||
"""Dark toggle switches theme and persists in localStorage."""
|
||||
page.goto(f"{app_url}/settings")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
# Settings page has the dark mode toggle
|
||||
toggle = page.locator("#settDarkMode")
|
||||
assert toggle.is_visible()
|
||||
|
||||
# Start in light mode
|
||||
theme = page.evaluate("document.documentElement.getAttribute('data-theme')")
|
||||
if theme == "dark":
|
||||
toggle.click()
|
||||
page.wait_for_timeout(200)
|
||||
|
||||
# Toggle to dark
|
||||
toggle.click()
|
||||
page.wait_for_timeout(200)
|
||||
assert page.evaluate("document.documentElement.getAttribute('data-theme')") == "dark"
|
||||
assert page.evaluate("localStorage.getItem('theme')") == "dark"
|
||||
|
||||
# Toggle back to light
|
||||
toggle.click()
|
||||
page.wait_for_timeout(200)
|
||||
assert page.evaluate("document.documentElement.getAttribute('data-theme')") != "dark"
|
||||
assert page.evaluate("localStorage.getItem('theme')") == "light"
|
||||
|
||||
|
||||
def test_fouc_prevention(page, app_url):
|
||||
"""Theme is applied before CSS loads (inline script in <head>)."""
|
||||
# Set dark theme in localStorage before navigation
|
||||
page.goto(f"{app_url}/")
|
||||
page.evaluate("localStorage.setItem('theme', 'dark')")
|
||||
|
||||
# Navigate fresh — the inline script should apply dark before paint
|
||||
page.goto(f"{app_url}/")
|
||||
# Check immediately (before networkidle) that data-theme is set
|
||||
theme = page.evaluate("document.documentElement.getAttribute('data-theme')")
|
||||
assert theme == "dark", "FOUC: dark theme not applied before first paint"
|
||||
|
||||
# Cleanup
|
||||
page.evaluate("localStorage.removeItem('theme')")
|
||||
|
||||
|
||||
def test_bottom_nav_visible_on_mobile(page, app_url):
|
||||
"""Bottom nav is visible on mobile viewport, top navbar is hidden."""
|
||||
page.set_viewport_size({"width": 375, "height": 812})
|
||||
page.goto(f"{app_url}/")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
bottom_nav = page.locator(".bottom-nav")
|
||||
top_navbar = page.locator(".top-navbar")
|
||||
|
||||
assert bottom_nav.is_visible(), "Bottom nav should be visible on mobile"
|
||||
assert not top_navbar.is_visible(), "Top navbar should be hidden on mobile"
|
||||
|
||||
# Check 5 tabs exist
|
||||
tabs = page.locator(".bottom-nav-item")
|
||||
assert tabs.count() == 5
|
||||
|
||||
|
||||
def test_active_tab_amber_accent(page, app_url):
|
||||
"""Active nav tab uses amber accent color, not blue."""
|
||||
page.goto(f"{app_url}/")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
active_tab = page.locator(".nav-tab.active")
|
||||
assert active_tab.count() >= 1
|
||||
|
||||
# Get computed color of active tab
|
||||
color = page.evaluate("""
|
||||
() => getComputedStyle(document.querySelector('.nav-tab.active')).color
|
||||
""")
|
||||
# Amber #D97706 = rgb(217, 119, 6)
|
||||
assert "217" in color and "119" in color, f"Active tab color should be amber, got: {color}"
|
||||
|
||||
|
||||
def test_dark_mode_contrast(page, app_url):
|
||||
"""Dark mode has proper contrast — bg is dark, text is light."""
|
||||
page.goto(f"{app_url}/")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
# Enable dark mode
|
||||
page.evaluate("document.documentElement.setAttribute('data-theme', 'dark')")
|
||||
page.wait_for_timeout(100)
|
||||
|
||||
bg = page.evaluate("getComputedStyle(document.body).backgroundColor")
|
||||
color = page.evaluate("getComputedStyle(document.body).color")
|
||||
|
||||
# bg should be dark (#121212 = rgb(18, 18, 18))
|
||||
assert "18" in bg, f"Dark mode bg should be dark, got: {bg}"
|
||||
# text should be light (#E8E4DD = rgb(232, 228, 221))
|
||||
assert "232" in color or "228" in color, f"Dark mode text should be light, got: {color}"
|
||||
|
||||
# Cleanup
|
||||
page.evaluate("document.documentElement.removeAttribute('data-theme')")
|
||||
@@ -1,59 +0,0 @@
|
||||
"""E2E: Logs page with per-order filtering and date display."""
|
||||
import pytest
|
||||
from playwright.sync_api import Page, expect
|
||||
|
||||
pytestmark = pytest.mark.e2e
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def navigate_to_logs(page: Page, app_url: str):
|
||||
page.goto(f"{app_url}/logs")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
|
||||
def test_logs_page_loads(page: Page):
|
||||
"""Verify the logs page renders with sync runs dropdown."""
|
||||
expect(page.locator("h4")).to_contain_text("Jurnale Import")
|
||||
expect(page.locator("#runsDropdown")).to_be_visible()
|
||||
|
||||
|
||||
def test_sync_runs_dropdown_has_options(page: Page):
|
||||
"""Verify the runs dropdown is populated (or has placeholder)."""
|
||||
dropdown = page.locator("#runsDropdown")
|
||||
expect(dropdown).to_be_visible()
|
||||
# Dropdown should have at least the default option
|
||||
options = dropdown.locator("option")
|
||||
assert options.count() >= 1, "Expected at least one option in runs dropdown"
|
||||
|
||||
|
||||
def test_filter_buttons_exist(page: Page):
|
||||
"""Verify the log viewer section is initially hidden (no run selected yet)."""
|
||||
viewer = page.locator("#logViewerSection")
|
||||
expect(viewer).to_be_hidden()
|
||||
|
||||
|
||||
def test_order_detail_modal_structure(page: Page):
|
||||
"""Verify the order detail modal exists in DOM with required fields."""
|
||||
modal = page.locator("#orderDetailModal")
|
||||
expect(modal).to_be_attached()
|
||||
expect(page.locator("#detailOrderNumber")).to_be_attached()
|
||||
expect(page.locator("#detailCustomer")).to_be_attached()
|
||||
expect(page.locator("#detailDate")).to_be_attached()
|
||||
expect(page.locator("#detailItemsBody")).to_be_attached()
|
||||
|
||||
|
||||
def test_quick_map_modal_structure(page: Page):
|
||||
"""Verify quick map modal exists with multi-CODMAT support."""
|
||||
modal = page.locator("#quickMapModal")
|
||||
expect(modal).to_be_attached()
|
||||
expect(page.locator("#qmSku")).to_be_attached()
|
||||
expect(page.locator("#qmProductName")).to_be_attached()
|
||||
expect(page.locator("#qmCodmatLines")).to_be_attached()
|
||||
|
||||
|
||||
def test_text_log_toggle(page: Page):
|
||||
"""Verify text log section is hidden initially and toggle button is in DOM."""
|
||||
section = page.locator("#textLogSection")
|
||||
expect(section).to_be_hidden()
|
||||
# Toggle button lives inside logViewerSection which is also hidden
|
||||
expect(page.locator("#btnShowTextLog")).to_be_attached()
|
||||
@@ -1,178 +0,0 @@
|
||||
"""E2E: Mappings page with flat-row list, sorting, multi-CODMAT modal."""
|
||||
import re
|
||||
|
||||
import pytest
|
||||
from playwright.sync_api import Page, expect
|
||||
|
||||
pytestmark = pytest.mark.e2e
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def navigate_to_mappings(page: Page, app_url: str):
|
||||
page.goto(f"{app_url}/mappings")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
|
||||
def test_mappings_page_loads(page: Page):
|
||||
"""Verify mappings page renders."""
|
||||
expect(page.locator("h4")).to_contain_text("Mapari SKU")
|
||||
|
||||
|
||||
def test_flat_list_container_exists(page: Page):
|
||||
"""Verify the flat-row list container is rendered."""
|
||||
container = page.locator("#mappingsFlatList")
|
||||
expect(container).to_be_visible()
|
||||
# Should have at least one flat-row (data or empty message)
|
||||
rows = container.locator(".flat-row")
|
||||
assert rows.count() >= 1, "Expected at least one flat-row in the list"
|
||||
|
||||
|
||||
def test_show_inactive_toggle_exists(page: Page):
|
||||
"""R5: Verify 'Arata inactive' toggle is present."""
|
||||
toggle = page.locator("#showInactive")
|
||||
expect(toggle).to_be_visible()
|
||||
label = page.locator("label[for='showInactive']")
|
||||
expect(label).to_contain_text("Arata inactive")
|
||||
|
||||
|
||||
def test_show_deleted_toggle_exists(page: Page):
|
||||
"""Verify 'Arata sterse' toggle is present."""
|
||||
toggle = page.locator("#showDeleted")
|
||||
expect(toggle).to_be_visible()
|
||||
label = page.locator("label[for='showDeleted']")
|
||||
expect(label).to_contain_text("Arata sterse")
|
||||
|
||||
|
||||
def test_add_modal_multi_codmat(page: Page):
|
||||
"""R11: Verify the add mapping modal supports multiple CODMAT lines."""
|
||||
# "Formular complet" opens the full modal
|
||||
page.locator("button[data-bs-target='#addModal']").first.click()
|
||||
page.wait_for_timeout(500)
|
||||
|
||||
codmat_lines = page.locator("#codmatLines .codmat-line")
|
||||
assert codmat_lines.count() >= 1, "Expected at least one CODMAT line in modal"
|
||||
|
||||
# Click "+ CODMAT" button to add another line
|
||||
page.locator("#addModal button", has_text="CODMAT").click()
|
||||
page.wait_for_timeout(300)
|
||||
assert codmat_lines.count() >= 2, "Expected a second CODMAT line after clicking + CODMAT"
|
||||
|
||||
# Second line must have a remove button
|
||||
remove_btns = page.locator("#codmatLines .codmat-line:nth-child(2) .qm-rm-btn")
|
||||
assert remove_btns.count() >= 1, "Second CODMAT line is missing remove button"
|
||||
|
||||
|
||||
def test_search_input_exists(page: Page):
|
||||
"""Verify search input is present with the correct placeholder."""
|
||||
search = page.locator("#searchInput")
|
||||
expect(search).to_be_visible()
|
||||
expect(search).to_have_attribute("placeholder", "Cauta SKU, CODMAT sau denumire...")
|
||||
|
||||
|
||||
def test_pagination_exists(page: Page):
|
||||
"""Verify pagination containers are in DOM."""
|
||||
expect(page.locator("#mappingsPagTop")).to_be_attached()
|
||||
expect(page.locator("#mappingsPagBottom")).to_be_attached()
|
||||
|
||||
|
||||
def test_inline_add_button_exists(page: Page):
|
||||
"""Verify 'Adauga Mapare' button is present."""
|
||||
btn = page.locator("button", has_text="Adauga Mapare")
|
||||
expect(btn).to_be_visible()
|
||||
|
||||
|
||||
# ── Autocomplete keyboard & scroll tests ─────────
|
||||
|
||||
MOCK_ARTICLES = [
|
||||
{"codmat": f"ART{i:03}", "denumire": f"Articol Test {i}", "um": "BUC"}
|
||||
for i in range(1, 20)
|
||||
]
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_articles(page: Page):
|
||||
"""Mock /api/articles/search to return test data without Oracle."""
|
||||
def handle(route):
|
||||
route.fulfill(json={"results": MOCK_ARTICLES})
|
||||
page.route("**/api/articles/search*", handle)
|
||||
yield
|
||||
page.unroute("**/api/articles/search*")
|
||||
|
||||
|
||||
def _open_modal_and_type(page: Page, query: str = "ART"):
|
||||
"""Open add-modal, type in CODMAT input, wait for dropdown."""
|
||||
page.locator("button[data-bs-target='#addModal']").first.click()
|
||||
page.wait_for_timeout(400)
|
||||
codmat_input = page.locator("#codmatLines .cl-codmat").first
|
||||
codmat_input.fill(query)
|
||||
# Wait for debounce + render
|
||||
page.wait_for_timeout(400)
|
||||
return codmat_input
|
||||
|
||||
|
||||
def test_autocomplete_keyboard_navigation(page: Page, mock_articles):
|
||||
"""ArrowDown/Up moves .active class, Enter selects."""
|
||||
codmat_input = _open_modal_and_type(page)
|
||||
|
||||
dropdown = page.locator("#codmatLines .cl-ac-dropdown").first
|
||||
expect(dropdown).to_be_visible()
|
||||
|
||||
# ArrowDown → first item active
|
||||
codmat_input.press("ArrowDown")
|
||||
first_item = dropdown.locator(".autocomplete-item").first
|
||||
expect(first_item).to_have_class(re.compile("active"))
|
||||
|
||||
# ArrowDown again → second item active
|
||||
codmat_input.press("ArrowDown")
|
||||
second_item = dropdown.locator(".autocomplete-item").nth(1)
|
||||
expect(second_item).to_have_class(re.compile("active"))
|
||||
expect(first_item).not_to_have_class(re.compile("active"))
|
||||
|
||||
# ArrowUp → back to first
|
||||
codmat_input.press("ArrowUp")
|
||||
expect(first_item).to_have_class(re.compile("active"))
|
||||
|
||||
# Enter → selects the item
|
||||
codmat_input.press("Enter")
|
||||
expect(dropdown).to_be_hidden()
|
||||
assert codmat_input.input_value() == "ART001"
|
||||
|
||||
|
||||
def test_autocomplete_escape_closes(page: Page, mock_articles):
|
||||
"""Escape closes dropdown."""
|
||||
codmat_input = _open_modal_and_type(page)
|
||||
|
||||
dropdown = page.locator("#codmatLines .cl-ac-dropdown").first
|
||||
expect(dropdown).to_be_visible()
|
||||
|
||||
codmat_input.press("Escape")
|
||||
expect(dropdown).to_be_hidden()
|
||||
|
||||
|
||||
def test_autocomplete_scroll_keeps_open(page: Page, mock_articles):
|
||||
"""Mouse wheel on dropdown doesn't close it (blur fix)."""
|
||||
codmat_input = _open_modal_and_type(page)
|
||||
|
||||
dropdown = page.locator("#codmatLines .cl-ac-dropdown").first
|
||||
expect(dropdown).to_be_visible()
|
||||
|
||||
# Scroll inside the dropdown via mouse wheel
|
||||
dropdown.evaluate("el => el.scrollTop = 100")
|
||||
page.wait_for_timeout(300)
|
||||
|
||||
# Dropdown should still be visible
|
||||
expect(dropdown).to_be_visible()
|
||||
|
||||
|
||||
def test_autocomplete_click_outside_closes(page: Page, mock_articles):
|
||||
"""Click outside closes dropdown (Tab away moves focus)."""
|
||||
codmat_input = _open_modal_and_type(page)
|
||||
|
||||
dropdown = page.locator("#codmatLines .cl-ac-dropdown").first
|
||||
expect(dropdown).to_be_visible()
|
||||
|
||||
# Tab away from the input to trigger blur
|
||||
codmat_input.press("Tab")
|
||||
page.wait_for_timeout(300)
|
||||
|
||||
expect(dropdown).to_be_hidden()
|
||||
@@ -1,78 +0,0 @@
|
||||
"""E2E: Missing SKUs page with resolved toggle and multi-CODMAT modal."""
|
||||
import pytest
|
||||
from playwright.sync_api import Page, expect
|
||||
|
||||
pytestmark = pytest.mark.e2e
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def navigate_to_missing(page: Page, app_url: str):
|
||||
page.goto(f"{app_url}/missing-skus")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
|
||||
def test_missing_skus_page_loads(page: Page):
|
||||
"""Verify the page renders with the correct heading."""
|
||||
expect(page.locator("h4")).to_contain_text("SKU-uri Lipsa")
|
||||
|
||||
|
||||
def test_resolved_toggle_buttons(page: Page):
|
||||
"""R10: Verify resolved filter pills exist and 'unresolved' is active by default."""
|
||||
unresolved = page.locator(".filter-pill[data-sku-status='unresolved']")
|
||||
resolved = page.locator(".filter-pill[data-sku-status='resolved']")
|
||||
all_btn = page.locator(".filter-pill[data-sku-status='all']")
|
||||
|
||||
expect(unresolved).to_be_attached()
|
||||
expect(resolved).to_be_attached()
|
||||
expect(all_btn).to_be_attached()
|
||||
|
||||
# Unresolved should be active by default
|
||||
classes = unresolved.get_attribute("class")
|
||||
assert "active" in classes, f"Expected unresolved pill to be active, got classes: {classes}"
|
||||
|
||||
|
||||
def test_resolved_toggle_switches(page: Page):
|
||||
"""R10: Clicking resolved/all toggles changes active state correctly."""
|
||||
resolved = page.locator(".filter-pill[data-sku-status='resolved']")
|
||||
unresolved = page.locator(".filter-pill[data-sku-status='unresolved']")
|
||||
all_btn = page.locator(".filter-pill[data-sku-status='all']")
|
||||
|
||||
# Click "Rezolvate"
|
||||
resolved.click()
|
||||
page.wait_for_timeout(500)
|
||||
|
||||
classes_res = resolved.get_attribute("class")
|
||||
assert "active" in classes_res, f"Expected resolved pill to be active, got: {classes_res}"
|
||||
|
||||
classes_unr = unresolved.get_attribute("class")
|
||||
assert "active" not in classes_unr, f"Expected unresolved pill to be inactive, got: {classes_unr}"
|
||||
|
||||
# Click "Toate"
|
||||
all_btn.click()
|
||||
page.wait_for_timeout(500)
|
||||
|
||||
classes_all = all_btn.get_attribute("class")
|
||||
assert "active" in classes_all, f"Expected all pill to be active, got: {classes_all}"
|
||||
|
||||
|
||||
def test_quick_map_modal_multi_codmat(page: Page):
|
||||
"""R11: Verify the quick mapping modal supports multiple CODMATs."""
|
||||
modal = page.locator("#quickMapModal")
|
||||
expect(modal).to_be_attached()
|
||||
|
||||
expect(page.locator("#qmSku")).to_be_attached()
|
||||
expect(page.locator("#qmProductName")).to_be_attached()
|
||||
expect(page.locator("#qmCodmatLines")).to_be_attached()
|
||||
expect(page.locator("#qmPctWarning")).to_be_attached()
|
||||
|
||||
|
||||
def test_export_csv_button(page: Page):
|
||||
"""Verify Export CSV button is visible on the page."""
|
||||
btn = page.locator("button", has_text="Export CSV")
|
||||
expect(btn).to_be_visible()
|
||||
|
||||
|
||||
def test_rescan_button(page: Page):
|
||||
"""Verify Re-Scan button is visible on the page."""
|
||||
btn = page.locator("#rescanBtn")
|
||||
expect(btn).to_be_visible()
|
||||
@@ -1,55 +0,0 @@
|
||||
"""E2E: Order detail modal structure and inline mapping."""
|
||||
import pytest
|
||||
from playwright.sync_api import Page, expect
|
||||
|
||||
pytestmark = pytest.mark.e2e
|
||||
|
||||
|
||||
def test_order_detail_modal_has_roa_ids(page: Page, app_url: str):
|
||||
"""R9: Verify order detail modal contains all ROA ID labels."""
|
||||
page.goto(f"{app_url}/logs")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
modal = page.locator("#orderDetailModal")
|
||||
expect(modal).to_be_attached()
|
||||
|
||||
modal_html = modal.inner_html()
|
||||
assert "ID Comanda" in modal_html, "Missing 'ID Comanda' label in order detail modal"
|
||||
assert "ID Partener" in modal_html, "Missing 'ID Partener' label in order detail modal"
|
||||
assert "GOMAG" in modal_html, "Missing 'GOMAG' column label in order detail modal"
|
||||
assert "ROA" in modal_html, "Missing 'ROA' column label in order detail modal"
|
||||
|
||||
|
||||
def test_order_detail_items_table_columns(page: Page, app_url: str):
|
||||
"""R9: Verify items table has all required columns."""
|
||||
page.goto(f"{app_url}/logs")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
headers = page.locator("#orderDetailModal thead th")
|
||||
texts = headers.all_text_contents()
|
||||
|
||||
# Current columns (may evolve — check dashboard.html for source of truth)
|
||||
required_columns = ["SKU", "Produs", "CODMAT", "Cant.", "Pret GoMag", "TVA%", "Valoare"]
|
||||
for col in required_columns:
|
||||
assert col in texts, f"Column '{col}' missing from order detail items table. Found: {texts}"
|
||||
|
||||
|
||||
def test_quick_map_from_order_detail(page: Page, app_url: str):
|
||||
"""R9+R11: Verify quick map modal is reachable from order detail context."""
|
||||
page.goto(f"{app_url}/logs")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
modal = page.locator("#quickMapModal")
|
||||
expect(modal).to_be_attached()
|
||||
|
||||
expect(page.locator("#qmCodmatLines")).to_be_attached()
|
||||
expect(page.locator("#qmPctWarning")).to_be_attached()
|
||||
|
||||
|
||||
def test_dashboard_navigates_to_logs(page: Page, app_url: str):
|
||||
"""Verify the sidebar on the dashboard contains a link to the logs page."""
|
||||
page.goto(f"{app_url}/")
|
||||
page.wait_for_load_state("networkidle")
|
||||
|
||||
logs_link = page.locator(".top-navbar a[href='/logs'], .bottom-nav a[href='/logs']")
|
||||
expect(logs_link.first).to_be_visible()
|
||||
@@ -1,108 +0,0 @@
|
||||
"""
|
||||
QA test fixtures — shared across api_health, responsive, smoke_prod, logs_monitor,
|
||||
sync_real, plsql tests.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
# Add api/ to path
|
||||
_api_dir = str(Path(__file__).parents[2])
|
||||
if _api_dir not in sys.path:
|
||||
sys.path.insert(0, _api_dir)
|
||||
|
||||
# Directories
|
||||
PROJECT_ROOT = Path(__file__).parents[3]
|
||||
QA_REPORTS_DIR = PROJECT_ROOT / "qa-reports"
|
||||
SCREENSHOTS_DIR = QA_REPORTS_DIR / "screenshots"
|
||||
LOGS_DIR = PROJECT_ROOT / "logs"
|
||||
|
||||
|
||||
def pytest_addoption(parser):
|
||||
# --base-url is already provided by pytest-playwright; we reuse it
|
||||
# Use try/except to avoid conflicts when conftest is loaded alongside other plugins
|
||||
try:
|
||||
parser.addoption("--env", default="test", choices=["test", "prod"], help="QA environment")
|
||||
except ValueError:
|
||||
pass
|
||||
try:
|
||||
parser.addoption("--qa-log-file", default=None, help="Specific log file to check")
|
||||
except (ValueError, Exception):
|
||||
pass
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def base_url(request):
|
||||
"""Reuse pytest-playwright's --base-url or default to localhost:5003."""
|
||||
url = request.config.getoption("--base-url") or "http://localhost:5003"
|
||||
return url.rstrip("/")
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def env_name(request):
|
||||
return request.config.getoption("--env")
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def qa_issues():
|
||||
"""Collect issues across all QA tests for the final report."""
|
||||
return []
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def screenshots_dir():
|
||||
SCREENSHOTS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
return SCREENSHOTS_DIR
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def app_log_path(request):
|
||||
"""Return the most recent log file from logs/."""
|
||||
custom = request.config.getoption("--qa-log-file", default=None)
|
||||
if custom:
|
||||
return Path(custom)
|
||||
|
||||
if not LOGS_DIR.exists():
|
||||
return None
|
||||
|
||||
logs = sorted(LOGS_DIR.glob("sync_comenzi_*.log"), key=lambda p: p.stat().st_mtime, reverse=True)
|
||||
return logs[0] if logs else None
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def oracle_connection():
|
||||
"""Create a direct Oracle connection for PL/SQL and sync tests."""
|
||||
from dotenv import load_dotenv
|
||||
env_path = Path(__file__).parents[2] / ".env"
|
||||
load_dotenv(str(env_path), override=True)
|
||||
|
||||
user = os.environ.get("ORACLE_USER", "")
|
||||
password = os.environ.get("ORACLE_PASSWORD", "")
|
||||
dsn = os.environ.get("ORACLE_DSN", "")
|
||||
|
||||
if not all([user, password, dsn]) or user == "dummy":
|
||||
pytest.skip("Oracle not configured (ORACLE_USER/PASSWORD/DSN missing or dummy)")
|
||||
|
||||
# TNS_ADMIN must point to the directory containing tnsnames.ora, not the file
|
||||
tns_admin = os.environ.get("TNS_ADMIN", "")
|
||||
if tns_admin and os.path.isfile(tns_admin):
|
||||
os.environ["TNS_ADMIN"] = os.path.dirname(tns_admin)
|
||||
elif not tns_admin:
|
||||
# Default to api/ directory which contains tnsnames.ora
|
||||
os.environ["TNS_ADMIN"] = str(Path(__file__).parents[2])
|
||||
|
||||
import oracledb
|
||||
conn = oracledb.connect(user=user, password=password, dsn=dsn)
|
||||
yield conn
|
||||
conn.close()
|
||||
|
||||
|
||||
def pytest_sessionfinish(session, exitstatus):
|
||||
"""Generate QA report at end of session."""
|
||||
try:
|
||||
from . import qa_report
|
||||
qa_report.generate(session, QA_REPORTS_DIR)
|
||||
except Exception as e:
|
||||
print(f"\n[qa_report] Failed to generate report: {e}")
|
||||
@@ -1,245 +0,0 @@
|
||||
"""
|
||||
QA Report Generator — called by conftest.py's pytest_sessionfinish hook.
|
||||
"""
|
||||
import json
|
||||
import os
|
||||
import smtplib
|
||||
from datetime import date
|
||||
from email.mime.text import MIMEText
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
CATEGORIES = {
|
||||
"Console": {"weight": 0.10, "patterns": ["e2e/"]},
|
||||
"Navigation": {"weight": 0.10, "patterns": ["test_page_load", "test_", "_loads"]},
|
||||
"Functional": {"weight": 0.15, "patterns": ["e2e/"]},
|
||||
"API": {"weight": 0.15, "patterns": ["test_qa_api", "test_api_"]},
|
||||
"Responsive": {"weight": 0.10, "patterns": ["test_qa_responsive", "responsive"]},
|
||||
"Performance":{"weight": 0.10, "patterns": ["response_time"]},
|
||||
"Logs": {"weight": 0.15, "patterns": ["test_qa_logs", "log_monitor"]},
|
||||
"Sync/Oracle":{"weight": 0.15, "patterns": ["sync", "plsql", "oracle"]},
|
||||
}
|
||||
|
||||
|
||||
def _match_category(nodeid: str, name: str, category: str, patterns: list) -> bool:
|
||||
"""Check if a test belongs to a category based on patterns."""
|
||||
nodeid_lower = nodeid.lower()
|
||||
name_lower = name.lower()
|
||||
|
||||
if category == "Console":
|
||||
return "e2e/" in nodeid_lower
|
||||
elif category == "Functional":
|
||||
return "e2e/" in nodeid_lower
|
||||
elif category == "Navigation":
|
||||
return "test_page_load" in name_lower or name_lower.endswith("_loads")
|
||||
else:
|
||||
for p in patterns:
|
||||
if p in nodeid_lower or p in name_lower:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def _collect_results(session):
|
||||
"""Return list of (nodeid, name, passed, failed, error_msg) for each test."""
|
||||
results = []
|
||||
for item in session.items:
|
||||
nodeid = item.nodeid
|
||||
name = item.name
|
||||
passed = False
|
||||
failed = False
|
||||
error_msg = ""
|
||||
rep = getattr(item, "rep_call", None)
|
||||
if rep is None:
|
||||
# try stash
|
||||
try:
|
||||
rep = item.stash.get(item.config._store, None)
|
||||
except Exception:
|
||||
pass
|
||||
if rep is not None:
|
||||
passed = getattr(rep, "passed", False)
|
||||
failed = getattr(rep, "failed", False)
|
||||
if failed:
|
||||
try:
|
||||
error_msg = str(rep.longrepr).split("\n")[-1][:200]
|
||||
except Exception:
|
||||
error_msg = "unknown error"
|
||||
results.append((nodeid, name, passed, failed, error_msg))
|
||||
return results
|
||||
|
||||
|
||||
def _categorize(results):
|
||||
"""Group tests into categories and compute per-category stats."""
|
||||
cat_stats = {}
|
||||
for cat, cfg in CATEGORIES.items():
|
||||
cat_stats[cat] = {
|
||||
"weight": cfg["weight"],
|
||||
"passed": 0,
|
||||
"total": 0,
|
||||
"score": 100.0,
|
||||
}
|
||||
|
||||
for r in results:
|
||||
nodeid, name, passed = r[0], r[1], r[2]
|
||||
for cat, cfg in CATEGORIES.items():
|
||||
if _match_category(nodeid, name, cat, cfg["patterns"]):
|
||||
cat_stats[cat]["total"] += 1
|
||||
if passed:
|
||||
cat_stats[cat]["passed"] += 1
|
||||
|
||||
for cat, stats in cat_stats.items():
|
||||
if stats["total"] > 0:
|
||||
stats["score"] = (stats["passed"] / stats["total"]) * 100.0
|
||||
|
||||
return cat_stats
|
||||
|
||||
|
||||
def _compute_health(cat_stats) -> float:
|
||||
total = sum(
|
||||
(s["score"] / 100.0) * s["weight"] for s in cat_stats.values()
|
||||
)
|
||||
return round(total * 100, 1)
|
||||
|
||||
|
||||
def _load_baseline(reports_dir: Path):
|
||||
baseline_path = reports_dir / "baseline.json"
|
||||
if not baseline_path.exists():
|
||||
return None
|
||||
try:
|
||||
with open(baseline_path) as f:
|
||||
data = json.load(f)
|
||||
# validate minimal keys
|
||||
_ = data["health_score"], data["date"]
|
||||
return data
|
||||
except Exception:
|
||||
baseline_path.unlink(missing_ok=True)
|
||||
return None
|
||||
|
||||
|
||||
def _save_baseline(reports_dir: Path, health_score, passed, failed, cat_stats):
|
||||
baseline_path = reports_dir / "baseline.json"
|
||||
try:
|
||||
data = {
|
||||
"health_score": health_score,
|
||||
"date": str(date.today()),
|
||||
"passed": passed,
|
||||
"failed": failed,
|
||||
"categories": {
|
||||
cat: {"score": s["score"], "passed": s["passed"], "total": s["total"]}
|
||||
for cat, s in cat_stats.items()
|
||||
},
|
||||
}
|
||||
with open(baseline_path, "w") as f:
|
||||
json.dump(data, f, indent=2)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
def _delta_str(health_score, baseline) -> str:
|
||||
if baseline is None:
|
||||
return ""
|
||||
prev = baseline.get("health_score", health_score)
|
||||
diff = round(health_score - prev, 1)
|
||||
sign = "+" if diff >= 0 else ""
|
||||
return f" (baseline: {prev}, {sign}{diff})"
|
||||
|
||||
|
||||
def _build_markdown(health_score, delta, cat_stats, failed_tests, today_str) -> str:
|
||||
lines = [
|
||||
f"# QA Report — {today_str}",
|
||||
"",
|
||||
f"## Health Score: {health_score}/100{delta}",
|
||||
"",
|
||||
"| Category | Score | Weight | Tests |",
|
||||
"|----------|-------|--------|-------|",
|
||||
]
|
||||
|
||||
for cat, s in cat_stats.items():
|
||||
score_pct = f"{s['score']:.0f}%"
|
||||
weight_pct = f"{int(s['weight'] * 100)}%"
|
||||
tests_str = f"{s['passed']}/{s['total']} passed" if s["total"] > 0 else "no tests"
|
||||
lines.append(f"| {cat} | {score_pct} | {weight_pct} | {tests_str} |")
|
||||
|
||||
lines += ["", "## Failed Tests"]
|
||||
if failed_tests:
|
||||
for name, msg in failed_tests:
|
||||
lines.append(f"- `{name}`: {msg}")
|
||||
else:
|
||||
lines.append("_No failed tests._")
|
||||
|
||||
lines += ["", "## Warnings"]
|
||||
if health_score < 70:
|
||||
lines.append("- Health score below 70 — review failures before deploy.")
|
||||
|
||||
return "\n".join(lines) + "\n"
|
||||
|
||||
|
||||
def _send_email(health_score, report_path):
|
||||
smtp_host = os.environ.get("SMTP_HOST")
|
||||
if not smtp_host:
|
||||
return
|
||||
try:
|
||||
smtp_port = int(os.environ.get("SMTP_PORT", 587))
|
||||
smtp_user = os.environ.get("SMTP_USER", "")
|
||||
smtp_pass = os.environ.get("SMTP_PASSWORD", "")
|
||||
smtp_to = os.environ.get("SMTP_TO", smtp_user)
|
||||
|
||||
subject = f"QA Alert: Health Score {health_score}/100"
|
||||
body = f"Health score dropped to {health_score}/100.\nReport: {report_path}"
|
||||
|
||||
msg = MIMEText(body)
|
||||
msg["Subject"] = subject
|
||||
msg["From"] = smtp_user
|
||||
msg["To"] = smtp_to
|
||||
|
||||
with smtplib.SMTP(smtp_host, smtp_port) as server:
|
||||
server.ehlo()
|
||||
server.starttls()
|
||||
if smtp_user:
|
||||
server.login(smtp_user, smtp_pass)
|
||||
server.sendmail(smtp_user, [smtp_to], msg.as_string())
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
def generate(session, reports_dir: Path):
|
||||
"""Generate QA health report. Called from conftest.py pytest_sessionfinish."""
|
||||
try:
|
||||
reports_dir = Path(reports_dir)
|
||||
reports_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
results = _collect_results(session)
|
||||
|
||||
passed_count = sum(1 for r in results if r[2])
|
||||
failed_count = sum(1 for r in results if r[3])
|
||||
failed_tests = [(r[1], r[4]) for r in results if r[3]]
|
||||
|
||||
cat_stats = _categorize(results)
|
||||
health_score = _compute_health(cat_stats)
|
||||
|
||||
baseline = _load_baseline(reports_dir)
|
||||
delta = _delta_str(health_score, baseline)
|
||||
|
||||
today_str = str(date.today())
|
||||
report_filename = f"qa-report-{today_str}.md"
|
||||
report_path = reports_dir / report_filename
|
||||
|
||||
md = _build_markdown(health_score, delta, cat_stats, failed_tests, today_str)
|
||||
|
||||
try:
|
||||
with open(report_path, "w") as f:
|
||||
f.write(md)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
_save_baseline(reports_dir, health_score, passed_count, failed_count, cat_stats)
|
||||
|
||||
if health_score < 70:
|
||||
_send_email(health_score, report_path)
|
||||
|
||||
print(f"\n{'═' * 50}")
|
||||
print(f" QA HEALTH SCORE: {health_score}/100{delta}")
|
||||
print(f" Report: {report_path}")
|
||||
print(f"{'═' * 50}\n")
|
||||
|
||||
except Exception:
|
||||
pass
|
||||
@@ -1,87 +0,0 @@
|
||||
"""QA tests for API endpoint health and basic contract validation."""
|
||||
import time
|
||||
import urllib.request
|
||||
import pytest
|
||||
import httpx
|
||||
|
||||
pytestmark = pytest.mark.qa
|
||||
|
||||
ENDPOINTS = [
|
||||
"/health",
|
||||
"/api/dashboard/orders",
|
||||
"/api/sync/status",
|
||||
"/api/sync/history",
|
||||
"/api/validate/missing-skus",
|
||||
"/api/mappings",
|
||||
"/api/settings",
|
||||
]
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def client(base_url):
|
||||
"""Create httpx client; skip all if app is not reachable."""
|
||||
try:
|
||||
urllib.request.urlopen(f"{base_url}/health", timeout=3)
|
||||
except Exception:
|
||||
pytest.skip(f"App not reachable at {base_url}")
|
||||
with httpx.Client(base_url=base_url, timeout=10.0) as c:
|
||||
yield c
|
||||
|
||||
|
||||
def test_health(client):
|
||||
r = client.get("/health")
|
||||
assert r.status_code == 200
|
||||
data = r.json()
|
||||
assert "oracle" in data
|
||||
assert "sqlite" in data
|
||||
|
||||
|
||||
def test_dashboard_orders(client):
|
||||
r = client.get("/api/dashboard/orders")
|
||||
assert r.status_code == 200
|
||||
data = r.json()
|
||||
assert "orders" in data
|
||||
assert "counts" in data
|
||||
|
||||
|
||||
def test_sync_status(client):
|
||||
r = client.get("/api/sync/status")
|
||||
assert r.status_code == 200
|
||||
data = r.json()
|
||||
assert "status" in data
|
||||
|
||||
|
||||
def test_sync_history(client):
|
||||
r = client.get("/api/sync/history")
|
||||
assert r.status_code == 200
|
||||
data = r.json()
|
||||
assert "runs" in data
|
||||
assert isinstance(data["runs"], list)
|
||||
|
||||
|
||||
def test_missing_skus(client):
|
||||
r = client.get("/api/validate/missing-skus")
|
||||
assert r.status_code == 200
|
||||
data = r.json()
|
||||
assert "missing_skus" in data
|
||||
|
||||
|
||||
def test_mappings(client):
|
||||
r = client.get("/api/mappings")
|
||||
assert r.status_code == 200
|
||||
data = r.json()
|
||||
assert "mappings" in data
|
||||
|
||||
|
||||
def test_settings(client):
|
||||
r = client.get("/api/settings")
|
||||
assert r.status_code == 200
|
||||
assert isinstance(r.json(), dict)
|
||||
|
||||
|
||||
@pytest.mark.parametrize("endpoint", ENDPOINTS)
|
||||
def test_response_time(client, endpoint):
|
||||
start = time.monotonic()
|
||||
client.get(endpoint)
|
||||
elapsed = time.monotonic() - start
|
||||
assert elapsed < 5.0, f"{endpoint} took {elapsed:.2f}s (limit: 5s)"
|
||||
@@ -1,139 +0,0 @@
|
||||
"""
|
||||
Log monitoring tests — parse app log files for errors and anomalies.
|
||||
Run with: pytest api/tests/qa/test_qa_logs_monitor.py
|
||||
|
||||
Tests only check log lines from the current session (last 1 hour) to avoid
|
||||
failing on pre-existing historical errors.
|
||||
"""
|
||||
import re
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.qa
|
||||
|
||||
# Log line format: 2026-03-23 07:57:12,691 | INFO | app.main | message
|
||||
_MAX_WARNINGS = 50
|
||||
_SESSION_WINDOW_HOURS = 1
|
||||
|
||||
# Known issues that are tracked separately and should not fail the QA suite.
|
||||
# These are real bugs that need fixing but should not block test runs.
|
||||
_KNOWN_ISSUES = [
|
||||
"soft-deleting order ID=533: ORA-00942", # Pre-existing: missing table/view
|
||||
"Oracle init failed: DPY-4000", # Dev env: no Oracle tnsnames
|
||||
"ANAF API client error 404", # Dev env: ANAF mock returns 404
|
||||
"ANAF API server error after retry: 500", # Dev env: ANAF mock returns 500
|
||||
]
|
||||
|
||||
|
||||
def _read_recent_lines(app_log_path):
|
||||
"""Read log file lines from the last session window only."""
|
||||
if app_log_path is None or not app_log_path.exists():
|
||||
pytest.skip("No log file available")
|
||||
|
||||
all_lines = app_log_path.read_text(encoding="utf-8", errors="replace").splitlines()
|
||||
|
||||
# Filter to recent lines only (within session window)
|
||||
cutoff = datetime.now() - timedelta(hours=_SESSION_WINDOW_HOURS)
|
||||
recent = []
|
||||
for line in all_lines:
|
||||
# Parse timestamp from log line: "2026-03-24 09:43:46,174 | ..."
|
||||
match = re.match(r"(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2})", line)
|
||||
if match:
|
||||
try:
|
||||
ts = datetime.strptime(match.group(1), "%Y-%m-%d %H:%M:%S")
|
||||
if ts >= cutoff:
|
||||
recent.append(line)
|
||||
except ValueError:
|
||||
recent.append(line) # Include unparseable lines
|
||||
else:
|
||||
# Non-timestamped lines (continuations) — include if we're in recent window
|
||||
if recent:
|
||||
recent.append(line)
|
||||
|
||||
return recent
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_log_file_exists(app_log_path):
|
||||
"""Log file path resolves to an existing file."""
|
||||
if app_log_path is None:
|
||||
pytest.skip("No log file configured")
|
||||
assert app_log_path.exists(), f"Log file not found: {app_log_path}"
|
||||
|
||||
|
||||
def _is_known_issue(line):
|
||||
"""Check if a log line matches a known tracked issue."""
|
||||
return any(ki in line for ki in _KNOWN_ISSUES)
|
||||
|
||||
|
||||
def test_no_critical_errors(app_log_path, qa_issues):
|
||||
"""No unexpected ERROR-level lines in recent log entries."""
|
||||
lines = _read_recent_lines(app_log_path)
|
||||
errors = [l for l in lines if "| ERROR |" in l and not _is_known_issue(l)]
|
||||
known = [l for l in lines if "| ERROR |" in l and _is_known_issue(l)]
|
||||
if errors:
|
||||
qa_issues.extend({"type": "log_error", "line": l} for l in errors)
|
||||
if known:
|
||||
qa_issues.extend({"type": "known_issue", "line": l} for l in known)
|
||||
assert len(errors) == 0, (
|
||||
f"Found {len(errors)} unexpected ERROR line(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
|
||||
+ "\n".join(errors[:10])
|
||||
)
|
||||
|
||||
|
||||
def test_no_oracle_errors(app_log_path, qa_issues):
|
||||
"""No unexpected Oracle ORA- error codes in recent log entries."""
|
||||
lines = _read_recent_lines(app_log_path)
|
||||
ora_errors = [l for l in lines if "ORA-" in l and not _is_known_issue(l)]
|
||||
known = [l for l in lines if "ORA-" in l and _is_known_issue(l)]
|
||||
if ora_errors:
|
||||
qa_issues.extend({"type": "oracle_error", "line": l} for l in ora_errors)
|
||||
if known:
|
||||
qa_issues.extend({"type": "known_issue", "line": l} for l in known)
|
||||
assert len(ora_errors) == 0, (
|
||||
f"Found {len(ora_errors)} unexpected ORA- error(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
|
||||
+ "\n".join(ora_errors[:10])
|
||||
)
|
||||
|
||||
|
||||
def test_no_unhandled_exceptions(app_log_path, qa_issues):
|
||||
"""No unhandled Python tracebacks in recent log entries."""
|
||||
lines = _read_recent_lines(app_log_path)
|
||||
tb_lines = [l for l in lines if "Traceback" in l]
|
||||
if tb_lines:
|
||||
qa_issues.extend({"type": "traceback", "line": l} for l in tb_lines)
|
||||
assert len(tb_lines) == 0, (
|
||||
f"Found {len(tb_lines)} Traceback(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
|
||||
+ "\n".join(tb_lines[:10])
|
||||
)
|
||||
|
||||
|
||||
def test_no_import_failures(app_log_path, qa_issues):
|
||||
"""No import failure messages in recent log entries."""
|
||||
lines = _read_recent_lines(app_log_path)
|
||||
pattern = re.compile(r"import failed|Order.*failed", re.IGNORECASE)
|
||||
failures = [l for l in lines if pattern.search(l)]
|
||||
if failures:
|
||||
qa_issues.extend({"type": "import_failure", "line": l} for l in failures)
|
||||
assert len(failures) == 0, (
|
||||
f"Found {len(failures)} import failure(s) in recent {_SESSION_WINDOW_HOURS}h window:\n"
|
||||
+ "\n".join(failures[:10])
|
||||
)
|
||||
|
||||
|
||||
def test_warning_count_acceptable(app_log_path, qa_issues):
|
||||
"""WARNING count in recent window is below acceptable threshold."""
|
||||
lines = _read_recent_lines(app_log_path)
|
||||
warnings = [l for l in lines if "| WARNING |" in l]
|
||||
if len(warnings) >= _MAX_WARNINGS:
|
||||
qa_issues.append({
|
||||
"type": "high_warning_count",
|
||||
"count": len(warnings),
|
||||
"threshold": _MAX_WARNINGS,
|
||||
})
|
||||
assert len(warnings) < _MAX_WARNINGS, (
|
||||
f"Warning count {len(warnings)} exceeds threshold {_MAX_WARNINGS} "
|
||||
f"in recent {_SESSION_WINDOW_HOURS}h window"
|
||||
)
|
||||
@@ -1,208 +0,0 @@
|
||||
"""
|
||||
PL/SQL package tests using direct Oracle connection.
|
||||
|
||||
Verifies that key Oracle packages are VALID and that order import
|
||||
procedures work end-to-end with cleanup.
|
||||
"""
|
||||
import json
|
||||
import time
|
||||
import logging
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.oracle
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
PACKAGES_TO_CHECK = [
|
||||
"PACK_IMPORT_COMENZI",
|
||||
"PACK_IMPORT_PARTENERI",
|
||||
"PACK_COMENZI",
|
||||
"PACK_FACTURARE",
|
||||
]
|
||||
|
||||
_STATUS_SQL = """
|
||||
SELECT status
|
||||
FROM user_objects
|
||||
WHERE object_name = :name
|
||||
AND object_type = 'PACKAGE BODY'
|
||||
"""
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Module-scoped fixture for sharing test order ID between tests
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def test_order_id(oracle_connection):
|
||||
"""
|
||||
Create a test order via PACK_IMPORT_COMENZI.importa_comanda and yield
|
||||
its ID. Cleans up (DELETE) after all module tests finish.
|
||||
"""
|
||||
import oracledb
|
||||
|
||||
conn = oracle_connection
|
||||
order_id = None
|
||||
|
||||
# Find a minimal valid partner ID
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"SELECT MIN(id_part) FROM nom_parteneri WHERE id_part > 0"
|
||||
)
|
||||
row = cur.fetchone()
|
||||
if not row or row[0] is None:
|
||||
pytest.skip("No partners found in Oracle — cannot create test order")
|
||||
partner_id = int(row[0])
|
||||
except Exception as exc:
|
||||
pytest.skip(f"Cannot query nom_parteneri table: {exc}")
|
||||
|
||||
# Find an article that has a price in some policy (required for import)
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT na.codmat, cp.id_pol, cp.pret
|
||||
FROM nom_articole na
|
||||
JOIN crm_politici_pret_art cp ON cp.id_articol = na.id_articol
|
||||
WHERE cp.pret > 0 AND na.codmat IS NOT NULL AND rownum = 1
|
||||
""")
|
||||
row = cur.fetchone()
|
||||
if not row:
|
||||
pytest.skip("No articles with prices found in Oracle — cannot create test order")
|
||||
test_sku, id_pol, test_price = row[0], int(row[1]), float(row[2])
|
||||
|
||||
nr_comanda_ext = f"PYTEST-{int(time.time())}"
|
||||
# Values must be strings — Oracle's JSON_OBJECT_T.get_string() returns NULL for numbers
|
||||
articles = json.dumps([{
|
||||
"sku": test_sku,
|
||||
"quantity": "1",
|
||||
"price": str(test_price),
|
||||
"vat": "19",
|
||||
}])
|
||||
|
||||
try:
|
||||
from datetime import datetime
|
||||
with conn.cursor() as cur:
|
||||
clob_var = cur.var(oracledb.DB_TYPE_CLOB)
|
||||
clob_var.setvalue(0, articles)
|
||||
id_comanda_var = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
|
||||
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
|
||||
nr_comanda_ext, # p_nr_comanda_ext
|
||||
datetime.now(), # p_data_comanda
|
||||
partner_id, # p_id_partener
|
||||
clob_var, # p_json_articole
|
||||
None, # p_id_adresa_livrare
|
||||
None, # p_id_adresa_facturare
|
||||
id_pol, # p_id_pol
|
||||
None, # p_id_sectie
|
||||
None, # p_id_gestiune
|
||||
None, # p_kit_mode
|
||||
None, # p_id_pol_productie
|
||||
None, # p_kit_discount_codmat
|
||||
None, # p_kit_discount_id_pol
|
||||
id_comanda_var, # v_id_comanda (OUT)
|
||||
])
|
||||
|
||||
raw = id_comanda_var.getvalue()
|
||||
order_id = int(raw) if raw is not None else None
|
||||
|
||||
if order_id and order_id > 0:
|
||||
conn.commit()
|
||||
logger.info(f"Test order created: ID={order_id}, NR={nr_comanda_ext}")
|
||||
else:
|
||||
conn.rollback()
|
||||
order_id = None
|
||||
|
||||
except Exception as exc:
|
||||
try:
|
||||
conn.rollback()
|
||||
except Exception:
|
||||
pass
|
||||
logger.warning(f"Could not create test order: {exc}")
|
||||
order_id = None
|
||||
|
||||
yield order_id
|
||||
|
||||
# Cleanup — runs even if tests fail
|
||||
if order_id:
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"DELETE FROM comenzi_elemente WHERE id_comanda = :id",
|
||||
{"id": order_id}
|
||||
)
|
||||
cur.execute(
|
||||
"DELETE FROM comenzi WHERE id_comanda = :id",
|
||||
{"id": order_id}
|
||||
)
|
||||
conn.commit()
|
||||
logger.info(f"Test order {order_id} cleaned up")
|
||||
except Exception as exc:
|
||||
logger.error(f"Cleanup failed for order {order_id}: {exc}")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Package validity tests
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_pack_import_comenzi_valid(oracle_connection):
|
||||
"""PACK_IMPORT_COMENZI package body must be VALID."""
|
||||
with oracle_connection.cursor() as cur:
|
||||
cur.execute(_STATUS_SQL, {"name": "PACK_IMPORT_COMENZI"})
|
||||
row = cur.fetchone()
|
||||
assert row is not None, "PACK_IMPORT_COMENZI package body not found in user_objects"
|
||||
assert row[0] == "VALID", f"PACK_IMPORT_COMENZI is {row[0]}"
|
||||
|
||||
|
||||
def test_pack_import_parteneri_valid(oracle_connection):
|
||||
"""PACK_IMPORT_PARTENERI package body must be VALID."""
|
||||
with oracle_connection.cursor() as cur:
|
||||
cur.execute(_STATUS_SQL, {"name": "PACK_IMPORT_PARTENERI"})
|
||||
row = cur.fetchone()
|
||||
assert row is not None, "PACK_IMPORT_PARTENERI package body not found in user_objects"
|
||||
assert row[0] == "VALID", f"PACK_IMPORT_PARTENERI is {row[0]}"
|
||||
|
||||
|
||||
def test_pack_comenzi_valid(oracle_connection):
|
||||
"""PACK_COMENZI package body must be VALID."""
|
||||
with oracle_connection.cursor() as cur:
|
||||
cur.execute(_STATUS_SQL, {"name": "PACK_COMENZI"})
|
||||
row = cur.fetchone()
|
||||
assert row is not None, "PACK_COMENZI package body not found in user_objects"
|
||||
assert row[0] == "VALID", f"PACK_COMENZI is {row[0]}"
|
||||
|
||||
|
||||
def test_pack_facturare_valid(oracle_connection):
|
||||
"""PACK_FACTURARE package body must be VALID."""
|
||||
with oracle_connection.cursor() as cur:
|
||||
cur.execute(_STATUS_SQL, {"name": "PACK_FACTURARE"})
|
||||
row = cur.fetchone()
|
||||
assert row is not None, "PACK_FACTURARE package body not found in user_objects"
|
||||
assert row[0] == "VALID", f"PACK_FACTURARE is {row[0]}"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Order import tests
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_import_order_with_articles(test_order_id):
|
||||
"""PACK_IMPORT_COMENZI.importa_comanda must return a valid order ID > 0."""
|
||||
if test_order_id is None:
|
||||
pytest.skip("Test order creation failed — see test_order_id fixture logs")
|
||||
assert test_order_id > 0, f"importa_comanda returned invalid ID: {test_order_id}"
|
||||
|
||||
|
||||
def test_cleanup_test_order(oracle_connection, test_order_id):
|
||||
"""Verify the test order rows exist and can be queried (cleanup runs via fixture)."""
|
||||
if test_order_id is None:
|
||||
pytest.skip("No test order to verify")
|
||||
|
||||
with oracle_connection.cursor() as cur:
|
||||
cur.execute(
|
||||
"SELECT COUNT(*) FROM comenzi WHERE id_comanda = :id",
|
||||
{"id": test_order_id}
|
||||
)
|
||||
row = cur.fetchone()
|
||||
|
||||
# At this point the order should still exist (fixture cleanup runs after module)
|
||||
assert row is not None
|
||||
assert row[0] >= 0 # may be 0 if already cleaned, just confirm query works
|
||||
@@ -1,146 +0,0 @@
|
||||
"""
|
||||
Responsive layout tests across 3 viewports.
|
||||
Tests each page on desktop / tablet / mobile using Playwright sync API.
|
||||
"""
|
||||
import pytest
|
||||
from pathlib import Path
|
||||
from playwright.sync_api import sync_playwright, expect
|
||||
|
||||
pytestmark = pytest.mark.qa
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Viewport definitions
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
VIEWPORTS = {
|
||||
"desktop": {"width": 1280, "height": 900},
|
||||
"tablet": {"width": 768, "height": 1024},
|
||||
"mobile": {"width": 375, "height": 812},
|
||||
}
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Pages to test: (path, expected_text_fragment)
|
||||
# expected_text_fragment is matched loosely against page title or any <h4>/<h1>
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
PAGES = [
|
||||
("/", "Panou"),
|
||||
("/logs", "Jurnale"),
|
||||
("/mappings", "Mapari"),
|
||||
("/missing-skus", "SKU"),
|
||||
("/settings", "Setari"),
|
||||
]
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Session-scoped browser (reused across all parametrized tests)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def pw_browser():
|
||||
"""Launch a Chromium browser for the full QA session."""
|
||||
with sync_playwright() as pw:
|
||||
browser = pw.chromium.launch(headless=True)
|
||||
yield browser
|
||||
browser.close()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Parametrized test: viewport x page
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.parametrize("viewport_name", list(VIEWPORTS.keys()))
|
||||
@pytest.mark.parametrize("page_path,expected_text", PAGES)
|
||||
def test_responsive_page(
|
||||
pw_browser,
|
||||
base_url: str,
|
||||
screenshots_dir: Path,
|
||||
viewport_name: str,
|
||||
page_path: str,
|
||||
expected_text: str,
|
||||
):
|
||||
"""Each page renders without error on every viewport and contains expected text."""
|
||||
viewport = VIEWPORTS[viewport_name]
|
||||
context = pw_browser.new_context(viewport=viewport)
|
||||
page = context.new_page()
|
||||
|
||||
try:
|
||||
page.goto(f"{base_url}{page_path}", wait_until="networkidle", timeout=15_000)
|
||||
|
||||
# Screenshot
|
||||
page_name = page_path.strip("/") or "dashboard"
|
||||
screenshot_path = screenshots_dir / f"{page_name}-{viewport_name}.png"
|
||||
page.screenshot(path=str(screenshot_path), full_page=True)
|
||||
|
||||
# Basic content check: title or any h1/h4 contains expected text
|
||||
title = page.title()
|
||||
headings = page.locator("h1, h4").all_text_contents()
|
||||
all_text = " ".join([title] + headings)
|
||||
assert expected_text.lower() in all_text.lower(), (
|
||||
f"Expected '{expected_text}' in page text on {viewport_name} {page_path}. "
|
||||
f"Got title='{title}', headings={headings}"
|
||||
)
|
||||
finally:
|
||||
context.close()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Mobile-specific: navbar toggler
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_mobile_navbar_visible(pw_browser, base_url: str):
|
||||
"""Mobile viewport: bottom nav should be visible (top navbar hidden on mobile)."""
|
||||
context = pw_browser.new_context(viewport=VIEWPORTS["mobile"])
|
||||
page = context.new_page()
|
||||
try:
|
||||
page.goto(base_url, wait_until="networkidle", timeout=15_000)
|
||||
# On mobile, top-navbar is hidden and bottom-nav is shown
|
||||
bottom_nav = page.locator(".bottom-nav")
|
||||
expect(bottom_nav).to_be_visible()
|
||||
finally:
|
||||
context.close()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Mobile-specific: tables wrapped in .table-responsive or scrollable
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.parametrize("page_path", ["/logs", "/mappings", "/missing-skus"])
|
||||
def test_mobile_table_responsive(pw_browser, base_url: str, page_path: str):
|
||||
"""
|
||||
On mobile, any <table> should live inside a .table-responsive wrapper
|
||||
OR the page should have a horizontal scroll container around it.
|
||||
If no table is present (empty state), the test is skipped.
|
||||
"""
|
||||
context = pw_browser.new_context(viewport=VIEWPORTS["mobile"])
|
||||
page = context.new_page()
|
||||
try:
|
||||
page.goto(f"{base_url}{page_path}", wait_until="networkidle", timeout=15_000)
|
||||
|
||||
tables = page.locator("table").all()
|
||||
if not tables:
|
||||
# No tables means nothing to check — pass (no non-responsive tables exist)
|
||||
return
|
||||
|
||||
# Check each table has an ancestor with overflow-x scroll or .table-responsive class
|
||||
for table in tables:
|
||||
# Check direct parent chain for .table-responsive
|
||||
wrapped = page.evaluate(
|
||||
"""(el) => {
|
||||
let node = el.parentElement;
|
||||
for (let i = 0; i < 6 && node; i++) {
|
||||
if (node.classList.contains('table-responsive')) return true;
|
||||
const style = window.getComputedStyle(node);
|
||||
if (style.overflowX === 'auto' || style.overflowX === 'scroll') return true;
|
||||
node = node.parentElement;
|
||||
}
|
||||
return false;
|
||||
}""",
|
||||
table.element_handle(),
|
||||
)
|
||||
assert wrapped, (
|
||||
f"Table on {page_path} is not inside a .table-responsive wrapper "
|
||||
f"or overflow-x:auto/scroll container on mobile viewport"
|
||||
)
|
||||
finally:
|
||||
context.close()
|
||||
@@ -1,142 +0,0 @@
|
||||
"""
|
||||
Smoke tests for production — read-only, no clicks.
|
||||
Run against a live app: pytest api/tests/qa/test_qa_smoke_prod.py --base-url http://localhost:5003
|
||||
"""
|
||||
import time
|
||||
import urllib.request
|
||||
import json
|
||||
|
||||
import pytest
|
||||
from playwright.sync_api import sync_playwright
|
||||
|
||||
pytestmark = pytest.mark.smoke
|
||||
|
||||
PAGES = ["/", "/logs", "/mappings", "/missing-skus", "/settings"]
|
||||
|
||||
|
||||
def _app_is_reachable(base_url: str) -> bool:
|
||||
"""Quick check if the app is reachable."""
|
||||
try:
|
||||
urllib.request.urlopen(f"{base_url}/health", timeout=3)
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
@pytest.fixture(scope="module", autouse=True)
|
||||
def _require_app(base_url):
|
||||
"""Skip all smoke tests if the app is not running."""
|
||||
if not _app_is_reachable(base_url):
|
||||
pytest.skip(f"App not reachable at {base_url} — start the app first")
|
||||
|
||||
PAGE_TITLES = {
|
||||
"/": "Panou de Comanda",
|
||||
"/logs": "Jurnale Import",
|
||||
"/mappings": "Mapari SKU",
|
||||
"/missing-skus": "SKU-uri Lipsa",
|
||||
"/settings": "Setari",
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def browser():
|
||||
with sync_playwright() as p:
|
||||
b = p.chromium.launch(headless=True)
|
||||
yield b
|
||||
b.close()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# test_page_loads
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.parametrize("path", PAGES)
|
||||
def test_page_loads(browser, base_url, screenshots_dir, path):
|
||||
"""Each page returns HTTP 200 and loads without crashing."""
|
||||
page = browser.new_page()
|
||||
try:
|
||||
response = page.goto(f"{base_url}{path}", wait_until="domcontentloaded", timeout=15_000)
|
||||
assert response is not None, f"No response for {path}"
|
||||
assert response.status == 200, f"Expected 200, got {response.status} for {path}"
|
||||
|
||||
safe_name = path.strip("/").replace("/", "_") or "dashboard"
|
||||
screenshot_path = screenshots_dir / f"smoke_{safe_name}.png"
|
||||
page.screenshot(path=str(screenshot_path))
|
||||
finally:
|
||||
page.close()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# test_page_titles
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.parametrize("path", PAGES)
|
||||
def test_page_titles(browser, base_url, path):
|
||||
"""Each page has the correct h4 heading text."""
|
||||
expected = PAGE_TITLES[path]
|
||||
page = browser.new_page()
|
||||
try:
|
||||
page.goto(f"{base_url}{path}", wait_until="domcontentloaded", timeout=15_000)
|
||||
h4 = page.locator("h4").first
|
||||
actual = h4.inner_text().strip()
|
||||
assert actual == expected, f"{path}: expected h4='{expected}', got '{actual}'"
|
||||
finally:
|
||||
page.close()
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# test_no_console_errors
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.parametrize("path", PAGES)
|
||||
def test_no_console_errors(browser, base_url, path):
|
||||
"""No console.error events on any page."""
|
||||
errors = []
|
||||
page = browser.new_page()
|
||||
try:
|
||||
page.on("console", lambda msg: errors.append(msg.text) if msg.type == "error" else None)
|
||||
page.goto(f"{base_url}{path}", wait_until="networkidle", timeout=15_000)
|
||||
finally:
|
||||
page.close()
|
||||
|
||||
assert errors == [], f"Console errors on {path}: {errors}"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# test_api_health_json
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_api_health_json(base_url):
|
||||
"""GET /health returns valid JSON with 'oracle' key."""
|
||||
with urllib.request.urlopen(f"{base_url}/health", timeout=10) as resp:
|
||||
data = json.loads(resp.read().decode())
|
||||
assert "oracle" in data, f"/health JSON missing 'oracle' key: {data}"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# test_api_dashboard_orders_json
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def test_api_dashboard_orders_json(base_url):
|
||||
"""GET /api/dashboard/orders returns valid JSON with 'orders' key."""
|
||||
with urllib.request.urlopen(f"{base_url}/api/dashboard/orders", timeout=10) as resp:
|
||||
data = json.loads(resp.read().decode())
|
||||
assert "orders" in data, f"/api/dashboard/orders JSON missing 'orders' key: {data}"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# test_response_time
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.mark.parametrize("path", PAGES)
|
||||
def test_response_time(browser, base_url, path):
|
||||
"""Each page loads in under 10 seconds."""
|
||||
page = browser.new_page()
|
||||
try:
|
||||
start = time.monotonic()
|
||||
page.goto(f"{base_url}{path}", wait_until="domcontentloaded", timeout=15_000)
|
||||
elapsed = time.monotonic() - start
|
||||
finally:
|
||||
page.close()
|
||||
|
||||
assert elapsed < 10, f"{path} took {elapsed:.2f}s (limit: 10s)"
|
||||
@@ -1,134 +0,0 @@
|
||||
"""
|
||||
Real sync test: GoMag API → validate → import into Oracle (MARIUSM_AUTO).
|
||||
|
||||
Requires:
|
||||
- App running on localhost:5003
|
||||
- GOMAG_API_KEY set in api/.env
|
||||
- Oracle configured (MARIUSM_AUTO_AUTO)
|
||||
"""
|
||||
import os
|
||||
import time
|
||||
from datetime import datetime, timedelta
|
||||
from pathlib import Path
|
||||
|
||||
import httpx
|
||||
import pytest
|
||||
from dotenv import load_dotenv
|
||||
|
||||
pytestmark = pytest.mark.sync
|
||||
|
||||
# Load .env once at module level for API key check
|
||||
_env_path = Path(__file__).parents[2] / ".env"
|
||||
load_dotenv(str(_env_path), override=True)
|
||||
|
||||
_GOMAG_API_KEY = os.environ.get("GOMAG_API_KEY", "")
|
||||
_GOMAG_API_SHOP = os.environ.get("GOMAG_API_SHOP", "")
|
||||
|
||||
if not _GOMAG_API_KEY:
|
||||
pytestmark = [pytest.mark.sync, pytest.mark.skip(reason="GOMAG_API_KEY not set")]
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def client(base_url):
|
||||
with httpx.Client(base_url=base_url, timeout=30.0) as c:
|
||||
yield c
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def gomag_api_key():
|
||||
if not _GOMAG_API_KEY:
|
||||
pytest.skip("GOMAG_API_KEY is empty or not set")
|
||||
return _GOMAG_API_KEY
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def gomag_api_shop():
|
||||
if not _GOMAG_API_SHOP:
|
||||
pytest.skip("GOMAG_API_SHOP is empty or not set")
|
||||
return _GOMAG_API_SHOP
|
||||
|
||||
|
||||
def _wait_for_sync(client, timeout=60):
|
||||
"""Poll sync status until it stops running. Returns final status dict."""
|
||||
deadline = time.monotonic() + timeout
|
||||
while time.monotonic() < deadline:
|
||||
r = client.get("/api/sync/status")
|
||||
assert r.status_code == 200, f"sync/status returned {r.status_code}"
|
||||
data = r.json()
|
||||
if data.get("status") != "running":
|
||||
return data
|
||||
time.sleep(2)
|
||||
raise TimeoutError(f"Sync did not finish within {timeout}s")
|
||||
|
||||
|
||||
def test_gomag_api_connection(gomag_api_key, gomag_api_shop):
|
||||
"""Verify direct GoMag API connectivity and order presence."""
|
||||
seven_days_ago = (datetime.now() - timedelta(days=7)).strftime("%Y-%m-%d")
|
||||
# GoMag API uses a central endpoint, not the shop URL
|
||||
url = "https://api.gomag.ro/api/v1/order/read/json"
|
||||
params = {"startDate": seven_days_ago, "page": 1, "limit": 5}
|
||||
headers = {"X-Oc-Restadmin-Id": gomag_api_key}
|
||||
|
||||
with httpx.Client(timeout=30.0, follow_redirects=True) as c:
|
||||
r = c.get(url, params=params, headers=headers)
|
||||
|
||||
assert r.status_code == 200, f"GoMag API returned {r.status_code}: {r.text[:200]}"
|
||||
data = r.json()
|
||||
# GoMag returns either a list or a dict with orders key
|
||||
if isinstance(data, dict):
|
||||
assert "orders" in data or len(data) > 0, "GoMag API returned empty response"
|
||||
else:
|
||||
assert isinstance(data, list), f"Unexpected GoMag response type: {type(data)}"
|
||||
|
||||
|
||||
def test_app_sync_start(client, gomag_api_key):
|
||||
"""Trigger a real sync via the app API and wait for completion."""
|
||||
r = client.post("/api/sync/start")
|
||||
assert r.status_code == 200, f"sync/start returned {r.status_code}: {r.text[:200]}"
|
||||
|
||||
final_status = _wait_for_sync(client, timeout=60)
|
||||
assert final_status.get("status") != "running", (
|
||||
f"Sync still running after timeout: {final_status}"
|
||||
)
|
||||
|
||||
|
||||
def test_sync_results(client):
|
||||
"""Verify the latest sync run processed at least one order."""
|
||||
r = client.get("/api/sync/history", params={"per_page": 1})
|
||||
assert r.status_code == 200, f"sync/history returned {r.status_code}"
|
||||
|
||||
data = r.json()
|
||||
runs = data.get("runs", [])
|
||||
assert len(runs) > 0, "No sync runs found in history"
|
||||
|
||||
latest = runs[0]
|
||||
assert latest.get("total_orders", 0) > 0, (
|
||||
f"Latest sync run has 0 orders: {latest}"
|
||||
)
|
||||
|
||||
|
||||
def test_sync_idempotent(client, gomag_api_key):
|
||||
"""Re-running sync should result in ALREADY_IMPORTED, not double imports."""
|
||||
r = client.post("/api/sync/start")
|
||||
assert r.status_code == 200, f"sync/start returned {r.status_code}"
|
||||
|
||||
_wait_for_sync(client, timeout=60)
|
||||
|
||||
r = client.get("/api/sync/history", params={"per_page": 1})
|
||||
assert r.status_code == 200
|
||||
|
||||
data = r.json()
|
||||
runs = data.get("runs", [])
|
||||
assert len(runs) > 0, "No sync runs found after second sync"
|
||||
|
||||
latest = runs[0]
|
||||
total = latest.get("total_orders", 0)
|
||||
already_imported = latest.get("already_imported", 0)
|
||||
imported = latest.get("imported", 0)
|
||||
|
||||
# Most orders should be ALREADY_IMPORTED on second run
|
||||
if total > 0:
|
||||
assert already_imported >= imported, (
|
||||
f"Expected mostly ALREADY_IMPORTED on second run, "
|
||||
f"got imported={imported}, already_imported={already_imported}, total={total}"
|
||||
)
|
||||
@@ -1,70 +0,0 @@
|
||||
-- Setup test data for Phase 1 validation tests
|
||||
-- Create test articles in NOM_ARTICOLE and mappings in ARTICOLE_TERTI
|
||||
|
||||
-- Clear any existing test mappings
|
||||
DELETE FROM ARTICOLE_TERTI WHERE sku IN ('CAFE100', '8000070028685', 'TEST001');
|
||||
|
||||
-- Disable trigger to allow specific ID_ARTICOL values
|
||||
ALTER TRIGGER trg_NOM_ARTICOLE_befoins DISABLE;
|
||||
|
||||
-- Create test articles in NOM_ARTICOLE with correct structure
|
||||
-- Using specific ID_ARTICOL values for test consistency
|
||||
INSERT INTO NOM_ARTICOLE (
|
||||
ID_ARTICOL, CODMAT, DENUMIRE, UM,
|
||||
DEP, ID_SUBGRUPA, CANT_BAX, STERS, ID_MOD, INACTIV,
|
||||
IN_STOC, IN_CRM, DNF, PRETACHCTVA, TAXA_RECONDITIONARE, GREUTATE,
|
||||
ID_UTIL, DATAORA
|
||||
) VALUES (
|
||||
9999001, 'CAF01', 'Cafea Test - 1kg', 'BUC',
|
||||
0, 1, 1, 0, 1, 0,
|
||||
1, 1, 0, 0, 0, 1000,
|
||||
-3, SYSDATE
|
||||
);
|
||||
|
||||
INSERT INTO NOM_ARTICOLE (
|
||||
ID_ARTICOL, CODMAT, DENUMIRE, UM,
|
||||
DEP, ID_SUBGRUPA, CANT_BAX, STERS, ID_MOD, INACTIV,
|
||||
IN_STOC, IN_CRM, DNF, PRETACHCTVA, TAXA_RECONDITIONARE, GREUTATE,
|
||||
ID_UTIL, DATAORA
|
||||
) VALUES (
|
||||
9999002, 'LAV001', 'Lavazza Gusto Forte Test', 'BUC',
|
||||
0, 1, 1, 0, 1, 0,
|
||||
1, 1, 0, 0, 0, 1000,
|
||||
-3, SYSDATE
|
||||
);
|
||||
|
||||
INSERT INTO NOM_ARTICOLE (
|
||||
ID_ARTICOL, CODMAT, DENUMIRE, UM,
|
||||
DEP, ID_SUBGRUPA, CANT_BAX, STERS, ID_MOD, INACTIV,
|
||||
IN_STOC, IN_CRM, DNF, PRETACHCTVA, TAXA_RECONDITIONARE, GREUTATE,
|
||||
ID_UTIL, DATAORA
|
||||
) VALUES (
|
||||
9999003, 'TEST001', 'Articol Test Generic', 'BUC',
|
||||
0, 1, 1, 0, 1, 0,
|
||||
1, 1, 0, 0, 0, 500,
|
||||
-3, SYSDATE
|
||||
);
|
||||
|
||||
-- Price entry for CAF01 in default price policy (id_pol=1)
|
||||
-- Used for single-component repackaging kit pricing test
|
||||
MERGE INTO crm_politici_pret_art dst
|
||||
USING (SELECT 1 AS id_pol, 9999001 AS id_articol FROM DUAL) src
|
||||
ON (dst.id_pol = src.id_pol AND dst.id_articol = src.id_articol)
|
||||
WHEN NOT MATCHED THEN INSERT (id_pol, id_articol, pret, proc_tvav)
|
||||
VALUES (src.id_pol, src.id_articol, 51.50, 19);
|
||||
|
||||
-- Create test mappings in ARTICOLE_TERTI
|
||||
-- CAFE100 -> CAF01 (repackaging: 10x1kg = 1x10kg web package)
|
||||
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ)
|
||||
VALUES ('CAFE100', 'CAF01', 10, 100, 1);
|
||||
|
||||
-- Real GoMag SKU -> Lavazza article
|
||||
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ)
|
||||
VALUES ('8000070028685', 'LAV001', 1, 100, 1);
|
||||
|
||||
-- Re-enable trigger after test data creation
|
||||
ALTER TRIGGER trg_NOM_ARTICOLE_befoins ENABLE;
|
||||
|
||||
COMMIT;
|
||||
|
||||
PROMPT === Test Data Setup Complete ===
|
||||
@@ -1,38 +0,0 @@
|
||||
-- Cleanup test data created for Phase 1 validation tests
|
||||
-- Remove test articles and mappings to leave database clean
|
||||
|
||||
-- Remove test price entry
|
||||
DELETE FROM crm_politici_pret_art WHERE id_pol = 1 AND id_articol = 9999001;
|
||||
|
||||
-- Remove test mappings
|
||||
DELETE FROM ARTICOLE_TERTI WHERE sku IN ('CAFE100', '8000070028685', 'TEST001');
|
||||
|
||||
-- Remove test articles (using specific ID_ARTICOL range to avoid removing real data)
|
||||
DELETE FROM NOM_ARTICOLE WHERE ID_ARTICOL BETWEEN 9999001 AND 9999003;
|
||||
|
||||
-- Remove any test orders created during testing (optional - to avoid accumulation)
|
||||
DELETE FROM COMENZI_ELEMENTE WHERE ID_COMANDA IN (
|
||||
SELECT ID_COMANDA FROM COMENZI
|
||||
WHERE NR_COMANDA LIKE 'COMPLETE-%'
|
||||
OR NR_COMANDA LIKE 'FINAL-TEST-%'
|
||||
OR NR_COMANDA LIKE 'GOMAG-TEST-%'
|
||||
OR NR_COMANDA LIKE 'TEST-%'
|
||||
OR COMANDA_EXTERNA LIKE '%TEST%'
|
||||
);
|
||||
|
||||
DELETE FROM COMENZI
|
||||
WHERE NR_COMANDA LIKE 'COMPLETE-%'
|
||||
OR NR_COMANDA LIKE 'FINAL-TEST-%'
|
||||
OR NR_COMANDA LIKE 'GOMAG-TEST-%'
|
||||
OR NR_COMANDA LIKE 'TEST-%'
|
||||
OR COMANDA_EXTERNA LIKE '%TEST%';
|
||||
|
||||
-- Remove test partners created during testing (optional)
|
||||
DELETE FROM NOM_PARTENERI
|
||||
WHERE DENUMIRE LIKE '%Test%'
|
||||
AND ID_UTIL = -3
|
||||
AND DATAORA > SYSDATE - 1; -- Only today's test partners
|
||||
|
||||
COMMIT;
|
||||
|
||||
PROMPT === Test Data Cleanup Complete ===
|
||||
@@ -1,469 +0,0 @@
|
||||
"""
|
||||
Oracle Integration Tests — Regula adrese PJ/PF
|
||||
===============================================
|
||||
Verifică că comenzile importate respectă regula:
|
||||
PF (fără CUI): id_adresa_facturare = id_adresa_livrare
|
||||
PJ (cu CUI): adresa_facturare_roa se potrivește cu adresa billing GoMag
|
||||
|
||||
Testele principale sunt E2E (importă comenzi sintetice în Oracle și verifică).
|
||||
Testele de regresie verifică comenzile existente din SQLite.
|
||||
|
||||
Run:
|
||||
pytest api/tests/test_address_rules_oracle.py -v
|
||||
./test.sh oracle
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.oracle
|
||||
|
||||
_script_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)), "..")
|
||||
_project_root = os.path.dirname(_script_dir)
|
||||
|
||||
from dotenv import load_dotenv
|
||||
_env_path = os.path.join(_script_dir, ".env")
|
||||
load_dotenv(_env_path, override=True)
|
||||
|
||||
_tns_admin = os.environ.get("TNS_ADMIN", "")
|
||||
if _tns_admin and os.path.isfile(_tns_admin):
|
||||
os.environ["TNS_ADMIN"] = os.path.dirname(_tns_admin)
|
||||
elif not _tns_admin:
|
||||
os.environ["TNS_ADMIN"] = _script_dir
|
||||
|
||||
if _script_dir not in sys.path:
|
||||
sys.path.insert(0, _script_dir)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixtures
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def oracle_env():
|
||||
"""Re-aplică .env și actualizează settings pentru Oracle."""
|
||||
load_dotenv(_env_path, override=True)
|
||||
_tns = os.environ.get("TNS_ADMIN", "")
|
||||
if _tns and os.path.isfile(_tns):
|
||||
os.environ["TNS_ADMIN"] = os.path.dirname(_tns)
|
||||
|
||||
from app.config import settings
|
||||
settings.ORACLE_USER = os.environ.get("ORACLE_USER", "MARIUSM_AUTO")
|
||||
settings.ORACLE_PASSWORD = os.environ.get("ORACLE_PASSWORD", "ROMFASTSOFT")
|
||||
settings.ORACLE_DSN = os.environ.get("ORACLE_DSN", "ROA_CENTRAL")
|
||||
settings.TNS_ADMIN = os.environ.get("TNS_ADMIN", _script_dir)
|
||||
settings.FORCE_THIN_MODE = os.environ.get("FORCE_THIN_MODE", "") == "true"
|
||||
return settings
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def client(oracle_env):
|
||||
from fastapi.testclient import TestClient
|
||||
from app.main import app
|
||||
with TestClient(app) as c:
|
||||
yield c
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def oracle_pool(oracle_env):
|
||||
"""Pool Oracle direct pentru verificări în DB."""
|
||||
from app import database
|
||||
database.init_oracle()
|
||||
yield database.pool
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def real_codmat(client):
|
||||
"""CODMAT real din Oracle pentru liniile comenzii sintetice."""
|
||||
for term in ["01", "PH", "CA", "A"]:
|
||||
resp = client.get("/api/articles/search", params={"q": term})
|
||||
if resp.status_code == 200:
|
||||
results = resp.json().get("results", [])
|
||||
if results:
|
||||
return results[0]["codmat"]
|
||||
pytest.skip("Nu s-a găsit niciun CODMAT în Oracle pentru test")
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def app_settings(client):
|
||||
"""Setările aplicației (id_pol, id_sectie, etc.)."""
|
||||
resp = client.get("/api/sync/schedule")
|
||||
assert resp.status_code == 200
|
||||
import sqlite3
|
||||
from app.config import settings as _s
|
||||
db_path = _s.SQLITE_DB_PATH if os.path.isabs(_s.SQLITE_DB_PATH) else os.path.join(_script_dir, _s.SQLITE_DB_PATH)
|
||||
conn = sqlite3.connect(db_path)
|
||||
conn.row_factory = sqlite3.Row
|
||||
rows = conn.execute("SELECT key, value FROM app_settings").fetchall()
|
||||
conn.close()
|
||||
return {r["key"]: r["value"] for r in rows}
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def run_id():
|
||||
return f"pytest-addr-{int(time.time())}"
|
||||
|
||||
|
||||
def _build_pj_order(run_id, real_codmat):
|
||||
"""Comandă sintetică PJ: companie cu billing ≠ shipping."""
|
||||
from app.services.order_reader import OrderBilling, OrderShipping, OrderData, OrderItem
|
||||
billing = OrderBilling(
|
||||
firstname="Test", lastname="PJ", phone="0700000000", email="pj@pytest.local",
|
||||
address="Bld Unirii 1", city="Bucuresti", region="Bucuresti", country="RO",
|
||||
company_name="PYTEST COMPANY SRL", company_code="RO99000001", company_reg="J40/9999/2026",
|
||||
is_company=True
|
||||
)
|
||||
shipping = OrderShipping(
|
||||
firstname="Curier", lastname="Destinatar", phone="0799999999", email="ship@pytest.local",
|
||||
address="Str Livrare 99", city="Cluj-Napoca", region="Cluj", country="RO"
|
||||
)
|
||||
return OrderData(
|
||||
id=f"{run_id}-PJ",
|
||||
number=f"{run_id}-PJ",
|
||||
date="2026-01-15T10:00:00",
|
||||
status="new", status_id="1",
|
||||
billing=billing, shipping=shipping,
|
||||
items=[OrderItem(sku="PYTEST-SKU-PJ", name="Test PJ Item",
|
||||
price=10.0, quantity=1.0, vat=19.0)],
|
||||
total=10.0, delivery_cost=0.0, discount_total=0.0
|
||||
)
|
||||
|
||||
|
||||
def _build_pf_order(run_id, real_codmat):
|
||||
"""Comandă sintetică PF: persoană fizică, billing ≠ shipping (dar billing ROA trebuie = shipping)."""
|
||||
from app.services.order_reader import OrderBilling, OrderShipping, OrderData, OrderItem
|
||||
billing = OrderBilling(
|
||||
firstname="Ion", lastname="Popescu", phone="0700000001", email="pf@pytest.local",
|
||||
address="Str Alta 5", city="Timisoara", region="Timis", country="RO",
|
||||
company_name="", company_code="", company_reg="", is_company=False
|
||||
)
|
||||
shipping = OrderShipping(
|
||||
firstname="Ion", lastname="Popescu", phone="0700000001", email="pf@pytest.local",
|
||||
address="Str Livrare 10", city="Iasi", region="Iasi", country="RO"
|
||||
)
|
||||
return OrderData(
|
||||
id=f"{run_id}-PF",
|
||||
number=f"{run_id}-PF",
|
||||
date="2026-01-15T10:00:00",
|
||||
status="new", status_id="1",
|
||||
billing=billing, shipping=shipping,
|
||||
items=[OrderItem(sku="PYTEST-SKU-PF", name="Test PF Item",
|
||||
price=10.0, quantity=1.0, vat=19.0)],
|
||||
total=10.0, delivery_cost=0.0, discount_total=0.0
|
||||
)
|
||||
|
||||
|
||||
def _cleanup_test_orders(oracle_pool, run_id):
|
||||
"""Șterge comenzile de test din Oracle."""
|
||||
try:
|
||||
conn = oracle_pool.acquire()
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"DELETE FROM comenzi WHERE comanda_externa LIKE :1",
|
||||
[f"{run_id}%"]
|
||||
)
|
||||
conn.commit()
|
||||
oracle_pool.release(conn)
|
||||
except Exception as e:
|
||||
print(f"Cleanup warning: {e}")
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test E2E: import PJ + PF sintetice în Oracle
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestAddressRulesE2E:
|
||||
"""Import comenzi sintetice și verifică adresele în Oracle."""
|
||||
|
||||
@pytest.fixture(scope="class", autouse=True)
|
||||
def cleanup(self, oracle_pool, run_id):
|
||||
yield
|
||||
_cleanup_test_orders(oracle_pool, run_id)
|
||||
|
||||
def test_pj_billing_addr_is_gomag_billing(self, oracle_pool, real_codmat, app_settings, run_id):
|
||||
"""PJ: adresa facturare în Oracle provine din GoMag billing (nu shipping)."""
|
||||
from app.services.import_service import import_single_order
|
||||
from app.services.order_reader import OrderItem
|
||||
|
||||
order = _build_pj_order(run_id, real_codmat)
|
||||
# Replace test SKU with real codmat via mapping (or just use items with real SKU)
|
||||
order.items = [OrderItem(sku=real_codmat, name="Test PJ",
|
||||
price=10.0, quantity=1.0, vat=19.0)]
|
||||
|
||||
id_pol = int(app_settings.get("id_pol") or 0) or None
|
||||
id_sectie = int(app_settings.get("id_sectie") or 0) or None
|
||||
|
||||
result = import_single_order(order, id_pol=id_pol, id_sectie=id_sectie,
|
||||
app_settings=app_settings)
|
||||
|
||||
if not result["success"]:
|
||||
pytest.skip(f"Import PJ eșuat (SKU probabil nemapat): {result.get('error')}")
|
||||
|
||||
id_fact = result["id_adresa_facturare"]
|
||||
id_livr = result["id_adresa_livrare"]
|
||||
|
||||
assert id_fact is not None, "PJ: id_adresa_facturare lipsește din result"
|
||||
assert id_livr is not None, "PJ: id_adresa_livrare lipsește din result"
|
||||
|
||||
# PJ cu billing ≠ shipping: adresele trebuie să fie DIFERITE
|
||||
assert id_fact != id_livr, (
|
||||
f"PJ cu billing≠shipping trebuie să aibă id_fact({id_fact}) ≠ id_livr({id_livr}). "
|
||||
f"Regula veche (different_person) s-ar comporta la fel, dar acum PJ folosește billing GoMag."
|
||||
)
|
||||
|
||||
# Verifică în Oracle că adresele există
|
||||
conn = oracle_pool.acquire()
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"SELECT id_livrare, id_facturare FROM comenzi WHERE comanda_externa = :1",
|
||||
[order.number]
|
||||
)
|
||||
row = cur.fetchone()
|
||||
oracle_pool.release(conn)
|
||||
|
||||
assert row is not None, f"Comanda {order.number} nu s-a găsit în Oracle comenzi"
|
||||
assert row[0] == id_livr, f"id_livrare Oracle ({row[0]}) ≠ result ({id_livr})"
|
||||
assert row[1] == id_fact, f"id_facturare Oracle ({row[1]}) ≠ result ({id_fact})"
|
||||
|
||||
def test_pf_billing_addr_equals_shipping(self, oracle_pool, real_codmat, app_settings, run_id):
|
||||
"""PF: adresa facturare în Oracle = adresa livrare (ramburs curier)."""
|
||||
from app.services.import_service import import_single_order
|
||||
from app.services.order_reader import OrderItem
|
||||
|
||||
order = _build_pf_order(run_id, real_codmat)
|
||||
order.items = [OrderItem(sku=real_codmat, name="Test PF",
|
||||
price=10.0, quantity=1.0, vat=19.0)]
|
||||
|
||||
id_pol = int(app_settings.get("id_pol") or 0) or None
|
||||
id_sectie = int(app_settings.get("id_sectie") or 0) or None
|
||||
|
||||
result = import_single_order(order, id_pol=id_pol, id_sectie=id_sectie,
|
||||
app_settings=app_settings)
|
||||
|
||||
if not result["success"]:
|
||||
pytest.skip(f"Import PF eșuat: {result.get('error')}")
|
||||
|
||||
id_fact = result["id_adresa_facturare"]
|
||||
id_livr = result["id_adresa_livrare"]
|
||||
|
||||
assert id_fact is not None, "PF: id_adresa_facturare lipsește din result"
|
||||
assert id_livr is not None, "PF: id_adresa_livrare lipsește din result"
|
||||
|
||||
# PF: id_facturare TREBUIE să fie = id_livrare
|
||||
assert id_fact == id_livr, (
|
||||
f"PF trebuie să aibă id_fact({id_fact}) = id_livr({id_livr}) — "
|
||||
f"ramburs curier pe adresa de livrare"
|
||||
)
|
||||
|
||||
# Verifică în Oracle
|
||||
conn = oracle_pool.acquire()
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"SELECT id_livrare, id_facturare FROM comenzi WHERE comanda_externa = :1",
|
||||
[order.number]
|
||||
)
|
||||
row = cur.fetchone()
|
||||
oracle_pool.release(conn)
|
||||
|
||||
assert row is not None, f"Comanda {order.number} nu s-a găsit în Oracle comenzi"
|
||||
assert row[1] == row[0], (
|
||||
f"Oracle: id_facturare({row[1]}) ≠ id_livrare({row[0]}) pentru PF"
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test: parsare componente adresă (strada, numar, bloc, scara, apart, etaj)
|
||||
# Apelează direct parseaza_adresa_semicolon din Oracle — fără import comandă.
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestAddressComponentParsing:
|
||||
"""Verifică extragerea componentelor adresei direct prin parseaza_adresa_semicolon."""
|
||||
|
||||
def _parse_address(self, oracle_pool, address, city="Bucuresti", region="Bucuresti"):
|
||||
"""Call Oracle parseaza_adresa_semicolon and return parsed components."""
|
||||
from app.services.import_service import format_address_for_oracle
|
||||
formatted = format_address_for_oracle(address, city, region)
|
||||
|
||||
conn = oracle_pool.acquire()
|
||||
try:
|
||||
with conn.cursor() as cur:
|
||||
p_judet = cur.var(str, 200)
|
||||
p_localitate = cur.var(str, 200)
|
||||
p_strada = cur.var(str, 100)
|
||||
p_numar = cur.var(str, 100)
|
||||
p_sector = cur.var(str, 100)
|
||||
p_bloc = cur.var(str, 30)
|
||||
p_scara = cur.var(str, 10)
|
||||
p_apart = cur.var(str, 10)
|
||||
p_etaj = cur.var(str, 20)
|
||||
|
||||
cur.callproc("PACK_IMPORT_PARTENERI.parseaza_adresa_semicolon", [
|
||||
formatted, p_judet, p_localitate, p_strada, p_numar,
|
||||
p_sector, p_bloc, p_scara, p_apart, p_etaj
|
||||
])
|
||||
|
||||
return {
|
||||
"strada": p_strada.getvalue(),
|
||||
"numar": p_numar.getvalue(),
|
||||
"bloc": p_bloc.getvalue(),
|
||||
"scara": p_scara.getvalue(),
|
||||
"apart": p_apart.getvalue(),
|
||||
"etaj": p_etaj.getvalue(),
|
||||
"localitate": p_localitate.getvalue(),
|
||||
"judet": p_judet.getvalue(),
|
||||
}
|
||||
finally:
|
||||
oracle_pool.release(conn)
|
||||
|
||||
def test_full_address_all_components(self, oracle_pool):
|
||||
"""Adresa completă cu nr, bl, sc, ap — toate componentele se extrag din strada."""
|
||||
addr = self._parse_address(oracle_pool,
|
||||
"Bd. 1 Decembrie 1918 nr. 26 bl. 6 sc. 2 ap. 36")
|
||||
assert addr["numar"] == "26", f"numar={addr['numar']}"
|
||||
assert addr["bloc"] == "6", f"bloc={addr['bloc']}"
|
||||
assert addr["scara"] == "2", f"scara={addr['scara']}"
|
||||
assert addr["apart"] == "36", f"apart={addr['apart']}"
|
||||
assert "SC" not in (addr["strada"] or ""), f"SC ramas in strada: {addr['strada']}"
|
||||
assert "AP" not in (addr["strada"] or ""), f"AP ramas in strada: {addr['strada']}"
|
||||
|
||||
def test_alphanumeric_bloc_and_letter_scara(self, oracle_pool):
|
||||
"""Bloc alfanumeric (VN9) și scara literă (A) + etaj."""
|
||||
addr = self._parse_address(oracle_pool,
|
||||
"Strada Becatei nr 29 bl. VN9 sc. A et. 10 ap. 42")
|
||||
assert addr["numar"] == "29", f"numar={addr['numar']}"
|
||||
assert addr["bloc"] == "VN9", f"bloc={addr['bloc']}"
|
||||
assert addr["scara"] == "A", f"scara={addr['scara']}"
|
||||
assert addr["etaj"] == "10", f"etaj={addr['etaj']}"
|
||||
assert addr["apart"] == "42", f"apart={addr['apart']}"
|
||||
|
||||
def test_address_without_commas_uppercase(self, oracle_pool):
|
||||
"""Adresa uppercase fără virgule — keywords spațiu-separate."""
|
||||
addr = self._parse_address(oracle_pool,
|
||||
"STR DACIA NR 15 BLOC Z2 SC 1 AP 7 ET 3")
|
||||
assert addr["numar"] == "15", f"numar={addr['numar']}"
|
||||
assert addr["bloc"] == "Z2", f"bloc={addr['bloc']}"
|
||||
assert addr["scara"] == "1", f"scara={addr['scara']}"
|
||||
assert addr["apart"] == "7", f"apart={addr['apart']}"
|
||||
assert addr["etaj"] == "3", f"etaj={addr['etaj']}"
|
||||
|
||||
def test_address_with_existing_commas(self, oracle_pool):
|
||||
"""Adresa care deja are virgule — nu se strică parsarea."""
|
||||
addr = self._parse_address(oracle_pool,
|
||||
"Str Victoriei, nr. 10, bl. A1, sc. B, et. 2, ap. 15")
|
||||
assert addr["numar"] == "10", f"numar={addr['numar']}"
|
||||
assert addr["bloc"] == "A1", f"bloc={addr['bloc']}"
|
||||
assert addr["scara"] == "B", f"scara={addr['scara']}"
|
||||
assert addr["etaj"] == "2", f"etaj={addr['etaj']}"
|
||||
assert addr["apart"] == "15", f"apart={addr['apart']}"
|
||||
|
||||
def test_no_keywords_street_unchanged(self, oracle_pool):
|
||||
"""Adresa simplă fără keywords — strada rămâne intactă."""
|
||||
addr = self._parse_address(oracle_pool, "Strada Victoriei 10")
|
||||
assert "VICTORIEI" in (addr["strada"] or ""), f"strada={addr['strada']}"
|
||||
|
||||
def test_blocuri_neighborhood_not_extracted_as_bloc(self, oracle_pool):
|
||||
"""'Blocuri' in street name must NOT be parsed as BLOC keyword."""
|
||||
result = self._parse_address(oracle_pool, "Str Principala Modarzau Blocuri", "Zemes", "Bacau")
|
||||
assert "MODARZAU BLOCURI" in (result.get("strada") or ""), f"strada should contain MODARZAU BLOCURI, got {result}"
|
||||
assert result.get("bloc") is None, f"bloc should be NULL for neighborhood name, got {result.get('bloc')}"
|
||||
|
||||
def test_numar_overflow_with_landmark(self, oracle_pool):
|
||||
"""'nr 5 la non stop' — numar=5, landmark overflow muta in strada."""
|
||||
addr = self._parse_address(oracle_pool, "Str zorilor nr 5 la non stop", "Brasov", "Brasov")
|
||||
assert addr["numar"] == "5", f"numar={addr['numar']!r} (asteptat '5')"
|
||||
assert "ZORILOR" in (addr["strada"] or ""), f"strada={addr['strada']!r}"
|
||||
assert "NON" in (addr["strada"] or ""), f"landmark lipsa din strada: {addr['strada']!r}"
|
||||
|
||||
def test_numar_overflow_with_sat_localitate(self, oracle_pool):
|
||||
"""'nr21 sat Grozavesti corbii mari' — numar=21, SAT overwrite p_localitate (satul = localitate)."""
|
||||
addr = self._parse_address(oracle_pool, "Pe deal nr21 sat Grozavesti corbii mari", "Corbii Mari", "Dambovita")
|
||||
assert addr["numar"] == "21", f"numar={addr['numar']!r} (asteptat '21')"
|
||||
assert "DEAL" in (addr["strada"] or ""), f"strada={addr['strada']!r}"
|
||||
assert "GROZAVESTI" not in (addr["strada"] or ""), f"SAT in strada: {addr['strada']!r}"
|
||||
# SAT ... a fost mutat in p_localitate (override din GoMag "CORBII MARI")
|
||||
assert "GROZAVESTI" in (addr["localitate"] or "").upper(), (
|
||||
f"localitate={addr['localitate']!r} (astept sa contina GROZAVESTI)"
|
||||
)
|
||||
|
||||
def test_numar_normal_not_affected(self, oracle_pool):
|
||||
"""Numar normal (<= 10 chars) nu e atins de overflow fix."""
|
||||
addr = self._parse_address(oracle_pool, "Str Mihai Viteazu nr 10", "Cluj-Napoca", "Cluj")
|
||||
assert addr["numar"] == "10", f"numar={addr['numar']!r}"
|
||||
assert "VITEAZU" in (addr["strada"] or ""), f"strada={addr['strada']!r}"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test regresie: comenzi existente în SQLite
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestAddressRulesRegression:
|
||||
"""Verifică că comenzile existente importate după fix respectă regula PJ/PF."""
|
||||
|
||||
FIX_DATE = "2026-04-08" # data când a fost aplicat fix-ul
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def sqlite_rows(self):
|
||||
"""Comenzi cu adrese populate importate după data fix-ului."""
|
||||
import sqlite3
|
||||
from app.config import settings
|
||||
db_path = os.environ.get("SQLITE_DB_PATH", os.path.join(_script_dir, "orders.db"))
|
||||
if not os.path.exists(db_path):
|
||||
pytest.skip(f"SQLite DB lipsă: {db_path}")
|
||||
|
||||
conn = sqlite3.connect(db_path)
|
||||
conn.row_factory = sqlite3.Row
|
||||
rows = conn.execute("""
|
||||
SELECT order_number, cod_fiscal_gomag,
|
||||
id_adresa_facturare, id_adresa_livrare,
|
||||
adresa_facturare_gomag, adresa_livrare_gomag,
|
||||
adresa_facturare_roa, adresa_livrare_roa,
|
||||
first_seen_at
|
||||
FROM orders
|
||||
WHERE id_adresa_facturare IS NOT NULL
|
||||
AND id_adresa_livrare IS NOT NULL
|
||||
AND first_seen_at >= ?
|
||||
""", (self.FIX_DATE,)).fetchall()
|
||||
conn.close()
|
||||
return rows
|
||||
|
||||
def test_pf_id_facturare_equals_id_livrare(self, sqlite_rows):
|
||||
"""PF noi: id_adresa_facturare = id_adresa_livrare."""
|
||||
pf_rows = [r for r in sqlite_rows if not r["cod_fiscal_gomag"]]
|
||||
if not pf_rows:
|
||||
pytest.skip(f"Nicio comandă PF importată după {self.FIX_DATE}")
|
||||
|
||||
violations = [
|
||||
f"{r['order_number']}: id_fact={r['id_adresa_facturare']} id_livr={r['id_adresa_livrare']}"
|
||||
for r in pf_rows
|
||||
if r["id_adresa_facturare"] != r["id_adresa_livrare"]
|
||||
]
|
||||
assert not violations, (
|
||||
f"PF comenzi cu id_fact ≠ id_livr ({len(violations)}):\n" + "\n".join(violations[:10])
|
||||
)
|
||||
|
||||
def test_pj_billing_roa_matches_gomag_billing(self, sqlite_rows):
|
||||
"""PJ noi: adresa_facturare_roa se potrivește cu GoMag billing address."""
|
||||
from app.services.sync_service import _addr_match
|
||||
|
||||
pj_rows = [
|
||||
r for r in sqlite_rows
|
||||
if r["cod_fiscal_gomag"] and r["adresa_facturare_gomag"] and r["adresa_facturare_roa"]
|
||||
]
|
||||
if not pj_rows:
|
||||
pytest.skip(f"Nicio comandă PJ cu adrese populate importată după {self.FIX_DATE}")
|
||||
|
||||
violations = []
|
||||
for r in pj_rows:
|
||||
if not _addr_match(r["adresa_facturare_gomag"], r["adresa_facturare_roa"]):
|
||||
violations.append(
|
||||
f"{r['order_number']}: billing_gomag={r['adresa_facturare_gomag']!r} "
|
||||
f"fact_roa={r['adresa_facturare_roa']!r}"
|
||||
)
|
||||
|
||||
assert not violations, (
|
||||
f"PJ comenzi cu adresa_facturare_roa care nu corespunde GoMag billing ({len(violations)}):\n"
|
||||
+ "\n".join(violations[:10])
|
||||
)
|
||||
@@ -1,114 +0,0 @@
|
||||
"""
|
||||
Test: Basic App Import and Route Tests (pytest-compatible)
|
||||
==========================================================
|
||||
Tests module imports and all GET routes without requiring Oracle.
|
||||
Converted from api/test_app_basic.py.
|
||||
|
||||
Run:
|
||||
pytest api/tests/test_app_basic.py -v
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
import pytest
|
||||
|
||||
# --- Marker: all tests here are unit (no Oracle) ---
|
||||
pytestmark = pytest.mark.unit
|
||||
|
||||
# --- Set env vars BEFORE any app import ---
|
||||
_tmpdir = tempfile.mkdtemp()
|
||||
_sqlite_path = os.path.join(_tmpdir, "test_import.db")
|
||||
|
||||
os.environ["FORCE_THIN_MODE"] = "true"
|
||||
os.environ["SQLITE_DB_PATH"] = _sqlite_path
|
||||
os.environ["ORACLE_DSN"] = "dummy"
|
||||
os.environ["ORACLE_USER"] = "dummy"
|
||||
os.environ["ORACLE_PASSWORD"] = "dummy"
|
||||
os.environ.setdefault("JSON_OUTPUT_DIR", _tmpdir)
|
||||
|
||||
# Add api/ to path so we can import app
|
||||
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
if _api_dir not in sys.path:
|
||||
sys.path.insert(0, _api_dir)
|
||||
|
||||
|
||||
# -------------------------------------------------------
|
||||
# Section 1: Module Import Checks
|
||||
# -------------------------------------------------------
|
||||
|
||||
MODULES = [
|
||||
"app.config",
|
||||
"app.database",
|
||||
"app.main",
|
||||
"app.routers.health",
|
||||
"app.routers.dashboard",
|
||||
"app.routers.mappings",
|
||||
"app.routers.sync",
|
||||
"app.routers.validation",
|
||||
"app.routers.articles",
|
||||
"app.services.sqlite_service",
|
||||
"app.services.scheduler_service",
|
||||
"app.services.mapping_service",
|
||||
"app.services.article_service",
|
||||
"app.services.validation_service",
|
||||
"app.services.import_service",
|
||||
"app.services.sync_service",
|
||||
"app.services.order_reader",
|
||||
]
|
||||
|
||||
|
||||
@pytest.mark.parametrize("module_name", MODULES)
|
||||
def test_module_import(module_name):
|
||||
"""Each app module should import without errors."""
|
||||
__import__(module_name)
|
||||
|
||||
|
||||
# -------------------------------------------------------
|
||||
# Section 2: Route Tests via TestClient
|
||||
# -------------------------------------------------------
|
||||
|
||||
# (path, expected_status_codes, is_known_oracle_failure)
|
||||
GET_ROUTES = [
|
||||
("/health", [200], False),
|
||||
("/", [200, 500], False),
|
||||
("/missing-skus", [200, 500], False),
|
||||
("/mappings", [200, 500], False),
|
||||
("/logs", [200, 500], False),
|
||||
("/api/mappings", [200, 503], True),
|
||||
("/api/mappings/export-csv", [200, 503], True),
|
||||
("/api/mappings/csv-template", [200], False),
|
||||
("/api/sync/status", [200], False),
|
||||
("/api/sync/history", [200], False),
|
||||
("/api/sync/schedule", [200], False),
|
||||
("/api/validate/missing-skus", [200], False),
|
||||
("/api/validate/missing-skus?page=1&per_page=10", [200], False),
|
||||
("/api/sync/run/nonexistent/log", [200, 404], False),
|
||||
("/api/articles/search?q=ab", [200, 503], True),
|
||||
("/settings", [200, 500], False),
|
||||
]
|
||||
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def client():
|
||||
"""Create a TestClient with lifespan for all route tests."""
|
||||
from fastapi.testclient import TestClient
|
||||
from app.main import app
|
||||
|
||||
with TestClient(app, raise_server_exceptions=False) as c:
|
||||
yield c
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"path,expected_codes,is_oracle_route",
|
||||
GET_ROUTES,
|
||||
ids=[p for p, _, _ in GET_ROUTES],
|
||||
)
|
||||
def test_route(client, path, expected_codes, is_oracle_route):
|
||||
"""Each GET route should return an expected status code."""
|
||||
resp = client.get(path)
|
||||
assert resp.status_code in expected_codes, (
|
||||
f"GET {path} returned {resp.status_code}, expected one of {expected_codes}. "
|
||||
f"Body: {resp.text[:300]}"
|
||||
)
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,391 +0,0 @@
|
||||
"""
|
||||
CUI Validation Tests
|
||||
====================
|
||||
Tests for Romanian CUI sanitization, checksum validation, and OCR typo correction.
|
||||
|
||||
Run:
|
||||
cd api && python -m pytest tests/test_cui_validation.py -v
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.unit
|
||||
|
||||
# --- Set env vars BEFORE any app import ---
|
||||
_tmpdir = tempfile.mkdtemp()
|
||||
os.environ["FORCE_THIN_MODE"] = "true"
|
||||
os.environ["SQLITE_DB_PATH"] = os.path.join(_tmpdir, "test_cui.db")
|
||||
os.environ["ORACLE_DSN"] = "dummy"
|
||||
os.environ["ORACLE_USER"] = "dummy"
|
||||
os.environ["ORACLE_PASSWORD"] = "dummy"
|
||||
os.environ["JSON_OUTPUT_DIR"] = _tmpdir
|
||||
|
||||
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
if _api_dir not in sys.path:
|
||||
sys.path.insert(0, _api_dir)
|
||||
|
||||
from unittest.mock import AsyncMock, patch, MagicMock
|
||||
|
||||
from app.services.anaf_service import (
|
||||
strip_ro_prefix,
|
||||
validate_cui,
|
||||
validate_cui_checksum,
|
||||
sanitize_cui,
|
||||
_call_anaf_api,
|
||||
check_vat_status_batch,
|
||||
)
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# strip_ro_prefix
|
||||
# ===========================================================================
|
||||
|
||||
class TestStripRoPrefix:
|
||||
def test_basic_ro_prefix(self):
|
||||
assert strip_ro_prefix("RO15134434") == "15134434"
|
||||
|
||||
def test_ro_with_space(self):
|
||||
assert strip_ro_prefix("RO 15134434") == "15134434"
|
||||
|
||||
def test_lowercase_ro(self):
|
||||
assert strip_ro_prefix("ro15134434") == "15134434"
|
||||
|
||||
def test_no_prefix(self):
|
||||
assert strip_ro_prefix("15134434") == "15134434"
|
||||
|
||||
def test_whitespace(self):
|
||||
assert strip_ro_prefix(" RO15134434 ") == "15134434"
|
||||
|
||||
def test_empty(self):
|
||||
assert strip_ro_prefix("") == ""
|
||||
|
||||
def test_none(self):
|
||||
assert strip_ro_prefix(None) == ""
|
||||
|
||||
def test_ocr_fix_O_to_0(self):
|
||||
"""Letter O in CUI should be converted to digit 0."""
|
||||
assert strip_ro_prefix("49O33O51") == "49033051"
|
||||
|
||||
def test_ocr_fix_I_to_1(self):
|
||||
"""Letter I in CUI should be converted to digit 1."""
|
||||
assert strip_ro_prefix("I5134434") == "15134434"
|
||||
|
||||
def test_ocr_fix_L_to_1(self):
|
||||
"""Letter L in CUI should be converted to digit 1."""
|
||||
assert strip_ro_prefix("L5134434") == "15134434"
|
||||
|
||||
def test_ocr_fix_combined_with_ro(self):
|
||||
"""RO prefix removed first, then OCR fix on remaining."""
|
||||
assert strip_ro_prefix("RO49O33O51") == "49033051"
|
||||
|
||||
def test_ro_prefix_not_affected_by_ocr(self):
|
||||
"""The 'RO' prefix is removed before OCR translation."""
|
||||
assert strip_ro_prefix("Ro 50519951") == "50519951"
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# validate_cui
|
||||
# ===========================================================================
|
||||
|
||||
class TestValidateCui:
|
||||
def test_valid_short(self):
|
||||
assert validate_cui("12") is True
|
||||
|
||||
def test_valid_10_digits(self):
|
||||
assert validate_cui("1234567890") is True
|
||||
|
||||
def test_too_short(self):
|
||||
assert validate_cui("1") is False
|
||||
|
||||
def test_too_long(self):
|
||||
assert validate_cui("12345678901") is False
|
||||
|
||||
def test_non_digits(self):
|
||||
assert validate_cui("49O33O51") is False
|
||||
|
||||
def test_empty(self):
|
||||
assert validate_cui("") is False
|
||||
|
||||
def test_none(self):
|
||||
assert validate_cui(None) is False
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# validate_cui_checksum
|
||||
# ===========================================================================
|
||||
|
||||
class TestValidateCuiChecksum:
|
||||
"""Test Romanian CUI check digit algorithm (key 753217532)."""
|
||||
|
||||
@pytest.mark.parametrize("cui,name", [
|
||||
("49033051", "MATTEO&OANA CAFFE 2022 SRL"),
|
||||
("15134434", "AUTOKLASS CENTER SRL"),
|
||||
("44741316", "OLLY'S HOUSE IECEA MARE SRL"),
|
||||
("45484539", "S OFFICE VENDING SRL"),
|
||||
("8722253", "VENUS ALIMCOM SRL"),
|
||||
("3738836", "AUSTRAL TRADE SRL"),
|
||||
("37567030", "CONVER URBAN SRL"),
|
||||
("45350367", "TURCHI GARAGE SRL"),
|
||||
("3601803", "known company"),
|
||||
("18189442", "known company"),
|
||||
("45093662", "CARTON PREMIUM SRL"),
|
||||
("50519951", "SERCO CAFFE COMPANY"),
|
||||
])
|
||||
def test_valid_cuis(self, cui, name):
|
||||
assert validate_cui_checksum(cui) is True, f"CUI {cui} ({name}) should pass checksum"
|
||||
|
||||
@pytest.mark.parametrize("cui", [
|
||||
"49033052", # last digit wrong (should be 1)
|
||||
"15134435", # last digit wrong
|
||||
"44741310", # last digit wrong
|
||||
])
|
||||
def test_invalid_checksum(self, cui):
|
||||
assert validate_cui_checksum(cui) is False
|
||||
|
||||
def test_invalid_format_rejected(self):
|
||||
assert validate_cui_checksum("ABC") is False
|
||||
assert validate_cui_checksum("") is False
|
||||
assert validate_cui_checksum("1") is False
|
||||
|
||||
def test_checksum_result_10_becomes_0(self):
|
||||
"""When (sum*10)%11 == 10, check digit should be 0.
|
||||
|
||||
CUI 14186770: body=1418677, padded=001418677,
|
||||
sum=0+0+3+8+1+42+35+21+14=124, 1240%11=10 → check=0.
|
||||
"""
|
||||
assert validate_cui_checksum("14186770") is True
|
||||
# Wrong check digit for same body
|
||||
assert validate_cui_checksum("14186771") is False
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# sanitize_cui
|
||||
# ===========================================================================
|
||||
|
||||
class TestSanitizeCui:
|
||||
def test_clean_cui_no_warning(self):
|
||||
bare, warning = sanitize_cui("RO15134434")
|
||||
assert bare == "15134434"
|
||||
assert warning is None
|
||||
|
||||
def test_ocr_typo_fixed_no_warning(self):
|
||||
"""Letter O→0 fix results in valid checksum, no warning."""
|
||||
bare, warning = sanitize_cui("49O33O51")
|
||||
assert bare == "49033051"
|
||||
assert warning is None
|
||||
|
||||
def test_ocr_typo_with_ro_prefix(self):
|
||||
bare, warning = sanitize_cui("RO49O33O51")
|
||||
assert bare == "49033051"
|
||||
assert warning is None
|
||||
|
||||
def test_valid_format_bad_checksum_warns(self):
|
||||
bare, warning = sanitize_cui("49033052") # wrong check digit
|
||||
assert bare == "49033052"
|
||||
assert warning is not None
|
||||
assert "nu trece verificarea" in warning
|
||||
|
||||
def test_invalid_format_warns(self):
|
||||
bare, warning = sanitize_cui("ABCDEF")
|
||||
assert warning is not None
|
||||
assert "caractere invalide" in warning
|
||||
|
||||
def test_empty_no_warning(self):
|
||||
bare, warning = sanitize_cui("")
|
||||
assert bare == ""
|
||||
assert warning is None
|
||||
|
||||
def test_bare_cui_no_prefix(self):
|
||||
bare, warning = sanitize_cui("45484539")
|
||||
assert bare == "45484539"
|
||||
assert warning is None
|
||||
|
||||
def test_with_spaces(self):
|
||||
bare, warning = sanitize_cui(" RO 8722253 ")
|
||||
assert bare == "8722253"
|
||||
assert warning is None
|
||||
|
||||
def test_ro_space_format(self):
|
||||
"""CUI like 'Ro 50519951' from real GoMag data."""
|
||||
bare, warning = sanitize_cui("Ro 50519951")
|
||||
assert bare == "50519951"
|
||||
assert warning is None
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# _call_anaf_api — notFound parsing + error handling
|
||||
# ===========================================================================
|
||||
|
||||
class TestCallAnafApi:
|
||||
"""Tests for ANAF API response parsing and error handling."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_notfound_as_integers(self):
|
||||
"""ANAF notFound items are plain integers (CUI values), not dicts."""
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_response.json.return_value = {
|
||||
"found": [],
|
||||
"notFound": [12345678, 87654321],
|
||||
}
|
||||
|
||||
mock_client = AsyncMock()
|
||||
mock_client.post.return_value = mock_response
|
||||
mock_client.__aenter__ = AsyncMock(return_value=mock_client)
|
||||
mock_client.__aexit__ = AsyncMock(return_value=False)
|
||||
|
||||
with patch("app.services.anaf_service.httpx.AsyncClient", return_value=mock_client):
|
||||
results = await _call_anaf_api([{"cui": 12345678, "data": "2026-04-07"}])
|
||||
|
||||
assert "12345678" in results
|
||||
assert "87654321" in results
|
||||
assert results["12345678"]["scpTVA"] is None
|
||||
assert results["87654321"]["scpTVA"] is None
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_notfound_as_dicts_still_works(self):
|
||||
"""Backward compat: if ANAF ever returns notFound as dicts, still parse them."""
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_response.json.return_value = {
|
||||
"found": [],
|
||||
"notFound": [{"date_generale": {"cui": 99999999}}],
|
||||
}
|
||||
|
||||
mock_client = AsyncMock()
|
||||
mock_client.post.return_value = mock_response
|
||||
mock_client.__aenter__ = AsyncMock(return_value=mock_client)
|
||||
mock_client.__aexit__ = AsyncMock(return_value=False)
|
||||
|
||||
with patch("app.services.anaf_service.httpx.AsyncClient", return_value=mock_client):
|
||||
results = await _call_anaf_api([{"cui": 99999999, "data": "2026-04-07"}])
|
||||
|
||||
assert "99999999" in results
|
||||
assert results["99999999"]["scpTVA"] is None
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_found_items_parsed(self):
|
||||
"""Normal found items are parsed correctly."""
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_response.json.return_value = {
|
||||
"found": [{
|
||||
"date_generale": {"cui": 15134434, "denumire": "AUTOKLASS CENTER SRL"},
|
||||
"inregistrare_scop_Tva": {"scpTVA": True},
|
||||
}],
|
||||
"notFound": [],
|
||||
}
|
||||
|
||||
mock_client = AsyncMock()
|
||||
mock_client.post.return_value = mock_response
|
||||
mock_client.__aenter__ = AsyncMock(return_value=mock_client)
|
||||
mock_client.__aexit__ = AsyncMock(return_value=False)
|
||||
|
||||
with patch("app.services.anaf_service.httpx.AsyncClient", return_value=mock_client):
|
||||
results = await _call_anaf_api([{"cui": 15134434, "data": "2026-04-07"}])
|
||||
|
||||
assert results["15134434"]["scpTVA"] is True
|
||||
assert results["15134434"]["denumire_anaf"] == "AUTOKLASS CENTER SRL"
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_4xx_error_no_retry(self):
|
||||
"""4xx client errors (like 404) should not retry."""
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 404
|
||||
|
||||
mock_client = AsyncMock()
|
||||
mock_client.post.return_value = mock_response
|
||||
mock_client.__aenter__ = AsyncMock(return_value=mock_client)
|
||||
mock_client.__aexit__ = AsyncMock(return_value=False)
|
||||
|
||||
log_messages = []
|
||||
with patch("app.services.anaf_service.httpx.AsyncClient", return_value=mock_client):
|
||||
results = await _call_anaf_api(
|
||||
[{"cui": 12345678, "data": "2026-04-07"}],
|
||||
log_fn=lambda msg: log_messages.append(msg),
|
||||
)
|
||||
|
||||
assert results == {}
|
||||
# Should only call once (no retry for 4xx)
|
||||
assert mock_client.post.call_count == 1
|
||||
assert any("404" in msg for msg in log_messages)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_log_fn_receives_errors(self):
|
||||
"""log_fn callback receives error messages for UI display."""
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 500
|
||||
|
||||
mock_client = AsyncMock()
|
||||
mock_client.post.return_value = mock_response
|
||||
mock_client.__aenter__ = AsyncMock(return_value=mock_client)
|
||||
mock_client.__aexit__ = AsyncMock(return_value=False)
|
||||
|
||||
log_messages = []
|
||||
with patch("app.services.anaf_service.httpx.AsyncClient", return_value=mock_client):
|
||||
with patch("asyncio.sleep", new_callable=AsyncMock):
|
||||
results = await _call_anaf_api(
|
||||
[{"cui": 12345678, "data": "2026-04-07"}],
|
||||
log_fn=lambda msg: log_messages.append(msg),
|
||||
)
|
||||
|
||||
assert results == {}
|
||||
assert len(log_messages) >= 1
|
||||
|
||||
|
||||
class TestCheckVatStatusBatch:
|
||||
"""Tests for check_vat_status_batch with log_fn propagation."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_log_fn_passed_through(self):
|
||||
"""log_fn is forwarded from check_vat_status_batch to _call_anaf_api."""
|
||||
log_messages = []
|
||||
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_response.json.return_value = {"found": [], "notFound": [12345678]}
|
||||
|
||||
mock_client = AsyncMock()
|
||||
mock_client.post.return_value = mock_response
|
||||
mock_client.__aenter__ = AsyncMock(return_value=mock_client)
|
||||
mock_client.__aexit__ = AsyncMock(return_value=False)
|
||||
|
||||
with patch("app.services.anaf_service.httpx.AsyncClient", return_value=mock_client):
|
||||
results = await check_vat_status_batch(
|
||||
["12345678"], log_fn=lambda msg: log_messages.append(msg),
|
||||
)
|
||||
|
||||
assert "12345678" in results
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_empty_list_returns_empty(self):
|
||||
assert await check_vat_status_batch([]) == {}
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_non_digit_cuis_filtered(self):
|
||||
"""CUIs that aren't pure digits are filtered out before API call."""
|
||||
mock_response = MagicMock()
|
||||
mock_response.status_code = 200
|
||||
mock_response.raise_for_status = MagicMock()
|
||||
mock_response.json.return_value = {"found": [], "notFound": []}
|
||||
|
||||
mock_client = AsyncMock()
|
||||
mock_client.post.return_value = mock_response
|
||||
mock_client.__aenter__ = AsyncMock(return_value=mock_client)
|
||||
mock_client.__aexit__ = AsyncMock(return_value=False)
|
||||
|
||||
with patch("app.services.anaf_service.httpx.AsyncClient", return_value=mock_client):
|
||||
results = await check_vat_status_batch(["ABC", "12345678"])
|
||||
|
||||
# Only the digit CUI should be in the body
|
||||
call_body = mock_client.post.call_args[1]["json"]
|
||||
assert len(call_body) == 1
|
||||
assert call_body[0]["cui"] == 12345678
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user