feat: add FastAPI admin dashboard with sync orchestration and test suite
Replace Flask admin with FastAPI app (api/app/) featuring: - Dashboard with stat cards, sync control, and history - Mappings CRUD for ARTICOLE_TERTI with CSV import/export - Article autocomplete from NOM_ARTICOLE - SKU pre-validation before import - Sync orchestration: read JSONs -> validate -> import -> log to SQLite - APScheduler for periodic sync from UI - File logging to logs/sync_comenzi_YYYYMMDD_HHMMSS.log - Oracle pool None guard (503 vs 500 on unavailable) Test suite: - test_app_basic.py: 30 tests (imports + routes) without Oracle - test_integration.py: 9 integration tests with Oracle Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
44
CLAUDE.md
44
CLAUDE.md
@@ -6,13 +6,13 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
|
||||
|
||||
**System:** Import Comenzi Web → Sistem ROA Oracle
|
||||
|
||||
This is a multi-tier system that automatically imports orders from web platforms (GoMag, etc.) into the ROA Oracle ERP system. The project combines Oracle PL/SQL packages, Visual FoxPro orchestration, and a Flask web admin interface for SKU mapping management.
|
||||
This is a multi-tier system that automatically imports orders from web platforms (GoMag, etc.) into the ROA Oracle ERP system. The project combines Oracle PL/SQL packages, Visual FoxPro orchestration, and a FastAPI web admin/dashboard interface.
|
||||
|
||||
**Current Status:** Phase 1 (Database Foundation) - 75% Complete
|
||||
- ✅ P1-001: ARTICOLE_TERTI table created
|
||||
- ✅ P1-002: IMPORT_PARTENERI package complete
|
||||
- ✅ P1-003: IMPORT_COMENZI package complete
|
||||
- 🔄 P1-004: Manual testing packages (NEXT UP)
|
||||
**Current Status:** Phase 4 Complete, Phase 5 In Progress
|
||||
- ✅ Phase 1: Database Foundation (ARTICOLE_TERTI, IMPORT_PARTENERI, IMPORT_COMENZI)
|
||||
- ✅ Phase 2: VFP Integration (gomag-vending.prg, sync-comenzi-web.prg)
|
||||
- ✅ Phase 3-4: FastAPI Admin + Dashboard (mappings CRUD, sync orchestration, pre-validation)
|
||||
- 🔄 Phase 5: Production (file logging done, auth + notifications pending)
|
||||
|
||||
## Architecture
|
||||
|
||||
@@ -25,8 +25,8 @@ This is a multi-tier system that automatically imports orders from web platforms
|
||||
### Tech Stack
|
||||
- **Backend:** Oracle PL/SQL packages
|
||||
- **Integration:** Visual FoxPro 9
|
||||
- **Admin Interface:** Flask + Oracle connection pool
|
||||
- **Data:** Oracle 11g/12c (ROA system)
|
||||
- **Admin/Dashboard:** FastAPI + Jinja2 + Oracle pool + SQLite
|
||||
- **Data:** Oracle 11g/12c (ROA system), SQLite (local tracking)
|
||||
|
||||
## Core Components
|
||||
|
||||
@@ -87,15 +87,18 @@ CREATE TABLE ARTICOLE_TERTI (
|
||||
);
|
||||
```
|
||||
|
||||
### Flask Admin Interface
|
||||
### FastAPI Admin/Dashboard
|
||||
|
||||
#### admin.py
|
||||
**Location:** `api/admin.py`
|
||||
#### app/main.py
|
||||
**Location:** `api/app/main.py`
|
||||
**Features:**
|
||||
- Oracle connection pool management
|
||||
- SKU mappings CRUD operations
|
||||
- Web interface for configuration
|
||||
- Real-time validation
|
||||
- FastAPI with lifespan (Oracle pool + SQLite init)
|
||||
- File logging to `logs/sync_comenzi_YYYYMMDD_HHMMSS.log`
|
||||
- Routers: health, dashboard, mappings, articles, validation, sync
|
||||
- Services: mapping, article, import, sync, validation, order_reader, sqlite, scheduler
|
||||
- Templates: Jinja2 (dashboard, mappings, sync_detail, missing_skus)
|
||||
- Dual database: Oracle (ERP data) + SQLite (tracking)
|
||||
- APScheduler for periodic sync
|
||||
|
||||
## Development Commands
|
||||
|
||||
@@ -115,10 +118,17 @@ sqlplus CONTAFIN_ORACLE/password@ROA_ROMFAST @03_import_comenzi.sql
|
||||
DO vfp/gomag-vending.prg
|
||||
```
|
||||
|
||||
### Flask Admin Interface
|
||||
### FastAPI Admin/Dashboard
|
||||
```bash
|
||||
cd api
|
||||
python admin.py
|
||||
pip install -r requirements.txt
|
||||
uvicorn app.main:app --host 0.0.0.0 --port 5003 --reload
|
||||
```
|
||||
|
||||
### Testare
|
||||
```bash
|
||||
python api/test_app_basic.py # Test A - fara Oracle
|
||||
python api/test_integration.py # Test C - cu Oracle
|
||||
```
|
||||
|
||||
## Project Structure
|
||||
|
||||
128
README.md
128
README.md
@@ -1,2 +1,128 @@
|
||||
# gomag-vending
|
||||
# GoMag Vending - Import Comenzi Web → ROA Oracle
|
||||
|
||||
System automat de import comenzi din platforma GoMag in sistemul ERP ROA Oracle.
|
||||
|
||||
## Arhitectura
|
||||
|
||||
```
|
||||
[GoMag API] → [VFP Orchestrator] → [Oracle PL/SQL] → [FastAPI Admin]
|
||||
↓ ↓ ↑ ↑
|
||||
JSON Orders Process & Log Store/Update Dashboard + Config
|
||||
```
|
||||
|
||||
### Stack Tehnologic
|
||||
- **Database:** Oracle PL/SQL packages (PACK_IMPORT_PARTENERI, PACK_IMPORT_COMENZI)
|
||||
- **Integrare:** Visual FoxPro 9 (gomag-vending.prg, sync-comenzi-web.prg)
|
||||
- **Admin/Dashboard:** FastAPI + Jinja2 + Oracle pool + SQLite
|
||||
- **Date:** Oracle 11g/12c (schema ROA), SQLite (tracking local)
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Prerequisite
|
||||
- Python 3.10+
|
||||
- Oracle Instant Client (optional - suporta si thin mode)
|
||||
|
||||
### Instalare si pornire
|
||||
```bash
|
||||
cd api
|
||||
pip install -r requirements.txt
|
||||
# Configureaza .env (vezi api/.env.example)
|
||||
uvicorn app.main:app --host 0.0.0.0 --port 5003 --reload
|
||||
```
|
||||
|
||||
Deschide `http://localhost:5003` in browser.
|
||||
|
||||
### Testare
|
||||
|
||||
**Test A - Basic (fara Oracle):**
|
||||
```bash
|
||||
cd api
|
||||
python test_app_basic.py
|
||||
```
|
||||
Verifica 17 importuri de module + 13 rute GET. Asteptat: 30/30 PASS.
|
||||
|
||||
**Test C - Integrare Oracle:**
|
||||
```bash
|
||||
python api/test_integration.py
|
||||
```
|
||||
Necesita Oracle activ. Verifica health, mappings CRUD, article search, validation, sync. Asteptat: 9/9 PASS.
|
||||
|
||||
## Structura Proiect
|
||||
|
||||
```
|
||||
/
|
||||
├── api/ # FastAPI Admin + Database
|
||||
│ ├── app/ # Aplicatia FastAPI
|
||||
│ │ ├── main.py # Entry point, lifespan, logging
|
||||
│ │ ├── config.py # Settings (pydantic-settings, .env)
|
||||
│ │ ├── database.py # Oracle pool + SQLite init
|
||||
│ │ ├── routers/ # Endpoint-uri HTTP
|
||||
│ │ │ ├── health.py # /health, /api/health
|
||||
│ │ │ ├── dashboard.py # / (dashboard HTML)
|
||||
│ │ │ ├── mappings.py # /mappings, /api/mappings
|
||||
│ │ │ ├── articles.py # /api/articles/search
|
||||
│ │ │ ├── validation.py # /api/validate/*
|
||||
│ │ │ └── sync.py # /api/sync/*
|
||||
│ │ ├── services/ # Business logic
|
||||
│ │ │ ├── mapping_service # CRUD ARTICOLE_TERTI
|
||||
│ │ │ ├── article_service # Cautare NOM_ARTICOLE
|
||||
│ │ │ ├── import_service # Import comanda in Oracle
|
||||
│ │ │ ├── sync_service # Orchestrare: JSON→validate→import
|
||||
│ │ │ ├── validation_service # Validare SKU-uri
|
||||
│ │ │ ├── order_reader # Citire JSON-uri din vfp/output/
|
||||
│ │ │ ├── sqlite_service # Tracking runs/orders/missing SKUs
|
||||
│ │ │ └── scheduler_service # APScheduler timer
|
||||
│ │ ├── templates/ # Jinja2 HTML (dashboard, mappings, etc.)
|
||||
│ │ └── static/ # CSS + JS
|
||||
│ ├── database-scripts/ # Oracle SQL scripts
|
||||
│ ├── test_app_basic.py # Test A - fara Oracle
|
||||
│ ├── test_integration.py # Test C - cu Oracle
|
||||
│ └── requirements.txt # Python dependencies
|
||||
├── vfp/ # VFP Integration
|
||||
│ ├── gomag-vending.prg # Client GoMag API
|
||||
│ ├── sync-comenzi-web.prg # Orchestrator VFP
|
||||
│ └── utils.prg # Utilitare VFP
|
||||
├── docs/ # Documentatie
|
||||
│ ├── PRD.md # Product Requirements
|
||||
│ └── stories/ # User Stories
|
||||
└── logs/ # Log-uri aplicatie
|
||||
```
|
||||
|
||||
## Configurare (.env)
|
||||
|
||||
```env
|
||||
ORACLE_USER=MARIUSM_AUTO
|
||||
ORACLE_PASSWORD=********
|
||||
ORACLE_DSN=ROA_CENTRAL
|
||||
FORCE_THIN_MODE=true # sau INSTANTCLIENTPATH=C:\oracle\instantclient
|
||||
SQLITE_DB_PATH=data/import.db
|
||||
APP_PORT=5003
|
||||
LOG_LEVEL=INFO
|
||||
JSON_OUTPUT_DIR=../vfp/output
|
||||
```
|
||||
|
||||
## Status Implementare
|
||||
|
||||
### Phase 1: Database Foundation - COMPLET
|
||||
- ARTICOLE_TERTI table + Docker setup
|
||||
- PACK_IMPORT_PARTENERI package
|
||||
- PACK_IMPORT_COMENZI package
|
||||
|
||||
### Phase 2: VFP Integration - COMPLET
|
||||
- gomag-vending.prg (GoMag API client)
|
||||
- sync-comenzi-web.prg (orchestrator cu logging)
|
||||
|
||||
### Phase 3-4: FastAPI Admin + Dashboard - COMPLET
|
||||
- Mappings CRUD + CSV import/export
|
||||
- Article autocomplete (NOM_ARTICOLE)
|
||||
- Pre-validation SKU-uri
|
||||
- Import orchestration (JSON→Oracle)
|
||||
- Dashboard cu stat cards, sync control, history
|
||||
- Missing SKUs management page
|
||||
- File logging (logs/sync_comenzi_*.log)
|
||||
|
||||
### Phase 5: Production - IN PROGRESS
|
||||
- [x] File logging
|
||||
- [ ] Email notifications (SMTP)
|
||||
- [ ] HTTP Basic Auth
|
||||
- [ ] NSSM Windows service
|
||||
|
||||
@@ -1,41 +1,57 @@
|
||||
# API Directory - Phase 1 Complete
|
||||
# GoMag Import Manager - FastAPI Application
|
||||
|
||||
## Core Files
|
||||
Admin interface si orchestrator pentru importul comenzilor GoMag in Oracle ROA.
|
||||
|
||||
### 🎛️ `admin.py`
|
||||
**Purpose:** Flask web admin interface pentru mapări SKU
|
||||
- Oracle connection pool management
|
||||
- CRUD operations pentru ARTICOLE_TERTI
|
||||
- Web interface pentru configurare mapări
|
||||
- **Port:** 5000 (configurable)
|
||||
## Componente
|
||||
|
||||
### 🧪 `tests/`
|
||||
**Purpose:** Directory cu toate testele și utilitățile validation
|
||||
- `final_validation.py` - Ultimate P1-004 validation script
|
||||
- `test_final_success.py` - Complete end-to-end test
|
||||
- `test_syntax.py` - Package compilation checker
|
||||
- `check_packages.py` - Package status utility
|
||||
- `check_table_structure.py` - Schema validation utility
|
||||
- `README.md` - Documentation pentru toate testele
|
||||
### Core
|
||||
- **main.py** - Entry point FastAPI, lifespan (Oracle pool + SQLite init), file logging
|
||||
- **config.py** - Settings via pydantic-settings (citeste .env)
|
||||
- **database.py** - Oracle connection pool + SQLite schema + helpers
|
||||
|
||||
## Configuration Files
|
||||
### Routers (HTTP Endpoints)
|
||||
| Router | Prefix | Descriere |
|
||||
|--------|--------|-----------|
|
||||
| health | /health, /api/health | Status Oracle + SQLite |
|
||||
| dashboard | / | Dashboard HTML cu stat cards |
|
||||
| mappings | /mappings, /api/mappings | CRUD ARTICOLE_TERTI + CSV |
|
||||
| articles | /api/articles | Cautare NOM_ARTICOLE |
|
||||
| validation | /api/validate | Scanare + validare SKU-uri |
|
||||
| sync | /sync, /api/sync | Import orchestration + scheduler |
|
||||
|
||||
### 📁 `database-scripts/`
|
||||
- `01_create_table.sql` - ARTICOLE_TERTI table
|
||||
- `02_import_parteneri.sql` - PACK_IMPORT_PARTENERI package
|
||||
- `04_import_comenzi.sql` - PACK_IMPORT_COMENZI package
|
||||
### Services (Business Logic)
|
||||
| Service | Rol |
|
||||
|---------|-----|
|
||||
| mapping_service | CRUD pe ARTICOLE_TERTI (Oracle) |
|
||||
| article_service | Cautare in NOM_ARTICOLE (Oracle) |
|
||||
| import_service | Port din VFP: partner/address/order creation |
|
||||
| sync_service | Orchestrare: read JSONs → validate → import → log |
|
||||
| validation_service | Batch-validare SKU-uri (chunks of 500) |
|
||||
| order_reader | Citire gomag_orders_page*.json din vfp/output/ |
|
||||
| sqlite_service | CRUD pe SQLite (sync_runs, import_orders, missing_skus) |
|
||||
| scheduler_service | APScheduler - sync periodic configurabil din UI |
|
||||
|
||||
### 🐳 `docker-compose.yaml`
|
||||
Oracle container orchestration
|
||||
## Rulare
|
||||
|
||||
### 🔧 `.env`
|
||||
Environment variables pentru MARIUSM_AUTO schema
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
uvicorn app.main:app --host 0.0.0.0 --port 5003 --reload
|
||||
```
|
||||
|
||||
### 📋 `requirements.txt`
|
||||
Python dependencies (oracledb, flask, etc.)
|
||||
## Testare
|
||||
|
||||
---
|
||||
```bash
|
||||
# Test A - fara Oracle (verifica importuri + rute)
|
||||
python test_app_basic.py
|
||||
|
||||
**Phase 1 Status:** ✅ 100% COMPLETE
|
||||
**Ready for:** Phase 2 VFP Integration
|
||||
**Cleanup Date:** 10 septembrie 2025, 12:57
|
||||
# Test C - cu Oracle (integrare completa)
|
||||
python test_integration.py
|
||||
```
|
||||
|
||||
## Dual Database
|
||||
- **Oracle** - date ERP (ARTICOLE_TERTI, NOM_ARTICOLE, COMENZI)
|
||||
- **SQLite** - tracking local (sync_runs, import_orders, missing_skus, scheduler_config)
|
||||
|
||||
## Logging
|
||||
Log files in `../logs/sync_comenzi_YYYYMMDD_HHMMSS.log`
|
||||
Format: `2026-03-11 14:30:25 | INFO | app.services.sync_service | mesaj`
|
||||
|
||||
0
api/app/__init__.py
Normal file
0
api/app/__init__.py
Normal file
35
api/app/config.py
Normal file
35
api/app/config.py
Normal file
@@ -0,0 +1,35 @@
|
||||
from pydantic_settings import BaseSettings
|
||||
from pathlib import Path
|
||||
import os
|
||||
|
||||
class Settings(BaseSettings):
|
||||
# Oracle
|
||||
ORACLE_USER: str = "MARIUSM_AUTO"
|
||||
ORACLE_PASSWORD: str = "ROMFASTSOFT"
|
||||
ORACLE_DSN: str = "ROA_CENTRAL"
|
||||
INSTANTCLIENTPATH: str = ""
|
||||
FORCE_THIN_MODE: bool = False
|
||||
TNS_ADMIN: str = ""
|
||||
|
||||
# SQLite
|
||||
SQLITE_DB_PATH: str = str(Path(__file__).parent.parent / "data" / "import.db")
|
||||
|
||||
# App
|
||||
APP_PORT: int = 5003
|
||||
LOG_LEVEL: str = "INFO"
|
||||
JSON_OUTPUT_DIR: str = ""
|
||||
|
||||
# SMTP (optional)
|
||||
SMTP_HOST: str = ""
|
||||
SMTP_PORT: int = 587
|
||||
SMTP_USER: str = ""
|
||||
SMTP_PASSWORD: str = ""
|
||||
SMTP_TO: str = ""
|
||||
|
||||
# Auth (optional)
|
||||
API_USERNAME: str = ""
|
||||
API_PASSWORD: str = ""
|
||||
|
||||
model_config = {"env_file": ".env", "env_file_encoding": "utf-8", "extra": "ignore"}
|
||||
|
||||
settings = Settings()
|
||||
135
api/app/database.py
Normal file
135
api/app/database.py
Normal file
@@ -0,0 +1,135 @@
|
||||
import oracledb
|
||||
import aiosqlite
|
||||
import sqlite3
|
||||
import logging
|
||||
import os
|
||||
from pathlib import Path
|
||||
from .config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# ---- Oracle Pool ----
|
||||
pool = None
|
||||
|
||||
def init_oracle():
|
||||
"""Initialize Oracle client mode and create connection pool."""
|
||||
global pool
|
||||
|
||||
force_thin = settings.FORCE_THIN_MODE
|
||||
instantclient_path = settings.INSTANTCLIENTPATH
|
||||
dsn = settings.ORACLE_DSN
|
||||
|
||||
if force_thin:
|
||||
logger.info(f"FORCE_THIN_MODE=true: thin mode for {dsn}")
|
||||
elif instantclient_path:
|
||||
try:
|
||||
oracledb.init_oracle_client(lib_dir=instantclient_path)
|
||||
logger.info(f"Thick mode activated for {dsn}")
|
||||
except Exception as e:
|
||||
logger.error(f"Thick mode error: {e}")
|
||||
logger.info("Fallback to thin mode")
|
||||
else:
|
||||
logger.info(f"Thin mode (default) for {dsn}")
|
||||
|
||||
pool = oracledb.create_pool(
|
||||
user=settings.ORACLE_USER,
|
||||
password=settings.ORACLE_PASSWORD,
|
||||
dsn=settings.ORACLE_DSN,
|
||||
min=2,
|
||||
max=4,
|
||||
increment=1
|
||||
)
|
||||
logger.info(f"Oracle pool created for {dsn}")
|
||||
return pool
|
||||
|
||||
def get_oracle_connection():
|
||||
"""Get a connection from the Oracle pool."""
|
||||
if pool is None:
|
||||
raise RuntimeError("Oracle pool not initialized")
|
||||
return pool.acquire()
|
||||
|
||||
def close_oracle():
|
||||
"""Close the Oracle connection pool."""
|
||||
global pool
|
||||
if pool:
|
||||
pool.close()
|
||||
pool = None
|
||||
logger.info("Oracle pool closed")
|
||||
|
||||
# ---- SQLite ----
|
||||
SQLITE_SCHEMA = """
|
||||
CREATE TABLE IF NOT EXISTS sync_runs (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
run_id TEXT UNIQUE,
|
||||
started_at TEXT,
|
||||
finished_at TEXT,
|
||||
status TEXT,
|
||||
total_orders INTEGER DEFAULT 0,
|
||||
imported INTEGER DEFAULT 0,
|
||||
skipped INTEGER DEFAULT 0,
|
||||
errors INTEGER DEFAULT 0,
|
||||
json_files INTEGER DEFAULT 0
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS import_orders (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
sync_run_id TEXT REFERENCES sync_runs(run_id),
|
||||
order_number TEXT,
|
||||
order_date TEXT,
|
||||
customer_name TEXT,
|
||||
status TEXT,
|
||||
id_comanda INTEGER,
|
||||
id_partener INTEGER,
|
||||
error_message TEXT,
|
||||
missing_skus TEXT,
|
||||
items_count INTEGER,
|
||||
created_at TEXT DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS missing_skus (
|
||||
sku TEXT PRIMARY KEY,
|
||||
product_name TEXT,
|
||||
first_seen TEXT DEFAULT (datetime('now')),
|
||||
resolved INTEGER DEFAULT 0,
|
||||
resolved_at TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS scheduler_config (
|
||||
key TEXT PRIMARY KEY,
|
||||
value TEXT
|
||||
);
|
||||
"""
|
||||
|
||||
_sqlite_db_path = None
|
||||
|
||||
def init_sqlite():
|
||||
"""Initialize SQLite database with schema."""
|
||||
global _sqlite_db_path
|
||||
_sqlite_db_path = settings.SQLITE_DB_PATH
|
||||
|
||||
# Ensure directory exists
|
||||
db_dir = os.path.dirname(_sqlite_db_path)
|
||||
if db_dir:
|
||||
os.makedirs(db_dir, exist_ok=True)
|
||||
|
||||
# Create tables synchronously
|
||||
conn = sqlite3.connect(_sqlite_db_path)
|
||||
conn.executescript(SQLITE_SCHEMA)
|
||||
conn.close()
|
||||
logger.info(f"SQLite initialized: {_sqlite_db_path}")
|
||||
|
||||
async def get_sqlite():
|
||||
"""Get async SQLite connection."""
|
||||
if _sqlite_db_path is None:
|
||||
raise RuntimeError("SQLite not initialized")
|
||||
db = await aiosqlite.connect(_sqlite_db_path)
|
||||
db.row_factory = aiosqlite.Row
|
||||
return db
|
||||
|
||||
def get_sqlite_sync():
|
||||
"""Get synchronous SQLite connection."""
|
||||
if _sqlite_db_path is None:
|
||||
raise RuntimeError("SQLite not initialized")
|
||||
conn = sqlite3.connect(_sqlite_db_path)
|
||||
conn.row_factory = sqlite3.Row
|
||||
return conn
|
||||
91
api/app/main.py
Normal file
91
api/app/main.py
Normal file
@@ -0,0 +1,91 @@
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime
|
||||
from fastapi import FastAPI
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from pathlib import Path
|
||||
import logging
|
||||
import os
|
||||
|
||||
from .config import settings
|
||||
from .database import init_oracle, close_oracle, init_sqlite
|
||||
|
||||
# Configure logging with both stream and file handlers
|
||||
_log_level = getattr(logging, settings.LOG_LEVEL.upper(), logging.INFO)
|
||||
_log_format = '%(asctime)s | %(levelname)s | %(name)s | %(message)s'
|
||||
_formatter = logging.Formatter(_log_format)
|
||||
|
||||
_stream_handler = logging.StreamHandler()
|
||||
_stream_handler.setFormatter(_formatter)
|
||||
|
||||
_log_dir = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'logs')
|
||||
os.makedirs(_log_dir, exist_ok=True)
|
||||
_log_filename = f"sync_comenzi_{datetime.now().strftime('%Y%m%d_%H%M%S')}.log"
|
||||
_file_handler = logging.FileHandler(os.path.join(_log_dir, _log_filename), encoding='utf-8')
|
||||
_file_handler.setFormatter(_formatter)
|
||||
|
||||
_root_logger = logging.getLogger()
|
||||
_root_logger.setLevel(_log_level)
|
||||
_root_logger.addHandler(_stream_handler)
|
||||
_root_logger.addHandler(_file_handler)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Startup and shutdown events."""
|
||||
logger.info("Starting GoMag Import Manager...")
|
||||
|
||||
# Initialize Oracle pool
|
||||
try:
|
||||
init_oracle()
|
||||
except Exception as e:
|
||||
logger.error(f"Oracle init failed: {e}")
|
||||
# Allow app to start even without Oracle for development
|
||||
|
||||
# Initialize SQLite
|
||||
init_sqlite()
|
||||
|
||||
# Initialize scheduler (restore saved config)
|
||||
from .services import scheduler_service, sqlite_service
|
||||
scheduler_service.init_scheduler()
|
||||
try:
|
||||
config = await sqlite_service.get_scheduler_config()
|
||||
if config.get("enabled") == "True":
|
||||
interval = int(config.get("interval_minutes", "5"))
|
||||
scheduler_service.start_scheduler(interval)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
logger.info("GoMag Import Manager started")
|
||||
yield
|
||||
|
||||
# Shutdown
|
||||
scheduler_service.shutdown_scheduler()
|
||||
close_oracle()
|
||||
logger.info("GoMag Import Manager stopped")
|
||||
|
||||
app = FastAPI(
|
||||
title="GoMag Import Manager",
|
||||
description="Import comenzi web GoMag → ROA Oracle",
|
||||
version="1.0.0",
|
||||
lifespan=lifespan
|
||||
)
|
||||
|
||||
# Static files and templates
|
||||
static_dir = Path(__file__).parent / "static"
|
||||
templates_dir = Path(__file__).parent / "templates"
|
||||
static_dir.mkdir(parents=True, exist_ok=True)
|
||||
(static_dir / "css").mkdir(exist_ok=True)
|
||||
(static_dir / "js").mkdir(exist_ok=True)
|
||||
templates_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
app.mount("/static", StaticFiles(directory=str(static_dir)), name="static")
|
||||
|
||||
# Include routers
|
||||
from .routers import health, dashboard, mappings, articles, validation, sync
|
||||
app.include_router(health.router)
|
||||
app.include_router(dashboard.router)
|
||||
app.include_router(mappings.router)
|
||||
app.include_router(articles.router)
|
||||
app.include_router(validation.router)
|
||||
app.include_router(sync.router)
|
||||
0
api/app/routers/__init__.py
Normal file
0
api/app/routers/__init__.py
Normal file
10
api/app/routers/articles.py
Normal file
10
api/app/routers/articles.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from fastapi import APIRouter, Query
|
||||
|
||||
from ..services import article_service
|
||||
|
||||
router = APIRouter(prefix="/api/articles", tags=["articles"])
|
||||
|
||||
@router.get("/search")
|
||||
def search_articles(q: str = Query("", min_length=2)):
|
||||
results = article_service.search_articles(q)
|
||||
return {"results": results}
|
||||
17
api/app/routers/dashboard.py
Normal file
17
api/app/routers/dashboard.py
Normal file
@@ -0,0 +1,17 @@
|
||||
from fastapi import APIRouter, Request
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse
|
||||
from pathlib import Path
|
||||
|
||||
from ..services import sqlite_service
|
||||
|
||||
router = APIRouter()
|
||||
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
||||
|
||||
@router.get("/", response_class=HTMLResponse)
|
||||
async def dashboard(request: Request):
|
||||
return templates.TemplateResponse("dashboard.html", {"request": request})
|
||||
|
||||
@router.get("/missing-skus", response_class=HTMLResponse)
|
||||
async def missing_skus_page(request: Request):
|
||||
return templates.TemplateResponse("missing_skus.html", {"request": request})
|
||||
30
api/app/routers/health.py
Normal file
30
api/app/routers/health.py
Normal file
@@ -0,0 +1,30 @@
|
||||
from fastapi import APIRouter
|
||||
from .. import database
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
@router.get("/health")
|
||||
async def health_check():
|
||||
result = {"oracle": "error", "sqlite": "error"}
|
||||
|
||||
# Check Oracle
|
||||
try:
|
||||
if database.pool:
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("SELECT SYSDATE FROM DUAL")
|
||||
cur.fetchone()
|
||||
result["oracle"] = "ok"
|
||||
except Exception as e:
|
||||
result["oracle"] = str(e)
|
||||
|
||||
# Check SQLite
|
||||
try:
|
||||
db = await database.get_sqlite()
|
||||
await db.execute("SELECT 1")
|
||||
await db.close()
|
||||
result["sqlite"] = "ok"
|
||||
except Exception as e:
|
||||
result["sqlite"] = str(e)
|
||||
|
||||
return result
|
||||
84
api/app/routers/mappings.py
Normal file
84
api/app/routers/mappings.py
Normal file
@@ -0,0 +1,84 @@
|
||||
from fastapi import APIRouter, Query, Request, UploadFile, File
|
||||
from fastapi.responses import StreamingResponse, HTMLResponse
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from pydantic import BaseModel
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
import io
|
||||
|
||||
from ..services import mapping_service, sqlite_service
|
||||
|
||||
router = APIRouter(tags=["mappings"])
|
||||
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
||||
|
||||
class MappingCreate(BaseModel):
|
||||
sku: str
|
||||
codmat: str
|
||||
cantitate_roa: float = 1
|
||||
procent_pret: float = 100
|
||||
|
||||
class MappingUpdate(BaseModel):
|
||||
cantitate_roa: Optional[float] = None
|
||||
procent_pret: Optional[float] = None
|
||||
activ: Optional[int] = None
|
||||
|
||||
# HTML page
|
||||
@router.get("/mappings", response_class=HTMLResponse)
|
||||
async def mappings_page(request: Request):
|
||||
return templates.TemplateResponse("mappings.html", {"request": request})
|
||||
|
||||
# API endpoints
|
||||
@router.get("/api/mappings")
|
||||
def list_mappings(search: str = "", page: int = 1, per_page: int = 50):
|
||||
return mapping_service.get_mappings(search=search, page=page, per_page=per_page)
|
||||
|
||||
@router.post("/api/mappings")
|
||||
async def create_mapping(data: MappingCreate):
|
||||
try:
|
||||
result = mapping_service.create_mapping(data.sku, data.codmat, data.cantitate_roa, data.procent_pret)
|
||||
# Mark SKU as resolved in missing_skus tracking
|
||||
await sqlite_service.resolve_missing_sku(data.sku)
|
||||
return {"success": True, **result}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.put("/api/mappings/{sku}/{codmat}")
|
||||
def update_mapping(sku: str, codmat: str, data: MappingUpdate):
|
||||
try:
|
||||
updated = mapping_service.update_mapping(sku, codmat, data.cantitate_roa, data.procent_pret, data.activ)
|
||||
return {"success": updated}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.delete("/api/mappings/{sku}/{codmat}")
|
||||
def delete_mapping(sku: str, codmat: str):
|
||||
try:
|
||||
deleted = mapping_service.delete_mapping(sku, codmat)
|
||||
return {"success": deleted}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
@router.post("/api/mappings/import-csv")
|
||||
async def import_csv(file: UploadFile = File(...)):
|
||||
content = await file.read()
|
||||
text = content.decode("utf-8-sig")
|
||||
result = mapping_service.import_csv(text)
|
||||
return result
|
||||
|
||||
@router.get("/api/mappings/export-csv")
|
||||
def export_csv():
|
||||
csv_content = mapping_service.export_csv()
|
||||
return StreamingResponse(
|
||||
io.BytesIO(csv_content.encode("utf-8-sig")),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=mappings.csv"}
|
||||
)
|
||||
|
||||
@router.get("/api/mappings/csv-template")
|
||||
def csv_template():
|
||||
content = mapping_service.get_csv_template()
|
||||
return StreamingResponse(
|
||||
io.BytesIO(content.encode("utf-8-sig")),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=mappings_template.csv"}
|
||||
)
|
||||
90
api/app/routers/sync.py
Normal file
90
api/app/routers/sync.py
Normal file
@@ -0,0 +1,90 @@
|
||||
from fastapi import APIRouter, Request, BackgroundTasks
|
||||
from fastapi.templating import Jinja2Templates
|
||||
from fastapi.responses import HTMLResponse
|
||||
from pydantic import BaseModel
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
from ..services import sync_service, scheduler_service, sqlite_service
|
||||
|
||||
router = APIRouter(tags=["sync"])
|
||||
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
||||
|
||||
|
||||
class ScheduleConfig(BaseModel):
|
||||
enabled: bool
|
||||
interval_minutes: int = 5
|
||||
|
||||
|
||||
# HTML pages
|
||||
@router.get("/sync", response_class=HTMLResponse)
|
||||
async def sync_page(request: Request):
|
||||
return templates.TemplateResponse("dashboard.html", {"request": request})
|
||||
|
||||
|
||||
@router.get("/sync/run/{run_id}", response_class=HTMLResponse)
|
||||
async def sync_detail_page(request: Request, run_id: str):
|
||||
return templates.TemplateResponse("sync_detail.html", {"request": request, "run_id": run_id})
|
||||
|
||||
|
||||
# API endpoints
|
||||
@router.post("/api/sync/start")
|
||||
async def start_sync(background_tasks: BackgroundTasks):
|
||||
"""Trigger a sync run in the background."""
|
||||
status = await sync_service.get_sync_status()
|
||||
if status.get("status") == "running":
|
||||
return {"error": "Sync already running", "run_id": status.get("run_id")}
|
||||
|
||||
background_tasks.add_task(sync_service.run_sync)
|
||||
return {"message": "Sync started"}
|
||||
|
||||
|
||||
@router.post("/api/sync/stop")
|
||||
async def stop_sync():
|
||||
"""Stop a running sync."""
|
||||
sync_service.stop_sync()
|
||||
return {"message": "Stop signal sent"}
|
||||
|
||||
|
||||
@router.get("/api/sync/status")
|
||||
async def sync_status():
|
||||
"""Get current sync status."""
|
||||
status = await sync_service.get_sync_status()
|
||||
stats = await sqlite_service.get_dashboard_stats()
|
||||
return {**status, "stats": stats}
|
||||
|
||||
|
||||
@router.get("/api/sync/history")
|
||||
async def sync_history(page: int = 1, per_page: int = 20):
|
||||
"""Get sync run history."""
|
||||
return await sqlite_service.get_sync_runs(page, per_page)
|
||||
|
||||
|
||||
@router.get("/api/sync/run/{run_id}")
|
||||
async def sync_run_detail(run_id: str):
|
||||
"""Get details for a specific sync run."""
|
||||
detail = await sqlite_service.get_sync_run_detail(run_id)
|
||||
if not detail:
|
||||
return {"error": "Run not found"}
|
||||
return detail
|
||||
|
||||
|
||||
@router.put("/api/sync/schedule")
|
||||
async def update_schedule(config: ScheduleConfig):
|
||||
"""Update scheduler configuration."""
|
||||
if config.enabled:
|
||||
scheduler_service.start_scheduler(config.interval_minutes)
|
||||
else:
|
||||
scheduler_service.stop_scheduler()
|
||||
|
||||
# Persist config
|
||||
await sqlite_service.set_scheduler_config("enabled", str(config.enabled))
|
||||
await sqlite_service.set_scheduler_config("interval_minutes", str(config.interval_minutes))
|
||||
|
||||
return scheduler_service.get_scheduler_status()
|
||||
|
||||
|
||||
@router.get("/api/sync/schedule")
|
||||
async def get_schedule():
|
||||
"""Get current scheduler status."""
|
||||
return scheduler_service.get_scheduler_status()
|
||||
111
api/app/routers/validation.py
Normal file
111
api/app/routers/validation.py
Normal file
@@ -0,0 +1,111 @@
|
||||
import csv
|
||||
import io
|
||||
from fastapi import APIRouter
|
||||
from fastapi.responses import StreamingResponse
|
||||
|
||||
from ..services import order_reader, validation_service
|
||||
from ..database import get_sqlite
|
||||
|
||||
router = APIRouter(prefix="/api/validate", tags=["validation"])
|
||||
|
||||
@router.post("/scan")
|
||||
async def scan_and_validate():
|
||||
"""Scan JSON files and validate all SKUs."""
|
||||
orders, json_count = order_reader.read_json_orders()
|
||||
|
||||
if not orders:
|
||||
return {"orders": 0, "json_files": json_count, "skus": {}, "message": "No orders found"}
|
||||
|
||||
all_skus = order_reader.get_all_skus(orders)
|
||||
result = validation_service.validate_skus(all_skus)
|
||||
importable, skipped = validation_service.classify_orders(orders, result)
|
||||
|
||||
# Track missing SKUs in SQLite
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
for sku in result["missing"]:
|
||||
# Find product name from orders
|
||||
product_name = ""
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
if item.sku == sku:
|
||||
product_name = item.name
|
||||
break
|
||||
if product_name:
|
||||
break
|
||||
|
||||
await db.execute("""
|
||||
INSERT OR IGNORE INTO missing_skus (sku, product_name)
|
||||
VALUES (?, ?)
|
||||
""", (sku, product_name))
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
return {
|
||||
"json_files": json_count,
|
||||
"total_orders": len(orders),
|
||||
"total_skus": len(all_skus),
|
||||
"importable": len(importable),
|
||||
"skipped": len(skipped),
|
||||
"skus": {
|
||||
"mapped": len(result["mapped"]),
|
||||
"direct": len(result["direct"]),
|
||||
"missing": len(result["missing"]),
|
||||
"missing_list": sorted(result["missing"])
|
||||
},
|
||||
"skipped_orders": [
|
||||
{
|
||||
"number": order.number,
|
||||
"customer": order.billing.company_name or f"{order.billing.firstname} {order.billing.lastname}",
|
||||
"items_count": len(order.items),
|
||||
"missing_skus": missing
|
||||
}
|
||||
for order, missing in skipped[:50] # limit to 50
|
||||
]
|
||||
}
|
||||
|
||||
@router.get("/missing-skus")
|
||||
async def get_missing_skus():
|
||||
"""Get all tracked missing SKUs."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
cursor = await db.execute("""
|
||||
SELECT sku, product_name, first_seen, resolved, resolved_at
|
||||
FROM missing_skus
|
||||
ORDER BY resolved ASC, first_seen DESC
|
||||
""")
|
||||
rows = await cursor.fetchall()
|
||||
return {
|
||||
"missing_skus": [dict(row) for row in rows],
|
||||
"total": len(rows),
|
||||
"unresolved": sum(1 for r in rows if not r["resolved"])
|
||||
}
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
@router.get("/missing-skus-csv")
|
||||
async def export_missing_skus_csv():
|
||||
"""Export missing SKUs as CSV."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
cursor = await db.execute("""
|
||||
SELECT sku, product_name, first_seen, resolved
|
||||
FROM missing_skus WHERE resolved = 0
|
||||
ORDER BY first_seen DESC
|
||||
""")
|
||||
rows = await cursor.fetchall()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(["sku", "product_name", "first_seen"])
|
||||
for row in rows:
|
||||
writer.writerow([row["sku"], row["product_name"], row["first_seen"]])
|
||||
|
||||
return StreamingResponse(
|
||||
io.BytesIO(output.getvalue().encode("utf-8-sig")),
|
||||
media_type="text/csv",
|
||||
headers={"Content-Disposition": "attachment; filename=missing_skus.csv"}
|
||||
)
|
||||
0
api/app/services/__init__.py
Normal file
0
api/app/services/__init__.py
Normal file
27
api/app/services/article_service.py
Normal file
27
api/app/services/article_service.py
Normal file
@@ -0,0 +1,27 @@
|
||||
import logging
|
||||
from fastapi import HTTPException
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def search_articles(query: str, limit: int = 20):
|
||||
"""Search articles in NOM_ARTICOLE by codmat or denumire."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
if not query or len(query) < 2:
|
||||
return []
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT id_articol, codmat, denumire
|
||||
FROM nom_articole
|
||||
WHERE (UPPER(codmat) LIKE UPPER(:q) || '%'
|
||||
OR UPPER(denumire) LIKE '%' || UPPER(:q) || '%')
|
||||
AND ROWNUM <= :lim
|
||||
ORDER BY CASE WHEN UPPER(codmat) LIKE UPPER(:q) || '%' THEN 0 ELSE 1 END, codmat
|
||||
""", {"q": query, "lim": limit})
|
||||
|
||||
columns = [col[0].lower() for col in cur.description]
|
||||
return [dict(zip(columns, row)) for row in cur.fetchall()]
|
||||
192
api/app/services/import_service.py
Normal file
192
api/app/services/import_service.py
Normal file
@@ -0,0 +1,192 @@
|
||||
import html
|
||||
import json
|
||||
import logging
|
||||
import oracledb
|
||||
from datetime import datetime, timedelta
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Diacritics to ASCII mapping (Romanian)
|
||||
_DIACRITICS = str.maketrans({
|
||||
'\u0103': 'a', # ă
|
||||
'\u00e2': 'a', # â
|
||||
'\u00ee': 'i', # î
|
||||
'\u0219': 's', # ș
|
||||
'\u021b': 't', # ț
|
||||
'\u0102': 'A', # Ă
|
||||
'\u00c2': 'A', # Â
|
||||
'\u00ce': 'I', # Î
|
||||
'\u0218': 'S', # Ș
|
||||
'\u021a': 'T', # Ț
|
||||
# Older Unicode variants
|
||||
'\u015f': 's', # ş (cedilla)
|
||||
'\u0163': 't', # ţ (cedilla)
|
||||
'\u015e': 'S', # Ş
|
||||
'\u0162': 'T', # Ţ
|
||||
})
|
||||
|
||||
|
||||
def clean_web_text(text: str) -> str:
|
||||
"""Port of VFP CleanWebText: unescape HTML entities + diacritics to ASCII."""
|
||||
if not text:
|
||||
return ""
|
||||
result = html.unescape(text)
|
||||
result = result.translate(_DIACRITICS)
|
||||
# Remove any remaining <br> tags
|
||||
for br in ('<br>', '<br/>', '<br />'):
|
||||
result = result.replace(br, ' ')
|
||||
return result.strip()
|
||||
|
||||
|
||||
def convert_web_date(date_str: str) -> datetime:
|
||||
"""Port of VFP ConvertWebDate: parse web date to datetime."""
|
||||
if not date_str:
|
||||
return datetime.now()
|
||||
try:
|
||||
return datetime.strptime(date_str[:10], '%Y-%m-%d')
|
||||
except ValueError:
|
||||
return datetime.now()
|
||||
|
||||
|
||||
def format_address_for_oracle(address: str, city: str, region: str) -> str:
|
||||
"""Port of VFP FormatAddressForOracle."""
|
||||
region_clean = clean_web_text(region)
|
||||
city_clean = clean_web_text(city)
|
||||
address_clean = clean_web_text(address)
|
||||
return f"JUD:{region_clean};{city_clean};{address_clean}"
|
||||
|
||||
|
||||
def build_articles_json(items) -> str:
|
||||
"""Build JSON string for Oracle PACK_IMPORT_COMENZI.importa_comanda."""
|
||||
articles = []
|
||||
for item in items:
|
||||
articles.append({
|
||||
"sku": item.sku,
|
||||
"quantity": str(item.quantity),
|
||||
"price": str(item.price),
|
||||
"vat": str(item.vat),
|
||||
"name": clean_web_text(item.name)
|
||||
})
|
||||
return json.dumps(articles)
|
||||
|
||||
|
||||
def import_single_order(order, id_pol: int = None, id_sectie: int = None) -> dict:
|
||||
"""Import a single order into Oracle ROA.
|
||||
|
||||
Returns dict with:
|
||||
success: bool
|
||||
id_comanda: int or None
|
||||
id_partener: int or None
|
||||
error: str or None
|
||||
"""
|
||||
result = {
|
||||
"success": False,
|
||||
"id_comanda": None,
|
||||
"id_partener": None,
|
||||
"error": None
|
||||
}
|
||||
|
||||
try:
|
||||
order_number = clean_web_text(order.number)
|
||||
order_date = convert_web_date(order.date)
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Step 1: Process partner
|
||||
id_partener = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
|
||||
if order.billing.is_company:
|
||||
denumire = clean_web_text(order.billing.company_name)
|
||||
cod_fiscal = clean_web_text(order.billing.company_code) or None
|
||||
registru = clean_web_text(order.billing.company_reg) or None
|
||||
is_pj = 1
|
||||
else:
|
||||
denumire = clean_web_text(
|
||||
f"{order.billing.firstname} {order.billing.lastname}"
|
||||
)
|
||||
cod_fiscal = None
|
||||
registru = None
|
||||
is_pj = 0
|
||||
|
||||
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_partener", [
|
||||
cod_fiscal, denumire, registru, is_pj, id_partener
|
||||
])
|
||||
|
||||
partner_id = id_partener.getvalue()
|
||||
if not partner_id or partner_id <= 0:
|
||||
result["error"] = f"Partner creation failed for {denumire}"
|
||||
return result
|
||||
|
||||
result["id_partener"] = int(partner_id)
|
||||
|
||||
# Step 2: Process billing address
|
||||
id_adresa_fact = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
billing_addr = format_address_for_oracle(
|
||||
order.billing.address, order.billing.city, order.billing.region
|
||||
)
|
||||
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_adresa", [
|
||||
partner_id, billing_addr,
|
||||
order.billing.phone or "",
|
||||
order.billing.email or "",
|
||||
id_adresa_fact
|
||||
])
|
||||
addr_fact_id = id_adresa_fact.getvalue()
|
||||
|
||||
# Step 3: Process shipping address (if different)
|
||||
addr_livr_id = None
|
||||
if order.shipping:
|
||||
id_adresa_livr = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
shipping_addr = format_address_for_oracle(
|
||||
order.shipping.address, order.shipping.city,
|
||||
order.shipping.region
|
||||
)
|
||||
cur.callproc("PACK_IMPORT_PARTENERI.cauta_sau_creeaza_adresa", [
|
||||
partner_id, shipping_addr,
|
||||
order.shipping.phone or "",
|
||||
order.shipping.email or "",
|
||||
id_adresa_livr
|
||||
])
|
||||
addr_livr_id = id_adresa_livr.getvalue()
|
||||
|
||||
# Step 4: Build articles JSON and import order
|
||||
articles_json = build_articles_json(order.items)
|
||||
|
||||
# Use CLOB for the JSON
|
||||
clob_var = cur.var(oracledb.DB_TYPE_CLOB)
|
||||
clob_var.setvalue(0, articles_json)
|
||||
|
||||
id_comanda = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||
|
||||
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
|
||||
order_number, # p_nr_comanda_ext
|
||||
order_date, # p_data_comanda
|
||||
partner_id, # p_id_partener
|
||||
clob_var, # p_json_articole (CLOB)
|
||||
addr_livr_id, # p_id_adresa_livrare
|
||||
addr_fact_id, # p_id_adresa_facturare
|
||||
id_pol, # p_id_pol
|
||||
id_sectie, # p_id_sectie
|
||||
id_comanda # v_id_comanda (OUT)
|
||||
])
|
||||
|
||||
comanda_id = id_comanda.getvalue()
|
||||
|
||||
if comanda_id and comanda_id > 0:
|
||||
conn.commit()
|
||||
result["success"] = True
|
||||
result["id_comanda"] = int(comanda_id)
|
||||
logger.info(f"Order {order_number} imported: ID={comanda_id}")
|
||||
else:
|
||||
conn.rollback()
|
||||
result["error"] = "importa_comanda returned invalid ID"
|
||||
|
||||
except oracledb.DatabaseError as e:
|
||||
error_msg = str(e)
|
||||
result["error"] = error_msg
|
||||
logger.error(f"Oracle error importing order {order.number}: {error_msg}")
|
||||
except Exception as e:
|
||||
result["error"] = str(e)
|
||||
logger.error(f"Error importing order {order.number}: {e}")
|
||||
|
||||
return result
|
||||
188
api/app/services/mapping_service.py
Normal file
188
api/app/services/mapping_service.py
Normal file
@@ -0,0 +1,188 @@
|
||||
import oracledb
|
||||
import csv
|
||||
import io
|
||||
import logging
|
||||
from fastapi import HTTPException
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def get_mappings(search: str = "", page: int = 1, per_page: int = 50):
|
||||
"""Get paginated mappings with optional search."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
offset = (page - 1) * per_page
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Build WHERE clause
|
||||
where = ""
|
||||
params = {}
|
||||
if search:
|
||||
where = """WHERE (UPPER(at.sku) LIKE '%' || UPPER(:search) || '%'
|
||||
OR UPPER(at.codmat) LIKE '%' || UPPER(:search) || '%'
|
||||
OR UPPER(na.denumire) LIKE '%' || UPPER(:search) || '%')"""
|
||||
params["search"] = search
|
||||
|
||||
# Count total
|
||||
count_sql = f"""
|
||||
SELECT COUNT(*) FROM ARTICOLE_TERTI at
|
||||
LEFT JOIN nom_articole na ON na.codmat = at.codmat
|
||||
{where}
|
||||
"""
|
||||
cur.execute(count_sql, params)
|
||||
total = cur.fetchone()[0]
|
||||
|
||||
# Get page
|
||||
data_sql = f"""
|
||||
SELECT at.sku, at.codmat, na.denumire, at.cantitate_roa,
|
||||
at.procent_pret, at.activ,
|
||||
TO_CHAR(at.data_creare, 'YYYY-MM-DD HH24:MI') as data_creare
|
||||
FROM ARTICOLE_TERTI at
|
||||
LEFT JOIN nom_articole na ON na.codmat = at.codmat
|
||||
{where}
|
||||
ORDER BY at.sku, at.codmat
|
||||
OFFSET :offset ROWS FETCH NEXT :per_page ROWS ONLY
|
||||
"""
|
||||
params["offset"] = offset
|
||||
params["per_page"] = per_page
|
||||
cur.execute(data_sql, params)
|
||||
|
||||
columns = [col[0].lower() for col in cur.description]
|
||||
rows = [dict(zip(columns, row)) for row in cur.fetchall()]
|
||||
|
||||
return {
|
||||
"mappings": rows,
|
||||
"total": total,
|
||||
"page": page,
|
||||
"per_page": per_page,
|
||||
"pages": (total + per_page - 1) // per_page
|
||||
}
|
||||
|
||||
def create_mapping(sku: str, codmat: str, cantitate_roa: float = 1, procent_pret: float = 100):
|
||||
"""Create a new mapping."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
INSERT INTO ARTICOLE_TERTI (sku, codmat, cantitate_roa, procent_pret, activ, data_creare, id_util_creare)
|
||||
VALUES (:sku, :codmat, :cantitate_roa, :procent_pret, 1, SYSDATE, -3)
|
||||
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate_roa, "procent_pret": procent_pret})
|
||||
conn.commit()
|
||||
return {"sku": sku, "codmat": codmat}
|
||||
|
||||
def update_mapping(sku: str, codmat: str, cantitate_roa: float = None, procent_pret: float = None, activ: int = None):
|
||||
"""Update an existing mapping."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
sets = []
|
||||
params = {"sku": sku, "codmat": codmat}
|
||||
|
||||
if cantitate_roa is not None:
|
||||
sets.append("cantitate_roa = :cantitate_roa")
|
||||
params["cantitate_roa"] = cantitate_roa
|
||||
if procent_pret is not None:
|
||||
sets.append("procent_pret = :procent_pret")
|
||||
params["procent_pret"] = procent_pret
|
||||
if activ is not None:
|
||||
sets.append("activ = :activ")
|
||||
params["activ"] = activ
|
||||
|
||||
if not sets:
|
||||
return False
|
||||
|
||||
sets.append("data_modif = SYSDATE")
|
||||
set_clause = ", ".join(sets)
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(f"""
|
||||
UPDATE ARTICOLE_TERTI SET {set_clause}
|
||||
WHERE sku = :sku AND codmat = :codmat
|
||||
""", params)
|
||||
conn.commit()
|
||||
return cur.rowcount > 0
|
||||
|
||||
def delete_mapping(sku: str, codmat: str):
|
||||
"""Soft delete (set activ=0)."""
|
||||
return update_mapping(sku, codmat, activ=0)
|
||||
|
||||
def import_csv(file_content: str):
|
||||
"""Import mappings from CSV content. Returns summary."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
reader = csv.DictReader(io.StringIO(file_content))
|
||||
created = 0
|
||||
updated = 0
|
||||
errors = []
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
for i, row in enumerate(reader, 1):
|
||||
try:
|
||||
sku = row.get("sku", "").strip()
|
||||
codmat = row.get("codmat", "").strip()
|
||||
cantitate = float(row.get("cantitate_roa", "1") or "1")
|
||||
procent = float(row.get("procent_pret", "100") or "100")
|
||||
|
||||
if not sku or not codmat:
|
||||
errors.append(f"Row {i}: missing sku or codmat")
|
||||
continue
|
||||
|
||||
# Try update first, insert if not exists (MERGE)
|
||||
cur.execute("""
|
||||
MERGE INTO ARTICOLE_TERTI t
|
||||
USING (SELECT :sku AS sku, :codmat AS codmat FROM DUAL) s
|
||||
ON (t.sku = s.sku AND t.codmat = s.codmat)
|
||||
WHEN MATCHED THEN UPDATE SET
|
||||
cantitate_roa = :cantitate_roa,
|
||||
procent_pret = :procent_pret,
|
||||
activ = 1,
|
||||
data_modif = SYSDATE
|
||||
WHEN NOT MATCHED THEN INSERT
|
||||
(sku, codmat, cantitate_roa, procent_pret, activ, data_creare, id_util_creare)
|
||||
VALUES (:sku, :codmat, :cantitate_roa, :procent_pret, 1, SYSDATE, -3)
|
||||
""", {"sku": sku, "codmat": codmat, "cantitate_roa": cantitate, "procent_pret": procent})
|
||||
|
||||
# Check if it was insert or update by rowcount
|
||||
created += 1 # We count total processed
|
||||
|
||||
except Exception as e:
|
||||
errors.append(f"Row {i}: {str(e)}")
|
||||
|
||||
conn.commit()
|
||||
|
||||
return {"processed": created, "errors": errors}
|
||||
|
||||
def export_csv():
|
||||
"""Export all mappings as CSV string."""
|
||||
if database.pool is None:
|
||||
raise HTTPException(status_code=503, detail="Oracle unavailable")
|
||||
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(["sku", "codmat", "cantitate_roa", "procent_pret", "activ"])
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
cur.execute("""
|
||||
SELECT sku, codmat, cantitate_roa, procent_pret, activ
|
||||
FROM ARTICOLE_TERTI ORDER BY sku, codmat
|
||||
""")
|
||||
for row in cur:
|
||||
writer.writerow(row)
|
||||
|
||||
return output.getvalue()
|
||||
|
||||
def get_csv_template():
|
||||
"""Return empty CSV template."""
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(["sku", "codmat", "cantitate_roa", "procent_pret"])
|
||||
writer.writerow(["EXAMPLE_SKU", "EXAMPLE_CODMAT", "1", "100"])
|
||||
return output.getvalue()
|
||||
178
api/app/services/order_reader.py
Normal file
178
api/app/services/order_reader.py
Normal file
@@ -0,0 +1,178 @@
|
||||
import json
|
||||
import glob
|
||||
import os
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Optional
|
||||
|
||||
from ..config import settings
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@dataclass
|
||||
class OrderItem:
|
||||
sku: str
|
||||
name: str
|
||||
price: float
|
||||
quantity: float
|
||||
vat: float
|
||||
|
||||
@dataclass
|
||||
class OrderBilling:
|
||||
firstname: str = ""
|
||||
lastname: str = ""
|
||||
phone: str = ""
|
||||
email: str = ""
|
||||
address: str = ""
|
||||
city: str = ""
|
||||
region: str = ""
|
||||
country: str = ""
|
||||
company_name: str = ""
|
||||
company_code: str = ""
|
||||
company_reg: str = ""
|
||||
is_company: bool = False
|
||||
|
||||
@dataclass
|
||||
class OrderShipping:
|
||||
firstname: str = ""
|
||||
lastname: str = ""
|
||||
phone: str = ""
|
||||
email: str = ""
|
||||
address: str = ""
|
||||
city: str = ""
|
||||
region: str = ""
|
||||
country: str = ""
|
||||
|
||||
@dataclass
|
||||
class OrderData:
|
||||
id: str
|
||||
number: str
|
||||
date: str
|
||||
status: str = ""
|
||||
status_id: str = ""
|
||||
items: list = field(default_factory=list) # list of OrderItem
|
||||
billing: OrderBilling = field(default_factory=OrderBilling)
|
||||
shipping: Optional[OrderShipping] = None
|
||||
payment_name: str = ""
|
||||
delivery_name: str = ""
|
||||
source_file: str = ""
|
||||
|
||||
def read_json_orders(json_dir: str = None) -> tuple[list[OrderData], int]:
|
||||
"""Read all GoMag order JSON files from the output directory.
|
||||
Returns (list of OrderData, number of JSON files read).
|
||||
"""
|
||||
if json_dir is None:
|
||||
json_dir = settings.JSON_OUTPUT_DIR
|
||||
|
||||
if not json_dir or not os.path.isdir(json_dir):
|
||||
logger.warning(f"JSON output directory not found: {json_dir}")
|
||||
return [], 0
|
||||
|
||||
# Find all gomag_orders*.json files
|
||||
pattern = os.path.join(json_dir, "gomag_orders*.json")
|
||||
json_files = sorted(glob.glob(pattern))
|
||||
|
||||
if not json_files:
|
||||
logger.info(f"No JSON files found in {json_dir}")
|
||||
return [], 0
|
||||
|
||||
orders = []
|
||||
for filepath in json_files:
|
||||
try:
|
||||
with open(filepath, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
|
||||
raw_orders = data.get("orders", {})
|
||||
if not isinstance(raw_orders, dict):
|
||||
continue
|
||||
|
||||
for order_id, order_data in raw_orders.items():
|
||||
try:
|
||||
order = _parse_order(order_id, order_data, os.path.basename(filepath))
|
||||
orders.append(order)
|
||||
except Exception as e:
|
||||
logger.warning(f"Error parsing order {order_id} from {filepath}: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading {filepath}: {e}")
|
||||
|
||||
logger.info(f"Read {len(orders)} orders from {len(json_files)} JSON files")
|
||||
return orders, len(json_files)
|
||||
|
||||
def _parse_order(order_id: str, data: dict, source_file: str) -> OrderData:
|
||||
"""Parse a single order from JSON data."""
|
||||
# Parse items
|
||||
items = []
|
||||
raw_items = data.get("items", [])
|
||||
if isinstance(raw_items, list):
|
||||
for item in raw_items:
|
||||
if isinstance(item, dict) and item.get("sku"):
|
||||
items.append(OrderItem(
|
||||
sku=str(item.get("sku", "")).strip(),
|
||||
name=str(item.get("name", "")),
|
||||
price=float(item.get("price", 0) or 0),
|
||||
quantity=float(item.get("quantity", 0) or 0),
|
||||
vat=float(item.get("vat", 0) or 0)
|
||||
))
|
||||
|
||||
# Parse billing
|
||||
billing_data = data.get("billing", {}) or {}
|
||||
company = billing_data.get("company")
|
||||
is_company = isinstance(company, dict) and bool(company.get("name"))
|
||||
|
||||
billing = OrderBilling(
|
||||
firstname=str(billing_data.get("firstname", "")),
|
||||
lastname=str(billing_data.get("lastname", "")),
|
||||
phone=str(billing_data.get("phone", "")),
|
||||
email=str(billing_data.get("email", "")),
|
||||
address=str(billing_data.get("address", "")),
|
||||
city=str(billing_data.get("city", "")),
|
||||
region=str(billing_data.get("region", "")),
|
||||
country=str(billing_data.get("country", "")),
|
||||
company_name=str(company.get("name", "")) if is_company else "",
|
||||
company_code=str(company.get("code", "")) if is_company else "",
|
||||
company_reg=str(company.get("registrationNo", "")) if is_company else "",
|
||||
is_company=is_company
|
||||
)
|
||||
|
||||
# Parse shipping
|
||||
shipping_data = data.get("shipping")
|
||||
shipping = None
|
||||
if isinstance(shipping_data, dict):
|
||||
shipping = OrderShipping(
|
||||
firstname=str(shipping_data.get("firstname", "")),
|
||||
lastname=str(shipping_data.get("lastname", "")),
|
||||
phone=str(shipping_data.get("phone", "")),
|
||||
email=str(shipping_data.get("email", "")),
|
||||
address=str(shipping_data.get("address", "")),
|
||||
city=str(shipping_data.get("city", "")),
|
||||
region=str(shipping_data.get("region", "")),
|
||||
country=str(shipping_data.get("country", ""))
|
||||
)
|
||||
|
||||
# Payment/delivery
|
||||
payment = data.get("payment", {}) or {}
|
||||
delivery = data.get("delivery", {}) or {}
|
||||
|
||||
return OrderData(
|
||||
id=str(data.get("id", order_id)),
|
||||
number=str(data.get("number", "")),
|
||||
date=str(data.get("date", "")),
|
||||
status=str(data.get("status", "")),
|
||||
status_id=str(data.get("statusId", "")),
|
||||
items=items,
|
||||
billing=billing,
|
||||
shipping=shipping,
|
||||
payment_name=str(payment.get("name", "")) if isinstance(payment, dict) else "",
|
||||
delivery_name=str(delivery.get("name", "")) if isinstance(delivery, dict) else "",
|
||||
source_file=source_file
|
||||
)
|
||||
|
||||
def get_all_skus(orders: list[OrderData]) -> set[str]:
|
||||
"""Extract unique SKUs from all orders."""
|
||||
skus = set()
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
if item.sku:
|
||||
skus.add(item.sku)
|
||||
return skus
|
||||
71
api/app/services/scheduler_service.py
Normal file
71
api/app/services/scheduler_service.py
Normal file
@@ -0,0 +1,71 @@
|
||||
import logging
|
||||
from apscheduler.schedulers.asyncio import AsyncIOScheduler
|
||||
from apscheduler.triggers.interval import IntervalTrigger
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
_scheduler = None
|
||||
_is_running = False
|
||||
|
||||
|
||||
def init_scheduler():
|
||||
"""Initialize the APScheduler instance."""
|
||||
global _scheduler
|
||||
_scheduler = AsyncIOScheduler()
|
||||
logger.info("Scheduler initialized")
|
||||
|
||||
|
||||
def start_scheduler(interval_minutes: int = 5):
|
||||
"""Start the scheduler with the given interval."""
|
||||
global _is_running
|
||||
if _scheduler is None:
|
||||
init_scheduler()
|
||||
|
||||
# Remove existing job if any
|
||||
if _scheduler.get_job("sync_job"):
|
||||
_scheduler.remove_job("sync_job")
|
||||
|
||||
from . import sync_service
|
||||
|
||||
_scheduler.add_job(
|
||||
sync_service.run_sync,
|
||||
trigger=IntervalTrigger(minutes=interval_minutes),
|
||||
id="sync_job",
|
||||
name="GoMag Sync",
|
||||
replace_existing=True
|
||||
)
|
||||
|
||||
if not _scheduler.running:
|
||||
_scheduler.start()
|
||||
|
||||
_is_running = True
|
||||
logger.info(f"Scheduler started with interval {interval_minutes}min")
|
||||
|
||||
|
||||
def stop_scheduler():
|
||||
"""Stop the scheduler."""
|
||||
global _is_running
|
||||
if _scheduler and _scheduler.running:
|
||||
if _scheduler.get_job("sync_job"):
|
||||
_scheduler.remove_job("sync_job")
|
||||
_is_running = False
|
||||
logger.info("Scheduler stopped")
|
||||
|
||||
|
||||
def shutdown_scheduler():
|
||||
"""Shutdown the scheduler completely."""
|
||||
global _scheduler, _is_running
|
||||
if _scheduler and _scheduler.running:
|
||||
_scheduler.shutdown(wait=False)
|
||||
_scheduler = None
|
||||
_is_running = False
|
||||
|
||||
|
||||
def get_scheduler_status():
|
||||
"""Get current scheduler status."""
|
||||
job = _scheduler.get_job("sync_job") if _scheduler else None
|
||||
return {
|
||||
"enabled": _is_running,
|
||||
"next_run": job.next_run_time.isoformat() if job and job.next_run_time else None,
|
||||
"interval_minutes": int(job.trigger.interval.total_seconds() / 60) if job else None
|
||||
}
|
||||
206
api/app/services/sqlite_service.py
Normal file
206
api/app/services/sqlite_service.py
Normal file
@@ -0,0 +1,206 @@
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from ..database import get_sqlite, get_sqlite_sync
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def create_sync_run(run_id: str, json_files: int = 0):
|
||||
"""Create a new sync run record."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
await db.execute("""
|
||||
INSERT INTO sync_runs (run_id, started_at, status, json_files)
|
||||
VALUES (?, datetime('now'), 'running', ?)
|
||||
""", (run_id, json_files))
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def update_sync_run(run_id: str, status: str, total_orders: int = 0,
|
||||
imported: int = 0, skipped: int = 0, errors: int = 0):
|
||||
"""Update sync run with results."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
await db.execute("""
|
||||
UPDATE sync_runs SET
|
||||
finished_at = datetime('now'),
|
||||
status = ?,
|
||||
total_orders = ?,
|
||||
imported = ?,
|
||||
skipped = ?,
|
||||
errors = ?
|
||||
WHERE run_id = ?
|
||||
""", (status, total_orders, imported, skipped, errors, run_id))
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def add_import_order(sync_run_id: str, order_number: str, order_date: str,
|
||||
customer_name: str, status: str, id_comanda: int = None,
|
||||
id_partener: int = None, error_message: str = None,
|
||||
missing_skus: list = None, items_count: int = 0):
|
||||
"""Record an individual order import result."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
await db.execute("""
|
||||
INSERT INTO import_orders
|
||||
(sync_run_id, order_number, order_date, customer_name, status,
|
||||
id_comanda, id_partener, error_message, missing_skus, items_count)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""", (sync_run_id, order_number, order_date, customer_name, status,
|
||||
id_comanda, id_partener, error_message,
|
||||
json.dumps(missing_skus) if missing_skus else None, items_count))
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def track_missing_sku(sku: str, product_name: str = ""):
|
||||
"""Track a missing SKU."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
await db.execute("""
|
||||
INSERT OR IGNORE INTO missing_skus (sku, product_name)
|
||||
VALUES (?, ?)
|
||||
""", (sku, product_name))
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def resolve_missing_sku(sku: str):
|
||||
"""Mark a missing SKU as resolved."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
await db.execute("""
|
||||
UPDATE missing_skus SET resolved = 1, resolved_at = datetime('now')
|
||||
WHERE sku = ?
|
||||
""", (sku,))
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def get_sync_runs(page: int = 1, per_page: int = 20):
|
||||
"""Get paginated sync run history."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
offset = (page - 1) * per_page
|
||||
|
||||
cursor = await db.execute("SELECT COUNT(*) FROM sync_runs")
|
||||
total = (await cursor.fetchone())[0]
|
||||
|
||||
cursor = await db.execute("""
|
||||
SELECT * FROM sync_runs
|
||||
ORDER BY started_at DESC
|
||||
LIMIT ? OFFSET ?
|
||||
""", (per_page, offset))
|
||||
rows = await cursor.fetchall()
|
||||
|
||||
return {
|
||||
"runs": [dict(row) for row in rows],
|
||||
"total": total,
|
||||
"page": page,
|
||||
"pages": (total + per_page - 1) // per_page if total > 0 else 0
|
||||
}
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def get_sync_run_detail(run_id: str):
|
||||
"""Get details for a specific sync run including its orders."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
cursor = await db.execute(
|
||||
"SELECT * FROM sync_runs WHERE run_id = ?", (run_id,)
|
||||
)
|
||||
run = await cursor.fetchone()
|
||||
if not run:
|
||||
return None
|
||||
|
||||
cursor = await db.execute("""
|
||||
SELECT * FROM import_orders
|
||||
WHERE sync_run_id = ?
|
||||
ORDER BY created_at
|
||||
""", (run_id,))
|
||||
orders = await cursor.fetchall()
|
||||
|
||||
return {
|
||||
"run": dict(run),
|
||||
"orders": [dict(o) for o in orders]
|
||||
}
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def get_dashboard_stats():
|
||||
"""Get stats for the dashboard."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
# Total imported
|
||||
cursor = await db.execute(
|
||||
"SELECT COUNT(*) FROM import_orders WHERE status = 'IMPORTED'"
|
||||
)
|
||||
imported = (await cursor.fetchone())[0]
|
||||
|
||||
# Total skipped
|
||||
cursor = await db.execute(
|
||||
"SELECT COUNT(*) FROM import_orders WHERE status = 'SKIPPED'"
|
||||
)
|
||||
skipped = (await cursor.fetchone())[0]
|
||||
|
||||
# Total errors
|
||||
cursor = await db.execute(
|
||||
"SELECT COUNT(*) FROM import_orders WHERE status = 'ERROR'"
|
||||
)
|
||||
errors = (await cursor.fetchone())[0]
|
||||
|
||||
# Missing SKUs (unresolved)
|
||||
cursor = await db.execute(
|
||||
"SELECT COUNT(*) FROM missing_skus WHERE resolved = 0"
|
||||
)
|
||||
missing = (await cursor.fetchone())[0]
|
||||
|
||||
# Last sync run
|
||||
cursor = await db.execute("""
|
||||
SELECT * FROM sync_runs ORDER BY started_at DESC LIMIT 1
|
||||
""")
|
||||
last_run = await cursor.fetchone()
|
||||
|
||||
return {
|
||||
"imported": imported,
|
||||
"skipped": skipped,
|
||||
"errors": errors,
|
||||
"missing_skus": missing,
|
||||
"last_run": dict(last_run) if last_run else None
|
||||
}
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def get_scheduler_config():
|
||||
"""Get scheduler configuration from SQLite."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
cursor = await db.execute("SELECT key, value FROM scheduler_config")
|
||||
rows = await cursor.fetchall()
|
||||
return {row["key"]: row["value"] for row in rows}
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def set_scheduler_config(key: str, value: str):
|
||||
"""Set a scheduler configuration value."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
await db.execute("""
|
||||
INSERT OR REPLACE INTO scheduler_config (key, value)
|
||||
VALUES (?, ?)
|
||||
""", (key, value))
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
165
api/app/services/sync_service.py
Normal file
165
api/app/services/sync_service.py
Normal file
@@ -0,0 +1,165 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
|
||||
from . import order_reader, validation_service, import_service, sqlite_service
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Sync state
|
||||
_sync_lock = asyncio.Lock()
|
||||
_current_sync = None # dict with run_id, status, progress info
|
||||
|
||||
|
||||
async def get_sync_status():
|
||||
"""Get current sync status."""
|
||||
if _current_sync:
|
||||
return {**_current_sync}
|
||||
return {"status": "idle"}
|
||||
|
||||
|
||||
async def run_sync(id_pol: int = None, id_sectie: int = None) -> dict:
|
||||
"""Run a full sync cycle. Returns summary dict."""
|
||||
global _current_sync
|
||||
|
||||
if _sync_lock.locked():
|
||||
return {"error": "Sync already running"}
|
||||
|
||||
async with _sync_lock:
|
||||
run_id = datetime.now().strftime("%Y%m%d_%H%M%S") + "_" + uuid.uuid4().hex[:6]
|
||||
_current_sync = {
|
||||
"run_id": run_id,
|
||||
"status": "running",
|
||||
"started_at": datetime.now().isoformat(),
|
||||
"progress": "Reading JSON files..."
|
||||
}
|
||||
|
||||
try:
|
||||
# Step 1: Read orders
|
||||
orders, json_count = order_reader.read_json_orders()
|
||||
await sqlite_service.create_sync_run(run_id, json_count)
|
||||
|
||||
if not orders:
|
||||
await sqlite_service.update_sync_run(run_id, "completed", 0, 0, 0, 0)
|
||||
_current_sync = None
|
||||
return {
|
||||
"run_id": run_id,
|
||||
"status": "completed",
|
||||
"message": "No orders found",
|
||||
"json_files": json_count
|
||||
}
|
||||
|
||||
_current_sync["progress"] = f"Validating {len(orders)} orders..."
|
||||
|
||||
# Step 2: Validate SKUs (blocking Oracle call -> run in thread)
|
||||
all_skus = order_reader.get_all_skus(orders)
|
||||
validation = await asyncio.to_thread(validation_service.validate_skus, all_skus)
|
||||
importable, skipped = validation_service.classify_orders(orders, validation)
|
||||
|
||||
# Track missing SKUs
|
||||
for sku in validation["missing"]:
|
||||
product_name = ""
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
if item.sku == sku:
|
||||
product_name = item.name
|
||||
break
|
||||
if product_name:
|
||||
break
|
||||
await sqlite_service.track_missing_sku(sku, product_name)
|
||||
|
||||
# Step 3: Record skipped orders
|
||||
for order, missing_skus in skipped:
|
||||
customer = order.billing.company_name or \
|
||||
f"{order.billing.firstname} {order.billing.lastname}"
|
||||
await sqlite_service.add_import_order(
|
||||
sync_run_id=run_id,
|
||||
order_number=order.number,
|
||||
order_date=order.date,
|
||||
customer_name=customer,
|
||||
status="SKIPPED",
|
||||
missing_skus=missing_skus,
|
||||
items_count=len(order.items)
|
||||
)
|
||||
|
||||
# Step 4: Import valid orders
|
||||
imported_count = 0
|
||||
error_count = 0
|
||||
|
||||
for i, order in enumerate(importable):
|
||||
_current_sync["progress"] = f"Importing {i+1}/{len(importable)}: #{order.number}"
|
||||
|
||||
result = await asyncio.to_thread(
|
||||
import_service.import_single_order,
|
||||
order, id_pol=id_pol, id_sectie=id_sectie
|
||||
)
|
||||
customer = order.billing.company_name or \
|
||||
f"{order.billing.firstname} {order.billing.lastname}"
|
||||
|
||||
if result["success"]:
|
||||
imported_count += 1
|
||||
await sqlite_service.add_import_order(
|
||||
sync_run_id=run_id,
|
||||
order_number=order.number,
|
||||
order_date=order.date,
|
||||
customer_name=customer,
|
||||
status="IMPORTED",
|
||||
id_comanda=result["id_comanda"],
|
||||
id_partener=result["id_partener"],
|
||||
items_count=len(order.items)
|
||||
)
|
||||
else:
|
||||
error_count += 1
|
||||
await sqlite_service.add_import_order(
|
||||
sync_run_id=run_id,
|
||||
order_number=order.number,
|
||||
order_date=order.date,
|
||||
customer_name=customer,
|
||||
status="ERROR",
|
||||
id_partener=result.get("id_partener"),
|
||||
error_message=result["error"],
|
||||
items_count=len(order.items)
|
||||
)
|
||||
|
||||
# Safety: stop if too many errors
|
||||
if error_count > 10:
|
||||
logger.warning("Too many errors, stopping sync")
|
||||
break
|
||||
|
||||
# Step 5: Update sync run
|
||||
status = "completed" if error_count <= 10 else "failed"
|
||||
await sqlite_service.update_sync_run(
|
||||
run_id, status, len(orders), imported_count, len(skipped), error_count
|
||||
)
|
||||
|
||||
summary = {
|
||||
"run_id": run_id,
|
||||
"status": status,
|
||||
"json_files": json_count,
|
||||
"total_orders": len(orders),
|
||||
"imported": imported_count,
|
||||
"skipped": len(skipped),
|
||||
"errors": error_count,
|
||||
"missing_skus": len(validation["missing"])
|
||||
}
|
||||
|
||||
logger.info(
|
||||
f"Sync {run_id} completed: {imported_count} imported, "
|
||||
f"{len(skipped)} skipped, {error_count} errors"
|
||||
)
|
||||
return summary
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Sync {run_id} failed: {e}")
|
||||
await sqlite_service.update_sync_run(run_id, "failed", 0, 0, 0, 1)
|
||||
return {"run_id": run_id, "status": "failed", "error": str(e)}
|
||||
finally:
|
||||
_current_sync = None
|
||||
|
||||
|
||||
def stop_sync():
|
||||
"""Signal sync to stop. Currently sync runs to completion."""
|
||||
# For now, sync runs are not cancellable mid-flight.
|
||||
# Future: use an asyncio.Event for cooperative cancellation.
|
||||
pass
|
||||
71
api/app/services/validation_service.py
Normal file
71
api/app/services/validation_service.py
Normal file
@@ -0,0 +1,71 @@
|
||||
import logging
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def validate_skus(skus: set[str]) -> dict:
|
||||
"""Validate a set of SKUs against Oracle.
|
||||
Returns: {mapped: set, direct: set, missing: set}
|
||||
- mapped: found in ARTICOLE_TERTI (active)
|
||||
- direct: found in NOM_ARTICOLE by codmat (not in ARTICOLE_TERTI)
|
||||
- missing: not found anywhere
|
||||
"""
|
||||
if not skus:
|
||||
return {"mapped": set(), "direct": set(), "missing": set()}
|
||||
|
||||
mapped = set()
|
||||
direct = set()
|
||||
sku_list = list(skus)
|
||||
|
||||
with database.pool.acquire() as conn:
|
||||
with conn.cursor() as cur:
|
||||
# Check in batches of 500
|
||||
for i in range(0, len(sku_list), 500):
|
||||
batch = sku_list[i:i+500]
|
||||
placeholders = ",".join([f":s{j}" for j in range(len(batch))])
|
||||
params = {f"s{j}": sku for j, sku in enumerate(batch)}
|
||||
|
||||
# Check ARTICOLE_TERTI
|
||||
cur.execute(f"""
|
||||
SELECT DISTINCT sku FROM ARTICOLE_TERTI
|
||||
WHERE sku IN ({placeholders}) AND activ = 1
|
||||
""", params)
|
||||
for row in cur:
|
||||
mapped.add(row[0])
|
||||
|
||||
# Check NOM_ARTICOLE for remaining
|
||||
remaining = [s for s in batch if s not in mapped]
|
||||
if remaining:
|
||||
placeholders2 = ",".join([f":n{j}" for j in range(len(remaining))])
|
||||
params2 = {f"n{j}": sku for j, sku in enumerate(remaining)}
|
||||
cur.execute(f"""
|
||||
SELECT DISTINCT codmat FROM NOM_ARTICOLE
|
||||
WHERE codmat IN ({placeholders2})
|
||||
""", params2)
|
||||
for row in cur:
|
||||
direct.add(row[0])
|
||||
|
||||
missing = skus - mapped - direct
|
||||
|
||||
logger.info(f"SKU validation: {len(mapped)} mapped, {len(direct)} direct, {len(missing)} missing")
|
||||
return {"mapped": mapped, "direct": direct, "missing": missing}
|
||||
|
||||
def classify_orders(orders, validation_result):
|
||||
"""Classify orders as importable or skipped based on SKU validation.
|
||||
Returns: (importable_orders, skipped_orders)
|
||||
Each skipped entry is a tuple of (order, list_of_missing_skus).
|
||||
"""
|
||||
ok_skus = validation_result["mapped"] | validation_result["direct"]
|
||||
importable = []
|
||||
skipped = []
|
||||
|
||||
for order in orders:
|
||||
order_skus = {item.sku for item in order.items if item.sku}
|
||||
order_missing = order_skus - ok_skus
|
||||
|
||||
if order_missing:
|
||||
skipped.append((order, list(order_missing)))
|
||||
else:
|
||||
importable.append(order)
|
||||
|
||||
return importable, skipped
|
||||
214
api/app/static/css/style.css
Normal file
214
api/app/static/css/style.css
Normal file
@@ -0,0 +1,214 @@
|
||||
:root {
|
||||
--sidebar-width: 220px;
|
||||
--sidebar-bg: #1e293b;
|
||||
--sidebar-text: #94a3b8;
|
||||
--sidebar-active: #ffffff;
|
||||
--sidebar-hover-bg: #334155;
|
||||
--body-bg: #f1f5f9;
|
||||
--card-shadow: 0 1px 3px rgba(0,0,0,0.08);
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif;
|
||||
background-color: var(--body-bg);
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
/* Sidebar */
|
||||
.sidebar {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: var(--sidebar-width);
|
||||
height: 100vh;
|
||||
background-color: var(--sidebar-bg);
|
||||
padding: 0;
|
||||
z-index: 1000;
|
||||
overflow-y: auto;
|
||||
transition: transform 0.3s ease;
|
||||
}
|
||||
|
||||
.sidebar-header {
|
||||
padding: 1.25rem 1rem;
|
||||
border-bottom: 1px solid #334155;
|
||||
}
|
||||
|
||||
.sidebar-header h5 {
|
||||
color: #fff;
|
||||
margin: 0;
|
||||
font-size: 1.1rem;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.sidebar .nav-link {
|
||||
color: var(--sidebar-text);
|
||||
padding: 0.65rem 1rem;
|
||||
font-size: 0.9rem;
|
||||
border-left: 3px solid transparent;
|
||||
transition: all 0.15s ease;
|
||||
}
|
||||
|
||||
.sidebar .nav-link:hover {
|
||||
color: var(--sidebar-active);
|
||||
background-color: var(--sidebar-hover-bg);
|
||||
}
|
||||
|
||||
.sidebar .nav-link.active {
|
||||
color: var(--sidebar-active);
|
||||
background-color: var(--sidebar-hover-bg);
|
||||
border-left-color: #3b82f6;
|
||||
}
|
||||
|
||||
.sidebar .nav-link i {
|
||||
margin-right: 0.5rem;
|
||||
width: 1.2rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.sidebar-footer {
|
||||
position: absolute;
|
||||
bottom: 0;
|
||||
padding: 0.75rem 1rem;
|
||||
border-top: 1px solid #334155;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
/* Main content */
|
||||
.main-content {
|
||||
margin-left: var(--sidebar-width);
|
||||
padding: 1.5rem;
|
||||
min-height: 100vh;
|
||||
}
|
||||
|
||||
/* Sidebar toggle button for mobile */
|
||||
.sidebar-toggle {
|
||||
position: fixed;
|
||||
top: 0.5rem;
|
||||
left: 0.5rem;
|
||||
z-index: 1100;
|
||||
border-radius: 0.375rem;
|
||||
}
|
||||
|
||||
/* Cards */
|
||||
.card {
|
||||
border: none;
|
||||
box-shadow: var(--card-shadow);
|
||||
border-radius: 0.5rem;
|
||||
}
|
||||
|
||||
.card-header {
|
||||
background-color: #fff;
|
||||
border-bottom: 1px solid #e2e8f0;
|
||||
font-weight: 600;
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
/* Status badges */
|
||||
.badge-imported { background-color: #22c55e; }
|
||||
.badge-skipped { background-color: #eab308; color: #000; }
|
||||
.badge-error { background-color: #ef4444; }
|
||||
.badge-pending { background-color: #94a3b8; }
|
||||
.badge-ready { background-color: #3b82f6; }
|
||||
|
||||
/* Stat cards */
|
||||
.stat-card {
|
||||
text-align: center;
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.stat-card .stat-value {
|
||||
font-size: 1.75rem;
|
||||
font-weight: 700;
|
||||
line-height: 1.2;
|
||||
}
|
||||
|
||||
.stat-card .stat-label {
|
||||
font-size: 0.8rem;
|
||||
color: #64748b;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.05em;
|
||||
}
|
||||
|
||||
/* Tables */
|
||||
.table {
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
.table th {
|
||||
font-weight: 600;
|
||||
color: #475569;
|
||||
border-top: none;
|
||||
}
|
||||
|
||||
/* Forms */
|
||||
.form-control:focus, .form-select:focus {
|
||||
border-color: #3b82f6;
|
||||
box-shadow: 0 0 0 0.2rem rgba(59, 130, 246, 0.15);
|
||||
}
|
||||
|
||||
/* Responsive */
|
||||
@media (max-width: 767.98px) {
|
||||
.sidebar {
|
||||
transform: translateX(-100%);
|
||||
}
|
||||
.sidebar.show {
|
||||
transform: translateX(0);
|
||||
}
|
||||
.main-content {
|
||||
margin-left: 0;
|
||||
}
|
||||
.sidebar-toggle {
|
||||
display: block !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Autocomplete dropdown */
|
||||
.autocomplete-dropdown {
|
||||
position: absolute;
|
||||
z-index: 1050;
|
||||
background: #fff;
|
||||
border: 1px solid #dee2e6;
|
||||
border-radius: 0.375rem;
|
||||
box-shadow: 0 4px 12px rgba(0,0,0,0.15);
|
||||
max-height: 300px;
|
||||
overflow-y: auto;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.autocomplete-item {
|
||||
padding: 0.5rem 0.75rem;
|
||||
cursor: pointer;
|
||||
font-size: 0.875rem;
|
||||
border-bottom: 1px solid #f1f5f9;
|
||||
}
|
||||
|
||||
.autocomplete-item:hover, .autocomplete-item.active {
|
||||
background-color: #f1f5f9;
|
||||
}
|
||||
|
||||
.autocomplete-item .codmat {
|
||||
font-weight: 600;
|
||||
color: #1e293b;
|
||||
}
|
||||
|
||||
.autocomplete-item .denumire {
|
||||
color: #64748b;
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
/* Pagination */
|
||||
.pagination .page-link {
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
/* Loading spinner */
|
||||
.spinner-overlay {
|
||||
position: fixed;
|
||||
top: 0; left: 0; right: 0; bottom: 0;
|
||||
background: rgba(255,255,255,0.7);
|
||||
z-index: 9999;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
215
api/app/static/js/dashboard.js
Normal file
215
api/app/static/js/dashboard.js
Normal file
@@ -0,0 +1,215 @@
|
||||
let refreshInterval = null;
|
||||
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
loadDashboard();
|
||||
// Auto-refresh every 10 seconds
|
||||
refreshInterval = setInterval(loadDashboard, 10000);
|
||||
});
|
||||
|
||||
async function loadDashboard() {
|
||||
await Promise.all([
|
||||
loadSyncStatus(),
|
||||
loadSyncHistory(),
|
||||
loadMissingSkus(),
|
||||
loadSchedulerStatus()
|
||||
]);
|
||||
}
|
||||
|
||||
async function loadSyncStatus() {
|
||||
try {
|
||||
const res = await fetch('/api/sync/status');
|
||||
const data = await res.json();
|
||||
|
||||
// Update stats
|
||||
const stats = data.stats || {};
|
||||
document.getElementById('stat-imported').textContent = stats.imported || 0;
|
||||
document.getElementById('stat-skipped').textContent = stats.skipped || 0;
|
||||
document.getElementById('stat-missing').textContent = stats.missing_skus || 0;
|
||||
|
||||
// Update sync status badge
|
||||
const badge = document.getElementById('syncStatusBadge');
|
||||
const status = data.status || 'idle';
|
||||
badge.textContent = status;
|
||||
badge.className = 'badge ' + (status === 'running' ? 'bg-primary' : status === 'failed' ? 'bg-danger' : 'bg-secondary');
|
||||
|
||||
// Show/hide start/stop buttons
|
||||
if (status === 'running') {
|
||||
document.getElementById('btnStartSync').classList.add('d-none');
|
||||
document.getElementById('btnStopSync').classList.remove('d-none');
|
||||
document.getElementById('syncProgressText').textContent = data.progress || 'Running...';
|
||||
} else {
|
||||
document.getElementById('btnStartSync').classList.remove('d-none');
|
||||
document.getElementById('btnStopSync').classList.add('d-none');
|
||||
|
||||
// Show last run info
|
||||
if (stats.last_run) {
|
||||
const lr = stats.last_run;
|
||||
const started = lr.started_at ? new Date(lr.started_at).toLocaleString('ro-RO') : '';
|
||||
document.getElementById('syncProgressText').textContent =
|
||||
`Ultimul: ${started} | ${lr.imported || 0} ok, ${lr.skipped || 0} skip, ${lr.errors || 0} err`;
|
||||
} else {
|
||||
document.getElementById('syncProgressText').textContent = '';
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('loadSyncStatus error:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async function loadSyncHistory() {
|
||||
try {
|
||||
const res = await fetch('/api/sync/history?per_page=10');
|
||||
const data = await res.json();
|
||||
const tbody = document.getElementById('syncRunsBody');
|
||||
|
||||
if (!data.runs || data.runs.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="7" class="text-center text-muted py-3">Niciun sync run</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
tbody.innerHTML = data.runs.map(r => {
|
||||
const started = r.started_at ? new Date(r.started_at).toLocaleString('ro-RO', {day:'2-digit',month:'2-digit',hour:'2-digit',minute:'2-digit'}) : '-';
|
||||
let duration = '-';
|
||||
if (r.started_at && r.finished_at) {
|
||||
const sec = Math.round((new Date(r.finished_at) - new Date(r.started_at)) / 1000);
|
||||
duration = sec < 60 ? `${sec}s` : `${Math.floor(sec/60)}m ${sec%60}s`;
|
||||
}
|
||||
const statusClass = r.status === 'completed' ? 'bg-success' : r.status === 'running' ? 'bg-primary' : 'bg-danger';
|
||||
|
||||
return `<tr style="cursor:pointer" onclick="window.location='/sync/run/${esc(r.run_id)}'">
|
||||
<td>${started}</td>
|
||||
<td><span class="badge ${statusClass}">${esc(r.status)}</span></td>
|
||||
<td>${r.total_orders || 0}</td>
|
||||
<td class="text-success">${r.imported || 0}</td>
|
||||
<td class="text-warning">${r.skipped || 0}</td>
|
||||
<td class="text-danger">${r.errors || 0}</td>
|
||||
<td>${duration}</td>
|
||||
</tr>`;
|
||||
}).join('');
|
||||
} catch (err) {
|
||||
console.error('loadSyncHistory error:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async function loadMissingSkus() {
|
||||
try {
|
||||
const res = await fetch('/api/validate/missing-skus');
|
||||
const data = await res.json();
|
||||
const tbody = document.getElementById('missingSkusBody');
|
||||
|
||||
// Update stat card
|
||||
document.getElementById('stat-missing').textContent = data.unresolved || 0;
|
||||
|
||||
const unresolved = (data.missing_skus || []).filter(s => !s.resolved);
|
||||
|
||||
if (unresolved.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="4" class="text-center text-muted py-3">Toate SKU-urile sunt mapate</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
tbody.innerHTML = unresolved.slice(0, 10).map(s => `
|
||||
<tr>
|
||||
<td><code>${esc(s.sku)}</code></td>
|
||||
<td>${esc(s.product_name || '-')}</td>
|
||||
<td><small>${s.first_seen ? new Date(s.first_seen).toLocaleDateString('ro-RO') : '-'}</small></td>
|
||||
<td>
|
||||
<a href="/mappings?sku=${encodeURIComponent(s.sku)}" class="btn btn-sm btn-outline-primary" title="Creeaza mapare">
|
||||
<i class="bi bi-plus-lg"></i>
|
||||
</a>
|
||||
</td>
|
||||
</tr>
|
||||
`).join('');
|
||||
} catch (err) {
|
||||
console.error('loadMissingSkus error:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async function loadSchedulerStatus() {
|
||||
try {
|
||||
const res = await fetch('/api/sync/schedule');
|
||||
const data = await res.json();
|
||||
|
||||
document.getElementById('schedulerToggle').checked = data.enabled || false;
|
||||
if (data.interval_minutes) {
|
||||
document.getElementById('schedulerInterval').value = data.interval_minutes;
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('loadSchedulerStatus error:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async function startSync() {
|
||||
try {
|
||||
const res = await fetch('/api/sync/start', { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.error) {
|
||||
alert(data.error);
|
||||
}
|
||||
loadDashboard();
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
async function stopSync() {
|
||||
try {
|
||||
await fetch('/api/sync/stop', { method: 'POST' });
|
||||
loadDashboard();
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
async function scanOrders() {
|
||||
const btn = document.getElementById('btnScan');
|
||||
btn.disabled = true;
|
||||
btn.innerHTML = '<span class="spinner-border spinner-border-sm"></span> Scanning...';
|
||||
|
||||
try {
|
||||
const res = await fetch('/api/validate/scan', { method: 'POST' });
|
||||
const data = await res.json();
|
||||
|
||||
// Update pending/ready stats
|
||||
document.getElementById('stat-pending').textContent = data.total_orders || 0;
|
||||
document.getElementById('stat-ready').textContent = data.importable || 0;
|
||||
|
||||
let msg = `Scan complet: ${data.total_orders || 0} comenzi, ${data.importable || 0} ready, ${data.skipped || 0} skipped`;
|
||||
if (data.skus && data.skus.missing > 0) {
|
||||
msg += `, ${data.skus.missing} SKU-uri lipsa`;
|
||||
}
|
||||
alert(msg);
|
||||
loadDashboard();
|
||||
} catch (err) {
|
||||
alert('Eroare scan: ' + err.message);
|
||||
} finally {
|
||||
btn.disabled = false;
|
||||
btn.innerHTML = '<i class="bi bi-search"></i> Scan';
|
||||
}
|
||||
}
|
||||
|
||||
async function toggleScheduler() {
|
||||
const enabled = document.getElementById('schedulerToggle').checked;
|
||||
const interval = parseInt(document.getElementById('schedulerInterval').value) || 5;
|
||||
|
||||
try {
|
||||
await fetch('/api/sync/schedule', {
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ enabled, interval_minutes: interval })
|
||||
});
|
||||
} catch (err) {
|
||||
alert('Eroare scheduler: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
async function updateSchedulerInterval() {
|
||||
const enabled = document.getElementById('schedulerToggle').checked;
|
||||
if (enabled) {
|
||||
await toggleScheduler();
|
||||
}
|
||||
}
|
||||
|
||||
function esc(s) {
|
||||
if (s == null) return '';
|
||||
return String(s).replace(/&/g, '&').replace(/</g, '<').replace(/>/g, '>').replace(/"/g, '"').replace(/'/g, ''');
|
||||
}
|
||||
299
api/app/static/js/mappings.js
Normal file
299
api/app/static/js/mappings.js
Normal file
@@ -0,0 +1,299 @@
|
||||
let currentPage = 1;
|
||||
let currentSearch = '';
|
||||
let searchTimeout = null;
|
||||
|
||||
// Load on page ready
|
||||
document.addEventListener('DOMContentLoaded', loadMappings);
|
||||
|
||||
function debounceSearch() {
|
||||
clearTimeout(searchTimeout);
|
||||
searchTimeout = setTimeout(() => {
|
||||
currentSearch = document.getElementById('searchInput').value;
|
||||
currentPage = 1;
|
||||
loadMappings();
|
||||
}, 300);
|
||||
}
|
||||
|
||||
async function loadMappings() {
|
||||
const params = new URLSearchParams({
|
||||
search: currentSearch,
|
||||
page: currentPage,
|
||||
per_page: 50
|
||||
});
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/mappings?${params}`);
|
||||
const data = await res.json();
|
||||
renderTable(data.mappings);
|
||||
renderPagination(data);
|
||||
} catch (err) {
|
||||
document.getElementById('mappingsBody').innerHTML =
|
||||
`<tr><td colspan="7" class="text-center text-danger">Eroare: ${err.message}</td></tr>`;
|
||||
}
|
||||
}
|
||||
|
||||
function renderTable(mappings) {
|
||||
const tbody = document.getElementById('mappingsBody');
|
||||
|
||||
if (!mappings || mappings.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="7" class="text-center text-muted py-4">Nu exista mapari</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
tbody.innerHTML = mappings.map(m => `
|
||||
<tr>
|
||||
<td><strong>${esc(m.sku)}</strong></td>
|
||||
<td><code>${esc(m.codmat)}</code></td>
|
||||
<td>${esc(m.denumire || '-')}</td>
|
||||
<td class="editable" onclick="editCell(this, '${esc(m.sku)}', '${esc(m.codmat)}', 'cantitate_roa', ${m.cantitate_roa})">${m.cantitate_roa}</td>
|
||||
<td class="editable" onclick="editCell(this, '${esc(m.sku)}', '${esc(m.codmat)}', 'procent_pret', ${m.procent_pret})">${m.procent_pret}%</td>
|
||||
<td>
|
||||
<span class="badge ${m.activ ? 'bg-success' : 'bg-secondary'}" style="cursor:pointer"
|
||||
onclick="toggleActive('${esc(m.sku)}', '${esc(m.codmat)}', ${m.activ})">
|
||||
${m.activ ? 'Activ' : 'Inactiv'}
|
||||
</span>
|
||||
</td>
|
||||
<td>
|
||||
<button class="btn btn-sm btn-outline-danger" onclick="deleteMappingConfirm('${esc(m.sku)}', '${esc(m.codmat)}')" title="Dezactiveaza">
|
||||
<i class="bi bi-trash"></i>
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
`).join('');
|
||||
}
|
||||
|
||||
function renderPagination(data) {
|
||||
const info = document.getElementById('pageInfo');
|
||||
info.textContent = `${data.total} mapari | Pagina ${data.page} din ${data.pages || 1}`;
|
||||
|
||||
const ul = document.getElementById('pagination');
|
||||
if (data.pages <= 1) { ul.innerHTML = ''; return; }
|
||||
|
||||
let html = '';
|
||||
// Previous
|
||||
html += `<li class="page-item ${data.page <= 1 ? 'disabled' : ''}">
|
||||
<a class="page-link" href="#" onclick="goPage(${data.page - 1}); return false;">«</a></li>`;
|
||||
|
||||
// Pages (show max 7)
|
||||
let start = Math.max(1, data.page - 3);
|
||||
let end = Math.min(data.pages, start + 6);
|
||||
start = Math.max(1, end - 6);
|
||||
|
||||
for (let i = start; i <= end; i++) {
|
||||
html += `<li class="page-item ${i === data.page ? 'active' : ''}">
|
||||
<a class="page-link" href="#" onclick="goPage(${i}); return false;">${i}</a></li>`;
|
||||
}
|
||||
|
||||
// Next
|
||||
html += `<li class="page-item ${data.page >= data.pages ? 'disabled' : ''}">
|
||||
<a class="page-link" href="#" onclick="goPage(${data.page + 1}); return false;">»</a></li>`;
|
||||
|
||||
ul.innerHTML = html;
|
||||
}
|
||||
|
||||
function goPage(p) {
|
||||
currentPage = p;
|
||||
loadMappings();
|
||||
}
|
||||
|
||||
// Autocomplete for CODMAT
|
||||
let acTimeout = null;
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
const input = document.getElementById('inputCodmat');
|
||||
if (!input) return;
|
||||
|
||||
input.addEventListener('input', () => {
|
||||
clearTimeout(acTimeout);
|
||||
acTimeout = setTimeout(() => autocomplete(input.value), 250);
|
||||
});
|
||||
|
||||
input.addEventListener('blur', () => {
|
||||
setTimeout(() => document.getElementById('autocompleteDropdown').classList.add('d-none'), 200);
|
||||
});
|
||||
});
|
||||
|
||||
async function autocomplete(q) {
|
||||
const dropdown = document.getElementById('autocompleteDropdown');
|
||||
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
|
||||
const data = await res.json();
|
||||
|
||||
if (!data.results || data.results.length === 0) {
|
||||
dropdown.classList.add('d-none');
|
||||
return;
|
||||
}
|
||||
|
||||
dropdown.innerHTML = data.results.map(r => `
|
||||
<div class="autocomplete-item" onmousedown="selectArticle('${esc(r.codmat)}', '${esc(r.denumire)}')">
|
||||
<span class="codmat">${esc(r.codmat)}</span>
|
||||
<br><span class="denumire">${esc(r.denumire)}</span>
|
||||
</div>
|
||||
`).join('');
|
||||
dropdown.classList.remove('d-none');
|
||||
} catch (err) {
|
||||
dropdown.classList.add('d-none');
|
||||
}
|
||||
}
|
||||
|
||||
function selectArticle(codmat, denumire) {
|
||||
document.getElementById('inputCodmat').value = codmat;
|
||||
document.getElementById('selectedArticle').textContent = denumire;
|
||||
document.getElementById('autocompleteDropdown').classList.add('d-none');
|
||||
}
|
||||
|
||||
// Save mapping (create)
|
||||
async function saveMapping() {
|
||||
const sku = document.getElementById('inputSku').value.trim();
|
||||
const codmat = document.getElementById('inputCodmat').value.trim();
|
||||
const cantitate = parseFloat(document.getElementById('inputCantitate').value) || 1;
|
||||
const procent = parseFloat(document.getElementById('inputProcent').value) || 100;
|
||||
|
||||
if (!sku || !codmat) { alert('SKU si CODMAT sunt obligatorii'); return; }
|
||||
|
||||
try {
|
||||
const res = await fetch('/api/mappings', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({ sku, codmat, cantitate_roa: cantitate, procent_pret: procent })
|
||||
});
|
||||
const data = await res.json();
|
||||
|
||||
if (data.success) {
|
||||
bootstrap.Modal.getInstance(document.getElementById('addModal')).hide();
|
||||
clearForm();
|
||||
loadMappings();
|
||||
} else {
|
||||
alert('Eroare: ' + (data.error || 'Unknown'));
|
||||
}
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
function clearForm() {
|
||||
document.getElementById('inputSku').value = '';
|
||||
document.getElementById('inputCodmat').value = '';
|
||||
document.getElementById('inputCantitate').value = '1';
|
||||
document.getElementById('inputProcent').value = '100';
|
||||
document.getElementById('selectedArticle').textContent = '';
|
||||
}
|
||||
|
||||
// Inline edit
|
||||
function editCell(td, sku, codmat, field, currentValue) {
|
||||
if (td.querySelector('input')) return; // Already editing
|
||||
|
||||
const input = document.createElement('input');
|
||||
input.type = 'number';
|
||||
input.className = 'form-control form-control-sm';
|
||||
input.value = currentValue;
|
||||
input.step = field === 'cantitate_roa' ? '0.001' : '0.01';
|
||||
input.style.width = '80px';
|
||||
|
||||
const originalText = td.textContent;
|
||||
td.textContent = '';
|
||||
td.appendChild(input);
|
||||
input.focus();
|
||||
input.select();
|
||||
|
||||
const save = async () => {
|
||||
const newValue = parseFloat(input.value);
|
||||
if (isNaN(newValue) || newValue === currentValue) {
|
||||
td.textContent = originalText;
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const body = {};
|
||||
body[field] = newValue;
|
||||
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
|
||||
method: 'PUT',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(body)
|
||||
});
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
loadMappings();
|
||||
} else {
|
||||
td.textContent = originalText;
|
||||
alert('Eroare: ' + (data.error || 'Update failed'));
|
||||
}
|
||||
} catch (err) {
|
||||
td.textContent = originalText;
|
||||
}
|
||||
};
|
||||
|
||||
input.addEventListener('blur', save);
|
||||
input.addEventListener('keydown', (e) => {
|
||||
if (e.key === 'Enter') save();
|
||||
if (e.key === 'Escape') { td.textContent = originalText; }
|
||||
});
|
||||
}
|
||||
|
||||
// Toggle active
|
||||
async function toggleActive(sku, codmat, currentActive) {
|
||||
try {
|
||||
const res = await fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
|
||||
method: 'PUT',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({ activ: currentActive ? 0 : 1 })
|
||||
});
|
||||
const data = await res.json();
|
||||
if (data.success) loadMappings();
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
// Delete (soft)
|
||||
function deleteMappingConfirm(sku, codmat) {
|
||||
if (confirm(`Dezactivezi maparea ${sku} -> ${codmat}?`)) {
|
||||
fetch(`/api/mappings/${encodeURIComponent(sku)}/${encodeURIComponent(codmat)}`, {
|
||||
method: 'DELETE'
|
||||
}).then(r => r.json()).then(d => {
|
||||
if (d.success) loadMappings();
|
||||
else alert('Eroare: ' + (d.error || 'Delete failed'));
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// CSV import
|
||||
async function importCsv() {
|
||||
const fileInput = document.getElementById('csvFile');
|
||||
if (!fileInput.files.length) { alert('Selecteaza un fisier CSV'); return; }
|
||||
|
||||
const formData = new FormData();
|
||||
formData.append('file', fileInput.files[0]);
|
||||
|
||||
try {
|
||||
const res = await fetch('/api/mappings/import-csv', {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
});
|
||||
const data = await res.json();
|
||||
|
||||
let html = `<div class="alert alert-success">Procesate: ${data.processed}</div>`;
|
||||
if (data.errors && data.errors.length > 0) {
|
||||
html += `<div class="alert alert-warning">Erori: <ul>${data.errors.map(e => `<li>${esc(e)}</li>`).join('')}</ul></div>`;
|
||||
}
|
||||
document.getElementById('importResult').innerHTML = html;
|
||||
loadMappings();
|
||||
} catch (err) {
|
||||
document.getElementById('importResult').innerHTML = `<div class="alert alert-danger">${err.message}</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
function exportCsv() {
|
||||
window.location.href = '/api/mappings/export-csv';
|
||||
}
|
||||
|
||||
function downloadTemplate() {
|
||||
window.location.href = '/api/mappings/csv-template';
|
||||
}
|
||||
|
||||
// Escape HTML
|
||||
function esc(s) {
|
||||
if (s == null) return '';
|
||||
return String(s).replace(/&/g, '&').replace(/</g, '<').replace(/>/g, '>').replace(/"/g, '"').replace(/'/g, ''');
|
||||
}
|
||||
57
api/app/templates/base.html
Normal file
57
api/app/templates/base.html
Normal file
@@ -0,0 +1,57 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="ro">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>{% block title %}GoMag Import Manager{% endblock %}</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.2/font/bootstrap-icons.css" rel="stylesheet">
|
||||
<link href="/static/css/style.css" rel="stylesheet">
|
||||
</head>
|
||||
<body>
|
||||
<!-- Sidebar -->
|
||||
<nav id="sidebar" class="sidebar">
|
||||
<div class="sidebar-header">
|
||||
<h5><i class="bi bi-box-seam"></i> GoMag Import</h5>
|
||||
</div>
|
||||
<ul class="nav flex-column">
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% block nav_dashboard %}{% endblock %}" href="/">
|
||||
<i class="bi bi-speedometer2"></i> Dashboard
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% block nav_sync %}{% endblock %}" href="/sync">
|
||||
<i class="bi bi-arrow-repeat"></i> Import Comenzi
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% block nav_mappings %}{% endblock %}" href="/mappings">
|
||||
<i class="bi bi-link-45deg"></i> Mapari SKU
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% block nav_missing %}{% endblock %}" href="/missing-skus">
|
||||
<i class="bi bi-exclamation-triangle"></i> SKU-uri Lipsa
|
||||
</a>
|
||||
</li>
|
||||
</ul>
|
||||
<div class="sidebar-footer">
|
||||
<small class="text-muted">v1.0</small>
|
||||
</div>
|
||||
</nav>
|
||||
|
||||
<!-- Mobile toggle -->
|
||||
<button class="btn btn-dark d-md-none sidebar-toggle" type="button" onclick="document.getElementById('sidebar').classList.toggle('show')">
|
||||
<i class="bi bi-list"></i>
|
||||
</button>
|
||||
|
||||
<!-- Main content -->
|
||||
<main class="main-content">
|
||||
{% block content %}{% endblock %}
|
||||
</main>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
|
||||
{% block scripts %}{% endblock %}
|
||||
</body>
|
||||
</html>
|
||||
135
api/app/templates/dashboard.html
Normal file
135
api/app/templates/dashboard.html
Normal file
@@ -0,0 +1,135 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Dashboard - GoMag Import{% endblock %}
|
||||
{% block nav_dashboard %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<h4 class="mb-4">Dashboard</h4>
|
||||
|
||||
<!-- Stat cards row -->
|
||||
<div class="row g-3 mb-4" id="statsRow">
|
||||
<div class="col">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-secondary" id="stat-pending">-</div>
|
||||
<div class="stat-label">In Asteptare</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-primary" id="stat-ready">-</div>
|
||||
<div class="stat-label">Ready</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-success" id="stat-imported">-</div>
|
||||
<div class="stat-label">Imported</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-warning" id="stat-skipped">-</div>
|
||||
<div class="stat-label">Skipped</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-danger" id="stat-missing">-</div>
|
||||
<div class="stat-label">SKU Lipsa</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Sync Control -->
|
||||
<div class="card mb-4">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<span>Sync Control</span>
|
||||
<span class="badge bg-secondary" id="syncStatusBadge">idle</span>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<div class="row align-items-center">
|
||||
<div class="col-auto">
|
||||
<button class="btn btn-success btn-sm" id="btnStartSync" onclick="startSync()">
|
||||
<i class="bi bi-play-fill"></i> Start Sync
|
||||
</button>
|
||||
<button class="btn btn-outline-secondary btn-sm" id="btnScan" onclick="scanOrders()">
|
||||
<i class="bi bi-search"></i> Scan
|
||||
</button>
|
||||
<button class="btn btn-danger btn-sm d-none" id="btnStopSync" onclick="stopSync()">
|
||||
<i class="bi bi-stop-fill"></i> Stop
|
||||
</button>
|
||||
</div>
|
||||
<div class="col-auto">
|
||||
<div class="form-check form-switch d-inline-block me-2">
|
||||
<input class="form-check-input" type="checkbox" id="schedulerToggle" onchange="toggleScheduler()">
|
||||
<label class="form-check-label" for="schedulerToggle">Scheduler</label>
|
||||
</div>
|
||||
<select class="form-select form-select-sm d-inline-block" style="width:auto" id="schedulerInterval" onchange="updateSchedulerInterval()">
|
||||
<option value="1">1 min</option>
|
||||
<option value="5" selected>5 min</option>
|
||||
<option value="10">10 min</option>
|
||||
<option value="15">15 min</option>
|
||||
<option value="30">30 min</option>
|
||||
<option value="60">60 min</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="col">
|
||||
<small class="text-muted" id="syncProgressText"></small>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Recent Sync Runs -->
|
||||
<div class="card mb-4">
|
||||
<div class="card-header">Ultimele Sync Runs</div>
|
||||
<div class="card-body p-0">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Data</th>
|
||||
<th>Status</th>
|
||||
<th>Total</th>
|
||||
<th>OK</th>
|
||||
<th>Skip</th>
|
||||
<th>Err</th>
|
||||
<th>Durata</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="syncRunsBody">
|
||||
<tr><td colspan="7" class="text-center text-muted py-3">Se incarca...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Missing SKUs (quick resolve) -->
|
||||
<div class="card">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<span>SKU-uri Lipsa</span>
|
||||
<a href="/missing-skus" class="btn btn-sm btn-outline-primary">Vezi toate</a>
|
||||
</div>
|
||||
<div class="card-body p-0">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>SKU</th>
|
||||
<th>Produs</th>
|
||||
<th>Data</th>
|
||||
<th>Actiune</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="missingSkusBody">
|
||||
<tr><td colspan="4" class="text-center text-muted py-3">Se incarca...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script src="/static/js/dashboard.js"></script>
|
||||
{% endblock %}
|
||||
118
api/app/templates/mappings.html
Normal file
118
api/app/templates/mappings.html
Normal file
@@ -0,0 +1,118 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Mapari SKU - GoMag Import{% endblock %}
|
||||
{% block nav_mappings %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h4 class="mb-0">Mapari SKU</h4>
|
||||
<div>
|
||||
<button class="btn btn-sm btn-outline-secondary" onclick="downloadTemplate()"><i class="bi bi-file-earmark-arrow-down"></i> Template CSV</button>
|
||||
<button class="btn btn-sm btn-outline-secondary" onclick="exportCsv()"><i class="bi bi-download"></i> Export CSV</button>
|
||||
<button class="btn btn-sm btn-outline-primary" data-bs-toggle="modal" data-bs-target="#importModal"><i class="bi bi-upload"></i> Import CSV</button>
|
||||
<button class="btn btn-sm btn-primary" data-bs-toggle="modal" data-bs-target="#addModal"><i class="bi bi-plus-lg"></i> Adauga Mapare</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Search -->
|
||||
<div class="card mb-3">
|
||||
<div class="card-body py-2">
|
||||
<div class="input-group">
|
||||
<span class="input-group-text"><i class="bi bi-search"></i></span>
|
||||
<input type="text" class="form-control" id="searchInput" placeholder="Cauta SKU, CODMAT sau denumire..." oninput="debounceSearch()">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Table -->
|
||||
<div class="card">
|
||||
<div class="card-body p-0">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>SKU</th>
|
||||
<th>CODMAT</th>
|
||||
<th>Denumire</th>
|
||||
<th>Cantitate ROA</th>
|
||||
<th>Procent Pret</th>
|
||||
<th>Activ</th>
|
||||
<th style="width:100px">Actiuni</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="mappingsBody">
|
||||
<tr><td colspan="7" class="text-center text-muted py-4">Se incarca...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-footer d-flex justify-content-between align-items-center">
|
||||
<small class="text-muted" id="pageInfo"></small>
|
||||
<nav>
|
||||
<ul class="pagination pagination-sm mb-0" id="pagination"></ul>
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Add/Edit Modal -->
|
||||
<div class="modal fade" id="addModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title" id="addModalTitle">Adauga Mapare</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="mb-3">
|
||||
<label class="form-label">SKU</label>
|
||||
<input type="text" class="form-control" id="inputSku" placeholder="Ex: 8714858124284">
|
||||
</div>
|
||||
<div class="mb-3 position-relative">
|
||||
<label class="form-label">CODMAT (Articol ROA)</label>
|
||||
<input type="text" class="form-control" id="inputCodmat" placeholder="Cauta codmat sau denumire..." autocomplete="off">
|
||||
<div class="autocomplete-dropdown d-none" id="autocompleteDropdown"></div>
|
||||
<small class="text-muted" id="selectedArticle"></small>
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col-6 mb-3">
|
||||
<label class="form-label">Cantitate ROA</label>
|
||||
<input type="number" class="form-control" id="inputCantitate" value="1" step="0.001" min="0.001">
|
||||
</div>
|
||||
<div class="col-6 mb-3">
|
||||
<label class="form-label">Procent Pret (%)</label>
|
||||
<input type="number" class="form-control" id="inputProcent" value="100" step="0.01" min="0" max="100">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
|
||||
<button type="button" class="btn btn-primary" onclick="saveMapping()">Salveaza</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Import CSV Modal -->
|
||||
<div class="modal fade" id="importModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Import CSV</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p class="text-muted small">Format CSV: sku, codmat, cantitate_roa, procent_pret</p>
|
||||
<input type="file" class="form-control" id="csvFile" accept=".csv">
|
||||
<div id="importResult" class="mt-3"></div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Inchide</button>
|
||||
<button type="button" class="btn btn-primary" onclick="importCsv()">Import</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script src="/static/js/mappings.js"></script>
|
||||
{% endblock %}
|
||||
223
api/app/templates/missing_skus.html
Normal file
223
api/app/templates/missing_skus.html
Normal file
@@ -0,0 +1,223 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}SKU-uri Lipsa - GoMag Import{% endblock %}
|
||||
{% block nav_missing %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<h4 class="mb-0">SKU-uri Lipsa</h4>
|
||||
<div>
|
||||
<button class="btn btn-sm btn-outline-secondary" onclick="exportMissingCsv()">
|
||||
<i class="bi bi-download"></i> Export CSV
|
||||
</button>
|
||||
<button class="btn btn-sm btn-outline-primary" onclick="scanForMissing()">
|
||||
<i class="bi bi-search"></i> Re-Scan
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
<div class="card-body p-0">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>SKU</th>
|
||||
<th>Produs</th>
|
||||
<th>First Seen</th>
|
||||
<th>Status</th>
|
||||
<th>Actiune</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="missingBody">
|
||||
<tr><td colspan="5" class="text-center text-muted py-4">Se incarca...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-footer">
|
||||
<small class="text-muted" id="missingInfo"></small>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Map SKU Modal -->
|
||||
<div class="modal fade" id="mapModal" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Mapeaza SKU: <code id="mapSku"></code></h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<div class="mb-3 position-relative">
|
||||
<label class="form-label">CODMAT (Articol ROA)</label>
|
||||
<input type="text" class="form-control" id="mapCodmat" placeholder="Cauta codmat sau denumire..." autocomplete="off">
|
||||
<div class="autocomplete-dropdown d-none" id="mapAutocomplete"></div>
|
||||
<small class="text-muted" id="mapSelectedArticle"></small>
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col-6 mb-3">
|
||||
<label class="form-label">Cantitate ROA</label>
|
||||
<input type="number" class="form-control" id="mapCantitate" value="1" step="0.001" min="0.001">
|
||||
</div>
|
||||
<div class="col-6 mb-3">
|
||||
<label class="form-label">Procent Pret (%)</label>
|
||||
<input type="number" class="form-control" id="mapProcent" value="100" step="0.01" min="0" max="100">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Anuleaza</button>
|
||||
<button type="button" class="btn btn-primary" onclick="saveQuickMap()">Salveaza</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
let currentMapSku = '';
|
||||
let acTimeout = null;
|
||||
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
loadMissing();
|
||||
|
||||
const input = document.getElementById('mapCodmat');
|
||||
input.addEventListener('input', () => {
|
||||
clearTimeout(acTimeout);
|
||||
acTimeout = setTimeout(() => autocompleteMap(input.value), 250);
|
||||
});
|
||||
input.addEventListener('blur', () => {
|
||||
setTimeout(() => document.getElementById('mapAutocomplete').classList.add('d-none'), 200);
|
||||
});
|
||||
});
|
||||
|
||||
async function loadMissing() {
|
||||
try {
|
||||
const res = await fetch('/api/validate/missing-skus');
|
||||
const data = await res.json();
|
||||
const tbody = document.getElementById('missingBody');
|
||||
|
||||
document.getElementById('missingInfo').textContent =
|
||||
`Total: ${data.total || 0} | Nerezolvate: ${data.unresolved || 0}`;
|
||||
|
||||
const skus = data.missing_skus || [];
|
||||
if (skus.length === 0) {
|
||||
tbody.innerHTML = '<tr><td colspan="5" class="text-center text-muted py-4">Toate SKU-urile sunt mapate!</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
tbody.innerHTML = skus.map(s => {
|
||||
const statusBadge = s.resolved
|
||||
? '<span class="badge bg-success">Rezolvat</span>'
|
||||
: '<span class="badge bg-warning text-dark">Nerezolvat</span>';
|
||||
|
||||
return `<tr class="${s.resolved ? 'table-light' : ''}">
|
||||
<td><code>${esc(s.sku)}</code></td>
|
||||
<td>${esc(s.product_name || '-')}</td>
|
||||
<td><small>${s.first_seen ? new Date(s.first_seen).toLocaleDateString('ro-RO') : '-'}</small></td>
|
||||
<td>${statusBadge}</td>
|
||||
<td>
|
||||
${!s.resolved ? `<button class="btn btn-sm btn-outline-primary" onclick="openMapModal('${esc(s.sku)}')">
|
||||
<i class="bi bi-link-45deg"></i> Mapeaza
|
||||
</button>` : `<small class="text-muted">${s.resolved_at ? new Date(s.resolved_at).toLocaleDateString('ro-RO') : ''}</small>`}
|
||||
</td>
|
||||
</tr>`;
|
||||
}).join('');
|
||||
} catch (err) {
|
||||
document.getElementById('missingBody').innerHTML =
|
||||
`<tr><td colspan="5" class="text-center text-danger">${err.message}</td></tr>`;
|
||||
}
|
||||
}
|
||||
|
||||
function openMapModal(sku) {
|
||||
currentMapSku = sku;
|
||||
document.getElementById('mapSku').textContent = sku;
|
||||
document.getElementById('mapCodmat').value = '';
|
||||
document.getElementById('mapCantitate').value = '1';
|
||||
document.getElementById('mapProcent').value = '100';
|
||||
document.getElementById('mapSelectedArticle').textContent = '';
|
||||
new bootstrap.Modal(document.getElementById('mapModal')).show();
|
||||
}
|
||||
|
||||
async function autocompleteMap(q) {
|
||||
const dropdown = document.getElementById('mapAutocomplete');
|
||||
if (q.length < 2) { dropdown.classList.add('d-none'); return; }
|
||||
|
||||
try {
|
||||
const res = await fetch(`/api/articles/search?q=${encodeURIComponent(q)}`);
|
||||
const data = await res.json();
|
||||
|
||||
if (!data.results || data.results.length === 0) {
|
||||
dropdown.classList.add('d-none');
|
||||
return;
|
||||
}
|
||||
|
||||
dropdown.innerHTML = data.results.map(r => `
|
||||
<div class="autocomplete-item" onmousedown="selectMapArticle('${esc(r.codmat)}', '${esc(r.denumire)}')">
|
||||
<span class="codmat">${esc(r.codmat)}</span>
|
||||
<br><span class="denumire">${esc(r.denumire)}</span>
|
||||
</div>
|
||||
`).join('');
|
||||
dropdown.classList.remove('d-none');
|
||||
} catch (err) {
|
||||
dropdown.classList.add('d-none');
|
||||
}
|
||||
}
|
||||
|
||||
function selectMapArticle(codmat, denumire) {
|
||||
document.getElementById('mapCodmat').value = codmat;
|
||||
document.getElementById('mapSelectedArticle').textContent = denumire;
|
||||
document.getElementById('mapAutocomplete').classList.add('d-none');
|
||||
}
|
||||
|
||||
async function saveQuickMap() {
|
||||
const codmat = document.getElementById('mapCodmat').value.trim();
|
||||
const cantitate = parseFloat(document.getElementById('mapCantitate').value) || 1;
|
||||
const procent = parseFloat(document.getElementById('mapProcent').value) || 100;
|
||||
|
||||
if (!codmat) { alert('Selecteaza un CODMAT'); return; }
|
||||
|
||||
try {
|
||||
const res = await fetch('/api/mappings', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
sku: currentMapSku,
|
||||
codmat: codmat,
|
||||
cantitate_roa: cantitate,
|
||||
procent_pret: procent
|
||||
})
|
||||
});
|
||||
const data = await res.json();
|
||||
|
||||
if (data.success) {
|
||||
bootstrap.Modal.getInstance(document.getElementById('mapModal')).hide();
|
||||
loadMissing();
|
||||
} else {
|
||||
alert('Eroare: ' + (data.error || 'Unknown'));
|
||||
}
|
||||
} catch (err) {
|
||||
alert('Eroare: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
async function scanForMissing() {
|
||||
try {
|
||||
await fetch('/api/validate/scan', { method: 'POST' });
|
||||
loadMissing();
|
||||
} catch (err) {
|
||||
alert('Eroare scan: ' + err.message);
|
||||
}
|
||||
}
|
||||
|
||||
function exportMissingCsv() {
|
||||
window.location.href = '/api/validate/missing-skus-csv';
|
||||
}
|
||||
|
||||
function esc(s) {
|
||||
if (s == null) return '';
|
||||
return String(s).replace(/&/g, '&').replace(/</g, '<').replace(/>/g, '>').replace(/"/g, '"').replace(/'/g, ''');
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
158
api/app/templates/sync_detail.html
Normal file
158
api/app/templates/sync_detail.html
Normal file
@@ -0,0 +1,158 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Sync Run - GoMag Import{% endblock %}
|
||||
{% block nav_sync %}active{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||
<div>
|
||||
<a href="/" class="text-decoration-none text-muted"><i class="bi bi-arrow-left"></i> Dashboard</a>
|
||||
<h4 class="mb-0 mt-1">Sync Run <small class="text-muted" id="runId">{{ run_id }}</small></h4>
|
||||
</div>
|
||||
<span class="badge bg-secondary fs-6" id="runStatusBadge">-</span>
|
||||
</div>
|
||||
|
||||
<!-- Run summary -->
|
||||
<div class="row g-3 mb-4">
|
||||
<div class="col-md-3">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value" id="runTotal">-</div>
|
||||
<div class="stat-label">Total Comenzi</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-success" id="runImported">-</div>
|
||||
<div class="stat-label">Imported</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-warning" id="runSkipped">-</div>
|
||||
<div class="stat-label">Skipped</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-3">
|
||||
<div class="card stat-card">
|
||||
<div class="stat-value text-danger" id="runErrors">-</div>
|
||||
<div class="stat-label">Errors</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="mb-3">
|
||||
<small class="text-muted" id="runTiming"></small>
|
||||
</div>
|
||||
|
||||
<!-- Orders table -->
|
||||
<div class="card">
|
||||
<div class="card-header">Comenzi</div>
|
||||
<div class="card-body p-0">
|
||||
<div class="table-responsive">
|
||||
<table class="table table-hover mb-0">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>#</th>
|
||||
<th>Nr Comanda</th>
|
||||
<th>Data</th>
|
||||
<th>Client</th>
|
||||
<th>Articole</th>
|
||||
<th>Status</th>
|
||||
<th>Detalii</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="ordersBody">
|
||||
<tr><td colspan="7" class="text-center text-muted py-4">Se incarca...</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
<script>
|
||||
const RUN_ID = '{{ run_id }}';
|
||||
|
||||
document.addEventListener('DOMContentLoaded', loadRunDetail);
|
||||
|
||||
async function loadRunDetail() {
|
||||
try {
|
||||
const res = await fetch(`/api/sync/run/${RUN_ID}`);
|
||||
const data = await res.json();
|
||||
|
||||
if (data.error) {
|
||||
document.getElementById('ordersBody').innerHTML =
|
||||
`<tr><td colspan="7" class="text-center text-danger">${data.error}</td></tr>`;
|
||||
return;
|
||||
}
|
||||
|
||||
const run = data.run;
|
||||
|
||||
// Update summary
|
||||
document.getElementById('runTotal').textContent = run.total_orders || 0;
|
||||
document.getElementById('runImported').textContent = run.imported || 0;
|
||||
document.getElementById('runSkipped').textContent = run.skipped || 0;
|
||||
document.getElementById('runErrors').textContent = run.errors || 0;
|
||||
|
||||
const badge = document.getElementById('runStatusBadge');
|
||||
badge.textContent = run.status;
|
||||
badge.className = 'badge fs-6 ' + (run.status === 'completed' ? 'bg-success' : run.status === 'running' ? 'bg-primary' : 'bg-danger');
|
||||
|
||||
// Timing
|
||||
if (run.started_at) {
|
||||
let timing = 'Start: ' + new Date(run.started_at).toLocaleString('ro-RO');
|
||||
if (run.finished_at) {
|
||||
const sec = Math.round((new Date(run.finished_at) - new Date(run.started_at)) / 1000);
|
||||
timing += ` | Durata: ${sec < 60 ? sec + 's' : Math.floor(sec/60) + 'm ' + (sec%60) + 's'}`;
|
||||
}
|
||||
document.getElementById('runTiming').textContent = timing;
|
||||
}
|
||||
|
||||
// Orders table
|
||||
const orders = data.orders || [];
|
||||
if (orders.length === 0) {
|
||||
document.getElementById('ordersBody').innerHTML =
|
||||
'<tr><td colspan="7" class="text-center text-muted py-4">Nicio comanda</td></tr>';
|
||||
return;
|
||||
}
|
||||
|
||||
document.getElementById('ordersBody').innerHTML = orders.map((o, i) => {
|
||||
const statusClass = o.status === 'IMPORTED' ? 'badge-imported' : o.status === 'SKIPPED' ? 'badge-skipped' : 'badge-error';
|
||||
|
||||
let details = '';
|
||||
if (o.status === 'IMPORTED' && o.id_comanda) {
|
||||
details = `<small class="text-success">ID: ${o.id_comanda}</small>`;
|
||||
} else if (o.status === 'SKIPPED' && o.missing_skus) {
|
||||
try {
|
||||
const skus = JSON.parse(o.missing_skus);
|
||||
details = `<small class="text-warning">SKU lipsa: ${skus.map(s => '<code>' + esc(s) + '</code>').join(', ')}</small>`;
|
||||
} catch(e) {
|
||||
details = `<small class="text-warning">${esc(o.missing_skus)}</small>`;
|
||||
}
|
||||
} else if (o.status === 'ERROR' && o.error_message) {
|
||||
details = `<small class="text-danger">${esc(o.error_message).substring(0, 100)}</small>`;
|
||||
}
|
||||
|
||||
return `<tr>
|
||||
<td>${i + 1}</td>
|
||||
<td><strong>${esc(o.order_number)}</strong></td>
|
||||
<td><small>${o.order_date ? o.order_date.substring(0, 10) : '-'}</small></td>
|
||||
<td>${esc(o.customer_name)}</td>
|
||||
<td>${o.items_count || '-'}</td>
|
||||
<td><span class="badge ${statusClass}">${o.status}</span></td>
|
||||
<td>${details}</td>
|
||||
</tr>`;
|
||||
}).join('');
|
||||
|
||||
} catch (err) {
|
||||
document.getElementById('ordersBody').innerHTML =
|
||||
`<tr><td colspan="7" class="text-center text-danger">Eroare: ${err.message}</td></tr>`;
|
||||
}
|
||||
}
|
||||
|
||||
function esc(s) {
|
||||
if (s == null) return '';
|
||||
return String(s).replace(/&/g, '&').replace(/</g, '<').replace(/>/g, '>').replace(/"/g, '"');
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
0
api/data/.gitkeep
Normal file
0
api/data/.gitkeep
Normal file
@@ -1,5 +1,10 @@
|
||||
Flask==2.3.2
|
||||
Flask-CORS==4.0.0
|
||||
oracledb==1.4.2
|
||||
python-dotenv==1.0.0
|
||||
gunicorn==21.2.0
|
||||
fastapi==0.115.6
|
||||
uvicorn[standard]==0.34.0
|
||||
jinja2==3.1.4
|
||||
python-multipart==0.0.18
|
||||
oracledb==2.5.1
|
||||
aiosqlite==0.20.0
|
||||
apscheduler==3.10.4
|
||||
python-dotenv==1.0.1
|
||||
pydantic-settings==2.7.1
|
||||
httpx==0.28.1
|
||||
|
||||
147
api/test_app_basic.py
Normal file
147
api/test_app_basic.py
Normal file
@@ -0,0 +1,147 @@
|
||||
"""
|
||||
Test A: Basic App Import and Route Tests
|
||||
=========================================
|
||||
Tests module imports and all GET routes without requiring Oracle.
|
||||
Run: python test_app_basic.py
|
||||
|
||||
Expected results:
|
||||
- All 17 module imports: PASS
|
||||
- HTML routes (/ /missing-skus /mappings /sync): PASS (templates exist)
|
||||
- /health: PASS (returns Oracle=error, sqlite=ok)
|
||||
- /api/sync/status, /api/sync/history, /api/validate/missing-skus: PASS (SQLite-only)
|
||||
- /api/mappings, /api/mappings/export-csv, /api/articles/search: FAIL (require Oracle pool)
|
||||
These are KNOWN FAILURES when Oracle is unavailable - documented as bugs requiring guards.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
# --- Set env vars BEFORE any app import ---
|
||||
_tmpdir = tempfile.mkdtemp()
|
||||
_sqlite_path = os.path.join(_tmpdir, "test_import.db")
|
||||
|
||||
os.environ["FORCE_THIN_MODE"] = "true"
|
||||
os.environ["SQLITE_DB_PATH"] = _sqlite_path
|
||||
os.environ["ORACLE_DSN"] = "dummy"
|
||||
os.environ["ORACLE_USER"] = "dummy"
|
||||
os.environ["ORACLE_PASSWORD"] = "dummy"
|
||||
|
||||
# Add api/ to path so we can import app
|
||||
_api_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
if _api_dir not in sys.path:
|
||||
sys.path.insert(0, _api_dir)
|
||||
|
||||
# -------------------------------------------------------
|
||||
# Section 1: Module Import Checks
|
||||
# -------------------------------------------------------
|
||||
|
||||
MODULES = [
|
||||
"app.config",
|
||||
"app.database",
|
||||
"app.main",
|
||||
"app.routers.health",
|
||||
"app.routers.dashboard",
|
||||
"app.routers.mappings",
|
||||
"app.routers.sync",
|
||||
"app.routers.validation",
|
||||
"app.routers.articles",
|
||||
"app.services.sqlite_service",
|
||||
"app.services.scheduler_service",
|
||||
"app.services.mapping_service",
|
||||
"app.services.article_service",
|
||||
"app.services.validation_service",
|
||||
"app.services.import_service",
|
||||
"app.services.sync_service",
|
||||
"app.services.order_reader",
|
||||
]
|
||||
|
||||
passed = 0
|
||||
failed = 0
|
||||
results = []
|
||||
|
||||
print("\n=== Test A: GoMag Import Manager Basic Tests ===\n")
|
||||
print("--- Section 1: Module Imports ---\n")
|
||||
|
||||
for mod in MODULES:
|
||||
try:
|
||||
__import__(mod)
|
||||
print(f" [PASS] import {mod}")
|
||||
passed += 1
|
||||
results.append((f"import:{mod}", True, None, False))
|
||||
except Exception as e:
|
||||
print(f" [FAIL] import {mod} -> {e}")
|
||||
failed += 1
|
||||
results.append((f"import:{mod}", False, str(e), False))
|
||||
|
||||
# -------------------------------------------------------
|
||||
# Section 2: Route Tests via TestClient
|
||||
# -------------------------------------------------------
|
||||
|
||||
print("\n--- Section 2: GET Route Tests ---\n")
|
||||
|
||||
# Routes: (description, path, expected_ok_codes, known_oracle_failure)
|
||||
# known_oracle_failure=True means the route needs Oracle pool and will 500 without it.
|
||||
# These are flagged as bugs, not test infrastructure failures.
|
||||
GET_ROUTES = [
|
||||
("GET /health", "/health", [200], False),
|
||||
("GET / (dashboard HTML)", "/", [200, 500], False),
|
||||
("GET /missing-skus (HTML)", "/missing-skus", [200, 500], False),
|
||||
("GET /mappings (HTML)", "/mappings", [200, 500], False),
|
||||
("GET /sync (HTML)", "/sync", [200, 500], False),
|
||||
("GET /api/mappings", "/api/mappings", [200, 503], True),
|
||||
("GET /api/mappings/export-csv", "/api/mappings/export-csv", [200, 503], True),
|
||||
("GET /api/mappings/csv-template", "/api/mappings/csv-template", [200], False),
|
||||
("GET /api/sync/status", "/api/sync/status", [200], False),
|
||||
("GET /api/sync/history", "/api/sync/history", [200], False),
|
||||
("GET /api/sync/schedule", "/api/sync/schedule", [200], False),
|
||||
("GET /api/validate/missing-skus", "/api/validate/missing-skus", [200], False),
|
||||
("GET /api/articles/search?q=ab", "/api/articles/search?q=ab", [200, 503], True),
|
||||
]
|
||||
|
||||
try:
|
||||
from fastapi.testclient import TestClient
|
||||
from app.main import app
|
||||
|
||||
# Use context manager so lifespan (startup/shutdown) runs properly.
|
||||
# Without 'with', init_sqlite() never fires and SQLite-only routes return 500.
|
||||
with TestClient(app, raise_server_exceptions=False) as client:
|
||||
for name, path, expected, is_oracle_route in GET_ROUTES:
|
||||
try:
|
||||
resp = client.get(path)
|
||||
if resp.status_code in expected:
|
||||
print(f" [PASS] {name} -> HTTP {resp.status_code}")
|
||||
passed += 1
|
||||
results.append((name, True, None, is_oracle_route))
|
||||
else:
|
||||
body_snippet = resp.text[:300].replace("\n", " ")
|
||||
print(f" [FAIL] {name} -> HTTP {resp.status_code} (expected {expected})")
|
||||
print(f" Body: {body_snippet}")
|
||||
failed += 1
|
||||
results.append((name, False, f"HTTP {resp.status_code}", is_oracle_route))
|
||||
except Exception as e:
|
||||
print(f" [FAIL] {name} -> Exception: {e}")
|
||||
failed += 1
|
||||
results.append((name, False, str(e), is_oracle_route))
|
||||
|
||||
except ImportError as e:
|
||||
print(f" [FAIL] Cannot create TestClient: {e}")
|
||||
print(" Make sure 'httpx' is installed: pip install httpx")
|
||||
for name, path, _, _ in GET_ROUTES:
|
||||
failed += 1
|
||||
results.append((name, False, "TestClient unavailable", False))
|
||||
|
||||
# -------------------------------------------------------
|
||||
# Summary
|
||||
# -------------------------------------------------------
|
||||
|
||||
total = passed + failed
|
||||
print(f"\n=== Summary: {passed}/{total} tests passed ===")
|
||||
|
||||
if failed > 0:
|
||||
print("\nFailed tests:")
|
||||
for name, ok, err, _ in results:
|
||||
if not ok:
|
||||
print(f" - {name}: {err}")
|
||||
|
||||
sys.exit(0 if failed == 0 else 1)
|
||||
252
api/test_integration.py
Normal file
252
api/test_integration.py
Normal file
@@ -0,0 +1,252 @@
|
||||
"""
|
||||
Oracle Integration Tests for GoMag Import Manager
|
||||
==================================================
|
||||
Requires Oracle connectivity and valid .env configuration.
|
||||
|
||||
Usage:
|
||||
cd /mnt/e/proiecte/vending/gomag
|
||||
python api/test_integration.py
|
||||
|
||||
Note: Run from the project root so that relative paths in .env resolve correctly.
|
||||
The .env file is read from the api/ directory.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
# Set working directory to project root so relative paths in .env work
|
||||
_script_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
_project_root = os.path.dirname(_script_dir)
|
||||
os.chdir(_project_root)
|
||||
|
||||
# Load .env from api/ before importing app modules
|
||||
from dotenv import load_dotenv
|
||||
_env_path = os.path.join(_script_dir, ".env")
|
||||
load_dotenv(_env_path, override=True)
|
||||
|
||||
# Add api/ to path so app package is importable
|
||||
sys.path.insert(0, _script_dir)
|
||||
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
# Import the app (triggers lifespan on first TestClient use)
|
||||
from app.main import app
|
||||
|
||||
results = []
|
||||
|
||||
|
||||
def record(name: str, passed: bool, detail: str = ""):
|
||||
status = "PASS" if passed else "FAIL"
|
||||
msg = f"[{status}] {name}"
|
||||
if detail:
|
||||
msg += f" -- {detail}"
|
||||
print(msg)
|
||||
results.append(passed)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test A: GET /health — Oracle must show as connected
|
||||
# ---------------------------------------------------------------------------
|
||||
def test_health(client: TestClient):
|
||||
test_name = "GET /health - Oracle connected"
|
||||
try:
|
||||
resp = client.get("/health")
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
oracle_status = body.get("oracle", "")
|
||||
sqlite_status = body.get("sqlite", "")
|
||||
assert oracle_status == "ok", f"oracle={oracle_status!r}"
|
||||
assert sqlite_status == "ok", f"sqlite={sqlite_status!r}"
|
||||
record(test_name, True, f"oracle={oracle_status}, sqlite={sqlite_status}")
|
||||
except Exception as exc:
|
||||
record(test_name, False, str(exc))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test B: Mappings CRUD cycle
|
||||
# POST create -> GET list (verify present) -> PUT update -> DELETE -> verify
|
||||
# ---------------------------------------------------------------------------
|
||||
def test_mappings_crud(client: TestClient):
|
||||
test_sku = "TEST_INTEG_SKU_001"
|
||||
test_codmat = "TEST_CODMAT_001"
|
||||
|
||||
# -- CREATE --
|
||||
try:
|
||||
resp = client.post("/api/mappings", json={
|
||||
"sku": test_sku,
|
||||
"codmat": test_codmat,
|
||||
"cantitate_roa": 2.5,
|
||||
"procent_pret": 80.0
|
||||
})
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
assert body.get("success") is True, f"create returned: {body}"
|
||||
record("POST /api/mappings - create mapping", True,
|
||||
f"sku={test_sku}, codmat={test_codmat}")
|
||||
except Exception as exc:
|
||||
record("POST /api/mappings - create mapping", False, str(exc))
|
||||
# Skip the rest of CRUD if creation failed
|
||||
return
|
||||
|
||||
# -- LIST (verify present) --
|
||||
try:
|
||||
resp = client.get("/api/mappings", params={"search": test_sku})
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
mappings = body.get("mappings", [])
|
||||
found = any(
|
||||
m["sku"] == test_sku and m["codmat"] == test_codmat
|
||||
for m in mappings
|
||||
)
|
||||
assert found, f"mapping not found in list; got {mappings}"
|
||||
record("GET /api/mappings - mapping visible after create", True,
|
||||
f"total={body.get('total')}")
|
||||
except Exception as exc:
|
||||
record("GET /api/mappings - mapping visible after create", False, str(exc))
|
||||
|
||||
# -- UPDATE --
|
||||
try:
|
||||
resp = client.put(f"/api/mappings/{test_sku}/{test_codmat}", json={
|
||||
"cantitate_roa": 3.0,
|
||||
"procent_pret": 90.0
|
||||
})
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
assert body.get("success") is True, f"update returned: {body}"
|
||||
record("PUT /api/mappings/{sku}/{codmat} - update mapping", True,
|
||||
"cantitate_roa=3.0, procent_pret=90.0")
|
||||
except Exception as exc:
|
||||
record("PUT /api/mappings/{sku}/{codmat} - update mapping", False, str(exc))
|
||||
|
||||
# -- DELETE (soft: sets activ=0) --
|
||||
try:
|
||||
resp = client.delete(f"/api/mappings/{test_sku}/{test_codmat}")
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
assert body.get("success") is True, f"delete returned: {body}"
|
||||
record("DELETE /api/mappings/{sku}/{codmat} - soft delete", True)
|
||||
except Exception as exc:
|
||||
record("DELETE /api/mappings/{sku}/{codmat} - soft delete", False, str(exc))
|
||||
|
||||
# -- VERIFY: after soft-delete activ=0, listing without search filter should
|
||||
# show it as activ=0 (it is still in DB). Search for it and confirm activ=0. --
|
||||
try:
|
||||
resp = client.get("/api/mappings", params={"search": test_sku})
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
mappings = body.get("mappings", [])
|
||||
deleted = any(
|
||||
m["sku"] == test_sku and m["codmat"] == test_codmat and m.get("activ") == 0
|
||||
for m in mappings
|
||||
)
|
||||
assert deleted, (
|
||||
f"expected activ=0 for deleted mapping, got: "
|
||||
f"{[m for m in mappings if m['sku'] == test_sku]}"
|
||||
)
|
||||
record("GET /api/mappings - mapping has activ=0 after delete", True)
|
||||
except Exception as exc:
|
||||
record("GET /api/mappings - mapping has activ=0 after delete", False, str(exc))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test C: GET /api/articles/search?q=<term> — must return results
|
||||
# ---------------------------------------------------------------------------
|
||||
def test_articles_search(client: TestClient):
|
||||
# Use a short generic term that should exist in most ROA databases
|
||||
search_terms = ["01", "A", "PH"]
|
||||
test_name = "GET /api/articles/search - returns results"
|
||||
try:
|
||||
found_results = False
|
||||
last_body = {}
|
||||
for term in search_terms:
|
||||
resp = client.get("/api/articles/search", params={"q": term})
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
last_body = body
|
||||
results_list = body.get("results", [])
|
||||
if results_list:
|
||||
found_results = True
|
||||
record(test_name, True,
|
||||
f"q={term!r} returned {len(results_list)} results; "
|
||||
f"first={results_list[0].get('codmat')!r}")
|
||||
break
|
||||
if not found_results:
|
||||
# Search returned empty — not necessarily a failure if DB is empty,
|
||||
# but we flag it as a warning.
|
||||
record(test_name, False,
|
||||
f"all search terms returned empty; last response: {last_body}")
|
||||
except Exception as exc:
|
||||
record(test_name, False, str(exc))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test D: POST /api/validate/scan — triggers scan of JSON folder
|
||||
# ---------------------------------------------------------------------------
|
||||
def test_validate_scan(client: TestClient):
|
||||
test_name = "POST /api/validate/scan - returns valid response"
|
||||
try:
|
||||
resp = client.post("/api/validate/scan")
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
# Must have at least these keys
|
||||
for key in ("json_files", "orders", "skus"):
|
||||
# "orders" may be "total_orders" if orders exist; "orders" key only
|
||||
# present in the "No orders found" path.
|
||||
pass
|
||||
# Accept both shapes: no-orders path has "orders" key, full path has "total_orders"
|
||||
has_shape = "json_files" in body and ("orders" in body or "total_orders" in body)
|
||||
assert has_shape, f"unexpected response shape: {body}"
|
||||
record(test_name, True, f"json_files={body.get('json_files')}, "
|
||||
f"orders={body.get('total_orders', body.get('orders'))}")
|
||||
except Exception as exc:
|
||||
record(test_name, False, str(exc))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test E: GET /api/sync/history — must return a list structure
|
||||
# ---------------------------------------------------------------------------
|
||||
def test_sync_history(client: TestClient):
|
||||
test_name = "GET /api/sync/history - returns list structure"
|
||||
try:
|
||||
resp = client.get("/api/sync/history")
|
||||
assert resp.status_code == 200, f"HTTP {resp.status_code}"
|
||||
body = resp.json()
|
||||
assert "runs" in body, f"missing 'runs' key; got keys: {list(body.keys())}"
|
||||
assert isinstance(body["runs"], list), f"'runs' is not a list: {type(body['runs'])}"
|
||||
assert "total" in body, f"missing 'total' key"
|
||||
record(test_name, True,
|
||||
f"total={body.get('total')}, page={body.get('page')}, pages={body.get('pages')}")
|
||||
except Exception as exc:
|
||||
record(test_name, False, str(exc))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Main runner
|
||||
# ---------------------------------------------------------------------------
|
||||
def main():
|
||||
print("=" * 60)
|
||||
print("GoMag Import Manager - Oracle Integration Tests")
|
||||
print(f"Env file: {_env_path}")
|
||||
print(f"Oracle DSN: {os.environ.get('ORACLE_DSN', '(not set)')}")
|
||||
print("=" * 60)
|
||||
|
||||
with TestClient(app) as client:
|
||||
test_health(client)
|
||||
test_mappings_crud(client)
|
||||
test_articles_search(client)
|
||||
test_validate_scan(client)
|
||||
test_sync_history(client)
|
||||
|
||||
passed = sum(results)
|
||||
total = len(results)
|
||||
print("=" * 60)
|
||||
print(f"Summary: {passed}/{total} tests passed")
|
||||
if passed < total:
|
||||
print("Some tests FAILED — review output above for details.")
|
||||
sys.exit(1)
|
||||
else:
|
||||
print("All tests PASSED.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
Reference in New Issue
Block a user