Compare commits
11 Commits
7e30523242
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
32974e3b85 | ||
|
|
ab20856cd6 | ||
|
|
956667086d | ||
|
|
9b62b2b457 | ||
|
|
b13d9a466c | ||
|
|
18acfd2226 | ||
|
|
bcd65d9fd6 | ||
|
|
874ba4ca4e | ||
|
|
e42b1f63b7 | ||
|
|
c8bed18978 | ||
|
|
6620b28ed1 |
78
README.md
78
README.md
@@ -412,10 +412,10 @@ Loguri aplicatie: `logs/sync_comenzi_*.log`
|
||||
|
||||
```bash
|
||||
# Conectare SSH (PowerShell remote, cheie publica)
|
||||
ssh -p 22122 gomag@79.119.86.134
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 -o StrictHostKeyChecking=no gomag@79.119.86.134
|
||||
|
||||
# Verificare .env
|
||||
cmd /c type C:\gomag-vending\api\.env
|
||||
powershell -Command "Get-Content C:\gomag-vending\api\.env | Select-String 'ORACLE_'"
|
||||
|
||||
# Test conexiune Oracle
|
||||
C:\gomag-vending\venv\Scripts\python.exe -c "import oracledb, os; os.environ['TNS_ADMIN']='C:/roa/instantclient_11_2_0_2'; conn=oracledb.connect(user='VENDING', password='ROMFASTSOFT', dsn='ROA'); print('Connected!'); conn.close()"
|
||||
@@ -423,20 +423,80 @@ C:\gomag-vending\venv\Scripts\python.exe -c "import oracledb, os; os.environ['TN
|
||||
# Verificare tnsnames.ora
|
||||
cmd /c type C:\roa\instantclient_11_2_0_2\tnsnames.ora
|
||||
|
||||
# Verificare procese Python
|
||||
Get-Process *python* | Select-Object Id,ProcessName,Path
|
||||
# Verificare procese Python (ID-uri pentru kill/restart)
|
||||
powershell -Command "Get-Process python -ErrorAction SilentlyContinue | Format-Table Id, CPU -AutoSize"
|
||||
|
||||
# Verificare loguri recente
|
||||
Get-ChildItem C:\gomag-vending\logs\*.log | Sort-Object LastWriteTime -Descending | Select-Object -First 3
|
||||
|
||||
# Test sync manual (verifica ca Oracle pool porneste)
|
||||
curl http://localhost:5003/health
|
||||
curl -X POST http://localhost:5003/api/sync/start
|
||||
# Test app (prin nginx reverse proxy)
|
||||
powershell -Command "Invoke-WebRequest -Uri 'http://localhost/gomag/' -UseBasicParsing | Select-Object StatusCode"
|
||||
|
||||
# Refresh facturi manual
|
||||
curl -X POST http://localhost:5003/api/dashboard/refresh-invoices
|
||||
# Retry comanda din linie de comanda
|
||||
powershell -Command "Invoke-WebRequest -Uri 'http://localhost/gomag/api/orders/NRCOMANDA/retry' -Method POST -UseBasicParsing | Select-Object -ExpandProperty Content"
|
||||
```
|
||||
|
||||
#### Deploy pachet Oracle PL/SQL via SSH
|
||||
|
||||
```bash
|
||||
# Metoda corecta: sqlplus cu fisier .pck (contine ambele: PACKAGE + PACKAGE BODY)
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"echo exit | sqlplus -S VENDING/PAROLA@ROA '@C:\\gomag-vending\\api\\database-scripts\\05_pack_import_parteneri.pck'\""
|
||||
# Output asteptat: "Package created." + "Package body created."
|
||||
```
|
||||
|
||||
#### Restart serviciu FastAPI via SSH
|
||||
|
||||
Userul `gomag` nu are acces la `nssm` sau `sc` (necesita Administrator).
|
||||
Metoda disponibila — kill python + relanseaza start.ps1:
|
||||
|
||||
```bash
|
||||
# 1. Gaseste PID-urile Python
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"Get-Process python -ErrorAction SilentlyContinue | Format-Table Id, CPU -AutoSize\""
|
||||
|
||||
# 2. Kill + restart (inlocuieste PID1,PID2 cu valorile reale)
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"Stop-Process -Id PID1,PID2 -Force -ErrorAction SilentlyContinue; Start-Sleep 2; cd C:\\gomag-vending; Start-Process powershell -ArgumentList '-NoExit','-File','start.ps1' -WindowStyle Hidden\""
|
||||
|
||||
# 3. Verifica ca a pornit (asteapta ~5s)
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"Invoke-WebRequest -Uri 'http://localhost/gomag/' -UseBasicParsing | Select-Object StatusCode\""
|
||||
```
|
||||
|
||||
#### Ce NU merge via SSH (userul gomag fara Administrator)
|
||||
|
||||
| Comanda | Eroare | Alternativa |
|
||||
|---------|--------|-------------|
|
||||
| `nssm restart GoMagVending` | `Error opening service manager!` | Kill python + Start-Process start.ps1 (vezi mai sus) |
|
||||
| `sc query` / `sc stop` | `Access is denied` | Nu exista alternativa — necesita acces direct la server |
|
||||
| `Get-WmiObject Win32_Process` | `Access denied` | `Get-Process` simplu fara CommandLine |
|
||||
| Pipe `\|` in -Command cu ghilimele nested | `An empty pipe element is not allowed` | Scrie SQL in fisier temporar, copiaza cu scp, ruleaza `@fisier.sql` |
|
||||
| `&&` (bash syntax) in PowerShell | `The term '&&' is not recognized` | Foloseste `;` (continua indiferent) sau `-Command "cmd1; cmd2"` |
|
||||
| `-m` flag la `curl` in PowerShell | `Ambiguous parameter name` | Foloseste `Invoke-WebRequest` in loc de curl |
|
||||
| Here-doc `<< 'EOF'` in PowerShell | `Missing file specification` | Scrie fisierul local, copiaza cu scp |
|
||||
|
||||
#### Rulare SQL ad-hoc prin SSH (fara interactiv)
|
||||
|
||||
PowerShell nu suporta pipe catre sqlplus cu ghilimele complexe. Metoda corecta:
|
||||
|
||||
```bash
|
||||
# 1. Scrie SQL local
|
||||
cat > /tmp/query.sql << 'EOF'
|
||||
SELECT coloana FROM tabel WHERE conditie;
|
||||
exit
|
||||
EOF
|
||||
|
||||
# 2. Copiaza pe prod
|
||||
scp -i ~/.ssh/id_ed25519 -P 22122 /tmp/query.sql "gomag@79.119.86.134:C:/gomag-vending/query.sql"
|
||||
|
||||
# 3. Ruleaza
|
||||
ssh -i ~/.ssh/id_ed25519 -p 22122 gomag@79.119.86.134 \
|
||||
"powershell -Command \"sqlplus -S VENDING/PAROLA@ROA '@C:\\gomag-vending\\query.sql'\""
|
||||
```
|
||||
|
||||
**Nu folosi** `echo 'SQL;' | sqlplus` — PowerShell trateaza `|` diferit si poate esua cu "empty pipe element".
|
||||
|
||||
### Probleme frecvente
|
||||
|
||||
| Eroare | Cauza | Solutie |
|
||||
|
||||
@@ -141,6 +141,12 @@ async def _call_anaf_api(body: list[dict], retry: int = 0, log_fn=None) -> dict[
|
||||
|
||||
checked_at = datetime.now().isoformat()
|
||||
|
||||
# CONTRACT (consumed by sync_service.evaluate_cui_gate):
|
||||
# Return {} → transient error (down/429/5xx/timeout)
|
||||
# Return {cui: {scpTVA: None, denumire_anaf: ""}} → ANAF notFound explicit
|
||||
# Return {cui: {scpTVA: bool, denumire_anaf: str}} → ANAF found
|
||||
# If you change this semantics, update the gate in sync_service too.
|
||||
|
||||
# Parse ANAF response
|
||||
found_list = data.get("found", [])
|
||||
for item in found_list:
|
||||
|
||||
@@ -2,38 +2,39 @@ import html
|
||||
import json
|
||||
import logging
|
||||
import re
|
||||
import unicodedata
|
||||
import oracledb
|
||||
from datetime import datetime, timedelta
|
||||
from .. import database
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Diacritics to ASCII mapping (Romanian)
|
||||
_DIACRITICS = str.maketrans({
|
||||
'\u0103': 'a', # ă
|
||||
'\u00e2': 'a', # â
|
||||
'\u00ee': 'i', # î
|
||||
'\u0219': 's', # ș
|
||||
'\u021b': 't', # ț
|
||||
'\u0102': 'A', # Ă
|
||||
'\u00c2': 'A', # Â
|
||||
'\u00ce': 'I', # Î
|
||||
'\u0218': 'S', # Ș
|
||||
'\u021a': 'T', # Ț
|
||||
# Older Unicode variants
|
||||
'\u015f': 's', # ş (cedilla)
|
||||
'\u0163': 't', # ţ (cedilla)
|
||||
'\u015e': 'S', # Ş
|
||||
'\u0162': 'T', # Ţ
|
||||
# Stroke/ligature letters NFKD does not decompose (structural mod, not a
|
||||
# combining mark). Everything else — RO cedilla ş/ţ, RO comma-below ș/ț,
|
||||
# HU ő/ű, DE umlaut, CZ háček, FR accent, ES tilde — is handled
|
||||
# universally by unicodedata.normalize('NFKD') + Mn-category strip below.
|
||||
_NFKD_OVERRIDES = str.maketrans({
|
||||
'ß': 'ss', # ß
|
||||
'æ': 'ae', 'Æ': 'AE', # æ Æ
|
||||
'œ': 'oe', 'Œ': 'OE', # œ Œ
|
||||
'ł': 'l', 'Ł': 'L', # ł Ł (Polish)
|
||||
'đ': 'd', 'Đ': 'D', # đ Đ (Croatian)
|
||||
'ø': 'o', 'Ø': 'O', # ø Ø (Danish/Norwegian)
|
||||
})
|
||||
|
||||
|
||||
def clean_web_text(text: str) -> str:
|
||||
"""Port of VFP CleanWebText: unescape HTML entities + diacritics to ASCII."""
|
||||
"""Port of VFP CleanWebText: unescape HTML entities + strip diacritics to ASCII.
|
||||
|
||||
NFKD decomposition + combining-mark filter covers RO/HU/DE/CZ/PL/FR/ES in
|
||||
one pass; _NFKD_OVERRIDES handles stroke letters NFKD leaves alone.
|
||||
"""
|
||||
if not text:
|
||||
return ""
|
||||
result = html.unescape(text)
|
||||
result = result.translate(_DIACRITICS)
|
||||
result = result.translate(_NFKD_OVERRIDES)
|
||||
decomposed = unicodedata.normalize('NFKD', result)
|
||||
result = ''.join(ch for ch in decomposed if not unicodedata.combining(ch))
|
||||
# Remove any remaining <br> tags
|
||||
for br in ('<br>', '<br/>', '<br />'):
|
||||
result = result.replace(br, ' ')
|
||||
@@ -93,14 +94,23 @@ def format_address_for_oracle(address: str, city: str, region: str) -> str:
|
||||
city_clean = clean_web_text(city)
|
||||
address_clean = clean_web_text(address)
|
||||
address_clean = " ".join(address_clean.replace(",", " ").split())
|
||||
# Strip city name from end of address (users often type it)
|
||||
if city_clean:
|
||||
# Strip city/region suffixes users often append to address
|
||||
if city_clean or region_clean:
|
||||
addr_upper = address_clean.upper().rstrip()
|
||||
city_upper = city_clean.upper().strip()
|
||||
if addr_upper.endswith(city_upper):
|
||||
stripped = address_clean[:len(address_clean.rstrip()) - len(city_upper)].rstrip()
|
||||
if stripped: # don't strip if nothing remains
|
||||
address_clean = stripped
|
||||
city_upper = city_clean.upper().strip() if city_clean else ""
|
||||
region_upper = region_clean.upper().strip() if region_clean else ""
|
||||
for pattern in [
|
||||
(city_upper + " " + region_upper).strip(),
|
||||
(region_upper + " " + city_upper).strip(),
|
||||
city_upper,
|
||||
region_upper,
|
||||
]:
|
||||
if pattern and addr_upper.endswith(pattern):
|
||||
stripped = address_clean[:len(address_clean.rstrip()) - len(pattern)].rstrip()
|
||||
if stripped:
|
||||
address_clean = stripped
|
||||
addr_upper = address_clean.upper().rstrip()
|
||||
break
|
||||
return f"JUD:{region_clean};{city_clean};{address_clean}"
|
||||
|
||||
|
||||
|
||||
@@ -59,6 +59,40 @@ async def _download_and_reimport(order_number: str, order_date_str: str, custome
|
||||
id_gestiune = app_settings.get("id_gestiune", "")
|
||||
id_gestiuni = [int(g.strip()) for g in id_gestiune.split(",") if g.strip()] if id_gestiune else None
|
||||
|
||||
# Pre-validate prices: auto-insert PRET=0 in CRM_POLITICI_PRET_ART for missing
|
||||
# CODMATs so PL/SQL doesn't crash with COM-001. Mirrors sync_service flow.
|
||||
from .. import database
|
||||
validation = {"mapped": set(), "direct": set(), "missing": set(), "direct_id_map": {}}
|
||||
if database.pool is not None:
|
||||
conn = await asyncio.to_thread(database.get_oracle_connection)
|
||||
try:
|
||||
skus = {item.sku for item in target_order.items if item.sku}
|
||||
if skus:
|
||||
validation = await asyncio.to_thread(
|
||||
validation_service.validate_skus, skus, conn, id_gestiuni,
|
||||
)
|
||||
if id_pol and skus:
|
||||
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
|
||||
cota_tva = float(app_settings.get("discount_vat") or 21)
|
||||
await asyncio.to_thread(
|
||||
validation_service.pre_validate_order_prices,
|
||||
[target_order], app_settings, conn, id_pol, id_pol_productie,
|
||||
id_gestiuni, validation, None, cota_tva,
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Retry pre-validation failed for {order_number}: {e}")
|
||||
await sqlite_service.upsert_order(
|
||||
sync_run_id="retry",
|
||||
order_number=order_number,
|
||||
order_date=order_date_str,
|
||||
customer_name=customer_name,
|
||||
status=OrderStatus.ERROR.value,
|
||||
error_message=f"Retry pre-validation failed: {e}",
|
||||
)
|
||||
return {"success": False, "message": f"Eroare pre-validare preturi: {e}"}
|
||||
finally:
|
||||
await asyncio.to_thread(database.pool.release, conn)
|
||||
|
||||
try:
|
||||
result = await asyncio.to_thread(
|
||||
import_service.import_single_order,
|
||||
@@ -77,17 +111,6 @@ async def _download_and_reimport(order_number: str, order_date_str: str, custome
|
||||
)
|
||||
return {"success": False, "message": f"Eroare import: {e}"}
|
||||
|
||||
# Build order_items data from fresh GoMag download (mirrors sync_service:882-891).
|
||||
# Resolves ARTICOLE_TERTI mapping so UI shows mapped/direct badge.
|
||||
try:
|
||||
skus = {item.sku for item in target_order.items if item.sku}
|
||||
validation = await asyncio.to_thread(
|
||||
validation_service.validate_skus, skus, None, id_gestiuni
|
||||
) if skus else {"mapped": set(), "direct": set()}
|
||||
except Exception as e:
|
||||
logger.warning(f"Retry: validate_skus failed for {order_number}, defaulting mapping_status=direct: {e}")
|
||||
validation = {"mapped": set(), "direct": set()}
|
||||
|
||||
order_items_data = [
|
||||
{
|
||||
"sku": item.sku, "product_name": item.name,
|
||||
|
||||
@@ -1253,6 +1253,19 @@ async def get_all_imported_orders() -> list:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def get_deleted_in_roa_order_numbers() -> set[str]:
|
||||
"""Return set of order_numbers marked DELETED_IN_ROA (sticky-excluded from auto-sync)."""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
cursor = await db.execute(
|
||||
f"SELECT order_number FROM orders WHERE status = '{OrderStatus.DELETED_IN_ROA.value}'"
|
||||
)
|
||||
rows = await cursor.fetchall()
|
||||
return {r[0] for r in rows}
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
async def clear_order_invoice(order_number: str):
|
||||
"""Clear cached invoice data when invoice was deleted in ROA."""
|
||||
db = await get_sqlite()
|
||||
@@ -1275,10 +1288,13 @@ async def clear_order_invoice(order_number: str):
|
||||
|
||||
|
||||
async def mark_order_deleted_in_roa(order_number: str):
|
||||
"""Mark an order as deleted in ROA — clears id_comanda, invoice cache, and stale items."""
|
||||
"""Mark an order as deleted in ROA — clears id_comanda + invoice cache.
|
||||
|
||||
order_items are preserved so the detail view can still show what was
|
||||
originally ordered. On 'Reimporta', add_order_items replaces them.
|
||||
"""
|
||||
db = await get_sqlite()
|
||||
try:
|
||||
await db.execute("DELETE FROM order_items WHERE order_number = ?", (order_number,))
|
||||
await db.execute(f"""
|
||||
UPDATE orders SET
|
||||
status = '{OrderStatus.DELETED_IN_ROA.value}',
|
||||
|
||||
@@ -48,7 +48,7 @@ def _addr_match(gomag_json, roa_json):
|
||||
r'ET|ETAJ|COM|COMUNA|SAT|MUN|MUNICIPIUL|JUD|JUDETUL|CARTIER|PARTER|SECTOR|SECTORUL|ORAS)(?:\b|(?=\d))'
|
||||
)
|
||||
def norm(s):
|
||||
s = (s or '').translate(import_service._DIACRITICS).upper()
|
||||
s = import_service.clean_web_text(s or '').upper()
|
||||
s = _ADDR_WORDS.sub('', s)
|
||||
return re.sub(r'[^A-Z0-9]', '', s)
|
||||
def _soundex(s):
|
||||
@@ -112,6 +112,58 @@ async def _record_phase_err(run_id: str, phase: str, err: Exception) -> None:
|
||||
logger.warning(f"record_phase_failure failed for phase={phase}: {rec_err}")
|
||||
|
||||
|
||||
def evaluate_cui_gate(
|
||||
is_ro_company: bool,
|
||||
company_code_raw: str | None,
|
||||
bare_cui: str,
|
||||
anaf_data: dict | None,
|
||||
) -> str | None:
|
||||
"""Return block reason or None if the order passes the CUI gate.
|
||||
|
||||
CONTRACT on anaf_data:
|
||||
- None → ANAF down / transient error → tolerate (pass)
|
||||
- {scpTVA: None, denumire_anaf: ""} → ANAF notFound explicit → block
|
||||
- {scpTVA: bool, denumire_anaf: str} → ANAF found → pass
|
||||
"""
|
||||
if not is_ro_company or not company_code_raw:
|
||||
return None
|
||||
if not anaf_service.validate_cui(bare_cui):
|
||||
return f"CUI invalid (format): {company_code_raw!r}"
|
||||
if not anaf_service.validate_cui_checksum(bare_cui):
|
||||
return f"CUI invalid (cifra de control): {bare_cui}"
|
||||
if (
|
||||
anaf_data is not None
|
||||
and anaf_data.get("scpTVA") is None
|
||||
and not (anaf_data.get("denumire_anaf") or "").strip()
|
||||
):
|
||||
return (
|
||||
f"CUI {company_code_raw!r} (sanitizat: {bare_cui}) nu exista in registrul ANAF — "
|
||||
f"verifica daca nu e inversat cu numarul de la registrul comertului"
|
||||
)
|
||||
return None
|
||||
|
||||
|
||||
async def _record_order_error(
|
||||
run_id: str, order, customer: str, shipping_name: str, billing_name: str,
|
||||
payment_method: str, delivery_method: str, discount_split_json: str | None,
|
||||
order_items_data: list, reason: str, id_partener: int | None = None,
|
||||
) -> None:
|
||||
"""Write an ERROR row to SQLite (orders + sync_run_orders + order_items)."""
|
||||
await sqlite_service.upsert_order(
|
||||
sync_run_id=run_id, order_number=order.number, order_date=order.date,
|
||||
customer_name=customer, status=OrderStatus.ERROR.value,
|
||||
id_partener=id_partener, error_message=reason,
|
||||
items_count=len(order.items),
|
||||
shipping_name=shipping_name, billing_name=billing_name,
|
||||
payment_method=payment_method, delivery_method=delivery_method,
|
||||
order_total=order.total or None, delivery_cost=order.delivery_cost or None,
|
||||
discount_total=order.discount_total or None, web_status=order.status or None,
|
||||
discount_split=discount_split_json,
|
||||
)
|
||||
await sqlite_service.add_sync_run_order(run_id, order.number, OrderStatus.ERROR.value)
|
||||
await sqlite_service.add_order_items(order.number, order_items_data)
|
||||
|
||||
|
||||
async def _check_escalation() -> tuple[str | None, dict[str, int]]:
|
||||
"""Return (phase_to_halt_on, recent_counts).
|
||||
|
||||
@@ -424,6 +476,18 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
||||
|
||||
orders = active_orders
|
||||
|
||||
# ── Sticky exclusion: skip orders previously marked DELETED_IN_ROA ──
|
||||
deleted_set = await sqlite_service.get_deleted_in_roa_order_numbers()
|
||||
if deleted_set:
|
||||
excluded_deleted = [o for o in orders if o.number in deleted_set]
|
||||
orders = [o for o in orders if o.number not in deleted_set]
|
||||
if excluded_deleted:
|
||||
_log_line(run_id,
|
||||
f"Excluse {len(excluded_deleted)} comenzi marcate DELETED_IN_ROA "
|
||||
f"(stergeri sticky — foloseste 'Reimporta' pentru override)")
|
||||
for o in excluded_deleted:
|
||||
_log_line(run_id, f"#{o.number} [{o.date or '?'}] → IGNORAT (DELETED_IN_ROA)")
|
||||
|
||||
if not orders:
|
||||
_log_line(run_id, "Nicio comanda activa dupa filtrare anulate.")
|
||||
await sqlite_service.update_sync_run(run_id, "completed", cancelled_count, 0, 0, 0)
|
||||
@@ -540,132 +604,34 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
||||
# Step 2d: Pre-validate prices for importable articles
|
||||
if id_pol and (truly_importable or already_in_roa):
|
||||
_update_progress("validation", "Validating prices...", 0, len(truly_importable))
|
||||
_log_line(run_id, "Validare preturi...")
|
||||
all_codmats = set()
|
||||
for order in (truly_importable + already_in_roa):
|
||||
for item in order.items:
|
||||
if item.sku in validation["mapped"]:
|
||||
pass
|
||||
elif item.sku in validation["direct"]:
|
||||
all_codmats.add(item.sku)
|
||||
# Get standard VAT rate from settings for PROC_TVAV metadata
|
||||
cota_tva = float(app_settings.get("discount_vat") or 21)
|
||||
|
||||
# Dual pricing policy support
|
||||
id_pol_productie = int(app_settings.get("id_pol_productie") or 0) or None
|
||||
codmat_policy_map = {}
|
||||
|
||||
if all_codmats:
|
||||
if id_pol_productie:
|
||||
# Dual-policy: classify articles by cont (sales vs production)
|
||||
codmat_policy_map = await asyncio.to_thread(
|
||||
validation_service.validate_and_ensure_prices_dual,
|
||||
all_codmats, id_pol, id_pol_productie,
|
||||
conn, validation.get("direct_id_map"),
|
||||
cota_tva=cota_tva
|
||||
)
|
||||
_log_line(run_id,
|
||||
f"Politici duale: {sum(1 for v in codmat_policy_map.values() if v == id_pol)} vanzare, "
|
||||
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)} productie")
|
||||
else:
|
||||
# Single-policy (backward compatible)
|
||||
price_result = await asyncio.to_thread(
|
||||
validation_service.validate_prices, all_codmats, id_pol,
|
||||
conn, validation.get("direct_id_map")
|
||||
)
|
||||
if price_result["missing_price"]:
|
||||
logger.info(
|
||||
f"Auto-adding price 0 for {len(price_result['missing_price'])} "
|
||||
f"direct articles in policy {id_pol}"
|
||||
)
|
||||
await asyncio.to_thread(
|
||||
validation_service.ensure_prices,
|
||||
price_result["missing_price"], id_pol,
|
||||
conn, validation.get("direct_id_map"),
|
||||
cota_tva=cota_tva
|
||||
)
|
||||
pv_result = await asyncio.to_thread(
|
||||
validation_service.pre_validate_order_prices,
|
||||
truly_importable + already_in_roa,
|
||||
app_settings, conn, id_pol, id_pol_productie,
|
||||
id_gestiuni, validation,
|
||||
lambda msg: _log_line(run_id, msg),
|
||||
cota_tva,
|
||||
)
|
||||
|
||||
# Also validate mapped SKU prices (cherry-pick 1)
|
||||
mapped_skus_in_orders = set()
|
||||
for order in (truly_importable + already_in_roa):
|
||||
for item in order.items:
|
||||
if item.sku in validation["mapped"]:
|
||||
mapped_skus_in_orders.add(item.sku)
|
||||
|
||||
mapped_codmat_data = {}
|
||||
if mapped_skus_in_orders:
|
||||
mapped_codmat_data = await asyncio.to_thread(
|
||||
validation_service.resolve_mapped_codmats, mapped_skus_in_orders, conn,
|
||||
id_gestiuni=id_gestiuni
|
||||
)
|
||||
# Build id_map for mapped codmats and validate/ensure their prices
|
||||
mapped_id_map = {}
|
||||
for sku, entries in mapped_codmat_data.items():
|
||||
for entry in entries:
|
||||
mapped_id_map[entry["codmat"]] = {
|
||||
"id_articol": entry["id_articol"],
|
||||
"cont": entry.get("cont")
|
||||
}
|
||||
mapped_codmats = set(mapped_id_map.keys())
|
||||
if mapped_codmats:
|
||||
if id_pol_productie:
|
||||
mapped_policy_map = await asyncio.to_thread(
|
||||
validation_service.validate_and_ensure_prices_dual,
|
||||
mapped_codmats, id_pol, id_pol_productie,
|
||||
conn, mapped_id_map, cota_tva=cota_tva
|
||||
)
|
||||
codmat_policy_map.update(mapped_policy_map)
|
||||
# Filter truly_importable for kits with missing component prices
|
||||
kit_missing = pv_result.get("kit_missing") or {}
|
||||
if kit_missing:
|
||||
kit_skus_missing = set(kit_missing.keys())
|
||||
new_truly = []
|
||||
for order in truly_importable:
|
||||
order_skus = {item.sku for item in order.items}
|
||||
if order_skus & kit_skus_missing:
|
||||
missing_list = list(order_skus & kit_skus_missing)
|
||||
skipped.append((order, missing_list))
|
||||
else:
|
||||
mp_result = await asyncio.to_thread(
|
||||
validation_service.validate_prices,
|
||||
mapped_codmats, id_pol, conn, mapped_id_map
|
||||
)
|
||||
if mp_result["missing_price"]:
|
||||
await asyncio.to_thread(
|
||||
validation_service.ensure_prices,
|
||||
mp_result["missing_price"], id_pol,
|
||||
conn, mapped_id_map, cota_tva=cota_tva
|
||||
)
|
||||
|
||||
# Add SKU → policy entries for mapped articles (1:1 and kits)
|
||||
# codmat_policy_map has CODMAT keys, but build_articles_json
|
||||
# looks up by GoMag SKU — bridge the gap here
|
||||
if codmat_policy_map and mapped_codmat_data:
|
||||
for sku, entries in mapped_codmat_data.items():
|
||||
if len(entries) == 1:
|
||||
# 1:1 mapping: SKU inherits the CODMAT's policy
|
||||
codmat = entries[0]["codmat"]
|
||||
if codmat in codmat_policy_map:
|
||||
codmat_policy_map[sku] = codmat_policy_map[codmat]
|
||||
|
||||
# Pass codmat_policy_map to import via app_settings
|
||||
if codmat_policy_map:
|
||||
app_settings["_codmat_policy_map"] = codmat_policy_map
|
||||
|
||||
# ── Kit component price validation ──
|
||||
kit_pricing_mode = app_settings.get("kit_pricing_mode")
|
||||
if kit_pricing_mode and mapped_codmat_data:
|
||||
id_pol_prod = int(app_settings.get("id_pol_productie") or 0) or None
|
||||
kit_missing = await asyncio.to_thread(
|
||||
validation_service.validate_kit_component_prices,
|
||||
mapped_codmat_data, id_pol, id_pol_prod, conn
|
||||
)
|
||||
if kit_missing:
|
||||
kit_skus_missing = set(kit_missing.keys())
|
||||
for sku, missing_codmats in kit_missing.items():
|
||||
_log_line(run_id, f"Kit {sku}: prețuri lipsă pentru {', '.join(missing_codmats)}")
|
||||
new_truly = []
|
||||
for order in truly_importable:
|
||||
order_skus = {item.sku for item in order.items}
|
||||
if order_skus & kit_skus_missing:
|
||||
missing_list = list(order_skus & kit_skus_missing)
|
||||
skipped.append((order, missing_list))
|
||||
else:
|
||||
new_truly.append(order)
|
||||
truly_importable = new_truly
|
||||
new_truly.append(order)
|
||||
truly_importable = new_truly
|
||||
|
||||
# Mode B config validation
|
||||
if kit_pricing_mode == "separate_line":
|
||||
if app_settings.get("kit_pricing_mode") == "separate_line":
|
||||
if not app_settings.get("kit_discount_codmat"):
|
||||
_log_line(run_id, "EROARE: Kit mode 'separate_line' dar kit_discount_codmat nu e configurat!")
|
||||
finally:
|
||||
@@ -919,6 +885,7 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
||||
cod_fiscal_override = None
|
||||
anaf_data_for_order = None
|
||||
raw_cf = ""
|
||||
bare_cui = ""
|
||||
if order.billing.is_company and order.billing.company_code:
|
||||
raw_cf = import_service.clean_web_text(order.billing.company_code) or ""
|
||||
bare_cui, cui_warning = anaf_service.sanitize_cui(raw_cf)
|
||||
@@ -938,6 +905,36 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
||||
if is_ro_company and anaf_data_for_order and anaf_data_for_order.get("scpTVA") is not None:
|
||||
anaf_strict = 1 # ANAF data available → strict search
|
||||
|
||||
# Build order items data and discount split (needed by gate error path)
|
||||
order_items_data = []
|
||||
for item in order.items:
|
||||
ms = "mapped" if item.sku in validation["mapped"] else "direct"
|
||||
order_items_data.append({
|
||||
"sku": item.sku, "product_name": item.name,
|
||||
"quantity": item.quantity, "price": item.price,
|
||||
"baseprice": item.baseprice, "vat": item.vat,
|
||||
"mapping_status": ms, "codmat": None, "id_articol": None,
|
||||
"cantitate_roa": None
|
||||
})
|
||||
ds = import_service.compute_discount_split(order, app_settings)
|
||||
discount_split_json = json.dumps(ds) if ds else None
|
||||
|
||||
# Gate CUI (RO PJ): block if CUI invalid or ANAF explicit notFound
|
||||
block_reason = evaluate_cui_gate(
|
||||
is_ro_company, order.billing.company_code, bare_cui, anaf_data_for_order
|
||||
)
|
||||
if block_reason:
|
||||
error_count += 1
|
||||
_log_line(run_id, f"#{order.number} BLOCAT: {block_reason}")
|
||||
await _record_order_error(
|
||||
run_id, order, customer, shipping_name, billing_name,
|
||||
payment_method, delivery_method, discount_split_json,
|
||||
order_items_data, block_reason,
|
||||
)
|
||||
if error_count > 10:
|
||||
break
|
||||
continue
|
||||
|
||||
# ANAF official name override: used at partner creation (not lookup).
|
||||
# Strip before truthy check → reject whitespace-only values.
|
||||
denumire_override = None
|
||||
@@ -955,22 +952,6 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
||||
denumire_override=denumire_override,
|
||||
)
|
||||
|
||||
# Build order items data for storage (R9)
|
||||
order_items_data = []
|
||||
for item in order.items:
|
||||
ms = "mapped" if item.sku in validation["mapped"] else "direct"
|
||||
order_items_data.append({
|
||||
"sku": item.sku, "product_name": item.name,
|
||||
"quantity": item.quantity, "price": item.price,
|
||||
"baseprice": item.baseprice, "vat": item.vat,
|
||||
"mapping_status": ms, "codmat": None, "id_articol": None,
|
||||
"cantitate_roa": None
|
||||
})
|
||||
|
||||
# Compute discount split for SQLite storage
|
||||
ds = import_service.compute_discount_split(order, app_settings)
|
||||
discount_split_json = json.dumps(ds) if ds else None
|
||||
|
||||
if result["success"]:
|
||||
imported_count += 1
|
||||
await sqlite_service.upsert_order(
|
||||
@@ -1040,28 +1021,13 @@ async def run_sync(id_pol: int = None, id_sectie: int = None, run_id: str = None
|
||||
|
||||
if not result["success"]:
|
||||
error_count += 1
|
||||
await sqlite_service.upsert_order(
|
||||
sync_run_id=run_id,
|
||||
order_number=order.number,
|
||||
order_date=order.date,
|
||||
customer_name=customer,
|
||||
status=OrderStatus.ERROR.value,
|
||||
id_partener=result.get("id_partener"),
|
||||
error_message=result["error"],
|
||||
items_count=len(order.items),
|
||||
shipping_name=shipping_name,
|
||||
billing_name=billing_name,
|
||||
payment_method=payment_method,
|
||||
delivery_method=delivery_method,
|
||||
order_total=order.total or None,
|
||||
delivery_cost=order.delivery_cost or None,
|
||||
discount_total=order.discount_total or None,
|
||||
web_status=order.status or None,
|
||||
discount_split=discount_split_json,
|
||||
)
|
||||
await sqlite_service.add_sync_run_order(run_id, order.number, OrderStatus.ERROR.value)
|
||||
await sqlite_service.add_order_items(order.number, order_items_data)
|
||||
_log_line(run_id, f"#{order.number} [{order.date or '?'}] {customer} → EROARE: {result['error']}")
|
||||
await _record_order_error(
|
||||
run_id, order, customer, shipping_name, billing_name,
|
||||
payment_method, delivery_method, discount_split_json,
|
||||
order_items_data, result["error"],
|
||||
id_partener=result.get("id_partener"),
|
||||
)
|
||||
|
||||
# Safety: stop if too many errors
|
||||
if error_count > 10:
|
||||
|
||||
@@ -529,6 +529,147 @@ def resolve_mapped_codmats(mapped_skus: set[str], conn,
|
||||
return result
|
||||
|
||||
|
||||
def pre_validate_order_prices(orders, app_settings: dict, conn, id_pol: int,
|
||||
id_pol_productie: int = None,
|
||||
id_gestiuni: list[int] = None,
|
||||
validation: dict = None,
|
||||
log_callback=None,
|
||||
cota_tva: float = 21) -> dict:
|
||||
"""Pre-validate prices for orders before importing them via PACK_IMPORT_COMENZI.
|
||||
|
||||
Auto-inserts PRET=0 rows in CRM_POLITICI_PRET_ART for missing CODMATs so
|
||||
PL/SQL adauga_articol_comanda doesn't raise COM-001. Mutates
|
||||
app_settings["_codmat_policy_map"] for build_articles_json routing.
|
||||
|
||||
Used by both bulk sync (sync_service.run_sync) and retry (retry_service).
|
||||
|
||||
Args:
|
||||
orders: list of orders to scan for SKUs/CODMATs
|
||||
app_settings: mutated with _codmat_policy_map (SKU/CODMAT → id_pol)
|
||||
conn: Oracle connection (caller manages lifecycle)
|
||||
id_pol: default sales price policy
|
||||
id_pol_productie: production policy for cont 341/345 (None = single-policy)
|
||||
id_gestiuni: gestiune filter for resolve_mapped_codmats
|
||||
validation: output of validate_skus; computed internally if None
|
||||
log_callback: optional Callable[[str], None] for progress messages
|
||||
cota_tva: VAT rate for PROC_TVAV metadata (default 21)
|
||||
|
||||
Returns: {"codmat_policy_map": dict, "kit_missing": dict, "validation": dict}
|
||||
- codmat_policy_map: {codmat_or_sku: id_pol}
|
||||
- kit_missing: {sku: [missing_codmats]} for kits with unprice components
|
||||
- validation: validate_skus result (for caller convenience)
|
||||
"""
|
||||
log = log_callback or (lambda _msg: None)
|
||||
|
||||
if not orders:
|
||||
return {"codmat_policy_map": {}, "kit_missing": {}, "validation": validation or {}}
|
||||
|
||||
if validation is None:
|
||||
all_skus = {item.sku for o in orders for item in o.items if item.sku}
|
||||
validation = validate_skus(all_skus, conn, id_gestiuni)
|
||||
|
||||
log("Validare preturi...")
|
||||
|
||||
# Direct CODMATs (SKU exists in NOM_ARTICOLE without ARTICOLE_TERTI mapping)
|
||||
all_codmats = set()
|
||||
for order in orders:
|
||||
for item in order.items:
|
||||
if item.sku in validation["mapped"]:
|
||||
continue
|
||||
if item.sku in validation["direct"]:
|
||||
all_codmats.add(item.sku)
|
||||
|
||||
codmat_policy_map = {}
|
||||
|
||||
if all_codmats:
|
||||
if id_pol_productie:
|
||||
codmat_policy_map = validate_and_ensure_prices_dual(
|
||||
all_codmats, id_pol, id_pol_productie,
|
||||
conn, validation.get("direct_id_map"), cota_tva=cota_tva,
|
||||
)
|
||||
log(f"Politici duale: "
|
||||
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol)} vanzare, "
|
||||
f"{sum(1 for v in codmat_policy_map.values() if v == id_pol_productie)} productie")
|
||||
else:
|
||||
price_result = validate_prices(
|
||||
all_codmats, id_pol, conn, validation.get("direct_id_map"),
|
||||
)
|
||||
if price_result["missing_price"]:
|
||||
logger.info(
|
||||
f"Auto-adding price 0 for {len(price_result['missing_price'])} "
|
||||
f"direct articles in policy {id_pol}"
|
||||
)
|
||||
ensure_prices(
|
||||
price_result["missing_price"], id_pol,
|
||||
conn, validation.get("direct_id_map"), cota_tva=cota_tva,
|
||||
)
|
||||
|
||||
# Mapped SKUs (via ARTICOLE_TERTI)
|
||||
mapped_skus_in_orders = {
|
||||
item.sku for o in orders for item in o.items
|
||||
if item.sku in validation["mapped"]
|
||||
}
|
||||
|
||||
mapped_codmat_data = {}
|
||||
if mapped_skus_in_orders:
|
||||
mapped_codmat_data = resolve_mapped_codmats(
|
||||
mapped_skus_in_orders, conn, id_gestiuni=id_gestiuni,
|
||||
)
|
||||
mapped_id_map = {}
|
||||
for sku, entries in mapped_codmat_data.items():
|
||||
for entry in entries:
|
||||
mapped_id_map[entry["codmat"]] = {
|
||||
"id_articol": entry["id_articol"],
|
||||
"cont": entry.get("cont"),
|
||||
}
|
||||
mapped_codmats = set(mapped_id_map.keys())
|
||||
if mapped_codmats:
|
||||
if id_pol_productie:
|
||||
mapped_policy_map = validate_and_ensure_prices_dual(
|
||||
mapped_codmats, id_pol, id_pol_productie,
|
||||
conn, mapped_id_map, cota_tva=cota_tva,
|
||||
)
|
||||
codmat_policy_map.update(mapped_policy_map)
|
||||
else:
|
||||
mp_result = validate_prices(
|
||||
mapped_codmats, id_pol, conn, mapped_id_map,
|
||||
)
|
||||
if mp_result["missing_price"]:
|
||||
ensure_prices(
|
||||
mp_result["missing_price"], id_pol,
|
||||
conn, mapped_id_map, cota_tva=cota_tva,
|
||||
)
|
||||
|
||||
# Bridge SKU → policy via 1:1 mappings (build_articles_json reads by SKU)
|
||||
if codmat_policy_map and mapped_codmat_data:
|
||||
for sku, entries in mapped_codmat_data.items():
|
||||
if len(entries) == 1:
|
||||
codmat = entries[0]["codmat"]
|
||||
if codmat in codmat_policy_map:
|
||||
codmat_policy_map[sku] = codmat_policy_map[codmat]
|
||||
|
||||
if codmat_policy_map:
|
||||
app_settings["_codmat_policy_map"] = codmat_policy_map
|
||||
|
||||
# Kit component price gating
|
||||
kit_missing = {}
|
||||
kit_pricing_mode = app_settings.get("kit_pricing_mode")
|
||||
if kit_pricing_mode and mapped_codmat_data:
|
||||
kit_missing = validate_kit_component_prices(
|
||||
mapped_codmat_data, id_pol, id_pol_productie, conn,
|
||||
)
|
||||
if kit_missing:
|
||||
for sku, missing_codmats in kit_missing.items():
|
||||
log(f"Kit {sku}: prețuri lipsă pentru {', '.join(missing_codmats)}")
|
||||
|
||||
return {
|
||||
"codmat_policy_map": codmat_policy_map,
|
||||
"kit_missing": kit_missing,
|
||||
"validation": validation,
|
||||
"mapped_codmat_data": mapped_codmat_data,
|
||||
}
|
||||
|
||||
|
||||
def validate_kit_component_prices(mapped_codmat_data: dict, id_pol: int,
|
||||
id_pol_productie: int = None, conn=None) -> dict:
|
||||
"""Pre-validate that kit components have non-zero prices in crm_politici_pret_art.
|
||||
|
||||
@@ -601,6 +601,142 @@ function _renderReceipt(items, order) {
|
||||
}
|
||||
|
||||
// ── Order Detail Modal (shared) ──────────────────
|
||||
function _configureDetailButtons(order, orderNumber, opts) {
|
||||
const status = (order.status || '').toUpperCase();
|
||||
const isInvoiced = !!(order.factura_numar);
|
||||
|
||||
const retryBtn = document.getElementById('detailRetryBtn');
|
||||
if (retryBtn) {
|
||||
const canRetry = [ORDER_STATUS.ERROR, ORDER_STATUS.SKIPPED, ORDER_STATUS.DELETED_IN_ROA].includes(status);
|
||||
retryBtn.style.display = canRetry ? '' : 'none';
|
||||
if (canRetry) {
|
||||
retryBtn.onclick = async () => {
|
||||
retryBtn.disabled = true;
|
||||
retryBtn.innerHTML = '<span class="spinner-border spinner-border-sm me-1"></span> Reimportare...';
|
||||
try {
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/retry`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
retryBtn.innerHTML = '<i class="bi bi-check-circle"></i> ' + (data.message || 'Reimportat');
|
||||
retryBtn.className = 'btn btn-sm btn-success';
|
||||
if (opts.onStatusChange) opts.onStatusChange();
|
||||
setTimeout(() => renderOrderDetailModal(orderNumber, opts), 1500);
|
||||
} else {
|
||||
retryBtn.innerHTML = '<i class="bi bi-exclamation-triangle"></i> ' + (data.message || 'Eroare');
|
||||
retryBtn.className = 'btn btn-sm btn-danger';
|
||||
setTimeout(() => {
|
||||
retryBtn.innerHTML = '<i class="bi bi-arrow-clockwise"></i> Reimporta';
|
||||
retryBtn.className = 'btn btn-sm btn-outline-primary';
|
||||
retryBtn.disabled = false;
|
||||
}, 3000);
|
||||
}
|
||||
} catch (err) {
|
||||
retryBtn.innerHTML = 'Eroare: ' + err.message;
|
||||
retryBtn.disabled = false;
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const resyncBtn = document.getElementById('detailResyncBtn');
|
||||
if (resyncBtn) {
|
||||
const canResync = [ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes(status);
|
||||
resyncBtn.style.display = canResync ? '' : 'none';
|
||||
if (canResync) {
|
||||
if (isInvoiced) {
|
||||
resyncBtn.disabled = true;
|
||||
resyncBtn.style.opacity = '0.5';
|
||||
resyncBtn.style.pointerEvents = 'none';
|
||||
resyncBtn.title = 'Comanda facturata';
|
||||
} else {
|
||||
resyncBtn.disabled = false;
|
||||
resyncBtn.style.opacity = '';
|
||||
resyncBtn.style.pointerEvents = '';
|
||||
resyncBtn.title = '';
|
||||
resyncBtn.onclick = () => {
|
||||
inlineConfirmAction(resyncBtn, 'Confirmi resync?', async (btn) => {
|
||||
try {
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/resync`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
btn.innerHTML = '<i class="bi bi-check-circle"></i> Reimportat';
|
||||
btn.className = 'btn btn-sm btn-success';
|
||||
if (opts.onStatusChange) opts.onStatusChange();
|
||||
setTimeout(() => renderOrderDetailModal(orderNumber, opts), 1500);
|
||||
} else {
|
||||
btn.innerHTML = '<i class="bi bi-exclamation-triangle"></i> ' + (data.message || 'Eroare');
|
||||
btn.className = 'btn btn-sm btn-danger';
|
||||
setTimeout(() => {
|
||||
btn.innerHTML = '<i class="bi bi-arrow-repeat"></i> Resync';
|
||||
btn.className = 'btn btn-sm btn-outline-warning';
|
||||
btn.disabled = false;
|
||||
}, 3000);
|
||||
}
|
||||
} catch (err) {
|
||||
btn.innerHTML = 'Eroare: ' + err.message;
|
||||
btn.disabled = false;
|
||||
}
|
||||
}, {
|
||||
defaultHtml: '<i class="bi bi-arrow-repeat"></i> Resync',
|
||||
loadingText: 'Resync...',
|
||||
confirmClass: 'btn-warning',
|
||||
defaultBtnClass: 'btn-outline-warning'
|
||||
});
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const deleteBtn = document.getElementById('detailDeleteBtn');
|
||||
if (deleteBtn) {
|
||||
const canDelete = [ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes(status);
|
||||
deleteBtn.style.display = canDelete ? '' : 'none';
|
||||
if (canDelete) {
|
||||
if (isInvoiced) {
|
||||
deleteBtn.disabled = true;
|
||||
deleteBtn.style.opacity = '0.5';
|
||||
deleteBtn.style.pointerEvents = 'none';
|
||||
deleteBtn.title = 'Comanda facturata';
|
||||
} else {
|
||||
deleteBtn.disabled = false;
|
||||
deleteBtn.style.opacity = '';
|
||||
deleteBtn.style.pointerEvents = '';
|
||||
deleteBtn.title = '';
|
||||
deleteBtn.onclick = () => {
|
||||
inlineConfirmAction(deleteBtn, 'Confirmi stergerea?', async (btn) => {
|
||||
try {
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/delete`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
btn.innerHTML = '<i class="bi bi-check-circle"></i> Sters';
|
||||
btn.className = 'btn btn-sm btn-danger';
|
||||
if (opts.onStatusChange) opts.onStatusChange();
|
||||
setTimeout(() => renderOrderDetailModal(orderNumber, opts), 1500);
|
||||
} else {
|
||||
btn.innerHTML = '<i class="bi bi-exclamation-triangle"></i> ' + (data.message || 'Eroare');
|
||||
btn.className = 'btn btn-sm btn-danger';
|
||||
setTimeout(() => {
|
||||
btn.innerHTML = '<i class="bi bi-trash"></i> Sterge din ROA';
|
||||
btn.className = 'btn btn-sm btn-outline-danger';
|
||||
btn.disabled = false;
|
||||
}, 3000);
|
||||
}
|
||||
} catch (err) {
|
||||
btn.innerHTML = 'Eroare: ' + err.message;
|
||||
btn.disabled = false;
|
||||
}
|
||||
}, {
|
||||
defaultHtml: '<i class="bi bi-trash"></i> Sterge din ROA',
|
||||
loadingText: 'Stergere...',
|
||||
confirmClass: 'btn-danger',
|
||||
defaultBtnClass: 'btn-outline-danger'
|
||||
});
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Render and show the order detail modal.
|
||||
* @param {string} orderNumber
|
||||
@@ -717,9 +853,14 @@ async function renderOrderDetailModal(orderNumber, opts) {
|
||||
document.getElementById('detailError').style.display = '';
|
||||
}
|
||||
|
||||
// Configure footer action buttons BEFORE any early-return on items —
|
||||
// DELETED_IN_ROA orders have no items but must still expose the Reimporta button.
|
||||
_configureDetailButtons(order, orderNumber, opts);
|
||||
|
||||
const items = data.items || [];
|
||||
if (items.length === 0) {
|
||||
document.getElementById('detailItemsBody').innerHTML = '<tr><td colspan="9" class="text-center text-muted">Niciun articol</td></tr>';
|
||||
if (opts.onAfterRender) opts.onAfterRender(order, items);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -853,143 +994,6 @@ async function renderOrderDetailModal(orderNumber, opts) {
|
||||
document.getElementById('detailItemsBody').innerHTML = tableHtml;
|
||||
_renderReceipt(items, order);
|
||||
|
||||
// Retry button (only for ERROR/SKIPPED orders)
|
||||
const retryBtn = document.getElementById('detailRetryBtn');
|
||||
if (retryBtn) {
|
||||
const canRetry = [ORDER_STATUS.ERROR, ORDER_STATUS.SKIPPED, ORDER_STATUS.DELETED_IN_ROA].includes((order.status || '').toUpperCase());
|
||||
retryBtn.style.display = canRetry ? '' : 'none';
|
||||
if (canRetry) {
|
||||
retryBtn.onclick = async () => {
|
||||
retryBtn.disabled = true;
|
||||
retryBtn.innerHTML = '<span class="spinner-border spinner-border-sm me-1"></span> Reimportare...';
|
||||
try {
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/retry`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
retryBtn.innerHTML = '<i class="bi bi-check-circle"></i> ' + (data.message || 'Reimportat');
|
||||
retryBtn.className = 'btn btn-sm btn-success';
|
||||
if (opts.onStatusChange) opts.onStatusChange();
|
||||
// Refresh modal after short delay
|
||||
setTimeout(() => renderOrderDetailModal(orderNumber, opts), 1500);
|
||||
} else {
|
||||
retryBtn.innerHTML = '<i class="bi bi-exclamation-triangle"></i> ' + (data.message || 'Eroare');
|
||||
retryBtn.className = 'btn btn-sm btn-danger';
|
||||
setTimeout(() => {
|
||||
retryBtn.innerHTML = '<i class="bi bi-arrow-clockwise"></i> Reimporta';
|
||||
retryBtn.className = 'btn btn-sm btn-outline-primary';
|
||||
retryBtn.disabled = false;
|
||||
}, 3000);
|
||||
}
|
||||
} catch (err) {
|
||||
retryBtn.innerHTML = 'Eroare: ' + err.message;
|
||||
retryBtn.disabled = false;
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Resync button (IMPORTED/ALREADY_IMPORTED only)
|
||||
const resyncBtn = document.getElementById('detailResyncBtn');
|
||||
if (resyncBtn) {
|
||||
const canResync = [ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes((order.status || '').toUpperCase());
|
||||
resyncBtn.style.display = canResync ? '' : 'none';
|
||||
if (canResync) {
|
||||
const isInvoiced = !!(order.factura_numar);
|
||||
if (isInvoiced) {
|
||||
resyncBtn.disabled = true;
|
||||
resyncBtn.style.opacity = '0.5';
|
||||
resyncBtn.style.pointerEvents = 'none';
|
||||
resyncBtn.title = 'Comanda facturata';
|
||||
} else {
|
||||
resyncBtn.disabled = false;
|
||||
resyncBtn.style.opacity = '';
|
||||
resyncBtn.style.pointerEvents = '';
|
||||
resyncBtn.title = '';
|
||||
resyncBtn.onclick = () => {
|
||||
inlineConfirmAction(resyncBtn, 'Confirmi resync?', async (btn) => {
|
||||
try {
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/resync`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
btn.innerHTML = '<i class="bi bi-check-circle"></i> Reimportat';
|
||||
btn.className = 'btn btn-sm btn-success';
|
||||
if (opts.onStatusChange) opts.onStatusChange();
|
||||
setTimeout(() => renderOrderDetailModal(orderNumber, opts), 1500);
|
||||
} else {
|
||||
btn.innerHTML = '<i class="bi bi-exclamation-triangle"></i> ' + (data.message || 'Eroare');
|
||||
btn.className = 'btn btn-sm btn-danger';
|
||||
setTimeout(() => {
|
||||
btn.innerHTML = '<i class="bi bi-arrow-repeat"></i> Resync';
|
||||
btn.className = 'btn btn-sm btn-outline-warning';
|
||||
btn.disabled = false;
|
||||
}, 3000);
|
||||
}
|
||||
} catch (err) {
|
||||
btn.innerHTML = 'Eroare: ' + err.message;
|
||||
btn.disabled = false;
|
||||
}
|
||||
}, {
|
||||
defaultHtml: '<i class="bi bi-arrow-repeat"></i> Resync',
|
||||
loadingText: 'Resync...',
|
||||
confirmClass: 'btn-warning',
|
||||
defaultBtnClass: 'btn-outline-warning'
|
||||
});
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Delete button (IMPORTED/ALREADY_IMPORTED only)
|
||||
const deleteBtn = document.getElementById('detailDeleteBtn');
|
||||
if (deleteBtn) {
|
||||
const canDelete = [ORDER_STATUS.IMPORTED, ORDER_STATUS.ALREADY_IMPORTED].includes((order.status || '').toUpperCase());
|
||||
deleteBtn.style.display = canDelete ? '' : 'none';
|
||||
if (canDelete) {
|
||||
const isInvoiced = !!(order.factura_numar);
|
||||
if (isInvoiced) {
|
||||
deleteBtn.disabled = true;
|
||||
deleteBtn.style.opacity = '0.5';
|
||||
deleteBtn.style.pointerEvents = 'none';
|
||||
deleteBtn.title = 'Comanda facturata';
|
||||
} else {
|
||||
deleteBtn.disabled = false;
|
||||
deleteBtn.style.opacity = '';
|
||||
deleteBtn.style.pointerEvents = '';
|
||||
deleteBtn.title = '';
|
||||
deleteBtn.onclick = () => {
|
||||
inlineConfirmAction(deleteBtn, 'Confirmi stergerea?', async (btn) => {
|
||||
try {
|
||||
const res = await fetch(`/api/orders/${encodeURIComponent(orderNumber)}/delete`, { method: 'POST' });
|
||||
const data = await res.json();
|
||||
if (data.success) {
|
||||
btn.innerHTML = '<i class="bi bi-check-circle"></i> Sters';
|
||||
btn.className = 'btn btn-sm btn-danger';
|
||||
if (opts.onStatusChange) opts.onStatusChange();
|
||||
setTimeout(() => renderOrderDetailModal(orderNumber, opts), 1500);
|
||||
} else {
|
||||
btn.innerHTML = '<i class="bi bi-exclamation-triangle"></i> ' + (data.message || 'Eroare');
|
||||
btn.className = 'btn btn-sm btn-danger';
|
||||
setTimeout(() => {
|
||||
btn.innerHTML = '<i class="bi bi-trash"></i> Sterge din ROA';
|
||||
btn.className = 'btn btn-sm btn-outline-danger';
|
||||
btn.disabled = false;
|
||||
}, 3000);
|
||||
}
|
||||
} catch (err) {
|
||||
btn.innerHTML = 'Eroare: ' + err.message;
|
||||
btn.disabled = false;
|
||||
}
|
||||
}, {
|
||||
defaultHtml: '<i class="bi bi-trash"></i> Sterge din ROA',
|
||||
loadingText: 'Stergere...',
|
||||
confirmClass: 'btn-danger',
|
||||
defaultBtnClass: 'btn-outline-danger'
|
||||
});
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (opts.onAfterRender) opts.onAfterRender(order, items);
|
||||
} catch (err) {
|
||||
document.getElementById('detailError').textContent = err.message;
|
||||
|
||||
@@ -169,7 +169,7 @@
|
||||
|
||||
<script>window.ROOT_PATH = "{{ rp }}";</script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<script src="{{ rp }}/static/js/shared.js?v=47"></script>
|
||||
<script src="{{ rp }}/static/js/shared.js?v=49"></script>
|
||||
<script>
|
||||
// Dark mode toggle
|
||||
function toggleDarkMode() {
|
||||
|
||||
@@ -21,6 +21,11 @@ CREATE OR REPLACE PACKAGE PACK_IMPORT_PARTENERI AS
|
||||
-- 22.04.2026 - fix numar overflow: prima componenta ramane numar; "SAT X" → p_localitate (satul
|
||||
-- = localitate, TIER L1/L2/L3 existent rezolva id_loc); landmark → strada;
|
||||
-- COM/ORAS/MUN ignorate (deja in p_localitate din GoMag city)
|
||||
-- 23.04.2026 - hardening: SUBSTR(1,10) neconditionat dupa split, blocheaza
|
||||
-- overflow rezidual pe prefix lung fara spatiu in primii 10 char.
|
||||
-- 28.04.2026 - fix ORA-06502: v_bloc/scara/apart/etaj in cauta_sau_creeaza_adresa
|
||||
-- marite la VARCHAR2(100) — Oracle OUT param mostenea constrangerea
|
||||
-- VARCHAR2(10) si cadea pe "apartament 140 interfon 140 Municipiul..."
|
||||
|
||||
-- ====================================================================
|
||||
-- CONSTANTS
|
||||
@@ -712,6 +717,10 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_PARTENERI AS
|
||||
p_numar := SUBSTR(p_numar, 1, 10);
|
||||
END IF;
|
||||
END IF;
|
||||
-- Safety net: daca split-ul de mai sus a lasat >10 char (ex: prefixul
|
||||
-- inaintea primului spatiu era el insusi >10), forteaza limita coloanei.
|
||||
-- 23.04.2026 - hardening overflow rezidual
|
||||
p_numar := SUBSTR(p_numar, 1, 10);
|
||||
p_bloc := SUBSTR(p_bloc, 1, 30);
|
||||
p_scara := SUBSTR(p_scara, 1, 10);
|
||||
p_apart := SUBSTR(p_apart, 1, 10);
|
||||
@@ -969,10 +978,10 @@ CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_PARTENERI AS
|
||||
v_strada VARCHAR2(1000);
|
||||
v_numar VARCHAR2(1000);
|
||||
v_sector VARCHAR2(100);
|
||||
v_bloc VARCHAR2(30);
|
||||
v_scara VARCHAR2(10);
|
||||
v_apart VARCHAR2(10);
|
||||
v_etaj VARCHAR2(20);
|
||||
v_bloc VARCHAR2(100);
|
||||
v_scara VARCHAR2(100);
|
||||
v_apart VARCHAR2(100);
|
||||
v_etaj VARCHAR2(100);
|
||||
v_id_tara NUMBER(10);
|
||||
v_principala NUMBER(1);
|
||||
begin
|
||||
|
||||
@@ -821,6 +821,49 @@ class TestFormatAddressForOracle:
|
||||
assert result == "JUD:Bacau;Zemes;Str Principala Modarzau Blocuri"
|
||||
|
||||
|
||||
class TestCleanWebTextDiacritics:
|
||||
"""clean_web_text strips diacritics across RO/HU/DE/CZ/PL via NFKD."""
|
||||
|
||||
def test_hungarian_acute(self):
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("BALÁZS LORÁNT") == "BALAZS LORANT"
|
||||
|
||||
def test_hungarian_double_acute(self):
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("Lőrincz Ödön") == "Lorincz Odon"
|
||||
assert clean_web_text("Erdős Pál") == "Erdos Pal"
|
||||
|
||||
def test_romanian_comma_below_modern(self):
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("Ștefan Țîrcă") == "Stefan Tirca"
|
||||
assert clean_web_text("ȘTEFAN ȚÎRCĂ") == "STEFAN TIRCA"
|
||||
|
||||
def test_romanian_cedilla_legacy_preserved(self):
|
||||
"""Cedilla ş/ţ/Ş/Ţ must still normalize to s/t/S/T (regression guard)."""
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("şcoala") == "scoala"
|
||||
assert clean_web_text("ţara") == "tara"
|
||||
assert clean_web_text("ŞTEFAN ŢARA") == "STEFAN TARA"
|
||||
assert clean_web_text("IAŞI") == "IASI"
|
||||
|
||||
def test_german_umlaut_and_eszett(self):
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("Müller Straße") == "Muller Strasse"
|
||||
|
||||
def test_czech_polish(self):
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("Dvořák") == "Dvorak"
|
||||
assert clean_web_text("Łódź") == "Lodz"
|
||||
|
||||
def test_html_entity_unescape(self):
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("Café") == "Cafe"
|
||||
|
||||
def test_empty_input(self):
|
||||
from app.services.import_service import clean_web_text
|
||||
assert clean_web_text("") == ""
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Group 11: TestRefreshOrderAddress
|
||||
# ===========================================================================
|
||||
|
||||
@@ -168,12 +168,15 @@ async def test_save_orders_batch_overwrite():
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# mark_order_deleted_in_roa — must purge items
|
||||
# mark_order_deleted_in_roa — preserves items so detail view stays useful
|
||||
# ===========================================================================
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_mark_order_deleted_removes_items():
|
||||
"""Soft-delete must remove order_items (no ghost rows)."""
|
||||
async def test_mark_order_deleted_preserves_items():
|
||||
"""Soft-delete keeps order_items so the detail view shows what was ordered.
|
||||
|
||||
On 'Reimporta', add_order_items replaces them (DELETE+INSERT inside _safe_upsert_order_items).
|
||||
"""
|
||||
await _seed_order("ORD-DEL")
|
||||
await sqlite_service.add_order_items("ORD-DEL", [
|
||||
_item("SKU1", qty=5), _item("SKU2", qty=3),
|
||||
@@ -182,8 +185,10 @@ async def test_mark_order_deleted_removes_items():
|
||||
|
||||
await sqlite_service.mark_order_deleted_in_roa("ORD-DEL")
|
||||
|
||||
# Items purged
|
||||
assert await _items_for("ORD-DEL") == []
|
||||
# Items preserved — detail view can still display them alongside "Comanda stearsa din ROA"
|
||||
items = await _items_for("ORD-DEL")
|
||||
assert len(items) == 2
|
||||
assert {i["sku"] for i in items} == {"SKU1", "SKU2"}
|
||||
|
||||
# Orders row still present with DELETED_IN_ROA status (not hard-deleted)
|
||||
db = await sqlite_service.get_sqlite()
|
||||
|
||||
412
api/tests/test_pre_validate_order_prices.py
Normal file
412
api/tests/test_pre_validate_order_prices.py
Normal file
@@ -0,0 +1,412 @@
|
||||
"""Tests for validation_service.pre_validate_order_prices and retry pre-validation.
|
||||
|
||||
Regression source: production VENDING orders #485841978 and #485841895 (2026-04-28)
|
||||
crashed with PL/SQL COM-001 'Pretul pentru acest articol nu a fost gasit in lista
|
||||
de preturi' because the Retry button skipped the price-list pre-population step
|
||||
that bulk sync runs.
|
||||
|
||||
These tests verify:
|
||||
- pre_validate_order_prices auto-inserts PRET=0 in CRM_POLITICI_PRET_ART for
|
||||
CODMATs missing entries (so PL/SQL doesn't crash).
|
||||
- Dual-policy routing: cont 341/345 → id_pol_productie; else → id_pol.
|
||||
- Empty input returns empty result without DB calls.
|
||||
- Idempotent: running twice when prices already exist does no inserts.
|
||||
- retry_service propagates pre-validation failures as ERROR with clear message.
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
from unittest.mock import patch, MagicMock, AsyncMock
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), ".."))
|
||||
|
||||
from app.services.order_reader import OrderData, OrderItem, OrderShipping, OrderBilling
|
||||
|
||||
|
||||
def _make_order(number: str, items: list[tuple[str, float, float]]) -> OrderData:
|
||||
"""items = [(sku, quantity, price), ...]"""
|
||||
return OrderData(
|
||||
id=number,
|
||||
number=number,
|
||||
date="2026-04-28 10:00:00",
|
||||
items=[
|
||||
OrderItem(sku=sku, name=f"Product {sku}", price=price,
|
||||
quantity=qty, vat=21.0, baseprice=price)
|
||||
for sku, qty, price in items
|
||||
],
|
||||
billing=OrderBilling(firstname="Ion", lastname="Test"),
|
||||
shipping=OrderShipping(firstname="Ion", lastname="Test"),
|
||||
)
|
||||
|
||||
|
||||
# ============================================================
|
||||
# UNIT TESTS — no Oracle
|
||||
# ============================================================
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_pre_validate_empty_orders_returns_empty():
|
||||
"""Empty orders list short-circuits without any DB calls."""
|
||||
from app.services import validation_service
|
||||
|
||||
app_settings = {}
|
||||
mock_conn = MagicMock()
|
||||
|
||||
result = validation_service.pre_validate_order_prices(
|
||||
orders=[],
|
||||
app_settings=app_settings,
|
||||
conn=mock_conn,
|
||||
id_pol=1,
|
||||
)
|
||||
|
||||
assert result["codmat_policy_map"] == {}
|
||||
assert result["kit_missing"] == {}
|
||||
# No DB cursor was opened
|
||||
mock_conn.cursor.assert_not_called()
|
||||
# app_settings unchanged
|
||||
assert "_codmat_policy_map" not in app_settings
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
def test_pre_validate_no_skus_in_orders():
|
||||
"""Orders with no items skip price validation entirely."""
|
||||
from app.services import validation_service
|
||||
|
||||
order = OrderData(id="1", number="1", date="2026-04-28", items=[],
|
||||
billing=OrderBilling())
|
||||
app_settings = {}
|
||||
mock_conn = MagicMock()
|
||||
|
||||
# validation passed-in is empty
|
||||
validation = {"mapped": set(), "direct": set(), "missing": set(),
|
||||
"direct_id_map": {}}
|
||||
|
||||
result = validation_service.pre_validate_order_prices(
|
||||
orders=[order],
|
||||
app_settings=app_settings,
|
||||
conn=mock_conn,
|
||||
id_pol=1,
|
||||
validation=validation,
|
||||
)
|
||||
|
||||
assert result["codmat_policy_map"] == {}
|
||||
assert result["kit_missing"] == {}
|
||||
|
||||
|
||||
@pytest.mark.unit
|
||||
async def test_retry_propagates_pre_validation_error():
|
||||
"""Pre-validation failure in retry path returns ERROR with clear message."""
|
||||
from app.services import retry_service
|
||||
|
||||
target_order = _make_order("RETRY-FAIL-1", [("SKU-X", 1, 100)])
|
||||
|
||||
fake_pool = MagicMock()
|
||||
fake_conn = MagicMock()
|
||||
fake_pool.release = MagicMock()
|
||||
|
||||
with patch("app.services.gomag_client.download_orders",
|
||||
new=AsyncMock(return_value=None)), \
|
||||
patch("app.services.order_reader.read_json_orders",
|
||||
return_value=([target_order], 1)), \
|
||||
patch("app.services.sqlite_service.upsert_order",
|
||||
new=AsyncMock()) as mock_upsert, \
|
||||
patch("app.services.validation_service.validate_skus",
|
||||
return_value={"mapped": set(), "direct": {"SKU-X"},
|
||||
"missing": set(), "direct_id_map": {"SKU-X": 1}}), \
|
||||
patch("app.services.validation_service.pre_validate_order_prices",
|
||||
side_effect=RuntimeError("ORA-12541: TNS no listener")), \
|
||||
patch("app.database.pool", fake_pool), \
|
||||
patch("app.database.get_oracle_connection", return_value=fake_conn):
|
||||
|
||||
app_settings = {"id_pol": "1", "id_gestiune": "1", "discount_vat": "21"}
|
||||
|
||||
result = await retry_service._download_and_reimport(
|
||||
order_number="RETRY-FAIL-1",
|
||||
order_date_str="2026-04-28T10:00:00",
|
||||
customer_name="Ion Test",
|
||||
app_settings=app_settings,
|
||||
)
|
||||
|
||||
assert result["success"] is False
|
||||
assert "pre-validare" in result["message"].lower()
|
||||
assert "TNS" in result["message"]
|
||||
# Verify ERROR persisted to SQLite
|
||||
mock_upsert.assert_called_once()
|
||||
call_kwargs = mock_upsert.call_args.kwargs
|
||||
assert call_kwargs["status"] == "ERROR"
|
||||
|
||||
|
||||
# ============================================================
|
||||
# ORACLE INTEGRATION TESTS — require live Oracle
|
||||
# ============================================================
|
||||
|
||||
# Oracle connection setup (lazy import to keep unit tests isolated)
|
||||
def _get_oracle_conn():
|
||||
import oracledb
|
||||
from dotenv import load_dotenv
|
||||
load_dotenv('.env')
|
||||
user = os.environ['ORACLE_USER']
|
||||
password = os.environ['ORACLE_PASSWORD']
|
||||
dsn = os.environ['ORACLE_DSN']
|
||||
try:
|
||||
instantclient_path = os.environ.get(
|
||||
'INSTANTCLIENTPATH', '/opt/oracle/instantclient_23_9'
|
||||
)
|
||||
oracledb.init_oracle_client(lib_dir=instantclient_path)
|
||||
except Exception:
|
||||
pass
|
||||
return oracledb.connect(user=user, password=password, dsn=dsn)
|
||||
|
||||
|
||||
def _has_price_entry(cur, id_pol: int, id_articol: int) -> tuple[bool, float | None]:
|
||||
"""Returns (exists, pret). pret is None if row doesn't exist."""
|
||||
cur.execute("""
|
||||
SELECT PRET FROM crm_politici_pret_art
|
||||
WHERE id_pol = :p AND id_articol = :a
|
||||
""", {"p": id_pol, "a": id_articol})
|
||||
row = cur.fetchone()
|
||||
return (row is not None, row[0] if row else None)
|
||||
|
||||
|
||||
def _pick_unpriced_article(cur, id_pol: int, count: int = 1) -> list[tuple[int, str]]:
|
||||
"""Find existing NOM_ARTICOLE rows without CRM_POLITICI_PRET_ART entry for id_pol.
|
||||
Returns: [(id_articol, codmat), ...]. Skips test if not enough found.
|
||||
"""
|
||||
cur.execute("""
|
||||
SELECT id_articol, codmat FROM nom_articole na
|
||||
WHERE sters = 0 AND inactiv = 0
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM crm_politici_pret_art pa
|
||||
WHERE pa.id_articol = na.id_articol AND pa.id_pol = :pol
|
||||
)
|
||||
AND ROWNUM <= :n
|
||||
""", {"pol": id_pol, "n": count})
|
||||
return cur.fetchall()
|
||||
|
||||
|
||||
def _pick_priced_article(cur, id_pol: int) -> tuple[int, str, float] | None:
|
||||
"""Find any (id_articol, codmat, pret) with existing CRM_POLITICI_PRET_ART entry."""
|
||||
cur.execute("""
|
||||
SELECT na.id_articol, na.codmat, pa.pret
|
||||
FROM nom_articole na
|
||||
JOIN crm_politici_pret_art pa ON pa.id_articol = na.id_articol
|
||||
WHERE pa.id_pol = :pol AND na.sters = 0 AND na.inactiv = 0
|
||||
AND ROWNUM <= 1
|
||||
""", {"pol": id_pol})
|
||||
return cur.fetchone()
|
||||
|
||||
|
||||
def _pick_default_id_pol(cur) -> int | None:
|
||||
"""Pick first usable id_pol from CRM_POLITICI_PRETURI."""
|
||||
cur.execute("""
|
||||
SELECT id_pol FROM crm_politici_preturi
|
||||
WHERE sters = 0 AND ROWNUM <= 1
|
||||
ORDER BY id_pol
|
||||
""")
|
||||
row = cur.fetchone()
|
||||
return row[0] if row else None
|
||||
|
||||
|
||||
@pytest.mark.oracle
|
||||
def test_pre_validate_inserts_missing_prices_for_direct_sku():
|
||||
"""REGRESSION (prod orders #485841978, #485841895):
|
||||
A SKU that resolves directly to a CODMAT in NOM_ARTICOLE with NO entry
|
||||
in CRM_POLITICI_PRET_ART must auto-insert PRET=0 so the import doesn't
|
||||
crash with COM-001.
|
||||
|
||||
Uses a real unpriced article from the test schema. Cleans up after.
|
||||
"""
|
||||
from app.services import validation_service
|
||||
|
||||
with _get_oracle_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
id_pol = _pick_default_id_pol(cur)
|
||||
assert id_pol is not None, "No usable id_pol found in CRM_POLITICI_PRETURI"
|
||||
|
||||
unpriced = _pick_unpriced_article(cur, id_pol, count=1)
|
||||
if not unpriced:
|
||||
pytest.skip(f"All articles in policy {id_pol} already have prices")
|
||||
|
||||
id_art, codmat = unpriced[0]
|
||||
inserted = False
|
||||
try:
|
||||
# Pre-condition
|
||||
exists, _ = _has_price_entry(cur, id_pol, id_art)
|
||||
assert not exists, f"Pre-condition: {codmat} should be unpriced"
|
||||
|
||||
# Use codmat as direct SKU. validate_skus → direct (matches NOM_ARTICOLE)
|
||||
order = _make_order("VEN-PV-DIRECT", [(codmat, 1, 100)])
|
||||
app_settings = {}
|
||||
validation = {
|
||||
"mapped": set(),
|
||||
"direct": {codmat},
|
||||
"missing": set(),
|
||||
"direct_id_map": {codmat: {"id_articol": id_art, "cont": None}},
|
||||
}
|
||||
|
||||
validation_service.pre_validate_order_prices(
|
||||
orders=[order], app_settings=app_settings, conn=conn,
|
||||
id_pol=id_pol, validation=validation, cota_tva=21,
|
||||
)
|
||||
conn.commit()
|
||||
inserted = True
|
||||
|
||||
# Post-condition: PRET=0 row created
|
||||
exists, pret = _has_price_entry(cur, id_pol, id_art)
|
||||
assert exists, (
|
||||
f"REGRESSION: price entry for {codmat} (id={id_art}) "
|
||||
f"in policy {id_pol} should be auto-created"
|
||||
)
|
||||
assert pret == 0, f"Auto-inserted price should be 0, got {pret}"
|
||||
finally:
|
||||
if inserted:
|
||||
cur.execute(
|
||||
"DELETE FROM crm_politici_pret_art "
|
||||
"WHERE id_pol = :p AND id_articol = :a AND pret = 0",
|
||||
{"p": id_pol, "a": id_art},
|
||||
)
|
||||
conn.commit()
|
||||
|
||||
|
||||
@pytest.mark.oracle
|
||||
def test_pre_validate_idempotent_when_prices_exist():
|
||||
"""When all CODMATs already have CRM_POLITICI_PRET_ART entries, no INSERTs run.
|
||||
Verifies idempotency on a second pre-validation pass — existing prices untouched."""
|
||||
from app.services import validation_service
|
||||
|
||||
with _get_oracle_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
id_pol = _pick_default_id_pol(cur)
|
||||
assert id_pol is not None, "No usable id_pol found"
|
||||
|
||||
priced = _pick_priced_article(cur, id_pol)
|
||||
if not priced:
|
||||
pytest.skip(f"No priced articles in policy {id_pol}")
|
||||
|
||||
id_art, codmat, pret_orig = priced
|
||||
|
||||
cur.execute("""SELECT COUNT(*) FROM crm_politici_pret_art
|
||||
WHERE id_articol = :a AND id_pol = :p""",
|
||||
{"a": id_art, "p": id_pol})
|
||||
count_before = cur.fetchone()[0]
|
||||
|
||||
order = _make_order("VEN-IDEM", [(codmat, 1, 200)])
|
||||
app_settings = {}
|
||||
validation = {
|
||||
"mapped": set(), "direct": {codmat}, "missing": set(),
|
||||
"direct_id_map": {codmat: {"id_articol": id_art, "cont": None}},
|
||||
}
|
||||
|
||||
for _ in range(2): # Run twice
|
||||
validation_service.pre_validate_order_prices(
|
||||
orders=[order], app_settings=app_settings, conn=conn,
|
||||
id_pol=id_pol, validation=validation, cota_tva=21,
|
||||
)
|
||||
conn.commit()
|
||||
|
||||
cur.execute("""SELECT COUNT(*), MAX(pret) FROM crm_politici_pret_art
|
||||
WHERE id_articol = :a AND id_pol = :p""",
|
||||
{"a": id_art, "p": id_pol})
|
||||
count_after, pret_after = cur.fetchone()
|
||||
assert count_after == count_before, (
|
||||
f"Idempotency violated: {count_before} → {count_after} rows"
|
||||
)
|
||||
assert pret_after == pret_orig, (
|
||||
f"Existing price changed: {pret_orig} → {pret_after}"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.oracle
|
||||
def test_pre_validate_dual_policy_routing():
|
||||
"""Articles with cont 341/345 route to id_pol_productie; others to id_pol_vanzare.
|
||||
|
||||
Picks two existing unpriced articles, marks one with cont=341, runs
|
||||
pre_validate, asserts each landed in the expected policy.
|
||||
"""
|
||||
from app.services import validation_service
|
||||
|
||||
with _get_oracle_conn() as conn:
|
||||
with conn.cursor() as cur:
|
||||
id_pol = _pick_default_id_pol(cur)
|
||||
assert id_pol is not None, "No usable id_pol"
|
||||
|
||||
# Find a second policy to use as productie (any other usable id_pol)
|
||||
cur.execute("""SELECT id_pol FROM crm_politici_preturi
|
||||
WHERE sters = 0 AND id_pol != :p AND ROWNUM <= 1
|
||||
ORDER BY id_pol""", {"p": id_pol})
|
||||
row = cur.fetchone()
|
||||
if not row:
|
||||
pytest.skip("Need 2 distinct id_pol values for dual-policy test")
|
||||
id_pol_productie = row[0]
|
||||
|
||||
unpriced = _pick_unpriced_article(cur, id_pol, count=2)
|
||||
if len(unpriced) < 2:
|
||||
pytest.skip("Need 2 unpriced articles for dual-policy test")
|
||||
(id_prod, codmat_prod), (id_sales, codmat_sales) = unpriced[0], unpriced[1]
|
||||
|
||||
# Save original cont values for cleanup
|
||||
cur.execute("SELECT cont FROM nom_articole WHERE id_articol = :a",
|
||||
{"a": id_prod})
|
||||
cont_prod_orig = cur.fetchone()[0]
|
||||
|
||||
try:
|
||||
cur.execute("UPDATE nom_articole SET cont = '341' "
|
||||
"WHERE id_articol = :a", {"a": id_prod})
|
||||
conn.commit()
|
||||
|
||||
order = _make_order(
|
||||
"VEN-DUAL",
|
||||
[(codmat_prod, 1, 50), (codmat_sales, 1, 80)],
|
||||
)
|
||||
app_settings = {}
|
||||
validation = {
|
||||
"mapped": set(),
|
||||
"direct": {codmat_prod, codmat_sales},
|
||||
"missing": set(),
|
||||
"direct_id_map": {
|
||||
codmat_prod: {"id_articol": id_prod, "cont": "341"},
|
||||
codmat_sales: {"id_articol": id_sales, "cont": cont_prod_orig or "302"},
|
||||
},
|
||||
}
|
||||
|
||||
result = validation_service.pre_validate_order_prices(
|
||||
orders=[order], app_settings=app_settings, conn=conn,
|
||||
id_pol=id_pol, id_pol_productie=id_pol_productie,
|
||||
validation=validation, cota_tva=21,
|
||||
)
|
||||
conn.commit()
|
||||
|
||||
policy_map = result["codmat_policy_map"]
|
||||
assert policy_map.get(codmat_prod) == id_pol_productie, (
|
||||
f"cont=341 article ({codmat_prod}) should route to "
|
||||
f"productie={id_pol_productie}, got {policy_map.get(codmat_prod)}"
|
||||
)
|
||||
assert policy_map.get(codmat_sales) == id_pol, (
|
||||
f"non-341 article ({codmat_sales}) should route to "
|
||||
f"vanzare={id_pol}, got {policy_map.get(codmat_sales)}"
|
||||
)
|
||||
|
||||
# Verify rows landed in the right policy
|
||||
exists_prod_in_prod, _ = _has_price_entry(cur, id_pol_productie, id_prod)
|
||||
exists_prod_in_sales, _ = _has_price_entry(cur, id_pol, id_prod)
|
||||
exists_sales_in_sales, _ = _has_price_entry(cur, id_pol, id_sales)
|
||||
exists_sales_in_prod, _ = _has_price_entry(cur, id_pol_productie, id_sales)
|
||||
assert exists_prod_in_prod and not exists_prod_in_sales, (
|
||||
"cont=341 row should be in productie policy only"
|
||||
)
|
||||
assert exists_sales_in_sales and not exists_sales_in_prod, (
|
||||
"Non-341 row should be in sales policy only"
|
||||
)
|
||||
finally:
|
||||
# Cleanup: restore cont, delete inserted PRET=0 rows
|
||||
cur.execute("UPDATE nom_articole SET cont = :c "
|
||||
"WHERE id_articol = :a",
|
||||
{"c": cont_prod_orig, "a": id_prod})
|
||||
cur.execute(
|
||||
"DELETE FROM crm_politici_pret_art "
|
||||
"WHERE id_pol IN (:p1, :p2) "
|
||||
"AND id_articol IN (:a1, :a2) AND pret = 0",
|
||||
{"p1": id_pol, "p2": id_pol_productie,
|
||||
"a1": id_prod, "a2": id_sales},
|
||||
)
|
||||
conn.commit()
|
||||
140
api/tests/test_sticky_deleted_filter.py
Normal file
140
api/tests/test_sticky_deleted_filter.py
Normal file
@@ -0,0 +1,140 @@
|
||||
"""
|
||||
Sticky DELETED_IN_ROA Filter Tests
|
||||
===================================
|
||||
Unit tests for get_deleted_in_roa_order_numbers() helper and integration
|
||||
test for the sticky-exclusion filter applied in sync_service before
|
||||
order classification.
|
||||
|
||||
Run:
|
||||
cd api && python -m pytest tests/test_sticky_deleted_filter.py -v
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.unit
|
||||
|
||||
_tmpdir = tempfile.mkdtemp()
|
||||
os.environ.setdefault("FORCE_THIN_MODE", "true")
|
||||
os.environ.setdefault("SQLITE_DB_PATH", os.path.join(_tmpdir, "test_sticky_deleted.db"))
|
||||
os.environ.setdefault("ORACLE_DSN", "dummy")
|
||||
os.environ.setdefault("ORACLE_USER", "dummy")
|
||||
os.environ.setdefault("ORACLE_PASSWORD", "dummy")
|
||||
os.environ.setdefault("JSON_OUTPUT_DIR", _tmpdir)
|
||||
|
||||
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
if _api_dir not in sys.path:
|
||||
sys.path.insert(0, _api_dir)
|
||||
|
||||
from app import database
|
||||
from app.services import sqlite_service
|
||||
from app.constants import OrderStatus
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
async def _clean_orders():
|
||||
"""Ensure schema exists, clear orders table before each test."""
|
||||
database.init_sqlite()
|
||||
db = await database.get_sqlite()
|
||||
try:
|
||||
await db.execute("DELETE FROM orders")
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
yield
|
||||
|
||||
|
||||
async def _insert_order(order_number: str, status: str, id_comanda: int | None = None):
|
||||
db = await database.get_sqlite()
|
||||
try:
|
||||
await db.execute(
|
||||
"""
|
||||
INSERT INTO orders (order_number, order_date, customer_name, status, id_comanda)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
""",
|
||||
(order_number, "2026-04-22", "Test Customer", status, id_comanda),
|
||||
)
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_returns_empty_set_when_no_orders():
|
||||
"""Helper unit: empty table → empty set."""
|
||||
result = await sqlite_service.get_deleted_in_roa_order_numbers()
|
||||
assert result == set()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_returns_only_deleted_in_roa_status():
|
||||
"""Helper unit: filters only DELETED_IN_ROA, ignores other statuses."""
|
||||
await _insert_order("ORD-1", OrderStatus.IMPORTED.value, id_comanda=100)
|
||||
await _insert_order("ORD-2", OrderStatus.DELETED_IN_ROA.value)
|
||||
await _insert_order("ORD-3", OrderStatus.CANCELLED.value)
|
||||
await _insert_order("ORD-4", OrderStatus.ERROR.value)
|
||||
await _insert_order("ORD-5", OrderStatus.DELETED_IN_ROA.value)
|
||||
await _insert_order("ORD-6", OrderStatus.SKIPPED.value)
|
||||
|
||||
result = await sqlite_service.get_deleted_in_roa_order_numbers()
|
||||
assert result == {"ORD-2", "ORD-5"}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_mark_order_deleted_then_helper_returns_it():
|
||||
"""Integration: mark_order_deleted_in_roa → helper picks it up."""
|
||||
await _insert_order("ORD-100", OrderStatus.IMPORTED.value, id_comanda=500)
|
||||
|
||||
before = await sqlite_service.get_deleted_in_roa_order_numbers()
|
||||
assert "ORD-100" not in before
|
||||
|
||||
await sqlite_service.mark_order_deleted_in_roa("ORD-100")
|
||||
|
||||
after = await sqlite_service.get_deleted_in_roa_order_numbers()
|
||||
assert "ORD-100" in after
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_filter_excludes_deleted_orders():
|
||||
"""Integration: simulates sync filter step.
|
||||
|
||||
Pre-mark ORD-2 as DELETED_IN_ROA, run the same filter logic from
|
||||
sync_service:478-489, assert ORD-2 is excluded.
|
||||
"""
|
||||
await _insert_order("ORD-1", OrderStatus.IMPORTED.value, id_comanda=100)
|
||||
await _insert_order("ORD-2", OrderStatus.DELETED_IN_ROA.value)
|
||||
await _insert_order("ORD-3", OrderStatus.IMPORTED.value, id_comanda=300)
|
||||
|
||||
incoming = [
|
||||
type("O", (), {"number": "ORD-1", "date": "2026-04-22"})(),
|
||||
type("O", (), {"number": "ORD-2", "date": "2026-04-22"})(),
|
||||
type("O", (), {"number": "ORD-3", "date": "2026-04-22"})(),
|
||||
type("O", (), {"number": "ORD-NEW", "date": "2026-04-22"})(),
|
||||
]
|
||||
|
||||
deleted_set = await sqlite_service.get_deleted_in_roa_order_numbers()
|
||||
excluded = [o for o in incoming if o.number in deleted_set]
|
||||
survivors = [o for o in incoming if o.number not in deleted_set]
|
||||
|
||||
assert {o.number for o in excluded} == {"ORD-2"}
|
||||
assert {o.number for o in survivors} == {"ORD-1", "ORD-3", "ORD-NEW"}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_filter_with_no_deleted_is_noop():
|
||||
"""Integration: deleted_set empty → all orders pass through."""
|
||||
await _insert_order("ORD-1", OrderStatus.IMPORTED.value, id_comanda=100)
|
||||
|
||||
incoming = [
|
||||
type("O", (), {"number": "ORD-1", "date": "2026-04-22"})(),
|
||||
type("O", (), {"number": "ORD-NEW", "date": "2026-04-22"})(),
|
||||
]
|
||||
|
||||
deleted_set = await sqlite_service.get_deleted_in_roa_order_numbers()
|
||||
survivors = [o for o in incoming if o.number not in deleted_set]
|
||||
|
||||
assert deleted_set == set()
|
||||
assert {o.number for o in survivors} == {"ORD-1", "ORD-NEW"}
|
||||
256
api/tests/test_sync_cui_gate.py
Normal file
256
api/tests/test_sync_cui_gate.py
Normal file
@@ -0,0 +1,256 @@
|
||||
"""
|
||||
CUI Gate Tests
|
||||
==============
|
||||
Unit tests for evaluate_cui_gate() and _record_order_error() in sync_service.
|
||||
|
||||
Tests 1-6: pure predicate, no IO.
|
||||
Test 7: integration — _record_order_error with pre-seeded SQLite IMPORTED row
|
||||
verifies COALESCE preserves existing id_partener.
|
||||
|
||||
Run:
|
||||
cd api && python -m pytest tests/test_sync_cui_gate.py -v
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
import pytest
|
||||
|
||||
pytestmark = pytest.mark.unit
|
||||
|
||||
_tmpdir = tempfile.mkdtemp()
|
||||
os.environ.setdefault("FORCE_THIN_MODE", "true")
|
||||
os.environ.setdefault("SQLITE_DB_PATH", os.path.join(_tmpdir, "test_cui_gate.db"))
|
||||
os.environ.setdefault("ORACLE_DSN", "dummy")
|
||||
os.environ.setdefault("ORACLE_USER", "dummy")
|
||||
os.environ.setdefault("ORACLE_PASSWORD", "dummy")
|
||||
os.environ.setdefault("JSON_OUTPUT_DIR", _tmpdir)
|
||||
|
||||
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
if _api_dir not in sys.path:
|
||||
sys.path.insert(0, _api_dir)
|
||||
|
||||
from app import database
|
||||
from app.services import sqlite_service
|
||||
from app.services.sync_service import evaluate_cui_gate, _record_order_error
|
||||
from app.services.order_reader import OrderBilling, OrderShipping, OrderData, OrderItem
|
||||
from app.constants import OrderStatus
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
_VALID_ANAF_FOUND = {"scpTVA": True, "denumire_anaf": "NONA ROYAL SRL", "checked_at": "2026-04-22T10:00:00"}
|
||||
_ANAF_NOT_FOUND = {"scpTVA": None, "denumire_anaf": "", "checked_at": "2026-04-22T10:00:00"}
|
||||
|
||||
# A CUI with valid format and valid checksum (MATTEO&OANA CAFFE 2022 SRL)
|
||||
_VALID_CUI = "49033051"
|
||||
# Same body but last digit modified → fails checksum
|
||||
_BAD_CHECKSUM_CUI = "49033052"
|
||||
# J-format — the incident CUI (registru number in the CUI field)
|
||||
_J_FORMAT = "J1994000194225"
|
||||
|
||||
|
||||
def _make_pj_order(company_code=_VALID_CUI, number="O-001"):
|
||||
billing = OrderBilling(
|
||||
firstname="Ion", lastname="Pop", phone="0700", email="x@x.ro",
|
||||
address="Str A 1", city="Cluj", region="Cluj", country="Romania",
|
||||
company_name="TEST SRL", company_code=company_code,
|
||||
company_reg="J12/123/2020", is_company=True,
|
||||
)
|
||||
shipping = OrderShipping(
|
||||
firstname="Ion", lastname="Pop", phone="0700", email="x@x.ro",
|
||||
address="Str A 1", city="Cluj", region="Cluj", country="Romania",
|
||||
)
|
||||
return OrderData(
|
||||
id=number, number=number, date="2026-04-22",
|
||||
billing=billing, shipping=shipping,
|
||||
items=[OrderItem(sku="SKU1", name="Prod", price=10.0, quantity=1, vat=19)],
|
||||
)
|
||||
|
||||
|
||||
def _make_pf_order(number="O-PF-1"):
|
||||
billing = OrderBilling(
|
||||
firstname="Ana", lastname="Pop", phone="0700", email="a@x.ro",
|
||||
address="Str B 2", city="Iasi", region="Iasi", country="Romania",
|
||||
is_company=False,
|
||||
)
|
||||
shipping = OrderShipping(
|
||||
firstname="Ana", lastname="Pop", phone="0700", email="a@x.ro",
|
||||
address="Str B 2", city="Iasi", region="Iasi", country="Romania",
|
||||
)
|
||||
return OrderData(
|
||||
id=number, number=number, date="2026-04-22",
|
||||
billing=billing, shipping=shipping,
|
||||
items=[OrderItem(sku="SKU1", name="Prod", price=10.0, quantity=1, vat=19)],
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Tests 1-6: pure predicate — no IO
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class TestEvaluateCuiGate:
|
||||
|
||||
def test_format_invalid_incident_case(self):
|
||||
"""Test 1: J-format in cod_fiscal field (the 22-Apr-2026 incident) → blocked."""
|
||||
# bare_cui from sanitize_cui("J1994000194225") = "J1994000194225" (not digits)
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=True,
|
||||
company_code_raw=_J_FORMAT,
|
||||
bare_cui=_J_FORMAT,
|
||||
anaf_data=_VALID_ANAF_FOUND,
|
||||
)
|
||||
assert result is not None
|
||||
assert "format" in result
|
||||
|
||||
def test_checksum_invalid(self):
|
||||
"""Test 2: valid format, wrong check digit → blocked."""
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=True,
|
||||
company_code_raw=_BAD_CHECKSUM_CUI,
|
||||
bare_cui=_BAD_CHECKSUM_CUI,
|
||||
anaf_data=_VALID_ANAF_FOUND,
|
||||
)
|
||||
assert result is not None
|
||||
assert "cifra de control" in result
|
||||
|
||||
def test_anaf_not_found_explicit(self):
|
||||
"""Test 3: ANAF explicit notFound → blocked with registry hint."""
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=True,
|
||||
company_code_raw=_VALID_CUI,
|
||||
bare_cui=_VALID_CUI,
|
||||
anaf_data=_ANAF_NOT_FOUND,
|
||||
)
|
||||
assert result is not None
|
||||
assert "nu exista in registrul ANAF" in result
|
||||
assert "registrul comertului" in result
|
||||
|
||||
def test_anaf_found_vat_payer_passes(self):
|
||||
"""Test 4: ANAF found + platitor TVA → pass."""
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=True,
|
||||
company_code_raw=_VALID_CUI,
|
||||
bare_cui=_VALID_CUI,
|
||||
anaf_data=_VALID_ANAF_FOUND,
|
||||
)
|
||||
assert result is None
|
||||
|
||||
def test_anaf_down_fallback_passes(self):
|
||||
"""Test 5 [CRITICAL REGRESSION]: ANAF down (anaf_data=None) + valid CUI → pass.
|
||||
|
||||
If this test fails, the gate is breaking the ANAF-down fallback and all
|
||||
RO company orders would error when ANAF is unavailable.
|
||||
"""
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=True,
|
||||
company_code_raw=_VALID_CUI,
|
||||
bare_cui=_VALID_CUI,
|
||||
anaf_data=None, # ANAF down / transient error
|
||||
)
|
||||
assert result is None, (
|
||||
"ANAF down must NOT block orders — gate must only block on explicit notFound"
|
||||
)
|
||||
|
||||
def test_pf_always_passes(self):
|
||||
"""Test 6: PF order (is_ro_company=False) → always pass, regardless of CUI."""
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=False,
|
||||
company_code_raw=_J_FORMAT,
|
||||
bare_cui=_J_FORMAT,
|
||||
anaf_data=_ANAF_NOT_FOUND,
|
||||
)
|
||||
assert result is None
|
||||
|
||||
def test_no_company_code_passes(self):
|
||||
"""PJ without company_code → pass (nothing to validate)."""
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=True,
|
||||
company_code_raw=None,
|
||||
bare_cui="",
|
||||
anaf_data=None,
|
||||
)
|
||||
assert result is None
|
||||
|
||||
def test_anaf_found_non_vat_passes(self):
|
||||
"""ANAF found non-platitor TVA (scpTVA=False) → pass."""
|
||||
result = evaluate_cui_gate(
|
||||
is_ro_company=True,
|
||||
company_code_raw=_VALID_CUI,
|
||||
bare_cui=_VALID_CUI,
|
||||
anaf_data={"scpTVA": False, "denumire_anaf": "FIRMA SRL", "checked_at": "2026-04-22T10:00:00"},
|
||||
)
|
||||
assert result is None
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Test 7: integration — COALESCE preserves id_partener on gate block
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def _init_db():
|
||||
database.init_sqlite()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_record_order_error_preserves_id_partener():
|
||||
"""Test 7: _record_order_error called with id_partener=None preserves existing id_partener
|
||||
via SQLite COALESCE in upsert_order.
|
||||
|
||||
Scenario: order was previously IMPORTED with id_partener=9001.
|
||||
At resync the gate blocks it (bad CUI). _record_order_error passes id_partener=None.
|
||||
After upsert, the row should have status=ERROR and id_partener=9001 (preserved).
|
||||
"""
|
||||
order = _make_pj_order(company_code=_J_FORMAT, number="O-COALESCE-1")
|
||||
run_id = "test-run-coalesce"
|
||||
|
||||
# Seed an existing IMPORTED row with id_partener=9001
|
||||
db = await sqlite_service.get_sqlite()
|
||||
try:
|
||||
await db.execute(
|
||||
"""INSERT OR REPLACE INTO orders
|
||||
(order_number, order_date, customer_name, status, id_partener, items_count)
|
||||
VALUES (?, ?, ?, ?, ?, ?)""",
|
||||
(order.number, order.date, "TEST SRL", OrderStatus.IMPORTED.value, 9001, 1),
|
||||
)
|
||||
await db.commit()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
# Gate fires → calls _record_order_error with id_partener=None (gate doesn't know it)
|
||||
await _record_order_error(
|
||||
run_id=run_id,
|
||||
order=order,
|
||||
customer="TEST SRL",
|
||||
shipping_name="Ion Pop",
|
||||
billing_name="TEST SRL",
|
||||
payment_method="card",
|
||||
delivery_method="curier",
|
||||
discount_split_json=None,
|
||||
order_items_data=[{
|
||||
"sku": "SKU1", "product_name": "Prod", "quantity": 1,
|
||||
"price": 10.0, "baseprice": None, "vat": 19,
|
||||
"mapping_status": "direct", "codmat": None, "id_articol": None, "cantitate_roa": None,
|
||||
}],
|
||||
reason=f"CUI invalid (format): {_J_FORMAT!r}",
|
||||
id_partener=None, # gate doesn't have it
|
||||
)
|
||||
|
||||
# Verify: status=ERROR, id_partener=9001 (COALESCE preserved), error_message populated
|
||||
db = await sqlite_service.get_sqlite()
|
||||
try:
|
||||
row = await db.execute(
|
||||
"SELECT status, id_partener, error_message FROM orders WHERE order_number = ?",
|
||||
(order.number,),
|
||||
)
|
||||
row = await row.fetchone()
|
||||
finally:
|
||||
await db.close()
|
||||
|
||||
assert row is not None, "Order row missing after _record_order_error"
|
||||
assert row[0] == OrderStatus.ERROR.value, f"Expected ERROR, got {row[0]}"
|
||||
assert row[1] == 9001, f"Expected id_partener=9001 (preserved by COALESCE), got {row[1]}"
|
||||
assert row[2] and "format" in row[2], f"Expected error_message with 'format', got {row[2]!r}"
|
||||
Reference in New Issue
Block a user