Compare commits
4 Commits
f07946b489
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c6d69ac0e0 | ||
|
|
9f2fd24d93 | ||
|
|
7a1fa16fef | ||
|
|
61193b793f |
@@ -106,6 +106,13 @@ python3 scripts/sync_vending_to_mariusm.py --apply --yes
|
|||||||
Sincronizează via SSH din VENDING (prod Windows) în MARIUSM_AUTO (dev ROA_CENTRAL):
|
Sincronizează via SSH din VENDING (prod Windows) în MARIUSM_AUTO (dev ROA_CENTRAL):
|
||||||
nom_articole (noi by codmat, codmat updatat) + articole_terti (noi, modificate, soft-delete).
|
nom_articole (noi by codmat, codmat updatat) + articole_terti (noi, modificate, soft-delete).
|
||||||
|
|
||||||
|
## Design System
|
||||||
|
|
||||||
|
Always read DESIGN.md before making any visual or UI decisions.
|
||||||
|
All font choices, colors, spacing, and aesthetic direction are defined there.
|
||||||
|
Do not deviate without explicit user approval.
|
||||||
|
In QA mode, flag any code that doesn't match DESIGN.md.
|
||||||
|
|
||||||
## Deploy Windows
|
## Deploy Windows
|
||||||
|
|
||||||
Vezi [README.md](README.md#deploy-windows)
|
Vezi [README.md](README.md#deploy-windows)
|
||||||
|
|||||||
324
DESIGN.md
Normal file
324
DESIGN.md
Normal file
@@ -0,0 +1,324 @@
|
|||||||
|
# Design System — GoMag Vending
|
||||||
|
|
||||||
|
## Product Context
|
||||||
|
- **What this is:** Internal admin dashboard for importing web orders from GoMag e-commerce into ROA Oracle ERP
|
||||||
|
- **Who it's for:** Ops/admin team who monitor order sync daily, fix SKU mappings, check import errors
|
||||||
|
- **Space/industry:** Internal tools, B2B operations, ERP integration
|
||||||
|
- **Project type:** Data-heavy admin dashboard (tables, status indicators, sync controls)
|
||||||
|
|
||||||
|
## Aesthetic Direction
|
||||||
|
- **Direction:** Industrial/Utilitarian — function-first, data-dense, quietly confident
|
||||||
|
- **Decoration level:** Minimal — typography and color do the work. No illustrations, gradients, or decorative elements. The data IS the decoration.
|
||||||
|
- **Mood:** Command console. This tool says "built by someone who respects the operator." Serious, efficient, warm.
|
||||||
|
- **Anti-patterns:** No purple gradients, no 3-column icon grids, no centered-everything layouts, no decorative blobs, no stock-photo heroes
|
||||||
|
|
||||||
|
## Typography
|
||||||
|
|
||||||
|
### Font Stack
|
||||||
|
- **Display/Headings:** Space Grotesk — geometric, slightly techy, distinctive `a` and `g`. Says "engineered."
|
||||||
|
- **Body/UI:** DM Sans — clean, excellent readability, good tabular-nums for inline numbers
|
||||||
|
- **Data/Tables:** JetBrains Mono — order IDs, CODMATs, status codes align perfectly. Tables become scannable.
|
||||||
|
- **Code:** JetBrains Mono
|
||||||
|
|
||||||
|
### Loading
|
||||||
|
```html
|
||||||
|
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||||
|
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||||
|
<link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,300;0,9..40,400;0,9..40,500;0,9..40,600;0,9..40,700;1,9..40,400&family=JetBrains+Mono:wght@400;500;600&family=Space+Grotesk:wght@400;500;600;700&display=swap" rel="stylesheet">
|
||||||
|
```
|
||||||
|
|
||||||
|
### CSS Variables
|
||||||
|
```css
|
||||||
|
--font-display: 'Space Grotesk', sans-serif;
|
||||||
|
--font-body: 'DM Sans', sans-serif;
|
||||||
|
--font-data: 'JetBrains Mono', monospace;
|
||||||
|
```
|
||||||
|
|
||||||
|
### Type Scale
|
||||||
|
| Level | Size | Weight | Font | Usage |
|
||||||
|
|-------|------|--------|------|-------|
|
||||||
|
| Page title | 18px | 600 | Display | "Panou de Comanda" |
|
||||||
|
| Section title | 16px | 600 | Display | Card headers |
|
||||||
|
| Label/uppercase | 12px | 500 | Display | Column headers, section labels (letter-spacing: 0.04em) |
|
||||||
|
| Body | 14px | 400 | Body | Paragraphs, descriptions |
|
||||||
|
| UI/Button | 13px | 500 | Body | Buttons, nav links, form labels |
|
||||||
|
| Data cell | 13px | 400 | Data | Codes, IDs, numbers, sums, dates (NOT text names — those use Body font) |
|
||||||
|
| Data small | 12px | 400 | Data | Timestamps, secondary data |
|
||||||
|
| Code/mono | 11px | 400 | Data | Inline code, debug info |
|
||||||
|
|
||||||
|
## Color
|
||||||
|
|
||||||
|
### Approach: Two-accent system (amber state + blue action)
|
||||||
|
Every admin tool is blue. This one uses amber — reads as "operational" and "attention-worthy."
|
||||||
|
- **Amber (--accent):** Navigation active state, filter pill active, accent backgrounds. "Where you are."
|
||||||
|
- **Blue (--info):** Primary buttons, CTAs, actionable links. "What you can do."
|
||||||
|
- Primary buttons (`btn-primary`) stay blue for clear action hierarchy.
|
||||||
|
|
||||||
|
### Light Mode (default)
|
||||||
|
```css
|
||||||
|
:root {
|
||||||
|
/* Surfaces */
|
||||||
|
--bg: #F8F7F5; /* warm off-white, not clinical gray */
|
||||||
|
--surface: #FFFFFF;
|
||||||
|
--surface-raised: #F3F2EF; /* hover states, table headers */
|
||||||
|
--card-shadow: 0 1px 3px rgba(28,25,23,0.1), 0 1px 2px rgba(28,25,23,0.06);
|
||||||
|
|
||||||
|
/* Text */
|
||||||
|
--text-primary: #1C1917; /* warm black */
|
||||||
|
--text-secondary: #57534E; /* warm gray */
|
||||||
|
--text-muted: #78716C; /* labels, timestamps */
|
||||||
|
|
||||||
|
/* Borders */
|
||||||
|
--border: #E7E5E4;
|
||||||
|
--border-subtle: #F0EFED;
|
||||||
|
|
||||||
|
/* Accent — amber */
|
||||||
|
--accent: #D97706;
|
||||||
|
--accent-hover: #B45309;
|
||||||
|
--accent-light: #FEF3C7; /* amber backgrounds */
|
||||||
|
--accent-text: #92400E; /* text on amber bg */
|
||||||
|
|
||||||
|
/* Semantic */
|
||||||
|
--success: #16A34A;
|
||||||
|
--success-light: #DCFCE7;
|
||||||
|
--success-text: #166534;
|
||||||
|
|
||||||
|
--warning: #CA8A04;
|
||||||
|
--warning-light: #FEF9C3;
|
||||||
|
--warning-text: #854D0E;
|
||||||
|
|
||||||
|
--error: #DC2626;
|
||||||
|
--error-light: #FEE2E2;
|
||||||
|
--error-text: #991B1B;
|
||||||
|
|
||||||
|
--info: #2563EB;
|
||||||
|
--info-light: #DBEAFE;
|
||||||
|
--info-text: #1E40AF;
|
||||||
|
|
||||||
|
--cancelled: #78716C;
|
||||||
|
--cancelled-light: #F5F5F4;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Dark Mode
|
||||||
|
Strategy: invert surfaces, reduce accent saturation ~15%, keep semantic colors recognizable.
|
||||||
|
|
||||||
|
```css
|
||||||
|
[data-theme="dark"] {
|
||||||
|
--bg: #121212;
|
||||||
|
--surface: #1E1E1E;
|
||||||
|
--surface-raised: #2A2A2A;
|
||||||
|
--card-shadow: 0 1px 3px rgba(0,0,0,0.4), 0 1px 2px rgba(0,0,0,0.3);
|
||||||
|
|
||||||
|
--text-primary: #E8E4DD; /* warm bone white */
|
||||||
|
--text-secondary: #A8A29E;
|
||||||
|
--text-muted: #78716C;
|
||||||
|
|
||||||
|
--border: #333333;
|
||||||
|
--border-subtle: #262626;
|
||||||
|
|
||||||
|
--accent: #F59E0B;
|
||||||
|
--accent-hover: #D97706;
|
||||||
|
--accent-light: rgba(245,158,11,0.12);
|
||||||
|
--accent-text: #FCD34D;
|
||||||
|
|
||||||
|
--success: #16A34A;
|
||||||
|
--success-light: rgba(22,163,74,0.15);
|
||||||
|
--success-text: #4ADE80;
|
||||||
|
|
||||||
|
--warning: #CA8A04;
|
||||||
|
--warning-light: rgba(202,138,4,0.15);
|
||||||
|
--warning-text: #FACC15;
|
||||||
|
|
||||||
|
--error: #DC2626;
|
||||||
|
--error-light: rgba(220,38,38,0.15);
|
||||||
|
--error-text: #FCA5A5;
|
||||||
|
|
||||||
|
--info: #2563EB;
|
||||||
|
--info-light: rgba(37,99,235,0.15);
|
||||||
|
--info-text: #93C5FD;
|
||||||
|
|
||||||
|
--cancelled: #78716C;
|
||||||
|
--cancelled-light: rgba(120,113,108,0.15);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Status Color Mapping
|
||||||
|
| Status | Dot Color | Badge BG | Glow |
|
||||||
|
|--------|-----------|----------|------|
|
||||||
|
| IMPORTED | `--success` | `--success-light` | none (quiet when healthy) |
|
||||||
|
| ERROR | `--error` | `--error-light` | `0 0 8px 2px rgba(220,38,38,0.35)` |
|
||||||
|
| SKIPPED | `--warning` | `--warning-light` | `0 0 6px 2px rgba(202,138,4,0.3)` |
|
||||||
|
| ALREADY_IMPORTED | `--info` | `--info-light` | none |
|
||||||
|
| CANCELLED | `--cancelled` | `--cancelled-light` | none |
|
||||||
|
| DELETED_IN_ROA | `--cancelled` | `--cancelled-light` | none |
|
||||||
|
|
||||||
|
**Design rule:** Problems glow, success is calm. The operator's eye is pulled to rows that need action.
|
||||||
|
|
||||||
|
## Spacing
|
||||||
|
- **Base unit:** 4px
|
||||||
|
- **Density:** Comfortable — not cramped, not wasteful
|
||||||
|
- **Scale:**
|
||||||
|
|
||||||
|
| Token | Value | Usage |
|
||||||
|
|-------|-------|-------|
|
||||||
|
| 2xs | 2px | Tight internal gaps |
|
||||||
|
| xs | 4px | Icon-text gap, badge padding |
|
||||||
|
| sm | 8px | Compact card padding, table cell padding |
|
||||||
|
| md | 16px | Standard card padding, section gaps |
|
||||||
|
| lg | 24px | Section spacing |
|
||||||
|
| xl | 32px | Major section gaps |
|
||||||
|
| 2xl | 48px | Page-level spacing |
|
||||||
|
| 3xl | 64px | Hero spacing (rarely used) |
|
||||||
|
|
||||||
|
## Layout
|
||||||
|
|
||||||
|
### Approach: Grid-disciplined, full-width
|
||||||
|
Tables with 8+ columns and hundreds of rows need every pixel of width.
|
||||||
|
|
||||||
|
- **Nav:** Horizontal top bar, fixed, 48px height. Active tab has amber underline (2px).
|
||||||
|
- **Content max-width:** None on desktop (full-width for tables), 1200px for non-table content
|
||||||
|
- **Grid:** Single-column layout, cards stack vertically
|
||||||
|
- **Breakpoints:**
|
||||||
|
|
||||||
|
| Name | Width | Columns | Behavior |
|
||||||
|
|------|-------|---------|----------|
|
||||||
|
| Desktop | >= 1024px | Full width | All features visible |
|
||||||
|
| Tablet | 768-1023px | Full width | Nav labels abbreviated, tables scroll horizontally |
|
||||||
|
| Mobile | < 768px | Single column | Bottom nav, cards stack, condensed views |
|
||||||
|
|
||||||
|
### Border Radius
|
||||||
|
| Token | Value | Usage |
|
||||||
|
|-------|-------|-------|
|
||||||
|
| sm | 4px | Buttons, inputs, badges, status dots |
|
||||||
|
| md | 8px | Cards, dropdowns, modals |
|
||||||
|
| lg | 12px | Large containers, mockup frames |
|
||||||
|
| full | 9999px | Pills, avatar circles |
|
||||||
|
|
||||||
|
## Motion
|
||||||
|
- **Approach:** Minimal-functional — only transitions that aid comprehension
|
||||||
|
- **Easing:** enter(ease-out) exit(ease-in) move(ease-in-out)
|
||||||
|
- **Duration:**
|
||||||
|
|
||||||
|
| Token | Value | Usage |
|
||||||
|
|-------|-------|-------|
|
||||||
|
| micro | 50-100ms | Button hover, focus ring |
|
||||||
|
| short | 150-250ms | Dropdown open, tab switch, color transitions |
|
||||||
|
| medium | 250-400ms | Modal open/close, page transitions |
|
||||||
|
| long | 400-700ms | Only for sync pulse animation |
|
||||||
|
|
||||||
|
- **Sync pulse:** The live sync dot uses a 2s infinite pulse (opacity 1 → 0.4 → 1)
|
||||||
|
- **No:** entrance animations, scroll effects, decorative motion
|
||||||
|
|
||||||
|
## Mobile Design
|
||||||
|
|
||||||
|
### Navigation
|
||||||
|
- **Bottom tab bar** replaces top horizontal nav on screens < 768px
|
||||||
|
- 5 tabs: Dashboard, Mapari, Lipsa, Jurnale, Setari
|
||||||
|
- Each tab: icon (Bootstrap Icons) + short label below
|
||||||
|
- Active tab: amber accent color, inactive: `--text-muted`
|
||||||
|
- Height: 56px, safe-area padding for notched devices
|
||||||
|
- Fixed position bottom, with `padding-bottom: env(safe-area-inset-bottom)`
|
||||||
|
|
||||||
|
```css
|
||||||
|
@media (max-width: 767px) {
|
||||||
|
.top-navbar { display: none; }
|
||||||
|
.bottom-nav {
|
||||||
|
position: fixed;
|
||||||
|
bottom: 0;
|
||||||
|
left: 0;
|
||||||
|
right: 0;
|
||||||
|
height: 56px;
|
||||||
|
padding-bottom: env(safe-area-inset-bottom);
|
||||||
|
background: var(--surface);
|
||||||
|
border-top: 1px solid var(--border);
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-around;
|
||||||
|
align-items: center;
|
||||||
|
z-index: 1000;
|
||||||
|
}
|
||||||
|
.main-content {
|
||||||
|
padding-bottom: 72px; /* clear bottom nav */
|
||||||
|
padding-top: 8px; /* no top navbar */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Dashboard — Mobile
|
||||||
|
- **Sync card:** Full width, stacked vertically
|
||||||
|
- Status + controls row wraps to 2 lines
|
||||||
|
- Sync button full-width at bottom of card
|
||||||
|
- Last sync info wraps naturally
|
||||||
|
- **Orders table:** Condensed card view instead of horizontal table
|
||||||
|
- Each order = a compact card showing: status dot + ID + client name + total
|
||||||
|
- Tap to expand: shows date, factura, full details
|
||||||
|
- Swipe left on card: quick action (view error details)
|
||||||
|
- **Filter bar:** Horizontal scrollable chips instead of dropdowns
|
||||||
|
- Period selector: pill chips (1zi, 7zi, 30zi, Toate)
|
||||||
|
- Status filter: colored chips matching status colors
|
||||||
|
- **Touch targets:** Minimum 44x44px for all interactive elements
|
||||||
|
|
||||||
|
### Orders Mobile Card Layout
|
||||||
|
```
|
||||||
|
┌────────────────────────────────┐
|
||||||
|
│ ● CMD-47832 2,450.00 RON│
|
||||||
|
│ SC Automate Express SRL │
|
||||||
|
│ 27.03.2026 · FCT-2026-1847 │
|
||||||
|
└────────────────────────────────┘
|
||||||
|
```
|
||||||
|
- Status dot (8px, left-aligned with glow for errors)
|
||||||
|
- Order ID in JetBrains Mono, amount right-aligned
|
||||||
|
- Client name in DM Sans
|
||||||
|
- Date + factura in muted data font
|
||||||
|
|
||||||
|
### SKU Mappings — Mobile
|
||||||
|
- Each mapping = expandable card
|
||||||
|
- Collapsed: SKU + product name + type badge (KIT/SIMPLU)
|
||||||
|
- Expanded: Full CODMAT list with quantities
|
||||||
|
- Search: Full-width sticky search bar at top
|
||||||
|
- Filter: Horizontal scrollable type chips
|
||||||
|
|
||||||
|
### Logs — Mobile
|
||||||
|
- Timeline view instead of table
|
||||||
|
- Each log entry = timestamp + status icon + summary
|
||||||
|
- Tap to expand full log details
|
||||||
|
- Infinite scroll with date separators
|
||||||
|
|
||||||
|
### Settings — Mobile
|
||||||
|
- Standard stacked form layout
|
||||||
|
- Full-width inputs
|
||||||
|
- Toggle switches for boolean settings (min 44px touch target)
|
||||||
|
- Save button sticky at bottom
|
||||||
|
|
||||||
|
### Gestures
|
||||||
|
- **Pull to refresh** on Dashboard: triggers sync status check
|
||||||
|
- **Swipe left** on order card: reveal quick actions
|
||||||
|
- **Long press** on SKU mapping: copy CODMAT to clipboard
|
||||||
|
- **No swipe navigation** between pages (use bottom tabs)
|
||||||
|
|
||||||
|
### Mobile Typography Adjustments
|
||||||
|
| Level | Desktop | Mobile |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| Page title | 18px | 16px |
|
||||||
|
| Body | 14px | 14px (no change) |
|
||||||
|
| Data cell | 13px | 13px (no change) |
|
||||||
|
| Data small | 12px | 12px (no change) |
|
||||||
|
| Table header | 12px | 11px |
|
||||||
|
|
||||||
|
### Responsive Images & Icons
|
||||||
|
- Use Bootstrap Icons throughout (already loaded via CDN)
|
||||||
|
- Icon size: 16px desktop, 20px mobile (larger touch targets)
|
||||||
|
- No images in the admin interface (data-only)
|
||||||
|
|
||||||
|
## Decisions Log
|
||||||
|
| Date | Decision | Rationale |
|
||||||
|
|------|----------|-----------|
|
||||||
|
| 2026-03-27 | Initial design system created | Created by /design-consultation. Industrial/utilitarian aesthetic with amber accent, Space Grotesk + DM Sans + JetBrains Mono. |
|
||||||
|
| 2026-03-27 | Amber accent over blue | Every admin tool is blue. Amber reads as "operational" and gives the tool its own identity. Confirmed by Claude subagent ("Control Room Noir" also converged on amber). |
|
||||||
|
| 2026-03-27 | JetBrains Mono for data tables | Both primary analysis and subagent independently recommended monospace for data tables. Scannability win outweighs the ~15% wider columns. |
|
||||||
|
| 2026-03-27 | Warm tones throughout | Off-white (#F8F7F5) instead of clinical gray. Warm black text instead of blue-gray. Makes the tool feel handcrafted. |
|
||||||
|
| 2026-03-27 | Glowing status dots for errors | Problems glow (box-shadow), success is calm. Operator's eye is pulled to rows that need action. Inspired by subagent's "LED indicator" concept. |
|
||||||
|
| 2026-03-27 | Full mobile design | Bottom nav, card-based order views, touch-optimized gestures. Supports quick-glance usage from phone. |
|
||||||
|
| 2026-03-27 | Two-accent system | Blue = action (buttons, CTAs), amber = state (nav active, filter active). Clear hierarchy. |
|
||||||
|
| 2026-03-27 | JetBrains Mono selective | Mono font only for codes, IDs, numbers, sums, dates. Text names use DM Sans for readability. |
|
||||||
|
| 2026-03-27 | Dark mode in scope | CSS variables + toggle + localStorage. All DESIGN.md dark tokens implemented in Commit 0.5. |
|
||||||
15
TODOS.md
Normal file
15
TODOS.md
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
# TODOS
|
||||||
|
|
||||||
|
## P2: Refactor sync_service.py in module separate
|
||||||
|
**What:** Split sync_service.py (870 linii) in: download_service, parse_service, sync_orchestrator.
|
||||||
|
**Why:** Faciliteza debugging si testare. Un bug in price sync nu ar trebui sa afecteze import flow.
|
||||||
|
**Effort:** M (human: ~1 sapt / CC: ~1-2h)
|
||||||
|
**Context:** Dupa implementarea planului Command Center (retry_service deja extras). sync_service face download + parse + validate + import + price sync + invoice check — prea multe responsabilitati.
|
||||||
|
**Depends on:** Finalizarea planului Command Center.
|
||||||
|
|
||||||
|
## P2: Email/webhook alert pe sync esuat
|
||||||
|
**What:** Cand sync-ul gaseste >5 erori sau esueaza complet, trimite un email/webhook.
|
||||||
|
**Why:** Post-lansare, cand app-ul ruleaza automat, nimeni nu sta sa verifice constant.
|
||||||
|
**Effort:** M (human: ~1 sapt / CC: ~1h)
|
||||||
|
**Context:** Depinde de infrastructura email/webhook disponibila la client. Implementare: SMTP simplu sau webhook URL configurabil in Settings.
|
||||||
|
**Depends on:** Lansare in productie + infrastructura email la client.
|
||||||
@@ -1,74 +1,3 @@
|
|||||||
-- ====================================================================
|
|
||||||
-- PACK_IMPORT_COMENZI
|
|
||||||
-- Package pentru importul comenzilor din platforme web (GoMag, etc.)
|
|
||||||
-- in sistemul ROA Oracle.
|
|
||||||
--
|
|
||||||
-- Dependinte:
|
|
||||||
-- Packages: PACK_COMENZI (adauga_comanda, adauga_articol_comanda)
|
|
||||||
-- pljson (pljson_list, pljson) - instalat in CONTAFIN_ORACLE,
|
|
||||||
-- accesat prin PUBLIC SYNONYM
|
|
||||||
-- Tabele: ARTICOLE_TERTI (mapari SKU -> CODMAT)
|
|
||||||
-- NOM_ARTICOLE (nomenclator articole ROA)
|
|
||||||
-- COMENZI (verificare duplicat comanda_externa)
|
|
||||||
-- CRM_POLITICI_PRETURI (flag PRETURI_CU_TVA per politica)
|
|
||||||
-- CRM_POLITICI_PRET_ART (preturi componente kituri)
|
|
||||||
--
|
|
||||||
-- Proceduri publice:
|
|
||||||
--
|
|
||||||
-- importa_comanda(...)
|
|
||||||
-- Importa o comanda completa: creeaza comanda + adauga articolele.
|
|
||||||
-- p_json_articole accepta:
|
|
||||||
-- - array JSON: [{"sku":"X","quantity":"1","price":"10","vat":"19"}, ...]
|
|
||||||
-- - obiect JSON: {"sku":"X","quantity":"1","price":"10","vat":"19"}
|
|
||||||
-- Optional per articol: "id_pol":"5" — politica de pret specifica
|
|
||||||
-- (pentru transport/discount cu politica separata de cea a comenzii)
|
|
||||||
-- Valorile sku, quantity, price, vat sunt extrase ca STRING si convertite.
|
|
||||||
-- Daca comanda exista deja (comanda_externa), nu se dubleaza.
|
|
||||||
-- La eroare ridica RAISE_APPLICATION_ERROR(-20001, mesaj).
|
|
||||||
-- Returneaza v_id_comanda (OUT) = ID-ul comenzii create.
|
|
||||||
--
|
|
||||||
-- Parametri kit pricing:
|
|
||||||
-- p_kit_mode — 'distributed' | 'separate_line' | NULL
|
|
||||||
-- distributed: discountul fata de suma componentelor se distribuie
|
|
||||||
-- proportional in pretul fiecarei componente
|
|
||||||
-- separate_line: componentele se insereaza la pret plin +
|
|
||||||
-- linii discount separate per-kit sub componente, grupate pe cota TVA
|
|
||||||
-- p_id_pol_productie — politica de pret pentru articole de productie
|
|
||||||
-- (cont in 341/345); NULL = nu se foloseste
|
|
||||||
-- p_kit_discount_codmat — CODMAT-ul articolului discount (Mode separate_line)
|
|
||||||
-- p_kit_discount_id_pol — id_pol pentru liniile discount (Mode separate_line)
|
|
||||||
--
|
|
||||||
-- Logica cautare articol per SKU:
|
|
||||||
-- 1. Mapari speciale din ARTICOLE_TERTI (reimpachetare, seturi compuse)
|
|
||||||
-- - daca SKU are >1 rand si p_kit_mode IS NOT NULL: kit pricing logic
|
|
||||||
-- - altfel (1 rand sau kit_mode NULL): pret web / cantitate_roa direct
|
|
||||||
-- 2. Fallback: cautare directa in NOM_ARTICOLE dupa CODMAT = SKU
|
|
||||||
--
|
|
||||||
-- get_last_error / clear_error
|
|
||||||
-- Management erori pentru orchestratorul VFP.
|
|
||||||
--
|
|
||||||
-- Exemplu utilizare:
|
|
||||||
-- DECLARE
|
|
||||||
-- v_id NUMBER;
|
|
||||||
-- BEGIN
|
|
||||||
-- PACK_IMPORT_COMENZI.importa_comanda(
|
|
||||||
-- p_nr_comanda_ext => '479317993',
|
|
||||||
-- p_data_comanda => SYSDATE,
|
|
||||||
-- p_id_partener => 1424,
|
|
||||||
-- p_json_articole => '[{"sku":"5941623003366","quantity":"1.00","price":"40.99","vat":"21"}]',
|
|
||||||
-- p_id_pol => 39,
|
|
||||||
-- v_id_comanda => v_id);
|
|
||||||
-- DBMS_OUTPUT.PUT_LINE('ID comanda: ' || v_id);
|
|
||||||
-- END;
|
|
||||||
-- 20.03.2026 - dual policy vanzare/productie, kit pricing distributed/separate_line, SKU→CODMAT via ARTICOLE_TERTI
|
|
||||||
-- 20.03.2026 - kit discount deferred cross-kit (separate_line, merge-on-collision)
|
|
||||||
-- 20.03.2026 - merge_or_insert_articol: merge cantitati cand kit+individual au acelasi articol/pret
|
|
||||||
-- 20.03.2026 - kit pricing extins pt reambalari single-component (cantitate_roa > 1)
|
|
||||||
-- 21.03.2026 - diagnostic detaliat discount kit (id_pol, id_art, codmat in eroare)
|
|
||||||
-- 21.03.2026 - fix discount amount: v_disc_amt e per-kit, nu se imparte la v_cantitate_web
|
|
||||||
-- 25.03.2026 - skip negative kit discount (markup), ROUND prices to nzecimale_pretv
|
|
||||||
-- 25.03.2026 - kit discount inserat per-kit sub componente (nu deferred cross-kit)
|
|
||||||
-- ====================================================================
|
|
||||||
CREATE OR REPLACE PACKAGE PACK_IMPORT_COMENZI AS
|
CREATE OR REPLACE PACKAGE PACK_IMPORT_COMENZI AS
|
||||||
|
|
||||||
-- Variabila package pentru ultima eroare (pentru orchestrator VFP)
|
-- Variabila package pentru ultima eroare (pentru orchestrator VFP)
|
||||||
@@ -98,6 +27,31 @@ END PACK_IMPORT_COMENZI;
|
|||||||
/
|
/
|
||||||
CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
CREATE OR REPLACE PACKAGE BODY PACK_IMPORT_COMENZI AS
|
||||||
|
|
||||||
|
-- ====================================================================
|
||||||
|
-- PACK_IMPORT_COMENZI
|
||||||
|
-- Package pentru importul comenzilor din platforme web (GoMag, etc.)
|
||||||
|
-- in sistemul ROA Oracle.
|
||||||
|
--
|
||||||
|
-- Dependinte:
|
||||||
|
-- Packages: PACK_COMENZI (adauga_comanda, adauga_articol_comanda)
|
||||||
|
-- pljson (pljson_list, pljson) - instalat in CONTAFIN_ORACLE,
|
||||||
|
-- accesat prin PUBLIC SYNONYM
|
||||||
|
-- Tabele: ARTICOLE_TERTI (mapari SKU -> CODMAT)
|
||||||
|
-- NOM_ARTICOLE (nomenclator articole ROA)
|
||||||
|
-- COMENZI (verificare duplicat comanda_externa)
|
||||||
|
-- CRM_POLITICI_PRETURI (flag PRETURI_CU_TVA per politica)
|
||||||
|
-- CRM_POLITICI_PRET_ART (preturi componente kituri)
|
||||||
|
|
||||||
|
-- 20.03.2026 - dual policy vanzare/productie, kit pricing distributed/separate_line, SKU→CODMAT via ARTICOLE_TERTI
|
||||||
|
-- 20.03.2026 - kit discount deferred cross-kit (separate_line, merge-on-collision)
|
||||||
|
-- 20.03.2026 - merge_or_insert_articol: merge cantitati cand kit+individual au acelasi articol/pret
|
||||||
|
-- 20.03.2026 - kit pricing extins pt reambalari single-component (cantitate_roa > 1)
|
||||||
|
-- 21.03.2026 - diagnostic detaliat discount kit (id_pol, id_art, codmat in eroare)
|
||||||
|
-- 21.03.2026 - fix discount amount: v_disc_amt e per-kit, nu se imparte la v_cantitate_web
|
||||||
|
-- 25.03.2026 - skip negative kit discount (markup), ROUND prices to nzecimale_pretv
|
||||||
|
-- 25.03.2026 - kit discount inserat per-kit sub componente (nu deferred cross-kit)
|
||||||
|
-- ====================================================================
|
||||||
|
|
||||||
-- Constante pentru configurare
|
-- Constante pentru configurare
|
||||||
c_id_util CONSTANT NUMBER := -3; -- Sistem
|
c_id_util CONSTANT NUMBER := -3; -- Sistem
|
||||||
c_interna CONSTANT NUMBER := 2; -- Comenzi de la client (web)
|
c_interna CONSTANT NUMBER := 2; -- Comenzi de la client (web)
|
||||||
|
|||||||
@@ -47,54 +47,59 @@ def test_order_id(oracle_connection):
|
|||||||
try:
|
try:
|
||||||
with conn.cursor() as cur:
|
with conn.cursor() as cur:
|
||||||
cur.execute(
|
cur.execute(
|
||||||
"SELECT MIN(id_partener) FROM parteneri WHERE id_partener > 0"
|
"SELECT MIN(id_part) FROM nom_parteneri WHERE id_part > 0"
|
||||||
)
|
)
|
||||||
row = cur.fetchone()
|
row = cur.fetchone()
|
||||||
if not row or row[0] is None:
|
if not row or row[0] is None:
|
||||||
pytest.skip("No partners found in Oracle — cannot create test order")
|
pytest.skip("No partners found in Oracle — cannot create test order")
|
||||||
partner_id = int(row[0])
|
partner_id = int(row[0])
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
pytest.skip(f"Cannot query parteneri table: {exc}")
|
pytest.skip(f"Cannot query nom_parteneri table: {exc}")
|
||||||
|
|
||||||
# Build minimal JSON articles — use a SKU known from NOM_ARTICOLE if possible
|
# Find an article that has a price in some policy (required for import)
|
||||||
with conn.cursor() as cur:
|
with conn.cursor() as cur:
|
||||||
cur.execute(
|
cur.execute("""
|
||||||
"SELECT codmat FROM nom_articole WHERE rownum = 1"
|
SELECT na.codmat, cp.id_pol, cp.pret
|
||||||
)
|
FROM nom_articole na
|
||||||
|
JOIN crm_politici_pret_art cp ON cp.id_articol = na.id_articol
|
||||||
|
WHERE cp.pret > 0 AND na.codmat IS NOT NULL AND rownum = 1
|
||||||
|
""")
|
||||||
row = cur.fetchone()
|
row = cur.fetchone()
|
||||||
test_sku = row[0] if row else "CAFE100"
|
if not row:
|
||||||
|
pytest.skip("No articles with prices found in Oracle — cannot create test order")
|
||||||
|
test_sku, id_pol, test_price = row[0], int(row[1]), float(row[2])
|
||||||
|
|
||||||
nr_comanda_ext = f"PYTEST-{int(time.time())}"
|
nr_comanda_ext = f"PYTEST-{int(time.time())}"
|
||||||
|
# Values must be strings — Oracle's JSON_OBJECT_T.get_string() returns NULL for numbers
|
||||||
articles = json.dumps([{
|
articles = json.dumps([{
|
||||||
"sku": test_sku,
|
"sku": test_sku,
|
||||||
"cantitate": 1,
|
"quantity": "1",
|
||||||
"pret": 50.0,
|
"price": str(test_price),
|
||||||
"denumire": "Test article (pytest)",
|
"vat": "19",
|
||||||
"tva": 19,
|
|
||||||
"discount": 0,
|
|
||||||
}])
|
}])
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
from datetime import datetime
|
||||||
with conn.cursor() as cur:
|
with conn.cursor() as cur:
|
||||||
clob_var = cur.var(oracledb.DB_TYPE_CLOB)
|
clob_var = cur.var(oracledb.DB_TYPE_CLOB)
|
||||||
clob_var.setvalue(0, articles)
|
clob_var.setvalue(0, articles)
|
||||||
id_comanda_var = cur.var(oracledb.DB_TYPE_NUMBER)
|
id_comanda_var = cur.var(oracledb.DB_TYPE_NUMBER)
|
||||||
|
|
||||||
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
|
cur.callproc("PACK_IMPORT_COMENZI.importa_comanda", [
|
||||||
nr_comanda_ext, # p_nr_comanda_ext
|
nr_comanda_ext, # p_nr_comanda_ext
|
||||||
None, # p_data_comanda (NULL = SYSDATE in pkg)
|
datetime.now(), # p_data_comanda
|
||||||
partner_id, # p_id_partener
|
partner_id, # p_id_partener
|
||||||
clob_var, # p_json_articole
|
clob_var, # p_json_articole
|
||||||
None, # p_id_adresa_livrare
|
None, # p_id_adresa_livrare
|
||||||
None, # p_id_adresa_facturare
|
None, # p_id_adresa_facturare
|
||||||
None, # p_id_pol
|
id_pol, # p_id_pol
|
||||||
None, # p_id_sectie
|
None, # p_id_sectie
|
||||||
None, # p_id_gestiune
|
None, # p_id_gestiune
|
||||||
None, # p_kit_mode
|
None, # p_kit_mode
|
||||||
None, # p_id_pol_productie
|
None, # p_id_pol_productie
|
||||||
None, # p_kit_discount_codmat
|
None, # p_kit_discount_codmat
|
||||||
None, # p_kit_discount_id_pol
|
None, # p_kit_discount_id_pol
|
||||||
id_comanda_var, # v_id_comanda (OUT)
|
id_comanda_var, # v_id_comanda (OUT)
|
||||||
])
|
])
|
||||||
|
|
||||||
raw = id_comanda_var.getvalue()
|
raw = id_comanda_var.getvalue()
|
||||||
@@ -122,11 +127,11 @@ def test_order_id(oracle_connection):
|
|||||||
try:
|
try:
|
||||||
with conn.cursor() as cur:
|
with conn.cursor() as cur:
|
||||||
cur.execute(
|
cur.execute(
|
||||||
"DELETE FROM comenzi_articole WHERE id_comanda = :id",
|
"DELETE FROM comenzi_elemente WHERE id_comanda = :id",
|
||||||
{"id": order_id}
|
{"id": order_id}
|
||||||
)
|
)
|
||||||
cur.execute(
|
cur.execute(
|
||||||
"DELETE FROM com_antet WHERE id_comanda = :id",
|
"DELETE FROM comenzi WHERE id_comanda = :id",
|
||||||
{"id": order_id}
|
{"id": order_id}
|
||||||
)
|
)
|
||||||
conn.commit()
|
conn.commit()
|
||||||
@@ -193,7 +198,7 @@ def test_cleanup_test_order(oracle_connection, test_order_id):
|
|||||||
|
|
||||||
with oracle_connection.cursor() as cur:
|
with oracle_connection.cursor() as cur:
|
||||||
cur.execute(
|
cur.execute(
|
||||||
"SELECT COUNT(*) FROM com_antet WHERE id_comanda = :id",
|
"SELECT COUNT(*) FROM comenzi WHERE id_comanda = :id",
|
||||||
{"id": test_order_id}
|
{"id": test_order_id}
|
||||||
)
|
)
|
||||||
row = cur.fetchone()
|
row = cur.fetchone()
|
||||||
|
|||||||
@@ -119,7 +119,8 @@ def test_mobile_table_responsive(pw_browser, base_url: str, page_path: str):
|
|||||||
|
|
||||||
tables = page.locator("table").all()
|
tables = page.locator("table").all()
|
||||||
if not tables:
|
if not tables:
|
||||||
pytest.skip(f"No tables on {page_path} (empty state)")
|
# No tables means nothing to check — pass (no non-responsive tables exist)
|
||||||
|
return
|
||||||
|
|
||||||
# Check each table has an ancestor with overflow-x scroll or .table-responsive class
|
# Check each table has an ancestor with overflow-x scroll or .table-responsive class
|
||||||
for table in tables:
|
for table in tables:
|
||||||
|
|||||||
494
api/tests/test_business_rules.py
Normal file
494
api/tests/test_business_rules.py
Normal file
@@ -0,0 +1,494 @@
|
|||||||
|
"""
|
||||||
|
Business Rule Regression Tests
|
||||||
|
==============================
|
||||||
|
Regression tests for historical bug fixes in kit pricing, discount calculation,
|
||||||
|
duplicate CODMAT resolution, price sync, and VAT normalization.
|
||||||
|
|
||||||
|
Run:
|
||||||
|
cd api && python -m pytest tests/test_business_rules.py -v
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import tempfile
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.unit
|
||||||
|
|
||||||
|
# --- Set env vars BEFORE any app import ---
|
||||||
|
_tmpdir = tempfile.mkdtemp()
|
||||||
|
_sqlite_path = os.path.join(_tmpdir, "test_biz.db")
|
||||||
|
|
||||||
|
os.environ["FORCE_THIN_MODE"] = "true"
|
||||||
|
os.environ["SQLITE_DB_PATH"] = _sqlite_path
|
||||||
|
os.environ["ORACLE_DSN"] = "dummy"
|
||||||
|
os.environ["ORACLE_USER"] = "dummy"
|
||||||
|
os.environ["ORACLE_PASSWORD"] = "dummy"
|
||||||
|
os.environ["JSON_OUTPUT_DIR"] = _tmpdir
|
||||||
|
|
||||||
|
_api_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||||
|
if _api_dir not in sys.path:
|
||||||
|
sys.path.insert(0, _api_dir)
|
||||||
|
|
||||||
|
from unittest.mock import MagicMock, patch
|
||||||
|
|
||||||
|
from app.services.import_service import build_articles_json, compute_discount_split
|
||||||
|
from app.services.order_reader import OrderData, OrderItem
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
def make_item(sku="SKU1", price=100.0, quantity=1, vat=19):
|
||||||
|
return OrderItem(sku=sku, name=f"Product {sku}", price=price, quantity=quantity, vat=vat)
|
||||||
|
|
||||||
|
|
||||||
|
def make_order(items, discount_total=0.0, delivery_cost=0.0, discount_vat=None):
|
||||||
|
order = OrderData(
|
||||||
|
id="1", number="TEST-001", date="2026-01-01",
|
||||||
|
items=items, discount_total=discount_total,
|
||||||
|
delivery_cost=delivery_cost,
|
||||||
|
)
|
||||||
|
if discount_vat is not None:
|
||||||
|
order.discount_vat = discount_vat
|
||||||
|
return order
|
||||||
|
|
||||||
|
|
||||||
|
def is_kit(comps):
|
||||||
|
"""Kit detection pattern used in validation_service and price_sync_service."""
|
||||||
|
return len(comps) > 1 or (
|
||||||
|
len(comps) == 1 and (comps[0].get("cantitate_roa") or 1) > 1
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 1: compute_discount_split()
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestDiscountSplit:
|
||||||
|
"""Regression: discount split by VAT rate (import_service.py:63)."""
|
||||||
|
|
||||||
|
def test_single_vat_rate(self):
|
||||||
|
order = make_order([make_item(vat=19), make_item("SKU2", vat=19)], discount_total=10.0)
|
||||||
|
result = compute_discount_split(order, {"split_discount_vat": "1"})
|
||||||
|
assert result == {"19": 10.0}
|
||||||
|
|
||||||
|
def test_multiple_vat_proportional(self):
|
||||||
|
items = [make_item("A", price=100, quantity=1, vat=19),
|
||||||
|
make_item("B", price=50, quantity=1, vat=9)]
|
||||||
|
order = make_order(items, discount_total=15.0)
|
||||||
|
result = compute_discount_split(order, {"split_discount_vat": "1"})
|
||||||
|
assert result == {"9": 5.0, "19": 10.0}
|
||||||
|
|
||||||
|
def test_zero_returns_none(self):
|
||||||
|
order = make_order([make_item()], discount_total=0)
|
||||||
|
assert compute_discount_split(order, {"split_discount_vat": "1"}) is None
|
||||||
|
|
||||||
|
def test_zero_price_items_excluded(self):
|
||||||
|
items = [make_item("A", price=0, quantity=1, vat=19),
|
||||||
|
make_item("B", price=100, quantity=2, vat=9)]
|
||||||
|
order = make_order(items, discount_total=5.0)
|
||||||
|
result = compute_discount_split(order, {"split_discount_vat": "1"})
|
||||||
|
assert result == {"9": 5.0}
|
||||||
|
|
||||||
|
def test_disabled_multiple_rates(self):
|
||||||
|
items = [make_item("A", vat=19), make_item("B", vat=9)]
|
||||||
|
order = make_order(items, discount_total=10.0)
|
||||||
|
result = compute_discount_split(order, {"split_discount_vat": "0"})
|
||||||
|
assert result is None
|
||||||
|
|
||||||
|
def test_rounding_remainder(self):
|
||||||
|
items = [make_item("A", price=33.33, quantity=1, vat=19),
|
||||||
|
make_item("B", price=33.33, quantity=1, vat=9),
|
||||||
|
make_item("C", price=33.34, quantity=1, vat=5)]
|
||||||
|
order = make_order(items, discount_total=10.0)
|
||||||
|
result = compute_discount_split(order, {"split_discount_vat": "1"})
|
||||||
|
assert result is not None
|
||||||
|
assert abs(sum(result.values()) - 10.0) < 0.001
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 2: build_articles_json()
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestBuildArticlesJson:
|
||||||
|
"""Regression: discount lines, policy bridge, transport (import_service.py:117)."""
|
||||||
|
|
||||||
|
def test_discount_line_negative_quantity(self):
|
||||||
|
items = [make_item()]
|
||||||
|
order = make_order(items, discount_total=5.0)
|
||||||
|
settings = {"discount_codmat": "DISC01", "split_discount_vat": "0"}
|
||||||
|
result = json.loads(build_articles_json(items, order, settings))
|
||||||
|
disc_lines = [a for a in result if a["sku"] == "DISC01"]
|
||||||
|
assert len(disc_lines) == 1
|
||||||
|
assert disc_lines[0]["quantity"] == "-1"
|
||||||
|
assert disc_lines[0]["price"] == "5.0"
|
||||||
|
|
||||||
|
def test_discount_uses_actual_vat_not_21(self):
|
||||||
|
items = [make_item("A", vat=9), make_item("B", vat=9)]
|
||||||
|
order = make_order(items, discount_total=3.0)
|
||||||
|
settings = {"discount_codmat": "DISC01", "split_discount_vat": "1"}
|
||||||
|
result = json.loads(build_articles_json(items, order, settings))
|
||||||
|
disc_lines = [a for a in result if a["sku"] == "DISC01"]
|
||||||
|
assert len(disc_lines) == 1
|
||||||
|
assert disc_lines[0]["vat"] == "9"
|
||||||
|
|
||||||
|
def test_discount_multi_vat_creates_multiple_lines(self):
|
||||||
|
items = [make_item("A", price=100, vat=19), make_item("B", price=50, vat=9)]
|
||||||
|
order = make_order(items, discount_total=15.0)
|
||||||
|
settings = {"discount_codmat": "DISC01", "split_discount_vat": "1"}
|
||||||
|
result = json.loads(build_articles_json(items, order, settings))
|
||||||
|
disc_lines = [a for a in result if a["sku"] == "DISC01"]
|
||||||
|
assert len(disc_lines) == 2
|
||||||
|
vats = {d["vat"] for d in disc_lines}
|
||||||
|
assert "9" in vats
|
||||||
|
assert "19" in vats
|
||||||
|
|
||||||
|
def test_discount_fallback_uses_gomag_vat(self):
|
||||||
|
items = [make_item("A", vat=19), make_item("B", vat=9)]
|
||||||
|
order = make_order(items, discount_total=5.0, discount_vat="9")
|
||||||
|
settings = {"discount_codmat": "DISC01", "split_discount_vat": "0"}
|
||||||
|
result = json.loads(build_articles_json(items, order, settings))
|
||||||
|
disc_lines = [a for a in result if a["sku"] == "DISC01"]
|
||||||
|
assert len(disc_lines) == 1
|
||||||
|
assert disc_lines[0]["vat"] == "9"
|
||||||
|
|
||||||
|
def test_per_article_policy_bridge(self):
|
||||||
|
items = [make_item("SKU1")]
|
||||||
|
settings = {"_codmat_policy_map": {"SKU1": 42}, "id_pol": "1"}
|
||||||
|
result = json.loads(build_articles_json(items, settings=settings))
|
||||||
|
assert result[0]["id_pol"] == "42"
|
||||||
|
|
||||||
|
def test_policy_same_as_default_omitted(self):
|
||||||
|
items = [make_item("SKU1")]
|
||||||
|
settings = {"_codmat_policy_map": {"SKU1": 1}, "id_pol": "1"}
|
||||||
|
result = json.loads(build_articles_json(items, settings=settings))
|
||||||
|
assert "id_pol" not in result[0]
|
||||||
|
|
||||||
|
def test_transport_line_added(self):
|
||||||
|
items = [make_item()]
|
||||||
|
order = make_order(items, delivery_cost=15.0)
|
||||||
|
settings = {"transport_codmat": "TR", "transport_vat": "19"}
|
||||||
|
result = json.loads(build_articles_json(items, order, settings))
|
||||||
|
tr_lines = [a for a in result if a["sku"] == "TR"]
|
||||||
|
assert len(tr_lines) == 1
|
||||||
|
assert tr_lines[0]["quantity"] == "1"
|
||||||
|
assert tr_lines[0]["price"] == "15.0"
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 3: Kit Detection Pattern
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestKitDetection:
|
||||||
|
"""Regression: kit detection for single-component repackaging (multiple code locations)."""
|
||||||
|
|
||||||
|
def test_multi_component(self):
|
||||||
|
comps = [{"codmat": "A", "cantitate_roa": 1}, {"codmat": "B", "cantitate_roa": 1}]
|
||||||
|
assert is_kit(comps) is True
|
||||||
|
|
||||||
|
def test_single_component_repackaging(self):
|
||||||
|
comps = [{"codmat": "CAF01", "cantitate_roa": 10}]
|
||||||
|
assert is_kit(comps) is True
|
||||||
|
|
||||||
|
def test_true_1to1_not_kit(self):
|
||||||
|
comps = [{"codmat": "X", "cantitate_roa": 1}]
|
||||||
|
assert is_kit(comps) is False
|
||||||
|
|
||||||
|
def test_none_cantitate_treated_as_1(self):
|
||||||
|
comps = [{"codmat": "X", "cantitate_roa": None}]
|
||||||
|
assert is_kit(comps) is False
|
||||||
|
|
||||||
|
def test_empty_components(self):
|
||||||
|
assert is_kit([]) is False
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 4: sync_prices_from_order() — Kit Skip Logic
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestSyncPricesKitSkip:
|
||||||
|
"""Regression: kit SKUs must be skipped in order-based price sync."""
|
||||||
|
|
||||||
|
def _make_mock_order(self, sku, price=50.0):
|
||||||
|
mock_order = MagicMock()
|
||||||
|
mock_item = MagicMock()
|
||||||
|
mock_item.sku = sku
|
||||||
|
mock_item.price = price
|
||||||
|
mock_order.items = [mock_item]
|
||||||
|
return mock_order
|
||||||
|
|
||||||
|
@patch("app.services.validation_service.compare_and_update_price")
|
||||||
|
def test_skips_multi_component_kit(self, mock_compare):
|
||||||
|
from app.services.validation_service import sync_prices_from_order
|
||||||
|
orders = [self._make_mock_order("KIT01")]
|
||||||
|
mapped = {"KIT01": [
|
||||||
|
{"codmat": "A", "id_articol": 1, "cantitate_roa": 1},
|
||||||
|
{"codmat": "B", "id_articol": 2, "cantitate_roa": 1},
|
||||||
|
]}
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
sync_prices_from_order(orders, mapped, {}, {}, 1, conn=mock_conn,
|
||||||
|
settings={"price_sync_enabled": "1"})
|
||||||
|
mock_compare.assert_not_called()
|
||||||
|
|
||||||
|
@patch("app.services.validation_service.compare_and_update_price")
|
||||||
|
def test_skips_repackaging_kit(self, mock_compare):
|
||||||
|
from app.services.validation_service import sync_prices_from_order
|
||||||
|
orders = [self._make_mock_order("CAFE100")]
|
||||||
|
mapped = {"CAFE100": [
|
||||||
|
{"codmat": "CAF01", "id_articol": 1, "cantitate_roa": 10},
|
||||||
|
]}
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
sync_prices_from_order(orders, mapped, {}, {}, 1, conn=mock_conn,
|
||||||
|
settings={"price_sync_enabled": "1"})
|
||||||
|
mock_compare.assert_not_called()
|
||||||
|
|
||||||
|
@patch("app.services.validation_service.compare_and_update_price")
|
||||||
|
def test_processes_1to1_mapping(self, mock_compare):
|
||||||
|
from app.services.validation_service import sync_prices_from_order
|
||||||
|
mock_compare.return_value = {"updated": False, "old_price": 50.0, "new_price": 50.0, "codmat": "X"}
|
||||||
|
orders = [self._make_mock_order("SKU1", price=50.0)]
|
||||||
|
mapped = {"SKU1": [
|
||||||
|
{"codmat": "X", "id_articol": 100, "cantitate_roa": 1, "cont": "371"},
|
||||||
|
]}
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
sync_prices_from_order(orders, mapped, {}, {"SKU1": 1}, 1, conn=mock_conn,
|
||||||
|
settings={"price_sync_enabled": "1"})
|
||||||
|
mock_compare.assert_called_once()
|
||||||
|
call_args = mock_compare.call_args
|
||||||
|
assert call_args[0][0] == 100 # id_articol
|
||||||
|
assert call_args[0][2] == 50.0 # price
|
||||||
|
|
||||||
|
@patch("app.services.validation_service.compare_and_update_price")
|
||||||
|
def test_skips_transport_discount_codmats(self, mock_compare):
|
||||||
|
from app.services.validation_service import sync_prices_from_order
|
||||||
|
orders = [self._make_mock_order("TRANSP", price=15.0)]
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
sync_prices_from_order(orders, {}, {"TRANSP": {"id_articol": 99}}, {}, 1,
|
||||||
|
conn=mock_conn,
|
||||||
|
settings={"price_sync_enabled": "1",
|
||||||
|
"transport_codmat": "TRANSP",
|
||||||
|
"discount_codmat": "DISC"})
|
||||||
|
mock_compare.assert_not_called()
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 5: Kit Component with Own Mapping
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestKitComponentOwnMapping:
|
||||||
|
"""Regression: price_sync_service skips kit components that have their own ARTICOLE_TERTI mapping."""
|
||||||
|
|
||||||
|
def test_component_with_own_mapping_skipped(self):
|
||||||
|
"""If comp_codmat is itself a key in mapped_data, it's skipped."""
|
||||||
|
mapped_data = {
|
||||||
|
"PACK-A": [{"codmat": "COMP-X", "id_articol": 1, "cantitate_roa": 1, "cont": "371"}],
|
||||||
|
"COMP-X": [{"codmat": "COMP-X", "id_articol": 1, "cantitate_roa": 1, "cont": "371"}],
|
||||||
|
}
|
||||||
|
# The check is: if comp_codmat in mapped_data: continue
|
||||||
|
comp_codmat = "COMP-X"
|
||||||
|
assert comp_codmat in mapped_data # Should be skipped
|
||||||
|
|
||||||
|
def test_component_without_own_mapping_processed(self):
|
||||||
|
"""If comp_codmat is NOT in mapped_data, it should be processed."""
|
||||||
|
mapped_data = {
|
||||||
|
"PACK-A": [{"codmat": "COMP-Y", "id_articol": 2, "cantitate_roa": 1, "cont": "371"}],
|
||||||
|
}
|
||||||
|
comp_codmat = "COMP-Y"
|
||||||
|
assert comp_codmat not in mapped_data # Should be processed
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 6: VAT Included Type Normalization
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestVatIncludedNormalization:
|
||||||
|
"""Regression: GoMag returns vat_included as int 1 or string '1' (price_sync_service.py:144)."""
|
||||||
|
|
||||||
|
def _compute_price_cu_tva(self, product):
|
||||||
|
price = float(product.get("price", "0"))
|
||||||
|
vat = float(product.get("vat", "19"))
|
||||||
|
if str(product.get("vat_included", "1")) == "1":
|
||||||
|
return price
|
||||||
|
else:
|
||||||
|
return price * (1 + vat / 100)
|
||||||
|
|
||||||
|
def test_vat_included_int_1(self):
|
||||||
|
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": 1})
|
||||||
|
assert result == 100.0
|
||||||
|
|
||||||
|
def test_vat_included_str_1(self):
|
||||||
|
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": "1"})
|
||||||
|
assert result == 100.0
|
||||||
|
|
||||||
|
def test_vat_included_int_0(self):
|
||||||
|
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": 0})
|
||||||
|
assert result == 119.0
|
||||||
|
|
||||||
|
def test_vat_included_str_0(self):
|
||||||
|
result = self._compute_price_cu_tva({"price": "100", "vat": "19", "vat_included": "0"})
|
||||||
|
assert result == 119.0
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 7: validate_kit_component_prices — pret=0 allowed
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestKitComponentPriceValidation:
|
||||||
|
"""Regression: pret=0 in CRM is valid for kit components (validation_service.py:469)."""
|
||||||
|
|
||||||
|
def _call_validate(self, fetchone_returns):
|
||||||
|
from app.services.validation_service import validate_kit_component_prices
|
||||||
|
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
mock_cursor = MagicMock()
|
||||||
|
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
|
||||||
|
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
|
||||||
|
mock_cursor.fetchone.return_value = fetchone_returns
|
||||||
|
|
||||||
|
mapped = {"KIT-SKU": [
|
||||||
|
{"codmat": "COMP1", "id_articol": 100, "cont": "371", "cantitate_roa": 5},
|
||||||
|
]}
|
||||||
|
return validate_kit_component_prices(mapped, id_pol=1, conn=mock_conn)
|
||||||
|
|
||||||
|
def test_price_zero_not_rejected(self):
|
||||||
|
result = self._call_validate((0,))
|
||||||
|
assert result == {}
|
||||||
|
|
||||||
|
def test_missing_entry_rejected(self):
|
||||||
|
result = self._call_validate(None)
|
||||||
|
assert "KIT-SKU" in result
|
||||||
|
assert "COMP1" in result["KIT-SKU"]
|
||||||
|
|
||||||
|
def test_skips_true_1to1(self):
|
||||||
|
from app.services.validation_service import validate_kit_component_prices
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
mapped = {"SKU1": [
|
||||||
|
{"codmat": "X", "id_articol": 1, "cont": "371", "cantitate_roa": 1},
|
||||||
|
]}
|
||||||
|
result = validate_kit_component_prices(mapped, id_pol=1, conn=mock_conn)
|
||||||
|
assert result == {}
|
||||||
|
|
||||||
|
def test_checks_repackaging(self):
|
||||||
|
"""Single component with cantitate_roa > 1 should be checked."""
|
||||||
|
from app.services.validation_service import validate_kit_component_prices
|
||||||
|
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
mock_cursor = MagicMock()
|
||||||
|
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
|
||||||
|
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
|
||||||
|
mock_cursor.fetchone.return_value = (51.50,)
|
||||||
|
|
||||||
|
mapped = {"CAFE100": [
|
||||||
|
{"codmat": "CAF01", "id_articol": 100, "cont": "371", "cantitate_roa": 10},
|
||||||
|
]}
|
||||||
|
result = validate_kit_component_prices(mapped, id_pol=1, conn=mock_conn)
|
||||||
|
assert result == {}
|
||||||
|
mock_cursor.execute.assert_called_once()
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 8: Dual Policy Assignment
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestDualPolicyAssignment:
|
||||||
|
"""Regression: cont 341/345 → production policy, others → sales (validation_service.py:282)."""
|
||||||
|
|
||||||
|
def _call_dual(self, codmats, direct_id_map, cursor_rows):
|
||||||
|
from app.services.validation_service import validate_and_ensure_prices_dual
|
||||||
|
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
mock_cursor = MagicMock()
|
||||||
|
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
|
||||||
|
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
|
||||||
|
# The code uses `for row in cur:` to iterate, not fetchall
|
||||||
|
mock_cursor.__iter__ = MagicMock(return_value=iter(cursor_rows))
|
||||||
|
# Mock ensure_prices to do nothing
|
||||||
|
with patch("app.services.validation_service.ensure_prices"):
|
||||||
|
return validate_and_ensure_prices_dual(
|
||||||
|
codmats, id_pol_vanzare=1, id_pol_productie=2,
|
||||||
|
conn=mock_conn, direct_id_map=direct_id_map
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_cont_341_production(self):
|
||||||
|
result = self._call_dual(
|
||||||
|
{"COD1"},
|
||||||
|
{"COD1": {"id_articol": 100, "cont": "341"}},
|
||||||
|
[] # no existing prices
|
||||||
|
)
|
||||||
|
assert result["COD1"] == 2 # id_pol_productie
|
||||||
|
|
||||||
|
def test_cont_345_production(self):
|
||||||
|
result = self._call_dual(
|
||||||
|
{"COD1"},
|
||||||
|
{"COD1": {"id_articol": 100, "cont": "345"}},
|
||||||
|
[]
|
||||||
|
)
|
||||||
|
assert result["COD1"] == 2
|
||||||
|
|
||||||
|
def test_other_cont_sales(self):
|
||||||
|
result = self._call_dual(
|
||||||
|
{"COD1"},
|
||||||
|
{"COD1": {"id_articol": 100, "cont": "371"}},
|
||||||
|
[]
|
||||||
|
)
|
||||||
|
assert result["COD1"] == 1 # id_pol_vanzare
|
||||||
|
|
||||||
|
def test_existing_sales_preferred(self):
|
||||||
|
result = self._call_dual(
|
||||||
|
{"COD1"},
|
||||||
|
{"COD1": {"id_articol": 100, "cont": "345"}},
|
||||||
|
[(100, 1), (100, 2)] # price exists in BOTH policies
|
||||||
|
)
|
||||||
|
assert result["COD1"] == 1 # sales preferred when both exist
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 9: Duplicate CODMAT — resolve_codmat_ids
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
class TestResolveCodmatIds:
|
||||||
|
"""Regression: ROW_NUMBER dedup returns exactly 1 id_articol per CODMAT."""
|
||||||
|
|
||||||
|
@patch("app.services.validation_service.database")
|
||||||
|
def test_returns_one_per_codmat(self, mock_db):
|
||||||
|
from app.services.validation_service import resolve_codmat_ids
|
||||||
|
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
mock_cursor = MagicMock()
|
||||||
|
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
|
||||||
|
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
|
||||||
|
# Simulate ROW_NUMBER already deduped: 1 row per codmat
|
||||||
|
mock_cursor.__iter__ = MagicMock(return_value=iter([
|
||||||
|
("COD1", 100, "345"),
|
||||||
|
("COD2", 200, "341"),
|
||||||
|
]))
|
||||||
|
|
||||||
|
result = resolve_codmat_ids({"COD1", "COD2"}, conn=mock_conn)
|
||||||
|
assert len(result) == 2
|
||||||
|
assert result["COD1"]["id_articol"] == 100
|
||||||
|
assert result["COD2"]["id_articol"] == 200
|
||||||
|
|
||||||
|
@patch("app.services.validation_service.database")
|
||||||
|
def test_resolve_mapped_one_per_sku_codmat(self, mock_db):
|
||||||
|
from app.services.validation_service import resolve_mapped_codmats
|
||||||
|
|
||||||
|
mock_conn = MagicMock()
|
||||||
|
mock_cursor = MagicMock()
|
||||||
|
mock_conn.cursor.return_value.__enter__ = MagicMock(return_value=mock_cursor)
|
||||||
|
mock_conn.cursor.return_value.__exit__ = MagicMock(return_value=False)
|
||||||
|
# 1 row per (sku, codmat) pair
|
||||||
|
mock_cursor.__iter__ = MagicMock(return_value=iter([
|
||||||
|
("SKU1", "COD1", 100, "345", 10),
|
||||||
|
("SKU1", "COD2", 200, "341", 1),
|
||||||
|
]))
|
||||||
|
|
||||||
|
result = resolve_mapped_codmats({"SKU1"}, mock_conn)
|
||||||
|
assert "SKU1" in result
|
||||||
|
assert len(result["SKU1"]) == 2
|
||||||
|
codmats = [c["codmat"] for c in result["SKU1"]]
|
||||||
|
assert "COD1" in codmats
|
||||||
|
assert "COD2" in codmats
|
||||||
@@ -528,6 +528,379 @@ def test_repackaging_kit_pricing():
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
# ===========================================================================
|
||||||
|
# Group 10: Business Rule Regression Tests (Oracle integration)
|
||||||
|
# ===========================================================================
|
||||||
|
|
||||||
|
def _create_test_partner(cur, suffix):
|
||||||
|
"""Helper: create a test partner and return its ID."""
|
||||||
|
partner_var = cur.var(oracledb.NUMBER)
|
||||||
|
name = f'Test BizRule {suffix}'
|
||||||
|
cur.execute("""
|
||||||
|
DECLARE v_id NUMBER;
|
||||||
|
BEGIN
|
||||||
|
v_id := PACK_IMPORT_PARTENERI.cauta_sau_creeaza_partener(
|
||||||
|
NULL, :name, 'JUD:Bucuresti;BUCURESTI;Str Test;1',
|
||||||
|
'0720000000', 'bizrule@test.com');
|
||||||
|
:result := v_id;
|
||||||
|
END;
|
||||||
|
""", {'name': name, 'result': partner_var})
|
||||||
|
return partner_var.getvalue()
|
||||||
|
|
||||||
|
|
||||||
|
def _import_order(cur, order_number, partner_id, articles_json, kit_mode='separate_line', id_pol=1):
|
||||||
|
"""Helper: call importa_comanda and return order ID."""
|
||||||
|
result_var = cur.var(oracledb.NUMBER)
|
||||||
|
cur.execute("""
|
||||||
|
DECLARE v_id NUMBER;
|
||||||
|
BEGIN
|
||||||
|
PACK_IMPORT_COMENZI.importa_comanda(
|
||||||
|
:order_number, SYSDATE, :partner_id,
|
||||||
|
:articles_json,
|
||||||
|
NULL, NULL,
|
||||||
|
:id_pol, NULL, NULL,
|
||||||
|
:kit_mode,
|
||||||
|
NULL, NULL, NULL,
|
||||||
|
v_id);
|
||||||
|
:result := v_id;
|
||||||
|
END;
|
||||||
|
""", {
|
||||||
|
'order_number': order_number,
|
||||||
|
'partner_id': partner_id,
|
||||||
|
'articles_json': articles_json,
|
||||||
|
'id_pol': id_pol,
|
||||||
|
'kit_mode': kit_mode,
|
||||||
|
'result': result_var
|
||||||
|
})
|
||||||
|
return result_var.getvalue()
|
||||||
|
|
||||||
|
|
||||||
|
def _get_order_lines(cur, order_id):
|
||||||
|
"""Helper: fetch COMENZI_ELEMENTE rows for an order."""
|
||||||
|
cur.execute("""
|
||||||
|
SELECT ce.CANTITATE, ce.PRET, na.CODMAT, ce.PTVA
|
||||||
|
FROM COMENZI_ELEMENTE ce
|
||||||
|
JOIN NOM_ARTICOLE na ON ce.ID_ARTICOL = na.ID_ARTICOL
|
||||||
|
WHERE ce.ID_COMANDA = :oid
|
||||||
|
ORDER BY ce.CANTITATE DESC, ce.PRET DESC
|
||||||
|
""", {'oid': order_id})
|
||||||
|
return cur.fetchall()
|
||||||
|
|
||||||
|
|
||||||
|
def test_multi_kit_discount_merge():
|
||||||
|
"""Regression (0666d6b): 2 identical kits at same VAT must merge discount lines,
|
||||||
|
not crash on duplicate check collision."""
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
print("TEST: Multi-kit discount merge (separate_line)")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
|
||||||
|
setup_test_data(cur)
|
||||||
|
partner_id = _create_test_partner(cur, suffix)
|
||||||
|
|
||||||
|
# 2 identical CAFE100 kits: total web = 2 * 450 = 900
|
||||||
|
articles_json = '[{"sku": "CAFE100", "cantitate": 2, "pret": 450}]'
|
||||||
|
order_id = _import_order(cur, f'TEST-BIZ-MERGE-{suffix}', partner_id, articles_json)
|
||||||
|
|
||||||
|
assert order_id and order_id > 0, "Order import failed"
|
||||||
|
rows = _get_order_lines(cur, order_id)
|
||||||
|
|
||||||
|
art_lines = [r for r in rows if r[0] > 0]
|
||||||
|
disc_lines = [r for r in rows if r[0] < 0]
|
||||||
|
assert len(art_lines) >= 1, f"Expected article line(s), got {len(art_lines)}"
|
||||||
|
assert len(disc_lines) >= 1, f"Expected discount line(s), got {len(disc_lines)}"
|
||||||
|
|
||||||
|
total = sum(r[0] * r[1] for r in rows)
|
||||||
|
expected = 900.0
|
||||||
|
print(f" Total: {total:.2f} (expected: {expected:.2f})")
|
||||||
|
assert abs(total - expected) < 0.02, f"Total {total:.2f} != expected {expected:.2f}"
|
||||||
|
print(" PASS")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" FAIL: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def test_kit_discount_per_kit_placement():
|
||||||
|
"""Regression (580ca59): discount lines must appear after article lines (both present)."""
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
print("TEST: Kit discount per-kit placement")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
|
||||||
|
setup_test_data(cur)
|
||||||
|
partner_id = _create_test_partner(cur, suffix)
|
||||||
|
|
||||||
|
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 450}]'
|
||||||
|
order_id = _import_order(cur, f'TEST-BIZ-PLACE-{suffix}', partner_id, articles_json)
|
||||||
|
|
||||||
|
assert order_id and order_id > 0, "Order import failed"
|
||||||
|
rows = _get_order_lines(cur, order_id)
|
||||||
|
|
||||||
|
art_lines = [r for r in rows if r[0] > 0]
|
||||||
|
disc_lines = [r for r in rows if r[0] < 0]
|
||||||
|
print(f" Article lines: {len(art_lines)}, Discount lines: {len(disc_lines)}")
|
||||||
|
assert len(art_lines) >= 1, "No article line found"
|
||||||
|
assert len(disc_lines) >= 1, "No discount line found — kit pricing did not activate"
|
||||||
|
print(" PASS")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" FAIL: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def test_repackaging_distributed_total_matches_web():
|
||||||
|
"""Regression (61ae58e): distributed mode total must match web price exactly."""
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
print("TEST: Repackaging distributed total matches web")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
|
||||||
|
setup_test_data(cur)
|
||||||
|
partner_id = _create_test_partner(cur, suffix)
|
||||||
|
|
||||||
|
# 3 packs @ 400 lei => total web = 1200
|
||||||
|
articles_json = '[{"sku": "CAFE100", "cantitate": 3, "pret": 400}]'
|
||||||
|
order_id = _import_order(cur, f'TEST-BIZ-DIST-{suffix}', partner_id,
|
||||||
|
articles_json, kit_mode='distributed')
|
||||||
|
|
||||||
|
assert order_id and order_id > 0, "Order import failed"
|
||||||
|
rows = _get_order_lines(cur, order_id)
|
||||||
|
|
||||||
|
# Distributed: single line with adjusted price
|
||||||
|
positive_lines = [r for r in rows if r[0] > 0]
|
||||||
|
assert len(positive_lines) == 1, f"Expected 1 line in distributed mode, got {len(positive_lines)}"
|
||||||
|
|
||||||
|
total = positive_lines[0][0] * positive_lines[0][1]
|
||||||
|
expected = 1200.0
|
||||||
|
print(f" Line: qty={positive_lines[0][0]}, price={positive_lines[0][1]:.2f}")
|
||||||
|
print(f" Total: {total:.2f} (expected: {expected:.2f})")
|
||||||
|
assert abs(total - expected) < 0.02, f"Total {total:.2f} != expected {expected:.2f}"
|
||||||
|
print(" PASS")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" FAIL: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def test_kit_markup_no_negative_discount():
|
||||||
|
"""Regression (47b5723): when web price > list price (markup), no discount line should be inserted."""
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
print("TEST: Kit markup — no negative discount")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
|
||||||
|
setup_test_data(cur)
|
||||||
|
partner_id = _create_test_partner(cur, suffix)
|
||||||
|
|
||||||
|
# CAF01 list price = 51.50/unit, 10 units = 515
|
||||||
|
# Web price 600 > 515 => markup, no discount line
|
||||||
|
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 600}]'
|
||||||
|
order_id = _import_order(cur, f'TEST-BIZ-MARKUP-{suffix}', partner_id, articles_json)
|
||||||
|
|
||||||
|
assert order_id and order_id > 0, "Order import failed"
|
||||||
|
rows = _get_order_lines(cur, order_id)
|
||||||
|
|
||||||
|
disc_lines = [r for r in rows if r[0] < 0]
|
||||||
|
print(f" Total lines: {len(rows)}, Discount lines: {len(disc_lines)}")
|
||||||
|
assert len(disc_lines) == 0, f"Expected 0 discount lines for markup, got {len(disc_lines)}"
|
||||||
|
print(" PASS")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" FAIL: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def test_kit_component_price_zero_import():
|
||||||
|
"""Regression (1703232): kit components with pret=0 should import successfully."""
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
print("TEST: Kit component price=0 import")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
|
||||||
|
setup_test_data(cur)
|
||||||
|
partner_id = _create_test_partner(cur, suffix)
|
||||||
|
|
||||||
|
# Temporarily set CAF01 price to 0
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE crm_politici_pret_art SET PRET = 0
|
||||||
|
WHERE id_articol = 9999001 AND id_pol = 1
|
||||||
|
""")
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Import with pret=0 — should succeed (discount = full web price)
|
||||||
|
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 100}]'
|
||||||
|
order_id = _import_order(cur, f'TEST-BIZ-PRET0-{suffix}', partner_id, articles_json)
|
||||||
|
|
||||||
|
print(f" Order ID: {order_id}")
|
||||||
|
assert order_id and order_id > 0, "Order import failed with pret=0"
|
||||||
|
print(" PASS: Order imported successfully with pret=0")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
finally:
|
||||||
|
# Restore original price
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE crm_politici_pret_art SET PRET = 51.50
|
||||||
|
WHERE id_articol = 9999001 AND id_pol = 1
|
||||||
|
""")
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" FAIL: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
# Restore price on error
|
||||||
|
cur.execute("""
|
||||||
|
UPDATE crm_politici_pret_art SET PRET = 51.50
|
||||||
|
WHERE id_articol = 9999001 AND id_pol = 1
|
||||||
|
""")
|
||||||
|
conn.commit()
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def test_duplicate_codmat_different_prices():
|
||||||
|
"""Regression (95565af): same CODMAT at different prices should create separate lines,
|
||||||
|
discriminated by PRET + SIGN(CANTITATE)."""
|
||||||
|
print("\n" + "=" * 60)
|
||||||
|
print("TEST: Duplicate CODMAT different prices")
|
||||||
|
print("=" * 60)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
suffix = f'{datetime.now().strftime("%H%M%S")}-{random.randint(1000, 9999)}'
|
||||||
|
setup_test_data(cur)
|
||||||
|
partner_id = _create_test_partner(cur, suffix)
|
||||||
|
|
||||||
|
# Two articles both mapping to CAF01 but at different prices
|
||||||
|
# CAFE100 -> CAF01 via ARTICOLE_TERTI (kit pricing)
|
||||||
|
# We use separate_line mode so article gets list price 51.50
|
||||||
|
# Then a second article at a different price on the same CODMAT
|
||||||
|
# For this test, we import 2 separate orders to same CODMAT with different prices
|
||||||
|
# The real scenario: kit article line + discount line on same id_articol
|
||||||
|
|
||||||
|
articles_json = '[{"sku": "CAFE100", "cantitate": 1, "pret": 450}]'
|
||||||
|
order_id = _import_order(cur, f'TEST-BIZ-DUP-{suffix}', partner_id, articles_json)
|
||||||
|
|
||||||
|
assert order_id and order_id > 0, "Order import failed"
|
||||||
|
rows = _get_order_lines(cur, order_id)
|
||||||
|
|
||||||
|
# separate_line mode: article at list price + discount at negative qty
|
||||||
|
# Both reference same CODMAT (CAF01) but different PRET and SIGN(CANTITATE)
|
||||||
|
codmats = [r[2] for r in rows]
|
||||||
|
print(f" Lines: {len(rows)}")
|
||||||
|
for r in rows:
|
||||||
|
print(f" qty={r[0]}, pret={r[1]:.2f}, codmat={r[2]}")
|
||||||
|
|
||||||
|
# Should have at least 2 lines with same CODMAT but different qty sign
|
||||||
|
caf_lines = [r for r in rows if r[2] == 'CAF01']
|
||||||
|
assert len(caf_lines) >= 2, f"Expected 2+ CAF01 lines (article + discount), got {len(caf_lines)}"
|
||||||
|
signs = {1 if r[0] > 0 else -1 for r in caf_lines}
|
||||||
|
assert len(signs) == 2, "Expected both positive and negative quantity lines for same CODMAT"
|
||||||
|
print(" PASS: Same CODMAT with different PRET/SIGN coexist")
|
||||||
|
|
||||||
|
conn.commit()
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" FAIL: {e}")
|
||||||
|
import traceback
|
||||||
|
traceback.print_exc()
|
||||||
|
try:
|
||||||
|
with oracledb.connect(user=user, password=password, dsn=dsn) as conn:
|
||||||
|
with conn.cursor() as cur:
|
||||||
|
teardown_test_data(cur)
|
||||||
|
conn.commit()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
print("Starting complete order import test...")
|
print("Starting complete order import test...")
|
||||||
print(f"Timestamp: {datetime.now()}")
|
print(f"Timestamp: {datetime.now()}")
|
||||||
@@ -536,16 +909,32 @@ if __name__ == "__main__":
|
|||||||
|
|
||||||
print(f"\nTest completed at: {datetime.now()}")
|
print(f"\nTest completed at: {datetime.now()}")
|
||||||
if success:
|
if success:
|
||||||
print("🎯 PHASE 1 VALIDATION: SUCCESSFUL")
|
print("PHASE 1 VALIDATION: SUCCESSFUL")
|
||||||
else:
|
else:
|
||||||
print("🔧 PHASE 1 VALIDATION: NEEDS ATTENTION")
|
print("PHASE 1 VALIDATION: NEEDS ATTENTION")
|
||||||
|
|
||||||
# Run repackaging kit pricing test
|
# Run repackaging kit pricing test
|
||||||
print("\n")
|
print("\n")
|
||||||
repack_success = test_repackaging_kit_pricing()
|
repack_success = test_repackaging_kit_pricing()
|
||||||
if repack_success:
|
if repack_success:
|
||||||
print("🎯 REPACKAGING KIT PRICING: SUCCESSFUL")
|
print("REPACKAGING KIT PRICING: SUCCESSFUL")
|
||||||
else:
|
else:
|
||||||
print("🔧 REPACKAGING KIT PRICING: NEEDS ATTENTION")
|
print("REPACKAGING KIT PRICING: NEEDS ATTENTION")
|
||||||
|
|
||||||
|
# Run business rule regression tests
|
||||||
|
print("\n")
|
||||||
|
biz_tests = [
|
||||||
|
("Multi-kit discount merge", test_multi_kit_discount_merge),
|
||||||
|
("Kit discount per-kit placement", test_kit_discount_per_kit_placement),
|
||||||
|
("Distributed total matches web", test_repackaging_distributed_total_matches_web),
|
||||||
|
("Markup no negative discount", test_kit_markup_no_negative_discount),
|
||||||
|
("Component price=0 import", test_kit_component_price_zero_import),
|
||||||
|
("Duplicate CODMAT different prices", test_duplicate_codmat_different_prices),
|
||||||
|
]
|
||||||
|
biz_passed = 0
|
||||||
|
for name, test_fn in biz_tests:
|
||||||
|
if test_fn():
|
||||||
|
biz_passed += 1
|
||||||
|
print(f"\nBusiness rule tests: {biz_passed}/{len(biz_tests)} passed")
|
||||||
|
|
||||||
exit(0 if success else 1)
|
exit(0 if success else 1)
|
||||||
@@ -82,46 +82,51 @@ def test_health_oracle_connected(client):
|
|||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
# Test B: Mappings CRUD cycle (uses real CODMAT from Oracle nomenclator)
|
# Test B: Mappings CRUD cycle (uses real CODMAT from Oracle nomenclator)
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
TEST_SKU = "PYTEST_INTEG_SKU_001"
|
@pytest.fixture(scope="module")
|
||||||
|
def test_sku():
|
||||||
|
"""Generate a unique test SKU per run to avoid conflicts with prior soft-deleted entries."""
|
||||||
|
import time
|
||||||
|
return f"PYTEST_SKU_{int(time.time())}"
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="module")
|
@pytest.fixture(scope="module")
|
||||||
def real_codmat(client):
|
def real_codmat(client):
|
||||||
"""Find a real CODMAT from Oracle nomenclator to use in mappings tests."""
|
"""Find a real CODMAT from Oracle nomenclator to use in mappings tests."""
|
||||||
resp = client.get("/api/articles/search", params={"q": "A"})
|
# min_length=2 on the endpoint, so use 2+ char search terms
|
||||||
if resp.status_code != 200:
|
for term in ["01", "PH", "CA"]:
|
||||||
pytest.skip("Articles search unavailable")
|
resp = client.get("/api/articles/search", params={"q": term})
|
||||||
results = resp.json().get("results", [])
|
if resp.status_code == 200:
|
||||||
if not results:
|
results = resp.json().get("results", [])
|
||||||
pytest.skip("No articles found in Oracle for CRUD test")
|
if results:
|
||||||
return results[0]["codmat"]
|
return results[0]["codmat"]
|
||||||
|
pytest.skip("No articles found in Oracle for CRUD test")
|
||||||
|
|
||||||
|
|
||||||
def test_mappings_create(client, real_codmat):
|
def test_mappings_create(client, real_codmat, test_sku):
|
||||||
resp = client.post("/api/mappings", json={
|
resp = client.post("/api/mappings", json={
|
||||||
"sku": TEST_SKU,
|
"sku": test_sku,
|
||||||
"codmat": real_codmat,
|
"codmat": real_codmat,
|
||||||
"cantitate_roa": 2.5,
|
"cantitate_roa": 2.5,
|
||||||
})
|
})
|
||||||
assert resp.status_code == 200
|
assert resp.status_code == 200, f"create returned {resp.status_code}: {resp.json()}"
|
||||||
body = resp.json()
|
body = resp.json()
|
||||||
assert body.get("success") is True, f"create returned: {body}"
|
assert body.get("success") is True, f"create returned: {body}"
|
||||||
|
|
||||||
|
|
||||||
def test_mappings_list_after_create(client, real_codmat):
|
def test_mappings_list_after_create(client, real_codmat, test_sku):
|
||||||
resp = client.get("/api/mappings", params={"search": TEST_SKU})
|
resp = client.get("/api/mappings", params={"search": test_sku})
|
||||||
assert resp.status_code == 200
|
assert resp.status_code == 200
|
||||||
body = resp.json()
|
body = resp.json()
|
||||||
mappings = body.get("mappings", [])
|
mappings = body.get("mappings", [])
|
||||||
found = any(
|
found = any(
|
||||||
m["sku"] == TEST_SKU and m["codmat"] == real_codmat
|
m["sku"] == test_sku and m["codmat"] == real_codmat
|
||||||
for m in mappings
|
for m in mappings
|
||||||
)
|
)
|
||||||
assert found, f"mapping not found in list; got {mappings}"
|
assert found, f"mapping not found in list; got {mappings}"
|
||||||
|
|
||||||
|
|
||||||
def test_mappings_update(client, real_codmat):
|
def test_mappings_update(client, real_codmat, test_sku):
|
||||||
resp = client.put(f"/api/mappings/{TEST_SKU}/{real_codmat}", json={
|
resp = client.put(f"/api/mappings/{test_sku}/{real_codmat}", json={
|
||||||
"cantitate_roa": 3.0,
|
"cantitate_roa": 3.0,
|
||||||
})
|
})
|
||||||
assert resp.status_code == 200
|
assert resp.status_code == 200
|
||||||
@@ -129,25 +134,25 @@ def test_mappings_update(client, real_codmat):
|
|||||||
assert body.get("success") is True, f"update returned: {body}"
|
assert body.get("success") is True, f"update returned: {body}"
|
||||||
|
|
||||||
|
|
||||||
def test_mappings_delete(client, real_codmat):
|
def test_mappings_delete(client, real_codmat, test_sku):
|
||||||
resp = client.delete(f"/api/mappings/{TEST_SKU}/{real_codmat}")
|
resp = client.delete(f"/api/mappings/{test_sku}/{real_codmat}")
|
||||||
assert resp.status_code == 200
|
assert resp.status_code == 200
|
||||||
body = resp.json()
|
body = resp.json()
|
||||||
assert body.get("success") is True, f"delete returned: {body}"
|
assert body.get("success") is True, f"delete returned: {body}"
|
||||||
|
|
||||||
|
|
||||||
def test_mappings_verify_soft_deleted(client, real_codmat):
|
def test_mappings_verify_soft_deleted(client, real_codmat, test_sku):
|
||||||
resp = client.get("/api/mappings", params={"search": TEST_SKU, "show_deleted": "true"})
|
resp = client.get("/api/mappings", params={"search": test_sku, "show_deleted": "true"})
|
||||||
assert resp.status_code == 200
|
assert resp.status_code == 200
|
||||||
body = resp.json()
|
body = resp.json()
|
||||||
mappings = body.get("mappings", [])
|
mappings = body.get("mappings", [])
|
||||||
deleted = any(
|
deleted = any(
|
||||||
m["sku"] == TEST_SKU and m["codmat"] == real_codmat and m.get("sters") == 1
|
m["sku"] == test_sku and m["codmat"] == real_codmat and m.get("sters") == 1
|
||||||
for m in mappings
|
for m in mappings
|
||||||
)
|
)
|
||||||
assert deleted, (
|
assert deleted, (
|
||||||
f"expected sters=1 for deleted mapping, got: "
|
f"expected sters=1 for deleted mapping, got: "
|
||||||
f"{[m for m in mappings if m['sku'] == TEST_SKU]}"
|
f"{[m for m in mappings if m['sku'] == test_sku]}"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
95
test.sh
95
test.sh
@@ -9,17 +9,42 @@ cd "$(dirname "$0")"
|
|||||||
GREEN='\033[32m'
|
GREEN='\033[32m'
|
||||||
RED='\033[31m'
|
RED='\033[31m'
|
||||||
YELLOW='\033[33m'
|
YELLOW='\033[33m'
|
||||||
|
CYAN='\033[36m'
|
||||||
RESET='\033[0m'
|
RESET='\033[0m'
|
||||||
|
|
||||||
|
# ─── Log file setup ──────────────────────────────────────────────────────────
|
||||||
|
LOG_DIR="qa-reports"
|
||||||
|
mkdir -p "$LOG_DIR"
|
||||||
|
TIMESTAMP=$(date '+%Y%m%d_%H%M%S')
|
||||||
|
LOG_FILE="${LOG_DIR}/test_run_${TIMESTAMP}.log"
|
||||||
|
|
||||||
|
# Strip ANSI codes for log file
|
||||||
|
strip_ansi() {
|
||||||
|
sed 's/\x1b\[[0-9;]*m//g'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Tee to both terminal and log file (log without colors)
|
||||||
|
log_tee() {
|
||||||
|
tee >(strip_ansi >> "$LOG_FILE")
|
||||||
|
}
|
||||||
|
|
||||||
# ─── Stage tracking ───────────────────────────────────────────────────────────
|
# ─── Stage tracking ───────────────────────────────────────────────────────────
|
||||||
declare -a STAGE_NAMES=()
|
declare -a STAGE_NAMES=()
|
||||||
declare -a STAGE_RESULTS=() # 0=pass, 1=fail, 2=skip
|
declare -a STAGE_RESULTS=() # 0=pass, 1=fail, 2=skip
|
||||||
|
declare -a STAGE_SKIPPED=() # count of skipped tests per stage
|
||||||
|
declare -a STAGE_DETAILS=() # pytest summary line per stage
|
||||||
EXIT_CODE=0
|
EXIT_CODE=0
|
||||||
|
TOTAL_SKIPPED=0
|
||||||
|
|
||||||
record() {
|
record() {
|
||||||
local name="$1"
|
local name="$1"
|
||||||
local code="$2"
|
local code="$2"
|
||||||
|
local skipped="${3:-0}"
|
||||||
|
local details="${4:-}"
|
||||||
STAGE_NAMES+=("$name")
|
STAGE_NAMES+=("$name")
|
||||||
|
STAGE_SKIPPED+=("$skipped")
|
||||||
|
STAGE_DETAILS+=("$details")
|
||||||
|
TOTAL_SKIPPED=$((TOTAL_SKIPPED + skipped))
|
||||||
if [ "$code" -eq 0 ]; then
|
if [ "$code" -eq 0 ]; then
|
||||||
STAGE_RESULTS+=(0)
|
STAGE_RESULTS+=(0)
|
||||||
else
|
else
|
||||||
@@ -31,6 +56,8 @@ record() {
|
|||||||
skip_stage() {
|
skip_stage() {
|
||||||
STAGE_NAMES+=("$1")
|
STAGE_NAMES+=("$1")
|
||||||
STAGE_RESULTS+=(2)
|
STAGE_RESULTS+=(2)
|
||||||
|
STAGE_SKIPPED+=(0)
|
||||||
|
STAGE_DETAILS+=("")
|
||||||
}
|
}
|
||||||
|
|
||||||
# ─── Environment setup ────────────────────────────────────────────────────────
|
# ─── Environment setup ────────────────────────────────────────────────────────
|
||||||
@@ -140,44 +167,72 @@ run_stage() {
|
|||||||
shift
|
shift
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${YELLOW}=== $label ===${RESET}"
|
echo -e "${YELLOW}=== $label ===${RESET}"
|
||||||
|
|
||||||
|
# Capture output for skip parsing while showing it live
|
||||||
|
local tmpout
|
||||||
|
tmpout=$(mktemp)
|
||||||
set +e
|
set +e
|
||||||
"$@"
|
"$@" 2>&1 | tee "$tmpout" | log_tee
|
||||||
local code=$?
|
local code=${PIPESTATUS[0]}
|
||||||
set -e
|
set -e
|
||||||
record "$label" $code
|
|
||||||
|
# Parse pytest summary line for skip count
|
||||||
|
# Matches lines like: "= 5 passed, 3 skipped in 1.23s ="
|
||||||
|
local skipped=0
|
||||||
|
local summary_line=""
|
||||||
|
summary_line=$(grep -E '=+.*passed|failed|error|skipped.*=+' "$tmpout" | tail -1 || true)
|
||||||
|
if [ -n "$summary_line" ]; then
|
||||||
|
skipped=$(echo "$summary_line" | grep -oP '\d+(?= skipped)' || echo "0")
|
||||||
|
[ -z "$skipped" ] && skipped=0
|
||||||
|
fi
|
||||||
|
rm -f "$tmpout"
|
||||||
|
|
||||||
|
record "$label" $code "$skipped" "$summary_line"
|
||||||
# Don't return $code — let execution continue to next stage
|
# Don't return $code — let execution continue to next stage
|
||||||
}
|
}
|
||||||
|
|
||||||
# ─── Summary box ──────────────────────────────────────────────────────────────
|
# ─── Summary box ──────────────────────────────────────────────────────────────
|
||||||
print_summary() {
|
print_summary() {
|
||||||
echo ""
|
echo ""
|
||||||
echo -e "${YELLOW}╔══════════════════════════════════════════╗${RESET}"
|
echo -e "${YELLOW}╔══════════════════════════════════════════════════╗${RESET}"
|
||||||
echo -e "${YELLOW}║ TEST RESULTS SUMMARY ║${RESET}"
|
echo -e "${YELLOW}║ TEST RESULTS SUMMARY ║${RESET}"
|
||||||
echo -e "${YELLOW}╠══════════════════════════════════════════╣${RESET}"
|
echo -e "${YELLOW}╠══════════════════════════════════════════════════╣${RESET}"
|
||||||
|
|
||||||
for i in "${!STAGE_NAMES[@]}"; do
|
for i in "${!STAGE_NAMES[@]}"; do
|
||||||
local name="${STAGE_NAMES[$i]}"
|
local name="${STAGE_NAMES[$i]}"
|
||||||
local result="${STAGE_RESULTS[$i]}"
|
local result="${STAGE_RESULTS[$i]}"
|
||||||
# Pad name to 26 chars
|
local skipped="${STAGE_SKIPPED[$i]}"
|
||||||
|
# Pad name to 24 chars
|
||||||
local padded
|
local padded
|
||||||
padded=$(printf "%-26s" "$name")
|
padded=$(printf "%-24s" "$name")
|
||||||
if [ "$result" -eq 0 ]; then
|
if [ "$result" -eq 0 ]; then
|
||||||
echo -e "${YELLOW}║${RESET} ${GREEN}✅${RESET} ${padded} ${GREEN}passed${RESET} ${YELLOW}║${RESET}"
|
if [ "$skipped" -gt 0 ]; then
|
||||||
|
local skip_note
|
||||||
|
skip_note=$(printf "passed (%d skipped)" "$skipped")
|
||||||
|
echo -e "${YELLOW}║${RESET} ${GREEN}✅${RESET} ${padded} ${GREEN}passed${RESET} ${CYAN}(${skipped} skipped)${RESET} ${YELLOW}║${RESET}"
|
||||||
|
else
|
||||||
|
echo -e "${YELLOW}║${RESET} ${GREEN}✅${RESET} ${padded} ${GREEN}passed${RESET} ${YELLOW}║${RESET}"
|
||||||
|
fi
|
||||||
elif [ "$result" -eq 1 ]; then
|
elif [ "$result" -eq 1 ]; then
|
||||||
echo -e "${YELLOW}║${RESET} ${RED}❌${RESET} ${padded} ${RED}FAILED${RESET} ${YELLOW}║${RESET}"
|
echo -e "${YELLOW}║${RESET} ${RED}❌${RESET} ${padded} ${RED}FAILED${RESET} ${YELLOW}║${RESET}"
|
||||||
else
|
else
|
||||||
echo -e "${YELLOW}║${RESET} ${YELLOW}⏭️ ${RESET} ${padded} ${YELLOW}skipped${RESET} ${YELLOW}║${RESET}"
|
echo -e "${YELLOW}║${RESET} ${YELLOW}⏭️ ${RESET} ${padded} ${YELLOW}skipped${RESET} ${YELLOW}║${RESET}"
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
echo -e "${YELLOW}╠══════════════════════════════════════════╣${RESET}"
|
echo -e "${YELLOW}╠══════════════════════════════════════════════════╣${RESET}"
|
||||||
if [ "$EXIT_CODE" -eq 0 ]; then
|
if [ "$EXIT_CODE" -eq 0 ]; then
|
||||||
echo -e "${YELLOW}║${RESET} ${GREEN}All stages passed!${RESET} ${YELLOW}║${RESET}"
|
if [ "$TOTAL_SKIPPED" -gt 0 ]; then
|
||||||
|
echo -e "${YELLOW}║${RESET} ${GREEN}All stages passed!${RESET} ${CYAN}(${TOTAL_SKIPPED} tests skipped total)${RESET} ${YELLOW}║${RESET}"
|
||||||
|
else
|
||||||
|
echo -e "${YELLOW}║${RESET} ${GREEN}All stages passed!${RESET} ${YELLOW}║${RESET}"
|
||||||
|
fi
|
||||||
else
|
else
|
||||||
echo -e "${YELLOW}║${RESET} ${RED}Some stages FAILED — check output above${RESET} ${YELLOW}║${RESET}"
|
echo -e "${YELLOW}║${RESET} ${RED}Some stages FAILED — check output above${RESET} ${YELLOW}║${RESET}"
|
||||||
fi
|
fi
|
||||||
echo -e "${YELLOW}║ Health Score: see qa-reports/ ║${RESET}"
|
echo -e "${YELLOW}║${RESET} Log: ${CYAN}${LOG_FILE}${RESET}"
|
||||||
echo -e "${YELLOW}╚══════════════════════════════════════════╝${RESET}"
|
echo -e "${YELLOW}║${RESET} Health Score: see qa-reports/"
|
||||||
|
echo -e "${YELLOW}╚══════════════════════════════════════════════════╝${RESET}"
|
||||||
}
|
}
|
||||||
|
|
||||||
# ─── Cleanup trap ────────────────────────────────────────────────────────────
|
# ─── Cleanup trap ────────────────────────────────────────────────────────────
|
||||||
@@ -193,6 +248,10 @@ fi
|
|||||||
|
|
||||||
setup_env
|
setup_env
|
||||||
|
|
||||||
|
# Write log header
|
||||||
|
echo "=== test.sh ${MODE} — $(date '+%Y-%m-%d %H:%M:%S') ===" > "$LOG_FILE"
|
||||||
|
echo "" >> "$LOG_FILE"
|
||||||
|
|
||||||
case "$MODE" in
|
case "$MODE" in
|
||||||
ci)
|
ci)
|
||||||
run_stage "Unit tests" python -m pytest -m unit -v
|
run_stage "Unit tests" python -m pytest -m unit -v
|
||||||
@@ -258,5 +317,7 @@ case "$MODE" in
|
|||||||
;;
|
;;
|
||||||
esac
|
esac
|
||||||
|
|
||||||
print_summary
|
print_summary 2>&1 | log_tee
|
||||||
|
echo ""
|
||||||
|
echo -e "${CYAN}Full log saved to: ${LOG_FILE}${RESET}"
|
||||||
exit $EXIT_CODE
|
exit $EXIT_CODE
|
||||||
|
|||||||
Reference in New Issue
Block a user