Aptli

Validations

Validations are the quality control step after a report is submitted. As of the current release, validations are no longer a standalone page — they live alongside the report they apply to, shown as a traffic-light badge on the reports list and a status-grouped badge row in the expanded report view.

A supervisor or QC inspector opens a report, clicks a validation badge to view or create a validation in an overlay modal, verifies measurements and installation quality, and records findings against the specific line items in the report. Each finding has a severity level (critical, warning, info) and the overall validation status determines whether payment is released or the worker needs to correct their work.

Where Validations Live

Reports list: Every report row shows an aggregate validation badge (grey/red/yellow/green) in its own column. The badge displays the most-severe status with a count — for example, FAILED · 3 if there are three failed validations on that report. Hovering the badge reveals a per-status breakdown like failed: 1, passed: 2.

Expanded report: Opening a report in the list shows a validation badge row grouped by status. Clicking any badge opens the ValidationEditModal — a full-featured overlay for viewing, editing, or creating a validation without leaving the report.

Mobile: The mobile supervisor queue at /m/validations is unchanged. Mobile validators continue to work through assigned validations from a dedicated list.

Print view: The standalone /fulfillment/validations/:id page is preserved as a print-only view. The modal has a Print button in its header that opens the print page in a new tab. Deep-links from old bookmarks still resolve.

Opening a Validation

From the reports list:

  1. Find the report you want to validate
  2. Click the validation badge in the report row (or expand the row to see badges grouped by status)
  3. The ValidationEditModal opens over the page
  4. View the existing validation, or click Create validation if none exists
  5. Record findings, upload photos, set status, save

Access gates:

  • Viewing validations requires no special right
  • Creating requires the validationsCreate admin right
  • Editing requires the validationsUpdate admin right

Unified Submit (Report + Validation Together)

The redesign unifies editing across the report and its validations. When you have both the report and a validation open, the Submit button batches all changes together — report edits and validation edits share a single diffData payload and commit atomically. You don't save validations separately from the report.

Validation Structure

Core Fields:

  • Report - Reference to validated report (set automatically when you open from the report)
  • Validator - User who performed QC check
  • Status - pass, fail, needs-revision, approved-with-notes
  • Validation Date - When QC performed
  • Overall Notes - General observations, summary

Detailed Findings:

  • Findings - Array of specific issues discovered
  • Photos - Documentation of quality issues
  • Recommended Actions - Follow-up tasks, corrections needed

Findings Structure

Each finding in a validation targets a specific line item from the report's Work Completed list. Findings include:

  • Which work item has the issue (by position in the work completed list)
  • Issue type — the category of problem (e.g., volume mismatch, quality issue)
  • Severity — critical, warning, or info
  • Description — a plain-language explanation of the issue

Example: A report records Cat6 Cable (45 m), Junction Boxes (8 units), and Electrician Labour (3.5 hours). The validation finds a volume mismatch on the cable (42 m physically measured vs. 45 m reported, severity: warning) and a critical quality issue on two improperly mounted junction boxes.

Key Points:

  • Each finding references a specific line item, so validators can pass some work and flag others in the same report
  • Multiple findings can be added per report
  • Detailed notes preserve context for the worker's review

Validation Status

pass - Work meets quality standards

  • All measurements accurate
  • Installation per specifications
  • Materials properly documented

fail - Work does not meet standards

  • Critical quality issues
  • Significant measurement discrepancies
  • Requires rework

needs-revision - Minor issues, corrections needed

  • Small measurement differences (within tolerance)
  • Documentation incomplete
  • Photos needed

approved-with-notes - Acceptable with caveats

  • Work meets minimum standards
  • Issues noted for tracking
  • Patterns to address in training

Issue Types

volume_mismatch - Reported volume doesn't match physical measurement

  • Example: Report claims 50m cable, measurement shows 45m
  • Severity: warning (small difference) or critical (large discrepancy)

quality_issue - Installation doesn't meet standards

  • Example: Improper mounting, missing weatherproofing, damage
  • Severity: critical (safety hazard) or warning (cosmetic)

documentation_incomplete - Missing required information

  • Example: No photos, vague description, missing certifications
  • Severity: info (minor) or warning (regulatory requirement)

location_discrepancy - Work performed at wrong location

  • Example: Report geometry doesn't match task geometry
  • Severity: critical (completely wrong site) or warning (slightly off)

consumption_mismatch - Consumed materials don't align with work completed

  • Example: Reported 50m cable work, consumed 80m from stock
  • Severity: warning (investigate waste/theft)

safety_violation - Unsafe practices observed

  • Example: Missing safety equipment, improper procedures
  • Severity: critical (always)

Severity Levels

critical - Requires immediate correction, work not acceptable

  • Safety violations
  • Major quality defects
  • Significant measurement errors
  • Financial impact > threshold

warning - Needs attention, but work marginally acceptable

  • Minor quality issues
  • Small measurement discrepancies
  • Documentation gaps
  • Training opportunities

info - Noted for tracking, no immediate action required

  • Best practices suggestions
  • Efficiency improvements
  • Informational observations

Creating Validations

Access Required: validationsCreate admin right

Workflow:

  1. Navigate to Fulfillment → Reports
  2. Open (or expand) the report you want to validate
  3. Click a validation badge to open the ValidationEditModal, or click Create validation if none exists
  4. Review work completed against physical site
  5. For each resource in work completed:
    • Verify measurements
    • Check quality of installation
    • Document any issues in findings
  6. Upload photos of quality issues
  7. Set overall status (pass/fail/needs-revision/approved-with-notes)
  8. Add overall notes (summary)
  9. Click Submit to save report + validation edits together

Field Validation: Mobile validator workflow (/m/validations):

  • GPS verification (are you at the site?)
  • Camera integration for quality issue photos
  • Voice-to-text for descriptions
  • Offline mode (submit when back online)

Validation Workflow

Typical Process:

  1. Report Submitted - Worker completes report
  2. Badge Visible - Reports list shows a grey "none" badge for reports without validations
  3. Validator Opens Report - Clicks the badge, opens the modal
  4. Site Visit - Validator visits location (or reviews based on photos + data)
  5. Measurements - Physical verification of work
  6. Photos - Document quality (good or bad)
  7. Findings - Record any discrepancies inside the modal
  8. Status - Set pass/fail/needs-revision on the validation
  9. Submit - Saves report + validation atomically
  10. Follow-Up - If needs-revision, worker addresses issues
  11. Re-Validation - Re-open the badge, add findings to the existing validation or create a new one
  12. Final Approval - Report approved, payment released

Filtering Reports by Validation Status

Because validations live on the reports page now, the reports list accepts a validation status filter:

  • all — every report (default)
  • none — reports without any validation (needs QC attention)
  • any — reports with at least one validation
  • failed — reports with at least one failed validation
  • revision — reports with at least one needs-revision validation
  • passed — reports where all validations pass

Under the hood, the /api/reports/get endpoint accepts populateValidations=true and validationStatusFilter=<status> query parameters. The reports list uses these to show the badges and to filter.

Custom filters still apply:

  • Validator (who performed QC)
  • Date range
  • Severity (critical findings only)
  • Issue type (volume_mismatch, quality_issue, etc.)

Payment Integration

Validations control payment release:

Payment Hold:

  • Reports with fail status → payment held
  • Reports with needs-revision → partial payment (configurable)
  • Reports with approved-with-notes → full payment, notes tracked

Payment Release:

  1. Report submitted
  2. Validation performed via the badge/modal
  3. If pass or approved-with-notes → payment released
  4. If needs-revision → partial payment, re-validation required
  5. If fail → no payment, rework required

Financial Protection:

  • Prevents payment for substandard work
  • Incentivizes quality
  • Audit trail for payment decisions

Validation Analytics

Track quality trends over time:

By Worker:

  • Pass rate per worker
  • Common issue types
  • Improvement trends
  • Training needs identification

By Task Type:

  • Which tasks have highest fail rate
  • Resource-specific quality issues (cable vs. labor vs. equipment)
  • Estimation accuracy (volume mismatches indicate poor estimates)

By Validator:

  • Consistency checks (is one validator too strict/lenient?)
  • Validation turnaround time
  • Finding severity distribution

Best Practices

Timely Validation:

  • Use the reports list's none filter to find reports awaiting QC
  • Validate within 24-48 hours of report submission
  • Fresh evidence (materials still visible, worker still remembers)
  • Faster payment to workers

Specific Findings:

  • Reference the specific work completed entry by position (the first item, second item, etc.) to make findings actionable
  • Detailed descriptions help worker understand issue
  • Photos provide indisputable evidence

Constructive Feedback:

  • Frame as training opportunity
  • Explain why issue matters (safety, standards, cost)
  • Offer suggestions for improvement

Consistent Standards:

  • Apply same criteria to all workers
  • Document quality standards clearly
  • Regular validator calibration (ensure consistency)

Use the Print View for Hard Copies:

  • Open the ValidationEditModal on an existing validation
  • Click the Print button in the modal header
  • The print page opens in a new tab, ready for paper or PDF export

Validation Immutability

Soft Deletes: Validations can be marked deleted but stay in database:

  • Preserves QC history
  • Maintains payment audit trail
  • Configurable retention period
  • viewDeleted admin right to see deleted validations

Edit Restrictions: After payment released:

  • Validations become read-only (modal is view-only)
  • Corrections require a new validation with notes
  • Preserves financial integrity
  • Version history tracks all changes before finalization