GROOT FORCE - Test Cases: Variant-Specific Features
Document Version: 1.0
Date: November 2025
Status: Production Ready
Classification: Internal - QA & Product Engineering
Document Control
| Version | Date | Author | Changes |
|---|---|---|---|
| 1.0 | Nov 2025 | Product Team | Initial variant-specific test cases |
Approval:
- Product Manager: _________________ Date: _______
- QA Lead: _________________ Date: _______
- Variant Product Owners (9): _________________ Date: _______
- Technical Lead: _________________ Date: _______
Table of Contents
- GF-BE: Ben's Assistive Edition
- GF-CL: Care & Joy Edition (NDIS)
- GF-NF: Nick's Fitness Edition
- GF-TX: TradeForce Pro Edition
- GF-DI: SilentLink Edition (Deaf/HoH)
- GF-VI: VisionAssist Edition
- GF-TR: Traveller Edition
- GF-LX: Lifestyle Edition
- GF-EN: Enterprise Vision Edition
Test Overview
Total Test Cases: 45 variant-specific validation procedures
Priority Distribution:
- P0 (Critical): 27 test cases - Core variant functionality
- P1 (High): 15 test cases - Enhanced features
- P2 (Medium): 3 test cases - Optional features
Testing Philosophy:
Each GROOT FORCE variant builds on the common platform but adds specialized features for specific user needs. These tests validate that unique variant features work correctly and integrate seamlessly with the core platform.
Test Coverage:
- 9 product variants
- 5 test cases per variant (focused on unique features)
- All tests validate features not covered in core platform tests
1. GF-BE: Ben's Assistive Edition
Target User: Individuals with limited mobility, wheelchair users, those requiring assistive technology
Key Features: Wheelchair mounting, voice/gaze control, advanced hazard alerts, SOS integration
Hardware Differences: Reinforced mounting bracket, enhanced IMU for wheelchair motion
TC-VAR-BE-001: Wheelchair Mounting System
Priority: P0
Category: Mechanical Integration
Requirement Trace: REQ-BE-100
Automation: Manual
Objective:
Verify wheelchair mounting system is secure, adjustable, and safe.
Test Equipment:
- Standard wheelchair (24" wheels)
- Manual wheelchair
- Power wheelchair
- Mounting bracket
- Vibration test platform
- Force gauge
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Install mounting bracket on wheelchair | Bracket installs without tools | ☐ |
| 2 | Check installation time | Installation ≤3 minutes | ☐ |
| 3 | Attach GF-BE device to bracket | Secure magnetic attachment | ☐ |
| 4 | Test attachment force | Requires ≥5N to detach (secure) | ☐ |
| 5 | Adjust viewing angle (±45°) | Smooth adjustment, stays in position | ☐ |
| 6 | Test on manual wheelchair (rough terrain) | No detachment over 30 min | ☐ |
| 7 | Test on power wheelchair (vibration) | No detachment over 30 min | ☐ |
| 8 | Simulate bump (10cm drop) | Device remains attached | ☐ |
| 9 | Check quick-release mechanism | Detaches in < 2 seconds when needed | ☐ |
| 10 | Test bracket on 5 different wheelchairs | Compatible with all models | ☐ |
| 11 | Verify field of view from wheelchair | FOV matches standing user (adjusted) | ☐ |
| 12 | Check cable management | Charging cable secure, no snag hazard | ☐ |
Pass Criteria:
- ✅ Secure attachment (≥5N force)
- ✅ Quick installation (≤3 min)
- ✅ Survives vibration and bumps
- ✅ Compatible with standard wheelchairs
TC-VAR-BE-002: Voice & Gaze Control System
Priority: P0
Category: Accessibility Interface
Requirement Trace: REQ-BE-101
Automation: Semi-automated
Objective:
Validate voice and gaze control provide reliable hands-free operation.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Calibrate gaze tracking | Calibration completes in ≤1 minute | ☐ |
| 2 | Test gaze selection (10 targets) | Accuracy ≥90% (9/10 correct) | ☐ |
| 3 | Measure gaze selection latency | Latency ≤300ms (dwell time) | ☐ |
| 4 | Test voice command recognition | Accuracy ≥95% (20 commands) | ☐ |
| 5 | Test "hands-free mode" activation | "Enable hands-free" command works | ☐ |
| 6 | Test menu navigation via gaze | Can navigate full menu hierarchy | ☐ |
| 7 | Test voice + gaze hybrid control | Both modes work seamlessly together | ☐ |
| 8 | Simulate wheelchair movement (bumps) | Gaze tracking maintains accuracy | ☐ |
| 9 | Test SOS activation (voice only) | "Emergency" command triggers SOS | ☐ |
| 10 | Test in noisy environment (60 dB) | Voice recognition still ≥85% | ☐ |
| 11 | Check customizable command phrases | User can set custom commands | ☐ |
| 12 | Test fatigue resistance (30 min use) | Accuracy maintained over time | ☐ |
Pass Criteria:
- ✅ Gaze accuracy ≥90%
- ✅ Voice recognition ≥95%
- ✅ Latency ≤300ms
- ✅ Works during wheelchair motion
Voice Commands Tested:
- "Open menu"
- "Emergency"
- "Call [contact]"
- "Navigate home"
- "Read message"
- "Take photo"
TC-VAR-BE-003: Enhanced Hazard Detection (Wheelchair-Specific)
Priority: P0
Category: Safety - Mobility
Requirement Trace: REQ-BE-102
Automation: Semi-automated
Objective:
Verify enhanced hazard detection for wheelchair users (ramps, doorways, obstacles).
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Approach doorway (80cm width) | Width detected, clearance indicated | ☐ |
| 2 | Test narrow doorway (70cm) | Warning: "Doorway too narrow" | ☐ |
| 3 | Approach ramp (8° slope) | Ramp detected, slope indicated | ☐ |
| 4 | Test steep ramp (15° slope) | Warning: "Steep ramp ahead" | ☐ |
| 5 | Detect low-hanging obstacle (1.2m height) | Warning given at safe distance | ☐ |
| 6 | Test curb detection (up and down) | Both directions detected | ☐ |
| 7 | Detect uneven surface (threshold > 2cm) | Warning given before reaching | ☐ |
| 8 | Test elevator door detection | Detects opening/closing doors | ☐ |
| 9 | Measure warning distance | Warnings ≥3m ahead (reaction time) | ☐ |
| 10 | Test 50 varied environments | Hazard detection rate ≥95% | ☐ |
| 11 | Verify haptic directional alerts | Left/right vibration guides correctly | ☐ |
| 12 | Check audio descriptiveness | Alerts clear: "Narrow doorway, 65cm wide, ahead 2 meters" | ☐ |
Pass Criteria:
- ✅ Doorway width detection ±5cm
- ✅ Ramp angle detection ±2°
- ✅ Hazard warnings ≥3m ahead
- ✅ Detection rate ≥95%
Wheelchair-Specific Hazards:
- Narrow doorways
- Ramps (up/down)
- Low-hanging obstacles
- Curbs
- Thresholds
- Elevator doors
TC-VAR-BE-004: Remote Assistance Integration
Priority: P1
Category: Support Features
Requirement Trace: REQ-BE-103
Automation: Manual
Objective:
Validate remote assistance allows caregiver to provide support.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | User initiates remote assistance request | Request sent to caregiver | ☐ |
| 2 | Caregiver receives notification | Notification received ≤5 seconds | ☐ |
| 3 | Establish video/audio connection | Connection established ≤10 seconds | ☐ |
| 4 | Test two-way audio quality | Clear audio both directions | ☐ |
| 5 | Test live camera feed sharing | Caregiver sees user's view in real-time | ☐ |
| 6 | Test AR annotation (caregiver draws) | Annotations appear in user's HUD | ☐ |
| 7 | Caregiver sends navigation waypoint | User receives visual path guidance | ☐ |
| 8 | Test screen sharing (user's HUD) | Caregiver sees what user sees | ☐ |
| 9 | Check session recording (with consent) | Session recorded for review | ☐ |
| 10 | Test emergency override | Caregiver can trigger SOS remotely | ☐ |
| 11 | Verify privacy controls | User can end session anytime | ☐ |
| 12 | Test connection over cellular (LTE) | Works on mobile network | ☐ |
Pass Criteria:
- ✅ Connection established ≤10 seconds
- ✅ Clear audio/video quality
- ✅ AR annotations visible to user
- ✅ User maintains privacy control
TC-VAR-BE-005: Wheelchair Motion Detection & Adaptation
Priority: P1
Category: Adaptive Features
Requirement Trace: REQ-BE-104
Automation: Semi-automated
Objective:
Verify system adapts to wheelchair motion patterns.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Detect wheelchair (vs walking) mode | Auto-detects wheelchair movement | ☐ |
| 2 | Check motion calibration | System learns wheelchair-specific motion | ☐ |
| 3 | Test stationary stability | No false motion alerts when parked | ☐ |
| 4 | Test slow rolling (0.5 m/s) | System tracks position accurately | ☐ |
| 5 | Test fast rolling (2 m/s) | System maintains tracking | ☐ |
| 6 | Simulate turning (90° and 180°) | Orientation updated correctly | ☐ |
| 7 | Test navigation guidance (wheelchair paths) | Routes prefer ramps over stairs | ☐ |
| 8 | Check fall detection adaptation | No false falls during transfers | ☐ |
| 9 | Test over rough terrain (grass, gravel) | Motion tracking maintained | ☐ |
| 10 | Verify battery optimization | Lower power mode during stationary | ☐ |
Pass Criteria:
- ✅ Auto-detects wheelchair mode
- ✅ Accurate motion tracking
- ✅ No false fall detections
- ✅ Navigation prefers accessible routes
2. GF-CL: Care & Joy Edition (NDIS)
Target User: NDIS support workers, disability care providers
Key Features: NDIS-compliant note-taking, consent management, incident reporting, care documentation
Hardware Differences: None (software-focused variant)
TC-VAR-CL-001: NDIS-Compliant Note-Taking
Priority: P0
Category: Documentation
Requirement Trace: REQ-CL-100
Automation: Semi-automated
Objective:
Verify automatic note-taking generates NDIS-compliant progress notes.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Start support session (voice: "Start session") | Recording begins, consent checked | ☐ |
| 2 | Conduct 30-minute support activity | System transcribes continuously | ☐ |
| 3 | End session (voice: "End session") | Transcription stops, note generated | ☐ |
| 4 | Review generated note structure | Includes all NDIS required fields | ☐ |
| 5 | Check "Participant Details" section | Name, NDIS number, date/time present | ☐ |
| 6 | Check "Activity Delivered" section | Activities listed and categorized | ☐ |
| 7 | Check "Support Provided" section | Assistance level documented | ☐ |
| 8 | Check "Observations" section | Participant behavior/mood noted | ☐ |
| 9 | Check "Outcomes Achieved" section | Goals and progress documented | ☐ |
| 10 | Check "Next Steps" section | Follow-up actions listed | ☐ |
| 11 | Verify note editability | Support worker can review/edit | ☐ |
| 12 | Test export to NDIS portal format | Exports as compliant CSV/XML | ☐ |
| 13 | Check timestamp accuracy | All times accurate to minute | ☐ |
| 14 | Verify privacy (participant name) | Can be anonymized if needed | ☐ |
Pass Criteria:
- ✅ All NDIS required fields present
- ✅ Accurate transcription (≥90%)
- ✅ Editable by support worker
- ✅ Exports to NDIS format
NDIS Progress Note Fields:
- Participant Details
- Date & Time
- Activity Delivered
- Support Provided
- Observations
- Outcomes Achieved
- Next Steps
- Support Worker Signature
TC-VAR-CL-002: Consent Management System
Priority: P0
Category: Privacy & Compliance
Requirement Trace: REQ-CL-101
Automation: Manual
Objective:
Validate consent system ensures participant consent before recording.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Attempt to start recording without consent | System blocks, prompts for consent | ☐ |
| 2 | Request consent (voice: "Request consent") | Consent prompt displayed to participant | ☐ |
| 3 | Participant grants consent (voice: "I consent") | Consent recorded with timestamp | ☐ |
| 4 | Check consent log entry | Includes: who, when, what activity | ☐ |
| 5 | Start recording after consent | Recording begins normally | ☐ |
| 6 | Participant revokes consent mid-session | Recording stops immediately | ☐ |
| 7 | Check partial recording handling | Partial session saved with note | ☐ |
| 8 | Test written consent (participant signs device) | Signature captured and stored | ☐ |
| 9 | Test guardian/carer consent (proxy) | Proxy consent accepted and logged | ☐ |
| 10 | Check consent expiry | Consent valid for session only | ☐ |
| 11 | Verify consent audit trail | All consent events logged immutably | ☐ |
| 12 | Test consent withdrawal rights | Participant can request deletion | ☐ |
Pass Criteria:
- ✅ No recording without consent
- ✅ Consent logged with timestamp
- ✅ Participant can revoke anytime
- ✅ Full audit trail maintained
TC-VAR-CL-003: Incident Detection & Reporting
Priority: P0
Category: Safety & Documentation
Requirement Trace: REQ-CL-102
Automation: Semi-automated
Objective:
Verify system detects and assists with incident reporting.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Simulate fall incident | Fall detected automatically | ☐ |
| 2 | Check incident alert | "Incident detected - Start report?" | ☐ |
| 3 | Start incident report (voice: "Yes") | Incident report template loaded | ☐ |
| 4 | System pre-fills known data | Time, location, people present | ☐ |
| 5 | Support worker adds details (voice) | Transcribed into report | ☐ |
| 6 | Check incident report structure | Includes: what, when, where, who, how | ☐ |
| 7 | Verify injury assessment prompts | System asks about injuries | ☐ |
| 8 | Check immediate actions section | First aid, emergency services logged | ☐ |
| 9 | Test photo attachment | Can attach incident photos | ☐ |
| 10 | Verify notification to supervisor | Supervisor notified immediately | ☐ |
| 11 | Check report submission | Submitted to NDIS portal | ☐ |
| 12 | Test manual incident logging | Can log incident without auto-detect | ☐ |
Pass Criteria:
- ✅ Auto-detects major incidents
- ✅ Report includes all required fields
- ✅ Supervisor notified immediately
- ✅ Submits to NDIS portal
Incident Types Detected:
- Falls
- Medical emergencies
- Behavioral incidents
- Medication errors
- Equipment malfunctions
TC-VAR-CL-004: Multi-Participant Session Management
Priority: P1
Category: Workflow
Requirement Trace: REQ-CL-103
Automation: Manual
Objective:
Validate system handles multiple participants in group sessions.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Start group session | "Start group session" command | ☐ |
| 2 | Add participant 1 (voice: "Add [name]") | Participant 1 added to session | ☐ |
| 3 | Add participant 2 | Participant 2 added | ☐ |
| 4 | Add participant 3 | Participant 3 added | ☐ |
| 5 | Check consent for all participants | Consent requested from each | ☐ |
| 6 | Conduct group activity (30 min) | Session transcribed | ☐ |
| 7 | System identifies speakers | Speaker diarization works | ☐ |
| 8 | End group session | Session ends for all participants | ☐ |
| 9 | Review generated notes | Separate note per participant | ☐ |
| 10 | Check individual participant notes | Each note relevant to that person | ☐ |
| 11 | Verify group activity logged | Shared activity documented | ☐ |
| 12 | Test individual note customization | Worker can add participant-specific notes | ☐ |
Pass Criteria:
- ✅ Handles up to 5 participants
- ✅ Separate notes per participant
- ✅ Speaker identification functional
- ✅ Individual consent management
TC-VAR-CL-005: NDIS Compliance Verification
Priority: P0
Category: Compliance
Requirement Trace: REQ-CL-104
Automation: Automated
Objective:
Verify all documentation meets NDIS Practice Standards.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Generate 20 sample progress notes | All notes generated successfully | ☐ |
| 2 | Run compliance checker | All notes pass validation | ☐ |
| 3 | Check "Rights and Dignity" documentation | Participant consent always present | ☐ |
| 4 | Check "Person-Centred" language | Notes use participant-preferred terms | ☐ |
| 5 | Verify goal-oriented documentation | Outcomes linked to NDIS goals | ☐ |
| 6 | Check time recording accuracy | Time blocks sum correctly | ☐ |
| 7 | Verify billing code suggestions | Correct NDIS support item codes | ☐ |
| 8 | Check data retention compliance | Notes stored securely for 7 years | ☐ |
| 9 | Test access controls | Only authorized staff can view notes | ☐ |
| 10 | Verify audit trail | All edits logged with who/when | ☐ |
| 11 | Check data export compliance | Export includes all required fields | ☐ |
| 12 | Test with NDIS auditor checklist | Passes all checklist items | ☐ |
Pass Criteria:
- ✅ 100% notes pass NDIS validation
- ✅ Rights and dignity preserved
- ✅ Goal-oriented documentation
- ✅ 7-year secure retention
NDIS Practice Standards Verified:
- Core Module (all providers)
- Rights and Dignity
- Person-Centred
- Outcome-focused
- Privacy and dignity
3. GF-NF: Nick's Fitness Edition
Target User: Fitness enthusiasts, athletes, active lifestyle users
Key Features: Real-time vitals, fatigue detection, performance tracking, training guidance
Hardware Differences: Enhanced health sensors (HR, SpO₂, temp)
TC-VAR-NF-001: Real-Time Heart Rate Accuracy
Priority: P0
Category: Health Sensors
Requirement Trace: REQ-NF-100
Automation: Semi-automated
Objective:
Validate heart rate monitoring accuracy during exercise.
Test Equipment:
- Medical-grade chest strap HR monitor (reference)
- ECG monitor (validation)
- Exercise bike or treadmill
- Test subjects (5+ people)
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Calibrate GF-NF HR sensor | Sensor ready | ☐ |
| 2 | Attach reference chest strap | Reference device ready | ☐ |
| 3 | Measure resting HR (5 minutes) | GF-NF HR within ±5 BPM of reference | ☐ |
| 4 | Light exercise (50-60% max HR) | Accuracy within ±5 BPM | ☐ |
| 5 | Moderate exercise (60-70% max HR) | Accuracy within ±7 BPM | ☐ |
| 6 | Vigorous exercise (70-85% max HR) | Accuracy within ±10 BPM | ☐ |
| 7 | Test during movement artifacts | System rejects bad readings | ☐ |
| 8 | Check HR display latency | Updates every 1-2 seconds | ☐ |
| 9 | Test 20-minute continuous workout | Continuous accurate tracking | ☐ |
| 10 | Calculate correlation with reference | Correlation ≥0.95 (r²) | ☐ |
| 11 | Test on 5 different subjects | Accuracy consistent across users | ☐ |
| 12 | Verify HR zone detection | Correctly identifies zones (rest/fat burn/cardio/peak) | ☐ |
Pass Criteria:
- ✅ Resting HR: ±5 BPM
- ✅ Exercise HR: ±10 BPM
- ✅ Correlation ≥0.95 vs reference
- ✅ Consistent across users
Heart Rate Zones:
- Resting: < 60% max HR
- Fat burn: 60-70% max HR
- Cardio: 70-85% max HR
- Peak: > 85% max HR
TC-VAR-NF-002: Fatigue Detection System
Priority: P0
Category: Performance Monitoring
Requirement Trace: REQ-NF-101
Automation: Semi-automated
Objective:
Verify fatigue detection using HRV and performance metrics.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Measure baseline HRV (rested state) | Baseline HRV recorded | ☐ |
| 2 | Perform intense workout (30 min) | HRV decreases | ☐ |
| 3 | Check fatigue score during workout | Score increases appropriately | ☐ |
| 4 | Test fatigue alert threshold | Alert triggers at predetermined level | ☐ |
| 5 | Verify alert content | "Fatigue detected - Consider rest" | ☐ |
| 6 | Continue past fatigue threshold | System tracks declining performance | ☐ |
| 7 | Check recovery monitoring (post-workout) | HRV recovers over time | ☐ |
| 8 | Test next-day readiness score | Score reflects recovery status | ☐ |
| 9 | Simulate overtraining (multiple days) | System warns of overtraining risk | ☐ |
| 10 | Verify training recommendations | Suggests lighter workout when fatigued | ☐ |
| 11 | Test 7-day fatigue trends | Trends visualized clearly | ☐ |
| 12 | Check integration with training plan | Adapts plan based on fatigue | ☐ |
Pass Criteria:
- ✅ Fatigue score correlates with HRV
- ✅ Alerts at appropriate thresholds
- ✅ Recovery tracking functional
- ✅ Training recommendations adaptive
Fatigue Indicators:
- HRV decrease > 20% from baseline
- Resting HR increase > 10% from baseline
- Performance metrics declining
- Subjective fatigue score (user input)
TC-VAR-NF-003: Performance Tracking & Analytics
Priority: P1
Category: Data Analysis
Requirement Trace: REQ-NF-102
Automation: Automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Complete workout with GF-NF | Workout data recorded | ☐ |
| 2 | Check automatic workout detection | Type detected (run/bike/gym/etc.) | ☐ |
| 3 | Verify workout summary | Duration, HR avg/max, calories displayed | ☐ |
| 4 | Check HR zone breakdown | Time in each zone calculated | ☐ |
| 5 | Test calorie calculation | Calories = f(HR, duration, user profile) | ☐ |
| 6 | Verify VO2 max estimation | Estimate updated after cardio workout | ☐ |
| 7 | Check training load calculation | Acute and chronic load tracked | ☐ |
| 8 | Test performance trends (7/30/90 days) | Trends visualized on HUD | ☐ |
| 9 | Verify personal records tracking | PRs automatically detected | ☐ |
| 10 | Check data export | Exports to CSV/GPX/TCX formats | ☐ |
| 11 | Test sync with fitness apps | Syncs to Strava, Apple Health, etc. | ☐ |
| 12 | Verify historical data access | Can view past 12 months | ☐ |
Pass Criteria:
- ✅ Workout auto-detection ≥90%
- ✅ Calorie calculation within ±15%
- ✅ Data exports correctly
- ✅ Syncs with major fitness platforms
TC-VAR-NF-004: Real-Time Form Guidance (Running)
Priority: P1
Category: Performance Enhancement
Requirement Trace: REQ-NF-103
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Start running workout | Form monitoring activated | ☐ |
| 2 | Check cadence detection | Cadence (steps/min) displayed | ☐ |
| 3 | Test optimal cadence guidance | Recommends 170-180 SPM | ☐ |
| 4 | Simulate overstriding | Alert: "Reduce stride length" | ☐ |
| 5 | Check ground contact time | GCT measured via IMU | ☐ |
| 6 | Test vertical oscillation detection | Excessive bounce flagged | ☐ |
| 7 | Verify real-time audio cues | Voice: "Increase cadence" | ☐ |
| 8 | Check asymmetry detection | Left/right imbalance detected | ☐ |
| 9 | Test fatigue impact on form | Form degradation tracked | ☐ |
| 10 | Verify post-run form analysis | Summary with improvement tips | ☐ |
Pass Criteria:
- ✅ Cadence accuracy ±3 SPM
- ✅ Real-time guidance functional
- ✅ Form degradation detected
- ✅ Actionable improvement tips
Form Metrics:
- Cadence (steps/min)
- Ground contact time
- Vertical oscillation
- Stride length
- Left/right asymmetry
TC-VAR-NF-005: SpO₂ Monitoring (Altitude/Hypoxia)
Priority: P1
Category: Health Monitoring
Requirement Trace: REQ-NF-104
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Measure baseline SpO₂ (sea level) | SpO₂ ≥95% | ☐ |
| 2 | Compare to medical pulse oximeter | Within ±2% of reference | ☐ |
| 3 | Simulate altitude (hypoxic chamber 2500m) | SpO₂ decreases appropriately | ☐ |
| 4 | Check low SpO₂ alert (threshold 88%) | Alert triggers correctly | ☐ |
| 5 | Verify alert content | "Low blood oxygen - Slow down" | ☐ |
| 6 | Test during exercise at altitude | Continuous monitoring | ☐ |
| 7 | Check SpO₂ trend visualization | Trend graph displayed | ☐ |
| 8 | Test in various lighting conditions | Works in sunlight and darkness | ☐ |
| 9 | Verify motion artifact rejection | Bad readings flagged | ☐ |
| 10 | Check battery impact | SpO₂ monitoring adds < 5% drain | ☐ |
Pass Criteria:
- ✅ Accuracy within ±2% of medical device
- ✅ Low SpO₂ alerts functional
- ✅ Works in varied lighting
- ✅ Minimal battery impact
4. GF-TX: TradeForce Pro Edition
Target User: Tradespeople, technicians, industrial workers
Key Features: IP65 protection, OH&S reminders, AR measurement, SOP integration
Hardware Differences: Reinforced frame, IP65 rating, enhanced microphone (noise canceling)
TC-VAR-TX-001: IP65 Environmental Protection
Priority: P0
Category: Durability
Requirement Trace: REQ-TX-100, IEC 60529
Automation: Manual (requires test lab)
Objective:
Verify IP65 rating (dust-tight, water jet protected).
Test Equipment:
- Dust chamber
- Water jet nozzle (6.3mm, 12.5 L/min)
- Talcum powder (test dust)
- Pressure gauge
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Visual inspection of seals | All seals intact, no gaps | ☐ |
| 2 | Dust test: place in dust chamber (8 hours) | Device operates normally | ☐ |
| 3 | Open device after dust test | No dust ingress visible | ☐ |
| 4 | Water jet test: 6.3mm nozzle, 12.5 L/min | Device operates during test | ☐ |
| 5 | Test from all angles (6 directions) | Water doesn't penetrate | ☐ |
| 6 | Check functionality after water test | All functions work | ☐ |
| 7 | Test USB-C port protection | Port cover seals properly | ☐ |
| 8 | Simulate worksite abuse (dust + sweat) | Survives combined exposure | ☐ |
| 9 | Check microphone/speaker seals | Audio quality maintained | ☐ |
| 10 | Verify display visibility when wet | Touchscreen works when wet | ☐ |
| 11 | Test in construction environment (7 days) | No degradation | ☐ |
| 12 | Post-test inspection | No damage, all seals intact | ☐ |
Pass Criteria:
- ✅ No dust ingress (IP6X)
- ✅ No water ingress from jets (IPX5)
- ✅ All functions work after exposure
- ✅ Passes IEC 60529 IP65 test
IP65 Standard:
- 6 (Dust): Dust-tight, no ingress
- 5 (Water): Protected against water jets from any direction
TC-VAR-TX-002: OH&S Safety Reminder System
Priority: P0
Category: Workplace Safety
Requirement Trace: REQ-TX-101
Automation: Semi-automated
Objective:
Validate occupational health & safety reminders function correctly.
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Configure worksite safety rules | Rules loaded (PPE, heights, confined space) | ☐ |
| 2 | Enter work zone (geofence) | System detects entry | ☐ |
| 3 | Check PPE reminder | "Put on safety glasses and hard hat" | ☐ |
| 4 | User confirms PPE donned | Reminder cleared | ☐ |
| 5 | Work at height scenario | "Working at height - Check harness" | ☐ |
| 6 | Test confined space detection | "Confined space - Gas test required" | ☐ |
| 7 | Check periodic break reminders | Reminds every 2 hours for break | ☐ |
| 8 | Test hazardous material alert | Detects chemical storage area | ☐ |
| 9 | Verify noise level warning | Alert when > 85 dB SPL | ☐ |
| 10 | Check heat stress monitoring | Alert when temp > 35°C + exertion | ☐ |
| 11 | Test emergency mustering | Can trigger evacuation alert | ☐ |
| 12 | Verify compliance logging | All safety checks logged with timestamp | ☐ |
Pass Criteria:
- ✅ Geofence detection ≤10 sec
- ✅ Appropriate reminders for hazards
- ✅ Break reminders every 2 hours
- ✅ Compliance audit trail complete
OH&S Reminders:
- PPE requirements
- Height work precautions
- Confined space protocols
- Hazardous materials
- Noise exposure
- Heat stress
- Break reminders
TC-VAR-TX-003: AR Measurement Tools
Priority: P1
Category: Productivity
Requirement Trace: REQ-TX-102
Automation: Semi-automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Activate AR measurement mode | Mode active, calibration check | ☐ |
| 2 | Measure known distance (1.0m) | Measurement: 1.0m ±2cm | ☐ |
| 3 | Test distance range (0.5m to 5m) | All distances accurate ±2% | ☐ |
| 4 | Measure vertical height | Height accuracy ±2% | ☐ |
| 5 | Test angle measurement | Angle accuracy ±2° | ☐ |
| 6 | Use virtual level (plumb/horizontal) | Accuracy ±1° | ☐ |
| 7 | Measure area (rectangle 2m × 3m) | Area: 6.0m² ±5% | ☐ |
| 8 | Test volume calculation | Volume calculated correctly | ☐ |
| 9 | Check measurement annotation | Can add notes to measurements | ☐ |
| 10 | Test measurement export | Exports with photos to PDF | ☐ |
| 11 | Verify measurement history | Last 50 measurements saved | ☐ |
| 12 | Test in varied lighting | Works in bright and dim light | ☐ |
Pass Criteria:
- ✅ Distance accuracy ±2%
- ✅ Angle accuracy ±2°
- ✅ Level accuracy ±1°
- ✅ Measurements exportable
AR Tools:
- Distance measurement
- Height measurement
- Angle measurement
- Virtual level
- Area calculation
- Volume estimation
TC-VAR-TX-004: SOP (Standard Operating Procedure) Viewer
Priority: P1
Category: Workflow
Requirement Trace: REQ-TX-103
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Upload test SOP document (PDF) | SOP loaded successfully | ☐ |
| 2 | Open SOP viewer (voice: "Show SOP") | SOP displayed on HUD | ☐ |
| 3 | Test step-by-step navigation | Can advance through steps | ☐ |
| 4 | Check voice control ("Next step") | Advances to next step | ☐ |
| 5 | Test gaze control (look at next button) | Advances via gaze | ☐ |
| 6 | Verify hands-free operation | Fully operable without hands | ☐ |
| 7 | Test image/diagram display | Diagrams visible and clear | ☐ |
| 8 | Check zoom functionality | Can zoom images with voice | ☐ |
| 9 | Test checklist completion | Can check off completed steps | ☐ |
| 10 | Verify completion logging | SOP completion logged with timestamp | ☐ |
| 11 | Test SOP search | Can search SOPs by keyword | ☐ |
| 12 | Check offline access | SOPs available without network | ☐ |
Pass Criteria:
- ✅ Fully hands-free operation
- ✅ Step-by-step navigation clear
- ✅ Images/diagrams readable
- ✅ Offline access functional
TC-VAR-TX-005: Noise-Canceling Communication (High Noise Environments)
Priority: P0
Category: Communication
Requirement Trace: REQ-TX-104
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Test in quiet environment (40 dB) | Voice recognition baseline | ☐ |
| 2 | Test at construction site (85 dB) | Voice recognition ≥85% | ☐ |
| 3 | Test near power tools (95 dB) | Voice recognition ≥75% | ☐ |
| 4 | Test during phone call (construction noise) | Call audio clear to recipient | ☐ |
| 5 | Check bone conduction audio clarity | User can hear clearly despite noise | ☐ |
| 6 | Test 4-microphone beamforming | Background noise suppressed | ☐ |
| 7 | Verify ANC (active noise canceling) | Reduces noise by ≥15 dB | ☐ |
| 8 | Test AI voice enhancement | AI improves voice clarity | ☐ |
| 9 | Check communication range | Bluetooth works at 15m despite interference | ☐ |
| 10 | Test emergency override (loud alarm) | Can still trigger SOS via voice | ☐ |
Pass Criteria:
- ✅ Voice recognition ≥85% at 85 dB
- ✅ Call audio clear to recipient
- ✅ ANC reduces noise ≥15 dB
- ✅ SOS reliable in extreme noise
5. GF-DI: SilentLink Edition (Deaf/Hard-of-Hearing)
Target User: Deaf and hard-of-hearing individuals
Key Features: Live captioning, sound-class detection, directional sound indicators, visual alerts
Hardware Differences: None (software + camera focus)
TC-VAR-DI-001: Live Captioning Accuracy
Priority: P0
Category: Accessibility
Requirement Trace: REQ-DI-100
Automation: Semi-automated
Objective:
Validate real-time speech-to-text captioning accuracy.
Test Equipment:
- Calibrated audio playback (known speech samples)
- Multiple speakers (male, female, various accents)
- Noisy environment (cafe, traffic)
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Enable live captioning | Captions appear on HUD | ☐ |
| 2 | Test with clear speech (50 sentences) | WER ≤5% (≥95% accuracy) | ☐ |
| 3 | Test with single speaker | Speaker identified, captions accurate | ☐ |
| 4 | Test with multiple speakers (conversation) | Speakers differentiated | ☐ |
| 5 | Check speaker diarization | "Person 1:", "Person 2:" labels | ☐ |
| 6 | Test in moderate noise (cafe 65 dB) | WER ≤15% | ☐ |
| 7 | Test in high noise (traffic 75 dB) | WER ≤25% (still usable) | ☐ |
| 8 | Check caption latency | Latency ≤1 second | ☐ |
| 9 | Test caption readability | Font size, contrast appropriate | ☐ |
| 10 | Verify punctuation & capitalization | Proper grammar applied | ☐ |
| 11 | Test caption history (scroll back) | Can review last 5 minutes | ☐ |
| 12 | Check caption export | Can save transcript | ☐ |
Pass Criteria:
- ✅ WER ≤5% (clean speech)
- ✅ WER ≤15% (moderate noise)
- ✅ Latency ≤1 second
- ✅ Speaker diarization functional
TC-VAR-DI-002: Sound-Class Detection & Alerts
Priority: P0
Category: Environmental Awareness
Requirement Trace: REQ-DI-101
Automation: Automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Play fire alarm sound | Detected: "🔥 Fire alarm" | ☐ |
| 2 | Play doorbell sound | Detected: "🔔 Doorbell" | ☐ |
| 3 | Play baby crying sound | Detected: "👶 Baby crying" | ☐ |
| 4 | Play dog barking | Detected: "🐕 Dog barking" | ☐ |
| 5 | Play car horn | Detected: "🚗 Car horn" | ☐ |
| 6 | Play phone ringing | Detected: "📞 Phone ringing" | ☐ |
| 7 | Play siren (ambulance) | Detected: "🚨 Emergency siren" | ☐ |
| 8 | Test 30 sound classes | Detection accuracy ≥90% | ☐ |
| 9 | Check alert visibility | Large icon + text on HUD | ☐ |
| 10 | Verify haptic alert | Vibration accompanies visual | ☐ |
| 11 | Test alert priority | Critical sounds (fire) override others | ☐ |
| 12 | Check false positive rate | FPR ≤5% | ☐ |
Pass Criteria:
- ✅ Sound detection accuracy ≥90%
- ✅ Critical sounds always detected
- ✅ Visual + haptic alerts clear
- ✅ False positive rate ≤5%
Sound Classes:
- Fire alarm
- Doorbell
- Baby crying
- Dog barking
- Car horn
- Phone ringing
- Sirens
- Knocking
- Glass breaking
- Smoke alarm
- Timer/alarm clock
TC-VAR-DI-003: Directional Sound Indicators
Priority: P1
Category: Spatial Awareness
Requirement Trace: REQ-DI-102
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Play sound from left (90°) | Arrow points left on HUD | ☐ |
| 2 | Play sound from right (270°) | Arrow points right | ☐ |
| 3 | Play sound from behind (180°) | Arrow points backward | ☐ |
| 4 | Play sound from front (0°) | Arrow points forward | ☐ |
| 5 | Test angular accuracy | Direction within ±20° | ☐ |
| 6 | Test with 4-mic array beamforming | Accurate directional detection | ☐ |
| 7 | Check indicator visibility | Arrow clearly visible on HUD | ☐ |
| 8 | Test with moving sound source | Arrow updates in real-time | ☐ |
| 9 | Verify distance indication | Near/far indicated by size/opacity | ☐ |
| 10 | Test multiple sounds simultaneously | Shows dominant direction | ☐ |
Pass Criteria:
- ✅ Direction accuracy within ±20°
- ✅ Real-time updates (≤500ms latency)
- ✅ Clear visual indicators
- ✅ Works with moving sources
TC-VAR-DI-004: Visual Alert System (Non-Audio)
Priority: P0
Category: Accessibility
Requirement Trace: REQ-DI-103
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Test all alert types (20+ scenarios) | All alerts have visual representation | ☐ |
| 2 | Check notification visibility | Large, high-contrast icons | ☐ |
| 3 | Test color-coding | Emergency=red, Info=blue, etc. | ☐ |
| 4 | Verify haptic patterns | Different vibrations for different alerts | ☐ |
| 5 | Test alert persistence | Stays on screen until acknowledged | ☐ |
| 6 | Check alert stacking | Multiple alerts queued properly | ☐ |
| 7 | Test brightness in sunlight | Visible in bright outdoor light | ☐ |
| 8 | Verify nighttime visibility | Visible but not blinding at night | ☐ |
| 9 | Test alert customization | User can adjust size, position | ☐ |
| 10 | Check alert history | Last 20 alerts accessible | ☐ |
Pass Criteria:
- ✅ 100% alerts have visual representation
- ✅ High contrast, large icons
- ✅ Haptic patterns distinctive
- ✅ Visible in all lighting
TC-VAR-DI-005: Sign Language Translation (Beta Feature)
Priority: P2
Category: Experimental
Requirement Trace: REQ-DI-104
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Activate sign language mode | Camera focuses on hands | ☐ |
| 2 | Sign basic phrase: "Hello" | Recognized and translated | ☐ |
| 3 | Test 20 common signs (ASL/Auslan) | Recognition ≥70% | ☐ |
| 4 | Check translation latency | Translation ≤2 seconds | ☐ |
| 5 | Test sentence formation | Multi-sign phrases recognized | ☐ |
| 6 | Verify accuracy disclaimer | "Beta feature - accuracy may vary" | ☐ |
| 7 | Test in varied lighting | Works in indoor/outdoor light | ☐ |
| 8 | Check fingerspelling support | Can spell alphabet | ☐ |
| 9 | Test user feedback mechanism | Can report incorrect translations | ☐ |
| 10 | Verify continuous learning | System improves over time | ☐ |
Pass Criteria:
- ✅ Recognition ≥70% (beta acceptable)
- ✅ Latency ≤2 seconds
- ✅ Clear beta disclaimer
- ✅ User feedback functional
Note: Sign language is a beta/experimental feature with lower accuracy requirements than production features.
6. GF-VI: VisionAssist Edition
Target User: Low-vision and blind individuals
Key Features: Scene description, OCR text reading, object recognition, navigation guidance
Hardware Differences: None (software + camera focus)
TC-VAR-VI-001: Scene Description Quality
Priority: P0
Category: Accessibility - Vision
Requirement Trace: REQ-VI-100
Automation: Semi-automated
Objective:
Validate AI-powered scene description provides useful information.
Test Equipment:
- 50 test scenes (indoor, outdoor, varied complexity)
- Human evaluation panel (5 blind/low-vision evaluators)
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Capture simple scene (empty room) | Description generated | ☐ |
| 2 | Review description accuracy | All major elements mentioned | ☐ |
| 3 | Test complex scene (busy street) | Prioritizes important elements | ☐ |
| 4 | Check description length | 2-4 sentences (not overwhelming) | ☐ |
| 5 | Verify object detection | Common objects identified | ☐ |
| 6 | Test person detection | People mentioned with position | ☐ |
| 7 | Check spatial relationships | "Table on left, door ahead" | ☐ |
| 8 | Test 50 varied scenes | Human eval: usefulness ≥4.0/5.0 | ☐ |
| 9 | Measure description latency | Generated in ≤3 seconds | ☐ |
| 10 | Verify audio output quality | TTS clear and natural | ☐ |
| 11 | Test continuous mode | Updates as user moves | ☐ |
| 12 | Check battery impact | Description mode < 10% extra drain/hour | ☐ |
Pass Criteria:
- ✅ Human evaluation ≥4.0/5.0
- ✅ Latency ≤3 seconds
- ✅ Spatial relationships included
- ✅ Prioritizes important elements
Scene Elements Detected:
- Objects (furniture, items)
- People (count, position)
- Obstacles
- Doors/exits
- Spatial layout
- Lighting conditions
TC-VAR-VI-002: OCR Text Reading Accuracy
Priority: P0
Category: Accessibility - Vision
Requirement Trace: REQ-VI-101
Automation: Automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Point at printed text (book page) | Text recognized | ☐ |
| 2 | Check OCR accuracy | Accuracy ≥95% (clear print) | ☐ |
| 3 | Test various fonts (10 different) | All fonts recognized | ☐ |
| 4 | Test small text (8pt font) | Readable when zoomed | ☐ |
| 5 | Test handwritten text | Recognition ≥70% (clear writing) | ☐ |
| 6 | Check reading speed | Reads aloud at natural pace | ☐ |
| 7 | Test menu reading (restaurant) | Menu items read correctly | ☐ |
| 8 | Test medication label reading | Critical info (dosage, warnings) prioritized | ☐ |
| 9 | Test street sign reading | Signs detected and read from distance | ☐ |
| 10 | Verify document mode | Can read multi-page documents | ☐ |
| 11 | Test in varied lighting | Works in dim and bright light | ☐ |
| 12 | Check reading controls | Can pause, resume, adjust speed | ☐ |
Pass Criteria:
- ✅ OCR accuracy ≥95% (printed text)
- ✅ Handwriting ≥70% accuracy
- ✅ Medication labels prioritized
- ✅ Works in varied lighting
Text Types Tested:
- Printed books
- Menus
- Medication labels
- Street signs
- Packaging
- Handwritten notes
- Screen text
TC-VAR-VI-003: Object & Obstacle Recognition
Priority: P0
Category: Safety - Vision
Requirement Trace: REQ-VI-102
Automation: Semi-automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Present common object (chair) | Identified: "Chair ahead, 1.5 meters" | ☐ |
| 2 | Test 100 common objects | Recognition ≥90% | ☐ |
| 3 | Check distance estimation | Accuracy ±20% | ☐ |
| 4 | Test obstacle detection (walking path) | All obstacles flagged | ☐ |
| 5 | Verify audio alert priority | Obstacles get immediate alert | ☐ |
| 6 | Test object search: "Find door" | Guides user to nearest door | ☐ |
| 7 | Check continuous scanning mode | Scans and announces as user moves | ☐ |
| 8 | Test color detection: "What color is this?" | Color announced accurately | ☐ |
| 9 | Verify brand/logo recognition | Can identify 50+ common brands | ☐ |
| 10 | Test currency recognition | Identifies bills/coins | ☐ |
| 11 | Check low-light performance | Works in dim lighting (uses flash) | ☐ |
| 12 | Verify object history | Last 20 objects saved | ☐ |
Pass Criteria:
- ✅ Object recognition ≥90%
- ✅ Distance estimation ±20%
- ✅ Obstacle alerts reliable
- ✅ Currency recognition functional
TC-VAR-VI-004: Audio Navigation Guidance
Priority: P0
Category: Navigation - Vision
Requirement Trace: REQ-VI-103
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Set destination (within building) | Route calculated | ☐ |
| 2 | Start navigation | Turn-by-turn audio guidance | ☐ |
| 3 | Check guidance clarity | Clear instructions: "Walk 10 steps, turn right" | ☐ |
| 4 | Verify obstacle avoidance | Route avoids detected obstacles | ☐ |
| 5 | Test indoor navigation | Works without GPS | ☐ |
| 6 | Check audio cue frequency | Updates every 5-10 steps or at turns | ☐ |
| 7 | Test haptic directional cues | Vibration guides left/right | ☐ |
| 8 | Verify landmark mentions | "Passing coffee shop on your left" | ☐ |
| 9 | Test destination confirmation | "Destination reached" announcement | ☐ |
| 10 | Check re-routing if off-path | Recalculates route automatically | ☐ |
| 11 | Test outdoor GPS navigation | Works with standard maps | ☐ |
| 12 | Verify safety warnings | Warns of hazards (stairs, crossings) | ☐ |
Pass Criteria:
- ✅ Clear, timely audio guidance
- ✅ Obstacle avoidance functional
- ✅ Indoor navigation works
- ✅ Safety warnings given
TC-VAR-VI-005: Facial Recognition (Opt-In Feature)
Priority: P1
Category: Social Assistance
Requirement Trace: REQ-VI-104
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | User enables face recognition (consent) | Feature activated | ☐ |
| 2 | Add known person (train with 10 photos) | Person profile created | ☐ |
| 3 | Detect known person (3m distance) | "John approaching, 3 o'clock" | ☐ |
| 4 | Check recognition accuracy (20 people) | Accuracy ≥90% | ☐ |
| 5 | Test unknown person detection | "Unknown person approaching" | ☐ |
| 6 | Verify privacy controls | Can disable feature anytime | ☐ |
| 7 | Check data storage | Faces stored locally, encrypted | ☐ |
| 8 | Test varied lighting conditions | Works in indoor/outdoor light | ☐ |
| 9 | Verify detection range | Detects faces up to 5 meters | ☐ |
| 10 | Check multi-person detection | Can recognize multiple people | ☐ |
| 11 | Test false positive rate | FPR < 5% | ☐ |
| 12 | Verify consent requirements | Clear privacy notice shown | ☐ |
Pass Criteria:
- ✅ Recognition accuracy ≥90%
- ✅ Detection range up to 5m
- ✅ Privacy controls robust
- ✅ Explicit consent required
Privacy Requirements:
- Opt-in only (disabled by default)
- Local storage (not cloud)
- Encrypted data
- User can delete data anytime
- Clear privacy notice
7. GF-TR: Traveller Edition
Target User: International travelers, tourists, digital nomads
Key Features: Multi-language translation, currency conversion, travel assistance
Hardware Differences: None (software-focused)
TC-VAR-TR-001: Real-Time Translation Accuracy
Priority: P0
Category: Language Translation
Requirement Trace: REQ-TR-100
Automation: Semi-automated
Objective:
Validate real-time translation accuracy across multiple languages.
Test Equipment:
- Native speakers (5+ languages)
- Standard translation test sets (BLEU score)
- Audio playback (varied accents)
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Set target language: Spanish | Language selected | ☐ |
| 2 | Speak English phrase | Translated to Spanish | ☐ |
| 3 | Check translation accuracy | BLEU score ≥30 (acceptable) | ☐ |
| 4 | Test 100 common travel phrases | Accuracy ≥85% | ☐ |
| 5 | Test conversation mode (bi-directional) | Both directions work | ☐ |
| 6 | Check translation latency | Latency ≤3 seconds | ☐ |
| 7 | Test 10 languages | All 10 functional | ☐ |
| 8 | Verify offline translation | Works without internet (limited) | ☐ |
| 9 | Test accent robustness | Works with varied accents | ☐ |
| 10 | Check audio output quality | TTS clear in target language | ☐ |
| 11 | Test text translation (signs, menus) | OCR + translation pipeline | ☐ |
| 12 | Verify context awareness | Translates based on context | ☐ |
Pass Criteria:
- ✅ BLEU score ≥30 (acceptable translation)
- ✅ Common phrases ≥85% accurate
- ✅ Latency ≤3 seconds
- ✅ Works on 10+ languages
Languages Tested (Priority):
- Spanish
- French
- German
- Mandarin
- Japanese
- Korean
- Italian
- Portuguese
- Arabic
- Thai
TC-VAR-TR-002: Currency Conversion & Recognition
Priority: P1
Category: Travel Assistance
Requirement Trace: REQ-TR-101
Automation: Semi-automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Point camera at price tag "€50" | Recognizes currency symbol | ☐ |
| 2 | System converts to home currency | Shows "~$55 USD" (current rate) | ☐ |
| 3 | Test 10 major currencies | All recognized correctly | ☐ |
| 4 | Check exchange rate updates | Rates updated daily | ☐ |
| 5 | Test manual conversion (voice) | "Convert 100 euros to dollars" | ☐ |
| 6 | Verify quick math overlay | Conversion shown on HUD instantly | ☐ |
| 7 | Test banknote recognition | Identifies bills (10+ currencies) | ☐ |
| 8 | Check coin recognition | Identifies coins (limited support) | ☐ |
| 9 | Test split bill calculator | Can split restaurant bill | ☐ |
| 10 | Verify tipping calculator | Suggests appropriate tips by country | ☐ |
Pass Criteria:
- ✅ Currency recognition ≥95%
- ✅ Exchange rates updated daily
- ✅ Conversion shown instantly
- ✅ Tipping suggestions by country
TC-VAR-TR-003: Local Navigation & POI Discovery
Priority: P1
Category: Travel Navigation
Requirement Trace: REQ-TR-102
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Ask: "Find nearby restaurants" | List of restaurants shown | ☐ |
| 2 | Check POI relevance | Results appropriate for location | ☐ |
| 3 | Test navigation to selected POI | Turn-by-turn guidance | ☐ |
| 4 | Verify multi-language directions | Directions in user's language | ☐ |
| 5 | Test public transit integration | Shows bus/train options | ☐ |
| 6 | Check offline maps availability | Maps pre-downloadable | ☐ |
| 7 | Test landmark identification | Recognizes famous landmarks | ☐ |
| 8 | Verify hours of operation | Shows open/closed status | ☐ |
| 9 | Check user reviews integration | Shows ratings/reviews | ☐ |
| 10 | Test AR walking directions | Arrows overlaid on real world | ☐ |
Pass Criteria:
- ✅ POI results relevant
- ✅ Navigation accurate
- ✅ Offline maps functional
- ✅ AR directions clear
TC-VAR-TR-004: Travel Document Management
Priority: P2
Category: Travel Organization
Requirement Trace: REQ-TR-103
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Scan passport (photo) | Data extracted (name, number, expiry) | ☐ |
| 2 | Store document securely | Encrypted storage | ☐ |
| 3 | Scan boarding pass | Flight details extracted | ☐ |
| 4 | Check itinerary organization | Trip organized by date | ☐ |
| 5 | Test notification system | Alerts for flight times, check-in | ☐ |
| 6 | Verify offline access | Documents available offline | ☐ |
| 7 | Check document expiry warnings | Warns if passport expiring < 6 months | ☐ |
| 8 | Test visa requirement lookup | Shows visa needs by country | ☐ |
| 9 | Verify backup system | Documents backed up (encrypted) | ☐ |
| 10 | Check emergency access | Can access docs quickly via voice | ☐ |
Pass Criteria:
- ✅ OCR accuracy ≥95%
- ✅ Encrypted storage
- ✅ Offline access works
- ✅ Expiry warnings functional
TC-VAR-TR-005: Cultural Context & Safety Alerts
Priority: P1
Category: Travel Safety
Requirement Trace: REQ-TR-104
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Enter new country (geofence) | Welcome message with cultural tips | ☐ |
| 2 | Check tipping customs info | Local tipping norms displayed | ☐ |
| 3 | Verify business hours context | Local working hours noted | ☐ |
| 4 | Test safety alerts | Warns of unsafe areas (if applicable) | ☐ |
| 5 | Check local emergency numbers | Shows local 911 equivalent | ☐ |
| 6 | Test dress code reminders | Suggests appropriate attire | ☐ |
| 7 | Verify gesture warnings | Warns about offensive gestures | ☐ |
| 8 | Check travel advisory integration | Shows gov't travel warnings | ☐ |
| 9 | Test embassy locator | Finds nearest embassy | ☐ |
| 10 | Verify SOS local integration | SOS works with local services | ☐ |
Pass Criteria:
- ✅ Cultural tips appropriate
- ✅ Safety alerts timely
- ✅ Emergency info accurate
- ✅ Embassy locator functional
8. GF-LX: Lifestyle Edition
Target User: General consumers, daily lifestyle users
Key Features: Journaling, mood tracking, wellness reminders, everyday productivity
Hardware Differences: None (software-focused)
TC-VAR-LX-001: Voice Journaling & Diary
Priority: P1
Category: Lifestyle Features
Requirement Trace: REQ-LX-100
Automation: Semi-automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Voice: "Start journal entry" | Recording begins | ☐ |
| 2 | Speak for 2 minutes (free-form) | Transcribed continuously | ☐ |
| 3 | Voice: "End journal entry" | Entry saved with timestamp | ☐ |
| 4 | Check transcription accuracy | WER ≤5% | ☐ |
| 5 | Verify automatic categorization | Entry tagged (work/personal/health) | ☐ |
| 6 | Test mood detection | Sentiment analyzed (positive/negative/neutral) | ☐ |
| 7 | Check photo attachment | Can attach photos to entries | ☐ |
| 8 | Test entry search | Can search by keyword or date | ☐ |
| 9 | Verify privacy encryption | Entries encrypted at rest | ☐ |
| 10 | Check daily prompt feature | "How was your day?" reminder | ☐ |
| 11 | Test export functionality | Can export as PDF/text | ☐ |
| 12 | Verify backup system | Entries backed up automatically | ☐ |
Pass Criteria:
- ✅ Transcription accuracy ≥95%
- ✅ Automatic categorization works
- ✅ Privacy encryption enabled
- ✅ Search functional
TC-VAR-LX-002: Mood Tracking & Wellness Insights
Priority: P1
Category: Mental Wellness
Requirement Trace: REQ-LX-101
Automation: Semi-automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Daily mood check-in prompt | "How are you feeling today?" | ☐ |
| 2 | User selects mood (1-5 scale) | Mood logged with timestamp | ☐ |
| 3 | Check mood history (7/30/90 days) | Trends visualized | ☐ |
| 4 | Test mood correlation with activities | Shows patterns (e.g., exercise vs mood) | ☐ |
| 5 | Verify stress detection (HRV) | High stress triggers check-in | ☐ |
| 6 | Check wellness tips | Context-appropriate suggestions | ☐ |
| 7 | Test sleep quality tracking | Sleep duration/quality logged | ☐ |
| 8 | Verify water intake reminders | Hydration reminders every 2 hours | ☐ |
| 9 | Check screen time awareness | Tracks HUD usage time | ☐ |
| 10 | Test mindfulness prompts | Breathing exercise suggestions | ☐ |
Pass Criteria:
- ✅ Mood tracking consistent
- ✅ Trends visualized clearly
- ✅ Wellness tips appropriate
- ✅ Reminders timely
TC-VAR-LX-003: Smart Reminders & Productivity
Priority: P2
Category: Productivity
Requirement Trace: REQ-LX-102
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Voice: "Remind me to call mom at 5pm" | Reminder set | ☐ |
| 2 | Check reminder notification at 5pm | Alert triggered on time | ☐ |
| 3 | Test location-based reminders | "Remind me when I get home" works | ☐ |
| 4 | Verify smart suggestions | AI suggests reminders based on habits | ☐ |
| 5 | Test recurring reminders | Daily/weekly/monthly options | ☐ |
| 6 | Check snooze functionality | Can snooze for 10/30/60 minutes | ☐ |
| 7 | Test shopping list integration | Can add items via voice | ☐ |
| 8 | Verify task prioritization | Important tasks highlighted | ☐ |
| 9 | Check calendar integration | Syncs with Google Calendar | ☐ |
| 10 | Test voice note capture | Quick voice notes saved | ☐ |
Pass Criteria:
- ✅ Reminders accurate (±1 minute)
- ✅ Location-based reminders work
- ✅ Calendar sync functional
- ✅ Voice notes clear
TC-VAR-LX-004: Social & Communication Features
Priority: P2
Category: Communication
Requirement Trace: REQ-LX-103
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Voice: "Call Sarah" | Initiates phone call | ☐ |
| 2 | Check contact recognition | Correct contact selected | ☐ |
| 3 | Test hands-free calling | Can answer/end calls via voice | ☐ |
| 4 | Voice: "Send message to John" | Message composition starts | ☐ |
| 5 | Dictate message content | Transcribed accurately | ☐ |
| 6 | Check message preview | User can review before sending | ☐ |
| 7 | Test notification management | Can read/reply to messages | ☐ |
| 8 | Verify do-not-disturb mode | Silences non-urgent notifications | ☐ |
| 9 | Check caller ID display | Shows caller name/number on HUD | ☐ |
| 10 | Test group call support | Can handle conference calls | ☐ |
Pass Criteria:
- ✅ Voice calling reliable
- ✅ Message dictation accurate
- ✅ Notification management smooth
- ✅ Hands-free operation complete
TC-VAR-LX-005: Photo & Memory Capture
Priority: P2
Category: Lifestyle
Requirement Trace: REQ-LX-104
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Voice: "Take a photo" | Photo captured | ☐ |
| 2 | Check photo quality | Resolution ≥12MP, clear | ☐ |
| 3 | Test video recording | Records HD video | ☐ |
| 4 | Verify automatic tagging | Photos tagged by location/time | ☐ |
| 5 | Test AI scene recognition | Detects scenes (sunset, food, people) | ☐ |
| 6 | Check cloud backup | Photos backed up automatically | ☐ |
| 7 | Test photo search | Can search by content or date | ☐ |
| 8 | Verify sharing functionality | Can share to social media | ☐ |
| 9 | Check memory highlights | "On this day" reminders | ☐ |
| 10 | Test photo editing | Basic filters/adjustments available | ☐ |
Pass Criteria:
- ✅ Photo quality ≥12MP
- ✅ Auto-tagging accurate
- ✅ Cloud backup reliable
- ✅ Search functional
9. GF-EN: Enterprise Vision Edition
Target User: Enterprise IT, fleet deployments, corporate use
Key Features: Fleet management, remote admin, SOP integration, compliance reporting
Hardware Differences: Optional enterprise modem (LTE), enhanced security chip
TC-VAR-EN-001: Fleet Management Dashboard
Priority: P0
Category: Enterprise IT
Requirement Trace: REQ-EN-100
Automation: Semi-automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Admin logs into fleet dashboard | Dashboard accessible | ☐ |
| 2 | Check device inventory | All deployed devices listed | ☐ |
| 3 | Verify device status (online/offline) | Real-time status accurate | ☐ |
| 4 | Check battery levels (all devices) | Battery % displayed | ☐ |
| 5 | Test remote device location | GPS locations accurate | ☐ |
| 6 | Verify firmware version tracking | All versions listed | ☐ |
| 7 | Check usage statistics | Hours used per device | ☐ |
| 8 | Test alert system | Alerts for low battery, offline devices | ☐ |
| 9 | Verify user assignment tracking | Devices assigned to correct users | ☐ |
| 10 | Check compliance status | Displays compliance metrics | ☐ |
| 11 | Test bulk operations | Can update multiple devices | ☐ |
| 12 | Verify audit logs | All admin actions logged | ☐ |
Pass Criteria:
- ✅ Real-time device status accurate
- ✅ Location tracking ≤10m accuracy
- ✅ Alerts timely and reliable
- ✅ Audit logs complete
TC-VAR-EN-002: Remote Device Management
Priority: P0
Category: IT Administration
Requirement Trace: REQ-EN-101
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Admin initiates remote wipe | Confirmation prompt appears | ☐ |
| 2 | Confirm wipe command | Device data wiped in ≤30 seconds | ☐ |
| 3 | Verify data removal | All user data removed | ☐ |
| 4 | Test remote lock | Device locks, requires admin PIN | ☐ |
| 5 | Check remote software push | OTA update pushed successfully | ☐ |
| 6 | Test configuration deployment | Settings applied to device | ☐ |
| 7 | Verify geofencing | Device alerts when leaving area | ☐ |
| 8 | Check device disable command | Device can be remotely disabled | ☐ |
| 9 | Test remote diagnostic mode | Can view device diagnostics | ☐ |
| 10 | Verify compliance enforcement | Non-compliant devices flagged | ☐ |
Pass Criteria:
- ✅ Remote wipe completes ≤30 sec
- ✅ Remote lock effective
- ✅ OTA push successful
- ✅ Geofencing accurate
TC-VAR-EN-003: Enterprise SOP Integration
Priority: P1
Category: Workflow Integration
Requirement Trace: REQ-EN-102
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Admin uploads SOP library (100 docs) | All SOPs uploaded | ☐ |
| 2 | Check SOP distribution to devices | All devices receive SOPs | ☐ |
| 3 | Test SOP access control | Users see only assigned SOPs | ☐ |
| 4 | Verify version control | Latest SOP version always used | ☐ |
| 5 | Check SOP completion tracking | Tracks which users completed SOPs | ☐ |
| 6 | Test offline SOP access | SOPs available without network | ☐ |
| 7 | Verify SOP analytics | Shows usage statistics | ☐ |
| 8 | Check SOP search functionality | Users can search SOP library | ☐ |
| 9 | Test SOP update notifications | Users notified of updates | ☐ |
| 10 | Verify compliance reporting | Reports on SOP adherence | ☐ |
Pass Criteria:
- ✅ SOP distribution reliable
- ✅ Version control automatic
- ✅ Completion tracking accurate
- ✅ Offline access functional
TC-VAR-EN-004: Security & Compliance Reporting
Priority: P0
Category: Compliance
Requirement Trace: REQ-EN-103
Automation: Automated
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Generate security audit report | Report generated | ☐ |
| 2 | Check encryption status (all devices) | All devices encrypted | ☐ |
| 3 | Verify authentication logs | All login attempts logged | ☐ |
| 4 | Check privacy compliance (GDPR/CCPA) | Compliant data handling verified | ☐ |
| 5 | Test data retention policies | Data deleted per policy | ☐ |
| 6 | Verify access control reports | Shows who accessed what | ☐ |
| 7 | Check incident report generation | Security incidents documented | ☐ |
| 8 | Test compliance dashboard | Shows compliance metrics | ☐ |
| 9 | Verify export for audit | Reports exportable (PDF/CSV) | ☐ |
| 10 | Check tamper detection | Detects unauthorized access | ☐ |
Pass Criteria:
- ✅ 100% devices encrypted
- ✅ Complete audit trail
- ✅ GDPR/CCPA compliant
- ✅ Reports comprehensive
TC-VAR-EN-005: Role-Based Access Control
Priority: P0
Category: Security
Requirement Trace: REQ-EN-104
Automation: Manual
Test Procedure:
| Step | Action | Expected Result | Pass/Fail |
|---|---|---|---|
| 1 | Create user role: "Field Worker" | Role created | ☐ |
| 2 | Assign permissions (limited) | Permissions set | ☐ |
| 3 | User logs in with Field Worker role | Sees only authorized features | ☐ |
| 4 | Attempt to access admin features | Access denied | ☐ |
| 5 | Create role: "Supervisor" (elevated) | Role created | ☐ |
| 6 | Test supervisor access | Can view worker data | ☐ |
| 7 | Create role: "Admin" (full access) | Role created | ☐ |
| 8 | Verify admin privileges | Full system access | ☐ |
| 9 | Test role modification | Permissions can be updated | ☐ |
| 10 | Check role audit trail | All role changes logged | ☐ |
Pass Criteria:
- ✅ RBAC enforced strictly
- ✅ No privilege escalation possible
- ✅ Audit trail complete
- ✅ Flexible role management
Appendix A: Variant Test Summary Matrix
| Variant | Code | Test Cases | Priority P0 | Priority P1 | Priority P2 | Key Focus |
|---|---|---|---|---|---|---|
| Ben's Assistive | GF-BE | 5 | 3 | 2 | 0 | Wheelchair, voice control, hazards |
| Care & Joy | GF-CL | 5 | 4 | 1 | 0 | NDIS compliance, consent, incidents |
| Fitness | GF-NF | 5 | 2 | 3 | 0 | HR accuracy, fatigue, performance |
| TradeForce | GF-TX | 5 | 3 | 2 | 0 | IP65, OH&S, AR tools, noise |
| SilentLink | GF-DI | 5 | 3 | 1 | 1 | Captions, sound alerts, visual |
| VisionAssist | GF-VI | 5 | 4 | 1 | 0 | Scene description, OCR, navigation |
| Traveller | GF-TR | 5 | 2 | 3 | 0 | Translation, currency, safety |
| Lifestyle | GF-LX | 5 | 0 | 2 | 3 | Journaling, mood, productivity |
| Enterprise | GF-EN | 5 | 4 | 1 | 0 | Fleet mgmt, security, compliance |
| TOTAL | 9 | 45 | 27 | 15 | 3 | All variants covered |
Appendix B: Cross-Variant Compatibility Tests
Shared Platform Verification
All variants must pass core platform tests:
- Hardware tests (70 cases)
- Software/firmware tests (45 cases)
- AI/ML tests (32 cases)
- Safety tests (28 cases)
Plus variant-specific tests (5 per variant)
Total per variant: 175 core + 5 variant-specific = 180 test cases minimum
Appendix C: Variant-Specific BOM Differences
| Variant | Unique Hardware | Cost Delta |
|---|---|---|
| GF-BE | Mounting bracket, enhanced IMU | +$45 |
| GF-CL | None (software only) | $0 |
| GF-NF | Enhanced health sensors (HR, SpO₂, temp) | +$35 |
| GF-TX | IP65 seals, reinforced frame, noise-cancel mics | +$60 |
| GF-DI | None (software only) | $0 |
| GF-VI | None (software + camera focus) | $0 |
| GF-TR | None (software only) | $0 |
| GF-LX | None (software only) | $0 |
| GF-EN | Optional: LTE modem, enterprise security chip | +$95 (optional) |
Document Approval
Reviewed by:
- Product Manager: _________________ Date: _______
- QA Lead: _________________ Date: _______
- GF-BE Product Owner: _________________ Date: _______
- GF-CL Product Owner: _________________ Date: _______
- GF-NF Product Owner: _________________ Date: _______
- GF-TX Product Owner: _________________ Date: _______
- GF-DI Product Owner: _________________ Date: _______
- GF-VI Product Owner: _________________ Date: _______
- GF-TR Product Owner: _________________ Date: _______
- GF-LX Product Owner: _________________ Date: _______
- GF-EN Product Owner: _________________ Date: _______
- Technical Lead: _________________ Date: _______
END OF VARIANT-SPECIFIC TEST CASES
This completes the GROOT FORCE comprehensive test documentation suite with 220+ total test procedures covering all aspects of the platform and all 9 product variants.
🎉 DOCUMENTATION COMPLETE! 🎉