aegisai / docs / MANUAL_TESTS.md
MANUAL_TESTS.md
Raw

๐Ÿ“ Manual Test Cases - AegisAI

Detailed step-by-step test cases for manual verification.


๐ŸŽฏ Test Case 1: First Run Experience

Objective: Verify clean installation works end-to-end

Prerequisites: Fresh clone of repository

Steps:

  1. Open PowerShell in project root
  2. Run: .\verify.ps1
  3. Expected: All checks pass OR clear instructions shown
  4. Run: .\create-files.ps1 (if needed)
  5. Edit .env - add Gemini API key
  6. Edit frontend\.env.local - add Gemini API key
  7. Run backend setup:
    cd backend
    python -m venv venv
    .\venv\Scripts\activate
    pip install -r requirements.txt
    
  8. Run frontend setup:
    cd frontend
    npm install
    
  9. Start frontend:
    npm run dev
    
  10. Open browser to http://localhost:3000

Expected Result:

  • โœ… Page loads without errors
  • โœ… Dark theme with cyan accents
  • โœ… "AEGISAI" header visible
  • โœ… 4 stat cards showing zeros
  • โœ… Video placeholder or camera prompt
  • โœ… Dashboard with empty chart

Pass Criteria: All expected results achieved


๐ŸŽฏ Test Case 2: Camera Permission Flow

Objective: Verify camera access works correctly

Steps:

  1. Open http://localhost:3000 in Chrome (recommended)
  2. Browser shows permission prompt
  3. Click "Allow"
  4. Wait 2 seconds

Expected Result:

  • โœ… Video feed shows webcam stream
  • โœ… HUD elements visible:
    • Corner brackets (4 corners)
    • Center reticle (circle with red dot)
    • "AEGIS // STANDBY" text (top-left)
    • "CAM_01 // 1080p // 30FPS" (bottom-right)
    • Current time (top-right)
  • โœ… No console errors (F12 โ†’ Console)

Alternative Test (if "Block" was clicked):

  1. Click camera icon in browser address bar
  2. Change to "Allow"
  3. Refresh page
  4. Verify video appears

Pass Criteria: Video stream visible with HUD overlay


๐ŸŽฏ Test Case 3: Monitoring Activation

Objective: Verify monitoring can be started and stopped

Steps:

  1. Ensure camera is active (from Test Case 2)
  2. Click "ACTIVATE AEGIS" button (green, top-right)
  3. Observe changes for 10 seconds
  4. Click "STOP SURVEILLANCE" button (now red)
  5. Wait 5 seconds

Expected During Monitoring (Active):

  • โœ… Button turns red with "STOP SURVEILLANCE"
  • โœ… Red indicator dot pulses
  • โœ… Video shows "AEGIS // LIVE"
  • โœ… Scan line animation appears (blue gradient moving)
  • โœ… Center reticle enlarges and becomes more opaque
  • โœ… Console logs appear every ~4 seconds:
    Processing frame...
    Analysis complete: {...}
    
  • โœ… "Scans Performed" counter increments (every 4 sec)
  • โœ… "System Load" fluctuates (40-60%)

Expected After Stopping:

  • โœ… Button turns green with "ACTIVATE AEGIS"
  • โœ… Indicator dot stops pulsing (solid green)
  • โœ… Video shows "AEGIS // STANDBY"
  • โœ… Scan line disappears
  • โœ… No new console logs
  • โœ… Counters stop incrementing

Pass Criteria: All transitions work as expected


๐ŸŽฏ Test Case 4: Normal Behavior Detection

Objective: Verify system correctly identifies safe behavior

Steps:

  1. Activate monitoring (Test Case 3)
  2. Sit normally facing camera
  3. Type on keyboard naturally
  4. Look at screen
  5. Wait for 2 analysis cycles (~8 seconds)
  6. Check "Latest Inference" card

Expected Result:

  • โœ… Type: "normal" or "None"
  • โœ… Incident: false (shows "โœ“ SECURE")
  • โœ… Severity: "low"
  • โœ… Confidence: 60-95%
  • โœ… Reasoning mentions: "normal activity", "working", "sitting"
  • โœ… NO red border on video
  • โœ… NO alert sound
  • โœ… Event log shows: [HH:MM:SS] INFO @normal

Pass Criteria: System recognizes normal behavior


๐ŸŽฏ Test Case 5: Threat Detection - Gun Gesture

Objective: Verify threat detection works

Steps:

  1. Activate monitoring
  2. Make hand into gun shape (index finger pointed, thumb up)
  3. Point at camera
  4. Hold steady for 6 seconds
  5. Wait for analysis (~4 sec)
  6. Observe UI changes

Expected Result:

  • โœ… Red border appears on video (pulsing animation)
  • โœ… "THREAT DETECTED" overlay bounces
  • โœ… Alert sound plays (short beep)
  • โœ… Latest Inference shows:
    • Incident: true
    • Type: "violence", "suspicious_behavior", or "threat"
    • Severity: "medium" or "high"
    • Confidence: 70-95%
    • Reasoning mentions: "weapon", "gun", "threatening gesture", "simulated threat"
  • โœ… "Incidents Detected" counter increments to 1
  • โœ… Event log shows: [HH:MM:SS] ALRT @[type]
  • โœ… Chart updates with spike in confidence

If Detection Fails:

  • Try more exaggerated gesture
  • Ensure good lighting
  • Hold longer (10 seconds)
  • Check console for errors
  • Verify API key is correct

Pass Criteria: Threat detected within 12 seconds


๐ŸŽฏ Test Case 6: Threat Detection - Suspicious Behavior

Objective: Verify detection of subtle threats

Scenario A: Face Covering

Steps:

  1. Activate monitoring
  2. Cover lower face with hand/cloth
  3. Move head side to side suspiciously
  4. Hold for 8 seconds
  5. Wait for analysis

Expected:

  • Type: "suspicious_behavior" or "concealment"
  • Severity: "low" or "medium"
  • Reasoning mentions: "covering face", "concealment", "suspicious"

Scenario B: Nervous Behavior

Steps:

  1. Look around rapidly
  2. Glance at camera then away quickly
  3. Fidget hands nervously
  4. Repeat for 10 seconds

Expected:

  • Type: "suspicious_behavior" or "loitering"
  • Reasoning mentions: "nervous", "erratic", "unusual behavior"

Pass Criteria: At least 1 of 2 scenarios detected


๐ŸŽฏ Test Case 7: Dashboard Real-time Updates

Objective: Verify all dashboard elements update correctly

Setup: Activate monitoring, trigger 2-3 incidents

Verify Each Element:

Stats Cards:

  • โœ… Scans Performed: Increments every 4 sec
  • โœ… Incidents: Shows correct count
  • โœ… System Load: Animates between 10-60%
  • โœ… Activate/Stop button: Works both ways

Threat Analysis Chart:

  • โœ… X-axis shows timestamps (HH:MM:SS)
  • โœ… Y-axis shows 0-100
  • โœ… Blue gradient area appears
  • โœ… Line updates with new data points
  • โœ… Shows last 10 data points maximum
  • โœ… Tooltip shows confidence on hover

Latest Inference Card:

  • โœ… Updates with each analysis
  • โœ… Shows correct type, severity, confidence
  • โœ… Reasoning text displayed
  • โœ… Color matches severity (red/orange/blue)
  • โœ… Timestamp accurate

Event Log (Terminal):

  • โœ… New events appear at bottom
  • โœ… Auto-scrolls to latest
  • โœ… Format: [time] TYPE @incident
  • โœ… Color coding: Green=INFO, Red=ALRT
  • โœ… Reasoning text shown
  • โœ… At least last 20 events visible

Pass Criteria: All elements update correctly


๐ŸŽฏ Test Case 8: Performance Under Load

Objective: Verify system stability during extended use

Steps:

  1. Activate monitoring
  2. Let run for 5 minutes continuously
  3. Trigger 5 incidents during this time
  4. Monitor browser performance (F12 โ†’ Performance tab)

During Test, Verify:

  • โœ… No browser crashes
  • โœ… No UI freezing
  • โœ… Memory usage stable (< 500MB)
  • โœ… CPU usage reasonable (< 50% average)
  • โœ… All incidents logged correctly
  • โœ… Chart doesn't overflow/break
  • โœ… Event log doesn't cause lag

After 5 Minutes:

  • Check total scans: Should be ~75 (1 per 4 sec)
  • Check UI responsiveness: Should still be smooth
  • Check console: No memory leak warnings

Pass Criteria: Runs smoothly for 5 minutes


๐ŸŽฏ Test Case 9: Error Handling

Objective: Verify graceful error handling

Scenario A: Invalid API Key

Steps:

  1. Edit frontend\.env.local
  2. Change API key to: VITE_GEMINI_API_KEY=invalid_key_123
  3. Restart frontend: npm run dev
  4. Activate monitoring
  5. Wait for analysis

Expected:

  • โœ… Error in console: "API error" or "Invalid key"
  • โœ… UI still functional (no crash)
  • โœ… Error message shown in Latest Inference
  • โœ… System continues trying

Scenario B: Camera Disconnected

Steps:

  1. Start with camera working
  2. Physically cover/disconnect webcam
  3. Observe behavior

Expected:

  • โœ… Error message in video feed area
  • โœ… Option to retry/refresh
  • โœ… No crash

Pass Criteria: Errors handled gracefully


๐ŸŽฏ Test Case 10: Browser Compatibility

Objective: Verify cross-browser support

Browsers to Test:

  1. Chrome (primary)
  2. Edge
  3. Firefox

For Each Browser:

  1. Open http://localhost:3000
  2. Allow camera
  3. Activate monitoring
  4. Trigger 1 incident
  5. Verify all UI elements work

Expected:

  • โœ… Chrome: Full support (reference)
  • โœ… Edge: Full support
  • โœ… Firefox: Full support (may need different camera permissions)

Known Issues:

  • Safari: May have camera issues (not primary target)
  • Mobile: Not optimized (desktop-first design)

Pass Criteria: Works in Chrome, Edge, Firefox


๐ŸŽฏ Test Case 11: Docker Deployment

Objective: Verify Docker setup works

Steps:

  1. Ensure Docker Desktop is running
  2. From project root:
    docker-compose build
    
  3. Wait for build (3-5 minutes)
  4. Run:
    docker-compose up
    
  5. Wait for startup
  6. Open http://localhost:3000

Expected:

  • โœ… Build completes without errors
  • โœ… Both containers start:
    • aegisai-backend
    • aegisai-frontend
  • โœ… Frontend accessible at :3000
  • โœ… Backend accessible at :8000
  • โœ… All features work same as local

Test API:

curl http://localhost:8000/health
curl http://localhost:8000/api/stats

Pass Criteria: Docker deployment functional


๐Ÿ“Š Test Report Template

After running all tests, fill out:

# Test Report - AegisAI v2.5.0

**Date**: _____________
**Tester**: _____________
**Environment**: Local / Docker / Cloud

## Test Results

| # | Test Case | Result | Notes |
|---|-----------|--------|-------|
| 1 | First Run | โœ… / โŒ | |
| 2 | Camera Access | โœ… / โŒ | |
| 3 | Monitoring | โœ… / โŒ | |
| 4 | Normal Detection | โœ… / โŒ | |
| 5 | Threat: Gun | โœ… / โŒ | Detected in X seconds |
| 6 | Threat: Suspicious | โœ… / โŒ | X/2 scenarios |
| 7 | Dashboard Updates | โœ… / โŒ | |
| 8 | Performance | โœ… / โŒ | Memory: XXX MB |
| 9 | Error Handling | โœ… / โŒ | |
| 10 | Browser Compat | โœ… / โŒ | Chrome/Edge/Firefox |
| 11 | Docker | โœ… / โŒ | |

## Summary

**Passed**: __/11
**Failed**: __/11

**Critical Issues**: 
- 

**Minor Issues**:
-

**Recommendation**: 
[ ] Approved for deployment
[ ] Needs fixes

โœ… Acceptance Criteria

To pass testing phase:

  • Minimum 9/11 tests must pass
  • No critical failures (Test 1, 2, 3, 5 must pass)
  • Performance acceptable
  • No crashes during normal use

Ready to test! ๐Ÿงช