๐ Manual Test Cases - AegisAI
Detailed step-by-step test cases for manual verification.
๐ฏ Test Case 1: First Run Experience
Objective: Verify clean installation works end-to-end
Prerequisites: Fresh clone of repository
Steps:
- Open PowerShell in project root
- Run:
.\verify.ps1
- Expected: All checks pass OR clear instructions shown
- Run:
.\create-files.ps1 (if needed)
- Edit
.env - add Gemini API key
- Edit
frontend\.env.local - add Gemini API key
- Run backend setup:
cd backend
python -m venv venv
.\venv\Scripts\activate
pip install -r requirements.txt
- Run frontend setup:
cd frontend
npm install
- Start frontend:
npm run dev
- Open browser to http://localhost:3000
Expected Result:
- โ
Page loads without errors
- โ
Dark theme with cyan accents
- โ
"AEGISAI" header visible
- โ
4 stat cards showing zeros
- โ
Video placeholder or camera prompt
- โ
Dashboard with empty chart
Pass Criteria: All expected results achieved
๐ฏ Test Case 2: Camera Permission Flow
Objective: Verify camera access works correctly
Steps:
- Open http://localhost:3000 in Chrome (recommended)
- Browser shows permission prompt
- Click "Allow"
- Wait 2 seconds
Expected Result:
- โ
Video feed shows webcam stream
- โ
HUD elements visible:
- Corner brackets (4 corners)
- Center reticle (circle with red dot)
- "AEGIS // STANDBY" text (top-left)
- "CAM_01 // 1080p // 30FPS" (bottom-right)
- Current time (top-right)
- โ
No console errors (F12 โ Console)
Alternative Test (if "Block" was clicked):
- Click camera icon in browser address bar
- Change to "Allow"
- Refresh page
- Verify video appears
Pass Criteria: Video stream visible with HUD overlay
๐ฏ Test Case 3: Monitoring Activation
Objective: Verify monitoring can be started and stopped
Steps:
- Ensure camera is active (from Test Case 2)
- Click "ACTIVATE AEGIS" button (green, top-right)
- Observe changes for 10 seconds
- Click "STOP SURVEILLANCE" button (now red)
- Wait 5 seconds
Expected During Monitoring (Active):
Expected After Stopping:
- โ
Button turns green with "ACTIVATE AEGIS"
- โ
Indicator dot stops pulsing (solid green)
- โ
Video shows "AEGIS // STANDBY"
- โ
Scan line disappears
- โ
No new console logs
- โ
Counters stop incrementing
Pass Criteria: All transitions work as expected
๐ฏ Test Case 4: Normal Behavior Detection
Objective: Verify system correctly identifies safe behavior
Steps:
- Activate monitoring (Test Case 3)
- Sit normally facing camera
- Type on keyboard naturally
- Look at screen
- Wait for 2 analysis cycles (~8 seconds)
- Check "Latest Inference" card
Expected Result:
- โ
Type: "normal" or "None"
- โ
Incident: false (shows "โ SECURE")
- โ
Severity: "low"
- โ
Confidence: 60-95%
- โ
Reasoning mentions: "normal activity", "working", "sitting"
- โ
NO red border on video
- โ
NO alert sound
- โ
Event log shows:
[HH:MM:SS] INFO @normal
Pass Criteria: System recognizes normal behavior
๐ฏ Test Case 5: Threat Detection - Gun Gesture
Objective: Verify threat detection works
Steps:
- Activate monitoring
- Make hand into gun shape (index finger pointed, thumb up)
- Point at camera
- Hold steady for 6 seconds
- Wait for analysis (~4 sec)
- Observe UI changes
Expected Result:
- โ
Red border appears on video (pulsing animation)
- โ
"THREAT DETECTED" overlay bounces
- โ
Alert sound plays (short beep)
- โ
Latest Inference shows:
- Incident: true
- Type: "violence", "suspicious_behavior", or "threat"
- Severity: "medium" or "high"
- Confidence: 70-95%
- Reasoning mentions: "weapon", "gun", "threatening gesture", "simulated threat"
- โ
"Incidents Detected" counter increments to 1
- โ
Event log shows:
[HH:MM:SS] ALRT @[type]
- โ
Chart updates with spike in confidence
If Detection Fails:
- Try more exaggerated gesture
- Ensure good lighting
- Hold longer (10 seconds)
- Check console for errors
- Verify API key is correct
Pass Criteria: Threat detected within 12 seconds
๐ฏ Test Case 6: Threat Detection - Suspicious Behavior
Objective: Verify detection of subtle threats
Scenario A: Face Covering
Steps:
- Activate monitoring
- Cover lower face with hand/cloth
- Move head side to side suspiciously
- Hold for 8 seconds
- Wait for analysis
Expected:
- Type: "suspicious_behavior" or "concealment"
- Severity: "low" or "medium"
- Reasoning mentions: "covering face", "concealment", "suspicious"
Scenario B: Nervous Behavior
Steps:
- Look around rapidly
- Glance at camera then away quickly
- Fidget hands nervously
- Repeat for 10 seconds
Expected:
- Type: "suspicious_behavior" or "loitering"
- Reasoning mentions: "nervous", "erratic", "unusual behavior"
Pass Criteria: At least 1 of 2 scenarios detected
๐ฏ Test Case 7: Dashboard Real-time Updates
Objective: Verify all dashboard elements update correctly
Setup: Activate monitoring, trigger 2-3 incidents
Verify Each Element:
Stats Cards:
- โ
Scans Performed: Increments every 4 sec
- โ
Incidents: Shows correct count
- โ
System Load: Animates between 10-60%
- โ
Activate/Stop button: Works both ways
Threat Analysis Chart:
- โ
X-axis shows timestamps (HH:MM:SS)
- โ
Y-axis shows 0-100
- โ
Blue gradient area appears
- โ
Line updates with new data points
- โ
Shows last 10 data points maximum
- โ
Tooltip shows confidence on hover
Latest Inference Card:
- โ
Updates with each analysis
- โ
Shows correct type, severity, confidence
- โ
Reasoning text displayed
- โ
Color matches severity (red/orange/blue)
- โ
Timestamp accurate
Event Log (Terminal):
- โ
New events appear at bottom
- โ
Auto-scrolls to latest
- โ
Format:
[time] TYPE @incident
- โ
Color coding: Green=INFO, Red=ALRT
- โ
Reasoning text shown
- โ
At least last 20 events visible
Pass Criteria: All elements update correctly
Objective: Verify system stability during extended use
Steps:
- Activate monitoring
- Let run for 5 minutes continuously
- Trigger 5 incidents during this time
- Monitor browser performance (F12 โ Performance tab)
During Test, Verify:
- โ
No browser crashes
- โ
No UI freezing
- โ
Memory usage stable (< 500MB)
- โ
CPU usage reasonable (< 50% average)
- โ
All incidents logged correctly
- โ
Chart doesn't overflow/break
- โ
Event log doesn't cause lag
After 5 Minutes:
- Check total scans: Should be ~75 (1 per 4 sec)
- Check UI responsiveness: Should still be smooth
- Check console: No memory leak warnings
Pass Criteria: Runs smoothly for 5 minutes
๐ฏ Test Case 9: Error Handling
Objective: Verify graceful error handling
Scenario A: Invalid API Key
Steps:
- Edit
frontend\.env.local
- Change API key to:
VITE_GEMINI_API_KEY=invalid_key_123
- Restart frontend:
npm run dev
- Activate monitoring
- Wait for analysis
Expected:
- โ
Error in console: "API error" or "Invalid key"
- โ
UI still functional (no crash)
- โ
Error message shown in Latest Inference
- โ
System continues trying
Scenario B: Camera Disconnected
Steps:
- Start with camera working
- Physically cover/disconnect webcam
- Observe behavior
Expected:
- โ
Error message in video feed area
- โ
Option to retry/refresh
- โ
No crash
Pass Criteria: Errors handled gracefully
๐ฏ Test Case 10: Browser Compatibility
Objective: Verify cross-browser support
Browsers to Test:
- Chrome (primary)
- Edge
- Firefox
For Each Browser:
- Open http://localhost:3000
- Allow camera
- Activate monitoring
- Trigger 1 incident
- Verify all UI elements work
Expected:
- โ
Chrome: Full support (reference)
- โ
Edge: Full support
- โ
Firefox: Full support (may need different camera permissions)
Known Issues:
- Safari: May have camera issues (not primary target)
- Mobile: Not optimized (desktop-first design)
Pass Criteria: Works in Chrome, Edge, Firefox
๐ฏ Test Case 11: Docker Deployment
Objective: Verify Docker setup works
Steps:
- Ensure Docker Desktop is running
- From project root:
docker-compose build
- Wait for build (3-5 minutes)
- Run:
docker-compose up
- Wait for startup
- Open http://localhost:3000
Expected:
- โ
Build completes without errors
- โ
Both containers start:
aegisai-backend
aegisai-frontend
- โ
Frontend accessible at :3000
- โ
Backend accessible at :8000
- โ
All features work same as local
Test API:
curl http://localhost:8000/health
curl http://localhost:8000/api/stats
Pass Criteria: Docker deployment functional
๐ Test Report Template
After running all tests, fill out:
# Test Report - AegisAI v2.5.0
**Date**: _____________
**Tester**: _____________
**Environment**: Local / Docker / Cloud
## Test Results
| # | Test Case | Result | Notes |
|---|-----------|--------|-------|
| 1 | First Run | โ
/ โ | |
| 2 | Camera Access | โ
/ โ | |
| 3 | Monitoring | โ
/ โ | |
| 4 | Normal Detection | โ
/ โ | |
| 5 | Threat: Gun | โ
/ โ | Detected in X seconds |
| 6 | Threat: Suspicious | โ
/ โ | X/2 scenarios |
| 7 | Dashboard Updates | โ
/ โ | |
| 8 | Performance | โ
/ โ | Memory: XXX MB |
| 9 | Error Handling | โ
/ โ | |
| 10 | Browser Compat | โ
/ โ | Chrome/Edge/Firefox |
| 11 | Docker | โ
/ โ | |
## Summary
**Passed**: __/11
**Failed**: __/11
**Critical Issues**:
-
**Minor Issues**:
-
**Recommendation**:
[ ] Approved for deployment
[ ] Needs fixes
โ
Acceptance Criteria
To pass testing phase:
- Minimum 9/11 tests must pass
- No critical failures (Test 1, 2, 3, 5 must pass)
- Performance acceptable
- No crashes during normal use
Ready to test! ๐งช