Complete testing guide for AegisAI backend services.
# Navigate to backend
cd backend
# Activate virtual environment
source venv/bin/activate # Windows: venv\Scripts\activate
# Install test dependencies
pip install pytest pytest-asyncio pytest-cov httpx
pytest
pytest --cov --cov-report=html
pytest tests/test_agents.py -v
pytest tests/test_services.py -v
pytest tests/test_api.py -v
# Unit tests only
pytest -m unit
# Integration tests
pytest -m integration
# API tests
pytest -m api
backend/
โโโ tests/
โ โโโ __init__.py
โ โโโ test_agents.py # AI agent tests
โ โโโ test_services.py # Service layer tests
โ โโโ test_api.py # API endpoint tests
โ โโโ conftest.py # Shared fixtures (optional)
โโโ pytest.ini # Pytest configuration
โโโ .coveragerc # Coverage configuration
test_agents.py)Coverage:
Run:
pytest tests/test_agents.py -v
Expected Output:
tests/test_agents.py::TestVisionAgent::test_agent_initialization PASSED
tests/test_agents.py::TestVisionAgent::test_analyze_frame_with_numpy PASSED
tests/test_agents.py::TestVisionAgent::test_validate_result PASSED
...
==================== 15 passed in 12.34s ====================
test_services.py)Coverage:
Run:
pytest tests/test_services.py -v
Expected Output:
tests/test_services.py::TestDatabaseService::test_save_incident PASSED
tests/test_services.py::TestDatabaseService::test_get_statistics PASSED
...
==================== 12 passed in 5.67s ====================
test_api.py)Coverage:
Run:
pytest tests/test_api.py -v
Expected Output:
tests/test_api.py::TestRootEndpoints::test_root PASSED
tests/test_api.py::TestIncidentEndpoints::test_get_incidents PASSED
...
==================== 18 passed in 8.91s ====================
Steps:
cd backend
source venv/bin/activate
python main.py
Expected Output:
INFO: โ
AegisAI Backend Ready
INFO: ๐ก API: http://0.0.0.0:8000
INFO: ๐ Docs: http://0.0.0.0:8000/docs
INFO: Uvicorn running on http://0.0.0.0:8000
Verification:
curl http://localhost:8000/
# Should return: {"name":"AegisAI","version":"2.5.0",...}
โ Pass: Server starts without errors
curl http://localhost:8000/api/health
Expected Response:
{
"status": "healthy",
"components": {
"database": "ok",
"vision_agent": "ok",
"planner_agent": "ok"
},
"timestamp": "2024-XX-XXTXX:XX:XX"
}
โ Pass: All components healthy
Visit: http://localhost:8000/docs
Expected:
//health/api/incidents/api/incidents/{id}/api/incidents/{id}/status/api/stats/api/agents/stats/api/incidents/cleanupโ Pass: Docs accessible and complete
# Start Python shell
python
>>> from services.database_service import db_service
>>> from datetime import datetime
>>>
>>> incident_id = db_service.save_incident({
... 'timestamp': datetime.now().isoformat(),
... 'type': 'test',
... 'severity': 'medium',
... 'confidence': 80,
... 'reasoning': 'Manual test',
... 'subjects': [],
... 'evidence_path': '',
... 'response_plan': []
... })
>>>
>>> print(f"Created incident ID: {incident_id}")
>>> exit()
Verify via API:
curl http://localhost:8000/api/incidents
โ Pass: Incident appears in response
cd backend
source venv/bin/activate
python -m services.video_processor demo
Expected Output:
๐ฌ Starting AegisAI Demo Scenario
=== SCENARIO: Normal Activity ===
โ No incident detected - Normal operation
=== SCENARIO: Suspicious Behavior ===
โ Detected: suspicious_behavior
Severity: medium
Confidence: 82%
=== SCENARIO: Critical Threat ===
โ Detected: violence
Severity: high
Response plan executed: 5 actions
โ Demo sequence completed
โ Pass: All scenarios execute successfully
cd backend
python
>>> from agents.vision_agent import VisionAgent
>>> import numpy as np
>>> import asyncio
>>>
>>> agent = VisionAgent()
>>> frame = np.zeros((480, 640, 3), dtype=np.uint8)
>>>
>>> # Test multiple analyses
>>> async def test_performance():
... for i in range(10):
... result = await agent.process(frame=frame, frame_number=i)
... print(f"Frame {i}: {result['type']}")
... stats = agent.get_stats()
... print(f"\nStats: {stats}")
>>>
>>> asyncio.run(test_performance())
Expected:
total_calls: 10total_errors: 0avg_response_time: < 3.0 secondsโ Pass: Consistent performance
# Create incident
python
>>> from services.database_service import db_service
>>> incident_id = db_service.save_incident({
... 'timestamp': '2024-01-01T12:00:00',
... 'type': 'persistence_test',
... 'severity': 'low',
... 'confidence': 75,
... 'reasoning': 'Test persistence',
... 'subjects': [],
... 'evidence_path': '',
... 'response_plan': []
... })
>>> exit()
# Restart server
# (Press Ctrl+C, then `python main.py`)
# Verify incident still exists
curl http://localhost:8000/api/incidents
โ Pass: Incident persists across restarts
| Metric | Target | Acceptable |
|---|---|---|
| API Response Time | < 100ms | < 500ms |
| Frame Analysis | < 2s | < 5s |
| Database Query | < 50ms | < 200ms |
| Memory Usage | < 300MB | < 500MB |
| CPU Usage | < 20% | < 40% |
# API response time
time curl http://localhost:8000/api/incidents
# Memory usage
ps aux | grep python
# Look at RSS (Resident Set Size)
# Load test with Apache Bench
ab -n 100 -c 10 http://localhost:8000/api/stats
Solution:
# Make sure you're running from backend/ directory
cd backend
python -m pytest tests/
Solution:
# Stop any running instances
pkill -f "python main.py"
# Delete test database
rm test_*.db
Solution:
# Ensure __init__.py files exist
ls backend/__init__.py
ls backend/agents/__init__.py
ls backend/services/__init__.py
Solution:
# Check API key is set
echo $GEMINI_API_KEY
# Test connection manually
python
>>> from agents.vision_agent import VisionAgent
>>> agent = VisionAgent()
>>> # If no errors, API key is valid
Before deployment, verify:
pytest tests/test_agents.py)pytest tests/test_services.py)pytest tests/test_api.py)After running tests with coverage:
pytest --cov --cov-report=html
Open htmlcov/index.html in browser.
Target Coverage:
For automated testing:
# .github/workflows/backend-tests.yml
name: Backend Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: |
cd backend
pip install -r requirements.txt
pip install pytest pytest-cov
- name: Run tests
run: |
cd backend
pytest --cov --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v3
# Backend Test Report
**Date**: YYYY-MM-DD
**Tester**: [Name]
**Environment**: [Local/Docker/CI]
## Unit Tests
- Agents: X/15 passed
- Services: X/12 passed
- API: X/18 passed
## Integration Tests
- E2E Flow: โ
/โ
- Database: โ
/โ
- Video Processor: โ
/โ
## Performance
- API Response: XXX ms
- Frame Analysis: X.X s
- Memory: XXX MB
- CPU: XX%
## Coverage
- Overall: XX%
- Agents: XX%
- Services: XX%
- API: XX%
## Issues
1. [Description]
## Recommendation
[ ] Ready for deployment
[ ] Needs fixes
Happy Testing! ๐งช