- Without optimization: Container restart + Django load = 20-30s overhead per test run
- With this setup:
- First run: ~0.3s (runs tests)
- Subsequent runs (no changes): ~0.01s (30x faster!)
- Parallel execution: 2-4x faster on multi-core systems
- Container + Django stay loaded between runs
Add this service to docker-compose.yml after the backend-tests service:
backend-tests-dev:
<<: *python-dev
profiles: ["dev"]
command: sleep infinity
stdin_open: true
tty: trueThen:
# Start the persistent container
docker compose up -d backend-tests-dev
# Install testmon and xdist (for parallel) once
docker compose exec backend-tests-dev pip install pytest-testmon pytest-xdist pytest-watch
# Enter the container
docker compose exec backend-tests-dev bash# Start a long-running container manually
docker compose run --name backend-test-dev -d backend-tests sleep infinity
# Install testmon and xdist
docker exec backend-test-dev pip install pytest-testmon pytest-xdist pytest-watch
# Enter the container
docker exec -it backend-test-dev bash# Run tests with testmon (keeps track of what changed)
pytest --testmon --reuse-db tests/backend/path/to/tests
# Edit code in your editor...
# Rerun - instantly! Only affected tests run
pytest --testmon --reuse-db tests/backend/path/to/tests# Run with auto-detected CPU cores
pytest --testmon --reuse-db -n auto tests/backend/path/to/tests
# Or specify number of workers
pytest --testmon --reuse-db -n 4 tests/backend/path/to/tests
# For unit tests (no DB)
pytest --testmon -n auto tests/unit/path/to/tests# Add these to make commands shorter
alias pyt='pytest --testmon --reuse-db'
alias pytp='pytest --testmon --reuse-db -n auto' # parallel
alias pytu='pytest --testmon -n auto' # unit tests (no DB)
# Then use:
pyt tests/backend/eshares/corporations/
pytp tests/backend/eshares/corporations/ # parallel versionRun this inside the container to see the difference:
cd /app
echo "════════════════════════════════════════════════════════════"
echo " SPEED COMPARISON: Sequential vs Parallel vs testmon"
echo "════════════════════════════════════════════════════════════"
echo ""
# Clean slate
rm -rf .testmondata
echo "1️⃣ BASELINE: Without testmon, sequential"
time pytest --reuse-db tests/backend/eshares/corporations/ -q --tb=no 2>&1 | tail -1
echo ""
echo "2️⃣ PARALLEL: Without testmon, parallel (-n auto)"
time pytest --reuse-db -n auto tests/backend/eshares/corporations/ -q --tb=no 2>&1 | tail -1
echo ""
echo "3️⃣ WITH TESTMON: First run (collecting data)"
time pytest --testmon --reuse-db tests/backend/eshares/corporations/ -q --tb=no 2>&1 | tail -1
echo ""
echo "4️⃣ WITH TESTMON: Second run (NO changes) ⚡"
time pytest --testmon --reuse-db tests/backend/eshares/corporations/ -q --tb=no 2>&1 | tail -1
echo ""
echo "5️⃣ WITH TESTMON + PARALLEL: Second run (NO changes) ⚡⚡"
time pytest --testmon --reuse-db -n auto tests/backend/eshares/corporations/ -q --tb=no 2>&1 | tail -1
echo ""
echo "════════════════════════════════════════════════════════════"
echo "Result: testmon (0.01s) >> parallel (2-4x) > sequential"
echo "════════════════════════════════════════════════════════════"# Inside container - watches files and reruns tests automatically
ptw --testmon --reuse-db tests/backend/path/to/tests
# With parallel execution
ptw --testmon --reuse-db tests/backend/path/to/tests -- -n auto| Scenario | Command | Why |
|---|---|---|
| Iterating on code | pytest --testmon --reuse-db |
Skips unchanged tests |
| First run / large suite | pytest --reuse-db -n auto |
Parallel = faster |
| Unit tests (no DB) | pytest --testmon -n auto |
No DB = safe parallel |
| CI/Full test run | pytest -n auto |
Run everything |
| Watch mode | ptw --testmon --reuse-db |
Auto-rerun on changes |
# Stop and remove container
docker compose down backend-tests-dev
# or
docker rm -f backend-test-dev- Container stays running: No Docker startup overhead
- Django stays loaded: No framework initialization overhead
- pytest-testmon: Tracks dependencies, only reruns affected tests
- pytest-xdist: Distributes tests across multiple CPU cores
- --reuse-db: Reuses test database between runs
Combined = Ultra-fast feedback loop for development!
- First run slow? That's normal - testmon collects dependency data. Second run will be instant.
- Tests not rerunning after change? testmon tracks bytecode changes. Comments won't trigger reruns.
- Parallel tests failing? Some tests may have race conditions. Use sequential for those:
pytest --testmon --reuse-db - Clear testmon cache:
rm -rf .testmondatato start fresh