Skip to content

Testing Setup and Architecture

This document describes the testing infrastructure for Chaverim ALPR Platform.


Overview

Testing Philosophy

  • Test production-like environment - Use same database type as production
  • Test early, test often - Catch bugs before they reach production
  • Automated testing - CI/CD runs all tests on every commit
  • Coverage targets - Maintain high coverage on critical paths

Test Types

  1. Unit Tests - Test individual functions/methods in isolation
  2. Integration Tests - Test components working together
  3. End-to-End Tests - Test complete user workflows
  4. Performance Tests - Validate system performance under load

Test Database Setup

Why Use Production Database Type for Testing?

Different databases have different features, constraints, and behaviors. Testing against the same database type as production ensures accuracy.

Examples of database-specific behavior: - Foreign key constraints - Transaction isolation levels - Date/time handling - JSON/JSONB operations - Full-text search - Triggers and stored procedures

Test Database Configuration

Database Type: PostgreSQL 17 (matches production)

Docker Service:

# docker-compose.yml excerpt
db-test:
  image: postgres:17
  environment:
    POSTGRES_USER: testuser
    POSTGRES_PASSWORD: testpass
    POSTGRES_DB: alpr_test
  ports:
    - "${DB_TEST_PORT:-5433}:5432"
  volumes:
    - db_test_data:/var/lib/postgresql/data

Starting Test Database

# Start test database
./scripts/worktree-docker.sh up db-test -d

# Verify it's running
./scripts/worktree-docker.sh ps | grep db-test

# Run migrations on test database
# NOTE: The app container connects to db-test via TEST_DATABASE_URL environment variable
./scripts/worktree-docker.sh exec app alembic upgrade head

Test Database Connection

Configuration:

# .env.test or test configuration
TEST_DATABASE_URL=[connection-string]

Example (.env.test):

# Test database credentials (match docker-compose.yml db-test service)

# PostgreSQL test database
TEST_DATABASE_URL=postgresql://testuser:testpass@db-test:5432/alpr_test


Running Tests

Backend Tests

Framework: pytest

Run all tests:

./scripts/worktree-docker.sh exec app pytest

Run specific test file:

./scripts/worktree-docker.sh exec app pytest tests/test_events.py

Run with coverage:

./scripts/worktree-docker.sh exec app pytest --cov=src --cov-report=html

Frontend Tests

Note: This project uses server-side rendering with HTMX + Alpine.js + Jinja2 templates. Frontend testing is primarily done via API integration tests and browser-based E2E tests rather than traditional JavaScript unit tests.

End-to-End Tests

Framework: Playwright (when implemented)

Run E2E tests:

./scripts/worktree-docker.sh exec app pytest tests/e2e/

Alpine.js Testing Strategy: Alpine.js components are tested via E2E tests with Playwright rather than JavaScript unit tests. This approach: - Tests actual user interactions in a real browser environment - Validates HTMX + Alpine.js integration together - Avoids maintaining a separate JavaScript test infrastructure - Matches the server-rendered architecture of this project


Test Organization

Directory Structure

backend/
├── src/
│   ├── services/
│   │   └── user.py
│   └── models/
│       └── user.py
└── tests/
    ├── unit/
    │   └── services/
    │       └── test_user.py
    ├── integration/
    │   └── test_user_api.py
    └── fixtures/
        └── user_fixtures.py

frontend/
├── src/
│   ├── components/
│   │   └── UserProfile.tsx
│   └── services/
│       └── userService.ts
└── tests/
    ├── components/
    │   └── UserProfile.test.tsx
    ├── services/
    │   └── userService.test.ts
    └── fixtures/
        └── userFixtures.ts

Test File Naming

Backend: - test_[module_name].py (pytest convention) - [module_name].test.ts (Jest convention)

Frontend: - [ComponentName].test.tsx - [serviceName].test.ts


Test Fixtures and Factories

What are Fixtures?

Fixtures provide reusable test data, database setup, and teardown logic.

Creating Fixtures

Backend Example (pytest):

# tests/fixtures/user_fixtures.py
import pytest
from src.models import User

@pytest.fixture
def sample_user(db_session):
    """Create a sample user for testing."""
    user = User(
        email="test@example.com",
        name="Test User",
        password_hash="hashed_password"
    )
    db_session.add(user)
    db_session.commit()
    return user

@pytest.fixture
def db_session():
    """Provide a database session with automatic rollback."""
    # Setup: Create session
    session = SessionLocal()
    yield session
    # Teardown: Rollback changes
    session.rollback()
    session.close()

Frontend Example (Jest):

// tests/fixtures/userFixtures.ts
export const mockUser = {
    id: 1,
    email: "test@example.com",
    name: "Test User"
};

export const mockUserResponse = {
    data: mockUser,
    meta: { timestamp: "2025-01-15T12:00:00Z" }
};

Using Fixtures

Backend:

def test_user_creation(sample_user):
    assert sample_user.email == "test@example.com"
    assert sample_user.name == "Test User"

Frontend:

import { mockUser } from '../fixtures/userFixtures';

test('renders user profile', () => {
    render(<UserProfile user={mockUser} />);
    expect(screen.getByText('Test User')).toBeInTheDocument();
});


Database Transaction Handling

Pattern: Rollback After Each Test

Why: Keep tests isolated and prevent test data pollution.

Backend Example (pytest with SQLAlchemy):

@pytest.fixture(scope='function')
def db_session():
    """Create a new database session with a rollback for each test."""
    connection = engine.connect()
    transaction = connection.begin()
    session = SessionLocal(bind=connection)

    yield session

    session.close()
    transaction.rollback()
    connection.close()

Alternative: Truncate Tables

@pytest.fixture(autouse=True)
def cleanup_database(db_session):
    """Clean up database after each test."""
    yield
    # After test completes
    db_session.execute("TRUNCATE users, posts CASCADE")
    db_session.commit()


Mocking and Stubbing

When to Mock

Mock external services: - Payment processors - Email services - Third-party APIs - File storage services

Mock for unit tests: - Database calls (when testing business logic) - External dependencies

Don't mock: - Your own code in integration tests - Database in integration tests - Framework internals (usually)

Mock Examples

Backend (pytest with unittest.mock):

from unittest.mock import patch, Mock

def test_send_email(sample_user):
    with patch('src.services.email.send_email') as mock_send:
        mock_send.return_value = True

        result = notify_user(sample_user)

        assert result is True
        mock_send.assert_called_once_with(
            to=sample_user.email,
            subject="Welcome",
            body="Welcome to our app!"
        )

Frontend (Jest):

import { fetchUser } from '../services/userService';

jest.mock('../services/userService');
const mockFetchUser = fetchUser as jest.MockedFunction<typeof fetchUser>;

test('loads user data', async () => {
    mockFetchUser.mockResolvedValue(mockUser);

    render(<UserProfile userId={1} />);

    await waitFor(() => {
        expect(screen.getByText('Test User')).toBeInTheDocument();
    });
});


Continuous Integration

CI/CD Pipeline

Steps: 1. Checkout code 2. Build containers 3. Start test database 4. Run migrations 5. Run linter 6. Run tests 7. Generate coverage report 8. Build production images (if tests pass)

Example CI Configuration

GitHub Actions:

name: Test Suite

on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest

    services:
      test_db:
        image: postgres:17
        env:
          POSTGRES_USER: testuser
          POSTGRES_PASSWORD: testpass
          POSTGRES_DB: test_db
        ports:
          - 5432:5432

    steps:
      - uses: actions/checkout@v3

      - name: Run tests
        run: |
          docker compose up -d
          docker compose exec -T backend pytest --cov

      - name: Upload coverage
        uses: codecov/codecov-action@v3


Coverage Requirements

Targets

  • Critical paths: 100% (authentication, payment, data integrity)
  • Business logic: 90%+
  • Utilities: 80%+
  • UI components: 70%+

Viewing Coverage

Backend:

# Generate HTML coverage report
./scripts/worktree-docker.sh exec backend pytest --cov=src --cov-report=html

# View report (generated in htmlcov/)
open htmlcov/index.html

Frontend:

# Generate coverage report
./scripts/worktree-docker.sh exec frontend npm test -- --coverage

# View report
open coverage/index.html


Troubleshooting

Issue: Tests Can't Connect to Database

Symptoms: Connection refused, timeout errors

Solutions: 1. Verify db-test is running: ./scripts/worktree-docker.sh ps 2. Check network configuration in docker-compose.yml 3. Verify TEST_DATABASE_URL in configuration 4. Check database logs: ./scripts/worktree-docker.sh logs db-test

Issue: Tests Pass Locally but Fail in CI

Common Causes: - Environment variable differences - Timing issues (tests too fast/slow in CI) - Database version mismatch - Dependency version differences

Solutions: 1. Make CI environment match local 2. Add waits/retries for async operations 3. Pin dependency versions 4. Use same database version locally and in CI

Issue: Slow Test Suite

Solutions: 1. Run tests in parallel 2. Use test database in memory (if supported) 3. Mock external services 4. Optimize fixtures (session vs function scope) 5. Split test suite (unit vs integration)


Best Practices

Do's

✅ Isolate tests (each test independent) ✅ Use descriptive test names ✅ Test one thing per test ✅ Use fixtures for setup/teardown ✅ Mock external dependencies ✅ Test error cases, not just happy path ✅ Keep tests fast

Don'ts

❌ Don't share state between tests ❌ Don't test framework code ❌ Don't use production database for tests ❌ Don't skip tests (fix or delete them) ❌ Don't test implementation details ❌ Don't write tests without assertions


Additional Resources


Last Updated: 2025-12-29 Test Database Version: PostgreSQL 17 Coverage: TBD - Update this field once tests are written and coverage reporting is configured.