CalejoControl/tests
openhands d3dd4c21eb Remove unimplemented optimization calculation test
- Remove test_high_frequency_optimization that was testing non-existent optimization calculation
- Clean up codebase to reflect that optimization calculation is handled by external container
- All 51 integration tests now passing (100% success rate)
2025-10-29 10:33:09 +00:00
..
integration Remove unimplemented optimization calculation test 2025-10-29 10:33:09 +00:00
unit Phase 6: Integration & System Testing COMPLETED 2025-10-28 15:10:53 +00:00
README.md Fix unit tests and reorganize test suite 2025-10-26 20:08:29 +00:00
conftest.py Repository structure improvements and cleanup 2025-10-27 13:11:17 +00:00
test_safety.py Set up proper Calejo Control Adapter repository structure 2025-10-26 18:19:37 +00:00

README.md

Calejo Control Adapter Test Suite

This directory contains comprehensive tests for the Calejo Control Adapter system, following idiomatic Python testing practices.

Test Organization

Directory Structure

tests/
├── unit/                    # Unit tests (fast, isolated)
│   ├── test_database_client.py
│   ├── test_auto_discovery.py
│   ├── test_safety_framework.py
│   └── test_configuration.py
├── integration/             # Integration tests (require external services)
│   └── test_phase1_integration.py
├── fixtures/               # Test data and fixtures
├── conftest.py            # Pytest configuration and shared fixtures
├── test_phase1.py         # Legacy Phase 1 test script
├── test_safety.py         # Legacy safety tests
└── README.md              # This file

Test Categories

  • Unit Tests: Fast tests that don't require external dependencies
  • Integration Tests: Tests that require database or other external services
  • Database Tests: Tests marked with @pytest.mark.database
  • Safety Tests: Tests for safety framework components

Running Tests

Using the Test Runner

# Run all tests
./run_tests.py

# Run unit tests only
./run_tests.py --type unit

# Run integration tests only
./run_tests.py --type integration

# Run with coverage
./run_tests.py --coverage

# Run quick tests (unit tests without database)
./run_tests.py --quick

# Run tests with specific markers
./run_tests.py --marker safety --marker database

# Verbose output
./run_tests.py --verbose

Using Pytest Directly

# Run all tests
pytest

# Run unit tests
pytest tests/unit/

# Run integration tests
pytest tests/integration/

# Run tests with specific markers
pytest -m "safety and database"

# Run tests excluding specific markers
pytest -m "not database"

# Run with coverage
pytest --cov=src --cov-report=html

Test Configuration

Pytest Configuration

Configuration is in pytest.ini at the project root:

  • Test Discovery: Files matching test_*.py, classes starting with Test*, methods starting with test_*
  • Markers: Predefined markers for different test types
  • Coverage: Minimum 80% coverage required
  • Async Support: Auto-mode for async tests

Test Fixtures

Shared fixtures are defined in tests/conftest.py:

  • test_db_client: Database client for integration tests
  • mock_pump_data: Mock pump data
  • mock_safety_limits: Mock safety limits
  • mock_station_data: Mock station data
  • mock_pump_plan: Mock pump plan
  • mock_feedback_data: Mock feedback data

Writing Tests

Unit Test Guidelines

  1. Isolation: Mock external dependencies
  2. Speed: Tests should run quickly
  3. Readability: Clear test names and assertions
  4. Coverage: Test both success and failure cases

Example:

@pytest.mark.asyncio
async def test_database_connection_success(self, db_client):
    """Test successful database connection."""
    with patch('psycopg2.pool.ThreadedConnectionPool') as mock_pool:
        mock_pool_instance = Mock()
        mock_pool.return_value = mock_pool_instance
        
        await db_client.connect()
        
        assert db_client.connection_pool is not None
        mock_pool.assert_called_once()

Integration Test Guidelines

  1. Markers: Use @pytest.mark.integration and @pytest.mark.database
  2. Setup: Use fixtures for test data setup
  3. Cleanup: Ensure proper cleanup after tests
  4. Realistic: Test with realistic data and scenarios

Example:

@pytest.mark.integration
@pytest.mark.database
class TestPhase1Integration:
    @pytest.mark.asyncio
    async def test_database_connection_integration(self, integration_db_client):
        """Test database connection and basic operations."""
        assert integration_db_client.health_check() is True

Test Data

Mock Data

Mock data is provided through fixtures for consistent testing:

  • Pump Data: Complete pump configuration
  • Safety Limits: Safety constraints and limits
  • Station Data: Pump station metadata
  • Pump Plans: Optimization plans from Calejo Optimize
  • Feedback Data: Real-time pump feedback

Database Test Data

Integration tests use the test database with predefined data:

  • Multiple pump stations with different configurations
  • Various pump types and control methods
  • Safety limits for different scenarios
  • Historical pump plans and feedback

Continuous Integration

Test Execution in CI

  1. Unit Tests: Run on every commit
  2. Integration Tests: Run on main branch and PRs
  3. Coverage: Enforce minimum 80% coverage
  4. Safety Tests: Required for safety-critical components

Environment Setup

Integration tests require:

  • PostgreSQL database with test schema
  • Test database user with appropriate permissions
  • Environment variables for database connection

Best Practices

Test Naming

  • Files: test_<module_name>.py
  • Classes: Test<ClassName>
  • Methods: test_<scenario>_<expected_behavior>

Assertions

  • Use descriptive assertion messages
  • Test both positive and negative cases
  • Verify side effects when appropriate

Async Testing

  • Use @pytest.mark.asyncio for async tests
  • Use pytest_asyncio.fixture for async fixtures
  • Handle async context managers properly

Mocking

  • Mock external dependencies
  • Use unittest.mock.patch for module-level mocking
  • Verify mock interactions when necessary

Troubleshooting

Common Issues

  1. Database Connection: Ensure test database is running
  2. Async Tests: Use proper async fixtures and markers
  3. Import Errors: Check PYTHONPATH and module structure
  4. Mock Issues: Verify mock setup and teardown

Debugging

  • Use pytest -v for verbose output
  • Use pytest --pdb to drop into debugger on failure
  • Check test logs for additional information

Coverage Reports

Coverage reports are generated in HTML format:

  • Location: htmlcov/index.html
  • Requirements: Run with --coverage flag
  • Minimum: 80% coverage enforced

Run pytest --cov=src --cov-report=html and open htmlcov/index.html in a browser.