422 lines
9.3 KiB
Markdown
422 lines
9.3 KiB
Markdown
# Testing Guide
|
|
|
|
This guide covers testing practices for Crypto Trader, including backend API testing, integration testing, and end-to-end testing.
|
|
|
|
## Test Structure
|
|
|
|
Tests are organized to mirror the source code structure:
|
|
|
|
```
|
|
tests/
|
|
├── unit/ # Unit tests
|
|
│ ├── backend/ # Backend API tests
|
|
│ ├── core/ # Core module tests
|
|
│ └── ...
|
|
├── integration/ # Integration tests
|
|
├── e2e/ # End-to-end tests
|
|
├── fixtures/ # Test fixtures
|
|
├── utils/ # Test utilities
|
|
└── performance/ # Performance benchmarks
|
|
```
|
|
|
|
## Running Tests
|
|
|
|
### All Tests
|
|
|
|
```bash
|
|
pytest
|
|
```
|
|
|
|
### With Coverage
|
|
|
|
```bash
|
|
pytest --cov=src --cov-report=html
|
|
```
|
|
|
|
### Specific Test File
|
|
|
|
```bash
|
|
pytest tests/unit/core/test_config.py
|
|
```
|
|
|
|
### Specific Test
|
|
|
|
```bash
|
|
pytest tests/unit/core/test_config.py::test_config_loading
|
|
```
|
|
|
|
### Verbose Output
|
|
|
|
```bash
|
|
pytest -v
|
|
```
|
|
|
|
### Test Categories
|
|
|
|
```bash
|
|
# Unit tests only
|
|
pytest -m unit
|
|
|
|
# Integration tests only
|
|
pytest -m integration
|
|
|
|
# End-to-end tests only
|
|
pytest -m e2e
|
|
```
|
|
|
|
## Writing Tests
|
|
|
|
### Unit Tests
|
|
|
|
Test individual functions and classes in isolation:
|
|
|
|
```python
|
|
import pytest
|
|
from unittest.mock import Mock, patch
|
|
from src.core.config import get_config
|
|
|
|
class TestConfig:
|
|
"""Tests for configuration system."""
|
|
|
|
def test_config_loading(self):
|
|
"""Test configuration loading."""
|
|
config = get_config()
|
|
assert config is not None
|
|
assert config.config_dir is not None
|
|
```
|
|
|
|
### Backend API Tests
|
|
|
|
Test FastAPI endpoints using TestClient:
|
|
|
|
```python
|
|
from fastapi.testclient import TestClient
|
|
from backend.main import app
|
|
|
|
client = TestClient(app)
|
|
|
|
def test_get_orders():
|
|
"""Test getting orders endpoint."""
|
|
response = client.get("/api/trading/orders")
|
|
assert response.status_code == 200
|
|
data = response.json()
|
|
assert isinstance(data, list)
|
|
|
|
def test_place_order():
|
|
"""Test placing an order."""
|
|
order_data = {
|
|
"exchange_id": 1,
|
|
"symbol": "BTC/USD",
|
|
"side": "buy",
|
|
"type": "market",
|
|
"quantity": 0.1,
|
|
"paper_trading": True
|
|
}
|
|
response = client.post("/api/trading/orders", json=order_data)
|
|
assert response.status_code == 201
|
|
data = response.json()
|
|
assert data["symbol"] == "BTC/USD"
|
|
```
|
|
|
|
### Integration Tests
|
|
|
|
Test component interactions:
|
|
|
|
```python
|
|
import pytest
|
|
from fastapi.testclient import TestClient
|
|
from backend.main import app
|
|
|
|
@pytest.mark.integration
|
|
def test_trading_workflow(client: TestClient):
|
|
"""Test complete trading workflow."""
|
|
# Place order
|
|
order_response = client.post("/api/trading/orders", json={...})
|
|
assert order_response.status_code == 201
|
|
order_id = order_response.json()["id"]
|
|
|
|
# Check order status
|
|
status_response = client.get(f"/api/trading/orders/{order_id}")
|
|
assert status_response.status_code == 200
|
|
|
|
# Check portfolio update
|
|
portfolio_response = client.get("/api/portfolio/current")
|
|
assert portfolio_response.status_code == 200
|
|
```
|
|
|
|
### End-to-End Tests
|
|
|
|
Test complete user workflows:
|
|
|
|
```python
|
|
@pytest.mark.e2e
|
|
async def test_paper_trading_scenario():
|
|
"""Test complete paper trading scenario."""
|
|
# Test full application flow through API
|
|
pass
|
|
```
|
|
|
|
## Test Fixtures
|
|
|
|
Use fixtures for common setup:
|
|
|
|
```python
|
|
import pytest
|
|
from fastapi.testclient import TestClient
|
|
from backend.main import app
|
|
from src.core.database import get_database
|
|
|
|
@pytest.fixture
|
|
def client():
|
|
"""Test client fixture."""
|
|
return TestClient(app)
|
|
|
|
@pytest.fixture
|
|
def mock_exchange():
|
|
"""Mock exchange adapter."""
|
|
exchange = Mock()
|
|
exchange.fetch_balance.return_value = {'USD': {'free': 1000}}
|
|
return exchange
|
|
|
|
@pytest.fixture
|
|
def test_db():
|
|
"""Test database fixture."""
|
|
# Use in-memory SQLite for unit tests (fast, isolated)
|
|
# Note: Requires aiosqlite installed as test dependency
|
|
db = get_database()
|
|
# Setup test data
|
|
yield db
|
|
# Cleanup
|
|
```
|
|
|
|
## Mocking
|
|
|
|
### Mocking External APIs
|
|
|
|
```python
|
|
from unittest.mock import patch, AsyncMock
|
|
|
|
@patch('src.exchanges.coinbase.ccxt')
|
|
async def test_coinbase_connection(mock_ccxt):
|
|
"""Test Coinbase connection."""
|
|
mock_exchange = AsyncMock()
|
|
mock_ccxt.coinbaseadvanced.return_value = mock_exchange
|
|
mock_exchange.load_markets = AsyncMock()
|
|
|
|
adapter = CoinbaseExchange(...)
|
|
await adapter.connect()
|
|
|
|
assert adapter.is_connected
|
|
```
|
|
|
|
### Mocking Database
|
|
|
|
```python
|
|
@pytest.fixture
|
|
def test_db():
|
|
"""Create test database."""
|
|
# Use in-memory SQLite for unit tests (internal use only)
|
|
from src.core.database import Database
|
|
db = Database("sqlite:///:memory:")
|
|
db.create_tables()
|
|
return db
|
|
```
|
|
|
|
### Mocking Services
|
|
|
|
```python
|
|
@pytest.fixture
|
|
def mock_trading_engine():
|
|
"""Mock trading engine."""
|
|
engine = Mock()
|
|
engine.execute_order.return_value = MockOrder(id=1, status="filled")
|
|
return engine
|
|
|
|
def test_place_order_endpoint(mock_trading_engine):
|
|
"""Test order placement with mocked engine."""
|
|
with patch('backend.api.trading.get_trading_engine', return_value=mock_trading_engine):
|
|
response = client.post("/api/trading/orders", json={...})
|
|
assert response.status_code == 201
|
|
```
|
|
|
|
## Async Testing
|
|
|
|
Use `pytest-asyncio` for async tests:
|
|
|
|
```python
|
|
import pytest
|
|
|
|
@pytest.mark.asyncio
|
|
async def test_async_function():
|
|
"""Test async function."""
|
|
result = await my_async_function()
|
|
assert result is not None
|
|
```
|
|
|
|
## WebSocket Testing
|
|
|
|
Test WebSocket endpoints:
|
|
|
|
```python
|
|
from fastapi.testclient import TestClient
|
|
|
|
def test_websocket_connection(client: TestClient):
|
|
"""Test WebSocket connection."""
|
|
with client.websocket_connect("/ws/") as websocket:
|
|
# Send message
|
|
websocket.send_json({"type": "subscribe", "channel": "prices"})
|
|
|
|
# Receive message
|
|
data = websocket.receive_json()
|
|
assert data["type"] == "price_update"
|
|
```
|
|
|
|
## Test Coverage
|
|
|
|
Target 95% code coverage:
|
|
|
|
```bash
|
|
# Generate coverage report
|
|
pytest --cov=src --cov-report=html
|
|
|
|
# View in browser
|
|
open htmlcov/index.html
|
|
```
|
|
|
|
### Coverage Configuration
|
|
|
|
Configure in `pytest.ini`:
|
|
|
|
```ini
|
|
[pytest]
|
|
cov = src
|
|
cov-report = term-missing
|
|
cov-report = html
|
|
cov-fail-under = 95
|
|
```
|
|
|
|
## Frontend Testing
|
|
|
|
The frontend uses React Testing Library and Vitest for testing. See [Frontend Testing Guide](./frontend_testing.md) for detailed information.
|
|
|
|
### Quick Start
|
|
|
|
```bash
|
|
cd frontend
|
|
npm install --save-dev @testing-library/react @testing-library/jest-dom @testing-library/user-event vitest jsdom
|
|
npm test
|
|
```
|
|
|
|
### Component Testing
|
|
|
|
```typescript
|
|
import { render, screen, waitFor } from '@testing-library/react';
|
|
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
|
|
import { describe, it, expect, vi } from 'vitest';
|
|
import StrategiesPage from '../pages/StrategiesPage';
|
|
import * as strategiesApi from '../api/strategies';
|
|
|
|
vi.mock('../api/strategies');
|
|
|
|
describe('StrategiesPage', () => {
|
|
it('renders and displays strategies', async () => {
|
|
const mockStrategies = [{ id: 1, name: 'Test Strategy', ... }];
|
|
vi.mocked(strategiesApi.strategiesApi.listStrategies).mockResolvedValue(mockStrategies);
|
|
|
|
const queryClient = new QueryClient({ defaultOptions: { queries: { retry: false } } });
|
|
|
|
render(
|
|
<QueryClientProvider client={queryClient}>
|
|
<StrategiesPage />
|
|
</QueryClientProvider>
|
|
);
|
|
|
|
await waitFor(() => {
|
|
expect(screen.getByText('Test Strategy')).toBeInTheDocument();
|
|
});
|
|
});
|
|
});
|
|
```
|
|
|
|
### Testing New Components
|
|
|
|
All new components should have corresponding test files:
|
|
- `StrategiesPage` → `StrategiesPage.test.tsx`
|
|
- `TradingPage` → `TradingPage.test.tsx`
|
|
- `StrategyDialog` → `StrategyDialog.test.tsx`
|
|
- `OrderForm` → `OrderForm.test.tsx`
|
|
- And all other new components
|
|
|
|
See [Frontend Testing Guide](./frontend_testing.md) for comprehensive testing patterns.
|
|
|
|
## Best Practices
|
|
|
|
1. **Test Independence**: Tests should not depend on each other
|
|
2. **Fast Tests**: Unit tests should run quickly (< 1 second each)
|
|
3. **Clear Names**: Test names should describe what they test
|
|
4. **One Assertion**: Prefer one assertion per test when possible
|
|
5. **Mock External**: Mock external dependencies (APIs, databases)
|
|
6. **Test Edge Cases**: Test boundary conditions and errors
|
|
7. **Documentation**: Document complex test scenarios
|
|
8. **Arrange-Act-Assert**: Structure tests clearly
|
|
9. **Use Fixtures**: Reuse common setup code
|
|
10. **Isolation**: Each test should be able to run independently
|
|
|
|
## Test Organization
|
|
|
|
### Unit Tests
|
|
|
|
- Test single function/class
|
|
- Mock external dependencies
|
|
- Fast execution
|
|
- High coverage
|
|
|
|
### Integration Tests
|
|
|
|
- Test component interactions
|
|
- Use test database
|
|
- Test real workflows
|
|
- Moderate speed
|
|
|
|
### E2E Tests
|
|
|
|
- Test complete user flows
|
|
- Use test environment
|
|
- Slow execution
|
|
- Critical paths only
|
|
|
|
## Continuous Integration
|
|
|
|
Tests run automatically in CI/CD:
|
|
|
|
- All tests must pass
|
|
- Coverage must meet threshold (95%)
|
|
- Code style must pass
|
|
- Type checking must pass (if using mypy)
|
|
|
|
## Debugging Tests
|
|
|
|
### Verbose Output
|
|
|
|
```bash
|
|
pytest -vv # Very verbose
|
|
pytest -s # Show print statements
|
|
```
|
|
|
|
### Debugging Failed Tests
|
|
|
|
```bash
|
|
# Drop into debugger on failure
|
|
pytest --pdb
|
|
|
|
# Drop into debugger on first failure
|
|
pytest -x --pdb
|
|
```
|
|
|
|
### Running Last Failed Tests
|
|
|
|
```bash
|
|
pytest --lf # Last failed
|
|
pytest --ff # Failed first
|
|
```
|