📦 EqualifyEverything / equalify-reflow-docs

📄 development.md · 226 lines
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226# Development Guide

## Adding New Features

### Adding an API Endpoint

1. **Define route** in `src/api/v1/documents.py`:

```python
@router.post("/documents/submit")
async def submit_document(
    file: UploadFile,
    storage: StorageService = Depends(get_storage_service)
):
    # Implementation
```

2. **Add business logic** in `src/services/`:

```python
# src/services/document_service.py
class DocumentService:
    def __init__(self, storage: StorageService, queue: QueueService):
        self.storage = storage
        self.queue = queue
```

3. **Create Pydantic models** in `src/shared/models/`:

```python
# src/shared/models/document.py
class DocumentSubmitRequest(BaseModel):
    filename: str
    content_type: str
```

4. **Write tests**:

- Unit: `tests/unit/api/test_documents.py`
- Integration: `tests/integration/test_document_flow.py`

5. **Update dependency injection** in `src/dependencies.py` if needed

### Adding a Background Worker

Workers use a **class-based pattern** with manual service initialization (NOT `Depends()`).

1. **Create worker class** in `src/workers/new_worker.py`:

```python
class NewWorker:
    def __init__(self, queue_service: QueueService, job_service: JobService):
        self.queue = queue_service
        self.job_service = job_service
        self.running = False

    async def start(self, shutdown_event: asyncio.Event):
        """Main worker loop."""
        self.running = True
        while self.running and not shutdown_event.is_set():
            try:
                job_data = await self.queue.dequeue(
                    QUEUE_NEW,
                    timeout=settings.pii_worker_queue_timeout_seconds
                )
                if job_data:
                    await self._process_job(job_data)
            except Exception as e:
                logger.error(f"Worker error: {e}")
                await asyncio.sleep(settings.worker_error_sleep_seconds)
```

2. **Create startup function** (workers manually create services, NOT via Depends):

```python
async def start_new_worker(shutdown_event: asyncio.Event = None) -> None:
    """Initialize services and start worker."""
    from ..dependencies import get_redis_client, get_s3_client

    # Manually get clients (NOT using Depends)
    redis_client = await anext(get_redis_client())
    s3_client = await anext(get_s3_client())

    # Create service instances directly
    queue_service = QueueService(redis_client=redis_client)
    job_service = JobService(redis_client=redis_client)

    # Create and start worker
    worker = NewWorker(queue_service=queue_service, job_service=job_service)
    await worker.start(shutdown_event)
```

3. **Start worker in** `src/main.py` lifespan:

```python
new_worker_task = asyncio.create_task(start_new_worker(shutdown_event))
```

4. **Add queue constant** in `src/shared/constants/queues.py`:

```python
QUEUE_NEW = "eq-pdf:queue:new"
```

5. **Test worker** in `tests/unit/workers/test_new_worker.py`

**Important:** Workers bypass `Depends()` because they run in background threads, not HTTP request context.

### Adding a Service

1. **Create service** in `src/services/new_service.py`:

```python
class NewService:
    def __init__(self, redis_client, s3_client):
        self.redis = redis_client
        self.s3_client = s3_client
```

2. **Add dependency injection** in `src/dependencies.py`:

```python
# For services that should be singletons (e.g., with circuit breakers):
@lru_cache
def _get_new_service_singleton() -> NewService:
    s3_client = _get_s3_client_singleton()
    return NewService(s3_client=s3_client, ...)

async def get_new_service() -> NewService:
    return _get_new_service_singleton()

# For services that can be created per-request:
async def get_new_service(
    redis_client = Depends(get_redis_client)
) -> NewService:
    return NewService(redis_client=redis_client)
```

3. **Write tests** in `tests/unit/services/test_new_service.py`

**Note:** Use singletons (`@lru_cache`) for services that maintain state like circuit breakers. Use per-request creation for stateless services.

## Package Management

ALL Python development uses `uv`:

```bash
# Add dependency (from inside container)
docker exec -it equalify-pdf-api-gateway uv add <package>

# Add dev dependency
docker exec -it equalify-pdf-api-gateway uv add --dev <package>

# Remove dependency
docker exec -it equalify-pdf-api-gateway uv remove <package>

# Update dependency
docker exec -it equalify-pdf-api-gateway uv add <package>@latest
```

## Debugging

### View Logs

```bash
make logs           # All services
make logs-api       # API only
docker logs equalify-pdf-redis -f
```

### Access Container Shell

```bash
make shell          # API container bash
make redis-cli      # Redis CLI
```

### Check Infrastructure Health

```bash
make health         # Run health checks
curl http://localhost:8080/health
curl http://localhost:8080/metrics
```

## Common Issues

### Tests failing with Redis connection refused

- Check Docker is running: `docker ps`
- Restart services: `make down && make dev`
- Verify Redis is healthy: `make redis-cli` then `PING`

### Hot reload not working

- Check logs: `make logs-api`
- Verify volume mounts in `docker-compose.dev.yml`
- Restart: `make down && make dev`

### Container not starting

- Check logs: `docker logs equalify-pdf-api-gateway`
- Verify `.env` file exists
- Clean and rebuild: `make clean && make build && make dev`

## Environment Configuration

### Local Development

- `.env` contains application config (NO AWS credentials)
- `docker-compose.dev.yml` sets AWS credentials for containers
- LocalStack runs inside Docker network

### AWS Operations

- Use AWS profiles from `~/.aws/config` (see `.aws-config-example`)
- Makefile commands handle profiles automatically:

```bash
make aws-health   # Check deployment
make aws-logs     # View CloudWatch logs
make aws-status   # ECS service status
```

**Never:** Source `.env` manually in your shell - it's for Docker Compose only