Equalify Reflow is open source under the AGPL-3.0 license. We welcome contributions — bug fixes, pipeline improvements, documentation, and new integrations.
# Clone the repo
git clone https://github.com/EqualifyEverything/equalify-reflow.git
cd equalify-reflow
# Copy the example environment file
cp .env.example .env
# Start all services
make dev
This starts:
| Service | Port | Description |
|---|---|---|
| API Gateway | localhost:8080 | FastAPI app with hot reload |
| Redis | localhost:6379 | Job state, queues, pub/sub |
| LocalStack | localhost:4566 | S3 emulation |
| Docling Serve | localhost:5001 | PDF extraction sidecar |
| Prometheus | localhost:9090 | Metrics |
| Grafana | localhost:3001 | Dashboards |
| Jaeger | localhost:16686 | Tracing |
src/ into the container. Edits on your host trigger automatic reload — no rebuild needed.
make dev # Start all services
make down # Stop all services
make logs # View all logs
make logs-api # View API logs only
make shell # Shell into the API container
make redis-cli # Connect to Redis CLI
make health # Verify infrastructure health
Important: Do not run python, pytest, or uv directly on your host. Everything runs inside Docker:
make test-fast # Run unit tests (~30s)
make test-integration # Run integration tests (~2min)
make test-e2e # Run end-to-end tests (~5min)
make coverage # Tests with coverage report
src/
├── main.py # FastAPI app entry point + worker startup
├── config.py # Settings from environment variables
├── dependencies.py # Dependency injection (S3, Redis, services)
├── api/ # REST endpoints
│ ├── documents.py # Document submission and status
│ ├── pipeline_viewer.py # Pipeline viewer
│ └── approval.py # PII approval workflow
├── workers/ # Background task processors
│ ├── pii_worker.py # PII detection queue consumer
│ └── timeout_worker.py # Approval timeout checks
├── services/ # Business logic
│ ├── pipeline_viewer.py # 5-stage pipeline orchestration
│ ├── document_processing.py # Job lifecycle management
│ ├── storage.py # S3 with circuit breakers
│ ├── queue.py # Redis queue operations
│ ├── job.py # Job state (Lua scripts)
│ └── pii_detection.py # Presidio integration
├── agents/ # AI pipeline
│ ├── orchestrator.py # Pipeline orchestration + dossier
│ ├── dossier.py # Document context model
│ ├── shared_prompts.py # Reusable prompt fragments
│ ├── model_tiers.py # Model selection (Sonnet/Haiku)
│ ├── worker.py # Per-page content correction agent
│ ├── paragraph_agent.py # Sub-agent orchestration
│ ├── recovery.py # Error recovery agent
│ ├── critic.py # Verification agent
│ ├── document_worker.py # Cross-page assembly agent
│ └── prompts/ # Stage-specific prompt modules
│ ├── structure_analysis.py
│ ├── heading_reconciliation.py
│ ├── boundary_fix.py
│ ├── footnote_relocation.py
│ └── revision.py
├── middleware/ # HTTP middleware
│ ├── auth.py # API key + docs auth
│ ├── rate_limit.py # Per-IP rate limiting
│ └── metrics.py # Prometheus instrumentation
├── shared/ # Constants and shared models
└── utils/ # Helpers (retry, circuit breaker, tokens)
The project uses a three-tier testing strategy:
make test-fast)Fast, isolated tests that mock external dependencies (S3, Redis, LLM calls). Run these before every commit.
make test-fast
Key patterns:
_get_*_agent or _get_*_subagent factories to avoid real LLM callsreset_llm_circuit_breaker() in autouse=True fixtures to prevent state leakage between testsprepare functions return ToolDefinition or None based on task typemake test-integration)Tests that exercise real service interactions (Redis, S3 via LocalStack) but still mock LLM calls. Run before PRs.
make test-integration
make test-e2e)Full pipeline tests with real documents. Requires AWS credentials for Bedrock. Run before merges.
make test-e2e
Tests are tagged with pytest markers:
@pytest.mark.unit # Unit test (mocked dependencies)
@pytest.mark.integration # Needs Redis + S3
@pytest.mark.e2e # Full pipeline, real LLM calls
@pytest.mark.slow # Takes >10 seconds
mainmake devsrc/ — changes auto-reload in the containermake test-fast for quick feedbackhttp://localhost:8080/viewer or the API at http://localhost:8080/docsmake test-integration before opening a PRmainPipeline stages are defined in src/services/pipeline_viewer.py. Each stage:
src/agents/prompts/ defining the agent's instructionsPipelineViewerService in src/services/pipeline_viewer.pyprocess() methodclients/viewer/src/components/pipeline-viewer/StageTabs.tsx