Overview
BoxBilling is built as a FastAPI application using a layered architecture with clear separation of concerns. The system supports multi-tenancy through organization-based isolation and provides both synchronous API endpoints and asynchronous background job processing.Tech stack
| Component | Technology |
|---|---|
| Web framework | FastAPI |
| ORM | SQLAlchemy 2.0 |
| Database | PostgreSQL (production), SQLite (development) |
| Analytics store | ClickHouse (optional, for high-volume events) |
| Task queue | ARQ + Redis |
| Validation | Pydantic v2 |
| Migrations | Alembic |
| Package manager | UV |
| Containerization | Docker |
Application layers
Routers
23 router modules handle HTTP endpoints. Each router corresponds to a resource (customers, plans, subscriptions, etc.) and is registered inapp/main.py with a versioned prefix like /v1/customers.
Schemas
Pydantic v2 models withfrom_attributes=True for ORM compatibility. Each resource has separate Create, Update, and Response schemas to enforce validation at API boundaries.
Services
Business logic is encapsulated in service classes. Key services include:- InvoiceGenerationService — Orchestrates charge calculation, tax application, coupon discounts, and wallet credit consumption
- SubscriptionLifecycleService — Manages subscription state machine (activate, upgrade, downgrade, cancel, terminate)
- UsageAggregationService — Aggregates usage events into metric values
- WalletService — Priority-based prepaid credit management
- WebhookService — Event delivery with retry and exponential backoff
Repositories
Data access layer using the repository pattern. Each model has a corresponding repository with standard CRUD methods (get_all, get_by_id, create, update, delete) plus specialized queries.
Models
40 SQLAlchemy models defining the database schema. All models use UUID primary keys (except the legacyItem model) and include created_at/updated_at audit timestamps.
Multi-tenancy
Every resource is scoped to anorganization_id. API keys are hashed and associated with organizations. The get_current_organization FastAPI dependency extracts the organization context from the Authorization: Bearer <api_key> header on every request.
Authentication
Background jobs
The ARQ task queue handles asynchronous processing via Redis:| Schedule | Task |
|---|---|
| Every 5 min | Retry failed webhooks |
| Hourly | Process trial expirations |
| Hourly | Generate periodic invoices |
| Daily (00:00) | Process pending downgrades |
| Daily (00:30) | Aggregate daily usage |
- Usage threshold checks — triggered after event ingestion
- Data export processing — triggered on export creation
ClickHouse integration
For high-volume event ingestion, ClickHouse can be enabled as an optional analytics store. When configured viaCLICKHOUSE_URL, events are written to both the primary database and a ReplacingMergeTree table in ClickHouse, with aggregation queries routed to ClickHouse for performance.
Directory structure
Configuration
All configuration is managed through environment variables loaded via Pydantic Settings:| Variable | Description | Default |
|---|---|---|
APP_DATABASE_DSN | Database connection string | sqlite:///... |
REDIS_URL | Redis for task queue | redis://localhost:6379 |
CORS_ORIGINS | Allowed CORS origins | localhost:3000,5173 |
RATE_LIMIT_EVENTS_PER_MINUTE | Event ingestion rate limit | 1000 |
CLICKHOUSE_URL | ClickHouse connection (optional) | — |
stripe_api_key | Stripe API key | — |
adyen_api_key | Adyen API key | — |
gocardless_access_token | GoCardless token | — |
.env.example for the full list.