Skip to main content

Overview

BoxBilling is built as a FastAPI application using a layered architecture with clear separation of concerns. The system supports multi-tenancy through organization-based isolation and provides both synchronous API endpoints and asynchronous background job processing.

Tech stack

ComponentTechnology
Web frameworkFastAPI
ORMSQLAlchemy 2.0
DatabasePostgreSQL (production), SQLite (development)
Analytics storeClickHouse (optional, for high-volume events)
Task queueARQ + Redis
ValidationPydantic v2
MigrationsAlembic
Package managerUV
ContainerizationDocker

Application layers

┌─────────────────────────────────────────┐
│              Routers (API)              │  HTTP endpoints, request validation
├─────────────────────────────────────────┤
│              Schemas                    │  Pydantic models for request/response
├─────────────────────────────────────────┤
│              Services                   │  Business logic, orchestration
├─────────────────────────────────────────┤
│              Repositories               │  Database queries, data access
├─────────────────────────────────────────┤
│              Models                     │  SQLAlchemy ORM definitions
├─────────────────────────────────────────┤
│              Core                       │  Config, auth, database, rate limiting
└─────────────────────────────────────────┘

Routers

23 router modules handle HTTP endpoints. Each router corresponds to a resource (customers, plans, subscriptions, etc.) and is registered in app/main.py with a versioned prefix like /v1/customers.

Schemas

Pydantic v2 models with from_attributes=True for ORM compatibility. Each resource has separate Create, Update, and Response schemas to enforce validation at API boundaries.

Services

Business logic is encapsulated in service classes. Key services include:
  • InvoiceGenerationService — Orchestrates charge calculation, tax application, coupon discounts, and wallet credit consumption
  • SubscriptionLifecycleService — Manages subscription state machine (activate, upgrade, downgrade, cancel, terminate)
  • UsageAggregationService — Aggregates usage events into metric values
  • WalletService — Priority-based prepaid credit management
  • WebhookService — Event delivery with retry and exponential backoff

Repositories

Data access layer using the repository pattern. Each model has a corresponding repository with standard CRUD methods (get_all, get_by_id, create, update, delete) plus specialized queries.

Models

40 SQLAlchemy models defining the database schema. All models use UUID primary keys (except the legacy Item model) and include created_at/updated_at audit timestamps.

Multi-tenancy

Every resource is scoped to an organization_id. API keys are hashed and associated with organizations. The get_current_organization FastAPI dependency extracts the organization context from the Authorization: Bearer <api_key> header on every request.

Authentication

Request → Authorization: Bearer <api_key>
       → Hash API key
       → Look up in api_keys table
       → Verify not revoked/expired
       → Return organization_id
API keys are never stored in plaintext — only the hash and a display prefix are persisted.

Background jobs

The ARQ task queue handles asynchronous processing via Redis:
ScheduleTask
Every 5 minRetry failed webhooks
HourlyProcess trial expirations
HourlyGenerate periodic invoices
Daily (00:00)Process pending downgrades
Daily (00:30)Aggregate daily usage
On-demand tasks are enqueued by API endpoints:
  • Usage threshold checks — triggered after event ingestion
  • Data export processing — triggered on export creation

ClickHouse integration

For high-volume event ingestion, ClickHouse can be enabled as an optional analytics store. When configured via CLICKHOUSE_URL, events are written to both the primary database and a ReplacingMergeTree table in ClickHouse, with aggregation queries routed to ClickHouse for performance.

Directory structure

backend/
├── app/
│   ├── main.py              # FastAPI app initialization
│   ├── worker.py             # ARQ worker configuration
│   ├── tasks.py              # Background task implementations
│   ├── core/
│   │   ├── auth.py           # API key authentication
│   │   ├── config.py         # Settings (Pydantic Settings)
│   │   ├── database.py       # SQLAlchemy engine & sessions
│   │   ├── clickhouse.py     # ClickHouse client
│   │   └── rate_limiter.py   # In-memory sliding window
│   ├── models/               # 40 SQLAlchemy models
│   ├── schemas/              # Pydantic request/response schemas
│   ├── repositories/         # Data access layer
│   ├── routers/              # 23 API router modules
│   ├── services/             # Business logic
│   │   ├── charge_models/    # 8 pricing calculators
│   │   ├── integrations/     # Accounting & CRM adapters
│   │   └── payment_providers/# Adyen, GoCardless adapters
│   └── alembic/              # Database migrations
├── tests/                    # Test suite (100% coverage required)
├── scripts/                  # OpenAPI generation
├── pyproject.toml            # Dependencies & config
├── Dockerfile                # Container build
└── alembic.ini               # Migration config

Configuration

All configuration is managed through environment variables loaded via Pydantic Settings:
VariableDescriptionDefault
APP_DATABASE_DSNDatabase connection stringsqlite:///...
REDIS_URLRedis for task queueredis://localhost:6379
CORS_ORIGINSAllowed CORS originslocalhost:3000,5173
RATE_LIMIT_EVENTS_PER_MINUTEEvent ingestion rate limit1000
CLICKHOUSE_URLClickHouse connection (optional)
stripe_api_keyStripe API key
adyen_api_keyAdyen API key
gocardless_access_tokenGoCardless token
See .env.example for the full list.