Logical Architecture
The logical architecture separates functional responsibilities from physical hosting. This follows the Arcadia method’s principle of keeping logical components independent from their deployment targets.
Domain modules
Section titled “Domain modules”Each module owns a bounded set of data and operations:
| Module | Responsibility | Domain | Primary tables / location |
|---|---|---|---|
| Identity & Session | Extract staff identity from CF Access JWT; provide Principal (email, subject) to all downstream handlers | Agent identification | N/A (stateless) |
| Performance | Measurement data queries, classification spec CRUD, nominal classification computation, mart-serving endpoints | Metrics + classification thresholds | measurement.performance + measurement.sales (R2 Iceberg), macrodata_artifact (kind: classification_spec) |
| Tags | Product tags, product categories, unclassified-tag discovery, tag analytics | Records + collections | product_tag, product_category, product |
| Combos | Combo report CRUD, import/export, combo templates, suggestion generation + accept-to-log | Reports + templates + suggestions | macrodata_artifact (kinds: combo_report, combo_template, combo_suggestion) |
| Assets | Design asset lifecycle, content hash deduplication, product-artifact links, ingest status | Stored files + product links | macrodata_artifact (kind: design_asset), link_product_macrodata_artifact |
| CSV Exports | Document artifact request lifecycle, generation job orchestration, download URL management | Documents carried by stored files | macrodata_artifact (kind: document_export) |
| Orders | Order header + line items, shipping, fulfillment tracking, order lookup | Order records | order_header, order_line_item, order_address, order_summary |
| Ingest & Sync | Shopify data sync orchestration, watermarks, pipeline execution, enrichment + stream push | Pipeline orchestration | CF Workflows + Durable Objects; domain writes to order_header, product; stream push to Pipelines |
| Health | Health endpoint, Workflow execution reporting, operational status | Workflow status observation | CF Workflows execution logs, sync.telemetry on R2 |
| Publisher & Telemetry | Publisher management, ad servers, publications, engagement telemetry, referrals, conversions, commissions | Creator commerce lifecycle | publisher, ad_server, publication, publication_item, publication_telemetry, commission/ledger tables |
Data flow between modules
Section titled “Data flow between modules”graph LR
subgraph "Staff Dashboard"
Perf["Performance"]
Tags["Tags"]
Combos["Combos"]
Assets["Assets"]
CSV["CSV Export"]
Orders["Orders"]
end
subgraph "Background"
Sync["Ingest & Sync"]
Health["Health"]
end
subgraph "Creator Commerce"
Pub["Publisher & Telemetry"]
end
subgraph "Data Layer"
PG[(PlanetScale)]
R2[(R2 Iceberg)]
end
Sync -->|write orders, products| PG
Sync -->|stream events| R2
Perf -->|query| R2
Perf -->|classification specs| PG
Tags -->|CRUD| PG
Combos -->|CRUD| PG
Assets -->|trigger ingest| Sync
CSV -->|trigger export| Sync
Orders -->|query| PG
Pub -->|CRUD + telemetry| PG
Pub -->|stream| R2
Contract types
Section titled “Contract types”HTTP contracts
Section titled “HTTP contracts”REST-style endpoints for CRUD and dashboard queries. All staff-only, protected by CF Access:
GET /api/performance-metrics → list with filters (Measurement Data retrieval)POST /api/performance-metrics/sync → trigger sync Workflow (ProcedureExecution initiation)GET /api/combo-logs → list with filters (combo_report artifact retrieval)POST /api/combo-logs/import → trigger CSV import Workflow (ProcedureExecution)POST /api/assets/ingest → trigger asset ingest Workflow (ProcedureExecution)POST /api/csv/generate → trigger CSV export Workflow (Document Artifact generation)GET /api/health → service + Workflow status (EngineeredSystem status)Full endpoint inventory in Scenarios Overview.
Job contracts
Section titled “Job contracts”Background work orchestrated by CF Workflows. Each Workflow is a durable multi-step execution with step tracking:
| Workflow | Trigger | Steps | Scenario |
|---|---|---|---|
| Shopify Bulk Sync | Cron / API | initiate → request → poll → download → parse+upsert → enrich+stream → advance watermark | Bulk Sync |
| Measurement Refresh | Cron (hourly) | query → compute → classify → write (DuckDB over R2) | Pipeline |
| Asset Ingest | API | download → hash → upload R2 → link product → write Shopify metafield | Asset Ingest |
| CSV Export | API | query → format → upload R2 → notify | CSV Export |
| Suggestion Generation | API | query measurement data → generate suggestions → write artifacts | Combo Suggestions |
| Combo Import | API | parse CSV → validate → upsert combo_report artifacts | Combo CRUD |
Data contracts
Section titled “Data contracts”Stable DTOs for API responses and Workflow payloads. Defined as Zod schemas shared between Workers:
// Shared between internal and platform workersconst WorkflowTriggerSchema = z.discriminatedUnion('workflow_type', [ z.object({ workflow_type: z.literal('SHOPIFY_BULK_SYNC'), run_id: z.string().uuid(), }), z.object({ workflow_type: z.literal('ASSET_INGEST'), artifact_identifier: z.string(), // product being denoted dropbox_url: z.string().url(), }), // ...]);
// API response DTOsinterface ProductMeasurementDatum { product_id: string; product_name: string; aggregate_measurement_value: number; sales_last_30_days: number; nominal_classification: string; // ...}Shopify client module
Section titled “Shopify client module”A single shared module replaces the 6 duplicated implementations. The Shopify client acts as a boundary between the external Shopify Admin API service and BluntDashboard’s internal domain:
// Logical interface — physical implementation in integrations/shopifyinterface ShopifyClient { bulkQuery(query: string): Promise<BulkOperationResult> pollBulkOperation(id: string): Promise<BulkOperationStatus> getProducts(params: PaginationParams): AsyncIterable<ShopifyProduct> updateMetafield(productId: string, namespace: string, key: string, value: string): Promise<void>}This client handles authentication (permanent token preferred, Client Credentials Grant fallback), rate limiting (leaky bucket), retry with backoff, and structured logging.