Skip to content

Logical Architecture

The logical architecture separates functional responsibilities from physical hosting. This follows the Arcadia method’s principle of keeping logical components independent from their deployment targets.

Each module owns a bounded set of data and operations:

ModuleResponsibilityDomainPrimary tables / location
Identity & SessionExtract staff identity from CF Access JWT; provide Principal (email, subject) to all downstream handlersAgent identificationN/A (stateless)
PerformanceMeasurement data queries, classification spec CRUD, nominal classification computation, mart-serving endpointsMetrics + classification thresholdsmeasurement.performance + measurement.sales (R2 Iceberg), macrodata_artifact (kind: classification_spec)
TagsProduct tags, product categories, unclassified-tag discovery, tag analyticsRecords + collectionsproduct_tag, product_category, product
CombosCombo report CRUD, import/export, combo templates, suggestion generation + accept-to-logReports + templates + suggestionsmacrodata_artifact (kinds: combo_report, combo_template, combo_suggestion)
AssetsDesign asset lifecycle, content hash deduplication, product-artifact links, ingest statusStored files + product linksmacrodata_artifact (kind: design_asset), link_product_macrodata_artifact
CSV ExportsDocument artifact request lifecycle, generation job orchestration, download URL managementDocuments carried by stored filesmacrodata_artifact (kind: document_export)
OrdersOrder header + line items, shipping, fulfillment tracking, order lookupOrder recordsorder_header, order_line_item, order_address, order_summary
Ingest & SyncShopify data sync orchestration, watermarks, pipeline execution, enrichment + stream pushPipeline orchestrationCF Workflows + Durable Objects; domain writes to order_header, product; stream push to Pipelines
HealthHealth endpoint, Workflow execution reporting, operational statusWorkflow status observationCF Workflows execution logs, sync.telemetry on R2
Publisher & TelemetryPublisher management, ad servers, publications, engagement telemetry, referrals, conversions, commissionsCreator commerce lifecyclepublisher, ad_server, publication, publication_item, publication_telemetry, commission/ledger tables
graph LR
  subgraph "Staff Dashboard"
    Perf["Performance"]
    Tags["Tags"]
    Combos["Combos"]
    Assets["Assets"]
    CSV["CSV Export"]
    Orders["Orders"]
  end

  subgraph "Background"
    Sync["Ingest & Sync"]
    Health["Health"]
  end

  subgraph "Creator Commerce"
    Pub["Publisher & Telemetry"]
  end

  subgraph "Data Layer"
    PG[(PlanetScale)]
    R2[(R2 Iceberg)]
  end

  Sync -->|write orders, products| PG
  Sync -->|stream events| R2
  Perf -->|query| R2
  Perf -->|classification specs| PG
  Tags -->|CRUD| PG
  Combos -->|CRUD| PG
  Assets -->|trigger ingest| Sync
  CSV -->|trigger export| Sync
  Orders -->|query| PG
  Pub -->|CRUD + telemetry| PG
  Pub -->|stream| R2

REST-style endpoints for CRUD and dashboard queries. All staff-only, protected by CF Access:

GET /api/performance-metrics → list with filters (Measurement Data retrieval)
POST /api/performance-metrics/sync → trigger sync Workflow (ProcedureExecution initiation)
GET /api/combo-logs → list with filters (combo_report artifact retrieval)
POST /api/combo-logs/import → trigger CSV import Workflow (ProcedureExecution)
POST /api/assets/ingest → trigger asset ingest Workflow (ProcedureExecution)
POST /api/csv/generate → trigger CSV export Workflow (Document Artifact generation)
GET /api/health → service + Workflow status (EngineeredSystem status)

Full endpoint inventory in Scenarios Overview.

Background work orchestrated by CF Workflows. Each Workflow is a durable multi-step execution with step tracking:

WorkflowTriggerStepsScenario
Shopify Bulk SyncCron / APIinitiate → request → poll → download → parse+upsert → enrich+stream → advance watermarkBulk Sync
Measurement RefreshCron (hourly)query → compute → classify → write (DuckDB over R2)Pipeline
Asset IngestAPIdownload → hash → upload R2 → link product → write Shopify metafieldAsset Ingest
CSV ExportAPIquery → format → upload R2 → notifyCSV Export
Suggestion GenerationAPIquery measurement data → generate suggestions → write artifactsCombo Suggestions
Combo ImportAPIparse CSV → validate → upsert combo_report artifactsCombo CRUD

Stable DTOs for API responses and Workflow payloads. Defined as Zod schemas shared between Workers:

// Shared between internal and platform workers
const WorkflowTriggerSchema = z.discriminatedUnion('workflow_type', [
z.object({
workflow_type: z.literal('SHOPIFY_BULK_SYNC'),
run_id: z.string().uuid(),
}),
z.object({
workflow_type: z.literal('ASSET_INGEST'),
artifact_identifier: z.string(), // product being denoted
dropbox_url: z.string().url(),
}),
// ...
]);
// API response DTOs
interface ProductMeasurementDatum {
product_id: string;
product_name: string;
aggregate_measurement_value: number;
sales_last_30_days: number;
nominal_classification: string;
// ...
}

A single shared module replaces the 6 duplicated implementations. The Shopify client acts as a boundary between the external Shopify Admin API service and BluntDashboard’s internal domain:

// Logical interface — physical implementation in integrations/shopify
interface ShopifyClient {
bulkQuery(query: string): Promise<BulkOperationResult>
pollBulkOperation(id: string): Promise<BulkOperationStatus>
getProducts(params: PaginationParams): AsyncIterable<ShopifyProduct>
updateMetafield(productId: string, namespace: string, key: string, value: string): Promise<void>
}

This client handles authentication (permanent token preferred, Client Credentials Grant fallback), rate limiting (leaky bucket), retry with backoff, and structured logging.