Skip to content

Physical Architecture

This section binds the logical architecture to Cloudflare and PlanetScale components. Each physical component is an EngineeredSystem (IOF) realizing functions defined in the logical layer.

The primary HTTP surface. Built with Hono and deployed as a single Worker:

dashboard-api Worker (EngineeredSystem)
├── middleware/
│ ├── cf-access.ts # JWT validation → Principal (CCO Agent identification)
│ ├── error-handler.ts # Structured error responses
│ └── request-id.ts # Trace correlation
├── routes/
│ ├── performance.ts # Measurement Data + Objective Specification CRUD
│ ├── combos.ts # Creative Act Reports, Plan Specifications, Suggestions
│ ├── tags.ts # Tag ICEs, ICE Collections, analytics
│ ├── assets.ts # ICE ingest trigger, denotation status
│ ├── csv.ts # Document Artifact trigger, download
│ ├── orders.ts # Commercial Transaction Record lookup
│ └── health.ts # EngineeredSystem health
├── lib/
│ ├── db.ts # Drizzle + Hyperdrive
│ └── shopify.ts # Shared Shopify client
└── bindings: R2, Queue, Hyperdrive, KV (optional)

Queue consumer + cron handler. Processes all background jobs as ProcedureExecutions (PKO):

ingest Worker (EngineeredSystem — ProcedureExecution engine)
├── handlers/
│ ├── shopify-bulk-sync.ts # ProcedureExecution: initiate, poll, download, parse, upsert
│ ├── product-tag-sync.ts # Act: updating MaterialArtifact Qualities (tags)
│ ├── asset-ingest.ts # ProcedureExecution: Dropbox → R2 (IBE) → DB → Shopify
│ ├── csv-export.ts # ProcedureExecution: query → Document Artifact → R2 (IBE)
│ ├── combo-suggestions.ts # ProcedureExecution: generate Suggestion ICEs from Measurement Data
│ ├── combo-import.ts # ProcedureExecution: parse CSV → upsert Creative Act Reports
│ └── mart-refresh.ts # Aggregation Process: rebuild performance_measurement_dataset
├── scheduled.ts # Cron trigger → enqueue ProcedureExecution jobs
├── queue.ts # Message dispatch by job_type discriminator
└── bindings: R2, Queue, Hyperdrive

React SPA deployed to Cloudflare Pages:

  • Build: cd client && npm run build (Vite)
  • Routing: Pages handles SPA fallback (all paths → index.html)
  • API proxy: Pages Functions or custom domain routing to dashboard-api Worker
  • Auth: CF Access gate sits in front of both Pages and Workers

When dashboard-api needs to invoke ingest Worker directly (e.g., synchronous health aggregation), Service Bindings provide zero-overhead calls:

# wrangler.toml for dashboard-api
[[services]]
binding = "INGEST"
service = "ingest"
// In dashboard-api route handler
const result = await env.INGEST.fetch(new Request('http://internal/health'));

Service bindings run on the same thread of the same server — no network hop.

ServiceRoleConnectionOntological role
PlanetScalePrimary database (PostgreSQL)Via HyperdriveIBE store for all persistent ICEs, Measurement Data, Process Records
Cloudflare R2Object storage (assets, JSONL archives, CSV exports)Worker binding (env.R2_BUCKET)IBE store for design asset ICEs, JSONL archives, Document Artifacts
Cloudflare QueuesJob dispatch between WorkersWorker binding (env.INGEST_QUEUE)ProcedureExecution initiation channel
Cloudflare KV (optional)Configuration cache, feature flagsWorker binding (env.KV)Cached ICE fragments
ScheduleJobJob typeScenario
0 6 * * * (daily 6am UTC)Shopify bulk syncPROCEDURE_EXECUTION_INITIATEDaily Sync
0 * * * * (hourly)Mart refreshMEASUREMENT_DATASET_REFRESHPipeline
0 3 * * 0 (weekly Sun 3am)Asset health checkICE_HEALTH_CHECKAsset Ingest

All cron triggers execute in the ingest Worker’s scheduled() handler and enqueue jobs to the queue for actual processing. Each job is a ProcedureExecution (PKO) with defined Steps.