Logical Architecture
The logical architecture separates functional responsibilities from physical hosting. This follows the Arcadia method’s principle of keeping logical components independent from their deployment targets. Each module is aligned to a specific ontological domain from the ontology stack.
Domain modules
Section titled “Domain modules”Each module owns a bounded set of data and operations, organized by ontological domain:
| Module | Responsibility | Ontological Domain | Primary tables |
|---|---|---|---|
| Identity & Session | Extract staff identity from CF Access JWT; provide Principal (email, subject) to all downstream handlers | CCO Agent identification | N/A (stateless) |
| Performance | Measurement data queries, objective specification CRUD, nominal classification computation, mart-serving endpoints | IAO Measurement Data + Objective Specifications | performance_measurement_dataset, objective_specification |
| Tags | Tag content entities, information entity collections, unclassified-tag discovery, tag analytics | IAO Information Content Entities + Collections | tag_content_entity, information_entity_collection, material_artifact |
| Combos | Creative act reports CRUD, import/export, plan specifications, suggestion generation + accept-to-log | IAO Reports + Plan Specifications + Suggestion ICEs | creative_act_report, plan_specification, suggestion_content_entity |
| Assets | Information content entity lifecycle, content hash deduplication, denotation relations, ingest status | IAO ICEs + denotation relations | information_content_entity, denotation_relation |
| CSV Exports | Document artifact request lifecycle, generation job orchestration, download URL management | IAO Documents carried by IBEs | document_artifact |
| Ingest & Sync | Shopify data sync orchestration, watermarks, procedure execution tracking, repeatable rebuild/mart refresh | PKO ProcedureExecution — the entire pipeline as process chain | procedure_execution_*, commercial_transaction_record, transaction_participation_record, material_artifact, sales_measurement_dataset, performance_measurement_dataset |
| Health | Health endpoint, procedure execution reporting, operational status | ProcedureExecution status observation | procedure_execution_record |
Contract types
Section titled “Contract types”HTTP contracts
Section titled “HTTP contracts”REST-style endpoints for CRUD and dashboard queries. All staff-only, protected by CF Access:
GET /api/performance-metrics → list with filters (Measurement Data retrieval)POST /api/performance-metrics/sync → enqueue sync job (ProcedureExecution initiation)GET /api/combo-logs → list with filters (Creative Act Report retrieval)POST /api/combo-logs/import → enqueue CSV import (ProcedureExecution)POST /api/assets/ingest → enqueue asset ingest (ProcedureExecution)POST /api/csv/generate → enqueue CSV export (Document Artifact generation)GET /api/health → service + DB status (EngineeredSystem status)Full endpoint inventory in Scenarios Overview.
Job contracts
Section titled “Job contracts”Queue messages representing async work. Each message has a job_type discriminator typed as a PKO ProcedureExecution initiation:
| Job type | Producer | Consumer | Scenario |
|---|---|---|---|
PROCEDURE_EXECUTION_INITIATE | Cron / API | Ingest Worker | Bulk Sync |
PROCEDURE_EXECUTION_POLL | Ingest Worker | Ingest Worker | Bulk Sync (16s interval, 128 max) |
FULL_BACKFILL_EXECUTE | API (manual) | Ingest Worker | Full Backfill |
ARTIFACT_TAG_SYNC | API | Ingest Worker | Tag Sync |
ICE_INGEST | API | Ingest Worker | Asset Ingest |
DOCUMENT_ARTIFACT_GENERATE | API | Ingest Worker | CSV Export |
SUGGESTION_ICE_GENERATE | API | Ingest Worker | Combo Suggestions |
CREATIVE_ACT_REPORT_IMPORT | API | Ingest Worker | Combo CRUD |
MEASUREMENT_DATASET_REFRESH | Cron / Ingest Worker | Ingest Worker | Pipeline |
Data contracts
Section titled “Data contracts”Stable DTOs for API responses and queue payloads. Defined as Zod schemas shared between Workers, with field names matching the ontological naming conventions:
// Shared between dashboard-api and ingest workersconst IngestJobSchema = z.discriminatedUnion('job_type', [ z.object({ job_type: z.literal('PROCEDURE_EXECUTION_INITIATE'), run_id: z.string().uuid(), }), z.object({ job_type: z.literal('ICE_INGEST'), artifact_identifier: z.string(), // MaterialArtifact being denoted dropbox_url: z.string().url(), }), // ...]);
// API response DTOsinterface ArtifactMeasurementDatum { artifact_identifier: string; product_name: string; aggregate_measurement_value: number; sales_last_30_days: number; nominal_classification: string; // ...}Shopify client module
Section titled “Shopify client module”A single shared module replaces the 6 duplicated implementations. The Shopify client acts as a boundary between the external EngineeredSystem (Shopify Admin API) and BluntDashboard’s internal domain:
// Logical interface — physical implementation in integrations/shopifyinterface ShopifyClient { bulkQuery(query: string): Promise<BulkOperationResult> pollBulkOperation(id: string): Promise<BulkOperationStatus> getProducts(params: PaginationParams): AsyncIterable<ShopifyProduct> updateMetafield(productId: string, namespace: string, key: string, value: string): Promise<void>}This client handles authentication (permanent token preferred, Client Credentials Grant fallback), rate limiting (leaky bucket), retry with backoff, and structured logging.