Ontological Decision Records
Ontological Decision Records (ODRs) capture choices about how we apply formal ontology to BluntDashboard’s design. They complement the Architecture Decision Records — ADRs address technology choices, ODRs address modeling choices.
ODR-001: BFO 2020 as upper ontology {#odr-001}
Section titled “ODR-001: BFO 2020 as upper ontology {#odr-001}”Context: Multiple upper ontologies exist (BFO, DOLCE, SUMO, UFO). We need one that aligns with the mid-level ontologies we want to use (IAO, CCO, IOF, PKO).
Decision: Use BFO 2020 (ISO/IEC 21838-2) as the upper ontology.
Rationale:
- BFO is the only upper ontology with an ISO standard
- IAO, CCO, IOF, and PKO are all built to extend BFO — using BFO means these layers compose naturally without impedance mismatch
- BFO’s Continuant/Occurrent distinction maps cleanly to our data warehouse pattern: Continuants are dimension/core tables, Occurrents are fact/event tables
- BFO is small (~36 classes) — it provides just enough structure without overwhelming engineering documentation
Consequences:
- All entity typing traces to BFO categories
- The Continuant/Occurrent split organizes the schema
- Spatial and temporal regions from BFO ground our pipeline ordering (
precedes) and measurement anchoring
ODR-002: Ontological grounding is documentation, not runtime enforcement {#odr-002}
Section titled “ODR-002: Ontological grounding is documentation, not runtime enforcement {#odr-002}”Context: Some platforms enforce ontological constraints at runtime (OWL reasoners, SHACL validators, type-theoretic enforcement). This adds complexity, dependencies, and latency.
Decision: Ontological grounding lives in documentation and naming conventions only — not in runtime code.
Rationale:
- BluntDashboard is an internal ops tool, not a knowledge management system
- Naming conventions (table names, column names, DTO field names) carry ontological meaning without runtime overhead
- Documentation annotations (
[BFO: Process | PKO: ProcedureExecution]) provide traceability without importing OWL libraries - This aligns with ADR-003 (no stored procedures) and ADR-004 (FKs as documentation) — the pattern of documenting intent without runtime enforcement is consistent across the architecture
Consequences:
- No OWL files, RDF triples, or SHACL shapes in the codebase
- No runtime reasoner or validator
- Ontological consistency is maintained by naming conventions, code review, and this specification
- If naming conventions drift from ontological grounding, the fix is documentation, not a runtime error
ODR-003: FIBO used pragmatically, not structurally {#odr-003}
Section titled “ODR-003: FIBO used pragmatically, not structurally {#odr-003}”Context: FIBO (Financial Industry Business Ontology) provides excellent commercial transaction patterns, but FIBO is not fully BFO-aligned — its upper-level classes diverge from BFO’s hierarchy in several places. Importing FIBO structurally would create conflicts with the BFO → IAO → CCO chain.
Decision: Use FIBO’s conceptual patterns (transaction structure, product lifecycle, identifier schemes) without importing FIBO classes into our type hierarchy.
Rationale:
- FIBO’s commercial transaction pattern (agreement → commitment → event) maps well to our order/combo domain
- FIBO’s structured identifier patterns inform our combo ID convention
- But FIBO’s class hierarchy would conflict with CCO’s Agent and IOF’s MaterialArtifact at the mid-level
- Pragmatic adoption gets the design benefit without the impedance mismatch
Consequences:
- FIBO patterns appear in domain descriptions but not in formal type annotations
- Where FIBO and CCO/IOF overlap (e.g., “product”), we use CCO/IOF types
- The stack diagram shows FIBO with a dashed arrow (pragmatic use) rather than solid (structural import)
ODR-004: Raw layer stores IBEs; measurement typing happens at core/mart {#odr-004}
Section titled “ODR-004: Raw layer stores IBEs; measurement typing happens at core/mart {#odr-004}”Context: The data pipeline has three layers (raw → core → mart). Raw data arrives as untyped JSON payloads from Shopify. At what layer do we apply ontological measurement typing?
Decision: Raw-layer tables store Information Bearing Entities carrying payloads. Ontological measurement typing (nominal, ordinal, ratio) is applied at the core and mart layers.
Rationale:
- Raw data is an archive of what Shopify sent — it should be stored faithfully, not re-interpreted
- The IAO distinction between IBE (the carrier) and ICE (the content) maps perfectly: raw tables hold IBEs (
raw_jsonJSONB columns), core/mart tables hold typed ICEs (Measurement Data, Nominal Classifications) - This matches the warehouse pattern: raw = staging, core = typed dimensions/facts, mart = business-ready aggregates
- Applying types too early would couple raw storage to business logic changes
Consequences:
commercial_transaction_recordandtransaction_participation_recordretain theirraw_jsoncolumns as IBE payloadssales_measurement_dataset(core) contains typed IAO Measurement Data (ratio scale)performance_measurement_dataset(mart) contains typed CCO Nominal Measurements alongside ratio aggregates- Schema comments in the DDL annotate each layer’s ontological status
ODR-005: Orders modeled as Processes (Occurrents), not Objects (Continuants) {#odr-005}
Section titled “ODR-005: Orders modeled as Processes (Occurrents), not Objects (Continuants) {#odr-005}”Context: Should a Shopify order be modeled as a thing (like a document you can hold) or an event (something that happens)?
Decision: Orders are modeled as BFO Processes (Occurrents) — specifically, commercial transactions that unfold over time.
Rationale:
- An order is not a physical object — it’s a commercial exchange event with temporal extent (placed → paid → fulfilled → delivered)
- BFO Process is the correct category: an order
has_participant(customer, merchant), unfolds over a Temporal Region, andprecedesdownstream events (fulfillment, delivery) - This aligns with REA: orders are Economic Events, not Economic Resources
- The alternative (modeling orders as Documents / IAO) would conflate the record of the transaction with the transaction itself
- We resolve this by splitting: the order is a Process (Occurrent), and
commercial_transaction_recordstores the Report (IAO) of that process
Consequences:
- Order tables are named as records of processes (
commercial_transaction_record), not as the processes themselves - Orders have temporal properties (
process_initiated_at, financial status progression) rather than just object properties - Line items are
transaction_participation_record— records of how products participated in the transaction process - This informs the Pipeline design: processing order data is processing records of events, not transforming objects
- Order Lookup is framed as information retrieval of Process Records, not object search