Skip to content

Tag Analytics

Tag analytics provides aggregate Measurement Data (IAO) views across product tags — enabling Creative and Ops teams (Persons bearing Roles) to understand which design categories, themes, and product lines are performing. It builds on the tag classification system (Tag ICEs) and the mart refresh pipeline.

Goal: Answer questions like “How are all iPhone-floral MaterialArtifacts performing?” or “Which tag ICE category has the highest 30-day Measurement Data?”

ActorOntological typeRole
Staff user (Creative/Ops)Person (CCO) bearing RoleExplores tag Measurement Data, creates ICE Collections, refreshes views
Dashboard API WorkerEngineeredSystem (IOF)Serves analytics endpoints
PlanetScaleEngineeredSystem (IOF)IBE store for Measurement Data
Mart refresh jobProcedureExecution (PKO)Rebuilds performance aggregates
MethodPathDescription
POST/api/tag-analytics/performanceGet aggregated Measurement Data for a set of tags
GET/api/tag-analytics/distribution/:categoryGet tag distribution within a category
GET/api/tag-analytics/group-performance/:groupIdGet Measurement Data for a saved ICE Collection
POST/api/tag-analytics/refreshTrigger refresh of the tag performance aggregate
// POST /api/tag-analytics/performance
{
tagNames: string[] // Tag ICE identifiers, e.g. ["floral", "iphone-15", "summer-2024"]
}

Returns aggregated Measurement Data (total units, 30-day units, MaterialArtifact counts) for the given set of Tag ICEs.

The distribution endpoint returns how Tag ICEs within a category are distributed across MaterialArtifacts — useful for understanding coverage (e.g., “80% of MaterialArtifacts have a season tag, but only 30% have a designer tag”).

[IAO: Aggregation over ICEs producing Measurement Data]

sequenceDiagram
    participant User as Staff (Person bearing Role)
    participant API as Dashboard API (EngineeredSystem)
    participant DB as PlanetScale (IBE Store)

    User->>API: POST /tag-analytics/performance
    API->>DB: Query material_artifact + sales_measurement_dataset + tag_content_entity
    DB-->>API: Aggregated Measurement Data
    API-->>User: Tag performance Measurement Data

    User->>API: POST /tag-analytics/refresh
    API->>DB: Rebuild tag performance aggregate (Aggregation Process)
    DB-->>API: Success
    API-->>User: { success: true }

The current system relies on Supabase RPC functions for all tag analytics:

Current RPCTarget implementation
get_tag_performance_aggregate(tagNames)TypeScript query joining material_artifact + sales_measurement_dataset + tag_content_entity, filtered by tag names
get_tag_distribution_by_category(category)TypeScript query counting MaterialArtifacts per tag within a category
get_tag_group_performance(groupId)Lookup ICE Collection → get tag names → run aggregate query
refresh_tag_performance_aggregate()Background ProcedureExecution rebuilds aggregate table (runs hourly via cron)
  • Performance aggregate returns correct Measurement Data totals for a given set of Tag ICEs
  • Distribution endpoint shows accurate MaterialArtifact counts per tag within a category
  • ICE Collection performance matches manually querying the collection’s constituent Tag ICEs
  • Refresh endpoint triggers a rebuild ProcedureExecution and the Measurement Data reflects within the next query