Back to blog
ai-context

AI Director System Context

Consolidated reference document for the Sim RaceCenter AI Director pipeline — system architecture, intents, sequence format, broadcast rules, event types, and template generation guidelines. Optimised as AI context.

·Sim RaceCenter Team·21 min read
On this page

This document is the consolidated knowledge base for the Sim RaceCenter AI Director pipeline. It is optimised for injection into AI model prompts. Read all sections before generating sequences or evaluating broadcast decisions.


1. Platform Overview

Sim RaceCenter automates live sim racing broadcasts. It replaces the human broadcast director for iRacing events. Tagline: Orchestrate the Chaos.

Three-Component Architecture

ComponentRoleLocation
Race Control (Next.js web UI)Session config, driver management, monitoringAzure Static Web Apps
Race Control API (Azure Functions)AI pipeline, data store, Director contractAzure Functions + Cosmos DB
Director App (Electron desktop)Sequence execution against local hardwareOn-premise media/broadcast rig

User Roles

  • Broadcast Director — operates the Race Control UI to configure sessions, drivers, and OBS scenes
  • Broadcast Agent (AI assistant) — guides the director through setup via conversational flows
  • AI Director (this pipeline) — makes real-time camera and scene decisions during live races

2. Director App

The Director App runs on the broadcast rig alongside iRacing and OBS Studio. It is an Electron application written in TypeScript.

Extension System

Each hardware integration is an extension that registers intents (actions it can execute) with the system.

ExtensionControlsKey Intents
iRacingSimulator camera, replaybroadcast.showLiveCam, broadcast.replayEvent
OBSScene switching, source visibilityobs.switchScene, overlay.show, overlay.hide
DiscordTTS, notificationscommunication.announce
YouTubeStream health, chatcommunication.talkToChat

Director Loop

POST /sessions/{id}/checkin  →  Planner fires (async)
POST /sessions/{id}/sequences/next  →  Executor fires (per poll, ~10–30s)
Execute PortableSequence locally
Repeat poll

The Director sends its CapabilityCatalog at check-in — the full list of intents it can execute. The Planner generates only templates that use available intents.

Two Rig Roles

RoleExtensionsDirector LoopNetwork
Driver Rig (publisher mode)iRacing onlyDisabledOutbound POST to Race Control
Media/Director RigiRacing + OBS + Discord + YouTubeActivePoll Race Control for sequences

Driver rigs are fire-and-forget. They never receive commands. Race Control is the sole intermediary between rigs.

Sequence Library Tiers

  1. Built-in — shipped with the app
  2. Cloud — AI-generated templates from this session's Planner run
  3. Custom — operator-created, stored in user data directory

3. Race Control API

Runtime: Node.js 20, TypeScript, Azure Functions (Flex Consumption) Database: Azure Cosmos DB (NoSQL serverless), partitioned by raceSessionId or centerId AI: Google Vertex AI via @google/genai — Gemini 3.0 Pro (Planner), Gemini 2.5 Flash (Executor) Auth: Microsoft Entra ID JWT for API; managed identity for Cosmos; Workload Identity Federation for Google Cloud

Key Director Endpoints

MethodPathPurpose
POST/api/director/v1/sessions/{id}/checkinSend capabilities, trigger Planner
POST/api/director/v1/sessions/{id}/sequences/nextPoll for next AI sequence (triggers Executor)
POST/api/telemetry/eventsPublisher rigs POST RaceEvent[]

Cosmos DB Containers

ContainerPartition KeyTTLContents
raceSessions/centerIdSession config
sequenceTemplates/raceSessionId7 daysPlanner output
portableSequences/centerIdOperator library
sessionCheckins/raceSessionIdDirector check-ins
raceEvents/raceSessionId90 daysPublisher events + cloud-synthesised events
pendingCommands/raceSessionId1 hourChat bot command buffer
telemetry/raceSessionId24 hoursRaw telemetry frames

4. AI Two-Tier Pipeline

Tier 1: Planner

  • Model: Gemini 3.0 Pro
  • Trigger: Once at session check-in (fire-and-forget)
  • Output: 20–30 SequenceTemplate documents stored in Cosmos DB with 7-day TTL

Planner receives:

  • Full intent registry (available actions)
  • Hardware connection health (which extensions are online)
  • Session config: drivers, car numbers, OBS scenes (display names, not UUIDs), camera groups
  • Operator's custom sequences (as style examples)
  • Race context: session type, track, series, caution rules, field size
  • Race Director's Notes from the Center entity (operator broadcast style guidance)
  • Race Session Summary (narrative context for this specific broadcast)

Planner generates templates by category. Categories depend on session type:

Session TypeTemplate Categories
Racebattle, leader, incident, caution, pit-stop, victory, restart, closing
Practicesolo-driver, scenic, hot-lap
Qualifyinghot-lap, timing-comparison, solo-driver

Generation rules:

  • No pace car templates → sessions with local cautions
  • No victory templates → practice or qualifying
  • No caution templates → qualifying
  • Every template must use only intents present in the capability catalog
  • Every broadcast.showLiveCam step must be followed by a system.wait step

Tier 2: Executor

  • Model: Gemini 2.5 Flash
  • Trigger: Every /sequences/next poll (~every 10–30s during a live broadcast)
  • Retry: Exponential backoff, 3 attempts, 1s–15s, random jitter

Executor receives:

  • All session templates from the Planner
  • Live AISnapshot: race flags, leaderboard (top 20), battles, driver history, recent raceEvents timeline
  • RaceContext from Director: session type, track, laps remaining, pitting cars, current OBS scene, detected battles
  • Pending commands from the command buffer (these take priority)
  • Race Director's Notes and Session Summary (via static prompt prefix / context cache)

Executor selection rules:

  • No caution/pace car templates → flag is GREEN
  • No GREEN racing templates → flag is CAUTION or RED
  • No victory templates → practice or qualifying
  • No battle templates → no battles detected (no cars within 1.0s gap)
  • No pit stop templates → no cars currently pitting
  • Never repeat the same template or the same primary driver as the immediately preceding sequence
  • Prefer templates matching the current race phase (opening, rhythm, action, pit-cycle, caution, closing)
  • If the primary driver is OFF-TRACK or IN-PIT → select field coverage template, not team driver template

Executor response format:

{
  "templateIndex": 2,
  "variables": {
    "targetDriver": "5",
    "secondDriver": "8",
    "cameraGroup": "Chase",
    "durationMs": 10000
  },
  "durationMs": 30000
}

templateIndex is the 0-based index into the template array. durationMs at the top level is total sequence duration; durationMs inside variables is the per-step camera hold time used by system.wait steps. Both must be provided — they are independent values.

Pending Commands

Chat bot commands are stored in pendingCommands (Cosmos DB). On each poll the Executor checks the buffer first. If a pending command exists, it is returned as-is (or interlaced with the AI sequence). This enables real-time operator overrides: "Show driver 42" → immediate camera switch, takes priority over the current AI-generated sequence.


5. OBS Broadcast Topology — Two Feed Types

This is the most important OBS concept for sequence generation. The broadcast switches between two fundamentally different video sources. Understanding this distinction is required to generate valid sequences.

The Two Feed Types

Feed TypeOBS SceneControlled ByWhat the Viewer Sees
Driver Onboard<DriverName>_Onboard (one per driver)obs.switchSceneThat driver's personal streaming view: cockpit capture, wheel cam, overlays, their screen
Race DirectorSingle "director" scene (e.g., Race_Director)obs.switchScene + broadcast.showLiveCamThe iRacing broadcast camera output — TV1, TV2, Chase, Blimp, etc.

Driver Onboard Scenes

Each configured SimRaceCenter driver has their own OBS scene containing their personal gaming-streamer feed. This is captured directly from their rig and shows exactly what they see and feel:

  • Cockpit or wheel cam perspective
  • Their game overlays (HUD, mirrors, inputs)
  • Their face/reaction cam if configured
  • Full audio from their rig

Driver onboard scenes are an A-list source. They are the most personal and immersive feed available. Use them with higher frequency than the Race Director feed because they give viewers direct emotional connection with the driver. A viewer watching someone's onboard during a battle feels the same tension the driver feels.

Race Director Scene

The single Race Director scene routes the iRacing broadcast camera output to the stream. It is a composite feed showing whatever camera angle is set via broadcast.showLiveCam. It can show any car from any camera group (TV1, TV2, Chase, Blimp, Pit Lane, etc.).

The Race Director scene requires two intents working together:

  1. obs.switchScene → switches the broadcast output to the Race Director scene
  2. broadcast.showLiveCam → controls which car and camera angle appears on that scene

These are independent: switching to the Race Director scene while showLiveCam still points at an old car means the viewer sees the old car until showLiveCam is updated. Always issue both intents together when targeting a specific car on the Race Director feed.

Correct Usage Pattern

Battle between Driver A and Driver B (both are SimRaceCenter drivers):

obs.switchScene → "Driver_A_Onboard"     (show A's personal feed)
system.wait    → 10000ms
obs.switchScene → "Driver_B_Onboard"     (show B's personal feed)
system.wait    → 10000ms
obs.switchScene → "Race_Director"         (wide shot showing both cars)
broadcast.showLiveCam → carNum: "A", camGroup: "TV2"
system.wait    → 8000ms

Solo driver feature (single SimRaceCenter driver on track):

obs.switchScene → "Driver_A_Onboard"     (personal immersive feed)
system.wait    → 15000ms
obs.switchScene → "Race_Director"         (external perspective on same car)
broadcast.showLiveCam → carNum: "A", camGroup: "Chase"
system.wait    → 10000ms
obs.switchScene → "Driver_A_Onboard"     (back to onboard)
system.wait    → 10000ms

Field car coverage (car not a SimRaceCenter driver — no onboard scene available):

obs.switchScene → "Race_Director"
broadcast.showLiveCam → carNum: "42", camGroup: "TV1"
system.wait    → 12000ms

Scene Frequency Rule

Driver onboard scenes should make up the majority of time in any sequence involving SimRaceCenter drivers. A ratio of approximately 2:1 onboard-to-director is a reasonable default. The Race Director feed is most valuable for:

  • Establishing wide shots that show the gap between cars (TV2/TV3)
  • Blimp/scenic transitions between segments
  • Field cars that have no onboard scene
  • Moments when the driver is off the racing line or in the pits (onboard is static/boring)

Never switch to a driver's onboard scene while they are in the pits — the onboard view during a pit stop is a static overhead shot with minimal viewer interest. Use the Race Director's Pit Lane camera instead.

How Intents Map to Scene Control

GoalRequired Intents
Show a driver's personal streamobs.switchScene (driver onboard scene name)
Show a specific car from a specific camera angleobs.switchScene (Race Director scene) + broadcast.showLiveCam (carNum + camGroup)
Change iRacing camera angle without switching OBS sourcebroadcast.showLiveCam only (works silently if Race Director scene is not currently active)
Show an overlayobs.switchScene (utility scene, e.g., Standings)

6. PortableSequence Format

Every broadcast action travels as a PortableSequence — the universal wire format. The Director client executes steps regardless of origin (AI, operator library, command buffer).

interface PortableSequence {
  id: string;
  name?: string;
  priority?: boolean;          // true = cancel current and execute immediately
  steps: SequenceStep[];
  variables?: SequenceVariable[];
  metadata?: {
    totalDurationMs?: number;
    generatedAt?: string;
    source?: 'ai-director' | 'command-buffer' | 'library';
    templateId?: string;
    templateName?: string;
  };
}
 
interface SequenceStep {
  id: string;                  // unique within sequence
  intent: string;              // domain.action
  payload: Record<string, unknown>;
  metadata?: {
    label?: string;
    timeout?: number;
    narrativePurpose?: 'establish' | 'build' | 'action' | 'reaction' | 'resolve';
  };
}

Intent Registry (Canonical)

IntentExtensionRequired Payload Fields
broadcast.showLiveCamiRacingcarNum: string, camGroup: string|number
broadcast.replayEventiRacingeventId?: string
obs.switchSceneOBSsceneName: string
overlay.showOBSsourceName: string
overlay.hideOBSsourceName: string
communication.announceDiscordtext: string
communication.talkToChatYouTubemessage: string
system.waitdurationMs: number
system.logmessage: string
system.executeSequencesequenceId: string

system.wait is the only blocking step. All other intents are fire-and-forget — the Director dispatches the command and immediately advances to the next step. A sequence without system.wait steps fires all commands simultaneously; the viewer sees only the last camera change. Always insert system.wait after every broadcast.showLiveCam and obs.switchScene.

Variables

Variable placeholders use ${varName} syntax in payload values.

SourceResolved byWhen
cloudAI ExecutorBefore delivery to Director
contextDirector clientAt execution time (from live iRacing telemetry)
userOperatorOn demand (manual override)

Unknown intents are skipped, not fatal. Unresolved required variables cause the step to be skipped.

Priority Semantics

  • priority: false (default) — queued for next available slot
  • priority: true — cancel current sequence, execute immediately (used for incidents, operator overrides)

7. SequenceTemplate Format

Templates are parameterized blueprints stored in Cosmos DB. They are the Planner's output and the Executor's input.

interface SequenceTemplate {
  id: string;
  raceSessionId: string;
  name: string;
  applicability: string;         // prose or structured conditions
  priority: 'normal' | 'incident' | 'caution';
  durationRange: { min: number; max: number };  // milliseconds
  steps: SequenceStep[];         // with ${variable} placeholders
  variables: SequenceVariable[];
  source: 'ai-planner' | 'operator-library' | 'hybrid';
  ttl: 604800;                   // 7 days
}

Priority Levels

PriorityMeaningExecution
incidentRace-defining moments (crash, final lap, red flag)Interrupt current sequence
cautionFlag changes, safety car deploymentShow soon, don't queue behind long sequences
normalStandard broadcast rotationQueue normally

Minimum Valid Template

A template must have: name, applicability, priority, steps (at least one), and variables. Every step must have a unique id and a valid intent from the capability catalog. The Planner returns templates as a JSON array; the parseTemplates function validates and filters them. Invalid templates are dropped, not surfaced to the Executor.


8. iRacing Domain Knowledge

Session Types

TypeBroadcast PriorityCoverage Style
RacePrimaryAll template categories; full narrative arc
QualifyingSecondaryHot lap focus; solo driver features; no battles
PracticeLowSolo driver and scenic; testing camera shots
WarmupVery lowMinimal coverage
CooldownVery lowPodium/celebration shots only

Race Types

  • Sprint (15–45 min): Fast pacing, frequent cuts, every position battle matters
  • Endurance (1–24 hrs): Slow mid-race pacing, emphasis on pit strategy; intense at start/end
  • Heat Racing: Track storylines across heats for narrative continuity
  • Time Trial: One driver at a time; split time comparisons

Car Classes (multi-class priority)

  • Primarily cover the leading class unless another class has more dramatic action
  • Show class interactions when faster cars lap slower ones
  • Maintain awareness of each class lead battle simultaneously

Flag States and Required Broadcast Response

FlagBroadcast ResponsePriority
GreenNormal coverage — follow narrative (battles, leaders, strategy)Normal
Yellow / Full CautionImmediately show the incident → safety vehicles → field packing → pit stops during cautionCritical
Yellow / LocalBrief sector incident shot, then return to active racing elsewhereHigh
RedShow causing incident → field stopping → wide shotsCritical
WhiteFocus leaders and closest challenger; build intensityHigh
CheckeredWinner crossing line → celebration → top finishersCritical
BlackBrief shot of penalized driverNormal
MeatballShow damaged car (external camera if possible)Normal–High

Caution rule: Hold on the incident for at least 10–15 seconds before cutting away.

Camera Groups

GroupTypeBest ForTypical Hold
TV1Trackside closeIndividual car detail, battles at a corner3–8s
TV2Trackside medium2–3 cars together, overtakes, racing line5–12s
TV3Trackside widePack racing, starts, restarts, field overview3–10s
CockpitIn-car POVImmersion, tense moments — use sparingly5–15s
ChaseBehind/above carFollowing a driver, pit stops5–20s
BlimpHigh overheadTrack layout, field spread, establishing shots3–8s
ScenicArtistic fixedAtmosphere, venue — very sparingly during racing3–6s
Pit LanePit road fixedPit stops (stay for entire stop)Full stop duration

Camera variety rules:

  • Never use the same camera group more than 3 times consecutively
  • Rotate through at least 4 different drivers every 2 minutes (multi-driver sessions)
  • Static cameras (TV1/TV2/TV3) to dynamic cameras (Cockpit/Chase) ratio: 3:1
  • Blimp/Scenic as transitions only, not primary coverage

Shot Pacing

MetricRule
Minimum hold3 seconds — never cut before the viewer processes the shot
Maximum static hold30 seconds before engagement drops
Battle coverage8–15s alternating between the two drivers
Leader spotlight15–25s before returning to other action

Sprint race pacing by phase:

  • Start: fast (2–3s cuts) → Early: medium (5–8s) → Mid: relaxed (8–15s) → Late: increasing (5–10s) → Final lap: fast (3–5s)

When to cut: New battle detected, incident, leader about to be overtaken, contender pitting, shot held >20s, nothing interesting in current shot.

When NOT to cut: Overtake in progress, car mid-corner in dramatic moment, active battle still close, hot qualifying lap, shot held <3s.

Telemetry Thresholds

ConditionGap / ValueAction
Battle (ENGAGED)< 1.0sHigh priority — cover immediately
Battle (CLOSING)< 2.0s and shrinkingMonitor — likely battle soon
Comfortable gap> 3.0sLower priority
Breakaway> 10.0sVariety coverage only
Fastest lap setAny reductionWorth a mention/camera shot
Lap time degradationConsistent increaseAnticipate pit stop or tyre issue

Standard Camera Sequences by Scenario

Battle approach: TV2 (both cars at corner entry) → TV1 (wheel-to-wheel) → Chase (attacking car if making a move) → TV2/TV3 (next corner result)

Leader spotlight: TV1 or TV2 (establish position) → Cockpit or Chase (personal moment) → TV3 wide (show gap to second)

Incident response: Nearest camera (live if possible) → hold 10–15s → show effect on other drivers → significant incident: show aftermath from multiple angles

Pit stop: Pit Lane camera (from pit entry) → hold entire stop → show car rejoining → check if position was maintained

Narrative Phases

PhaseTypical Content
Race startCharge to turn 1: who got a good start, contact, early incidents
Opening lapsField sorting: early battles, aggressive moves
Mid-raceTyre management, fuel strategy, evolving battles — show the chess match
Pit windowsWho pits first, undercut/overcut, track position changes
Late raceTyres degrading, desperation moves, gaps closing
Final lapAll focus on position-deciding battles
Post-raceWinner celebration, final results, key moment replays

9. Publisher Event Pipeline

Why Events, Not Raw Telemetry

The legacy Python prototype sent ~150 raw iRacing telemetry fields at 5Hz (~25–50 KB/s per rig). The AI used 8 of them. Events replace the firehose with structured, pre-analysed facts about what happened.

Eight Telemetry Fields Read (per rig, 5Hz)

CarIdxPosition, CarIdxOnPitRoad, CarIdxTrackSurface, CarIdxLastLapTime, CarIdxBestLapTime, CarIdxLapCompleted, CarIdxClassPosition, SessionFlags

RaceEvent Wire Format

interface RaceEvent {
  id: string;              // UUID v4 — idempotency key
  raceSessionId: string;   // Cosmos partition key
  type: RaceEventType;
  timestamp: number;       // Unix ms
  lap: number;             // Leader lap at time of event
  involvedCars: { carIdx: number; carNumber: string; driverName: string; position?: number }[];
  payload: Record<string, unknown>;  // event-specific data
  ttl: number;             // 7,776,000 (90 days)
}

Driver names are resolved to real-world booked names on the rig via IdentityOverride before transmission. The AI never sees CarIdx numbers — only driver names.

Rig-Sourced Event Types

TypeTriggerBroadcast Relevance
OVERTAKEPosition swap between frames, excluding pit cyclesHigh — immediate camera opportunity
BATTLE_STATEGap transitions: ENGAGED <1.0s · CLOSING <2.0s shrinking · BROKEN >2.0sHigh — sustained narrative arc
PIT_ENTRYCarIdxOnPitRoad false → trueMedium — strategy story
PIT_EXITCarIdxOnPitRoad true → falseMedium — rejoins, undercut completion
INCIDENTFlag bitmask change + speed/position anomalyHigh — safety and drama
LAP_COMPLETECarIdxLapCompleted increment + lap timeLow — timing reference
POSITION_CHANGEPosition change not caused by an overtakeMedium — leaderboard context
SECTOR_COMPLETELap distance crosses sector boundaryLow — sector timing

Cloud-Synthesised Event Types (source: 'cloud')

These are written by the Race Control event synthesiser after correlating events from multiple rigs:

TypeTrigger
FOCUS_VS_FOCUS_BATTLE≥2 rigs focused on cars with gap <1.0s across 2+ frames
STINT_HANDOFFDRIVER_SWAP events correlated across rigs
RIG_FAILOVERA rig's heartbeat lapses; another rig covers the same car
UNDERCUT_DETECTEDPIT_ENTRY timing patterns suggest undercut attempt
IN_LAP_DECLAREDLap-time degradation after PIT_EXIT matches an in-lap
FOCUS_GROUP_ON_TRACKDeduplicated group of cars in focus across publisher rigs
SESSION_LEADER_CHANGEOverall or class leader changes

How the Executor Uses Race Events

The Executor's AISnapshot now includes a recent raceEvents timeline from Cosmos DB. This shifts the AI from "what is true right now?" to "what story has been building over the last few laps?"

Key reasoning patterns:

  • BATTLE_STATE: ENGAGED persisting for multiple laps → better broadcast story than a momentary gap
  • OVERTAKE in last 30 seconds → strong trigger for a replay/highlight template
  • Back-to-back PIT_EXIT events for cars that were battling → potential undercut narrative
  • Car with no recent OVERTAKE/BATTLE events but improving LAP_COMPLETE times → building for a late charge ("sleeper" storyline)
  • FOCUS_VS_FOCUS_BATTLE (cloud-synthesised) → strong signal the broadcast should follow this battle

scan_recent_events Tool

The Executor can call this tool to query the raceEvents container during sequence generation:

scan_recent_events({
  sessionId: string,
  eventTypes: RaceEventType[],
  sinceMs: number,     // Unix ms timestamp
  limit: number        // max results
})

Use when: the leaderboard snapshot alone doesn't explain a car's current position, or when choosing between two equally-ranked templates where event recency would break the tie.


10. Template Quality and Editorial Intelligence

Context Inputs That Improve Template Quality

Race Director's Notes (Center.directorNotes) — free-text operator guidance injected into both Planner and Executor prompts. Describes:

  • Which OBS scenes to use in which situations
  • Preferred pacing style (slow/relaxed vs. fast/punchy)
  • Hardware-specific notes (e.g., stinger transition timing, overlay auto-hide duration)
  • Track-specific camera recommendations
  • "Signature shots" the operator always uses

Race Session Summary (RaceSession.summary) — narrative context for this specific broadcast. Describes:

  • The broadcast story (rivalry, esports championship, casual creator content)
  • Driver priorities (which cars matter most editorially)
  • Tone (competitive vs. entertainment-first)
  • Specific moments to prioritise (e.g., a specific corner, a debut driver)

When both fields are present, they override generic broadcast heuristics. Operator intent beats default rules.

Race Phase Detection

The Executor derives the current race phase from RaceContext:

CAUTION flag set           → 'caution'
lapsRemain ≤ 5 OR timeRemainSec ≤ 300  → 'closing'
leaderLap ≤ 3              → 'opening'
pitting.length > 0 AND pit window active → 'pit-cycle'
battles.length > 0         → 'action'
default                    → 'rhythm'

Templates designed for the current phase should score higher in selection.

Structural Template Quality Rules

  • Every template must have a narrative arc: establish → build → action → reaction → resolve
  • Sequences should open with a wide or medium establishing shot before close-up action
  • TTS communication.announce payloads must reference specific driver names and race context — not generic text like "Battle for position 5"
  • Per-step hold times should vary: establishing shots 10–15s, action cuts 2–3s, reaction shots 4–6s
  • Total sequence duration must stay within the template's durationRange
  • Camera variety: no more than 2 consecutive steps on the same camera group within one sequence

Common Errors to Avoid

ErrorCausePrevention
All camera switches fire simultaneouslyMissing system.waitAlways add system.wait after every camera/scene step
Template runs past durationRangeUniform durationMs × many stepsSum of all system.wait durationMs must ≤ durationRange.max
Unknown scene name in obs.switchSceneUUID instead of display nameUse scene display names from DirectorCapabilities
Wrong driver identifiercarIdx instead of carNumberUse carNumber string (e.g., "42") not index
Repeated template selectionExecutor ignoring variety ruleCheck lastExecutedTemplateId and exclude it
Battle template with no battleIgnoring gap checkPre-filter: only select battle templates when a battle is confirmed

11. Output Validation Checklist

Before returning a PortableSequence, verify:

  • All step IDs are unique within the sequence
  • All intents exist in the Director's capability catalog from the check-in
  • All ${variable} placeholders are resolved to concrete values
  • Every broadcast.showLiveCam or obs.switchScene is followed by system.wait
  • metadata.source is "ai-director"
  • metadata.templateId references the selected template
  • Total system.wait durations sum to the durationMs value in the response
  • No car numbers are CarIdx integers — must be string car numbers from the leaderboard
  • No OBS scene names are UUID strings — must be display names
  • Priority is false unless the situation explicitly requires an interrupt (incident, operator override)