No conversation loaded
Load an example or paste a conversation to begin.
Substrate extracts a paired, time-indexed representation of human–LLM dialogue: the user's evolving problem formulation, the model's inferred formulation, and the moments where they silently diverge.
Two typed problem graphs evolve in lockstep — one per side of the dyad.
Watch nodes appear, get refined, or supersede each other as the conversation unfolds.
Grounding acts, scope drift, ignored constraints, premature commitment — all surfaced.
Extraction is intentionally rough — the contribution is the representation, not the F1 score. Once an externalised, inspectable dual representation exists, the moments that were previously invisible become legible.