Tools and techniques for moving performance capture from script to
engine. Faster, cheaper, more consistent. Four years building
proprietary pipeline at Deck Nine, followed by independent R&D
extending the same thread into markerless capture and agentic
authoring.
Deck Nine Pipeline
SceneCalculatorPro | Script Scoping and Auditing
Motion Capture Director · Tool Development · 2025
SceneCalculatorPro parsed .xml data from Deck Nine’s proprietary
script editor and compared it against production scope, reference
video shot on stage, and in-game cinematic minutes once implemented.
Midway through Life is Strange: Reunion, SceneCalculatorPro
identified a 35% project overage and became an essential tool in
bringing the game back into scope and budget.
Backfill | Automated Reference Video Assembly and Delivery
Motion Capture Director · Tool Development · 2025
Backfill is a tool suite that automated the assembly, editing,
subtitling, and notation of reference videos from the mocap stage. It
leverages the DaVinci Resolve and Confluence APIs to deliver witness
video and performance context to downstream teams as a searchable,
indexed Confluence library, turning a manual, day-long handoff
into a push-button pipeline step.
Narrative Engine
Scripty | branching narrative editor | state management engine
Scripty is a branching narrative editor built for
interactive stories that accumulate hundreds of player choices without
the exponential branch explosion of traditional tools. Each decision
writes to a running state portrait of the player, letting scenes react
to patterns of behavior instead of single flags. A CodeMirror
screenplay surface sits alongside a live state inspector, giving
writers and directors one authoring environment for drafting,
auditing, and testing narrative state across the full runtime.
A dedicated wire-planning and whiteboard layer turns the same document
into a visual workspace: scenes become drag-and-drop cards, choice
dependencies render as explicit wires between them, and story beats can
be rearranged spatially before a single line of dialogue is touched.
The two surfaces are bi-directional. Build the wire plan first and let
it scaffold the script, or start writing and generate the wire plan
from what you’ve already drafted. Planning and writing stay in
the same tool, anchored to the same underlying state.
bodypipe is a markerless pipeline that extracts body,
face, and hand data from video: a machine-learning-driven
replacement for the Vicon and Faceware systems of old. Running on
consumer GPUs, with a Qt GUI, real-time 3D preview with Live Link
connections, and BVH/FBX export.
facepipe | tracking refinement and regression model training
facepipe takes raw MediaPipe face tracking and
refines it into performance-ready ARKit blendshape data. Actor-specific
profiles capture corrections through Cubase-style automation handles,
letting a director apply, undo, and reshape tracking curves directly
on the timeline. Those corrections feed regression models trained
per-actor, so the system sharpens to the performer the more you work
with them. Live Link output streams straight into Unreal.
sandpipe-body | inertia and physics from monocular video
sandpipe is a general-purpose pixel-physics sandbox:
elemental particles with unique physics impacting a spring system and
boundary collisions, running on the GPU.
sandpipe-body is the pipeline that extracts physics
cues from monocular video. The research roadmap folds sandpipe-body
back into bodypipe’s skeletal data as a physical prior. The
result will be mocap with weight and grounding baked in, recovered
from a single camera.
Agentic Network
The pipeline runs on an interconnected homelab of heterogeneous agents:
an orchestration model directing a cluster of specialized local LLMs,
all of them reading from and writing to a single shared memory layer.
Tailscale holds the nodes together as a private mesh, and a central
dispatcher routes work to whichever model is best suited, long-context
reasoning, code, vision, embeddings, without ever leaving the local
network.
A unified memory service gives every agent the same persistent context:
semantic memories that consolidate over time, heat-decayed chat memory
that cools when topics go quiet, and raw transcripts of every session
ingested automatically. Inbound traffic flows through a Telegram bridge
that carries conversations from phone to orchestrator. Outbound
messages are council-gated before anything leaves the network.
Autonomous wake cycles run on cron, checking state and picking up work
while I’m away.