OpenMontage website-to-hyperframes

install
source · Clone the upstream repo
git clone https://github.com/calesthio/OpenMontage
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/calesthio/OpenMontage "$T" && mkdir -p ~/.claude/skills && cp -r "$T/.agents/skills/website-to-hyperframes" ~/.claude/skills/calesthio-openmontage-website-to-hyperframes && rm -rf "$T"
manifest: .agents/skills/website-to-hyperframes/SKILL.md
source content

Website to HyperFrames

Capture a website, then produce a professional video from it.

Users say things like:

  • "Capture https://... and make me a 25-second product launch video"
  • "Turn this website into a 15-second social ad for Instagram"
  • "Create a 30-second product tour from https://..."

The workflow has 7 steps. Each produces an artifact that gates the next.


Step 1: Capture & Understand

Read: references/step-1-capture.md

Run the capture, read the extracted data, and build a working summary using the write-down-and-forget method.

Gate: Print your site summary (name, top colors, fonts, key assets, one-sentence vibe).


Step 2: Write DESIGN.md

Read: references/step-2-design.md

Write a simple brand reference for the captured website. 6 sections, ~90 lines. This is a cheat sheet, not the creative plan — that comes in Step 4.

Gate:

DESIGN.md
exists in the project directory.


Step 3: Write SCRIPT

Read: references/step-3-script.md

Write the narration script. The story backbone. Scene durations come from the narration, not from guessing.

Gate:

SCRIPT.md
exists in the project directory.


Step 4: Write STORYBOARD

Read: references/step-4-storyboard.md

Write per-beat creative direction: mood, camera, animations, transitions, assets, depth layers, SFX. This is the creative north star — the document the engineer follows to build each composition.

Gate:

STORYBOARD.md
exists with beat-by-beat direction and an asset audit table.


Step 5: Generate VO + Map Timing

Read: references/step-5-vo.md

Generate TTS audio, transcribe for word-level timestamps, and map timestamps to beats. Update STORYBOARD.md with real durations.

Gate:

narration.wav
(or .mp3) +
transcript.json
exist. Beat timings in STORYBOARD.md updated.


Step 6: Build Compositions

Read: The

/hyperframes
skill (invoke it — every rule matters) Read: references/step-6-build.md

Build each composition following the storyboard. After each one: self-review for layout, asset placement, and animation quality.

Gate: Every composition has been self-reviewed. No overlapping elements, no misplaced assets, no static images without motion.


Step 7: Validate & Deliver

Read: references/step-7-validate.md

Lint, validate, preview. Create a HANDOFF.md for multi-session continuity.

Gate:

npx hyperframes lint
and
npx hyperframes validate
pass with zero errors.


Quick Reference

Video Types

TypeDurationBeatsNarration
Social ad (IG/TikTok)10-15s3-4Optional hook sentence
Product demo30-60s5-8Full narration
Feature announcement15-30s3-5Full narration
Brand reel20-45s4-6Optional, music focus
Launch teaser10-20s2-4Minimal, high energy

Format

  • Landscape: 1920x1080 (default)
  • Portrait: 1080x1920 (Instagram Stories, TikTok)
  • Square: 1080x1080 (Instagram feed)

Reference Files

FileWhen to read
step-1-capture.mdStep 1 — reading captured data
step-2-design.mdStep 2 — writing DESIGN.md
step-3-script.mdStep 3 — writing the narration script
step-4-storyboard.mdStep 4 — per-beat creative direction
step-5-vo.mdStep 5 — TTS, transcription, timing
step-6-build.mdStep 6 — building compositions with self-review
step-7-validate.mdStep 7 — lint, validate, preview, handoff
techniques.mdSteps 4 & 6 — 10 visual techniques with code patterns (SVG drawing, Canvas 2D, 3D, typography, Lottie, video, typing, variable fonts, MotionPath, transitions)