HeyGen released HyperFrames on April 17, 2026, an open-source tool that converts HTML and CSS animations into MP4 video files using headless Puppeteer for frame capture and FFmpeg for encoding. The tool produces deterministic output, meaning the same HTML file generates the same video every time, and ships with a flag designed for AI coding agents to generate video programmatically.

For the broader landscape, see our complete guide to AI video generation in 2026.

What Happened

HeyGen published HyperFrames on GitHub under the Apache 2.0 license. The project reached over 6,600 stars within days of launch. The tool takes an HTML file with data-* attributes as input and exports an MP4, with no API keys or cloud services required.

Supported animation types include GSAP 3, Lottie JSON, CSS transitions, Three.js scenes, and WebGL shader transitions. Any animation that runs in a browser can be captured frame-by-frame and assembled into video. The core CLI is a single command: npx hyperframes render --output output.mp4.

The release includes direct AI agent integration. HyperFrames ships with a --non-interactive flag for automated pipelines and is listed as a supported skill for Claude Code, Cursor, Gemini CLI, and OpenAI Codex. Agents install the skill via npx skills add heygen-com/hyperframes and can go from a text prompt to a finished MP4 without manual steps.

Why It Matters

The defining property is deterministic rendering. Screen recording captures whatever the display shows at a given moment, which introduces dropped frames, inconsistent timing, and GPU-dependent results. HyperFrames renders frame-by-frame at the specified frame rate, so output is identical across machines and runs.

For creators building animated graphics, lower thirds, or motion-designed title cards in HTML, the tool removes the screen recording step entirely. Everything stays in version control, runs locally, and adds no recurring cost.

The agent integration is the forward-looking part. Claude Code and Cursor can now write HTML animations and render them directly to MP4 in a single session. That is a new capability class: AI agents producing polished animated output rather than static assets.

Key Details

  • License: Apache 2.0, fully open source
  • Requirements: Node.js 22+, FFmpeg
  • Supported animations: GSAP 3, Lottie, CSS, Three.js, WebGL shaders
  • Output: MP4 at any resolution and frame rate
  • AI agent flag: --non-interactive for scripted pipelines
  • Agent skill install: npx skills add heygen-com/hyperframes
  • GitHub: github.com/heygen-com/hyperframes
  • Docs: hyperframes.heygen.com

What to Do Next

The fastest test is npx hyperframes render on an existing HTML animation. The HyperFrames documentation includes starter templates for GSAP and Lottie that work without any additional configuration.

For teams already using AI coding agents, add HyperFrames as a skill in a Claude Code or Cursor session and prompt for a short animated segment. The full pipeline from text prompt to MP4 runs in a single agent session with no manual intervention.