Hyperframes can render to HDR10 MP4 (H.265 10-bit, BT.2020) when your composition references HDR video or HDR still images. HDR is auto-detected by default from your media sources and falls back to SDR when none are present.Documentation Index
Fetch the complete documentation index at: https://hyperframes-docs-hyperframes-mcp.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
By default, Hyperframes probes your media and enables HDR only when HDR sources are present. Use
--hdr to force HDR even without HDR sources, or --sdr to force SDR even when HDR sources are present.Quickstart
Add an HDR source to your composition
Hyperframes detects HDR from the source’s color space metadata. The most reliable HDR sources are:
- HDR video tagged BT.2020 with PQ (
smpte2084) or HLG (arib-std-b67) transfer - HDR still images as 16-bit PNG with BT.2020 PQ encoding
Render normally
Terminal
--format mp4. If Hyperframes detects HDR sources, it renders HDR automatically. If you also pass --format mov or --format webm, Hyperframes logs a warning and falls back to SDR.Verify the output is HDR
Use See Verifying HDR output for what to look for.
ffprobe to confirm the encoded stream carries HDR color tagging and HDR10 metadata:Terminal
How HDR Mode Works
During render, the producer:Probes every video and image source
Runs
ffprobe on each <video> and <img> source to read its color space (primaries, transfer function, matrix). This probe drives the default auto-detect behavior and is skipped only when you explicitly force SDR with --sdr.Picks the dominant HDR transfer
If any source uses PQ (
smpte2084), the output uses PQ. Otherwise, if any source uses HLG (arib-std-b67), the output uses HLG. If no HDR sources are found, the render stays SDR.Encodes to H.265 10-bit BT.2020
The video encoder switches to
libx265 with -pix_fmt yuv420p10le, color tagging colorprim=bt2020:transfer=<smpte2084|arib-std-b67>:colormatrix=bt2020nc, and HDR10 static metadata (master-display and max-cll). Without that metadata, players (QuickTime, YouTube, HDR TVs) tone-map the stream as if it were SDR BT.2020 — which looks wrong.Composites HDR sources natively, converts SDR overlays
HDR videos and images are extracted as 16-bit linear-light pixels via FFmpeg, kept out of the DOM screenshot, and composited server-side at full bit depth. SDR DOM overlays (text, shapes, UI from your HTML) are converted from sRGB to BT.2020 before being layered on top, so colors do not shift.
Source Media Requirements
HDR video
Hyperframes recognizes HDR video from itsffprobe color space metadata:
| Indicator | Recognized as HDR |
|---|---|
color_primaries contains bt2020 | Yes |
color_space contains bt2020 | Yes |
color_transfer = smpte2084 (PQ) | Yes — PQ |
color_transfer = arib-std-b67 (HLG) | Yes — HLG |
All else (e.g. bt709, smpte170m) | No — treated as SDR |
Terminal
HDR still images
Hyperframes supports HDR still images delivered as 16-bit PNGs tagged with BT.2020 primaries and PQ transfer. Drop them into the composition as a normal<img>:
index.html
HDR
<img> decoding is limited to 16-bit PNG. JPEG, WebP, AVIF, and APNG are not recognized as HDR sources — they load through the normal SDR DOM path. For HDR motion, use a <video> element.SDR sources mixed with HDR
You can freely mix SDR and HDR media in the same composition:- SDR videos stay in the DOM screenshot path and get the sRGB → BT.2020 conversion described above
- HDR videos are extracted natively at 16-bit and composited underneath the SDR DOM layer
- SDR images and DOM elements (text, shapes, gradients, GSAP animations) are converted from sRGB to BT.2020
Output Format Requirements
| Output format | HDR supported |
|---|---|
mp4 | Yes — H.265 10-bit BT.2020, HDR10 metadata |
mov | No — falls back to SDR |
webm | No — falls back to SDR |
--format mov or --format webm, Hyperframes logs a message and produces the equivalent SDR render. There is no error — the render still completes — so check the logs (or your verification step) to confirm you got HDR.
Verifying HDR Output
Useffprobe to confirm both the color tagging and the HDR10 static metadata are present:
Terminal
Terminal
color_transfer=arib-std-b67 — the rest of the checks are the same.
Docker Rendering
Docker uses the same auto-detect logic as local rendering, so you can produce HDR10 MP4 output from the containerized renderer without extra flags:Terminal
ffprobe checks described in Verifying HDR output.
Docker HDR rendering currently runs CPU-side for the SDR DOM layer (the container falls back to software WebGL because GPU passthrough is not configured by default). Frame capture is therefore slower than local headed Chrome — measure your own composition with
--quiet off and compare wall-clock times before sizing CI runners. The encoded HDR10 metadata and pixel data are identical to a local render.Limitations
- MP4 only — HDR output with
--format movor--format webmfalls back to SDR - HDR images: 16-bit PNG only — other formats (JPEG, WebP, AVIF, APNG) are not decoded as HDR and fall through the SDR DOM path
- H.265 only — H.264 is stripped — calling the encoder with
codec: "h264"andhdr: { transfer }is rejected; the encoder logs a warning, dropshdr, and tags the output as SDR/BT.709.libx264cannot encode HDR, so the alternative would be a “half-HDR” file (BT.2020 container tags but a BT.709 VUI block in the bitstream) which confuses HDR-aware players. - GPU H.265 emits color tags but no static mastering metadata —
useGpu: truewith HDR (nvenc, videotoolbox, qsv, vaapi) tags the stream with BT.2020 + the correct transfer (smpte2084 / arib-std-b67) but does not embedmaster-displayormax-cllSEI. ffmpeg does not let those flags pass through hardware encoders. The output is suitable for previews and authoring but not for HDR10-aware delivery (Apple TV, YouTube, Netflix). For spec-compliant HDR10 production output, leaveuseGpu: falseso the SWlibx265path embeds the mastering metadata. - Player support — the
<hyperframes-player>web component plays back the encoded MP4 in the browser and inherits whatever HDR support the host browser provides; it does not implement its own HDR pipeline - Headed Chrome HDR DOM capture — the engine ships a separate WebGPU-based capture path for rendering CSS-animated DOM directly into HDR (
initHdrReadback,launchHdrBrowser). It requires headed Chrome with--enable-unsafe-webgpuand is not used by the default render pipeline. See Engine: HDR if you are building a custom integration.
Common Pitfalls
| Symptom | Likely cause |
|---|---|
| Output looks identical to SDR | Source media is SDR, or SDR was forced with --sdr. Run ffprobe on your inputs and check the render logs |
| Output is “kind of HDR” but tone-mapped wrong on YouTube/QuickTime | Missing HDR10 static metadata on the encoded stream. Verify with the ffprobe snippet above |
| Docker render is much slower than local | Expected — the container falls back to software WebGL for SDR DOM capture. Pixel output is the same |
Used --format webm and got SDR | Expected — HDR output is MP4 only |
HDR <img> looks SDR / washed out | Source is not a 16-bit PNG. Re-export as 16-bit PNG (BT.2020 PQ) or use a <video> element instead |
Next Steps
Rendering
Local vs Docker, quality presets, workers
CLI
Full
render command reference including HDR auto-detect, --hdr, and --sdrEngine: HDR APIs
Public HDR utilities exported from
@hyperframes/engineCommon Mistakes
Pitfalls that affect render output