HDR10 is the open, royalty-free HDR (High Dynamic Range) standard that every premium video pipeline ships in 2026. It's the universal HDR baseline — every HDR-capable display decodes it, every modern codec encodes it, every streaming service that ships HDR content offers HDR10 as the default. Dolby Vision and HDR10+ exist as premium tiers above HDR10; HLG exists for broadcast contribution. HDR10 is the floor that everything else builds on. This page is the engineering reference for what HDR10 actually specifies, how to encode it correctly, and where it fits in HDR delivery.
What HDR10 is
HDR10 is a bundle of standards published by the Consumer Technology Association (CTA) starting in 2015. It specifies:
- Transfer function — SMPTE ST 2084 (PQ, Perceptual Quantizer). The mapping between digital code values and perceived luminance.
- Color space — BT.2020 primaries (wide color gamut).
- Bit depth — 10-bit per color component (sometimes called "Main 10" in HEVC profile terminology).
- Static metadata — SMPTE ST 2086 mastering display metadata + MaxFALL/MaxCLL content luminance metadata.
The "10" in HDR10 refers to 10-bit color, not the "HDR-1.0" version designation it sounds like. HDR10 is the first widely-deployed HDR consumer format; the version number is implicit (1).
The combination of PQ + BT.2020 + 10-bit gives:
- Higher dynamic range — PQ supports up to 10,000 nits peak luminance (vs SDR's 100 nits ceiling).
- Wider color gamut — BT.2020 covers ~75% of the visible color spectrum (vs Rec.709's ~36%).
- More precision — 10-bit reduces banding in subtle gradients (skies, shadows, smoke) that 8-bit chokes on.
Together, these make HDR10 content meaningfully better than SDR — assuming the display is HDR-capable and properly calibrated.
ST 2084 (PQ) transfer function
PQ — Perceptual Quantizer — is the OETF (opto-electrical transfer function) that maps brightness to digital code values. The defining characteristic: PQ is display-referred, not scene-referred. The code value 1023 (10-bit max) maps to a fixed luminance value (10,000 nits) regardless of context.
This contrasts with SDR's gamma curve (also called "BT.1886") which is scene-referred — the code value relates to scene brightness, not absolute display luminance. In SDR, the 100% white code value displays at whatever brightness the display is calibrated to (typically 100 nits in studio reference).
The implication: HDR10 content has absolute luminance encoded in the bits. A bright sun should be at 5000 nits in the encoding; the display reproduces it as close to 5000 nits as it can (typical consumer HDR TVs cap at 1000-2000 nits, so the bright sun gets tone-mapped to the display's capability).
PQ enables the dynamic range that makes HDR matter — bright lights look bright, dark shadows look dark, and the perceptual mapping is consistent across HDR-capable displays.
BT.2020 color space
BT.2020 (also called Rec.2020) is the wide color gamut HDR10 uses. It defines:
- Red, green, blue primaries that bound a triangle in the CIE 1931 chromaticity diagram much larger than Rec.709's triangle.
- D65 white point (same as Rec.709).
- Y'CbCr matrix coefficients for converting between RGB and Y'CbCr representations.
BT.2020 covers roughly 75% of the visible spectrum compared to Rec.709's 36%. The practical effect: more saturated colors, especially in the red and green channels. A red of intensity X in BT.2020 looks more saturated than the same R-channel intensity in Rec.709.
Most consumer displays don't fully cover BT.2020 — they're typically in the DCI-P3 range (~46% of visible spectrum). Content encoded as BT.2020 gets gamut-mapped to the display's actual capability. Over time, displays are expanding toward full BT.2020; for now, partial coverage is the production reality.
Static metadata
HDR10 ships static metadata — values that apply to the entire video, not per-frame. The two metadata categories:
SMPTE ST 2086 mastering display metadata — describes the display the content was mastered on:
- Display primaries (RGBW chromaticity coordinates).
- White point coordinates.
- Min and max display luminance (e.g., 0.005 nits min, 1000 nits max for a typical HDR mastering monitor).
This tells consumer displays "this is how the colorist saw it; do your best to match."
MaxFALL (Maximum Frame Average Light Level) — the maximum frame-average luminance across the video. Helps consumer displays plan their tone-mapping budget.
MaxCLL (Maximum Content Light Level) — the maximum pixel luminance across the video. Helps displays size their highlight reproduction.
The metadata is "static" — same values applied to the whole video. For content with varying dynamic ranges across scenes, static metadata can't represent the per-scene optimal mapping. Dynamic metadata formats (Dolby Vision, HDR10+) address this, at cost of additional encoding/playback complexity.
HDR10 in HEVC
HDR10 in HEVC uses the Main 10 profile (10-bit color depth, 4:2:0 chroma sampling). The signaling is in:
- VUI (Video Usability Information) — the video has color primaries = BT.2020 (matrix coefficients = 9), transfer characteristics = PQ (matrix coefficients = 16), and matrix coefficients = BT.2020 non-constant luminance.
- SEI (Supplemental Enhancement Information) messages — mastering display metadata (SEI message 137) and content light level (SEI message 144).
Encoding HDR10 with x265 requires explicit HDR signaling:
ffmpeg -i input -c:v libx265 -profile:v main10 -pix_fmt yuv420p10le \
-x265-params "colorprim=bt2020:transfer=smpte2084:colormatrix=bt2020nc:\
master-display=G(13250,34500)B(7500,3000)R(34000,16000)WP(15635,16450)L(10000000,1):\
max-cll=1000,400" \
-c:a copy output.hevc
The cryptic master-display and max-cll strings are the SEI metadata. Get them wrong and HDR-capable displays may not engage HDR mode for the content; they'll display it as miscolored SDR.
HDR10 in AV1
AV1 supports HDR10 natively. The signaling is similar to HEVC (color primaries, transfer characteristics, matrix coefficients in OBUs) plus metadata in OBU_METADATA OBUs.
Encoding HDR10 with SVT-AV1:
ffmpeg -i input -c:v libsvtav1 -pix_fmt yuv420p10le \
-svtav1-params "color-primaries=9:transfer-characteristics=16:matrix-coefficients=9:\
mastering-display=G(0.265,0.690)B(0.150,0.060)R(0.680,0.320)WP(0.3127,0.3290)L(1000,0.0001):\
content-light=1000,400" \
-c:a copy output.av1
AV1 HDR10 ecosystem maturity has caught up with HEVC; the encoding details differ slightly but the standards-compliant output is functionally equivalent.
HDR10 in CMAF / fMP4
For streaming, HDR10 content is packaged in CMAF segments (fragmented MP4) with codec-specific metadata in the init segment. The packaging signals HDR10 via:
ftypbrand — typicallycmfcor similar.colrbox — color information atom signaling color primaries, transfer characteristics, matrix coefficients.mdcvbox (mastering display color volume) — for HEVC HDR10 metadata.cllibox (content light level information) — for MaxCLL/MaxFALL.
Players read these boxes when initializing decode and apply HDR rendering accordingly.
Player and display ecosystem
HDR10 support across the consumer ecosystem in 2026:
- iOS / iPadOS — HDR10 supported on capable devices since iPhone 8/iPad Pro (2017+).
- Android — HDR10 supported on flagship phones since 2017+. Specific support varies by SoC.
- Smart TVs — every HDR-capable TV from 2016+ supports HDR10. Universal across LG, Samsung, Sony, Vizio, etc.
- Set-top boxes — Roku, Fire TV, Apple TV all support HDR10.
- Browsers — HDR10 in browsers is more limited. Safari supports HDR via HEVC. Chromium-based browsers gained HDR support over time but ecosystem is more limited than mobile/TV.
- Game consoles — PS5, Xbox Series X|S support HDR10 natively.
The install base for HDR10 playback is large and growing. Premium streaming services that ship HDR content can reach a meaningful audience with HDR10 alone.
HDR10 vs Dolby Vision vs HDR10+
The three HDR formats compared:
| Dimension | HDR10 | Dolby Vision | HDR10+ |
|---|---|---|---|
| Open / royalty status | Open, royalty-free | Proprietary, licensed | Open with Samsung-led licensing |
| Static / dynamic metadata | Static | Dynamic | Dynamic |
| Bit depth | 10-bit | 10/12-bit | 10-bit |
| Color space | BT.2020 | BT.2020 | BT.2020 |
| Display ecosystem | Universal HDR-capable | Premium TVs and devices | Samsung TVs primary, growing |
| Content support | Massive | Apple TV+, Netflix premium, Disney+ premium | Amazon Prime, some studio content |
| Quality vs HDR10 | Baseline | Better via per-scene metadata | Better via per-scene metadata |
The pragmatic 2026 answer: ship HDR10 as the universal baseline. Add Dolby Vision for premium tiers if your content licensing allows and your audience justifies it. HDR10+ is a niche addition, primarily for Samsung TV audience reach.
Operational considerations
Things that matter for production HDR10:
Mastering monitor calibration — HDR10 metadata is meaningful only if it accurately describes the mastering process. Mis-set metadata causes incorrect display tone-mapping. Verify metadata against the actual mastering setup.
Tone-mapping on consumer displays — most consumer HDR TVs can't display 10,000 nits peak luminance. They tone-map to their actual peak (typically 600-2000 nits). Quality of tone-mapping varies by TV; some look great, some look worse than SDR. Test on real devices.
Codec compatibility — HDR10 in HEVC works universally. HDR10 in H.264 is technically possible but rarely supported by consumer hardware decoders. AV1 HDR10 is supported on AV1-capable hardware (Ada GPUs, modern Android, recent smart TVs).
Bandwidth — 10-bit HDR10 typically needs ~10-20% more bitrate than equivalent-quality 8-bit SDR for similar perceptual quality. The wider gamut and higher dynamic range have real bandwidth cost.
SDR fallback — for audiences without HDR-capable displays, you ship SDR variants. Often produced via tone-mapping from the HDR master. The fallback path needs careful color management to avoid washed-out or oversaturated output.
What MpegFlow does with HDR10
MpegFlow's FfmpegExecutor worker image runs HEVC Main 10 (x265) and AV1 10-bit (SVT-AV1) for HDR10 output. The DAG runtime expresses HDR10 encoding as parallel rendition stages on job_stages with explicit dependency tracking; per-stage retry handles transient failures; sibling cancellation propagates fatal failures.
An FfprobeExecutor stage characterizes source HDR signaling upstream (mastering display, MaxCLL, MaxFALL, color primaries, transfer characteristics); cross-stage data flow wires the probe output into the downstream rendition stages so the encoder receives accurate per-asset metadata rather than fixed defaults.
For pipelines that produce both HDR and SDR variants (typical premium streaming setup), the workflow runs an FfmpegExecutor tone-mapping stage that derives the SDR variant from the HDR master via BT.2390-based mapping; customers with specific colorist preferences can configure tone-mapping parameters.
The CMAF muxer at the packaging stage preserves HDR signaling boxes (colr, mdcv, clli) the encoder populated. Multi-DRM packaging (CENC/cbcs encryption applied to HDR or SDR variants) is part of the Phase 2D / Shaka Packager roadmap, not currently shipped runtime executors — customers needing DRM-protected HDR delivery handle the encryption/signaling step in their own packaging tooling alongside MpegFlow today.
For customers shipping HDR for the first time, the conversation typically focuses on the master-side tooling (where does the HDR-graded master come from?), the HDR metadata accuracy (do you have ST 2086 + MaxCLL/MaxFALL?), and the SDR fallback strategy (tone-map automatically vs manual SDR grade). HDR10 as the technical implementation is solved; the editorial and operational integration is where customer-specific work happens.
The strict-broker security model handles HDR content like any pipeline payload — workers carry no ambient credentials; content access flows through short-lived presigned URLs scoped per stage; access is disposed on completion. HDR doesn't change the security posture; the additional metadata is just additional fields in the stage spec.