Color space defines what colors a video signal can encode. Different color spaces have different reachable colors — Rec.709 (the SDR HD standard) covers about 36% of visible colors; BT.2020 (the modern HDR standard) covers about 75%. The color space your content uses determines what saturated reds, greens, and blues are actually representable. Getting color space wrong in a pipeline produces washed-out or oversaturated output, color shifts, and the kind of subtle quality bugs that are hard to debug. This page is the engineering reference for the major color spaces, what they cover, and how to work with them in production.
What a color space is
A color space defines:
- Primary chromaticities — the exact red, green, and blue colors used as primaries.
- White point — the chromaticity that represents pure white.
- Transfer function — the mapping between linear light and the encoded code values (related to but distinct from the color space proper; OETF/EOTF curves).
- Matrix coefficients — for Y'CbCr-encoded content, the matrix used to convert between RGB and Y'CbCr representations.
The primaries form a triangle in the CIE 1931 chromaticity diagram. All colors representable in the color space lie within (or on) that triangle. Larger triangles cover more colors but require more bits to represent at equivalent precision.
The most common color spaces for digital video:
- Rec.601 — SD video standard. Smaller gamut than Rec.709.
- Rec.709 — HD video standard. The SDR baseline.
- DCI-P3 — Digital Cinema Initiative standard. Wider than Rec.709.
- Display P3 — Apple's variant of DCI-P3 with D65 white point. Common on Apple displays.
- BT.2020 / Rec.2020 — UHD/HDR standard. Significantly wider than DCI-P3.
Rec.709 — the HD/SDR standard
Rec.709 (also called BT.709) is the color space defined for HD video. It covers approximately 36% of the visible spectrum on the CIE chromaticity diagram. The primaries are:
- Red: x=0.640, y=0.330
- Green: x=0.300, y=0.600
- Blue: x=0.150, y=0.060
- White: D65 (x=0.3127, y=0.3290)
The transfer function is BT.1886 (a power-law gamma curve, approximately gamma 2.4 in display-referred form).
Rec.709 is the universal SDR baseline. Every SDR video pipeline produces Rec.709 content; every consumer display supports Rec.709 at minimum. If you're shipping SDR video, Rec.709 is what you're using.
The 36% gamut coverage means many real-world colors aren't representable — saturated reds and greens that exist in nature get clipped to less-saturated approximations. For most consumer content, this is acceptable; viewers don't notice the missing colors. For premium content (cinema, commercials, certain documentary), the limitation is more visible.
DCI-P3 — the cinema standard
DCI-P3 (Digital Cinema Initiative P3) is the color space used in digital cinema. It covers approximately 46% of the visible spectrum — wider than Rec.709 but narrower than BT.2020. The primaries are:
- Red: x=0.680, y=0.320
- Green: x=0.265, y=0.690
- Blue: x=0.150, y=0.060
- White: DCI white (x=0.314, y=0.351) for theatrical use; D65 for "Display P3" variant
DCI-P3 was developed for digital cinema projection. The wider gamut covers most of the colors a film print can reproduce, which made DCI-P3 a meaningful improvement over older digital projection standards.
For consumer displays, "Display P3" is the more common variant — DCI-P3 primaries with D65 white point (the standard white for display content). Apple's iMac, MacBook Pro, and recent iPhones display Display P3 content natively. Many premium consumer TVs from 2018+ also cover Display P3.
DCI-P3 sits in an interesting middle position — wider than Rec.709 (so meaningful for premium SDR content), narrower than BT.2020 (so less complete for HDR content). Most premium SDR content from Apple's ecosystem ships in Display P3; HDR content uses BT.2020.
BT.2020 / Rec.2020 — the wide-gamut HDR standard
BT.2020 (also called Rec.2020) is the wide color gamut defined for UHDTV (4K and 8K) and HDR video. It covers approximately 75% of the visible spectrum. The primaries are:
- Red: x=0.708, y=0.292
- Green: x=0.170, y=0.797
- Blue: x=0.131, y=0.046
- White: D65
BT.2020 has very saturated primaries — saturated enough that no consumer display in 2026 fully reaches them. The primaries are essentially "where the laser pointers point" — pure single-wavelength colors. Producing red light that pure requires a laser; consumer displays use phosphor or quantum-dot emission that's less pure.
The implication: BT.2020-encoded content is gamut-mapped to the actual display's capability. A Display P3 TV can show ~46% of visible colors; a BT.2020-encoded video gets gamut-mapped from 75% to 46% during display.
Most current premium HDR TVs cover ~70-80% of BT.2020 (with quantum dot displays reaching higher than older WOLED designs). The displays are catching up to the standard; for now, BT.2020 is "encode for the future capability with current display fallback."
For HDR content, BT.2020 is the standard. HDR10, Dolby Vision, HLG, and HDR10+ all use BT.2020 primaries with D65 white point.
Gamut comparison
Visualizing the relative gamut sizes (rough percentages of visible color space):
Rec.709: ████████████ (36%)
Display P3: ███████████████ (46%)
DCI-P3: ███████████████ (46%)
BT.2020: █████████████████████████ (75%)
The jumps are meaningful:
- Rec.709 → Display P3: ~28% more colors. Premium-quality enhancement noticeable on capable displays.
- Display P3 → BT.2020: ~63% more colors. Beyond current consumer display capability for the most saturated regions.
- BT.2020 → visible spectrum: ~33% of visible colors still outside BT.2020.
The progression matches the historical evolution: Rec.709 (HD era), DCI-P3 (early premium / Apple), BT.2020 (HDR / future-proofing).
Color space conversion
Converting between color spaces requires:
- Linearize — apply the EOTF (electro-optical transfer function) to convert encoded code values to linear light.
- Matrix transform — apply a 3×3 matrix to convert between source and target primaries.
- Gamut handling — colors outside the target gamut must be either clipped (lossy) or compressed (gamut compression algorithms).
- Quantize — apply the OETF to convert linear light back to encoded code values in the target color space.
The ffmpeg colormatrix filter handles basic conversion:
ffmpeg -i bt2020.mp4 -vf "colormatrix=bt2020:bt709" rec709.mp4
For HDR-to-SDR conversion (BT.2020 + PQ → Rec.709 + BT.1886), the conversion is more complex because both color space and transfer function change, and tone-mapping is required for the dynamic range reduction. The tonemap filter family handles this:
ffmpeg -i hdr10.mp4 -vf \
"zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable:desat=0,zscale=t=bt709:m=bt709:r=tv,format=yuv420p" \
sdr.mp4
Color science is genuinely hard. Bad conversion produces visible quality bugs (color shifts, banding, oversaturation). For production pipelines, use well-tested conversion tools and verify output against expected color targets.
Signaling in containers
Color space information is signaled in the video container metadata:
colrbox in MP4/CMAF — color information atom. Specifies color primaries, transfer function, and matrix coefficients.- VUI in HEVC/AV1 — Video Usability Information signals the same.
SPS/PPSin H.264 — color signaling in sequence parameter set.
Common signaling combinations:
| Format | Color primaries | Transfer | Matrix |
|---|---|---|---|
| Rec.709 SDR | BT.709 (1) | BT.1886 (1) | BT.709 (1) |
| BT.2020 SDR | BT.2020 (9) | BT.2020 NCL (14) | BT.2020 NCL (9) |
| HDR10 | BT.2020 (9) | PQ (16) | BT.2020 NCL (9) |
| HLG | BT.2020 (9) | ARIB STD-B67 (18) | BT.2020 NCL (9) |
| DCI-P3 | DCI-P3 (12) | varies | varies |
| Display P3 | Display P3 (12) | gamma 2.2 or sRGB | varies |
Mismatched signaling is a common source of quality bugs. Content encoded as Rec.709 but signaled as BT.2020 displays with washed-out colors; content encoded as BT.2020 but signaled as Rec.709 displays with oversaturated colors.
For automated pipelines, the signaling must match the actual encoded content. Verify with media inspection tools (ffprobe, mediainfo) that the signaled color space matches the encoder's actual output.
Common color space pitfalls
Things that go wrong in production:
- Color space metadata stripping — pipelines that re-mux without preserving color metadata produce content with default (and often wrong) signaling. Players assume Rec.709 by default; BT.2020 content with stripped metadata gets misinterpreted.
- Mid-pipeline transcode without color awareness — re-encoding without explicit color space specification can default to Rec.709 even when source is BT.2020. Specify color space explicitly at every encoder stage.
- Display-referred vs scene-referred confusion — HLG is scene-referred; PQ is display-referred. Mixing tooling that assumes one with content that uses the other produces wrong output.
- Chroma sub-sampling color shifts — 4:2:0 chroma sub-sampling can introduce subtle color shifts. For colorist-graded premium content, 4:2:2 chroma sub-sampling preserves more color detail but has different operational characteristics.
- Out-of-gamut colors during editorial — content authored in Display P3 then shipped as Rec.709 has out-of-gamut colors clipped. Better to author with the delivery color space in mind, or to ship in the wider color space.
What MpegFlow does with color spaces
MpegFlow's DAG runtime handles color-space metadata explicitly through stage parameters. An FfprobeExecutor stage characterizes the source's color space, primaries, and transfer; cross-stage data flow wires the probe output into downstream FfmpegExecutor rendition stages so each rendition's encoder receives accurate per-asset color metadata. The partitioner persists each stage to job_stages with explicit dependency tracking; per-stage retry handles transient failures.
For color-space conversion (BT.2020 HDR → Rec.709 SDR for SDR variant production), the workflow runs a conversion stage on FfmpegExecutor with BT.2390-aligned tone mapping by default; customers with specific colorist requirements can configure conversion parameters. Sibling cancellation propagates fatal failures across rendition stages so dependent encodes don't waste compute.
For pipelines that ship multiple color-space variants (e.g., BT.2020 HDR + Rec.709 SDR + Display P3 mid-tier), each variant is a parallel rendition stage. The color-space parameter is per-stage; the same source produces different color-space outputs through parallel sibling stages.
The CMAF muxer at the packaging stage preserves the colr box signaling the encoder populated; players read the signaling and render appropriately.
The strict-broker security model handles color-space work like any pipeline payload — workers carry no ambient credentials; content access flows through short-lived presigned URLs scoped per stage; access is disposed on completion.
For customers building their first multi-variant color-space pipeline (typically when adding HDR delivery to an existing SDR pipeline), the conversation focuses on: source color space (what's the master?), target color spaces per delivery (HDR + SDR variants), conversion algorithm choices (which tone-mapping for HDR→SDR?), and validation procedures (how do we verify the variants look correct?). Color management is a real area of pipeline engineering; getting it right requires explicit design rather than defaults.