MpegFlowBlogBack to home
← Topics·Color

Limited vs full range color — TV range vs PC range and the conversion bugs

Practical reference on video color range — TV range (16-235 for 8-bit), PC range (0-255), historical reasons, signaling, range mismatch artifacts, ffmpeg conversion.

ByMpegFlow Engineering Team·Color
·May 9, 2026·8 min read·1,588 words
In this topic
  1. What color range is
  2. Why TV range exists
  3. When each is used
  4. Range mismatch artifacts
  5. Signaling
  6. ffmpeg range handling
  7. Detection
  8. Conversion procedure
  9. Common range bugs
  10. Range and HDR
  11. Pipeline range strategy
  12. Range and color management
  13. Operational considerations
  14. What MpegFlow does with color range

Color range — whether YUV values use the full 0-255 (or 0-1023 for 10-bit) or a limited 16-235 range — is one of the small color details that produces big visible bugs when handled wrong. Limited range (TV range) skips the lowest 16 and highest 20 values; full range uses everything. Mixing the two without correct signaling produces washed-out or crushed output. This page is the engineering reference.

#What color range is

YUV color encoding uses 8 bits per channel (or 10 bits for HDR). The 8-bit range is theoretically 0-255 (256 values). Two conventions for using this range:

TV range / Limited range / Studio range:

  • Y' (luma): 16 to 235.
  • Cb, Cr (chroma): 16 to 240.
  • Reserved values 0-15 and 236-255 (luma) provide headroom for processing.

PC range / Full range:

  • Y' (luma): 0 to 255.
  • Cb, Cr (chroma): 0 to 255.
  • Full digital dynamic range; no reserved headroom.

For 10-bit content, the math scales:

  • TV range: Y' = 64-940, Cb/Cr = 64-960.
  • PC range: 0-1023.

Most consumer streaming uses TV range. Most computer graphics and screen captures use PC range. Mixing them without conversion is the source of most range-related bugs.

#Why TV range exists

The historical reason: analog video had blacker-than-black and whiter-than-white signals (sync pulses, headroom for processing). When digital video was introduced, the convention preserved this by reserving the low/high values:

  • 0-15 for "blacker-than-black" (sync, headroom).
  • 236-255 for "whiter-than-white" (specular highlights, processing headroom).

These reserved values aren't displayed as black or white pixels — they're metadata that signals "below black" or "above white" in the analog sense.

Modern digital video doesn't need analog sync pulses, but the convention persists for compatibility. Broadcast television, professional video, and most consumer streaming use TV range as the de facto standard.

#When each is used

TV range (limited):

  • Broadcast television.
  • Most consumer streaming (Netflix, Disney+, HBO Max, etc.).
  • Blu-ray and DVD.
  • HDMI video output (typically; configurable on some TVs).
  • ProRes, DNxHR, and most professional codec streams.

PC range (full):

  • Computer graphics output.
  • Screen captures and screen recording.
  • Some web video (depends on the encoder and authoring tool).
  • DisplayPort output (configurable; default varies).
  • Mobile device displays (many devices use full range internally).

For pipelines mixing content from these sources, range conversion at ingest is essential.

#Range mismatch artifacts

When TV-range content is interpreted as full-range (or vice versa):

TV-range content interpreted as full-range:

  • Black is rendered as gray (16/255 = 6.3% gray).
  • White is rendered as light gray (235/255 = 92% gray).
  • Output looks washed-out, low-contrast.

Full-range content interpreted as TV-range:

  • Black at 0 is treated as below-black (clipped to TV-black).
  • White at 255 is treated as above-white (clipped to TV-white).
  • Output looks crushed in shadows and highlights.

Either way, the result is visibly wrong. Color is shifted, contrast is wrong, and quality is compromised even though the underlying pixel data is correct.

#Signaling

The container should signal the range so playback systems handle it correctly:

MP4 / fMP4 / MOV:

  • colr box with range field (limited vs full).
  • Some players default to limited if not specified.

MPEG-TS:

  • VUI in H.264/HEVC stream signals video_full_range_flag.
  • 0 = limited range; 1 = full range.

MKV / WebM:

  • Color metadata in the SegmentInfo includes range field.

For pipelines, the signaling must match the actual content. Mis-signaled content produces the artifacts above.

#ffmpeg range handling

ffmpeg handles range explicitly in encoding and processing:

Specifying range during encoding:

ffmpeg -i input.mov -c:v libx264 -pix_fmt yuv420p -color_range tv output.mp4

-color_range tv declares limited range. Use pc for full range.

Range conversion via filter:

ffmpeg -i input.mp4 -vf "scale=in_range=full:out_range=limited" -c:v libx264 output.mp4

The scale filter (used as an identity scaler) performs range conversion. in_range is source range; out_range is target.

zscale for more precise conversion:

ffmpeg -i input.mp4 -vf "zscale=in=full:out=limited" -c:v libx264 output.mp4

zscale is more accurate for range conversions than scale.

#Detection

Detecting source range:

ffprobe -v error -select_streams v:0 -show_entries stream=color_range -of default=noprint_wrappers=1:nokey=1 input.mp4

Output is tv, pc, or empty (unsignaled).

If unsignaled, you have to infer:

  • If source is from broadcast, professional video, or known streaming: assume TV range.
  • If source is from screen capture or computer graphics: assume PC range.
  • If unknown: pixel inspection (find the lowest and highest Y values; if min is around 16, it's TV range; if min is around 0, it's PC range).

For pipelines ingesting heterogeneous content, range detection at ingest is part of the standard inspection stage.

#Conversion procedure

For TV range → PC range conversion (limited to full):

PC_value = (TV_value - 16) * 255 / (235 - 16)
PC_value = (TV_value - 16) * 255 / 219

For 8-bit content, this stretches the limited 16-235 range to fill 0-255.

For PC range → TV range conversion (full to limited):

TV_value = TV_low + (PC_value * (TV_high - TV_low) / 255)
TV_value = 16 + (PC_value * 219 / 255)

This compresses the full 0-255 to fit in 16-235, with 16 = TV-black.

The conversion is mathematically clean but loses precision (going TV → PC → TV doesn't return exactly to start values). Avoid round-trip range conversions when possible.

#Common range bugs

Bug 1: ffmpeg defaults vary by version.

Older ffmpeg versions handled range differently than current. Pipelines that worked on old ffmpeg may produce range bugs on newer.

Bug 2: Browser playback differences.

Different browsers handle unsignaled range differently. Chrome may default to TV; Safari may default to PC. Test on actual targets.

Bug 3: HDR handoff to SDR.

HDR content is typically TV range. When converting to SDR, ensure the conversion preserves range.

Bug 4: Screen capture in PC range encoded as TV range.

Screen captures are PC range; encoding as TV range without conversion produces washed-out output. Use the range conversion filter at ingest.

Bug 5: Reset on metadata strip.

Some pipeline operations (transcoding, re-multiplexing) can strip range metadata. Output then defaults to whatever the new player chooses. Ensure metadata preservation through pipeline.

#Range and HDR

HDR signals are typically TV range (specifically, narrow range PQ). The "narrow range" is wider than 8-bit TV range proportionally — for 10-bit HDR, the narrow range is 64-940.

For pipelines:

  • HDR encoding: typically TV range narrow.
  • HDR-to-SDR conversion: preserves TV range narrow → TV range BT.709.
  • Screen-captured HDR (rare but possible): PC range. Convert before downstream processing.

The HDR specification of "narrow range" is the same concept as SDR's "TV range" — reserved low/high values for processing headroom and historical compatibility.

#Pipeline range strategy

For most pipeline operations:

  1. Detect range at ingest.
  2. Convert to TV range if input is PC range (most consumer streaming uses TV range).
  3. Preserve range through processing.
  4. Sign correctly at output.
  5. Test on target players.

For pipelines with internal-only PC range workflows (screen recording, computer graphics rendering), keep PC range throughout and convert only at the final consumer-facing output stage.

#Range and color management

Range interacts with color space and transfer function:

  • Rec.709 SDR: TV range standard.
  • DCI-P3 / Display P3: typically TV range.
  • BT.2020 HDR: TV range narrow.
  • sRGB (computer graphics): PC range.

Pipelines that handle multiple color spaces need consistent range handling per color space, with explicit conversion at boundaries.

#Operational considerations

Things that matter for color range in production:

  • Detect at ingest — automatic range detection saves manual configuration.
  • Convert early — handle range in early pipeline stages; downstream operates in expected range.
  • Sign correctly — output metadata must match actual content range.
  • Document conventions — record what range your pipeline operates in; new engineers benefit.
  • Cross-platform testing — verify output looks correct on multiple browsers/players.
  • Round-trip avoidance — minimize range conversions; each conversion loses precision.

#What MpegFlow does with color range

MpegFlow's DAG runtime detects color range through an FfprobeExecutor stage upstream of encode. Cross-stage data flow wires the probe output (range, primaries, transfer) into the downstream FfmpegExecutor rendition stages so the encoder receives accurate range metadata rather than assumed defaults; the partitioner persists each stage to job_stages with explicit dependency tracking. PC-range content (typically from screen captures or non-broadcast sources) is converted to TV range at the encode stage via filter parameters by default; TV-range content passes through unchanged.

For customers producing internal-only PC-range workflows, the workflow YAML configures PC-range output as a per-rendition parameter on the FfmpegExecutor stage.

The CMAF muxer at the packaging stage preserves range signaling in the container metadata that the encoder emitted.

The strict-broker security model handles range conversion like any pipeline payload — workers carry no ambient credentials; content access flows through short-lived presigned URLs scoped per stage; access is disposed on completion.

For customers debugging range-related issues in production, the standing recommendation: verify range signaling in source content; verify range conversion happens at the encode stage; verify output signaling matches actual range; test on actual target players to catch player-specific range handling differences.

The general guidance: color range is a small detail with big visible impact. Get it right at ingest; preserve through processing; sign correctly at output. Range bugs are the source of "this video looks wrong on platform X but fine on platform Y" in production — almost always traceable to range mishandling somewhere in the pipeline.

Tags
  • color-range
  • tv-range
  • pc-range
  • color
  • conversion
  • FFmpeg
See also

Related topics and reading

  • HDR to SDR conversion — the full pipeline from PQ to BT.709
  • Engineering blog
    FFmpeg presets that survive production
    What works, what bites, what to pin across encoder versions
  • Engineering blog
    FFmpeg in Kubernetes: pod, queue, operator
    The four patterns and where each one breaks
Building on this?

Join the MpegFlow beta.

We're shipping the encoder MVP this quarter. If you're wrangling color in production, the beta is built for you — no card, no console waiting.

Join the beta More color
© 2026 MpegFlow, Inc. · Trust & complianceAll systems nominal·StatusPrivacy