Here, found the post I was looking for. I would trust Jim Houston on the subject most.
sRGB and rec709 have the same primaries on the horseshoe curve.
For the characteristic curve, Both of these contain a linear slope near the toe, although the breakpoint is different.
On the camera-taking characteristics curve (signal definition),
both of these curves use an upper-scale 1/2.2 gamma.
On the display characteristics curve, sRGB uses a gamma 2.2 display assumption
(computer monitors assumed to be the output source.) for the overall display curve.
However, the upper scale contrast is actually at gamma 2.4, only inclusion of the
linear section would cause a curve fitting gamma function to become gamma 2.2.
rec709 (and rec601) before, did not fully define a display characteristics curve; the
existence of brightness and contrast dials on CRT's made it impossible to know
"exactly" how the user would see a file. Instead of gamma as a prime determinant,
all video systems used a white point plus pluge definition to provide a visibility
guarantee for the signal. Exact reproduction was not assured (NTSC = never
twice the same color). When Sony BVM CRT's became the gold standard, it
was possible to measure a large set of them to determine what the 'mastering'
output standard really was in practice. This has been done several times,
most notably by Steve Roberts at the BBC who came up with the 2.35 gamma
that became part of the ITU reference monitor standard. Hollywood studios
have also done studies of the outputs and have centered around 2.4 gamma
for the output display. (though like anything there are some outliers that range
between 2.2-2.4.) The difference between 2.35 and 2.4 gamma on the output
of the display are slight, and mostly show as a small mid-tone brightness
difference, and slightly more visible shadows in the 2.35 gamma.
Jim Houston
VP Technology and Engineering
Sony Pictures
For the characteristic curve, Both of these contain a linear slope near the toe, although the breakpoint is different.
On the camera-taking characteristics curve (signal definition),
both of these curves use an upper-scale 1/2.2 gamma.
On the display characteristics curve, sRGB uses a gamma 2.2 display assumption
(computer monitors assumed to be the output source.) for the overall display curve.
However, the upper scale contrast is actually at gamma 2.4, only inclusion of the
linear section would cause a curve fitting gamma function to become gamma 2.2.
rec709 (and rec601) before, did not fully define a display characteristics curve; the
existence of brightness and contrast dials on CRT's made it impossible to know
"exactly" how the user would see a file. Instead of gamma as a prime determinant,
all video systems used a white point plus pluge definition to provide a visibility
guarantee for the signal. Exact reproduction was not assured (NTSC = never
twice the same color). When Sony BVM CRT's became the gold standard, it
was possible to measure a large set of them to determine what the 'mastering'
output standard really was in practice. This has been done several times,
most notably by Steve Roberts at the BBC who came up with the 2.35 gamma
that became part of the ITU reference monitor standard. Hollywood studios
have also done studies of the outputs and have centered around 2.4 gamma
for the output display. (though like anything there are some outliers that range
between 2.2-2.4.) The difference between 2.35 and 2.4 gamma on the output
of the display are slight, and mostly show as a small mid-tone brightness
difference, and slightly more visible shadows in the 2.35 gamma.
Jim Houston
VP Technology and Engineering
Sony Pictures
Comment