Announcement

Collapse
No announcement yet.

Monitor Calibration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Monitor Calibration

    Hi All,

    I am definitely OT but since this is one of the best forum around and the question is related to visualisation I'm giving it a go.

    I have mixed feelings about monitor calibration, sometimes it works sometimes it doesn't, at least for me.
    I had several version of EyeOne Display Pro and a Spyder 3 and I have used them with different monitors in the last 10 years, including an EIZO CG243W.

    To make a long story short, I would like to know what is your current setup and workflow.
    At with cd do you setup the brightness. Often they suggest a value of 120 cd or 160cd but with nowadays high-end monitor to me it looks like a bit of a waste in term of dynamic range.

    My monitors are setup at 200cd but then, when I see the same images on less bright screens, they don't look as good.

    What is your standard workflow?

    Do you load the ICC profile in the VFB or leave it as default?

    Regards,

    Giacomo.



  • #2
    Any updates you have about this?
    I am at the point where you were standing in 2017.
    Any tips for me?
    for my blog and tutorials:
    www.alfasmyrna.com

    Comment


    • #3
      We're in a *huge* transitional phase towards broader gamuts and advanced display technologies.
      Tired of ant-sized text (i still have a ~230dpi HP sitting somewhere.), i bought myself a 72dpi big OLED TV (~98% DCI-P3).
      That's my only monitor now, for all kinds of work (and play).

      Now, i am far removed from the need for accurate color calibration, these days, and as such i didn't even bother calibrating it: it just looks gorgeous and rangy as shipped from the factory (once the TV "AI" gimmicks are turned off.).
      The prologue is to get to the point where i turn on HDR in windows, and move to scRGB @1000+ nits.
      That'd need a whole different calibration, and likely a wholly upgraded sensor and software to calibrate it (imaginary colors aren't physically creatable.).
      Even just in sRGB, an OLED display will create perfect blacks, and result in infinite contrast (any value divided by zero...).
      Pure blacks instead of milky ones will make an enormous perceptual difference, which is difficult to fathom and quantify/correct via calibration.

      Relevant?
      It depends: my phone is HDR and OLED, but uses a different color space (BT2020).
      Colors'll never match the Monitor.
      Someone else will have a standard sRGB display, and unless i catered for their specific needs, they'd see washed out images instead of rangy, punchy ones (if i authored them with HDR/scRGB as target).
      Also my other monitor, not OLED, had scRGB and HDR, but its rendition was nothing short of terrible, truly an unusable gimmick.
      In no way could i have made pictures look good there.

      The whole industry is in a state of flux with this: the competing standards are many, the hardware differs like it has never done before (the kind of display tech used makes a ton of difference on the results.), and even content makers will make video available in one standard only (HDR10/HDR10+/DolbyVision... long list.), and some even attach metadata to *each frame* to ensure proper display behavior.

      My guess/hope is that at some point in the not-too-distant future, we'll have display devices talk to the OS the same way media talks to the player: they'll broadcast their technical specs and qualities (instead of just "i'm sRGB compliant"), and content will get color-transformed -where possible- to look best on that specific device.
      Some games, and video players, already are able to turn on HDR mode as the media demands it, while others will require the user to fiddle first, and enjoy the media in the right way later.
      A sort of ICC, on steroids, and automated.
      Work on this is just starting, though.

      TL;DR: it's complicated these days, more so than in the past, as new, competing, display panel techs and color standars are being created.
      The best way is likely to create a custom color pipeline to go from the rendered linear data to the target display device, in a way that makes sense while authoring it on one's own display device.
      OCIO may help with providing for transformation profiles, device decriptors and frameworks.
      If sRGB is your only concern, then your only option is to stick to the standard, by the letter, and forget the fact that monitors got better/brighter.
      It's then the consumer's onus to ensure their display device matches the sRGB standard as well.


      edit: i'm also not alone in thinking it's a bit messy, today.
      Last edited by ^Lele^; 14-01-2021, 05:36 AM.
      Lele
      Trouble Stirrer in RnD @ Chaos
      ----------------------
      emanuele.lecchi@chaos.com

      Disclaimer:
      The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

      Comment


      • #4
        archviz experience here - simple ipad is still the easiest way of making sure you and your clients are looking at the same thing. lowest common denominator but surprisingly useful imo.
        Marcin Piotrowski
        youtube

        Comment


        • #5
          I think the only thing I would really trust (not to say it's actually "correct") is a self-calibrating monitor. I bought Dell ultrasharps just because I don't want to spend big $$s on a self calibrating monitor, and ultrasharps come factory calibrated. Whether it's correct or not, it allows me to at least pretend it is.
          James Burrell www.objektiv-j.com
          Visit my Patreon patreon.com/JamesBurrell

          Comment

          Working...
          X