Hi,
I've been creating hdri images for a long time already, but one thing has always bothered me. I guess my main questions is this:
Why does a HDRI (merged from a set of jpg files) viewed on a computer screen at a given exposure, looks different to the original jpg file at that same exposure?
So for example you create a hdri from 9 exposures where number 7 is 'well' exposed. You create the hdri and then you set the base exposure for it to be as in image n°7. When you compare the hdri with the original n°7 image, it looks nothing like it. The hdri looks washed out, dark areas are lit up, and light areas are darker. It looks like it has tonemapping applied to it already.
I made a small test to compare different programs that can merge jpg files to hdri's. Here is an image showing the results. I tried to show all files at very similar exposures. Also note the histograms.

Here's an animated gif:
http://www.aversis.be/vrayrhino/hdri.gif
What surprises me is that the godfather of all hdri programs creates the most washed out image. It is very similar to photoshop's result though. Then comes PTgui, I made a stitch directly from the bracketed sets of jpg's and rendered it to hdri (so ptgui does the hdr merging). Photomatix produces the most saturated and contrasty result. But it is still different to the original LDR stitch.
My concern is, that when you would use a hdri as your rendered background, it will have this washed out look. I am in the process of creating very high res hdri files, and would like to get it right so that they are usable as rendered backgrounds too.
I know I can adjust the merged hdri in photoshop to give it more contrast, but I tried stuff like that in the past, and noticed it can have drastic results on lighting in the 3d scene.
Any thoughts would be greatly appreciated
Would this have anything to do with srgb or adobe rgb profiles in jpg files? Is it just a gamma issue? Why do different programs produce such a different result?
Thanks,
wouter
I've been creating hdri images for a long time already, but one thing has always bothered me. I guess my main questions is this:
Why does a HDRI (merged from a set of jpg files) viewed on a computer screen at a given exposure, looks different to the original jpg file at that same exposure?
So for example you create a hdri from 9 exposures where number 7 is 'well' exposed. You create the hdri and then you set the base exposure for it to be as in image n°7. When you compare the hdri with the original n°7 image, it looks nothing like it. The hdri looks washed out, dark areas are lit up, and light areas are darker. It looks like it has tonemapping applied to it already.
I made a small test to compare different programs that can merge jpg files to hdri's. Here is an image showing the results. I tried to show all files at very similar exposures. Also note the histograms.
Here's an animated gif:
http://www.aversis.be/vrayrhino/hdri.gif
What surprises me is that the godfather of all hdri programs creates the most washed out image. It is very similar to photoshop's result though. Then comes PTgui, I made a stitch directly from the bracketed sets of jpg's and rendered it to hdri (so ptgui does the hdr merging). Photomatix produces the most saturated and contrasty result. But it is still different to the original LDR stitch.
My concern is, that when you would use a hdri as your rendered background, it will have this washed out look. I am in the process of creating very high res hdri files, and would like to get it right so that they are usable as rendered backgrounds too.
I know I can adjust the merged hdri in photoshop to give it more contrast, but I tried stuff like that in the past, and noticed it can have drastic results on lighting in the 3d scene.
Any thoughts would be greatly appreciated

Would this have anything to do with srgb or adobe rgb profiles in jpg files? Is it just a gamma issue? Why do different programs produce such a different result?
Thanks,
wouter
Comment