Announcement

Collapse
No announcement yet.

Biased vs Unbiased

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by CCOVIZ View Post
    you're still dependent of third party, even if it is chaos group
    You're dependent on third parties at all times, from the hardware to the OS and various software makers, i'm not sure i understand your point.
    but put that Fstorm renders next to this.
    It's a material relying on stray hair orthogonal to the base strand, so it will need enrichment with fur on the vrscans side as well, if one's to see the hair so directly. It has of course been done before to quite good effect (Although i don't think we're at liberty to show the piece i had in mind.).
    It's however a very different take on the problem to solve, when there is no client target to hit, but a custom sample with no measurable accuracy against a live sample across lighting and viewing conditions.
    It's pretty, no doubt, but also very subjective and prone to unwanted behaviour (Shading response, sizzling, long rendertimes and so on).
    that's a never-ending topic,
    As broad as the opinions in the world, indeed.
    But look at what could be achieved on an eight cores workstation, by one user, in 3 minutes per render.
    Then casting our minds back to 5 hours on a 32 xeon cores machine for a 1280px image should put things in perspective.
    And back to what i was saying earlier about the papers: great academia, eminently interesting, but not nearly ready for production.
    Lele
    Trouble Stirrer in RnD @ Chaos
    ----------------------
    emanuele.lecchi@chaos.com

    Disclaimer:
    The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

    Comment


    • #32
      Originally posted by ^Lele^ View Post
      You're dependent on third parties at all times, from the hardware to the OS and various software makers, i'm not sure i understand your point.
      What I meant there is that to benefit from the VRscans capabilities, you need to pay for it. I'm totally fine with this, especially considering the results you posted there. But it would be great to have an alternative to do that on our side, effortlessly. Much more flexibility that way IMHO.

      Originally posted by ^Lele^ View Post
      It's a material relying on stray hair orthogonal to the base strand, so it will need enrichment with fur on the vrscans side as well, if one's to see the hair so directly. It has of course been done before to quite good effect (Although i don't think we're at liberty to show the piece i had in mind.).
      It's however a very different take on the problem to solve, when there is no client target to hit, but a custom sample with no measurable accuracy against a live sample across lighting and viewing conditions.
      It's pretty, no doubt, but also very subjective and prone to unwanted behaviour (Shading response, sizzling, long rendertimes and so on).
      Can't disagree on that, you're absolutely right. We can put some fur on top of VRscans and it will look gorgeous (and probably make a huge impact on render times), at least for those samples with really tight patterns (no gap between core fibers, almost no flyaway fibers on those kinds of fabrics). Fiber shader is much more versatile. Send a sample to be scanned on your side? Sure but what happens if the client changes his mind in the middle of the project? That's pretty common in archviz. To be honest, Vrayscans are probably more suited for product viz, automotive rendering etc.. It's a good tool with no doubt, so please do not be offended, I'm actually quite impressed by the results you're showing here.

      Originally posted by ^Lele^ View Post
      As broad as the opinions in the world, indeed.
      But look at what could be achieved on an eight cores workstation, by one user, in 3 minutes per render.
      Then casting our minds back to 5 hours on a 32 xeon cores machine for a 1280px image should put things in perspective.
      And back to what i was saying earlier about the papers: great academia, eminently interesting, but not nearly ready for production.
      And for the 2010 (9 years ago, yeah) 32 cores Xeon render with whatever render engine and whatever shader, I'm a bit more dubitative. I mean, keeping the Fstorm example, we can render High res render like those in less than one hour on a single GPU these days.

      But like I said, I'm kind of a no-compromise guy. If it looks better and my computer can handle it, I'll use it. But I totally understand that some guys are way more render-times concerned. Different needs, different opinions, different workflows... And I'm pretty sure that would be our final conclusion if one day we have that discussion around a beer.



      Last edited by CCOVIZ; 12-02-2019, 05:07 AM.

      Comment


      • #33
        By the way one feature I've wanted for a while that would help bridge these two is a way to do vrscan capture virtually. Go insane with your detail down to the fiber level and each strand of woven filament. But then run it through a vrscan render process and generate a multi res BDRF in a vrscane file. Then we could do fun things like model grass, trees etc and just throw it on a texture on BG geometry to get full forest trillion of poly BDRFs. Crazy stuff that your scanners could never capture but you could generate virtual versions as if you scaled down a few miles of forest to a 1square foot tile.
        Gavin Greenwalt
        im.thatoneguy[at]gmail.com || Gavin[at]SFStudios.com
        Straightface Studios

        Comment


        • #34
          Yes, that's interesting, I like the idea!

          Comment

          Working...
          X