Announcement
Collapse
No announcement yet.
VRAY Stereoscopic Helper Distortion (6x1 Cam)
Collapse
X
-
Hi Neilg, Thanks for the reply! I followed the Chaosgroup VR Guide pdf and this from IrisVR:
https://help.irisvr.com/hc/en-us/art...mas-in-3ds-Max
I cant attach the guide because its too big but can be found hhere:
https://labs.chaosgroup.com/index.ph...ual-reality-2/
Attached are my helper and cam settings
Strange thing is that it looks fine as a 6x1 Camera but as soon as I switch on the Stereo Helper it does that.
Thanks
Comment
-
Of course! That make so much sense! My eyes are 6.5cm apart (I went back and looked at the documentation and both say 6.5cm).
That could also explain the very strong 3d effect.
Anyway, made the change and the distortion is totally gone.
Thank you AlexP!
-T
Comment
-
This brings up an important point that most people - including developers of VR software and apps - are completely unaware of and/or choose to ignore: The 1/30th rule of stereoscopic photography.
When planning out your stereoscopic VR scene you need to run a tape measure from your camera to the CLOSEST object in a 360° radius. Once you have that measurement, divide by 30. THAT should be your eye distance (more properly known as IPD (interpupillary distance). For instance, if the closest object in the scene is 120cm, your eye distance/IPD in the V-Ray stereoscopic helper should be 4cm.
Because 6.5cm is the average human IPD, you really don't have to worry if the closest object is 195cm away or further. Only if something is closer than 195cm should you apply the 1/30th rule.
Why?
If you don't follow this rule then you run the risk of something called a Window Violation - or Edge Violation more specifically. What happens, and I see this in countless VR scenes created in V-Ray, is that an object on the edge of the screen in the left eye image is barely showing up in the right eye image. That causes your brain to try and fill in that missing info and you get an almost shimmering effect. Again, it's amazing how often I see this happening and how virtually no one accounts for it - including Chaos - in their documentation. You simply cannot create great 3D without taking this into account. Hope that helps!
Comment
-
Originally posted by landrvr1 View PostAgain, it's amazing how often I see this happening and how virtually no one accounts for it - including Chaos - in their documentation.
Best regards,
VladoI only act like I know everything, Rogers.
Comment
-
Do you mean depth perception instead of scale perception? Changing the IPD of the lenses when doing interior scenes won't be an issue at all as long as you don't exceed the average human measurement of 6.5cm. You can push past that a little bit but it's not advisable. In fact, in the world of 3D films the formula is usually MUCH more conservative - about 1/60. For any 3D work, whether it's photography or VR or film, even the smallest of IPD will give you depth. Somewhere along the line in the creation of all this VR the basics of 3D photography and film got completely overlooked.
I mean, it's kind of simple: If you look at the screen and there's a chair on the left side but it's barely there on the right side, that's a problem. It's a problem not because I say it's a problem, haha. It's a problem because your brain says it's a problem. It's human physiology at work.
I've had VR developers argue with me endlessly about how I don't know what I'm talking about - that somehow because it's VR the rules and methods don't apply. That's complete rubbish, of course. If you are working in ANY stereoscopic medium they always apply. The 1/30 rule isn't something that's set in stone by any means. It's merely an accepted guide to avoid horrid window/edge violations.
Comment
-
I'll be honest, that sounds like complete nonsense to me. the main benifit of VR and a real world scale in real estate is so you can put the goggles on and instinctively be able to read distances, depth and scale accurately. Having some shimmering is a by product of doing it with pre rendered images, the idea of changing the eye distance and in effect shrinking the person so the space seems bigger or smaller seems like madness.
It makes sense with movies that depth is arbitrary and for effect, but 360 degree stereo for real estate it has to be accurate to what it is on a human.
Comment
-
The fact that it's a 360° environment doesn't make any difference, and that's where VR developers and users get confused. Think of it this way: There's no difference at all when you spin around a 360° environment with a headset on vs watching a 3D movie and the camera spins around the scene in a 360° circle. Or half that circle. Or a fraction of that circle.
The main reason that the 1/30 rule is important has to do with the current state of VR viewer technology more than anything else.
People think of the VR experience as 'Actually putting the viewer into the space!!" That's how it was sold and marketed. The problem is that the vast majority of VR viewers being used are phone based; a small screen with two side-by-side images sitting a mere inches from your eyeballs. It's missing the key element and that's a wide field-of-view to match human vision. To make up for that loss the developers do crazy things like FORCE a wider field of view in the two images - leading to hilarious amounts of barrel distortion that's only compounded (and at the mercy of) the bi-convex lenses.
Each new generation of expensive VR googles seeks to give you a wider and wider field of vision when you put them on and THAT is the key to success in terms of always leaving the distance between the cameras at the average human IPD of 6.5cm. Right now, with ordinary vision just sitting at my desk, I've got natural edge/window violations happening. I've got a coffee cup off to the far right of me that, if I close my right eye, I cannot see with just my left eye open. The reason - the exact reason - that my brain isn't going crazy over that fact is the result of one thing: I've got the basic human FOV of 180°, and the part of my brain that's connected to my vision receptors is actually programmed to compensate. That's entirely not the case when viewing VR with a limited field of vision and incorrect camera IPD.
With respect to changing the depth or scale as a result of decreasing or increasing IPD: as mentioned, if done in small increments there is no perceptible effect on depth and scale. Going from a camera lens IPD of 6.5cm to 3cm is NOT going to make the chair that's 2m away seem like it's actually 1m away.
Lastly, this is not a scientific exercise by any means; nor should it be. VR, at this point in the development (and especially with the proliferation of Google Cardboard viewers), needs to be thought of more as a great stereoscopic 3D experience as opposed to 'virtual reality'.Last edited by landrvr1; 03-09-2016, 06:55 AM.
Comment
-
A great comparison chart; although a year old. Only 1 actually meets and exceeds the Human FOV of 180° - in which case a constant camera IPD of 6.5cm might be okay.
http://www.virtualrealitytimes.com/2...w-vr-headsets/
Comment
Comment