I've been thinking about the problems with 3D viewing. One of the biggest limitations we have is that in viewing a screen, we are pretty much currently limited to "side-by-side" effects in order to produce a sense of depth. However, this really doesn't match our true vision. Our true vision also factors in the Z-axis - looking up and down in conjunction with side-by-side.
I speculate that this is one of the main reasons people experience nausea when using glasses to view an image.
For example, if you've ever experienced "sea sickness", it's often the up and down motion of a boat on waves which causes an equilibrium imbalance. However, experienced sailors know that a quick way to stabilize the balance is to keep your focus on the horizon.
By allowing Z-axis quad cameras, you're allowing the viewer to look both up and down, but also side to side.
Not only would this relieve nausea, but also provide for a more accurate and immersive 3D experience.

A new type of camera "lens" would need to be created - similar to a fisheye that infinitely surrounds an object from half a hemisphere on the side of the viewer.
I speculate that this is one of the main reasons people experience nausea when using glasses to view an image.
For example, if you've ever experienced "sea sickness", it's often the up and down motion of a boat on waves which causes an equilibrium imbalance. However, experienced sailors know that a quick way to stabilize the balance is to keep your focus on the horizon.
By allowing Z-axis quad cameras, you're allowing the viewer to look both up and down, but also side to side.
Not only would this relieve nausea, but also provide for a more accurate and immersive 3D experience.
A new type of camera "lens" would need to be created - similar to a fisheye that infinitely surrounds an object from half a hemisphere on the side of the viewer.
Comment