cancel
Showing results for 
Search instead for 
Did you mean: 

Stereoscopy with objects in distance?

digital
Explorer
At what distance does stereoscopy become irrelevant and just be 'infinity'?

E.g. If I had a 360 degree panorama, and there were no close objects (e.g. a photo taken from a helicopter). Would a stereo display be needed? Would a single image be enough?

I guess with the current rift being very low resolution, the parallax distance between objects 100 meters away would be minuscule?

Thoughts?

Regards:


John
14 REPLIES 14

KuraIthys
Honored Guest
I originally thought it worked this way, but unfortunately, someone pointed out that it depends on convergence.

I believe the rift assumes convergence is parallel, in which case at 'infinity' the image seen by both eyes is identical.

However, as was pointed out to me, if you look at something closer. (convergence nearer than infinity), then anything past that point would diverge.

For instance, lets say you're staring at your own finger. Your vision converges at about arms length.

At distances closer than this, the two images your eyes see would have some amount of separation that gets larger as the object gets closer to your face.


But, if you look at something further away, the images cross over, and rather than getting even closer together, they get further apart again, with increasing distance increasing the apparent distance as seen by each eye.

Distant objects then, only look identical when the two images are assumed to be recorded by parallel cameras...

This appears to be an acceptable assumption for the rift headset, but for most other 3d tasks, you have to assume that the convergence relates to the distance between the viewer and the screen. (in which case, anything behind the screen in 3d space would diverge, and there is no such thing as 'infinity' for stereoscopic purposes...)

Basically, you first have to establish what assumptions about convergence are valid for the 3d system in question before you can say at what distance (if any) 3d information no longer matters.

cybereality
Grand Champion
I've heard different numbers, but I think it's somewhere around 20-30 meters where there is no longer any stereo effect.

ganzuul
Honored Guest
Does that mean that after 30 meters a rendering engine could compute just one view on the geometry instead of two?

captain3d
Honored Guest
All though stereoscopic roundness disappears quickly you can still separate one object from another at very far distances. A building a 1000 feet away can still be seen to be slightly closer that the clouds or mountain for example. Look out the window and try it for yourself.

phil

DieKatzchen
Honored Guest
Mostly this is done by our brain using two methods. (A) parallax, which is simply the fact that moving parallel to an object produces more apparent movement the closer you are to it, and (B) already having a pretty good idea how big the building is. The human brain has a huge database of how big things generally are which it uses to tell how far away they are when stereoscopy fails. There are a number of optical illusions that use this to mess with people, the most obvious being a room you can look into through a peephole (thus eliminating stereo vision) and people appear larger or smaller depending on where in the room they are. The objects in the room are custom made so that the brain interprets them as, say, a perfectly ordinary pool table, but the table is larger at one end than the other so people standing at that end appear smaller by comparison.

jtoeppen
Honored Guest
http://www.3dpan.org/3d/76266-76265-180-180

The viewing system impacts apparent resolution of depth in stereo images.

By taking two panoramas taken with a separation of more than twenty feet one can see depth for a couple of miles. A flyby of Jupiter can provide many separations to choose from. There is science to the art, and an empirical aspect as well.

The math is simple; eyespacing/object distance should be 1/30 for an object to appear at about 5 feet. If your eyes were twice as far apart as they are now, the world would appear half sized. Cloud and mount range photos benefit from spacings greater than 200 feet. A series of photos may be taken from a moving car and the different separations tried.

DieKatzchen
Honored Guest
Yes, but your brain is just interpreting it as a smaller object closer to them. I believe the question was whether processing power could be saved by rendering a single view for things farther than 30 feet away, and the answer is "probably." I'm not sure how well it would work, but I think it would work.

sobchak
Honored Guest
I don't think using the same image for objects 30 feet away is the way to go. First, I can't think of any engine that supports this, and some weird z-buffer composting method could potentially be more processor intensive than just rendering the same scene twice in the traditional way. Second, I can imagine it also looking odd the closer the "2d barrier" is. At 30 feet away, I can picture it looking to the player like they're constantly at the center of some weird 2D wallpapered cylinder with 3D objects approaching out of it and receding into it as the player moves. A weird effect that would be a neat thing to see, but probably not what you're going for. From viewing 3D movies taken while using the rift, like TF2 footage, I can see stereoscopic info at distances greater than 30 feet. Realistically, you're talking about a barrier more like 300 feet away, or more even... which is territory usually reserved for lo-poly or texture efficient LODs (fast to render) and skyspheres.

edzieba
Honored Guest
According to this experimental paper, beyond a few metres humans can still perceive depth purely from stereo disparity. However, they are really bad at it. You might be able to get away with mono rendering beyond a few tens of metres, but it might look a bit weird. In some cases where you have things nearby and things far away with little in-between (e.g. spaceflight simulator), this will look fine.