cancel
Showing results for 
Search instead for 
Did you mean: 

Where's my infinity at?

R0dluvan
Level 3
I got a Quest recently, and just as with other VR I've tried previously, the sense of depth I get from large scale scenes is very underwhelming. One of the draws of VR for me is the promise of experiencing the exhilaration of flight or the vastness of space, but that falls flat when infinitely distant objects all appear to be ten meters away - which is always what I see.

If I look at the sky or the mountains in the distance in for example the default living room scene that comes with the Quest, they look like they're painted on a dome no more than maybe 20 meters in diameter. The planets and stars in Elite: Dangerous look the same - planets take on a distinctly concave appearance (because the dome is concave).

Why does infinity not appear infinitely far away? I know the HMD uses stereo disparity as its main cue for conveying distance. If the light from a distant point enters both my eyes at the same angle, i.e. they are parallel rays, they appear infinitely distant. So the rays must not be parallel here even though they should be - there is some convergence.

With an HMD, infinitely distant objects are rendered identically on the left screen and the right screen. But that's not enough to ensure their rays enter my eyes in parallel; the rendered (virtual) images must also be the right distance from each other physically - that distance being my IPD. Now, the Quest has an IPD slider that allows me to move the lenses to match my eyes - but this does not itself have any effect on the convergence, it just makes for a more or less distorted view of the images. But I have also read that the IPD slider moves the screens themselves along with the lenses. If that is so, that will affect convergence, and should change the apparent depth of all objects, including those at infinity. But this does not happen. The slider only affects distortion, not apparent distances. From the maximum to the minimum setting there is a 20% difference in IPD, which should produce a very noticeable distance change.

I created a simple scene in Unity to test distance perception. I have 7 cubes, all facing the camera at distances of 1, 3, 10, 30, 100, 300 and 1000 units (meters by default).
jl4ln2da0az9.png
Their sizes are proportional to their distances, so that their monocular appearances are pretty much identical:3hb2k9zr06a2.png
No matter the IPD slider setting, the leftmost cube appears correctly 1 m distant - just within reach. The second seems pretty reasonably 3 m distant. But as I go on, instead of having increasingly large distance increments, the farther cubes appear bunched together at the "dome" with little apparent difference in distance or size.

Why doesn't the slider affect apparent distance? An error as big as 20% in where the image lies should make a drastic difference in where the 1 m object appears to be, and should also affect the "dome". I can think of two possibilities: either it is in fact not correct that the slider moves the screens, or the Quest changes its rendering to compensate for this, so that I can't use this method to produce the infinite depth I want. Does anybody know the answer to this one?

My guess as to what is happening is something like this: the IPD slider does move the screens, the Quest does compensate in software for this, and the rays from infinitely distant objects do impinge in parallel on my eyes - but the focal depth of the Quest (which as I understand is some 1.5 m?) won't let my brain accept the cue from stereo disparity entirely and comes up with a compromise - the dome. But I still would like to verify this, and the way I thought of to do so is to make the left and right VR cameras slightly "cross-eyed", forcing the disparity of all objects to increase uniformly. I haven't figured out how to override the cameras in this way yet, however.

(Postscript: The obvious thing the Quest might be doing to offset the IPD slider in software is to increase the virtual camera baseline - i.e. move your virtual eyes so they're at the same distance as your real eyes are according to the IPD setting. This would explain why the 1 m cube doesn't change. It doesn't explain why infinite objects aren't affected by the slider, though. The virtual camera baseline has no effect on infinite objects, so that mechanism can't be responsible for infinite objects staying at the same distance when the screens move.)
7 REPLIES 7

kojack
Volunteer Moderator
Volunteer Moderator
There are two primary depth cues that humans use: vergence and accommodation.
Vergence is when you angle your eyes in/out. This is what VR headsets work with. Humans can only sense distance using vergence to about 30m max, after that our eyes just aren't precise enough to see the difference.
That's where accomodation (focus) comes in. We can focus on things much further away. Accomodation is inaccurate, but has a huge range (to infinity).
The problem is no VR headset on the market can do dynamic accommodation (Oculus is working on it with varifocal and multifocal devices). Instead they are all fixed focus, usually around 1.5m.

So no matter what current headset you get (Oculus, Vive, Index, etc) they all have the same issue, focus is fixed and vergence alone can't handle long distances.

Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2

R0dluvan
Level 3
You mention vergence and accommodation but there is also stereopsis. Wikipedia says vergence (it calls it "convergence") works up to 10 m, but for stereopsis "One study shows that 97.3% are able to distinguish depth at horizontal disparities of 2.3 minutes of arc or smaller, and at least 80% could distinguish depth at horizontal differences of 30 seconds of arc". 30 seconds of arc corresponds to a distance of over 400 m (provided an IPD of 60 mm). I think you'll find your stereo vision can do a lot better than 30 m if you just take a look through your window. 😉

None of this answers any of my questions though.

Nunyabinez
Level 8
Actually, he did answer your question. You are talking about what is called the Vista Space. This is the space where things are too far away to take action on them. The elements that matter in this space include Occlusion, Relative Size, Arial Perspective, and Height in the Visual field. None of this is affected by what you are referring to which in the VR field is called Binocular Disparity.The Personal and Action Spaces include Binocular Disparity, Motion Parallax, and Convergence and Accommodation.

The reason that the distant objects don't appear right is that there is one focus for the entire screen. Hold up an object a foot away from your eyes and you will see that everything in the distance has become blurry. Then focus on something far away and look at something a foot away in your peripheral vision. It will be blurry.

This is one of the least important cues in simulating 3D environments, but to have truly completely convincing 3D, the objects that are further away need to be blurred when you are looking at something in the near field and vice versa.

As kojack said, once dynamic accommodation is implemented, this problem should disappear.

This guy explains it much better than I can https://youtu.be/YWA4gVibKJE. 

i7 8700, 16GB, RTX 2080 TI, Rift CV1 | i5 4690K, 16GB, GTX 1660 TI, Rift CV1 | Quest | Quest 2

R0dluvan
Level 3
Okay, I see that you and kojack are suggesting that it's the accommodation depth cue that's messing with me, which was also one of my theories.
I'd still like to know explicitly how the IPD slider works on the Quest, and I still would like to experiment with manipulating the cameras like I said, though. I've posted a separate question specifically addressing the latter, though.

R0dluvan
Level 3
As for the vista space: aerial perspective, motion parallax and occlusion can have no role in the sensation of depth when looking at a starry sky, or the moon, so their absence don't explain why space is so flat and close in Elite: Dangerous. More evidence pointing toward accommodation as the chief problem. I've got to ask myself, if this is the whole problem, shouldn't a HMD design that focuses at infinity eliminate this problem (for distant objects - presumably nearby objects will look worse)? Do they?

Good video BTW.

SecretGerbil
Level 2
Wikipedia says vergence (it calls it "convergence") works up to 10 m, but for stereopsis "One study shows that 97.3% are able to distinguish depth at horizontal disparities of 2.3 minutes of arc or smaller, and at least 80% could distinguish depth at horizontal differences of 30 seconds of arc". 30 seconds of arc corresponds to a distance of over 400 m (provided an IPD of 60 mm).

R0dluvan
Level 3


Wikipedia says vergence (it calls it "convergence") works up to 10 m, but for stereopsis "One study shows that 97.3% are able to distinguish depth at horizontal disparities of 2.3 minutes of arc or smaller, and at least 80% could distinguish depth at horizontal differences of 30 seconds of arc". 30 seconds of arc corresponds to a distance of over 400 m (provided an IPD of 60 mm).


Did you mean to quote me?