cancel
Showing results for 
Search instead for 
Did you mean: 

Could SLI allow extremely high resolution dual screens?

captaintrips
Explorer
Just out of curiosity, wondering if Oculus plans to realease a dual-screen oculus in the future (two independent screens at full resolution per each eye), and if so... could there be any possibility of making each screen rendered by a single GPU? One GPU focuses on a screen 1920x1080 (or higher) and the other GPU does the same with the other? I know SLI profiles allow you to choose what each GPU can do now... such as one handling the graphics and the other handling the MSAA, or both handling graphics, or whatnot.

I remember reading a while back that was one of the biggest hurdles with VR was the bandwidth and graphics power required to rendered two separate images at higher resolutions per each image, however it seems plausible simply having SLI, or more specifically, having a sepereate GPU to handle each of the two screens separately would resolve that?

My only contradiction on this thought would be possible latency between GPU's and if microstutter would affect anything, but I'm not technical enough in how cross gpu architecture works to understand if or how that may play a role.

Thoughts or comments?
13 REPLIES 13

jwilkins
Explorer
I don't think so. 30 flashes per second would be perceived as flicker. I don't think you could tell the flicker was alternating between eyes.
(╯°□°)╯︵┻━┻

320x200
Explorer
"jwilkins" wrote:
I don't think so. 30 flashes per second would be perceived as flicker. I don't think you could tell the flicker was alternating between eyes.


Well, possibly at that speed, but you want to consider large deltas with a wide range of frequencies. To stick with the muzzle flash example, 30 per second is well over the firecap of many weapons in cod so I don't think you can discount the problem just by assuming the delta will either be infrequent or hyper-frequent.

Simple test to run with the Rift, just manually mess up the eyes, but I've not tried it myself. Given recent issues I've hit with sound perception I don't discount any chance of human perception picking up unnatural things as feeling "off" or "somehow wrong". Yeah, I know the ear is a lot faster than the eye, but still, doing unnatural things, even quickly, is often perceptible.

jwilkins
Explorer
I think you are absolutely right and we need to actually test it. We can only discuss this so much 🙂 My problem right now is that I don't have a Rift or a high speed camera (to confirm the actual delay between eyes instead of trusting I programmed it correctly). It would still be anecdotal until we could study more people's reaction. Unfortunately I can't do that unless I go through a lot of red tape, but an individual or company can experiment on people with impunity :lol:
(╯°□°)╯︵┻━┻

Antonyc
Honored Guest
Been thinking alot about sli support specifically for the rift

I reckon if NVidia could implement a sli method, that renders the 3d image side by side with a vertical split
each card would drive one half of the screen, this would be the best solution for everyone

it would work for 1 screen and also 2 screens as you would just set them up as 2 screen surround
imagine dual 1080p or even 1440p screens 🙂

I think this would also give us the best performance, I currently use sli with 3 screens and a resolution of 5760*1080
and my two gtx580 handle most modern games with ease at that resolution, so I imagine I have a lot of headroom as far as the rift dev kit is concerned

it would work better for 2 screens as they should be perfectly in sync going via sli

I can see it being beneficial for NVidia, as I imagine sli using side by side vertical split method would be the primary way of powering a hmd like the rift, they'd probably sell quite a extra cards because of this