cancel
Showing results for 
Search instead for 
Did you mean: 

CV1 must not be less than 1440p...

WhiteSkyMage
Honored Guest
It's true you will might need 2 cards but it is time to push Nvidia and AmD to "preorder" next smaller nanometer nodes. If they buy the new smaller chips first then mobile chip companies will not have anything (which is what must happen! Who needs phones? - they are just pointless with Oculus rift on the market!) The high end market must start getting leaps of performance in order to drive virtual reality to its immersion!
43 REPLIES 43

WhiteSkyMage
Honored Guest
"jngdwe" wrote:
After seeing the lessened SDE of the Note 4, I see no reason not to go for 1440p. At the very least the screen should be 1440p, even if we don't render native.


1 GPU IS enough! But some people are like me - they want to use every single technology possible to make picture look better (like DSR) and this is where 2 GPUs come into play. You can not do 1440p DSR with just one 980...You must keep your FPS above 60 at all times...and that will not be possible every time at 4K rendering and downscaling unless the game is well optimized...

If you imagine a game like Skyrim... you put 50 texture mods, including 4K ones, with Anti Aliasing and DSR on top (i know i am going a bit too far here) - but imagine the most extreme of the extreme max with 1 GTX 980... will it be possible to remain at 60+ FPS? Not everyone is looking at the minimum cash to spend here - some people want performance without mercy...

MrsVR
Honored Guest
"WhiteSkyMage" wrote:
will it be possible to remain at 60+ FPS?


90 FPS with CV1. So funny.
Rendering/Game engineer

WhiteSkyMage
Honored Guest
"MrsVR" wrote:
"WhiteSkyMage" wrote:
will it be possible to remain at 60+ FPS?


90 FPS with CV1. So funny.


Thats what Palmer said... You are looking at min of 60 FPS and no less...

I wonder how will Sony do their project Morpheus... It will be funny if they fail....

MrMonkeybat
Explorer
60 fps Async time-warped to 120hz would probably work, as it was the strobing at 60hz low persistence is the main reason for higher hz.

If you optimise to something between Half Life2 and mid PS3X360 era graphics. Something like a R9 280 should be able to render 4k 120hz and output through the twin Displayports. they can be had for under $200USD these days.

We have still yet to see Game Engines and Drivers fully optimised for VR with things like Dynamic render buffers, Binocular culling, simultaneous stereo rendering.

It all comes down to what screen technology they have access to. 2560 RGB might be better that 3840 pentile. It dose not necessarily need to be a standard resolution either its possible to cut the screens to whatever size fits best.

RonsonPL
Heroic Explorer
Just my "2 cents":

1. 60fps is not the minimum, it's the minimum that sometimes work, but it's just "in case of disaster, use this", not "minimal, enough". There will be many serious downsides of only having 60 instead of 90-120. Latency and the image artifacts, where time-warp cannot work well (and will never work well, because of the nature of it, so don't count on "they'll optimize it".

2. You never strobe at 60Hz. It was said many times before: X fps is required for X Hz to make low persistence work. You cannot have even 89fps at 90Hz. You need to make it exactly the same.
Even 80fps at 160Hz (not talking about VR and time warp now) looks awful and light-years away of what low persistence can offer.
Not an Oculus hater, but not a fan anymore. Still lots of respect for the team-Carmack, Abrash. Oculus is driven by big corporation principles now. That brings painful effects already, more to come in the future. This is not the Oculus I once cheered for.

pappythefoo
Protege
"RonsonPL" wrote:
"VRoom" wrote:
I don't care if it's 1440p or 1080p as long as the SDE is almost solved (and, as I read, CB prototype is practically there). No matter what, resolution won't be perfect for quite some time... so as long as the image quality is good enough that it doesn't bother you anymore and the experience itself becomes more important (again, CB seems to deliver in that department), I'm OK with it.


Well...yes.
And no.


You don't care in the same way people say "I don't want 3D or VR, since I don't see anything wrong with 2D. 2D is fine, it's enough". And that's originating from the lack of knowledge of what you are loosing, what things are blocked, what you cannot experience.
Go play Battlefield 4. Set the resolution to 800x600 and resolution scale to the lowest. Now go play a round, when you're on foot and in a building. If you were playing any game for your first time in your life, you would say it's sufficient to enjoy. After all, you scored a few kills. It was fun.
But go fly a heli at this resolution...
Go drive a tank and try to shoot a guy at some distance.
Go read a small text/indicators in a flying simulators.
and you'll see what's the problem.

These are the limits you get when dealing with "good enough" hardware.
For example, right now, some people with 3D Vision and DK2 are playing Project Cars on their monitors because they don't exactly see (DK2) the objects on which they judge their braking points. That's just another example.

You didn't see a hi-res VR - you don't know what difference it makes.
You didn't see a much wider (than DK2 and probably CV1) FOV in VR, so you don't know what doors it opens either.

We now know we're not getting any proper and widely accepted (by developers) controller. Then we are limited (very much!) at the very beginning. 1080p would make even more limitations.

Don't get me wrong. I know what DK2 can do, I know it can be fun, and with SDE improved, CV1 will be significantly better. But I also know what people said when 320x240 was a standard. And then at 480p. And then at 720p and 1080p.
Always the same - resolution doesn't matter. And then... you go back to the lower one, that you called "OK" back in the days and... you clearly see you were wrong, and you decide to watch out for this mistake in the future. At least that's what I did.

We should get to 4x the pixel amount as DK2, as soon as possible. Even if it means changing the price from Facebook-casualish 300$ to the levels initially announced, which means 450-500$.
Similar FOV and not more than 20-30% additional pixels compared to DK2 is simply as far from the promised "best VR we can offer under 500$" as it gets. And the quote about 500$ is from Oculus Kickstarter times.

edit: and if PS3 is able to display 60fps at 1080p (2006's Ridge Racer), if a smartphone can do VR on 1440p display, then single-GPU card cheaper than Titans, can handle the 1440p+50% without a problem. Even with some DSR, which you can easily replace with proper AA, although not when you want to stick to the advantages of deferred rendering engine. But no one forces anyone to use it, so DSR is not the only option we got to improve the quality that much.


I agree.. Oculus Rift does simulate human eye view (see things with 2 eyes) but currently(DK2) the view via severe near-sighted without glasses.
and it's definitely problem in most game we play on the regular monitor.

It's not like me asking for a fighter pilot level of eyesight but an eyesight that can read out little text 3 feet away.
Ordered : 20 Sep 2013 Number : 74XXX Status : Arrived Holy Blessed Email : Nov 5 Arrived : Nov 13 Shipping Email : Nov 14

TomSD
Honored Guest
"pappythefoo" wrote:
I agree.. Oculus Rift does simulate human eye view (see things with 2 eyes) but currently(DK2) the view via severe near-sighted without glasses.
and it's definitely problem in most game we play on the regular monitor.

It's not like me asking for a fighter pilot level of eyesight but an eyesight that can read out little text 3 feet away.

I agree too. Downsampling/sharpening is not enough to solve this problem, nor is elimination of SDE. And using the DK2, it's a very obvious problem that everyone notices right away.

This stands in contrast to the frame rate. When things are working properly with low persistence and timewarp, 75Hz in DK2 seems to be quite sufficient, especially for a version 1 kind of thing. Much more sufficient than 1080p. So what's the motivation for the increase to 90Hz? Is it one of those "if you experienced 90Hz and then went back to 75Hz you'd understand" kind of things? What's the benefit that justifies the (substantial) cost?
i7-4770K, 2x GTX 780 SLI, Windows 7 64-bit, Oculus runtime 0.6.0.0

RonsonPL
Heroic Explorer
"TomSD" wrote:

This stands in contrast to the frame rate. When things are working properly with low persistence and timewarp, 75Hz in DK2 seems to be quite sufficient, especially for a version 1 kind of thing. Much more sufficient than 1080p. So what's the motivation for the increase to 90Hz? Is it one of those "if you experienced 90Hz and then went back to 75Hz you'd understand" kind of things? What's the benefit that justifies the (substantial) cost?


1. The biggest part of 75Hz being not enough is flickering. Some people are OK with 75Hz, while others are not until it goes over 110Hz. 75Hz is not enough for most. For me 90Hz isn't enough. This will limit your time before your eyes/brain get fatigued, so the more, the better. I wouldn't mind 240Hz for just this reason. Also, the brighter the screen/scene, the more obvious and painful is the flicker. DK2 is awfully dim in LP mode, so 75Hz would look a lot worse on CV1 (I sure hope the CV1 screen is much brighter than DK2, but after playing on Sony's HMZ-T2 (also OLED screen) I'm starting to be afraid. Smaller space between pixels and sub-pixels could help a lot here).

2. The latency- the more Hz, the lower the latency, for positional tracking and for controllers.

3. The higher the framerate, the more useful time warp technique gets.

4. If framerate is higher, then latency goes down, and there are game genres requiring a lot more than 90Hz to feel good while playing. My opinion: The framerate should be as high as possible. We should choose less detailed assetss in graphics over lower framerate/screen frequency.
Unfortunately I don't think we'll get 120Hz soon, and 180+Hz in the near future. I hope for at least 96Hz so movies can be easily interpolated from 24fps to 96fps (like this program does: http://www.svp-team.com/ ), and 120Hz would be even better, since there will be a lot of 60Hz 360° videos made for casual VR (Gear VR, Android VR, Sony Morpheus)), although I don't know if achieving such ultra-low latency required to interpolate 60Hz 360° videos is possible in 2015/16.)
Not an Oculus hater, but not a fan anymore. Still lots of respect for the team-Carmack, Abrash. Oculus is driven by big corporation principles now. That brings painful effects already, more to come in the future. This is not the Oculus I once cheered for.

TomSD
Honored Guest
So... you see flickering with DK2? I've never been able to see it. Nor do I find it dim, but I tend to prefer darker displays. I've also never felt that the display was a significant factor in fatigue I've experienced, but I'll admit that's hard to nail down.

I clearly remember seeing uncomfortable flicker on CRT monitors running at 60Hz. I identified and corrected it on many occasions even when the regular user of the monitor claimed to not have realized it was a problem. But I was never able to see a benefit from exceeding 75Hz on a CRT. When LCDs came along, I always felt that 60Hz was sufficient on an LCD. When it comes to latency, the 8.3ms reduction going from 60Hz to 120Hz seems miniscule in comparison to my tested reaction time of about 240ms.

Don't get me wrong, I know that when it comes to latency you have to chase every millisecond everywhere you can because it all adds up. But at some point there are always diminishing returns, and it can be harmful to the overall experience if valuable resources such as compute capacity are squandered by chasing them.

I hope that if refresh rates higher than 75Hz are achievable on CV1, advanced users will be able to select the refresh rate that best fits their personal situation and preference. I'd much rather optimize my experience for benefits that are more useful to me: higher graphical fidelity and more easily achievable consistent synchronization between frame rate and refresh rate.

It would be a real face-palm thing IMO if a requirement of exceeding 75Hz ends up being the thing that limits CV1 to a 1080p screen. I think Oculus is better than that, though.
i7-4770K, 2x GTX 780 SLI, Windows 7 64-bit, Oculus runtime 0.6.0.0

WhiteSkyMage
Honored Guest
"RonsonPL" wrote:
"TomSD" wrote:

This stands in contrast to the frame rate. When things are working properly with low persistence and timewarp, 75Hz in DK2 seems to be quite sufficient, especially for a version 1 kind of thing. Much more sufficient than 1080p. So what's the motivation for the increase to 90Hz? Is it one of those "if you experienced 90Hz and then went back to 75Hz you'd understand" kind of things? What's the benefit that justifies the (substantial) cost?


1. The biggest part of 75Hz being not enough is flickering. Some people are OK with 75Hz, while others are not until it goes over 110Hz. 75Hz is not enough for most. For me 90Hz isn't enough. This will limit your time before your eyes/brain get fatigued, so the more, the better. I wouldn't mind 240Hz for just this reason. Also, the brighter the screen/scene, the more obvious and painful is the flicker. DK2 is awfully dim in LP mode, so 75Hz would look a lot worse on CV1 (I sure hope the CV1 screen is much brighter than DK2, but after playing on Sony's HMZ-T2 (also OLED screen) I'm starting to be afraid. Smaller space between pixels and sub-pixels could help a lot here).

2. The latency- the more Hz, the lower the latency, for positional tracking and for controllers.

3. The higher the framerate, the more useful time warp technique gets.

4. If framerate is higher, then latency goes down, and there are game genres requiring a lot more than 90Hz to feel good while playing. My opinion: The framerate should be as high as possible. We should choose less detailed assetss in graphics over lower framerate/screen frequency.


Ok, but then framerate = gpu power. Game companies will want us to get best visuals possible and on top they wont optimize the games... Oculus Rift targets the low cost remember? Great, but maybe they are assuming you've got some powerful beast on the back to render at that frame rate. Remember, PC graphics will not go back to something like a console graphics just because graphic adapters are weak and they cannot achieve high framerate. It is either the consumer that pays $$$$ for 2 or 3 flagship GPUs or turns down graphical detail.

If I, for example, want perfect 90FPS @90Hz on 1440p (+DSR), then i will have to get another 980 and overclock the **** out of both of them...and who knows if thats gonna work when a game is badly optimized...