New to the forums? Click here to read the "How To" Guide.

Developer? Click here to go to the Developer Forums.

CV1 must not be less than 1440p...

WhiteSkyMageWhiteSkyMage Posts: 8
edited February 2015 in General
It's true you will might need 2 cards but it is time to push Nvidia and AmD to "preorder" next smaller nanometer nodes. If they buy the new smaller chips first then mobile chip companies will not have anything (which is what must happen! Who needs phones? - they are just pointless with Oculus rift on the market!) The high end market must start getting leaps of performance in order to drive virtual reality to its immersion!
«1

Comments

  • ...or it will FAIL?
  • kojackkojack Posts: 6,474 Volunteer Moderator
    Who needs phones? - they are just pointless with Oculus rift on the market!
    True. I much prefer using a pc, DK2 and a portable generator to make phone calls while walking around rather than a phone.
    I can wheel it around on a trolley, so it's technically mobile. :)
  • jngdwejngdwe Posts: 566
    Art3mis
    I agree that it needs to be 1440p.

    I'm waiting until the 12th to order my Gear VR, but I've already held my DK2 lenses up to the screen, and the SDE was noticeably improved. I see no reason to not use 1440p, we can always upscale from 1080p.
  • WirelineWireline Posts: 1,203
    NerveGear
    I'm not sure I agree with this anymore. We've already had the "higher res does not equal no SDE" arguments, which though not openly proven has come from some high level sources.

    However for me the issue has been about resolution, and to be honest I have found that using NVidia DSR makes a MASSIVE difference to the image quality in the rift. A great example is Prepar3D, where usually I would struggle to read the gauges in the aircraft, and from the normal pilot position the cockpit detail would be quite jaggy.

    However I set the resolution of the rift to essentially 4K, and the difference is like night and day. Its still not as good as looking at a normal monitor, but is a leaps and bounds improvement.

    The problem of course is that with higher resolutions and higher refresh, the hardware you need will just keep going up and up. How many people are currently maxing everything out with their DK2's? I've got an overclocked Haswell-E and two overclocked 980's, and there is still stuff that struggles (without DSR), such as DCS World.

    I think a lot of this is going to come down to software.
  • For 4K screen (note 5) wouldn't be possible from every angle you looks at it:
    - HDMI 2.0 does not support a resolution of 4k @90fps, max is 60fps, so unless they do not use display port 1.3, I don't see the chance...
    - No GPU can do 90 fps at 4k and Oculus needs to run on a PCM with no more than 2 cards in SLI.
    - it will be more expensive, not only the gear itself but the PC behind it...Meaning that it will be an expensive upgrade even for enthusiasts.
    - Even with 1440p screen, it will still be good for couple of years time. Even when graphics cards reach the performance for 4K using 1 card, people will just use DSR and AAs, and actually even now if you have 2 GTX 980s or Titans. Bare in mind that 1440p is still challenging on highest settings game (none of you want to take down graphics quality, am I right?) if you are looking at something less than a GTX 970.

    I will however be disappointed if Oculus decides on a normal 1080p screen again...It would just be bad...
  • WirelineWireline Posts: 1,203
    NerveGear
    For 4K screen (note 5) wouldn't be possible from every angle you looks at it:
    - HDMI 2.0 does not support a resolution of 4k @90fps, max is 60fps, so unless they do not use display port 1.3, I don't see the chance...
    - No GPU can do 90 fps at 4k and Oculus needs to run on a PCM with no more than 2 cards in SLI.
    - it will be more expensive, not only the gear itself but the PC behind it...Meaning that it will be an expensive upgrade even for enthusiasts.
    - Even with 1440p screen, it will still be good for couple of years time. Even when graphics cards reach the performance for 4K using 1 card, people will just use DSR and AAs, and actually even now if you have 2 GTX 980s or Titans. Bare in mind that 1440p is still challenging on highest settings game (none of you want to take down graphics quality, am I right?) if you are looking at something less than a GTX 970.

    I will however be disappointed if Oculus desides on a normal 1080p screen again...It would just be bad...

    I think you have misunderstood what I was saying. With DSR, you are oversampling. The final image is still 1080p because thats what the DK2 is. I am not saying we need 4K, I am saying 1080p is fine - with oversampling.

    That has been my experience at least :)
  • WhiteSkyMageWhiteSkyMage Posts: 8
    edited February 2015
    Wireline wrote:
    For 4K screen (note 5) wouldn't be possible from every angle you looks at it:
    - HDMI 2.0 does not support a resolution of 4k @90fps, max is 60fps, so unless they do not use display port 1.3, I don't see the chance...
    - No GPU can do 90 fps at 4k and Oculus needs to run on a PCM with no more than 2 cards in SLI.
    - it will be more expensive, not only the gear itself but the PC behind it...Meaning that it will be an expensive upgrade even for enthusiasts.
    - Even with 1440p screen, it will still be good for couple of years time. Even when graphics cards reach the performance for 4K using 1 card, people will just use DSR and AAs, and actually even now if you have 2 GTX 980s or Titans. Bare in mind that 1440p is still challenging on highest settings game (none of you want to take down graphics quality, am I right?) if you are looking at something less than a GTX 970.

    I will however be disappointed if Oculus desides on a normal 1080p screen again...It would just be bad...

    I think you have misunderstood what I was saying. With DSR, you are oversampling. The final image is still 1080p because thats what the DK2 is. I am not saying we need 4K, I am saying 1080p is fine - with oversampling.

    That has been my experience at least :)

    I meant 4k display directly in Oculus CV1, (in other words they jump over the 1440p), if somebody comes across suggesting it... I wonder what will happen if you do 1440p using DSR...will it be unplayable...?

    Anyway I just wish all companies making mobile devices like apple make a 1 year delay because of no mass production of chips for themselves... I will feel good to visit TSMC and see apple guys crying after finding out that all the next node chips are bought by Nvidia...
  • WirelineWireline Posts: 1,203
    NerveGear

    I meant 4k display directly if somebody comes across suggesting it... I wonder what will happen if you do 1440p using DSR...will it be unplayable...?

    It will be machine dependent. I am running a 5820K @ 4.5GHz and SLI superclocked 980's, and I can run 4K DSR with P3D (flight sim) without judder. 2K DSR would be fine for it. I have also tried it in Elite Dangerous with 2K res and it was fine, looks so much better.

    That hardware setup is of course pretty far from mainstream, so again we come back to the hardware / accessibility problem.
  • Wireline wrote:

    I meant 4k display directly if somebody comes across suggesting it... I wonder what will happen if you do 1440p using DSR...will it be unplayable...?

    It will be machine dependent. I am running a 5820K @ 4.5GHz and SLI superclocked 980's, and I can run 4K DSR with P3D (flight sim) without judder. 2K DSR would be fine for it. I have also tried it in Elite Dangerous with 2K res and it was fine, looks so much better.

    That hardware setup is of course pretty far from mainstream, so again we come back to the hardware / accessibility problem.

    Yeah I got 5820K but haven't come to OC it yet. Also for now 1 GTx 980...will buy the other one when I get my ROG 4K IPS screen this summer...I will WC the 2nd one. At least for the Oculus Rift CV1 I will be ready to rock it with some heavy graphics...

    I will see my CPU when I see I need the extra performance for future multi threaded games and video editing...
  • WirelineWireline Posts: 1,203
    NerveGear
    Wireline wrote:

    I meant 4k display directly if somebody comes across suggesting it... I wonder what will happen if you do 1440p using DSR...will it be unplayable...?

    It will be machine dependent. I am running a 5820K @ 4.5GHz and SLI superclocked 980's, and I can run 4K DSR with P3D (flight sim) without judder. 2K DSR would be fine for it. I have also tried it in Elite Dangerous with 2K res and it was fine, looks so much better.

    That hardware setup is of course pretty far from mainstream, so again we come back to the hardware / accessibility problem.

    Yeah I got 5820K but haven't come to OC it yet. Also for now 1 GTx 980...will buy the other one when I get my ROG 4K IPS screen this summer...I will WC the 2nd one. At least for the Oculus Rift CV1 I will be ready to rock it with some heavy graphics...

    I will see my CPU when I see I need the extra performance for future multi threaded games and video editing...

    Nice :) If you have got yourself a good chip, that 5820K should cruise easily up to 4.5, with some folks seeing up to 4.8 if they are really lucky. In fact 4.5 is the recommended initial overclock from ASUS.

    That said, my first 5820K was one of the early 'dog' batches, and wouldn't go above 4GHz no matter how much voltage you threw at it, and it started degrading fast. I had the Intel Tuning Plan (highly recommended if you are going to OC) and now have a new chip, easily does 4.5GHz at 1.26V (may even run at lower voltage but have not tested yet - this is just running AUTO from the X99 Deluxe BIOS).

    When Haswell-E works well, its a simple multiplier plus voltage job (taking into account any strap changes if you want to run your memory over 2400MHz).

    I have mine OC'd for flight sims, which are enormously CPU reliant.
  • TomSDTomSD Posts: 429
    Brain Burst
    Oculus should make CV1 1080p just to spite all the people who were frothing over 4K this, 4K that, and instantly close all new threads discussing resolution. Keep just one thread to be the cesspool of resolution whining. Sticky it. Have the first post say "nope, not gonna happen -Palmer".

    You'll get 1080p+SDE and like it!

    (Wouldn't I make a great community manager?)
    i7-4770K, 2x GTX 780 SLI, Windows 7 64-bit, Oculus runtime 0.6.0.0
  • janhercajanherca Posts: 36
    Brain Burst
    Give me less aperture ratio in the OLED screen, do not give me pentile anymore, forget the resolution, and CV1 would blow my mind for sure.

    What we need is a CUSTOM VR SCREEN (from Samsung or any other), and no more phone-screens with super-resolutions. The problem with this is how to lock the 300-400$ retail price.

    And that's the equation that Oculus is dealing with.
  • VizionVRVizionVR Posts: 3,022
    Wintermute
    kojack wrote:
    Who needs phones? - they are just pointless with Oculus rift on the market!
    True. I much prefer using a pc, DK2 and a portable generator to make phone calls while walking around rather than a phone.
    I can wheel it around on a trolley, so it's technically mobile. :)
    :lol:
    Ah yes, Riftcarts. I've read about these.
    If only they made booths with phones inside them. For added convenience they'd be on every street corner and gas station. The phone companies could charge us to make phonecalls. We'd never need to carry those bulky mobile phones around any more, we only need our "Riftcarts" for VR. It would be revolutionary!!
    Not a Rift fanboi. Not a Vive fanboi. I'm a VR fanboi. Get it straight.
  • WalkyWalky Posts: 357
    Brain Burst
    frankzappa wrote:
    ...or it will FAIL?

    This made my day :lol:
  • ennogsennogs Posts: 230
    Art3mis
    It's true you will might need 2 cards but it is time to push Nvidia and AmD to "preorder" next smaller nanometer nodes. If they buy the new smaller chips first then mobile chip companies will not have anything (which is what must happen! Who needs phones? - they are just pointless with Oculus rift on the market!) The high end market must start getting leaps of performance in order to drive virtual reality to its immersion!


    If the rift came out and needed 2two high end cards to run it then it would fail. Needing one high end card fair enough but two?
  • jngdwejngdwe Posts: 566
    Art3mis
    After seeing the lessened SDE of the Note 4, I see no reason not to go for 1440p. At the very least the screen should be 1440p, even if we don't render native.
  • Wireline wrote:
    I'm not sure I agree with this anymore. We've already had the "higher res does not equal no SDE" arguments, which though not openly proven has come from some high level sources.

    However for me the issue has been about resolution, and to be honest I have found that using NVidia DSR makes a MASSIVE difference to the image quality in the rift. A great example is Prepar3D, where usually I would struggle to read the gauges in the aircraft, and from the normal pilot position the cockpit detail would be quite jaggy.

    However I set the resolution of the rift to essentially 4K, and the difference is like night and day. Its still not as good as looking at a normal monitor, but is a leaps and bounds improvement.

    The problem of course is that with higher resolutions and higher refresh, the hardware you need will just keep going up and up. How many people are currently maxing everything out with their DK2's? I've got an overclocked Haswell-E and two overclocked 980's, and there is still stuff that struggles (without DSR), such as DCS World.

    I think a lot of this is going to come down to software.

    I think this is the reason Crescent Bay is not higher resolution than Gear VR. They have found that 1440p and DSR is better than 4k and native resolution.

    I believe CV1 will be something like 1300x3250 pixels (resulting in a 5:4 aspet ratio).
  • VRoomVRoom Posts: 596
    I don't care if it's 1440p or 1080p as long as the SDE is almost solved (and, as I read, CB prototype is practically there). No matter what, resolution won't be perfect for quite some time... so as long as the image quality is good enough that it doesn't bother you anymore and the experience itself becomes more important (again, CB seems to deliver in that department), I'm OK with it.
  • RonsonPLRonsonPL Posts: 1,115
    Trinity
    VRoom wrote:
    I don't care if it's 1440p or 1080p as long as the SDE is almost solved (and, as I read, CB prototype is practically there). No matter what, resolution won't be perfect for quite some time... so as long as the image quality is good enough that it doesn't bother you anymore and the experience itself becomes more important (again, CB seems to deliver in that department), I'm OK with it.

    Well...yes.
    And no.


    You don't care in the same way people say "I don't want 3D or VR, since I don't see anything wrong with 2D. 2D is fine, it's enough". And that's originating from the lack of knowledge of what you are loosing, what things are blocked, what you cannot experience.
    Go play Battlefield 4. Set the resolution to 800x600 and resolution scale to the lowest. Now go play a round, when you're on foot and in a building. If you were playing any game for your first time in your life, you would say it's sufficient to enjoy. After all, you scored a few kills. It was fun.
    But go fly a heli at this resolution...
    Go drive a tank and try to shoot a guy at some distance.
    Go read a small text/indicators in a flying simulators.
    and you'll see what's the problem.

    These are the limits you get when dealing with "good enough" hardware.
    For example, right now, some people with 3D Vision and DK2 are playing Project Cars on their monitors because they don't exactly see (DK2) the objects on which they judge their braking points. That's just another example.

    You didn't see a hi-res VR - you don't know what difference it makes.
    You didn't see a much wider (than DK2 and probably CV1) FOV in VR, so you don't know what doors it opens either.

    We now know we're not getting any proper and widely accepted (by developers) controller. Then we are limited (very much!) at the very beginning. 1080p would make even more limitations.

    Don't get me wrong. I know what DK2 can do, I know it can be fun, and with SDE improved, CV1 will be significantly better. But I also know what people said when 320x240 was a standard. And then at 480p. And then at 720p and 1080p.
    Always the same - resolution doesn't matter. And then... you go back to the lower one, that you called "OK" back in the days and... you clearly see you were wrong, and you decide to watch out for this mistake in the future. At least that's what I did.

    We should get to 4x the pixel amount as DK2, as soon as possible. Even if it means changing the price from Facebook-casualish 300$ to the levels initially announced, which means 450-500$.
    Similar FOV and not more than 20-30% additional pixels compared to DK2 is simply as far from the promised "best VR we can offer under 500$" as it gets. And the quote about 500$ is from Oculus Kickstarter times.

    edit: and if PS3 is able to display 60fps at 1080p (2006's Ridge Racer), if a smartphone can do VR on 1440p display, then single-GPU card cheaper than Titans, can handle the 1440p+50% without a problem. Even with some DSR, which you can easily replace with proper AA, although not when you want to stick to the advantages of deferred rendering engine. But no one forces anyone to use it, so DSR is not the only option we got to improve the quality that much.
    Not an Oculus hater, but not a fan anymore.
    Still lots of respect for the team-Carmack, Abrash.
    Oculus is driven by big corporation principles now. That brings painful effects already, more to come in the future. This is not the Oculus I once cheered for.
  • WhiteSkyMageWhiteSkyMage Posts: 8
    edited February 2015
    jngdwe wrote:
    After seeing the lessened SDE of the Note 4, I see no reason not to go for 1440p. At the very least the screen should be 1440p, even if we don't render native.

    1 GPU IS enough! But some people are like me - they want to use every single technology possible to make picture look better (like DSR) and this is where 2 GPUs come into play. You can not do 1440p DSR with just one 980...You must keep your FPS above 60 at all times...and that will not be possible every time at 4K rendering and downscaling unless the game is well optimized...

    If you imagine a game like Skyrim... you put 50 texture mods, including 4K ones, with Anti Aliasing and DSR on top (i know i am going a bit too far here) - but imagine the most extreme of the extreme max with 1 GTX 980... will it be possible to remain at 60+ FPS? Not everyone is looking at the minimum cash to spend here - some people want performance without mercy...
  • MrsVRMrsVR Posts: 203
    will it be possible to remain at 60+ FPS?

    90 FPS with CV1. So funny.
    Rendering/Game engineer
  • MrsVR wrote:
    will it be possible to remain at 60+ FPS?

    90 FPS with CV1. So funny.

    Thats what Palmer said... You are looking at min of 60 FPS and no less...

    I wonder how will Sony do their project Morpheus... It will be funny if they fail....
  • MrMonkeybatMrMonkeybat Posts: 640
    Brain Burst
    60 fps Async time-warped to 120hz would probably work, as it was the strobing at 60hz low persistence is the main reason for higher hz.

    If you optimise to something between Half Life2 and mid PS3X360 era graphics. Something like a R9 280 should be able to render 4k 120hz and output through the twin Displayports. they can be had for under $200USD these days.

    We have still yet to see Game Engines and Drivers fully optimised for VR with things like Dynamic render buffers, Binocular culling, simultaneous stereo rendering.

    It all comes down to what screen technology they have access to. 2560 RGB might be better that 3840 pentile. It dose not necessarily need to be a standard resolution either its possible to cut the screens to whatever size fits best.
  • RonsonPLRonsonPL Posts: 1,115
    Trinity
    edited February 2015
    Just my "2 cents":

    1. 60fps is not the minimum, it's the minimum that sometimes work, but it's just "in case of disaster, use this", not "minimal, enough". There will be many serious downsides of only having 60 instead of 90-120. Latency and the image artifacts, where time-warp cannot work well (and will never work well, because of the nature of it, so don't count on "they'll optimize it".

    2. You never strobe at 60Hz. It was said many times before: X fps is required for X Hz to make low persistence work. You cannot have even 89fps at 90Hz. You need to make it exactly the same.
    Even 80fps at 160Hz (not talking about VR and time warp now) looks awful and light-years away of what low persistence can offer.
    Not an Oculus hater, but not a fan anymore.
    Still lots of respect for the team-Carmack, Abrash.
    Oculus is driven by big corporation principles now. That brings painful effects already, more to come in the future. This is not the Oculus I once cheered for.
  • pappythefoopappythefoo Posts: 36
    Brain Burst
    RonsonPL wrote:
    VRoom wrote:
    I don't care if it's 1440p or 1080p as long as the SDE is almost solved (and, as I read, CB prototype is practically there). No matter what, resolution won't be perfect for quite some time... so as long as the image quality is good enough that it doesn't bother you anymore and the experience itself becomes more important (again, CB seems to deliver in that department), I'm OK with it.

    Well...yes.
    And no.


    You don't care in the same way people say "I don't want 3D or VR, since I don't see anything wrong with 2D. 2D is fine, it's enough". And that's originating from the lack of knowledge of what you are loosing, what things are blocked, what you cannot experience.
    Go play Battlefield 4. Set the resolution to 800x600 and resolution scale to the lowest. Now go play a round, when you're on foot and in a building. If you were playing any game for your first time in your life, you would say it's sufficient to enjoy. After all, you scored a few kills. It was fun.
    But go fly a heli at this resolution...
    Go drive a tank and try to shoot a guy at some distance.
    Go read a small text/indicators in a flying simulators.
    and you'll see what's the problem.

    These are the limits you get when dealing with "good enough" hardware.
    For example, right now, some people with 3D Vision and DK2 are playing Project Cars on their monitors because they don't exactly see (DK2) the objects on which they judge their braking points. That's just another example.

    You didn't see a hi-res VR - you don't know what difference it makes.
    You didn't see a much wider (than DK2 and probably CV1) FOV in VR, so you don't know what doors it opens either.

    We now know we're not getting any proper and widely accepted (by developers) controller. Then we are limited (very much!) at the very beginning. 1080p would make even more limitations.

    Don't get me wrong. I know what DK2 can do, I know it can be fun, and with SDE improved, CV1 will be significantly better. But I also know what people said when 320x240 was a standard. And then at 480p. And then at 720p and 1080p.
    Always the same - resolution doesn't matter. And then... you go back to the lower one, that you called "OK" back in the days and... you clearly see you were wrong, and you decide to watch out for this mistake in the future. At least that's what I did.

    We should get to 4x the pixel amount as DK2, as soon as possible. Even if it means changing the price from Facebook-casualish 300$ to the levels initially announced, which means 450-500$.
    Similar FOV and not more than 20-30% additional pixels compared to DK2 is simply as far from the promised "best VR we can offer under 500$" as it gets. And the quote about 500$ is from Oculus Kickstarter times.

    edit: and if PS3 is able to display 60fps at 1080p (2006's Ridge Racer), if a smartphone can do VR on 1440p display, then single-GPU card cheaper than Titans, can handle the 1440p+50% without a problem. Even with some DSR, which you can easily replace with proper AA, although not when you want to stick to the advantages of deferred rendering engine. But no one forces anyone to use it, so DSR is not the only option we got to improve the quality that much.

    I agree.. Oculus Rift does simulate human eye view (see things with 2 eyes) but currently(DK2) the view via severe near-sighted without glasses.
    and it's definitely problem in most game we play on the regular monitor.

    It's not like me asking for a fighter pilot level of eyesight but an eyesight that can read out little text 3 feet away.
    Ordered : 20 Sep 2013 Number : 74XXX Status : Arrived
    Holy Blessed Email : Nov 5
    Arrived : Nov 13
    Shipping Email : Nov 14
  • TomSDTomSD Posts: 429
    Brain Burst
    I agree.. Oculus Rift does simulate human eye view (see things with 2 eyes) but currently(DK2) the view via severe near-sighted without glasses.
    and it's definitely problem in most game we play on the regular monitor.

    It's not like me asking for a fighter pilot level of eyesight but an eyesight that can read out little text 3 feet away.
    I agree too. Downsampling/sharpening is not enough to solve this problem, nor is elimination of SDE. And using the DK2, it's a very obvious problem that everyone notices right away.

    This stands in contrast to the frame rate. When things are working properly with low persistence and timewarp, 75Hz in DK2 seems to be quite sufficient, especially for a version 1 kind of thing. Much more sufficient than 1080p. So what's the motivation for the increase to 90Hz? Is it one of those "if you experienced 90Hz and then went back to 75Hz you'd understand" kind of things? What's the benefit that justifies the (substantial) cost?
    i7-4770K, 2x GTX 780 SLI, Windows 7 64-bit, Oculus runtime 0.6.0.0
  • RonsonPLRonsonPL Posts: 1,115
    Trinity
    TomSD wrote:
    This stands in contrast to the frame rate. When things are working properly with low persistence and timewarp, 75Hz in DK2 seems to be quite sufficient, especially for a version 1 kind of thing. Much more sufficient than 1080p. So what's the motivation for the increase to 90Hz? Is it one of those "if you experienced 90Hz and then went back to 75Hz you'd understand" kind of things? What's the benefit that justifies the (substantial) cost?

    1. The biggest part of 75Hz being not enough is flickering. Some people are OK with 75Hz, while others are not until it goes over 110Hz. 75Hz is not enough for most. For me 90Hz isn't enough. This will limit your time before your eyes/brain get fatigued, so the more, the better. I wouldn't mind 240Hz for just this reason. Also, the brighter the screen/scene, the more obvious and painful is the flicker. DK2 is awfully dim in LP mode, so 75Hz would look a lot worse on CV1 (I sure hope the CV1 screen is much brighter than DK2, but after playing on Sony's HMZ-T2 (also OLED screen) I'm starting to be afraid. Smaller space between pixels and sub-pixels could help a lot here).

    2. The latency- the more Hz, the lower the latency, for positional tracking and for controllers.

    3. The higher the framerate, the more useful time warp technique gets.

    4. If framerate is higher, then latency goes down, and there are game genres requiring a lot more than 90Hz to feel good while playing. My opinion: The framerate should be as high as possible. We should choose less detailed assetss in graphics over lower framerate/screen frequency.
    Unfortunately I don't think we'll get 120Hz soon, and 180+Hz in the near future. I hope for at least 96Hz so movies can be easily interpolated from 24fps to 96fps (like this program does: http://www.svp-team.com/ ), and 120Hz would be even better, since there will be a lot of 60Hz 360° videos made for casual VR (Gear VR, Android VR, Sony Morpheus)), although I don't know if achieving such ultra-low latency required to interpolate 60Hz 360° videos is possible in 2015/16.)
    Not an Oculus hater, but not a fan anymore.
    Still lots of respect for the team-Carmack, Abrash.
    Oculus is driven by big corporation principles now. That brings painful effects already, more to come in the future. This is not the Oculus I once cheered for.
  • TomSDTomSD Posts: 429
    Brain Burst
    So... you see flickering with DK2? I've never been able to see it. Nor do I find it dim, but I tend to prefer darker displays. I've also never felt that the display was a significant factor in fatigue I've experienced, but I'll admit that's hard to nail down.

    I clearly remember seeing uncomfortable flicker on CRT monitors running at 60Hz. I identified and corrected it on many occasions even when the regular user of the monitor claimed to not have realized it was a problem. But I was never able to see a benefit from exceeding 75Hz on a CRT. When LCDs came along, I always felt that 60Hz was sufficient on an LCD. When it comes to latency, the 8.3ms reduction going from 60Hz to 120Hz seems miniscule in comparison to my tested reaction time of about 240ms.

    Don't get me wrong, I know that when it comes to latency you have to chase every millisecond everywhere you can because it all adds up. But at some point there are always diminishing returns, and it can be harmful to the overall experience if valuable resources such as compute capacity are squandered by chasing them.

    I hope that if refresh rates higher than 75Hz are achievable on CV1, advanced users will be able to select the refresh rate that best fits their personal situation and preference. I'd much rather optimize my experience for benefits that are more useful to me: higher graphical fidelity and more easily achievable consistent synchronization between frame rate and refresh rate.

    It would be a real face-palm thing IMO if a requirement of exceeding 75Hz ends up being the thing that limits CV1 to a 1080p screen. I think Oculus is better than that, though.
    i7-4770K, 2x GTX 780 SLI, Windows 7 64-bit, Oculus runtime 0.6.0.0
  • RonsonPL wrote:
    TomSD wrote:
    This stands in contrast to the frame rate. When things are working properly with low persistence and timewarp, 75Hz in DK2 seems to be quite sufficient, especially for a version 1 kind of thing. Much more sufficient than 1080p. So what's the motivation for the increase to 90Hz? Is it one of those "if you experienced 90Hz and then went back to 75Hz you'd understand" kind of things? What's the benefit that justifies the (substantial) cost?

    1. The biggest part of 75Hz being not enough is flickering. Some people are OK with 75Hz, while others are not until it goes over 110Hz. 75Hz is not enough for most. For me 90Hz isn't enough. This will limit your time before your eyes/brain get fatigued, so the more, the better. I wouldn't mind 240Hz for just this reason. Also, the brighter the screen/scene, the more obvious and painful is the flicker. DK2 is awfully dim in LP mode, so 75Hz would look a lot worse on CV1 (I sure hope the CV1 screen is much brighter than DK2, but after playing on Sony's HMZ-T2 (also OLED screen) I'm starting to be afraid. Smaller space between pixels and sub-pixels could help a lot here).

    2. The latency- the more Hz, the lower the latency, for positional tracking and for controllers.

    3. The higher the framerate, the more useful time warp technique gets.

    4. If framerate is higher, then latency goes down, and there are game genres requiring a lot more than 90Hz to feel good while playing. My opinion: The framerate should be as high as possible. We should choose less detailed assetss in graphics over lower framerate/screen frequency.

    Ok, but then framerate = gpu power. Game companies will want us to get best visuals possible and on top they wont optimize the games... Oculus Rift targets the low cost remember? Great, but maybe they are assuming you've got some powerful beast on the back to render at that frame rate. Remember, PC graphics will not go back to something like a console graphics just because graphic adapters are weak and they cannot achieve high framerate. It is either the consumer that pays $$$$ for 2 or 3 flagship GPUs or turns down graphical detail.

    If I, for example, want perfect 90FPS @90Hz on 1440p (+DSR), then i will have to get another 980 and overclock the **** out of both of them...and who knows if thats gonna work when a game is badly optimized...
  • andrewtekandrewtek Posts: 976
    Art3mis
    Any game "built for VR" will not be badly optimized. If the requirement to put that moniker on a game is 90 or 120 FPS, game and engine developers will make it happen. Oculus will not choose an impossible target and engine makers (at least Unreal Engine 4's makers) are open with how to achieve the needed frame rates.
Sign In or Register to comment.