New to the forums? Click here to read the "How To" Guide.

Developer? Click here to go to the Developer Forums.

Stereo Shading Reprojection

bigmike20vtbigmike20vt Posts: 4,153 Valuable Player
edited August 2017 in General
I am surprised this has not been posted here yet. I have not had time to read properly but up to 20% decrease on gpu load sounds nice to me

Whilst I am not really a huge fan of oculus seemingly putting in a lot of time into mobile VR, the plus point is, when working on weaker devices it really makes them push the boundaries to reducing the levels of compute needed for low end VR

which then has the knock on effect of those with high end VR being able to crank up the details :)



Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR :)
«1

Comments

  • snowdogsnowdog Posts: 7,503 Valuable Player
    I meant to post about this yesterday after I saw it on Reddit but I didn't get around to it. I might have mentioned it in a post here somewhere but I can't remember.

    Only for Unity so far but they reckon it's easy enough to bring this to other engines.

    This is what I meant when I said that Oculus are so far ahead of everyone in terms of R&D. Valve haven't even sorted out their version of ASW yet. What with this and ASW the performance of the Rift (and the quality of a PC capable of running one) is going to piss all over other headsets.
    "This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

    Thomas Covenant, Unbeliever
  • YoLolo69YoLolo69 Posts: 1,129
    Wintermute
    The Oculus dev team still amaze me for their imagination and creativity, it's pretty rare to go that deep and far to provide a better experience instead of relying only on new hardware.

    “Dreams feel real while we are in them, it's only when we wake up that we realize something was strange.” - Dom Cobb

    "Be careful, if you are killed in real life you die in VR too." - TD_4242

    I7 3770K OC 4.5GHz, GTX1080 OC 10%, 16GB DDR3 2448  OC, Oculus Rift CV1

  • bigmike20vtbigmike20vt Posts: 4,153 Valuable Player
    YoLolo69 said:
    The Oculus dev team still amaze me for their imagination and creativity, it's pretty rare to go that deep and far to provide a better experience instead of relying only on new hardware.
    Well i guess its in their interests really. The biggest thing naysayers say about vr is the price of entry is too high. The lower end the pc that can be classed vr capable the more people can board the vr train without having to buy a new pc. Plus in the future i imagine it will help mobile vr
    Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR :)
  • snowdogsnowdog Posts: 7,503 Valuable Player
    YoLolo69 said:
    The Oculus dev team still amaze me for their imagination and creativity, it's pretty rare to go that deep and far to provide a better experience instead of relying only on new hardware.
    That's what @vannagirl said! :o:pB)
    "This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

    Thomas Covenant, Unbeliever
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    Copied from my thread I opened in haste as I panicked about this news.

    :(

    I read this article and see similar technology to nvidia's depth buffer 3D (aka fake 3D), which is absolutely shocking.  Similar to Crysis 2 and 3's awful inhouse 3D.  Basically, lots of 2D assets pushed to depth, like 2D characters in pop up books. Not good.

    I come from 3D Vision.  One of the things that killed 3D, I think, was depth buffer.  It does not blow you away to S3D; frankly, it turns you off it.

    Hopefully that image they chose to use of those ball things was just very poorly chosen and isn't remotely representative.  If it is representative [*shock horror* ] you will be able to see it object as lots of 2D objects pushed to depth.  Obviously for the scene with the sphere to be rendered accurately you would see halos to various degrees all over the spheres.

    Has anyone seen this in action?
  • bigmike20vtbigmike20vt Posts: 4,153 Valuable Player
    Not seen it in action but i have faith that oculus wouldn't blow their load and kill vr on a feature which doesnt work for the sake of a 20% improvement. Atw worked. Asw worked. I have faith this will too
    Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR :)
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    edited August 2017
    Hope so!

    It does say depth buffer and that picture...  Who's Oculus' quality control?  I 100% know Palmer knew what he was talking about when it came to 3D quality.  He was still around with ATW and ASW.  He is not around now.

    Sorry but depth buffer is a right load of poo.

  • bigmike20vtbigmike20vt Posts: 4,153 Valuable Player
    Palmer lucky helped start this whole thing off and for that i will for ever be grateful but if push came to shove now the ball is rolling and it was a choice between PL and john carmack as a software guy improving the features i think i would take carmack
    Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR :)
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    edited August 2017
    Yeah, forgot about Carmack, he is cool (made Doom 3 S3D).

    I think depth buffer is a bit different to ATW and ASW though, don't they predict where the pixels go for missing frames? If so, that's different to depth buffer.  With ATW and ASW you get the missing frame's predicted information injected into the scene but depth buffer removes information.

    You need all the information from both eyes' representations for decent S3D.  Not sure how you would achieve that 20% performance gains without some information loss.  The perfomance boost with depth buffer was good but information loss had a rude impact on image definition.

    I am concerned because this sounds just like depth buffer 3D to me.  The article says depth buffer and the picture shows a *single*, as in one single halo for a sphere.  The only difference between this and nvidia's depth buffer is that they correct the halo effect on the final pass.

    Soz but warning bells are chiming.
  • vannagirlvannagirl Posts: 2,007 Valuable Player
    snowdog said:
    YoLolo69 said:
    The Oculus dev team still amaze me for their imagination and creativity, it's pretty rare to go that deep and far to provide a better experience instead of relying only on new hardware.
    That's what @vannagirl said! :o:pB)
    hahaha i go get my car"



    is this possible for some new tech oculus are coming with and not to water down our current vr?

    do i even know what i am talking of??
    Look, man. I only need to know one thing: where they are. 
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    vannagirl said:
    is this possible for some new tech oculus are coming with and not to water down our current vr?

    do i even know what i am talking of??
    Obviously yes! You've just summed up my fears in one sentence
  • snowdogsnowdog Posts: 7,503 Valuable Player
    I think your worries are unfounded @andysonofbob because Oculus have the best in the business working for them and they know their onions. As I've mentioned a few times on here before Oculus are so far ahead of everyone else in terms of VR R&D that it's ridiculous, they won't release something unless it's the dog's bollocks. We're not talking about Valve here lol
    "This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

    Thomas Covenant, Unbeliever
  • snowdogsnowdog Posts: 7,503 Valuable Player
    @vannagirl Getting your car won't do you much good. I have now moved from living in your crawlspace to living inside your car. If you've got a spare few minutes a cup of tea and a sarnie would be great and an empty bottle would be even better. I'm dying to go to the loo! :o>:)
    "This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

    Thomas Covenant, Unbeliever
  • vannagirlvannagirl Posts: 2,007 Valuable Player
    Hahahaha

    You make me laugh @snowdog you never miss any chance to bring the pain to valve
    Look, man. I only need to know one thing: where they are. 
  • AndyW1384AndyW1384 Posts: 307
    Trinity
    edited August 2017
    vannagirl said:
    is this possible for some new tech oculus are coming with and not to water down our current vr?

    do i even know what i am talking of??
    Obviously yes! You've just summed up my fears in one sentence

    I guess the fact they leave it up to developers to decide which frames or even materials get reprojection applied means it's just another tool for developers to use or not, depending on their particular game's needs and their own quality standards.

    I'm sure it'll introduce artefacts, just like ASW does. Reprojection artefacts may be worse in some situations, it'll be up to developers to learn how to minimise those, or when not to use reprojection at all.

    I'm sure there'll be some developers who'll eschew its use entirely, and expect people to have triple 1080Ti's to play that developer's game at max settings ;)

    [edit]
    It's quite possible a lot of games will only start enabling reprojection when framerates drop too low, or will turn it on when users select lower-quality in exchange for higher framerates. But it's clear this isn't going to be foisted by Oculus onto all developers.
  • flexy123flexy123 Posts: 793
    3Jane
    edited August 2017
    I understand the concerns, but I think if they get this right WITHOUT compromises in visual quality, it means even less H/W requirements...this is a GOOD thing.
    That being said, I am still wondering whether the Oculus SDK has an equivalent to Valve's "The Lab" shader . This is the best VR MSAA shader...and Valve made this open source, it's on the unity store. It's awesome because it's a MSAA forward rendering algorithm that is self-adjusting its quality.
    https://www.assetstore.unity3d.com/en/#!/content/63141


  • kzintzikzintzi Posts: 1,068
    Wintermute
    snowdog said:
    @vannagirl Getting your car won't do you much good. I have now moved from living in your crawlspace to living inside your car. If you've got a spare few minutes a cup of tea and a sarnie would be great and an empty bottle would be even better. I'm dying to go to the loo! :o>:)
    @vannagirl, they have a hangie thing for your car for this sort of thing these days.. what was it again.. ahh, that's right it's called a tow-rope :tongue:
    Though you are more than slightly incoherent, I agree with you Madam,
    a plum is a terrible thing to do to a nostril.
  • vannagirlvannagirl Posts: 2,007 Valuable Player
    Haha @kzintzi

    Genius i like how you think
    Look, man. I only need to know one thing: where they are. 
  • snowdogsnowdog Posts: 7,503 Valuable Player
     :o:(  
    "This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

    Thomas Covenant, Unbeliever
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    edited August 2017
    AndyW1384 said:

    I guess the fact they leave it up to developers to decide which frames or even materials get reprojection applied means it's just another tool for developers to use or not, depending on their particular game's needs and their own quality standards.

    I'm sure it'll introduce artefacts, just like ASW does. Reprojection artefacts may be worse in some situations, it'll be up to developers to learn how to minimise those, or when not to use reprojection at all.

    I'm sure there'll be some developers who'll eschew its use entirely, and expect people to have triple 1080Ti's to play that developer's game at max settings ;)

    [edit]
    It's quite possible a lot of games will only start enabling reprojection when framerates drop too low, or will turn it on when users select lower-quality in exchange for higher framerates. But it's clear this isn't going to be foisted by Oculus onto all developers.
    Soz about bringing this up again - I have seen the damage depth buffer reprojection has done for gaming 3D before.

    @AndyW1384
    ASW has artifacts but at least ASW replaces information with what it thinks should be true 3D.  The artifacts are where it goes wrong.
    Depth buffer reprojection removes information.  It's instantly obvious when used.  It's moniker of fake 3D is not a term of endearment.


    That said, I was wondering how it could be done... The fact it saves 20% made me think they might not be doing it every frame, maybe only one frame in three or something:
    F1 - Stereo
    F2 - Stereo
    F3 - Reprojection

    That might not be too noticable?

    As to leaving it to the devs - you're joking right? Crytek with Crysis 2 and 3?  Deus Ex? Civ 6?  When devs control 3D it is rarely good.  In over ten years of 3D gaming experience I have observed that for every good developer with regard to 3D, there is way more bad.

    Challenge: There are hundreds of games with depth buffer reprojection.  Show me one game, where stereoscopic 3D has worked (patched or vanilla), just one PC game, where depth buffer reprojection has come anywhere close to its S3D counterpart.
    Betcha can't!
  • AndyW1384AndyW1384 Posts: 307
    Trinity
    andysonofbob said:
    [snip]

    As to leaving it to the devs - you're joking right? Crytek with Crysis 2 and 3?  Deus Ex? Civ 6?  When devs control 3D it is rarely good.  In over ten years of 3D gaming experience I have observed that for every good developer with regard to 3D, there is way more bad.

    Challenge: There are hundreds of games with depth buffer reprojection.  Show me one game, where stereoscopic 3D has worked (patched or vanilla), just one PC game, where depth buffer reprojection has come anywhere close to its S3D counterpart.
    Betcha can't!
    I'm not sure that this argument holds water. That some (many) game developers have poor standards isn't a good reason for game engines to exclude valid techniques or tools just because they can be misused. Lots of developers have shoved ridiculous, ugly bloom effects into their games or produced fugly character models or animations. That doesn't mean the shaders used to implement bloom are a bad thing in themselves, nor the modelling or animation suites used by the developers.

    Yes, developers are going to be mucking things up in VR, just as many of them muck up other aspects of games. That's why we read reviews and avoid the turkeys. And, yes, while developers learn what does and doesn't work with VR they'll be mucking things up more.

    I would be interested to know what, if any, differences there are between Oculus' Stereo Shading Reprojection and Crytek's Screen Space Reprojection Stereo. From what I've read about the Crytek method it was way faster than what Oculus are claiming for theirs. I was also interested in the limitations Oculus described in their blog, including the comment that it's only worth using if shaders are complex.

    Anyway, until there's some evidence that this particular technique, implemented the way Oculus have, always and inevitably produces rubbish VR then I think it's a bit too early to be panicking. I'll wait for some independent developer to put it through its paces - then I'll start panicking.

    Unfortunately I can't even check the 4 games you quoted (let alone the hundreds of others you didn't) as I don't have a stereoscopic monitor/glasses setup, so I'll have to decline your challenge.
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    With respect, if it is like Nvidia's reprojection, it's not an effect that can be missused by devs.

    For Nvidia's reprojection, the driver taps into the depth buffer and carries out the scene reprojection from there.  If you go to the Nvidia forum you can see how you can apply the technique to any game. In other words, it's not a matter of degrees, it's on or off. Devs can't really affect it.

    Obviously, I'm really hoping to be proved wrong here.  Hopefully Oculus depth buffer reprojection is sufficient different from depth buffer reprojection fake 3d used by Nvidia to be effective without watering down the experience.

    Ps I didn't mean civ6, I meant civ beyond earth. :/
  • Thane_Thane_ Posts: 242
    Nexus 6
    Long time s3d gamer here and i share andysonofbob's concerns. I choose to play Crysis 2 in 2D over their depth buffer 3D. I'd compare the result to a 2d to 3d movie conversion. Not saying it cant work and only a 20% gain sounds better.
  • ZenbaneZenbane Posts: 15,155 Valuable Player
    edited August 2017
    I missed this discussion when it was first posted somehow, but a new article came out this morning reinforcing this:

    According to Oculus, 26% performance improvement was observed while testing the technology in Unity. The developers tested the technology using GTX 1080 and AMD R920 and observed similar improvements

    http://www.techleer.com/articles/249-oculus-to-use-rendering-technology-for-enhancing-performance/

    Pretty cool, can't wait to see this become part of a new standard.

    Are you a fan of the Myst games? Check out my Mod at http://www.mystrock.com/
    Catch me on Twitter: twitter.com/zenbane
  • cyberealitycybereality Posts: 26,156 Oculus Staff
    edited August 2017
    I agree that the stereo reprojection technique used in Crysis 2 was not great quality, even though it was extremely fast. This new method from Oculus is more advanced than that, though I am not 100% familiar with the technical aspects (of either method) so it's hard to say definitely what's what. From what I read on the blog, it appears the Oculus method still renders depth buffers for both eyes and only uses reprojection the transfer the color and shading from one eye to the other. The Crytek method was only using a single render (for example, from the center eye) and reprojecting left and right camera shifts basically as a post-process effect, which is a little different and probably lesser quality. In addition, Crysis 2 wasn't doing much about occlusion, so you end up with stretched pixels, causing jarring visual artifacts, while this new method re-renders the occluded pixels so you don't lose any information.
    AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i
    Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    edited August 2017
    Thane_ said:
    Long time s3d gamer here and i share andysonofbob's concerns. I choose to play Crysis 2 in 2D over their depth buffer 3D. I'd compare the result to a 2d to 3d movie conversion. Not saying it cant work and only a 20% gain sounds better.
    Phew!  Glad I am not the only one who has seen depth buffer reprojection in action and is concerned about this news as a result. :/

    Zenbane said:
    I missed this discussion when it was first posted somehow, but a new article came out this morning reinforcing this:

    According to Oculus, 26% performance improvement was observed while testing the technology in Unity. The developers tested the technology using GTX 1080 and AMD R920 and observed similar improvements

    http://www.techleer.com/articles/249-oculus-to-use-rendering-technology-for-enhancing-performance/

    Pretty cool, can't wait to see this become part of a new standard.

    Thanks for the article; I think it is the same as nVidia's fake 3D:

    To tackle this issue, Oculus has come up with the new rendering technology known as Stereo Shading Reprojection. In the usual VR applications, an image is rendered twice, one for the right eye and one for the left one. But, the Stereo Shading Reprojection technique allows the pixels to be rendered just once. Then, the rendered pixels are projected to the other eye which reduces the time and resource requirement that was wasted in rendering the image twice.

    But they do say 'new'.  It really does sound the same as nVidia's depth buffer reprojection technique (they are even using the same names!).  I think the difference might be themore sophisticated halo correction algorithm.  But that doesn't get around the fact that you will be loosing visual information.

    I agree that the stereo reprojection technique used in Crysis 2 was not great quality, even though it was extremely fast. This new method from Oculus is more advanced than that, though I am not 100% familiar with the technical aspects (of either method) so it's hard to say definitely what's what. From what I read on the blog, it appears the Oculus method still renders depth buffers for both eyes and only uses reprojection the transfer the color and shading from one eye to the other.
    The blog Zenbane posted mentions pixels being rendered once.

    @cybereality
    What would be the chances of someone knocking up a quick Unity asset flip-like demo that uses the technique, a 3rd person scene with maybe a road going of in the distance, the standard cloudscape with the sun in the distance?  If it is just a bunch of code you need to add and if Unity is anything like Unreal Engine which provides a bunch of demo maps you can tinker with, I think even one of the starter packs demo maps would do. :)

    I am really sorry for being such a downer on this but if it looks like a duck, sounds like a duck, walks like a duck and all that...

    EDIT
    The reason I am so concerned is because I have played Elite Dangerous, Assetto Corsa, Robo Recall, Alien: Isolation, and a shed load of other games.  They're all awesome!  The Oculus is the best toy I have ever had and I am just concerned that the hunt for more efficiency SOMETIMES really does comes at a cost.

    I think the reason ASW is so good is because it maintains visual information at the cost of clamping frames to 45FPS and the reason nvidia's and Cryteks reprojection system sucks is because it removes visual information.
  • cyberealitycybereality Posts: 26,156 Oculus Staff
    You can read the Oculus blog if you want to understand more about how it works.
    https://developer.oculus.com/blog/introducing-stereo-shading-reprojection-for-unity

    Particularly this line: 
    • It is still stereoscopically correct, i.e. the sense of depth should be identical to normal rendering
    AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i
    Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
  • andysonofbobandysonofbob Posts: 247
    Nexus 6
    Thanks!  It looks like the blog has a link to a sample of it running in unity. I'll give it a lookie when I get back.

    :)
  • WildtWildt Posts: 2,266 Valuable Player
    Reading the Reddit thread reveals some legit concerns about this approach. I share them.
    PCVR: CV1 || 4 sensors || TPcast wireless adapter || MamutVR Gun stock V3
    PSVR: PS4 Pro || Move Controllers || Aim controller
    WMR: HP Reverb
Sign In or Register to comment.