cancel
Showing results for 
Search instead for 
Did you mean: 

How to integrate OVROverlay and Avatar SDK correctly

sycx
Explorer
We are using OVROverlay Underlay Cubemap as skybox.

Today we are integrating Oculus Avatar SDK.
We found that Avatar was overdrawn by Cubemap and became completely invisible.

Disable OVROverlay, we can see Avatar works fine.

How do I put OVROverlay under Avatar?
5 REPLIES 5

jerrytian
Protege
Hi, 

I am in the same project with @sycx, and after some discussion, I will bring some more details here.

We are aware of the quoted documentation part, and the cubemap overlay is indeed set as "underlay", which is rendered behind the eye buffer, as our understanding of it.

We are using the same setup illustrated in the document(see bottom of the comment), and the avatar should be rendered on the eye buffer, but now the avatar just doesn't show up. Though, if we draw a normal cube before cubemap underlay, it will be shown as expected.

@sycx  will prepare a barebone project to reproduce the problem and upload it to github when it is ready, hopefully our understanding of the mechanism is correct.

jvcwjmm0b99u.png





sycx
Explorer
Here is a demo:
 https://github.com/sycx/AvatarDemo

We simply put a cube, an OvrAvatar and a OVROverlay in the scene.




Press OVRInput.Button.One can show/hide Overlay

With this demo running on Gear VR. we can see the cube and the OVROverlay texture, but no avatar.
If we disable OVROverlay, we can see both cube and avatar.

We have no idea how to add OVRUnderlayTransparentOccluder shader to OvrAvatar.

JianZ
Expert Protege
hey @sycx

For making transparency stuff work with underlay, you'll need the eyebuffer layer to write alpha properly,  for your case, the avatar shader doesn't seem to be writing on alpha,  I'm passing this to our avatar sdk team.

However,  even if the avatar could write to the alpha layer, this ( underlay cubemap + eye buffer transparency ) can be really tricky for many reasons when your project grows in scale, so heed these warnings:

1. All transparency shader have to write to alpha, this might be not feasible for those materials you don't have control of, you need to render a second pass ( only write alpha-- might be a workaround), but you need setup the alpha rendering pass geometry to sync with the visual geometry.

2. Multiple layer cases + multiple type of blending mode  is complicated,  since you can only store one alpha value in the eye buffer, and how to generate and composite them is tricky,  

JasonWalters
Protege
Hi @imperativity, @JianZ, 

I am struggling with this issue too.  Here is my scenario:
1.  I'm using a Quad OVROverlay for my UI which appears about 2-3 meters from the player.
2.  The player's OVRAvatar hand's appear behind the UI.

Does Oculus use OVROverlay with the UI in home?  If yes, is it possible to share the solution?  I have read the documents posted above but still not clear on how to resolve.  Thanks!

JianZ
Expert Protege
hey @JasonWalters

we do use overlay in Gear VR home,  and the fact is that everything is made by overlay in gear VR home, so we can sort them correctly

For PC,  I don't think we use overlay much ( if any ),  I agree it is not that easy to use overlay on PC especially messing with OVRAvatar rendering, you'll have to figure some tricks like what I mentioned above.  fortunately, we'll have our SDK to support depth compositing soon, means overlay can do depth testing in our future SDK.

-Best