Hi all, I have a unity scene with a mixture of regular geometry and background spheres: Left360 and Right360 The geometry is totally normal 3d scene geometry. The background spheres are only seen by the left and right cameras respectively using culling layers in order to get the proper depth effect.
With "Use per eye cameras" disabled, I only have the option of seeing both together or one of them at a time. With "Use per eye cameras" enabled the 360 spheres appear correctly but the positional tracking and IPD / fov / general depth and tracking are mega broken.
I am unable to find any combination of settings to fix the problem, surely I can't be the only person to require the use of mixed culling layers using a tracked oculus headset, has anyone else found a solution to this?
Just an extra note - I have all plugins, utilities etc. updated to latest versions. I'm using the quest 2. Unity version 2019.3.14f1 - Regular 3d. Using new XR Plugin Management system.
Managed to get around the problem for my simple stereo360 setup via the attached shader, I think the customEditor doesn't exist which is why it shows the hidden parameters for "3DLayout" and "MirrorOnBack"
- Just to update this with extra info, I think the reason the default pano shader won't show the options for stereo currently is because it doesn't recognise the XR Plugin Management system as a valid stereo setup, only the legacy XR setup in project settings.
I have the exact same issue. Im trying to use Use per eye cameras to render a stereo image (not a skybox, just a regular square image). I have no idea how to do this using the one camera setup. When I build, my hand are small of not at the right position. Do you know were I can find a shader that will work on a regular image and not a skybox in the light settings? Thanks!