I am developing a game in Unity using the OVRCameraRig and the per eye camera setting, but I have run into a problem with the per eye camera positions.
One effect in my game involves raycasting from the individual left/right eye positions to determine the world position of other specific elements, which are rendered to each eye separately so they don't appear "out of focus" as the user changes their gaze from near to distant.
This effect is working well when testing in the editor, but it is not working in the build. The elements are diverging from focus as the user changes their gaze.
The problem appears to be caused by the left/right eye "anchor" positions of the OVR Camera Rig, which appear to be at the center eye position in the build but not in editor testing. To determine this I placed some simple objects in front of each eye. The objects have no parent and their positions are manually updated to be just in front of each eye. Additionally, each object is on a layer to render only to one eye. In the Unity editor testing, these objects are in the center of each eye, but in the build, they both appear centered, down in front of the user's nose.
Is this the expected behavior? When using "per eye cameras" should the actual transform positions of the left/right eye anchors be separate or centered?
If not, what could be causing this?
Also, any insight into why this would be different in the editor vs build would be helpful.