cancel
Showing results for 
Search instead for 
Did you mean: 

5.1.2p2 - lots of VR fixes

Anonymous
Not applicable
I just noticed the 5.1.2p2 patch is out with lots of fixes. Downloading now:

http://unity3d.com/unity/qa/patch-releases

Improvements

VR: Added an option to allow Cameras to render specifically to left/right eye for stereo 3D.

Fixes

(695727) - VR Oculus: Camera Viewport Rect now takes up the appropriate amount of Screen space.
(703281) - VR Oculus: Fixed crash when trying to access VRSettings.GetNativePtr while the Oculus HMD was not connected.
(none) - VR Oculus: HMD can now be connected to application after it has already started. It can also be disconnected and reconnected during application run time.
(none) - VR Oculus: Update to latest oculus dependencies.
(none) - VR: Camera transforms now reset when VRSettings.enabled is false or the loadedDevice is set to none. Camera parent transforms are not effected.
(none) - VR: Fixed stereo rendering regression.
(none) - VR: Improved MSAA performance by only antialiasing eye textures, not the composited back buffer.
(691345) - VR: Reenable vsync when device is disconnected.
(none) - VR: Virtual Reality Supported toggle in player settings no longer recreates the Graphics Device. VRSettings defaults to enabled and oculus if Virtual Reality is supported.
27 REPLIES 27

patrickbulman
Honored Guest
Heads up, I'm experiencing an issue with the Unity Pro Water reflection/refraction cameras being transformed by head tracking in this new version.

jmorris142
Explorer
Another heads up, if you use any image effects like bloom etc, they are different.

Lupus_Solus
Protege
It's really encouraging the way Unity is pushing it's VR dev tools.

phileday
Expert Protege
That's great to hear. I hope to implement it over the weekend then. I make a stereo 3D player so I'll let you know how I get on with the 3D stereo integration.

phileday
Expert Protege
Actually in saying that if anyone has any idea of how to implement this and can give me any advice that would be amazing. Previously I put objects for left and right eyes in different layers and then used culling on the camera. I have no idea how it is implemented here.

Anonymous
Not applicable
Based on what I see, you now need to add multiple cameras, and select the eye each camera is for from the new dropdown list available on the Camera component. It isn't clear where you put the new camera though. I would guess you would remove the camera on the center anchor, and replace it with 2 cameras, one on left anchor, and one or right. Then it looks more like legacy integration where you have separate cameras with separate culling masks that you can use.

I'm guessing you can also used layered cameras (multiple cameras per eye) in combination with this feature. I would like to see if you can use a mix of shared camera (render both eyes) for main scene, and use 2 additional cameras for per eye effects (3d movie overlay for example). I'm going to play around with this a bit and will report back.

ccs

Anonymous
Not applicable
I can confirm both camera layering and separate left/right cameras work well now. The only catch is you have to disable these lines of code in OVRCameraRig.cs (Oculus should remove this now that this is no longer a restriction):

//Debug.LogWarning("Having a Camera on " + c.name + " is deprecated. Disabling the Camera. Please use the Camera on " + leftEyeCamera.name + " instead.");
//c.enabled = false;

If you don't disable those lines of code, you won't be able to stick cameras on the left and right anchors as this code will disable them as soon as you attach them to the gameobject.

This opens a lot of possibilities. I was able to use a main both-eye camera for main scene at depth 0, and then have two depth 1 cameras, one for each eye to accomplish per eye rendering for things that need that (3d movie overlay, special effects, etc.).

The only downside I can see, which others have mentioned, is that there doesn't appear to be a way to turn off tracking for specific cameras. All cameras now track the head, even if they are not on the camera rig and tracking anchors.

ccs

wheatgrinder
Explorer
"ccs" wrote:
I can confirm both camera layering and separate left/right cameras work well now. The only catch is you have to disable these lines of code in OVRCameraRig.cs (Oculus should remove this now that this is no longer a restriction):

//Debug.LogWarning("Having a Camera on " + c.name + " is deprecated. Disabling the Camera. Please use the Camera on " + leftEyeCamera.name + " instead.");
//c.enabled = false;

If you don't disable those lines of code, you won't be able to stick cameras on the left and right anchors as this code will disable them as soon as you attach them to the gameobject.

This opens a lot of possibilities. I was able to use a main both-eye camera for main scene at depth 0, and then have two depth 1 cameras, one for each eye to accomplish per eye rendering for things that need that (3d movie overlay, special effects, etc.).

The only downside I can see, which others have mentioned, is that there doesn't appear to be a way to turn off tracking for specific cameras. All cameras now track the head, even if they are not on the camera rig and tracking anchors.

ccs


that sounds interesting, can you explain to the noob "why" I might want to do this? Can I improve performance by only rendering what is different for each eye? (but then isnt everything different for each eye?) or does this let me use things like bloom and other camera effects?

Anonymous
Not applicable
I think the potential benefit of using 3 camera approach (1 main both-eye camera for main rendering + 2 separate eye cameras for overlays/effects) is you gain the optimizations Unity/Oculus did for single camera VR (shadow maps rendered once, etc.) for main scene/object rendering, but still maintain the ability to do per-eye effects as an overlay.

A simple example of this is a movie theater application. The 3d movie theater is rendered to the both-eye camera, and the 3d movie file/stream presented on the virtual screen is rendered to left and right eyes separately. This allows sending correct images from a 3d movie to the correct eye. You could also accomplish this with a 2 camera setup, but you would lose some of the benefits Unity/Oculus added in for the shared both-eye camera for the main scene rendering.

ccs