Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

UnityVr vs Oculus Utilities

kilogoldkilogold Posts: 32
Brain Burst
edited February 2016 in Unity Development
Hi guys!
I'm at a confusion point...
I'm reading two separate docs:
I'm trying to use a single, unified base for my project, but I don't know which one I should stick with (integrating both sounds like a debugging nightmare waiting to happen).
I understand that Unity wants to cover more than just Oculus (Vive & PSVR), but what does that mean for Oculus developers when considering the two integrations?

My immediate questions are:
  • What things should I weigh in (pros/cons) considering which to marry my project with?
  • Is there any indication that Oculus Utilities will fully dissolve into UnityVr?


Any thoughts?

Comments

  • cyberealitycybereality Posts: 26,156 Oculus Staff
    You would use them together. The Oculus Utilities are not really a stand-alone SDK, but rather helper functions and classes that are built on top of the native Unity VR features.

    Technically you could use the standard Unity built-in VR tools without anything else, and it would work on a basic level. However, you would be missing out on a lot of functionality you may find useful.
    AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i
    Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
  • motorsepmotorsep Posts: 1,471 Oculus Start Member
    I inquired about the same thing here: viewtopic.php?f=37&t=27874

    Today I tried to use both OVR Player Controller dropped into Intro VR scene from Unity. It's not a trivial task. Unity samples use alot of scripts that rely on one another. So there going to be some work to add all the requires scripts to OVR Player Controller and get it working with Oculus Utils.

    I also recall there are a few things required to be used from Oculus Utils to be able to publish in Oculus store. I was hoping Oculus would release a new version of Utils designed to work with Unity's VR scripts out of the box. This was all this tedious setup could be avoided.

    I'd rather use Unity's setup because it's all there - drop in stuff and focus on making your game instead of fiddling with player setup and controls.

    P.S. So I deployed Intro VR scene from Unity to Gear VR - worked like a charm, with basic UI, interactions and Back button working. So, can we build apps / games not using Oculus Utils at all ? What do we lose precisely by not using Utils ?
  • motorsepmotorsep Posts: 1,471 Oculus Start Member
    So, any good reason why not to use Unity's VR examples and drop Oculus Utils completely ?
  • cyberealitycybereality Posts: 26,156 Oculus Staff
    I would not drop Oculus Utilities, as there are some Oculus specific functionality you may need (for example, if you are developing for Gear VR and wish to put your game on the Store).
    AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i
    Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
  • motorsepmotorsep Posts: 1,471 Oculus Start Member
    I would not drop Oculus Utilities, as there are some Oculus specific functionality you may need (for example, if you are developing for Gear VR and wish to put your game on the Store).

    Well, the problem is that Oculus Utils can't be "married" to Unity's VR examples easily. And Oculus Utils are bare bones assets, which means anything beyond standing in place and looking around, or moving with gamepad requires knowing C# programming. Which means people who are artists and not quote programmers are doomed to either learn C# and waste time, so to speak, or team up with programmers, who are few and between nowadays.

    Unity's VR examples allow to create a lot of experiences and/or games because they provide functionality for locomotion, GUI input, etc. So almost everything a VR app/game needs is in there, ready to be used out of the box.

    Could you please specify why Oculus Utils have to be used in order to be able to publish games in Oculus Store ?

    Is it some script that can be easily added to Unity's VR assets just so the app can make to the store?
  • motorsepmotorsep Posts: 1,471 Oculus Start Member
    So, why are Utils necessary for an app to be published on Oculus Store?

    Is it just some script not related to player's controller that can be simply dropped into Unity's VR example to complete an app ?
  • vrdavebvrdaveb Posts: 1,596 Oculus Staff
    If you are publishing to the Oculus store, there are certain requirements on the manifest, the behavior of the back button, and access to our Universal Menu, pass-through camera, etc, as outlined here: https://developer.oculus.com/documentat ... blish-req/. Some of this is vendor-specific and Unity has chosen not to include support for it in their sample assets. Therefore, you need to use OVRManager in your project, at least for now. Over time, we will work with Unity to make our Utilities more compatible with their Standard Assets, but we aren't there yet. In the meantime, if this is a blocking issue for you, you have all of the relevant code for the Utilities and Unity's Standard Assets, so there is nothing stopping you from modifying both to suit your needs.
  • Can anyone shed any light on trying to marry the two together. Specifically the Unity VR Sample's Gaze Interaction. I've had a pretty hard time trying to get this to work. I'm assuming it has something to do with the Raycast not having proper coordinates once it's been parented to the Oculus Utilities player controller. But as someone stated above, not being a dev myself, it's harder to connect the dots for this simple functionality.
  • motorsepmotorsep Posts: 1,471 Oculus Start Member
    Can anyone shed any light on trying to marry the two together. Specifically the Unity VR Sample's Gaze Interaction. I've had a pretty hard time trying to get this to work. I'm assuming it has something to do with the Raycast not having proper coordinates once it's been parented to the Oculus Utilities player controller. But as someone stated above, not being a dev myself, it's harder to connect the dots for this simple functionality.

    As vrdaveb said, use OVRManager in Unity VR examples. It has functions that call built-in Oculus functionality (global menu and such). All you need is to call those functions from OVRManager in the UNity VR scripts, where back button handled. I recall there are 3 states - one press, double press and long press.

    Everything else can Unity VR. In other words, you load Unity VR sample, add OVRManager there and you are good to go. Of course I am oversimplifying things as maybe there is a bit more involved than just adding OVRManager and calling its functions :)

    Btw, Unity VR examples run on Gear VR out of the box, without Oculus Utils.
  • Personally, I started with the Oculus Utils and stripped out whatever I didn't need (i.e. Button Mapping and such bc I was using another plugin for that), and added the Unity VR Stuff in as well. There's lots of good stuff from both companies, and I think it's great we're all sharing our experiences. That's really what's going to make this tech take off is the collaboration and experimentation of Devs trying new things and adopting what's best, and chucking the rest.


    Also, I'm surprised Oculus hasn't actually put the Oculus Utils in the Unity Asset Store. It'd make updating and pulling into the project that much easier.
  • motorsepmotorsep Posts: 1,471 Oculus Start Member
    Depending on how long it will take for Unity to add all necessary stuff from Oculus Utils into the engine, it might make sense for Oculus to release stripped down Utils that can be dropped into Unity VR examples with a detailed tutorial how to hook up most critical (for the app/game to be accepted to Oculus Store) functionality. Although I doubt that will ever happen :roll:
  • motorsepmotorsep Posts: 1,471 Oculus Start Member
    @vrdaveb: Do you think you guys can make such stripped down version of Utils to work with Unity's VR examples ?
  • vrdavebvrdaveb Posts: 1,596 Oculus Staff
    We are working with Unity on a plan for reconciling the Utilities and their new VR samples. There is nothing to announce yet, but you are free to modify both.
  • kilogoldkilogold Posts: 32
    Brain Burst
    vrdaveb wrote:
    We are working with Unity on a plan for reconciling the Utilities and their new VR samples. There is nothing to announce yet, but you are free to modify both.

    Thanks for chiming in!
    This is exactly the info I was looking for :)
  • justin.wasilenkojustin.wasilenko Posts: 33
    Brain Burst
    Is there an update or a timeframe for this?

    Just wondering if I should be trying to integrate both packages now, or wait a month and there's going to be a better solution?
  • vrdavebvrdaveb Posts: 1,596 Oculus Staff
    There is nothing specific to announce at this time. My expectation is that given their newness, the samples will probably change more over the next several months than the utilities will.
  • justin.wasilenkojustin.wasilenko Posts: 33
    Brain Burst
    Ok thanks, I was looking through it and there isn't much that needs to be changed around to get it to work. Thanks for the update.
  • ConzConz Posts: 83
    Hiro Protagonist
    After some testing, I'm not that happy with the Oculus Utilities (including the UI sample).

    - I have troubble with the OVR Character controller. In some cases it starts to rotate very slow vertically by looking left and right. (Seems a problem with calculating the world matrix, in that case the OVR Gazepointer has an offset, too)

    - Why does the GazePointer use RaycastAll and sort the result? Why not a simple ray? The OVR solution is very slow and detects the "gazable" objects through walls. Removing the gazable layer mask would be way too slow. Where is the benefit to detect more than the closest target?

    - The OVR Input solution is properitary and doesn't depend on the Unity "CrossPlatformInput", what makes it really annoying to change the controller settings.

    - The OVR-Input throws a compiler warning, for using deprecated code.

    - I'm not convinced by the mouse pointer logic in the OVR Input.

    I got the best results by using only the UnityVR implementation (and a modified FPSCharacterController) with classic "CrossPlattformInput" and my own (simple ray) gaze pointer.
    The OVR Utilities are a little bit over complicated and not really following the classic Unity style. I would welcome a deeper cooperation between Oculus and Unity here.
  • vrdavebvrdaveb Posts: 1,596 Oculus Staff
    Conz wrote:
    I have troubble with the OVR Character controller. In some cases it starts to rotate very slow vertically
    You mean OVRPlayerController? Sounds like dead zone is too tight for your controller. In a future release, we're planning to increase it a little to prevent this. In the meantime, can you change the 0.15f dead zone in OVRInput.cs to 0.2f?
    Conz wrote:
    Why does the GazePointer use RaycastAll and sort the result?
    It sounds like you're using the sample VR UI project from our blog (https://developer.oculus.com/blog/unity ... tem-in-vr/). That isn't part of the Utilities, it's a drop-in replacement for the usual mouse-driven input modules in UGUI. It implements Unity's GraphicRaycaster interface, which returns all hits in sorted order. This is more flexible than Physics.Raycast and Unity probably needs it for a number of corner cases in their UI logic.
    Conz wrote:
    The OVR Input solution is properitary and doesn't depend on the Unity "CrossPlatformInput", what makes it really annoying to change the controller settings.
    OVRInput is designed to be similar to UnityEngine.Input, which drives the undocumented CrossPlatformInput class's StandaloneInput back-end. Unity prefers not to include any Oculus-specific code in their Standard Assets, but we are still working with them to expose this in a cross-platform way.
    Conz wrote:
    I'm not convinced by the mouse pointer logic in the OVR Input.
    Which logic is that?
    Conz wrote:
    - The OVR-Input throws a compiler warning, for using deprecated code.
    I'm unable to reproduce this. What Unity version and Utilities version are you using? What is the warning? Is it in OVRInput.cs or OVRInputModule.cs?
    Conz wrote:
    The OVR Utilities are a little bit over complicated and not really following the classic Unity style. I would welcome a deeper cooperation between Oculus and Unity here.
    Sorry, but I need more detail to understand what is wrong here. Can you describe what part of the workflow is too complicated and how you would like it to change?
  • ConzConz Posts: 83
    Hiro Protagonist
    Thank you for your response.
    vrdaveb wrote:
    You mean OVRPlayerController? Sounds like dead zone is too tight for your controller. In a future release, we're planning to increase it a little to prevent this. In the meantime, can you change the 0.15f dead zone in OVRInput.cs to 0.2f?

    I don't think that the dead zone is the problem. I don't have a control for vertical movement with the controller. The strange thing is, that your gaze pointer (UI project) has an offset (when the shifting is happening), too. It detects the colliders with an offset of where they are in the world.

    vrdaveb wrote:
    It sounds like you're using the sample VR UI project from our blog (https://developer.oculus.com/blog/unity ... tem-in-vr/). That isn't part of the Utilities, it's a drop-in replacement for the usual mouse-driven input modules in UGUI. It implements Unity's GraphicRaycaster interface, which returns all hits in sorted order. This is more flexible than Physics.Raycast and Unity probably needs it for a number of corner cases in their UI logic.

    Yes, that seems to be the reason for my problems. I thought that is the sample scene, how to use the OVR Utils.

    Maybe the RaycastAll more flexible and is needed for some UI effects, but has some nasty side effects, like detecting the gazeable objects through walls and through "non gazeable objects". And RacastAll with sorting seems unnessesary slow. From user point of view, I can't imagine a situation where I would need this.
    In all my tests a simple Ray with a check if the collider is "gazeable" is much faster and gives the result I would expect from a gaze pointer.
    vrdaveb wrote:
    OVRInput is designed to be similar to UnityEngine.Input, which drives the undocumented CrossPlatformInput class's StandaloneInput back-end. Unity prefers not to include any Oculus-specific code in their Standard Assets, but we are still working with them to expose this in a cross-platform way.

    Maybe I don't get this. How can I change the controlls? Not with the Unity input settings, right? The only way I found was to modify the OVRInput.cs. And as an example there the button method.
    vrdaveb wrote:
    Which logic is that?
    The logic with the Spherecast. Or is the Spherecast not for the mouse?
    Maybe it will be different in the future (comment from code: "// In future versions of Uinty RaycastResult will contain screenPosition so this will not be necessary")

    vrdaveb wrote:
    I'm unable to reproduce this. What Unity version and Utilities version are you using? What is the warning? Is it in OVRInput.cs or OVRInputModule.cs?

    Unity Verion 5.3.2p2 on Win7 64
    Oculus 0.8
    And the UI-Demo from your link.

    Ok, my problem seems than, that I'm mixing the OVR Utils and the VR UI project here. The UI-project is your official demo scene, how to implement the controller and a gaze pointer. It shouldn't throw warnings. Please update the demo scene to your own API and current unity version.

    Messages I get with your scene:
    "OVRGamepadController has been deprecated and will be removed in a future release. Please migrate to OVRInput. Refer to the documentation here for more information: https://developer.oculus.com/documentat ... -ovrinput/"

    "Assets/Scripts/ParticleGazeCursor.cs(88,20): warning CS0618: `UnityEngine.ParticleSystem.emissionRate' is obsolete: `emissionRate property is deprecated. Use emission.rate instead.'
    "

    "Assets/Scripts/OVRInputModule.cs(97,14): warning CS0114: `UnityEngine.EventSystems.OVRInputModule.Reset()' hides inherited member `UnityEngine.EventSystems.UIBehaviour.Reset()'. To make the current member override that implementation, add the override keyword. Otherwise add the new keyword"

    vrdaveb wrote:
    Sorry, but I need more detail to understand what is wrong here. Can you describe what part of the workflow is too complicated and how you would like it to change?

    Not wrong, over complicated.
    - Your input logic without using the input settings.
    - The RaycastAll with the cloning of the PointerEventData

    Please make it "KISS" as possible. I'm only a scripter, not a programmer. Maybe I have a problem to understand all your reasons to build the API and the Demo scene the way you do.
    After some days trying to implement your example UI demo logic in my project I have dropped it and build my own character controller and GazePointer.
  • vrdavebvrdaveb Posts: 1,596 Oculus Staff
    Conz wrote:
    The strange thing is, that your gaze pointer (UI project) has an offset (when the shifting is happening), too. It detects the colliders with an offset of where they are in the world.
    Strange. I haven't seen anything like that. Can you send a project that reproduces the issue?
    Conz wrote:
    How can I change the controlls? Not with the Unity input settings, right?
    Right, InputManager.asset doesn't affect OVRInput. The axes and buttons map directly to the controllers, so you would have to query a different value.
    Conz wrote:
    It shouldn't throw warnings.
    We'll update OVRInputModule to use OVRInput instead of OVRGamepadController soon.
    Conz wrote:
    Please make it "KISS" as possible.
    The goal is to build all of this into Unity itself. Then you won't have to do anything special.
  • ConzConz Posts: 83
    Hiro Protagonist
    vrdaveb wrote:
    Conz wrote:
    The strange thing is, that your gaze pointer (UI project) has an offset (when the shifting is happening), too. It detects the colliders with an offset of where they are in the world.
    Strange. I haven't seen anything like that. Can you send a project that reproduces the issue?

    Sorry, I have moved on with the project. Can't offer a version with the bug.

    It happened only in unity version 5.3.2p2.
    Maybe it was related to the bug that you have posted in the other thread.
    ( viewtopic.php?f=37&t=29996&e=1#p330682 )
Sign In or Register to comment.