Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

Rift + VIVE development?

rikkcareyrikkcarey Posts: 7
NerveGear
I want to build a Unity app that runs on Rift and VIVE. I have read a everything that I can find and still remain uncertain. ;-)

I started here: https://developer.oculus.com/documentation/unity/latest/concepts/unity-cross-platform-dev/

So, I understand that Unity has built-in (but limited) Oculus/OVR support that will run on both Oculus Rift and VIVE. I believe that this one target exe will run on both. However, the features that work are limited to OVRCameraRig, OVRDisplay, OVRInput, OVROverlay, OVRBoundary, etc. Note that the doc sez that this also supports cross-platform for Avatars (but it's quite limited). So far, so good (I think).

However, things get messy if you use Oculus Integration (and probably similar if you use extra SteamVR api features). And considering that I would prefer not to build this stuff myself, I am unclear how to use these non-conforming features and still deliver on both platforms. My first instinct is to build abstract components (e.g. AbstractAvatar) that perform run-time hardware checks and calls either OVR or SteamVR apis, but would love to avoid this if possible. This approach would require two separate builds, I think.

Any advice or correction would be greatly appreciated.

P.S. I anticipate that I will also have problems when it comes time to put the app into the various stores. For example, I have read that Oculus requires no references to SteamVR in your app to be accepted. Sigh...

Comments

  • BigDaddioBigDaddio Posts: 26 Oculus Start Member
    I literally have separate projects for Rift/Vive/Go Its just easier that way, I read somewhere that others do it this way as well. Even using something like VRTK its a PITA. One of the biggest issues is dealing with Steam cloud, stats validation and Oculus are different. Maybe there is someone way better than me that can handle this but this has been the easiest and fasted for us.
  • PixelPupPixelPup Posts: 5
    NerveGear
    So...  The way we went about doing this was not to have separate projects for Vive/Rift but we had to have objects with scripts that we toggled on and off to test various things.  We ran into a lot of issues with the transforms/anchor points, etc.. being different between the headsets so we always had hacky offsets to make it work in the beginning, but later just toggled on and off the settings for one or the other.  It's kinda a pain, but I would almost say you just build one to completion and then port it over.  We use Rift all the way to the end in this manner and then go back and just fix whatever breaks on the Vive.  Not ideal but there isn't anything fundamentally different about them that needs to be accounted for in most cases with HMD plus 2 controllers.

  • treeviewstudiostreeviewstudios Posts: 39
    Brain Burst
    PixelPup said:
    So...  The way we went about doing this was not to have separate projects for Vive/Rift but we had to have objects with scripts that we toggled on and off to test various things.  We ran into a lot of issues with the transforms/anchor points, etc.. being different between the headsets so we always had hacky offsets to make it work in the beginning, but later just toggled on and off the settings for one or the other.  It's kinda a pain, but I would almost say you just build one to completion and then port it over.  We use Rift all the way to the end in this manner and then go back and just fix whatever breaks on the Vive.  Not ideal but there isn't anything fundamentally different about them that needs to be accounted for in most cases with HMD plus 2 controllers.



    We use VRTK and it works like a charm for all VR Platforms, easy to handle.
  • rikkcareyrikkcarey Posts: 7
    NerveGear
    @PixelPup  Thanks for letting me know.  I rarely hear VRTK mentioned, but it certainly looks like good work.

    Do you find that you need to still access Oculus VR and SteamVR apis for some features?  Or are you able to build your entire app on one api, that works transparently on both Oculus Rift and VIVE?
  • rikkcareyrikkcarey Posts: 7
    NerveGear
    It looks to me that Oculus (OVR), Valve (SteamVR/OpenVR), and VRTK all provide the same BASIC cross platform support: rendering, tracking, and input. However, they all appear to stop short of avatars (i.e. hand or controller avatars). So, I assume that this is just the state of things and that developers must build their own avatar system. Do I have this right?
  • WelbyWelby Posts: 1,063 Oculus Start Member
    We use both Oculus and SteamVR plugins (and to be honest we also support PSVR).

    Basically we just have different prefab on scene one for each supported headset. So we have different prefab with all the needed stuff for each platform and we do a check on Awake to enable the right prefab depending on what headset the user has connected to their PC.

    It's kinda clean that way. There are some disvantages to not have a single prefab/player like you have to do some stuff twice for example if you change something on your OculusPlayer prefab you have do to the same on the VivePlayer prefab.. but at least you can have everything on a single project but still separate so you can take full advantage of all the specific platform features.
  • treeviewstudiostreeviewstudios Posts: 39
    Brain Burst
    rikkcarey said:
    It looks to me that Oculus (OVR), Valve (SteamVR/OpenVR), and VRTK all provide the same BASIC cross platform support: rendering, tracking, and input. However, they all appear to stop short of avatars (i.e. hand or controller avatars). So, I assume that this is just the state of things and that developers must build their own avatar system. Do I have this right?
    In my case with VRTK and Avatars i have no problem, you only have to put the avatar inside the TrackingSpace, or SteamVR camera rig, or any X tracking space.

    then handle the local positions of each avatar, using your own AvatarDriver (inherit from it and you are done).

    I already implemented them in 2 different projects and i find no problem, though reaching the solution took me some time, i use the same Avatar but i switch from OculusDriver to SteamVRDriver , one uses native oculus input, the other one uses SteamVR( UnityXR tracking positions), also i could use this for PSVR or MR if i wanted.

    The only problem i see is that you have to map hands and buttons so you can get gestures, which is not a big problem, depending on the controller.


Sign In or Register to comment.