Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

Handtracking and Unity UI

TheformandTheformand Posts: 10
Brain Burst
Hey everyone.

Since we dont have hand tracking in-editor via Link (Guys please tell me this is coming, dont see how handtracking development will take off without it), I´m working on a small tool app to record and serialize hand poses for pose recognition in a game.
So, I would really like to have a way to interact with Unity UI. Raycast and click, thats all I need. I tried making my own HandTrackingRaycaster to use with a Canvas, but it doesnt seem to be working. Is there really not a built-in component for this?

Comments

  • TheformandTheformand Posts: 10
    Brain Burst
    Figured it out. On startup, find the OvrInputModule and set its rayTransform to OvrHand.PointerPose. Also find OvrRaycaster and set its .pointer to be OvrHand.PointerPose. Now you can interact with UI
  • El_inEl_in Posts: 26
    Brain Burst
    Can you give me a hint where to search for OvrInputModule ?
    I tried adding it to the OVRCameraRig. But when I want to set it's rayTransform to OvrHand.PointerPose I get a error.
  • TheformandTheformand Posts: 10
    Brain Burst
    When you create a Unity UI, it will create an event system for you in the scene. On The event system, remove the Standalone Input Module, and replace it with OVR Input module.
  • El_inEl_in Posts: 26
    Brain Burst
    Okay I have added the OVR Input Module to the Event system and deactivated the Standalone Input Module. Then I added the OVR Raycaster to my world space canvas and deactivated the Graphic Raycaster.
    When you wrote to assign the OvrHand.PointerPose to the OvrInputModule rayTransform I thought it's the gameobject HandRight or HandLeft which gets spawned by the Hands prefab. The Hand prefab does have a hand.cs script but they don't provide a PointerPose transform variable. 
    So i need somewhere the OVRHand.cs script with left or right hand selected but I have no Idea where to put it.  :#
  • TheformandTheformand Posts: 10
    Brain Burst
    I´m using the OTHER set of hand prefabs (Its a totally messy release by Oculus) that have OVRHand.cs attached to them. Im NOT using the Hands prefab to spawn them.
  • El_inEl_in Posts: 26
    Brain Burst
    Got it. 
    But I think I have to learn some fundamentals about ui Ray casting first. I though I could use the Ray tools like in the train demo scene. 

  • 1778657840117786578401 Posts: 1
    NerveGear
    Can you be more specific? I want to see the whole process
  • El_inEl_in Posts: 26
    Brain Burst
    I started using the windows Mixed Reality Toolkit since there is a integration for the quest.
    There is already native unity UI interaction included plus a lot of useful features like grabbing, scale and rotate
    github.com/provencher/MRTKExtensionForOculusQuest
  • nico.acevedonico.acevedo Posts: 1
    NerveGear
    Hi guys, I'm a little new with Unity, but not with programming so I'm doing very well. Anyway, I would really appreciate if you could explain this a bit more specifically :( Pleaaaaase. This thread it's exactly what I was looking for, but I don't seem to make it work, or I don't understand how. Thanks guys!
  • MarsAstroMarsAstro Posts: 3
    NerveGear
    I've got an EventSystem with the OVRInputModule and a Canvas with the OVRRaycaster, and on my right OVRHandPrefab I've put a script with this code:

        void Start()
        {
            Hand = GetComponent<OVRHand>();
            InputModule = FindObjectOfType<OVRInputModule>();
            Raycaster = FindObjectOfType<OVRRaycaster>();

            InputModule.rayTransform = Hand.PointerPose;
            Raycaster.pointer = Hand.PointerPose.gameObject;
        }

    However, it doesn't seem like I can interact with the buttons on the Canvas. Any idea what I'm doing wrong?
  • alienhereticalienheretic Posts: 1
    NerveGear
    any further info on this i also cant seem to get hand tracking working with unity ui
  • platymatrixplatymatrix Posts: 1
    NerveGear
    edited July 15
    As far as I know you can follow this tutorial here: (It won't let me post a link, but go to YouTube and search for Valem and User Interface. Then if you see "How to make a VR Game Part 4 User Interface" it's the right one)
    to get the UI set up

    And then after that write a script that changes the OVRInputModule to have ray.Transform = hand.PointerPose. That worked for me. You'll also have to use your laserpointer from the video as the pointer in your OVRRayCaster. Hope that helps. 

    Note: If you're using the UI Helpers from the video you'll have to comment out one line of code in the HandedInputSelector which sets the ray transform to be the right hand anchor. That will interfere with the other script. 
Sign In or Register to comment.