cancel
Showing results for 
Search instead for 
Did you mean: 

Handtracking and Unity UI

Theformand
Protege
Hey everyone.

Since we dont have hand tracking in-editor via Link (Guys please tell me this is coming, dont see how handtracking development will take off without it), I´m working on a small tool app to record and serialize hand poses for pose recognition in a game.
So, I would really like to have a way to interact with Unity UI. Raycast and click, thats all I need. I tried making my own HandTrackingRaycaster to use with a Canvas, but it doesnt seem to be working. Is there really not a built-in component for this?

13 REPLIES 13

MarsAstro
Honored Guest
I've got an EventSystem with the OVRInputModule and a Canvas with the OVRRaycaster, and on my right OVRHandPrefab I've put a script with this code:

    void Start()
    {
        Hand = GetComponent<OVRHand>();
        InputModule = FindObjectOfType<OVRInputModule>();
        Raycaster = FindObjectOfType<OVRRaycaster>();

        InputModule.rayTransform = Hand.PointerPose;
        Raycaster.pointer = Hand.PointerPose.gameObject;
    }

However, it doesn't seem like I can interact with the buttons on the Canvas. Any idea what I'm doing wrong?

Anonymous
Not applicable
any further info on this i also cant seem to get hand tracking working with unity ui

platymatrix
Honored Guest
As far as I know you can follow this tutorial here: (It won't let me post a link, but go to YouTube and search for Valem and User Interface. Then if you see "How to make a VR Game Part 4 User Interface" it's the right one)
to get the UI set up

And then after that write a script that changes the OVRInputModule to have ray.Transform = hand.PointerPose. That worked for me. You'll also have to use your laserpointer from the video as the pointer in your OVRRayCaster. Hope that helps. 

Note: If you're using the UI Helpers from the video you'll have to comment out one line of code in the HandedInputSelector which sets the ray transform to be the right hand anchor. That will interfere with the other script. 

I was using the Hands prefab back in the day since I only had the old oculus plugin and using unity 2018. So the goal was to override ray transform property of OVRInputModule which is in the event system and the ray transform of the OVRGazePointer which you can access via the static class instance. To solve these using the old hands prefab way we need to do it by code...

You need to get the ray transform from the ray tool. The ray tool is spawned dynamically by the InteractableToolsSDKDriver prefab. I marked the right ray tool with a tag so i can access it later.

RayTool.cs
public void Initialize()
{
...
   if(IsRightHandedTool)
   {
      gameObject.tag = "RightRayTool";
   }
}


I then expose the target ray from RayToolView.cs which can be accessed from RayTool.cs

RayToolView.cs
public Transform GetTargetTransform()
{
   return _targetTransform;
}

RayTool.cs
public Transform GetRay()
{
   return _rayToolView?.GetTargetTransform();
}

Then I use it in my overrider class accordingly

Overrider.cs
in awake() and make sure tag is in tagasset... 
var rayTool = GameObject.FindGameObjectWithTag("RightRayTool")?.GetComponent<RayTool>();

coroutine some time later when hands are ready...
OVRGazePointer pointer = OVRGazePointer.instance;
pointer.rayTransform = rayTool.GetRay();
ovrInput.rayTransform = rayTool.GetRay();

//NOTE: ovrinput is referenced from the component in eventsystem object