Since we dont have hand tracking in-editor via Link (Guys please tell me this is coming, dont see how handtracking development will take off without it), I´m working on a small tool app to record and serialize hand poses for pose recognition in a game. So, I would really like to have a way to interact with Unity UI. Raycast and click, thats all I need. I tried making my own HandTrackingRaycaster to use with a Canvas, but it doesnt seem to be working. Is there really not a built-in component for this?
Okay I have added the OVR Input Module to the Event system and deactivated the Standalone Input Module. Then I added the OVR Raycaster to my world space canvas and deactivated the Graphic Raycaster. When you wrote to assign the OvrHand.PointerPose to the OvrInputModule rayTransform I thought it's the gameobject HandRight or HandLeft which gets spawned by the Hands prefab. The Hand prefab does have a hand.cs script but they don't provide a PointerPose transform variable. So i need somewhere the OVRHand.cs script with left or right hand selected but I have no Idea where to put it. :#
I started using the windows Mixed Reality Toolkit since there is a integration for the quest. There is already native unity UI interaction included plus a lot of useful features like grabbing, scale and rotate github.com/provencher/MRTKExtensionForOculusQuest
Hi guys, I'm a little new with Unity, but not with programming so I'm doing very well. Anyway, I would really appreciate if you could explain this a bit more specifically 😞 Pleaaaaase. This thread it's exactly what I was looking for, but I don't seem to make it work, or I don't understand how. Thanks guys!