cancel
Showing results for 
Search instead for 
Did you mean: 

Hand tracking with Quest 2

femke.vanbeek.58
Explorer

I'm considering buying a Quest 2 for our perception lab, but there's one question that I keep being unclear about, so I hope someone here can answer it: I need to be able to access the (23 DOF?) hand tracking data, and save it for offline analysis. So, I not only need to be able to put colliders on for instance the finger tips to initiate collisions with virtual objects in a Unity scene, but I also need to be able to read and store all the hand position data as experimental data. Has someone succesfully done that before?

1 ACCEPTED SOLUTION

Accepted Solutions

mz8i
Explorer

Well, at least as it's defined in the WebXR specification, hand tracking data is sensitive data (due to potential use for fingerprinting - identifying users etc) so devices are required to add to it noise sufficient to prevent user identification. So then it depends what analysis you want to conduct on it.

 

As for Unity, after a simple search I can see this: https://developer.oculus.com/documentation/unity/unity-handtracking/ says "Enabling support for Hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose."

So you do get access to hand tracking data through Unity, but I'd say you'd be best off getting legal advice on whether your intended use would be in breach of Oculus T&Cs.

View solution in original post

5 REPLIES 5

mz8i
Explorer

I don't know about Unity (I presume it must be possible), but if you are open to using web technologies instead, you can definitely get access to the hand position in WebXR (VR in a HTML canvas) - see documentation of the hand tracking in the standard here: https://www.w3.org/TR/webxr-hand-input-1/#physical-hand

There are multiple libraries that can help you make a WebXR app in JavaScript, for example in React: https://github.com/pmndrs/react-xr

Thank you very much for your response, and sorry for my slow reply. It's great to hear that you've been able to access and use hand tracking data. My main concern was around Oculus considering hand tracking data a potential privacy issue, so I was afraid that I would only be able to use it as input to a game, but never be able to actually look at the data and save them. However, your response implies that this might be possible. I do have to work in Unity due to other dependencies, but that shouldn't change the principle. Nonetheless, it would be great if anyone happens to have experience with using and saving hand tracking data with Quest 2 in Unity, so if anyone does, please shoot me a message 🙂

mz8i
Explorer

Well, at least as it's defined in the WebXR specification, hand tracking data is sensitive data (due to potential use for fingerprinting - identifying users etc) so devices are required to add to it noise sufficient to prevent user identification. So then it depends what analysis you want to conduct on it.

 

As for Unity, after a simple search I can see this: https://developer.oculus.com/documentation/unity/unity-handtracking/ says "Enabling support for Hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose."

So you do get access to hand tracking data through Unity, but I'd say you'd be best off getting legal advice on whether your intended use would be in breach of Oculus T&Cs.

Thanks, that is super helpful info! That indeed sounds like it might not be so straightforward as I thought. We're just a simply academic lab, so often things are okay as long as it's for academic purposes, but 'expressly forbidden' sounds pretty harsh. Thanks for helping!

Did you manage to extract the hand tracking data? Similar to what you explained, I am trying to extract the skeleton position of each bone of the hand using Oculus in order to use it after for offline analysis of the hand interaction.

Have you been able to do it? Could you help me out?

Thanks 🙂