cancel
Showing results for 
Search instead for 
Did you mean: 

Meta Avatar Hand Bones -- What coordinate system?

Anonymous
Not applicable

I'm trying to manually convert OVRSkeleton bone transforms to drive Meta Avatars (for testing, I want to drive avatars with pre-recorded input). But the coordinate system seems to be different. It's very difficult to debug because the avatar system's IK defaults to a t-pose when there's anything going wrong.

 

I'm using a hand tracking input delegate (IOvrAvatarHandTrackingDelegate) and populating the fields. Sometimes I get some sort of motion but the wrists are completely screwed up and other times there is nothing at all (I feel like it is very sensitive to the position of the wrists but I have no clue what coordinate system they are in). I've tried just piping in the local position and rotation directly as well as setting wristPos to the transform of the controllers (because that bone is actually attached to a node that is not stored as a bone in the OVR camera rig).

 

Any help would be appreciated!

 

EDIT: I made some progress. I found that the avatars have a root node with scale of z=-1. The wrist position transforms that are to be returned from the hand input delegate serve as the root of the hand and transforming them as follows helps:

 

 

Matrix4x4 m = Matrix4x4.TRS(new Vector3(0, 0, 0), Quaternion.identity, new Vector3(1, 1, -1)); // this is the pivot point in whose coordinate system hands actually need to be. Found this by examining the Root node created by avatars
leftWristPose = m.inverse * leftWristPose;
rightWristPose = m.inverse * rightWristPose;

 

 

 

However this still results in backwards fingers (in this photo, I was typing on a keyboard but the fingers are curling upwards toward the avatar's face): 

 

fingers_backwards.pngNote that the thumb is on the wrong side of each hand too.

 

So I tried fixing the bone rotations by rotating the x-axis of the first (proximal) joint of each finger by 180 degrees:



Quaternion rot1 = Quaternion.Euler(180, 0, 0);
_currentTrackingHands.boneRotations[0] = rot1 * _currentTrackingHands.boneRotations[0];
_currentTrackingHands.boneRotations[4] = rot1 * _currentTrackingHands.boneRotations[4];
_currentTrackingHands.boneRotations[7] = rot1 * _currentTrackingHands.boneRotations[7];
_currentTrackingHands.boneRotations[10] = rot1 * _currentTrackingHands.boneRotations[10];
_currentTrackingHands.boneRotations[13] = rot1 * _currentTrackingHands.boneRotations[13];

_currentTrackingHands.boneRotations[17] = rot1 * _currentTrackingHands.boneRotations[17];
_currentTrackingHands.boneRotations[21] = rot1 * _currentTrackingHands.boneRotations[21];
_currentTrackingHands.boneRotations[24] = rot1 * _currentTrackingHands.boneRotations[24];
_currentTrackingHands.boneRotations[27] = rot1 * _currentTrackingHands.boneRotations[27];
_currentTrackingHands.boneRotations[31] = rot1 * _currentTrackingHands.boneRotations[31];

 

And now we have sausage fingers that are roughly in the correct place and seem to bend correctly. 

 

sausage_fingers.png

 

I've tried numerous other adjustments including pre-rotating the wrist but nothing helps. What is happening here?

6 REPLIES 6

thearperson
Explorer

Just curious, is this the new Meta Avatar or the old one? I thought OVRSkeleton was meant for the old version?

Anonymous
Not applicable

It's the new ones. The examples and documentation are very minimal, so I'm not sure what I'm supposed to be doing. I understand that for networking, we can obtain some opaque blob that can be ingested by remote avatars, but I wanted to map raw hand joint positions that I've already pre-recorded (and which I can use on different kinds of avatars). 

 

What I've discovered since I posted is that Oculus has a ConvertSpace() method that converts the handedness of the transforms. This helps but the wrist transform is still a mystery. Finger rotations have to be converted by negating the x and y quat components, and positions are converted by negating z. The only full transform given is the wrist transform but there's something else going on there that isn't obvious to me.

 

Anonymous
Not applicable

I tried two more things none of which are suitable:

1. Custom hand poses. This doesn't work because the skeleton of the Oculus hand prefabs is not compatible with the avatars! The sample skeleton they provide in the sample scene is somehow different in that the rotations don't behave the same way. This is quite frustrating. The rig is supposed to be customizable by attaching scripts that define the joints but it doesn't work as expected. Not sure where they got their example skeleton model from.

 

2. Network streaming. The RecordStreamData() and ApplyStreamData() functions do work, can be serialized to a file, and played back -- but only once! And there is no CAPI hook for reseting the playback state. Wonderful! It's totally unusable.

 

I think custom hand poses came the nearest to being a possible solution but the correct solution would be for Oculus to properly document what the heck the "wrist position" is supposed to be. It's not simply an RHS->LHS conversion. My hands always come out inverted along z and I can't seem to apply any transform (including negative scaling) to undo this. There really should be a way to feed hand bones into the system and have it do the right thing or at least provide a conversion function.

Similar interaction issue, Were you able to add physics capsules to the hands like you can do with the OVRCustomHandPrefab?

beastus
Explorer

Where did you come across these Meta Avatar (2.0?) rigged meshes? I haven't not seen this in the docs or samples. And what tooling are you using? Thanks!

P.S. I saw the .glb files that the third-person avatars sample use but they don't import into Blender for me. Seem to require proprietary plug-in?

I do record stream and apply stream on Update and it works fine. I just need to figure out how to programmatically disable the view