cancel
Showing results for 
Search instead for 
Did you mean: 

Use Take Recorder to record animations from tracked hands

Laramena
Explorer

Hi everyone,

I have hand tracking activated and am trying to record the hand components using the Take Recorder plugin. My goal is to capture hand animations which I can later on apply to a NPC without having to manually animate hand poses.

 

I was able to record animations from the official HandSample when using the controllers to manipulate the hand mesh. But when switching to hand tracking the hand poses are not recorded. Instead, only rotation and position of the hand component are stored in the sequence.

 

I hope I managed to express what I am trying to achieve in a comprehensible manner. Has anyone tried to record hand animations from tracked hands using Take Recorder before and got it running? Or has an idea what I am missing?

 

Thanks for your help!

2 ACCEPTED SOLUTIONS

Accepted Solutions

Get Socket Transform is to get the data, while to apply the transform on your proxy hands you can use the Transform Bone node in AnimBP..

View solution in original post

Laramena
Explorer

Hi everyone,

sorry, haven't been on this thread for quite a while. I cannot share the project however I can share some screenshots with you. Hope this will be helpful to someone.

I presuppose you already have your pawn actor with the Oculus hand tracking running.

1. You then add a SkeletalMeshComponent to the actor with the Skeletal Mesh being set to the OculusHand Mesh.

2. Also, I added a function to access and store the Oculus Hand's bone transforms in an array. I created a child class of the Oculus Hand Component which you don't need to do. Instead you would connect the Oculus Hand Component into the target pins:

GetBoneTransforms.jpg

3. Next, you create an Animation Blueprint which you assign to the Skeletal Mesh. Within the EventGraph of the Animation Blueprint you get a reference to your actor and call the 'Get Bone Transforms' function.

EventGraph.png

4. Within the 'Split Bone Transform' function of the Animation Blueprint I assigned the transforms to separate variables:

SplitBoneTransforms.jpg

5. And within the AnimGraph of the Animation Blueprint you finally use the transforms to modify the bones of the Skeletal Mesh:

AnimGraph.jpg

These are the settings for the Transform (Modify) Bone node (adjust 'bone to modify' accordingly):

TransformBone.jpg

 

Hope I did not forget anything and I explained everything in an understandable way. Let me know if you need further help. Maybe I'll find some time to create a Youtube Tutorial in the near future.

Best, Laramena

View solution in original post

10 REPLIES 10

EnterReality
Protege

I still don't have experience with the Quest2 hand tracking, but I have ton of experience with mocap in UE4, and I usually use Take Recorder to record every skeletal animation you want.

If you select a BP that has a skeletal mesh inside, that is fully recorded/baked inside the animation that is going to be saved.

If not, make sure that, within the Take recorder details, after you select the actor you want, you also make sure that the skeletal mesh of the hands it selected ( expand the BP you selected to be recorded ).

Hi, thanks for your reply! The thing is, within Take Recorder I cannot access the Skeletal Mesh of the Oculus Hand Component since I guess it's private and therefore there is no animation track. Do you know if I'm maybe missing a step to prepare the Oculus Hand Component for animation? I've seen Take Recorder running smoothly with MetaHumans which seems so much more complicated why I'm thinking I'm missing something very basic and obvious.

Oh, the fact that you can't access the hands skeletal mesh is very annoying...can you at least access the component once is spawned?
The workaround would be to access the spawned hand skeletal mesh and grab the hands/fingers rotation data, then use a separate skeletal mesh ( maybe the hands of the Mannequin ) and assign to them the rotation data from the Quest2 hand data you grabbed before.

By doing so you're using a kind of "proxy" skeletal mesh that you can then record using Take Recorder.

 

I know that this kind of workaround works because I have a lot of experience with VR gloves and retarget the data from them to realtime characters, so in case you need this to be done feel free to contact me via DM.

Sorry, my bad, I am kind of new to this. I meant to say I cannot access the skeletal mesh component within Take Recorder, it doesn't show up in the hierarchy.

 

I tried using "Get Bone Location by Name" on the oculus hand which is no problem at all. But for passing the bone location data over to my "proxy" hand I am using another poseable mesh component which again doesn't give me an animation track. Can I pass the bone location data onto a skeletal mesh component? Which function would I use to manipulate bone locations of a skeletal mesh component?

Get Socket Transform is to get the data, while to apply the transform on your proxy hands you can use the Transform Bone node in AnimBP..

Thanks, just found the nodes 🙂

RenderNinja
Honored Guest

Hello @Laramena , could you please share with us a github project where you made it? I will be happy to buy you a coffee for the share or at least please add some youtube tutorial. I am sure the community will be proud of you!

Laramena
Explorer

Hi everyone,

sorry, haven't been on this thread for quite a while. I cannot share the project however I can share some screenshots with you. Hope this will be helpful to someone.

I presuppose you already have your pawn actor with the Oculus hand tracking running.

1. You then add a SkeletalMeshComponent to the actor with the Skeletal Mesh being set to the OculusHand Mesh.

2. Also, I added a function to access and store the Oculus Hand's bone transforms in an array. I created a child class of the Oculus Hand Component which you don't need to do. Instead you would connect the Oculus Hand Component into the target pins:

GetBoneTransforms.jpg

3. Next, you create an Animation Blueprint which you assign to the Skeletal Mesh. Within the EventGraph of the Animation Blueprint you get a reference to your actor and call the 'Get Bone Transforms' function.

EventGraph.png

4. Within the 'Split Bone Transform' function of the Animation Blueprint I assigned the transforms to separate variables:

SplitBoneTransforms.jpg

5. And within the AnimGraph of the Animation Blueprint you finally use the transforms to modify the bones of the Skeletal Mesh:

AnimGraph.jpg

These are the settings for the Transform (Modify) Bone node (adjust 'bone to modify' accordingly):

TransformBone.jpg

 

Hope I did not forget anything and I explained everything in an understandable way. Let me know if you need further help. Maybe I'll find some time to create a Youtube Tutorial in the near future.

Best, Laramena

Thankyou so much @Laramena this is exactly what I needed, I think I should be able to get things working as I need with all this!