I have gone through the Interaction SDK and now understand how to create custom hand poses and behavior, custom interactions, with controllers and/or hand tracking, and so on. However, after having just gone through the Avatars 2 SDK docs and explored the samples it is not at all clear to me how the two SDKs are meant to be used together.
Does anyone know of a sample or anything that illustrates how to setup a player, camera rig, etc. to use Meta Avatars with Interaction SDK objects & patterns?
P.S. As a first crappy experiment I dropped a player controller rig from one of my Interaction SDK projects into my Avatars SDK project and got it working such that my avatar had two sets of hands - the avatar hands + the hands from my rig. It actually works to some extent in hand tracking mode if I then disable the rendering of the extraneous hands, but clearly this isn't the correct way to do it. Also, the hands on the avatars are smaller and they don't line up at all with the Hands when in controller mode. (Different sized hands, different poses in controller mode, different positioning, etc.)
But what is the correct way to setup an OVRPlayerController, CameraRig, AvatarEntity, and so such that it works with Interaction SDK and one can create interactions and hand poses using the tools from that SDK?
As a POC I'd like to get the Interaction SDK samples working with an Avatar rather than hands. I.e., Complex Grab, Direct Touch, and Pose Detection. I feel like this must already exist somewhere?
Alternatively, is there an easier option for building custom hand poses and object interactions? I find XR Interaction Toolkit + Unity animation to be better to work with than Interaction SDK, honestly. It took me way too long to recreate my XR-based drawing tools using Interaction SDK and had to resort to hacks.
It would be nice to have a reference mesh for the 2.0 avatars, at least the hands, to be able to rig in a normal Unity way and record animations from Unity. I've seen the CustomHandPose example but it's extremely crude. The code also contains typos and comments like:
// HACK: Random rotations allow us to pass the BodyAPI "is in hand" check. Without it, BodyAPI overrides and goes into rest pose
// I tried to get the random value as small as possible, but at .01 variance, rest pose triggers again 😕
Curious to know what others are doing. Not getting much clarity from the Avatars SDK docs or examples.
Also, are these Meta SDKs for Unity just really half-baked? From what I'm seeing I think I cannot create the kind of experience I want, although between Horizon Worlds and Horizon Workrooms the avatar system should be capable. Perhaps it is simply much better in Native SDK?