cancel
Showing results for 
Search instead for 
Did you mean: 

How to move Player in Hand tracking

modhdhruv
Explorer
How to do the movement of a player while using hand tracking in the oculus quest? Anyone have done Player movement?
Have tried to move OVRCameraRig with joystick four-button forward/backward/left/right. OVRCameraRig moves perfectly but issue is both HAND Models are not moving with OVRCameraRig. 
Tried to Put "Hands" game object as a parent of OVRCameraRig but no luck
Does anyone have an idea?
19 REPLIES 19

modhdhruv
Explorer
Tried to change in UpdatePose() HandSkeleton.cs.
transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
- Hand moving perfectly with OVRCameraRig.
Now how to move OVRCameraRig in the center camera direction is looking?

Stimmits
Explorer

modhdhruv said:

Tried to change in UpdatePose() HandSkeleton.cs.
transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
- Hand moving perfectly with OVRCameraRig.
Now how to move OVRCameraRig in the center camera direction is looking?


Note that adding positions this way does not account for local rotated translation of the OVRCameraRig, which InverseTransformPoint does do.
You can test this by rotating the OVRCameraRig 90 degrees on the Y axis. Your hands will appear left/right next to you if you add the positions, and still in front of you when using InverseTransformPoint. Note that the returned position is then also affected by scale, which is a good thing in this case.

Moving the OVRCameraRig in the direction the camera is looking - can be done by calculating a flat rotation and translating forward in that direction, doing something like this:

float speed = 2.0f;
Transform ovrCameraRig;
void Update () {
Quaternion headRotationFlat = Quaternion.Euler(0, Camera.main.transform.eulerAngles.y, 0);
ovrCameraRig.transform.Translate((headRotationFlat * Vector3.forward) * speed * Time.deltaTime, Space.World);
}

But there are many alternatives to this. Try and see which solution works best for your application. Maybe yours is better suited to incorporate teleportation locomotion for example?

You can also try and check out the new Unity XR Interaction Toolkit. Can't link it here, but try and google it, you'll find it really quick. It's also in the Unity Package Manager. It allows you to use some really nifty 'drag-and-drop' teleportation, interaction and more XR-based components.

modhdhruv
Explorer
Thanks, @Stimmits for your efforts.  🙂
Perfectly moving as I expecting. 
Have you tried OVRPlayerController instead of OVRCameraRig?
When using PlayerController have an issue regarding PanelHMDFollower.cs script. I'm using the PanelHMDFollower script for a button to keep nearby OVRCameraRig. But when use PlayerController it moves automatically in any direction when scene load.

Stimmits
Explorer

modhdhruv said:

Thanks, @Stimmits for your efforts.  🙂
Perfectly moving as I expecting. 
Have you tried OVRPlayerController instead of OVRCameraRig?
When using PlayerController have an issue regarding PanelHMDFollower.cs script. I'm using the PanelHMDFollower script for a button to keep nearby OVRCameraRig. But when use PlayerController it moves automatically in any direction when scene load.


Great! I'm using OVRCameraRig as part of my own player rig setup. But as I mentioned, there are many alternatives to moving your player around. I'm glad you figured it out.

Btw; when you are planning to use pinch gestures or pointer/raycast functionality for the hand-tracking, you have to translate the Pointer position and rotation to use the player rig reference location as well in your situation.

It's this part in Hand.cs:
// this part:
_pointer.PointerPosition = _currentState.PointerPose.Position.FromFlippedZVector3f();
_pointer.PointerOrientation = _currentState.PointerPose.Orientation.FromFlippedZQuatf();

// change to:
_pointer.PointerPosition = transform.TransformPoint(_currentState.PointerPose.Position.FromFlippedZVector3f());
Quaternion rotation = _currentState.PointerPose.Orientation.FromFlippedZQuatf();
_pointer.PointerOrientation = Quaternion.LookRotation(transform.TransformDirection(rotation * Vector3.forward), transform.TransformDirection(rotation * Vector3.up));

I hope @Oculus fixes this world to local hassle soon in their SDK, otherwise we'll find ourselves changing all these lines over and over again when they push an update to the SDK.

Zig420
Protege
Hi, I'm also trying to do some kind of teleport function with hand tracking using the pinch gesture. Could someone, at a high level explain to me what I need to do to to get this working? I haven't found any tutorials and I'm not really sure where to start. Thanks.

modhdhruv
Explorer


Stimmits said:


modhdhruv said:

Thanks, @Stimmits for your efforts.  🙂
Perfectly moving as I expecting. 
Have you tried OVRPlayerController instead of OVRCameraRig?
When using PlayerController have an issue regarding PanelHMDFollower.cs script. I'm using the PanelHMDFollower script for a button to keep nearby OVRCameraRig. But when use PlayerController it moves automatically in any direction when scene load.


Great! I'm using OVRCameraRig as part of my own player rig setup. But as I mentioned, there are many alternatives to moving your player around. I'm glad you figured it out.

Btw; when you are planning to use pinch gestures or pointer/raycast functionality for the hand-tracking, you have to translate the Pointer position and rotation to use the player rig reference location as well in your situation.

It's this part in Hand.cs:
// this part:
_pointer.PointerPosition = _currentState.PointerPose.Position.FromFlippedZVector3f();
_pointer.PointerOrientation = _currentState.PointerPose.Orientation.FromFlippedZQuatf();

// change to:
_pointer.PointerPosition = transform.TransformPoint(_currentState.PointerPose.Position.FromFlippedZVector3f());
Quaternion rotation = _currentState.PointerPose.Orientation.FromFlippedZQuatf();
_pointer.PointerOrientation = Quaternion.LookRotation(transform.TransformDirection(rotation * Vector3.forward), transform.TransformDirection(rotation * Vector3.up));

I hope @Oculus fixes this world to local hassle soon in their SDK, otherwise we'll find ourselves changing all these lines over and over again when they push an update to the SDK.



Yes. I have start movement of OVRCameraRig by Pinch value to forward direction. But having issue is when apply RigidBody to OVRCameraRig it behave odly. It moving too much fast in scene randomly without any transalte code. If not apply rigidbody(Kinematic = false ) to then it is not possible to keep user in boundary. Have any idea to solve?

modhdhruv
Explorer

Stimmits said:


modhdhruv said:

Tried to change in UpdatePose() HandSkeleton.cs.
transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
- Hand moving perfectly with OVRCameraRig.
Now how to move OVRCameraRig in the center camera direction is looking?


Note that adding positions this way does not account for local rotated translation of the OVRCameraRig, which InverseTransformPoint does do.
You can test this by rotating the OVRCameraRig 90 degrees on the Y axis. Your hands will appear left/right next to you if you add the positions, and still in front of you when using InverseTransformPoint. Note that the returned position is then also affected by scale, which is a good thing in this case.

Moving the OVRCameraRig in the direction the camera is looking - can be done by calculating a flat rotation and translating forward in that direction, doing something like this:

float speed = 2.0f;
Transform ovrCameraRig;
void Update () {
Quaternion headRotationFlat = Quaternion.Euler(0, Camera.main.transform.eulerAngles.y, 0);
ovrCameraRig.transform.Translate((headRotationFlat * Vector3.forward) * speed * Time.deltaTime, Space.World);
}

But there are many alternatives to this. Try and see which solution works best for your application. Maybe yours is better suited to incorporate teleportation locomotion for example?

You can also try and check out the new Unity XR Interaction Toolkit. Can't link it here, but try and google it, you'll find it really quick. It's also in the Unity Package Manager. It allows you to use some really nifty 'drag-and-drop' teleportation, interaction and more XR-based components.


This movement of OVRCameraRig works properly if we are not adding RigidBody and Collider.  After adding Rigidbody and collider it is doing the movement too much faster in a scene (OVRCameraRig goes out of the screen).

Stimmits
Explorer

modhdhruv said:


Stimmits said:


modhdhruv said:

Tried to change in UpdatePose() HandSkeleton.cs.
transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
- Hand moving perfectly with OVRCameraRig.
Now how to move OVRCameraRig in the center camera direction is looking?


Note that adding positions this way does not account for local rotated translation of the OVRCameraRig, which InverseTransformPoint does do.
You can test this by rotating the OVRCameraRig 90 degrees on the Y axis. Your hands will appear left/right next to you if you add the positions, and still in front of you when using InverseTransformPoint. Note that the returned position is then also affected by scale, which is a good thing in this case.

Moving the OVRCameraRig in the direction the camera is looking - can be done by calculating a flat rotation and translating forward in that direction, doing something like this:

float speed = 2.0f;
Transform ovrCameraRig;
void Update () {
Quaternion headRotationFlat = Quaternion.Euler(0, Camera.main.transform.eulerAngles.y, 0);
ovrCameraRig.transform.Translate((headRotationFlat * Vector3.forward) * speed * Time.deltaTime, Space.World);
}

But there are many alternatives to this. Try and see which solution works best for your application. Maybe yours is better suited to incorporate teleportation locomotion for example?

You can also try and check out the new Unity XR Interaction Toolkit. Can't link it here, but try and google it, you'll find it really quick. It's also in the Unity Package Manager. It allows you to use some really nifty 'drag-and-drop' teleportation, interaction and more XR-based components.


This movement of OVRCameraRig works properly if we are not adding RigidBody and Collider.  After adding Rigidbody and collider it is doing the movement too much faster in a scene (OVRCameraRig goes out of the screen).


Yes that is true. Manipulating the transform directly does not accomodate for interacting with the Physics engine. I encourage to use the OVRPlayerController when trying to implement Physics.

An alternative is not using the Transform Translate functionality, but opting for Rigidbody movement functionality; such as Rigidbody.MovePosition or manipulating the Rigidbody velocity to move towards a direction/position. These methods also incorporate Physics behaviour in their calculation. Be sure to check out the Rigidbody documentation.

Keep in mind that using Physics to drive a player controller can decrease the comfort level of the player drastically.

alejandro_casta
Expert Protege
Hi Sorry for my ignorance but I am interested in trying the Hand Tracking features for Quest, for it, Could anybody advice about where I can get its SDK for developing? Thanks in advance

modhdhruv
Explorer


Hi Sorry for my ignorance but I am interested in trying the Hand Tracking features for Quest, for it, Could anybody advice about where I can get its SDK for developing? Thanks in advance


For Unity Development :
1. Create an empty project in unity
2. Import Oculus Integration from Assetstore
3. There are several demos available by Oculus. You can get the HandInteraction demo in the SampleFramework directory.
4. Run and Test.