IMPORTANT:

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible: https://developer.oculus.com/quest-pitch/

For additional information and context, please see "Submitting Your App to the Oculus Quest Store".
Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

How to move Player in Hand tracking

modhdhruvmodhdhruv Posts: 10
NerveGear
How to do the movement of a player while using hand tracking in the oculus quest? Anyone have done Player movement?
Have tried to move OVRCameraRig with joystick four-button forward/backward/left/right. OVRCameraRig moves perfectly but issue is both HAND Models are not moving with OVRCameraRig. 
Tried to Put "Hands" game object as a parent of OVRCameraRig but no luck
Does anyone have an idea?

Comments

  • TomGTomG Posts: 20
    Brain Burst
    Same issue here.. the moment you move the OVRCameraRig away from 0,0,0 the hands stay at the original location. It's pre-release, but still, seems like that should not be that hard to fix. I'll have a look
  • TomGTomG Posts: 20
    Brain Burst
    Look for the file: HandSkeleton.cs, change line 145 to use localPosition

    // line 145: transform.position = pose.RootPose.Position.FromFlippedZVector3f();
    transform.localPosition = pose.RootPose.Position.FromFlippedZVector3f();

  • xiesixiesi Posts: 3
    NerveGear
    TomG said:
    Look for the file: HandSkeleton.cs, change line 145 to use localPosition

    // line 145: transform.position = pose.RootPose.Position.FromFlippedZVector3f();
    transform.localPosition = pose.RootPose.Position.FromFlippedZVector3f();

    It's not working properly. Can you help me solve this problem again?Thanks
  • TomGTomG Posts: 20
    Brain Burst
    edited December 2019
    xiesi said:
    TomG said:
    Look for the file: HandSkeleton.cs, change line 145 to use localPosition

    // line 145: transform.position = pose.RootPose.Position.FromFlippedZVector3f();
    transform.localPosition = pose.RootPose.Position.FromFlippedZVector3f();

    It's not working properly. Can you help me solve this problem again?Thanks
    What's not working.. please be a bit more specific. I found some issues when you're starting outside of 0,0,0 but if you start there you can transport elsewhere and still have your hands working.

    video of the result:
    photos.app.goo.gl/EH6Di834PsYH3WgH9
  • xiesixiesi Posts: 3
    NerveGear
    edited December 2019
    TomG said:
    xiesi said:
    TomG said:
    Look for the file: HandSkeleton.cs, change line 145 to use localPosition

    // line 145: transform.position = pose.RootPose.Position.FromFlippedZVector3f();
    transform.localPosition = pose.RootPose.Position.FromFlippedZVector3f();

    It's not working properly. Can you help me solve this problem again?Thanks
    What's not working.. please be a bit more specific. I found some issues when you're starting outside of 0,0,0 but if you start there you can transport elsewhere and still have your hands working.

    video of the result:
    photos.app.goo.gl/EH6Di834PsYH3WgH9
    The hand can be used normally. It can move to a position other than 0,0,0, but the position of the ray does not follow the position of the hand.The ray is still emitted from 0,0,0.
  • stefaneiselestefaneisele Posts: 1
    NerveGear

    Don't know if this solution is the best, but it works.
    Add this code to the RayTool.cs code to get the HandsAnchors inside the class:

    public GameObject handAnchorLeft, handAnchorRight;
    public void Awake()
            {
                handAnchorLeft = GameObject.Find("LeftHandAnchor");
                handAnchorRight = GameObject.Find("RightControllerAnchor");
            }

    As well change the GetRayCastOrigin() function inside the RayTool.cs like in the following lines:

    private Vector3 GetRayCastOrigin(){
         if (IsRightHandedTool)
         {return handAnchorRight.transform.position + MINIMUM_RAY_CAST_DISTANCE *
                     handAnchorRight.transform.forward;}
         else
        {return handAnchorLeft.transform.position + MINIMUM_RAY_CAST_DISTANCE *
                     handAnchorLeft.transform.forward;}
    }

    In the script RayToolView.cs go to the Update() function and change "var myPosition =" to:

    var myPosition = InteractableTool.IsRightHandedTool ? transform.parent.GetComponent<RayTool>().handAnchorRight.transform.position : transform.parent.GetComponent<RayTool>().handAnchorLeft.transform.position;

    This should do the work. The rays are now following the positions of the hands if you move the character.

  • modhdhruvmodhdhruv Posts: 10
    NerveGear
    I'm trying to move OVRCameraRig forward/back/left/right using hand button interaction. The issue is when OVRCameraRig moves forward, Hands do not move with OVRCameraRig. Does anyone have solution for this?
  • modhdhruvmodhdhruv Posts: 10
    NerveGear
    TomG said:
    Look for the file: HandSkeleton.cs, change line 145 to use localPosition

    // line 145: transform.position = pose.RootPose.Position.FromFlippedZVector3f();
    transform.localPosition = pose.RootPose.Position.FromFlippedZVector3f();

    Try to change position in HandSkeleton.cs. But not working.
  • StimmitsStimmits Posts: 7
    NerveGear
    edited January 7
    TomG said:
    Look for the file: HandSkeleton.cs, change line 145 to use localPosition

    // line 145: transform.position = pose.RootPose.Position.FromFlippedZVector3f();
    transform.localPosition = pose.RootPose.Position.FromFlippedZVector3f();


    Ah amazing, this worked. I also had to change the rotation to use the localRotation. Thank you very much!

    modhdhruv said:
    TomG said:
    Look for the file: HandSkeleton.cs, change line 145 to use localPosition

    // line 145: transform.position = pose.RootPose.Position.FromFlippedZVector3f();
    transform.localPosition = pose.RootPose.Position.FromFlippedZVector3f();

    Try to change position in HandSkeleton.cs. But not working.

    This only works when you have the skeleton hands active, for the mesh hands change HandMesh.cs to the following:
    // line 119:
    vertices[i] = transform.InverseTransformPoint(mesh.VertexPositions[i].FromFlippedZVector3f());

    Also, make sure that your hands are parented under the OVRCameraRig.
  • modhdhruvmodhdhruv Posts: 10
    NerveGear
    edited January 8
    Tried to change in UpdatePose() HandSkeleton.cs.
    transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
    Now how to move OVRCameraRig in the center camera direction is looking?
  • StimmitsStimmits Posts: 7
    NerveGear
    edited January 8
    modhdhruv said:
    Tried to change in UpdatePose() HandSkeleton.cs.
    transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
    Now how to move OVRCameraRig in the center camera direction is looking?
    Note that adding positions this way does not account for local rotated translation of the OVRCameraRig, which InverseTransformPoint does do.
    You can test this by rotating the OVRCameraRig 90 degrees on the Y axis. Your hands will appear left/right next to you if you add the positions, and still in front of you when using InverseTransformPoint. Note that the returned position is then also affected by scale, which is a good thing in this case.

    Moving the OVRCameraRig in the direction the camera is looking - can be done by calculating a flat rotation and translating forward in that direction, doing something like this:

    float speed = 2.0f;
    Transform ovrCameraRig;
    void Update () {
    Quaternion headRotationFlat = Quaternion.Euler(0, Camera.main.transform.eulerAngles.y, 0);
    ovrCameraRig.transform.Translate((headRotationFlat * Vector3.forward) * speed * Time.deltaTime, Space.World);
    }

    But there are many alternatives to this. Try and see which solution works best for your application. Maybe yours is better suited to incorporate teleportation locomotion for example?

    You can also try and check out the new Unity XR Interaction Toolkit. Can't link it here, but try and google it, you'll find it really quick. It's also in the Unity Package Manager. It allows you to use some really nifty 'drag-and-drop' teleportation, interaction and more XR-based components.
  • modhdhruvmodhdhruv Posts: 10
    NerveGear
    edited January 8
    Stimmits for your efforts.  :)
    Perfectly moving as I expecting. 
    Have you tried OVRPlayerController instead of OVRCameraRig?
    When using PlayerController have an issue regarding PanelHMDFollower.cs script. I'm using the PanelHMDFollower script for a button to keep nearby OVRCameraRig. But when use PlayerController it moves automatically in any direction when scene load.
  • StimmitsStimmits Posts: 7
    NerveGear
    edited January 8
    modhdhruv said:
    Stimmits for your efforts.  :)
    Perfectly moving as I expecting. 
    Have you tried OVRPlayerController instead of OVRCameraRig?
    When using PlayerController have an issue regarding PanelHMDFollower.cs script. I'm using the PanelHMDFollower script for a button to keep nearby OVRCameraRig. But when use PlayerController it moves automatically in any direction when scene load.
    Great! I'm using OVRCameraRig as part of my own player rig setup. But as I mentioned, there are many alternatives to moving your player around. I'm glad you figured it out.

    Btw; when you are planning to use pinch gestures or pointer/raycast functionality for the hand-tracking, you have to translate the Pointer position and rotation to use the player rig reference location as well in your situation.

    It's this part in Hand.cs:
    // this part:
    _pointer.PointerPosition = _currentState.PointerPose.Position.FromFlippedZVector3f();
    _pointer.PointerOrientation = _currentState.PointerPose.Orientation.FromFlippedZQuatf();
    // change to:
    _pointer.PointerPosition = transform.TransformPoint(_currentState.PointerPose.Position.FromFlippedZVector3f());
    Quaternion rotation = _currentState.PointerPose.Orientation.FromFlippedZQuatf();
    _pointer.PointerOrientation = Quaternion.LookRotation(transform.TransformDirection(rotation * Vector3.forward), transform.TransformDirection(rotation * Vector3.up));

    I hope @Oculus fixes this world to local hassle soon in their SDK, otherwise we'll find ourselves changing all these lines over and over again when they push an update to the SDK.

  • Zig420Zig420 Posts: 27 Oculus Start Member
    Hi, I'm also trying to do some kind of teleport function with hand tracking using the pinch gesture. Could someone, at a high level explain to me what I need to do to to get this working? I haven't found any tutorials and I'm not really sure where to start. Thanks.
  • modhdhruvmodhdhruv Posts: 10
    NerveGear
    Stimmits said:
    modhdhruv said:
    Stimmits for your efforts.  :)
    Perfectly moving as I expecting. 
    Have you tried OVRPlayerController instead of OVRCameraRig?
    When using PlayerController have an issue regarding PanelHMDFollower.cs script. I'm using the PanelHMDFollower script for a button to keep nearby OVRCameraRig. But when use PlayerController it moves automatically in any direction when scene load.
    Great! I'm using OVRCameraRig as part of my own player rig setup. But as I mentioned, there are many alternatives to moving your player around. I'm glad you figured it out.

    Btw; when you are planning to use pinch gestures or pointer/raycast functionality for the hand-tracking, you have to translate the Pointer position and rotation to use the player rig reference location as well in your situation.

    It's this part in Hand.cs:
    // this part:
    _pointer.PointerPosition = _currentState.PointerPose.Position.FromFlippedZVector3f();
    _pointer.PointerOrientation = _currentState.PointerPose.Orientation.FromFlippedZQuatf();
    // change to: _pointer.PointerPosition = transform.TransformPoint(_currentState.PointerPose.Position.FromFlippedZVector3f()); Quaternion rotation = _currentState.PointerPose.Orientation.FromFlippedZQuatf(); _pointer.PointerOrientation = Quaternion.LookRotation(transform.TransformDirection(rotation * Vector3.forward), transform.TransformDirection(rotation * Vector3.up));

    I hope @Oculus fixes this world to local hassle soon in their SDK, otherwise we'll find ourselves changing all these lines over and over again when they push an update to the SDK.

    Yes. I have start movement of OVRCameraRig by Pinch value to forward direction. But having issue is when apply RigidBody to OVRCameraRig it behave odly. It moving too much fast in scene randomly without any transalte code. If not apply rigidbody(Kinematic = false ) to then it is not possible to keep user in boundary. Have any idea to solve?
  • modhdhruvmodhdhruv Posts: 10
    NerveGear
    Stimmits said:
    modhdhruv said:
    Tried to change in UpdatePose() HandSkeleton.cs.
    transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
    Now how to move OVRCameraRig in the center camera direction is looking?
    Note that adding positions this way does not account for local rotated translation of the OVRCameraRig, which InverseTransformPoint does do.
    You can test this by rotating the OVRCameraRig 90 degrees on the Y axis. Your hands will appear left/right next to you if you add the positions, and still in front of you when using InverseTransformPoint. Note that the returned position is then also affected by scale, which is a good thing in this case.

    Moving the OVRCameraRig in the direction the camera is looking - can be done by calculating a flat rotation and translating forward in that direction, doing something like this:

    float speed = 2.0f;
    Transform ovrCameraRig;
    void Update () {
    Quaternion headRotationFlat = Quaternion.Euler(0, Camera.main.transform.eulerAngles.y, 0);
    ovrCameraRig.transform.Translate((headRotationFlat * Vector3.forward) * speed * Time.deltaTime, Space.World);
    }

    But there are many alternatives to this. Try and see which solution works best for your application. Maybe yours is better suited to incorporate teleportation locomotion for example?

    You can also try and check out the new Unity XR Interaction Toolkit. Can't link it here, but try and google it, you'll find it really quick. It's also in the Unity Package Manager. It allows you to use some really nifty 'drag-and-drop' teleportation, interaction and more XR-based components.
    This movement of OVRCameraRig works properly if we are not adding RigidBody and Collider.  After adding Rigidbody and collider it is doing the movement too much faster in a scene (OVRCameraRig goes out of the screen).
  • StimmitsStimmits Posts: 7
    NerveGear
    modhdhruv said:
    Stimmits said:
    modhdhruv said:
    Tried to change in UpdatePose() HandSkeleton.cs.
    transform.position = pose.RootPose.Position.FromFlippedZVector3f() + OVRCameraRig.transform.localPosition;
    Now how to move OVRCameraRig in the center camera direction is looking?
    Note that adding positions this way does not account for local rotated translation of the OVRCameraRig, which InverseTransformPoint does do.
    You can test this by rotating the OVRCameraRig 90 degrees on the Y axis. Your hands will appear left/right next to you if you add the positions, and still in front of you when using InverseTransformPoint. Note that the returned position is then also affected by scale, which is a good thing in this case.

    Moving the OVRCameraRig in the direction the camera is looking - can be done by calculating a flat rotation and translating forward in that direction, doing something like this:

    float speed = 2.0f;
    Transform ovrCameraRig;
    void Update () {
    Quaternion headRotationFlat = Quaternion.Euler(0, Camera.main.transform.eulerAngles.y, 0);
    ovrCameraRig.transform.Translate((headRotationFlat * Vector3.forward) * speed * Time.deltaTime, Space.World);
    }

    But there are many alternatives to this. Try and see which solution works best for your application. Maybe yours is better suited to incorporate teleportation locomotion for example?

    You can also try and check out the new Unity XR Interaction Toolkit. Can't link it here, but try and google it, you'll find it really quick. It's also in the Unity Package Manager. It allows you to use some really nifty 'drag-and-drop' teleportation, interaction and more XR-based components.
    This movement of OVRCameraRig works properly if we are not adding RigidBody and Collider.  After adding Rigidbody and collider it is doing the movement too much faster in a scene (OVRCameraRig goes out of the screen).
    Yes that is true. Manipulating the transform directly does not accomodate for interacting with the Physics engine. I encourage to use the OVRPlayerController when trying to implement Physics.

    An alternative is not using the Transform Translate functionality, but opting for Rigidbody movement functionality; such as Rigidbody.MovePosition or manipulating the Rigidbody velocity to move towards a direction/position. These methods also incorporate Physics behaviour in their calculation. Be sure to check out the Rigidbody documentation.

    Keep in mind that using Physics to drive a player controller can decrease the comfort level of the player drastically.
  • alejandro.castan.503alejandro.castan.503 Posts: 85 Oculus Start Member
    Hi Sorry for my ignorance but I am interested in trying the Hand Tracking features for Quest, for it, Could anybody advice about where I can get its SDK for developing? Thanks in advance
  • modhdhruvmodhdhruv Posts: 10
    NerveGear
    Hi Sorry for my ignorance but I am interested in trying the Hand Tracking features for Quest, for it, Could anybody advice about where I can get its SDK for developing? Thanks in advance
    For Unity Development :
    1. Create an empty project in unity
    2. Import Oculus Integration from Assetstore
    3. There are several demos available by Oculus. You can get the HandInteraction demo in the SampleFramework directory.
    4. Run and Test.
Sign In or Register to comment.