cancel
Showing results for 
Search instead for 
Did you mean: 

Drift Correction - Replace the Tracker with MoCap Data

xflagx
Level 2
Hello everyone,

I try to build a Walking VR Environment where people are able to walk through (quite large 12x15 meter) real space and experience a corresponding VR. I've got a PhaseSpace Motion Capture System which works quite well, carrying the Camera through the VR. (I'm using the latest Version of Unity 5.3.2 - but it might be a general issue)

Regarding the practical implications I'm not allowed to use the Camera shipped with the Rift.
So I got the Yaw Drift issue... :? Sometimes... but reproducable :geek:

First attempt to solve the issue: Using the UnityEngine,VR.InputTracking.Recenter() every time the orientation passes a reference point results in glitches and seems to be uncomfortable...

After reading the well written article* focusing the drift topic, the following question came to my mind.

Is it possible to feed the drift correction with my own reference points - create by a Motion Capture System.
Or more specific:
Are there some API calls allowing to pass some kind of data overriding the tracker data? With other words... Can I use my own tracker based on the Motion Capture information?

I just found the opportunity to disable the tracker which is fairly not enough.

Any suggestion or comment is welcome! If there is a solution for that - it will be part of an free and open source piece of software 8-)

Thanks in advance!

* https://developer.oculus.com/blog/magnetometer/
17 REPLIES 17

pasta
Level 2
Ah, okay, it must have been some weird effect from running at 10 fps that made it look that way.  Right, I was sleeping 100ms to do what I described in my previous post.  Also, it appears I did not solve the issue using the Oculus SDK directly either.

I think I understand the issue now.  I have the display board of the CV1 angled now so it is not horizontal to the floor when a user is looking straight ahead.  This causes the wobble from timewarp.  I see from the older Oculus SDK, osvTimewarpProjectionDesp, could be submitted in the ovrLayerEyeFovDepth layer.  This no longer looks to be the case.  If I had access to the osvTimewarpProjectionDesp and a way to submit it, I believe I could fix this issue.

So, my question is Is there still access to this directly in Unity or in the Oculus SDK?  Thanks!

vrdaveb
Oculus Staff
I have the display board of the CV1 angled now so it is not horizontal to the floor when a user is looking straight ahead.

Sorry, the source code for Unity's VR support is not public and there isn't an easy way to modify its behavior. Without that, it will be hard to support a modified CV1. Can I ask why it needs to be angled?

pasta
Level 2
Got it, I thought that might be the case.  Right, the angle to a better fit.  It is not absolutely necessary though, so I should still be okay even though this particular path will not work.

Thanks for your time!

Fuby1000
Level 4

vrdaveb said:




using UnityEngine;
using System.Collections;

public class FakeTracking : MonoBehaviour
{
    public OVRPose centerEyePose = OVRPose.identity;
    public OVRPose leftEyePose = OVRPose.identity;
    public OVRPose rightEyePose = OVRPose.identity;
    public OVRPose leftHandPose = OVRPose.identity;
    public OVRPose rightHandPose = OVRPose.identity;
    public OVRPose trackerPose = OVRPose.identity;

    void Awake()
    {
        OVRCameraRig rig = GameObject.FindObjectOfType<OVRCameraRig>();

        if (rig != null)
            rig.UpdatedAnchors += OnUpdatedAnchors;
        // print("in awake");
    }

    void OnUpdatedAnchors(OVRCameraRig rig)
    {
        if (!enabled)
            return;

        //This doesn't work because VR camera poses are read-only.
        //rig.centerEyeAnchor.FromOVRPose(OVRPose.identity);

        //Instead, invert out the current pose and multiply in the desired pose.
        OVRPose pose = rig.centerEyeAnchor.ToOVRPose(true).Inverse();
        OVRPose correctedCenterEyePose;

        correctedCenterEyePose.position = rig.centerEyeAnchor.ToOVRPose(true).position;
        correctedCenterEyePose.orientation = rig.centerEyeAnchor.ToOVRPose(true).orientation;
        correctedCenterEyePose.orientation *= Quaternion.Euler(-38.25f, 0.0f, 0.0f);
        // pose = centerEyePose * pose;
        pose = correctedCenterEyePose * pose;
        rig.trackingSpace.FromOVRPose(pose, true);

        //OVRPose referenceFrame = pose.Inverse();

        //The rest of the nodes are updated by OVRCameraRig, not Unity, so they're easy.
        rig.leftEyeAnchor.FromOVRPose(leftEyePose);
        rig.rightEyeAnchor.FromOVRPose(rightEyePose);
        rig.leftHandAnchor.FromOVRPose(leftHandPose);
        rig.rightHandAnchor.FromOVRPose(rightHandPose);
        rig.trackerAnchor.FromOVRPose(trackerPose);
    }
}






I`m trying to exchange all tracking data with mocap data. I thought this might be the answer but frankly I have no idea what your code does. Can you explain how exactly I insert my mocap data into this code? Also, where would I attach that script? On an empty Game Object? I`m working with Unity 5.4.1f1.

vrdaveb
Oculus Staff
I have no idea what your code does.

This code forces the head and hand poses to match ones that you provide by assigning to the *Pose fields. It applies your poses right after OVRCameraRig normally would have applied our tracking data by subscribing to the UpdatedAnchors event.

Can you explain how exactly I insert my mocap data into this code? 

Write your own MonoBehavior and in an Update function, set FakeTracking.*Pose to the values reported by your mocap data.

Also, where would I attach that script? On an empty Game Object?

Yep, that should work. I tested it by attaching it to the same GameObject as my OVRCameraRig, but it's up to you.

J_Egels
Level 2


vrdaveb said:

Sorry, the ability to disable rotation tracking was never prioritized. It's unlikely to get fixed in the near future. However, if you are tracking rotation at all with your own system, the high-frequency data from TimeWarp should complement it and reduce latency rather than cause wobble. You can set whatever position and rotation you want as follows:

using UnityEngine;
using System.Collections;

public class FakeTracking : MonoBehaviour
{
public OVRPose centerEyePose = OVRPose.identity;
public OVRPose leftEyePose = OVRPose.identity;
public OVRPose rightEyePose = OVRPose.identity;
public OVRPose leftHandPose = OVRPose.identity;
public OVRPose rightHandPose = OVRPose.identity;
public OVRPose trackerPose = OVRPose.identity;

void Awake()
{
OVRCameraRig rig = GameObject.FindObjectOfType<OVRCameraRig>();

if (rig != null)
rig.UpdatedAnchors += OnUpdatedAnchors;
}

void OnUpdatedAnchors(OVRCameraRig rig)
{
if (!enabled)
return;

//This doesn't work because VR camera poses are read-only.
//rig.centerEyeAnchor.FromOVRPose(OVRPose.identity);

//Instead, invert out the current pose and multiply in the desired pose.
OVRPose pose = rig.centerEyeAnchor.ToOVRPose(true).Inverse();
pose = centerEyePose * pose;
rig.trackingSpace.FromOVRPose(pose, true);

//OVRPose referenceFrame = pose.Inverse();

//The rest of the nodes are updated by OVRCameraRig, not Unity, so they're easy.
rig.leftEyeAnchor.FromOVRPose(leftEyePose);
rig.rightEyeAnchor.FromOVRPose(rightEyePose);
rig.leftHandAnchor.FromOVRPose(leftHandPose);
rig.rightHandAnchor.FromOVRPose(rightHandPose);
rig.trackerAnchor.FromOVRPose(trackerPose);
}
}




I see this thread was last commented at on november 2016.
I am currently using the CV1 with a Mocap suit, and uses the code quoted to "disable" the tracking on the CV1. So that the camera is able to behave as a child of the head mocap suit.

My problem is that when i move slightly with my head, there's still a tiny bit of movement. And I this that this is making me feel sick.
Does anyone know if you can fully dissable all of this, and that the rotation of the camera is coming from the parent head of my mocap suit etc.

I can completely turn off the tracking with this code:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class EnableDisableOVRPosAndRotTracking : MonoBehaviour {

    public bool useOVRTracking = false;

    // Use this for initialization
    void Start() {
        OVRPlugin.rotation = useOVRTracking;
        OVRPlugin.position = useOVRTracking;
    }

    // Update is called once per frame
    void Update() {

    }
}

___________________________________________________

Problem is that when in play mode the headset is displaying the camera render in a 2D window in a completely black space.

seanarvr
Level 2

vrdaveb said:

> I have no idea what your code does.

This code forces the head and hand poses to match ones that you provide by assigning to the *Pose fields. It applies your poses right after OVRCameraRig normally would have applied our tracking data by subscribing to the UpdatedAnchors event.

> Can you explain how exactly I insert my mocap data into this code? 

Write your own MonoBehavior and in an Update function, set FakeTracking.*Pose to the values reported by your mocap data.

> Also, where would I attach that script? On an empty Game Object?

Yep, that should work. I tested it by attaching it to the same GameObject as my OVRCameraRig, but it's up to you.
//This doesn't work because VR camera poses are read-only.
//rig.centerEyeAnchor.FromOVRPose(OVRPose.identity);

Are there any ways to modify centerEyeAnchor transform?
I would to use my own mocap data to override centerEyeAnchor



Erikges
Level 2
Hi! everyone, I'm trying to correct the Oculus Rift S SLAM Drifting in long distance. I assume that this drift is due to the XYZ 0.0.0 Reference set at the beginning of the experience and to the impossibility to create or add some Anchors in the real world.

So... I would like to use OpenCV ArUco markers hooked in some space of my warehouse and to make the Virtual world moving slowly to this reference to fit and match with the real world.

By the way I should be able to use the real objects occlusions, rather than avoid them... Is there a way to access to the Rift S SDK to use the buffer of the tracking SLAM Camera? I would like to implement all my stuff in a UE4 project 😉 does anyone else have some idea to share, or some previous experience with this objective? Thanks.