cancel
Showing results for 
Search instead for 
Did you mean: 

Divergence after turning off position tracking

Fuby1000
Protege
I want to increase the area in which I can use my CV1. To do that im taking the CV1`s orientation, but the position data the motion capturing system delivers. In the OVRCameraRig I`ve turned off position tracking. Unfortunately I have divergence issues now since the eye positions aren`t matched anymore (left eye looks left, right eye looks right - results in a lot of pain). 

How can I access the right/left eye positions or the space between them? Might there be an easy solution?
Note that if I move across the room it sometimes gets better or worse.

Unity Version: 5.4.03f
Oculus Utilities Version: 1.9
Mocap System: Motive Body 1.9
Plug-Ins: OptiTrack Unity Plugin
4 REPLIES 4

Fuby1000
Protege
Still no replies? Is there no way to exchange to CV1`s position data with the mocaps position data, before the software corrects the divergence issus?

Dragonfrost
Protege
What exactly are you trying to do and how? 

Fuby1000
Protege


What exactly are you trying to do and how? 


In my master thesis I`m trying to increase the area in which in can use my CV1. To achieve this I`ve tried to get rid of the CV1`s tracking and exchange it with data I receive from tracking my CV1 with a motion capturing system (Motive Body 1.9). I stream this data to Unity 3D, using the Plug-In,  OptiTrack (Motive developers) provides for this exact use.

I`ve already given up on the rotation though, since I miss to much of the CV1`s software that`s needed for correction. And well, the rotation is not bound to the Rift Sensor anyways. 

What`s left now is the position of my CV1. I want to exchange the position tracking data the sensor delivers with that of the mocap system. This works fine (I can just turn of pos tracking using the OVRManager script on the OVRCameraRig), but while I get the position of the WHOLE HMD(or to be precise it`s center), I have no information about the IPD(eye distance), which results in divergence. 

I`ve found out that I can interact with the position of the eye anchors, so I´ve tried to add a distance of 66mm manually, but this won`t do a thing once I put on the HMD(works fine in the Unity preview though, which is confusing). The code looks like this:
trackerAnchor.localRotation = tracker.orientation;
centerEyeAnchor.localRotation = VR.InputTracking.GetLocalRotation(VR.VRNode.CenterEye);
leftEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : VR.InputTracking.GetLocalRotation(VR.VRNode.LeftEye);
rightEyeAnchor.localRotation = monoscopic ? centerEyeAnchor.localRotation : VR.InputTracking.GetLocalRotation(VR.VRNode.RightEye);

trackerAnchor.localPosition = tracker.position;
centerEyeAnchor.localPosition = VR.InputTracking.GetLocalPosition(VR.VRNode.CenterEye);
leftEyeAnchor.localPosition = VR.InputTracking.GetLocalPosition(VR.VRNode.LeftEye) + new Vector3(-0.033f, 0, 0);
rightEyeAnchor.localPosition = VR.InputTracking.GetLocalPosition(VR.VRNode.RightEye)+ new Vector3(0.033f, 0, 0);
print("Left eye position: " + leftEyeAnchor.localPosition);
print("Right eye position: " + rightEyeAnchor.localPosition);
print("IPD: " + (rightEyeAnchor.localPosition - leftEyeAnchor.localPosition));
So how would I achieve this? If I could just manually insert the IPD somehow, this would be fine. 
But I get the feeling it`s not that straight forward afterall. Is there something I missing?
Maybe the Rift Sensor provides more than just IPD and position?

I hope this explains my problem a little better, if not feel free to ask.
(Sorry for the formatting, it simply puts the code boxes where it wants...)

nalex66
MVP
MVP
You should post this in the Developer forum (this is the Community forum, which is more geared towards end-users). You can switch over to the Dev forum with the drop-down at the top-right of the screen where it says Community.

DK2, CV1, Go, Quest, Quest 2, Quest 3.


Try my game: Cyclops Island Demo