cancel
Showing results for 
Search instead for 
Did you mean: 

New "VR Object" examples ?

Anonymous
Not applicable
We are quite interested to start playing around with "VR Object" but failed to find any information about integrating VR objects with Unity. Any idea where I could find some information about this?
19 REPLIES 19

mfmf
Oculus Staff
Is this referring to the public test realm mixed reality support? We're hoping to have more information on this before long. We will post it to the forum when it's available.

Anonymous
Not applicable
Correct, on the 1.16 update. Where you can pair third touch controller as "VR Object".


Constellation
Expert Protege

mfmf said:

Is this referring to the public test realm mixed reality support? We're hoping to have more information on this before long. We will post it to the forum when it's available.


Any update on this? I actually don't need to do any MR capture; I mainly want to get access to the position & orientation of the VR Object in Unity so I can attach a third Touch controller something (in the real world) and track it.

Constellation
Expert Protege
I couldn't find a way to do it from Unity but I could do it via the SDK and it should be easy enough to make a native plugin to make the call from Unity. Here's my code:

            ovrPoseStatef pose;
ovrTrackedDeviceType type = ovrTrackedDevice_Object0;
ovr_GetDevicePoses(Session, &type, 1, 0, &pose);


Constellation
Expert Protege
I feel like I'm pretty close to getting it working in Unity. I found a Node called DeviceObjectZero and I had a hunch it would correspond with ovrTrackedDevice_Object0. I can get the orientation of the VR Object (third Touch) as follows:

OVRPose p = OVRPlugin.GetNodePose(OVRPlugin.Node.DeviceObjectZero, OVRPlugin.Step.Render).ToOVRPose();

For some reason it doesn't seem to update position; any ideas?

mfmf
Oculus Staff
Hmm, that should work. Are you sure the object's actually being positionally tracked? (e.g. if it's not in view of the sensors, the position won't be updated)

Constellation
Expert Protege
Since I was only working with the (third) Touch and mostly working in Visual Studio I'd actually left my Rift on my desk. Once the head sensor inside the Rift was activated (in this case by holding my thumb over it) the Touch position tracking started working. When I uncovered the sensor again the position tracking continued to work for a while but eventually it stopped. From looking at the LED's on the cameras it seems like the head sensor in the Rift is what turns the cameras on and off; is there any way to decouple this other than by putting tape over the sensor? I have a use case where I'd like to be able to track a real world object using a third Touch and I need to always maintain tracking, even if the user temporarily removes the Rift.

I've noticed that the orientation tracking of the Touch seems to work even when both the Rift and Touch are outside the camera range and the cameras are off; I suppose the updates I'm getting are purely IMU based. I've also noticed that I'm getting position updates as well, the values are non-zero but I can't figure out what they're actually based on. I modified my Update() to check GetNodePositionTracked and GetNodeOrientationTracked so I'm not sure why those updates are coming through. It would be nice if these functions only returned true if the Node in question was within range of the cameras (and the cameras were obviously on).

Here's my Update():
        OVRPose p = OVRPlugin.GetNodePose(OVRPlugin.Node.DeviceObjectZero, OVRPlugin.Step.Render).ToOVRPose();

if (OVRPlugin.GetNodePositionTracked(OVRPlugin.Node.DeviceObjectZero))
{
this.transform.position = p.position;
}

if (OVRPlugin.GetNodeOrientationTracked(OVRPlugin.Node.DeviceObjectZero))
{
this.transform.rotation = p.orientation;
}



mfmf
Oculus Staff
Odd-- yeah, unlike the hands, the tracked object assumes it's always tracked. (Your code should work as expected for the hands.) I'll ask around about that. May be a limitation of our implementation, though.

mfmf
Oculus Staff
Update: looks like we're planning a fix for that in an upcoming integration, but it could still be a month or so out.