When trying to do sensor fusion with other devices I've noticed while the CameraFrustumFarZInMeters says 2.5m it seems closer to about 2.4m when I'm seeing positional issues. As well there seems to be no validation of the pose being within the camera frustrum. I've seen when approximately out at 2.4m for it to lose tracking then regain tracking.and for it to state the pose is over 10m's away from the camera while this doesn't pass a basic sanity test of being within the frustrum. As a result, my head becomes very disconnected from the objects it attached to until tracking is reacquired with valid data. Is there any way to force it to ignore bad positional data that isn't obvious. I could stop listening to the positions its reporting but I would prefer to keep the position offsets being generated for the head orientation. As well, I should state I'm experiencing this in Unreal Engine 4.8 used with 0.6.1 of the SDK so prefer not to modify the OculusRiftHMD implementation too much for integration concerns in the future.
I'm not sure if this is what you are seeing, but when the positional tracking looses sight (either through leaving the frustum or through occlusion, etc.) then the software falls back onto a head-neck model. So you will still receive some position information, but it won't be entirely accurate. Is this maybe what you are seeing?