I'm working on a project where I will be using a motion capture tracking system to track the HMD (Oculus DK2). Now I'm curious because I could not really find any documentation on manual drift correction and wanted to ask around what other developers are doing.
In another thread, I have seen the suggestion that you should only correct yaw drift error when the HMD is rotating slowly or at a standstill.
However, after reading a paper on redirection techniques (http://viscg.uni-muenster.de/publications/2009/SBHJFL09/) I made an interesting observation that applying any drift error while the HMD is not rotation is actually the worst as regards user comfort (because it is the most obvious drift correction and easily spotted by the user). Now, in the paper, it is stated that for rotations, they could compress/expand the angle by up to 10% without the users actually noticing. The following quote of the paper is remarkable:
Users can be turned physically about 41% more or 10% less than the perceived virtual rotation,
Therefore, my current idea is to do the following (this all applies only to yaw-rotations):
- measure yaw error only when HMD is not rotating or only very slowly (after a time so the external tracking system has catched up) - apply no correction when HMD is not rotating - apply correction as a function of the rotation speed with a factor of 0.1 --> so if the user rotates with 10 degrees/second, apply a correction of 1 degree
I'm curious how other developers are solving this or what you think about this idea?
An approach like that could work reasonably well, but the general answer to this problem is a Kalman filter (an EKF is probably most applicable), it is purpose built for combining sensors with different capabilities (and sources of error) to get a smooth output with prediction. The details are pretty complex, but there is a lot of material out there as it is used all the time in robotics and control systems.
The general idea is to create a simulated model of your system, and apply statistical methods to remove noise and bias by comparing expected vs observed outputs. As a high level example, when you observe a certain displacement from a camera, you would expect to see a certain acceleration from your IMU, plus noise, and vice versa. How they correlate tells you a lot about the true state of the system, as in the real position at a given instant.
The paper you link is for deliberate misdirection of a walking user, allowing exploration of a whole village in a room the size of a basketball court. I would be (I imagine) important only to do this misdirection whilst the user is actually walking - it would be very awkward to be standing still turning head left/right and having the village slowly yaw around you.
For yaw correction (for 'true' direction) you need an external reference, such as magnetometer (DK1) or camera (DK2). As you say it might be odd to have the system correcting whilst your head is stationary. However with a properly calibrated tracker the yaw drift should actually be pretty small, and the user would (probably) be moving their head enough for you to sneak in the correction at any time without it being noticed.
For yaw correction (for 'true' direction) you need an external reference, such as magnetometer (DK1) or camera (DK2).
I'm currently working on an AR project with the DK2; however I've noticed that there seems to be no way to use the extrenal camera as a reference point. In Unity the position of the HMD is set to 0,0,0 when the demo starts, and it tracks from there. You can get the position of the camera, however that is relative to the HMD's starting position of 0,0,0 and doesn't update.
Any ideas? I don't want to have to use the HMD as a starting point and have the user start from the same position every time