Hi! everyone, I'm trying to correct the Oculus Rift S SLAM Drifting in warehouse scaling. I assume that this drift is due to the XYZ 0.0.0 Reference set at the beginning of the experience and to the impossibility to create or add some Anchors in the real world.
So... I would like to use OpenCV ArUco markers hooked in some space of my warehouse and to make the Virtual world moving slowly to this reference to fit and match with the real world.
By the way I should be able to use the real objects occlusions, rather than avoid them... Is there a way to access to the Rift S SDK to use the buffer of the tracking SLAM Camera? I would like to implement all my stuff in a UE4 project 😉 does anyone else have some idea to share, or some previous experience with this objective? Thanks.
I think you could probably help SLAM by putting more visually distinct markers onto the floor of your warehouse. You should also consider the intensity of any lighting -- it may over saturate the camera sensors and lower the reliability of the SLAM algorithm. Generally speaking, a majority of the tracking is done by the IMU on the HMD and the SLAM camera's are there mostly to correct for IMU drift.