09-01-2021 12:41 PM
Now that I have my head wrapped around how to use the new AR Passthrough API (using Unity), I want to try a "shared" AR experience for two players in the same physical room using stage tracking mode - maybe like hitting a floating ball back and forth (to keep it simple) or maybe a ping pong game: real room, virtual ball and table.
How would you go about making sure that virtual objects appear in the same location for BOTH players in realtime? How would you synchronize the origin and initial orientation on the two Quests?
09-01-2021 05:26 PM
Seems like one player needs to be the authority on where the origin of the shared world is. You could have player 1 put their controller down on a marker (in a specific orientation) to set the origin and then have player 2 put their controller down on the same marker/orientation to synchronize. Then object positions could be communicated back and forth in the local coordinate system of that synchronization transform.
If you had a 3D printer you could print a "cup holder" for touch controllers that ensured a very specific position and orientation to guarantee close synchronization.
09-01-2021 05:47 PM
I *DO* have a 3D printer. Cool idea... once I have the quaternion containing the difference transform... I would have to apply the inverse of it to the "non-authoritative" OVRCameraRig parent object, right?
09-01-2021 07:15 PM
Yeah, I think that would work. If you offset and rotate player 2's camera rig so that the synchronization transform's world position and rotation matched player 1's, I think you'd have aligned worlds.
(Of course, I'm just closing my eyes and imagining, but it seems like it would work.)