Some months ago you released Oculus Avatar SDK for Unreal. With it you also released a sample project which shows how to spawn local and remote avatars (classes ALocalAvatar and ARemoteAvatar). The same project does a sort of package recording for replicating local movements to the remote avatars. What we noticed is that all the stuff is kept local; the structure which keeps recorded packets is opaque and, actually, this is unusable for a real multiplayer application: all the local avatar's movements are stored on a local structure and all "remote" avatars read from it and replicate movements on themselves.
Since we would like to use Oculus avatars, what we think to do is to track hands and head transformations and replicate them from a player to another; a sort of custom packet recording without opaque structures. Before doing it, is there a "right way" of using your ALocalAvatar and ARemoteAvatar for replicating movements in a real multiplayer application? If not, did you schedule something for the next future which can help us on this?
I'm not sure of the specifics of the unreal integration but with unity the output memory stream when recording a packet is just a byte array, so you should be able to send that and unpack on the local instance of each client.