So... The way we went about doing this was not to have separate projects for Vive/Rift but we had to have objects with scripts that we toggled on and off to test various things. We ran into a lot of issues with the transforms/anchor points, etc.. being different between the headsets so we always had hacky offsets to make it work in the beginning, but later just toggled on and off the settings for one or the other. It's kinda a pain, but I would almost say you just build one to completion and then port it over. We use Rift all the way to the end in this manner and then go back and just fix whatever breaks on the Vive. Not ideal but there isn't anything fundamentally different about them that needs to be accounted for in most cases with HMD plus 2 controllers.
It looks to me that Oculus (OVR), Valve (SteamVR/OpenVR), and VRTK all provide the same BASIC cross platform support: rendering, tracking, and input. However, they all appear to stop short of avatars (i.e. hand or controller avatars). So, I assume that this is just the state of things and that developers must build their own avatar system. Do I have this right?