I am running Unity 2019.4.11f1, latest Oculus Integration.
My scene has a camera in it that renders to a rendertexture. It is only active sometimes to update the otherwise static texture. However, it drains performance even when disabled. The strange thing is that it happens only after the first time it has rendered. Before that, everything is fine. The performance drain when disabled is equal to just continuosly rendering the camera, and therefore also impacted by the resolution of the texture. However, attaching a script with OnPostRender to the camera does not fire the event, so it is not rendering (at least not in Unity's domain).
I tried disabling the camera component and the entire gameobject. I have tried giving the camera a 1x1 render texture when disabled. I have tried setting the culling mask of the camera to no layers when disabled.
The only way to stop the performance drain is to destroy the camera.
If it is any help, Graphics.Blit would also not function on Quest, while it worked in Editor and on an Android phone. I am suspecting Oculus Integration is hijacking render textures/cameras somehow.