Showing results for 
Search instead for 
Did you mean: 

Oculus Mobile side-by-side stereo rendering— is it possible?

Level 5
I am developing an Oculus Mobile app using the C++ SDK. My target is the Oculus Go (I am not concerned with any platform older than the Oculus Go). A basic version of my code (based closely on the VrCubeWorld_NativeActivity sample from the SDK) is here

I am looking at the multiview rendering feature described here . As I understand code based on VrApi, the swapchain hands you two framebuffers one for each eye; so if you use the multiview opengl extension it allows you to render to both framebuffers simultaneously (one draw call draws to both framebuffers).

However: In my exploration of VR on desktop, I have encountered three separate ways of doing stereo rendering. One is multipass stereo, where you simply render each eye one at a time. One is view-based single-pass stereo rendering, which is what the GL_OVR_multiview extension does. However a third option is to use side-by-side single pass stereo rendering, using extensions such as GL_ARB_viewport_array and GLAD_GL_AMD_vertex_shader_viewport_index. In this solution there is only one framebuffer, but multiple viewports, and the single pass draws twice into the single double-wide framebuffer. The project i work with (LOVR) has found that this final method, the side-by-side single-pass stereo rendering, is more flexible than multiview stereo rendering, and therefore preferable to us. (The big problem we encountered with multiview is the requirement of a declaration like "layout(num_views = 2) in;" at the top of every shader, whereas with viewport arrays a single shader can be used for both stereo and mono renders.)

My question is: Is it possible to do side-by-side stereo rendering on any Oculus Mobile platform? Naively looking at VrAPI it does not seem to be possible because the swapchain issues single framebuffers per eye…