cancel
Showing results for 
Search instead for 
Did you mean: 

OpenGL context management VS Oculus

eskil
Honored Guest
Hi

I'm having some issues with writing a Oculus plugin for a platform layer i have.

To explain my issue i need to explain a bit about my plugin interface. The platform layer opens a window and creates a openGL context, then once this is done it will look for plugins. Plugins can do all sorts of things but in this example all that matters is rendering. When a plugin gets initialized, if can tell the API that it wants to intercept the applications rendering. When this call is made the Platform layer will create an additional OpenGL context and use wglShareLists to link it to the openglContext linked to the window. Then the once the plugin is activated, the plugin will take over the main rendering look, and can trigger the applications main render loop as many times as it likes and then present the result to the screen. The API intercepts FrameBufferBinds so that the application actually draws to a texture when it binds FBO 0, (Kind of nifty right? :-))

I have written a fair bunch of plugs for this to do color correction, and DK1 pluging among other things, but i cant get it to work with the new API. The issue I'm having is that it works on screen but it doesn't show up in the HMD. I assume that the rendering is working since it is warping and tracking correctly, but is unable to capture the image for some reason.

Right now the order of things are like this:

-OpenWindow.
-Create a GL Context /* WONT BE USED BY OCULUS! */
-ovr_Initialize()
-ovrHmd_Create();
-Create the GL Context the oculus renderer will use.
-ovrHmd_AttachToWindow();
-ovrHmd_ConfigureRendering
-create my textures in the oculus context.

Is this wrong, and if so how can it be wrong? I do create the window before i run ovr_Initialize, but I'm not giving ovr access to it until later and i am creating the context that will be used later. ovr shouldnt be able to care if i have an other open gl context. I'm assuming that ovr needs to create its own context and use wglShareLists, so it would matter that it gets access to the render context before the textures are created. I'm a bit lost here. If you like i can send you source code. I would be very interested to know how the capturing of the image has been implemented.

Cheers

E

Side note: the 64bit service exe crashes, the 32 bit works, im running 64 bit windows, a 64 bit app and all demos work fine.
22 REPLIES 22

kojack
MVP
MVP
There's no need to call them individually if there's no gap between them. ovr_initialize includes the same functionality as ovr_initializerenderingshim (all ovr_initializerenderingshim does is call OVR::System::DirectDisplayInitialize(), which ovr_initialize also calls).
The purpose of calling ovr_initializerenderingshim manually is to set up the render shim (must be done before any opengl use by the program) without creating all the other oculus stuff too, then call ovr_initialize later to finish the start up.

jfeh
Honored Guest
"cybereality" wrote:
Sorry for the delay.

First, make sure that you call the following function before you set up any of the other stuff you're doing.

ovr_InitializeRenderingShim();


Also, watch out because framebuffer 0 isn't the backbuffer when the Oculus shims are loaded.


Hi,

Could you please elaborate on the latter point ?
Is it a global side effect (i.e. in a process that has called "ovr_InitializeRenderingShim" at its start-up, will all the OpenGL context further created won't have their default framebuffer referenced by the index 0 regardless of their window being attached or not to the HMD) ?

I am having an issue when trying to bind the default framebuffer on the OpenGL context that serves my main window (this window being attached to the Rift via "ovrHmd_AttachToWindow").
Up till the first call to "ovrHmd_GetEyePoses " or "ovrHmd_EndFrame", I can successfully rebind my default framebuffer. However, as soon as one of the 2 previous functions have been called, calling glBindFramebuffer(GL_FRAMEBUFFER, 0) returns with no error, but when querying the current framebuffer with "glGetIntegerv(GL_READ_FRAMEBUFFER_BINDING, &fbo)" I get constantly a result of "2". What made me realize the latter was a call to glReadBuffer(GL_FRONT) (same for GL_BACK) that failed with a GL_INVALID_OPERATION as any framebuffer of non zero index is an FBO, and thus only COLOR_ATTACHMENT_$i are valid queries.
It is as if the OpenGL context had "lost" its default framebuffers at runtime (note : as Nsight can not be launched when libOVR is initialized, I cannot truly analyze the situation, any suggestions are welcome for proper OpenGL profiling with the Rift).

If you do have an explanation, I would be really grateful as understanding this point is of great importance regarding the integration of the DK2 in our software.

As to why am trying to do so ? I realized that when attaching the window to the HMD via "ovrHmd_AttachToWindow", one has to configure rendering with config.OGL.Header.BackBufferSize set to the window size and not the optimal Rift resolution (1920*1080) :
- Setting the Rift to optimal resolution (BackBufferSize = {1920,1080}) and attaching it to a smaller window clips/truncates the 2 distorted views to the window's viewport (i.e. no stereoscopy i.e. rubbish in HMD). So obviously, no minification is done inside libOVR.
- If the window is smaller than the Rift's optimal resolution but the Rift is configured in accordance (BackBufferSize = {window.w,window.h}), stereoscopy operates but rendering quality degrades (screen space undersampling).

So in order to benefit from full HMD resolution and thus optimal rendering quality without being constrained to show on screen a window of 1920*1080, I tried to render to the Rift via a dedicated hidden window which will always be in the optimal resolution (hmd->Resolution).
If mirroring of the 2 views is required by the application (because it rocks), one can copy to texture the Rift's framebuffer for further texture mapping the full viewport of another -visible- window (e.g. an OpenGL widget of 800*600 inside a full UI).
This should be possible as when tracing "ovrHmd_ConfigureRendering" and "ovrHmd_EndFrame" we can see that an OpenGL context is created by libOVR for distortion rendering, this context sharing the data with our original context passed inside the ovrGLConfig structure.
However, as we can not trace "ovrHmd_AttachToWindow" (socket serialization towards the service), we can not know how the HWND of the attached window is used. Does it have to have an OpenGL context ? Or for example will the service set an adequate PFD format and create a shared context with the distortion context (for accessing the distorted view internal textures) ? Is the content of the attached window copied at each frame to the native framebuffer of the Rift (this would explain the previous undersampling/truncation)? Or is the attached window "physically" bound to the hardware (i.e. no copy) ?
Any information regarding the internals of the "ovrHmd_AttachToWindow" that would help proper SDK use would be greatly appreciated.

Thanks for your support (and reading this too long post 🙂

eskil
Honored Guest
I have Posted a new version that includes two projects and at the source code here:
http://www.quelsolaar.com/oculus_test.zip

The application activates the Oculus mode with F1 and quits with "q". Note that once you have built the DLL you have to move it to the directory that runs the test application. I have left a copy there already.

All oculus related code is in betray_plugin_oculus_rift2.c, and at the point where this plugin gets initialized, the window has been opened, and a context has been created. When the plugin calls betray_plugin_callback_set_image_warp a new context gets created exclusively for the oculus plugin. The plugin will never touch or make use of any other oculus. To be clear the code that creates this context is in the EXE and is then handed back to the DLL its not created in the DLL. (Creating it in the DLL doesnt help by the way i have tried)

I hope this helps.

Cheers

E

jfeh
Honored Guest
As this topic was started by eskil, I will create a new topic dedicated to my specific issue (sorry if I overstepped on your topic eskil).
However, cybereality, could you please elaborate on the point you previously mentionned :
"Also, watch out because framebuffer 0 isn't the backbuffer when the Oculus shims are loaded."


Where is the backbuffer when the shims are loaded (is it an fbo index ?) ?
How can we retrieve its content (which is the specific topic that I wanted to address) ?

Thanks,
Jeff.

eskil
Honored Guest
Is anyone form oculus around? I was told to give you projects, so i did. Will you take a look at them?

E

cybereality
Grand Champion
I did take a quick look at the code, but honestly I was having trouble figuring out how it was supposed to work or even what I was looking at. I can see if one of the other guys more familiar with OpenGL can look at it.

eskil
Honored Guest
If you can find someone i can always give more help over mail or skype. You can reach me at eskil at obsession dot se.

eskil
Honored Guest
Well great! Now i have tied to uninstall Oculus and that has broken my AMD graphics driver by messing with the drives registry. Uninstalling and and reinstalling does not help. Thanks oculus! Can you please in the future limit yourselves to writing your own broken software and not try to break other peoples software too?

jfeh
Honored Guest
Cybereality,

I would appreciate an answer regarding what you previously said :
Also, watch out because framebuffer 0 isn't the backbuffer when the Oculus shims are loaded.


Even though it helps a bit (in a way) to know that 0 is not the default framebuffer, it would be even more useful -and a lot less frustrating- to us developpers to know what is the default framebuffer's index after loading the shims, or how to retrieve it.

Jeff.

cybereality
Grand Champion
@eskil: See the post here on how to fix a failed driver uninstall:

viewtopic.php?f=34&t=19343&p=237425#p237425