Neither my own applications nor the SDK examples appear to work in OpenGL mode if I enable DirectHMD with a DK1 (no DK2 to test with yet).
In my own example code, if I create a context with glfw and don't specify any special attributes (like the GL version or the profile) then the window is created on my primary monitor at the resolution of the Rift (in the upper left hand corner). If I attempt to specify the GL version and profile type, I get an error that the context could not be created.
In the OculusWorldDemo if I set the OpenGL renderer as the primary renderer, then the example crashes. This is because the wglCreateContextAttribsARB call in Render_GL_Win32_Device.cpp is returning NULL and the application isn't checking for that. The crash occurs a little later when glGetString(GL_VERSION) returns null (because there is no context active) and the SDK attempts to call strstr() with the null pointer.
I'm wondering if this is related to
this thread and the behind the scenes voodoo that Oculus is using to render to a non-existent monitor surface.
If I set the Rift mode back to 'Extend Desktop', rendering in OpenGL works properly in both my application code and the Oculus SDK examples.
Brad Davis - Developer for
High Fidelity
Co-author of
Oculus Rift in Action
Comments
Moving right past the :shock: at AMD not being supported, that's not the issue:
nVidia GeForce GTX 650 Ti, Driver version 337.50.
Co-author of Oculus Rift in Action
I render my own distortion at 250 FPS, will I be able to continue with this without the user having to go through hoops to play the game with this new screen driver?
It's really simple to just rotate the rendering by 90 degrees, seems you are pushing extended and moving away from cloned (which you should have done a long time ago) so that's good, but how does this Direct Mode work? And why haven't you asked the community about this? You asked if the box was important, but don't you think this is a little more crucial?
"It's like Homeworld in first person."
Disable Aero and vSync for a completely simulator sickness free experience with 2xHz FPS.
Keep the config utility open for tracking to work.
It hides the display from the OS and drives it independently. It captures the swap buffers created by the application and sends them to the Rift for scan out.
Not exactly sure what you're asking about here. I don't know what you mean by "box."
Bottom line don't force us into your software chain; oculus is not the only platform we need to cater! I for one have to support VR (not only Oculus), normal screen and Android. Make my job easier!
"It's like Homeworld in first person."
Disable Aero and vSync for a completely simulator sickness free experience with 2xHz FPS.
Keep the config utility open for tracking to work.
According to the documentation you can still perform your own distortion and even timewarp. However, I would go with the SDK provided ones as they are more likely to be optimised and correct. If you render just the distortion at 250FPS, that means the distortion alone eats 4ms from your 13.3ms frame budget on a DK2 - you might consider going with the much faster Oculus provided distortion.
I think Oculus is going in the right direction here, for costumers it's better if the Rift is not visible as a screen (how to configure this? 2D windows end up on that screen, the OS menu bar (OSX) could end up on that screen, etc etc., also: the games start on the correct screen automatically!). Tracking and distortion can get updated without having to touch the applications.
It's less complicated for the users, just a bit more complex for the developers, but this is fine.
Because that setting overrides the DirectHMD setting if you have a DK1 connected.
Follow along: https://github.com/CarlKenner/dolphin/commits/VR-Hydra
Latest Version: viewtopic.php?f=42&t=11241&start=1020#p249426
No my games full stereoscopic frames run at 250 fps and I distort it myself. On Nvidia it runs at 1000 fps, on ATI 200 fps!?
"It's like Homeworld in first person."
Disable Aero and vSync for a completely simulator sickness free experience with 2xHz FPS.
Keep the config utility open for tracking to work.
Sure, If it was a standard and properly implemented yes. But there has to be a way to write your own platform and that it doesen't punish you by making your users have to go through loops to get your game running!
"It's like Homeworld in first person."
Disable Aero and vSync for a completely simulator sickness free experience with 2xHz FPS.
Keep the config utility open for tracking to work.
Unless your application is the only Rift application your costumers have installed, they already will have the runtime installed. It's not so uncommon for users that they have to install drivers when they buy new hardware.
Facebook is not a nice granma, they're gonna squeeze us where they can. I need them to first deliver an unbiased simple hardware that works perfectly. Second they can build all the software they want, commended they don't force it down my throat.
"It's like Homeworld in first person."
Disable Aero and vSync for a completely simulator sickness free experience with 2xHz FPS.
Keep the config utility open for tracking to work.
How does what you're asking for make things easier for anyone? It's not easy for Oculus as there would be no way to delegate access to the Rift. It's not easier for developers since they're left on their own to implement much of what will be routine as the hardware kinks are worked out, and it's not easier for users as their configurations are ignored for every single VR app they own.
I'm down with the idea that access to the tracker information is delegated through a centralized service in order to allow multiple applications to access the Rift and the potential for allowing seamless transitions from application to application. However, it seems like this relatively straightforward functionality is orthogonal to the new (much more fragile and experimental) direct HMD code. The two things shouldn't be tied together in the same service.
Co-author of Oculus Rift in Action
Follow along: https://github.com/CarlKenner/dolphin/commits/VR-Hydra
Latest Version: viewtopic.php?f=42&t=11241&start=1020#p249426
One USB sensor, one display, one camera = One Rift. There's quite a bit being tied together in the service, along with HMD virtualization, hot plug, hardware matching, etc.
I'm developing with OpenGL on AMD hardware, so I won't be able to do that until the next update?
No, you just can't use direct mode. You have to use the legacy extended mode.
Win7: window->OVR: DK1: <not tested>
Win7: window->OVR: DK2: black
Win7: OVR->window: DK1: ok
Win7: OVR->window: DK2: artefacts (GL3.3) - crash (highest GL (4.x))
Win8: window->OVR: DK1: black
Win8: window->OVR: DK2: black
Win8: OVR->window: DK1: crash
Win8: OVR->window: DK2: crash
(OVR->window means init the SDK first, then create a window) Everything on nvidia.
Co-author of Oculus Rift in Action
No, as I'm just using the Rifts for porting right now, both machines are set to direct HMD mode only.
Win7: window->OVR: DK2: black
Win7: OVR->window: crash
This gave me a bit more info about the crash:
It's triggered in ovrHmd_EndFrame 637 (hmds->pRenderer->EndFrame(true);) 284 (ReleaseDC(RParams.Window, dc);) and then it's dying in C:\Windows\System32\OVRDisplay64.dll (Unhandled exception at 0x000007FEF4418D08 (OVRDisplay64.dll) in sdl_glew_ovr.exe: 0xC0000005: Access violation reading location 0x0000000000000008.).
You're zeroing out the memory in ovrRenderAPIConfig before calling ConfigureRendering, right? If you don't the HDC and HWND members of ovrGLConfigData will likely have garbage and could trigger such a crash.
Co-author of Oculus Rift in Action
No, it's set to store the HMD resolution, my window handle etc, see http://pastebin.com/b39QEsM1 - it's described in the SDK documentation on page 29.
Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
But if you build a driver that makes peoples computers bluescreen you should not make a service. There is no need for a service, you can share data without having a service. Whoever took that decision has a suit on and is greedy in every sense of the word.
We need a standalone driver that does not create more problems that is solves. I don't need positional tracking for my game yet and the 0.4 Java driver (StellaArtois one) won't be ready for a while. I will tell my users to not install 0.4 yet and kill it if they have since 0.3.X works now, because this:
You are fragmenting your own product.
You need to learn communication, when taking a big step like this; ask the community (here on your forum where your developers are and not your consumers on reddit). NOBODY wants yet another crummy service hogging our already crap OS (that we where also forced to use) down.
"It's like Homeworld in first person."
Disable Aero and vSync for a completely simulator sickness free experience with 2xHz FPS.
Keep the config utility open for tracking to work.
Thank you.
I actually see the point of using a centralized process to ensure that multiple apps can access the sensor data, if for no other reason than to ensure that there can be a VR launcher that can launch other apps directly into VR mode. It's the first step to making a VR environment into a viable desktop replacement. As for sharing data without a service, since only one process can access the hardware at a time, the only real alternative would be to have VR applications attempt to find a running process that's already accessing the hardware, or launch one if none existed. Essentially this is a poor-man's way of running a service. If you think services are inherently resource consuming to a significant degree, you're mistaken.
Well, the problems are stemming from Direct HMD mode, which is orthogonal to the fact that there's a service at all. I can see the purpose of doing it, and I've said before that one of the biggest hurdles Oculus had to overcome would be the issues inherent in the Rift being treated just like another monitor by the OS. I was just hoping that the cure wouldn't be worse than the disease.
I don't think that Direct HMD should have been rolled out concurrent with the DK2 SDK. Since it's clearly got some stability issues, it would have been far better to release the runtime with the existing extended mode mechanism of working with the Rift and then include Direct HMD as a beta that people could try out. That way people willing to bear the burden of potential blue screens and unbootable systems could have take the risks, and people who just wanted stability while they develop software for the DK2 would be able to have it.
Try to remember that you can't fragment a product that doesn't exist yet. The Rift is still in the dev kit stage.
Co-author of Oculus Rift in Action
The (D3D-based) Oculus demos work fine with direct mode, though.
Windows 7 64bit + nVidia GPU.