cancel
Showing results for 
Search instead for 
Did you mean: 

Oculus support for cross platform Godot game engine

Mux213
Explorer
Hi all,

First post for me on this forum, hope I'm in the right place. I've been working on adding VR support to the open source game engine called Godot (https://godotengine.org/). Recently all of the core work was merged into the master branch of the project and support for VR through OpenVR has all the basics working fine.

Seeing I now own a CV1 and touch controllers I was thinking about turning my hand at supporting the Oculus SDK as well (and in the long run looking at mobile support as Godot runs beautifully on Android). I've only done preliminary research looking at the sample source code in the SDK but thought I would do a post to announce I'll be working on this and hopefully get some feedback and help 🙂

One question that I do already have is around the render buffer textures. The sample code seems pretty adamant on the Oculus SDK creating the render buffers for the game engine to render into and then committing those render buffers. Godot already handles all the render buffer creation and has some specific requirements here for the way these are created. Is there an issue with not using the Oculus SDKs approach here and instead submitting the textures Godot creates and renders too directly to the Oculus SDK?

Note that everything in Godot is OpenGL 3 based. 

Cheers,

Bastiaan Olij
10 REPLIES 10

deftware
Expert Protege
I am not familiar with Godot, but searching through the code there's not very many framebuffers being used at all. I don't see anything in the way of a post-processing setup, and it appears that you could easily usurp the rendered output to wherever you want.

deftware
Expert Protege
https://github.com/godotengine/godot/search?utf8=%E2%9C%93&q=glBindFrameBuffer&type=

There's a system_fbo, which is set to zero, but you could easily change it to your own framebuffer.

However, you'd want to take advantage of whatever VR capabilities are built into the engine, because you don't want to naively be redrawing the entire scene from scratch for each eye.

Mux213
Explorer
Thanks for the answers guys. I've been busy getting the OpenVR stuff finished so totally lost track of checking back here. So sorry for the late response. I hope to finally sink my teeth into this. Using my rift over OpenVR has been fun but I'd like to try native support.

@imperativity that will likely be the course I will have to take. Is this the same approach the OpenVR drivers take when the rift is used seeing OpenVR does not have this limitation? I'm assuming it must do this copy of the buffer committed to OpenVR?

Not sure how to take the "you will have to use a different engine" remark.. 🙂

@deftware 
The system_fbo was introduced for iOS which also supplies its own buffer to render too. Problem is that simply replacing this would just render one eye.
Right now Godot does render each eye separately but it is just the first step in making the engine VR ready. The way VR was added to the engine will allow us to rejig a few things relatively easily to render both eyes in parallel or at the very least do a lot of preprocessing of the scene once per frame. But we're not there yet. 

Mux213
Explorer


This is feedback I provided from our PC SDK team and they suggested using a different (more supported) game engine due to the amount of modifications that would be required for your project to work properly.



@imperativity, that's kind of like telling an Oculus engineer they are better of working for Microsoft so you can imagine that remark felt a little misplaced 🙂 But I'm sure the PC SDK team didn't realise I wasn't looking for an engine to use for VR, I was looking to add rift support to a game engine 🙂

I shall see how far I can get with adding Oculus support to Godot. I'm sure I will have plenty of question as I go.

Vrally
Protege

Hi @Mux213

I have added support for the Oculus SDK to another open source engine and I know your pain. 🙂

I started my integration of the Oculus SDK with OpenSceneGraph engine with the very first public SDK. It was not that hard to get that working since the DK1 was pretty much just a monitor with a tracking sensor attached. But I have been forced to rework and redo a lot of work as the Oculus SDK have evolved.

The Oculus SDK do make some pretty strong claims to the order in which things have to be initialized, which can be a bit tough to shoehorn into an existing architecture. But the benefit of an open source engine is that it is often possible to work around these issues. But so far I have only been forced to submit one change to the OpenSceneGraph main codebase in order to make the Oculus integration work.

If you want to see my code, you find a link to my GitHub repository in my signature below.

Mux213
Explorer
@pixelminer thanks man! I will definately take a look at how you've approached things when I get stuck. 

Indeed, with open source engines you do have more choice in working with problems but it is still an existing engine. The SDK screams "build your engine ontop of this" and I kinda get that because that is how you get every last inch of performance out of it. But its not always the most practical approach to take when you have an existing engine. 

I wouldn't mind going down the "build it from the ground up" approach some day, just not today 🙂

Mux213
Explorer
Not bad for a mornings work. HMD support is all working, only need to add controller support:

https://github.com/BastiaanOlij/godot_oculus

*edit* touch controllers work.. Now needs a lot of polish 🙂

Vrally
Protege
Nice work!

Another HMD integration project that is worth mentioning is QVR. Which shows how to do render agnostic integrations with:
  • Oculus Rift
  • HTC Vive
  • Google Cardboard and Daydream
  • OSVR
As well as:
  • Almost all tracking and interaction hardware with VRPN
  • Custom large-scale VR labs with multiple GPUs and/or render clusters
  • Desktop-based fake-VR

Source here:
https://github.com/marlam/qvr




deftware
Expert Protege
I can't really imagine how you would even design an API for VR that doesn't really demand that either the engine be designed for it from the ground up, or at least require serious wrenching away on a conventional engine to get it in there.

The reality is that never before have we had a rendering display that is also user input similar to how mouse input is treated in a game engine. VR integration isn't going to be like swapping physics engines or rendering APIs. It requires that your input system be re-worked just for the head/controller tracking.

To add VR rendering to my engine I just replaced the existing scene camera with scene eyes, and generate their matrices from the player entity + OVR tracking state stuff, then instead of calling glDrawArrays once per piece of geometry I now call glViewport, glDrawArrays, glViewport, glDrawArrays per piece of geometry, with some glUniform calls to pass in each eyeball's view/projection matrices. So far so good. Next I have to add in some code to render all the 2D stuff that draws at the end of the frame to a swapchain texture to submit as a ovrLayerCylinder frame layer, and then replace the UI cursor's mouse input with tracked controller state input.. maybe I'll trace a line from the controller position to a cylinder that matches the ovrLayer...

Oh yea, I also have to replace all the little ingame 2D UI overlay stuff to actually render to the scene now - little health bar indicator and targeting reticle type stuff. I can attest to underestimating the work required, but I'm also trying to ensure maximum flexibility and have an affinity for making things more complicated than they probably should be.