Hello Oculus developers! I'm David Borel. I've been in graphics for >10 years at NVIDIA, Havok, zSpace, and 4 years next month at Oculus, where I lead the Engine Integrations team. We focus on core VR and performance in popular engines like Unity and Unreal. At OC4, I did a talk with Jian Zhang and Remi Palandri that covered some of what we do, including recent optimizations(https://www.youtube.com/watch?v=pjg309WSzlM).
I'm looking forward to your questions and will answer as many as possible. Let's do this!
Twitter: @Dave_Borel Oculus ID: vrdaveb
There may be a slight delay after you submit your questions. Please only submit once, we’ve received your message and it will be live in the thread shortly!
I've found that great custom hand poses help immersion in games. For Unity you have the Grip Poses scene, but I want to know if you plan to expand on it?
For example, modifying left hand pose and have that transferred to a right hand pose and viceversa.
Also some tutorials on how to lerp from one custom hand pose to another?
Finally, it would be amazing if you guys released the hand poses like the ones on Oculus First Contact, Toybox, or the new Home Screen so us developers without much design skills can hit the ground running.
The hardest challenge we've taken on recently may be Multiview. It required a lot of changes to engines' core rendering, moving from a traversal per-eye to a single traversal that renders objects to both eyes at once. Qualcomm, ARM, Epic, Unity, and the Oculus Mobile SDK team all worked hard with us to deliver it.
Hi David, do you have any tips for native debugging UE4? We are able to get the Nvidia Tegra debugger to attach, but we get SIGILLs any time there is input (controller buttons, touchpad, etc). I think I saw the same thing when I tried using a command line GDB client. Have you encountered this, and if so, has your team found a solve?