Displaying an image pair on the rift — Oculus
Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

Displaying an image pair on the rift

Hello,

I have a stereo image pair generated by a simulated environment (Blender, on Linux) where two cameras are placed side by side, separated by the same distance than a pair of eyes. The two images are fused in one image, with the two images side by side.

This image is sent via a socket to a program running on Windows, where the rift is plugged. I would like to display this image on the rift. 

Since the changes that prevents the Rift to be used as an external monitor, it is not clear how this could be performed. Is there a sample that converts that type of image to an OpenGL texture, and displays it on the Rift with the SDK with proper handling of the distortion etc... ?

Regards,

Jeremy
Tagged:

Comments

  • WreckLuse68WreckLuse68 Posts: 250
    Nexus 6
    edited October 2017
    Could perhaps something like Virtual Desktop/VorpX somehow display the stream...Virtual Space on Steam will stream 360 YT videos very decently...they all have different 3D viewing settings for SBS/OU etc (so if you can maybe get the stream on to the windows desktop) but I dont have a clue how to get a stream into them...or maybe Unity can do it. (I apologise if I am totally misunderstanding what you are doing).
    Regards
    When Einstein was asked how it felt to be the smartest man on Earth, he replied, “I wouldn’t know. Ask Nikola Tesla”.
  • kojackkojack Posts: 5,249 Volunteer Moderator
    One important consideration is that the stereo images should be rendered with the correct camera frustum settings. The rift expects each eye's image to be asymmetric, the centre of the eye isn't in the centre of each image half. Here's the values the SDK gives me for fov (in degrees from centre):
    up: 41.653
    down: 48.008
    left: 43.997
    right: 35.575
    That's for the left eye, the right eye is the same but with left and right angles swapped. So each eye has more downwards fov than upwards, and more to the side away from the nose.
     In a 3d engine that's pretty easy (just use the projection matrix oculus provides). In Blender this would be done with the camera's shift property. But I don't know how to convert those angles into Blender friendly shift values. (Ok, it's probably simple, but it's 3am and I don't want to think about trig)

    If you don't do that, the distortion process won't be correct. With a static image like this you most likely won't even notice, it's mainly when you move your head around in a scene that flaws in the distortion would show up.

    It shouldn't be hard to get it working. The SDK will give you a render texture, you just need to put the image into it and call the submit method. You can ignore all of the tracking and stuff.

    (I'd try knocking out a quick demo, but I'm busy until at least friday)

  • Jeremy-FORSSEAJeremy-FORSSEA Posts: 2
    NerveGear
    Hi Kojack,

    thanks for the information. As a first approximation, I would be happy to just be able to render this side-by-side image pair on a texture and draw it on the rift. Would you have a code snippet that performs this by any chance?

    Regards
Sign In or Register to comment.