Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

HOWTO: Blender prerendered equirectangular stereoscopic

DePingusDePingus Posts: 12
I think I figured out how to render equirectangular (360x180 degrees) stereoscopic (3D) images sequences with Blender!!! Well...technically other people figured it out and I just put the pieces together.

Here is the full sized rendered image to try for yourself in Whirligig.

Blender 2.72b http://www.blender.org/
noeol's Stereoscopic Rendering Blender script 1.6.9 http://www.noeol.de/s3d/
VirtualDub 1.10.4 http://www.virtualdub.org/
Whirligig 1.47 http://www.centzon.co.uk/whirligig/

How to:
There's basically 2 steps involved. First you setup an equirectangular lens on Blender's camera. Then you convert that camera using noeol's script into a stereoscopic rig.

Equirectangular Setup:
To setup an equirectangular camera lens you must first set your render engine to Cycles Render. Then select your camera and in the Lens section of the camera Settings choose Panoramic with an Equirectangular lens type. You won't see the Lens Type setting if you don't change the render engine to Cycles. That's it, try rendering a frame. Use something like FPSViewer http://www.fsoft.it/panorama/FSPViewer.htm to view the 2D panoramic image.

Stereoscopic Setup:
After setting up your 360 camera, you need to convert the same camera into a 3D stereoscopic rig. noeol's script is freakin' awesome for this. He has a great youtube tutorial on his website. WATCH THE VIDEO. Seriously. The whole thing. You're gong to be using the Side by Side stereoscopic preset and there are special instructions just for that one.

After you set it according to the video and you hit render, Blender's image viewer won't be able display the whole image but it will output the full image file to a directory you selected previously (during the video tutorial that you totally watched). If you render out an animation, you will get a sequence of images in the chosen directory. You can use VirtualDub to import that image sequence and export an AVI. Just drag the first image of the sequence into VirtualDub's main window, change the framerate to whatever you want in the video section, and save to AVI. Your video is now ready for viewing in Whiligig or LiveViewRift.

Final Thoughts:
I just pieced all this together last night and its very much a work in progress. But I do think I'm on the right track here. If anyone has anything to contribute please do! There's a lot of discussion on this topic in another thread here, but its more focused on Maya/Max/MentalRay. I hope this helps some one else.

It appears that some one has solved the problems and streamlined the process. Hopefully, it will be officially supported in Blender soon. In the meantime, check out the website below for an explanation and patch.


  • j1vvyj1vvy Posts: 143
    Brain Burst
    I am glad this is all coming together. I hope to see some animation examples in the future.
  • noemisnoemis Posts: 33
    Brain Burst
    Hey "DePingus",
    thank your for starting this topic... I'm also a user of Blender, owner of an oculus dk2 and testing 360° stuff (for example 360° videos from the kodak sp360 with occulus)... you named very useful scripts and tools to start with. But as you mentioned, there's a lot of work to do...

    I downloaded you image and testet it in Whirligig... yes it's stereoscopic, but not equirectangular, as the topic title says.

    We can render with Blender equirectangular images (with cycles)... check.
    We can render stereoscopic images with the help of "noeol's Stereoscopic"... check.

    But render stereoscopic equirectangular images? TO DO! Right?

    I'm still happy about this thread and hope to contribute in the future...
  • noemisnoemis Posts: 33
    Brain Burst
    After some testing I managed to create a quite good restult.
    an equirectangular stereoscopic / side by side image

    It works great in Whirligig. Download the image from the link above, maybe rename the long filename, copy it in the media folder in the Whirligig-Folder, start Whirligig and choose the format "Barrel Stereo SBS".

    here's an anaglyph version:

    it seems, that you can't see the description on imgur without being signed in, so here's the direct link:

    There's a link to an online viewer in the description.

    Creating good looking results with the anglyph method isn't easy, because you have to think deeper about colors. But it's a first try.
  • HaroldddHarolddd Posts: 24
    Brain Burst
    Thanks DePingus and noemis!

    This is great. My main use of Blender is making stereoscopic images, so now being able to make stereoscopic full environment images for Oculus Rift is exciting. :D
  • HaroldddHarolddd Posts: 24
    Brain Burst
    Here is a quick image I rendered with Blender. The background is a spherical mono image taken at Burning Man (not my photo), the silver statue is a model of myself from a 3D scan, and the Burning Man is a model I made using Wings 3D and Blender.

    I notice that the images need to be much higher resolution than 3840x1080

  • HaroldddHarolddd Posts: 24
    Brain Burst
    Nice image. :D Thanks. Although it seems a bit weak on stereo effect. What was the separation of the virtual cameras?
  • Thanks.
    camera separation was 2 cm. I had to keep it lower because of the objects very close to the camera. If you increase the separation it causes eye strain.
  • j1vvyj1vvy Posts: 143
    Brain Burst
    With noemis example with the hardwood floor it shows that the camera separation should be 0 for rendering the zenith and nadir. That last 1 deg should be near zero. See http://www.andrewhazelden.com/blog/2012/04/domemaster3d-stereoscopic-shader-for-autodesk-maya/

    @noemis this looks great. Except that 1° at the floor.

    @Nurul3d the stereo does not look quite right. When I look at it in LiveViewRift and switch between left and right I can see the image or objects shift but not in the right way. And the shift does not seem like enough. I suggest moving the camera than changing the separation.

    @Harolddd your example is too small; only 1024X288
  • @j1vvy, Thanks for your valuable feed back. I am digging in to it.
  • mediavrmediavr Posts: 234
    It is possible to render accurate depth at nadir and zenith but most 360 3d renderers dont do it .. but Micoy does and it is used eg. for full dome 3d projections where the audience can sit in any direction but can still see good 3d in the dome zenith eg. in the Marvel experience tour



  • HaroldddHarolddd Posts: 24
    Brain Burst
    Yes, very true. It was much too low resolution. So I rendered again at higher resolution.

  • Another test of SBS stereo panorama


    Please note:
    - camera shift 4"
    - No camera rotation
    - objects within 2 meters create slight eye strain( better to avoid putting objects very close to camera)

    let me know what do you think.
  • Non-panoramic SBS image test


    Please note:
    - camera shfit 6"
    - camera rotation(inward) to match the virtaul camera target( this is not good for 360 degree stereo panorama)

  • j1vvyj1vvy Posts: 143
    Brain Burst
    @Harolddd much better. The two models look like they are floating in the air and scaled smaller than life size. But seeing them makes me feel I am at first floating realy high until I look away to everything else and I can tell I am about right to be standing.

    @Nurul3D some more colors would make this easier to describe. I viewed in LiveViewRift and there is a one pixel discrepancy on the edges go together on both left and right view. Visible on the blue tea pot. The stereo of the yellow tea pot on the ceiling to my left when looking forward and up looks bad. The same tea pot looking at the back and up looks ok.

    Looking at the cropped view everything looked sharper and had more depth. Some of that would be because of the number of pixels per degree. With so many different size tea pots I did not have a sense of size.
  • j1vvy wrote:
    I viewed in LiveViewRift and there is a one pixel discrepancy on the edges go together on both left and right view. Visible on the blue tea pot.
    - Can you please elaborate this part, I didn't understand the "pixel discrepancy" part at all, this term is absolutely new to me.
    j1vvy wrote:
    Looking at the cropped view everything looked sharper and had more depth. Some of that would be because of the number of pixels per degree.

    - Can you please explain pixel per degree term, this term is new to me also.

    Thanks again "j1vvy" for your valuable feedback
  • j1vvyj1vvy Posts: 143
    Brain Burst
    There is a problem that is 1 pixel wide producing a line going through the blue tea cup in the back.

    The full 360° pano is 1920 wide per eye giving 5.3 pixels per degree
    The cropped view is maybe 90°wide and still 1920 pixels giving 21.3 pixels per degree.

    If the pano was 4 times wider then they would be of equal.

    I think you need about 12 pixels per deg to get native resolution of the DK2. Anything smaller is being up sampled while being displayed.
  • Thanks "j1vvy" this info is really valuable. I didn't know these information. I am going to apply this on my future test panos.
  • Here is a simple SBS rectangular animation test ( not pano)


    Please note, for depth exaggeration I used extremely wide angle virtual camera lens.
  • Here is an animation sample

    Merry Christmas
  • hi everybody!

    I'm also interested in the subject. I'm exploring the creation of stereo panoramic content both with live action and CG.

    @Nurul3d: I watched your pics and at first glance they look good, but I'm not sure they're correct.
    For example, here is a quick test I made with the method you described:


    (I marked Front, Right, Left and Behind for better discussion)

    The main camera is centered with the torus, that therefore should exhibit constant parallax. But it clearly doesn't.
    We can clearly see that horizontal parallax is maximum in front and behind the camera, and then goes to zero at left and right. This is what I expected, since the R and L cameras are shifted along the R-L direction, and thus can't show any parallax for objects along the R-L direction.

    Do you have this problem too? Or how did you solve it?
  • Nurul3dNurul3d Posts: 96
    @ ErikCaretta

    yes,You are right. This is due to software distortion. When I use 2:1 Equirectangular renderings , they are good, no parallax problem. But when I render panorama, this problem occurs. Not sure how to overcome this problem in renderings.

    Can anyone post a perfect stereo panorama example ( cg render) ?

    Thanks in advance.
  • Hello all, just stumbled upon this forum yesterday, and have had a change to have a go with an oculus which was very exciting. Anyway, if anyone is interested here's my attempt! It's a mock up of a gamehow set.

  • Just been playing around with this and it is really interesting.

    I have noticed with the scenes that I have created and the other scenes on this thread that the stereo effect works really well in one direction but when you look in another direction it is more difficult to get the two eyes to match and you get slight eye strain. Has anyone else noticed this and / or have a solution?


  • MHWorkshop wrote:
    Just been playing around with this and it is really interesting.

    I have noticed with the scenes that I have created and the other scenes on this thread that the stereo effect works really well in one direction but when you look in another direction it is more difficult to get the two eyes to match and you get slight eye strain. Has anyone else noticed this and / or have a solution?



    I just found this thread, and was thinking that would be an issue with the setup of "two stereo cameras rendering once for a whole scene". If the cameras are renders as if they're always looking "straight ahead", then the parts of the scene that are "left" and "right" of "straight ahead", will be rendered without the cameras "turning to look at it", meaning the left/right separation between the two cameras will become zero (from the scene's perspective, they're now front/back of each other rather than left/right).

    The definition of Blender's "panorama" camera mode says that it "renders the scene as if the camera rotated in place", taking strips of the scene and stitching them together. For a stereo camera pair, the cameras would need to rotate around the centerpoint between them, not their own center, otherwise you'd get the same problem with depth looking to the left and right. Maybe if someone dug into the python internals of the "panorama" camera mode, we could create a rig/script that rotates the camera around an off-axis point (the cameras' centerpoint) while doing the panorama sweep-and-stitch?

    Maybe there could be another type of script that creates "left eye" and "right eye" cameras set up in the six directions needed to make a cube map, and stitches them all together?
Sign In or Register to comment.