05-07-2015 03:51 PM
05-12-2015 04:04 PM
05-14-2015 05:45 AM
05-14-2015 07:23 AM
"Ser3nity" wrote:
Had the same issue, too. They really should make the download continue in the background.
No, you cannot travel in those images, yet. The lightfield approach from Otoy is groundbreaking, but I'm afraid the computational power required to create those images and the storage and bandwith to transmit them is ginormous...
I don't think it will be a real thing in the near future, maybe in some years. But their Octane renderer is really awesome, already!
File size of the compressed light field data is 36 MB (original source data was ~40 GB). It is visually lossless and retains HDR lighting from the source (see notes below for details). The capture diameter is 75 cm.
Multiple captures at < 1 meter apart can seamlessly expand the navigation volume to cover walkable spaces of any size. No hard limits on total size other than what we can load in memory or from disk/network
Camera, codec and viewer support HDR, but to keep things simple for this first test, the source captures were converted to LDR and color corrected by Paul before encoding. We will be using raw HDR source data normally.
The demo uses the same light field viewer that plays back synthetic light fields created with OctaneRender/OctaneVR, which we have been showing since August.
The LF capture is automated. The raw data from the capture can encoded and played back in the viewer, just as you see in this demo, with no human intervention . The goal with this setup is for it to not only be affordable, but super simple for anyone to use.
We tested depth extraction software from our Light Stage tools on this capture. mesh reconstruction and depth compositing look promising.
Manual refocusing to different depths in the LF works in the current viewer. You can bring objects in and out of focus based on their distance from the camera.
Light field streaming support is being added to ORBX media player app (PC) . Have had success decoding and playing LF datasets on Project Tango, and other devices with complete OpenGL ES 3.1 support; WebVR/WebGL2 in HTML5 is another option down the line.
~30 billion rays were sampled in the capture. We tested different captures modes: from one that takes a few minutes, to one that takes hours. The idea is you can have casual captures in a few minutes, archive quality captures in a few hours (in the latter case, sun movement is an issue).
We are working on video light field camera rigs for professional VR sports broadcasts like what we did in Feb. with the NHL. The cost of LF video camera+hardware needed to approach the fidelity we have with static and synthetic light fields is high. We are thinking of clever ways to make LF video capture cheap and affordable on consumer hardware.
05-15-2015 07:19 PM
05-16-2015 09:02 AM
"Yeticrab" wrote:
From what I can tell it's just a stereo cube map viewer. Like 360 photo but in 3d.
Probably the best looking thing on the gear vr right now
They have a free browser based renderer on their website so you can create your own stereo cubemaps.
The "synthetic light field" talk is misleading imo, it's just a png image, there is zero LF info in the image.
05-16-2015 07:27 PM
"Malkmus1979" wrote:"Yeticrab" wrote:
From what I can tell it's just a stereo cube map viewer. Like 360 photo but in 3d.
Probably the best looking thing on the gear vr right now
They have a free browser based renderer on their website so you can create your own stereo cubemaps.
The "synthetic light field" talk is misleading imo, it's just a png image, there is zero LF info in the image.
Yeticrab, I believe they are referring not to the photos viewable in the ORBX viewer at the current stage, but rather to their light field demos that actually allow positional tracking within a photograph, and eventually video. Their goal in a nutshell is to be able to give you front row tickets to sporting events (for example) where the positional tracking allows you to move and get better angles of the action. They have just within the last couple weeks achieved positional tracking within a photograph, laying the groundwork for this.
Here is a video of their light field technology allowing PT in real time:
https://www.youtube.com/watch?v=pyJUg-ja0cg
05-16-2015 08:52 PM
"Yeticrab" wrote:
I've seen that, but that's not what is currently on the gear vr which is what this forum is for.
05-25-2015 06:20 PM
05-30-2015 01:49 AM
05-21-2016 03:33 AM