cancel
Showing results for 
Search instead for 
Did you mean: 

Can someone explain what the ORBX Media Player is?

jodyrbrennan
Explorer
Are these still images all there is to is or am I missing something? How do I use it and what is it for?
20 REPLIES 20

cybereality
Grand Champion
You may only see a few photos if you removed the headset while it was downloading. To fix this, you need to delete the ORBX folder from the phone. Then go into the Android settings menu, force stop and clear data on ORBX, then uninstall. After that, you can reinstall the app through the Oculus Store (but be sure to keep the Gear VR on your face as the pictures download).
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

Ser3nity
Honored Guest
Had the same issue, too. They really should make the download continue in the background.

No, you cannot travel in those images, yet. The lightfield approach from Otoy is groundbreaking, but I'm afraid the computational power required to create those images and the storage and bandwith to transmit them is ginormous...
I don't think it will be a real thing in the near future, maybe in some years. But their Octane renderer is really awesome, already!

Malkmus1979
Explorer
"Ser3nity" wrote:
Had the same issue, too. They really should make the download continue in the background.

No, you cannot travel in those images, yet. The lightfield approach from Otoy is groundbreaking, but I'm afraid the computational power required to create those images and the storage and bandwith to transmit them is ginormous...
I don't think it will be a real thing in the near future, maybe in some years. But their Octane renderer is really awesome, already!


Ser3nity, not as big as you would think. From OTOY:

File size of the compressed light field data is 36 MB (original source data was ~40 GB). It is visually lossless and retains HDR lighting from the source (see notes below for details). The capture diameter is 75 cm.

Multiple captures at < 1 meter apart can seamlessly expand the navigation volume to cover walkable spaces of any size. No hard limits on total size other than what we can load in memory or from disk/network

Camera, codec and viewer support HDR, but to keep things simple for this first test, the source captures were converted to LDR and color corrected by Paul before encoding. We will be using raw HDR source data normally.

The demo uses the same light field viewer that plays back synthetic light fields created with OctaneRender/OctaneVR, which we have been showing since August.

The LF capture is automated. The raw data from the capture can encoded and played back in the viewer, just as you see in this demo, with no human intervention . The goal with this setup is for it to not only be affordable, but super simple for anyone to use.

We tested depth extraction software from our Light Stage tools on this capture. mesh reconstruction and depth compositing look promising.

Manual refocusing to different depths in the LF works in the current viewer. You can bring objects in and out of focus based on their distance from the camera.

Light field streaming support is being added to ORBX media player app (PC) . Have had success decoding and playing LF datasets on Project Tango, and other devices with complete OpenGL ES 3.1 support; WebVR/WebGL2 in HTML5 is another option down the line.

~30 billion rays were sampled in the capture. We tested different captures modes: from one that takes a few minutes, to one that takes hours. The idea is you can have casual captures in a few minutes, archive quality captures in a few hours (in the latter case, sun movement is an issue).

We are working on video light field camera rigs for professional VR sports broadcasts like what we did in Feb. with the NHL. The cost of LF video camera+hardware needed to approach the fidelity we have with static and synthetic light fields is high. We are thinking of clever ways to make LF video capture cheap and affordable on consumer hardware.

Yeticrab
Explorer
From what I can tell it's just a stereo cube map viewer. Like 360 photo but in 3d.

Probably the best looking thing on the gear vr right now

They have a free browser based renderer on their website so you can create your own stereo cubemaps.

The "synthetic light field" talk is misleading imo, it's just a png image, there is zero LF info in the image.

Malkmus1979
Explorer
"Yeticrab" wrote:
From what I can tell it's just a stereo cube map viewer. Like 360 photo but in 3d.

Probably the best looking thing on the gear vr right now

They have a free browser based renderer on their website so you can create your own stereo cubemaps.

The "synthetic light field" talk is misleading imo, it's just a png image, there is zero LF info in the image.


Yeticrab, I believe they are referring not to the photos viewable in the ORBX viewer at the current stage, but rather to their light field demos that actually allow positional tracking within a photograph, and eventually video. Their goal in a nutshell is to be able to give you front row tickets to sporting events (for example) where the positional tracking allows you to move and get better angles of the action. They have just within the last couple weeks achieved positional tracking within a photograph, laying the groundwork for this.

Here is a video of their light field technology allowing PT in real time:

https://www.youtube.com/watch?v=pyJUg-ja0cg

Yeticrab
Explorer
"Malkmus1979" wrote:
"Yeticrab" wrote:
From what I can tell it's just a stereo cube map viewer. Like 360 photo but in 3d.

Probably the best looking thing on the gear vr right now

They have a free browser based renderer on their website so you can create your own stereo cubemaps.

The "synthetic light field" talk is misleading imo, it's just a png image, there is zero LF info in the image.


Yeticrab, I believe they are referring not to the photos viewable in the ORBX viewer at the current stage, but rather to their light field demos that actually allow positional tracking within a photograph, and eventually video. Their goal in a nutshell is to be able to give you front row tickets to sporting events (for example) where the positional tracking allows you to move and get better angles of the action. They have just within the last couple weeks achieved positional tracking within a photograph, laying the groundwork for this.

Here is a video of their light field technology allowing PT in real time:

https://www.youtube.com/watch?v=pyJUg-ja0cg


I've seen that, but that's not what is currently on the gear vr which is what this forum is for.

All the technologies this company is working on are exciting, but all we have so far is a high resolution stereo cube map viewer.

Malkmus1979
Explorer
"Yeticrab" wrote:


I've seen that, but that's not what is currently on the gear vr which is what this forum is for.


People here are curious about what light field tech is and there's no reason to not talk about that here. OTOY have said that their light field videos are coming to the Gear VR, including the NHL game they recently demonstrated in public. So, I'm sorry, but I disagree, and definitely think this is an appropriate place to discuss it. We may not have positional tracking yet, but eventually we will.

http://venturebeat.com/2015/02/26/nhl-s ... l-reality/

standard3d
Honored Guest
It is a web based 360/360 panorama viewer for video and stills created by OctaneVR
Otoy.com is working to get a stereoscopic GearVR version out.
The only Android based 360 stereo VR viewer that works with OctaneVR is

http://standard3d.com/Holodeck/

gabrielefx
Honored Guest
If you want to understand what is the light field technology have a look at Litro web site.
The difference between a stereoscopic photo or render and a light field embedded photo is the dynamic parallax.
Litro photos are limited, with 3d renders I think it's possible to extend this parallax effect.
Imagine to split the space in a multitude of sections, if you slight move these photos or renders you will discover what is behind every object.
This is a trick used often in movies when actors are shooted on greed background. The virtual backgrounds are painted in Photoshop on several layers. All these 2d layers are animated in compositing programs.
Goldorack can explain better what Octane VR will do.

Anonymous
Not applicable
Might just be me but I cannot find the ORBX Media Player for my GearVr/samsung s7 anywhere how do you load it?