cancel
Showing results for 
Search instead for 
Did you mean: 

Waiting is hard... (Ideas while we wait)

donkaradiablo
Explorer
I think I know what's going to be the next big thing.

Real 3D "movies" of real people that you can watch from all angles as if they were there with you, with positional tracking, captured with multiple scanners, played back thanks to Carmack code compressing the hell out of that 4D data to be accessed and streamed to the GPU as needed depending on where you look.

Not 3D 360 degree movies... not real time rendered fake stuff. Real people, like they are really there with you.

Not whole stadiums, just one person. Also movies as they scan actors and put them in CGI spaces anyway. And as we take our loved ones to be scanned so we get to keep their memories forever, maybe we'll get that feeling our grandparents must have felt when they took our parents in for their first photos.

No physics, no dynamic lighting, no simulations, no artificial intelligence, no destructible environments... just predetermined, recorded stuff streamed using the information of where you are looking and the prediction of where you will be looking, which Oculus is already doing.

Yes, it's a whole lot of data. But all types of work andoptimizations Carmack has done and let us know he looked into in the past years should translate so well to this. And he has supercomputers with special purpose computing units at his fingertips, with lower level access.

What we have seen so far is nothing. Knowing that is what makes waiting hard. It's like you knew the photo was coming to town and everyone is excited but you know they are doing this thing called "movies"and you believe it will be big.

(edited: subject)
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...
47 REPLIES 47

donkaradiablo
Explorer
"In the weeks ahead, we’ll be revealing the details around hardware, software, input, and many of our unannounced made-for-VR games and experiences coming to the Rift. Next week, we’ll share more of the technical specifications here on the Oculus blog."



E3...
A new IP with Carmack magic
Set to define the next step in first person games

Set on a land that feels magical
Like being inside a Disney/Pixar universe
You are inside an animated universe... after all

It has story studio made experiences scattered around
It feels like a dream
A lucid dream where you take control and you let go of it
And have experiences that almost feel spiritual

Or nostalgic... like going into an arcade game you played as a kid
And an adventure game... And an animated movie.

The trailer ends with a cutscene where a familiar voice is saying "feel the energy between your hands"
And there is an energyball/lighting/fireball forming between your "virtual hands"
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

GSS
Explorer
CV1 was just announced. https://www.oculus.com/blog/first-look- ... g-q1-2016/

There may be an input solution too

donkaradiablo
Explorer
That's what this bit was referring to:
"donkaradiablo" wrote:
The trailer ends with a cutscene where a familiar voice is saying "feel the energy between your hands"
And there is an energyball/lighting/fireball forming between your "virtual hands"


I'm guessing hand tracking will be Oculus VR's way of one upping Valve as it will enable the team to say: made for SteamVR games can work perfectly on the Oculus Rift but made for Oculus Rift content that use our robust hand tracking is best experienced on the Rift.
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

donkaradiablo
Explorer
We will probably see something like this soon:



And if that happens to be the case, this is probably gonna be available to order on Oculus VR website as the input dev-kit and it's probably gonna be demoed on stage at E3, with some in-house games that make use of it, also with gestures mapped to actions to control the interface, showcasing the next step in computer human interaction.

Funny thing is, if this and the cam that is used for tracking, and a wearable headband or shutter glasses with IR leds similar to those that go on the Rift, are bundeled together (without a Rift), that would enable TV users to use their hands to control games and apps, and have parallax on their TVs similar to this (but with more precision) :




That would be a great way to make sure Oculus VR's input method gets support in non-VR games and apps too. Gamers who do not want to give up their TVs and who aren't interested in wearing a VR helmet, would still get a great level of immersion out of this.

It would also enable VR game developers to port their titles to non-VR environments, as even a VR only game design like EvadeVR, would be possible to bring to the TV.

This is probably what Nintendo should have done to keep it's Wii mojo and take it to the next level.

Would be funny if that was presented by Palmer, with a dodge the bullet type of game on TV, where at some point Palmer would lift his hand and stop the bullets, and reflect them back.

What would make this presentation with in-house games and a hand tracking solution trend faster than the speed of light is a Star Wars game deal of course. Force lighting, reflecting lasers, telekinesis, training where you try to dodge lightsaber attacks... In front of a TV or in VR. Both experiences powered by Oculus.
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

donkaradiablo
Explorer
Mobile is going to get so powerful so very soon.
1. With lower power consumption, compact form factor, GPUs with HBM/HBM2 should find their way to mobile faster than previous gen techs.
2. APUs and SOCs with high speed interconnect between the CPU and that High Bandwidth Memory, just the way the GPU and HBM are linked, must be a no brainer.
3. Manufacturing tech will be there.

Just as an example, an AMD APU with Zen CPU cores, next-gen GPU cores and HBM, accessed with Mantle/Vulkan, should allow for a mobile VR device (with a fan) that is much more powerful than current gen consoles and even most current gaming PCs, probably sooner than we think.

A GearVR that ships in 2017 is likely to have very advanced graphics.
______________________________________________________________________

nvidia
http://wccftech.com/nvidia-ceo-talks-pascal-pk100-pk104-gpus-produced-finfets-hbm/

shifting from GDDR5 memory to HBM2
up to 32GB of video buffer
upto 1TB/s of bandwidth
2X the performance per watt
2.7X memory capacity
3X the bandwidth of Maxwell
either the 14nm or 16nm node using FinFETs


AMD
http://www.forbes.com/sites/jasonevangelho/2015/05/06/confirmed-amd-to-launch-new-hbm-equipped-deskt...

High Bandwidth Memory (HBM)
Much higher memory bandwidth speed than GDDR5
Less power consumption
On-package memory reduces complexity of enthusiast-class graphics
Opportunities to extend across AMD product portfolio


Samsung
http://techreport.com/news/28244/report-samsung-spending-14-billion-on-new-semiconductor-fab
Plans to spend 14 billion on new semiconductor fab.
The biggest investment in a single semiconductor production line
Initial production at the new facility is expected to begin in the first half of 2017
That's a timeline that could point to the production of 10-nm chips

______________________________________________________________________

What that means for VR is

1. We've had DK1 and 2 to develop games and experiences that are also compatible with mobile systems (2013-2014)
2. Next we'll have consumer GearVR, with OTOY and Carmack magic providing very impressive visuals in experiences and minigames. (2015-2016)
3. Next we'll have CV1 that we can plug to a computer with expensive hardware accessed with modern low level APIs, allowing impressive visuals in endless universes. (2016-2017)
4. Next we'll have GearVR type mobile VR solutions, with SOCs having HBM, DX12 level GPUs and 8+ core CPUs in the same package. (2017-2018)
5. Next we'll have those at retinal resolution (2018-2019)

All the tech that will shape VR and gaming in general until 2020, is being carved today.
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

donkaradiablo
Explorer
I'd back this if it was a Kickstarter project:




It would be aimed at consumers looking for untethered VR at high quality, and developers looking to test high quality content they make for mobile VR coming to market H2 2017 - H1 2018.
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

donkaradiablo
Explorer
"donkaradiablo" wrote:



This... With interactive 3D animated characters that are scan based and use image based hdr lighting...


This Look and Click Adventure gameplay style seems to be cut out for OTOY light field capture based environments + capture based animations on LightScan based models rendered with image based hdr lighting.




Would need one "invisible to the user 3D model of the room" to occlude animated models when they are behind the couch for example. Could include a simple voxel based version of that model used for real time lighting the characters. Animated models could even cast shadows on that 3D model using that light information, a shadow on the transparent 3D layer that would be overlayed on top of the light field layer. Soft shadows, using ray casting in real time, which is another tech that OTOY is working on. OTOY's capture tools could evolve to a point where every light field capture also constructs those 3D models of the space captured for the game engine to use, out ot the images acquired from all angles.

What would complete this and make it perfect is Virtual Desktop turned into a UE4/Unity asset, displayed on that TV. It could be your virtual office where you are boss and NPCs work for you if "creating macros and running them" is turned into "showing the NPC what to do and letting the NPC take over and do the repetitive task for you".

Virtual Desktop turned into a UE4/Unity asset would also enable taking your desktop computer to any virtual world as your mobile computer. Could be on your virtual wristband, could be a tablet, could be a holographic floating screen. You could be in space and still getting emails, chasing dragons and buying diapers on amazon, working from wherever your soul desires to go, even if you are physically at the office.
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

donkaradiablo
Explorer
Either those 1080x1200 panels are smaller than previously anticipated, something like 1.2" dense panels, or I have been way too optimistic with my expectations.

Looking forward to the details about the lenses, the input solution and the in-house games. Oculus pretty much got all the "not fun" news out first (release date, resolution, even requirements). From now on, the road to E3 can be fascinating, or I'm being optimistic again.

The layers and compositor in the last SDK seem to be all I hoped for. It looks like that's going to allow some funky ways of optimizations. Different resolutions, different fps targets for different layers, composed together, distorted and timewarped positionally at the final stage, is going to turn into magic in the hands of talented programmers.
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

donkaradiablo
Explorer
Requiring 2 USB 3.0 ports could mean some cool things, like a very high resolution IR cam, or 3 cam solution to reconstruct you and your game space in the virtual space, or multiple IR cams on the hmd, or IR cams on wristbands and one on the desk... The resolution of the capture that happens in your room is more important than the resolution of the display if SDE is eliminated.

Deep Echo, with soft shadows and good lighting in Ultra mode, would rock with a little better resolution, no SDE and a fixed high refresh rate. It would be fantastic if those hands were mirroring the movement of your own, with the lighting and shadows from the environment on your in game avatar. For NPCs, scan based models with masks are "freaking real" good, like The Enforcer in the latest Technolust demo. UE4 with great animation and great design can offer great experiences at the Rift's resolution with a high end card when the Rift comes out.

I wonder if lenses that take care of SDE and create a smooth image out of basically fixed black noise between pixels, maybe it can create smooth images out of noise of real time ray tracing, especially if the patterns of that noise were programmable and used in conjunction with positional tracking and timewarp.
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...

donkaradiablo
Explorer



uber-cool. hoping one of the two usb3.0 ports required is for something like this. would make cv1 worth the wait (vs getting the vive).
Design with input solution, unifying mobile and PC product lines the input solution that could have been ideas Revolutionize the way we interact with... Change the world... Community...