cancel
Showing results for 
Search instead for 
Did you mean: 

Zuckerberg and Bosworth talk future of VR

Zenbane
Level 16

Zuckerberg and Bosworth recently (June 3rd) hosted an AMA which you can watch in its entirety here:

https://youtu.be/ncUbbVHn9vk

 

The highlights that I found most interesting,


Are Advertisements coming to VR? Yes.

 

They claim that we won't "hate it" and I can imagine some very useful ways to have advertising in VR. Although I'm sure there will be lots of complaints leading up to it. I mean... I see advertisements everywhere I look in the real world, even when I'm driving down the road or on the highway. Advertisements are part of our lives for anyone living in a populated city or town. 

 

As long as we don't get the terrible pop-up ads that we see in Mobile Apps, then we should be good!

 

 

Will we get neural interfaces in VR? Yes.


Zuckerberg: Well, whenever you’re designing a new platform, I think one of the most important aspects of it is input. I think in a lot of ways, how you control it is the most defining aspect of a platform, right? A lot of people think about AR and VR as sort of ‘what’s the output? Like what do you see?’

 

The bigger thing that defines PCs is you have keyboard and mouse. For phones, it was [that] you have this multi-touch and kind of swipe input.

 

So the question is — what are you going to use to control this natural interface around AR and VR? Our view is that it’s going to be somewhat of a combination of things, right?

 

You’ll have voice assistance and that’s going to be neat. But you’re not always going to want to use voice, because there are privacy issues with that – you want to sometimes control things without it telling everyone around you what you’re doing.

 

Hands are going to be a thing. People want to control hands. But you’re not always going to be walking around through the world with your hands outstretched in front of you doing stuff. So that will work sometimes better than others.

 

Controllers are going to be one interesting dimension of this too. Because as good as hands can get, if you’re doing something that’s really a micro movement – any gamer can tell you this – actually having a thumb pad and that kind of tactile feedback is super important. So for things like writing, you want a stylus – that’s super helpful to have something physical.

 

But, in some ways the holy grail of all this is a neural interface, where you basically just think something and your mind kind of tells the computer how you want it to go and that works.

 

There’s a bunch of research that we and others are doing into this. I think the key insight that our team has had… A lot of people, when they think about neural interfaces, they think about ‘how can we understand what you’re thinking?’

 

And it’s actually not about that. You don’t want to read the person’s mind. You’re not trying to understand what they’re thinking. What you’re trying to do is give the person an ability to have their brain send signals to the rest of the body about how this works.

 

And we have a system that does this, right? With motor neurons where your brain basically sends signals to your hands and your body telling them when you want to make movements, how to control it. And it turns out that we all have some extra redundant capacity for that, right? It’s part of the neuro-plasticity. If one pathway gets damaged, your brain can kind of get rewired but you can train those extra pathways to control, for example, a second set of virtual hands, so that way you just kind of think and, like, down the line, you have your virtual hands are typing and controlling what you’re doing in VR and AR, and then you don’t need to actually have a physical controller or anything like that because that’s awesome.

 

When you get to that, we’re gonna have this whole constellation of inputs, but that is perhaps one of the more ambitious projects that we have going on. But I think it’s really promising long-term and I think the team is making good progress towards it.

 

 

A good blurb on Wireless VR required for "presence" (note that this applies to PCVR as well):

 

Zuckerberg: Yeah, I think there’s two big pieces here, in terms of the experience and how you get it to be accessible. One thing, that I think people probably underrate, is that if you’re delivering a product that’s about presence, you really can’t have wires.

 

That’s probably not the most obvious place to go with this question, but I do think that if you want this to be something that a lot of people are going to experience, it needs to be a good experience. If you’re trying to deliver a sense of presence, you don’t want a wire wrapped around your neck. It really breaks the whole thing.

I think that is going to be the bar for VR and AR products of high quality, going forward. I think you’ll kind of see the market split into wired experiences (which are maybe going to be less accessible to a broader number of people) and then the things that are going to be the mainstream line of technology, even if it’s a little harder to develop… I think getting on that wireless path is really important.

 

 

Full article:

https://uploadvr.com/zuckerberg-bosworth-vr-transcript/

14 REPLIES 14

hoppingbunny123
Level 8

it sounds like hes just spending money and doesn't know what will come of it. whenever you do the ai machine learning stuff you need to focus on one job such as vision tracking. not vision tracking, nlp, + 7 other machine learning jobs.

 

if i was spending my money i would find the most effective easy to do job then spend money on that.

 

personally i would spend the money on tactile feedback gloves.  or the fancy eye tracking to get higher resolution. then with eye tracking finished i would apply that to my machine learning because the eyes have finite precise control similar to the ps5 controller thumb sticks.


@hoppingbunny123 wrote:

it sounds like hes just spending money and doesn't know what will come of it. whenever you do the ai machine learning stuff you need to focus on one job such as vision tracking. not vision tracking, nlp, + 7 other machine learning jobs.

 

if i was spending my money i would find the most effective easy to do job then spend money on that.

 

personally i would spend the money on tactile feedback gloves.  or the fancy eye tracking to get higher resolution. then with eye tracking finished i would apply that to my machine learning because the eyes have finite precise control similar to the ps5 controller thumb sticks.


 

You just tried to describe "successful multitasking" as a bad thing. It's not a bad thing. It's the key to success.

 

I'm experience with AI and Machine Learning, and it does take NLP +7+ other machine learning jobs. Machine Learning itself is a sub-field of AI. So to think that you can only focus on one or the other directly contradicts the very nature of these sciences.

 

That's like saying that when focusing on Gravity, you should ignore Mass and Acceleration.

 

Lastly, part of the science of innovation involves experimenting without knowing what will come of it. That's called "theorizing" and "experimentation." We wouldn't have virtual reality, or even computers and electricity, today if it wasn't for people working on things without knowing what will come of it.

 

Another word for this is: Ideas.

hoppingbunny123
Level 8

i may be wrong, but the activation functions in machine learning change the response time of the machine to user input. with the  sigmoid being used mostly and thats slow to respond to user input.

 

the neuralink demo of the monkey showed near instantaneous response time thats beyond my expectations for machine learning. especially on a cell phone. i wonder if thats real.

 

for the machine learning to be useful for facebook the activation function needs to use something fast, maybe relu.

 

these are the types of problems i would solve first before going to some fancy project thats too slow for consumers to use intuitively. what machine learning works with relu activation function is the first problem i would solve. then based on that i would pick what i would fund.

TomCgcmfc
Level 16

Good article, thanks for sharing this mate.  Main thing I get out of this is Wireless Rules, lol!  While there are still some shortcomings with the Q2 wireless w/Air Link (or VD), not being tethered to a PC more than makes up for these, for me anyway.  I do hope that the Q2 Pro makes this even better.  Better passthrough cameras may allow for some better AR experiences and improved hand/body tracking.  

 

As much as I still like my Vive Pro w/Index controllers for most of my sims, I doubt very much that I will ever buy another wired PCVR headset.

9 9900k, rtx3090, 32 Gb ram, 1tb ssd, 4tb hdd. xi hero wifi mb, 750w psu, Q2 w/Air Link, Vive Pro


@TomCgcmfc wrote:

Good article, thanks for sharing this mate.  Main thing I get out of this is Wireless Rules, lol!  While there are still some shortcomings with the Q2 wireless w/Air Link (or VD), not being tethered to a PC more than makes up for these, for me anyway.  I do hope that the Q2 Pro makes this even better.  Better passthrough cameras may allow for some better AR experiences and improved hand/body tracking.  

 

As much as I still like my Vive Pro w/Index controllers for most of my sims, I doubt very much that I will ever buy another wired PCVR headset.


 

Wireless does rule, my friend! What's interesting is that some 4 years ago when TPCast was being promoted (anyone remember that?), this forum was full of Oculus naysayers bragging about how the future of PCVR is in fact... Wireless VR.

 

Fast forward, and TPCast didn't do so well while Quest 2 offers Wireless PCVR better than any unit on the market. And suddenly... magically... the Oculus naysayers have miraculously changed their tone and BOOM: Wireless VR isn't so grand anymore. lol?

 

It's the same 'ol song and dance year after year: Whatever Oculus is "not" currently doing is supposed to be the best thing for VR, but as soon as Oculus "does it better," then it's no longer the best thing for VR 😂

inovator2
Level 4

It's a very nice video. I'm living what was talked about. I've mentioned on another thread that my brother who moved now watches with me our favorite tv shows together on in vr. We have for the past 9 months or more. Watching together on apps or utube tv that has a watch together way of putting your face on screen with the show being synched is nice. But when we watch our tv shows on big screen it's an omg moment. Just like Zuckerberg said you feel that presence even though we are avatars. Its presence in a profound way. We always lived in the same town and it's sad my brother has moved. After we watch shows together and we leave I feel like even though hes hundreds of miles away I didn't just turn off a screen but left him after a visit. As avatars get better and more real the experience will improve. I'm grateful not only for the awesome times I have with games and experiences but for the priviledge of being able to still do this with my brother. 

I can concur with that. I live in the UK, my brother lives in the States. I persuaded him to get a Quest2 and the first time we met up in Facebook Venues it was kind of neat. We watched some stuff and chatted with a few people and when the batteries started flagging said goodnight to each other. We both spontaneously went for a hug. It was a magic experience 🙂

Intel Core i7 6700K @ 4.5GHz. Asus-Z170-PRO MB - CORSAIR H105 HYDRO CPU COOLER EVGA GTX 1080Ti FTW3 Elite - 16GB DDR4 2666MHZ HYPERX SAVAGE.

Nekto2
Level 9

> ... Zuckerberg: Well, whenever you’re designing a new platform, I think one of the most important aspects of it is input.

> ... So for things like writing, you want a stylus – that’s super helpful to have something physical.

 

Great thought! Different types of input are impotent!

If you could sit behind your physical desk and use a pen/stylus to hand write math formulas and draw graphs in VR that could conquer entire school education!

Remote learning with video/audio is ok for talking but one is definitely need some precise writing ability with real pen/paper/marker/whiteboard to be transferred to VR.

 

Or... could you take both a real pen and existing Touch controller and then track movement and rotation of controller to recreate pen tip path? 🙂 🙂 🙂

May be 3d print some pen holder for Touch controller? So you could write on real paper and Touch will transfer it to VR image 🙂

 

Any ideas will this work? @kojack  ?

kojack
Volunteer Moderator
Volunteer Moderator

@Nekto2 wrote:

> ... Zuckerberg: Well, whenever you’re designing a new platform, I think one of the most important aspects of it is input.

> ... So for things like writing, you want a stylus – that’s super helpful to have something physical.

 

Great thought! Different types of input are impotent!

If you could sit behind your physical desk and use a pen/stylus to hand write math formulas and draw graphs in VR that could conquer entire school education!

Remote learning with video/audio is ok for talking but one is definitely need some precise writing ability with real pen/paper/marker/whiteboard to be transferred to VR.

 

Or... could you take both a real pen and existing Touch controller and then track movement and rotation of controller to recreate pen tip path? 🙂 🙂 🙂

May be 3d print some pen holder for Touch controller? So you could write on real paper and Touch will transfer it to VR image 🙂

 

Any ideas will this work? @kojack  ?


Actually I worked on a thing like this.

I sticky taped a pen to a touch controller and created a calibration system to work out the pen tip based on touching the one point on an object from different directions. The use case wasn't writing though, it was as a digitiser, you could touch points on a real object and record their 3d position, which would be exported to something like blender.

It wasn't very accurate though, Touch are good but the rotation of them magnifies the error of the pen tip since it's further from the centre of tracking.

 

Or just buy the Logitech VR stylus. 🙂

https://www.logitech.com/en-roeu/promo/vr-ink.html

However that VR pen (for Lighthouse) costs twice as much as a 256GB Quest 2!

 

Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2