New to the forums? Click here to read the "How To" Guide.

Developer? Click here to go to the Developer Forums.

VR at CES2019

124»

Comments

  • Digikid1Digikid1 Posts: 2,073 Valuable Player


    Do not ever trust that FRAUD. EVER. Linus is a fool. I actually know him IRL and he is not to be trusted. 
  • Digikid1Digikid1 Posts: 2,073 Valuable Player
    Mradr said:


    Man this stuff opens soo many doors. Imagine Oculus GO not needing a controller - just look at what you want to play for music or video and it just auto starts. No need for hand tracking for example while providing the other eye tracking benefits. Granted I know it has it - but not in the same way as tracking the eyes alone would allow, for example, while still watching a video - increase volume on the fly and less wear and tear on your neck.
    Not a good thing. Things automatically activating just by look at them.....I would hate that. 
  • DaftnDirectDaftnDirect Posts: 5,560 Volunteer Moderator
    edited January 2019

    Yeah, I'd welcome the foveated rendering benefits of eye tracking or course, but I'd much rather keep controls independent of eyes.

    I'm a little surprised that media seem to concentrate more on the control side rather than the performance improvements from eye tracking tech.

    Intel 5820K [email protected], Titan X (Maxwell), 16GB Corsair Vengeance DDR4, ASRock X99 Taichi, Samsung 500Gb 960 Evo M.2, Corsair H100i v2 Cooler, Inateck KTU3FR-4P USB 3 card, Windows 10 Pro v1903 (18363.720)
  • MradrMradr Posts: 3,617 Valuable Player
    edited January 2019

    Yeah, I'd welcome the foveated rendering benefits of eye tracking or course, but I'd much rather keep controls independent of eyes.

    I'm a little surprised that media seem to concentrate more on the control side rather than the performance improvements from eye tracking tech.

    What do you mean? We already using the same idea by looking at buttons and other UI in the Rift and GO. What's the difference for when they track your eyes instead? Honestly - that seems a bit short sighted to think to keep things independent from your eyes. I could think of a number of good reasons to track eyes for UI control in a numbers of ways. 

    Take a MMO - when you look at a player for a given amount of time  with your eyes - it can show statics about a player such as their health points - str - magic - etc

    In a hack in slash - bring up menus about your skills and abilities on the fly without having to press as many buttons.
  • snowdogsnowdog Posts: 7,354 Valuable Player
    Having eye tracking will also mean that when I develop my @vannagirl Simulator the 3D model of her will flash her tits and show her arse whenever you look at them. Eye tracking can't come soon enough imo :D
    "This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

    Thomas Covenant, Unbeliever
  • DaftnDirectDaftnDirect Posts: 5,560 Volunteer Moderator
    edited January 2019

    Mradr said:
    What do you mean? We already using the same idea by looking at buttons and other UI in the Rift and GO. What's the difference for when they track your eyes instead? Honestly - that seems a bit short sighted to think to keep things independent from your eyes. I could think of a number of good reasons to track eyes for UI control in a numbers of ways. 

    Take a MMO - when you look at a player for a given amount of time  with your eyes - it can show statics about a player such as their health points - str - magic - etc

    In a hack in slash - bring up menus about your skills and abilities on the fly without having to press as many buttons.
    The difference is that we're used to doing things with our hands at the same time that we're doing other things with our eyes.

    It's very intuitive to look at something you want to control, move your hands towards it and at the same time that you're doing that, look towards the next thing that you want to do. If you're relying on controlling things with your eyes you have to wait for that control to be done before looking elsewhere. It may seem quicker to control with eye's but I bet it works out slower and more awkward in reality.

    One of the first things I noticed with the VR version of Fallout 4 in comparison with the pancake version was that you can point your hand towards a stream, or other water supply and then look around you, while you fill your empty bottles with water (provided you keep your hand pointed at the water). This was really liberating, you can check for danger while your hands are busy. There are several other example in the game where this is useful. In the pancake version you have to maintain your gaze at what you're doing in order to carry out that action... just like you would in an eye-tracked control scenario.

    If I was in a hack & slash... I'd definitely want to have my eye on the enemy, not on any menus.

    Intel 5820K [email protected], Titan X (Maxwell), 16GB Corsair Vengeance DDR4, ASRock X99 Taichi, Samsung 500Gb 960 Evo M.2, Corsair H100i v2 Cooler, Inateck KTU3FR-4P USB 3 card, Windows 10 Pro v1903 (18363.720)
  • MradrMradr Posts: 3,617 Valuable Player
    edited January 2019

    Mradr said:
    What do you mean? We already using the same idea by looking at buttons and other UI in the Rift and GO. What's the difference for when they track your eyes instead? Honestly - that seems a bit short sighted to think to keep things independent from your eyes. I could think of a number of good reasons to track eyes for UI control in a numbers of ways. 

    Take a MMO - when you look at a player for a given amount of time  with your eyes - it can show statics about a player such as their health points - str - magic - etc

    In a hack in slash - bring up menus about your skills and abilities on the fly without having to press as many buttons.
    The difference is that we're used to doing things with our hands at the same time that we're doing other things with our eyes.

    It's very intuitive to look at something you want to control, move your hands towards it and at the same time that you're doing that, look towards the next thing that you want to do. If you're relying on controlling things with your eyes you have to wait for that control to be done before looking elsewhere. It may seem quicker to control with eye's but I bet it works out slower and more awkward in reality.

    One of the first things I noticed with the VR version of Fallout 4 in comparison with the pancake version was that you can point your hand towards a stream, or other water supply and then look around you, while you fill your empty bottles with water (provided you keep your hand pointed at the water). This was really liberating, you can check for danger while your hands are busy. There are several other example in the game where this is useful. In the pancake version you have to maintain your gaze at what you're doing in order to carry out that action... just like you would in an eye-tracked control scenario.

    If I was in a hack & slash... I'd definitely want to have my eye on the enemy, not on any menus.

    I understand for more usages in terms of having a controller base system to do more complex things. I mean hand tracking isn't better than a controller base input either. Hand tacking comes with a lot of limits in terms of your input same with voice input too + the learn curve that comes with them. I am talking about simple basic - GO level of request (from my first post there) to load up music - movies - and extra though doesn't require a full complex system/control for selecting said content. Plus - looking at something is far more easier to understand what to do with than hand tracking alone is either + and extra control functions too such as for example going backwards in a menu where you can just look at the back arrow in a menu and double blink for example.

    As a healer in some MMOs for example - my eyes are more focus on health bars than they are anything else. While in other games my eyes are focus on what is coming at me to make sure I am safe and out of danger. See where you are short sighted in your thinking is you are only looking at it from one side - the danger part. Instead of what else is going on as well. More or less you could include eye tracking to do both along with the controller system for different operations going on. I'm sure as more software comes out there will be more examples that will pop up where both can be used in a more complex way - but to say it not to use eye tracking for UI design is simply silly at best. 

    Honestly - we're NOT used to doing things with our hands anymore - we're used to doing things with input devices such as mouse, keyboard, or controllers.
  • DaftnDirectDaftnDirect Posts: 5,560 Volunteer Moderator
    edited January 2019
    That's true but mouse, keyboard and controllers are all extensions of your hands. I think you may be surprised by the difference in how you currently glance at things compared with how more deliberately you'd have to look at things to control them.

    When you say you focus on things, I believe it's more a mental focus than a actual eye focus. In reality, your eyes are constantly moving very quickly all over the screen with brief pauses where your mental focus is.. This would be fundamentally different if eye control was implemented.

    Yes, for headsets like the Go, that don't have hand tracked controls, eye tracked controls would be a useful addition for some experiences but I think we're talking about the Vive which has hand controls, together with the next iteration of the Rift (probably).

    There may be a use case for flight simulators where the screen in filled with controls and your hands are busy with joystick and throttle, but even then, I think the value of eye tracking is largely down to simulators not currently being able to implement hand tracking with flight controller use at the same time (or at least there are difficulties in doing so).

    Edit:
    I think all I'm saying is, our eyes have evolved over millions of years to be the perfect tools for gathering information very quickly. Our hands have evolved to be the perfect control instruments. And I think swapping those tasks in any way will not be the primary benefit of eye tracking... but performance gains definitely will be vital.
    Intel 5820K [email protected], Titan X (Maxwell), 16GB Corsair Vengeance DDR4, ASRock X99 Taichi, Samsung 500Gb 960 Evo M.2, Corsair H100i v2 Cooler, Inateck KTU3FR-4P USB 3 card, Windows 10 Pro v1903 (18363.720)
  • MradrMradr Posts: 3,617 Valuable Player
    edited January 2019
    I can still think of a ton of usages outside of performance usages that benefit from eye tracking though either way. From horror games to simulation games there can be so many new tricks that current flat games can't even hope to do. The performance gains are just icing on the cake and honestly - I said it in another post - that eye tracking is going to make or break future generations. If we don't start working on something early with eye tracking now and work with our software community to start work on it - you will just end up in the same boat as NV RTX is right now with their RT and DLSS futures + be limited to future hardware with the only way out is to increase the cost for more performance.


    Agree - eyes are design to take in a large amount of information - but your hands can only work well with that said information to work with. What makes humans something different though isn't that we rely on our hands alone - but the ability to extend their ability using tools (in this case input devices) to do something outside the normal for our biology. 
  • DaftnDirectDaftnDirect Posts: 5,560 Volunteer Moderator
    Well it's an nteresting discussion Mradr, I think we just have a slightly different perspective on it. I'd say performance gains are going to be the main thing and eye control will he the icing on the cake... if implemented only when it's a benefit!

    It's definitely going to be a fantastic addition for anyone with restrictions in movement for sure.
    Intel 5820K [email protected], Titan X (Maxwell), 16GB Corsair Vengeance DDR4, ASRock X99 Taichi, Samsung 500Gb 960 Evo M.2, Corsair H100i v2 Cooler, Inateck KTU3FR-4P USB 3 card, Windows 10 Pro v1903 (18363.720)
  • JD-UKJD-UK Posts: 2,362 Valuable Player
    Not VR, but nevertheless, interesting for those that refuse to use maps :p

    https://www.bbc.co.uk/news/av/technology-46860454/ces-2019-the-sat-nav-of-the-future-has-arrived






  • DaftnDirectDaftnDirect Posts: 5,560 Volunteer Moderator

    Yep saw that this morning. The Sat Nav side if it is a big improvement on anything small screen devices can do.

    I've got reservations about all the safety alerts popping up though. My feeling is if you're unable to identify a cyclist or truck pulling out in front of you as a cause for warning, then it may be time to take a taxi .....or give in to autonomous driving completely!

    I'm already against the ever increasing functionality if infotainment systems in modern cars. Touch screens or touch sensitive buttons should be strictly limited as they require you to look at them much more than physical buttons.

    Now we seem to have digital configurable instrument panels and ambient lighting on the rise with LED strips all over the place.

    In fact everything you can think of to distract the driver from looking at what's really outside the car!

    Anyway, the Sat Nav part was very good.

    Intel 5820K [email protected], Titan X (Maxwell), 16GB Corsair Vengeance DDR4, ASRock X99 Taichi, Samsung 500Gb 960 Evo M.2, Corsair H100i v2 Cooler, Inateck KTU3FR-4P USB 3 card, Windows 10 Pro v1903 (18363.720)
  • Shadowmask72Shadowmask72 Posts: 4,044 Valuable Player
    Brixmis said:
    Not VR, but nevertheless, interesting for those that refuse to use maps :p

    https://www.bbc.co.uk/news/av/technology-46860454/ces-2019-the-sat-nav-of-the-future-has-arrived


    Hmm. Turning real driving into a video game. Nice. I wonder if we get score multiplier overlay for avoiding drunk peds and arrogant Audi drivers?


    System Specs: RTX 2080 ti , i9 9900K CPU, 16 GB DDR 4 RAM, Win 10 64 Bit OS.
  • DaftnDirectDaftnDirect Posts: 5,560 Volunteer Moderator
    omg Audi drivers... what is it with them! they used to be the antidote to BMW drivers, now they've replaced them
    Intel 5820K [email protected], Titan X (Maxwell), 16GB Corsair Vengeance DDR4, ASRock X99 Taichi, Samsung 500Gb 960 Evo M.2, Corsair H100i v2 Cooler, Inateck KTU3FR-4P USB 3 card, Windows 10 Pro v1903 (18363.720)
  • snowdogsnowdog Posts: 7,354 Valuable Player
    Pah! Driving is for peasants. It won't be long before we're teleporting everywhere. Couple of hundred years at the most.

    In the meantime, once I win the Euromillions, I'll be employing a chauffeur to do all my driving for me. B)
    "This you have to understand. There's only one way to hurt a man who's lost everything. Give him back something broken."

    Thomas Covenant, Unbeliever
  • Techy111Techy111 Posts: 6,735 Volunteer Moderator
    You guys haven't experienced bad drivers until you drive on blue lights grrrrrrrrr
    A PC with lots of gadgets inside and a thing to see in 3D that you put on your head.

  • GreymanGreyman Posts: 1,315
    Wintermute
    I'm sure that ambulance drivers aren't as bad as that Techy  :)
  • JD-UKJD-UK Posts: 2,362 Valuable Player
    Yeah, when we drove with our blue lights on, everyone moved out the way very sharply! (But they do tend to do it more for police than other emergency vehicles...) :D




Sign In or Register to comment.