Hello there! I like to write a little too much sometimes, so if you don't have time for reading.. just TRY THIS!
Ok maybe check the "README.txt" first for controls.
The demo can still be enjoyed even if you don't want to try my alternative navigation methods or don't have a Razer Hydra (just use gamepad thumbsticks).
So apart from my particular obsession with positional tracking and motion controls another thing that i've been very interested in lately is THE VR NAVIGATION PROBLEM.
I believe letting people navigate VR environments intuitively, without inducing sickness to a significant percentage of them, is going to be crucial for the mainstream success of VR.
Unfortunately i think we are not quite there yet.
Even if we get the perfect HMD with ultra-accurate tracking, high refresh rate and 0 latency, simulating the usual game avatar walking or turning by using a gamepad or a mouse will still induce sickness to many people, because of the simple fact that there's still a mismatch between their visual and vestibular system.
Basically a HMD with a wide FOV can create the illusion of self-motion (vection) while your inner ear is not really detecting any movement, and this creates a conflict. Acceleration is the main problem here not velocity, since constant velocity doesn't really affect our inner ear fluids.
We know one possible solution is to design experiences where your avatar is not moving or is staying inside a fixed reference point like a cockpit, but i think this new alternate reality begs to be explored without restrictions.
Another aspect that i think needs to be improved a lot is navigation usability. It would be ideal if we had a navigation system that could be used even by non-technical people or with no previous gaming experience. Grandmas should be able to enjoy VR in all its glory!
Personally i don't think traditional controller thumbsticks have a place in the future of VR navigation. Yes they served their purpose in the history of computer games; when Microsoft released Halo in 2001 it was great being able to play a FPS while sitting on your couch, but really, i think having only 2DOF (for each thumbstick) with very limited granularity and precision is far from ideal for VR navigation.
13 years later we still don't know what the "standard" VR controller is going to look like, but we do know that it's very likely going to involve having 6DOF tracking on each hand (on top of 6DOF tracking for the head). We already have great products that meet these requirements (Oculus DK2 and Razer Hydra), i think it's time to try something different.. So DEATH TO THE THUMBSTICK!! Oh and DEATH TO THE MOUSE TOO!! There!
A BIT OF HISTORY
There have already been a good number of efforts trying to solve some of these Navigation issues, but as far as i know all of them have some sort of tradeoff.
- Time Rifters
seems to make a really good job reducing sim-sickness by having the HUD in a fixed position, giving the user the feeling of being inside a helmet and creating the same effect as cockpit games (where having a reference point reduces vection).
- The guys at CloudHead Games
introduced what they called "Comfort Mode
" for their game "The Gallery" which effectively removed rotational vection by creating incremental steps (of around 30 degrees i believe). This seems to help a lot of people, but many find it quite immersion breaking.
introduced an instant-teleport mode which similarly to comfort-mode completely eliminates vection, but in this case for translations instead of rotations. I think it's a really clever system and can be useful in many circumstances, but obviously it feels very different to what most people are used to. I think using it for instance in a game like Skyrim would remove some of the "magic" of exploration.
- I think the developer of Lava inc
had a good idea using positional tracking to control the forward acceleration, but unfortunately the sharp turns of the rollercoaster combined with a small wagon with no fixed reference made it a bit sim-sickness inducing, at least for me.
- EvilMech Runner
(not sure which one was first) implemented quite an original method of navigation also with the help of positional tracking, by detecting the up and down motion of the HMD when walking in place in RL and translating it into forward movement in VR. Some people said that it really helps, but it can lack the control finesse of other methods and be a bit too "active" for some users.
is reported to be rather comfortable by quite a few people, i think this is due to the mechanic of turning your head before moving and then going quickly forward. Also the open-landscape nature of the level helps to reduce the amount of vection.
- There are probably a few more examples worth noting that i don't remember right now, but i'll finish with a quote from Palmer Luckey
who said after Oculus Connect regarding the vestibular mismatch:
We have tested countless ideas (locked backgrounds, grids, FOV reduction, blur, head-movement based navigation, etc), but as of yet, there is no universal solution. Everything promising is very situationally dependent, not a silver bullet.
I'm disappointed that our extensive research has not turned up a universal fix for vection, but it is not for lack of trying.
Personally i have been thinking about and discussing this issue in a few different places over the last couple years, here are some examples of interesting threads worth a read:
MOTION-BASED NAVIGATION (MBN)
So i decided that i wanted to build a little demo to test some of the navigation methods that seemed more promising to me.
In fact i started by testing some of these methods by modifying the Tuscany demo almost a year ago, but soon i realized that in order to understand if something "felt" right it was essential to have a body inside the game, a more or less decent avatar that you could identify with. So that would take a bit more work than just a quick hack.
Apart from that i also wanted to have an alternative demo to Tuscany to show VR locomotion to first timers, so i decided to do both things at the same time while learning UE4, and i based my work on the already great
UE4 SUN TEMPLE demo
(more about this later).
I finally have something to show, and after trying a few different things that didn't work out i eventually settled for 1 alternative system for controlling avatar movement (translation) and 3 alternatives for controlling avatar rotation (1 using motion controls and 2 using head-tracking).
Basically i was inspired by the "1 button thread" in MTBS i mentioned before and i implemented what i named "PositionalMove" + "MotionTurn" on the oculus forum discussion, as well as 2 variations of "HeadStick + TurnButton".
The main idea is to use only 1 or 2 buttons in combination with other inputs in order to move or turn. The trick is that the buttons only "activate" navigation mode, but to control it you have to use your torso and/or hands. This means that if you want to stop moving or turning you have to return to the original position; this way there is always a smooth and directly controlled motion, without sudden accelerations or decelerations.
The basic idea for translation is to lean in Real-Life (RL) in the same direction that you want to move in Virtual Reality (VR). The theory is that this helps to reduce the mismatch between acceleration in VR and being idle in RL; we find a middle ground by mimicking the motions we want to do in VR.
Let me define some terms first:
S = Stopped
V = Velocity
A = Acceleration
D = Deceleration
J = Jerk
↑ = Increase
↓ = Decrease
Here's an example of the correspondence between leaning forward in RL and moving the character forward in VR:
RL --> VR
S --> S Head/torso is stopped in RL / Avatar stopped in VR
V↑(A) --> A↑(Ja) We start moving (accelerating) head forward / Acceleration increase in VR
V --> A We continue to move head at constant velocity / Constant acceleration in VR
v↓(D) --> A↓(Jd) We start stopping (decelerating) head motion / Acceleration decrease in VR
S --> V Head stopped in forward position / Constant velocity in VR
So what we are actually doing here is applying in VR the time derivative of the action applied in RL. There is still a mismatch between both, but this mismatch is not so obvious to the brain anymore since there is a correlation between accelerations.
The implementation sounds quite simple but i had a few challenges, like how to compensate for rotation of the head when you are only interested in translation of your torso (so it's possible to turn your head without moving). Since the SDK simply provides the location of the HMD you have to subtract the changes in translation caused by the rotation of your head to the "Device Position". My solution is not perfect since you have to approximate using a factor that depends on the size of the head, but it's still much better than using no correction at all.
Head-Based Rotation (Standard)
A similar principle applies when using head-based rotations. Here the rotation of your head in RL affects the rotation of your avatar in VR, however it has some problems that we didn't have with head-based translations.
I think it works quite well when mainly walking forward and making adjustments to your walking direction, but it's a bit counter-intuitive when for instance you are standing still, you notice something on your side and you want to rotate towards it. In real life you would always keep staring at the object while you are rotating, but here you have to effectively look away in order to be able to turn and then face it again.
I still left it because maybe some people prefer it over using a thumbstick or find it less nausea-inducing than other options.
Head-Based Rotation (Direct)
Alternative to the Standard version, here you immediately turn towards the point that you are looking at, so you have to counter the rotation with your head.
I think it works pretty well because all you have to do is fix your view in the direction that you want to go, and naturally compensate for the avatar rotation until it stops. The problem is that it doesn't follow the correlation between accelerations explained before, so i think it can induce motion sickness more easily; still i believe it could be a decent alternative in absence of motion controls.
This is the last method that i implemented for rotation. Basically you hold a motion controller like the Razer Hydra and use the yaw value of the control in combination with a button press to determine the rotation of the avatar in VR.
It's really quite similar to using a thumbstick but with the difference that it's much more precise, so you have greater control of the movement and it can provide much smoother transitions. This i believe can help reducing sim-sickness too.
Finally i also implemented what i call 1-Button Mode. If you activate this mode then the same buttons used for rotation will activate head-based translation as well. I think it works especially well with hand-based rotation (Hydras).
Control Methods Conclusions
All methods of input are enabled at the same time (using different buttons explained on the readme) so you can try different combinations, thumbstick to move and HandBased Rotation to rotate, play with the sensitivity adjustments, etc. You can also try stepping in front instead of leaning, but personally prefer to stay in place.
For me the best combination by far is Head-Based Translation + Hand-Based Rotation
, but i left the other options so people could test them. I'm quite certain i will settle for this or a very similar control scheme in the game that i'm working on, but would be honored if other games or apps considered it.
- Intuitive and easy to learn
- It can attenuate simulation sickness
- Responsive, precise and smooth (considerably more than using wasd keys or gamepad thumbsticks)
- Simplifies hand controls (from 2 thumbsticks to only 1 or 2 buttons), freedom to do other things, ideal in combination with motion controls, motion capture cameras or even pedals
- Decoupling of movement and vision -> allows richer interactions
- Works both standing and seated
- Can't turn and do something that involves rotating both hands at the same time
- Requires HMD positional tracking and motion controls so not applicable to something like GearVR. Having said that since only rotational tracking of the hands is required i guess something like the DroidGlove could be used instead of the Hydra/Stem/Move:
- Might be a bit biased, but i can't think of any other one right now :-P
Personally i don't usually get motion sickness so i'm not the best test subject. I tested with only 3 people apart from me and no sim-sickness reported using hydra motion-controls, one of them experienced a bit of dizziness using standard thumbsticks.
The only discomfort that was mentioned by one of the users (my brother) was when crashing into a wall, so i would advise please try not to crash into walls when testing, this will create a mismatch. Maybe it could be sold as a feature to increase immersion: you wouldn't crash your head against a wall in Real-Life, don't do it in VR either!
Unfortunately i don't think i can say that i "solved" the vestibular mismatch issue
, in the end my approach has its own set of tradeoffs like the other methods, but i thought i was worth the effort exploring. At least it's another option on the list which some people might prefer
, which i believe is a good thing.
ABOUT THE SUN TEMPLE MAP
Like i mentioned before, i wanted an alternative to Tuscany as an intro to VR and as a possible standing experience with movement. The main problems i found with Tuscany are:
- No avatar body
- Center of rotation when turning seems off
- Fairly simple graphics
- Stairs, which can induce sim-sickness
- Strange bumps on the floor
So i adapted/modded the UE4 SUN TEMPLE map for VR using the VR Template provided by mitchemmc from the Epic forums. I think it's an awesome map, looks good and runs quite well since it was originally optimized for mobile hardware. Another advantage is that it's quite small with close walls and columns so it has a good amount of vection.
The only problem with it is that it has 2 different levels and stairs in between, which is not good since this can induce sim-sickness too. I wanted to remove this problem from the equation so i decided to remove the stairs and flatten out the whole map; unfortunately the first hallway was a bit too much work to put at the same level as the rest so it's not accessible at the moment.
Here are some of the changes i made to the original level:
- Flattened map (minus hallway)
- Removed a few things for performance
- Changed scale / location of some objects that didn't look quite right in VR
- Adapted lighting / post-process
- Revised collisions
- Added ambient sounds
- Added avatar from VR template, slightly tweaked (i know animations could be improved a lot)
So basically i butchered the whole map, apologies to the original designers at Epic! :-P
PERFORMANCE AND OTHER CONSIDERATIONS
TO FINISH OFF
- The demo is compatible with 0.4.2 -> 0.4.4 Oculus runtimes, although i don't really recommend 0.4.3.
- The best performance for me is with 0.4.2 Direct Mode / Mirroring OFF / AERO OFF (hmd sp 140).
- I think it's VERY IMPORTANT to tweak the Screen Percentage value as best as you can for your machine. It has a big impact in visual fidelity so it should always be the maximum possible that allows you to run at constant 75 fps. You can access it in the menu (start button to enter menu and right trigger to select).
By default it's 130. If the demo is not running at constant 75 fps try to turn it down until it does. If the demo is running at constant 75 fps try to turn it up until it doesn't and then turn it back down again.
- As always there is a tradeoff, 0.4.4 has awesome near-0 low latency but in my case i have to turn down the screen percentage too low for my taste (110), that's why personally i prefer 0.4.2
- If using 0.4.4 Direct Mode -> Turn Aero ON!
- Remember to leave Razer Hydra on the base until the demo has started!
- I recommend playing standing in place to get a better sense of immersion/presence.
- Try to avoid crashing into walls or anything that will suddently stop your avatar as this can easily induce sim-sickness.
- Ambient Sound: I did a very quick job with this so i recommend to play it at very low volume (so you can barely hear your own steps) and without headphones (as it will only help you notice that there is no proper audio spatialization).
Please help testing this! I appreciate honest feedback and i'm especially interested in people that experience problems with vection (discomfort moving around in the Tuscany demo for instance).
Let me know what you think!
Which was your favorite mode or combination? Did you notice improvement in sim-sickness or was it worse than with classic navigation methods? Did you find the new controls intuitive?
Special Thanks to skyrimer (MTBS forums), MrGreen (Oculus forums), mitchemmc (UE4 VR Game Template) and getnamo (UE4 Control Plugins).
Some minor corrections to improve clarity