cancel
Showing results for 
Search instead for 
Did you mean: 

[Hand Tracking] New Oculus Start Button

DetectiveBunss
Honored Guest
Hello,

With the last update 17, i've noticed that there's now the start button showing within the left hand, like the right hand do with the oculus button.

I've also noticed that Waltz of The Wizards use this button to trigger a Menu Settings.

My question is : Which script is managing this new button ? I searched OVRHand.cs but didn't found anything. 
I also tried the OVRInput like i would do if i want to code the start button with the Touch Controller but it didn't seem to work.

I want to use this feauture to create a Menu for my game, but didn't find any documentation about it so far.

Do someone have found where he is generated in Unity and how we can customize it ?

Have a good day everyone
bdm7i2yobvk5.png
24 REPLIES 24

Usually the "menu" button (or system gesture equivalent in this case) causes the app to pause and open some kind of a main menu. I like to react to the native gesture, feels weird if nothing happens when the gesture triggers. Emulating it manually works to a point, but the internal system that tracks and renders the overlay is running independently from the app framerate, so precise timings are impossible to implement. The gesture should trigger the matching controller button so we can read it's state, even their documentation tells us to do that. If the user is first using controllers and sees the three line menu button open the main menu, then switches to hands and sees the same button icon but nothing happens, that's a pretty bad disconnect.

DanielYu6
Explorer

For anyone wondering this is fixed in 46 Oculus Integration. 

AndyDeveloper
Explorer

I can no longer detect the menu button after installing the Meta Quest Software Update v47.

I have tried both by binding the input action <XRController>{LeftHand}/start, and by checking the controller state directly through OVRPlugin. The button is triggered with the controller button, but not with the hand gesture. Also the animation appears to have changed, as it no longer shows a circular progress bar.

Nate27624
Honored Guest

Hello all! I have found a solution that works for me.  I am using Unity 2022.1.20f and Oculus Integration version 47 meaning that this works even with the new shorter menu press animation.

 

The code:

if(OVRInput.GetDown(OVRInput.Button.Three) && (OVRInput.GetActiveController() == OVRInput.Controller.Hands)){
//Open menu
}

 

I have tested this code and I have reason to believe that it should work on prior versions of the Oculus Integration package.  However, I have not tested this hypothesis yet. 

 

This code will only detect when the user is using hand tracking and presses the menu button.  To also include when the user presses the menu button with their controller include an or statement and  OVRInput.GetDown(OVRInput.Button.Start)

 

An additional note / why I believe this to work on prior versions:

It is necessary to include the OVRInput.GetActiveController() method because if not, when the user uses the controllers, the X button is mapped to OVRInput.Button.Three.  However, when the user uses their hands OVRInput.Button.Three is mapped to the menu button. 

This used to be an issue with OVRInput.Button.Four around 1-2 years ago.  I guessed they remapped the button at some point in time.  This issue was almost identical as I am sure not too much has changed beyond the OVRInput.Button.Start no longer working as intended.  That is why I believe this solution will work for previous versions of unity.

 

Hope this help!

-Nate

Thanks Nate for the write-up. I just upgraded Oculus Integration to v47 and even this works now (for both the hand system gesture and the controller menu button):

var state = OVRPlugin.GetControllerState4((uint)OVRInput.Controller.Hands);
bool menu_gesture = (state.Buttons & (uint)OVRInput.RawButton.Start) > 0;

(however, the input action still doesn't work for me)