05-27-2020 07:54 AM
10-01-2022 12:48 AM
Usually the "menu" button (or system gesture equivalent in this case) causes the app to pause and open some kind of a main menu. I like to react to the native gesture, feels weird if nothing happens when the gesture triggers. Emulating it manually works to a point, but the internal system that tracks and renders the overlay is running independently from the app framerate, so precise timings are impossible to implement. The gesture should trigger the matching controller button so we can read it's state, even their documentation tells us to do that. If the user is first using controllers and sees the three line menu button open the main menu, then switches to hands and sees the same button icon but nothing happens, that's a pretty bad disconnect.
11-01-2022 05:31 AM
For anyone wondering this is fixed in 46 Oculus Integration.
12-12-2022 06:32 AM
I can no longer detect the menu button after installing the Meta Quest Software Update v47.
I have tried both by binding the input action <XRController>{LeftHand}/start, and by checking the controller state directly through OVRPlugin. The button is triggered with the controller button, but not with the hand gesture. Also the animation appears to have changed, as it no longer shows a circular progress bar.
12-18-2022 09:36 PM
Hello all! I have found a solution that works for me. I am using Unity 2022.1.20f and Oculus Integration version 47 meaning that this works even with the new shorter menu press animation.
The code:
if(OVRInput.GetDown(OVRInput.Button.Three) && (OVRInput.GetActiveController() == OVRInput.Controller.Hands)){
//Open menu
}
I have tested this code and I have reason to believe that it should work on prior versions of the Oculus Integration package. However, I have not tested this hypothesis yet.
This code will only detect when the user is using hand tracking and presses the menu button. To also include when the user presses the menu button with their controller include an or statement and OVRInput.GetDown(OVRInput.Button.Start)
An additional note / why I believe this to work on prior versions:
It is necessary to include the OVRInput.GetActiveController() method because if not, when the user uses the controllers, the X button is mapped to OVRInput.Button.Three. However, when the user uses their hands OVRInput.Button.Three is mapped to the menu button.
This used to be an issue with OVRInput.Button.Four around 1-2 years ago. I guessed they remapped the button at some point in time. This issue was almost identical as I am sure not too much has changed beyond the OVRInput.Button.Start no longer working as intended. That is why I believe this solution will work for previous versions of unity.
Hope this help!
-Nate
12-19-2022 04:28 AM
Thanks Nate for the write-up. I just upgraded Oculus Integration to v47 and even this works now (for both the hand system gesture and the controller menu button):
var state = OVRPlugin.GetControllerState4((uint)OVRInput.Controller.Hands);
bool menu_gesture = (state.Buttons & (uint)OVRInput.RawButton.Start) > 0;
(however, the input action still doesn't work for me)