I'd like to suggest a feature for the Oculus Quest and Quest 2. Both of them track hands, however, there is one issue. Clapping your hands or interlacing your fingers. I'd like to suggest a feature where you can manually calibrate your hands. This is how it will work: -A white 'cel-shaded' outline of a hand appears while on passthrough. A window appears and instructs you to put your hand in the position of the outline.
-As you put your hand in the same position as the outline, another hand position appears. You repeat the same process until the calibration is done.
-There will be several poses you are required to put your hand in. Some examples will be; clapping hands, interlaced fingers, hand gestures, etc. You get the point.
This works just like Apple's voice assistant, Siri. Siri trains on your voice and makes your voice recognizable to Siri. My idea is the same as that, the hand calibration will train on your hands and makes your hand movements recognizable to the headset. This way, hand tracking can be much more accurate and much better.