IMPORTANT:

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible: https://developer.oculus.com/quest-pitch/

For additional information and context, please see "Submitting Your App to the Oculus Quest Store".
Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

Getting Both touch controllers to interact with Unity UI

greggmangreggman Posts: 6
Brain Burst
I have one touch controller working with Unity UI. I'm was using the HandedInputSelector from Oculus/SampleFramework/Code/DebugUI/Scripts/HandedInputSelector.cs  It only supports one controller at a time. I'd like to support 2 controllers. What do I need to do to get both controllers to manipulate the UI?

What I thought might work is I disabled the HandedInputSelector script. Then on the EventSystem object that had an OVRInputModule I added a second OVRInputModule. The first has its rayTranform set to the right hand, the other to the left. For the left I duplicated the LaserPointer and assigned the "cursor" property of the second OVRInputModule to that LaserPointer duplicate. The original LaserPointer has a LaserPointer script with a Cursor Visual reference to a sphere. So I duplicated the sphere as well and pointed the second LaserPointer to the 2nd sphere.

And, nothing. No pointer comes out of the second controller and no interaction.

What should I do to get both controllers to manipulate UI?

I'm on Unity 2019.3.6 with the 14.0 version of the Oculus asset from the unity asset store (2020 March 14th)
Sign In or Register to comment.