cancel
Showing results for 
Search instead for 
Did you mean: 

Event System and SDK

I literally have searched everywhere, but I couldn't fint any answer and frankly, all documentation available online is quite outdated.
I am in the process of converting a game I made for Google Daydream to Oculus Go, and I'm having some pretty bad issues with this task, especially when it comes to interaction with objects.
Daydream routed all controller events (movement and raycasting) through Unity's Event System, that meant that a click on the touchpad was enough to fire a OnPointerClick() event attached to an highlighted object.

How do I route all Oculus Go interactions through Unity's built-in event system? I don't want to roll my own event system, I just want to know if the current scripts available in the Oculus Utilities package (current version 1.27) are enough to fire up event triggers on objects without writing a custom event system.

PS: So far, I've tried putting a OVRCameraRig in the scene, attached a OVRPhysicsRaycaster to it, set the camera as the Ray transform and replaced GraphicRaycaster on a Canvas with OVRRaycaster.
1 REPLY 1



Hi,

Welcome to the Oculus Developer Forums.

Are you familiar with OVRInput? This is the script you can use in your scene to route all input requests properly for your scene. You can read more about it here.


Hi, thank you for the reply.
I forgot to mention it back then, but I was also using OVRInput, replacing the default Unity input module; anyway in the end the solution came after I started reading the code for GazePointer.cs.

Here's how I solved my problem:
First, I set up the scene with a OVRCameraRig, an Event System with OVRInput, an OVRPhysicsRaycaster attached to the root of OVRCameraRig and a OVRRaycaster on a World-Space canvas, replacing the default GraphicRaycaster.
At this point, the console was still throwing errors, complaining about a missing GazePointer.prefab file that couldn't be found, but in Oculus Utilities 1.27 there's no such prefab, instead there's a file named Cursor_Timer, which holds only a quad with a custom material (a radial-filling circle) and has no scripts attached to it.
So then I started reading the whole code and turns out the Oculus ecosystem expects either a GazePointer object or a GazePointer prefab to work correctly, so I made an empty gameobject, attached a GazePointer script to it and then added a child object named "GazeIcon", and it actually worked as it should, allowing me to point with my head and trigger buttons with Input.One.
Afterwards, I used RightHandAnchor instead of CenterEyeAnchor and kept working correctly: I was finally able to select items on the GUI and interact with 3D items without any troubles (well, except for the fact that to interact correctly with buttons you have to disable Unity's Button Navigation system, but that's more of a Unity issue than Oculus').

I think Oculus should fix the Unity package, add back the GazePointer prefab or remove it from the code because it gets really confusing, especially since the samples lack documentation altogether or need additional undocumented work to be built; and you should also rename the prefab as just Pointer, because one might think that it only works with gaze input and that one has to write a custom input system every time to use controllers.