cancel
Showing results for 
Search instead for 
Did you mean: 

Use GearVrController with Selection Ray - Unity

davidbeloosesky
Explorer
Hi, I figure out how to add a GearVrController to a scene.
(Adding a OVRCameraRig, and Add GearVRController under the RightHandAnchor)
The controller follows the hand movements. 

But how can I add a selection options + a ray from the device to the scene?
I mean a ray that can select item, like in the Oculus store below:

z55w50ytmdyp.png




43 REPLIES 43

oculus_gabor
Protege
There is some documentation relating to this in the works. Be on the lookout for that. In the mean time i can give you a quick and easy way to achieve this effect.

EDIT: The documentation mentioned above is now live at https://developer.oculus.com/blog/adding-gear-vr-controller-support-to-the-unity-vr-samples/

First tough, Gear VR lets a user select if a controller is left handed or right handed. To support this, you need to drag a copy of GearVRController as a child of LeftHandAnchor as well. When you do this, look at the GearVRController instance. It has a OVRGearVRController script attached. This script has a dropdown for controller type. Select R Tracked Remote and L Tracked Remote for left and right controllers. The OVRGearVRController script will automatically show / hide the appropriate controller prefab.

Since you have the LeftHandAnchor and RightHandAnchor transforms, you can use either one of those to create a ray. A ray needs only a position and a forward vector:
Transform rightHandAnchor; // Assign to the proper transform
Ray pointer = new Ray (rightHandAnchor.position, rightHandAnchor.forward);
You can now use this to do ray-casts! You can render a selection ray with unity's built in LineRenderer. Make sure the line renderer has at least two points (world space) and set them in an Update method to match the pointer.
lineRenderer.SetPosition (0, pointer.origin);
lineRenderer.SetPosition (1, pointer.origin + pointer.direction * 500.0f);
One more thing you might want to keep in mind, a Gear VR is not guaranteed to have a controller paired. If this happens to be the case, you want to fall back on gaze controls.

I wrote a simple script to demonstrate this. Attach it to OVRCameraRig and you should be good to go.
using UnityEngine;
using UnityEngine.Events;

public class VRRaycaster : MonoBehaviour {

[System.Serializable]
public class Callback : UnityEvent<Ray, RaycastHit> {}

public Transform leftHandAnchor = null;
public Transform rightHandAnchor = null;
public Transform centerEyeAnchor = null;
public LineRenderer lineRenderer = null;
public float maxRayDistance = 500.0f;
public LayerMask excludeLayers;
public VRRaycaster.Callback raycastHitCallback;

void Awake() {
if (leftHandAnchor == null) {
Debug.LogWarning ("Assign LeftHandAnchor in the inspector!");
GameObject left = GameObject.Find ("LeftHandAnchor");
if (left != null) {
leftHandAnchor = left.transform;
}
}
if (rightHandAnchor == null) {
Debug.LogWarning ("Assign RightHandAnchor in the inspector!");
GameObject right = GameObject.Find ("RightHandAnchor");
if (right != null) {
rightHandAnchor = right.transform;
}
}
if (centerEyeAnchor == null) {
Debug.LogWarning ("Assign CenterEyeAnchor in the inspector!");
GameObject center = GameObject.Find ("CenterEyeAnchor");
if (center != null) {
centerEyeAnchor = center.transform;
}
}
if (lineRenderer == null) {
Debug.LogWarning ("Assign a line renderer in the inspector!");
lineRenderer = gameObject.AddComponent<LineRenderer> ();
lineRenderer.shadowCastingMode = UnityEngine.Rendering.ShadowCastingMode.Off;
lineRenderer.receiveShadows = false;
lineRenderer.widthMultiplier = 0.02f;
}
}

Transform Pointer {
get {
OVRInput.Controller controller = OVRInput.GetConnectedControllers ();
if ((controller & OVRInput.Controller.LTrackedRemote) != OVRInput.Controller.None) {
return leftHandAnchor;
} else if ((controller & OVRInput.Controller.RTrackedRemote) != OVRInput.Controller.None) {
return rightHandAnchor;
}
// If no controllers are connected, we use ray from the view camera.
// This looks super ackward! Should probably fall back to a simple reticle!
return centerEyeAnchor;
}
}

void Update() {
Transform pointer = Pointer;
if (pointer == null) {
return;
}

Ray laserPointer = new Ray (pointer.position, pointer.forward);

if (lineRenderer != null) {
lineRenderer.SetPosition (0, laserPointer.origin);
lineRenderer.SetPosition (1, laserPointer.origin + laserPointer.direction * maxRayDistance);
}


RaycastHit hit;
if (Physics.Raycast (laserPointer, out hit, maxRayDistance, ~excludeLayers)) {
if (lineRenderer != null) {
lineRenderer.SetPosition (1, hit.point);
}

if (raycastHitCallback != null) {
raycastHitCallback.Invoke (laserPointer, hit);
}
}
}
}

mallmagician
Expert Protege
This is excellent. Thanks. 

davidbeloosesky
Explorer


There is some documentation relating to this in the works. Be on the lookout for that. In the mean time i can give you a quick and easy way to achieve this effect.

First tough, Gear VR lets a user select if a controller is left handed or right handed. To support this, you need to drag a copy of GearVRController as a child of LeftHandAnchor as well. When you do this, look at the GearVRController instance. It has a OVRGearVRController script attached. This script has a dropdown for controller type. Select R Tracked Remote and L Tracked Remote for left and right controllers. The OVRGearVRController script will automatically show / hide the appropriate controller prefab.

Since you have the LeftHandAnchor and RightHandAnchor transforms, you can use either one of those to create a ray. A ray needs only a position and a forward vector:
Transform rightHandAnchor; // Assign to the proper transform
Ray pointer = new Ray (rightHandAnchor.position, rightHandAnchor.forward);
You can now use this to do ray-casts! You can render a selection ray with unity's built in LineRenderer. Make sure the line renderer has at least two points (world space) and set them in an Update method to match the pointer.
lineRenderer.SetPosition (0, pointer.origin);
lineRenderer.SetPosition (1, pointer.origin + pointer.direction * 500.0f);
One more thing you might want to keep in mind, a Gear VR is not guaranteed to have a controller paired. If this happens to be the case, you want to fall back on gaze controls.

I wrote a simple script to demonstrate this. Attach it to OVRCameraRig and you should be good to go.
using UnityEngine;
using UnityEngine.Events;

public class VRRaycaster : MonoBehaviour {

[System.Serializable]
public class Callback : UnityEvent<Ray, RaycastHit> {}

public Transform leftHandAnchor = null;
public Transform rightHandAnchor = null;
public Transform centerEyeAnchor = null;
public LineRenderer lineRenderer = null;
public float maxRayDistance = 500.0f;
public LayerMask excludeLayers;
public VRRaycaster.Callback raycastHitCallback;

void Awake() {
if (leftHandAnchor == null) {
Debug.LogWarning ("Assign LeftHandAnchor in the inspector!");
GameObject left = GameObject.Find ("LeftHandAnchor");
if (left != null) {
leftHandAnchor = left.transform;
}
}
if (rightHandAnchor == null) {
Debug.LogWarning ("Assign RightHandAnchor in the inspector!");
GameObject right = GameObject.Find ("RightHandAnchor");
if (right != null) {
rightHandAnchor = right.transform;
}
}
if (centerEyeAnchor == null) {
Debug.LogWarning ("Assign CenterEyeAnchor in the inspector!");
GameObject center = GameObject.Find ("CenterEyeAnchor");
if (center != null) {
centerEyeAnchor = center.transform;
}
}
if (lineRenderer == null) {
Debug.LogWarning ("Assign a line renderer in the inspector!");
lineRenderer = gameObject.AddComponent<LineRenderer> ();
lineRenderer.shadowCastingMode = UnityEngine.Rendering.ShadowCastingMode.Off;
lineRenderer.receiveShadows = false;
lineRenderer.widthMultiplier = 0.02f;
}
}

Transform Pointer {
get {
OVRInput.Controller controller = OVRInput.GetConnectedControllers ();
if ((controller & OVRInput.Controller.LTrackedRemote) != OVRInput.Controller.None) {
return leftHandAnchor;
} else if ((controller & OVRInput.Controller.RTrackedRemote) != OVRInput.Controller.None) {
return rightHandAnchor;
}
// If no controllers are connected, we use ray from the view camera.
// This looks super ackward! Should probably fall back to a simple reticle!
return centerEyeAnchor;
}
}

void Update() {
Transform pointer = Pointer;
if (pointer == null) {
return;
}

Ray laserPointer = new Ray (pointer.position, pointer.forward);

if (lineRenderer != null) {
lineRenderer.SetPosition (0, laserPointer.origin);
lineRenderer.SetPosition (1, laserPointer.origin + laserPointer.direction * maxRayDistance);
}


RaycastHit hit;
if (Physics.Raycast (laserPointer, out hit, maxRayDistance, ~excludeLayers)) {
if (lineRenderer != null) {
lineRenderer.SetPosition (1, hit.point);
}

if (raycastHitCallback != null) {
raycastHitCallback.Invoke (laserPointer, hit);
}
}
}
}


oculus_gabor Thanks a lot for this very detailed answer.
It works like a charm! 

What needed to add a "clicking" functionality on buttons using this controller Ray?

oculus_gabor
Protege
Glad that was helpful. I assume by buttons you mean Unity's built in UI. There is some official documentation being worked on for this as well! Until that documentation is ready, i can point you in the right direction. This one is going to take a bit more work to get up and running.

EDIT: The documentation mentioned above is now live at: https://developer.oculus.com/blog/adding-gear-vr-controller-support-to-unitys-ui/

Oculus already has samples of how to interact with Unity's UI using a gaze pointer at: https://developer.oculus.com/blog/unitys-ui-system-in-vr/

If you use the scripts / UI system from that post, you'll have to remove references to OVRGazePointer.instance, from the OVRInputModule.cs script as you probably wont have the gaze pointer in your scene. Even if you don't remove it, you'll have to add null checks.

The ray used to interact with UI is constructed in the GetGazePointerData function of OVRInputModule.cs it looks like this:
leftData.worldSpaceRay = new Ray (rayTransform.position, rayTransform.forward);
You need to replace leftData.worldSpaceRay with the pick ray of the controller. Once you do that this should be a drag and drop solution. You might want to do something like:

if (controllerIsPresent && VRRaycaster.Instance != null) {
leftData.worldSpaceRay =
VRRaycaster.Instance.PickRay;
}
else {
leftData.worldSpaceRay = new Ray(rayTransform.position, rayTransform.forward);
}
That way you have support for Gear VR Controller picking rays, and gaze rays when no controller is present

myBadStudios
Protege
This truly sucks incredibly much. I've been trying to do something as simpe as getting a pointer to click on a button for over a week now and all I get is outdated documentation that points from one place to another place to another place and each place the code is outdated.

I rewrote the above code to also take into account Touch controllers and guess what I found? The code that is supposed to tell you if the left or right controller is active is saying "Yes, both are". Well that is super helpful.

So they link to another page that says "This is how you click on a UGUI button. First you need to drag this script onto that object"... only that script seems to have been removed from the latest SDKs. Seeing as how "This is the script that handles the UGUI Events" removing it from the SDK and then giving a tutorial that starts with "Right, first attach that component we removed" is entirely pointless to the max.

Seriously, man, how difficult can it be to just give us a piece of code that allus us to click on a button????? A whole week I have been trying already. I get this example and that demo or so and such sample project and each one was written for a different SDK and each time I try to use it in the latest I either get "This class could not be found. Are you missing a using directive?" errors all over the board or it simply crashes Unity the moment I start running.

I ask again: How hard could it possibly be to just give us working code to do something as basic as clicking on a button?????

myBadStudios
Protege
using UnityEngine;
using UnityEngine.Events;

public class VRRaycaster : MonoBehaviour
{
    [System.Serializable]
    public class Callback : UnityEvent<Ray, RaycastHit> { }

    [SerializeField] Transform 
        leftHandAnchor = null,
        rightHandAnchor = null,
        centerEyeAnchor = null;

    public float maxRayDistance = 500.0f;
    public LayerMask excludeLayers;
    public VRRaycaster.Callback raycastHitCallback;

    Transform pointer;
    OVRInput.Controller controller;
    LineRenderer lineRenderer = null;
    Ray laserPointer;
    RaycastHit hit;

    void Assign( ref Transform var, string parent )
    {
        if (null == var)
            var =  GameObject.Find( parent )?.transform ?? null ;
    }

    void Awake()
    {
        Assign( ref leftHandAnchor, "LeftHandAnchor" );
        Assign( ref rightHandAnchor, "RightHandAnchor" );
        Assign( ref centerEyeAnchor, "CenterEyeAnchor" );
        
        lineRenderer = gameObject.AddComponent<LineRenderer>();
        lineRenderer.shadowCastingMode = UnityEngine.Rendering.ShadowCastingMode.Off;
        lineRenderer.receiveShadows = false;
        lineRenderer.widthMultiplier = 0.02f;        
    }

    bool LeftTouch => ( controller & OVRInput.Controller.LTouch ) != OVRInput.Controller.None; 
    bool RightTouch => ( controller & OVRInput.Controller.RTouch ) != OVRInput.Controller.None; 
    bool LeftControl => ( controller & OVRInput.Controller.LTrackedRemote ) != OVRInput.Controller.None; 
    bool RightControl => ( controller & OVRInput.Controller.RTrackedRemote ) != OVRInput.Controller.None;

    Transform Pointer
    {
        get
        {
            if ( RightControl || RightTouch ) return rightHandAnchor;
            if ( LeftControl  || LeftTouch  ) return leftHandAnchor;
            
            // If no controllers are connected, we use ray from the view camera. 
            // This looks super ackward! Should probably fall back to a simple reticle!
            return centerEyeAnchor;
        }
    }

    private void OnGUI()
    {
        GUILayout.Label( "Controller: " + controller );
        GUILayout.Label( "Left: " + ( LeftTouch ? "True" : "False") );
        GUILayout.Label( "Right: " + ( RightTouch ? "True" : "False" ) );
    }

    void Update()
    {
        controller = OVRInput.GetConnectedControllers();
        pointer = Pointer;
        if ( pointer == null )
            return;

        laserPointer = new Ray( pointer.position, pointer.forward );

        lineRenderer.SetPosition( 0, laserPointer.origin );
        lineRenderer.SetPosition( 1, laserPointer.origin + laserPointer.direction * maxRayDistance );
                
        if ( Physics.Raycast( laserPointer, out hit, maxRayDistance, ~excludeLayers ) )
        {
            lineRenderer.SetPosition( 1, hit.point );
            raycastHitCallback?.Invoke( laserPointer, hit );            
        }
    }
}

oculus_gabor
Protege
Hi @myBadStudios

Sorry to hear your frustration. My posts are aimed mostly at mobile, i tend not to check them against rift for functionality. From now on i'll try to make sure my posts are compatible with both. If i have time in the future, i'll even go back and update the blog posts where possible.

Anyway, i've attached a unity project to this post. This project contains the minimal code needed for a controller that can interact with both UI and non UI objects. 

The project works on both Gear and Rift. The selection ray will come from the controller whos trigger was last pulled. Everything should be configured in the sample scene. 

Below is a pretty (low quality) gif of what the attached project looks like. Let me know if you have any questions!

lebn98es57no.gif

~Gabor

tamer_ozturk2
Explorer


Hi @myBadStudios

Sorry to hear your frustration. My posts are aimed mostly at mobile, i tend not to check them against rift for functionality. From now on i'll try to make sure my posts are compatible with both. If i have time in the future, i'll even go back and update the blog posts where possible.

Anyway, i've attached a unity project to this post. This project contains the minimal code needed for a controller that can interact with both UI and non UI objects. 

The project works on both Gear and Rift. The selection ray will come from the controller whos trigger was last pulled. Everything should be configured in the sample scene. 

Below is a pretty (low quality) gif of what the attached project looks like. Let me know if you have any questions!

lebn98es57no.gif

~Gabor


Holy cow, after getting lost in the blogs and messy oculus docs, here i find the answer to finally begin development with oculus.
You should add this to the blog posts to help new comers, so people know the basic code before diving into sample framework mess.

myBadStudios
Protege
@gabor 
Dude, I cannot thank you enough! I'm gonna download this immediately and give this a go. Again, thank you ever so much! :smiley: