Welcome to the Oculus Developer Forums!

Your participation on the forum is subject to the Oculus Code of Conduct.

In general, please be respectful and kind. If you violate the Oculus Code of Conduct, your access to the developer forums may be revoked at the discretion of Oculus staff.

Use GearVrController with Selection Ray - Unity

Hi, I figure out how to add a GearVrController to a scene.
(Adding a OVRCameraRig, and Add GearVRController under the RightHandAnchor)
The controller follows the hand movements. 

But how can I add a selection options + a ray from the device to the scene?
I mean a ray that can select item, like in the Oculus store below:






«1

Comments

  • mallmagicianmallmagician Posts: 297 Oculus Start Member
    This is excellent. Thanks. 
  • davidbelooseskydavidbeloosesky Posts: 7
    NerveGear
    There is some documentation relating to this in the works. Be on the lookout for that. In the mean time i can give you a quick and easy way to achieve this effect.

    First tough, Gear VR lets a user select if a controller is left handed or right handed. To support this, you need to drag a copy of GearVRController as a child of LeftHandAnchor as well. When you do this, look at the GearVRController instance. It has a OVRGearVRController script attached. This script has a dropdown for controller type. Select R Tracked Remote and L Tracked Remote for left and right controllers. The OVRGearVRController script will automatically show / hide the appropriate controller prefab.

    Since you have the LeftHandAnchor and RightHandAnchor transforms, you can use either one of those to create a ray. A ray needs only a position and a forward vector:
    Transform rightHandAnchor; // Assign to the proper transform
    
    Ray pointer = new Ray (rightHandAnchor.position, rightHandAnchor.forward);
    You can now use this to do ray-casts! You can render a selection ray with unity's built in LineRenderer. Make sure the line renderer has at least two points (world space) and set them in an Update method to match the pointer.
    lineRenderer.SetPosition (0, pointer.origin);
    lineRenderer.SetPosition (1, pointer.origin + pointer.direction * 500.0f);
    One more thing you might want to keep in mind, a Gear VR is not guaranteed to have a controller paired. If this happens to be the case, you want to fall back on gaze controls.

    I wrote a simple script to demonstrate this. Attach it to OVRCameraRig and you should be good to go.
    using UnityEngine;
    using UnityEngine.Events;
    
    public class VRRaycaster : MonoBehaviour {
    
    	[System.Serializable]
    	public class Callback : UnityEvent<Ray, RaycastHit> {}
    
    	public Transform leftHandAnchor = null;
    	public Transform rightHandAnchor = null;
    	public Transform centerEyeAnchor = null;
    	public LineRenderer lineRenderer = null;
    	public float maxRayDistance = 500.0f;
    	public LayerMask excludeLayers;
    	public VRRaycaster.Callback raycastHitCallback;
    
    	void Awake() {
    		if (leftHandAnchor == null) {
    			Debug.LogWarning ("Assign LeftHandAnchor in the inspector!");
    			GameObject left = GameObject.Find ("LeftHandAnchor");
    			if (left != null) {
    				leftHandAnchor = left.transform;
    			}
    		}
    		if (rightHandAnchor == null) {
    			Debug.LogWarning ("Assign RightHandAnchor in the inspector!");
    			GameObject right = GameObject.Find ("RightHandAnchor");
    			if (right != null) {
    				rightHandAnchor = right.transform;
    			}
    		}
    		if (centerEyeAnchor == null) {
    			Debug.LogWarning ("Assign CenterEyeAnchor in the inspector!");
    			GameObject center = GameObject.Find ("CenterEyeAnchor");
    			if (center != null) {
    				centerEyeAnchor = center.transform;
    			}
    		}
    		if (lineRenderer == null) {
    			Debug.LogWarning ("Assign a line renderer in the inspector!");
    			lineRenderer = gameObject.AddComponent<LineRenderer> ();
    			lineRenderer.shadowCastingMode = UnityEngine.Rendering.ShadowCastingMode.Off;
    			lineRenderer.receiveShadows = false;
    			lineRenderer.widthMultiplier = 0.02f;
    		}
    	}
    
    	Transform Pointer {
    		get {
    			OVRInput.Controller controller = OVRInput.GetConnectedControllers ();
    			if ((controller & OVRInput.Controller.LTrackedRemote) != OVRInput.Controller.None) {
    				return leftHandAnchor;
    			} else if ((controller & OVRInput.Controller.RTrackedRemote) != OVRInput.Controller.None) {
    				return rightHandAnchor;
    			}
    			// If no controllers are connected, we use ray from the view camera. 
    			// This looks super ackward! Should probably fall back to a simple reticle!
    			return centerEyeAnchor;
    		}
    	}
    
    	void Update() {
    		Transform pointer = Pointer;
    		if (pointer == null) {
    			return;
    		}
    
    		Ray laserPointer = new Ray (pointer.position, pointer.forward);
    
    		if (lineRenderer != null) {
    			lineRenderer.SetPosition (0, laserPointer.origin);
    			lineRenderer.SetPosition (1, laserPointer.origin + laserPointer.direction * maxRayDistance);
    		}
    
    
    		RaycastHit hit;
    		if (Physics.Raycast (laserPointer, out hit, maxRayDistance, ~excludeLayers)) {
    			if (lineRenderer != null) {
    				lineRenderer.SetPosition (1, hit.point);
    			}
    
    			if (raycastHitCallback != null) {
    				raycastHitCallback.Invoke (laserPointer, hit);
    			}
    		}
    	}
    }
    oculus_gabor Thanks a lot for this very detailed answer.
    It works like a charm! 

    What needed to add a "clicking" functionality on buttons using this controller Ray?
  • oculus_gaboroculus_gabor Posts: 40 Oculus Staff
    edited August 2017
    Glad that was helpful. I assume by buttons you mean Unity's built in UI. There is some official documentation being worked on for this as well! Until that documentation is ready, i can point you in the right direction. This one is going to take a bit more work to get up and running.

    EDIT: The documentation mentioned above is now live at: https://developer.oculus.com/blog/adding-gear-vr-controller-support-to-unitys-ui/

    Oculus already has samples of how to interact with Unity's UI using a gaze pointer at: https://developer.oculus.com/blog/unitys-ui-system-in-vr/

    If you use the scripts / UI system from that post, you'll have to remove references to OVRGazePointer.instance, from the OVRInputModule.cs script as you probably wont have the gaze pointer in your scene. Even if you don't remove it, you'll have to add null checks.

    The ray used to interact with UI is constructed in the GetGazePointerData function of OVRInputModule.cs it looks like this:
    leftData.worldSpaceRay = new Ray (rayTransform.position, rayTransform.forward);
    You need to replace leftData.worldSpaceRay with the pick ray of the controller. Once you do that this should be a drag and drop solution. You might want to do something like:

    VRRaycaster.Instance.PickRay;
    }
    else {
      leftData.worldSpaceRay = new Ray(rayTransform.position, rayTransform.forward);
    }if (controllerIsPresent && VRRaycaster.Instance != null) {
      leftData.worldSpaceRay = 
    That way you have support for Gear VR Controller picking rays, and gaze rays when no controller is present
  • myBadStudiosmyBadStudios Posts: 20
    Brain Burst
    This truly sucks incredibly much. I've been trying to do something as simpe as getting a pointer to click on a button for over a week now and all I get is outdated documentation that points from one place to another place to another place and each place the code is outdated. I rewrote the above code to also take into account Touch controllers and guess what I found? The code that is supposed to tell you if the left or right controller is active is saying "Yes, both are". Well that is super helpful. So they link to another page that says "This is how you click on a UGUI button. First you need to drag this script onto that object"... only that script seems to have been removed from the latest SDKs. Seeing as how "This is the script that handles the UGUI Events" removing it from the SDK and then giving a tutorial that starts with "Right, first attach that component we removed" is entirely pointless to the max. Seriously, man, how difficult can it be to just give us a piece of code that allus us to click on a button????? A whole week I have been trying already. I get this example and that demo or so and such sample project and each one was written for a different SDK and each time I try to use it in the latest I either get "This class could not be found. Are you missing a using directive?" errors all over the board or it simply crashes Unity the moment I start running. I ask again: How hard could it possibly be to just give us working code to do something as basic as clicking on a button?????
  • myBadStudiosmyBadStudios Posts: 20
    Brain Burst
    using UnityEngine;using UnityEngine.Events;
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst

    Sorry to hear your frustration. My posts are aimed mostly at mobile, i tend not to check them against rift for functionality. From now on i'll try to make sure my posts are compatible with both. If i have time in the future, i'll even go back and update the blog posts where possible.

    Anyway, i've attached a unity project to this post. This project contains the minimal code needed for a controller that can interact with both UI and non UI objects. 

    The project works on both Gear and Rift. The selection ray will come from the controller whos trigger was last pulled. Everything should be configured in the sample scene. 

    Below is a pretty (low quality) gif of what the attached project looks like. Let me know if you have any questions!



    ~Gabor

    Holy cow, after getting lost in the blogs and messy oculus docs, here i find the answer to finally begin development with oculus.
    You should add this to the blog posts to help new comers, so people know the basic code before diving into sample framework mess.
  • myBadStudiosmyBadStudios Posts: 20
    Brain Burst
    @gabor&nbsp;
    Dude, I cannot thank you enough! I'm gonna download this immediately and give this a go. Again, thank you ever so much! :smiley:

  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst

    Unfortunately, if no controller is connected, gaze input doesnt work with gear vr.

    RayPointer.cs UpdateCastRayIfPossible() function needs to be modified to make it work.

    Or

    RayPointer.cs needs another if check for the active controller to fall back to;

    OVRInput.Controller.TouchPad

    The question is, which one and how?

  • oculus_gaboroculus_gabor Posts: 40 Oculus Staff
    Glad you guys like the sample.

    @tamer.ozturk2, you are right, that sample has no gaze fallback currently. I put it together pretty hastily for touch controllers specifically. There are also some issues with how the active controller is being tracked and a few other API pain points.

    I've just finished writing a more in depth sample with fallback support for gaze, as well as a bunch of API improvements. This blog post will be updated with the new and instructions. I'll be writing the blog update on Friday, it should be live in about two weeks. I'll let you know as soon as its online!
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst
    Glad you guys like the sample.

    @tamer.ozturk2, you are right, that sample has no gaze fallback currently. I put it together pretty hastily for touch controllers specifically. There are also some issues with how the active controller is being tracked and a few other API pain points.

    I've just finished writing a more in depth sample with fallback support for gaze, as well as a bunch of API improvements. This blog post will be updated with the new and instructions. I'll be writing the blog update on Friday, it should be live in about two weeks. I'll let you know as soon as its online!

    Thank you, could you kindly share the code for the new sample here, so we dont wait 2 weeks.

  • myBadStudiosmyBadStudios Posts: 20
    Brain Burst
    edited December 2017
    I am bookmarking this page so I can get up to date info on this also.

    Here's the thing: I have an asset on the asset store that is supposed to block the game from starting until after you have passed my kit. Some of my customers have asked me "Will this work in VR?" and I said "I have no idea. I never tried it" so when I got my Rift the first thing I did was see if my kit could work and after spending the 30 seconds updating the UI to work in world space I immediately noticed a HUGE stumble block that will prevent my kit from working in VR:
    I am asking people to enter text into a text field and, well, for that I kinda need a keyboard... :(

    So I went and created a world space keyboard that can be skinned and customised to contain any combination of letters that the player chooses and all of this can be done in 2 lines of code and changing the background image of 1 prefab. I made that keyboard so skinnable and so customisable and so easy to use that there is nothing out there that can beat it as far as I have seen...

    Only problem is... although it works great as a skinned keyboard to replace the native keyboards on mobiles etc, as soon as I go into VR I have no means of clicking that keyboard. This means I can now release this asset as a "skinnable native keyboard replacement" not the "VR Keyboard" that I intended. I want to include it as a free update to my existing kit but now I am forced to tell my customers "Here is a keyboard you can use in VR. Use this and you can now use this asset of mine in VR!!!!!! ....you just have to figure out for yourself how to actually press the buttons"

    Not good :(
    This is what it is doing at the moment...

    https://youtu.be/coIy2F0QJBI 
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst

    Glad you guys like the sample.

    @tamer.ozturk2, you are right, that sample has no gaze fallback currently. I put it together pretty hastily for touch controllers specifically. There are also some issues with how the active controller is being tracked and a few other API pain points.

    I've just finished writing a more in depth sample with fallback support for gaze, as well as a bunch of API improvements. This blog post will be updated with the new and instructions. I'll be writing the blog update on Friday, it should be live in about two weeks. I'll let you know as soon as its online!

    Thank you, do you mind sharing the code as is, so we dont wait for 2 weeks? I like to read code for learning purposes anyhow.
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst

    Glad you guys like the sample.

    @tamer.ozturk2, you are right, that sample has no gaze fallback currently. I put it together pretty hastily for touch controllers specifically. There are also some issues with how the active controller is being tracked and a few other API pain points.

    I've just finished writing a more in depth sample with fallback support for gaze, as well as a bunch of API improvements. This blog post will be updated with the new and instructions. I'll be writing the blog update on Friday, it should be live in about two weeks. I'll let you know as soon as its online!

    Thank you, do you mind sharing the code as is, so we dont wait for 2 weeks? I like to read code for learning purposes anyhow.
  • oculus_gaboroculus_gabor Posts: 40 Oculus Staff
    @tamer.ozturk2 The code might not make a lot of sense without the documentation that is the blog post. It should be live pretty soon. Hang tight.

    @myBadStudios I just took a look at your video, it's really odd that the selection ray would be so far off. There are two potential issues i could think of:

    First some configuration of the visual ray might not match up with the ray actually being cast into the world (this is likely the cause).

    Second, I assume the canvas that the keyboard belongs to is a world-space canvas? Is the Event Camera of the canvas set to the center eye anchor?

    The code i attached to this forum post was a bit hack-y (proof of concept quality). When the blog is updated with the new code, if that code still breaks your input i'll take a closer look and help you resolve the input issue.
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst
    @tamer.ozturk2 The code might not make a lot of sense without the documentation that is the blog post. It should be live pretty soon. Hang tight.

    Okay great news, sorry for the triple post by the way but the forum has its own problems as i see.

    Could you atleast add swipe controls to the blog post or a new post as well?

  • oculus_gaboroculus_gabor Posts: 40 Oculus Staff
    Is there a use case for swipe that OVRInput in it's current state does not cover?

    From: https://developer.oculus.com/documentation/unity/latest/concepts/unity-ovrinput/
    // returns true on the frame when a user’s finger pulled off Gear VR touchpad controller on a swipe down
    OVRInput.GetDown(OVRInput.Button.DpadDown);
       
    // returns true the frame AFTER user’s finger pulled off Gear VR touchpad controller on a swipe right
    OVRInput.GetUp(OVRInput.RawButton.DpadRight);
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst
    Is there a use case for swipe that OVRInput in it's current state does not cover?

    From: https://developer.oculus.com/documentation/unity/latest/concepts/unity-ovrinput/
    // returns true on the frame when a user’s finger pulled off Gear VR touchpad controller on a swipe down
    OVRInput.GetDown(OVRInput.Button.DpadDown);
       
    // returns true the frame AFTER user’s finger pulled off Gear VR touchpad controller on a swipe right
    OVRInput.GetUp(OVRInput.RawButton.DpadRight);


    OVRInputModule has the following variables defined but not used anywhere, so i thought there is/was/supposed to be some swipe related code in the module when interacting with ui/non ui items.

      [Header("Touchpad Swipe Scroll")]
      #region GearVR swipe scroll
  • oculus_gaboroculus_gabor Posts: 40 Oculus Staff
    I see, that code is for using the Gear VR touchpad, or Gear VR Controller touchpad to scroll a scrollable area, like a text field with long text. It's not something you can hook into.
  • myBadStudiosmyBadStudios Posts: 20
    Brain Burst
    edited December 2017
    @oculus_gabor
    First off, just wanna say thank you again for taking the time to make this thing work. The moment I put the Rift on my head and I saw that orientation demo I said I am not making anything that is not made for the Rift! I was about 90% done with a (Darn it, I can never remember that acronym Universal Windows SomethingOrOther) game and it was made to be mobile and Mac Store compliant also and I just scrapped it and am now down to about 50% complete on the VR version thanks to all the extra stuff that I now wanna add since the static camera is becomming a VR experience :smiley:

    I am THAT committed to making Rift games now that I am prepared to sacrifice any and all other platforms (maybe make an exception for the GearVR)... but it is super disheartening to discover that something this basic is so hard to pull off so yeah, truly appreciate the assistance.

    Now the naysayer in me has to point out: What happens when the next version of the SDK is released? Will all your effort be for nothing again like the other demos and samples out there? :O All I can say is I hope my game is done by then :worried:

    Now, back on topic. I used your sample scene and just dragged my prefab in there so if and when the time comes for you to figure out "So why the heck is this not working for him???" then I can always send you back your project with my prefab in it or I could just send you the prefab and you can add it in yourself. Either way, you getting your hands on that sample project will be super easy as it will be a small upload.

    To answer your questions: Yes, it is a world space canvas and yes, I use the centre eye as the event camera. I thought that maybe the fact that I scaled my canvas down to 0.01 might be the issue but I see you did the same with yours. Seems we are both aware of Unity's sorely lacking support for showing text in world space without scaling down a huge canvas... So with that idea proving not to be the issue I was again stumped. 

    I thought I saw, when looking at the code, that you build up a list of items the ray intersects with and then calculate from there if this is something to be concerned with or not. I was wondering if that list doesn't get updated in time or at all and retains a pointer to an old selected object or something and thus it sends the input to the wrong object. I was going to go look into that as I could not think of what else it could possibly be.

    You saying the cast ray and the visible ray might not match up... interesting... wonder what might be causing that... Since before I found your samples (I actually did this before even looking for any demos/samples at all) I created the raycast from the hand to the forward direction. The first version of your script I saw you made provision for existing line renderers or creating it if not present. I removed that code and just made the script always create the line renderer. I am a big fan of that option... It also means you know for sure you know the start and end point of your ray and raycast. I didn't do it with your latest code, though, cause I wanted to see it working before I started tinkering with it. 

    Your code is clean, very little and easy to follow and understand even for someone completely new to your API so I look forward to your next update and hope it will finally put this issue to bed. Looking forward to seeing the code even without the documentation that goes with it. Your code is easy enough to follow without it. 

    ...but if I might make one little suggestion (and perhaps I should do this also just to prove your theory about the mismatched ray): Could you modify that raycast code so that when it points to something the ray ends where it is pointing in stead of pointing miles out into the distance? That single change will make it infinitely more clear what is being pointed at/to, wouldn't you agree?

    Again, thanks a lot for your efforts
  • myBadStudiosmyBadStudios Posts: 20
    Brain Burst
    edited December 2017
    PS: I did notice this:
    void RayHitSomething(Vector3 hitPosition, Vector3 hitNormal) {
            if (lineRenderer != null) {
                lineRenderer.SetPosition(1, hitPosition);
            }
        }
    

    ...but although not clearly visible in the video, when the headset is on you can clearly see the ray is going straight through and pointing to the middle of nowhere. It seems you THOUGHT you were doing just that but for whatever reason it just 'aint playing ball...

    Well, not with UI elements at least. I noticed you calling the callbacks and the text getting updated when the way intersects 3D objects but nothing happens with UI objects. Also, just in case you missed it, the camera rig has it's own canvas at exactly the same position as the camera
  • myBadStudiosmyBadStudios Posts: 20
    Brain Burst
    Good news: I just got my first successful demo out of the way. Everything works fine now! :D Still want to run a few more tests to verify but so far it all works just dandy :D 

    Bad news: I got no idea what I did to make it work. Sigh :(  

    Basically, I noticed there is this extra canvas on the rig so I took that out. Then I noticed that I basically duplicated the scripts that were on your canvas on mine so I started removing all the duplicates. I.e. you had a canvas and I had a canvas and since my object was the only one I wanted to interact with I removed your canvas. I just got rid of all duplicated stuff like that.

    Next I removed your code to interact with world objects and left it as GUI interact-able only. Finally (and I think this might have been the big one) my canvas had a GraphicRaycaster on it where you used the OVRRaycaster. I had actually moved that from the camera rig to my canvas and that was when I noticed my scene had both active at the same time. So I disabled the GraphicRaycaster, hit play and went looking for trouble... but I couldn't find any. Worked smooth as a baby's bottom. :smile:  Yeay! :) Finally, I can press buttons! :D Yoohoo! :D 

  • oculus_gaboroculus_gabor Posts: 40 Oculus Staff
    Hey yo!

    Good news: I just got my first successful demo out of the way.
    Glad to hear you have something up and running! Congrats, that's awesome!

    Finally (and I think this might have been the big one) my canvas had a GraphicRaycaster on it where you used the OVRRaycaster. 
    Indeed, the graphicsraycaster / overraycaster was the issue, this has to do with how unity's ui system works. You can't have two raycasters on the same canvas. This will be better explained in the updated blog post. Sorry this is so convoluted.

    Your code is clean, very little and easy to follow and understand even for someone completely new to your API so I look forward to your next update and hope it will finally put this issue to bed. 
    The sample code is clean, but it's not architected well. Interaction between different systems is not well planned out and random things need to be assigned in editor with no defaults. All of these issues are addressed in the updated blog post, i managed to make it simple enough that all that's required to interact with ui is to drag 'n drop three components in the scene.

    What happens when the next version of the SDK is released? Will all your effort be for nothing again like the other demos and samples out there?
    For the most part, our SDK is surprisingly solid.The core of the input module is actually Andy Borell's code from 2015 (written against some ancient version of the sdk), you can find details about how it works here. All i had to do was update the SDK and replace the gaze pointer with a ray, very similar to this. Updating the SDK had no issues :)

    The holidays are going to push the blog update out a bit, i think it's landing sometime early January.
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst
    The holidays are going to push the blog update out a bit, i think it's landing sometime early January.
    We shouldnt wait 1 month for a blog post update, you should also post the new scripts here or somewhere else.
  • tamer.ozturk2tamer.ozturk2 Posts: 24
    Brain Burst
    okay, the new 1.21 samples framework is out and OVRInputModule there is fixed. If it is your doing oculus_gabor, thank you .
  • vivalavidavivalavida Posts: 2
    NerveGear
    Thank you so much for this, why isn't basic UI interaction with the GearVR controller part of the utilities?
    (I've just finished the Daydream build and that has been so much smoother,
    Main pain points with GearVR so far are
    - ability to test in Unity Editor
    -UI interaction with controller)
  • oculus_gaboroculus_gabor Posts: 40 Oculus Staff
    vivalavida thanks for the feedback, it's really useful!

    I don't know the answers to your questions, but when i have some time i'll try to get the issues you pointed out resolved.
  • vivalavidavivalavida Posts: 2
    NerveGear
    I see, that code is for using the Gear VR touchpad, or Gear VR Controller touchpad to scroll a scrollable area, like a text field with long text. It's not something you can hook into.
    hey sorry for shifting focus to this, but does the scrolling code work?
    could you please maybe show which script handles this?
    I was unable to find any references to scroll related variables.

Sign In or Register to comment.