New to the forums? Click here to read the "How To" Guide.

Developer? Click here to go to the Developer Forums.

Auto Oculus Touch v0.1.5 (Latest release 5 Sep 2018)

kojackkojack Posts: 4,906 Volunteer Moderator
edited September 2018 in Games and Apps
AutoOculusTouch is a library and script for AutoHotKey. It allows you to read the state of oculus controller devices (Touch, Remote or XBox controller) from within an AutoHotKey script.

Downloads

Binary release - https://github.com/rajetic/auto_oculus_touch/releases/download/0.1.5/auto_oculus_touch_v0.1.5.zip
Source - https://github.com/rajetic/auto_oculus_touch


Change History

v0.1.1 - Initial release. Built against Oculus SDK 1.10
v0.1.2 - Added capacitive sensor support. Built against Oculus SDK 1.20
v0.1.3 - Changed initialisation to use Invisible mode. Added button comments to example script.
v0.1.4 - Added vibration. Added orientation tracking (yaw, pitch, roll) for touch. More example scripts. Built against Oculus SDK 1.26
v0.1.5 - Added vJoy integration. You can now output gamepad/joystick values for axes and buttons.

Prerequisites

You must have AutoHotKey installed (64 bit version). It is available from https://autohotkey.com
(AutoOculusTouch is tested against AutoHotKey version 1.1.24.04)
Oculus runtimes 1.26 and below or 1.29 and above are required. Runtimes 1.27 and 1.28 broke AutoOculusTouch by not returning any input to applications that aren't rendering or have the ovrInit_Invisible flag set. Runtime 1.29 fixes this. Currently 1.29 is only in beta, you need to opt in to betas to get it.
If you want vJoy support, you must have it installed. It is available from http://vjoystick.sourceforge.net/site/index.php/download-a-install
(AutoOculusTouch is tested against vJoy version 2.1.8)

Installation

AutoOculusTouch can be placed anywhere. No explicit installation is required.
There are several files included in the AutoOculusTouch binary release:
- auto_oculus_touch.dll     (A library that wraps around the Oculus SDK.
- vJoyInterface.dll            (Interface to the vJoy drivers)
- auto_oculus_touch.ahk     (An AutoHotKey script that defines various Oculus related functions. This is needed by the other scripts)
- oculus_remote_mouse.ahk   (Lets you use the remote as a mouse. Up/Down/Left/Right move. The centre button is left mouse. The back button is right mouse)
- oculus_remote_spotify.ahk (Send media keys with the remote. Up is previous track, Down is next track, centre button is play/pause. Not just for spotify, works with any media key compatible program)
- oculus_touch_mouse.ahk    (Move the mouse with the right touch controller. Index trigger is left mouse. A is right mouse. Thumbstick is mouse move. Hold the hand trigger to enable tracking, then pitch or yaw the mouse to move. Press B to reset yaw centre point to your current heading)
- oculus_touch_test.ahk     (Display a GUI of all touch values)
- oculus_touch_vjoy.ahk     (Turn the Touch controllers into a virtual gamepad for DirectInput games)

You should keep these files together.



Running

To start AutoOculusTouch, double click on one of the scripts (not auto_oculus_touch.ahk, it doesn't do anything on it's own). A green icon with a white H should appear in your system tray. AutoOculusTouch is now running.


Customising

The provided scripts has some example behaviour already defined. You can change any of this, or make your own.

AutoOculusTouch can give you:
- index and hand triggers of a Touch, as floats from 0.0 to 1.0.
- thumbstick axes as floats from -1.0 to 1.0.
- all Touch, Remote and XBox buttons.
- all Touch capacitive sensors.
- all Touch capacitive gestures (index pointing and thumbs up for either hand).
- Pitch, Roll and Yaw of both touch controllers in degrees.
- Set continuous or one shot vibration effects of different frequencies and amplitudes on either touch controller.

Vibration Use
Vibration has 3 properties: frequency, amplitude and oneshot.
Frequency: 1==320Hz, 2==160Hz, 3==106.7Hz, 4=80Hz  (this is the frequency of vibration)
Amplitude: 0-255 (0 stops vibration, 1-255 are the strength)
Oneshot: If this is zero, the vibration will stay at the level you set until you manually change/stop it. If this is 1, the vibration will be a short pulse then stops on it's own.

Orientation Use
Yaw: clockwise is positive
Pitch: aiming up is positive
Roll: tilting clockwise is positive
Yaw works a little different to the other two angles. Pitch and roll have definite zero angles that make sense (holding the controller level). But yaw is based on where the controller is aiming when powered on. To fix this, you can call ResetFacing(controllerNumber) to record the current direction as a Yaw of 0 degrees.
Each controller may have a different Yaw origin.
If you aren't wearing the Oculus headset, camera tracking is disabled. Touch rotation can still be tracked, but it is done with the onboard IMUs, which may drift over time.

vJoy Support
Normally AutoHotKey can only generate keyboard and mouse events.
vJoy is a driver that emulates one or more virtual joysticks with configurable features. AutoOculusTouch can now send analog axis and digital button values to vJoy. This lets you use Touch (or the remote) as a gamepad in games that support DirectInput.
Note: while most of the controls match an XBox controller, it technically isn't one. Any game that uses XInput directly can't see vJoy. Only DirectInput games will work here.

To use vJoy support, you need to install the vJoy drivers. You can download them from here: http://vjoystick.sourceforge.net/site/index.php/download-a-install
If you don't want vJoy support, you don't need to install the drivers, this is an optional feature, AutoOculusTouch works like v0.1.4 without it.


Important Notes

(v0.1.2 and below)
Due to the way the Oculus SDK works, running AutoOculusTouch will make Oculus Home or Dash think that a VR application is running that isn't rendering. The original intent was for AutoOculusTouch to run when no headset is being worn, such as using the Oculus Remote for controlling a PC for media playback, so this didn't matter. Running both AutoOculusTouch and another VR application at the same time probably shouldn't work, but it currently seems to.

What does this mean? Well, Oculus Home and Dash refuse to run a VR app while another is already running. But you can still run multiple apps at once if you start them using a means besides Home or Dash (such as explorer).

(v0.1.3 and above)
The note above is no longer valid in these versions. By setting the Invisible flag when calling the oculus sdk, AutoOculusTouch no longer appears to Dash as a VR app. This means it won't make the headset show a never ending loading screen and other VR apps can be run from Dash without an error.



Scripts

Here's some scripts that others have made that use this:
Fallout4 VR  - https://www.reddit.com/r/oculus/comments/7ltym1/fully_functional_touch_profile_for_fallout_4_vr/  by DickDastardlyUK
Doom VFR - https://www.reddit.com/r/oculus/comments/7h6nva/smooth_locomotion_in_doomvfr_for_rift_i_scripted/ by SpeakeasyArcade



















«13

Comments

  • nalex66nalex66 Posts: 4,185 Valuable Player
    Nice. I had wondered about using your AHK solution for cleaning up the controls in Fallout 4 VR. The wonky way that the game interprets some inputs would likely be an issue, though. 
    i7 5820K @ 4.25 GHz | EVGA GTX 1080 SC | Gigabyte GA-X99-UD4 | Corsair DDR4 3000 32 GB | Corsair HX 750W
    Corsair Hydro H100i | Samsung SSDs: 860 Evo 1 TB, 850 Evo 1 TB, 840 Evo 1 TB | Seagate BarraCuda HDD 3 TB
    My Oculus Medium sketchbook thread, and my Oculus Medium Gallery
  • kojackkojack Posts: 4,906 Volunteer Moderator
    It would depend on how it's doing input.
    If FO4VR can be keyboard controlled (I don't have FO4VR, so don't really know the issues), then you could get touch to control it. Using the capacitive sensors you could have some shift functionality to give more buttons.
    But if the game has hard coded vive controls that can't be disabled, they'd be mapped to touch by steamvr. AutoOculusTouch can't stop a game from seeing touch via oculus sdk or steamvr.

  • nalex66nalex66 Posts: 4,185 Valuable Player
    edited December 2017
    Yeah, that’s the problem—the game doesn’t recognize keyboard input (except for about 3 keys), and does some weird things that interfere with the way SteamVR interprets the Touch to Wand mapping. Clicking the Touchpad, for instance, demands a small offset from center to register, making it harder than necessary to click the stick. AutoOculusTouch might (or might not) help with remapping functions on the controller, but the game has other issues. Anyway, it’s been reasonable sorted out using OpenVR InputEmulator to clean things up. 
    i7 5820K @ 4.25 GHz | EVGA GTX 1080 SC | Gigabyte GA-X99-UD4 | Corsair DDR4 3000 32 GB | Corsair HX 750W
    Corsair Hydro H100i | Samsung SSDs: 860 Evo 1 TB, 850 Evo 1 TB, 840 Evo 1 TB | Seagate BarraCuda HDD 3 TB
    My Oculus Medium sketchbook thread, and my Oculus Medium Gallery
  • WreckLuse68WreckLuse68 Posts: 250
    Nexus 6
    I'm wondering if something like this can be used for the Assetto Corsa menus.
    When Einstein was asked how it felt to be the smartest man on Earth, he replied, “I wouldn’t know. Ask Nikola Tesla”.
  • kojackkojack Posts: 4,906 Volunteer Moderator
    CodeTen68 said:
    I'm wondering if something like this can be used for the Assetto Corsa menus.
    I just tested it, yep, it works.
    Start AC. While in the 2d part of the game, alt tab out and start autooculustouch. Then go back to AC and enter the VR part of the game. With my default script you can move the yellow circle mouse cursor with the right touch thumbstick and click on stuff with the right trigger.

    (If you start autooculustouch while AC is already in VR mode, you'll lose the VR view because oculus home gets confused. You might be able to start autooculustouch before AC, but I tried it the way above).

  • WreckLuse68WreckLuse68 Posts: 250
    Nexus 6
    excellent!!
    When Einstein was asked how it felt to be the smartest man on Earth, he replied, “I wouldn’t know. Ask Nikola Tesla”.
  • cyberealitycybereality Posts: 26,156 Oculus Staff
    Good stuff.
    AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i
    Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
  • KillCardKillCard Posts: 1,078 Poster of the Week
    Awesome! .. Thanks so much for this.

    Re: Running with OH open - if you hit the system button and then hit "cancel" after a few seconds it will state "OH could not close this down, press continue to return to OH". From then on it remains in memory and works fine.

    As an additional request/question (requestion? xD).

    Is it possible to do the reverse, so that we can press a button on the keyboard to trigger an Oculus Touch press? or even press another Touch button/action to trigger other Touch presses (e.g. make it so if I press the trigger on left controller to trigger the Analog stick press on right controller for example?) .. could be useful for those SteamVR games that rely on the Thumb-pad press for actions which is a lot less comfortable when on an Analog Stick press.
  • kojackkojack Posts: 4,906 Volunteer Moderator
    I've got some plans for that.

    This is curretly just using (abusing?) the oculus sdk, by being a vr app that doesn't render anything, it just reads the input state. To do the opposite means injecting input into the sdk, which isn't supported. However, using the same method as Revive might let me hijack the function that returns touch state. It's something I might experiment with over the next few weeks (I start christmas vaction tomorrow, until feb).
    No promises though. :)

  • KillCardKillCard Posts: 1,078 Poster of the Week
    kojack said:
    I've got some plans for that.

    This is curretly just using (abusing?) the oculus sdk, by being a vr app that doesn't render anything, it just reads the input state. To do the opposite means injecting input into the sdk, which isn't supported. However, using the same method as Revive might let me hijack the function that returns touch state. It's something I might experiment with over the next few weeks (I start christmas vaction tomorrow, until feb).
    No promises though. :)

    I wish you the best of luck! 

    Please let us know if you have any success.
  • kojackkojack Posts: 4,906 Volunteer Moderator
    edited December 2017
    KillCard said:
    kojack said:
    I've got some plans for that.

    This is curretly just using (abusing?) the oculus sdk, by being a vr app that doesn't render anything, it just reads the input state. To do the opposite means injecting input into the sdk, which isn't supported. However, using the same method as Revive might let me hijack the function that returns touch state. It's something I might experiment with over the next few weeks (I start christmas vaction tomorrow, until feb).
    No promises though. :)

    I wish you the best of luck! 

    Please let us know if you have any success.
    @KillCard Success!
    Well, the very first step.
    I've just managed to intercept programs asking for Touch state and replace the data with something else. Currently I'm tricking Google Earth into reversing the touch thumbsticks, so back zooms in and forward zooms out.
    So at the very least, I could make a simple app that lets you (via a script) remap touch controls.

    The downside is that right now you need to manually enter the process ID of the program to trick (so a DLL payload can be injected into it to bypass the oculus runtime). I'm sure I can come up with a better method, but for now it works as a basic test.

    Something that would be very easy to do is swap index and grab triggers, or even swap left and right hands.


  • KillCardKillCard Posts: 1,078 Poster of the Week
    kojack said:
    KillCard said:
    kojack said:
    I've got some plans for that.

    This is curretly just using (abusing?) the oculus sdk, by being a vr app that doesn't render anything, it just reads the input state. To do the opposite means injecting input into the sdk, which isn't supported. However, using the same method as Revive might let me hijack the function that returns touch state. It's something I might experiment with over the next few weeks (I start christmas vaction tomorrow, until feb).
    No promises though. :)

    I wish you the best of luck! 

    Please let us know if you have any success.
    @KillCard Success!
    Well, the very first step.
    I've just managed to intercept programs asking for Touch state and replace the data with something else. Currently I'm tricking Google Earth into reversing the touch thumbsticks, so back zooms in and forward zooms out.
    So at the very least, I could make a simple app that lets you (via a script) remap touch controls.

    The downside is that right now you need to manually enter the process ID of the program to trick (so a DLL payload can be injected into it to bypass the oculus runtime). I'm sure I can come up with a better method, but for now it works as a basic test.

    Something that would be very easy to do is swap index and grab triggers, or even swap left and right hands.


    Swapping Left Grab or Left Trigger with Left Analog Press would be exactly what I'm after actually. 

    Awesome work! .. I can't wait to see what comes of this.
  • kojackkojack Posts: 4,906 Volunteer Moderator
    More progress.
    It can now search for oculus apps by name rather than manually entering the process id.
    It's now lua scripted. Every time a program (that's been taken over) calls ovr_GetInputState, it will instead call a lua script that can access the real data, plus provide fake data. Here's an example script:
    ProcessInput = function(controllerType)
    state, result = ovr_GetInputState(controllerType)
    state.IndexTrigger[1],state.HandTrigger[1] = state.HandTrigger[1], state.IndexTrigger[1]
    ovr_SetInputState(state)
    return result
    end
    This grabs the real state, then swaps the right hand index and hand triggers.
    (Lua lets you do swaps like that without temps, such as a,b = b,a)

    I don't like the structure of that, but it's a proof of concept.

    So when I ran it in google earth, it made the index trigger rotate the world and the hand trigger drag the world.

    Hmm, 6:37pm and I haven't eaten today. Break time for christmas leftovers.

    The only annoying thing is that the input state structure returned by the oculus sdk has a heap of similar members, such as HandTrigger, HandTriggerNoDeadzone and HandTriggerRaw. There's no way of knowing which an app wants to look at, so you might need to do triplicate code (index, hand and thumbsticks all have 3 versions for each hand).


    I'm kind of treading on Revive territory here, so hopefully Oculus aren't going to throw a fit and block me. I'm not actually using Revive in any way, but I am injecting code into oculus apps, which is how Revive works.
    Avast has already gotten pissed of at my injector code, thinking it's a virus.




  • kojackkojack Posts: 4,906 Volunteer Moderator
    New progress.
    Here's a little script (note: this is unrelated to autohotkey scripts, this is a lua script for a new tool I'm writing) :
    require 'bit'

    ProcessInput = function(controllerType)
    state, result = ovr_GetInputState(controllerType)
    state2,res2 = ovr_GetInputState(ovrControllerType.Remote)
    if bit.band(state2.Buttons, ovrButton.Up) ~= 0 then
    state.Thumbstick[1].y=1
    end
    if bit.band(state2.Buttons, ovrButton.Down) ~= 0 then
    state.Thumbstick[1].y=-1
    end
    if bit.band(state2.Buttons, ovrButton.Right) ~= 0 then
    state.Thumbstick[1].x=1
    end
    if bit.band(state2.Buttons, ovrButton.Left) ~= 0 then
    state.Thumbstick[1].x=-1
    end
    return state,result
    end

    What this one does is read the requested controller type (for example Google Earth asks for either Touch) into state.
    Then it does another read just on the oculus remote, into state2.
    It then checks the remote's dpad and sets the right thumbstick of the touch to match.
    So it turns the remote into a touch analog thumbstick. I can use it to zoom in and out in google earth.
    :)
     

  • kojackkojack Posts: 4,906 Volunteer Moderator
    I really should make a new thread for it, but I'm not quite ready for a release yet, so I'll just dump this here. More features to add before the base version.

    I've done a large code refactoring of Oculus Injector, or whatever I end up calling it. I've added some new stuff to it too.

    Some new features:
    - time access
    - keyboard access (so you can make keys trigger Touch/Remote/Xbox inputs)
    - intercept tracking data!

    The last one means I can mess with position and orientation data of both touch controllers and the headset.
    For example, this code:
    function HookGetTrackingState(time, latency)
    state = ovr_GetTrackingState(time, latency)
    state.HandPoses[0].ThePose.Orientation = state.HandPoses[1].ThePose.Orientation
    return state
    end
    copies the right touch orientation to the left touch, while leaving it's position alone.

    Although now most of ideas for using this stuff will mean I'll have to implement a full vector/quaternion library in lua. :)

    I'm going to try intercepting the calls to ask which controllers are present. I might be able to create fake touches or remotes.

  • KillCardKillCard Posts: 1,078 Poster of the Week
    kojack said:
    I really should make a new thread for it, but I'm not quite ready for a release yet, so I'll just dump this here. More features to add before the base version.

    I've done a large code refactoring of Oculus Injector, or whatever I end up calling it. I've added some new stuff to it too.

    Some new features:
    - time access
    - keyboard access (so you can make keys trigger Touch/Remote/Xbox inputs)
    - intercept tracking data!

    The last one means I can mess with position and orientation data of both touch controllers and the headset.
    For example, this code:
    function HookGetTrackingState(time, latency)
    state = ovr_GetTrackingState(time, latency)
    state.HandPoses[0].ThePose.Orientation = state.HandPoses[1].ThePose.Orientation
    return state
    end
    copies the right touch orientation to the left touch, while leaving it's position alone.

    Although now most of ideas for using this stuff will mean I'll have to implement a full vector/quaternion library in lua. :)

    I'm going to try intercepting the calls to ask which controllers are present. I might be able to create fake touches or remotes.

    Dude this is great news! .. I might even be able to use this to make TAS Speedruns of Oculus games. xD
  • kojackkojack Posts: 4,906 Volunteer Moderator
    New release: v0.1.3

    I found a flag I've never seen before: ovrInit_Invisible
    By starting the sdk using this flag, I've removed the annoying bit where auto oculus touch stops Dash from loading other VR apps at the same time. It also stops auto oculus touch from appearing as the "app not responding" grey loading screen.

    I've also put some comments on the buttons enums to explain what they represent on Touch, Xbox and the Remote.

  • IskarienIskarien Posts: 1
    NerveGear
    Thank you for creating this!

    I downloaded it, but somehow I can't seem to get it to work, even though what I'd need is a very simple thing: 
    I'm handicapped, and I can't use the A and B buttons with my right hand, so I wanted to map the Y button as B... Which should be as simple as
    ovrY :: ovrB
    ovrB :: ovrY
    right?

    Doesn't work, and I think I may have placed it wrong... Any clue what I'm doing wrong? Would be great if you could help me with that, I can't play most games without such a "fix" and I suck at scripting stuff :/

    Thanks a lot :)
  • kojackkojack Posts: 4,906 Volunteer Moderator
    At the moment all that you can do in AutoOculusTouch is read the state of oculus controls then generate keyboard or mouse events. It's not possible to generate other oculus touch events. So you could make the Touch Y button press the keyboard B, but not the Touch B.

    The other program I'm working on (mentioned a little above) called Oculus Injector can do it, but it's not ready for release yet.
  • Animus777Animus777 Posts: 4
    NerveGear
    Any updates on Oculus Injector? I would like to use Oculus Touch as a mouse with non VR games. And it seems that program is my only hope.
  • kojackkojack Posts: 4,906 Volunteer Moderator
    I haven't had a chance to touch (no pun intended) it lately, but I'm planning on getting back to it soon.
  • Johnny1234Johnny1234 Posts: 10
    NerveGear
    Did anyone get auto_oculus_touch to work with oculus remote? I can only get left/right/up/down to work, the rest of the buttons do not register ;(

    ; working:

        if pressed & ovrRight
            Send, {WheelUp}

    ;not working:

        if pressed & ovrB
            Send, {WheelDown}
           




  • BaronthorBaronthor Posts: 1
    NerveGear
    edited June 2018
    Hey. Any updates on your Injector ? Im trying to use an wii balance board as an input for my vr games. All i can do with the wii balance walker program is to trigger keyboard events. So your injector would be awesome for this! 
  • huliqanhuliqan Posts: 11
    NerveGear
    edited June 2018
    auto_oculus_touch no works with oculus 1.27 runtime:((((((((((((((((((
  • kojackkojack Posts: 4,906 Volunteer Moderator
    edited June 2018
    Sorry, I've been busy with work and haven't had much time for VR.

    Yeah, looks like runtime 1.27 has broken AutoOculusTouch.

    It seems the new behaviour is that ovr_GetInputState (which reads the buttons on the touch and remote) now won't work in apps that aren't rendering and have the headset on. In the past you could ask for controller state at any time, including in invisible apps and when not wearing the headset. Now, that's gone.
    Without being the rendering app, all inputs are reported as zero.
    If you are rendering but take the headset off, all inputs go to zero in less than a second.
    There's no errors reported, it just stops giving state data.

    The crazy thing is I can still get touch tracking data, the IMUs are reported via a different function that hasn't been nerfed. But no buttons or thumbsticks.

    Unless I can find some hack to work around it, or Oculus made a mistake and fix it in 1.28, I have a feeling AutoOculusTouch is dead. :(


    Edit: Ok, looks like it's not just me, other developers in c++ and unity are reporting ovr_GetInputState is broken in 1.27. So maybe this is temporary afterall. Now to play the waiting game...


  • huliqanhuliqan Posts: 11
    NerveGear
    kojack
    Please, write to the oculus support about broken ovr_GetInputState in 1.27

  • AgterboschAgterbosch Posts: 413
    Nexus 6
    That sounds awesome, can it be used to emulate a mouse? That'd be great with my rsi

  • huliqanhuliqan Posts: 11
    NerveGear
    edited June 2018
    Oculus updated to 1.28 but  auto_oculus_touch no works :(((((((((((((((((((((((((((((((((((((((( :/ 
    What a hell??? >:)
  • kojackkojack Posts: 4,906 Volunteer Moderator
    Yep, I tested as soon as I saw that 1.28 beta had come out. :(
    I don't see any indication in the dev forums that they've found the issue.

    That sounds awesome, can it be used to emulate a mouse? That'd be great with my rsi
    Yep, you can use the thumbsticks to move a mouse cursor. Or at least you used to be able to, since the project is now broken until Oculus fix a bug.

  • huliqanhuliqan Posts: 11
    NerveGear
    edited June 2018
    I do this:
    -Stop Oculus Runtime service
    "C:\program files\Oculus\Support\oculus-runtime\OVRServiceLauncher.exe" -stop
    -rename the folder
    C:\ Program Files \ Oculus \ Support-
    -and just copy the folder Support from Oculus 1.26 (35 Mb) https://drive.google.com/file/d/1TWE0Q5PbcncHyitGagE0B9OLpPXS6XVt/view?usp=sharing
    (without Oculus Home)
    -Start Oculus Runtime service
    "C:\program files\Oculus\Support\oculus-runtime\OVRServiceLauncher.exe" -start
    -And auto_oculus_touch works as before

    ...oculus 1.26 full(with HOME) https://drive.google.com/file/d/1ZIK5FEpEyldqIXQrM6cFDXAFfFjEZgSl/view?usp=sharing
«13
Sign In or Register to comment.