I use Unity and created a C sharp script to control first person camera movement using the Oculus Touch controllers. It works fine in my Rift (Unity and exe builds) but doesn't work in a scene I sideloaded into my Quest. It works in the Unity project from which I made the Android build for the Quest, but not in the Quest. It doesn't look like Oculus has changed the Touch input mapping (Button.One is still Button.One, etc.). Can anyone explain what's going on?
Same problem here. All the apps I've developed the input stopped working all together. It was a really fun surprise to get just when I was going to have some holidays. Thanks for the Gift, Oculus 😞
Try Oculus menu -> Tools -> create store-compatible manifest. I agree its an awful surprise. I was about to take my quest out to demo to some friends and now everythings borked.
Oculus tech support recommended the same solution: clicking on Oculus -> Tools -> Create store-compatible AndroidManifest.xml. So I tried it again and it worked!
This is related to support for Go games on the Quest that changed recently. If you don't have a Quest app set up right (the manifest thing above) then the Quest will think its a Go app and will drop down to Go controller level support instead of Touch.
Thanks. In light of kojack's comment it seems at least possible that the reason the manifest thing worked the second time but not the first is that I think between the two attempts I changed the Element 0 setting in the OVR camera rig's OVR Manager to "Quest" from "Gear VR or Go".