I got a new oculus quest 2 headset a week ago. I have a unity project depicting a debris flow (from a real numerical simulations) where a terrain and the "flows" produced procedurally via script with the flows starting to animate after a play(the "flows" game objects are going to appear on the scene at run time). When I tried to build the project in both standalone and android, I got an ''error CS0115 ...no suitable method found to override'' heading to my unity editor. I added: #if UNITY_EDITOR and #endif at start and end of the editor script respectively and and the error was solved. However, after building the project on the android platform, " flows" are not coming into the oculus quest 2. While I can walk inside the terrain, I couldn't see the debris flow coming. The same building on standalone pc gives the expected result (I could see the terrain and the debris flow as if in the project from the built application). What did I miss on the android build setting ?
P.S: My unity and VR experience is limited.
I'd say without more details on how your simulation is actually implemented it will be hard for anyone on here to guess what the issue is. There are many things that could be the culprit.
If I had to take a wild guess, I'd say, this could be related to how the simulation results are rendered. I assume they are rendered using some sort of custom Unity Shader. On Windows Unity compiles these for the DirectX11 Graphics API. Android (the OS that the Quest 2 runs on) uses another Graphics API (OpenGL ES).
If the shader wasn't written to support OpenGL ES at all, or for a myriad of other reasons, it is very well possible that there are modifications needed to make it run on that specific platform.
Thank you for the reply. I would elaborate it a bit. The whole scene contains a base terrain and the simulations animated at specific time interval. The base terrain was taken from a Digital elevation model (DEM) and the simulations were the terrain DEM plus flow height of the debris which changed with time. A custom script reads the DEMs and imported them as 3D objects into unity and animated the simulations (two consecutive simulations appear at a time and when the next third one was added, the first one got destroyed.) The project was done in URP, hence I used shader graph to both the terrain and the simulations. For the terrain I used a triplanar mapping blended with normal vector to create a rocky slopes and grass horizontal. For the simulations, I used a complicated shader graph which included tiling and offset (UVs taken from flow velocity of the debris). I hope this will help somehow.
If I were in your place, I would probably troubleshoot this by simplifying the complicated shader graph for the part that's not working on Quest one step at a time, to see which part of it introduces the issues.
Even though the Unity documentation here is not specifically targeted at Shader Graph, some of the information could still be useful. It outlines some of the differences betwen Direct3D-like and OpenGL-like platforms.
An example for differences being, that whe UV coordinates start from the top when using DirectX while starting from the bottom on OpenGL platforms.
What I meant to say with my first post is that there's probably no way to help you without having access to your project. If you're new to Unity and VR this could be anything from simple fixes, up to digging deep into the "complicated shader graph".
If you can share the project I'd suggest posting on the Unity forums and link to your project on github. The community on there is much bigger. I think you'll be more likely to find somebody taking the time to look into this specific issue.