I'm heavily invested in implementing stereo-correct specular highlighting in Gear VR. One of my biggest issues with almost all existing Gear VR titles is that the best looking ones still have a dull, unrealistic look. My working theory which I'm looking to test is that it has to do with the complete lack of specular highlighting.
Of course, a full featured reflection shader will kill mobile performance, so that's of the table. Normally, the right optimization would be to use specular & roughness maps (Daedalus for Gear VR uses this). Unfortunately, this is still incredibly GPU intensive, and causes heavy aliasing if done wrong.
The suggestion proposed by John Carmack would be to do specular highlighting in realtime, but in the simplest possible case: one directional light e.g. the Sun, the camera & normal angles calculated in the vertex shader, and the light angle & specular highlight itself calculated in the fragment shader.
Even with this simple case, fairly advanced functions (pow, cos) need to run on the vertex & fragment shaders. I was wondering: has anyone tried using LUTs (look up textures) to speed up the calculations? Is the bandwidth tradeoff acceptable in a real world Gear VR game? What other optimizations could I try to speed up the process of calculating specular highlighting in realtime?