So, our Unity app has a lot of simple objects, and pushes the GPU pretty hard. Looking to squeeze every last drop, I took the oft-cited recommendation to override the texture compression to ASTC. Unfortunately, I noticed a fairly significant drop in runtime performance - maybe 10% or more. Enough to dump us down from 72fps to 60 in many cases.
Is this unusual? I'm curious to know the technical reasons why ASTC is supposed to give better performance (as per OVR Performance Lint, and several articles) and what our use-case might be doing to foil that.
Any info, or even anecdotal notes would be appreciated.