Wall of text, but really the question is in the title ;)
Some (or most) Android devices have a hard framerate cap. One of my HTC is capped at 30 fps, another at 60 fps, and disabling vsync in Unity (whether quality settings or from script) does not help there:
For continuous on-device performance monitoring during development (testing out the impact of game additions over time), it would be essential to measure (or "plausibly estimate") **time taken per frame, CPU-side and GPU-side**, even when the typical simplistic FPS metric would be still be at par or higher than the hard cap:
Currently, for a near-empty Unity scene (hence very high render performance) on a device with an enforced hard vsync (that sadly cannot be disabled via Unity project settings or scripts), a simple FPS measurement script would only report a constant 30 or 60 FPS. But that doesn't help analyze for example whether a newly added shader or script adds a heavy 5ms to frame time if we're still within budget "until it's too late": ie. this will only become noticeable once the whole setup later on drops under the device's hard vsync --- by that time we have too much new stuff going on and would have to dive into the whole messiness of profiling (if we're even on Unity Pro), debug builds etc.
Of course it's really hard to measure whole frame time (including GPU time) with vsync enforced:
- measuring CPU-side frame time is fairly easy by sensibly picking the proper methods to track approximate startTime of frame (OnUpdate) and endTime of frame (via WaitForEndOfFrame)
- **but alas** most of the time we might be GPU-bound!
I guess if I want to avoid profiling/debugging I'll need to figure out a way to force vsync-disabled somehow on unrooted Androids...
But maybe someone else another sneaky hacky idea to at least produce reasonably realistic guesstimates under imperfect measuring conditions?
↧