Network Update Rate
In game development, the send rate of server and client updates dictates how good the prediction of the real player position is. However, not all increases in send rate give you equal returns for the increased CPU and bandwidth costs you're paying.
I've recorded and averaged the prediction error over multiple runs on some common send rates to see how much the prediction improves. The test involves a character running in circles which tests a range of parameters like extrapolation, interpolation and update rate.
Data
Update rate | Error distance |
---|---|
1 Hz | 73.4 cm |
2 Hz | 51.5 cm |
5 Hz | 23.3 cm |
10 Hz | 12.5 cm |
15 Hz | 8.9 cm |
20 Hz | 7.4 cm |
25 Hz | 6.2 cm |
50 Hz | 4.5 cm |
100 Hz | 3.5 cm |
The error distance represents the distance of the client's predicted position to the server position sampled on each network packet receival. The move speed used in the test was 4.5 meters per second.
The test data shows that there's barely any noticeable benefit in running more than 20 Hz. The movement actually looks very smooth with only 10 Hz, but suffers in a few edge cases (ping-pong movement) where 20 Hz can lead to much better results. The final choice will of course depend on your prediction algorithms and the game type.
Conclusion
- Too little updates make it really hard to predict where the player is
- Too many updates will severely increase server CPU & bandwidth usage
- Prefer a send & receive rate somewhere in the ballpark of 10-20 Hz