I think I know what is wrong with my tickrate system now.
I calculate the amount it steps in time to move forward depending on how much time has passed since the last tick, divided by the expected delay between ticks (1 over w_tickrate, in this case, 1/120th of a second).
The math is sound. I have gone over it again and again and again and again. However:
There is always microscopic differences in latency which are simply out of my control. And because everything is on a single thread, the speed at which the graphics are drawn every frame can have a wild influence on the latency between world updates. There's clue number one.
Clue number two: I do not store the delta anywhere as a member within the world class. I only truncate the result down to an 8-bit counter telling how many ticks to process. The time measurement then gets thrown away because I reset w_lastUpdate every tick without storing the remainder of the w_ticksDue calculation. There's clue number 2.
Clue number 3: certain events within the world only deviate from their intended speed up to a certain point, before wrapping around to being slow again.
I think what's happening is I am constantly throwing away microscopic delays which are introduced during the rendering loop and this causes the world's internal clock to rapidly desync even though it is very close, because it simply forgets about that extra time when calculating w_ticksDue.
Gonna try a fix when I get home.
It didn't work. Going to throw the entire thing out and try again with a different system.