Well I finally fixed it.

It wasn't the timers.

I designed a bitmasking input system, which in hindsight I have no idea what the hell I was thinking because there is already a global bitmask just for that purpose. But basically it was getting cleared every tick instead of me properly handling released key presses.

This was unnoticeable at higher framerates because the renderer would update inputs faster than the world could update and have key events ready long before the next game tick.

However at lower framerates this would cause anything reading the bitmask to constantly start and stop moving because the bitmask was completely empty whenever multiple ticks occurred within one frame.

This also explains why visual effects worked fine.

So yeah I wasted 3 days reimplementing various timer systems over and over before realizing the system to prevent dropped inputs was dropping inputs and I could have avoided all of this

Somehow my timings are wrong...?
It's gotta be some kind of typo in my delta timer, I've checked everything else.

It is a good thing I have not started writing enemy AI yet or else the whole thing would have to be rebalanced.

I wonder what alternate DE's exist for AmigaOS 3? Workbench isn't my thing, but I'm trying.

Well I found out why nothing was showing up on the screen and I'm very, very embarrassed.

Quake-style developer console & my own cout wrapper for logging. Cvars and commands system work-in-progress.

Mostly for the windows users since they don't have an std::cout to diagnose issues with, but also so I can tweak/monitor vars without sifting through a debugger. CLion's gdb frontend only gets you so far.

Context: friend booted up Windows to play an anticheat protected game and this popped up

Progress report: Menu UI framework is mostly done. No logic to handle anything other than just buttons right now, but it's good enough I can move on soon and get to work on game logic.

Implemented a globally-visible bitmask of user input to be passed to worldThread during game loop

I have no I/O functions yet, so my debug and render variables are set from the command line.

Taking a few days because I want to perfect the absolute basics before I continue and get rid of any technical debt. Simple menu framework on the main thread will be the next step. Then a globally visible GameObject layer to be read by the renderer and animation systems in state INGAME

LibAlleg does not have a guide on framerate-independent game logic, which makes removing the FPS cap difficult. No worries, I could do multithreading.

Except for some ungodly reason, the renderer is hard-capped at 60 frames per second, agnostic to my own code. This is "120"

I only found this out the hard way when my timescale math inadvertently caused the publisher intro to be twice as long instead of just being double the framerate.

At that time I had not implemented a delta timer because I was relying on Allegro's event timers for pacing, so I had no way of viewing the framerate until I took this screenshot. You can probably imagine the hours I wasted trying to fix my code not realizing what was happening.

*sigh* time to contact the IRC again...

Twitter Crosspost 

RT @kaliivvs
Racist Check: Type "Nig" and press what comes up in the middle

Show older
Game Liberty Mastodon

Mainly gaming/nerd instance for people who value free speech. Everyone is welcome.