A few days ago, I had a friend who is also an old demo coder over for code reviewing. I showed off the game and got some feedback.
One thing I had chosen earlier was to use a constant timebox for my game loop that updates coordinates and draws them. It would only draw if it had enough time left after update (about 5 ms) or at least every 5th loop. Each timebox was 20 ms. This kept the game on 50 FPS (1000 ms / 20 ms = 50).
However, what we noticed when looking closer was that the game had slight 'lag' feeling on the asteroids that moved forward. My friend mentioned that upon switching screen buffer (replacing the current image shown on screen with whatever new image has been drawn until then), in some graphic libraries that function delays and waits for the graphic card to draw. I figured that was the cause of the lag.
The problem was that the graphic card was set on updating the screen with 60 Hz (60 times per second) while the game only generated 50 FPS (50 images per second). That meant, that 10 times per second, the same screen was shown again, which the player experienced as "lag". It's also why first trailers of the game was even more laggy (it had even ANOTHER frame rate that was totally out of sync).
I've now changed the implementation completely by adding delta time in all update functions. The functions multiply the movement per millisecond by the number of millisecond elapsed since the last update.
The game flows great now, without lag, and will have the same speed no matter which computer or refresh rate is used. And I've learnt to choose delta time before time window next time.