60fps is always going to feel smoother than 45fps. In 66ms you’re getting around 4 frames while using 60fps but only 3 frames with 45 fps. This is good for games with a lot of motion, such as racing games.
However the input lag depends on the sync between the game and the monitor. The higher the monitor frequency, the sooner it “asks “ for a new frame. Having a higher frequency means it waits less time before asking for a new frame .
The max input lag is dictated by the monitor frequency. For example, a game that runs at 45fps (and has a frame time of 1s/45= 22ms) has a maximum input lag (the delay between the frame being ready and being displayed in the screen) of 11ms because the screen is checking for a new frame at 90Hz (1s/90=11ms)
Ok, let’s talk frame times first:
1s / 90Hz = 11ms
1s / 60Hz = 16ms
1s / 45Hz = 22ms
Vsync introduces a lag between the frame being ready and being display in the screen. This means input lag is heavily dependent on the screen refresh rate, the higher the refresh rate, the lower the vsync lag. Then we need to add at least one frame time to the vsync lag, which is the minimum required for us to see the new frame. So it goes like this:
The input lag between 60/60 and 90/45 is identical. Some game engines might work better with higher fps (they’re tie to game render frequency), so 60/60 has a slight advantage over 90/45 regarding input lag overall.
60/60 shows 4 frames in 66ms, but 90/45 only presents 3 frames, so 60/60 is smoother.
90/45 uses less battery and it’s generally more stable, as it gives more times for hiccups in the frame timing.
In conclusion: 60/60 is preferred in high pace/action games (like shooters or racing games), but 90/45 is better in slower pace games, specially open worlds with big variance in scenes (like RDR2 or Spider-man).