Does anyone know whether there is a general principle that can be applied to figure out what level of performance loss would occur at different screen resolutions? My current monitor is a very old 5:4 19” 1280 x 1024 model and I’m looking to upgrade to something bigger, preferably a 34” ultra-wide. These only seem to come in 1080 and 1440 versions (why no 1200?) and, as I sit quite close to the screen, feel a 1080 model would be much too pixelated for my liking. This leaves me with the prospect of moving from a 1280 x 1024 screen with around 1.3 million pixels to a 3440 x 1440 one with nearly 5 million, plus a sizable increase in viewable area. Would a 4x increase in overall resolution equate to something like a 4x drop in frame rate, or is it just not as simplistic as this? My current Assetto Corsa benchmark figure is around 250 and I don’t think I could accept anything less than 120, after all, there’s no point in buying a 100Hz monitor (the minimum I plan on getting) if you can’t maintain at least 100fps most of the time. Incidentally, this will be run from an i5-7600K/GTX1080 PC (wish I’d gone for the 1080Ti!).