AMD GPUs for better CPU limited performance? HW Unboxed Video

RasmusP

Premium
7,079
4,301
Germany
Hey fellow simracers,

I just saw this video and since CPU performance is very important for us simracers, I thought it might be interesting for you too!

Here's the video:

So I have a 10600k @ 4.9 GHz with 2x 8GB 3466 Memory and a RTX3070.
Resolution is 3440x1440.

I mainly race in AC and rF2, some ACC. I have a 100 HZ gsync monitor and limit my FPS at 90 fps. In theory I should limit at 97 but I'm getting CPU related fps drops into the 80's every now and then so I prefer to set the limit to 90.

In all three sims, almost all fps drops are related to my CPU. The GPU load is never topping out, I made sure it won't happen with my settings.

So according to that video, I could have a better performance with an AMD graphics card. That's a very interesting result in my opinion!

I'm stuck with nvidia due to my monitor but overall that's a very interesting new information!
 
I am still holding out for the 6800XT this round.
I generally tend to keep GPUs for quite some time.
I'd actually like the reference model and would even spend up to $50 over the advertised price...but not a penny more.
I don't plan to buy anything less this time out...not the 3060, 3070 or 6700.
If it happens...it happens. If it doesn't...oh well. No big deal.
I haven't really put much effort into sourcing one either.
I believe it is futile until production either stabilizes...or over time, more gamers get a card and the market reaches a lower level of desperation.
I see the 6800XT as the best GPU for my current 3600x, plus I like its build features for simracing.
Doesn't need to be the outright fastest GPU, just smooth and with enough VRAM to load maximum textures.
 
Last edited:
The question is, though...does this even apply to simracing titles? I have no idea how WatchDogs: Legion or Horizon Zero Dawn load the CPU, as in how many threads do they use and does the driver run on different ones? Since with sims, there is usually a lot of threads to spare on a CPU like these, does the nVidia driver use these threads, making the added overhead effectively a non-issue since the threads would still be doing nothing otherwise?
 

RasmusP

Premium
7,079
4,301
Germany
The question is, though...does this even apply to simracing titles? I have no idea how WatchDogs: Legion or Horizon Zero Dawn load the CPU, as in how many threads do they use and does the driver run on different ones? Since with sims, there is usually a lot of threads to spare on a CPU like these, does the nVidia driver use these threads, making the added overhead effectively a non-issue since the threads would still be doing nothing otherwise?
Indeed... No idea!

When the video gets to the "That's enough graphs" I was like "Not really.. Can we have a non-multithread game too?".
Afaik both these games are gaining fps when going from 6 to 8 cores, so definitely not really comparable to simracing titles..
 
I like Hardware Unboxed a lot, but their handling of multithreading and CPU limits is not that much better than the vast majority of HW YT channels out there :( (as in...not very good) I don't get why it seems to be such a problem to get good data on it...

And yeah, there's definitely not too much data in there. I'd say there's actually very little data to make some of these fairly bold statements they're making. But I also admit I somewhat skimmed through the video as I wasn't really in the mood, so maybe I missed something.
 
Last edited:

pattikins

Premium
1,263
1,533
It's purely subjective and based on flaky memory, but I recall a drop in performance in Dirt Rally (VR - Oculus Rift cv1) when I went from a Radeon Vega 56 to a GTX 1080ti using a system based on an i7 3770 (non 'k' boost 3.7ghz) processor. The Vega 56 was far from perfect, but it was much better than my previous cards (R9 Nano & 390) in VR. Based on the reviews that I'd read I was expecting a vast graphical improvement and a smooth 90fps from the Nvidia card. However, those reviews were based on high end cpus. With my old Intel setup, I ended up turning down the graphic settings and I still didn't get my desired 90fps. It wasn't until I upgraded my system to a Ryzen 3600X that I got a decent frame-rate.

I've found some old AC benchmarks that I took with the i7 3770 & R9 390/GTX 1080ti - both churning out 7+ megapixels, but with different set ups & AC versions:

2016 Triple monitor set up
AC VERSION: 1.8.1 (x64)
POINTS: 14034
FPS: AVG=95 MIN=23 MAX=130 VARIANCE=0 CPU=62%

LOADING TIME: 38s
GPU: AMD Radeon (TM) R9 390 Series (6576x1080)
OS-Version: 6.2.9200 () 0x300-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:2X AF:2X SHDW:2048 BLUR:0
WORLD DETAIL: 3 SMOKE:3
PP: QLT:3 HDR:1 FXAA:0 GLR:3 DOF:3 RAYS:0 HEAT:0

2019
AC VERSION: 1.16.3 (x64)
POINTS: 12685
FPS: AVG=86 MIN=5 MAX=126 VARIANCE=0 CPU=55%

LOADING TIME: 33s
GPU: NVIDIA GeForce GTX 1080 Ti (3620x2036)
OS-Version: 6.2.9200 () 0x300-0x1
CPU CORES: 8
FULLSCREEN: ON
AA:4X AF:16X SHDW:4096 BLUR:12
WORLD DETAIL: 5 SMOKE:4
PP: QLT:4 HDR:1 FXAA:1 GLR:4 DOF:4 RAYS:1 HEAT:1
 

RasmusP

Premium
7,079
4,301
Germany
There's a new test on the German PC Games Hardware:

The conclusion is basically:
With dx11, nvidia has the better driver overhead, higher performance. Example: Witcher 3!

With dx12, it's sometimes better with the 6900xt, sometimes the 3090 still has the edge. With dx12, the driver can't optimize certain things anymore. The games need to have the stuff that is done by the driver in dx11 and earlier dx now.
Apparently, nvidia's advantage becomes lower or amd even has the edge in some games.

Here's the Witcher 3 benchmark: (dx11)
1615917961899.png


And this is F1 2020: (dx12)
1615918017662.png
 

TedBrosby-

Premium
458
547
If you have AMD 3000 series or Intel 9000 series or better, it's mostly a non-issue. Even the 8600K/8700K is probably more than enough to overcome the driver overhead for Nvidia. This is mostly an issue with both low resolution and lower end CPU's paired with a very powerful GPU. Apparently Nvidia GPU's are limited to barely above RX 580 performance, regardless of how powerful the GPU is. I don't see why anyone would pair a 3070 with a 7700K though. Honestly if you can get a 3070, you should upgrade your CPU.
 

Are you on VR, Triple Monitors, Ultrawides or ... ?

  • VR

    Votes: 308 36.5%
  • Triples

    Votes: 158 18.7%
  • Ultrawide

    Votes: 184 21.8%
  • Single Screen / Other

    Votes: 194 23.0%
Top