• Welcome to the largest (sim) racing website in the world!
    Blurring the line between real and virtual motorsports.

Configuring New PC

Discussion in 'Computech' started by arturo7, Aug 16, 2019.

  1. demetri

    demetri
    Premium

    Messages:
    286
    Ratings:
    +133
    And 1070ti is clearly not enough for 3x144Hz screens even if 1080p. That's 75% of 4K resolution, mind you. You need at least 2080 or 2070 super level of performance to get high fps on 3 screens.
     
    • Agree Agree x 1
  2. Andrew_WOT

    Andrew_WOT

    Messages:
    2,332
    Ratings:
    +529
    2080Ti consumed about 50W more during tests comparing to 1080Ti. Nvidia also had higher system wattage requirement for that card vs 1080Ti.
    May be things change in the bright, distant future, to better OR worse.
     
  3. RCHeliguy

    RCHeliguy
    Premium

    Messages:
    1,619
    Ratings:
    +554
    The 2080Ti used an absolutely HUGE chip which made yields low and increased costs dramatically.

    It is a bad scenario where everyone loses because they couldn't get the process lower than 12nm in time for their release.
     
  4. Jacco van der Zaag

    Jacco van der Zaag
    Premium

    Messages:
    354
    Ratings:
    +135
    @Andrew_WOT

    That is not a sponsored video. Take a look at this sheet:
    https://docs.google.com/spreadsheets/d/1Ej9mOe5NamLldpfyFfXrqvLMZP-aG2Zi1su7QzPRNJY/edit#gid=0

    There are enough Freesync monitors which are not able to support G-Sync at all, or with artifacts like flickering, ghosting, stuttering and so on.

    I myself got into this quite a while ago, since I've heard about the Adaptive Sync merge. After that, I decided to get out there and take a look for a few monitors for my triple setup. I also don't own a certified monitor, however, it's sister model is the one with speakers, mine is lacking those. So not a real worry since there won't be a different panel or something like that.

    Besides that, the use of a 1000W PSU what you are recommending, what is the real argumentation on that?

    Future proof? A good 750W unit will suffice for everything a SIM PC needs.
    Overclocking? Overclocked, a 9700K will draw 30-40W during gaming. At stress tests, it's something different, but still.
    https://www.tomshardware.com/reviews/intel-core-i7-9700k-9th-gen-cpu,5876-2.html
    https://www.techpowerup.com/review/intel-core-i7-9700k/16.html

    During that gaming test, the 9700K (OC) with a 1080 Ti will take 380W, whilst a 1080 Ti takes 260-270 alone.

    An overclocked 2070 will take as much power, ish, than the 1080 Ti.
    https://www.techpowerup.com/review/asus-geforce-rtx-2070-super-strix-oc/30.html
    https://www.techpowerup.com/review/nvidia-geforce-gtx-1080-ti/28.html


    USB power draw, audio card power draw (Who is using those nowadays anyway?) and stuff like keyboards, mice, SSD's, won't take a 100W in total.
     
  5. Andrew_WOT

    Andrew_WOT

    Messages:
    2,332
    Ratings:
    +529
    Your point? In theory power consumption drops but in practice...
     
    • Agree Agree x 1
    • Disagree Disagree x 1
  6. Jacco van der Zaag

    Jacco van der Zaag
    Premium

    Messages:
    354
    Ratings:
    +135
    Which test? I see 20W. Power draw from the site states 260 versus 250W. Recommended PSU 650 versus 600W.

    https://www.nvidia.com/nl-nl/geforce/graphics-cards/rtx-2080-ti/
    https://www.nvidia.com/nl-nl/geforce/products/10series/geforce-gtx-1080-ti/
     
  7. Jacco van der Zaag

    Jacco van der Zaag
    Premium

    Messages:
    354
    Ratings:
    +135
    I'm showing you practice results. You're not helping anyone with childish reactions about sponsored video's, and recommending things you can't really clarify.

    Here are some references of newer and older GPU's and CPU's.

    [​IMG] [​IMG] [​IMG] [​IMG]
     
    Last edited: Aug 16, 2019
  8. demetri

    demetri
    Premium

    Messages:
    286
    Ratings:
    +133
    An overclocked (~5GHz all core) 9700K will draw more like 60-70 during gaming, and up to 220 during torture tests (Prime95 with AVX, small FFT set) if you remove power limits in the BIOS. But 750w PSU is enough for such a build, even if you go with 2080ti
     
    • Agree Agree x 1
  9. RobertR1

    RobertR1

    Messages:
    833
    Ratings:
    +389
    I have a 8 yr old 850w PSU. It's so old the company is out of business (OCZ). It powers a highly OC'd 9900k with oc'd ram and oc'd 2080ti just fine (I have nothing left to OC :(). I'd expect 2019 PSU's to have less ripple, better transient response and cleaner power delivery. In my setup a 750w would be more than fine. If anything, I'd go for a high quality PSU like platinum over a higher wattage and lower quality.

    Unless you're XOC, you don't need an over the top PSU and if you were doing XOC, this thread wouldn't exist.

    Run realbench with hwinfo and you can approximate your usage.
     
    • Agree Agree x 3
    • Like Like x 1
  10. RCHeliguy

    RCHeliguy
    Premium

    Messages:
    1,619
    Ratings:
    +554
    Really?

    The audio card on my motherboard drives my 4 channel transducer amplifier.
    The 2nd audio card is a 7.1 card. It's optical out drives my 5.1 surround system for mirroring audio in VR when I have friends over.

    Other people will run Crew Chief over their headset and use a second card to drive speakers for everything else.

    If I connected it to my high end stereo, I use async USB to the DAC.
     
  11. RobertR1

    RobertR1

    Messages:
    833
    Ratings:
    +389
    The audio card you're using requires such little power that it's effectively a rounding error.
     
    • Agree Agree x 2
  12. Jacco van der Zaag

    Jacco van der Zaag
    Premium

    Messages:
    354
    Ratings:
    +135
    Yes, really. I don't drive an transducer amplifier. If I would rock that, I would use an audio card as well. However, there are more people not running an audio card, than the other way around.

    I get your point however.
     
    • Like Like x 1
  13. RCHeliguy

    RCHeliguy
    Premium

    Messages:
    1,619
    Ratings:
    +554
    I wasn't trying to suggest it was a power hog, especially when running an optical out.
     
  14. Jacco van der Zaag

    Jacco van der Zaag
    Premium

    Messages:
    354
    Ratings:
    +135
    By the way. Why no Ryzen 7 or Ryzen 9?
     
  15. RCHeliguy

    RCHeliguy
    Premium

    Messages:
    1,619
    Ratings:
    +554
    To throw a wrench into this, the current USB 3.2 specification for power has the potential for a computer to deliver quite a bit of power to external USB devices. My laptop has a portable secondary monitor that gets power and signal off a single USB-C connector. Not that it uses much, but that is coming.

    As an example, my Valve Index has an external power supply. Eventually VR headsets will pull power directly from the computer at least that is the thought behind the new USB-C connector in the back of the RTX video cards that no one is using yet. And yes I realize that the Rift and many other headsets currently run off of the computer. That is something else to factor in.

    The 3.1 USB-C connector allows for up to 100W of power to be pulled.
    However the VirtualLink USB 3.1 gen 2 only allows for 27W.
     
    Last edited: Aug 16, 2019
  16. RobertR1

    RobertR1

    Messages:
    833
    Ratings:
    +389
    Sims aren't very multicore and prefer max IPC on a couple of cores. Ryzen can't hang. If you're making a gaming/sim machine, intel i9 still wins esp if you put some time into OC'ing the chip. The Ryzen chips have no headroom. So while they have great IPC now, they can't keep up in raw frequency.

    If you're doing a lot of productivity in tandem with gaming; code compiling, HQ streaming, rendering, encoding, then the Ryzen chips are a better buy.
     
  17. RCHeliguy

    RCHeliguy
    Premium

    Messages:
    1,619
    Ratings:
    +554
    Incidentally, 2020 = NVidia 30XX release with 7nm
    https://wccftech.com/nvidias-ampere-gpu-launching-in-2020-will-be-based-on-samsungs-7nm-euv-process/

    Intel has started to release samples of their 10nm laptop CPU's and should have more substantial offerings in 2020. In 2021 Intel "claims" they will have 7nm process going but it will mostly be for their server based graphics cards. We will see... their 10nm is only 4-5 years behind schedule.

    My question is whether the Intel server graphics cards are for distributed gaming as a service or for use in rendering farms etc. since they are referring to them as server cards.
     
  18. Andrew_WOT

    Andrew_WOT

    Messages:
    2,332
    Ratings:
    +529
    @Jacco van der Zaag, that was a response to a different user, not sure why you quoted it.
    On Freeesync, I posted the list of NVidia certified monitors, those should work 100%. If your monitor is not on the list, there is community supported spreadsheet with what works and what not.
    On my WM UHD420 (4k, 42"), that is not on NVidia list, Freesync works flawlessly with EVGA 1080Ti.
    Sorry to hear your experience was different, but that does not mean that NVidia support for FreeSync sucks in general, they wouldn't be certifying G-Sync compatible monitors otherwise, or have many happy users like myself. :p
     
  19. Jacco van der Zaag

    Jacco van der Zaag
    Premium

    Messages:
    354
    Ratings:
    +135
    Can you even read? I'm saying, my screen works fine, even if it's not on the supported list. However, the version with speakers is, so that's why I made that choice.
     
  20. Andrew_WOT

    Andrew_WOT

    Messages:
    2,332
    Ratings:
    +529
    Perhaps, not with what ROG packages in. This baby can drive high impedance headphones without breaking a sweat.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.