Will you be Buying an RTX 3080?

Thanks for chiming in RasmusP , very informative as always :thumbsup:

Since it's a 2 x 8-pin card it's as good as always bouncing off the 350w power limit, whether it's running at stock clocks or OC'ed. Only a couple of watts difference really.

But after reading your post I did some more tinkering and I think I found the card's sweet spot: 1890mhz at 875mV.
With the boost curve flattened at that point, core frequency stays pretty much locked at 1890 under 98-99% load without downclocking, whereas before it was constantly fluctuating.
It's still flirting with the power limit though...averaging 346w and occasionally peaking over 350. Might have to go for something like 1860@850, or drop the +500 on the memory. But then again, my main goal now is achieving the highest stable average core clocks possible, not lowering power consumption as temps are good.

So far it seems stable in Heaven benchmark (2 hour loop) and TimeSpy (benchmark & stress test), now I need to test it in games to determine final stability.
 
Last edited:
I think they are launching it early to beat AMD out the door again, because AMD has a monster multi chip design GPU (7900xt) launching this year which is potentially more powerful (on paper) and uses far less energy. But likely not going to launch till Q4.

Makes sense to launch early to keep mind and market share if they know they might lose in terms of performance in any way. Plus charge a fortune as its rumoured to be 2x the 3090?! Charge what 3k for it?

I think the AMD card might win at gaming via brute force this time, but still lose to CUDA performance for rendering etc because of how Nvidia's doubled up architecture works. Last generation AMD caught up in gaming, this time I think its going to be even more interesting.
 
However we look at it, GPU pricing and availability will get better (for now anyway).
That is all subject to change though...should other GPU-mined coins become more appealing.
We did get a good look at where we all stand with regard to customer loyalty. There is none.
It also sucks royally that after all this time, most guys are 'happy' to pay the imaginary MSRP for almost two year old technology.
It also shows how desperate gamers are.
 
Last edited:
Not sure where I'm going to fall on this one. I might skip the 4000 series and hold it for the one after. The price increases on gpu's had gotten completely out of hand for me.
 
Yeah I'm running triple 1080s and also an ultra wide at 5160x1440 on my 3090 and 4k on my 3080ti. I would love to have more headroom but I need a very large increase in performance to make it worth wild.
 
The 4000 series looks like they'll be complete power hogs...though extremely fast.
We may be back to Fermi levels of hot.
The need for a new power supply in many cases, will affect the cost of upgrading to that series.
The whole GPU market is skewed though because guys went crazy buying these things at insane prices.
Right now Intel seems to be a complete mess with ARC.
I really wanted that to be good and was seriously considering the top... if priced well.
Sadly, they can't seem to get it together, so I am moving on.
Nvidia has quite the conundrum...
With GPU prices falling as a whole.... Do they dare release the 4000 series now, while they're still commanding extremely high prices for the current top-of-the-line 3000 series. That would massively decrease their value rather quickly.
On the other hand.... if they wait too long, an AMD launch and the expected 'flood' of used GPUs could force them to lower the initial pricing of the new series at launch.
That is something they're probably trying to avoid.
Nvidia will more than likely delay the 4000 series with an eye fixed firmly on what AMD's release timeline is.
If these are ready now, don't expect to see them until approximately one month prior to any AMD release announcement.
If AMD is smart, they'll surprise the heck out of Nvidia with an earlier than expected and unannounced until-the-very-last minute launch.
With the lower power requirements and thus, lower update cost, comparable performance, etc... guess which one will garner the most attention?
AMD screwed up their first opportunity to drastically cut into Nvidia's market by adopting a stupid strategy last time out.
Let us see if they'll screw it up again.
 
Last edited:
Funny thing about that delay...

I don't care.

I don't see any ATX 3.0 PS's yet, and I'm still waiting for a VR headset announcement that makes me itch for a new headset.

I also won't be upset to get 3 years+ out of both my Index and 2080Ti.

When I got both of those, my expectations was that I'd be itching to replace them before I had them for 2 years.
 
That’s how u should feel I would say with that gpu and headset. If I’d stayed with my rift s and 2070 super I would prob feel similar. But 2070 super and a g2 at 60hz feels like how u imagine it would. I want the 40 series to launch so I know what I’m going to buy when I see the proper stats and prices etc. now just feels like limbo, and limbo at 60hz isn’t great lol
 
I think it's more that if you need both A and B for either to have full value and B hasn't even been announced., the value of A goes down dramatically.

Hence not caring yet. If Valve made an announcement of anything resembling the rumors, I would suddenly be extremely excited about the new NVidia cards and impatient to see both.
 
Last edited:
The rumors for next-generation cards are always over-hyped. Maybe we'll get 2x a 3080, and maybe not. I suspect that a 50% improvement is about the best people should realistically hope for in real applications. And maybe they will be available in a month, and maybe you will only find them scalped on ebay for a year again.

Anyway, because my linux box's 1070 was giving out, I just upgraded it to a 3080, straight from EVGA. Quite a nice improvement!

My drive/fly sim rig is still on my 2080ti. Its monitor is an older 55 inch 4k OLED that is stuck at 60 Hz, and the 2080ti does well enough with that. But I think for sim racing, at least 120 Hz should be considered mandatory. Therefore, I'll save the 4080/4090 for the sim rig. At that point, I'll also upgrade to a 120 Hz or better OLED, too. And who knows, maybe the performance will be high enough to give MS flight simulator a go.

Having each system leap-frog the other keeps me entertained. :)
 
Last edited:
The 40 series does appear to have a very good chance of doubling performance and they should. They dropped the process size dramatically and increased the cuda core count quite a bit while increasing speed and memory throughout.

There have been realistic estimates based on that.

But!!! They are also introducing a 4090Ti, so I'm guessing you'll need to go up one level to get that doubling and I'm betting there will be some doubling of certain cards and less so in other ranges.

So yes doubling in some places, but probably not others.

That's my guess and I'm sticking to it until I see something that doesn't align with everything else I've read.

The downside will be requiring new power supplies and serious power for their more powerful cards.
 
Last edited:
The 40 series does appear to have a very good chance of doubling performance and they should. They dropped the process size dramatically and increased the cuda core count quite a bit while increasing speed and memory throughout.

If chiplet technology is used, then 2x is definitely doable. AMD seems poised to go that way for their top-tier product(s), at least.

And if comparing a 3080 to a 4090ti, then sure, I suppose 2x is a possibility.

All for a price, naturally.
 
I might buy a 4060 (ti) and sell my 3080.
I'd very much like the same performance with a lower idle power draw..
The 3070 I had took half of the 3080 when not gaming.
I do all my office stuff on my PC, play guitar via simulated amps and record a lot of meetings with obs.

I did some calculation and the 3080 costs about 40€ more per year.
Not a big issue but I also notice the additional 50-80W in the quality of the air in my room.
Especially when it stays on overnight...
 
Lauging at a bunch of kids frothing at the mouth over new toys?

I've not seen such a pragmatic view of GPU's by idle power draw. Interesting perspective.

My dev system is on like you mention and I plan to leave the 1080Ti in it. If that card dies, I'll likely put a 4060 or something like that. Basically the least powerful card that drives 3 monitors for work.

I don't see the point in putting my 2080Ti into it. Seems a complete waste. My daughter will like it.
 

Latest News

Are you buying car setups?

  • Yes

  • No


Results are only viewable after voting.
Back
Top