Question Spending $2k for a RTX 3080 Ti worth it?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I am going for the MSI RTX 3080 Ti Gaming X Trio (LHR), it is priced at a hefty $2k. It's 20% more than the total cost of my PC specs (see my sig below) that Its feels like I could have bought another duplicate of my rig.

I have never spent this much for a GPU in my entire life, but I do need it for production purposes as well and peace of mind from upgrading for the next 3-4 years or so.

Originally the price was around $2.4-$2.5k, but after GPU prices started going down a bit, it landed around $2k flat. I am considering buying the dip.

I kind of don't feel sticking with the 60 series GPU from Nvidia (i.e. 1060, 2060, 3060) albeit it is the economical choice. Its performance just doesn't compare, but its the backup card to go if the higher tiered GPU gets busted.

From what I can research:
MSRP of 3080 is $699
MSRP of 3080 Ti is $1,199, then GPU manufacturers (Asus, Gigabyte, MSI, etc.) can then price it even higher than that.

So I am basically spending almost double msrp.
However the reason why I feel like buying is because the silicon shortage is predicted to last up until the end of 2022/start of 2023.
Even if the pandemic ends, there are other factors that bring increased demand to silicon such as:
-Shift from traditional cars to electronic/smart vehicles (cars, trucks, etc.) which use chips
-More IoT devices
-China cutting down on raw material production (Silicon) to meet environmental goals (Chinese government is unpredictable)
-Ethereum keeps delaying EIP-3675 for how long as they want to (where they become proof of stake).

DzrrVMe.jpg




Silicon is an abundant resource, but it takes fabs to process it to turn it into useable chips. The bottleneck from what I can see is the lack of fabs.

Silicon fabs take years to setup.
I feel like silicon's price over time is starting to behave like copper.
Its good to point out as well that one better be careful with their GPU since replacements may be harder to come by.
The whole situation would just become worse for a long time before it gets better.
Next generation GPU msrp may be priced even higher. (1060 msrp - $299, 2060 msrp - $329, 3060 msrp - $399)
All of the reasons mentioned above, I feel like paying $2k for a GPU seems not so bad if things will continue to be bad for the entirety of 2022, but inside myself I digress for paying something this absurdly high.
Eventually this price bubble should & would collapse, but it would be wishful thinking if one were to hope for it to happen anytime soon.

https://www.bloomberg.com/news/arti...surge-throws-another-price-shock-at-the-world
 
I find my total system load hitting around 520-540W with a particular game so far and then my Powerchute personal edition says my "UPS Battery is out of juice" when it's still plugged into the wall outlet.

Gonna have to tame this beast and reduce the power limit using MSI Afterburner. Default is at 100%. I'll start by decrements of 5% and see if it crashes or is still playable.
 

Karadjgne

Titan
Ambassador
Nah, Nvidia (and AMD) will pull the classic "discontinue previous-gen mid-high-end ahead of next-gen mid-high end" so the 3070-3090S' new 'retail' price will be "SOLD OUT" and not really compete against the 4070+ unless you include the used market If we're lucky, maybe Nvidia will then bump production of 3060S and below to burn their remaining 7nm wafers on parts that cannot compete against new stuff.
Nvidia can do a lot more than that. There's plenty of room in the market for a higher end GTX card, like the rumors they are going to re-release a 20 series card as a filler. There's honestly not much demand for RT cores atm, even a 1080ti will hold its own with a 2070 or 3060 as far as fps goes. Only a few older titles like Minecraft show any massive amounts of visual changes between RT and non-RT.

So bring on the GTX 2660...
 

InvalidError

Titan
Moderator
So bring on the GTX 2660...
The rumored "filler 20-series" cards are supposed to be 12GB RTX2060 and the VRAM alone pretty much guarantees they will be $300+ SKUs, quite possibly a lot more if Nvidia decides to align MSRP with its other SKUs' street prices. Also got to slap the 20% wafer processing price bump on top and it is a safe bet that all of the other hiccups introduced by China's rolling blackouts will cause another round of price hikes, delays, shortages, etc. across the board.
 

InvalidError

Titan
Moderator
Why? The only real reason for 12Gb is 4k. For gaming. And a 2060 isn't enough card for that. It's passable at 1440p and good for 1080p. A 8Gb card would be plenty at that lvl.
Probably doesn't want to put 2060s with 256bits memory in too many people's hands. That is two more VRAM chips per board too unless going down to 128bits which would starve the GPU most of the time unless aiming for an hypothetical RTX2050(S).
 
Let me guess, New World game? I'd start at closer to 80% power limit.
Not even gonna have my fresh new card touch that game until they fixed whatever's causing GPUs to die there lol.
I find myself using the 55% power limit and surprisingly my game still runs. Not sure how low can I safely go. Will post data table here soon
 

InvalidError

Titan
Moderator
I find myself using the 55% power limit
Not having much faith in the manufacturers, heh? :)

If GPUs' power-vs-performance curve is similar to CPUs, you get the first 80+% of performance with the first 50% of power. If you don't need the absolute maximum peak performance, you can save a fair amount of power.

I doubt your GPU is in any real danger with New World as long as you lowered the maximum power enough for it to register and keep those "power bug" events in check.
 
Ok so I applied this guy's voltage/frequency curve just for safety measure in order to avoid the "RTX 3080 issues" on launch (This problem also seems to happen on the RTX 3090)
Article here: https://videocardz.com/newz/manufacturers-respond-to-geforce-rtx-3080-3090-crash-to-desktop-issues

But this guy explains that the problem isn't the capacitors per say, but rather the voltage fluctuation allowance given to the GPU.
View: https://www.youtube.com/watch?v=--YvuxE0xl4

*Would recommend watching the whole vid for what he's trying to say
The voltage alone is fine
The Frequency alone is fine
Its the fluctuation of the voltage and frequency past 1 Volts that isn't fine, hence he recommends locking it.

I don't know if the latest Nvidia drivers do fix it, but definitely am using the latest Nvidia driver right now.
Anyway still a good idea to give the caps some breathing room from the noise/fluctuation of voltage and frequency

Here is what my Voltage/Frequency curve looks like right now. Lowered the clock speed by -80 Mhz from base.
gyc8U8V.jpg

So my max volt would be 1V. I would play this around later by going lower ever 25V till I crash and find my sweet spot.

Anyhow after that, here's the power limit's I have been testing around with MSI After burner. The game I tested was Mechwarriors 5: Mercenaries on default graphics settings
For some reason I can't get average FPS to show up with Rivatuner, which is a better metric so I just glanced at whatever min/max fps values I can find
ldfg8RQ.png

Previously I said 55% was ok, but actually I kind of feel comfortable going even lower than that.
My sweet spot would be 40% power limit if the game didn't present any issues, otherwise I'd raise it up to 50%.
I find it unnecessary to have high uncapped FPS in single player games where there's no point of competition.

It's also worth noting that my max GPU Mhz is 1965 since I voltage locked it with the voltage/frequency graph as mentioned previously, as per the guy's recommendation.
Old articles say RTX 3080 experiences issues once clock speed goes past 2 Ghz and the voltage fluctuates with it, harming the capacitors in the long run hence I would heed to their recommendation for safety measure.

Maybe I will be confident to go back in the 90-95% power limit when I get a higher wattage UPS.

Also quite impressed with this GPU, at Idle it doesn't need to spin its fans like some Corsair PSU. This would further add up to fan longevity, compared to my old 1060 where the fans keep spinning even at idle.

ORjpJI0.png
 
Last edited:
Also, partly the reason why I felt going with the RTX 30 series now is because the 40 series is quite power demanding. Looking at around 400W, possibly even up to 600W for the high end model according to leaks. RTX 40 series may require a new kind of PSU to power up the GPU.

View: https://www.youtube.com/watch?v=UoTx0tqpaDg


I get what info I can find. True or not, at least there's some info rather than having no info to work with.
Whatever the outcome may be, I'll definitely look back at this thread and see if my post aged like wine :)
 
Watched the new world vid you linked, looks like the issue is even if you set a power limit of 100% in MSI After burner, the GPU goes above that while playing New World and that hits hard on the GPU.

People are complaining that their GPU is dying on the main menu
It seems to be happening on all cards where GPU Power usage % exceeds power limit that is set
There are some GPUs that can run New World and behave as they should with respect to their power limit.
I'm inclined to think that there's something wrong with that game.
 

InvalidError

Titan
Moderator
I'm inclined to think that there's something wrong with that game.
It isn't software developers' job to enforce the GPU's power limits, game developers should only worry about writing their games in the most hardware-agnostic manner possible for the best back, current and forward compatibility possible without having to integrate hardware-specific code for every GPU ever made.

Power limit excursions are the hardware developer's problem. It is their job to manage power and make potentially fatal excursions impossible.