The issue becomes that the basic gaming card is still $400. That's a chunk for a lot of people.
I Agree the pricing is outrageous. Ever since the 40 series was released nVidia has been screwing over customers with horrible price segmentation.
The issue becomes that the basic gaming card is still $400. That's a chunk for a lot of people.
"8 GB is dead" discourse skyrocketed in the last year, although the RTX 3070 probably got the ball rolling back in 2020.This article is late to the party by a few good years. 8GB has been obsolete for a long time now.
and it costs amd likely less than $20 for that 8gb to make it 16gb.While I'm not super thrilled about it, I think AMD can be afforded quite a bit more slack in this regard, for two reasons:
1 - at this point the 9060 XT is, particularly with discounts (albeit that in the form of a Newegg gift card), as low as $250
Nice to see your writing again!I sent Paul/Aaron a note (from the great beyond). I don't know if this was just outdated text based on real-world pricing (at one point the 16GB cards was trending closer to $500 than $430), but these days you can find the 8GB card starting at $370 — $350 if you want to try to get Newegg's mail-in rebate. The 16GB can be had for $430, straight up from Amazon and Newegg with no rebates.
Where is THE DISLIKE BUTTON?I understand that 8gb of vram isn't enough for a lot of people, and there is a borderline absurd amount of discourse surrounding technologies like DLSS and frame gen, but I've been getting by on a laptop with a 4060, and it's been perfectly fine. Better than fine, actually. Cyberpunk and Elden Ring run phenomenally well at 1600p on max settings. TLOU2, God Of War, Ghost Of Tsushima. It handles everything I throw at it better than I think it will. All of the older games that I spend 90% of my gaming time playing run perfectly. Obviously if it weren't for upscaling we would be having a different conversation, but I'm not what one would call a "pixel peeper" so it doesn't bother me at all. As long as I can get over 60 fps I'm happy. However, looking at a game like the Witcher 4 makes me think I might have some issues in the near future. We shall see. To be frank I think a lot of people just like complaining, or nothing is ever good enough, or they get caught up in the cycle of negativity and discourse or whatever. But for now, yes, I'd say 8gb of vram ia passable.
Say that you've never heard of texture quality and a lot of other graphics settings that mostly depend on VRAM which is not expensive that BUT have a great impact on the quality of what you see.Say that you've never heard of the "8GB isn't enough" claim. What then happens is when you encounter a game that hiccups because of the RAM limit, you would do what every gamer would do, which is to lower your settings until the hiccup stops. As reported by those same tech tubers, 1080p at medium settings would be fine for the vast majority of games, including AAA games.
This actually, it's kind of ridiculous that it can't do 1080p ultra at $400 brand new for a GPU only... I am kind of fine for them to sell skyhigh price for the HALO product, but you better offer something reasonably decent at 4k ultra for most current ganes at a reasonable price plus a "student" kind of entry gaming card where the 60 series used to be at something like $200 for 1080p high settings. HALO product is 50%show off and future proof, but buying a $400 card and have to live with 1080p and compromised setting is.... kind of too greedyThe issue becomes that the basic gaming card is still $400. That's a chunk for a lot of people.
It's funny how some people say something like this, are all monitors 1080p or 4K? Do you have to put all the setting at max when you have a 4K monitor?Hell, 4k ultra? If that's a concern then surely you put some money on the table to make it possible by buying a decent GPU instead.
For 1080p though? With ray tracing and Dlss 8GB does the job just fine unless you're trying to play Alan Wake 2 which is a beast of its own (I played it on a 2060 though).
I wouldn't say this is "good vs bad" option, but "bad vs worse".While I'm not super thrilled about it, I think AMD can be afforded quite a bit more slack in this regard, for two reasons:
1 - at this point the 9060 XT is, particularly with discounts (albeit that in the form of a Newegg gift card), as low as $250
2 - More importantly, there was a video, I think it was Hardware Unboxed, that did a comparison between the 9060 XT 8GB and the 5060 Ti 8GB, and found that the AMD card suffers notably less from the VRAM limitation than does the Nvidia card.
Not sure what the "bias" is here? That if you're spending $350 or more, 8GB is a bad investment? The problem is that $50 more gives you so much more potential over the long haul. If this was a $200~$250 card versus a $250~$300 card, you might be able to argue that the extra $50 for doubling the memory is a bit of a tough sell. Bumping the price up to $380 vs. $430, though?They almost had it, but the bias leaked through when they mixed RT and non RT.
The writer already had a conclusion and just wrote material to support that conclusion.
8GB is sufficient for 1080p/1440p up to High / Very High. Ultra on modern games simple has too large texture sizes, which honestly do not belong on resolutions like 1920x1080 or 2560x1440, but do belong on 3840x2160, 7680x4320 or the various ultra wide desktop displays. 8GB is for entry level 128-bit cards like the xx50 (now nVidia calls them xx60). No enthusiast should be buying those cards.
IMHO The 4060 TI / 5060 TI should never of existed as 128-bit cards and instead been a cut down version of the higher tier 192-bit cards. But nVidia playing games with model numbers to justify ridiculous per-chip prices.
Dude, the Steam survey is a HORRIBLE metric to use. It has tons of problems, not the least of which is that it relies on users reporting what they use. If you look at it now, it shows a tiny fraction of AMD cards being used, but SALES data suggest the percentage should be FAR higher than what it is on there!If you widen the context to be representative of the wider gaming population, eg those on Steam survey, the statement is no longer true.
Not sure what the "bias" is here? That if you're spending $350 or more, 8GB is a bad investment? The problem is that $50 more gives you so much more potential over the long haul. If this was a $200~$250 card versus a $250~$300 card, you might be able to argue that the extra $50 for doubling the memory is a bit of a tough sell. Bumping the price up to $380 vs. $430, though?
Nvidia shouldn't have created the 5060 Ti 8GB. Period. The regular 5060 already covers the same market for less money, and yes it's a bit slower... but if you don't need 16GB, then you sure as hell don't need the extra performance of the 5060 Ti vs the 5060.
Dude, the Steam survey is a HORRIBLE metric to use. It has tons of problems, not the least of which is that it relies on users reporting what they use. If you look at it now, it shows a tiny fraction of AMD cards being used, but SALES data suggest the percentage should be FAR higher than what it is on there!
It's been a bad joke for YEARS:It has been shown to be highly accurate as a representation of the PC gaming market as a whole.
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
It's not just the enthusiasts PC Master Race who treat their gaming rigs specs as a badge of honor or bragging rights. It's the defacto PC gaming distribution platform, any PC gamer who doesn't at least have it installed would be the definition of a statistical outlier.
Randomization sampling over large numbers is kind of how science works, using mathematical models you can figure out your confidence level. For Valve it's 2~3 sigma, which is 95~99% confidence, good enough for business and marketing data. Scientific data usually requires 4~6 sigma (99.994% ~ 99.99966%), which is where the myth of "Steam is unreliable" comes from.
It's been a bad joke for YEARS:
Why the Steam hardware survey is a joke for what people try to use it for.
1) Me, from May You really don't want to look at the Steam numbers for the simple fact that they don't really bear any relation to the actual market. I say this based on the fact that the 2 products that hard numbers have ever been released for, the HD5700 and HD5800 cards, shortly after release...forums.anandtech.com
Btw, quoting an article from Steam themselves doesn't really make a good example, bub.