News Nvidia vs AMD vs Intel: Last 12 Months of GPUs Are Disappointing

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Sleepy_Hollowed

Distinguished
Jan 1, 2017
536
237
19,270
Good article that even touches on the direction they're probably going, and much like cryptocurrencies, they'd better be extremely careful, there's only so much that machine learning can do, so at some point when investing dries up we'll be here again.

To those that are optimistic, great on you, but I don't think that the same investors falling for cryptocurrency (they're public companies after all) would be thinking too long range at all.

If these three were smart, they'd internally split the efforts to focus on ML, Gaming and Pro sectors, though I highly doubt it, considering nvidia had a massive lead to do this and decided not to, even with huge clients like Nintendo.

AMD on the other hand had Sony and Microsoft as clients and, all things considered, have been a bit better for gamers, price vs performance notwithstanding. Hopefully they don't mess up. Intel should learn from both their competitors, but given recent moves, I'm not holding my breath.
 
I'm ready for a new system. but waiting for Nvidia 50XX gpus when they finally appear .hopefully using pcie-5
I'm hoping for a big jump in performance . and Z790 motherboards will be cheaper by then(or superseded)
my 3080 will hang on till then (y)
youre smoking copium if you think ngreedia is going to get any better with the price/performance of 5000 series gpus. they've already made a billion dollars with 1000% markup on their AI business (yes, 1000% markup), that's a price to earnings ratio companies can only dream of making. that's apple levels of PE ratio. they have no interest in consumer products anymore, why should they? there is no margin in gpu pricing. the real money is in AI.

with AMD not planning to compete with the higher end of NVIDIAs product stack, the high end nvidia products will see a price increase in line with the performance increase so as to not undercut the existing product on the shelf. the msrp for the 4090 is $1600, if the 5090 is 30% improvement over the 4090, expect it's msrp to be $2,079.99

you can do the same estimation with the rest of their product stack
 
  • Like
Reactions: sherhi

Giroro

Splendid
I was actually impressed moving from a mobile 2060 to a 4060. Obvious. But It was improvements in areas other than the raw speed that got my attention. I can now encode AV1 video at lightning speed and with good quality output (I never got good encoding video quality from a video-card before)., With DLSS 3, I get 4x the performance- on a larger 2560x1600 vs 1080P screen! I also get the speed from a very portable 14" ultrabook size rather than the much larger/thicker 15" that was the smallest notebook 2060 could fit in.

The mobile 2060 has good encoders. It's HEVC/h.265 quality should be about the same as AV1 from a 40xx, plus HEVC is more widely supported at the moment.
 
Well seeing how GPU sales have fallen off a cliff, I expect the market to correct itself.

To anyone thinking "they won't change", that's not how supply vs demand works. Consumer's will only pay so much, and suppliers can easily lose money on SKUs not moving. Suppliers will not tolerate continuously losing money and someone is going to step in to scoop up that demand.

Also realize that product's are done a year in advanced, not 48 hrs after an embargo is lifted. Expect AMD to adjust prices next year to match market conditions, their CEO is rather level headed. Nvidia is anyone's guess, they risk losing market share to AMD or Intel if they rest too long on their laurels.

Halo products sell to they same folks that buy 100k sports cars and threadripper CPUs. There is a massive market for midrange products.
 

InvalidError

Titan
Moderator
Also realize that product's are done a year in advanced, not 48 hrs after an embargo is lifted.
While the design may get started 3-4 years ahead of launch, product tiers aren't final until the launch is announced. Branding 50-tier specs as 60-tier branding carrying 70-tier price tags was entirely avoidable, same goes with AMD and Intel to a lesser extent.
 
  • Like
Reactions: SunMaster

ilukey77

Reputable
Jan 30, 2021
833
339
5,290
If you are going to complain about 8GB, you forgot the A750 and 8GB A770.

8GB is fine, in the $180-240 range. The RX6600 is a solid buy at $180-200.
Sure i could mention the a380 and 750 770 8gb but intel is new to the market and i wouldnt buy a 8gb Intel card either !!

AMD and Nvidia are charging a premium for their 8gb rubbish HUGE difference

6600 is last gen so maybe read the OP thread as we are talking last 12 months !!

I have a power color 6650xt sakura hellhound in a build great card but would still suck for alot of modern games !!

would blanket statement that all 8gb cards sold in 2023 are rubbish be better ??
 
Last edited:
I've used this analogy to explain it to some friends which don't have the want or time to read everything about nVidia, AMD and Intel on the GPU space and I think it's fitting.

This whole generation has been like walking into your favorite restaurant and asking for fine 9oz/300g steak medium rare (or insert whatever you like best so it works for you) and instead get a 6oz/200g hamburger. You then choose to say "yeah, no... I asked for something else" and then the cook comes over and says "well bud, you either have that or go hungry". And there's this table full of weirdos saying "it's fine! it still tastes like steak!".

Regards.
 
Jul 11, 2023
11
8
15
This whining about the lack of progress is discouraging.
If I have a goal to buy a 3060 ti level card and I buy a 4060ti level card, I don't see a problem.
The 4060ti has advantages. 1) FG 2) Less consumption.
It's worth the money.
Already 42 games support FG, most likely future games will also be supported.
It's strange that people think that FG software acceleration costs nothing, firstly it requires hardware support, secondly this technology requires some significant efforts to implement it.
It's sad that progress is slowing down, but there is an objective reason for this: these are semiconductors.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
If I have a goal to buy a 3060 ti level card and I buy a 4060ti level card, I don't see a problem.
The problem is only being marginally better (slightly worse in a few cases due to the memory interface going down from 256bits to 128bits) for a higher MSRP. The 3060Ti itself caught quite a bit of flack due to having 4GB less VRAM than the 3060 non-Ti too. And then you have the debacle about 8GB being a tight squeeze for modern titles at a time where GDDR6 prices are less than half what they were during COVID/crypto, which makes it that much more difficult to excuse scroogey VRAM sizes when you get to $300+ price points.

If you are happy with all of the rounded corners on the 4060Ti, good for you. Specs-wise though, there are some very clear regressions and failures to make it make sense to complain about. If nobody complains, they'll push their corner-cutting even harder next time around.
 
  • Like
Reactions: Jagar123

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
The problem is only being marginally better (slightly worse in a few cases due to the memory interface going down from 256bits to 128bits) for a higher MSRP. The 3060Ti itself caught quite a bit of flack due to having 4GB less VRAM than the 3060 non-Ti too. And then you have the debacle about 8GB being a tight squeeze for modern titles at a time where GDDR6 prices are less than half what they were during COVID/crypto, which makes it that much more difficult to excuse scroogey VRAM sizes when you get to $300+ price points.

If you are happy with all of the rounded corners on the 4060Ti, good for you. Specs-wise though, there are some very clear regressions and failures to make it make sense to complain about. If nobody complains, they'll push their corner-cutting even harder next time around.

exactly why I decided to go for a 6800 for my build since it will vastly outperform the 4060 ti 16gb for $70 cheaper. I am going to complain and I hope others do the same so that nvidia doesn't get away with it next gen as well.
 
Jul 11, 2023
11
8
15
The problem is only being marginally better (slightly worse in a few cases due to the memory interface going down from 256bits to 128bits)
As soon as I see the mention of 128/256 bits, I immediately understand that the opponent does not know the technical details, but repeats like a parrot the same theses that I heard from bloggers.
4060/4060 ti have a much higher chip frequency and at the same time a much larger L2 cache of 32MB in 4060ti versus 4MB in 3060ti, as well as the 4000 series it has the best architecture.
I agree, 128 bits may be the reason why in some games, at some resolutions, as a rule it is 4K, 4060 can lose 3060, but in practice these tests do not matter. It is much more important to me when 4060 outputs 5-10 frames more than 3060 and as a result we get 65 necessary and sufficient frames in 1080p than what it could give 15 frames in 4K with 256 bus, instead of 10 frames with 128 bit bus. If 4060 or 3060 had at least a 512-bit bus, they would still remain at approximately the same level.
And one more fact: Everyone likes to point a finger at these few games in which there is a problem with the bus and at the same time carefully avoid mentioning the fact that FG (DLSS3) is already supported in ~ 40 games and most likely their number will double in a year and at the same time FG gives a significant boost of the order of 30-50%.
 
  • Like
Reactions: KyaraM
Jul 11, 2023
11
8
15
exactly why I decided to go for a 6800 for my build since it will vastly outperform the 4060 ti 16gb for $70 cheaper. I am going to complain and I hope others do the same so that nvidia doesn't get away with it next gen as well.
1) 4060ti on 16GB has practically no advantage in comparison with 4060ti on 8GB, with rare exceptions, but at the same time it costs $ 100 more, perhaps it can be useful for some tasks related to neural networks.
2) 6800/6800xt is a card from a different price range. In our region, they cost about ~ $ 700. Even if you have the opportunity to buy such a new card somewhere on ebay, then a comparison with the new 4060ti on 8GB, which costs about $ 400-450 with a guarantee in the store at your home, is not quite correct.
And of course it is impossible not to mention the DLSS3, which gives a good boost, as well as the fact that the 6800 is hotter and requires a more powerful PSU.
 
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
As soon as I see the mention of 128/256 bits, I immediately understand that the opponent does not know the technical details, but repeats like a parrot the same theses that I heard from bloggers.
4060/4060 ti have a much higher chip frequency and at the same time a much larger L2 cache of 32MB in 4060ti versus 4MB in 3060ti, as well as the 4000 series it has the best architecture.
Cache is no substitute for raw memory bandwidth while pushing higher resolution textures and increased buffer sizes throughout. The shortcomings of 8GB especially on 128bits will become a major sore point soon enough and that is what the "bloggers who don't understand technical details" are complaining about.

8GB is struggling in a growing number of current-day games and will fall flat on its face in the near future at least for people who insist on pushing high-ish details.
 
Jul 11, 2023
11
8
15
Cache is no substitute for raw memory bandwidth while pushing higher resolution textures and increased buffer sizes throughout. The shortcomings of 8GB especially on 128bits will become a major sore point soon enough and that is what the "bloggers who don't understand technical details" are complaining about.

8GB is struggling in a growing number of current-day games and will fall flat on its face in the near future at least for people who insist on pushing high-ish details.
I do not inspire any confidence in the opinion of people who do not see the prospects associated with DLSS2+ DLSS3 and at the same time panic about the lack of 8GB of memory.
For example: Remnant 2 is a heavy game built on UE 5, which will be relevant for the next 3-5 years at least and at 4060ti gives about 100 fps at 1080p ultra with active FG and at the same time consumes about 6GB of memory, probably affects the work of nanite technology.
And in particularly severe cases, you can enable DLSS2 "quality", which automatically leads to a decrease in memory consumption, and at the same time the loss of quality is not actually noticeable.
And I not only reviewed a lot of tests, but I own 4080 and tested 4060ti myself, and what I saw with my own eyes suggests that many bloggers at least do not tell the whole truth, specifically focus on insignificant facts and at the same time hide important ones.
 
  • Like
Reactions: KyaraM

enewmen

Distinguished
Mar 6, 2005
2,251
5
19,815
The mobile 2060 has good encoders. It's HEVC/h.265 quality should be about the same as AV1 from a 40xx, plus HEVC is more widely supported at the moment.
Yes, h.265 should be similar to AV1. I've tried to encode h.265 on a 2060 and 4060 for many hours. I've always ended up using the (dog slow) CPU for encoding (I can tell the difference, especially in fast-moving video). AV1 on the 4060 just gave me good results the first time. My 2-cents worth.
 
While the design may get started 3-4 years ahead of launch, product tiers aren't final until the launch is announced. Branding 50-tier specs as 60-tier branding carrying 70-tier price tags was entirely avoidable, same goes with AMD and Intel to a lesser extent.

Well yeah the names they gave them are done near the end, the actual silicon tiers and marketing plan are done well in advance. What I am referencing here is AMD isn't going to change it's name / pricing structure immediately after nVidia's embargo is lifted and the market responds poorly. I expect them to be able to adjust fire within a quarter or two though and attack that market weakness and try to snag market share.

Cache is no substitute for raw memory bandwidth while pushing higher resolution textures and increased buffer sizes throughout. The shortcomings of 8GB especially on 128bits will become a major sore point soon enough and that is what the "bloggers who don't understand technical details" are complaining about.

8GB is struggling in a growing number of current-day games and will fall flat on its face in the near future at least for people who insist on pushing high-ish details.

Eh 8GB is perfectly fine for settings you would be using on a card with such limited memory bandwidth. The 4060ti 16GB demonstrated this quite well. It's too bad nVidia decided to rename the cards to one tier higher then they should be.
 
Last edited:

InvalidError

Titan
Moderator
Eh 8GB is perfectly fine for settings you would be using on a card with such limited memory bandwidth. The 4060ti 16GB demonstrated this quite well.
Most people don't view market segments by memory bandwidth, they view it by compute performance and then whatever may bottleneck it. The easily foreseeable future is higher texture resolutions readily blowing through 8GB and practically everyone who knows a thing about graphics quality vs compute effort will tell you that higher resolution texture are the lowest-hanging fruit on the visual quality tree. Cards that lack the bandwidth and space to accommodate that will hit brick walls beyond bad launch-month ports in the near future.
 
Jul 11, 2023
11
8
15
Most people don't view market segments by memory bandwidth, they view it by compute performance and then whatever may bottleneck it. The easily foreseeable future is higher texture resolutions readily blowing through 8GB and practically everyone who knows a thing about graphics quality vs compute effort will tell you that higher resolution texture are the lowest-hanging fruit on the visual quality tree. Cards that lack the bandwidth and space to accommodate that will hit brick walls beyond bad launch-month ports in the near future.
In Remnant 2, memory consumption in 1080p does not exceed 6 GB, such projects as cyberpunk 2077 phantom liberty, starfield, Immortals of Avium will be released very soon, according to them it will be possible to judge the performance requirements of video cards for at least the next 3-5 years.

I believe that the problem with the lack of 8GB of memory is inflated by bloggers, which is good for AMD because this is their competitive advantage. But in fact, the chip has always played a crucial role in performance.
 
  • Like
Reactions: KyaraM
Jul 11, 2023
11
8
15
The pressure against 8GB comes largely from consoles having 16GB of shared memory for developers to play with.
Given the fact that there are 16 gig sharen in the console, most likely the distribution will fluctuate in 8 + 8 Gig, which does not look at all like a problem for the 4060, which has these 8 Gig. And we must also remember that the console must ensure the operation of a 4K TV, the 4060 is positioned as a video card for 1080p
 
Last edited:
  • Like
Reactions: KyaraM

InvalidError

Titan
Moderator
Given the fact that there are 16 gig sharen in the console, most likely the distribution will fluctuate in 8 + 8 Gig, which does not look at all like a problem for the 4060, which has these 8 Gig.
Consoles have a pared-down OS, almost no background processes and having unified memory means there is little to no data duplication between GPU memory and CPU memory like you have on PC. Also, due to every console being exactly the same, console games can be optimized much tighter than their PC ports.
 
Status
Not open for further replies.