Discussion Is it just me or do AMD GPUs just not feel interesting anymore ?

I would say in the face of NVIDIA pricing and another disappointing release that it pushed me more into AMD court than away. I have an RX 7800 XT that has been really good so far. So, I would say I find AMD more interesting in light of this view. In addition, I have become a recent convert of FSR in relation to an older card and finding out what a valuable tool it can be for NVIDIA or AMD cards.
 
I find the copy cat nature of amd to be boring if I'm honest and lack of innovation on there GPU side.

I took more interest in there CPU side mainly apus.

It's a shame there GPU side hasn't really evolved much I feel that there hardware is greatly under tapped due to not pouring enough resources into the very dated hair texture methods. And needs a feature that's exclusively there own idea and not something Nvidia can copy something built round there tech that can be used in consoles and in pc hardware that can scale.

As much as I love free sync etc I feel that amd needs to get some exclusive feature set to set it apart. So that there not solely relying on being just fastest in raster.

And true stable drivers.

Amd users need to also stop giving amd free pass on buggy software as well. If I bought a car and the dealer told me It would be running like fine wine in 3 years id be mad as hell.
 
I wouldn't use "interesting" regarding any GPU. Best word i'd use, would be "intriguing" and that too, only for Intel Arc GPUs.

But when it comes to Radeon vs Nvidia, both have their pros/cons and it depends on user which one to prefer.

For gaming use, Nvidia is best. Top-notch performance, solid drivers. Downside is higher cost compared to Radeon.
Radeon GPUs do well in workstation use (and did well in cryptocurrency mining too). Downsides are higher power consumption and running hotter than same spec Nvidia GPU. But Radeon is cheaper than Nvidia.

So, for small budget people, Radeon GPUs have actually good value. Given that one then pays more for beefier PSU and spends more on PC's cooling. While also putting up with the driver issues (e.g stutters).
Nvidia is solid bet for anyone who is willing to pay top dollar for solid performance.

Personally, i prefer Nvidia. But i've used ATI and Radeon HD GPUs before too.
 
Definitely you :)

Amd vs Nvidia .. upto 40x0 pound for pound prices : raster is as good or better on AMD, “features” ray tracing DLSS etc are better on Nvidia.

I prefer to use raster so the features are moot. Heat on my 7900xt.. 70°C, power 320W and it’s silent. Drivers are stable, no crashes in the year I’ve had the card listed in my sig. don’t believe that Nvidia are perfect, my 3070 was running on outdated drivers for a year because there was a dual monitor bug in the later drivers, it wouldn’t run at idle when the card was not loaded, clocks were flat out

You pay your money you take your choice..
 
Last edited:
  • Like
Reactions: jordanbuilds1
The software/tech from AMD has become stale and they have not introduced anything "New" in terms of advancing display/visual tech in quite a while. So I feel AMD is just playing the catch up to the leader game and has not focused enough on pushing "New" ideas into the GPU sector.

If you view the last 15 or more years of GPU tech you will see that Nvidia has worked with more software developers to push the visual tech in the way they wanted to the point AMD has had to play catch up to a tech that Nvidia has pioneered putting AMD behind. Because of Nvidia pioneering the tech they have such a large head start this forces any other company, AMD and Intel, to start years behind them keeping the little GUY's at the bottom.

In these regards Nvidia has a monopoly keeping them in the lead an insuring their dominance in the sector. What tech has pushed AMD and now Intel to play catch up? Answer: Ray Tracing is the most recent, then you have G-sync that took AMD a year to release Free-sync which ended up being an Open Source way of implementing the tech. There are more but these are some of the most recent tech disadvantages that Nvidia has forced on to the rest of the sector to implement. While these are good improvements, this forces other company's to be a few years behind in research and development of the tech insuring Nvidia dominance.

The last big visual tech that AMD released and started pioneering was the TressFX hair technology. Nvidia GPU's had issues running this because at that time AMD was pioneering the tech so Nvidia was behind in R&D and didn't have the full understanding of how to implement it. This was in 2013 and AMD has not pushed any boundaries since.

These are the reasons I do not find AMD interesting right now as a Company but I do like their GPU's for performance to dollars spent.
 
To further show that AMD Radeon GPUs aren't match to Nvidia top-end, is the move on AMD part to abandon making high-end GPUs in the future.
Article: https://www.pcgamesn.com/amd/new-radeon-8000-series-strategy
Interview: https://www.tomshardware.com/pc-com...ck-hyunh-talks-new-strategy-for-gaming-market

Whereby, anyone who wants to have high-end GPU (e.g 4K), their only option is Nvidia. In 1080p and 1440p segment, AMD remains.
Intel Arc is also focusing more on the 1080p segment (highest market share there). And maybe even in 1440p segment.

TH's GPU hierarchy: https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

So, moving forwards, in the future;
1080p segment would have nice selection between Nvidia, Radeon and Intel.
1440p segment would have slimmer option, mostly between Nvidia and Radeon (given that Intel Arc B-series doesn't do great at 1440p).
4K segment would only have one option: Nvidia.
 
To further show that AMD Radeon GPUs aren't match to Nvidia top-end, is the move on AMD part to abandon making high-end GPUs in the future.
Article: https://www.pcgamesn.com/amd/new-radeon-8000-series-strategy
Interview: https://www.tomshardware.com/pc-com...ck-hyunh-talks-new-strategy-for-gaming-market

Whereby, anyone who wants to have high-end GPU (e.g 4K), their only option is Nvidia. In 1080p and 1440p segment, AMD remains.
Intel Arc is also focusing more on the 1080p segment (highest market share there). And maybe even in 1440p segment.

TH's GPU hierarchy: https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

So, moving forwards, in the future;
1080p segment would have nice selection between Nvidia, Radeon and Intel.
1440p segment would have slimmer option, mostly between Nvidia and Radeon (given that Intel Arc B-series doesn't do great at 1440p).
4K segment would only have one option: Nvidia.

Honestly I think it's a pro that amd is focusing on the middle ground as that's where alot of people are going to look. The 100-600 area is where most people will shop for a gpu.

Amd is smaller than Nvidia considerably. And I feel rDNA doesn't scale well enough. Which is why I figure they are skipping high end to focus on midrange they don't have the capacity or a large enough team. Nvidia can roll out driver updates in blink of week. Amd takes like months. Intel is also proving it can slap updates and I feel it's to early to call battlemage as a 1080p card. If rumours are true floating around there may be another card in the distant future as well.

It's a slow shift but 1440p will be more the norm.

I do think amd will focus on high end if the tech feels like it has a fighting chance I think they see rDNA as a bit of a flop.
 
  • Like
Reactions: jordanbuilds1
It's a slow shift but 1440p will be more the norm.
I don't think 1440p would be norm any day soon, if ever.

Currently 1080p is cheap and good enough for most people.
E.g even i would rather have 144 FPS on 1080p than 60 FPS on 1440p, with more expensive monitor to display 1440p.

Also, 1440p is only for PC monitor resolution. TVs are either 720p, 1080p or 4K. There are no 1440p TVs. And many people are also using TVs as their display for the PC.
So, unless display manufacturers stop making 1080p displays (monitors and TVs), 1080p remains as the norm.

Used to be, where norm was 768p (1024x768, 4:3 aspect ratio), but once 16:9 came; 720p was the norm. 900p was around as well but not much. Another norm became 1080p, which stands the norm today as well. Though, even today, 720p still lingers around.
If anything, 4K could become the new norm. It, more-or-less, is already the norm with TVs.
 
  • Like
Reactions: jordanbuilds1
AMD are the only real competition for NVIDIA and tbh I get completely put off NVIDIA products due their constant marketing blitzkriegs in the form of paid influencer sponsorships which turn out to be full of lies that and AMD rarely release a "bad" card but user error leads to a lot of issues for people (myself included).

I like to think of AMD vs NVIDIA the same as Linux vs Microsoft.
AMD/linux are less user friendly but you normally have a nice clean slate without the bloatware.
NVIDIA/Microsoft are more "consumer friendly" with little hassle but you are making sacrifices you didn't know where yours to make.

One thing i will admit to though is that AMD have hiked their prices to match NVIDIA which is a shame as they used to be the cheaper alternative.
 
I would say in the face of NVIDIA pricing and another disappointing release that it pushed me more into AMD court than away. I have an RX 7800 XT that has been really good so far. So, I would say I find AMD more interesting in light of this view. In addition, I have become a recent convert of FSR in relation to an older card and finding out what a valuable tool it can be for NVIDIA or AMD cards.
Agreed. Nividia's recent release has been meh at best. My RX 6800 is still plenty for my needs, but if I were in the market for a new GPU, it would be and AMD card. The sky high pricing, pitiful vram amounts, and relying on fake frames to lie to the public, doesn't sit well with me.
 
  • Like
Reactions: jordanbuilds1
Purely opinion

Most people don’t buy x90 cards, they don’t buy x80 cards. The x70 and x60 cards are the prevalent cards reported on surveys such as steam.

The x90 cards have given an indication of the x70 and x60 cards a few years down the line.
Developers produce AAA games that need ever stronger gpus. They make games that require the latest and greatest to run at 4k, max details and as such exclude the majority from experiencing their products with full eye candy.

The inclusion of “fake frames” (a horrible idea) mitigates this the shortfall in performance to some extent, the reported problem is an apparent sluggishness in response to inputs. This makes me think of my old 486sx. That sluggishness disappeared when I got a P120 in the games I played at the time.

Upscaling is less problematic. In fast moving scenes so long as it doesn’t smear it is acceptable. In more steady scenes more effort could/should be placed on fidelity. Ideally there should be both fidelity and motion. The idea of the balance is to maintain frame rates. Hardware is capable of achieving the ideal, or at least very close.

Target FPS

With the 486 people we’re looking for a solid 30fps. On the CRT monitors of the time that frame rates looked good. Pentium/Athlon the target hit 60fps and the display hardware available was pretty much locked there for a long time. It was that the commodity LCD displays could present.

Question, what can your monitor display?

I don’t need more than 165Hz to saturate my displays (not a flex, any frame rate greater than that is wasted).

Twitchy FPS shooters and mmorpg typically need a quick real refresh rate, see the bad guy and unalive him..or lose. The fake frames doesn’t help with this, going by the Nvidia slide, 1/4 of a screen upscaled and interpolated 3 times till the next 1/4 screen.. you won’t see the changes. New data is needed. Real frames whether raster or ray traced are needed to follow the unpredictable enemy.
(Hence my preference for raster and no “features” such as upscaling and fake frames)

Ray tracing is becoming usable across all new equipped cards but the implementation lies in the hands of the devs. Do they optimise the render path for Nvidia and throw in a token for AMD/Intel or do they properly implement for all? (Remembering the occluded objects/tesselation a few years ago). Could Windows ENFORCE a path on all manufacturers and level the playing field such that the best hardware shines?

People in the forum have been saying that AMD/Intel are copying Nvidia. Ray tracing, upscaling, fake frames, AI implementations of the 3 are the examples I can think of. Assume that ray tracing gives photorealistic images (it can look really close) then what else is there to develop?
Techniques to implement similar have been developed by AMD/Intel and they are improving. It will soon get to the point where the results are indistinguishable from any of the hardware sources. Improvements in visual fidelity will be in the realm of diminishing returns. What follows is a race for FPS again.

A few assumptions,
1, the monitors settle at a refresh rate of 240Hz. (A future target?j
2, the GPUs are generating photorealistic images at 4k
3, the games can feed the render pipelines at 240Hz and the GPUs can process that amount of data.

If photorealism is truly achieved there are no real improvements to make, the subtleties of detail, the near infinities of colour and shade, the difference between the edge if a leather jacket compared to a woollen sweater, the diffuse shadow from indirect light contrasting with the hard edge of sunlight, the light quality, sun/incandescent/fluorescent/led, realistic skin.. all rendered in real time will be incredible.
Once the hardware can do this at the assumed refresh rate and resolution across the board what else does a gamer want. The trio are relatively close to doing this and once achieved incremental improvements will give diminishing returns.
It is a matter of time and commercial pressure as you will need to upgrade less often if/when some degree of stability is achieved. Nvidia has an out, AI. AMD has an out, ML. Does intel have enough good will and penetration to get into AI in a significant way?

GPUs are still interesting but what makes them interesting is becoming commonplace. They wont be boring but I think the best times have passed. From simply being an adapter to drive a monitor to gaining rudimentary acceleration (rage, 3dfx, tnt), followed by the basic GPU processing ability of transform and lighting. 18 years of iterative advancement and in 2018, RTX. We are now 7 years into ray tracing and it is close to usable maturity across the manufacturers.
 
2, the GPUs are generating photorealistic images at 4k
Your whole reply was interesting read. Though, i think that manufacturers won't stick to 4K, instead go higher in resolution. 8K, 16K and so forth. Latter we can see today with TVs and TVs are the forerunners to the PC monitors.

Once the hardware can do this at the assumed refresh rate and resolution across the board what else does a gamer want.
I think the manufacturers (and gamers alike), would then focus (or want) VR. Since another step away from monitors, is VR. It gives far higher degree of immersion to the games, than monitor + headset.

VR is already here, albeit not as immersive to completely replace reality. But what i think the future brings, is similar to what was shown in the movie Ready Player One (2018).

Assume that ray tracing gives photorealistic images (it can look really close) then what else is there to develop?
Another thing manufacturers most likely would focus on, is making the hardware physical dimensions as small as possible. Since what good is that near-perfect GPU (or PC in general), when it takes up entire room (or warehouse).

Power draw and thermals is another challenge. Can't have near-perfect GPU when it pulls 5kW alone + outputting as much heat as space heater does.