Review AMD Radeon RX 7800 XT Review: The Lateral Pass

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I don't really understand why this article is so bad other than the fact that the article is telling people to buy vastly overpriced 20 series gpus when (especially at the time) RTX isn't really viable enough to get most people to care. would you please explain why the article is so bad?
Basically, the article lied about the value of RTX at the time. Let's face it, here we are five years later and it's still not worth it. Watch the Gamers Nexus video for more elaboration because it was five years ago and Steve's video was done when he still had it fresh in his mind.

When that article came out, it was so bad that I swore to never read it again. I already had 30 years of PC building under my belt when that video came out. You develop an innate understanding of PC components and PC configuration from decades of experience, even if you never worked in the industry (and I did work for years at Tiger Direct). As a result, this video incurred the same reaction in me that I would have if a real media outlet tried to un-ironically defend the "logic" of the Flat-Earth movement.

The reaction that I and most tech experts had to this thinly-veiled commercial for nVidia was both negative and visceral. We were literally pissed off by it which is why I vividly remember its existence five years later.

I want to reiterate however, that none of this has anything to do with Jarred Walton. He did not write that article and his reviews have been very well done. It's unfair that something like this should reflect poorly on him because none of us have any control over things that we don't personally do.
 

Order 66

Respectable
Apr 13, 2023
1,592
696
2,070
Basically, the article lied about the value of RTX at the time. Let's face it, here we are five years later and it's still not worth it. Watch the Gamers Nexus video for more elaboration because it was five years ago and Steve's video was done when he still had it fresh in his mind.

When that article came out, it was so bad that I swore to never read it again. I already had 30 years of PC building under my belt when that video came out. You develop an innate understanding of PC components and PC configuration from decades of experience, even if you never worked in the industry (and I did work for years at Tiger Direct). As a result, this video incurred the same reaction in me that I would have if a real media outlet tried to un-ironically defend the "logic" of the Flat-Earth movement.

The reaction that I and most tech experts had to this thinly-veiled commercial for nVidia was both negative and visceral. We were literally pissed off by it which is why I vividly remember its existence five years later.
I can't think of any games where RTX is worth turning on other than Minecraft RTX and maybe Portal RTX just because of the nostalgia associated with portal. Are there any other games that you think RTX is worth it?
 
  • Like
Reactions: NeoMorpheus
I can't think of any games where RTX is worth turning on other than Minecraft RTX and maybe Portal RTX just because of the nostalgia associated with portal. Are there any other games that you think RTX is worth it?
Well, no, I don't think that there are any applications of RT that are worth paying hundreds more on a video card to use it. I don't even think that Minecraft RTX is worth it. Definitely Minecraft RTX needs RT to function properly but I don't think that it's worth the difference in cost either. Games like Minecraft RTX, Control and Portal RTX were just basically nVidia RT commercials disguised as video games.
 
Last edited:

Order 66

Respectable
Apr 13, 2023
1,592
696
2,070
Well, no, I don't think that there are any applications of RT that are worth paying hundreds more on a video card to use it. I don't even think that Minecraft RT is worth it.
why don't you think minecraft RTX is worth it? It's not like other games where you can't tell that it is on. It is a night and day difference on vs off.
 

JarredWaltonGPU

Senior GPU Editor
Editor
You have to remember though, even if not you specifically, Tom's Hardware has been quite guilty of shilling for nVidia in the past. I'm sure that a lot of people (you included for sure) remember a certain sorry excuse for tech journalism by Avram Piltch back in 2018, an "article" that was so pro-nVidia, that to this day, I believe he was paid-off by nVidia to push their false narrative. He took a serious risk with the TH brand-name when he did this and caused it irreparable damage. He sided with nVidia when it came to their predatory pricing tactics instead of siding with consumers. That betrayal of your core audience, consumers has still never been forgotten and will never truly be forgiven, especially since he has never retracted the article or apologised for it. That's the work of a narcissist and people know it.
No, you're way off base. Sorry. I know Avram, I know his history, and he's still EIC for Tom's Hardware. And you have to know what EIC has come to mean: Not "knows everything" but "upper management who guides direction." That in itself was a big change from historical Tom's Hardware, but that's a different matter.

He's not the same level of PC enthusiast as some people (including me), and perhaps that's part of the issue. But the real issue is that he came from Laptop Mag and worked with other sites where the "opposing views" approach was pretty common and worked well (Tom's Guide still does them). One side takes the hard "Product X sucks" stance and the other side takes the "Product X is great" position, and they both post an op-ed on the subject. This is, literally, the whole story. It was a play for traffic, pure and simple. It wasn't because Nvidia paid anyone, but it was an attempt to get "paid" by the internet traffic gods. It failed and backfired, quite badly.

Now, to be fair there are people that would wholeheartedly agree with Avram. I know of plenty of enthusiasts that looked at the generational uplift compared to the Titan X and said, "Hell yeah, count me in!" We don't do the opposing views approach at Tom's Hardware anymore, for precisely this reason. We try to give you the overall view, and not silly shenanigans to get more traffic.

What's still funny to me is that, in retrospect, anyone who paid $1200 for an RTX 2080 Ti in 2018 made an excellent choice. Not because it was so incredibly fast, but because the unpredictability of crypto and the pandemic made the 2080 Ti an amazing pick. If you didn't like your 2080 Ti in 2021, you could sell it for more than you originally paid! Also funny is that the guy who wrote the anti-RTX 2080 Ti piece now works for AMD.

But if you disregard that op-ed piece, what did the actual Tom's Hardware review of the RTX 2080 Ti say? 4.5-stars, which is where most people stopped, but it was backed up by a lot of testing data. A generational 30% (give or take) improvement over the GTX 1080 Ti, with substantially better than Titan V and Titan X performance, plus a new feature that was very hard to pin down at the time? I can see why people were excited. I took a slightly more muted approach at PC Gamer, scoring it an 84, but a lot of people gave the 2080 Ti a 9/10 score, and there were even 10/10 ratings at places. Too high, just like a 10/10 for the 7800 XT is overblown, but mid to high 80s certainly wasn't out of the question.

Anyway, mistakes were made. People on the internet inflated those mistakes for their own reasons, like the GN videos you linked that are frankly everything that's wrong with a lot of the YouTuber stuff where personal attacks and drama are more important than reality. Also, Tom himself (Pabst) is absolutely not a shining pillar of greatness. LOL
 

JarredWaltonGPU

Senior GPU Editor
Editor
why don't you think minecraft RTX is worth it? It's not like other games where you can't tell that it is on. It is a night and day difference on vs off.
And this is the problem with the blanket condemnation of ray tracing that's happening. AMD diehards can't admit that there are some instances where it actually can be pretty impressive. Cyberpunk 2077 RT Overdrive is another example. You basically have to do DLSS or some other upscaling to get reasonable performance, and of course it's heavily Nvidia promoted, but it does look better. And back in 2018, we really weren't sure how long it would take to get more (meaningful) RTX games.

Right now, here are the games where I think DXR is worth enabling (especially on high-end Nvidia GPUs):

The Ascent
Bright Memory: Infinite
Control
Cyberpunk 2077
Deliver Us The Moon
Dying Light 2
FIST: Forged In Shadow Torch
Ghostwire: Tokyo
Hogwarts Legacy
Minecraft
Portal
Pumpkin Jack
Spider-Man: Miles Morales
Spider-Man Remastered
Watch Dogs Legion (only reflections, but it does look better)

That's fifteen games, and while it's a subjective list, also note that there are nearly 100 games now with ray tracing support. So 15% of those, give or take, seem to use ray tracing in a way that makes for at least a somewhat noticeable difference in image fidelity. That's... not a very good rate. LOL Even if we're generous, I can't say that more than 25% of DXR games use it in a meaningful way.

And good (better) graphics alone doesn't make for a better game. Cyberpunk 2077 is the perfect example of this. It was hugely hyped and I was super excited for it to come out. I did finish the game, but I was pretty disappointed overall with so much of the plot and gameplay. I'd also score it a solid 3.5-star (on my own rating scale), which is again "okay but not good, certainly not great."

Given what can now be accomplished — especially with DLSS 3.5 Ray Reconstruction — we're probably about to get a new collection of enhanced games where the DXR effects become even more noticeable, with performance hits that perhaps will be less than before. It took half a decade to get here, and DXR is by no means ubiquitous, but I suspect sometime in the next couple of years we'll start to see more truly impressive use cases where you don't end up thinking, "Okay, that's nice, but it looked fine without DXR."

Which of course is funny, because Jensen's "It just works" bit about ray tracing back in 2018 also discussed how Nvidia was able to do RT ten years before we normally would have reached the point of having "fast enough" hardware to do it. Well, maybe by kicking it off five years ago, we won't have to wait another ten years after we have hardware that's fast enough to handle full RT.
 
  • Like
Reactions: KyaraM and Order 66

Order 66

Respectable
Apr 13, 2023
1,592
696
2,070
And this is the problem with the blanket condemnation of ray tracing that's happening. AMD diehards can't admit that there are some instances where it actually can be pretty impressive. Cyberpunk 2077 RT Overdrive is another example. You basically have to do DLSS or some other upscaling to get reasonable performance, and of course it's heavily Nvidia promoted, but it does look better. And back in 2018, we really weren't sure how long it would take to get more (meaningful) RTX games.

Right now, here are the games where I think DXR is worth enabling (especially on high-end Nvidia GPUs):

The Ascent
Bright Memory: Infinite
Control
Cyberpunk 2077
Deliver Us The Moon
Dying Light 2
FIST: Forged In Shadow Torch
Ghostwire: Tokyo
Hogwarts Legacy
Minecraft
Portal
Pumpkin Jack
Spider-Man: Miles Morales
Spider-Man Remastered
Watch Dogs Legion (only reflections, but it does look better)

That's fifteen games, and while it's a subjective list, also note that there are nearly 100 games now with ray tracing support. So 15% of those, give or take, seem to use ray tracing in a way that makes for at least a somewhat noticeable difference in image fidelity. That's... not a very good rate. LOL Even if we're generous, I can't say that more than 25% of DXR games use it in a meaningful way.

And good (better) graphics alone doesn't make for a better game. Cyberpunk 2077 is the perfect example of this. It was hugely hyped and I was super excited for it to come out. I did finish the game, but I was pretty disappointed overall with so much of the plot and gameplay. I'd also score it a solid 3.5-star (on my own rating scale), which is again "okay but not good, certainly not great."

Given what can now be accomplished — especially with DLSS 3.5 Ray Reconstruction — we're probably about to get a new collection of enhanced games where the DXR effects become even more noticeable, with performance hits that perhaps will be less than before. It took half a decade to get here, and DXR is by no means ubiquitous, but I suspect sometime in the next couple of years we'll start to see more truly impressive use cases where you don't end up thinking, "Okay, that's nice, but it looked fine without DXR."

Which of course is funny, because Jensen's "It just works" bit about ray tracing back in 2018 also discussed how Nvidia was able to do RT ten years before we normally would have reached the point of having "fast enough" hardware to do it. Well, maybe by kicking it off five years ago, we won't have to wait another ten years after we have hardware that's fast enough to handle full RT.
I don’t understand how ray tracing existed before rtx 20 series. I’m not sure how ray tracing was used in movies before rtx cards. Would someone please explain this? I have always wondered.
 

NeoMorpheus

Commendable
Jun 8, 2021
159
187
1,760
Absolutely, but the number of professionals who game compared to the number of gamers who don't do professional work is so small as to possibly statistically zero. The people who do professional work aren't what we generally talk about here because if your card is making you money, you're probably looking at buying whatever replaced the Quadro, not a GeForce card.

What you have to keep in mind though is that not all professional applications for content creation require CUDA. There are content creators on YouTube who do use Radeons for content creation. In fact, Graphically Challenged actually sold his RTX 4090 in favour of using his RX 7900 XTX and he uses it for professional work. He actually did two videos about it. Here's the video he did when he first made the decision:
And here's his follow-up video:
As you see in both videos, he flat-out states that he's more impressed with Radeon drivers than GeForce drivers. GeForce driver instability was one of the reasons that he switched to Radeon in the first place. He's a pretty prolific YouTuber with 111K subs and he's perfectly happy using a Radeon as his main card for gaming and content creation. This guy is a serious YouTuber whose whole channel is all about video cards and he chose to use Radeon over GeForce.

I think this tells us that the belief that one must use a GeForce card for content creation, while not necessarily a myth, has been very exaggerated. This also tells us that people who talk about "bad Radeon drivers" are in fact the ignorant fools that I always claimed them to be. This guy is a respected video card expert so it's not like I'm just picking some random video that I found. I've been subbed to him for years.
I think that you will like this video:

View: https://youtu.be/2WeDQ9FcMl0?si=5uXDXRziiPaFRGE5
 
  • Like
Reactions: Avro Arrow

NeoMorpheus

Commendable
Jun 8, 2021
159
187
1,760
AMD diehards can't admit that there are some instances where it actually can be pretty impressive.
I am an AMD diehard as you said before but that blanket statement its simply wrong. Many of us that are capable of understanding data have reached a conclusion that the performance hit its not worth the eye candy returned.
but it does look better.
In some cases, in some specific parts (puddles and mirros galore) and dont forget, either paused or simply not moving.
The Ascent
I really love that game and i found the excessive glare annoying. Not to mention, as stated above, everything needs to be wet so you can have rt reflections.
That's fifteen games, and while it's a subjective list,
15 games, out of thousands simply doesn’t justify the price of a 4090.
And good (better) graphics alone doesn't make for a better game.
Bingo. That is my biggest problem with reviewers that right away say “AMD’s RT performance sucks, buy ngreedia !” But why should I believe that claim, since none of these games are any better because of RT? But except by you here, that exact phrase its never used and a less technical reader will not know better and end up wasting money on a more expensive gpu because of the misguided “RT performance “.
Which of course is funny, because Jensen's "It just works" bit about ray tracing back in 2018 also discussed how Nvidia was able to do RT ten years before we normally would have reached the point of having "fast enough" hardware to do it. Well, maybe by kicking it off five years ago, we won't have to wait another ten years after we have hardware that's fast enough to handle full RT.
Since the first rtx gpu came out, i said pretty much that, RT its not worth the hype until at least 5 more gpu gens. So perhaps that “statement “ will be accurate.

When i can buy a gpu that can do 4k full RT without any gimmicks (upscaling) , cheats (fake frames), etc and doent cost a kidney, then i will care for it.

And that’s assuming that RT ends up making the game better, not just pretty.
 
Aug 30, 2023
3
0
10
My opinion is that while the AMD Radeon RX 7800 XT may improve on certain hardware specifications, it does not offer a substantial increase in gaming performance from the RX 6800 XT. It may be best to wait for the more advanced RDNA 3 GPUs to be released before making a decision on which graphics card to buy.
 
D

Deleted member 2950210

Guest
I dont know how low they could go, prices wise, but they should had released the 7900XTX at a way lower price and about the name, we humans will automatically (as many reviewers do) compare it with the 4090 because of that 9 in the model name.
So maybe changing the name (besides a way lower price) would had been a better approach.

It has nothing to do with that 9: people keep comparing the 4090 and 7900 XTX - and rightfully so - because each of them is the flagship card of two competing companies.

I understand that, as the AMD die hard that you are, you don't like the results of that comparison, and that's a tough pill for you to swallow, but a lower price, along with a different name, won't make it any less tough.

If AMD really want their cards to be more attractive, they actually have to make them better than the competition.

When i can buy a gpu that can do 4k full RT without any gimmicks (upscaling) , cheats (fake frames), etc and doent cost a kidney, then i will care for it.

Um, that would be... NEVER?

4090, can pretty much handle all of the above, but you 've already said you 're willing to ignore Nvidia's top notch features, in favor of your misguided notion of "greater good".

So you sell the PC... buy an XBox Series X or PS5 and call it a day.

When it comes to cutting edge PC gaming... you have to pay the premium. A cutting edge GPU, should be expensive...

And that’s assuming that RT ends up making the game better, not just pretty.

But, making a game look prettier, is RayTracing's way of making it better. What more would you have it do?
 
Last edited by a moderator:

Elusive Ruse

Commendable
Nov 17, 2022
375
491
1,220
Absolutely, but the number of professionals who game compared to the number of gamers who don't do professional work is so small as to possibly statistically zero. The people who do professional work aren't what we generally talk about here because if your card is making you money, you're probably looking at buying whatever replaced the Quadro, not a GeForce card.

What you have to keep in mind though is that not all professional applications for content creation require CUDA. There are content creators on YouTube who do use Radeons for content creation. In fact, Graphically Challenged actually sold his RTX 4090 in favour of using his RX 7900 XTX and he uses it for professional work. He actually did two videos about it. Here's the video he did when he first made the decision:
And here's his follow-up video:
As you see in both videos, he flat-out states that he's more impressed with Radeon drivers than GeForce drivers. GeForce driver instability was one of the reasons that he switched to Radeon in the first place. He's a pretty prolific YouTuber with 111K subs and he's perfectly happy using a Radeon as his main card for gaming and content creation. This guy is a serious YouTuber whose whole channel is all about video cards and he chose to use Radeon over GeForce.

I think this tells us that the belief that one must use a GeForce card for content creation, while not necessarily a myth, has been very exaggerated. This also tells us that people who talk about "bad Radeon drivers" are in fact the ignorant fools that I always claimed them to be. This guy is a respected video card expert so it's not like I'm just picking some random video that I found. I've been subbed to him for years.
There are definitely instances where the XTX shines in professional workloads and can overall on par with a 4090 e.g. Davinci Resolve (Fusion still goes to 4090), however in my humble opinion if you making good dough with your GPU you're better off with the 4090 in most cases.
 

NeoMorpheus

Commendable
Jun 8, 2021
159
187
1,760
It has nothing to do with that 9:
Yes it does and the comment below confirms it.
people keep comparing the 4090 and 7900 XTX - and rightfully so - because each of them is the flagship card of two competing companies.
Again, you are wrong, since again, you are ignoring price points. Not everyone can (or its willing to spend) that much money and even AMD themselves stated that the 7900 XTX competes with the 4080.

Hell, I wouldnt rest so much on that point, because it shows the disregard for their customers by charging almost %100 more yet its only around %30 faster and in some cases, slower.

But hey, you made it clear, you have money to burn, so thats nothing for you.
I understand that, as the AMD die hard that you are, you don't like the results of that comparison, and that's a tough pill for you to swallow, but a lower price, along with a different name, won't make it any less tough.
The best part is that being the ngreedia diehard that you are, you have ignored completely how no only they are overcharging you and their loyal customers, you have fallen into the "brand religion" to the point that you are only observing what you want.
Perfect example, like it or not, the 7800 XT is faster than whatever its offered in the same price range.
But seems that in your book, only the 4090 exist.
Crazy how people (you, in case it needs to be mentioned) act when someone else doesn't worship the same brand that I do.
If AMD really want their cards to be more attractive, they actually have to make them better than the competition.
Funny how in every single one of your comments, you assume that everything they do, they are that far behind and yet, you conveniently ignore the price points which is the biggest factor to many, if not all.

4090, can pretty much handle all of the above, but you 've already said you 're willing to ignore Nvidia's top notch features, in favor of your misguided notion of "greater good".
Again, you conveniently ignore the valid points that a consumer should be wary (locking you into their hardware hence removing options) and once again, you ignore the problem of price, you might have daddy to pay for everything but many people cannot afford a 4090 or worse, their moral compass (which clearly your is skewed) wont let them justify such prices.
So you sell the PC... buy an XBox Series X or PS5 and call it a day.
Actually, I have a Series X plus this awesome all AMD system, very happy with both.
When it comes to cutting edge PC gaming... you have to pay the premium. A cutting edge GPU, should be expensive...
And that was spoken like a proud member of the PCMR....
But, making a game look prettier, is RayTracing's way of making it better.
If it doesnt involved an insane performance hit then its acceptable but even that, is debatable...
 
Last edited:

Order 66

Respectable
Apr 13, 2023
1,592
696
2,070
Yes it does and the comment below confirms it.

Again, you are wrong, since again, you are ignoring price points. Not everyone can (or its willing to spend) that much money and even AMD themselves stated that the 7900 XTX competes with the 4080.

The best part is that being the ngreedia diehard that you are, you have ignored completely how no only they are overcharging you and their loyal customers, you have fallen into the "brand religion" to the point that you are only observing what you want. Perfect example, like it or not, the 7800 XT is faster than whatever its offered in the same price range.
But seems that in your book, only the 4090 exist.

Funny how in every single one of your comments, you assume that everything they do, they are that far behind and yet, you conveniently ignore the price points which is the biggest factor to many, if not all.


Again, you conviniently ignore the valid points that a consumer shold be wary (locking you into their hardware hence removing options) and once again, you ignore the problem of price, you might have daddy to pay for everything but many people cannot afford a 4090 or worse, their moral compass (which clearly your is skewed) wont let them justify such prices.

Actually, I have a Series X plus this awesome all AMD system, very happy with both.

And that was spoken like a proud member of the PCMR....

If it doesnt involved an insane performance hit then its acceptable.
I also have an all AMD system. Couldn't be happier runs everything at 1080p ultra at 60fps (except Starfield because Bethesda)
 
  • Like
Reactions: NeoMorpheus

NeoMorpheus

Commendable
Jun 8, 2021
159
187
1,760
I also have an all AMD system. Couldn't be happier runs everything at 1080p ultra at 60fps (except Starfield because Bethesda)
Because it is an all AMD system, it allows me to use ChimeraOS (Linux) and out of the box, I dont have to mess with anything, since AMD open-source support its awesome, this thing simply works.
 
  • Like
Reactions: Upacs

Order 66

Respectable
Apr 13, 2023
1,592
696
2,070
Starfield is a good game with some flaws. It desperately needs an optimization update. It also (If it gets optimized) needs a graphics update as the graphics are good but not great.
 

JarredWaltonGPU

Senior GPU Editor
Editor
I don’t understand how ray tracing existed before rtx 20 series. I’m not sure how ray tracing was used in movies before rtx cards. Would someone please explain this? I have always wondered.
Are you serious? Because I'm not sure. But basically, all the computations involved in ray tracing can be done in other ways. They're just slower. So ray tracing hardware acceleration gives about a 10X speedup, possibly more if you want to factor in AI upscaling tech. And that's 10X relative to doing the calculations on GPU shaders, where the movies do all the calculations on the CPU — and also tend to use data sets stretching into the hundreds of gigabytes, so that sort of RT still isn't really possible on consumer hardware.

When i can buy a gpu that can do 4k full RT without any gimmicks (upscaling) , cheats (fake frames), etc and doent cost a kidney, then i will care for it.
Even movies are using AI-enhanced denoising algorithms and other "cheats" and "gimmicks." Assuming the upscaling looks as good as native (which Quality mode DLSS does in general, esp. at 4K), I have no problems with using it. The target will always be RT plus upscaling now, because as hardware gets faster, the RT effects will get more complex, and doing 4K native will always be "too much" for modern game engines.
 
  • Like
Reactions: Order 66

Order 66

Respectable
Apr 13, 2023
1,592
696
2,070
Are you serious? Because I'm not sure. But basically, all the computations involved in ray tracing can be done in other ways. They're just slower. So ray tracing hardware acceleration gives about a 10X speedup, possibly more if you want to factor in AI upscaling tech. And that's 10X relative to doing the calculations on GPU shaders, where the movies do all the calculations on the CPU — and also tend to use data sets stretching into the hundreds of gigabytes, so that sort of RT still isn't really possible on consumer hardware.


Even movies are using AI-enhanced denoising algorithms and other "cheats" and "gimmicks." Assuming the upscaling looks as good as native (which Quality mode DLSS does in general, esp. at 4K), I have no problems with using it. The target will always be RT plus upscaling now, because as hardware gets faster, the RT effects will get more complex, and doing 4K native will always be "too much" for modern game engines.
I knew that ray tracing could be done in other ways for gaming (eg running rtx features on a gtx card) however I did not think it would be viable for movies since I thought I heard somewhere that it takes several hours to render 1 frame of a movie. With how long a movie is and how many frames there are I just didn't think that RT would be viable for movies if it took several hours to render a single frame. ( i am probably wrong though)
 

NeoMorpheus

Commendable
Jun 8, 2021
159
187
1,760
Even movies are using AI-enhanced denoising algorithms and other "cheats" and "gimmicks." Assuming the upscaling looks as good as native (which Quality mode DLSS does in general, esp. at 4K), I have no problems with using it. The target will always be RT plus upscaling now, because as hardware gets faster, the RT effects will get more complex, and doing 4K native will always be "too much" for modern game engines.
And I am ok with most of that, but at the same time, we cannot ignore what they are trying to pull with these techniques since I think they are not being "honest". I am not too crazy in FSR claims of 3 times the increase in FPS because you can count on reviewers only showing the total FPS count, but ignoring the fact that now most are fake.

Heck, I can expect dlss 4 or whatever to then claim 5 fake frames per real frame and people will run with that.

That said, again, my other problem is being forced to use one brand of hardware (in this case) since we know thats how dlss rolls and for many, it doesnt matter, but to me it does.
 
  • Like
Reactions: Order 66
D

Deleted member 2950210

Guest
Again, you are wrong, since again, you are ignoring price points. Not everyone can (or its willing to spend) that much money and even AMD themselves stated that the 7900 XTX competes with the 4080.

It's not just about price points: it's about which card sits on top of each tier.

A flagship card, is the top card of a company's new generation of GPUs.

Regardless of what that company thinks, or would love to do, consumers will pick the top dog out of each company and compare it with each other.

The winner, pretty much determines the consumers' perception of each company.

It may be a bit unfair, but that's exactly what happens.

At present, like it or not, Nvidia looks like the better company. And that's mostly due to how beast of a card 4090 is.

No wonder AMD would very much like to compare their 7900 XTX with RTX 4080: 'cause that's pretty much the only way they can come out on top.

The alternative, are cooperations, like the one with Bethesda, which will make their cards seem better than Nvidia's, only for 4K to kick in and remind us who the true leader really is. That's right: even on its own turf, AMD can't beat Nvidia, 'cause 4K 4090 even marginally beats 4K 7900 XTX at Starfield.

Hell, I wouldnt rest so much on that point, because it shows the disregard for their customers by charging almost %100 more yet its only around %30 faster and in some cases, slower.

4090 is better than 7900 XTX in RT performance, upscaling and 4K gaming. No wonder it's so expensive: it's a much better GPU. Do i like these prices? Of course not. But Nvidia is never gonna lower them, so long as there's no competition.

Also, you have to take into account the fact that, compared to its previous generation counterpart (3090), RTX 4090 is a 60-80% improvement. That hasn't been seen since... when?

The performance, truly is massive:





Usually, generational upgrades are in the 30% range.

Look at it this way: whoever buys 4090, pays $1650 for the best GPU on the market, that gives amazing 4K Ultra performance... and has no GPU worries for the next 4-5 years at a minimum.

It's honestly the biggest reason why one should upgrade.

Also: the 4090 purchase has nothing to do with framerate alone. It has to do with eye candy as well. 4K looks night and day better than 1080p... and quite honestly I'm surprised more people don't game at that resolution.

But hey, you made it clear, you have money to burn, so thats nothing for you.

Actually, i said the exact opposite: i said i'm far from being a rich guy, adding that it's probably the first time in 24 years that i was able to afford a card like 4090. It is exactly because i don't have much money that i bought 4090: if paying for overpriced GPUs is inevitable in today's world, i'd rather buy the top dog right from the start, so i don't have to be on the market every once in a while, when a new game release renders my mid-tier card obsolete.

The best part is that being the ngreedia diehard that you are, you have ignored completely how no only they are overcharging you and their loyal customers, you have fallen into the "brand religion" to the point that you are only observing what you want.
Perfect example, like it or not, the 7800 XT is faster than whatever its offered in the same price range.
But seems that in your book, only the 4090 exist.
Crazy how people (you, in case it needs to be mentioned) act when someone else doesn't worship the same brand that I do.

Hate to break it to ya buddy, but, just because you admitted you 're a fanboy, doesn't mean everybody thinks the same way you do. If i were a fanboy, i wouldn't dare to admit that, even though 4090 is an out of this world upgrade, the rest of Nvidia's 4000 series is complete garbage!

Funny how in every single one of your comments, you assume that everything they do, they are that far behind and yet, you conveniently ignore the price points which is the biggest factor to many, if not all.

Yes, i am ignoring price points, and i have already explained why: 'cause it's not just the price, it's what you get for it.

As long as i see people like you, mentioning prices like it's the only thing that matters, you better believe I will post from the other side of the fence, pointing to the quality/performance factor. Looking to prices alone, can lead to some terrible decisions.

Again, you conveniently ignore the valid points that a consumer should be wary (locking you into their hardware hence removing options) and once again, you ignore the problem of price, you might have daddy to pay for everything but many people cannot afford a 4090 or worse, their moral compass (which clearly your is skewed) wont let them justify such prices.

As far as personal attacks are concerned, you could have done a lot better than that, 'cause i'm currently working on two jobs.
 
Last edited by a moderator:
Jan 4, 2023
15
33
40
Yes, i am ignoring price points, and i have already explained why: 'cause it's not just the price, it's what you get for it. ...Looking to prices alone, can lead to some terrible decisions.
You ignore price point and then incorrectly accuse him of ignoring performance and only looking at price??

Why are you even commenting on a mid tier graphics card article if you only care about 4090?
 
  • Like
Reactions: NeoMorpheus

Phaaze88

Titan
Ambassador
These review articles can be more entertaining than some TV soaps...
Need more popcorn...



Look at it this way: whoever buys 4090, pays $1650 for the best GPU on the market, that gives amazing 4K Ultra performance... and has no GPU worries for the next 4-5 years at a minimum.
Hmm, AMD/Nvidia/Intel won't like that. I'm sure they would prefer customers doing more frequent gpu upgrades. What's the average been, every other generation? IDK.
Some folks even saw/see DLSS/FSR/XeSS as a means to stretch out their next upgrade - I mean, I was one of them. Now, if only there was a method, or technology in place to maintain the average upgrade cycle...

4K looks night and day better than 1080p... and quite honestly I'm surprised more people don't game at that resolution.
Cost - long term cost is why 4K gaming hasn't blown up. It requires $$$ up front, as well as warranting more frequent gpu upgrades because the game of catch up at this resolution is faster than the resolutions below it.
Once enough time has passed and 4K becomes the 1080p of yesteryear, many more will be on it, and by then, 16K or whatever will be the new 4K, and the cycle continues.
 
  • Like
Reactions: NeoMorpheus
Status
Not open for further replies.