News Nvidia expects next-gen Blackwell GPUs to be supply constrained

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Are you kidding, 24GB 7900xtx for $939 and up vs 24GB 4090 for $2000 and up street pricing. And what corners? back up your preposterous claim
Do you actually think the purpose of the 7900xtx is to compete with the 4090? Because that's absolutely not the card the 7900xtx is competing with. It's competing with the 4080/4080 Super. AMD even went on record saying they weren't trying to compete with the 4090.

Don't put bias and loyalty ahead of logic. The 7900xtx is a good value relative to the 4080, so long as you don't care about ray tracing. But make no mistake, it's still overly expensive and has some drawbacks, and the rest of the AMD stack is otherwise mundane comparatively.
 
Last edited:
Guess hiding their greed isn't a priority anymore when stupid consumers keep buying anyway. Didn't they say this about the last 3 generations?
 
Do you actually think the purpose of the 7900xtx is to compete with the 4090? Because that's absolutely not the card the 7900xtx is competing with. It's competing with the 4080/4080 Super. AMD even went on record saying they weren't trying to compete with the 4090.

Don't put bias and loyalty ahead of logic. The 7900xtx is a good value relative to the 4080, so long as you don't care about ray tracing. But make no mistake, it's still overly expensive and has some drawbacks, and the rest of the AMD stack is otherwise mundane comparatively.
Again back up your claim…I’m waiting…
 
Again back up your claim…I’m waiting…
Which part do you want? The part where the 7900 XTX is still a $1000 card that doesn't do Ray Tracing well? That it has higher power consumption than the 4080/4080 super, that it's louder by default in its cooling, and that if you run multiple monitors the power consumption is even worse?

Do you want the other part where AMD themselves said that the 7900 XTX wasn't competing with the 4090? Because that was the original issue you took with my comment and I don't see you defending it now, so why don't we both just get past the bias and move on to something more interesting than arguing on the internet?

I just don't understand why you'd be so upset about this. The 7900 XTX has better power connectors, more RAM, and it's a better card if you don't care about RT. AMD still put a huge price on the card to compete with NVidia's offerings, and this entire gen, excluding the 4090 (price aside), is mediocre. It's not one company doing the damage.
 
Last edited:
Call centers and the like are very transitive workplaces already. So that wouldn't be a big concern. Not to mention the number that are already outsourced.

Programming will change, but not go away. Still need people to actually validate AI code. So what I expect there is programmers to be able to make more complex things in shorter amounts of time. May be some layoffs of people who don't adapt.

Order taking at restaurants is already automated at quite a few places. Still have food prep staff. And drive through systems have also been outsourced in many franchises already.

People will move to things that can make money in an AI society. Hand made goods and services is usually touted as a thing that would arise. If we can just detach healthcare from jobs in the US, a lot of small businesses could pop up overnight.

I'll respectfully disagree. Businesses will press every advantage allowed to them at the expense of employees. To say otherwise is dillusional. If they can replace 6 skilled workers with 1 skilled worker + AI, they will. Because...profits. But that is 5 skilled people out of work. Eventually businesses will only shoot themselves in the foot. No skilled paying jobs means no customer base because everyone is broke. This is because everyone's skills were deemed obsolete due to AI in a short period of time.

The AI revolution isn't just coming after one industry, but all of them that doesn't require a human presense like a Doctor. And more people work call centers in the USA than you imagine. Not everything it outsourced to "Bob" in Bumfawk, Akabar.

Cottage industries from places like Etsy and the like is a pipe dream.
 
Last edited:
  • Like
Reactions: thestryker
I'll respectfully disagree. Businesses will press every advantage allowed to them at the expense of employees. To say otherwise is dillusional. If they can replace 6 skilled workers with 1 skilled worker + AI, they will. Because...profits. But that is 5 skilled people out of work. Eventually businesses will only shoot themselves in the foot. No skilled paying jobs means no customer base because everyone is broke. This is because everyone's skills were deemed obsolete due to AI in a short period of time.

The AI revolution isn't just coming after one industry, but all of them that doesn't require a human presense like a Doctor. And more people work call centers in the USA than you imagine. Not everything it outsourced to "Bob" in Bumfawk, Akabar.

Cottage industries from places like Etsy and the like is a pipe dream.

I'm not saying that companies aren't exploiting people, that is the general way capitalisms works. In reference specifically to call centers I am saying that it is an easier job to transition out of then most others. And I didn't say they were all outsourced, just that your local job market is already impacted by outsourcing. Now they can't all take on retail or other service jobs en masse, but it also won't be an overnight transition. Companies with the proper capital can invest in AI sooner rather than later. The more hard pressed call centers that barely make ends meet are likely to close up shop completely because they would no longer be cost competitive.

Kind of mixing up skilled labor and unskilled labor there a little. Skilled labor is going to see a transition to people working with AI, not necessarily being replaced by it. You don't fire assets if you don't have to. If you can increase productivity by giving each employee an AI assistant that is the smart thing to do, you can then take on more contracts/jobs. You let go five experts who are just as good as the one you kept, they will end up working for a competitor. Doesn't always work out that way, some managers and executives may be short sighted and make it happen as you describe.

Now general AI would be a different matter and could outright replace any non-physical job and potentially many physical jobs with robotics.

I'm more scared about the transportation industry in the near future with self driving trucks/cars or true rideshare AI cars. How many truck drivers are out there? How many forklift operators and warehouse personnel? Uber drivers, etc.

The same discussion people had with the introduction of the 'paperless' office. They found new jobs.
 
Which part do you want? The part where the 7900 XTX is still a $1000 card that doesn't do Ray Tracing well? That it has higher power consumption than the 4080/4080 super, that it's louder by default in its cooling, and that if you run multiple monitors the power consumption is even worse?

Do you want the other part where AMD themselves said that the 7900 XTX wasn't competing with the 4090? Because that was the original issue you took with my comment and I don't see you defending it now, so why don't we both just get past the bias and move on to something more interesting than arguing on the internet?

I just don't understand why you'd be so upset about this. The 7900 XTX has better power connectors, more RAM, and it's a better card if you don't care about RT. AMD still put a huge price on the card to compete with NVidia's offerings, and this entire gen, excluding the 4090 (price aside), is mediocre. It's not one company doing the damage.
AMD didn’t cut corners, they honestly thought RDNA 3 was going to compete with the 4090 at $999 but their performance and efficiency simulations did not match actual performance and efficiency so they dedicated their entire driver team to optimizing RDNA 3 for 3 months straight to figure out why. You can say that RDNA 3 is a disappointment, but you can’t say it was on purpose, and with the reality of RDNA 3 in hand, AMD can only lower prices so much. It is unfortunate for AMD that it didn’t pan out like they genuinely thought it would, but you can’t justify the assertion that they cut corners given the radical departure from the monolithic GPU norms.
 
AMD didn’t cut corners, they honestly thought RDNA 3 was going to compete with the 4090 at $999 but their performance and efficiency simulations did not match actual performance and efficiency so they dedicated their entire driver team to optimizing RDNA 3 for 3 months straight to figure out why. You can say that RDNA 3 is a disappointment, but you can’t say it was on purpose, and with the reality of RDNA 3 in hand, AMD can only lower prices so much. It is unfortunate for AMD that it didn’t pan out like they genuinely thought it would, but you can’t justify the assertion that they cut corners given the radical departure from the monolithic GPU norms.
Could you possibly nitpick a comment any further to make a point that AMD good NVidia bad?

Let's replace "cut corners" with "there are concessions to ownership" and then revise everything you're saying. You're missing the point entirely, which is, at the end of the day, AMD didn't help anyone with their pricing and making excuses for them, but deriding NVidia, looks shortsighted and bad faith. The 7900 XTX is still a $1000 card that isn't wholly superior to the 4080. This is not even up for debate, it's factual by virtue of evidence.

You're also still asserting the 7900 XTX wasn't designed to compete with the 4080, even though AMD literally said as much. So, again, using the 4090 to compare to the 7900 XTX completely misses the point I was making about how bad pricing was across both brands, not just one.
 
Which part do you want? The part where the 7900 XTX is still a $1000 card that doesn't do Ray Tracing well? That it has higher power consumption than the 4080/4080 super, that it's louder by default in its cooling, and that if you run multiple monitors the power consumption is even worse?

Do you want the other part where AMD themselves said that the 7900 XTX wasn't competing with the 4090? Because that was the original issue you took with my comment and I don't see you defending it now, so why don't we both just get past the bias and move on to something more interesting than arguing on the internet?

I just don't understand why you'd be so upset about this. The 7900 XTX has better power connectors, more RAM, and it's a better card if you don't care about RT. AMD still put a huge price on the card to compete with NVidia's offerings, and this entire gen, excluding the 4090 (price aside), is mediocre. It's not one company doing the damage.
Huh, now 3090 levels of raytracing performance is bad. Good to know. I would guess you were one of the guys that justified spending a gazillion dollars for that card during the crypto craze but now that AMD has gotten there, oh no, that performance isn't acceptable!
I get that Nvidia has advantages in raytracing but man, they use like a square foot of silicon just for that and Tensor and holy <Mod Edit> they make you pay for that fancy sand.
Also, stop defending <Mod Edit> companies, they won't send you freebies and neither it will increase the value of your stocks.
 
Last edited by a moderator:
AMD didn’t cut corners, they honestly thought RDNA 3 was going to compete with the 4090 at $999 but their performance and efficiency simulations did not match actual performance and efficiency so they dedicated their entire driver team to optimizing RDNA 3 for 3 months straight to figure out why. You can say that RDNA 3 is a disappointment, but you can’t say it was on purpose, and with the reality of RDNA 3 in hand, AMD can only lower prices so much. It is unfortunate for AMD that it didn’t pan out like they genuinely thought it would, but you can’t justify the assertion that they cut corners given the radical departure from the monolithic GPU norms.

Lisa Su herself said they would have a 4090 competitor. And internal documents showed that Lisa Su was upset they didn't yet have a flagship GPU to compete toe to toe with Nvidia.

Rumor mill says RDNA 3 had a physics design flaw with the interposer failing and corrupting data after extended use at high speeds. This was a flaw they could not fix without some major changes and revalidation.

That said AMD is inferior to the 4080 in RT, upscaling, av1, and broadcasting tech for streamers. NVIDIA is inferior in raster by about 5% (depending on rez) and memory offerings. (Meaning it will age quicker)

Pick your poison. Even though I bought a 7900XT, we really can't dismiss RT. It's no longer a gimmick with full path tracing. (Portal, Minecraft, cyberpunk and Alan wake 2 just to name a few)

6900XT was $900 MSRP

6800XT was $650 MSRP

7900XTX is/was $1000 replacing $900 6900XT
7900XT was $900 replacing $650 6800XT

Only reason AMD got away with this is because of NVIDIAs pricing. But gamers revolted against both.

And even with a $700 7900XT, an equivalent $400 8700XT and $500 8700XTX is coming out in October. That's 8 months...and the RT will still be awful equiv ~3070 levels.

But all this is moot because AI will suck up all production nodes.
 
Huh, now 3090 levels of raytracing performance is bad. Good to know. I would guess you were one of the guys that justified spending a gazillion dollars for that card during the crypto craze but now that AMD has gotten there, oh no, that performance isn't acceptable!
I get that Nvidia has advantages in raytracing but man, that use like a square foot of silicon just for that and Tensor and holy <Mod Edit> they make you pay for that fancy sand.
Don't make assumptions or blanket accusations. I've had a 1070 since right before Crypto hit the first time.

You are also misrepresenting what I'm saying because you have an allegiance to a brand, as far as I can tell. I never said the Ray Tracing on the 7900 XTX wasn't at least acceptable, but at $1000, and a gen ahead, I'd argue you made the point more valid that the performance isn't particularly compelling, especially considering the 4080, a card it competes against, handily outperforms it on that front (while being slower in raster, though not by much). Hell, the 7900 XTX doesn't even outperform cards cheaper than it in Ray Tracing. Not everyone cares about it, that's entirely their choice, but how about you be objective rather than impulsive in your responses?
 
Could you possibly nitpick a comment any further to make a point that AMD good NVidia bad?

Let's replace "cut corners" with "there are concessions to ownership" and then revise everything you're saying. You're missing the point entirely, which is, at the end of the day, AMD didn't help anyone with their pricing and making excuses for them, but deriding NVidia, looks shortsighted and bad faith. The 7900 XTX is still a $1000 card that isn't wholly superior to the 4080. This is not even up for debate, it's factual by virtue of evidence.

You're also still asserting the 7900 XTX wasn't designed to compete with the 4080, even though AMD literally said as much. So, again, using the 4090 to compare to the 7900 XTX completely misses the point I was making about how bad pricing was across both brands, not just one.
I am not an AMD fanboy. Here is my history.
Nvidia GeForce 4400ti, FX 5800, GTX 6800 GT, GTX 280, GTX 470, GTX 680, GTX 980, GTX 1080. My 6900XT was my first non-Nvidia card and honestly I couldn’t be happier. Nvidia has been good to me for 2 decades, but they’re not the same company today.

And no, Lisa Su said the 7900XTX was a 4090 competitor which is why it has the same memory amount. Their simulations pointed to the ability to efficiently clock to 3 ghz + and a massive 50% efficiency improvement over RDNA 2. However when real silicon testing began they quickly found RDNA 3 to not behave how they expected. This is when AMD began aligning the 7900xtx to the 4080. And at $1200 for a 16GB card with 50% less ram and 5% slower rasterization, the $1000 7900XTX is much better value with future proofing.
 
Lisa Su herself said they would have a 4090 competitor. And internal documents showed that Lisa Su was upset they didn't yet have a flagship GPU to compete toe to toe with Nvidia.

Rumor mill says RDNA 3 had a physics design flaw with the interposer failing and corrupting data after extended use at high speeds. This was a flaw they could not fix without some major changes and revalidation.

That said AMD is inferior to the 4080 in RT, upscaling, av1, and broadcasting tech for streamers. NVIDIA is inferior in raster by about 5% (depending on rez) and memory offerings. (Meaning it will age quicker)

Pick your poison. Even though I bought a 7900XT, we really can't dismiss RT. It's no longer a gimmick with full path tracing. (Portal, Minecraft, cyberpunk and Alan wake 2 just to name a few)

6900XT was $900 MSRP

6800XT was $650 MSRP

7900XTX is/was $1000 replacing $900 6900XT
7900XT was $900 replacing $650 6800XT

Only reason AMD got away with this is because of NVIDIAs pricing. But gamers revolted against both.

And even with a $700 7900XT, an equivalent $400 8700XT and $500 8700XTX is coming out in October. That's 8 months...and the RT will still be awful equiv ~3070 levels.

But all this is moot because AI will suck up all production nodes.
I’m only saying RT is a gimmick because it requires fake frame insertion and low pixel count upscaling to be playable. When full resolution real-time path-tracing is achievable, I will no longer discount RT’s worth.
 
I am not an AMD fanboy. Here is my history.
Nvidia GeForce 4400ti, FX 5800, GTX 6800 GT, GTX 280, GTX 470, GTX 680, GTX 980, GTX 1080. My 6900XT was my first non-Nvidia card and honestly I couldn’t be happier. Nvidia has been good to me for 2 decades, but they’re not the same company today.

And no, Lisa Su said the 7900XTX was a 4090 competitor which is why it has the same memory amount. Their simulations pointed to the ability to efficiently clock to 3 ghz + and a massive 50% efficiency improvement over RDNA 2. However when real silicon testing began they quickly found RDNA 3 to not behave how they expected. This is when AMD began aligning the 7900xtx to the 4080. And at $1200 for a 16GB card with 50% less ram and 5% slower rasterization, the $1000 7900XTX is much better value with future proofing.
That's great, glad to hear it: the 7900 XTX was no longer a 4090 competitior at the end of its development, so it's irrelevant what the intention was at the start. Also, you're still talking about things that are besides the point: both AMD and NVidia are at fault for maintaining the current pricing and neither is free of blame.

I have no interest in arguing which card, between the 4080 and 7900 XTX, is the better buy. It was never my point and never will be. I have no idea how many times I have to reiterate my stance before you stop redirecting the conversation.
 
its won't pop.

Unlike crypto its not volatile & has a great deal of future growth.
It will pop. Dot.com bubble popped and the Internet obviously had a great deal of future growth as well since before that one popped.

Even crypto has a future but that's not really a good investment vehicle. Crypto was supposed to decentralize power but it just transferred from banks to exchanges (which are basically banks with a different name for regulatory purposes).
 
Yeah, until the AI bubble pops and they come crawling back to the gamers that supported them since the beginning. I hope gamers instead buy Intel and AMD cards to send them a message.

The problem is... AI is not a bubble!
AI you can actually use. For economy, military, anything!
Unlike virtual currency! And the company/army/country that has most powerful AI wins! That means unlimited demand to faster and faster AI hardware! Boys and girls. This situation does not go away, disappear or diminish. It is getting bigger, more demanding and more seeked after!
 
Status
Not open for further replies.