News Some RTX 4070s Already Discounted a Day After Launch

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I'm shocked. SHOCKED!

Well, not that shocked.
--

This is good and nVidia needs to strike a balance, because most people will put things in the balance when you start going up in price without touching grass, which is quite ironic since nVidia is green. HA.

Anyway, I hope nVidia takes some lessons from this and competes in good faith and doesn't try to screw its fanbase so blatantly.

Inb4 "in good faith": check their latest marketing shenanigans and how God awfully misleading they are. More than previously. No hat AMD is much better, but they do get called out rightfully for it.

Regards.
Uhm. Where do you see AMD called out for bad marketing takes but not Nvidia? Because, Nvidia got ripped apart for their "3x performance" claim with the 4070Ti, get constantly criticized for pricing, had much more backlash for the burning power adapters they weren't really responsible for in the first place, and most people are critical of their latest RTX statistics as well. That is quite a bit of flack right there. Where is the critizism towards AMD again? For their 8K claims, for example, or their newest stunt concerning "maximum raytracing performance" I have read about just today? Or the vapor chamber fiasko? Their performance claims of the 7900 series?
 
  • Like
Reactions: Why_Me
Uhm. Where do you see AMD called out for bad marketing takes but not Nvidia? Because, Nvidia got ripped apart for their "3x performance" claim with the 4070Ti, get constantly criticized for pricing, had much more backlash for the burning power adapters they weren't really responsible for in the first place, and most people are critical of their latest RTX statistics as well. That is quite a bit of flack right there. Where is the critizism towards AMD again? For their 8K claims, for example, or their newest stunt concerning "maximum raytracing performance" I have read about just today? Or the vapor chamber fiasko? Their performance claims of the 7900 series?
You just mentioned them? LOL.

Going back a bit more into the past, they were called out for Vega and Fury overhyped releases as well. Look at the RX5700 siblings release and all the driver problems after also claiming being "super duper" competitive. Then there's Polaris' PCIe power draw stupidity.

On the nVidia side, I also remember quite a few incidents as well, but I don't recall a single time they've been called out for their marketing slides in particular, before recent times.

Regards.
 
They want to shift as many units as possible before the summer slowdown kicks in and don't want to risk having to cut them deeper if by some miracle AMD decides to be price competitive instead of price matching. Plus, as TH has pointed out, last generation more powerful cards are the same price or cheaper right now.
 
Out of curiosity, does anyone here know the actual production costs of the 20,30, and 40 series cards?

To make any kind of reasonable analysis on pricing, you do kind of need that. So far not a single person has provided such data, not that it doesn't exist publicly, nor can be reasonably estimated, but I'm positive at least several of the posters so far don't have a clue what it is.

The 4070 may be overprice...maybe it isn't. Hard to tell with the missing information.
 
You just mentioned them? LOL.

Going back a bit more into the past, they were called out for Vega and Fury overhyped releases as well. Look at the RX5700 siblings release and all the driver problems after also claiming being "super duper" competitive. Then there's Polaris' PCIe power draw stupidity.

On the nVidia side, I also remember quite a few incidents as well, but I don't recall a single time they've been called out for their marketing slides in particular, before recent times.

Regards.
I see almost nothing about the vapor chamber, but constant jabs about the adapters, especially in this forum. Nothing about the rest. Just because I mentioned it does not mean many others do, and most comments I have seen basically wrote them off. And I do very well see complaints about Nvidia marketing, and even more lately. So I think claiming they don't get called out for bs is simply wrong.
 
  • Like
Reactions: Why_Me
Tying a resolution to a tier makes absolutely no sense to me. Graphics performance is almost infinitely tunable, almost any GPU can play any resolution as long as you are willing to make compromises. Historically, "entry level" GPUs meant aiming for medium-high settings out of the box at best to get a steady 60FPS. An actual entry-level shopper wouldn't even dream of achieving Ultra-Psycho-Nightmare at 80+FPS (1440p) on launch-day titles like the RTX4070 can.

While the price-performance may suck and the 12GB of VRAM could be problematic for enthusiasts in the future, the RTX4070 isn't performing like an entry-level GPU ("got to make significant compromises just to hit 60FPS") until you hit 4k or turn RT on.
I did not come up with it, that's been the marketing to a degree. You do know that NVIDA themselves Market tiers and resolution. the xx50 card Has been an entry level card since like forever. NVIDA has stated the xx80 cards as Entry to 4k, I think I read the change to the 4070ti as entry to 4k now. Claim the xx60 is a 4k card is silly. My 2060 can barely handle Hogwarts on a 1440p. There are more and more videos showing the Need for the VRAM than ever before to achieve playable frame rates. I can go on... but no need.
 
There are more and more videos showing the Need for the VRAM than ever before to achieve playable frame rates. I can go on... but no need.
If you are a more budget-conscious person willing to lower details to keep things going smoothly, most games will let you drop the VRAM footprint at least down to 4GB. VRAM is only an issue if you refuse to compromise.

Should people have to make that sort of compromise on a new $600 GPU in launch-day titles? No. But that is a whole other debate.
 
  • Like
Reactions: atomicWAR
I did not come up with it, that's been the marketing to a degree. You do know that NVIDA themselves Market tiers and resolution. the xx50 card Has been an entry level card since like forever. NVIDA has stated the xx80 cards as Entry to 4k, I think I read the change to the 4070ti as entry to 4k now. Claim the xx60 is a 4k card is silly. My 2060 can barely handle Hogwarts on a 1440p. There are more and more videos showing the Need for the VRAM than ever before to achieve playable frame rates. I can go on... but no need.
Actually, they market the 4070Ti as perfect for 1440p and entry to 4K, but mainly 1440p. Which is also the resolution it is best at relative to other cards.
 
  • Like
Reactions: Why_Me
There are limits on how much of an "afterthought" gaming can be before shareholders start asking questions about collapsing consumer sales. Nvidia could also find itself in somewhat of a predicament as more new players enter the AI field with dedicated chips that shed all unnecessary functions to pack the most BF16 and INT8 throughput they can in the least silicon and power possible.

While inflation may be a thing, practically every year until four years ago brought cheaper faster chips regardless of inflation. While the cost of GPU wafers may have gone up, the cost of support components has fallen substantially now that the supply chain is mostly back to pre-covid normal which should easily offset it yet GPU prices are still going up. Someone is being exceedingly greedy and attempting to hog all gains.

Most of the inflation we see today is thanks to market consolidation where too many economic sectors have only one or two real players controlling nearly everything.
I'm a Nvidia investor ...... I invested $120,000 back in October and it's now worth $265,000 .... I have no complaints

Nvidia is cutting production of their consumer lines because they need that time at TSMC to make A100 and H100 boards for Generative AI ..... Elon Muck just ordered 10,000 H100 units for Twitter, that's 80,000 AD100 chips (Same as the 4090 but fully enabled) ..... There is already such high demand for the H100 that the normally price $35,000 model is going for $40,000 - $46,000 on eBay right now

What you people fail to understand is Nvidia booked time at TSMC well over a year ago before anyone realized how big Generative AI was going to be so they really have no choice but to lower production of the consumer lines so they can make the much more profitable professional/server lines like the A100 and H100 ...... If Nvidia lowered the margins on their consumer lines when they could be making the more profitable A100 and H100 THEN you would see pissed off investors mad because Nvidia was leaving so much money on the table to please a few "Gamers" so they quit whining
 
Last edited:
  • Like
Reactions: Why_Me
I see almost nothing about the vapor chamber, but constant jabs about the adapters, especially in this forum. Nothing about the rest. Just because I mentioned it does not mean many others do, and most comments I have seen basically wrote them off. And I do very well see complaints about Nvidia marketing, and even more lately. So I think claiming they don't get called out for bs is simply wrong.
Don't mix "blunders" with "marketing". They're definitely different things. Both AMD and nVidia have a long list of very public blunders to which I have no energy to get into, as I just came back from watching Sabaton in the Wembley Arena.

Plus, are you going to tell me you really believe that the nVidia crowd, having like 2 or 3 times more die hard fans (quick high level estimation based on market share and Steam survey and overall impression in public media), they won't point out every little thing from AMD when it comes to marketing slides? I remember some people quoting the foot notes of the presentations arguing they were fake. FAKE! THE NOTES! xD

But hey, it's simpler to just agree to disagree on the premise you have your own view and I have my own on the matter.

Regards.
 
I'm a Nvidia investor ...... I invested $120,000 back in October and it's now worth $265,000 .... I have no complaints

Nvidia is cutting production of their consumer lines because they need that time at TSMC to make A100 and H100 boards for Generative AI ..... Elon Muck just ordered 10,000 H100 units for Twitter, that's 80,000 AD100 chips (Same as the 4090 but fully enabled) ..... There is already such high demand for the H100 that the normally price $35,000 model is going for $40,000 - $46,000 on eBay right now

What you people fail to understand is Nvidia booked time at TSMC well over a year ago before anyone realized how big Generative AI was going to be so they really have no choice but to lower production of the consumer lines so they can make the much more profitable professional/server lines like the A100 and H100 ...... If Nvidia lowered the margins on their consumer lines when they could be making the more profitable A100 and H100 THEN you would see pissed off investors mad because Nvidia was leaving so much money on the table to please a few "Gamers" so they quit whining
I don't think he was denying the economics of consumer vs enterprise/professional markets, nor was I (...he was partially replying to me). I think the argument gamer's have is price for consumer cards and the economics gamers play their overall sales. While gaming is no longer the driving force of Nvidia profits, their gaming division is still a third of their revenue and its not exactly wise to piss of a full third of your clientele off. Yes we gamers get the majority of wafers are going to go to to professional grade GPUs. BUT that doesn't mean Nvidia needs to screw over their gaming base either with ridiculous pricing. With Intel entering the market who knows where consumers mind share will be in 10 years time. I don't think Nvidia is ready to give up that market share so easily but I could be wrong with how value AI brings to the table. Regardless if they keep treating their gaming base in this fashion Nvidia will be ripe to be rolled over by another company in the gaming market.
 
  • Like
Reactions: SunMaster
What you people fail to understand is Nvidia booked time at TSMC well over a year ago before anyone realized how big Generative AI was going to be so they really have no choice but to lower production of the consumer lines so they can make the much more profitable professional/server lines like the A100 and H100 ....
That would only be true if Nvidia was wafer-constrained. However, Nvidia reduced its wafer orders and TSMC is nowhere near max capacity anymore. The "scarcity" is entirely artificial at this point.
 
  • Like
Reactions: SunMaster
Piss on a market segment long enough, someone else may usurp your market share and you'll be the one struggling to get it back. If Intel decides to go for a market share grab with a $299 "B750", there will be carnage in the mid-range segment.
Intel, or anybody else for that matter, isn't allowed to do that, the FTC would be on their backs in a beat.
Selling product at a price deemed too low for a competitor to be able to match it without financial harm is what was a big part of the famous AMD "lawsuit" against them.
an as efficient competitor would have had to offer prices which would not have been viable and that, accordingly, the rebate scheme at issue was capable of foreclosing such a competitor.

While gaming is no longer the driving force of Nvidia profits, their gaming division is still a third of their revenue and its not exactly wise to piss of a full third of your clientele off.
It's not 1/3 that gets pissed off, it's whatever the DIY percentage of that 1/3 is, which is probably a pretty small number.
A big percentage of that 1/3 is pre-build buyers that will barely notice a few hundred dollars more on a multi thousands dollars build.
 
I'm a Nvidia investor ...... I invested $120,000 back in October and it's now worth $265,000 .... I have no complaints

Nvidia is cutting production of their consumer lines because they need that time at TSMC to make A100 and H100 boards for Generative AI ..... Elon Muck just ordered 10,000 H100 units for Twitter, that's 80,000 AD100 chips (Same as the 4090 but fully enabled) ..... There is already such high demand for the H100 that the normally price $35,000 model is going for $40,000 - $46,000 on eBay right now

What you people fail to understand is Nvidia booked time at TSMC well over a year ago before anyone realized how big Generative AI was going to be so they really have no choice but to lower production of the consumer lines so they can make the much more profitable professional/server lines like the A100 and H100 ...... If Nvidia lowered the margins on their consumer lines when they could be making the more profitable A100 and H100 THEN you would see pissed off investors mad because Nvidia was leaving so much money on the table to please a few "Gamers" so they quit whining
I appreciate your candour, this made my mind up about any potential Nvidia purchases in the future. Nothing wrong with the company deciding to listen to their investors and follow the money, however, it doesn't mean I have to follow them as well.
 
Intel, or anybody else for that matter, isn't allowed to do that, the FTC would be on their backs in a beat.
Selling product at a price deemed too low for a competitor to be able to match it without financial harm is what was a big part of the famous AMD "lawsuit" against them.
Anti-dumping law only aims to prevent companies from selling products below cost for predatory reasons. As long as you can turn a profit, you can sell stuff as cheap as you want. You can still sell stuff below cost for liquidation purposes - I'm quite certain AMD wasn't making any profit on the heaps of 8GB RX580s it had to sell off for ~$120 including a two games bundle after the first crypto crash.

What really got Intel in trouble with AMD was unfair trade practices where Intel was paying shops (marketing allowances, freebies, bulk discounts) and blackmail (reducing or halting shipments and exclusivity agreements) to not sell AMD chips.
 
What really got Intel in trouble with AMD was unfair trade practices where Intel was paying shops (marketing allowances, freebies, bulk discounts) and blackmail (reducing or halting shipments and exclusivity agreements) to not sell AMD chips.
I linked the document for a reason and it explains what they were charged for.
Main part of importance right here:
Company that is too big makes better prices=fine without even checking.
"As regards that complaint, the Court of Justice notes that the General Court confirmed the Commission’s line of argument that loyalty rebates granted by an undertaking in a dominant position were, by their very nature, capable of restricting competition such that an analysis of all the circumstances of the case and, in particular, an as efficient competitor test (‘AEC test’) were not necessary. "
 
It's not 1/3 that gets pissed off, it's whatever the DIY percentage of that 1/3 is, which is probably a pretty small number.
A big percentage of that 1/3 is pre-build buyers that will barely notice a few hundred dollars more on a multi thousands dollars build.

I disagree. Yes DIY is a sliver of that 1/3 but ALL pc gamers I know are pissed. Not just DIY ones but those who buy prebuilt or buy prebuilt but upgrade gpus. That is literally the whole (or nearly whole) gaming market. Yes its the DIY/upgrade crowd you'll hear most online as their comfort with all things tech related tends to be higher than prebuilt weekend gaming warriors and thus they are more visible online. Idk maybe the noob PC gamer might not mind or notice these price hikes but it appears to me that most every other gaming 'click' has noticed and us upset about it.

I'll give it to you though its tough to gage accurately and at the end of the day Nvidia leads the pack so many gamers will likely go Nvidia regardless. But rubbing your consumer base wrong over and over is not wise. With Intel entering the gpu market Nvidia could easily become the next 3dfx if Intel releases the right product(s) at the right price. 3dfx lost 80% market share almost over night. If Nvidia is not careful the same could happen to them. Just saying even if its unlikely, for now. I just know I ready to jump ship at this point as soon as someone can compete with Nvidia perfirmance/features.
 
  • Like
Reactions: Jagar123
I'll give it to you though its tough to gage accurately and at the end of the day Nvidia leads the pack so many gamers will likely go Nvidia regardless. But rubbing your consumer base wrong over and over is not wise. With Intel entering the gpu market Nvidia could easily become the next 3dfx if Intel releases the right product(s) at the right price. 3dfx lost 80% market share almost over night. If Nvidia is not careful the same could happen to them. Just saying even if its unlikely, for now. I just know I ready to jump ship at this point as soon as someone can compete with Nvidia perfirmance/features.
Intel could definitely flood the market with well priced GPUs especially if they will use their new FABs to make the GPU dies themselves...and the vmem, but it will take them a long while to get the drivers to close to the level of nvidia, or AMD for that matter.
Also IF any of that happens nvidia can just lower prices on the then current or the next upcoming products and everybody would be back to buying nvidia.

It's like what many people say about intel/amd, they want AMD to do well so that intel will lower prices for them to buy better and cheaper intel and not AMD...

For nvidia to lose something like 80% of business intel would have to come up with a GPU that would run RTX games at 60FPS + at the same price points that nvidia has for card that do RTX at 10-15FPS.
Just a crude example, don't get caught up in percentages.
 
  • Like
Reactions: atomicWAR
Intel could definitely flood the market with well priced GPUs especially if they will use their new FABs to make the GPU dies themselves...and the vmem, but it will take them a long while to get the drivers to close to the level of nvidia, or AMD for that matter.
Also IF any of that happens nvidia can just lower prices on the then current or the next upcoming products and everybody would be back to buying nvidia.

It's like what many people say about intel/amd, they want AMD to do well so that intel will lower prices for them to buy better and cheaper intel and not AMD...

For nvidia to lose something like 80% of business intel would have to come up with a GPU that would run RTX games at 60FPS + at the same price points that nvidia has for card that do RTX at 10-15FPS.
Just a crude example, don't get caught up in percentages.
I agree for the most part. Yes Nvidia can drop prices as well but if a good product hits the market from another manufacturer and Nvidia has burned their base enough times it won't take much to get gamers to switch sides. While it is unlikely its not out of the realm of possibility that Nvidia could lose its lead if they keep rubbing their base wrong. Intel has made great strides in their drivers when I had expected them to quit already if I am being honest (seriously flabergasted Intel is still in the market). Intel does have a long way to go but as stated it only takes one product to change that dynamic. When I got a Riva TNT 2 card in the late 90's I was the only one of my friends not to have a Voodoo card and was teased for it. By the time I dropped in my Geforce 256 most of my friends had Nvidia and by the time I put in a geforce 2, all of my friends had switched to Nvidia or at least stopped using 3dfx. Times can change with a quickness in the tech field as you well know. I just think Nvidia would be wise to treat gamers in a better fashion is all. Yes we are shrinking in terms of income but a third is still a large slice of the pie to mistreat (imho).
 
Anti-dumping law only aims to prevent companies from selling products below cost for predatory reasons. As long as you can turn a profit, you can sell stuff as cheap as you want. You can still sell stuff below cost for liquidation purposes - I'm quite certain AMD wasn't making any profit on the heaps of 8GB RX580s it had to sell off for ~$120 including a two games bundle after the first crypto crash.

What really got Intel in trouble with AMD was unfair trade practices where Intel was paying shops (marketing allowances, freebies, bulk discounts) and blackmail (reducing or halting shipments and exclusivity agreements) to not sell AMD chips.
Exactly the reason I am excited and terrified Intel has entered the GPU market. If anyone can climb this mountain of mind share and performance its them. We need more competition in the gpu space but I won't lie I worry what COULD happen should Intel ever become top dog in the GPU space. Talk about a catch 18...I mean 22 *wink for those in the know*.
 
For nvidia to lose something like 80% of business intel would have to come up with a GPU that would run RTX games at 60FPS + at the same price points that nvidia has for card that do RTX at 10-15FPS.
Just a crude example, don't get caught up in percentages.
80% may be a tall order but ~50% of the dGPU market if we go by the Steam survey (at least until last month's anomaly where the RTX3050 mysteriously surged to #1) is sub-$300 GPUs.

Battlemage is supposed to be twice as powerful as Alchemist. If Intel sticks to the same price points as Alchemist as the slides appeared to imply, it will be offering about twice as much performance per dollar as the other uncompetitive brands' GPUs currently do. I'd say that would stand a pretty good chance of upturning the market.

Intel doesn't need a be-all, end-all product. Only something that can shatter the stalemate.
 
Something some of you may be forgetting, so I'll remind you: we were all kids and we all had our first PC experiences that led, some of us at least, to become CS'es, Software Engies or work in something IT-related. No matter what we call ourselves from the "consumer" angle, it is still an important market to dominate. This is also why Microsoft is all over schools and Universities. And by "all over", I mean ALL OVER them. A lot of scientific research is being done using CUDA, so if they remove consumer-grade GPUs that can run all those for students, then you can imagine what would happen: they will diversify.

When you learn something using a set of tools, then you're more than likely to continue using those same set of tools for a very long time, unless your work/job/position forces you to learn new things.

So, in short: if nVidia takes the ball home because they're not getting enough revenue or something, that is going to come back and bite them later on. I'd imagine they want to go the IBM route (maybe?), but... I'm not sure. Considering they're very much into building things in some very relevant markets with no hard breakthroughs (like IBM), I don't think they're quite there yet or even close.

Regards.
 
  • Like
Reactions: atomicWAR
A lot of scientific research is being done using CUDA, so if they remove consumer-grade GPUs that can run all those for students, then you can imagine what would happen: they will diversify.
Much like how Microsoft underestimated mobile platforms early on and by the time it decided to take a shot at that market, it got almost completely shut out by Android and IOS. Now it is afraid of losing desktop market share too and we are getting free OS upgrades for all eligible computers out of it.
 
  • Like
Reactions: atomicWAR
Something some of you may be forgetting, so I'll remind you: we were all kids and we all had our first PC experiences that led, some of us at least, to become CS'es, Software Engies or work in something IT-related. No matter what we call ourselves from the "consumer" angle, it is still an important market to dominate. This is also why Microsoft is all over schools and Universities. And by "all over", I mean ALL OVER them. A lot of scientific research is being done using CUDA, so if they remove consumer-grade GPUs that can run all those for students, then you can imagine what would happen: they will diversify.

When you learn something using a set of tools, then you're more than likely to continue using those same set of tools for a very long time, unless your work/job/position forces you to learn new things.

So, in short: if nVidia takes the ball home because they're not getting enough revenue or something, that is going to come back and bite them later on. I'd imagine they want to go the IBM route (maybe?), but... I'm not sure. Considering they're very much into building things in some very relevant markets with no hard breakthroughs (like IBM), I don't think they're quite there yet or even close.

Regards.
As usual...extremely well said. I agree 1000%.