News Chinese Nvidia RTX 4070 Ti Price Listing Matches Cancelled 4080 12GB

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Intel has a die size of the 3070ti and can't beat a 3060 consistently. It's not just drivers, but their design. I won't nitpick what is more than the other (pretty stupid and moot to do so), but Intel is still years behind and ARC made it clear.
There are definite design issues at a fundamental level(i.e. Memory throughput). But to the die size argument, a point against direct comparison of die size in Intels favor is it has decided to put a ton of high level features/accelerators on every dGPU which beefs up the die size without any direct gaming performance uplift (AV1 encode, AI training, etc...) while AMD/Nvidia reserve most/all of these features/accelerators for a pro/data center line. Now if XeSS was working/functional everywhere it could help close that performance gap, although it is more of a trick/shortcut to improve FPS, ARC still has fundamental frame delivery issues compared to the others that hopefully can be addressed in drivers.

Arc is still a very compelling card to a creator thanks to the pro features at consumer prices.
 
nVidia's greed is gonna destroy their own business pretty quick. These prices would be expensive for a 90 tier card let alone a 70ti, Jenson and his people need to realise most of the western world is in recession, most of europe is struggling with crippling energy prices (300% increase in a year) with real world incomes dropping ~10% for the majority and if he thinks he can double the price of every SKU and sell any product he is absolutely crazy. nVidia will crash and burn either by AMD taking a more sensible approach or by nobody by GPU's from anyone. Gaming is a Luxury and there are only a tiny percentage of people who can afford it at these prices.
 
It’s not the 2024 $1000 rtx 5050 that worries me, but the 2026 “acting out your favorite games with your friends in your living room because the rtx 6000 series low end GPU’s cost as much as a used car ”.
How do you "act out your favorite games" if you cannot afford a GPU to play those games to find out how to "act them" in the first place?

But yeah, I'm still on a GTX1050 and not planning to upgrade while this pricing insanity is still in full force. It still plays WoW well enough and I'll stick to that for the foreseeable future.
 
nVidia's greed is gonna destroy their own business pretty quick. These prices would be expensive for a 90 tier card let alone a 70ti, Jenson and his people need to realise most of the western world is in recession, most of europe is struggling with crippling energy prices (300% increase in a year) with real world incomes dropping ~10% for the majority and if he thinks he can double the price of every SKU and sell any product he is absolutely crazy. nVidia will crash and burn either by AMD taking a more sensible approach or by nobody by GPU's from anyone. Gaming is a Luxury and there are only a tiny percentage of people who can afford it at these prices.
Yeah, this is what record inflation, unprecedented TSMC leading edge 30% price increase and eliminating bulk order discounts because TSMC knows they are the sole leading edge semiconductor manufacturer for all the world’s fabless design firms, and what recessions do. After applying the first 2 variables, anticipated supply and demand determines what price nvidia will publish. (Not trying to justify any greed added onto the 4000 series prices) Basically, nvidia knows that they will have to price their cards higher due to the added costs associated with inflation/TSMC price hikes because nvidia has to maintain or increase revenue year on year or they will be liable to shareholder lawsuits for failure to uphold shareholder interests. Recessions already lowers demand and this inevitable price hike will lower demand for their gpu’s further still. So I presume the bean counters at Nvidia have decided to lower supply of 4000 series cards compared to past launches then increase the margins on the cards to match the adjusted demand to keep the shareholders happy.

We can complain all day about the prices but it seems nvidia knows what they are doing in this economy. It is my hypothesis (based on my business management experience) that since all the initial 4090 supply was bought in the first day they can just lower AD103 die and increase AD102 die production proportionally to keep just enough poor price to performance 4080’s on the shelf to remind people that the higher margin 4090 is the only card that is borderline worth it’s price in the 4000 series.

What do y’all think? Is my hypothesis have merit, or am I missing data points here?

thanks ahead of time for any rebuttals, I love hypothesizing haha
 
There are definite design issues at a fundamental level(i.e. Memory throughput). But to the die size argument, a point against direct comparison of die size in Intels favor is it has decided to put a ton of high level features/accelerators on every dGPU which beefs up the die size without any direct gaming performance uplift (AV1 encode, AI training, etc...) while AMD/Nvidia reserve most/all of these features/accelerators for a pro/data center line. Now if XeSS was working/functional everywhere it could help close that performance gap, although it is more of a trick/shortcut to improve FPS, ARC still has fundamental frame delivery issues compared to the others that hopefully can be addressed in drivers.

Arc is still a very compelling card to a creator thanks to the pro features at consumer prices.
Uh... Can you let me know what specific features Intel has put in their cards which are in use and working (via drivers) right now that AMD or nVidia don't have?

I'm actually curious, because throwing "it has ML and AI stuff in it!" is a very bad blanket statement when there's zero software or it doesn't work.

Regards.
 
We will get 4030 for $300 so PC gaming will be just fine!
😉

In reality what happens is the same as with cell phones.
New gen 4050 are gonna be $500 and that is the weakest we are gonna get (maybe)
The price of 3050 will drop to $350, the price of 2050 will drop to $300 or even $250 and vola Nvidia has cheap gpu for low end… Aka last generation GPUs are sold at good profit as an low end options so that manyfacturers don´t have to sell anything at real discount = profit to the company and stock holders.

So PC gaming will not die, people just are gonna use two or tree generation old hardware, when they build new gaming computer, unless they have more money than brain cells and they buy the newest generation. And there are enough customers to buy these next gen products at incredibly high prices! 4090 at 2500€ are sold out! 4080 at 1800€ are not sold out, but still selling, while normal people buy 3000 series that is still above MSRP! Nvidia can only say that they have done good job!

I personally have so big steam game library that I will newer go to consoles. I may be forced among the people who buy two or tree generation old GPU hardware, but that is that. AMD and Nvidia still get my money and at the same time get even more money from people who buy the next gen tech. They only win in all situations and now they win even more than before.

Nah, nvidia failed their earnings call hard and their stock is about to take a massive dump. This is the generation they learn their lesson.

Theres a reason I only buy used gpus, they wont ever get a dime out of me. Miner gpus are flooding the market, nvidia will mess this up like they have done with the 2000 series and go back to cheaper prices again.

They might even be forced to release super models again due to sheer incompetance. This isnt the first time they changed a card model number, and the last time did not go well for them. Once this recession hits people start losing jobs, no one in their right mind will buy gpus at these prices, like no one bought the titan card.
 
Uh... Can you let me know what specific features Intel has put in their cards which are in use and working (via drivers) right now that AMD or nVidia don't have?

I'm actually curious, because throwing "it has ML and AI stuff in it!" is a very bad blanket statement when there's zero software or it doesn't work.

Regards.
Intel has very fast, efficient AV1 encoders. They make for good streaming pcs.
 
Yeah, this is what record inflation, unprecedented TSMC leading edge 30% price increase and eliminating bulk order discounts because TSMC knows they are the sole leading edge semiconductor manufacturer for all the world’s fabless design firms, and what recessions do. After applying the first 2 variables, anticipated supply and demand determines what price nvidia will publish. (Not trying to justify any greed added onto the 4000 series prices) Basically, nvidia knows that they will have to price their cards higher due to the added costs associated with inflation/TSMC price hikes because nvidia has to maintain or increase revenue year on year or they will be liable to shareholder lawsuits for failure to uphold shareholder interests. Recessions already lowers demand and this inevitable price hike will lower demand for their gpu’s further still. So I presume the bean counters at Nvidia have decided to lower supply of 4000 series cards compared to past launches then increase the margins on the cards to match the adjusted demand to keep the shareholders happy.

We can complain all day about the prices but it seems nvidia knows what they are doing in this economy. It is my hypothesis (based on my business management experience) that since all the initial 4090 supply was bought in the first day they can just lower AD103 die and increase AD102 die production proportionally to keep just enough poor price to performance 4080’s on the shelf to remind people that the higher margin 4090 is the only card that is borderline worth it’s price in the 4000 series.

What do y’all think? Is my hypothesis have merit, or am I missing data points here?

thanks ahead of time for any rebuttals, I love hypothesizing haha

Inflation plays a part but there is no way it justifies the increase. TSMC have made things more difficult but lets face it that was inevitable at some point given everyone else just let TSMC take over the whole market and give them a monopoly.
I agree that might be what nVidia are trying to do but I don't believe the jack up the prices and cut production model will be able to work. I can't see how nVidia will be able to find enough people to pay these inflated prices to keep their revenue growing. The 90 tier market is tiny as a percentage of the total GPU market and there are only a few 80 tier customers who would be either willing or able to go up to 90 tier even if the 80 card wasn't good value. This will be harder and harder further down the stack and gamers used to paying $200-250 for 60 class cards won't pay $500+ whether it's because they can't (most of them) or because they won't be ripped off and settle for keeping a 1060 or 1660 for longer. I can't see the endgame here, I think nVidia are gambling that there is a large enough group of hardcore gamers who will pay any price for a GPU, I think that gamble will fail and if they don't see it quickly enough it could cost them alot especially if either AMD or Intel realise there is an enormous market out there in the $200-$400 bracket.
 
Uh... Can you let me know what specific features Intel has put in their cards which are in use and working (via drivers) right now that AMD or nVidia don't have?

I'm actually curious, because throwing "it has ML and AI stuff in it!" is a very bad blanket statement when there's zero software or it doesn't work.
Check this reddit thread↓, but yes with Intel OneApi there is current support for pytorch and tensorflow, the 16gb of memory seems to be important to this crowd.
This was a replacement to my GTX 1070. I don’t have any direct benchmarks, but the memory increase alone allowed me to train some models I had issues with before.

For “pros”, I’d say the performance for the price point is pretty money. Looking at NVIDIA GPUs that have 16+ GB of memory, you’d need a 3070 which looks to be in the $600-$700 range. The setup took me an evening to get everything figured out, but it wasn’t too bad.
And as LawlessQuill already stated, AV1 hardware encode, many already touted this on the lowly A380, Intel hardware solutions not only perform faster but at higher quality (closer to software encode) than Nvidia and AMD.

But for adobe you are right, GPU acceleration still not implemented... Even the machine learning support wasn't added till October. These features, like gaming performance, have potential to get a lot better or perhaps never be implemented/improve from from the current state...

I would hope with the datacenter side of the business(Arc Pro, Ponte Vechio) we will see continued improvement and adoption, but still waiting to see how right MLID is, even if he isn't right about canceling discrete GPU, his info may predict the general direction (Reduced funding/investment/resources to execute in the graphics division).
 
  • Like
Reactions: -Fran-
With ~50% of dGPU users running sub-$300 cards based on Steam's survey, I believe there is no shortage of interest in sub-200W GPUs should Intel decide to make a serious push for market share instead of profit.

Yeah totally agree. The majority of gamers are sporting mainstream/mid-range GPUs, and the sub-$500 USD market is still pretty much active. The mid range graphic cards are what the companies should make for the majority of people out there.

They are the best on basis of price to performance ratio! High End GPUs are hilariously overpriced these days. As are low end GPUs to be honest. The Midrange is where its best at most of the time.