Hardware leaker claims Nvidia will use harvested AD103 dies in some GeForce RTX 4070 cards.
New RTX 4070 May Come With Salvaged RTX 4080 Dies : Read more
New RTX 4070 May Come With Salvaged RTX 4080 Dies : Read more
The SEO gods require tribute. Also, I don't think the 4070 is all that bad, considering the whole package. It's ~3080 performance, plus DLSS3 and a couple extra GB of VRAM, for $100 less, and with ~60% of the power draw. It's not AWESOME and you wouldn't upgrade from a 3080 or 6800 (or higher) to the 4070. But if you have an RTX 20-series or RX 5000-series card and are looking for a potential upgrade, for the time being it's the "best $600" card in my book.I actually audibly laughed out loud reading the first sentence of this article, "It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers."
I know management is making you put that in there to get clicks, but from everything I've read gamers everywhere are actually shunning the 4070 pretty hard. I know I am.
I'd recommend maybe choosing a different opening in future articles? It just sounds inauthentic the way it is written now.
Don't they hardwire in wattage limits for those mobile chips? If so I don't think they could really reuse them for desktop cards. Though it would not be the first time mobile GPU chips ended up in desktop graphics cards.Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.
The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned.And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).
So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.
I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.The SEO gods require tribute. Also, I don't think the 4070 is all that bad, considering the whole package. It's ~3080 performance, plus DLSS3 and a couple extra GB of VRAM, for $100 less, and with ~60% of the power draw. It's not AWESOME and you wouldn't upgrade from a 3080 or 6800 (or higher) to the 4070. But if you have an RTX 20-series or RX 5000-series card and are looking for a potential upgrade, for the time being it's the "best $600" card in my book.
And no, that doesn't mean you should run out and buy a $600 GPU right now. Give it another month and we'll see what 4060 Ti and maybe even RX 7600 XT look like, hopefully for a lot less than $600. I also want to see where RX 7800/7700 land, in performance, features, and price. But everything going on right now suggests things aren't actually going to get much better than the current status quo. I'd love to see an RX 7800 XT that clearly beats the RTX 4070 for about the same price. I'm not convinced that will happen.
Just wait until a 400 dollar card gives you +100% performance. I usually do not jump on a graphics card upgrade until I get at least +100%.I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.
Don't they hardwire in wattage limits for those mobile chips? If so I don't think they could really reuse them for desktop cards. Though it would not be the first time mobile GPU chips ended up in desktop graphics cards.
I'm not sure what you're getting at. It's why we said "AD103" in the article, which is the same chip in both desktop 4080 and laptop 4090. Any differences are due to binning and finalizing of what voltage to use. Maybe you're just referring to the use of GDDR6 in the mobile 4090 rather than GDDR6X in the desktop 4080, but I'd be surprised for Nvidia to use GDDR6 in a 16GB desktop 4070 now. I actually expected the RTX 4070 to use GDDR6 before it was announced, and the fact that it doesn't probably means Nvidia has plenty of GDDR6X to go around.Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.
The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned. And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).
So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.
At the same time, the 4090 is still available in a multitude of variants, with power packages between 80 to 150W and support for up to +25W extra with Dynamic Boost 2.0
I'm not sure what you're getting at. It's why we said "AD103" in the article, which is the same chip in both desktop 4080 and laptop 4090. Any differences are due to binning and finalizing of what voltage to use. Maybe you're just referring to the use of GDDR6 in the mobile 4090 rather than GDDR6X in the desktop 4080, but I'd be surprised for Nvidia to use GDDR6 in a 16GB desktop 4070 now. I actually expected the RTX 4070 to use GDDR6 before it was announced, and the fact that it doesn't probably means Nvidia has plenty of GDDR6X to go around.
I did the exact same thing, but with the 20 series for similar reasons. I jumped on my current card because it dropped to 10GB pricing and I had fears that AMD/nvidia would try to keep as much crypto price inflation as is possible. I'm firmly in the camp of "buy nothing unless you actually need it" with this generation of GPUs due to the poor value proposition (unless you're buying halo in which case the 4090 isn't really that bad compared to the rest).I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.
I actually audibly laughed out loud reading the first sentence of this article, "It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers."
I know management is making you put that in there to get clicks, but from everything I've read gamers everywhere are actually shunning the 4070 pretty hard. I know I am.
I'd recommend maybe choosing a different opening in future articles? It just sounds inauthentic the way it is written now.
Seems like a gently used 1 gen old card might fit your needs. Back in Jan I picked up a mint AMD 6800 XT reference card for about $480 on eBay that works awesome. At the time these were about $700ish if you could find them.Just wait until a 400 dollar card gives you +100% performance. I usually do not jump on a graphics card upgrade until I get at least +100%.
Shockingly (to me, anyway), the going rate for RX 6800 XT cards on eBay over the past 30 days is still $455. Considering you can find brand-new cards for not much more than that ($510 for an MSI RX 6800 XT at Newegg), I would never buy a potentially used for mining for over two years card off of eBay just to save $55.Seems like a gently used 1 gen old card might fit your needs. Back in Jan I picked up a mint AMD 6800 XT reference card for about $480 on eBay that works awesome. At the time these were about $700ish if you could find them.
Just because it is on ebay does not mean it was used for mining though, right? I know a lot of folks that never mined and sell their cards second hand on ebay, etc etera. 55 Dollars could mean the difference between a 2tb or 1tb drive, or a windows license, and many others.Shockingly (to me, anyway), the going rate for RX 6800 XT cards on eBay over the past 30 days is still $455. Considering you can find brand-new cards for not much more than that ($510 for an MSI RX 6800 XT at Newegg), I would never buy a potentially used for mining for over two years card off of eBay just to save $55.
Right, you can't know. That's the problem. Any GPU on eBay that has been out for 2.5 years (like the RX 6800/6800 XT/6900 XT cards and RTX 3070/3080/3090) was very likely used for mining is my take. That might not matter too much to some people, but it's a risk. If you get an RTX 30-series LHR card, then it would be from slightly later and potentially less used for mining. (LHR came out with the RTX 3060 in February 2021, and then around April~May new versions of the other RTX 30-series cards switched to LHR.)Just because it is on ebay does not mean it was used for mining though, right? I know a lot of folks that never mined and sell their cards second hand on ebay, etc etera. 55 Dollars could mean the difference between a 2tb or 1tb drive, or a windows license, and many others.
While I agree with the logic of what may or may not have been mined on and the precaution of assuming that people are lying about the condition of the cards, I disagree with the sentiment that cards with mining hours are so much worse than a used gaming only card. I have seen gamers treat and use their cards in similar if not worse ways than some miners. Leaving them clogged with dust and dog hair, spilling drinks on them, shipping them in socket, and so on. There is at least anecdotal evidence out there than mined on cards are fine to use, but the best advice for anything used is caveat emptor.Right, you can't know. That's the problem. Any GPU on eBay that has been out for 2.5 years (like the RX 6800/6800 XT/6900 XT cards and RTX 3070/3080/3090) was very likely used for mining is my take. That might not matter too much to some people, but it's a risk. If you get an RTX 30-series LHR card, then it would be from slightly later and potentially less used for mining. (LHR came out with the RTX 3060 in February 2021, and then around April~May new versions of the other RTX 30-series cards switched to LHR.)
There's no good way, AFAIK, to tell for certain when a particular card was manufactured. Maybe there's a date on the card or box, but you'd need a good photo to show that. I just know that I would always approach eBay sellers with an assumption of them lying/exaggerating about whether a card was ever used for mining. RX 6650/6750/6950 XT are safer, because those weren't released until Ethereum mining was nearly over — they came out May 10, 2022. Ethereum mining profits were much lower by then, and officially ended on September 15. So at worst, the 6x50 cards were used for four months of mining, and probably not even that. RTX 3090 Ti would also be safer by the same logic.
But personally, buying a used graphics card, I would expect to pay at least 25% below what a brand-new card currently costs. Which means RX 6800 for ~$360, 6800 XT for ~$380, RX 6950 XT for ~$450. Also, RTX 3070 for ~$350, RTX 3080 for ~$425, RTX 3090 for ~$600. (Nvidia's "acceptable prices" are based off the similarly performing RTX 40-series cards, minus a chunk because the newer architecture GPUs are better in various ways.)
Yes, and that's my point with the prices at the end of the last post. If you're going to buy a used card, know how to stress test it to ensure it's running stable, and don't overpay for the card. $500 for a 6800 XT when most were selling at $700+ is a reasonable deal. $455 when you can buy new for $510, at least in my book, isn't a great deal. I'd rather pay 10% more to get a card that has a full warranty, original packaging, etc. and probably has free shipping.While I agree with the logic of what may or may not have been mined on and the precaution of assuming that people are lying about the condition of the cards, I disagree with the sentiment that cards with mining hours are so much worse than a used gaming only card. I have seen gamers treat and use their cards in similar if not worse ways than some miners. Leaving them clogged with dust and dog hair, spilling drinks on them, shipping them in socket, and so on. There is at least anecdotal evidence out there than mined on cards are fine to use, but the best advice for anything used is caveat emptor.
And included games.Yes, and that's my point with the prices at the end of the last post. If you're going to buy a used card, know how to stress test it to ensure it's running stable, and don't overpay for the card. $500 for a 6800 XT when most were selling at $700+ is a reasonable deal. $455 when you can buy new for $510, at least in my book, isn't a great deal. I'd rather pay 10% more to get a card that has a full warranty, original packaging, etc. and probably has free shipping.
Which have dubious value, unless you want those specific games. I'd personally rather they just skip the included games and make the product cheaper so I can decided if I want to spend the savings on those or other games.And included games.
Agreed. I think the last included game I even installed was Witcher III, but I didn't actually play it, just used it for benchmarking.Which have dubious value, unless you want those specific games. I'd personally rather they just skip the included games and make the product cheaper so I can decided if I want to spend the savings on those or other games.