News New RTX 4070 May Come With Salvaged RTX 4080 Dies

Jagar123

Prominent
Dec 28, 2022
61
92
610
I actually audibly laughed out loud reading the first sentence of this article, "It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers."

I know management is making you put that in there to get clicks, but from everything I've read gamers everywhere are actually shunning the 4070 pretty hard. I know I am.

I'd recommend maybe choosing a different opening in future articles? It just sounds inauthentic the way it is written now.
 
Last edited:
I actually audibly laughed out loud reading the first sentence of this article, "It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers."

I know management is making you put that in there to get clicks, but from everything I've read gamers everywhere are actually shunning the 4070 pretty hard. I know I am.

I'd recommend maybe choosing a different opening in future articles? It just sounds inauthentic the way it is written now.
The SEO gods require tribute. Also, I don't think the 4070 is all that bad, considering the whole package. It's ~3080 performance, plus DLSS3 and a couple extra GB of VRAM, for $100 less, and with ~60% of the power draw. It's not AWESOME and you wouldn't upgrade from a 3080 or 6800 (or higher) to the 4070. But if you have an RTX 20-series or RX 5000-series card and are looking for a potential upgrade, for the time being it's the "best $600" card in my book.

And no, that doesn't mean you should run out and buy a $600 GPU right now. Give it another month and we'll see what 4060 Ti and maybe even RX 7600 XT look like, hopefully for a lot less than $600. I also want to see where RX 7800/7700 land, in performance, features, and price. But everything going on right now suggests things aren't actually going to get much better than the current status quo. I'd love to see an RX 7800 XT that clearly beats the RTX 4070 for about the same price. I'm not convinced that will happen.
 
Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.

The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned.And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).

So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.

At the same time, the 4090 is still available in a multitude of variants, with power packages between 80 to 150W and support for up to +25W extra with Dynamic Boost 2.0
 
Last edited:
Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.

The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned.And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).

So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.
Don't they hardwire in wattage limits for those mobile chips? If so I don't think they could really reuse them for desktop cards. Though it would not be the first time mobile GPU chips ended up in desktop graphics cards.
 

Jagar123

Prominent
Dec 28, 2022
61
92
610
The SEO gods require tribute. Also, I don't think the 4070 is all that bad, considering the whole package. It's ~3080 performance, plus DLSS3 and a couple extra GB of VRAM, for $100 less, and with ~60% of the power draw. It's not AWESOME and you wouldn't upgrade from a 3080 or 6800 (or higher) to the 4070. But if you have an RTX 20-series or RX 5000-series card and are looking for a potential upgrade, for the time being it's the "best $600" card in my book.

And no, that doesn't mean you should run out and buy a $600 GPU right now. Give it another month and we'll see what 4060 Ti and maybe even RX 7600 XT look like, hopefully for a lot less than $600. I also want to see where RX 7800/7700 land, in performance, features, and price. But everything going on right now suggests things aren't actually going to get much better than the current status quo. I'd love to see an RX 7800 XT that clearly beats the RTX 4070 for about the same price. I'm not convinced that will happen.
I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.
 
I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.
Just wait until a 400 dollar card gives you +100% performance. I usually do not jump on a graphics card upgrade until I get at least +100%.
 
Don't they hardwire in wattage limits for those mobile chips? If so I don't think they could really reuse them for desktop cards. Though it would not be the first time mobile GPU chips ended up in desktop graphics cards.

No, the 4090 is actually available in a multitude of variants, with power packages between 80 to 150W and support for up to +25W extra with Dynamic Boost 2.0. So they can re-use some of these chips if need be. I just edited my post now btw to add this text.
 
  • Like
Reactions: helper800
Another theory is that Nvidia will use the RTX 4090 Laptop GPU, where some of the dies are defective.

The Laptop 4090 SKU is having the same specs as the NVIDIA GeForce RTX 4080 Desktop GPU, as far as core configurations are concerned. And more importantly, the RTX 4090 mobile uses 16 GB GDDR6 dedicated graphics memory with a clock speed of 20 Gbps (effective).

So NVIDIA could re-use some of the laptop chips as well , since both the desktop 4080 and laptop 4090 chips have 16GB memory capacity and 256-bit bus.

At the same time, the 4090 is still available in a multitude of variants, with power packages between 80 to 150W and support for up to +25W extra with Dynamic Boost 2.0
I'm not sure what you're getting at. It's why we said "AD103" in the article, which is the same chip in both desktop 4080 and laptop 4090. Any differences are due to binning and finalizing of what voltage to use. Maybe you're just referring to the use of GDDR6 in the mobile 4090 rather than GDDR6X in the desktop 4080, but I'd be surprised for Nvidia to use GDDR6 in a 16GB desktop 4070 now. I actually expected the RTX 4070 to use GDDR6 before it was announced, and the fact that it doesn't probably means Nvidia has plenty of GDDR6X to go around.
 
I'm not sure what you're getting at. It's why we said "AD103" in the article, which is the same chip in both desktop 4080 and laptop 4090. Any differences are due to binning and finalizing of what voltage to use. Maybe you're just referring to the use of GDDR6 in the mobile 4090 rather than GDDR6X in the desktop 4080, but I'd be surprised for Nvidia to use GDDR6 in a 16GB desktop 4070 now. I actually expected the RTX 4070 to use GDDR6 before it was announced, and the fact that it doesn't probably means Nvidia has plenty of GDDR6X to go around.

I'm just saying that the Laptop 4090 SKU is also based on the AD103 chip as the desktop RTX 4080 variant. No other GPU is based on AD103, except the RTX 5000 Mobile Ada Generation SKU.

So there might be a possibility for Nvidia to salvage some of these laptop chips, though using GDDR6 memory would b e kind of odd. Even I think this would be weird, unless some GDDR6X chips might be costing Nvidia too much (seems a bit unlikely though) ?

Or maybe there are not many DESKTOP 4080 dies which might be defective to begin with ?
 
I know my aging 2080 is needing an upgrade. I just don't feel anything released so far warrants it. I am content continuing to wait. If that means I skip this generation because I don't think the prices are where they should be then so be it. I do want AMD to do better with their 7800 XT release but I don't have much hope for it being priced that well either.
I did the exact same thing, but with the 20 series for similar reasons. I jumped on my current card because it dropped to 10GB pricing and I had fears that AMD/nvidia would try to keep as much crypto price inflation as is possible. I'm firmly in the camp of "buy nothing unless you actually need it" with this generation of GPUs due to the poor value proposition (unless you're buying halo in which case the 4090 isn't really that bad compared to the rest).
 

Giroro

Splendid
Salvaged 4080 dies, specifically AD102, are probably two steps up from the upbranded dies that they're currently using.
But unfortunately, Nvidia got greedy and took the underpowered AD104 die and called it a RTX 4080 12GB 4070 Ti, so everything is a mess and none of it matters. They'll sell the worst-possible chips that they can force the market to accept at the highest-possible price.
You have to love the hearwarmingly defiant "What are you going to do about it, punk? That's what I Thought" attitude from the green team.
Let's just be glad they haven't decided to call their minimum-possible functionality AD107 chip the "RTX 4070 6 GB, GDDR6, 96 bit" and price it at $549.99. .... yet.

Custom-tailored leather jackets don't pay for themselves, ususally.
 
  • Like
Reactions: PEnns

FunSurfer

Distinguished
Yes! This is the card I was waiting for, the 4070 Super with 16GB Vram and 256 bit interface. Lower power consumption than 4080 and adequate Vram pool, hoping this will come true
 
  • Like
Reactions: Why_Me

PEnns

Reputable
Apr 25, 2020
703
746
5,770
I actually audibly laughed out loud reading the first sentence of this article, "It didn't take long for the GeForce RTX 4070 to position itself as one of the best graphics cards for gamers."

I know management is making you put that in there to get clicks, but from everything I've read gamers everywhere are actually shunning the 4070 pretty hard. I know I am.

I'd recommend maybe choosing a different opening in future articles? It just sounds inauthentic the way it is written now.

The same author just posted another article...showing his real feelings about AMD:

AMD Can't Beat Ada, So Brags About Old Ampere Comparisons

The number of ridiculous articles here today is about to become a record in absurdity!
 
  • Like
Reactions: Why_Me

autobahn

Distinguished
Mar 11, 2009
10
4
18,515
Just wait until a 400 dollar card gives you +100% performance. I usually do not jump on a graphics card upgrade until I get at least +100%.
Seems like a gently used 1 gen old card might fit your needs. Back in Jan I picked up a mint AMD 6800 XT reference card for about $480 on eBay that works awesome. At the time these were about $700ish if you could find them.
 
Last edited:
Seems like a gently used 1 gen old card might fit your needs. Back in Jan I picked up a mint AMD 6800 XT reference card for about $480 on eBay that works awesome. At the time these were about $700ish if you could find them.
Shockingly (to me, anyway), the going rate for RX 6800 XT cards on eBay over the past 30 days is still $455. Considering you can find brand-new cards for not much more than that ($510 for an MSI RX 6800 XT at Newegg), I would never buy a potentially used for mining for over two years card off of eBay just to save $55.
 
  • Like
Reactions: Why_Me and Jagar123
Shockingly (to me, anyway), the going rate for RX 6800 XT cards on eBay over the past 30 days is still $455. Considering you can find brand-new cards for not much more than that ($510 for an MSI RX 6800 XT at Newegg), I would never buy a potentially used for mining for over two years card off of eBay just to save $55.
Just because it is on ebay does not mean it was used for mining though, right? I know a lot of folks that never mined and sell their cards second hand on ebay, etc etera. 55 Dollars could mean the difference between a 2tb or 1tb drive, or a windows license, and many others.
 
Just because it is on ebay does not mean it was used for mining though, right? I know a lot of folks that never mined and sell their cards second hand on ebay, etc etera. 55 Dollars could mean the difference between a 2tb or 1tb drive, or a windows license, and many others.
Right, you can't know. That's the problem. Any GPU on eBay that has been out for 2.5 years (like the RX 6800/6800 XT/6900 XT cards and RTX 3070/3080/3090) was very likely used for mining is my take. That might not matter too much to some people, but it's a risk. If you get an RTX 30-series LHR card, then it would be from slightly later and potentially less used for mining. (LHR came out with the RTX 3060 in February 2021, and then around April~May new versions of the other RTX 30-series cards switched to LHR.)

There's no good way, AFAIK, to tell for certain when a particular card was manufactured. Maybe there's a date on the card or box, but you'd need a good photo to show that. I just know that I would always approach eBay sellers with an assumption of them lying/exaggerating about whether a card was ever used for mining. RX 6650/6750/6950 XT are safer, because those weren't released until Ethereum mining was nearly over — they came out May 10, 2022. Ethereum mining profits were much lower by then, and officially ended on September 15. So at worst, the 6x50 cards were used for four months of mining, and probably not even that. RTX 3090 Ti would also be safer by the same logic.

But personally, buying a used graphics card, I would expect to pay at least 25% below what a brand-new card currently costs. Which means RX 6800 for ~$360, 6800 XT for ~$380, RX 6950 XT for ~$450. Also, RTX 3070 for ~$350, RTX 3080 for ~$425, RTX 3090 for ~$600. (Nvidia's "acceptable prices" are based off the similarly performing RTX 40-series cards, minus a chunk because the newer architecture GPUs are better in various ways.)
 
Right, you can't know. That's the problem. Any GPU on eBay that has been out for 2.5 years (like the RX 6800/6800 XT/6900 XT cards and RTX 3070/3080/3090) was very likely used for mining is my take. That might not matter too much to some people, but it's a risk. If you get an RTX 30-series LHR card, then it would be from slightly later and potentially less used for mining. (LHR came out with the RTX 3060 in February 2021, and then around April~May new versions of the other RTX 30-series cards switched to LHR.)

There's no good way, AFAIK, to tell for certain when a particular card was manufactured. Maybe there's a date on the card or box, but you'd need a good photo to show that. I just know that I would always approach eBay sellers with an assumption of them lying/exaggerating about whether a card was ever used for mining. RX 6650/6750/6950 XT are safer, because those weren't released until Ethereum mining was nearly over — they came out May 10, 2022. Ethereum mining profits were much lower by then, and officially ended on September 15. So at worst, the 6x50 cards were used for four months of mining, and probably not even that. RTX 3090 Ti would also be safer by the same logic.

But personally, buying a used graphics card, I would expect to pay at least 25% below what a brand-new card currently costs. Which means RX 6800 for ~$360, 6800 XT for ~$380, RX 6950 XT for ~$450. Also, RTX 3070 for ~$350, RTX 3080 for ~$425, RTX 3090 for ~$600. (Nvidia's "acceptable prices" are based off the similarly performing RTX 40-series cards, minus a chunk because the newer architecture GPUs are better in various ways.)
While I agree with the logic of what may or may not have been mined on and the precaution of assuming that people are lying about the condition of the cards, I disagree with the sentiment that cards with mining hours are so much worse than a used gaming only card. I have seen gamers treat and use their cards in similar if not worse ways than some miners. Leaving them clogged with dust and dog hair, spilling drinks on them, shipping them in socket, and so on. There is at least anecdotal evidence out there than mined on cards are fine to use, but the best advice for anything used is caveat emptor.
 
While I agree with the logic of what may or may not have been mined on and the precaution of assuming that people are lying about the condition of the cards, I disagree with the sentiment that cards with mining hours are so much worse than a used gaming only card. I have seen gamers treat and use their cards in similar if not worse ways than some miners. Leaving them clogged with dust and dog hair, spilling drinks on them, shipping them in socket, and so on. There is at least anecdotal evidence out there than mined on cards are fine to use, but the best advice for anything used is caveat emptor.
Yes, and that's my point with the prices at the end of the last post. If you're going to buy a used card, know how to stress test it to ensure it's running stable, and don't overpay for the card. $500 for a 6800 XT when most were selling at $700+ is a reasonable deal. $455 when you can buy new for $510, at least in my book, isn't a great deal. I'd rather pay 10% more to get a card that has a full warranty, original packaging, etc. and probably has free shipping.
 

Eximo

Titan
Ambassador
Yes, and that's my point with the prices at the end of the last post. If you're going to buy a used card, know how to stress test it to ensure it's running stable, and don't overpay for the card. $500 for a 6800 XT when most were selling at $700+ is a reasonable deal. $455 when you can buy new for $510, at least in my book, isn't a great deal. I'd rather pay 10% more to get a card that has a full warranty, original packaging, etc. and probably has free shipping.
And included games.
 
  • Like
Reactions: JarredWaltonGPU

Eximo

Titan
Ambassador
Which have dubious value, unless you want those specific games. I'd personally rather they just skip the included games and make the product cheaper so I can decided if I want to spend the savings on those or other games.
Agreed. I think the last included game I even installed was Witcher III, but I didn't actually play it, just used it for benchmarking.

Got all the Batman games that way though. I got one of them bundled, and the class action reward for the bad PC port was to give everyone the other titles in the franchise for free.

Still didn't play them. Not a big fan of that style of RPG.