News Nvidia GeForce RTX 3070 Founders Edition Review: Taking on Turing's Best at $499

D

Deleted member 2783327

Guest
I know TH caters mostly for the US but you do have a global audience.

The RTX 2080 Ti debuted at around A$2500
The RTX 3070 seems to advertised @ around A$1500. Not quite half the price, but still good if it's better than the RTX 2080 Ti.

I bought a RTX 2080 Ti on sale for A$1900 (About $1300 USD at the time), to replace the GTX 1080 Ti I had purchased 2 years earlier for A$1200

I should have waited :(

Hoping for great stuff from AMD...
 

johnners2981

Distinguished
Jul 28, 2010
265
12
18,795
Fast and efficient, the RTX 3070 basically matches the previous gen 2080 Ti at less than half the cost.

Nvidia GeForce RTX 3070 Founders Edition Review: Taking on Turing's Best at $499 : Read more

I wouldn't call it efficient, it uses nearly 100w more than the 1070FE. Maybe my expectations of new generation cards are too high.

It does have roughly three times the performance but the watts per frame isn't staggeringly better when comparing it to a 4 and half year old card.
 
  • Like
Reactions: aalkjsdflkj

King_V

Illustrious
Ambassador
I get that at stock settings, it skates in at just under the 75W PCIe limit, which is good. If you're overclocking, well, that's on you, though I would've thought that the card might try to avoid taxing the PCIe slot even under those conditions.

Though, if I refer back to the old article discussing this with regard to the RX480, it's supposed to be 66W, or 71 if you include the "plus or minus 8%" - so, pulling 70 stock is really cutting it very close.

Still, we allow for 75W cards to be without a PCIe connector. But I thought GPUs have been trying to avoid flirting with crossing the line since the original RX 480. Maybe the 1050 3GB was the exception that violated it, but not by as much as the RX 480.

That said, that it's pulling more than 150W from the single 8-pin connector, at STOCK clocks, is a problem in my book, and should not be happening. Nvidia absolutely should be called out on this. 165.3W is about 10% over spec there.
 
  • Like
Reactions: merlinq and RodroX
I get that at stock settings, it skates in at just under the 75W PCIe limit, which is good. If you're overclocking, well, that's on you, though I would've thought that the card might try to avoid taxing the PCIe slot even under those conditions.

Though, if I refer back to the old article discussing this with regard to the RX480, it's supposed to be 66W, or 71 if you include the "plus or minus 8%" - so, pulling 70 stock is really cutting it very close.

Still, we allow for 75W cards to be without a PCIe connector. But I thought GPUs have been trying to avoid flirting with crossing the line since the original RX 480. Maybe the 1050 3GB was the exception that violated it, but not by as much as the RX 480.

That said, that it's pulling more than 150W from the single 8-pin connector, at STOCK clocks, is a problem in my book, and should not be happening. Nvidia absolutely should be called out on this. 165.3W is about 10% over spec there.

Lets hope partners will go with 8+6 pcie power conector.
 
  • Like
Reactions: King_V

JarredWaltonGPU

Senior GPU Editor
Editor
I know TH caters mostly for the US but you do have a global audience.

The RTX 2080 Ti debuted at around A$2500
The RTX 3070 seems to advertised @ around A$1500. Not quite half the price, but still good if it's better than the RTX 2080 Ti.

I bought a RTX 2080 Ti on sale for A$1900 (About $1300 USD at the time), to replace the GTX 1080 Ti I had purchased 2 years earlier for A$1200

I should have waited :(

Hoping for great stuff from AMD...
Tracking down all the regional pricing data for various GPUs is very time consuming, and I figure people in other areas can fill in the blanks. If the 3070 is being sold for A$1500, though, that's just price gouging. Worst-case, it should be maybe A$1000, and probably lower than that. You can find RX 5700 XT starting at around A$550, whereas it's currently $380 in the US, so only a 45% difference -- and 40% of that is exchange rate.
 

JarredWaltonGPU

Senior GPU Editor
Editor
I get that at stock settings, it skates in at just under the 75W PCIe limit, which is good. If you're overclocking, well, that's on you, though I would've thought that the card might try to avoid taxing the PCIe slot even under those conditions.

Though, if I refer back to the old article discussing this with regard to the RX480, it's supposed to be 66W, or 71 if you include the "plus or minus 8%" - so, pulling 70 stock is really cutting it very close.

Still, we allow for 75W cards to be without a PCIe connector. But I thought GPUs have been trying to avoid flirting with crossing the line since the original RX 480. Maybe the 1050 3GB was the exception that violated it, but not by as much as the RX 480.

That said, that it's pulling more than 150W from the single 8-pin connector, at STOCK clocks, is a problem in my book, and should not be happening. Nvidia absolutely should be called out on this. 165.3W is about 10% over spec there.
The 6-pin and 8-pin cables are designed to handle a lot more than 150W, though. I believe the spec for ATX power supplies is to allow for up to two 8-pin PEG connectors on a single cable harness, which means the cables are already rated for up to 300W. I'm not a PSU guru, so maybe that's not always correct, but for any moderate quality PSU (80 Plus Bronze or higher), I'd be surprised if the PEG harnesses used lower gauge cables that can't handle more than 150W. I know I've seen solutions (Bitcoin mining ASICs) that pulled like 250W over a single 6-pin PEG connector -- though that's way over spec and I wouldn't recommend it. (Gee, how did that BTC mining farm catch fire!?)

The issue with the PCIe x16 slot power is that it's routed through the motherboard, and going over spec on lower tier boards is a real risk -- especially if you're really over spec, like say pulling 100W instead of 75W. But for PCIe 8-pin connectors, I'm not even remotely concerned. A really cheap PSU could conceivably have problems, but no one should be using really cheap PSUs in the first place. I do agree with the general sentiment that dropping to a single 8-pin instead of 8-pin plus 6-pin is weird. Would a 6-pin + 8-pin to 12-pin adapter make it less of a problem, though? Because then you'd have people taking a single harness with two connectors and plugging them both into the y-combiner, and at that point you could have just pulled the extra power over a single 8-pin connector.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Lets hope partners will go with 8+6 pcie power conector.
They will. Probably most will just use dual 8-pin I bet. Well, maybe not most, but a lot of them will. Also, I'm not immediately aware of any GPUs outside of Nvidia's 30-series Founders Editions that are using the new 12-pin connectors. I kind of hope it doesn't catch on ... though I'd happily see the demise of 6-pin connectors, because then we could just use 8-pin and not worry about the 6+2-pin connectors with a fiddly extra two pins that don't always line up when you try to plug them in. I'm sure we'll be seeing those things for the next 10 years or more. 😑
 
I wouldn't call it efficient, it uses nearly 100w more than the 1070FE. Maybe my expectations of new generation cards are too high.

It does have roughly three times the performance but the watts per frame isn't staggeringly better when comparing it to a 4 and half year old card.
Samsung's 8nm is a single process node generation jump from 16/12nm, as Samsung's 8nm is just a 10nm refinement. Pascal made a two process node generation jump. Things may change if NVIDIA uses TSMC's 7nm in the future.

Lets hope partners will go with 8+6 pcie power conector.
It looks like some of them are using 8+8 PCIe for no apparent reason other than, at least in my eyes, to make it easier on logistics. Unfortunately this comes at the expense of consumer confusion.
 
They will. Probably most will just use dual 8-pin I bet. Well, maybe not most, but a lot of them will. Also, I'm not immediately aware of any GPUs outside of Nvidia's 30-series Founders Editions that are using the new 12-pin connectors. I kind of hope it doesn't catch on ... though I'd happily see the demise of 6-pin connectors, because then we could just use 8-pin and not worry about the 6+2-pin connectors with a fiddly extra two pins that don't always line up when you try to plug them in. I'm sure we'll be seeing those things for the next 10 years or more. 😑

Samsung's 8nm is a single process node generation jump from 16/12nm, as Samsung's 8nm is just a 10nm refinement. Pascal made a two process node generation jump. Things may change if NVIDIA uses TSMC's 7nm in the future.


It looks like some of them are using 8+8 PCIe for no apparent reason other than, at least in my eyes, to make it easier on logistics. Unfortunately this comes at the expense of consumer confusion.

Yeap, I really don't mind, 8+6 or 8+8, as long as its not the 12 pin connector.
 

King_V

Illustrious
Ambassador
The 6-pin and 8-pin cables are designed to handle a lot more than 150W, though. I believe the spec for ATX power supplies is to allow for up to two 8-pin PEG connectors on a single cable harness, which means the cables are already rated for up to 300W. I'm not a PSU guru, so maybe that's not always correct, but for any moderate quality PSU (80 Plus Bronze or higher), I'd be surprised if the PEG harnesses used lower gauge cables that can't handle more than 150W. I know I've seen solutions (Bitcoin mining ASICs) that pulled like 250W over a single 6-pin PEG connector -- though that's way over spec and I wouldn't recommend it. (Gee, how did that BTC mining farm catch fire!?)

The issue with the PCIe x16 slot power is that it's routed through the motherboard, and going over spec on lower tier boards is a real risk -- especially if you're really over spec, like say pulling 100W instead of 75W. But for PCIe 8-pin connectors, I'm not even remotely concerned. A really cheap PSU could conceivably have problems, but no one should be using really cheap PSUs in the first place. I do agree with the general sentiment that dropping to a single 8-pin instead of 8-pin plus 6-pin is weird. Would a 6-pin + 8-pin to 12-pin adapter make it less of a problem, though? Because then you'd have people taking a single harness with two connectors and plugging them both into the y-combiner, and at that point you could have just pulled the extra power over a single 8-pin connector.

Yeah, going over spec on the motherboard is definitely a bigger issue for the reasons you said - which is why I thought it odd that Nvidia was flirting with the edge of the limit on the slot in addition to going over with the 8-pin.

I don't have nearly the familiarity with lots of models of PSUs that the PSU gurus do, but I was under the impression that there were PSUs out there with a single harness that had only a single 8-pin connector. It just seems odd for Nvidia to violate the standard in this way, and I would've been more comfortable with the idea of having it as an 8+6-to-12 adapter. Then any violation of spec would be on the PSU maker, rather than Nvidia themselves.

I guess that, yes, having two connectors at the end of a single harness is common, but, can we guarantee that all PSUs with an 8-pin connector really can handle going over, even if only by a small amount?
 

aalkjsdflkj

Honorable
Jun 30, 2018
45
33
10,560
I wouldn't call it efficient, it uses nearly 100w more than the 1070FE. Maybe my expectations of new generation cards are too high.

It does have roughly three times the performance but the watts per frame isn't staggeringly better when comparing it to a 4 and half year old card.

Agreed. Based on the power consumption figures the 1070 FE was 141/146 W vs. 218/224 W for the stock 3070. Over 2 generations that's about a 50% increase in power for what looks to me like a doubling of performance. I'm surprised by how little increased efficiency that provides. I suspect a big part of the issue is all the space dedicated to ray tracing, which at this point seems like more of a gimmick. Eventually RT will revolutionize game graphics, but right now I can only tell the different in RT on / RT off in playbacks and screenshots 95% of the time. During gameplay it generally makes no real difference most of the time. I'd love to purchase a 3070 without any of the RT cores for a lower price and less power consumption, but that's not happening.
 
  • Like
Reactions: King_V

King_V

Illustrious
Ambassador
Agreed. Based on the power consumption figures the 1070 FE was 141/146 W vs. 218/224 W for the stock 3070. Over 2 generations that's about a 50% increase in power for what looks to me like a doubling of performance. I'm surprised by how little increased efficiency that provides. I suspect a big part of the issue is all the space dedicated to ray tracing, which at this point seems like more of a gimmick. Eventually RT will revolutionize game graphics, but right now I can only tell the different in RT on / RT off in playbacks and screenshots 95% of the time. During gameplay it generally makes no real difference most of the time. I'd love to purchase a 3070 without any of the RT cores for a lower price and less power consumption, but that's not happening.

So, about a 33% increase overall, except for the RT being an extra feature.

I am actually curious as to what the performance efficiency would be if they made, say, a GTX 17- series that was the RT-less version of these cards.


Any which way, though, this is definitely a case of shots fired in Navi 2's direction, assuming Nvidia can supply these cards.
 
So, about a 33% increase overall, except for the RT being an extra feature.

I am actually curious as to what the performance efficiency would be if they made, say, a GTX 17- series that was the RT-less version of these cards.


Any which way, though, this is definitely a case of shots fired in Navi 2's direction, assuming Nvidia can supply these cards.
The difference between the 2060 and the 1660 Ti, minding that there are still differences in terms of shader count and using Tom's Hardware's reviews on the two cards, the difference in watts per frame was about 0.10 in favor of the 1660 Ti.

I still want to point out that Pascal was an outlier. It had a process jump of two generations (from the base nodes of 32nm to 14nm) whereas Ampere only had one from Turing (from the base nodes of 14nm to 10nm). The only other outlier I found with regards to watts/frame was Kepler from Fermi, but in that case NVIDIA changed so many things that I don't think it's a fair comparison.
 
  • Like
Reactions: JarredWaltonGPU

King_V

Illustrious
Ambassador
The difference between the 2060 and the 1660 Ti, minding that there are still differences in terms of shader count and using Tom's Hardware's reviews on the two cards, the difference in watts per frame was about 0.10 in favor of the 1660 Ti.
Yeah, with the 16- series, they didn't have any sort of "everything is exactly the same as a particular 20-series model, just without the RT.

If Nvidia does a 17-series at all, I strongly suspect they'll go the same route: nothing will be made that, can compete against a 30-series card's performance when not using RT.
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
I wouldn't call it efficient, it uses nearly 100w more than the 1070FE. Maybe my expectations of new generation cards are too high.

It does have roughly three times the performance but the watts per frame isn't staggeringly better when comparing it to a 4 and half year old card.

https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/36.html

26% better than the 1070 at 1080p, 41% at 1440p, 54% at 4k. Using performance per watt as your measuring stick, it's the most efficient card available at all 3 resolutions (could change tomorrow).
 

Math Geek

Titan
Ambassador
i recall when the rx 480 came out and tom's tested more than 75w being pulled from the pcie slot. it was a big deal and caused a lot of grief for amd from many many angles.

what changed in how motherboards are made that it's now ok to break 75w on the pcie slot? not screaming bias or anything but this does seem rather odd to just shrug off what not too long ago was a massive deal. it's pulling too much from both connections and all it gets is a shrug and a "it should be ok"....

anyone explain why it's ok now all of a sudden??? seems like a dangerous precedent to set
 
  • Like
Reactions: merlinq
Yeah, with the 16- series, they didn't have any sort of "everything is exactly the same as a particular 20-series model, just without the RT.

If Nvidia does a 17-series at all, I strongly suspect they'll go the same route: nothing will be made that, can compete against a 30-series card's performance when not using RT.
I have a feeling they won't do a successor to the 16 series, since the 3070 is already smaller than the 2060 in die size and I would put money on the 3060 being nearly the same size as the 1660. They'd only have to get within spitting distance of the 2060's performance for a 50 level SKU which I'd find is reasonable to do.

If they're going to do a 30 level SKU, I'd imagine they'll just recycle one of the 16-series Turing GPUs.
i recall when the rx 480 came out and tom's tested more than 75w being pulled from the pcie slot. it was a big deal and caused a lot of grief for amd from many many angles.

what changed in how motherboards are made that it's now ok to break 75w on the pcie slot? not screaming bias or anything but this does seem rather odd to just shrug off what not too long ago was a massive deal. it's pulling too much from both connections and all it gets is a shrug and a "it should be ok"....

anyone explain why it's ok now all of a sudden??? seems like a dangerous precedent to set
I recall the RX 480 could exceed that limit at stock. The 3070 doesn't appear to be exceeding it unless you overclock it.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Honestly I'd rather see the 12-pin have wider adoption. Less connectors to deal with and I'm sure PSU manufacturers would love to save a penny or two by not needing to add that 6+2-pin appendix.
The problem is that the 6-pin and 8-pin standards are legacy. It's the old XKCD joke:
standards.png


6-pin and 8-pin can't go away until and unless everything stops using them, and that won't happen for decades. So the new standard is just a third thing for all the PSUs to support, rather than a replacement of the old standards.