Review Nvidia GeForce RTX 3070 Founders Edition Review: Taking on Turing's Best at $499

Status
Not open for further replies.
Jan 11, 2021
2
1
15
Did this post time travel a from a period where we naively believe that a 3070 could be bought for usd 500? I'll keep my 1070 for perpetuity. Thanks.
 
  • Like
Reactions: bigdragon
Jan 11, 2021
2
1
15
Please don't irritate people with 500$ price for 3070. It was never true. Think 1000$ and above now.

Lack of RGB is giant bonus for anyone who want his PC being hidden and silent under table.
I created an account just to complain about this LOL
 
The initial GPU review posts, which all went up on the appropriate launch dates (as in, not one second late on any of them) were all a single monolithic page. There are reasons behind that, but the short summary is that the managing editors at Tom's Hardware are now redirecting the original URL into the new paginated version. The text and charts have not changed, on any of the reviews. It's just a change in the presentation. The original review was posted at https://www.tomshardware.com/news/nvidia-geforce-rtx-3070-founders-edition-review and the new version with pages is at https://www.tomshardware.com/reviews/nvidia-geforce-rtx-3070-founders-edition-review -- which is why the comments reset. (You can still read the original comments in the old forum thread if you want:

 

ctn.gooners

Commendable
Oct 21, 2018
1
0
1,510
Just logging into to complain the horrible availability of this card on Best Buy. Their ordering system doesn't work and just being screwed by scalpers.

Best Buy needs to honor orders base in the time it was ordered rather than releasing them.

And shame on Nvidia. Horrible company to allow it
 
Jan 12, 2021
1
2
10
I hate to break it to the people in the comments but the MSRP for the 3070 is still $499. You can buy one at that price, albeit with infamously extreme difficulty. Use Hotstock and keep an eye on stores that are not Ebay for stock. Literally just bought a 3070 on 1/6/21 for MSRP after 3 weeks of waiting. On Best Buy, which another comment was complaining was facilitating bots to instantly sweep up everything.

Stop being so dramatic, it's just a graphics card. It will not change your life. If you want to complain about paying $1000 bucks to scalpers, go for it. But you paid the idiot tax and put money in their pocket. Do not assume the rest of us need to as well.
 
Use Hotstock and keep an eye on stores that are not Ebay for stock.
.....
Stop being so dramatic, it's just a graphics card. It will not change your life. If you want to complain about paying $1000 bucks to scalpers, go for it. But you paid the idiot tax and put money in their pocket. Do not assume the rest of us need to as well.

Hotstock is still US related service for US byers. It have no use here in Europe. Particularly not in my country.

For some who are building completely new system, GPU shortage is serious roadblock. Choice between months long waiting or having something from 2 generations ago for inflated price. No surprise why people are willing to pay double price or even more.
 

coxbw

Distinguished
Jun 22, 2011
52
7
18,535
I do not blame the web sites for lack of stock, but the BOTS. Assuming new items announced today in a few minutes, once again we will not be able to get them due to bots.
 
Why are you posting a review on new hardware 3 months after it came out?
They seems sorted out a mess with older articles.

I do not blame the web sites for lack of stock, but the BOTS. Assuming new items announced today in a few minutes, once again we will not be able to get them due to bots.
Blame impatient byers and scalpers. Bots are only a tool in hands of not so nice people (scalpers).
 
Feb 1, 2021
1
0
10
Fast and efficient, the RTX 3070 basically matches the previous gen 2080 Ti at less than half the cost.

Nvidia GeForce RTX 3070 Founders Edition Review: Taking on Turing's Best at $499 : Read more

I think that the 3070 is a big disappointment :

Metro exodus ( 2 years old game) cannot keep up 60fps stable on 1440p with RT and DLSS ON.
View: https://youtu.be/mUj2aSLpE0o


Cyberpunk cannot keep up 60fps stable on 1440p with RT and DLSS ON.
View: https://youtu.be/ENY6WQpFW6A


The Medium cannot keep up 50fps stable on 1440p with RT and DLSS ON.
View: https://youtu.be/qfJBoAebC3c


A card that can't run an old game and future AAA upcoming games is not a future proof card.
The 3080 is the real upgrading worth it this year.
 

Diceman_2037

Distinguished
Dec 19, 2011
53
3
18,535
It's a bit surprising that Nvidia only uses a single 8-pin PEG connector to feed the 12-pin connector, though, since the 8-pin is rated for 150W.

Sure would love to see this misinformation die, the 8-pin molex connector and cabling behind it is not rated for a max of 150w, thats the rating for its usage as an Input on a graphics card and has no relevance the capabilities of the psu or its cabling.

Add in 75W from the PCIe slot

the PCIE slot is 66w max on 12v power, the sensors GPU-Z reports are all 12v.

I'd be more concerned about seeing 70w draw from here than the 12p connector drawing up to 200w, the molex microfit 12pin is good up to 468w on paper when all 6 pins are active @ 6.5a - only 3 pins and 3 grounds are present on the adapter and socket on the 3070 FE.

and it's basically right at the limit

no it isn't, this card technically isn't PCIE spec and i don't believe there are PCIE logos anywhere on the packaging or manuals because it doesn't use a compliant power connector, by not using a compliant power connector and not claiming PCIE compliant it doesn't have to adhere to 150w per terminal, it can use whatever the cable is capable of delivering as per the insulation rating and cable gauge allows.
 
Last edited:
  • Like
Reactions: hamoo
Sure would love to see this misinformation die, the 8-pin molex connector and cabling behind it is not rated for a max of 150w, thats the rating for its usage as an Input on a graphics card and has no relevance the capabilities of the psu or its cabling.

the PCIE slot is 66w max on 12v power, the sensors GPU-Z reports are all 12v.

I'd be more concerned about seeing 70w draw from here than the 12p connector drawing up to 200w, the molex microfit 12pin is good up to 468w on paper when all 6 pins are active @ 6.5a - only 3 pins and 3 grounds are present on the adapter and socket on the 3070 FE.

no it isn't, this card technically isn't PCIE spec and i don't believe there are PCIE logos anywhere on the packaging or manuals because it doesn't use a compliant power connector, by not using a compliant power connector and not claiming PCIE compliant it doesn't have to adhere to 150w per terminal, it can use whatever the cable is capable of delivering as per the insulation rating and cable gauge allows.
You're only about 15 months late to the party, but for the record:

  1. Yes, most PSUs can send much more than 150W over the 8-pin cable. Did you know that the extra two pins on 8-pin PEG are for... ground! So a 6-pin connector is rated for 75W, add in two more ground pins and it's double the power delivery. Meaning the 12V lines are easily capable of delivering 300W on most good quality PSUs. But what about low quality PSUs, which absolutely do exist? There will inevitably be some PSUs with a single 8-pin connector on the end of a cable and the cable won't really be rated for more than 150W delivery. If there were two 8-pin connections, there would be less risk.
  2. There's no such thing as an "8-pin Molex connector" or "molex microfit 12-pin." Molex was a company that designed early interconnect systems for electronics, and the 4-pin Molex connector became the standard for PCs back in the 80s. But no one ever calls the modern 6-pin, 8-pin, and 12-pin connectors "Molex" because the company had nothing to do with designing them. (See, I can also be pedantic.)
  3. The main point is that it would have been trivial for Nvidia to add dual 8-pin inputs, as it already had the adapter cables available and economies of scale would come into play.
  4. You're seriously going to say that a graphics card that slots into a PCIe slot and uses PCIe power connectors for auxiliary power doesn't actually need to conform to the PCIe standard? :rolleyes:
More critically, it was rather stupid to push an early 12-pin connector, even on cards that don't need it, and then include an adapter in every card. Notice how many of the AIC partners used 12-pin connections for RTX 30-series? Exactly none. Not a single one. There are now, more than 18 months later, RTX 3090 Ti cards that use... a 16-pin connector, which is basically the same thing as Nvidia's 12-pin plus four extra signaling pins. Still, it's an official standard at least. And all of the 3090 and lower GPUs that aren't Founders Edition models will still use dual 8-pin connectors, because it really doesn't simplify card design or improve the cards in any measurable way.

It's like Nvidia crowing about how awesome the new cooling design is for the RTX 30-series (mostly the 3080), but then all of the AIC partners used traditional cooling designs that in most cases cool better than the Founders Edition equivalent. Why? Because Nvidia spent time and money on making a card that looks different, rather than a card that actually performs better. 12-pin was different for the sake of being different, not because it was actually necessary.
 
  • Like
Reactions: Krotow

Eximo

Titan
Ambassador
Molex does design the Mini and Micro fit systems. The odd one is that we call the 4-pin connector Molex, when it was designed by AMP. The old six pin AT connector was by Molex.
 
  1. Yes, most PSUs can send much more than 150W over the 8-pin cable. Did you know that the extra two pins on 8-pin PEG are for... ground! So a 6-pin connector is rated for 75W, add in two more ground pins and it's double the power delivery. Meaning the 12V lines are easily capable of delivering 300W on most good quality PSUs. But what about low quality PSUs, which absolutely do exist? There will inevitably be some PSUs with a single 8-pin connector on the end of a cable and the cable won't really be rated for more than 150W delivery. If there were two 8-pin connections, there would be less risk.
  2. There's no such thing as an "8-pin Molex connector" or "molex microfit 12-pin." Molex was a company that designed early interconnect systems for electronics, and the 4-pin Molex connector became the standard for PCs back in the 80s. But no one ever calls the modern 6-pin, 8-pin, and 12-pin connectors "Molex" because the company had nothing to do with designing them. (See, I can also be pedantic.)
  3. The main point is that it would have been trivial for Nvidia to add dual 8-pin inputs, as it already had the adapter cables available and economies of scale would come into play.
  4. You're seriously going to say that a graphics card that slots into a PCIe slot and uses PCIe power connectors for auxiliary power doesn't actually need to conform to the PCIe standard? :rolleyes:
More critically, it was rather stupid to push an early 12-pin connector, even on cards that don't need it, and then include an adapter in every card. Notice how many of the AIC partners used 12-pin connections for RTX 30-series? Exactly none. Not a single one. There are now, more than 18 months later, RTX 3090 Ti cards that use... a 16-pin connector, which is basically the same thing as Nvidia's 12-pin plus four extra signaling pins. Still, it's an official standard at least. And all of the 3090 and lower GPUs that aren't Founders Edition models will still use dual 8-pin connectors, because it really doesn't simplify card design or improve the cards in any measurable way.

It's like Nvidia crowing about how awesome the new cooling design is for the RTX 30-series (mostly the 3080), but then all of the AIC partners used traditional cooling designs that in most cases cool better than the Founders Edition equivalent. Why? Because Nvidia spent time and money on making a card that looks different, rather than a card that actually performs better. 12-pin was different for the sake of being different, not because it was actually necessary.

Indeed 12V PCIe cables are capable to support power above 75W. If I can trust my Be Quiet Dark Power Pro 11 PSU manual, then single cable in 8-pin configuration can support up to 300W. Obviously when original cables are used. For other PSUs user should check his PSU manual and avoid exceeding current/power limit per output. 150W per 8-pin 12V PCIe power cable seems safe option for average PSU with whatever cable.
 

Eximo

Titan
Ambassador
I should add that a 6-pin PCIe cable only has to have 5 wires in it. 2 12V, two ground and one signal, 8-pin opts to add the third 12V wire and two additional grounds. So effectively most 6+2 pin PCIe cables carry all three 12V wires anyway.

What the cable can really handle is up to the gauge of the wire used and whether the pins have been soldered or crimped. And that specification was to prevent excessive wire heating and voltage drop, really you can just keep pumping current up until they get hot enough to melt. Not advisable, but it is just wire.
 
Molex does design the Mini and Micro fit systems. The odd one is that we call the 4-pin connector Molex, when it was designed by AMP. The old six pin AT connector was by Molex.
Molex technically adapted the AMP standard and tweaked it, I believe. It was backward compatible with AMP, though.
But while Molex may still be involved with interconnects, I have never in my life heard someone refer to a 6-pin, 8-pin, 12-pin, or even 24-pin connectors as "Molex." (Well, until this thread. LOL) They're commonly called by names like 6-pin/8-pin PEG (PCI Express Graphics) or just PCIe, 8-pin EPS12V, or 24-pin ATX.
 
Status
Not open for further replies.