Nvidia GeForce GTX 1070 8GB Pascal Performance Review

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

TJ Hooker

Titan
Ambassador

Am I blind? Where are you seeing these values? All I see are the power over time graphs for the PCIe slot power.
 


Yeah I know, nowhere did the author of the article even state the average.

One thing I don't understand is this statement:
In order to find out, we performed 10 to 20 tests per board. Step by step, we both overclocked them and lowered their power target to achieve the lowest possible power consumption and performance. The following bar graph shows the three main results representing the lowest possible power consumption, stock settings, and the highest possible overclock.
So why are they trying to lower power consumption and then put the power consumption data on the graph? I just don't understand this statement. It's as if it was higher than 150W but they tailored it down to Nvidia's specification? I also don't know why the average for the below was not shown.
bvmfY8p.png
 

d_s_c_8

Commendable
Jul 9, 2016
6
0
1,510


The 12V rail has a tolerance of ± 5%. So the card consuming 75W will draw somewhere between 6.58A@11.4V and 5.95A@12.6V. So a range of 5.95A to 6.58A is still over the 5.5A allow from the PCIe spec.
 

d_s_c_8

Commendable
Jul 9, 2016
6
0
1,510


Full paragraph quote from page 7 of review:
"Taking a closer look at the motherboard slot yields a surprising finding: none of the cards in this round-up use the 3V rail at all. This means that the PCIe slot doesn’t really provide the 75W most enthusiasts assume it does, since the 12V rail only offers about 65W on its own. This is almost exactly where Nvidia’s GeForce GTX 1070 Founders Edition ends up, with spikes well in excess of 75W. They're not particularly dangerous, but can cause audible artifacts if you're using on-board audio on a poorly-designed motherboard."

The review clearly implies the card was averaging 75W from the PCIe slot from the paragraph above. The graph below show data points coalescing around the 75W line. Which we can deduce the average to be 75W power draw from the PCIe slot.

r_600x450.png
 

kamhagh

Honorable
Mar 10, 2013
331
0
10,810
I wanted to build a pc with this GPU, But then I remembered there are no games I like :D And paying 2000$ just for gaming is a bit much, Gonna buy a chromebook :D unless a new bioshock comes out, I would pay 5000$ to play it :D
 


And it really doesn't matter anyway, just like how the 75W and 150W specs of the PCIe cables don't matter either.
 

TJ Hooker

Titan
Ambassador
That's not how I read it at all. As @neblogai said, I think they're simply talking about the spec for slot power (66W for 12V rail, 9 for 3V), not about how much power the cards in question are drawing through the slot. As far as the graph... I really don't think you can conclude an average just by looking at it. No idea why they didn't just state what the average is.
 

ssdpro

Honorable
Apr 10, 2013
162
0
10,680


I can think of three reasons to purchase the FE over cards with partner designed coolers:

1) Desire to exhaust heat out the rear of the chassis. If you make use of M.2 drives and a card or two covers it up and exhausts heat in that general area you will find that already hot M.2 gots a whole lot hotter and throttles. I am such a person and can verify that reality.
2) Lottery of overclockability is randomized with the FE cards. There are tiers of overclockability. Partners test samples and then apply a tier system based on their performance. All cards meet a certain standard. Those that test the highest become higher end factory overclocked products (see EVGA FTW). Those that meet a baseline standard but have a lower ceiling become standard edition products. Results of partner forums seem to indicate the FE cards cover that entire spectrum. You might get one that is much more capable than the specification (like the FTW products), you might get one that has a lower ceiling (like the standard editions).
3) A card with one fan has a lower chance than a card with two or more fans of being a "ticker".

That said, I think the better choice are products with custom cooling designs this time around. That by no means makes the purchaser of the FE cards an idiot. There are valid reasons to purchase the FE.

 


Probably 90% of people who purchased them don't have small cases, 97% probably don't have an M.2 drive. Let's say that 95% of FE purchasers did not have those valid reasons and just wanted to get their hands on one as quickly as possible. Alaos, with proper case airflow, aftermarket coolers don't even make the case hot at all.
 
I am really liking these new detailed power consumption tests. IMO if your overclocking your cpu and/or gpu, its a great idea to have the graphics card drawing most of it's power from the pcie psu connector, leaving more, cleaner power to OC the cpu and have less strain on motherboard components.
 


I don't think it'd affect the CPU really; it seems it only affect audio.
 

neblogai

Distinguished


CPUs have their own power connector. And I think those spikes could only affect basic plugs of analog audio on the MB, but not the audio that is transferred by anything digital- optical, DVI-D, HDMI, DP.
 

sillynilly

Reputable
Jan 6, 2016
170
0
4,680
The FE looks the best hands down for me. I also prefer them to vent out of the case and not into it - and I do run a PCIE SSD so it matters to me.

While I don't have a 1070 - I did grab a 1080 - these are beautifully made. Love me some team Green!
 


I'm just thinking if the gpu isn't hogging the power from the motherboard, everything else has a better potential to run more stable.
 

mahanddeem

Distinguished
Apr 30, 2007
496
3
18,865
Why always not compare to actual factory OC'ed cards? like MSI, Gigabyte, Asus factory oc'ed Matrix, G1, TwinFrozr, etc. 980Ti's and 980's?
It would be more realistic to compare to these cards since these are more common than just reference design ones
 

stairmand

Distinguished
Apr 21, 2009
40
3
18,535
I wish Toms would stop using Project Cars for testing. It uses Phyx and it's massively skewed in favour of NVidia cards.
 


Nothing new here .... unfortunately we still have uninformed folks posting "they all have the same GPU, just buy the cheapest one and overclock it".

Ya woulda thot this practice would have stopped when reference cards and "non-reference cards with references PCBs" were failing left and right with the GTX 570. Unfortunately, as time goes on, the old adage "Goof things come to those who wait" is considered less and less.

1. Non-reference cards provide an opportunity for AIB partners to distinguish themselves. This is a delicate dance between incurring more costs for premium components, more fabrication costs for adding extra PCB component cooling and having that actually produce a performance advantage. Vendors seem to rethink their approach here year to year as we rarely see any one vendor maintaining the title year to year. Which one finishes "on top" isn't oft known a while.

2. Obviously, any bugs, design errors, fabrication issues, BIOS problems, fan curve mistargets, will be corrected in later steppings.

3. Price will be essentially "what the market will bear. Buying when supply is ow and demand is high ... as in right after a new card is released means you will pay more to get less.

I was glad to see both camps make a mistake here. This allows it to be looked at as what it is .... a reference versus non-reference issue without the distraction of those wanting to make it an AMD versus nVidia issue. Both camps made a mistep here, and both of them are paying for it by laughing all the way to the bank.

It's kind of funny how one side only sees issue for the other camp.

Many have made issue of the power problem with AMD, but doesn't seem to notice the FE throttling issue and post distorted reasoning arguing that it isn't taking place

The other side rails about the price of the 10xx cards being way over the published MSRP but fail to acknowledge that the cheapest 480 you can actually buy is $80 over MSRP.

Buying, making decisions or even arguing about which card is better should always wait till non-reference cards are available with which limitations of the cooler and PCB are not gimping card performance.


Any chance we will see benchmarks revisited for the AIB cards ? Would sure like to see how min fps stacks up against the reference models given the throttling issues.
 

I think it is ok to keep benchmarks like this, but it should be explained in the benchmark why there is such a difference. But it is also not the only bench AMD does poorly on, scroll down to the Tom Raider results. This isn't a new thing, and it has always concerned me that AMD drops the ball in performance on always a couple of current gen titles, where it hardly ever seems to go the other way. Do AMD need to put some more effort into driver tweaks for these titles perhaps? or is there nothing they can do, is it the game code?
 

neblogai

Distinguished


But does it matter if all the same review titles get perfectly optimised, but other games- are not? And we have to understand, that proprietary tech like physx and hairworks cannot be truly optimised by both companies.
But I agree with the need for tech-intro and explanation of why this or that is important. And also more diverse benchmarked game list would be great. For example, an example of MOBA title at different settings. Doom as representative of Vulkan. Some DX12 game that is light on CPU and another that is very CPU heavy, like Total War:Warhammer- to see the benefit of multiple cores finally being used, and their positive effect on minimum frame rates.
 
Status
Not open for further replies.