Nvidia GeForce GTX 1000 Series (Pascal) MegaThread: FAQ and Resources

Page 51 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Also, has anybody found anything online as to when an EVGA 1080 Hybrid or msi GTX 1080 Seahawk will be available on newegg or tigerdirect? Or any other non-reference cards for that matter. It's been over 2 weeks since release and I haven't been able to find anything, except on ebay, but for ridiculous prices.
 
wish i knew the answer to that. i would expect those models to be slower coming than the other ones. history has shown that the higher end models are delayed a good bit. i have not seen any hints at when those will show. clock speeds and such are not even released yet for the evga. nothing but the pic on the main 1080 page of theirs.
 
Some people already have custom 1080s so you just have to keep checking to see if they're in stock.
 


It does seem shocking how quickly graphics cards have aged. However I think it's more a case of forgetting how old our cards are. I fist bought a GTX 970 in about October 2014, which is one year eight months ago. Yet it does not feel anything like that long ago. My GTX 650 Ti Boost is practically a dinosaur. That is compounded by the fact that card was out for ages before I bought it.

After my 970 had to be returned I only moved on to a 980 in late 2015. It is already only a 1080p card (for max settings and 60fps), and only just that.

I remember looking longingly at the GTX 760 myself and thinking how powerful it must be. Or much more capable than my 650 ti boost.

 


I still have a dinosaur HD5870 2GB and even though I am gonna upgrade as soon as flagship AMD GPU's hit the market, I can play all the newer games at 1080p without a problem. Just lower the graphics a bit and the card is 6 years old.
While I'm not implying my card hasn't aged I do also think we, gamers, sometimes overreact how old the tech is we own when the new, shiny knight on a white horse arrives and blinds us with his greatness(i.e. gtx1080).
 
a new 1080 review from guru 3d. this time it is the gigabyte g1 gaming which is usually a good card

http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-1080-g1-gaming-review,1.html

MSI is clearly not being shy about sending out samples of its new cards. here is another gaming x 1070/80 review if you need more info on these cards :)

http://www.overclockersclub.com/reviews/msi_gtx1080_gtx1070_gaming_x_8g/

Galax phoenix GLH 1080 tested as well but hardware.info

http://us.hardware.info/product/351450/gainward-geforce-gtx-1080-phoenix-glh-8gb/testresults

so far all these custom cards seem to be right on par with each other. running the same fps and temps for the most part from what i am seeing. seems price is gonna be a big issue with the 1080 cards, if they are all gonna perform roughly the same.
 
Something came as a shock for me over power usage in the Pascal cards.

The GTX 1080 is specified to use 20W more than the GTX 980, on the Nvidia website; reference cards I assume.

However the reviews of the Asus Strix 1080 shows a very different result. I claims the Asus Strix 1080 uses fifty more watts than Strix 980. That's quite a power draw.

NB I think the most I ever saw my 980 and i5-4690 use was 320W from the power supply. (Usage at the wall socket would be higher.) It was measured using Corsair Link.
 
the few reviews we have seem to average about 200w at stock speeds for the custom cards. overclocking of course adds to this. the reference cards were clearly at about the 180w point but i doubt any of the oc'ed custom cards will get that low at all.
 


Looks like a good performer. It hits the same overclocking limit of 2.0-2.1GHz that all the other cards seem to be hitting. This is more backup for my theory that you may as well go with a cheaper model custom cooled card when looking at Pascal since they all seem to overclock about the same. As long as the card you choose has sufficient cooling you will probably get to the 2.0-2.1GHz range.
 


The relevant point is what level you are playing at ...

At 1440p, w/ the 970
Tomb Raider goes from 29.8 to 58.7; scaling = 96.98%
Battlefield 3 goes from 62.1 to 121.4; scaling = 95.49%
Far Cry 3 goes from 35.6 to 68.8; scaling = 93.26%
Crysis 3 goes from 22.5 to 43.3; scaling = 92.44%
Thief goes from 70.8 to 136.1; scaling = 92.23%

With the 770 at the resolution you were likely playing at (1080p), had just moved to or would like to play at (1440P), scaling was great and it made sense.
With the 970 at the resolution you were likely playing at (1080p), had just moved to or would like to play at (1440P), scaling was great and it made sense.

OTOH The 1070 is a Display Port 1.4 card in a market w/ no DP 1/.4 monitors. What can a pair of 1070s do w/ 120 + hz at 4k ? Nothing ... none exist as yet. yes, certainly CPU limitations play a role but the critical factor has always been at relevant resolutions. No one bought SLI to play at 1600 x 900.

But even accounting for the decrease in scaling as you drop resolution, we have never seen the low numbers we are seeing mow. For one, at least one site seems to have purposely gimped their test using two unmatched cards, one of which is know to thermal throttle... they also picked games, that aren't exactly known for their scaling ability. What I am saying here is that, at this point in time, we don't have enough information. nVidia has been purposely trying to decrease the attractiveness of SLI's x70s as compared to the x80 for sometime as it cannibalizes x80 sales. Until there's more data with more popular titles so that we can compare against previous 9xx results, there's no way to tell how much is a) CPU limitations, b) driver limitations or c) author's agenda.
 

I can't agree on that. Buying two x70 will always be more than a single x80. Why would nVidia cut the rope on the x70s when it's a solid income source?
 


It's more than cooling ... as we see in this article for the 970s, there are numerous design differences between the cards in design and construction as detailed in the bottom third of pages 2 thru 4 of the review

http://www.bit-tech.net/hardware/graphics/2014/09/19/nvidia-geforce-gtx-970-review/1

While the three cards tested have improved coolers, they do perform quite differently ... the thing is, long before you get to the performance testing page near the end of the article, you can easily "predict" the order of finish when overclocked by reading about the component differences on pages 2-4. Number of power phases on VRM, VRM / MOFSET cooling, available power, quality of chokes used. Two cards with great coolers, one of which improves all those components (MSI) and one which uses reference or only slight modification of reference components will simply not perform as well as the card which uses components of higher quality and cools them better .... the coolers may cool the GPU equally, but the VRM temp limits OCs as often, if not more, than GPU temps. When one uses heat sinks and thermal pads on the VRM, MOFSETS etc, that card will have better performance whenever GPU temp is not the limiting factor.
 


This is exactly what I was thinking - they cobbled together a hodgepodge SLI setup using a PCI extender, unmatched cards, and a previous generation SLI bridge and then tested games with poor SLI scaling. I couldn't believe they were willing to let it go at that - I have SLI 980's and they scale very well in say, DOOM, for one example. I hope somebody else gets 2 matched 1070's with a new HB bridge and tests games that actually scale well. As an SLI user I'm not exactly about to rush out and buy a game that doesn't support it, now am I? So why would they test games that dont?
 


But then it kind of beats the purpose. You go SLI for more performance. I wouldn't want to limit myself to only but a handful of games that scale well with SLI.
 


History says otherwise .... we built dozens each generation and two x70s or even cheaper cards have been both cheaper and better performing ... As such nVida has juiced the price of the x70s and even purposely nerfed performance. The 9xx series they reduced the throttling point so as to increase the performance gap and still they only managed 11.7%. How to you justify paying twice as much for a 12% performance increase ?

1. Two $200 560 Tis were 40 % faster than the $510 580.... $110 back in pcket and 40% more performance
2. Two $150 650 Ti Boosts were cheaper than one $500 680 and outperformend that ... $200 less and more performance!
3. The 980 was going for $640 - $650 when my son and half his friends, bought two 970s.

As I said, nVidia has been adjusting pricing and performance to make this less attractive because the margins on the 980 exceed that of the 970.... think about that. A card which is 20% faster, does not cost 20% more to make ...and they aren't charging 20% more, they are charging 100% more.

Getting back to No. 3 above. My son paid $319 for his 1st MSI 970 Gaming when the MSI 980 Gaming was $639 ... each card came with two $60 games. He had his friend buy a 2nd 970, and paid him $260 for it new in box, his friend kept the coupons for the 2 free games (each came out $60 a head) .... couldn't buy a 980 for $570 then. Aside from base cost, on a price performance basis, the 980 would have to cost $457 to be competitive on a fps per dollar basis.
 


I pretty much agree with all you just said but still cannot find "evidence", if you will, for nVidia nerfing their smaller cards.
 


Huh ? How does 25 out of 27* games tested constitute a handful ? Did you read the post in question ? It's the reverse ... , no it's worse than the reverse.

* EDIT I said 26 before , I missed 1. Dead Rising is also slower in SLI but that game has a 30 fps cap

Of you don't want to limit yourself to improved performance in a "handful of games", then you needed to avoid the 980 over the twin 970.

-In only two instance out of 27 was the 980 faster ... when tested, I haven't checked whether the 2 games have since had SLI profiles added.
-Over the 27 games, 970s in SLI averaged 40% faster than the 980.
-Did you play Wolfenstein New Order or the fps cap limited Dead Rising .. w/o those titles it's a complete shutout.
-On popular demanding games scaling was over 90% for Tomb Raider, Battlefield 3, Far Cry 3, Crysis 3, Thief, etc

Your position is that you would buy the 980 over two 970s because the 980 is faster in 2 games and the 970 is faster in 25 ?

Dragon Age Inquisition* goes from 38.8 with a 980 to 66.0 with twin 970s for performance increase of 70.10%
Thief goes from 81.1 with a 980 to 136.1 with twin 970s for performance increase of 67.82%
Far Cry 3 goes from 41.0 with a 980 to 68.8 with twin 970s for performance increase of 67.80%
Tomb Raider goes from 35.0 with a 980 to 58.7 with twin 970s for performance increase of 67.71%
Battlefield 3 goes from 73.1 with a 980 to 121.4 with twin 970s for performance increase of 66.07%
Bioshock Infinite goes from 88.0 with a 980 to 143.9 with twin 970s for performance increase of 63.52%
Crysis 3 goes from 26.8 with a 980 to 43.3 with twin 970s for performance increase of 61.57%
Splinter Cell: Blacklist goes from 57.1 with a 980 to 92.2 with twin 970s for performance increase of 61.47%
Battlefield 4 goes from 53.1 with a 980 to 83.2 with twin 970s for performance increase of 56.69%
Batman: Arkham Origins goes from 95.0 with a 980 to 148.3 with twin 970s for performance increase of 56.11%
Assassins Creed* Unity goes from 24.1 with a 980 to 37.0 with twin 970s for performance increase of 53.53%
Far Cry4* goes from 53.4 with a 980 to 81.7 with twin 970s for performance increase of 53.00%
Shadow of Mordor* goes from 70.8 with a 980 to 107.1 with twin 970s for performance increase of 51.27%
Ryse* goes from 62.4 with a 980 to 94.3 with twin 970s for performance increase of 51.12%
Watch Dogs goes from 57.1 with a 980 to 85.1 with twin 970s for performance increase of 49.04%
Grid 2 goes from 95.5 with a 980 to 141.8 with twin 970s for performance increase of 48.48%
Assassins Creed goes from 43.2 with a 980 to 61.8 with twin 970s for performance increase of 43.06%
Crysis goes from 52.0 with a 980 to 74.0 with twin 970s for performance increase of 42.31%
Civilization Beyond Earth* goes from 80.2 with a 980 to 113.7 with twin 970s for performance increase of 41.77%
Metro LL goes from 53.2 with a 980 to 74.6 with twin 970s for performance increase of 40.23%
WoW: Mists of Pandaria goes from 126.3 with a 980 to 174.8 with twin 970s for performance increase of 38.40%
COD Advanced Warfare* goes from 113.1 with a 980 to 154.5 with twin 970s for performance increase of 36.60%
Witcher 3 goes from 48.0 with a 980 to 63.8 with twin 970s for performance increase of 32.92%
Alien Isolation* goes from 99.0 with a 980 to 121.0 with twin 970s for performance increase of 22.22%
Diablo III: Reaper of Souls goes from 183.0 with a 980 to 220.8 with twin 970s for performance increase of 20.66%
Wolfenstein: New Order goes from 61.8 with a 980 to 56.2 with twin 970s for performance increase of -9.06%
Dead Rising* goes from 43.6 with a 980 to 38.6 with twin 970s for performance increase of -11.47%

That being said, we'll have to see how the 1070 / 1080 does ... both cards are so powerful that it may make less sense this time around to go SLI and nVidia is certainly doing everything they can to make this option less attractive as they lose money when folks choose 2 cheaper cards in SLI over their top dog.

Current pricing has also increased the relative price ratio. Until drivers mature over the next 2-3 weeks, it's difficult to predict where the SLI option will fall for 1440p / 1080p.

 


I said nothing like that. I merely replied to you saying you won't be buying a game that doesn't support SLI.

Regarding SLI, though, it's not all sunshine and rainbows. I have never used an SLI setup but many people complain about micro-stutters, screen tearing and such.
 


I'm just not sure an article about how 970s overclock is all that relevant to the 1070 and 1080. Several 1080 reviews have come out and all of them mention that they are able to achieve overclocks in the 2.0-2.1GHz range. This is true in the founders cards and it's also true in the Gigabyte and MSI reviews I have read.

I don't doubt that all the parts you talk about are better in the more expensive models but if they have the same overclocking limit as the cheaper designs does that really matter? The FE cards won't perform as well because they will hit that thermal throttle point and lower their clock speeds more often but assuming that the cheaper cards' coolers are sufficient they shouldn't run into thermal throttling much, if ever. If they aren't thermal throttling they should theoretically be able to hold the same overclocks as the better models.

Again, all those parts you talk about probably are better but if Nvidia is artificially capping the amount of overclock they can achieve with their limiters I'm not sure why I should pay for them.
 


That is why I am waiting on AMD and their tier 1 GPU's that, supposedly, will give the user much, much more control over overclocking the card.
 


I have often wondered how many of those "complaints" come from AMD's paid shills.
 


This was my point - people that use SLI wont be buying games that dont support it well, at least until an SLI profile is added. Why then, in the 1070 SLI review, did they only benchmark games with terrible scaling or brand new ones that aren't supported yet? It seemed like an fairly obviously biased review in favor of not supporting SLI. In today's games when the vast majority of AAA titles support SLI fairly well within a month or so after release theres certainly no shortage of titles with which to show off what SLI can do ,but again for whatever reason, they didn't.
 


You didn't say that doing SLI would limit you to only "a handful of games" .... clearly 25 out of 27 does not constitute a "handful". Like the "telephone" game kids play, each time an erroneous post is repeated, it grows in the telling. here's your quote.

But then it kind of beats the purpose. You go SLI for more performance. I wouldn't want to limit myself to only but a handful of games .[/that scale well with SLI.

I didn't say I wouldn't buy a game that doesn't support SLI, just said that I didn't , nor do I know anyone who did, play the mentioned game that didn't support SLI. It's not that the lack of SLI support would be a deal killer, just that I don't play a lot of obscure games.

You have never used SLI but yet repeat the same thing that lot of people say who have also never used SLI... we have built dozens of SLI system, we own 3 SLI systems and have never seen microstuttering.... saw screen tearing once on BF3 beta ... went away with a driver update. Granted maybe there is a game out there, poor console port perhaps, that has the issue but no one here has ever seen it, nor has anyone we have ever built for reported it.

Yes, back in the day, before the current numbering scheme of x00 series, NVidia supported SLI on lower end cards. When one chose, poorly, one of those cards to SLI, you would sometimes experience those issues.... but what you are saying is equivalent to "cell phones can't do wifi"... back then, no... today yes. NVidia doesn't support SLI on that level of card anymore.

Here's a review of a low - medium end card w/ SLI That's 3 tiers, not 1 tier, below the top ... $170 MSRP, $150 retail and still no mention of microstuttering and still provide comparable or better performance.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost_SLI/23.html

[/After running the GeForce GTX 650 Ti Boost SLI through our test suite, I have to admit that I'm impressed. The duo delivered performance easily matching and often exceeding much more expensive single-card options such as the GeForce GTX 680 and Radeon HD 7970 GHz Edition, and they don't cost as much. SLI multi-GPU scaling works well with all of our titles except for F1 2012. Scaling by going from one to two GTX 650 Ti Boost cards is around 70%, even with F1 2012 taken into account. Unlike AMD, NVIDIA does a good job of maintaining its SLI profiles, so you should be able to play new games without a long wait for multi-GPU support. However, the risk that a game will not be supported still exists, and you might, at worst, end up with single-card performance. This is in my opinion, given the massive performance-per-dollar advantage, an acceptable tradeoff. I would definitely recommend a GTX 650 Ti Boost SLI setup to a friend looking to spend as little money as possible on a high-end gaming rig.
* Note F1 support was later provided.

So here we have it, 1 game in the 17 game test suite didn't support SLI **at the time of testing**. 1 game in the 19 game test suite with the 970 test didn't support SLI. Under no stretch of the imagination are you thereby limited to only a handful of games as 18/19 or 16 out of 17 does not constitute a "handful.

 
my biggest concern with xfire/sli would be those games that are not very optimized for it. you drop back to a single card which is not as good as it could be since you went with 2 lower cards to better a single higher card. great idea if both cards can work, not so great when you have to drop back to a single card. for me personally, i'd rather get the single better card and just have relatively consistent performance form every game and not have to go back and forth from great 2 card gaming and not so great single card gaming.

just my personal opinion on the subject. i'll stick to a single card for now :)
 

TRENDING THREADS