Nvidia GeForce GTX 1080 Pascal Review

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mapesdhs

Distinguished


Probably depends on the game & resolution, etc. Best to check various reviews, see if you can find performance data to compare to 780 SLI, even some extrapolation is required (eg. data available for 780 Ti SLI). Bench databases on toms, etc. should help.
 

morpheas768

Distinguished
Mar 3, 2009
270
0
18,960
Honestly I feel sorry for those "poor" people that are going to buy the founders edition of this card. Its 100 dollars extra to get a reference card just a bit sooner than the proper good partner boards.

I mean, yeah if you must have this card as soon as possible, and you dont want to OC ever, and price means nothing to you, then by all means go ahead and buy it.
But to me, this is just a not-so-elaborate scheme for Nvidia to make even more money.
They're pretty tired of partners selling more than 90% of the cards with NV GPUs, and want a piece of the action, a real piece not a tiny one.
Nvidia even has a plan for selling them directly now, without the need for 3rd party sellers eating their candy.

I'm not saying Nvidia is evil or anything, I'm just saying that you'd be a fool to not recognize what they're doing.
 

TJ Hooker

Titan
Ambassador


This doesn't really make sense. First off, every partner card still has an Nvidia-made GPU in it, so Nvidia makes money regardless of who sells the card. Secondly, if Nvidia wanted to sell more cards, they could just limit how many GPUs they sell to partners. Hell, Nvidia could cut out partners altogether and sell every card themselves if they wanted to, but they obviously don't.

The idea that this is a ploy to muscle into the VGA market at the expense of the partners doesn't really add up.
 

morpheas768

Distinguished
Mar 3, 2009
270
0
18,960

No, no and no. You misunderstood nearly everything I said and missed the point altogether.
Nvidia does make money from partner boards as its Nvidia GPUs that sell. I never said Nvidia makes no money from partner boards.

What I said is that Nvidia makes even more money with this "Founders Edition" scheme. Its not doing it at the expense of partners, partners will still make money as usual more or less (maybe a bit less).
But by releasing ONLY Founders edition earlier than the partner boards and pricing them 100 dollars higher, and cutting the middleman they do make more money.

I dont know how obvious it can be.
Nvidia even named its reference boards "Founders Edition" to appeal to consumers even more. Its a simple idea, but why not? If it works it works.

Sure, maybe in the future we will see Nvidia even more aggressively trying to sell its Founders Edition cards, but for the time being, this isnt a typical launch, and even someone as stubborn as you cant deny that.
 
I actually think the Founder's Edition pricing may be intended to help their partners by encouraging people to spend more for the top of the line custom models.

In the past, that G1 Gaming card meant you had to splurge and spend about $50 more than the reference model. Now that same G1 seems like a bargain at $50 less than the reference price.

The relative price means that more G1 Gaming Extremes will be sold than just the base Windforce model. People will be more willing to splurge since the baseline was raised. Combine that with the fact that it's clear the custom models will actually benefit from more high-end design components, and it should be a big boost for the partner companies.
 

morpheas768

Distinguished
Mar 3, 2009
270
0
18,960

It could be, its not unreasonable at least.
Its definitely something to think about, but either way, the result is more money for Nvidia.

We just have to wait and see what AMD has in store for the enthusiast/gaming market.
 


Nobody expected it to be bad. Who would expect with the node shrink and all that performance would go down and efficiency get worse? Well, efficiency actually stayed the same as the GTX 980 about. It has less high spikes, but the actual average power requirement when under load remains the same. I don't consider perf/watt as having good efficiency, because if that is the case the 295X2 is a very efficient GPU. Just look at how much greater perf/watt it has over the GPUs from 2008.
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
I think 17Seconds implying that many people here seems unimpressed even when we also assert the reason why.

But I think after what Maxwell achieved over Kepler and all the hype nVidia situated for the new successor, we have every reason to be. "2 times perfomance 3 times efficiency" yes! Of course! Because now we know, it's basically the wonder of 16nm, while still using Maxwell with slight tweak. Too slightly even, it barely matters.
And yet they call this billion dollar research? Well, surely not for a gaming card. I believe that amount of money goes for the big stuff for companies and proffesionals.

Then again, our hopes and expectation might just got way too high.

Also, I don't dare to imagine if AMD did the same thing going 16nm using what they achieved with Nano and going top to bottom with it. Please AMD, no... though I have a gut feeling that's exactly what they would do.
 

g00ey

Distinguished
Aug 15, 2009
470
0
18,790
I get slightly confused at the explanations of the voltage regulators and PWM controllers. I don't quite understand '5+1-phase design'. I am to assume that it means that 1 phase supplies voltage to the memory and 5 supply voltage to the GPU. But what exactly is a phase? I mean it's DC current we are talking about here, not AC, what components does a 'phase' consist of exactly? A Zener diode circuit? What measures are taken to prevent current from flowing between these phases, if we assume that they operate independently to supply voltage to the graphics card?

I assume that the PWM controller just regulates the cooling fan and has nothing to do with power supply to the whole card.

I must say that I don't like that they have reduced the auxiliary power input. I think a GPU should do it's best to minimize its power consumption from the motherboard and try drawing as much of it as possible directly from the PSU instead. It's too much to assume that every motherboard is up for it as motherboards have to deal with other power consuming hardware components to boot, such as the CPU. Then imagine what happens if we were to use SLI. A more sustainable hardware philosophy is to reduce the strain on the motherboard power controllers as much as possible.
 

TJ Hooker

Titan
Ambassador
The VRMs are step down DC-DC converters. Their essential components are switches (transistors) and an inductor. Each transistor has a limit on the average current that can pass through it, as well as how fast it can switch. Multiple instances of these converters are combined, with each instance being referred to as a phase, with only one phase being on at a time. Using multiple phases increases overall current capacity (and therefore power capacity), and increases effective switching frequency (improving response time and lowering ripple). The PWM controller is an essential part of the power delivery circuitry, controlling how much current is being delivered by controlling the duty cycle of the switches.
AFAIK, voltage/power supplied to the PCIe slot doesn't go through any VRMs on the mobo; they're just connected directly the mobo power connectors. If that's the case, I don't think it places any particular burden on the mobo.
 

Wooooowwe

Commendable
May 11, 2016
3
0
1,510


The 1080 costs 600 at launch, that extra 100 is the suckers price that nvidia is charging for day one people. They are charging 100 because they know most are stupid enough to pay it. The actual price is 600
It cost 700 at launch. It is yet to be seen, even though I am sure there will be, that a manufacturer prices the 1080 at 600. Still the other person had a valid point... what will the ti or Titan cost? If AMD can't compete then Nvidia has full range to continue screwing over a consumer on price increases. Yes it is a different process and delivers great performance, but that shouldn't cause the price to increase $100+ from one gen to another.
 

Joseph Jasik

Reputable
Feb 24, 2015
26
0
4,530
For arguments sake lets say the ps4 is somewhat equivalent to GTX580. They will optimize a game for the PS4, so it runs as good as it can. Enter now the gtx 1080, that has so much more power that it lets the devs release less and less optimized games, becase, what the hell, the card will muscle through this crap, and 60fps is good enough. Why support this kind of behavior with my wallet, just to get a crappy release of a game that is even more broken than the last?
 

jwl3

Distinguished
Mar 15, 2008
341
0
18,780
I hate when fanboys compare their overclocked old equipment with the latest and greatest and claim that their old card or cpu oc'd is equal to the new one. It's not.

That's like strapping on a rocket booster engine to a Camry and claiming that it's faster than the new LaFerrari. If you're gonna use your OC benchmark, then use the OC for the new card. Otherwise, you're just a fanboy.
 

JTWrenn

Distinguished
Aug 5, 2008
331
234
19,170
Random question. I am wondering the best way to consider how much a new card would heat up a room. Do you think power usage would be the best comparison? I figure heat is relative to cooler used so it may not be the best answer, and all cards run hotter than room temp in general. So wattage is all I can come up with. Any ideas would be helpful. Posting it here because the gtx1080 or 1070s are what I am looking at going to to get a good gaming performance without needing the ac on to keep from sweating.
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960
If you own a 980 Ti... No need to get overly excited/worried about upgrading to this particular GPU, unless you just have to have the "latest and "greatest". Initial testing was done vs. the GTX 980? Rather odd and goes back to my initial statement.

On the other hand, if running a 970 (or below) and 980 (apparently, according to NV) or 700 series GPU, this presents a very good upgrade-path.

Time will tell what will happen with that MSRP however, once vendors start doing their add-on's; cooling solutions, factory OC's, software bundles, etc...

 

TJ Hooker

Titan
Ambassador

The amount of power the card consumes is equal to the amount of heat that is dissipated into the case/room. Cooler is irrelevant.
 

mapesdhs

Distinguished


And the 1080 isn't exactly a power/heat monster anyway, not like the 480 was, or something like the 590.


 

BorgOvermind

Distinguished
Oct 4, 2011
46
0
18,540
Marketing does wonders.
The card is good, but just a normal tech-up, constant gain just as most next-gen situations.

The 1070 seems to have 5 clusters disabled, not 4. Apparently, they want to make sure there's better availability at launch.
 
Status
Not open for further replies.