Discussion: Nvidia Pascal

Page 16 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Oh trust me, I've stuided a LOT on if 3 1080P displays are good, but I've decided against it. Mainly it's worse for gaming than 1 monitor.

I'm not going to get a G sync monitor, I'm going to put that $100 into a better GPU.

Plus I sort of Do want IPS if possible.
 


That's not gonna work like that.. You already have examples in "history" where some of the gpu's are using 2 times more then current high-end cards (while same range cards are using much less), or other way around. And it's only Pascal architecture with couple gpus for now. They are not gonna just do that.. DX12 was how many times faster in theory ? Then it comes to 50%, and then maybe 10% for real.

Nothing goes by that math.
 
I do care too. Cause I have a high quality 850W psu and I want to do something with the performance of 2x980Ti SLI and I can barely do that and keep them at stock clocks. With the Pascal equal They will consume less (So one can OC them) and still have headroom and more performance and lower heat 8gb etc etc
 
Not saying that performance increase will be linear. But early expectation is the upcoming gen will get you 960 or 950 performance for 75w. But then again the new 950 is 75 w part (expected to be slower than regular 950). There are rumor about the new GP106 will be 75w part. usually nvidia Gx106 will address sub 150w part. Sub 75w usually goes to Gx107.
 


What ? I was running 2x980ti g1 at 1500/8000 on 850w G2 EVGA psu last week while I was testing 4k screen.
 


That makes little sense old chap, did you mean to say couldn't or could not? To me that would make more sense in this particular rant.
 


I have that PSU too but it doesnt leave it a lot of headroom if you run them oc'd
 


Why Not? The Tesla doesn't use gas at all and some models can go 0-60 in 3.2 secounds....

I prefer more power efficient video cards...
 


Agreed. Why not?

While more power efficent cards aren't absolutely necessary for high end systems, why not have the benefits of power savings?

I would also disagree at saying if a card is 75W, look what a 300W version can do. I don't think silicon works that way, a certain architecture can only go so high on voltage before it starts damaging the GPU.
 
Tesla is a bad comparison. Those cares once depleted of their batteries require (the new one anyway) like a 8 hour charge to be full. Sorry but i wouldn't wanna run out of juice or have to wait 8 hours for a recharge. It's pc gaming. It's expected that to be the fastest you will use power. I expect cards to get more efficient power wise but i expect a high end card to use more juice than any other card. Like i said if you want to be the fastest you have to have the power to back it up. Also tesla cars cant really be modded to go faster lol. You pay 80,000+ for a high end tesla i'll pay $4,000 for a 89 coupe 5.0 mustang and dump $10,000 into the engine and destroy that tesla in the 1/4 mile. 😀 just saying if you wanna be a big dog you gotta expect to have power to back it up. For those who want a energy efficient system great but they are also not the ones dropping thousands of dollars for the most powerful stuff on the market either. At that level power consumption is less of a issue. Its more about the output or results of the gpu's for instance.
 


I disagree however you're entitled to your own opinion...
 


The new card will possibly use less than half the power of your current gpu configuration and possibly be 1.5x to 2x more powerful...
 
I used to think that there's no way a 4K card would be called a GTX 1080, which is probably true. But the false assumption I think I made was that these are "4K" cards. If they change the name from the usual naming scheme, that would strongly imply that these are in fact 4K cards, but that's probably not the case.

This would explain why the leaked shrouds have the GTX 1080/1070 nomenclature etched on them. They might not want to give a false impression that these are designed for any particular resolution, 4K to be specific. So in that light, keeping with the usual scheme makes sense, particularly given a product stack that includes 1070, 1060, and mid-range, definitely not 4K cards.

What I don't like is repeating names. I think GTX 1080 is available, but AMDs upcoming 400 series reminds me of the Fermi generation from Nvidia. I've already been there, done that with a (GTX) 480, and had way too many discussions about a 460 vs a 470, and so on.
 
Status
Not open for further replies.