Nvidia Reveals Pascal: GTX 1080 And 1070 To Beat Titan X, GDDR5X Debuts

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


The 970 had an MSRP of $329. It makes sense since the gtx 970 was so much better than gtx 960 and only cost $100 more.
 


Agreed. I don't believe Nvidia is exaggerating their claims of about the performance of the gtx 1070 and 1080. Like Maxwell in 2014, it seems amd will have no immediate answer. Nvidia cards have been more efficient in terms of power consumption for the last 3 generations, it would be nice to see amd beat them this time around, but thats being optimistic.
 
I wouldn't expect them to have the same power efficiency as Maxwell if they are attempting to gain on AMD in DX12 performance. Maxwell doesn't have separate compute engines like Fermi and GCN have. Its more efficient for rendering DX11 and older pipelines since its all single track. But DX12 is Asynchronous so the separate compute engines make a clear advantage for the GCN architecture. I don't see nVidia being competitive without adding more compute engines and this means lower power efficiency compared to Maxwell.
Also I still find AMD will be first onto the market with their competing cards. The 1080 should be available no sooner than Q4 2016 since GDDR5x won't be widely available until Q4 2016. All AMD news recently shows them being ahead this production run. We really won't see until it happens
 



Yeah I'm pretty sure you're right about that. Another reason I was planning on skipping this generation. But, here's to hoping.
 


The 1080 will be available for purchase on 5/27.
 
i say we should wait untill the other camp also release there stuff. once done the experts will do all the comparisons etc and give us the conclusions as well. by that time the market is settled and the prices also might be somewhat reduced as well. then if want we can get the best card for the price/performance ratio. what ever that may be if im going to get a new card that should use less power than my gtx 970 and at least should show a general 30% performance increase.
 
Let's wait for the retail products to be tested. Then we will see what is more true to real user experiences. It has been noted elsewhere that on screen information during some of the show was clocked at 2114MHz. That is a significant jump over the base clock of 1607MHz, or even over the 1733MHz boost clock. It is possible, but unlikely users will be able to realize that type of overclock. Remember the marketing team? They are one of the most important parts of the magic show. They implement all sorts of tricks to amaze the audience.
 


No, if you look at the graph Pascal requires higher power than Maxwell. The chart labels GTX 980 at 160W and GTX 1080 at 180W. Those are under-exaggerations anyway, so you are looking at more of a 200W GTX 1080.
 
Also, the presumed lack of FreeSync is the only real drawback I see, and the thing that would have me buying a comparable AMD card (if there were one).
why would you be looking for freesync in a Nvidia card?
You must really like your samsung monitor
 


A higher binned chip with more OC potential for $699.
The Founders Edition is nothing more than the Stock Cooler that Nvidia made. Its not special binned chips that overclock better. In fact, the Founders Edition, have 0 stock overclock. You'll want to buy a pre-overclocked card for better performance, instead of relying on software overclocking tools which are finicky and have to be re-done everytime you format or install windows updates.

^^ This
 
The referenced benchmark for the doom run was a 1080 with a 23% overclock already on it (which was the average upper limit of what a 980 could manage also). Also with that being the card used for the demo, it's not unreasonable to assume that is the card referenced in the slides. If that's the case, then a 1080 is around 5-10% faster than a 980ti for a lot less power consumption. But in the end, it's a $600 card that is 5-10% faster than...another $600 card. Nothing revolutionary about it, for conventional gaming.

The hardware updates for the VR support are where the real performance gains are, but then, if you aren't using a VR display, you won't see those bigger jumps.
 
I loved the article, but you wrapped it up the wrong way, I believe. This is how I would've finished this article:

Last we checked, we got an x70 card with 87.5% of the (high-speed) VRAM we were lied to about and we were told we'd get. Is history following its course with the GeForce GTX 1070p?
Is nVidia being an HIV-infected prostitute dying for customer money again? That means selling a 7GB (practically usable) card while lying to your customers you're selling an 8GB one.
Stay tuned for actual reviews to see what you're being told to spend your money on!

It's probably 7.5GB + .5GB
 

Um, maybe because you own a FreeSync monitor already? Maybe because FS displays are a lot cheaper than GS displays?
 
I don’t think the 1080Ti will be released until after AMD launches it Polaris or the week before it’s due to launch depending on what Nvidia learns in advance about its performance. If Nvidia knows the 1080Ti will definitely be faster they will time the launch just before AMD’s release. If they don’t know for sure they will wait until after the AMD release to see how it performs. Then make what adjustments they can then when they feel it is competitive enough in performance or price they'll release it .
 

I am pretty skeptical of PR talk, but I think that presenting OC specs and implying that it is reference card performance is a bit outrageous. Would they really do something like that? I'm not so sure...
 


I agree, I hope AMD is going to do something awesome with Polaris and Zen. Given their recent track record though, I'm not holding out much hope. But on the other hand they're moving to a new manufacturer process so we could see something completely different than what we're used to with FX in its' current form. Guess we'll have to wait for the official announcements before we know for certain.
 
What im hoping is going to happen is once these monsters get released that the well performing graphics cards now such as the 980 Ti are going to go down in price
 


The 16nm and 14nm process is the same thing, here have a read.

In their analysis of the finFET's influence on layout, Rob Aitken and colleagues and ARM found: "Fin and metal pitches have different scaling pressures, so they have not tended to line up. For example, at 14nm GlobalFoundries has stated that it uses a fin pitch of 48nm and a metal pitch of 64nm. The same values are used in TSMC’s 16nm process."

http://www.techdesignforums.com/practice/guides/finfets/
 


That could happen but statistically speaking from the last couple of generations of hardware that I've seen in the time I've been here, you'll see discounts on 980TIs, but don't expect NVIDIA to undercut the cost by like $400. That ain't happening. You could theoretically see a 980TI drop below $500, I've seen some Gigabyte models go for $550, but I wouldn't expect to see that steep of a price decrease.
 


The basic Zotac one (90101-10P or something) had gone as low as 289.99. However, now it's at 304.99.
 
Status
Not open for further replies.