Nvidia Announces GeForce GTX 1080 Ti; $700, Coming Next Week

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Newsflash: the RX 480 isn't their best card. Have you heard of Fury X? It beats 1070 in at least a few games & settings.

Of course, on the eve of Vega's launch, that's kind of an academic point.
 
Anyhow, can anyone confirm whether the memory controllers operate independently? I assume they each address a distinct address range, rather than supporting an interleaved arrangement.

I wonder whether disabling the memory controller was necessitated by yield issues.

Gotta say, $700 is sounding like a breath of fresh air, after the high pricing we've seen on 1080 and Titan XP. I smell some further discounting of 1080's coming, and that's probably going to push down 1070's, a bit.

I'm really impressed by the unlocking of all SMs and the extra memory performance. I think we all knew Titan XP buyers were getting ripped off, but I'm a bit surprised by how much.
 
At that point I am just curious to see what is going to happens. AMD is having more TFLOPS for their card and a better memory module. If they price their card at 600$ and matching or beating the TI, it will be a winning situation. Otherwise, it might be Fury all over again.
 


It won't be Fury all over again. Fury's weakness was its low amount of HBM, which hampered its high-res performance. Vega uses HBM2 to cache the PC's main memory, so you can both use a great amount of memory and get great performance. AMD has taken quite a long time to develop Vega, so I don't expect it to flop, even if it can't beat Pascal.

 
I have been waiting for this hoping for a price drop on the 1070. How long do think I should give it before I say screw it and just buy one? I'm kinda sick of waiting lol.
 


Personally, I see Fury as a large-scale experiment to see if truckloads of VRAM throughput would really translate into better performance; gluing two 285 together and feeding it 4 Gb of HBM would say if, yes or no, the high-range would really get bottlenecked on modern games. Using HBM as a large cache and adding a bunch of GDDR5 on top has the advantage of reducing costs and still improve performance at the expense of a slightly more complex memory controller.

Still, one question: is the HBM2 "cache" inclusive or exclusive? I mean, will a Vega chip with 4 Gb of HBM2 + 8 Gb of GDDR5 show up as 8 Gb or as 12 Gb? Maybe it's written somewhere, but I'm feeling lazy.
 


On the games that use hardware based PhysX you'll see spectacular results. My current build listed below started as a similar setup except with 2 G1 OC970's and an EVGA SC780. Batman Arkham City and the Metro's got 60fps in 4k. It was awesome. . . .downside is that almost everyone abandoned hardware PhysX shortly after.
 
Hoping by spring/summer the dust will clear and the best of the AIB cards will be out. Definitely going to put one in my older HTPC rig. Can't wait to see a 2600k rocking one of these.
 


Titan XP buyers knew full well the nature of what they were buying, the history of the Titan XM and 980 Ti was already established. It was their choice to make a purchase, nobody forced them. Top-end products always carry a premium because there are always those willing to pay the extra, to have the best, to be the first, or simply because they can; whatever. If one doesn't like the price then don't buy the product.

Ian.

 

I've been pondering this model:

https://www.scan.co.uk/search?q=LN77175

Not something FPS fans would likely go for, but err on image quality and certain other features, and I've never liked the narrow viewing angles of TNs.





I'll be adding results for testing one with a 5GHz 2700K as soon as I can to my site. I don't have a page yet with all my Futuremark data, but PM me and I can send you the links when they're ready.

Ian.

 
Naughty, greedy NV... Vega is right around the corner with HBM2, a new arch, competitive pricing. NV knows their days of targeting consumers duped into believing the hype NV had paid a fortune for is nearing their end. Rehashing maxwell/pascal, cherry picked card review scandals and benches, proprietary ashattery... all await their rightful reckoning.

Any consumer with the least bit of sense will have PATIENCE until Vega's release, see how the benchmarks are, and wait for Vega's release and benches before making a choice, rather than rushing to throw more money at NV so soon.
 


People who buy Titan cards dont care at all ... I expect them using $5000 speakers for gaming ... and a $10000 OLED TV as well ...
 


IMO a Gaming card should never reach $699 ... there was a time when the flagship was always $500 ... until nvidia suddenly decided their new flagship will be $650 instead of replacing the $500 older card ...

the Last time was GTX 680 .. $500 remember ? and the better one was really a DUAL GPU GTX 690 for $999 , not a stupid "Ti"

Then they came with the "Ti" trick .. meh ...

After that Nvidia went crazy .. the Ti should be also $500 .... the flag ship should always be $500 and replaces the older flag ship , instead of fooling us with a naming trick "Ti"

I wish AMD comes in and crush the prices again like they did to intel ...
 


It's a cache, so it doesn't add. Still, being able to access the PC's main memory through a cache is a welcome addition. And, of course, it will be key to achieve the fastest integrated graphics ever, when Vega gets used in APUs.
 


So, there's this thing called inflation. Ever heard of it?
 
So, there's this thing called inflation. Ever heard of it?

nah , its not the case here ...Actually Electronics gets cheaper with time ... the PC components are the only goods that go against inflation and gets cheaper and cheaper ...

we used to pay $20,000 for a tiny hard disk ... and $10000 for Megs of RAM .. and at that time the US Dollar was 20 times worth today .. that is the $20,000 for a tiny harddisk is like $400,000 today for the same tiny harddisk

and funny today we have harddisks 10000 times that size for just $100
 
ELI_HARPER13
Mar 1, 2017, 8:06 AM
Well everyone that already owns a 1080 just got kicked in the nuts...
___________

They have learned to expect and like it. Otherwise they wouldnt keep doing it.
 

Eh, I don't know about that. I mean yes, you do get more FLOPS/gigabytes/whatever per dollar as time goes by. But when you look at the cost of a given product tier/relative performance level, I don't think the cost really changes much from one generation to the next. An example would be Intel's Core CPUs.
 
So Nvidia could repurpose full pascal titan to a 1080ti boost with the New memory given volta in May. Volta becomes the new Titan with either GDDR6 or HBM2 at different levels of card. GDDR6 is reportedly upto memory speeds of 16Gbps, Nvidia always has multiply aces up their sleeve.
 
When the Fury X was released, it was supposed to be a 980Ti killer at the same $650 price point. It only barely kept up with the reference 980Ti in some games. And the Fury X being already nearly maxed out from the factory highly overclocked left little user overclock headroom. When the GPU makers starting coming out with overclocked 980Tis with custom cooling solutions, they had upwards of 10% overclock headroom *on top* of the ~10% factory overclock.

Example here [www.guru3d.com/index.php?ct=articles&action=file&id=16329] and here [www.guru3d.com/index.php?ct=articles&action=file&id=16346] with a Gigabyte overclocked one showing Hitman at 1440p going from 84->91->100FPS in stock/factory overclock/user overclock respectively. Not even a contest against Fury X's 83->89FPS [http://www.guru3d.com/index.php?ct=articles&action=file&id=16512&admin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1]. So the bottom line is I don't think the $400 1070 (now $350) is a fair market comparison. If there is going to be a comparison to a Fury X to a Pascal GPU, the $600 1080 (now $500) is the more fair comparison. And we all know what the 1080 does.

 
Status
Not open for further replies.