MasterMace
Distinguished
So the Titan X drops sub-30 in 3 out of 6 games at 2160p. I can correctly call this not 2160p ready. If you want 2160p, you need 2 cards for 2015.
cst1992 :People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.
Hmm seeing how we have a fully unlocked GM200 and it's not double the performance per watt over Kepler like they marketed Maxwell in the beginning is kind of disappointing that they got about half of that.
People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).
Hmm seeing how we have a fully unlocked GM200 and it's not double the performance per watt over Kepler like they marketed Maxwell in the beginning is kind of disappointing that they got about half of that.
Well, NVidia did state that GM20x was supposed to go through a die shrink to 20nm before it launched, but TSMC pushed back on going to 20nm as they wanted to drop to a nm in the teens instead. This is why you haven't seen the Maxwells go below 28 nm and why AMD hasn't released the 3xx cards as yet. So, if the Maxwell cards had been released at 20nm, they very likely would have been double the performance per watt. So, in this case, TSMC is holding up both NVidia and AMD on producing lower power cards.
IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked
photonboy :SLI 2xTitan X + GSYNC.
If money was not an issue that's what I would do.
*And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.
I don't complain because my FERRARI wasn't a good value.
As companies strive for perfection, they should make their cards the most appealing to all audiences as they can.
In your case, you "own a Ferrari." When someone buys a Ferrari (or any car, house, etc.), it is assumed that they are in for a more permanent investment. New graphics cards, by nature, are released every year-year and a half, causing new graphics cards to become outdated and obsolete within 3-5 years after being released. When someone gets a Ferrari, it's assumed that the car will retain most, if not all of its value after being purchased. Whereas graphics cards are concerned, they lose value relatively quickly.