CptBarbossa :
@somebodyspecial
Look at the tests. In frame time variance the 295x2 is the best in dual gpu configurations, including dual card setups. I posted a link to the page that shows that.
There is no denying that a single card will have better frame time variance than dual. That is not even debated. The 295x2 comes closer than any other setup though.
And to say you would pick the 290x is simply silly. At WORST case you can disable the crossfire and get the same if not better performance than the standard 290x due to higher base clocks. And at that point you would have the crossfire option if you needed it.
I have stated that the TitanX is a great card if you can shell out for it. I agree that it has great frame time variance. I just dont think it hold VALUE compared to the 295x2. Similar performance for significantly lower cost is hard to pass up.
The problem is it's only good when it works. Even with AMD's new beta driver supplied to techreport for the test you get this:
"New driver is ~50 FPS. Better than before, but seriously doesn't feel like 50 FPS on a single GPU."
"The Radeon R9 295 X2 is capable of producing some very high FPS averages, but the gaming experience it delivers doesn't always track with the traditional benchmark scores."
FPS score doesn't matter if the game doesn't feel fluid while playing it right? That is his point. You can say all you want about CF vs. SLI, but the point is single is never a crap shoot. Also as you see with shadow of mordor, you need 6GB or more to run without suffering at 4K.
http://techreport.com/review/27969/nvidia-geforce-gtx-titan-x-graphics-card-reviewed/7
I'm thinking we'll be seeing more situations where the 12GB shows it's "value" as games move on up with unreal 4 etc pushing them to death in many ways most of today's games don't. The 4GB per gpu on 295x2 will suffer over time correct? He notes it has less effective memory than 290x.
" Thus, the 295 X2 really struggles here. My notes say: "Super-slow ~7 FPS when starting game. Occasional slowdowns during, should show up in Fraps. Slow on enemy kill sequences. Super-slow in menus. Unacceptable." The fix, of course, is to turn down the texture quality, but that is a compromise required by the 295 X2 that the 290X might be able to avoid. And the Titan X laughs. "
It won't get better as games advance, it will end up in even more situations like Shadow of Mordor. So you have a profile problem (AMD must be able to fund constantly updating for new games, same as NV) and a memory problem vs. Single gpu TitanX. I'm not a fan of SLI or CF mind you, so no love for either side on that one. I prefer the single always working solution (preferably the winner at low watts, but I live in AZ and bake in the PC room in summer...LOL).
Further, note in BF4, the TitanX/295x2 perform nearly identically, and this is a MANTLE game. It's not always a straight up loss for single core even tossing aside CF profiles needing updates constantly for new games, memory limitations and watts used etc.
"Remarkably, this is also another case where the R9 295 X2's performance matches that of the Titan X almost exactly."
ANOTHER case. One other point, this card holds value for FP32 usage and cuda for certain pro scenarios. It stomps on a $3700 K6000 and absolutely destroys the K5000. More bad news in Crisis 3 test also for 295x2. Game after game there is an issue. Titan runs away with Borderlands Prequel. Power at load nearly double at 330w extra. Add that up over 4-5yrs at gamer hours and titan has no extra cost.
Toms says much the same in Far Cry4:
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-3.html
"On paper, the Radeon R9 295X2 enjoys a commanding 32% advantage over Nvidia’s GeForce GTX Titan X in Far Cry 4 at 2560x1440 using the Ultra quality preset. But a look at frame rate over time shows that card’s lead to be sporadic. At times, it’s actually slower than the more consistent GeForce GTX Titan X. "
Note here, it's over 4 months before a profile hits and it's bad at 4K. This is just like Hardocp saying sometimes you wait 6 months for one from AMD. By that time many are on another game. As toms notes previously in the article you're depending on AMD's CF drivers to make a case for this card over TitanX. Note toms recommends the titan also, calling it their unicorn
http://www.hardocp.com/article/2015/03/17/nvidia_geforce_gtx_titan_x_video_card_preview/4
"Whether you play at 4K or 1440p, 4GB video cards may not be enough! Just with this single TITAN X we got two games to use up to 8GB of VRAM. When applying SLI, this demand could increase beyond 8GB at 4K. At this point, 8GB may be the minimum you would want for 4K gaming."
Again, another site saying 4GB, ouch, even at 1440p possibly in some games this year. Look at page 3 of the preview. Only one game came in under 4GB at 4K cranked. Sure you can turn stuff down to get under 4GB but...
http://www.rapidtables.com/calc/electric/electricity-calculator.htm
330w x3hrs per day=$43/yr. 21hrs a week for a gamer reasonable (more if you're sharing with kids etc in the house)? I can put that in on a saturday/sunday on a new game alone, especially rpg/strategy. So 5yrs is over ~$210 at 12c/kwh and many states are above 14c in usa (up to above 18c). At the high end here you're talking $300 savings assuming a buyer of a card this expensive uses it for a while longer than most and easily more savings if you have a few users clocking up hours. I upgrade every 3yrs or so usually at $250-360 range, but at $700-1000 I'd be attempting to hit 4-5yrs for sure and OCing it in that last year or two to do it if needed. I'm at that now on my Radeon 5850 but I admit I've been waiting for 20nm for a (hopefully) drop in heat for major perf increase also. When buying a vid card you have to consider TCO over the life of the device (especially at these watts), not just day 1 purchase price.
One last point on TitanX, I wouldn't mind overclocking it based on watts heat etc. But there's no way I'd do that on 295x2.
http://anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/17
At 250w less, Titan OC'd is right on 295x2's butt, without all the problems of CF/SLI or 4GB.
http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,32.html
http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,34.html
TitanX blowing out 295x2 in all tests except tomb raider when both OC'd. Check out their costs of use too where they assume 5 days a week of 4hrs (roughly same as my 21hrs) and peg the cost of 295x2 at 2X! $80 a year vs. 160. Even at 4yrs that's $320. He's giving euro costs here, my example was usa. But either way, it's a massive haircut to TitanX's price right?
I guess we'll have to agree to disagree on the merits of 295x2
Having said that I'll wait for a die shrink, and anyone thinking TitanX is priced high, I think a 6GB version that lands somewhere between 980/TitanX is coming when AMD's 380/390's hit, that will replace 980's price and push that card down some. You have to do something with slightly failed 610mm^2 failed dies right?