GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Except in a 4K environment with Ultra settings. Games in 4k with these settings are chewing past 4GB of VRAM and that is where the 970's would choke the system. 6-8GB of VRAM is going to be needed going forward in Ultra 4k. At 1080p and 1440p, the two GTX 970's are probably a solid bet.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
Mantle isn't really relevant for fast CPUs in BF4's campaign. The saved overhead helps a AMD FX to survive in multi-player but it gives you only marginal advantages in our benchmark scene, if. For mid-class and entry-level systems it might play a role, 100% ackn. :)

4GB are really too less for UHD. My 980 SLi runs very often in some limits and the result is well-known. :(
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
In this one benchmark scene it is simply not mentionable. That's all and not a general boycott. Mantle was a good idea to speed up the development :)
 


now then give me AMD single gpu that is not 290X but faster than 290X to compare with 980.
 


even anandtech mention that software favors AMD cards. and looking at the product page there is no mention about CUDA. to me it is not that nvidia can't do better in OpenCL but they simply choose not to.
 

TheOtherOne

Distinguished
Oct 19, 2013
220
74
18,670
Anyone keep talking about R9 295x2 beats the Titan X and almost half the price keeps forgetting about the power usage. Even if they don't mind all the noise and stuff but almost the double amount of power usage alone will add to your "cost".
How long will it take for you to pay the extra cost in elect. bill to match up the cost of Titan X and then still keep paying as long as you keep those over hot / noisy and power hungry R9 295x2 ?
 
From a personal standpoint I don't care if 295x2 beat the Titan in performance, what I hear about the power draw, heat, and noise from those things is enough to keep me away. I find it humorous when people blatantly ignore power usage and think of it as no matter. Well - if these people were paying their electricity bill they would not be so happy then. I think for an adult individual who pays his or her own bills, Nvidia is a wiser choice because you won't rack up the electricity bill every month.

If AMD really wants to make the 300 series a killer they need to make cooler, quieter, and more power-efficient cards. Nvidia is undoubtedly already ahead of the game in that, and if AMD plans to increase the performance of their cards they have to also ensure that they can mainain low power and temperatures. It's almost like gas mileage in cars. If a new car comes out with a better engine but gets 10 less miles to the gallon, it's ratio of mileage/performance is relatively the same, but the better that ratio you need to make more efficient use of the gas, just as a graphics card needs to make more efficient use of power.

AMD seems to be taking their time with the 300 series so I hope they come out as very worthy products that will increase competition and make Nvidia more inclined to release some better-performing cards.
 
...If AMD really wants to make the 300 series a killer they need to make cooler, quieter, and more power-efficient cards.
gaming-power-consumption_w_600.png

I can't seem to reconcile your statement with this chart ...
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


I think he means Titanx is blowing the doors off 290x and less power. To compete with titanx it takes that bar showing 295x2. See the electric bill info I gave previously and TitanX wins a few things vs that card. The 300's need to have TitanX power levels and also PERF levels. They will be facing a 6GB version of TitanX that will replace 980's (maybe with slightly lower power usage than TitanX). You could say the same for 980 vs. 290x. 185w vs. 242 for 290x and 980 does it while smacking around 290x in everything. Some might not care about the difference but it adds up over 4-5yrs especially if you're a heavy gamer or have kids etc using the machine also.

With crap like the EPA keeps doing, now trying to pass putting water usage in your hotels...LOL, how much will that raise my bill on a room after costs of installing cams, monitoring it etc. On top of already high electric bills, you could really end up regretting not getting the lowest watt gpu you can live with (perf wise). They already charge more here for PRIME hours from 3-11pm IIRC (as in when I get home from work, until bed time basically - most people's gaming hours during the week). They tell you to do laundry/dishes at midnight to avoid getting dinged...LOL.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Correct. Because they chose to use ADOBE apps, and cuda support has been in it for ages and still is today. Anandtech purposely avoids testing adobe IMHO. You wouldn't want to show the same operation on Vegas/AMD/OpenCL vs. Premiere/NV/Cuda. NV would smoke AMD.

I remember when AMD was bragging about getting openCL in Adobe, but clearly it means nothing if it can't beat NV with Cuda in the same app. I'll note AMD has been silent since they posted the graphic showing massive improvement coming :( It's comic everyone is afraid to test Adobe with AMD/OpenCL vs. NV/Cuda. You simply run the same tests and check a box for each card. I've been harping on tomshardware for ages for them to do this. All these sites who hate Proprietary stuff let it get in the way of showing the power of cuda for amateurs at home. At $50 for Adobe's CC suite monthly (cheaper if not the whole suite), you'd think they would want to show how it performs for both sides now that OpenCL is in there too. It is in the hands of regular everyday people at this price now. It's not a $1500 suite today with a subscription.

There is no point in NV supporting OpenCL when Cuda does better and is in almost every pro app on the market (over 200 apps use it). There is a reason NV/Cuda owns ~75% of the workstation market. You don't spend 8yrs+billions making an ecosystem around Cuda and dump it because someone likes OpenCL. You tell them, let me know when you catch us, we'll talk then ;)
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


I didn't realize 295x2 was $500. Silly me I thought it was $690 at newegg today, which is hardly 1/2 the price of $1000, and the power used over a few years means they're about equal. You don't own these for a day for an hour. You game for hours on end for 3-5yrs (some of us more, some longer etc, some have multiple users in the house).
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
http://techreport.com/review/27969/nvidia-geforce-gtx-titan-x-graphics-card-reviewed/8
Just in case I forgot to mention it. Techreport uses BF4 with Mantle on for AMD 50fps AMD, 48 NV. Not sure if I mentioned that before. So although toms didn't use it (and as mentioned by formatc pointless in their scene), techreport did.

Just read anandtech also on M6000 (copy of TitanX) for pros. $5000! WOW.
http://www.anandtech.com/show/9096/nvidia-announces-quadro-m6000-quadro-vca-2015
So they should fly off the shelf for people trying to save money at $1000. Gamers should probably just ignore the card and wait for the inevitable 6GB version IMHO :) Let the pros on a budget buy this.
 

Orlando Caba

Reputable
Mar 21, 2015
1
0
4,510
Toms Hardware.. no matter how you try to justify spending $1000 on a single gpu that loses to it's dual card competition that costs a lot less.. The price still doesn't make sense. I own 3 290x in crossfire and I am running a 4k monitor and I can play EVERY SINGLE GAME MAXED out with NO MICROSTUTTER.. I'm talking about every game with 4x to 8x super sampling. Plus You can't reference Far Cry 4 an Nvidia Gameworks title that We all knew that was going to happen. I don't have ANY issues with Crossfire and this is the first time I have every Really truly loved Crossfire and it's the first time It's actually better than SLI thanks to XDMA. Either way i'm not upgrading my GTX 780ti rig until next year then because this titan X doesn't make much sense.
 

Arabian Knight

Reputable
Feb 26, 2015
114
0
4,680


and thats why Nvidia never released an 8G GtX 970 , I am telling you nvidia are Thiefs.

I will Gladly pay 200$ more for the 12G as I said $800 is OK for Titan X. 200$ more than Dual Gtx 970 understandable ! remember that dual GTX 970 hacve 4+4 G of ram which count as 4G but still are 8G in cost !!! , so the Titan X has only 4G more !!! and still I would pay $200 more than 2x GTX 970 , will compromise.
but $1000 ? in their dreams , I will never pay money for Greed.

 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Greed? You don't read balance sheets or quarterly reports do you? Nvidia has not made as much as 2007 in the last 8yrs. That would be an impossible statement to make if they were greedy at any point in the last 8yrs. On top of that the ~500mil they make right now would really be $233mil or so if you took off Intel's $266mil/yr payments. So they are essentially making 1/4 of 2007 profits. These numbers are PUNY. Intel brings in $10B and you're whining about a company barely making 233mil...LOL.

Get a better job if you want a better card, they are NOT ripping anyone off and in fact need to charge more for everything they sell. The same for AMD, who has lost $6 BILLION in the last 12yrs, laid off 1/3 of their workers, sold everything they used to own etc. Even NV put off plans to expand into their new building. Neither company is doing great or none of this could be said. You put off a new HQ when you're not doing as good as you should be. If they're being greedy, I'm the king of england...LOL.

If it wasn't for the people who DO pay for these super high end cards, you wouldn't be getting half the perf you do at $200-500. It's people like you that are helping to kill AMD...Waaa, I want some stuff for free blah blah, you're too greedy..waahh. Get a real job and get out of the welfare line. The professional version of this card (titanx) the M6000 will be $5000+. Expensive to you, but for a company that lives or dies on pumping out graphics etc, the driver help etc is more than worth it. Your statement might have some merit if they were clearing a billion a year (meaning beating 2007's ~800-850mil). But they're nowhere near that.
 


same power consumption? no:

http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,8.html

http://www.guru3d.com/articles_pages/geforce_gtx_970_sli_review,4.html

also it is not stupid. it is about making as much profit as possible.
 


it seems that you just angry with nvidia because they are not selling the hardware at the price that you're willing to pay.the top card has always come with premium price. this has been the case since long time. for these companies there is no good or evil. just profit. let me tell you this. did you think that AMD like it when they have to sell their 295x2 for $600+ when the card was supposed to sell at $1500 price tag?
 

Sekeira

Reputable
Oct 17, 2014
53
0
4,640
If i had such a budget for a cpu right now i would go for the 295x2 without a second thought,costs half as much here in europe (630e vs the titan's wopping 1240 right now on amazon), it looks way better, runs a lot cooler,has a backplate and runs just as fast or even faster. With the rest of the money i could buy the psu and a kickass monitor to go along with it or i could go 15 days on vacation to spain or italy...nvidia just ruins it for all of us with these price policies,i trully hope amd does not follow queue but i think they will.
 

Sekeira

Reputable
Oct 17, 2014
53
0
4,640
SLI 2xTitan X + GSYNC.

If money was not an issue that's what I would do.

*And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.

I don't complain because my FERRARI wasn't a good value.
SLI 2xTitan X + GSYNC.

If money was not an issue that's what I would do.

*And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.

I don't complain because my FERRARI wasn't a good value.
SLI 2xTitan X + GSYNC.

If money was not an issue that's what I would do.

*And why do people whine about the COST of any of the Titan cards? NVidia isn't misleading anybody here; if you don't think it's worth the cost then don't buy it.

I don't complain because my FERRARI wasn't a good value.
 
even if i had $1000 to spend on a gpu.... i would not buy either titan x or 295x2.... they are both bad value. by black friday this year you will be able to buy a gimped gm200 or figi card for under $600 anyways... at which point you could have two for under $1200, which would completely destroy the titan x and 295x2. much better value. otherwise the reviews are just that, forget price, its a sneak peak of single gpu performance for the next 2 years.... running beta drivers mind you. like always driver maturization will add roughly 5-10% performance in the next year or so. we all need to root for amd to have a just as fast cool running stable figi card that has just as much overclocking headroom as gm200 does, its simply better for all of us.
 
Status
Not open for further replies.