GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960

All I am saying is that power efficiency isn't a big deal when you are building a PC with one discrete GPU while the other poster claims it is the single most important aspect of the chip. If that was true, nobody would buy Titans or GTX x80 cards. Nobody would buy ANY AMD cards. This has nothing to do with 295x2 vs Titan X unless you want it to. It is about what an enthusiast PC builder looks at when shopping for GPUs and CPUs. One of them is going to be power efficiency, but that isn't at the top of the list. If it was at the top of the list, then enthusiasts with unlimited budgets would go with the 960 over the 980 or Titan X - but they don't. They go with the most powerful GPU, sometimes 2 or 3 of them even though they are less efficient. Performance will always trump power efficiency. And, most of the people who buy the 960 don't even buy it solely for its high efficiency although that is a huge plus - they buy it because it is within their budget and for $200 that is the best performing card you can get.
 

anthony8989

Distinguished
Just imagine for a moment that AMD's offerings were on par with Nvidia in every respect excluding power efficiency, and that all competing cards were within a few dollars of eachother. What enthusiast gamer in their right mind would purchase a less efficient card at any performance level , when a more efficient option is available, if even for only a few dollars more?

That's the situation AMD wishes it was in. Because as it stand right now, in reality, their chief competitor has faster more efficient models out with an onslaught of features (that actually work) - complimented by an array of affiliated products that utilize interwoven technologies. All to make our gaming experience more delightful. How great?

Not to mention, there are other benefits to power efficiency than just $$$. Run 500 watts through a semi-conductor, or hell even a piece of pure Silver , then run 200 through another - guess which one wears down faster.

The fact that you bring the Titan X into a discussion of gaming performance per dollar just solidifies the fact that you don't understand this card at all. That fact, in accumulation with previous points you've attempted to make, forms the reasoning as to why you should not be commenting on this discussion.
 
a rough guess is that amd will offer a product that draws 10-15% more power than nvidia and maybe only 3-5% slower in overall gaming performance on average but will be at a price that is 10-15% lower than nvidia.
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960

If you are talking to me, bud, I never said anything about performance per dollar. Performance per watt, yes. Per dollar? No. And I am not comparing AMD vs Nvidia. I am just saying that power efficiency isn't a top priority when shopping for GPUs if you are a rich enthusiast. They look at performance. Not performance per dollar or per watt, performance per-iod. I don't think you guys realize that I am not trying to argue AMD vs Nvidia. I am just replying to someone who made a comment which bluntly said "power efficiency is the single most important aspect of modern chips" or something like that. The last time you shopped for a CPU or GPU, what did you look at? Only power efficiency? Then there is the warm and delightful community we have here - two people basically called me a stupid idiot now you are pretty much telling me to shut up with your last sentence. Why? Probably because you guys think I am an AMD fanboy? I am not even talking about AMD vs Nvidia. I am comparing models not only within the Nvidia desktop GPU lineup, but all Maxwells as well. if power efficiency is such a big deal when an enthusiast is shopping for GPUs, why does any of them buy a 980 or Titan X over a 960?
 
well generally speaking last generation 290x vs 780ti they were about the same performance per dollar through each of the price drops. the 290x did not overclock worth a crap, 100mhz over shipped clocks and the card was well above 80c. the 780ti however could be generally be pushed in the 200-300mhz range overclock until it reached 79-82c and at that point was much faster. the top ten most played popular online games, WoW, LoL, cs;go, bf4, diablo 3, skyrim, etc., i think amd did come up slightly ahead by about 10% more fps per dollar, but at the cost of about 7% overall(equally about 3% factored) in the top 5 most played pc games which according to the study. at stock clocks amd held a considerable lead of performance per dollar. but this still was at the cost of more fan noise and higher power consumption and nearly no overclocking headroom.
 

anthony8989

Distinguished
If you are talking to me, bud, I never said anything about performance per dollar. Performance per watt, yes.

Oh good, then look at this:

perfwatt_3840.gif


In case it doesn't show , here's the link https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/30.html

Titan X > every other card , per-iod. You are not an enthusiast gamer, stop speaking on their behalves.
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960

Oh, really? Look at my current rig. Thanks for proving that mine is better than yours. Still, nobody looks at these figures when they shop for these cards. I didn't give a crap that is why I didn't know it was this good per watt. I just wanted the best performance PERIOD. Thanks for proving my point, BUD. Again, I don't care if the Titan X generates its own power, power efficiency is not the top aspect/selling point of desktop enthusiast GPUs. I am not trying to say any card is better than any other card. I am just saying that isn't the only thing that people shopping for these cards look at. I don't exactly know why you find that hard to believe. I don't get why you think I am trying to make a certain card look bad when I OWN TWO OF THEM. If you have facts that prove that people do prioritize power efficiency over raw performance, then please share them. Until then, stop telling people what to do. Stop trying to tell me what I am or what I am not. Unless you are an omniscient god, you don't know anything about me.
 

Eggz

Distinguished


Going in circles at this point. Your whole argument above was that manufacturers should engineer chips to be as powerful as possible even if it wastes all sorts of electricity.



It's simple. GPU capability is the most important thing to people looking for performance. No surprise there. But it's not the only thing to consider. What else? Well, power efficiency is a major thing for the reasons I already explained above. Go back and re-read if you don't remember. I just did, and nothing you're bringing up goes addressed up there.

Good luck with everything.
 


yes it might not matter much in regular consumer application but the very same chip also used by those servers and supercomputer.
 

anthony8989

Distinguished
Oh, really? Look at my current rig. Thanks for proving that mine is better than yours. Still, nobody looks at these figures when they shop for these cards. I didn't give a crap that is why I didn't know it was this good per watt. I just wanted the best performance PERIOD. Thanks for proving my point, BUD. Again, I don't care if the Titan X generates its own power, power efficiency is not the top aspect/selling point of desktop enthusiast GPUs. I am not trying to say any card is better than any other card. I am just saying that isn't the only thing that people shopping for these cards look at. I don't exactly know why you find that hard to believe. I don't get why you think I am trying to make a certain card look bad when I OWN TWO OF THEM. If you have facts that prove that people do prioritize power efficiency over raw performance, then please share them. Until then, stop telling people what to do. Stop trying to tell me what I am or what I am not. Unless you are an omniscient god, you don't know anything about me.

LOL. I typically try to avoid going personal in forum arguments but I'm afraid in this case I can't help myself.

"my rig is better than yours". Lol first of all sir, with a rig like : Asus X99 Rampage V-Ex, Intel 5960x, 16 GB (2x8 GB) Gskill 2400, 2x Titan X SLI, 512 GB 850 Pro SSD, 2TB WD Black, Sony Optiarc BD-R, SeaSonic X-850, 2x Acer XB280HK 4K Gsync monitors, Thermaltake Level 10 GT case, NH-D14, NZXT Sentry-2 fan controller

It's pics or it didn't happen and gtfo! :) JK , sorta.

Second, since you clearly have no concept what a Titan series card really is and therefore couldn't possibly utilize it to its full potential; if that rig is truly real and indeed yours, I'd be inclined to infer that you were a wanna-be enthusiast with more money than brains. The type of person that would hire someone to purchase the parts and build the PC for them with a practically unlimited budget.

Now riddle me this oh great PC enthusiast with pro-rig in profile: Why Spend $300 - $400 more on your GPU's for less performance than 3 way SLI 970's or 980's when you don't care for power efficiency or space? Why pay the premium on the Titan X's when you don't use CUDA? Why make comments like
"Once all the people who actually wanted Titan Xs buy them, the "demand" will end. And there really aren't that many people who will wake up tomorrow and be willing to charge $1,000+ on their paypal account for one of them. Right now, there are more of these people than there are cards"
and also,
"I don't care what the people who buy it are using it for, be it gaming, CUDA or as a tiny boat anchor"
- Aren't you ''the people''?

For someone who owns two Titan X's it really does seem as though you are an outsider looking in.
 

tristan huot

Honorable
Dec 25, 2013
5
0
10,510
There is something wrong with there benchmarks I think they add msaa because I have a 4k TV with 970 oc and I get 50 fps average at battlefield 4 ultra settings. They should take it off because you don't need it at that resolution
 

tristan huot

Honorable
Dec 25, 2013
5
0
10,510
There is something wrong with there benchmarks I think they add msaa because I have a 4k TV with 970 oc and I get 50 fps average at battlefield 4 ultra settings. They should take it off because you don't need it at that resolution
 
Status
Not open for further replies.

TRENDING THREADS