blackkstar
Honorable
@juan you miss my point. HEDT is a market that will put two 300w graphics cards in their system, and then buy 125W CPUs and OC them to 200w+ CPUs. The only metric that matters on HEDT is raw performance. If it was all about efficiency, no one on HEDT would overclock at all.
Lets say those rumors I posted are true.
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/13
Anandtech has GTX 980 doing ~42fps in BF4 at 4k on ultra. Add 20% performance from the rumor I posted and you get 50.4fps. GTX 980 system pulls 294w. So 6% increase is 311w.
What do you think gamers and other HEDT enthusiasts will want? The one with the lower TDP or the one with the higher performance (ignoring metrics like driver quality since they're irrelevant to what I'm getting at)? Have you seen anyone take a GTX 660 over a GTX 980 because the GTX 660 used less power? I don't think I've ever seen that happen in a forum that specialized in HEDT parts.
My entire point is that this is basically a given fact. AMD released 200w+ TDP CPU and people bought it! Yes efficiency is important, but people will take the higher performing card even if it's far less efficient. People still bought Fermis and Fermi refresh. People buy Titan and dual GPU high end cards from Nvidia and AMD.
Also, there's no need to treat me like some sort of idiot who doesn't understand efficiency. I have a 4 CPU Opteron rig and they're all 75w TDP CPUs and it's trading blows in multi-thread with 4.5ghz Haswell-E 6 core or stock Haswell-E 8 core in what I use it for. Yes total I have 300w of CPU, but it cost significantly less than Haswell-E EE 8 core and I still use it because it has more raw performance. And this might be difficult for you to fathom but most HEDT people only have one or two computers and they pay the average US electricity rate of 12 cents /KwH. Spending even $20 a month on electricity for a ridiculous overclocked, two GPU system is still far cheaper than doing something like going to see the movies or going out to the bar.
I know you love your precious efficiency but the vast majority of HEDT owners don't care at all. It's irrelevant in the market. What matters is raw performance. And when Intel, AMD, or Nvidia start talking about it, HEDT gets disappointed eventually.
Even your little APUs you fawn over so much have been a massive disappointment to HEDT owners. In fact, it's such a huge disappointment, that people will buy a two year old architecture on an archaic platform over taking something more efficient, because the old architecture and platform offers much better raw performance, regardless of how efficient the APUs are. The majority of people out there who want high end gaming rigs go for dCPU and dGPU, even if APU is way more efficient. Because they only care about raw performance.
I agree with you in servers that there's a "sweet spot" between power consumption and raw performance if you're doing something like running Disney's 55,000 core render farm, but no one in their right minds cares about it for HEDT as long as they have access to reasonably priced electricity, which most of the developed world has unless you're living on a little Island somwhere.
Servers: efficiency good
GPGPU farms: efficiency good
Mobile: efficiency good
Embedded: efficiency good
High end desktop: no one cares about efficiency, they are about performance.
The biggest problem with enthusiasts is the fact that they masturbate over bar graphs of raw performance without even understanding them. To say that HEDT is caught up in something else besides raw performance is borderline blasphemy.
Lets say those rumors I posted are true.
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/13
Anandtech has GTX 980 doing ~42fps in BF4 at 4k on ultra. Add 20% performance from the rumor I posted and you get 50.4fps. GTX 980 system pulls 294w. So 6% increase is 311w.
What do you think gamers and other HEDT enthusiasts will want? The one with the lower TDP or the one with the higher performance (ignoring metrics like driver quality since they're irrelevant to what I'm getting at)? Have you seen anyone take a GTX 660 over a GTX 980 because the GTX 660 used less power? I don't think I've ever seen that happen in a forum that specialized in HEDT parts.
My entire point is that this is basically a given fact. AMD released 200w+ TDP CPU and people bought it! Yes efficiency is important, but people will take the higher performing card even if it's far less efficient. People still bought Fermis and Fermi refresh. People buy Titan and dual GPU high end cards from Nvidia and AMD.
Also, there's no need to treat me like some sort of idiot who doesn't understand efficiency. I have a 4 CPU Opteron rig and they're all 75w TDP CPUs and it's trading blows in multi-thread with 4.5ghz Haswell-E 6 core or stock Haswell-E 8 core in what I use it for. Yes total I have 300w of CPU, but it cost significantly less than Haswell-E EE 8 core and I still use it because it has more raw performance. And this might be difficult for you to fathom but most HEDT people only have one or two computers and they pay the average US electricity rate of 12 cents /KwH. Spending even $20 a month on electricity for a ridiculous overclocked, two GPU system is still far cheaper than doing something like going to see the movies or going out to the bar.
I know you love your precious efficiency but the vast majority of HEDT owners don't care at all. It's irrelevant in the market. What matters is raw performance. And when Intel, AMD, or Nvidia start talking about it, HEDT gets disappointed eventually.
Even your little APUs you fawn over so much have been a massive disappointment to HEDT owners. In fact, it's such a huge disappointment, that people will buy a two year old architecture on an archaic platform over taking something more efficient, because the old architecture and platform offers much better raw performance, regardless of how efficient the APUs are. The majority of people out there who want high end gaming rigs go for dCPU and dGPU, even if APU is way more efficient. Because they only care about raw performance.
I agree with you in servers that there's a "sweet spot" between power consumption and raw performance if you're doing something like running Disney's 55,000 core render farm, but no one in their right minds cares about it for HEDT as long as they have access to reasonably priced electricity, which most of the developed world has unless you're living on a little Island somwhere.
Servers: efficiency good
GPGPU farms: efficiency good
Mobile: efficiency good
Embedded: efficiency good
High end desktop: no one cares about efficiency, they are about performance.
The biggest problem with enthusiasts is the fact that they masturbate over bar graphs of raw performance without even understanding them. To say that HEDT is caught up in something else besides raw performance is borderline blasphemy.