cangelini :
Hate? The R9 280X won an *award*. I think Tahiti at $300 is pretty much brilliant.
I wrote one of the least flattering GTX 780 stories out there. I only identified a couple of situations where a Titan made any sense at all. And although the 760 *did* change the balance at $250, that card still didn't get an award. I liked the 770 for the simple fact that it delivered better-than-680 performance for close to $100 less.
The rest of AMD's new line-up is a lot like what exists already. Again, the 7870 is a better value than 270X. So what are you getting worked up over? The fact that I'm pointing out these aren't new GPUs? They're not. 😉
Unfortunately you can get another 100mhz (10%, actually 104mhz) for $249 after $10 rebate on multiple models at newegg (probably elsewhere too). So if you add 10% to all the benchmarks this is a $40-50 loss if you go AMD, not totally brilliant. Correction, just checked and MSI model is 117mhz more than stock. Also $249. Overclocked out of the box and shipped that way is basically what I call reference. There is no point in buying ref for $249 when you can get 10%+ more for the same cost. At that point you should be testing one of the boxed cards we'd all buy, not some ref point nobody in their right mind would purchase. Evga/MSI are $249 and Gigabyte has one for the same at $259 (no rebate, but still OC in the box). Zotac has an even faster model for $259 (1111/1176, far better than 980/1033 right? - FREE 143mhz! well over 12%, 13+ depending on which way you're looking at it). Memory runs an extra 200mhz also on Zotac. You should at least be benching one of these and saying there is a $40 difference. Why test ref? What idiot buys those?
Having said that I can get a 1000/1050 model of 7970 for $289, after rebate so I guess you'd say $30 difference. No free games (Yet? Until shelves clear of old cards I'd guess) with 200 series either verse brand new AAA Batman Arkham Origins right? EVGA also comes with Rise of the Triad.
"For the past two years, we’ve watched AMD dominate compute-oriented workloads. It does particularly well in the OpenCL-accelerated LuxMark benchmark, based on the LuxRender rendering system. Nvidia’s Kepler architecture isn't as inspiring for this type of task."
Well, yeah, when you ignore the fact that only a fool runs opencl on NV when CUDA can be used just by swapping out the plugin
😉 Why do you guys still insist on forcing NV into a situation no pro person would use? How about putting luxrender plugin vs. a cuda version for NV like octane etc? You buy a CUDA card specifically to take advantage of 7yrs of CUDA work done by Nvidia right? Would you seriously use an OpenCL rendering plugin when Octane etc is easily available for ALL of the same apps? Please tell me you are not this dumb. You can say NV sucks at OpenCL, but in the same sentence you should say "but only a fool would use it over CUDA". You're leaving out 1/2 the story. Physx is also an advantage for games, though not used enough yet, but when it is it's pretty cool.
I could almost say the same for Maya etc. Furryball/Octane etc can be used. With lightwave you can use Octane for cuda. I don't understand why you guys keep ignoring CUDA.
OctaneRender™ supports all of these:
ArchiCAD, Cinema4D, Inventor, Maya, Revit, Softimage, 3ds Max, Blender, Daz Studio,Lightwave,Poser,Rhino
AutoCAD (coming soon)
SketchUp (in development)
Pretty much the same set as luxrender (which IMHO should only be used on AMD as it isn't running cuda optimizations). Someone please explain why you NEVER pit OpenCL AMD against CUDA NVIDIA? As long as you've been doing it and tossing in "nvidia sucks for this type of work", I can only assume you don't want the world to know cuda vs. opencl is a no-contest situation. RatGPU for Maya or 3dsmax? Why? Use Octane or FurryBall etc and CUDA. We are talking UNFUNDED (opencl-amd broke) vs. FUNDED (NV 7yrs of profits dumping into Cuda, 640 universities teach it, over 200 apps use it etc). Even bitmining has cuda versions...ROFL. Though it is useless now anyway with Asics. You admit this, but bench it anyway...ROFL. What for? To show a niche case nobody uses? Isn't that like benchmarking a game from 1995 and saying expect this perf today?
http://www.hpcwire.com/hpcwire/2013-08-02/nvidia_cuda_55_production_release_now_available.html
Aug 2013. Wow, up from 500 about a year ago (my older posts say 500...LOL). Cuda is gaining even more and now on Arm.
One more observation here: whats up with bitmining using 100w more on 280x vs 760 (really that high?)? And 57w more in gaming. So AMD is using 40% more watts in bitmining and 28% more in gaming, 3x more watts in multi-monitor idle, and 2x more watts (a little OVER 2x more watts) while playing a movie? But this is a brilliant card? I think NV feels less power, less noise, better drivers, less heat=HIGHER PRICE. According to hardocp they do not plan on cutting 760/770, only lower models (660 drops to $179 today I guess). But I suspect that will change shortly. Supposedly they have 2 new cards coming too (guessing just fully unlocked GK110 and probably higher clocked).
Can you say CRAP drivers?
😉 Or is that just a crappy product? You guys gloss over that like it's nothing. Not even a comment at the bottom of the power page...ROFL. Mine would say something like "and with this kind of power usage we'd expect FAR better performance, as NV cleans AMD's clocks in power"

Why was the 280x left off the NOISE test, and why were none compared to NV? OH, right, they suck in noise too. Why no overclocking either? Already too hot, noisy and wattage skyrocketed even without it? No comments on any of that either in the article
😉