AMD Radeon R9 280X, R9 270X, And R7 260X: Old GPUs, New Names

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Do the company's price adjustments make this introduction newsworthy, or will the excitement need to wait for its upcoming Radeon R9 290 and 290X, based on fresh silicon?

Well I don't know you got an article and a news report out of it, seems to me the answer's obvious.
 


Unfortunately you can get another 100mhz (10%, actually 104mhz) for $249 after $10 rebate on multiple models at newegg (probably elsewhere too). So if you add 10% to all the benchmarks this is a $40-50 loss if you go AMD, not totally brilliant. Correction, just checked and MSI model is 117mhz more than stock. Also $249. Overclocked out of the box and shipped that way is basically what I call reference. There is no point in buying ref for $249 when you can get 10%+ more for the same cost. At that point you should be testing one of the boxed cards we'd all buy, not some ref point nobody in their right mind would purchase. Evga/MSI are $249 and Gigabyte has one for the same at $259 (no rebate, but still OC in the box). Zotac has an even faster model for $259 (1111/1176, far better than 980/1033 right? - FREE 143mhz! well over 12%, 13+ depending on which way you're looking at it). Memory runs an extra 200mhz also on Zotac. You should at least be benching one of these and saying there is a $40 difference. Why test ref? What idiot buys those?

Having said that I can get a 1000/1050 model of 7970 for $289, after rebate so I guess you'd say $30 difference. No free games (Yet? Until shelves clear of old cards I'd guess) with 200 series either verse brand new AAA Batman Arkham Origins right? EVGA also comes with Rise of the Triad.

"For the past two years, we’ve watched AMD dominate compute-oriented workloads. It does particularly well in the OpenCL-accelerated LuxMark benchmark, based on the LuxRender rendering system. Nvidia’s Kepler architecture isn't as inspiring for this type of task."

Well, yeah, when you ignore the fact that only a fool runs opencl on NV when CUDA can be used just by swapping out the plugin 😉 Why do you guys still insist on forcing NV into a situation no pro person would use? How about putting luxrender plugin vs. a cuda version for NV like octane etc? You buy a CUDA card specifically to take advantage of 7yrs of CUDA work done by Nvidia right? Would you seriously use an OpenCL rendering plugin when Octane etc is easily available for ALL of the same apps? Please tell me you are not this dumb. You can say NV sucks at OpenCL, but in the same sentence you should say "but only a fool would use it over CUDA". You're leaving out 1/2 the story. Physx is also an advantage for games, though not used enough yet, but when it is it's pretty cool.

I could almost say the same for Maya etc. Furryball/Octane etc can be used. With lightwave you can use Octane for cuda. I don't understand why you guys keep ignoring CUDA.

OctaneRender™ supports all of these:
ArchiCAD, Cinema4D, Inventor, Maya, Revit, Softimage, 3ds Max, Blender, Daz Studio,Lightwave,Poser,Rhino
AutoCAD (coming soon)
SketchUp (in development)

Pretty much the same set as luxrender (which IMHO should only be used on AMD as it isn't running cuda optimizations). Someone please explain why you NEVER pit OpenCL AMD against CUDA NVIDIA? As long as you've been doing it and tossing in "nvidia sucks for this type of work", I can only assume you don't want the world to know cuda vs. opencl is a no-contest situation. RatGPU for Maya or 3dsmax? Why? Use Octane or FurryBall etc and CUDA. We are talking UNFUNDED (opencl-amd broke) vs. FUNDED (NV 7yrs of profits dumping into Cuda, 640 universities teach it, over 200 apps use it etc). Even bitmining has cuda versions...ROFL. Though it is useless now anyway with Asics. You admit this, but bench it anyway...ROFL. What for? To show a niche case nobody uses? Isn't that like benchmarking a game from 1995 and saying expect this perf today?

http://www.hpcwire.com/hpcwire/2013-08-02/nvidia_cuda_55_production_release_now_available.html
Aug 2013. Wow, up from 500 about a year ago (my older posts say 500...LOL). Cuda is gaining even more and now on Arm.

One more observation here: whats up with bitmining using 100w more on 280x vs 760 (really that high?)? And 57w more in gaming. So AMD is using 40% more watts in bitmining and 28% more in gaming, 3x more watts in multi-monitor idle, and 2x more watts (a little OVER 2x more watts) while playing a movie? But this is a brilliant card? I think NV feels less power, less noise, better drivers, less heat=HIGHER PRICE. According to hardocp they do not plan on cutting 760/770, only lower models (660 drops to $179 today I guess). But I suspect that will change shortly. Supposedly they have 2 new cards coming too (guessing just fully unlocked GK110 and probably higher clocked).

Can you say CRAP drivers? 😉 Or is that just a crappy product? You guys gloss over that like it's nothing. Not even a comment at the bottom of the power page...ROFL. Mine would say something like "and with this kind of power usage we'd expect FAR better performance, as NV cleans AMD's clocks in power" :) Why was the 280x left off the NOISE test, and why were none compared to NV? OH, right, they suck in noise too. Why no overclocking either? Already too hot, noisy and wattage skyrocketed even without it? No comments on any of that either in the article 😉
 


Reference cards use reference design boards and are easier/cheaper to water cool.

Beyond that, I am unsure why you are comparing the 760 to the 280X. The 280X competes with the 770 which is $100 more. The 7950 is still available at $50-70 less than the 760 and matches or exceeds its performance. Yes, the 270X is not a direct competitor for the 760 and it doesn't appear that AMD had any intention of having the 270X compete against the 760 but it does lock the $200 price point for performance. The 280 (non-X) will eventually be released as a rebranded 7950 to occupy the $250 price point and it shouldn't have any issues handling the 760 since the 7950 still overclocks well beyond the overclock capacity of the 760.

Also, not sure where you are getting the idea that the game bundle isn't included with the 280X. Models on sale in Europe include the bundles...

With that said, feel free to continue with your mindless rant :)
 


Probably because neither are in the same price point. But having them included in the benchmarks would be useful.
 
I'm kind of confused: I keep seeing "7970 GE for $30 more (than the R9 280x)" in this article - yet on both Newegg.ca and Newegg.com I can't find any for less than $379.99. I'm assuming you mean the 7970 non-GE but still, it just seems a little unfair to AMD's effort at providing more performance for a better price.
 
Are you a troll? 😛

The Tensilica DSPs are likely the same ones used in the XB1. They're highly programmable, and will give developers greater control. They are capable of offloading essentially all of a game's advanced audio processing, which would otherwise eat into your CPU budget, even with an X-Fi installed. With that being said, they don't replace your sound card. These DSPs are complementary, and won't stop me from seeking a high-quality add-in sound card (such as an X-Fi).
 
You're right. And congrats for doing your research on CUDA Vs. OpenCL. The bottom line is: In GPU and CPU technology, AMD is just plain inferior to its competitors. They struggle to squeeze every last drop out of their architecture while nVidia and Intel continue to perfect and refine theirs. IMO, you only buy AMD components when you can't afford nVidia or Intel ( I'm sure its been said before). And AMD must recognize this on some level because it seems to be their exact market stratagem in the discrete graphics market Vs. nVidia.

I lost my respect for that company after that stupid D-bag commercial they peddled out. " You don't want nVidia . Their cards don't even work properly, their tech is outdated, and they try and bribe you with free games" - Projecting much?

 


Easier and cheaper to water cool? I'm not sure that's true. But any how water cooling aside, the point stands - why buy a reference card when enhanced versions of the card are available for the same price? It just does not make sense unless you're somehow saving a bunch of cash for your water cooling unit ( which I really don't think you would)
Aslo he's comparing the 280X with the GTX 760 because practically every reviewer out there is doing exactly the same. It's currently the closest nVidia card to the 280X in terms of price. They've already established it.
 
Boring article. R9 280X is slower than 7970 GHz? Lol, you have AMD Overdrive right there to fix it. At 300$ and a beginner overclock ability, this card, just like 7970, simply blows 680/770 away without sweat.
Tom's review of 770, an OC version of 680 with same price, is more positive. Why you have to mess AMD up like in this garbage?
 
OMG these names are just too much, seriously have they heard the word about less I more? R9 280x-FDHD... Seriously that has to be one of the worst names for a card ever. What does the X even stand for? If the name is already long and redundant why bother adding on even more redundant things?
 
i'm most interested about this "true audio" thing. can it be enabled anyway like nvidias phys x, and just have it run on the cpu? or are they being a$$holes and completely locking it out without the right graphics?(sound) card? can amd just make a sound card? that would make more sense, rather than integrating it to the gpu?
 


Actually I think you may have to blame Sony for that as they were the money grabbing gits that came up with HDMI and then charged the world to use it.
 


Probably not much, since professional series drivers are optimised for gfx
functions that are generally not used in games, such as antialiased lines,
whereas games use functions rarely used in pro apps, eg. 2-sided textures.

Ian.

 

Is that a whiff of troll, or just fanboy?
I've owned cards from both companies over the last dozen+ years, and nothing has convinced me that one is notably inferior.
Iirc, image quality metrics favor AMD, but I haven't watched movies on my cards so I couldn't tell you there's a difference first hand. At the lower end of the budget and wattage scale where I typically buy, AMD wins on bang/buck hands down, and has pretty much since the HD5670, maybe before. I'd love for nVidia to compete in that range, but they don't. If nVidia had a competitor to the low-profile HD7750, I'd buy it right now. OTOH, nVidia appears to win at the top end on pure bang, and SLI seems to have been the better multi-card solution for a long time. I'm using weasel-words like "seems," and "appears," and that's part of my point; the two companies have their strengths and weaknesses, but overall neither one is remarkably "inferior" to the other, and differences will largely be a matter of opinion.
 
Thanks to the Origin debate there have been some interesting numbers making the rounds on the internet though. 😗

Jon Bach, president and founder of Seattle’s Puget Systems, provided a cornucopia of reliability data culled from testing of 5698 units. Here’s what he had to say via email:

”It is hard to quantify customer experience, but one thing I can quantify is reliability. How many have failed? Here’s a report from the last 3 years.

Nvidia: 5.36% total failures (in our testing + in the field)
AMD: 8.89% total failures (in our testing + in the field)

But more important than failure rate is how many failed in our customer hands? We do a lot of testing here to weed out as many bad cards as possible in our build process. Here’s how many have failed in the field over the last 3 years:

Nvidia: 2.42% failures in the field
AMD: 3.23% failures in the field

Here’s that same info, over the last 1 year only:

Nvidia: 4.95% total failures (in our testing + in the field)
AMD: 7.79% total failures (in our testing + in the field)
Nvidia: 1.02% failures in the field
AMD: 3.25% failures in the field

So yes, AMD does have a higher failure rate, but nothing that puts up such a big red flag that I would want to drop their product.”

http://www.pcworld.com/article/2052184/whats-behind-origin-pcs-decision-to-so-publicly-dump-amd-video-cards-.html
 
I was thinking of getting the R7 260X, but had a gut feeling that AMD doing a rebrand, now I'm glad I went with the GTX 650 Ti BOOST.
 
Same shit Nvidia pulled! the only new cards will be WAY TOMUCH FREAKING MONEY! Old cards rebranded.

I'll stick with my SLi watercooled GTX680 @ 1301Mhz core / 6908Mhz ram till Maxwell cores release I guess. This AMD release isn't going to spawn anything worth giving a ____ about from Nvidia or AMD .... QQ
 
My HD7870 remains sufficient to max all of my games at 1920x1080 (I don't currently play any shooters). When it isn't, I'll just install the HD7970 from my backup rig that was BTC mining until the ASICs took over.
If AMD or nVidia wants me to uncork my wallet, they'll need to produce a low-profile card that doesn't need PCIe power and is at least as powerful as a HD7770 or GTX650Ti.
 


No AMD leaves more room for the custom card makers and their stock coolers are generally worse than Nvidia. Nvidia doesn't leave that much room for the custom card makers. That is the reason AMD cards can be overclocked higher, as well as no voltage lock.
 
Lol, what do you mean those 7870's won't last much longer. Geez, it hasn't even been 6 months since I bought a sapphire 7850 oc.
 
Sorry to nitpick here, but the Radeon HD 7870 GHz Edition is a Pitcairn part, not a Tahiti part. I think that you meant to say Radeon HD 7870XT because that card has the Tahiti LE GPU. It has a normal clock of 925MHz and is more powerful than the HD 7870 GHz Edition. I have 2 of them in crossfire, they're awesome. 😀
 
Status
Not open for further replies.