Report: AMD Working on Radeon R9 280 Graphics Card

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Bitcoin and Litecoin may be the two best-known crypto-currencies but there are dozens of them out there and there are even kits to help people put their own together. Some of the newer currencies are based on large chunks of data to make RAM size, bandwidth and latency the bottleneck to eliminate most of the ASIC/FPGA custom processing's advantage and the GPUs' bandwidth advantage: it does not matter how much processing power you can cram into an ASIC if it spends all its time waiting on RAM.

As for making multi-currency ASICs, that is only practical within closely related coin families that rely heavily on raw processing power and can easily be scaled up. Once you drift too far away from that (significant architecture changes), you end up wasting tons of die area and timing margin on circuitry that only gets used on one specific currency. Multi-currency might work for RAM-bound currencies where the die size would be dictated by RAM controller power/ground/IO pads, leaving spare room to implement a handful of different algorithms... but the performance benefits of ASIC-mining RAM-bound coins would be slim to none compared to a CPU or GPU with a similar RAM configuration.
 
Mining isn't saving AMD's gpu line. Get over it, that is for ASICS now. Even if you buy now and mine 24/7 for the next 4 months (which will have LTC etc Asics out that can be used for multiple currencies) you won't get your money back or even 1/2 of it no matter the price. Quit saying this is a good idea or even a situation to ponder. It's not and as income statements show (both AMD and NV's) it isn't happening at all now as EVERY card is in stock.
ASICs are still not practical on scrypt mining because its relatively high memory usage.

And they are in stock because of the prices which were raised significantly
 
Indeed there are many scrypt coins out there that are still profitable for GPU miners and not affected by ASICS. I find it funny that so many people have read a mainstream article about bitcoin being dominated by asics and think that's case for all other crypto currencies. Fact is, AMD GPUs are still hugely in demand by miners. Dogecoin is one in particular that is still profitable on GPU.
 
So now instead of creating newer and better cards, they've resigned themselves to rebadging older cards, so the bit coin miners acn swoop them all up, forcing retailers to increase the prices by 150-300%.Nvidia, you have a new fan.
Any new cards AMD churns out will just be swooped up by miners. What AMD needs, or should be doing, is to somehow figure out how to make separate cards for gamers and miners. This is the only way to normalize the market. But I'm not sure if AMD has the incentive to do that because as long as they get their money, it doesn't matter who it is from.
 

Twice the R&D expense for half the revenue or possibly less per product line... not worth it.

In order to "differentiate" GPUs from mining GPUs, AMD/Nvidia would have to artificially cripple their GPU's GPGPU performance, which would be a very bad thing when applications and games are starting to use GPGPU too.

Unless you missed all of AMD's promotion campaigns for hUMA and HSA over the past year, you should know that AMD is betting the barn on GPGPU's success so if they start crippling GPGPU performance on their GPUs now, they would effectively ruin any chances of their hybrid GPGPU plans getting taken seriously ever again and will have wasted a year promoting it on top of undermining their chances of raising ASPs with their next-gen chips - something AMD desperately needs right now.
 

NVidia already does...
Which is why miners don't like them
 


I was kind of being facetious when I said that because I know there's no way in hell AMD would, in a million years, do such a thing. The best PC gamers can hope for is limit the quantity each account/person can buy. We should at least make miners work for it.
 

AFAIK, the only thing that is crippled on Nvidia's mainstream cards compared to Tesla/Quadro is double-precision float which is of little importance for Bitcoin and Scrypt mining which rely entirely on integers for SHA-256 computations. In double-precision, mainstream Nvidia cards score about half their equivalent "pro" models but for Scrypt, they score about the same: ~450kH/s.

Based on Litecoin's mining hardware comparison wiki, AMD's R9-290X is about twice as fast as Nvidia's Tesla and 780Ti at Scrypt-based mining (900-1000kH/s) so if GPU-mining is truly driving AMD GPU prices right now, that makes Nvidia's GPUs about half as desirable as their nearest equivalent AMD counterpart.

I doubt Nvidia will let AMD enjoy higher ASPs from having the GPU-mining monopoly without a fight. If Nvidia does get their mining ducks in a row, prices may come back down a bit as long as there are no component shortages such as GDDR5.
 


Looks like things may change. Look at the 750ti review
It has very good single compute performance
 

Well, if we were talking about mining, single and double precision float are of no importance since all the BTC/LTC hashing is done using INT.

That said, Maxwell does look like a much better mining option than anything else Nvidia had out there until now and at less than half the power per kH/s for Scrypt, the power savings might be quite tempting as well.
 
GeForce FX is released. ATi fans laugh at its power usage numbers and loud fan. nVIDIA users praise it.Radeon HD 2900 XT is released. nVIDIA fans laugh at its power usage numbers. ATi fans praise it.GeForce GTX 480 comes out. nVIDIA fans praise it. ATi/AMD fans laugh at its power usage numbers.Etc etc...I'm not taking you folks seriously anymore. You're like Democrats and Republicans. To be honest most of you absolutely do not care if a card uses more power if that card is made by your preferred brand and/or performs well.
 


I understand the concept of the rebadge, aside from lacking the nvidia side of the spectrum, but to me it doesn't give anyone further rights to complain when they upgrade every generation. I know its a minority that complain about it. Its just annoying considering you can spend half of the cost of a medium powered graphics card to purchase a great 3rd party cooler and overclock the hell out of the card (unless your like me and purchase a non-reference card that lacks the voltage controller, DOH!).

I'm still rockin a 5850 and just barely getting to the point where I gotta play on lowest settings, only on the newest AAA titles of course. The idea that someone needs to upgrade every new release and then complain its a rebadge just sounds redundant. Have I ever told you the meaning of insanity?
 

I'm complaining about the rebadge and I'm still using an HD5770 so definitely not upgrading every generation. That old card still gives me 45FPS in SC2 and 60fps in WoW on 1200p Ultra preset (minus shadows and FSAA) and that's good enough for me to skip all the way until 20-22nm GPUs come out.

The other thing that annoys me with the rebadging apart from obfuscating which products are genuinely new (ex.: Nvidia just launched Maxwell with the 750 even though the 7xx lineup is already full of Kepler-based chips - rebadging in reverse to avoid launching Maxwell as the 850 and making the 760-780 seem obsolete before their 8xx replacements are lined up) is that it makes older models seem much older (or newer for recently rebadged models) than they actually are just because of their model number.
 


I was mainly generalizing among all comments speaking negatively about rebadges, not necessarily you only, just to clarify. Its just something that has been an industry standard for quite a few years at this point. its not even only limited to the PC component industry, as vehicle manufacturers have been doing the same thing for years. Personally I don't mind entertaining the thought of buying a rebadge (IE: 280/280x) for the additional benefits of updated binning, updated cooler, support longevity, etc. Theoretically, it should also make the previous generations cheaper after the newer rebadge is released (with or without updated bin's).

I know what your talking about with the new maxwell 750. AMD did the same thing with the 7790 using the hawaii architecture, even though the lower end R7 chips use an even older volcanic islands architecture. It just seems like a live test to see if the new architecture will work in the wild without having to charge an arm and a leg for it to cover expenses if something goes wrong (Warranty replacements).

Still I digress and resort to my comment that it amazes me at how resilient the last several years of graphics processing power have held up compared to the days of old (GT 8800 768, I'm looking at you. And GT 5700 LE, how I miss you...). They'd more than likely run out of money and run into an overstock of unused GPU's since they are capable of running games (among other things) for a much longer period.

I'm definitely on the same boat with you regarding 20/22nm cards. We will definitely reap the benefits of holding out so long.
 


Are you referring to the GTX760 by any chance?
 


Who said I was insulted? Did I say you were complaining? I find your comments baseless. It's comic some people in this forum find a detailed post as a fight post then just denounce it with a useless comment instead of any kind of debate or any reason why they disagree with it. Why can't people discuss opinions/topics in a forum without people like you calling it a fight? Perhaps you don't understand exactly what a forum is for. DISCUSSION. If you're saying I'm making assumptions, tell us how. Instead you give useless posts with nothing to back them up. You called my posts conspiracy theory, grabbing at straws etc but nothing to show it. Just a baseless comment without anything backing it. How am I misrepresenting the facts? How am I wrong? You give nothing. If you're going to simpy throw out accusations about every post you don't like, just sit there quietly and don't bother. You're wasting our time. :sarcastic:

http://www.paulgraham.com/disagree.html
Read that. Learn how to make a useful comment backed by something please 😉
https://commons.wikimedia.org/wiki/File:Graham%27s_Hierarchy_of_Disagreement.svg
(a better pic, but doesn't show up here).
Until you can make it into the top 3 in that chart, don't bother posting. At best your posts reach contradiction and that's it. Here's one that's jpg so it shows here:
disagreement-hierarchy.jpg


https://en.wikipedia.org/wiki/Argument_from_ignorance
Maybe read this while your at it. Between that and the chart you should have a better idea of how to form a decent argument.

So I'm full of assumptions, but...
You assume I'm insulted.
You assume I'm looking for a fight, though I'm not sure why. Is it because of a detailed post that outlines my opinion and how I got to it?
"Nopony reads walls of text. Use syntax and paragraphs"...blah blah...Come back when you actually have something meaningful to say.

FACT: Intel has lost 21% of market share in notebooks to ARM. The same will happen to desktops as we hit 64bit 8/16GB etc.
FACT: Intel and ARM are both coming for low end and at some point ARM comes for high-end (what do you think Denver etc is for?).
FACT: Most poeple just play games, browse, email, play some youtube/netflix and very little else (twitter/facebook etc).
FACT: You don't need Wintel to do any of the stuff mentioned above now and even casual gamers get by on phones/tablets.
FACT: Samsung is coming with 4GB modules for phones just 6 months after 3GB (THIS YEAR). Phones/tablets of 2015 will have 6-8GB.
FACT: Most people don't have more than 8GB in their PC's.
FACT: Intel has 52B revenue while ARM has ~$26B. The only way for either side to grow is steal market from the other.
FACT: Phone sales are slowing (growth), and tablets will hit a wall soon too, proving my revenue statement above further.

Not sure what part of my statements you think is assumptions, but any assumptions I made are based on a lot of facts we all know. I could keep going listing facts but you should get the point. Do I make a few assumptions after looking at the data? Of course, how do you predict anything without doing this? You don't appear to understand what forms an argument (or even what the definition of the word in this context is - which is an opinion supported by facts/data - has nothing to do with fighting). Debating someone on a topic isn't fighting.

It's not much of an assumption to see major apps will be running on android once Denver (and everything Qcom, etc follows with on 64bit) etc comes and anything over 4GB hits these things. Chromebooks already come with 2GB of DDR3L 1600MHz memory, nothing stopping them from slapping 16-32GB in a 500watt box with a gpu. If you don't think this is the goal you are ignoring the obvious. ARM wants Intel's $52B+ revenue (Qcom, NV, Samsung, etc everyone wants this 52B). There is only half that amount on ARM's side, the only way to grow is to go after WINTEL. These are not assumptions, they are reality. Intel has the same goal, just from top down as opposed to bottom up.

"This is Jen Hsun's goal and he has Valve, Google, Amazon, Apple, Samsung etc helping to ensure linux, opengl, android, chrome etc become decent Wintel replacements."
This isn't an assumption either. It is reality. The second MS mentioned it's store Valve went crazy. The rest are already succeeding to some extent and will further that cause with A57 etc as ARM moves up from low-end notebook taking more x86 share. Everyone has server/desktop chips coming or on their roadmaps. What do you think A57/64bit is for? Everybody's roadmaps show these are DESKTOP or SERVER chips. What do you do with more than 4GB? Stuff that is ABOVE mobile crap/chromebooks. You call it a wall of text, I call it making a decent argument (and no that doesn't mean fighting...ROFL), backed by supporting evidence.

You respond to tone, contradict or worse. Attack the substance not the grammar, not the person, etc. :no:
Learn how to disagree or just sit silent. Reality is, you really didn't even state your case (you rarely do), so saying you made a contradiction is even reaching, but I was being generous 😉 Try giving a well thought out argument instead of junk like "your grammar sucks and I hate your walls". Move along, you don't have to read it or respond. If you had any data backing your position you'd post it instead of complaining (yes you did whine about my wall) about my wall or grammar, paragraphs, etc. None of which mean anything if the data is good. We done?
 


Interesting? No, factually correct. He wasn't arguing your point, he was correcting your data. You seem to have trouble with DATA/FACTS. I digress...
 
FACT: Intel has lost 21% of market share in notebooks to ARM. The same will happen to desktops as we hit 64bit 8/16GB etc.
FACT: We are already at 8/16GB on desktop and x86 still rules
FACT: Most people don't have more than 8GB in their PC's.
I will bet that will change in the next 3 years or less

FACT: The vast majority of desktop software is written for X86 which makes changing architecture beyond stupid
 

Who knows how much longer this will hold true? If ARM-based tablets and phones continue evolving at their current pace for 2-3 more years, they will be nearly as powerful as current desktops and people who are not ball-and-chained to x86/Windows won't have much of a reason to own an x86-based system.

About half the people I know already do practically all their everyday personal computing on their phone or tablet. The other half is mostly PC gamers.
 

I use my tablet when I am away from the computer. Much easier to carry than a laptop.
I know people that use their phone online a lot too. But when I am at home in the main room I use a desktop

 


Those cards wouldn't be in such short supply, if it weren't for the miners buying them up by the dozens. And naturally, retailers have to do something about the short supply, so they artificially mark them higher than they normally would.

When I look at New Egg listing the R9 290X at $900, my blood boils, nobody else is responsible for this than the miners.
 

Lack of worthy competitors is still a candidate: Kepler's poor mining performance deserves some of the blame for miner raiding AMD's GPU supply - AMD's GPUs were up to twice as fast as their nearest Nvidia counterparts in some mining benchmarks. If mid/high-end Maxwell's mining performance scales as good as the 750Ti seems to be hinting at. AMD's GPUs are about to get a whole lot less appealing at least in terms of kH/watt.

Only time will tell if the Maxwell-based 760/770 successors will either knock AMD prices down by providing competitive mining alternatives or tell us even AMD+Nvidia's total output is still insufficient to meet current mining demand.
 
Status
Not open for further replies.