News AMD Aims to Increase Efficiency of Its Chips Thirtyfold by 2025

More power efficient = More Mining purchases.

Lets face it AMD got fat on the mining boom. How many of you want to bet: If you extrapolate mining difficulty with profitability up to 2025, it will require 30x more efficient GPUs

AMD and NVIDIA at this point should just create a sub company dedicated to mining solutions. And when the market falls out, they can write it off as a loss.
 
  • Like
Reactions: artk2219

King_V

Illustrious
Ambassador
And . . Nvidia didn't? Intel didn't?

Also, isn't Ethereum supposed to move to proof of stake by early next year?

Bitcoin is easier with ASICs. Chia requires storage mostly.

I'm not sure why you're assuming that cryptocurrency will completely stagnate rather than move forward. (or get banned, if I had my preference)
 
  • Like
Reactions: Krotow and artk2219
More power efficient = More Mining purchases.

Lets face it AMD got fat on the mining boom. How many of you want to bet: If you extrapolate mining difficulty with profitability up to 2025, it will require 30x more efficient GPUs

AMD and NVIDIA at this point should just create a sub company dedicated to mining solutions. And when the market falls out, they can write it off as a loss.
Honestly, I think this has nothing to do with mining. I think AMD will end up referring to some specific hardware feature that it doesn't currently support, meaning it has to do the work via software. Then it will implement it in hardware and get a 30X increase in efficiency over the next five years. It will almost certainly be something AI / machine learning related, as that's the big area of growth right now in datacenter stuff.
 

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
More power efficient = More Mining purchases.

Lets face it AMD got fat on the mining boom. How many of you want to bet: If you extrapolate mining difficulty with profitability up to 2025, it will require 30x more efficient GPUs

AMD and NVIDIA at this point should just create a sub company dedicated to mining solutions. And when the market falls out, they can write it off as a loss.
ah yes, the grand conspiracy! they should stop becoming more efficient altogether! they should actually go in the OTHER direction and create LESS efficient products! BRILLIANT!
 
  • Like
Reactions: Krotow
Honestly, I think this has nothing to do with mining. I think AMD will end up referring to some specific hardware feature that it doesn't currently support, meaning it has to do the work via software. Then it will implement it in hardware and get a 30X increase in efficiency over the next five years. It will almost certainly be something AI / machine learning related, as that's the big area of growth right now in datacenter stuff.
Darn, I was hoping for a 10W GPU with the same performance chops as a 6800XT :p

EDIT: Out of curiosity I wanted to see how far the lower TDP GPUs can perform and the best that I'd bother to investigate is <=25W GPUs can now get the same performance as maybe a higher-end Radeon HD 5000 or GTX 500 GPU.

It only took about ten years but hey, it's something!
 
Last edited:
  • Like
Reactions: King_V

plateLunch

Honorable
Mar 31, 2017
89
29
10,560
Honestly, I think this has nothing to do with mining.
I agree. I believe there was a press release in the past month or two where AMD won a large data center or supercomputer order because their power consumption was much lower than Intel. The customer had evaluated the total cost of their installation and power was a deciding factor.

This was the first time I've seen power mentioned as a deciding factor so it surprised me. But it's a cost factor so not unexpected. Power consumption is a selling point.
 
  • Like
Reactions: ddcservices
Darn, I was hoping for a 10W GPU with the same performance chops as a 6800XT :p

EDIT: Out of curiosity I wanted to see how far the lower TDP GPUs can perform and the best that I'd bother to investigate is <=25W GPUs can now get the same performance as maybe a higher-end Radeon HD 5000 or GTX 500 GPU.

It only took about ten years but hey, it's something!

Hey, the PCIe slot can supply 75W, so why not go for a 6800XT that needs no additional power connectors! That would 30x more efficient (to my wallet)! :)
 
Honestly, I think this has nothing to do with mining. I think AMD will end up referring to some specific hardware feature that it doesn't currently support, meaning it has to do the work via software. Then it will implement it in hardware and get a 30X increase in efficiency over the next five years. It will almost certainly be something AI / machine learning related, as that's the big area of growth right now in datacenter stuff.

I know you are likely right. AI & ML are huge profit makers.

But I have a lot of cynicism about motivations lately. The very community that kept them afloat, they are turning their back on. (Value oriented builder community)
 

mattkiss

Honorable
Sep 22, 2016
48
9
10,535
Honestly, I think this has nothing to do with mining. I think AMD will end up referring to some specific hardware feature that it doesn't currently support, meaning it has to do the work via software. Then it will implement it in hardware and get a 30X increase in efficiency over the next five years. It will almost certainly be something AI / machine learning related, as that's the big area of growth right now in datacenter stuff.

It's not over the next five years. It's from the beginning of 2020 to the beginning of 2025.
 
  • Like
Reactions: TJ Hooker

TheOtherOne

Distinguished
Oct 19, 2013
220
74
18,670
Hey, the PCIe slot can supply 75W, so why not go for a 6800XT that needs no additional power connectors! That would 30x more efficient (to my wallet)! :)
I will take the bait ... Even tho I think it's sarcasm.

No additional power connectors doesn't mean no power consumption. If PCIe slot can supply enough power, that still means the same power is being consumed and it comes from the same source (wall socket), it just gets to GPU via different route, so to speak. It still won't be 30x (or any xxs) more efficient (to your wallet)! :p
 
I will take the bait ... Even tho I think it's sarcasm.

No additional power connectors doesn't mean no power consumption. If PCIe slot can supply enough power, that still means the same power is being consumed and it comes from the same source (wall socket), it just gets to GPU via different route, so to speak. It still won't be 30x (or any xxs) more efficient (to your wallet)! :p

No, no sarcasm - I would love to have a GPU that is so efficient that you could get 6800XT level performance from the 75W supplied through the socket! It would surely allow for some cheaper pricing.
 

ddcservices

Honorable
Oct 12, 2017
54
26
10,560
More power efficient = More Mining purchases.

Lets face it AMD got fat on the mining boom. How many of you want to bet: If you extrapolate mining difficulty with profitability up to 2025, it will require 30x more efficient GPUs

AMD and NVIDIA at this point should just create a sub company dedicated to mining solutions. And when the market falls out, they can write it off as a loss.
You don't think that AMD would have sold everything it makes even without mining? Get real, Zen3 is a great core design, and RDNA 2, even though it isn't quite as fast as NVIDIA in ray tracing, has been proven to be a solid graphics architecture when it comes to game performance.

You also don't seem to understand that AMD and NVIDIA make NOTHING extra if the GPUs it makes get used for one purpose or another. These companies make GPUs and sell them to the video card makers, THAT IS IT. If the video card makers sell for more or less, that doesn't make more money for AMD or NVIDIA. It would be like saying a given model of tires if put on a Mercedes will bring the tire manufacturer more money than if those same tires are put on a Ford. Tires are tires, and GPUs are GPUs. The companies that package these products on a higher tier product may make more in profits, but the components don't make more money for the component maker.
 

ddcservices

Honorable
Oct 12, 2017
54
26
10,560
I know you are likely right. AI & ML are huge profit makers.

But I have a lot of cynicism about motivations lately. The very community that kept them afloat, they are turning their back on. (Value oriented builder community)

Through the AMD FX period, AMD had to be a "value" brand, because at the top end, AMD wasn't competitive when it comes to performance. The shift to Ryzen has come with some headaches, but many people who complain have forgotten what changed with the Zen3 generation.

Now, as a refresher, Zen, Zen+, and Zen2 used 3 or 4 core per CCX, two CCX per CCD, and up to two CCDs for Ryzen. This allowed for 4 core, 6 core, 8 core, 12 core, and 16 core processors. The move to Zen3 eliminated the CCX entirely, and now there are 8 core per CCD with no CCX. AMD does have 6 core CCDs as well, mostly from one extra core being disabled after another had to be disabled due to failing QA.

So, with 6 or 8 core per CCD, this means 6 core, 8 core, 12 core, and 16 core chips. Note that there aren't any 4 core chips here.

AMD also had another interesting thing happen, very few dies that failed to QA at the highest speed. So, in the past, you had 3600, 3600X, 3700, 3700X, 3800X, 3900X, and 3950X. The Zen3 generation not only dropped the 5600(non-X), it also dropped the 5700 and 5700X...why make three 8 core processors if they all QA to the same speed as the 5800X?

Now, the lack of a 5700 and 5700X means that the lower cost 8 core is now GONE, and with it, we have found a lot of people complaining about there being no cheap 8 core chips that can be overclocked to around the same speed as the 5800X, so, no "bargain" chip.

The OEMs suck up most of the APUs, because OEMs hate putting video cards into computers if they can avoid them, so you don't see a LOT of APUs in the market. You complain about you feel AMD abandoned the budget builders, without realizing the "why". If AMD was selling out all Zen3 chips, and doesn't have excess fab capacity, why should AMD sell lower priced chips?

So, failed chips...like the 5600 non-X...those go to the OEMs as CPUs since they can't overclock anyway(BIOS won't support it). So, again, with a shortage of fab capacity, why should AMD just take perfectly good 5800X chips and sell them as 5700 or 5700X?
 

ddcservices

Honorable
Oct 12, 2017
54
26
10,560
I agree. I believe there was a press release in the past month or two where AMD won a large data center or supercomputer order because their power consumption was much lower than Intel. The customer had evaluated the total cost of their installation and power was a deciding factor.

This was the first time I've seen power mentioned as a deciding factor so it surprised me. But it's a cost factor so not unexpected. Power consumption is a selling point.

The data center is where power draw really matters, but it goes a bit beyond that. Intel is at MOST at 38 cores per CPU at this point for Xeon, so you would need a quad-Xeon to get more cores per server than a dual-64 core EPYC server. The POWER draw is critical, since if you can do the same job with fewer physical servers, that saves a lot of physical room in server racks/data centers, plus it saves on power draw. Big companies like Google will go for EPYC over Xeon for that reason, it actually saves money. All large data centers also need to look at overall power draw when it comes to the total cost of running things.
 
Through the AMD FX period, AMD had to be a "value" brand, because at the top end, AMD wasn't competitive when it comes to performance. The shift to Ryzen has come with some headaches, but many people who complain have forgotten what changed with the Zen3 generation.

Now, as a refresher, Zen, Zen+, and Zen2 used 3 or 4 core per CCX, two CCX per CCD, and up to two CCDs for Ryzen. This allowed for 4 core, 6 core, 8 core, 12 core, and 16 core processors. The move to Zen3 eliminated the CCX entirely, and now there are 8 core per CCD with no CCX. AMD does have 6 core CCDs as well, mostly from one extra core being disabled after another had to be disabled due to failing QA.

So, with 6 or 8 core per CCD, this means 6 core, 8 core, 12 core, and 16 core chips. Note that there aren't any 4 core chips here.

AMD also had another interesting thing happen, very few dies that failed to QA at the highest speed. So, in the past, you had 3600, 3600X, 3700, 3700X, 3800X, 3900X, and 3950X. The Zen3 generation not only dropped the 5600(non-X), it also dropped the 5700 and 5700X...why make three 8 core processors if they all QA to the same speed as the 5800X?

Now, the lack of a 5700 and 5700X means that the lower cost 8 core is now GONE, and with it, we have found a lot of people complaining about there being no cheap 8 core chips that can be overclocked to around the same speed as the 5800X, so, no "bargain" chip.

The OEMs suck up most of the APUs, because OEMs hate putting video cards into computers if they can avoid them, so you don't see a LOT of APUs in the market. You complain about you feel AMD abandoned the budget builders, without realizing the "why". If AMD was selling out all Zen3 chips, and doesn't have excess fab capacity, why should AMD sell lower priced chips?

So, failed chips...like the 5600 non-X...those go to the OEMs as CPUs since they can't overclock anyway(BIOS won't support it). So, again, with a shortage of fab capacity, why should AMD just take perfectly good 5800X chips and sell them as 5700 or 5700X?

It's their right to sell for what they want. I never claimed otherwise. Loyalties a @#$@ though. I'm flipping back over to intel my next build most likely.
 

JamesJones44

Reputable
Jan 22, 2021
620
560
5,760
It's their right to sell for what they want. I never claimed otherwise. Loyalties a @#$@ though. I'm flipping back over to intel my next build most likely.

Brand loyalty is a seriously flawed concept anyway. 99.99999% of the companies out there don't give a poop about you, so why people care about them has always baffled me. Buy the best thing at the best price and you'll aways be happy with the purchase. Plus, companies will get the message way better than when you stick around when they are making crap products.
 
  • Like
Reactions: GenericUser