News Lisa Su announces AMD is on the path to a 100x power efficiency improvement by 2027 — CEO outlines AMD’s advances during keynote at imec’s ITF Worl...

Status
Not open for further replies.
This feels like a what have you done for me lately problem. Cool AMD, that 100x thing sounds super awesome on paper, but nobody in the know really believes it. We know you will use the exact same amount of power total, and that it is just a type of green washing...also everyone will likely have a large increase in efficiency in this arena as it is the big push....always is a same power more production pipeline.

This just feels a little desperately times, or something. AMD needs to actually compete with Nvidia head to head at some point here instead of always being off by a bit with some caveat. Would love a real Nvidia competitor.
 
If you get away from the idea you need Ray Tracing in games, AMD provides alternative products to Nvidia.
If then you get away from the idea that you need an Intel CPU for gaming, they also provide alternatives to Intel products.

I have no doubt that people will once again buy RTX 5000 GPU's just because it's in their head they need to upgrade. Nvidia is very good at marketing their products and making consumers believe they need their technology. That is the only area where AMD is really lagging when compared to Intel and Nvidia.

In a way it's instilled in many consumers heads' that only Intel/Nvidia make good gaming products.
 
This feels like a what have you done for me lately problem. Cool AMD, that 100x thing sounds super awesome on paper, but nobody in the know really believes it. We know you will use the exact same amount of power total, and that it is just a type of green washing...also everyone will likely have a large increase in efficiency in this arena as it is the big push....always is a same power more production pipeline.

This just feels a little desperately times, or something. AMD needs to actually compete with Nvidia head to head at some point here instead of always being off by a bit with some caveat. Would love a real Nvidia competitor.
What are you rambling about? AMD is surpassing Nvidia's marketshare this year. They are competing with Nvidia, and winning.
 
  • Like
Reactions: PEnns
If you get away from the idea you need Ray Tracing in games, AMD provides alternative products to Nvidia.
If then you get away from the idea that you need an Intel CPU for gaming, they also provide alternatives to Intel products.

I have no doubt that people will once again buy RTX 5000 GPU's just because it's in their head they need to upgrade. Nvidia is very good at marketing their products and making consumers believe they need their technology. That is the only area where AMD is really lagging when compared to Intel and Nvidia.

Is RT really something we don't need? Or is AMD simply not interested in doing it properly? Because the way I see it, Nvidia has GPUs that are providing higher RT performance while using less power than AMD.

Nvidia's GPUs are certainly overpriced right now at every tier, but AMD's are only slightly better price/performance for rasterization, use more power, and lack features like proper tensor/matrix cores and more robust RT hardware.

Intel went after Nvidia with rasterization, RT, and matrix hardware — the trifecta. AMD keeps saying, "you don't really need RT or matrix for games..." And at some point, it will be truly wrong and there will be even more AI and RT stuff that just doesn't run as well on AMD. RDNA 4 better reprioritize RT and AI if AMD wants to stay in the (GPU) game.

In a way it's instilled in many consumers heads' that only Intel/Nvidia make good gaming products.

If you 're consistently being better that your competitors for more than two decades, then people start to believe you 're always better, no matter what.

It's called perception.

In order for that perception to change, AMD have to try and do the only thing they persistently avoid doing: create a better flagship card than Nvidia.

It's gonna be costly, no doubt about that, and it's gonna be bought by a small number of consumers. But it's not about the sales: it's about sending a message, that you can still maintain a dynamic presence on the GPU market.

With a ~20% market share, AMD are getting closer and closer to being reduced into a non-factor in the GPU realm.

And that's nothing but bad news, for every consumer out there.
 
  • Like
Reactions: salgado18
AMD, QC, STMicro, Brocadcom know the missing piece from Nvidia's "moat" is...power. Nvidia hardware is very power hungry & inefficient, mainly in size, consumption and thermal. Even Tegras are home heaters.

Intel knows it, but can't execute aside from legacy architectures (Core cpus). Now, with all the Snapdragon hype on MS laptops, AMD is really going to complete on 2 fronts: datacenter efficiency and edge AI, hence the competition is Nvidia, QC....and Apple. Didn't Nvidia announce ARM cpus incoming (likely for edge AI) ? Thus, efficiency is a top sales point and breaks the Nvidia moat... for now.

Intel? They are all over the place and with Altera spun out, heading to just compete with TSMC if their management doesn't get their act together with a real vision. But maybe that's their vision--cause TSMC's business is with all of the above?
 
This just feels a little desperately times, or something. AMD needs to actually compete with Nvidia head to head at some point here instead of always being off by a bit with some caveat. Would love a real Nvidia competitor.

Couldn't agree more.

AMD hardware isn't bad, but I can't immediately point to one thing where it's leading from the front in the GPU realm right now. Which of course is hard to do when you're ~20% of the market.

For Nvidia, I can point to ray tracing, AI, and upscaling as three concrete examples of technology and techniques that it established, and not surprisingly it continues to lead in those areas.

In other words, if AMD wants to get ahead of Nvidia, it actually has to get ahead of Nvidia. It can't follow and thereby become the leader. Drafting (like we see in cycling or running or other sports) doesn't work in the business world. AMD needs to figure out a way to leapfrog Nvidia. And that's damn difficult to do, obviously.
 
  • Like
Reactions: artk2219
..
In other words, if AMD wants to get ahead of Nvidia, it actually has to get ahead of Nvidia. It can't follow and thereby become the leader. Drafting (like we see in cycling or running or other sports) doesn't work in the business world. AMD needs to figure out a way to leapfrog Nvidia. And that's damn difficult to do, obviously.
No kidding it's difficult. A bunch of you folks are saying "AMD needs to compete with nVidia," but when you have the all-star team, that's tough to disrupt. I'm going to use the Bulls Jordan era dynasty as an example. Teams DID compete with them; sorry, but all the louzy talk of AMD not competing doesn't fly with me. If the measuring stick on competing is only when two parties are almost exactly toe-to-toe and jab-for-jab, that's lovely and all but when do you see that happening in duopology and small oligopoly scenarios? Almost every single imaginable market in the world has clear market leaders. If it was easy to do, there'd be more competitors, but there isn't. Look at Matrox, VIA, and others. Intel has barely survived Alchemist to the point that many don't believe Battlemage will arrive.

No one needs to feel bad for AMD as that's not what I'm trying to do here; they've come A LONG ways since Ryzen launched, and their trajectory is still impressive. They greatly benefit from this AI boom just as nVidia does. AMD's marketing has long been weak -- I think this is where we all agree.
 
  • Like
Reactions: PEnns
People, what is up with the AMD hate?

30x or maybe even 100x power efficiency is amazing.

So why suggesting that a big increase in efficiency will mean nothing, because future chips will still use the same power - that is just stupid talk. If new chips are many times more efficient, but still use the same power then those chips will do many times more work and that is not nothing.

And as for the AMD GPU's vs. Nvidia, fact of the matter is that apart from the top models from either team, there simply isn't a reason to care about RT since neither company mid-range or lower products come with usable RT. That many still overpay for mid-range Nvidia offerings, maybe even getting only acceptable amounts of VRAM, speaks to the power of marketing and not tech they buy. Instead they should buy AMD.
 
Isn't this 100x stuff ignoring basic physics? You can draw all the straight lines you like on a log scale, but that completely ignore the fact that progress in real-world silicon is already on diminishing returns now. I might just believe an upside of 10x, but 100x? That seems like fairytale stuff without a wholesale shift to optoelectronics.
 
What are you rambling about? AMD is surpassing Nvidia's marketshare this year. They are competing with Nvidia, and winning.
Got a link to this? Every metric I see is AMD largely just holding steady with a small percentage in desktop ~20% and server GPU about ~10%. The rest of it is predominantly Nvidia.
 
According to the slide, AMD's 286 had 16 megabytes of cache RAM. Cache, incidentally, that filled up the entire address bus. Neat.

IIRC the 286 had neither L1 nor L2 cache. Atari strapped 16 KILOBYTES of L2 cache to its 16MHz 68000s, that I do know.
 
  • Like
Reactions: Geef
Isn't this 100x stuff ignoring basic physics? You can draw all the straight lines you like on a log scale, but that completely ignore the fact that progress in real-world silicon is already on diminishing returns now. I might just believe an upside of 10x, but 100x? That seems like fairytale stuff without a wholesale shift to optoelectronics.
So far AMD has over delivered on what they promised with their Ryzen generations, so I doubt they would be talking BS.
And in contrast we have Intel that has been selling essentially overclocked new "generations" of CPU's for a while, the latest now turning running them like Intel said was okay means they burn out.

I know which company I have more faith in that is for sure, but still - lets hope Qualcomm brings something interesting and that Microsoft doesn't lose interest in backing the ARM platform.
 
According to the slide, AMD's 286 had 16 megabytes of cache RAM. Cache, incidentally, that filled up the entire address bus. Neat.

Yeah I looked at that and was trying to imagine how awesome a chip would have been back then if it actually had 16megabytes of cache RAM. Maybe it was AMD's first attempt at a Ryzen 1 - 286X3D - 1 core - chip?!

Get that extra performance on those old games like Prince of Persia. 😉
 
What are you rambling about? AMD is surpassing Nvidia's marketshare this year. They are competing with Nvidia, and winning.
Uh, what? In what market? It sure isn't add in graphics cards, or AI...AI being the most profitable market. I think AMD is making great headway into the datacenter market but only on the CPU side. Nvidia still stomps them in AIB Gpu, overall datacenters because GPUs are such a big part of data crunching now, and especially in AI. They are pushing Intel on several markets but not Nvidia. Where did you get this idea?
For example from wwctech AIB GPU market. Nvidia holds 80%, AMD holds 19%, Intel 1% as of 4th quarter 2023.

Amd is making gains, in that sector but not in AI. Their gains are primarily coming from hitting Intel on the sever cpu market, which Nvidia is just starting to poke their head into.

Amd isn't useless, but they are not really an nvidia competitor yet on anything but price.
 
Last edited:
AMD GPUs are just fine and dandy I have a couple as well as a couple of Nvidias incl the 4090 and 3090, in non RT games its really neck and neck in fact last gen cards AMD usually tops it but once you add the Nvidia developed RT the yes AMD loses out, now I have to be honest and say I really struggle to see where the RT actually makes much of a difference in games, if i really look hard yes I can tell but when playing I dont have time tbh.
I paid £1750 for my 3090 and just over 1100 for the 6900, similar prices for the current gen and i really cannot with all honesty justify the extra any more, be honest online games we turn down the candy because it can give you the edge in a frag, and the 4090 worries me am always looking for hot cables!
A 7800x3d @ 105w plus my 7900xtx will do me over my bonfire 4090 and a potential Intel room heater at 300w, havent you heard energy is frickin expensive these days!
 
Yeah I looked at that and was trying to imagine how awesome a chip would have been back then if it actually had 16megabytes of cache RAM. Maybe it was AMD's first attempt at a Ryzen 1 - 286X3D - 1 core - chip?!

Get that extra performance on those old games like Prince of Persia. 😉
LOL. I reckon the CPU must have been an engineering sample, then! I believe games were particularly I/O bound back in the day so RAM disks were all the rage if you did not have a HDD. Monkey Island would use superfluous RAM as cache, but many others just thrashed the drive hard.

Good luck configuring HIMEM.SYS, though.
 
  • Like
Reactions: jp7189
Su pointed to 3nm Gate All Around (GAA) transistors as AMD’s next step on the silicon roadmap
AMD tends to use TSMC, so that would be "2nm" for GAAFETs.

Isn't this 100x stuff ignoring basic physics? You can draw all the straight lines you like on a log scale, but that completely ignore the fact that progress in real-world silicon is already on diminishing returns now. I might just believe an upside of 10x, but 100x? That seems like fairytale stuff without a wholesale shift to optoelectronics.
I remember AnandTech did an article about AMD's 25x20 claims back in 2020:
https://www.anandtech.com/show/1588...oal-renoir-zen2-vega-crosses-the-line-in-2020

The calculation for compute performance was relatively straightforward, showing a 5x gain between Kaveri (2014) and Renoir (2020). Energy efficiency was fishy because it was more of a measurement of idle power, applicable to laptops idling most of the time, but not performance/watt under load.

So you have to check what these claims are actually measuring but AMD is definitely improving their CPUs and GPUs/accelerators.

The 100x they are talking about here should be pertaining to "data center compute node power efficiency", and an extension of their 30x goal. So it is only 3.33x between 2025 and 2027, which might be achievable.
 
  • Like
Reactions: bit_user
I'm just going to take a wild guess that the metric will be along the lines of instructions/ops per watt in terms of efficiency. Which can be achieved in a lot of ways, but those efficiencies will vary based on workload.

I highly doubt what is being claimed is that 125 watt processor lineup will become 1.25 watts by 2027.
 
  • Like
Reactions: King_V and bit_user
LOL. Yes, she's right, but everyone is doing these things. I've run these calculations periodically over many years. You make the cache 10% bigger and count a 50% increase in power/efficiency. Switch from HDD to SSD and count a 10x improvement. Switch from a truckload of punched cards to a thumb drive and count 100x improvement. It's real enough, but a bit misleading when stated without context.
 
Status
Not open for further replies.