News AMD stock reaches 52-week high - driven by AI demand and analyst optimism

Status
Not open for further replies.

bit_user

Titan
Ambassador
If there's one thing working in AMD's favor, it's that they've hopefully learned just how hard it is to leap-frog Nvidia. Ideally, this will have them doing some really outside-the-box thinking about their AI product plans, rather than continuing the path of evolutionary improvements to current designs. I expect hybrid CPU/GPU processors (i.e. MI300) is just the start.
 
Last edited:
  • Like
Reactions: valthuer

spongiemaster

Admirable
Dec 12, 2019
2,311
1,303
7,560
The problem for AMD has always been the software side. With Nvidia's ecosystem so dominant in the industry, it's probably too late for AMD to make any serious headway. It's going to take the rest of the industry moving to an open standard for AMD to have any chance in the market.
 

bit_user

Titan
Ambassador
The problem for AMD has always been the software side. With Nvidia's ecosystem so dominant in the industry, it's probably too late for AMD to make any serious headway.
They have a CUDA clone called HIP + tools for porting CUDA code to HIP. They've used these to add support for their GPUs to popular deep learning frameworks and other software, but it does have to be maintained as a separate backend. It does also mean that AMD will always have the perception (if not also the reality) of being a cheaper imitation.

AMD would point out that their HIP API also supports Nvidia hardware, letting you convert your CUDA code to HIP and then maintain only that version. However, I'm not sure how many would trust AMD's support for Nvidia hardware to be at the same level of CUDA's native support for it.

It's going to take the rest of the industry moving to an open standard for AMD to have any chance in the market.
We have open standards: OpenCL and SYCL. Some people like to bring up Vulkan Compute, but that's not comparable to OpenCL or CUDA. Intel is the main one still pushing OpenCL and SYCL, but even they have oneAPI, at a higher-level.

IMO, the main reason OpenCL didn't dominate is because no big platform provider forced the issue. It's still out there, and if support for it were ever ubiquitous across all hardware, it could enjoy something of a resurgence. Not sure it'll ever displace CUDA, at this point.
 
Last edited:

Jimbojan

Distinguished
May 17, 2017
83
37
18,560
I am sorry, AMD will never gain any significant shares in AI, as most data center companies are making their own chips, also Intel is making its next generation server and accelerators with scale, it is more power efficient than AMD's 5nm chip. Intel is moving to Intel 4 and Intel 3, data Center will not use AMD chips anymore, mainly because Intel has more power efficient chips in Intel 3 and 4 and later on 18A. Both AMD and TSMC will be falling behind in 2024. I can’t see AMD can be in any reasonable position from here. All those analysts are jokers, they are doing it for their pump and dump to make profit out of you.
 

spongiemaster

Admirable
Dec 12, 2019
2,311
1,303
7,560
Nothing you said changes what I said. I am aware OpenCL exists. It's not going to be a serious threat to NVidia unless the rest of the industry gets behind it and there is no indication that is happening any time soon. AMD supporting OpenCL is not going to move the industry.
 

bit_user

Titan
Ambassador
I am sorry, AMD will never gain any significant shares in AI, as most data center companies are making their own chips,
And yet they still buy Nvidia GPUs! That tells me the market is still there, if you can build something good enough.

also Intel is making its next generation server and accelerators with scale, it is more power efficient than AMD's 5nm chip. Intel is moving to Intel 4 and Intel 3, data Center will not use AMD chips anymore, mainly because Intel has more power efficient chips in Intel 3 and 4 and later on 18A.
Ooh, someone is counting their chickens before they hatched!

Too bad Meteor Lake (made on Intel 4) seems to be so underwhelming.

Both AMD and TSMC will be falling behind in 2024.
It's only Intel's 20A node that supposedly gains a lead, and I think we won't see a big volume ramp of those products until 2025.

I can’t see AMD can be in any reasonable position from here. All those analysts are jokers, they are doing it for their pump and dump to make profit out of you.
Considering your 100% stalwart track record of pumping Intel, that sounds like the pot calling the kettle black.
 
  • Like
Reactions: Neilbob

bit_user

Titan
Ambassador
Nothing you said changes what I said. I am aware OpenCL exists. It's not going to be a serious threat to NVidia unless the rest of the industry gets behind it and there is no indication that is happening any time soon.
What you're missing is that most of the industry doesn't write CUDA code. Their stuff sits a layer above it, by using frameworks like TensorFlow, PyTorch, etc. They (mostly) don't care what's underneath.

AMD supporting OpenCL is not going to move the industry.
Maybe you need to reread what I wrote, because I didn't say anything about that. AMD backed away from OpenCL a long time ago.
 

Neilbob

Distinguished
Mar 31, 2014
245
307
19,620
Every few months you post this exact message, or some variation of it, completely disregarding articles and tests from multiple sources that suggest otherwise when it comes to power efficiency in particular. This is the case for now, but you somehow possess the ability to see in to the future in order to make assessments about forthcoming products.

You also seem to be intent on suggesting that Intel will make advancements but AMD (and TSMC) will remain stationary, which certainly hasn't been the case for several years.

Also, you seem determined that AMD are going to fall and/or become irrelevant. We've already seen what happens if this weird event you seem to hoping for actually happens: products stagnate and prices rise - consumers don't win.

As for Intel's process developments, they often look to me like 33.3% improvement and 66.6% marketing.
 
  • Like
Reactions: bit_user

spongiemaster

Admirable
Dec 12, 2019
2,311
1,303
7,560
What you're missing is that most of the industry doesn't write CUDA code. Their stuff sits a layer above it, by using frameworks like TensorFlow, PyTorch, etc. They (mostly) don't care what's underneath.


Maybe you need to reread what I wrote, because I didn't say anything about that. AMD backed away from OpenCL a long time ago.
My original point was that AMD doesn't have the software to make any headway against Nvidia in AI. You brought up OpenCL as some sort of counter, so whether or not you actually said it, OpenCL is not the answer to AMD's problems.
 

waltc3

Honorable
Aug 4, 2019
433
234
11,060
Good advice, Mark. "Past profitability is no guarantor of future success." Just ask Intel. Not picking on Intel, but it's true. Investors are often a volatile lot--just watch what happens when the AI craze begins to wear thin and investors realize that AI has nothing to do with thinking computers. When they discover that for every job AI supposedly replaces, the company must hire two employees to constantly edit its output for much needed corrections and copyright violations! My advice: pick the stock(s) you like, buy it and when it doubles sell half of it until it doubles again, and then sell half again, rinse, repeat, and do not put all your eggs into one basket...! And never invest more than you can afford to lose at any one time. There is no free lunch. Greed kills, etc.
 

bit_user

Titan
Ambassador
My original point was that AMD doesn't have the software to make any headway against Nvidia in AI.
In spite of my lack of enthusiasm for HIP, it did enable AMD to quickly add support for a lot of packages people normally use Nvidia to accelerate. I think if AMD could get the major Linux distros to ship with ROCm built-in (and improve the user experience of installing it, for those which don't) and provide robust ROCm support for their consumer GPUs and iGPUs, they could have a viable alternative to Nvidia (assuming it's priced accordingly).
 
Status
Not open for further replies.