News AMD takes $800M haircut as US gov't cuts off China's AI GPU supply

Maybe this will be good for gamers if these start ending up on gpus?
Not if they don't have any shaders or graphics outputs.

However, I don't see why other businesses can't use these for AI. They may be hobbled for the Chinese market but they are still powerful.

What about the $5.5B haircut taken by NVidia on H20 cards? Makes AMD's $800M look relatively small.
Here it is:
https://www.tomshardware.com/tech-i...as-us-govt-chokes-off-supply-of-h20s-to-china

In proportion to AI accelerator market share / market cap, maybe it's worse for AMD??
 
In proportion to AI accelerator market share / market cap, maybe it's worse for AMD??
Much worse. Nvidia can absorb it, AMD less so. However, that does mean they should be able to still sell them to others. It might help some fledgling AI companies jump the queue and get accelerators faster.

As to using them for gaming, it could be done. Nothing says someone won't make an AI dependent game and having an AI accelerator card may become a thing in the future. Now I would expect this to get baked into gaming GPUs by the time that happens, but still.

I think an RPG with LLM driven NPCs would make for an interesting experience.
 
  • Like
Reactions: usertests
I think an RPG with LLM driven NPCs would make for an interesting experience.
We'll definitely see open world RPGs/games utilizing LLMs and other AI/ML methods for NPC interactivity and maybe complex user-driven storylines (although if done wrong it will become buggier than you could possibly imagine). We will also see real-time voice synthesis, which can help solve the problem of games taking gigabytes to store a relatively small amount of voice acted canned responses. There are already companies allowing you to license voice actors for synthesis.

I think it comes down to what the next-gen consoles ship with, since PC gaming follows wherever they go. I can't see PS6 or the Xbox 720 shipping with more than 32 GB of memory, even though the sky's the limit for some of this stuff. We may see integrated NPUs in the 300-1000 TOPS range, or they could simply make the GPU larger so developers can choose.

I doubt PC gamers will ever need to add PCIe/M.2 AI accelerators. AMD claims the 9070 XT is good for up to 779 INT8 TOPS with sparsity, or 1557 INT4 TOPS with sparsity. What effect that has on using it for graphics at the same time, I do not know. But it may be another 5+ years before games require a form of AI acceleration matching next-gen consoles, and more powerful GPUs will be available by then.
 
"The U.S. government has been taking steps to limit Beijing’s access to America’s most advanced chips to ensure the U.S. will retain its edge in artificial intelligence and prevent its East Asian rival from outpacing it."

In music production, film editing or photography, it has long been known that limited tools may actually boost creativity and output, For example, A musician who has installed more than 500 VST-plugins to his DAW will never get to know and master each individual virtual instrument as a similarly talented artist who is working only with 20-30 VSTs. Excellent and skilled photographers often use relatively old cameras and lenses (5-10 years old), which are in technical terms totally outdated but which they mastered so well that the camera does never get in the way of the creative process.
Something similar may be expected with CPU/GPU hardware: consoles such as the Playstation, the Nintendo Switch or the Xbox are much much more constrained compared to contemporary high-end PCs. However, the programmers are forced to optimize their code for exactely one single (albeit limited) cofiguration, which actually leads to perfectly playable and grat looking games on consoles.
And with DeepSeek we have seen that restrictions in computing hardware seem to stimulate lots of workarounds and code- and workflow-optimizations that ultimately deliver surprsingly competitive results.
 
Something similar may be expected with CPU/GPU hardware: consoles such as the Playstation, the Nintendo Switch or the Xbox are much much more constrained compared to contemporary high-end PCs. However, the programmers are forced to optimize their code for exactely one single (albeit limited) cofiguration, which actually leads to perfectly playable and grat looking games on consoles.
LOL!
No!
All the console games are made on a few game engines that create the same crap code for consoles as for PCs, it's the same engine that makes the code, they just tick the PC box instead of the console box and recompile it.
It's also all the same hardware except for nintendo, everybody is on x86 and a normal GPU.
A lot of games run very badly on consoles and also on PC.
The PC just has a lot more additional options for more filters and more crap on top (raytracing and so on)

For more than a decade now (ps4 and xbone) every PC game has not been a port of a console game but has been a console game with different settings applied.
 
AMD and nVidia should sell these products in the west then. There is essentially unlimited demand for AI chips everywhere, there is no need to landfill these things.