News Nvidia warns of gaming GPU shortage this quarter, recovery in early 2025 — Chipmaker rakes in record profits as net income soars by 109% YoY

Admin

Administrator
Staff member

TeamRed2024

Upstanding
Aug 12, 2024
191
125
260
Looking forward to next year. The 50 series GPUs and the Ryzen 9 9950X3D.

Someone correct me if I'm wrong... but aren't the x3D chips pretty irrelevant if you are gaming in 4K? I've seen the comment that they are irrelevant at 4K made by many in various threads... and every x3D benchmark video shows 1080p performance.

Asking because I game in 4K which is all on the GPU... so I would assume the answer is yes? I remember 1080p HD coming out in like 2007... but don't remember the last time I actually used that resolution on my PC.

That being said... gonna keep on trucking with my 9950X and 4090 until the 5090 is readily available at MSRP.
 
  • Like
Reactions: valthuer

valthuer

Prominent
Oct 26, 2023
170
170
760
Someone correct me if I'm wrong... but aren't the x3D chips pretty irrelevant if you are gaming in 4K?

Well, if I’m not much mistaken, even a 4K gamer can benefit from 9800 X3D, in the sense that there’s a great performance improvement in the 0.1 and 1% FPS lows, and that can come in handy when dealing with games that are known to be stutterfests.
 
  • Like
Reactions: CelicaGT
Mar 10, 2020
420
384
5,070
Someone correct me if I'm wrong... but aren't the x3D chips pretty irrelevant if you are gaming in 4K? I've seen the comment that they are irrelevant at 4K made by many in various threads... and every x3D benchmark video shows 1080p performance.

Asking because I game in 4K which is all on the GPU... so I would assume the answer is yes? I remember 1080p HD coming out in like 2007... but don't remember the last time I actually used that resolution on my PC.

That being said... gonna keep on trucking with my 9950X and 4090 until the 5090 is readily available at MSRP.

Today the GPU is the system bottleneck at 4k, however, assuming a notable increase in throughput from 4000 series to 5000 series Nvidia cards then the CPU will be able to stretch its legs and feed that throughput.

X3d chips appear to show that they have the headroom, the performance available to fulfill the GPU’s needs.
 
Sep 6, 2024
6
7
15
Well, if I’m not much mistaken, even a 4K gamer can benefit from 9800 X3D, in the sense that there’s a great performance improvement in the 0.1 and 1% FPS lows, and that can come in handy when dealing with games that are known to be stutterfests.
TechPowerUp site provides benchmark for 4k and 1% FPS 4k. For me the difference at 1% FPS is still negligible between X3D and nonX3D. No benefit. When 7800X3D was for $300 it was no brainer, but now for more than $500 in my country it is a big pass.
 

JamesJones44

Reputable
Jan 22, 2021
853
784
5,760
Someone correct me if I'm wrong... but aren't the x3D chips pretty irrelevant if you are gaming in 4K?
With current GPUs this is largely correct (there are some benefits, but far less pronounced as lower resolutions). However, we don't yet know how much performance 50xx GPUs will bring to the table, they could make the x3D processors relevant for 4K if things start to become more CPU bound instead of GPU bound, but we don't have enough info to know as of right now.
 
nvidia: there migth be gaming gpu shortage this quarter, but next quarter it'll be fine.

can you say staged any better?

stops making 40 series to make them low quantity for the release of 50 series next quarter.
This is called business planning. If you know you are about to make a bunch of existing products... well, not "obsolete" but "no longer worth the original price" then you don't keep making those products. You halt production long before the replacement is ready.
"Don't buy a GPU now! Wait till we finally get our new generation onto market ... sometime next year!"
Also nope. I think this is more like, "Oh, you want our old GPUs for the holidays? Great. Prices have gone up due to the shortages that are now present in the marketplace. Thanks for the money!"

Really, this all goes back to the data center Blackwell delays. All of the original chips that had issues? They're basically junk. Or if not junk, not able to be used right now. So then Nvidia has to allocate a bunch of extra wafers to the GB200 die, and TSMC is tapped out on production capacity. So, in order to fulfill the massively lucrative data center obligations first, the consumer Blackwell GPUs get pushed back.

And now, instead of having launched already, we have "Blackwell coming in January."

I am 100% certain that at the beginning of 2024, Nvidia fully intended to launch RTX 50-series this fall. But plans have to be flexible, and sacrificing tens of billions in GB200 orders in order to make several billion in the gaming sector just doesn't make any kind of sense on a business level.

AD102 production has long since halted I'm sure. AD103? Probably also halted around the same time. I'd wager only AD104 and above are still being produced, and even that is probably winding down now to make room for the lower tier Blackwell GPUs in the spring/summer timeframe.
 
Well, if I’m not much mistaken, even a 4K gamer can benefit from 9800 X3D, in the sense that there’s a great performance improvement in the 0.1 and 1% FPS lows, and that can come in handy when dealing with games that are known to be stutterfests.
Also depends on what kind of game you're playing. An FPS yeah its all GPU bound. But something like Factorio that is heavily physix/simulation bound so the CPU matters more even at 4k.
 
  • Like
Reactions: P.Amini

TeamRed2024

Upstanding
Aug 12, 2024
191
125
260
As always it depends on your frame rate target and not the resolution. Steve from HUB has railed about this repeatedly for years and people still don't get it.

Not a HUB subscriber... about as useful as Jayz2cents IMO. I did glance at the article though and I'm definitely a "casual 60 fps" gamer (4K) and I also grinned at that part about people questioning 1080p benchmarks being relevant... because that's something I've always wondered as well.

Why are we still seeing 1080p benchmarks in 2024? Must be for those hardcore FPS junkies wanting their max fps because for me 1080p gaming is about as visually exciting as watching a movie on DVD.

To each their own. :cheese:
 
  • Like
Reactions: bolweval
Mar 10, 2020
420
384
5,070
Why are we still seeing 1080p benchmarks in 2024?

1080ti was the bottleneck, replaced by 2080ti and 3090ti and 4090… each generation the gpu became more “transparent”, “less opaque” to the data presented to it. More data can pass through it.

Processors behave similarly generation on generation. (No preference, not necessarily the fastest, just easier to type)
1800x, 2700x, 3700x, 5800x, 7700x, 9700x

To benchmark the processor:

If you pair each of the processors with a 1080ti at 1080p there will be an increase in frames rendered to (for example) 3700x at which time the GPU is saturated, testing at 1440p or 4k is pointless. The gpu can’t render more frames.

Switching up to a 3090ti more frames can be rendered, none of the listed CPUs is likely to saturate the GPU at 1080p so you would see increasing frame rates as the processor generations are tested in order. 4090 would be further from being saturated.

1440p you are loading the GPU more, the 1080ti and 2080ti will begin to struggle before the newer processors bottleneck and at 4k even the mighty 4090 will be bottlenecking with the newer processors.

At 4k you might see some processor scaling but the faster, newer processors will show the same frame rates within a few percent as the GPU cannot do more work.


CPU tests are conducted at 1080p to demonstrate CPU scaling. That resolution is reasonably easy for the GPU. Using the fastest GPU means that the GPU is not the restriction in the tests. Any scaling seen is likely to be due to CPU scaling.

1080p is still prevalent in the Steam surveys and is still relevant due to this.
 

TeamRed2024

Upstanding
Aug 12, 2024
191
125
260
1080p is still prevalent in the Steam surveys and is still relevant due to this.

Yeah I saw that mentioned somewhere recently. Not enough info though IMO and still doesn't answer the question. Why is a resolution coming up on 20 years of age still relevant?

Certainly isn't because of visual quality. Must be the GPU prices because I'd bet those same people surveyed are all running 4K HD panels in their living rooms.
 
  • Like
Reactions: valthuer
Not a HUB subscriber... about as useful as Jayz2cents IMO. I did glance at the article though and I'm definitely a "casual 60 fps" gamer (4K) and I also grinned at that part about people questioning 1080p benchmarks being relevant... because that's something I've always wondered as well.

Why are we still seeing 1080p benchmarks in 2024? Must be for those hardcore FPS junkies wanting their max fps because for me 1080p gaming is about as visually exciting as watching a movie on DVD.

To each their own. :cheese:
If you don't understand why nothing higher than 1080p is relevant in a CPU benchmark/review then you don't understand benchmarking/reviews. In a CPU review if you're testing resolutions which are GPU limited then you're not actually testing the CPU. The point is to remove the GPU from the equation as much as possible without causing other issues (ex: GN and HUB have both experienced some anomalous results when using a 4090 at 720p so they both stick to 1080p for CPU reviews). This way you know the upper bounds of what the CPU is capable of no matter the video card.

Putting that into practice: If all you care about is 60 fps then what you need to look for in a CPU is one that's capable of 60 fps in anything you're likely to play. At that point you'd need to match it with a video card that can do the same at whatever resolution you play.

CPUs don't scale with resolution like video cards do so while multiple resolutions is important for a video card review it's simply not for CPU. I'd bet any 6 core ADL or newer could probably do 4k/60 in most games.

edit: an analogy popped into my head testing games in a CPU test at 4k would be like testing GPUs on a 4th Gen Intel CPU: sure you'll get results, but they won't be testing what you're intending to.
 
Last edited:

Mama Changa

Great
Sep 4, 2024
78
45
60
Its time we begin boycotting NVIDIA and supporting their competitors, AMD and Intel; may glory go to them!
As long as the RX 8800XT delivers on the rumours, then it's my next card. This means 7900XT/XTX raster, 50% faster RT than 7900XTX, lower power than 7900XT, 16GB $599 max. Oh and it better be able to support FSR4 when it's available.