News Leak suggests that DDR6 development has already started, aiming for 21 GT/s

Here are all the LPDDR6 memory slides as well. :)

Synopsys has laid out up to 14.4 Gbps as the highest defined data rate, with an introductory rate of 10.667 Gbps.

LPDDR6 will also use a 24-bit wide channel composed of two 12-bit sub-channels, and offer an introductory bandwidth of up to 28 GB per second and up to 38.4 GB/s using the fastest 14.4 Gbps dies.

LPDDR6-Memory-_1.jpeg


LPDDR6-Memory-_2-1456x765.jpeg


LPDDR6-Memory-_3-1456x770.jpeg


LPDDR6-Memory-_4-1456x768.jpeg


LPDDR6-Memory-_5-1456x765.jpeg


LPDDR6-Memory-_6-1456x820.png


LPDDR6-Memory-_7-1456x822.jpeg


LPDDR6-Memory-_8-1456x819.jpeg


LPDDR6-Memory-_9-1456x822.jpeg


LPDDR6-Memory-_10-1456x815.jpeg


LPDDR6-Memory-_12-1456x816.jpeg


LPDDR6-Memory-_13-1456x816.jpeg


LPDDR6-Memory-_14-1456x820.jpeg


LPDDR6-Memory-_15-1456x815.png


LPDDR6-Memory-_16-1456x816.jpeg
 
  • Like
Reactions: JamesJones44

TechyIT223

Proper
Jun 30, 2023
192
45
110
We need more details on GPU GDDR7 next gen memory specs. Rumors have confirmed 28Gbps as the minimum starting speed.

But it's uncertain whether such high speed will be utilised by nvidia let alone AMD.
 
Yes. Very likely the first iteration of GDDR7 "next-gen" GPUs would utilize the 28 Gbps dies, with only a few high-end flagship variants or their refreshes getting 32 Gbps pin speeds.

But I think only Nvidia would go for this, since AMD might still stick with GDDR6/X mem instead, for their mainstream Navi 48 and 44 GPUs (speculation).

Still, this would be a huge improvement over GDDR6 which peaks out at 24 Gbps, and we have only seen up to 22.5 Gbps speeds on certain GPUs such as the RTX 4080 SUPER.

However, GDDR7 standard allows for memory bandwidths of up to 192 GB/s per chip, but it still has a 32-bit data width, so eventually, GDDR7 could also run at effective speeds of up to 48 Gb/s per bus bit, or an effective clock speed of 48 GHz.

So here is a 'pipe dream' based on this future "48 GHz" effective speed ! Because this is unlikely to happen at least on consumer gaming GPUs initially. :sleep:

A Graphics card with a narrow 64-bit bus typical of low-end GPUs could achieve 384 GB/s of bandwidth (think RX 6500 XT).

A mainstream GPU with 128-bit memory would already have a bandwidth of 768 GB/s.

And some high-performance GPUs with 256-bit and 384-bit buses could sport bandwidths of 1536 GB/s and 2304 GB/s respectively !
 

TechyIT223

Proper
Jun 30, 2023
192
45
110
What really worriers me is the pricing of next gen GPU GDDR7 cards.

I would be skipping the whole next generation unless AMD has something better than nvidia.

In fact there's no way I'd pay over $1000 for a graphics card, so that basically rules out 5000 series at this point anyway. 🙁

If my current nvidia 3080 GPU fails before Radeon 8000 launches, I'll probably pick up a cheap 6900XT because frankly, after 2 years of having "good" ray tracing, I've found it to be not that memorable.
 
I won't be upgrading to any discrete GPU anytime soon. Would directly go for an iGPU setup like the upcoming Strix HALO APU lineup, or Intel's next-gen Panther Lake series, or whichever lineup offers higher igpu performance.

Given how powerful integrated graphics has become in recent years, and the type of games I usually play, a powerful high CU/XE core count APU should be more than enough for my casual gaming needs.

Currently rocking an RTX 4060 8GB GPU, so this will keep me busy for next few years or so, unless of course it dies (**touches wood*). :cautious:


Sorry, this is kind of OFF TOPIC:

But the current AMD Radeon 780M iGPU can already deliver over 60 FPS at 1080p in games such as Cyberpunk 2077. This is a huge leap for integrated graphics as they are getting close to the point where they can replace entry-level discrete solutions.

There's a reason why NVIDIA isn't expected to invest any more in its MX line of GPUs since both AMD & Intel are prepping some powerful integrated chips for future laptop designs.
  • Vanilla Skyrim (1080P / High) - 120 FPS Average
  • Crysis Remastered (1080P / High) - 71 FPS Average
  • World of Warcraft (1080P / Max) - 98 FPS Average
  • Genshin Impact (1080P / High) - 60 FPS Average
  • Spiderman Miles Morales (1080P / Low) - 74 FPS Average
  • Dirt 5 (1080P / Low) - 71 FPS Average
  • Ghostwire Tokyo (1080P / Low) - 58 FPS Average
  • Borderlands 3 (1080P / Med) - 73 FPS Average
  • God of War (1080P / Original / FSR Balanced) - 68 FPS Average
  • God of War (1080P / Original / Native Res) - 58 FPS Average
  • Mortal Kombat 11 (1080P / High) - 60 FPS Average
  • Red Dead Redemption 2 (1080P / Low / FSR Performance) - 71 FPS Average
The previous gaming benchmark results from ETA Prime's last video:
  • CSGO (1080P / High) - 138 FPS (Average)
  • GTA V (1080P / Very High) - 81 FPS (Average)
  • Forza Horizon 5 (1080P / High) - 86 FPS (Average)
  • Fortnite (1080P / Medium) - 78 FPS (Average)
  • Doom Eternal (1080P / Medium) - 83 FPS (Average)
  • Horizon Zero Dawn (1080P / Perf) - 69 FPS (Average)
  • COD: MW2 (1080P / Recommended / FSR 1) - 106 FPS (Average)
  • Cyberpunk 2077 (1080P / Med+Low) - 65 FPS (Average)

View: https://www.youtube.com/watch?v=KKaoxe5dd3M
 
Last edited:
  • Like
Reactions: TechyIT223

TechyIT223

Proper
Jun 30, 2023
192
45
110
I won't be upgrading to any discrete GPU anytime soon. Would directly go for an iGPU setup like the upcoming Strix HALO APU lineup, or Intel's next-gen Panther Lake series, or whichever lineup offers higher igpu performance.

Given how powerful integrated graphics has become in recent years, and the type of games I usually play, a powerful high CU/XE core count APU should be more than enough for my casual gaming needs.

Currently rocking an RTX 4060 8GB GPU, so this will keep me busy for next few years or so, unless of course it dies (**touches wood*). :cautious:


Sorry, this is kind of OFF TOPIC:

But the current AMD Radeon 780M iGPU can already deliver over 60 FPS at 1080p in games such as Cyberpunk 2077. This is a huge leap for integrated graphics as they are getting close to the point where they can replace entry-level discrete solutions.

There's a reason why NVIDIA isn't expected to invest any more in its MX line of GPUs since both AMD & Intel are prepping some powerful integrated chips for future laptop designs.
  • Vanilla Skyrim (1080P / High) - 120 FPS Average
  • Crysis Remastered (1080P / High) - 71 FPS Average
  • World of Warcraft (1080P / Max) - 98 FPS Average
  • Genshin Impact (1080P / High) - 60 FPS Average
  • Spiderman Miles Morales (1080P / Low) - 74 FPS Average
  • Dirt 5 (1080P / Low) - 71 FPS Average
  • Ghostwire Tokyo (1080P / Low) - 58 FPS Average
  • Borderlands 3 (1080P / Med) - 73 FPS Average
  • God of War (1080P / Original / FSR Balanced) - 68 FPS Average
  • God of War (1080P / Original / Native Res) - 58 FPS Average
  • Mortal Kombat 11 (1080P / High) - 60 FPS Average
  • Red Dead Redemption 2 (1080P / Low / FSR Performance) - 71 FPS Average
The previous gaming benchmark results from ETA Prime's last video:
  • CSGO (1080P / High) - 138 FPS (Average)
  • GTA V (1080P / Very High) - 81 FPS (Average)
  • Forza Horizon 5 (1080P / High) - 86 FPS (Average)
  • Fortnite (1080P / Medium) - 78 FPS (Average)
  • Doom Eternal (1080P / Medium) - 83 FPS (Average)
  • Horizon Zero Dawn (1080P / Perf) - 69 FPS (Average)
  • COD: MW2 (1080P / Recommended / FSR 1) - 106 FPS (Average)
  • Cyberpunk 2077 (1080P / Med+Low) - 65 FPS (Average)

View: https://www.youtube.com/watch?v=KKaoxe5dd3M
Well, I can't blame you for going the igpu route and it actually makes sense if you are into casual gaming.

Pretty impressive performance numbers for the 780m igpu though.

Next gen igpu would surely be much better, efficient and faster. Imagine a 40CU count igpu 😜
 

TRENDING THREADS