• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

does you think gddr6 will have a big impact on 4k what about hdr

WINTERLORD

Distinguished
Sep 20, 2008
1,805
19
19,815
well im wondering cause you could probably get a graphics card now with the proper gpu to have acceptable frame rates but am wondering what kind of impact gddr6 will have on next gen cards. like a 1070ti has enough cores sort of sapeek to run my game gw2 at plenty acceptable frames considering its cpu bound and i have a 8700k but like what about a 1170 it may have 2666 stream processors not much of an increase but the memory will be ddr6 wonder if itl make a huge difference and ifg i got somthing now if its worth waiting or if maybe itl just be a fancy way of saying its a 15% increase as companies somtimes do
 
Solution
While GDDR6 *may* improve 4K performance/ability.... it's not going to be the sole factor.

I don't believe the bandwidth of DDR5/X is too great of an issue currently, even at 4K.... although I'm sure it's part of the equation

While yes, with the rumored 384bit bus (Hynix), the bandwidth jump is substantial (to ~768GB/s) I would expect the lower voltage required to be the greater "benefit", along with capacity of VRAM. Less voltage required for the memory could allow more voltage/power to be directed elsewhere on the card &/or additional memory.

I would be surprised at all to see 12+GB VRAM on consumer GPUs (albeit the higher end).


There's also likely to be one eye on 8K monitors.
4K monitors a (relatively) affordable/mainstream...
It is fruitless to make graphics card choices based on the underlying specs.
designers pick the components to achieve a given price/performance objective.

Better to look at benchmarks for the games you will play.
You will get fair value at every price point.

GDDR6 is likely to first show up on the very highest top end cards, and expect to pay a hefty premium at launch.
 
While GDDR6 *may* improve 4K performance/ability.... it's not going to be the sole factor.

I don't believe the bandwidth of DDR5/X is too great of an issue currently, even at 4K.... although I'm sure it's part of the equation

While yes, with the rumored 384bit bus (Hynix), the bandwidth jump is substantial (to ~768GB/s) I would expect the lower voltage required to be the greater "benefit", along with capacity of VRAM. Less voltage required for the memory could allow more voltage/power to be directed elsewhere on the card &/or additional memory.

I would be surprised at all to see 12+GB VRAM on consumer GPUs (albeit the higher end).


There's also likely to be one eye on 8K monitors.
4K monitors a (relatively) affordable/mainstream today.... ~5 years after they first debuted.

8K monitors somewhat debuted in ~2016 (albeit most were prototypes and it wasn't until mid-late 2017 that some were actually available).
The 11xx (or 20xx) series may be a little soon to accommodate, but some of the requirements for 8K in terms of bandwidth are likely to be trialed in these cards, if I were to hazard a guess.
 
Solution


Given 2 otherwise identical GPUs, one with gddr3 and one with gddr5...can you tell the difference?
No, because you can't find two identical cards with different generation vram types.
Similarly, you won't find the same card available with gddr5 and gddr6. A host of other factors come into play.

Like trying to compare DDR2 and DDR3. Impossible, because all the other things are also different.