Adata's DDR4 Memory Modules Are Finally Arriving

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
can anyone explain why we are on ddr4 when graphics cards already use ddr5? why not just jump straight to 5? Cheers.

Are different things. Graphic cards use GDDR5, not DDR

For clarification, Graphics memory is "Dual ported". What this means is that data in GDDR can be simultaneously written to by the GPU and read by the video driver. This is different than normal DRAM in which a block can only be read or written to at a given time.
 
I'm considering getting the X99 with an I7-5930k (seems to be the best pick for gaming, better pcie connectivity than then 5920k and 500mhz faster than the 5960x, also more likely to overclock better) along with some descent DDR4 (As for the ddr4 comments above, there are other benefits to DDR4 than just basic ras/cas and power savings)
Personally I am aiming for octo core and do a mild/average OC on it, Reason is Direct X 12 next year will finally be utilizing more cores for games instead of typically loading most of the work onto just 1 of the cores so it should balance out. Either one(5930 or 5960) you cant go wrong but I would skip the 5920 if you will be considering multiple gpu's and maybe adding an M.2 SSD drive onto those pci-e lanes to share that bus.
Bit of a shame on the DDR 4 speeds/latency but it has to start somewhere, Also next gen gpu's will be another round of 28nm, Seems like next year will be when all the good shit gets sorted out,Most likely have slightly updated/faster Haswell-E's by then as well.

What do you guys think? Aim for the per core slower octa or the quicker hexa core for gaming!?
 

GDDR is not dual-ported. There are two major differences between DDR and GDDR:
- GDDR uses Quad-Data-Rate signaling on the data pins, which is made possible mainly because GDDR signals do not need to go through socket interfaces
- GDDR has separate read and write data strobe signals to enable faster turn-around when switching data bus direction while DDR has bidirectional strobe pins which introduce one cycle of latency to switch signal driver between ends

Conventional dual-ported memory went practically extinct over 15 years ago because dual-ported memory itself was much more expensive (more expensive to produce due to the extra complexity and also a niche product so much fewer sales to spread development costs on) and having dedicated read and write data ports also doubled the pin cost on the video chip.

The concept of dual-ported access is still alive and well but mostly in (semi-)custom form... like Intel's Crystalwell 128MB L4 cache for Haswell with GT3e, Broadwell and Skylake.
 
If the timings were so much important then why DDR5 memory is being used in Graphics card.

Ofcourse when 1600 CL6 RAM is compared with 1600 CL9 RAM then the former will be winner but if 1600 CL6 is compared with 2133 CL9 then the latter one will be faster attributable to its higher frequency.

DDR4 is something new & better and should be taken positively.
 

DDR5 does not exist. GPUs use GDDR5.

The reason why GPUs can work with GDDR5's 20+ cycles latency is because streaming texture data from memory is far more bandwidth-intensive than latency-sensitive. The order in which data will be requested for most GPU-optimized tasks is known well in advance so the GPU can easily hide the latency with prefetching.

For a main CPU on the other hand, guessing which pieces of data will be needed next is far more difficult since the code running on CPUs is so much more diverse and traditionally much less linear.
 
wonder what the prices on those could be !!
I'll wait till G-Skill release their kits on the market and then maybe I'll buy a decent x99 rig ...say 5930k ...64 GB Ram ... Rampage V if released .... and a GTX 880 if it comes before Christmas ..
 

Previous generations started with a 50-100% premium over mainstream memory during most of their first year... so I would not be surprised if the first generation of 16GB DDR4 kits started around $200-250.
 
2133-15 DDR4 when there is 2133-9 DDR3 @ 1.5V on the market... not convinced those DDR4 DIMMs will be a hit with the X99 crowd that is not exactly known for caring about saving 3-5W - particularly if it comes with any sort of performance penalty like the 66% higher latency on DDR4 would.

Well, the DDR4 transition has to start somewhere... even if the starting point seems to make little to no sense.
I agree i have OC'ed my memory to 1866 at Cas 9 and is still at 1.5 V DDR4 has no point right now
 
Status
Not open for further replies.