Nvidia next generation to use 512bit and 448bit memory controller

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
0
http://www.fudzilla.com/index.php?option=com_content&task=view&id=6702&Itemid=1


Can it be that Nvidia will bring a real next generation product just a quarter after it released its 9800 series? Well, we don’t have the answer to that particular question, but as we reported here, Nvidia is working on a new GPU codenamed D10U-30. The D10U-30 will feature 1,024MB of GDDR3 memory and we learned that there will be one more SKU below it.

The second one is codenamed D10U-20 and it will have 896MB of memory, again of the GDDR3 flavor. This new card indicates that Nvidia can play with the memory configuration and that the new chip might support more than the regular 256-bit memory interface.

This one might support 384-bit or some other memory configurations, but we still don’t have enough details about it. It looks like Nvidia doesn’t feel that going for GDDR4 is necessary and it looks like the company will rather jump directly from GDDR3 to GDDR5.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
0
My guess would be....

32 ROP, either 48 or 96 TMU, 192SP, 512bit memory controller

28 ROP, either 40 or 80 TMU, 160SP, 448bit memory controller.
 

LukeBird

Distinguished
Nov 14, 2007
654
0
18,980
0
Unfortunately, after the rumours (unfounded!) of the 9800GX2, I'll believe it when i see it!
If it's true, looks like Nvidia reckon the HD 4xxx's will be monsters! :D
 

John Vuong

Distinguished
Mar 4, 2008
66
0
18,630
0
Well, I hope NVIDIA will use a 512 bit memory bus and then I will be happy. And FFS, use GDDR5, ATI is going to use it, why stick with GDDR3? That is getting old.
 

iluvgillgill

Splendid
Jan 1, 2007
3,732
0
22,790
1
will nvidia do what they have done before with the 8800GTX,they release a "REAL KING" which will last for 2 years and in those 2 year they can bring out many rename remasked lightly tweaked cards to fool people and empty people's pocket?

and to John Vuong even GDDR3 is old but but the GDDR3 used in the 9800GTX is 0.8ns memory chip its spec clock very high and is high enough to go into the GDDR4 territory of 2400mhz which standard GDDR4 start off with 2000mhz.its like DDR2-1066 and DDR3-1066.
 
Yeah I read that last night before going to bed and sent a quick e-mail to FUAD saying if the memory size is correct it's unlikely to be 384 bit versus 448bit.

Marv, I like the breakdown above, I think the former TMU number instead of the later though but potential a higher internal ratio vis-avis addressing within the TMU.
 


It sounds like they're still having tolerance issues in supporting GDDR4/5.

The cost issue is no longer prohibitive, and the power/heat issue wouldn't be a problem as the GDDR5 don't require 1.8V to run @ high speed unlike the GDDR4.

It'll be interesting to see what the bandwidths match up to be because GDDR5 will launch just under twice as fast as GDDR3, so could be interesting. However I would not be surprised that with good cooling the GDDR3 overclocks pretty well and when multiplied by the 512bit interface, the bandwidth will scale amazingly well under overclocking. Something that was rather limited in early HD2900s.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290
3
I love this discussions about VGAs E-peen.

More MGhz Doesnt mean more performance. Just more MGhz. Netburst is a good proof of that.

Next Gen Cards ? They will come with PPU attached.

The Geforce 9xxx are just refurbished G92. Nvidia is holding the PPU card ( Or the 2 cores Card) while peeps continue to buy VGAs to expose their e-peen.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290
3
Well, i still have a old Chip, that can still muscle with a 8600GT. Fast GPUs and Fast Rams are nice, but if the bus is really slow...well...nothing much more to say about it aint it ?

The resolutions of LCDs is a nice point, but doubling the cards for 50% (or less) improved perfomance, is the perfect sale. Really. It is. not to mention games, due to their development cycle, never (or very rarely) take advantage of cutting edge Hardware/CUDA/Developing Tools.

God i hope Ati/Amd, really get their act together, so Nvidia NEEDs to draw out good Cards (Pun intended). I hate the refurbishing of old chips, just with more and faster RAM, and OCed GPU.

I wont upgrade my build so far. the "upgrades" are nice and dandy, but their arent sufficient, Performance/moneywise.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
0


I don't know about E-Peens but I buy cards because the games run slow. I don't know about others but I do try to hold out long as I could or get a good deal on one that isn't going to hurt my pockets.

These dx10 cards aren't pentium 4. FUD didn't even mention mhz. PPU is not necessary when we have have 4 core CPU. The biggest leaps came from SP and you could call the geforce 7 like Pentium 4. Looking at speculated specs though it seems to be leaps and bounds faster than 8800gtx. A card that can finally play crysis at very high detail. A single core no nonsense SLI card.

 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290
3


Just one thing. A PPU, or a specialized CPU if you prefer would be the next logical leap. 2 GPUs in the same PCB is a very old tale. Called V5000 with 2xVSA Chips. I had one, it still works btw, although drivers arent made for a long time. As many are showing their enthusiast and their opinion (some of them filled with valid reasons themselfes), i show my opinion. I created this account on this forum just for it. Its a older opinion you may say, less enthusiastic, with more...huumm..memory of old breakthroughs.

The 8800Gtx is a great card, but in the market of VGAs, its already a very old card (or chip if you prefer). Crysis isnt heavy it self, the VGA market this last year just has run really slow compared to other years. Ati and Nvidia are both to blame on this matter. I remember the Hype about Farcry, and it was ran fully few months after being launched by new Hardware.

Honestly, i want the best bang for my buck aswell (Euro in this case) but i look at the options, and my R480 Chip still run smoothly enough (Crysis at medium , 30-40 fps).

About the PPU, this small hint. For Rasterization (Or Ray Tracing, that i think intel will fail deeply) your CPU is pure muscle. Some operations he will do nicely, other he will take loads of time. Its what we call a Generic Processor. Can do everything, but not nicely. The next gen Eye-Candy and performace wise, will be(i hope) a Embedded PPU, or instead of this silly SLI/Crossfire fight ( double the money for less than 50% perfomance, plus software problems, plus, games arent made for SLI/Crossfire yet), a "addon-card" with dedicated support. A second card, made for dedicated work. Wanna play in max settings ? Sure buy the "addon" card. I know i will. But its silly to add another card with the same specs. With all the trouble it comes with.


Bah, just venting my dissapointment about the GPUs industry atm. Sorry if i hit some soft spots.
 

spotless

Distinguished
Sep 7, 2007
248
0
18,680
0
well, for the sake of the costumer (us), amd & nvidia (placement due the alphabetic) should do unified coding so every game engine can be optimized as in console, what dy think guys?
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
0

I agree that SLI has historically been a bad investment, but this latest generation of cards (and more importantly their associated drivers) has made it a much more viable option. I would go so far as to say that 2 8800GTs in SLI is the best value high performance solution on the market right now.

E-peen has little to do with it. If my 8800GT ran Crysis fine at very high settings with a little bit of AA then I wouldn't upgrade again until there was a game that needed more power.

Actually, a 3GHz Netburst processor is faster than a 2GHz Netburst processor, all other things being equal. Higher frequencies do mean more performance if the architecture is unchanged.

More likely the "real" next gen cards will be based on an architecture that lends itself well to physics calculations. Nvidia is already planning a CUDA implementation of Physx for the 8 and 9 series, and those cards pretty much suck at branching calculations.

I agree with you there. Hopefully Larrabee will provide some competition as well; the more the merrier :)
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290
3


Pretty Much, want me to post a Screen shot ? im at work atm, but i can do it later. Only Bioshock doesnt. Hangs up alot. For the exception of Bioshock i can run almost everything with decent playability. Got 512mb GDDR3 and overall still handles pretty well.

@Homerdog.

Aprecciated the comments, some i agree some i dont, but hey, thats life. I Wrote 2 more posts after that little "flame". At least people are picking with the E-peen statement. Sometimes, in VGA Forum, just seems like it.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290
3


No. Dunno if it is in 1440*900 or a bit lower. Its a X850XT 512MB, not a X800XT 256MB.
Btw, the link you showed me, it got really low results on other VGAs i already played Crysis , and other games aswell.
Sorry but some results there just dont add up.

I dont wanna sound crazy but some results seems really low. Compared to other systems i already played with. Inside those specs.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
0
I had a 850xt like 2 years ago. 850xt is just slightly clocked higher than x800xt. 5% difference. At medium settings in crysis 256mb card is enough for 1280x1024 resolution. I'm not too sure since I haven't tested it. At high settings it uses slightly over 320mb of vram @ 1280x1024.

The benchmarks aren't off at all when you consider they tested when the game was just released. Crysis eats GPU alive and spits it back out.

I had a 8600gts that would chug on 1440x900 medium. Playable? Yes but still slow as hell.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
0


I had a voodoo 5500 until I sold it off and got a Radeon 64meg vivo. :sol:



It is old but it's a 24rop with a 384bit memory bus beast. The whole G92 was to bring down the price for mainstream which it did. It was re-spin of the old. Look how dirt cheap G92 chips are? They are down right cheap and get only within 10% of a full 8800gtx at fraction of the cost. Crysis is a GPU killer. Medium settings looks decent but you don't know what you are missing until you can play this game very high. I get 17fps @ the very high settings on my card. I just like looking at the pretty water and jungle. Nvidia's GT200 should be able to play this game at very high. It's going to be expensive. 512bit memory controllers aren't cheap and neither is double the transistor count.



I really doubt that. My 8600gts that is fast as 1950pro did 28fps at max overclocked settings.



Ray Tracing is up to Nvidia and not Intel. Far as I know Nvidia is going with a hybrid technology. At least this is what I've heard.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290
3
Playable, i dont say its a breeze in a loaded (smoke, particles, mobs, etc) scene , but FPS doesnt drop too much.

Bottom line, what i do mean, and i think we agree in this point. Is that Nvidia & Ati, should be shipping much better products, that they are now. Now the Top of the line boards are just 2 GPUS on the same PCB (Or in 2 Pcbs same PCIE, this show you can make omolets..in several ways). There isnt no major breakthrough.

Smells like Vista. Really.

On-Topic: 512bit or 448bit buses on GPUs ? Now that would be a nice upgrade. But i think NVidia will leaves us to dry. Unless Ati, can pull a Rabbit out of the Hat.
 


It's about architecture actually, which seems to have escaped you, which isn't surprising by what you wrote.

More MGhz Doesnt mean more performance. Just more MGhz. Netburst is a good proof of that.
Actually it does, more than anything else. Like homerdog mentioned 3ghz of the same architecture is faster than 2 ghz of same architecture, and a 2Ghz Athlon XP will be faster than a 2Mhz Core2Quad. So while everything plays a part, performance is determined most importantly by speed. And anything running at a MGhz would like outperform the others in computational power and be restricted by other areas.

Next Gen Cards ? They will come with PPU attached.
No, they won't. PPU is dead Period.
nVidia bought Ageia and all their IP belong to US, err... them.
nV is putting it all into their GPGPU architecture, and it makes more sense to assign sPUs for physics when needed and then graphics when needed then to was an IC and memory and board space on a PPU whose utility was non-existant when Ageia pushed a hardware PPU. The PPU as an IC is dead, GPGPUs is the way things are going.

Use your eBrain more than your eWang.