NVidia Readies Dual Chip, Single Chip 9800GX2, 9800GTX and 9800GT

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I remember when the 8800GTX replaced the 7900GTX, the 8800GTX was more than four times more powerful. I was waiting for a huge upgrade and not some small 30% upgrade. We need a new architecture by November from NVidia, its not worth upgrading to a 9800GX2 unless you own an 8800GTS 320MB or lesser graphics card.
 
As for ATI being in for a rough time......I don't think so (But i could be wrong).

1) First the 3870-X2 should be cheaper to produce versus 9800 GX2. The 3870 X2 will be a single PCB board versus Nvidia's 9800 GX2. This should make it cheaper, and therefore they might be able to pass on the saving to the consumer, and therefore undercutting the 9800 GX2.

ie. If a 3870-X2 costs $500, and delivers 90% of a 9800 GX2 that sells for $700, I think the 3870-X2 will do well.

2) Second ATI's architecture scales better then Nvidias. So having a 9800 GX2 may only increase performance over a 8800 ultra by 30%, but having 2 3870's on a 3870-X2 generally gives over 75% increase in performance over a single 3870.

I'm not saying the 3870-X2 will be stronger the a 9800 GX2 (almost %100 it won't), but it may turn out to be a good product/buy.

Note: Only thing that annoys me is the performance hit that ATi cards take with AA. Enthusiasts who buy enthusiast cards generally like to have high quality picture (AA helps that out), and the performance hit that ATI takes with it on is just to much! I really hope they somehow can fix this!
 
Ill add to this post what I added to the shorter one. What do you guys think about what looks like an optical audio port on the back? Is it really the hybrid graphics/sound card that has been rumored or is it an input for the HDMI? The second options seems fairly uneventful but thats still something not mention in the specs
 



I wondered about that to, if its for sound then maybe "Creative" has something to compete against. Its an optical port that can pass simple digital data to just about anything, maybe its for an A/V receiver for your home theater.


@ babybudha, I think your right about the ATI's duel chip graphics card being cheaper in cost and with a bit less performance than the 9800GX2. ATI seems to be targeting the PC gamers with smaller screens (19"-22") instead of the 24"+ screen sizes for the high-end market. I don't know how ATI fans deal with the huge loss of performance when using AA, that for me would cause me to jump on over to the other side.
 
This is awesome.

I was SO on the fence regarding my eVga 8800GTS 640, it killed me to buy it. I mean, I had it for quite a while now, so I got alot of gaming out of it, making the wait for an 8800GT not such a good move. BUT, the GT is a sweet pat on the back to the "holdouts". Good job, your patience paid off well guys :)

BUT this??? WTF!!! I am SO happy I didn't wait!!! This is the FIRST time I ever made a good call not to wait and just buy. This sammich-board hack POS would have never gone into my PC, I mean, all c*ck and no balls! HAHA!!!

I think nVidia slightly under-estimated Ati. I mean, pull out the duct tape and wrap it around 2 8800GT and get 30% more power than an Ultra?? Come on. Ultras are ancient now.

This is going to be a trainwreck. I am going to piss myself laughing when the game sites post the Crysis benchmarks and show the high 30's and no support for a bunch of other titles (no frame increase worth the $$$), then really old games getting 200fps.

What would have been interesting is if it was a sandwich card with 2x1GB for people running huge LCD's, then it would be ok I guess.

Seriously, nVidia, you better do something to get 60frames outta Crysis with SOMETHING, cause we ain't buying your 9x series then.
 
This is awesome.

I was SO on the fence regarding my eVga 8800GTS 640, it killed me to buy it. I mean, I had it for quite a while now, so I got alot of gaming out of it, making the wait for an 8800GT not such a good move. BUT, the GT is a sweet pat on the back to the "holdouts". Good job, your patience paid off well guys :)

BUT this??? WTF!!! I am SO happy I didn't wait!!! This is the FIRST time I ever made a good call not to wait and just buy. This sammich-board hack POS would have never gone into my PC, I mean, all c*ck and no balls! HAHA!!!

I think nVidia slightly under-estimated Ati. I mean, pull out the duct tape and wrap it around 2 8800GT and get 30% more power than an Ultra?? Come on. Ultras are ancient now.

This is going to be a trainwreck. I am going to piss myself laughing when the game sites post the Crysis benchmarks and show the high 30's and no support for a bunch of other titles (no frame increase worth the $$$), then really old games getting 200fps.

What would have been interesting is if it was a sandwich card with 2x1GB for people running huge LCD's, then it would be ok I guess.

Seriously, nVidia, you better do something to get 60frames outta Crysis with SOMETHING, cause we ain't buying your 9x series then.
 
Other than Crysis and Oblivion, what games can really tax a system that bad anyways that would require a 9850 GX2? I played Crysis and Oblivion on PLAYABLE frame rates! Unless I decide to get a 30" monitor, it makes no sense to me.

 
The 9800GX2 is for people that either have an older graphics card and its time for them to upgrade anyway or for those that just have to have the best that don't care about the money.
 


People like me have been wanting more power for a long time, I can slow down the 8800 GTX with a 19" screen in many games... I play on a CRT screen at 2048x1536.

I would have to see retail prices and real performance figures before I made any decisions, sounds to me like its just rolling out die shrink as a new gpu to me. Looks like Nvidia struck it so lucky with the 8800 design that they just havent been able to make any headway yet... Or they just cant be bothered because ATI is behind and they got no "need" to push the boat out :pfff:
 



Or this is really all NVidia can do performance wise, well nerver know now. Thanks ATI for not pushing the graphics card market, this is an example why we need ATI to push out great cards. When theres healthy competition there tend to be greater leeps and bounds.
 
Cryis will play At Very High Setting at 1920xaabb only on a quad configuration pair up with new Intel CPUs about to arrive, thats about over 2,000 on video cards. Who in the world will spend this much just to play crysis? Anybody know how much this cards will cost? Hopefully when this cards arrives, 8800 gtx prices will drop dramatically, then ill pair up 2 8800 gtx in SLI.
 
for the people that were talking about the image of the nekkid chick spinning, I was going to post saying it was fake and actually i did originally post that. but it turns out it isn't...and the reason is, When i first removed the rest of the frames i just saw left right left right. But after looking a little longer I noticed it looked like it was going just clockwise and i really think IT IS the way your brain perceives it.

Here is the image Frames 20-34
fakemy8.gif

 

Nvidia already used that design (7950gx2) thus they save a little development costs. In addition, having two GPUs on a single PCB can be a nightmare to develop/manufacture. Usually the PCB needs more layers AND will be longer which are both major cost factors. Additionally if the chips put on that pcb are putting out a lot of heat, it might not be possible to put them on a single pcb without reducing the clockspeeds to keep the card from going supernova.
Given nvdias history with dual cards and looking at their 8800 generation cards, i say they use the dual pcb approach for various reasons.
1. Heat. Two pcbs, while not as good as two seperate cards in sli, provide a larger surface area and should be easier to cool.
2. Cost. If mass produced the single pcb approach, combined with a cooler or lower clocked chip, might be cheaper. That holds only true if it produced in great numbers since the PCB is quite complex and it's size is non-standard. If the card is sold in limited numbers (i'm looking at the GX2 again) then it might be cheaper to use the dual pcb approach. Should the demand skyrocket, they can always create a refresh - that's how nvidia works.

I prefer the single-pcb approach because, to me, it is a cleaner approach. On the other hand i'm using a Pentium D right now (the epitome of glued uglyness) so cost and efficiency play a role too. I think Nvidia will place that card at the very high-end of graphic cards and price it accordingly. The GPUs on those PCB are probably something we've seen already, so basically this card doesn't deserve the 9xxx name as others already noted.


PS: It is dual, not duel.
 



dunno what the hell your talking about but the nipply little vixen is spinning both left and right to me..., first one way than the other.... tease
 


To me she's going clockwise the other, than counterclockwise the next. I personally think the image is set to go left to right than right to left, and has nothing to do with your brain whatsoever.

And it's so funny to hear "anti" clockwise. Is that an east coast thing? Over here we just say counter clockwise. 😛
 
Not sure what this chick spinning is representing...just hopped on this thread and skipped the first pages..It doesn't seem to follow a pattern, she switches between clockwise and counterclock wise. You can just tell by looking at the hand/hip/leg placement. Not sure what this represents though. It's interesting that it switches between the two with a odd pattern even though it's only 100kb. I don't have much experience with GIF's...But I guess you can set it up to randomly to repeat?..that way it keeps the file size small.