New Nvidia GeForce 8800GT Surprises

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I'll be building soon and the arrival of the GT hasn't changed anything for me. I'll still be going with a GTX or Ultra - mainly due to the fact that I'll be gaming in 1900x1200 resolution. I wish that I could save some money 'cause I don't exactly have it like that...but the GT won't cut it for me.
 


Yeah I guess the gt is meant to appeal to a very broad spectrum of people, but at the very high end it falls down. The 3800 might be a winner in the value segment like the 8800gt, and ATI's drivers have come along leaps and bounds, so there new parts should be competitive more or less right out of the box. Rumours of a dual gpu ATI part look quite exciting.

 


Yeah, but you can only step up to stock cards, and I know how to use ntune anyway :)

I'm still a bit weary as there are a couple areas where the GTS 320mb is better on paper, and nvidia insists that there are *some* cases where a GTS will outperform a GT, though I haven't seen one of those yet.
 
SLI 8800GTs?
 
Hello all, I just wanna hear your opinion about a VGA matter.
I have a P4 3.2GHz(1MB L2cache/800FSB), 2.5GB DDRII 533MHz with an ASUS 7800GHX 256MB PCI-E.
I play all the games almost with full details. Games Like HL2,Fear, WOW,Dawn of war etc. (1680X1050wide res)
The problem is that my 2 last games (C&C 3 and Company of Heroes-Opposing Fronts) with High details start to lower too much the frames and I'm facing delays. I reduce the quality and I saw serius improvement.
So I came with the question...

If I change my VGA (Which is 1,5 - 2 years older) to a 8800GTX 768MB. Would this solve my problems? or Should I need to change CPU to C2D?



Regards
Faiakas
 


To notice a HUGE difference in any game you should go 8800 GTX and a Core 2 Duo. COH you should see 100+ fps and with every detail setting at max. I your gaming at or under 1650x1050 then the 8800GT 512 is good enough. If am getting rid of my 7800GTX today (selling it for $150 US) , replacing it with the 8800 GTX, yeaaa!

On my 7800 GTX I had almost every setting set to high Q except for physics, shader and I got 30-52 fps in Crysis XP. Crysis see's a bigger difference in fps when the CPU is OC'ed than most games. If your in a money crysis then go for a Core 2 Duo E6600 or better and the 8800GT 512MB then your all good. For under $500 you can get an 8800GT with a Core 2 Duo, only thing is you mobo left to buy.

You should have no problem getting a good mobo from Gigabyte for about $129.99 for P35.
 
Evgas highest model of the 8800gt also has its shader clocks higher than stock.

Im in the process of stepping up my card...I literally only had one day left to step up :) Getting the card for free. I ordered the Thermatake Duorb to replace the gt's stock cooler when it arrives. What is the best tool for overclocking?
 
Does that allow for the shader clocks to be overclocked too?

Evgas Stepup only allows for the stock model. I read a review somewhere that said overclocking the shader clocks increases its performance a lot....somewhere in the 13000-14000 in 3dmark06
 


That would be a big mistake before the Christmas period.

Do Nvidia really think anyone is going to buy the GTX or Ultra with the GT running them so close for performance for around 1/2 the price?

Lets face it very few people game at 1900 x 1600 and even if they do, 2 x 8800GT's in SLI paste the GTX for around the same price at all resolutions!

Nvidia were entirely right to launch this card with this performance as its what the market needed, but unless they bring out a high end version then from now on they're going to be selling GT's and nothing else - a bad move with the holiday season nearly upon us.

Time to get that High end version out NOW!
 
You can step up any of EVGA's cards. But only to the refrence one.

If you have an EVGA 8600gts OC you can step up to the 8800gt stock speed card.

You can't always step up to an overclocked edition, there do sometimes have OC editions in the step up program but usually not for a couple months after the launch of a new line...
 


The same people who bought the GTX/Ultra before. Anyone wanting the top framrates in Crysis will still want an Ultra. It's the money-no-object crowd, not the penny concious.
Also with Tri-SLi on the doorstep, you can ONLY use that with the GTX/Ultra, so anyone already with 2 buys their 3rd. I don't think nV's all that worried though about moving old product, they likely would be ok with just droping them all together. But instead they just sell off what they have left to those who don't care about price, and then in 2008 voila new hardware for those very same people to buy again.

Lets face it very few people game at 1900 x 1600 and even if they do, 2 x 8800GT's in SLI paste the GTX for around the same price at all resolutions!

Yeah but 2 Ultras past those 2 GTs. And the small group that does game at or above 19x12 is their target market, and it's small, but it's always been small even before the GT arrived, it's never been the majority of the market.

Time to get that High end version out NOW!

I disagree, time to get Tri-SLi out now so they can sell even more GTX/Ultras.
 
I'm been reading this thread and the others with interest! This is a question about the 8800GT, even if it doesn't seem like it :) I think anyone with one of the Nvidia cards sporting 2x DVI output may be able to help me....

I've recently brought a 52 inch Sharp 1080p LCD TV. Been downloading 720p/1080p films off (tsk tsk). Now I just need to work out how to hook up my PC to the TV

Currently I have a Geoforce 6800, with a single DVI output (powering my PC monitor) and S-video (which I hooked up to the TV).

Clearly this is sub-optimal, and I would like to upgrade to the Geforce 8800GT which has just been announced. I see it has two DVI outputs. So I just need a DVI>HDMI convertor or a cable with DVI on one end and HDMI on the other.

My question is, will I be able to have two signals as I do currently - 1900x1200 to my monitor though DVI1 output and 1080p output to the TV through the the DVI2 output?

If so this is great - does anyone know the answer for sure?

(Other specs ASus A8N-SLI mobo, Athlon 64 3000+, 1 gig system RAM, Geoforce 6800, BenQ 1920x1200 LCD monitor, Windows XP) (edit: I might upgrade to an Athlon X2 4000)
 
There's no way I'm buying another 8 series card with the Crysis demo performing so poorly. I seriously hope Nvidia really beefs up the ROPs because current cards aren't fast enough to use antialiasing properly in Crysis. Then again, they don't really perform well without antialiasing either. 🙁
 


Crysis doesn't run poorly, its the people that set every setting in Crysis to max Q. I remmeber when Oblivion came out, it to was stressful on current cards. You can't always expect to run newer games at max setting even with high-end hardware, its unrealistic to expect that.
 
Drivers will improve over time, don't get your nickers in a twist.

Besides, with 8800GT out when tri-sli is released the 8800GTX's may get a big price drop and 3GTX's may cost similar to the cost of 2GTX's now. Which LOADS of people have, anyway though, if I was rich and I wanted to game, there would be no reason why I wouldn't go with 3x XFX Ultra XXX editions. It's not meant for the light hearted its meant for the rich and enthusiasts.

At 1920x1200 or higher even, the scaling for tri-sli seems to be close to 2.8 in some cases, sorry but thats awesome! Like greatgrapeape said, not many ppl sue that hgih res, I know I wouldn't simply because screens that big are bad IMO for FPS, takes too long to move mouse over to them, even though higher precision shots I find easier on my 27inch. expect the profit made by selling say 1 ultra card is MUCH more then selling a low end card, so they can still make a lot of money by just making a few tweaks to their board to run PCI-E2 and TRI-SLI. In which case selling tons mroe motherboards and tons more graphic cards. Get rid of their "old" 8800GTX stock as well if thats their goal lol.
 
I'm in a bit of a sticky situation with my Gigabyte N650SLI-DS4.

I'm planning on getting a pair of 8800 GT's and do some SLI, but should I upgrade to a N680SLI first then do it? or stick with this one?

Will I get 2x the power/speed from a N680SLI? or can my N650SLI will handle both my handle both my 8800GT's with no bandwidth ceiling?
 
I think the review is somewhat incomplete and could be a bit misleading as to the complete overview the performances from different cards. I would like Tomshardware to redo this review but this time, include 8800 GTS 640MB in the mix. We all know that there is a huge performance gap between cards with fewer ram and cards with lots of ram. GT and GTS are essential difference chips, just like GTX, Ultra, GTS, and GT are all different chips. However, you're comparing apples and oranges when you pit 320MB against 512MB.

Personally I own 2 8800GTS 640MB SLI setup, I am curious how the GT will preform against the GTS 640MB in both single card and SLI setup, as well as the other cards. ESPECIALLY at high resolutions like 1920x1080(1200) and 1080p.

Please redo this review or do another review with the 8800GTS 640MB included.