GeForce 8600: DirectX 10 For The Masses

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's the point... we all have NO idea. This article was labeled "DX10 Cards for the masses" not "8600 Cards for the masses", so why base your entire testing process around DX 9.0c??

It's still a DX10 card, that there are no DX10 apps to test it on other than nV demos means you test it on current DX9 titles. It's still a DX10 card, blame M$ and the Devs for poor DX10 implementation. So regardless of your semantics, DX9 is the area these cards currently compete in until there's actual titles in stores or patches on sites.

If you guys don't want to be "beta testers" for MS with Vista, than you would be stupid to go out and spend the money on a DX10 card anyhow.

Except the GF8800 dominaes everything else in DX9 and will of course beat them in DX10 since it's not an option to the others, in this case it's a question of price/performance. Still fine to spend you money on this card, because it can play old, current and future titles, just not worth the current MSRP.

There are much better deals on 79xx cards to be had.

Yes there are, but there's likttle point in getting them, because dollar for dollar right now the X19xx series outperform them.

Let's see some comparsion marks with an 8600 on Vista and XP playing CoH with DX9, and CoH with the DX10 patch... than we can judge for ourselves.

You could if the official COH path were out of Alpha/Beta testing, it's still only 1.5 on THQ's site(at this time), 1.6 was just announced (still in dev/testing),and DX10 doesn't come until Patch 1.7.
FSX's patch isn't out of Beta yet either, so what mythic DX10 games are out there that are publically available?

Quite yer moaning and get a clue.
Informed criticism is valid, yours on the other hand is moaning about the state of the industry not this review.
All I hear is LAME. Lame article, Lame 8600 card, Lame 32 pipes, Lame 128 bit bus, Lame DX10 implementation, Lame Vista. LAME LAME LAME. I guess you CAN play DX10 Chess on Vista Home Premium?
I can't be too harsh; the 9700 Pro was way ahead of DX9 and no one complained.
 
There are tons of info about 8600 leaks to the web way before its official release.
What about AMD? How come we're not seeing much of new HD2x00 info being release? are they not schedule for 4/23/07 release date?
 
what is THIS bs in the review:

" .. it does not seem suited for those looking for good image quality accompanied with high frame rates. While it can produce 62 frames a second at a resolution of 12x10 with AA enabled in Doom 3, the card can only produce 64 frames with trilinear filtering and no antialiasing."

What does this mean? he is trying to say something but im not getting it ..

bi-linear sample, 4x AA
tri-linear sample, 4x AA
8x AF, 4x AA

WTF is image quality enabled what does this mean? Is tri-linear something important I was told Anistropic filtering was superior.
 
Seems the very best of them is around 800 / 1250(2500).

Which is only about +75 core /+100 memory boost from the GTS XXX and about +100/+200 from stock.

Now I'm not sure if any of them are shipping with GDDR4 yet, I haven't seen them, but if you could get another 100-200(200-400)Mhz out of the memory it would likely make a huge difference.




edited bad (math), d'oh! 😳
 
Well I guess it's for the best.The people who can't afford the better cards still need something,but I think NVIDIA is cramping the performance too much.

Dahak

AMD X2-4400+@2.6 TOLEDO
EVGA NF4 SLI MB
2X EVGA 7950GT KO IN SLI
4X 512MB CRUCIAL BALLISTIX DDR500
WD300GIG HD/SAMSUNG 250GIG HD
ACER 22IN WIDESCREEN LCD 1600X1200
THERMALTAKE TOUGHPOWER 850WATT PSU
COOLERMASTER MINI R120
3DMARK05 13,471
 
WTF is image quality enabled what does this mean? Is tri-linear something important I was told Anistropic filtering was superior.


"image quality enabled" implies that Anistro-filtering is better than single bi/trilinear, but the card can only play decently with trilinear... this means that the card cannot run with high image quality settings (like AF) turned on.

I agree that it is worded poorly, but I think what you are saying is the same thing.
 
Well I guess it's for the best.The people who can't afford the better cards still need something,but I think NVIDIA is cramping the performance too much.

Wait until the X2600's come out, then you should see the 8600's price drop to something more reasonable... although everyone in this thread will probably have grandkids by the time AMD release a new card 😛

Will be interesting to see how the X1950XT stands up in a DX10 title against the 8600.
 
Will be interesting to see how the X1950XT stands up in a DX10 title against the 8600.

and therein is the great unknown... until we see that we have no idea if the 8 series are FX-cards or "real" geForce cards... 😉
 
Folks, we have been listening to this DX10 crapola for a YEAR. I am beginning to think that PC gaming and the video card era have now peaked. Maybe consoles are the future, after all.
I know, it's annoying. Same phenomenon as dual-core processors when the Pentium D and Athlon X2s came out, I think I heard 3245 times that although there was no measurable benefits from dual-core now, there would be in some kind of near future, and people with single-core would be let down. Well, two years later we have finally seen one game to back up this statement: Supreme Commander. And there are new, much faster processors (the Core2Duo) to replace those Pentium Ds and X2s now, so the kind of gaming enthusiast who paid a premium back in 2005 for dual-core has probably upgraded now and never actually used the second core on his Pentium D or X2.

I'll play Crysis when there will be a 200$ video card to play it perfectly. There's no rush, plus by that time there will be all the free strategy guides, patches, mods, etc.
 
wow, don't be optimistic or anything... surely don't sugar-coat it, tell us how you really feel.

😉


Honestly, even "way back when" the dual cores hit there was a tangible benefit from having 2 procs. Even on games that did not "support" it you could still run more stuff in the background while gaming. Still can. Very nice.


But your frustration over hardware tech moving faster than software is an echo of times past. This has always been the case, and likely will not change. Software dev cannot be fully realized on vapor hardware. It must be done on the actual hardware. This means you can't do much until you have the item.

If it takes years to dev large projects then I am glad we have come this far in such a short time. 8)
 
I'll play Crysis when there will be a 200$ video card to play it perfectly. There's no rush, plus by that time there will be all the free strategy guides, patches, mods, etc.

By that time, Crysis will be in the discount bin and the latest games may not run at all on what we have today, while the hardware will be so far advanced that a Nvidia 8800 will be like an ATI 9800 Pro is now.
 
By that time, Crysis will be in the discount bin and the latest games may not run at all on what we have today, while the hardware will be so far advanced that a Nvidia 8800 will be like an ATI 9800 Pro is now.
I don't think so, in 2004 Doom 3 and HL2 were released and the X800XT was the dream gpu at 500$; in 2006 I bought an X800XL for 140$ and it plays both these games flawlessly, even FEAR runs extremely well, I am very pleased. And that time the best card was the X1900XTX so I was only one generation late, not 3 like you're saying.

Honestly, even "way back when" the dual cores hit there was a tangible benefit from having 2 procs. Even on games that did not "support" it you could still run more stuff in the background while gaming. Still can. Very nice.
Yet even to people with dual-core processors complaining about slow framerates, the first advice they get is to turn off any useless background operations. And honestly who needs to encode H.264 video while playing Oblivion :roll:
 
I don't think so, in 2004 Doom 3 and HL2 were released and the X800XT was the dream gpu at 500$; in 2006 I bought an X800XL for 140$ and it plays both these games flawlessly, even FEAR runs extremely well, I am very pleased. And that time the best card was the X1900XTX so I was only one generation late, not 3 like you're saying.
You say you bought a card that was 1.5 gens (? the 1900 was a different core than the 1800 but not a "full" gen different) behind to play a 2 year old game, wow. You are still way off the curve. (budget may force you to this, but that does not mean it is the "best" way to go)

Yet even to people with dual-core processors complaining about slow framerates, the first advice they get is to turn off any useless background operations. And honestly who needs to encode H.264 video while playing Oblivion :roll:
Sure, if you are having problems that is the first thing you do to debug any issue... duh. Once the issue is found, go back to business as usual. I did not say to encode video while gaming (h.264 is super intensive on the cpu and most encoders use multi-core so that is a dumb idea anyway), but I have got in some quick online matches of CoH and UT2k4 while burning cd's... Others run folding@home or other apps while gaming. Dual cores free you up for that on a game that only uses one core.
 
Seems the very best of them is around 800 / 1250(2500).

Which is only about +75 core /+100 memory boost from the GTS XXX and about +100/+200 from stock.

Mmm...

Stock: 675/1000
"The very best of them": 800/1250
Result: +125/+250

Keep editing Ape! :wink:
 
In the last 2 generations of nVIDIA cards, mid-range was really good. It stood against the previous high-end remarkably.

The 6600GT against the FX5900 Ultra, the 7600GT against the 6800 Ultra.
Excellent performers.

I was expecting more from the 8600GTS, specially with those rumors of being a 64-shader GPU with 256-bit memory. If it would have had with 64 SPs and 256-bit memory, it's performance would have been something like the 7900GTX and X1950XTX (if not more).

32 SPs and 128-bit memory is not enough. I'm seriously thinking about saving some more money for a 8800GTS 320.
 
that is why I made the possible comparison to the FX gen cards. The ti4200 and 6600gt were insane deals at the time for the money. The FX5600 was not. Granted the FX gen did great on dx8 (previous gen like dx9 is now) but sucked at the "next gen" dx9 (like dx10 is now maybe) and the midrange of the FX REALLY sucked.

Nv totally came back on both high end and midrange with the 6 series... and everyone was hoping a repeat of that (7 series was good too btw) but while the midrange seems to be missing I hope the whole 8 series is not a repeat of the FX bomb.


Just some thoughts... :)
 
its that stupid 128bit interface the last card from Nvidia that should have had that was the GF 7600GT its just a major loss, well lets see what ATI spits out they are still with out a contender
 
Seems the very best of them is around 800 / 1250(2500).

Which is only about +75 core /+100 memory boost from the GTS XXX and about +100/+200 from stock.

Mmm...

Stock: 675/1000
"The very best of them": 800/1250
Result: +125/+250

Keep editing Ape! :wink:

Well my edit was from the original 1250(2400) error since I originally put 1200, but noticed I had seen a low core high memory in another so I originaly changed the base memory, the 'about' covers the area of overclocking in 'general'. 😛
 
some pricing info


http://www.ewiz.com/detail.php?name=XFX-86SEXM&src=Deal
Thanks for the pricing.
But I will NEVER get a 8600 as long as I can get a X1950XT for the same price.
I hope things change soon.
Myself (like most) wished the 8600 was 64 pipes/256 bit, etc.
 

TRENDING THREADS