GeForce 8600: DirectX 10 For The Masses

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You say you bought a card that was 1.5 gens (? the 1900 was a different core than the 1800 but not a "full" gen different) behind to play a 2 year old game, wow. You are still way off the curve. (budget may force you to this, but that does not mean it is the "best" way to go)
Excuse me but I don't get your point: what do you mean by "way off the curve"?

Sure, if you are having problems that is the first thing you do to debug any issue... duh. Once the issue is found, go back to business as usual. I did not say to encode video while gaming (h.264 is super intensive on the cpu and most encoders use multi-core so that is a dumb idea anyway), but I have got in some quick online matches of CoH and UT2k4 while burning cd's... Others run folding@home or other apps while gaming. Dual cores free you up for that on a game that only uses one core.
Well on a single-core you can also burn cds and run apps that require little cpu power and it won't have a significant impact on your framerate, so, so much for going to dual-core. If running other apps will have an impact it can also be the result of more I/O on the hard drive or RAM usage and dual-core doesn't help on that. So seriously X2s were just marketing, for gamers that is.
 
Excuse me but I don't get your point: what do you mean by "way off the curve"?

I mean just that... you are not on the leading edge, the front of the curve... whatever term you want to use. You were not any where near the top performing cards of that time. It was not a slam, just that your budget did not allow (or you did not want it to allow) you to get the best of breed. You were basing performance on old games so that old card did what you wanted it too. Running newer games hit it much harder.

Well on a single-core you can also burn cds and run apps that require little cpu power and it won't have a significant impact on your framerate, so, so much for going to dual-core. If running other apps will have an impact it can also be the result of more I/O on the hard drive or RAM usage and dual-core doesn't help on that. So seriously X2s were just marketing, for gamers that is.
You are totally missing my point man... and arguing with a wall I suppose. I am saying that you can run MANY apps while gaming. That disc burning includes some encoding for a particular format (video cd for example) plus all the things running in the background. (like xfire, folding, speedfan, key-loggers, spyware, rootkits, trojans... whatever)

You are whining about an issue that you seem to think existed 2 years ago and yet is still relevant today... give it up and admit that everyone is moving to dual core and those of us that have are seeing tangible benefits that, while not directly a result of games "seeing" two cores are a result of having that extra processing power for any number of things.

Whether your hobby is computing in general, PC gaming, CAD or crunching code... more power = teh good! That is a rule that will always hold.

if you are only a passer-by then you will never understand and you should not whine about those who do... get a console or go weave some baskets and just be happy. :)
 
also, dual cores are not marketing. FPS ain't everything. it can also prevent stuttering which will happen on a single core if it gets too busy at certain points when background apps usage spikes.

good point stranger, I have seen this happen...

I missed that one 8)
 
i never use an single core cpu agane (in my prime pc any way) system is alot more smother even thought games are not muti threded (games that i play even) it uses more then 1 cpu when playing games

some users still think single core cpus are faster then dual core past maybe right but you cant get an core 2 duo as an single core so No single core can beat it (Quad core is still an overkill unless maybe supream commander is used or progs that use more then 2 cpus)

X1950 is not DX10 card it can not use DX10 games only DX9 as some one posted compareing 8800 and an x1950 with dx10 (dx10 performace will wipe the floor with DX9 cards unless its an 8600 mite be slower)
 
X1950 is not DX10 card it can not use DX10 games only DX9

And not only are there no DX10 games yet, there won't be any DX10 and above only games for at least a year if not 2.

Right now the GF8600GT being DX10 doesn't make it better than an X1950XT if they're selling at the same price.

Like I mentioned before I think there is a benefit to DX10, but for these cards I'd say it's worth about $20 in my pricing scheme.

And if someone is serious about a DX10 future, then the GF8800GTS is the minimum consideration IMO.
 
I mean just that... you are not on the leading edge, the front of the curve... whatever term you want to use. You were not any where near the top performing cards of that time. It was not a slam, just that your budget did not allow (or you did not want it to allow) you to get the best of breed. You were basing performance on old games so that old card did what you wanted it too. Running newer games hit it much harder.
Of course, that's my basic initial point you're just restating. That's just another way of saying "I'll play Crysis when there will be a 200$ card to play it perfectly": I mean you pay not for the technology being what it is, you pay for it being what it is now, ie: the top-end. My point is, you save a lot of money and trouble by living 1-2 years in the past. Half-Life 2 is the same game it was upon its release, it's just as freaking good, and it runs flawlessly on cheap hardware.

You are totally missing my point man... and arguing with a wall I suppose. I am saying that you can run MANY apps while gaming. That disc burning includes some encoding for a particular format (video cd for example) plus all the things running in the background. (like xfire, folding, speedfan, key-loggers, spyware, rootkits, trojans... whatever)
So ok earlier you said video encoding was a dumb idea but now you're suggesting doing it again :roll: ; and anyhow EVEN with a dual-core processor you'll want to minimize that kind of background activity to ensure best possible performance; running a spyware scan for instance even if the CPU role was absolutely free, implies a lot of hard drive activity which WILL be the major source of stutters in-game.

You are whining about an issue that you seem to think existed 2 years ago and yet is still relevant today... give it up and admit that everyone is moving to dual core and those of us that have are seeing tangible benefits that, while not directly a result of games "seeing" two cores are a result of having that extra processing power for any number of things.
The Core2Duo is just plain faster in games and everywhere and I could hardly care less whether that's because it's dual-core or not: it's a faster processor. But the X2 in 90% of game benchmarks was NOT faster so the selling argument was not performance, unlike the C2D, it was just "two cpus lolz". Which is why the issue is not the same today. I don't care what makes CPUs faster as long as they prove to be so, in the case of the X2 there was only the theoritical feature and little performance.
 
Linky no worky !! :?

Folks try this one maybe;
http://www.tomshardware.com/2007/04/17/geforce_8600/

And seriously WTF !?! Only 3Dmk05, Doom3, FEAR and Oblivion?
lamebw3.gif


And at least publish the composite sub scores of 3Dmk05 !!

Oohh final bungholiomark score on a card known to have exagerated 3Dmark results!?! C'Mon! :roll:

If you guys need more software let me know I can send you a few titles.
joystickon0.gif

At least they finally stopped using Quake3 as a benchmark.
 
X1950 is not DX10 card it can not use DX10 games only DX9

And not only are there no DX10 games yet, there won't be any DX10 and above only games for at least a year if not 2.

Right now the GF8600GT being DX10 doesn't make it better than an X1950XT if they're selling at the same price.

I'm not brazen, like so many, enough to say that I "know" this will happen. But, I'm wagering on the likelihood that by the time that DX10 is so widespread that the 8600/8500's support will really make it pull away from it's current competition... ATI and Nvidia will have a better card, or two, of the same price point by then, so as to make the adoption of it at this time insignificant. And, that otherwise even non-native (i.e. DX9 cards like the 79s and x19s) will still be sufficient by comparison to what the 86s and 85s are doing. With that in mind, if I still had a budget that placed me way too far below the 320mb GTS, I'd still comfortably get a DX9 card for the same short term as a "midrange" DX10 one now. I'm not saying nVidia is dumb or anything they did exactly what they should do, but I am saying that the DX10 compatibility seems much ado about nothing. [/quote]
 
Too me I look at the DX10 benefit/option currently worth about $20-25 premium over a similar DX9 card in that class because of it's potential use for those cards.

So seeing it usually in the X1950GT range and at best the X1950Pro range ($135), then I add $20 to it and think the GF8600GT should be price closer to $150-160.

If they were selling for $150 then I'd definitely recommend them.
 
I should have been clearer but I meant my post to be in agreement to yours, so I hope you didn't get the idea that I was calling you out.
Oh and... "Beagle Beagle"
 
No I didn't get that feeling, I was just explaining my position, it's not a terrible card, just not what we hoped for, were lead to believe, and a tough sell considering it's price competitors.

It's all good man!

BTW, it's Beegle Beagle, but props for knowing !! 8)

grapeapepg2.jpg


beeglegrapexf4.gif
 
Too me I look at the DX10 benefit/option currently worth about $20-25 premium over a similar DX9 card in that class because of it's potential use for those cards.

So seeing it usually in the X1950GT range and at best the X1950Pro range ($135), then I add $20 to it and think the GF8600GT should be price closer to $150-160.

If they were selling for $150 then I'd definitely recommend them.

I must be missing something here... I've just read the review and this entire post and the numbers just aren't adding up.

Here (in Australia), the 8600GT is $180, comparable in price to the X1950GT. Performance between the 2 would assumedly be close, but as it wasn't included it's hard to determine. I'd say that it (8600GT) would perform on par with the 7900GS (also not included, clearly this review was centered around the 8600GTS), which in turn is similar to the X1950GT. So from a realistic POV, the 8600GT get's my vote; similar performance, similar price, +DX10 and possibly less power/heat/noise.

As for the 8600GTS, it would certainly appear that it's not worth the PCB it's printed on. The X1950Pro/XT craps all over it, with a higher price tag to boot.

So, is you were looking at getting the 7600GT, would it be safe to assume that the 8600GT is a welcome upgrade, for very little $ change?

And if you had a slightly higher budget, you would choose the X1950Pro/XT instead?

Would be interesting to see the VGA charts updated, as well as a new price/performance chart. Not everyone is here to purchase the biggest, baddest GPU available. :)

My bad: Price diff. between 8600GT & 7600GT here is about $50 now.. Guess the comparison between them is a bit off.. Still, my points are valid. :roll:
 
No I didn't get that feeling, I was just explaining my position, it's not a terrible card, just not what we hoped for, were lead to believe, and a tough sell considering it's price competitors.

It's all good man!

BTW, it's Beegle Beagle, but props for knowing !! 8)
]
It's easier to swollow if you think of the 8800 GTS 320 as midrange and the 8600 GT as budget-end, and the 8300 as integrated.
 
No I didn't get that feeling, I was just explaining my position, it's not a terrible card, just not what we hoped for, were lead to believe, and a tough sell considering it's price competitors.

It's all good man!

BTW, it's Beegle Beagle, but props for knowing !! 8)
]
It's easier to swollow if you think of the 8800 GTS 320 as midrange and the 8600 GT as budget-end, and the 8300 as integrated.
 
Of course, that's my basic initial point you're just restating. That's just another way of saying "I'll play Crysis when there will be a 200$ card to play it perfectly": I mean you pay not for the technology being what it is, you pay for it being what it is now, ie: the top-end. My point is, you save a lot of money and trouble by living 1-2 years in the past. Half-Life 2 is the same game it was upon its release, it's just as freaking good, and it runs flawlessly on cheap hardware.

Fair enough, the only reason I responded was that if you want to be 2 years behind on HL2 (which does still look good in all fairness) that is fine, but playing that while a friend plays crysis is not for me... when I see it I want it... to each his own.

So ok earlier you said video encoding was a dumb idea but now you're suggesting doing it again :roll: ; and anyhow EVEN with a dual-core processor you'll want to minimize that kind of background activity to ensure best possible performance; running a spyware scan for instance even if the CPU role was absolutely free, implies a lot of hard drive activity which WILL be the major source of stutters in-game.
No, earlier I said encoding H.264 was dumb as it hits the cpu very hard. Encoding something lower qual is fine... like say a video cd. 😉

The rest of that list was sarcasm... I did not say a spyware scan, but literally spyware. (and rootkits and trojans and...) It is tough to get sarcasm accross in a forum, but I figured that if you missed the spyware bit, you would get it with the rootkit comment. I was wrong. No harm no foul. 8)

The Core2Duo is just plain faster in games and everywhere and I could hardly care less whether that's because it's dual-core or not: it's a faster processor. But the X2 in 90% of game benchmarks was NOT faster so the selling argument was not performance, unlike the C2D, it was just "two cpus lolz". Which is why the issue is not the same today. I don't care what makes CPUs faster as long as they prove to be so, in the case of the X2 there was only the theoritical feature and little performance.
The X2 was faster in games over a p4 when it came out. (socket 940 and 939... 754 was ~ equal) Still is I believe... :)

sure, it was not as big a gap as the core2 currently owns, but it was a win.

Regardless, this is an old argument that is boring and irrelevant now. Moving on. :arrow:
 
mid range DX10 card is nice and all, But when in the hell is the mid-range cards going to adopt 256bit interface, since the high-end is moving beyond 256bit?

I totally agree wif u on this one, I mean the 8 series mid range should have been 256bit at least.I guess their probably not taking their competition seriously.I really hope a geforce 9 is in the works rite now, and release it after the r600 comes out.Coz as of rite now the 8 series seems to suck big time.Even the old 7950 could still do the trick for most games rite now.
Heck, the 7 series on the playstation3 could even play Crysis and Oblivion with HDR and Anti-aliasing enabled. The 8 series just seems to be the card nvidia release to enable HDR and anti-aliasing for current games on the pc.
Dont get me wrong, I really like Nvidia, but they really need to step it up a notch with the 8series.
 
*sigh* I was really looking forward to the 8600, but they don't seem to be worth buying. I guess I will wait until ATI's cards come out, then decide which card to purchase. I hope the 8800's price goes down.
 
The target area for is 150 to 250$ I agree with the ape, IF these cards are in that segment (actually only the gts) then it should be in the low end, or 150$ This segment is where they make the most of their monies, sell the most etc. Selling in this segment with this low of a product, means theyre trying to get all they can for as little as possible. Time will tell if ATI/AMD will do the same
 
*sigh* I was really looking forward to the 8600, but they don't seem to be worth buying. I guess I will wait until ATI's cards come out, then decide which card to purchase. I hope the 8800's price goes down.

u r right i will wait untill 8800 gts price falls and first dx10 game arrive cause who knows the performance of crysis and ut07 on 8800