HD2400/HD2600 VS 8600GTS

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Oh my bad miss "You know everything" I don't see anywhere that show the 8800 have a dedicated HD acceleration.

Guess you didn't see their launch information;
http://www.nvidia.com/page/8800_features.html

The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity,

"Hardware Decode Acceleration:
Provides ultra-smooth playback of H.264, VC-1, WMV and MPEG-2 HD and SD movies. "


And oh, look when not using specialized software the GF8600 performs just like the GF7 and X1K series;
http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=379&Itemid=27&limit=1&limitstart=10

And i must be blind seeing everyone talking about amd doesn't have UVD and supposed to have it.

Yeah you like all the knobs who missed it the first time around did miss that AMD said it supported AVIVO HD not UMD, they read-in the later because of descriptions of UVD and AVIVO HD together. But some reviewers like EliteBastards got it right at launch, others didn't figure out their own mistake until weeks later;
http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=388&Itemid=31&limit=1&limitstart=6
"Functionality offered for both the mid-range and low-end on their new range of boards, with the processing power of R600 left to do much of the work on the Radeon HD 2900."

We've covered this already.

You must have better source than I.

Seems pretty obvious now doesn't it ?!? :roll:
 
Elite Bastard must be blind too, because i don't get past 15% CPU Utilization with a 1080p Movie. I guess Windows Media player is a specialized software ?

It doesn't matter what you claim to get, what do you get on your G80, X1950, HD2900 and GF7900GS and how do they compare?

So far you've been wrong.

The point is, there is HD acceleration of some kinds on the G80, although like I said it may be slightly different than the GF8400/8600, but then again so is the HD2400/2600 vs HD2900 as well as the GF8600.

And like the HD2900 & G80 the GF8600 needs software to make the most of it's features obviously since it won't act better than the other in every situation only in apps tweaked for it. So It's all still software dependant to some extent, but whether the shader based HD2900 method, dedicated UVD method, or method that the G80s show any difference to the driver is another story. Who knows, maybe the X1950 and GF7600GT are running at max themselves to reduce the load, while the GF8600 is running at idle, but the impact on the CPU is about the same, and no ones tested the rest of the situations extensively enough other than to pop in encrypted disks which doesn't rely on the HD acceleration making the difference so much as the DRM decode.

Either way I don't think it's EB who's blind considering they bothered to check it out, whereas you missed the HD info in the G80 launch info.

Personally I'm waiting for the good AVIVO and PureVideo reviews from people like Cleeve, Crashman, or A E D at Firingsquad, I respect their attention to detail for a larger picture than HQV tests even the new BR version.
 
Once again that shows that the GF8600 is also software dependant. So while we wait for drivers to turn the GF8500 from a video decelerator in XP to an accelerator , we'll have to wait for similar drivers for the HD2900 to offer it's benefits too.

Thus confirming my previous statement;
"So It's all still software dependant to some extent, but whether the shader based HD2900 method, dedicated UVD method, or method that the G80s show any difference to the driver is another story."
 

srgess

Distinguished
Jan 13, 2007
556
0
18,990
What do you mean by Software dependent ? Driver ? Well the 8800 is out for a long time and you are going to tell me they didn't figure out a driver to benefit for the hd acceleration ? Same for ati, card may be out only fews day ago but the card itself was under construction for years and its all they can do ? They can't do sh**.
 
What do you mean by Software dependent ? Driver ? Well the 8800 is out for a long time and you are going to tell me they didn't figure out a driver to benefit for the hd acceleration ?

How long did it take ATi and nV to get AVIVO and PureVideo to work to near their potential? They leapfrog each other every 6-9 months due to upgrades. So considering how much the software is involved there's a long way to go before this story is even beginning to be explored considering their need of optimized apps to support them as well as better drivers.

Same for ati, card may be out only fews day ago but the card itself was under construction for years and its all they can do ? They can't do sh**.

Did you even read cleeve's review?

Based on the above how do you explain the GF8500's video decelerator in XP with the GF8600 doing so to in H.264? XP's been out for as long as either the G80 or R600 has been in development stages on paper, yet the GF8500 and 8600 perform worse than software decoding in XP? So despite being out longer than the HD2900 and having a sibling in the market and having forever to work on drivers for XP that is a standard not moving target unlike Vista, they still couldn't get it to work.

So how does that link help your argument against the HD2900 and AVIVO-HD, which AMD says needs the software update to allow the shaders to perform the functions as described?

Also explain to me how your analogy favours the GF8500/8600 over the other 2 options? Right now the G80 beats the GF8500 in 3 out of 4 of the situations explored by Cleeve and versus the GF8600 it's 1 win and 2 ties and 1 loss for either, and that's despite the supposed benefit of new hardware acceleration.

Seems like they all have a long way to go when it comes to software intergration/compatability. And I expect that to come rather quickly now that the HTPCs are the target, but it's still very early.

Once we have both solutions running without those glaring issues, then we can start comparing which one does what better/worse.
 

srgess

Distinguished
Jan 13, 2007
556
0
18,990
Once we have both solutions running without those glaring issues, then we can start comparing which one does what better/worse.

That right but i doubt amd will find a solution soon, they are just too busy to finish the barcelona. Since you always find answer, ( this is not graphical question) but how does the BE-2350 Processor ( 45WTP ) is consuming more power then a X2 5000 (Idle) and near a X2 3800 (65nm) Load and perform worst then a X2 3800 ? which i saw that there: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3003&p=1
 

vic20

Distinguished
Jul 11, 2006
443
0
18,790
What do you mean by Software dependent ? Driver ? Well the 8800 is out for a long time and you are going to tell me they didn't figure out a driver to benefit for the hd acceleration ?

How long did it take ATi and nV to get AVIVO and PureVideo to work to near their potential? They leapfrog each other every 6-9 months due to upgrades. So considering how much the software is involved there's a long way to go before this story is even beginning to be explored considering their need of optimized apps to support them as well as better drivers.

Same for ati, card may be out only fews day ago but the card itself was under construction for years and its all they can do ? They can't do sh**.

Did you even read cleeve's review?

Based on the above how do you explain the GF8500's video decelerator in XP with the GF8600 doing so to in H.264? XP's been out for as long as either the G80 or R600 has been in development stages on paper, yet the GF8500 and 8600 perform worse than software decoding in XP? So despite being out longer than the HD2900 and having a sibling in the market and having forever to work on drivers for XP that is a standard not moving target unlike Vista, they still couldn't get it to work.

So how does that link help your argument against the HD2900 and AVIVO-HD, which AMD says needs the software update to allow the shaders to perform the functions as described?

Also explain to me how your analogy favours the GF8500/8600 over the other 2 options? Right now the G80 beats the GF8500 in 3 out of 4 of the situations explored by Cleeve and versus the GF8600 it's 1 win and 2 ties and 1 loss for either, and that's despite the supposed benefit of new hardware acceleration.

Seems like they all have a long way to go when it comes to software intergration/compatability. And I expect that to come rather quickly now that the HTPCs are the target, but it's still very early.

Once we have both solutions running without those glaring issues, then we can start comparing which one does what better/worse.

Okay, knock it off.

You going to use a 1.4 mobile Celeron to play HD movies? Unless you have a laptop, who really gives a damn about H.264 acceleration on GPUs?

Does it bother anyone that their CPU runs at 80% instead of 20% utilization while watching HD video? If it does, you need your head examined. Watch your movies in the living room with the gf, not on your PC with a towel....

By the time BD-ROMs, HD-DVD drives and most importantly recorders reach the magic sub-$100 range mickey mouse videos cards will have it.

Maybe watching a Bluray flick on a second monitor while you surf the net and encode video..... nm gimme a break. This is stupid.

Argue about something meaningful please :roll:
 
Okay, knock it off.

You going to use a 1.4 mobile Celeron to play HD movies? Unless you have a laptop, who really gives a damn about H.264 acceleration on GPUs?

Uh yeah these are the exact ones I am considering it for, and to be specific of two candidates for each VPU, the HP Dragon HDX or the new Apple 17" LED-lit MacBookPro.

Does it bother anyone that their CPU runs at 80% instead of 20% utilization while watching HD video? If it does, you need your head examined. Watch your movies in the living room with the gf, not on your PC with a towel....

Well remember alot of people are considering these cards for HTPCs, which is the ONLY benefit of the GF8600 currently. so alot of people are taking their old CPUs and putting them in these rigs or using cool underclocked CPUs, so being able to remove the burden is nice. And anyhing that approaches 80+% means that if another thing uses system resources then you will get stutters, freezes or crashes. So it is an issue even if running just under 100% on a new Core2Duo desktop.

I still prefer the idea of the HD2600 for the laptops especially HDMI 1.2 over DVI; but I'm open to the F8600GT (not GS) as it's pretty competant too for a laptop which is why I want the info first rather than people deciding the fate of yet to be released products.

By the time BD-ROMs, HD-DVD drives and most importantly recorders reach the magic sub-$100 range mickey mouse videos cards will have it.

I already have an Xbox 360 HD-DVD Drive, and I'm waiting for the LaCie to come down or LG to bring an external BR+HD-DVD recorder.

Maybe watching a Bluray flick on a second monitor while you surf the net and encode video..... nm gimme a break. This is stupid.

Argue about something meaningful please :roll:

If you don't see the benefits, that's fine, but dismissing the benefits is ignorant, just as much so as pretending that any of the solutions are finalized yet. However considering this is very relevant to some of us, and is a graphics card/chip issue I don't understand your gripe.
 

vic20

Distinguished
Jul 11, 2006
443
0
18,790
Okay, knock it off.

You going to use a 1.4 mobile Celeron to play HD movies? Unless you have a laptop, who really gives a damn about H.264 acceleration on GPUs?

Uh yeah these are the exact ones I am considering it for, and to be specific of two candidates for each VPU, the HP Dragon HDX or the new Apple 17" LED-lit MacBookPro.

Does it bother anyone that their CPU runs at 80% instead of 20% utilization while watching HD video? If it does, you need your head examined. Watch your movies in the living room with the gf, not on your PC with a towel....

Well remember alot of people are considering these cards for HTPCs, which is the ONLY benefit of the GF8600 currently. so alot of people are taking their old CPUs and putting them in these rigs or using cool underclocked CPUs, so being able to remove the burden is nice. And anyhing that approaches 80+% means that if another thing uses system resources then you will get stutters, freezes or crashes. So it is an issue even if running just under 100% on a new Core2Duo desktop.

I still prefer the idea of the HD2600 for the laptops especially HDMI 1.2 over DVI; but I'm open to the F8600GT (not GS) as it's pretty competant too for a laptop which is why I want the info first rather than people deciding the fate of yet to be released products.

By the time BD-ROMs, HD-DVD drives and most importantly recorders reach the magic sub-$100 range mickey mouse videos cards will have it.

I already have an Xbox 360 HD-DVD Drive, and I'm waiting for the LaCie to come down or LG to bring an external BR+HD-DVD recorder.

Maybe watching a Bluray flick on a second monitor while you surf the net and encode video..... nm gimme a break. This is stupid.

Argue about something meaningful please :roll:

If you don't see the benefits, that's fine, but dismissing the benefits is ignorant, just as much so as pretending that any of the solutions are finalized yet. However considering this is very relevant to some of us, and is a graphics card/chip issue I don't understand your gripe.

My gripe is you are being way too anal about something that won't be an issue in a short period of time.

Reminds me of when DVD-ROMs came out. You needed GPU acceleration or a dedicated decoder with a sub 500 MHz CPU. Once CPUs got into the 600+ range, you could easily play DVDs with no assisted acceleration at all. Today you can play multiple 480 videoes similtaneously with an old PCI 2d card if u wanted to.

We are already looking at cheap quads next months. CPUs will soon make HD acceleration via GPUs a non-issue. History repeats itself. Companies like Cyberlink will ensure it.

Background tasks causing stuttering in HD on a recent CPU? Now I know you are full of it...

HDPlaybackmultitask.jpg


My old P4 tech bench machine. Yes that is HD video playing in Nero.
 
My gripe is you are being way too anal about something that won't be an issue in a short period of time.

It will be an issue for laptops for a long period of time, even 20% versus 60% means a huge difference in battery life, as well as heat..

Reminds me of when DVD-ROMs came out. You needed GPU acceleration or a dedicated decoder with a sub 500 MHz CPU. Once CPUs got into the 600+ range, you could easily play DVDs with no assisted acceleration at all. Today you can play multiple 480 videoes similtaneously with an old PCI 2d card if u wanted to.

Yeah however even those better CPUs were nowhere near as good as the dedicated solutions like my Creative DVD that came with the daughter card decoder.

We are already looking at cheap quads next months. CPUs will soon make HD acceleration via GPUs a non-issue. History repeats itself. Companies like Cyberlink will ensure it.

Yes it does, but most people are moving their older rigs to HTPC role so for the time being it will be necessary. Sure it not a big deal for future cores, but then again it won't be a big deal for them because they will have the VPU+CPU option anyways.

Background tasks causing stuttering in HD on a recent CPU?

Sure, depends on the tasks and what's being watched and done. You image is really pretty but doesn't show anything about the stability of the video nor it's quality or compression method. And doesn't show your priority level of tasks, putting Nero and Norton in low priority mode means little as a stress test, and considering your 2 threads top out simultaneously then it's far from efficient, nor even a good comparison of what would hapen on a single thread situation which is still the majority of secondary HTPC PCs for now using old Athlons and P4s which couldn't take advantage of HT if they're running above 50% all the time. Also what's the bit rate of your content, I doubt it's anywhere near the future 40+Mb/s, so it's not like we're even at the end of the challenge from future titles yet so your current rig's ability to handle them is in question as well.

Now I know you are full of it...

Yeah unlike you who keep talking about a quad core uber rigs when we're takling about the VPU assist for the midrange level. Like I said the features ofthe HD2900 and GF8800 will matter little since they'll be slapped on big CPUs and even at idle they consume more than the GF8600 and GF7900GS at full tilt, why don't you postthe results of a huge compressor equipped phase change rig? I'm sure HTPC owners would love that loud rig just so they can avoid upgrading their X300SE. :roll:

Once again your statements are ignorant and irrelevant to the market that looks at these features we're talking about, and theoriginal criticism is pointless.
 

vic20

Distinguished
Jul 11, 2006
443
0
18,790
You're an @ss. My statements aren't ignorant, they are sarcastic but realistic. Yours are assuming, anal retentive and arrogant, and not just to me. You have a serious attitude problem. Which is why I posted in the first place.

Who is going to buy an 8600 and put it in their Althon XP HTPC? Oh wait.....you can't....

So maybe those with HTPC rigs built out of old spare parts (ie. old P4 2.4) will buy ATi HD 2600 AGP cards if and when they come out. Then go out and spend another $200 on a 360 HD-DVD or $500 on a Bluray writer.

How many users is that? Come on. People slap together old stuff to have fun for free or little money. These are machines playing below standard def downloaded videos and music, like modded a modded Xbox user.

Seriously. Almost everyone serious about HD and using a PCI-e board capable of an 8600 in their HTPC would already have a halfway decent CPU. Pentium E2160s are what? Under a hundred bucks?

I don't know if you've noticed, but CPUs are getting cooler, quieter, consuming less power while getting cheaper and more powerful. This is what I was getting at. I'm sure you know that and are just being a pr1ck to make yourself look better than me.

Sorry I didn't use a really crappy chip and a high priority background process and a 40mbit D video stream in my cap. I really didn't think someone would want to or attempt something so stupid.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280
the lowly single ROP cluster (like 4 traditional ROPs).

Which means to mean that while it may or may not be 'better' than the GF8600GTS, it will likely still suffer when compared to the value of the previous generations higer end cards, especially the X1950Pro and X1950XT.
That´s what i´m afraid of. Somehow AMD can´t get the midrange right. When the 1xxx midrange showed up it was too expensive. And even now the 1650xt is, at least in europe, almost as expensive as the 1950 Pro. :?


I would wait until the HD2600 is tested better before deciding.

As for crysis performance, I wouldn't expect the GF8600 to be all that great in Crysis, but I wouldn't expect the HD2600 to be much better.

IMO the X1950XT is much more attractive for the money. Of course it consumes more power.

Overall unless you NEED to upgrade now, then I'd say wait for those summer refreshes like the RV670.

With building the 1950Pro AMD created a price/performance monster that keeps killing and maiming everything close by. I hope they make a nice profit with those things, yet i believe they will cut into the 2600 market - at least until the 256 bit variants are released.
And while AMD stated late summer, i honestly don´t believe they will show up then. Given AMDs history i wouldn´t even be surprised if we don´t see them this year.
 
You're an @ss. My statements aren't ignorant, they are sarcastic but realistic. Yours are assuming, anal retentive and arrogant, and not just to me. You have a serious attitude problem. Which is why I posted in the first place.

And you're acting like a Fuktard, your statements are ignorant of the HTPC and Laptop markets, which are obvious to anyone interested in either.

Who is going to buy an 8600 and put it in their Althon XP HTPC? Oh wait.....you can't....

Who said anything about an Athlon XP? I said OLD Athlons, which includes the Athlon 64 2800-3500+ which would be perfect candidates for these benefits and often find themselves in HTPCs built from old parts. As for the older Athlons, they are candidates for the AGP HD2400-2600.

How many users is that? Come on. People slap together old stuff to have fun for free or little money. These are machines playing below standard def downloaded videos and music, like modded a modded Xbox user.

You can try to characterize them like that, but other than HD these rigs are usually quite competent at everything else and were their OC'ed gamers before the launch of the C2D, now their back to stock for cooler operation. And there's alot of them out there, I've parted 3 of them in the last year for people and was planning on moving my old parts to one before I went all laptops. There's alot of members here who have such rigs.

Seriously. Almost everyone serious about HD and using a PCI-e board capable of an 8600 in their HTPC would already have a halfway decent CPU. Pentium E2160s are what? Under a hundred bucks?

Anyone 'Serious about HD' already had stand alone solutions before Xmas. :roll:
Thes market for HTPC makers are for people who know the value of low CPU utlization and being able to do many things in hardware like audio.

I don't know if you've noticed, but CPUs are getting cooler, quieter, consuming less power while getting cheaper and more powerful.

CPUs don't get quieter, unless you're hearing the gates switching. :tongue:

This is what I was getting at. I'm sure you know that and are just being a pr1ck to make yourself look better than me.

I don't need that to make me look better than you, all I need is the fact that an old CPU running @ idle still consumes less power and thus generates less heat than a newer 'efficient' CPU running @ load. That's what is obvious to anyone who has HTPCs or Laptops, but seems so foreign to you.

Like I said the GF8800 and HD2900 don't matter because they both consume a ton of energy at idle, so their difference between hardware assist and software assist will be minor, and their benefit over CPU utilization will also be minor, but in a low power/heat mid-range card/chip the overall effect will be big. IF you can't see that then nothing will enlighten you.
 
That´s what i´m afraid of. Somehow AMD can´t get the midrange right. When the 1xxx midrange showed up it was too expensive. And even now the 1650xt is, at least in europe, almost as expensive as the 1950 Pro. :?

Yeah I think we have to wait for the RV670 and GF8800GS class product to get a good replacement, and yeah it's very similar to the X1600 situation last generation, which I said since the GF8600 launched.

With building the 1950Pro AMD created a price/performance monster that keeps killing and maiming everything close by.

Yeah and the cheap X1900XTs in the US didn't help either, and to a lesser extent the GF7900GS is also killing the value of the GF8600, and likely the HD2600 in that anyone who already owns any 3 of those, or even the GF7600GT/X1650XT will have a difficult time justifying a GF8600GTS or HD2600XT , and same for the GS and GF8500.

I hope they make a nice profit with those things, yet i believe they will cut into the 2600 market - at least until the 256 bit variants are released.
And while AMD stated late summer, i honestly don´t believe they will show up then. Given AMDs history i wouldn´t even be surprised if we don´t see them this year.

Yeah that's the big question, no solid dates and alot of speculation from both extremes in the timeframe. I suspect the fall is a realistic timeframe, but we'll have to wait and see. Also the idea of a Gemini/Dual part seems to be getting alot more traction, which personally I think is the wrong direction for that , but I'll reserve final judgement once we know whether it's more effective than the lackluster X1950Pro DUAL card was.

I think whomever gets that upper mid-range product to market first for a reasonable price will win alot of contracts and alot of value conscious, power concious, and performance upgraders. The GTS-320 is a great value performer, but it's not much les power consuming than the other 8800s or even the HD2900 to be considered for people with PSU constraints. A GF8800GS or HD2900Pro would be sweet.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280
Yeah and the cheap X1900XTs in the US didn't help either, and to a lesser extent the GF7900GS is also killing the value of the GF8600, and likely the HD2600 in that anyone who already owns any 3 of those, or even the GF7600GT/X1650XT will have a difficult time justifying a GF8600GTS or HD2600XT , and same for the GS and GF8500.
The 7900 is another good example. Last week i noticed a 99€ offer at some online shop. That´s 20€ less than the already cheap 1950pro. It looks like some people are clearing their inventory thanks to the 8600. 8O

Yeah that's the big question, no solid dates and alot of speculation from both extremes in the timeframe. I suspect the fall is a realistic timeframe, but we'll have to wait and see. Also the idea of a Gemini/Dual part seems to be getting alot more traction, which personally I think is the wrong direction for that , but I'll reserve final judgement once we know whether it's more effective than the lackluster X1950Pro DUAL card was.
I liked the idea of that dual card. It´s not really the same as nvidias GX2 since ATI only took a mid-range chip. The pricing with these offers is critical though and until now i haven´t seen a single dual chip solution that is worth its money.

I think whomever gets that upper mid-range product to market first for a reasonable price will win alot of contracts and alot of value conscious, power concious, and performance upgraders. The GTS-320 is a great value performer, but it's not much les power consuming than the other 8800s or even the HD2900 to be considered for people with PSU constraints. A GF8800GS or HD2900Pro would be sweet.
Indeed! By looking at the hardware specs the gap between the 8800GTS and the 8600GTS is too big. The 2600XT might or might not close it, but the refresh chips will - i´m quite sure of that!
 

vic20

Distinguished
Jul 11, 2006
443
0
18,790
I don't need that to make me look better than you, all I need is the fact that an old CPU running @ idle still consumes less power and thus generates less heat than a newer 'efficient' CPU running @ load. That's what is obvious to anyone who has HTPCs or Laptops, but seems so foreign to you.

Like I said the GF8800 and HD2900 don't matter because they both consume a ton of energy at idle, so their difference between hardware assist and software assist will be minor, and their benefit over CPU utilization will also be minor, but in a low power/heat mid-range card/chip the overall effect will be big. IF you can't see that then nothing will enlighten you.

Yeah. I'm sure an old AMD64 3000+ with an 8600/2600 video card would consume less power at idle (and run cooler/quieter) than a board with onboard component video and a EE chip at load.... :roll:

http://www.hardwaresecrets.com/fullimage.php?image=6714
 
Well unless you show something a little more impressive, then I'll go with the two cool running card and CPU outstripping the hard running CPU, especially since the 690G you link to isn't that good with 1080P;
http://www.techreport.com/reviews/2007q1/amd-690g/index.x?pg=10

And that's a movie trailer not HD-DVD or BluRay so you can expect the CPU to be working much harder on that intetgrated solution if it'll even be able to handle the content at all without skipping since it has the warning on most mobos about not being able to play 1080P smoothly. So you can focus on individual portions, but your solutions don't combine all the benefits of the GF8600 and HD2600 to HTPC owners. And if people already have the Gf8800 and HD2900 anyways, then it's nice free bonus to them. So still you're either missing the issue or being obtuse.

Also, now you've got them buying a new processor, new mobo and new memory from that AMD64 3000+, that's a little much don't you think compared to a $90 graphics card? C'mon!
 

vic20

Distinguished
Jul 11, 2006
443
0
18,790
Well unless you show something a little more impressive, then I'll go with the two cool running card and CPU outstripping the hard running CPU, especially since the 690G you link to isn't that good with 1080P;
http://www.techreport.com/reviews/2007q1/amd-690g/index.x?pg=10

And that's a movie trailer not HD-DVD or BluRay so you can expect the CPU to be working much harder on that intetgrated solution if it'll even be able to handle the content at all without skipping since it has the warning on most mobos about not being able to play 1080P smoothly. So you can focus on individual portions, but your solutions don't combine all the benefits of the GF8600 and HD2600 to HTPC owners. And if people already have the Gf8800 and HD2900 anyways, then it's nice free bonus to them. So still you're either missing the issue or being obtuse.

Also, now you've got them buying a new processor, new mobo and new memory from that AMD64 3000+, that's a little much don't you think compared to a $90 graphics card? C'mon!

I don't disagree. Been playing Devil's advocate because you love to argue, never admit you might be wrong about something and love to dismiss those around you.

My system would consume less power and produce less heat, even at full load vs. your 3000+ and 8600/2600 system at idle.

Which is what I was commenting on and what you avoided.

Anyway, I'm getting bored of this.
 
My system would consume less power and produce less heat, even at full load vs. your 3000+ and 8600/2600 system at idle.

Got any numbers for that?

Which is what I was commenting on and what you avoided.

I didn't avoid anything I said at the start of the section you quoted that you need to show me something more impressive to back up your statement.

Because your P4 isn't that great.

http://www.xbitlabs.com/articles/cpu/display/sempron-3000_9.html#sect0
http://www.xbitlabs.com/articles/cpu/display/core2duo-shootout_11.html#sect0
http://www.xbitlabs.com/articles/cpu/display/amd-energy-efficient_6.html#sect0

A minimum of of +50W difference on those CPUs (and usually more but let's say less like 40W for a 20% vs 100% comparo), while the worst case scenario power hungry GF8600GTS (not GF8500) on it' 2D max is ~30W and the HD2400 will be much much less thanks to 65nm;

http://www.xbitlabs.com/articles/video/display/geforce8600gts_7.html#sect0

So really like I said, show me something more compelling to support your statement. Because your 60-80C CPU is going to be kick out more heat period (laws of phyics) and it will also run into temperature issues (different than heat of course) because it's going to be taxing the thermal capacity of the HSF and the air cooling capacity inside the rig, while the other system won't be taxed at all and easily dissipate the heat leading to lower temperatures all around.

Anyway, I'm getting bored of this.

I got bored from your first post, because it's obvious you don't understand the need yet wanted to jump in an criticise for personal reasons.