Redneck5439
Honorable
tea urchin :
The answers are all in this thread and on line. Meaningful numbers and a calculation?
I've gone over and over it. I'm a British trained Sparks,Dude. I'm BORED of showing how 60 watts per hour at 20 odd cents per unit = what it does. I'm sick to the gills of explaining the a.m.Dinosaur boards shag out cos of the extra ampage.
I should just start a pc building biz for el Barto's and explain 'but it will play f.o.4' _
Send Homer round when its knackered.
I can build him another bag of crap and waste another windows licence.
I've gone over and over it. I'm a British trained Sparks,Dude. I'm BORED of showing how 60 watts per hour at 20 odd cents per unit = what it does. I'm sick to the gills of explaining the a.m.Dinosaur boards shag out cos of the extra ampage.
I should just start a pc building biz for el Barto's and explain 'but it will play f.o.4' _
Send Homer round when its knackered.
I can build him another bag of crap and waste another windows licence.
Everyone knows what you believe about the electricity used by an FX vs an Intel. EVERYONE knows how totally flawed your math is on the subject and EVERYONE knows you are extremely "biased" for Intel. We can talk about cost of running a stock FX and a stock Intel all day long, but it won't help because that extra under $1 (US currency) a month in the power bill is really going to kick you in the pants ~10 years down the road. Of course by then just about everyone has upgraded or is looking at their next upgrade as parts do wear out.
The newest, best Intel i7 6700K is a 91W chip. The FX 8370 is a 125W chip. The difference in power TDP is 34W, about half that of a 60W light bulb. Would anyone in their right mind not turn on a light bulb because of its horrendous power consumption? You know how asinine you would sound yelling at someone "don't turn that light bulb on, I can't afford the extra power on my electric bill"?
Lets not forget either that be it Intel or AMD the very first thing that people do when they get that shiny new processor is see how far they can overclock it. Will an overclocked processor use more power? Yes. Will an overclocked processor jack your bill over $20 a month? Never, even overclocked a processor doesn't use enough power to really effect your power bill. At most an overclocked processor (be it Intel or AMD) will cost $5 a month more. An overclocked AMD (FX 8 core) will typically use about $1.30 more per month than an Intel for a computer running 5-8 hours a day. The whole idea of a processor using so much power as to not be a good option for a budget minded personal computer user for a home PC is just totally ridiculous, but by all means keep pounding your nonsense view into the dirt as you can't come up with a real answer as to why AMD doesn't make a good budget build.
It is totally obvious that you are trying very hard to start a "flame war" and at this point you are trolling. Calling AMD junk (direct quote "a.m.Dinosaur junk") is trolling, and will only result in yet another flame war that has already happened on forums more times than can be counted.