Last week, I made a challenge to dvdpiddy to see if he could actually stop posting for an entire weekend. In return, I would have to tell everyone how much the Intel netburst technology sucks. So, I decided to conduct a few tests here on my own for comparison. Now, this comparison is between two machines that of which, are currently available.
First off, I ran some simple tests but before I did, I set my Prescott back to stock which is 3Ghz. My opty is currently running at 2.5Ghz which still gives it a 500mhz disadvantage.
The first thing right off the bat are the temps. The opty runs at 28C idle and have never gone over 36C at load. My Intel runs at 33idle and hits around 50 under load. I do wanna note that both machines are in Chieftec Dragon cases which are identical. They both utilize the same case fans. Both coolers are zalmans but different models. Winner of this test: Opty
http://photobucket.com/albums/f16/Luminaris/630%20Screens/?action=view¤t=IntelStockTemps.jpg
http://photobucket.com/albums/f16/Luminaris/Opteron%20Screens/?action=view¤t=OpteronCurrentSpecs.jpg
The second thing I did was clock the startup of both machines. I'm talking from the time I hit the power button till the time I can actually start using applications. I did this three times consecutively with each machine. Intel averaged 29 seconds while the Opteron averaged 15 seconds. Winner: Opty
The third thing I did was conduct a room temperature test. I used an electronic thermostat that of which I placed in my room where all of my computers are located. It's an enclosed office and it is 14X15. The temperature normally is set to 69 degrees by the house thermostat. In starting this test, I shut off all equipment in my office. The room temperature hovered right around 70 degrees. What I did to conduct this test was turn each machine on, and let it run overnight under load using prime95. The first machine up was the opty. When starting this test, I made sure the room temp was normal (70 degrees) I closed the door and let the opty do its thing all night. I came in the next morning and the temp was at 71 degrees.
Next up was the Intel machine. Again, I made sure the temp was at 70 degrees normal temp. I shut down the opty machine and everything else was off. I let it run prime all night. I came in the next morning and found the temperature at a staggering 78 degrees in my office. That's 7 degrees higher than the opty. Draw your own conclusions on that one.
Lastly, the benchmarks. Just using 3DMark on this one. I know the 3DMark tests have been noted as being Intel biased and that inspired me to run them since I have an Intel machine. The results are not even remotely close. The opty machine uses a 7800GT while the Intel machine uses a 6800XT. I will also include a screen of the Intel when I had a 7800GT in it and it was overclocked to almost 4Ghz. Here are the results. Winner: Opty
http://photobucket.com/albums/f16/Luminaris/630%20Screens/?action=view¤t=Intel3DMark.jpg
http://photobucket.com/albums/f16/Luminaris/Opteron%20Screens/?action=view¤t=3DMark06Score.jpg
http://photobucket.com/albums/f16/Luminaris/630%20Screens/?action=view¤t=3DMARK061.jpg
Now I know your probably wondering why the hell i'm even posting this. Well, I think it's pretty obvious as it compares two very much talked about processors. I realize there are many factors involved and there really is no comparing the current Intel vs. AMDs current lineup. Unless you overclock the current Intel processors, there is no comparison at stock speeds or otherwise. Even if I run these tests with the Opty at stock speeds, it still beats up my Intel pretty bad. If you feel i'm wrong about that, I'd be more than happy to run the tests at stock to prove my point.
In conclusion, Intel really has nothing currently to compare against AMD. The netburst architecture is simply put, bad technology and IMO, a waste. I've been an avid Intel enthusiast for years but for now, I'll use AMD. Please do not turn this thread into a flame fest as, we have enough of those already and all I'm doing is throwing out some real world observations here. I would value your opinions though.
Love, peace and harmony to all
~Luminaris~
First off, I ran some simple tests but before I did, I set my Prescott back to stock which is 3Ghz. My opty is currently running at 2.5Ghz which still gives it a 500mhz disadvantage.
The first thing right off the bat are the temps. The opty runs at 28C idle and have never gone over 36C at load. My Intel runs at 33idle and hits around 50 under load. I do wanna note that both machines are in Chieftec Dragon cases which are identical. They both utilize the same case fans. Both coolers are zalmans but different models. Winner of this test: Opty
http://photobucket.com/albums/f16/Luminaris/630%20Screens/?action=view¤t=IntelStockTemps.jpg
http://photobucket.com/albums/f16/Luminaris/Opteron%20Screens/?action=view¤t=OpteronCurrentSpecs.jpg
The second thing I did was clock the startup of both machines. I'm talking from the time I hit the power button till the time I can actually start using applications. I did this three times consecutively with each machine. Intel averaged 29 seconds while the Opteron averaged 15 seconds. Winner: Opty
The third thing I did was conduct a room temperature test. I used an electronic thermostat that of which I placed in my room where all of my computers are located. It's an enclosed office and it is 14X15. The temperature normally is set to 69 degrees by the house thermostat. In starting this test, I shut off all equipment in my office. The room temperature hovered right around 70 degrees. What I did to conduct this test was turn each machine on, and let it run overnight under load using prime95. The first machine up was the opty. When starting this test, I made sure the room temp was normal (70 degrees) I closed the door and let the opty do its thing all night. I came in the next morning and the temp was at 71 degrees.
Next up was the Intel machine. Again, I made sure the temp was at 70 degrees normal temp. I shut down the opty machine and everything else was off. I let it run prime all night. I came in the next morning and found the temperature at a staggering 78 degrees in my office. That's 7 degrees higher than the opty. Draw your own conclusions on that one.
Lastly, the benchmarks. Just using 3DMark on this one. I know the 3DMark tests have been noted as being Intel biased and that inspired me to run them since I have an Intel machine. The results are not even remotely close. The opty machine uses a 7800GT while the Intel machine uses a 6800XT. I will also include a screen of the Intel when I had a 7800GT in it and it was overclocked to almost 4Ghz. Here are the results. Winner: Opty
http://photobucket.com/albums/f16/Luminaris/630%20Screens/?action=view¤t=Intel3DMark.jpg
http://photobucket.com/albums/f16/Luminaris/Opteron%20Screens/?action=view¤t=3DMark06Score.jpg
http://photobucket.com/albums/f16/Luminaris/630%20Screens/?action=view¤t=3DMARK061.jpg
Now I know your probably wondering why the hell i'm even posting this. Well, I think it's pretty obvious as it compares two very much talked about processors. I realize there are many factors involved and there really is no comparing the current Intel vs. AMDs current lineup. Unless you overclock the current Intel processors, there is no comparison at stock speeds or otherwise. Even if I run these tests with the Opty at stock speeds, it still beats up my Intel pretty bad. If you feel i'm wrong about that, I'd be more than happy to run the tests at stock to prove my point.
In conclusion, Intel really has nothing currently to compare against AMD. The netburst architecture is simply put, bad technology and IMO, a waste. I've been an avid Intel enthusiast for years but for now, I'll use AMD. Please do not turn this thread into a flame fest as, we have enough of those already and all I'm doing is throwing out some real world observations here. I would value your opinions though.
Love, peace and harmony to all
~Luminaris~