(GAMERS ONLY) i7 vs 955/ is 300$ worth it?

Status
Not open for further replies.

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Im tired of seeing gamers that never/once every month use video encoding or zipping (mostly NEVER used.) go for i7 for their new build. What about Phenom II 955 people?? Intel does beat AMD in performance, But I think that for the mainstream, Us, Gamers, are just buying i7 for the braggin rights. Nothing else. ONCE AGAIN I DO SAY THAT INTEL BEATS AMD IN PERFORMANCE SO PLEASE KEEP YOUR EGO DOWN:


http://www.tomshardware.com/reviews/phenom...955,2278-9.html

Same performance.

http://www.tomshardware.com/reviews/phenom...55,2278-10.html


Here in world in conflict, i7 looks like he has the advantage. But weird how a usually very used option such as AA makes the i7 drop by 20 FPS. While 955 drop only 10. Anyway, you will HARDLY see a difference for the ~300$ (mobo+cpu+ram) your paying. This fps drop is very present in intel cpu. If you get a Intel CPU that performs same in gaming that a Phenom, You will get a lot more instable Frame rate. Sometimes dropping from 60 to 30, making the game look extremly lame compared to a constant 45-35 from Phenom.

http://www.tomshardware.com/reviews/phenom...55,2278-11.html


Phenom is more power efficient, And this is what gamers need, The power should be used by GPU's and make us pay less for PSU.


Now lets talk about that ADVANTAGE of i7 that MOST (not ALL because that would be lie) and by MOST i mean at least 70% NEVER use.


http://www.tomshardware.com/reviews/phenom...955,2278-8.html


less then a minute, less then 50 seconds, and 10 seconds difference. Is that EVEN such a big deal? I would rather pay 300 less and wait 1 minute for something I might use...once every year? I never zip anyway and most dont. We torrent and UNZIP not ZIP. The only REAL disadvantage is the video encoding that MAYBE some of us use. But 300$ pops in my mind again, and 1min/300$ is FAR from being a good deal to me.


ONCE AGAIN THIS IS RELATED TO THE GAMER COMMUNITY AND PEOPLE THAT DOESNT VIDEO ENCODE/ZIP ALOT (aka every gamer..).

As you see, the ONLY reason AMD is Bleeding money is because people buy for the braggin right, they don't even take the time to compare for their need and often end up paying more.
300$ People. 300 $! add the 200$ you were gonna get your GPU with, and you can get a 500$ GPU. 2HD5850 or 1 HD5870 or even 1 GTX295. Do you understand how much gamers waste? This is why I started this thread.

This is the only reason AMD is bleeding money and can't make even more "gamer" cpu, it's only because nobody takes time to READ about the actual technology, they don't even ask themselves what will they do with that damn CPU there getting at over 300$. And please don't mention i5, I really don't want to go look for those slicing review that cuts this cpu to pieces, and it's definitly not a "gamer" cpu if it doesnt support dual x16 scaling. Intel is definitly not made for "mid range" CPU's.


One review is not enough? Here's more:
http://www.hardwarecanucks.com/charts/cpu....76,77&tid=2

Watch what happens if you overclock it to a mere 3.8
Yes, it reach the performance of a 1000$ CPU.


http://www.hardwarecanucks.com/charts/cpu....76,77&tid=4


Beat the i7.


Now don't get me wrong, the i7 is DEFINITLY a better CPU. DEFINITLY. But NOT for US gamers, only for corporations and things like that.
But is it worth the money? Is it worth 300$ that you couldv spent on a better GPU?
Check for example the valve particle simulation, of course i7 beat the Phenom II 955, but look at the frames here, it's already past 60, you will not notice any difference because YOUR MONITOR IS SET TO 60HZ. Even the bests mainstreamz are only 75. I heard about new 120MHZ monitors, but I think our eye limits us and you won't see the difference. Is there a difference to "see" anyway?? I think 60 is the "limit" required for gaming...


AMD isnt bleeding money cuz they suck, AMD is bleeding money because PEOPLE SUCKS.
Computing used to be a professional domain, now anyone can come and claim that he can use a computer, while he hardly know what provide them Gaming power. If all people would take time to CHECK and LEARN, AMD would be rich by now, and gaming would have advanced a bit faster.


Now I didnt take the time to make this thread just to get diced or start a i7 vs Phenom II 955 war, I already lost. Im talking GAME WISE. And since I am on a "gamer" forum, Im hoping NOT to get flammed, just to be explained where I got wrong. I think AMD should rule the gaming community. But no it don't, because most of people only care about brand name or fashion, "since everybody is going i7 Im going too".

I just felt the need to show the truth to everyone that SHOULD see it. I personally would prefer saving my money and laughing on people that NEVER will use the REAL power of the i7 but still wasted 300 bucks.


Anyway If 2 reviews ain't enough, and that you might want to talk about the "awesome" scaling power the i7 has to offer, here's a little something for you:

http://www.modreactor.com/english/Reviews/...is-Warhead.html


People claim that a CPU Bottlenecking can be seen at maximum quality. Here's your Bottlenecking.

http://www.modreactor.com/english/Reviews/...-Clear-Sky.html


Once again, the moment we crank the AA, i7 shows its true weakness.


http://www.modreactor.com/english/Reviews/...n-Conflict.html

Okay, I give the i7 ONE win. GG. we NEARLY reached 60 fps with phenom tho, if you overclock it, youl go over 60. Once again, this is a performance that will not be seen.

http://www.modreactor.com/english/Reviews/...-May-Cry-4.html


Enough said. I am ready to get flammed, raped, jumped at for this insane blasphemy. Go on. Just remember that: I am talking GAMEWISE,
I admit a i7 is better CPU that phenom II 955 but still ask yourself if 1 min is worth 300$ that you could win 30 fps with.

I will admit something FRANKLY, Quad cores of last generation DID beat AMD! I will NEVER deny this. But I am talking about the current generation. Also, 6-cores and 8 cores (Bulldozer) are coming in AMD while i9 and 8 core are coming at Intel, Now I wonder wich one will cost less for the performance that never will be used and wich one "gamers" will get?


One last thing, I mentioned that i5 is totally out of the question and this is why: First I read a review that said that the testers were disapointed by i5, and that i7-860 was the real deal for LGA1156. Another thing is i5 only scales dual x8 for the same price of a dual x16 from AMD. And last but not least, LGA1156 is already dead: This is intel's response to next year's 6core and 2011's 8core: http://www.fudzilla.com/content/view/15749/35/
Less cache, more pricey, still dual x8. GG intel.



What do you people think about the whole i7 frenzy happening in the gaming world? Please refrain from hating comments, I just showed pure facts...This is in no biased, and not meant to dice intel or intel owners.
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Wow, 28 reads, and not a single flame?
Cmon guyz, This is pure blasphemy.
You can't let me do that? Where is the ego people?
wow I feel so rejected right now...
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
:( just don't hit me...
sorry daddy...
and what about the actual thread?
I would like to know if im going out my mind, or if I am right
Your position?
 

MiamiU

Distinguished
Jul 8, 2009
58
0
18,630
i always felt AMD vs Intel was pretty much like ATI vs Nvidia. Just like Nvidia used to have the best performing graphic cards but charged a premium that i thought wasn't worth it, Intel charges a premium for their better performing CPUs. I personally rather go AMD and have a well rounded pc, while saving money.
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Whats sad is that, you don't lose performance going AMD.
Seriously man, 1min per 300$?? You gotta be Sh1tting me??
If it was 2 or 3 minute, then yeah for NON GAMERS I would recommend i7.
But cmon guyz, in the GRAPHIC forum, Im amazed that people still go i7....what a waste...
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Oh thank you!! >< I don't know about the latest technology in monitors XD
Can you give me an example of pricing?
Still the human eye limits the frames per second you can see, and 120 is far off the limit.
 

MiamiU

Distinguished
Jul 8, 2009
58
0
18,630
well the 120hz monitors are used for the nvidia 3d vision, other then that I don't see why anyone would want to spend the extra money on them, besides i think the best resolution you can get them in is 1680 x 1050. correct me if I'm wrong
 

randomizer

Champion
Moderator
The only reason you'd go with the i7 is if you were doing video encoding (with something like x264) or even more so with 3D rendering. If either of these tasks is your job, the extra $300 for the i7 will pay for itself in increased productivity pretty quick.

For gaming there is no reason to go with i7, except perhaps for FSX, but a 2P workstation is better for that :D
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
The only reason you'd go with the i7 is if you were doing video encoding (with something like x264) or even more so with 3D rendering. If either of these tasks is your job, the extra $300 for the i7 will pay for itself in increased productivity pretty quick.

For gaming there is no reason to go with i7, except perhaps for FSX, but a 2P workstation is better for that :D

Finally someone understand!!!
Now, if you could explain this to the over 9000 gamers here on the graphic forum that buy a i7.....
Maybe AMD would have more money to REALLY beat intel at a fair price..?
 
I posted a thread when these i7's first came out, It said "Why do you all keep recommending the i7 when there is no real world difference in games between the i7 and a top end C2D or AMD"
See, that's a lot easier than your wall of text that I'm sorry but I'm not awake enough to read all off right now :)
The reason i got given was basically because Enthusiasts want the best regardless.
Oh and on the refresh thing, there is no set limit to what the human eye can perceive. Sure there are reports about on the net claiming A or B and movies only run at X speed etc. We have done the subject to death a few times on these forums and its clear that its down to individual preference as well as some people having a different tolerance as far as the actual refresh rate is concerned. While I'm quite happy running a monitor at 60Hz with out an issue i know people who claim it gives them issues, nausea and headaches etc.
Personally i can tell the difference between 60 and 75 on my own screen. that's to say i cant sit and look at it and say that's 75 and that's 60 but only last week the wife had installed something on her settings and asked me to look at something and it was strange but i thought hang on something isn't right here, went and looked at the properties and sure enough it was running 60.

Mactronix
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Oh ok thanks for the information!

But yeah.. I still wonder if it's REALLY worth the extra cash...
Well anyway, both cpu only reach the 120 fps on 1 on 2 games so...Currently, i7 has no clear advantage over Phenom II 955

Well I know this wall of text is HUGE but it's hard to convince people by just claiming something, and it took my quite a good amount of time to prepare a non-holed claim...Seems it's working pretty well, I did not get flammed YET.

Well this one is full of hatred against that dude in the quote, maybe this one will look better?
http://forums.overclockersclub.com/index.php?showtopic=168710

YOU KNOW WHAT? IMA POST IT INSTEAD!!
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Well...at least it looks a little better... O.O took off that non necessary hate quote.. Does it look any better? :D makes you want to read it maybe?
 

michaelmk86

Distinguished
Dec 9, 2008
647
1
19,015
I cannot believe that there is so much difference 300$(for cpu+mobo+ram) but I guess different countries different prices.

For me it is worth the money to go with the i7 instead of the PII 955 because in my country the price difference is 100€(for cpu+mobo+ram) plus the fact the i7 is reasonably faster in games considering that it cost only 100€ more.

http://www.tomshardware.com/reviews/phenom-versus-i7,2360.html
AMD Phenom II X4 955 Black Edition (OC 3.7 GHz) with 2 x HD 4890
Intel i7-920 (OC 3.4 GHz) with 2 x HD 4870

Here are some of the results:

World in Conflict
Phenom II 3.7 GHz __ 2x HD 4890 __ 1920x1200 very high 63fps
Intel i7 920 3.4GHz __2x HD 4870 __ 1920x1200 very high 88fps

far cry 2
Phenom II 3.7 GHz __ 2x HD 4890 __ 1920x1200 very high 71fps
Intel i7 920 3.4GHz __2x HD 4870 __ 1920x1200 very high 92fps

Prototype
Phenom II 3.7 GHz __ 2x HD 4890 __ 1920x1200 4xAA high 54fps
Intel i7 920 3.4GHz __2x HD 4870 __ 1920x1200 4xAA high 70fps

As you can see the i7 system (with slower graphic cards) shows a solid lead over the Phenom II(with faster graphic cards). Keep I mind that the i7-920 was only OC at 3.4GHz, Now imagine the difference if the i7-920 was OC at 4.2GHz(easy with a good aircooler) total domination against the Phenom II.

Also performance difference will be even bigger (between the i7-920 and PII-955) with more power full GPU setup (ex. 2 HD5870’s) because the Phenom II will bottleneck the 2 HD5870’s very badly.

To conclude:
Combination 1
i7(cpu+mobo+ram) = 500€ + 2 HD4870’s = 250€ == 750€
Combination 2
PII(cpu+mobo+ram) = 400€ + 2 HD4890’s = 350€ == 750€

Combination 1 gives better fps in games than Combination 2 with the same amount of money.

Now you are going to disagree or whatever, but I don’t care because you are a blinded AMDfanboy that likes “flame” :p
 

cobot

Distinguished
Jun 15, 2008
295
0
18,810
I posted a thread when these i7's first came out, It said "Why do you all keep recommending the i7 when there is no real world difference in games between the i7 and a top end C2D or AMD"
See, that's a lot easier than your wall of text that I'm sorry but I'm not awake enough to read all off right now :)
The reason i got given was basically because Enthusiasts want the best regardless.
Oh and on the refresh thing, there is no set limit to what the human eye can perceive. Sure there are reports about on the net claiming A or B and movies only run at X speed etc. We have done the subject to death a few times on these forums and its clear that its down to individual preference as well as some people having a different tolerance as far as the actual refresh rate is concerned. While I'm quite happy running a monitor at 60Hz with out an issue i know people who claim it gives them issues, nausea and headaches etc.
Personally i can tell the difference between 60 and 75 on my own screen. that's to say i cant sit and look at it and say that's 75 and that's 60 but only last week the wife had installed something on her settings and asked me to look at something and it was strange but i thought hang on something isn't right here, went and looked at the properties and sure enough it was running 60.

Mactronix


But you have a crt screen, then.
Very simplified, a crt screen will "draw" an image on the screen 60, 75, 85 times per second (depending on the refresh rate), which can result in flickering. It looks like the screen is turned on and off veeery quickly all the time, which is acutally not so far from the truth.

An LCD screen doesn't have this problem as the shutters keep their opacity until they get a new instruction. Theoretically, the backlight could produce a flickering effect, but the backlight is usually locked at 200hz so that is rarely a problem.

In short:

The refresh rate on a crt-screen is how often a picture is drawn on the screen.

The refresh rate on an LCD-screen is how often the picture can "change". A low refresh rate can cause stuttering, but I can't imagine that ever being the case with 60hz.



Found a picture that shows the picture being "drawn" on a crt screen:

Refresh_scan.jpg


Looking at that one understands why crt-screens can give the impression of flickering.
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Now you are going to disagree or whatever, but I don’t care because you are a blinded AMDfanboy that likes “flame” :p



Saying that a Phenom II 955 will bottleneck two HD5870 was rather not inteligent yes.
In fact you don't have any benchmark proving that. I think it's total speculations non based.

As for world at war, You are one of the few people I met that will base himself on a SINGLE game, while Phenom II 955 has proven to be AS good, or BETTER then i7. Also, If you overclock the i7, you should overclock the phenom II 955 also to make it fair right? Your domination is now quite reduced....

http://www.modreactor.com/english/Reviews/Test-ATI-HD-4890-1GB-CrossFire-AMD-Phenom-II-955-BE-vs-Intel-Core-i7-920/Page-6-Performance-World-in-Conflict.html

In this benchmark, both cards were the same. 6 FPS. Total domination yes. Overclock the phenom II 955, and comparing will be pointless because your monitor is probably set to 60mhz. even set to 75, you will need to concentrate to see the difference.

If you like to concentrate on 1 game to prove domination, what about this:

http://www.modreactor.com/english/Reviews/Test-ATI-HD-4890-1GB-CrossFire-AMD-Phenom-II-955-BE-vs-Intel-Core-i7-920/Page-4-Performance-Crysis-Warhead.html

Here is your total domination. =/
Btw the mobo used was the M4A79T Deluxe, known to have hyper flood problems. With a Asus Crosshair III formula, overlcocked or not, i7 would have been crushed.

Well yeah it really depends on your country but here at canada :
AMD PHENOM II X4 955 QUAD CORE AM3 125W 3200 HZ 8MB CACHE BLACK EDITION
ASUS CROSSHAIR III FORMULA SKT.AM3 NVIDIA 790FX 2PCI-E 2.0 X16/2PCI-E X1/4D.DDR3/1600/1333 /ATX
OCZ OCZ3G1333LVAM4GK AMD Gold PC3 10666 2X2GB Kit 1333MHz 9-9-9-20
2xASUS EAH5850/G/2DIS/1GD5 PCI-E RADEON HD5850 1GB-DDR5,256BIT DUAL DVI-I,HDMI,HEATSINK

1090$

And for i7

INTEL CORE I7-920 2.66GHZ 8MB CACHE LGA-1366 4.8GT QPI
ASUS RAMPAGE II EXTREME SK.1366 INT X58/ICH10R 3XPCI-E 16X 6.D.DDR3-1066MHZ 1600/1333 FSB SATA,ATX
OCZ OCZ3G1333LV6GK Gold Tri Channel Kit PC3 10666 3X2GB Kit 1333MHz 8-8-8-20
2xASUS EAH5850/G/2DIS/1GD5 PCI-E RADEON HD5850 1GB-DDR5,256BIT DUAL DVI-I,HDMI,HEATSINK

1462.59$ Exactly

http://www.sohodiffusion.com/configurpc.asp#

Now you might ask why I took the Asus Rampage II extreme? Well I matched the quality of the item of course.
It's simple to take the lamest Mobo that cost less and will never allow your so-claimed 4.2GHZ overclocking and make the Phenom II 955 look horrible, but no it doesnt work that way. Even If I take THE LAMESTmobo for i7:

MSI X58M SKT1366 INTEL X58 6 DDR3 1600 SATA FSB 6.4GT/S MATX AUDIO 1394 GB LAN

Were still at 1252$, 250 $ more then a Phenom II 955 build that will rape the i7 thru the @ss IN GAMING (i7 will win in things you will never use..have fun..)

I shoot facts, with websites, name of items, benchmarks and all you can afford to show me is your writing? with EXTREMLY BAD speculations that PHENOM II 955 WILL BOTTLENECK 2 HD5870?
You must be out of your mind... I wonder who sounds more like a fanboy here...?
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
One more thing.

Once again, notice the MASSIVE drop of the i7 when you crank the AA, while the Phenom II 955 doesnt even flinch one bit.
Do you know why they didnt crank the AA to the max?? I mean with that setup, it would still have been playable at 1920x1200.
Because they probably did not want to show the weakness of i7 in generating AA.

Also two "advantages" that were actual mistakes:
2 HD4890 scales less good then 2 HD4870 because it's just a "revised" HD4870 that was single GPU boosted.
The only resolution in wich i7 win by a big amount, is 1280x1024.
Before buying i7, you should first upgrade your monitor.

Now chek Stalker at 1920x1200 with AA cranked up:
50.4 for Phenom II
46.1 for i7

Crysis 1920x1200 no AA (AA would have killed i7)
42.1 PII
36.8 i7

A very good Chart of the INCREDIBLY LAME AA of the i7 is in hawx, where they crank the AA to the max

Notice how the Phenom II 955 DOESNT FLINCH at 1920x1200 and goes down 99 to 95, a mere 4 fps drop.

Watch how i7 goes down straight from 113 to 98. Near 20 FPS. Of course, it's pointless on a game that run well like HAWX.
But imagine Crysis 2 for example? i7 seems to be quite weak for some Good AA. Not future proof for gaming right here, especially when AA is becoming VERY used.

Prototype and World at conflict seems to be game that Phenom II has problem with, Dunno why.
ANYWAY, any frame over 60 is pointless to compare. But I did compare how much a phenom II doesnt flinch, and how much a i7 drop straight to hell.

Anything else to say? some more cherry-picking benchmarks maybe?
 

randomizer

Champion
Moderator
The i7 system has more CPU power available to it. Once you shift more emphasis on the GPUs (with AA for example) the extra benefit of more CPU power diminishes. The i7 drops alot when AA is applied because it has more to lose as the bottleneck shifts towards the GPUs more (that sounds so cliche, I hate using the term "bottleneck").
 

bboynatural

Distinguished
Oct 13, 2009
272
0
18,790
Yup but even on HD4870, I never saw such drops.
The 4800 serie was actually known for having one of the greatest AA
Both HD4870 AND HD4890 should bring in the same performance, or at least, not a 20 drop to death...
many benchmarks I saw (I personally like very much Guru3D) showed same "AA" power for both of these card.

I heard of Intel CPU being less "Consistent" In fps tho, For example having lots of FPS drops, making high demanding games go from 60 to 30, while phenom II would keep a low 45-35 FPS, wich looked less awfull then the massive drop of intel's...I actually read it here, when someone posted a so fkin biased review of i7 920 vs phenom II 550 (dual vs near 8 core. Who the fk do that?) by [H]...will try to find it..
 

randomizer

Champion
Moderator
[H] reviews are actually useful for this sort of thing, as they show the framerate over time, not just one or two bars (where only the average is somewhat useful anyway, as minimum framerate is meaningless really).
 
Status
Not open for further replies.

TRENDING THREADS

Latest posts