What is it with all the Phenom "sucks" responses....

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

endyen

Splendid
They desperately needed the 9700s and high end barcy's shipping, and as such, I just cant see them freezing shipments for a minor problem. Its conjecture, but if TLB isnt an issue, and AMD didnt over react, then there must be another problem. A mystery
The TLB is the issue.
They are afraid that it could be exploitable in a DDoS attack.
Without barcelona, and with the current pricewar, the phenoms are not economically viable.
I would not expect that they did any quad core wafers after the scope of the erata was discovered.
We should also remember that AMD is well known for having binning problems on early wafers. They should be able to get up to dual core speeds in Q4/08 or Q1/09.
I expect that the higher binned phenoms and barcelonas will transfer to 45nm around Q3/09
 

endyen

Splendid

Can you find a website that has subjective benchmarks?
The better sites used to subjective tests. Sometimes they'd do blind comparisons with staff, sometimes they'd go out into "the wild"
If you don't care about benchmarks, why come here?
Sometimes it's a good place to learn, sometimes the best benchmark is the people on sites like this.
I have learned to trust your "objective" benchmarks less and less over the last ten years or so. As often as not, it's the benchmark that decides the winner, but then the person who chooses the benchmark has more tools than that.
It used to be that when a site did a benchmark, they would supply a file, should you want to check thier results. Then things were a little more objective. There is a benchmark for what I say though. Compare the benchmarks used by THG in thier diferent anual cpu charts.
At the end of the day, there is only one benchmark that really counts, and it's very subjective.
I've been a hobiest, and sideline builder for may years, and about a couple thousand builds. I've always judged the quality of my builds by the enthusiasm of the user.
 

Falken699

Distinguished
Aug 26, 2007
374
0
18,780
Encode 10 DVD movies, burn, and verify them ALL in a row, in one sitting, and time it. You'll be off by quite a large margin with the Phenom.

When Phenom "tocks" (Shanghai) and they get a few Revs at that node, and you can actually OC the thing 30% without a hassle, then ok, maybe it would be more worth it.

But, people buying now (not in 18 months) will notice a big difference if they are power users.

Otherwise, mom and dad won't know, you are correct.
 

LukeBird

Distinguished
Nov 14, 2007
654
0
18,980

In your haste to bash harna I think perhaps you should have proof-read your post before taking the knowledge high-ground here....
E6850 is indeed based on the older Conroe, but it was the last of the revisions (I believe) and is not that different to a Wolfdale based machine. "There are cheaper and faster 45nm quads available as the E8400" the E8400 is not a quad.
"62fps vs 45fps is a massive difference playability wise", you cannot make such a statement on average framerates. What if the average of 62fps was made up of them machine running at 200fps for x amount of time, but running at 1fps for x/10 amount of time. (Those are just arbitary numbers and have no mathematical basis on the percentages I used) That is certainly not playable for a large part. Lets now say that the 45fps was 45 +/- 5% for x time. Which is more playable now?
Average FPS are much like benchmarks, half the story for a quarter of the time...
Everyone here has an opinion (which however seemingly 'useless' doesn't give you the right to bash them for it... i may make an exception for "Thunderman"! ;)) so perhaps allow people to make it? :)
I'm glad there is discussion on what I started, but one thing that worries me when I read replies is the use of "subjective benchamrks", I would like to make it clear I don't believe in such things, I meant (as someone pointed out) personal user experience is always better than a benchmark, IMO. :)
 
http://www.techspot.com/review/84-amd-phenom-family-performance/

Techspot has done a bit of a Phenom review which I thought might be of interest.

Quite a few of the scores has Phenom performing quite well though I still think the poor scaling on a few tests is indicative of a memory controller and WRITE caching issue that hopefully B3 will fix.

I just thought I'd throw something useful into the mix.

The comparison to the new E8400 is there - not sure if they have it over at the other E8400 thread yet ... I'll check.

 

harna

Distinguished
Jan 2, 2008
282
0
18,790
Epselon84 writes:

What does GPU choice have to do with it? Both platforms can be upgraded with a faster GPU, it doesn't take away the fact that Core 2 is a much better gaming CPU for Crysis.

I see you have nothing to say about the X2 6400+ benchmarks. What happened to 'Oh X2 is so fast I just creamed my pants?!". :lol:

NEW!! MSI K9A2 Platinum 790FX Socket AM2+ Motherboard

Quad CrossFire (8X+8X+8X+8X)

9600 Phenom BE

Win VISTA 64

I would like to bet your 20 FPS just vanished into thin air baby.

8Mb (4 X 2) DDR 2 800

Let's see that Wolfdale setup eat this. Which is the sucky sytem now??

Oh and just for good measure, if you plonked that hopeless 6400+ on there it probaly do it to.
 


Still looks like the Q6600 is besting the Phenom 9900 in most benchmarks...
 

harna

Distinguished
Jan 2, 2008
282
0
18,790
lukebird writes:

"62fps vs 45fps is a massive difference playability wise", you cannot make such a statement on average framerates. What if the average of 62fps was made up of them machine running at 200fps for x amount of time, but running at 1fps for x/10 amount of time. (Those are just arbitary numbers and have no mathematical basis on the percentages I used) That is certainly not playable for a large part. Lets now say that the 45fps was 45 +/- 5% for x time. Which is more playable now?"

Hear ye hear ye hear ye! because here speaks a critic with common sense. Even with an average frame rate of 65 Crysis will crunch a system to a halt in places. I definitely know this because I've played and finished it.

I'm currently in the process of tuning my drivers in the most intensive parts of the game at 16 X 10. If you have a powerful system lukebird is dead right, because in some not so intensive parts 100 + FPS can be readily obtained and does wonders for the average. A far more useful figure would be % of time below 30FPS.

But of course this test is presumably done in high settings, and since they can be turned back a bit without a great deal of visual loss the net result is that the Phenom is indeed a very good platform on which to play Crysis.

I can obtain 62 FPS in parts of the game in 16 X 10 medium with one 3870 and a 939 3800+ running 2.0 gig DDR 400 running the damn game in Vista under Dx 10 and I already know that it runs one hell of a lot smoother in Win XP.

With my 5000 BE & 8800 GT running under Win XP it will be considerably more. With either system I am not yet stretched to require either X-Fire or SLi solutions, but they are in range if I should change my mind. Every other game I have other than Crysis runs maxed out and smooth as. Crysis is a challenge and enjoy it, but there is still so much improvement on the software and driver front to go before it's necessary to hammer away at it with the hardware extremes. And let's face it we are still not getting consistent gains out of multi GPU/CPU setups, that time is still ahead of us.
 

harna

Distinguished
Jan 2, 2008
282
0
18,790


Is that funny peculiar or funny ha! ha?

What's funny about that, isn't a dual core a multi core? Any huge advantage quad may get over dual is some time distant.

Comparison was simply to consider first C2D release Vs first Phenom release, nothing more. All architecture start with low speed and graduate to higher speeds. I can't understand why the Phenom is treated any differently, but somehow it sucks, because the competition is a little tougher today than it was for the Core 2 18 months ago. Well I guess that's progress for you.

And to TC thanks for the link, but the first comparison I turned to made me chuckle a bit:

source: http://www.techspot.com/review/84- [...] rformance/

"For reasons that are unknown to us the Phenom processors deliver superior performance in the PCmark05 graphics test. In fact, so does the Athlon64 X2, so we believe this probably has something to do with the on-die memory controller."
 

UncleDave

Distinguished
Jun 4, 2007
223
0
18,680



:bounce: :bounce: :bounce: RESPECT :bounce: :bounce: :bounce:
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


True, a personal experience is going to be better on all accounts, but benchmarks and reviews from websites such as toms are really all most have to go on because of the cost issue that is involved in these reviews, since we do not get the parts sent to us to try out before we buy them. They are a great basis for many of us to not only make recommendations for initial part releases, but also to help with the backing of our arguments and most importantly, they help lead to what new products we buy on a personal level...thus giving us the ability to at least share our personal experience with others about said product. So in essence they are not completely irrelevant and hold more value than I believe many give credit to. Just thought I would point that out...feel free to disagree.

Best,

3Ball
 
That review had the Q6600 in it as well as the new E8400.

Some of you need to grow up and contribute something useful in terms of criticism or comment that at least gives some insight.

All I see here are blind fanbois.
 

jivorivor

Distinguished
Jan 31, 2007
5
0
18,510
i occassionally dabble in the thg forums while usually preferring overclock.net for true enthusiast forums. however has anyone asked a big question about overclocking. if all 2140 are capable of over clocking to say 3.0ghz with stock voltage then 3.0ghz ought to be intels base speed rating.
i love reading about all the overclocking bs when everyones processor is doing at least a base increase from 2.4 to 3.0ghz then intel is in my opinion understating their processors for some reason. maybe so more people will buy it because it OVERCLOCKS, whoopty doo! this is not an overclock this is hitting what intel should have stated base as. if intel knew these processors could reach this speed with every processor in the bin then they should have just raised the multi. but i guess it just rings a bell to say it overclocks better than competition (which there is none currently). i have 2 q6600s one at work and one at home they both hit 3.0 ones a b and ones a G0. maybe AMD should sell their X2 as 1.9ghz and then when people hit 3.0 say they overclock like a beast but still cant beat intel C2D
 

LukeBird

Distinguished
Nov 14, 2007
654
0
18,980

Nope, I agree with you, I just said that IMO user reviews are more useful as they give insight beyond numbers that can be (lets face it) manipulated to display anything we like! :)
If there were more reviews from long-term users then perhaps people would buy slgihtly differently?
An interesting (but admittedly, unreal) conundrum. :)
But as I said 3Ball, I very much appreciate reading your responses as they are structured and useful, as I'd like mine to come across in fact!

I should also add I completely agree with the above post, my favourite OCing quote being that Q6600 G0's will hit 3.6 with ease....
Not perhaps all of them, I guess theoretically some could be limited to say 2.8GHz or so...
I guess it's just being careful with your wording that gets the intended message acros...
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


This is a very good point and could be very true as when I usually do need to look for information myself I do look for answer from actual people other than reviewers (not to say that reviewers arent people...lol), but I tend to find comparing them will actually lead me to a more educated decision (although not always the case...unfortunately).



Yea, 3.6ghz would be nice, but I will take 3.2ghz on it...lol, who wouldnt? I just want one even though my gain might be negligible at best (at this point and time anyway). lol

Best,

3Ball
 

turpit

Splendid
Feb 12, 2006
6,373
0
25,780
Well, both the Phenom and Intel 45nm questions are likely to remain mysteries. Ive seen articles with 'quotes' from Intel personnel on both sides of the issues, saying it was engineering problems, or that there was simply no reason to dump the 45nm quads on the market now. Given the lack of information, both are equally as viable. Same with Phenom. There may be a bigger problem, or they may have over reacted, or, they may not have wanted to take the smallest chance. Its just with Phenom, they needed it so badly the benefits seemed to outwiegh the risk. In Intels case, theres no pressure to speak of, other than meeting roadmap dates.

At this point, with AMD I wouldnt even worry about future products. They need to fix their current flagship line. And if the rumours are true about them doing away with 90nm, they need to do it fast, to get clockspeeds up to 90nm levels
 

turpit

Splendid
Feb 12, 2006
6,373
0
25,780


Really? You have any links showing that? Not attacking, just curious. It would certainly go a long, long way to explaining a lot. If they were really vulnerable to distributed denial of service attacks, well, thats more than a showstopper. If that were true, it would actually make Intels FDiv bug pale in comparison. Hell, thats a business killer. The loss of sales and further damage to reputation would be nothing compared to the lawsuits.

If thats true, .....just wow.


On the wafers, stopping quad etching would make sense, and go some way towards explaining the absence of the trie cores and quad masked dual cores, if the TLB issue was really a potential DDoS door.

If you have any links, I would love to see them. If there is proof, this looks like the key piece of the puzzle.

On the higher binned parts....I wouldnt expect to see anything higher than they are currently getting with brisbane....slightly less due to the extra heat generated but being distributed to the same surface area. Heat soak could force throttling. Beyond that....not much unless they thicken the gate layer again, but then theyre into the deteriorating cycle of having to pump more power in to overcome the extra thickness to get the higher clockspeed, which raises waist heat further, and on and on and on etc. the issue with the clockspeed has been SOI. Worked great at 90nm, but at 65nm, the trade offs seem too costly, and if they procede to 45nm with SOI....I dont see any advantage
 

NuclearShadow

Distinguished
Sep 20, 2007
1,535
0
19,810
Well its as simple as this... naturally people want whats best. The Phenom isn't whats best so naturally its viewed as weaker and less desirable. This is 100% a natural instinct of every creature on this planet to want whats best whether it be a mate, food, or in special cases like us humans computer hardware. In our human nature we also socialize with insults and jokes against something that isn't #1 such as a poor politician tends to be made fun of more than a good one. While the better counterpart to whatever example is at hand is paraded around like a hero or treasure until something/someone better comes along then naturally it itself is forgot about and is no longer desired.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


I don't expect you to be hard on one company and not the other. I have issues with Phenom too. Part of me wants to get one that's B3, but the sensible side says just get a Phenom dual core (or triple core) and wait for 45nm and more apps for quad core. My whole point is that many of the Intel fanboys who ignored Netburst issues and preferred their company's failure of design vision back then are now eager to attack AMD for theirs.

I guess you won't respond to my answer re: the 6000 and Phenom. It's the clock speed within the same architecture. It's not rational to compare the P4 situation vis a vis X2 with the Phenom situation vis a vis X2. Netburst was so inefficient, at least by Prescott days, that clock speed meant bupkis overall. People crowed that they got their Smithfield 805 stable to 4 gigahertz on expensive water cooling and yet it barely equaled or beat a much lower clocked Athlon X2.

Phenom beats the X2 6000 in some apps, but the X2 6000 beats Phenom in games. Not all that much, mind you, but it still wins out. The efficiency of the Phenom cores, as documented by Tom's Hardware test results is real, it does not mean that a 2.3 gigahertz always beats an X2 6000:

http://www.tomshardware.com/2008/01/14/phenom_vs_athlon_core_scaling_compared/

Phenom beats X2 6000 in 3DS Max:

921-1075-435.png


In Mainconcept:

921-1075-431.png


Athlon X2 6000+ beats Phenom in AVG antivirus:

921-1075-433.png


In Prey, the Athlon X2 6000+ wins again:

921-1075-425.png


Yet, in Supreme Commander, which benefits from more cores, the Phenom beats the X2 6000+

921-1075-421.png


I want a Phenom over an Athlon X2 6000+ because I don't play barely multithreaded FPS. The older CRPGs I play won't be hindered by a Phenom but the newer CRPGs and RTS titles I want to play will benefit. A lower clocked Phenom will beat out an Athlon X2 at any clock speed if the game or application that the Phenom runs supports more than 2 cores.

Supreme Commander, World in Conflict, Age of Conan, LOTR Online, The Witcher, Hellgate London and other games will do well enough on a Phenom. Games like Spore, Alan Wake and probably even Fallout 3 and the as yet unannounced TES V: Fargoth Dines on the Dunmer (nonworking title :D ) will do even better.

When AMD goes 45nm and gets a Phenom up to X2 6000+ clock speeds, then it will shine in even single or dual threaded games like the slightly outdated Prey, Half Life 2 and Serious Sam titles. As for Crysis, I'm sure a Phenom does quite well today. Sure, Intel does better, but Phenom beating an X2 core for core is not all that bad. Keep in mind that 25% of 2200 megahertz added to the Phenom's rating vis a vis X2 is somewhere around 2.75 gigahertz, which is enough under the X2 6000+ 3 gigahertz to be hampered in single threaded, or dual threaded games.



I consider 30 fps to be the sweet spot, but then again, I don't play Crysis. The last FPS I tried was Far Cry, and I liked it but didn't love going around shooting people, not even bad guys. I enjoyed Oblivion much more, it's more fun tossing fireballs at daedra, hacking goblins to pieces with silver axes and sneaking around stealing an Elder Scroll from blind moth priests. The only quest arc that didn't work for me in Oblivion is the Dark Brotherhood, because I think that murder is bad for business (yay Thieves Guild!).

In CRPG's, 30 fps is quite playable. All the Fraps junkies need to realize that there are more genres than FPS where 10 fps makes or breaks it for the enthusiast gamer.

Yes, Wolfdale beats C2D beats Phenom, which sometimes beats X2 6000+ and sometimes not. Wolfdale is the one ring for Sauron to rule them all. Phenom at 45nm won't beat the 45nm Intel quad cores. AMD is back to the K62 days, that's my budget and it's fine by me. They will eventually get a new architecture out, after Swift fuses CPU and IGP into a quad core. Then we'll be back to seeing some real competition.

Until then, I'll still buy AMD because they can play Oblivion and other CRPG's, past, present and future. The only Intel I've recently considered is a Pentium dual core with a bundled ECS mobo at Fry's for $88 to replace the aging P4 630 that I cobbled together out of old parts when a friend gave me the CPU on an ASUS P5RD1 X200 mobo.

I considered it, but I decided not to go that way. The worm tongue lies of "Dude you got a Dell" and "Intel inside" during the Netburst days helped Intel survive what would have been a disaster for AMD. Restrictive agreements questioned on three continents in courts of law kept AMD out of the OEM marketplace when they had the best product for three years.

So, dude, I'm not buying Intel. Not until AMD catches up fully and has a much bigger market share. Sometimes, it's about fairness in economic choices and not just 20 fps.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


Core for core comparison in the same architecture is always dependent on clock speeds, that's a given. I never meant to imply that a lowly B2 Phenom would beat a high end X2, except in heavily multithreaded titles, as I pointed out above.



The transformation from Netburst to C2D is so stunning that we have to question the sanity of the suits and engineers who pushed Netburst when they already had the right design to build upon but then ignored it. If life were fair, then Intel would have gone into bankruptcy over Netburst. C2D is the best "Hail Mary" play I can think of in recent business history.



Intel promises similar increases in performance, and the previews I've see support gains that are phenomenal (sorry AMD pun intended), but not as great as 40%. Were I not an obstinate AMD fan with a sense of humor, I would go Intel today, but I just don't like their business practices during Netburst, so I'm supporting the underdog, which will eventually have it's day, even if it won't be until May 8th, 2012.
 

TRENDING THREADS