Barton 2500+ Preview

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
<A HREF="http://www.thetechboard.com/reviews/barton.php" target="_new"> Click to read </A>

<b> "You can put lipstick on a pig, but hey, it's still a pig!" - RobD </b>
 

lhgpoobaa

Illustrious
Dec 31, 2007
14,462
1
40,780
hmmm. bit too preliminary to draw and conclusions... a damn fine overclocker though :smile:

methinks the XP2500+ will be a hot item.

<b>My Computer is so powerful Sauron Desires it and mortal men Covet it, <i>My Precioussssssss</i></b>
 

Quetzacoatl

Distinguished
Jan 30, 2002
1,790
0
19,780
not enough details yet, but looks promising. Scaling? When are we going to see that!??! We all want a 2500+ barton to go from 1.83Ghz to 2.8Ghz ^^

Instead of Rdram, why not just merge 4 Sdram channels...
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
hmmm. bit too preliminary to draw and conclusions... a damn fine overclocker though :smile:
I disagree vehemently. The article in no way indicates good OCing potential.

It <i>seems</i> like a good overclocker, until you look at those Comanche4 benchmark results. The overclock got a whole 0.05% performance increase at frames/sec and a whole 0.06% performance increase in triangles/sec. Yet it's clocked 20.84% faster <i>and</i> that's with a raised FSB (meaning that it has more bandwidth). And no, I didn't forget to multiply that 0.06% by 100. It really is six hundreths of a percent.

Obviously something is wrong. A 20.84% clock speed increase should give more than a 0.06% performance gain. As far as I can figure, there are only three possible answers:

1) This dolt who did the benchmarking was using insufficent cooling and while running real-world apps where the CPU actually warms up, it was already being throttled. (Based on that the non-OCed Comanche benchmark scores were virtually identical to the OCed scores, meaning that both scores were so similar because the CPU was being throttled down to a 'safe' level both times.)

2) The Barton was right at the very edge of it's performance, so much so that when OCed it was throttled right back down to the same performance as when not OCed.

3) AMD has implemented a new way to prevent OCing by ensuring that the performance is the same no matter how you externally clock the chip. (With the possible exception of performance gains from a higher FSB.)

These are, of course, in the order that I believe likely, meaning that my impression of the reviewer is awfully low. Either way though, no matter how you cut it that review makes it's own validity extremely questionable. I personally don't think that the review in any way indicates what the Barton is or isn't capable of.


PC Repair-Vol 1:Getting To Know Your PC.
PC Repair-Vol 2:Troubleshooting Your PC.
PC Repair-Vol 3:Having Trouble Troubleshooting Your PC?
PC Repair-Vol 4:Having Trouble Shooting Your PC?
 

baldurga

Distinguished
Feb 14, 2002
727
0
18,980
Maybe it is a stupid point, but can be the GEFORCE <b>2</b> Ti the reason of bad scaling? I wonder what would be the result if an ATI 9700Pro would has been used.

With that config like that, I can only give some credit to pure CPU benchmarks, not all-system benchmarks.


Still looking for a <b>good online retailer</b> in Spain :frown:
 

paulj

Distinguished
Feb 15, 2001
523
0
18,980
Uh the Comanche Benchmark is a 3D test that pushes the video card and system. It is not a pure measure of the CPU. The video card is the limiting factor in this case not the platform.

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by paulj on 02/06/03 09:30 AM.</EM></FONT></P>
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
baldurga and paulj, I do not believe that the graphics card put up an inpenetrable wall. Yes, indeed it does hinder the performage gain possible. However, I have <b>never</b> seen a point where any game cannot be sped up by a faster processor, regardless of how crappy the graphics card on it is.

Think of all of those systems with dinky onboard graphics. Even <i>they</i> see a gain in performance from faster processors, and you <i>know</i> that their graphics system is maxed out.

Yet in this case, the game sped up a whole 0.05%, or in effect, it didn't improve <i>at all</i>. Even with a maxed out graphics card, it should have seen <i>some</i> improvement. So it didn't have anything to do with the graphics card. The Barton was simply hitting some sort of a wall. Now, whether that wall was the fault of the reviewer or of AMD is all that really remains to be answered.


PC Repair-Vol 1:Getting To Know Your PC.
PC Repair-Vol 2:Troubleshooting Your PC.
PC Repair-Vol 3:Having Trouble Troubleshooting Your PC?
PC Repair-Vol 4:Having Trouble Shooting Your PC?
 

Crashman

Polypheme
Former Staff
So clock for clock the Barton is better, but XP rating for XP rating it's worse? Yet AMD's PRICES are based on XP ratings, not clock speed! So unless they can produce one with a higher clock speed (say, 2200MHz) they still aren't progressing!

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
It seems like a good overclocker, until you look at those Comanche4 benchmark results. The overclock got a whole 0.05% performance increase at frames/sec and a whole 0.06% performance increase in triangles/sec. Yet it's clocked 20.84% faster and that's with a raised FSB (meaning that it has more bandwidth). And no, I didn't forget to multiply that 0.06% by 100. It really is six hundreths of a percent.
Comanche4 sucks in terms of pulling faster fps. It's one of the worst optimized games of today. If you have the fastest video card, then you might expect very little fps gain in this game. This case, the reviewer is using a GF2 Ti, and Comanche4 makes full use of Dx8. So it's sure that, video card is not letting the CPU to scale perfectly.

<b> "You can put lipstick on a pig, but hey, it's still a pig!" - RobD </b>
 

Lonemagi

Distinguished
Feb 20, 2002
969
0
18,980
Right now on pricewatch.com you can get a xp 3000+ (2.16 ghz) 333fsb Barton for <font color=red>$624</font color=red>!

Now I can get my 2100 to 2.1ghz... and it cost me $98. Dont see people jumping on this unless they are numbers freaks..

<A HREF="http://tekkoshocon.com/" target="_new">http://tekkoshocon.com/</A> Southeast Pennsylvania gets an Anime Convention!
 

eden

Champion
Sorry Slvr, but I think it can be entirely the card here. Look at the THG VGA Charts 2, and look at the XP2700 and the 3.06GHZ disparity for one card in the low end.
<A HREF="http://www6.tomshardware.com/graphic/20021218/vgacharts-06.html" target="_new">http://www6.tomshardware.com/graphic/20021218/vgacharts-06.html</A>
Here we see the Radeon VE having a very small 0.1% boost at most, when going to the XP2700 from the 3.06GHZ. Even though the 3.06 was fitted badly with the RAM and all, it still shows that although there is such a significant clock speed difference, with all the bandwidth over the AthlonXP, it still does little. Jedi Knight II also only starts to bottleneck CPUs later on.

Therefore these results, if truly on a cheap GF2, are in fact normal. I would've been surprised if it were on an R300.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

Crashman

Polypheme
Former Staff
Yes, pricing based on XP numbers means that AMD processors at the higher speeds are not the value people claim them to be, because the XP rating excludes any performance advantage.

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
 

eden

Champion
Additionally I looked up an even more evident article, the mainstream comparison. Notice here <A HREF="http://www6.tomshardware.com/graphic/20030120/vgacharts-02.html#aquanox" target="_new">http://www6.tomshardware.com/graphic/20030120/vgacharts-02.html#aquanox</A>
how moving from a 1GHZ Tbird to an AthlonXP 2700, more than twice the clock speed, and theoretically, the performance, on a Geforce 2 MX, yeilded no more than a 7th of a frame.
Therefore I believe you are wrong in seeing the Barton has bad scaling due to this very negligeable increase of performance in Commanche, and I believe these scores are more than normal when the video card was long ago the bottleneck. And here we are talking about a small ~25% clock increase for the Barton overclock.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 02/06/03 11:09 PM.</EM></FONT></P>
 

Quetzacoatl

Distinguished
Jan 30, 2002
1,790
0
19,780
Look, it's been said before, when you're testing for CPU scaling, you have to take out the vid card factor, so the games and benchies aren't vid reliant. Hence, yes, the Gf2 ti was holding it back somewhat here. Although, when you look at a lot of benches like the Winstone and SiSoft, all that theoretical and synthetic crap, the vid card won't have any say in it. Also, I never beleved for once that these would have a perfect scaling ratio of 100 percent performance to Mhz for each increase. Nice overclock nonetheless

Instead of Rdram, why not just merge 4 Sdram channels...
 

eden

Champion
Yes but I was replying to Slvr's comments towards the Commanche benchmark. I believe it is obvious the card was the limiting factor there and the review has a very poor capability of reviewing if he made the benchmark scale like this and used an old card.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
Sorry Eden, but I still think you're mistaken.

Notice here http://www6.tomshardware.com/graphic/20030120/vgacharts-02.html#aquanox
how moving from a 1GHZ Tbird to an AthlonXP 2700, more than twice the clock speed, and theoretically, the performance, on a Geforce 2 MX, yeilded no more than a 7th of a frame.
Only a 0.7FPS difference, yet that's still a 4.07% increase in performance, which is a <b>lot</b> larger than a 0.06% increase. Further, you're talking about a MX card. If use that same graph to look at a GF2Ti, you're getting an 18.15%, which is a <i>hell</i> of a lot better than a 0.06%.

Now, to be fair let's factor in the differences. An AXP 2700+ is <i>theoretically</i> equivalent to T-Bird 2.7GHz, meaning that it is about 170% faster. The Barton OC was only 20.84% faster. So if we multiply the 18.15% increase for the GF2Ti listed above by (20.84 / 170.0) to normalize the result according to the Barton OC, we still get an expected 2.27% performance incease, which is a heck of a lot larger than the actual 0.06% recorded.

Even if we take your GF2MX difference of 4.07% and normalize it in the same way, we get an 0.5% increase on the GF2MX, which is still almost a hundred times more than 0.06%.

So sorry Eden, but 0.06% is just <i>way</i> too low for me to believe, no matter what the graphics card. Even the GF2MX should have done almost a hundred times better than that.

I'm not saying that the 20.84% OC should have given a direct 20.84% FPS increase. Heck, I'm not even saying that it should have given a 1% FPS increse. I am however saying that the recorded 0.06% increase was <i>way</i> too low. It indicates that <i>something</i> wasn't quite right.

Especially when you consider that all of the other benchmarks were <i>synthetic</i> benchmarks. The <i>only</i> real-world benchmark is the one and only one that seems off. To me, it just indicates that we can't really put any stock in the validity of that review.

That's all that I'm saying, is that I trust that review about as far as I can throw it. If you want to trust it, fine, go ahead. Just remember that even the author admits his benchmarks are lame.
<font color=blue>Update 02/05/03: There was some discussion about my "lame" benchmarks on Anandtech. Do I think they're lame? Compared to otehrs I've done and seen, yes.</font color=blue>
After an endoresement like that from the very author of the article, do I even have to say any more?


PC Repair-Vol 1:Getting To Know Your PC.
PC Repair-Vol 2:Troubleshooting Your PC.
PC Repair-Vol 3:Having Trouble Troubleshooting Your PC?
PC Repair-Vol 4:Having Trouble Shooting Your PC?
 

paulj

Distinguished
Feb 15, 2001
523
0
18,980
I am however saying that the recorded 0.06% increase was way too low. It indicates that something wasn't quite right......

To me, it just indicates that we can't really put any stock in the validity of that review.
Well I think we can all agree that something was wrong with this example of benchmarking. Why use a GF2 anyway? I feel guilty for having even discussed this benchmark. :smile:

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
 

paulj

Distinguished
Feb 15, 2001
523
0
18,980
I think it's ironic that probably every one in this thread would do the same thing if they owned this system. Get a new video card.

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
 

paulj

Distinguished
Feb 15, 2001
523
0
18,980
So clock for clock the Barton is better, but XP rating for XP rating it's worse?
Yes, pricing based on XP numbers means that AMD processors at the higher speeds are not the value people claim them to be, because the XP rating excludes any performance advantage.
It seems that way in the synthetic benchmarks but not in the application benchmarks (unfortunately there is only the Comanche benchmark).

You still have to compare benchmarks relevant to your individual applications regardless of the XP rating or P4 speed and then compare price. But you may be right.

The game will change again when we go to 400MHz (AMD) and 800MHz (Intel) fsb. :smile:

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
 

eden

Champion
I just wish they wouldn't always lower the clock speed so much whenever they increase the IPC, in order to jam the XP rating even further. Would it hurt if they had an XP2800+ at 2.25GHZ become a Barton and stay 2.25GHZ? It'd sure as hell compete better!

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: