Well. Let us not consider overclocking. I may be a fool but I'm assuming each CPU can be overclocked to some extent.
I native mode. out of the box, how much of a performance gain can one expect to get from the base 3000 to the premium for someone who wants to play games (online or offline) and rune some development software (.NET suite or JAVA).
From what I've read and gleaned here and other places, and
I'm talking just the Winchester cores here. All other specs are the same:
3000+ = 1.8ghz
3200+ = 2.0ghz
3500+ = 2.2ghz
So, the 3000+ is 20% slower clocked than the 3200+, the 3500+ is 20% faster clocked than the 3200+. Is it a real difference? It's noticeable. More noticeable from 3000+ to 3500+ than it is from 3000+ to 3200+ or 3200+ to 3500+. I don't have an A64, so I don't know, but there was a noticeable, though not remarkable, difference when I oc'd my XP 2600+ to 3200+ speeds. Is it worth 66% more expense (150 to 250)? That depends on your needs and budget. Only the one spending the money can make that decision.
For games, today's games are usually not (drastically) CPU limited by the 3000+. A hard-core gamer you may find it limiting because that little bit faster CPU means a bit faster response when reacting to something. Future games of course will be cpu limited, but there's no way of knowing when that point will be reached. For dev software, well, M$ dev software craves speed. That means it can suck up more cpu cycles than you have at any moment, so speed is more important. However, there's a point of diminishing returns - $100 for a 5% reduction in compile times (numbers out of thin air - no backup - no idea if they're accurate or not - just to give you the idea of what I'm talking about) may or may not be worth it.
For overclocking, what I've gathered, and what makes sense to me is that all 3 chips came from the same process, potentailly side-by-side on the same physical wafer, so they all should have the same maximum 'potential' speed. That speed isn't a "%" overclock, but a maximum chip speed. Almost all Winchesters appear to be able to go to 2.6ghz on air cooling. Whether an individual chip will make only 2.4 or over 2.8 depends on the subatomic structure and accuracy of all the cuts for that particular chip (not wafer, that individual chip). AMD will take the chips that test best (at least I would if I were a chip mfg.) and sell them as the fastest ones, and the chips that test worst and sell them as the slowest ones. That means that while any Winchester CPU can probably hit 2.8ghz, a 3500+ is 'more likely' to be able to. How much more likely? Dunno.
It seems a safe bet if you (or whoever) is only reaching for 2.4ghz to just buy the cheapest one since it appears they all can go at least that high. To 2.6ghz might be reaching the edge of a poor-testing chip's abilities, and for your best chance at getting over 2.8ghz get the fastest chip. But any of the chips have a chance at hitting 2.8ghz. (does that make sense?)