epsilon84 :
DAMMIT! I was looking forward to getting a Q9450...
Looking further ahead, this could spell bad news for Nehalem as well. If AMD can't get into a competitive position soon, Nehalem can easily be pushed out to 2009 for the very same reasons...
Well, if all the Intel fanboys get their wish, and AMD dies, then we'll be back to Netburst days in regards Intel business practices. They might even cancel their own fusion plans and go head to head with Nvidia in the discrete GPU market.
I can't understand people buying Intel right now. Sure, they have the "best" but X2 is still quite good and, though Phenom won't be decent before 45nm (AMD should have followed Intel's lead and avoided 65nm native quad core), buying Intel now is like asking for trouble a few years down the line.
Is it all about increments of 10 fps in a particular FPS or 40 seconds in video encoding? I can see the time differences in 3DS Max or Divx making a difference in the workstation market, but at home? All in all, I've bought more Intel CPUs since 1993 than AMD, but only when Intel had the working product. I chose Pentium over K5, but I went K62 because it was a decent CPU at a budget price. I went P4 Northwood over Athlon XP because of heat issues, but I avoided Prescott and Smithfield Netburst and went Athlon X2 instead.
Now, I don't see the real world difference that everyone claims is there, at least not enough of a difference that matters to me. So, I buy AMD. Plus, I'm a bit old fashioned, I prefer pins on the CPU and not the motherboard, and I find AMD stock coolers decent enough at stock speeds (I don't overclock). Now, AMD has the benefit of ATI chipsets and GPU's, and I've only bought an Nvidia chipset and GPU combo once as a barebones offer.
I do wish that AMD had planned to release a hybrid Crossfire in March that actually benefited from power savings, and that worked with more cards than the upcoming 3400 series. As is, I'll just have to buy a couple of 3870's instead and wait to see how fusion pans out by mid 2009.
Xbit Labs on fusion:
http://www.xbitlabs.com/news/cpu/display/20071216231717_AMD_Claims_First_Swift_Fusion_Processor_Due_in_Second_Half_2009.html
“The first APU platform is code-named Swift. It gives you the choice of technologies for high-confidence volume production ramp. We want to re-use as much [IP] as possible to accelerate our quality [qualification] and time to market. So, we have an AMD Stars CPU core, the graphics core that is based on the present high-end discrete GPU core and leverages the North Bridge that is presently found in Griffin, the CPU of the Puma platform. It will be our second 45nm generation product, so the maturity of the [production technology] will be proven. It is done on the current SOI design rules, which is the process that we know how to build on very well,” Mr. Rivas explained.
Initially the company indicated that Fusion processors “are expected in late 2008/early 2009”, and the company anticipated to use them within all of the chipmaker’s “priority computing categories”, including laptops, desktops, workstations and servers, as well as in “consumer electronics and solutions tailored for the unique needs of emerging markets”. A little later the company said that the first-generation of Fusion chips will be aimed at laptops and that production will start in early 2009. This time AMD claims that the actual chips will reach the market only in the second half of 2009, which may mean that the product will only be launched commercially in Q4 2009. Still, the company said that it is minimizing all the risks hopes to really deliver the product on time.
“By optimizing the choice of IP blocks we have less risks and faster time to market in the second half of 2009,” claimed , executive vice president of computing solutions group at AMD.
IMHO, Intel won't do a darned thing unless the market forces them to, and the market right now is AMD/ATI CPU's, chipsets and coming fusion. Nvidia won't be a factor until Intel brings out their discrete GPU's. Expect some Intel dirty tricks where they lock out Nvidia and ATI discrete GPU's on Intel chipsets once they have a full line of discrete GPU's up and running.
So, buy Intel if you want, but don't kvetch too much about their delays due to AMD's failures. AMD has vision and tries for the improbable. They get flamed for sitting pretty during X2 and not putting enough into R&D, but the flamers seldom had a bad thing to say about Intel during the Netburst days.
atomicWAR :
no matter how you fry this bacon up it leaves a nasty taste in your mouth...and just yesterday i said what a bad time it was for a build. apperantly timing just got worse. merry christmas everyone!!!!
Yes, indeed. We have two AMD systems that I was going to upgrade, but it didn't pan out because of three reasons:
1: The X3870 is virtually unavailable (I wanted the 1 gig version anyways).
2: Hybrid Crossfire won't be out in the U.S. before March, and the first incarnation won't have the power saving features, and won't work with any cards above the X3400 series anyways. I had hoped that, while it gave a 60% improvement with entry level cards, it would give both power savings and a 10% improvement with the midrange GPUs.
3. Phenom is a bust at 65nm. So, I'll just upgrade my AM2 boards with 2.9 gigahertz 65watt Brisbanes first quarter of 2008 instead. I'll wait for fusion at 45nm instead of going Phenom this spring.
I'll still try to find the X3870's for our two systems after the one gig version arrives. If worse comes to worse, I can always get the regular 512 meg version or, as an absolute last resort, two X3850's with 512 megs.
Makes you think that both companies wanted to get rid of old inventory for the holidays. Add Nvidia to the list too. I don't know how many people went for the older 8800GTS when the 8800GT wasn't available. Ideally, old tech should be marked down once new GPU's hit the market in force, but the release of ATI and Nvidia's new cards are sort of papier-mache.