Haswell (1150 due next year) is DDR3 only and I doubt they will be replacing 2011 that fast just to be able to make use of DDR4 but I am guessing that it might be 2015 before DDR4 becomes mainstream. By then AM3+ and 1155 will be long dead so what is going to use it first I don't know maybe AMD will get there or Intel in it's high end server platform but dumping 1150 (Haswell) and 2011 so fast won't sit well with most.
They don't have to throw the socket away. They could just do the same as their 4 series chipsets that had both DDR2 and DDR3 controllers in them, or like the Phenom IIs that also had both controllers in them and offer both boards. Then companies could offer hybrid DDR3/DDR4 boards to allow a upgrade path.
Other than the "clock mesh" announcement recently I don't think there have been any details yet.
I thought there was going to be a new FM2 socket for Trinity but that appears to have been scrapped for now.
It still is FM2 for Trinity, that hasn't changed yet. I can see why, same thing with AM3+ only with BD. Different arch that has a vastly different pin layout. There were people who thought they had AM3 mobos that supported BD but they either did not or had a AM3+ socket on a 800 series chipset, even some 700 series chipsets, as the chipset of 900 series was the same as the 800 series.
I stated it before, if it was a black socket it supported BD, if it was white it did not.
Not to be a party pooper, but I still haven't seen a low profile 6670 (not even Sapphire =/), so they might be a great combo (which is good), but they still don't fit in my lil' HTPC case XD
Also, the MoBo price... Did it have the same set of features as the Asus? I mean, I have that MoBo and it has a TON of things. It's one of the few integrated audio chipset with almost no annoying signal noise. I have it attached to a pair of studio monitors, so it's is very noticeable.
And yeah, the A8 heats like a stove, nothing to argue there 😛
I really feel bad when I think about that, mal. I'm a FM1 and AM3+ owner, lol. Well, it's more like "owned" at the moment, hahaha.
Cheers!
http://www.newegg.com/Product/Product.aspx?Item=N82E16814161397
There ya go. Would be nice and comfy in a HTPC it looks like.
No, people trashed BD because outside of heavily multithreaded apps, it isn't any faster, and in many cases is SLOWER then the previous architecture. Meanwhile, AMD promised it would be significantly faster then it was. And killed its own competing architecture so everyone would have no choice but to move to BD.
Meanwhile, BD is priced the same as the i5-2500k, which beats it outright in the majority of benchmarks out there.
And no, the "it will improve as software gets more threaded" argument does not work, because as I have already noted, software simply does not and will not scale beyond a few cores.
Finally, everyone here can not recommend a processor on the hope it *might* be better a few years down the road.
What you are doing, rather then making an argument on why BD is better then everyone thinks it is, are instead attacking everyone thats been pointing out the obvious flaws in its design.
Don't even try. keithlm tends to always try to turn it against you. best to just ignore. When Barcelona failed to beat Kentsfield per clock and per core with the "superior" monolithic design, it became a matter of it "feeling smoother".
Just do as I will this time around, ignore.
Toms benchmarks resulted in exactly what I said it would, favoring Intel whenever you can install a low to medium dGPU. The GPU and the Intel both had separate heat sinks and fans, thus they didn't share the same thermal headroom. The APU on the other hand must constrain both it's CPU and it's GPU operations within the same thermal envelope. With low to medium desktop dGPU' being cheap like they are, there is little reason not to have one. And thus APU on a regular desktop doesn't make sense. They APU may have more theoretical performance (four complete cores) but the Intel unit has better performance per core and a dedicated GPU which is superior to the APU's IGP.
Now that all being said, Toms picked the 3870 for a reason, it gave them the highest amount of money to add in a superior GPU to the Intel chip. Try that same setup again but instead use something a bit more realistic.
109.99 A6-3650
http://www.newegg.com/Product/Product.aspx?Item=N82E16819103943
79.99 A6-3500
http://www.newegg.com/Product/Product.aspx?Item=N82E16819103951
Now if we're talking HTPCs and other appliance devices, that's a slightly different story. Although still shouldn't use a 3870.
Actually Toms was trying to take the top of the line APU and just see how it stacked up. And they priced a Intel CPU with a discrete to be the same price to see that. They made not that the CPU did perform better in multithreaded apps, as is expected 2 vs 4, and that the IGP is not the best for gaming at $140 if you can get a dual core and better discrete GPU for the same or a bit less.
I see nothing wrong with it. I would have prefered to see results with a GPU closer to the 3870s but it would have been cheaper in price.
But you are saying the same as Toms basically did for gaming. If you can get a better option, why buy the A8 3870K?
I am sure if the A8 won you wouldn't be using it as an example for this.
how did you spend $100 on a 6670 when toms bought it for $70? the i3 2120 is $130 and the 6770 is $100= 230 total. The A8 is $140 and the 6670 is $70 making 210 total.
http://3dmark.com/search?resultTypeId=232&linkedDisplayAdapters=1&cpuModelId=1315&chipsetId=688
http://3dmark.com/search?resultTypeId=232&linkedDisplayAdapters=1&cpuModelId=1090&chipsetId=570
So why pit the i3 with a more costly GPU and the A8 with a cheaper GPU? Is it to give an advantage to the A8 so it can hybrid CF and get better performance? Why not take the A8 with a 6770 and the i3 with a 6770?
Thats like the people who didn't want to compare Phenom II to Nehalem and instead only Kentsfield/Yorksfield or make it an uneven playing ground based on whatever made AMD shine.
BTW, 3DMark is not the best way to show real world performance.
http://www.ebay.com/itm/Authentic-AMD-FX-4100-Processor-Key-Chain-/260968373598?pt=CPUs&hash=item3cc2ed5d5e
Nice. I heard Intel does this with their bad CPUs.