AMD to Release Two New Llano APUs: A8-3870K, A6-3670K

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Dacatak[/nom]K for UNLOC[K]ED already used by Intel.If they went for [UK] for NLOC[K]ED an entire state might think it was made just for them.Personally, I'd go for [OC] for UNL[OC]KED for the double entendre.[/citation]
OC is short for overclocked
 
[citation][nom]Pazero01[/nom]hmmm A8-3870K vs I5-2700K? The A8 has a bigger number! Lets get that![/citation]

I7 not I5

BTW i find the naming by amd strange (intel) i3, i5, i7, (AMD) A4, A6, A8

great they can't sue them
 
I remember when this processors were announce that the K is a short way to name the Black Edition, and it made sense, but still think that AMD made it on purpose.
 
Sorry, but I need CPU performance over GPU performance. Almost everything I do uses the CPU a great deal more than the GPU. While GPU performance is great, you only see it during gaming. You won't notice a difference in GPU for most other products so bragging about GPU performance is a bit lame. Both do 2D performance the same and low end 3D performance is similar. High end 3D performance is for gaming almost strictly except for some engineering applications like CAD which still won't be that different. Basically, high end GPU is a niche market. It just so happens that the people who post on this page are gamers. 90% of the rest of population simply isn't a high end PC gamer. This is why Intel is winning and why Dell still recommends Intel for the non high end gamer and Intel + desktop GPU for the high end gamer.
 
@silverblue

I highly doubt AMD is laying down more transistor then needed, that's a costly exercise in pointlessness, AMD has always been good at getting more bang out of less transistors and i was surprised the transistor count for the FX was so high, sounds like marketing just got their numbers all wrong thats all

i have no doubt IB graphics will be a real improvement over their existing offering but i think you must be masochistically optimistic to believe intel can come close to AMDs offering, HD3k is a feeble gfx solution at the best of times it's not really hard to improve feeble, to get it to compete requires more then just improving, it requires an overhaul
 
@iGamer

strange you talk about GPU performance not being important but then go on to quote recommended use for high end CPU as gaming

truth of the matter is very few applications will see a marked difference between a similarly priced Intel vs AMD chip, shaving a few seconds here and there doesn't mean much in the over scheme of things, sure it's nice for bragging rights but unless your into real intensive computing it's not going make an ounce of difference. I can tell you i like to get as much power for my buck, but i have used intel chips with their IGP and it's a damn pain and hassle, i would gladly take a small hit in CPU performance so i dont have to deal with the hassle that is the Intel IGP
 
Maso

Nobody has refuted the 213m transistors required for a module. Do you really think that 8MB of L3 cache, the IMC, processor interconnects etc. make up the other 348M (if we're going by 1.2b as a strict figure)? The difference between an Athlon II X4 and a Phenom II X4 was the 6MB L3 cache which, when you compare numbers - 300m and 758m - is a difference of 458m transistors. I can't for one moment see how Bulldozer's larger L3 cache would have much fewer transistors than Phenom II's.

As for being "masochistically optimistic to believe intel can come close to AMDs offering", let's put it this way - IVB will have a much larger GPU than the HD3K. The number of execution units (EU) is going from 12 in HD3K to 16 in the upper IVB parts, each EU is more powerful than the previous ones, and the instruction support will finally include DX11. Here's a link which offers far more detail:

http://www.anandtech.com/show/4830/intels-ivy-bridge-architecture-exposed/5

If Intel can offer decent drivers, who's to say that Ivy Bridge won't significantly close that gap? Perhaps it won't realistically match the A8, but it shows Intel is getting serious about on-die graphics, and that Haswell will only improve on them. Perhaps they do need an architecture change, but that's not going to stop Intel trying to get one up on AMD in their stronger area.
 
[citation][nom]d00dad[/nom]but will it play Battlefield 3[/citation]

the 3850 with play bf3 so my guess is yes... yes they will... (medium settings at 1280x1024 it will anyways)
 
[citation][nom]silverblue[/nom]Ivy Bridge's graphics will be a large improvement on before; enough to bring them close to, if not equal to, the higher end of the Llanos we have now.[/citation]
I seriously doubt that Ivy Bridge would come close; the HD 4000's main improvement is that it goes from 12 to 16, and it adds DX11 support; the latter was the source of some possible media claims of "catching up." Of course, almost all enthusiasts know that simply adding support for the latest DX hardly doesn anything; otherwise the HD 4000 would be as good as a GTX 590 or 6990. This DX support, however, accounts for most of the "bulking up" here; after all, Intel spent the (likely wasted) resources to put in full hardware tessellation support.

All told, even though the HD graphics clock higher than other GPUs, it hardly makes up for having only 16 shaders. Since they're vector units not superscalar, we have to compare the quantity to GeForce cards... And even still, 16 is an abysmally low number; we're talking something like the GeForce 8500GT. (as the only GF card that had 16 shaders without also having a 64-bit memory interface) And yes, the clock speed difference is a bit mitigated here, since nVidia clocks their shaders higher than the core clock; 900 MHz is within the range for the HD 4000, and a bit above the middle of the range.

An 8500GT isn't a match for even a low-end Radeon 6000-series card. It doesn't even come close. So again, Ivy Bridge's graphics will keep up the pace with Sandy Bridge, and remain a joke.

[citation][nom]silverblue[/nom]The only way they're going to reduce transistor count is to cut the L3 cache size or redesign it completely so it doesn't require so many transistors, though I'm not sure as to how that would be achieved.[/citation]
Well, more efficient (read: simpler) designs could be figured out to accomplish the same results in hardware. As I'd mentioned, AMD had done it before when they transitioned from R600/RV670 to RV770, where they managed to make each stream processor take up less die space without using a die shrink, purely by re-designing them.

Also, +1 for the careful thought and linkage. I hadn't had any time to pay attention to the Bulldozer die count controversy, so it's good to have that cleared up.[citation][nom]silverblue[/nom]MasoNobody has refuted the 213m transistors required for a module. Do you really think that 8MB of L3 cache, the IMC, processor interconnects etc. make up the other 348M[/citation]
Of course, the extra has to be even more, since 1MB of cache requires slightly more than 50M transistors. (each SRAM cell uses 6 transistors, vs. DRAM cells which typically use one transistor and one capacitor)
 
[citation][nom]theuniquegamer[/nom]I just get a news that amd has cut about 800 million transisters from the new bulldozer production in future. That is the bulldozer will be of 1.2 billion transister instead of 2.0. But the architecture will remain the same i.e 8 core/4 module. Why amd is doing such bad things i don't know.Amd is trying to make more profit from it like the llano.[/citation]

AMD didn't cut 800 million transistors from bulldozer, thats how many transistors it had all along...... AMD's PR deptartment screwed up and said it had over 2 billion when in reality it has about 1.2 billion. Learn to read.... lol
 
@silverblue

That puts BD's L3 at only just slightly smaller than Phenom II, which is not inconceivable considering that they have a completely different connection topology and factor in some optimization

if i recall some of the older IGP from intel were suppose to support DX9, but i can tell you from past experience 'support' was a very loose translation of the word. Just making the GPU core bigger isn't going make a competitive product, sure it going go faster than the previous generation but the previous generation was no where near competitive, they need a complete overhaul

The biggest problem is Intel gfx capability was gained through acquisition, the IP from this acquisition was no where competitive compared to nVidia or ATI (back then), sure Intel can sink large amounts of resources and cash into trying to develop a new architecture (which they have absolutely no background knowledge in) that can compete with today's offering, but by then nVidia and AMD would have moved on, it's a moving goal post. I highly doubt Intel will funnel a significant amount of R&D away form the CPU division to try and develop a competitive gfx solution

personally they should just not bother, sell their chips minus the igp for cheaper, i believe something like that can really put a dent into AMD armor, or dedicate that real estate to more CPU horse power
 
They may not want to funnel lots of money away from CPU development, but they did before with Larrabee. All I'm saying is that by having an IGP that's 50-60% faster than the HD3000, they could be very close to the first generation Llano. If each EU is being significantly improved, it's got to result in a decent speed up.

You could remove the IGP, but QuickSync would be sorely missed.
 
[citation][nom]silverblue[/nom]They may not want to funnel lots of money away from CPU development, but they did before with Larrabee. All I'm saying is that by having an IGP that's 50-60% faster than the HD3000,[/citation]
Wait, did I miss benchmarks somewhere for this? What I'd seen was that per-clock performance for the HD 4000 was close to the same, (which is the case with GPUs; unlike CPUs which are highly sensitive to mild changed to impact per-clock performance) and on average clock rates were DROPPED across the board, an average of 200 MHz. This is purely countered by the increase in "EUs" (I'll just go ahead and call them "SPs") from 12 to 16.

Keep in mind that while technically, Cleeve's GPU Hierarchy chart shows the HD 3000 two steps below the lowest Llano GPUs, each step is generally logarithmic: those two steps go from a GeForce 8500GT to an 8600GT, which is a big step: nearly double.

And of course, keep in mind that the single HD 3000 listing assumes the fastest variant, which won't be facing those low-end, near-Brazos-level GPUs; it'll be up against the higher-end stuff, which is nearly an order of magnitude up from there. An idealized 60% improvement would, perhaps, almost reach the 6370D, sure, but that still leaves it a joke, as you'd likely be not seeing that unless you've got a $200US+ core i5, and $130US gets you the 6550D. Given that Llanos current can be gotten for as little as $65US on NewEgg right now, (and that's the 6410, not the 6370) I honestly can't see Ivy Bridge really making any ground here at all.
 
@silverblue

hands up anyone who uses quicksync, i highly doubt that it would be sorely missed, given the choice i believe folks would prefer a cheaper chip or more CPU horse power over quicksync any day of the week
 
most of the desktop cpus with hd 4000 won't be in any use anyway. intel is probably aiming hd 4000 at mobile market, like they did with hd 3000. amd's apus too, make more sense in laptops. that's why intel keeps pairing better igp with higher end cpus. they might make some of the lower desktop i3 and i5s available with hd 4000, e.g. when trinity makes it to the market. amd also restricted llano, putting it in seperate socket fm1, with no option to use older phenom or newer fx. it's rumored to be replaced by fm2 when trinity comes out. their approach is a bit better than intel's but similar nonetheless.
 
[citation][nom]RazorBurn[/nom]For $500 to $1000 i can have an i5 or i7 with descrete GPU that will run around Llano in any benchmarks.. Llano will only be sellable below $300 PC's.. More than $300? Its not practical to use Llano anymore..[/citation]
What a joke! The 2500k casts $200+. So you are going to put a OS, Mobo, Case, RAM, HDD, DVD, PSU and STILL add dedicated graphics for $500?

EPIC FAIL

You will not even BEGIN to build a decent i5 rig until $700, MINIMUM!

AND whats with everyone's Intel confusion with the name? ATI made a card with the same name over three years ago.
 
[citation][nom]joytech22[/nom]Am I the only person posting who instantly thought "Lawsuit" when they read the model numbers?I mean.. Intel has similar numbering schemes with their newer CPU's.Anyway..I know this might seem biased towards AMD but it's just my thoughts so I expect thumbs down lolI'm honestly surprised OEM's don't strongly consider using these chips in their desktops.I mean sure there are desktops and laptops out there with them but wherever I go and look around the $500-$1000 PC's have SB Pentiums or i3's or even i5's but mostly with the horrendous Intel IGP's.I think it would be a MUCH better deal for the customer if they stuck these Llano CPU's into those systems, less things nowdays are limited by CPU power, and these Llano CPU's perform almost identically to their Phenom II brothers and think about it that kind of CPU power is already enough for most consumers.The GPU on those Llano CPU's also trounces anything Intel has and everybody knows it including OEM's.They are the perfect all-in-one CPU..I know if I was to start my own OEM brand I would use them and leave the more expensive space to Intel and a dedicated GPU, or a Llano matched with a dedicated GPU for Crossfire.My two cents. :\[/citation]
Well I think you are looking at wrong price range. Llano APUs are more geared toward $350-$600 segment.
 
[citation][nom]phatboe[/nom]You can't trademark numbers so AMD is safe.The problem with AMD is supply. Globalfoundries has production problems and can not keep up with demand. It's not that OEM's don't like AMD chips, it just that AMD is having a hard time supplying chips to OEMs so they tend to stick with Intel because Intel does not suffer from production problems at this time.[/citation]
Yes you can! Ferrari had to change the number model on their F1 car from F150 because Ford already had a truck with that same model number :) on other hands , well intel`s deep pockets prevails even in consumer PC builders puttin crappy I3 and crapy intel HD graphics instead of AMD`s Llano APU.
 
This is simply amazing news. APU's are already awesome, combining a respectable CPU and a GPU that is 4 to 10 times faster than anything Intel has.

We can already enjoy low to mid range gaming with the use of AMD's existing APU's, and now the gaming experience will become even better! Capable gaming computers can be built for $200-400 thanks to AMD!

AMD's APU's are a great product, and there is nothing to compete with them.
 
Echoing some great words from Joytech22:

"I'm honestly surprised OEM's don't strongly consider using these chips in their desktops. I mean sure there are desktops and laptops out there with them but wherever I go and look around the $500-$1000 PC's have SB Pentiums or i3's or even i5's but mostly with the horrendous Intel IGP's. I think it would be a MUCH better deal for the customer if they stuck these Llano CPU's into those systems, less things nowdays are limited by CPU power, and these Llano CPU's perform almost identically to their Phenom II brothers and think about it that kind of CPU power is already enough for most consumers. The GPU on those Llano CPU's also trounces anything Intel has and everybody knows it including OEM's. They are the perfect all-in-one CPU.. I know if I was to start my own OEM brand I would use them and leave the more expensive space to Intel and a dedicated GPU, or a Llano matched with a dedicated GPU for Crossfire.My two cents. :\"

 
1) As already stated, you cannot patent numbers. However, I think the move to use the -k suffix is a marketing decision especially since the new Intel chips are 3000 series. Personally, I would rather see the suffix as -bk or change the "0" to a "5".
2) I also wish I found more OEM machines with LLanos chips, but it is easier for OEM tech-support to have a shorter checklist when troubleshooting over the phone. Having an i3-2100 or i5-2400 machine using only the IGP also gives retailers/e-tailers the opportunity to make more money by "up selling" a discrete graphics card to the unsuspecting customer.

Personally, I like "Black Edition" better than just the letter "k".

Saving my cash for another A8-38xx build for the family. My work PC runs great and doubles as a presentation/movie/entertainment center.

but will it play Battlefield 3

CPU/APU: AMD (LLano) A8-3850
Cooler: stock
Motherboard MSI A75MA-G55 (ver.1.0)
System RAM Corsair CMV4GX3M2A1333C9
Case: Antec Sonata (gen.1)
PSU: Antec EA-650
Optical: Plextor PX-B320SA
Ran the campaign with 1440x900 at medium settings (I did not have MSI Afterbuner on so I do not know what the FPS was), but could not test multiplayer because it was blocked at work.
 
Status
Not open for further replies.