AMD to Launch 95 Watt FX-8300 CPU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]apache_lives[/nom]ummmm you do know the i7 3770k is a 77w chip, and even the old i5 2500k stock eats those AMD's alive right?more positive news means there tweaking the process some more, squeezing more out of it -- a few reviews will see whether this will increase the overclock-ability or hamper it[/citation]
he was being sarcastic...
 
[citation][nom]moricon[/nom]True Heavily threaded it can win in some applications, but gaming shows the 3570K as the faster chip for that purpose.But when I OC my 3570k it leaves the FX8350 far behind. at 4.5Ghz, there is simply nothing the FX8350 can do to keep up, and it uses way more power than the 3570k as well, especially once you OC it!Nope, honestly, the 3570k is definately the better buy no-matter which way you look at it, shame really, loved my 1055t, it ran so sweet at 4.2Ghz and was plenty fast, i really want AMD to do better![/citation]

Actually, disabling one core per module and overclocking the CPU/NB, NB, and memory considerably let AMD's eight core FX models catch Intel's quad core models quite excellently. It's true to say that at stock configuration, CPU frequency overclocking won't let AMD catch up, but it's wrong to say that there is nothing that can be done. Disabling the four unnecessary cores (one per module to let the remaining core in each module use the entire module's resources not only speeds up workloads that aren't heavily threaded at the cost of heavily threaded performance, but also drops power consumption) even cuts down on power consumption too. It won't catch Ivy in efficiency most of the time, but it'll be a big improvement over stock.
 
[citation][nom]gaborbarla[/nom]I would love to see it sooner, but knowing a bit about programming myself I know how difficult it is to implement most algorithms on multiple threads so I think it will be later.[/citation]

We already have several games that are able to use six or eight threads with very high scaling. Sooner is more likely.
 
[citation][nom]uglynerdman[/nom]im face palming myself at some of the amd enthusiast around here. I used to be one. the QM's do as well as the amd desktop parts. and theyre 45-55 watt parts. if that doesnt make amd's cpus look like a joke i dont know what does. Amds comepeting with i3s and trying to compete with the i5's. If you pay your own powerbill, you understand the bang for your buck in both performance and energy savings is just intel all around. I dont like to sound like a intel fanboy, im on my first intel system ever and well.. it blows my mind how my laptop can outdo my former amd desktop. I've also hopped off supporting buying amd gpus, they dont offer anything nvidia doesnt, its just that the green has physx and well buying alice madness for my woman, she loves the sparkles, the difference in borderlands 2 is very noticable with the enviroment cloth and dirt flying about bits of fragments everything. I do not see any value in amd anymore. i feel bad for the company, i also think looking at the benchmarks the prices they tout arent even low enough.[/citation]

Why are so many people mentioning TDP? TDP doesn't matter except for cooling and even then, it's just a synthetic number. It isn't synonymous with power consumption, so please stop pretending that it is.

As much as I like to focus on power consumption as well, most people on this site simply don't have a power bill where the small power consumption differences matter. How many of you realize that the power consumption differences are much smaller than that of most light bulbs used in houses? A few bucks a year shouldn't be ignored, but it shouldn't be used as a noose for AMD either, especially for the situations where AMD has a pricing advantage to start with.

As for graphics, AMD most certainly does offer things that Nvidia doesn't just like Nvidia offers things that AMD doesn't. For example, AMD has far greater performance with MSAA, something that is far superior to FXAA in graphics quality. AMD has far greater performance with Direct Compute and OpenCL accelerated features which have been getting more and more support (several games support features based on these technologies such as advanced lighting features that can be said to compete with PhysX and unlike PhysX, you can't just throw in a low end card from the other team or use the CPU for them).

There's more to be said on it, but I think that the point is made well enough.
 
[citation][nom]lostmyclan[/nom]i sell my 2500k to buy a new pc. and a surprise i get. the 2500k is less laggy than the 3770k i do no if the HT make some laggys but. i miss my old 2500k at 5ghz. the maximum i get before burn everthing in my case is about 4.5ghz. bulldozer on the syntetic benchs is low on avarege intel cpus, but in real life is a kick ass.[/citation]

The 3770K at 4.5GHz is ~= to the 2500K at 5GHz per core. If you think that Hyper-Threading is the cause of your issues, then why don't you simply disable it? If the 2500K is still noticeably better, then you're either imagining it (no offense intended by that) or there is a different problem in your computer (maybe something is overheating and throttling). The 3770K should not be more "laggy" than the 2500K.
 

tului

Distinguished
Aug 20, 2010
193
0
18,680
[citation][nom]crisan_tiberiu[/nom]I said another rig and i didnt say that i am going to trow away my main ^-[/citation]
Indeed, I agree with you. I believe that eventually AMD can turn their game around. I also want a second player in the retail CPU market, so for my second machine, I'll go FX series. I'm waiting for whatever is beyond Piledriver as I already have a Trinity HTPC. I went AMD on these systems mainly to support them.

My main system is an i7 3930K, which does eat anything AMD has alive.
 

LEXX911

Distinguished
Nov 24, 2007
24
0
18,510
If it could run on my AM3 M4A89GTD PRO/USB3 motherboard than we will talk. Still running and stuck with the 1090t with this motherboard. They shot themselves on the foot for running this underwhelm chip on another motherboard chipset.
 

kenyee

Distinguished
Nov 17, 2006
176
0
18,680
Here's hoping they redesign it using hand layouts instead of autorouted layouts and redesign the architecture for the next version so Intel has competition again.
The Ivy Bridge "upgrade" from Sandy Bridge was delayed and minor because they had no real competition :p
 
[citation][nom]blppt[/nom]I think you mean MLAA which is roughly AMD's equivalent to FXAA.[/citation]

Even FXAA is far better than MLAA. MLAA 2.0 is AMD's true competitor for FXAA. FXAA and MLAA suck and MLAA 2.0 is only good when used in conjunction with MSAA. The original MLAA was utter garbage compared to FXAA and that's saying something considering that I'm someone who dislikes FXAA.

Regardless, my point was that MSAA is better than FXAA and although Nvidia supports MSAA, AMD has far greater efficiency with it (it impacts AMD's performance much less than it impacts Nvidia's performance, especially on the Radeon 79xx models).
 

bustapr

Distinguished
Jan 23, 2009
1,613
0
19,780
[citation][nom]apache_lives[/nom]ummmm you do know the i7 3770k is a 77w chip, and even the old i5 2500k stock eats those AMD's alive right?more positive news means there tweaking the process some more, squeezing more out of it -- a few reviews will see whether this will increase the overclock-ability or hamper it[/citation]
the 8350 trades blows in most benchmarks against the 2500k. the 8350 costs around $190, the 2500k still costs around $210. an 8300 would be cheaperand be amazing bang for the buck if it isnt too much weaker than the 8350.
 

kartu

Distinguished
Mar 3, 2009
959
0
18,980
So what kind of programs do we have that would benefit from CPU upgrade at all?

1) video/audio compression - oh well, already parallelized
2) games - most game devs use engines by unreal, valve, id software and the likes, once those start using 4+ cores, so will most games

[citation][nom]killerclick[/nom]Here's a benchmark comparison of FX-8350 and i5-3570K: http://www.anandtech.com/bench/Product/697?vs=701i5-3570K comes out as the overall winner, it loses only in heavily threaded applications, but consumes less power, and also has a potentially useful GPU, unlike the FX-8350.I'd love to be able to justify buying AMD again, but they're just too far behind. AMD bet on GPU and multicore, and their bet is still not paying off.[/citation]

At least in Europe, AMD costs 25 Euro less and it also runs on a cheaper mobo. (often forgotten when comparing AMD vs Intel price/ratio)
 
[citation][nom]bustapr[/nom]the 8350 trades blows in most benchmarks against the 2500k. the 8350 costs around $190, the 2500k still costs around $210. an 8300 would be cheaperand be amazing bang for the buck if it isnt too much weaker than the 8350.[/citation]

2500k overclocks further and is more efficient, and of course the K edition cost more because its unlocked
 
[citation][nom]bjaminnyc[/nom]Nice Chip. Curious how well it overclocks, should run cool at 95w.[/citation]

Sometimes the more efficent chips end up overclocking less.

I remember a time when people thought the Intel Prescott 89w chips were considered "too hot" (given there average at best performance of the time), long time ago now.

Oh and 95w isnt that "cool", but for an 8 core 3+ghz yes that pretty darn cool.
 
[citation][nom]apache_lives[/nom]Sometimes the more efficent chips end up overclocking less.I remember a time when people thought the Intel Prescott 89w chips were considered "too hot" (given there average at best performance of the time), long time ago now.Oh and 95w isnt that "cool", but for an 8 core 3+ghz yes that pretty darn cool.[/citation]

Well, back in the day, coolers were oftentimes just solid blocks of aluminum (IE very weak compared to most modern coolers), so even 60-90W CPUs could run very hot. Also, to be fair to Netburst, it really wasn't too bad of an architecture. A huge part of its performance issues was caused by its huge memory bottle-neck. This is easily proven by comparing performance between DDR-400, DDR2-667, and DDR3-1600 for Netburst CPUs.

Unlike Athlon 64, they didn't have an on-die memory controller and this was a big issue. Core 2 was more than a better architecture; Core 2 was also better able to deal with poor memory performance and Core 2 was generally used with better memory than Netburst.
 
[citation][nom]kenyee[/nom]Here's hoping they redesign it using hand layouts instead of autorouted layouts and redesign the architecture for the next version so Intel has competition again.The Ivy Bridge "upgrade" from Sandy Bridge was delayed and minor because they had no real competition :p[/citation]

Ivy Bridge's being minor compared to Sandy had little to do with AMD. It was a tick in Intel's schedule and ticks generally don't bring much for performance. I agree with you on what AMD should do, but that doesn't change my stance on Ivy.
 
[citation][nom]Madjimms[/nom]Double cache & shrink the node.... THEN it would be absolutely beastly![/citation]

AMD is already beyond the point of where increasing cache helps performance in most workloads. They need faster cache, not more cache.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]apache_lives[/nom]ummmm you do know the i7 3770k is a 77w chip, and even the old i5 2500k stock eats those AMD's alive right?more positive news means there tweaking the process some more, squeezing more out of it -- a few reviews will see whether this will increase the overclock-ability or hamper it[/citation]


you obviously don't pay much attention to benchmarks , the pile driver 8 cores , perform at near the the same level of an i7 - sandy bridge in most regular applications so NO the old i5 (ivy bridge) does not "eat the AMD's alive' , in fact it beats out the sandy bridge i5's in nearly ever application , except for ones that are obviously optimized for intel cpu's. do some researh before spewing fanboy nonsense.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
really ok first i want to say i'm not a fanboy , but i'm sick of hearing intel fanboys say there is no value in amd anymore and then go on to talk about gaming. get f---ing realistic here 15-17 or bulldozer-piledriver 8 core and there is next to no noticable difference in gaming frame rates. THE ONLY TIME this or that chip makes a difference out of those 4 is when you get into productivity aps like 3ds max , autocad , or adobe premiere. as for cost saving sure you can get an 15 that prices same as a amd chip , but if you want a same level mainboard to go with that i5 it will cost more money than the amd main board so in the end you still pay more for intel. also of note while intel does do better at work apps , amd doesn't Do bad on those. amd is still a viable platform on a budget , and to say otherwise , or to insult the "the other camp" for saying amd is still in the game , now that is true marks of fanboyism.
 
Status
Not open for further replies.