Status
Not open for further replies.

Ags1

Honorable
Apr 26, 2012
255
0
10,790
I've always been partial to AMD and I like the prices of bulldozer CPUs (and associated motherboards), but the performance lags notably behind Intel. It seems in the few years I've been away from computing hardware AMD has lost its way a little. Are there any bulldozer chips that represent particularly good value?

And what would a 6 core bulldozer compare to in the intel architecture? From benchmarks it would seem they compare to (and slightly beat) Intel quad cores.
 
The 6 core bulldozers actually tend to fall behind intel's current quad core offerings, even when the software is heavily threaded. THe FX 8120 and 8150 are competitive with the current i5 and i7 quad core CPUs in certain heavily threaded tasks that can take advantage of the higher core count. Forr the less threaded tasks the quad core intel offerings vastly outperform AMD's 8 core CPUs. For most day to day tasks the Bulldozer CPUs fall way behind intel, as there are very few programs that are designed to use more than 4 cores, a lot of programs still use only 1 or 2 cores.

I suppose the FX 4100 is probably the best value overall if you really need to have a quad core and have a very limited budget, like say only $120 for a CPU. In certain heavily threaded tasks it can outperform the similarly priced Core i3 2100, which is a dual core with hyperthreading. The 8 core models may also be worthwhile if you have software that takes particular advantage of the high core count.
 

night wolf

Distinguished
Jan 1, 2012
107
0
18,690
ive heard from people I know who bought the bull and they told me that the cpu is very disappointing compared to the intel cpus.

ill say go for the intel 2600k (personally) but the bulldozer is alot cheaper than the intel, if your budget wont allow the intel and dont forget you need to buy a new mobo aswell with the intel.

then go for the amd if you dont have allot in your budget
 
The less "failish" of the bunch would have to be the 8120 if you're willing to do some mild OC before melting the MoBo, haha. Price wise, it's not such a bad purchase if you already have an AM3+ MoBo and something lower than a Phenom II 965 with no OC.

Cheers!
 

Ags1

Honorable
Apr 26, 2012
255
0
10,790
I am thinking that my next build should involve a nice socket 1155 motherboard and the cheapest celeron processor I can put on it. You can get a 2.6GHz celeron for 36 euros, so it wont cost much as a disposable stepping stone to a 2500K. It's a shame though - I've always liked AMD.
 

Terry1212

Distinguished
Aug 3, 2009
143
0
18,690
There's nothing wrong with going AMD. What it comes down to is the user experience. At the end of the day, you're not gonna care if the Intel CPU got 100 fps and your AMD CPU got 80 fps. You'll save money and at the same time won't be able to tell a difference. This is assuming it's for gaming. There may be things that Intel CPU's have a noticeable advantage, but again, you gotta take into account your own personal user experience. Personally, I'd rather save money and just have an overall quality machine.
 

sykozis

Distinguished
Dec 17, 2008
1,759
5
19,865
For gaming: Does it really matter what the framerate is, if the game play is smooth?

For MultiMedia: Does it really matter if it gets lower benchmark scores, if it browses the internet and plays media just as well as more expensive processors?
 

welshmousepk

Distinguished
What tue framerate is determines how smooth it is. And i disagree with above poster, you CAN tell the difference between 80 and 100fps.
I dont see any reason to biy bulldozer when in virtually every task the intel equivalent is cheahper and more powerful.
 

InvalidError

Titan
Moderator

Since Intel has faster chips than AMD at nearly every price point, AMD is a gray area even for cost-cutting unless you want to use their IGP which is twice as fast as Intel's best.
 

sykozis

Distinguished
Dec 17, 2008
1,759
5
19,865

On a 60hz monitor, there is no difference between 80 and 100fps....the output framerate is still only 60fps as that's all the monitor can actually display. The only time the difference between 80 and 100fps actually makes a difference is if using a 120hz monitor. So, the "above poster" is completely accurate.
 

Raidur

Distinguished
Nov 27, 2008
2,365
0
19,960
Tell that to my eyes/hands, which can very much tell/feel a difference between 90-120FPS and 280-300FPS on CS:S. 60hz monitor. Your eyes may not be able to "see it", but the "feel" is definitely there.
 

sykozis

Distinguished
Dec 17, 2008
1,759
5
19,865

Regardless of the FPS being reported, a 60hz display has a maximum output of 60fps. Period. This means every frame over that 60fps barrier is omitted from display. Due to how monitors work, it's 100% impossible to see/feel any difference between framerates once you exceed 60fps. The reported FPS is simply what the graphics card is rendering. It has no relation at all to what is being displayed. You simply want to believe it "feels" smoother and convince yourself it's true.....even though it's not.
 

anxiousinfusion

Distinguished
Jul 1, 2011
1,035
0
19,360


You are absolutely correct. Beyond a certain point, you just won't notice the difference. But that's where aging comes into play. While today's games will run at (for argument's sake) 80 fps on your build, tomorrow's may run at ~30 fps then ~15 fps and so on. Said Intel processor may get 100 fps today, and later on a still-playable ~50 fps. Without upgrading your system, you will be limiting the amount of time that your system is good for recent games. (But the money you save is for upgrades, right?! :D )
 

InvalidError

Titan
Moderator

Yes and no.

This is assuming the graphics card is able to render the vast majority of frames in under 20ms to maintain a nearly perfect illusion of continuous movement. Ridiculously high FPS at least indicate that the brute force to pull this off is present but as the TechReport article points out, some obscure factors sometimes cause some frames to take much longer than average to complete and cause perceivable stuttering even on high-end GPUs and SLI configurations... and Radeons seem to be the worst offenders there.
 

welshmousepk

Distinguished


Incorrect. Higher framrate= faster rendering. equals more control.
This is why console games that require precision control (forza for instance) render the video at 30fps but the physics at 240fps.

yes it won't look any smoother beyond 60, but it will feel smoother because you have more individual frames in which to make your input and control the outcome.

Or do you think that console devs are stupid and just making stuff up?
Why do you think pro gamers also go for the highest FPS possible, even when on n60hz monitors?

I absolutely assure you i can tell a difference between 60 and 120fps on a 60hz monitor.
 

InvalidError

Titan
Moderator

Run fraps and I bet you will find out that the reason you can "tell a difference" is because your setup regularly fails to render some frames under 16.67ms with vsync on and what you are actually noticing is the stutter when this happens, not the actual extra FPS itself.
 

welshmousepk

Distinguished
nope, it just feels generally smoother due to the higher framerate.

I'm not just making stuff up, console devs regularly render physics engines at a higher framerate than video. Why would they do this if it couldn't possibly make it smoother?

A good way to test this is to switch your monitor to 24hz mode. If you still get 60fps, it will look choppy but feel MUCH smoother than the same game at 30 frames per second on a 60hz monitor.
 

anxiousinfusion

Distinguished
Jul 1, 2011
1,035
0
19,360


Unless OP plans on being a pro gamer, considering a BD purchase is a rational decision.
 

welshmousepk

Distinguished


I still don't understand this logic.

BD= less performance for the same money.

regardless of if you will use the performance, why wouldn't you?

Don't get me wrong, I'm an AMD fan who used AMD chips ion nearly every build until SB came out. but SB has since totally wiped the floor with everything AMD has released.
when it was Phenom 955 vs i5 750, AMD had the clear advantage for price. But with SB you look to save maybe 40 dollars for a considerably lower performing chip.
 


Exactly what I was thinking. Which is why I don't understand why some people are so hellbent on buying the Bulldozer anyway.
 

InvalidError

Titan
Moderator

Someone has to contribute to AMD's wellfare plan. Since we need AMD to stay afloat to avoid ending up with an x86 CPU monopoly, people should cut AMD fanboys some slack. If they still want to buy AMD despite benchmarks saying Intel currently has the best bang-per-buck at nearly every price point, let them.
 
Status
Not open for further replies.