AMD CPU speculation... and expert conjecture

Page 555 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished
I currently have no problems with FX8350 + R9-290X...so I fail to see where the issue would be otherwise...

EDIT: MANTLE is going to likely be a great API for devs who want to run on steam boxes...so DX12 is jumping on the bandwagon for windows...meh.
 

iron8orn

Admirable


I see that.

Hey, what is your cpu-nb and voltage at for 2400mhz ram. I was having trouble getting 2133mhz stable.
 

8350rocks

Distinguished
Well, the RAM is technically running at 2133 MHz, however, it is DDR3-2400 RAM, so while it runs at 2133 in an overclocked state, it is capable of running higher. NB is overclocked 10% voltage is stock...
 
There is no such thing as an "upgrade path", that stopped being realistic about three to four years ago. Technology is changing at such a pace that you will need to replace your CPU/MB every three to four years, socket compatibility for future upgrades is a thing of the past. Putting a dGPU on a platform with an APU is sheer folly and completely miss's the entire point of the APU in the first place (as of now anyhow). APU's are for low profile / low power application like devices. Things like living room entertainment PC's or something for your children to use. Enthusiasts won't really be using them unless you like to build project PC's for sh!ts and giggles. That is why so many here hate them so much, they don't see a use for them because they don't personally have a use for them, yet there is definitely a large market segment they belong in.

I currently have no problems with FX8350 + R9-290X...so I fail to see where the issue would be otherwise...

Bottlenecking in terms of CPU vs GPU has nothing to do with bandwidth, we haven't come anywhere close to saturating the PCIe interconnects. Typically it's when a particular CPU doesn't have enough performance to feed geometry and vector data to a GPU and is usually the result of the coders using a single thread that packs rendering along with everything else. Modern API's now have multi-threaded rendering support and as such there should be FAR less "bottlenecking" taking place.
 

iron8orn

Admirable


My thinking really had to do with the single thread performance.



 



It's far more complicated then that. The vast majority of rendering work happens inside the GPU, but before the GPU can actually do anything vertex and geometry data need to be fed into it. That is a very parallel task though prior to DX11 developed were forced to treat the render pipeline as serial. This gets worse if the developer does this inside the main control thread since that dramatically limits your performance ceiling. The amount of data required vs how much rasterization work the GPU has to do is what creates the balance between the two and why you can get different benchmark numbers depending on who's doing the benchmarking. Different graphics settings place more burden on the GPU (rasterization / lighting) or the CPU (vertices and geometry data). API's like MANTLE work by being more efficient at the part where the CPU set's up the data for the GPU to use while also offering a render pipeline that can host multiple render threads. NVidia's GPU drivers are more efficient at managing that data feed then AMD's (ATI) are.

It's also why I laugh whenever I hear such gross statements as "X bottlenecks Y", there are so many factors involved that such statements become useless.
 

iron8orn

Admirable
Maybe I think of it differently than others.. but I think of bottleneck as a all around scenario, whether it be dx9 on one thread or dx11/12 on eight.

I assume dx12 will be backwards compatible as dx11 is.

I really hope AMD just makes needed adjustments to the new cards to be most efficient with directx instead of continuing Mantle from what I see from my point of view to be a right now thing and nothing else. Something of a waist of time and for lower end cards no?


 


As obvious as it will sound, I disagree with just one point, palladin.

You can talk about "bottlenecking" going by your argument as proof. For a given game, which was built using a given Game Engine, you do know which combination of GPU and CPU will be bottlenecking each other. So, it's kind of semantics, but you can talk about bottlenecking for a given set of fixed variables (Game - CPU - GPU) :p

Maybe using it when you don't really know what you mean is what is funny. I was there at some point, but then I learned, haha.

Cheers!
 
btw, it looks like if gpus keep growing in performance like this, cpus might be come irrelevant in the next 2-3 years and get replaced by gpus with decent per-core performance. won't that be great?


when i was talking about bandwidth i wasn't talking about pcie, prolly shoulda specified. i was talking about how much data cpus can process and output without the gpu stalling. in retrospect, may shoulda said latency as well.

we, the less knowledgeable people (e.g. moi), use "x bottlenecks y" as a (over)simplification of an end-situation (in terms of gaming experience) since knowing the mechanism doesn't always help in obtaining the desired performance. i am not ignoring the knowledge, mind you. but the game coding, gpu, driver design, cpu design are out of user's hand. when someone else asks "will x bottleneck y?" the most i can offer is a "specification" as in "in which tasks, which pc config (incl. software), will x bottleneck y?" considering the bottlenecking situation as the worst case scenario.

edit: when someone asks questions like that, there is always a certain amount of helplessness and frustration involved. :)
 

iron8orn

Admirable
 

some of the trinity and kaveri apus, intel's own cpus have "pro" or "workstation" designations but that's due to specialized software and support, not advanced/high performance hardware capabilities. the underlying chip is usually the same as consumer chips.
 


Except graphics options and the control panel exist. Every last one of those settings changes the performance dynamic and thus the balance. Some games are also highly mod-able which further changes that balance, eg Skyrim. Skyrim looks to be a "CPU bound" game because the original release is very much limited by "console experience" crap. Load in some high def lighting, weather / environmental and other mods and suddenly it makes a 180 and becomes a very "GPU bound" game, no to mention you can manually increase the threading, just don't go to high. My skyrim setup was pushing my two 780 Hydro's to the limit while having plenty of CPU power on my 8350fx @4.8ghz to spare.

Different settings can do different things, and by carefully changing what's in the application control profile I can make the game perform better or worse and any particular hardware setup. This is because some of those settings involve high CPU side calculations while others require more GPU side work. Something as simple as the mode of AA you chose favor the CPU or the GPU. This is one of the reasons I favor NVidia GPU's, their drivers are much better at not inflicting additional CPU load whenever possible.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I agree with Yuka. Moreover the main point of MANTLE is to eliminate API overheads that make that some CPUs bottleneck some GPUs. During MANTLE presentation it was showed how a 8350 underclocked @ 2GHz could feed a 290X in a demo with about 10x more draw calls that ordinary games. Under DX11 the same CPU would bottleneck the same GPU using the same settings for the demo.




AMD will continue developing MANTLE by the same reasons that it was developed in the first time.




It would be terrible! GPUs are optimized for throughput. CPUs are (usually) optimized for latency. The whole APU approach and the main emphasis of HSA is just to use both kind of processing units together.

In fact, my prediction is that GPUs will disappear about the year 2020 and will be replaced by asymmetric CPUs.
 

iron8orn

Admirable
I highly doubt Microsoft has not taken Mantle into consideration while developing dx12.

dx11 is nothing new and game devs continued using old versions, really just holding things back.

I like AMD but they really just need to keep up with Intel, Nvidia, and Microsoft instead of.. well causing a division because they have low ipc with high power consumption.

They have a chance to go 14nm vs 14nm in the near future so will they take it and become a true competitor for enthusiasts cpu's.

Even the Steamroller was late out of gate and still we do not see a FX version and only embarrassing overpriced superclocked chips.
 

logainofhades

Titan
Moderator



I would think this could play any game coming out and has a bit of upgrade room as well.

PCPartPicker part list / Price breakdown by merchant

CPU: Intel Core i3-4130 3.4GHz Dual-Core Processor ($109.98 @ SuperBiiz)
Motherboard: ASRock H81M-DGS R2.0 Micro ATX LGA1150 Motherboard ($45.99 @ Newegg)
Memory: Kingston Fury Red Series 8GB (2 x 4GB) DDR3-1866 Memory ($72.99 @ Amazon)
Storage: Seagate Barracuda 1TB 3.5" 7200RPM Internal Hard Drive ($52.92 @ Amazon)
Video Card: XFX Radeon HD 7850 1GB Core Edition Video Card ($119.99 @ Newegg)
Case: NZXT Gamma Classic (Black) ATX Mid Tower Case ($29.99 @ Newegg)
Power Supply: EVGA 500W 80+ Bronze Certified ATX Power Supply ($48.91 @ Amazon)
Optical Drive: Lite-On iHAS124-14 DVD/CD Writer ($14.99 @ Newegg)
Total: $495.76
Prices include shipping, taxes, and discounts when available
 

logainofhades

Titan
Moderator
The i3 is a valid gaming CPU. That system would be far better than a console. You are kidding yourself to think otherwise. I cannot consider any AMD cpu for a gaming system at this point, unless you live near a microcenter. I love their GPU's, but their CPU's just do not impress me anymore, for a gaming rig. I do want a Kabini based HTPC, though. I really hope the new arch they are working on is a winner. I want to see competition. Right now there really isn't any on the CPU side. Faildozer's concept was a bad direction to go in. Considering they are scrapping it, AMD realizes this as well.
 
core i3s don't have staying power. they're okay as starter gaming cpus but lose performance edge soon when demanding games come out. amd's starter gaming cpu is 750k, performs more or less lower core i3's equivalent, except those crazy 3.8GHz ones.
due to performance and upgrade reasons, neither intel nor amd have "upgradable" platforms. intel's lga1150 will soon be replaced by lga1151 and at the high end lga2011 (haswell-e's). meanwhile amd's entry level is isolated with socket fm2/+ and at it's high end, am3+. cpus like core i3, fx6300 (due to present pricing) occupy that gap.
 

logainofhades

Titan
Moderator
LGA 1150 has at last a good year left, given the fact we haven't seen Broadwell yet. Granted any haswell/haswell refresh i5/i7/Xeon are still going to be very capable up to, and probably past Skylake. I am sitting two years now on my 3570k, and still don't really feel the need for an upgrade. I would get the i3 now, replace with a 1231v3, when funds allow, and enjoy performance that AMD cannot match right now.

AM3+ is essentially dead, and FM2+ is just plain sad. I had high hopes for Kaveri, at least on the GPU side, and was quite disappointed. Instead of near HD 7750 performance some were predicting, we got R7 240 performance, which isn't much better than the HD 6670 it replaced. AMD needs a die shrink, badly.
 

8350rocks

Distinguished
Broadwell will likely not be on LGA 1150...expect a new rash of Z107 boards to come out and no backward compatibility yet again. The haswell refresh was LGA1150's last hoorah...hence no broadwell yet...because Intel knew it would not be compatible.

Die shrink(s) are coming... but Rome was not built in a day you know.
 

con635

Honorable
Oct 3, 2013
644
0
11,010
An athlon/pentium and hd7750/gt is better than a console for most, you guys are assuming people game on pc for visuals alone, people like that are in very a tiny minority. Im amazed if anybody plays on pc for fifaxx/cod/halo etc only in 1080p/4k/x8aa etc. I and everyone I know who games on pc does so for the games that arent on console and the communities that come with the games who mod them and make them last many years longer that also dont exist on console. Also alot of games that come from console to pc have been very poorly optimized to the point weaker console hardware plays them as well so why buy a big £££ pc for this?
I'll say it again, the myth that you need high end h/w to game is one that only hurts pc gaming esp the small devs.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Did you miss the part where I said that anything lower than GTX 760 or R9 280 was a waste since the GPUs in the Xbone and PS4 are just a little below that?

You can't do a decent gaming PC for $500. You need to spend $700 at least, and that's expecting that you already own a monitor. If you don't have a monitor, you're basically going to be coming close to $1000 for an "ok" gaming PC from completely scratch.

Meanwhile, I think it's safe to say that there are more people with TVs only than there are people with computer monitors only, and you don't (usually) need a new TV for a console.

It's just not possible. I know Intel guys never like to admit that an Intel product isn't up for the task, but nothing below FX 6300 or i5 and GTX 760 or R9 280 is worth it for a gaming PC, unless you really plan on using that PC for other things and you don't own a laptop.

But that i3 is $10 less than FX 6300. I really don't feel like going through a ton of benchmarks and showing you that i3 wins when it doesn't matter (60fps+ already, etc) while FX 6300 wins hard in situations that use more cores, but something like that is getting close.

The cheapest 280 I can find is $190 after rebate on newegg. So you add another 100 dollars or so for a vastly superior rig.

Or are you going to tell me that Core i3 + 7850 is just as good as the extra $100 you spend on FX 6300 + R9 280? But either way it brings me to my point that a $500 gaming PC sucks, but that price is irrelevant because PS4 is only $400 anyways. Xbone has a $400 bundle without kinnect now.

You aren't beating a console with a gaming PC if you're budget constrained. And I'm not even touching the fact that if you just want to play games, you can get a refurbished Wii U fo $200 with a free game.

Or the fact that a console is (probably) going to get you on a big LCD TV that looks great while a budget computer monitor is going to be some god-awful TN panel trash that turns into eye cancer as soon as you move your head.

http://pcpartpicker.com/p/VCZVGX

That is the bare minimum I would even consider telling someone to build, and it sucks hardcore. Unless you're telling me you're going to:

A. Use an old monitor that might not even be 1080p (ha, the irony of PC gamers going "lol PS4 and Xbone can't 1080p" and then buying a crappy lower resolution TN panel
B. Pirate Windows
C. Are ok with these god-awful bottom of the barrel parts

Do you want more proof? I basically took the cheapest options I could possibly find on PCPARTPICKER for the specs I was looking for. You can probably do around $500 if you plan on pirating Windows (or building a gaming PC to run Linux) and you have an extra monitor handy, but that's a lot of assumptions to deal with. Once that your build doesn't even take into account.
 
Status
Not open for further replies.