AMD CPU speculation... and expert conjecture

Page 78 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


This is kinda what I've been saying for a year now. To code for multiple processor resources you have to rethink the problem. Crysis kinda did this by shifting some of the burden to the CPU under the assumption that everyone would have at least three to four cores. That shift resulted in reducing some of the burden from the GPU which in turn allows the GPU to process more data.

Gamer's just kinda angry cause his old declaration of "games never needing more then two cores" was just handily disproved. Good Engineers, when faced with an insurmountable problem, do not just throw their hands up and quit. They find creative ways to rewrite the rules so that the insurmountable problem is no longer insurmountable.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


How do we know Crysis3 is not just calculating prime numbers on 2 threads, just to make the game appear to be multithreaded ?

 


o_O?

http://tinypic.com/view.php?pic=mh3gud&s=6

Bench's show the dual core CPU's lagging significantly behind the 4+ core CPUs, especially in the demanding "Welcome to the Jungle" level. There was an link posted earlier where explained that C3's engine moved some code that traditionally runs in the GPU into the CPU. Since gaming by and large tends to be "GPU limited" and typically 25~60% of a CPU's resources go underutilized during game play, they made the decision to move it out to free up GPU resources to get more performance out of their product. You really notice this with the i3, it gets creamed. We're going to be seeing more and more games utilize 3~4 cores, possibly more, as we move forward.
 
Now how does all this talk relate to SR? Well it's just demonstrating that "IPC ONRY" and "you don't need more then two cores for gaming" are not valid metrics for evaluating a processors capabilities / value. We're beginning to move beyond that and processor with multiple cores is going to be evaluated under better conditions then previously. Intel still has the advantage of the best engineers money can buy and a dominating R&D budget, their CPUs are still going to amazing. Just don't expect i3's to be beating FX6xxx's anymore.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


So i had a thought.
AMD has about 15% of the desktop market. My guess is that of that 15%, atleast 60% of the people are AMD fanbois, who will buy only BD/PD/Trinity/Llano products, mainly due to fanboism and ideological reasons (supporting the underdog, big corps are evil, intel bribes S/W devs, etc.).

So even if Intel were to reduce the price of all i3 and i5 procs by $50-$70, they still cant break into that "AMD fanboi" market space. (Which would be about 10% of the total market).
 


It's not dynamic, this is something the developers actively chose to do. They gambled on four core processors being the norm amongst their target demographic and designed their product around that target.
 


That is a whole lot of flamebait.

AMD's more prolific market is not us enthusiasts but OEM offerings. OEM's like HP, Dell, Sony, and Leveno sell the vast majority of the worlds computers and their buyers tend to chose based on price. They walk into the store, or nowadays go to the website with a fixed budget in mind and browse the listed products. The pick out one that *looks* like it will do what they want it to do and us at or under that price point. OEM's know this and create their offerings around these price points while encouraging you to "price up" with optional extras.

This is important because AMD CPU's tend to be cheap, the FX8350 is something like $190 on newegg. The 3570K is $210~219, the 3770K is $329 and the FX-6300 is ~$130 USD. From an OEM's point of view they can lower their prices or increase their profit margins by going with AMD CPUs over Intel's in the value segments. As long as the product gives the expected experience to the consumer, wouldn't be good for any bad press after all.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
The only fanboys I see here are the people that come onto a "AMD" thread to defend their "intel" product in ONE Crysis 3 benchmark where a quad core Intel lost to an 8 core AMD. Seriously, it's kind of annoying. We do understand intel is ahead, but AMD isn't that far behind. And especially with the value AMD has on Intel, the performance is Phenomenal. (No pun intended)

Also, what kind of an excuse is it that "The GPU should be doing more." Have you considered that there is a reason to why they offload more things onto the CPU in Crysis 3? The game was optimized to "Melt" you PC. lol.

*EDIT* Found this... not sure if legit or not, but interesting none the less.

http://wccftech.com/amd-kaveri-apu-features-gddr5ddr4-memory-controller/
 
So i had a thought.
AMD has about 15% of the desktop market. My guess is that of that 15%, atleast 60% of the people are AMD fanbois, who will buy only BD/PD/Trinity/Llano products, mainly due to fanboism and ideological reasons (supporting the underdog, big corps are evil, intel bribes S/W devs, etc.).

So even if Intel were to reduce the price of all i3 and i5 procs by $50-$70, they still cant break into that "AMD fanboi" market space. (Which would be about 10% of the total market).

This is like watching one of those R-Rated Samuel L Jackson movies, where every second word is a explitive. Just in this case every second word was some annotation of Fanboyism.

I trollalala'd by way to Lollipop lounge at the desperado efforts at WUMage.

But tell me why can't people use their AMD's that basically are all well up there on FPS and give a good blend of Gaming and heavy system usage at a not expensive cost, or are you trying to play fool martyr here justifying why you bought whatever intel chip you have. I mean on my HTPC is can play Fifa 13 on maxed setting at full HD while the HD4000 st-u-u-u-u-u-t-t-t-t-t-t-t-t-e-e-rrrrrrrs along its merry way. F1 2012, DayZ, Stalker, Diablo 3, Mirror Edge, Dead space 1 and 2, FEAR just some games I can play maxed out on my (in your impressions) rubbish Trinity, BF3 can play at 16:9 low settings maxed meshing at over 40FPS on multiplayer servers and it can play Skyrim on medium/high settings at 1366x768 on medium at 1600x900 resolutions and still be playable.

In actual fact intels igpu is further behind AMD's than AMD's x86 is behind Intels, Haswell will be further behind Kaveri than Ivy is to Trinity so it comes down to you buy what makes you happy, most here are fully contented on the value and performance, along with the innovations AMD at least try to give people rather than brick walling x86.

 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


It isn't that hard of a concept. If you have a GPU at 100% load and a CPU at 25% load, what do you want to shift your load too?

Ideally, both would be able to hit 100% usage and both would be simultaneously maxed out.

If Crysis 3 was CPU bottlenecked on everything that came out and GPUs were sitting around at 50% usage, of course everyone would be jumping on OpenCL/DirectCompute/CUDA. Why wouldn't you? Do you not find it ridiculous that people are spending $500+ on CPUs and they're not seeing much out of gaming from it at all?

Why drive down a crowded street when you can drive down one that's barely used?

It's a very good trend. It's going to change the landscape from "lol my i3 runs Skyrim just as good as a 3960x!" to "the i3 just got humiliated."

AMD is at the biggest advantage since they have the PS4 CPU and they probably have the Xbox CPU. For the first time ever consoles are going to be just like PCs, and it's going to mean optimizing for console is the same as optimizing for PC. I simply can't fathom why you would think that getting more usage out of components would be a bad thing.

CPUs have been at a horrid stand-still and it's killing the desktop market. I've ranted about this before but the 5 years before Nehalem came out we went from horrible Netburst single cores to Core i7 920. In the last 5 years we've gone from nehalem to Sandy Bridge to Ivy Bridge (which was hardly an upgrade), and now Haswell is rumored to have about a 10% to 15% increase in performance?

And no one can figure out why no one buys desktops anymore? Because if you've built one in the last 4 years, you have no reason to upgrade. If you bought one in 2004, by the time 2008 came along, you'd have one pathetic system if you didn't toss out the CPU.

I mean, look at this as a reminder:

2004: ruled by single core
http://www.tomshardware.com/charts/cpu-charts-2004/Farcry,440.html

2008, four years later:
http://www.tomshardware.com/charts/desktop-cpu-charts-q3-2008/Crysis-1680x1050,818.html
The dominant core counts from 4 years ago do not even show up on the benchmarks, because they'd do so poorly

2012, four years later:
http://www.tomshardware.com/charts/cpu-charts-2012/-20-Crysis-II,3175.html
Quad cores still dominate.

Notice how from 2004 to 2008 we went from single to quad, and then from from 2008 to 2012 we got hex cores and they don't do a damn thing? And look how well the old single cores held up a year later. Notice how in 2012 the duals are doing just fine and giving playable frame rates?


 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


more like would you rather go in a sports car on a crowded highway, or walk on an empty street. When the highway is crowded, you still are pretty fast. And when its open, there is simply no comparison.

The devs would have to check this hypothesis in every situation. Whether the CPU off-loading is worth it or not, or would it still be faster to just let the GPU do all the stuff. There has to be a clear benefit of doing parallel work on the CPU over a GPU.

I am all for using multiple cores for AI, memory management, stacks, syncing, engine housekeeping and hundreds of other serial/barely parallel stuff that go on continuously. But to use the CPU for textures/graphics work is foolish IMO. This approach sounds a lot like "putting a tick on checkbox for marketing, just to show PC is our main platform".
 


Go check the benchmarks. Their method is obviously providing a performance benefit. Using an underutilized resource is faster then not using it at all.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
I wonder if you even realize this is exactly where HSA is headed. HSA with its IGP will handle all of the computations, while the DGPU will do just graphics.

The tricky part will be when the system is only an APU. How much of the IGP gets shuffled to computation tasks.

Funny when a game comes out that puts as much strain on the cpu as is available that its a bad thing just because Intel lost its lead. There is a certain group of people who try to nullify Crysis 3 as a valid benchmark.

Kinda funny now that the shoe is on the other foot no?
 


Heh

HSA is the word I didn't want to bring up yet, though yes this is exactly what HSA is about. When most people think HSA they usually think taking CPU oriented work and putting it on the dGPU, well that door swings both ways. If the dGPU is becoming overloaded then you could also take work from the dGPU and put it on the less used CPU. It's about viewing all the resourced in a combined manor rather then compartmentalizing everything. Now it's my understanding that the workload shift in C3 is not dynamic, during the design phase they specifically took features and coded them to be used on the CPU rather then the GPU. Fortuitously it seems to of worked and the game is using both CPU and GPU resources instead of only using GPU or forcing everything into two worker threads on the CPU. The only people who are going to be overly butt hurt are the ones that did something like an i3 + 670 because "games only use two cores and are GPU bound anyway".
 
i'll just leave it here a.k.a. amd is doing it again :D
AMD_Richland_Page_07.jpg

http://www.bjorn3d.com/2013/03/amd-apu-richland/

moar richland
http://semiaccurate.com/2013/03/12/amd-goes-mobile-first-with-richland/
http://techreport.com/news/24482/amd-intros-35w-richland-mobile-apus
http://www.techpowerup.com/181318/AMD-Announces-Next-Generation-quot-Richland-quot-A-Series-Mobile-APUs.html
 

daerohn

Distinguished
Jan 18, 2009
105
0
18,710
3 years ago when I pruchased an AMD Phenom II X4 955 with a 4890 graphic card, I did not believe it will perform dramatically better in video converting when compared to Athlon 2 3800 with a 3870 graphics card. Well I have tried to turn on the option to use the graphics card processing ability in converting job, i started to covert a DVD of 6GB, went to make some coffee for my self as it took around 1 hour on my previous system. When i returned back to my desk i saw that the task was already finished less then16 mins. Then I realized that a PC with it GPU allowed to do the calculation together with CPU is alot more effective than a sole CPU system. AMD had also seen this and purchased ATI to combine this two technology the result is A series GPU which outperforms its rivals. I hope they will achive the same success in desktop CPU too as I am not a laptop lover.
 
It isn't that hard of a concept. If you have a GPU at 100% load and a CPU at 25% load, what do you want to shift your load too?

What CPU and what GPU? What if I purchase the game two years from now, with my i7-2600k and NVIDIA 890 GTX? Oh, guess what? The CPU is bottlenecked while my GPU is at 50%. Woops, what did we actually solve here?

See the problem? You aren't actually "solving" anything, you're just moving the bottleneck around. The code in question is typically run on the GPU because, guess what? The GPU executes it FASTER then the CPU does.

And thats the problem is have: In the absence of any bottlenecks, the code is sub-optimal, and will run slower then if it were coded the "traditional" way.
 


I'm not convinced AMD is going to make a lot of profit off the consoles. Especially since WiiU sales are already falling, and I suspect the PS4/Xbox Next sales aren't going to do that well either.

Then again, I'm very pessimistic on consoles in general at this point. I'm convinced Smartphones/Tablets have more or less made them obsolete.
 
Status
Not open for further replies.