AMD CPU speculation... and expert conjecture

Page 712 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Given enough time everything will become an artifact. Binary computers included.

 

con635

Honorable
Oct 3, 2013
644
0
11,010

The way some see it is 'no way you could fit that 290x on a cpu die' but think of it this way, look at how small cpu cores are getting, powerful ones, so why not move them to gpu die and stick it in a socket? I'm sure I seen a video which described how the cpu cores on future apus wouldn't be on a separate part on the die but each core in among the gpu cores. Like each cu would have a cpu core and gpu cores within it. Sorry if my description is bit off.
 


Physics says this. Thermal dissipation is a very big part of engineering and powerful GPU's generate 150~200W+ of thermal power while CPU's do 70~100W. Going with smaller process's lets you use less raw power but it also means your thermal density goes up, you have more power being generated in a much smaller space then before and shunting that power becomes a real challenge. Now compound this with the fact that two separate chips will have separate thermal solutions which independently can remove more power then a single HSF and you see why a dCPU + dGPU will always be more powerful then a combined unit.

The real question is how much power does a user need? Will we get to a point where an integrated solution can provide everything a user will want to do? This won't happen in the next five years, guaranteed, and doesn't look possible in the next 10 seeing as graphics power requirements look primed to explode again. CPU power requirements at home have leveled off slightly but only because games were console focused and thus artificially handicapped. As we move further into real 64-bit computing I expect CPU requirements to start going up again but at a slower pace then graphics requirements.

While it's possible that our "gaming" requirements could entail a 25~50W CPU, it most certainly won't entail a 30~55W iGPU. So I don't expect a shift away from separate specialized processors operating independently on graphics rendering + central processing for the next 20~25 years.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
A user will take all the computing power they can get. The trend im starting to see is companies, aka Intel, will simply stop making them more powerful and force people to settle with what they decide to offer. What option would we have to stop them? They could just stop making DT parts and tell everyone "there is no demand" and people would believe it because there is no parts to be found.

That doesnt mean that if they made a leap in performance and offered it to the public, people wouldnt buy one, quite the opposite, people would have a reason to upgrade a SB system. Lets say somehow AMD or Intel makes a cpu that doubles performance, i can guarantee youd see more than a double sales increase, provided Intel didnt quadruple the price just to reduce demand.
 

Fr33Th1nk3r

Reputable
Feb 22, 2014
222
0
4,710
I don't know why they keep wanting to stick GPU die in CPU's. You have to replace a GPU just about every year or two. They come out with new CPU's every year or two and make us change the motherboard. Or we could buy a new GPU and squeeze out a few more years.

Why the F would I waste 300 dollars on an APU+mobo that will be outdated in 2 years?
 


I think you just explained why Intel/AMD want to combine the CPU/GPU better then anyone else here has.
 


I think you guys forget how much you can get done with older hardware :p Most modern games will run fine on any reasonable DX11 class card, so we're talking Radeon 5000 cards, or first gen Fermi GT(X) 400 cards. Couple that with a quad core processor (C2Quad or Phenom II X4) and you've got a machine that can play pretty much anything once the settings are turned down. There are games *in development or just coming out* that maybe defy that, however they are still the exception.

Don't get me wrong, I recently upgraded from such a machine and the improvement is noticeable (particularity the fidelity)- however it was only very recently I was starting to have problems with performance with the settings turned down. On that basis, you can quickly see how a decent APU could quickly be good enough. I mean I also think it's worth keeping in mind that some of the best games ever made had little to do with top end graphics. Good story, interesting mechanics and enjoyable game-play are all just as (if not more) important that the game looking pretty. On that basis I really don't agree with the 'I have to replace my whole machine every 2 years' argument. You may want to, to be able to turn up the settings a few notches and such, however can you honestly tell me you've hit the 'cant run it' point after just 2 years use of a machine?

Edit: Another point- I actually think the fact most modern CPU's include passable graphics capabilities is a huge improvement. It wasn't that long ago that people buying expensive brand new machines from places like PC World, would take something home with a total deficiency when it comes of 3D graphics capabilities. The OEM manufacturers simply don't want to include a dgpu in the bill of materials if they can avoid it. People not 'in the know' then feel pretty ripped off when "why can't my kid play battlefield on our brand new machine?" - because you need a discreet GPU to do that, which your new machine doesn't have (it's then hit or miss if the system even includes a pcie slot to install one). Now at least a decent new machine has a baseline graphics capability that's enough to play most things, albeit on low settings.
 

logainofhades

Titan
Moderator


You forget that us enthusiasts are the minority. For the average user, an A8 7600 is more than enough power. It is enough for browsing the net and watching Netflix/Hulu. Most people do not have a need for the kind of rigs we have. Since I have not been gaming my sandy bridge i5 equipped laptop has been plenty, for my needs. I browse craigslist for Bronco and boat parts, and am on facebook. I do little else, once I get home from work.
 


To be honest, for many the A8 7600 is enough for games too :p Just depends what you want to play, and at what settings / fps. I gave a friend of mine an ancient Radeon HD4670 that had given me many years of use, he added it to his first gen i3 machine and can't stop thanking me :p He can now play minecraft at double digit frame rates, as well as load all sorts of other games that just wouldn't work before :) To him that A8 7600 is high end futuristic stuff of fiction :p
 

Fr33Th1nk3r

Reputable
Feb 22, 2014
222
0
4,710


You're right. But our market share in dollars is equal to if not more than the "i just want to play call of duty and pew pew" crowd. We have a right to say what we do and don't want. I don't want to game on a tablet APU and I don't want to play something with Minecraft graphics. I want to immerse myself into a pretty looking world for hours upon end. It's entertainment for me and hundreds of thousands of other people.
 

Fr33Th1nk3r

Reputable
Feb 22, 2014
222
0
4,710
http://jonpeddie.com/publications/pc_gaming_hardware_market_report/
according to this we (the mid and high end gaming market) are between 1/2 and 3/4 of the market.
Mainstream being:I want to play minecraft and facebook games on my apple computer.
 

truegenius

Distinguished
BANNED
what i am reading :ouch: ? every one is like hail apu :heink:

lets suppose we are now at verge of efficiency and now intel/amd/nvidia have nothing new to offer
so what will happen

if apus will become so much enough then there will be no gaming pc then, just xbox2 and ps5. Don't you think like this ?
maybe some variant too like xbox2 extreme and ps5 extreme with dual apu like thing for enthusiast

and then we will be running all our things on 4th gen i3/5/7 based laptop
no more need for new operating system because no more new hardware capabilities which will need os upgrade
no more software upgrade required because there will be nothing new in them except metro or minimalistic look ( :vomi: )

if ms decided to sell os in xbox2 then we will say s(rew you ms i am buying ps5

for normal browsing work even phones are enough and even have games at cheaper price than pc or consoles and can offer same story as of pc/console

even tegra 4 based nvidia tegra note tablet is enough to replace even laptop for all the movie watching, web browsing, and gaming stuff, attach a keyboard to do office work, attach a remote control to play games like you do in consoles, insert 64GB memory card to save your files, attach pendrive to backup or access some other data, connect to monitor using hdmi or wifi to make it even more like PC or laptop or to use it as HTPC, bring out stylus and draw your imagination

If there happen a dgpu killer apu and our mentality be like we only need 7400 to play games and do our stuff then it will kill pc's too
there will be only tablets and consoles

Indeed we don't need dgpu killer apu, we only need above mentality to kill dgpu and thus to kill PC/laptop along with it
If it ever happens that apu killed the dgpus then it won't be because of that apu but because of market strategy and people's mentality

and it is possible because people killed phone and now we have something close to tablet than phone
4" is now considered puny now, but when i tell those 5" user to touch all the angles of their phone/screen then i see them adjusting accordingly
i wish Jobs was alive, he would have done something to keep phone factor alive, his 4" idea was good, but then crapsung went ahead and crapped the idea

Edit
The best apu we can have today is in ps4 with current fastest memory, most efficient cpu and gpu under limited tdp
but see how it is struggling with 1080p even with all the priorities and optimizations done and we will be maxing 4k in few years
so in 2020 whatever the best apu we will have, it will be nothing in front of PC power of that time
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
You have to be cautious with some benchmarks. I3 might do well in single player with Mantle, but there's no chance it'll keep up with something like 4m/8c Piledriver in a large, multiplayer map in BF4. It's the same trap a lot of people fell for when they bought Pentium G. They saw less demanding parts of the game benchmarked, assumed that since the weaker parts did well there that they'd do better everywhere, and then were massively disappointed when doing things like playing multiplayer.

You've got to account for cherry picking too. DX12 and Mantle are in a PR fight, the goals of reviews are to show that their API is more efficient. You aren't going to do that by loading up some game like BF4 multiplayer that wants 6+ cores and then throwing Mantle on it. You're going to find a place where the lower end CPUs bottleneck before the big ones, and then go "wow look how much faster this is!"

People are assuming that draw calls are the only thing that the CPU is doing when it's gaming, and it's far from it. DX12 and Mantle are going to alleviate CPU requirements, not do away with them entirely.

And in regards to Mesa, I have no idea what happened. I upgraded to 10.4.4 and everything is super smooth. I used to have bad performance with dual monitors (oddly enough my A4-5000 is fine with open source drivers), and now almost everything is super smooth. This forum page still runs terribly but everything else is fine, and my wobbly windows are buttery smooth. AMD open source drivers are getting really good. I hope they continue with what they're doing here, it's great.
 
I3 might do well in single player with Mantle, but there's no chance it'll keep up with something like 4m/8c Piledriver in a large, multiplayer map in BF4

The models with HTT do. And lower the CPU requirements, and I'd make the leap that the i3 would actually outperform the FX-8xxx series.

Hence why I've been cautioning from the very beginning that the LAST thing AMD want's is lower CPU side requirements, since that allows Intel's i3 to undercut their entire FX lineup. If you can manage to get an i3 to steady frames at medium/high settings, FX gets priced out of the market.
 

for the money core i3s ask for and since you'd need a new mobo anyway when skylake hits, fx quad module retains it's place in the market. core i3s haven't undercut anything since haswell 1st gen came out.
 

logainofhades

Titan
Moderator
You need a new motherboard, once Zen hits. That is the life of a PC enthusiast. I doubt skylake will be much of an improvement over haswell, or broadwell. Intel doesn't have to make huge performance improvements, because it doesn't have any real competition. For budget systems, i3 is still a better buy than FX, unless you live near a microcenter. Less than a $100 for an FX 6300 and GA-78LMT-USB3 is a damn fine deal, for budget rigs.
 

jdwii

Splendid
 

bmacsys

Honorable
BANNED


You have it all wrong. The die is huge. That is one of AMD's big problems. They have no room for an L3 cache because the igpu takes up so much real estate on the die. A10-7850K 245 mm2 vs Core i5-4690 177mm² . All you do is read press releases and slides without having a clue how to properly interpret them.
 

bmacsys

Honorable
BANNED


Because 99.99% of the computers sold are oem machines for the average home user or business. They don't know or could care less what a "discrete gpu" is. They don't know or care what "fps" is.
 
Status
Not open for further replies.