AMD CPU speculation... and expert conjecture

Page 631 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished
Skylake is supposed to be about on par with Kaveri by realistic projections.

Yet, most gamers are not using Kaveri APUs in their HEDT gaming rigs.

I smell slippery slope here.

IF iGPUs are what juan expects to win HPC, THEN iGPUs must kill gaming iGPUs soon.

HOWEVER, iGPUs are NOT killing dGPUs soon. Probably not ever.

When ray tracing becomes viable, dGPUs will have forever cemented their necessity in gaming rigs.

/argument
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
There is a huge difference between the realm of possibility and reality. Marketing loves the realm of possibility. Yes it may one day be possible to manufacture a 10b transistor gflop piece of silicon that will replace everything inside a computer, however making it a reality and making it not cost $10,000 each is not likely to become a reality any time soon.

Heck, KL doesnt even have an IGP. All it is doing is putting the ram closer to the x86 cpu cores.

"From Intel's point of view, today's hottest trend in high-performance computing – GPU acceleration – is just a phase, one that will be superseded by the advent of many-core CPUs, beginning with Chipzilla's next-generation Xeon Phi, codenamed "Knights Landing"."

And everyone cusses amd for trying to beat intel to the many lower ipc cores over a few faster cores. Intels KL is a lowly atom based core, 72 of them.

How useful is KL in anything other than strictly for servers? How many games can use 72 cores? Lets talk about that 3 tflop. 3 /72 = 41 gflop per core.

By comparison, haswell is 44 gflops/core.

Not so impressive when you break it down and look at reality instead of marketing.
 


I think if you take a step back, you'd be surprised *how many* games are played on iGPU. I mean 'gamers' covers the huge number of people playing stuff like DOTA or LOL. Also many of the less 'hardcore' games run well on weaker GPU (e.g. The Sims, Sim City etc).

Even stuff like Battlefield will run playably on Kaveri @ 720p. No it isn't going to 'kill' dGPU, however it's probably *enough* for most. At the end of the day many users will only look at changing something if there is a problem. I remember having to add a Geforce 2 MX to a system of one of my neighbours when their kid was leant a copy of Deus Ex by one of his friends at it wouldn't run on their brand new machine they picked up from PC World (as it only had Intel integrated motherboard graphics, which back then were a total waste of time). Now PC World are still selling cheap machines with big 'headline' numbers and no dDPU. The difference is that most of these machines now have reasonably capable iGPU from either company and that is a real plus. I mean you can play a *huge amount* with pretty weak GPU these days, I'm still rocking a first gen i5 laptop with a lowly Geforce GT420m gpu (which is about on par with HD4000) and that plays stuff like PA, Tomb Raider (2013) even Battlefield 3 without much issue, albeit on low settings. That's the big change here- dGPU's are no longer a necessity for most games, and I'm certain that shipments of low end dGPU must be dwindling as there really is *no need* with a modern machine to buy something like a R7 240 as it's matched by the (effectively free) iGPU.

I think as things move forward the iGPU's are set to get more capable, meaning more erosion from the bottom end of the dGPU market. I don't see them replacing them completely any time soon, 2020 sounds a bit early to me as well, but they *are* likely to become more niche moving forward as the number of situations you need one diminish. A good example of the effect of providing 'good enough' to all is sound cards. You can still buy impressive discreet sound cards that are much more capable than the integrated cards, but I'll bet non of you have one (I certainly don't) as the audio capabilities of the integrated card is ample for my needs.
 

wh3resmycar

Distinguished
i still laugh at that notion of dgpu extinction. it took a 280x for me to play crysis 1 near 60fps (from start to finish)@ 1080p with no AA, mind you i started around 15/17fps with athlonx2/8600gt ddr3.

add to the fact that new technologies like VXGI adds more realism at the expense of computing power, then we're looking at the customary juanrga pipedream.

APUs replacing dgpus will only work if technology will stop at 1080p, but it didn't and it won't. buzz me when an APU can hold crysis 60fps constant @ 1080p with 8x AA. but by then we would be talking about 8k already.

maybe juan meant mobile dGPU chips running linux games :lol:
 


And on this I agree. We're at least getting to the point where you can expect an IGP to at least RUN most recent titles, albeit at low settings. That's a significant improvement. But you'll never reach the point where an IGP will beat out a dedicated GPU, simply due to thermal constraints.

The irony here is I think AMDs strategy is going to backfire on them in spectacular fashion. I simply do not see ARM being competitive in servers due to both more entrenched CPU architectures and lack of SW, which means most of AMDs profits are going to come from their GPU division, which if you believe Juan is going to be a thing of the past. That would leave AMD in a declining role in the Desktop, absent in servers, and basically reliant on legacy contracts (Trinity) for the next decade. Not a good strategy for success.
 
slightly different take: "good enough" gfx performance has been around for years, in every generation of gpus. in terms of unit sales, entry level and mid range gfx cards outsell high end by a large number. however, higher end cards, especially upper midrange (in terms of price) to high end cards have bigger profit margin, halo products even more so. this is why amd and nvidia haven't stopped making big gpus (for gaming, professional, hpc etc.).


as for igpu and dgpu for pc gaming, people will only stop buying higher performing dgfx when mobo companies abandon pcie/expansion slots. until then, people will keep adding discreet gfx after playing games on the igpu for a year or two. i've seen this since llano and it hasn't changed yet. since gfx performance is so easy to add, people won't stop buying more powerful cards. heck, oems can do the same thing in laptops, sell one with an empty mxm slot for dgfx upgrade.
and if anyone sings intel's praise then s/he/they should know this (and i know for sure that people who do, haven't really "gamed" with an intel igpu): driver/application support, image quality, smoothness matter. amd and nvidia have much more experience in this. they've worked out their execution while intel is still in infancy or just doesn't care. moreover, intel's igpus, incl. iris pro don't have gfx performance compared to amd's. you can play some games at low settings, yes. but try to keep on playing games on that on a non-casual basis, say... for a year or two and see if your satisfaction level changes or not. if not, for 3 months straight, at least. a lot of dt kaveri owners are already adding more powerful discreet gfx while richland and trinity owners have already added one, some even upgraded to newer gfx cards.

edit: actually, if people keep playing games on locked pcs like laptops and slimline desktops, it'll only drive them to a full blown gaming desktops. :p
 


You've made my point though- the iGPU is simply replacing the entry level dGPU nowadays. I agree Intel graphics aren't ideal (they are fundamentally Nvidia based anyway...), mainly due to the software support, though the drivers *are* getting better (the number of PA players running on Intel graphics is astonishing, more so that the drivers appear to work reasonably well, other games of course may not be as well supported).

I agree with both of you that dGPU won't go away, but I don't think *everyone* bothers to upgrade. Your only going to upgrade *if* the performance of the iGPU is impacting your experience. There are already games out there that run so fluently on iGPU (e.g. DOTA 2) that if that is what you play, you aren't going to gain anything by upgrading. Now the proportion of modern games in this category is small however I can see it increasing as iGPU's develop. The tragic thing is, once iGPUs really are 'good enough' for most games, that is when the OEMs will remove the expansion capability to shave £2 off the BOM price, and then we're in a worse position than before iGPU started getting good :(

As for AMD abandoning graphics, well I just don't believe that. I can envisage that at some point down the line it may make more sense to sell and high performance 'APU' rather than a 'dGPU' (maybe- that is highly speculative) but that hinges on having decent graphics tech- and if there is still a demand for a separate card, you'll bet they will offer it :) AMD's whole strategy of late is about being able to offer diversity- "mix and match the cores and components YOU need for YOUR application and we will build it" kinda approach. Decent graphics IP is a critical component of that and frankly AMD have been firing on all cylinders in the gpu space the last year.
 

yes, as long as it's cheaper and the igpu delivers good performance. the "cheaper" bit will help with the folks who want to buy dt/laptop on the cheap and the "igpu" bit helps both oems and the buyers. amd can be more competitive in the entry/mid level segment outside u.s. as long as they can supply chips. so far they've always prioritized u.s. and europe.
i thought upper mid and higher end gpus could be in danger if display resolution and software's demand didn't go up. but now, with the new consoles, 4k and the rest, free/g sync, v.r. etc. coming out - i think dgfx demand will rise again. entry level and low end segment will be nearly taken over by the igpus save for non gaming, cpu-heavy (e.g. intel hsw-e pc with gt730 for display(s) only) pcs.
 


Well having followed the AMD v Intel thing for far too long, I wouldn't count them out on the CPU side just yet... I agree that currently their offerings are somewhat unbalanced though (mind you could argue the same about Intel the other way, HD4600 is ok, if I had the choice though *without a dGPU* then I'd take Kaveri any day).
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


If you're going off of Steam Hardware Survey numbers, they just count all GPUs in system that show up in the OS, add them together, and then divide for market share.

For example, if you have a single computer with a 4770k with iGPU not being used with a GTX 980, SHW will count that as two GPUs and give Intel and Nvidia a 50:50 market share.

If you do not believe me, look for steam forum posts with system information. iGPU is included unless it's explicitly disabled in BIOS.

But my point is that Intel iGPU is drastically over-stated in Steam Hardware Survey.

And as I have said before, we are at an awkward stage where GPUs are too strong for pretending to ray trace and not strong enough to actually ray trace. Once we reach ray tracing stage iGPU will not be able to keep up. And ray tracing scales directly with hardware by simply adding light sources (each ray of photons is calculated how it bounces off of things and goes through things so adding more lights scales very well since there's just more paths to trace).

The conversation is not worth bringing up again. Yes iGPU will catch up with fake ray tracing but once actual ray tracing comes along they are finished for gaming.

Look at ratGPU: http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r7-260x,3635-17.html

It takes 29 seconds for Titan to render a single frame. Do you want to play game at 2 frames per minute? GPUs have a really long way to go still and to ignore ray tracing in video games is ridiculous.

Ray tracing in games means: real reflections, no more faking it, shiny cars and stuff, working glass, refraction, reflection, mirrors, etc. All done without having to fake things like mirrors that are just rendering the whole scene twice.

Once we see dGPU hit photorealism with ray tracing then iGPU will start to kill dGPU. But until then we are just in a large stale-mate. But to say that iGPU is going to completely kill dGPU in a few years is extremely ignorant of technologies required for video games going into the future.
 
To be fair though, we'll see Ray Casting implementations first, due to those speed reasons. But yeah, Ray Tracing scales with HW, so that's where you need a dedicated external piece of equipment, since core count is all that matters. You aren't going to improve Ray Tracing performance though clock bumps or architecture improvements. You need execution resources.

And the fact a GTX Titan can power 2FPS tells you how far we have yet to go to get 30 FPS stable ray tracing. And of course, the more light sources, the harder and slower the computation gets. Those games with lots of torches lined up in an underground cavern? Yeah, not in a Ray Tracing engine; that's committing performance suicide. So there's gameplay factors to consider as well.
 


I'm a bit confused why you quoted me with this response? I didn't mention steam hardware survey... if you go and buy a PC from a shop chances are it has iGPU only. There are many people who then play games on these systems, a few years ago that wouldn't have been possible.

As for ray tracing- I am an engineer / product designer and I do some photo-realistic rendering for my job. So far *none of the applications I use support GPU acceleration for ray tracing*. It's all done on CPU and at high quality it takes 30 minutes a frame or more. Trust me I would *love* for GPU acceleration to be added as it's got to be an improvement, still this isn't what the majority of users need.

Also the point your making is completely missing what I was saying. There are always going to be new ways to push graphics that demand more graphics power. That *isn't* likely to change. The thing is though that full 'photo real' ray traced graphics *aren't required* for many games. Good graphics isn't the same thing as a good game, and there is a point where the presentation is good enough and more bells and whistles don't really add much.

People will *only* bother with a dGPU in the event that there is something *they want to play* but either can't, or the experience is severely limited by the iGPU. So for your FPS fans out there, sure dGPU is a must have. For people into other genre's (e.g. DOTA which is hugely popular) if the game runs well and they enjoy it, they're probably not going to be that bothered about paying for a dGPU. This is why things like iPADS and tablets sell well- if it does what people want and doesn't cause problems they're usually happy. I'm also not the one predicting imminent 'doom' for dGPUs... I'm just stating that the more stuff that runs well on iGPU, the more niche dGPU's will become.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780
One of my friends was gaming on i7 2700k with HD3000 after his old GTX285 died because he remove it and broke it while trying to insert it again into the motherboard, he played for a good 3 months on that crap HD3000, but he keep posting pics on Steam of Dark Souls 1, Resident Evil 5, Dead Island Riptide, Payday2, Left4Dead2, Fear Series, Skyrim and Saints Row 2... he was basically playing the same titles he played on his GTX285 at almost the same speed but on lower detail at 1920x1080, i think he just got a GTX 650 or something.

So yeah, as much as i hate Intel and their crappy iGPUs, they do the work just fine if you can look past the lower detail, i don`t think i could game on that sort of crap... but if I had to choose from no gaming at all or using a HD3000 i guess ill take the HD3000.

On a side note i also have a Radeon HD3300 iGPU on my motherboard, just for the "LULZ" i tried removing my current GPU and use the HD3300... DANG i could even run Skyrim, GTA4, Saints Row 2, Left 4 Dead 2, The Witcher 2 and even Tomb Raider 2013 on the HD3300, what was more amazing is that all games run at 28-32 FPS avg, obviously all games at 1280x1024 and the lowest detail available for each game.

So yeah, iGPUs do pack some punch if you`re drifting without a strong dGPU... this is as last resort obviously, sometimes it amazes me they are able to run the games on more than 30 fps with such crappy Graphics.

One of my cousins just build a PC for less than 200 USD, he got a second hand CPU, Motherboard and 6GB of Ram for just 40 USD, so he asked me if it was a good deal... for the price yeah, but the the CPU he got was a Phenom x4 9650 2.3GHZ with an old 790 Chipset AM2+ Motherboard, i mean for the money it was ok but it is still very old... so i gave him a Radeon HD6670 and he got a cheap PSU with Case for less than 40 USD, hes been gaming heavily on GTA4, Fifa14, Sims 3 and i just saw him launch Fifa 15 on Origin and it seems to be working fine for him.

People will just use what ever they got in order to game with good speed, they don´t really care about graphic detail as long as it runs smooth enough.

I used to be like that back in 2002 when i build my first PC, i didnt care about detail as long as it RUN smooth... remember having a 2.2GHZ Pentium 4 with 512MB ram with a horrible 5200FX who could barely run games if you use lowest detail and low resolution, i did not care one bit... but i got more and more into PCs i started craving better detail and higher resolutions.
 
Also the point your making is completely missing what I was saying. There are always going to be new ways to push graphics that demand more graphics power. That *isn't* likely to change. The thing is though that full 'photo real' ray traced graphics *aren't required* for many games. Good graphics isn't the same thing as a good game, and there is a point where the presentation is good enough and more bells and whistles don't really add much.

Very much true. Advancement in gameplay has basically died though, since there's no real AA studios left to drive innovation.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Or that voxel cone lighting Nvidia demoed last week. That was the coolest thing I've seen in years but I don't think even the 980 can do that at 60fps. I hope AMD has an answer for that.

A combo of that with a higher resolution Occulus will need significant horsepower. In about 2 years the visual fidelity is going to skyrocket demanding these beefy dGPUs.
 


There's been some interesting stuff from smaller studios on Kickstarter... I'd suggest you look at the new campaign for 'Flagship'... it's the RTS game I've personally always wanted to play :) Big companies are too worried about maintaining the formula to do anything radical :/
 


And those companies continue to eat everyone else. Any AA studio that has any amount of success will simply be eaten by EA/Activision. 2k looks to be the next studio brought out, leaving Ubisoft as the only major non-EA/Activision studio left.
 

i wouldn't say niche... just yet. low end and entry level gfx are niche now, only for office pcs, old pcs and cpu heavy pcs (e.g. hsw-e/centurion/xeon with gt730/radeon hd 7730 for display output). igpus have effectively replaced that segment in the mainstream pcs.

the upcoming 4k (and may be v.r. too) will certainly cause many people to upgrade for many reasons. for example, demand for more memory bandwidth and capacity, h.e.v.c. support (although it's showing up in mobile socs too) etc. reduced cpu overhead from new apis will also drive dgfx as people will have to worry less about cpu bottlenecks. these and other factors will drive the lower mid and higher segments.
 

jdwii

Splendid


When they mentioned Simcity i was like no A10 will run that game smooth(My CPU struggles in low 25-27 in cities).
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


Sorry to butt in, but having playing GTAIV (not the best coded GTA among the others) on maximum settings @ 1080p (view distance high, not very high), I've gotten over 80 FPS in most situations except when there are about 30 helis spawned and you are blowing them all up at the same time (lol).

This was with a GTX 770 and a overclocked i5-3570k, so it's really not that bad of a game (compared to ArmA especially). However, considering that GTAIV can barely get higher than 30 FPS in most situations @ 720p on a PS3 where as GT5 sat at 30+ FPS the whole time @ 1080p, that really shows how bad the engine was (at least for the PS3 processor / board / aluminum sheet-thing).

Also haven't been on here in a while. Any chatter about the 900 series?
 
Status
Not open for further replies.