Amazing. I just posted a question on May 30 about integrated graphics that this article answers.
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
I believe the Iris Pro 580 didn't actually see widespread use though. It was used in some Mini-PC NUCS that cost around $1000, while still not quite offering the gaming performance of an entry-level PC with a dedicated card costing significantly less, making it kind of pointless. So it probably makes sense to stick with testing the common integrated solutions that people are far more likely to encounter.A more fair comparison would've been to use the Intel Iris GPUs since the UHD 630 is a mid range Intel iGPU. The Iris Pro 580 from 6th gen Skylake is still Intel's most power iGPU to date, still a bit more powerful than even the latest Iris Plus iGPU in Coffee Lake. But either way, even the Iris Pro 580 is still way behind the GTX 1050 in performance by all measures.
Yeah, Iris Pro Graphics 580 was not widely used. Incidentally, I did have a Core i7-5775C processor, which I also tried in the same Z97 motherboard as the i7-4770K. That has Iris Pro Graphics 6200, which is basically the same as HD 4600 except with eDRAM and twice the EU/shader count. Unfortunately, the drivers for whatever reason were even worse than on the 4770K!I believe the Iris Pro 580 didn't actually see widespread use though. It was used in some Mini-PC NUCS that cost around $1000, while still not quite offering the gaming performance of an entry-level PC with a dedicated card costing significantly less, making it kind of pointless. So it probably makes sense to stick with testing the common integrated solutions that people are far more likely to encounter.
This begs the question, why doesn't AMD just take one of their mobile APUs, put it in a desktop package, and just turn the clocks/tdp up on both the igpu and cpu.It's all about money for sure -- but not the way you're suggesting. No one has successfully made a fast iGPU solution for PCs. You're saying Microsoft paid off not just AMD, but every other company that might think about doing such a thing. Not a chance. And equally unlikely MS was able to pay AMD off.
Easiest way to disprove that assertion: If MS could pay AMD to not make a faster iGPU for PC ... wouldn't it make far more sense to pay AMD to not make such a thing for Sony? Microsoft isn't worried about competition from PCs killing the market for Xbox at all. It hasn't been a problem since the original Xbox; why would things change now?
And $50 for 4GB of HBM2 is extremely expensive on a component level. I gave math earlier, but basically a chip like Picasso costs AMD around $35 to make, and Renoir is maybe $50-$60. So if AMD put a $60 chip with $100 of HBM2, it would need to either sell tens of millions of the chips at around $200 each, or else price would have to be much, much higher -- like $500+.
There simply isn't enough demand or profit potential in making an extreme performance integrated graphics solution right now. Only custom designs that are basically guaranteed to sells tens of millions of units over time (Xbox, PlayStation, or Apple) can justify the cost.
Are you talking about Renoir? Because that is coming to desktops, but as with past APUs, it's a mobile-first chip. And there's a limit to what you can do with higher TDPs.This begs the question, why doesn't AMD just take one of their mobile APUs, put it in a desktop package, and just turn the clocks/tdp up on both the igpu and cpu.
The 7nm vega and 7nm CPU would provide a good improvement over the current 3200/3400g and most of the development costs are already done.
If Intel uses HBM for their future integrated graphic solution, could the high-speed VRAM also function as an L4 cache?
The Iris Pro 580 from 6th gen Skylake is still Intel's most power iGPU to date, still a bit more powerful than even the latest Iris Plus iGPU in Coffee Lake. But either way, even the Iris Pro 580 is still way behind the GTX 1050 in performance by all measures.
Erm...maybe if they stopped making laptops 9mm thick?
This race to make everything wafer thin and wafer light has consequences you know.
Even if they'd theoretically make a 200 W integrated GPU it would run into even more cooling issues due to the heat from the CPU side.
Ryzen 5 3400G has other factors, namely it runs the PCIe with an x8 link width, but I still don't think it will make much if any difference. I could test with Pentium Gold G5400, or even a Ryzen 3 3200G, and I'm sure the GTX 1050 will be the bottleneck in most games. Which do you want, then: G5400, 3200G, or just stick with the 3400G? (I've swapped the CPU out already, so regardless of which I test, I'll need to change the CPU.)
Yes, generally speaking -- and the GTX 1050 is definitely hindered any time I run it in DX12 mode if DX11 is an option. Metro Exodus, Shadow of the Tomb Raider, Borderlands 3, The Division 2 all run better in DX11. So if you really want to see GTX 1050 performance in the best light, I'll test at 720p using the 'best' API as a comparison point. I haven't done that lately because often it doubles the amount of testing -- I need to check both APIs to see which is faster, and sometimes it's resolution/setting dependent as well. (That mostly only applied to ultra-fast GPUs, however, like RTX 2070 Super and above -- the GTX 1050 likely never benefits from DX12 vs. DX11.)
I didn't include 1080p testing because I only ran it on the Vega 11 and GTX 1050. Since this article was about integrated graphics, and since Intel's UHD 630 clearly can't handle 720p minimum settings in most games, trying to run it at 1080p medium settings either results in crashes, failure to run due to framerate requirements, or performance that's so slow as to be meaningless. Many games also choke if they fall below 10 fps -- everything slows down, physics gets wonky, etc. So a benchmark that takes 60-90 seconds normally might take five minutes or more at 5 fps. That's a lot of time wasted just to prove what is already known. So, including 1080p medium, I'd only have two results in the charts: GTX 1050 and Vega 11 3400G, which isn't very useful.
Really?AMD APU's would be good for "FortNite", CS:GO, LoL, older games.
But that's about it.
They're the latest gen APUs on the market.These APU's are last-gen. Why even write about it, ...?
Well said. I don't always agree with TH but this article was a good opinion. I was/am still tired of people recommending AMD integrated chips as if they are even great cheapy mainstream gaming chips. Plus in my experience AMD laptops are not great quality whether the CPU is more "bang for buck" which again is debatable depending on what you use it for. Most of my AMD laptops compared to the Intel/Nvidia ones are junkie in my opinion. Wear fast and pieces fall off. BUt again that is my experience.Are we forgetting who the APU market is? These are people on a budget, that just need a computer, they don't need the bleeding edge. I have built, or upgraded, a number of computers over the years for people with very basic needs. The most demanding games that they play is the variations of solitaire, or an updated version of minesweep. Sorry, but playing solitaire at 144 fps on a 52 inch 4k monitor is not something the target market of these APUs are interested in. Post to Facebook, read ads on Craigslist, respond to emails, fire up a spreadsheet or word processor in Microsoft Works (yes, Works, not Office), maybe watch a puppy or kitten video on YouTube because their grandchild sent them a link in an email (after you explain how to click on that link in the email to be able to view that video).
So, who is up to playing Spider Solitaire on the 72" 240 Hz 8K monitor with the $200 computer I will custom build for you with these thrilling APUs?
And that has nothing to do with AMD, rather Intel making sure AMD never gets into any premium designs, perhaps through paying people off.Well said. I don't always agree with TH but this article was a good opinion. I was/am still tired of people recommending AMD integrated chips as if they are even great cheapy mainstream gaming chips. Plus in my experience AMD laptops are not great quality whether the CPU is more "bang for buck" which again is debatable depending on what you use it for. Most of my AMD laptops compared to the Intel/Nvidia ones are junkie in my opinion. Wear fast and pieces fall off. BUt again that is my experience.
It has less to do with Intel and more to do with AMD not having a really great laptop solution prior to Renoir. Because when you start with a bunch of disclaimers like weaker battery life, that inherently limits the appeal. Now combine that with more than a decade of underwhelming AMD laptop solutions, and that's hard to overcome.And that has nothing to do with AMD, rather Intel making sure AMD never gets into any premium designs, perhaps through paying people off.
AMD's 4th gen mobile CPUs destroy everything intel has to offer in mobile. AMD beats Intel in games and most creative workloads. In addition to superior performance, AMD draws less power and outputs less heat, which is good for a laptop.
However, have fun finding a high end AMD laptop with a high end GPU. It doesn't happen. There is a reason for this...
It's all about money for sure -- but not the way you're suggesting. No one has successfully made a fast iGPU solution for PCs. You're saying Microsoft paid off not just AMD, but every other company that might think about doing such a thing. Not a chance. And equally unlikely MS was able to pay AMD off.
Easiest way to disprove that assertion: If MS could pay AMD to not make a faster iGPU for PC ... wouldn't it make far more sense to pay AMD to not make such a thing for Sony? Microsoft isn't worried about competition from PCs killing the market for Xbox at all. It hasn't been a problem since the original Xbox; why would things change now?
And $50 for 4GB of HBM2 is extremely expensive on a component level. I gave math earlier, but basically a chip like Picasso costs AMD around $35 to make, and Renoir is maybe $50-$60. So if AMD put a $60 chip with $100 of HBM2, it would need to either sell tens of millions of the chips at around $200 each, or else price would have to be much, much higher -- like $500+.
There simply isn't enough demand or profit potential in making an extreme performance integrated graphics solution right now. Only custom designs that are basically guaranteed to sells tens of millions of units over time (Xbox, PlayStation, or Apple) can justify the cost.
You don't get how things work if you think a component cost of $50 is small. That's the raw material cost. You have to integrate that with other components (which costs money), pay for R&D (lots of money), and still make a profit -- not just you, but your partners. So a chip that has a raw cost of $150 either needs to sell a LOT of units at a price of around $225, or it needs to sell a decent number of units at a price of $300, or it can sell a smaller number of units at $500. Economies of scale.MS dont need to pay AMD off , they just order millions of chips on the condition nothing like it is released into the desktop market.
and big no $50 more for HMB2 is not that expensive when you give me APU that runs at least 6 Tflops .. and integrated GPU is ALOT cheaper than GPU cards which have more components on the card plus the defect rates in making Cards .
that $50 more for HBM2 PLUS the APU price will still be cheaper than a 6Tflops GPU no matter how you put it .. and add to it the bonus of compactness and lower power and heat etc...
$500 ??? sorry thats your Overpriced estimation based on nothing ... and dont worry , IF AMD decides to higher the onboard GPU Tflops to be around 6Tflops at minumum like last Gen Xbox upto 12 Tflops like the new one , then they WILL sell Millions and they will make Intel CPU with onboard GPU a Joke.
few years ago I put my ideas about how the CPU market is cheating us in Prices and everything was OverProiced , I got flamed here like crazy form every one , when AMD stepped in , Intel showed us they were stealing us when they lowered their CPU prices to Half. but back then ? you all talk the same .. trying to justify monopolies pricings.
sorry does not work any more , actually there is another reason why AMD is not releasing powerfull APU to the Market , it will hurt their GPU cards badly , no one will buy them because AMD is not good at the moment in the flagship , Nvidia is beating them in the high end ... AMD GPU success are all in the midrange cards which is a direct competitor to the 6-12 Tflops APU that are in consoles ... so AMD will lose the GPU card market if they sell fast APU.
The reason no one makes a console-like "APU" is that they can't guarantee the numbers to make it profitable. Xbox or PlayStation are basically guaranteed to do tens of millions of unit sales over the first few years, probably 100 million or more by the time the console is ready to be retired.
Lots of companies could do it, but none of them are doing it because they don't see a demand for such a product. That's the simple truth. Enterprise wants either a laptop chip and who cares about gaming -- which is what all current Intel chips provide, and it suits enterprise users just fine -- or else they want massive compute for HPC environments, in which case integrated graphics is a stupid idea. Intel will have Xe HP and Xe HPC for data centers this year (or next for Xe HPC?), and it will have laptops with serviceable graphics with Tiger Lake. That satisfies pretty much all the demands of enterprise users.Intel can probably do it, since they wouldn't have to turn a profit on the product. Achieving economy of scale is more important. Money will come in from the enterprise side.
Lots of companies could do it, but none of them are doing it because they don't see a demand for such a product. That's the simple truth.
$500 ??? sorry thats your Overpriced estimation based on nothing ...
MS dont need to pay AMD off , they just order millions of chips on the condition nothing like it is released into the desktop market.
How much is this APU with HBM2 memory supposed to cost? It's been possible to get ~6 Tflop graphics cards for a little over $150 for quite a while now. The 4GB RX 480 launched for just $200 four years ago while offering nearly 6 Tflops of compute performance, which was multiple times the graphics performance of the PS4 and Xbox One that came out just two and a half years prior, and roughly on par with the raw graphics performance of the "4K" consoles that didn't come until later. After this next generation of consoles launch, it likely won't be too long before relatively inexpensive graphics cards are significantly outperforming them as well.and big no $50 more for HMB2 is not that expensive when you give me APU that runs at least 6 Tflops .. and integrated GPU is ALOT cheaper than GPU cards which have more components on the card plus the defect rates in making Cards .
What system was being used back in ... 1998? I can't recall seeing any actual HDTV (720p or above) sets until the early 00s, though I admit I wasn't living on the cutting edge of TV setups. I do remember my roommate in college buying a $2500 Sony Wega Trinitron TV at one point. I can't remember the size or even if it was widescreen, but I'm pretty sure it was still SD -- this would have been around 1997-1998.About 22 years ago I worked retail in the electronics department of a large nation wide retailer. This was around the same time that HD was making in-roads, and rear projector televisions where the main way to watch something on a 52"or 60" screen. On a night I had off a co-worker had a customer, who had $500 eyes. He came in with his wife, and could not tell the difference between a $500 SD (really it was just a high-end analog, yet I mention SD so you have some idea as to picture image) TV and a $2,500 top of the line HD TV - both 32 inches in size. He could not tell the difference, and so he could save $2,000 by going with the $500 TV, hence the $500 eyes.