News AMD vs Intel Integrated Graphics: Can't We Go Any Faster?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
@JarredWaltonGPU
At the time most TVs where tube based, and the largest screen that was not projection was the 40" Mitsubishi Diamond. Vizio was a no-body that was trying to make a name for itself, with an LCD (or was it LED) screen that used a muffin fan to circulate air as quietly as possible - I think it was a 48" or 52" screen. Toshiba and Sony where setting high standards for HDTVs, with Panasonic about the only worthwhile Laser Disk player (12" disks) on the market. JVC did not think the fail of Betamax was the end of the quality VCR picture, and they introduced a HD version - just as good as those newfangled video discs (X-Men was released on it as working proof of VHR-HD). And yes, if you wanted quality at 40" or larger you went with a plasma display back then.
 
It seems that the GPU choice is between integrated graphics and a $300 graphics card. Man, it would be a lot easier with an unlimited budget and a component industry that stayed put for 2 years at a time.
 
It's frustrating that the only thing holding the Ryzens back from being basically competent is a measly 2gb of gddr5, which can't cost that much at this point. It's not about wanting a budget gaming rig, obviously slapping whatever GPU you can afford into whatever PC you can scrounge up is the solution there. Even a Core 2 Quad @ 3ghz, which is scrap metal at this point would probably not bottleneck a GPU much worse than the Jaguar thing in consoles.
The thing I want a APU for is simply the packaging. The existing Mini ITX solutions just aren't truly portable in sense of being able to be stuffed into backpack and taken wherever, because of the stupid graphics card sticking up. I know it's niche market no one cares about but the number of cool things you can do with case design without a discrete to contend with is staggering, if you care about such things. I've always wanted to stuff a Mini Itx board into a 70's vintage hi-fi amp case along with a new class-D amp board and see if i could get all the controls to work for instance.
 
It's frustrating that the only thing holding the Ryzens back from being basically competent is a measly 2gb of gddr5, which can't cost that much at this point. It's not about wanting a budget gaming rig, obviously slapping whatever GPU you can afford into whatever PC you can scrounge up is the solution there. Even a Core 2 Quad @ 3ghz, which is scrap metal at this point would probably not bottleneck a GPU much worse than the Jaguar thing in consoles.
The thing I want a APU for is simply the packaging. The existing Mini ITX solutions just aren't truly portable in sense of being able to be stuffed into backpack and taken wherever, because of the stupid graphics card sticking up. I know it's niche market no one cares about but the number of cool things you can do with case design without a discrete to contend with is staggering, if you care about such things. I've always wanted to stuff a Mini Itx board into a 70's vintage hi-fi amp case along with a new class-D amp board and see if i could get all the controls to work for instance.

Ryzen APUs uses the system ram, so it should be DDR4, and not GDDR5. While you can only configure 2GB in the BIOS, the GPU actually helps itself to more system RAM when required. Thus, the bottleneck is not so much on the amount of RAM, but on the speed of the ram since its being shared with the CPU at the same time.

Anyway, I feel iGPU have come a long way. You can see below where I created a comparison of specs between a MX350 vs an overclocked Vega 11:
50022560136_8973c136df_b.jpg
 
It seems that the GPU choice is between integrated graphics and a $300 graphics card. Man, it would be a lot easier with an unlimited budget and a component industry that stayed put for 2 years at a time.
The article was testing integrated graphics against a GTX 1050, a card that was priced around the US $110-$140 range when it launched well over three years ago. And even then, you could get cards around twice as fast as a 1050 for around $200.

The thing I want a APU for is simply the packaging. The existing Mini ITX solutions just aren't truly portable in sense of being able to be stuffed into backpack and taken wherever, because of the stupid graphics card sticking up. I know it's niche market no one cares about but the number of cool things you can do with case design without a discrete to contend with is staggering, if you care about such things. I've always wanted to stuff a Mini Itx board into a 70's vintage hi-fi amp case along with a new class-D amp board and see if i could get all the controls to work for instance.
So, you want a gaming laptop? Those can be stuffed in a backpack, and even include their own power source, screen, and input devices built in, allowing them to be used anywhere. The dedicated graphics hardware in them is either built into the board, or on a small dedicated card, and optimized for lower power and heat output, allowing the system to remain as compact as possible. Such hardware could optionally be stripped out of a laptop and installed in a custom case.

As for fitting a desktop card in a tiny case, there are options like PCIe riser cables that could allow it to lie flat and provide some additional flexibility in terms of positioning. The cooler will take up a fair amount of space, but an APU with much faster graphics hardware than current models would likely require a relatively large cooler as well.

Ryzen APUs uses the system ram, so it should be DDR4, and not GDDR5. While you can only configure 2GB in the BIOS, the GPU actually helps itself to more system RAM when required. Thus, the bottleneck is not so much on the amount of RAM, but on the speed of the ram since its being shared with the CPU at the same time.
I think they were referring to the possibility of adding a couple GB of VRAM to an APU, so that that it could potentially offer more performance without getting bottlenecked by system RAM as much. However, as was already mentioned, that would likely require something like HBM2 memory to work, not GDDR5. Combined with more graphics cores, that would add to the processor's cost, likely resulting in making it just about as expensive as a dedicated card and processor combined, which is probably why it hasn't really been considered a viable product up to this point.
 
I think they were referring to the possibility of adding a couple GB of VRAM to an APU, so that that it could potentially offer more performance without getting bottlenecked by system RAM as much. However, as was already mentioned, that would likely require something like HBM2 memory to work, not GDDR5. Combined with more graphics cores, that would add to the processor's cost, likely resulting in making it just about as expensive as a dedicated card and processor combined, which is probably why it hasn't really been considered a viable product up to this point.
I see. Thank you for pointing out.

While it is not impossible for Intel or AMD to add dedicated memory to their iGPU, but as you rightfully pointed out, it is not cost efficient to them, nor will it be for us as consumers. It is not just about the cost of the RAM chips, but also to the cost of accomodating them on the chip they sell. In this case, perhaps it would make more sense to get a iGPUless processor and just get a low end dedicated graphics.
 
  • Like
Reactions: JarredWaltonGPU
"even the fastest integrated solutions pale in comparison to a dedicated GPU"
The GT 710 exists solely to disprove this, and the Ryzen 4000 and 5000 APUs are a testament to that fact, beating even the GT 1030 in many tasks both in and out of gaming.
 
"even the fastest integrated solutions pale in comparison to a dedicated GPU"
The GT 710 exists solely to disprove this, and the Ryzen 4000 and 5000 APUs are a testament to that fact, beating even the GT 1030 in many tasks both in and out of gaming.
The point isn’t that the fastest integrated can’t beat some of the worst dedicated graphics. It’s that any decent dedicated GPU blows away integrated graphics. GT 710 is a complete joke that’s over six years old (and based on hardware from 2014), and GT 1030 GDDR5 is also complete weak sauce. Maybe when we get stacked dies and memory on CPUs, things will change, but dedicated GPUs will still be much faster.
 
yeah the coming rdna2 igpus are supposed to be double performance at least for laptops. I would bet in a couple years igpus will be able to match middle end cards like the gtx 1660 or something.
 
yeah the coming rdna2 igpus are supposed to be double performance at least for laptops. I would bet in a couple years igpus will be able to match middle end cards like the gtx 1660 or something.
Perhaps, though in three years GTX 1660 hardware will be very much entry level on the graphics card market. Right now, the previous (Vega 8) iGPU from AMD is at about the level of the slowest Pascal GPU Nvidia released (GT 1030). If things progress, maybe we'll see GTX 1650 Super or GTX 1660 levels of performance. It will likely take stacked dies and HBM2 to get there, however.
 
Perhaps, though in three years GTX 1660 hardware will be very much entry level on the graphics card market. Right now, the previous (Vega 8) iGPU from AMD is at about the level of the slowest Pascal GPU Nvidia released (GT 1030). If things progress, maybe we'll see GTX 1650 Super or GTX 1660 levels of performance. It will likely take stacked dies and HBM2 to get there, however.
oh yeah i guess i should say something that will be what the gtx 1660 is right now. A solid 1080p card
 
Maybe when we get stacked dies and memory on CPUs, things will change, but dedicated GPUs will still be much faster.
Discrete GPUs may still be faster but if IGPs become ~5X faster, a much larger chunk of the market won't care about dGPUs anymore no matter how much faster they may become. I don't care how much faster GPUs are when none that are worth upgrading to are available new under $200.
 
Discrete GPUs may still be faster but if IGPs become ~5X faster, a much larger chunk of the market won't care about dGPUs anymore no matter how much faster they may become. I don't care how much faster GPUs are when none that are worth upgrading to are available new under $200.
The problem is that if integrated graphics become 5X faster, you can rest assured they're not going to be "basically free" like Intel's current iGPU. So you'll be able to get a CPU with slow iGPU like the i5-12400 for $200, or $260 for the Ryzen 5 5600G. And then you'll have the "fast iGPU" solutions that will be priced at $400-$500 and have graphics performance perhaps equal to a $200 GPU. But when the GPU shortages end, most people won't want to pay extra for a budget GPU integrated into their CPU when they could get a dedicated budget GPU that can be resold/upgraded. And I'm still skeptical that we'll actually see mainstream GPU levels of performance from an iGPU at a reasonable price.

The place where fast iGPU will matter most is in mobile. Laptops would much rather have a single chip doing decent graphics and CPU than to have a dedicated GPU. That's why Apple has the M1/M1 Pro/M1 Max. Intel and AMD would both like to offer something competitive, though there's a big difference in what Apple users will pay and what Windows users will pay.
 
The problem is that if integrated graphics become 5X faster, you can rest assured they're not going to be "basically free" like Intel's current iGPU. So you'll be able to get a CPU with slow iGPU like the i5-12400 for $200, or $260 for the Ryzen 5 5600G. And then you'll have the "fast iGPU" solutions that will be priced at $400-$500 and have graphics performance perhaps equal to a $200 GPU.
A fast IGP should be much cheaper than a similarly performing dGPU since you eliminate the cost of a separate PCB, separate VRM, separate HSF, the costs and overheads of going over PCIe, separate packaging, marketing, distribution, etc. With tiles/chiplets-based designs, you don't have to pick a one-size-kinda-fits-some compromise, you can physically omit the IGP chiplet/tile and HBM for IGP-less SKUs, small HBM-less IGP for minimum necessary IGP action for office-like and head-less use, the biggest IGP that can fit under the IHS with HBM or whatever else in-between.

We'll see in 2-3 years,when Intel brings tiles-based CPUs to the mainstream.

In the meantime, AMD's 680M IGP is a pretty good jump over previous IGPs, already kills off most of the market for previous sub-$200 GPUs if it was available on desktops, only question is how much AMD would retail an AM5 version of the same APUs for.
 
Some manufactures create custom chips that may have a CPU that works with the X86 instruction set, yet also has a security sub-chip that uses custom code, and another sub-chip that might do video encoding/decoding, again not using the X86 instruction set. The sub-chip could be grafted directly into the CPU's silicone, or it could be a chiplet sitting on the same carrier. An iGPU is just the most commonly known form of grafting something other than a CPU onto a CPU.

I have built some very simple and low-key, last year (even last decade) based computers for people that just want to check Craigslist, Facebook, and Gmail accounts, with looking at a family photo album, and maybe looking at the top stories from the local television stations. In this case I am looking for a CPU that is at least a generation or two older than current, with iGPU support. I don't want anything fancy, and I don't care for Chromebooks/Chromeboxes - even though they will do the job, I just don't care for them (same thing with netbooks) - what happens if they decide to type a letter on a word processor, yes, Google Docs can do it, yet it still leaves something to be desired.

As for as dGPUs I do not consider anything less than an (#)#50, and skip the GTX 16#0 line of GPUs. The [G|R]TX ##70 gives a good value, the price😛erformance seems to strike a good balance.

Just remember, opinions are like flowers, some smell better than others, and of course my opinion smells like a field full of roses 😉
 
As for as dGPUs I do not consider anything less than an (#)#50, and skip the GTX 16#0 line of GPUs. The [G|R]TX ##70 gives a good value, the price😛erformance seems to strike a good balance.
Just keep in mind Nvidia shifted around their product names since the 20-series. The 2060 basically took over the role of what would have previously been marketed as an x70 card, the 2070 took over the x80 slot, the 2080 took the place of an x80 Ti, and the 2080 Ti was closer to what they might have once marketed as a "Titan". This is all in terms of pricing, power draw and chip sizes. They likely adjusted the model numbers in an attempt to make the 20-series cards look less underwhelming at launch outside of their then-unusable RT hardware, and to encourage people to move up in the product stack to a tier above what they would normally consider buying.

The 1660 / 1660 Ti filled the traditional x60 product segment (without RT hardware), and the 1650 did so for the x50 segment, so the numbering remained similar to previous cards at the lower-end of the lineup, just with an odd starting number thrown in to make the numbering shenanigans at the higher-end less obvious. That's why there's both 1660 cards and 2060 cards in the same generation.

As for the 30-series, the 3090 took over the traditional "Titan" slot that had been occupied by the 2080 Ti, and the 3050 took over the x60 slot. At least as far as their official MSRPs go. The crypto-induced price gouging currently makes such comparisons less meaningful, as prices are currently close to double what these cards should officially cost.