News Intel Gen12 Xe Graphics Leads AMD's Vega in Integrated GPU Performance

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

EdgeT

Distinguished
Jan 8, 2009
280
7
18,865
Not sure what it is with AMD fanboys, but more than any other fanboy they attribute inventions to their company that they didn't actually invent. First multicore CPU was made by IBM. 4 years before the first dual core Opteron. First 64bit CPU? Seriously? Depending on the definition of 64bit, the first 64bit CPU was released no later than the early 90's, a decade and a half before AMD. Intel released the far more innovative Itanium years before AMD released an x86-64 CPU. AMD was one of if not the last of the major CPU manufacturers to reach 64bit, and their 64bit band aid to the x86 instruction set has crippled the advancement of the industry leaving us stuck with an instruction set from the 1970s, that should have been scrapped 2 decades ago.

I didn't say they invented the stuff, they just adopted it across their consumer product line. I'm talking about the consumer market specifically.
Itanium wasn't even that widely spread, didn't they actually abandon it a few years ago?
Also, I'm not an AMD fanboy, not a fanboy of anything really, I usually pick the lesser of two evils and/or the best value for the money. And Intel, as far as we know, has done a lot more shady stuff than AMD while arbitrarily limiting their hardware.
 
  • Like
Reactions: Olle P and bit_user

EdgeT

Distinguished
Jan 8, 2009
280
7
18,865
Never said they were innovations but Intel did adopt them first.

And wait glued together CPUs are a joke? So Zen 2 is a joke? Because they are using the exact same design to implement higher core counts. And if it was such a joke why did Core 2 Quad wipe the floor with Phenom? Sometimes a design that works just works even if you think its a "joke".

And if we want to talk firsts, AMD nor Intel had the first 64bit CPU. Between them though Intel did have a 64bit CPU before AMD it was just never mass produced for the general populace. And Intels was a true 64bit CPU, not extensions added onto the existing design. In fact AMD is the reason we are still stuck with the aging x86 design.

Everyone is bad at security. AMD, Intel, Apple, Google. There is not one company that has not had a flaw or been hacked. Just because we have not seen everything doesn't mean it doesn't exist.

But I digress. Both companies have pros and cons. Both are after your dollar and in the end could care less how you view them.

Zen 2 "gluing" is of modules, not of actual cores. The Core 2 Quads had much higher IPC than the Phenoms, especially the Phenom I, which were garbage.

Well yeah, but I was talking about consumer products, not professional hardware, consumer wise, the Athlon64 was the first IIRC.

To be honest, I'm pretty sure that if Intel hadn't "insisted" on having separate sockets, basically splitting the industry in half back in the day, AMD wouldn't have had to resort to mere extensions AND we would still have more competition (think VIA, which is now basically dead).

Yes I know, no system is perfect but Intel seems to stack bugs/vulnerabilities.

True, in the end companies are just that, they want cash plain and simple. It's up to us to make informed choices and not follow the herd.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
It was all about the market and the way they wanted to go. The problem in the PC world is major changes are hard to push. Intels IA64 was a pure 64but design which means it would have to emulate x86 and thus would have a performance loss. AMD64 was just x86 with 64bit extensions added to it which means it would still run x86 code the same as their other products but also 64bit code.

Because AMDs option was the easy way to go thats where the market decided to go. If we went the route of IA64, it would have been a rough transition as software would have had to recode for this new ISA.

In short, the market was too lazy to move in the direction that would have taken work and instead decided to move the direction that took little to no work and thus we are still on the x86 ISA that has inherent limitations that are harder to work around.



The PC gaming market back during SCSI was even smaller than it is now and more niche. Yes gamers used more exotic parts, we always have and still do, but the mass majority did not and had little to no need for it.

You cannot use a niche market to proclaim it was for the majority.

And until Intel ran into process tech issues it wasn't the same. In fact they had a well working plan that provided decent uplifts in performance. I am sure if AMD didn't sell their FABs they would have run into issues as well especially if you look at how ambitious Intel was going for their 10nm originally.

And how much process tech have you studied? I did in college and Intel had scholarships for people there. We got to go to their FABs once (along with a Texas Insturments FAB) and I can tell you it is a complex product. Its insanely mind blowing seeing it in real time vs just reading about it.

Again , All non intel PCs used SCSI and left IDE behind as early as 1990 .. My AMIGA PC was a Gaming PC and had Quantum SCSI Drive ...

and when the Amiga went bankrupt I switched back to PC and never used IDE drives ...

At that Time , Apple Mac 2 was around and AGAIN it used SCSI drive as standard ...

Intel made the better choices a "niche" market .. they forced people to use IDE and low speed ports for a long long long time just to sell them cheap products for the price of better technology .. and they never changed ..

When other PCs had 10 times faster ports , faster SCSI HDD as standard choice for the SAME PRICE , Intel never allowed it to happen for more than ten years ...

As I said , intel allways was behind in technology compared to others .. Sadly , Amiga and Apple failed to stay in the business market because they made stupid "business" choices and not stupid "Innovation" choices ...
 
Last edited:

bit_user

Titan
Ambassador
Oh my, yet another of these threads.

...

....document.writeln('My PC is better than your PC.')
Not exactly. Nobody has a Tiger Lake CPU, so there's no argument on that basis.

It's more like people arguing according to the bias of their preferred brand. Not saying everyone is biased, but biases tend to amplify the typical variations of opinions and perspectives.
 

bit_user

Titan
Ambassador
To all the folks asking about TDP of the Intel Tiger Lake 1165G7: it is generally assumed to be a 4C/8T 15 W TDP part.
So this would make it fairly well comparable with current AMD Renoir 4xxx-U APUs that are operating at a 15 W TDP specification.
As @jimmysmitty already pointed out, the i7-1065G7 has a 15 W TDP, but is configurable up to 25 W!


As someone else already commented in this forum: AMD could easily regain those 4.4% performance lead by Intel Tiger LAke by simply adding just one more CU to its GPU.
But the more important question is: do they really need to? NObody is going to pick a Notebook based on a 4% better (or worse) iGPU performance, which is so negligable that it won't make any meaningful difference in everyday use.
Agreed. And if you care that much about GPU performance, you're probably buying a laptop with a dGPU, anyhow.
 

bit_user

Titan
Ambassador
I have 8750h intel 17' laptop and its also under cooled. I've seen an option with i9 8'th gen, same cooler. So it happens in both teams. I am keeping it at 80% max speed to avoid hot/warm area under my hands.
My work laptop probably has borderline-adequate cooling. The CPU is a quad-core Skylake i7 plus it has a Nvidia GPU that I never really tax. The keyboard never gets hot, but the fans quickly spin up with a bit of load, and make a fair bit of noise. My employer has so much crap running in the background that the fans are going most of the time, and rarely stay at one speed.

My solution was as you say. First, I tried changing the cooling policy to "passive" and that helped. But, not enough. Eventually, I had to limit power to 90%. That made a massive difference in fan speeds, but now the machine is even more painfully slow. Even though it has a NVMe SSD, it feels like it actually has a HDD!
 

bit_user

Titan
Ambassador
its amazing how many stories I have to hear about a part that is not released defeating a 2 year old part. WOW !
Except the Ryzen 9 4900HS isn't 2 years old. It just launched like 10 weeks ago.

And AMD is not likely to have a new generation of laptop parts, by the time Intel's i7 1165G7 reaches the market.
 

bit_user

Titan
Ambassador
You're right, comparing a pre-release product to a mature product is stupid. It takes AMD 6-12 months minimum after launch to get gpu drivers working reasonably well,
Being a Vega iGPU, there shouldn't really be any work to do. This is the 3nd generation of APUs with Vega graphics.

lord knows what a disaster they are months before release.
It's not the AMD chip that's months away from release.

 

bit_user

Titan
Ambassador
They're even bad at business because even while AMD's wiping the floor with their CPU division, they refuse to lower their CPU prices.
The reason Intel isn't lowering CPU prices is that they don't need to. They're still selling every chip they can make.

Where they could be suffering some erosion of their margins is in server CPUs. However, we wouldn't know, because the prices big customers actually pay aren't disclosed to the public.
 
  • Like
Reactions: TJ Hooker

bit_user

Titan
Ambassador
Again , All non intel PCs used SCSI and left IDE behind as early as 1990 .. My AMIGA PC was a Gaming PC and had Quantum SCSI Drive ...

and when the Amiga went bankrupt I switched back to PC and never used IDE drives ...

At that Time , Apple Mac 2 was around and AGAIN it used SCSI drive as standard ...
In 1997, Apple's PowerMac G3 switched over to IDE (though they had servers which still used SCSI):


IDE was commodity. To compete, Apple switched to PCI and IDE. I forget if/when Power Macs switched to SIMMs, or if they always stuck with DIMMs and the PC world just caught up to them. Anyway, the first time I saw a G3, I was struck by the fact that it was basically a PC (aside from the CPU, of course).
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
What you said makes no sense.
Either way you cannot deny that Intel has closer working relationships.
So nothing of what I stated makes sense, or can you be more specific?

What Intel does is to go to these companies and say (indirectly):
If you allow us to provide "help" with the software, and don't make much of an investment in competing products, we will pay you lots of money (that you could use to market our products).

Clearly there must be something wrong when the entire yearly profit made by Dell was money received from Intel.

[Intel] They're even bad at business because even while AMD's wiping the floor with their CPU division, they refuse to lower their CPU prices.
No, that's wrong.
  1. If you look at the OEM market for consumer and office products the vast majority of offers available are still based on Intel CPUs. AMD is not wiping the floor, yet.
  2. Intel sell about as much as they can produce (partly as a result of production problems), so no reason to cut the prices.
  3. Only on the server side is AMD coming strong for new installations. Intel is on life support aided by all the security issues, forcing existing installments to purchas more CPUs to meet the reduction in processing power resulting from the security fixes. Here Intel has cut their prices by around 50%.
 
  • Like
Reactions: bit_user

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
The reason Intel isn't lowering CPU prices is that they don't need to. They're still selling every chip they can make.

Where they could be suffering some erosion of their margins is in server CPUs. However, we wouldn't know, because the prices big customers actually pay aren't disclosed to the public.

Actually intel lowered their HEDT CPU per core prices to half . it is just the low end CPU that did not get a huge price cut because they are still the best for gaming for the higher clock per core they can reach ...

However as the time passes , people who game are streaming and are making movies of their gaming using their PCs which demands more threads ... and I think in the next 2 years Intel will have to reduce their consumer CPU prices or they will lose badly and AMD will dominate .
 

bit_user

Titan
Ambassador
Actually intel lowered their HEDT CPU per core prices to half . it is just the low end CPU that did not get a huge price cut because they are still the best for gaming for the higher clock per core they can reach ...
Yeah, I read that. It sort of falls in the category of what I said about server CPUs, although you're correct that they dropped the official list prices with the introduction of either Cascade Lake or Cascade Lake Refresh.

Anyway, what I thought @EdgeT was talking about was mainstream desktop CPUs, which is the point I was trying to address.
 

domih

Reputable
Jan 31, 2020
205
183
4,760
Not exactly. Nobody has a Tiger Lake CPU, so there's no argument on that basis.

It's more like people arguing according to the bias of their preferred brand. Not saying everyone is biased, but biases tend to amplify the typical variations of opinions and perspectives.

I'm OK with the linguistic nuance. So replace "_pc" with "_pc_brand" :)

The notion is the same however: one should not care about the Red team against the Blue team but rather about which part satisfies one's need, also assuming that all other things being equal two solutions within 5% performance are basically equal due to margin of errors and barely noticeable at usage anyway.
 

bit_user

Titan
Ambassador
The notion is the same however: one should not care about the Red team against the Blue team but rather about which part satisfies one's need,
That's not realistic, though. People have their own brand preferences and are innately tribal (if sports fans haven't proven this, I don't know what will).

also assuming that all other things being equal
Except they're not. The CPU performance is somewhat lopsided towards AMD, in this comparison.

two solutions within 5% performance are basically equal due to margin of errors and barely noticeable at usage anyway.
Agreed. That's basically what I've been saying. Also, people for who are likely to care about GPU performance that much will likely already be in the market for a laptop with a separate GPU, anyhow.

I think you might be missing the main point of this article, however. Since the dawn of iGPUs and APUs, about a decade ago, AMD has always had a strong lead on graphics performance. So, it's a pretty big deal that Intel seems to have convincingly caught AMD on this front.
 

TJ Hooker

Titan
Ambassador
I don't think it's fair to lay all the blame for x86 still being around on AMD, just because they're the ones who debuted x86-64. Anyone could have developed a better ISA, but no one did. Even though x86 is sub-optimal, it apparently still managed to be the best option at the time.
 
I don't think it's fair to lay all the blame for x86 still being around on AMD, just because they're the ones who debuted x86-64. Anyone could have developed a better ISA, but no one did. Even though x86 is sub-optimal, it apparently still managed to be the best option at the time.

Intel did develop a new ISA. IA64. It was a pure 64bit ISA. It wasn't that x86-64 was better though. It was because it was easier. Intel still maintained a majority market share when they launched Itanium and probably could have pushed harder but it probably also would have come at a much greater cost.

The market was lazy. If AMD developed a pure 64 bit ISA instead of adding to x86 it would have failed as well.
 

spongiemaster

Admirable
Dec 12, 2019
2,346
1,325
7,560
I don't think it's fair to lay all the blame for x86 still being around on AMD, just because they're the ones who debuted x86-64. Anyone could have developed a better ISA, but no one did. Even though x86 is sub-optimal, it apparently still managed to be the best option at the time.
AMD was last to market with a 64 bit ISA and they took the easiest route because they had to come up with something, or get locked out of the CPU industry, and that's pretty much all they were capable of. The one major advantage taking the shortcut had was that it ran all existing x86 code without penalty. All the ground up ISA's had to emulate x86 code which resulted in huge performance penalties. It's a difficult marketing route to tell your customer base their brand new 64bit CPU will run all their existing program like turtles in molasses in January. jimmysmitty is right, the market took the lazy route. They chose the easiest and cheaper route in the short term with complete backwards compatibility instead of something truly new that would have been better for the industry in the long term.