News Intel Gen12 Xe Graphics Leads AMD's Vega in Integrated GPU Performance

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

domih

Reputable
Jan 31, 2020
188
170
4,760
Oh my, yet another of these threads.

let myPC = ...;
let yourPC = ...;

if (myPC > yourPC) {
....document.writeln('My PC is better than your PC.')
} else {
....if (myPC == yourPC) {
........document.writeln('My PC is better than your PC.')
....} else {
........document.writeln('My PC is better than your PC.')
....}
}
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
AMD is still a winner for one reason , they are keeping their integrated Graphics potential in the pocket , they can at any time they feel danger increase the cu units and win big time .

They are just not using much cu units to control the market and competition. there is no real threat here.
 

ottonis

Reputable
Jun 10, 2020
166
133
4,760
To all the folks asking about TDP of the Intel Tiger Lake 1165G7: it is generally assumed to be a 4C/8T 15 W TDP part.
So this would make it fairly well comparable with current AMD Renoir 4xxx-U APUs that are operating at a 15 W TDP specification.
The Tiger Lake line of CPUs is assumed to be announced in Q3 2020, so it presumably won't be long before we see these little babies entering the market.

This is actually a good thing for the consumer, as it does of course put some additional pressure to AMD in order to further optimize their current Renoir mobile APUs or accelerate the development of next gen APUs.

As someone else already commented in this forum: AMD could easily regain those 4.4% performance lead by Intel Tiger LAke by simply adding just one more CU to its GPU.
But the more important question is: do they really need to? NObody is going to pick a Notebook based on a 4% better (or worse) iGPU performance, which is so negligable that it won't make any meaningful difference in everyday use.
Much more important than that would be some juicy and power-saving hardware encoding/decoding blocks for H264 and other common video codecs, To my knowledge, AMD could seriously improve on this - and they should.

All in all - being a huge AMD fan myself, I am not unhappy with Intel offering some serious competition, as this will further stimulate the spirits in AMD and let them accelerate their innovating even further.
 
  • Like
Reactions: bit_user
"If the latest Time Spy benchmarks (via @TUM_APISAK) are accurate"...."That means Intel's Gen12 Xe Graphics delivered up to 4.4% higher performance"

So to said, if AMD iGPU can deliver 45 FPS on a game, perhaps/maybe (a BIG MAYBE) Intels iGPU could deliver 46.98 FPS. I guess is a Win, if only if, a "result" from a syntetic benchmark would relate to real gaming performance.

So yeah, this is without even knowing what kind of RAM configurations, TDP configurations, etc the i7 was tested with.
 
It'd be nice if this were based on some actual reporting. I suspect what happens is they sell the Ryzen models down-market, and simply shave costs by under-cooling them.

What will be interesting to see is that still happens with AMD's top-end APUs. Now that they're delivering market-leading CPU performance, they might get positioned as a premium product, rather than a value model.
It makes sense I have 8750h intel 17' laptop and its also under cooled. I've seen an option with i9 8'th gen, same cooler. So it happens in both teams. I am keeping it at 80% max speed to avoid hot/warm area under my hands.
 
  • Like
Reactions: bit_user

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
Interesting, but let's wait to see the actual chip and its pricing. This kind of IGP doesn't mean that much in a high end part like a i7 or Ryzen 9. If you can buy these chips and everything that come with it, you probably also can afford to buy a decent graphic card to pair with it.

The real question for me is to see if Intel will be able to integrate these Xe graphics in budget and mainstream chips (like i3 and i5 or even Pentium) while still keeping the price competitive. Because this kind APU is much more interesting for a budget build than for a high-end system. Adding 100$ to the price of a already pricey i7 isn't a big problem, adding the same amount to a i3 or i5 is enough to make it non competitive on the market.
 
Jun 10, 2020
1
1
10
  • Like
Reactions: bit_user

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
come on guys!!
we are talking about next gen intel vs 2 years old vega8
You're right, comparing a pre-release product to a mature product is stupid. It takes AMD 6-12 months minimum after launch to get gpu drivers working reasonably well, lord knows what a disaster they are months before release. Try using pre-release vega8 drivers in the comparison and the deficit will be far more than 4.4%. How about waiting for an actual product review before getting all bent out of shape.
 
Haha, keep trying Intel, you're a dinosaur.

A dinosaur that keeps making tons of money and still has a lot of innovative designs.

We don't even know the TDP of the Intel part! When we are talking laptop APU's TDP is everything. If its a 45 watt part then Intel is actually loosing to AMD. lets wait and see before we claim an Intel victory over some random leak.

It would be a safe assumption that this will slot into the same spot as the i7-1065G7 which is a 15W part configurable up to 25W. Its using 10nm+ which means they probably are able to get more clock speed and performance in the same thermal envelope.

I highly doubt its a 45W part. Would make no logical sense for Intel to make the next gen part the same with a better iGPU and throw it to a much higher TDP.

I agree that AMD would win ith just one more CU, but the numbers are probably a bit off since it doesn't scale in a linear fashion.

Compared to how Intel's current IGPs perform it's a huge leap!

One important reason Intel is so big on laptops is their marketing grip on the manufacturers.
I suspect that's still a factor, given Asus' designs of the latest Ryzen based gaming laptops. (Totally inferior cooling that reduce the computing performance!) A manufacturer that makes Ryzen laptops without built-in flaws probably won't get any money from Intel.

Intel is big on laptops because they tend to have better relationships with OEMs than AMD and work more closely with them. Ultrabooks was an Intel design. Project Athena is also one. Its one thing to design a system and cooling on your own, another to have the hardware manufacture help you to design the best possible solution.

AMD still has a ton of work building OEM relationships and doing as much as Intel does. Everyone thinks Intel just throws money, which is true as they do invest a ton into these kinds of things, but they also work with OEMs and software developers.

Kind of a benefit of making money hand over fist.
 
Yes through mucho illegal and immoral doings. Morally, they have always been bankrupt and technically, only good performance-wise, feature-wise, they have always been behind.

Except until recently AMD was behind. Intel was almost always ahead with features. Had integrated USB 3 first, DDR4 first and PCIe 3.0 first. AMD jumped on PCIe 4.0 which is a rarity.

And illegal is dependent. Both have done shady things. Some people think certain things are illegal which are not.

Either way its a profitable company and they still have multiple innovations that have come and will come to marrket. Without a giant like Intel a lot of things would not see the light of day. No matter what anyone buys Intel has their hands in pretty much every single piece of technology from USB to 802.11 protocols etc.

FYI, I wouldn't call Phenom I or Bulldozer Intels doings at all. Just a bad choice made by AMD execs that lead to hard financial times for the company and a boon for Intel who then dominated the HPC/server market (happens with no real competition) and a massive slice of the consumer market.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
It'd be nice if this were based on some actual reporting. ...
What will be interesting to see is that still happens with AMD's top-end APUs. Now that they're delivering market-leading CPU performance, they might get positioned as a premium product, ...
You think this marketing material has a flair of "low budget"?
From Asus: "... ROG Zephyrus G14 is the world’s most powerful 14-inch Windows 10 Pro gaming laptop. Outclass the competition..."
That's just one of the laptops with insufficient cooling.

Intel is big on laptops because they tend to have better relationships with OEMs than AMD ...
Intel is able to have close relationships with OEMs because they've been very big on laptops and servers for a long time.
When it's "cheaper" for an OEM to buy parts from Intel than to get comparable parts for free from AMD there's something wrong with the relationship though...
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
A dinosaur that keeps making tons of money and still has a lot of innovative designs.



Ultrabooks was an Intel design. Project Athena is also one. Its one thing to design a system and cooling on your own, another to have the hardware manufacture help you to design the best possible solution.

Sorry but Ultrabook was APPLE Design the MacbookAIR. Intel stole it from Apple.

Not even that, it was steve Jobs that convinced intel to focus on low voltage CPU , and the first intel based MAcbook Air was his design he even told them to cut the size of it. I still remember intel CEO saying "what Steve Jobs did in Macbook Air and low voltage CPU was an EYE OPENER"

as for the rest , Intel was allways behind in tech .. allways the external ports in intel motherboards were behind compare to Apple .... Even the Harddisk inteface stayed ATA IDE for Ten years after all others abandoned that in favor of SCSI as standard in their machines .. Even the AMIGA computers in the early 90's never used IDE and used SCSI harddrives.

Parallel ports stayed for long time as well , stupid USB1 when Apple had Firewire 800 for TEN YEARS ...

Intel is not innovative . Period. they love money only they kept us using 4 cores CPU just to milk our money as well ...
 
You think this marketing material has a flair of "low budget"?
From Asus: "... ROG Zephyrus G14 is the world’s most powerful 14-inch Windows 10 Pro gaming laptop. Outclass the competition..."
That's just one of the laptops with insufficient cooling.

Intel is able to have close relationships with OEMs because they've been very big on laptops and servers for a long time.
When it's "cheaper" for an OEM to buy parts from Intel than to get comparable parts for free from AMD there's something wrong with the relationship though...

What you said makes no sense.

Either way you cannot deny that Intel has closer working relationships. Intels software team dwarfs most software companies.

Its something AMD has lacked and needs to build in order to really compete on an even playing field with Intel.

Sorry but Ultrabook was APPLE Design the MacbookAIR. Intel stole it from Apple.

Not even that, it was steve Jobs that convinced intel to focus on low voltage CPU , and the first intel based MAcbook Air was his design he even told them to cut the size of it. I still remember intel CEO saying "what Steve Jobs did in Macbook Air and low voltage CPU was an EYE OPENER"

as for the rest , Intel was allways behind in tech .. allways the external ports in intel motherboards were behind compare to Apple .... Even the Harddisk inteface stayed ATA IDE for Ten years after all others abandoned that in favor of SCSI as standard in their machines .. Even the AMIGA computers in the early 90's never used IDE and used SCSI harddrives.

Parallel ports stayed for long time as well , stupid USB1 when Apple had Firewire 800 for TEN YEARS ...

Intel is not innovative . Period. they love money only they kept us using 4 cores CPU just to milk our money as well ...

And where is firewire now?

USB 4 is going to integrate Thunderbolt because Thunderbolt is superior to USB.

And do you really think Intel is not involved or didn't have SCSI earlier? ANSI is the one who published SCSI. Guess who is a member of that group? Intel. Guess who isn't? AMD is not. SCSI was more heavily used with servers than consumer drives and by the time it trickled down to consumers SATA was here and a better option especially when not having to worry about the chain.

What company doesn't love money? All companies love money and will find the best way to make it.
 

EdgeT

Distinguished
Jan 8, 2009
280
7
18,815
Except until recently AMD was behind. Intel was almost always ahead with features. Had integrated USB 3 first, DDR4 first and PCIe 3.0 first. AMD jumped on PCIe 4.0 which is a rarity.

And illegal is dependent. Both have done shady things. Some people think certain things are illegal which are not.

Either way its a profitable company and they still have multiple innovations that have come and will come to marrket. Without a giant like Intel a lot of things would not see the light of day. No matter what anyone buys Intel has their hands in pretty much every single piece of technology from USB to 802.11 protocols etc.

FYI, I wouldn't call Phenom I or Bulldozer Intels doings at all. Just a bad choice made by AMD execs that lead to hard financial times for the company and a boon for Intel who then dominated the HPC/server market (happens with no real competition) and a massive slice of the consumer market.

I don't call USB3, DDR4 and PCIe 3 innovations.
Innovation = (AMD) First multi-core, first 64 bit CPU, etc. Intel's first multi-core were just single cores "glued" together, what a joke.

At that rate, I could mention the Pentium 4, which was an embarrassement, even if Bulldozer was probably AMD's greatest failure.

I agree, all companies do shady stuff, but Intel seems to do it with a neverending passion.

Yes they have their hands in pretty much everything which raises quite a few red flags, knowing how bad they are at security.

They're even bad at business because even while AMD's wiping the floor with their CPU division, they refuse to lower their CPU prices.

To me, Intel's the Apple of the x86 world. And I'm not saying this as an AMD fanboy, just check my signature, it's just that to me, Sandy Bridge was their last line of CPUs worth anything, it was either that or shivers Bulldozer.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
I don't call USB3, DDR4 and PCIe 3 innovations.
Innovation = (AMD) First multi-core, first 64 bit CPU, etc. Intel's first multi-core were just single cores "glued" together, what a joke.

To me, Intel's the Apple of the x86 world. And I'm not saying this as an AMD fanboy, just check my signature, it's just that to me, Sandy Bridge was their last line of CPUs worth anything, it was either that or shivers Bulldozer.

Not sure what it is with AMD fanboys, but more than any other fanboy they attribute inventions to their company that they didn't actually invent. First multicore CPU was made by IBM. 4 years before the first dual core Opteron. First 64bit CPU? Seriously? Depending on the definition of 64bit, the first 64bit CPU was released no later than the early 90's, a decade and a half before AMD. Intel released the far more innovative Itanium years before AMD released an x86-64 CPU. AMD was one of if not the last of the major CPU manufacturers to reach 64bit, and their 64bit band aid to the x86 instruction set has crippled the advancement of the industry leaving us stuck with an instruction set from the 1970s, that should have been scrapped 2 decades ago.
 
I don't call USB3, DDR4 and PCIe 3 innovations.
Innovation = (AMD) First multi-core, first 64 bit CPU, etc. Intel's first multi-core were just single cores "glued" together, what a joke.

At that rate, I could mention the Pentium 4, which was an embarrassement, even if Bulldozer was probably AMD's greatest failure.

I agree, all companies do shady stuff, but Intel seems to do it with a neverending passion.

Yes they have their hands in pretty much everything which raises quite a few red flags, knowing how bad they are at security.

They're even bad at business because even while AMD's wiping the floor with their CPU division, they refuse to lower their CPU prices.

To me, Intel's the Apple of the x86 world. And I'm not saying this as an AMD fanboy, just check my signature, it's just that to me, Sandy Bridge was their last line of CPUs worth anything, it was either that or shivers Bulldozer.

Never said they were innovations but Intel did adopt them first.

And wait glued together CPUs are a joke? So Zen 2 is a joke? Because they are using the exact same design to implement higher core counts. And if it was such a joke why did Core 2 Quad wipe the floor with Phenom? Sometimes a design that works just works even if you think its a "joke".

And if we want to talk firsts, AMD nor Intel had the first 64bit CPU. Between them though Intel did have a 64bit CPU before AMD it was just never mass produced for the general populace. And Intels was a true 64bit CPU, not extensions added onto the existing design. In fact AMD is the reason we are still stuck with the aging x86 design.

Everyone is bad at security. AMD, Intel, Apple, Google. There is not one company that has not had a flaw or been hacked. Just because we have not seen everything doesn't mean it doesn't exist.

But I digress. Both companies have pros and cons. Both are after your dollar and in the end could care less how you view them.
 

ottonis

Reputable
Jun 10, 2020
166
133
4,760
[...]

And if we want to talk firsts, AMD nor Intel had the first 64bit CPU. Between them though Intel did have a 64bit CPU before AMD it was just never mass produced for the general populace. And Intels was a true 64bit CPU, not extensions added onto the existing design. In fact AMD is the reason we are still stuck with the aging x86 design.
[...]

To my knowledge, most of the time AMD used to have a significantly smaller market share compared with Intel. And for the larger part of the last decade, AMD CPUs were trailing Intel with regards to performance and features.
Without having market dominance, how can an "underdog" like AMD magically force its bigger competitor and the entire market to stick with a certain CPU architecture?

So, I would be very grateful if you could elaborate on your statement "AMD is the reason we are still stuck with aging x86 design".

Thanks in advance!
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
And where is firewire now?

USB 4 is going to integrate Thunderbolt because Thunderbolt is superior to USB.

And do you really think Intel is not involved or didn't have SCSI earlier? ANSI is the one who published SCSI. Guess who is a member of that group? Intel. Guess who isn't? AMD is not. SCSI was more heavily used with servers than consumer drives and by the time it trickled down to consumers SATA was here and a better option especially when not having to worry about the chain.

What company doesn't love money? All companies love money and will find the best way to make it.

eh firewire is old today yes but intel was way behind at the time for like 10 long years.

as for SCSI , you are wrong about it being mostly for servers , All high end Gaming PCs had scsi add in cards before WD introduced the Raptor drives and b4 SATA was around , enthusiasts never touched IDE drives , people who spent $3000 on gaming machine used to get 10-15k rpm SCSI drives and not stupid IDE ones , and used RAID as well and got the performance they needed for a top machine. and I was one of them ...

and finally yes all companies love to make money , but some give you something REAL in return and others like Intel keeps selling you the old tech again and again and again ...
 
Last edited:
To my knowledge, most of the time AMD used to have a significantly smaller market share compared with Intel. And for the larger part of the last decade, AMD CPUs were trailing Intel with regards to performance and features.
Without having market dominance, how can an "underdog" like AMD magically force its bigger competitor and the entire market to stick with a certain CPU architecture?

So, I would be very grateful if you could elaborate on your statement "AMD is the reason we are still stuck with aging x86 design".

Thanks in advance!

It was all about the market and the way they wanted to go. The problem in the PC world is major changes are hard to push. Intels IA64 was a pure 64but design which means it would have to emulate x86 and thus would have a performance loss. AMD64 was just x86 with 64bit extensions added to it which means it would still run x86 code the same as their other products but also 64bit code.

Because AMDs option was the easy way to go thats where the market decided to go. If we went the route of IA64, it would have been a rough transition as software would have had to recode for this new ISA.

In short, the market was too lazy to move in the direction that would have taken work and instead decided to move the direction that took little to no work and thus we are still on the x86 ISA that has inherent limitations that are harder to work around.

eh firewire is old today yes but intel was way behind at the time for like 10 long years.

as for SCSI , you are wrong about it being mostly for servers , All high end Gaming PCs had scsi add in cards before WD introduced the Raptor drives and b4 SATA was around , enthusiasts never touched IDE drives , people who spent $3000 on gaming machine used to get 10-15k rpm SCSI drives and not stupid IDE ones , and used RAID as well and got the performance they needed for a top machine. and I was one of them ...

and finally yes all companies love to make money , but some give you something REAL in return and others like Intel keeps selling you the old tech again and again and again ...

The PC gaming market back during SCSI was even smaller than it is now and more niche. Yes gamers used more exotic parts, we always have and still do, but the mass majority did not and had little to no need for it.

You cannot use a niche market to proclaim it was for the majority.

And until Intel ran into process tech issues it wasn't the same. In fact they had a well working plan that provided decent uplifts in performance. I am sure if AMD didn't sell their FABs they would have run into issues as well especially if you look at how ambitious Intel was going for their 10nm originally.

And how much process tech have you studied? I did in college and Intel had scholarships for people there. We got to go to their FABs once (along with a Texas Insturments FAB) and I can tell you it is a complex product. Its insanely mind blowing seeing it in real time vs just reading about it.