News Intel Demos Tiger Lake's Xe Graphics on Early Laptop Sample with Battlefield V

Shadowclash10

Prominent
May 3, 2020
184
46
610
Correct me if I'm wrong, but don't AMD's Renoir APUs run BF5 at more than 30 FPS at 1920x1080 high settings?
Yep. I mean, Renoir iGPUs are apparently just a little behind a Nvidia MX250. Roughly 5-10 frames ahead of the Tiger Lake GPUs. Not waay ahead, but still ahead. I mean, you can basically play any recent game @720p with a Renoir iGPU, and some games @1080p.
 
"showing an early laptop (possibly a software development system) "
"Early drivers/sw "

It's promising at least,hopefully they will increase performance until launch but even like that it's a huge improvement from what intel had until now.
 

excalibur1814

Distinguished
Sep 12, 2009
212
61
18,670
"30 frames per second, occasionally going as high as 32 fps. "

Good lord, 32fps!? If it's not hitting 60fps then it's a bit pointless. Plus, more importantly, what's the MINIMUM frames? Ideally, I'd want an absolute minimum of 30fps for the low end and as close to 60+fps. This is pc gaming, not console gaming. :)
 

pudubat

Commendable
Feb 8, 2018
8
2
1,515
"30 frames per second, occasionally going as high as 32 fps. "

Good lord, 32fps!? If it's not hitting 60fps then it's a bit pointless. Plus, more importantly, what's the MINIMUM frames? Ideally, I'd want an absolute minimum of 30fps for the low end and as close to 60+fps. This is pc gaming, not console gaming. :)
You are missing the point that this is a thin and light laptop, not a gamer oriented laptop. What this actually say is:"We're bringing gaming to your work laptop."
 
  • Like
Reactions: JarredWaltonGPU

thGe17

Reputable
Sep 2, 2019
70
23
4,535
Correct me if I'm wrong, but don't AMD's Renoir APUs run BF5 at more than 30 FPS at 1920x1080 high settings?

Yes, you are probably wrong. ;-)
For example the Acer Swift 3 with its 4700U only manages 32 Fps in FC5 in 1366x768 and in "normal" quality.
In RiseOfTheTombRaider it achieves 40 Fps in 1366x768 DX12 "Medium" w/o AA.
Additionally, most BF5 FullHD benchmarks on youtube use "Low" settings (and sometimes few selected options are set to "Medium"), therefore the result from this iGPU is quiet good.
Beyond that, an iGPU ist most likely irrelevant, because you want to use a dedicated mobile chip.
 
Apparently I was mistaken, as AdoredTV claims the 4900HS was 20% slower, although it was running in DirectX 11 and not 12 so how the FPS was affected is questionable, plus a single game does not a true comparison make.

Still, if Xe is coming along this well, it should spur AMD to get their butts in gear and invest heavily in their graphics division so rDNA (gaming) cards can actually be on par with nVidia again, both in terms of PRICE and performance, not to mention their garbage software...

 
Apparently I was mistaken, as AdoredTV claims the 4900HS was 20% slower, although it was running in DirectX 11 and not 12 so how the FPS was affected is questionable, plus a single game does not a true comparison make.
DX12 on an AMD APU is probably 5% faster, perhaps 10% at most. It varies by game, but usually DX12 is only a major boost to performance in a few games, and more on high-end AMD dedicated GPUs than on integrated graphics.
 

rtoaht

Reputable
Jun 5, 2020
119
124
4,760
May 18, 2020
4
0
10
Correct me if I'm wrong, but don't AMD's Renoir APUs run BF5 at more than 30 FPS at 1920x1080 high settings?
This guy simulated Ryzen 7 4800HS to perform as a Ryzen 7 4700U with 1080p low-medium settings and 60% resolution scale and he is getting like 55 fps on average .These settings are no where near the settings that Tiger Lake was running 1080p all high 100% resolution scale. With Ryzen 7 4800U you will get lik 4-5 more fps. PS the drivers are also in early development stage.


View: https://www.youtube.com/watch?v=JuzDJMiSn_o&t=37s
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
There are a couple motherboards that do "unofficially" offer it. The problem is thunderbolt is an intel specification and requires certification by Intel for the official "thunderbolt" stamp of approval and the licensing fees for non-intel equipment I understand to be "extreme"
Thunderbolt 4 - which is USB 4 & Thunderbolt 3 requires a certification from Intel. Thunderbolt 3 is license free and able to be integrated on any board - AMD and Intel.

https://bit-tech.net/news/tech/peripherals/intel-thunderbolt-3-licensing/1/

Thunderbolt 4 is a certification - and there are also no licensing fees involved. Intel wants to makes sure that TB4 meets their specification - there may be a 1 time fee. So far only 1 AMD motherboard offers TB4 - and that was a few months ago and can't find the link.
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
Apparently I was mistaken, as AdoredTV claims the 4900HS was 20% slower, although it was running in DirectX 11 and not 12 so how the FPS was affected is questionable, plus a single game does not a true comparison make.

Still, if Xe is coming along this well, it should spur AMD to get their butts in gear and invest heavily in their graphics division so rDNA (gaming) cards can actually be on par with nVidia again, both in terms of PRICE and performance, not to mention their garbage software...


I am not sure Adored is or was ever a good source of info - he has some good guesses about a few things, but I suspect his Talking Elmo doll is his source for most if it. Benchmarks are great - neither he nor anyone else has production silicon. Just like ICL - I would imagine that the 1165G7 would also contain the top end (/7) graphics - which is 96EU on Tiger Lake.

I am glad to see Xe LP doing well - not sure which chip this was - I know the 28W is coming to NUCs. Even the Gen11 in Ice Lake is 2x faster than Gen9.5 - and Xe LP should be 2x as fast as Gen 11. Whether it's running on DX11 or DX12 is the next foxhole - since most of the other foxholes have been OBE (overtaken by events) as more and more info about Xe is released. I am not looking to an ultralight for gaming - this is more a demo for the release of the Xe HP into the compute and consumer graphics markets later this year.

R&D is virtually stalled at AMD - their lack of significant profits means that they have not been able to invest - thing about R&D - you save money now for significant pain later. Both their consumer and data center parts are behind the curve - next to impossible to dislodge Nvidia from their throne - they invented the GPU compute concept and plowed resources - hardware, software and expertise into the community - so now CUDA is a defacto standard...

I agree that AMD has garbage software - from BIOS to graphics drivers - total garbage.
 
I am not sure Adored is or was ever a good source of info - he has some good guesses about a few things, but I suspect his Talking Elmo doll is his source for most if it. Benchmarks are great - neither he nor anyone else has production silicon. Just like ICL - I would imagine that the 1165G7 would also contain the top end (/7) graphics - which is 96EU on Tiger Lake.

I am glad to see Xe LP doing well - not sure which chip this was - I know the 28W is coming to NUCs. Even the Gen11 in Ice Lake is 2x faster than Gen9.5 - and Xe LP should be 2x as fast as Gen 11. Whether it's running on DX11 or DX12 is the next foxhole - since most of the other foxholes have been OBE (overtaken by events) as more and more info about Xe is released. I am not looking to an ultralight for gaming - this is more a demo for the release of the Xe HP into the compute and consumer graphics markets later this year.

R&D is virtually stalled at AMD - their lack of significant profits means that they have not been able to invest - thing about R&D - you save money now for significant pain later. Both their consumer and data center parts are behind the curve - next to impossible to dislodge Nvidia from their throne - they invented the GPU compute concept and plowed resources - hardware, software and expertise into the community - so now CUDA is a defacto standard...

I agree that AMD has garbage software - from BIOS to graphics drivers - total garbage.

Salty and behind the times much?

The 4900HS blows away intel laptops on just about any standard test that is computationally expensive. The videos by professional reviewers all over the web. They just offer more cores and consume less power in the process. The cost of offering more cores is there is less silicon/power available for graphics. AMD's next gen APU for laptops will address this shortcoming and switch over to Navi which will blow away anything Intel has for some time to come (Until Intel's fabs can get their head out of their tail and start making progress on smaller nodes. <10nm...which doesn't appear to happening any time soon.)

NVIDIA will continue to be a problem for AMD. I will admit that. But there's a "synergy" o r "Halo effect" growing around AMD's high end products and that's carrying over to their GPU division. The 5700XT still offers the best bang for the buck in it's category. The 2070 is slower. The 2070Super is $100 more for <5% difference.

The super high end (2080ti's) are < 3% of the market for AIB (let alone all chips including APU/iGPUs) That said having a high end topper is critical to a brands image. NVIDIA does very well here. However AMD is dumping a tremendous amount of money back into R&D (including RTG) and still turning an excellent profit. Their 3000 series is selling better than they hoped. Their debt is quickly dwindling.
 
Last edited:
I am not sure Adored is or was ever a good source of info - he has some good guesses about a few things, but I suspect his Talking Elmo doll is his source for most if it. Benchmarks are great - neither he nor anyone else has production silicon. Just like ICL - I would imagine that the 1165G7 would also contain the top end (/7) graphics - which is 96EU on Tiger Lake.

I am glad to see Xe LP doing well - not sure which chip this was - I know the 28W is coming to NUCs. Even the Gen11 in Ice Lake is 2x faster than Gen9.5 - and Xe LP should be 2x as fast as Gen 11. Whether it's running on DX11 or DX12 is the next foxhole - since most of the other foxholes have been OBE (overtaken by events) as more and more info about Xe is released. I am not looking to an ultralight for gaming - this is more a demo for the release of the Xe HP into the compute and consumer graphics markets later this year.

R&D is virtually stalled at AMD - their lack of significant profits means that they have not been able to invest - thing about R&D - you save money now for significant pain later. Both their consumer and data center parts are behind the curve - next to impossible to dislodge Nvidia from their throne - they invented the GPU compute concept and plowed resources - hardware, software and expertise into the community - so now CUDA is a defacto standard...

I agree that AMD has garbage software - from BIOS to graphics drivers - total garbage.
I was with you until the last two paragraphs, where you jumped the shark. AMD is certainly behind Nvidia when it comes to GPU market, particularly in the pro space and supercomputers. But I don't think AMD is stalled ... just trailing as usual.

I will say I've had more issues with AMD GPU BIOS and drivers (especially RX 5600 XT) than Nvidia cards -- heck, I've got a 5600 XT that seems to refuse to work properly on some PCs for whatever reason. But I've got other 5600 XT cards that work just fine in the same system. It's more a problem with the specific card, which was originally a 12 Gbps and lower clocked card that got a VBIOS update.
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
I was with you until the last two paragraphs, where you jumped the shark. AMD is certainly behind Nvidia when it comes to GPU market, particularly in the pro space and supercomputers. But I don't think AMD is stalled ... just trailing as usual.

I will say I've had more issues with AMD GPU BIOS and drivers (especially RX 5600 XT) than Nvidia cards -- heck, I've got a 5600 XT that seems to refuse to work properly on some PCs for whatever reason. But I've got other 5600 XT cards that work just fine in the same system. It's more a problem with the specific card, which was originally a 12 Gbps and lower clocked card that got a VBIOS update.
You can look at revenues at AMD and also look what Nvidia and Intel are spending on R&D - more appropriate to compare the Nvidia, due to their limited product line. The thing about R&D spending or lack thereof - is the results will be revealed in the longer term. Nvidia is clearly outspending AMD and maybe even Intel on R&D specifically for GPUs - and Ampere shows that investment.

The comment about garbage software was in response to the OP trying to imply that Nvidia software is garbage - maybe it is, maybe it isn't. My experience with Nvidia software is in the context of their behemoth graphics driver / GeForce Experience package - and not their CUDA based tools.

I was pleasantly surprised at your most recent article - a little surprised you could not source a DG1 - as it is the 96EU graphics from the Tiger Lake G7 line. I know they are not available to the public - but Tiger Lake laptops are coming - Acer Swift (likely the machine Strout used for his demo) should be shipping soon.

I have the Vega VII and a Gigabyte 5700XT - and had a few small issues with the drivers, but also didn't go as in depth as a review would require. My takeaway from those 2 cards was that they were not going to dislodge Nvidia from my machines - and certainly not my game machines - but in the case of the 5700XT (like most of my unused tech ends up with my bro-in-law) it is a good choice for the mid mid/high range - price is attractive and performance is good. It is significantly more powerful than his non existent discrete GPU.
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
Salty and behind the times much?

The 4900HS blows away intel laptops on just about any standard test that is computationally expensive. The videos by professional reviewers all over the web. They just offer more cores and consume less power in the process. The cost of offering more cores is there is less silicon/power available for graphics. AMD's next gen APU for laptops will address this shortcoming and switch over to Navi which will blow away anything Intel has for some time to come (Until Intel's fabs can get their head out of their tail and start making progress on smaller nodes. <10nm...which doesn't appear to happening any time soon.)

NVIDIA will continue to be a problem for AMD. I will admit that. But there's a "synergy" o r "Halo effect" growing around AMD's high end products and that's carrying over to their GPU division. The 5700XT still offers the best bang for the buck in it's category. The 2070 is slower. The 2070Super is $100 more for <5% difference.

The super high end (2080ti's) are < 3% of the market for AIB (let alone all chips including APU/iGPUs) That said having a high end topper is critical to a brands image. NVIDIA does very well here. However AMD is dumping a tremendous amount of money back into R&D (including RTG) and still turning an excellent profit. Their 3000 series is selling better than they hoped. Their debt is quickly dwindling.

This is about Graphics and still not convinced that an 8 core laptop APU is viable - 6 more + better graphics would have probably been more viable.

"(Until Intel's fabs can get their head out of their tail and start making progress on smaller nodes. <10nm...which doesn't appear to happening any time soon.)"

Salty and behind the times much? You can bet that there are more than a few parts in Intel labs manufactured at 7nm (TSMC 4/5nm equiv). Pretty sure there are no AMD parts that are smaller than a 10nm class product (TSMC 7nm)

The 10nm issues at Intel provided a golden opening for AMD - which was unable to significantly capitalize on those issues. So regardless of process/lithography (which honestly is an interesting thing to ponder, but makes up zero % of decisions to buy one product over another - it may matter to people like Us - but we are basically a statistically insignificant market) Intel has still kept it's lead - that's not conjecture or speculation - it is demonstrable facts in the Quarterly reports. Not fanboy, wishful thinking or unicorns and rainbows - Quarterly reports are facts.

Zen 3 may set the world on fire - and Big Navi may absolutely overshadow Ampere. But history shows that ALL of the Zen product launches were to be the final nail in Intel's coffin - people took the already aggressive AMD marketing message and filtered that through the fever dreams of the most adamant AMD Fanboys and decided that Intel will be obliterated... I am sure AMD wish the community of AMD supporters would tone it down - you can never live up to fever dream specs.

Not that different than Camaro vs Mustang or Samsung vs LG or any of the other rivalries - we all have our preferred team - but when that Mustang beats your Camaro in 4 out of 5 races - you can talk about "sun was in my eyes, rules weren't clear, thought end was the 4th telephone pole, not the 5th" Does not change the fact that in those 5 races, the Mustang won 4.

"NVIDIA will continue to be a problem for AMD. I will admit that. But there's a "synergy" o r "Halo effect" growing around AMD's high end products and that's carrying over to their GPU division."

Not sure about a Halo effect but not important.

" However AMD is dumping a tremendous amount of money back into R&D (including RTG) and still turning an excellent profit."

Yeah, no on both dumping a "tremendous" amount of money into R&D and also on the turning an excellent profit. Don't confuse increasing Y over Y or Q over Q as being excellent profit. The quarterly reports are pretty easy to read - and no where are there enough revenues to allow dumping a tremendous amount of money or turning an excellent profit. AMD has made great strides financially from the previous arch to Ryzen 1 - retiring the long standing debt is positive step.

Q12020 (Jan-March) they (AMD) had a net income (profit) of $162M - same period Nvidia was on track for $2.8B in R&D spending for 2020 - so can assume ~$700M on R&D for the 1st Q. Intel is on track to spend ~$13.5B in 2020 on R&D - so ~$3.375B in Q1. So even if they put that entire $162M in R&D - their competitors (to be fair, Intel is not a fair comparison - due to being in so many more segments other than CPU and GPU) - Nvidia is putting more than 4x AMD's total quarterly profit into R&D in the same period - and Q12020 was AMD's 2nd strongest Q since the Ryzen 1 launch. Q42019 was $170M on $2.11B in revenues. And AMD is splitting that meager R&D budget between CPU and GPU - Nvidia is almost completely laser focused on GPU.

This is more about the scale of their competitors. All these #s can be found through a Google search.

Dr Su has done an excellent job on executing the basics (which her predecessor did not do) and has improved the future of AMD greatly - but they are no where she expected them to be at this point - after launching several generations... This is a tough space - was always going to be an uphill struggle - Intel on one end and Nvidia on the other... Doesn't matter what the company is - going into a mature space with one or more large entrenched competitors would never be easy.
 
You can look at revenues at AMD and also look what Nvidia and Intel are spending on R&D - more appropriate to compare the Nvidia, due to their limited product line. The thing about R&D spending or lack thereof - is the results will be revealed in the longer term. Nvidia is clearly outspending AMD and maybe even Intel on R&D specifically for GPUs - and Ampere shows that investment.

The comment about garbage software was in response to the OP trying to imply that Nvidia software is garbage - maybe it is, maybe it isn't. My experience with Nvidia software is in the context of their behemoth graphics driver / GeForce Experience package - and not their CUDA based tools.

I was pleasantly surprised at your most recent article - a little surprised you could not source a DG1 - as it is the 96EU graphics from the Tiger Lake G7 line. I know they are not available to the public - but Tiger Lake laptops are coming - Acer Swift (likely the machine Strout used for his demo) should be shipping soon.

I have the Vega VII and a Gigabyte 5700XT - and had a few small issues with the drivers, but also didn't go as in depth as a review would require. My takeaway from those 2 cards was that they were not going to dislodge Nvidia from my machines - and certainly not my game machines - but in the case of the 5700XT (like most of my unused tech ends up with my bro-in-law) it is a good choice for the mid mid/high range - price is attractive and performance is good. It is significantly more powerful than his non existent discrete GPU.
I'm sure Intel will source us some Xe Graphics stuff for testing once they launch. Everything before that, they'll put under NDA and even if we got one, we couldn't report on it without getting in legal trouble. We'd have to get drivers as well as hardware, which is another hurdle. But Ryan Shrout's demo of BFV on TGL was quite surprising to me, compared to ICL doing the same test. It definitely gives me hope that the dedicated Xe cards won't completely suck -- but pricing on those will be key, and Intel is clearly more interested in the compute space for Xe than it is consumer GPUs.