Intel Lists Overclockable Core i7-8809G With Vega Graphics

Status
Not open for further replies.

razamatraz

Honorable
Feb 12, 2014
200
0
10,710
Interesting. 100Watts puts this above the typical mainstream desktop processors of today even...although perfectly reasonable for discrete like GPU performance (assuming it actually has that)
 

bit_user

Polypheme
Ambassador
This is such a weird product. The only case I can see for bundling them together would be to share power management (which they do). Perhaps the socket/board will be a little simpler, as well, but bundling them will probably just move cost from the motherboard to the CPU/GPU package, and make the whole thing harder to cool than if they were two separate packages.

I'm more puzzled than ever. The only way I can truly make sense of it is if it's just a stunt to show off their EMIB integration platform.
 

bit_user

Polypheme
Ambassador
Does anyone have TDP specs on the GPUs that typically appear in high-end gaming laptops and "mobile workstations"?

The CPU specs seem to be a match for the i7-7920HQ:

https://ark.intel.com/products/97462/Intel-Core-i7-7920HQ-Processor-8M-Cache-up-to-4_10-GHz

If it's configured down to 35 W TDP, then that would leave at least 65 W for GPU + HBM2. Or more, when the CPU isn't cranking so hard. Assuming we're talking about running on AC power, of course.

BTW, if the HD Graphics aren't being used, then you'd easily lop about 10 W off the CPU die's TDP. That means limiting it to 35 W wouldn't even entail much sacrifice, putting a turbo of 4.1 GHz still possibly within reach.
 

alextheblue

Distinguished

It would allow OEMs to build something more compact than a comparable system with an MXM GPU. I imagine that there's some net TDP savings too. If they're serious about building it, I would bet they're seeing some demand for such a design. Maybe Apple or other OEMs requested a high-power GPU on-package. Either way this is a stopgap until their own nascent high-performance graphics efforts bear fruit.

As for cooling... it's 100W but with a fairly large surface area, there's room for more than enough heatpipes. They don't all need to end up in the same pile of fins, either. If you want to you can divide the heat. Don't get me wrong, it isn't going to end up in true ultrathins. But it doesn't need to be a DTR, either. Check out the ROG GL502VS:

https://www.asus.com/us/Laptops/ROG-GL502VS/gallery/

That's a 15.6", not a massive DTR and it has a 1070 and a 45W HQ CPU. The mobile variant of the 1070 has a 115W TDP by itself. The combined potential heat load with that HQ CPU is ~160W. Now imagine cutting your cooling requirement to just the 1070. So the cooling setup they're using for the 1070 would be sufficient by itself, and as I mentioned before you could even split it up if your design would benefit. I think it's pretty neat, even if you won't be able to cram it in an ultrathin.

What's going to hurt adoption is cost. Not only of the APU itself (hah it's got AMD graphics onboard I can call it an APU!), but I'd imagine it requires a different socket, which means a new board design exclusive to these chips. Correct me if I'm wrong, but I don't think their current mobile sockets provide enough current.

Also, does anyone remember NexGen's PF100 using IBM's MCM technology before it was cool? Heh...
 

bit_user

Polypheme
Ambassador

Once you add enough cooling, I don't see the space savings. I also don't get where the TDP savings would come from, unless you're assuming their power-management protocol can only function within the package.


That's got to be a maximal figure, and not supported by all implementations. I'm skeptical that laptop really dissipates 160 W without more serious fans. I know the power brick is rated for 180 W of output, but a good chunk of that should be for battery charging.

Anyway, I'm not saying the idea of a 100 W laptop is ridiculous. On AC power, it seems reasonable for a mobile workstation or gaming laptop. I'm just saying I can't see a good reason why they should be in the same package. Even if we accept it delivers some small amount of space savings in board area, you're not going to use such a powerful setup with a tiny screen, so probably board area is not a limiting factor.

I'm sticking to the idea that Intel waved a wad of cash in the face of cash-poor AMD (remember, this project probably had its inception ~1 year ago), wanting to use it as a stunt to sell the industry on using EMIB.
 
The only place I see this working are in people that don't care about having a beefy and hot running laptop. Or in an intel NUC style HTPC, portable gaming computer, or even as a console to replace the current xbox or playstation.
 

bloodroses

Distinguished


I'm guessing you've never owned a gaming laptop before. With the last one I owned (Toshiba x205-sli), the battery would slowly discharge, even on AC power, if I was running it at full load. The power brick on that was 180w as well. The fans weren't that beefy either; just well designed heat pipes.
 

alextheblue

Distinguished

Sorry I didn't mean to downvote you. You're right about cooling requirements. I would also note that if a gaming laptop discharges on AC, they should have included a beefier power brick. Normally it would make more sense to start power throttling ever so slightly, and both of those issues are on Toshiba.
 

alextheblue

Distinguished

I must not have been clear. You save space vs a 160W combined TDP system like the one I linked... you can use less cooling (~60W less) and eliminate provisions and space for the MXM module. Space savings. On to power management. There are infinite variations of power management. It's not a check-box feature. I'd be surprised to hear they weren't performing more advanced power budget balancing (like all modern APUs) than what you get when an OEM slaps in an MXM module.

The brick is rated at 180W so it can run at peak power without discharging the battery. Also even when gaming you're not always 100% pegged on CPU and GPU all the time, so there's probably some room for slow charging while gaming. Regardless, it does most of its charging when you're NOT fully loading the system. That's not unusual for a gaming laptop, really.

Oh, and the stock TDP of that GPU is definitely 115W, just as the stock TDP of the HQ CPU is 45W. Of course they can underclock them. They can down-TDP this 100W APU too, if they want. It still won't come CLOSE to needing the power and cooling of that 160W machine.
 

alextheblue

Distinguished

Vega's efficiency actually increases massively at low TDP. They pushed it too hard on desktop. Look at the small Vega configs in the 15W Ryzen Mobile chips and extrapolate from there. So I don't know how powerful it will really turn out to be in the end, but I hope you're basing your expectations more off their existing mobile Vega rather than the desktop power demons.
 

bit_user

Polypheme
Ambassador

FWIW, I couldn't see any fans on the ASUS he linked, but their gallery doesn't show many angles. Any fans it does have are certainly not beefy.


Fixed.
 

alextheblue

Distinguished
That was why I used that model as an example. It's relatively slim for the kind of horsepower it packs. Many "true" gaming laptops stick to a real DTR form factor, but not always because it's strictly necessary for that hardware. A lot of it comes down to demand... many gaming laptop buyers want behemoths with multiple drivebays and overkill cooling, sometimes even overclocking. But there is some direct appeal for OEMs too. OEMs can easily re-use the same board, chassis, etc and offer multiple models (or customized options) with a range of processors and GPUs and even desktop CPUs. So for the skinnier models they give up flexibility, but it's more impressive to look at for sure. That was one that I had seen before and it sprung to mind. I've since found their GX501VI which is even more impressive - it's got a 1080 and yet is slimmer still.
Gracias.
 
Status
Not open for further replies.