News Intel GPU Head: Our Core Audience Wants One Power Connector

bit_user

Polypheme
Ambassador
while a plan to build a high-end graphics board that would consume around 200W may sound like an unachievable dream given today's standards for gaming-grade graphics cards
He didn't say anything about what performance tier he was targeting. The way I interpret his statements is to prepare the market for another round of low/mid-range products.

So far, Intel is lagging on GPU perf/W. They're not going to magically leapfrog the efficiency of AMD and Nvidia in a single generation (if ever).

"And [we're working on] landing more partners in India who can ship good volumes here at good price points."
That's interesting. I wonder if it's a response to any pushback they've gotten from Taiwanese board partners. Or, maybe they just want to diversify their supply chain.

So, what are some Indian GPU card makers?
 
  • Like
Reactions: TJ Hooker

SyCoREAPER

Honorable
Jan 11, 2018
768
276
13,220
Core audience wants Intel to disrupt AMD and Nvidia.

So far these dedicated cards are just like their integrated ones; 10 years behind. And the drivers? Embarrassing.

Less talk, more do.
 

TechieTwo

Notable
Oct 12, 2022
234
209
960
Obviously if Intel's GPU cards only use 225w there is no need for anything more than a single 8-pin socket. Beyond that power draw they will need a second socket or the single screwball ATX 3.0 cluster socket
 

Co BIY

Splendid
Core audience wants Intel to disrupt AMD and Nvidia.

So far these dedicated cards are just like their integrated ones; 10 years behind. And the drivers? Embarrassing.

Less talk, more do.

Arc cards don't appear to be 10 years behind. Close to parity in the Low-Mid market.

I suspect that Intel is focused on OEM partners who want to sell basic gaming boxes at a basic price. No drama and low RMA rates. The whole system costing less than an 4070 and available off the shelf as an impulse purchase at a big box store.

Core i5 (OEM cooler) plus Arc 770 on an H-chipset with an easy to reach power target for a low-end PSU. That's a real sweet spot for an integrated system provider.
 

Giroro

Splendid
The GTX 1080 Ti was a 250 W card, so it's not like Intel is of targeting some particularly low power envelope.

As Nvidia struggles with "Bad Grandpa" Jensen's overwhelming greed and diminishing sanity, Intel is smart enough to understand that its hard to sell a 4 slot computer part that's physically incapable of fitting inside most computers. Forget about even trying to find a way to power that toaster.

Bonus points if if Intel makes a card that can run in a HP Peice of Crap™ without the whole system catching fire. Although, HP and Dell have gotten so downmarket and proprietary that I'd be surprised if they even have a real PCIe slot or single spare power connector.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
The GTX 1080 Ti was a 250 W card, so it's not like Intel is of targeting some particularly low power envelope.
Depending on how far back you want to go, we can find top-end cards that used far less. So, that doesn't really count for a whole lot.

Bonus points if if Intel makes a card that can run in a HP Peice of Crap™ without the whole system catching fire. Although, HP and Dell have gotten so downmarket and proprietary that I'd be surprised if they even have a real PCIe slot or single spare power connector.
If you bought a Dell or HP that didn't already have a PCIe power cable, then the PSU is probably too small and has no extra connectors for one. I think it's common for people with Dells to upgrade the PSU to one from an AlienWare, when they want to install a beefier graphics card.
 

SethNW

Reputable
Jul 27, 2019
36
20
4,535
Of course, if you can't compete on performance, then you definitely want to find some other excuse to sell cards. So yeah, it might take Intel few generations to properly compete. And considering that power is their biggest thing, I don't even think their midrange will be that good
 

ManDaddio

Reputable
Oct 23, 2019
99
59
4,610
The GTX 1080 Ti was a 250 W card, so it's not like Intel is of targeting some particularly low power envelope.

As Nvidia struggles with "Bad Grandpa" Jensen's overwhelming greed and diminishing sanity, Intel is smart enough to understand that its hard to sell a 4 slot computer part that's physically incapable of fitting inside most computers. Forget about even trying to find a way to power that toaster.

Bonus points if if Intel makes a card that can run in a HP Peice of Crap™ without the whole system catching fire. Although, HP and Dell have gotten so downmarket and proprietary that I'd be surprised if they even have a real PCIe slot or single spare power connector.
I didn't know that you worked at Nvidia and know all the ins and outs of their business.
Name calling and blanket accusations really doesn't show much intelligence.
 

ManDaddio

Reputable
Oct 23, 2019
99
59
4,610
If Intel wants to stay around 250 Watts then they will never get to the point where they're going to achieve 4090 performance.
Unless, there's some major architectural changes across the whole GPU industry. Simply shrinking everything isn't going to keep doubling performance I don't think.
AMD is on the something but nothing is proven yet.
I think Nvidia is at a tough spot right now because they have to make some major architectural changes. Cost is becoming a factor for them. And the cost isn't just simply producing the chip. Research and development cost a lot of man hours.
I don't think they're charging high prices because they want to. The RTX 4090 proves that. It's just the 4090 gives them much more headroom when it comes to return on their investment.
 
  • Like
Reactions: Sluggotg

bit_user

Polypheme
Ambassador
I think Nvidia is at a tough spot right now because they have to make some major architectural changes. Cost is becoming a factor for them. And the cost isn't just simply producing the chip. Research and development cost a lot of man hours.
I think Nvidia has been using gaming product revenue to help underwrite R&D for new markets, like their push into self-driving SoCs. Also, that huge new HQ building in California can't be cheap.

I don't think they're charging high prices because they want to. The RTX 4090 proves that.
How does it prove that?
 

jp7189

Distinguished
Feb 21, 2012
334
192
18,860
It seems to me that the last few GPU generations are all about more cores, more cache, more transistors, more clock speed. The consequence to that is more power. Sure there are still some architectural efficiencies to implement, and new manufacturing nodes can count for a lot, but more and more i get the feeling that performance gains are going to come with commensurate power increases.

Are we going to come to a place where performance stagnates because it's not reasonable to increase power anymore?
 

rluker5

Distinguished
Jun 23, 2014
626
381
19,260
Consoles have one power connector.

But to get back to the point, let's look at the quote in question: ""My priority at this point is getting that core audience, with one power connector," Intel Graphics Head Raja Koduri said in an interview with Gadgets360, an Indian tech site. "And that can get you up to 200W – 225W. If you nail that, and something a little above and a little below, all that falls into the sweet spot.""

So mainstream resolutions and settings. Not halo flagship cards. How many people game at higher than 4k60 high anyways? Anybody here think 8k gaming is in any way worth it? How much would you spend to go from 4k60 to 4k120?

I don't think Intel is in a position where it would be remotely profitable to challenge Nvidia's flagship. But they can bring good mainstream performance at reasonable power draw and cost to the market (and prebuild makers) and likely make a profit.

And that last driver improved a lot of driver overhead across a lot of games. I pulled out my 3080 to check my A750 and it isn't going back for a while. Would be nice if somebody did a ton of work and retested Arc as some kind of followup, maybe at 6 months or a year? To see how well it has aged. Too much work for me though.
 

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
Not sure why ppl are so disappointed at this piece of news, Intel was never gonna compete in the immediate future. Their GPU chief sounds like he wants to keep his feet on the ground and make sure the cards can compete at mid and low tier. It's a solid mindset, the mid-range priced cards ($300-$500) are gonna sell best especially considering the biggest player has decided to vacate that market. If Intel manages to get things right, they will be on their way to grab a good chunk of their target market next generation or the one after.
 

kjfatl

Reputable
Apr 15, 2020
181
130
4,760
I see a lot of people who seem to be upset that Intel is aiming at the bottom 80% of the discrete GPU market, sucking away 50% of the profit and R&D dollars from NVidia and AMD in this area. I'm one of those people who doesn't want a PC that pulls as much power as a space heater. 2 or 3 years from now when Intel is running w/ 18a process and they are running 180 frames/sec on an 4K monitor the complaint will be about trying to run 240 frames/sec on dual 8K monitors. There is a place for $3000 video cards and PC's with 1500W power supplies, but it is a shrinking market on the consumer side.
 
Apr 1, 2020
1,447
1,102
7,060
No Raja Koduri, home users and gamers don't care about power as long as it's under 350w, we've been using dual 8 pin GPUs for over a decade, even on GPUs like the HD 2900 XT that was a massive heater, and in the last few years it's gotten even better since many of the Chinese Firecracker PSUs have been replaced by decent quality ~700w PSUs at low price points.

What they actually care about is price to performance. An entry level card like the A770 at $350 is just as insane as a high end RTX 3080 at $1200 and upper midrange RTX 4070 Ti at $900.
 
  • Like
Reactions: bit_user

rluker5

Distinguished
Jun 23, 2014
626
381
19,260
No Raja Koduri, home users and gamers don't care about power as long as it's under 350w, we've been using dual 8 pin GPUs for over a decade, even on GPUs like the HD 2900 XT that was a massive heater, and in the last few years it's gotten even better since many of the Chinese Firecracker PSUs have been replaced by decent quality ~700w PSUs at low price points.

What they actually care about is price to performance. An entry level card like the A770 at $350 is just as insane as a high end RTX 3080 at $1200 and upper midrange RTX 4070 Ti at $900.
You are in a very small group of people who place a very high priority on performance in a competitive sense. I undervolt/underclock my 3080 for most games. And I used to run sli 780tis, then 1080tis and even ran a dedicated power line from my main breaker to my pc power outlet during a remodel.
It is no longer worth it to me to put up with excessive noise or the hassles of watercooling to go from 4k60 high to 4k60 ultraRT.

Most are happy with 1440p60, high. An A380 is an entry level card, as is a 6400 or 1650. A770 is midrange like the 3060ti or 6700xt and the market has decided that $350 is quite sane. There are many more people in the world than you. Some have different ways of measuring how much performance they want to pay for and how much quality of life they are willing to sacrifice for it.
 

Giroro

Splendid
If you bought a Dell or HP that didn't already have a PCIe power cable, then the PSU is probably too small and has no extra connectors for one. I think it's common for people with Dells to upgrade the PSU to one from an AlienWare, when they want to install a beefier graphics card.

That's where you run into the problem with Dell's proprietary power supplies. You're going to pay inflated prices and not really have much choice. You also have to know how to source and install a power supply.

I think we've all been there once. But there's a whole lot of people who are inexperienced and paying the nerd squad to drop in a GPU. I don't know if best buy even sells a Dell compatible PSU.
 
  • Like
Reactions: Sluggotg

SyCoREAPER

Honorable
Jan 11, 2018
768
276
13,220
I didn't realize the 3060Ti came out 10 years ago. My how time flies.

Dramatic effect obviously went over this one's head.

Arc cards don't appear to be 10 years behind. Close to parity in the Low-Mid market.



I suspect that Intel is focused on OEM partners who want to sell basic gaming boxes at a basic price. No drama and low RMA rates. The whole system costing less than an 4070 and available off the shelf as an impulse purchase at a big box store.



Core i5 (OEM cooler) plus Arc 770 on an H-chipset with an easy to reach power target for a low-end PSU. That's a real sweet spot for an integrated system provider.

My mock might have been excessive at 10, maybe I should have joked with 5. Anyway the cards compete wirh the 1650 and 3060, 3 and 1.5 years respectively.

They are basically the Nintendo of the GPU world, a 1-2 generation behind product line.
 

InvalidError

Titan
Moderator
Most are happy with 1440p60, high. An A380 is an entry level card, as is a 6400 or 1650. A770 is midrange like the 3060ti or 6700xt and the market has decided that $350 is quite sane.
For me, "mid-range" means that at least 50% of the potential market can afford it. Based on the Steam survey, only ~50% of people are willing to pay more than $250 for a dGPU, which would peg $350 as already one or two rungs beyond what the mid-range should be.