News Intel GPU Head: Our Core Audience Wants One Power Connector

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You probably don't want to pull the whole 75W from PCIe since there have been some reports of melty PCIe slots on GPUs that did that.
Got proof? There have been some 75 W cards that took all their power from the slot.

Also, when AMD first launched Polaris, there was a mini-scandal when Toms found the card was sometimes pulling significantly more than 75 W from the slot. AMD later issued a firmware fix which addressed that.
 
It seems to me that the last few GPU generations are all about more cores, more cache, more transistors, more clock speed. The consequence to that is more power.
Not necessarily. If the process node improves as well, then it gives you budget for some of those things. What's happening is that node shrinks aren't delivering the same degree of efficiency improvements as they used to (i.e. due to things like "leakage"). And because the GPU market is lately more competitive & higher-stakes, graphics cards are pushing beyond the budget afforded by the node increase and stretching the power envelope.

Are we going to come to a place where performance stagnates because it's not reasonable to increase power anymore?
It depends a lot on how manufacturing continues to progress. Perhaps the more pressing concern about silicon fabrication is that new nodes are getting more expensive. That's bad for power consumption, because it incentivizes making smaller dies and clocking them higher (i.e. even further past their peak-efficiency point).
 
  • Like
Reactions: jp7189
For me, "mid-range" means that at least 50% of the potential market can afford it. Based on the Steam survey, only ~50% of people are willing to pay more than $250 for a dGPU, which would peg $350 as already one or two rungs beyond what the mid-range should be.
I'm not sure I agree with the 50% threshold. I'd probably bracket midrange at like 65th percentile to 40th, or so. Even though it's going to be at the upper end of that range, a figure of $350 probably sounds somewhat reasonable because we're still in a mindset of crypto/pandemic-era pricing.

Going back a couple generations, I think of midrange as GTX 1060 to GTX 1070, where the latter was indeed in the ballpark of $350.
 
Even though it's going to be at the upper end of that range, a figure of $350 probably sounds somewhat reasonable because we're still in a mindset of crypto/pandemic-era pricing.
I'm using the Steam survey market share as the basis of my "mid range" assessment because it relies on what people are actually using instead of whatever "mindset" the market is attempting to put its target audience / unvoluntary wallet siphon victims into at any given time.
 
Last edited:
Core audience wants Intel to disrupt AMD and Nvidia.

So far these dedicated cards are just like their integrated ones; 10 years behind. And the drivers? Embarrassing.

Less talk, more do.
I didn't realize the 3060Ti came out 10 years ago. My how time flies.

Yup my Arc A770 16GB that's 10 years behind, just like their integrated ones. It's happily purring along at a solid 144FPS+ in ray traced ultra nightmare in Doom Eternal at 1080P. I'm really feeling down about this $350 purchase right now.

May as well laugh now while their drivers are still improving because the 500lb gorilla has a fullstack solution working its way into iGPUs, dGPUs, NUCs, laptops, and datacenter GPUs. The free ride is over.
 
  • Like
Reactions: LolaGT and JayNor
For me, "mid-range" means that at least 50% of the potential market can afford it. Based on the Steam survey, only ~50% of people are willing to pay more than $250 for a dGPU, which would peg $350 as already one or two rungs beyond what the mid-range should be.
NVIDIA has forgotten the magic price point that made the GTX 1060 the MVP legend of midrange GPU launches.
 
Yup my Arc A770 16GB that's 10 years behind, just like their integrated ones. It's happily purring along at a solid 144FPS+ in ray traced ultra nightmare in Doom Eternal at 1080P. I'm really feeling down about this $350 purchase right now.

May as well laugh now while their drivers are still improving because the 500lb gorilla has a fullstack solution working its way into iGPUs, dGPUs, NUCs, laptops, and datacenter GPUs. The free ride is over.

Go reread the next few comments I made.

And Doom Eternal could run on literal Potato as the CPU and lemons as battery sources. It's an incredibly well optimized game. Other devs should take notes.
 
Go reread the next few comments I made.

I didn't read the other posts but wouldn't be surprised if you walked it back. I was too busy laughing at the first and couldn't let it pass.

And Doom Eternal could run on literal Potato as the CPU and lemons as battery sources. It's an incredibly well optimized game. Other devs should take notes.

I don't think you get 144FPS+ solid with RT/UN in Doom Eternal on a potato.

The point is the power of Intel's Xe cores. It's definitely a laugh now, cry later situation for Nvidia fanboys since their company is now going to be competing with Intel and AMD fullstack solutions in all markets now. They may still be around in 10 years. They've cornered the $1,600 gaming GPU market after all.
 
I didn't read the other posts but wouldn't be surprised if you walked it back. I was too busy laughing at the first and couldn't let it pass.



I don't think you get 144FPS+ solid with RT/UN in Doom Eternal on a potato.

The point is the power of Intel's Xe cores. It's definitely a laugh now, cry later situation for Nvidia fanboys since their company is now going to be competing with Intel and AMD fullstack solutions in all markets now. They may still be around in 10 years. They've cornered the $1,600 gaming GPU market after all.

So negative. I said I exaggerated. 1650 was 3 years old so I should have joked 5 and obviously higher end compare to the 3060.

So I'll post it again since you can't be bothered to read before making condescending comments. I said Intel is a generation behind and still don't have their drivers sorted out.

So are the cards good value? Once the drivers are fixed, sure. Is it enough to upset the duopoly between Nvidia and AMD? No.

And I have AMD and Nvidia. I don't believe in fanboism. I don't get paid to blindly defend any company. I buy what works best. 4090 is incredibly expensive but wrecks everything including the 7 series based on AMDs own event and will even more so once DLSS3 comes to more games.

As for my AMD card, it's perfect for my laptop, doesn't get hot like Nvidia and provides perfect on the go or couch performance.
 
I said Intel is a generation behind

They're not though. Once the 4060 arrives I'd agree. The RT and AI upscaling is there and both are well-regarded as essentially equal to Nvidia's. Late, but not behind.
The 4060 should perform like a 3070 did. Arc definitely competes in the 3060-3070 range today though, depending on the title. Nvidia has such oversupply of GPUs that they have to continue leaving an opening for their competitors, rumor is June '23 for the 4060.

So are the cards good value? Once the drivers are fixed, sure.

If you actually do the shopping on what's available at places like Newegg, the market is still pretty tough right now. Considering prices and availability, the only cards I'd buy today for myself are the 6700XT and A770. The Intel cards do get a free point for the novelty of them and their unique features (16GB, strong RT, good upscaling, AV1 encoding, Intel Deep Link at a whole $350). And there's only a single SKU of the 6700XT that I'd buy right now (a sub $400 model). Speaking as someone that has a 3080 Ti and A770.

Is it enough to upset the duopoly between Nvidia and AMD? No.

The duopoly is already broken. The cards work today and are selling. Not sure what "breaking the duopoly" requires other than a valid 3rd option. Especially in this market.

I referred to Doom Eternal because I was in a game while checking Toms and reading these comments. This card is getting used, and contrary to reviews, it's actually impressive. If you just use it like a normal person, it just works. And this is their 1st generation? I've had no problems with it. I was going to buy a 4070 but I'm not sure I'm even going to bother now. I've been using the A770 daily for over a month now, and can definitely use it every single day going forward till Battlemage. It's just as reliable as my 3080 Ti. Probably more reliable so far, as Nvidia is always pushing the edge and breaks things in drivers often.

My buddy who uses a 6900XT also picked up an A770 on a whim has had the same experience, says he could use it as his daily GPU as well. Sample size of two, but the majority of Newegg reviews are 5 star and positive so reality is hitting these "pro" reviewers pretty hard right now.

I couldn't disagree more with the "reddit" narrative that you shared here on Arc being such a sad case. Popular sentiment? Yeah. But I actually have a card getting punished daily, and that reddit narrative doesn't line up with my experience.
 
I couldn't disagree more with the "reddit" narrative that you shared here on Arc being such a sad case. Popular sentiment? Yeah. But I actually have a card getting punished daily, and that reddit narrative doesn't line up with my experience.

Unlike redditors I know what an opinion is and if it holds up in your life style I'm happy for you and glad to hear it. I'm not here to change your mind nor do I care to. You enjoy it, that's great.

I can only go "first hand based on what you mentioned regarding the 6900. I had a 3080 and a 6950 and it didn't meet my needs so I'm only speaking comparatively, not from the BS Bandwagon