News Intel Shares Potential Fix for High Idle Power Consumption on Arc GPUs

The reason I stick with Nvidia is because you're basically a beta tester by buying Intel or AMD GPU that simply lack the features Nvidia software has.

Yesterday I was talking to someone who had screen tearing in their game, and I said I just set the game at the desired framerate through Nvidia's control panel. He uses an AMD GPU and his reaction was one of disbelief, the fact I could just lock the framerate per application, even my browser's FPS.

He is now considering an Nvidia GPU.

To compete with Nvidia, it will require a lot better software and drivers than the garbage AMD and Intel are putting out.

Neither AMD or Intel even has native DX9 support, they're not even beta products, they're just missing core features. It's an afterthought for AMD and Intel. With Nvidia, GPU is their core business, and it shows.

People said Intel selling dedicated GPU would be good for the consumer. No it's not, it has been nothing but issues, Intel is releasing half-assed drivers just like AMD.

I consider the only real competition to Nvidia being ARM with PowerVR, Qualcomm, general Mali (ST micro / Samsung etc).
 
Last edited:
I wish I could have this problem and whine about it online, but I can't find the darn ARC cards anywhere in the UK xD!

EBuyer, which seems to be the only retailer with them, still has them as "coming soon" and there's no other location/place they're being sold here. And they're not even cheap at £400.

What the hell, Intel? Was MLiD right on the money?

Regards.
 
My Asrock A380 goes from a bit under 17w to a bit over 14w with notably more fluctuations between 17w and 12w.
Not a ton of savings, but it is doing something. It does seem somewhat dependent on resolution. Maybe only some of the silicon, like the 3d part is getting the savings.
It is nice to see the drop on the A750. I have a reference one on the way and it is going into my living room itx where quiet matters.

And Newegg is usually out of them. The A380 seemed like a normal purchase before they went on backorder, but the others were only available for a bit at launch, then I saw A750s in stock when I checked yesterday. Back out of stock right now. Better than the other brands at launch, but I imagine the demand for these is probably less.
 
Last edited:
The reason I stick with Nvidia is because you're basically a beta tester by buying Intel or AMD GPU that simply lack the features Nvidia software has.

Yesterday I was talking to someone who had screen tearing in their game, and I said I just set the game at the desired framerate through Nvidia's control panel. He uses an AMD GPU and his reaction was one of disbelief, the fact I could just lock the framerate per application, even my browser's FPS.

He is now considering an Nvidia GPU.

To compete with Nvidia, it will require a lot better software and drivers than the garbage AMD and Intel are putting out.

Neither AMD or Intel even has native DX9 support, they're not even beta products, they're just missing core features. It's an afterthought for AMD and Intel. With Nvidia, GPU is their core business, and it shows.

People said Intel selling dedicated GPU would be good for the consumer. No it's not, it has been nothing but issues, Intel is releasing half-assed drivers just like AMD.

I consider the only real competition to Nvidia being ARM with PowerVR, Qualcomm, general Mali (ST micro / Samsung etc).

You are spewing a much of nonsense. The AMD drivers suck mantra is quite tired.
 
  • Like
Reactions: -Fran- and bit_user
The AMD drivers suck mantra is quite tired.
Tired but still true, my 5700 XT still has graphical glitches from time to time (Windows/lightroom,handbrake workload). Nvidia is still king of drivers, (AS long as you stay on n-2 generation for cards, my gt1030 started to get buggy on newer driver launches...)