Review Intel Arc B570 review featuring the ASRock Challenger OC: A decent budget option with a few deep cuts

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yes, it does, though it is disabled by default. TBH I have not fiddled with the setting because I was not sure what it did (presumably nothing for my aging RX480).
Kit Guru shows how improve idle power draw on arc, they achieved decent idle power on the b580 (15W), but not as good on the B570 (22W), Really the only reviewer to cover this from what I have seen.
Steps:
  1. ASPM enabled in bios
  2. Tweak power settings, change PCI Express → Link state power management → maximum power savings

Without this both battlemage cards idle at 35W, which still beats Alchemist which I believe idles 37-40w, also idle power use is dependent on resolution and number of displays.
 
Oof. My existing build pulls ~45 watts from the wall while sitting on the desktop. Disappointing.
yes, still not great, but it is understandably a low priority for for Intel discrete GPU's. Just look at all the hullabaloo over driver overhead... More pressing issues for them to make the card marketable and desireable than idle power consumption. But with how well Intel idle works on iGPU, I believe they can eventually get it to pull mere watts when its doing nothing.

My NVidia GTX1080 is quite efficient with a single display, around 5W idle, and its now ancient (Coming up on 9 years old!) . Running on my SFF with a vanilla i9-13900, I am sub 30W from the wall (SFF build 13900). In these winter months though, I wouldn't mind a little more.
 
  • Like
Reactions: PiersPlowman
My NVidia GTX1080 is quite efficient with a single display, around 5W idle, and its now ancient (Coming up on 9 years old!) . Running on my SFF with a vanilla i9-13900, I am sub 30W from the wall (SFF build 13900). In these winter months though, I wouldn't mind a little more.
The RX480 pulls 7-8 watts on the core doing nothing; I have no idea about the VRAM, but probably not much.

Ryzen's dirty little secret is the SOC idle power draw which accounts for almost half of my system's idle power usage. Only gets worse with XMP, too.

The thought of 70+ watts being pulled with only a Word document open...well...hmm.

Oh, well.
 
After reading a lot of reviews, some showcasing with XeSS2 enabled, I am a little torn on what may be missing from the playability side of the review. A lot of the sub 40 FPS in this review for example give the impression that the card cannot do some AAA games well. Reviews looking at pure rasterization focus on comparing competing products, but it would be nice if we were left with an impression if the card can do AAA games with high to ultra settings comfortably when the software is enabled. Some video reviews show the B570 cruising through these games with 60=100 FPS. Maybe a paragraph speaking to the playability of the card with XeSS2 enabled so we can understand if 1) the game becomes more than playable and 2) if XeSS2 is mature enough to use it. There is no need to comment on the competing product softwares since the rasterization tells us which cards are more powerful. Looking at the software complement of a specific card however informs further if enabling it improves the experience.
 
If your motherboard has working ASPM support, then the idle should be good. Check the BIOS
edit: if not, then it's still high (though better than Alchemist, still not on RDNA 3 or Lovelace levels)
I haven't taken the time to really dig into idle power and low use power for a while, but I do know that at my current settings (which do NOT have ASPM enabled!), idle power on the Arc B-series is in the 30W range. Turning on ASPM can reduce performance, and since I'm primarily interested in higher performance rather than lower power, I haven't worried about it.

It's something I do want to investigate in the future, though. Including just how much (or how little) ASPM impacts gaming performance. Maybe it doesn't hurt at all and it should be left enabled! But that's for a day when I'm not trying to benchmark one GPU per day, every day, nights and weekends included, rushing for the 5090 and 5080 launch windows....
 
After reading a lot of reviews, some showcasing with XeSS2 enabled, I am a little torn on what may be missing from the playability side of the review. A lot of the sub 40 FPS in this review for example give the impression that the card cannot do some AAA games well. Reviews looking at pure rasterization focus on comparing competing products, but it would be nice if we were left with an impression if the card can do AAA games with high to ultra settings comfortably when the software is enabled. Some video reviews show the B570 cruising through these games with 60=100 FPS. Maybe a paragraph speaking to the playability of the card with XeSS2 enabled so we can understand if 1) the game becomes more than playable and 2) if XeSS2 is mature enough to use it. There is no need to comment on the competing product softwares since the rasterization tells us which cards are more powerful. Looking at the software complement of a specific card however informs further if enabling it improves the experience.
I plan on revisiting Arc in a month or two when things have calmed down, and trying out XeSS / FSR 2/3 in all games that support it. This is one of the many things I just don't have the bandwidth to get done right now.
 
hey Jarred, just wanted to point out a typo, right under the specs graph:
“As noted already, the Arc B5780 takes”

Anyway, always love your hard work, this article is no exception, thank you for thoroughly reviewing all the GPUs as they come out!
 
I have never been able to figure out why my PC has such high idle power consumption.

I have always had high wattage from the wall for my setup even while idle. My PC setup which includes a 55 inch CX OLED, ViewSonic Elite XG270QG, and a Gigabyte AORUS FO32U2P for monitors. My PC specs are a 5800X3D and an EVGA 3080 FTW3. Completely idle with a clean boot my setup sucks well over 150 watts from the wall but that also includes my Logitech Z906 which pulls about 25 watts idle. My 5800X3D without PBO idles at around 32 watts and the 3080 idles at around 36 watts but only if I have my LG CX TV not connected to the 3080. If I have all 3 monitors in use my 3080 idles at 88 watts.
 
It's something I do want to investigate in the future, though. Including just how much (or how little) ASPM impacts gaming performance. Maybe it doesn't hurt at all and it should be left enabled! But that's for a day when I'm not trying to benchmark one GPU per day, every day, nights and weekends included, rushing for the 5090 and 5080 launch windows....
Understandable, makes sense. I didn't think it had any impact on gaming performance at all, honestly, though I could be mistaken.
 
I have never been able to figure out why my PC has such high idle power consumption.

I have always had high wattage from the wall for my setup even while idle. My PC setup which includes a 55 inch CX OLED, ViewSonic Elite XG270QG, and a Gigabyte AORUS FO32U2P for monitors. My PC specs are a 5800X3D and an EVGA 3080 FTW3. Completely idle with a clean boot my setup sucks well over 150 watts from the wall but that also includes my Logitech Z906 which pulls about 25 watts idle. My 5800X3D without PBO idles at around 32 watts and the 3080 idles at around 36 watts but only if I have my LG CX TV not connected to the 3080. If I have all 3 monitors in use my 3080 idles at 88 watts.
when you're running multi-monitor, GPU idle power goes up a lot. Your CPU also has an io die that cannot physically go below a certain power threshold (iirc it's around 15W).
 
  • Like
Reactions: helper800
I have never been able to figure out why my PC has such high idle power consumption.

I have always had high wattage from the wall for my setup even while idle. My PC setup which includes a 55 inch CX OLED, ViewSonic Elite XG270QG, and a Gigabyte AORUS FO32U2P for monitors. My PC specs are a 5800X3D and an EVGA 3080 FTW3. Completely idle with a clean boot my setup sucks well over 150 watts from the wall but that also includes my Logitech Z906 which pulls about 25 watts idle. My 5800X3D without PBO idles at around 32 watts and the 3080 idles at around 36 watts but only if I have my LG CX TV not connected to the 3080. If I have all 3 monitors in use my 3080 idles at 88 watts.
Unsure if this will help you but I did some video card power usage testing with my setup after getting a 4k/120Hz TV (I have 3 displays total 3440x1440/144Hz and 2560x1440/120Hz):

primary two on without TV:
38-41W

primary two on with TV set to 4k/60Hz under the TV resolutions in NVCP:
38-41W

primary two on with TV set to 4k/120Hz under the PC resolutions in NVCP (or 100Hz as these are my only two options):
110-120W

only UW monitor on with TV set to 4k/120Hz:
38-41W

What's happening with the spike in power usage is that the VRAM isn't clocking down. I leave my TV disabled in NVCP unless I'm using it.
 
  • Like
Reactions: helper800
Unsure if this will help you but I did some video card power usage testing with my setup after getting a 4k/120Hz TV (I have 3 displays total 3440x1440/144Hz and 2560x1440/120Hz):

primary two on without TV:
38-41W

primary two on with TV set to 4k/60Hz under the TV resolutions in NVCP:
38-41W

primary two on with TV set to 4k/120Hz under the PC resolutions in NVCP (or 100Hz as these are my only two options):
110-120W

only UW monitor on with TV set to 4k/120Hz:
38-41W

What's happening with the spike in power usage is that the VRAM isn't clocking down. I leave my TV disabled in NVCP unless I'm using it.
That makes sense. Thanks for sharing.