News Intel Arc AV1 Encoder Is Faster Than Nvidia NVENC H.264

shady28

Distinguished
Jan 29, 2007
428
298
19,090
This isn't real surprising when you consider that ARC came from HPC data center GPUs. One of the biggest uses for those data center GPUs is real time streaming of many different encoding types to end users consuming that content - millions of streams.

Good to note its performance is good there, but that small win for Intel was already pretty well known.

It might eventually be good for twitch streamers and such, but they have to get the basic 3D performance and driver issues fixed first, especially in DX 10/11 where most e-sports titles are sitting at.
 
  • Like
Reactions: Soaptrail
AV1 being compared to H264 is like comparing H265 to VP8 and/or 9. It's just apples to oranges in terms of generations of each technology.

AMD's and nVidia's HEVC enoders are the AV1 match and we already know, at least on the AMD side, that HEVC is quite good.

While I haven't tested nVidia's NVENC in a good while, I've been toying around with AMD's (Vega64 and RX6800M, RX6900XT) for streaming and recording video. HEVC/H265 is hands down better than H264 by a lot.

Regards.
 
  • Like
Reactions: TheOtherOne

InvalidError

Titan
Moderator
cool story bro - so when is arc coming out lol?
It is available now, if you are willing to jump through some hoops to get one shipped from China :)

The drivers being what they are while 6+ months behind the original launch plans, I'd recommend waiting another six months for drivers to mature some more before considering it. The GN post-review software package impressions looks like Intel has little to no software QA.
 

Soaptrail

Distinguished
Jan 12, 2015
302
96
19,420
It is available now, if you are willing to jump through some hoops to get one shipped from China :)

The drivers being what they are while 6+ months behind the original launch plans, I'd recommend waiting another six months for drivers to mature some more before considering it. The GN post-review software package impressions looks like Intel has little to no software QA.
I think this is one bright spot for ARC GPUs. But till they stabilised the drivers, I won't bother to pay extra for any ARC GPUs.

You make it sound like you would buy one. I am sure 95% of Tom's readers will not buy Intel GPU anytime soon.
 

InvalidError

Titan
Moderator
You make it sound like you would buy one.
I actually might if Intel solves its drivers and performance issues. The RX6500 is missing too many features I want, has extremely questionable longer-term viability and I refuse to pay $200 for that performance level. I'm still using a GTX1050, so the A380 even in its current state would still be a somewhat decent upgrade for $140 were it not for its show-stopping bugs.
 

Eximo

Titan
Ambassador
I kind of wanted an A380 to tinker with, but since it NEEDS resizeable bar, my target 4th gen system would not be good enough. I've been considering a lightweight i3-12100 to replace it.
I don't really game on it, but I would test some things out to see how it goes. And it would be mostly DX11 and earlier titles.
 
I actually might if Intel solves its drivers and performance issues. The RX6500 is missing too many features I want, has extremely questionable longer-term viability and I refuse to pay $200 for that performance level. I'm still using a GTX1050, so the A380 even in its current state would still be a somewhat decent upgrade for $140 were it not for its show-stopping bugs.
They will never fix the performance for older DX, except maybe for the few extremely popular long living titles, this is a bottomless money pit that intel will not even start to try and fill.
Even nvidia or AMD with all of their experience if they had to release a completely new arch they wouldn't be able to do all of that past work again.
Performance for newer titles is pretty decent, so the only thing that will be fixed is the stability and compatibility issue, if it really is that bad because outside of Gamers nexus who else did any video on this?! Is it really that bad or is he just making it look bad for clicks?
I mean smooth sync is supposed to mess with the graphics around tearing lines and in his video he makes it out like it's a super severe bug...it just doesn't work for that game (yet or maybe forever)
 

InvalidError

Titan
Moderator
I mean smooth sync is supposed to mess with the graphics around tearing lines and in his video he makes it out like it's a super severe bug...it just doesn't work for that game (yet or maybe forever)
When you compare the GN vs Intel SmoothSync samples, the GN ones look like garbage is randomly getting mixed in: black and yellow pixels in areas that were blue in both the old and new and even next-new fames in samples where there are two frame buffer transitions within one screen refresh cycle.

From watching the Intel SmoothSync samples, it is obvious that it works by transitioning an increasing fraction of pixels from old frame to new frame. It is fundamentally tripple-buffering with a blend 7-8 lines transition from old to new front buffer. This shouldn't be causing the garbage GN was seeing and work the same regardless of the game being played. I'm guessing Intel's drivers were randomly scrapping the old buffer prematurely for some stupid reason such as forgetting that it needs to hold onto it for 7-8 lines beyond the buffer flip.
 

shady28

Distinguished
Jan 29, 2007
428
298
19,090
Let's keep in mind that the SmoothSync pixelating issues - GN noted that it was only on titles running DX10 and DX11. They said DX12 didn't have issues.

This is why I've been saying Intel didn't know their target market.

Great encoding performance, higher quality than NVENC, Quicksync, and AMD AMF - that's a full sweep, they beat all competitors.

As the article notes :
"AV1 is an open-source, royalty-free video coding format developed by the Alliance for Open Media, a consortium founded in 2015. AV1 is a free, state-of-the-art codec that anyone can use on the internet. AV1 provides massively better compression performance with up to 50 percent smaller file sizes compared to H.264. "

But most twitch / streaming players I see are playing e-sports.

E-sports tends to be older titles. Not all of them, but many. They are going to be sorely disappointed unless the titles they are playing are using DX12.

Also the more fundamental issues with driver install, driver patching, and driver un-install is pretty much unforgiveable right now. That absolutely has to be fixed.

I can't imagine this taking more than a month or two to fix the driver install / uninstall issues, it's just bad that they are having those issues at all at this point, but the more complex issues with SmoothSync on DX10/11 I bet will linger for some time.
 

Eximo

Titan
Ambassador
I've watched a few different A380 reviews, they all come to same rough conclusion. Drivers aren't there yet, and performance is mostly worst than the 1650 and 6400 on all titles with a few DX12 and Vulkan exceptions. Really comes down to how much they will retail for. A770 is the more interesting prospect as a mid-range card. If they keep their promise and price it like they claimed ~$400, it should be pretty compelling as a direct competitor to the RTX3060 and 6600XT. (At least for someone who knows they are going to stick to DX12 and Vulkan titles)

Sad that their first flagship comes down to effectively last gen mid-range, but it is their first try. And if the rumors are true that there is a flaw in the silicon, maybe they will have a minor refresh when the next batch of GPUs are built.
 

cfbcfb

Reputable
Jan 17, 2020
96
58
4,610
What is this article even about, quality or speed? Seems the author is getting very confused between the two.

Both. The intel encoder makes videos very fast, and the av1 codec despite using a low bitrate (so less data that has to be streamed) has an excellent picture quality.

So creators can make use of the maligned intel cards to produce high quality content faster, that's easier to stream and looks better to the end user.

The fly in the ointment is the home devices used for streaming (roku, google tv, tivo etc) and whether they natively support decoding/displaying the av1 codec.
 

cfbcfb

Reputable
Jan 17, 2020
96
58
4,610
This is why I've been saying Intel didn't know their target market.

Intel knows their target market. The problem is that reviewers have tried to push them into a market segment they never had any serious interest in.

The products are a large step up from integrated graphics, they aren't going to cost much, and they'll provide everyday performance up to a 4k desktop environment without wheezing. The high speed, high quality encoders are perfect for content creators. They can also do some gaming here and there at 1080p using modern dx12 titles. Most computer users don't game. Most of the ones that do get by with something in the 1050-1650 performance range.

Nobody at Intel expected to match Nvidia or AMD's decades of driver maturity extending back to older titles on day one, nor do they anticipate competing in the 450-600W range with the other companies, for the home user.

Oh, and did I mention that they aren't going to cost very much? The price to an OEM is ridiculously low. Its created a value proposition where its not that much more expensive for an OEM to add discrete graphics to their products, where they would have normally gone with integrated.

AMD is going in the other direction with rdna.2 embedded in their APU's. The problem with that from a regular AMD user is power delivery to the APU and how challenging it is to overclock an APU vs dual overclocking a cpu and a gpu. I have a 5800H system I'm fiddling with right now that I'd love to run at a lower cpu tdp, and it'll go down to 15w. But if I lower it below 45w, the gpu performance gets scalped by 75%.
 

TheOtherOne

Distinguished
Oct 19, 2013
220
74
18,670
AV1 was created in 2019 and ARC does it faster than nvidia cards that have been doing this for 18 and 10 years on the old codecs, that's the whole point of the article.
My point was why the heck are we comparing this to decades older tech when it's upgraded version is already out for years and doing much better? Lets see how BIG and HUGE difference is between AV1 and h.265/x265 and then they can brag about being faster by a HUGE margin.
 
Aug 3, 2022
2
0
10
The graph makes it pretty clear: image quality vs 3.5/6/8Mbps bitrate for each implementation.

Looks like they've quietly edited the article to remove all the references to encoding speed, since it was originally a confusing mess. Makes you wonder if there's a reason they exclude their entire site from the Wayback Machine...
 

TRENDING THREADS