News Intel Claims 71% Arc GPU Performance Boost in Cyberpunk 2077

Colif

Win 11 Master
Moderator
Nice, maybe we can make newer games look better instead of making a 2 year old game look better.

Maybe its a sign of a general lack of good games coming out we have to improve on games from the past.
 
  • Like
Reactions: atomicWAR

Eximo

Titan
Ambassador
Maybe its a sign of a general lack of good games coming out we have to improve on games from the past.

Development time for impactful games has become years long, and in terms of popularity it is often franchises that win out over new titles. Elden Ring maybe as the standout new title?

Hogwart's Legacy, Spiderman, and your usual CoD and Battlefield seem to be the high graphics titles of note (I am sure I am missing some)

How many years is Witcher 4 going to take?
 
Nice, maybe we can make newer games look better instead of making a 2 year old game look better.

Maybe its a sign of a general lack of good games coming out we have to improve on games from the past.
It's probably just CD Projekt Red pushing out the DLSS 3 and Overdrive update and saying, hey, let's toss in XeSS support as well. Since the game already has DLSS 2 and FSR 2.1 support, adding XeSS should have been pretty trivial.
 
  • Like
Reactions: cyrusfox

atomicWAR

Glorious
Ambassador
Nice, maybe we can make newer games look better instead of making a 2 year old game look better.

Maybe its a sign of a general lack of good games coming out we have to improve on games from the past.
While I appreciate a devs supporting games years after launch, there does seem to be a lot of 'add RT/DLSS/FSR/Xess support' to older games attitude to create buzz for these newer features when new games are slow to adopt, use flawed and or have poor implementations (most typical). I'll never forget Quake 2 being the killer app for RTX 2000 gpus, lol. Things have improved but they are still typically low quality or moon shots Path Tracing that don't work at playable frame rates on 99% of new GPUs.
 
Last edited:
D

Deleted member 1353997

Guest
I think the title should make it clear that this is about XeSS and not just about Intel Arc GPUs.
 
  • Like
Reactions: salgado18

Eximo

Titan
Ambassador
Yes. Having a third competitor in the GPU space could disrupt prices. Right now Nvidia and AMD are easily able to ask almost whatever they want for high end GPUs. Unless someone comes along and says, no, you can have that performance for 30% less money.

A770 didn't quite make it this go around (which is a 16GB card for $400, under $300 if you get the 8GB card), but it is priced according to its performance, not features. If Battlemage or Celestial can do anywhere near the RTX4070 or RX6800 for $500 they will have a winner.

RX 6800 is there right now at $470 (After lowered pricing following the 4070 launch), and is what Nvidia could be doing with the RTX4070, but they choose to ask $600.

AMD chose to wait 6 months since the launch of the 7900XTX and 7900XT before launching any of the lesser cards (If we believe the June launch date). Nvidia has spaced out the 40 series more tightly at the high end (one card per month, but waited an additional 3 months for the 4070. (Though somewhat understandable, mobile 4080, desktop 4070 /Ti share a GPU, so they've really launched three products with that chip) AMD has launched all their mobile 7000 series parts with Navi 33, with the two desktop cards being Navi 31.

If the RX 7800 is reasonably priced, and there is no reason to suspect it will with the 7900 XT's $900 price, there is room for a disruptor. Though Intel looks to still be about a year out from their next launch.
 

Colif

Win 11 Master
Moderator
Yes. Having a third competitor in the GPU space could disrupt prices. Right now Nvidia and AMD are easily able to ask almost whatever they want for high end GPUs. Unless someone comes along and says, no, you can have that performance for 30% less money.
Expecting Intel to save us... lol.

WE be better off like it was 23+ years ago with a lot more than just 3. Actual competition instead of companies that let their purchasers fight but do deals between each other. Intel LAN chips on AMD motherboards... Its all fake competition.

More makers might even wake up a large percentage of population to fact Nvidia doesn't make all the GPU.
 
It's probably just CD Projekt Red pushing out the DLSS 3 and Overdrive update and saying, hey, let's toss in XeSS support as well. Since the game already has DLSS 2 and FSR 2.1 support, adding XeSS should have been pretty trivial.
I wonder if adding DLSS 3 or XeSS had anything to do with them also breaking the visual quality on moving objects in game with v1.62. Whatever they did to the anti-aliasing (TAA?) used in the game, has introduced awful ghosting visuals that make FSR 2.1 look far worse than in v1.61.

Turning off AA through a config file will "confirm" the issue is very likely related to AA, but this also makes the game run with lower FPS and no working upscaling tech.
 

Eximo

Titan
Ambassador
Expecting Intel to save us... lol.

WE be better off like it was 23+ years ago with a lot more than just 3. Actual competition instead of companies that let their purchasers fight but do deals between each other. Intel LAN chips on AMD motherboards... Its all fake competition.

More makers might even wake up a large percentage of population to fact Nvidia doesn't make all the GPU.

I don't expect Intel to save us, just offer an alternative. Once they start a price war it will be noticeable. They pretty much have to to gain any significant marketshare. If they price themselves the same, and offer worse gaming experience, they won't move anything. A770/A750 offers relatively poor gaming experience at a discounted price, but you still get all the full features of a GPU. If they can even come close to rasterization performance with mid-tier cards of Radeon 7000 or Nvidia 50 series. I don't expect a price war for high end GPUs. They can price those at whatever, people will buy them regardless as they always have.

AMD drops its prices often in response to Nvidia. And Nvidia did price drops to get rid of 30 series inventory. Didn't entirely work. Once you have a third option, and if Nvidia keeps up their high premium buy-in and Nvidia starts seeing a loss in revenue in mid-tier products they will have to do something in response.

Not just custom/DIY builders either. Intel is likely to offer discounts to OEMs that buy their processors, so we might see very aggressively priced OEM desktops.
 

InvalidError

Titan
Moderator
I don't expect Intel to save us, just offer an alternative. Once they start a price war it will be noticeable. They pretty much have to to gain any significant marketshare.
While a price war would do a nice job of resetting greedflation, I don't expect Intel to aggressively pursue market share. If Intel really wanted to do that, I think they should be aiming for the best GPU they can put together for $200 where buyers would be willing to forgive minor inconveniences for great bang-per-buck nobody else is bothering to offer instead of aiming at $300+ where many more competing options are available and people are far more critical of quirks no matter how small.
 

Eximo

Titan
Ambassador
That $200-300 range is exactly where they are now with the A750. RX6600 is just the more reliable option and generally a little cheaper. RTX3050 is very overpriced at $260.

I hope Battlemage comes in around the same. B750 (or hopefully B550 (hmm, they might not use that name, B575)) for $250 with similar performance to an RX7600 or RTX4050. That is where it is really needed. They can still do the big B770 or what have you for $500, enthusiasts might buy it just because.

That $600 dollar gaming PC we all got used to is really hard to do now. PSU prices are way up, GPU Price/Performance is up. Motherboard prices are a little up. About the only consistent things have been CPU prices at low/mid while SSDs have been dropping a lot.
 

InvalidError

Titan
Moderator
That $200-300 range is exactly where they are now with the A750. RX6600 is just the more reliable option and generally a little cheaper. RTX3050 is very overpriced at $260.
I'm not talking $200-300 range, I'm talking hard $199.99. The A750 sales are apparently still not doing too good at $250 or $230 for the Asrock model. As you wrote earlier, Intel needs to do a lot better than inconsistently compete on same-price performance if it wants to actually move units.
 

Eximo

Titan
Ambassador
The occasional sale for the RX6600 gets down to 199.99, usually around $220, but for the most part, I don't think we can expect to see $200 fully featured GPUs next generation. Launch price for the RX6600 was $329...
 

InvalidError

Titan
Moderator
The occasional sale for the RX6600 gets down to 199.99, usually around $220, but for the most part, I don't think we can expect to see $200 fully featured GPUs next generation. Launch price for the RX6600 was $329...
The RX6600 launched during peak COVID with peak component shortages and crypto-mining in effect.

DRAM prices have gone down 50+% since then and a 200sqmm die is still only $40-50 at current wafer prices. A decent $200 GPU with 8GB of VRAM is very much feasible should a manufacturer decide to make a serious bid for entry-level market share.

If Intel gets its i4 process in working order, it gets to cut TSMC out of the loop, which should make it that much easier to turn a profit on cheaper GPUs.
 

Eximo

Titan
Ambassador
I suppose it depends if they are going to bother making any 4GB cards that aren't there for just simple display outputs.

6500XT is pretty bad value compared to the A380. GTX1650 rarely gets down to that price range (is one right now though at $162)