Alchemist outperformed Ampere between 1.04X to 1.56X.
1.56X RT performance might sound impressive until you realize they are talking about running Fortnite at a "cinematic" 36fps. : P
The Arc A770 is much more interesting, especially since Intel's backing up its previous claims that Arc is competitive or even a bit better than Nvidia's second-generation RT cores inside Ampere.
Considering they have suggested the A770 should be competitive with a 3060 Ti in modern APIs, falling back closer to 3060-level performance in RT implies that their RT performance is a bit worse than Nvidia's, not better. Their chart suggests the A770 may get around 10% better RT performance than a 3060 on average, but the 3060 Ti tends to get around 30% more RT performance than a 3060. The RT performance does look competitive, but not quite at the same level as far as the performance hit from enabling it is concerned. Of course, if they price the card like a 3060, that could still be a win.
There's also the question of how well XeSS will compare to DLSS, considering most of these games will require upscaling for optimal performance with RT. If DLSS can get away with a lower render resolution for similar output quality, that could easily eat up that ~10% difference between the 3060 and A770 in RT.
However, new clues suggest that the Arc A770 could cost around the $400 mark. That would make sense, considering the official MSRP for the RTX 3060 is only $330.
I would suspect it to be priced less than $400. If they are directly comparing it to the 3060 here, then they will likely price it to be competitive with the 3060. They have stated that the cards will be priced according to their performance in "tier 3" games on older APIs, and the A770 will probably perform more like a 3060 in those titles, even if it can be competitive with the 3060 Ti in newer APIs (at least when RT isn't involved).
Out of curiosity... Has nVidia or AMD/ATI ever done so much marketing for one of their own GPU releases in this manner?
I personally don't recall them doing this, at all, ever. I understand Intel needs to keep some of the interest alive, but... What good does it do if they just don't release the darn cards and let independent reviewers do it? This is now officially bordering the pathetic.
Nvidia did a lot of pre-release hype for their 20-series when they knew the only real selling points were going to be RT and DLSS that were not going to be usable in actual games for many months after launch. That Star Wars: Reflections demo was showcased 6 months before the cards came out. So far I haven't seen Intel spending millions hiring Lucasfilm to make a promotional tech demo for their cards. : P
Obviously, Intel most likely planned their cards to release earlier in the year, but decided they needed to spend more time on drivers than they originally expected. So their promotion of the cards got dragged out a bit. And of course, unlike AMD and Nvidia, Intel is a new entrant into the market. So it's only natural that they will need more promotion than their competitors to get the word out there about their new cards.
And the delays ultimately don't matter that much, so long as they release the cards at competitive price points. Intel is likely willing to sell the hardware at-cost to help them establish a presence in the market, and delaying the release until the drivers are in a mostly-usable state should make for a better first impression than if they released them with broken drivers half a year ago. Having the cards mostly get scooped up by mining operations and resellers probably wouldn't have helped them establish much of a presence in the market either. Considering the still-inflated prices of mid-range graphics hardware, they still have an opportunity to make a good impression in this range.