Tech equivalent for me means primarily
equivalent resources turned into performance, which in a GPU design is mostly the RAM in size and bandwidth and and the electric power to burn.
There the B580 and the RTX 4070 get to use the same means, 12 GB of 192-bit VRAM at ~500GByte/s and 200 Watts of power but wind up in a very different performance class.
Pretty near all tech reviews of the B580 chose to stick with the 'Intel recommended price equivalent' RTX 4060, which delivers somewhat less performance but uses a 128-bit bus instead of 192-bit, 277 GByte/s instead of 500 GByte/s bandwidth, 8 instead of 12 GB of RAM and 115 Watts of power instead of 200, so siginificantly less resources for results that punch much higher than the linear equivalent of the resources given.
I made a mistake there, since the only one who compared the B580 against an RTX 4070 was Phoronix (and the first review to come out at all), I looked at that
Phoronix overall results, where its
200 vs
96, which qualifies as "double" but failed to notice that he compared it against an RTX 4070
Super, which gets 220 Watts instead of 200 and a slightly better bin of the same chip. But even at 180 vs 96 it wouldn't be off by far.
Too many cards, too easy to miss, I'm sorry for that mistake!
Yet it doesn't change the overall picture much, in terms of technology the B580 proves how woefully behind Intel is compared to NVidia, which sounds like "envy" in Spanish for good reasons.
Intel may try to save face by offering it at that price, but current listings in Europe are at €325 which isn't €250.
The RTX 4070 is €550, the RTX 4070 Super is €650 and you can choose which one you want to compare it to.
The first may fall slightly short of "double" the second hits it per Phoronix, but perhaps Linux isn't the same story as Windows, even if it's the same games.
I consider throwing "dishonest" at me rather harsh and would argue that it is Intel who is trying very hard to tilt the scales.
And unless that price stays at half of what NVidia charges for a technically similar card, converting the same Wattage into significantly less performance is an uneasy decision to make. And that's not counting software aspects.
Of course, if it's good enough, it's good enough, at least you can't fall into the trap of buying more than you need within the Battlemage family currently.
But from a tech angle, Battlemage remains just shockingly bad, very much a Bulldozer vs a Core. And that means Intel stays far away from being a serious contender, which I'd like it to be.
But does that matter? Especially if energy efficiency isn't that high on your list?
It mostly depends on if Intel can make enough money from Battlemage to continue and perhaps become more efficient, too.
The price of the external resources, VRAM, PCBs etc. and power is the same for all, the ASIC the main differentiator. B580 and RTX 4070 don't differ vastly in transistor count, 19B vs. effectively 25B for the 4070 bin, but even if Intel could get theirs for half the price that NVidia has to pay, that doesn't mean they can maintain the advertised price gap if NVidia decides to squeeze a bit. Nor does half price seem very likely, not after Pat got in a spat with TMSC's founder.
If NVidia were to lower prices ever so slightly, Intel would turn from blue to red numbers while NVidia is still raking it in and we all know who can afford what.
I bought an A770 perhaps six months after launch, because I wanted to replace a GTX 1060 in a 24x7 box that is crammed with storage and needs something compact and low-power. That A770 mostly just failed to work with my dual DP KVM or DP generally: only HDMI was ok, so I sent it back after benchmarking and got the compact and efficient PNY RTX 4070 instead, which just works fine, but unfortunately at twice the money.
Perhaps a year later I bought an A770m in a i7-12700H NUC when that NUC became so cheap the A700m with 16GB VRAM and 500GB/s bandwidth was basically included for free (€700 for the Serpent Canyon). At that price it just wasn't a risk, more of a freebie.
It turned out ok for gaming, and it's still used in the family.
This summer I got a Lenovo LOQ ARP9 laptop for €750 that includes an RTX 4060m as well as a Rembrand-R 8-core Zen 3. This laptop tends to use half the Wattage at the wall the NUC uses, yet runs even Unreal 5 games like ARK Survival Ascend enjoyable enough with software magic, where the Serpent Canyon simply fails to exceed single digit FPS, perhaps only because ASA doesn't support XeSS, but who knows?
I'd say I try my best to give Intel a fair chance, in CPUs and GPUs. And if they fail it typically means I have to spend more, so why would I want to tilt scales?
If the B580 really drops to €250, I might get one to test, even keep if, it gets the job done. But if I'll be able to get another RTX 4070 for not that much more, used or whatever, I'd pick that, because when the price for Nvidia is right, the value is better. And with all those boxes in the home, efficiency is money, too.