News Maxsun unveils Intel dual-GPU Battlemage graphics card with 48GB GDDR6 to compete with Nvidia and AMD

The most import information, how those two GPUs are interconnected between each other, is missing.

And that's most likely because it's bad news: the same two x8 slots, which are also used to talk to the host.

And with that it's really a 2-in-1 card with two distinct 24GB GPUs, not a double-sized GPU with 48GB, a mere space saver.

The value of that approach is extremely limited, you'd have to compensate with model design manpower to exploit it.

In a normal market that wouldn't work, but we are far from that.
 
The most import information, how those two GPUs are interconnected between each other, is missing.

And that's most likely because it's bad news: the same two x8 slots, which are also used to talk to the host.

And with that it's really a 2-in-1 card with two distinct 24GB GPUs, not a double-sized GPU with 48GB, a mere space saver.

The value of that approach is extremely limited, you'd have to compensate with model design manpower to exploit it.

In a normal market that wouldn't work, but we are far from that.
Sorry, I'm not understanding it that well- so this will perform the same (maybe worse) than just buying the two cards separately, and the only real benefit is motherboard space since they share a PCI slot? And maybe it might be cheaper than two separate cards?
 
And with that it's really a 2-in-1 card with two distinct 24GB GPUs, not a double-sized GPU with 48GB, a mere space saver.
That's definitely the point as it enables the top configuration of Intel's "Project Battlematrix":
DOLuMfK.jpeg
 
GN did a teardown of the pre-production sample and it looks like it's just 2x GPUs each attached to their own 8x lanes.
I don't see any traces or chip that interconnect the two together, so it seems like a space saving design.
 
  • Like
Reactions: jp7189
Kind cool if have two graphics on one system, If the software don't work in dual graphics you can choose what Gpu will drive the software.

It's like a dual cpu motherboard maybe with some kind of numa memory pool :)
 
Well, maybe some techtuber will get one to test with and run Ashes of the Singularity, would be interesting at the least.

I can't think of any other titles that actually support multi-GPU.
 
This dual Maxsun card is not proven and is made by a Chinese company, so they must be cutting corners. if it is priced around 700 bucks, it could be a novelty purchase

However, I’m going to move forward with a B60 24 GB card, maybe buy two to start things off.

The Nvidia Empire is collapsing, Intel is quietly becoming the platform for creative work. Get in now while prices are good.
 
This dual Maxsun card is not proven and is made by a Chinese company, so they must be cutting corners. if it is priced around 700 bucks, it could be a novelty purchase
Not counting tariffs $700 actually sounds like a small markup, and the only cut corner and risk being production scale. I really can't imagine them selling like hotcakes.
However, I’m going to move forward with a B60 24 GB card, maybe buy two to start things off.
If you have the space, that should be just as good (or bad), but offer the flexibility to repurpose the two sepearately.
The Nvidia Empire is collapsing, Intel is quietly becoming the platform for creative work. Get in now while prices are good.
Collapsing is where I believe Intel is far ahead in the lead and that could become an issue in terms of support and long-term value.

After some initial evaluation my B580 sits in storage mostly as a spare. At €300 in January it was fair value for the price, but not a great GPU. I fear margins aren't where Intel needs them to carry on. It still retains some of the A770 issues around power consumption and nearly no XeSS (and AI) support in games (apps).

If I was really hard pressed on money, I'd probably go for a 5060 instead, even at 8GB. Otherwise I'd aim for a 5060ti with16GB or save for something bigger.

If I could swap the B580 for a 24GB variant at €50 extra, I'd probably do that, mostly for curiosity's sake. But I wouldn't buy a 24GB variant new: the value of the additional RAM may be much less than what you're hoping for.

I have a 24GB RTX 4090 and there aren't that many off-the-shelf, home-use models that fit that exact RAM size. Most of the 70B stuff really needs much more and even 35B models struggle. Designing and training your own size matching variant, even if you can use an existing open source dataset and model as a base, is way beyond what even Intel seems ready to invest: don't expect anyone else doing that, unless that were their only choice.

I can't see that B60 card fitting any workload I care about. The base B580 isn't great at 1440 gaming and 24GB doesn't improve that. But even with AI/ML 24GB isn't a very useful RAM size while lack of software support may mean that it delivers much less than a CUDA capable card would deliver with similar hardware specs.

Model design and training for denser models might be a niche, but even there you'd much rather use the CUDA ecosystem. For anything LLM even huge piles of these won't do any good, even if you could afford to write your own software stack like DeepSeek did: not for design, not even for training, and not for inference.

I'd be happy to hear your experiences if you dare to try.

In cases like this I like to ensure that I have a 30 day free return window available full time.