GN said that Intel said it would require ReBar to work properly, just like the first generation.Curious if these will require ReBar as Alchemist did
GN said that Intel said it would require ReBar to work properly, just like the first generation.Curious if these will require ReBar as Alchemist did
Looks like they got it from the Sparkle specifications rather than Intel, but I'd say it's as reliable.GN said that Intel said it would require ReBar to work properly, just like the first generation.
Nope, they timed it so that they could clear old inventory during Black Friday and now early adopters will have a chance to pay full price right before or after Christmas and ahead of Nvidia or AMD releasing new cards. If anything they timed it pretty good.They announced just in time to completely miss Black Friday, and by extension, the holiday shopping season.
I bet somebody lost their job over that silly blunder...
Looking at the x060 performance and the TPU performance charts, & using the 2060 super as the base line.I dont see the point of B570 at 220 USD. Makes more sense at 200 USD. Everyone is going to stretch and get the B580.
Hope the driver updates boosts the performance to more than 10% greater than 4060. With AMD's RDNA4 launch, Intel's win would be short lived, like 2 or 3 months max. And 2 more months, you will have the 5060.
I wouldn't expect too much from the 5060, especially if it is as vram starved as the 4060 was.
Memory bandwidth is about a lot more than the bus width, the 4060 is not "starved", in fact it outperforms many GPUs with more memory in many games and applications. I'm almost 100% sure the 5060 will have more memory bandwidth overall than the B580 regardless of bus width.Then 4060 wasn't VRAM starved, the 16GB version that was floating around proved as much. It's memory bandwidth starved, that 128-bit interface just isn't able to move data fast enough. If you look you'll see that these Intel dGPU's are 192 and 160 bit wide, so doesn't look to be starved. With current DRAM densities it's 2GB of VRAM for each 32-bit memory bus. You can daisy chain DRAM chips for double capacity but the same bandwidth, which is what the 16GB 4060 did.
is not that the point? aka using the classic cinema style bait price.Nice write up. I will be an early adopter just like the A750. 12GB of vram will be nice. Let’s hope drivers are mature enough.
I love the way the A750 LE looks, and this one looks pretty nice as well. I am looking forward to it.This card looks like the new Polaris. Or more recently a more efficient 6700XT that does good raytracing and upscaling that costs less than a used 6700xt. I was really hoping for the 32 core version and I didn't like how the fans looked on the LE at first, but after seeing the assembly animation clip ion this video they are starting to grow on me.
View: https://youtu.be/Dl81n3ib53Y?t=563
Especially from an efficient volume perspective. I really want the 2 slot version so it looks like LE for me.
It will be nice seeing how well it does vs my A750.
I dont see the point of B570 at 220 USD. Makes more sense at 200 USD. Everyone is going to stretch and get the B580.
Hope the driver updates boosts the performance to more than 10% greater than 4060. With AMD's RDNA4 launch, Intel's win would be short lived, like 2 or 3 months max. And 2 more months, you will have the 5060.
The B570 was listed as 150W. Assuming all power from PCI-e connectors, then a single power connection would suffice for the B570, while needed two for the B580. That might not bother enthousiasts, but it likely is important to system builders.
Other than that: I agree. If these two were the only choice then I would go for the B580.
In the video Intel claims the B580 runs at half the noise level as the previous LE cards.I love the way the A750 LE looks, and this one looks pretty nice as well. I am looking forward to it.
Nope, they timed it so that they could clear old inventory during Black Friday and now early adopters will have a chance to pay full price right before or after Christmas and ahead of Nvidia or AMD releasing new cards. If anything they timed it pretty good.
Two connectors aren't required for the B580 (stock 190W) and most of the models I've seen only have one. Between slot power (75W max) and PCIe 8-pin (150W max) there's 225W maximum available to the card. Personally I'd prefer having two connectors because I'd really like to see how far the card could go, but it shouldn't be necessary for daily driver type operation.The B570 was listed as 150W. Assuming all power from PCI-e connectors, then a single power connection would suffice for the B570, while needed two for the B580.
If nvidia wipe out intel and AMD then they will be in trouble. It is in nvidia best interest for them to continue "milking" the customer and piss them of. That way they will buy intel or AMD.If NV had any sense, drop a 16GB 5060 with a little performance nudge, while keeping power requirements low, at the $250 price point and so long as it wipes the floor with 4060, it will wipe out Intel and AMD market share.
But as we know with NV, they will just hike the prices and try and milk the customer.
The price on the B570 will fall eventually. Same strategy as AMD: don't price too low at launch, let it drop unofficially later.I dont see the point of B570 at 220 USD. Makes more sense at 200 USD. Everyone is going to stretch and get the B580.
Hope the driver updates boosts the performance to more than 10% greater than 4060. With AMD's RDNA4 launch, Intel's win would be short lived, like 2 or 3 months max. And 2 more months, you will have the 5060.
They might not have made money on the A580 at that price (or most of the Arc cards at any price points). And it was launched extremely late, a whole year after the A750/A770. How many A580s were even made?So gen on gen (A580- $179) this delivers roughly 40-45% more performance for exactly 40% more money, and adds an extra 4gb of vram? Do i get it right? 😬
RDNA4 will be the new Polaris. Battlemage is the last gasp. I would check out a B380/B310 for SFF though.This card looks like the new Polaris.