News Sapphire Nitro+ RX 9070 XT hides its 16-pin connector inside, offers cableless look

This just begs the question: why? The RX 9070 XT isn't going to pull that kind of power. Let's even say the Nitro+ overclocks enough to to run consistently 60W over the 304W standard.

At 364W, you're still within the envelope of 2x8 pin plus the PCIe slot. How much power are they planning on this card to pull?
 
  • Like
Reactions: DS426
That's fair, though it looks as if this could've easily accommodated dual 8-pin connectors. And, if I'm recalling correctly, the 16 pin connector's cable tends to be stiffer, doesn't it? That suggests to me that the dual 8-pin would work at least as easily.

And, I think even with dual 8-pin, there would still be room for the LED cable.

EDIT: I should admit that the aesthetics are quite appealing on this card. I'm just not convinced that moving to the 16-pin connector was necessary for it, not do I think it's a good idea.
 
I'm a big Sapphire fan, but this is a HARD no for me. AMD is specifically trying to avoid the 16-pin issues, but Sapphire has now made the Radeon brand vulnerable, even if they are the only AIB that does this.

BTW, this must be on the 340W TBP models that AMD was talking about in their presentation.
 
  • Like
Reactions: Devoteicon
Nobody has noticed that this places the connector right under an active fan? Smart actually, for a connector that likes to melt itself.

Would've been neat to see a small heatspreader plate connected to the heatsink fins where the connector sits also!
 
Last edited:
  • Like
Reactions: King_V
I've perused the specs of all of the board partners that have their 9070XTs listed. Asrock's Taichi SKU also used the 12V-2x6 connector, but the connection isn't hidden like Sapphire's Nitro+. When looking at the other manufactures it seems the hot-clocked cards get either three 8-pins or the 12pin while the moderately or stock clocked will use the two 8-pins.
The highest published clock is the Taichi which boosts to 3100mhz. That's a 4.4% OC. Can they successfully get a big premium for an improvement that will likely fall within the margin for error in most testing? I would be happy with being closer to stock, only eating 300W, being less than 300mm long and maybe only taking up two slots.
 
It almost looks like this configuration forces you to violate the connector minimum bend radius (unless you have a horizontal, 90-degree connector, but I'm not sure if I've ever seen a 12 pin connector like that). Without dimensions I'm just speculating though.

But as said above, the fact that it's right next to the heatsink fins and in the path of the fans probably helps, plus it not being a super power hungry card.
 
Nobody has noticed that this places the connector right under an active fan? Smart actually, for a connector that likes to melt itself.

Would've been neat to see a small heatspreader plate connected to the heatsink fins where the connector sits also!
Alright, that's an excellent point. I still don't feel that this is overall a good idea, but it seems they're trying to mitigate the consequences with more than just the fuses, so, kudos on that.
 
Nobody has noticed that this places the connector right under an active fan? Smart actually, for a connector that likes to melt itself.

Would've been neat to see a small heatspreader plate connected to the heatsink fins where the connector sits also!
It's not exactly great there, they put the connector behind the fins - so it gets hot air blown at it.

That said, I don't think for the power this card pulls that connector will be an issue.
 
  • Like
Reactions: adbatista
This just begs the question: why? The RX 9070 XT isn't going to pull that kind of power. Let's even say the Nitro+ overclocks enough to to run consistently 60W over the 304W standard.

At 364W, you're still within the envelope of 2x8 pin plus the PCIe slot. How much power are they planning on this card to pull?
It's the same question I once had for Android phones when certain mfg's decided it was the cool thing to do and put a phone notch on their phones to make people think they looked like iPhone's.
 
The 16-pin connector has less cable clutter compared to 2x 8-pin side by side, and that improves airflow. Just a guess.

Also, seeing as that 16-pin's "600W" max rating is bunk and more like "450W", I'd say it's the perfect card to pair it with.
So long as they didn't bridge the six 12V rails into one, and used at least 3 shunt resistors over 3 rails...
 
This just begs the question: why? The RX 9070 XT isn't going to pull that kind of power. Let's even say the Nitro+ overclocks enough to to run consistently 60W over the 304W standard.

At 364W, you're still within the envelope of 2x8 pin plus the PCIe slot. How much power are they planning on this card to pull?
Perhaps it is for testing the cable on normal power draws. To see where's the problem. It should not melt. If it does.... The problem lies in the cable itself or in the power delivery setup. Or something like that.
 
  • Like
Reactions: Devoteicon and olaf
There's no real good reason to use that defective connector other then esthetics. Placing it to block airflow with the cable , while bending it in a question position doesn't look like a good idea.
They could have just placed the 2x8 pins to the bottom front of the gpu like on server cards, and it would have been a better solution.
 
  • Like
Reactions: Devoteicon
Nobody has noticed that this places the connector right under an active fan? Smart actually, for a connector that likes to melt itself.

Would've been neat to see a small heatspreader plate connected to the heatsink fins where the connector sits also!
And you see no problem with the insanity that you have to use active/passive cooling for a stupid conector ? Just because it supposedly looks better?
Are we Apple?
 
That's a mighty big looking bit of hardware. So, when do we stop calling them graphics "cards" and start calling them graphics bricks? Not saying there's anything wrong with them, just pointing out how un-card-like they've become.

Also, those fuses...are they thermal fuses? And are they easily replaced? Or are they just a self-destruct for the $600+ graphics brick designed to save your $16 cable? Just curious.

Dang. I just read that back to myself in Mariner's voice. Been watching too much Lower Decks lately.
 
  • Like
Reactions: stuff and nonesense
I'm a big Sapphire fan, but this is a HARD no for me. AMD is specifically trying to avoid the 16-pin issues, but Sapphire has now made the Radeon brand vulnerable, even if they are the only AIB that does this.

BTW, this must be on the 340W TBP models that AMD was talking about in their presentation.
High-end Nvidia RTX 40 and 50 cards burn because Nvidia has continually simplified/shrunk the power delivery on the graphics cards' side - removing load balancing and protection features that were there for the 3090 series.

Buildzoid did a video recently analyzing the routings (and ranting as he's wont to do).
 
  • Like
Reactions: Penzi
It's the same question I once had for Android phones when certain mfg's decided it was the cool thing to do and put a phone notch on their phones to make people think they looked like iPhone's.
it was there for very short period of time to increase screen to body size ratio, its been gone and alsmost forgotten for a while now, nowadays you have one or two camera holes or camera completely hidden
 
I won't buy any product that uses this connector for more than 350W. So, I guess this is an interesting use for it.
350W is still 29A if it all ends up going down a single wire. The connector is nowhere near as important as the current balancing mechanism for any multi-wire solution. (Over both the 12V and ground wires!)
 
  • Like
Reactions: lmcnabney