News Moore Threads MTT S70: A GPU with 7GB of GDDR6 Memory

Stesmi

Commendable
Sep 1, 2021
27
30
1,560
Not that weird, really.

I haven't looked into the architecture, so I could be off base here, but look at the Stream Processors, 3584 vs 4096. If the arrangement is one memory channel per 512 Stream Processors, 512*7 = 3584 and 512*8 = 4096.


# 32-bit memory channelsMemory Interface widthStream Processors
132-bit512
6192-bit3072
7224-bit3584
8256-bit4096

I'm guessing here, but it does seem plausible.

Another thing - Odd memories are back. The same thing was said when the first non-power-of-2 size came to the market, it was also odd, but now it's normal. Same goes here.
 

bit_user

Polypheme
Ambassador
Not that weird, really.

I haven't looked into the architecture, so I could be off base here, but look at the Stream Processors, 3584 vs 4096. If the arrangement is one memory channel per 512 Stream Processors, 512*7 = 3584 and 512*8 = 4096.
Nice analysis, but I think there could be a simpler explanation. My guess is they had a bin full of dies where either one memory channel was bad or some bank of stream processors was bad. Instead of creating two different lower-tier products, they just put them all in the same bin and that's how they ended up at 7/8ths.

That is mostly only true for pixel snobs who insist on playing everything at mostly maxed-out settings, which is rarely an option on what is probably a $200-ish GPU no matter how much VRAM you throw at it.
Right. I mean, considering its current performance level, in actual games, you probably can't play anything on settings that require > 7 GB, anyway.

Perhaps their next generation of products will start to be "interesting", at least from a value perspective.
 
  • Like
Reactions: gg83

Stesmi

Commendable
Sep 1, 2021
27
30
1,560
Nice analysis, but I think there could be a simpler explanation. My guess is they had a bin full of dies where either one memory channel was bad or some bank of stream processors was bad. Instead of creating two different lower-tier products, they just put them all in the same bin and that's how they ended up at 7/8ths.
Oh yes, absolutely. They could be, but don't have to be connected. I was speculating. It would make economical sense to have an out, as you describe - one bank bad and/or one memory channel bad => S70.
 

Stesmi

Commendable
Sep 1, 2021
27
30
1,560
You forgot to mention one major change/downgrade by by MTT.

The S70 supports only supports PCIe Gen4 x16 interface, unlike the S80 which ironically was the first GPU to get PCIe Gen5 support ~! :rolleyes:
I wonder if that's fused in hardware or not, as I would guess they are the same dies, just fused off parts. Nice catch.
 

bit_user

Polypheme
Ambassador
the S80 which ironically was the first GPU to get PCIe Gen5 support ~!
They certainly said so, but then it's highly comical to read their various announcements prior to the launch of that GPU. I honestly wonder if it even worked at PCIe 5.0 speeds.

I had a 1st gen PCIe 3.0 motherboard and GPU. For the longest time, it would often auto-negotiate down to PCIe 2.0. I never figured out why, but I didn't really care.
 
Also this Chunxiao architecture platform lacks proper support for DX12, Vulkan or OpenGL programming interfaces, which severely holds this arch back, hence that's why the S80 only matches the performance of a GT 1030 ! Which is very less by today's standard.

MTT needs to focus more on drivers, firmware and software to get full benefits from the S70/S80.
 
  • Like
Reactions: Roland Of Gilead
They certainly said so, but then it's highly comical to read their various announcements prior to the launch of that GPU. I honestly wonder if it even worked at PCIe 5.0 speeds.

Technically speaking MTT was the first company to announce that the S80 supports full PCIe Gen5 x16 interface, but whether the card worked at it's full potential is highly doubtful. I don't think in real life benchmarks/apps the S80 really took full advantage of the PCIe Gen5 x16 interface.

Although it's a first PCIe 5.0 graphics card it is unlikely to leverage such bandwidth potential.


But hey, this card can at least play CRYSIS, so that's something for MTT to brag about ! (y) :D It appears that the "Can it play Crysis" meme lives on forever till eternity!

View: https://www.youtube.com/watch?v=1R8dQWw7ldM&t=1s
 
Last edited:

Stesmi

Commendable
Sep 1, 2021
27
30
1,560
Technically speaking MTT was the first company to announce that the S80 supports full PCIe Gen5 x16 interface, but whether the card worked at it's full potential is highly doubtful. I don't think in real life benchmarks/apps the S80 really took full advantage of the PCIe Gen5 x16 interface.

Although it's a first PCIe 5.0 graphics card it is unlikely to leverage such bandwidth potential.


But hey, this card can at least play CRYSIS, so that's something for MTT to brag about ! (y) :D It appears that the "Can it play Crysis" meme lives on forever till eternity!

View: https://www.youtube.com/watch?v=1R8dQWw7ldM&t=1s
I'm sure it'll make an awesome text console in Linux as well :)
 
  • Like
Reactions: bit_user
This is actually very interesting because we have a previously-semi-unknown company making GPUs that have more capability than a GeForce RTX 3050 (at least in theory). Considering the number of people who game with Polaris cards, GTX 1xxx, etc., there's definitely a market for GPUs like this, as long as they're priced correctly and have good drivers.

That's a beauty-looking card too:
pKcgZURimfzK9dT8GigWq3-970-80.png.webp
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
This is actually very interesting because we have a previously-semi-unknown company making GPUs that have more capability than a GeForce RTX 3050 (at least in theory). Considering the number of people who game with Polaris cards, GTX 1xxx, etc., there's definitely a market for GPUs like this, as long as they're priced correctly and have good drivers.
They are rumored to be using IP from Imagination/PowerVR, which makes this all the more interesting as they're not the only ones. So, if they get it to work reasonably well, that portends an interesting future for the GPU landscape.