Sapphire Launches Cryptomining Radeon RX 570 With 16GB VRAM


I don't think VRAM is really much of a limitation in today's games at 4K, at least so long as the card has 8GB. I don't recall seeing any benchmarks where the 2080's 8GB caused any performance issues relative to the 1080 Ti's 11GB. Even the 2060's 6GB seems to be enough to handle current games just as well as the 1070 Ti's 8GB at 4K, though I could see that potentially impacting performance more within the next couple years. It seems like 8GB should be enough for a while though, so it probably wouldn't be worth using these cards in a multi-card setup for gaming, since they undoubtedly cost more than 8GB RX 570s.

As for 5K, it won't likely be relevant for gaming for quite a while, and I can't see even three RX 570s handling it well. Considering how poorly supported SLI and Crossfire tend to be, you would be stuck running a single RX 570 in any game that didn't work well with it, meaning you would be falling back to middling performance at 1080p in those titles. I can't see multi-card support getting better any time soon either, seeing as at least Nvidia seems to be moving away from it, only supporting SLI on their $700+ cards this generation.

And even if multi-card setups were widely supported by game developers, it still probably wouldn't be a very good idea to run these cards crossfired at 4K resolution or higher. Even with good multi-card scaling, a single 2060 or 1070 Ti would likely outperform two of these cards in most cases, at around half the power draw and heat output. And even those cards only provide mediocre performance at 4K, requiring graphics settings to be lowered to maintain reasonable frame rates in newer games. Three RX 570s might draw close to 500 watts when fully utilized, making such a setup even less practical.
 

TJ Hooker

Titan
Ambassador

There are always new coins and algorithms coming out, some of which are ASIC-resistant (at least for a time). Hadn't heard of the "grin coin" mentioned in the article before, but apparently it uses some new proof-of-work algorithm that's supposed to be mostly resistant to existing ASICs, making it friendly for CPU/GPU mining. I'm sure there's other coins out there that can be GPU-mined as well, I think people are still GPU mining ethereum, no idea how profitable it is though.
 

bloodroses

Distinguished


Hmm, interesting that they're trying to make them ASIC resistant. It seems kind of counter productive to trying to make a 'standardized' coin to get universally adopted though. I can see other 'grey area' uses for it though. Either way, as you said, it's interesting as a proof-of-work concept.

As with mining, honestly I see the fad completely dying out/useful/profitable as businesses/governments will just spin their own currency type if they choose to switch to crypto. This article is a perfect example of what Japan is trying to accomplish:

https://www.technologyreview.com/s/611656/will-people-ditch-cash-for-cryptocurrency-japan-is-about-to-find-out/
 

InvalidError

Titan
Moderator

I can't imagine governments switching to crypto for currency, that's far too inefficient. Much simpler to stick to regular currency and existing banking infrastructure which doesn't require massive computing power per transaction.

Keep in mind that one of the core reasons behind crypto-currencies' massive overheads is the lack of trust since there are no trusted central authorities to back them up. Crypto-currencies are little more than computer science experiments.
 

TJ Hooker

Titan
Ambassador

Well yeah, that's kind of the whole idea behind cryptocurrencies, that you don't need to trust a centralized authority in order for the currency to have value.
 

TRENDING THREADS