News Thanks to Miners and Scalpers, eBay Pricing for Ampere, RDNA2 GPUs Continue to Rise

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
They were doing fine before these fictitious puff-smoke coins came to be and be mined. There was a time before Bitcoin and Co, you know? They still did good enough for R&D...
You mean, back in the days when designing a new chip cost $50M, not $5 billion, and when chips were 90nm in size, and could be produced at in any of a dozen different fabs, instead of having to compete for extraordinarily expensive capacity in two or three? Yeah, back then.

Look, it's the real world out there. Stop deluding yourself. GPUs are no longer just toys for kids. Significantly more than half of NVidia's revenue comes from non-gamers. Without that extra money funding development, NVidia would be announcing the 1000-series next year, not the 4000 series, and you'd be giggling in glee at the prospect of paying $700 for a card that outperforms your 960.

it won't do any good no matter how many times you try - still makes me vomit (and I'm not the only one).
Some people are allergic to facts, I know. But forced exposure can mitigate the symptoms.
 
Why not do at least a little math before making off-the-cuff statements?

2.2 billion gamers worldwide * 200 watts/av * 16 hrs/wk * 52 wks/yr / 1000 (w-kw) / 8141 kw-h/ton = 45 million tons of coal. Per year. Every year.

Some systems may not use 200 watts, but then again there are many gamers using 400+ watt systems for far more than 16 hours a week. I ignored energy-mix considerations, but I used the US figure for TCE, whereas the figure for anywhere in Asia and Africa is going to be lower. So I think this estimate is far closer than your "dozens".

So why not responsibly give up gaming? And posting to message boards also? The CO2 you save may save a life one day.... 🙄
I think you are taking what I said too literally. There have been articles published recently suggesting bitcoin mining uses more power than many small countries.
 
Well, hopefully in about 1 year all those mining gpus flood the used market and we can get them?
Is that the silverlining in this story? lol
 
The real world facts, however, agree with me. The Bitcoin hash rate at the end of September was running 150 exahashes/sec. As of last night, it was 149 eh/sec. A 30-day moving average shows a slight increase since October, but nothing extraordinary. Yes, some miners are buying some 3000-series GPUs to mine with, but there hasn't been a 350% surge in mining, nor anywhere near it. Your statements that "miners will buy them all, no matter how many are made" and "the more they buy, the more they make" are both fallacious. When the hash hate climbs, ROI declines, and the payback period for a new card quickly becomes unattractive.
You're actually looking at the wrong thing. GPUs are not used for Bitcoin mining -- they're used on NiceHash and people get paid in Bitcoin, yes, but the hash rates for SHA256 on GPUs are abysmal compared to ASICs. Some ASICs can do about 30 TH/s for SHA256 while consuming 1000W. A GPU can maybe do ... I don't even know these days. 10 GH/s? Or maybe it's less than that. Orders of magnitude difference, though, and so mining of BTC is only done by ASICs (or perhaps viruses mining on someone else's hardware).

You need to look at Ethereum and some of the other coins. On July 2, 2020 the total ETH hash rate was about 187 Petahash/s (PH/s). Right now, the hash rate is nearly 400 PH/s (394.5 PH/s to be exact, and climbing very rapidly). That's largely due to the influx of GPU miners with new Ampere and other GPUs, and I think a bunch of older GPUs are being started up mining as well.

Do ASICs exist for ETH? Yes, but the best ones are 500MH/s at 1000W, which is only about 5-10 times more efficient than GPUs, and there aren't enough ASICs to go around. There may be long-term issues with the ASICs as well as the DAG increases in size and potentially requires more RAM. Basically, GPUs are readily available (compared to the custom ASICs -- which, incidentally, are also hard to make because they use TSMC and TSMC is totally tapped out on capacity).
 
  • Like
Reactions: VforV
You're actually looking at the wrong thing. GPUs are not used for Bitcoin mining -- they're used on NiceHash and people get paid in Bitcoin, yes, but the hash rates for SHA256 on GPUs are abysmal compared to ASICs. Some ASICs can do about 30 TH/s for SHA256 while consuming 1000W. A GPU can maybe do ... I don't even know these days. 10 GH/s? Or maybe it's less than that. Orders of magnitude difference, though, and so mining of BTC is only done by ASICs (or perhaps viruses mining on someone else's hardware).

You need to look at Ethereum and some of the other coins. On July 2, 2020 the total ETH hash rate was about 187 Petahash/s (PH/s). Right now, the hash rate is nearly 400 PH/s (394.5 PH/s to be exact, and climbing very rapidly). That's largely due to the influx of GPU miners with new Ampere and other GPUs, and I think a bunch of older GPUs are being started up mining as well.

Do ASICs exist for ETH? Yes, but the best ones are 500MH/s at 1000W, which is only about 5-10 times more efficient than GPUs, and there aren't enough ASICs to go around. There may be long-term issues with the ASICs as well as the DAG increases in size and potentially requires more RAM. Basically, GPUs are readily available (compared to the custom ASICs -- which, incidentally, are also hard to make because they use TSMC and TSMC is totally tapped out on capacity).

But doesnt that mean that it should create a lower gpu shortage than what bitcoin caused 2 years ago?Because its not as big as bitcoin?
Also will the 3050 and 3050ti with 4/6 Gb of vram be used by miners just as much as their big brothers?Or will the lower vram make them unapealing?
 
But doesnt that mean that it should create a lower gpu shortage than what bitcoin caused 2 years ago?Because its not as big as bitcoin?
Also will the 3050 and 3050ti with 4/6 Gb of vram be used by miners just as much as their big brothers?Or will the lower vram make them unapealing?
Direct Bitcoin mining via GPUs hasn't been profitable in ages -- since 2014 at least. That's when ASICs showed up and suddenly there was a mad scramble to create alternative coins that could be mined via GPUs (and even CPUs). The GPU shortages are caused by all the altcoin mining, not Bitcoin mining, but the price of BTC going up pulls most altcoins up alongside it, which makes it more profitable to mine.

The 4GB cards can't effectively mine Ethereum these days and so they are not likely to be impacted -- though demand for GPUs from other sectors can still be an issue. 6GB and above are all viable for ETH mining right now, and I think it will be a couple of years before the DAG goes beyond 6GB.
 
Its more than just mining.

Corona (increased consumer demand AND productivity shortages), scalpers, AND mining.
Scalpers don't reduce availability, they raise the prices. I think Covid impact is being overstated as well. In certain areas like cheap laptops and business computers, webcams the impact has been significant. For $500+ video cards? I doubt covid has significantly driven up demand for high end GPU purchases.
 
I don't buy this kind of gear from eBay, and I certainly don't consider scalpers to be official barometers of component price. Only idiots will buy and pay scalper prices currently. It's unreal.
 
Scalpers don't reduce availability, they raise the prices. I think Covid impact is being overstated as well. In certain areas like cheap laptops and business computers, webcams the impact has been significant. For $500+ video cards? I doubt covid has significantly driven up demand for high end GPU purchases.

Yes...Covid excuses fall apart when you consider that AMD had its best year yet in the year of "Covid" excuses...😉
 
Its not this, its not that, blah blah.
We have at least one person stating this is "intentional" from the manufacturers, but with no reason why they would do that.

Yet, here we are.
Little or nothing available at actual retail price, or from the typical retail outlets.

I'm sure some of you genuises will espouse the real reason. But I haven't seen it yet.
 
I think it's a broader problem than just the newer cards (although they certainly are one of the causes of it). A while back I was looking at a GTX 970 Founders edition to SLI in one of my rigs for fun. They were running about $120 USD on eBay. Now, they're sitting at $170+. GTX 1080's were listing for $320, now sitting at at $450+.

And, remember these are used cards a few generations old and they're all up by about 30% or so. Ugh. Hopefully these are going to come back down to earth.
 
It can reduce availability as well.

Not all warranties are transferable. Buying from that fool on fleabay, you are not the original purchaser.
Would YOU buy a product at 50% over retail, with no warranty? I know I wouldn't.
I wouldn't buy 50% over retail with a warranty. Doesn't mean the card isn't still on the market for sale.
 
Why not do at least a little math before making off-the-cuff statements?

2.2 billion gamers worldwide * 200 watts/av * 16 hrs/wk * 52 wks/yr / 1000 (w-kw) / 8141 kw-h/ton = 45 million tons of coal. Per year. Every year.
Sure, the math can work in your favor when you are just making up random numbers out of thin air.

There might be over 2 billion "gamers" worldwide, but a majority of them are doing their gaming on mobile devices. The number of consoles being sold each generation, for example (specifically performance consoles, not low-powered devices like the Switch or Wii that only draw 10-15 watts) only add up to around 150 million units in total. The number of those gaming regularly on relatively high-powered gaming PCs is harder to ascertain with any degree of accuracy, but I think it's safe to say that those numbers are grossly overestimated. And in terms of the total power used by an individual graphics card in those scenarios, it's still only a small fraction of the power that would be used for mining on the card 24/7. A system might be drawing more power while playing a demanding game, but that's typically only for a small portion of any given day.

Also will the 3050 and 3050ti with 4/6 Gb of vram be used by miners just as much as their big brothers?Or will the lower vram make them unapealing?
Even if something like a 3050 4GB managed to be less attractive to miners, it would still have it's pricing and availability heavily effected by mining, since lots of people building systems or otherwise in the market for a graphics card would resort to one of those in the absence of anything higher-end within their budget. Just look at current lower-end 4GB cards like the GTX 1650 and 1650 SUPER for an idea of what to expect. Their MSRPs are $150 and $160 respectively, but they are mostly selling for around double that currently from third-party sellers, even pre-owned cards. If a 3050-class card comes out soon with performance close to that of a 2060, I would expect it to be hard to find for a price much lower than what those are selling for now, that is, upward of $500.
 
ITs not them. The root cause is limited capacity at TSMC and Samsung. Why? Because of mobile phones. Bulk of the capacity goes into manufacturing mobile phone chips and memory.

That is the story they tell us, yes. But is it true? What was the story with the 1000 series cards? I am starting to think more and more that both companies are doing this on purpose, to increase revenue and lifecycle of their products.
 
I am just wondering, with all the miners scooping up cards left and right, is it really necessary to have articles on this site on a daily base talking about how to optimize and mine crypto?
To me it's like throwing gasoline onto a fire.

Yes, yes, it attracts revenue. But is it the kind of revenue you want to attract?
 
What was the story with the 1000 series cards?
Basically the same story as it is today: AMD and Nvidia were maxed out on their wafer orders.

The only difference between then and now is that back then, AMD and Nvidia were skeptical of how sustainable crypto-mining was going to be, so they slowly ramped wafer orders instead of rushing to get every possible wafer made while today, TSMC and Samsung are maxed-out so there are no more wafers available for either of them to get more chips made even if they wanted to.
 
You're actually looking at the wrong thing. GPUs are not used for Bitcoin mining..
Yes, a point I've made in other threads. However I didn't even want to open up the GPU/FPGA/ASIC can of worms.

On July 2, 2020 the total ETH hash rate was about 187 Petahash/s (PH/s). Right now, the hash rate is nearly 400 PH/s
400 PH/s equates to 0.4 EH/s. Meaning Ethereum is using 0.26% of the computing power as is Bitcoin. It's not consuming 100% of Ampere production capacity; it and all other mining combined might be consuming half. Also, Ethereum value is up 450% over the period you mention, and by your figures, the hash rate is already up 210%. Meaning it's not far away from balancing out that rise. Obviously we could continue to see similar rises in future value, but barring that unlikely event, the demand for new hash capacity is going to return to normal in the next 3-4 months.
 

TRENDING THREADS