News Thanks to Miners and Scalpers, eBay Pricing for Ampere, RDNA2 GPUs Continue to Rise

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
There is not [sic] anywhere near 2.2 billion people who game an average of 832 hours a year
You still miss the point. Use whatever reasonable estimates you wish: you'll still reach an energy usage figure comparable to that used by mining. And each and every joule of it wasted for no good purpose whatsoever, save personal gratification. I'm not a big fan of crypto-currency (in fact I've spent countless posts exposing its flaws) but the fact remains it can at least claim some socially-useful benefit. Gaming has none, no matter how shrilly you scream otherwise. And yes, I myself average more than that 832 hrs/year gaming. I don't allow that to excuse absurdly puerile justifications for my hobby, however.

Those are not the numbers per-console. That is the combined total between Microsoft and Sony's consoles worldwide over the course of each generation, including enhanced and updated versions of the consoles.
Utterly false:

PS5: 5M
PS4: 115M
PS3: 88M
PS2: 155M
Xbox/. Xbox 360: 106M
Xbox One: 51M
Wii/Wii U: 114M

Even with the above, I'm leaving off several hundred million consoles between the 1st-gen PS and the older Nintendo units, simply because their present footprint and/or power consumption is too low to count significantly. As for your belief that anything but the most recent console generation is "sitting around collecting dust", you are seriously underestimating the percentage of these units that are resold when the original owner upgrades. In North America, that percentage is low, but in Asia and Eastern Europe in particular, it tops 75+%.
 
Utterly false:

PS5: 5M
PS4: 115M
PS3: 88M
PS2: 155M
Xbox/. Xbox 360: 106M
Xbox One: 51M
Wii/Wii U: 114M

Even with the above, I'm leaving off several hundred million consoles between the 1st-gen PS and the older Nintendo units, simply because their present footprint and/or power consumption is too low to count significantly. As for your belief that anything but the most recent console generation is "sitting around collecting dust", you are seriously underestimating the percentage of these units that are resold when the original owner upgrades. In North America, that percentage is low, but in Asia and Eastern Europe in particular, it tops 75+%.
Those numbers are pretty much exactly what I wrote. Around 170 million units per generation between Microsoft and Sony combined, with overlap and replaced hardware likely bringing the number of households down closer to 150 million. It's weird that you would combine Xbox and Xbox 360 numbers together though, since those were two completely different generations of consoles, with the original not selling particularly well, and the vast majority of sales going to the PS2 that generation. The same goes for the Wii combined with the Wii U. I guess that fits with the trend of manipulating numbers to make them appear higher than they really are though. : P

And again, the Wii draws less than 15 watts when gaming, so I'm not sure why you included it. I'm specifically referring to hardware that is capable of drawing something vaguely in the vicinity of that 200 watts you suggested. And that's still a fair amount higher than what many of these devices use.

I don't see evidence suggesting that a large portion of those older consoles are still seeing regular daily use averaging over 800 hours a year either. You are basing calculations on guesswork that seems a bit out of touch. The reason I pointed these things out is because you were calling someone out for making up numbers, but then did the same yourself under the guise of having data to back them up. It's not that I missed your point, just that you are using bad data to support your point.

You still miss the point. Use whatever reasonable estimates you wish: you'll still reach an energy usage figure comparable to that used by mining. And each and every joule of it wasted for no good purpose whatsoever, save personal gratification. I'm not a big fan of crypto-currency (in fact I've spent countless posts exposing its flaws) but the fact remains it can at least claim some socially-useful benefit. Gaming has none, no matter how shrilly you scream otherwise. And yes, I myself average more than that 832 hrs/year gaming. I don't allow that to excuse absurdly puerile justifications for my hobby, however.
And also, your point is kind of questionable. Gaming has "no socially-useful benefit" but cryptocurrency does? Cryptocurrencies are designed primarily as a means of making their creators and earliest adopters/investors money, and little else. As currencies, they have proven to be hugely unstable, with their value based mostly on speculation and market manipulation. Compare that to something like the dollar, or the euro, that tend to remain relatively stable from one year to the next, providing some amount of confidence in their value. The massive energy waste of cryptocurrency compared to those other monetary systems just makes the whole thing worse.

Gaming has been shown to provide many benefits, being a form of entertainment, and in many cases a means of socialization. As far as "wasting energy" goes, there are many other forms of entertainment and socialization that use more. Just driving somewhere nearby will tend to use more energy than a lengthy gaming session. And really, having a person be able to buy a single graphics card for gaming (or something like content creation) just seems like a less greedy use of the hardware than having someone buy up as many cards as they can get their hands on for either burning through power for the sole use of generating a currency of questionable usefulness, or for reselling to those who are at prices that put them out of the reach of others.
 
  • Like
Reactions: Krotow and Pleiades
Gaming has been shown to provide many benefits, being a form of entertainment, and in many cases a means of socialization
It's also been shown to cause severe addictive behavior, impairment to interpersonal relationships and educational/employment opportunities, increased anti-social behavior, and many more negatives, to the point where gaming syndrome is classified as a mental health disease by the US and the World Health Organization, with essentially every medical in the world recommending that parents limit their children's exposure to it. I don't see such warnings from coin mining. I won't even mention the long-term physical problems from the lack of exercise caused by heavy gaming.

Your points about the indeterminate utility of crypto-currency are accurate. However, it unquestionably provides at least some benefit to non-miners, such as those living in authoritarian regimes such as Venezuela, using it to protect themselves from hyperinflation.

I also note your hypersensitivity to the effect non-gaming uses of GPUs have on the price of gaming cards. But you utterly ignore the reverse effect. Every wafer NVidia allots to gaming GPUs is one less going to pro Ampere chips. Meaning your desire for a l33t gaming card is reducing availability of cards used for everything from cancer research to life-saving weather prediction. How can you be so callous? 😢

Just driving somewhere nearby will tend to use more energy than a lengthy gaming session.
We're comparing gaming to mining here, not driving. Don't deflect. And high-power systems use more energy than you admit. A 500w system (people on this forum brag about systems pulling twice that) consumes, with PSU and line losses added, 2.1M joules/hour, or 13M joules for a long gaming session. That'll take a compact car several miles.

It's weird that you would combine Xbox and Xbox 360 numbers togetherI guess that fits with the trend of manipulating numbers to make them appear higher than they really are though. : P
Now you're simply being absurd. Let's try it both ways:

XBox/Xbox 360: 106M
or
XBox: 24M. Xbox 360: 86M

Neither is misleading ... to anyone with rudimentary math and language skills.
 
Every wafer NVidia allots to gaming GPUs is one less going to pro Ampere chips. Meaning your desire for a l33t gaming card is reducing availability of cards used for everything from cancer research to life-saving weather prediction.
Considering Nvidia makes far more money per professional chip they sell, I doubt they allow gaming cards to cut into their supply of professional ones. The main thing limiting access to those pro parts is the unnecessarily high prices they put on them, preventing research budgets from going as far.

And its the unpredictable market created by crypto that's causing shortages of consumer graphics cards, not the far more predictable gaming market that manufacturers can plan for. That's not only hurting gamers, but also content-creators and other professionals who use these cards for more constructive purposes. In any case, we are not talking about cards being used to create anything particularly useful in the case of cryptocurrency. It's all about cashing in on as much money as one can before the crypto market crashes and less-knowledgeable investors end up footing the bill.

The other points I'm not going to even respond on, since they should be pretty obvious, and have already been covered.
 
And its the unpredictable market created by crypto that's causing shortages of consumer graphics cards, not the far more predictable gaming market that manufacturers can plan for.
No, it is the industry-wide supply chain disruption combined with simultaneous launches of multiple high-volume products using chips from the same fabs and increased demand from covid that caused the shortage long before crypto-mining bounced back. Crypto is the cherry on the sundae ensuring GPUs will remain scarce for the foreseeable future.
 
No, it is the industry-wide supply chain disruption combined with simultaneous launches of multiple high-volume products using chips from the same fabs and increased demand from covid that caused the shortage long before crypto-mining bounced back. Crypto is the cherry on the sundae ensuring GPUs will remain scarce for the foreseeable future.
It's undoubtedly some combination of factors at play, but it seems to largely be mining ramping up the prices at this point. When old used cards that were around $150 new for quite some time are selling for twice as much now with no warranty, there's obviously something more going on than just the usual gaming market. Relatively few people are spending that kind of premium on cards for gaming, but rather because they are hoping to make money off of them through mining. The same thing happened a few years back, and there was no major "supply chain disruption" happening then to affect prices. It was pretty much all mining, and that likely accounts for most of the current pricing situation as well. You don't see other components seeing similar shortages and massive price hikes across all product lines currently. There were some shortages for things like power supplies and motherboards due to unexpected demand and supply-chain issues immediately following the early stages of the lockdowns, but by this point, that should have much less of an impact.
 
So Mr Moore did a very piece of this on his Moore's Law.

Mr Moore, if you are reading, you made several erroneous assumption why Mining Could not be controlled.

Assumption 1) You have to develop two unique GPUs

No you do not. You only need to develop one GPU. The difference between the mining GPU's and the non mining ones are a burnt internal fuse which prevents mining

Assumption 2) Mining farms don't need drivers and cards as they create their own allowing them to bypass all security measures

First off, the drivers are not 100 % open source not even on AMD. NVIDIA is even worse. There are proprietary pieces of code in there. Two not all security is baked into the drivers. That is just one prong.

Let me emphasize: Security can be baked into GPU ROM or circuit pathways that cannot be rewritten. This requires absolutely no drivers and is built into every chip that would be released at a minimal cost of development.

Let me give you several examples of how this would work:
  1. GPU boots up and it enumerates and polls the bus to see how many cards are on the bus. If it sees more than 1 card, it will shut down.
  2. Upon Bootup, if a GPU is designated a Gaming ONLY GPU, it will issue a 512 bit cypher Challenge-Response to the OFFICIAL drivers. The Drivers are digitally signed and certified and this is used with the internal key against the random 512 GPU code to return an answer. If the answer is incorrect the GPU shuts down to prevent mining. Since AMD drivers are used for gaming systems, and have to be certified with the proper signature to return the proper answer, they can only be used on gaming systems and NOT mining farms. With official drivers in place, they can look for specific code sequences common to mining operations DURING COMPUTE COMPILE TIME with near zero impact (only an intial delay of pattern matching using PAULA trees or skip match algorithms)
 
Assumption 1) You have to develop two unique GPUs

No you do not. You only need to develop one GPU. The difference between the mining GPU's and the non mining ones are a burnt internal fuse which prevents mining
You cannot "disable mining" with fuses because mining algorithms running on GPUs are shaders just like everything else that runs on GPUs and any arbitrary restrictions to what the GPU is able to run is almost guaranteed to have unintended side effects on other things when games and other programs use GPUs to run mesh shaders, vertex shaders, texture shaders, AI, physics, CODECs and arbitrary GPGPU workloads.

GPU boots up and it enumerates and polls the bus to see how many cards are on the bus. If it sees more than 1 card, it will shut down.
Kind of pointless when crypto profitability has reached the point where miners are willing to buy whole laptops to add single GPUs to their farm. Even if AMD and Nvidia wanted to make their GPUs enumerate the system to prevent multiples, that could easily be blocked by setting up virtualization. You could also boot a system with the extra GPUs powered down and bring them up afterward.

With official drivers in place, they can look for specific code sequences common to mining operations DURING COMPUTE COMPILE TIME with near zero impact (only an intial delay of pattern matching using PAULA trees or skip match algorithms)
Hashing algorithms consist mainly of multiplications, additions and lookup tables, just like many other everyday shaders. A detection algorithm sensitive enough to detect all possible permutations of a given hash algorithm would be likely to also trip on many ordinary shaders as games and application use GPU acceleration for more things. For example, with DirectStorage, games may end up using shaders to decrypt, decompress and hash-check things as they get loaded into VRAM.
 
You cannot "disable mining" with fuses because mining algorithms running on GPUs are shaders just like everything else that runs on GPUs and any arbitrary restrictions to what the GPU is able to run is almost guaranteed to have unintended side effects on other things when games and other programs use GPUs to run mesh shaders, vertex shaders, texture shaders, AI, physics, CODECs and arbitrary GPGPU workloads.


Kind of pointless when crypto profitability has reached the point where miners are willing to buy whole laptops to add single GPUs to their farm. Even if AMD and Nvidia wanted to make their GPUs enumerate the system to prevent multiples, that could easily be blocked by setting up virtualization. You could also boot a system with the extra GPUs powered down and bring them up afterward.


Hashing algorithms consist mainly of multiplications, additions and lookup tables, just like many other everyday shaders. A detection algorithm sensitive enough to detect all possible permutations of a given hash algorithm would be likely to also trip on many ordinary shaders as games and application use GPU acceleration for more things. For example, with DirectStorage, games may end up using shaders to decrypt, decompress and hash-check things as they get loaded into VRAM.
Okay invalid

Do you want me to diagram it out for you via control blocks and flow diagrams?

The recognition software uses common PAULA trees and match algorithms like a lot of antivirus signatures do. I'll also include the links to these techniques via knuth.
 
The recognition software uses common PAULA trees and match algorithms like a lot of antivirus signatures do.
Despite all of the fancy AV algorithms that have been invented, AV signatures still need to get updated daily due to virus writers making changes just large enough that algorithms cannot reliably detect them as known variants anymore. I doubt AMD and Nvidia want to commit to updating their mining software algorithms on a daily basis and miners obviously won't update their drivers to versions that interfere with their mining software.

At the end of the day, it'll most likely be gamers that will get screwed over the worst.
 
Despite all of the fancy AV algorithms that have been invented, AV signatures still need to get updated daily due to virus writers making changes just large enough that algorithms cannot reliably detect them as known variants anymore. I doubt AMD and Nvidia want to commit to updating their mining software algorithms on a daily basis and miners obviously won't update their drivers to versions that interfere with their mining software.

At the end of the day, it'll most likely be gamers that will get screwed over the worst.

Mining is algs on websites are currently blocked daily by most av companies now.

And if AMD and NVIDIA make $1000+ /chipset on mining specific parts then that more than pays for the one employee to write new signatures every couple days. Amd and nvidia have the capability to run all the same crypto currencies in a lab. And it will likely take less time than a recoding and redistribution of mining algs. That's expensive down time.

Amd wins, nvidia wins with much fatter margins. Miners have a throttled market so demand doesn't outstrip supply. Demand goes down because amd and nvidia control how much miners make at the end based on their chip pricing. If demand gets too low the mining chips become lower in price due to natural market demand

Only people that would be affected is bigger mining farms as amd and nvidia control distribution.

Gamers get scraps but at least they get them at reasonable prices and slightly greater availability.

Your argument about game performance getting screwed over by anti virus type algorithms is demonstrably false for obvious reasons. If that were the case we would see serious fps loses with games today because of anti virus signatures on cpu code. Viruses use the same u ops as games. You Catch it during the compile phase including Paula and skip matches along with statistical heuristics for morphological changes. If it fails you simply fail to load the shader with the appropriate confirmation signature. Raw dumps without an approved signature ( challenge response part 2) will be rejected by gpu.

Do you want the flow control blocks in a visio format of how all of it would work?

I wrote low level control systems in firmware and security for over 15 years. I also reverse engineered stuff for over 15 years. I now have augmented that knowledge with pentesting groups training and AI. I see how they break these systems. And I know how to prevent it.

Although my main job is designing thermodynamic systems and teaching factory machines to build customer specific engineering hardware. I have lead internal classes on ai and iot systems and had lead several new initiatives for the company I work for based in on my independent research.
 
Last edited:
  • Like
Reactions: Krotow
Mining is algs on websites are currently blocked daily by most av companies now.
AMD and Nvidia won't be updating their anti-mining on a daily basis since it costs millions of dollars for them to put a WHQL driver together and as I wrote earlier, miners can simply stick with older drivers that won't block their current mining algorithm and no amount of fancy driver or shader signage will change that.

The entire premise of using software updates to block mining falls flat on its face when it can be bypassed by simply freezing driver updates to a version compatible with whatever mining software you are using.
 
AMD and Nvidia won't be updating their anti-mining on a daily basis since it costs millions of dollars for them to put a WHQL driver together and as I wrote earlier, miners can simply stick with older drivers that won't block their current mining algorithm and no amount of fancy driver or shader signage will change that.

The entire premise of using software updates to block mining falls flat on its face when it can be bypassed by simply freezing driver updates to a version compatible with whatever mining software you are using.
Bull. You don't need a full whql driver for signature files

"Nvidia explained in today's announcement: "RTX 3060 software drivers are designed to detect specific attributes of the Ethereum cryptocurrency mining algorithm, and limit the hash rate, or cryptocurrency mining efficiency, by around 50 percent." The company didn't say whether or not it plans (we'll get to that in a moment), but Nvidia's latest RTX offerings are currently at the top of our GPU benchmarks hierarchy, and it would be nice if more of them were actually used to play games"

Well would you look at that.
 
Bull. You don't need a full whql driver for signature files
That is of absolutely no material importance: if I know that drivers v390+ have an anti-mining thing in them but v389 does not, then I simply use v389 and whatever Nvidia does at any future point in time has absolutely zero effect on my mining farm. If Nvidia releases new GPUs that are only compatible with drivers v400, then I update my mining software with algorithms that v400 cannot detect and run v400 drivers only, whatever Nvidia does in the future has zero effect on my mining farm.

The only ones getting prevented from mining are gamers running the latest drivers.
 
  • Like
Reactions: JarredWaltonGPU
That is of absolutely no material importance: if I know that drivers v390+ have an anti-mining thing in them but v389 does not, then I simply use v389 and whatever Nvidia does at any future point in time has absolutely zero effect on my mining farm. If Nvidia releases new GPUs that are only compatible with drivers v400, then I update my mining software with algorithms that v400 cannot detect and run v400 drivers only, whatever Nvidia does in the future has zero effect on my mining farm.

The only ones getting prevented from mining are gamers running the latest drivers.
Is it solely in the drivers though? In the other article about the guy that already tested a 3060, he wasn't even using 3060 drivers. He installed publicly available 3000 series drivers and told Windows he was using a 3070, yet he still saw reduced mining performance. Currently available Nvidia drivers don't handicap mining performance for any available 3000 series card, yet they somehow did for this guy's 3060 while being detected as a 3070 by Windows, which would tell us this isn't purely a driver implementation. It would be interesting to see if this guy used older 3000 series drivers if mining performance was still affected.
 
  • Like
Reactions: digitalgriffin
Is it solely in the drivers though? In the other article about the guy that already tested a 3060, he wasn't even using 3060 drivers. He installed publicly available 3000 series drivers and told Windows he was using a 3070, yet he still saw reduced mining performance.
Nothing too surprising about performance being wonky when the drivers are tricked into detecting hardware as something different from what is actually installed. I would expect that to actually crash the GPU/drivers outright when drivers try to send work to shaders units that should have been disabled.
 
Is it solely in the drivers though? In the other article about the guy that already tested a 3060, he wasn't even using 3060 drivers. He installed publicly available 3000 series drivers and told Windows he was using a 3070, yet he still saw reduced mining performance. Currently available Nvidia drivers don't handicap mining performance for any available 3000 series card, yet they somehow did for this guy's 3060 while being detected as a 3070 by Windows, which would tell us this isn't purely a driver implementation. It would be interesting to see if this guy used older 3000 series drivers if mining performance was still affected.

It's a combination of hardware and software that work together. If hardware is written to be gaming only, then it can query the software driver to make sure it's certified driver. If the hardware doesn't get proper response, then nerf it.

I'm not saying this is what nvidia does but it does make sense to do it this way.

And it's pure coincidence that this article came out when it t did. But it does pretty much prove it is more than possible.

Clearly Moore was incorrect. I'm not looking my nose down at him. We all make mistakes. But I've learned with age that people can come up with elegant solutions to complex problems. So just because you can't think of a solution doesn't mean others can't.

I just saw his channel today and he's making excuses and trying to shift blame and redirecting. "This is a masterful move by nvidia to screw over gamers" sighs Gamers were already getting screwed. If you use your PC for gaming and mining, you are STILL a miner. The vast majority of gamers/researches/content creators ARE NOT miners. And if you are a gamer & miner, you aren't using your GPU the most effectively. In fact, you risk losing money based on decreased efficiency as well as decreased GPU life. (depends on current coin price)
 
Last edited:
Nothing too surprising about performance being wonky when the drivers are tricked into detecting hardware as something different from what is actually installed. I would expect that to actually crash the GPU/drivers outright when drivers try to send work to shaders units that should have been disabled.
It has been confirmed by Nvidia that the limiter isn't purely in the driver and involves the vBIOS.

https://www.tomshardware.com/news/nvidia-vbios-lock-hashing-rate-rtx3060
 
It's not so much a surprise as it is a reveal of actual hard data from eBay. Prices today are 10-35% higher for the same parts compared to prices in early January. And it's likely to get worse before it gets better.

I sold my 1080 Ti (with EVGA hybrid cooler) in early December for $400... and I thought I was giving someone a great deal given the $699 + $179 retail I paid back in April 2017.

Guess I should have waited... 😆

Seriously though... we are talking chump change in price difference as far as I'm concerned... I'm just amazed that people are this desperate that they will continue to keep scalpers in business by paying absurb prices.

Exactly. It's frustrating, no doubt. But at the end of the day, miners are just another customer base, and the idiots buying from scalpers are the bigger problem. I was planning on upgrading my PC (7 years old) with a RTX 3080/ Ryzen 5900x, but I'd rather wait for a year or more if needed than pay scalpers.

My new build in sig was at 100% retail price... the only thing is I had to go with Intel 10th gen instead of AMD which wasn't available outside of scalpers. Not a huge deal... I don't care how good the 5900 series is on paper... they aren't worth double the price of Intel 10th gen which is what the scalpers were charging a couple months ago.

I got the 3090 with a lucky BB purchase in November... and put the rest of the parts together over the next month and then built the PC. Outside of slow holiday delivery times it wasn't a big deal.
 
  • Like
Reactions: JarredWaltonGPU
Whats your opinions on scalpers but for products that are not scarce?

My brother is super lazy and buys stuff off Amazon Prime at inflated prices just because its easier.

It really winds me up how stupid he is but he likes the easiness of it all. This is not for PS5 or other rare stuff. I am on about everyday stuff that he could visit his local shop for.
 
Why would eBay want to stop sales of things where it makes a 10% profit? It's the entire business model eBay was founded on: Get people to sell stuff and take a modest cut. eBay likes to pretend it wants to stop scalpers, but really it's just trying to stop sales that will end up in returns (or bogus sales in the first place). If anything, the only real change eBay is likely to make is a ban of anyone caught trying to sell an image or box of a hot commodity. People actually selling GPUs at higher than usual prices? More money for eBay!

Thats true. We are doomed haha.
 
At some point this <Mod Edit> up market ends.
All these high end cards will make their way back on the market as decently cared for, used options for those who just have to have the latest gpu. January 2020 I bought an MSI 1070ti- a favorite of miners a couple of years ago- for $250 on ebay in excellent condition. Cards used for mining having been run in a steady state at 65 degrees C, have a lot of life left. Now used EVGA-level ti's are selling for $500 on ebay- basically the NIB price 3 years ago. That same card is over $1000 new on amazon, newegg, etc. for a 3 y.o card that released at $450! As for the 3000 series cards, Gamers should sit this one out.
 
I don't buy this kind of gear from eBay, and I certainly don't consider scalpers to be official barometers of component price. Only idiots will buy and pay scalper prices currently. It's unreal.
What is a "scalper price"?
My guess is NIB price for a 3 y.o. used card qualifies. 30% above release price the same new card by retailers seems like scalping, but maybe it isn't.
I'd pay 1/2 price, but that's gonna be awhile, so I wait.