All Eyes On Graphics Card Shortage, Few Answers Forthcoming

Status
Not open for further replies.

why_wolf

Honorable
Most likely the miners were hitting up the other brands firsts and depleting their stock before moving on the EVGA. Unless for some reason EVGA placed a much larger order for GPU cores ahead of time than the other OEMs.
 

valeman2012

Distinguished
Apr 10, 2012
1,258
11
19,315
21
They are forgetting gamers still uses these cards for their main purposes.

AMD and Nvidia need to prevent mining somehow on these cards, force them use the special mining cards instead once released.
 

clonazepam

Distinguished
Jul 10, 2010
2,627
0
21,160
119


Nobody is being forgotten. Many card mfrs are watching their peripheral sales slow. Less cards being bought by gamers means slower adoption of g-sync and freesync monitors, as well as the many other branded items, with higher margins, being attached to sales of the graphics cards. Miners just simply aren't interested in the extras.

They can't really gimp card performance for mining without major repercussions and they can't get enough chips for their current SKUs, let alone make special cards aimed at mining.
 

ARICH5

Distinguished
Oct 13, 2009
2,484
0
20,160
110
that raises an interesting question. is there something that gpu makers can OMIT in their gpu's that would make the card unattractive to miners, but still benefit the gamer?
what is it specificcaly in the gpu that the miner uses and omit that what the gamer doesnt use?
 

SinxarKnights

Distinguished


Really there is nothing you can take out of it that would benefit the gamer. Miners use it because of the parallel processing power of cuda (and openCL on AMD cards). Absolutely no reason to gimp hardware because of a short term spike in demand regardless of the reason.
 

jsantosv

Prominent
Jul 7, 2017
1
0
510
0
Miners don't use anything specific in the card. Every crypto currency has some algorithm that is optimized to run on certain hardware. Daggerhasimoto (Ethereum algorithm) is designed to run on GPU memory. The faster the memory the better. The GF10 series is pretty good for this, specially because they're energy efficient. However, the GTX1080 memory is pretty weak for ETH mining and 1070's are expensive, that's why the 1060 3GB hits the sweet spot. Until the ETH DAG can't fit in the 3GB memory and all these cards immediately become obsolete for ETH mining (this is predicted to happen in Dec / Jan next year, but nobody is certain). Only cards with probably 4GB or more will work for ETH mining.

However, there are hundreds of crypto coins out there that can be mined with the GPU core. ETH just happen to be the most profitable at this moment.

To control supply against mining I think GPU makers would have to develop some kind of API designed specifically for mining hardware, and make all the mining software developers comply to this standard. Given the open source nature of crypto currencies this is unlikely to happen. They could also ban mining software via GPU drivers on some models, but somebody will eventually come a modify the drivers.

Stores are restricting the amount of cards you can buy at one time. But you can buy cards of different models in the same order. Miners only care about chip, the amount of memory and cooling. An MSI Gaming 6GB do exactly the same as an GTX Gaming X 6GB and you can buy both (I just did in Newegg).

So yeah, it's a though one. I'm sure that they will come up with a solution to balance out things. Gaming and Crypto mining will never end, so they must find a solution.
 

nzalog

Respectable
BANNED
Jan 2, 2017
541
0
2,160
61


More I keep hearing these "gamers" talk the more I dislike them and I'm actually a gamer myself.

When mining slows down and every manufacturer has a huge abundance of stock which leads them to significantly lower prices, will you and these other "gamers" stop crying then?

There are positives and negatives to everything, even if it may not be immediately apparent...
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
4
There is nothing to really worry about here. Once these cards have lived through their usefulness, the crypto miners will dump them back on the market at large discounts to recoup costs.
There is no such thing as making Crypto-mining specific cards. The only things they can do is not attach superfluous things to the board as they would not be needed reducing the boards cost. There is no cypto-specific workload. The workload is specific to the currency, and typically favors GPUs due to their computational performance.
 

SinxarKnights

Distinguished


Thats what i'm saying as well. Gimping graphics cards because of a short term shortage is extremely short sighted. Thankfully Nvidia and AMD are focused on profits (weird saying that but in this case it is a good thing). Meaning they don't care who uses their cards or for what purpose.
 


This almost certainly will not happen. Please read the article more carefully. One company has already said they will not increase production because they do not want an over abundance of stock when people lose interesting in Cryptocurrency mining again. It is highly doubtful AMD or Nvidia will either, for the reasons stated in the article. And GPUs on the re-sale market will also be next to useless, as they will be extremely worn out from mining.
 

hahmed330

Reputable
Apr 6, 2014
22
0
4,510
0
Why not release a fully built 8GPU and 16 GPU mining servers for the general masses?? (They already did for those greedy hundreds of GPU running miners) It would further ease the pressure on demand especially since they would be offering a complete package for it's ease of use and cost it especially would be attractive.
 

nzalog

Respectable
BANNED
Jan 2, 2017
541
0
2,160
61




Right the all knowing article with definatives like "One company explicitly told us that it does not plan to increase production" and "It’s notable that some OEMs refused to comment on the shortage at all." I guess we now know what the entire industry will do. Solid assumptions...

Really doubt this will not affect anyone's decisions on how to profit from this and eventually overshoot the demand when it starts to ramp down.

When people can buy a "worn out" gpu and replace some thermal pads and thermal paste for half the price of a new one, prices will fall. Hardware does not wear out the way you describe it.
 
Computer hardware does not last forever. The harder a piece of hardware has been used, the closer it is to dying. For example, GPUs that have been used 24/7 for several months at full capacity are likely close to the point where they will no longer function, even if they appear to work fine now.

If you don't believe that is true, then you must be under the impression that computer hardware lasts forever. If that is the case, I don't know what to say to you.

As for what companies will do, why would they choose the path with the most risk? Right now AMD, Nvidia, and the various board partners have two real choices in regards to the matter.
1.) Keep production and costs about the same and run the risk of a shortage.
2.) Increase production and financial investment to create more GPUs, which runs the risk of having far too many GPUs in stock.

If they choose option 2, and become over stocked on GPUs, they then need to either sell them at a lower price that may constitute a loss or decreased profit margin, or they will need to pay to store the GPUs until they can be sold off at regular prices. Either option increases costs for the companies while reducing overall profit margins.

Option 1, however, doesn't see an increase in spending to produce GPUs, doesn't run the risk of selling products at lower prices and reduced profit margins, and doesn't force the companies to pay to store excessive amounts of products. Plus, the shortage is likely good for all of the companies involved, as they are able to sell all of their products with virtually nothing left on the shelf, and they are selling at higher prices.

If you were a company, which would you do? Obviously the one that offers the greatest financial return for the least investment.
 

SinxarKnights

Distinguished


I don't understand your reasoning though. The cards are designed and tested to run at its max TDP for years but somehow mining for a few months makes them break? Of course there are defects that cause premature failures. Are you saying they purposely design their products to fail from mining, which doesn't even max out the card? If that is the case, again that is a huge defect and the justification for a class action lawsuit against both amd and nvidia. If you got proof that mining damages the graphics card or that they are designing them to fail rapidly (cant think of any other reason they would do this other than for financial gain) then speak up.

Assuming ignorance because I don't agree with your conjecture is quite asinine especially coming from a Tom's staff member.
 


That comment was in regards to the questions as to how a GPU gets worn out. Your statement directly questioned "how does a gpu get "worn out". It either works or it doesn't." My post explains that. A GPU gets worn out by use, until it eventually breaks. That is something everyone should be aware of. I did not assume you or anyone else was ignorant for disagreeing with what I said about GPUs that were used for mining are to be avoided. Nor would I assume that if someone disagrees with me about how companies will react to the shortage. If someone does not acknowledge that computer components eventually break, however, then it is clear they need to do a little more research.
 

nzalog

Respectable
BANNED
Jan 2, 2017
541
0
2,160
61
Hardware doesn't last forever, but generally lives WAY past it's relevance. It's why things like server virtualization are viable. As long as there were no inherint flaws in the hardware design then it will last. I honestly can't remember the last time I a piece of tech "wear out" before it was time to upgrade.
 

nzalog

Respectable
BANNED
Jan 2, 2017
541
0
2,160
61


Sorry that reply was not toward you and was a bit unclear. I'm actually agreeing with you. Just saying that I rather buy used piece of hardware at a huge discount and then replace the thermal grease and pads. However I don't think that is necessary in most cases unless the card is running obviously too hot.
 
A Gpu can die - typically, a voltage regulator can give out, or (more rare, but it happens) a shader unit fries due to electromigration, or any other component gets too hot and fries - especially when it's not "cooked" properly (I.e. Not allowed to get to higher temperatures progressively). Mining cards are packed together in often badly cooled open cases and run at max power for weeks on end - while your typical gaming GPU can endure 4 years of being 6 hours a day at max load, these cards get that much wear in 6-8 months time. When resold, they are thus very close to their limit.
 

nzalog

Respectable
BANNED
Jan 2, 2017
541
0
2,160
61
I guess it depends on what you're mining. Ether does not benefit from extreme GPU speeds so people actually tend to under-volt and even under-clock their cards. Cards that a gamer would run at 200W+ ends up using ~90W.
 
I think the point being made is how hard the cards are being run when used for mining which is usually 24/7. Graphics cards are generally designed to last for years and do. That's under more normal circumstances though. There's no set 'average' for how much time a 'gamer' spends playing games, it's too vague. Some gamers only play a few hours a week, some may play 20hrs over a weekend.

Just throwing a rough figure out, let's say 20hrs a week. That's 2hrs a day on weekdays and 5hrs each day on the weekend. Compared to a gpu that's being used 24/7 in a mining machine, after just 8 weeks of mining the gpu will have been used the same equivalent as that gamer would in almost a year and a half. 12 weeks of 24/7 mining would use the gpu at the same rate that gamer would use the card in almost 2yrs.

It doesn't mean a mining card will necessarily be 'dead' by the time a person gets ahold of it but various things could heavily worn in terms of internal components. There's also no 'health meter' or 'odometer' for a gpu. At least when you buy a used vehicle you can see how used. What is used, 40k miles or 350k miles? There's a big difference and it's at least some indication of useful life left. Nothing like that exists for the gpu and if it's been run hard and someone buys it at a nice discount for $200 when the card is a $380 gpu but it fails in some way 2 weeks later that's an expensive paperweight.

Just my personal opinion, but buying used gpu's after they've been used in mining is right up there with buying a used car from a shady auto auction or buying a taxi or decommissioned public safety vehicle or rental car. Pretty safe bet they got heavily abused.
 

papality

Honorable
Dec 1, 2012
74
0
10,640
1
a) What gives with autoplay videos on your articles now, possibly the most reviled thing on the internet?
b) Why are they covering up portions of text?
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS