The Ethereum Effect: Graphics Card Price Watch (Updated)

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I wanted to buy a 1080 Ti, 2 days ago it cost 1200 something (Canadian) at memoryexpress, today its 1400 something.. Not buying a new video card. Going to stick with my 680 which cost me 600 something Canadian like 5 years ago.
 
Give it a few months Nvidia and AMD and their AIBs have been reluctant to ramp up production because of what happened in 2013. The crypto market crashed and left them with stale inventory and thousands of used cards for sale. Now that it appears crypto is back for the foreseeable future. Production will ramp up.
 
Solution where all (manufacturers, miners and gamers) would be happy only exists if manufacturers separate their products to two different lines of products. One for miners and one for gamers. Architecturally, those two lines could be the same with the most difference would be in hardware implementation of SHA-1, SHA-2 (256), SHA-3 and similar algorithms used for mining implemented in core instruction set in the GPUs for miners. Hardware implementation would means 100x (or so) greater Hash rates in comparison with software implementation on regular GPU. Of course, those mining cards could be degraded by removing texture unit, video output (DAC), AI, tensor cores even PCIe 3.0 interface could be replaced by cheaper USB 3.0 etc. since miners already use risers to increase amount of cards per motherboard. Also those cards would have increased heat sinks with accent on speed and cooling. Appearance on the market of those cards would have increase overall Hash power for all crypto-currencies, but we know that all crypto-currency algorithms implement difficulty level which is auto-adjusted to preserve constant generation rate of new blocks (coins) in the case if overall HASH rate chenges. By other words, those cards on the market will make GPUs without hardware implementation of SHA and other algorithms unprofitable for mining. And that would be solution of current problem, prices of regular graphic card would get recovered and cards for mining would be formed independently of those and would be dictated by Hash rates and cryptocurrency prices.
 


They aren't going to invest in all knew architecture for just mining. When the existing product already works. Mining is volatile so they are reluctant to ramp up production and get stuck with 3 months worth of stock and next generation ruling out in a few months

 


Implementing several new instructions with hardware implementation should not be big deal. Like GPU on 1080 and GPU on 1080ti differs only in number of cores. Same way 1080ti could differ from an 1080crypt by crypt engine instead of texture unit. Intel is already done that in their processors but of course number of thread execution in parallel is for order of magnitude lower in comparison to modern GPU.
 

If you're going to put together a "GPU" that throws out nearly everything related to graphics, it isn't a GPU anymore and you're competing directly with ASIC miner designers/manufacturers at that point.

GPUs are only practical for mining during the transition period between new crypto-currency which has no ASIC miner yet (and may never achieve sufficient market cap to be worth investing the millions of dollars necessary to design an ASIC with supporting ecosystem) and crypto-currency that gains enough popularity to end up dominated by ASIC miners.

GPU-mining will continue to dominate new crypto-currencies indefinitely due to the high barrier to entry on the ASIC side and year+ engineering and manufacturing overhead between deciding to design an ASIC and having the first working production prototype.
 


First is true if you remove all things which define graphic card, it is not graphic card anymore, but some card. But don't forget, already existing products like Teslas are found their market without video port 10 years ago. And they exist even now like computing products which doesn't need video output by design. Why same couldn't happen to those (lets not call them video cards but) crypto cards?

Some new currencies algorithms implement strategies to be ASICs resistant. Using lot of memory utilization and similar techniques make them 'hard or impossible' for ASICs implementation. For those currencies GPU will always be mining profitable unless manufacturers do something to separate gaming (regular) GPUs from those for mining. Other possible way would be if cryptocurrency market completely collapses and disappear, which will not going to happen by my opinion. By other words, if GPU doesn't change inside and if it is profitable for mining we will have problem on the market. Manufacturers could also forced increase power consumption of GPU without increasing computation power to make it unprofitable but that would be step backward which no one wants.

 

Mining-specific cards "without outputs" have already been tried before and have been a failure: they merely leave GPU manufacturers stranded with unsellable stock when crypto-currency rushes slow down and also reduce the number of GPUs available on the market. It is a lose-lose scenario for GPU manufacturers.

As for "resistant" algorithms, ASICs can have more memory channels, more aggregate memory bandwidth, lower memory latency, more memory and better power efficiency than any CPU or GPU can ever have. The only problem is the large up-front overhead and lead time to get started on a new currency when existing hardware cannot be re-purposed efficiently to mine it. Making the algorithm more memory-intensive merely shifts the ASIC design effort from compute to memory IO. If people could know years in advance which crypto-currencies are going to win, there would be ASIC miners for all of them regardless of what they're focusing on as their bottleneck. Nobody wants to spend 10-50M$ into ASIC and platform R&D while there is a strong possibility that the currency they choose target might not survive long enough to make a profit.

Using GPUs eliminate risk for as long as GPUs are economically viable to mine the currency-of-the-day with and you can sell those GPUs when they aren't viable to mine with anymore too. Can't do that with output-less mining-specific cards. That's another reason why miners prefer gaming GPUs over "mining GPUs" which have no resale value.
 


I agree that all you said about ASICs is true, but algorithms resistant to ASICs really exist. ASICs exist only for those primitive algorithms with fixed functions like any of SHA. Consider program of several gigabytes with lot of branches dependent on data you are encrypting and using shifting and rotation instructions where amount of shift is dependent also on input data. For example if-then block only 32 level deep branches program at 4 billion different points of execution where program can continue. If every such point contains only one of those (ROL, SHR XOR etc) functions, with shift amount dependent on data from previous iteration, making ASICs of such algorithm would require trillions and trillions of transistors which is out of question.

I don't agree with your point of view of the future. Maybe current crypto-currency market looks like a battle field but in the future why lot of crypto-currencies wouldn't coexsist together? Like you have gold, silver, copper, titanium etc. You can use one or another metal depending on what you are going to do with it. Some are more corrosive resistant, some are more conductive some are heavier etc. Like in crypto-currencies, some will provide more privacy, some will be more convenient for every-day usage in shopping, restaurant, some will be more appropriate for business and large transactions, some will be for kids with more levels of authorization (parents), some will be fee free etc. But all of them have to offer decentralization, security and exchange.

I'm sure that anyone who mines seriously, very seriously, would exchange 1000 GPU cards 1080Ti for 10 of 1080crypto if those 10 provide same Hash power with 100 times less energy and without "cooling of the room system" which is required for 1000 GPUs, doesn't matter if he will sell it or not in the case cryptomarket breaks. Someone who spent half millions of dollars on GPUs such investment is calculated risk and he can afford to loose it.

For manufacturers, if they really want to solve market problems, they have to provide different hardware for miners. Samsung realized that
https://www.theverge.com/2018/1/31/16954366/bitcoin-cryptocurrency-mining-asic-chips-samsung

ADDENDUM:
Samsung strategy seems to be some kind of punishment to Nvidia because they know if they success in their hardware and offer better Hash performance per hardware price it will find an instant replacement in rigs based on GPUs. As consequence, it will be followed by overall HASH rate increase on all crypto-currencies which will be followed by increasing difficulty level of all algorithms to satisfy constant block (coin) generation rate. That difficulty level will make GPU unprofitable for mining and consequence is flood fill of market with used GPUs (mostly 1080ti, 1080 and 1070) which will slow down Nvidia to release new Volta architecture. Who would buy Volta for several thousand dollars if market is full of cheap 1080ti cards? That is the reason more why Nvidia needs an instant solution.


 


And the crypto market is nowhere near that level of maturity yet.
That is years away, if ever.

Which does nothing for Johnny and Jane wanting to buy a gaming level GPU today.
 

Bit-wise operations, shifts and other things like that are one place where ASICs and FPGAs excel at well beyond general-purpose CPUs and GPUs. Claiming that those are somehow complex and cost-prohibitive on ASIC proves how little you know about how digital circuits work.

As for making the hash algorithm arbitrarily complex by making it stupidly large and convoluted, that would work, but such arbitrary complexity also means ending up with an algorithm that is nigh impossible to maintain and troubleshoot.

Also, since your original argument was about producing cut-down GPUs specifically for mining, a cut-down GPU would be completely useless for your massively branchy/indirect example since GPUs don't do branch prediction, speculative execution, out-of-order execution or indirections, which means even a Tesla GPU would already be horrible assuming the convoluted algorithm even fits in local RAM.

Another problem with making the algorithm arbitrarily complex is that many people simply may not have enough CPU-power and RAM to run it. At that point, your crypto-currency already ran itself out of the market due to being too costly to mine. Ethereum could easily have been designed with a 14GB starting PRNG block size to rule out near-future mainstream GPUs but then most people wouldn't have been able to mine it even on CPU due to having less than 16GB of system RAM. To gain market traction, it had to start with a block size most PCs could cope with, so it started at 1GB instead.
 


Yes, it was about future as I said.
And for Jonny and Jane if they really, really need to buy a GPU right now I'm sure they can find two or three 1050ti / 1060 cards, put in SLI and have playable games. Maybe someone on this site made benchmarks of such configuration in comparisons with 1070/1080 cards. But, if they can wait then...
 


3 x 1060's =$1500.
Yeah, right.

And SLI does not scale nearly as well as you might think.
 

SLI in the GTX10xx series isn't supported on anything less than the GTX1070, is no longer supported beyond two cards, only works properly in SOME games, yields far less than linear gain (20-60%) in most games where it actually works, causes performance issues (mainly stutter) in many, doesn't work at all with others and is likely to get dropped altogether in the future.
 


If you better analyse what did I say maybe you conclude it is not about complexity of operations but amount of different operations. In 16 GB of RAM you could store program with several billions of branches where every branches finite with a complex function. Looping such program would produce "logical function pattern" which is not reproducible by fixed logic gates since it changes with every input data and cycle. Code for such program is not written line by line by some programmer but is rather generated by software. The point is the amount of different operations, which is not capable for ASICs since it would require enormous number of transistors. My knowledge of digital architecture is much better than my English is :) and I wouldn't claim such thing if I don't understand it well.



ASICs resistant algorithm have nothing with modified GPU. I only mentioned that some crypto-currencies use it to prevent ASICs implementation of it. By my opinion modified GPU or call it "CriptoU" should implement known functions like SHA1,2,3, NIST etc and it will find it's place on market since lot of crypto-currencies are based on those. Those function could be easily implemented in hardware. For example single SHA-1 iteration to be implemented by hardware requires less transistors than hardware implementation of 64bit multiplier because shifts and rotations doesn't require any specific electronics, those are just output lines from one part of electronics on substrate connected to appropriate (shifted) inputs in another stage...


I agree with that. But it is on algorithm making technicians. If I would be payed to produce an efficient algorithm to fit in GB of RAM and to be ASICs resistive and fast enough even on Celerons, I'm sure I would make it. There would be a lot of obstacles, like screaming of antiviruses since you have to produce large code in the data block and then force to execute ....

 


Then you get this:
QAMzRnZ.jpg
 


Not only miners are clueless about Cryptocurrency wallets services don`t care about Security, the miners are stealing the Gamers/Non Miners use GPUs and causing Price to severely rise up.

Sure its supporting the GPU Company business, but these miners are stealing from non miners.

GPU Makers should prevent consumers from using Non Specialized Mining GPU to mine forcing them buy specialized GPU for mining and let the miners have Supply Shortage.Right now the Miners stealing from non miners that just want to play games or do some media creation stuff or vm.
 

Doesn't work. They could make GPUs 30cm thick so you can only fit one inside a PC and they could still put 6-8 per rig using PCIe risers. Linus did a video a few months ago to find out what's the longest PCIe riser you can use before losing signal integrity by daisy-chaining risers and he ended up putting his GPU across the room from the host PC by the time the GPU quit working... so even if GPUs were 49" full-rack size, you could still hook six of 'em up to a single PC if you wanted to.
 


I'm in the group that dislikes mining, but I still have to appreciate that this setup is kinda cool.
 
GPU's are usually tuned primarily for performance over efficiency. Mining cards value efficiency first since electricity costs determine profitability. If manufacturers can custom tune their hardware (as opposed to something you can BIOS flash) to be more efficient than gaming GPU's, it would increase their profitability compared to a gaming GPU's. So it would depend on how much more efficient, and how much of a competitive disadvantage miners are willing to risk their investments on.
 


You can already do that yourself, you can undervolt the card yourself or with a custom bios,. We don't need special cards. We just need more cards. But cryptocurrency needs to prove itself less volatile before we see it.
 


That's why I said custom tune their hardware, as opposed to something you can BIOS flash. Most of the features that differentiate 3rd party cards deal with power delivery and cooling in an attempt to maximize performance or reduce noise.
 
Status
Not open for further replies.