Nvidia Releases GTX 1050 3GB To Stave Off Crypto-Miners

Status
Not open for further replies.
Ethereum isn't the only crypto to mine.

According to this giant wall of text;
https://whattomine.com/coins?utf8=%E2%9C%93&adapt_q_280x=0&adapt_q_380=0&adapt_q_fury=0&adapt_q_470=0&adapt_q_480=0&adapt_q_570=0&adapt_q_580=0&adapt_q_vega56=0&adapt_q_vega64=0&adapt_q_750Ti=0&adapt_q_1050Ti=1&adapt_1050Ti=true&adapt_q_10606=0&adapt_q_1070=0&adapt_q_1070Ti=0&adapt_q_1080=0&adapt_q_1080Ti=0&eth=true&factor%5Beth_hr%5D=13.9&factor%5Beth_p%5D=70.0&grof=true&factor%5Bgro_hr%5D=14.5&factor%5Bgro_p%5D=75.0&phi=true&factor%5Bphi_hr%5D=6.0&factor%5Bphi_p%5D=75.0&cn=true&factor%5Bcn_hr%5D=300.0&factor%5Bcn_p%5D=50.0&cn7=true&factor%5Bcn7_hr%5D=300.0&factor%5Bcn7_p%5D=50.0&eq=true&factor%5Beq_hr%5D=180.0&factor%5Beq_p%5D=75.0&lre=true&factor%5Blrev2_hr%5D=14500.0&factor%5Blrev2_p%5D=75.0&ns=true&factor%5Bns_hr%5D=420.0&factor%5Bns_p%5D=75.0&factor%5Btt10_hr%5D=7.0&factor%5Btt10_p%5D=75.0&factor%5Bx16r_hr%5D=4.0&factor%5Bx16r_p%5D=50.0&factor%5Bskh_hr%5D=11.5&factor%5Bskh_p%5D=75.0&factor%5Bn5_hr%5D=15.0&factor%5Bn5_p%5D=70.0&factor%5Bxn_hr%5D=1.0&factor%5Bxn_p%5D=75.0&factor%5Bcost%5D=0.06&sort=Profitability24&volume=0&revenue=24h&factor%5Bexchanges%5D%5B%5D=&factor%5Bexchanges%5D%5B%5D=binance&factor%5Bexchanges%5D%5B%5D=bitfinex&factor%5Bexchanges%5D%5B%5D=bittrex&factor%5Bexchanges%5D%5B%5D=cryptobridge&factor%5Bexchanges%5D%5B%5D=cryptopia&factor%5Bexchanges%5D%5B%5D=hitbtc&factor%5Bexchanges%5D%5B%5D=poloniex&factor%5Bexchanges%5D%5B%5D=yobit&dataset=Main&commit=Calculate

If we base our calculations on the Geforce 1050Ti the top crypto to mine is indeed Ethereum at 62 cents a day, taking 0.06 per kWh as the power cost.

If we eliminate Ethereum from the list due to the decreased amount of VRAM on the new Geforce 1050 then the next crypto that is able to effectively be mined it would be Bitcoin Gold and or Zcash at 55 cents a day.

Never heard of Zencash but you could mine it for 0.59 cents a day.

55/62 is a 12% loss if miners decided to buy this card and mined Bitcoin Gold instead of Ethereum versus a normal Geforce 1050Ti.

This card by itself won't stave off cryptominers if they really wanted to buy it, especially if the amount it mines per month / the price of the card brings it to higher efficiency than the Geforce 1060.

You could put 8 of them in a Biostar TB250-BTC D+ for a low powered $4.40 a day, $132 a month rig.

What staves off cryptominers is the imminent release of the GTX 11xx or GTX 20xx series which is supposedly much more efficient than the 10xx series.
 

bit_user

Titan
Ambassador

Bandwidth usually refers to the actual data rate, since it's a measure of frequency.

A better way to phrase it would be that they reduced the aggregate data bus width (each memory channel has a separate address bus and data bus). To be even more precise, I think they actually cut it down from 4x 32-bit channels to 3x.
 

longjohn119

Honorable
Jun 22, 2013
3
0
10,510
The GTx 1050 was already limited by the memory buss width and making it even smaller doesn't make a lot of sense to me .... The reason I say this is when I increased the memory clock speed I didn't gain squat in performance which tells me the memory buss wirth is already at full capacity ....
 

bit_user

Titan
Ambassador

I think you have it backwards. If increasing the clock doesn't gain you anything, it shows memory was not a bottleneck (i.e. not maxed).

Whether it's memory-bottlenecked will depend on the game, the options, and the resolution, however.
 

bit_user

Titan
Ambassador
Here's the ratio of compute to memory bandwidth, for all the different models (source: https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_series ):

Code:
Model          GFLOPS  GB/sec  FLO/B
------------------------------------
GTX 1050         1733    112    15.5
GTX 1050 3 GB    2138     84    25.5
GTX 1050 Ti      1981    112    17.7
GTX1060          3855    192    20.1
GTX 1070         5783    256    22.6
GTX 1070 Ti      7816    256    30.5
GTX 1080         8228    320    25.7
GTX 1080 Ti     10609    352    30.1
Titan Xp        10790    384    28.1
Titan V         12288    653    18.8

I used base clocks, meaning this is a lower-bound.

So, the GTX 1070 Ti and GTX 1080 Ti should be the most bandwidth-starved, whereas the GTX 1050 is at the other end of the spectrum. From this perspective, their move makes a lot of sense. It puts the GTX 1050 3 GB on par with cards like the GTX 1080.
 


Geforce 1050 MX, Nvidia poking fun at themself.
https://en.wikipedia.org/wiki/GeForce_2_series#GeForce_2_MX
(Showing my age)

Geforce 1050 Atom / Molecular Edition, Intel Jab

Geforce 1050 Trash Compactor
Enough time has passed for us to make fun of the AMD Bulldozer line of processors

 


Nobody has caused more nonsensical consumer confusion in GPU nomenclature than Nvidia. Their product marketing department has Tourette Syndrome. They did something similar to this with the 3GB GTX 1060.

Many people, albeit less informed than most Tom's Community members here, thought it was just a 3GB version of the 6GB 1060. Of course they had no idea Nvidia disabled one ten of the 1060's streaming processors which cut CUDA cores from 1280 to 1152. That could make the difference between playable 60FPS and sub-60FPS gaming (https://images.techhive.com/images/article/2016/09/1060-3gb-fcp-1080-100680479-orig.png).

In that case of the two 1060s, Nvidia should have made the 6GB variant the 1060 Ti and the 3GB the 1060. Just stupid considering they have a 1050 Ti and 1050 and 1080 Ti and 1080 (as they have had in previous gen cards too).

Regarding AMD's Bulldozer, that failure was bulldozed into the landfill of history. So that's a good name for that generation CPU. Ryzen rose out from its ashes to fame and glory.
 


At least now we know that if AMD ever names another line of CPU after heavy machinery, we should avoid those processors.

AMD Phoenix Check
AMD Blue Jay Check
AMD Concrete Mixer ... NO AVOID THIS

 

King_V

Illustrious
Ambassador



I'm not sure that on the low-end of things, that a higher GFLOPS/bandwidth is desirable. This is just making me more suspicious (though it's always a vague suspicion) that this will hurt gaming performance in the card.


I guess we'll have to see actual gaming benchmarks to be sure, but it seems like this will hurt things.
 
I know for sure that algorithms like Ethereum are highly sensitive to memory bandwidth, hence the massive memory overclocks Ethereum miners do and are almost not affected by the core clock. (You have to really downclock the hell out of the core clock to affect the Ethereum mining rate, for me it was like -250 core clock.

Other algorithms like Equihash are not very sensitive to memory bandwidth and do benefit greatly from core clock.

Looking at the above numbers I predict that the Geforce 1050 3GB may mine Equihash faster than a Geforce 1050Ti due to the increased core clock speed and the same amount of CUDA cores, at least at nonoverclocked settings.

The reduced throughput of the ram would hinder Ethereum, but you may not even be able to mine Ethereum with 3 gigabytes for very long as the above posts mentioned.

This almost feels like Nvidia is saying use this cheap card to mine Bitcoin Gold and Zcash, we don't like Ethereum.

Real benchmarks would indeed settle this.


Even if it is a efficiency god at mining Bitcoin Gold you would need a ton of them to make any real money.

55 cents a day, $16.50 a month as mentioned earlier for the Geforce 1050Ti

Even if your power is unlimited, it isn't but lets pretend it is, you are always limited by available PCI-E slots for graphics cards on the motherboard, along with physical space to put said motherboards/computers.

Buying a ton of Geforce 1050s and setting them up is more of a proof of concept than a functional mining rig

For comparison a Geforce 1060 is about $300 and is about 80% more powerful than a Geforce 1050Ti at mining (80% more money per month)

Meaning you could fit 80% more hashes(sol) / second in the same footprint.

Moving from a Geforce 1060 to a Geforce 1070 is about 40% more hashes / second.

Moving from a Geforce 1070 to a Geforce 1080 is about 34% more hashes / second.

Moving from a Geforce 1080 to a Geforce 1080Ti is about 33% more hashes / second.

As you can see the performance gains are shrinking, making the Geforce 1060 the most economical of the series, it pays itself off the quickest.

This is obviously not including the undisclosed price and hash rate of the Geforce 1050 3GB.


Prices are coming down, I currently see Geforce 1080Ti for $850 on Newegg, used to be over $1000.

Miners are most likely waiting on the next series for their next purchase.
 

bit_user

Titan
Ambassador

All else being equal, more bandwidth is certainly better.

I computed the reciprocal of what we want to maximize, since it's a nicer number. But I tend to agree that you want GFLOPS high and FLO/B low. Notice how Titan V has the highest GFLOPS and the second-lowest FLO/B (Floating-Point Ops per Byte).


Again, it will certainly depend on the game, resolution, and settings. But this should give you some perspective that the new card is not outside the range of existing products.

For it to be bandwidth-starved, then the following cards would be even more-so (i.e. at the same settings, with no upper bound on framerate): GTX 1080, Titan Xp, GTX 1080 Ti, and GTX 1070 Ti.

So, at this superficial level, I think we can say the new card is a pretty safe bet. We still need to confirm via benchmarks, but I expect performance mostly on-par with GTX 1050 Ti. Getting that at a cheaper price = win. The biggest downside (both from memory size & bandwidth) will be at higher resolutions. So, not advisable for much above 1080p.
 

King_V

Illustrious
Ambassador


Hmm, I can see that - though admittedly, the high end cards are only bandwidth starved because they have overwhelming power in terms of GFLOPS. GDDR6 and/or HBM will hopefully help alleviate that in the next gen cards. "I CAN'T BREATHE THROUGH THIS COFFEE STRAW!!!" lol


And, agreed, definitely still a medium details 1080p card, though it'll be interesting to see where it falls compared to the standard 1050, 1050Ti, and 3GB 1060 in various games (particularly in the few games where the 1050Ti can outperform the 3GB 1060).
 
Status
Not open for further replies.