System Builder Marathon, Q4 2013: A $2400 PC That Costs $2700

Status
Not open for further replies.
"It’s a shame that a digital gold rush is taking these out of the hands of so many gamers."

Seeing that it is impossible to break even doing bitcoin mining with GPUs, i expect sooner than later a flood of barely used cards will hit the used market.
 
We'll see how this all ends up with the Litecoin and bitcoin miners. Either they were smart for getting ahead of the game, or they were idiots for believing they could make more than they spent.

But in the case of the Bitcoiners, there's a better method to mine, why bother with the GPUs? Seems to me they lose out no matter how they end up.
 
You can't break even due to power bills, let alone hardware prices. But wait, there's more!
Forum members often call a machine that burns far too much energy for the amount of useful work we get out of it a "space heater". But if you compare THIS machine to an ACTUAL space heater, you can clearly see the benefit of using THIS machine RATHER than an actual space heater to heat your workspace. Let mining pools pay a portion of this winter's heating bill!

I'm completely against the CONCEPT of crypto-currency mining because they produce no USEFUL data. We're producing GARBAGE data of increasing difficulty generation-by-generation and wasting all those resources to do it. It's worse than raising cattle for the leather and throwing away the meat. It's more akin to raising cattle for photographs of the cow and throwing away the cow!

These machines might actually benefit society if they were using a program like F@H, and we'd at least have a solid argument between their cost to society and their benefit to society. Someone should have beat the bitcoin guy to the punch and developed F@H coins.

Or take a look at cloud servers. Large companies are renting out their excess computing resources during low-traffic periods. Now look at PC-based, self-serving distributed computing platforms like Skype. The per-user cost is low but the number of users is high, so hosting the program across those same "clients" makes sense.

Why don't we have companies knocking down our doors begging for our excess data resources? Someone with a great marketing plan AND excellent technical knowledge should set up a distributed computing platform that pays individuals for their contributions. Environmentalists should praise that move as reducing the number of data centers needed world-wide, but me?

I'm just trying to reduce waste. I even collect my small bits of scrap metal (broken car parts, etc) and give them away to scrap metal collectors because it costs more to take these in than these are worth. Those guys collect enough small batches to make it worth the 15-mile trip. And you don't need to be a tree hugger to see that everyone benefits from that type of effort.
 
While it is true that most miners wont break even do to electricity cost, it does not mean that a profit wont be made. One big draw of crypto currency is the black market. Silk road was and is huge. If you want to launder money, then crypto mining is a great way to go. If the current return on laundering money is a 75% return, and trypto mining is 85% then why not? Further more, its safer than keeping piles of cash around in a safe house. For the avg. user mining wont make a return, but for others, it can better than other alternatives.
 
Crash, you forgot your soapbox! :)

If we're to believe what we're told and crypto-currency mining is to blame for retailer spikes in the highest-tier AMD cards, then I expect to see AMD make some changes in its next generation of cards, especially if AMD isn't cashing in on the rush for its cards and the price hikes are solely due to merchant mark-ups. Considering AMD's business concerns over recent years, I don't expect AMD to make any such profitability mistake ever again. Instead, I think AMD will follow nVidia's example.

When nVidia capped GPGPU performance on the majority of its cards, then went on to produce the Titan and Tesla cards without such GPGPU restriction at higher prices, I was OK with that. It meant gamers could buy cards built for gaming at a reasonable price, people who used their cards for both gaming and GPGPU-related tasks could buy a card built for both for a premium, and researchers could buy cards that were fully-optimized for GPGPU use for an even higher premium. If AMD had done that with the R9-series, we'd have quite a few more gamers sporting brand new AMD cards this holiday season.

And back to the article... Heckuva build! It's an improvement over the previous build in just about every way, with the exception of its current cost.
 
The article's my "soap box". At any rate, I've given AMD's options a few considerations too. It's made a commitment to end users and the only way to profiteer without having people call you out on it is to sell these through a "back channel". The other problem is supply and demand: They can't ramp up production very quickly, and who's to say that this expanded market wouldn't evaporate before they had the extra cards to fill it? The BEST thing for AMD to do is stick to its guns and let retailers take the blame for profiteering.

I figure there will be a flood of used cards on the market in three months as it gets more difficult to mine the most profitable currencies. But someone mentioned that before I responded. It would be REALLY REALLY bad for AMD to spend 6-weeks increasing production volume, only to see a flood of cheap used cards knock the market out from under their new card sales. Once again, AMD is probably doing best to stick to its plans. Nobody remembers when Intel blamed overproduction by AMD for the CPU market collapse of 1999..in fact those news articles were buried within three months. But I remember :)

 
I think we're all forgetting that there are FAR more digital currencies out there than Bitcoin, and almost all of them have a hash algorithm and mining method that prevents ASIC mining rigs from having an advantage. Litecoin is a popular one right now because the exchange rate is low, the total number of possible LC is significantly higher than the BC max, and it's optimized for APU/GPU mining.

That said, I think this build really missed the mark. My current build would come fairly close to your BF3 numbers and yet my system can easily be had for around the $1300 range. While it might not compete on some of the other compute tasks, it still does pretty damn good.

Give me $2400 and I'm sure I could smoke this rig. I'd expect more from Tom's.
 
I think excessive heat and noise should be taken into account of overall performance/$. The Antec solution in this build borders on ridiculous IMO @ this price point. GPU fans reaching normal conversation decibel levels would also kill it for me. I have a close set up with same case sli 770,p/p h220 so I know how quiet this setup could be. I understand the initial 780vs290 argument, hopefully vender coolers will improve the 290 cooling considerably and prices will settle back down.

There is a heat/power/efficiency section but In the conclusion I would just like to see Qx vs Qy dbl and system temp along side the % performance gain for a more big picture view. Anyone else ideas on this?
 
@Crashman
"When nVidia capped GPGPU performance on the majority of its cards... It meant gamers could buy cards built for gaming at a reasonable price"
Uhm, what? Noone got a price drop, gamer cards didn't become cheaper. Reduce in power consumption ? yes, cheaper ? no.
 


Protein Folding Coins do exist.
 
I have to wonder how much of the system noise is really attributable to the PSU...

I have a similar-model Seasonic PSU, an i3570K (stock) and a pair of the Superclocked EVGA GTX 570 in an old InWin Q500 case--I've experimented with the various fan speeds once the system had been on for a while, and the PSU fan was (subjectively) the nosiest *by far*. Even setting both blowers and the case/CPU fans to max (when the system was cold) didn't generate nearly as much noise as the PSU alone...
 
Should have been a 4770k at 5GHz and not a 6core at lower clock rates. I've seen this over and over. 4770k disable HyperJunk and clock the bastard to 5GHz and watch it eat any other CPU for lunch at gaming. Simple as that. If you want threads then you can get the 2011 Socket. But for gaming the 4770k Wins. Come on Toms, You just wrote that wa 2 months ago?
 

Do Tell. No really, I'm lucky to see $20 a day from a pair of R9 290s, and that's gross (not net) in Litecoin. And I really am just using that machine as a space heater.
Very little. You can't hear these PSU fans because the GPU fans are so loud. One of the themes of this build was to take a pair of top-performing good-value cards that Chris rejected for noise, and make a livable system of them. And that effort succeeded at stock speeds. BTW, the cards were set to Uber mode at stock.
Sorry, but you thought this was a gaming build? You must be reading what the other guys said. I never agreed to make my build a gaming build: I wanted to beat my $2550 system in everything INCLUDING games, not games to the exclusion of everything else.
 
Call me a noob but why is Toms still using BF3 as a benchmark if BF4 is out? Is that because the last PCs were benchmarked using BF3 too?
 
Status
Not open for further replies.