High-End personal Workstation Guidance

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


In my opinion, things have changed a lot, recently. Certainly it used to be true a workstation card made a big difference. The software was written using Open GL and the workstation cards were designed to take advantage of this. Now Autodesk (very recently) has changed their tune (and their promise) and switched to D3D. The gamer cards are designed for D3D.

I never did ANY 3D programming for any API, so I am no expert. I started thinking about this new machine a couple years ago, and at that time a workstation card was mandatory. So the fact that a gaming card might be the way to go now TOOK ME BY SURPRISE, to say the least.

As far as the cost difference, don't overlook the importance of support. Unlike you, institutions MUST have support. At this time, I don't believe Autodesk, or any hardware manufacturer will give any support for the gamer cards used in a workstation environment.

If all goes according to plan, pretty soon WE WILL know for sure how the two cards compare. The GTX-580 should be here on Thursday. I'll have some numbers shortly thereafter. I'll let you know as soon as I run the benchmarks.

As far as keeping both cards: NO WAY! No way can I afford to do that. A'int happen'n...
 
I am RUSHING out the door to get advance tickets to 3D IMAX TRON, showing a block from my house ... I'm gonna camp out, in line, and my daughter is going to meet me there. YAY.

To me? ... This is a little slice of heaven ... Seeing the new TRON (in all it's intended glory) AND a dinner date with my "babe" daughter. ... Life is great, today!.

(Camp Grenada ... if yesterday is taken into account)

http://www.youtube.com/watch?v=D2Hx_X84LC0&feature=fvw

 


I've called Newegg, I got an RMA. I'm sending the whole thing back. 😉
 



Oh, it's OK. The re-stocking fee would have only been $450.
 
Troubleshooting is not always as easy as it seems. Today finding information online is simple and quick the difficult part of troubleshooting is to use that information to resolve your computer problems. Today you really don't have to be an expert in application development to be able to run simple and even more complex troubleshooting tasks.

Computerfreetips.com makes trouble shooting simple, anyone who has basic knowledge in computers can follow the instructions to resolve any of the problems mentioned below. We hope that this page proves to be helpful while enhancing your knowledge of your PC.

 
Hello, everyone. I was thinking about making a new thread for this, but the wealth of knowledge in this one (Alvin 😛) is astounding and I couldn't resist.

My dad bought me this computer years and years ago when I was around ten years old. It's been a fine machine, having been always on since then, but it just can't keep up anymore. It's gotten to the point where I can barely play flash games on the web, and even then some won't work. I was playing World of Warcraft at bare minimum settings and pulling 10 frames/s, but Cataclysm's updates killed that altogether. It's time for an upgrade from this ever loyal SR1500NX (completely stock, barring the 512MB of RAM I added.)

Anyhow, I'm somewhat technically competent, and I decided to take a shot at building a new computer for myself. I would have done it a lot sooner, but it's taken me this long to save up enough cash to do anything. I'm shooting for something that can handle WoW and Starcraft II and what have you at maximum settings and still put out a decent frame rate.

Here is the build I came up with:

1zcekr8.jpg


I started by picking out the processor and graphics card. I chose the motherboard by sorting for a compatible socket with SATA 3 and USB 3 and then sorted by the highest rated. The Gigabyte board looked good. The price on the RAM seemed good, and it appeared to have a decent clock. The rest of the build just sort of fell in after that.

However, I was up until seven in the morning last night reading not only this thread but several articles from Tom's Hardware. I found out two things: One, that I'd be better off with a GTX 570, and two: something called Sandy Bridge.

The 8 hour marathon of research, and mainly this thread, has sowed a lot of doubt in me. This is the most money I've ever spent in my life, and I don't really want to mess it up.

A little background on the financials: I've been saving my entire life, never made a withdraw, and I have $1500. 1k is the max I'd like to take out of this. The parentals are willing to chip in $500, which brings us to to a total of a $1.5k budget for the rig. I already have the keyboard that came with the Compaq, a wireless mouse, speakers, and a monitor (TV actually, 32" Vizio plasma).

I need some guidance from people who actually know something about hardware. I've been flying by the seat of my pants, and I think I did pretty well for myself, but I don't really know.

Also, would I be better off just buying a computer off of Dell or the like? What's the benefit to doing it myself?

Thank you for taking the time to read this, I hope you all have some insight.
 
Taylor,
I think you did a fairly decent job, on the build (more than half way to my certification) ... I hope you will be plaesed to hear that it represents about 100% overkill and we can give you more than you were "asking for", at nearly half the price.

Recommend:

Phenom-II 965 with a hyper 212+ cooler.
4GB (2GBx2) of RipJaws 1600c7 memory (kit)
Gigabyte UD3H mobo with an 890 chipset
One GTX 570 or 580 [at least 1GB GDDR] (Sapphire, PNY, EVGA, GIGABYTE, ASUS)
One Vertex-2 ~120GB SSD (boot/apps/games)
One Samsung Spinpoint F3 HDD @ 1TB
One SATA DVDR (ASUS, SAMSUNG, SONY, LG)
One Antec 300 (*ILLUSION* model ... comes with fan)
One Corsair 750 Watt PSU
One 23" ASUS or SAMSUNG LCD Displays (6ms or less response ... 1920x1200)
... One (appropriate) video cable and an extra (matched RPM/CFM) CM 120mm fan (push-pull cooling).
*** I will dig this matchig fan part out, 4u, if you should opt for this route (hyper 212 plus).

DO start a NEW thread, once you have this system mostly spec'd out ...
... Follow the "How to ask for help" sticky form, at the top of this forum's main page).

Come back here and post your new thread link, or PM me.

NO ... You cannot SLI on an AMD mobo ... but ... you WILL NOT need to !!

PS: Don't let the combo deals (or rebates) lead you around, by the nose. Spec the best parts!

Also see HAF 922 Case.

If you were to go with a 570 and the Antec 300 and swap the SSD for a 2nd 1TB F3 HDD ... also dropping down to a Phenom-II 955 (will clock the same) ... We can prolly shave a few hundred more bucks, without you feeling ANY drag.

AMD/ATi also makes some really decent gaming cards which COULD crossfire, on this system (but I DEFINITELY recommend *ONE* really good GPU (not two).


... And ... We could go even cheaper than THAT (and still game like a demon), by going with the next to the fastest Athlon-II x4 proc, a MicroATx 7xx chipset, 4GB of DDR2, two 500GB spinners, and a Centurion Mini Tower ... with a SCREAMING ATi GPU, and you will STILL be WAY WAY better off, than you are, right now.

(Caveat: MicroATX mobos only have ONE GPU slot ... Crossfire is not possible (who cares ?? ! ).






 



I wouldn't have believed that you could run well at ultra for that. I'm impressed. However, I'm trying to kind of future proof this a little bit, because I don't really see myself having the cash for a new rig for another several years. That's why I went with the build I did - it should be able to hang on for years to come and I've got the option of going SLI.

I'm somewhat brain dead at the moment, and I'm going to turn in for the night, but I'll hit you up tomorrow.

I REALLY, REALLY appreciate your input. I don't know if you can imagine how many circles I've spun myself around in, it's been a hell of a ride. Thank you.
 



*** DO NOT HIJACK THIS THREAD ***
*** START YOUR OWN OR PM ME ***

*** YOU WILL BE IGNORED, ON *THIS* THREAD, FROM THIS POINT FORWARD ***

I understand your philosophy, but I must recommend against any attempt to "future proof", a new build, for the next 6 months.

If you can build a system that is almost four times as powerful, as your current build, for "half the cost", that means that you will be in a position to upgrade, much sooner and will be in a better position (financially) to build a "dream system" at a time when it will really count.

There is alot going on, right now, in the bizz ... This is not just another product cycle that we are in ... THIS is a major platform shift ... Only happens every 4 years.

There are FANTASTIC values, in perfectly good (screaming fast) AMD kit ... Except for the cheapest build that I spec'd.

>>> Either of the Phenom-II X4 builds WILL get you WAY down the road (WITH OPTIONS!). <<<


 


Well ... If you liked the first one ... You will like the sequel.

I am actually more excited about FUTURE sequels, now that FLYNN's son has brought his digital GF into the REAL WORLD ... THAT "line" has some cool possibilities.

The prob with the original, and with this sequel is that there is a dearth of emotional engagement ... A few moments of tension and anger, between father and son, but no sizzlin' love interest and no warm/fuzzies, of any kind ... emotionally, ... kind of bland.

Visually? Very cool, though.

I was neither thrilled nor disappointed. Totally worth our time and money, tho.

We took photos of ourselves wearing the new (super stoopid looking) 3D glasses ... very "special" ... and ... NO! ... not gonna post those photos ... In fact, I offered my daughter $20 to delete them ... FAT CHANCE !!

 


I'd say, the original Tron is one of those movies you just cannot watch nowadays, because of the dated technology. I wonder if that is how it will seem 30 years from now? Nowbody will want to watch a movie like Avatar because the special effects are SO bad?

Anyway,I see the GTX-580 is "out for delivery". Should know pretty soon how much difference there is.

What about monitor resolution? I don't have her big monitor here. All I have hooked up to it now is an old Sony 19" CRT @ 1280X1024. How does that affect the benchmarks?
 
I'm getting significant differences between BIOS settings and what CPU-Z is reporting, for both the CPU and the memory. My clock multiplier is set to 23X, but CPU-Z is reporting 24. If I set it back to 22 in the BIOS, then CPU-Z reports it correctly. But if I set it back to 23X, again CPU-Z reports it as 24.

For memory, BIOS auto has them at the correct 7-7-7-20, but CPU-Z is reporting 9-9-9-23!!!

I assume CPU-Z is the one that is accurate?
 
OK, sorry, I figured it out. I needed to turn Turbo off to get the 23X clock multiplier. Duh...

And I needed to turn off "auto" in the BIOS memory settings and set them all manually, 7-7-7-20

Now CPU-Z is reporting what the BIOS settings state. I'll re-run the benchmarks to see how much differenc the faster memory timings make.
 


"ON" ... absolutely ... "ON" ... definitely ... "ON" ... indubitably ... "ON" !

Leave HT set to "ON" at all times ... at any clock.

While there are some instances where turning HT off would allow you to clock faster (i.e. cooler), those would only include situations which would not (ever) address more than the four basic cores (4 threads max).

For any kind of render/transcode ... HT enables 8 threads ... Along with some extra heat.

Since you are not pushing your clocks ... and since you have a great cooler, you can easily do 3.8GHz with HT on "all the time", without shortenning the life (MTBF) of your expensive CPU (very much).

Gamers would want to turn it off (imo) because games use 4 or less cores, anyway and, gamers want/like to push their clocks.

... Same goes with a Buisiness VP "Power-User"/"Multi-Tasker" (MAY-be).



Turbo is a different story.

 


On, got it.

By the way, I started having instability problems after I manually set the memory timings to 7-7-7-20. I figured cuz I upped the B-clocks. I'm sure there's a way to slow the memory back down so I can use the faster timings, but I DON'T HAVE ANY TIME to figure anything out anymore, so I set them back to AUTO for now, and they went back to 9-9-9-20something.

I HAVE THE 580! Wow, is it impressive! (to look at, and to hold, it's VERY heavy. I haven't installed it yet).

I'm madly trying to run benchmarks on the Quadro before I uninstall it and its drivers.

OK, back to work. Let's see, did he say HT on, or off...


 


Yes .. I should have explicitly instructed you to MANUALLY enter the PUBLISHED (data-sheet) timings, for YOUR specific modules, in ANY system and with ANY sticks. (AND "SAVE" those BIOS settings, on exit).

And, almost in the same breath ... That more sticks (or DDR/stick) will suck more juice and that it is "likely" that you may have to rotate modules and boost RAM voltages (by only hundredths of volts, at a time), in order to get it to run PRIME95 *STABLE*, for days on end.

Otherwise, the sys may hang during the most taxing ops (longer renders and greedy games).

Keep that in the back of your mind(s).


 
Oh, Alvin, I am *SO* glad I didn't sent that Quadro back!!!!!!

I'm trying to think of a movie where a guy ALMOST does something REALLY stupid, but then at the last minute something happens to save him so I can post a Youtube link. But I'm not you, so I can't think of one. Sorry.

Yes the GTX-580 runs 3DMark 11 a lot faster. 3 times faster (although that still seems way too slow). But Specviewperf 11 WON'T HARDLY EVEN RUN! Not only that, but you can't see anywhere's near the same amount of detail as you could with the Quadro. It would be useless in a design environment. (Yes, I have BOTH pci-e power connectors connected, one 8-pin and one 6-pin).

The difference is ASTONISHING. The Quadro beats the GTX-580 by a MILE, more actually.

I never bothered to use the driver off the CD, I got it straight from the EVGA driver download site, same as nvidia, 263.09.

So, I don't know. Something is wrong. The 3dMark 11 score (P6061) is too low and it is not even close to being usable according to the Specviewperf 11 test.

Talk to me Goose.




 


HHAw-ONK ... Haw-ONK !

I am rather stupified, ATM, to be honest.

You are the one with the flush test lab.

You know, tho, one possible kink, that does spring to mind, is that some config file was "constructed" for ONE of those cards, and then applied to the other ... i.e., the card was changed but some aspect of the sw configuration was not changed (as well), to reflect the presence of the new card.

Mannnn ... Am I spinnin' yarn, here, or what ???

How 'bout some of that "world-famous" Quadro telephone/email suppt ??

:heink:
 


A completely different analogy, than you had stipulated, but (perhaps) "a bit more like it" ...

... You'll have to watch the whole thing, to "get it".


http://www.youtube.com/watch?v=qZxDh3hdIyw

 


Here is the procedure I used to remove all traces of the old drivers:

Section A Control Panel

1. Uninstall any video card overclocking programs that are tied to your drivers. (Rivatuner, ATi Tool, etc.)

2. Uninstall video drivers. Generally this is done through the control panel (add/remove programs for XP, Programs and Features for Vista) in Windows. For ATi it called "ATi - Software Unintall Utility". For Nvidia it's usually labeled

as "Nvidia Display Drivers" you want to uninstall. Restart. When restarted don't allow windows to install any drivers for your cards yet, just cancel out. Re-check in add/remove programs - Programs and Features for any remaining

ATi/Nvidia display associated drivers.


Section B Registry
See Fig. X
1. Click Start->Run. Type: regedit

2. In the registry editor Click File->Export. Save to your desktop (Now you have a backup of your registry.)

3. Expand the folder "HKEY_CURRENT_USER" (by clicking the "+" sign), then expand the folder "Software".

4. Find and delete any files/folders pertaining to your old video drivers in the Windows folder and Program Files folder(s) BE CAREFUL NOT TO DELETE ANY CHIPSET FILES OR FOLDERS.

5. Follow steps 4 and 5 for the folder "HKEY_LOCAL_MACHINE. Click File->Exit and you are done.


Section C Windows
See Fig. Z
1. Delete the ATi/Nvidia folder on your system drive in the "Program Files" folder or wherever you saved it.

2. Go to the drive that Windows is stored on and double-click on the Windows folder.

3. Delete everything possible in the Temp and Prefetch folders.

4. In Vista/7: Start-->Search-->For Files and Folders. Type ATI in the search box, go to advanced and be sure to tick the box to search unindexed drives (if indexing is off). Two ATi folders should pop up under C:\Users\"yourname"\appdata. Delete both.

Or, you can find the folders manually. These files can be located, but first you need to make sure that you have "Show Hidden Files/Folders" selected under "Folder Options" in the Control Panel. Once done, go to: My Documents-

>AppData->Local and delete the ATi folder. You may find another folder to delete in: My Documents->AppData->Roaming.

5. Run Driver Sweeper

6. Empty the Recycle Bin. Restart.


Install New Drivers


I followed it to the letter.

I already called Quadro tech support at the beginning of all this. They told me to not worry about 3DMark 11 and use Specviewperf, which of course I did, and my scores were pretty much in line with the published norms.

So where am I? It's the day before Christmas Eve. I can't load up 3ds Max, CS5, et.al. until Christmas day. Me and her will do that, and THEN we'll go from there, I guess. Does that sound like a plan, or is the low 3DMark 11 score telling us I've got a hardware/software problem somewhere that STILL needs to get fixed?