He doesn't know that more RAM will improve the OS performance.
That's really only true up until a certain point, dependent upon your setup and usage. There is some caching/preloading of commonly-used applications in RAM, but it's not quite as much as you might think. Your computer will run a word processor, Web browser, or e-mail at the same speed if you have 512MB RAM or 128GB RAM.
Sorry but when it comes to Vista, you're wrong about that. And I'm not talking about a subtle wrong, or a maybe we're splitting hairs wrong, or maybe it's a subjective thing wrong. You're wrong, wrong, absolutely positively way the hell wrong here.
Plus Dvorak talks about scoring movies with his Vista system. Video editing LOVES memory. Lots and lots of it. The more the better. Don't take my word for it (even though it's a sincere statement drawn from personal experience, and never mind the fact that I work for a television network) - go ahead and
Google it for yourself.
In this age of VM software he thinks dual-boot is a practical solution, and it seems like he has no idea what the MCE to Vista upgrade path is.
VMs are good, I'll tell you that. I run 64-bit Linux and it's nice to not have to reboot to be able to be able to open some %$*! Windows-only thing (usually some ActiveX-dependent website or DRM'ed file.) But they do have their shortcomings, particularly in the fact that they cannot handle accelerated graphics, access all of the host machine's hardware directly or load drivers. Windows has to restart a lot anyway, [...]
No argument about that. I wish I knew what you folks were doing that causes you to restart Windows so often, though. Y'all running Windows ME or something? I've got XP Pro, 2000 Server, 2003 Server, etc. scattered all over my house and it's all been rock solid. At the office we had a funny incident recently where an NT4 box which runs a middleware app with known memory leaks, which because of that known memory leak was
supposed to be on a weekly reboot schedule, crashed. Turns out nobody in the data center was recycling it and it had been running un-touched for a year and a half before the leak finally caused the kernel to barf.
His "perfect" PC uses RAID-5?
RAID 5 is not bad to tell the truth, but it's not really optimal for straight desktop usage. It is expensive because you have to get 3 HDDs and an RAID controller if your motherboard or OS does not support RAID 5. Write speed also is much slower on RAID 5 than a single drive.
Exactly. And what's another drive (for RAID 0+1 or RAID 10) if you're going to spend that much cash anyway?
While Dvorak didn't say what RAID level he was planning, any solution with three drives is going to be silly for desktop use.
He accuses Windows of bottlenecking LAN performance. Oh, well yeah, you're right, this is the same guy who doesn't know RAM will improve OS performance.
And Dvorak's 100% right that it's hard to get more than 500 Mbps out of GbE adapters under Windows
Buy better NICs and better switches. I've had no such issues. We have servers that are entirely capable of saturating Cisco gig switch ports.
Wireless as a LAN backup??
I think he means also being able to use wireless to connect to the local network. This is handy if your house isn't wired for Ethernet and your desktop doesn't sit within cable reach of the modem or router.
Since he said "
I like having two network access systems. The first would be a gigabit controller, which should be on the motherboard by now. [..] Let's also add a wireless backup", that tells me his primary connectivity would be copper. While I'm having trouble envisioning the old fart moving his perfect seventy pound desktop all over his hovel, perhaps I'm wrong.
It also reminds me of how clueless Dvorak is since it's nearly impossible to find a motherboard that, if it integrates a NIC, doesn't integrate a gig NIC.
And moving on to that AMD mule and prostitute show, when the benchmarks I have seen show the E6300 and the X2 4200 to be fairly close in overall performance, one has to wonder about the alleged performance difference. Although I'm not surprised a GeForce chipset would generally trounce those Intel controllers.
The E6300 is roughly equal to the X2 4200+ only in SSE-heavy applications due to its 128-bit SSE execution engine. It's much faster in integer math but slower in non-SSE performance than an equivalently-clocked K8.
Yes, the AMDs are almost universally superior with the FP work. Somehow though, I'm having trouble envisioning Mr. Dvorak doing laboratory -variety number crunching while editing his videos, composing his ill-informed articles and playing whatever game he thinks he needs four video cards for. But who knows, maybe I'm wrong. You think AMD's mule and prostitute routine showcased math apps? Beats me.
-Brad