Review of 6 New KT133A Boards

IntelConvert

Distinguished
Jan 6, 2001
272
0
18,780
I found today's new review by Patrick to be very good (and most importantly, it was to the point and objective). I appreciated the Win98-Win2K comparison in the BAPCo SYSmark 2000, however, I wish Patrick would have also included Content Creation benches (with both O/S's), which were glaringly missing.

That should complete the KT133A board evaluations, so now TomsHardware should move on to examining the new DDR boards (and especially those based on VIA's highly touted KT266 chipset).

<P ID="edit"><FONT SIZE=-1><EM>Edited by IntelConvert on 02/21/01 09:19 PM.</EM></FONT></P>
 

pvsurfer

Distinguished
Jan 4, 2001
395
0
18,780
While it was good coverage of the 6 new boards (and I too appreciated seeing the Win98-Win2K performance deltas) , I have a problem with this and most all reviews nowadays. The benchmarks place way too much emphasis on gaming! Now before I get flamed for that remark, I want to say that I fully recognize gaming's popularity and there certainly isn't anything wrong with reviews reflecting that popularity. I just find the reviews are too heavily weighted to that end (test setups and benchmarks), to the extent that the interests of those of us who use our PC's every day for "productive-oriented tasks" are not well served.

Just one guy's opinion; I welcome yours...
 

IntelConvert

Distinguished
Jan 6, 2001
272
0
18,780
"The benchmarks place way too much emphasis on gaming!"... Wow, I'm surprised that remark didn't get you torched!

BTW, I don't know of any review-site that doesn't heavily weight games/3D in their benchmark testing. Hopefully, TomsHardware (as well as our gamer friends out there) understand our call for more balanced testing.

Tom/Patrick: Please accept these remarks constructively!
 
G

Guest

Guest
The benchmarks usually DO put too much emphasis on gaming. Both sets of these reviews have been fairly balanced, but I am wondering why he avoided complimenting the Epox 8KTA-3 board in the first review. All he did was call overclockers freaks for wanting to use it! In this new review, he updated the BIOS on some of the other members of the first review, but not the Epox. It still performed above average, but how much better would it have been? This is one of the best motherboards I've had the pleasure of using in years. I highly recommend it.
 

Zenthar

Distinguished
I must agree with you. I don't remember if it was on tomshardware, but I used to see a Linux Kernel compilation benchmark and the first time I saw it I though "Hey, what a great idea for benchmark!". But now I don't see it as much. All we see is benchmark scores "simulating" high end pc usage.
 

pvsurfer

Distinguished
Jan 4, 2001
395
0
18,780
I'm ok with (as you put it) "simulating" high-end pc usage. I consider my PC interests to be high-end. I just have a problem with so much emphasis on "gaming benchmarks" and I am trying to make a case for more balance by having Tom include more "productivity benchmarks".
 

stable

Distinguished
Feb 13, 2001
419
0
18,780
The reason why gaming is so heavily weighed is due to the fact that most productivity applications have already capped out at 600MHz - 800MHz with a 133FSB. Gaming puts more torture on video, RAM and CPU utilization while also demonstrating larger compatibility range.

The real problem for general application software is in it's inabilities to multi-task effectively. I would say that more people crash from multi-tasking than they do from any other issue. This is more of an interrupt/cycle issue and incompatible multitasking than anything, and while it doesn't show a great deal of single application load, it does give a good representation of GENERAL stability for the largest audience. The bummer here is that you really can't show the power of the processor and chipset by demonstating "crashability through multi-tasking." At best you can only reveal weaknesses in programming code and device drivers. That is also why you are seeing more and more tests measuring MPEG4 compression and rendering as well as results from other appications that require a great deal of vertical market power, yet still fall into a larger audience group for measurable performance.

Gaming testing reveals quite a bit actually in that it also helps to identify just how great some of this programming is to thoroughly exploit the hardware designs and present a better performance picture in multiple architectures. Something that Microsft just doesn't think about when they are making an office application (okay, any application).

Gaming uses pretty much all of your devices as well, so problems show up much sooner. Your video, sound, CD-ROM network, modem and hard drives are working away, whereas your problems with measuring productivity for office appications are lucky to even be mentioned.

Steve Benoit

Stable Technologies
'The way IT should be!'
 

pvsurfer

Distinguished
Jan 4, 2001
395
0
18,780
Steve: I don't think that I understand the basis of your statement "The reason why gaming is so heavily weighed is due to the fact that most productivity applications have already capped out at 600MHz-800MHz with a 133FSB".

To me, productivity apps mean a lot more than just a Word document or a Power Point presentation. Those of us who do extensive work with a relational database, or an image-editor know they will make very good use of all the CPU crunching power you have (not to mention their appetite for RAM and fast HD I/O)! If your statement means that such apps would "cap out at 600MHz-800MHz with a 133FSB", then I can assure you that's not true.

There are benchmarks that are very demanding and more representative of true productivity. One such benchmark is PS5bench. The action generates a 10MB, 20MB, or 50MB test image. It can invoke up to 21 Photoshop operations and has a timing feature to record the time that each operation takes to complete.

I fail to see how gaming benchmarks, such as Mercedes Benz Truck Racing or Quake III Arena, can possibly provide an accurate indication as to which of the parts being tested would do a better job in my productivity environment!
 

IntelConvert

Distinguished
Jan 6, 2001
272
0
18,780
Glad to see a few people in my corner - re inclusion of more productivity benchmarks in Tom's testing. While there are numerous benchmark tests out there, IMHO inclusion of the Ziff Davis benchmarks, especially ZD Media Business Winstone 2000/2001 and ZD Media Content Creation Winstone 2001, would get it done!
 

pvsurfer

Distinguished
Jan 4, 2001
395
0
18,780
At least Ace's Hardware recognizes that some people use their PC's more for productivity than for play! You might want to check out yesterday's article "The KT133A Decision-Part 2" - it's a good read...

<A HREF="http://www.aceshardware.com/Spades/read.php?article_id=25000186" target="_new">http://www.aceshardware.com/Spades/read.php?article_id=25000186</A>

The only flaw that I found in an otherwise excellent KT133A motherboard comparison was the sole use of Win98 in their test setup. Considering that Win2K is the preferred O/S for business and professional use, that was a real goof-up on their part, possibly affecting the results!
 

stable

Distinguished
Feb 13, 2001
419
0
18,780
I believe my post stands accurately for itself.

The first line in the second paragraph qualifies my first paragraph by beginning with, "The real problem for general application software". That's about as far as I will go as the comments were never intended for "other" applications, which were covered later in the post.

As for Image rendering, I believe I covered that issue, as does Tom's hardware by now including MPEG4 benchmarks.

As for large RDBMS testing, I would agree that this doesn't fit into the general office application arena. However, I believe anyone with common sense realizes that anyway.

As most RDBMS software is client/server, there are many more issues that are affected past than workstation configuration. Primarily the configuration and specifics of the deployment of the software on the server, it's use of Index buffers, cache buffers and RAID on the SERVER (not client), and too many other issues to list here.

As for the client side, I would agree that specifications should be beefier than a standard workstation; however, for any of these configurations you wouldn't be using ANY of the normal consumer configurations offered by most of Tom's reviews. You have to keep in mind what the largest audience is looking for, and it isn't a solution for crunching their 2Gig Oracle or Sybase RDBMS.

For these needs, you must automatically assume that the user would always be using Registered DIMMs and NOT unbuffered DIMMs. Secondly, serious CAD, MPEG4 and large RDBMS users should always be using SCSI drives and NOT IDE as this technology will provide better performance in LEAPS AND BOUNDS and as time is measured as the primary concern, the cost of these devices is always assumed in the purchase. Just ask anyone that does graphic design or CAD for a living and they'll tell you that they don't care that SCSI costs 3 times more, they spend that money because THEY MUST have the performance gains that it offers. TIME IS MONEY to them.

It should also be obvious to anyone with a 1gig+ Microsoft Access database that the 800/133 cap that I mentioned isn’t the best you could get; however, the 'general' user community doesn't have that scenario and therefor my comments would be accurate. They again, should find the existing benchmarks more than suitable.

Finally, I wouldn't assume (as you did) that gaming doesn't use databases. I think you'll find there are hundreds of programmers spending thousands of hour’s hunched over their keyboards writing number crunching db's that run those games. While the code may look different, you have to remember that every time you choose to turn right instead of left in Doom, you are enacting "If/Then" actions and there is some serious math going on in there. It's not all graphics, but even in that field there are major calculations being performed, not just some artists free hand drawing being sent up to your screen. Thus, again as mentioned, these benchmarks are VERY useful for measuring how the system will perform compared to other systems (including the one that the user has on his or her desk now!)

Again, I would qualify all of my remarks by stating that it appears as though the benchmarks used by Tom's Hardware are targeted toward the largest audience, which is general consumers. Specific application requirements (let's say for a 10Gig Oracle DB) should be addressed by satisfying the needs on a component by component basis as this is the only accurate way to satisfy the needs of that specific scenario. In other words, if you have a beefy DB requirement, you shouldn't even be wasting your time reading Tom's Review of a KT133 chipset with a 45 Gig IDE drive and 128MB of PC133 Crucial RAM.

I mean let’s talk common sense here. You guys have to take (and make) these posts using some HONEST common sense and be more practical. For Tom to satisfy every person that came to him and said, “I have this ‘xyz’ requirement and your benchmarks don’t cover that” would result in his benchmark section being 100 pages long and the testing would take him 200 more hours than he already spends. (Which is already a heck of a lot more than most!) This just isn’t realistic to ask, nor practical or realistically feasible to deliver. If you have a suggestion for a SPECIFIC standardized benchmark that accurately represents a clear picture of a vertical market to a large audience, I would suggest that you show it’s benefits to Tom in an email and maybe, just maybe (like us) he would include it in future testing.

Steve Benoit


Stable Technologies
'The way IT should be!'
 
G

Guest

Guest
I'm not sure that your statement is the definitive statement about business or for that matter serious personal use:

"For these needs, you must automatically assume that the user would always be using Registered DIMMs and NOT unbuffered DIMMs. Secondly, ... large RDBMS users should always be using SCSI drives and NOT IDE as this technology will provide better performance in LEAPS AND BOUNDS and as time is measured as the primary concern, the cost of these devices is always assumed in the purchase. Just ask anyone that does graphic design or CAD for a living and they'll tell you that they don't care that SCSI costs 3 times more, they spend that money because THEY MUST have the performance gains that it offers. TIME IS MONEY to them."

Money is money and very often getting money for hardware is harder than getting money for staff. I have seeen the same story in many companies where they would rather spend 100 times more on staff costs than get a) more disk space, b) more cpu, c) another computer, etc. Hence we are stuck with making our workstations work like servers. We create our pilots on our workstations and after a year or so these get put on to a production box. Very often the pilot needs to be proved before any monetary or staff resources will be forthcoming. I created a report server on my workstation to prove the concept and to create much needed reports for finance. Some extracts from a production aix db2 and then loaded these into MS Access. There were forms and reports finance could use. The total cost was my time to create this system which was about a month elapsed and about two weeks effort. Then IS came in and took this over. Well it took two people to take my report server over full time just to run it and do no development. THen they were going to put this on a real server (much like what you describe) and the cost was immediately $250,000. THen They needed to put it into an IS "approved" database and it took 6 people 8 months to do it. I guess the total cost came to about $1M. With this kind of experience, our department is much more hesitant about letting IS do another project because it means a lot of money. We are much more leaning to having a couple of cheap machines to act as servers and do the job ourselves. Hence, the comments others made about non-gaming reviews and applications make sense in this context.

I find more and more people are buying computers with not just a gaming interest but putting Linux on it and running some real applications.

So while I feel for the considerable effort various people around the world are putting in to create these reviews, I agree that there should be some effort put into non-gaming applications as we don't always play games.

So your statement "I mean let’s talk common sense here. You guys have to take (and make) these posts using some HONEST common sense and be more practical." I think that's what these guys are doing and saying. And I think you should respect that and if you don't agree, don't equate that with common sense and others as not having common sense.

They are expressing their opinion just as you are yours.
 
G

Guest

Guest
I'll add one more vote to the "Less emphasis on gaming benchmarks" camp.

I thought we might be moving in the right direction with the Linux kernel compile benchmark (I spend a fair bit of time watching 'make' do its job), but Tom doesn't seem to use it anymore :-(

How about some simple integer & floating point benchmarks (SPEC?). The "Real World" benchmarks don't mean much to me. I rarely use typical office apps, and when I do, they're plenty snappy on a low end system. But I'd give alot to shave a few minutes off my half hour FPU-intensive computation runs (or a few days off my multi-day runs :).

How about time to complete a SETI work unit? or some SPEC benchmarks? I think Anandtech has a good idea with their stability "torture tests" as well, but I'd like some stats on how repeatible it is.
My $0.02
Eric


In theory, there is no difference between theory and practice.
In practice, there is.