X800GT Benchmarked

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yep Links been snuffed.

Now says <A HREF="http://www.hkepc.com/" target="_new"><font color=red>(Coming Soon)</font color=red></A>

LOL!

Just make the link into page = 2 and you're good to go for the rest of the review they only locked the front door;

<A HREF="http://www.hkepc.com/hwdb/x800gt-2.htm" target="_new">http://www.hkepc.com/hwdb/x800gt-2.htm</A>

:evil: 😎 :evil:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
<P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 08/02/05 05:50 PM.</EM></FONT></P>
 
I saw it when you first linked it, but I still can not read it any better.

<pre><font color=red>°¤o,¸¸¸,o¤°`°¤o \\// o¤°`°¤o,¸¸¸,o¤°
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
 
i have a 6800 128 16 pipes unlocked and my card is not crushed when sm3.0 is enabled
First of all unlocked card is not a mid-range anymore, it's something around a de-tuned GT.

on games there is a performance dip but i avarage 30fps on chronicles of riddick,
First, Riddick is OGL, but what resolution do you have to drop down to/from in order to get the same playable frames. Look at the GF7800GTX's performance and you don't see nearly the same drops. And be sure you're using AGL2.0++, that's where the performance hit is felt, not in the normal OGL2.0 setting;

<A HREF="http://www.firingsquad.com/hardware/chronicles_of_riddick_performance/page6.asp" target="_new">http://www.firingsquad.com/hardware/chronicles_of_riddick_performance/page6.asp</A>

To me that's just eneormous, far worse than SM3.0 in FartCry and SplinterCell.

and the graphics look a bit more crisper/smoother better looking on the 6800 than the X800XT.
A bit doesn't make it a killer app, just in the same way that DX8.1 didn't make the R8500 a better choice than the GF4ti.

and the performance decrease is not what you say it is
The drop in FartCry is so much that doing HDR on the X850XT using 3 passes would bring it to about 80-90% of the GF6800U's performance, 51fps to 19 @ 16x12, to me that's a crippling impact, same for other resolutions.

<A HREF="http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page13.asp" target="_new">http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page13.asp</A>
<A HREF="http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page14.asp" target="_new">http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page14.asp</A>

This is all based on my opnions which i have seen, i trust reviews as much as i trust my nieghbour hes ok but i would not leave him my car keys.
And my position is I trust reviewers with reputations to lose over other people's perceptions of what may or may not be. At least they use FRAPS, and not just perception of what may or may not be playable, because everyone's acceptable levels are different.

If you can show me a 'Killer App' then I'd promote it, but right now, it's a look-ahead feature, and the non-moded GF6800s can't handle it, let alone the GF6600GTs. Of course that's just my opinion, so you can disagree with it as you see fit.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
LOL! 😎

Use a translator. :tongue:



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
Well your version of crippling is far different to my version, a normal pal tv runs at 25 FPS. i get 30-35min FPS in COR ( at AGL2.0++ settings)i play a lot of games i tinker with the settings to always get the best visuals everything to max basically, if it cant run on max then time to upgrade.

as for fartcry, it runs in sm3.0 (fully patched)perfectly fine with playable Frame rate of 25+, i here a few people with no experience with a sm3.0 card talk about this performance drop to much, when i send my 6800 128mb to you then you will see.

I have a hyundai L90D+ so my res is always at 1280 x 1024 with AS & AF at 4x 4x. i run COR at 0x0x

The only area`s where this card sucks is that it only has 128mb so i cannot run doom3 on ultra settings for 512 cards also you need rivatuner to unlock all pipes, but all other settings are good.

Oh i never siad SM3.0 cards are killer i siad there better than sm2.0 they have that extra box with a tick which say enable lol!

No offence but if you GGA trust every review you read you will never buy anything new, are you saying you trust www.firingsquad.com by using every upto date driver e,t,c to get the best results e,t,c and not all hardware is the same.

Just seen it this review was done in febuary, new drivers work magic you would be surprised what a increase in 5+fps does to a game.


its kind of funny my vanilla 6800 decimates a 6800ultra according to the results of this Review, i have a 6800 but my card magically does better than this review says lol.

All of the above are from my actually experiences with my own setup, (this is not an attack at you GGA i value you opnions, and enjoy the discussions you bring up)

But I do agree with you the mid range 6600gts and standard 6800 cannot handle sm3.0 games at all, unless they atleast 16 piplines open with a 256bit memory interface.

Also specs on the machine need to be considered, my rig is not magic but it performs well in games.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
 
As a 6800U owner, who tends think [H] paints a good playable settings picture, you have me curious. If anything, I say other reviews giving ave fps are almost misleading as to actual best playable settings. Yet you are getting better playable results than what FS painted?

its kind of funny my vanilla 6800 decimates a 6800ultra according to the results of this Review
How is that? please explain. At 1280x1024 4X/16X they averaged 70 fps in training. What do you get at 4X/4X?

It drops to under 33 fps average with HDR on. Are you saying you run farcry each day at 1280x1024 with HDR on and stay above 25 fps all the time?

I have not tried HDR in any game after my latest driver update. I tend to demo HDR for fun because I can, but a 6800U isn't up to the task of using it IMO. I'd rather keep the reolution up and framerates up, and AA enabled.

Have you tried this <A HREF="http://downloads.guru3d.com/download.php?det=830" target="_new">Farcry Benchmark Utility</A>? It's a fun little tool for comparisons, although in the end it will take fraps benchmarks of actual gameplay to get the real picture. 1280x1024 4X/16X would be my preffered gaming settings, but I feel that in some of the most demanding levels I need to reduce it to 2X AA in farcry or the framerates take away from the gameplay. Even with 2X AA there are areas where it can't stay above 30fps. (I tend to always leave fraps running while gaming.) But in most maps I can get away with 4X AA.



<A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
 
The "25FPS must be fine because that's what TV is" is commonly trotted out, but IMO it doesn't apply to games. In most games I can 'feel' Lag if they're running at sub-60FPS. It's obviously down to the way they have the game set up to take your inputs or something, as DOOMIII stays responsive even if it dips below 30FPS (although you can see it doesn't <i>look</i> smooth, it <i>feels</i> it). In most games the responsiveness of the controls seem to be tied into the FPS you're getting, and 30 simply isn't enough. HALO was absolutely terrible in this regard. But it was crap anyway, so who cares...

Obviously this is just my own opinion, but it does lend support to GGA's argument that we all have different limits on what is acceptable.

---
<font color=red>"Life is <i>not</i> like a box of chocolates. It's more like a jar of jalapeńos - what you do today might burn your a<b></b>ss tommorrow."
 
How can i explain the results of my system apart from they rock.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
 
Halo was pretty good on my fx5600, and that avaraged 30fps, i still play halo online today under the name 7 sins with the 6800 spec machine, i still think halo is a excelent game never experienced lag it could be your pro which lets you down, i have noticed anything under 20fps gets to laggy, 25fps is minimum to stop lag in games, thats just my opnion , and this is what i have noticed when playing every game on the pc under the sun.

mouse smoothing is a b1tch so always switch that off that reduces lag a lot, and start windows with bare minimum processes running, this increases FPS a LOT.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
 
X800GT = 6600GT

But GeForce still have the SM3.0 advantage...

-
GA-K8NF-9 / <b><font color=green>Athlon 64 3200+</font color=green> @ 3800+</b>
Infineon DDR400 (CL2.5) 2x512Megs
<font color=green>GeForce 6600GT 128Megs</font color=green>
<A HREF="http://www.getfirefox.com" target="_new">Get Firefox!</A>
 
I had a Ti4600 at the time, and I found it frankly ridiculous how poorly the game performed. Though it was over 30FPS most of the time, it just felt horribly unresponsive. I tried loads of stuff, tweaks and drivers, before I suddenly realised how vastly over-rated the game is, and went and played something more worthwhile.

If you like bland graphics, boring levels, stupid AI, unsatisfying weapons and overall woeful performance, then I can see how HALO would appeal :tongue:

If they'd not completely fubar'd the conversion, maybe I would've got more into it, but it simply didn't seem worth the effort to me.

---
<font color=red>"Life is <i>not</i> like a box of chocolates. It's more like a jar of jalapeńos - what you do today might burn your a<b></b>ss tommorrow."
 
Well I disagree, it's not just FS that shows those drops in performance. I would neve go on just one reviewer that makes less sense (almost like people who post Anandtech stuff here).

As for what's fluid and acceptable we've discussed this a few times, and my position is that it depends on the game. Creepers like SplinterCell and some parts of FartCry don't need high sustain framerates, but games UT2K4 or HL2/CS:S need good elevated framerates to ensure fluidity in things like jump-straifing, 180degree turns, etc.

When playing single player I also think it matters less because you won't know what you're missing compared to the guy who's fraggin' yer a$$ in multiplayer.

I don't disagree it's a nice feature, but if enabling it on even the GF6800U using FP16 is almost the same as running it on the X850XT with 3 passes using FP24, then really, the feature and it's support on this generation is not that 'impressive' and keeps it from being anything but a tie-breaker IMO.

Now as games/engines start appearing that were designed around SM3.0 features then we may see a true advanatage, until then the scenes where it's used are limited, and the impact isn't worth it IMO.

Only where it becomes a feature where you'd be willing to take a performance hit like an GF6600GT over an X800XL or an X850XT over a GF7800GTX does something become a must have feature when talking about things like that. Just like FP32 couldn't save the FX line when the performance just wasn't there.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
I disagree with you totally, it does not lag its just your machine setup or somthing you have done wrong somwhere lol i bet you have mouse smoothing switched on, also halo heavily realys on cpu and to be honest the ge4600ti was not the cutting edge at the time of this games release its was the 5900ultra and the super cool 9800XT (which i used to dream about).

Halo is a resource hog and the single missions are ok but mutliplay is excelent, make sure you have a low ping and good broadband supplyer.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
 
What can i say i agree with you on all points siad there, but if you look at the 7800gtx benchmarks from that site i forgot the name, But the results just does not look consistent enough and i could not see a reasen why the results had not been consistent, i would say there is a driver problem, over sites which have reviewed had some better results, but 2 systems never act the same.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
 
X800GT = 6600GT
That's being generous to the GF6600 IMO.

the X800GT-256mb>X800GT128mb-256bit>GF6600GT.

Only in D3 was there any change of positioning, and that's not enough to make a tie.

If you can explain the GF6600GT 'advanatage' you speak of then fine, otherwise it's the typical checkbox feature because when enabled the GF6600GT would have trouble competing with an X600XT or plain X700, and the number of games that support it are minimal,and even then they are tacked on for show. Things may change with the advent of those games that are optimized from the ground up, but otherwise, there's no 'advantage' yet that falls outside to the 'FX advanatage' I mention above.

To me from the initial review a better positioning of the X800GT would be (all unmoded/un-oc'd);

GF6800LE < X700PRO < GF6600GT < X800GT < GF6800 < X800 < X800PRO < X800XL/GF6800GT, etc.

From that +/- your preferences for features.

Just my two frames' worth as always.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
The GF7800GTX isn't the issue, but I'd agree immature drivers for that, I was really focuing on the hit to the GF6800 series whose drivers should be mature by now even for relatively new games like Riddick.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
i agree the x800gt will easily out pace the 6600gt, and if ati is marketing this kit for the same price as a 6600gt nvidia will lose the mainstream crown.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
 
when i looked at the date for the 6800 review it siad feb 2005 at the top can you double check for me please.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
 
I based my "X800GT = 6600GT" on the fact this in the HKEPC review, most benchmarks numbers shows very close race between both GPU.

But there is only 2 realworld bench in this review : D3 and HL2. I agree that in High-Res/FSAA the X800GT seems to have a small lead. But I doubt we will really notice the difference while playing. I can't wait to read the HardOPC review of this new X800 flavor.

And since this X800 is more like a stripped-down X850, will it be matched with an X800 or X850 CrossFire edition? Another confusing choice for buyers in perspective...

-
GA-K8NF-9 / <b><font color=green>Athlon 64 3200+</font color=green> @ 3800+</b>
Infineon DDR400 (CL2.5) 2x512Megs
<font color=green>GeForce 6600GT 128Megs</font color=green>
<A HREF="http://www.getfirefox.com" target="_new">Get Firefox!</A>
 
Yeah the Riddick review (one of the only ones with OGL 2.0 vs 2.0++) is that old, but I don't know of many other reviews that show the difference. Most review the OGL2.0 path alone because they want to compare nV to ATi not nV to nV.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
LOL!

They've updated the Benchmarks again with the 128mb and 256mb on the same list (and page 4 leads to page 4 again)

To get to the benchies now you must go directly;

<A HREF="http://www.hkepc.com/hwdb/x800gt-5.htm" target="_new">http://www.hkepc.com/hwdb/x800gt-5.htm</A>

And while you say they are close, in HL2 the X800GT has a 15-20+% lead, while the D3 scores only truely favour the GF6600GT by less than 10% and only in the 1 test is it above the margin of error. Max playable is the same for both in D3, unless people think avg ~40fps is playable. And max playable for HL2 goes to the X800GT. If experience is anything D3 will remain the exception to the rule of the X800GT > GF6600GT. But as you say the margin may be very little depending on the settings and application. For me the true test will be minimum FPS since that's usually what affected most by both fast memory bandwidth and added memory. That may be noticeable, but we'll have to wait and see what reviews like [H]'s reveal with their hystograms.

As for crossfire, probably it'll be possible, heck it's possible with about evey card now, but whether there would be any benifit would be more of a question. 2 X800GTs will likely cost noticeably more than a single X800XL, and adding an X800GT to an XL would have no net effect. Even for plain X800 buyers I doubt there'd be the motivation. As for confusion, like was mentioned before, no more thanalready exists, and those interested in SLi and Crossfire should do their research first otherwise they'll get burnt just like all those people who bought FX5200XTs and R9200SEs just because they had a whole kick-ass 256MB of memory.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
I'm not even going to bother to mention that whenever "ATI's" so called crown-stealer mainstream card is out, we might already be looking at the mainstream and budget-end cards for the 7000 series. GGA, you mentioned something about SIS cards, previously?? If it was to make a point, it really didn't, because the implication that newer things always come out doesn't affect anyone unless it's to come out relatively soon. And to me, it would be stupid of nVidia to have their 7800 GTX card (and the GT) out competiting with the entire r520 series.

By that, I am implying that when ATI *does* release their r520, it will be with the other cards of the series, with the exception of perhaps the high-end card. Why I think this? It just seems stupid that they'd *only* release the r520 card...I mean, nVidia could pull it off since it was before any competition, but now that the competition is there, I think ATI should go full force.

But of course that's just my speculation. But despite everything I've seen, I personally wouldn't invest in either an x800XT, nor a 6600GT. Although, investing in a 6600GT would be more feasible for me, just as it contains SM3, which is the same feature as the "next gen. cards." Even if performance isn't as good, if I was looking for a card to last me longer, that's the one I would personally choose. And I want it to be clear that I'm not putting down the x800XT. I'm just simply stating that with SM2, it's not something I'm particularly interested in investing now.

You might want to, though, as well as many others. But I personally do not. So please don't tell me how my personal preferences are flawed.
 
ATi's mentioned that the companion parts (RV530, and others) will appear close to the R520's launch. Expect the R520 to come in whatever flavours it comes in and ATi will likely launch whatever of what it can to fill the gaps (although don't expect an R530-like card immediately even if it were ready to ship).

As for SM3.0 I go back to the FX statement. Features mean little if the accompanying power doesn't make it worthwhile either. Sofar no one can show a single thing with SM3.0 that can't be replicated with ATi's feature set in any real-world app.

Once again it's great in theory, just not as impressive until the future titles ship, and buying a card for that far into the future is a risky proposition.



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
 
ATi's mentioned that the companion parts (RV530, and others) will appear close to the R520's launch. Expect the R520 to come in whatever flavours it comes in and ATi will likely launch whatever of what it can to fill the gaps (although don't expect an R530-like card immediately even if it were ready to ship).
Yay, we finally agree! So once this takes place, I think we will be seeing more budget-oriented cards from nVidia. Such as a 7800, 7600, 7200, (I don't think that's 100% confirmed, yet, though, so don't quote me on it).

In other words, I would probably wait to buy a mainstream card after they come out with the 7000 series or r520. If not to see if a better card of that series for around the same price as this gen., then to atleast see this gen. decrease in price.

As for SM3.0 I go back to the FX statement. Features mean little if the accompanying power doesn't make it worthwhile either. Sofar no one can show a single thing with SM3.0 that can't be replicated with ATi's feature set in any real-world app.
Doom 3 certainly seems to show the advantage...

Once again it's great in theory, just not as impressive until the future titles ship, and buying a card for that far into the future is a risky proposition.
What? The x800XT, the 6600GT, or both? Because yea, I agree with 'ya.

But when even the next gen. cards are only going to be SM3, it somewhat shows that it will do us good for a while. Not neccissarily years and years from now, as SM4 is to come out with LDDM for Vista late next year, but atleast until then.

Overall, price is another huge impact, or atleast it would be on me. In fact, price is probably the most importain factor for the budget and mainstream cards...if it wasn't, then you'd easily find youself dishing out hundreds of dollars for the high-end. Now don't confuse my statement; I'm not saying performance isn't important. Just saying that price is heavely considered.
 
Kinney, have you been going to anger management classes? No joke, you have come a long way in a short period of time. Keep up the good work!

ASUS P5WD2 Premium
Intel 3.73 EE @ 5.6Ghz
XMS2 DDR2 @ 1180Mhz

<A HREF="http://valid.x86-secret.com/records.php?PHPSESSID=792e8f49d5d9b8a4d1ad6f40ca029756" target="_new">#2 CPUZ</A>
SuperPI 25secs