ATi have just been caught cheating.

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
So far you proved you can quote from other sources (it's the only time your english is coherent, so it's obviously NOT yours), you can screw up the most common well known benchmark in the biz, go on ad nauseum about your own posts replying to yourself 3-4 times, and fumble even the most common concepts and terms like Unreal 3, but I doubt you'd know the actual name without first hitting google or somewhere else so that you could once again try and defend and cover up your major Cock-ups.

I've proved my point that your initial statement was BS and your proof was based on the kind of mistake that a PHD and/or Game Engine 'Developer' should never make. I'm sure that GED from Manhatan Institute for the Totally Inept will serve you well, but your inability to put forward your point other than by an avalanche of BS tells me that you never had to defend any thesis let alone one of a PHD level from a reputable college, except perhaps is it was in PhysEd!

As I've said the NV40 has a stonger skill set, but may never get to exploit them; while the R42X may have less skills but employs them very well right now. Neither of these characteristis describe your situation here, as you have neither the skillz, nor the ability to employ whatever it is that you have in any other way than a disjointed mosaic of prosaic truisms obviously culled from other sources put together with no understanding of how they relate to the discussions at hand. The fact that you post in segments tells me you have to go searching for your information, and then come back and post it once you found a new source to quote (without actually quoting your REAL source of course).

I'll now return you to your regularly shceduled monologue so that others here may have something more to read tomorrow instead of only contenting themselves with another Bizarro, Dilbert or Natural Selection with their morning Coffee and Sugar Corn Pops.

G'night Y'all, hope he gives you something GOOD to keep ya' entertained! :wink:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
Mr. Entium:

<b>ON 24-bit VS 32-bit:</b>

Jesus christ, put this to bed until a SINGLE GODDAMN APP can show a difference between the two, will you?
Yes, 32-bit is great. Is it relevant right now? Holy crap, of course not!

You have permission to rehash this in 6 months to a year when it might possibly show up as an issue (probably won't though.)

Right now, this is like arguing wether or not we have the right to cold-fusion air conditioning systems when they get invented. Give it a break man, yes 32-bits is a wonderful idea, and yes, right now it's useless as tits on a bull.

The one thing 32-bit is good for is slowing down Geforce cards when they're forced to calculate at that precision. I wouldn't classify that as a strength for Nvidia right now.
More of a weakness at this point, really.

<b>ON your lying about having an X800 PRO and overclocking it to 520/1100 (and other things):</b>

You obviously don't have an X800 PRO at your disposal (you still haven't posted a 3dMark03 compare link with your overclocks even though you said you would).
This brings everything else you've said into question. A software develloper? More likely a pretending copy-and-paste troll.

I'd venture to say you're probably hineigger, or a bitter kinney (I hope not), or perhaps a new Nvidiot who feels like spamming crap.

I don't think anyone here actually believes anything you say anymore, but I thought I'd pop in and let you know that I will be pointing out your inconsistancies to others when given the chance. Just because it gives me a little satisfaction to be a thorn in your side.

Can't abide liars.

If you're older than 16 and spend your time lying on tech boards, you probably have a mountain of real-life problems that you could better spend your time on.
Then again, if you're older than 16 and spend your time lying on tech boards, you probably don't have what it takes to be a real man and do something about them.

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 333/343)</i>
<b>AthlonXP <font color=red>~2600+</b></font color=red> <i>(2400+ @ 2145 Mhz)</i>
<b>3dMark03: <font color=red>4,876</b>
 
mmmmm sugar corn pops

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy 😀
 
I just bought a Box of Strawberry Blasted HoneyCombs! 😎

Gonna go to bed so I can wake up sooner and Eat THEM! :lol:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
damn youuuuuu
*shakes fist*

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy 😀
 
i'll not stay in that forum, i left it long ago, but i got called.

one thing, entium. if you'd be a real developer, you'd know that most of your statements are simply fud. or else, i don't understand your employee.

good luck trying to fool people into your false statements. next time.

don't you feel stupid, genericweapon, to stay in here? 😀

"take a look around" - limp bizkit

www.google.com
 
It's Dave!

Dave I hope you understand that, unlike entium, everyone on this board respects your knowledge and your opinions. You have been a great resource to us. I hope you have not decided to not be a part of these boards, because I think everyone here would love to have you around.

I hope you'll continue to post in the graphics cards section and enlighten us with your knowledge and insider information!

Me: are you saying I can't provide?
Me: cause I know I can provide.
Me: oh and I can provide money too😉
Rachel:): why do we need money when we can just stay in our room and have sex all day?
 
Dave :lol: ....glad you got my message. I've been seeing you post more and more at Beyond3D...always good stuff. I've been there just over a year but read more than I post(I'm still under 1000 post's)
Thank's for coming over to say to yer old friends.....

<A HREF="http://rmitz.org/AYB3.swf" target="_new">All your base are belong to us.</A>
<A HREF="http://service.futuremark.com/compare?2k3=2216718" target="_new"><b>3DMark03</b></A>
 
I miss DAVE!! *cries*


RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy 😀
 
Hello, im new here at the forum so i hope u dont flame me for asking maby some silly questions..😉

I have read all the pages here at this post now, i am considering buying the x800XT card. I mean the card still looks great to me, but im not a pro or anything... So what would you reccomend me do, the x800XT or the new nVidia card?

And by the way, entium ur a really colorfull person...not in a god way...

And also, im sorry for my poor english...hehe 😀

Im here to learn 😀
 
Its completely up to you, if ya want a card that will perform well but only last ya till late next year x800

If ya don't want to spend money once Unreal 3 is out gf 6800 ultra (if this is your choice wait some time till the 6800 drops in price I don't think I've seen it less then $650)
 
He's a moderator at <A HREF="http://opengl.nutty.org/forum/index.php" target="_new">this</A> OpenGL forum. And he post's <A HREF="http://www.beyond3d.com/forum/viewtopic.php?t=12486&postdays=0&postorder=asc&start=1020" target="_new">here</A> alot.
Go hang out with him :lol:

<A HREF="http://rmitz.org/AYB3.swf" target="_new">All your base are belong to us.</A>
<A HREF="http://service.futuremark.com/compare?2k3=2216718" target="_new"><b>3DMark03</b></A>
 
jihn...welcome to the forum :smile:
I dont think you could go wrong with either card. Be sure to let us know when you've made your decision though:)

<A HREF="http://rmitz.org/AYB3.swf" target="_new">All your base are belong to us.</A>
<A HREF="http://service.futuremark.com/compare?2k3=2216718" target="_new"><b>3DMark03</b></A>
 
hell I chew through Cards about every 3 to 4 months, I always try and buy the one behind the fastest due to cost. ie: 9800 pro instead of the 9800XT, that sort of thing.

....................
<font color=red>AMD 2700 XP<font color=red>
<font color=red>Gecube 9800 pro Extreme<font color=red>
<font color=blue>ASUS A7N8X deluxe<font color=blue>
<font color=blue>1024 meg Dual Channel DDR 3200<font color=blue>
....................
 
The 6800 will run unreal 3 like a geforce 1 run doom 3...

Athlon 2700xp+ (oc: 3200xp+ with 200fsb) , Radeon 9800pro (oc: 410/360) , 1024mb pc3200 (5-3-3-2), Asus A7N8X-X
 
no it will do it at around 30 fps, thats what all the tech documentation was about although thier polycounts are from 200k to 1 million polys thier fill rates are at around 30 million polygons per sec.

Fill rates are polygons in the screen x fps.

There probably won't be a huge performance leap like in this batch of cards for a while.

Gf3 to Gf4 wasn't that big. The 9700 was big because it finally allowed more then 4 textures to be sent through the pipelines at the same time. But gpu wise it was equal to the gf4 ti 4600.
 
blah blah blah

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy 😀
 
Its not the polygons that slow down gpus much anymore its the amount of textures used.

Typically Unreal 3 uses 4 textures for its new bumpmapping system. There won't be any change in that. Now they have already stated they will only go up to a million polygons so its not going to change from that.
 
w/e dude

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy 😀
 
And so at the end of the day, which is better ? the X800XT or the GF6800U, now I don't care whether I buy ATI or Nvidia, what i want is the most bang for my buck and one that has the capability to "grow" through driver enhancements as time goe's on. I get the feeling that ATI by milking all it can out of the current generation core is at the end of it's limits, with next to no growth available, and are in fact waiting to see how Nvidia does or they plan on putting all there eggs in the PCIex basket, whilst Nvidia with the next generation core will allow for some growth as they write and re-write the drivers as time goes on.

Or am I just pissing in the wind here ?

I would be very interested to hear some thoughts on this, as I wouldn't mind forking out for one of these puppies sometime soon.

Gary

( be nice to be able to play Duke Nukem in VGA mode and ROTT would run pretty sweet too I'd imagine, not sure about Descent though...)



....................
<font color=red>AMD 2700 XP<font color=red>
<font color=red>Gecube 9800 pro Extreme<font color=red>
<font color=blue>ASUS A7N8X deluxe<font color=blue>
<font color=blue>1024 meg Dual Channel DDR 3200<font color=blue>
....................
 
If ya don't want to spend money once Unreal 3 is out gf 6800 ultra (if this is your choice wait some time till the 6800 drops in price I don't think I've seen it less then $650)

no it will do it at around 30 fps
Where do you come up with this??? Tim Sweeney has said that Unreal 3 technology will be powering games in 2006.

How can you possibly say that a video card today will run a game, no wait, not even a game, a particular engine at 30fps in at least 2 years from now??? No one can say for certain how any of today's cards are going to run Doom 3 or HL 2, yet you tell this stranger that he should buy a 6800 so that he can play a game based on the unreal 3 engine in 2 years.

Here's what Sweeney said about how he thinks today's cards will run the Unreal 3 engine...
By DirectX9 minimum spec, we mean we're going to make a game that brings today's GeForce FX's and Radeon 9700+'s to their knees at 640x480! 🙂 We are targetting next-generation consoles and the kinds of PC's that will be typical on the market in 2006, and today's high end graphics cards are going to be somewhat low end then, similar to a GeForce4MX or a Radeon 7500 for today's games.
You can read the full interview <A HREF="http://www.beyond3d.com/interviews/sweeney04/" target="_new">here</A>.