ATI Inferior to NVidia?

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Damn GW: throw that GS on ebay, I'm sure it would sell in a hot minute for that price. Hell, if I was still rockin AGP i'd buy it....

/agree thread = lame lame = please kill
 
They do suck, a whole load of air. ATI cards sound like jet engines.
ATI cards run hot, turning cases into easy-bake ovens.
ATI drivers are just plain horrible.

Lol. Look at the Nvidiots come out of the woodwork.

Well, I review both Ati and Nvidia cards for Tom's Hardware, and I can tell you that both Ati and Nvidia cards and drivers are excellent for Windows gaming.

They both have small irritations as far as the driver panel goes, but they are both stable and game fast, and that's what matters. Anybody who says differently is living in the past, or is a fanboy tool.

ROFLMAO someone obviously forgot about the FX5800 ultra ;)
 
To date, we don't know that information. So I think it's rather useless to throw those values out until we know what's changed.

Yet you want to argue based on your admitted fact that you don't know anymore than Fuad; heck the fact that the members at B3D are debating his info despite their unreserved contempt of the InQ is because it's one of the only places for info, especially now that WaveyDave is with ATi. Funny how you gripe about the InQ info yet you have nothing better yourself other than the umptienth guess at die size based on equally spurious information.

The difference is more than just "helping G70/G80 in texture-heavy" games. It's also "potential bottleneck" when you have a ratio that doesn't jive with the architecture (what I associate as being x1600's problem).

Bottleneck only if you consider it as a true 12 fully fledge pipeline design, not if you consider it a very capable smaller design. There 'bottlenecks' you speak of are axactly what the difference is between texture-heavy and pixel heavy tasks, so you're just restatting my point.

But it wouldn't flip to 64 vertex, as that would prove catastrophic (in games) with nothing for the pixel shaders. So that 64 vs. 16 is merely theoretical.

Perhaps you should do more research, because there are games out there that ONLY used vertex shaders so it wouldn't be catastrophic, it would be perfect for those games, and it's not just games we're talking about here, and there are many other applications, and the workstation market looks to benifit greatly from this potential. Of course neither of those are 'common knowledge' so you wouldn't be able to copy it from somneone else, you'd actually have to think about it on your own.

And again, I think it will depend on how those 16 vertex shaders in G80 (granted that figure is accurate) compare fundamently to just one shader in R600.

Well baring some major redesign, which would require even more transistors, it'll probably closely match current ALu pairing. So unless you have something conrete otherwise, still doesn't change the fact that you have nothing more to stand on than anyone else, and are in no more of a position to comment.

You have nothing to base it on either, that's my point. And I think it's completely futile that you're now comparing R580 to G80 (didn't you just agree we know very little about G80 a second ago?) 0_o

No, you're the one who's question my generalized statement of this marketplace looking very interesting in the near future due to possible dynamics (nV doing well doesn't keep them from selling to AMD nor Intel, but guess you missed how that's relevant), and so without anything more you're simply FUDing your way through defending something you admit you know nothing about, and definitely no more than myself or anyone else. My statement is about the market aspect of what will undoubtedly be an interesting dynamic should that be the case, your statements are Fanboi whining about "don't say bad things about the G80!" :roll:

So I guess what you're saying is, let you make your personal speculation theories, but when someone calls you out on it, we'll just say they don't know enough. You should have known those kinds of actions don't bode well with me.

No that's certainkly NOT what you did, you didn't state opinion, your started off with the FanATic comment, so obviously you either once again forgot to READ what was written or had your typical knee-jerk, or just plain JERK reaction. Show me how this was a valid counterpoint;

I know fanATIcs are so desperately trying to paint this R600>>>>G80 picture (with unified shaders), but don't you think when G80 does debut, and isn't an FX failure, that it's just going to make G80 look even better than it actually is (since people were so convinced otherwise)?

You go sofar as to say that if the G80 isn't a complete flop, like the FX you mentioned in your own thread, that somehow that's admirable. S3 makes alot of products that aren't FX calibre, that doesn't help them. The only people who'd care that it's better than expected, and not the BEST they could buy would be people who aren't looking for the best product, they're just looking for reasons to justify their purchase of their favourite company. Anyone who buys a product that's inferior to the competition, just because it isn't as crap as they though it would be is an idiot. Unless there's some compeling reason to that person, like Linux support or specialization unavailable elsewhere. Where that one reason outweighs the poorer global performance. That's a compelling reason, your 'well it's no FX' reason is just justification for people who are more concerned about not buying the competition than buying that product, something a Fanboi would do. So it looks like you're the one putting forth the weak rgument about the future products.

Also, please present the "c/p" I've done in our discussion.

Well I'd actually have to spend more time reading B3D than here replying to your nonsense now wouldn't I. Based on your inability to take your argument beyone the single stage, it's obviously they aren't your own, and even if the wording's different it looks like alot of trumphsiao's stuff.

And the nature of my refutations to you isn't to prove you wrong in any sense,

They why challenge my statement with a Fanboi comment like yours?
Utter BS on your part!

It's ashame you couldn't just be objective, allowing others to chime in to what could be a very productive discussion (like we have at B3D).

Like GW said, you're nothing @ B3D, you're simply a klingon like you are here, probably posting equally trite BS simply to say you posted. I've been participating @ B3D for much MUCH longer, but by no means am I a heavily active participant because it's harder to keep a conversation when you post every 2 days there. I know GW and a few others who are far more active, and if they remember my posts for anything it's for them being insightful and contibuting to the discussion @ B3D, not some crap like yours. That you think your simply reading/posting @ B3D makes you special is laughable, I doubt anyone there even knows who you are, which is not the case with GW and a few others who could be included among the 'we @ B3D'. Speaking of which I prefer Uttar's comments on the whole launch business;

"If the NV40 pre-launch era taught me anything, it has to be that NVIDIA's misinformation tactics are even more bullshit-ish and efficient than their PR tactics Next thing we hear, they might have decided to do a few false tape-outs, costing them millions of dollars, just to confuse everyone, including ATI, hah! The next step in sheer insanity, of course, would have had to be to them at different foundries, processes, and times.
(okay, NV40's early "false" tape-outs weren't for that reason, but they did confuse everyone thanks to them anyway!)"


The sooner you learn that you can't trust any of the pre-release info, the better for you. Especially for someone who claims to 'know' so much, but provides little by the way of supporting evidence or countering arguments. The 'well it could be anything, we don't know' argument doesn't refute what I said, and definitely isn't a basis for your BS statement.

All in all it's going to heat up, and the way it looks now is interesting because Intel looks to have the best CPU for a while, and ATI looks to have the best VPU in the R600.

But as always, only time will tell.


NOW REFUTE THAT A$$H0LE, or else BLOW! :evil:
I already have. :)

No you haven't, far from it infact, because in order to have a rebutal you need proof, and that's lacking in this argument. So until you find some you'll still be siting there bitching about how I did your G80 wrong by ever saying that the R600 might be the leading VPU, thus making for an interesting system builder conflict. Oh no!

At least arguing that nV being in the lead and creating a situation where AMD had to optimize and lauch with nV and not themselves, would be a far better argument than you've made.

I supported my argument, and you still have nothing left but regurgitated pap to counter with. So I guess it still looks like the R600 will be the VPU all eyes are focused on in the fall.

And until they actually meet in the PC ring, then the R600>G80, simply because

R>G

and

600>80

It's that simple.
 

Dahak

Distinguished
Mar 26, 2006
1,267
0
19,290
Bothe ATI and NVIDIA supply awesome graphics cards.I really can't say from a performance stand point just who is better.However I know I've had severe driver issues with ATI based cards and was able to get very little help from them regarding these issues.So I made the switch to NVIDIA and i've been very happy with all the cards i've bought.I am gamer and i am constantly upgrading my system when finances permit.Anyhow,what company you use is up to you.And some do work better than others on different games.Goodluck whatever direction you go.

Dahak


EVGA NFR4-SLI READY MB
ACEPOWER 520WATT PSU
AMD 4400+ DUAL CORE S-939
NVIDIA 7800GT(soon to be 2)
WD 300GB HD 7200 RPM
EXTREME 19IN.MONITOR
 
This thread is lame :roll:

Yep, agreed.
lamecj1.gif
 

Gamer_369

Distinguished
May 29, 2005
183
0
18,680
Yet you want to argue based on your admitted fact that you don't know anymore than Fuad; heck the fact that the members at B3D are debating his info despite their unreserved contempt of the InQ is because it's one of the only places for info, especially now that WaveyDave is with ATi. Funny how you gripe about the InQ info yet you have nothing better yourself other than the umptienth guess at die size based on equally spurious information.
Equally "spurious" information?

G71 = 196mm^2 - http://www.anandtech.com/video/showdoc.aspx?i=2717&p=2
R580 = 352mm^2 - http://www.beyond3d.com/misc/chipcomp/?view=chipdetails&id=105


Bottleneck only if you consider it as a true 12 fully fledge pipeline design, not if you consider it a very capable smaller design. There 'bottlenecks' you speak of are axactly what the difference is between texture-heavy and pixel heavy tasks, so you're just restatting my point.
But it's not even a 12 fully fledged pipeline design. Surely you know about the decoupled TMUs and ROPs, which only give it a total of 4.


Perhaps you should do more research, because there are games out there that ONLY used vertex shaders so it wouldn't be catastrophic, it would be perfect for those games, and it's not just games we're talking about here, and there are many other applications, and the workstation market looks to benifit greatly from this potential. Of course neither of those are 'common knowledge' so you wouldn't be able to copy it from somneone else, you'd actually have to think about it on your own.
This is the seventh time you've made that allegation, yet still can't prove it. Come on, I even told you the boards I frequent! Knowing your love to trash people, surely you could've come up with something by now. :roll:

Well baring some major redesign, which would require even more transistors, it'll probably closely match current ALu pairing. So unless you have something conrete otherwise, still doesn't change the fact that you have nothing more to stand on than anyone else, and are in no more of a position to comment.
Wait a minute. "Baring some major redesign, which would require even more transistors" is a very sketchy statement. If you're referring to "even more transistors" than G71, we already know that's true (if you take the 500 mil. nVidia stated). In fact, it's roughly 80% more than G71, so does this suffice for your "even more transistor units." (And again, transistor density is likely different).

Secondly, you speak as if a R600 shader equals an NV40 vertex shader by default, which again, is not true. Thirdly, even a minor redesign would make a difference, and by arguing that point would be to essentially say there's no difference at all between one shader and its modified second.

So again, your logic isn't adding up.


No, you're the one who's question my generalized statement of this marketplace looking very interesting in the near future due to possible dynamics (nV doing well doesn't keep them from selling to AMD nor Intel, but guess you missed how that's relevant), and so without anything more you're simply FUDing your way through defending something you admit you know nothing about, and definitely no more than myself or anyone else. My statement is about the market aspect of what will undoubtedly be an interesting dynamic should that be the case, your statements are Fanboi whining about "don't say bad things about the G80!" :roll:
Yes, this market place is looking to get very interesting, but you went as so far to actually declare winners. I'm still going to wait out for K8L, because I think this will give AMD another good chance to near Intel in performance and value. But it won't be easy, as Conroe has launched, been reviewed, tested, and looks mightly attractive.

Also, I think my whining is "don't attack me for saying something good about G80." Although, it doesn't really bother me if you attack me or not. It's actually quite amusing. :)


No that's certainkly NOT what you did, you didn't state opinion, your started off with the FanATic comment, so obviously you either once again forgot to READ what was written or had your typical knee-jerk, or just plain JERK reaction. Show me how this was a valid counterpoint;

I know fanATIcs are so desperately trying to paint this R600>>>>G80 picture (with unified shaders), but don't you think when G80 does debut, and isn't an FX failure, that it's just going to make G80 look even better than it actually is (since people were so convinced otherwise)?
When you overhype one product and is doesn't live up to it's pre-concepted standards, usually it becomes dissapointing. Likewise, if you underhype a product, and it comes out doing pretty good, people tend to be caught off guard and think higher of it. Common sense...


Like GW said, you're nothing @ B3D, you're simply a klingon like you are here, probably posting equally trite BS simply to say you posted. I've been participating @ B3D for much MUCH longer, but by no means am I a heavily active participant because it's harder to keep a conversation when you post every 2 days there. I know GW and a few others who are far more active, and if they remember my posts for anything it's for them being insightful and contibuting to the discussion @ B3D, not some crap like yours. That you think your simply reading/posting @ B3D makes you special is laughable, I doubt anyone there even knows who you are, which is not the case with GW and a few others who could be included among the 'we @ B3D'.
lol, for the second time, I'm not bragging or even boasting my reputation on B3D. I think you must feel threatned or something though to actually attack me on a different board!

And what's even more funny is all the people that've criticized be for being a "boasting member" from B3D have come out and said "Oh yea! Well I've been at B3D for x years, and I'm a senior member, etc!" All I did was give my name - when asked.

Heh, but if you are going to trash my reputation, I'd atleast expect you to back it up. Although I realize it's probably easier to just take the low route. ;)

The sooner you learn that you can't trust any of the pre-release info, the better for you. Especially for someone who claims to 'know' so much, but provides little by the way of supporting evidence or countering arguments. The 'well it could be anything, we don't know' argument doesn't refute what I said, and definitely isn't a basis for your BS statement.
G80 thus far looks to be on schedule to it's September timeframe launch, but things could change; you never know.

No you haven't, far from it infact, because in order to have a rebutal you need proof, and that's lacking in this argument.
That's become your default closing. I don't think you even read my posts anymore, as long as you get to trash me at the end, you're good. lol :roll:

I supported my argument, and you still have nothing left but regurgitated pap to counter with. So I guess it still looks like the R600 will be the VPU all eyes are focused on in the fall.
Well, we'll let Tom (or whoever does the reviewing) decipher that. It's obvious you don't know enough to make victor statements quite yet. Too bad Fuad wasn't there to help you, though. I know his arguments are quite persuasive. :lol:

And until they actually meet in the PC ring, then the R600>G80, simply because

R>G

and

600>80

It's that simple.
8800>2800 (unless of course you take the x to mean a value, in which case that doesn't work out, lol)
Green>Red (bigger word)

:p

Also, your rhetoric looks awful similiar to that at DriverHeaven. Hmm....*ponders :wink:

Ape, I'd be happy to sit it out and actually wait for the launch of G80 and R600, as then the reviews will bear it all. And while you can biotch at me about facts you don't want to hear, it won't be so easy when it's someone doing the reviewing.

*Gamer
 
Equally "spurious" information?

So you're going to guess based on The G71 and R580, we all know how effective that's been. :roll:

But it's not even a 12 fully fledged pipeline design. Surely you know about the decoupled TMUs and ROPs, which only give it a total of 4.

So you basically can't read again. Re-read understand what I said, I said only someone who considers it a 12 pipe design would see it as a bottleneck, ie YOU.

Perhaps you should do more research, because there are games out there that ONLY used vertex shaders so it wouldn't be catastrophic, it would be perfect for those games, and it's not just games we're talking about here, and there are many other applications, and the workstation market looks to benifit greatly from this potential. Of course neither of those are 'common knowledge' so you wouldn't be able to copy it from somneone else, you'd actually have to think about it on your own.
This is the seventh time you've made that allegation, yet still can't prove it.

Why would I have to prove that, when you haven't proven anything in this thread sofar, including your claim that full vertex dedication would be catastrophic. In some games it would be exactly what they need. Of course you ignore that because it further weakens your argument, but like I said your regurgitation and recombining of other people's ideas leaves you wihtout the ability to address your own mistakes. Gurdd since you think it's the exact same thing, I'll just say I don't have proof, but it LOOKS LIKE your plagiarize, and as such by your standards, that's proof enough. Or did you want to re-examine your argument that looks like = declare? :twisted:

Wait a minute. "Baring some major redesign, which would require even more transistors" is a very sketchy statement.

No more sketchy than your attack and then BS defence. And no I wasn't talking about your truism that G80 will have more transistors than the G71, but of course it allows you to answer without saying anything, yet again. Guess spelling out the ALUs to you didn't help you, simply put, 'without a major redesign' it no more favourable to the G80 design as you would try to argue since the R520 and R500's vertex functionality is very similar, and if anything the R600 would simply refine that, since a major change would cost alot more transistors like I said.

Yes, this market place is looking to get very interesting, but you went as so far to actually declare winners.

Once again you have rading/comprehension issues, I didn't declare a winner, I handicapped the favourites, perhaps you should once again read some more and understand the difference. You on the other hand essential are sulking that I picked the Conroe as a 2:1 favourite, and the R600 as a 5:4 favourite. So once again your lack of comprehension has steered you wrong.

I'm still going to wait out for K8L, because I think this will give AMD another good chance to near Intel in performance and value. But it won't be easy, as Conroe has launched, been reviewed, tested, and looks mightly attractive.

And will come out after the launch of the next gen graphics cards by their current predicted timetables (once again remember LOOKS like). So waiting for the K8L is all well and good but by current 'common knowledge' won't be a factor until after the start of the card war, of didn't you get the memo from those who keep hold of the common knowledge?

Also, I think my whining is "don't attack me for saying something good about G80."

BS! You weren't simply saying consider the G80, you took umbrage of something that as very little to do with nV other than being a 3rd party to this. So stop the BS, I didn't ATTACK the G80, no if anything I picked Conroe and R600 as the strongest contenders in the fall, and YOU attacked that. Really learn how to read and write, because you're really mixing up what I said and what you said.

lol, for the second time, I'm not bragging or even boasting my reputation on B3D. I think you must feel threatned or something though to actually attack me on a different board!

Yeah how did I ever confuse the 'we at B3D' comment. Auew dude, and I couldn't care less what forum it's on you still confuse yourself, the hardware and your arguments regardless of venue. The thing is you chose this one to spout your BS and mistakes. BTW, wonder if the members of B3D thnk the X1800XL is the R520 chip, or whether they would agree with you that it is not? Should we ask?

And what's even more funny is all the people that've criticized be for being a "boasting member" from B3D have come out and said "Oh yea! Well I've been at B3D for x years, and I'm a senior member, etc!" All I did was give my name - when asked.

And GW's earned that right, I never did because unlike you I don't name drop to try and give myself credability, I use the weight of my arguments. Perhaps you should try that, then no one would know anything about you at B3D, which would likely be to your betterment, because the last thing you need is more enemies, because I doubt you make many friends.

Heh, but if you are going to trash my reputation, I'd atleast expect you to back it up. Although I realize it's probably easier to just take the low route. ;)

That you think I'd even bother, show you think to highly of yourself. Nah I wouldn't do it. Nah, I'd call in favours to have other people do it for me... if I were so inclined. :twisted:

G80 thus far looks to be on schedule to it's September timeframe launch, but things could change; you never know.

'Looks', so here it is you're once again declaring the G80 will be here for September. Or did you want to once again revisit your argument about the way something "looks".

That's become your default closing. I don't think you even read my posts anymore, as long as you get to trash me at the end, you're good. lol :roll:

Well if I stopped READING then at least I'd be at your level, because you keep ignoring what is avtually written there and go astray to try and save yourself.

Well, we'll let Tom (or whoever does the reviewing) decipher that. It's obvious you don't know enough to make victor statements quite yet.

And you obviously don't know enough about the english language to make statements yet with regards to the difference between VICTOR and top CONTENDER. I'll take Fuad, but you'll need Webster or Roget.
Oh yeah, and TOM doesn't do the reviews and hasn't for a while, even back in the day when Tom did contribute LARS was the principle reviewer, and you'd know him by another name if your actually were enough of a B3D member to matter.

8800>2800 (unless of course you take the x to mean a value, in which case that doesn't work out, lol)

So now you're declaring model numbers too? Wow you're a font of speculation, yet you have trouble with my statments. Incredible. X can have any value, and since you didn't pick it I pick infinity. Yeah I win!

Green>Red (bigger word)

RED>green, bigger wavelength, which is universal, because as we know ROUGE>vert. And it's irrelevant anyways as soon they'll both be green. So they cancel each other out.

Also, your rhetoric looks awful similiar to that at DriverHeaven. Hmm....*ponders :wink:

I've posted about 5 times at driverheaven to troubleshoot driver issues, so really ponder all you want, the difference is I can speak to my argument, you can't.

Ape, I'd be happy to sit it out and actually wait for the launch of G80 and R600, as then the reviews will bear it all. And while you can biotch at me about facts you don't want to hear, it won't be so easy when it's someone doing the reviewing.

'Facts I don't want to hear', you haven't presented any yet. The only concrete statements you made sofar were all proven to be false, just like your R520 statements. BTW, did you want to post that transcript GEO has giving an exact time in September, or even something that is half as assured as your were, or was that BS too and you once again not reading things properly.

Still after all this, it still looks like the Conroe will be the best, and the harder you argue against it the more I'm 110% certain the R600 is going to be far FAR better than the G80, in fact you are so defensive about the G80 that is must really REALLY suck. The checkbox advantages of the R600 are going to be enough to fill and entire retail bpx/ OMG it's gonna be a Monster, at least better to the power of infinity! :twisted:
 
Equally "spurious" information?

So you're going to guess based on The G71 and R580, we all know how effective that's been. :roll:

But it's not even a 12 fully fledged pipeline design. Surely you know about the decoupled TMUs and ROPs, which only give it a total of 4.

So you basically can't read again. Re-read understand what I said, I said only someone who considers it a 12 pipe design would see it as a bottleneck, ie YOU.

Perhaps you should do more research, because there are games out there that ONLY used vertex shaders so it wouldn't be catastrophic, it would be perfect for those games, and it's not just games we're talking about here, and there are many other applications, and the workstation market looks to benifit greatly from this potential. Of course neither of those are 'common knowledge' so you wouldn't be able to copy it from somneone else, you'd actually have to think about it on your own.
This is the seventh time you've made that allegation, yet still can't prove it.

Why would I have to prove that, when you haven't proven anything in this thread sofar, including your claim that full vertex dedication would be catastrophic. In some games it would be exactly what they need. Of course you ignore that because it further weakens your argument, but like I said your regurgitation and recombining of other people's ideas leaves you wihtout the ability to address your own mistakes. Gurdd since you think it's the exact same thing, I'll just say I don't have proof, but it LOOKS LIKE your plagiarize, and as such by your standards, that's proof enough. Or did you want to re-examine your argument that looks like = declare? :twisted:

Wait a minute. "Baring some major redesign, which would require even more transistors" is a very sketchy statement.

No more sketchy than your attack and then BS defence. And no I wasn't talking about your truism that G80 will have more transistors than the G71, but of course it allows you to answer without saying anything, yet again. Guess spelling out the ALUs to you didn't help you, simply put, 'without a major redesign' it no more favourable to the G80 design as you would try to argue since the R520 and R500's vertex functionality is very similar, and if anything the R600 would simply refine that, since a major change would cost alot more transistors like I said.

Yes, this market place is looking to get very interesting, but you went as so far to actually declare winners.

Once again you have rading/comprehension issues, I didn't declare a winner, I handicapped the favourites, perhaps you should once again read some more and understand the difference. You on the other hand essential are sulking that I picked the Conroe as a 2:1 favourite, and the R600 as a 5:4 favourite. So once again your lack of comprehension has steered you wrong.

I'm still going to wait out for K8L, because I think this will give AMD another good chance to near Intel in performance and value. But it won't be easy, as Conroe has launched, been reviewed, tested, and looks mightly attractive.

And will come out after the launch of the next gen graphics cards by their current predicted timetables (once again remember LOOKS like). So waiting for the K8L is all well and good but by current 'common knowledge' won't be a factor until after the start of the card war, of didn't you get the memo from those who keep hold of the common knowledge?

Also, I think my whining is "don't attack me for saying something good about G80."

BS! You weren't simply saying consider the G80, you took umbrage of something that as very little to do with nV other than being a 3rd party to this. So stop the BS, I didn't ATTACK the G80, no if anything I picked Conroe and R600 as the strongest contenders in the fall, and YOU attacked that. Really learn how to read and write, because you're really mixing up what I said and what you said.

lol, for the second time, I'm not bragging or even boasting my reputation on B3D. I think you must feel threatned or something though to actually attack me on a different board!

Yeah how did I ever confuse the 'we at B3D' comment. Auew dude, and I couldn't care less what forum it's on you still confuse yourself, the hardware and your arguments regardless of venue. The thing is you chose this one to spout your BS and mistakes. BTW, wonder if the members of B3D thnk the X1800XL is the R520 chip, or whether they would agree with you that it is not? Should we ask?

And what's even more funny is all the people that've criticized be for being a "boasting member" from B3D have come out and said "Oh yea! Well I've been at B3D for x years, and I'm a senior member, etc!" All I did was give my name - when asked.

And GW's earned that right, I never did because unlike you I don't name drop to try and give myself credability, I use the weight of my arguments. Perhaps you should try that, then no one would know anything about you at B3D, which would likely be to your betterment, because the last thing you need is more enemies, because I doubt you make many friends.

Heh, but if you are going to trash my reputation, I'd atleast expect you to back it up. Although I realize it's probably easier to just take the low route. ;)

That you think I'd even bother, show you think to highly of yourself. Nah I wouldn't do it. Nah, I'd call in favours to have other people do it for me... if I were so inclined. :twisted:

G80 thus far looks to be on schedule to it's September timeframe launch, but things could change; you never know.

'Looks', so here it is you're once again declaring the G80 will be here for September. Or did you want to once again revisit your argument about the way something "looks".

That's become your default closing. I don't think you even read my posts anymore, as long as you get to trash me at the end, you're good. lol :roll:

Well if I stopped READING then at least I'd be at your level, because you keep ignoring what is avtually written there and go astray to try and save yourself.

Well, we'll let Tom (or whoever does the reviewing) decipher that. It's obvious you don't know enough to make victor statements quite yet.

And you obviously don't know enough about the english language to make statements yet with regards to the difference between VICTOR and top CONTENDER. I'll take Fuad, but you'll need Webster or Roget.
Oh yeah, and TOM doesn't do the reviews and hasn't for a while, even back in the day when Tom did contribute LARS was the principle reviewer, and you'd know him by another name if your actually were enough of a B3D member to matter.

8800>2800 (unless of course you take the x to mean a value, in which case that doesn't work out, lol)

So now you're declaring model numbers too? Wow you're a font of speculation, yet you have trouble with my statments. Incredible. X can have any value, and since you didn't pick it I pick infinity. Yeah I win!

Green>Red (bigger word)

RED>green, bigger wavelength, which is universal, because as we know ROUGE>vert. And it's irrelevant anyways as soon they'll both be green. So they cancel each other out.

Also, your rhetoric looks awful similiar to that at DriverHeaven. Hmm....*ponders :wink:

I've posted about 5 times at driverheaven to troubleshoot driver issues, so really ponder all you want, the difference is I can speak to my argument, you can't.

Ape, I'd be happy to sit it out and actually wait for the launch of G80 and R600, as then the reviews will bear it all. And while you can biotch at me about facts you don't want to hear, it won't be so easy when it's someone doing the reviewing.

'Facts I don't want to hear', you haven't presented any yet. The only concrete statements you made sofar were all proven to be false, just like your R520 statements. BTW, did you want to post that transcript GEO has giving an exact time in September, or even something that is half as assured as your were, or was that BS too and you once again not reading things properly.

Still after all this, it still looks like the Conroe will be the best, and the harder you argue against it the more I'm 110% certain the R600 is going to be far FAR better than the G80, in fact you are so defensive about the G80 that is must really REALLY suck. The checkbox advantages of the R600 are going to be enough to fill and entire retail bpx/ OMG it's gonna be a Monster, at least better to the power of infinity! :twisted:

.............so...... hows...... stuff.......?

there seems to be some serious ownage here

wether it be Intel vs AMD or ATI vs Nvidia, its always a leapfrog
 

Gamer_369

Distinguished
May 29, 2005
183
0
18,680
[So you're going to guess based on The G71 and R580, we all know how effective that's been. :roll:
You can use mere math to find the theoretical size of G80 based off G71's size and transitor count, granted transitor density is the same. Because it likely it isn't, we only get a ball-park range, as I've stated time and time again. But it's closer than anything your friend, Fuad is going to conjer up, and you take his words with such high credibility so mine should look even better.

So you basically can't read again. Re-read understand what I said, I said only someone who considers it a 12 pipe design would see it as a bottleneck, ie YOU.
Maybe you need to re-read what I said. 0_o

But it's not even a 12 fully fledged pipeline design.

That says a lot when you critizize others of not "properly reading" in the same post you seem to make that very same blatant mistake. :roll:

Why would I have to prove that, when you haven't proven anything in this thread sofar, including your claim that full vertex dedication would be catastrophic. In some games it would be exactly what they need. Of course you ignore that because it further weakens your argument, but like I said your regurgitation and recombining of other people's ideas leaves you wihtout the ability to address your own mistakes.
Or the fact that you omit any comment on the fact that 1 ps + 1 vs is more capable than two unified shaders. IMO, that's the epitome of what we're debating, yet you sweep it under the rug so discretely.

And it's funny that my rhetoric is so advanced that it must be written by someone else, then when you realize that argument failed, you'll just pretend how what I said was all false and irrelevant. Nice back-door you have there. ;)


No more sketchy than your attack and then BS defence. And no I wasn't talking about your truism that G80 will have more transistors than the G71, but of course it allows you to answer without saying anything, yet again. Guess spelling out the ALUs to you didn't help you, simply put, 'without a major redesign' it no more favourable to the G80 design
Oh, but Grape, you said it so clearly ;)

Well baring some major redesign, which would require even more transistors


as you would try to argue since the R520 and R500's vertex functionality is very similar, and if anything the R600 would simply refine that, since a major change would cost alot more transistors like I said.
And both have similiar transistor counts between 300-350 million, which you should have known by you're own theory. G80 with 500 million, is again, roughly 80% more than G71 with 278. So all you did was use R520 and R500 to further prove my point. 0_o

Once again you have rading/comprehension issues, I didn't declare a winner, I handicapped the favourites, perhaps you should once again read some more and understand the difference. You on the other hand essential are sulking that I picked the Conroe as a 2:1 favourite, and the R600 as a 5:4 favourite. So once again your lack of comprehension has steered you wrong.
Well, it would have been conveniant of you to actually stick those ratios in, but unfourtunately you failed to do so, and instead came up with:

"but the G80 doesn't stand a chance remaining @ 90nm with this stuck hybrid design, when the R600 will launch @ 80nm with the unified design that will allow it to push more shaders per clock by a long shot."

Now, the term "doesn't stands a chance" would atleast be a 5:1, not 5:4, wouldn't you say? ;)


And will come out after the launch of the next gen graphics cards by their current predicted timetables (once again remember LOOKS like). So waiting for the K8L is all well and good but by current 'common knowledge' won't be a factor until after the start of the card war, of didn't you get the memo from those who keep hold of the common knowledge?
Lol, that's a pretty weak point when your argument doesn't apply to anyone who somehow thinks K8L wouldn't launch during next year.

BS! You weren't simply saying consider the G80, you took umbrage of something that as very little to do with nV other than being a 3rd party to this. So stop the BS, I didn't ATTACK the G80, no if anything I picked Conroe and R600 as the strongest contenders in the fall, and YOU attacked that. Really learn how to read and write, because you're really mixing up what I said and what you said.
Actually, you're once again displaying that "let me imply what I want to, then when you call me out on it, I'll play dumb." :roll:


Yeah how did I ever confuse the 'we at B3D' comment. Auew dude, and I couldn't care less what forum it's on you still confuse yourself, the hardware and your arguments regardless of venue. The thing is you chose this one to spout your BS and mistakes. BTW, wonder if the members of B3D thnk the X1800XL is the R520 chip, or whether they would agree with you that it is not? Should we ask?
Oh, I'm sorry. I should have known you would have taken the term "we at B3D" and deciphered it as me partaking in discussions at B3D. What a strong argument you have there. :roll:

R520 is x1800XT with x1800XL being built on that core, lower clocked, etc.

Are you that lost in everything else we're discussing, that you know you'll atleast gain a win if you can get somebody to acknowledge what we already knew? :roll:


And GW's earned that right, I never did because unlike you I don't name drop to try and give myself credability, I use the weight of my arguments. Perhaps you should try that, then no one would know anything about you at B3D, which would likely be to your betterment, because the last thing you need is more enemies, because I doubt you make many friends.
Actually, you have both yet to identify who you actually are on those boards via this discussion.

That you think I'd even bother, show you think to highly of yourself. Nah I wouldn't do it. Nah, I'd call in favours to have other people do it for me... if I were so inclined. :twisted:
It just makes me think even lower of you.


'Looks', so here it is you're once again declaring the G80 will be here for September. Or did you want to once again revisit your argument about the way something "looks".
This is you focussing on the word look, because again, debating "looks" guarantees you a safe way out of the argument. :roll:


So now you're declaring model numbers too? Wow you're a font of speculation, yet you have trouble with my statments. Incredible. X can have any value, and since you didn't pick it I pick infinity. Yeah I win!
Geforce 8 has already been confirmed by nVidia. Maybe you missed that memo?


Dang Grape, it aint no fun debating when you're this uninformed on the topic. ;)
Sure you can debate rhetoric and the word "look," but I was honestly expecting a little better than you. Although I'm a little surprised how slimy apes can get, especially those purple ones. :roll:

Anywho, I am 120% coinfident that G80 will be nVidia's gem, and that both G80 and R600 will put up a tough fight.

*Gamer 8)
 
At this point in time Lamer, I just don't care what you think, but I'm not going to waste much more time on this, and after this will simply ridicule you and your statements.

I'll summarize a bit, because too much time is wasted on you and your off-road pursuit of tangents.

You haven't proven any points;

You argue that there's a bottleneck in a design that has no equals to compare it to (like I said it's only seen as a bottleneck by ignorant people like yourself who consider it to be something it's not [to which you reply that I consider it to be that which it is not, nice retort, I know you are but what am I comes to mind]), therefore you cannot prove it to be anymore of a bottleneck than alternate choices within the same design/cost constraints.

You argue that somehow the G80s potential 16 unified v+g units may have more complexity and that we can't equate the NV40 design to the NV50, then you mention the R580 as being 16 pixel shaders with 3ALUs (when in fact it's 48 with 1+1 ALUs if you wanna get as anal as you seem to want to, or did you confuse your core with shader units again?), and that change cost ATi more transistors, so like I said the G80 isn't going to make their gemoetry engines any more capable or functional without greatly increasing transistor count. This flumoxed you and you decided to repeat my statement over and over again instead of addressing that glaring reality which defeats your mathematically theory of die size if they ad more functionality per unit. So like I said baring any major redesign of their current Vertex engine;

http://www.beyond3d.com/previews/nvidia/g70/index.php?p=02

it will not perform any miracles against a similarly capable R500/520/580 design likely to be found in the R600 (with or without vertex texturing capabilities);
http://www.beyond3d.com/reviews/ati/r520/index.php?p=02

So once again you'd need to explain this MAGIC you speak of that will increase performance and capability without increasing transistor count. If they do change the number and layout of the ALUs, that's a different story, but since you don't know that your argument using that as a basis is no better than saying it'll use pixie dust. :roll:

Especially not if you're trying to argue fact versus prediction. Which remember was the whole point of your post wasn't it. Some declared winner, oh wait yea lete's see, you complain that I pick a winnner, and then you go on to not comment on the part where I say it looks like, but my response to your Fanboi post calling me a FanATic;

Well, it would have been conveniant of you to actually stick those ratios in, but unfourtunately you failed to do so, and instead came up with

Now, the term "doesn't stands a chance" would atleast be a 5:1, not 5:4, wouldn't you say? ;)

Where I mention the G80 doesn't stand a chance is in my reply to your attack. So from your first post with the name calling to my reply that the G80 doesn't stand a chance at 90nm, somehow it's the second post that caused your first reply. Cool, time travel now too, guess that's how the G80 will win, it'll suck SO hard that it'll warp space time in such a way as to slow down the rest of the universe and therefore make it look like it's performing faster. Wow that is Magic!

Now that you let me think of it, I'd like to upgrade the R600 to an 8:5 favourite because of the likelyhood of a blackhole in the G80 core causing geometry issues and artifacts.

You bring up the unrelated K8L as if I give a rats A$$ about your green AMD fanboi-dom, it ain't gonna be a player until long after this stuff launches, how about you talk about the K10 with the R700 vs G100+. :roll:

Try and keep your arguments based on the relevant, I know it's hard for you.

Actually, you're once again displaying that "let me imply what I want to, then when you call me out on it, I'll play dumb." :roll:

No implication needed, people can read, and they can easily see your first post, which was an attack from the start. What motivated it is obviously that I said good things about Miss Ecuador instead of your girlfriend Mr U-r-u-guay! My post said nothing about the G80, yet you came rushing in to defend it. Why not the S3 or Intel processors if you were trying to be as balanced as you claim?

Then in reply to your major blunder about the R520 your reply is;

Are you that lost in everything else we're discussing, that you know you'll atleast gain a win if you can get somebody to acknowledge what we already knew? :roll:

Well obviously you don't already know it because you spent 2 posts trying to refute that the X1800XL was indeed the R520. See that's a FACT, the rest of us KNOW that the X1800XL IS the R520, everything else is speculation. Yet for some reason you think we should take your sepculation seriously when you can't even get a simple FACT that IS 'common knowledge' straight and feel the need to argue that. Based on that alone your credability is shot, and it's obvious you're just here to troll and argue. Then when confronted it's my fault you made the mistake of your own choosing, I didn't pick R520, YOU did.

Geforce 8 has already been confirmed by nVidia. Maybe you missed that memo?

Yet you still don't know the name of what it's up against because there is no finalized name for the R600, and even the early one may change, because as we've discussed here already, now would be the perfect time for AMD/ATi to change their naming/numbering scheme.

Dang Grape, it aint no fun debating when you're this uninformed on the topic. ;)

What like your Magical features, and your misinformed regurgitation of other people's statements. BTW, I'll ask again, where's GEO's transcript with the exact dates again?

Anywho, I am 120% coinfident that G80 will be nVidia's gem,

And a magical enchanted +5 on rendering gem too I bet.
Who else's gem would it be, 3DfX's?

Sure you can debate rhetoric and the word "look," but I was honestly expecting a little better than you.

Wow I'm actually debating my use of the word 'looks' which you took umberage with, with your use of the very same term. Guess I should've come up with some other tangent like you.

What still stands is what I said in the first sentence of my first reply to you; "Once again you miss the point of the greater picture and the future of the industry and come to spout you uninformed PR pomp. You miss the whole point behind AMD with the best VPU and Intel with the best CPU, but being an nV supporter, I'm not surprised. "

Although I'm a little surprised how slimy apes can get, especially those purple ones. :roll:

Whatever LAMER, if that's the best you got it's pretty weak dude, and I'm not surprised based on your previous comments.

That you think I'd even bother, show you think to highly of yourself. Nah I wouldn't do it. Nah, I'd call in favours to have other people do it for me... if I were so inclined. :twisted:
It just makes me think even lower of you.

That's fine, I don't value your opinion anyways, so I'll go as low as you if you'd like. It's obvious that you targeted my post here based on my previously correcting your mistakes, little else could cause such a Fanboi response based on what I said whch is far from the FanATic post you claim it to be. Needless to say that you went out of your way in this thread to participate, and specifically only with my statements, not the many before that which were far more 'FanATical' to say the least. Now you want me to let it slide and not do the same to you? I'd consider it based on what I can only assume is the ignorance of youth, but if you want to continue wasting my time, then I'll make sure it's worth it for me. I don't hold grudges, and even a well known forum members who were banned by others I forgive and forget and chat with, and I've never advocated bannig someone or following them to other posts/forums because it's against my nature, but for you I might make an exception.
 
.............so...... hows...... stuff.......?

LOL!
There's time for stuff now? :lol:

wether it be Intel vs AMD or ATI vs Nvidia, its always a leapfrog

Definitely.

The way I see it is that the next two releases of cards will determine AMD's future in the market. I think there's a good chance that regardless of success they either have a strategy of just profiting in the market and forgeting about the terribly expensive low margin high end, in the same way Intel does, or they would look to thinking that they could make ATi's margins slightly better than before because of reduced fab costs and some shared R&D at the silicon level and thus if they can get + $1 net out of the venture they may still pursue it, especially if it helps defer some of their core CPU R&D costs (it's not like these are unrelated products). Either is a possibility, and only the former would really hurt gamers. In my discussions with a few nV boosters (in a far more amicable way than here) the thing that could hurt the ATi graphics wing of AMD is if AMD really loses focus on both and shifts all their efforts on just CPUs. I don't see this short term, but if the R600 is delayed enough, or a failure, or too costly, or a myriad of other hitches, it could spell doom for them and then you'd see AMD do what Intel does and fight for the profitable low-mid range, and thus kill everyone's profits (including nV and S3/VIA), which would leave less money for development of this ever commoditized segment and the pace of development and advancement would slow considerably since no one is going to spend money to develop exotic high ends, when they have no money from the formerly profitable middle. If AMD sees success or failure in the R600 series it likely will colour their future, either pull back or dedicate more resources to try to capitalize. Of course if they fail and 'think' they could get success with the next gen afterwards they could also say one more chance and then we're out and in so doing dedicate more resources to try and succeed. I don't see them rolling over and out of the high end GPU market, but I definitely don't see them pursuing an agressive competition with anyone if there's not serious benifit from it, and ATi and nV kinda competed themselves into very low margins, that might change fi the margin is more important than the compeition, and the marquee cards are no longer seen as the way to survival, but instead they look to Intel's model of success. Let's hope not.

It's going to be and interesting 12-18 months IMO, and I just hope that whatever happens we have more competition and not less, because that wouldn't benifit us in the least.
 

Gamer_369

Distinguished
May 29, 2005
183
0
18,680
In all of your rumbled-jumbled clash of rhetoric, I think it's evident you really don't know as much on R600 as you thought you did, nor on G80 as well, either. I suppose that's why you've shifted the conversation so far away from that topic with your random references and irrelevant FUD.

But on the topic of our discussion (even though you tried to diverge from it many times), it was funny watching you try to use theoretical figures and values to attempt to make conclusions; and your assumptions based on those "conclusions" were even more pathetic. I guess we found your weak-point, Grape (atleast it amused me :lol: ).

As I say time and time again, it's useless arguing with a fanboy, because they will see what they want, hear what they want, and acknowledge what they want - usually everything that makes them feel good about their affirmed company. It was obvious when someone challenged the thought of R600>>>>G80, you became very defensive, and unfourtunately didn't have the bite to back the bark (which is a behavior becoming quite prevlant from your during our small, little debates.)

I gave you just about every reason in the book to back up why I think G80 has just as much of a chance as R600, and gave you every chance to refute those claims, yet everytime you attempted to, you were either contradicting yourself, or not making an argument at all. But most of the time, you let your one-sided opinions get the best of you, which really was the damaging factor in this debate, on your part.

You even came out and verified the validity of my points, but it was so hard that you had to say they weren't really mine. So it's very clear to me now why you made those allegations, but unfourtunately when it came time to present the facts to prove those accusations, you came up short as usual.

I'm not going to pety down arguing with your rehtoric claims and such. I already made my point that G80 will have as much chance as R600, if not more. It's not my problem that you're just a big ATI-cheer-leader who has everything counting on R600 right now. (Perhaps because you realize the success of R600 will have a small impact on AMD's overall choice to remain in the discrete market).

I'll try not to gloat too much when the results come out, but I'll definately be saving all of your little remarks and see how they hold come this fall. ;)

Anyhow, I'll conclude this debate as I see you're just plain out of valid arguments to make besides throwing in random FUD. Good day to you :)

*Gamer 8)
 
In all of your rumbled-jumbled clash of rhetoric...

YADDA, Yadda, yadda, whatever, the topic is best summed up in the reply to Apache, your posts were an excercise in futility of demanding facts to support prognostication, and then providing nothing more speculation in return, if that makes you feel you KNOW either the R600 or G80 better fine, but your glaring mistakes prove you don't even know the current designs so future ones must be more comfortable since any wild guess is as good as the next. :roll:

I'll try not to gloat too much when the results come out, but I'll definately be saving all of your little remarks and see how they hold come this fall. ;)

IF it is this fall, right? :roll:
And so you end with what I said all along, coming full circle to the same closing words, ONLY TIME WILL TELL.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
Anyhow, I'll conclude this debate

while I thank you for ending... it should have been given up long ago. Aside from one clearly thought out argument (I commented on it earlier and Grape shut it down w/ good responses) The majority of this stuff has been Gamer: attack, Grape: present argument and rebutt attack w/ logic, Gamer: attack rebuttal w/ diversion and repetition, Grape: rebutt w/ logic and point out gamer illogic. Wash, rinse, repeat.

If this were an actual debate then you got served. No, Grape's coments and responses were not perfect logic and yes they had some circularity in them, but frankly he did not need it. Never did you rebutt any weak points he had, instead you were satisfied w/ arguing over nonsensical things and repeating single idiotic arguments as "responses" to Grape's points when they had no relevance. Even my comment on the one good argument you made garnered a similar backwards response. It would have been better (for you) if you had either asked someone to help you w/ logic or just bow-out of the competitiion.

What you've been saying in this thread is some of the most insanely idiotic things I have ever read. At no point in your rambling, incoherent responses were you even close to anything that could be considered rational thoughts. Everyone on this forum is now dumber for having read your responses. I award you no points, and may God have mercy on your soul. ;)




oh, and yes... that last paragraph? That was virtually a verbatim plagaristic rippoff from Billy Madison, yet is somehow fits here. lol
source is cited. ;)
 

pojomofo

Distinguished
May 19, 2006
38
0
18,530
I really think the only time nVidia clearly held the performance crown was when the 7800GT and 7800GTX came out, and the X1800XT came out like 6 months later. ATI clearly had an edge when it was 9xxx series vs FX series.

Other than those 2 short stints, they have always been competetive. I think it is ridiculous to say that nVidia is head-and-shoulders above ATI. If you prefer nVidia then that is great, but no way can you say that nVidia owns ATI, or vice versa