They do suck, a whole load of air. ATI cards sound like jet engines.
ATI cards run hot, turning cases into easy-bake ovens.
ATI drivers are just plain horrible.
To date, we don't know that information. So I think it's rather useless to throw those values out until we know what's changed.
The difference is more than just "helping G70/G80 in texture-heavy" games. It's also "potential bottleneck" when you have a ratio that doesn't jive with the architecture (what I associate as being x1600's problem).
But it wouldn't flip to 64 vertex, as that would prove catastrophic (in games) with nothing for the pixel shaders. So that 64 vs. 16 is merely theoretical.
And again, I think it will depend on how those 16 vertex shaders in G80 (granted that figure is accurate) compare fundamently to just one shader in R600.
You have nothing to base it on either, that's my point. And I think it's completely futile that you're now comparing R580 to G80 (didn't you just agree we know very little about G80 a second ago?) 0_o
So I guess what you're saying is, let you make your personal speculation theories, but when someone calls you out on it, we'll just say they don't know enough. You should have known those kinds of actions don't bode well with me.
Also, please present the "c/p" I've done in our discussion.
And the nature of my refutations to you isn't to prove you wrong in any sense,
It's ashame you couldn't just be objective, allowing others to chime in to what could be a very productive discussion (like we have at B3D).
I already have.All in all it's going to heat up, and the way it looks now is interesting because Intel looks to have the best CPU for a while, and ATI looks to have the best VPU in the R600.
But as always, only time will tell.
NOW REFUTE THAT A$$H0LE, or else BLOW!
Equally "spurious" information?Yet you want to argue based on your admitted fact that you don't know anymore than Fuad; heck the fact that the members at B3D are debating his info despite their unreserved contempt of the InQ is because it's one of the only places for info, especially now that WaveyDave is with ATi. Funny how you gripe about the InQ info yet you have nothing better yourself other than the umptienth guess at die size based on equally spurious information.
But it's not even a 12 fully fledged pipeline design. Surely you know about the decoupled TMUs and ROPs, which only give it a total of 4.Bottleneck only if you consider it as a true 12 fully fledge pipeline design, not if you consider it a very capable smaller design. There 'bottlenecks' you speak of are axactly what the difference is between texture-heavy and pixel heavy tasks, so you're just restatting my point.
This is the seventh time you've made that allegation, yet still can't prove it. Come on, I even told you the boards I frequent! Knowing your love to trash people, surely you could've come up with something by now. :roll:Perhaps you should do more research, because there are games out there that ONLY used vertex shaders so it wouldn't be catastrophic, it would be perfect for those games, and it's not just games we're talking about here, and there are many other applications, and the workstation market looks to benifit greatly from this potential. Of course neither of those are 'common knowledge' so you wouldn't be able to copy it from somneone else, you'd actually have to think about it on your own.
Wait a minute. "Baring some major redesign, which would require even more transistors" is a very sketchy statement. If you're referring to "even more transistors" than G71, we already know that's true (if you take the 500 mil. nVidia stated). In fact, it's roughly 80% more than G71, so does this suffice for your "even more transistor units." (And again, transistor density is likely different).Well baring some major redesign, which would require even more transistors, it'll probably closely match current ALu pairing. So unless you have something conrete otherwise, still doesn't change the fact that you have nothing more to stand on than anyone else, and are in no more of a position to comment.
Yes, this market place is looking to get very interesting, but you went as so far to actually declare winners. I'm still going to wait out for K8L, because I think this will give AMD another good chance to near Intel in performance and value. But it won't be easy, as Conroe has launched, been reviewed, tested, and looks mightly attractive.No, you're the one who's question my generalized statement of this marketplace looking very interesting in the near future due to possible dynamics (nV doing well doesn't keep them from selling to AMD nor Intel, but guess you missed how that's relevant), and so without anything more you're simply FUDing your way through defending something you admit you know nothing about, and definitely no more than myself or anyone else. My statement is about the market aspect of what will undoubtedly be an interesting dynamic should that be the case, your statements are Fanboi whining about "don't say bad things about the G80!" :roll:
When you overhype one product and is doesn't live up to it's pre-concepted standards, usually it becomes dissapointing. Likewise, if you underhype a product, and it comes out doing pretty good, people tend to be caught off guard and think higher of it. Common sense...No that's certainkly NOT what you did, you didn't state opinion, your started off with the FanATic comment, so obviously you either once again forgot to READ what was written or had your typical knee-jerk, or just plain JERK reaction. Show me how this was a valid counterpoint;
I know fanATIcs are so desperately trying to paint this R600>>>>G80 picture (with unified shaders), but don't you think when G80 does debut, and isn't an FX failure, that it's just going to make G80 look even better than it actually is (since people were so convinced otherwise)?
lol, for the second time, I'm not bragging or even boasting my reputation on B3D. I think you must feel threatned or something though to actually attack me on a different board!Like GW said, you're nothing @ B3D, you're simply a klingon like you are here, probably posting equally trite BS simply to say you posted. I've been participating @ B3D for much MUCH longer, but by no means am I a heavily active participant because it's harder to keep a conversation when you post every 2 days there. I know GW and a few others who are far more active, and if they remember my posts for anything it's for them being insightful and contibuting to the discussion @ B3D, not some crap like yours. That you think your simply reading/posting @ B3D makes you special is laughable, I doubt anyone there even knows who you are, which is not the case with GW and a few others who could be included among the 'we @ B3D'.
G80 thus far looks to be on schedule to it's September timeframe launch, but things could change; you never know.The sooner you learn that you can't trust any of the pre-release info, the better for you. Especially for someone who claims to 'know' so much, but provides little by the way of supporting evidence or countering arguments. The 'well it could be anything, we don't know' argument doesn't refute what I said, and definitely isn't a basis for your BS statement.
That's become your default closing. I don't think you even read my posts anymore, as long as you get to trash me at the end, you're good. lol :roll:No you haven't, far from it infact, because in order to have a rebutal you need proof, and that's lacking in this argument.
Well, we'll let Tom (or whoever does the reviewing) decipher that. It's obvious you don't know enough to make victor statements quite yet. Too bad Fuad wasn't there to help you, though. I know his arguments are quite persuasive. :lol:I supported my argument, and you still have nothing left but regurgitated pap to counter with. So I guess it still looks like the R600 will be the VPU all eyes are focused on in the fall.
8800>2800 (unless of course you take the x to mean a value, in which case that doesn't work out, lol)And until they actually meet in the PC ring, then the R600>G80, simply because
R>G
and
600>80
It's that simple.
I dont know how to ebayDamn GW: throw that GS on ebay, I'm sure it would sell in a hot minute for that price.
We would be a forum of dumbasses without you...
Equally "spurious" information?
But it's not even a 12 fully fledged pipeline design. Surely you know about the decoupled TMUs and ROPs, which only give it a total of 4.
This is the seventh time you've made that allegation, yet still can't prove it.Perhaps you should do more research, because there are games out there that ONLY used vertex shaders so it wouldn't be catastrophic, it would be perfect for those games, and it's not just games we're talking about here, and there are many other applications, and the workstation market looks to benifit greatly from this potential. Of course neither of those are 'common knowledge' so you wouldn't be able to copy it from somneone else, you'd actually have to think about it on your own.
Wait a minute. "Baring some major redesign, which would require even more transistors" is a very sketchy statement.
Yes, this market place is looking to get very interesting, but you went as so far to actually declare winners.
I'm still going to wait out for K8L, because I think this will give AMD another good chance to near Intel in performance and value. But it won't be easy, as Conroe has launched, been reviewed, tested, and looks mightly attractive.
Also, I think my whining is "don't attack me for saying something good about G80."
lol, for the second time, I'm not bragging or even boasting my reputation on B3D. I think you must feel threatned or something though to actually attack me on a different board!
And what's even more funny is all the people that've criticized be for being a "boasting member" from B3D have come out and said "Oh yea! Well I've been at B3D for x years, and I'm a senior member, etc!" All I did was give my name - when asked.
Heh, but if you are going to trash my reputation, I'd atleast expect you to back it up. Although I realize it's probably easier to just take the low route.
G80 thus far looks to be on schedule to it's September timeframe launch, but things could change; you never know.
That's become your default closing. I don't think you even read my posts anymore, as long as you get to trash me at the end, you're good. lol :roll:
Well, we'll let Tom (or whoever does the reviewing) decipher that. It's obvious you don't know enough to make victor statements quite yet.
8800>2800 (unless of course you take the x to mean a value, in which case that doesn't work out, lol)
Green>Red (bigger word)
Also, your rhetoric looks awful similiar to that at DriverHeaven. Hmm....*ponders
Ape, I'd be happy to sit it out and actually wait for the launch of G80 and R600, as then the reviews will bear it all. And while you can biotch at me about facts you don't want to hear, it won't be so easy when it's someone doing the reviewing.
Equally "spurious" information?
But it's not even a 12 fully fledged pipeline design. Surely you know about the decoupled TMUs and ROPs, which only give it a total of 4.
This is the seventh time you've made that allegation, yet still can't prove it.Perhaps you should do more research, because there are games out there that ONLY used vertex shaders so it wouldn't be catastrophic, it would be perfect for those games, and it's not just games we're talking about here, and there are many other applications, and the workstation market looks to benifit greatly from this potential. Of course neither of those are 'common knowledge' so you wouldn't be able to copy it from somneone else, you'd actually have to think about it on your own.
Wait a minute. "Baring some major redesign, which would require even more transistors" is a very sketchy statement.
Yes, this market place is looking to get very interesting, but you went as so far to actually declare winners.
I'm still going to wait out for K8L, because I think this will give AMD another good chance to near Intel in performance and value. But it won't be easy, as Conroe has launched, been reviewed, tested, and looks mightly attractive.
Also, I think my whining is "don't attack me for saying something good about G80."
lol, for the second time, I'm not bragging or even boasting my reputation on B3D. I think you must feel threatned or something though to actually attack me on a different board!
And what's even more funny is all the people that've criticized be for being a "boasting member" from B3D have come out and said "Oh yea! Well I've been at B3D for x years, and I'm a senior member, etc!" All I did was give my name - when asked.
Heh, but if you are going to trash my reputation, I'd atleast expect you to back it up. Although I realize it's probably easier to just take the low route.
G80 thus far looks to be on schedule to it's September timeframe launch, but things could change; you never know.
That's become your default closing. I don't think you even read my posts anymore, as long as you get to trash me at the end, you're good. lol :roll:
Well, we'll let Tom (or whoever does the reviewing) decipher that. It's obvious you don't know enough to make victor statements quite yet.
8800>2800 (unless of course you take the x to mean a value, in which case that doesn't work out, lol)
Green>Red (bigger word)
Also, your rhetoric looks awful similiar to that at DriverHeaven. Hmm....*ponders
Ape, I'd be happy to sit it out and actually wait for the launch of G80 and R600, as then the reviews will bear it all. And while you can biotch at me about facts you don't want to hear, it won't be so easy when it's someone doing the reviewing.
You can use mere math to find the theoretical size of G80 based off G71's size and transitor count, granted transitor density is the same. Because it likely it isn't, we only get a ball-park range, as I've stated time and time again. But it's closer than anything your friend, Fuad is going to conjer up, and you take his words with such high credibility so mine should look even better.[So you're going to guess based on The G71 and R580, we all know how effective that's been. :roll:
Maybe you need to re-read what I said. 0_oSo you basically can't read again. Re-read understand what I said, I said only someone who considers it a 12 pipe design would see it as a bottleneck, ie YOU.
Or the fact that you omit any comment on the fact that 1 ps + 1 vs is more capable than two unified shaders. IMO, that's the epitome of what we're debating, yet you sweep it under the rug so discretely.Why would I have to prove that, when you haven't proven anything in this thread sofar, including your claim that full vertex dedication would be catastrophic. In some games it would be exactly what they need. Of course you ignore that because it further weakens your argument, but like I said your regurgitation and recombining of other people's ideas leaves you wihtout the ability to address your own mistakes.
Oh, but Grape, you said it so clearlyNo more sketchy than your attack and then BS defence. And no I wasn't talking about your truism that G80 will have more transistors than the G71, but of course it allows you to answer without saying anything, yet again. Guess spelling out the ALUs to you didn't help you, simply put, 'without a major redesign' it no more favourable to the G80 design
And both have similiar transistor counts between 300-350 million, which you should have known by you're own theory. G80 with 500 million, is again, roughly 80% more than G71 with 278. So all you did was use R520 and R500 to further prove my point. 0_oas you would try to argue since the R520 and R500's vertex functionality is very similar, and if anything the R600 would simply refine that, since a major change would cost alot more transistors like I said.
Well, it would have been conveniant of you to actually stick those ratios in, but unfourtunately you failed to do so, and instead came up with:Once again you have rading/comprehension issues, I didn't declare a winner, I handicapped the favourites, perhaps you should once again read some more and understand the difference. You on the other hand essential are sulking that I picked the Conroe as a 2:1 favourite, and the R600 as a 5:4 favourite. So once again your lack of comprehension has steered you wrong.
Lol, that's a pretty weak point when your argument doesn't apply to anyone who somehow thinks K8L wouldn't launch during next year.And will come out after the launch of the next gen graphics cards by their current predicted timetables (once again remember LOOKS like). So waiting for the K8L is all well and good but by current 'common knowledge' won't be a factor until after the start of the card war, of didn't you get the memo from those who keep hold of the common knowledge?
Actually, you're once again displaying that "let me imply what I want to, then when you call me out on it, I'll play dumb." :roll:BS! You weren't simply saying consider the G80, you took umbrage of something that as very little to do with nV other than being a 3rd party to this. So stop the BS, I didn't ATTACK the G80, no if anything I picked Conroe and R600 as the strongest contenders in the fall, and YOU attacked that. Really learn how to read and write, because you're really mixing up what I said and what you said.
Oh, I'm sorry. I should have known you would have taken the term "we at B3D" and deciphered it as me partaking in discussions at B3D. What a strong argument you have there. :roll:Yeah how did I ever confuse the 'we at B3D' comment. Auew dude, and I couldn't care less what forum it's on you still confuse yourself, the hardware and your arguments regardless of venue. The thing is you chose this one to spout your BS and mistakes. BTW, wonder if the members of B3D thnk the X1800XL is the R520 chip, or whether they would agree with you that it is not? Should we ask?
Actually, you have both yet to identify who you actually are on those boards via this discussion.And GW's earned that right, I never did because unlike you I don't name drop to try and give myself credability, I use the weight of my arguments. Perhaps you should try that, then no one would know anything about you at B3D, which would likely be to your betterment, because the last thing you need is more enemies, because I doubt you make many friends.
It just makes me think even lower of you.That you think I'd even bother, show you think to highly of yourself. Nah I wouldn't do it. Nah, I'd call in favours to have other people do it for me... if I were so inclined. :twisted:
This is you focussing on the word look, because again, debating "looks" guarantees you a safe way out of the argument. :roll:'Looks', so here it is you're once again declaring the G80 will be here for September. Or did you want to once again revisit your argument about the way something "looks".
Geforce 8 has already been confirmed by nVidia. Maybe you missed that memo?So now you're declaring model numbers too? Wow you're a font of speculation, yet you have trouble with my statments. Incredible. X can have any value, and since you didn't pick it I pick infinity. Yeah I win!
Well, it would have been conveniant of you to actually stick those ratios in, but unfourtunately you failed to do so, and instead came up with
Now, the term "doesn't stands a chance" would atleast be a 5:1, not 5:4, wouldn't you say?
Actually, you're once again displaying that "let me imply what I want to, then when you call me out on it, I'll play dumb." :roll:
Are you that lost in everything else we're discussing, that you know you'll atleast gain a win if you can get somebody to acknowledge what we already knew? :roll:
Geforce 8 has already been confirmed by nVidia. Maybe you missed that memo?
Dang Grape, it aint no fun debating when you're this uninformed on the topic.
Anywho, I am 120% coinfident that G80 will be nVidia's gem,
Sure you can debate rhetoric and the word "look," but I was honestly expecting a little better than you.
Although I'm a little surprised how slimy apes can get, especially those purple ones. :roll:
It just makes me think even lower of you.That you think I'd even bother, show you think to highly of yourself. Nah I wouldn't do it. Nah, I'd call in favours to have other people do it for me... if I were so inclined. :twisted:
.............so...... hows...... stuff.......?
wether it be Intel vs AMD or ATI vs Nvidia, its always a leapfrog
In all of your rumbled-jumbled clash of rhetoric...
I'll try not to gloat too much when the results come out, but I'll definately be saving all of your little remarks and see how they hold come this fall.
Anyhow, I'll conclude this debate