Is Charlie right

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
So, does every person research? Is that now a requirement for average Joe? Or, would it be better to have reliability from any company, when theyve released a new product, that it performs as it has in the past?

Heres an example. Take a midrange card from 6 renamings ago vs one from 4 renmaings ago. The performance is quite noticeable, as well as DX updates/capabilities. Now, take one from 3 namings ago to presnt day. They have the same performance, they havnt gone to the latest DX available, yet they have this barnd spankin new name.

If I was a buyer of nVidia for several gens, and the last Id bought was a 88 series, Id be pretty happy with the results of each new purchase Id made. But things havnt changes since then, and if I were to purchase another "new" card fro, them, and had never had to "research" before, as theyd always come thru in the past, and ended up with the same card Id already had, dont you think this person may get a lil mad? Discouraged, and finally will do research, and find out whats going on, and be then upset even more?
 
Research isn't fun. You do it to either save yourself money or go for quality, or both, or Power. Ignorance isn't my forte.

Life isn't a pattern Jaydee, its new everytime what was before might not be next time, you can't rely on that. Look at the economy. Honestly I don't kno how you can argue this.

Are you trying to say ppl shouldn't have to research? It should be given? Common Give me a break.

Avg joe....pffft

Every1 starts of as an avg joe most of the time. Its what you make of it.

I research. I learn. I came into these forums not knowing what nm stands for. FSB. Yet here I am speaking this weird language.

Avg joe my fat ass:)

Please lets not take a step further, if its anything that you'll convince me of its not promoting ignorance (not saying u are). :hello:
 
I was hoping for more from nVidia, and not just buyer beware. But thats just me. Maybe some people find it ok, but I dont, as weve seen a stall in the mid ends, maybe longer than the reign of the 88 series, only time will tell. This is bad for nVidia, as all the people in the "know", who do research also recommend to average Joe, so he doesnt have to research, and has to trust someone or something, and the ones "in the know" are disatisfied with nVidias current naming scheme as well as other things theyve done recently, so in no shape or form is this a positive for nVidia, and people shouldnt try to fool themselves thinking its good for their books, because burning customers, and disheartening former fans isnt good in my book
 
well it makes no difference in mine. Your telling me Nvidia makes an awesome GPU blows ATI out of the water next season. What are you going to do? Prices are the same as the 2900 XT vs 8800 GTX, your looking for a top GPU. You have to choose

Which is it?

You kno u'll choose Nvidia. This is how I am. I don't care what the company does. As long as I kno what I'm buying WOOOOT!

I have nothing against ATI, I've been with them up until 2007 (summer of). They just don't tickle my fancy, which is top of the line.

I was fair this year and I bought the 4870 X2 to see what it offers. Not happy about it, and not sad. I mean if I had to choose between a 4870 X2 and a 9800 GX2...goes with otu saying 4870 X2.

As for 280 GTX v 4870 X2, if you would've asked me before I would've said 4870 X2, but after seeing some of the min frame rates....meh if its around the 280 GTXs whats really the point of having those extra couple of avg frames for another 150$?

Thats my honest opinion. I really don't think dual GPu is worth it for only up to 30% over 1 280 GTX.

Only reason I bought the 9800 GX2 (at the time in march) was because it almost DOUBLEd the 8800 GTX and 8800 GTS etc. That was really appealing. but 30% meh.

Don't get me wrong, that card rips, just not to my standards😛
 
Thats good to know. Thats your opinion, and Im sure itll help others in their decisions as to what their future buying options are. Just like Ive been saying all along, after all this disappointment coming from nVidia and their renaming scheme, people opinions will help shape other peoples future purchases as well, and this is what Im saying. You say it doesnt matter, people should research, but we all here give advice on other peoples possible purchases, as to whether its a good thing or not, and again, this is my point, and you being only 1 of a few that isnt tired of this renaming thing, will have a much lessor impact on those people, and to me, that could be a good thing , or not, depending on the scenario.

Speaking from what a persons personal experience is, is what its all about, as to helping others, and carries weight, as it should. Youre maybe starting to understand where Im coming from in this, hopefully, and I dont want harm to come to nVidia, it just seems this thing theyre doing has alot of potential for self infliction, and bottom line, thats not good for them
 
dude Nvidia and crash and burn for allI care. i want wats top. OFC I'll help ppl get what they need and not exadurate. But for me I'll always buy top😛

Anyways I passed my exam so I'm not in a arguing mood anymore LOL😛
 


what kind of WOOOOT did you get with the FX5800
 
what are you talking about buying decisions? We're talking about flaming not buying. I base MY decisions on Performance. I base OTHERS needs on what they need and their budget.

I don't understand where you get these 2 series to be my decisions. I said that I started at the 8 series because that was the dominate series at the time.

Even the 9 series was stronger than what ATI had out at the time if you remember. The 3870 X2 barely kept up with the 9800 GTX (8800 GTS) and the 8800 Ultra...yes it was more expensive than both.


That still doesn't mean I base my opinion on them. They've just had more luck with the GPUs. If there no competition at the time, no point to advance.

Thank you :hello:
 


BTW ape, i did say it was a mistake that i did not put in a link.


I dont think charlie just makes this stuff up he obviously has contacts in the business that he cannot divulge
 
Oh

BTW I heard that Nvidia and Ati just got married, their making NVitia

I also have contacts I don't want to divulge😀 😛

Super spies!!!! Like in Hollywood:)

Awesome!
 
no, you started on the 8 series because it was the first time you where old enough to go out and buy something without mummy there to hold ur hand
S-A-N-D-R-A
 


moving on, I think this came to it's end Rangers is too immature for this:)

I don' t even kno how that made any sense?

BTW

Why are you using Sexism to meet your gains? Are women not able to have a say in Hardware?

What changes if I were a woman? Are my statements any less you prick?

Honestly completely un called for.
 
do i have to spell it out for you, you getting uptight and women getting uptight, is it that time of the month again SANDRA, but my humor must fly right over that pretty blond skull of yours, thickO
 
As always you have nothing but insults coming out of your mouth, whether thats cuz your a big man on campus in forums and small man in real life...I don't kno and I don't point fingers

Can you try to make it a habbit of not insulting ppl to get our post across? There's a thick line between insulting, being a prick and joking.

Smart jokes are the ones every1 can enjoy. The ones that come out as a direct Sexist insult I'm sorry if I don't like them all that much.

Now move on little boy let the grown ups talk

No point in flaming the rest here, if your E-penis has something to say you can flame my msg box. No point in wasting every1's time.
 
the big cry baby cant take it, maybe its time you went home to ur mummy, as for the insults, there only for you, a wee christmas present from me to you
 
I think theres a thick line between crying and unnecessary...seein as you won't drop the joke, you don't have closure yet:) What I can notice from your posts is when you have nothing to say you insult simple as that😀

as always you walk out looking like the star😉

Good luck with that

Thanks for the Christmas present though I'm glad you soo generous

Heres mine:) --------------------------> :hello:
 
i would rather resort to violence but name calling will have to do, plain and simple, i dont like you, i dont like the way you try to hide that your an nvidia fanboy, come on just come out with it, it will make you feel better and help you gain some respect (that you badly need i may add) so just shout it out, instead of hiding it
 
I vote for thread lock. What was a questionable post has gone on to become this? Is Charlie right? Probably about some of is. (I think a lot, but who am I?) Is Liquid an Nvidia fanboy? Looks like it to some/most people. Did Rangers violate posting rules and is no longer contributing to the thread in a meaningful way? Seriously, if this is the way the thread is going, just lock it now...
 
I'd agree with that, this is just pointless at this point.

And I did admit I lean more towards for Nvidia cuz I'm an enthusiast, but I don't let it cloud my judgement when telling people what they should buy.

For example do you want the fastest Setup? 3 280 GTXs

Do you wanted the closest setup to match it at a good price? 4870 X2 quad

After would be the 260 GTX tri 4850 X2 quad

etc.

Either way here's a related question, any news on Nvidia doing dx 10.1 or just jumping to DX 11? or still a dx 10 company? For this GX2 card or for anything to come?

I'll agree with that much the 200 series' lack of DX 10.1 kinda was a shock.
 
OK, lets get this thread back on track.

Good question L1qu1d. Itll be really interesting as to how both companies are going to do this. Still lots of speculation going on about whether DX11 will arrive first, or the cards. The hints from ATI that they already have their Win7 drivers going may mean theyll be going tto DX11 at 40nm, but then again, they may not, and same for nVidia.
The timings off for the 40nm release and when DX11 arrives. Normally they might fudge a lil, and wait some for 40nm, but nVidia cant wait, and thus, ATI cant either. Who knows? Im just hoping it wont turn out to be another DX10 fiasco, where they changed DX10, and then later finished it with DX10.1