I think I'm an ATIer for life now.

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

microterf

Distinguished
Feb 19, 2009
642
0
19,010
After seeing the $379.99 for the 5870, I think I'll stick with ATI from here on out. I was pretty impressed with the eyefinity reviews on the 10th (been looking for a way to game on my extra monitors), and the performance of the 5870 now that we see it is right where most of us expected it to be.

The last Nvidia card I bought was a Geforce 4 Ti4600 for $400. It burnt out 2 months later, so I bought a Radeon 9700 Pro for the same price. One thing we know for sure is that no matter what we buy now, 8 years from now it will be little over a paperweight. That being said, I cannot believe how cheap this card is compared to how much Nvidia charges for their single GPU cards when they are on top.

My next build was going to have whatever card was the fastest, but now I don't care what NV brings out, I'm getting 2 5870x2 when they hit the egg.

Thank you AMD/ATI!

 
Solution
Never say never 😉 If we leave out cards for a sec and just look at the companies, I respect AMD more than Nvidia, but that also can change. For example if JHH would finally step down, their internal culture and practices may change, same can be said about AMD.


No it's not, you've been saying they couldn't raise price and MIXING your contradictory statements as if they were reasoned replies. You even spell it out futher later;

Increase the price = sell less = overall less profit

Which is wrong, increase price when there is excess demand adn you INCREASE profit, because you reduce the excess demand. Mfrs want to reach equilibrium, so at launch both the HD5870 and 5850 could've been more expensive because as you noted there were shortages and more demand than there was supply. By having the price just a bit too low ATi/AMD left money on the table. They could then adjust as they move along dropping prices even in the second week, etc.
That you don't get that from what even you yourself wrote, without my input, means you don't understand it to begin with and are probably parroting someone else's statements as if they were even semi-logical from an economic perspective. :heink:

And before you confuse yourself again, I'm not saying they would maximize profits by charging $600/card, but they are leaving money on the table due to under-estimating the value of their cards to early adopters.

I'm no Nvidia fanboy considering all of my current GPUs are ATI. I'm sure you're just another ATI fanboy though judging from the way you're rabidly attacking Nvidia...

If you have a problem with the facts, then fix them, but you still haven't corrected or supported your cheating statements, and you continue to post things counter to the facts, which is my problem with nVidia and their Fanbois right now, since they have nothing better to do than FUD the forums. IF the G100/G300 comes out and is good, then I will give it the respect it deserves like I did the G80, but I'm not about to back of confronting the fanbois or people like yourself who distort the facts (either willfully or through ignorance). :pfff:
 
Just to interject for a moment about ATi vs NVidia marketshare, not including Intel, you're both wrong by about the same margin (~10%); as it stands as of Q2 '09 (the most recent I could find in a hurry):

ATi: 18.4 units
NVidia: 29.2 units

Total units: 47.6

ATi: ~38.66%
NVidia: ~61.34%

That's a 22.6% disparity. Not necessarily crushing, but also not the position you want to be in. Looking at it simply, it's about 40/60 -- which really isn't all that bad. Please note that ATi saw a 41.5% increase in volume while NVidia saw a lesser 23.6% gain. Regardless of what the marketshare figures say, ATi and NVidia are both household names (at least to computer enthusiasts) which tells us that neither will be going anywhere soon.

Source
 


At this point, you're just cherry picking certain lines and misrepresenting my quotes. I suggest you re-read that post in its greater context since it was a reply to rhys.

Let me simplify and sum it up for you since you don't seem to get it:
If ATI overcharges, they are above equilibrium. They may make greater profit per card, but their demand will be lower. If ATI undercharges like now, their demand will be high but they make less profit. If they charge slightly higher than now at equilibrium price, they won't benefit much either since Nvidia has the greater market share and people (your average non-tech savy consumer) are more likely to get the GTX285 over the 5850 if they are the same price since most people prefer Nvidia.




Ironic you're accusing me of distorting the facts...that's the pot calling the kettle black. It's pretty obvious that you're an ATI fanboy. You've fallen into the typical fanboy trap how thinking how the company you worship respects you and lowers their prices to make you, their loyal customer, feel special.
 
@BlueScreenDeath: His main point is that 5870 is worth more realistically than it is selling for. People would still buy it in most likely the same relative amount if the price was higher. Last time I checked (a few days ago) the disparity between a GTX 295 and HD 5870 was about $80 on Newegg. That's some breathing room. It's worth, conservatively, around $400 at least. They're not keeping the price low to be nice to their consumers per-say, but they sure aren't doing it to be mean either.
 
Anyone buying ATI now for the first time will be left with a nice impression, and sets up further purchases in the future, this is a seeding of the market, not like last time, where the 4xxx series came in at low low pricing, which was aimed mainly at nVidia true money maker, the G92.
Since its no longer a factor, and the new cards outperform the old G200 series, then they should be priced above them per performance, which they arent, and thats where you need to look
 


Then why didn't you write 70% (over-exaggerated to begin with) discrete GPUs, instead? First it's not a correct #, second it was talking about the wrong thing. Why don't you use common sense before replying and get your facts correct first. The onus is on YOU when posting figures to get them right, or do you not understand the concept of supporting your statements. Instead of admiting your mistake and saying 'I meant this, and I exagerated the #', you want to make it my fault, well then go back to your original reply to MikeinBC and delete it ! He was just being hyperbolic and loose with the facts like you! [:thegreatgrapeape:5]

The very fact that Nvidia owns a much larger segment of the market gives them MORE power to set prices. ATI does not have this same luxury.

WTF are you talkign about? Do you think because Ford makes more Tauruses that they have more control over pricing than Ferrari? The two are totally un-related. The only way that relates to pricing might be the fanboi base they can draw on for excessive pricing, and nV proved that too is fleeting with the whole GTX280 experiment, where the under-dog controlled pricing not the 2:1 volume leader. And since nV has no equivalent product, their pricing influence in the HD5K launch is even further reduced, where their only response is moving parts at a loss (not an economic motivation if it's new production, but a marketing one).

Nvidia is able to overcharge for the GTX280 since they are more well known and have a larger market base. ATI cannot do the same for the 5850 and achieve the same level of sales that the GTX280 did.

Yhat has little to nothing to do with it, it's about perceived value far more than being well-known or having a larger previous market bade. If the HD5870 were twice as fast as the GTX295, they'd have no problem moving many times more parts than the GTX280 regardless of their notoriety or fan base. It's about perceived value (even for $600 cards) and like any new launch, unless it's incredibly better than previous generations there's nothing that will make people move en masse, even if it a great card (GF8800GTX) or a mediocre one (GTX280) comparitively. And I doubt the GTX280 sold anywhere near as well as the HD5870 over the same period considering how poorly it did (even with production shortfalls) before price drops.

And ever since the pricing of 3x00 series, the general trend was Nvidia releases, ATI releases, Nvidia drops prices, ATI drops prices, etc

And it's ATi that set the bar for pricing, not nVidia, nVidia priced well above the ATi cards, and nV could've achieved the same pricing with a dart-board. ATi's pricing was far more influence by cost than nV, so your argument about them doing the same thing was proven wrong with the HD5K launch, where ATi could set the price higher and didn't.

If they beat nV a long time ago, why does nV still have the majority of the market? You really should do your research. The G92s beat the 3x00 hands down.

Do you even remember what you wrote, and what I was specific to reply to , or are you that out of touch you can't remember that, let alone your revisionist history?

To recap;
Back in the ATI 3x00 vs Nvidia 8x00 battle, Nvidia beat ATI hands down with performance and manufacturing cost.

To which I replied after correcting you with regards to the HD2K vs 3K;
...for the HD3K it was a better situation, where against the G92 it had much better yields since the G92 ran into early problems, and was cheaper again against the G80. It was the performance that was still a bit lacking, but not manufacturing cost.

Which clearly states, the G92 performance was better than the HD3K, but the HD3K had better yields/costs. Get it?
As to why nV still has market share, it's pretty easy to guess if you bothered to follow the industry, which you haven't obviously... the G92 was a better performer, but cost more, so nV lost money propping up the card to maintain sales which kept their market share high, but it was still and is still falling. This is nothing new and has been discussed here many times. YOU should really do YOUR research, your post are full or errors, either ignorance on your part, or outright lies.

The 4x00 did beat the GT200 (especially after price drops), but overall, ATI is still catching up to Nvidia in terms of market shares.

No one disputes that ATi is recovering from their massive loss from the R600, but that is the same situation nV faces now, where they are relying on their past success (like ATi did on the X1900 series) to help them weather the storm of a potential major set-back). But, that's not what you said, it's your point B about cost;

B. Their manufacturing is just different. The GT200 cards had a large memory bandwidth, which I believe was expensive. ATI going with a smaller bandwidth but faster memory saved them money in the long run. Back in the ATI 3x00 vs Nvidia 8x00 battle, Nvidia beat ATI hands down with performance and manufacturing cost.
ATI caught up quickly with the 4x00 series and now it's beginning to beat Nvidia.


So it's very relevant to what you were saying about costs, and ATi in not just Beginning to catch up , it had beaten it before, and if you're talking about performance, then it's not beginning to beat nVidia, it already did.

Seriously, you're all over the map with your mistakes, and for the exceptions you took with what Mike wrote, you're not in a position to get any additional leeway as if you're someone who accidentally walked in on this discussion, especially when you tell people to stop kissing IHV's a$$es. :non:

You clearly don't understand I'm differentiating ATI vs Nvidia in terms of overall market as opposed to ATI vs Nvidia for each generation.

I understand that, probably better than you do, and thus I also understand that the way you're using it for other purposes is incorrect. Overall market means next to nothing, no more than nV's Mobo market share mattered leading up to their announcement that they're pretty much done there now too.

Like the caveat says: past performance is no guarantee of future success. :pfff:
 


Actually you're posting the same information I was, except I have more recent JPR info as well from their overall info, but here's the better source really for your info, the source itself (as JDJ could probably confirm I was posting this before it was at Xbit or Xtreview);

http://jonpeddie.com/press-releases/details/amd-soars-in-q209-intel-and-nvidia-also-show-great-gains/

If you actually follow the #s, that's still the 50% + I was talking about for intel.

I don't know where my margin of error is in that, since it's based on the slight improvemet due to Netbooks, which actually increased the # as I alluded to in my '+'.

edit, ps here's the add-in #s for that same period you were looking at;
http://jonpeddie.com/press-releases/details/jon-peddie-research-more-positives-than-negatives-in-q209-for-the-graphics-/
 


I read it the first time and have no need to re-read it, if you wish to say something different, re-write it, but as you said it and as it remained unchanged in writing or supporting evidence, you don't understand the concept of excess demand, and leaving money on the table (or in consumer's pockets).

Let me simplify and sum it up for you since you don't seem to get it:
If ATI overcharges, they are above equilibrium. They may make greater profit per card, but their demand will be lower. If ATI undercharges like now, their demand will be high but they make less profit. If they charge slightly higher than now at equilibrium price, they won't benefit much either since Nvidia has the greater market share and people (your average non-tech savy consumer) are more likely to get the GTX285 over the 5850 if they are the same price since most people prefer Nvidia.

You see, you don't need to simplify it for me, but you obviously need it simplified for you.

The first part you start getting right, describing basic MicroEconomic concepts of S&D equilibrium, and then you end it with a marketing concept, ignoring your entire first portion.

As rhys already told you "please don't quote the most basic of facts as if you know something every one else in the world doesn't know already", esspecially when you don't understand more than the Wiki definition. :pfff:

I have spent enough time in both fields to know the difference, and to know you're trying to cop-out. ATi could charge more, because they have excess demand for the card, how much more is questionable, not that they could charge more to actually meet equilibrium, not be below it.
Your second part assumption is that demand is simply and only for Bungholiomarks and performance in a handful of games in which the GTX2xx can compete, not in the 5+ areas nV has nothing. As for 'most people prefer nVidia', nice try at that. Most people prefer intel, and when Larrabee comes out I guess nVidia will be gone right? And in the HPC market, I guess nVidia should just roll over and die since they have little market share compared to the competition, and people just like intel better. Same lame argument, and equally as irrelevant, since both F100 for the HPC market and HD5K for the gaming market, people want performace/price not just picking some fanboi favourite. If all that mattered was previous preference, then there would be no market share changes, right?
Read the JPR numbers, the most significant increase in a quarter, with intel taking 2nd place and nV taking third. And even in the discrete market, ATi up, nV down, that's because the numbers change. And you seem to think that the demand curves for the GTX2xx and HD5K are the same, and even you should know that they are completely different curves for both supply and demand, at opposite ends of the product life cycle.

Ironic you're accusing me of distorting the facts...that's the pot calling the kettle black. It's pretty obvious that you're an ATI fanboy. You've fallen into the typical fanboy trap how thinking how the company you worship respects you and lowers their prices to make you, their loyal customer, feel special.

Talk about not reading what was written.
Unlike you, I know what I am talking about, both in graphics and economics, because I have vastly more experience in both than you do. I disparage ATi when they deserve it and nV when they deserve it, and like I said, PROVE ME WRONG!
I don't have any loyalty to any company, not even my favoured Matrox. At work I provisioned and use nV cards far more than ATi and intel, at home I've swapped between both. What I don't like is the fanboi revisionism that comes during the launches, that's what you did with your cheating statements, and also with you 'errors' in the previous card history and what it means to this generation.

Unlike you I pick my statements carefully, and I'd laughingly admit my error or my hyperbole, if shown so, but you seem to acknowledge neither in what you wrote. :sarcastic:
 
Yea, the new ones just came out.
As far as overall market, if you remember the K8 was also sharing high pricing along with Intel, the clear dominate leader in cpus, but the perceived value of the K8 made its worth varified, the same as the 4xxx series for ATI, as its perf made nVidia realign its pricing on the G200s
 


After looking it over I don't actually think you made a comment on any marketshare besides Intel (which I factored out and did the math for everyone because it wasn't just an ATi/NVidia marketshare graph), so I digress. Intel definitely holds the majority share, though.
 
My first pc had ati rage lt pro, 33mhz in a 2x agp, no heatsinks, nor fans. I learned modes changed dynamically, saved power, and I kicked butt in half life 1. I got the cd in the first group ever on planet earth. Back then, it was a time to call ati for support, and I got it. Today its millions and millions of sales, and hopefully a forum is honest, to gain replies.
I have had a dozen cards since then, 8 pcs, all my builds...

the only reason I stay ATI is the thoughts of something else besides hemi heads, 4 miles a gallon and 550 cubic inches to go get some bread and milk is way unnecessary. Today, I still have fanless ATI at 725mhz, unheard of 55nm, and AVIVO as it is called now, had a version even in the late 90s...

Vid cards are more than gaming, it is part of evolution, pretending challengers rivals like nvidia only adds to the lack of thrill of it all. :pt1cable:

I am here at 1280x960, the card at 110 mhz and less than one volt..Since when does nvidia go that far? I am too old to be called a " fan boy" of ATI, I stick with the facts. And if you are burning a card in 2 months, you aint learned a damn thing about cooling, it is the first thing to scrap on all cards today...the oe cooling. that is the evil of both manufactures, and has been there for nearly a decade, or more for some enthusiasts.
 


First off, I buy whatever gives me the best performance for the last money. I don't care which brand I'm buying.

However...

1) Performance tweaking gives competitive edge. Any company is going to do this to some dgree.

2) Capitalism, giving producers an incentive to use/sell your product is part of our capitalist market.

3) Capitalism (If you don't like the price, don't buy. Then they'll reduce price)

4) Capitalism.

5) Capitalism, and potentially to reduce complains of instability due to potential driver issues?

6) I usually don't buy video games based on movies, so no idea on this one.

7) No idea.

8) DX11 is worthless, nothing uses it yet.

Half of your complaints should be directed at every major business in this country. :)
 


I don't really care about the rest of your post but I don't understand the reasoning behind #8. Ever since DirectX 9 people have been shouting any further DirectX evolution is useless. I don't get it. It adds new, better, and faster performing eye candy, not to mention since DirectX 10 they completely redid the entire thing to be more streamlined for game developers (e.g. people saying OpenGL is less convoluted to write for). That may sound counter-intuitive (because then no one knows how to use it, whether its more simple or not) but one of these days people are gonna realize DirectX 9 wasn't some kind of gift from the gods, it's just what they got used to.

And the whole "nothing uses it yet" is like not buying a graphics card that runs everything at 120 FPS, instead opting for the same price as one that runs everything at 60 FPS because they both "look the same." Sure, they both willl probably look the same now, but what happens when the games start stressing that 60 FPS down and the guy with 120 is still over its max? Sound investment, sure... :pt1cable: I justify that comparison because neither the 5870 or 5850 are more expensive than GTX 2 series cards and perform above or at their tier now, even without the Direct3D 11 optimizations and improvements.
 
No wonder Derek Perez quit, he didnt want to have to use these lines
infoxj.jpg