ATi trying too hard?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
As TGGA said, this is the direction gfx is heading, and nVidias only option.
Because AMD/ATI have the better option, they can do both, while nVidia has to chase this new direction, possibly diminishing their product for DT gfx.
Its not dirty pool, its simply pointing out whats going on, without telling nVidias side of things, or their need to change.
Having the first cards out are important in many ways, as it leads to dev developement in games, as well as great early sales, especially coming in with a DX change. So Promoting them to the max is only a natural thing to do.
In a way, the OP did ask a nooberish question, which, because the way he/she phrased them, it could be construed as nothing but flame bait. But some Im sure dont have a clue as to the changes or nVidia position, so its somewhat good, somewhat bad. But choosing to pick out anyone confrontationally, instead of better defining their points, makes the poster look even more noobish and flame bait.
You blew a possible good thread, with more people contributing by the way you responded, and the tone of your questions.
 

The majority of people that buy cards like GTX295/4870x2/5870 have 1920x1200 resolution and the minorities have 2560x1600.
 



Actually, you're quite correct. My statement was simply comparing the top end 1-core card from ATI and NVDA. 285>4890. Similarly, the 380 will likely be slightly stronger than 5890. Is this a fair comparison? Hardly. For the price 5890 will trounce the 380.

Comes down to this: Who *will* have the most powerful card? Probably NVDA. Who will pay 2x the price for ~10-20% performance increase? Not me.
 


How do you know the pricing on the 3xx series?
 


QFT

To be honest.. back to the ATI x8xx series, I can't remember ATI ever being the more expensive for performance. (This is of course disregarding the HD2xxx debacle. Which even then the highest end ATI was ~the same price as the 8800GTS)
 


Thing that ATI has never done (to the extent of my [probably small T.T] knowledge) is taken advantage of the situation by idiotically raising prices. The 8800ULTRA was what? - $800!? I'm sure any CrossFire configuration at that time for under $500 would easily have crushed it.
 


Die size.


ATi have much more room to play with in terms of prices while turning a profit.


Nvidia don't. I'm also not sure on Nvidia's power and thermal envelopes for ramping up into an x2 variant.
 


Well you cannot assume anything till we get proof. Of course if we go by past experience we can all say that Nvdia will overprice but this time it is different due to AMD's 5 series. If Nvidia doesnt get their act together then there will be no point in spending mad coin on a 3 series when we can get a 5 series for much less without sacrificing performance. From what we have heard it looks like the prices will be slightly higher but nothing to get worried about, hopefully....
 


That's a fair assumption, but looking at what has been happening ATi has been closing the performance gap generation after generation while NVidia has remained about at the same linear increase which is what is allowing them to catch up. This of course was happening because instead of trying to rework any sort of new architecture they just kept using the same 8800 variants.

Now that NVidia is pressed to come up with something new we are seeing they stumbled a little and have changed their direction completely to general purpose graphics computing. Now, while I'll submit the new cards would seem to probably stomp anything ATi has as far as that goes, but, to reinforce your point, I find it so incredibly hard to believe they'll do anything more than match the performance at a higher price because those cards are gonna be notably more expensive to create with no graphical technical advantage, but rather the extra money is paying for the CUDA cores, which are useless to most gamers. No one is gonna have their cake and eat it to when it's costing the company more to create it.

So lets assume your assumption that they may come out with a more powerful card might be correct, but I think in that case it has to be at 8800 Ultra pricing, especially if they had to produce all this other nonsense that doesn't have to anything with gaming. They should keep all that crap on their workstation cards; why blur the line between gaming and general processing when its gonna end up turning away customers because it's more expensive? I doubt most of the people that are gonna buy it anyway are gonna use it. Like you, I'm not paying more for a bunch of crap on a card I'm not gonna use. I think NVidia is gonna start bleeding money if they don't come up with some brilliant pricing scheme with these new chips. It just seems like their whole direction is completely convoluted to me, I don't know.

All I do know is that I can rest assured that they're probably both be around when I need to update my HD 4890, and may the best man win. 😉
 



Proof tells us that nVidia makes bigger cards. Bigger cards cost more monies.
Facts tell us nVidia always has the highest-priced card.

Though you do present good points.
 


So what? That's not contrary to what I said, I didn't say the majority were 25x16, I said that 25x16 would be a more common on such a setup than on single or mid-range cards. Which is the point as to why that resolution is relevant when talking about the performance crown.

1280x1024 might be a more common resolution for the HD5850, GTX285, 275, 4870, etc, but I wouldn't be acceptable if they were unplayable at 1920x1080/1200.

Like I said, what would your defense of that major performance hole be?

You criticize him as a Fanboi, saying it's a myth, yet multiple reviews show eactly the problem he was pointing out, to which you simply say 'well those big panels aren't that common' as if GTX295s were the norm, and then later, after calling him a Fanboi, agree that the GTX295 suck at high res but you come up with an incorrect excuse as to why (memory).

It's not like anyone would expect the most expensive card out there to better handle the higher standard resolutions?

Seriously, for calling someone a Fanboi and saying what they said was myth, you haven't done much to back up your statement (other than a memory myth) and little to keep yourself from looking like a Fanboi after that, other than to back-track and qualify your statement and say 'Oh I didn't mean at that resolution, only by my own rules'. :pfff:

You should take the advice of your avitar on this one. 😛
 
nV's low yields of early chips also doesn't bode well for the price of a base chip (likely lower yield rate than ATi on a lower suface area yield per wafer to begin with), then more epensive and complex PCB to support more pins, more layes, and more memory (with the asymetric the total memory will be above 1GB as it's unlikely they would go below 1GB again, so expect 1.5GB, which will cost more too as the price range for slow to fast memory is about 20% not 50%.

Also add to that a mature HD5K board versus new G300 board and it's likely not going to be anywhere near the same cost initially, and therefore unlikely to be very close in price either, so it needs to be faster enough to be worth the difference.
 
People need to be sensible about it.

What makes anybody think that fermi will be much better than evergreen? Looking at the stats, there is no reason to believe that.

What makes anybody think that fermi will be cheaper than evergreen? There is practically no hope of that happening, and if it did AMD would just slash prices anyway.

Nvidia cannot make miracles happen. All they have done is double up the shaders etc, same as ATI doubled theirs. Why would anyone believe that this will lead to a crushing Nvidia victory?

Fermi might be faster than evergreen, but it will cost more. Fermi might also be slower, and it will still cost more. The way I see it, Nvidia are beaten already anyway.
 
It doesn't matter. How much faster can it be Random? It's doubled g200, same as evergreen doubled r700. The top fermi can only be as good vs the gtx260 as the 5870 is compared to 4870, assuming everything else is equal (ie nvidia also manage a 10-15% clock speed increase).

To me that says both cards will be as close together in gaming performance as the current gtx260 and 4870 is. Once again Nvidia have a chip that is 40-50% bigger too.

I don't see any reason why anybody would believe fermi will be better than what the paper suggests. It looks like g200 v r700 all over again, the only difference this time is, ATI are out much earlier.

Even if Fermi turns out to be 20-30% faster...we know that AMD will just slash prices on these and *still* make a profit, just like with r700. Nvidia are in a horrible situation here.
 


I'll take a crack at this. There's a decent argument either way on whether ATI really is trying too hard to market their new lineup.

On the one hand, if someone's looking for a DX11 card, how hard can it be to sell them something from the only DX11 lineup? Marketing to these people is unnecessary. Likewise, ATI historically has been ahead on the price/performance curve too, so the red team also has most of the bargain hunters in their corner.

On the other hand, demand does appear to be greater than the retail supply. With most cards in the hands of marked-up name brand system builders, and the 'waiting list' for 5xxx retail cards so long, Nvidia's next thing could be on the market before everyone currently in the market for a new GPU, at least the ones not fiercely loyal to either red or green, can get a Radeon. Looking at it that way, ATI marketing makes perfect sense.


But in all honesty, until there's a single GPU card that can hammer Crysis on all max settings at 60+ fps while only drawing 150w(...the holy grail? When are we getting that, 2012?) who cares? Seriously, is BattleForge THAT good?

 
About the dx11 argument -

If you believe dx11 isnt important, and you believe it will be ages before dx11 becomes standard, and hold off on buying dx11 cards because of that...what do you expect the games companies to do?

There has never been a better time to give the thumbs up to progress, and thumbs down to the tired status-quo. ATI have the best cards in any dx - buy one and make the games devs believe that we are wanting more than what Nvidia have been trying to keep us boxed into.
 
DX11 is a strong selling point the only problem for the consumer is that to make the choice of waiting to get a new card when both companies are directly competing and when there are more games out then shitty(mostly stalker) battleforge and stalker with DX11 support.

Frankly i can wait until more cards and more games come out the price of the 5770 shouldn't go anywhere but down in the meantime.

DX11 is only a strong selling point when you need a new card not really if you feel like replacing an aging one. imo

actually it's not that great if you need a card either now that i think about it.
I live in the US so i can just buy EVGA/BFG and do a step up/trade up if nvidia or ati makes a better card with in 3 months and by nvidia i mean release but a guess a card sold and given back is still counted as a card sold.
 
I think a few of us are missing the point here.
Tho it sucks to see cards sold out in the channel, theyll come, and are coming fast enough.
But the real important thing is, when nVidia comes out with their cards, the OEMs have already been satiated, and tho it comes in a lower profit per unit, the unit numbers are much larger, and you also have the attention down the road of the OEMs doing this.
So, by the time nVidia does come out, ATIs cards will be fime in the channel for numbers, theyll have alot of dev support to work with them on their DX11 projects, the OEMs will have been selling large amounts of ATI/PC units, ready to order more, and ATIs drivers will be much more mature.
So, like jennyh says, nVidia is in a world of hurt, as we all know, the best they can do is: either fermi is a killer card, coming in priced reasonably, and being able to scale well, which performance is carried on down the marketing slots, heading into a slower purchasing period, where alot of sales have already been made.
Will this happen?
More likely, we will see the same perf/price we had with the 200 vs the 4xxx series, nVidias disadvantages have already been talked about by TGGA, as in initial costs, being relayed to consumers or eating away profits, and again, in a previously well traveled and satiated market, at a slow economic period
 
I agree with Jaydee, and i also think that the pricing on the fermi is going to be pretty high. If Fermi can best 5xxx series in performance its going to be alot. We already witnessed how high Nvidia is going to go with price. Since just one 295 card at lowest on newegg i think is 500$. So... If Fermi can preform, Fermi can price.
 


The fact is that the GTX295 get higher min fps than every single card.
Seriously, why you try so hard to support that the GTX295 sucks so much.
The reality is that the GTX295 is still the fastest card out there for almost 10 months!!! and some people refuse to see the truth.