Nvidia vs ATI 2010

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


According to whom?
Most review sites had them pegged @ $249/399 prior to launch day.

The 4800 had 800 shaders vs 320 in the 3800. They didn't even double the die size.
3800 -> 4800 = 55nm -> 55nm
4800 -> 5800 = 55nm -> 40nm
They put 1600 SP's in there, but the die size more than doubled! I'll give them 30% of the die size for tessellators/DX11 hardware (the 4800/3800 had a tessellator too) but that's still the 89% they gained from moving fab process.

Good thing you'll 'give' them that. :sarcastic:
So what else changed in the 3800->4800? Maybe if you use you MEMORY you'll be able to figure out where alot of those savings came from, it might RING a bell. 😗

The 4870 killed the 3870X2 more than the 5870 did to the 4870X2.

Not really, except in the situations where the 3870X2 had terrible scaling and acted similar to or worse than a single HD3870.
Which isn't saying much, like the GF8800 improvement over the GF7900 series (not the X1900/50), the HD4K was more of a capable previous generation than either of those two were, but even still the improvements were in line with previous generations which approached two of the best of the previous generation not two of the weakest of the previous generation.
 


But which the end result is still simply 'making explosions bigger' , 'more debris', 'more smokey' etc. It's unfortunately still not game physics, it's just effects physics. It looks nice, but still doesn't doesn't affect the game itself.

Hopefully as we move towards a more powerful and more agnostic future, they'll add more interaction physics and hopefully that works on more devices.

 
im sure it wont be long (especially after DX11) til Havok begin work on GPU accelerated physics. and lets face it, nvidia won't have a chance there. havok have been the kings of physics since before it was just a gimmick.

now just about every game is expected to have real-world physics. its almost always done by havok, and when they work on GPU acceleration, it will be the mainstream solution.
 

Don't forget the "open" command 😀
 
Well and that's what nV thought when they first backed HavocFX SLi Physics before they decided to by Ageia.

I think it will still be a while after Havok enters the GPU/OpenCL fray before we have good interactive physics, but I could see them making a good first step since it would have a better CPU tie-in and they do have very good game physics to start with.
 


Yeah, I never used it. I mainly did the save/reload cheat, didn't want to do the console open command cause I wanted some challenge... up until I could make an enchanted amulet to 'open all'.
 
ATI have a huge lead in production capability and Nvidia will never close the gap with their current business model.

Absolutely correct.


ATi's engineering team can design solely for graphics.


Nvidia's engineering team is forced by upper management to design for graphics and compute.



One is focussed, the other is compromised. The compromise is never going to win. Hence why I can only see the gap growing, not shrinking*.



*At least until (if) Nvidia management have a major rethink about their strategic direction.
 


Why bother?


Multi-core = multiple physics threads while utilising the CPU more effectively.


We are forever complaining at coders to make software more SMP/SMT friendly. Physics is an ideal opportunity, yet people advocate the use of PhysX instead - which of course takes away performance from the bottleneck... the GPU.


Sensible? Not in the slightest.
 
ATI will win out in PC graphics cards in 2010, but I suspect nVidia expected this to happen. I don't think Fermi is being developed for the PC gaming market. nVidia needs to create a complete CPU\GPU solution and this is their first attempt. They're willing to take a loss on fermi at first if it gives them a shot at future devices like the next generation game consoles, phones, etc... .

nVidia can afford to lose money in 2010 and maybe 2011. AMD much less so.

The real question is not who will do better in 2010. It's who will be better positioned going into 2012.
 
You think nvidia's shareholders are going to support that sort of behaviour?

It's a moot point. They've already developed Fermi (almost). nVidia had no choice what with AMD and ATI offering a complete solution and Intel building Larrabee (although Larrabee's future is now in doubt).

Think about it. If you were Nintendo/Sony/MS, would you prefer one vendor build you a complete solution for the xBox 720 or PS4, or would you want IBM to build the CPU and nVidia the GPU and hope they work well together?

I'm not saying everything has gone exactly according to plan for nVidia. I'm just saying Fermi's success won't entirely be judged by how many GT300 series graphics cards are sold in 2010. The success or failure of Fermi won't be known until a couple of years from now. nVidia still has a long haul ahead of them. Convincing vendors to use Fermi as a GPU is going to be a lot tougher than making a good graphics card out of it.
 
I want software Physics so my i7 has something to do. I have ~60gflops of SSE power sitting idle and that's on my lowly stock 920. Imagine the new 6+ core CPUs coming out in the next few years.
 

GF100 (Fermi) is and always has been the Tesla workstation card and I'm pretty sure that I read somewhere that the desktop cards will be based on Fermi, as such all the news and speculation on the upcoming release of GF100 cards may have very little bearing on the cards that the general public will eventually get to purchase.
 
Nvidia has always been like 2 years ahead of ATI. Nvidia is far better. And will always be.
I think i know what he is trying to say, Nvidia started designing Fermi 2 years before ATI designed 5xxx. so in essence and in theory Fermi is ahead of ATI by 2 years, just being released 3 years late.
 


Fermi has gone out of fashion, just like huge GPU cores.

Also how do you know when they started designing Fermi & RV870? If this is true, this is a bigger win for ATI, they beat a chip that had 2 extra years to respin & redesign.
 


I think he was trying to be some what sarcastic by coming up with a possible way that the fanboy could have a valid point.
 


and why in the world did you so honestly bought an i7?

e-peen perhaps?
 
You know its always back and forth with who's better because they have different release cycles. They usually dont release cards at the same time as each other. I would just go with what ever card is the best at the time in your price range.

Ati will come out with something that beats everything then 3 to 4 months later Nivida will come out with something that beats everything. And then the cycle repeats. And I dont believe in waiting for the "next" series to come out because of improvements because you will always be waiting. By the time you wait for Nivida's next series there will already be word out of what ATI plans on releasing 3 to 4 months after that. And i hope that ATI and Nivida will always continue this trend competing with each other because it usually keeps prices competitive. If for what ever reason either company went under and stop making cards that would be bad for the end consumer, then you would have little choice for buying a GPU.

 


Actually when they do they are usually relatively close (R8500/GF3, X800/GF6800, GTX285/HD4K), and when they don't usually the first to launch is the far better card (R9700/FX5800, GF8800/HD2900 [GF7800 having a brief advantage over the X1800 but then swapping in the X1900/GF7900 releases which also swapped launch order]), so how does that bode for the Fermi GPU based on past release patterns? Or does the past have little influence on the present? I would say that it's not that they have different release cycles that matter, especially since nV had every intention of releasing right near the HD5K, so much as whether they missed their release target due to issues like the FX, X1800, HD2900 all of which fell short of expectations.

I would just go with what ever card is the best at the time in your price range.

Which should really always be the way people buy, see what is available in your price range and then compare features you need/value and what works best with the apps you want to run.

...And i hope that ATI and Nivida will always continue this trend competing with each other because it usually keeps prices competitive.

Yep, although really equality means higher prices, while one company being a little low on performance or features usually means price becomes a method of competition for the weaker product, with equality you usually have equally healthy prices.
 
Nvidia has always been like 2 years ahead of ATI. Nvidia is far better. And will always be.


As much as I hate to agree with Nvidia, he is right. ATi just now released a card that can keep up and surpass the GTX295 guys... the gtx295 has been out forever. Sure, ATi has better price/performance but Nvidia will always beat them as far as raw performance goes. Its like Intel vs. AMD... It is stupid to buy Nvidia cards because of the $$$ count but some people have that money. If I had to choose between fermi and 5870 give me the Fermi. You seen the transistor count on that biotch? It is going to kick the crap out of any video card we have seen. Nvidia is fighting Intel at this point that means more sophisticated GPU technology not just gaming and blu-ray rendering crap.
 


The GTX295 wasn't even out a year before it got dethroned (it hasn't even been out a year YET), far from 'forever' which is how long it took to dethrone the R9700 and X800 and GF8800.

Sure, ATi has better price/performance but Nvidia will always beat them as far as raw performance goes.

Right now they don't beat them, so that kinda puts and end to the idea of 'always will', and they definitely haven't 'always have' either as anyone familiar with the history of the cards knows. Heck even in the other thread's list based on the way you two calculate your winners, the GTX280 never led, but I doubt either of you would figure that out either when making your myopic statements.

If I had to choose between fermi and 5870 give me the Fermi. You seen the transistor count on that biotch?

So you would take Fermi A1 silicon with all it's massive # of transistors? :heink: Yeah it performs so well it can't be used in public. :pfff:

Transistor count alone means nothing or else the HD2900 would've outperformed the GF8800, it's how you use them that matters. Just like another size adage. 😗

And we still don't know if the graphics version of Fermi will get the full compliment of active transistors in the design let alone the die.

It is going to kick the crap out of any video card we have seen. Nvidia is fighting Intel at this point that means more sophisticated GPU technology not just gaming and blu-ray rendering crap.

More sophisticated compute technology, the GPU technology is the stuff involved in the gaming and blu-ray rendering 'crap' as you put it and ATi is ahead of nV there even in the proposed Fermi design, and it's also the main reason people by graphics cards, not to model atomic interactions or run economic models on MatLab.

Most importantly, whatever it is that they want to do with them, they currently can't get it to do it well enough to launch a product.
 

+1 :hello:



This one cracks me up almost just as much. :lol:
First off, the gtx 295 was Nvidia's counter to the 4870 x2 wich came out first so being out forever ... wrong.
ATI's refresh kicked the crap outta Nvida's 295, Nvidia's answer is .... more delays ... wonder why ...

Transistor count = more fab failures. Good luck seeing Fermi, sure the few they are able to produce will kick the crap outta whoever, but will it ever make it to market?
 
i love both ati and nvidia gpu's, i am no nvidia fanboy or ati fanboy but, watch this video and i mean the whole video so u can see the final fps score. the 4870x2 was made to surpass the gtx295, but the new 5870 cant even keep up with the gtx295, we will never see the true power of the 5870, or any ati card for that matter, why, because the ati driver development team has, and always had their head up their butt,
the new gt300 cards will wipe the floor with the new ati cards. o and look at the 3dmark scores a the end of the video, everything is higher.

http://www.youtube.com/watch?v=zlL4aombWyo
 
Status
Not open for further replies.

TRENDING THREADS