GT300 wins over 5870

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It would be really interesting to know what they've (NVidia's) got. They've been sending so many mixed signals about their cards and plans, I really don't know what to expect. I'd be shocked if they really flopped as miserably as some are predicting, but I suppose we very well may be heading back into an ATI domination era, which would be nice, since they should oscillate. I guess we'll just have to wait and see.
 
Smoggy, hardware PhysX has supposedly been 'the next big thing' in games for almost half a decade now, and it still isn't big yet.

I don't know how this is going to help them make-up for all the other shortcomings, any more than ATi's entertaining but far from compelling SmartShader effects made much of an impact on their sales.

Relying heavily on PhysX to improve their fortunes, may lead them to a similar outcome as Ageia if they aren't careful and lose focus of the MAIN raison-d'etre of their company... graphics, which if they can't make compelling how are these side shows supposed to matter.

Be it Cuda/Tesla, or PhysX-PPU, or Acceleware video transcoder, none of those items are enough to move anywhere near the same number of chips as their graphics potential, and none of them are enough to entice gamers and non-gamers to pay for GPUs that don't do graphics as well as the competition, they're great as tie-breakers, but you need to start at parity, not in a deep hole.
 
yeah, but I've seen more bumf on that and the mobile PDA HD thing at the moment and ah heck all about PC graphics other than the downplaying of DX11, which could be just one of the usual smoke and no trousers things or something worse.
 
I'm pretty sure Tegra is the only thing keeping them alive right now.

I just can't understand why they would abandon what made them in the first place. Nvidia have rebadged a lot more than they have created recently and that doesn't make any sense. What caused them to change everything or did they really believe that AMD buying out ATI was the end of competition in graphics? They celebrated AMD's aquisition as if it was a victory, but 3 years later they are looking like they have been completely outmatched.

How did that happen? In any other company, heads would be rolling.
 
Nvidia have rebadged a lot more than they have created recently and that doesn't make any sense.

I had a sense this was coming when the GTX280/260 was released. The GTX260 192SP was faster then the 9800GTX, but only 15-20%. If they disabled another chunk of SPs, then the odds are that it would have been the same speed, or possibly slightly slower. The problem is that the die of the GTX chips is so much larger then the 9800GTX. It would be cheaper for Nvidia to make the 9800GTX then to make a GTX250 168SP chip.

The same will continue with the new series if the rumors are true. If the GT300 is as large as its rumored to be, even making the "giant" GT200 would be cheaper. Something I've been wondering about is what has Nvidia been doing with the defective cores? AMD cut parts and put them out as other chips, AKA the 4830. Why did we never see ANY "GTX250s"? The chips must have went somewhere.
 


Crysis2 is on the way with the new engin so that will be what pushes the 5870(x2) to the limit. Also for many people it doesn't matter if it is more power then they will ever use, they don't care if there is NO GAME that can stand up to the plain 5870, they will still use quad fire 5870.

I mean come out who needs to 4-way SLI 285s? But they do it, because this is a hobby, it's about more then playing games it's about building computers, it's the same reason you have a GT Mustang that can do 150 mph even if you are never going to get it up that fast.

Also MANY people do folding@home or simply love to get the best benchmark scores, and they will get as much power as possible to do it.
 


Crysis 2's engine is actually a downgrade from the original Crysis engine. Youtube vids of Crysis 2 so far shows it is actually a graphical downgrade as well...it makes sense since they're trying to make it work on the 360 and PS3... =(
 


I wouldn't call it a downgrade per say.. it is certainly more scalable.. but it should be able to push graphics on the pc to levels a bit higher than the original crysis given what crytech has said about it. But it is too bad that the game that for today is 'the' pc game wil be dumbed down to the console level.. imagine what they could do with a purely dx11 engine.. /cry
 


Sorry, meant the REAL one which I count as the full Crytek3 engine for PC. I am not really going to count the Console version as real, as is simply seems like it is against the point of Crysis (at least the point all the fans made of it) which is to be able to play it at the best the engine can offer.


When Crysis 2 hits PC's or whatever spin off Crysis name they want to call it (warhead 2 or whatever) I am guessing we will really see what they can offer, and likely with DX11.
 
Don't forget guys, the CryEngine 3 was demoed at the Eyefinity even on theatre wide curround-view gaming, so it's going to be playable there too (at what setting don't know, but they showed the DX11 path [dunno high, bery high, ultra, megaful, whatever]. It seems a little chunky in this demo, but the single screen experience was liquid, and you wonder if it was any CPU lag considering it looks like texture swaps or some other loading that are slowing it down briefly;
http://www.youtube.com/watch?v=04dMyeMKHp4

And like Darkvine said, it's a hobby for many just building the awesome rig. I spend about $5,000/year skiing, and about half that much playing hockey, and I still spend about $2,000+ a year on PC stuff even though I get a ton free at work. Some people don't have other hobbies (or their wives won't let them anymore) so this is what they do.
Now while it may be as useful as having a machine drag you up a hill on a cold day, just so you can fall back down, it's not about utility, it's about fun, and I'm sure that an Xfire HD5Kx2 rig would provide alot of fun to many people, even if it was to just watch the power meter spin around like a vinyl record. :)

However that's not to excuse them should they come here to whine or brag about ome aspect of their setup. :kaola:
 
Hey SS, looks like volume shipments to the OEMs, ODMs and system integrators for Novemeber;

http://www.fudzilla.com/content/view/15568/1/

So, maybe we should see actual Laptops around February or March, a little sooner than I expected, but a little later than I had hoped (hoped they were capitalizing on their recent regaining of the mobile crown, so I could capitalize on that with a sweet new laptop for Xmas 🙁 ).
 
Dude what did you expect to find in a thread about the G300 & HD5870 neither has had a quality review or test done of them yet, sofar it's rumour and PR vs Rumour and PR.

The place for your comment is after Sep 23, when we should have some actual Facts and Data for one part to start with.
 


While it isn't a solid test at least there is something on the HD5870, unlike the zip on the G300s.