GeForce GTX 570 Review: Hitting $349 With Nvidia's GF110

Status
Not open for further replies.

thearm

Distinguished
Dec 18, 2008
276
0
18,780
Grrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.
 

xurwin

Distinguished
Mar 7, 2010
459
0
18,810
at $350 beating the 6850 in xfire? i COULD say this would be a pretty good deal, but why no 6870 in xfire? but with a narrow margin and if you need cuda. this would be a pretty sweet deal, but i'd also wait for 6900's but for now. we have a winner?
 

sstym

Distinguished
Feb 16, 2009
118
0
18,680
[citation][nom]thearm[/nom]Grrrr... Every time I see these benchmarks, I'm hoping Nvidia has taken the lead. They'll come back. It's alllll a cycle.[/citation]

There is no need to root for either one. What you really want is a healthy and competitive Nvidia to drive prices down. With Intel shutting them off the chipset market and AMD beating them on their turf with the 5XXX cards, the future looked grim for NVidia.
It looks like they still got it, and that's what counts for consumers. Let's leave fanboyism to 12 year old console owners.
 

nevertell

Distinguished
Oct 18, 2009
335
0
18,780
It's disappointing to see the freaky power/temperature parameters of the card when using two different displays. I was planing on using a display setup similar to that of the test, now I am in doubt.
 

reggieray

Distinguished
Nov 4, 2010
454
0
18,780
I always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.
Or am I missing something?
 

theholylancer

Distinguished
Jun 10, 2005
1,953
0
19,810
hmmm more sexual innuendo today than usual, new GF there chris? :D

EDIT:

Love this gem:
Before we shift away from HAWX 2 and onto another bit of laboratory drama, let me just say that Ubisoft’s mechanism for playing this game is perhaps the most invasive I’ve ever seen. If you’re going to require your customers to log in to a service every time they play a game, at least make that service somewhat responsive. Waiting a minute to authenticate over a 24 Mb/s connection is ridiculous, as is waiting another 45 seconds once the game shuts down for a sync. Ubi’s own version of Steam, this is not.

When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.
 

nevertell

Distinguished
Oct 18, 2009
335
0
18,780
I was planning on doing so, but I didn't get enough money from the work I was doing, so I'll stick with just a new monitor. I will definitely get a new card during the next year, but not for now :( And by then, there might be new great cards out there.
 

phantomtrooper

Distinguished
Apr 17, 2008
194
0
18,690
why is the dual-gpu 5970 in this comparison? why is the 6850 crossfire? u have two ati dual-gpu solutions, but no nvidia dual-gpu solutions. biased much?
 

anacandor

Distinguished
Mar 29, 2008
123
0
18,690
While the 5xx series is looking decent so far, it seems to me (pure speculation here) that Nvidia held back with this series and are possibly putting more resources into Kepler. I feel this because they aren't trying to kill AMD for market share, instead put up a perfectly resonable product that neither EXCELS vastly beyond last gen, but providing enough performance to justify a new product. That said i'm looking forward to their 2011 lineup.

Also, it would have been interesting to see Metro 2033 tested with max instead of medium settings. All the cards are able to play medium at all resolutions with no AA... push them to their limits? :)

Thoroughly enjoyable review though. Thanks, Chris!
 

gxpbecker

Distinguished
Apr 23, 2009
50
0
18,630
i LOVE seeing Nvidia and AMD trading blows back and forth. Keeps prices in check lol and gives more optiosn for buyers!!!
 

tronika

Distinguished
Oct 19, 2010
26
0
18,530
[citation][nom]ReggieRay[/nom]I always wonder why they use the overpriced Ultimate edition of Windows? I understand the 64 bit because of memory, that is what I bought but purchased the OEM home premium and saved some cash. For games the Ultimate does no extra value to them.Or am I missing something?[/citation]
noticed that too. i really can't think of any reason other than the language support for the Tom's engineers. 99% of the gamer market would be better off with home premium 64 bit. the other 1% that actually runs and maintains a domain in their house should get professional or the bloated "ultimate". i mean, who really uses bitlocker on their gaming machine anyway? great article though! i jumped on the 580 and haven't looked back. used to run on a 5850 but now that i've seen fermi in all of it's glory i'm really interested in nvidia's future "vision".
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]theholylancer[/nom]hmmm more sexual innuendo today than usual, new GF there chris? EDIT:Love this gem:When a reviewer of not your game, but of some hardware using your game comments on how bad it is for the DRM, you know it's time to not do that, or get your game else where warning.[/citation]

Wait, innuendo? Where? :)
 

kyleguy298

Distinguished
Oct 16, 2010
123
0
18,680
OMG this is exactly what I was thinking a while ago

"I hope that the GTX570 would be a better version of an older higher end card"

and I actually saw this post and Im like am I a fortune teller?
 

jgutz2006

Distinguished
Jul 7, 2009
473
0
18,810
Keep the competition rocking so i can continue my trend of upgrading every second generation and having that upgrade substantial. I've been 100% Nvidia since my Geforce 256 and heavily considered making the switch my last 2 upgrades but have always remained loyal only because of having SLI boards but things changed this time around and I switched over to a Sapphire Eyefinity 6 5870 2GB and running 4 24" displays wonderfully! (without having an extra slave card!)
 
Status
Not open for further replies.