GeForce GTX 295 Performance: Previewed

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

daveloft

Distinguished
Jun 23, 2008
36
0
18,530
[citation][nom]scook9[/nom]SO..........i see nvidia still has not figured out how to use GDDR5?!?! If they could get that worked into the GTX295, i dont think ANYONE would be able to top that for a while.Still dissappointed that nvidia has said nothing about GDDR5...Toms already did an article a while ago on how Hynix is now making 1gb DDR5 low latency chips somewhat cheap..whats the hold up Nvidia[/citation]

Nvidia does not need to use DDR5, their design was to use a much wider bus and slower memory 448 bit and 512 bit. ATI went with a much slimmier 256 bit bus and needed the extra speed DDR5 offered.

I really don't think the current chips would benefit all that much from the extra bandwidth without the processing power to go with it. DRR5 may be coming down in price but so is DDR3.

I'm sure they will 'figure' it out when they need the extra bandwidth and the price is where they want it.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
[citation][nom]Swixi[/nom]That is only in Far Cry 2 for some strange reason. Check the other game benchmarks on that site; don't just link one game, or you create an unfavorable bias equivalent to that of an ATI fanboy (most of the people on here).[/citation]

agree w/ your reasoning on using only a subset of examples... but your conclusion that most ppl on here are ati fanboys? I don't see any more ati love than Nv love... there are irrelevant comments about bad ati drivers and lame comments about ati still winning b/c Nv has to use 2 PCBs and such but you find that on any forum.

If anything there is a nice balance and MOST ppl here are fans of getting the best bang for the buck at the moment regardless of brand... so I dunno where you come off w/ a comment like that. curious.
 

sparky2010

Distinguished
Sep 14, 2008
74
0
18,630
Well, putting that driver thing aside, i just remembered of the sideport that ATI had disabled from the beginning.. since they released the 4870 x2.. i never found out if they ever did enable it in a driver update (again drivers), if they did, so be it.. but if they didn't.. well, i'm sure that the x2 would still have some extra bite in it somewhere no? (although don't hit me if i'm wrong.. i said i don't know if it's been enabled or not)
 

Pei-chen

Distinguished
Jul 3, 2007
1,298
9
19,285
Yorkfield > Kentsfield > Agena @ same clock speed.

A castrated Agena couldn't possibly be faster than half a Yorkfield. I don't know how some website quoted ran their test but if the result is Kuma >= Wolfdale, they are clearly off target.
 

kelfen

Distinguished
Apr 27, 2008
690
0
18,990
[citation][nom]sparky2010[/nom]Well, putting that driver thing aside, i just remembered of the sideport that ATI had disabled from the beginning.. since they released the 4870 x2.. i never found out if they ever did enable it in a driver update (again drivers), if they did, so be it.. but if they didn't.. well, i'm sure that the x2 would still have some extra bite in it somewhere no? (although don't hit me if i'm wrong.. i said i don't know if it's been enabled or not)[/citation]
that could put them at par if they do it right also benifit other x2 chips
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
Thanks for the preview. I'm a bit disappointed right now, another 4870/GTX 260 shoot off... Just as I thought I would go insane from the 4870/GTX 260 threads, now I'll go insane with them and the new GTX295/4870 X2 threads!
 

randomizer

Champion
Moderator
[citation][nom]The_Blood_Raven[/nom]Thanks for the preview. I'm a bit disappointed right now, another 4870/GTX 260 shoot off... Just as I thought I would go insane from the 4870/GTX 260 threads, now I'll go insane with them and the new GTX295/4870 X2 threads![/citation]
With no clear winner, a price war could happen. But NVIDIA would be hurting from that.
 

Lans

Distinguished
Oct 22, 2007
46
0
18,530
Nice preview! :)

Too bad we'll have to wait to see if the power numbers hold true for all/most games/apps or that is just really Fry Cry 2 (and under certain settings). Just wondering why the GTX 295 has a higher TDP than a 4870 X2...

The $500 target seems possible given the GTX 260 is at about ~$230 now and the GTX 295 looks closer to a GTX 260 except for the 240x2 SPs. Still I can't believe it Nvidia is willing to do that because it'll probably force their own prices down as well (the GTX 280 is at ~$380). And Nvidia margins must be pretty thin already (big chips for similar prices)... I suppose they could be like AMD/ATI and are hungry for market shared but I'll believe when I see it. It would definitely be a very sweet deal at that price with the numbers seen thus far.
 

chmod000

Distinguished
May 20, 2008
16
0
18,510
[citation][nom]Cleeve[/nom]Having said that, the 295 looks like a killer deal at $500.[/citation]

I don't care if it cooks your breakfast for you, $500 for a video card is NOT a "killer deal". The only price point offering any kind of value for money right now is in the $100-200 range.

With the way the economy is going both AMD/ATI and nVidia would be better off trying to fight for future brand loyalty by optimizing drivers (and I don't just mean for games, I'm talking stability, video/texture quality, pulldown and deinterlacing detection/quality and newer GPU tasks like video encoding and physics handling) for these value sections than beating their virtual chests about 15-20 FPS gains over their competitor when the avg FPS is already above 60.

For $500 you can build a brand new PC that will do pretty d*mn well in the target range most people play at (1024-1650), if you spend the money where it really counts and have a few leftovers from a previous build.

If you're going to claim that $500 for a video card is a killer deal, then at least preface it with something like "If you've got the money to waste" because, quite frankly, I doubt the majority of your readers do. I'd even go so far as to say that there are those of us that, even if we DID have the money to waste, STILL wouldn't do it.

Do you honestly view $500 as a deal, Cleeve? Considering this card offers NOTHING new to the G90 line, it comes off as nothing but a fleecing of the customer in an effort to appease shareholders and VP's who don't like seeing their vested interests one-upped by a company who is supposed to be on their death bed.
 

cleeve

Illustrious
[citation][nom]chmod000[/nom]I don't care if it cooks your breakfast for you, $500 for a video card is NOT a "killer deal"...
...Considering this card offers NOTHING new to the G90 line, it comes off as nothing but a fleecing of the customer in an effort to appease shareholders and VP's who don't like seeing their vested interests one-upped by a company who is supposed to be on their death bed.[/citation]

Whoa! Chill out dude! Is there a reason you're taking my opinion about a videocard as a personal insult?

As far as the GTX 295, I would disagree that it's 'nothing new'. This card offers two GTX 280's - the most powerful single GPU's available - for less than the price of one and a half GTX 280s. That's something new to me.

Clearly the phrase 'killer deal' is somewhat subjective. But, yes, I do find the GTX 295 a 'killer deal'. I think the performance bar has been sufficiently raised over the 4870 X2 at the same price point, and I reserve the right to call that a 'killer deal' at my capricious whim. We both have the right to our own opinions, actually. Free speech and whatnot, yay!

I don't care if it's based on old tech. That doesn't affect the bottom line to me, which is price/performance. Hell, if they made a card cobbled together with twenty-four 6600 GT GPUs glued with chewing gum, charged $50 for it, and it could beat out a GTX 280... well, it wouldn't really matter to me as long as it performs well for the price. I would also call that a 'killer deal'. I would likely buy one.

If it turns out the 295 is a publicity stunt and the card is nigh-impossible to find at retail after release, I'll agree with you wholeheartedly. But it's a little early to dismiss what appears to be a successful product based on... well, what ARE you basing it on?

What problem do you have with a cheaper, more powerful card hitting the market, exactly? That it costs $500, and you don't personally think any card should cost that much? Capitalism dictates otherwise my friend - if nobody would pay that much, they wouldn't offer the product at that price. Your beef is with the buying public who feel it's worth the price to buy the best tech, not with Ati or Nvidia. If people didn't buy $500 cards, they would stop selling them. Indeed, $200 cards would also perform much worse, because without the R&D driving the upper-echelon cards, we wouldn't have Radeon 4850s and 9800 GTX+ cards under $200, would we?

As much as you hate the high price points, they make the low price point performance you seem to covet a reality. Without them, it wouldn't exist, not in it's current form. Think about it, bro.

In any case, I wasn't trying to upset you, just voicing my take on the card. Peace out!
 

pocketdrummer

Distinguished
Dec 1, 2007
1,090
37
19,310
So, what does this mean for the mainstream cards? What do you expect the prices to do when it comes out?

I'm thinking of getting a GTX260 for christmas, but if the price is about to plummet, I should wait.

Anyone care to speculate?
 

thepinkpanther

Distinguished
Nov 24, 2004
289
0
18,780
too little too late, nvidia pulls their duel card solution too late and look now we have a new series! If it came a month after the 4870x2 perhaps it would be worth spending $500 on it ( i never think u should buy a card for tat price anyways.) xfire 4850 ftw!
 

thepinkpanther

Distinguished
Nov 24, 2004
289
0
18,780
on a lighter note i thought nvidia renamed their cards so the comsumer wouldnt be confused with numbers...well the gtx260 and gtx260 core 216 sounds confusing.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]pocketdrummer[/nom]So, what does this mean for the mainstream cards? What do you expect the prices to do when it comes out?I'm thinking of getting a GTX260 for christmas, but if the price is about to plummet, I should wait.Anyone care to speculate?[/citation]

If that's the card you're going for, my only advice would be to make sure you get the Core 216 version, as the 192-core version is simply being phased out.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]zodiacfml[/nom]too expensive for a poor guy like me. i'm actually more impressed with the combined core i7 and 4870.[/citation]

Also a good combo.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
[citation][nom]sparky2010[/nom]Well, putting that driver thing aside, i just remembered of the sideport that ATI had disabled from the beginning.. since they released the 4870 x2.. i never found out if they ever did enable it in a driver update (again drivers), if they did, so be it.. but if they didn't.. well, i'm sure that the x2 would still have some extra bite in it somewhere no? (although don't hit me if i'm wrong.. i said i don't know if it's been enabled or not)[/citation]

sideport is on the mobo, not the video card and is meant to be "dedicated" video memory for onboard chips and budget setups. Toms even did a review on one mobo with it here a bit ago... regardless, on-card memory and its ensuing faster bus speed will always win on an enthusiast system like we're talking here... that sideport (in its current implementation) will gain nothing for an X2 afaik.
 

CptTripps

Distinguished
Oct 25, 2006
361
0
18,780
$500 is not a great deal, but if you can afford it right on. I am looking to get an Icore7 machine and I think this would be a perfect card for it. Then in a few months maybe throw in another one, hell yeah.
 

madogre

Distinguished
Dec 7, 2008
26
0
18,530
The low FPS in Crysis is due to the fram buffer being smaller then ATI's.
Im sure some of it is drivers, just wait till NV has had 6 months of drivers then match them vs ATI with the drivers they use today.
Yea Im sure it will be but but but, no buts, you want to be fair then you have to use them drivers from the same time frame.
 
Status
Not open for further replies.