GeForce GTX 295 Performance: Previewed

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

madogre

Distinguished
Dec 7, 2008
26
0
18,530
[citation][nom]madogre[/nom]The low FPS in Crysis is due to the frame buffer being smaller then ATI's.Im sure some of it is drivers, just wait till NV has had 6 months of drivers then match them vs ATI with the drivers they use today.Yea Im sure it will be but but but, no buts, you want to be fair then you have to use them drivers from the same time frame.[/citation]
 

cleeve

Illustrious
[citation][nom]madogre[/nom]The low FPS in Crysis is due to the fram buffer being smaller then ATI's.Im sure some of it is drivers, just wait till NV has had 6 months of drivers then match them vs ATI with the drivers they use today.Yea Im sure it will be but but but, no buts, you want to be fair then you have to use them drivers from the same time frame.[/citation]

1. The frame buffer is negligibly smaller. It's only a ~10% diffrence, not enough to have a real-world impact. Crysis targets 512MB for it's texture memory, so this isn't the issue.

2. Reasoning that to be fair we wait 6 months for drivers is ludicrous. Are you advocating that in 6 months we use Nvidia's newest drivers against the current Catalyst 8.12's to be 'fair'? Where does it end? Is it unfair to compare a Ford Mustang to a Dodge Challenger because one was released a number of years before the other one?

Cards are compared price/performance when they are on sale, when they are available to the public. That's as fair as it gets, because that's when people are paying their hard-earned dollars for them, and using them to play games in the real world - not in some theoretical best-case scenario.

This is a preview, but when the 295 is available at retail, it's a real world comparison... you don't have to wait 6 months for drivers to 'catch up' to do it, unless Nvidia is going to wait 6 months to sell them...

 
Can someone explain to me why we're talking about power consumption about a yet to be authorized yet to be launched product? Other than of course to help nVidia's XMAS PR plans?

I understand the idea of not comparing Rebate prices, but I was under the impression that there was an embargo on paper launch reviews too. Guess that only applies when Christmas sales figures are on the line.
Don, what happens if the GTX 295 is nowhere to be seen for $499, can't go back and undo the PR on this one.

I'd like a review of the yet to be launched S3 DX11 product along with Larabee and the R800 series with fantastical predictions too.

Considering the delay of reviews of currently shipping products like the HD4870, I'm dissapointed that this is one of the few reviews to come at a timely fashion, and only because for the review to come after Xmas like the product it wouldn't benefit the strategy. And they have the whole 'Beta this, not finalized product that' statement to cover any variation that doesn't put a positive light on things.

First there was the rush to review the effect of drivers (just 2 short releases after the new hardware) to proclaim drivers did nothing, just before the big boost drivers arrived, now previews not of the technology, but of PR marks. I doubt I would've seen such a review under Lars or Darren. I doubt they would've needed this in their '(P)reviews'

"We’ve presented the results from six games. Five of them were mandated by Nvidia as a sample of the most-anticipated titles for the 2008 holiday season. Four of those five are part of Nvidia’s The Way It’s Meant to Be Played program."

The excuses to follow that section would make me laugh if it weren't for cracked ribs from skiing. I especially loved the part about TWIMTBP being of no consequence, thanks Chris, did you right that section yourself or was it mandated too? Seriously if it was so benign you wouldn't have felt the need to explain it, and there also wouldn't be the suspicions out there about the actions in games like, FartCry2, Assassin's Creed and Oblivion (only HDR for Xbox, eh?!?). C'mon, puff piece at best, bought and Mandated at worst.

This (p)review seems to have been delivered with a bow on it from Santa, just in time for Xmas sales.
 
[citation][nom]sojrner[/nom]sideport is on the mobo, not the video card and is meant to be "dedicated" video memory for onboard chips and budget setups. Toms even did a review on one mobo with it here a bit ago... regardless, on-card memory and its ensuing faster bus speed will always win on an enthusiast system like we're talking here... that sideport (in its current implementation) will gain nothing for an X2 afaik.[/citation]

Actually he's right, sideport was meant for on bus communication in the X2 as well (it's also used for Mobo communication for IGPs as well), but it was an OPTIONAL feature for X2s, and last time I heard it had not been utilized/enabled by any of the AIBs.

Anand pointed this out in his review, which was previously mentioned by Hexus, B3D, etc as well, but it's nice and clean here;
http://www.anandtech.com/video/showdoc.aspx?i=3372&p=3
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
[citation][nom]TheGreatGrapeApe[/nom]Actually he's right, sideport was meant for on bus communication in the X2 as well (it's also used for Mobo communication for IGPs as well), but it was an OPTIONAL feature for X2s, and last time I heard it had not been utilized/enabled by any of the AIBs. Anand pointed this out in his review, which was previously mentioned by Hexus, B3D, etc as well, but it's nice and clean here;http://www.anandtech.com/video/showdoc.aspx?i=3372&p=3[/citation]

thanks for the heads up ape, I missed that portion of the sideport spec.

and sorry at the other dude I responded to.

game on.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
Sure, I’ll take a stab TheGreat.

This product will not be available for Christmas, so any reference to bows, holiday sales figures, or Santa unfortunately doesn’t really apply.

This preview is not a review. Says so right in the title. Moreover, it’s stated in the story itself that this is engineering sample hardware, which could very well change by the time it hits retail, hence the disclaimer.

I would also like a review of S3’s DX11 part, Larrabee, XGI’s upcoming DX12 component, and 3DLabs’ next workstation board. Please forward all samples to the office and I’ll make sure those are previewed as well! Further, I would have loved to have previewed the 4870 X2 or the 4870 1GB when those were being launched. Unfortunately, AMD hadn’t given me a heads-up to expect anything from FedEx. Post-launch, it happens when it happens.

It was with the anticipation that the tin foil hats would be out in full force that I added a page explaining the testing situation, why I sat back to think about it editorially, and decided that instead of running a bunch of games that had been out for years and were useful as tests, but probably not as much as actual titles to play, that I’d be willing to test using a new round of games, adding Crysis as the torture test. Had I felt compromised editorially, the story wouldn’t have gone up. Guess what? Nvidia didn’t want to see the low Crysis 2560x1600 AA/AF score in there, but as part of our data set, all benchmarks were included! I “righted” that section all by myself, thank you! ;-) You’ve interpreted my effort at transparency to be a list of excuses, which it isn’t.

Not sure how I can say this any clearer, but here goes: the only way I am able to do my job is to maintain objectivity to the very best of my ability. My assessment of Nvidia’s branded software development efforts is just that—and nobody needs to pay me for my opinion except Tom’s Hardware. The data provided in this piece was run and provided in order to be informational. If you didn’t get anything out of it, I’m sorry, but as a hardware enthusiast myself, I am *always* interested in getting more insight into whatever is just over the horizon. Hopefully you’ll feel differently when it’s AMD under the microscope rather than Intel/Nvidia.

Hope your ribs feel better soon; careful for those trees.
 

tallguy1618

Distinguished
Nov 14, 2007
344
1
18,780
Why does it matter if the minimum FPS is bad on a card due to driver issues a month before it comes out? And why did you keep posting it? DO you think we will listen more if you keep reposting?
 

gxsolace

Distinguished
Mar 28, 2008
160
0
18,680
[citation][nom]tallguy1618[/nom]Why does it matter if the minimum FPS is bad on a card due to driver issues a month before it comes out? And why did you keep posting it? DO you think we will listen more if you keep reposting?[/citation]

What exactly is being reposted? As far as i can tell, this story is brand new. RTFA?
 

romioforjulietta

Distinguished
Sep 5, 2008
12
0
18,510
3 FPS for the 295GTX at 2560.1600 L0L
the 4870x2 got 15 FPS which means 5 times faster at 2560.1600 all max 4xAA+8XAF this is what i call bull power.

yeah i think now alot of NV FAGs would say no the DRIVERS have yet to get MATURE, OH REALLY i dont think so.
4870x2 is the man.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
I'm looking forward to ATI cards arriving by June. They should do quite nicely as competition for the dual card monster Nvidia keeps pushing at the best single card. Now, if Nvidia reengineered, and put two GPU's on one PCB, then that would make a difference.

Having the top performance crown might trickle down to people who don't do research, but the 3870x2 is the last $450 card I'll ever buy at launch. It's good, but was matched by ATI's own 4850 at sub $200 prices six months later. Nvidia and ATI make their real money at the mainstream and you can't beat the 4830, 4850 and 4870 at their price points.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
[citation][nom]TheGreatGrapeApe[/nom]...I was under the impression that there was an embargo on paper launch reviews too.[/citation]

you know, it seems not that long ago that the site made that stand publicly. Of course, looking at the changes being made at other pieces of the tom's pie it's a wonder we haven't just gone to reviewing how the different hardware runs flash games better. ;)
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
I keep hearing this fanboy argument about ATI making more money per card because of GPU size differences. But then the reds never seem to want to mention the expense of GDDR5 memory.
What's up with that small gpu guys ?
I also find it quite amazing that suddenly, as long as fanboyism is all the rage, the sourpusses about prices, flip flop 180 degrees and literally (although absolutely UNCONSCIOUSLY) declare "Ati is ripping us off by gouging the crap out of us with their cheaper to make GPU!"
What's up with that small gpu boys ?
I guess that's what happens when raging red fumes invade the mind.
Smaller is better, and getting ripped off is a big plus.
LOL
Yes, it's SO PATHETIC.
The tiny gpu peeps may now swirl about in red rage then "come up with the idea all on their own" that they aren't getting bent over because "GDDR5 is more expensive than GDD3". Gosh, and sure as red sunshine, they're going to start saying it.
So you've been flim flammed and ATI stole your money, because they could, the fanboyism made you brag their corporate profits are sky high, and boy that really hurt the green nvidia owners... LOL
Yesiree, the smarter alecky it gets, the dumber it is, but boy oh brother is sure plays ten thousand times over at all the enthusiast sites. The goatse repeitition without a thought of ones own pocketbook is utterly amazing.
Congratulations, you're just so #1, we all want to be like you. NOT !
 

baracubra

Distinguished
Jan 24, 2008
312
0
18,790
Lol silicondoc, that was real interesting :p
Anyway, thnx Chris, I really enjoyed this preview but now I can't wait to hear about nVidea's 3D project!!!
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]chmod000[/nom]I don't care if it cooks your breakfast for you, $500 for a video card is NOT a "killer deal". The only price point offering any kind of value for money right now is in the $100-200 range.With the way the economy is going both AMD/ATI and nVidia would be better off trying to fight for future brand loyalty by optimizing drivers (and I don't just mean for games, I'm talking stability, video/texture quality, pulldown and deinterlacing detection/quality and newer GPU tasks like video encoding and physics handling)..... [/citation]
Gee, I hate to induce endless red rage, but of course, NVidia drivers are supremely stable when compared to ati drivers, you don't have to hack out the CCC, jam in the dot net2.0, diddle endlessly with CF, curse feverishly about greyed out CCC tabs, and generally tell lies every ten seconds some benchmark is put up. You also have asked for what ? OHHHH a great NVidia control panel easily accessible right click desktop with html like features and ALREADY THERE game specific tweaks, and oh yeah that PhysX thing you mentioned- YESSIREE ! NVidia ONLY - oh and doing the dirty with the videos and dvd movies- why that BadaBing is g-darn SCREAMER of an application... for conversion encoding
I guess it's everything you said you wanted... oh and don't forget, the new squishy soft body PhysX, which I predict is right now and will in the future create a MASSIVE market segment in "sexy games" because jiggly squishy body parts makes for anime + ...
I am really, so, so sick of the red lies, really sick of them.(not you person I am replying to - nor anyoen here of course)
I have never seen such a bunch of endusers screaming their "company" that raped them on price and is making huge profit because of it is " so great for doing it" - how exactly does that work for them ?
I know how it's "supposed to work", but when you're bragging that "your company" (meaning the one you emptied your wallet for and handed it all over to that doesn't know or care about you or tom dick or harry) just swiped up huge profits from you because it cost so little to make what they sold you, and then you're putting down your imaginary "enemy" with that, from the "other company" because they got such huge silicon for the same price and it was so expensive to make - why you've ALREADY LOST TOUCH WITH THE REAL WORLD - because you're sitting on it, the empty wallet of the real world, and the gigantic greedy money guzzling con artist you love so much is LAUGHING at you - but you don't know it, not clue one!
Quite a FEAT I must say having watched you twist reality 180 degrees.
(No not you specifically OF COURSE. )
But talk about full blown insanity...I've never seen anything like it - especially at similar price points.
I mean if you were the real traditional braggart, you could wail loudly your card cost so much more because you're daddy geekbucks -and wow that could actually play for you.
Instead, we've got "useful corporate lackey" of the I don't know what variety - hey they "ripped me off good" and made a lot of money because of it ! ( ?? What is that exactly ? Is that a Gen X problem ? )
Ok, you know, I'm really sorry fellow enthusiasts, but somebody HAD TO SAY IT, TWICE ! How can those types of red fan lines go on ten thousand times left in the mired stupidty they reside in, and NOONE ever bring it up ?
Whew, man I'm telling you. I'm trying to finish here... quickly.
Thank you NVidia for giving all your end users a gigantic value with expensive big silicon, and ati you greedy corporate pigs, cut your prices down where they belong since it's so cheap to make your tiny chip - stop RIPPING OFF FAN BOYS ! You just KNOW they can't help themselves - it's not right.
I dearly hope that clears things up a bit.
NO, I don't believe all you whom have been doing it are deeply steeped in AMD/ATI stocks so that's why you handed your money over...NO... don't even try it.
Now please carry this message forth to all the other forums - because they all have the same pollution.
Thank you.

 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
Ok, now my turn to learn. Here is a list of aegia physX enabled games : http://www.nzone.com/object/nzone_physxgames_home.html
That's quite a long list so claiming there is "hardly anything" just doesn't cut it.
Since NVidia acquired Aegia one can easily say all those games are still on the PhysX enhanced list for NVidia. I seriously doubt they trashed that capability.
My question is: With the 295, does the bigbang driver +ver allow enabling PhysX on the 295 in SlI mode? (as I recall the bigbang driver didn't do this for other Nv's - but did allow a 3rd card to run PhysX with Sli on 2 others)
I think it's very safe to say it no doubt can be enabled in single gpu mode on the singly used gpu of the 295.
I'd also think they might be able to implement PhysX on the second GPU when in single mode, but thought that was not the case prior (with the 9800Gx2 for example).
 
G

Guest

Guest
Are company buyouts retroactive? If next week Microsoft bought Nintendo, would we remember back in 85 when Microsoft released the NES? ...
 

cleeve

Illustrious
[citation][nom]silicondoc[/nom]Gee, I hate to induce endless red rage, but of course, NVidia drivers are supremely stable when compared to ati drivers, you don't have to hack out the CCC, jam in the dot net2.0, diddle endlessly with CF, curse feverishly about greyed out CCC tabs, and generally tell lies every ten seconds some benchmark is put up. [/citation]

I have to respond to this as a hardware reviewer; as far as my experience goes, it's absolutely false.

I test Nvidia cards and Ati cards on a regular basis, side by side. I can't say I've run into a show-stopping driver bug from either camp in years. YEARS.

Both driver sets have a few irritating nuances, mind you. But honestly, the whole "Ati drivers are crap" thing is *so* 1998. Let's move on. Or can you specify a specific problem I'm missing that's so terribly bad?
 
[citation][nom]cangelini[/nom]This product will not be available for Christmas, so any reference to bows, holiday sales figures, or Santa unfortunately doesn’t really apply.[/citation]

Uh, yes it does, you're helping to affect Xmas sales of a competing product by (p)reviewing a non-existant product that you know won't arrive in time for Xmas sales just before the other customer is ready to sell their product. EVERYONE knows that the 3 major sales periods for graphics cards are just before summer break, just at return to school, and Xmas. Telling people,'HEY HEY WE Have something coming soon, don't spend your money now!', let's nVidia influence the market without actually having a product on the shelf, just the same as the rumours of their 55nm part in the fall season which never came, people wait, and then they put off their buying, just when companies are planning their launches, and regardless of actual pricing and real deals, people say, Oh, but something else is around the corner and they're already reviewing it so it must be coming soon and in volume.

[citation][nom]cangelini[/nom]This preview is not a review. Says so right in the title. Moreover, it’s stated in the story itself that this is engineering sample hardware, which could very well change by the time it hits retail, hence the disclaimer.[/citation]

How many people do you think that read yours or any (p)review will appreciate it's just a preview, and not just jump to the results pages. Even in comprehensive new hardware reviews people don't read squat they flip to the pretty pictures/benchies, this should not come as a surprise we see it every day.
How many of the people reading will know what a paper launch is (how many even know of the old days where both companies would paper launch at each other [the stopping of that practice blamed for the fall of the pulp industry as it's biggest buyer of paper]). Where it got so bad people would include pictures of many boxes ready for shipping to prove that the products were there on official launch day not weeks/monnths later. It's like intel's multi-year DX10 driver promise to keep people hoping/expecting instead of buying something else. Always extend the people hope and promises if you don't have actual product to sell them.

[citation][nom]cangelini[/nom]the only way I am able to do my job is to maintain objectivity to the very best of my ability. My assessment of Nvidia’s branded software development efforts is just that—and nobody needs to pay me for my opinion except Tom’s Hardware. The data provided in this piece was run and provided in order to be informational. If you didn’t get anything out of it, I’m sorry, but as a hardware enthusiast myself, I am *always* interested in getting more insight into whatever is just over the horizon. Hopefully you’ll feel differently when it’s AMD under the microscope rather than Intel/Nvidia.[/citation]

No I wouldn't feel better whomever it was.
You miss the point, it's not that your objectivity is in question, it's your critical thinking and situational awareness that seems questionable, in publishing a paper-launch preview into one of the largest (if not the largest) buying seasons of the year. People don't need to be purposely biased, they can be unwitting tools of the mfr who knows how to manipulate their pawns with free stuff and previews. And both ATi and nVidia are savvy with that form of PR manipulation.

Understand it's less to do with the content of the review than the timing of a paper-launch preview of a product most of us have known is 'eventually coming' once the 55nm parts start appearing, but we also understand it will be in extremely low quantity and will also be met with a price cut similar to every other launch by ATi and nVidia.

I don't question your objectivity in reviewing, but I do question your judgment in posting this (p)review which is light on technical information and heavy on benchmarks that are covered in asterix and achieve just one thing, to put PR out there when there's no physical product to advertise at this time.

Ask yourself just WHY you got a preview part for this buying season from a company that, like ATi, clams up and says "we don't comment on unreleased products" unless it's strategically advantageous for them to 'leak' and early (p)review parts. They both do this, and it lessened for a while because people called them on it. Regardless of the naming of it, it's a paper-launch and it has one role, the same as it ever was, to stifle competition's sales, and yes the timing is meant to hurt the competition's sales more than give people a peek at what's behind the current. That's why Xmas is an issue, because nV is trying to be the Grinch to ATi's sales (which is funny cause it would be the first Xmas in 4 years that would have ATi on top of the single card list [X1900 launch in January, the low volume GF7800Ultra {*cough* (ouch) GTX-512} was launched for that season]). You may have been unaware of the significance of the season and the motivation of this otherwise rare (p)review of an unreleased product, but I'm certain nVidia was fully aware of their timing.

[citation][nom]cangelini[/nom]Hope your ribs feel better soon; careful for those trees.[/citation]

Thanks, it wasn't the trees (trees are our friends, glades are great) it was a ledge at Revelstoke that spontaneously 'grew' out of the mountain and jumped in front of me. Flat light sucks in deep powder !
 
*dang* just saw the quote stream above, would LOVE to be able to edit comments to clean that up. Hopefully closing quotes works this time.

[citation][nom]sojrner[/nom]you know, it seems not that long ago that the site made that stand publicly.[/citation]

Exactly.

They did it twice in my recollection, once under Lars (it was a last minute no-time to review stand essentially) and under Darren (who likely got as sick of the shenanigans as Lars.

[citation][nom]sojrner[/nom]Of course, looking at the changes being made at other pieces of the tom's pie it's a wonder we haven't just gone to reviewing how the different hardware runs flash games better.[/citation]

LOL! *ouch* (can't laugh)

Yeah I feel that to, there are still some in-depth looks, but they are rarer and the site seems more geared towards marketing and selling than investigating and learning.
 

cleeve

Illustrious
[citation][nom]TheGreatGrapeApe[/nom]PS, it's not a 'Killer Deal'...[/citation]

In your humble opinion it isn't, in my humble opinion it is.

Getting an extra GTX 280 GPU for $125 over the price of a single GTX 280 qualifies for 'killer deal' status to me. I can understand why others might disagree, but I'm certainly entitled to the opinion, Grape.
 
Status
Not open for further replies.