GeForce GTX 295 Performance: Previewed

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

randomizer

Champion
Moderator
[citation][nom]Grunch[/nom]ATI is going to reales soon a new DRIVER "SIDEPORT" its gonna improve a lot of FPS in games.[/citation]
No, they will never release that driver. SIDEPORT doesn't exist, it is just a myth. That or AMD don't know how to make it work.
 

constonce

Distinguished
Dec 9, 2008
11
0
18,510
I really didnt see any huge benefit from SIDEPORT. Its kinda like Hybrid SLI in that its a cool feature for PC manufacturers to list on the box but besides that nothing. I do enjoy the notion though of GPU manufacturers crossing over into the CPU manufacturing field. New GPU cores tend to be faster in overall capability than even quad core CPUs. I know it would be difficult for ATI as they are a subsidiary of AMD but still.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]TIndytim[/nom]First!?Why do I get the feeling AMD is already working on something to bust Nvidia again?[/citation]

TRANSLATION: " ATI really needs something more now, they got stomped with virginal NVidia drivers. "

Yes, that would be why that feeling came over you....
(yes there are leaks on the web about it *the next ati cardup* too )
 

randomizer

Champion
Moderator
[citation][nom]constonce[/nom]I really didnt see any huge benefit from SIDEPORT.[/citation]
Are you referring to the IGP version? That isn't the same, since it's connecting the IGP to onboard memory, rather than a GPU-GPU interconnect.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]scook9[/nom]SO..........i see nvidia still has not figured out how to use GDDR5?!?! If they could get that worked into the GTX295, i dont think ANYONE would be able to top that for a while.Still dissappointed that nvidia has said nothing about GDDR5...Toms already did an article a while ago on how Hynix is now making 1gb DDR5 low latency chips somewhat cheap..whats the hold up Nvidia[/citation]
The rumor is already out on the WEB - the GTX214 will use GDDR5 - I did not see the specific bit width, though, or don't recall it - one assumes 256 with the GDDR5, right, but I suppose some 384 or other number in between down to 256 is possible. They have 192, 256, 384, 448, 512 - I may have missed a few - but NVIDIA has shown flexibility there when ATI has not. (I skipped 128 )
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]enyceckk101[/nom]wow , this card doesn't consumption any power when u playing far cry 2 ? Im getting this card ![/citation]
I do wonder why you got -3 applied to your post. YES it is LESS POWER than the 4870x2 50 watts less overall they said in the review - I guess the red ragers gave you three minuses, and one of them will chime in it was your spelling that caused it - or the idea of "no power" instead of less power.
Oh well, ENJOY the GTX295 that STOMPS the 4870x2 - I KNOW I am going to enjoy all the red fan excuses... man they are already rolling out like molasses - Moaning Orwellian Lamerz...
I will plus you up +1, to a minus two, because if noone buys it competition suffers and prices go higher. Don't let them take you down man ! They're all over 40 years old and part of the establishment !
Thank you for having the guts to say you want it, when the red fever rages all about and you're therefore living dangerously....lol
( I wonder why a lot of people didn't say " I want it !" gee what could that be ? Maybe I haven't read far enough )
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]jaydeejohn[/nom]Read my link. The minimum fps sucks on this card, the 295. Its a driver issue. It happens, even when youre working with devs early on in the dev of a new game, like nVidia does with its TWIMTBP program[/citation]
You know, that maybe should not be such a thing to whine about since the 4870 gets stomped by the gtx260 when it comes to minimum framerates. I know, the highend super rezz 2 grand monitor 2560x1600 all aa af that noone I know personally ever plays in is really cool as it hangs just above NO ! WAIT !- well as it hangs below 30 frames on the minimum a LOT.. and therfore isn't playable...
Yeah man, I'm gonna get me a card that plays in a resolution my monitors cannot sustain...
No wait - I'm gonna spend $3,000.00 on some OK flat panels that can do it at 2560x1600 so I can get a 400-500 card(4870x2) while I MOAN about the price of the competition card ..... that blows it away... or even the GTX280 that works at a high framerate - high enough for my non-existant 2560x1600 dual monitors - and claim it's the one - when I can't even run that rezzz... on one monitor let alone two...
I'll just DUMP my two monitors...
Yeah, so much FUD.
Yeah, I think the real deal is most people run in less than 1600x1200 (or it's close equivalents) SO....
THEY ARE GETTING DECENT PLAYABLE FRAMERATES out of the top end cards - like the top 4 or more on down the scale. (depending on which NVidia or ATI) - because they are generally NOT playing 2650x1600.
I think that's one of the worst things about so many reviews and discussions lately ... if one considers the OTHER hardware from lets say pentium 4HT on up - even with pci-e 1.0 , and how cheap 2 gigs of ram is - and hardrive space - the SMARTEST thing a gamer could do is BUY A TOP END CARD...
and use it on their curent rig - no you won't be maxxing it - not even i7 maxxes the real high setups - BUT - you will get very comfortable framerates in ALL YOUR GAMES, at all the resolutions your current monitors can accomplish - yes it will tax your cpu - but so does your current card likely...
So that's my 2 cents on the issue.
No, people should NOT go out buying immensely expensive 2650x1600 flat panels just to run a 4870x2 or a GTX295 or a GTX260 or 4850CF or etc..
They SHOULD GET THE CARD ....
And enjoy the framerate buffer and all the eye candy at their highest rezz on their current monitor(s) and system.
That should be PUSHED MUCH MORE.
It is better for the end user.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]mbbs20[/nom]the 4870 x2 are selling for around $460- $470 on newegg...why doesnt it reflect correctly in this article because it says 4870 x2 ~$500http://www.newegg.com/Product/Prod [...] 6814103066[/citation]
Oh gee original price still there $549.00 for the 4870x@ - I guess NVidia is driving down ATI's flagship moneycard ALREADY with their "denounced as dastardly by redfans" early announcement, they will probably destroy ATI and force them into bankruptcy ! How much lower can ATI go - with that $549.00 egg release ? Might be the end of em. The bell is tolling...
So NVidia is goping to release at $499.00 and ATI will be the SCALPER then with their $549.00 release with much smaller silicon wafer costs. SHAME ON YOU ATI ! ripping off your fans ... SHAME ON YOU !
Thank God NVidia is coming to market to stop the raping of gamers by ATI - they have the top single card (even though it's 3rd when CF doesn't work in so many titles that don't support it) and they get a big fat greedy head and come in at $549 !
I'm so mad at the scapling greedy ATI corp machine !
.
Thank you NVidia for DRIVING THEM DOWN from their GREED DRIVEN FRENZY !
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]hellwig[/nom]Lets see, at release the GTX 280 was what, $650? ATI released the 4870 at $300 (or was it $400) and completely decimated Nvidia's sales that month, forcing Nvidia to drop prices down by almost 40%. That's huge. To target their pricepoint at the 4870X2's current $500 mark is definately a gamble, they can't be making much money at that price. Whats to say ATI couldn't drop the 4870X2 price 10% to $450, and maintain the price/performance lead its R700 series has held for so many months?Nvidia loves that "worlds fastest single card" feather in their cap, but they might be sacrificing too much to get it back this time. I guess we'll have to wait and see what the final price and performance specs are.[/citation]
YOU KNOW THAT'S A NICE RINGER - "price performance lead" - but what is there to back that up with SALES NUMBERS ?
Why wouldn't someone want the GTX280 currently still over the 4870x2 ? Of course, if you don't want to HASSLE with the game support and driver issues - you get the GTX280 or even the GTX260/216 - and every resolution you play at is FINE plenty of framerate... (do YOU own TWO 2560x1600 flat panels ? Even ONE ? )
So that's a great catchphrase, but the X2 issue with surely makes me wonder if more 4870x2's have been sold over the GTX280's.
Same of course will apply with the GTX295 -BUT I can hardly see a $50 difference make a person go for the 4870x2 over the GTX295 that beats it so soundly even with the virgin drivers...while the 4870 drivers are "matured" and already squeezed all the blood out of the turnip.
I wonder if anyone has "the numbers" on pieces sold, because I haven't seen them anywhere. That would be quite interesting in any case - the outcome - the numbers could be quite instructive.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]Swixi[/nom]That is only in Far Cry 2 for some strange reason. Check the other game benchmarks on that site; don't just link one game, or you create an unfavorable bias equivalent to that of an ATI fanboy (most of the people on here).[/citation]
I agreee with you, the slant always goes that way from so many people. I'd like to ask all the red fanboys if they're using a 2560x1600 flatpanel (that cost them far upwards of a grand and more than 2 grand if it's a good one) - because they seem to LOVE that "winning aa af resolution" and scream that settles it ....
I think the LACK of their fantasy $2,000.00 monitor settles it in favor of NVidia -- like almost 100% of the time.
( I know, they all have TWO gigantic 2560x1600 low low ms flatpanels... )
Hmm... and someone "doesn't see" what you mean about red fan domination...
I know, I should cool it , but I can't take a whole year of this CRAP !
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]randomizer[/nom]With no clear winner, a price war could happen. But NVIDIA would be hurting from that.[/citation]
Do you really give a dign dang if Nvidia is "hurting from that" just because you heard their wafer is bigger ?
I mean do you own their stock ? No, you must own red stock ?
No I think you don't own either stock, but just couldn't help yourself.
Do you think maybe both companies go for "the crown" in single card x2 because "it more than makes up for itself in driving their lower end sales "?
GEE, you think that might be WHY ?
So isn't it POSSIBLE at least that the crown holder has a huge selling point about being the best company and that translates into higher sales across all their tiers ?
Isn't it possible it MIGHT actually BOOST NVIDIA's market share?
Hmmm....
Why - is the thinking so simpleton ?
Maybe NVidia is just a bunch of crazed wackos that can't stand being number 1 ?
Of course if that's true, it would also apply to RED ATI wacko freakdom 4870x2...
Gee, what if taking the top spot actually improves NVidia's bottom line even if they slash the GTX295 price ?
How would that make you feel ?
Be honest, let's have a group inner child session...
LOL
Nvidia corp HQ, arrogant A CEO screams: " By golly I don't give a BLEEPITY BLEEP how much this slams our bottom line into the dirt ! Make that 2 core 295, if we have to go under slashing it's price well sell it at a loss ! Anything to go down dying as the King ! DO IT NOW ! "
_________________________________________________-

Gee, ya think all that conventional "wisdom" that all the little ankle biters spew all the time might just be WRONG ! ?
I don't know, maybe you're so arrogant, you spend yourself and your loved ones into bankruptcy trying to mr #1 ?
Ya think ?
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]thepinkpanther[/nom]on a lighter note i thought nvidia renamed their cards so the comsumer wouldnt be confused with numbers...well the gtx260 and gtx260 core 216 sounds confusing.[/citation]
Ok, mr pink, (or miss or misses as the case may be) - how is for the love of GOD ! - 260/192 260/216 any worse than 4870/512 4870/1024 ?
__________________________
HERE WE HAVE A PERFECT EXAMPLE OF TOTALLY UNCONSCIOUS RED FANBOY BLOVIATING FOREVER AND NOONE POINTS IT OUT - TILL I DO !
______________________________________

In other words, sir - or madam - the 260 ALWAYS has 896 ram on it - so THERE ARE EXACTLY 2 KINDS, 192 and 216 shaders. TWO KINDS

In other words, sir - or madam - the 4870 ALWAYS has the same core on it - so THERE ARE EXACTLY 2 KINDS, 512 and 1024 memory. TWO KINDS
__________________________________________________________________

Is it clear yet - WHAT A GIGANTIC BIAS HAS BEEN PERPED ON NVIDIA FOR HOW MANY GOD FOR SAKEN MONTHS BY THE MINDLESS DROOLERS ?
__________________________________________________________________

I'm getting a 4870 !~ oh really, well that's confusing... which one are you getting ? whatever do you mean ? YES WHAT A LYING SCREW JOB THE REDS HAVE PERPED ABOUT !

This has been a public service announcement.
Please cut the crap now that you know how STUPID it is.




 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]randomizer[/nom]Why all the crap against a fellow fanboy?[/citation]
I love you fellow fanboy. I'm just helping fanboys get to a point of HONESTY - so they can really hammer that fan syndrome home with no possible rebuttal or effective objection. We're talking clear winning - no fuzzy edges of FUD.
 
[citation][nom]constonce[/nom]I still fail to see the glamor of owning a card that has 1600 Stream Processing Units and is being one up'd by a card with less than half that.. blah blah .. And, didnt have to use the same amount of hardware to do so.[/citation]

Actually dude, nVidia used MORE harddware, both in area of silicon and in transistor count, so your statement is 100% the opposite of what reality is, in which nV uses more hardware, they just chose to use less of it on SPUs and more on TMUs and ROPs and Memory Buses.

As Randomizer points out it's not about the # of SPUs it's about the architecture as a whole.

But your statement is simply wrong, and only goes to further illustrate that SPUs alone aren't everything, just like nV's much higher VPU speed or much higher memory bit width mean little on their own, which silly hasn't figured out yet.
 
[citation][nom]sillydoc[/nom]The rumor is already out on the WEB - the GTX214 will use GDDR5 - I did not see the specific bit width, though, or don't recall it - one assumes 256 with the GDDR5[/citation]

HaHa, and the rumour is already out on the web that AMD are already shipping their 40nm refresh parts for the HD4800s replacement, but that's equally spurious and I give it the same weight, Zero! Mainly because people like yourself confuse the real rumours of lesser parts and thing something else huge is coming.
As for a GTX214, you point me to anything credible with that. Because obviously you don't get how nV naming works, a GTX214, would be at the bottom of the scale (GTX295 being towards the top), so very unlikely to warrant a 256bit memory interface; which is probably why you conveniently forget any details, because they are as over-inflated BS as the rest of your info.
As for the G(T)212, G214, G216, G218 they're a long way off (40nm parts) and of little consequence to these, with the G214 being competition for the HD4830 replacing the aging G9600GT not anything special, and certainly not worthy of the cost of supporting a memory bitwidth above 256.

[citation][nom]sillyconedork[/nom]Do you really give a dign dang if Nvidia is "hurting from that" just because you heard their wafer is bigger ?[/citation]

Dude, once again you don't know WTF you're talking about, and for someone with silicon in their name that's laughable. nV uses the same size TSMC wafer as ATi (300mm), neither is bigger or smaller in that department, it is the size of their chip (don't need to hear about it, they are easily visible on the card), and thus the yield of chips per EQUALLY SIZED 300mm wafer. Seriously you just look more ignorant the more you try to post about such things, because you are light on the information and heavy on the fanboi mis-information.

[citation][nom]sillyconedork[/nom] They have 192, 256, 384, 448, 512 - I may have missed a few - but NVIDIA has shown flexibility there when ATI has not. (I skipped 128 )[/citation]

That's not a demonstration of flexibility, that's a demonstration of the limits of their memory controllers. They pushed to 512bit AFTER ATi, still have yet to hit GDDR4 or GDDR5 in any major way. Those un-balanced numbers are indications or their inabilities not their flexibility, just like the NVIO is not a demonstration of flexibility, but a demonstration that they ran out of die space without making the GPU even more Ginormic. An NVIO-style I/O chip makes alot of sense on a multi-GPU solution, but not a single one, just like the unbalanced memory interfaces are a sign of their limitations not their benefits. It doesn't take much work to disable the memory lanes. Just disable the ROP/controller interface and voila, you can cut the pathway in whatever increments that offers you, which is exactly what nV did, cripple their chips, not make them more flexible anymore than ATi's, ATi just limited their crippling to the standard increments like 512->256bit, because they're never been memory limited so much as TMU or ROP limited in the R6xx series, and accounting for standard increments i buffer size is much easier than using 320MB and 640MB which offers no advantage over 256 & 512 but costs more and is nowhere near as capable as the much larger sizes.

[citation][nom]sillydoc[/nom]HERE WE HAVE A PERFECT EXAMPLE...
... the 260 ALWAYS has 896 ram on it - so THERE ARE EXACTLY 2 KINDS, 192 and 216 shaders. TWO KINDS[/citation]

That's not correct, as there are actually 3 types of GTX260 SOFAR: original 65nm GTX260 , 65nm GTX 260 - 216 shaders, and now 55nm GTX260s. All of them will say GTX 260 and 896MB RAM on them, whether or not the AIBs actually put the other information on the box or the customer even knows the difference to begin with has yet to be seen. This is very similar to the GF8800GTS situation where it's to nV's benefit to confuse people as to not hurt their efforts to dump the old ones on the ignorant while replacing the new for the savvy buyer. Savvy?

Seriously you really just aren't very informed are you, even of your favourite company's own lineup? Although your ability to combine a lack of style with a lack of substance is pretty impressive.
 
[citation][nom]randomizer[/nom]Look at all the lies AMD has spewed about Sideport. It either doesn't exist or they haven't got a clue about how to make it work.[/citation]

I wouldn't go that far, but it's obvious that for whatever reason they just can't get it working. It's there, you can see the region on the chip, but if you can get it to work, it might as well not be there.

It wouldn't be the first time both companies have added an X factor only to find out in the real world it just doesn't work well enough.

If there were a time for ATi to 'unlock' support for the Xfire Sideport it would be now, but as mentioned by so many, it might only be a very select few that would benefit if the AIBs decided not to bother supporting the feature on the boards. It would be like including a turbo/blower in a car but not hooking it up, and then finally when the let you hook it up your local dealer decided no to get any models with the body that supported the air intake.

I doubt we'll see it this round, mainly because it's not absolutely needed as if these were the R600s, and if they can't get it working well enough. Would you bother with something that's only really targeted at less than 1% of your market who would be just as mad about stability issues as they would be happy about a few % points gain?

I wouldn't be surprised if the main reason/benefit of the Xfire sideport is to test it for this round so that next time it is a usable feature. That to me would be a better idea than launching something that amounts to ATi's half-A$$ed X850 Crossfire solutions which got as much or more criticism about it's flaws and limitation than it got praise for it's benefits.

It'd be nice to see it working (even as a tech demo) for the theory and technical aspect, but as a wide-release product, I don't think we'll ever see the Sideport used in the HD4K series.
 

randomizer

Champion
Moderator
Well the way I see it is that it's just an added cost to the manufacture of the board. If they don't intend to enable it they should stop adding it and save themselves (and the consumer) some money. Sure, add it to engineering samples and let a few reviewers play with it, but don't boost the cost of the board unnecessarily, even if only by a small margin.
 
Yep, I think that's exactly why they didn't add it to most boards. If ATi can't prove it useful right up front, it's just an added cost to the AIB, who already is forced into these low low prices with the low-ball approach the launch took.

I doubt ATi would show the crossfire sideport working to anyone until they have it widely implemented in the next generation. If they can't use it to a marketing benefit right now, letting other people see it in action would simply give away information to their competition. If they keep it under wraps and locked down by drivers then only they see the benefits and limitations of their unique option, and that way leave intel, nV and S3 guessing as to whether it's worthwhile pursuing.

At least, that's what I'd do if I were in that situation.
 

randomizer

Champion
Moderator
For all we know AMD could be doing what they did with RV770. Remember how it was supposed to have 480 SPUs? Downplaying the significance of this could be because it really is pointless/unworkable and they just want to tease us until we forget about it, or they're saving up something big. I hope the latter, and they've done it in the past, so lets just hope that is the case. At least they're not pulling a Barcelona PR stunt claiming it is the best thing since tap water.
 
G

Guest

Guest
@ TIndyTim; Because they are a failure.
@ NarwhaleAU; its 260's dumba$$.
 

loeric

Distinguished
Nov 2, 2008
2
0
18,510
Just did a further slice and dice on the raw data in this preview, and here is how the GTX 295 fares against the 4870x2 (where 4870x2 as baseline of 1):

1920 x 1200 2560 x 1600
No AA / No AF 4x AA / 8x AF No AA / No AF 4x AA / 8x AF
Crysis 1.15 1.03 1.06 0.15
WaW 1.28 1.10 1.21 1.27
Dead Space 4.40 1.88 1.94 1.69
Fallout 3 1.02 1.04 1.03 1.03
Far Cry 2 0.97 1.07 0.89 1.17
Left 4 Dead 1.01 1.15 1.08 1.24

It appears that both cards perform comparably most of the time at 1920x1200 unless one wants to play at 2560x1600 with AA and AF turned on or seriously cares about Crysis, Waw, and Dead Space. If I have a 4870x2, I wouldn't sweat over it and chase after the GTX 295. After all, ATI's R800 could show up in this summer anyway.

ATI and nVidia release their top dogs almost exactly out-of-phase, i.e. about six month from each other. It's no surprise that they take turns to capture the top gpu crown. It's all business. It could become even more interesting when Intel gets into the battle with the two. At least, consumers should be happy to see ATI and nVidia compete and not conspire together. Just my $0.02.
 
Status
Not open for further replies.