GeForce GTX 660 Ti Review: Nvidia's Trickle-Down Keplernomics

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]bigcyco1[/nom]If this card was $250.00 it be worth buying not a $300.00 lol that's ridiculous[/citation]

I'd recommend a 7850 over the 660 TI at that price. The 7850 is worse at stock, but it can overclock almost as far in performance as a 7870 can, maybe just as far, depending on the model. A 7870and a 7850 can both overclock farther than a 660 TI can go. The 7850 still has the compute advantage and memory bandwidth advantage in this usage, among other advantages. Having CUDA/PhysX on the 660 TI doesn't help too greatly considering Kepler tends to suck at them compared to Fermi and until TXAA is better supported, Nvidia's AA woes are too severe to overlook.
 
G

Guest

Guest
BestJinjo ,
I completely agree that Nvidia is playing dirty with the GTX660 Ti because it will works like a GTX670 in some case and like an "HD7850 OC" in others .

you wrote:
1) They aren't taking into consideration that future games will use DirectCompute for lighting / global illumination model and incorporate contact hardening shadows. ALREADY 3 GAMES do it and Kepler tanks in them:

- Dirt Showdown
- Sniper Elite V2
- Sleeping Dogs

Kepler may "tank" in those games , but every games wich support PhysX are tested with PhysX DISABLED.
If they would use it , I let you guess the performance index hit on every ATI/AMD product.
In the end of the day , people buying a gtx660 Ti may seriously consider the HD7950 because it feels like a whole product with overclocking headroom and constant performances but in my case I would just stick with a GTX 670.
 

BestJinjo

Distinguished
Feb 1, 2012
41
0
18,540
play222,

Yes definitely PhysX is an NV specific feature. Games that use PhysX are far and few though. Batman and Alice do it well but the other games that use PhysX often incorporate unrealistic level of debris (Mafia II) or hardly make any noticeable changes in the game (Metro 2033). Of course there will be PhysX in Borderlands 2 as the next major game. What would happen if more and more games started to use DirectCompute though?

BTW, AMD is officially dropping prices this week on HD7850 to $209, 7870 to $249 and 7950 to $319 and is now bundling Sleeping Dogs in place of Dirt Showdown.

For $50 less, the 7870 provides maybe 10% less performance and for $20-30 more HD7950 provides a lot more overclocking headroom. If you do want to use PhysX, then NV card is the way to go but after you finish with Batman AC and Borderlands 2, you'll probably wait another 6 months before any new game uses PhysX and maybe longer for a game that uses PhysX and does it well.

Personally, I am more impressed by physics in Red Faction Guerilla over just about anything NV has done with their closed PhysX implementation.
 

Captain_Kickass

Distinguished
Sep 18, 2011
19
0
18,520
The fact that the GTX 660 Ti sucks at 1920x1080 does not suprise me given the 24Rops and 192Bit, many of us saw that comming based on speculated rummors. Nivida needs to release a 32Rops and 256Bit version by X-mas or soon after if they really want to see the numbers from the GTX 660TI line IMO.
 
[citation][nom]Captain_Kickass[/nom]The fact that the GTX 660 Ti sucks at 1920x1080 does not suprise me given the 24Rops and 192Bit, many of us saw that comming based on speculated rummors. Nivida needs to release a 32Rops and 256Bit version by X-mas or soon after if they really want to see the numbers from the GTX 660TI line IMO.
[/citation]
See
[citation][nom]outlw6669[/nom]They already have that, it is called the GTX 670...[/citation]
 

BestJinjo

Distinguished
Feb 1, 2012
41
0
18,540
New AMD prices end of this week w/ Sleeping Dogs:

$209 7850
$249 7870
$319 7950 GPU Boost

Price wars FTW!

We shall soon see HD7850 under $200 with rebates.
 
[citation][nom]play222[/nom]BestJinjo ,I completely agree that Nvidia is playing dirty with the GTX660 Ti because it will works like a GTX670 in some case and like an "HD7850 OC" in others . you wrote:1) They aren't taking into consideration that future games will use DirectCompute for lighting / global illumination model and incorporate contact hardening shadows. ALREADY 3 GAMES do it and Kepler tanks in them:- Dirt Showdown- Sniper Elite V2- Sleeping Dogs Kepler may "tank" in those games , but every games wich support PhysX are tested with PhysX DISABLED.If they would use it , I let you guess the performance index hit on every ATI/AMD product.In the end of the day , people buying a gtx660 Ti may seriously consider the HD7950 because it feels like a whole product with overclocking headroom and constant performances but in my case I would just stick with a GTX 670.[/citation]

Kepler sucks at DirectC because Nvidia decided that they didn't care about consumer compute performance. AMD/Ati sucks at PhysX because it is a closed physics implementation owned completely be Nvidia that is not shared with AMD, so AMD isn't even legally allowed to support it properly. The situations aren't the same and are both Nvidia's fault.

Also, with overclocking considered, a 7850 can be a considerably better card than an overclocked 660 TI.
 

Captain_Kickass

Distinguished
Sep 18, 2011
19
0
18,520
[citation][nom]outlw6669[/nom]See[/citation]
lol yea at 100-150 dollars more when there really is no need for the GTX 670. pitty the suckers that bought it. The GTX 660 TI is practically the same card without the 32Rops and 256Bit. Can you say 670 jip ? lolz ... oh don't tell me you bought one ?
 

Captain_Kickass

Distinguished
Sep 18, 2011
19
0
18,520
honestly the step up to the GTX 670 is a lie. the GTX 660 TI core would do the practically the same results with the 32Rops and 256bit. There's really not much difference, Same clocks, same shaders ... do you really think that many GTX 660 TIs are ALL faild GTX 670 ? really ? lolz. Think about it... The gtx 670 is just another gimic for them squeeze an extra 100-150 bucks when really when the GTX 660 TI would jsut as well if it weren't gimped on purpose.
joke. GTX 670 is for suckers. I'll wait till they up the GTX 660 TI someday.
 
[citation][nom]Captain_Kickass[/nom]lol yea at 100-150 dollars more when there really is no need for the GTX 670. pitty the suckers that bought it. The GTX 660 TI is practically the same card without the 32Rops and 256Bit. Can you say 670 jip ? lolz ... oh don't tell me you bought one ?[/citation]

Those extra ROPs and the wider bus are worth the extra $100, not that I'd buy the 670 anyway.

[citation][nom]Captain_Kickass[/nom]honestly the step up to the GTX 670 is a lie. the GTX 660 TI core would do the practically the same results with the 32Rops and 256bit. There's really not much difference, Same clocks, same shaders ... do you really think that many GTX 660 TIs are ALL faild GTX 670 ? really ? lolz. Think about it... The gtx 670 is just another gimic for them squeeze an extra 100-150 bucks when really when the GTX 660 TI would jsut as well if it weren't gimped on purpose. joke. GTX 670 is for suckers. I'll wait till they up the GTX 660 TI someday.[/citation]

The GTX 660 TI with a 256 bit GDDR5 memory interface and the benefits that such an interface would grants (such as the additional ROPs) is a GTX 670. Nvidia used GK104s that had damaged memory interfaces and/or GK104s that were good enough for a 670, but they didn't have enough GK104s with damaged memory interfaces to keep up stock of GTX 660 TIs, so they had to use some of the GK104s that would have gone to GTX 670s and maybe even 680s. Regardless of what reasons went into the GK104s in use by the 660 TI, it takes a 25% memory bandwidth drop from a card (the 670) that is already a memory bandwidth bottle-necked card.

The GTX 660 TI will never have a better memory interface. Its memory interface is what makes it a GTX 660 TI. If you wait for a GTX 660 TI with a 256 bit memory interface, then you wait for something that will never exist except as the GTX 670.

If you want to argue about step-ups that are stupid, then the 670 to the 680 and the 7950 to the 7970 are stupid, but the 660 TI to the 670 is not stupid. The 680 is so memory bandwidth bottle-necked that it's considerably faster GPU can't stretch its performance by much compared to the 670 and the 7970's advantage over the 7950 is almost purely in its GPU clock frequency. Overclocking either the 670 or the 7950 will let them perform no worse than their larger brothers if given a chance, granted the 7950 is underclocked more than the 670 is and simply needs a larger GPU frequency boost to match the 7970 than the 670 needs to match the 680. At the same frequencies as their larger brothers, both the 7950 and the 670 are effectively indistinguishable from their larger brothers in performance.

The same is not true for the 660 TI. Overclocking will never let it top nor even match the 670. The 670 has more headroom than the 660 TI because of its better memory interface not being as much of a bottle-neck as the 660 TI's memory interface is for their GPU. The 660 TI was gimped to make a card below the 670 in performance and price just as all graphics cards that aren't at the top are gimped compared to the top in order to fill the roles of a lower end card. Is it really a lie just because instead of gimping the GPU (which the 670 shows to be very ineffective due to the huge memory bandwidth bottle-neck stopping the lower GPU performance from making much of a difference), it was the memory interface and related hardware that was gimped? As much as I dislike the result, Nvidia wanted a card below the 670 in performance and not just price. Gimping the memory interface was the only way to do this within reason except for not using the GK104.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Just like BestJinjo has pointed out over and over again. I just caught wind of AMD's aggressive price cuts, thanks AMD Gaming's Facebook page. It's a TechReport article: http://techreport.com/discussions.x/23447 Though they're one of the sites that made a review that put the GTX660Ti in favor, not that I'm saying they're evil as one commenter here on TH, maybe blazorthon or the article writer "Cleeve," mentioned, the results vary with the games and settings used in the tests.

I remember you too got an HD 7850 or two, blaz. Kinda sucks for us eh? Hehehe... But what can you do... Prices really just have to drop sometime. :D
That's really something though, being able to buy the HD 7870 for the old price (I think) of the HD 7850, but you did mention, blaz, put in another way, when properly overclocked, the performance difference with those two cards may be negligible. Or did I misunderstand what you meant by "...can overclock almost as far in performance ... maybe just as far."?
 


The 7950 can overclock just as far as the 7970 can. The 7850 can't quite match the 7870 in overclocking performance, but it can get pretty close. The 7870 has a much more significant hardware advantage over the 7850 than the 7970 has over the 7950 and it shows when you do a good comparison, granted that I don't think that the price hike is quite worth the performance hike.

Yes, I do have a 7850. However, I got to use it for quite a while before now, so I suppose that I got my money's worth with it.
 

BestJinjo

Distinguished
Feb 1, 2012
41
0
18,540
army_ant7,

My biggest complaint this round is that most of hardware press (besides few websites such as Tom's), abided by NV's reviewers guide, cranked tessellation to extreme and have foregone MSAA to FXAA in many reviews online. That positioned GTX660Ti the best possible light. A reviewer's job isn't to sell videocards, but to give us a fair and balanced overview of the card's strengths and weaknesses. Places such as Tom's and Xbitlabs did just that and it showed problems for the GTX660Ti.

Further, so many reviewers like AnandTech put a bunch of factory pre-overclocked GTX660Tis against a reference HD7950 "B" type card. However, that's not helping the consumer who now has 5-6 after-market HD7950 cards to buy on Newegg for $300-320 and they don't have 7950 B overvolted BIOS installed. As such the 7950 reviews showed it to consume way more power than it does and most never even bothered to explore its overclocking! In fact, some of them compared GTX660Ti @ MAX oc VS. a stock 7950.

As blazorthon noted, if someone wanted to save $, they'd just simply get an HD7850 or even 7870 and overclock them. GTX660Ti isn't much faster than those cards with MSAA, unless a person hates MSAA for some reason. And if someone wants to overclock, well the 7950 offers 25-40% headroom for the same amount of $.

What NV did is used its marketing again to set up reviews masterfully by somehow telling reviewers to include factory preoverclocked cards that GPU boosted to 1200mhz+, despite even reference 660Ti cards being on sale. Talk about skewing the #s against a stock 7950!
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
@BestJinjo,
You did mention some of that above I think and I did read it. And yes, I do agree that the reviews can be skewed into Nvidia's favor against the true interests of the consumers.

Something I need some of your (and anyone else's) thoughts on though. I think I also read this at the conclusion of the article which this forum thread is for. Cranking up tessellation to extreme makes Nvidia's cards look better?

I'm wondering about this because I thought that Kepler's geometry shader supposedly sucked, thus the low GPGPU performance and the bigger performance hit when turning on tessellation compared to the GCN cards. (Please tell me if I'm mistakenly mixing together two different things, GPGPU performance and geometry shader performance (tessellation).)

Or is tessellation turned to extreme so as to sort of falsely advertise the GTX 660Ti's "stength" and to sort of divert people's eyes from its weakness with MSAA?

Thanks! :)
 

BestJinjo

Distinguished
Feb 1, 2012
41
0
18,540
It is Kepler Compute performance that is weak. Its geometry/tessellation performance is as good as GTX670. That is why in games that use Compute for lighting and High Ambient Occlusion, GTX660Ti is so weak (Dirt Showdown, Sniper Elite, Sleeping Dogs). But for games that use extreme tessellation, it will do alright.

Kepler architecture is actually faster at tessellation than GCN is. However, since games aren't just made entirely of tessellation and have compute shaders, particle shaders, volumetric shaders and high resolution textures, other aspects of the videocard come into play. In a purely synthetic tessellation benchmark, Kepler would win though.

The last point you made - exactly. When you play the game you can decide what you want to enable. Higher tessellation, higher AA, switch SSAO to HDAO for lighting model, etc. A review should cover strengths and weaknesses of the product. Only testing GTX660Ti at extreme tessellation with FXAA is not representative of how all gamers play.

Just run Unigine Heaven Demo on Normal vs. Extreme tessellation. To me the Normal setting looks great and Extreme looks artificial. The bricks, cobblestones and other things on Extreme stop looking natural because they are too bulbous. That's why extreme tessellation also can appear more fake than "average" tessellation usage.
 
[citation][nom]BestJinjo[/nom]It is Kepler Compute performance that is weak. Its geometry/tessellation performance is as good as GTX670. That is why in games that use Compute for lighting and High Ambient Occlusion, GTX660Ti is so weak (Dirt Showdown, Sniper Elite, Sleeping Dogs). But for games that use extreme tessellation, it will do alright. Kepler architecture is actually faster at tessellation than GCN is. However, since games aren't just made entirely of tessellation and have compute shaders, particle shaders, volumetric shaders and high resolution textures, other aspects of the videocard come into play. In a purely synthetic tessellation benchmark, Kepler would win though.The last point you made - exactly. When you play the game you can decide what you want to enable. Higher tessellation, higher AA, switch SSAO to HDAO for lighting model, etc. A review should cover strengths and weaknesses of the product. Only testing GTX660Ti at extreme tessellation with FXAA is not representative of how all gamers play.Just run Unigine Heaven Demo on Normal vs. Extreme tessellation. To me the Normal setting looks great and Extreme looks artificial. The bricks, cobblestones and other things on Extreme stop looking natural because they are too bulbous. That's why extreme tessellation also can appear more fake than "average" tessellation usage.[/citation]

http://media.bestofmicro.com/U/P/336769/original/tessellation%20scaling.png

I'd expect the 660 TI to do tessellation more efficiently than the 670 and the 680 because it's GPU is almost equal to the 670, but it's memory bandwidth hurts its performance, so tessellation has a lot of hardware to work with and it might have less of an impact on performance than it does with the 670 and 680 because of it working the GPU, a part that is largely wasted by the memory bandwidth bottle-neck, so it has a lot of performance that can't be shown except maybe by tessellation. However, Kepler still seems to have inferior tesselllation performance to GCN, at least according to Tom's tests. Their memory bandwidth might hide this, but the GPU itself seems to handle it worse than Fermi did and especially worse than GCN does. I'm still looking for a similar tessellation performance comparison that includes the 660 TI to prove/disprove this.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
@BestJinjo,
blaz beat me to it (Hehe...), but I was gonna say that I do remember Tom's Hardware's review articles on Kepler showing that it does have weaker tessellation performance compared to GCN. I may have mixed up the hardware components responsible for a major GPGPU performance contribution (Though the geometry shader may have a big part in GPGPU performance. I don't really know. Hehe...), but I do remember one of their articles, maybe their Kepler release article, stating that Nvidia reduced the geometry shaders to 1 section or something. (It would be best to see the article to be more accurate/sure.) I think the premise mentioned was that Nvidia was targeting performance with features most commonly used during its release, though this doesn't seem to be the case with the GTX 660Ti. Kinda ironic really, if that were the case.

That's not to say that the GTX 680 performed worse than the HD 7970 with tessellation on (Those two cards that were compared as I remember/think.). It The 680 just had a bigger performance drop when tessellation was turned on relative to when it was off, compared to the HD 7970.

Is this what you remember seeing, BestJinjo, or did you really see performance benchmarks where Kepler excelled when tessellation was turned on? :)
 
[citation][nom]Anonymous[/nom]BestJinjo ,I completely agree that Nvidia is playing dirty with the GTX660 Ti because it will works like a GTX670 in some case and like an "HD7850 OC" in others . you wrote:1) They aren't taking into consideration that future games will use DirectCompute for lighting / global illumination model and incorporate contact hardening shadows. ALREADY 3 GAMES do it and Kepler tanks in them:- Dirt Showdown- Sniper Elite V2- Sleeping Dogs Kepler may "tank" in those games , but every games wich support PhysX are tested with PhysX DISABLED.If they would use it , I let you guess the performance index hit on every ATI/AMD product.In the end of the day , people buying a gtx660 Ti may seriously consider the HD7950 because it feels like a whole product with overclocking headroom and constant performances but in my case I would just stick with a GTX 670.[/citation]Well i wouldn't go so far as to recommend a 7850 over a 660 ti lol but i agree a 7950/7970 or GTX 670 is best bang for buck!
 
[citation][nom]bigcyco1[/nom]Well i wouldn't go so far as to recommend a 7850 over a 660 ti lol but i agree a 7950/7970 or GTX 670 is best bang for buck![/citation]

Get a 7850 with actively cooled memory and it can easily beat the 660 TI in overclocking performance with some tessellation and AA in use.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980

Actively cooled memory? I don't think I've seen cards that have indicated that. Does any card with two fans pretty much encompassing the whole card qualify as having actively cooled memory? Also, another thing, does the temperature in Overdrive include the VRAM's temp? Just wondering if people need to worry about overclocking their VRAM. Thanks! :)
 
Status
Not open for further replies.