GTX 660 Ti / GTX 660

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I don't see why not, unless there are Nvidia release restrictions in effect. But any custom cooler on the 670's would work on the 660's, if the information above remains correct.
 
Hopefully, VideoCardz.com indicates that there is only one version of the 660Ti however and that is the 256-Bit one.
 

And then they sell these games with names like "half-life" and "oblivion" that ironically also describe your social life after you make the purchase.
 

And all those energy drinks and snacks - when you go to the bathroom it's Blood Bowl: Legendary Edition. :sweat:
 


so more rumors comes out eh? i think we should get the definite answer in two or three weeks
 

Beats the 7950 hands down and seems to trade blows with the 7970? Not bad for a mid-$200 card.
 

Here's what TweakTown has to say on the subject:
"The problem for AMD, though, is that while this card is priced at levels around the HD 7850 and HD 7870, its performance is around that of the HD 7950 and HD 7970. You combine this with the fact that AMD really don't have any more room to move on price due to recent price drops. We really do wonder what AMD can do. NVIDIA do this excellent job of taking on multiple AMD models with just one video card and in this case, we see that happening again."
http://www.tweaktown.com/reviews/4869/nvidia_geforce_gtx_660_ti_2gb_reference_video_card_review/index14.html
 


I read that entire review and am hoping this doesn't come true. I know the 7800 series silicon is much cheaper to make than the GK104 Nvidia is re-harvesting, so maybe AMD can cut MSRP again...
 

And reinforce any rumours that they've been price gouging? I'm sure that would go down like a fart in a lift with the company faithful. :lol:
 

I recall an article from way back that postulated the reaction of the AMD faithful if leaks proved true and Nvidia's mid range Kepler GPU was more than a match for AMD's finest, your post makes me think of that time. Why is that? :lol:
 
Point here is, it was nVidia that complained about the costs of 28nm, no one else.
They no longer enjoy only good di per wafer, and have to absorb the higher costs of an expensive node.
Using a larger die will cost more
 

The point I was trying to make was about the negative feedback they may receive if they were to make any price cuts but feel free to read what you want. 😉
 


GPU Die sizes from published reviews and so on...

GK104 -> 294 mm^2
Pitcairn -> 212 mm^2

Smaller die size, more per silicon wafer. More per silicon wafer, the cheaper per die. Since both Nvidia and AMD are paying per wafer now (AMD has always been paying per wafer, I think), last I read.

Also, I won't be surprised if the yields are better on the AMD Pitcairn. They came to market way earlier and initial yields of the GK104 were supposedly not satisfactory.
 


Price gouging are the words. The graphics market shrunk so much over the years, and now it is essentially a two party system at mid to higher level performances and a different two party system at low, integrated performance levels...

I remember the old days when S3, Matrox, 3dfx (3DFX), PowerVR, Trident, Rendition were still competitive in consumer graphics.
 


Ah, ye be an old fart too...? :ange: :lol:
 


Yet more hope to AMD cutting prices to the bone. That AMD's 200-300$ part was due to *cough* price gouging... I don't know AMD's original intentions, but I suspect the die was sized originally for sub 200$ market, and they are raking in money every time someone buys a 7800 series.
 


Too old for the computer world, I suppose. I have noticed my loss of interest in high performance computing and the lack of motivation to try new games now that I've played a historical list of great games...
 

I'm glad I'm not the only one, there are a few games around that are a bit of a giggle but I'm still waiting for something to grab me by the family jewels in the same way that Half-Life did back in '97.