Nvidia Desktop GPUs Hit 40-nm

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Guys, 40nm does not give you a "better" graphics solution for gaming. All 40nm does is use less power and generate less heat, allowing the entire package to be smaller and quieter.
These are HTPC cards and useful only in places where power, heat and space are concerns.
Compare the stream processors, bit rate and memory on these to other cards in thh nvidia line up and they are not too bad. Comparing them to ATI cards is useless unless you have a reference card to bench with.

What I am interested in is how they will perform on the bench - low power low noise (less heat means less fan needed).
 
It's about time we see 40NM... I hope Tiawan Semiconductor yields are better now. I'll be expecting something better from AMD. Geforce GFX suck for dvd's.
 
The GT220 doesn't seem all that bad. It is basically half of a 9600GT on a 40nm process.

I bet it could still run crysis and far cry 2 at respectable (mostly medium, some high) settings @ 1280x1024.

The GT210, though.... what a crappy card.
 
Where's our 40nm GTX300, please. Actually I'm really glad to see these OEM cards come out, it's giving us a preview of what the GTX300 lineup will actually be, so we can safely assume DX10.1 (DX11 is very unlikely).

My current G92 based 65nm 8800GTS 512mb is great, but I'm looking for a little bit more kick than what the GTX285 offers and I don't want to blow the bank, a nice $200 street priced GTX3xx 40nm is sounding like just what I'm waiting for.
 
This is an uber-entry level card, why all this excitement? the 4770 was hott because it was 40nm and on par with the 4850 performance-wise. This ought to give the Radeon 4350 a run for it's money...
 
[citation][nom]IzzyCraft[/nom]10.1 is ati territory? It's more like Nvidia didn't feel it was going to catch on so didn't really put it on their chips. Mostly it's on there as DX11 Nears as a show of we can do it.[/citation]

More like Nvidia couldn't get their hardware to adhere to 10.1 standards and still beat ATI in benchmarks, so they pretended like it didn't matter, and pressured game developers NOT to support 10.1 through their BS "The way its meant to be played" campaign. Assassin's Creed being the most glaringly obvious example.

Nice try though.

-But, for the record, I still consider myself neutral between ATI and Nvidia, and generally purchase Nvidia cards since I have a hard on for EVGA. 3 Separate Graphics cards, not so much as a minor fan issue, nice step-up program... its a damn shame they don't make ATI cards too. Oh well.

This is still good news. If Nvidia can get the 40nm down and more reliable for them using experience from the OEM cards, it might let them at the very least build some more efficient high end cards.
 
Interesting to see, but I still want the 4770... Along with a 4.0GHz i7 and ddr3 1800MHz.

I hope this is a sign of things to come. Not of shitty cards, but cheap, efficient, and moving forward.
 
ATI is releasing 4 DX11 cards by October. Nvidia is getting farther and farther behind.....death is imminent.

Nvidia has a hard time fighting AMD and they know Intel is coming, so yeah, it's just a matter of time before they die as far as computer GPU are concerned. There's a reason why they didint show much new computer related product at computex and they focus on external device solutions.

I bet pretty soon we wont see any "nvidia, the way it's meant to be played" but rather "powered by Nvidia" on cell phones, digital cameras and stuff like that.
 
Sooo... BASICALLY... all told... it brings absolutely NOTHING NEW to the table...

Well, all things considered... at least it isn't a recycled G92 part... *nods*
 
[citation][nom]Sushi Warrior[/nom]Just a little FYI to people wondering why low-end is first, do you realize how many enthusiasts there are? Not many. The majority (AKA a huge percent) of the market is low-end, OEM cards.[/citation]

Oh I don't think anyone can argue your logic. What makes this telling is that this is upside down from any previous release Nvidia has done. To me, that indicates problems. Whatever the problems may be, they must not be as apparent on the lower end cards. If everything were going right, this would be an article about the 300 series.
 
[citation][nom]themike[/nom]Nvidia has a hard time fighting AMD and they know Intel is coming, so yeah, it's just a matter of time before they die as far as computer GPU are concerned. There's a reason why they didint show much new computer related product at computex and they focus on external device solutions.I bet pretty soon we wont see any "nvidia, the way it's meant to be played" but rather "powered by Nvidia" on cell phones, digital cameras and stuff like that.[/citation]

A smart person would hope otherwise. That would leave one high end graphics manufacturer (owned by a company who has operated in the black for how long?) so it would negate the need for R&D to develop new cards and they could price existing ones as high as they desired. Same argument on the Intel/AMD wars. Only a fool would hope for one or the other to fail.
 
Well if they released 40nm high end parts they would go ontop of the already exhausted lineup of GT200 chips, besides it'll be a nice surprise when we hit GT300 (hopefully.) Because we already have the GTX280 (65nm), and the GTX285 (55nm), and both 65nm and 55nm GTX260 216's (not to mention the GTX260 192 models as well.....) AND still more: the GTX275's (55nm all around as far as I know.) If they did release it A: Not alot of people would care all that much, and B: They would most likely get bashed for any renaming/re modeling/pricing they did, and this makes for a good test trial for 40nm.
 
Status
Not open for further replies.