Kepler news and discussion

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Well dang. if thats the official specs of the GTX 670....I might as well just get a GTX 680.. Unless the Price tag is better.
 



Might as well, I don't think it'll be much cheaper. :ouch:
 

Well with Specs like that, it better be a 670 ti, otherwise, it would seem like they are tying to make a "448 core GTX 560 ti" and a GTX 570. When the 448 core is really no different than a 570....

I wanna see those specs one more time. and compare the differences between the official 680 and the rumored 670., I would like to see what the ram count is and everything else
 
Yea i would have to say, the 670 looks like a slightly crippled version of the GTX 680
I would bet money they are taking either bad chips from the 680 process that can't quite handle the Ram bus or any other qualifications of the 680 standards.
 



Essentially that's what the 560 was to the 570... just disabled cores and limited the bus.
 

Also not trying to drive away from NVIDIA. But would you bet AMD does this for the 7000 series like they did the 6970 and the 6950
 



Sure they do... it's all about the bottom line and cutting costs. nVidia and AMD are aren't much different from one another.
 
Read the article?
It only gets worse, so no savings at 20nm, and less so going forwards.
Now, since Kepler has abandoned GPGPU mastery, the big K seems to be rare and expensive, its my guess
No more guaranteed yield costs is going to hurt them, as theyve gone way out of their way speaking on this
 

Getting more outta the same piece of material = more profit any way you slice it.
 
Well what i interpret from this a thanks for sharing the link jd. But i think NVIDIA not only is sadly having to increase price points...they are doing it because TSMC is being greedy and charging extra for the cost of every new transistor lead and Wafer chip manufacturing process. Which i can understand, it cost's load of money to design the machinery to mass produce them. But now the way i look at it. is TSMC gets the most money out of the entire line. from TSMC>NVIDIA> Consumer

NVIDIA should be Pissed with TSMC for doing this
 
http://semimd.com/blog/tag/umc/
The foundries are seeing a clear trend at the leading-edge. “The number of tape outs is decreasing, but the volumes are much higher,” said GlobalFoundries’ Capodieci during his keynote at ISQED

Citing International Business Strategies Inc. (IBS), a research firm, Cadence’s Beckley said at the 32/28nm nodes, a fab runs $3 billion, process R&D is $1.2 billion, IC design costs ranges from $50 million to $90 million, and mask costs are from $2 million to $3 million.

Citing the same research firm, he said at the 22/20nm nodes, a fab will cost $4 billion to $7 billion, process R&D runs from $2.1 billion to $3 billion, design costs run from between $120 million to $500 million, and mask costs are from $5 million to $8 million

fabmodel.jpg


Its apparent, neither AMD nor nVidia are to blame for higher prices, especially early on in the process
 

$ figures apply only to poor people like you and I but not huge Corporations like Nvidia
 
Exactly. You know its funny how big companys think if they switch up every year they are saving money....But only in theory..

Some times its best to Pick a dat dem company and stick with it.
 
Understand that the correlation between us and the corporations trickle down to varying costs to us, and thus my point, as well as a potential slow down of node generations, due to the need of higher volumes, which simply arent there, and lessor tape outs, which has to be spread amongst those who do them, which again, gets trickled down to the consumer
 

"Freedom is when you have nothing left to lose" Graphs, numbers, specs, brands, %ages, 'The Mark share" I can't give a you know what about those give me more for my hard earned dollars is Bottom line and I don't care what it cost nvidias share holders if they don't give me what I want I go to Radeon and if they don't have what I want then screw them both I will go Golfing and enjoy the great outdoors LOL.
 


just an idea actually. my current GTX460 can handle most of the games above 60FPS well under the resolution. but when it coming to heavy hitter such as the witcher 2 or metro the performance was awful if i tried to crank up the setting to the max. in metro i can't play the game on dx11 enabled even at my resolution. that's why i'm thinking of such idea though highly overclock cpu might needed so it can keep up with the gpu at such resolution.



:lol: 60fps is enough for me. if i can get the frame rates min above 60fps i will enable v-sync to prevent tearing. on another note i'm very interested with nvidia new adaptive v-sync. does the feature only apply to kepler or older card will get the treatment on future driver update



AFAIK such design using close loop cooling. the water cooling mechanism will only cool the gpu core. as for other parts it still will be cool by air hence the fan. if i'm not mistaken asetek make something similar on GTX580 before.



too much unknown info about it right now but for sure the big GK is out the with nvidia. as for the naming i don't know what will the called it. if they decide to put the big GK on the 600 series we might see GTX685 Ti, GTX685 and GTX685 SE. if they put it into the 700 series it might be GTX780. as for the dual gpu maybe nvidia will not make one based on the big GK since dual GK104 should have enough power to rival 7990.
 
Well, the PC Market is gonna get thinner. because when people start having to pay more for parts. people will start dropping PC gaming. Even tho it will always be around. With the Economy in most countries going to the bottom at some rates. it gets harder to afford to "Pay to Play" but the Enthusiasts and the "Professional's" will always be able to afford the hardware because they use it for more than just gaming
 


Have they ever had an "in between" card like that. I always remember just even number X50,X60,X70,X80,X90. So how will this "in between" card compare since there has never been anything like this before. Do you think it will be better than the 680 but not as good as the 690 or do you think it will be just about equal to the 680? Also how do you think it will be priced.
 

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-680/performance

Look at the 200 series and the 400 series.
 
I think PC gaming is going to be making a come back. Most of these Dell type machines for years have not been at all capable of playing modern games. Now AMD's Trinity chips will be legitimate for modern games at reasonable settings. Intel's isn't all that far behind either. This adds up to mean most computers sold will be capable of playing modern games again. Also gaming as a whole is much more popular/mainstream than in the past.

If I am not mistaken, AMD is going to smaller dies for their 77xx and 78xx cards, which, once this process matures should help reduce costs to manufacture. The cost to go to a new node is going up, and looks like it will stay that way unfortunately.
 
Status
Not open for further replies.