Report: 20nm Nvidia Maxwell Possibly Delayed

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

kyle382

Distinguished
Jun 15, 2010
577
0
19,010
what is the point of even posting something like this? Vague rumor that already seemed like a likely scenario...could it BE?!?! lol. where do they dig this crap up.
 

AngryCorgi

Distinguished
Feb 20, 2008
26
0
18,530
TSMC is one huge dissappointment...Apple was supposed to shift to TSMC for all its cyclone and newer CPUs, but has had to rely on Samsung more than planned. TSMC has a lot of business but not getting anything done.
 


isn't that TSMC considered as second best after intel? anyway from what i know so far TSMC 20nm for SoC is not much of a problem. the problem is on the high performance process that usually used to manufacture GPU.
 

toddybody

Distinguished


I bought my 780's early February...with rumors of Maxwell hitting March. I always take the rumors with a grain of salt (especially given it's new lithography). Im happy with my cards now, will prob use Step Up for the 6GB 780's...I'd like to get reference coolers this time over ACX.
 


Interestingly enough, Intel made the move to 450mm wafers a bit back saying it was the next logical step in a way to produce better yields and lower overall costs.

I am honestly surprised TSMC didn't look into that already.
 

Vixzer

Distinguished
Apr 5, 2014
12
3
18,515
That is what AMD does with us poor consumers when she fails to deliver competitive products... now Nvidia has the luxury to produce the 800 series at their on pace...2015... Noooooo...
 


uh...if TSMC have problem with their 20nm node then it will affect AMD as well. it doesn't matter if AMD are competitive are not.
 

Textfield

Honorable
Jun 23, 2013
70
0
10,660
I'm impressed with Maxwell's power efficiency so far, but we'll have to see how far this goes. I'm sure that Maxwell is a more efficient core, but I also have to assume part of the 750's efficiency comes from the maturity of the 28nm manufacturing Nvidia has long been employing with their Kepler GPUs (and getting quite impressive thermal and power results from). The move to 20nm could definitely further this power advantage, but could it also backfire? Could the immaturity of the 20nm process cause higher defect density and lower ASIC quality, and actually increase prices while power usages don't improve?
 


It would be a very low possibility to not lower power usage from 20nm to 28nm. Only if it had major leakage issues would it increase power usage.

http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/20

Ivy Bridge was a 222nm shrink from the 32nm Sandy Bridge was on and it used way less power under load than SB.

Low yields can cause a price increase for sure though and at first will be a bit more expensive but it should drop fast as it matures and if they follow Intel and move to 450mm wafers they could cut price a lot.
 
Status
Not open for further replies.