Is Charlie right

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790
NVIDIA IS SET to trickle out the latest batch of 55nm parts. Expreview has some pictures and tidbits about the latest 55nm GT200/GT200b here, and some GX2 info here.

It looks like the on-again, off again GT200GX2 is on again, and it is called the GTX295. Yay. The 55nm parts, internally code named GT206, are finally trickling out like we said they would, with no speed increases, and no power gains. What should have been a simple optical shrink is turning into a totally botched job, with the 'real' 55nm parts unlikely to come out until late January at the earliest following yet another spin.

Given the lack of gains with the B2 stepping, the GX2/GTX295 still seem unmakable in volume, but such trifling concerns have never stopped Nvidia in the past. We hear they are going to launch it even though they can't make it, along with the requisite 19 parts to Newegg so they can claim it is on sale. Real volume won't happen until (if?) they can fix the power problems.

We hear that the 'launch' is likely going to happen at a shindig on the 12th of December so they can claim the win they promised before the end of the year. One has to wonder if cherry picking parts in an attempt to use tame press to snow the public is the definition of 'Whoop-ass'? I am sure they will claim a stunning victory in any case.

One way you can tell how screwed up the chip is is the use of a heat spreader and a stiffener (the metal ring around the chip). If you have a big die, you need mechanical support for it, or it can crack or break bumps. A stiffening ring is usually the cheapest and most efficient way to go, but in many cases, a heat spreader will do the same job.

The problem with a heat spreader is that it introduces two additional thermal barriers, the paste under the lid and the lid itself, to the cooling of the silicon. Each one makes cooling incrementally less efficient, not to mention material and assembly costs. You don't do this unless you have to.

If you are wondering why every modern CPU out there has one, the answer is simple, so ham-handed monkeys like most DIY people don't crack the die when they clamp the heatsink on. Think AMD K8 here. CPU makers think the cost of a spreader, and the reduction in performance it brings, is worth the protection it gives.

GPUs however come assembled. Factory robots don't break chips, so the mechanical protection is not an issue, but the costs remains. So, why did Nvidia do it on the GT200? They can't control hot spots. The lid is a heat spreader, and it helps keep chips with poor hot spot control alive and working.

When you see a heat spreader on a part that comes assembled, it is a pretty sure sign something is wrong thermally, it simply is not worth the cost and performance drop otherwise. Make no mistake, the spreader and stiffener combo on the GT200b is a bad bad sign.

Why is the GT200b such a clustered filesystem check? We heard the reason, and it took us a long time to actually believe it, they used the wrong DFM (Design For Manufacturing) tools for making the chip. DFM tools are basically a set of rules from a fab that tell you how to make things on a given process.

These rules can be specific to a single process node, say TSMC 55nm, or they can cover a bunch of them. In this case, the rules basically said what you can or can not do at 65nm in order to have a clean optical shrink to 55nm, and given the upcoming GT216, likely 40nm as well. If you follow them, going from 65nm to 55nm is as simple as flipping a switch.

Nvidia is going to be about 6 months late with flipping a switch, after three jiggles (GT200-B0, -B1 and -B2), it still isn't turning on the requested light, but given the impending 55nm 'launch', it is now at least making sparking sounds.

The real question is, with all the constraints and checks in place, how the heck did Nvidia do such a boneheaded thing? Sources told us that the answer is quite simple, arrogance. Nvidia 'knew better', and no one is going to tell them differently. It seems incredulous unless you know Nvidia, then it makes a lot of sense.

If it is indeed true, they will be chasing GT200 shrink bugs long after the supposed release of the 40nm/GT216. In fact, I doubt they will get it right without a full relayout, something that will not likely happen without severely impacting future product schedules. If you are thinking that this is a mess, you have the right idea.

The funniest part is what is happening to the derivative parts. Normally you get a high end device, and shortly after, a mid-range variant comes out that is half of the previous part, and then a low end SKU that is 1/4 of the big boy. Anyone notice that there are all of zero GT200 spinoffs on the roadmap? The mess has now officially bled over into the humor column.


looks like nvidia are having problems again
 
The very fact that nVidia hasnt released its 55nm G200s shows hes at least partially right.. I remember rumors of both 65 and 55nm being taped out before G200 release. Im thinking that because of G200s efficiency, or lack of it, transistor to die size, has caused some of this. It took ATI 3 shrinks to get theirs right, going from the 2xxx series to the 4xxx. The 2xxx series leaked bad, the 3xxx was better, but no performance boosts, it took the 4xxx series to actually open it up.

Reading Annands tribute to the engineers on the 4xxx team, and the 4870 et al, in his article, he says not until 2010 will we see midrange and lower cards using the G200 design, so maybe theres alot more to it than whats being accepted. Shooting for the top spot, having such a large die prohibits going to the lower end in a cost vs performance scenario, when faced with a competitor thats already gone small, aimed for not the halo end, but the performance end as their top chip, and it leaves a huge gap, which nVidia was hoping the G9x series would fill, as the G200 just couldnt be competitive at these smaller, more cutthroat pricings we have at the lower end of cards
 
There is no reason ATm for Nvidia to rush everything an make a mistake. They have the top single GPU solution out, and seeing as their approach to dual GPus is always as easy as a sandwhich, we should see something from them soon.
As it stands for me, I'm never going quad sli anymore, just too many problems with it. Even with tri sli, i'm a little jumpy. I'd say regular SLi, or single GPu or nothign my case from now on....but we'll see what the drivers will have to say.
 
@ rangers,
How dare you just copy and paste like that, without a link we are all denied a visit to the inq and are robbed of a chance to bask in its myriad advertising :lol:
No seriously you used a hell of a lot of space for something a simple link would have done. I know forums where they scream blue murder if you use a whole link and not a TinyURL.

Mactronix :)
 
In many ways I do feel he's right. As mentioned, there have been ZERO lower end products based off the new design. They haven't managed to trickle down to lower price points at all. He and I can guess all day long as to why that is, it might be a problem or not. AMD has done a fantastic job moving the 4 series out to EVERY price point. They had to use CF in order to hit the very high end, but its there and it can compete.

I'm very interested in the GX2 part. Because the GPU die is so large, its been speculated that they will have to do another dual PCB card unlike the x2. This means increased costs due to the two PCBs. You also have to wonder a bit about power. The 4870x2 only needs a single 8pin and a single 6pin to work. Will the same be true of the GTX295? Or will each board need both, requiring a monster of a PSU? Last, the chunk of copper/other metal must cost more then a shiny penny as well.

In many ways I see little wrong with what he wrote. Nvidia mis-stepped with these cards. The question now is, what do they do?
 
Looks like amd is getting lucky this year with the introduction to the 4xxx's and nvidia's poor performing 9xxx’s cards, plus the gtx200's. nvidia does have the best singe gpu solution but with a few weaknesses(price/performance, power, and heat) which makes amd's 4870 look the best overall

i have little faith in nvidia and it's gtx295 nor any next gen gtx300's cards. nvidia had it's 15 minutes of fame. but its now ati's turn for the spotlight. Hopefully nvidia doesn’t fall to behind and can still compete with ati.


yes i did, get a life
this is really childish and you should of put a link but you could of just "sorry here is the link" instead of arguing with strangestranger. you both should get a life
 
cant see nvidia standing still, they will pull every underhanded stroke in the book to regain top spot, ether way were in for a pretty exiting six months
 
The way u speak of it, its like AMD is an angel and Nvidia is a devil. I mean both companies pull their own schemes, and both made their own mistakes :)

Either way, I think that as consumers we have only to gain.
 
i feel sorry for anyone that holds Charlie with any relevance whatsoever. the guy has a deep hate for Nvidia and loves to spread the FUD. he is like a fanboi evolved to tabloid writer.
 
These are all previews even if they are real, I mean the 4870 was supposed to double the 9800 GX2 from the previews, and the 9800 GX2 was supposed to more than double the 8800 GTX constantly, yet we saw the GX2 get crippled when resolutions were increased, while the 8800 GTX managed to survive because of the extra Ram.

We'll see where this goes, be happy either way, all of the other cards will go down in price😀
 
oh definitely. i do not desire any dual GPU card at this point. i just find posts that circle back to Charlie about the same as posting an article in the Weekly World News or The National Enquirer. maybe reference to something real then smeared with crap.
 
well charlie can be a bit over zealous with hes hatred of any thing nvidia, but he gets it right 9 times out of 10, and i find him quite insightful
 
Yeah, he seemed a bit harsh. I mean, I agree that the card is a bit absurd, but if it is a total flop, why would NVidia release it?

Also, I think he is a too cruel to NVidia's design team. I mean, these are the same people who created the G80's. They aren't idiots. And that Anandtech article about AMD gives some insight into this very difficult process.
 
I'd say the biggest problem is Nvidia thought they could go to a bigger die like the GTX200's, and they hit a wall a size/generation before they thought they would. In turn they have incredible power requirements, high heat, and low yields. Just seems like they really need to get something new out the door try to cut GTX200 out of their history (maybe the 9k series, too).

I just hope their next design (not shrink) is radically different, even if it ends up slower and in need of tweaking (Thinking HD 2k here).
 
u kno what i still Don't understand, its the strongest single GPU, it almost doubles the 8800 GTX in some cases when it beats out the GX2. I'm still at a lose when it comes to these heat levels I keep hearing. I leave 33% fan speed, the cards idle at 55 and go to 65 on heavy load.

So all this heat talk, not that I'm denying it, I just don't see it. Every1 is calling the G200 series a flop, yet we have a card that can keep up to the 4870 X2 with a single GPU...

I don't understand what ppl's expectations are. I mean the 4870 X2 is just 2 4870s in Crossfire...yet it results in up to 30% gain compared to 1 280 GTX.


can Some1 explain?

I'm not defending either company, I just don't like it when words are just thrown out there. I mean not even a couple of weeks ago (when I was consistently on THG forums) every1 was bitching and complaining about the heat levels in the 4000 series. Whats going on now, nvidia launches the GX2, and all these heat problems sprung up out of nowhere:S

Some1 enlighten me

280 GTX OCED thermals: 50-55

http://www.guru3d.com/article/bfg-geforce-gtx-280-ocx-review/4

4870 1 gig (after the fan fix): 72 -85

http://www.guru3d.com/article/amd-ati-radeon-hd-4870-1024mb-review/4

This whole thermal points are still affect the over all temps of the card.
 
Yeah, people who whine about heat need to take a step back and look at what they are saying. Why? Because the truth is that, to date, the worst card when it comes to heat production has been the 4870X2. Now I own a 4870X2 so I can talk from experience. These cards dump out HUGE amounts of heat, and require constant fan variations to keep the heat on the actual core down.

Now I love my X2 it kicks butt in games and I am a big fan for the underdogs (ATI AMD). But the truth remains that this card heats up my room when it is IDLING.

The 9800GX2 on the other hand, which was my previous card, was toted as a "heat monster" and a "dual PCB failure" by the INQ, was a great card and actually ran VERY cool, overclocked well, and still holds the title in some games (Think Crysis).

Simply put, ATI and Nvidia are not going to release dual chip cards unless there is a place for them in the market. No matter what the INQ or anyone else says I fully believe that the GX2 260 or whatever will be an attractive offering.

Heat is just a scapegoat for alternative motives.
 


I believe that what people are referring to is that the GT200 generates more heat, regardless of what temperature it runs at. It is perfectly possible after all that nvidia is installing a higher quality cooler on their cards compared to the ones installed by ATI on the HD 4800 series. Remember that those power-hungry beasts need to dissipate one way or the other all that thermal energy and even if the current cooler is enough for a single GT200, I'd like to see what it will take for two of those sharing the same cooler.

To put it into perspective, it is my firm believe that with the current GT200 chip that we all know and love (sarcasm 😛) we would need something along the lines of a triple slot cooler to keep the temps down low enough for those alarmists scared of burning out their GPUs at 65 C idle.

Basically what I'm trying to say is that, they might be installing decent enough heatsinks on their single GPU cards, there's a physical limitation as to how much they'll be able to cool with a dual slot solution. That's just how it works.


EDIT@annisman: What people are referring to as heat-monster is not how hot the card runs, but how much heat it generates. If that doesn't make sense, then I'll try to explain a little better 😀
 
9800 GX2 was the most demanded card, and still is. Companies ahve that card back ordered. Its a late bloomer. The 7950 GX2 was the first dual of Nvidia, SLi was still something that was new to public. Now, if games don't support sli/Crossfire, its a surprise.

The 7950 GX2 shouldn't be compared. Thats the beginning of Sli and how the market shifted because of it (ofc its NOT THE literal first, but it was mainstream like it is today, where every game supports it).

For my 280 GTX, BFG told me that 100 degrees should be the max the card could take, and if it stays around 90 degrees all together its fine, but not recommended. So I mean this whole thing about the heat I personally think its BS. No triple fan, no special cooling. Stock cooling.

So please no more of this heat nonsense, its just not a factor that this card falls under. My 280 GTXs make a 3 way sandwich, barely any air circulates in and out of my case because of water cooling.