GF100 (Fermi) previews and discussion

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Well, I guess if I have to, I could dig up a few threads saying the same thing about Charlie here, and about the same thing, tho saying he was wrong again, about the wood screws/fake Fermi, about it coming in Oct, about it coming in Dec about it coming in Jan etc.
I could point out a few threads where people said he was wrong about Fermis first yields, about its power consumption etc, but then again, whats the point, its all 20/20
 


Don't point out threads started by John_ to me.... I already said there are some devout people who refuse to acknowledge obvious facts. You already knew that and so do I.

Secondly, the wood screws joke is somewhat of a misnomer..... nvidia demonstrated their new Fermi arch running, what does it matter if the card they actually gave to people was a dummy, ATI has done the same numerous times, so saying that it was a fake was hardly proving anything on Charlie's part.

Thirdly, saying the card wasn't coming out in 2009, again, wasn't any big surprise.

Lastly, he was wrong about their first yields. He stated a specific number which was incorrect, the yields were low, duh, we already knew that hence it being late, but he rattled off a VERY specific percentage which he had no evidence to support.... other than his.. "sources".

As for power consumption, everyone knew it would be high, it is a huge die..... he still doesn't know the specifics nor does anyone else except nVidia employees, so saying "omg it uses a lot of power" alone means absolutely nothing without knowing the performance.

If Charlie has cold hard facts to report and evidence to support it, on something that isn't common knowledge, I would love for him to share them, I really would, knowledge is power.

I don't quite understand why you are defending Charlie like his knight in shining armour though jaydee, it would be like defending that tard Fuad, cause he's got a piss poor track record as well....
 
He deserves mention in this thread, because his 2% yields, his early announcement that the card was fakem first denied by nVidia, then later admitted it was.
His power estimates of 280 watts for the 512 chip etc etc can be verified soon.
Whats funny here is, hes on the cutting edge of infos, and has been from the beginning, and tho some hate him for doing what hes done, what hes done so far is fairly right on the mark.
Ive been following this along quite closely, and from day to day, and I know whos written what when, and like I said, hes been the first most the time, and right most the time, all about issues either effecting Fermi or nVidia PR
 
Hes admittedly tried to solve this issue, but they wont back down, and he makes it a point in almost every article he writes about them.
He says theyre run on fear, makes their partners fear them as well. Wont tell the truth, even when confronted with evidence, ala the wooden screws etc
I take it all with a grain, and time will prove one right over the other, but so far, I see Charlie as winning

It doesnt do a company any good to ticj off the media, and just because he isnt Anand or Toms, nVidia thinks they can slap him down like so many others, according to Charlie.
He also wrote about the partner drops before anyone else, and their defections as well, before anyone else.
Makes you wonder, even with salt
 


Except JDJ, that Charlie's reputation is not one of an impartial journalist. We all know his hate on nVidia and to be honest his recent articles just can't erase that so easily. I am fully aware that he has been right so far on almost all Fermi news, but 1 right doesn't make up for 20 wrongs.
In my opinion the new hierarchy will be something like this:
5970>GTX380>5870~GTX360. I agree this looks pretty bad for nVidia, but I won't go as far as Charlie depicts the situation.
 
I agree, and so does Charlie, and even he sees a possible G380 as beating the 5970.
We arent chasing rabbits here (pun intended), were talking about nVidia, and Fermi to be exact.
What we do know is nVidias margins will be tight in a price/perf scenario. The card will use alot of power, will most likely be hot, or hotter than previous gens
Lets look at heat for example. The R600 was a large, hot power conuming card. The 3870, same perf, no increases, wasnt, it was much smaller and cooler as well.
Look at the 8800s, hot, large and lotsa power needed, by the time 92b came along, it wasnt anything close to this.
The 4xxx series? Lotsa perf increases, same concept, hotter, more power, as the compute density went up.
Now, the G200 was hot, low yields for the 280, higher failure rates etc 200b? A lil better perf, better power/cooling, and better yields.
Now we get to Fermi. Much higher compute density, same size as G200, much higher perf. This leads to power and heat consumptions again, and I havnt even gotten into costs, or yields, which should be less than G200 280.
Add into the fact that Tesla will be using the top bin for both iterations, the creme de la creme, on a lower yielding chip, and the 512s will be very rare, or expensive, or both, besides the overlying higher costs of each cards needs, mode layers on the pcb, more traces etc.
This is what Charlie has said, it makes sense, and if this is beyond possibility, and hedging more towards anti nVidia, show me where it is, cause I dont see it
 
I agree the chip will be big, a power hog and hot as hell (that is why I won't buy it+my hate for nVidia's way of doing business), but I won't go as far as Charlie to say the chip is unmarketable. I will probably struggle with the low yields for the launch period, but seeing that the chip is pretty scalable we might see nVidia producing more midstream cards from the failed high end chips. nVidia will probably take the single GPU crown and that will help PR and help lower-mid chips to sell like hot cakes.
We all know that the majority of consumers believe the PR and don't bother to check at review sites + add in the diehard fanboys and we already have a semi successful launch. Improve the yields and the BS from PR and voila, nVidia is gaining market share.
 
Thats the other problem. Currently, theres nothing on the mid cards and lower. Now, this only backs up the low yields, being late, having troubles etc clocks ans so forth.
Ive seen a few comments here n there about nVidias way of doing tesselation, and how Fermis laid out, and its may just be, the lower you go in scaling, the tesselation ability goes down in HW, nothing to confirm this, but thats the speculation.
If theyre hanging their hat on tesselation as it is, say doing half a Fermi, for a decent card in perf, close to a 280 or so, if it only has half the tesselation units in HW, well, youcan see where this is going. So far, its speculation only, but this could hamper its abilities as it scales downwards.
 


JDJ, I hate to burst your bubble, but...please tell me who is going to care about tesselation from an average consumer point of view? Does your average Joe even know what tesselation is? He will see it on the box and go...'oh I have a card capable of tesselation, neat!' and buy the freaking card. Tell me I am totally false and maybe I will gain back some faith in humanity 😛
 
Oh, Im not saying its a deal breaker, yet. Though, it could be, but its too early, unless we see a Crysis type game come out, with its full features. So, no bubbles to be burst here, anyways, I was refering to nVidias claims from their slides, and thier tesselation benches.
I wasnt promoting it, as much as questioning it.

I agree, unless it truly stands out, and again, who knows? Its nothing, but all we really have is a fast paced game where its being rendered and high speeds, where we arent looking anyways. Wait until a few more games come out, then we will know if its got alot of potential or not.
Im betting, if done right, itll make a few heads turn
 
Eh, I don't think tesselation is ever going to make heads turn to be honest. It isn't really something that ever stands out, especially if you don't know what to look at.

All people will ever notice is a frame rate drop and be confused of why.

And considering the resolution on most peoples monitors is 1280x1024 and 1680x1050, it's even less significant.

So the fact that nVidia says they can do it "better" than ATI means almost nothing. Problem is the DX11 standard doesn't really leave either of them much room to stand out at all, so things like "we tesselate better" are going to be pretty common place with DX11.

Either that or ATI/nVidia are going to have to start busting out 15 monitor wrap-around setups, with 3D and their own integrated high definition 24-bit/192kHz positional sound outputs, that also do some tesselation 😛. Man that would be cool......

EDIT: Just found this....
http://www.techeye.net/chips/nvidia-beats-amd-even-without-fermi

Looks like ATI hasn't gained any marketshare with their 5800s, or even in the last year, Intel has take 5% from nvidia though through integrated graphics solutions... interesting, very interesting.

Intel is still a giant I guess..... wait, they own over 50% marketshare now... oh crap, Intel is slowly taking over the industry... goodbye cheap computing.

"that’s even with the temptations of EyeFinity and DX11. Perhaps people don’t care about flashy features as much as AMD thinks they should." - Told ya multi-monitor gaming doesn't matter yet 😛 Only to extreme enthusiasts with lots of money (I'll admit I still want it though >.<).
 


How many people play games on movie screens 😛 Not many.



... I'm eating my strawberry yogurt with a spoon right now............. wtf you talkin' 'bout Willis?
 

If we could, we would. Imagine characters as big as us? Hate to see the AA levels, even 32xAA wouldnt cut it
 


Nah, if we could, SOME people would... like 1% of the population, because it would probably cost you your life just to use it one time.

And it wouldn't be called anti-aliasing anymore. It would be called "omgwtfbbq-anti-aliasing" and would take at least 256 ATI or nVidia cards xfired/SLId together and industrial grade hovercraft turbines to cool them all and a small nuclear reactor to power them.

I can only dream of such things 😛

Btw, you're also kind of limited in a house by how big your house is. You can't have a theatre sized screen in a normal house now could ya 😛 And the expense to make a house for that.... would be enormous.

So yeah, a lot of people would do multi-monitor right now, if it worked with more than like 5fps and didn't cost $700 for a card and like $200x3 for the monitors. You could build a whole new computer with that money, a damn nice computer....
 


If the house will limit your omfgwtfbbq experience I bet there will be nVidia, ATI approved houses on the market :sol:
 


The image in my head of a 5 story residential house with gigantic "SLI Ready" and "TWIMTBP" stickers on the front of it just made me cry from laughing so hard....

I bet you those houses will also have a giant and useless plastic air duct for Fermi cards 😛, which will also serve as the heating for the house and as the toaster/oven/microwave.... oh and the BBQ as well...
 


Then a fanboy will never be a true fanboy if he doesn't have the TWIMTBP house or the RED batmobile ATI residence. That might reduce clutter in forum threads to be honest 😀
 
There will be dozens of Divorce complaints citing EyeInsanity as a reason.
quote
"I came home from work and my husband had spent our mortgage money for two months on 3 computer screens and a bright red steering wheel. He seemed manic and insisted myself and our kids watch as he repeatedly tried to get a game going on this THING. He didn't come to bed for the next week etc......."
The Husbands counter claim.
Best purchase EVER !
 
Status
Not open for further replies.