NVIDIA GTX 350

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Well, you know they dont want to relinquich the crown. And a 260x2 at 55nm could work. But heres the thing. Do we really know the full potential of the 4870x2? Sure we got previews, with inklings as to how itd do, but I still think theres more to it. ATI said itd be 15% faster, and for the most part it was, but in alot of areas it wasnt, so Im still expecting more before launch. Question is, will nVidia be able to do anything about it with an even higher hill to climb?
 

jcorqian

Distinguished
May 7, 2008
143
0
18,680
nottheking:

At the rate nVidia's going, I'd wager that a lot of us enthusaists might happen to know nVidia better than nVidia's own people. :pt1cable: (I'd note that a number of engineers are among the enthusiasts; while not working as an engineer, I have a bit of education as one)

Personally, I think this is a very misguided statement. While it may be true that a number of engineers are among the enthusiast crowd (I, myself, am currently working as an engineer for John Deere over the summer before going back to school), that does not mean we could possibly know more about Nvidia's products than they do. We simply do not have enough firsthand knowledge. We can get information from various sources, we can speculate all we want, but that certainly does not qualify us as "knowing enough" to be able to dictate what Nvidia should or should not do better than Nvidia themselves. Both Nvidia and ATI are great and clever companies, they have to be to last in their cutthroat industry. To say that a lot of enthusiasts know what they are doing better than Nvidia or ATI is simply doing the companies and their people a disservice.

I'm not trying to say that what a lot people are saying on these boards is wrong or misguided. Plenty of people are knowledgable and have offered great opinions and insights. That doesn't make these people more qualified to make Nvidia's decisions then Nvidia's own people.
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
It dosent make sence to push out a revision so quickly, at least not to an average consumer. The G200 would be COMPLETELY all for naught, a total waste of time, development, money, resources, EVERYTHING. Not to mention completely pissing on fanboys and consumers by making the cards they paid $600 for obsolete.

They are going to take such a massive hit in the bank with this whole thing. It would be better to just set the cards at X amount and ride the storm out, there is no way they are gonna beat ATI in the price-performance department this round.
 

one-shot

Distinguished
Jan 13, 2006
1,369
0
19,310
Here is a little electrical theory. Ohm's Law... I = Current E = Voltage P = Power/watts most household circuits have 15A circuits. 15A * 120E = 1800VA or 1800Watts, is the max for a single bedroom/main room/bathroom circuit without tripping the breaker. I've seen PSU's in the 1.4KW/1400watts range and that is rediculous. If you are running 400watts idle at 8 cents/KW/hour for a 24/7 rig that'll add up to ~3.25 cents/hr and average 30 days/month*24hrs=720hrs*.0325 = $23.4/month *12 = 280.80/year. That doesn't sound too bad and even on load for 6 hours a day you might run up to $25 - $30 a month or $300 - $360 a year. I double checked my math a few times to make sure, feel free to look it over. 400 watts idle is a lot but considering speculations it could be a reality. Thats if you care about electricity but a circuit can only handle so much before the breaker trips. Not starting on fire.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

First-hand knowledge is over-rated; it's mostly an excuse given out. If nVidia knew what they were doing, then how come they didn't see that GT200 was going to be a lemon anywhere near as far out as most enthusiasts seemed to realize? nVidia and ATi are not all-knowing; what is truly misguided is to believe that they know everything about GPU design and creation, because they do not. They only know enough to keep up the pace of producing new designs at a constant rate; you can look at their successes and failures and tell that it's VERY much a constant learning experience for them.

And yes, if you're learning something, that means that chances are that there's someone else out there that already knows what you're learning.


Since I believe that was directed at me, I'll shoot back with some elementary computer construction knowledge, to have you know that my comment on the maximum power you can ram into a video card has NOTHING to do with what a household circuit can handle. Rather, it's that those individual yellow cords coming from a PC's PSU can only handle roughly 2.1 amps of +12v current apiece, (75w for a 6-pin PCIe connector, 100w for an 8-pin, with half the pins as grounds) because they are FAR thinner cords than what run in a house. The electrical bottleneck lies not within what the building can supply, but rather, just how many cables you can run from the PSU.

Given that modern graphics cards are designed to be allowable to be used in tandem with each other, you're going to need a multiple of cables to power all your cards. Already, if you use a pair of GTX 280s in SLi, that means you need 2 6-pin and 2 8-pin connectors. To the best of my knowledge, no existing power supply features more than 2 8-pin connectors, so right out by using one, tri-SLi is out of the question unless you have two PSUs... And cases that could accomidate two PSUs are very, very rare when you can find them at all, and that's ignoring the extra price you'd have to pay for 'em. (since we're continuing on the line of thought that money is the only limitor on computer design, period) If you go any higher, then you lose the ability to have SLi, because you only have enough 8-pin plugs for a single card. It's an open question if it could be possible to even try to cram three power plugs onto a single card.
 
G

Guest

Guest
^ yes... but you COULD have 2 6 pin and 1 8 pin in a card...

I think my PS comes with like 500 6 pin adapters lol
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
^ yes... but you COULD have 2 6 pin and 1 8 pin in a card...
Yeah, as I said, it was an open question as to whether a card could accomodate a third plug on them; at the very least it'd be inconvenient to route three separate bundles of cables to each video card. It's something I questioned when I saw, for instance, the GTX 280 take an 8+6 arrangement when it comes very close to the very maximum for that, hardly leaving any room for increasing power, (236/250w, or 94.4% capacity) as well as eliminating the chance for 3-way SLi under normal situations. Of course, as I understand, a number of PSU designs actually don't have outright 8-pin connectors, but rather include a couple of extra 2-pin "breakaway" connectors that are used in conjunction with a 6-pin to accomplish the same thing while allowing it to be used as a 6-pin plug when not needed.

Of course, I can't be entirely positive on this, since I've not taken a look at a whole lot of 8-pin PSUs... And not to mention, my 600w Tagan's old enough that it only has a single pair of 6-pin plugs. :p
 
G

Guest

Guest
They should make a 20 pin connector... like the one on the motherboard... then we don't have to deal with all these power requirements :kaola:
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Just remember, as a basic rule of thumb, as I mentioned: every yellow cable (the universal standard for those carying +12v current is to color them bright yellow) can carry up to 25 watts of power. So that's 2 pins (1 for +12v and 1 for ground) for every 25 watts. A 20-pin connector, if it were all allocated for +12v, would carry approximately 250 watts of power, bringing the total for a card up to 325w.

Of course, as spathotan hinted at, there's little reason to make a dedicated, monolithic plug for that, when you can accomplish the same thing with multiple smaller plugs. Plus, it would save money on all the mid-range cards that only call for a single 6-pin plug, as they are produced in MUCH higher volume than the GTX 200 series, and it would shave off a small chunk on the assembly cost of each card, and likewise, shave off of the cost of each PSU, as they wouldn't have to include a plug that maybe a couple percent of the people getting the PSU would ever use.
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980
Latest news

Leaked forceware 177.66

NVIDIA_GT200.DEV_05E0.1 = "NVIDIA GeForce GT200-400"

NVIDIA_GT200.DEV_05FD.1 = "NVIDIA GT200-875-GL"

NVIDIA_GT200.DEV_05FE.1 = "NVIDIA GT200-850-GL

Basically all these cards will be faster than gtx 280, 875 will be dual based on 55nm.

GT200-400 will be slowest, it will be gtx 280 on 55nm with much higher clocks, reducing size of die will make it happen

 

So theyre going to have a x2 Quadro? And you knew this, yet you posted that ^ ? Not sure where youre coming from
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980
John I know as much about its architecture and so on as you do :D

For the first time in the last 9 years I have been gaming heavily, I have heard a bit of inside info.I know its dual based and its performance is higher than Ati's competitor ( not a huge amount, but enough ) and it will eat Crysis for breakfast @ 1080p.

Think playable on one single slot card @ 2560x1600 VERY HIGH
 
Common sense tells me that a 4870x2, which we still havnt seen how good its really going to be, cant eat Crysis for lunch at 12x10, so if this barely beats whatever the real scores of the 4870x2 will be, and the4870x2 cant do it, what makes you think this can? I still dont get it. I can read. I also know some people predicting how good this card is also said the G200s would kill the R700s, which hasnt been the case. Patience is one thing, talking like they know something after they fail, is something else. That same person wouldnt release what they found with the AC benches they made, DX10 vs DX10.1, using a 3xxx series card. Then they claim theyre not biased. Id tale it slow from hearing info at this time , as like Ive said, we really dont know how well the 4870x2 will be
 

jcorqian

Distinguished
May 7, 2008
143
0
18,680
"First-hand knowledge is over-rated; it's mostly an excuse given out. If nVidia knew what they were doing, then how come they didn't see that GT200 was going to be a lemon anywhere near as far out as most enthusiasts seemed to realize? nVidia and ATi are not all-knowing; what is truly misguided is to believe that they know everything about GPU design and creation, because they do not. They only know enough to keep up the pace of producing new designs at a constant rate; you can look at their successes and failures and tell that it's VERY much a constant learning experience for them. "

@ nottheking:

So you are telling me that an enthusiaist such as yourself (who is not an engineer, as you stated you weren't) can know and understand as much about Nvidia's GPUs and are in a better position to make decisions than Nvidia's own engineers?

I never argued that Nvidia and ATI were "all knowing", I simply stated that their engineers know more about their GPU's than the vast majority of enthusiasts.

As for GT200 being a lemon, is it really? If ATI's 4800 series wasn't so amazingly good, no one would be bashing the GT200. Nvidia's folly was underestimating ATI and unappealing pricing, not necessarily designing a bad card.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
Yea, but JDJ thats something of a silly statement - because 9800GX2's microstutter all over the place if they are pressured, especially in quad SLI

That alone made the gtx260 and gtx280 a far superior solution to the 9800 GX2 - the min framerates were just much better and the gameplay was more consistant.

Thats why just looking at "avg fps" will make a product seem to have either bloated performance or deflated standards.

All that said, I think that the gtx280 is still a disappointing card - but if they hadn't overpriced it, it wouldn't have gotten criticized near as much. The gtx280 still delivered nearly twice the performance of an 8800 Ultra with the same AA settings, which is very good performance - but the pricing is just all wrong. You can get a gtx260 or AMD 4870 for 200$ cheaper than the gtx280, and the performance of either card is within 5%-10% of the gtx280

Thats what really killed the gtx280 for me anyways, it just lacks for its price point. I could've bought them instead of the gtx260s but they are just not worth it as the performance increase is too small, especially in a dual gpu comparison.

4870x2 is going to be an excellent card, but best thing about this solution is the AA performance and the quadfire scaling. AMD really hit it off with this and I think Nvidia should really go back to the drawing board if they want to best AMD.
 
I never said MS wasnt an issue with the x2, nor would I say the G280 is a bad card. Actually its a great card. Not at the price it was at, nor is currently at tho. I agree that the pricing was part of the disappointment, not ATIs response, that came later, and made it worse. Look, Im looking forward to this "x2 killer" from nVidia, as Im also looking for the final x2 performance, once we see something other than es samples on beta drivers.
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980
Across the board its performance is a bit higher than HD4870X2 John.

4870X2 is a hella of a card too, its performance figures for Crysis are not a measure of how good the card is, Ati are supposedly working very heavily on optimization for that title.

Results so far on that game and a few others to a lesser degree are no way an indication of how good it is, thus Nvidia releasing a super card.

Crysis on 4870x2 will be at gtx 280 sli level at launch, at the moment its at 9800gtx sli level.From what I have heard Gtx200x2 will run Crysis a nice bit past gtx 280 sli level and thus alot faster than 4870X2.





 
G

Guest

Guest
now the prices on the GTX 200 series are great... gtx 260s for under 300 and you can get gtx 280s for 420 on newegg
 
Status
Not open for further replies.