Nforce 4

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

georgebeee

Distinguished
Dec 11, 2003
404
0
18,780
I just got a 3 piece soundset from yamaha, which does surprisingly well with a soundblaster live 5.1.

but I have to agree with grape's assessment of the airflow. the usage of power is also something I prefer(I feel the same way about the heat and power of my graphics card)

so, with the reading just provided to me, it's via all the way unless things change. 6600GT in the future for me when I build a new system, and a second in SLI solution instead of buying a power hungry heat-producing solution.

but we'll have to see, that's just my thought of the moment, time will tell

I prefer to buy my products after they've been out a few months, don't have to pay the extra premium(which I don't have to begin with), they've been tested by the enthusiasts already(many thanks), and the drivers have been improved and tested.

well, that's just my 1 1/2 cents worth--which is all I got at the moment anyway
 

Johanthegnarler

Distinguished
Nov 24, 2003
895
0
18,980
Hah. I pretty much figured there would be no agp support with a SLI board. But i'm definately wishing.

I would really like to update to a AMD64 solution but where i'm at as of now isn't really worth it. I've used 3 machines i've built for others now and damnnnn it's so much nicer. Sadly I can actually see how much smoother the 3d graphics are with AMD64 chips. Well... in comparison to my 2500@3200 barton.

I really hope DDR1 can stay somewhat mainstream, but we know intels influence on the market. This complete hardware change every year is getting bad. There's no such thing as upgrading really anymore. But then again.. AMD somewhat made DDR1 mainstream first while intel was going RDRAM.

<A HREF="http://arc.aquamark3.com/arc/arc_view.php?run=277124623" target="_new">http://arc.aquamark3.com/arc/arc_view.php?run=277124623</A>
46,510 , movin on up. 48k new goal. Maybe not.. :/
 

eden

Champion
I'm curious about something, as I had just realized it. If SLI is nVidia's offering, what exactly is VIA offering with the Dual PCIe claim? Like, how did they go about that technology?

This may only impact Workstation solutions, but it would be interesting to look at from a Quadro SLI + ProApp stand point. I think for most of us here it will have little impact at all.
Actually why would it help? If both cards will be giving out their maximum and HAVE to be synchronized, there isn't any way one can go further than the other if both run at max, and if not, then synchronizing so one can give out more once both have finished their job wouldn't work either, to my knowledge.
As for the rest,t he performance figures as noted by [H] are not impressive, and as such the things that would interest me in the NF4 over VIA, SIS and ATI's solutions are slowly being dropped one by one. The Raid support is interesting, but we'll have to see it's impact too before deciding.
Agreed. Performance is more disappointing than anything. It is overall weaker than the VIA, from what I've seen. And when it DOES do better, it's by a very small margin. Really, where did the old dual channel optimizations go nVidia? Why did you emphacize so much on firewall processor usage optimizations instead of making the chipset itself a better alternative to the NF3?
I like nForce, honestly, it's a great concept and is attractive, but the fact is, nForce 2 was what defined the serie. It had major performance to bring. It WAS the reason behind the sudden competition from AMD with the launch of the AthlonXP 2800+. (it never was about the dual channel itself, but the optimization enabled using it, kinda like the extra 8 GPRs in AMD64 which don't necessarily relate to 64-bit) And the features were rich too. NF3 advanced that, but really, the VIA solution has been immensely competitive, and overall, the features on that chipset were also just as competitive, save for the audio everyone is touting.

Right now VIA's my top contender, despite Spitfire and Crashman's obvious misgivings (which I respect, but does not reflect my experiences with VIA). Gigabyte and VIA are my current solution, work well, and if they are the same, unlike Spitfire, I'd pick VIA over nV.
My sentiments exactly. I don't see the fuss about VIA anymore.
I would actually ask the anti-VIA people here: have you actually found VIA to recently have been very buggy or as bad as during the KT133 days? In fact, have we even heard that many stability complaints against the KT800PRO?

--
<font color=blue>Ede</font color=blue>
 
SLI doesn't appear to be just for nV mobos (wise not to they would just limit their options), SLI is for selling cards more than MOBOs IMO. And I'm not sure how much proprietary tweaking is required for SLI on a mobo, perhaps it just needs two fully functioning PCIe graphics slots (even if 8x),

SLI with Quadros would help with the very large textures, and it's the only place I would expect to see differences now. Many of these apps are also geared for scalability even to the extent of large arrays. Really if we would see significant difference in 8X SLI versus 16x SLI I would expect it there if anywhere.

But that;s really a guess right now, there's been no tests so far to show anything, let alone limitations, bottlenecks or differences. Personally I still think Tetris is the one thing that will push this solution of the edge. Spinning that _|_ shaped piecde is gonna drive that thing nuts! :wink:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
One of my friend's has a KT333, it works OK. Recently one of my friends bought a KM266 mobo, and it's terrible. I'm not talking about perormance or IGP performance. Not all 4-in-1 drivers work well, poor IDE performance, overall crappy responsiveness etc. etc.

But all nForce(1) and nForce2 boards that me and my friends used, was great.


------------
<A HREF="http://www.foood.net" target="_new">FOOOD's Icons</A>

<A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig</A> & <A HREF="http://geocities.com/spitfire_x86/benchmark.html" target="_new">3DMark score</A>
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
BTW, when you install VIA drivers, you have to Install 4-in-1, audio, lan, usb2.0 drivers seperately.

nForce = all drivers in one unified package.

------------
<A HREF="http://www.foood.net" target="_new">FOOOD's Icons</A>

<A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig</A> & <A HREF="http://geocities.com/spitfire_x86/benchmark.html" target="_new">3DMark score</A>
 
Interesting PICs/SPECs of IWILL's solution they have the CK8-04 (nF4) having 2 x 16xPCIe (also includes pics/spec of intel solution);

<A HREF="http://www.iwill.net/sppage/2004_10/2004_10.htm" target="_new">http://www.iwill.net/sppage/2004_10/2004_10.htm</A>

I think it's a typo based on the early info about the nF4 boards, but it would be interesting to see if there is a workstation class board that does have 2 x 16x support while the gamer boards only have 2 x 8x as that would somewhat indicate that there is a feeling that workstations will get more use from the higher throughput.

I still think it's a Typo, and that it's two slots that accept 16x and likely they too run at 8x when used in SLI configuration.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

eden

Champion
SLI with Quadros would help with the very large textures, and it's the only place I would expect to see differences now. Many of these apps are also geared for scalability even to the extent of large arrays. Really if we would see significant difference in 8X SLI versus 16x SLI I would expect it there if anywhere.
Oh that's what you meant. I thought you were saying at first that SLI where one slot is at 16X and the other 8X would make a big difference when Quadros are used. Was wondering how you could even get more in one slot than the other given the fact this is a synchronized rendering setup.

perhaps it just needs two fully functioning PCIe graphics slots (even if 8x),
Would kinda discredit the claim that nVidia spent years making SLI.

--
<font color=blue>Ede</font color=blue>
 

eden

Champion
I didn't ask you about old VIA chipsets, I said RECENTLY, in fact even mentioned the KT800PRO.

That's why I dropped the past when speaking about the VIA now, if I wanted to buy from them. I asked to see who could prove me the VIA stigma remains in the new stuff.

--
<font color=blue>Ede</font color=blue>
 

eden

Champion
I still think it's a Typo, and that it's two slots that accept 16x and likely they too run at 8x when used in SLI configuration.
I'm curious as to know what would be the purpose in dropping both slots' speeds to 8X, really?

--
<font color=blue>Ede</font color=blue>
 
I'm curious as to know what would be the purpose in dropping both slots' speeds to 8X, really?
Well there are limited number of PCIe lanes 20-24 usually in most MoBos/chicpsets, and twin 16s would saturate even hartier solutions. People have been saying that the nForce4 has 32 lanes, but that's a false assumption based on the 2 x 16x (leaving nothing for other 1X slots and southbridge?). The Nf4 is 20 + 2 like just about ever other one out there. They need to do the 2x8 to allow it to fit within that 20, with 16 for the SLI and 4x1 for other items.

Here's a <A HREF="http://images.anandtech.com/reviews/chipsets/nvidia/ck804/nvidia_sli.gif" target="_new">PIC of the nV solution</A> which uses a bridge chip to switch between slots, and only the bridge chip gets the full 16X.

Which is why I think the ad is a typo, but it would be interesting if they use a separate solution for workstations to maximize performance.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

eden

Champion
The amount of lanes though, that is up to the motherboard makers or the chipset makers' decision?

It sounds like IWill's able to make their own amount there.

--
<font color=blue>Ede</font color=blue>
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
Nvidia responds to soundstrom related mail
<A HREF="http://www.theinquirer.net/?article=19265" target="_new">http://www.theinquirer.net/?article=19265</A>

:tongue: <A HREF="http://www.geocities.com/priyajeet/fing.jpg" target="_new"><i><font color=red>Very funny, Scotty.</font color=red><font color=blue> Now beam down my clothes.</font color=blue></i></A> :tongue:
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
So basically he says: "It's not there, but we haven't permanently ditched it".

They surely must be tweaking the design in preparation for a sound card launch.

I suspect the days of such a great sound solution coming for free with the motherboard are long gone. :frown:

---
Epox 8RDA+ V1.1 w/ Custom NB HS
XP1700+ @200x11 (~2.2Ghz), 1.55 Vcore
2x256Mb Corsair PC3200LL/1x512Mb Corsair XMS PC4000 2.5-3-3-7
Sapphire 9800Pro (VGA Silencer Rev3) @400/730
 

c0d1f1ed

Distinguished
Apr 14, 2003
266
0
18,780
Those aren't benchmarks, that's nV PR. I mean actual in someone's hands, no partial precision, no 'special drivers' benchmarks. Would anyone trust ATI's Supa-Doopa R480 results if they said it gets 10,000 in 3Dmk05? Of course not, so why is SLI any different, probably because we've seen working mock-ups, but no real benchmarks, and therefore people will believe anything.
NVIDIA's SLI can give you 32 pixel pipelines. ATI is limited to 16. That's not a matter of belief, it's a fact. It allows you to double the resolution and get the same performance as with one card. Staying at the same resolution doesn't really double performance, because most vertex processing work is done by both cards. That's more efficient than each doing half of the work and sending the results over the bus.

In this perspective, I think their results with current beta-drivers are impressive. And I'm sure it's not just PR. They won't risk getting everybody excited and then dissapointing them with benchmarks done by other reviewers. On the contrary, I expect them to do even better once the SLI drivers mature...
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
no its not 32. its 16 for each portion of the screen. i will only call it 32 if all the screen is 32. it wud be 32 if cards were running serially. Parallel doesnt signify doubling.

:tongue: <A HREF="http://www.geocities.com/priyajeet/fing.jpg" target="_new"><i><font color=red>Very funny, Scotty.</font color=red><font color=blue> Now beam down my clothes.</font color=blue></i></A> :tongue: <P ID="edit"><FONT SIZE=-1><EM>Edited by priyajeet on 10/25/04 02:28 PM.</EM></FONT></P>
 
NVIDIA's SLI can give you 32 pixel pipelines. ATI is limited to 16. That's not a matter of belief, it's a fact.

That's not a FACT, that's FUD/BS, <A HREF="http://www.darkcrow.co.kr/News/News_Content.asp?board_idx=2657" target="_new">ATI can do XxX = 32 also</A> (and have been able to for LONGER than PCIe SLI engineering samples). But still it's not the same, as priyajeet pointed out above, it's theoretical pipes. Even if it were 'true 32', it still doesn't sanctify their PR, these are their tests not independant tests. So it remains PR. They didn't even trust the reviewers enough to invite them to run the tests with them.

It allows you to double the resolution and get the same performance as with one card.
Well when is it a doubling of resolution? I can't think of any current 'standard' resolutions that achieves doubling (the closest would be going from 1280x1024 to 1920x1440, even going from 1024x768 to 1280x1024 is undershooting and 1600x1200 is overshooting). This also assumes that you will get a near 100% increase by adding the extra card, and even nV doesn't expect that kind of increase regardless of whether it's vertex shader heavy or pixel shader heavy titles.

And I'm sure it's not just PR.
And I'm not. Why not allow others to play with their cards. Anyone taking these numbers at face value is naive.
As firing squad says laments <i>"Unfortunately, we don’t have an nForce4 SLI motherboard to test with just yet</i>, but <b>we were given the following numbers by NVIDIA</b>:"

I don't doubt they are <i>potentially</i> valid in some justified/sanitized/'playing by our rules' kind of way, but I'd prefer to get figures from someone who is considered objective (like FiringSquad), and that would exclude ATI, nVidia, XGI, etc. Heck, XGI is a perfect example of how the factual but limited PR-filtered numbers don't reflect reality; the Volari V8 DUO has MASSIVE fillrate potential which is confirmed by results in selected benchmarks, yet the cards suck in actual games. Awesomegra!

They won't risk getting everybody excited and then dissapointing them with benchmarks done by other reviewers.
Where have you been for the last year and a half? Both ATI and nV have paper launched stuff at each other, and you think that they are going to care about being exposed by reviewers? Hah! nV and it's launch partner not to long ago produced literature to try and say that the FX 5200 and 5500 were superior choices for gamers based on one benchmark. Sure sure, they wouldn't take an opportunity to spread FUD, yeah they're too noble for that. Ha, right!

On the contrary, I expect them to do even better once the SLI drivers mature...
Could be, however I won't take their word for it, I'll wait for someone who doesn't have a history of fudging their numbers for PR purposes.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
Well there is also a limit based on what the chipset can handle. The basic nforce4 layout is 20 lanes according to <A HREF="http://www.nvidia.com/page/pg_20041015917263.html" target="_new">their Spec sheet</A>. Which really is no different than the competition (probably 20+2).

Now that doesn't mean there couldn't be a workstation version that is different, but if it is based on the same chip, then it's 20 lane too.

Tumwater seems to be the powerhouse here with 24 full PCIe lanes, which IIRC are divided up into 3 sets of 8. And which has 8 dedicated to each PCIe and then 8 used for peripherals be they PCIe or PCI/PCI-x (using a bridge).

Still the Tumwater is not a viable solution for us, and the way it's lanes are controlled seems to give it little advanatage over the nF4 SLI whose major advantage may be that SLI bridge that does the task of splitting/joinging the information to/from the cards. And that advanatge may be significant especially if it saves even more CPU cycles.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

eden

Champion
Well, the only time I was amazed at nVidia's honesty in their PR benches was the 6600GT preview performance for DOOM III. The results were exactly on target if not slightly better. Didn't ever think their PR would live up to its claims.

--
<font color=blue>Ede</font color=blue>
 
True, but then again, that was DOOM3.

I just don't trust any of the companies. They have too much money riding on this to be 100% open about things. Not that they are always lying, but they will selectively pick and chose what shows them in the best light. "You might say, well Duh! We all do that to some extent!", it's simply wise to keep that in mind. I epxect the Stock GF6800U SLI combo to blow the stock X800XTPE-F'N'EH away, and even the OC'd cards to blow and OC'd Xetc away, but I'm still not simply going to accept nV's word for it. Why are they so eager to give us performance figures so early (which they kept so secret for the original NV40), yet so unwilling to give the reviewers a crack at testing the cards? It just sounds like they are trying to manipulate the news by selectively releasing the good news but making sure no one finds out the whole story. Even if it's something as small as their being a glitch in 1 aspect or game, it's obviously enough to keep control of the testing. It reminds me of war coverage, show all the good stuff, but don't you dare show anything that might not make us look good.

Let's also say that the track record of FUD for this generation (from all sides including XGI), and even the last to a great extent, isn't something that engenders respect of company-sponsored info.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

eden

Champion
Good grief Ape, I really WASN'T telling you otherwise, I was agreeing but mentioning a side note of something rather surprising by nVidia!

:eek:

I feel sad now, good bye, I'll understand if you don't want me! No no, don't cry!

:lol:

--
<font color=blue>Ede</font color=blue>
 
Funny re-reading that it seems a lot edgier after your comments. My original intent was saying, Damn Right, and $crew 'em all!

It just really bothers me from both sides all this paper.

Funny thing is most mfrs are hitting the numbers THEY give us, it's the ones we don't know about that disapoint us. And really that goes for nV, ATI, AMD, Intel, Creative, etc.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: