immagikman

Distinguished
Sep 29, 2006
264
0
18,780
WTF? Why would Intel support Xfire but not SLi? So does this mean that we will have to wait for Nvidia to come up with a new chipset to really match the new intel line up? This seems like a huge gaff to me.

Was wondering if someone could fill me in on what I am missing....I just do not get why Intel would stiff such a large market segment....especially supporting a CPU competitor.
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
It is not Intel that is to blame for this, it is Nvidia. The X38 hardware is completely compatible with SLI, but Nvidia writes its drivers in such a way that they detect the presence or absence of Nvidia chips on the motherboard and, if there are none there, SLI is disabled. Nvidia figures they can thereby force you to buy an Nvidia-chipset motherboard if you want to use SLI and they can make more money out of you.

The question you should be asking is: "WTF? Why would Nvidia not allow SLI on Intel hardware?"
 

R0B0T

Distinguished
Oct 11, 2006
66
0
18,630
if the slots and bandwidth are there, couldnt someone "and i belive ive heard of this already for dual X8 platforms" write hacked drivers to allow SLI support on a intel config?
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
Well, yes and no: hacked drivers are available for SLI on (say) the 975X platform, but they're not good hacked drivers. I'm not sure they even support G80-series cards at all (they certainly didn't for a long time, and may well still not). The hacking is also likely to introduce bugs, and they will be based on older driver versions so they won't have all of the latest bugfixes and performance improvements from Nvidia. Using hacked drivers is very much an "at your own risk" phenomenon - I wouldn't want to take that risk myself.

On the plus side, if the rumours are correct, RV670 vs G92 will be a much more equal struggle than R600 vs G80 was: performance similar this time round and ATI's chip possibly being cheaper.

So the inability to support SLI may only matter to people who already own a pair of Nvidia cards - other people may not have a problem with going Crossfire instead. In fact it's not impossible that Nvidia will actually lose money because of this decision: it's possible that there will be more X38 owners who don't buy Nvidia GPUs than there will be Nvidia GPU owners who don't buy X38 mb's. I really hope that happens!
 

sailer

Splendid
Nvidia seems to be going farther and farther out on a limb. They won't support Intel chipsets because Intel is their rival. They won't support AMD/ATI chipsets because AMD/ATI is their rival. One of these days, both Intel and AMD/ATI may decide they don't need Nvidia at all for their CPUs. Then Nvidia may find itself with video cards that only fit Nvidia chipsets, but Nvidia chipsets don't support CPUs from either of the two major companies.

Something similar to that happened to 3DFX many years ago, when 3DFX supported Glide only, while the rest of the world was going to Microsoft's DX. 3DFX went from being a premier video card company to being, well, has anyone seen any 3DFX video dards for sale recently? Nvidia needs to be careful, in my opinion.
 

gwolfman

Distinguished
Jan 31, 2007
782
0
18,980

I read some articles where some people testing the boards were able to run SLI but they had special drivers and such, just to test it; but mentioned that we probably won't see it as consumers.
 
Its a joke isnt it. When Intel released C2D, the only mobo available for it was Intels own which they charged a premium for ($260+) and it didnt support SLI either. Makes you wonder why they need to rape people like that, knowing that the majority of them will want to upgrade that overpriced mobo to SLI eventually.
Kinda funny how the Intel X38 supports AMD/ATI crossfire, but not nvidia SLI. I guess that dispells the rumor of Intel buying nvidia.


MB - It has nothing to do with Intel not wanting to provide SLI. It's nVidia refusing to licence SLI to Intel. Why? Because then they sell more nVidia chipsets to go wth the video cards instead of merely collecting a smaller licencing fee. nVidia (probably rightly) are asserting their #1 performance position in the marketplace by refusing to play along with Intel. We can all be sure that Intel would rather pay someone *other* than AMD/ATI for the ability to run dual(or more) cards. However, INT is taking the pragmatic approach of paying AMD/ATI for Crossfire rights because it's still cheaper than having to develop their own and thereby having to support nVidia *and* ATI cards by themselves.
 

Dogsnake

Distinguished
All this only matters to those running 2 video cards. For most builders on a budget of some type, they can choose the Intel X38 and have a fine system with 1 card be it NV or ATI.

As too hacked drivers I would say watch out. I have had a number of friends get burned when they had modded drivers running and had to do any type of XP re-install. The OS install just hangs up when it gets to the point of installing the drivers.
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
3dfx did a lot of things wrong, but an over-dependence on Glide was certainly not one of them. The games industry focused on Glide for a long time because 3dfx hardware was so much better than anybody else's that no serious gamer ever used anything else, and so there was no point in using an "open" API: the only effect would have been to make the games run slower. This situation persisted for quite some time, partly because the Unreal engine was so widely used and worked very well with Glide.

3dfx came unglued for a number of reasons, including:

- Rash involvement in the abortive "Voodoo Rush" project.
- Banshee underperforming in 3D when compared to Voodoo 2.
- Ill-advised decision to purchase STB (AIB manufacturer) and stop selling chips to anyone else.
- Delayed product releases.
- Mis-reading what consumers wanted. They did this first when Voodoo 3 didn't support 32-bit colour and then (much more seriously) when Voodoo 5 didn't support geometry acceleration.

It was actually missed deadlines that did the most damage. Voodoo 5 was originally intended to go against the original GeForce 256, and could have been a very serious rival to it: Nvidia cocked up their design and had to drop the clock speed from 200MHz to 120, and the geometry acceleration was too weak to be of any practical use. But, as it was, Voodoo 5 didn't ship until it was going up against GeForce 2, which was very powerful and had useful geometry acceleration, which people decided they wanted a lot more than they wanted a T-Buffer.
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
On the other hand, if you're not using 2 video cards, it may be hard to justify choosing X38 over P35.
 

sailer

Splendid


I realize that there were more problems with 3DFX than just Glide, but I mainly remember having 3DFX cards and finding fewer and fewer games that supported it. As I recall, and my memory may not be perfect, Sierra was the last game maker that supported Glide. Glide was great at the time, but it got left behind.

Most assuredly, the initial lack of support for 32 bit color and the later lack of geometry aceleration hurt as well, along with the high prices being asked. Those were what drove me to buying Nvidia cards. The point I was trying to make was not about Glide iself, but the possibility that Nvidia could dig itself into a hole similar to the one 3DFX did. For those, including myself, who only use one card, this is no problem of course. Then again, I'm saving for a bigger monitor which will need two cards, especially using Vista, so where does that leave me except in ATI's camp.
 

SiriusStarr

Distinguished
Oct 1, 2007
73
0
18,630


Hear hear... Nvidia, your chipsets suck. Give the people what they want!!!! :non:
 

sailer

Splendid


Exactly, and this is what nicolasb wrote in reference to 3DFX, "Miss-reading what consumers wanted". If a consumer, like myself, wants two cards and Nvidia won't allow SLI on Intel hardware, then I have to buy two ATI cards. Its all nice and good to stand on principle, but if they shoot themselves in the foot, it gets hard to stand.
 

dashbarron

Distinguished
Sep 9, 2007
187
0
18,680
No point beating around the bush that Nvidia is certainly doing much better in the GPU war right now. So, does Nvidia have a new chipset coming out soon? I'd hate to choose the X-38 and not be able to use SLI with Nvidia. I want to get a new computer when Penryn comes out, so I'm stuck with the 680I unless something brand new from Nvidia comes out soon.

Suggestions from anyone? I won't be going dual graphics right away, but I'm going to have to choose my alliance by the new year or so.
 

immagikman

Distinguished
Sep 29, 2006
264
0
18,780
Well the 680i isn't a bad board, My favorite builder uses them :)

More importantly to me though is 3 full X16 PCIe slots, two highend cards in SLi and a third mid range card for multi monitor support.
I'll be running the SLi cards to a 40" 1920x1080 display and the third card supporting two 30" side monitors. I should be well set for years to come. :) Ill have to look at the specs of the 7xx series chips from Nvidia....ATi got on my bad side years ago and I just don't have an interest in their crossfire product. (When I say years ago, Matrox was a big player in the video market...yes Im old) Age and Eyesight being what it is :) Big monitors are more important that higher resolutions on smaller screens.
 

javimars

Distinguished
May 16, 2006
217
0
18,680


my hero :pt1cable:
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
What the hell are you talking about? Are you trying to suggest that 3dfx cards couldn't run Direct3D or OpenGL? Of course they could! And a great deal better than any other card could at least up until the Riva TNT2 came out.
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
Yeah, but wait till the end of November, and things may look very different.

Yes, but it's pretty much exactly the same as the old one: the north bridge chip is identical.
 

sailer

Splendid


I know they could run Direct3D and OpenGL. Glide was simply considered better at one time. And they were a good card for a time. I owned a couple of them. Even did their version of SLI. But time moved on and they didn't. So I eventually bought Nvidia cards. And I've had a couple ATI cards in the past, when ATI was at the top of the heap.

My main wish is that both Nvidia and ATI could be used on the same motherboards so that a person didn't have to choose the motherboard based on which video card he was planning to buy, at least if he's planning on buying two cards. This proprietary stuff when it comes to using two cards is a total pain.
 

bydesign

Distinguished
Nov 2, 2006
724
0
18,980
Doesn't really matter X-38 was DOA. No real advantages over G-35. Isn't any better overclocker and cost more. Intel will be changing sockets for next gen chips and DDR3 in not worth the money. I waited on X-38 before buying a Blitz Formula has every feature under the sun and around $250. It will support a 500MHz FSB on air (chipset) and companies are almost giving away DDR2 ram.

Beside Intel was clear from the beginning that it would be crossfire only. All the rumors about SLI support came from the usual sources. Let's see what happens next year.
 

sailer

Splendid
I read on Techpowerup and a couple other sites that a X48 chipset was being introduced in a couple months, and that is one reason there have not been many X38 boards produced. For those who aren't in a hurry, the X48 might well be worth waiting for. From what I understand, its mainly an updated X38, with a couple of the bugs fixed.
 

rubix_1011

Contributing Writer
Moderator
Something similar to that happened to 3DFX many years ago, when 3DFX supported Glide only, while the rest of the world was going to Microsoft's DX. 3DFX went from being a premier video card company to being, well, has anyone seen any 3DFX video dards for sale recently? Nvidia needs to be careful, in my opinion.

nVidia bought out 3dfx. Now does this make more sense?