[quotemsg=6572607,105,167461]I wonder if this forum is all populated by kids.
With few exceptions I see childish comments all over without any clue on the subject.
1) The gamer market is only a tiny part of the whole IT market. You may be surprised to know that most of the income of the GFX board makers comes from the cheapest segment of their products. This is because, for each GTX or 3870x2 sold they sell at least 100/1000 computers for the business world. And these do not need directX10.x support to run Word or Excel (at least not yet, seen the latest moves from M$).
The race for having the fastest card is only for market purpose, not for really selling more of those pieces.
2) Notebook market is as profitable as the desktop one, so nVidia is well positioned with their offer of high performances integrated cards. They still probably still offer the fastest gfx cards for notebooks despite the war on chipsets they are suffering.
3) 2 slots, 2 CPUs, 1 lane, 2 fans, n sockets... huh? Fact is that the 3780x2 is a crap try to create the fastest GFX cards (for market and image reasons). The 3870x2 is faster only for games supporting Crossfire, otherwise you get the same performances of the standard 3870 (but wasting double power). When cross fire is supported, this card is only minimal faster than the Ultra, just showing how ridiculous is the try to get the crown of the fastest card at the moment. Yet, price has its importance. So, if you need SLI/Crossfire support enabled, I would buy 2x9600 in SLI for having faster performance at cheaper price than this single board with 2 GPUs . Not that 2 cheaper cards are worse than one more expensive if all of them do the same work and require the same support (SLI/Crossfire).
If you want to make comparisons seriously you will see that the 3870 is a loser in terms of both price and performances. If you want to belive that you have spent you money on the right product, well, for sure you made a better move than going for a much more expensive 8800GTX Ultra.
No, I'm not an nVidia fanboy. Can care less about who is going to provide the best card (I'm out of market with my puny GF6600 AGP8x). But best for me means performance at at a decent price (yet, the Ultra is not decently priced). More professional cards are available if you want to spend few thousands dollars.
Now going a bit technical. Hoping for nVidia to have troubles in future for marketing maneuvers of its competitors is quite dumb. If it will ever be true, it would mean we won't have the best product nonetheless.
Because if nVidia is forced to reduce its research investments due to financial problems that will not make ATi/Intel products better in a absolute point of view. Will just flatten the curve towards the bottom. The loser is the consumer in the end.
I would really hope for ATi (and AMD on CPU front) to come up with better products on their own. Just as it is now nVidia is messing up the market at its will (they could even release a faster card (8800GT 512) cheaper than their previously ultra-priced ones (8800GTS 320/640)) without caring about the confusion that would bring (yet, those cards are still rare on the market probably because they have still to sell some of these now obsoleted GTS laying around).
Meanwhile ATi could just hope to get a performance improvement by going to 55nm and so be able to raise the clock much over the previously version (removing the need for a nuclear power for the 2900XT version). Won't call that a breakthrough in R&D, seen that nVidia chips can obtain better performances at lower clocks and are produced at 90nm (now 65nm, but the fastest is still 90nm).
Still I'm not an nVidia fanboy. I'm a bit to old to be called a boy at all, BTW. I'm describing what I see presently on the market. And I can forsee, as anyone else can if can look at the whole picture, that nVidia still has not expressed it full potential.
New released 65nm chips are just clocked as fast (or even a bit slower) than the old 90nm ones. Now, what is holding nVidia from releasing a 8800GT Ultra card based on the G92 chip? Seen what this chip performs at stock clock with respect to the G90, that would outperform the current ultra a lot. Would also annihilate 3870x2 tiny gain where it has it. The answer is that Ati for now is not a great competitor, and nVidia has not will to outperform/obsolete its own cards that are still selling like cookies.
And this is not taking into account how the new G94 performs in its ultra cut down version (the one mounted on the 9600 has 30% less transistors than the G92 and performs almost as well). Seen the extra numbers of transistors needed for ATi R600 (which is the real record ATi can show) and its successive variants, it is clear that this is not state-of-art design.
More clock, more transistors (which mean production costs, even at 55nm) but less performances.
Other particulars make me think ATi has screwed up this development cycle: one was the try of the 512-bit bus on first R600 models for the memory which clearly show they expected better performance at higher resolutions. With current results, it is clear it is a waste of money.
In the end I really hope for ATi to come back with a new chip (R700?) that is going to further push nVidia to release something new on its turn (I think that without ATi 3xxx series we propably would have never seen nVidia's 9600 release so soon, at least). Or we may remain with 8800GTX Ultra as the top nVidia card for a whole year again. Which is not simply an ATi's market problem. It becomes to be a problem for everyone as the prices won't drop shortly (and they have already be stable for too long, IMHO).
Realistc as I am, I won't expect ATi to overthrown nVidia with next chipset (unless miracles happen) as I am quite sure nVidia is well hiding its wardogs, but I hope it will at least able to move waters a bit again.
If you find this a bit long, don't read it at all. It contains only bullshit of a probably too old guy.Huh? Seen the benchmarks done on this very site (as well as in others) there are some modern games than do not supprt SLI/Crossfire, or if they do, they do that very poorly. See the 3870x2 is much slower in some games than the Ultra. That would not be the case if all games were all supporting multiple GPU in the right way, would it?
As well, there are some acclaimed games that do not scale with quad cores at all.
About the lack of the whole picture: I do not think that Intel can create a competitive GPU for ATi or nVidia. Past experience proved Intel not having the technical resources for that. Nor the intetest (seen that the most of the gain is from mainstream cheap pieces of "old technology" ). You may have tons of money, but if you do not have the know-how you have either to invest lots of those money for long time or buy people that already know. Still GPU developing is a complex market that requires investments on the long run with high risky results (you do not know where you'll be at the end of the development with respect to the competitors, expecially if you have not been in the market before/don't acquire the right "spies" ).
Maybe lately many ATi engineers may have fallen into Intel hands. But I suspect Intel is probably making a "simple" card with DX10 support for the notebooks in order to contrast nVidia. If so, Intel entry in the market doesn't really change anything, but is probably just not letting nVidia expand on high-end notebook with an exclusive DX10 offer.
In a long run nVidia will have problems with its chipsets on desktops only if both AMD and Intel will close their doors for 3rd party chipset development. And I will suspect this is not a thing a real free market can tolerate, despite each CPU company will bring the reasons of defending their own business. Intel is already in a legal storm, so I think it won't probably want to start a new one that would just confirm how bad they act with respect to the competitors (though Intel may have the financial resources to pay for these actions in future when competitors will be crushed). I think that AMD worst problem is with CPU, not GPU, so they need all the help possible to sell more of them. nVidia can help in this as it did in the past where SLI was only available for AMD CPU providing the best game platforms. Still you can see that there are lots of AM2 and also AM2+ motherboards with nVidia chipsets despite latest AMD chipsets. It is also possible that finally nVidia will acquire the license to build Crossfire solutions in order to alleviate AMD on the burden to create their own chipsets for their CPU.
If you want to speculate on far future there are lots of equally available possible endings based on nothing (or BS, if you prefer). What remains is that nVidia has proven to be quite good at marketing in the past (crushing 3DFX, 3DLabs, Matrox S3, and a bunch of other well established video and chipset manifacturers) and probably it is not sleeping in this moments. Still we have not any elements for speculating on this, as it is more a marketing question than a technical one .
As you can see the current interaction between the companies has all been messed up after Ati's acquisition by AMD that broke the established equilibrium. And it is quite funny as well.
nVidia was doing chipset exclusively for AMD. Intel had Ati support. Now AMD build chipsets that support only ATi and sells them to Intel. Which on its turn see nVidia providing SLI support to their CPU. Intel would not probably want to support Crossfire as that would help raising AMD competitor's income, and AMD doesn't want to have SLI support anymore as that would lower its GFX sales. However AMD needs to sell CPU and they know nVidia GFX cards sell well and cannot afford to direct nVidia buyers on Intel platforms only. Intel cannot let nVidia build too good chipsets or it won't be able to sell its own (and I suspect this to be the prolem in the notebook market where Intel can offer its own entire CPU/chpset/GPU set, which is not true on desktops).
Now, given this set of considerations, any small move can make the castle fall on one side or the other. As any scenery is possible, one can suspect nVidia to buy VIA and start producing its own x86 CPUs in order to offer its CPU/Chipset/GPU whole sets as the other ones. That would be interesting, indeed.
Any one wants t add more BS to mine?
[/quotemsg]
Damn!! Skip skip...