Nvidia's in trouble

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980
[quotemsg=6572250,50,158300]idiot its called x2 because it has 2 processors incorporated on it .. comparing the 3870x2 with a single processing unit card is stupid.[/quotemsg]

And how many PCI-E slots does it take up on your MB you tool? :non:




Better to not speak and let people think you are stupid, than open your mouth and remove all doubt. :sarcastic:
 

hairycat101

Distinguished
Jul 7, 2007
895
0
18,980
[quotemsg=6572252,51,127900]And how many PCI-E slots does it take up on your MB you tool? :non:




Better to not speak and let people think you are stupid, than open your mouth and remove all doubt. :sarcastic:[/quotemsg]

Unfortunately none. I don't have one :pt1cable:
 

coltezeu

Distinguished
Dec 22, 2007
19
0
18,510
[quotemsg=6572252,51,127900]And how many PCI-E slots does it take up on your MB you tool? :non:




Better to not speak and let people think you are stupid, than open your mouth and remove all doubt. :sarcastic:[/quotemsg]
and how many pcie slots does the 8800 ultra take up ?

your quote should fit nicely here-->
Better to not speak and let people think you are stupid, than open your mouth and remove all doubt.
you just removed mine. . hope you feel proud:)
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980
[quotemsg=6572255,53,158300]and how many pcie slots does the 8800 ultra take up ?
[/quotemsg]

The exact same as an X2.

(If you want to count the cooler size as 'slots' - the standard for both is 2, with certain manufacturers doing singles for both)


So, can you justify why:

"comparing the 3870x2 with a single processing unit card is stupid."


Of course you can't.


 

dev1se

Distinguished
Oct 8, 2007
483
0
18,780
[quotemsg=6572249,49,160672]

Lol... fl4m1n l33t n00b p0wnx0r hax0r fanb0yz[/quotemsg]


you talkin bout me there n00b :lol:
 

jakemo136

Distinguished
Jan 10, 2008
46
0
18,530
[quotemsg=6572261,56,118937]Good dual GPU based cards are out (for a while) and making a difference/or not........now lets wait for the Dual Core GPUs (2GPUs on 1die, cant imagine the power use!) to come out :D :D[/quotemsg]


That will be wicked awesome. Especially since GPUs are already at 55nm process.
 

andrei3333

Distinguished
Feb 1, 2008
97
0
18,630
listen: game boy, why dont you judge the 9 series when it comes out and not base your whole essay right there on your assumptions...cause you know what they say happens when you assume too much dont you ? (you make an A$$ out of yourself)
 

FrozenGpu

Distinguished
Dec 8, 2007
986
0
18,990
[quotemsg=6572250,50,158300]idiot its called x2 because it has 2 processors incorporated on it .. comparing the 3870x2 with a single processing unit card is stupid.[/quotemsg]

What would you compare it to?

When Nvidia doesn't have their gx2 counterpart ready?

idiota!

obviously the ultra you moran... It doesn't even matter how great the x2 is b/c its still based on ATI's quasi 320 stream processors which are more like 64 stream processors, which is so gay... Nvidia's architecture is obviously better, it makes better use of their own hardware to produce better results in games [where it matters] Ati should be working closer with game developers so that their hardware would accomplish the same ends = better results in game!

not to pick on anyone, but these arguments are pretty dumb, they accomplish so little, just say the truth, and thats all anybody needs to hear...
 
[quotemsg=6572129,2,124383]i smell a fan boy!! flame him !![/quotemsg]How can you accuse him of being a fan boy when you avatar consists of you displaying a Nvidia card in front of your face? Tell me you are not an Nvidia Fan boy.
 

kellytm3

Distinguished
Dec 2, 2007
122
0
18,680
Didnt Nvidia get named company of the year?AMD/ATI,has never won that honor.How long has it taken AMD/ATI,to make a card(barely a cod hair faster)than the Ultra?When the smoke clears it all boils down to market share,and Nvidia has a much larger share,than ATI.
 

ZOldDude

Distinguished
Apr 22, 2006
1,251
1
19,280
Secondly, the Geforce 9 series is just a rebadged Geforce 8 series.
Looks like you have no idea what your talking about.
My G92 kicks ass and at a low pwr/price (they sold for $258 3 weeks ago).
I even returned an -unopened- 3870 for a refund.
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780
[quotemsg=6572157,16,154240]http://www.tomshardware.co.uk/ATI-HD-3870,review-30103-12.html

HD3870X2 (one card) > 8800Ultra (one card)[/quotemsg]

No ATI's card is Crossfire on one (Crossfire on a stick) PCB with two GPU's, hence the drivers that run the 3870x2 use Crossfire in the drivers instead of the chipset. Don't think for a second that you can fool me with that one card, one card ********, you can't compare one card with two GPU's versus one card with only one GPU. Thats like comparing a single core processor with a dual core processor or a quad core processor.


What do you mean? I'm not being irrational, I'm stating what we know.

Are you think headed?!? Your not stating facts with most of your statements, there pruely guess work (fantasy) at best.

Yes, Nvidia. Despite the 3x market cap advantage and the fact Nvidia's Geforce 9 series is coming out very soon, I believe they will be the losers over the next two years.

You believe doesn't make them facts, it makes them fictional. Go play with your Gameboy.
 

ZOldDude

Distinguished
Apr 22, 2006
1,251
1
19,280
[quotemsg=6572136,5,154240]No; the HD3870X2, a single card by any definition, beats a single 8800GTS-G92, 8800Ultra or 9600GT.[/quotemsg]
And costs a truckload more money that I would rather have in part of a CD earning interest.
A 3870 does not beat a 8800GTS G92...if your going to pit GFX cards against one another at least have them use the same amount of GPU cores.

My 2 1/2-3 yr old systems (seven on my LAN) in my profile used 7900GT's...the new system I put together about 3 weeks ago uses 6000+ ($111) and a 8800GTS 512 G92 (now saleing for $258 and dropping) and -stock- gets just short of 12K score in 3dMark06.

 

gomerpile

Distinguished
Feb 21, 2005
2,293
0
19,810
Price will be the deciding factor, however looking at nvidias profit for this year proves that Nvida will be able to maintain a reduction in price for the performance. Nvida's drivers have always been satisfactory and far more Superior than ATI. Drivers wont be an issue, but there will always be the problem with drivers with dx10.1. eventually the programs will have updates, games will have updates and graphic card will have updates finally after 200 updates the cards drivers will work.
Every ATI card I've known there has been some major issues that still carry over to the next.
I would hold off on buying the nvida if the price is 600 and stay with what you have now or a med end nvida or ati. Next year the gddr5 and the ppu will be here.The white paper of the gddr5 is going to give a huge boost.
 

ZOldDude

Distinguished
Apr 22, 2006
1,251
1
19,280
[quotemsg=6572277,64,128596]no my daddy can beat yours up 100 times lmao :pt1cable:[/quotemsg]
LOL!
I like the bumper sticker that says "My 8th grade slacker can kick your 8th grade Honor Role students ass!"
 

IndigoMoss

Distinguished
Nov 30, 2007
571
0
18,980
I totally agree with you on the chipset side of things. I'm so pissed that I can't run the 45nm processors. Do they really expect me to spend another $200 dollars just to get that support with almost no new features? Come on, that's retarded. I should have just waited a few more month and bought an x38 board. Then I'd be able to Crossfire my 3850 512mb.

As far as the graphics card side is concerned, unless Nvidia has something under their sleeves, they sounds like they are going to have a tough time beating the R700 if the rumors about it are true.

For those that are citing Intel against AMD as an example, Intel keeps releasing significantly higher performance parts and their plains for their new architecture looks to be kicking ass. Where so far it seems Nvidia might be twiddling their thumbs while the GPU's fly of the shelves ala AMD pre Core 2.

I'm really hoping for ATI to pull out a win with the R700, I want to see the days of the 9800pro and more importantly the 9550 which was a beast of a card for the time.
 

trooper1947

Distinguished
Apr 11, 2007
251
0
18,790
[quotemsg=6572219,39,115127]Crysis says that Nvidia is the way its meant to be played! ATI is not. End of story![/quotemsg]


They'll put anything at a beginning of a game bootup for enough money .... nvidia should have spent more on development instead of paying off game manufacturers for a stupid popup .
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
[quotemsg=6572142,8,150022]The 9800GX2, a single card by any definition, WILL beat a single HD3870 X2.

So where's your argument?

HD3870 X2 Costs £250+

2 x 9600GT's Cost £240

Which does best in benchmarks? The 9600GT SLI setup.
[/quotemsg]

That 9600gt SLI test was done using two FPS. Big deal. It's irrational anyways, because the markets are different. Only fanboys care about ultimate fps wins between cards that aren't competing. 9600gt in SLI is of interest to SLI board owners who have an older 8xxx series card, like an 8800gts 320.

It's the 9800gx2 that competes against the 3870x2. Will it win? Probably by a small margin. The 4870x2 will be out by this summer, which will beat the 9800gx2. That's the way it goes. I won't care that a dual GPU Nvidia card that's not on a single PCB and has heat issues can beat a 3870x2 by a small margin in a FPS. My 3870x2's a better card. It's overclocked, it runs cooler and new driver releases will improve performance in the CRPGs I play. It already does great in The Witcher and Oblivion. Too bad LOTR Online went with the Nvidia bribery program (i.e. "the way it's meant to be paid").

Nvidia's hurting because they can't buy AMD. They need a CPU and fast. What will Nvidia do when Swift and Larrabee are out? Nvidia will be left out of the notebook market, which is the big moneymaker nowadays, not enthusiast gaming rigs. Nvidia's boards will probably end up supporting only some AMD or Intel processors, and not any desktop processors that provide hybrid Crossfire via Swift.

If they can make enough money in the enthusiast end, more power to them. What I forsee is another Nvidia FX generation. Their fans kept them alive back then, so they'll probably survive as a niche player competing against AMD and Intel at a time when both have GPU cores in CPU's, both have great chipsets and one standard of Crossfire. Plus, Intel will have their own discrete GPU's so it will be a 3 way race.

Competition is good.

[quotemsg=6572221,40,68270]And with the resources that Intel has, do you think theyll just sit by and let nVidia get the first try at new games?[/quotemsg]

Yes, Intel has more money to bribe game developers than Nvidia has. All the "The way it's meant to be played" program comes out to is Nvidia paying developers to provide pre-release code so Nvidia cards can be optimized in time for the first benchies of a new game (usually FPS). ATI eventually catches up, but only after the damage is done.

Maybe it's time for Nvidia to enter the console market by buying a CPU from Sony or IBM, but using their own chipsets and GPU designs? That may be their only route out of a mess. 3dfx had all the cash and great Voodoo cards, but they could not compete and were bought by Nvidia. It's just that no one right now wants to buy Nvidia, but would rather compete.
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780
[quotemsg=6572350,73,124995]They'll put anything at a beginning of a game bootup for enough money .... nvidia should have spent more on development instead of paying off game manufacturers for a stupid popup .[/quotemsg]

How is advertising stupid? I don't here you criticizing Intel or AMD, and Nvidia has PLENTY of $$$$ to go around with their marketshare being so high. Fact:The 8800GTX has been selling like hotcakes since its release. Can't debunk that.


@yipsl How is Nvidia hurting because they can't buy AMD? :heink:
 
My question is, will the graphics market be the same 2 years from now? Wont Fusion/Larrabee create a whole new approach? And if so, how will nVidia be able to compete? OK, maybe both Intel and AMD will let nVidia make compatible chips for them, but itll always be a catchup situation. I think nVidia is going to need a x86 cpu solution at be able to compete. I may be wrong (not like I havnt ever before heheh) but I see trouble on the horizon for nVidia, tho I DONT welcome it
 
Status
Not open for further replies.