Geforce 3, MAC support in april [-peep-]!!!

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
Thanks NVIDIA you [-peep-] suck!!! I hope you Like filling Steve Jobs POCKETS FULL OF CASH !!!

<A HREF="http://maccentral.macworld.com/news/0102/22.geforce3kit.shtml" target="_new">http://maccentral.macworld.com/news/0102/22.geforce3kit.shtml</A>

:cool: First person to get a topic banned. ABIT BP6 Lives FOREVER!!! :cool:
 

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
76 GIGAFLOPS... WOW like i care. More MAC support we dont need this. Geforce should of been a PC only chip. I will not worry the x86 will never die....

:cool: First person to get a topic banned. ABIT BP6 Lives FOREVER!!! :cool:
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
This is interesting, that nVidia was able to immediately break into a whole new plateform which looked like overnite success. Well the GF3 card is very new technology and I wonder if the drivers will really be stable and smooth for both the PC and the Mac. Plus no reviews anywhere? Why not? Makes me wonder if it is ready for release, what better publicity can you get then great reviews to get the buyers ready. In any case I find this an exciting time, I do believe ATI has their Radeon II design and card ready and they are waiting on nVidia so as to see if they have to go back to the drawing board. I don't think so.

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 02/26/01 04:18 PM.</EM></FONT></P>
 

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
Well Nvidia is GREEDY (even more then Intel). To them why not support the MAC MORE MONEY FOR US.

Why not use the Same chip in the X-box for the PC. So we can make more money, and less research.

I hope the Radeon Whips there A$$....

:cool: First person to get a topic banned. ABIT BP6 Lives FOREVER!!! :cool:
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Nvidia is definitely moved into ATI territory (OEM, Laptops, Mac) and other territory (Linux) and basically killed 3dfx. Wow!!! Now we are looking at >$500 video cards and many people are jumping up and down for it???? nVidia even affected Directx8 and wrote routines to bend Directx8 around there new GF3 chip. I wonder if ATI had heads up on this sufficiently in advance to design new hardware around it or is it more like this: "Hey ATI, here is the code now for Directx8 which was designed around your competitor chipset some time ago which will be released shortly, good luck." Microsoft going with nVidia chipset for its Xbox is a powerful ally that could influence the graphics industry by its Operating System exclussive software support of hardware. Do we want just a nVidia grahpic card world or a s3(dead), Matrox(wounded), ATI, 3dfx(dead), nVidia graphic card world?

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 02/26/01 02:50 PM.</EM></FONT></P>
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
rcf84 at overclockers.com share your sentiments, since not to many people are reading this I will give you some links and quotes you can find there. nVidia propagada is interesting, if you have some time you might want to check out some of these links about the NV-20 or a.k.a. GF3 and price. If you are not rcf84 then you will probably not be interested or might get agitated on the below links, so I recommend you just skip this part of the tread or post. Sorry for the length rcf84 but thought you might want to debate or talk over some of this stuff. Let me know. Some of us are concerned over the rather fast inflating video card prices.

<A HREF="http://www.overclockers.com/" target="_new">http://www.overclockers.com/</A>
<b>Recommended reading from above link:</b>

*2/23/01 - Leave NVidia alone It's not NVidia's fault for charging so much; it's your fault for not wanting to pay it. That (and some more legitimate points) from a letter I got, and my response. Ed
*2/22/01 - We'll leave NVidia alone today
*2/21/01 - I'll talk more about the NV20 and pricing later on today, but this analysis of likely NV20 performance should give you pause. Ed (This is the link for above NV20 performance <A HREF="http://www.geocities.com/tnaw_xtennis/NV20-5.htm" target="_new">http://www.geocities.com/tnaw_xtennis/NV20-5.htm</A> )

Quote from Overclockers.com:
2/21/01 - Smaller print, more words Why NVidia's skyrocketing pricing is bad news for you, even if you won't pay more than X for a video card.....

<font color=red><b>"The Case Against NVidia"</font color=red></b>
Ed Stroligo - 2/21/01


--------------------------------------------------------------------------------


<font color=blue>Are you paying twice as much for a computer now than you did two years ago?

Do processors cost twice as much? Memory? Hard drives? Of course not. In fact, the price trend is decidedly downward, not up.

Except for video cards.

If Intel doubled the price of its CPUs, would you like that? AMD? Anybody else?

So why is NVidia unique?

Sure, the equipment has gotten better. So have the CPUs and memory and hard drives.

So why is NVidia unique?

To most of you, it probably doesn't matter if NVidia charges $700 or $7,000 or $7 million initially; you're not going to pay until it hits your price level.

However, as the price of the top-end equipment goes up and up and up, it's going to take longer and longer and longer for it to get down to your price level.

Given some level of technological improvement, the prices of some components will go down over time, but have you considered how the same card selling for $500 can cost just $200 a year later?

Let me put it this way; Price shifts like that are rarely if ever due to the actual cost of materials in the computer industry, simply because we are dealing with more-or-less organized sand.

In accounting terms, computer component manufacture is usually a high fixed-cost, low variable-cost industry. In English, it's the equipment that costs the big bucks, not the actual making of the product.

When that is the situation, you have two possible pricing strategies:

If you want to make a lot of money, you set the price high and try to amortize the cost of the equipment on the backs of the early buyers. After the equipment's been paid for, you've got the ability to lower the price quite a bit, but you try to milk it for all it's worth.

Or you can try to spread out those amortization costs among a whole lot of buyers by setting the price relatively low and making up in volume what you lose in profit-per-sale.

The first approach is pretty much what Intel does. The second is what AMD is doing now. NVidia is clearly trying to follow the Intel pattern.

What are you getting for your money?

The difference between a 32Mb DDR and 64Mb DDR GeForce2 GTS card looks to be about $100. $100 is quite a bit for 32Mb of RAM. Yes, I realize this is higher-speed RAM than even PC2100, and I'm going to poke around and find out just how much it does cost, but I suspect a fat chunk of that $100 is profit.

Let's look at a 64Mb Ultra. Same amount of RAM, just a bit faster. Same essential GPU, just a bit faster. Cost? Not just a bit more. It's about $200 more. Pretty hard to believe the additional cost comes anywhere near $200.

The GeForce3 will follow the same road: some improvement, much higher price.

Paying a mint for tweaks

Video cards have pretty much hit a bottleneck: memory bandwidth. It's just not going to get a whole lot faster any time soon. Going from 4ns RAM to 3.8ns RAM is no great leap.

True, GPUs can play more and more tricks to make the most of the situation, just as CPUs have. But there are just so many tricks that can be played, and we have already reached the point of diminishing returns. Might not be so bad if the price remained constant, but you're paying more and more for those diminishing returns. Yes, even if you have a strict budget, you're paying more, but more on that in a moment.

Paying a mint for useless tweaks

You might say, "It has new features," or "It throws out more frames."

Features are useless if they're not used. If NVidia has some wiz-gang feature, programmers have to program for it. They're not ready at day one, or fifty, or even a hundred. If it takes a half a year or a year for games to actually use the feature, what good does it do you until then?

So what is your brag, "I have a great feature I can't use?" Maybe it is a great feature. So good, buy it when you can actually use it; don't pay a lot more for it before you can.

Mine Is Six Feet Long, and Eight During Blue Moons Falling On Leap Years

Wanting more FPS makes sense up to the point where it continues to improve the gaming experience. Anything beyond that is a waste. For many games, we are well beyond the point where it makes a difference. If you are playing one of those games, size doesn't matter anymore.

Movies have somehow managed to exist, and pretty well, running at 24-30 fps for most of a century. If it really made a difference in quality, don't you think the standard would have been raised by now.

I realize that framerates can differ dramatically in a game. I can see wanting a nice big buffer so that the game runs smoothly even at worst. But you don't need a 200fps buffer to achieve that.

Can any of you say with a straight face, "I can see a definite difference between 120fps and 80?" If you can't, then why bother?

Current benchmarking on video cards is absurd. If you benchmark using a high resolution, the video card chokes at a certain point due to fill rate or memory bandwidth bottlenecks. There is no reason to upgrade when that is the bottleneck.

Of course, we can't have this, so instead we see benchmarking done at absurd resolutions like 512X384, which are often the only places where there is a (meaningless) improvement.

So what do we end up with? "Look at me! I have an improvement I can't see at a resolution I never use!" It's like being six feet long; more is not always better. You might get bragging rights within equally deluded circles, but try putting it to use.

Presumed Foolish

What is NVidia saying to you? First, it's repeating the old Intel swan song, which is:

"We're going to make loads of money because we know there's a certain proportion of fools with more shekels than sense who'll shell out plenty to (often) sustain delusions of grandeur or social status, no matter how absurd. They're begging to be taken, and we're taking."

This frontal assault on wallets is easily rebuffed by showing an empty one, but it does distract most from noticing the important message, which is:

"We will raise the average price paid for a video card throughout our product line."

Two hundred dollars used to get you a top-of-the-line video card not too much after introduction. Now it gets you third or fourth place. Third or fourth place didn't cost you $200 two years ago.

You don't like third or fourth best when you used to be able to get the best for the same, do you? Might make a lot of you eventually decide to lay out another fifty or a hundred to move up a notch or two, doesn't it?

That's how this hurts those of you who would never lay out $500 or $700. Let's face it, a big chunk of the people buying computer equipment aren't known for patience, and NVidia expects to take advantage of that.

The Eternal Deathmatch: Buyer Vs. Seller

Nvidia has the perfect right to try to charge as much as it likes for its products. We have the equal right to try to pay as little as possible for them. Neither goal by itself is particularly reasonable, but it's only when the two interact that you get a market.

It's true that they can only sell what we buy, but if we don't pay enough, eventually they won't be anything around for sale.

A free market bounces between those two incompatible extremes. If one side goes too far, the other has to take action to restore the balance. If sellers ask too much, buyers need to refuse to buy. If buyers will pay too little, sellers need to stop selling at that price.

"Something for nothing" and "Nothing for something" are but two side of the same coin. Neither is good.
</font color=blue>



<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 02/26/01 04:36 PM.</EM></FONT></P>
 
G

Guest

Guest
First of all, am I happy that the new video card will cost $700? No. However, the way I see it is that the reason Nvidia can charge so much for this graphics card is that they don't have any serious competition at the high end. In my mind, being the best earns them the right to charge the highest prices. I don't like it, but I do think that it's fair. After all, Nvidia is there to make money just like Intel, Microsoft, AMD (er... does AMD actually make money? <g>)
Of course I (and probably most people) am not going to pay that much for a video card; I will wait a year and pay $80 for a Geforce3 MX. However, this pricing scheme does have one interesting consequence: For the first time, game developers will be working with video cards one or two generations ahead of the consumer. Thus, when the Geforce3 hits mainstream pricing levels, software will likely already be available.

John
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Maybe those advance games that is exclusively written for the GF3 will cost $100 a pop eh? With such advance programming needed for the gf3, why of coarse it will cost more. Of coarse after someone gets suckered in spending $700 on a video card they more then going to spend $100 for a game. Besides to access the real ability of the card you will need the software, what is the use of playing all those other games which play great on the Radeon, you need that special software. Just think 5 GF3 title games costing $500. Now who is really going to write the software for this beast that only the GF3 can handle? Meaning everything else won't be powerful enough except maybe the Radeon. Now where are the reviews for this card, isn't it suppose to be out by now. Well March will be here.
 
G

Guest

Guest
The features of the GF3 will be available through Directx8, so I don't think they'll be all that hard to program.

>Now who is really going to write the software for this >beast that only the GF3 can handle?

My point was that it takes so long to program a game - some are taking three years at this point - that games have always been behind in terms of implementing the latest hardware features. People will write games that only this card can handle because by the time the game is finished, the GF3 will be mainstream. That is to say that perhaps for the first time the software and hardware will become accessable about the same time.

John

"We will firewall Napster at source... We will firewall it at your PC." - Sony VP Steve Heckler
 
G

Guest

Guest
Well, it's hard to predict for several reasons. The number of transistors in GPU's has been increasing dramatically - much faster than in CPU's and this obviously cannot continue. Also, it probably depends on if they have any significant competition. So best guess:

9/2001:
GF3 Ultra $700
GF3 $350 - $400

3/2002
GF3 'Supra' $700
GF3 Ultra $350-400
GF3 $250

9/2002
NV30 $700
GF3 Supra $350-400
GF3 Ultra $250
GF3 $150

So you see, I am predicting 2 refreshes of the GF3. They probably clocked the GF3 lower than the GF2 Ultra so that they could raise the speed twice. The NV30 ultra will probably come out 3/2003. If ATI comes out with significant competition then this schedule may be accelerated. Also, the wild card is when they will release the Geforce3 MX. We just need to wait and see...

John

"We will firewall Napster at source... We will firewall it at your PC." - Sony VP Steve Heckler
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
:eek: Hey I was the first to post info on nVidia-Apple.
Why is nobody replying to my posts. Anyway, I'm not one to hold grudges.

Well, here's a Q&A.

<b>Why is GeForce3 expensive?</b>

The GeForce3 is the result of enormous amounts of research and innovation (unlike intels claims of innovation by just re-iterating their 6th gen core bit by bit).

This research costs a lot. Then there is of course the huge amounts of silicon per chip. 20% more transistors than intels latest p4 i.e. 57 million. This Price is likely to come down quite a bit 'cos microsoft did pay 'em a lot towards research. Also, the same research can be used to produce m/b chipsets, comms chipsets, audio chipsets. In other words, they have the technology now to produce a full computer where the only non nVidia parts need be the memory chips, hdd, dvd + external peripherals. So, Like Tom said, Intel and AMD better beware.

And finally, Its performance justifies (will justify) the price. SSE2 will be useless in processing graphics as routines specialized for it are way better off being processed by the nVidia GPU. So you wont need a hyper expensive processor. Any current generation processor will suffice.




<b>Why a slower Clock than the predecessor?</b>

57 Million Transistors = A LOTTA HEAT+POWER!!!

A Lower freq helps reduce it.




<b>Why will it be popular with developers?</b>

Programmable vertex shader. much easier to work with. I think. I think I might get the nVidia SDK and have a bash.



<b>Will it be popular with gamers?</b>

Hell yeah!

Well, with hardcore gamers. Others will take up the card slowly.




<b>What new features will be in store with the GeForce3?</b>

A fully integrated physics engine, no longer needing the system cpu???
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
I like looking at something from many different angles to get a clearer view of it. I was hoping to debate a little bit over the "Overclockers.com" standpoint in not doing any reviews of the GF3 chip based on its price so that I could rip a new A$$hole in there fixated stance. Like judging something without even looking at, testing or evaluating the evidence. Like not mentioning that nVidia don't make graphic cards so who is making the profit. nVidia is taking a chance here and is trying to define a whole new era in gaming and graphic ability for everybody, in other words if they make the best product, hey they make money, your happy and life goes on. Plus no one is forced to buy the GF3 line of chips at any given price, some may desire the advance features such as FSAA that screams and will work on any game now! Others like myself don't see the need yet to upgrade, but at least TomsHardware researches, analyzes, test and dishes out the facts and then gives well thought out opinions. Oveclockers.com on this reminds of someone cusing out the jeweler because the 60 karat dimond neckless is way more expensive then the perl neckless from the picture in the newspaper without even looking at the two.