well are people forgeting something

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Is he the one that said that the mx was virtually as fast as a GF2 in 3dsmax which appears to be 100% correct? I liked that guy, wished he would return and give us more good info.

Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b>
 
G

Guest

Guest
yeah, that was him. Glad he told me that or I would have bought a card that performed the same for $200 more.

I think I have his contact on the other hard drive. I can pick up my hardware tmw!
 

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
Well i wonder how a dual chip Radeon 2 will works in 3d studio max. Like i say i think the Radeon 2 maxx will be called the FireGL 5.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
 
G

Guest

Guest
Yikes, that'd be the machine to have. I'm going to have to start saving up now. If ati could have the tessalation through the gpu sent back to 3dsmax that'd be cool, athough the bandwidth from agp to cpu would be just as slow as from ram to cpu anyway.

Oh and sorry for crapping your thread, tonestar just doesn't care where he picks his fights... I think every post I've made this week has had him reply something stupid like "your still learning to make folders" What a goof.<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 06:18 PM.</EM></FONT></P>
 

Ncogneto

Distinguished
Dec 31, 2007
2,355
53
19,870
Well as interesting as it sounds the first dual GPU offering of ATI was a miserable fialure to the point they had to disable 85% ( my estimate) of the second GPU.

A little bit of knowledge is a dangerous thing!
 

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
Well i think the Radeon 2 maxx will be like this. If the first cant take the load the 2nd gpu will help it out. If not 2nd is doing nonthing. Should emilate bottlenecks of the Radeon 2.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
 
G

Guest

Guest
What discreet and other 3d apps makers out there need to do it team up with video card manufacturers and make pci cards that is responsible for one viewport, although I can already see a problem with that as the cpu would get too big of a work out sending everything to each card via pci. A better thing would be a agp card that has seperate gpu's for each viewport. Each gpu could have it's own memory bank as well as a larger shared one. I think that'd work great though I'm no electronics designer.
 

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
What would work better:

-sharing 1 big 64mb of ram

-having 2 seperate 32mb of ram for each gpu

-having 2 seperate 16mb of ram for the gpu's and also sharing 32mb.

Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
 
G

Guest

Guest
They could probably do better each having 32mb, they could closer tie it ot the gpu but then again I think they'd need someway to communicate to each other. Hell, just make a card that has a gpu with 64mb cache! Would be the size of the chip but would run fast :)
 

Ncogneto

Distinguished
Dec 31, 2007
2,355
53
19,870
HUH? Why not just use two video cards? Actually this is were the nforce starts to get interesting. Rumour has it that It can support two AGP (one internal and one external). This is also further fueling specualtion ona possible SMP soluiton in the future with this chipset as the internal AGP could theoretcally be substituted for another CPU channel.

<A HREF="http://images.anandtech.com/reviews/chipsets/nvidia/nforce/preview/twinbank.jpg" target="_new">http://images.anandtech.com/reviews/chipsets/nvidia/nforce/preview/twinbank.jpg</A>

A little bit of knowledge is a dangerous thing!
 
G

Guest

Guest
Oh kelder, for gods sake....what do we have to do to make you believe it. The article titled "Professional affair" in THG under "Graphic Guide"...do you not believe the reviewer? He clearly pionts out the difference in ground between the geforce cards and the pro cards. Visit any 3D or Cad site and have a look at their reviews, some even include the games cards to clear up the myths that people like you have come to believe.
The reason you can live with a games card is because you don't create complex models or scenes, if you did you'd realise that it takes forever to move and manipulate these objects.
Your still trying to pretend you know more than the reveiwer. If the geforce cards were just as good nvidia would rename and repackage the same card, but even in their own range they have modified the card. Why bother if the geforce's are just as good? Why then do the Quadro's benchmark better? You are soooooo stupid!!!

If anyone would like to see what kind of a deceitfull little bitch kelder is go to a post called "Graphic designer computer(help)" posted by maiden_hell in the CPU forum. See:

http://forumz.tomshardware.com/modules.php?name=Forums&file=faq&notfound=1&code=1

The conversation gets onto profesional graphics cards, and kelder is unable to defend his position in gaming cards so decides not to return. How pathetic. How funny!!!
So kelder...does it make you feel good to come here and preach to some ignorants? You don't know what your talking about so keep your cock suckin mouth shut!!!

Gee, what does it take to get through your thick skull?

"no kelder....you can't shoot goblins in wordpad"
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
What are you talking about?? Seems like m_kelder picked out the best performance for the buck anyone could purchase for 3dsMax. Check out this report about 3dsMax and different video cards:

<A HREF="http://www.xbitlabs.com/video/3dmax/index2.html" target="_new">http://www.xbitlabs.com/video/3dmax/index2.html</A>

By the way how much do those professional cards sell for? and which ones are you talking about?

Well to eat your <b>C :smile: :smile: kie</b> and have it too, gotta get <b>Rade :smile: n II</b><P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 06/06/01 10:34 PM.</EM></FONT></P>
 
G

Guest

Guest
we wuz just suposen

I'll have to do some testing but I'm sure a tripled my performance from my older system for sure.<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 10:36 PM.</EM></FONT></P>
 
G

Guest

Guest
"titled "Professional affair" in THG under "Graphic Guide"
is over a year old moron!
It is outdated and obsolete

"The reason you can live with a games card is because you don't create complex models or scenes, if you did you'd realise that it takes forever to move and manipulate these objects."

Just look at some recent benchmarks and you'll see that when it comes to high poly that grinds a system to a halt both the geforce and any 3dlabs card performs the same!

"Your still trying to pretend you know more than the reveiwer. If the geforce cards were just as good nvidia would rename and repackage the same card, but even in their own range they have modified the card. Why bother if the geforce's are just as good? Why then do the Quadro's benchmark better? You are soooooo stupid!!!"

Ummm, the quadro has no changes over a geforce besides clock! The 'only' reason a quadro outperforms a geforce is the driver. Nvidia disables many things for the geforce so it doesn't perform quite as well as a quadro. Obviously you don't know what you talking about and I've researched it...

Oh and about the other thread, I told him to wait for the northwood and dual athlons to come into retail, what is bad about that? And I told him that he would be better off buying a geforce2 instead of a 3dlabs card because there is hardly a difference and a huge difference in price.

Sit down tonestar<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 10:45 PM.</EM></FONT></P>
 
G

Guest

Guest
Oh noko, you have no idea do you? Did you read that article in THG? I think it was in may 2000.

All the cards in that test are games cards. All of them. Have a look at any of those cards compared to 3Dlabs, Elsa or FireGL cards...they don't even remotely compare.

Yes, these cards are often alot more expensive but for a REAL designer they are essential. Thats why any real person can pick kelder for a fraud from a mile away.

And noko...why not read my links before you post back crazy stuff?

"no kelder....you can't shoot goblins in wordpad"
 
G

Guest

Guest
You freakin dumb ba&stard...have a look at these cards side by side before you keep spilling garbage...

"no kelder....you can't shoot goblins in wordpad"
 
G

Guest

Guest
lol man, I've showed you benchmarks before! Don't you remember or are you just stupid? I'll post them again, guess your thick!

Oh and thanks for putting me in your sig, shows how little of a life you have that every day you try and attack me some way or a nother.
 
G

Guest

Guest
Here we go...
These nubmers were given to me at the discreet web board when I asked which card I should get. These people know many times more about this than anyone here just to let you know.

firegl2/firegl3/Oxygen VX1PRO/Oxygen GVX420/Gloria III/Geforce2
numbers in that order

raster*******************/77.5/77.5/16/37.7/100.2/85.5
geo2(high poly)**********/4.4/4.4/2.1/2.2/2.5/2.4
wireframe(lower poly)****/11.8/11.9/5.7/6/7.6/4.4
4views*******************/12.5/12.6/4.8/5/7.8/3.7
light1*******************/74.6/72.7/4.5/10.9/50.7/41.8
light2*******************/74.5/72.9/20.7/31.3/52.1/51.2
light3*******************/78.8/72.9/13.1/15.9/52.7/51.3
text1********************/68.7/70/???/49.2/114/81
text2********************/41.7/42.5/10.4/18.2/42.7/40.5
wheu, that took too long :)

As you see tonestar, I speak from benchmarks, you just speak from your ass.
 
G

Guest

Guest
he doesn't do it, he never said he did. Read this thread again and you'll see what he said to me before, it is quite funny.

Oh, and by the way tonestar, dumbest is slang which implies your stupidity, heh, everything you've said so far just makes you seem more and more retarded, keep it up!<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 11:02 PM.</EM></FONT></P>
 
G

Guest

Guest
Ok kelder...cos your such a freakin mule and you never listen to me I am going to quote some other people:

Kelledin:
For 3D, the GeForce series cards are gaming cards. The Quadro series cards (beefed-up GeForce series chips) are still "wannabes" in the professional 3D world. Cards like the Elsa Gloria series cards and the Permedia series chipsets (used in the Diamond FireGL cards) still trounce the Quadro for professional 3D (geometry processing power is more important than fill rate in this scenario).

Makaveli:
Don't listen to these children The Geforce 3 is far from a professional 2d/3d for graphic design. What I would suggest you do is list the software you will be using, and perphaps if you know anyone in the same field find out what they are using and how it works for them. Don't listen to these kids poor advice. When it comes to professional Video cards for that kinda of usage Nvidia ain't smack! Next time u post a topic like this ask for responses from people in the same profession, there advice will be far more valuable to you.

I can't speak for everyone but there are alot of children on this forum, and they sometimes give alot of uneducated advice and opinons.

Makaveli was refering to you idiot. This is what people have to do to try to tame an idiot like you.

!!!!BANG!!!!! your dead.....

"no kelder....you can't shoot goblins in wordpad"
 
G

Guest

Guest
In reply to Kelledin's post, he failed to mention that the only cards that beat nvidia chips are wildcats and firegl's which are in the $1000's and if you can afford them then I have allways said go right ahead.

To Makaveli post, the geforce 3 hasn't fallowed nvidia's previous release in that only brute force is what gamers want, which is what 3d artists want.

Glad I could clear that up for you tonestar

Oh and I remember where the guy that gave me those benchmarks told me those benchmarks came from. They were in a 3d artist mag but can't remember which one so it isn't heresy.<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/06/01 11:10 PM.</EM></FONT></P>
 
G

Guest

Guest
Wow kelder thanks for that...look at those numbers!!!

The pro cards certainly have a huge difference in the geometry, wireframe and lighting tests don't they? And like I have attempted to explain to you, this is the way people design, they don't render every couple of seconds...negating the "work time" application of the raster and texture marks. Hardware shading doesn't make it up against raytracing, therefore demising these marks again.

Anyone who knows what the marks are all about must be crying from laughter...oh kelder...talk about incriminating evidence!!

What a fool you are, you even proved yourself an idiot....thank god your not a doctor!!

Suck the whopper!!!

"no kelder....you can't shoot goblins in wordpad"
 
G

Guest

Guest
Are you delirious? I think you need to read those numbers again because you don’t seem to understand that the oxygen cards don’t perform better.

Denial is a natural defence of your brain to protect you from things that hurt you so it is ok that you said that tonestar.