3DS Render Benchmark

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
G

Guest

Guest
1. Reply to my comments regarding the way Max works.
What comments, I'm not shuffling through your crap to find what ever it is you want to talk about.

2. Talk about rendering in the view port.
Sure we can do that. Since the topic is cpu rendering in max I stuck to that most of the time. Why not start another thread?

I didn't say that that viewport performance doesn't matter, just that the topic wasn't about that. You say that you are trying to tell me how the real 3d world works, I know how important viewport performance is, and I never said that someone shouldn't buy a system on that basis but what I am saying is that there are underlying issues that may prove to be more important than you blindly choose to ignore. When you are texture mapping, or configuring an effect you will need to do hundreds of renderings. Nearly everything in max souly requires cpu, mostly everything could always be done faster. You can't have everything in the world but you can choose trade offs that result in the best performance you can get. For most people, getting the fastest rendering time possible is a much higher priority then having slightly slow viewports. All prof cards, even the wildcats wait for the cpu to be finished it’s calculations and then render to the viewport. What do you want, a $200 card that waits for the cpu to finish or a $2000 card that does a little better but still waits for the cpu to finish?

3. Go research why there is a great difference in the price of cards and talk about that. And talk about some of the hardware on the cards.

Why such a high price? Cause they can. Price doesn't reveal a thing. Unless you are talking wildcats, there isn't a reason to pay more than what the geforces cost. Prof cards that cost $600 retail today are two years old and are beaten by todays gaming cards that cost from $200-$300. Only thing today’s game cards are missing is a more powerful geometry processor, otherwise their fill rate are fine. They even handle textures better than most prof cards out there. Before you rant on how most prof animators out there use prof cards you need to understand that they still do because of budgets. They bought those prof cards for $2000+ and they haven't upgraded yet. They will most likely pay for the top dollar prof vid card again but that does mean that the geforce 2's suck. I have benchmarks of six cards, including the geforce 2. Unfortunately it doesn’t have any wild cats in there but does have firegl2 and firegl3 which are better than all 3dlabs cards except the wildcat line.
 
G

Guest

Guest
Jesus Kelder I new you were stupid, just quite how stupid I can't believe. Now I know why my boss does his best to keep me at work, because he probably came across people like you in his time.

M Kelder: "For most people, getting the fastest rendering time possible is a much higher priority then having slightly slow viewports. All prof cards, even the wildcats wait for the cpu to be finished it’s calculations and then render to the viewport. What do you want, a $200 card that waits for the cpu to finish or a $2000 card that does a little better but still waits for the cpu to finish?"

No moron, halfwit dickbreath. For most people getting viewport speed is critical. Real designers and animtors (not you by your own admission) need to work with large scenes, many polygons, and view transforms in real time. Mapping is left for the end. Even then you turn off other objects (see Hide Objects command in 3D Max for example) and quickly render your object until you are happy with your material. Whenever you see the "How it was made" scenes relative to feature films, they are always in wireframe or shaded mode. See you say this crap because you have no idea what your talking about.

3. Go research why there is a great difference in the price of cards and talk about that. And talk about some of the hardware on the cards.

M kelder: "Why such a high price? Cause they can. Price doesn't reveal a thing. Unless you are talking wildcats, there isn't a reason to pay more than what the geforces cost. Prof cards that cost $600 retail today are two years old and are beaten by todays gaming cards that cost from $200-$300. Only thing today’s game cards are missing is a more powerful geometry processor, otherwise their fill rate are fine. They even handle textures better than most prof cards out there. Before you rant on how most prof animators out there use prof cards you need to understand that they still do because of budgets. They bought those prof cards for $2000+ and they haven't upgraded yet. They will most likely pay for the top dollar prof vid card again but that does mean that the geforce 2's suck. I have benchmarks of six cards, including the geforce 2. Unfortunately it doesn’t have any wild cats in there but does have firegl2 and firegl3 which are better than all 3dlabs cards except the wildcat line.

You begin by saying "Why such a high price? Cause they can. Price doesn't reveal a thing." Gee what a f>uckin moron you are. Think about this, no gamer is going to buy an new Elsa Gloria or synergy card are they? They're not gonna buy a Firegl or a wildcat either. So who the f>uck is gonna buy them? Real people you dumb bastard, not you, but REAL animators, REAL designers. And do you think these people with years of experience and knowledge are gonna hand over those sort of dollars for a glossy box? As for talking about long term investments there have always been other cards for designers to consider. It's all equivelant. It's like saying "ok, I'II buy a wildcat now but I know that in 3 years when the Geforce 6 comes out it won't perform well against it". Course not cock head. The pro cards are still beating the game cards in profesional apps. And if the day comes when the game cards beat the pro cards then there will be no market for them will there? You fu>ckhead. I can't believe I have to explain this kinda basic sh.it to you. But hey, its not like you understood any of the other threads. You f>uckin dumb arselicking whore. Think abit.

You go on to tell us that the firegl cards are better than the 3D labs cards. When was this a debate over who make the best cards? Isn't that why we come here to read Tom's guide? This is an issue about your ignorance and about your determination to keep putting bullsh.it up on this forum. Before you make another dumb arse post go read an article in the graphics guide dated December 13, 2000. They talk about the most important features required for real designers (not you)(never will be you) and then compare 3 cards including the geforce. You'll find that where it matters (for profesional designers, not gamers) that the pro card really kicks butt. They actually spoke to profesionals when doing the benchmark and this is what they had to say:

"We interviewed several different companies that use professional OpenGL applications on a daily basis. During talks, we noticed a certain trend: While working on a graphics project, rendering of large textures is much less important than in 3D games, where it has become most essential. However, the modeling of objects requires very high polygon rates. As long as objects and scenes are still "under construction", most designers are handling them as wire frame models. Even looking at single views or scenes is done with simple shading models during production phase. Only at the final stage of a project the designer requires some kind of high rendering performance and thus fill rate. It shows that the user profile of a construction designer is completely different from that of a 3D-gamer."

So you fu>cking dumb crack whore even the reviewer agrees with me. Are you going to try and post some bullsh>it dismissing his arguement too? And notice that he says "It shows that the user profile of a construction designer is completely different from that of a 3D-gamer". Thats why I knew straight away you were a fraud and the same with your boyfriend m tedesco. Because I have spoken to REAL people who had the geforce 2 when it first came out and it crawled under the polygon count. It went back to the shop a day later and went back to their pro cards. But don't believe me read about it in the article. That way you'll see a bit of what the design world is on about. It compares the geforce 2 GTS, the quadro 2 and the firegl. Maybe then you'll understand why people buy pro cards and not say sh.it like :

"For most people, getting the fastest rendering time possible is a much higher priority then having slightly slow viewports" and

"Unless you are talking wildcats, there isn't a reason to pay more than what the geforces cost."

God you are one dumb bitch. How many times do I have to prove your an idiot before you'll admit it? I expect an apology for your ignorant remarks and a thank you for an education you may have never got. So don't come back here until you have read some reveiws and done some research into the way a REAL designer works, maybe even a few times (just incase it doesn't sink in). Ok? Now f>uck off you dumb crack whore, dirty slut, scum suckin maggot.



"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
 
G

Guest

Guest
Once again you miss read what I type. You take once sentence out of a paragraph and just go on that. I'll have to remind you again that I am speaking from person that does animation as a hobby on one pc. If you were a professional then you'd of course have rendering farms so it would all be viewport performance on your workstation. And on top of that, you would get the best prof cards since you personally wouldn't have to pay for it, so you obviously wouldn't give a [-peep-] about any game cards.

You really need to get off your pedestal, calm down and try to have a normal conversation. You keep resorting to immature personal attacks which really have 0 effect against someone with at least half a brain. It has been a week now and you haven’t even calmed down a bit, or are you one of those people that rant and rave in forums because it means nothing in the long run and it makes you feel big? That is my guess about you.

If you want to talk about having the best machines in a network for an animation house then almost everything I have said would be changed. I know things would work differently there and you don't need to point anything like that out to me. You have a very bad problem with jumping to conclusions and you can't read between the lines at all. This is hampering your ability to hold an intelligent conversation, but I must admit I get a kick out of some of the stuff you say as an insult, like the last line of your message back to me. You must go through a lot of keyboards by smashing them.

I never ever said that you don't need good wireframe performance. I just added that part about mapping because it is nice to have. Ask any animator if they would like to have the ability to play back scenes with the maps on and not wireframe all the time. I know 90% of the time you need to be in wireframe. Pro and game cards alike wait for the cpu to finish the calculations before doing their job.

"no gamer is going to buy an new Elsa Gloria or synergy card are they"
Of course not, and I never said they would. Prof cards are only high priced because animation houses don't mind forking out lots of money. You again take one sentence out of a paragraph and base your insults on that. I said price it doesn't reveal a thing because the pro card market is set at high prices for cards. Even a 2 year old card that can't perform better than a geforce 2 costs 3x more.

"t shows that the user profile of a construction designer is completely different from that of a 3D-gamer."
I NEVER SAID IT WASN'T DIFFERENT!!! All I am stating is if you don't have thousands of dollars for the highest end pro cards then a geforce 2 is a good choice.

“I expect an apology for your ignorant remarks and a thank you for an education you may have never got”

Man I had a good laugh when you said that. you think you know so much more than me when in reality you don't know any more. As I said before, I am talking as a person that has one machine and is forking out the money himself. I can't believe you thought I was saying that the profs out there should only use geforce's and that viewports don't take no. 1 priority. If I were a professional and had a rendering farm and a good budget to go with then I’d of course go for a dual P3 setup with a wildcat because there wouldn’t be a reason not to but for me personally I can’t spend $6000-$8000 on a personal system so I have to make cuts places. First to go is the intel cpu’s, cost a arm and a leg and an Athlon isn’t too far behind in performance and is way lower in price. Also, it’s fpu is far more superior to that of the P3 and even more so than the P4. There is mo point in having dual cpu’s when I’d have to wait longer for renders and pay more. Second to go would be the pro card. Of course I’d love to have a newer one but can’t pay that much and the older ones aren’t nearly worth their price.

And yes, firegl cards are better than all 3dlab card except wildcats. Here some benchmarks I was given at discreets forum.

Firegl2 Firegl3 Oxygen VX1pro Oxygen GVX420 Gloria 3 Geforce 2
Geom2 – 4.4 4.4 2.1 2.2 2.5 2.4
4views -- 12.5 12.6 4.8 5 7.8 3.7
wirefrm-- 11.8 11.9 5.7 6 7.6 4.4
light1 -- 74.6 72.7 4.5 10.9 50.1 41.8
light2 -- 74.5 72.9 20.7 31.3 52.1 51.2
light3 -- 78.8 72.9 13.1 15.9 52.7 51.3
text1hi -- 6.7 52.6 8.4 18.1 14.2 3.4
text2 -- 41.7 42.5 10.4 18.2 42.7 40.5

As you can see, I was correct in saying that the firegl’s are better than 3dlabs except for the wildcats.
 
G

Guest

Guest
Kelder, why waste your time with this jackass?
Pro Cards are more expensive because they're aren't as many customers in the marketplace buying them as there are for gamers cards. Yes, those people have deeper pockets but if they didn't then no one could survive selling pro's cards at cheap $ because it's such a limited market. They must pay for the R&D so they jack up the price of the cards. None of the hardware discussed on this forum costs a lot to manufacture, it's simply about R&D and the marketplace. Look at how much Intel can slash their prices overnight. Look at RAM prices bounce around.
This Forum comes off as a breeding ground for blowhards like tonestar. I don't understand why people with intelligence like egeorge even stick around. I know I'll be moving on. I don't know why Tom even set this thing up if he didn't care enough to have it moderated.
-ZAMEUS
 
G

Guest

Guest
Well kelder finally you have had no where to run so now your starting to agree with me. So you actually didn't say to much in your last post other than attempt to disguise your mistakes in previos posts. Man it almost sounds like what I've been saying for the last 5 pages is starting to sink in. You were unable to refute anything I had said previously and your attempts to cover your own ass is quite amusing.You last line was infact very funny:

"As you can see, I was correct in saying that the firegl’s are better than 3dlabs except for the wildcats"

Hahaha....now your trying to turn this post into a benchmark for cards when it never has been. Anyone who has been reading this post will know that I never once even remotely suggested anything about firegl's vs the 3Dlabs cards. This is an attempt by a fraudulant wanker like you to save some face. Well kelder YOU F>UCKIN HAVEN'T!!! Ah yes my workmates have been reading this post and have had a good laugh at watching you try and clutch onto straws in an attempt to disguise your stupidity. Thanks for amusing us. Ok kelder since you have not brought up any technical issues in your last post let me tell you this.

There are many people who attempt to get into animation and CG. You are way way way way way down the bottom. Not even on the list in fact. Your limited perpective, inablility to comprehend facts and total lack of knowledge make it an imposibilty for you to have any success in this field. Unless you are say 12 years old of course. Then your ignorance can be forgiven. My words may sound harsh however they will save you from wasting years of your life attempting to reach an unatainable goal. Then again you may have relised the reality of your situation even though I doubt it. I'm sure these words will ring true for you and you will wish you had listened. Hey kelder its not all your fault, some of it is genetics.


"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
 
G

Guest

Guest
"You go on to tell us that the firegl cards are better than the 3D labs cards. When was this a debate over who make the best cards?"

You wanted to debate my claims about which hardware is the best choice. You claimed earlier that the firegl’s weren’t good. Your really stupid man. I haven't changed my view at all. The only thing that changes is where the scenario is at. As you can see from the benchmarks from before, the geforce 2 (yes a game card, which I'd mod into a qaudro anyway) performs just as well as a oxygen in wireframe The cpu still holds the cards back. Look at geom2. It isn’t the video cards fault they are so low. What is additionally good is that the geforce handles textures and lights better than the oxygens. Looking at those benchmarks, tell me again which you'd want?

You keep trying to discourage me but you can't. You say I know nothing about hardware for animation when in fact I know as much as you. Besides, you don't have to know [-peep-] about hardware to be an animator. Age is also irrelevant.
No, I'm not trying to "cover my own ass". I still stand where I did at the beginning of this thread. Your either trying to give me a hard time or you don't know [-peep-]. I've always said that what I say is good in animation is based on price/performance of the person that has to pay for his own machine. I've shown the benchmarks that prove that the geforce 2 isn't a card to pass on and that it doesn't suck all that bad in 3d. I've proven that the Athlon is a good choice too since it won't be too far behind Intel cpu's for viewport performance and that it kills Intel when rendering, which IS more important that viewports sometimes if you only have one machine to work on.

To Zameus: I really don’t know why I keep coming back here. Everyone that has come by to this thread reads it and realizes that I have valid arguments and that this guy is a total dumb jackass. His posts are full of ignorance, self worship, blatant disrespect and futile attempts to make himself look special. Everytime he insults me about something I come back with proof or reassess what I mean. He constantly puts words into my mouth or misinterpret what I say. If that wasn’t bad enough he comes back to say that I change my view which I haven’t, and he has the gull to call me a fake when all he has to do is ask what I mean or discuss things in a cool mature manner.

Tonestar, your such an ass. You have really tunneled ideas and thoughts. You think that 3dlabs is the best because the price is high, which I disproved and was fallowed up by someone else about that. You think intel cpu’s are many times better than AMD’s cpu’s when in fact they aren’t. I’d hate to keep pointing out your bad traits as a human but I think it is time you understand why no one likes you. I think I’m pretty much done with you now, just keep living you self proclaimed mr. Know it all life and I know your going to dismiss everything I say as you do to everyone that does share your ideas.
 
G

Guest

Guest
kelder you deceitfull slut, this was never a debate about graphic card benchmarks and like I told you before, do not attempt ot turn it into one. You still haven't addressed any of the REAL issues. Let me remind you of my previous post:

Jesus!!! I wonder if there's anything in your skull sometimes. You've posted the rendering/video bullshi>t a dozen times now. I tried to explain to you how life works in a 3D enviroment, did I not? Didn't we already talk about rendering in the veiwport vs rendering as say a .BMP or .TIF or .MOV. I told you most people render when they leave work didn't I? So then what do you think I'm saying you dumb c>unt? Jesus I said all this back in the Graphics card post even before this post was even here.

Listen your dumb whore and get this into your f>ucking thick skull, why do you suppose they quote fill rates when they advertise a card? Why do you supppose cards cost between $50 and $5000? I'II say it again, when you have many objects (hence many polygons) in a scene your ultimate goal is to be able to work with them in real time. Often these objects are lit as well. You may use material wrapping in your animations, its nice to get an idea of what it might look like. This you dumb turd is what I talk about when I say rendering.

1. If my machine at work renders while I'm at home why would I care about the graphics card?
2. Why is there such a huge price gap between pro cards and cheap cards?

Why don't you actually start working with an animation package before you come hear and open up your cock sucking gob!! You play games boy, I don't even know why your even comparing the geforce 2 and the GVX1, you don't need either card your so lame. Just buy something cheap, it'll be enough for you. So do you finally f.ucking get it?

If your gonna reply, do this:
1. Reply to my comments regarding the way Max works.
2. Talk about rendering in the view port.
3. Go research why there is a great difference in the price of cards and talk about that. And talk about some of the hardware on the cards.

DON'T GO ON ABOUT OTHER CRAP. YOUR CREDIBILITY IS DEAD. STICK TO THE TOPICS AT HAND!!!!

Well kelder? Do you have anything to say?
I went on to disprove your comments: You said:

""For most people, getting the fastest rendering time possible is a much higher priority then having slightly slow viewports. All prof cards, even the wildcats wait for the cpu to be finished it’s calculations and then render to the viewport. What do you want, a $200 card that waits for the cpu to finish or a $2000 card that does a little better but still waits for the cpu to finish?"
Why such a high price? Cause they can. Price doesn't reveal a thing. Unless you are talking wildcats, there isn't a reason to pay more than what the geforces cost. Prof cards that cost $600 retail today are two years old and are beaten by todays gaming cards that cost from $200-$300. Only thing today’s game cards are missing is a more powerful geometry processor, otherwise their fill rate are fine. They even handle textures better than most prof cards out there. Before you rant on how most prof animators out there use prof cards you need to understand that they still do because of budgets. They bought those prof cards for $2000+ and they haven't upgraded yet. They will most likely pay for the top dollar prof vid card again but that does mean that the geforce 2's suck. I have benchmarks of six cards, including the geforce 2. Unfortunately it doesn’t have any wild cats in there but does have firegl2 and firegl3 which are better than all 3dlabs cards except the wildcat line."


I replied with a quote from Toms Graphics guide:

"We interviewed several different companies that use professional OpenGL applications on a daily basis. During talks, we noticed a certain trend: While working on a graphics project, rendering of large textures is much less important than in 3D games, where it has become most essential. However, the modeling of objects requires very high polygon rates. As long as objects and scenes are still "under construction", most designers are handling them as wire frame models. Even looking at single views or scenes is done with simple shading models during production phase. Only at the final stage of a project the designer requires some kind of high rendering performance and thus fill rate. It shows that the user profile of a construction designer is completely different from that of a 3D-gamer"

So are you going to admit you were wrong or just keep changing the subject and lying? You are either smarter than the reviewer or you are wrong. Which is it? You will never ever be an animator. These words will ring in your ears 10 years from now when you are flipping burgers at Mc Donalds. Remember that.


"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
 
G

Guest

Guest
If you read you'd see that I have answered those already. Not talk about graphic card? That is half your questions. Btw, why quote everything I say? LoL, I think this is over, you can't understand anything.
 
G

Guest

Guest
You have answered nothing, I have no questions about graphic cards. Why would I need to ask you anything? Please prove anything you say from here on. You remain an idiot.

"Cock-a-doodle-do" is what I say to my girl when I wake her UP in the morning!!
 
G

Guest

Guest
>We can see how an Athlon stacks up against Intel IGHZ SMP.<

There is another 3d rendering/animation program called Cinema 4D made by Maxom (http://www.maxon.de/). On their site Maxon provides a free benchmark program to see how well PCs and Mac’s (yeah, yeah, I know) do with viewscreen (i.e., shading) and raytrace rendering. Cinema 4D (and the benchmark) is a multi-CPU capable program. Here are some links to posted results:

http://code.c4dzone.com/html/cinebench.html
http://www.vimagic.de/vm/cinebench/

Also, if you go here (http://www.postforum.com/), enter the Cinema 4D forum, and do a search on “cinebench” you can pull many more benchmark results. Here are the results of my AMD rig:

Cinebench 2000 V1.0 Performance
****************************************************

Tester : P4Fool

Processor : AMD 1.33G
Number of CPUs : 1
Physical Memory : 512 PC133 cas2
Operating System : WinMe

Graphic Card : Leadtek GF2 MX DH Pro
Resolution : 1280X1024
Color Depth : 32

****************************************************

Shading (CINEMA 4D) : 12.25 CB
Shading (OpenGL) : 18.55 CB
Raytracing (Single CPU): 17.90 CB
Raytracing (Multiple CPU): --- CB

OpenGL Shading is 1.51 times faster than CINEMA 4D Shading!

These results were with a 1.2G 200 MHz FSB Tbird overclocked to 1.33G (10X133) on an IWill K266 (no, not even DDR) motherboard. The thing that totally blows me away is the excellent OpenGL performance (viewport, or shading results) with a low end 32M SDR GeForce2 MX card!

Below is a dual PIII rig:

Cinebench 2000 V1.0 Performance
****************************************************

Tester : XXX

Processor : PIII 1ghz
Number of CPUs : 2
Physical Memory : 512
Operating System : Win 2000

Graphic Card : Fire GL2
Resolution : 1280*1024
Color Depth : Truecolor

****************************************************

Shading (CINEMA 4D) : 8.85 CB
Shading (OpenGL) : 14.62 CB
Raytracing (Single CPU): 11.91 CB
Raytracing (Multiple CPU): 20.23 CB

OpenGL Shading is 1.65 times faster than CINEMA 4D Shading!
2 CPUs are 1.70 times faster than 1 CPU !

****************************************************

As the benchmark shows, the dual 1G PIII rig beats the 1.33G AMD system in multiple cpu raytracing. The fire GL2 video card helps with the OpenGL performance!

The amazing thing is to see how (not) well an expensive P4 system (loaded with 1G[!!!] RDRAM and GF2 Ultra) does in this benchmark, compared with either of the above cheaper, yet better performing platforms...

****************************************************
Cinebench 2000 V1.0 Performance
****************************************************

Tester : XXX

Processor : Pentium 4 1.5GHz
Number of CPUs : 1
Physical Memory : 1GB RDRAM
Operating System : Windows2000 Pro

Graphic Card : nVidia GeoForce 2 Ultra 64Mb DDR
Resolution : 1280 x 1024
Color Depth : 32 Bit

****************************************************

Shading (CINEMA 4D) : 12.05 CB
Shading (OpenGL) : 17.43 CB
Raytracing (Single CPU): 14.59 CB
Raytracing (Multiple CPU): --- CB

OpenGL Shading is 1.45 times faster than CINEMA 4D Shading!

Yep, I am not going to rush out and buy me a P4 anytime soon... Anyway, for what it is worth, a (relatively) bias-free presentation of benchmark results for your enjoyment. Dual 1G PIIIs are faster in raytracing than a 1.33G AMD. (but don't even go there with a single PIII setup!).


<P ID="edit"><FONT SIZE=-1><EM>Edited by P4Fool on 04/24/01 11:02 AM.</EM></FONT></P>
 

Raystonn

Distinguished
Apr 12, 2001
2,273
0
19,780
Another example of software made last year that was optimized for Athlon and P3, but not P4. A recompile would definately change those results. Don't expect those results to last with the next version of their software.

-Raystonn

-- The center of your digital world --
 
G

Guest

Guest
>Another example of software made last year that was optimized for Athlon and P3, but not P4. A recompile would definately change those results.<

Ray, the point of your post must be that I should by a dead-end .18 micron P4 that can't perform today's tasks as well as a cheaper, yet better performing system.

Face it. Right now the P4 is not a good purchase for the *thinking* consumer. How many forummers have admitted to buying a P4? In the future, the P4 may be a great performer worthy of the $$ Intel demands for it (much like the good old celery 300a). And when (and if) the P4 rules the roost, I'd be the first to consider buying a P4. But right now the P4 is just not an option.

You and Cyberguy crack me up about how great the current P4 incarnation is, and what a great buy it is. Meanwhile, I'll just sit back and see how this all plays out.
 

Raystonn

Distinguished
Apr 12, 2001
2,273
0
19,780
"the point of your post must be that I should by a"

I wasn't telling you to buy anything. What CPU you buy depends on many factors. Two of these factors are how good the CPU is and what kind of software is available for it at the time of purchase. We were only discussing the processor itself, not current software market conditions. These conditions will only improve with time anyway. Most peopel hang onto systems for 2-3 years at least. They'd get more bang for the buck in the future with a P4 purchase in the present.

-Raystonn

-- The center of your digital world --
 
G

Guest

Guest
>They'd get more bang for the buck in the future with a P4 purchase in the present.

Since we're doing rampant speculation here, I'll speculate that the next generation of AMD chip with SSE2 support will be the better bang for the buck in the future.

And my speculation is just as irrelevent as yours.


In theory, there is no difference between theory and practice.
In practice, there is.
 

Raystonn

Distinguished
Apr 12, 2001
2,273
0
19,780
"I'll speculate that the next generation of AMD chip with SSE2 support will be the better bang for the buck in the future"

That's fine but you can't buy one of those right now. The P4 is available for purchase.

-Raystonn

-- The center of your digital world --
 
G

Guest

Guest
>That's fine but you can't buy one of those right now. The
>P4 is available for purchase.

But you can't get the P4 optimized software yet in most cases. Which is exactly my point. I'll believe it when I see it. Or more likely try it.



In theory, there is no difference between theory and practice.
In practice, there is.
 

khha4113

Distinguished
Dec 31, 2007
2,143
0
19,780
<b>They'd get more bang for the buck in the future with a P4 purchase in the present.</b>
So wouldn't it make sense to wait for the time when it'd get <b>SS2 optimized softwares</b> than right now? And P4 price is dropping!
 

Raystonn

Distinguished
Apr 12, 2001
2,273
0
19,780
Of course it makes sense to wait. You always get better technology in the future. However, some people need a new computer _now_, and they need this computer to last them for 3-5 years.

-Raystonn

-- The center of your digital world --
 
G

Guest

Guest
I usually ignore these P4 opinion posts but have the following comments :-

1. I agree Aces is a civilised knowledgable site with mature
posts compared with THG.

2. Take comfort in the fact that Hysteria as demonstrated by
CBERIMAGE and others is the final refuge of the
inadequate. Don't ignore them ... refute them logically
then they only win in their own minds.


<P ID="edit"><FONT SIZE=-1><EM>Edited by frikkie on 04/25/01 01:06 PM.</EM></FONT></P>
 
G

Guest

Guest
>We were only discussing the processor itself, not current software market conditions<

No, Ray, actually I was discussing how the PIII, Athlon, and P4 benched in a popular mid-level 3D animation program. On the other hand, you were discussing the processor (P4), ignoring the current software market conditions (i.e., no targeted p4 optimizations), and presenting as fact that the next software update for Cinema 4D would be specifically optimized for the P4. Quite a bit different from your quote above, wouldn't you say?

If you had kept to a strictly processor related discussion, as your quote alludes to, your only possible conclusion would have been that the 1.5 G P4 lost to a Dual 1G PIII setup in raytracing and lost to a 1.33G Athlon in raytracing and viewport rendering.
 

Fltsimbuff

Distinguished
Mar 30, 2001
114
0
18,680
Does this forum have moderators? I'm getting sick of this kinda crap. Personal attacks and insults on other forum members resulting from a difference of opinion is a pathetic way to prove a point.... This is a CPU HARDWARE Discussion, on the Tom's HARDWARE Forum.
Tonestar: You really ought to take a look at your posts, and then Decide if anyone is acutallu going to belive that you are a mechanical engineer, or a perverted schoolboy. Comments like those that you have made have no place here, and they have strayed WAY off topic. Please Show some repect for other members like myself, who are sick of you and your mudslinging. No matter what "side" you are on, I think most can agree that your posts are worthless to the forum, because they have very little worthwhile info.

--Fltsimbuff
 

Fltsimbuff

Distinguished
Mar 30, 2001
114
0
18,680
Well, considering that the current P4 Socket is about to change in a few months, I seriously doubt that any P4 they buy now would Last them 3-5 years... you just pretty much stated that the P4 is the CPU of the future... well, *Northwood* may be, but the current Willamette will soon be obsolete. That means that if someone wants a CPU of the future, they ought not go for the current P4... Better to go for an Athlon Rig, and make less of a dent in their $$.

--Fltsimbuff
 
G

Guest

Guest
tonestar hasn't posted in two days.
I wonder if Fredi shut him down?


In theory, there is no difference between theory and practice.
In practice, there is.
 

Raystonn

Distinguished
Apr 12, 2001
2,273
0
19,780
"considering that the current P4 Socket is about to change in a few months, I seriously doubt that any P4 they buy now would Last them 3-5 years"

What does their socket have to do with how long they keep their current computer system before upgrading? Absolutely nothing. It has more to do with the processor, as is, being able to execute the latest software without falling miserably behind.

"Better to go for an Athlon Rig"

Athlons and P3 are at the end of their lifecycles. Future software will only perform better and better on the P4 and worse and worse on Athlons and P3s.

-Raystonn

-- The center of your digital world --