Thouroughbred to Barton, then what for x86?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Hmm, I remember reading somewhere that the original 3DNow! instructions actually included some of the SSE instructions. However, 3DNow Pro does have more instructions than SSE so that would mean SSE is a subset of 3DNow Pro and 3DNow Pro probably contains some SSE2 instructions, hmmmmm..... 😱

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
 
They will check for 3dnow support and load that module if supported. If not supported, they will then check for SSE support and load that module if supported. This is simply the way it had always been done.
How exactly was LightWave optimized for Intel processors?

If it was compiled using the Intel compiler, it should perform some 20-30% better on the Athlon, just like every other application compiled with the Intel compiler, whether the Athlon processor supports SSE or not.

If it was rewritten in hand-coded assembly according to Intel guidelines, and SSE performs better in this scenario, then why did Intel not advise priority for SSE/SSE2 optimizations? Did they specifically advise that (hypothetically) slower 3Dnow optimizations be used for AMD processors?

And out of all this, note that no one has <i>ever</i> publicly mentioned 3Dnow! support in Lightwave, so the effect of possible 3Dnow! support modules is probably not even relevant. Fact of the matter is, the Athlon should perform a fair sight better with the newer Lightwave, regardless of the Athlon's SSE capabilities. In all probability, the only reason it doesn't is that it was specifically written to get no performance gain on an Athlon.

<i>If a server crashes in a server farm and no one pings it, does it still cost four figures to fix?
 
3Dnow! and SSE are fundamentally different. However, the original Athlon and T-bird did carry SSE instructions as part of its enhanced 3Dnow! instruction set. It simply could not advertise itself as fully supporting SSE (or set the appropriate bit in its feature register) without an SSE license.

<i>If a server crashes in a server farm and no one pings it, does it still cost four figures to fix?
 
Actually, it depends on how you define 3DNow! Unlike Intel, 3DNow! represents all the floating-point instruction sets the AMD Athlon (XP) processor has.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
 
Well aside from what others said, AthlonXP has SSE Ray, so your point about Athlons being underperforming for 3d Now reasons is not valid. I find it laughable that Newtek would go in cahoots with Intel. If SSE is in AthlonXP now, then how do you explain Celerons beating AthlonXPs?
I agree with Kelledin and whoever does realize the program is faking it. Simply put, they are rejecting AMD processors, because the Lightwave 7 was fully Athlon dominated. If Celeron has no SSE2, then it goes to even further investigation.

--
For the first time, Hookers are hooked on Phonics!!
 
Eden, if the application looks for 3dnow extensions and uses the 3dnow module instead of the SSE module, then SSE is not going to be used. In essence many applications will be harmed by the fact that the processor _does_ support 3dnow. Many apps are designed to attempt to choose between 3dnow and SSE. None are currently designed to use both.

-Raystonn


= The views stated herein are my personal views, and not necessarily the views of my employer. =
 
Then that is a cheap attempt by Newtek indeed. Why on earth would a fully well performing processor, be suddenly put on trial by forcing it to use 3d NOW one version later, if it was doing so damn well before?
Big mistake by Newtek and again, this shows Intel behind it. I am sure, because in no way do you damage a processor's performance one version later just to see the Intel family win.

--
For the first time, Hookers are hooked on Phonics!!
 
Then that is a cheap attempt by Newtek indeed. Why on earth would a fully well performing processor, be suddenly put on trial by forcing it to use 3d NOW one version later, if it was doing so damn well before?
Were Athlon XP processors doing better with the prior version of Lightwave 3D, or are you talking about the non-XP Athlons?


Big mistake by Newtek and again, this shows Intel behind it.
Sorry, not true. Intel has nothing to do with Newtek.


-Raystonn


= The views stated herein are my personal views, and not necessarily the views of my employer. =
 
Of course, but since I could not find any benchs prior to version 7, I can only rely on those who had seen it previously, the main version.
It's 11 PM here, school tommorow, sorry I have to give up for now on searching!
Gnight all.

--
For the first time, Hookers are hooked on Phonics!!
 
Of course, but since I could not find any benchs prior to version 7
<A HREF="http://www.blanos.com/benchmark/" target="_new">http://www.blanos.com/benchmark/</A>

<i>If a server crashes in a server farm and no one pings it, does it still cost four figures to fix?
 
I don't know in which IA64 pradise you live - but by the time Itanium which sells for around 2500-3000 USD price drops to 3 digits (if ever) - a fast Duron or Cellron will offer more preformance then it.
even TODAY the NW2.2 is rated higher on SPEC_FP - with around 5th of the cost and half the power consumption...
same probably goes for the Athlon XP 2100+ .

bringing its integer preformance to the picture - a Ghz level celron probably out preformes it on Integer tasks - which ironcly make the celron a better SERVER cpu then itanium (servers often depends on Integer preformnce - with no use for Floating point) if you dont need the 64 bit adressing that is... if you need it - there still are much better solutions then Itanium - many RiSC processors offer 64bit.

dont hype its clockspeeds/preformance ratio - the current core of the itanium cant go any faster... and if you look at other RiSC processors - there are 4 or 5 such processors which preforme better then Itanium at around the same clockspeed (I think SUNs' ultra-sparc, two Alphas, and IBMs' Power4).

no wonder almost no-one bought Itanium anyway (well around 200 were sold). and no-one will continue to-buy it.

thats Intel for ya...


This post is best viewed with common sense enabled<P ID="edit"><FONT SIZE=-1><EM>Edited by iib on 03/18/02 08:22 PM.</EM></FONT></P>
 
The Itanium is Intel's first try at a 64-bit processor, however. Intel will most definitely have a faster more efficient 64-bit in the future. I predict the Hammer will eventually force Intel to enter the x86-64 desktop market in 2003 or 2004. Then we'll truly have some excellent 64-bit apps.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
 
I don't know if *force* is an appropriate term here. I think that if Hammer catches on and becomes a force to be reckoned with from the outset, Intel may have the necessary impetus to pursue their own X86-64 chip more agressively. However. if Microsoft doesn't put their weight behind Hammer to some extent or if the Linux kernel update supporting X86-64 doesn't catch on, I really don't see a compelling reason for Intel to pursue their own variant any faster than they already are.

While the Hammer will supposedly perform about 15% better in 64 bit mode, if it is able to ramp as expected and if it performs as expected out of the gate, AMD should be in fine shape either way.

In other words...there's a lot of *ifs* involved.

Mark-

When all else fails, throw your computer out the window!!!
 
If it's their first try, I still find it laughable they didn't go further, given they have 10 times AMD's resources.
Not only this but Hammer is also AMD's first try at x86-64 and so far it seems very nice and steady, yet they have 10 times less resources and money than Intel, probably more.
They are forced btw, Yamhill anyone?
The real question is, will AMD reserve all rights for x86-64, and thus can control whether or not Intel can use that? Will AMD play the devil and not let Intel get the license for that? Don't forget, AMD will have x86-64 far before Intel, so we'll see how AMD feels to play.

--
For the first time, Hookers are hooked on Phonics!!<P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 03/18/02 04:26 PM.</EM></FONT></P>
 
I beleive AMD has made the X86-64 microcode specs open source, so any CPU developer can develop their own cpu to work with the X86-64 specs. I think that's what the rumored Yamhill is doing.

Mark-

When all else fails, throw your computer out the window!!!
 
Not licensing x86-64 is bad! It will kill the future of 64-bit desktop computing. We need competition in the desktop 64-bit market and IA64 is too different to make the transition smoothly.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
 
If it's their first try, I still find it laughable they didn't go further
So you are saying you could have done better given the same resources. Quite the ego.


Not only this but Hammer is also AMD's first try at x86-64
The 64-bit extensions in the Hammer are just that, extensions. AMD did not design a new instruction set that is optimized for 64-bit operations.


The real question is, will AMD reserve all rights for x86-64, and thus can control whether or not Intel can use that? Will AMD play the devil and not let Intel get the license for that? Don't forget, AMD will have x86-64 far before Intel, so we'll see how AMD feels to play.
x86-64 was released as an open standard. It does not have to be licensed. AMD will not get a penny if others release x86-64 processors.

-Raystonn


= The views stated herein are my personal views, and not necessarily the views of my employer. =
 
My mistake then, but I remember someone saying something about being able to license that like SSE. Although AMD was able to get SSE from Intel nicely.

About Itanium, I'd say with the resources I have, I woulda developped something that would at least not be this hot, not draw this much power, and perform much better than that and not mark it with 3000$ or over. I'm surprised you didn't find it that Intel didn't go further than what they usually would have.

--
For the first time, Hookers are hooked on Phonics!!
 
Well, you'd need a degree in computer engineering too. :wink: . Ahh, I still don't know if I want to study computer enginnering (hardware) or computer science (software) in University.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
 
*sigh*
right now, i am not enjoying computer systems engineering... i think i might like it better later on... when i actually get to do something computer oriented

:wink: Engineering is the science of making life simple, by making it more complicated.
 
Are you in your first year? May I ask what you are learning right now then?

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
 
1st year, 2nd term... mostly generic engineeing stuff this term...
Multivariable Calculus
Linear Algebra
Physics: Electromagnetics and Waves
Computer Problem Solving (C++)
Applied Language Studies (Technical Writting)

you can see my comming years <A HREF="http://www.carleton.ca/cuuc/programs/Computer_Systems_Engineering_Program.html" target="_new">here</A>

... i am starting to think i should have gone to waterloo... but i didn't like it there... and was too close to home... plus they gave me lots of money to go to carleton...

:wink: Engineering is the science of making life simple, by making it more complicated.