• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

The Return of Intel's Pentium MMX

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
they should have based it off core 2 duo =P

then it would own even more
 
Interesting that intel is going back almost 20 years to try and compete with Nvidia and AMD/ATI's current (or near future) technologies. That would be great if we could get one more player in the GPU market. AMD is forcing Nvidia to lower their prices, can you imagine what Intel could do if they introduced a 3rd GPU in the market? SWEET!
 
thogrom Core 2 is based off of Core which goes all the way back to the Pentium Pro which had its roots from the Pentium MMX. So in a way it is based off of the same technology that Core 2 is based off of.

Funny thing, kinda related, I have a old Pentium w/MMX sitting on my desk. Its a reminder of where we used to be (75 whole MHz YAY!!!!)
 
This is very good news. For years I've been saying that Intel should dust off the 386 and put dozens of them on a single die. It looks like they've done me one better and gone with the Pentium. This is a very smart move for Intel because it accomplishes three things:

1) They don't have to develop a new core for the cGPU.
2) The cGPU will use x86/x64 instructions, thus making it far easier for developers to write code and debug code targeting the cGPU.
3) Developers will learn to employ proper threading techniques to utilizes 32 cores. This means they'll learn the skills necessary to truly take advantage of the modern CPUs.

A side effect of all this is that a computer built solely around the Larrabee. Over on my blog at ITtoolbox I've pontificated about the possibility of building GPU-based computers, and this could be a great place to start.
 
Definitely it will run x86 because it was developed on x86 architecture. If Intel will manage to develop and make it a powerful GPU, it simply means that powerful GPU was already develop 2 decades ago but only used for general purpose computing.
 
LMAO, I still got my old Pentium 233MHz MMX Socket 7 CPU. Can you believe I got that thing to over clock to 333MHz back in 1998. I ran a duct from outside which was 20 degrees F to the CPU. It would scream through Quake2 and Tomb Raider 2 which was fast back then, But today the CPU is obsolete unless you want to run windows 98 or 2000. Who would have thought that Intel would use the design for a GPU.
 
I remember my first pc box was a P200MMX, with windows 95 and a Tseng Labs ET6000 PCI video card. I could not find anything at the time that would not run on that machine. In fact, I still have it today and it still works!

Very interesting idea to put 32 Pentium cores on a single die/package. Parallel processing is definitely the direction to go for, it's just a shame that even today SO MANY applications out there suck on multi-core systems.
 
Pentium MMX!?! that was my first computer (now my mother's one 😛)with a nice Voodoo 2 video card lol 😉

Seriously, if it gives the performance the hope, i'l be the first surprised

 
Wow..just...wow. I really don't expect much from this. It is a combination of everything that intel falls horribly short in.


Good multi core design
64bit performance/compatibility
Graphics performance
Actually innovative design period.

When they were designing the core/2 duo/quad chips to which they churn out a new flavor, socket, chipset every 3 or 4 months now at obscene prices yes they went back to the PIII chips and improved off of that. But those stating the connections of core2 to core to pentium pro back to the pentium MMX and "that powerful gpu was developed 20 years ago but only used for general purpose computing"...are you guys serious? Are you actually impressed by those analogies, or just trying to make intel sound better than they are?

First off it's pretty much a given that each new generation of cpu's is in a minor or major way based off an earlier design. Working off that logic AMD and intel have ties going back to chips that were run in warehouse sized computers.

You could also make the claim that AMD has ties to going back to those chips since AMDK5 and the intel pentium 75mhz-133mhz shared the socket 5 platform, and Socket 7 was shared by intel Pentium, Pentium MMX and AMD K6. It could also be said that Intel's 64bit chips are based off AMD designs...since intel had to liscense the intrustions from AMD to even get in the 64bit game.

But then of course the AMD 64bit chips were a direct result of Alpha designs. Alpha having had the first 64bit RISC cpu back 1992. Yes that's how long 64bit cpu's have been around...since '92. The AMD socket A series of cpu's were actually based off the Alpha64bit cpu, only increased cache sizes and clock speeds using 32bit extenstions. How could that be? Because the guys that worked for alpha (later DEC) and designed the 64bit chips went to work for AMD. The socket A being a place holder while they focused on the opteron, based even more closely on the Alpha64.

Sure i suppose you could look at the Core/core 2 series of being a testament to the longevity of intel's designing skill and innovation. But more realistically it should be screaming the fact that between Febuary 26th of 1999 with the release of the pentium 3 550mhz, and the release of the Core 2 january 5th i belive of 2006 they didn't release a product with ANY staying power.

That's nearly 7 years without progress...or if you wanted to nitpick, between November 2000 with the release of the P4, and january 2006 everything they designed was crap. Which is pretty much true. They had to go all the way back to the last millennia to find something worth building off of. Am i the only one that gets how sad that is? Endlessly deep pockets, vastly greater resources than AMD, and 7 years of work were not worth paying attention to. This is what the AMD/intel war has been like going back 15 years or more.

Intel makes something ground breaking chip...regarding the fading platform generation
Everybody wants it
they muscle out the competition
Milk their good idea for all it's worth while ignoring the competition
Wake up to find the guy 1/10th thier size has leapt light years ahead of them
Rely on muscle and the power of marketing to sustain their sales for the next several years while they come up with another great product to revitalize the fading platform.

Intel is the reason that there is such shoddy multicore application support, and 64bit application support. Multicore was intended for 64bit. But intel couldn't play on that level, they knew it, the competition knew it and the hardware vendors knew it. So they stall. They come out with Core2, which is meant for 32bit, because really no one needs 64bit stuff, it's all just hype. So what if we say it's hype because we can't make stuff like that? So what if we say it's hype because the little guy we laughed at for so many years that has been beating our skulls in as it is in 32bit stuff for years gets another 15-20% performance in 64bit application. Why should you upgrade your hardware to take advantage of new software.

Instead you can upgrade to new hardware that makes all the software you know and love run REALLY REALLY FAST! We think you'll love it so much we'll let you spend money on it every 3-6 months, and lots of it!
 
Pentium MMX 75mhz, i had that before until i sold it to a student who wants to use if for school and study (of course in our place). Imagine that until now it's working and running! with a windows 98! We're nearing the age where technology posted on fictional movies are coming to life!
 
AMD/ATI vs Intel vs Nv

its gonna be very interesting, looks like Nv is left out cold, when the technology move forward to CPU/GPU fusion.

that wouldn't happen soon, but i totally hope that a PC make over, remake motherboard completely, make it small and powerful, lose the weight of a desktop yet gain more power, less energy consumption, please.

MOVE FORWARD>>>>>> PC hasn't really MOVED in terms of the design, Window needs to MOVED FORWARD as well, hopefully Window 7 isn't "Window Me 3".

so AMD/ATi, Intel & Nv, good luck.

Key: No graphic card is worth more than USD$250, its just none sense, and rip off.
 
[citation][nom]cruiseoveride[/nom]great, but does it work on Linux?what are windows users going to do with 32cores anyways..../hides[/citation]
If I'm not mistaken, this will be an add-on card so it will work anywhere (with a PCIe slot). Intel is likely to provide some reference "software" renderer in the drivers but you should be able to write and use your own renderer with this card (raster/raytracing/whatever). So it is just a matter of making Linux drivers which would allow you to communicate with the card over the bus (upload your code and run it).
 
LOL, I have a pentium 120@133 as a fileserver, and one Pentium MMX 233 which is currently unpluged but going to be plugged in very soon. :-D
 
Sorry for the second post but I forgot what I wanted to say. ^_^;

The most fascinating thing is, you take 32 Pentium MMX processors, do some adjustments (make it 64bit), and can compete with Ati and Nvidia?
What exactly were this companies all the years doing (just inflating the chip)? Or is intel working on this already for a long time (several years) and doing really big changes to the old core?
 
Moriarity, I believe you could get a high performance because you can do a hell of a lot of parallel computing during raytracing. One or a few rays per core at a time and they are all being calculated without any consideration to other data that can bottleneck it all.

Stream processors works in the same way but you still have to do a lot of presorting culling, z-buffering, et cetera. Maybe raytracing can do without all these prerequirments and dependencies
 
wow, even if they fail miserably i feel this could still be a huge acheivement. think how much better and easier it will be for companies limited to using onboard graphics! instead of finding some motherboard which is EXTREMELY limited and costs a crapload, you could just buy any old two bob motherboard and chuck one of these babies in it. saves money, time and possibly; electricity.
i couldnt see it beating discrete graphics in any games for a long time. but there will be a time when discrete graphics is obsolete. a LOOOONG time from now....when we all have supercomputers implanted in our brains and telephones in our shoes.
 
Man, that must be exciting for Intel engineers. Hey guys, let's see if we can get this 10 year old design to be a decent GPU! Anyone else remember reading about Intel dumping it's high priced engineering talent about 6 years ago? Seems it was true.
 
Re: Iocedmyself's wall of text

I think everyone is jumping the gun a bit here, if I say I designed a car engine by going back to the 1821 designs by Faraday, I don't think those who know what I was talking about would think I was using obsolete technology. Sometimes it's best to just clean up and go back to a simple design and improve on it.

You have to consider the design challenges that they addressed between that generation and the current. They weren't thinking in expanding cores in that time frame, they were adding extensions that won't be used by Larrabee, they were coming up with cache access algorithms, adding logic circuits to handle the increasing size of the chip and the latency that caused. These are all problems that don't need to be addressed in a slimmed down core.

They're simplifying, less potential for problem, less overhead, more room for more cores. I also doubt they didn't draw on any advances used in current architecture that could be applied.
 
Status
Not open for further replies.

TRENDING THREADS