i'm still amazed that when entering this forum you aren't given a book of matches, and i only skimmed through a few pages.
So first and foremost....did anyone ever figure out the increasing popularity of the french language? I find it hard to picture people not from france (or canada) waking up one morning and proclaiming "i want to be more like...the french"
second i know it was from awhile ago, but my take on the i5 series unpopularity (at least at first haven't been paying much attention in tech circles past few months) is due to the ongoing trend of "new entry level chip.....new socket req that covers the price difference of the chips with HT still enabled!"Intel makes great strides in a wide variety of tech fields, SSD's Ethernet, BUS/interface improvements, they produce ungodly huge volumes of silicone, and their business model of "GIMMIE THAT IT'S MINE, that's not fair we can't figure out how to do that yet so why can they? die-shrinks = new socket designs, if you play with those other guys we won't share our toys with you Mr.OEM and of course putting all their products on the lowest shelf so thier customers have to bend over making it easier for intel to "service" them. They accept the fact that the majority of their customer base doesn't know the difference between a Mac and a Dell, and they embrace it with marketing of half truths and shiny distractions.
AMD on the other hand has almost no marketing department, have 1/20th the resources and still manages to be the innovative of the two, considers what's best for their majority customer base with their designs and is actually more interested in advancement in the tech field no matter how many times they almost go broke to successfully break new ground where intel failed to.
We've had 64bit hardware and OS's around on a consumer level for 7 years, intel couldn't compete in that field, so they denounce it, dismiss it and focus their effort into distracting everyone with shiny things in marketing gimmicks. Vista was supposed to be exclusively 64bit, and dx10 was supposed to include tesilation, it was a selling point in fact that MS coordinated with AMD/ATI to ensure they would have the 2900HD card complete with tessellation ready for Vista launch, as well as optimized vista around A64. Intel's chip sets can barely render XP let alone meet new and improved standards, so they bitched until that along with a couple other features were removed from DX10 so their chip sets could continue to limp along. Same deal with why we got a 32bit version of Vista, intel couldn't break through the 64bit barrier so they had to be accommodated further even after doing all they could to keep people running xp with their 32bit chips. Vista got earmarked as horrible because 32bit version sucked and blew similtaniously because it was never designed with those limitations in mind, so DX10 was useless due to insignificant feature improvements as well as 80 some % of users still running XP which didn't support it.
Yet somehow intel is praised for taking the Pentium pro and P3 designs and turning them into C2/Q, was a great 32bit chip, made amd look horrible since A64 chips in 32bit OS's took a 20% performance hit....but i can't get around how much worse it makes EVERY chip between p3 and C2 look even worse. because even they had to admit thier was no salvaging numerous product designs they marketed the hell out of over most of a decade with their fingers crossed all the while.
The i7 is an impressive chip in performance....if someone gave me an i7 system i'd prolly use it, but not if it means accepting the fact that it will be bested by the tock cycle at a fraction of the price in another 6-9 months and require a system overhaul. Besides which it's overkill for pretty much anyone who doesn't do any rendering extensive video editing or complex scientific simulations. Nice to have, but not justifiable for the majority, but they made a great chip....and all they needed was to license x86-64, IMC design, hyper transport (to tinker with and rename quickpath) and get detailed instructions as well from AMD on how to make a single piece of silicone host 4 cpu cores. But they did throw in that extra memory channel...and brought back HT from the P4 hall of shame. But still....very impressive display of ingenuity. Baffling that the higher performing i7 based server chips limited to a dual socket/8-core server was overlooked in lieu of AMD's 48-core 8 socket configuration...that were already backwards compatible with existing motherboards to the point of AMD selling out of the 6 core chips for the better part of a year 3 weeks before official launch.
Itanium, timna, P4's and larrabee intels biggest cpu investments, the first bombed, the second failed multiple revisions, the third performed half as well as it should of and clocked 1/3rd as high as was promised, and the most recent had one of their longest existing employees in charge of the project throw up his hands after multiple hardware delays, poor yields, design complications and performance issues saying "it's hopeless and never going to work, and i don't want my name attached to it when it crashes and burns at launch, i quit"
It was never going to be a successful GPU anyone capable of basic math skills should have figured that out after the first round of articles on the specs. Has great potential as a gpgpu, but for now it's consumer launch is postponed indefinitely.
Though AMD is "rumored" to have a cloud computing prototype using an 8 socket board, 4 of which are occupied by 12 core (6x2 in single package) server chips, the other 4 sockets occupied by 4 specially designed low power quad core 4870 or 4890 gpu's. 25+ teraflops of gpu computing power across four sockets. But i'm sure larrabee could catch up...years too late i'm sure, but still.
People can only get screwed by intel so much before getting pounded a bit too raw. But doesn't quite matter so long as AMD keeps the trend in their gpu designs.
"in soviet russia, screws intel you!