jennyh :
You mean like getting Crytek onboard with cryengine3 eyefinity, codemasters to hold back Dirt2 for dx11, that kind of thing?
Eyefinity is in and of itself. That means NOTHING in terms of working for optimizations for the games engine. It just means it will support Eyefinity.
The last game I knew to have the ATI logo on it was VALVes Half-Life 2. That was back in 2004. Its also why Source engine based games tend to do better on a ATI GPU even when nVidia has the better card and is also why I continue to go ATI beside the fact that I can get better performance for the price.
And Dirt 2 being held for DX11 is not because of ATI. Its because of Microsoft. ATI just has the support for it now. Microsoft is the creator of DX11. And the company wants to utilize it. Hell, what if Dirt 2 is held all the way until nVidia releases their G300 series chips? Does that mean they are working with them? Nah. They just want to be able to utilize DX11 without patching it because that normally ends in a mess.
Lol.
Upendra09 :
neither did i but Fazers and Jenny are in their own worlds so it doesn't matter,
A world of who knows what.
Jelly.
iocedmyself :
Ah, nothing like the endless bickering of the varying degrees of tech savy forum goers. It's been awhile since i've felt like talking about how all these projects are really faring and what new ones are on there way it's just delighted me so much by the departure of Intels most senior engineer who when faced with having to be attached to incredible RT demo topping out at 25 fps in a flip book quality decided "well thanks, it's been fun but im not staying on this project for another year"
No there aren't any planned layoffs at AMD though the upper crust had to tighten thier belts.
AMD is ....5 months or so ahead of schedual for their double digit cored chips. I thought it was impressive that they got their 6 core server chip out when they did, so did everyone else apparently since they sold out 3 weeks before launch. Probably when they realized that no matter how much faster intel's quad server chips may be in single socket configs compared to AMD's 6 core....I7 based server chips are limited to two sockets...8 cores in a server...while the 6 core AMD chips were drop in replacements for existing 8 socket server boards for a 48core system.
How many gpu's can AMD hope to fit in a cpu package? Four. Not surprisingly having 12 cpu cores and upwards of....lets say 15 teraflops of rendering power in one system lets 8-10 people do quite well with collaborative efforts with that one system. Bet the die shrink will do wonders.
now as for larrabee, as i said the from the announcement, which was shown with simple math with the Architecture reviews last christmas it's not going to be here anytime soon, and it's not going to perform anywhere near what was claimed. That was showcased quite well a few days ago.
Yes JDJ before you explain it again, i do get what larrabee will end up being, something that does decently with specialized liniar computing tasts. But what no one seems to acknowledge is that not only has it been 3 years, but they were pushing over 300w with their 24 core design, but whoops they realized that wasn't going to compete in the graphics card market so they upped it to 32 cores, which if they clocked the cores at 2ghz where they claimed they would would have been a mere 15-20% shy of the processing power of the 4870x2.
But oh no! They still can't get all those cores clocked to1.4ghz successfully so they're going to need to bump it up to 48 cores now by moving to 32nm, surely that will go well when above all else they've been suffering from horrible yields at 45nm.
As i saw mentioned before, yes AMD does license x86 from intel...and intel has cross licensing agreement to use x86-A64....as well as hypertransport, IMC which is probably why the silicon looks so much like the phenom/II.
I don't care really how fast the I7's are, and with the I5's needing a new motherboard it may as well be included in the cpu price. While AMD on the other hand continues to offerbackwards compatibility for yet another cpu design when 6 core desktop chips come out in a few months.
Intel has always been great with designing hardware standards, sata, USB PCI-E, ethernet SSD's.....but they have failed at innovation in the cpu field for a good long time.....they'd rather stall the market growth and wait to be able to cash in on what the competition is doing. Which is why we didn't get tessilation with dx10 and the HD2900, intel couldn't meet that standard. But hey, so what if it means being able to go from a 4890 performance at 1080p gaming res and more or less maintaining the performance running a 5870 @ 7680x3200.
We have no idea if AMD plans any layoffs. Normally its not announced until its going to happen. And if it does, I wouldn't be suprised.
As far as the known goes, AMDs 12 core is not planned for a bit and its set to be a MCM like Intels C2Q was. Gotta love the hypocracy there. Hate on Intels MCM approach then use it. Woot.
And dude, having 48 x86 based cores on one piece of silicon clocked at 1.4GHz is quite amazing. And does no one remember their 80 core Terascale chip? 80 cores all clocked at 2.5GHz then easily pushed to 5GHz. Thats why I laugh when people doubt that Intel can get 48 cores to above 1GHz. Because they have done it before. With more cores.
And horrible yields at 45nm? For who? intel? You must be crazy. Last time I checked they were getting such good yields that they are able to release chips at a pretty low starting price compared to their 65nm quads and duals.
And last time I looked at a Phenom/II and Core i7 they looked quite different. And Intels IMC started DDR3, but we can still thank DEC-Alpha for the IMC. And if you actually look at QPI vs HTT, they are quite different in how they work.
And we shall see how backwards compatable AMDs 6 core will be. As far as I know, if you get a AM3 mobo you need a new CPU. If you get a AM2+ mobo you can get either but you will miss out on newer techs such as SATA 6Gb/s, USB 3.0, DDR3 ect. It seems to me that in order to kep your other PC parts from bottlenecking you you do need a new mobo every 2-3 years. Just like with Intel.
And as I said, Eyefinity looks great. But not many people have more than one monitor.
Overall, each company innovates. Can't have one without the other. But no company thinks of it all first. In technology there is a start (like x86) then it expands from there through multiple companies.