Tim Sweeney Laments Intel Larrabee Demise

Status
Not open for further replies.
I don't understadn why everyone thinks this is shelved? They aren't selling it retail. They are selling (giving?) generation 1 as a stream processor add-on (like CUDA) to intel partners so they can learn it. They will then try and get gen 2 up to spec and sell it at retail.
 
"No quantity of Teraflops can compensate for a lack of support for dynamic dispatch, a full C++programming model, a coherent memory space, etc."
Isn't the whole point of DirectX and OpenGL to make graphics cards and all their teraflops easier to access and utilize by making standard programming APIs?
 
[citation][nom]LORD_ORION[/nom]I don't understadn why everyone thinks this is shelved? They aren't selling it retail. They are selling (giving?) generation 1 as a stream processor add-on (like CUDA) to intel partners so they can learn it. They will then try and get gen 2 up to spec and sell it at retail.[/citation]
Well, Larrabee in all its glory has been canned by the board of directors, and having a dedicated piece of hardware replaced by software development platform sort of bring a certain finality to this fiasco. Not saying that the specs didn't look good on paper, but if we only praised blueprints then nVidia should take home the crown for the best current graphics card... Sorry to say, but it's not how things work.
 
[citation][nom]omnimodis78[/nom]It was a joke from day one! Intel graphics...it even sounds comical.[/citation]
I think an on die GPU working as one of the CPU cores is a genuis idea and would allow amazing graphical capabilities.
 
[citation][nom]omnimodis78[/nom]It was a joke from day one! Intel graphics...it even sounds comical.[/citation]
I really don't get why people expect a $7 piece of silicon to perform the same as a ~$15 or even a $100 piece from ATI & NVIDIA. Grab the list from DjEaZy and divide the performance of each card by the silicon size, cost or power requirements and you'll see why they have about 90% of the laptop market. It's not good enough to play games or do anything intensive (by a LONG shot) but I think it does a lot with very little.
 
It's just logic. Larrabee have level instruction to work in mixedscalar/vector. Orriented Object in C++ always cost more by this specification (It's an object). So it's normal, any CPU need to undestand the object work with the GPGPU, the GPGPU must interstand it, before he could use scalar/vector expression on the current object.
So your CPU/GPGPU work 2-3 times more (for simple object / complexe could need 10 times more) running that kind of expression.

It's interresting features for a little revolution (at low cpu cost) but pc game won't see difference... you use ATi or Nvidia.
So this just open the door to ATi and Nvidia to make scalar/vector function at low cost.
 
[citation][nom]Honis[/nom]Isn't the whole point of DirectX and OpenGL to make graphics cards and all their teraflops easier to access and utilize by making standard programming APIs?[/citation]

Not when it is possible to eliminate the middle man, DirectX and OpenGL and write directly to the chip. Right now, developers have their game engine, then DirectX or Open GL then it gets to the chip. If they were able to code to something like the x86 standard, trust me, your graphics processor would be doing a lot more with less coding. I never thought this thing would kill Nvidia or ATI, but I was hoping it would see the light of day.
 
Really?? Get the hell outta here! Like Tim Sweeney, Mr. "We're designing the next Unreal Engine specifically to console standards... PC considerations to come later... in fact, we don't have a single DX11 game in the pipeline since we're still concentrating on our precious DX9 console releases" Really gives a flying flip about innovative trends in PC graphics technology...
 
Sorry it's not coming to market, or sorry that it was a bad idea and the performance sucks? Because if it was as great as he makes it out to be, then it wouldn't have been cancelled...

He sounds like one of the people defending Global Warming theory after Climategate happened. The truth doesn't need lies to support it, and if an Atom CPU does 3 gigaflops, then 30 of them on one die doesn't do 2000 gigaflops...
 
[citation][nom]reichscythe[/nom]Really?? Get the hell outta here! Like Tim Sweeney, Mr. "We're designing the next Unreal Engine specifically to console standards... PC considerations to come later... in fact, we don't have a single DX11 game in the pipeline since we're still concentrating on our precious DX9 console releases" Really gives a flying flip about innovative trends in PC graphics technology...[/citation]

if valve said it i would care...
 
Tim Sweeney... why should YOU or WE care what you think?

You're company (EPIC)/unreal are going to consoles. With NO (or little) games left for PC, what does it matter? DX11 or DX12 will be meaningless.

AMD and Nvidia need to see the writing on the wall... because the GPU card market can simply end in a few years. NO games = NO NEED for a GAMING video card! The onboard graphics with AMD/ATI chipsets does very well already.

Also Tim Sweeney.... you totally screwed up the lastest UT3 game. Bad menu GUI, horrible maps with so few good choices that most people got BORED with the game before the fans could learn and make good maps - but because the maps are soo huge, so few servers actually host such maps. So UT3 servers are EMPTY of humans. Controls for vehicles are worse - some stupid reason, the UT2004 version was too good?

Tim Sweeney opinion isn't worth that much in my book.

But I'm just a gamer. So I don't matter.

 
Awwwww, I hope this doesn't hold things back. The ability to program your own pipelines was going to be the next gaming revolution.
 
[citation][nom]DjEaZy[/nom]... just look on the graphics card hierarchy chart... http://www.tomshardware.com/review [...] 491-7.html[/citation]

[citation][nom]omnimodis78[/nom]It was a joke from day one! Intel graphics...it even sounds comical.[/citation]


you are all missing the point! its meant to be an competitor to GPgpu, NOT GPU!! GPGPU is about doing anything and everything OTHER than graphics with the graphics card! so Larrabee's graphics performance is irrelevant!!

honestly, they should have never tried to mix it with GPUs at their OWN game! trying to be a graphics card!! it should have been marketed AND released as a ultrathreaded co-processor - which is what it really is! like the SPUs in the ps3! they should have taken that as a warning! sony was originally going to try using two cell processors - one for graphics - and not have a gpu, but even they new better! a CPU will never be as good as a gpu at what a GPU is SPECIFICALLY meant to do!!

there strategy should have been accentuating its EASE OF USE. which is what has all us game devs excited. being x86, I could have imagined using all my existing c++ knowledge, and using it as a tool to really see proper multi threaded programming take off. the trick with this would have been releasing it ASAP! while GPGPU was still in its infancy. as GPGPU has matured, with dx11 and such, it may be too late to play the 'EASE OF USE' card. personally, i'd still buy one.
 
[citation][nom]google_climategate[/nom]He sounds like one of the people defending Global Warming theory after Climategate happened. The truth doesn't need lies to support it[/citation]

All this "Climategate" crap is absolute nonsense. Some hacker found a couple emails out of thousands over 13 years that when certain phrases are taken out of context "help" the politicians who say global warming is a lie (while getting payed to say that by those in industry who would have to submit to climate change laws were they to pass).

The emails, when read in their entirety, have absolutely nothing that contradicts the MOUNTAINS of evidence that global warming is influenced by humans and is harmful. This is the warmest decade in recorded history! Ocean levels are rising, land is disappearing because of the rising oceans, icecaps are melting, record high global temps, etc.

200 of the top scientists with mountains of undeniable evidence vs industry payed politicians with a few phrases taken out of context from a few emails. Who is more likely to be telling the truth?

Take a look at this: http://www.youtube.com/watch?v=YctV731kS8I
 
[citation][nom]ravewulf[/nom]All this "Climategate" crap is absolute nonsense. Some hacker found a couple emails out of thousands over 13 years that when certain phrases are taken out of context "help" the politicians who say global warming is a lie (while getting payed to say that by those in industry who would have to submit to climate change laws were they to pass).The emails, when read in their entirety, have absolutely nothing that contradicts the MOUNTAINS of evidence that global warming is influenced by humans and is harmful. This is the warmest decade in recorded history! Ocean levels are rising, land is disappearing because of the rising oceans, icecaps are melting, record high global temps, etc.200 of the top scientists with mountains of undeniable evidence vs industry payed politicians with a few phrases taken out of context from a few emails. Who is more likely to be telling the truth?Take a look at this: http://www.youtube.com/watch?v=YctV731kS8I[/citation]
You sound awfully sure of yourself, and dangerously so. There is insurmountable evidence that the planet is actually getting colder, yet you still sound far assured that it is getting warmer. George Orwell anyone?
 
[citation][nom]FoShizzleDizzle[/nom]You sound awfully sure of yourself, and dangerously so. There is insurmountable evidence that the planet is actually getting colder, yet you still sound far assured that it is getting warmer. George Orwell anyone?[/citation]

Do you see the icecaps and glaciers growing and extending over more of the planet like when we get an ice age, or are they shrinking?

It doesn't take a genius to figure it out, but apparently people like to ignore the scientists who are experts in this area.
 
[citation][nom]matt87_50[/nom]you are all missing the point! its meant to be an competitor to GPgpu, NOT GPU!! GPGPU is about doing anything and everything OTHER than graphics with the graphics card! so Larrabee's graphics performance is irrelevant!! honestly, they should have never tried to mix it with GPUs at their OWN game! trying to be a graphics card!! it should have been marketed AND released as a ultrathreaded co-processor - which is what it really is! like the SPUs in the ps3! they should have taken that as a warning! sony was originally going to try using two cell processors - one for graphics - and not have a gpu, but even they new better! a CPU will never be as good as a gpu at what a GPU is SPECIFICALLY meant to do!!there strategy should have been accentuating its EASE OF USE. which is what has all us game devs excited. being x86, I could have imagined using all my existing c++ knowledge, and using it as a tool to really see proper multi threaded programming take off. the trick with this would have been releasing it ASAP! while GPGPU was still in its infancy. as GPGPU has matured, with dx11 and such, it may be too late to play the 'EASE OF USE' card. personally, i'd still buy one.[/citation]
I don't think we're missing the point at all. Firstly, we have the technology today, of other things, it's called CUDA (not saying it's the same thing, but from an end-user perspective, it is pretty much on the same level) and I remember telling my friend who thought that Larrabee will change the World, that Larrabee was nothing more than a PR stunt, and it apparently worked because it shook the industry to the core. Yes, both nVidia and ATI were crapping their pants not because larrabee was such a great thing but because people like you bought into the same old Intel story... The facts spoke and speak for themselves. I use CUDA today, now - but larrabee was nowhere to be seen, and it got canned. How are we missing the point?
 
[citation][nom]FoShizzleDizzle[/nom]You sound awfully sure of yourself, and dangerously so. There is insurmountable evidence that the planet is actually getting colder, yet you still sound far assured that it is getting warmer. George Orwell anyone?[/citation]

Er... your city maybe getting colder as weather patterns change. But any sat photos, research and ski resorts around the world NOTICE / shows that there is less snow, less cold, less ice.

It'll snow-ball into more heat as we lose ICE. Standard science man, white reflect heat, dark absorbs. Less polar ice = more dark blue water absorbing the sun's rays generating... more heat.

Compare that to a white car vs a black car.

DUH
 
From a developer point of view one thing has none to do with the other. You can still have an API from Directx and a different implementation for each. The point is that GPUs have all that power but can only be used in a certain way. It's just like having a super high speed train that departs every hour and with all wagons. You can have an alternative one same speed but having flexibility to go each wagon at it's time as they're needed. That means you control the power, the opposite is currently happening with other GPUs.
I still have the opinion that we all knew this chip wasn't going anywhere for at least one more year.
 
Status
Not open for further replies.