What Nvidia Had to Say About Larrabee's Delay

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
Intel has great ideas. But they might just get too far ahead for everybody else. I don't see what future is there for two separate processing unit CPU and GPU in a computer. Wouldn't it be nice and unified to have one processing unit to do everything (computing and graphics)? With the discrete card configuration, half the resources are wasted if it's idling. For example, if you are doing heavy computing, most of the graphic pipes is not used = wasted silicon. On the other hand, if the apps/game is graphic intense but not CPU intense, then most cores of the CPU would be idling or wasted.

But what if the CPU is designed to perform both heavy computing as well as graphic? In the near future, we might see 256+ cores in a CPU that rivals our current supercomputer. Now, the CPU could shift its resources/cores based on the apps/games. If you're doing heavy computation, most cores would be dedicated to computing and less to the display graphics. During intense gaming, the CPU would allocate most cores to rendering graphics. In both cases, the entire CPU is utilized; very little wasted. This method of dynamic resource allocation, might be the future. But of course would require changes to both our hardware and software.
 

masterjaw

Distinguished
Jun 4, 2009
1,159
0
19,360
Surely, they can't and shouldn't make an extravagant comment with Larabee because they're in no position at all to talk about GPUs right now. Nvidia is currently the underdog and their only response we get right now is the rebadged GT 310.
 
[citation]Yesterday, AMD predictably told us that it was proud of having established both a CPU and GPU business.[/citation]

Wait a second.... when did AMD establish a GPU business? Last time I checked, they just took over ATI and claimed any glory ATI had already planned out (up to R900 in fact is pre-AMD take over).

[citation][nom]sstym[/nom]I think they also said "That'll teach you to shut us off the chipset market, d*cks".[/citation]

If you look at one side. Look at the other:

nVidia wasn't allowing any support for SLI outside of their chipsets that were actually inferior to Intels chipsets and even AMDs chipsets. They were trying to create a SLI monopoly so to speak. They knew that if they didn't do it this way they would be lose a lot of sales to Intel and AMD.

I know plenty of people who only went with nVidia because at the time their GPUs were better and wanted SLI. If they could have a Intel/AMD chipset with SLI they would have. It makes you wounder why AMD was willing to let Intel have a CFX license. Maybe because they knew they could make more money off of Intel chipsets.....

[citation][nom]jacobdrj[/nom]Wouldn't it be funny if Intel tomorrow actually released Larrabee, and it was awesome?[/citation]

It would be.

but my guess to the delay is because there was a point where Intel revamped the program. They basically started from scratch. That wont stop them from continuing to find ways to make it better. Now they will focus on the software development and have it ready to go.

Next will be the process. Probably work it into 32nm that way they can get better performance and lower the TDP.
 

back_by_demand

Splendid
BANNED
Jul 16, 2009
4,821
0
22,780
Maybe, one day, after Intel has spunked a few more billions, they may produce a graphics solution that actually is good. Problem is nVidia and ATI have already established brand recognition and a history of producing fine products.

Intel have a history of producing integrated graphics that are, frankly, shit.

Intel dont have the money to actually produce a good product AND the advertising budget to convince us that it isnt more of the same old crap.
 

rebturtle

Distinguished
Dec 13, 2001
283
0
18,780
For an article title offering nVidia's position, I expected more than a one-sentence quote buffeted by the last three articles worth of Larrabee news.
 

rooseveltdon

Distinguished
Jan 18, 2009
364
0
18,790
Honestly i can say this because i know a lot friends of mine who work for intel's engineering division here in hudson MA, intel has the tendency to squeeze everything they can from their engineers even when the resources for the products they want are not fully up to par, it happened during the days of the P lll and P4 and they are doing it now with larrabee, they probably expect larrabee to be some sort of revolutionary product at once capable of performing like a gtx295 or more while having CPU like abilities close to the early core 2 series chips, on paper this looks really juicy but realistically designing a chip like that is extremely hard and getting those kinds of results are not likely at least not right away, Nvidia and ATI have a lot of experience with GPU technology and they are fully aware of such difficulties so they are not rushing to make something like that (nvidia's physix and stream are not really close to the abilities that larrabee is supposed to have).....in many ways nvidia can be just as bad and in the past it hs engaged invery questionable business practices, that being said, i think that they need to take their time and make sure they get it right once the resources are available and they need to keep quiet about it until more progress is made, hopefully through the use of the 32 nm manufacturing process and new ways to manage space on the die they might be able to achieve the performance they want....right now i am more intrigued by Nvidia's fermi line up i wonder what kind of changes we are going to see in the gpu market 2010 is going to be an interesting year
 

tjar

Distinguished
Dec 9, 2009
7
0
18,510
[citation][nom]tacoslave[/nom]ha ha ha anyone else a redman? AMD KNYUKKA's!!!!!seriuosly though amd has been the only one innovating now a days i mean in both graphics AND cpu arenas. i7 = better use of phenom architecture, multicore, 64bit, the list goes on but needless to say they are the only ones that can laugh at intel now nvidia cant even complete their gpu on time.[/citation]

This is the dumbest thing I have ever read seeing as how AMD and Intel have used entirely different architectures since the days of the k6 to handle x86 instructions. Just because they share common features, such as 64bit and on die memory controller, doesn't make the i7 an improved phenom architecture. You seem to forget that a few key innovations that make CPUs as fast as they are today were first done by Intel, such as out of order execution and on die cache. The x86-64bit is the only true innovation that AMD pioneered here as the on die memory controller definitely didn't help AMD against the Core 2 generation of Intel silicon. Intel themselves even said that you can only integrate the memory controller once and unfortunately for AMD they pretty much stopped innovation right there and took a nap since the p4 was no match for the Athlon.

Now as for the article I was quite disappointed with the shoddy attempt at journalism here since a one line comment does not an article make. They should have prodded nVidia for more feedback instead of coughing up yesterdays AMD/ATI comments. Or maybe set the stage better by mentioning that Intel is the Worlds largest manufacturer of graphics chips and then throw in a mention of Intel's previous foray into discrete graphics cards. Instead of just relying on the spat between the 2 companies to get readers to click and then be be disappointed with the article they might have mentioned that it wouldn't be too terribly difficult for Intel to make a traditional graphics card seeing as they do have experience in this area. They could have also fanned the Intel/nVidia flames by throwing Intel's ability to make standard graphics card in nVidia's face and then mentioning their current inability to get their current gen hardware off the ground or maybe take the tame approach and just point out that Intel isn't taking a standard approach to video cards, they're trying something that hasn't been done before. I mean come on all this article has done is fan the fanboy flames.

I'll digress now since I have rambled on plenty. I'm just upset with the poor quality of news articles from THG these days.
 

Dkz

Distinguished
Oct 16, 2009
207
0
18,680
It would be nice too have those chips on the market, netbooks, notebooks all around hopefully gain extra power. This world isn't going the best graphic cards are in ATI/nVidia's hands and that wont change any time soon.
Even if Intel has the economic power to put their own invention on the market, who will dare to change all the sudden to a new GPU maker! May be within 6 years they may compete with the low end graphic cards but i don't really think their aim right now is to beat Nvidia/ATI in that war.
 

razorblaze42

Distinguished
Jun 2, 2009
150
0
18,680
@Tjar Tacoslave was only off a little bit:
Intel is a huge company with a massive bottom line, but 3 Billion dollars on the larrabee paper weight was a colossal blunder. Why in the hell would Intel think it was a good idea to compete with ATI/NVidia building graphic cards? Greed I suppose. Only a fool would buy a first generation, completely “experimental" graphics card from Intel. This is an "EPIC FAILURE" considering the fact the R&D on Larrabee is rumored to exceed the cost of both ATI 5000 Series and NVidia Fermi combined…BRILLIANT! Intel is far better at cloning the work of others, than they are at actually coming up with something completely new. If you take a look at Intel’s almighty Nehalem architecture you’ll find, there’s nothing essentially new about Nehalem’s core design and most of it was borrowed..wink wink from AMD. The basic specs of the Nehalem “Opti-clone” are exactly the same as the Barcelona (K10) architecture from AMD. It is natively quad-core and has three levels of cache, a built-in memory controller, and a high-performance system of point-to-point interconnections for communicating with peripherals and other CPUs in multiprocessor configurations. Intel has also added SSE instructions to Nehalem, specifically the SSE 4.2, components which appear to also be “borrowed” from AMD’s K10 micro-architecture. Next Nehalem integrated memory controller was “borrowed” from AMD who’s been using integrated memory controllers for years with the K8. Intel followed AMD K10 roadmap of course they avoided the potholes. Yes they improved the design, but improving something isn’t the same developing it. So it’s not surprising to me that Intel completely failed with larrabee, after all they didn’t have anyone test paper to coping from… lol
 

tjar

Distinguished
Dec 9, 2009
7
0
18,510
@Razorblaze42 I wasn't really off I just didn't mention other similarities to what AMD does.

For starters Intel started using the 3 tier cache back with the P4EEs. Yes Intel now uses an interconnect similar to Hyperlink but you totally missed where I said that the AMD and Intel ways of processing the x86 instruction set is totally different. Yes Nehalem isn't a new architecture is essentially an enhanced Core 2. But ever since the days of the K6 AMD cpus have translated the x86 instruction set into RISC like micro instructions.

Now you also missed what i was saying about Intel and graphics cards, but so has everybody else also. Larrabee, while it might be a huge blunder on Intel's part and they might have dumped more than the GNP of most small countries into R&D, is not a gpu in the same sense as nVidia's or AMD/ATI's GPU. What I said was that Intel could have easily developed a video card to compete with nVidia and ATI with little effort where as larrabee is an entirely different way of doing things so it's going to be difficult to develope.
 
Status
Not open for further replies.