Intel's 'Larrabee' on Par With GeForce GTX 285

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
ATIs problems for years was not hardware but their drivers. Up until the x1000 series they hadn't really hit the nail on the head.

I don't think hardware will be Intel's problem. It will be trying to create drivers for a market they've never been in before, discrete graphics. Their "GTX 285" performance was probably in one game they spent a month hand-coding drivers for. I'll wait for a comprehensive performance analysis before I give Intel any credit.
 
[citation][nom]zerapio[/nom]Not when I've seen the card with my own eyes.[/citation]


Running Crysis?


Where it is on Amazon, or NewEgg?

As a former owner of an Intel i740, I say: Don't believe again.
 
Intel partial roadmap:
Atom processor
Dual core (or more) Atoms
Larrabee (build basically of man Atoms with some extra glue :)
GPU/Processors integration
Processors with extra GPU glue that can switch hit depending on the needs of the user at the time

While I have to admit that I'm shocked that Intel has actually managed to earn good profit on the intermittent steps (Atoms, passably Larrabee, too...which I'm guessing they will nearly give away to get the CPUs better accepted when they are released), this should also mean that a pretty decent integrated GPU will come with all Intel chips and help PC gaming.

(And of course AMD/ATI will be doing the same thing so there shouldn't be any x86 exceptions)
 
Yeah, I'm not going put too much stock in what is being said at this point, Besides who knows, Maybe Larrabee could also be 2x more powerful than anything around this time next year. It would be a shame to see Nvidia have to play catch up for the next five years. lol :)
 
Yeah, I'm not going to put much stock in what is being said at this point, Besides who knows, Maybe Larrabee could also be 2x more powerful than anything around this time next year. It would be a shame to see Nvidia have to play catch up for the next five years. lol :)
 
After all the empty hype about the Itanium in the high end Server market etc... over the years, of which success never really materialized, I've adopted a wait and see attitude for Larrabee. Intel can sometimes fall behind schedule and scale things down at the last minute to get a product out the door. Don't get me wrong, I own many Intel chips, but I'm just hesitant to see the reality of things.

As others have stated, I'm all for the extra competition that would drive the other companies harder to improvise, but this all remains to be seen.
 
[citation][nom]marraco[/nom]Running Crysis? Where it is on Amazon, or NewEgg?As a former owner of an Intel i740, I say: Don't believe again.[/citation]
I added < Krosty
 
ok, so like the way CUDA is a gpu hacked to be like a cpu, this is a cpu acting like a gpu, its almost "software" rendering at the same speed as a 285GTX? sounds AWESOME!!
 
[citation][nom]matt87_50[/nom]ok, so like the way CUDA is a gpu hacked to be like a cpu[/citation]
It's not "hacked", it just uses a different API, good look at OpenCL.

[citation][nom]matt87_50[/nom]this is a cpu acting like a gpu, its almost "software" rendering at the same speed as a 285GTX? sounds AWESOME!![/citation]
Pretty much.
 
If Intel managed to integrate this Larrabee into their upcoming 32nm processor (like their plan in their roadmap) then it would be very good.

I wouldn't mind have integrated graphics with power on par with GTX285 on my CPU next year :-D
 
[citation][nom]goose man[/nom]I wouldn't mind have integrated graphics with power on par with GTX285 on my CPU next year :-D[/citation]
I would mind.

More crap on-die means more heat that could be better suited to CPU performance. Not to mention, if I new game came out, and my GPU was the only bottleneck in the system, I'd have to replace both my GPU and CPU and get a higher end combo that Intel had decided had the "right" ratio of CPU power to GPU power.
 
If their first foray into the modern discrete graphics market is as fast as the current fastest single GPU card, Intel can claim an amazing success. That's faster than anything ATI has out currently, and it's their first attempt. It would suggest that with further R&D, Intel can compete head to head with ATI and nVidia in the long term.
 
What he really meant to say was "Larabee blows."
But seriously, wasn't it supposed to be some super awesome alien technology that could perform real time ray tracing and all that sexy mysterious voodoo stuff? From what I understood, its introduction was so awesome when it was announced that it made that nvidia guy cry. What happened since then!?
 
[citation][nom]philologos[/nom]Given the current market, Larrabee was rumored to be poised to blow everything else out of the water.[/citation]
[citation][nom]Tindytim[/nom]I would mind.More crap on-die means more heat that could be better suited to CPU performance. Not to mention, if I new game came out, and my GPU was the only bottleneck in the system, I'd have to replace both my GPU and CPU and get a higher end combo that Intel had decided had the "right" ratio of CPU power to GPU power.[/citation]

Given the option of powerful integrated GPU for all, discrete for dedicated gamers, and "Turbo" modes for onboard GPU or CPU as needed (auto overclocking of CPU or GPU when the other is under utilitzed)...well, if you were GPU bound PUT IN A DISCRETE GPU (like you have ALWAYS had to do).

But this will mean that even a baseline computer will be guaranteed (hopefully and eventually of course) a higher level of baseline graphics. Also, for those of us who video encode or people with other highly parallel tasks it means there will be a standard parallel processing unit in every Intel chip.
 
[citation][nom]Kary[/nom]Given the option of powerful integrated GPU for all, discrete for dedicated gamers, and "Turbo" modes for onboard GPU or CPU as needed (auto overclocking of CPU or GPU when the other is under utilitzed)...well, if you were GPU bound PUT IN A DISCRETE GPU (like you have ALWAYS had to do)[/citation]
I buy a CPU for a CPU. Adding a GPU increases heat, and lowers how well the CPU overclocks. Why would that benefit me? It doesn't.

[citation][nom]Kary[/nom]But this will mean that even a baseline computer will be guaranteed (hopefully and eventually of course) a higher level of baseline graphics[/citation]
"base line". That's bull, people will still be using their old computers from 5-7 years ago, and if you want wide appeal you need to work on machines that old.

Computers are varied, that's the beauty of them. Choice and customization, you'd be stunting both.
 
[citation][nom]Tindytim[/nom]I buy a CPU for a CPU. Adding a GPU increases heat, and lowers how well the CPU overclocks. Why would that benefit me? It doesn't."base line". That's bull, people will still be using their old computers from 5-7 years ago, and if you want wide appeal you need to work on machines that old.Computers are varied, that's the beauty of them. Choice and customization, you'd be stunting both.[/citation]
Very few computer don't have a GPU in them... yeh, they add heat but THEY PUT A PICTURE ON THE MONITOR FOR YOU. Of course, gamers are going to upgrade to something decent through an add on....ideally (yeh, I hate that word, too) once you add PCI-e graphics card the onboard GPU will shut down and not pull ANY power (or will work in hibrid mode and the PCI-e will be cut off when not needed...maybe :)
How would it benifit you..as a gamer..if the GPU on the processor die is used for physics is one example (I'm just assuming you are a gamer from the fact you seem to disapprove of any onboard graphics).

Another example is if little Tommy next door (and all the little Tommy's) that go out and buy the cheapest Dell they can get their hands on (and we all know this happens) then go out to buy a PC game and... IT ACTUALLY WORKS (with all the graphics turned WAY down :) then maybe more PC Games will get released.
 
[citation][nom]Kary[/nom]Very few computer don't have a GPU in them... yeh, they add heat but THEY PUT A PICTURE ON THE MONITOR FOR YOU.[/citation]
DURRRR!

but that heat is elsewhere on the motherboard. Putting it directly on die increases the heat on the processor, and decreases overclocking performance.
 
I want to see it in an actual product that is on the shelf before I dare to get excited. Drivers are a huge problem for Intel, even if they get the hardware sorted out. We've heard big things before - remember x3100 and x4500? Yeah, they sucked, too. Graphics, for Intel, are excuses to make stronger CPU's and give you a reason to buy a totally new machine. They don't want them to be too good, they want them to suck, it's their business model.
 
Status
Not open for further replies.