Larrabee vs geforce 9800gt vs radeon 5770 vs gt250

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

cheesesubs

Distinguished
Oct 8, 2009
459
0
18,790
as we know larrabee will be release in january 2010. for mid range graphic card which do you think is the most cost/performance? will larrabee win the mid range gpu war?
 
Id welcome you to my thread, which shows a 21% increase on the new beta drivers, which puts it even with the 295 in just 1 game shown so far, but theres more to come.
Like I said, you havnt a clue, you probably dont even game.
http://hardocp.com/article/2009/11/10/need_for_speed_shift_gameplay_performance_iq/10
And, as far as your 2 outta 3 claim, it shows youd never ever run high end, as those wins are generally at lower res, with no or lil eyecandy, which is not what the top cards are for.
 

ubernoobie

Distinguished
May 29, 2009
886
0
19,010
lol intel gma rapes amd/nvidia? nope, amd has hd4200 which is atleast able to run a source engine game at max settings no aa @ 30fps+. GMAx4500 would only get about 25 fps on same settings and nvidia's 9400 chipset gets 50fps+
 

chedrz

Distinguished
Aug 7, 2006
290
0
18,790
Wait...is this guy serious in saying that it's an achievement for Nvidia's dual-chip card to be able to beat AMD's best single-chip card roughly 60% of the time? Especially considering that AMD's drivers ALWAYS take a few months to be properly optimized? And even if a GTX285 can hang with a HD 5850, it's still ~$60 more. Add in that neither of the Nvidia cards are capable of doing DX11 (even though it may not matter much yet), and...why are we having this argument?

Dude, come back and argue about Fermi after Nvidia figures out how to mass-produce a working chip that is on the market and has been properly benchmarked. And just to add a little more than that, there is no way that a high-end Fermi card will be at the same price point as AMD's cards, since Nvidia doesn't know how to sell something at a price average people can afford.

Oh, and s3 makes me lol.
 
Intel Corp.'s Larrabee graphics processor, which is expected to challenge Nvidia Inc. and ATI Corp. in the high-performance desktop and gaming PC market, will arrive early next year, the chipmaker's CEO said during a quarterly earnings call on Tuesday.


From your link.
Its dated April, its saying 1st half. You do know what 1st half means dont you?
In your thread, theres no mention of how many cores, what speeds, etc etc etc. Pie in the sky.
 

cheesesubs

Distinguished
Oct 8, 2009
459
0
18,790


they are exist.... intel holding it back and push the release date away from original schedule was because of optimization. intel planned to unleash it in 3nd quarter 2009(they anounced it in 2008) but cypress(5800) ruin the party and forced intel to refine the architecture and pushing the date forward. same case also happen to fermi they were caught surprisely by evergreen's low profile release. you cant blame them for it.

btw intel goes with 32nm(or 22nm?) instead of 40nm which is the main reason why were they delay the release of larrabee. and since 32nm processor has successfully on shipping schedule(core i9/i3, pentium e7000s) i believe larrabee will release anytime in early 2010, and defintely not first half. there are only 2 source that claim it will release in first half or april. but since evergreen was anounce to be release in december 2009, it got earlier than consumer's expectation. larrabee might be just like evergreen.
 
Last I heard, silicon was pretty much ready, they had a few exclusive tests going on, and the drivers were waaaaay off, delaying it further.
Youre acting as if you have exclusive info here, which I assure you you dont.
No one knows of its performance. IDF show was unimpressive to many, so, lets wait and see what they bring, before we start thumping our chests ok?
 

chedrz

Distinguished
Aug 7, 2006
290
0
18,790
You seriously think that Evergreen and Cyprus had a low-profile release? Sure, that's why in June AMD was saying it would be out "sooner than you think."
http://www.anandtech.com/showdoc.aspx?i=3573

Maybe Intel pushed back Larrabee because, while roadmaps are effective for planning in the long-run, they do not account for problems in the R&D cycle. Did you think about that?

Give Intel a few years, and then they might be able to shoot for the performance crown with Larrabee. It won't happen in January, though (or whenever it actually gets released).
 

cheesesubs

Distinguished
Oct 8, 2009
459
0
18,790


it took by surprise. and also the heat cause the delay too. a mainstream lrl will consume 130w TDF, too much for a midrange gpu....not to mention extreme multichip + multicore version....that is the reason why they undergo refining the chip . however even without refine it(32nm). the multicore feature and level 1/2cache +instruction set will still beat evergreen even hemlock is not certain to be a match to "extreme" version of larrabee. intel hold it back shows that they aint hot head like nvidia that only dancing with amd's shadow.

they will make perfect chip that can compete to fermi 395.



you are right at this once....the driver issue is the major reason(beside TDF) they cant get the chips under the sunlight. they had been isolate from gpu industry for decade so they would need time to write perfect driver. because of this, even long time driver expert like nvidia can make disaster. why dont we give them a chance like you gave it to amd back to that failure 2900xt?
 

chedrz

Distinguished
Aug 7, 2006
290
0
18,790


Yep. And you'll pass a grammar course.
 

cheesesubs

Distinguished
Oct 8, 2009
459
0
18,790


offtopic: yes!! i did not pass it as you can see that how bad i am in that post! guess my typing skill need to be improve.....

whatever as long as you can "barely" read it, it will be fine.....

 
Im right, which makes you? January release? This beats that, well I have 1 HUGE problem with LRB, you cant find it anywheres!!!!!!!!!!! heheh
Thats my point, and your mistake. Until we have benches we have nothing. Same for Fermi. Unlike the 5 series from ATI, we could sorta project its perf, but only to a point, but ballpark
You can do neither with LRB or Fermi, since both are brand new arch', so again, come back after youve learned a few basics
 
Well, even Fermi we have some benchmark, it should be faster than their last card, the GTX285. As for Intel, all we have to compare to is their IGP, which beating that would be no great achievement.

Edit: I meant benchmark as in 'bottom end of where it could perform', not as in an actual FPS Benchmark.
 

xaira

Distinguished
GT300 "Fermi" Architecture GPU Specifications

* 3.0 billion transistors
* 40nm GPU by TSMC
* 384-bit memory interface (6x64-bit memory controllers)
* 512 shader cores (renamed to CUDA Cores)
* 32 CUDA cores per shader cluster
* 1MB L1 cache memory [divided into 16KB Cache - Shared Memory]
* 768KB L2 unified cache memory
* Up to 6GB GDDR5 memory (1.5GB for GeForce and up to 6GB for Quadro/Tesla)
* Half Speed IEEE 754 Double Precision
* 16 Streaming Multiprocessors (new name for the former Shader Cluster) containing 32 cores each

that 1mb l1 and 728kb l2, thats not cache, then what is it?

and i never said that the gpu will take over from cpus, its just that waiting 4 hours for an i7 to convert a video is not exactly attractive when a 5750/gts250 can do it in 1. the gpu is being used to help the cpu, not replace it. and nvidia and intel and amd are pumping huge amts of paper into this application.
 

Which is true, and only further shows the OPs total neglect to fact, and a propensity towards branding.
All the specs truly mean nothing, as we dont know how they perform. Itll all be done in SW, which will be an advantage possibly for gpgpu usage, if CUDA doesnt take off well, but for games? Nothing there, sorry. Again, at least we have a base for Fermi, nothing for LRB for gaming, at all
 
So, we have a card that no one knows when itll come out, if the drivers are mature or even work for the game you play, no history of perf at all, no die size, power draw, being done at Intel instead of TSMC, no knowledge of cores, if theyll perform or not, if certain games cause problems for them or not, yet you, you say its a monster? Please understand, youre playing to the wrong crowd here.
And if its delayed too long, itll have to face refresh parts, which it will anyways, before it has much chance to take off anyways, then itll be a new process for the gpus before Intel can make any changes, then itll be HKMG, which will be like another process for free, and of course, therell be improvements all along the way, and maybe by then, well see the secong iteration of LRB. Like I said, give it a rest, lets all wait
 

jennyh

Splendid
Larrabee will not be released in January 2010 lol.

How can anybody believe that? We have seen nothing except a tiny insignificant demo at IDC. No leaks, no benchmarks, no idea of what the performance is like, except in ancient games like Quake where I heard it isn't bad actually!

We've been getting leaked clarkdale benches for how long now? 4 months+? Yet we still haven't seen a single leak of larrabee, still no clue of its performance. 32nm? You gotta be kidding me.

I'll say he is half correct though. It should be out in January, just in 2011 not 2010.
 

Dekasav

Distinguished
Sep 2, 2008
1,243
0
19,310


I would be that cheesesubs has no idea whatsoever about what anyone's saying.
 

yannifb

Distinguished
Jun 25, 2009
1,106
2
19,310


Indeed.... indeed.

Anyway back to GPUs!

Honestly I dont know what some of you are talking about when you say the GTX 295 is soooo much better than the 5870... I have one and honestly im kind of regretting it (partly cause i dont have an SLI board and because i want eyefinity).... but its not too late to fix that- anyone want my 295 ;)