BestJinjo :
@ Sakkura "The 7970 only gave a 37 FPS average in Far Cry 3 @ Ultra 1920x1080. Crysis 3 is on the way."
You must be looking at early benchmarks. Since then the game got more patches and AMD/NV released newer drivers that improved performance. HD7970GE hits almost 40 fps at 2560x1600 Ultra with HDAO in FC3:
http://www.hardocp.com/article/2012/12/17/far_cry_3_video_card_performance_iq_review/4#.URabF6VZUeo
At 1080P, 45 fps with 4xMSAA and 68 fps without MSAA.
http://gamegpu.ru/images/stories/Test_GPU/Action/Far%20Cry%203%20v.%201.0.2/fc3%201920%20ss%204x.png
http://gamegpu.ru/images/stories/Test_GPU/Action/Far%20Cry%203%20v.%201.0.2/fc3%201920%20ss.png
Crysis 3 has a nearly 35% performance hit with 4xMSAA due to the engine using a deferred lighting path model. Drop down to 2x SMAA (medium) and you get 55-60 fps on an 1100mhz HD7970 with superior IQ. Also, for many people it's not worth it to spend $500 to upgrade from GTX680/7970 just to go from 0AA to 4AA. People tend to want much more significant increases if they are spending $500 for an upgrade.
@ hasten "Try again. $400 in the US (us.ncix.com). Canadian does not matter to most of us."
$350 XFX 7970 with Bioshock Infinite and Crysis 3:
http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=3242918&CatId=7387
@ ojas "Nvidia's official road-map...Kepler for 2011 and Maxwell for 2013."
Incorrect. Nvidia has long clarified that Kepler was for 2012 and Maxwell for 2014. This was from their own slide decks before Kepler even launched last year. The rumors of Kepler for 2011 and Maxwell for 2013 go back to 2010 before Fermi launched. You are way behind on the data.
http://i.imgur.com/uYIe8.jpg
@ bit-user "I think a more likely explanation is that AMD knows their lineup won't stand up well to the next generation of Keplers, so they went back to the drawing board to squeeze a little more performance of their GCN2's."
Except GTX700 is also rumored to be pushed back to Q3-4 2013 based on similar sources (Sweclockers, Videocardz, etc.). Since HD7970GE is already 10%+ faster than GTX680, even if GK114 is 25% faster than GTX680, that would only make it 14% faster than HD7970GE. If GK114 launches at $499, AMD could just lower the price of HD7970GE to $349. They have bigger things to address like re-writing the memory management sub-system of all GCN parts, which in itself could improve performance/smoothness of their GPUs. Makes little sense to release HD8000 parts when their HD7000 parts are not even fully optimized by the drivers when HD8000 is based on the GCN 1.0 architecture. Might as well maximize performance from GCN 1.0 so that it translates into HD8000 series. AMD rushed HD7000 series to market and then spent 7-8 months optimizing it. Due to their drivers, the 7970 was initially noncompetitive at $550 with a $500 GTX680. The result was price drops. If they have the new driver that resolves memory management issues and gives HD8000 another 10% boost, they could launch a more competitive card at $499-549 and actually make $.
@ gurg "I made my decision and bought a higher performing Gigabyte 680 yesterday for my single monitor. Wanted a Gigabyte or XFX 7970 ghz"
Umm no. Gigabyte HD7970 Ghz is faster than a Gigabyte GTX680, considering it also ups the clocks to 1100mhz. Newegg has that card for $429.99 with no rebates:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125439
Moreso, Gigabyte HD7970Ghz is $379.99 on the Egg and it will max out at the same clocks as the Ghz edition since it's identical GPU. You can just flash the BIOS from the Ghz edition onto that card.
Then there is the Sapphire Vapor-X 7970 Ghz for $440 which is also faster than the GTX680 you bought:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202001
If you don't buy stuff online, it doesn't mean there aren't better deals to be had.
@nikoli707, "sorry to burst your bubbles but the msi lightning gtx 680 is the fastest single gaming gpu on the planet."
Nope. NCIX also pitted GTX680 Lightning against Asus Matrix HD7970 Platinum both OC to the max. The 680 Lightning lost:
http://www.youtube.com/watch?v=sJaoY0-kfk8
That's not even considering that HD7970GE dominates the 680 in multi-monitor gaming:
http://www.legionhardware.com/articles_pages/gigabyte_geforce_gtx_680_4gb,6.html
Or in titles that use DirectCompute for graphical effects:
http://www.xbitlabs.com/articles/graphics/display/his-iceq-x2-7970-7950-7850_11.html#sect5
HD7970GE also delivers superior FPS/smoothness compared to 680 in BF3 now:
http://www.youtube.com/watch?v=qR3ewLMbywY
You cannot make a case that GTX680 Lightning is the single fastest GPU unless all you play is AC3, WOW, BL2 and Project CARS.
It's a bit laughable you pointing out multi-monitor gaming when the games are running in single digits. I call that a tie, when I'd never run there with EITHER card. Look at the games in your legionhardware link. Even at the lower res they used (just looking at alan wake) the top card on hits 40fps (avg...what's the min?). Upping the res only one card hits 19fps AVG. So 10fps min then?...LOL. You can't call a victory for ANY card when ALL cards are NOT able to play at the resolutions you're quoting as wins. Correct? Unfortunately I look at reviews like legion's as a waste of time. I'm only interested in MIN fps as that dictates my fun factor. Nice to have avg in there too, but without min fps the results don't mean much. Card X beats Card Y but both end in a slide show?
Your xbitlabs post shows many games hitting under 30fps min at 2560x1440 (never mind your Multi-monitor res link at far above this). I call no victory when a card can't hit 30fps min+. I call it unplayable on EITHER card if you can't hit above 30fps no matter which you're using or what you're playing.
http://www.xbitlabs.com/picture/?src=/images/graphics/his-iceq-x2-7970-7950-7850/zfulltable.png
Metro2033 & Crisis 2 don't even reach 20fps min at 2560x1440. So how can you claim it's faster when nobody in their right mind would play a game that will dip to 10fps? Actually it's UNDER 10 with AA on. If the 7970ghz iceq2 edition ($400 card) is hitting 9-10fps and NOT 30fps EVEN ON AVG why the heck would I play that game on that resolution, and realizing this who the heck cares who wins in a race at 10fps?...LOL. I call BS on your analysis. Even Sniper elite hits 25fps min on the 7970 and that's with NO AA. Turn on AA and the 7970ghz hits 9fps min...LOL. Yeah I like to play slide shows all day, it's so enjoyable. Even the avg with AA in sniper is only 20fps...ROFL. These cards are NOT meant for anything over 1920x1200 in all but a few games. I wish sites would just leave out any result where games hit under 30fps min. They mean nothing.
Only battlefield 3 and F1 2011 were playable (above 30fps min) on both cards with 2560x1440 AA ON. Every other game in the xbitlabs review hits below 20fps (most in the low teens) with AA on at 2560x1440. So the discussion is pointless on who won at this res as you can't play there. It's kind of like saying I can run faster than you off this 5000 foot cliff and I'll hit the dirt first! But the end result is we both die. So do either of us want to run off the cliff to prove the point? Is it any more fun for me if I win? NOPE. We're both dead still right? But I win, so neener neener neener...LOL. Who freaking cares? We both lose and can't play this game of running off the cliff right? It won't be any more fun for me than you and we'll both be dead...LOL. Do you get it?
Please refrain from quoting anything as winning if it can't do it at PLAYABLE 30 fps all day. The cards are fairly evenly matched depending on the game you play. I'm so sick of people claiming AMD is better when neither can play at 30fps where they claim AMD wins, or the usual (and also tired) saying that NV is bandwidth starved, when you have to run at a res where nobody can run to show it. It's extremely difficult to find a game where NV is bandwidth starved and AMD is over 30fps min. That's what I call PERFECT engineering on NV's part. The only places they hurt are situations that don't exist in game play with their cards.
"You cannot make a case that GTX680 Lightning is the single fastest GPU unless all you play is AC3, WOW, BL2 and Project CARS. "
Don't forget to add Diablo3, Starcraft2, Far Cry 3 (tie until you kick in the OC'ed NV's), COD Black Ops2 (and others I can't be bothered to look up)...It cuts both ways depending on the game. The list of losers for AMD is longer than you suggest here... and I'd say the ones they lose in are more popular to boot. For instance, who plays Sniper2 Elite? Scores 66 at metacritic (users hate it too) while something like borderlands 2 scores 89 with 1100 users rating it 8.1 (vs. 6.7 for sniper and only a 227 even rating it). I think 11mil still play WOW, 6 million bought D3 (not me, but still) etc...Sleeping dogs has half the users rating it on metacritic (500) vs. Borderlands, while starcraft 2 has 2100+ rating it....See my point? The games AMD seems to win in, have less interest from users. Metro2033 in the xbitlabs test hits 16fpsmin on 7970ghz iceq2 edition at 1920x1080! Is that really a win? That's with NO AA even. Turn AA on and you barely get out of single digits for min...LOL. Its UNDER 10fps for min above 1080p! You playing there? Again, I ask, you like slide shows? The avg was only 25fps for crying out loud. You can't play there! It's not a victory to win at 9fps min...ROFL. Have fun playing there pal.
I'd also remind you that the 7970ghz is using 55watts more than a GTX680 shown here:
http://www.guru3d.com/articles_pages/asus_ares_ii_2_review,8.html
People seem to forget this. 228w is a lot higher than 173 correct? That's a LOT of heat in your room if your in a state like AZ as I am. It's also NOISY compared to NV cards (well duh, can't cool an extra 55w without at least some noise). So it costs more to run (50w bulbs aren't free last I checked), heats up more and is making more noise. That's 3 losses I don't like in any case.
In your own xbitlabs review you linked it's a 70w difference:
http://www.xbitlabs.com/articles/graphics/display/his-iceq-x2-7970-7950-7850_8.html#sect1
OUCH. Worse than my example. 483w vs. 414 for gtx 680, heck even the 7950b ran 468w.
Also note that most people are benchmarking NV cards at REFERENCE clocks when comparing perf (like tom's, hardocp these days etc). I wouldn't touch a ref clocked NV card EVER. You can get another 100-150mhz for free OC'd out of the box. Why buy a 915/980 card (which everyone loves to bench vs. AMD's ghz editions....LOL) when Newegg offers 1033/111 (zotac amp) 1032/1111 (Gbyte) & 1046/1124 (evga) for the near the same prices ($279/289) in say 660Ti for example here? It's the same in all models. Who buys Ref Clocked NV's? What for? Note the fine print on all test setup pages at toms where they tell you they're using ref clocks. I still don't know why. There's a big difference between 915/980 and 1046/1124. Sites should be doing hardware reviews with what we BUY not ref clocks. We keep seeing benchmarks of ghz and boost editions of AMD's vs REF NV cards which is not real. I really wish NV would release their own ghz editions so this would stop. It is after all, the only cards you'd buy if buying NV unless you just purposely hate the extra 10-15% gain via the factory overclocked cards that have been there since debut. When you consider the OOBE with these cards you get other games being victories or at worst a lot more ties in many games. There is no clear victor for this generation. NV had the clear lead until ~oct and the 12.11 drivers from AMD came which made things pretty much a wash in most games (BL2 & D3 being exceptions for NV). Titan will change this no doubt but for very few people (who has ~$900 for a card? They could shock us with $400-500 but I doubt it).
Having said all that, I'd rather buy a card that uses less watts, heat, noise and is owning ~65% of the market and making 500mil last year vs. a company owning ~25% and losing 1.8Bil last year. It took AMD most of the year to catch NV via 12.11 drivers. That's either NV getting it right from the get go, or market share causing people to optimize for NV first in games.
Hardocp sums this up well here:
http://hardocp.com/article/2013/01/21/2012_nvidia_video_card_driver_performance_review/6
NV dominated most of the year with better drivers & game optimizations as games appeared. Meanwhile AMD has to give away games to keep from losing even more market share even now and still losing 1.8Bil. They'd better quit giving away games to get sales or this will just keep happening. I'm half glad they delayed their next cards, as they really can't afford to go toe to toe in R&D with NV (3.5B in cash w/no debt means you lose this war). They need to slow down and make some money off current R&D (milk the cow so to speak). Console sales may help this dynamic in 2014+ but only if they succeed, and I doubt this. Consoles are no longer alone (phones, tablets, ouya, Steambox, etc etc...the list is growing). Nv's own project shield brings your PC power to your TV. Then what do I need a console for? If my PC can be anywhere in the house and it's gpu can get a game on my TV a console is pointless for a lot of people. All the other devices mentioned will add up to a loss in console sales IMHO. It's simple math. This game is already over, we're simply watching AMD bleed to death. Unless they get bought, they'll continue to get weaker and weaker (first gave up the cpu race, now delaying gpu race - which no doubt will cause a msg next week from NV saying delay of maxwell as that's just good business to slow like Intel has with no competition pushing you). See the writing on the wall yet? AMD was the worst performing stock in the semi index last year. I don't see that changing this year.
http://www.forbes.com/sites/greatspeculations/2012/11/30/why-amds-stock-collapsed-and-how-it-can-recover/
The recovery scenario is a pipe dream. No arm chip until 2014 etc...Nothing to get a gpu gain, cpu's under attack from INTEL & ARM (many arm competitors entering server and even desktop cpu's shortly), this is getting more ugly by the minute. They are chasing the crowd instead of leading it. NV will have 5 revs of tegras under their belts by the time AMD debuts ARM1 (whatever they call it). Project Denver/Boulder will also be out attacking their cpu's (and NV isn't alone doing this). There is NO good AMD news on the horizon I'm afraid. AMD has to get bought or we'll all be paying NV $1000 for cards in 2014-2015. OUCH. I fear shortly $500 will be our mid-range again. That sucks. I hope they're still shopping their company via JP Morgan etc
Sell or die.