Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


But here's the thing Hawaii is faster clock for clock, wider mem bus more vram. So if both 290x and 780ti are over clocked to 1250mhz the 290x will pull ahead, as seen on fire strike extreme. Its the exact same situation with the Tahiti and GK104 Kepler, AMDs architecture scales better.

The reference cooler on the R9 is crap just by adding aftermarket cooling leave clocks at stock peformance jumps 10-13%. Furthermore R9 is on beta drivers and Mantle is approaching fast with wide dev support. The future in HSA and opencl!
 





Refresh of Hawaii not needed AMD mistake was terrible reference cooling, Hawaii is faster clock for clock we know this. Aftermarket cooling will only proof this further, lol but yes Mouse at stock the 780ti is less than 8% faster with its superior cooler!
 


http://www.youtube.com/watch?v=EIPggCgYK38
 


Ohooo! Shocked that hasn't went viral yet....

I have no preference between the AMD/ATI cards or nVidia. I just want the best card for the money per resolution that I can afford. But, ericjohn, as much as you say AMD doesn't have the best cards, there's other factors involved than just performance.

Now, I have no idea what the 290X can do versus the 780Ti with stock or aftermarket solutions. But the way I see it, if the 290X gets close to it, and for less money to boot, that's a win for AMD.

Because you're not going to see, say for example, the difference between 112 FPS and 108 FPS between the cards at the same resolution, and perhaps the only things you'll likely notice are the minor stutters/ micro stuttering at full speed. Maybe some other qualities that bug the hell out of you, like tearing and such.

But to the person that can live with those issues, he's going to feel like the winner when it's all said and done. And that's usually where people form their opinions, based off their experiences. If that AMD card doesn't display those annoyances very much, nobody can say he was wrong for choosing it over another card.

For me, the best buys from both of these companies is...
AMD: R9 280X
nVidia: GTX 780

I could go either way, but I will admit, nVidia has clearly shown that their cards are more dependable at higher resolutions. AMD needs to do a better job with variances between frames. nVidia's variances look like a tight rope compared to the "seismic waves" of AMD.

And yeah, that damn reference fan sucks too (proven). But they usually rely on the likes of MSI and Sapphire to find better solutions to cool these cards. AMD simply wants to get their product on the shelf and into the hands of the customer and its' obvious with that crappy fan they haven't bother to improve on since it's inception.
 


We can argue all day about the 8% (I say BS, it's faster than that in a LOT of games), but it's well known that 780TI also comes with 10DB's of less noise and 10C less temps. If AMD could have made a better cooler and charged the same they would have. But YOU CAN'T. So they DIDN'T...Right? Do you think they purposely chose a turd for cooling and figured "ahh, F&ck it, who cares if it brings 1000 crap reviews with people making fun of the noise/heat etc". I also think they thought they had better chips than they did have (fire the guy that made that mistake) or Asus wouldn't be shipping an OUT of spec fan after AMD upped things 300rpm. It's clear Asus was told "our chips will come in at 95c and have no issues with your ~1800rpm fan". Then they started popping off the line and not being able to do this so AMD says "raise everything 300rpm". I'm sure Asus said a quick "WTF?" as all their cards are out of spec now right (according to toms they are not made for ~2200+rpm)?? I don't see Asus saying "oops, some stupid engineer guy can't read, we shipped the wrong fans on retail units". AMD overestimated the "coolness" of their chips...LOL. When that happens none of your chips do 900-1ghz all day without talking water/warranty destruction etc right? 😉 Wait for us to release the REAL version is not a good response. Your margins won't be as high after you change the cooling either if selling at the same price. AMD can't really up it at this point...ROFL. That move would surely make matters even worse.

https://www.youtube.com/watch?v=m1JOhT015ww&html5=1
780ti unboxing with linustechtips. Both 780ti/290/290x all maxed out of their boxes. Warranty on all intact. This is nowhere near 8% in any game. Move to 8:35 in the vid to get all the benchmarks. I call that smacked around big time.
Bioshock+BF4 ~15% (upping to 1600p still 20%/10%+, but many games under 30fps min then anyway)
TombRaider ~19.2% (same at 1600p also)
FarCry3 (20%@1600p)
Also note at 10:15 in the vid he explains them having reasons to say 780ti isn't overkill for 1080p YET :) This comment is for anyone about to jump in saying "1080p and a 780ti, you freaking crazy?" Apparently NOT yet. As engines get even tougher on GPUS (CryEngine will no doubt be tougher in Star Citizen coming next year than Crysis3 itself as the linustechtips unboxing commented). The unreal 3 engine of today is far more graphically taxing than it was upon release. They don't remove effects as an engine evolves, they ADD TO IT right?

In order to say 8% you have to run ref speeds all day on 780ti (who does that?) and find NV's worst games (crysis warhead anyone?...Oh wait, that hasn't been played in 5 years or more), while at the same time getting a LUCKY amd gpu, and probably have to live in ALASKA where your gpu becomes a room heater. Also note the 290 won most benchmarks vs. 290x. AMD's bump in fan speeds makes 290x pointless...Clearly a bit of desperation after 780ti launched (You've killed your own KING card in weeks), and neither will do anything more than catch NV and not even sure that will happen knowing you can actually OC out of the box quite well on NV and they have after market solutions too (out already BTW). There is no game in the above vid under 10%. So saying 8% across the board is a bit of a pipe dream on your part, considering not ONE game was below 10% at any res above (unless you're already below 30fps min anyway-you can't win if nobody can play there) right?

MOH Warfighter 2, Assassins creed3 (over 10%, but less at 1600p), COD Black Ops2 (over 10%), Diablo3 (over 10%), StarCraft 2 (over 10%), World of Warcraft (over 20%), Splinter Cell Blacklist (over 20%), Guild Wars 2 all over 10%, FarCry3 (10%+), Lost Planet 2 (tweaktown shows loss to lowly 770 in this game, never mind Titan/780TI). All 780ti articles from techspot, techpowerup (most games came from here, they test more than most), hardocp, guru3d, techreport, tweaktown etc. I do not count any victory in a res or game that won't do 30fps min and if you dip below that I automatically discount the score no matter who wins the game (so anyone about to start quoting 4K crap, don't bother if it's under 30fps MIN, not avg or max, MIN!). As toms says of Metro LL@4K "the win is largely symbolic though". Yeah, 30fps avg is kind of pointless to win at. The game can't be played at 30fps avg (and remember that's not a min-you may see TEENS for min fps). Not much changes at 4K other than a bunch of games get unplayable unless you like making them ugly first. I play games AS designed and turning stuff off is against my religion. :)

Don't forget none of these OC cards (or any modified card) will be $400/$550. They will ALL be above AMD ref prices. Well, this may not be the case if it takes another month or two to get them out the door, we are already one month down while 780ti came out of the gate with OC cards and water block announcements from EVGA day one. We aren't even talking Gsync or drivers here which are already worth a premium over AMD for me. So I end up seeing drivers, perf, heat, noise, gsync (among other things, streaming to shield/tv etc) and no way to purchase AMD even if they fixed the fan shortly. Fix it and give a price drop and maybe I'm pondering you, but to do that they would be losing money (if not already after launch issues, probably paying some to change fans, Asus clearly needed a new one to handle 300rpm). We also aren't talking the better games (3 AAA and very new), though no game causes me to buy a card I have to live with for 3-6yrs. It's all about the card for me, but the games are there too for people who care. There just is no way to slice this as a winner for AMD no matter how much a person loves the company. You take 8 months to catch a guy, come lacking drivers, come hot and noisy, come lacking games, come with more watts and then he beats you again ~2 weeks later anyway even on top of all your launch problems/fixes. We are still waiting to see what AMD was supposed to be able to do. Well, we've seen the card modified and it won't catch 780ti even then as toms already added a $75 mod (which already would make me go 780TI knowing everything else here). A mod won't bring them to 83c while running 1ghz needed to even come close to 780ti and doing it all under quiet mode DB's. Only a poor person or fanboy would go AMD at this point :) I can OC the 780ti out of the box and get speeds that were NOT used to get the scores above (only linustechtips vid is OC'ed for everything, the rest not in linus' vid were just ref scores). At best a mod will move the goal posts but it won't change them if you OC the 780ti, which is all a fan mod does for 290/290x anyway. 290/290x can't run as designed, so essentially an OC if you need a mod to run normal which is already 10c higher and noisy. Semantics here; you say they needed a better fan, I say it can't run as they said...Either way anyone sees it, the result is AMD can't run as advertised. No, saying UP TO X mhz doesn't work, which only serves to tell me I may not get what I paid for. Who would buy a brand new mustang that says UP TO 150mph MAYBE if the stars aline, you pray to god, perhaps slaughter a chicken etc, but it depends on how good the engine is which varies by 30% in perf. NO FREAKING WAY that car sells to anyone but a mustang fan.

If I have a gpu that runs 10000 watts and runs full speed doing it in open air (never throttles, maxed all day), can I really claim victory if you can't do this in ANY normal condition? If I throw it in a normal box, but it needs to run 250w and 1/4 speed to do this, then gets smashed in all benchmarks by my competitor you don't keep claiming I'm faster right? You don't keep saying all I need is a great heatsink and fan and this is a whole new race right? That is dumb and not true. It's like benchmarking at 4K and claiming a victor when nobody hit 20fps avg (never mind mins you are already unplayable). Who cares? Even if you could claim this crap, as the OP already said you can mod NV cards too. You don't say NV is faster, all I have to do is void my warranty and use liquid nitrogen to prove it...ROFL.

I don't know about anyone else, but I couldn't care less how much work you get done clock for clock. I only care that whatever clock you run at, YOU WIN, and I don't expect it to slow down and no longer run as advertised over time during play. If you sell me a 1GHZ card it better run at 1ghz all day or ABOVE (like NV, guaranteed speeds OOBE). NEVER below. I don't expect a Vid card to perform all over the map depending on how hot my case is, airflow etc...Whatever you say it does, it should do all day no matter what I do (that's the point of QA right?). I don't care if you run 100mhz or 2000mhz. I care which of these is the WINNER and can you do it at or below the other guys heat/noise/watts? In this case the winner is Nvidia (perf), Nvidia (heat), Nvidia (noise) and Nvidia (Watts), and did I mention the games winner - Nvidia (3 AAA games). Will a magical fan cure SOME of this? Probably...But will they be $400 & $550 (290/290x)? Because so far these ones weren't the REAL models right? So they were just the FAKE prices correct? Because the REAL card that can actually do 900mhz-1ghz all day (well, closer anyway...LOL) isn't out yet right (are Asus OC 290x in stock yet - maybe but it took two months so you get the point)?

As a consumer I don't care about die size, IPC or clock speeds (more interested in PERF, Noise, Heat, Watts - pretty much in that order). How fast would NV be if they released a driver or bios tomorrow that upped the temps to 95c instead of 83c and raised fan speeds to whatever meets AMD's average noise in reviews on current retails? I'm sure AMD lovers would freak and call NV cheaters...ROFL. Does clock for clock crap matter if you're using more watts and still losing? Pound for pound 8 cores should be better than 4. But that isn't always true either, and even 5ghz doesn't help AMD on FX 9590.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/62166-amd-fx-9590-review-piledriver-5ghz-13.html
Some games, like skyrim are still 30% slower even at 5ghz with 8cores vs. i7-4770. Next page has torchlight 2 showing pretty much duplicate results (25% slower on AMD). They didn't win a SINGLE game in the whole review (does AMD win any game today on cpus?). Civ5 is over 30%. There are many more showing AMD far behind. Again PERF is most important here and INTEL does it with less watts (far less-220w cpu? WTF?), heat & noise. What is comic to me is AMD thinks this chip is worth nearly i7-4770. Put the crack pipe down please AMD and quit spending on consoles which personally I just want to see die, instead of gpu/cpu/drivers. Consoles are the sole reason why you seem to not have enough money to spend on proper R&D for a card launch, release 220w cpus that can't catch an 80w Intel with half the cpu cores (that mind you runs 1ghz+ less also), and are still fixing drivers from 2yr old cards (phase 2 coming late Jan now...ROFL). The only way this could get worse is if they screw up Mantle on BF4, which if delayed for months, would be a screw-up right? It was supposed to land Nov, now Dec nearly gone and looks like even further as Dice has likely stopped mantle optimization in favor of fixing their BROKE game right? Surely Dice has put everyone onto fixing a game that may cause a lawsuit if they don't put everything they have on this.

http://www.forbes.com/sites/briansolomon/2013/12/12/did-ea-lie-about-battlefield-4-now-under-investigation/
You shouldn't have attached your new feature to EA. If BF4 is already off 70% from BF3 sales, and talk of a pending lawsuit now, how late will Mantle be and does it even matter that it's in there if nobody wants it now?
"Of further worry may be that EA has permanently damaged its reputation among gamers, which could affect its sales of other games in the future."

I thought the reputation was already damaged...LOL. They are the most hated company on the planet aren't they?
http://www.gamespot.com/articles/refunds-available-for-battlefield-4-premium-after-server-woes/1100-6416663/
Microsoft refunding xbox BF4 owners now? I'm hoping Carmack is reading this crap and realizing he should rethink his statements on "done when it's done" as that is CLEARLY the best model to follow. Ship a beta and fix at retail is a bad idea. Why would you hitch your wagon to the worst company in America for two years running (now with BF4 will likely win the 2014 award also...ROFL)? I digress...
 


Hey, you seem to have a very strong opinion on this so I am just going to throw this out there to see what you think even though a few people on a forum are probably not going to change your mind. If you look at most of the benchmarks (ie: Tomb Raider, Crysis 3, Hitman: Absolution, almost all of the first 2 pages of 1080p benchmarks) the fps are either way over 60 or for the Current Games section within a margin of error or if you think that tests are accurate 100% of the time the 9590 is on average (for the 4770K vs. FX 9590 (Current Games) page 1 ) only 4.422 frames per second behind an i7 4770K which runs for 300 dollars whereas the 9590 goes for around 280 without any cooling solution (lets be honest if you are going for a high end system with any of these CPUs you probably are not using the stock cooler anyways). As for PS4/X Box One/ Wii U being a waste of AMD's time and money that is quite obviously wrong, AMD is going to be reaping profit from Sony, Microsoft, and Nintendo so regardless which console wins the most market share AMD gets a slice of the pie, and maybe that has an effect on investors with AMD's 3rd quarter 2013 financials being the first time in a while they have made a net profit which will probably carry over into Q4 as well. This boost will give them more resources to work on CPU, GPU, and APU technology providing more performance increases between generations and getting better/more advertising and distribution.

On the GPU side of things Hawaii is a hit, the cooler is a different story; the GPUs can safely operate up to 95C anyways but will clock down as temperatures go up, this doesn't mean that non-reference cards like premium ASUS models (think Matrix or Platinum cards) or Sapphire Toxic edition cards will be plagued with the same temperature problems. Also I was at AMD APU 13 and I played a Battlefield 4 demo with 2 identical systems except one had 2 GTX 780s in SLI and the other had 2 R9 290s in Crossfire both systems had both cards at stock settings, checking that the graphics settings were all the same, with that the Nvidia cards had clear stutter and difficulty keeping even 30 fps and during action dipped even lower while the 290s were able to stay within 45-50 fps, they were both systems built by Maingear with their cooling scheme with all-in-one cpu liquid coolers keeping heat away from the GPUs. A pre-built system with stock settings is going to be what most people play at, because the overclocking and enthusiast community is not as big as forums would make it seem. Most people don't feel comfortable building their own PC and overclocking and would rather have something shipped to them which they can use to play games right out of the box with little to no effort on their part. Despite AMD optimizations in BF4 you can't ignore 10+ fps difference on two cards in the same price tier. Hawaii really flexes at higher resolutions and right now 4K isn't mainstream with most people playing at 1080p with some enthusiasts going out to 1440p or 1600p, and according to Steam's hardware survey only 3.63% of gamers have monitor resolutions greater than 1920x1080 but the thing is most people who game at 1080p or below are not going to be getting the latest 500+ dollar GPUs. When you look at these companies you have to realize more people have Intel HD 2000 graphics than both GTX 670s and 680s combined (again from the latest steam survey, so that is only people who have steam and does not include anyone who might play casual browser games or other platforms like Origin or Battle.Net). For these guys the real battles are at the lower end entry level like the GT 610 or HD 7570 and mid tier cards like the GT 640, GTX 650, HD 7750, HD 7790.

For what its worth AMD also with APUs picking up speed and of course their involvement with the consoles bringing publicity to them as well with both AMD as a brand and APUs, things are looking up. Are they going to take over Intel's market share any time soon, probably not; but are they getting out of the pit they dug themselves into with bulldozer, slowly but steadily and their pace of recovery will only get faster. Intel and AMD both understand that desktop CPUs will still be around as long as people use computers, but they also know that more and more casual users and people who do not play games or don't care for games at high detail settings at high resolutions are switching to tablets and laptops and that is why they are investing in low TDP parts and processors intended for those platforms.
 


These days not seeing that much more performance outta WC, compared to what we used to ... Haswell runs into voltage limits (4.5 to 4.7 most often) not that far past hi end air cooler (4.4) territory ..... and the GPUs, tho they run a lot cooler under water (mine at 39C) they also run into voltage limits or artifacts not far beyond where air takes them.

The 290x is aggressively overclocked outta the box .... I haven't been able to get more, and I haven't seen a review getting more, than 7% over the ultra setting on the 290x. With a decent 780, not the ti, (i.e. Asus DCII) ) I have been able to get a conservative 25% OC on the core, 20% on the memory resulting in max boost clocks in the range of 40% over stock speeds.
 


you're missing the most important part of the debate though... AMD's cards when not being snuffed up the noses of bitcoin miners are usually much cheaper in comparison to Nvidia cards, and I do like Nvidia cards I have a 550TI in my htpc and a 7850 in my main gaming rig. If I had to choose between the 2 I would hands down pick the R9 mainly due to mantle coming out, and it comes with battlefield for free which to me is worth it. The only problem I face is that the market is so screwed right now the cards cost 100+ from what they're originally supposed to be because of the whole bitcoin thing... gay!
 


If AMD made something like a 290-B for Bitcoin mining with bare minimum of everything that is not needed for mining bitcoin or any of the other scrypt/crypto currencies they would make boatloads of money and it would at least help keep some of those hawks away from the gaming cards.
 


Oh I totally agree, but that could take months and we know that AMD would get screwed on that deal because who knows how much they could drop overnight... too much risk/not enough reward, and that's why we'll never see them do that...
 


Well I guess board partners could technically do that, they can make a non-reference card built just for that and call it the 290-B like when powercolor made a 7990 before AMD had even announced it.
 


Good luck getting Mantle in any game AMD doesn't pay for (no different than Physx). It is NOT a reason to buy a card. Nobody will write for this for FREE. It gains a dev no extra cash on your purchase (they can't charge an extra $10-15 to AMD buyers who use mantle, it is just wasted resources), so why support it unless AMD pays you? You like wasting time for no profits? Having ENGINE support for Mantle doesn't mean your game has Mantle support, it just means if you WRITE for Mantle it will work, but you still have to do the writing. Every dev that uses frostbite3 etc still has to do the work. Of course the engine needs support first, or you just can't write for it at all. But having that support doesn't magically mean all games using frostbite use mantle. It just means if a dev decides to devote some extra time to Mantle, they CAN. Physx is in UNREAL 3 engine. But only rarely, when a dev decides to use it, do we get a physx game (borderlands 2 etc). IF NV doesn't pay, a dev completely ignores that part of the Unreal 3 engine.

At best, and only confirmed by ONE SINGLE dev, you MIGHT get 20% and even AMD only claims "we wouldn't bother for 5%" so why not claim we wouldn't bother for 15%? Because you will get 10-20% most of the time or less. I can get that with a driver update from either side on some games or 2 driver updates in others (many games give 10% per update from either side). At worst I can get this with a refresh card a year later and it affects ALL games with NO dev support because the card is just FASTER. IE, a 780ti is ~20% faster than ref 780 9 months later. That perf affects all games. Having said that, I'd rather buy for Gsync, which again affects almost all games, no dev cost etc. I buy the hardware and pretty much all games get better. AMD can't afford to make mantle used. How many times do you think they can pay a dev $8mil to launch a beta product?...LOL. BF4 is a disaster leading to lawsuits and doesn't even have Mantle out yet. Mantle will always get the LEAST attention. Look at the # of games supporting physx and divide by 5-10. Then you have the # of games that will support Mantle as AMD has no funds. Devs won't waste the time for even 20% when most of the market will NEVER see it (most of AMD's lineup and ALL Intel/NV solutions won't see a single % of the work you do, why do it?). Also even fewer Mantle capable cards are in the hands of REAL GAMERS, as the miners have been buying everything in sight. So it will be even longer before you have a reasonable audience that will even see your work and even on that day, it won't be many people.

GameWorks from NV (a direct response by NV to AMD's proprietary Mantle crap) will have a better shot at surviving and I hope that fails TOO! We don't want either side to get this proprietary crap working which splits dev time and takes away from the actual game design time to improve things for a few people. Every ounce of time spent on Mantle or Gameworks crap means less time for the actual GAME for all players. If devs could charge an extra $10-20 for Mantle users (or physx etc) you'd see everyone writing for it. But you can't and are stuck charging the same for everyone so nobody will use it unless paid. Period. This might have had a better chance if consoles would use it too, but only a fool would think MS would let ANY other api compete with DirectX...LOL. I won't help you kill my platform. That is stupid. Microsoft will do everything they can to kill Mantle (just like OpenGL etc).

I just don't understand anyone buying a card for Mantle. You'll be disappointed by the fact that AMD can't afford to get more than a few launch titles (BF4, maybe a few more) out from the get go, and possibly a title or two each year after that just like physx etc. Physx is actually in the consoles and support still sucks and this is backed by a 2.7B in cash company with NO DEBT. How far do you think Mantle gets with a near bankrupt company that has lost 6Billion+ over the last 10yrs (yeah, not made a dime in 10yrs, only losses), billions in debt and running out of cash? NOT FAR. AMD needs to start creating products and tech that sell themselves, not tech (mantle) that forces you to PAY people to use because it gains them NOTHING in profits. Gsync sells monitors for a premium so monitor makers have a reason to push it (it sells monitors). Mantle, Physx etc gets nothing but a few happy people, with no extra money.

You get what you pay for. AMD has cheaper products right now, but for that you get phase1 mentioned last july that comes in Nov. Phase 2 that you're still waiting for and who knows how many phases will actually be needed to fix everything (phase2 doesn't cover any dx9/OpenGL people etc - could be wrong on opengl we'll see). You're now 2 months into the release and waiting for NON REF cards. Phase2 has turned from November to Jan/Feb. I would gladly pay more to avoid that nightmare that seems to never end. Phase2 is the eyefinity release I think so don't expect Dx9/OpenGL fixes. How many phases are needed to fix stuff that should work out of the box? I guess we each decide how much pain we are willing to put up with. I prefer NV's drivers at this point and Gsync on top as a bonus. I say that as a Radeon 5850 owner :) AMD will have to do a lot of convincing to get me again. My card has worked out perfect, but I wouldn't want to be an owner of all the problems of the last few years with AMD nor waiting on updates for pretty much every game during that time that sometimes come 6 months later. NV updates for games usually come before the game itself or on the titles release date at least. That is what you get when a company has the cash to spend on a REAL driver team. Not that AMD's isn't real, but it seems they need to double it for it to cover all their products and games as they come out. Their reputation can't take much more delays etc.
 
Besides how worked up you seem it is true that AMD has had losses but the last quarter AMD was in the black and all trends are pointing to continued growth. No one can know for sure until the next quarter financials are released but things look good as of now. Mantle will need work no doubt, and they do provide developer incentives for using Mantle to big name studios, but CryTek has not (to my knowledge) announced they will be using Mantle and Star Citizen devs have said they will be using it in a CryEngine game. AMD does have driver (and CPU as well) issues to work out but with the new CEO and hiring back some old staff along with bringing in some big names in the industry across the board including the drivers team seems to be a big step in the right direction. They are not going to take back lost market share overnight and rebuilding is slow for a company of that scale but I would not count them out just yet.
 
AMD is out of the gaming CPU market and has bet big time on the console market, and nVidia is betting the higher performance of today's higher-end GPU's will keep gamers building gaming PCs. Only time will tell who is right, but I prefer a system I can build myself and know what goes into it, plus I can upgrade it when something better comes out. The console you can't really do that and besides it spys on you and sends the data to MS or Sony depending on what you have, lol.
 



Ya losing me on this one.... the 780 is cheaper than the R290x even before the vendor price add ons and while the 290x leads outta the box, the 7% max OC, hurts when compared against the 780 which can OC 25% and more.

Skip to 8:30 mark
http://www.youtube.com/watch?v=djvZaHHU4I8

The BF4 game is a nice incentive, but I bought 2 780s just b4 XMas and used the 6 games that came with it as XMas presents ....the three $50 games with each card, made my net "out of pocket" just over $350 per card and saved me some XMas shopping.
 


I think AMD is gonna learn what nVidia learned last time around .... the console market is a huge source of capital but not a huge source of profit. With that kind of volume Sony and MS get to really squeeze the suppliers and while ya sell a lotta stuff, ya have a very small margin if any. Keeps people employed, factories open and the lights on but very little money to put back into R & D or other endeavors and it puts a substantial drain on company resources hurting other markets.

If AMD can turn mantle into a synergestic undertaking and save money by developing for consoles and PCs at same time, they have a shot at turning recent fortunes around. But they really, really need to get more than four cards into the Top Twenty-Five.

http://store.steampowered.com/hwsurvey/videocard/



 
I am pretty sure that AMD and Nvidia both know that between the two of them the enthusiast and extreme market segments are locked. The real threat to them both as discrete GPU designers is trying to convince an OEM that it is worth it to put a 7750 or GT 630 in their computer instead of Intel's integrated solution, AMD has the APU as a counter to that but it needs to gain a lot more traction before Dell will risk confusing customers with AMD vs Intel when Intel is plastered all over the place and has TV ads whereas AMD is not really that well known. The enthusiast market will be there as long as there are enthusiasts willing to buy the products, but without a doubt the mainstream market is larger and more apathetic to things like Mantle or PhysX. No matter how large the community seems looking at forums or blogs anyone going out and buying any graphics card above a HD 7770/ GTX 650 is (as of now) not the norm.
 
NEED HELP: Why does this 780 TI have higher fps than my 780 TI in Arma 3 by a huge difference of 50 fps? I have like 27 fps average while this benchmark has 75 fps. I know that my CPU is good enough (i7 4820k). Could somebody please help me?
 


Please post in the appropriate section in the Forums for questions like this. Also it depends on what resolution and detail setting you are running, any other programs running in the background, what version of the Nvidia drivers you have, etc. There are too many factors and it would derail the purpose of this comment thread.

 

You really nailed a point that really hurts AMD more than just Intel practically out gunning them. And that is, AMD sucks when it comes to advertising their products.

You don't see commercials of their processors or GPUs. For that matter, I remember seeing Intel commercials years before AMD even came to mind as a possibility for a PC. Now, here they are with an APU that their investors are hoping will be the meal ticket for the entire company, so to speak.

Has anyone seen a commercial yet of this APU? How about their GPUs?

I don't hate AMD. But I do wish they'd get their backsides in gear and properly market their products as if they desire to win people over.
 


Marketing is everything otherwise people wouldn't pay outrageous prices for lower-cost components that go into Apple products.
 
The mobile version of the forums doesn't let you quote messages but continuing the thread, AMD has been getting more coverage with Xbox One, PS4, Steam machines, and all the CES coverage of Kaveri was not completely left out of more mainstream tech sites like Engadget and I think there was a piece in Gizmodo. So things are getting better, they are finally starting serious reconstruction and if Q3 and Q4 13 financials show anything it is working
 
Status
Not open for further replies.