GeForce GTX 750 Ti Review: Maxwell Adds Performance Using Less Power

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I also agree with a sentiment that Cleeve expressed a while back now, that there are no longer any bad cards, just bad prices. Considering that it beats a GTX650Ti and comes with 2GB of RAM, the $150 price is not "bad," but I wouldn't call it compelling either. For $120 however, it would fly off the shelves.
 
Seriously wondering what the Crossfire capabilities of this thing are. Also really want to build a green gaming computer now. If only I wasn't a broke ass college student.
 


No I think he meant sound cards. They are usually this small, and usually vid cards are not. It is about the size of an audigy etc.
 
" ...we’re excited to think about what Nvidia might do with this architecture and a 250 W power budget. " little tweakup (H265, Direct X 11.x ??) and a die-shrink to 20nm!Wow The 800 Series should be a pretty massive performance boost even over current lineup !
 


Yea, the price is really what is keeping me from getting one, much less the 3 I want.
 


You won't be SLIing them. Only thing would be Folding.
 


I didn't intend to SLI them. I was going to replace the HD 5850 in my file server, the 5850 in one FX 8320 rig and the HD 4870 in the other FX 8320 rig. I would probably only fold on the file server one, though.
 
When a single-slot low-profile version is produced, I will buy it (assuming no competition exists that needs to be considered) to replace a HD7750. It is a tempting upgrade for a GTX650Ti I'm using in Omega now, but that would be penny wise and pound foolish, since I've got a HD7970 on a shelf I can use.
 


http://www.xbitlabs.com/news/cpu/display/20131113225841_AMD_Cans_Plans_to_Introduce_Next_Gen_FX_Microprocessors_Next_Year.html
Well no showing on any 2014 roadmap at least says they canned it this year. Even if the roadmap changes it's tough to get it into this year with AMD's 20nm tapeout comment. They don't even have a 20nm soc coming this year (seattle is server only and 28nm, with no modem in house this makes sense, but still sad) which is much more simple to design and manufacture than a large cpu. So if they can't fund a 20nm soc while funding 28nm seattle, I don't hold any hope of a 20nm FX cpu this year or probably even next year as they already have too much on their plate for their balance sheet.

This is why you have phase3 drivers coming, freesync that isn't a product, cpu race that is over, soc race not even hitting 20nm as everyone else gears up 20nm socs and even then only server, no modem so phones miles away (as everyone wants it IN HOUSE now), vid card launches NOT hard, soft and lacking a decent ref design (290/290x could cut retail speeds), mantle in beta after 2+yrs of working with Dice etc etc...Consoles created all the current R&D funding issues and diverted attention from their CORE products and customers. Bummer. AMD just keeps getting weaker. Even NV said they couldn't afford consoles without taking R&D from other areas, so how the heck can AMD afford it without everything else suffering as we see from the list of problems? Laying off 30% of your engineers can't help with R&D either. All of it adds up to a 'get ready for crap products going forward' situation for at least a while. R&D funding gets you a card like 750ti that is 43% more transistors, only 25% bigger doing it and at the same time far less power hungry. Those numbers should only be possible with a die shrink. Which truly makes you wonder what 20nm high-end will look like. I think we're in for a holy crap moment with no answer from AMD but another hot, high watt, high noise part at best and we'll keep going further down if they don't get something that massively adds to their bottom line.

GF will steal any profits from last Q and most from this Q (or all?) as AMD owes 200mil from Dec 31 and will likely get burned with another fine this year since they opted NOT to produce console chips at GF and instead chose TSMC. Producing these at GF was really the only way to meet new WSA contracts. So we have another fine coming unless something magical happens, with the only question being how big a fine will they owe? Basically the first 6 months of this years profits (assuming they make some next Q) are gone and wasted. ARM is coming for x86 (already has 21% of ALL notebooks), Intel is now coming for low end, and NV is looking like R&D on core products is about to bury AMD in gpus (along with the ARM Denver chips coming after notebooks/desktops soon and boulder coming for server probably next year). I don't see how AMD makes money on anything but consoles and they won't break 480mil for the year even if they have a repeat of xmas for 4 quarters in a row (which only netted AMD ~45mil anyway last Q).

http://www.extremetech.com/computing/174980-its-time-for-amd-to-take-a-page-from-intel-and-dump-steamroller/2
"During its Q3 2013 conference call, AMD CEO Rory Read noted that the company would begin taping out new 20nm designs “in the next couple of quarters.” That means no Excavator core until 2015 or so."

Design times don't lie, and are very difficult to speed up much. So as I said, no core until 2015 if ever again. If you're taping out in the next few quarters we're not likely to see anything from the FX line in 2015. With funding for lots of projects not really being possible for AMD their first 20nm chips would be SOC's, GPU's and APU's not an FX line (even Intel gave up the enthusiast CPU in favor of APU models). They have already given up on the cpu perf race with Intel in public statements so the enthusiast FX line would be a last priority if it isn't ZERO priority already. Unless you have data that says otherwise, everything we know says don't expect an FX chip any time soon. Roadmaps ALONE might not tell the full story, but those with statements from the company, financial data, along with what we know about production says the roadmaps are telling the truth.

Worst info of the day?:
http://www.extremetech.com/computing/176919-amd-leak-confirms-that-excavator-apu-will-be-28nm-and-that-some-production-is-moving-back-to-globalfoundries
28nm excavator APU. OUCH. So this won't be leaps and bounds above Kaveri with no 20nm either (I'm guessing Intel may catch AMD's APU's gpus this year or next year as AMD delays 20nm APU at least some). Though the 65w suggests possibly a better process, it's still 2015 and as I said the SOC/APU/GPU will come first for 20nm not an FX line and we see carrizo here in 2015. Toronto server cpu 2015 possibly also, but server. There is no point in trying to compete with 14nm broadwell with a 28nm Carrizo right? I'd only think something is coming if their Q3 comments said "just taped out 20nm FX last Q and we'll have some 20nm to compete with 14nm broadwell" and even then it would likely be no different than the current haswell/FX line story.
 
does this card have Zerocore like AMD's cards? Does is turn off the fan when the monitor is off? That would be really important for me.
 
While I find the naming of this card a bit off and the price/performance not too great.

The power ceiling and the size of this thing is astonishing.
Got my hands on one today and damn, a low profile version of this card would be brilliant for older prebuilds and HTPC that will work on pretty much any PSU.
Excellent card when that is considered.
 
There is a serious problem with the Litecoin hash rates for the Radeon cards given in page 17 ( http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750-17.html ). For example Radeon R9 270X hashrate is 420-480 kh/s, depending on the configuration ( https://litecoin.info/Mining_Hardware_Comparison has some real world figures).You did not configure the cgminer properly (or at all apparently), because Radeon cards in your comparison perform 25% to 50% better when properly configured.
 
Litecoin chart contains false info.
Proof: https://litecoin.info/Mining_hardware_comparisonGTX 660 and 650 Ti hasrates values are too high:

660: max real hashrate is around 153khash/s, not 209 specified in article.
650: max real hashrate is around 99khash/s, not 152 specified in article.

On other hand,Radeon 270/270X - 300 khash/s? For these cards, max hashrate is around 450/470
Although, it's quite possible to get hashrates lower than max values...
But getting more higher hashrates seems unrealistic for me.

On example, I own two HD6870.
Hardware comparison specifies 330 and 355 max hashrates for this card.
But my best result is 330, in overclocked state (990/940 against 915/1050 base clock)
 
Once again it didnt take long for people to mention the price.

Im going to be honest; this card has poor price/performance compared to the R7 265 and most of us should get that. People on Toms dont realise that the price/performance isnt the be all and end all when it comes to how good something is. These cards are excellent for people who are just entering PC gaming who needs something to run with the crappy PSUs and cases that OEMs use.

This card is competitve because it targets a market that AMD doesnt have any footing in because APUs' GPU performance is still relatively poor and the 7750 is barely a gaming GPU anymore. Also, the fact that a 60w, GM107 (Which will be Maxwell's weakest GPU), 28nm GPU can even come close to a mid range R7 265 with almost 3x the TDP of the 750 Ti is pretty damn good if I say so myself...
 
Actually this card has an excellent price/performance ratio for those that are looking to upgrade an OEM machine. Anything better, and in some cases slower, would require the purchase of a new power supply as well, increasing the overall cost of the upgrade. Nvidia did this one right. They have a card that can lock on the OEM user base and not have to do a price war with AMD to do it. Hopefully, this will light a fire under AMD and they can answer with something of their own.
 
Not bad performance for that power level...That said, id never buy it, not enough performance.And 60 watts is a lie, when its >60 watts. Sure 62 is close to 60, but im not a fan of companies under stating power requirements.
 


3.3% is negligible. TDP is estimated
I wouldn't buy it either because I don't like NVidia. But it is a lot of performance for the power usage. If
the other maxwell cards are the same, it could change things significantly
 
It's just amazing how much power you can have in a tiny card right now. I just want one for a NES mod to PC. I was thinking about an A10 7850K and a R7 equivalent, but I was concerned about real performance and immediate usability over all features (HSA and sh!t). I'm looking at i3, i5 4440, H81 minimum ITX, 8 to 16GB of ram and this beauty of a videocard. Surely I'm able to fit MB, card, ssd and heatsink in a NES, I just need the low profile/non dual slot design. It would be a beautiful portable console/PC. I have to figure out if this is the most I can get from a PicoPSU 160XT.
 
It's just amazing how much power you can have in a tiny card right now. I just want one for a NES mod to PC. I was thinking about an A10 7850K and a R7 equivalent, but I was concerned about real performance and immediate usability over all features (HSA and sh!t). I'm looking at i3, i5 4440, H81 minimum ITX, 8 to 16GB of ram and this beauty of a videocard. Surely I'm able to fit MB, card, ssd and heatsink in a NES, I just need the low profile/non dual slot design. It would be a beautiful portable console/PC. I have to figure out if this is the most I can get from a PicoPSU 160XT.
Galaxy is releasing a LP form factor. Pretty excited, but would like one from a more reputable brand.http://www.galaxytech.com/__EN_GB__/Product2/ProductDetail?proID=517&isStop=0&isPack=False&isPow=False
 
Status
Not open for further replies.