GeForce GTX 750 Ti Review: Maxwell Adds Performance Using Less Power

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Definitely! Although nVidia has no incentive to do so. AMD has nothing to compete with performance per watt in it's class.

 

Sorry, to get you all so riled up. My point is, a card that for most truthfully means an AIB OC which are now more like $160-170. That’s pricey for really only able to provide medium settings/no AA in most titles at 1080p, slightly bettering a R7 260X. This is a really nice entry card for a novice that buy’s a stock box i5, while a little short on grunt to be mainstream it a plug-n-play. Today with 1920x1080 the norm for most displays having to ante-up that much cash should more often say at least some AA more often. If your building your own low priced gaming machine, any good 400+w PSU is already on your list, then this come in a little short for such a build, the GTX 660 is a better buy.
 

I think this is a pretty big deal because before NVidia was notably lacking in decent low power cards before
 


This power envelope was dominated by AMD for some time.

There are always the back and forth in this segment.
 
Nvidia and AMD both have had the crown for performance per watt at different times. Nvidia seems to be moving in that direction with Maxwell. Nvidia did this one right. The OEM crowd has a decent performing GPU, that will not require a PSU upgrade, and don't have to worry about card length issues either. As soon as they can make some low profile versions, they will dominate the OEM bracket. AMD needs something to answer if they care at all about OEM PC owner marketshare.
 
AMD's 7750 was by far the top low profile/no aux power video card for something like 18 months (an eternity in GPU markets) and nVidia wasn't even close. This card is a 40-50% performance improvement over the 7750, completely trumping it. All that without a node shrink is very impressive. I can't wait to see what they can do once their process does shrink this fall.

If anyone hasn't noticed, there's a pretty big push towards small form factor and quiet computing. Low wattage GPUs are crucial to making that segment true gaming PCs and not just HTPCs.
 
If/when a single-slot, low-profile version of this card becomes available, I'll be looking to buy it for my mini-ITX box. Until then, the HD7750 it has remains the most powerful card that will fit.
Considering the plethora of games available on Steam that aren't all that demanding, that lowly HD7750 is sufficient for great settings even at 1920x1080. The most demanding game I play is Guild Wars 2, and it looks quite nice on that card; the GTX750Ti would just make it a lot smoother, and maybe let me increase view distance.
 
I still would like 3 of the 750ti to replace my HD 5850 in my file server, the 5850 in my FX 8320 @ 4.0ghz rig, and my FX 8320 rig that is currently sitting with a GTS 450 than I am testing out for a friend. Normally that rig has an HD 4870.
 
I really think AMD botched the sub-$100 market with the R7 250. The 250 gives you inferior performance compared to the 7750, but you don't get a price drop. Yes, the 750 Ti is now the performance king, but you're also paying $160 for it. I think there's still a market for people that want a decent mainstream gaming GPU for $100 or less that doesn't require a power cable. I mean, they could release a 255 that's a rebranded 7750, but that makes things confusing with current 250 pricing. And who would want a 250 when you can get a 255 for $5-$10 more?
 
Whoever setup the mining benchmarks for this article doesn't have a clue what they're doing... I get 450kh/s out of my R9 270x on stock clocks. Learn how to configure CGminer, thanks.
 


I've read a lot of people "claiming 450kH/s" from a card with 1,280 shaders but yet to see proof. Until then, it's bs. Like a lot of people on the Litecoin Wiki page. No validations and for some reason want to lie about it.

It's a 300-350kH/s card with matching WU's or close to it (.9). Getting 450kH/s and -/+300 WU's means the hashrate is junk.

I can see people getting 400+kH/s with the 7870XT with the Tahiti LE chip but not Pitcairn. Needless to say, every card is different from the components used to it's firmware. A simple Google will easily show that most with Pitcairn GPU's with 1,280 shaders average hashrate is right around 300kH/s.
 


I am assuming you are not on Aurora Glade? :)


 
So a question I've been wanting to ask for ages. Why are they deliberately hobbling the cards when it comes to memory transfer? My GTX260 had 448mb's and now my GTX670 Ti OC has 128mb's. Isn't it true that the games industry has hit the time/profit ceiling about 5-6 years ago, but the card companies want you to think you need better cards to run what you can run on Ultra with a 260? Not to mention the whole SLi scam. Two cards at half the PCI-e multiplier? And they have to communicate which slows performance. Something I've tested. Why can't I get an affordable card like a GTX670 Ti with 448mb's memory trnasfer. It's not like it can't be done. I only run one screen though, I understand that Sli and Crossfire, along with top price cards will run three screens for that surround look, but who the hell really needs it? I still play brand new games on Ultra with my shitty GTX670 TI OC. And the frame rate it's perfectly fine. I guess it's what you don't know, rather than what you know. Like the $400 HDMI cables that came out when Blu-ray players did, haha.
 
They never made a 670Ti...

There's been substantial improvements. Look at the fact that a 6990 is about equal with a R9 280, for example.

Also, there's been some testing on the PCIe thing. There's no bottleneck from the interface till you get down to about PCIe2.0x4, by memory.
 


Thanks for clearing that up, you're right about my card, it's 660 Ti OC, I never payed much notice considering I neither noticed any performance increases. I only bought it because my father inlaw wanted to buy me something for Christmas. So I just lumped it in with an Asus Phoebus. You can notice the difference with that though.
What's an R9 280? I could look it up but I'm about to go out, is it a GeForce or an ATI? Or is there a new kid on the block.
About the memory transfer, I still don't see why they can't just leave them at higher bandwidths. It's not like it costs them anything. The whole card is probably only 5$ to have made in China.
And the games ceiling comment still stands. I haven't come across a game I can't play on ultra at 1920 x 1080.
Anyway, all the best, since Win 8 came out I've lost all interest in PC's anyway. I've switched to consoles and Linux.
 
R9 280 is a rebranded 7950.

I'm not sure about your units - they completely don't make sense.

Lots of people play on higher resolutions, or want framerates about 60Hz.

I'll have to have a look, but I think it's a LOT more than that. Even just the metal in the card is probably worth more than $5 at the moment. Semiconductor fabs are multi-billion dollar facilities, and these GPUs are at the bleeding edge of manufacturing techniques.

The 260 had a 448-bit interface, because it used DDR3 and was a fairly high-end cards. It had a bandwidth of 111.9GB/s. The GTX760 is 192.2GB/s, and the naming scheme has changed somewhat - the 260 was the second fastest nVidia card of the generation, but the 760 is #4 (not counting the Titan/Titan Black), and there may be more.
 
About the memory transfer, I still don't see why they can't just leave them at higher bandwidths. It's not like it costs them anything. The whole card is probably only 5$ to have made in China.
NVidia loves deliberately gimping their product in various ways. No idea why
And the games ceiling comment still stands. I haven't come across a game I can't play on ultra at 1920 x 1080.
Anyway, all the best, since Win 8 came out I've lost all interest in PC's anyway. I've switched to consoles and Linux.

No one says you have to use 8
 


You don't need to check my units, hehe, I was being facetious. Have a great day buddy.
 




It's a 100% fanless 0DB passive cooled DAW that's just for midi sequencing and recording it doesn't need a powerful CPU just a cool quiet and energy efficient one. For every day use I'll probably have it clocked more reasonably between 1.2GHz to 1.6GHz.

The CPU-Z was just testing it's threshold limits and for the banner. I actually wired up my PC cases reset button to the bios jumpers because I got sick of resetting it trying to figure out the sweet spot for post or hang up with the underclock same overclocking hassles apply.
 
Even in underclocked it would still only bottleneck it circumstantially to some extent really anyway and I'm clearly aiming more for low heat and low wattage performance efficiency.

Honestly I'll probably wait on a die shrink and or fanless solution, but this generation of GPU's is one to look out for and consider it actually appears compelling enough to consider the upgrade from a 8800GT or 260GTX even though the performance isn't really much different the efficiency and heat output is night and day different.
 
1080P is on the verge of being replaced by 4K probably in 3 to 4 years they'll start to become much more standard. That of course will certainly change GPU requirements a fair amount. I do agree with you for the record though about memory bit rates being crippled deliberately by both Nvidia and AMD.

I'm not sure if maybe power requirements plays a role in that or not. but back around when the 8800GT series launched I was expecting high end GPU's to transition in the direction of 1024 memory bit rates, but instead kind of trended back in the other direction entirely gradually getting smaller each generation.
 
No, but Tom's Hardware keeps not so gently prodding XP users to engage the kill switch at Microsoft's mercy not that it's doing much good they still haven't developed a proper replacement for it.

The best they've come up with is windows 7 which is far from perfect and really at best what I'd deem a middle of the road compromise if anything.
 


I SERIOUSLY disagree. What is wrong with Win7 compared to XP?

 
I keep an XP machine around for very old software. But 7 is quite good. Even Vista is tolerable with decent hardware

Really no reason to keep a main machine on XP at this point. It is over 12 years old.
About like using Win 95 in 2007!
 
Status
Not open for further replies.