GeForce GTX 750 Ti Benchmarked: Slower Than GTX 660?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


The first leak and probably with beta drivers. You don't know these are even representing what will ship.

AMD's ref cards shipped and ran 740mhz instead of 1ghz and it took a few months to fix it with AIB partners making better cooling systems on NON-Ref right? So even if real it would be better than AMD at this point with Hawaii right? I think the numbers can change but doesn't matter if they bring it with far less heat/watts. The first round are the fakes anyway. They will just get near or match power of older cards at low power. Then 20nm and we get the real deal where they deliver the power at normal levels which will amp up the perf of course. Then again, until we actually have shipping parts nobody knows anything as proven by Hawaii.
 


yeah, more performance at the same power is even better!
 


well he did say 'runs' which could mean anything.
 
I have a feeling that Maxwell is going to be a bunch of same performance GPUs with a much much lower power consumption.
I don't think this is going to be true. If the GTX750 Ti actually uses a GM107, and it's achieving similar performance to GK106 based cards, I think that would be quite impressive, and should give you some indication of the performance potential of higher-end Maxwell GPUs. Just compare the specs in Nvidia's current and previous gen lineups. A brief example, GK106 has 3x the shader cores, and 1.5x the ROPs of GK107.

... let your imagination run wild.

This would also fall in line with what you would expect to see from a generational performance jump at comparative positions in Nvidia's GPU lineup. If these benchmarks are accurate, GM107 will offer around a 2x performance jump over GK107 based cards.
 


I love how in almost every thread there's some kid who claims that their 3-4 year old mid range card can still handle everything they throw at it. Really?? What games are you playing? At what settings? At what resolution? It's just really funny, in an oblivious self-therapy sort of way. It just sound like they're trying to reassure themselves that their older hardware is still up to date and performance competitive, more than anything else.
 
If it is like how both AMD and Nvidia has done for previous generations, it will more than likely be a last generation card rebranded as a new generation. This way unknowing customers will upgrade their last years model with a new one that is pretty much the exact same thing; profit for AMD and Nvidia.
 
Competitive? No. I can't run everything I play on max settings anymore, and I have no idea how demanding Mechwarrior Online and the other games I play are. However, I'm still enjoying 40+ fps at 1920x1200 (I think that's it, it's just a smidgen higher than 1080p) with settings that I can barely differentiate from max settings. Perhaps I just need to play more demanding games to see my GPU's deficiencies, but that's not going to be for a while because I'm in the middle of playing through a bunch of PS2 titles on an emulator, and my GPU is more than enough to keep those slammed to the 60fps limit I set in the options while rendering at 3 times the original resolution.

I've been watching the charts at Toms Hardware, and up until very recently it would have been over my price limit to move more than one tier in an upgrade.
And don't assume everyone who finds their hardware sufficient for their current needs is a child. It makes you look ignorant of how the responsible population of the world lives and how they value the hardware they have.
 
Same performance for less power is great, but it makes it hard to justify buying a new card. Sell me more performance at the same price, no matter how you get it, and then I'll be interested. I have a 460 GTX that runs everything I throw at it.... give me a real reason to upgrade!

If a GTX 760 that is approximately twice as fast isn't a reason to upgrade, I don't know what's going to sway you. You may just be playing games that don't need much.

I have a 460 still too, overclocked to 850 MHz, and I see plenty of room for improvement.
 
I agree, daekar. Some gamers like to get the latest and greatest titles, "beat" them quickly (while enjoying the highest possible settings), and then move on. Others, such as myself (and daekar you may be another) find enjoyable games with extremely high replay value (e.g. Guild Wars for me) and play them for years. For these latter gamers, constant upgrades simply aren't needed.
 
Oxide, the performance is admirable, but a quick check on Newegg reveals those cards are all $250 $300+ which is way above my price range. I have never paid more than $200 for a GPU and will not until the dollar is inflated to considerably lower than its current value. $300+ could buy some very nice car parts, a 22 rifle for my wife, or a not-insignificant amount of materials to maintain or improve our house and property. The value proposition, for me, just isn't there.
 
The non ti 660 is a 192 bit part.I am sure they will release some overclocked/more unlocked version not to long down the road to fix this. Kind of like the 650 / 650 ti and 650 ti boost
They're both 192-bit.

I sure hope they don't end up having to release a boost edition. Whenever you see an inelegant naming convention like that, you know the resulting product is out of necessity to fill a performance gap, and not a part of their original plans. Hopefully they get this 'Ti' right the first time.
 


That would make sense. I'm not sure what the basis for some peoples expectations are, but for some reason a lot of people were expecting the 750 Ti to perform close to the 660 Ti, or in other words similar to a 760... which wouldn't make much sense. Although I suppose that in itself wouldn't completely rule out the possibility of senseless product lineups, I mean just look at the 290 series.

In addition I think something a lot of people aren't factoring into their performance expectations is the GPU, rumored to be a GM107. In other words, the lowest-end dGPU of the Maxwell architecture.
 


Give you a real reason to upgrade? Well, if a 750 Ti at a sub $200 price point (which is likely to be its MSRP) isn't enough of a reason to upgrade, then it sounds like you just don't want to upgrade. Which is perfectly fine if you don't need the additional performance, but I'm just saying it doesn't sound like it has anything to do with the performance rationale you gave in your original comment. A 750 Ti will perform substantially better than a 460, and will likely settle into a very competitive price point, like the 460.
 


Agreed. Turning stuff off doesn't count as playing there IMHO. Most of my games won't take 1920x1200 on my radeon 5850 (many can't hack 1680x1050 on my smaller 22in vs. the Dell 2407 24in I have, which I guess is why I have two and always will probably). Upgrading to maxwell will merely up my res on both monitors (replacing 22in with 27 1440p at some point or hopefully 1600p), not eliminate the need for the smaller native res. I half don't believe 20nm will bring a SINGLE gpu card that will satisfy all games at 1440p and certainly not for long as engines progressively ratchet up graphics. Maxed out you'll be dipping below 30 probably on more than one occasion. Can you still have fun turning crap off? Of course. Would I want to? Heck no 😉 I'll buy a better card to get there. I'm hoping maxwell gets me there, but I won't be holding my breath until 14nm for this story to come true IMHO. And as engines get more graphically intense, EVEN that case will no longer be true. I'm guessing 2nd gen Unreal 4 games, Star Citizen 2 (maybe even the first?) etc will tax most cards in 1440p that are single gpu.

I LOL every time I see someone saying AMD's APU's can play 1080p. With how much crap turned off to do it? You're not going to be doing 1080p seeing the game as the developer intended that is for sure.
 
I see Nvidia and Intel all focusing on this low power consumption components for desktop pc's, am I the only one that doesn't care if my powerbill goes up 2-3$ a month because of my pc? I would rather buy a pc based around actual FPS and performance, then power consumption.
 
Barantos, many more PCs are sold into corporate markets than are bought by guys like us, and quantities may be measured in hundreds or thousands of units. On that scale, power saving is a big deal, especially since corporate power rates may be higher than residential.
 


That's a really strange way of looking at it.

Efficiency improvements translate to more performance at the same TDP, or the same performance at lower TDP's, and it provides more headroom to scale performance at the high-end.

That being said I think the situation is quite a bit different in the CPU and dGPU markets. The effects of Intel's focus on efficiency have been far more apparent at lower TDP's and high core count server/workstation environments, but hasn't really translated into big performance gains for desktop enthusiasts for a while now. This is because scaling threads is a more effective way of scaling performance in servers/workstations than relying on clock or IPC scaling.

But the effects of efficiency improvements in GPU architectures have without a doubt benefited PC enthusiasts, where these improvements have translated into significant performance scaling across pretty much every market. GPU's are often TDP constrained, even in desktops, so efficiency improvements are a great way to scale performance, especially when you can't rely on consistent advancements in fabrication process.

... so what's the problem?
 
Status
Not open for further replies.