GeForce GTX 650 Ti Boost Review: A Budget-Oriented GK106-Based Boss

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
Another stupid name for a product? Why not call it the 655?
Why have "TI" cards?
Why have "GTX" named into the cards... are there any GTX660 and GT660 cards out there? The reason we had GT / PRO / GTX was because they were called "Geforce 2 xxx"

Waiting for the new "Nvidia Geforce GTX 769 Boost TI Ultra" to come out... its better than the "Nvidia Geforce GTX 769 Boost TI Pro"
 
[citation][nom]belardo[/nom]Another stupid name for a product? Why not call it the 655?Why have "TI" cards?Why have "GTX" named into the cards... are there any GTX660 and GT660 cards out there? The reason we had GT / PRO / GTX was because they were called "Geforce 2 xxx"Waiting for the new "Nvidia Geforce GTX 769 Boost TI Ultra" to come out... its better than the "Nvidia Geforce GTX 769 Boost TI Pro"[/citation]

You didn't mention how Nvidia also adds in GPU core count for some model names, IE the GTX 560 Ti 448 core ;)
 
[citation][nom]tomfreak[/nom]I am not going to change my plan, GCN win the war in this whole generation. Except the 7870LE/XT +7900 series, the entire line up is more power efficient. Then the whole GCN line up is better price/performance + free AAA games even u dont like the games, u can always sell the keys to get ur card cheaper. There is simply no reason to opt Nvidia now. I have been using exclusively Nvidia cards from TNT2 all the way to Fermi. It looks like Kepler is going to be a miss for me.[/citation]

I wouldn't go that far. Also, AMD has several power efficiency losses. The Radeon 7770, the Radeon 7970, and the Radeon 7970 GHz Edition are less efficient than the competition such as the GTX 650 Ti (granted that is now better said to compete with the Radeon 7790 which is competitive in power efficiency), GTX 670, and the GTX 680.
 

abitoms

Distinguished
Apr 15, 2010
81
0
18,630
slightly OT.

"This could be specific to our X79-based platform (we've been seeing an increasing number of X79-oriented issues lately)"

can you do an article on this, please? looks to be an issue that needs a real, hard look at.
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
[citation][nom]blazorthon[/nom]I wouldn't go that far. Also, AMD has several power efficiency losses. The Radeon 7770, the Radeon 7970, and the Radeon 7970 GHz Edition are less efficient than the competition such as the GTX 650 Ti (granted that is now better said to compete with the Radeon 7790 which is competitive in power efficiency), GTX 670, and the GTX 680.[/citation]did u actually read my post, I said 7870XT 7900 are less power efficient. 7770 are priced against GTX650 nonti. It completes with GTX650 non-TI.

The entire GCN line up are more power efficient than Nvidia. (except Tahiti chip which I stated earlier)
 


My bad, I missed where you said 7900. Regardless, my point about the 7770 stands. Also, the GTX 650 competes with the Radeon 7750, not the Radeon 7770. Up until the Radeon 7790 launched, the Radeon 7770 competed with the GTX 650 Ti. The GTX 650 too is more efficient than the Radeon 7770 IIRC anyway and the same is true compared to the Radeon 7750. So, really, Pitcairn and Bonair are AMD's only all-around efficiency wins.
 
[citation][nom]masmotors[/nom]good hope amd drops price on 7950 i want that i have the 7850 2 gb and love it i would get gtx 670 if i could sell my gpu but still too high[/citation]

Not likely. The 7950 900MHz OC and the 7950 Boost Edition are both currently priced properly.
 

ericjohn004

Honorable
Oct 26, 2012
651
0
11,010
No I got the 7870LE Powercolor Myst Edition Tahiti card. You know the one EVERYONE gets. Not the XT BS. I got the same one Tom's Hardware recommends. It's not like I was disappointed. I mean theres gotta be a reason a 7870LE with a Tahiti is so cheap. And that reason is because it's hot, uses lots of power, it's loud, and it's made cheaply. However, I like the cards performance, I don't have to look at it or see it drawing power, but I do have to hear that thing. And the Bioshock game that came with it I planned on buying anyways.

Anyone who says AMD cards are a far better value and that they wouldn't even consider an Nvidia card are just kidding themselves. Clearly Tom's knows what they're talking about so if you don't agree with them, take it up with them, don't just come back AMD here cause your in bed with them. Clearly from this review, a 650Ti Boost is of better value than a 7850 1GB or 2GB cause it's only going to be costing 150 bucks. AMD WILL have to lower their prices to compete. You should be happy, not bitching.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
One day, maybe they'll remember OpenCL isn't cared about by NV at all because nobody tries to make money with this crap. You run CUDA in AdobeCS6 (video or photo etc), Blender, 3Dsmax, Vray, etc etc. Pro apps use CUDA (with perhaps an option for opengl or opencl).

Start running CUDA vs. whatever AMD runs fastest with (opengl or opencl) in apps testing gpgpu. Nobody buys a NV card to run opencl stuff. NV doesn't care about an opensource driver when they already can do the same things faster in Cuda or OpenGL in EVERY pro app that makes you money. Testing gpgpu without showing cuda is pointless. A real pro app user would not act like 7yrs of cuda groundwork doesn't exist. IT DOES exist, and it's a HUGE difference in some things. IE your own ps cs5 tests shown here:
http://www.tomshardware.com/reviews/adobe-cs5-cuda-64-bit,2770-8.html
3mins vs. 12mins (6core cpu)...WOW.
http://www.blenderguru.com/4-easy-ways-to-speed-up-cycles/
Only 12x faster for ticking a box for cuda...12x faster sucks right? ;)

I would only turn on OpenCL if there was NO other app that did the same job with Cuda. But then, that situation doesn't exist so... Which explains why NV pushes cuda and openGL and basically ignores OpenCL. I mean, why would they want to foster AMD optimizations in apps? Helping OpenCL means AMD gets into the game as apps add support. They don't desire this. Not to mention cuda in the same app would (should) never be beaten by opencl, since it's kind of an abstraction layer above directly talking to the hardware with cuda so to speak.
http://www.heatonresearch.com/node/2487
"If you want to use nVidia, ATI and Intel, then you will use OpenCL. Sounds like a slam-dunk to use OpenCL? Right? CUDA is much more advanced that OpenCL. In OpenCL you are programming your graphics kernels in C (actually C99, but still C). In CUDA, you are using C/C++. In OpenCL, you are dealing with a higher-level abstraction. If you are on an nVidia card, OpenCL is essentially compiling to CUDA. Because of these reasons, I decided to go directly with CUDA for Encog's GPU implementation. This limits me to nVidia cards, but that is something I am willing to accept. At least for now. I may add an OpenCL version at some point in the future, but that is not planned at this point."

There's an example of what many devs think, especially when NV funds some help (like all big apps) to code in Cuda. Besides this why do I care how fast I can help someone solve cancer etc? Bitcoining is over for all but large botnets as easy blocks are gone.
https://community.rapid7.com/community/infosec/blog/2012/12/06/skynet-a-tor-powered-botnet-straight-from-reddit
Just an example...Oh look it uses opencl...LOL. Well a botnet would have to be cross-hardware compatible right? :)

So what is it with testing open source JUNK? I get that a lot of it's free and works on a lot of stuff, but does anyone use this stuff to make money? Based on registrations 163,000 folding@home site, folding@home is pointless to most and so is bitmining. There are 352MILLION computers sold each year and you're benchmarking for 163000 people (and why the heck do I want to run up my electric bill for that? Will they pay me when solved?)? How about some meaningful REAL MONEY MAKING apps start get tested for showing gpgpu stuff? You can use luxrender plugins for stuff like 3dsmax, blender 2.6, cinema4d etc, but again as soon as you do that I turn on Cuda. OpenCL should always be a last option if forced when there is a cuda way to do the same thing on NV. You guys are always forcing NV cards into the worst gpgpu situation they could be in, which nobody would actually do, then acting like it's normal instead of retarded :( This amounts to this lame excuse:
"Previously, when the GPGPU universe was divided into CUDA (Nvidia) and Stream (AMD), we faced the problem that most applications supported only one of the two environments, and could thus not be directly compared to each other."
http://www.tomshardware.com/reviews/graphics-card-benchmarks-charts-review,3154-8.html
It's AMD's tough luck most apps skip them because they haven't invested for 7yrs like NV has right? Take the best app from both and compare those running the same scene render etc if this is the excuse you'll give for an answer. Then again everyone uses adobeCS, so it's hard to imagine not using it as a benchmark while showing a bunch of stuff nobody uses for real life instead. Odd...If stream has fallen so far behind cuda that it's useless that isn't NV's problem, nor should you act as though the situation doesn't exist. 500 universities in 26 countries teach cuda. How many teach stream or opencl use? You can use OpenCL in Adobe and anyone with NV hardware will immediately turn on cuda. The # of people using AdobeCS vs all of what you've shown combined is ridiculous. They sell 4Bil worth of the stuff yearly. Surely you can do better than this platform agnostic crap. :)
 

EzioAs

Distinguished


2-Way. According to Nvidia's website and pictures of the card ;)
 


OpenCL is rapidly replacing CUDA as the go-to language for GPU compute and Direct3D replaced OpenGL almost completely several years ago. You're being so far out of the loop that it's not even funny. Also worth mentioning is how Direct Compute tends to beat both OpenCL and compute while, like OpenCL, not being limited to Nvidia cards.

It's also worth mentioning that Adobe is one of the biggest companies ditching sole CUDA support and switching to OpenCL/Direct Compute.

Tom's tests a lot of compute workloads that many people uses. It's generally Anand whom tests a lot of non-important things such as years outdated application suits.
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
[citation][nom]blazorthon[/nom]My bad, I missed where you said 7900. Regardless, my point about the 7770 stands. Also, the GTX 650 competes with the Radeon 7750, not the Radeon 7770. Up until the Radeon 7790 launched, the Radeon 7770 competed with the GTX 650 Ti. The GTX 650 too is more efficient than the Radeon 7770 IIRC anyway and the same is true compared to the Radeon 7750. So, really, Pitcairn and Bonair are AMD's only all-around efficiency wins.[/citation]ur stand on 7770 vs 650ti is incorrect. it is price against 650. U cannot buy 650ti with the price of 7770. 650ti is a 110w GPU it is 40% more power hungry than 7770, but it is not 40% faster so 7770 is more efficient

http://www.anandtech.com/show/6838/nvidia-geforce-gtx-650-ti-boost-review- check it urself 650non-ti is price against 7770.
 
A card that consumes just a fingernail of 200w at load and produces sweltering 65-70 degree centigrade thermals in the budget market is just a perversion of that market where low cost, low power consuming, silent parts are needed. This attempt at the sub $200 is a little like bringing the shotgun to a paintball party, and the results in all honesty is not that impressive. Consider the $200 sub market as being the kind of builder that doesnt' want to spend more than $450-500 on a system and maximise performance, not only does this part need a bigger GPU than the competitors part it intended to jaust namely said 7790 which it consumes around 60% more power producing double the thermal envelope for around 12% more performance. I am sure a 350w will suffice but running the PSU and GPU up will produce heat and noice that is unbearable, so for the 7790 you can get away with a 350w very comfortably even for SFF, while the 650ti may require a 450w just to play safe, that costs more too.

5 gaming benches run, in FPS it beat the 7850 in 2 of the 5, in frame times it beat the 7850 in 1 of 5 and also the 7790 in 1 of 5, this parts supperior clock rates and memory speeds are about the only thing that is driving it into a semi acceptable position but ultimately this is Nvidia selling off its unsold 660 family backlogs of silicon, basically reharvesting. Overall the 7850 1GB at $179 still remains the champion of this division and will do so long as stocks last.

 

Mathos

Distinguished
Jun 17, 2007
584
0
18,980
Oh for gods sake... Who buys the 1GB 7850 over the 2GB model? Really, if you're gonna compare the 650ti Boost 2GB, to the 7850 at least use the model with the same amount of ram.
 


You're still not even paying attention to what I said. Furthermore, the GTX 650 Ti doesn't consume anywhere near 110W. Not even the Radeon 7850 usually consumes that much power and the 7850 consumes a lot more power than the 650 Ti. The GTX 650 Ti consumes a similar amount of power to the Radeon 7770 and is in fact more power efficient than the Radeon 7770.

power.png

http://www.tomshardware.com/reviews/geforce-gtx-650-ti-benchmark-gk106,3318-18.html

You also ignore two major facts. One, the Radeon 7770 and Radeon 7750 currently reside in much the same price range with the GTX 650 due to the Radeon 7770's recent price drop without a price drop on the Radeon 7750 and the GTX 650. Two, the GTX 650 Ti will get a price drop because of the GTX 650 Ti Boost and until the recent price drop on the 7770, it was only marginally more expensive than the Radeon 7770 which was its only competition. Pulling up Anandtech links instead of actual links to average pricing such as pcpartpicker shows is just wasting time with irrelevant links that don't prove your point at all.

http://pcpartpicker.com/parts/video-card/#c=115,118,80,79,109&sort=a5&qq=1
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
The price drop from 650ti is not going to get anywhere near 7770 range. it will cannibalize 650 sales. You are trying to put future price drop and hoping 650ti are price good enough to complete with 7770.

Nvidia did not put 650ti to complete against 7770. it was slot into the gap between 7770-7850 which the current 7790 fills in but 7790 are slightly more exp. 650ti completes with no one as shows in the table.
 


Yeah, think you are right. Powercolor is not great either.
 

pettudor

Honorable
Jan 24, 2013
3
0
10,510
if the 650 ti boost is just a slower 660 gtx why isn't it called 660 gtx LE ? And i don't think this card is a threat for the 7790 because of the power gab between them. 50w is actually a lot if you take in consideration the 33w the wii u consumes :) , if you're on a budget i wouldn't spend my money on a new power supply and lastly I certainly don't trust nvidia when it comes to pre-pricing(to good to be true).
 

brimur

Distinguished
Jan 21, 2011
21
0
18,510
It used to be the case that a newer more powerful gfx card came out every 6 months. I like that this is no longer the case and that gfx card makers have realised that they need to stick with a card and bring out variants of that card to suit customers needs. I bought a GTX 680 a year ago and its nice to know its still a relevant card and will be for some time.
 


The reported price drops put the 650 Ti right with the 7770, so you're speculating more than I am with your claims.

The 650 Ti's competition was the 7770 until the 7790 came out because the 7770 was the closest card to it from AMD in performance. It's still fairly close even though the 7790 has taken it's place as the 650 Ti's competitor. Heck, Tom's chart shows the improvement from the 7770 to the 650 Ti to be a only around 10% or so on average in these tests and even if we look at lower quality tests where the difference is greater, it's still not enough on average for the 650 Ti to have competed with anything other than the 7770 until the 7790 comes out.
 


the 7770 was not direct competition for the 650ti... the 650ti was placed and priced right in the middle between the 7770 and 7850, in both price and performance. if the 650ti comes down to 7770 prices, then the 7770 becomes overpriced. The direct competitor for the 650ti from AMD was meant to be the 7790.

At a little lower price it offered better performance, and was meant to be the 650ti killer.

The 650ti boost is clearly aimed straight at the 7850, and by extensionthe 7790. priced about the same as the 7790, and performing about as good as a 7850, it's a formidable challenge to both cards. Which will require AMD to cut a number of their card prices to remain competitive.
 
Status
Not open for further replies.