Please lecture me!

benzi118

Honorable
Dec 26, 2012
42
0
10,530
Hey guys....hope you are all well. Thanks for dropping by.

I am looking to upgrade my little GTX650 Ti graphics card. I think it's not ready for latest games & more upcoming ones neither.
I was wondering if you guys would like to share your computing knowledge, like picking a decent graphics card.

I was looking at Sapphire R9 270x 4gb OC on internet which is 60% off at the moment.
But how do you guys tell a good or bad graphics card? I don't assume it'll be that easy to say the bigger memory the better. Shall I avoid any cards that has Direct CU II?

Thank you for your support in advance!
 
ok so Direct CU II is simply the cooling solution for the graphics card (And it's a very good one mind)

Bigger memory is important, but what you want to look at more than that is memory bandwidth and type. E.g. the Sapphire r9 280x Toxic edition has the toxic cooler (3 fans, black and yellow colours) and 384-bit GDDR5 memory. GDDR5 is the normal memory type for modern GPUs, and 384-bit is the bandwidth of the memory on this particular card.

To make things a little more basic, I'll put it this way. For the "current series" of graphics cards from nvidia and AMD you have the GTX 700 series cards from nvidia, and the Radeon r7 and r9 series cards from AMD. Basically, the 2nd number determines how good a card is in that series. e.g:

nvidia: the GTX 780ti is better than the GTX 780, which is better than the GTX 770, which is better than the 760 and so on. (ti means turbo boost, something some nvidia cards have, basically if 2 cards have the same number, but one has ti, that one is normally faster)

AMD: the r9 295x2 is better than the r9 290x/290, which is better than the r9 280x/280, which is better than the r9 270x/270, which is better than the r7 260x/260 etc.

The comparison between the 2 brands is tricky, as the vrams differ on comparable cards. For the main 3 AMD cards, the 270(x), 280(x) and 290(x), the comparable cards are the GTX 760, GTX 770 and GTX 780 respectively. Both cards in each comparison yield similar performance figures, but one may be better than the other in some games.

If you need any more info let me know :)

EDIT: The x in the AMD cards just means higher clocks as far as i know, something which is used when comparing 2 cards of the same name, from 2 different vendors (e.g. Sapphire Dual-X r9 280x vs ASUS Direct CU II r9 280x).
 
If you're looking at the r9 270x 4gb, then you should also look at the GTX 760. For an idea of pricing and which prices you're looking at, which country are you in and do you have the link to the 270x you were looking at?
 
Hi. Thank you for your feedback! It's really helpful. Please see my comments below! Thanks!

Ok so Direct CU II is simply the cooling solution for the graphics card (And it's a very good one mind)
My comment- I will include Direct CU II in my shopping list from now on.

Bigger memory is important, but what you want to look at more than that is memory bandwidth and type. E.g. the Sapphire r9 280x Toxic edition has the toxic cooler (3 fans, black and yellow colours) and 384-bit GDDR5 memory. GDDR5 is the normal memory type for modern GPUs, and 384-bit is the bandwidth of the memory on this particular card.
My Comment- So I guess the bigger memory the higher graphics detail I can go, but also bigger bandwidth will help such high memory get through. Am I correct?

To make things a little more basic, I'll put it this way. For the "current series" of graphics cards from nvidia and AMD you have the GTX 700 series cards from nvidia, and the Radeon r7 and r9 series cards from AMD. Basically, the 2nd number determines how good a card is in that series. e.g:
nvidia: the GTX 780ti is better than the GTX 780, which is better than the GTX 770, which is better than the 760 and so on. (ti means turbo boost, something some nvidia cards have, basically if 2 cards have the same number, but one has ti, that one is normally faster)
My Comment- I think I get the ideal. Basically, the higher the number the better graphics card that is. Speaking of dual graphics card in one system, could i use two different series cards? Is there any settings for the PC to recognise that there are two graphics card it can use?

AMD: the r9 295x2 is better than the r9 290x/290, which is better than the r9 280x/280, which is better than the r9 270x/270, which is better than the r7 260x/260 etc.
My comment- The toxic one holds 1100MHz, and bandwidth is 384-bits & 6400 MHz Effective, which is pretty amazing. Compare to SAPPHIRE DUAL-X R9 270 I pick only has 256-bits & 5600 MHz Effective

The comparison between the 2 brands is tricky, as the vrams differ on comparable cards. For the main 3 AMD cards, the 270(x), 280(x) and 290(x), the comparable cards are the GTX 760, GTX 770 and GTX 780 respectively. Both cards in each comparison yield similar performance figures, but one may be better than the other in some games.
My Comment- It was difficult to choose one when I first look at the model numbers. They all look very Cool(GTX, OC & DIrectCU II), all seem will give amazing performance. I think I need to look at the specification as well.

If you need any more info let me know :)

EDIT: The x in the AMD cards just means higher clocks as far as i know, something which is used when comparing 2 cards of the same name, from 2 different vendors (e.g. Sapphire Dual-X r9 280x vs ASUS Direct CU II r9 280x). [/quotemsg]
 


Hi CGurell

Recently I upgrade my system, these are the new component I bought,

Intel i5 4670k
Gigabyte G1.SNIPER Z87 Motherboard
G Skill Ripjaws 8GB 1333MHz (From old system)
EVGA GeForce GTX650Ti 1GB
600W PSU

However, my GTX650 Ti wouldn't be able to support games like Watch Dogs, Battlefield 4 or more in High graphics!
Might as well replace it & sell it to someone. I'm from UK & looking for something with 200 pounds.
After receive your feedback, I have amended my wishlist-
1st choice- Sapphire R9 270X 2GB Toxic 1100MHz
2nd choice- Sapphire Dual-X Radeon R9 270X OC 2GB GDDR5 Graphics Card with Boost
3rd choice- Sapphire R9 280X 3GB Toxic 1100MHz GDDR5 (Sightly out of my budget, but I assume will last at least 3-4 years)

Will an Intel CPU works better with NVIDIA & AMD CPU works better with AMD Radeon?

Thank you for your time & helpful advices!
 


CPU and GPU don't really affect each other, some AMD CPU integrated graphics and AMD GPUs can run in CrossFire, but I wouldn't worry about it as it's mainly for lower end/older cards

The 280x you chosen is a great card, it most probably won't run everything at max settings at 1920x1080 for 3-4 years (And remember we're geeks, not psychics, so we don't know what's coming in terms of games and new hardware over the next 3-4 years), but it should run everything at max right now except for some games like Watchdogs and FarCry 3, and will at least be able to run everything for 3-4 years, not sure what settings though. For any higher resolution though, it probably won't perform extremely well, but should be alright

Just a note, what make/model is your power supply? Wattage isn't everything, we need to know if it is reliable, efficient, and above all, safe.

Oh and Novatech have this offer on a Dual-X r9 280x right now (The card I have) http://www.novatech.co.uk/products/components/amdradeongraphicscards/amdr9280xseries/11221-00-20g.html?utm_source=facebook&utm_medium=socialmedia&utm_campaign=facebook
 


nvidia are a little stricter about sli (Their dual graphics solution) than AMD are over crossfire. Basically nvidia will only let you use 2 cards in sli if both cards are EXACTLY the same. I'm not saying sli won't work outside of using 2 identical cards, but it's only guaranteed to work with identical cards.

AMD allow you to crossfire any related cards, so any r7 260x with any r7 260/r7 260x/HD7790 (from AMDs 7000 series cards, these were basically renamed and sold as the r7/r9 200 series cards, but for less money), any r9 270 with any r9 270/r9 270x/HD 7870, any 280/280x/HD 7970, and any r9 290 with any r9 290/290x (Even if you get one with more memory, although the memory used will only be the minimum memory of both cards, so if you have a 2gb r9 270x and a 4gb r9 270x, then only 2gb will be used)
 
If i were to chime in, it would be... [ Assuming you're a person who lives in the real world and money matters to you 😀 ]
effective clock speed > all.

Get something from a few series back, that has a higher model number. So basically.. we're on 7-series or whatever of nvidia cards... Get a 480. or 2 480's in SLI... And you should be outperforming the latest greatest £500 nvidia card (or at least keeping up) for a fraction of the price.
[ So a 480 is better than a.. i dunno '510'. Or a '680' is better than 760. Don't take this for gospel. Its very easy to do a youtube or google search '680 vs 760' and none of this is concrete -- Because different games obviously have different requirements and use the drivers differently]


In this PC, i'm running 2 GTX 275's [2 series cards -- we're now on 7? i think] in SLI... And i play for example... Arma3 @ 5040 * 1080.
They go on Ebay for about £50 each...

Ebay's very luck of the draw... But still, so is dropping £500 on something that's going to go down in price in a few months when nvidia comes out with some new technology that is a 'must have' -- that turns out to be not-as-revolutionary [gasp] as their marketing department is telling everyone.

Just sayin'
 


This is a very compelling argument, but a couple of things to note:

AMD cards have recently been used to "mine" bitcoins, which essentially means they are left running at 100% 24/7 for weeks, even months, severely reducing the lifetime of the card.

Especially for nvidia cards, anything 400 series and under won't have any of the features that, say, a 600 or 700 series card will have (e.g. GeForce Experience)

There are a LOT of counterfeit cards on the market these days. If you see anything like this: http://www.ebay.co.uk/itm/GeForce-GTX650-2GB-384Bit-DDR3-PCI-Express-2-0x16-GK106-Video-Graphics-Card-HK-/351094611141?pt=PCC_Video_TV_Cards&hash=item51bede9cc5 (Note the cooler is off and it doesn't come in manufacturer packaging, oh and no version of this card exists with DDR3 memory like this person has put), you know it's fake. Make sure to check the memory types (Always go GDDR5) and simply be careful. If the person says the cards have been used in a mining rig then stay away. Also note that you will probably have no warranty with a used card compared to a new one.
 
To counter your point.. GeForce experience is software, not 'features' :)
and really? GeForce experience? "The easiest way to update your drivers, optimise your games, and ... "

2-series cards support overclocking via the drivers lol. GeForce experience? I'll pass.. But then i dont install all that other useless 'stuff' with the drivers - and wouldn't even if it was 'supported' on my card...

Personally, I'm capable of changing ingame visual settings / updating drivers myself -- :) I do agree/realise that this software may be of help to people unwilling to update drivers manually / tweak their settings though. (But again, i disagree that convenience is worth several hundreds of £'s)


You are correct though about 'features' though, but the 'features' you speak of are mostly [i'd argue] either marketing blurb - to get people to spend money or supported via software [backwards compatability yay -- which to me, means it makes no difference and comes down to raw pixel-pushing-power again] .

To clarify my point:
The old 2-series cards do not natively support directx11 [ but what has that stopped me doing? - nothing] I can still play all the latest games in dx10 mode... And if i -really- wanted to, i could mod them to support it.. Because they get it through driver support. I'm also quite certain i've played Directx11-only games on this rig... there's a spacey one by futuremark.


The newer cards have better architecture, native dx11 support but [comparably] lack the raw power of the older series...
This is evident when you compare the series' power consumption.

Furthermore, the architecture refinements are less apparent [and effective / useful ] on older games.


So, yes, a £1000 graphics card may outperform a £50 one at 1920*1080 on max settings...
...But we were expecting that right?
And what about when you compare them both running an older game at an actually 'high' resolution, like for example... 5040*1080.

Also, the real question is [for me, when upgrading] do refinements in architecture justify the expense and increase the performance, compared to older, cheaper cards in say... SLI?

I'd disagree and say No... Unless ofcourse you don't care about money. [ in which case, wish i were you. ]
 


Thing is, he's not looking at a £1000 GPU... He's looking at a £170 one, yeah £170 is more than £100 but £70 for better game support, mantle support and a warranty, I know what I would pay...
 


Pardon my skepticism, but while you may be "running it" at that resolution, you're certainly not doing so with good performance or visuals, and that's not a very high bar to set.

What you've got is equivalent to two of today's R7 250's in Crossfire, which are worth about £50 brand new and so low-end they can get all the power they need from your PCIe slot.

If you get sucked in to the marketing hype, that's your problem. Nothing is ever "revolutionary." But the performance-per-watt gain over the previous is all that really matters. As long as you're upgrading at a cadence of about once every 2-4 years to a similarly-priced product, you'll be keeping up.
 


Actually i am & I do pardon your skepticism..

As i've pointed out... Arma at that resolution requires GPU's with raw rendering power for the large 3d scenes... Not jazzy shader tricks and 2014 energy efficiency...


But you should know i run the games at, at least, high graphics settings across the board... I could list a big list of games that i can play at that resolution on high but what's the point when i'm not here to prove anything to anyone, just offer the OP a real feasable alternative.

Again though, this is purely to my tastes... You should also know, I don't run anti-aliasing or any other post-processing effect done after the game's been rendered because i play all my games in competitive environments, where speed and visual pixel-perfect accuracy is more important to me than it looking like a blurry - photograph.

Also, for the people saying 'you need at least 4 gig graphics ram' they clearly don't know what they're talking about and have "bought into the marketing hype".

In my experience, Video memory has allowed people to run with less texture pop but i can still run the game on very high texture settings, and my cards have a comparably abysmal 800meg of video memory.. (ultra if i was willing to put up with popping, which again, because of my competitive nature.. i am not. )



Is the above from experience? It's just i'm actually shocked... It'll be nice to know when i do come to upgrade...
I get 50+ fps in arma3 on the previously mentioned settings... Will those 2 x R7 250's push that out at 5040*1050?
 


Indeed, perhaps you are right then... Still it's not as if he needed another person saying 'oh just get the latest generation of card' because he already is probably going to get 10 people copy+pasting your message. 😱) I'm offering an alternative if money _is_ an issue.

..And you don't have to go down the ebay route. There will be uneducated people practically giving those cards away because they're not 'latest generation' if you shop around. I picked up an extra cheap ex-display one when mine went... *gasp, no manual* 🙁
 


Its not from MY experience, but its from the benchmarking done to formulate the following chart:
http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html

If two GTX 275's in SLI will do 50 FPS at 5040x1050 in Arma 3, then I don't see why two R7 250's in Crossfire wouldn't perform similarly. And, like I said, they're about the same price. So if we're considering £50 Ebay cards, seems better to buy £50-60 brand new ones that perform about the same.
 


Thanks, that's all i wanted to know!
I'm, like you, very skeptical of what 'some guy' says on the internet... :)

Although it does make me wonder as to why _anyone_ on a single monitor needs to run more than R7 250...
Still my friend has recently ordered a latest gen ATI card so i'll be able to see it first-hand... which i hope will dispel my skepticism, and save my wallet some 😱)

(sorry for hi-jacking your thread op)
 


Its true that we do often overestimate how much we actually "need" for 1080p gaming, especially at 60 Hz, but keep in mind that back when your SLI GTX 275's were new, that was a pretty high-end graphics array. If you were to spend the same today*, you'd have enough for a GTX 780, or almost two 770's in SLI. You'd be maxing out games at 1440p. That's twice what I'd recommend anyone spend today if they were gaming at 1080p.

* Caveat: I know US prices way, way better.