if it runs Skyrim why not Witcher 2?

skyrimmer69

Honorable
Jun 21, 2012
8
0
10,510
So I'm trying to enjoy playing my newly downloaded Witcher 2. I don't get it! I firmly believe that Skyrim is the game by which you may judge your computer's gaming capabilities. If you can play Skyrim on Medium to High settings, then you can play any game out there. So what's the deal with the lag and turtle-slow save loading in Witcher 2? I have a Radeon HD 4300/4500 Series card and it runs Skyrim perfectly. The minimum requirements for Witcher2 say a Radeon HD 3850. Well, I have better than that right? So why won't it run like Skyrim and my other Steam games?
 
you have it the wrong way round m8, if you can run witcher 2 on medium high you can play anything on medium. skyrim was never used as a yard stick for measuring pc ability because of 2 things. it was seriously bugged at release. so much so the comunity ended up patching it long b4 the original writers did. and secondly it isnt a gfx demanding game. most mid range cards can easily max it. where because of uber sample no gfx card can max witcher 2.
as for getting it tunning turn the settings down. ubersample is a system killer. turn off most of the high end gfx quality and the game will run on a mid to low end card. which you have.
oh and the 3850 is actually better than the 4300/4500 as its a x850 while yours is a x3xx/x5xx card
 

skyrimmer69

Honorable
Jun 21, 2012
8
0
10,510
That's new info to me. I always thought: The higher the number, the better it is; whether CPU, sound/video card, whatever...but I hear ya all saying that's not so, that the numbers are, well, just numbers that mean very little for the consumer to judge the latest (and therefore the better) version. Huh. Go figure. I'll try turning everything down. Thanks again.
 
With Radeons the 1st number is the generation its from, the next shows where it is in that generation and the next does the same but within the number before. ie 38xx<45xx but 57xx<38xx there are exception like the 5770 is a 6770 just renamed.
 

gunzrx

Honorable
Jul 15, 2012
4
0
10,510


It used to be like that (even then, I very much doubt it).

Nowdays, all companies pretty much re-release their cards for a few generations esp. at the lower-end.

The famour 8800gt, was also the 9800 gt I think, and a 200 series card!

It usually goes like the first 2 numbers for amd, and 1st for nvidia is generation number.

Usually the top few cards, offer better power draw, a few new features (they steal money from your wallet for you :kaola: ), etc.

The rest are refreshes, also with Nvidia they like suffixes. A Gt is the worst, then the lower the letter the better ie.e gts is worse than gtx.

A few in the mid-range also get a ti label, which would be better than the regular gtx of that card. Ie gtx 560 vs gt 560 ti.

Can get confusing but usually the more money the better, use this chart though

www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html

And you can safely assume that the higher number cards in the same category are better.

One more thing though, you can get the previous gen cards really cheap to the point where the value is just self-evident.

An example would be a gtx 580 for 170 almost new on e-bay.

To get sig. better perf. you have to go for a gtx 670 or a higher end 7000 series card. Both costing upwards of 2x as much!