22' 1680x1050 or 22' 1920x1080?

Lordegor3

Honorable
Feb 13, 2013
48
0
10,540
I currently own a 22' LG LED with 1680x1050 and I have an option to get a 22' Benq GW2250 LED with 1920x1080.
I am a gamer and currently running a gtx 670 OCed to 1267/3454.
If i switch to 1920x1080, what is the performance impact that could be?
What is the FPS drop would be?
On the 1680x1050 i could max out every game (Crysis 3: 34-40 fps, TombRaidaer: 50-60 fps, AC3 stable 60 fps)
And is the switching would make me see a better picture while playing? i mean, does it worth the performance impact, since both are 22'?
 
The picture quality would be a nice improvement, but you will lose some frames to that. Possibly 10 - 15 FPS average, depending on how graphically demanding the game is. I play AC3 with a 1080p monitor and GTX 670 and have no performance issues (Max settings 60fps with vsync) but Crysis 3 may take a hit as it is pretty intense( I don't have Crysis 3 but benches and videos etc). Not sure about Tomb Raider though.
 
It won't be very much. I'd say 5 is being optimistic. You are going from 1.7m pixels to 2.05m. It won't be anything you'll notice.

I didn't feel like going through alot of benchmarks because its tiring but google is your friend and heres 2 from crysis 3.

1680-Med-A.png

1920-Med-A.png
 
there will definitely be a perferomance hit, but if you're using it to play games, i suggest getting that 144hz monitor with the lightboost hack, since you already have a nvidia card.


you'll always be able to drop the resolution if you want, and for games that run well past 120fps, you'll have some amazing clarity and buttery smooth gameplay
 
I have i5-4670K, cl9/1866 RAM, 7870Ghz and I get about 42-46 fps in TombRaider on Ultimate, FXAA, Shadows all Ultra, TressFX.
Everything I have is stock speeds. 1080p.
670 could handle it.
If under 60fps is unacceptable then stick with sub 1080p
If you want better quality with a slight performance hit then upgrade, I would not pay full price to upgrade from 1680x1050.
Anyone correct me but I thought ASUS monitors were better?
 


40 fps and up is acceptable, even 35 + sometimes. i played Crysis 3, all settings on Ultra, TXAA x4, 34-60 fps, good enough, on 1680x1050
 


From those pics I see that i must upgrade to 1080p. 5 fps hit on Crysis 3 is nonsense.
But if i won't notice any difference, it's not worthy.
 


Once I get ethernet cable running to the room where my PC is then I will be able to tell you what numbers I get on 1080p Crysis 3 on a card slower than GTX 670 OC'd
 


If you mean actually go inside a store, Id be very detailed about what I look at. Usually display models settings are all jacked up to look the brightest, eye popping settings. Personally, I hate using Dynamic Contrast, drowns out everything and stuff doesn't look as sharp and detailed to me... to me that is...
 
Well, after googling, i found that the higher ress is better, since the higher ress (1080p) is on the same 22' (like my 1680x1050). i would not need to use AA, which means more fps. Correct me if i'm wrong.
 


Well, I already own the monitor. My parents got it really cheap for their new PC, so i can give them mine, the 16x10, they dont have any problem. Just was wondering if the impact worth it.
 

TRENDING THREADS