Gaming At 3840x2160: Is Your PC Ready For A 4K Display?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010
Even given the fact that I will never be able to dump $6000 into a new PC/Monitor setup I still find this article incredibly impressive for 2 reasons:

1) it shows how much room there still is for graphics to advance.

And (more importantly)

2) It shows undeniably how well the Nvidia Geforce GTX Titan scales in a dual card configuration. I've always heard that, while the theoretical gains of any dual card would be double performance, history has you realistically getting a 50% improvement at best from the second card. Amazingly thought, in 5 out of the 7 titles benchmarked at a level that is capable of stressing a GTX Titan, the actual benchmarks of the SLI configuration did show the frame rates almost exactly doubling those of the single card.
 

BigMack70

Distinguished
Mar 16, 2007
141
3
18,715
I am dying to hear more news about ASUS' 39" 4k VA panel that is supposed to come out early next year...

Can't wait to get to gaming at 4k. Even if it means turning some settings down on my 780s, resolution > settings any day in my book.

Also, what's with the weird scaling issues in Crysis 3? That is probably the game that needs SLI scaling the most, and that is some of the most crap scaling I've seen in years from SLI.
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790
My dream monitor is still a 5:4 display which is exactly double my current 1280x1024 monitor, so a 2560x2048 display, at about 30". Sadly, most video cards max out at 2560x1600 per single panel, but that will change soon once 4K gets going. I'm really hoping 5:4 makes a big comeback.

Even a 1920x1536 display would be better than any 16:9 display.
2560x2048 displays actually do exist, but they''re all medical diagnostic monitors with several drawbacks (deal-killers) :
1. They only come in 19"-21" varieties.
2. They cost $3,000-$12,000
3. Only grayscale. FML.

The human eyesight visual angle is almost exactly 4:3, but 5:4 is the cleaner ratio being 1.25 even as opposed to 1.33333.... But if 4:3 and 5:4 are the more natural angles, how did 16:9 become so ubiquitous? Because of some arbitrary film ratio that came about by adding the sound portion to the reel thereby clipping the vertical space that should've been there. Curse you outdated film! Oh what could've been...
 

Plusthinking Iq

Honorable
Sep 11, 2013
547
1
11,060
"4k" gaming with aa??? is this site serius......its barely useful for low ress monitors, get it into your head aa its not a quality setting..

to my main point against these "4k" monitors, input laag, and not to mention only 60hz,
 

CaptainTom

Honorable
May 3, 2012
1,563
0
11,960
It's really weird a single 7970 wasn't used. I have overclocked mine to the point that it trades blows with a GTX 780 (For half the price).

Hell an HD 7850 on medium settings would have been nice to so we could really see the usability of a mid-range card...
 

godfather666

Distinguished
Aug 10, 2011
132
0
18,680
I would appreciate it if someone can clarify this issue for me. Are these games actually running at native 4K? Or are they just being upscaled to that resolution? Do developers really create 4k textures? How does this work?
I feel like when I play some really old game, from 2004 for example, it doesn’t look any different at 1080p than it does at 720p. I can understand you may no longer need anti-aliasing, but are you really seeing a higher level of detail? How is that possible?
Please explain.
Thanks!
 

gunbust3r

Distinguished
Nov 23, 2011
33
0
18,530
I think the sweet spot here is going to be the Chinese 4K "TV's" once they get them taking 60Hz input. The Seiki 39" that does 30Hz is under $600 on sale. Asus is smoking crack if they think that $3500 price is going to hold up on a 31" screen when 39" 60Hz T'vs are available.
 

hero1

Distinguished
May 9, 2012
841
0
19,060
Good article as usual. However, I'd like to know why do you have the AA and such enabled when it is not needed at such resolution? Anandtech did a review and said it is not necessary to have them turned on and that is the way I would play if I had a 4K monitor. I am sure that 2x GTX 780 running at GPU clock of 1250 MHz by 7400 MHz on vRAM will be more than enough to get 60FPS without the need of AA and such.
 
The reason 120-144Hz is not catching on is because we lack CPUs that can drive games that fast. Heck, no matter what resolution you run in Crysis 3 and Tomb Raider on Ultra, you still see dips to around 40fps here and there - on >4GHz i7s.
 

cmi86

Distinguished
Strange you didn't let the 7970 and 7990 come play, would have been fun to see. All the same this is completely unrealistic and applies to no one except for those with copious amounts of money and nothing to spend it on.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
I agree on that the anti-aliasing option becomes academic once you hit 4K resolutions. Even at 5760x1080, I usually turn it off or put it on a minimal 2x on most games since I enjoy more a sharper image over a blended one.
 

Ssateneth

Distinguished
Apr 28, 2007
4
0
18,510
Where can I get the updated PQ321Q Firmware? I have a PQ321Q but still have problems booting/POSTing with my monitor in 4k mode. I'm hoping a firmware update can fix it but I can't find it.
 

mapesdhs

Distinguished


What do you reckon then might be the cause of the lack of scaling with Crysis3?

Ian.



 

Grandmastersexsay

Honorable
May 16, 2013
332
0
10,780
I can't see any pixels on my current displays. Chances are you can't either.

For 4k displays to make sense, I would have to get a display twice as large as I have now and sit just as close to it. Even if I could ever afford a 50"+ 4k monitor, I wouldn't want to put it on my desk two feet from my face. Now if I actually had to lower the refresh rate to 30 fps, games would look worse at any distance.

4k makes sense for advertising companies and display manufactures and that's about it.
 

Grandmastersexsay

Honorable
May 16, 2013
332
0
10,780
I can't see any pixels on my current displays. Chances are you can't either.

For 4k displays to make sense, I would have to get a display twice as large as I have now and sit just as close to it. Even if I could ever afford a 50"+ 4k monitor, I wouldn't want to put it on my desk two feet from my face. Now if I actually had to lower the refresh rate to 30 fps, games would look worse at any distance.

4k makes sense for advertising companies and display manufactures and that's about it.
 

mapesdhs

Distinguished


It'll be interesting to see how Crysis3 scales with the AMD cards once the drivers
are sorted out.

Ian.

 

mapesdhs

Distinguished


Lots of CRTs can do native 2048x1536 (any based on the same 22" Sony Trinitron), and
a unit in decent condition will look nicer than most flat panels even now, especially any
TN model. Down side of course is such CRTs are large, heavy, use more power, are
generally no more than 22", and probably won't last that long after purchase (at best a
year or two), but they are dirt cheap if you want to try one, typically less than $130 fixed
price, much less via auction or whatever. Examples include the Dell P1130 and all its
Dell/HP variants, the SGI 5411 - there's at least a dozen which all use the same tube.
Those sold as "A-grade" are the only ones worth considering.

Having said that, one problem with such monitors is the stupid default Windows drivers
which prevent one from accessing the 2K mode without some .ini fiddling; very annoying.
This was easy to do with XP, but I've not yet worked out an equivalent fix for Win7.

I used to play games at 2048x1536 on a 22" CRT (Dell P1130). I waited ages for flat panels
to be IMO finally good enough to justify the switch without being annoyed at the drop in
resolution or fidelity, eventually replacing the CRT with an HP 24" H-IPS 1920x1200 LP2475W,
which does look very good. I do miss the extra pixel height sometimes (Oblivion looked
great on the old P1130), but I certainly don't miss the hefty desktop footprint, etc.


Customer demand determines what the market goes with, but such behaviour is often
self-reinforcing. It wasn't that long ago that 1200 monitors were pretty much the same
price and as easy to obtain as 1080 monitors, but then people just started going
with 1080p more and more, and pricing followed suit. For a while it was hard to find a
good 1200 display, though the influx of cheaper IPS models solved this, eg. the Dell
U2412M is quite nice and well priced. Same happened with 2560x1440 vs. 2560x1600;
the latter always cost more, but now the 1600 models cost massively more. I wanted to
get a 1600 model for benchmarking, but I couldn't justify the cost; I bought a Dell 1440
instead which does look ok. Thankfully, review articles now seem to be sticking more
with 1440 testing anyway.

Personally I'd rather the PQ321Q wasn't referred to as a 4K display, because it's not.
If it was 4096x2300, then that'd be fine. Perhaps using 3840 just makes the tech easier
to sort out, and no doubt it means the marketing is easier, it being exactly twice the
width & height of 1080p.

Ian.

PS. There are 24" CRTs such as the Sony FW900, but the weight is insane. Needs two
people to move one.

 
Status
Not open for further replies.