Gaming At 3840x2160: Is Your PC Ready For A 4K Display?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.




There is still problem with AMD cards at 4K. The good thing is that AMD answered at one site:
We did finally get some feedback from AMD on the subject after quite a bit of back and forth. Essentially, AMD claims the problems we are seeing are due only to synchronization issues and NOT from bandwidth limitations. Since the file copies are done over the PCIe bus, only an instance of near 100% utilization on it would cause degradation – and the only instances of that would be from heavy system memory access. However, if you are accessing main system memory with the GPUs in your PC then other performance bottlenecks are going to creep up before CrossFire scaling.

If that is the case then AMD should be able to fix the CrossFire + Eyefinity issues in the coming weeks or months. A bandwidth issue would be much harder to deal with and could mean a fix would have never arrived for HD 7000-series users.

So there is no point benching AMD cards at 4K for now.
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790


Uh no. A 1920x1536 (5:4) and 1920x1080 (16:9) display gives you identical fov on the x-axis i.e. horizontal image information is the same, but the 5:4 display also gives you more vertical image information (y-axis) so you can also spot enemies dropping in from the sky or shoot down choppers and the like much more easily.

Also, the absolute image space is also larger with 5:4 for any given monitor size, e.g. a 20" 5:4 display will have a larger surface area for viewing than a 20" 16:9 display.
 
Correct me if Im wrong but, Isnt 1080p+ a strong AA (from sweetfx, wich impacts FPS a lot less) good enought?
I mean, I use a 32 inch tv on those settings and sure, things aint perfect, but how far are we going with this?
 

cypeq

Distinguished
Nov 26, 2009
371
2
18,795


You are a moron aren't you ? Just had to say it... screen width resolution does not equal viewing angle it's only a variable in the equation. Your typical game engine would keep vertical view angle constant to avoid stretching of the image and modify horizontal FOV accordingly.where wider display would get higher fov to keep image proportional.

In the old games when monitors were predominantly 4:3 both FOV values were constant, launching game like that (try starcraft as an example) would get you streaching vertical in 5:4 and horizontal in 16:9.

Very little games would allow you to tamper with both H-FOV and V-FOV leaving your 5:4 screen at heavy disadvantage when playing for example fps games. Giving you smaller H-FOV than on regular 4:3 screen by default.
 

flong777

Honorable
Mar 7, 2013
185
0
10,690
Wow - the next big thing in gaming. A rig to run this resolution will cost 7K easy. If you go with a high-end screen you could be looking at a 10K budget. How many gamers are going to have a 10K budget for a gaming computer?
 

mohit9206

Distinguished

In most cases the games are upscaled.The only games that will bebefit from this upscaling are newer modern AAA titles and not old games from 2004 or indie games etc.Also in games which have the option to enable hi res textures either through in-game or mods will benefit the most from this type of upscaling.
 

kloot

Distinguished
Nov 6, 2010
1
0
18,510
Hi, how does the 4k scenario look like for non-gamers? Which hardware would be required to run a resolution of 4x1920x1200 for standard applications (no gaming, no video editing)? Keen to hear your thoughts!
 
Is it just me, or the image that compares the FHD and UHD quality is zoomed to exagerate the effect?
I mean, i bearly notice a difference from 1600x1050 to 1900x1080, and i mean that i have to look for the changes to actually notice any.

Thou it would be nice to take a look at a few games at those resolutions.
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
How close would you need to site to a 27" 4k monitor in order to benefit from the resolution? Your face would need to be less than inches from the screen if you have 20/20 vision.

4k interests me, but only on projectors and 100" televisions.
 

coolronz

Distinguished
Apr 20, 2008
57
0
18,640
nice article Chris. loved to have seen even just a single HD7970.... from my understanding from other articles, its alot to do with memory bandwidth too. maybe AMD will fix their drivers, maybe not, honestly everything is fine in eyefinity for me with 3way CFX, only Crysis3 and Metro 2033/LL go under 40fps, unless of course i single monitor it on my 27", which honestly is still doable... may i suggest if you do a follow up article including AMD, could you maybe include a 2.5K monitor, thats more likely the sweet spot for more users here... thanks
 

geok1ng

Distinguished
Jun 25, 2008
111
0
18,690
The article takes on the challenge of trying to explaim for non-experts the reasons of why we don't have 4k @ 60hz today. TH is right when it states that the problem is that there is no internal monitor hardware ( some stuff named TCON) to drive that many pixels simultaneously in the market today.
the articla also shows the many issues with tiled displays, that seemed worse than the 30hz single display solutions from chinese TVs ( seiki and skyworth brands). With custom resolutioons and reducing blanking intervals, geeks have reached 38-45hz on these TVs, and we are very close to the magical 48hz for better 4k@24p movies playback on these devices. I tought that these TVs work with dual HDMI inputs, not single HDMI when connected to a PC. After reading your article i come out with the impression that the TVs today offer a better 4k experience as a monitor, even with lower refresh rates.
For 4k i beg to differ on the "mininum" graphics power. TPU has done an impressive review of SLI on the 700 series, whose data made me advice triple 4GB 770s instead of SLI 780s. The price is the same, the frame buffer is bigger and 3x770s only lag behind 2x780 in titles that do not scale beyond 2-way SLI.
 

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010

Switching resolutions on the same monitor will simply get you fuzziness at the lower, non-native resolution.

The image of the eye used to compare the resolutions is zoomed to show what the difference in pixel size would be on two similarly sized monitors. So using your comparison, you would have to take a 24" monitor with a native 1600x900 resolution and place it side-by-side with a 24" monitor with a 1920x1080 native resolution. Then you would see the pixel size difference. You could make a similar picture to compare the screens on the iPad 2 and iPad 3 or 4.

P.S. I type this on a 24" monitor with a native 1920x1200 resolution and, if I look closely but still not nose-to-glass, I can make out the individual pixels.
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790


Uh, no again. I was talking geometry and suddenly you bring up game engine limitations. The two are mutually exclusive. The fact remains that a 5:4 display given the same horizontal resolution (in pixels) as a 16:9 display will give you more image information. See this:
http://en.wikipedia.org/wiki/Aspect_ratio_%28image%29#Visual_comparisons
Look at the last comparison. The image on the right is what you get when you play current games on 16:9 displays, whereas the image on the left is what you COULD be getting if you had a 5:4 display with the same horizontal pixel count as the 16:9 monitor.

Any properly made game today takes into account both fullscreen and widescreen displays, so there would never be any squeezing or stretching, only cropping. Image data can be lost but not distorted. But on the matter of distortions, it is far more likely for a game to be designed for fullscreen and not widescreen, than vice versa. Far fewer games would be squeezed on fullscreen monitors than they would be stretched on widescreen monitors, hence this is a win for 5:4. Any distortions from 4:3 to 5:4 would be almost inconsequential compared to the distortions of going from 4:3 to 16:9.

You also ignored my second point on absolute area of 5:4 vs 16:9 per given diagonal, which is what all monitors are measured for. The panel area for a 20" 16:9 is mathematically smaller than the panel area for a 20" 5:4.
Smaller area = less image information available given the same ppi. Manufacturers prefer widescreen because it takes less material to reach a given diagonal and is therefore cheaper, but the end result is you get a smaller display.
 

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010


I think that what cypeq is trying to say is that is not how games, movies, tv, etc. has been done since the change from 4:3 to 16:9.

Video started with filming. Once they started adding sound to film they had to lower the amount of data carried in the picture so there was room for the audio. This led to them "cutting off" the top and bottom strips of the video. So movie theaters have long been widescreen format. Once TVs became prevalent, they would have to modify the video portion of a movie to broadcast it over the TV airwaves. So they cut off the edges to make it fit a 4:3 aspect ratio. This is (obviously) why you always get a "this film has been reformatted to fit your screen" message along with the "edited for time and content" message when you watch a movie on TV. I remember every Rambo movie always getting smashed horizontally when the credits started scrolling at the end.

Once DVDs came along you had the option of buying the "widescreen" format version, which gave you the full movie theater video portion bu added black bars at the top and bottom of the screen to avoid stretching or squishing the picture. This made everything appear smaller.

Then along came HD TVs, channels and video. Now, instead of just seeing the line of scrimmage and the linebackers when watching football on TV, you get to see all the way from the safetys to the deepest tailback. Now you get to see the full movie without the black bars. The thing is, they aren't taking the original movie everyone is used to and cropping out the top and bottom to make it fit. Rather they are taking the old TV versions and "adding back" the sides they had cropped out when they had to make them fit the 4:3 aspect ratio.

For a visual representation of what is happening you need to compare the two pictures marked "Two aspect ratios compared with images using the same height (vertical size):" in your wikipedia article.
 

soldier44

Honorable
May 30, 2013
443
0
10,810
Gaming at 2560 x 1600 for 3 yrs now. Going to be at least another year or even 2 before I consider moving to a 4K display and cards to support it.
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790
I think that what cypeq is trying to say is that is not how games, movies, tv, etc. has been done since the change from 4:3 to 16:9.

Video started with filming. Once they started adding sound to film they had to lower the amount of data carried in the picture so there was room for the audio. This led to them "cutting off" the top and bottom strips of the video. So movie theaters have long been widescreen format. Once TVs became prevalent, they would have to modify the video portion of a movie to broadcast it over the TV airwaves. So they cut off the edges to make it fit a 4:3 aspect ratio. This is (obviously) why you always get a "this film has been reformatted to fit your screen" message along with the "edited for time and content" message when you watch a movie on TV. I remember every Rambo movie always getting smashed horizontally when the credits started scrolling at the end.

Once DVDs came along you had the option of buying the "widescreen" format version, which gave you the full movie theater video portion bu added black bars at the top and bottom of the screen to avoid stretching or squishing the picture. This made everything appear smaller.

Then along came HD TVs, channels and video. Now, instead of just seeing the line of scrimmage and the linebackers when watching football on TV, you get to see all the way from the safetys to the deepest tailback. Now you get to see the full movie without the black bars. The thing is, they aren't taking the original movie everyone is used to and cropping out the top and bottom to make it fit. Rather they are taking the old TV versions and "adding back" the sides they had cropped out when they had to make them fit the 4:3 aspect ratio.

For a visual representation of what is happening you need to compare the two pictures marked "Two aspect ratios compared with images using the same height (vertical size):" in your wikipedia article.

Everything you've told me I've already known intrinsically for several years now. But this only applies to film or other fixed image mediums which have a pre-rendered boundary. Games are different in that there aren't any fixed image boundaries since everything is rendered in realtime, hence the last comparison in the wiki article applies to what I've been saying previously, whereas the third comparison applies to film, as you've stated. If a FPS game had a fixed boundary, you wouldn't be able to look in any direction you wanted to, and it wouldn't be much of a game.

Films look squeezed or stretched because the image doesn't fit. Games look squeezed or stretched because the renderer doesn't know how to properly handle the desired resolution. Same symptoms, different causes.
 

acortez

Honorable
Jul 29, 2013
1
0
10,510
Unlike the image shown at 1080p and again at 4K that illustrates the additional details between the two formats, what games have real 4K content that we would now be seeing for the first time?
My guess is none since all we are doing (now) is upsampling original content.
So what are the "original" resolution for these games tested?
Do we even know?
 


Quick test: type out 'lt' or 'ix' or 'ly' in a response box on your screen. Chances are the letters are touching each other causing annoyance to writers and code monkeys around the world. But then go to a nice high resolution cell phone and type it in. *bam* there is a blank pixel between the letters allowing proper separation between characters, much easier reading even though the characters are physically smaller, and even if you hold the device further from your face. More pixels means you can introduce scaling, which means you can more properly vectorize things like text so that it is always clean and crisp no matter how small it may be.

Or another test: Take a normal 1080p screen and display an 8MP image on it and then print the same image out on a regular 8.5x11 sheet of paper with a laser printer. A 1080p screen only has about 2MP of resolution, while a laser printer can print at much higher densities. Hold the print out up next to your computer screen and look at how much more detail is visible on the paper compared to the crap image that is displayed on the screen. There is a HUGE difference in clarity and detail between the two. 4K may not get rid of this entirely, but it will bring a much more 'print like' clarity and detail to the screen, much like high resolution screens do for cell phones.

Last is my greatest annoyance ever. Take a good quality 1080p bluray of a good old film from before the 1970s and watch it critically on a good quality 1080p monitor (not a TV because they blur crap on purpose to hide this issue). If you are paying attention you will notice 2 types of conflicting grain on the image. One is the film grain from the film stock that they used... but then there is another inconsistent grain from the digitization process! It is an issue of fine gradients of color where the digitizing cannot pick an exact color. This is highly prevalent in dark scenes, or scenes with a lot of grey. In 4K this still happens, but the digital grain becomes so small that your eye cannot differentiate individual pixels, and smooths out the image for you. That's right, not being able to differentiate individual pixels is a GOOD thing, and not a waste.

And another thing; You would not want a 50" 4K monitor 2' from your face as that would offer no greater clarity than a standard monitor. At 2' away you should be looking at a ~35-42" monitor in 4K, which is large to be sure, but should still be within your field of vision while offering higher pixel density and clarity.

And above all else: If you think it is a waste, then don't buy one. But claiming that it is not an improvement over the crap that we are forced to deal with every day and have grown accustomed to only shows your own literal and figurative blindness. It is going to be a good 5 years or so before I will be able to afford/justify buying one, but it is definitely a step forward, and not something to put off once it becomes readily available.
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680
if you cannot game at 120hz it is a waste. Sorry but 4k at even 60hz is not going to look smooth and clean compared to 120hz 2560x1440 monitors. My overlord tempest will blow this screen away due to the smooth action and clean look. 4K is years away from being a proper alternative since low refresh rates limit your FPS and FPS is king.
 

freedom fighter

Honorable
Aug 3, 2013
8
0
10,510


the higher the resolution, the more visible the ugly steps along lines positioned close to horizontal or vertical direction on a display. more over, you need to perform aa on trapnsparent textures too.. i cant go without aa anymore. sorry.
 

Raheel Hasan

Honorable
Apr 17, 2013
1,019
0
11,660


No it is not right, the LOWER the resolution, the more visible the ugly steps along lines. When u increase the resolution the size of the pixel reduces and u can not see the tiny edges so AA is not wanted at such a high resolution, even at 2560x1440 it is very difficult to notice the difference between 2AA and 4AA
 
Status
Not open for further replies.