Rumor: PlayStation 4 Will Support 4K Resolution

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Gigantor21 -- the difference between 1080p and 4k would be somewhere between minimal and barely noticeable on a screen that size; I'd guess that the manufactures wouldn't bother even producing 4k screens below ~40" until it becomes just as cheap to make them as 1080p screens [in that case, why not do it just to be able to print 4K on the box, whether the user will see a difference or not].
 
that's silly even decent desktops which cost over 2000$ won't play recent games at 2160 p / 60 fps ,,,
how can a cheap console """it's suppose to cost nearly 600$""" to reach that kind of performance ???!!!
 
Sure it will support 4k. Until SONY decides to pull support for it from the console a few hardware revisions down the line. Either because it is "still" too expensive, or because adoption of higher-than-1080p displays will be abysmal, or because the "technology wasn't ready for prime time", meaning performance will be as crappy or crappier than people are expecting.
 
just like how the PS3 can run games in 1080p...

(probably one day they will actually release a game that isn't a crappy 2D game with crappy graphics that actually runs in 1080p)
 
I think it's pretty much a given that next-gen consoles will be capable of rendering at the 4k resolution.

Current graphics card are capable of doing that, so there's no reason the next-gen consoles won't. Remember, when the Xbox 360 first came out, the 1080p resolution was still in its infancy, and only then did we start seeing 1080p TVs, and they were extremely expensive at the time.

Of course, most games never ran at 1080p, and most next-gen console games won't run at 4k. But the hardware will almost certainly be capable of 4k in theory. It'd be very surprising if it wasn't.
 
Listen up people, just try to ignore the corporations as much as possible. They come up with all this new stuff all the time trying to make you feel like the technology you have now is not good enough. I'm not saying you should never buy anything new at all, (I like to buy new stuff every once in a while), but don't feel like you have to waste your money on this new technology just because it's "new". What you have now is still pretty good.
 
[citation][nom]techguy911[/nom]4k is for video playback not gaming as for the price min $30,000 for a 4k display atm too much for average person don't see how anyone will pay that much for a display.Even 4k monitors are that price range while at smaller size movie theaters are using 4k atm it's so expensive that in small cities they only have 1 4k projector.[/citation]

I've seen 4K displays at *only* a little over $10,000.
 
[citation][nom]blazorthon[/nom]I've seen 4K displays at *only* a little over $10,000.[/citation]
I have seen those not true 4k resolution that is why they are 11k

Toshiba 55ZL2 -3840 x 2160 pixels
True 4k...........-4096 × 3112 Pixels 4k projector

While 4k is not new in asia 8K just released


 
[citation][nom]techguy911[/nom]I have seen those not true 4k resolution that is why they are 11kToshiba 55ZL2 -3840 x 2160 pixelsTrue 4k...........-4096 × 3112 Pixels 4k projectorWhile 4k is not new in asia 8K just released[/citation]

3840x2160 is considered a 4K resolution (the lowest and widest 4K resolution, if I remember correctly), so your point doesn't work. It is a *true* 4K resolution, granted it has a different aspect ratio. There are several 4K resolutions.
 
[citation][nom]aragis[/nom]It won't matter if it's 1080p or 4k or even 8k unless you're looking at a very large TV (50+ inch) from just 1,5 meters away, and %95 of us DON'T do that.[/citation]
That's not true. Pixel spacing of 0.003 arc-degrees is still better-looking than pixel spacing of 0.008 arc-degrees, even though acuity limitation of healthy person is pixel spacing of ~0.005 arc-degrees.
 
[citation][nom]gigantor21[/nom]Random noob question: will 4k even be worth it for games on a 23-25" screen? I do most of my gaming and movie viewing in my bedroom, so...[/citation]
Depends on the viewing distance. Then we can compare angular resolution of same-sized 1080p set and 2160p set.
 
I'm kinda kidding with this comment, but with a resolution like that, you might be able to do away with anti-aliasing. Haha! Though it's true, I think. The gargantuan cost of the screen itself is enough to not justify that, for now.

Not to mention the performance to quality ratio compared to just turning on MSAAx4. I'm not sure what would look better though at a given pixel density, MSAAx4 or 1080p x4 (i.e. 4K). I'm thinking that latter, but how about SSAAx4 (big performance-eater) or MSAAx8? I wonder.

Higher resolutions use up more VRAM due to a larger frame buffer and also higher resolution texture files, and maybe even more polygons for models, I believe. Well, unless the game scales up the game's field of view with resolution, I think, but that still results in having to possibly load up more textures and models onto a frame.
I remember learning that SSAA and MSAA (not sure about other AA technologies) eat up a lot of VRAM due to having to render a certain frame at higher resolution to a buffer (like 4K resolution for a 1080p frame at AAx4) in order to get the necessary samples in order to "blur" edges.

Graphics cards have been getting more and more VRAM as time passes by as it seems, but as for memory bandwidth, we may have to wait for the DDR4 versions of GDDR (maybe 6-7) or maybe Rambus' XDR2 (just remembered rumors last year of how the HD 7970, I think, might've had XDR2).
I'm not sure how the performance of different kinds and levels of AA at 1080p compare to rendering with just 4K, but by what I've been reading in the comments, it seems like a lot of people think the latter's way more performance-intensive, and there are benchmarks to prove it I bet.

Sorry, just wanted to share some thoughts I had. Don't you guys think it's interesting? Hehe... I may be wrong about somethings, especially how those AA's work, but you're free to look into it more if interested. 🙂
 


4K is something like between about 8MP and about 12MP, depending on the exact resolution (remember, there are several 4K resolutions), so it's performance can probably be simulated with Eyefinity/Surround resolutions that have similar amounts of pixels. It would be an interesting test to see what level of AA at 1080p and other resolutions would be similar in performance to 4K without AA.
 
I think I'd stick with the UD definition, 3840x2160, since it's exactly 4 times 1080p, though I'm not really sure if a 16:10 4K would be more pleasing. One thing for sure, I think, is that it would have more pixels, which I guess I'd like, but another thing is about how they say that it follows the Golden Ratio and how it's more aesthetically pleasing and stuff, but that is a personal preference I would think.

Anyway, it (UD) may be simulated with four 1080p monitors in a 2x2 landscape Eyefinity, assuming the use of Eyefinity has no or negligible amounts of processing overhead involved. Doing so with 4x1 landscape and portrait (same number of pixels), which I'm not sure is really used due to crosshair and other center-screen-related issues, may skew the results a bit due to more CPU workload for some reason, especially for the landscape mode. I can only guess why, but I read this in a review of something I can't quite remember here in TH. It was mentioned that wider aspect ratios cause more of a CPU performance hit, and this was shown when they tried 4:3 and/or 5:4 ratios (I think) and 16:9 and/or 16:10 ratios. So I'm assuming that 64:9 and, to a lesser extent, 36:16 (or 9:4) would show this CPU performance hit more so than 16:9/10.

Maybe you, blaz, or someone else, may confirm this and elaborate. 🙂
 


If the 1080p displays are in a 2x2 setup, then they could altogether support a 3840x2160 resolution and should have the same performance as it because it would have the same aspect ratio and pixel count. Well, that's my theory, I'm not saying that I'm correct until someone tests it.

As for what aspect ratio is the best, I think that I'd agree in that 16x10 is better than 16x9 and 4x3 IMO.
 
The games for the next gen will be 1080p upscaled to QuadHD. Its just doubling in 2 directions and easier than 1.5x ( 720 - 1080 )
 
[citation][nom]bigdog44[/nom]BTW... AA probably wont be necessary at 4k resolutions...[/citation]
Compared to a same-sized monitor/TV with 1080p that needed 4xAA, probably not, but it would still depend on the monitor/TV's screen size (or it's DPI more specifically) and your viewing distance, but also the most important thing, your personal preferences (including your visual acuity and (aesthetic) taste). 🙂
 
There's no way the PS4 games will be 4K resolution. Maybe while playing movies but most deff not the games. Would cost way too much.
 
[citation][nom]cheeba hawk[/nom]There's no way the PS4 games will be 4K resolution. Maybe while playing movies but most deff not the games. Would cost way too much.[/citation]

Upscaling. It wouldn't cost anything more than making the games for 1080p because that's what they'd be made for. The graphics wouldn't be able to handle actual 4K gaming even if the games were made for it.
 
[citation][nom]Anonymous[/nom]ps4>pc and its 4500$ cheapper[/citation]

I can build a $500 PC that is better than the PS4 is supposed to be. Anyone with a more than $300-2000 PC is wasting money if its a mere gaming rig. Furthermore, most PC games are cheaper than consoles games while having better graphics quality and frame rates thanks to the superior hardware, so it is usually actually both cheaper and a better experience if you play on the PC. There may be noteworthy exceptions where the game interface is different, such as motion sensing configurations, but that's then a difference of opinion, not necessarily quality. The PC can still arguably be the more *hardcore* gaming platform strictly because the mouse plus keyboard interface is still the most effective interface for most *hardcore* games. That doesn't necessarily mean more fun because how fun it is is still predominantly determined by personal preference and other qualities as well, but it does show that although consoles have their inherent advantages, so too does the PC.

That emulators for consoles on the PC continually come out and let the PC's superior performance show even in console games puts even the above in hot water. For example, Wii emulators let you play Wi games with better graphics quality and performance if you have even a decent PC than the Wii is capable of and you can choose between or both Wiimotes and the mouse/keyboard interface to play those Wii games. All the better, such emulators are available for free.
 
Status
Not open for further replies.